Science.gov

Sample records for postcalibration uncertainty analysis

  1. Practical postcalibration uncertainty analysis: Yucca Mountain, Nevada.

    PubMed

    James, Scott C; Doherty, John E; Eddebbarh, Al-Aziz

    2009-01-01

    The values of parameters in a groundwater flow model govern the precision of predictions of future system behavior. Predictive precision, thus, typically depends on an ability to infer values of system properties from historical measurements through calibration. When such data are scarce, or when their information content with respect to parameters that are most relevant to predictions of interest is weak, predictive uncertainty may be high, even if the model is "calibrated." Recent advances help recognize this condition, quantitatively evaluate predictive uncertainty, and suggest a path toward improved predictive accuracy by identifying sources of predictive uncertainty and by determining what observations will most effectively reduce this uncertainty. We demonstrate linear and nonlinear predictive error/uncertainty analyses as applied to a groundwater flow model of Yucca Mountain, Nevada, the United States' proposed site for disposal of high-level radioactive waste. Linear and nonlinear uncertainty analyses are readily implemented as an adjunct to model calibration with medium to high parameterization density. Linear analysis yields contributions made by each parameter to a prediction's uncertainty and the worth of different observations, both existing and yet-to-be-gathered, toward reducing this uncertainty. Nonlinear analysis provides more accurate characterization of the uncertainty of model predictions while yielding their (approximate) probability distribution functions. This article applies the above methods to a prediction of specific discharge and confirms the uncertainty bounds on specific discharge supplied in the Yucca Mountain Project License Application. Copyright © 2009 Authors(s). Journal Compilation © 2009 National Ground Water Association.

  2. Practical post-calibration uncertainty analysis: Yucca Mountain, Nevada, USA

    NASA Astrophysics Data System (ADS)

    James, S. C.; Doherty, J.; Eddebbarh, A.

    2009-12-01

    The values of parameters in a groundwater flow model govern the precision of predictions of future system behavior. Predictive precision, thus, typically depends on an ability to infer values of system properties from historical measurements through calibration. When such data are scarce, or when their information content with respect to parameters that are most relevant to predictions of interest is weak, predictive uncertainty may be high, even if the model is “calibrated.” Recent advances help recognize this condition, quantitatively evaluate predictive uncertainty, and suggest a path toward improved predictive accuracy by identifying sources of predictive uncertainty and by determining what observations will most effectively reduce this uncertainty. We demonstrate linear and nonlinear predictive error/uncertainty analyses as applied to a groundwater flow model of Yucca Mountain, Nevada, the US’s proposed site for disposal of high-level radioactive waste. Both of these types uncertainty analysis are readily implemented as an adjunct to model calibration with medium to high parameterization density. Linear analysis yields contributions made by each parameter to a prediction’s uncertainty and the worth of different observations, both existing and yet-to-be-gathered, toward reducing this uncertainty. Nonlinear analysis provides more accurate characterization of the uncertainty of model predictions while yielding their (approximate) probability distribution functions. This paper applies the above methods to a prediction of specific discharge and confirms the uncertainty bounds on specific discharge supplied in the Yucca Mountain Project License Application. Furthermore, Monte Carlo simulations confirm that hydrogeologic units thought to be flow barriers have probability distributions skewed toward lower permeabilities.

  3. Uncertainty analysis

    SciTech Connect

    Thomas, R.E.

    1982-03-01

    An evaluation is made of the suitability of analytical and statistical sampling methods for making uncertainty analyses. The adjoint method is found to be well-suited for obtaining sensitivity coefficients for computer programs involving large numbers of equations and input parameters. For this purpose the Latin Hypercube Sampling method is found to be inferior to conventional experimental designs. The Latin hypercube method can be used to estimate output probability density functions, but requires supplementary rank transformations followed by stepwise regression to obtain uncertainty information on individual input parameters. A simple Cork and Bottle problem is used to illustrate the efficiency of the adjoint method relative to certain statistical sampling methods. For linear models of the form Ax=b it is shown that a complete adjoint sensitivity analysis can be made without formulating and solving the adjoint problem. This can be done either by using a special type of statistical sampling or by reformulating the primal problem and using suitable linear programming software.

  4. Antarctic Photochemistry: Uncertainty Analysis

    NASA Technical Reports Server (NTRS)

    Stewart, Richard W.; McConnell, Joseph R.

    1999-01-01

    Understanding the photochemistry of the Antarctic region is important for several reasons. Analysis of ice cores provides historical information on several species such as hydrogen peroxide and sulfur-bearing compounds. The former can potentially provide information on the history of oxidants in the troposphere and the latter may shed light on DMS-climate relationships. Extracting such information requires that we be able to model the photochemistry of the Antarctic troposphere and relate atmospheric concentrations to deposition rates and sequestration in the polar ice. This paper deals with one aspect of the uncertainty inherent in photochemical models of the high latitude troposphere: that arising from imprecision in the kinetic data used in the calculations. Such uncertainties in Antarctic models tend to be larger than those in models of mid to low latitude clean air. One reason is the lower temperatures which result in increased imprecision in kinetic data, assumed to be best characterized at 298K. Another is the inclusion of a DMS oxidation scheme in the present model. Many of the rates in this scheme are less precisely known than are rates in the standard chemistry used in many stratospheric and tropospheric models.

  5. FRAM's Isotopic Uncertainty Analysis.

    SciTech Connect

    Vo, Duc T.

    2005-01-01

    The Fixed-Energy Response-Function Analysis with Multiple Efficiency (FRAM) code was developed at Los Alamos National Laboratory to measure the gamma-ray spectrometry of the isotopic composition of plutonium, uranium, and other actinides. They have studied and identified two different kinds of errors from FRAM analysis: random and systematic. The random errors come mainly from statistics and are easily determined. The systematic errors can come from a variety of sources and can be very difficult to determine. The authors carefully examined the FRAM analytical results of the archival plutonium data and of the data specifically acquired for this isotopic uncertainty analysis project, and found the relationship between the systematic errors and other parameters. They determined that the FRAM's systematic errors could be expressed as functions of the peak resolution and shape, region of analysis, and burnup (for plutonium) or enrichment (for uranium). All other parameters such as weight, matrix material, shape, size, container, electronics, detector, input rate, etc., contribute little to the systematic error or they contribute to the peak resolution and shape and then their contributions can be determined from the peak resolution and shape.

  6. Uncertainties in offsite consequence analysis

    SciTech Connect

    Young, M.L.; Harper, F.T.; Lui, C.H.

    1996-03-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequences from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the U.S. Nuclear Regulatory Commission and the European Commission began co-sponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables using a formal expert judgment elicitation and evaluation process. This paper focuses on the methods used in and results of this on-going joint effort.

  7. Approaches to highly parameterized inversion: A guide to using PEST for model-parameter and predictive-uncertainty analysis

    USGS Publications Warehouse

    Doherty, John E.; Hunt, Randall J.; Tonkin, Matthew J.

    2010-01-01

    Analysis of the uncertainty associated with parameters used by a numerical model, and with predictions that depend on those parameters, is fundamental to the use of modeling in support of decisionmaking. Unfortunately, predictive uncertainty analysis with regard to models can be very computationally demanding, due in part to complex constraints on parameters that arise from expert knowledge of system properties on the one hand (knowledge constraints) and from the necessity for the model parameters to assume values that allow the model to reproduce historical system behavior on the other hand (calibration constraints). Enforcement of knowledge and calibration constraints on parameters used by a model does not eliminate the uncertainty in those parameters. In fact, in many cases, enforcement of calibration constraints simply reduces the uncertainties associated with a number of broad-scale combinations of model parameters that collectively describe spatially averaged system properties. The uncertainties associated with other combinations of parameters, especially those that pertain to small-scale parameter heterogeneity, may not be reduced through the calibration process. To the extent that a prediction depends on system-property detail, its postcalibration variability may be reduced very little, if at all, by applying calibration constraints; knowledge constraints remain the only limits on the variability of predictions that depend on such detail. Regrettably, in many common modeling applications, these constraints are weak. Though the PEST software suite was initially developed as a tool for model calibration, recent developments have focused on the evaluation of model-parameter and predictive uncertainty. As a complement to functionality that it provides for highly parameterized inversion (calibration) by means of formal mathematical regularization techniques, the PEST suite provides utilities for linear and nonlinear error-variance and uncertainty analysis in

  8. ENHANCED UNCERTAINTY ANALYSIS FOR SRS COMPOSITE ANALYSIS

    SciTech Connect

    Smith, F.; Phifer, M.

    2011-06-30

    The Composite Analysis (CA) performed for the Savannah River Site (SRS) in 2009 (SRS CA 2009) included a simplified uncertainty analysis. The uncertainty analysis in the CA (Smith et al. 2009b) was limited to considering at most five sources in a separate uncertainty calculation performed for each POA. To perform the uncertainty calculations in a reasonable amount of time, the analysis was limited to using 400 realizations, 2,000 years of simulated transport time, and the time steps used for the uncertainty analysis were increased from what was used in the CA base case analysis. As part of the CA maintenance plan, the Savannah River National Laboratory (SRNL) committed to improving the CA uncertainty/sensitivity analysis. The previous uncertainty analysis was constrained by the standard GoldSim licensing which limits the user to running at most four Monte Carlo uncertainty calculations (also called realizations) simultaneously. Some of the limitations on the number of realizations that could be practically run and the simulation time steps were removed by building a cluster of three HP Proliant windows servers with a total of 36 64-bit processors and by licensing the GoldSim DP-Plus distributed processing software. This allowed running as many as 35 realizations simultaneously (one processor is reserved as a master process that controls running the realizations). These enhancements to SRNL computing capabilities made uncertainty analysis: using 1000 realizations, using the time steps employed in the base case CA calculations, with more sources, and simulating radionuclide transport for 10,000 years feasible. In addition, an importance screening analysis was performed to identify the class of stochastic variables that have the most significant impact on model uncertainty. This analysis ran the uncertainty model separately testing the response to variations in the following five sets of model parameters: (a) K{sub d} values (72 parameters for the 36 CA elements in

  9. Uncertainty analysis in seismic tomography

    NASA Astrophysics Data System (ADS)

    Owoc, Bartosz; Majdański, Mariusz

    2017-04-01

    Velocity field from seismic travel time tomography depends on several factors like regularization, inversion path, model parameterization etc. The result also strongly depends on an initial velocity model and precision of travel times picking. In this research we test dependence on starting model in layered tomography and compare it with effect of picking precision. Moreover, in our analysis for manual travel times picking the uncertainty distribution is asymmetric. This effect is shifting the results toward faster velocities. For calculation we are using JIVE3D travel time tomographic code. We used data from geo-engineering and industrial scale investigations, which were collected by our team from IG PAS.

  10. Risk Analysis and Uncertainty: Implications for Counselling

    ERIC Educational Resources Information Center

    Hassenzahl, David

    2004-01-01

    Over the past two decades, the risk analysis community has made substantial advances in understanding and describing uncertainty. Uncertainty is ubiquitous, complex, both quantitative and qualitative in nature, and often irreducible. Uncertainty thus creates a challenge when using risk analysis to evaluate the rationality of group and individual…

  11. Facility Measurement Uncertainty Analysis at NASA GRC

    NASA Technical Reports Server (NTRS)

    Stephens, Julia; Hubbard, Erin

    2016-01-01

    This presentation provides and overview of the measurement uncertainty analysis currently being implemented in various facilities at NASA GRC. This presentation includes examples pertinent to the turbine engine community (mass flow and fan efficiency calculation uncertainties.

  12. Uncertainty Analysis in Environmental Modeling Made Easy

    NASA Astrophysics Data System (ADS)

    Pappenberger, Florian; Harvey, Hamish; Beven, Keith; Hall, Jim

    2007-01-01

    Uncertainty analysis assesses the uncertainty in numerical model outputs that arises from ambiguity in model structures, parameters, boundary conditions, and evaluation data. An analysis of the impact of uncertainties should be undertaken in every environmental modeling exercise. Many techniques exist, however, and each requires an investment of time and resources to learn. The potential analyst is faced with the difficult question of which technique is best to use and which may be put off.

  13. Uncertainty Analysis Principles and Methods

    DTIC Science & Technology

    2007-09-01

    WARFARE CENTER WEAPONS DIVISION, CHINA LAKE NAVAL AIR WARFARE CENTER AIRCRAFT DIVISION, PATUXENT RIVER NAVAL UNDERSEA WARFARE CENTER DIVISION...total systematic uncertainties be combined in RSS. In many instances, the student’s t-statistic, t95, is set equal to 2 and URSS is replaced by U95...GUM, the total uncertainty UADD, URSS or U95, was offered as type of confi- dence limit. 9595 UxvaluetrueUx +≤≤− In some respects, these limits

  14. Uncertainty

    USGS Publications Warehouse

    Hunt, Randall J.

    2012-01-01

    Management decisions will often be directly informed by model predictions. However, we now know there can be no expectation of a single ‘true’ model; thus, model results are uncertain. Understandable reporting of underlying uncertainty provides necessary context to decision-makers, as model results are used for management decisions. This, in turn, forms a mechanism by which groundwater models inform a risk-management framework because uncertainty around a prediction provides the basis for estimating the probability or likelihood of some event occurring. Given that the consequences of management decisions vary, it follows that the extent of and resources devoted to an uncertainty analysis may depend on the consequences. For events with low impact, a qualitative, limited uncertainty analysis may be sufficient for informing a decision. For events with a high impact, on the other hand, the risks might be better assessed and associated decisions made using a more robust and comprehensive uncertainty analysis. The purpose of this chapter is to provide guidance on uncertainty analysis through discussion of concepts and approaches, which can vary from heuristic (i.e. the modeller’s assessment of prediction uncertainty based on trial and error and experience) to a comprehensive, sophisticated, statistics-based uncertainty analysis. Most of the material presented here is taken from Doherty et al. (2010) if not otherwise cited. Although the treatment here is necessarily brief, the reader can find citations for the source material and additional references within this chapter.

  15. Sensitivity analysis of uncertainty in model prediction.

    PubMed

    Russi, Trent; Packard, Andrew; Feeley, Ryan; Frenklach, Michael

    2008-03-27

    Data Collaboration is a framework designed to make inferences from experimental observations in the context of an underlying model. In the prior studies, the methodology was applied to prediction on chemical kinetics models, consistency of a reaction system, and discrimination among competing reaction models. The present work advances Data Collaboration by developing sensitivity analysis of uncertainty in model prediction with respect to uncertainty in experimental observations and model parameters. Evaluation of sensitivity coefficients is performed alongside the solution of the general optimization ansatz of Data Collaboration. The obtained sensitivity coefficients allow one to determine which experiment/parameter uncertainty contributes the most to the uncertainty in model prediction, rank such effects, consider new or even hypothetical experiments to perform, and combine the uncertainty analysis with the cost of uncertainty reduction, thereby providing guidance in selecting an experimental/theoretical strategy for community action.

  16. Uncertainty quantification and error analysis

    SciTech Connect

    Higdon, Dave M; Anderson, Mark C; Habib, Salman; Klein, Richard; Berliner, Mark; Covey, Curt; Ghattas, Omar; Graziani, Carlo; Seager, Mark; Sefcik, Joseph; Stark, Philip

    2010-01-01

    UQ studies all sources of error and uncertainty, including: systematic and stochastic measurement error; ignorance; limitations of theoretical models; limitations of numerical representations of those models; limitations on the accuracy and reliability of computations, approximations, and algorithms; and human error. A more precise definition for UQ is suggested below.

  17. Uncertainty analysis for Ulysses safety evaluation report

    NASA Technical Reports Server (NTRS)

    Frank, Michael V.

    1991-01-01

    As part of the effort to review the Ulysses Final Safety Analysis Report and to understand the risk of plutonium release from the Ulysses spacecraft General Purpose Heat Source-Radioisotope Thermal Generator, the Interagency Nuclear Safety Review Panel (INSRP) performed an integrated, quantitative analysis of the uncertainties of the calculated risk of plutonium release from Ulysses. Using state-of-art probabilistic risk assessment technology, the uncertainty analysis accounted for both variability and uncertainty of the key parameters of the risk analysis. The results show that INSRP had high confidence that risk of fatal cancers from potential plutonium release associated with calculated launch and deployment accident scenarios is low.

  18. Uncertainty Analysis of Instrument Calibration and Application

    NASA Technical Reports Server (NTRS)

    Tripp, John S.; Tcheng, Ping

    1999-01-01

    Experimental aerodynamic researchers require estimated precision and bias uncertainties of measured physical quantities, typically at 95 percent confidence levels. Uncertainties of final computed aerodynamic parameters are obtained by propagation of individual measurement uncertainties through the defining functional expressions. In this paper, rigorous mathematical techniques are extended to determine precision and bias uncertainties of any instrument-sensor system. Through this analysis, instrument uncertainties determined through calibration are now expressed as functions of the corresponding measurement for linear and nonlinear univariate and multivariate processes. Treatment of correlated measurement precision error is developed. During laboratory calibration, calibration standard uncertainties are assumed to be an order of magnitude less than those of the instrument being calibrated. Often calibration standards do not satisfy this assumption. This paper applies rigorous statistical methods for inclusion of calibration standard uncertainty and covariance due to the order of their application. The effects of mathematical modeling error on calibration bias uncertainty are quantified. The effects of experimental design on uncertainty are analyzed. The importance of replication is emphasized, techniques for estimation of both bias and precision uncertainties using replication are developed. Statistical tests for stationarity of calibration parameters over time are obtained.

  19. The need for model uncertainty analysis

    USDA-ARS?s Scientific Manuscript database

    Phosphorous (P) loss models are important tools for developing and evaluating conservation practices aimed at reducing P losses from agricultural fields. All P loss models, however, have an inherent amount of uncertainty associated with them. In this study, we conducted an uncertainty analysis with ...

  20. Dealing with Uncertainty in Chemical Risk Analysis

    DTIC Science & Technology

    1988-12-01

    0 * (OF 41 C-DEALING WITH UNCERTAINTY IN - CHEMICAL RISK ANALYSIS THESIS David S. Clement Captain, USAF AFIT/GOR/MA/8CD-2 DT[C. ~ELECTEf 2 9 MAR 18...AFIT/GOR/MA/88D-2 DEALING WITH UNCERTAINTY IN CHEMICAL RISK ANALYSIS THESIS David S. Clement Captain, USAF AFIT/GOR/MA/88D-2 DTIC V ~ 27989 Approved...for public release; distribution unlimited S . AFIT/GOR/KA/88D-2 DEALING WITH UNCERTAINTY IN CHEMICAL RISK ANALYSIS THESIS Presented to the Faculty

  1. Numerical Uncertainty Quantification for Radiation Analysis Tools

    NASA Technical Reports Server (NTRS)

    Anderson, Brooke; Blattnig, Steve; Clowdsley, Martha

    2007-01-01

    Recently a new emphasis has been placed on engineering applications of space radiation analyses and thus a systematic effort of Verification, Validation and Uncertainty Quantification (VV&UQ) of the tools commonly used for radiation analysis for vehicle design and mission planning has begun. There are two sources of uncertainty in geometric discretization addressed in this paper that need to be quantified in order to understand the total uncertainty in estimating space radiation exposures. One source of uncertainty is in ray tracing, as the number of rays increase the associated uncertainty decreases, but the computational expense increases. Thus, a cost benefit analysis optimizing computational time versus uncertainty is needed and is addressed in this paper. The second source of uncertainty results from the interpolation over the dose vs. depth curves that is needed to determine the radiation exposure. The question, then, is what is the number of thicknesses that is needed to get an accurate result. So convergence testing is performed to quantify the uncertainty associated with interpolating over different shield thickness spatial grids.

  2. Uncertainty Prediction in Passive Target Motion Analysis

    DTIC Science & Technology

    2016-05-12

    300118 1 of 25 UNCERTAINTY PREDICTION IN PASSIVE TARGET MOTION ANALYSIS STATEMENT OF GOVERNMENT INTEREST [0001] The invention described herein...uncertainty. (2) Description of the Prior Art [0004] In the bearing only target motion analysis (TMA) problem, one must estimate the position and...very often inadequate for characterizing the accuracy of estimates when data quality is low and the estimation problem is nonlinear. In recent years

  3. Uncertainty Analysis for Photovoltaic Degradation Rates (Poster)

    SciTech Connect

    Jordan, D.; Kurtz, S.; Hansen, C.

    2014-04-01

    Dependable and predictable energy production is the key to the long-term success of the PV industry. PV systems show over the lifetime of their exposure a gradual decline that depends on many different factors such as module technology, module type, mounting configuration, climate etc. When degradation rates are determined from continuous data the statistical uncertainty is easily calculated from the regression coefficients. However, total uncertainty that includes measurement uncertainty and instrumentation drift is far more difficult to determine. A Monte Carlo simulation approach was chosen to investigate a comprehensive uncertainty analysis. The most important effect for degradation rates is to avoid instrumentation that changes over time in the field. For instance, a drifting irradiance sensor, which can be achieved through regular calibration, can lead to a substantially erroneous degradation rates. However, the accuracy of the irradiance sensor has negligible impact on degradation rate uncertainty emphasizing that precision (relative accuracy) is more important than absolute accuracy.

  4. Approach to uncertainty in risk analysis

    SciTech Connect

    Rish, W.R.

    1988-08-01

    In the Fall of 1985 EPA's Office of Radiation Programs (ORP) initiated a project to develop a formal approach to dealing with uncertainties encountered when estimating and evaluating risks to human health and the environment. Based on a literature review of modeling uncertainty, interviews with ORP technical and management staff, and input from experts on uncertainty analysis, a comprehensive approach was developed. This approach recognizes by design the constraints on budget, time, manpower, expertise, and availability of information often encountered in ''real world'' modeling. It is based on the observation that in practice risk modeling is usually done to support a decision process. As such, the approach focuses on how to frame a given risk modeling problem, how to use that framing to select an appropriate mixture of uncertainty analyses techniques, and how to integrate the techniques into an uncertainty assessment that effectively communicates important information and insight to decision-makers. The approach is presented in this report. Practical guidance on characterizing and analyzing uncertainties about model form and quantities and on effectively communicating uncertainty analysis results is included. Examples from actual applications are presented.

  5. Robustness analysis for real parametric uncertainty

    NASA Technical Reports Server (NTRS)

    Sideris, Athanasios

    1989-01-01

    Some key results in the literature in the area of robustness analysis for linear feedback systems with structured model uncertainty are reviewed. Some new results are given. Model uncertainty is described as a combination of real uncertain parameters and norm bounded unmodeled dynamics. Here the focus is on the case of parametric uncertainty. An elementary and unified derivation of the celebrated theorem of Kharitonov and the Edge Theorem is presented. Next, an algorithmic approach for robustness analysis in the cases of multilinear and polynomic parametric uncertainty (i.e., the closed loop characteristic polynomial depends multilinearly and polynomially respectively on the parameters) is given. The latter cases are most important from practical considerations. Some novel modifications in this algorithm which result in a procedure of polynomial time behavior in the number of uncertain parameters is outlined. Finally, it is shown how the more general problem of robustness analysis for combined parametric and dynamic (i.e., unmodeled dynamics) uncertainty can be reduced to the case of polynomic parametric uncertainty, and thus be solved by means of the algorithm.

  6. Uncertainty analysis for Ulysses safety evaluation report

    SciTech Connect

    Frank, M.V. )

    1991-01-01

    As part of the effort to review the Ulysses Final Safety Analysis Report and to understand the risk of plutonium release from the Ulysses spacecraft General Purpose Heat Source---Radioisotope Thermal Generator (GPHS-RTG), the Interagency Nuclear Safety Review Panel (INSRP) and the author performed an integrated, quantitative analysis of the uncertainties of the calculated risk of plutonium release from Ulysses. Using state-of-art probabilistic risk assessment technology, the uncertainty analysis accounted for both variability and uncertainty of the key parameters of the risk analysis. The results show that INSRP had high confidence that risk of fatal cancers from potential plutonium release associated with calculated launch and deployment accident scenarios is low.

  7. Extended Forward Sensitivity Analysis for Uncertainty Quantification

    SciTech Connect

    Haihua Zhao; Vincent A. Mousseau

    2011-09-01

    Verification and validation (V&V) are playing more important roles to quantify uncertainties and realize high fidelity simulations in engineering system analyses, such as transients happened in a complex nuclear reactor system. Traditional V&V in the reactor system analysis focused more on the validation part or did not differentiate verification and validation. The traditional approach to uncertainty quantification is based on a 'black box' approach. The simulation tool is treated as an unknown signal generator, a distribution of inputs according to assumed probability density functions is sent in and the distribution of the outputs is measured and correlated back to the original input distribution. The 'black box' method mixes numerical errors with all other uncertainties. It is also not efficient to perform sensitivity analysis. Contrary to the 'black box' method, a more efficient sensitivity approach can take advantage of intimate knowledge of the simulation code. In these types of approaches equations for the propagation of uncertainty are constructed and the sensitivities are directly solved for as variables in the simulation. This paper presents the forward sensitivity analysis as a method to help uncertainty qualification. By including time step and potentially spatial step as special sensitivity parameters, the forward sensitivity method is extended as one method to quantify numerical errors. Note that by integrating local truncation errors over the whole system through the forward sensitivity analysis process, the generated time step and spatial step sensitivity information reflect global numerical errors. The discretization errors can be systematically compared against uncertainties due to other physical parameters. This extension makes the forward sensitivity method a much more powerful tool to help uncertainty qualification. By knowing the relative sensitivity of time and space steps with other interested physical parameters, the simulation is allowed

  8. Identifying sources of uncertainty using covariance analysis

    NASA Astrophysics Data System (ADS)

    Hyslop, N. P.; White, W. H.

    2010-12-01

    Atmospheric aerosol monitoring often includes performing multiple analyses on a collected sample. Some common analyses resolve suites of elements or compounds (e.g., spectrometry, chromatography). Concentrations are determined through multi-step processes involving sample collection, physical or chemical analysis, and data reduction. Uncertainties in the individual steps propagate into uncertainty in the calculated concentration. The assumption in most treatments of measurement uncertainty is that errors in the various species concentrations measured in a sample are random and therefore independent of each other. This assumption is often not valid in speciated aerosol data because some errors can be common to multiple species. For example, an error in the sample volume will introduce a common error into all species concentrations determined in the sample, and these errors will correlate with each other. Measurement programs often use paired (collocated) measurements to characterize the random uncertainty in their measurements. Suites of paired measurements provide an opportunity to go beyond the characterization of measurement uncertainties in individual species to examine correlations amongst the measurement uncertainties in multiple species. This additional information can be exploited to distinguish sources of uncertainty that affect all species from those that only affect certain subsets or individual species. Data from the Interagency Monitoring of Protected Visual Environments (IMPROVE) program are used to illustrate these ideas. Nine analytes commonly detected in the IMPROVE network were selected for this analysis. The errors in these analytes can be reasonably modeled as multiplicative, and the natural log of the ratio of concentrations measured on the two samplers provides an approximation of the error. Figure 1 shows the covariation of these log ratios among the different analytes for one site. Covariance is strongest amongst the dust element (Fe, Ca, and

  9. Uncertainty Assessment in Life Cycle Cost Analysis.

    DTIC Science & Technology

    1985-05-01

    of this report) Unclassified 15. DECL ASSI FICATION/ DOWNGRADING SCHEDULE 1S. DISTRIBUTION STATEMENT (of this Report) Approved for public release...data base oriented. 7. Risk Analysis and Decision Models in the Planning of Housing Projects, by Jorge A. Machado , Report No. R72-44, Structures...1979). Lewis, L., "Range Estimating -- Managing Uncertainty," AACE Bulletin, Vol. 19, No. 6, Nov/Dec 1977. Machado , J. A., "Risk Analysis and Decision

  10. Uncertainty Analysis for a Jet Flap Airfoil

    NASA Technical Reports Server (NTRS)

    Green, Lawrence L.; Cruz, Josue

    2006-01-01

    An analysis of variance (ANOVA) study was performed to quantify the potential uncertainties of lift and pitching moment coefficient calculations from a computational fluid dynamics code, relative to an experiment, for a jet flap airfoil configuration. Uncertainties due to a number of factors including grid density, angle of attack and jet flap blowing coefficient were examined. The ANOVA software produced a numerical model of the input coefficient data, as functions of the selected factors, to a user-specified order (linear, 2-factor interference, quadratic, or cubic). Residuals between the model and actual data were also produced at each of the input conditions, and uncertainty confidence intervals (in the form of Least Significant Differences or LSD) for experimental, computational, and combined experimental / computational data sets were computed. The LSD bars indicate the smallest resolvable differences in the functional values (lift or pitching moment coefficient) attributable solely to changes in independent variable, given just the input data points from selected data sets. The software also provided a collection of diagnostics which evaluate the suitability of the input data set for use within the ANOVA process, and which examine the behavior of the resultant data, possibly suggesting transformations which should be applied to the data to reduce the LSD. The results illustrate some of the key features of, and results from, the uncertainty analysis studies, including the use of both numerical (continuous) and categorical (discrete) factors, the effects of the number and range of the input data points, and the effects of the number of factors considered simultaneously.

  11. Systematic Analysis Of Ocean Colour Uncertainties

    NASA Astrophysics Data System (ADS)

    Lavender, Samantha

    2013-12-01

    This paper reviews current research into the estimation of uncertainties as a pixel-based measure to aid non- specialist users of remote sensing products. An example MERIS image, captured on the 28 March 2012, was processed with above-water atmospheric correction code. This was initially based on both the Antoine & Morel Standard Atmospheric Correction, with Bright Pixel correction component, and Doerffer Neural Network coastal water's approach. It's showed that analysis of the atmospheric by-products yield important information about the separation of the atmospheric and in-water signals, helping to sign-post possible uncertainties in the atmospheric correction results. Further analysis has concentrated on implementing a ‘simplistic' atmospheric correction so that the impact of changing the input auxiliary data can be analysed; the influence of changing surface pressure is demonstrated. Future work will focus on automating the analysis, so that the methodology can be implemented within an operational system.

  12. Uncertainty Analysis of Decomposing Polyurethane Foam

    NASA Technical Reports Server (NTRS)

    Hobbs, Michael L.; Romero, Vicente J.

    2000-01-01

    Sensitivity/uncertainty analyses are necessary to determine where to allocate resources for improved predictions in support of our nation's nuclear safety mission. Yet, sensitivity/uncertainty analyses are not commonly performed on complex combustion models because the calculations are time consuming, CPU intensive, nontrivial exercises that can lead to deceptive results. To illustrate these ideas, a variety of sensitivity/uncertainty analyses were used to determine the uncertainty associated with thermal decomposition of polyurethane foam exposed to high radiative flux boundary conditions. The polyurethane used in this study is a rigid closed-cell foam used as an encapsulant. Related polyurethane binders such as Estane are used in many energetic materials of interest to the JANNAF community. The complex, finite element foam decomposition model used in this study has 25 input parameters that include chemistry, polymer structure, and thermophysical properties. The response variable was selected as the steady-state decomposition front velocity calculated as the derivative of the decomposition front location versus time. An analytical mean value sensitivity/uncertainty (MV) analysis was used to determine the standard deviation by taking numerical derivatives of the response variable with respect to each of the 25 input parameters. Since the response variable is also a derivative, the standard deviation was essentially determined from a second derivative that was extremely sensitive to numerical noise. To minimize the numerical noise, 50-micrometer element dimensions and approximately 1-msec time steps were required to obtain stable uncertainty results. As an alternative method to determine the uncertainty and sensitivity in the decomposition front velocity, surrogate response surfaces were generated for use with a constrained Latin Hypercube Sampling (LHS) technique. Two surrogate response surfaces were investigated: 1) a linear surrogate response surface (LIN) and 2

  13. Uncertainty Analysis of Decomposing Polyurethane Foam

    NASA Technical Reports Server (NTRS)

    Hobbs, Michael L.; Romero, Vicente J.

    2000-01-01

    Sensitivity/uncertainty analyses are necessary to determine where to allocate resources for improved predictions in support of our nation's nuclear safety mission. Yet, sensitivity/uncertainty analyses are not commonly performed on complex combustion models because the calculations are time consuming, CPU intensive, nontrivial exercises that can lead to deceptive results. To illustrate these ideas, a variety of sensitivity/uncertainty analyses were used to determine the uncertainty associated with thermal decomposition of polyurethane foam exposed to high radiative flux boundary conditions. The polyurethane used in this study is a rigid closed-cell foam used as an encapsulant. Related polyurethane binders such as Estane are used in many energetic materials of interest to the JANNAF community. The complex, finite element foam decomposition model used in this study has 25 input parameters that include chemistry, polymer structure, and thermophysical properties. The response variable was selected as the steady-state decomposition front velocity calculated as the derivative of the decomposition front location versus time. An analytical mean value sensitivity/uncertainty (MV) analysis was used to determine the standard deviation by taking numerical derivatives of the response variable with respect to each of the 25 input parameters. Since the response variable is also a derivative, the standard deviation was essentially determined from a second derivative that was extremely sensitive to numerical noise. To minimize the numerical noise, 50-micrometer element dimensions and approximately 1-msec time steps were required to obtain stable uncertainty results. As an alternative method to determine the uncertainty and sensitivity in the decomposition front velocity, surrogate response surfaces were generated for use with a constrained Latin Hypercube Sampling (LHS) technique. Two surrogate response surfaces were investigated: 1) a linear surrogate response surface (LIN) and 2

  14. LCA data quality: sensitivity and uncertainty analysis.

    PubMed

    Guo, M; Murphy, R J

    2012-10-01

    Life cycle assessment (LCA) data quality issues were investigated by using case studies on products from starch-polyvinyl alcohol based biopolymers and petrochemical alternatives. The time horizon chosen for the characterization models was shown to be an important sensitive parameter for the environmental profiles of all the polymers. In the global warming potential and the toxicity potential categories the comparison between biopolymers and petrochemical counterparts altered as the time horizon extended from 20 years to infinite time. These case studies demonstrated that the use of a single time horizon provide only one perspective on the LCA outcomes which could introduce an inadvertent bias into LCA outcomes especially in toxicity impact categories and thus dynamic LCA characterization models with varying time horizons are recommended as a measure of the robustness for LCAs especially comparative assessments. This study also presents an approach to integrate statistical methods into LCA models for analyzing uncertainty in industrial and computer-simulated datasets. We calibrated probabilities for the LCA outcomes for biopolymer products arising from uncertainty in the inventory and from data variation characteristics this has enabled assigning confidence to the LCIA outcomes in specific impact categories for the biopolymer vs. petrochemical polymer comparisons undertaken. Uncertainty combined with the sensitivity analysis carried out in this study has led to a transparent increase in confidence in the LCA findings. We conclude that LCAs lacking explicit interpretation of the degree of uncertainty and sensitivities are of limited value as robust evidence for decision making or comparative assertions.

  15. Parameter Uncertainty for Repository Thermal Analysis

    SciTech Connect

    Hardin, Ernest; Hadgu, Teklu; Greenberg, Harris; Dupont, Mark

    2015-10-01

    This report is one follow-on to a study of reference geologic disposal design concepts (Hardin et al. 2011a). Based on an analysis of maximum temperatures, that study concluded that certain disposal concepts would require extended decay storage prior to emplacement, or the use of small waste packages, or both. The study used nominal values for thermal properties of host geologic media and engineered materials, demonstrating the need for uncertainty analysis to support the conclusions. This report is a first step that identifies the input parameters of the maximum temperature calculation, surveys published data on measured values, uses an analytical approach to determine which parameters are most important, and performs an example sensitivity analysis. Using results from this first step, temperature calculations planned for FY12 can focus on only the important parameters, and can use the uncertainty ranges reported here. The survey of published information on thermal properties of geologic media and engineered materials, is intended to be sufficient for use in generic calculations to evaluate the feasibility of reference disposal concepts. A full compendium of literature data is beyond the scope of this report. The term “uncertainty” is used here to represent both measurement uncertainty and spatial variability, or variability across host geologic units. For the most important parameters (e.g., buffer thermal conductivity) the extent of literature data surveyed samples these different forms of uncertainty and variability. Finally, this report is intended to be one chapter or section of a larger FY12 deliverable summarizing all the work on design concepts and thermal load management for geologic disposal (M3FT-12SN0804032, due 15Aug2012).

  16. Confronting deep uncertainties in risk analysis.

    PubMed

    Cox, Louis Anthony

    2012-10-01

    How can risk analysts help to improve policy and decision making when the correct probabilistic relation between alternative acts and their probable consequences is unknown? This practical challenge of risk management with model uncertainty arises in problems from preparing for climate change to managing emerging diseases to operating complex and hazardous facilities safely. We review constructive methods for robust and adaptive risk analysis under deep uncertainty. These methods are not yet as familiar to many risk analysts as older statistical and model-based methods, such as the paradigm of identifying a single "best-fitting" model and performing sensitivity analyses for its conclusions. They provide genuine breakthroughs for improving predictions and decisions when the correct model is highly uncertain. We demonstrate their potential by summarizing a variety of practical risk management applications.

  17. Representing uncertainty on model analysis plots

    NASA Astrophysics Data System (ADS)

    Smith, Trevor I.

    2016-12-01

    Model analysis provides a mechanism for representing student learning as measured by standard multiple-choice surveys. The model plot contains information regarding both how likely students in a particular class are to choose the correct answer and how likely they are to choose an answer consistent with a well-documented conceptual model. Unfortunately, Bao's original presentation of the model plot did not include a way to represent uncertainty in these measurements. I present details of a method to add error bars to model plots by expanding the work of Sommer and Lindell. I also provide a template for generating model plots with error bars.

  18. Coupled semivariogram uncertainty of hydrogeological and geophysical data on capture zone uncertainty analysis

    USGS Publications Warehouse

    Rahman, A.; Tsai, F.T.-C.; White, C.D.; Willson, C.S.

    2008-01-01

    This study investigates capture zone uncertainty that relates to the coupled semivariogram uncertainty of hydrogeological and geophysical data. Semivariogram uncertainty is represented by the uncertainty in structural parameters (range, sill, and nugget). We used the beta distribution function to derive the prior distributions of structural parameters. The probability distributions of structural parameters were further updated through the Bayesian approach with the Gaussian likelihood functions. Cokriging of noncollocated pumping test data and electrical resistivity data was conducted to better estimate hydraulic conductivity through autosemivariograms and pseudo-cross-semivariogram. Sensitivities of capture zone variability with respect to the spatial variability of hydraulic conductivity, porosity and aquifer thickness were analyzed using ANOVA. The proposed methodology was applied to the analysis of capture zone uncertainty at the Chicot aquifer in Southwestern Louisiana, where a regional groundwater flow model was developed. MODFLOW-MODPATH was adopted to delineate the capture zone. The ANOVA results showed that both capture zone area and compactness were sensitive to hydraulic conductivity variation. We concluded that the capture zone uncertainty due to the semivariogram uncertainty is much higher than that due to the kriging uncertainty for given semivariograms. In other words, the sole use of conditional variances of kriging may greatly underestimate the flow response uncertainty. Semivariogram uncertainty should also be taken into account in the uncertainty analysis. ?? 2008 ASCE.

  19. Pretest uncertainty analysis for chemical rocket engine tests

    NASA Technical Reports Server (NTRS)

    Davidian, Kenneth J.

    1987-01-01

    A parametric pretest uncertainty analysis has been performed for a chemical rocket engine test at a unique 1000:1 area ratio altitude test facility. Results from the parametric study provide the error limits required in order to maintain a maximum uncertainty of 1 percent on specific impulse. Equations used in the uncertainty analysis are presented.

  20. Representation of analysis results involving aleatory and epistemic uncertainty.

    SciTech Connect

    Johnson, Jay Dean; Helton, Jon Craig; Oberkampf, William Louis; Sallaberry, Cedric J.

    2008-08-01

    Procedures are described for the representation of results in analyses that involve both aleatory uncertainty and epistemic uncertainty, with aleatory uncertainty deriving from an inherent randomness in the behavior of the system under study and epistemic uncertainty deriving from a lack of knowledge about the appropriate values to use for quantities that are assumed to have fixed but poorly known values in the context of a specific study. Aleatory uncertainty is usually represented with probability and leads to cumulative distribution functions (CDFs) or complementary cumulative distribution functions (CCDFs) for analysis results of interest. Several mathematical structures are available for the representation of epistemic uncertainty, including interval analysis, possibility theory, evidence theory and probability theory. In the presence of epistemic uncertainty, there is not a single CDF or CCDF for a given analysis result. Rather, there is a family of CDFs and a corresponding family of CCDFs that derive from epistemic uncertainty and have an uncertainty structure that derives from the particular uncertainty structure (i.e., interval analysis, possibility theory, evidence theory, probability theory) used to represent epistemic uncertainty. Graphical formats for the representation of epistemic uncertainty in families of CDFs and CCDFs are investigated and presented for the indicated characterizations of epistemic uncertainty.

  1. Uncertainty Analysis and Expert Judgment in Seismic Hazard Analysis

    NASA Astrophysics Data System (ADS)

    Klügel, Jens-Uwe

    2011-01-01

    The large uncertainty associated with the prediction of future earthquakes is usually regarded as the main reason for increased hazard estimates which have resulted from some recent large scale probabilistic seismic hazard analysis studies (e.g. the PEGASOS study in Switzerland and the Yucca Mountain study in the USA). It is frequently overlooked that such increased hazard estimates are characteristic for a single specific method of probabilistic seismic hazard analysis (PSHA): the traditional (Cornell-McGuire) PSHA method which has found its highest level of sophistication in the SSHAC probability method. Based on a review of the SSHAC probability model and its application in the PEGASOS project, it is shown that the surprising results of recent PSHA studies can be explained to a large extent by the uncertainty model used in traditional PSHA, which deviates from the state of the art in mathematics and risk analysis. This uncertainty model, the Ang-Tang uncertainty model, mixes concepts of decision theory with probabilistic hazard assessment methods leading to an overestimation of uncertainty in comparison to empirical evidence. Although expert knowledge can be a valuable source of scientific information, its incorporation into the SSHAC probability method does not resolve the issue of inflating uncertainties in PSHA results. Other, more data driven, PSHA approaches in use in some European countries are less vulnerable to this effect. The most valuable alternative to traditional PSHA is the direct probabilistic scenario-based approach, which is closely linked with emerging neo-deterministic methods based on waveform modelling.

  2. Risk uncertainty analysis methods for NUREG-1150

    SciTech Connect

    Benjamin, U.S.; Boyd, G.J.

    1986-01-01

    Evaluation and display of risk uncertainties for NUREG-1150 constitute a principal focus of the Severe Accident Risk Rebaselining/Risk Reduction Program (SARRP). Some of the principal objectives of the uncertainty evaluation are: (1) to provide a quantitative estimate that reflects, for those areas considered, a credible and realistic range of uncertainty in risk; (2) to rank the various sources of uncertainty with respect to their importance for various measures of risk; and (3) to characterize the state of understanding of each aspect of the risk assessment for which major uncertainties exist. This paper describes the methods developed to fulfill these objectives.

  3. Assessing and reporting uncertainties in dietary exposure analysis: Mapping of uncertainties in a tiered approach.

    PubMed

    Kettler, Susanne; Kennedy, Marc; McNamara, Cronan; Oberdörfer, Regina; O'Mahony, Cian; Schnabel, Jürgen; Smith, Benjamin; Sprong, Corinne; Faludi, Roland; Tennant, David

    2015-08-01

    Uncertainty analysis is an important component of dietary exposure assessments in order to understand correctly the strength and limits of its results. Often, standard screening procedures are applied in a first step which results in conservative estimates. If through those screening procedures a potential exceedance of health-based guidance values is indicated, within the tiered approach more refined models are applied. However, the sources and types of uncertainties in deterministic and probabilistic models can vary or differ. A key objective of this work has been the mapping of different sources and types of uncertainties to better understand how to best use uncertainty analysis to generate more realistic comprehension of dietary exposure. In dietary exposure assessments, uncertainties can be introduced by knowledge gaps about the exposure scenario, parameter and the model itself. With this mapping, general and model-independent uncertainties have been identified and described, as well as those which can be introduced and influenced by the specific model during the tiered approach. This analysis identifies that there are general uncertainties common to point estimates (screening or deterministic methods) and probabilistic exposure assessment methods. To provide further clarity, general sources of uncertainty affecting many dietary exposure assessments should be separated from model-specific uncertainties.

  4. Uncertainty Budget Analysis for Dimensional Inspection Processes (U)

    SciTech Connect

    Valdez, Lucas M.

    2012-07-26

    This paper is intended to provide guidance and describe how to prepare an uncertainty analysis of a dimensional inspection process through the utilization of an uncertainty budget analysis. The uncertainty analysis is stated in the same methodology as that of the ISO GUM standard for calibration and testing. There is a specific distinction between how Type A and Type B uncertainty analysis is used in a general and specific process. All theory and applications are utilized to represent both a generalized approach to estimating measurement uncertainty and how to report and present these estimations for dimensional measurements in a dimensional inspection process. The analysis of this uncertainty budget shows that a well-controlled dimensional inspection process produces a conservative process uncertainty, which can be attributed to the necessary assumptions in place for best possible results.

  5. A TIERED APPROACH TO PERFORMING UNCERTAINTY ANALYSIS IN CONDUCTING EXPOSURE ANALYSIS FOR CHEMICALS

    EPA Science Inventory

    The WHO/IPCS draft Guidance Document on Characterizing and Communicating Uncertainty in Exposure Assessment provides guidance on recommended strategies for conducting uncertainty analysis as part of human exposure analysis. Specifically, a tiered approach to uncertainty analysis ...

  6. mu analysis with real parametric uncertainty

    NASA Technical Reports Server (NTRS)

    Young, Peter M.; Newlin, Matthew P.; Doyle, John C.

    1991-01-01

    The authors give a broad overview, from a LFT (linear fractional transformation)/mu perspective, of some of the theoretical and practical issues associated with robustness in the presence of real parametric uncertainty, with a focus on computation. Recent results on the properties of mu in the mixed case are reviewed, including issues of NP completeness, continuity, computation of bounds, the equivalence of mu and its bounds, and some direct comparisons with Kharitonov-type analysis methods. In addition, some advances in the computational aspects of the problem, including a novel branch and bound algorithm, are briefly presented together with numerical results. The results suggest that while the mixed mu problem may have inherently combinatoric worst-case behavior, practical algorithms with modest computational requirements can be developed for problems of medium size (less than 100 parameters) that are of engineering interest.

  7. An uncertainty analysis of wildfire modeling [Chapter 13

    Treesearch

    Karin Riley; Matthew Thompson

    2017-01-01

    Before fire models can be understood, evaluated, and effectively applied to support decision making, model-based uncertainties must be analyzed. In this chapter, we identify and classify sources of uncertainty using an established analytical framework, and summarize results graphically in an uncertainty matrix. Our analysis facilitates characterization of the...

  8. Uncertainty Analysis with Site Specific Groundwater Models: Experiences and Observations

    SciTech Connect

    Brewer, K.

    2003-07-15

    Groundwater flow and transport predictions are a major component of remedial action evaluations for contaminated groundwater at the Savannah River Site. Because all groundwater modeling results are subject to uncertainty from various causes; quantification of the level of uncertainty in the modeling predictions is beneficial to project decision makers. Complex site-specific models present formidable challenges for implementing an uncertainty analysis.

  9. The challenges on uncertainty analysis for pebble bed HTGR

    SciTech Connect

    Hao, C.; Li, F.; Zhang, H.

    2012-07-01

    The uncertainty analysis is very popular and important, and many works have been done for Light Water Reactor (LWR), although the experience for the uncertainty analysis in High Temperature Gas cooled Reactor (HTGR) modeling is still in the primary stage. IAEA will launch a Coordination Research Project (CRP) on this topic soon. This paper addresses some challenges for the uncertainty analysis in HTGR modeling, based on the experience of OECD LWR Uncertainty Analysis in Modeling (UAM) activities, and taking into account the peculiarities of pebble bed HTGR designs. The main challenges for HTGR UAM are: the lack of experience, the totally different code packages, the coupling of power distribution, temperature distribution and burnup distribution through the temperature feedback and pebble flow. The most serious challenge is how to deal with the uncertainty in pebble flow, the uncertainty in pebble bed flow modeling, and their contribution to the uncertainty of maximum fuel temperature, which is the most interested parameter for the modular HTGR. (authors)

  10. Uncertainty Analysis of Simulated Hydraulic Fracturing

    NASA Astrophysics Data System (ADS)

    Chen, M.; Sun, Y.; Fu, P.; Carrigan, C. R.; Lu, Z.

    2012-12-01

    Artificial hydraulic fracturing is being used widely to stimulate production of oil, natural gas, and geothermal reservoirs with low natural permeability. Optimization of field design and operation is limited by the incomplete characterization of the reservoir, as well as the complexity of hydrological and geomechanical processes that control the fracturing. Thus, there are a variety of uncertainties associated with the pre-existing fracture distribution, rock mechanics, and hydraulic-fracture engineering that require evaluation of their impact on the optimized design. In this study, a multiple-stage scheme was employed to evaluate the uncertainty. We first define the ranges and distributions of 11 input parameters that characterize the natural fracture topology, in situ stress, geomechanical behavior of the rock matrix and joint interfaces, and pumping operation, to cover a wide spectrum of potential conditions expected for a natural reservoir. These parameters were then sampled 1,000 times in an 11-dimensional parameter space constrained by the specified ranges using the Latin-hypercube method. These 1,000 parameter sets were fed into the fracture simulators, and the outputs were used to construct three designed objective functions, i.e. fracture density, opened fracture length and area density. Using PSUADE, three response surfaces (11-dimensional) of the objective functions were developed and global sensitivity was analyzed to identify the most sensitive parameters for the objective functions representing fracture connectivity, which are critical for sweep efficiency of the recovery process. The second-stage high resolution response surfaces were constructed with dimension reduced to the number of the most sensitive parameters. An additional response surface with respect to the objective function of the fractal dimension for fracture distributions was constructed in this stage. Based on these response surfaces, comprehensive uncertainty analyses were conducted

  11. Measurement uncertainty analysis techniques applied to PV performance measurements

    SciTech Connect

    Wells, C.

    1992-10-01

    The purpose of this presentation is to provide a brief introduction to measurement uncertainty analysis, outline how it is done, and illustrate uncertainty analysis with examples drawn from the PV field, with particular emphasis toward its use in PV performance measurements. The uncertainty information we know and state concerning a PV performance measurement or a module test result determines, to a significant extent, the value and quality of that result. What is measurement uncertainty analysis It is an outgrowth of what has commonly been called error analysis. But uncertainty analysis, a more recent development, gives greater insight into measurement processes and tests, experiments, or calibration results. Uncertainty analysis gives us an estimate of the I interval about a measured value or an experiment's final result within which we believe the true value of that quantity will lie. Why should we take the time to perform an uncertainty analysis A rigorous measurement uncertainty analysis: Increases the credibility and value of research results; allows comparisons of results from different labs; helps improve experiment design and identifies where changes are needed to achieve stated objectives (through use of the pre-test analysis); plays a significant role in validating measurements and experimental results, and in demonstrating (through the post-test analysis) that valid data have been acquired; reduces the risk of making erroneous decisions; demonstrates quality assurance and quality control measures have been accomplished; define Valid Data as data having known and documented paths of: Origin, including theory; measurements; traceability to measurement standards; computations; uncertainty analysis of results.

  12. Micro-Pulse Lidar Signals: Uncertainty Analysis

    NASA Technical Reports Server (NTRS)

    Welton, Ellsworth J.; Campbell, James R.; Starr, David OC. (Technical Monitor)

    2002-01-01

    Micro-pulse lidar (MPL) systems are small, autonomous, eye-safe lidars used for continuous observations of the vertical distribution of cloud and aerosol layers. Since the construction of the first MPL in 1993, procedures have been developed to correct for various instrument effects present in MPL signals. The primary instrument effects include afterpulse, laser-detector cross-talk, and overlap, poor near-range (less than 6 km) focusing. The accurate correction of both afterpulse and overlap effects are required to study both clouds and aerosols. Furthermore, the outgoing energy of the laser pulses and the statistical uncertainty of the MPL detector must also be correctly determined in order to assess the accuracy of MPL observations. The uncertainties associated with the afterpulse, overlap, pulse energy, detector noise, and all remaining quantities affecting measured MPL signals, are determined in this study. The uncertainties are propagated through the entire MPL correction process to give a net uncertainty on the final corrected MPL signal. The results show that in the near range, the overlap uncertainty dominates. At altitudes above the overlap region, the dominant source of uncertainty is caused by uncertainty in the pulse energy. However, if the laser energy is low, then during mid-day, high solar background levels can significantly reduce the signal-to-noise of the detector. In such a case, the statistical uncertainty of the detector count rate becomes dominant at altitudes above the overlap region.

  13. Critical analysis of uncertainties during particle filtration

    NASA Astrophysics Data System (ADS)

    Badalyan, Alexander; Carageorgos, Themis; Bedrikovetsky, Pavel; You, Zhenjiang; Zeinijahromi, Abbas; Aji, Keyiseer

    2012-09-01

    Using the law of propagation of uncertainties we show how equipment- and measurement-related uncertainties contribute to the overall combined standard uncertainties (CSU) in filter permeability and in modelling the results for polystyrene latex microspheres filtration through a borosilicate glass filter at various injection velocities. Standard uncertainties in dynamic viscosity and volumetric flowrate of microspheres suspension have the greatest influence on the overall CSU in filter permeability which excellently agrees with results obtained from Monte Carlo simulations. Two model parameters "maximum critical retention concentration" and "minimum injection velocity" and their uncertainties were calculated by fitting two quadratic mathematical models to the experimental data using a weighted least squares approximation. Uncertainty in the internal cake porosity has the highest impact on modelling uncertainties in critical retention concentration. The model with the internal cake porosity reproduces experimental "critical retention concentration vs velocity"-data better than the second model which contains the total electrostatic force whose value and uncertainty have not been reliably calculated due to the lack of experimental dielectric data.

  14. An optimization based sampling approach for multiple metrics uncertainty analysis using generalized likelihood uncertainty estimation

    NASA Astrophysics Data System (ADS)

    Zhou, Rurui; Li, Yu; Lu, Di; Liu, Haixing; Zhou, Huicheng

    2016-09-01

    This paper investigates the use of an epsilon-dominance non-dominated sorted genetic algorithm II (ɛ-NSGAII) as a sampling approach with an aim to improving sampling efficiency for multiple metrics uncertainty analysis using Generalized Likelihood Uncertainty Estimation (GLUE). The effectiveness of ɛ-NSGAII based sampling is demonstrated compared with Latin hypercube sampling (LHS) through analyzing sampling efficiency, multiple metrics performance, parameter uncertainty and flood forecasting uncertainty with a case study of flood forecasting uncertainty evaluation based on Xinanjiang model (XAJ) for Qing River reservoir, China. Results obtained demonstrate the following advantages of the ɛ-NSGAII based sampling approach in comparison to LHS: (1) The former performs more effective and efficient than LHS, for example the simulation time required to generate 1000 behavioral parameter sets is shorter by 9 times; (2) The Pareto tradeoffs between metrics are demonstrated clearly with the solutions from ɛ-NSGAII based sampling, also their Pareto optimal values are better than those of LHS, which means better forecasting accuracy of ɛ-NSGAII parameter sets; (3) The parameter posterior distributions from ɛ-NSGAII based sampling are concentrated in the appropriate ranges rather than uniform, which accords with their physical significance, also parameter uncertainties are reduced significantly; (4) The forecasted floods are close to the observations as evaluated by three measures: the normalized total flow outside the uncertainty intervals (FOUI), average relative band-width (RB) and average deviation amplitude (D). The flood forecasting uncertainty is also reduced a lot with ɛ-NSGAII based sampling. This study provides a new sampling approach to improve multiple metrics uncertainty analysis under the framework of GLUE, and could be used to reveal the underlying mechanisms of parameter sets under multiple conflicting metrics in the uncertainty analysis process.

  15. Uncertainties in landscape analysis and ecosystem service assessment.

    PubMed

    Hou, Y; Burkhard, B; Müller, F

    2013-09-01

    Landscape analysis and ecosystem service assessment have drawn increasing concern from research and application at the landscape scale. Thanks to the continuously emerging assessments as well as studies aiming at evaluation method improvement, policy makers and landscape managers have an increasing interest in integrating ecosystem services into their decisions. However, the plausible assessments carry numerous sources of uncertainties, which regrettably tend to be ignored or disregarded by the actors or researchers. In order to cope with uncertainties and make them more transparent for landscape managers, we demonstrate them by reviewing literature, describing an example and proposing approaches for uncertainty analysis. Additionally, we conclude with potential actions to reduce the insecurities accompanying landscape analysis and ecosystem service assessments. As for landscape analysis, the fundamental uncertainty origins are landscape complexity and methodological uncertainties. Concerning the uncertainty sources of ecosystem service assessments, the complexity of the natural system, respondents' preferences and technical problems play essential roles. By analyzing the assessment process, we find that initial data uncertainty pervades the whole assessment and argue that the limited knowledge about the complexity of ecosystems is the focal origin of uncertainties. For analyzing uncertainties in assessments, we propose systems analysis, scenario simulation and the comparison method as promising strategies. To reduce uncertainties, we assume that actions should integrate continuous learning, expanding respondent numbers and sources, considering representativeness, improving and standardizing assessment methods and optimizing spatial and geobiophysical data. Copyright © 2012 Elsevier Ltd. All rights reserved.

  16. Uncertainty Analysis of Historical Hurricane Data

    NASA Technical Reports Server (NTRS)

    Green, Lawrence L.

    2007-01-01

    An analysis of variance (ANOVA) study was conducted for historical hurricane data dating back to 1851 that was obtained from the U. S. Department of Commerce National Oceanic and Atmospheric Administration (NOAA). The data set was chosen because it is a large, publicly available collection of information, exhibiting great variability which has made the forecasting of future states, from current and previous states, difficult. The availability of substantial, high-fidelity validation data, however, made for an excellent uncertainty assessment study. Several factors (independent variables) were identified from the data set, which could potentially influence the track and intensity of the storms. The values of these factors, along with the values of responses of interest (dependent variables) were extracted from the data base, and provided to a commercial software package for processing via the ANOVA technique. The primary goal of the study was to document the ANOVA modeling uncertainty and predictive errors in making predictions about hurricane location and intensity 24 to 120 hours beyond known conditions, as reported by the data set. A secondary goal was to expose the ANOVA technique to a broader community within NASA. The independent factors considered to have an influence on the hurricane track included the current and starting longitudes and latitudes (measured in degrees), and current and starting maximum sustained wind speeds (measured in knots), and the storm starting date, its current duration from its first appearance, and the current year fraction of each reading, all measured in years. The year fraction and starting date were included in order to attempt to account for long duration cyclic behaviors, such as seasonal weather patterns, and years in which the sea or atmosphere were unusually warm or cold. The effect of short duration weather patterns and ocean conditions could not be examined with the current data set. The responses analyzed were the storm

  17. Radiometer Design Analysis Based Upon Measurement Uncertainty

    NASA Technical Reports Server (NTRS)

    Racette, Paul E.; Lang, Roger H.

    2004-01-01

    This paper introduces a method for predicting the performance of a radiometer design based on calculating the measurement uncertainty. The variety in radiometer designs and the demand for improved radiometric measurements justify the need for a more general and comprehensive method to assess system performance. Radiometric resolution, or sensitivity, is a figure of merit that has been commonly used to characterize the performance of a radiometer. However when evaluating the performance of a calibration design for a radiometer, the use of radiometric resolution has limited application. These limitations are overcome by considering instead the measurement uncertainty. A method for calculating measurement uncertainty for a generic radiometer design including its calibration algorithm is presented. The result is a generalized technique by which system calibration architectures and design parameters can be studied to optimize instrument performance for given requirements and constraints. Example applications demonstrate the utility of using measurement uncertainty as a figure of merit.

  18. Estimation of basic uncertainties in clinical analysis.

    PubMed

    Ertas, Ozlem Sogut; Kayali, Aycil

    2002-01-01

    Clinical analyses have a vital importance in human health. Thus, it is necessary to have reliable analytical information which depends on a good assessment of accuracy and this can be obtained by minimization of uncertainty values. The uncertainty values are complex and arise from different sources such as sample, method, instrumentation and data processing. Some of these components may have been evaluated from the distribution of results of series of measurements and can be characterized by standard deviations. In our laboratory, for example the balance calibration uncertainty for 1 g was found to be 0.00414 g. The uncertainties for the volume of solution contained in a 100 mL volumetric flask and a 0.5 mL glass-pipette were calculated to be 0.0843 mL and 0.0071 mL respectively and etc.

  19. Geoengineering to Avoid Overshoot: An Uncertainty Analysis

    NASA Astrophysics Data System (ADS)

    Tanaka, K.

    2009-04-01

    Geoengineering (or climate engineering) using stratospheric sulfur injections (Crutzen, 2006) has been called for research in case of an urgent need for stopping global warming when other mitigation efforts were exhausted. Although there are a number of concerns over this idea (e.g. Robock, 2008), it is still useful to consider geoengineering as a possible method to limit warming caused by overshoot. Overshoot is a feature accompanied by low stabilizations scenarios aiming for a stringent target (Rao et al., 2008) in which total radiative forcing temporarily exceeds the target before reaching there. Scenarios achieving a 50% emission reduction by 2050 produces overshoot. Overshoot could cause sustained warming for decades due to the inertia of the climate system. If stratospheric sulfur injections were to be used as a "last resort" to avoid overshoot, what would be the suitable start-year and injection profile of such an intervention? Wigley (2006) examined climate response to combined mitigation/geoengineering scenarios with the intent to avert overshoot. Wigley's analysis demonstrated a basic potential of such a combined mitigation/geoengineering approach to avoid temperature overshoot - however it considered only simplistic sulfur injection profiles (all started in 2010), just one mitigation scenario, and did not examine the sensitivity of the climate response to any underlying uncertainties. This study builds upon Wigley's premise of the combined mitigation/geoengineering approach and brings associated uncertainty into the analysis. First, this study addresses the question as to how much geoengineering intervention would be needed to avoid overshoot by considering associated uncertainty? Then, would a geoengineering intervention of such a magnitude including uncertainty be permissible in considering all the other side effects? This study begins from the supposition that geoengineering could be employed to cap warming at 2.0°C since preindustrial. A few

  20. Uncertainty Analysis of Thermal Comfort Parameters

    NASA Astrophysics Data System (ADS)

    Ribeiro, A. Silva; Alves e Sousa, J.; Cox, Maurice G.; Forbes, Alistair B.; Matias, L. Cordeiro; Martins, L. Lages

    2015-08-01

    International Standard ISO 7730:2005 defines thermal comfort as that condition of mind that expresses the degree of satisfaction with the thermal environment. Although this definition is inevitably subjective, the Standard gives formulae for two thermal comfort indices, predicted mean vote ( PMV) and predicted percentage dissatisfied ( PPD). The PMV formula is based on principles of heat balance and experimental data collected in a controlled climate chamber under steady-state conditions. The PPD formula depends only on PMV. Although these formulae are widely recognized and adopted, little has been done to establish measurement uncertainties associated with their use, bearing in mind that the formulae depend on measured values and tabulated values given to limited numerical accuracy. Knowledge of these uncertainties are invaluable when values provided by the formulae are used in making decisions in various health and civil engineering situations. This paper examines these formulae, giving a general mechanism for evaluating the uncertainties associated with values of the quantities on which the formulae depend. Further, consideration is given to the propagation of these uncertainties through the formulae to provide uncertainties associated with the values obtained for the indices. Current international guidance on uncertainty evaluation is utilized.

  1. Spatial Uncertainty Analysis of Ecological Models

    SciTech Connect

    Jager, H.I.; Ashwood, T.L.; Jackson, B.L.; King, A.W.

    2000-09-02

    The authors evaluated the sensitivity of a habitat model and a source-sink population model to spatial uncertainty in landscapes with different statistical properties and for hypothetical species with different habitat requirements. Sequential indicator simulation generated alternative landscapes from a source map. Their results showed that spatial uncertainty was highest for landscapes in which suitable habitat was rare and spatially uncorrelated. Although, they were able to exert some control over the degree of spatial uncertainty by varying the sampling density drawn from the source map, intrinsic spatial properties (i.e., average frequency and degree of spatial autocorrelation) played a dominant role in determining variation among realized maps. To evaluate the ecological significance of landscape variation, they compared the variation in predictions from a simple habitat model to variation among landscapes for three species types. Spatial uncertainty in predictions of the amount of source habitat depended on both the spatial life history characteristics of the species and the statistical attributes of the synthetic landscapes. Species differences were greatest when the landscape contained a high proportion of suitable habitat. The predicted amount of source habitat was greater for edge-dependent (interior) species in landscapes with spatially uncorrelated(correlated) suitable habitat. A source-sink model demonstrated that, although variation among landscapes resulted in relatively little variation in overall population growth rate, this spatial uncertainty was sufficient in some situations, to produce qualitatively different predictions about population viability (i.e., population decline vs. increase).

  2. Uncertainty Analysis in Space Radiation Protection

    NASA Technical Reports Server (NTRS)

    Cucinotta, Francis A.

    2011-01-01

    Space radiation is comprised of high energy and charge (HZE) nuclei, protons, and secondary radiation including neutrons. The uncertainties in estimating the health risks from galactic cosmic rays (GCR) are a major limitation to the length of space missions, the evaluation of potential risk mitigation approaches, and application of the As Low As Reasonably Achievable (ALARA) principle. For long duration space missio ns, risks may approach radiation exposure limits, therefore the uncertainties in risk projections become a major safety concern and methodologies used for ground-based works are not deemed to be sufficient. NASA limits astronaut exposures to a 3% risk of exposure induced death (REID) and protects against uncertainties in risks projections using an assessment of 95% confidence intervals in the projection model. We discuss NASA s approach to space radiation uncertainty assessments and applications for the International Space Station (ISS) program and design studies of future missions to Mars and other destinations. Several features of NASA s approach will be discussed. Radiation quality descriptions are based on the properties of radiation tracks rather than LET with probability distribution functions (PDF) for uncertainties derived from radiobiology experiments at particle accelerators. The application of age and gender specific models for individual astronauts is described. Because more than 90% of astronauts are never-smokers, an alternative risk calculation for never-smokers is used and will be compared to estimates for an average U.S. population. Because of the high energies of the GCR limits the benefits of shielding and the limited role expected for pharmaceutical countermeasures, uncertainty reduction continues to be the optimal approach to improve radiation safety for space missions.

  3. Contributions to Physics-Based Aeroservoelastic Uncertainty Analysis

    NASA Astrophysics Data System (ADS)

    Wu, Sang

    The thesis presents the development of a new fully-integrated, MATLAB based simulation capability for aeroservoelastic (ASE) uncertainty analysis that accounts for uncertainties in all disciplines as well as discipline interactions. This new capability allows probabilistic studies of complex configuration at a scope and with depth not known before. Several statistical tools and methods have been integrated into the capability to guide the tasks such as parameter prioritization, uncertainty reduction, and risk mitigation. (Abstract shortened by ProQuest.).

  4. Controllable set analysis for planetary landing under model uncertainties

    NASA Astrophysics Data System (ADS)

    Long, Jiateng; Gao, Ai; Cui, Pingyuan

    2015-07-01

    Controllable set analysis is a beneficial method in planetary landing mission design by feasible entry state selection in order to achieve landing accuracy and satisfy entry path constraints. In view of the severe impact of model uncertainties on planetary landing safety and accuracy, the purpose of this paper is to investigate the controllable set under uncertainties between on-board model and the real situation. Controllable set analysis under model uncertainties is composed of controllable union set (CUS) analysis and controllable intersection set (CIS) analysis. Definitions of CUS and CIS are demonstrated and computational method of them based on Gauss pseudospectral method is presented. Their applications on entry states distribution analysis under uncertainties and robustness of nominal entry state selection to uncertainties are illustrated by situations with ballistic coefficient, lift-to-drag ratio and atmospheric uncertainty in Mars entry. With analysis of CUS and CIS, the robustness of entry state selection and entry trajectory to model uncertainties can be guaranteed, thus enhancing the safety, reliability and accuracy under model uncertainties during planetary entry and landing.

  5. Uncertainty Modeling for Structural Control Analysis and Synthesis

    NASA Technical Reports Server (NTRS)

    Campbell, Mark E.; Crawley, Edward F.

    1996-01-01

    The development of an accurate model of uncertainties for the control of structures that undergo a change in operational environment, based solely on modeling and experimentation in the original environment is studied. The application used throughout this work is the development of an on-orbit uncertainty model based on ground modeling and experimentation. A ground based uncertainty model consisting of mean errors and bounds on critical structural parameters is developed. The uncertainty model is created using multiple data sets to observe all relevant uncertainties in the system. The Discrete Extended Kalman Filter is used as an identification/parameter estimation method for each data set, in addition to providing a covariance matrix which aids in the development of the uncertainty model. Once ground based modal uncertainties have been developed, they are localized to specific degrees of freedom in the form of mass and stiffness uncertainties. Two techniques are presented: a matrix method which develops the mass and stiffness uncertainties in a mathematical manner; and a sensitivity method which assumes a form for the mass and stiffness uncertainties in macroelements and scaling factors. This form allows the derivation of mass and stiffness uncertainties in a more physical manner. The mass and stiffness uncertainties of the ground based system are then mapped onto the on-orbit system, and projected to create an analogous on-orbit uncertainty model in the form of mean errors and bounds on critical parameters. The Middeck Active Control Experiment is introduced as experimental verification for the localization and projection methods developed. In addition, closed loop results from on-orbit operations of the experiment verify the use of the uncertainty model for control analysis and synthesis in space.

  6. Monte Carlo analysis of uncertainty propagation in a stratospheric model. 2: Uncertainties due to reaction rates

    NASA Technical Reports Server (NTRS)

    Stolarski, R. S.; Butler, D. M.; Rundel, R. D.

    1977-01-01

    A concise stratospheric model was used in a Monte-Carlo analysis of the propagation of reaction rate uncertainties through the calculation of an ozone perturbation due to the addition of chlorine. Two thousand Monte-Carlo cases were run with 55 reaction rates being varied. Excellent convergence was obtained in the output distributions because the model is sensitive to the uncertainties in only about 10 reactions. For a 1 ppby chlorine perturbation added to a 1.5 ppby chlorine background, the resultant 1 sigma uncertainty on the ozone perturbation is a factor of 1.69 on the high side and 1.80 on the low side. The corresponding 2 sigma factors are 2.86 and 3.23. Results are also given for the uncertainties, due to reaction rates, in the ambient concentrations of stratospheric species.

  7. Calculational methodology and associated uncertainties: Sensitivity and uncertainty analysis of reactor performance parameters

    SciTech Connect

    Kujawski, E.; Weisbin, C.R.

    1982-01-01

    This chapter considers the calculational methodology and associated uncertainties both for the design of large LMFBR's and the analysis of critical assemblies (fast critical experiments) as performed by several groups within the US. Discusses cross-section processing; calculational methodology for the design problem; core physics computations; design-oriented approximations; benchmark analyses; and determination of calculational corrections and associated uncertainties for a critical assembly. Presents a detailed analysis of the sources of calculational uncertainties for the critical assembly ZPR-6/7 to illustrate the quantitative assessment of calculational correction factors and uncertainties. Examines calculational uncertainties that arise from many different sources including intrinsic limitations of computational methods; design-oriented approximations related to reactor modeling; computational capability and code availability; economic limitations; and the skill of the reactor analyst. Emphasizes that the actual design uncertainties in most of the parameters, with the possible exception of burnup, are likely to be less than might be indicated by the results presented in this chapter because reactor designers routinely apply bias factors (usually derived from critical experiments) to their calculated results.

  8. Measurement uncertainty analysis techniques applied to PV performance measurements

    SciTech Connect

    Wells, C

    1992-10-01

    The purpose of this presentation is to provide a brief introduction to measurement uncertainty analysis, outline how it is done, and illustrate uncertainty analysis with examples drawn from the PV field, with particular emphasis toward its use in PV performance measurements. The uncertainty information we know and state concerning a PV performance measurement or a module test result determines, to a significant extent, the value and quality of that result. What is measurement uncertainty analysis? It is an outgrowth of what has commonly been called error analysis. But uncertainty analysis, a more recent development, gives greater insight into measurement processes and tests, experiments, or calibration results. Uncertainty analysis gives us an estimate of the I interval about a measured value or an experiment`s final result within which we believe the true value of that quantity will lie. Why should we take the time to perform an uncertainty analysis? A rigorous measurement uncertainty analysis: Increases the credibility and value of research results; allows comparisons of results from different labs; helps improve experiment design and identifies where changes are needed to achieve stated objectives (through use of the pre-test analysis); plays a significant role in validating measurements and experimental results, and in demonstrating (through the post-test analysis) that valid data have been acquired; reduces the risk of making erroneous decisions; demonstrates quality assurance and quality control measures have been accomplished; define Valid Data as data having known and documented paths of: Origin, including theory; measurements; traceability to measurement standards; computations; uncertainty analysis of results.

  9. Uncertainty Analysis of Air Radiation for Lunar Return Shock Layers

    NASA Technical Reports Server (NTRS)

    Kleb, Bil; Johnston, Christopher O.

    2008-01-01

    By leveraging a new uncertainty markup technique, two risk analysis methods are used to compute the uncertainty of lunar-return shock layer radiation predicted by the High temperature Aerothermodynamic Radiation Algorithm (HARA). The effects of epistemic uncertainty, or uncertainty due to a lack of knowledge, is considered for the following modeling parameters: atomic line oscillator strengths, atomic line Stark broadening widths, atomic photoionization cross sections, negative ion photodetachment cross sections, molecular bands oscillator strengths, and electron impact excitation rates. First, a simplified shock layer problem consisting of two constant-property equilibrium layers is considered. The results of this simplified problem show that the atomic nitrogen oscillator strengths and Stark broadening widths in both the vacuum ultraviolet and infrared spectral regions, along with the negative ion continuum, are the dominant uncertainty contributors. Next, three variable property stagnation-line shock layer cases are analyzed: a typical lunar return case and two Fire II cases. For the near-equilibrium lunar return and Fire 1643-second cases, the resulting uncertainties are very similar to the simplified case. Conversely, the relatively nonequilibrium 1636-second case shows significantly larger influence from electron impact excitation rates of both atoms and molecules. For all cases, the total uncertainty in radiative heat flux to the wall due to epistemic uncertainty in modeling parameters is 30% as opposed to the erroneously-small uncertainty levels (plus or minus 6%) found when treating model parameter uncertainties as aleatory (due to chance) instead of epistemic (due to lack of knowledge).

  10. Uncertainty Analysis of Power Systems Using Collocation

    DTIC Science & Technology

    2008-05-01

    position uncertainty. . . . . . . . . . . . . . 21 3-1 Left: Full grid with Legendre polynomial Gaussian quadrature. Right: Sparse grid with Gauss ...41 4-4 Log-log plots of error versus number of function evaluations. Sparse grid with Gauss -Kronrod-Patterson (SG-KP...full grid with Gaussian Quadrature (FG-GQ), and dimension-adaptivity with Gauss -Kronrod- Patterson (DA-KP) in three dimensions are shown

  11. Detailed Uncertainty Analysis of the ZEM-3 Measurement System

    NASA Technical Reports Server (NTRS)

    Mackey, Jon; Sehirlioglu, Alp; Dynys, Fred

    2014-01-01

    The measurement of Seebeck coefficient and electrical resistivity are critical to the investigation of all thermoelectric systems. Therefore, it stands that the measurement uncertainty must be well understood to report ZT values which are accurate and trustworthy. A detailed uncertainty analysis of the ZEM-3 measurement system has been performed. The uncertainty analysis calculates error in the electrical resistivity measurement as a result of sample geometry tolerance, probe geometry tolerance, statistical error, and multi-meter uncertainty. The uncertainty on Seebeck coefficient includes probe wire correction factors, statistical error, multi-meter uncertainty, and most importantly the cold-finger effect. The cold-finger effect plagues all potentiometric (four-probe) Seebeck measurement systems, as heat parasitically transfers through thermocouple probes. The effect leads to an asymmetric over-estimation of the Seebeck coefficient. A thermal finite element analysis allows for quantification of the phenomenon, and provides an estimate on the uncertainty of the Seebeck coefficient. The thermoelectric power factor has been found to have an uncertainty of +9-14 at high temperature and 9 near room temperature.

  12. Multimodel simulation of water flow: uncertainty analysis

    NASA Astrophysics Data System (ADS)

    Guber, A. K.; Pachepsky, Y. A.; van Genuchten, T. M.; Rowland, R. A.; Nicholson, T. J.; Cady, R. E.

    2009-04-01

    Simulations of soil water flow require measurements of soil hydraulic properties which are particularly difficult at field scale. Laboratory measurements provide hydraulic properties at scales finer than the field scale, whereas pedotransfer functions (PTFs) integrate information on hydraulic properties at larger scales. One way of downscaling large-scale data is to use several PTFs to generate hydraulic properties with each of the PTFs and to obtain the multimodel prediction of soil water flow by using weighted averages of the simulations results obtained with individual PTFs. Since its introduction, the multimodel prediction has been subject to much debates: whether a multimodel prediction is better than the single best forecast and what is the best method to weigh predictions obtained with the different models. The objective of this work was to evaluate errors and uncertainty of different weighting methods in multimodel prediction of soil water content. Data on soil water contents were collected at four locations at the USDA-ARS Beltsville OPE3 field site from January to November 2007. The locations were instrumented with Multisensor Capacitance Probes (SENTEK) to measure soil water content at depths from 10 to 100 cm with 10 cm increment. Standard meteorological data were measured in the vicinity of the site. Undisturbed soil samples were taken from the same depths to measure soil bulk density (BD), organic carbon content (OC) and soil texture in all locations. Fourteen PTFs, that had been developed from relatively large datasets (>200), were used to calculate soil hydraulic properties for each individual depth from measured BD, OC and soil texture. Thus, 14 sets of hydraulic parameters were obtained for each location. Then we solved the Richards equation with each set of hydraulic parameters for each location. The following multimodel prediction methods were compared in our study: (i) using only the best model; (ii) assigning equal weights to all models; (iii

  13. An Approach of Uncertainty Evaluation for Thermal-Hydraulic Analysis

    SciTech Connect

    Katsunori Ogura; Hisashi Ninokata

    2002-07-01

    An approach to evaluate uncertainty systematically for thermal-hydraulic analysis programs is demonstrated. The approach is applied to the Peach Bottom Unit 2 Turbine Trip 2 Benchmark and is validated. (authors)

  14. MOUSE (MODULAR ORIENTED UNCERTAINTY SYSTEM): A COMPUTERIZED UNCERTAINTY ANALYSIS SYSTEM (FOR MICRO- COMPUTERS)

    EPA Science Inventory

    Environmental engineering calculations involving uncertainties; either in the model itself or in the data, are far beyond the capabilities of conventional analysis for any but the simplest of models. There exist a number of general-purpose computer simulation languages, using Mon...

  15. MOUSE (MODULAR ORIENTED UNCERTAINTY SYSTEM): A COMPUTERIZED UNCERTAINTY ANALYSIS SYSTEM (FOR MICRO- COMPUTERS)

    EPA Science Inventory

    Environmental engineering calculations involving uncertainties; either in the model itself or in the data, are far beyond the capabilities of conventional analysis for any but the simplest of models. There exist a number of general-purpose computer simulation languages, using Mon...

  16. A Stochastic Collocation Algorithm for Uncertainty Analysis

    NASA Technical Reports Server (NTRS)

    Mathelin, Lionel; Hussaini, M. Yousuff; Zang, Thomas A. (Technical Monitor)

    2003-01-01

    This report describes a stochastic collocation method to adequately handle a physically intrinsic uncertainty in the variables of a numerical simulation. For instance, while the standard Galerkin approach to Polynomial Chaos requires multi-dimensional summations over the stochastic basis functions, the stochastic collocation method enables to collapse those summations to a one-dimensional summation only. This report furnishes the essential algorithmic details of the new stochastic collocation method and provides as a numerical example the solution of the Riemann problem with the stochastic collocation method used for the discretization of the stochastic parameters.

  17. Uncertainty Analysis of Knowledge Reductions in Rough Sets

    PubMed Central

    Zhang, Nan

    2014-01-01

    Uncertainty analysis is a vital issue in intelligent information processing, especially in the age of big data. Rough set theory has attracted much attention to this field since it was proposed. Relative reduction is an important problem of rough set theory. Different relative reductions have been investigated for preserving some specific classification abilities in various applications. This paper examines the uncertainty analysis of five different relative reductions in four aspects, that is, reducts' relationship, boundary region granularity, rules variance, and uncertainty measure according to a constructed decision table. PMID:25258725

  18. Uncertainty analysis technique for OMEGA Dante measurements.

    PubMed

    May, M J; Widmann, K; Sorce, C; Park, H-S; Schneider, M

    2010-10-01

    The Dante is an 18 channel x-ray filtered diode array which records the spectrally and temporally resolved radiation flux from various targets (e.g., hohlraums, etc.) at x-ray energies between 50 eV and 10 keV. It is a main diagnostic installed on the OMEGA laser facility at the Laboratory for Laser Energetics, University of Rochester. The absolute flux is determined from the photometric calibration of the x-ray diodes, filters and mirrors, and an unfold algorithm. Understanding the errors on this absolute measurement is critical for understanding hohlraum energetic physics. We present a new method for quantifying the uncertainties on the determined flux using a Monte Carlo parameter variation technique. This technique combines the uncertainties in both the unfold algorithm and the error from the absolute calibration of each channel into a one sigma Gaussian error function. One thousand test voltage sets are created using these error functions and processed by the unfold algorithm to produce individual spectra and fluxes. Statistical methods are applied to the resultant set of fluxes to estimate error bars on the measurements.

  19. Probabilistic accident consequence uncertainty analysis -- Early health effects uncertainty assessment. Volume 1: Main report

    SciTech Connect

    Haskin, F.E.; Harper, F.T.; Goossens, L.H.J.; Kraan, B.C.P.; Grupa, J.B.

    1997-12-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library of uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA early health effects models.

  20. Probabilistic accident consequence uncertainty analysis -- Late health effects uncertainty assessment. Volume 1: Main report

    SciTech Connect

    Little, M.P.; Muirhead, C.R.; Goossens, L.H.J.; Kraan, B.C.P.; Cooke, R.M.; Harper, F.T.; Hora, S.C.

    1997-12-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library of uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA late health effects models.

  1. Probabilistic accident consequence uncertainty analysis -- Uncertainty assessment for internal dosimetry. Volume 1: Main report

    SciTech Connect

    Goossens, L.H.J.; Kraan, B.C.P.; Cooke, R.M.; Harrison, J.D.; Harper, F.T.; Hora, S.C.

    1998-04-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library of uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA internal dosimetry models.

  2. Uncertainty Analysis via Failure Domain Characterization: Polynomial Requirement Functions

    NASA Technical Reports Server (NTRS)

    Crespo, Luis G.; Munoz, Cesar A.; Narkawicz, Anthony J.; Kenny, Sean P.; Giesy, Daniel P.

    2011-01-01

    This paper proposes an uncertainty analysis framework based on the characterization of the uncertain parameter space. This characterization enables the identification of worst-case uncertainty combinations and the approximation of the failure and safe domains with a high level of accuracy. Because these approximations are comprised of subsets of readily computable probability, they enable the calculation of arbitrarily tight upper and lower bounds to the failure probability. A Bernstein expansion approach is used to size hyper-rectangular subsets while a sum of squares programming approach is used to size quasi-ellipsoidal subsets. These methods are applicable to requirement functions whose functional dependency on the uncertainty is a known polynomial. Some of the most prominent features of the methodology are the substantial desensitization of the calculations from the uncertainty model assumed (i.e., the probability distribution describing the uncertainty) as well as the accommodation for changes in such a model with a practically insignificant amount of computational effort.

  3. The Role of Uncertainty in Aerospace Vehicle Analysis and Design

    NASA Technical Reports Server (NTRS)

    Kenny, Sean P.; Crespo, Luis G.

    2011-01-01

    Effective uncertainty quantification (UQ) begins at the earliest phase in the design phase for which there are adequate models and continues tightly integrated to the analysis and design cycles as the refinement of the models and the fidelity of the tools increase. It is essential that uncertainty quantification strategies provide objective information to support the processes of identifying, analyzing and accommodating for the effects of uncertainty. Assessments of uncertainty should never render the results more difficult for engineers and decision makers to comprehend, but instead provide them with critical information to assist with resource utilization decisions and risk mitigation strategies. Success would be measured by the tools to enable engineers and decision makers to effectively balance critical project resources against system requirements while accounting for the impact of uncertainty.

  4. Uncertainty Analysis via Failure Domain Characterization: Unrestricted Requirement Functions

    NASA Technical Reports Server (NTRS)

    Crespo, Luis G.; Kenny, Sean P.; Giesy, Daniel P.

    2011-01-01

    This paper proposes an uncertainty analysis framework based on the characterization of the uncertain parameter space. This characterization enables the identification of worst-case uncertainty combinations and the approximation of the failure and safe domains with a high level of accuracy. Because these approximations are comprised of subsets of readily computable probability, they enable the calculation of arbitrarily tight upper and lower bounds to the failure probability. The methods developed herein, which are based on nonlinear constrained optimization, are applicable to requirement functions whose functional dependency on the uncertainty is arbitrary and whose explicit form may even be unknown. Some of the most prominent features of the methodology are the substantial desensitization of the calculations from the assumed uncertainty model (i.e., the probability distribution describing the uncertainty) as well as the accommodation for changes in such a model with a practically insignificant amount of computational effort.

  5. Analysis of automated highway system risks and uncertainties. Volume 5

    SciTech Connect

    Sicherman, A.

    1994-10-01

    This volume describes a risk analysis performed to help identify important Automated Highway System (AHS) deployment uncertainties and quantify their effect on costs and benefits for a range of AHS deployment scenarios. The analysis identified a suite of key factors affecting vehicle and roadway costs, capacities and market penetrations for alternative AHS deployment scenarios. A systematic protocol was utilized for obtaining expert judgments of key factor uncertainties in the form of subjective probability percentile assessments. Based on these assessments, probability distributions on vehicle and roadway costs, capacity and market penetration were developed for the different scenarios. The cost/benefit risk methodology and analysis provide insights by showing how uncertainties in key factors translate into uncertainties in summary cost/benefit indices.

  6. Uncertainty analysis of geothermal energy economics

    NASA Astrophysics Data System (ADS)

    Sener, Adil Caner

    This dissertation research endeavors to explore geothermal energy economics by assessing and quantifying the uncertainties associated with the nature of geothermal energy and energy investments overall. The study introduces a stochastic geothermal cost model and a valuation approach for different geothermal power plant development scenarios. The Monte Carlo simulation technique is employed to obtain probability distributions of geothermal energy development costs and project net present values. In the study a stochastic cost model with incorporated dependence structure is defined and compared with the model where random variables are modeled as independent inputs. One of the goals of the study is to attempt to shed light on the long-standing modeling problem of dependence modeling between random input variables. The dependence between random input variables will be modeled by employing the method of copulas. The study focuses on four main types of geothermal power generation technologies and introduces a stochastic levelized cost model for each technology. Moreover, we also compare the levelized costs of natural gas combined cycle and coal-fired power plants with geothermal power plants. The input data used in the model relies on the cost data recently reported by government agencies and non-profit organizations, such as the Department of Energy, National Laboratories, California Energy Commission and Geothermal Energy Association. The second part of the study introduces the stochastic discounted cash flow valuation model for the geothermal technologies analyzed in the first phase. In this phase of the study, the Integrated Planning Model (IPM) software was used to forecast the revenue streams of geothermal assets under different price and regulation scenarios. These results are then combined to create a stochastic revenue forecast of the power plants. The uncertainties in gas prices and environmental regulations will be modeled and their potential impacts will be

  7. Uncertainty analysis for water supply reservoir yields

    NASA Astrophysics Data System (ADS)

    Kuria, Faith; Vogel, Richard

    2015-10-01

    Understanding the variability of water supply reservoir yields is central for planning purposes. The basis of this study is an empirical global relationship between reservoir storage capacity, water supply yield and reliability based on a global database of 729 rivers. Monte Carlo simulations reveal that the coefficient of variation of estimates of water supply reservoir yields depend only on the length of streamflows record and the coefficient of variation of the streamflows used to estimate the yield. We compare the results of those Monte Carlo experiments with an analytical uncertainty method First Order Variance Approximation (FOVA). FOVA is shown to produce a general, accurate and useful expression for estimating the coefficient of variation of water supply reservoir yield estimates. We also document how the FOVA analytical model can be used to determine the minimum length of streamflow record required during the design of water supply reservoirs so as to ensure that the yield delivered from reservoir falls within a prespecified margin of error.

  8. Proceedings of the CEC/USDOE workshop on uncertainty analysis

    SciTech Connect

    Elderkin, C.E. ); Kelly, G.N. )

    1990-09-01

    In recent years it has become increasingly important to specify the uncertainty inherent in consequence assessments and in the models that trace radionuclides from their source, through the environment, to their impacts on human health. European and US scientists have, been independently developing and applying methods for analyzing uncertainty. It recently became apparent that a scientific exchange on this subject would be beneficial as improvements are sought and as uncertainty methods find broader application. The Commission of the European Communities (CEC) and the Office of Health and Environmental Research of the US Department of Energy (OHER/DOE), through their continuing agreement for cooperation, decided to co-sponsor the CEC/USDOE Workshop on Uncertainty Analysis. CEC's Radiation Protection Research Programme and OHER's Atmospheric Studies in Complex Terrain Program collaborated in planning and organizing the workshop, which was held in Santa Fe, New Mexico, on November 13 through 16, 1989. As the workshop progressed, the perspectives of individual participants, each with their particular background and interests in some segment of consequence assessment and its uncertainties, contributed to a broader view of how uncertainties are introduced and handled. This proceedings contains, first, the editors' introduction to the problem of uncertainty analysis and their general summary and conclusions. These are then followed by the results of the working groups, and the abstracts of individual presentations.

  9. Monte Carlo based analysis of confocal peak extraction uncertainty

    NASA Astrophysics Data System (ADS)

    Liu, Chenguang; Liu, Yan; Zheng, Tingting; Tan, Jiubin; Liu, Jian

    2017-10-01

    Localisation of axial peaks is essential for height determination in confocal microscopy. Several algorithms have been proposed for reliable height extraction in surface topography measurements. However, most of these algorithms use nonlinear processing, which precludes estimating the peak height uncertainty. A Monte Carlo based standard uncertainty analysis model is developed here to evaluate the precision of height extraction algorithms. The key parameters of this model are the vertical sampling deviation and the size of the scanning pitch. Height extraction uncertainty of the centroid algorithm and nonlinear fitting algorithms were calculated using simulations. Our results offer a reference for selecting algorithms for confocal metrology.

  10. New challenges on uncertainty propagation assessment of flood risk analysis

    NASA Astrophysics Data System (ADS)

    Martins, Luciano; Aroca-Jiménez, Estefanía; Bodoque, José M.; Díez-Herrero, Andrés

    2016-04-01

    Natural hazards, such as floods, cause considerable damage to the human life, material and functional assets every year and around the World. Risk assessment procedures has associated a set of uncertainties, mainly of two types: natural, derived from stochastic character inherent in the flood process dynamics; and epistemic, that are associated with lack of knowledge or the bad procedures employed in the study of these processes. There are abundant scientific and technical literature on uncertainties estimation in each step of flood risk analysis (e.g. rainfall estimates, hydraulic modelling variables); but very few experience on the propagation of the uncertainties along the flood risk assessment. Therefore, epistemic uncertainties are the main goal of this work, in particular,understand the extension of the propagation of uncertainties throughout the process, starting with inundability studies until risk analysis, and how far does vary a proper analysis of the risk of flooding. These methodologies, such as Polynomial Chaos Theory (PCT), Method of Moments or Monte Carlo, are used to evaluate different sources of error, such as data records (precipitation gauges, flow gauges...), hydrologic and hydraulic modelling (inundation estimation), socio-demographic data (damage estimation) to evaluate the uncertainties propagation (UP) considered in design flood risk estimation both, in numerical and cartographic expression. In order to consider the total uncertainty and understand what factors are contributed most to the final uncertainty, we used the method of Polynomial Chaos Theory (PCT). It represents an interesting way to handle to inclusion of uncertainty in the modelling and simulation process. PCT allows for the development of a probabilistic model of the system in a deterministic setting. This is done by using random variables and polynomials to handle the effects of uncertainty. Method application results have a better robustness than traditional analysis

  11. Uncertainty of calculation results in vehicle collision analysis.

    PubMed

    Wach, Wojciech; Unarski, Jan

    2007-04-11

    In the analysis of road accidents two types of calculation result uncertainty can be distinguished: modelling uncertainty and uncertainty in calculation results [R.M. Brach, M. Brach, Vehicle Accident Analysis & Reconstruction Methods, SAE International Publisher, Warrendale, 2005]. The problem becomes very important first of all when minor modifications of input parameters or application of different models of the phenomenon lead to a fundamentally different answer to the question posed by the court. The aim of the paper was to prove the necessity of including the problem of uncertainty in calculations related to vehicle collision mechanics and to justify the application of different error analysis methods recommendable in vehicle collision reconstruction. The data file from crash test No. 7 [H. Burg, M. Lindenmann, Unfallversuche, Verlag Information Ambs, Kippenheim, 1982] was used, the selection restricted to the range typical of average police records of collision place. Collision speeds were calculated using two methods: reconstruction and simulation. The analysis of uncertainty was carried out. Maximum and mean square uncertainty were calculated by means of total differential of relevant forms. Since the reconstruction resulted in very broad error intervals of uniform distribution, additional calculations were performed by the Monte Carlo method using algorithm described in [W. Wach, J. Unarski, Determination of vehicle velocities and collision location by means of Monte Carlo simulation method, Special Publication Accident Reconstruction SP-1999, SAE Paper No. 2006-01-0907, 2006].

  12. Uncertainty analysis of displacement measurement with Imetrum Video Gauge.

    PubMed

    Liu, Chenlong; Yuan, Yongbo; Zhang, Mingyuan

    2016-11-01

    Imetrum Video Gauge is a commercial image-based two-dimensional displacement measurement system which has been widely used. Its uncertainty analysis is presented in this paper. First, the procedure to use Video Gauge is introduced. Then, based on the measurement model, two major sources of uncertainty are identified: (1) the uncertainty associated with the calibration procedure u(C) which is composed of the uncertainty of the known length used in calibration u(L) and the uncertainty of the projection of the known length in the image u(D) and (2) the uncertainty associated with the measurement system itself u(P). Following the Guide to the Expression of Uncertainty in Measurement, the uncertainties can be quantified. In the end, 60 experiments are performed to analyze the relationship between the measurement uncertainty and the working parameters, which are working distance, acquisition frequency and focal length of the lens. In order to ensure the validity of the calculation, two calculation methods are used. The main results of this paper are as follows: (1) the displacement measurement uncertainty increases along with working distance and decreases with the increase of focal length. At the same time, the results indicate that using a longer known length in calibration can also reduce the measurement uncertainty. (2) u(C) is greatly influenced by the known length used in the calibration procedure. It can be reduced when u(L) is reduced. (3) Under the laboratory circumstances, reducing u(C) can greatly reduce the total measurement uncertainty. (4) The displacement measurement uncertainty is more sensitive to the measurement uncertainty of the known length used in calibration than the projection of the known length in the image. (5) As the working distance grows, the sensitivity to the known length is getting weaker and the sensitivity to the projection of the known length in the image is getting stronger. (6) When a longer focal length lens is used, the influence

  13. Uncertainty and sensitivity analysis and its applications in OCD measurements

    NASA Astrophysics Data System (ADS)

    Vagos, Pedro; Hu, Jiangtao; Liu, Zhuan; Rabello, Silvio

    2009-03-01

    This article describes an Uncertainty & Sensitivity Analysis package, a mathematical tool that can be an effective time-shortcut for optimizing OCD models. By including real system noises in the model, an accurate method for predicting measurements uncertainties is shown. The assessment, in an early stage, of the uncertainties, sensitivities and correlations of the parameters to be measured drives the user in the optimization of the OCD measurement strategy. Real examples are discussed revealing common pitfalls like hidden correlations and simulation results are compared with real measurements. Special emphasis is given to 2 different cases: 1) the optimization of the data set of multi-head metrology tools (NI-OCD, SE-OCD), 2) the optimization of the azimuth measurement angle in SE-OCD. With the uncertainty and sensitivity analysis result, the right data set and measurement mode (NI-OCD, SE-OCD or NI+SE OCD) can be easily selected to achieve the best OCD model performance.

  14. Uncertainty Analysis of Seebeck Coefficient and Electrical Resistivity Characterization

    NASA Technical Reports Server (NTRS)

    Mackey, Jon; Sehirlioglu, Alp; Dynys, Fred

    2014-01-01

    In order to provide a complete description of a materials thermoelectric power factor, in addition to the measured nominal value, an uncertainty interval is required. The uncertainty may contain sources of measurement error including systematic bias error and precision error of a statistical nature. The work focuses specifically on the popular ZEM-3 (Ulvac Technologies) measurement system, but the methods apply to any measurement system. The analysis accounts for sources of systematic error including sample preparation tolerance, measurement probe placement, thermocouple cold-finger effect, and measurement parameters; in addition to including uncertainty of a statistical nature. Complete uncertainty analysis of a measurement system allows for more reliable comparison of measurement data between laboratories.

  15. Analysis of uncertainties in turbine metal temperature predictions

    NASA Technical Reports Server (NTRS)

    Stepka, F. S.

    1980-01-01

    An analysis was conducted to examine the extent to which various factors influence the accuracy of analytically predicting turbine blade metal temperatures and to determine the uncertainties in these predictions for several accuracies of the influence factors. The advanced turbofan engine gas conditions of 1700 K and 40 atmospheres were considered along with those of a highly instrumented high temperature turbine test rig and a low temperature turbine rig that simulated the engine conditions. The analysis showed that the uncertainty in analytically predicting local blade temperature was as much as 98 K, or 7.6 percent of the metal absolute temperature, with current knowledge of the influence factors. The expected reductions in uncertainties in the influence factors with additional knowledge and tests should reduce the uncertainty in predicting blade metal temperature to 28 K, or 2.1 percent of the metal absolute temperature.

  16. INSPECTION SHOP: PLAN TO PROVIDE UNCERTAINTY ANALYSIS WITH MEASUREMENTS

    SciTech Connect

    Nederbragt, W W

    2006-12-20

    The LLNL inspection shop is chartered to make dimensional measurements of components for critical programmatic experiments. These measurements ensure that components are within tolerance and provide geometric details that can be used to further refine simulations. For these measurements to be useful, they must be significantly more accurate than the tolerances that are being checked. For example, if a part has a specified dimension of 100 millimeters and a tolerance of 1 millimeter, then the precision and/or accuracy of the measurement should be less than 1 millimeter. Using the ''10-to-1 gaugemaker's rule of thumb'', the desired precision of the measurement should be less than 100 micrometers. Currently, the process for associating measurement uncertainty with data is not standardized, nor is the uncertainty based on a thorough uncertainty analysis. The goal of this project is to begin providing measurement uncertainty statements with critical measurements performed in the inspection shop. To accomplish this task, comprehensive knowledge about the underlying sources of uncertainty for measurement instruments need to be understood and quantified. Moreover, measurements of elemental uncertainties for each physical source need to be combined in a meaningful way to obtain an overall measurement uncertainty.

  17. The use of fuzzy mathematics in subjective uncertainty analysis

    SciTech Connect

    Cooper, J.A.

    1996-06-01

    We have been investigating the applicability of fuzzy mathematics in safety assessments (PSAs). It is a very efficient approach, both in terms of methodology development time and program execution time. Most importantly, it processes subjective information subjectively, not as if it were based on measured data. One of the most useful results of this work is that we have shown the potential for significant differences (especially in perceived margin relative to a decision threshold) between fuzzy mathematics analysis and conventional PSA analysis. This difference is due to subtle factors inherent in the choice of probability distributions for modeling uncertainty. Since subjective uncertainty, stochastic variability, and dependence are all parts of most practical situations, a technique has been developed for combining the three effects. The methodology is based on hybrid numbers and on Frechet inequality dependency bounds analysis. Some new results have also been obtained in the areas of efficient disjoint set representations and constrained uncertainty and variability analysis.

  18. Analysis and Reduction of Complex Networks Under Uncertainty

    SciTech Connect

    Knio, Omar M

    2014-04-09

    This is a collaborative proposal that aims at developing new methods for the analysis and reduction of complex multiscale networks under uncertainty. The approach is based on combining methods of computational singular perturbation (CSP) and probabilistic uncertainty quantification. In deterministic settings, CSP yields asymptotic approximations of reduced-dimensionality “slow manifolds” on which a multiscale dynamical system evolves. Introducing uncertainty raises fundamentally new issues, particularly concerning its impact on the topology of slow manifolds, and means to represent and quantify associated variability. To address these challenges, this project uses polynomial chaos (PC) methods to reformulate uncertain network models, and to analyze them using CSP in probabilistic terms. Specific objectives include (1) developing effective algorithms that can be used to illuminate fundamental and unexplored connections among model reduction, multiscale behavior, and uncertainty, and (2) demonstrating the performance of these algorithms through applications to model problems.

  19. Effect of material uncertainties on dynamic analysis of piezoelectric fans

    NASA Astrophysics Data System (ADS)

    Srivastava, Swapnil; Yadav, Shubham Kumar; Mukherjee, Sujoy

    2015-04-01

    A piezofan is a resonant device that uses a piezoceramic material to induce oscillations in a cantilever beam. In this study, lumped-mass modelling is used to analyze a piezoelectric fan. Uncertainties are associated with the piezoelectric structures due to several reasons such as variation during manufacturing process, temperature, presence of adhesive layer between the piezoelectric actuator/sensor and the shim stock etc. Presence of uncertainty in the piezoelectric materials can influence the dynamic behavior of the piezoelectric fan such as natural frequency, tip deflection etc. Moreover, these quantities will also affect the performance parameters of the piezoelectric fan. Uncertainty analysis is performed using classical Monte Carlo Simulation (MCS). It is found that the propagation of uncertainty causes significant deviations from the baseline deterministic predictions, which also affect the achievable performance of the piezofan. The numerical results in this paper provide useful bounds on several performance parameters of the cooling fan and will enhance confidence in the design process.

  20. Uncertainty analysis for common Seebeck and electrical resistivity measurement systems.

    PubMed

    Mackey, Jon; Dynys, Frederick; Sehirlioglu, Alp

    2014-08-01

    This work establishes the level of uncertainty for electrical measurements commonly made on thermoelectric samples. The analysis targets measurement systems based on the four probe method. Sources of uncertainty for both electrical resistivity and Seebeck coefficient were identified and evaluated. Included are reasonable estimates on the magnitude of each source, and cumulative propagation of error. Uncertainty for the Seebeck coefficient includes the cold-finger effect which has been quantified with thermal finite element analysis. The cold-finger effect, which is a result of parasitic heat transfer down the thermocouple probes, leads to an asymmetric over-estimation of the Seebeck coefficient. A silicon germanium thermoelectric sample has been characterized to provide an understanding of the total measurement uncertainty. The electrical resistivity was determined to contain uncertainty of ±7.0% across any measurement temperature. The Seebeck coefficient of the system is +1.0%/-13.1% at high temperature and ±1.0% near room temperature. The power factor has a combined uncertainty of +7.3%/-27.0% at high temperature and ±7.5% near room temperature. These ranges are calculated to be typical values for a general four probe Seebeck and resistivity measurement configuration.

  1. Parameter uncertainty analysis of a biokinetic model of caesium

    DOE PAGES

    Li, W. B.; Klein, W.; Blanchardon, Eric; ...

    2014-04-17

    Parameter uncertainties for the biokinetic model of caesium (Cs) developed by Leggett et al. were inventoried and evaluated. The methods of parameter uncertainty analysis were used to assess the uncertainties of model predictions with the assumptions of model parameter uncertainties and distributions. Furthermore, the importance of individual model parameters was assessed by means of sensitivity analysis. The calculated uncertainties of model predictions were compared with human data of Cs measured in blood and in the whole body. It was found that propagating the derived uncertainties in model parameter values reproduced the range of bioassay data observed in human subjects atmore » different times after intake. The maximum ranges, expressed as uncertainty factors (UFs) (defined as a square root of ratio between 97.5th and 2.5th percentiles) of blood clearance, whole-body retention and urinary excretion of Cs predicted at earlier time after intake were, respectively: 1.5, 1.0 and 2.5 at the first day; 1.8, 1.1 and 2.4 at Day 10 and 1.8, 2.0 and 1.8 at Day 100; for the late times (1000 d) after intake, the UFs were increased to 43, 24 and 31, respectively. The model parameters of transfer rates between kidneys and blood, muscle and blood and the rate of transfer from kidneys to urinary bladder content are most influential to the blood clearance and to the whole-body retention of Cs. For the urinary excretion, the parameters of transfer rates from urinary bladder content to urine and from kidneys to urinary bladder content impact mostly. The implication and effect on the estimated equivalent and effective doses of the larger uncertainty of 43 in whole-body retention in the later time, say, after Day 500 will be explored in a successive work in the framework of EURADOS.« less

  2. Parameter uncertainty analysis of a biokinetic model of caesium

    SciTech Connect

    Li, W. B.; Klein, W.; Blanchardon, Eric; Puncher, M; Leggett, Richard Wayne; Oeh, U.; Breustedt, B.; Nosske, Dietmar; Lopez, M.

    2014-04-17

    Parameter uncertainties for the biokinetic model of caesium (Cs) developed by Leggett et al. were inventoried and evaluated. The methods of parameter uncertainty analysis were used to assess the uncertainties of model predictions with the assumptions of model parameter uncertainties and distributions. Furthermore, the importance of individual model parameters was assessed by means of sensitivity analysis. The calculated uncertainties of model predictions were compared with human data of Cs measured in blood and in the whole body. It was found that propagating the derived uncertainties in model parameter values reproduced the range of bioassay data observed in human subjects at different times after intake. The maximum ranges, expressed as uncertainty factors (UFs) (defined as a square root of ratio between 97.5th and 2.5th percentiles) of blood clearance, whole-body retention and urinary excretion of Cs predicted at earlier time after intake were, respectively: 1.5, 1.0 and 2.5 at the first day; 1.8, 1.1 and 2.4 at Day 10 and 1.8, 2.0 and 1.8 at Day 100; for the late times (1000 d) after intake, the UFs were increased to 43, 24 and 31, respectively. The model parameters of transfer rates between kidneys and blood, muscle and blood and the rate of transfer from kidneys to urinary bladder content are most influential to the blood clearance and to the whole-body retention of Cs. For the urinary excretion, the parameters of transfer rates from urinary bladder content to urine and from kidneys to urinary bladder content impact mostly. The implication and effect on the estimated equivalent and effective doses of the larger uncertainty of 43 in whole-body retention in the later time, say, after Day 500 will be explored in a successive work in the framework of EURADOS.

  3. Parameter uncertainty analysis of a biokinetic model of caesium.

    PubMed

    Li, W B; Klein, W; Blanchardon, E; Puncher, M; Leggett, R W; Oeh, U; Breustedt, B; Noßke, D; Lopez, M A

    2015-01-01

    Parameter uncertainties for the biokinetic model of caesium (Cs) developed by Leggett et al. were inventoried and evaluated. The methods of parameter uncertainty analysis were used to assess the uncertainties of model predictions with the assumptions of model parameter uncertainties and distributions. Furthermore, the importance of individual model parameters was assessed by means of sensitivity analysis. The calculated uncertainties of model predictions were compared with human data of Cs measured in blood and in the whole body. It was found that propagating the derived uncertainties in model parameter values reproduced the range of bioassay data observed in human subjects at different times after intake. The maximum ranges, expressed as uncertainty factors (UFs) (defined as a square root of ratio between 97.5th and 2.5th percentiles) of blood clearance, whole-body retention and urinary excretion of Cs predicted at earlier time after intake were, respectively: 1.5, 1.0 and 2.5 at the first day; 1.8, 1.1 and 2.4 at Day 10 and 1.8, 2.0 and 1.8 at Day 100; for the late times (1000 d) after intake, the UFs were increased to 43, 24 and 31, respectively. The model parameters of transfer rates between kidneys and blood, muscle and blood and the rate of transfer from kidneys to urinary bladder content are most influential to the blood clearance and to the whole-body retention of Cs. For the urinary excretion, the parameters of transfer rates from urinary bladder content to urine and from kidneys to urinary bladder content impact mostly. The implication and effect on the estimated equivalent and effective doses of the larger uncertainty of 43 in whole-body retention in the later time, say, after Day 500 will be explored in a successive work in the framework of EURADOS.

  4. Propagation of variance uncertainty calculation for an autopsy tissue analysis

    SciTech Connect

    Bruckner, L.A.

    1994-07-01

    When a radiochemical analysis is reported, it is often accompanied by an uncertainty value that simply reflects the natural variation in the observed counts due to radioactive decay, the so-called counting statistics. However, when the assay procedure is complex or when the number of counts is large, there are usually other important contributors to the total measurement uncertainty that need to be considered. An assay value is almost useless unless it is accompanied by a measure of the uncertainty associated with that value. The uncertainty value should reflect all the major sources of variation and bias affecting the assay and should provide a specified level of confidence. An approach to uncertainty calculation that includes the uncertainty due to instrument calibration, values of the standards, and intermediate measurements as well as counting statistics is presented and applied to the analysis of an autopsy tissue. This approach, usually called propagation of variance, attempts to clearly distinguish between errors that have systematic (bias) effects and those that have random effects on the assays. The effects of these different types of errors are then propagated to the assay using formal statistical techniques. The result is an uncertainty on the assay that has a defensible level of confidence and which can be traced to individual major contributors. However, since only measurement steps are readly quantified and since all models are approximations, it is emphasized that without empirical verification, a propagation of uncertainty model may be just a fancy model with no connection to reality. 5 refs., 1 fig., 2 tab.

  5. Uncertainty Analysis and Parameter Estimation For Nearshore Hydrodynamic Models

    NASA Astrophysics Data System (ADS)

    Ardani, S.; Kaihatu, J. M.

    2012-12-01

    Numerical models represent deterministic approaches used for the relevant physical processes in the nearshore. Complexity of the physics of the model and uncertainty involved in the model inputs compel us to apply a stochastic approach to analyze the robustness of the model. The Bayesian inverse problem is one powerful way to estimate the important input model parameters (determined by apriori sensitivity analysis) and can be used for uncertainty analysis of the outputs. Bayesian techniques can be used to find the range of most probable parameters based on the probability of the observed data and the residual errors. In this study, the effect of input data involving lateral (Neumann) boundary conditions, bathymetry and off-shore wave conditions on nearshore numerical models are considered. Monte Carlo simulation is applied to a deterministic numerical model (the Delft3D modeling suite for coupled waves and flow) for the resulting uncertainty analysis of the outputs (wave height, flow velocity, mean sea level and etc.). Uncertainty analysis of outputs is performed by random sampling from the input probability distribution functions and running the model as required until convergence to the consistent results is achieved. The case study used in this analysis is the Duck94 experiment, which was conducted at the U.S. Army Field Research Facility at Duck, North Carolina, USA in the fall of 1994. The joint probability of model parameters relevant for the Duck94 experiments will be found using the Bayesian approach. We will further show that, by using Bayesian techniques to estimate the optimized model parameters as inputs and applying them for uncertainty analysis, we can obtain more consistent results than using the prior information for input data which means that the variation of the uncertain parameter will be decreased and the probability of the observed data will improve as well. Keywords: Monte Carlo Simulation, Delft3D, uncertainty analysis, Bayesian techniques

  6. Uncertainty of quantitative microbiological methods of pharmaceutical analysis.

    PubMed

    Gunar, O V; Sakhno, N G

    2015-12-30

    The total uncertainty of quantitative microbiological methods, used in pharmaceutical analysis, consists of several components. The analysis of the most important sources of the quantitative microbiological methods variability demonstrated no effect of culture media and plate-count techniques in the estimation of microbial count while the highly significant effect of other factors (type of microorganism, pharmaceutical product and individual reading and interpreting errors) was established. The most appropriate method of statistical analysis of such data was ANOVA which enabled not only the effect of individual factors to be estimated but also their interactions. Considering all the elements of uncertainty and combining them mathematically the combined relative uncertainty of the test results was estimated both for method of quantitative examination of non-sterile pharmaceuticals and microbial count technique without any product. These data did not exceed 35%, appropriated for a traditional plate count methods.

  7. A Peep into the Uncertainty-Complexity-Relevance Modeling Trilemma through Global Sensitivity and Uncertainty Analysis

    NASA Astrophysics Data System (ADS)

    Munoz-Carpena, R.; Muller, S. J.; Chu, M.; Kiker, G. A.; Perz, S. G.

    2014-12-01

    Model Model complexity resulting from the need to integrate environmental system components cannot be understated. In particular, additional emphasis is urgently needed on rational approaches to guide decision making through uncertainties surrounding the integrated system across decision-relevant scales. However, in spite of the difficulties that the consideration of modeling uncertainty represent for the decision process, it should not be avoided or the value and science behind the models will be undermined. These two issues; i.e., the need for coupled models that can answer the pertinent questions and the need for models that do so with sufficient certainty, are the key indicators of a model's relevance. Model relevance is inextricably linked with model complexity. Although model complexity has advanced greatly in recent years there has been little work to rigorously characterize the threshold of relevance in integrated and complex models. Formally assessing the relevance of the model in the face of increasing complexity would be valuable because there is growing unease among developers and users of complex models about the cumulative effects of various sources of uncertainty on model outputs. In particular, this issue has prompted doubt over whether the considerable effort going into further elaborating complex models will in fact yield the expected payback. New approaches have been proposed recently to evaluate the uncertainty-complexity-relevance modeling trilemma (Muller, Muñoz-Carpena and Kiker, 2011) by incorporating state-of-the-art global sensitivity and uncertainty analysis (GSA/UA) in every step of the model development so as to quantify not only the uncertainty introduced by the addition of new environmental components, but the effect that these new components have over existing components (interactions, non-linear responses). Outputs from the analysis can also be used to quantify system resilience (stability, alternative states, thresholds or tipping

  8. Probabilistic accident consequence uncertainty analysis: Dispersion and deposition uncertainty assessment, appendices A and B

    SciTech Connect

    Harper, F.T.; Young, M.L.; Miller, L.A.; Hora, S.C.; Lui, C.H.; Goossens, L.H.J.; Cooke, R.M.; Paesler-Sauer, J.; Helton, J.C.

    1995-01-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, completed in 1990, estimate the risks presented by nuclear installations based on postulated frequencies and magnitudes of potential accidents. In 1991, the US Nuclear Regulatory Commission (NRC) and the Commission of the European Communities (CEC) began a joint uncertainty analysis of the two codes. The objective was to develop credible and traceable uncertainty distributions for the input variables of the codes. Expert elicitation, developed independently, was identified as the best technology available for developing a library of uncertainty distributions for the selected consequence parameters. The study was formulated jointly and was limited to the current code models and to physical quantities that could be measured in experiments. To validate the distributions generated for the wet deposition input variables, samples were taken from these distributions and propagated through the wet deposition code model along with the Gaussian plume model (GPM) implemented in the MACCS and COSYMA codes. Resulting distributions closely replicated the aggregated elicited wet deposition distributions. Project teams from the NRC and CEC cooperated successfully to develop and implement a unified process for the elaboration of uncertainty distributions on consequence code input parameters. Formal expert judgment elicitation proved valuable for synthesizing the best available information. Distributions on measurable atmospheric dispersion and deposition parameters were successfully elicited from experts involved in the many phenomenological areas of consequence analysis. This volume is the second of a three-volume document describing the project and contains two appendices describing the rationales for the dispersion and deposition data along with short biographies of the 16 experts who participated in the project.

  9. Probabilistic accident consequence uncertainty analysis: Dispersion and deposition uncertainty assessment, main report

    SciTech Connect

    Harper, F.T.; Young, M.L.; Miller, L.A.; Hora, S.C.; Lui, C.H.; Goossens, L.H.J.; Cooke, R.M.; Paesler-Sauer, J.; Helton, J.C.

    1995-01-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the risks presented by nuclear installations based on postulated frequencies and magnitudes of potential accidents. In 1991, the US Nuclear Regulatory Commission (NRC) and the Commission of the European Communities (CEC) began a joint uncertainty analysis of the two codes. The ultimate objective of the joint effort was to develop credible and traceable uncertainty distributions for the input variables of the codes. Expert elicitation was identified as the best technology available for developing a library of uncertainty distributions for the selected consequence parameters. The study was formulated jointly and was limited to the current code models and to physical quantities that could be measured in experiments. Experts developed their distributions independently. To validate the distributions generated for the wet deposition input variables, samples were taken from these distributions and propagated through the wet deposition code model. Resulting distributions closely replicated the aggregated elicited wet deposition distributions. To validate the distributions generated for the dispersion code input variables, samples from the distributions and propagated through the Gaussian plume model (GPM) implemented in the MACCS and COSYMA codes. Project teams from the NRC and CEC cooperated successfully to develop and implement a unified process for the elaboration of uncertainty distributions on consequence code input parameters. Formal expert judgment elicitation proved valuable for synthesizing the best available information. Distributions on measurable atmospheric dispersion and deposition parameters were successfully elicited from experts involved in the many phenomenological areas of consequence analysis. This volume is the first of a three-volume document describing the project.

  10. Adaptive framework for uncertainty analysis in electromagnetic field measurements.

    PubMed

    Prieto, Javier; Alonso, Alonso A; de la Rosa, Ramón; Carrera, Albano

    2015-04-01

    Misinterpretation of uncertainty in the measurement of the electromagnetic field (EMF) strength may lead to an underestimation of exposure risk or an overestimation of required measurements. The Guide to the Expression of Uncertainty in Measurement (GUM) has internationally been adopted as a de facto standard for uncertainty assessment. However, analyses under such an approach commonly assume unrealistic static models or neglect relevant prior information, resulting in non-robust uncertainties. This study proposes a principled and systematic framework for uncertainty analysis that fuses information from current measurements and prior knowledge. Such a framework dynamically adapts to data by exploiting a likelihood function based on kernel mixtures and incorporates flexible choices of prior information by applying importance sampling. The validity of the proposed techniques is assessed from measurements performed with a broadband radiation meter and an isotropic field probe. The developed framework significantly outperforms GUM approach, achieving a reduction of 28% in measurement uncertainty. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  11. Planning for robust reserve networks using uncertainty analysis

    USGS Publications Warehouse

    Moilanen, A.; Runge, M.C.; Elith, J.; Tyre, A.; Carmel, Y.; Fegraus, E.; Wintle, B.A.; Burgman, M.; Ben-Haim, Y.

    2006-01-01

    Planning land-use for biodiversity conservation frequently involves computer-assisted reserve selection algorithms. Typically such algorithms operate on matrices of species presence?absence in sites, or on species-specific distributions of model predicted probabilities of occurrence in grid cells. There are practically always errors in input data?erroneous species presence?absence data, structural and parametric uncertainty in predictive habitat models, and lack of correspondence between temporal presence and long-run persistence. Despite these uncertainties, typical reserve selection methods proceed as if there is no uncertainty in the data or models. Having two conservation options of apparently equal biological value, one would prefer the option whose value is relatively insensitive to errors in planning inputs. In this work we show how uncertainty analysis for reserve planning can be implemented within a framework of information-gap decision theory, generating reserve designs that are robust to uncertainty. Consideration of uncertainty involves modifications to the typical objective functions used in reserve selection. Search for robust-optimal reserve structures can still be implemented via typical reserve selection optimization techniques, including stepwise heuristics, integer-programming and stochastic global search.

  12. Approaches to uncertainty analysis in probabilistic risk assessment

    SciTech Connect

    Bohn, M.P.; Wheeler, T.A.; Parry, G.W.

    1988-01-01

    An integral part of any probabilistic risk assessment (PRA) is the performance of an uncertainty analysis to quantify the uncertainty in the point estimates of the risk measures considered. While a variety of classical methods of uncertainty analysis exist, application of these methods and developing new techniques consistent with existing PRA data bases and the need for expert (subjective) input has been an area of considerable interest since the pioneering Reactor Safety Study (WASH-1400) in 1975. This report presents the results of a critical review of existing methods for performing uncertainty analyses for PRAs, with special emphasis on identifying data base limitations on the various methods. Both classical and Baysian approaches have been examined. This work was funded by the US Nuclear Regulatory Commission in support of its ongoing full-scope PRA of the LaSalle nuclear power station. Thus in addition to the review, this report contains recommendations for a suitable uncertainty analysis methodology for the LaSalle PRA.

  13. Uncertainty Analysis for RELAP5-3D

    SciTech Connect

    Aaron J. Pawel; Dr. George L. Mesina

    2011-08-01

    In its current state, RELAP5-3D is a 'best-estimate' code; it is one of our most reliable programs for modeling what occurs within reactor systems in transients from given initial conditions. This code, however, remains an estimator. A statistical analysis has been performed that begins to lay the foundation for a full uncertainty analysis. By varying the inputs over assumed probability density functions, the output parameters were shown to vary. Using such statistical tools as means, variances, and tolerance intervals, a picture of how uncertain the results are based on the uncertainty of the inputs has been obtained.

  14. Uncertainty analysis for absorption and first-derivative EPR spectra.

    PubMed

    Tseitlin, Mark; Eaton, Sandra S; Eaton, Gareth R

    2012-11-01

    Electron paramagnetic resonance (EPR) experimental techniques produce absorption or first-derivative spectra. Uncertainty analysis provides the basis for comparison of spectra obtained by different methods. In this study it was used to derive analytical equations to relate uncertainties for integrated intensity and line widths obtained from absorption or first-derivative spectra to the signal-to-noise ratio (SNR), with the assumption of white noise. Predicted uncertainties for integrated intensities and line widths are in good agreement with Monte Carlo calculations for Lorentzian and Gaussian lineshapes. Conservative low-pass filtering changes the noise spectrum, which can be modeled in the Monte Carlo simulations. When noise is close to white, the analytical equations provide useful estimates of uncertainties. For example, for a Lorentzian line with white noise, the uncertainty in the number of spins obtained from the first-derivative spectrum is 2.6 times greater than from the absorption spectrum at the same SNR. Uncertainties in line widths obtained from absorption and first-derivative spectra are similar. The impact of integration or differentiation on SNR and on uncertainties in fitting parameters was analyzed. Although integration of the first-derivative spectrum improves the apparent smoothness of the spectrum, it also changes the frequency distribution of the noise. If the lineshape of the signal is known, the integrated intensity can be determined more accurately by fitting the first-derivative spectrum than by first integrating and then fitting the absorption spectrum. Uncertainties in integrated intensities and line widths are less when the parameters are determined from the original data than from spectra that have been either integrated or differentiated.

  15. Uncertainty and sensitivity analysis for photovoltaic system modeling.

    SciTech Connect

    Hansen, Clifford W.; Pohl, Andrew Phillip; Jordan, Dirk

    2013-12-01

    We report an uncertainty and sensitivity analysis for modeling DC energy from photovoltaic systems. We consider two systems, each comprised of a single module using either crystalline silicon or CdTe cells, and located either at Albuquerque, NM, or Golden, CO. Output from a PV system is predicted by a sequence of models. Uncertainty in the output of each model is quantified by empirical distributions of each model's residuals. We sample these distributions to propagate uncertainty through the sequence of models to obtain an empirical distribution for each PV system's output. We considered models that: (1) translate measured global horizontal, direct and global diffuse irradiance to plane-of-array irradiance; (2) estimate effective irradiance from plane-of-array irradiance; (3) predict cell temperature; and (4) estimate DC voltage, current and power. We found that the uncertainty in PV system output to be relatively small, on the order of 1% for daily energy. Four alternative models were considered for the POA irradiance modeling step; we did not find the choice of one of these models to be of great significance. However, we observed that the POA irradiance model introduced a bias of upwards of 5% of daily energy which translates directly to a systematic difference in predicted energy. Sensitivity analyses relate uncertainty in the PV system output to uncertainty arising from each model. We found that the residuals arising from the POA irradiance and the effective irradiance models to be the dominant contributors to residuals for daily energy, for either technology or location considered. This analysis indicates that efforts to reduce the uncertainty in PV system output should focus on improvements to the POA and effective irradiance models.

  16. Probabilistic accident consequence uncertainty analysis -- Early health effects uncertainty assessment. Volume 2: Appendices

    SciTech Connect

    Haskin, F.E.; Harper, F.T.; Goossens, L.H.J.; Kraan, B.C.P.

    1997-12-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library of uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA early health effects models. This volume contains appendices that include (1) a summary of the MACCS and COSYMA consequence codes, (2) the elicitation questionnaires and case structures, (3) the rationales and results for the panel on early health effects, (4) short biographies of the experts, and (5) the aggregated results of their responses.

  17. Probabilistic accident consequence uncertainty analysis -- Uncertainty assessment for deposited material and external doses. Volume 2: Appendices

    SciTech Connect

    Goossens, L.H.J.; Kraan, B.C.P.; Cooke, R.M.; Boardman, J.; Jones, J.A.; Harper, F.T.; Young, M.L.; Hora, S.C.

    1997-12-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library of uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA deposited material and external dose models. This volume contains appendices that include (1) a summary of the MACCS and COSYMA consequence codes, (2) the elicitation questionnaires and case structures, (3) the rationales and results for the panel on deposited material and external doses, (4) short biographies of the experts, and (5) the aggregated results of their responses.

  18. Probabilistic accident consequence uncertainty analysis -- Uncertainty assessment for internal dosimetry. Volume 2: Appendices

    SciTech Connect

    Goossens, L.H.J.; Kraan, B.C.P.; Cooke, R.M.; Harrison, J.D.; Harper, F.T.; Hora, S.C.

    1998-04-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library of uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA internal dosimetry models. This volume contains appendices that include (1) a summary of the MACCS and COSYMA consequence codes, (2) the elicitation questionnaires and case structures, (3) the rationales and results for the panel on internal dosimetry, (4) short biographies of the experts, and (5) the aggregated results of their responses.

  19. Application of Model Based Uncertainty Analysis to Hydrocarbon Reservoirs

    NASA Astrophysics Data System (ADS)

    Lyons, S. L.; Lee, L. W.

    2007-12-01

    Model-Based Uncertainty Analysis (MBUA) is a method to evaluate reservoir performance uncertainty using multiple 3D models. Experimental Design techniques are used to determine what 3D models are built (unique geologic models which are then simulated) based on the number of identified uncertain factors. This method captures main effects and factor interactions and results in a multivariable response surface for each desired outcome (e.g. expected ultimate recovery, original hydrocarbon in place) which can be used for Monte Carlo simulations. The entire MBUA process results in tornado plots, exceedence curves, and a method to build representative models. A key strength of the MBUA process is the ability to capture dynamic responses such as water rate, production at early times, and plateau length. These responses provide project teams a greater understanding of how their subsurface uncertainties impact field performance and can help guide additional technical work, development planning decisions and representative model building. We have applied either a full or partial MBUA process to several fields for the purposes of identifying key technical uncertainties and improving the development plan across likely outcomes. Once an initial base case model has been built, subsequent models (totaling 16-54 models) can be rapidly built and simulated using normal geologic and flow simulation model workflows. While the analysis is quick, all model results are required to complete the analysis, thus simulation time is the biggest bottleneck in the process. Production data have been incorporated through either history-matching the initial base case or examining Monte Carlo simulation results for potential trends. The uncertainty analysis provides a streamlined process to examine potential uncertainty factors and allows a project to make informed decisions regarding future technical work or potential mitigation plans. A process overview will be presented along with sample results

  20. Visual Scanning Hartmann Optical Tester (VSHOT) Uncertainty Analysis (Milestone Report)

    SciTech Connect

    Gray, A.; Lewandowski, A.; Wendelin, T.

    2010-10-01

    In 1997, an uncertainty analysis was conducted of the Video Scanning Hartmann Optical Tester (VSHOT). In 2010, we have completed a new analysis, based primarily on the geometric optics of the system, and it shows sensitivities to various design and operational parameters. We discuss sources of error with measuring devices, instrument calibrations, and operator measurements for a parabolic trough mirror panel test. These help to guide the operator in proper setup, and help end-users to understand the data they are provided. We include both the systematic (bias) and random (precision) errors for VSHOT testing and their contributions to the uncertainty. The contributing factors we considered in this study are: target tilt; target face to laser output distance; instrument vertical offset; laser output angle; distance between the tool and the test piece; camera calibration; and laser scanner. These contributing factors were applied to the calculated slope error, focal length, and test article tilt that are generated by the VSHOT data processing. Results show the estimated 2-sigma uncertainty in slope error for a parabolic trough line scan test to be +/-0.2 milliradians; uncertainty in the focal length is +/- 0.1 mm, and the uncertainty in test article tilt is +/- 0.04 milliradians.

  1. Measures, Uncertainties, and Significance Test in Operational ROC Analysis

    PubMed Central

    Wu, Jin Chu; Martin, Alvin F.; Kacker, Raghu N.

    2011-01-01

    In receiver operating characteristic (ROC) analysis, the sampling variability can result in uncertainties of performance measures. Thus, while evaluating and comparing the performances of algorithms, the measurement uncertainties must be taken into account. The key issue is how to calculate the uncertainties of performance measures in ROC analysis. Our ultimate goal is to perform the significance test in evaluation and comparison using the standard errors computed. From the operational perspective, based on fingerprint-image matching algorithms on large datasets, the measures and their uncertainties are investigated in the three scenarios: 1) the true accept rate (TAR) of genuine scores at a specified false accept rate (FAR) of impostor scores, 2) the TAR and FAR at a given threshold, and 3) the equal error rate. The uncertainties of measures are calculated using the nonparametric two-sample bootstrap based on our extensive studies of bootstrap variability on large datasets. The significance test is carried out to determine whether the difference between the performance of one algorithm and a hypothesized value, or the difference between the performances of two algorithms where the correlation is taken into account is statistically significant. Examples are provided. PMID:26989582

  2. Solar model uncertainties, MSW analysis, and future solar neutrino experiments

    NASA Astrophysics Data System (ADS)

    Hata, Naoya; Langacker, Paul

    1994-07-01

    Various theoretical uncertainties in the standard solar model and in the Mikheyev-Smirnov-Wolfenstein (MSW) analysis are discussed. It is shown that two methods give consistent estimations of the solar neutrino flux uncertainties: (a) a simple parametrization of the uncertainties using the core temperature and the ncuelar production cross sections; (b) the Monte Carlo method of Bahcall and Ulrich. In the MSW analysis, we emphasize proper treatments of correlations of theoretical uncertainties between flux components and between different detectors, the Earth effect, and multiple solutions in a combined χ2 procedure. In particular the large-angle solution of the combined observation is allowed at 95% C.L. only when the theoretical uncertainties are included. If their correlations were ignored, the region would be overestimated. The MSW solutions for various standard and nonstandard solar models are also shown. The MSW predictions of the global solutions for the future solar neutrino experiments are given, emphasizing the measurement of the energy spectrum and the day-night effect in Sudbury Neutrino Observatory and Super-Kamiokande to distinguish the two solutions.

  3. FCF metallic waste data uncertainty analysis.

    SciTech Connect

    Yacout, A. M.

    1998-02-02

    Metallic waste is a residual of the electrometallurgical treatment of the Experimental Breeder Reactor II (EBR-II) spent fuel. The treatment is based on electrorefining the fuel in molten salt, and currently it remains in the demonstration phase at the Fuel Conditioning Facility (FCF) at Argonne National Laboratory-West (ANL-W). A reference metallic waste form is produced by mixing 15% zirconium with the stainless steel cladding hulls, which remain in the fuel dissolution baskets (FDB's) after electrorefining, to form a metal alloy. Estimates of uranium in this waste form are of importance to both operations and sensitive materials control and accountability (MC and A). Accurate estimates of the uranium in this product provide important information regarding the dissolution of uranium and the efficiency of the electrorefining process. Under certain operating conditions, non-negligible amounts of uranium were found in this stream, which made it an area of interest for MC and A. The estimates of uranium in this waste stream are currently provided through analysis of cladding hulls samples. The collected cladding hulls data and the errors associated with the data are discussed here, in addition to the effects of these errors on the overall facility ID variance.

  4. Uncertainty Analysis and Control in Nonlinear, Multiscale, Interconnected Systems

    DTIC Science & Technology

    2009-10-22

    exchange of information with researchers from the United Technologies Research Center on uncertainty analysis and control of thermoacoustic instabilities...of thermoacoustic instabilities. 4 Transitions 1. Performer: I. Mezic Customer: United Technologies Research Center, Hartford, Connecticut...Contact: Dr. Andraej Banaszuk. Result: Joint work on coupled oscillator models of thermoacoustic instability control concept derived from coupled

  5. Managing Uncertainty: Environmental Analysis/Forecasting in Academic Planning.

    ERIC Educational Resources Information Center

    Morrison, James L.; Mecca, Thomas V.

    An approach to environmental analysis and forecasting that educational policymakers can employ in dealing with the level of uncertainty in strategic decision making is presented. Traditional planning models are weak in identifying environmental changes and assessing their organizational impact. The proposed approach does not lead decision makers…

  6. Practitioner Representations of Environmental Uncertainty: An Application of Discriminant Analysis.

    ERIC Educational Resources Information Center

    Acharya, Lalit

    Multiple discriminant analysis was used to analyze the structure of a perceived environmental uncertainty variable employed previously in research on public relations roles. Data came from a subset (N=229) of a national sample of public relations practitioners belonging to the Public Relations Society of America, who completed a set of scaled…

  7. Estimating annual bole biomass production using uncertainty analysis

    Treesearch

    Travis J. Woolley; Mark E. Harmon; Kari B. O' Connell

    2007-01-01

    Two common sampling methodologies coupled with a simple statistical model were evaluated to determine the accuracy and precision of annual bole biomass production (BBP) and inter-annual variability estimates using this type of approach. We performed an uncertainty analysis using Monte Carlo methods in conjunction with radial growth core data from trees in three Douglas...

  8. Model Uncertainty and Robustness: A Computational Framework for Multimodel Analysis

    ERIC Educational Resources Information Center

    Young, Cristobal; Holsteen, Katherine

    2017-01-01

    Model uncertainty is pervasive in social science. A key question is how robust empirical results are to sensible changes in model specification. We present a new approach and applied statistical software for computational multimodel analysis. Our approach proceeds in two steps: First, we estimate the modeling distribution of estimates across all…

  9. Evaluating an Information System for Policy Modeling and Uncertainty Analysis.

    ERIC Educational Resources Information Center

    Henrion, Max; And Others

    1986-01-01

    The purpose of this evaluation of DEMOS, a system for decision modeling and uncertainty analysis, was to identify some generic issues of interest in the design of such information systems, obtain insights about the causes of problems in their use, and to suggest hypotheses about how to deal with them. (EM)

  10. Sensitivity and Uncertainty Analysis of the GFR MOX Fuel Subassembly

    NASA Astrophysics Data System (ADS)

    Lüley, J.; Vrban, B.; Čerba, Š.; Haščík, J.; Nečas, V.; Pelloni, S.

    2014-04-01

    We performed sensitivity and uncertainty analysis as well as benchmark similarity assessment of the MOX fuel subassembly designed for the Gas-Cooled Fast Reactor (GFR) as a representative material of the core. Material composition was defined for each assembly ring separately allowing us to decompose the sensitivities not only for isotopes and reactions but also for spatial regions. This approach was confirmed by direct perturbation calculations for chosen materials and isotopes. Similarity assessment identified only ten partly comparable benchmark experiments that can be utilized in the field of GFR development. Based on the determined uncertainties, we also identified main contributors to the calculation bias.

  11. The effects of uncertainty on the analysis of atmospheric deposition

    SciTech Connect

    Bloyd, C.N. ); Small, M.J.; Henrion, M.; Rubin, E.S. )

    1988-01-01

    Research efforts on the problem of acid ran are directed at improving current scientific understanding in critical areas, including sources of precursor emissions, the transport and transformation of pollutants in the atmosphere, the deposition of acidic species, and the chemical and biological effects of acid deposition on aquatic systems, materials, forests, crops and human health. The general goal of these research efforts is to characterize the current situation and to develop analytical models which can be used to predict the response of various systems to changes in critical parameters. This paper describes a framework which enables one to characterize uncertainty at each major stage of the modeling process. Following a general presentation of the modeling framework, a description is given of the methods chosen to characterize uncertainty for each major step. Analysis is then performed to illustrate the effects of uncertainty on future lake acidification in the Adirondacks Park area of upstate New York.

  12. Statistical uncertainty analysis of radon transport in nonisothermal, unsaturated soils

    SciTech Connect

    Holford, D.J.; Owczarski, P.C.; Gee, G.W.; Freeman, H.D.

    1990-10-01

    To accurately predict radon fluxes soils to the atmosphere, we must know more than the radium content of the soil. Radon flux from soil is affected not only by soil properties, but also by meteorological factors such as air pressure and temperature changes at the soil surface, as well as the infiltration of rainwater. Natural variations in meteorological factors and soil properties contribute to uncertainty in subsurface model predictions of radon flux, which, when coupled with a building transport model, will also add uncertainty to predictions of radon concentrations in homes. A statistical uncertainty analysis using our Rn3D finite-element numerical model was conducted to assess the relative importance of these meteorological factors and the soil properties affecting radon transport. 10 refs., 10 figs., 3 tabs.

  13. Calibration of solar cells' photoelectric properties and related uncertainty analysis

    NASA Astrophysics Data System (ADS)

    Meng, Haifeng; Xiong, Limin; He, Yingwei; Zhang, Junchao; Tian, Wei; Liu, Dingpu; Zhang, Jieyu; Xie, Linlin; Lei, Liu

    2014-07-01

    Solar cells' photoelectric properties calibration, i.e., current-voltage (I-V) characteristics is critical for both fundamental research and photovoltaic production line. This paper will present calibration of solar cells' I-V characteristics by a substitution method under simulate light source. Considering the calibration uncertainty and measurement accuracy, reliable measurement procedures developed in NIM with uncertainty analysis are also demonstrated. By controlling the influencing factors, relative expended combined uncertainty (Urel) of 2.1% (Isc), 1.0% (Voc) and 3.1% (Pmax) was concluded here, with a coverage factor k = 2. The measurement system meets all requirements of IEC 60904-1 and IEC 60904-9, and it has been applied to amounts of solar cells' I-V curves calibration for research institutes as well as industrial plants, which solved the problem of domestic metrology technology shortage in photovoltaic field.

  14. Uncertainty Analysis in Humidity Measurements by the Psychrometer Method

    PubMed Central

    Chen, Jiunyuan; Chen, Chiachung

    2017-01-01

    The most common and cheap indirect technique to measure relative humidity is by using psychrometer based on a dry and a wet temperature sensor. In this study, the measurement uncertainty of relative humidity was evaluated by this indirect method with some empirical equations for calculating relative humidity. Among the six equations tested, the Penman equation had the best predictive ability for the dry bulb temperature range of 15–50 °C. At a fixed dry bulb temperature, an increase in the wet bulb depression increased the error. A new equation for the psychrometer constant was established by regression analysis. This equation can be computed by using a calculator. The average predictive error of relative humidity was <0.1% by this new equation. The measurement uncertainty of the relative humidity affected by the accuracy of dry and wet bulb temperature and the numeric values of measurement uncertainty were evaluated for various conditions. The uncertainty of wet bulb temperature was the main factor on the RH measurement uncertainty. PMID:28216599

  15. Uncertainty analysis of a SFR core with sodium plenum

    SciTech Connect

    Canuti, E.; Ivanov, E.; Tiberi, V.; Pignet, S.

    2012-07-01

    The new concepts of Sodium-cooled Fast Reactors have to reach the Generation IV safety objectives. In this regard the Sodium Void Effect has to be minimized for the future projects of large-size SFR as well as the uncertainties on it. The Inst. of Radiological Protection and Nuclear Safety (IRSN) as technological support of French public authorities is in charge of safety assessment of operating and under construction reactors, as well as future projects. In order to state about the safety of new SFR designs the IRSN must be able to evaluate core parameters and their uncertainties. In this frame a sensitivity and uncertainty study has been performed to evaluate the impact of nuclear data uncertainty on sodium void effect, for the benchmark model of large SFR BN-800. The benchmark parameters (effective multiplication factor and sodium void effect) have been evaluated using two codes, the deterministic code ERANOS and the Monte Carlo code SCALE, while the S/U analysis has been performed only with SCALE. The results of the these studies point out the most relevant cross section uncertainties that affect the SVE and how efforts should be done in increasing the existing nuclear data accuracies. (authors)

  16. Uncertainty Analysis in Humidity Measurements by the Psychrometer Method.

    PubMed

    Chen, Jiunyuan; Chen, Chiachung

    2017-02-14

    The most common and cheap indirect technique to measure relative humidity is by using psychrometer based on a dry and a wet temperature sensor. In this study, the measurement uncertainty of relative humidity was evaluated by this indirect method with some empirical equations for calculating relative humidity. Among the six equations tested, the Penman equation had the best predictive ability for the dry bulb temperature range of 15-50 °C. At a fixed dry bulb temperature, an increase in the wet bulb depression increased the error. A new equation for the psychrometer constant was established by regression analysis. This equation can be computed by using a calculator. The average predictive error of relative humidity was <0.1% by this new equation. The measurement uncertainty of the relative humidity affected by the accuracy of dry and wet bulb temperature and the numeric values of measurement uncertainty were evaluated for various conditions. The uncertainty of wet bulb temperature was the main factor on the RH measurement uncertainty.

  17. Uncertainty analysis for computer model projections of hurricane losses.

    PubMed

    Iman, Ronald L; Johnson, Mark E; Watson, Charles C

    2005-10-01

    Projecting losses associated with hurricanes is a complex and difficult undertaking that is wrought with uncertainties. Hurricane Charley, which struck southwest Florida on August 13, 2004, illustrates the uncertainty of forecasting damages from these storms. Due to shifts in the track and the rapid intensification of the storm, real-time estimates grew from 2 to 3 billion dollars in losses late on August 12 to a peak of 50 billion dollars for a brief time as the storm appeared to be headed for the Tampa Bay area. The storm hit the resort areas of Charlotte Harbor near Punta Gorda and then went on to Orlando in the central part of the state, with early poststorm estimates converging on a damage estimate in the 28 to 31 billion dollars range. Comparable damage to central Florida had not been seen since Hurricane Donna in 1960. The Florida Commission on Hurricane Loss Projection Methodology (FCHLPM) has recognized the role of computer models in projecting losses from hurricanes. The FCHLPM established a professional team to perform onsite (confidential) audits of computer models developed by several different companies in the United States that seek to have their models approved for use in insurance rate filings in Florida. The team's members represent the fields of actuarial science, computer science, meteorology, statistics, and wind and structural engineering. An important part of the auditing process requires uncertainty and sensitivity analyses to be performed with the applicant's proprietary model. To influence future such analyses, an uncertainty and sensitivity analysis has been completed for loss projections arising from use of a Holland B parameter hurricane wind field model. Uncertainty analysis quantifies the expected percentage reduction in the uncertainty of wind speed and loss that is attributable to each of the input variables.

  18. Uncertainty Analysis of the Grazing Flow Impedance Tube

    NASA Technical Reports Server (NTRS)

    Brown, Martha C.; Jones, Michael G.; Watson, Willie R.

    2012-01-01

    This paper outlines a methodology to identify the measurement uncertainty of NASA Langley s Grazing Flow Impedance Tube (GFIT) over its operating range, and to identify the parameters that most significantly contribute to the acoustic impedance prediction. Two acoustic liners are used for this study. The first is a single-layer, perforate-over-honeycomb liner that is nonlinear with respect to sound pressure level. The second consists of a wire-mesh facesheet and a honeycomb core, and is linear with respect to sound pressure level. These liners allow for evaluation of the effects of measurement uncertainty on impedances educed with linear and nonlinear liners. In general, the measurement uncertainty is observed to be larger for the nonlinear liners, with the largest uncertainty occurring near anti-resonance. A sensitivity analysis of the aerodynamic parameters (Mach number, static temperature, and static pressure) used in the impedance eduction process is also conducted using a Monte-Carlo approach. This sensitivity analysis demonstrates that the impedance eduction process is virtually insensitive to each of these parameters.

  19. Uncertainty analysis for seismic hazard in Northern and Central Italy

    USGS Publications Warehouse

    Lombardi, A.M.; Akinci, A.; Malagnini, L.; Mueller, C.S.

    2005-01-01

    In this study we examine uncertainty and parametric sensitivity of Peak Ground Acceleration (PGA) and 1-Hz Spectral Acceleration (1-Hz SA) in probabilistic seismic hazard maps (10% probability of exceedance in 50 years) of Northern and Central Italy. The uncertainty in hazard is estimated using a Monte Carlo approach to randomly sample a logic tree that has three input-variables branch points representing alternative values for b-value, maximum magnitude (Mmax) and attenuation relationships. Uncertainty is expressed in terms of 95% confidence band and Coefficient Of Variation (COV). The overall variability of ground motions and their sensitivity to each parameter of the logic tree are investigated. The largest values of the overall 95% confidence band are around 0.15 g for PGA in the Friuli and Northern Apennines regions and around 0.35 g for 1-Hz SA in the Central Apennines. The sensitivity analysis shows that the largest contributor to seismic hazard variability is uncertainty in the choice of ground-motion attenuation relationships, especially in the Friuli Region (???0.10 g) for PGA and in the Friuli and Central Apennines regions (???0.15 g) for 1-Hz SA. This is followed by the variability of the b-value: its main contribution is evident in the Friuli and Central Apennines regions for both 1-Hz SA (???0.15 g) and PGA (???0.10 g). We observe that the contribution of Mmax to seismic hazard variability is negligible, at least for 10% exceedance in 50-years hazard. The overall COV map for PGA shows that the uncertainty in the hazard is larger in the Friuli and Northern Apennine regions, around 20-30%, than the Central Apennines and Northwestern Italy, around 10-20%. The overall uncertainty is larger for the 1-Hz SA map and reaches 50-60% in the Central Apennines and Western Alps.

  20. Treatment of uncertainties in the IPCC: a philosophical analysis

    NASA Astrophysics Data System (ADS)

    Jebeile, J.; Drouet, I.

    2014-12-01

    The IPCC produces scientific reports out of findings on climate and climate change. Because the findings are uncertain in many respects, the production of reports requires aggregating assessments of uncertainties of different kinds. This difficult task is currently regulated by the Guidance note for lead authors of the IPCC fifth assessment report on consistent treatment of uncertainties. The note recommends that two metrics—i.e. confidence and likelihood— be used for communicating the degree of certainty in findings. Confidence is expressed qualitatively "based on the type, amount, quality, and consistency of evidence […] and the degree of agreement", while likelihood is expressed probabilistically "based on statistical analysis of observations or model results, or expert judgment". Therefore, depending on the evidence evaluated, authors have the choice to present either an assigned level of confidence or a quantified measure of likelihood. But aggregating assessments of uncertainties of these two different kinds express distinct and conflicting methodologies. So the question arises whether the treatment of uncertainties in the IPCC is rationally justified. In order to answer the question, it is worth comparing the IPCC procedures with the formal normative theories of epistemic rationality which have been developed by philosophers. These theories—which include contributions to the philosophy of probability and to bayesian probabilistic confirmation theory—are relevant for our purpose because they are commonly used to assess the rationality of common collective jugement formation based on uncertain knowledge. In this paper we make the comparison and pursue the following objectives: i/we determine whether the IPCC confidence and likelihood can be compared with the notions of uncertainty targeted by or underlying the formal normative theories of epistemic rationality; ii/we investigate whether the formal normative theories of epistemic rationality justify

  1. Compositional Analysis of Lignocellulosic Feedstocks. 2. Method Uncertainties

    PubMed Central

    2010-01-01

    The most common procedures for characterizing the chemical components of lignocellulosic feedstocks use a two-stage sulfuric acid hydrolysis to fractionate biomass for gravimetric and instrumental analyses. The uncertainty (i.e., dispersion of values from repeated measurement) in the primary data is of general interest to those with technical or financial interests in biomass conversion technology. The composition of a homogenized corn stover feedstock (154 replicate samples in 13 batches, by 7 analysts in 2 laboratories) was measured along with a National Institute of Standards and Technology (NIST) reference sugar cane bagasse, as a control, using this laboratory's suite of laboratory analytical procedures (LAPs). The uncertainty was evaluated by the statistical analysis of these data and is reported as the standard deviation of each component measurement. Censored and uncensored versions of these data sets are reported, as evidence was found for intermittent instrumental and equipment problems. The censored data are believed to represent the “best case” results of these analyses, whereas the uncensored data show how small method changes can strongly affect the uncertainties of these empirical methods. Relative standard deviations (RSD) of 1−3% are reported for glucan, xylan, lignin, extractives, and total component closure with the other minor components showing 4−10% RSD. The standard deviations seen with the corn stover and NIST bagasse materials were similar, which suggests that the uncertainties reported here are due more to the analytical method used than to the specific feedstock type being analyzed. PMID:20669952

  2. Luminous efficiency estimates of meteors -I. Uncertainty analysis

    NASA Astrophysics Data System (ADS)

    Subasinghe, Dilini; Campbell-Brown, Margaret; Stokan, Edward

    2017-09-01

    The luminous efficiency of meteors is poorly known, but critical for determining the meteoroid mass. We present an uncertainty analysis of the luminous efficiency as determined by the classical ablation equations, and suggest a possible method for determining the luminous efficiency of real meteor events. We find that a two-term exponential fit to simulated lag data is able to reproduce simulated luminous efficiencies reasonably well.

  3. A systematic uncertainty analysis for liner impedance eduction technology

    NASA Astrophysics Data System (ADS)

    Zhou, Lin; Bodén, Hans

    2015-11-01

    The so-called impedance eduction technology is widely used for obtaining acoustic properties of liners used in aircraft engines. The measurement uncertainties for this technology are still not well understood though it is essential for data quality assessment and model validation. A systematic framework based on multivariate analysis is presented in this paper to provide 95 percent confidence interval uncertainty estimates in the process of impedance eduction. The analysis is made using a single mode straightforward method based on transmission coefficients involving the classic Ingard-Myers boundary condition. The multivariate technique makes it possible to obtain an uncertainty analysis for the possibly correlated real and imaginary parts of the complex quantities. The results show that the errors in impedance results at low frequency mainly depend on the variability of transmission coefficients, while the mean Mach number accuracy is the most important source of error at high frequencies. The effect of Mach numbers used in the wave dispersion equation and in the Ingard-Myers boundary condition has been separated for comparison of the outcome of impedance eduction. A local Mach number based on friction velocity is suggested as a way to reduce the inconsistencies found when estimating impedance using upstream and downstream acoustic excitation.

  4. Uncertainty analysis of doses from ingestion of plutonium and americium.

    PubMed

    Puncher, M; Harrison, J D

    2012-02-01

    Uncertainty analyses have been performed on the biokinetic model for americium currently used by the International Commission on Radiological Protection (ICRP), and the model for plutonium recently derived by Leggett, considering acute intakes by ingestion by adult members of the public. The analyses calculated distributions of doses per unit intake. Those parameters having the greatest impact on prospective doses were identified by sensitivity analysis; the most important were the fraction absorbed from the alimentary tract, f(1), and rates of uptake from blood to bone surfaces. Probability distributions were selected based on the observed distribution of plutonium and americium in human subjects where possible; the distributions for f(1) reflected uncertainty on the average value of this parameter for non-specified plutonium and americium compounds ingested by adult members of the public. The calculated distributions of effective doses for ingested (239)Pu and (241)Am were well described by log-normal distributions, with doses varying by around a factor of 3 above and below the central values; the distributions contain the current ICRP Publication 67 dose coefficients for ingestion of (239)Pu and (241)Am by adult members of the public. Uncertainty on f(1) values had the greatest impact on doses, particularly effective dose. It is concluded that: (1) more precise data on f(1) values would have a greater effect in reducing uncertainties on doses from ingested (239)Pu and (241)Am, than reducing uncertainty on other model parameter values and (2) the results support the dose coefficients (Sv Bq(-1) intake) derived by ICRP for ingestion of (239)Pu and (241)Am by adult members of the public.

  5. Uncertainty analysis of atmospheric variations from ground-based observations

    NASA Astrophysics Data System (ADS)

    Chang, K. L.; Petropavlovskikh, I. V.

    2016-12-01

    In year 2016, total column ozone levels continue to be influenced by the remaining levels of the man-made ozone depleting substances in the atmosphere. Ozone recovery is a much more subtle process as compared to the ozone depletion of the 1980s. The quality of ozone observations is important for understanding and interpreting of the trends.The interannual ozone variability is driven by various natural and climate related forcing. The sampling limitations of the ground-based networks complicates analysis of the state of ozone recovery globally and locally. Therefore, the detection of the recovery rates needs to be addressed with understanding of the measurement uncertainties. Total column ozone variations estimated using Dobson ground-based stations have been challenged by data inhomogeneity in time and by the irregularity of the spatial distribution of stations, as well as by interruptions in observation records. Understanding and measuring the inherent uncertainty in long-terms ozone changes is crucial for the understanding of recovery of ozone layer and effects of climate change on ozone recovery. In this talk we estimate errors resulting from the spatial interpolation of ozone records and construct a long-term zonal means using stochastic partial differential equation (SPDE) framework, an extension of Chang, Gullias and Fioletov (2015). We further present the uncertainty analysis of long-term ozone changes. The generalized additive models (changes associated with well-known proxies) and stochastic volatility models (i.e. unexpected changes or interruptions of the record), including the integrated nested Laplace approximation (INLA) approach, will be used for analysis of ozone records. The connection between spatial nonstationarity (non-constant relation with space) and the statistical estimation of volatility uncertainty will be assessed. The robustness of the space-time prediction of total column ozone trends will be addressed.

  6. Final Report. Analysis and Reduction of Complex Networks Under Uncertainty

    SciTech Connect

    Marzouk, Youssef M.; Coles, T.; Spantini, A.; Tosatto, L.

    2013-09-30

    The project was a collaborative effort among MIT, Sandia National Laboratories (local PI Dr. Habib Najm), the University of Southern California (local PI Prof. Roger Ghanem), and The Johns Hopkins University (local PI Prof. Omar Knio, now at Duke University). Our focus was the analysis and reduction of large-scale dynamical systems emerging from networks of interacting components. Such networks underlie myriad natural and engineered systems. Examples important to DOE include chemical models of energy conversion processes, and elements of national infrastructure—e.g., electric power grids. Time scales in chemical systems span orders of magnitude, while infrastructure networks feature both local and long-distance connectivity, with associated clusters of time scales. These systems also blend continuous and discrete behavior; examples include saturation phenomena in surface chemistry and catalysis, and switching in electrical networks. Reducing size and stiffness is essential to tractable and predictive simulation of these systems. Computational singular perturbation (CSP) has been effectively used to identify and decouple dynamics at disparate time scales in chemical systems, allowing reduction of model complexity and stiffness. In realistic settings, however, model reduction must contend with uncertainties, which are often greatest in large-scale systems most in need of reduction. Uncertainty is not limited to parameters; one must also address structural uncertainties—e.g., whether a link is present in a network—and the impact of random perturbations, e.g., fluctuating loads or sources. Research under this project developed new methods for the analysis and reduction of complex multiscale networks under uncertainty, by combining computational singular perturbation (CSP) with probabilistic uncertainty quantification. CSP yields asymptotic approximations of reduceddimensionality “slow manifolds” on which a multiscale dynamical system evolves. Introducing

  7. Orbit uncertainty propagation and sensitivity analysis with separated representations

    NASA Astrophysics Data System (ADS)

    Balducci, Marc; Jones, Brandon; Doostan, Alireza

    2017-09-01

    Most approximations for stochastic differential equations with high-dimensional, non-Gaussian inputs suffer from a rapid (e.g., exponential) increase of computational cost, an issue known as the curse of dimensionality. In astrodynamics, this results in reduced accuracy when propagating an orbit-state probability density function. This paper considers the application of separated representations for orbit uncertainty propagation, where future states are expanded into a sum of products of univariate functions of initial states and other uncertain parameters. An accurate generation of separated representation requires a number of state samples that is linear in the dimension of input uncertainties. The computation cost of a separated representation scales linearly with respect to the sample count, thereby improving tractability when compared to methods that suffer from the curse of dimensionality. In addition to detailed discussions on their construction and use in sensitivity analysis, this paper presents results for three test cases of an Earth orbiting satellite. The first two cases demonstrate that approximation via separated representations produces a tractable solution for propagating the Cartesian orbit-state uncertainty with up to 20 uncertain inputs. The third case, which instead uses Equinoctial elements, reexamines a scenario presented in the literature and employs the proposed method for sensitivity analysis to more thoroughly characterize the relative effects of uncertain inputs on the propagated state.

  8. ProbCD: enrichment analysis accounting for categorization uncertainty.

    PubMed

    Vêncio, Ricardo Z N; Shmulevich, Ilya

    2007-10-12

    As in many other areas of science, systems biology makes extensive use of statistical association and significance estimates in contingency tables, a type of categorical data analysis known in this field as enrichment (also over-representation or enhancement) analysis. In spite of efforts to create probabilistic annotations, especially in the Gene Ontology context, or to deal with uncertainty in high throughput-based datasets, current enrichment methods largely ignore this probabilistic information since they are mainly based on variants of the Fisher Exact Test. We developed an open-source R-based software to deal with probabilistic categorical data analysis, ProbCD, that does not require a static contingency table. The contingency table for the enrichment problem is built using the expectation of a Bernoulli Scheme stochastic process given the categorization probabilities. An on-line interface was created to allow usage by non-programmers and is available at: http://xerad.systemsbiology.net/ProbCD/. We present an analysis framework and software tools to address the issue of uncertainty in categorical data analysis. In particular, concerning the enrichment analysis, ProbCD can accommodate: (i) the stochastic nature of the high-throughput experimental techniques and (ii) probabilistic gene annotation.

  9. Methodological considerations with data uncertainty in road safety analysis.

    PubMed

    Schlögl, Matthias; Stütz, Rainer

    2017-02-16

    The analysis of potential influencing factors that affect the likelihood of road accident occurrence has been of major interest for safety researchers throughout the recent decades. Even though steady methodological progresses were made over the years, several impediments pertaining to the statistical analysis of crash data remain. While issues related to methodological approaches have been subject to constructive discussion, uncertainties inherent to the most fundamental part of any analysis have been widely neglected: data. This paper scrutinizes data from various sources that are commonly used in road safety studies with respect to their actual suitability for applications in this area. Issues related to spatial and temporal aspects of data uncertainty are pointed out and their implications for road safety analysis are discussed in detail. These general methodological considerations are exemplary illustrated with data from Austria, providing suggestions and methods how to overcome these obstacles. Considering these aspects is of major importance for expediting further advances in road safety data analysis and thus for increasing road safety.

  10. MOUSE (MODULAR ORIENTED UNCERTAINTY SYSTEM): A COMPUTERIZED UNCERTAINTY ANALYSIS SYSTEM. OPERATIONAL MANUAL.

    EPA Science Inventory

    MOUSE (Modular Oriented Uncertainty SystEm) deals with the problem of uncertainties in models that consist of one or more algebraic equations. It was especially designed for use by those with little or no knowledge of computer languages or programming. It is compact (and thus can...

  11. MOUSE (MODULAR ORIENTED UNCERTAINTY SYSTEM): A COMPUTERIZED UNCERTAINTY ANALYSIS SYSTEM. OPERATIONAL MANUAL.

    EPA Science Inventory

    MOUSE (Modular Oriented Uncertainty SystEm) deals with the problem of uncertainties in models that consist of one or more algebraic equations. It was especially designed for use by those with little or no knowledge of computer languages or programming. It is compact (and thus can...

  12. Uncertainty analysis of wind-wave predictions in Lake Michigan

    NASA Astrophysics Data System (ADS)

    Nekouee, Navid; Ataie-Ashtiani, Behzad; Hamidi, Sajad Ahmad

    2016-10-01

    With all the improvement in wave and hydrodynamics numerical models, the question rises in our mind that how the accuracy of the forcing functions and their input can affect the results. In this paper, a commonly used numerical third-generation wave model, SWAN is applied to predict waves in Lake Michigan. Wind data are analyzed to determine wind variation frequency over Lake Michigan. Wave predictions uncertainty due to wind local effects are compared during a period where wind has a fairly constant speed and direction over the northern and southern basins. The study shows that despite model calibration in Lake Michigan area, the model deficiency arises from ignoring wind effects in small scales. Wave prediction also emphasizes that small scale turbulence in meteorological forces can increase prediction errors by 38%. Wave frequency and coherence analysis show that both models can predict the wave variation time scale with the same accuracy. Insufficient number of meteorological stations can result in neglecting local wind effects and discrepancies in current predictions. The uncertainty of wave numerical models due to input uncertainties and model principals should be taken into account for design risk factors.

  13. Bayesian analysis of input uncertainty in hydrological modeling: 2. Application

    NASA Astrophysics Data System (ADS)

    Kavetski, Dmitri; Kuczera, George; Franks, Stewart W.

    2006-03-01

    The Bayesian total error analysis (BATEA) methodology directly addresses both input and output errors in hydrological modeling, requiring the modeler to make explicit, rather than implicit, assumptions about the likely extent of data uncertainty. This study considers a BATEA assessment of two North American catchments: (1) French Broad River and (2) Potomac basins. It assesses the performance of the conceptual Variable Infiltration Capacity (VIC) model with and without accounting for input (precipitation) uncertainty. The results show the considerable effects of precipitation errors on the predicted hydrographs (especially the prediction limits) and on the calibrated parameters. In addition, the performance of BATEA in the presence of severe model errors is analyzed. While BATEA allows a very direct treatment of input uncertainty and yields some limited insight into model errors, it requires the specification of valid error models, which are currently poorly understood and require further work. Moreover, it leads to computationally challenging highly dimensional problems. For some types of models, including the VIC implemented using robust numerical methods, the computational cost of BATEA can be reduced using Newton-type methods.

  14. Global land cover mapping: a review and uncertainty analysis

    USGS Publications Warehouse

    Congalton, Russell G.; Gu, Jianyu; Yadav, Kamini; Thenkabail, Prasad S.; Ozdogan, Mutlu

    2014-01-01

    Given the advances in remotely sensed imagery and associated technologies, several global land cover maps have been produced in recent times including IGBP DISCover, UMD Land Cover, Global Land Cover 2000 and GlobCover 2009. However, the utility of these maps for specific applications has often been hampered due to considerable amounts of uncertainties and inconsistencies. A thorough review of these global land cover projects including evaluating the sources of error and uncertainty is prudent and enlightening. Therefore, this paper describes our work in which we compared, summarized and conducted an uncertainty analysis of the four global land cover mapping projects using an error budget approach. The results showed that the classification scheme and the validation methodology had the highest error contribution and implementation priority. A comparison of the classification schemes showed that there are many inconsistencies between the definitions of the map classes. This is especially true for the mixed type classes for which thresholds vary for the attributes/discriminators used in the classification process. Examination of these four global mapping projects provided quite a few important lessons for the future global mapping projects including the need for clear and uniform definitions of the classification scheme and an efficient, practical, and valid design of the accuracy assessment.

  15. Sensitivity and uncertainty analysis of a polyurethane foam decomposition model

    SciTech Connect

    HOBBS,MICHAEL L.; ROBINSON,DAVID G.

    2000-03-14

    Sensitivity/uncertainty analyses are not commonly performed on complex, finite-element engineering models because the analyses are time consuming, CPU intensive, nontrivial exercises that can lead to deceptive results. To illustrate these ideas, an analytical sensitivity/uncertainty analysis is used to determine the standard deviation and the primary factors affecting the burn velocity of polyurethane foam exposed to firelike radiative boundary conditions. The complex, finite element model has 25 input parameters that include chemistry, polymer structure, and thermophysical properties. The response variable was selected as the steady-state burn velocity calculated as the derivative of the burn front location versus time. The standard deviation of the burn velocity was determined by taking numerical derivatives of the response variable with respect to each of the 25 input parameters. Since the response variable is also a derivative, the standard deviation is essentially determined from a second derivative that is extremely sensitive to numerical noise. To minimize the numerical noise, 50-micron elements and approximately 1-msec time steps were required to obtain stable uncertainty results. The primary effect variable was shown to be the emissivity of the foam.

  16. Eye tracker uncertainty analysis and modelling in real time

    NASA Astrophysics Data System (ADS)

    Fornaser, A.; De Cecco, M.; Leuci, M.; Conci, N.; Daldoss, M.; Armanini, A.; Maule, L.; De Natale, F.; Da Lio, M.

    2017-01-01

    Techniques for tracking the eyes took place since several decades for different applications that range from military, to education, entertainment and clinics. The existing systems are in general of two categories: precise but intrusive or comfortable but less accurate. The idea of this work is to calibrate an eye tracker of the second category. In particular we have estimated the uncertainty both in nominal and in case of variable operating conditions. We took into consideration different influencing factors such as: head movement and rotation, eyes detected, target position on the screen, illumination and objects in front of the eyes. Results proved that the 2D uncertainty can be modelled as a circular confidence interval as far as there is no stable principal directions in both the systematic and the repeatability effects. This confidence region was also modelled as a function of the current working conditions. In this way we can obtain a value of the uncertainty that is a function of the operating condition estimated in real time opening the field to new applications that reconfigure the human machine interface as a function of the operating conditions. Examples can range from option buttons reshape, local zoom dynamically adjusted, speed optimization to regulate interface responsiveness, the possibility to take into account the uncertainty associated to a particular interaction. Furthermore, in the analysis of visual scanning patterns, the resulting Point of Regard maps would be associated with proper confidence levels thus allowing to draw accurate conclusions. We conducted an experimental campaign to estimate and validate the overall modelling procedure obtaining valid results in 86% of the cases.

  17. The Uncertainty in the Local Seismic Response Analysis

    SciTech Connect

    Pasculli, A.; Pugliese, A.; Romeo, R. W.; Sano, T.

    2008-07-08

    In the present paper is shown the influence on the local seismic response analysis exerted by considering dispersion and uncertainty in the seismic input as well as in the dynamic properties of soils. In a first attempt a 1D numerical model is developed accounting for both the aleatory nature of the input motion and the stochastic variability of the dynamic properties of soils. The seismic input is introduced in a non-conventional way through a power spectral density, for which an elastic response spectrum, derived--for instance--by a conventional seismic hazard analysis, is required with an appropriate level of reliability. The uncertainty in the geotechnical properties of soils are instead investigated through a well known simulation technique (Monte Carlo method) for the construction of statistical ensembles. The result of a conventional local seismic response analysis given by a deterministic elastic response spectrum is replaced, in our approach, by a set of statistical elastic response spectra, each one characterized by an appropriate level of probability to be reached or exceeded. The analyses have been carried out for a well documented real case-study. Lastly, we anticipate a 2D numerical analysis to investigate also the spatial variability of soil's properties.

  18. Sensitivity and uncertainty analysis of regional marine ecosystem services value

    NASA Astrophysics Data System (ADS)

    Shi, Honghua; Zheng, Wei; Wang, Zongling; Ding, Dewen

    2009-06-01

    Marine ecosystem services are the benefits which people obtain from the marine ecosystem, including provisioning services, regulating services, cultural services and supporting services. The human species, while buffered against environmental changes by culture and technology, is fundamentally dependent on the flow of ecosystem services. Marine ecosystem services become increasingly valuable as the terrestrial resources become scarce. The value of marine ecosystem services is the monetary flow of ecosystem services on specific temporal and spatial scales, which often changes due to the variation of the goods prices, yields and the status of marine exploitation. Sensitivity analysis is to study the relationship between the value of marine ecosystem services and the main factors which affect it. Uncertainty analysis based on varying prices, yields and status of marine exploitation was carried out. Through uncertainty analysis, a more credible value range instead of a fixed value of marine ecosystem services was obtained in this study. Moreover, sensitivity analysis of the marine ecosystem services value revealed the relative importance of different factors.

  19. Biomass Thermogravimetric Analysis: Uncertainty Determination Methodology and Sampling Maps Generation

    PubMed Central

    Pazó, Jose A.; Granada, Enrique; Saavedra, Ángeles; Eguía, Pablo; Collazo, Joaquín

    2010-01-01

    The objective of this study was to develop a methodology for the determination of the maximum sampling error and confidence intervals of thermal properties obtained from thermogravimetric analysis (TG), including moisture, volatile matter, fixed carbon and ash content. The sampling procedure of the TG analysis was of particular interest and was conducted with care. The results of the present study were compared to those of a prompt analysis, and a correlation between the mean values and maximum sampling errors of the methods were not observed. In general, low and acceptable levels of uncertainty and error were obtained, demonstrating that the properties evaluated by TG analysis were representative of the overall fuel composition. The accurate determination of the thermal properties of biomass with precise confidence intervals is of particular interest in energetic biomass applications. PMID:20717532

  20. Case studies of uncertainty analysis and explosives cleanup

    SciTech Connect

    Elmore, A.C.; Graff, T.

    1999-07-01

    Decision making for ground water restoration projects is sometimes difficult or complicated because of the large uncertainties inherent to evaluating hydrogeologic and contaminant conditions which cannot be directly observed over the large scale. Fiscal responsibility and professional ethics require that decision makers use best science approaches instead of relying on arbitrary selection processes. Three case studies illustrate how decision tree models and Monte Carlo analysis can be used by the practicing professional to systematically make important environmental restoration decisions. Each case study is based on formerly used defense site projects in the Midwest where explosives contamination is present in the ground water.

  1. Results of a 24-inch Hybrid Motor Performance Uncertainty Analysis

    NASA Technical Reports Server (NTRS)

    Sims, Joseph D.; Coleman, Hugh W.

    1998-01-01

    The subscale (11 and 24-inch) hybrid motors at the Marshall Space Flight Center (MSFC) have been used as versatile and cost effective testbeds for developing new technology. Comparisons between motor configuration, ignition systems, feed systems, fuel formulations, and nozzle materials have been carried out without detailed consideration as to haw "good" the motor performance data were. For the 250,000 lb/thrust motor developed by the Hybrid Propulsion Demonstration Program consortium, this shortcoming is particularly risky because motor performance will likely be used as put of a set of downselect criteria to choose between competing ignition and feed systems under development. This analysis directly addresses that shortcoming by applying uncertainty analysis techniques to the experimental determination of the characteristic velocity, theoretical characteristic velocity, and characteristic velocity efficiency for a 24-inch motor firing. With the adoption of fuel-lined headends, flow restriction, and aft mixing chambers, state of the an 24-inch hybrid motors have become very efficient However, impossibly high combustion efficiencies (some computed as high as 108%) have been measured in some tests with 11-inch motors. This analysis has given new insight into explaining how these efficiencies were measured to be so high, and into which experimental measurements contribute the most to the overall uncertainty.

  2. Productivity of Northern Eurasian forests: Analysis of uncertainties

    NASA Astrophysics Data System (ADS)

    Shvidenko, Anatoly; Schepaschenko, Dmitry; McCallum, Ian

    2010-05-01

    Indicators of biological productivity of forests (live and dead biomass, net primary production, net and gross growth) are crucial for both assessment of the impacts of terrestrial ecosystems on major biogeochemical cycles and practice of sustainable forest management. However, different information and the diversity of methods used in the assessments of forests productivity cause substantial variation in reported estimates. The paper contains a systems analysis of the existing methods, their uncertainties, and a description of available information. With respect to Northern Eurasian forests, the major reasons for uncertainties could be categorized as following: (1) significant biases that are inherent in a number of important sources of available information (e.g., forest inventory data, results of measurements of some indicators in situ); (2) inadequacy and oversimplification of models of different types (empirical aggregations, process-based models); (3) lack of data for some regions; and (4) upscaling procedure of 'point' observations. Based on as comprehensive as possible adherence to the principles of systems analysis, we made an attempt to provide a reanalysis of indicators of forests productivity of Russia aiming at obtaining the results for which uncertainties could be estimated in a reliable and transparent way. Within a landscape-ecosystem approach it has required (1) development of an expert system for refinement of initial data including elimination of recognized biases; (2) delineation of ecological regions based on gradients of major indicators of productivity; (3) transition to multidimensional models (e.g., for calculation of spatially distributed biomass expansion factors); (4) use of process-based elements in empirical models; and (5) development of some approaches which presumably do not have recognized biases. However, taking into account the fuzzy character of the problem, the above approach (as well as any other individually used method) is

  3. Additional challenges for uncertainty analysis in river engineering

    NASA Astrophysics Data System (ADS)

    Berends, Koen; Warmink, Jord; Hulscher, Suzanne

    2016-04-01

    the proposed intervention. The implicit assumption underlying such analysis is that both models are commensurable. We hypothesize that they are commensurable only to a certain extent. In an idealised study we have demonstrated that prediction performance loss should be expected with increasingly large engineering works. When accounting for parametric uncertainty of floodplain roughness in model identification, we see uncertainty bounds for predicted effects of interventions increase with increasing intervention scale. Calibration of these types of models therefore seems to have a shelf-life, beyond which calibration does not longer improves prediction. Therefore a qualification scheme for model use is required that can be linked to model validity. In this study, we characterize model use along three dimensions: extrapolation (using the model with different external drivers), extension (using the model for different output or indicators) and modification (using modified models). Such use of models is expected to have implications for the applicability of surrogating modelling for efficient uncertainty analysis as well, which is recommended for future research. Warmink, J. J.; Straatsma, M. W.; Huthoff, F.; Booij, M. J. & Hulscher, S. J. M. H. 2013. Uncertainty of design water levels due to combined bed form and vegetation roughness in the Dutch river Waal. Journal of Flood Risk Management 6, 302-318 . DOI: 10.1111/jfr3.12014

  4. Statistical analysis of the uncertainty related to flood hazard appraisal

    NASA Astrophysics Data System (ADS)

    Notaro, Vincenza; Freni, Gabriele

    2015-12-01

    The estimation of flood hazard frequency statistics for an urban catchment is of great interest in practice. It provides the evaluation of potential flood risk and related damage and supports decision making for flood risk management. Flood risk is usually defined as function of the probability, that a system deficiency can cause flooding (hazard), and the expected damage, due to the flooding magnitude (damage), taking into account both the exposure and the vulnerability of the goods at risk. The expected flood damage can be evaluated by an a priori estimation of potential damage caused by flooding or by interpolating real damage data. With regard to flood hazard appraisal several procedures propose to identify some hazard indicator (HI) such as flood depth or the combination of flood depth and velocity and to assess the flood hazard corresponding to the analyzed area comparing the HI variables with user-defined threshold values or curves (penalty curves or matrixes). However, flooding data are usually unavailable or piecemeal allowing for carrying out a reliable flood hazard analysis, therefore hazard analysis is often performed by means of mathematical simulations aimed at evaluating water levels and flow velocities over catchment surface. As results a great part of the uncertainties intrinsic to flood risk appraisal can be related to the hazard evaluation due to the uncertainty inherent to modeling results and to the subjectivity of the user defined hazard thresholds applied to link flood depth to a hazard level. In the present work, a statistical methodology was proposed for evaluating and reducing the uncertainties connected with hazard level estimation. The methodology has been applied to a real urban watershed as case study.

  5. Joint analysis of epistemic and aleatory uncertainty in stability analysis for geo-hazard assessments

    NASA Astrophysics Data System (ADS)

    Rohmer, Jeremy; Verdel, Thierry

    2017-04-01

    Uncertainty analysis is an unavoidable task of stability analysis of any geotechnical systems. Such analysis usually relies on the safety factor SF (if SF is below some specified threshold), the failure is possible). The objective of the stability analysis is then to estimate the failure probability P for SF to be below the specified threshold. When dealing with uncertainties, two facets should be considered as outlined by several authors in the domain of geotechnics, namely "aleatoric uncertainty" (also named "randomness" or "intrinsic variability") and "epistemic uncertainty" (i.e. when facing "vague, incomplete or imprecise information" such as limited databases and observations or "imperfect" modelling). The benefits of separating both facets of uncertainty can be seen from a risk management perspective because: - Aleatoric uncertainty, being a property of the system under study, cannot be reduced. However, practical actions can be taken to circumvent the potentially dangerous effects of such variability; - Epistemic uncertainty, being due to the incomplete/imprecise nature of available information, can be reduced by e.g., increasing the number of tests (lab or in site survey), improving the measurement methods or evaluating calculation procedure with model tests, confronting more information sources (expert opinions, data from literature, etc.). Uncertainty treatment in stability analysis usually restricts to the probabilistic framework to represent both facets of uncertainty. Yet, in the domain of geo-hazard assessments (like landslides, mine pillar collapse, rockfalls, etc.), the validity of this approach can be debatable. In the present communication, we propose to review the major criticisms available in the literature against the systematic use of probability in situations of high degree of uncertainty. On this basis, the feasibility of using a more flexible uncertainty representation tool is then investigated, namely Possibility distributions (e

  6. Sensitivity and uncertainty analysis of a regulatory risk model

    SciTech Connect

    Kumar, A.; Manocha, A.; Shenoy, T.

    1999-07-01

    Health Risk Assessments (H.R.A.s) are increasingly being used in the environmental decision making process, starting from problem identification to the final clean up activities. A key issue concerning the results of these risk assessments is the uncertainty associated with them. This uncertainty has been associated with highly conservative estimates of risk assessment parameters in past studies. The primary purpose of this study was to investigate error propagation through a risk model. A hypothetical glass plant situated in the state of California was studied. Air emissions from this plant were modeled using the ISCST2 model and the risk was calculated using the ACE2588 model. The downwash was also considered during the concentration calculations. A sensitivity analysis on the risk computations identified five parameters--mixing depth for human consumption, deposition velocity, weathering constant, interception factors for vine crop and the average leaf vegetable consumption--which had the greatest impact on the calculated risk. A Monte Carlo analysis using these five parameters resulted in a distribution with a lesser percentage deviation than the percentage standard deviation of the input parameters.

  7. Reducing spatial uncertainty in climatic maps through geostatistical analysis

    NASA Astrophysics Data System (ADS)

    Pesquer, Lluís; Ninyerola, Miquel; Pons, Xavier

    2014-05-01

    Climatic maps from meteorological stations and geographical co-variables can be obtained through correlative models (Ninyerola et al., 2000)*. Nevertheless, the spatial uncertainty of the resulting maps could be reduced. The present work is a new stage over those approaches aiming to study how to obtain better results while characterizing spatial uncertainty. The study area is Catalonia (32000 km2), a region with highly variable relief (0 to 3143 m). We have used 217 stations (321 to 1244 mm) to model the annual precipitation in two steps: 1/ multiple regression using geographical variables (elevation, distance to the coast, latitude, etc) and 2/ refinement of the results by adding the spatial interpolation of the regression residuals with inverse distance weighting (IDW), regularized splines with tension (SPT) or ordinary kriging (OK). Spatial uncertainty analysis is based on an independent subsample (test set), randomly selected in previous works. The main contribution of this work is the analysis of this test set as well as the search for an optimal process of division (split) of the stations in two sets, one used to perform the multiple regression and residuals interpolation (fit set), and another used to compute the quality (test set); optimal division should reduce spatial uncertainty and improve the overall quality. Two methods have been evaluated against classical methods: (random selection RS and leave-one-out cross-validation LOOCV): selection by Euclidian 2D-distance, and selection by anisotropic 2D-distance combined with a 3D-contribution (suitable weighted) from the most representative independent variable. Both methods define a minimum threshold distance, obtained by variogram analysis, between samples. Main preliminary results for LOOCV, RS (average from 10 executions), Euclidian criterion (EU), and for anisotropic criterion (with 1.1 value, UTMY coordinate has a bit more weight than UTMX) combined with 3D criteria (A3D) (1000 factor for elevation

  8. Uncertainty analysis for fluorescence tomography with Monte Carlo method

    NASA Astrophysics Data System (ADS)

    Reinbacher-Köstinger, Alice; Freiberger, Manuel; Scharfetter, Hermann

    2011-07-01

    Fluorescence tomography seeks to image an inaccessible fluorophore distribution inside an object like a small animal by injecting light at the boundary and measuring the light emitted by the fluorophore. Optical parameters (e.g. the conversion efficiency or the fluorescence life-time) of certain fluorophores depend on physiologically interesting quantities like the pH value or the oxygen concentration in the tissue, which allows functional rather than just anatomical imaging. To reconstruct the concentration and the life-time from the boundary measurements, a nonlinear inverse problem has to be solved. It is, however, difficult to estimate the uncertainty of the reconstructed parameters in case of iterative algorithms and a large number of degrees of freedom. Uncertainties in fluorescence tomography applications arise from model inaccuracies, discretization errors, data noise and a priori errors. Thus, a Markov chain Monte Carlo method (MCMC) was used to consider all these uncertainty factors exploiting Bayesian formulation of conditional probabilities. A 2-D simulation experiment was carried out for a circular object with two inclusions. Both inclusions had a 2-D Gaussian distribution of the concentration and constant life-time inside of a representative area of the inclusion. Forward calculations were done with the diffusion approximation of Boltzmann's transport equation. The reconstruction results show that the percent estimation error of the lifetime parameter is by a factor of approximately 10 lower than that of the concentration. This finding suggests that lifetime imaging may provide more accurate information than concentration imaging only. The results must be interpreted with caution, however, because the chosen simulation setup represents a special case and a more detailed analysis remains to be done in future to clarify if the findings can be generalized.

  9. Thermal hydraulic limits analysis using statistical propagation of parametric uncertainties

    SciTech Connect

    Chiang, K. Y.; Hu, L. W.; Forget, B.

    2012-07-01

    The MIT Research Reactor (MITR) is evaluating the conversion from highly enriched uranium (HEU) to low enrichment uranium (LEU) fuel. In addition to the fuel element re-design, a reactor power upgraded from 6 MW to 7 MW is proposed in order to maintain the same reactor performance of the HEU core. Previous approach in analyzing the impact of engineering uncertainties on thermal hydraulic limits via the use of engineering hot channel factors (EHCFs) was unable to explicitly quantify the uncertainty and confidence level in reactor parameters. The objective of this study is to develop a methodology for MITR thermal hydraulic limits analysis by statistically combining engineering uncertainties with an aim to eliminate unnecessary conservatism inherent in traditional analyses. This method was employed to analyze the Limiting Safety System Settings (LSSS) for the MITR, which is the avoidance of the onset of nucleate boiling (ONB). Key parameters, such as coolant channel tolerances and heat transfer coefficients, were considered as normal distributions using Oracle Crystal Ball to calculate ONB. The LSSS power is determined with 99.7% confidence level. The LSSS power calculated using this new methodology is 9.1 MW, based on core outlet coolant temperature of 60 deg. C, and primary coolant flow rate of 1800 gpm, compared to 8.3 MW obtained from the analytical method using the EHCFs with same operating conditions. The same methodology was also used to calculate the safety limit (SL) for the MITR, conservatively determined using onset of flow instability (OFI) as the criterion, to verify that adequate safety margin exists between LSSS and SL. The calculated SL is 10.6 MW, which is 1.5 MW higher than LSSS. (authors)

  10. Probabilistic accident consequence uncertainty analysis: Food chain uncertainty assessment. Volume 1: Main report

    SciTech Connect

    Brown, J.; Goossens, L.H.J.; Kraan, B.C.P.

    1997-06-01

    This volume is the first of a two-volume document that summarizes a joint project conducted by the US Nuclear Regulatory Commission and the European Commission to assess uncertainties in the MACCS and COSYMA probabilistic accident consequence codes. These codes were developed primarily for estimating the risks presented by nuclear reactors based on postulated frequencies and magnitudes of potential accidents. This document reports on an ongoing project to assess uncertainty in the MACCS and COSYMA calculations for the offsite consequences of radionuclide releases by hypothetical nuclear power plant accidents. A panel of sixteen experts was formed to compile credible and traceable uncertainty distributions for food chain variables that affect calculations of offsite consequences. The expert judgment elicitation procedure and its outcomes are described in these volumes. Other panels were formed to consider uncertainty in other aspects of the codes. Their results are described in companion reports. Volume 1 contains background information and a complete description of the joint consequence uncertainty study. Volume 2 contains appendices that include (1) a summary of the MACCS and COSYMA consequence codes, (2) the elicitation questionnaires and case structures for both panels, (3) the rationales and results for the panels on soil and plant transfer and animal transfer, (4) short biographies of the experts, and (5) the aggregated results of their responses.

  11. Uncertainty Reduction using Bayesian Inference and Sensitivity Analysis: A Sequential Approach to the NASA Langley Uncertainty Quantification Challenge

    NASA Technical Reports Server (NTRS)

    Sankararaman, Shankar

    2016-01-01

    This paper presents a computational framework for uncertainty characterization and propagation, and sensitivity analysis under the presence of aleatory and epistemic un- certainty, and develops a rigorous methodology for efficient refinement of epistemic un- certainty by identifying important epistemic variables that significantly affect the overall performance of an engineering system. The proposed methodology is illustrated using the NASA Langley Uncertainty Quantification Challenge (NASA-LUQC) problem that deals with uncertainty analysis of a generic transport model (GTM). First, Bayesian inference is used to infer subsystem-level epistemic quantities using the subsystem-level model and corresponding data. Second, tools of variance-based global sensitivity analysis are used to identify four important epistemic variables (this limitation specified in the NASA-LUQC is reflective of practical engineering situations where not all epistemic variables can be refined due to time/budget constraints) that significantly affect system-level performance. The most significant contribution of this paper is the development of the sequential refine- ment methodology, where epistemic variables for refinement are not identified all-at-once. Instead, only one variable is first identified, and then, Bayesian inference and global sensi- tivity calculations are repeated to identify the next important variable. This procedure is continued until all 4 variables are identified and the refinement in the system-level perfor- mance is computed. The advantages of the proposed sequential refinement methodology over the all-at-once uncertainty refinement approach are explained, and then applied to the NASA Langley Uncertainty Quantification Challenge problem.

  12. Geoengineering to Avoid Overshoot: An Analysis of Uncertainty

    NASA Astrophysics Data System (ADS)

    Tanaka, Katsumasa; Cho, Cheolhung; Krey, Volker; Patt, Anthony; Rafaj, Peter; Rao-Skirbekk, Shilpa; Wagner, Fabian

    2010-05-01

    ., 2009) is employed to calculate climate responses including associated uncertainty and to estimate geoengineering profiles to cap the warming at 2°C since preindustrial. The inversion setup for the model ACC2 is used to estimate the uncertain parameters (e.g. climate sensitivity) against associated historical observations (e.g. global-mean surface air temperature). Our preliminary results show that under climate and scenario uncertainties, a geoengineering intervention to avoid an overshoot would be with medium intensity in the latter half of this century (≈ 1 Mt. Pinatubo eruption every 4 years in terms of stratospheric sulfur injections). The start year of geoengineering intervention does not significantly influence the long-term geoengineering profile. However, a geoengineering intervention of the medium intensity could bring about substantial environmental side effects such as the destruction of stratospheric ozone. Our results point to the necessity to pursue persistently mainstream mitigation efforts. 2) Pollution Abatement and Geoengineering The second study examines the potential of geoengineering combined with air clean policy. A drastic air pollution abatement might result in an abrupt warming because it would suddenly remove the tropospheric aerosols which partly offset the background global warming (e.g. Andreae et al, 2005, Raddatz and Tanaka, 2010). This study investigates the magnitude of unrealized warming under a range of policy assumptions and associated uncertainties. Then the profile of geoengineering is estimated to suppress the warming that would be accompanied by clean air policy. This study is the first attempt to explore uncertainty in the warming caused by clean air policy - Kloster et al. (2009), which assess regional changes in climate and hydrological cycle, has not however included associated uncertainties in the analysis. A variety of policy assumptions will be devised to represent various degrees of air pollution abatement. These

  13. Dynamic wake prediction and visualization with uncertainty analysis

    NASA Technical Reports Server (NTRS)

    Holforty, Wendy L. (Inventor); Powell, J. David (Inventor)

    2005-01-01

    A dynamic wake avoidance system utilizes aircraft and atmospheric parameters readily available in flight to model and predict airborne wake vortices in real time. A novel combination of algorithms allows for a relatively simple yet robust wake model to be constructed based on information extracted from a broadcast. The system predicts the location and movement of the wake based on the nominal wake model and correspondingly performs an uncertainty analysis on the wake model to determine a wake hazard zone (no fly zone), which comprises a plurality of wake planes, each moving independently from another. The system selectively adjusts dimensions of each wake plane to minimize spatial and temporal uncertainty, thereby ensuring that the actual wake is within the wake hazard zone. The predicted wake hazard zone is communicated in real time directly to a user via a realistic visual representation. In an example, the wake hazard zone is visualized on a 3-D flight deck display to enable a pilot to visualize or see a neighboring aircraft as well as its wake. The system substantially enhances the pilot's situational awareness and allows for a further safe decrease in spacing, which could alleviate airport and airspace congestion.

  14. Selection of Representative Models for Decision Analysis Under Uncertainty

    NASA Astrophysics Data System (ADS)

    Meira, Luis A. A.; Coelho, Guilherme P.; Santos, Antonio Alberto S.; Schiozer, Denis J.

    2016-03-01

    The decision-making process in oil fields includes a step of risk analysis associated with the uncertainties present in the variables of the problem. Such uncertainties lead to hundreds, even thousands, of possible scenarios that are supposed to be analyzed so an effective production strategy can be selected. Given this high number of scenarios, a technique to reduce this set to a smaller, feasible subset of representative scenarios is imperative. The selected scenarios must be representative of the original set and also free of optimistic and pessimistic bias. This paper is devoted to propose an assisted methodology to identify representative models in oil fields. To do so, first a mathematical function was developed to model the representativeness of a subset of models with respect to the full set that characterizes the problem. Then, an optimization tool was implemented to identify the representative models of any problem, considering not only the cross-plots of the main output variables, but also the risk curves and the probability distribution of the attribute-levels of the problem. The proposed technique was applied to two benchmark cases and the results, evaluated by experts in the field, indicate that the obtained solutions are richer than those identified by previously adopted manual approaches. The program bytecode is available under request.

  15. Impact of Model Uncertainties on Quantitative Analysis of FUV Auroral Images: Peak Production Height

    NASA Technical Reports Server (NTRS)

    Germany, G. A.; Lummerzheim, D.; Parks, G. K.; Brittnacher, M. J.; Spann, James F., Jr.; Richards, Phil G.

    1999-01-01

    We demonstrate that small uncertainties in the modeled height of peak production for FUV emissions can lead to significant uncertainties in the analysis of these sai-ne emissions. In particular, an uncertainty of only 3 km in the peak production height can lead to a 50% uncertainty in the mean auroral energy deduced from the images. This altitude uncertainty is comparable to differences in different auroral deposition models currently used for UVI analysis. Consequently, great care must be taken in quantitative photometric analysis and interpretation of FUV auroral images.

  16. Computational Methods for Sensitivity and Uncertainty Analysis in Criticality Safety

    SciTech Connect

    Broadhead, B.L.; Childs, R.L.; Rearden, B.T.

    1999-09-20

    Interest in the sensitivity methods that were developed and widely used in the 1970s (the FORSS methodology at ORNL among others) has increased recently as a result of potential use in the area of criticality safety data validation procedures to define computational bias, uncertainties and area(s) of applicability. Functional forms of the resulting sensitivity coefficients can be used as formal parameters in the determination of applicability of benchmark experiments to their corresponding industrial application areas. In order for these techniques to be generally useful to the criticality safety practitioner, the procedures governing their use had to be updated and simplified. This paper will describe the resulting sensitivity analysis tools that have been generated for potential use by the criticality safety community.

  17. Sensitivity and uncertainty analysis of the recharge boundary condition

    NASA Astrophysics Data System (ADS)

    Jyrkama, M. I.; Sykes, J. F.

    2006-01-01

    The reliability analysis method is integrated with MODFLOW to study the impact of recharge on the groundwater flow system at a study area in New Jersey. The performance function is formulated in terms of head or flow rate at a pumping well, while the recharge sensitivity vector is computed efficiently by implementing the adjoint method in MODFLOW. The developed methodology not only quantifies the reliability of head at the well in terms of uncertainties in the recharge boundary condition, but it also delineates areas of recharge that have the highest impact on the head and flow rate at the well. The results clearly identify the most important land use areas that should be protected in order to maintain the head and hence production at the pumping well. These areas extend far beyond the steady state well capture zone used for land use planning and management within traditional wellhead protection programs.

  18. Uncertainty Analysis of the NASA Glenn 8x6 Supersonic Wind Tunnel

    NASA Technical Reports Server (NTRS)

    Stephens, Julia; Hubbard, Erin; Walter, Joel; McElroy, Tyler

    2016-01-01

    This paper presents methods and results of a detailed measurement uncertainty analysis that was performed for the 8- by 6-foot Supersonic Wind Tunnel located at the NASA Glenn Research Center. The statistical methods and engineering judgments used to estimate elemental uncertainties are described. The Monte Carlo method of propagating uncertainty was selected to determine the uncertainty of calculated variables of interest. A detailed description of the Monte Carlo method as applied for this analysis is provided. Detailed uncertainty results for the uncertainty in average free stream Mach number as well as other variables of interest are provided. All results are presented as random (variation in observed values about a true value), systematic (potential offset between observed and true value), and total (random and systematic combined) uncertainty. The largest sources contributing to uncertainty are determined and potential improvement opportunities for the facility are investigated.

  19. Probabilistic accident consequence uncertainty analysis -- Uncertainty assessment for deposited material and external doses. Volume 1: Main report

    SciTech Connect

    Goossens, L.H.J.; Kraan, B.C.P.; Cooke, R.M.; Boardman, J.; Jones, J.A.; Harper, F.T.; Young, M.L.; Hora, S.C.

    1997-12-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library of uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA deposited material and external dose models.

  20. Uncertainty importance analysis using parametric moment ratio functions.

    PubMed

    Wei, Pengfei; Lu, Zhenzhou; Song, Jingwen

    2014-02-01

    This article presents a new importance analysis framework, called parametric moment ratio function, for measuring the reduction of model output uncertainty when the distribution parameters of inputs are changed, and the emphasis is put on the mean and variance ratio functions with respect to the variances of model inputs. The proposed concepts efficiently guide the analyst to achieve a targeted reduction on the model output mean and variance by operating on the variances of model inputs. The unbiased and progressive unbiased Monte Carlo estimators are also derived for the parametric mean and variance ratio functions, respectively. Only a set of samples is needed for implementing the proposed importance analysis by the proposed estimators, thus the computational cost is free of input dimensionality. An analytical test example with highly nonlinear behavior is introduced for illustrating the engineering significance of the proposed importance analysis technique and verifying the efficiency and convergence of the derived Monte Carlo estimators. Finally, the moment ratio function is applied to a planar 10-bar structure for achieving a targeted 50% reduction of the model output variance.

  1. [Parameter uncertainty analysis for urban rainfall runoff modelling].

    PubMed

    Huang, Jin-Liang; Lin, Jie; Du, Peng-Fei

    2012-07-01

    An urban watershed in Xiamen was selected to perform the parameter uncertainty analysis for urban stormwater runoff modeling in terms of identification and sensitivity analysis based on storm water management model (SWMM) using Monte-Carlo sampling and regionalized sensitivity analysis (RSA) algorithm. Results show that Dstore-Imperv, Dstore-Perv and Curve Number (CN) are the identifiable parameters with larger K-S values in hydrological and hydraulic module, and the rank of K-S values in hydrological and hydraulic module is Dstore-Imperv > CN > Dstore-Perv > N-Perv > conductivity > Con-Mann > N-Imperv. With regards to water quality module, the parameters in exponent washoff model including Coefficient and Exponent and the Max. Buildup parameter of saturation buildup model in three land cover types are the identifiable parameters with the larger K-S values. In comparison, the K-S value of rate constant in three landuse/cover types is smaller than that of Max. Buildup, Coefficient and Exponent.

  2. Regional Frequency and Uncertainty Analysis of Extreme Precipitation in Bangladesh

    NASA Astrophysics Data System (ADS)

    Mortuza, M. R.; Demissie, Y.; Li, H. Y.

    2014-12-01

    Increased frequency of extreme precipitations, especially those with multiday durations, are responsible for recent urban floods and associated significant losses of lives and infrastructures in Bangladesh. Reliable and routinely updated estimation of the frequency of occurrence of such extreme precipitation events are thus important for developing up-to-date hydraulic structures and stormwater drainage system that can effectively minimize future risk from similar events. In this study, we have updated the intensity-duration-frequency (IDF) curves for Bangladesh using daily precipitation data from 1961 to 2010 and quantified associated uncertainties. Regional frequency analysis based on L-moments is applied on 1-day, 2-day and 5-day annual maximum precipitation series due to its advantages over at-site estimation. The regional frequency approach pools the information from climatologically similar sites to make reliable estimates of quantiles given that the pooling group is homogeneous and of reasonable size. We have used Region of influence (ROI) approach along with homogeneity measure based on L-moments to identify the homogenous pooling groups for each site. Five 3-parameter distributions (i.e., Generalized Logistic, Generalized Extreme value, Generalized Normal, Pearson Type Three, and Generalized Pareto) are used for a thorough selection of appropriate models that fit the sample data. Uncertainties related to the selection of the distributions and historical data are quantified using the Bayesian Model Averaging and Balanced Bootstrap approaches respectively. The results from this study can be used to update the current design and management of hydraulic structures as well as in exploring spatio-temporal variations of extreme precipitation and associated risk.

  3. Uncertainty Analysis of the Three Pagodas Fault-Source Geometry

    NASA Astrophysics Data System (ADS)

    Haller, K. M.

    2015-12-01

    Probabilistic seismic-hazard assessment generally relies on an earthquake catalog (to estimate future seismicity from the locations and rates of past earthquakes) and faults sources (to estimate future seismicity from the known paleoseismic history of surface rupture). The paleoseismic history of potentially active faults in Southeast Asia is addressed at few locations and spans only a few complete recurrence intervals; many faults remain unstudied. Even where the timing of a surface-rupturing earthquakes is known, the extent of rupture may not be well constrained. Therefore, subjective judgment of experts is often used to define the three-dimensional size of future ruptures; limited paleoseismic data can lead to large uncertainties in ground-motion hazard from fault sources due to the preferred models that underlie these judgments. The 300-km-long, strike-slip Three Pagodas fault in western Thailand is possibly one of the most active faults in the country. The fault parallels the plate boundary and may be characterized by a slip rate high enough to result in measurable ground-motion at periods of interest for building design. The known paleoseismic history is limited and likely does not include the largest possible earthquake on the fault. This lack of knowledge begs the question what sizes of earthquakes are expected? Preferred rupture models constrain possible magnitude-frequency distributions, and alternative rupture models can result in different ground-motion hazard near the fault. This analysis includes alternative rupture models for the Three Pagodas fault, a first-level check against gross modeling assumptions to assure the source model is a reasonable reflection of observed data, and resulting ground-motion hazard for each alternative. Inadequate paleoseismic data is an important source of uncertainty that could be compensated for by considering alternative rupture models for poorly known seismic sources.

  4. Cassini Spacecraft Uncertainty Analysis Data and Methodology Review and Update/Volume 1: Updated Parameter Uncertainty Models for the Consequence Analysis

    SciTech Connect

    WHEELER, TIMOTHY A.; WYSS, GREGORY D.; HARPER, FREDERICK T.

    2000-11-01

    Uncertainty distributions for specific parameters of the Cassini General Purpose Heat Source Radioisotope Thermoelectric Generator (GPHS-RTG) Final Safety Analysis Report consequence risk analysis were revised and updated. The revisions and updates were done for all consequence parameters for which relevant information exists from the joint project on Probabilistic Accident Consequence Uncertainty Analysis by the United States Nuclear Regulatory Commission and the Commission of European Communities.

  5. Incorporating parametric uncertainty into population viability analysis models

    USGS Publications Warehouse

    McGowan, Conor P.; Runge, Michael C.; Larson, Michael A.

    2011-01-01

    Uncertainty in parameter estimates from sampling variation or expert judgment can introduce substantial uncertainty into ecological predictions based on those estimates. However, in standard population viability analyses, one of the most widely used tools for managing plant, fish and wildlife populations, parametric uncertainty is often ignored in or discarded from model projections. We present a method for explicitly incorporating this source of uncertainty into population models to fully account for risk in management and decision contexts. Our method involves a two-step simulation process where parametric uncertainty is incorporated into the replication loop of the model and temporal variance is incorporated into the loop for time steps in the model. Using the piping plover, a federally threatened shorebird in the USA and Canada, as an example, we compare abundance projections and extinction probabilities from simulations that exclude and include parametric uncertainty. Although final abundance was very low for all sets of simulations, estimated extinction risk was much greater for the simulation that incorporated parametric uncertainty in the replication loop. Decisions about species conservation (e.g., listing, delisting, and jeopardy) might differ greatly depending on the treatment of parametric uncertainty in population models.

  6. Probabilistic accident consequence uncertainty analysis: Food chain uncertainty assessment. Volume 2: Appendices

    SciTech Connect

    Brown, J.; Goossens, L.H.J.; Kraan, B.C.P.

    1997-06-01

    This volume is the second of a two-volume document that summarizes a joint project by the US Nuclear Regulatory and the Commission of European Communities to assess uncertainties in the MACCS and COSYMA probabilistic accident consequence codes. These codes were developed primarily for estimating the risks presented by nuclear reactors based on postulated frequencies and magnitudes of potential accidents. This two-volume report, which examines mechanisms and uncertainties of transfer through the food chain, is the first in a series of five such reports. A panel of sixteen experts was formed to compile credible and traceable uncertainty distributions for food chain transfer that affect calculations of offsite radiological consequences. Seven of the experts reported on transfer into the food chain through soil and plants, nine reported on transfer via food products from animals, and two reported on both. The expert judgment elicitation procedure and its outcomes are described in these volumes. This volume contains seven appendices. Appendix A presents a brief discussion of the MAACS and COSYMA model codes. Appendix B is the structure document and elicitation questionnaire for the expert panel on soils and plants. Appendix C presents the rationales and responses of each of the members of the soils and plants expert panel. Appendix D is the structure document and elicitation questionnaire for the expert panel on animal transfer. The rationales and responses of each of the experts on animal transfer are given in Appendix E. Brief biographies of the food chain expert panel members are provided in Appendix F. Aggregated results of expert responses are presented in graph format in Appendix G.

  7. Theoretical Analysis of Positional Uncertainty in Direct Georeferencing

    NASA Astrophysics Data System (ADS)

    Coskun Kiraci, Ali; Toz, Gonul

    2016-10-01

    GNSS/INS system composed of Global Navigation Satellite System and Inertial Navigation System together can provide orientation parameters directly by the observations collected during the flight. Thus orientation parameters can be obtained by GNSS/INS integration process without any need for aero triangulation after the flight. In general, positional uncertainty can be estimated with known coordinates of Ground Control Points (GCP) which require field works such as marker construction and GNSS measurement leading additional cost to the project. Here the question arises what should be the theoretical uncertainty of point coordinates depending on the uncertainties of orientation parameters. In this study the contribution of each orientation parameter on positional uncertainty is examined and theoretical positional uncertainty is computed without GCP measurement for direct georeferencing using a graphical user interface developed in MATLAB.

  8. Experimental investigations for uncertainty quantification in brake squeal analysis

    NASA Astrophysics Data System (ADS)

    Renault, A.; Massa, F.; Lallemand, B.; Tison, T.

    2016-04-01

    The aim of this paper is to improve the correlation between the experimental and the numerical prediction of unstable frequencies for automotive brake systems considering uncertainty. First, an experimental quantification of uncertainty and a discussion analysing the contributions of uncertainty to a numerical squeal simulation are proposed. Frequency and transient simulations are performed considering nominal values of model parameters, determined experimentally. The obtained results are compared with those derived from experimental tests to highlight the limitation of deterministic simulations. The effects of the different kinds of uncertainty detected in working conditions of brake system, the pad boundary condition, the brake system material properties and the pad surface topography are discussed by defining different unstable mode classes. Finally, a correlation between experimental and numerical results considering uncertainty is successfully proposed for an industrial brake system. Results from the different comparisons reveal also a major influence of the pad topography and consequently the contact distribution.

  9. Uncertainty analysis of fission fraction for reactor antineutrino experiments

    NASA Astrophysics Data System (ADS)

    Ma, X. B.; Lu, F.; Wang, L. Z.; Chen, Y. X.; Zhong, W. L.; An, F. P.

    2016-06-01

    Reactor simulation is an important source of uncertainties for a reactor neutrino experiment. Therefore, how to evaluate the antineutrino flux uncertainty results from reactor simulation is an important question. In this study, a method of the antineutrino flux uncertainty result from reactor simulation was proposed by considering the correlation coefficient. In order to use this method in the Daya Bay antineutrino experiment, the open source code DRAGON was improved and used for obtaining the fission fraction and correlation coefficient. The average fission fraction between DRAGON and SCIENCE code was compared and the difference was less than 5% for all the four isotopes. The uncertainty of fission fraction was evaluated by comparing simulation atomic density of four main isotopes with Takahama-3 experiment measurement. After that, the uncertainty of the antineutrino flux results from reactor simulation was evaluated as 0.6% per core for Daya Bay antineutrino experiment.

  10. Novel techniques for the analysis of the TOA radiometric uncertainty

    NASA Astrophysics Data System (ADS)

    Gorroño, Javier; Banks, Andrew; Gascon, Ferran; Fox, Nigel P.; Underwood, Craig I.

    2016-10-01

    In the framework of the European Copernicus programme, the European Space Agency (ESA) has launched the Sentinel-2 (S2) Earth Observation (EO) mission which provides optical high spatial -resolution imagery over land and coastal areas. As part of this mission, a tool (named S2-RUT, from Sentinel-2 Radiometric Uncertainty Tool) estimates the radiometric uncertainties associated to each pixel using as input the top-of-atmosphere (TOA) reflectance factor images provided by ESA. The initial version of the tool has been implemented — code and user guide available1 — and integrated as part of the Sentinel Toolbox. The tool required the study of several radiometric uncertainty sources as well as the calculation and validation of the combined standard uncertainty in order to estimate the TOA reflectance factor uncertainty per pixel. Here we describe the recent research in order to accommodate novel uncertainty contributions to the TOA reflectance uncertainty estimates in future versions of the tool. The two contributions that we explore are the radiometric impact of the spectral knowledge and the uncertainty propagation of the resampling associated to the orthorectification process. The former is produced by the uncertainty associated to the spectral calibration as well as the spectral variations across the instrument focal plane and the instrument degradation. The latter results of the focal plane image propagation into the provided orthoimage. The uncertainty propagation depends on the radiance levels on the pixel neighbourhood and the pixel correlation in the temporal and spatial dimensions. Special effort has been made studying non-stable scenarios and the comparison with different interpolation methods.

  11. Fukushima Daiichi unit 1 uncertainty analysis--Preliminary selection of uncertain parameters and analysis methodology

    SciTech Connect

    Cardoni, Jeffrey N.; Kalinich, Donald A.

    2014-02-01

    Sandia National Laboratories (SNL) plans to conduct uncertainty analyses (UA) on the Fukushima Daiichi unit (1F1) plant with the MELCOR code. The model to be used was developed for a previous accident reconstruction investigation jointly sponsored by the US Department of Energy (DOE) and Nuclear Regulatory Commission (NRC). However, that study only examined a handful of various model inputs and boundary conditions, and the predictions yielded only fair agreement with plant data and current release estimates. The goal of this uncertainty study is to perform a focused evaluation of uncertainty in core melt progression behavior and its effect on key figures-of-merit (e.g., hydrogen production, vessel lower head failure, etc.). In preparation for the SNL Fukushima UA work, a scoping study has been completed to identify important core melt progression parameters for the uncertainty analysis. The study also lays out a preliminary UA methodology.

  12. Sensitivity and Uncertainty Analysis to Burnup Estimates on ADS using the ACAB Code

    SciTech Connect

    Cabellos, O.; Sanz, J.; Rodriguez, A.; Gonzalez, E.; Embid, M.; Alvarez, F.; Reyes, S.

    2005-05-24

    Within the scope of the Accelerator Driven System (ADS) concept for nuclear waste management applications, the burnup uncertainty estimates due to uncertainty in the activation cross sections (XSs) are important regarding both the safety and the efficiency of the waste burning process. We have applied both sensitivity analysis and Monte Carlo methodology to actinides burnup calculations in a lead-bismuth cooled subcritical ADS. The sensitivity analysis is used to identify the reaction XSs and the dominant chains that contribute most significantly to the uncertainty. The Monte Carlo methodology gives the burnup uncertainty estimates due to the synergetic/global effect of the complete set of XS uncertainties. These uncertainty estimates are valuable to assess the need of any experimental or systematic re-evaluation of some uncertainty XSs for ADS.

  13. Sensitivity and Uncertainty Analysis to Burn-up Estimates on ADS Using ACAB Code

    SciTech Connect

    Cabellos, O; Sanz, J; Rodriguez, A; Gonzalez, E; Embid, M; Alvarez, F; Reyes, S

    2005-02-11

    Within the scope of the Accelerator Driven System (ADS) concept for nuclear waste management applications, the burnup uncertainty estimates due to uncertainty in the activation cross sections (XSs) are important regarding both the safety and the efficiency of the waste burning process. We have applied both sensitivity analysis and Monte Carlo methodology to actinides burnup calculations in a lead-bismuth cooled subcritical ADS. The sensitivity analysis is used to identify the reaction XSs and the dominant chains that contribute most significantly to the uncertainty. The Monte Carlo methodology gives the burnup uncertainty estimates due to the synergetic/global effect of the complete set of XS uncertainties. These uncertainty estimates are valuable to assess the need of any experimental or systematic reevaluation of some uncertainty XSs for ADS.

  14. Accounting for Calibration Uncertainties in X-ray Analysis: Effective Areas in Spectral Fitting

    NASA Astrophysics Data System (ADS)

    Lee, Hyunsook; Kashyap, Vinay L.; van Dyk, David A.; Connors, Alanna; Drake, Jeremy J.; Izem, Rima; Meng, Xiao-Li; Min, Shandong; Park, Taeyoung; Ratzlaff, Pete; Siemiginowska, Aneta; Zezas, Andreas

    2011-04-01

    While considerable advance has been made to account for statistical uncertainties in astronomical analyses, systematic instrumental uncertainties have been generally ignored. This can be crucial to a proper interpretation of analysis results because instrumental calibration uncertainty is a form of systematic uncertainty. Ignoring it can underestimate error bars and introduce bias into the fitted values of model parameters. Accounting for such uncertainties currently requires extensive case-specific simulations if using existing analysis packages. Here, we present general statistical methods that incorporate calibration uncertainties into spectral analysis of high-energy data. We first present a method based on multiple imputation that can be applied with any fitting method, but is necessarily approximate. We then describe a more exact Bayesian approach that works in conjunction with a Markov chain Monte Carlo based fitting. We explore methods for improving computational efficiency, and in particular detail a method of summarizing calibration uncertainties with a principal component analysis of samples of plausible calibration files. This method is implemented using recently codified Chandra effective area uncertainties for low-resolution spectral analysis and is verified using both simulated and actual Chandra data. Our procedure for incorporating effective area uncertainty is easily generalized to other types of calibration uncertainties.

  15. CASMO5/TSUNAMI-3D spent nuclear fuel reactivity uncertainty analysis

    SciTech Connect

    Ferrer, R.; Rhodes, J.; Smith, K.

    2012-07-01

    The CASMO5 lattice physics code is used in conjunction with the TSUNAMI-3D sequence in ORNL's SCALE 6 code system to estimate the uncertainties in hot-to-cold reactivity changes due to cross-section uncertainty for PWR assemblies at various burnup points. The goal of the analysis is to establish the multiplication factor uncertainty similarity between various fuel assemblies at different conditions in a quantifiable manner and to obtain a bound on the hot-to-cold reactivity uncertainty over the various assembly types and burnup attributed to fundamental cross-section data uncertainty. (authors)

  16. Approximate Uncertainty Modeling in Risk Analysis with Vine Copulas

    PubMed Central

    Bedford, Tim; Daneshkhah, Alireza

    2015-01-01

    Many applications of risk analysis require us to jointly model multiple uncertain quantities. Bayesian networks and copulas are two common approaches to modeling joint uncertainties with probability distributions. This article focuses on new methodologies for copulas by developing work of Cooke, Bedford, Kurowica, and others on vines as a way of constructing higher dimensional distributions that do not suffer from some of the restrictions of alternatives such as the multivariate Gaussian copula. The article provides a fundamental approximation result, demonstrating that we can approximate any density as closely as we like using vines. It further operationalizes this result by showing how minimum information copulas can be used to provide parametric classes of copulas that have such good levels of approximation. We extend previous approaches using vines by considering nonconstant conditional dependencies, which are particularly relevant in financial risk modeling. We discuss how such models may be quantified, in terms of expert judgment or by fitting data, and illustrate the approach by modeling two financial data sets. PMID:26332240

  17. Approximate Uncertainty Modeling in Risk Analysis with Vine Copulas.

    PubMed

    Bedford, Tim; Daneshkhah, Alireza; Wilson, Kevin J

    2016-04-01

    Many applications of risk analysis require us to jointly model multiple uncertain quantities. Bayesian networks and copulas are two common approaches to modeling joint uncertainties with probability distributions. This article focuses on new methodologies for copulas by developing work of Cooke, Bedford, Kurowica, and others on vines as a way of constructing higher dimensional distributions that do not suffer from some of the restrictions of alternatives such as the multivariate Gaussian copula. The article provides a fundamental approximation result, demonstrating that we can approximate any density as closely as we like using vines. It further operationalizes this result by showing how minimum information copulas can be used to provide parametric classes of copulas that have such good levels of approximation. We extend previous approaches using vines by considering nonconstant conditional dependencies, which are particularly relevant in financial risk modeling. We discuss how such models may be quantified, in terms of expert judgment or by fitting data, and illustrate the approach by modeling two financial data sets.

  18. Uncertainty Analysis of NASA Glenn's 8- by 6-Foot Supersonic Wind Tunnel

    NASA Technical Reports Server (NTRS)

    Stephens, Julia E.; Hubbard, Erin P.; Walter, Joel A.; McElroy, Tyler

    2016-01-01

    An analysis was performed to determine the measurement uncertainty of the Mach Number of the 8- by 6-foot Supersonic Wind Tunnel at the NASA Glenn Research Center. This paper details the analysis process used, including methods for handling limited data and complicated data correlations. Due to the complexity of the equations used, a Monte Carlo Method was utilized for this uncertainty analysis. A summary of the findings are presented as pertains to understanding what the uncertainties are, how they impact various research tests in the facility, and methods of reducing the uncertainties in the future.

  19. ANALYSIS OF MEASUREMENT UNCERTAINTIES IN THE NULLING TEST FOR AIR LEAKAGE FROM RESIDENTIAL DUCTS.

    SciTech Connect

    ANDREWS,J.W.

    2001-04-01

    An analysis of measurement uncertainties in a recently proposed method of measuring air leakage in residential duct systems has been carried out. The uncertainties in supply and return leakage rates are expressed in terms of the value of the envelope leakage flow coefficient and the uncertainties in measured pressures and air flow rates. Results of the analysis are compared with data published by two research groups.

  20. Methodology for Uncertainty Analysis of Dynamic Computational Toxicology Models

    EPA Science Inventory

    The task of quantifying the uncertainty in both parameter estimates and model predictions has become more important with the increased use of dynamic computational toxicology models by the EPA. Dynamic toxicological models include physiologically-based pharmacokinetic (PBPK) mode...

  1. Methodology for Uncertainty Analysis of Dynamic Computational Toxicology Models

    EPA Science Inventory

    The task of quantifying the uncertainty in both parameter estimates and model predictions has become more important with the increased use of dynamic computational toxicology models by the EPA. Dynamic toxicological models include physiologically-based pharmacokinetic (PBPK) mode...

  2. Model parameter uncertainty analysis for an annual field-scale P loss model

    NASA Astrophysics Data System (ADS)

    Bolster, Carl H.; Vadas, Peter A.; Boykin, Debbie

    2016-08-01

    Phosphorous (P) fate and transport models are important tools for developing and evaluating conservation practices aimed at reducing P losses from agricultural fields. Because all models are simplifications of complex systems, there will exist an inherent amount of uncertainty associated with their predictions. It is therefore important that efforts be directed at identifying, quantifying, and communicating the different sources of model uncertainties. In this study, we conducted an uncertainty analysis with the Annual P Loss Estimator (APLE) model. Our analysis included calculating parameter uncertainties and confidence and prediction intervals for five internal regression equations in APLE. We also estimated uncertainties of the model input variables based on values reported in the literature. We then predicted P loss for a suite of fields under different management and climatic conditions while accounting for uncertainties in the model parameters and inputs and compared the relative contributions of these two sources of uncertainty to the overall uncertainty associated with predictions of P loss. Both the overall magnitude of the prediction uncertainties and the relative contributions of the two sources of uncertainty varied depending on management practices and field characteristics. This was due to differences in the number of model input variables and the uncertainties in the regression equations associated with each P loss pathway. Inspection of the uncertainties in the five regression equations brought attention to a previously unrecognized limitation with the equation used to partition surface-applied fertilizer P between leaching and runoff losses. As a result, an alternate equation was identified that provided similar predictions with much less uncertainty. Our results demonstrate how a thorough uncertainty and model residual analysis can be used to identify limitations with a model. Such insight can then be used to guide future data collection and model

  3. Uncertainty analysis with reliability techniques of fluvial hydraulic simulations

    NASA Astrophysics Data System (ADS)

    Oubennaceur, K.; Chokmani, K.; Nastev, M.

    2016-12-01

    Flood inundation models are commonly used to simulate hydraulic and floodplain inundation processes, prerequisite to successful floodplain management and preparation of appropriate flood risk mitigation plans. Selecting statistically significant ranges of the variables involved in the inundation modelling is crucial for the model performance. This involves various levels of uncertainty, which due to the cumulative nature can lead to considerable uncertainty in the final results. Therefore, in addition to the validation of the model results, there is a need for clear understanding and identifying sources of uncertainty and for measuring the model uncertainty. A reliability approach called Point-Estimate Method is presented to quantify uncertainty effects of the input data and to calculate the propagation of uncertainty in the inundation modelling process. The Point Estimate Method is a special case of numerical quadrature based on orthogonal polynomials. It allows to evaluate the low order of performance functions of independent random variables such the water depth. The variables considered in the analyses include elevation data, flow rate and Manning's coefficient n given with their own probability distributions. The approach is applied to a 45 km reach of the Richelieu River, Canada, between Rouses point and Fryers Rapids. The finite element hydrodynamic model H2D2 was used to solve the shallow water equations (SWE) and provide maps of expected water depths associated spatial distributions of standard deviations as a measure of uncertainty. The results indicate that for the simulated flow rates of 1113, 1206, and 1282, the uncertainties in water depths have a range of 25 cm, 30cm, and 60 cm, respectively. This kind of information is useful information for decision-making framework risk management in the context flood risk assessment.

  4. Probabilistic structural analysis to quantify uncertainties associated with turbopump blades

    NASA Technical Reports Server (NTRS)

    Nagpal, Vinod K.; Rubinstein, Robert; Chamis, Christos C.

    1988-01-01

    A probabilistic study of turbopump blades has been in progress at NASA Lewis Research Center for over the last two years. The objectives of this study are to evaluate the effects of uncertainties in geometry and material properties on the structural response of the turbopump blades to evaluate the tolerance limits on the design. A methodology based on probabilistic approach was developed to quantify the effects of the random uncertainties. The results indicate that only the variations in geometry have significant effects.

  5. Uncertainty analysis of thermal quantities measurement in a centrifugal compressor

    NASA Astrophysics Data System (ADS)

    Hurda, Lukáš; Matas, Richard

    2017-09-01

    Compressor performance characteristics evaluation process based on the measurement of pressure, temperature and other quantities is examined to find uncertainties for directly measured and derived quantities. CFD is used as a tool to quantify the influences of different sources of uncertainty of measurements for single- and multi-thermocouple total temperature probes. The heat conduction through the body of the thermocouple probe and the heat-up of the air in the intake piping are the main phenomena of interest.

  6. Uncertainty Analysis of the Single-Vector Force Balance Calibration System

    NASA Technical Reports Server (NTRS)

    Parker, Peter A.; Liu, Tianshu

    2002-01-01

    This paper presents an uncertainty analysis of the Single-Vector Force Balance Calibration System (SVS). This study is focused on the uncertainty involved in setting the independent variables during the calibration experiment. By knowing the uncertainty in the calibration system, the fundamental limits of the calibration accuracy of a particular balance can be determined. A brief description of the SVS mechanical system is provided. A mathematical model is developed to describe the mechanical system elements. A sensitivity analysis of these parameters is carried out through numerical simulations to assess the sensitivity of the total uncertainty to the elemental error sources. These sensitivity coefficients provide valuable information regarding the relative significance of the elemental sources of error. An example calculation of the total uncertainty for a specific balance is provided. Results from this uncertainty analysis are specific to the Single-Vector System, but the approach is broad in nature and therefore applicable to other measurement and calibration systems.

  7. Comprehensive neutron cross-section and secondary energy distribution uncertainty analysis for a fusion reactor

    SciTech Connect

    Gerstl, S.A.W.; LaBauve, R.J.; Young, P.G.

    1980-05-01

    On the example of General Atomic's well-documented Power Generating Fusion Reactor (PGFR) design, this report exercises a comprehensive neutron cross-section and secondary energy distribution (SED) uncertainty analysis. The LASL sensitivity and uncertainty analysis code SENSIT is used to calculate reaction cross-section sensitivity profiles and integral SED sensitivity coefficients. These are then folded with covariance matrices and integral SED uncertainties to obtain the resulting uncertainties of three calculated neutronics design parameters: two critical radiation damage rates and a nuclear heating rate. The report documents the first sensitivity-based data uncertainty analysis, which incorporates a quantitative treatment of the effects of SED uncertainties. The results demonstrate quantitatively that the ENDF/B-V cross-section data files for C, H, and O, including their SED data, are fully adequate for this design application, while the data for Fe and Ni are at best marginally adequate because they give rise to response uncertainties up to 25%. Much higher response uncertainties are caused by cross-section and SED data uncertainties in Cu (26 to 45%), tungsten (24 to 54%), and Cr (up to 98%). Specific recommendations are given for re-evaluations of certain reaction cross-sections, secondary energy distributions, and uncertainty estimates.

  8. LCOE Uncertainty Analysis for Hydropower using Monte Carlo Simulations

    SciTech Connect

    O'Connor, Patrick W; Uria Martinez, Rocio; Kao, Shih-Chieh

    2015-01-01

    Levelized Cost of Energy (LCOE) is an important metric to evaluate the cost and performance of electricity production generation alternatives, and combined with other measures, can be used to assess the economics of future hydropower development. Multiple assumptions on input parameters are required to calculate the LCOE, which each contain some level of uncertainty, in turn affecting the accuracy of LCOE results. This paper explores these uncertainties, their sources, and ultimately the level of variability they introduce at the screening level of project evaluation for non-powered dams (NPDs) across the U.S. Owing to site-specific differences in site design, the LCOE for hydropower varies significantly from project to project unlike technologies with more standardized configurations such as wind and gas. Therefore, to assess the impact of LCOE input uncertainty on the economics of U.S. hydropower resources, these uncertainties must be modeled across the population of potential opportunities. To demonstrate the impact of uncertainty, resource data from a recent nationwide non-powered dam (NPD) resource assessment (Hadjerioua et al., 2012) and screening-level predictive cost equations (O Connor et al., 2015) are used to quantify and evaluate uncertainties in project capital and operations & maintenance costs, and generation potential at broad scale. LCOE dependence on financial assumptions is also evaluated on a sensitivity basis to explore ownership/investment implications on project economics for the U.S. hydropower fleet. The results indicate that the LCOE for U.S. NPDs varies substantially. The LCOE estimates for the potential NPD projects of capacity greater than 1 MW range from 40 to 182 $/MWh, with average of 106 $/MWh. 4,000 MW could be developed through projects with individual LCOE values below 100 $/MWh. The results also indicate that typically 90 % of LCOE uncertainty can be attributed to uncertainties in capital costs and energy production; however

  9. A POTENTIAL APPLICATION OF UNCERTAINTY ANALYSIS TO DOE-STD-3009-94 ACCIDENT ANALYSIS

    SciTech Connect

    Palmrose, D E; Yang, J M

    2007-05-10

    The objective of this paper is to assess proposed transuranic waste accident analysis guidance and recent software improvements in a Windows-OS version of MACCS2 that allows the inputting of parameter uncertainty. With this guidance and code capability, there is the potential to perform a quantitative uncertainty assessment of unmitigated accident releases with respect to the 25 rem Evaluation Guideline (EG) of DOE-STD-3009-94 CN3 (STD-3009). Historically, the classification of safety systems in a U.S. Department of Energy (DOE) nuclear facility's safety basis has involved how subject matter experts qualitatively view uncertainty in the STD-3009 Appendix A accident analysis methodology. Specifically, whether consequence uncertainty could be larger than previously evaluated so the site-specific accident consequences may challenge the EG. This paper assesses whether a potential uncertainty capability for MACCS2 could provide a stronger technical basis as to when the consequences from a design basis accident (DBA) truly challenges the 25 rem EG.

  10. Principles and applications of measurement and uncertainty analysis in research and calibration

    SciTech Connect

    Wells, C.V.

    1992-11-01

    Interest in Measurement Uncertainty Analysis has grown in the past several years as it has spread to new fields of application, and research and development of uncertainty methodologies have continued. This paper discusses the subject from the perspectives of both research and calibration environments. It presents a history of the development and an overview of the principles of uncertainty analysis embodied in the United States National Standard, ANSI/ASME PTC 19.1-1985, Measurement Uncertainty. Examples are presented in which uncertainty analysis was utilized or is needed to gain further knowledge of a particular measurement process and to characterize final results. Measurement uncertainty analysis provides a quantitative estimate of the interval about a measured value or an experiment result within which the true value of that quantity is expected to lie. Years ago, Harry Ku of the United States National Bureau of Standards stated that The informational content of the statement of uncertainty determines, to a large extent, the worth of the calibrated value.'' Today, that statement is just as true about calibration or research results as it was in 1968. Why is that true What kind of information should we include in a statement of uncertainty accompanying a calibrated value How and where do we get the information to include in an uncertainty statement How should we interpret and use measurement uncertainty information This discussion will provide answers to these and other questions about uncertainty in research and in calibration. The methodology to be described has been developed by national and international groups over the past nearly thirty years, and individuals were publishing information even earlier. Yet the work is largely unknown in many science and engineering arenas. I will illustrate various aspects of uncertainty analysis with some examples drawn from the radiometry measurement and calibration discipline from research activities.

  11. Principles and applications of measurement and uncertainty analysis in research and calibration

    SciTech Connect

    Wells, C.V.

    1992-11-01

    Interest in Measurement Uncertainty Analysis has grown in the past several years as it has spread to new fields of application, and research and development of uncertainty methodologies have continued. This paper discusses the subject from the perspectives of both research and calibration environments. It presents a history of the development and an overview of the principles of uncertainty analysis embodied in the United States National Standard, ANSI/ASME PTC 19.1-1985, Measurement Uncertainty. Examples are presented in which uncertainty analysis was utilized or is needed to gain further knowledge of a particular measurement process and to characterize final results. Measurement uncertainty analysis provides a quantitative estimate of the interval about a measured value or an experiment result within which the true value of that quantity is expected to lie. Years ago, Harry Ku of the United States National Bureau of Standards stated that ``The informational content of the statement of uncertainty determines, to a large extent, the worth of the calibrated value.`` Today, that statement is just as true about calibration or research results as it was in 1968. Why is that true? What kind of information should we include in a statement of uncertainty accompanying a calibrated value? How and where do we get the information to include in an uncertainty statement? How should we interpret and use measurement uncertainty information? This discussion will provide answers to these and other questions about uncertainty in research and in calibration. The methodology to be described has been developed by national and international groups over the past nearly thirty years, and individuals were publishing information even earlier. Yet the work is largely unknown in many science and engineering arenas. I will illustrate various aspects of uncertainty analysis with some examples drawn from the radiometry measurement and calibration discipline from research activities.

  12. Probabilistic accident consequence uncertainty analysis: Dispersion and deposition uncertainty assessment. Volume 3, Appendices C, D, E, F, and G

    SciTech Connect

    Harper, F.T.; Young, M.L.; Miller, L.A.

    1995-01-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, completed in 1990, estimate the risks presented by nuclear installations based on postulated frequencies and magnitudes of potential accidents. In 1991, the US Nuclear Regulatory Commission (NRC) and the Commission of the European Communities (CEC) began a joint uncertainty analysis of the two codes. The objective was to develop credible and traceable uncertainty distributions for the input variables of the codes. Expert elicitation, developed independently, was identified as the best technology available for developing a library of uncertainty distributions for the selected consequence parameters. The study was formulated jointly and was limited to the current code models and to physical quantities that could be measured in experiments. To validate the distributions generated for the wet deposition input variables, samples were taken from these distributions and propagated through the wet deposition code model along with the Gaussian plume model (GPM) implemented in the MACCS and COSYMA codes. Resulting distributions closely replicated the aggregated elicited wet deposition distributions. Project teams from the NRC and CEC cooperated successfully to develop and implement a unified process for the elaboration of uncertainty distributions on consequence code input parameters. Formal expert judgment elicitation proved valuable for synthesizing the best available information. Distributions on measurable atmospheric dispersion and deposition parameters were successfully elicited from experts involved in the many phenomenological areas of consequence analysis. This volume is the third of a three-volume document describing the project and contains descriptions of the probability assessment principles; the expert identification and selection process; the weighting methods used; the inverse modeling methods; case structures; and summaries of the consequence codes.

  13. SOARCA Peach Bottom Atomic Power Station Long-Term Station Blackout Uncertainty Analysis: Convergence of the Uncertainty Results

    SciTech Connect

    Bixler, Nathan E.; Osborn, Douglas M.; Sallaberry, Cedric Jean-Marie; Eckert-Gallup, Aubrey Celia; Mattie, Patrick D.; Ghosh, S. Tina

    2014-02-01

    This paper describes the convergence of MELCOR Accident Consequence Code System, Version 2 (MACCS2) probabilistic results of offsite consequences for the uncertainty analysis of the State-of-the-Art Reactor Consequence Analyses (SOARCA) unmitigated long-term station blackout scenario at the Peach Bottom Atomic Power Station. The consequence metrics evaluated are individual latent-cancer fatality (LCF) risk and individual early fatality risk. Consequence results are presented as conditional risk (i.e., assuming the accident occurs, risk per event) to individuals of the public as a result of the accident. In order to verify convergence for this uncertainty analysis, as recommended by the Nuclear Regulatory Commission’s Advisory Committee on Reactor Safeguards, a ‘high’ source term from the original population of Monte Carlo runs has been selected to be used for: (1) a study of the distribution of consequence results stemming solely from epistemic uncertainty in the MACCS2 parameters (i.e., separating the effect from the source term uncertainty), and (2) a comparison between Simple Random Sampling (SRS) and Latin Hypercube Sampling (LHS) in order to validate the original results obtained with LHS. Three replicates (each using a different random seed) of size 1,000 each using LHS and another set of three replicates of size 1,000 using SRS are analyzed. The results show that the LCF risk results are well converged with either LHS or SRS sampling. The early fatality risk results are less well converged at radial distances beyond 2 miles, and this is expected due to the sparse data (predominance of “zero” results).

  14. Uncertainty Analysis of RELAP5-3D

    SciTech Connect

    Alexandra E Gertman; Dr. George L Mesina

    2012-07-01

    As world-wide energy consumption continues to increase, so does the demand for the use of alternative energy sources, such as Nuclear Energy. Nuclear Power Plants currently supply over 370 gigawatts of electricity, and more than 60 new nuclear reactors have been commissioned by 15 different countries. The primary concern for Nuclear Power Plant operation and lisencing has been safety. The safety of the operation of Nuclear Power Plants is no simple matter- it involves the training of operators, design of the reactor, as well as equipment and design upgrades throughout the lifetime of the reactor, etc. To safely design, operate, and understand nuclear power plants, industry and government alike have relied upon the use of best-estimate simulation codes, which allow for an accurate model of any given plant to be created with well-defined margins of safety. The most widely used of these best-estimate simulation codes in the Nuclear Power industry is RELAP5-3D. Our project focused on improving the modeling capabilities of RELAP5-3D by developing uncertainty estimates for its calculations. This work involved analyzing high, medium, and low ranked phenomena from an INL PIRT on a small break Loss-Of-Coolant Accident as wall as an analysis of a large break Loss-Of- Coolant Accident. Statistical analyses were performed using correlation coefficients. To perform the studies, computer programs were written that modify a template RELAP5 input deck to produce one deck for each combination of key input parameters. Python scripting enabled the running of the generated input files with RELAP5-3D on INL’s massively parallel cluster system. Data from the studies was collected and analyzed with SAS. A summary of the results of our studies are presented.

  15. Fuzzy-algebra uncertainty analysis for abnormal-environment safety assessment

    SciTech Connect

    Cooper, J.A.

    1994-01-01

    Many safety (risk) analyses depend on uncertain inputs and on mathematical models chosen from various alternatives, but give fixed results (implying no uncertainty). Conventional uncertainty analyses help, but are also based on assumptions and models, the accuracy of which may be difficult to assure. Some of the models and assumptions that on cursory examination seem reasonable can be misleading. As a result, quantitative assessments, even those accompanied by uncertainty measures, can give unwarranted impressions of accuracy. Since analysis results can be a major contributor to a safety-measure decision process, risk management depends on relating uncertainty to only the information available. The uncertainties due to abnormal environments are even more challenging than those in normal-environment safety assessments, and therefore require an even more cautious approach. A fuzzy algebra analysis is proposed in this report that has the potential to appropriately reflect the information available and portray uncertainties well, especially for abnormal environments.

  16. New approaches to uncertainty analysis for use in aggregate and cumulative risk assessment of pesticides.

    PubMed

    Kennedy, Marc C; van der Voet, Hilko; Roelofs, Victoria J; Roelofs, Willem; Glass, C Richard; de Boer, Waldo J; Kruisselbrink, Johannes W; Hart, Andy D M

    2015-05-01

    Risk assessments for human exposures to plant protection products (PPPs) have traditionally focussed on single routes of exposure and single compounds. Extensions to estimate aggregate (multi-source) and cumulative (multi-compound) exposure from PPPs present many new challenges and additional uncertainties that should be addressed as part of risk analysis and decision-making. A general approach is outlined for identifying and classifying the relevant uncertainties and variabilities. The implementation of uncertainty analysis within the MCRA software, developed as part of the EU-funded ACROPOLIS project to address some of these uncertainties, is demonstrated. An example is presented for dietary and non-dietary exposures to the triazole class of compounds. This demonstrates the chaining of models, linking variability and uncertainty generated from an external model for bystander exposure with variability and uncertainty in MCRA dietary exposure assessments. A new method is also presented for combining pesticide usage survey information with limited residue monitoring data, to address non-detect uncertainty. The results show that incorporating usage information reduces uncertainty in parameters of the residue distribution but that in this case quantifying uncertainty is not a priority, at least for UK grown crops. A general discussion of alternative approaches to treat uncertainty, either quantitatively or qualitatively, is included.

  17. NASTRAN variance analysis and plotting of HBDY elements. [analysis of uncertainties of the computer results as a function of uncertainties in the input data

    NASA Technical Reports Server (NTRS)

    Harder, R. L.

    1974-01-01

    The NASTRAN Thermal Analyzer has been intended to do variance analysis and plot the thermal boundary elements. The objective of the variance analysis addition is to assess the sensitivity of temperature variances resulting from uncertainties inherent in input parameters for heat conduction analysis. The plotting capability provides the ability to check the geometry (location, size and orientation) of the boundary elements of a model in relation to the conduction elements. Variance analysis is the study of uncertainties of the computed results as a function of uncertainties of the input data. To study this problem using NASTRAN, a solution is made for both the expected values of all inputs, plus another solution for each uncertain variable. A variance analysis module subtracts the results to form derivatives, and then can determine the expected deviations of output quantities.

  18. [Measurement uncertainty in drinking water analysis. Conclusions from interlaboratory tests].

    PubMed

    Koch, M

    2006-10-01

    The problems of measurement uncertainty, its estimation and the connection with the requirements of the drinking water legislation on the analytical methods are described. The difficulties with these requirements are shown. On the basis of numerous interlaboratory test data, the concentration dependences of the reproducibility standard deviation for 75 drinking water parameters were calculated using a variance function described in DIN 38402-45. From this function means of standard deviation at the legal limit could be calculated with high confidence. These values are compared with the requirements of the legislation. These data can be used on the one hand for the estimation of uncertainties in the laboratories or the plausibility check of uncertainties already estimated. On the other hand these data can be helpful for deriving an official interpretation or creating new requirements for the drinking water legislation.

  19. Design and Uncertainty Analysis for a PVTt Gas Flow Standard

    PubMed Central

    Wright, John D.; Johnson, Aaron N.; Moldover, Michael R.

    2003-01-01

    A new pressure, volume, temperature, and, time (PVTt) primary gas flow standard at the National Institute of Standards and Technology has an expanded uncertainty (k = 2) of between 0.02 % and 0.05 %. The standard spans the flow range of 1 L/min to 2000 L/min using two collection tanks and two diverter valve systems. The standard measures flow by collecting gas in a tank of known volume during a measured time interval. We describe the significant and novel features of the standard and analyze its uncertainty. The gas collection tanks have a small diameter and are immersed in a uniform, stable, thermostatted water bath. The collected gas achieves thermal equilibrium rapidly and the uncertainty of the average gas temperature is only 7 mK (22 × 10−6 T). A novel operating method leads to essentially zero mass change in and very low uncertainty contributions from the inventory volume. Gravimetric and volume expansion techniques were used to determine the tank and the inventory volumes. Gravimetric determinations of collection tank volume made with nitrogen and argon agree with a standard deviation of 16 × 10−6 VT. The largest source of uncertainty in the flow measurement is drift of the pressure sensor over time, which contributes relative standard uncertainty of 60 × 10−6 to the determinations of the volumes of the collection tanks and to the flow measurements. Throughout the range 3 L/min to 110 L/min, flows were measured independently using the 34 L and the 677 L collection systems, and the two systems agreed within a relative difference of 150 × 10−6. Double diversions were used to evaluate the 677 L system over a range of 300 L/min to 1600 L/min, and the relative differences between single and double diversions were less than 75 × 10−6. PMID:27413592

  20. Measurement uncertainty analysis of low-dose-rate prostate seed brachytherapy: post-implant dosimetry.

    PubMed

    Gregory, Kent J; Pattison, John E; Bibbo, Giovanni

    2015-03-01

    The minimal dose covering 90 % of the prostate volume--D 90--is arguably the most important dosimetric parameter in low-dose-rate prostate seed brachytherapy. In this study an analysis of the measurement uncertainties in D 90 from low-dose-rate prostate seed brachytherapy was conducted for two common treatment procedures with two different post-implant dosimetry methods. The analysis was undertaken in order to determine the magnitude of D 90 uncertainty, how the magnitude of the uncertainty varied when D 90 was calculated using different dosimetry methods, and which factors were the major contributors to the uncertainty. The analysis considered the prostate as being homogeneous and tissue equivalent and made use of published data, as well as original data collected specifically for this analysis, and was performed according to the Guide to the expression of uncertainty in measurement (GUM). It was found that when prostate imaging and seed implantation were conducted in two separate sessions using only CT images for post-implant analysis, the expanded uncertainty in D 90 values were about 25 % at the 95 % confidence interval. When prostate imaging and seed implantation were conducted during a single session using CT and ultrasound images for post-implant analysis, the expanded uncertainty in D 90 values were about 33 %. Methods for reducing these uncertainty levels are discussed. It was found that variations in contouring the target tissue made the largest contribution to D 90 uncertainty, while the uncertainty in seed source strength made only a small contribution. It is important that clinicians appreciate the overall magnitude of D 90 uncertainty and understand the factors that affect it so that clinical decisions are soundly based, and resources are appropriately allocated.

  1. RECONSTRUCTING EXPOSURE SCENARIOS USING DOSE BIOMARKERS - AN APPLICATION OF BAYESIAN UNCERTAINTY ANALYSIS

    EPA Science Inventory

    We use Bayesian uncertainty analysis to explore how to estimate pollutant exposures from biomarker concentrations. The growing number of national databases with exposure data makes such an analysis possible. They contain datasets of pharmacokinetic biomarkers for many polluta...

  2. RECONSTRUCTING EXPOSURE SCENARIOS USING DOSE BIOMARKERS - AN APPLICATION OF BAYESIAN UNCERTAINTY ANALYSIS

    EPA Science Inventory

    We use Bayesian uncertainty analysis to explore how to estimate pollutant exposures from biomarker concentrations. The growing number of national databases with exposure data makes such an analysis possible. They contain datasets of pharmacokinetic biomarkers for many polluta...

  3. Uncertainty analysis for a field-scale P loss model

    USDA-ARS?s Scientific Manuscript database

    Models are often used to predict phosphorus (P) loss from agricultural fields. While it is commonly recognized that model predictions are inherently uncertain, few studies have addressed prediction uncertainties using P loss models. In this study we assessed the effect of model input error on predic...

  4. Natural hazard modeling and uncertainty analysis [Chapter 2

    Treesearch

    Matthew Thompson; Jord J. Warmink

    2017-01-01

    Modeling can play a critical role in assessing and mitigating risks posed by natural hazards. These modeling efforts generally aim to characterize the occurrence, intensity, and potential consequences of natural hazards. Uncertainties surrounding the modeling process can have important implications for the development, application, evaluation, and interpretation of...

  5. Isokinetic TWC Evaporator Probe: Calculations and Systemic Uncertainty Analysis

    NASA Technical Reports Server (NTRS)

    Davison, Craig R.; Strapp, J. Walter; Lilie, Lyle; Ratvasky, Thomas P.; Dumont, Christopher

    2016-01-01

    A new Isokinetic Total Water Content Evaporator (IKP2) was downsized from a prototype instrument, specifically to make airborne measurements of hydrometeor total water content (TWC) in deep tropical convective clouds to assess the new ice crystal Appendix D icing envelope. The probe underwent numerous laboratory and wind tunnel investigations to ensure reliable operation under the difficult high altitude/speed/TWC conditions under which other TWC instruments have been known to either fail, or have unknown performance characteristics and the results are presented in a companion paper. This paper presents the equations used to determine the total water content (TWC) of the sampled atmosphere from the values measured by the IKP2 or necessary ancillary data from other instruments. The uncertainty in the final TWC is determined by propagating the uncertainty in the measured values through the calculations to the final result. Two techniques were used and the results compared. The first is a typical analytical method of propagating uncertainty and the second performs a Monte Carlo simulation. The results are very similar with differences that are insignificant for practical purposes. The uncertainty is between 2 percent and 3 percent at most practical operating conditions. The capture efficiency of the IKP2 was also examined based on a computational fluid dynamic simulation of the original IKP and scaled down to the IKP2. Particles above 24 microns were found to have a capture efficiency greater than 99 percent at all operating conditions.

  6. Isokinetic TWC Evaporator Probe: Calculations and Systemic Uncertainty Analysis

    NASA Technical Reports Server (NTRS)

    Davison, Craig R.; Strapp, John W.; Lilie, Lyle E.; Ratvasky, Thomas P.; Dumont, Christopher

    2016-01-01

    A new Isokinetic Total Water Content Evaporator (IKP2) was downsized from a prototype instrument, specifically to make airborne measurements of hydrometeor total water content (TWC) in deep tropical convective clouds to assess the new ice crystal Appendix D icing envelope. The probe underwent numerous laboratory and wind tunnel investigations to ensure reliable operation under the difficult high altitude/speed/TWC conditions under which other TWC instruments have been known to either fail, or have unknown performance characteristics and the results are presented in a companion paper (Ref. 1). This paper presents the equations used to determine the total water content (TWC) of the sampled atmosphere from the values measured by the IKP2 or necessary ancillary data from other instruments. The uncertainty in the final TWC is determined by propagating the uncertainty in the measured values through the calculations to the final result. Two techniques were used and the results compared. The first is a typical analytical method of propagating uncertainty and the second performs a Monte Carlo simulation. The results are very similar with differences that are insignificant for practical purposes. The uncertainty is between 2 and 3 percent at most practical operating conditions. The capture efficiency of the IKP2 was also examined based on a computational fluid dynamic simulation of the original IKP and scaled down to the IKP2. Particles above 24 micrometers were found to have a capture efficiency greater than 99 percent at all operating conditions.

  7. An Analysis of Uncertainty and Satisfaction: A Hospital Case Study.

    ERIC Educational Resources Information Center

    Salem, Philip; Williams, M. Lee

    As part of a project investigating how communication within hospital systems differs from communication in other types of organizations, the employees of a 40-bed hospital were surveyed for their attitudes on perceived uncertainty, internal patterns of communication in the hospital, and worker satisfaction. Variables that were studied included…

  8. Uncertainty analysis and robust trajectory linearization control of a flexible air-breathing hypersonic vehicle

    NASA Astrophysics Data System (ADS)

    Pu, Zhiqiang; Tan, Xiangmin; Fan, Guoliang; Yi, Jianqiang

    2014-08-01

    Flexible air-breathing hypersonic vehicles feature significant uncertainties which pose huge challenges to robust controller designs. In this paper, four major categories of uncertainties are analyzed, that is, uncertainties associated with flexible effects, aerodynamic parameter variations, external environmental disturbances, and control-oriented modeling errors. A uniform nonlinear uncertainty model is explored for the first three uncertainties which lumps all uncertainties together and consequently is beneficial for controller synthesis. The fourth uncertainty is additionally considered in stability analysis. Based on these analyses, the starting point of the control design is to decompose the vehicle dynamics into five functional subsystems. Then a robust trajectory linearization control (TLC) scheme consisting of five robust subsystem controllers is proposed. In each subsystem controller, TLC is combined with the extended state observer (ESO) technique for uncertainty compensation. The stability of the overall closed-loop system with the four aforementioned uncertainties and additional singular perturbations is analyzed. Particularly, the stability of nonlinear ESO is also discussed from a Liénard system perspective. At last, simulations demonstrate the great control performance and the uncertainty rejection ability of the robust scheme.

  9. An analysis of uncertainties and skill in forecasts of winter storm losses

    NASA Astrophysics Data System (ADS)

    Pardowitz, Tobias; Osinski, Robert; Kruschke, Tim; Ulbrich, Uwe

    2016-11-01

    This paper describes an approach to derive probabilistic predictions of local winter storm damage occurrences from a global medium-range ensemble prediction system (EPS). Predictions of storm damage occurrences are subject to large uncertainty due to meteorological forecast uncertainty (typically addressed by means of ensemble predictions) and uncertainties in modelling weather impacts. The latter uncertainty arises from the fact that local vulnerabilities are not known in sufficient detail to allow for a deterministic prediction of damages, even if the forecasted gust wind speed contains no uncertainty. Thus, to estimate the damage model uncertainty, a statistical model based on logistic regression analysis is employed, relating meteorological analyses to historical damage records. A quantification of the two individual contributions (meteorological and damage model uncertainty) to the total forecast uncertainty is achieved by neglecting individual uncertainty sources and analysing resulting predictions. Results show an increase in forecast skill measured by means of a reduced Brier score if both meteorological and damage model uncertainties are taken into account. It is demonstrated that skilful predictions on district level (dividing the area of Germany into 439 administrative districts) are possible on lead times of several days. Skill is increased through the application of a proper ensemble calibration method, extending the range of lead times for which skilful damage predictions can be made.

  10. Use of paired simple and complex models to reduce predictive bias and quantify uncertainty

    NASA Astrophysics Data System (ADS)

    Doherty, John; Christensen, Steen

    2011-12-01

    Modern environmental management and decision-making is based on the use of increasingly complex numerical models. Such models have the advantage of allowing representation of complex processes and heterogeneous system property distributions inasmuch as these are understood at any particular study site. The latter are often represented stochastically, this reflecting knowledge of the character of system heterogeneity at the same time as it reflects a lack of knowledge of its spatial details. Unfortunately, however, complex models are often difficult to calibrate because of their long run times and sometimes questionable numerical stability. Analysis of predictive uncertainty is also a difficult undertaking when using models such as these. Such analysis must reflect a lack of knowledge of spatial hydraulic property details. At the same time, it must be subject to constraints on the spatial variability of these details born of the necessity for model outputs to replicate observations of historical system behavior. In contrast, the rapid run times and general numerical reliability of simple models often promulgates good calibration and ready implementation of sophisticated methods of calibration-constrained uncertainty analysis. Unfortunately, however, many system and process details on which uncertainty may depend are, by design, omitted from simple models. This can lead to underestimation of the uncertainty associated with many predictions of management interest. The present paper proposes a methodology that attempts to overcome the problems associated with complex models on the one hand and simple models on the other hand, while allowing access to the benefits each of them offers. It provides a theoretical analysis of the simplification process from a subspace point of view, this yielding insights into the costs of model simplification, and into how some of these costs may be reduced. It then describes a methodology for paired model usage through which predictive

  11. 42 CFR 81.11 - Use of uncertainty analysis in NIOSH-IREP.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... SAFETY AND HEALTH RESEARCH AND RELATED ACTIVITIES GUIDELINES FOR DETERMINING PROBABILITY OF CAUSATION... Models Used To Estimate Probability of Causation § 81.11 Use of uncertainty analysis in NIOSH-IREP. (a) EEOICPA requires use of the uncertainty associated with the probability of causation...

  12. 42 CFR 81.11 - Use of uncertainty analysis in NIOSH-IREP.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... SAFETY AND HEALTH RESEARCH AND RELATED ACTIVITIES GUIDELINES FOR DETERMINING PROBABILITY OF CAUSATION... Models Used To Estimate Probability of Causation § 81.11 Use of uncertainty analysis in NIOSH-IREP. (a) EEOICPA requires use of the uncertainty associated with the probability of causation...

  13. Model parameter uncertainty analysis for annual field-scale P loss model

    USDA-ARS?s Scientific Manuscript database

    Phosphorous (P) loss models are important tools for developing and evaluating conservation practices aimed at reducing P losses from agricultural fields. All P loss models, however, have an inherent amount of uncertainty associated with them. In this study, we conducted an uncertainty analysis with ...

  14. Model parameter uncertainty analysis for an annual field-scale phosphorus loss model

    USDA-ARS?s Scientific Manuscript database

    Phosphorous (P) loss models are important tools for developing and evaluating conservation practices aimed at reducing P losses from agricultural fields. All P loss models, however, have an inherent amount of uncertainty associated with them. In this study, we conducted an uncertainty analysis with ...

  15. Uncertainty analysis on simple mass balance model to calculate critical loads for soil acidity

    Treesearch

    Harbin Li; Steven G. McNulty

    2007-01-01

    Simple mass balance equations (SMBE) of critical acid loads (CAL) in forest soil were developed to assess potential risks of air pollutants to ecosystems. However, to apply SMBE reliably at large scales, SMBE must be tested for adequacy and uncertainty. Our goal was to provide a detailed analysis of uncertainty in SMBE so that sound strategies for scaling up CAL...

  16. Survey of sampling-based methods for uncertainty and sensitivity analysis.

    SciTech Connect

    Johnson, Jay Dean; Helton, Jon Craig; Sallaberry, Cedric J. PhD.; Storlie, Curt B. (Colorado State University, Fort Collins, CO)

    2006-06-01

    Sampling-based methods for uncertainty and sensitivity analysis are reviewed. The following topics are considered: (1) Definition of probability distributions to characterize epistemic uncertainty in analysis inputs, (2) Generation of samples from uncertain analysis inputs, (3) Propagation of sampled inputs through an analysis, (4) Presentation of uncertainty analysis results, and (5) Determination of sensitivity analysis results. Special attention is given to the determination of sensitivity analysis results, with brief descriptions and illustrations given for the following procedures/techniques: examination of scatterplots, correlation analysis, regression analysis, partial correlation analysis, rank transformations, statistical tests for patterns based on gridding, entropy tests for patterns based on gridding, nonparametric regression analysis, squared rank differences/rank correlation coefficient test, two dimensional Kolmogorov-Smirnov test, tests for patterns based on distance measures, top down coefficient of concordance, and variance decomposition.

  17. Analysis of Uncertainties with the Example of Direct and Inverse Heat Metering Problems

    NASA Astrophysics Data System (ADS)

    Chernukho, A. V.

    2016-03-01

    The unique features of arithmetics, algebra, logic, and of the analysis of uncertainty functions as mathematical objects have been investigated. The means of overcoming the deficiencies of the interval analysis and indistinct logic occurring due to the failure to account for the variability of the density function form are shown. An example of the application of the arithmetics of uncertainties to the analysis of the result of a multifactor experiment is given.

  18. Incorporating Externalities and Uncertainty into Life-Cycle Cost Analysis

    DTIC Science & Technology

    2012-03-01

    that humanity has a right to a safe and healthy environment and that this right has been surrendered involuntarily due to a lack of oversight of the...also responsible for producing ground- level ozone, which has a number of human health effects, and destroying stratospheric ozone, which protects ...subject to copyright protection in the United States. AFIT/GEM/ENV/12-M02 INCORPORATING EXTERNALITIES AND UNCERTAINTY INTO LIFE-CYCLE COST

  19. Real options analysis for photovoltaic project under climate uncertainty

    NASA Astrophysics Data System (ADS)

    Kim, Kyeongseok; Kim, Sejong; Kim, Hyoungkwan

    2016-08-01

    The decision on photovoltaic project depends on the level of climate environments. Changes in temperature and insolation affect photovoltaic output. It is important for investors to consider future climate conditions for determining investments on photovoltaic projects. We propose a real options-based framework to assess economic feasibility of photovoltaic project under climate change. The framework supports investors to evaluate climate change impact on photovoltaic projects under future climate uncertainty.

  20. A python framework for environmental model uncertainty analysis

    USGS Publications Warehouse

    White, Jeremy; Fienen, Michael; Doherty, John E.

    2016-01-01

    We have developed pyEMU, a python framework for Environmental Modeling Uncertainty analyses, open-source tool that is non-intrusive, easy-to-use, computationally efficient, and scalable to highly-parameterized inverse problems. The framework implements several types of linear (first-order, second-moment (FOSM)) and non-linear uncertainty analyses. The FOSM-based analyses can also be completed prior to parameter estimation to help inform important modeling decisions, such as parameterization and objective function formulation. Complete workflows for several types of FOSM-based and non-linear analyses are documented in example notebooks implemented using Jupyter that are available in the online pyEMU repository. Example workflows include basic parameter and forecast analyses, data worth analyses, and error-variance analyses, as well as usage of parameter ensemble generation and management capabilities. These workflows document the necessary steps and provides insights into the results, with the goal of educating users not only in how to apply pyEMU, but also in the underlying theory of applied uncertainty quantification.

  1. Parameter uncertainty and nonstationarity in regional extreme rainfall frequency analysis in Qu River Basin, East China

    NASA Astrophysics Data System (ADS)

    Zhu, Q.; Xu, Y. P.; Gu, H.

    2014-12-01

    Traditionally, regional frequency analysis methods were developed for stationary environmental conditions. Nevertheless, recent studies have identified significant changes in hydrological records, leading to the 'death' of stationarity. Besides, uncertainty in hydrological frequency analysis is persistent. This study aims to investigate the impact of one of the most important uncertainty sources, parameter uncertainty, together with nonstationarity, on design rainfall depth in Qu River Basin, East China. A spatial bootstrap is first proposed to analyze the uncertainty of design rainfall depth estimated by regional frequency analysis based on L-moments and estimated on at-site scale. Meanwhile, a method combining the generalized additive models with 30-year moving window is employed to analyze non-stationarity existed in the extreme rainfall regime. The results show that the uncertainties of design rainfall depth with 100-year return period under stationary conditions estimated by regional spatial bootstrap can reach 15.07% and 12.22% with GEV and PE3 respectively. On at-site scale, the uncertainties can reach 17.18% and 15.44% with GEV and PE3 respectively. In non-stationary conditions, the uncertainties of maximum rainfall depth (corresponding to design rainfall depth) with 0.01 annual exceedance probability (corresponding to 100-year return period) are 23.09% and 13.83% with GEV and PE3 respectively. Comparing the 90% confidence interval, the uncertainty of design rainfall depth resulted from parameter uncertainty is less than that from non-stationarity frequency analysis with GEV, however, slightly larger with PE3. This study indicates that the spatial bootstrap can be successfully applied to analyze the uncertainty of design rainfall depth on both regional and at-site scales. And the non-stationary analysis shows that the differences between non-stationary quantiles and their stationary equivalents are important for decision makes of water resources management

  2. Photovoltaic Calibrations at the National Renewable Energy Laboratory and Uncertainty Analysis Following the ISO 17025 Guidelines

    SciTech Connect

    Emery, Keith

    2016-09-01

    The measurement of photovoltaic (PV) performance with respect to reference conditions requires measuring current versus voltage for a given tabular reference spectrum, junction temperature, and total irradiance. This report presents the procedures implemented by the PV Cell and Module Performance Characterization Group at the National Renewable Energy Laboratory (NREL) to achieve the lowest practical uncertainty. A rigorous uncertainty analysis of these procedures is presented, which follows the International Organization for Standardization (ISO) Guide to the Expression of Uncertainty in Measurement. This uncertainty analysis is required for the team’s laboratory accreditation under ISO standard 17025, “General Requirements for the Competence of Testing and Calibration Laboratories.” The report also discusses additional areas where the uncertainty can be reduced.

  3. 'spup' - an R package for uncertainty propagation analysis in spatial environmental modelling

    NASA Astrophysics Data System (ADS)

    Sawicka, Kasia; Heuvelink, Gerard

    2017-04-01

    Computer models have become a crucial tool in engineering and environmental sciences for simulating the behaviour of complex static and dynamic systems. However, while many models are deterministic, the uncertainty in their predictions needs to be estimated before they are used for decision support. Currently, advances in uncertainty propagation and assessment have been paralleled by a growing number of software tools for uncertainty analysis, but none has gained recognition for a universal applicability and being able to deal with case studies with spatial models and spatial model inputs. Due to the growing popularity and applicability of the open source R programming language we undertook a project to develop an R package that facilitates uncertainty propagation analysis in spatial environmental modelling. In particular, the 'spup' package provides functions for examining the uncertainty propagation starting from input data and model parameters, via the environmental model onto model predictions. The functions include uncertainty model specification, stochastic simulation and propagation of uncertainty using Monte Carlo (MC) techniques, as well as several uncertainty visualization functions. Uncertain environmental variables are represented in the package as objects whose attribute values may be uncertain and described by probability distributions. Both numerical and categorical data types are handled. Spatial auto-correlation within an attribute and cross-correlation between attributes is also accommodated for. For uncertainty propagation the package has implemented the MC approach with efficient sampling algorithms, i.e. stratified random sampling and Latin hypercube sampling. The design includes facilitation of parallel computing to speed up MC computation. The MC realizations may be used as an input to the environmental models called from R, or externally. Selected visualization methods that are understandable by non-experts with limited background in

  4. Global uncertainty analysis of suspended sediment monitoring using turbidimeter in a small mountainous river catchment

    NASA Astrophysics Data System (ADS)

    Navratil, O.; Esteves, M.; Legout, C.; Gratiot, N.; Nemery, J.; Willmore, S.; Grangeon, T.

    2011-02-01

    SummaryA major challenge confronting the scientific community is to understand both patterns of and controls over spatial and temporal variability of suspended sediment dynamics in rivers, as these sediment govern nutriment export, river morphology, siltation of downstream reservoirs and degradation of water quality. High-frequency suspended sediment monitoring programs are required to meet this goal, particularly research in highly erodible mountainous catchments which supply the sediment load of the entire downstream fluvial network. However, in this context, analysis of the data and their interpretation are generally limited by many sources of uncertainty in river monitoring. This paper proposes to estimate the global uncertainty of suspended sediment monitoring using turbidimeter in a small mountainous river catchment (22 km 2; Southern French Alps). We first conducted a detailed analysis of the main uncertainty components associated with the turbidity approach, i.e. a widely used method to continuously survey the suspended sediment concentration (SSC). These uncertainty components were then propagated with Monte Carlo simulations. For individual records, SSC uncertainties are found to be on average less than 10%, but they can reach 70%. At the flood scale, the mean and the maximum SSC uncertainties are on average 20% (range, 1-30%), whereas sediment yield uncertainty is a mean 30% (range, 20-50% depending on the flood considered; discharge error, 20%). Annual specific sediment yield (SSY *) was then 360 ± 100 t km -2 year -1. Uncertainty components associated with the automatic pumping procedure, discharge measurement and turbidity fluctuation at the short time scale were found to be the greatest uncertainties. SSC and SSY uncertainties were found highly site- and time-dependent as they vary significantly with the hydro-sedimentary conditions. This study demonstrates that global uncertainty accounts for only a small part of inter-flood SSC and SSY variability

  5. Performance Assessment Uncertainty Analysis for Japan's HLW Program Feasibility Study (H12)

    SciTech Connect

    BABA,T.; ISHIGURO,K.; ISHIHARA,Y.; SAWADA,A.; UMEKI,H.; WAKASUGI,K.; WEBB,ERIK K.

    1999-08-30

    Most HLW programs in the world recognize that any estimate of long-term radiological performance must be couched in terms of the uncertainties derived from natural variation, changes through time and lack of knowledge about the essential processes. The Japan Nuclear Cycle Development Institute followed a relatively standard procedure to address two major categories of uncertainty. First, a FEatures, Events and Processes (FEPs) listing, screening and grouping activity was pursued in order to define the range of uncertainty in system processes as well as possible variations in engineering design. A reference and many alternative cases representing various groups of FEPs were defined and individual numerical simulations performed for each to quantify the range of conceptual uncertainty. Second, parameter distributions were developed for the reference case to represent the uncertainty in the strength of these processes, the sequencing of activities and geometric variations. Both point estimates using high and low values for individual parameters as well as a probabilistic analysis were performed to estimate parameter uncertainty. A brief description of the conceptual model uncertainty analysis is presented. This paper focuses on presenting the details of the probabilistic parameter uncertainty assessment.

  6. UNCERTAINTY ANALYSIS OF TCE USING THE DOSE EXPOSURE ESTIMATING MODEL (DEEM) IN ACSL

    EPA Science Inventory

    The ACSL-based Dose Exposure Estimating Model(DEEM) under development by EPA is used to perform art uncertainty analysis of a physiologically based pharmacokinetic (PSPK) model of trichloroethylene (TCE). This model involves several circulating metabolites such as trichloroacet...

  7. UNCERTAINTY ANALYSIS OF TCE USING THE DOSE EXPOSURE ESTIMATING MODEL (DEEM) IN ACSL

    EPA Science Inventory

    The ACSL-based Dose Exposure Estimating Model(DEEM) under development by EPA is used to perform art uncertainty analysis of a physiologically based pharmacokinetic (PSPK) model of trichloroethylene (TCE). This model involves several circulating metabolites such as trichloroacet...

  8. Video Scanning Hartmann Optical Tester (VSHOT) Uncertainty Analysis: Preprint

    SciTech Connect

    Lewandowski, A.; Gray, A.

    2010-10-01

    This purely analytical work is based primarily on the geometric optics of the system and shows sensitivities to various design and operational parameters. We discuss sources of error with measuring devices, instrument calibrations, and operator measurements for a parabolic trough test. In this paper, we include both the random (precision) and systematic (bias) errors for VSHOT testing and their contributions to the uncertainty. The contributing factors that we considered in this study are target tilt, target face to laser output distance, instrument vertical offset, scanner tilt, distance between the tool and the test piece, camera calibration, and scanner/calibration.

  9. PROBABILISTIC SENSITIVITY AND UNCERTAINTY ANALYSIS WORKSHOP SUMMARY REPORT

    SciTech Connect

    Seitz, R

    2008-06-25

    Stochastic or probabilistic modeling approaches are being applied more frequently in the United States and globally to quantify uncertainty and enhance understanding of model response in performance assessments for disposal of radioactive waste. This increased use has resulted in global interest in sharing results of research and applied studies that have been completed to date. This technical report reflects the results of a workshop that was held to share results of research and applied work related to performance assessments conducted at United States Department of Energy sites. Key findings of this research and applied work are discussed and recommendations for future activities are provided.

  10. Uncertainty analysis of diffuse-gray radiation enclosure problems: A hypersensitive case study

    NASA Technical Reports Server (NTRS)

    Taylor, Robert P.; Luck, Rogelio; Hodge, B. K.; Steele, W. Glenn

    1993-01-01

    An uncertainty analysis of diffuse-gray enclosure problems is presented. The genesis was a diffuse-gray enclosure problem which proved to be hypersensitive to the specification of view factors. This genesis is discussed in some detail. The uncertainty analysis is presented for the general diffuse-gray enclosure problem and applied to the hypersensitive case study. It was found that the hypersensitivity could be greatly reduced by enforcing both closure and reciprocity for the view factors. The effects of uncertainties in the surface emissivities and temperatures are also investigated.

  11. A GIS based spatially-explicit sensitivity and uncertainty analysis approach for multi-criteria decision analysis.

    PubMed

    Feizizadeh, Bakhtiar; Jankowski, Piotr; Blaschke, Thomas

    2014-03-01

    GIS multicriteria decision analysis (MCDA) techniques are increasingly used in landslide susceptibility mapping for the prediction of future hazards, land use planning, as well as for hazard preparedness. However, the uncertainties associated with MCDA techniques are inevitable and model outcomes are open to multiple types of uncertainty. In this paper, we present a systematic approach to uncertainty and sensitivity analysis. We access the uncertainty of landslide susceptibility maps produced with GIS-MCDA techniques. A new spatially-explicit approach and Dempster-Shafer Theory (DST) are employed to assess the uncertainties associated with two MCDA techniques, namely Analytical Hierarchical Process (AHP) and Ordered Weighted Averaging (OWA) implemented in GIS. The methodology is composed of three different phases. First, weights are computed to express the relative importance of factors (criteria) for landslide susceptibility. Next, the uncertainty and sensitivity of landslide susceptibility is analyzed as a function of weights using Monte Carlo Simulation and Global Sensitivity Analysis. Finally, the results are validated using a landslide inventory database and by applying DST. The comparisons of the obtained landslide susceptibility maps of both MCDA techniques with known landslides show that the AHP outperforms OWA. However, the OWA-generated landslide susceptibility map shows lower uncertainty than the AHP-generated map. The results demonstrate that further improvement in the accuracy of GIS-based MCDA can be achieved by employing an integrated uncertainty-sensitivity analysis approach, in which the uncertainty of landslide susceptibility model is decomposed and attributed to model's criteria weights.

  12. A GIS based spatially-explicit sensitivity and uncertainty analysis approach for multi-criteria decision analysis

    NASA Astrophysics Data System (ADS)

    Feizizadeh, Bakhtiar; Jankowski, Piotr; Blaschke, Thomas

    2014-03-01

    GIS multicriteria decision analysis (MCDA) techniques are increasingly used in landslide susceptibility mapping for the prediction of future hazards, land use planning, as well as for hazard preparedness. However, the uncertainties associated with MCDA techniques are inevitable and model outcomes are open to multiple types of uncertainty. In this paper, we present a systematic approach to uncertainty and sensitivity analysis. We access the uncertainty of landslide susceptibility maps produced with GIS-MCDA techniques. A new spatially-explicit approach and Dempster-Shafer Theory (DST) are employed to assess the uncertainties associated with two MCDA techniques, namely Analytical Hierarchical Process (AHP) and Ordered Weighted Averaging (OWA) implemented in GIS. The methodology is composed of three different phases. First, weights are computed to express the relative importance of factors (criteria) for landslide susceptibility. Next, the uncertainty and sensitivity of landslide susceptibility is analyzed as a function of weights using Monte Carlo Simulation and Global Sensitivity Analysis. Finally, the results are validated using a landslide inventory database and by applying DST. The comparisons of the obtained landslide susceptibility maps of both MCDA techniques with known landslides show that the AHP outperforms OWA. However, the OWA-generated landslide susceptibility map shows lower uncertainty than the AHP-generated map. The results demonstrate that further improvement in the accuracy of GIS-based MCDA can be achieved by employing an integrated uncertainty-sensitivity analysis approach, in which the uncertainty of landslide susceptibility model is decomposed and attributed to model's criteria weights.

  13. Comprehensive Approach to Verification and Validation of CFD Simulations Applied to Backward Facing Step-Application of CFD Uncertainty Analysis

    NASA Technical Reports Server (NTRS)

    Groves, Curtis E.; LLie, Marcel; Shallhorn, Paul A.

    2012-01-01

    There are inherent uncertainties and errors associated with using Computational Fluid Dynamics (CFD) to predict the flow field and there is no standard method for evaluating uncertainty in the CFD community. This paper describes an approach to -validate the . uncertainty in using CFD. The method will use the state of the art uncertainty analysis applying different turbulence niodels and draw conclusions on which models provide the least uncertainty and which models most accurately predict the flow of a backward facing step.

  14. A Framework to Support Generator Ramping Uncertainty Analysis and Visualization

    SciTech Connect

    2015-12-01

    Power system operation requires maintaining a continuous balance between system demand and generation within certain constraints. Traditionally, the balancing processes are based on deterministic models, which do not consider possible random deviations of system generation and load from their predicted values. With the increasing penetration of the renewable generation, unexpected balancing problems can happen due to the deviations. This can result in serious risks to system reliability and efficiency. When the available balancing reserve is not enough to cover the predicted net load range with uncertainty, deficiency of balancing needs occurs. In this case, it is necessary to commit or de-commit additional conventional generators to achieve the desired confidence level for the balancing needs. The framework is built for solving this problem. The ramping tool engine is used to predict additional balancing requirements caused by the variability and uncertainty of the renewable energy, under the constraints of the generation ramping capability and interchange schedule. The webbrowser- based GUI is used to visualize the data in web-environment, which provides flexibility to allow user to see the ramping outputs in any platform. The GOSS structure provides strong support to allow easy communication between ramping engine, and system inputs, as well as the GUI.

  15. Positional uncertainty of isocontours: condition analysis and probabilistic measures.

    PubMed

    Pöthkow, Kai; Hege, Hans-Christian

    2011-10-01

    Uncertainty is ubiquitous in science, engineering and medicine. Drawing conclusions from uncertain data is the normal case, not an exception. While the field of statistical graphics is well established, only a few 2D and 3D visualization and feature extraction methods have been devised that consider uncertainty. We present mathematical formulations for uncertain equivalents of isocontours based on standard probability theory and statistics and employ them in interactive visualization methods. As input data, we consider discretized uncertain scalar fields and model these as random fields. To create a continuous representation suitable for visualization we introduce interpolated probability density functions. Furthermore, we introduce numerical condition as a general means in feature-based visualization. The condition number-which potentially diverges in the isocontour problem-describes how errors in the input data are amplified in feature computation. We show how the average numerical condition of isocontours aids the selection of thresholds that correspond to robust isocontours. Additionally, we introduce the isocontour density and the level crossing probability field; these two measures for the spatial distribution of uncertain isocontours are directly based on the probabilistic model of the input data. Finally, we adapt interactive visualization methods to evaluate and display these measures and apply them to 2D and 3D data sets. © 2011 IEEE

  16. Analysis and Reduction of Complex Networks Under Uncertainty.

    SciTech Connect

    Ghanem, Roger G

    2014-07-31

    This effort was a collaboration with Youssef Marzouk of MIT, Omar Knio of Duke University (at the time at Johns Hopkins University) and Habib Najm of Sandia National Laboratories. The objective of this effort was to develop the mathematical and algorithmic capacity to analyze complex networks under uncertainty. Of interest were chemical reaction networks and smart grid networks. The statements of work for USC focused on the development of stochastic reduced models for uncertain networks. The USC team was led by Professor Roger Ghanem and consisted of one graduate student and a postdoc. The contributions completed by the USC team consisted of 1) methodology and algorithms to address the eigenvalue problem, a problem of significance in the stability of networks under stochastic perturbations, 2) methodology and algorithms to characterize probability measures on graph structures with random flows. This is an important problem in characterizing random demand (encountered in smart grid) and random degradation (encountered in infrastructure systems), as well as modeling errors in Markov Chains (with ubiquitous relevance !). 3) methodology and algorithms for treating inequalities in uncertain systems. This is an important problem in the context of models for material failure and network flows under uncertainty where conditions of failure or flow are described in the form of inequalities between the state variables.

  17. Analysis of uncertainty in equilibrium reconstruction in the EAST superconducting tokamak.

    PubMed

    Liu, G J; Wan, B N; Sun, Y W; Xiao, B J; Wang, Y; Luo, Zh P; Qian, J P; Liu, D M

    2013-07-01

    The analysis of uncertainties of magnetic measurements in equilibrium reconstruction is carried out on the EAST (Experimental Advanced Superconducting Tokamak) tokamak. It is shown that uncertainties of magnetic diagnostics are about 0.2% and 10 mWb for flux loops and 0.6% and 20 G for magnetic probes. Analyzing the sensitivity of the magnetic data uncertainty in the plasma shape reconstruction is presented, based on EFIT fixed boundary and fitting mode and applying overall uncertainty as fitting weight in EFIT. It is found that reconstruction uncertainties are ranged in 0.5-1.4 cm for 6 control segments, less than 0.8 cm for X-points, and 1.0-1.6 cm for strike points with 95% confidence, in the last experimental campaign on the EAST tokamak.

  18. Analysis of uncertainty in equilibrium reconstruction in the EAST superconducting tokamak

    NASA Astrophysics Data System (ADS)

    Liu, G. J.; Wan, B. N.; Sun, Y. W.; Xiao, B. J.; Wang, Y.; Luo, Zh. P.; Qian, J. P.; Liu, D. M.

    2013-07-01

    The analysis of uncertainties of magnetic measurements in equilibrium reconstruction is carried out on the EAST (Experimental Advanced Superconducting Tokamak) tokamak. It is shown that uncertainties of magnetic diagnostics are about 0.2% and 10 mWb for flux loops and 0.6% and 20 G for magnetic probes. Analyzing the sensitivity of the magnetic data uncertainty in the plasma shape reconstruction is presented, based on EFIT fixed boundary and fitting mode and applying overall uncertainty as fitting weight in EFIT. It is found that reconstruction uncertainties are ranged in 0.5-1.4 cm for 6 control segments, less than 0.8 cm for X-points, and 1.0-1.6 cm for strike points with 95% confidence, in the last experimental campaign on the EAST tokamak.

  19. Uncertainty analysis on simple mass balance model to calculate critical loads for soil acidity.

    PubMed

    Li, Harbin; McNulty, Steven G

    2007-10-01

    Simple mass balance equations (SMBE) of critical acid loads (CAL) in forest soil were developed to assess potential risks of air pollutants to ecosystems. However, to apply SMBE reliably at large scales, SMBE must be tested for adequacy and uncertainty. Our goal was to provide a detailed analysis of uncertainty in SMBE so that sound strategies for scaling up CAL estimates to the national scale could be developed. Specifically, we wanted to quantify CAL uncertainty under natural variability in 17 model parameters, and determine their relative contributions in predicting CAL. Results indicated that uncertainty in CAL came primarily from components of base cation weathering (BC(w); 49%) and acid neutralizing capacity (46%), whereas the most critical parameters were BC(w) base rate (62%), soil depth (20%), and soil temperature (11%). Thus, improvements in estimates of these factors are crucial to reducing uncertainty and successfully scaling up SMBE for national assessments of CAL.

  20. Uncertainty analysis of impacts of climate change on snow processes: Case study of interactions of GCM uncertainty and an impact model

    NASA Astrophysics Data System (ADS)

    Kudo, Ryoji; Yoshida, Takeo; Masumoto, Takao

    2017-05-01

    The impact of climate change on snow water equivalent (SWE) and its uncertainty were investigated in snowy areas of subarctic and temperate climate zones in Japan by using a snow process model and climate projections derived from general circulation models (GCMs). In particular, we examined how the uncertainty due to GCMs propagated through the snow model, which contained nonlinear processes defined by thresholds, as an example of the uncertainty caused by interactions among multiple sources of uncertainty. An assessment based on the climate projections in Coupled Model Intercomparison Project Phase 5 indicated that heavy-snowfall areas in the temperate zone (especially in low-elevation areas) were markedly vulnerable to temperature change, showing a large SWE reduction even under slight changes in winter temperature. The uncertainty analysis demonstrated that the uncertainty associated with snow processes (1) can be accounted for mainly by the interactions between GCM uncertainty (in particular, the differences of projected temperature changes between GCMs) and the nonlinear responses of the snow model and (2) depends on the balance between the magnitude of projected temperature changes and present climates dominated largely by climate zones and elevation. Specifically, when the peaks of the distributions of daily mean temperature projected by GCMs cross the key thresholds set in the model, the GCM uncertainty, even if tiny, can be amplified by the nonlinear propagation through the snow process model. This amplification results in large uncertainty in projections of CC impact on snow processes.

  1. Propagation of uncertainty in test-analysis correlation of substructured spacecraft

    NASA Astrophysics Data System (ADS)

    Kammer, Daniel C.; Nimityongskul, Sonny

    2011-03-01

    Organizations, such as the Air Force and NASA make critical decisions on spacecraft performance and survivability based on the results of test-analysis correlation metrics. In order to ensure the success of a new paradigm in finite element model validation where there is no system level test, uncertainty in the substructures must be propagated into the system level correlation metrics. The objective of this work is to quantify the level of accuracy required at the substructure level to produce acceptable analytical model accuracy at the system level. In preparation for future synthesized system level uncertainty analysis, a framework is presented for propagating analytical model uncertainty from a fixed interface Craig-Bampton substructure representation into a free-free substructure. Model uncertainty is parameterized in terms of test- or truth-analysis correlation metrics that are dictated by the Air Force. A statistical model is presented for these correlation metrics such that an analyst can specify a covariance matrix for uncertainty in model correlation at the fixed substructure level, and then propagate it into correlation uncertainty at the free substructure level. Development of the forward propagation approach then allows propagation of correlation uncertainty in the reverse direction from the free substructure into the fixed interface based Craig-Bampton representation. The proposed methods are applied to a typical spacecraft representation.

  2. Monte Carlo analysis of uncertainties in the Netherlands greenhouse gas emission inventory for 1990-2004

    NASA Astrophysics Data System (ADS)

    Ramírez, Andrea; de Keizer, Corry; Van der Sluijs, Jeroen P.; Olivier, Jos; Brandes, Laurens

    This paper presents an assessment of the value added of a Monte Carlo analysis of the uncertainties in the Netherlands inventory of greenhouse gases over a Tier 1 analysis. It also examines which parameters contributed the most to the total emission uncertainty and identified areas of high priority for the further improvement of the accuracy and quality of the inventory. The Monte Carlo analysis resulted in an uncertainty range in total GHG emissions of 4.1% in 2004 and 5.4% in 1990 (with LUCF) and 5.3% (in 1990) and 3.9% (in 2004) for GHG emissions without LUCF. Uncertainty in the trend was estimated at 4.5%. The values are in the same order of magnitude as those estimated in the Tier 1. The results show that accounting for correlation among parameters is important, and for the Netherlands inventory it has a larger impact on the uncertainty in the trend than on the uncertainty in the total GHG emissions. The main contributors to overall uncertainty are found to be related to N 2O emissions from agricultural soils, the N 2O implied emission factors of Nitric Acid Production, CH 4 from managed solid waste disposal on land, and the implied emission factor of CH 4 from manure management from cattle.

  3. Sensitivity analysis of a two-dimensional quantitative microbiological risk assessment: keeping variability and uncertainty separated.

    PubMed

    Busschaert, Pieter; Geeraerd, Annemie H; Uyttendaele, Mieke; Van Impe, Jan F

    2011-08-01

    The aim of quantitative microbiological risk assessment is to estimate the risk of illness caused by the presence of a pathogen in a food type, and to study the impact of interventions. Because of inherent variability and uncertainty, risk assessments are generally conducted stochastically, and if possible it is advised to characterize variability separately from uncertainty. Sensitivity analysis allows to indicate to which of the input variables the outcome of a quantitative microbiological risk assessment is most sensitive. Although a number of methods exist to apply sensitivity analysis to a risk assessment with probabilistic input variables (such as contamination, storage temperature, storage duration, etc.), it is challenging to perform sensitivity analysis in the case where a risk assessment includes a separate characterization of variability and uncertainty of input variables. A procedure is proposed that focuses on the relation between risk estimates obtained by Monte Carlo simulation and the location of pseudo-randomly sampled input variables within the uncertainty and variability distributions. Within this procedure, two methods are used-that is, an ANOVA-like model and Sobol sensitivity indices-to obtain and compare the impact of variability and of uncertainty of all input variables, and of model uncertainty and scenario uncertainty. As a case study, this methodology is applied to a risk assessment to estimate the risk of contracting listeriosis due to consumption of deli meats.

  4. Technique for direct measurement of thermal conductivity of elastomers and a detailed uncertainty analysis

    NASA Astrophysics Data System (ADS)

    Ralphs, Matthew I.; Smith, Barton L.; Roberts, Nicholas A.

    2016-11-01

    High thermal conductivity thermal interface materials (TIMs) are needed to extend the life and performance of electronic circuits. A stepped bar apparatus system has been shown to work well for thermal resistance measurements with rigid materials, but most TIMs are elastic. This work studies the uncertainty of using a stepped bar apparatus to measure the thermal resistance and a tensile/compression testing machine to estimate the compressed thickness of polydimethylsiloxane for a measurement on the thermal conductivity, k eff. An a priori, zeroth order analysis is used to estimate the random uncertainty from the instrumentation; a first order analysis is used to estimate the statistical variation in samples; and an a posteriori, Nth order analysis is used to provide an overall uncertainty on k eff for this measurement method. Bias uncertainty in the thermocouples is found to be the largest single source of uncertainty. The a posteriori uncertainty of the proposed method is 6.5% relative uncertainty (68% confidence), but could be reduced through calibration and correlated biases in the temperature measurements.

  5. Uncertainty analysis for probabilistic pipe fracture evaluations in LBB applications

    SciTech Connect

    Rahman, S.; Ghadiali, N.; Wilkowski, G.

    1997-04-01

    During the NRC`s Short Cracks in Piping and Piping Welds Program at Battelle, a probabilistic methodology was developed to conduct fracture evaluations of circumferentially cracked pipes for application to leak-rate detection. Later, in the IPIRG-2 program, several parameters that may affect leak-before-break and other pipe flaw evaluations were identified. This paper presents new results from several uncertainty analyses to evaluate the effects of normal operating stresses, normal plus safe-shutdown earthquake stresses, off-centered cracks, restraint of pressure-induced bending, and dynamic and cyclic loading rates on the conditional failure probability of pipes. systems in BWR and PWR. For each parameter, the sensitivity to conditional probability of failure and hence, its importance on probabilistic leak-before-break evaluations were determined.

  6. Modelling System Processes to Support Uncertainty Analysis and Robustness Evaluation

    NASA Technical Reports Server (NTRS)

    Blackwell, Charles; Cuzzi, Jeffrey (Technical Monitor)

    1996-01-01

    In the use of advanced systems control techniques in the development of a dynamic system, results from effective mathematical modelling is required. Historically, in some cases the use of a model which only reflects the "expected" or "nominal" important -information about the system's internal processes has resulted in acceptable system performance, but it should be recognized that for those cases success was due to a combination of the remarkable inherent potential of feedback control for robustness and fortuitously wide margins between system performance requirements and system performance capability. In the cases of a CELSS development, no such fortuitous combinations should be expected, and it should be expected that the uncertainty in the information on the system's processes will have to be taken into account in order to generate a performance robust design. In this paper, we develop one perspective of the issue of providing robustness as mathematical modelling impacts it, and present some examples of model formats which serve the needed purpose.

  7. How does uncertainty shape patient experience in advanced illness? A secondary analysis of qualitative data

    PubMed Central

    Etkind, Simon Noah; Bristowe, Katherine; Bailey, Katharine; Selman, Lucy Ellen; Murtagh, Fliss EM

    2016-01-01

    Background: Uncertainty is common in advanced illness but is infrequently studied in this context. If poorly addressed, uncertainty can lead to adverse patient outcomes. Aim: We aimed to understand patient experiences of uncertainty in advanced illness and develop a typology of patients’ responses and preferences to inform practice. Design: Secondary analysis of qualitative interview transcripts. Studies were assessed for inclusion and interviews were sampled using maximum-variation sampling. Analysis used a thematic approach with 10% of coding cross-checked to enhance reliability. Setting/participants: Qualitative interviews from six studies including patients with heart failure, chronic obstructive pulmonary disease, renal disease, cancer and liver failure. Results: A total of 30 transcripts were analysed. Median age was 75 (range, 43–95), 12 patients were women. The impact of uncertainty was frequently discussed: the main related themes were engagement with illness, information needs, patient priorities and the period of time that patients mainly focused their attention on (temporal focus). A typology of patient responses to uncertainty was developed from these themes. Conclusion: Uncertainty influences patient experience in advanced illness through affecting patients’ information needs, preferences and future priorities for care. Our typology aids understanding of how patients with advanced illness respond to uncertainty. Assessment of these three factors may be a useful starting point to guide clinical assessment and shared decision making. PMID:27129679

  8. Status of XSUSA for Sampling Based Nuclear Data Uncertainty and Sensitivity Analysis

    NASA Astrophysics Data System (ADS)

    Zwermann, W.; Gallner, L.; Klein, M.; Krzykacz-Hausmann, B.; Pasichnyk, I.; Pautz, A.; Velkov, K.

    2013-03-01

    In the present contribution, an overview of the sampling based XSUSA method for sensitivity and uncertainty analysis with respect to nuclear data is given. The focus is on recent developments and applications of XSUSA. These applications include calculations for critical assemblies, fuel assembly depletion calculations, and steadystate as well as transient reactor core calculations. The analyses are partially performed in the framework of international benchmark working groups (UACSA - Uncertainty Analyses for Criticality Safety Assessment, UAM - Uncertainty Analysis in Modelling). It is demonstrated that particularly for full-scale reactor calculations the influence of the nuclear data uncertainties on the results can be substantial. For instance, for the radial fission rate distributions of mixed UO2/MOX light water reactor cores, the 2σ uncertainties in the core centre and periphery can reach values exceeding 10%. For a fast transient, the resulting time behaviour of the reactor power was covered by a wide uncertainty band. Overall, the results confirm the necessity of adding systematic uncertainty analyses to best-estimate reactor calculations.

  9. Uncertainty analysis of statistical downscaling models using general circulation model over an international wetland

    NASA Astrophysics Data System (ADS)

    Etemadi, H.; Samadi, S.; Sharifikia, M.

    2014-06-01

    Regression-based statistical downscaling model (SDSM) is an appropriate method which broadly uses to resolve the coarse spatial resolution of general circulation models (GCMs). Nevertheless, the assessment of uncertainty propagation linked with climatic variables is essential to any climate change impact study. This study presents a procedure to characterize uncertainty analysis of two GCM models link with Long Ashton Research Station Weather Generator (LARS-WG) and SDSM in one of the most vulnerable international wetland, namely "Shadegan" in an arid region of Southwest Iran. In the case of daily temperature, uncertainty is estimated by comparing monthly mean and variance of downscaled and observed daily data at a 95 % confidence level. Uncertainties were then evaluated from comparing monthly mean dry and wet spell lengths and their 95 % CI in daily precipitation downscaling using 1987-2005 interval. The uncertainty results indicated that the LARS-WG is the most proficient model at reproducing various statistical characteristics of observed data at a 95 % uncertainty bounds while the SDSM model is the least capable in this respect. The results indicated a sequences uncertainty analysis at three different climate stations and produce significantly different climate change responses at 95 % CI. Finally the range of plausible climate change projections suggested a need for the decision makers to augment their long-term wetland management plans to reduce its vulnerability to climate change impacts.

  10. The IAEA Coordinated Research Program on HTGR Uncertainty Analysis: Phase I Status and Initial Results

    SciTech Connect

    Strydom, Gerhard; Bostelmann, Friederike; Ivanov, Kostadin

    2014-10-01

    The continued development of High Temperature Gas Cooled Reactors (HTGRs) requires verification of HTGR design and safety features with reliable high fidelity physics models and robust, efficient, and accurate codes. One way to address the uncertainties in the HTGR analysis tools is to assess the sensitivity of critical parameters (such as the calculated maximum fuel temperature during loss of coolant accidents) to a few important input uncertainties. The input parameters were identified by engineering judgement in the past but are today typically based on a Phenomena Identification Ranking Table (PIRT) process. The input parameters can also be derived from sensitivity studies and are then varied in the analysis to find a spread in the parameter of importance. However, there is often no easy way to compensate for these uncertainties. In engineering system design, a common approach for addressing performance uncertainties is to add compensating margins to the system, but with passive properties credited it is not so clear how to apply it in the case of modular HTGR heat removal path. Other more sophisticated uncertainty modelling approaches, including Monte Carlo analysis, have also been proposed and applied. Ideally one wishes to apply a more fundamental approach to determine the predictive capability and accuracies of coupled neutronics/thermal-hydraulics and depletion simulations used for reactor design and safety assessment. Today there is a broader acceptance of the use of uncertainty analysis even in safety studies, and it has been accepted by regulators in some cases to replace the traditional conservative analysis. Therefore some safety analysis calculations may use a mixture of these approaches for different parameters depending upon the particular requirements of the analysis problem involved. Sensitivity analysis can for example be used to provide information as part of an uncertainty analysis to determine best estimate plus uncertainty results to the

  11. Uncertainty Analysis Framework - Hanford Site-Wide Groundwater Flow and Transport Model

    SciTech Connect

    Cole, Charles R.; Bergeron, Marcel P.; Murray, Christopher J.; Thorne, Paul D.; Wurstner, Signe K.; Rogers, Phillip M.

    2001-11-09

    Pacific Northwest National Laboratory (PNNL) embarked on a new initiative to strengthen the technical defensibility of the predictions being made with a site-wide groundwater flow and transport model at the U.S. Department of Energy Hanford Site in southeastern Washington State. In FY 2000, the focus of the initiative was on the characterization of major uncertainties in the current conceptual model that would affect model predictions. The long-term goals of the initiative are the development and implementation of an uncertainty estimation methodology in future assessments and analyses using the site-wide model. This report focuses on the development and implementation of an uncertainty analysis framework.

  12. Factors Affecting Pollutant Load Reduction with Uncertainty Analysis in Urban Stormwater BMP Systems

    NASA Astrophysics Data System (ADS)

    Park, D.

    2015-12-01

    This study incorporates uncertainty analysis into a model of the performance of stormwater best management practices (BMPs) to characterize the uncertainty in stormwater BMP effluent load that results from uncertainty in the BMP performance modeling in an urban stormwater system. Detention basins are used as BMPs in the urban stormwater systems, and the total suspended solids (TSS) are used as an urban nonpoint source pollutant in Los Angeles, CA. The k-C* model, which incorporates uncertainty analysis, is applied to the uncertainty of the stormwater effluent concentration in urban stormwater systems. This study presents a frequency analysis of the runoff volume and BMP overflows to characterize the uncertainty of BMP effluent loads, and the load frequency curve (LFC) is simulated with and without BMP conditions and verified using the observed TSS load. Finally, the effects of imperviousness, BMP volume, and BMP surface area are investigated using a reliability analysis. The results of this study can be used to determine the appropriate BMP size to achieve a specific watershed runoff pollutant load. The result of this evaluation method can support the adequate sizing of a BMP to meet the defined nonpoint source pollutant regulations. Acknowlegments This research was supported by a grant (14AWMP-B082564-01) from Advanced Water Management Research Program funded by Ministry of Land, Infrastructure and Transport of Korean government.

  13. Information Theory and the Analysis of Uncertainties in a Spatial Geological Context

    NASA Astrophysics Data System (ADS)

    Wellmann, Florian; Jessell, Mark

    2014-05-01

    The interpretation of uncertainties in a spatial context is of fundamental importance for the generation of structural geological models; this applies to models for mineral exploration, to scientific structural geological studies and fundamental geological evaluations. With our work, we are addressing uncertainties in this spatial geological context. Encouraged by the interdisciplinary and interactive aspect of the session, we would like to present our method to other branches of geosciences. Structural geological models, here understood as structural representations of the dominant geological units in the subsurface, always contain uncertainties. The analysis of these uncertainties is intricate as these models are usually constructed on the basis of greatly varying data quality and spatial distribution. An additional complication is that, in most cases, the general distribution of uncertainties in space is of interest, and not a single outcome as, for example, the flow at a well. In the context of structural geological uncertainties, we therefore face two problems: (i) how can we estimate uncertainties in a complex 3-D geological model, and (ii) what is a meaningful measure to visualise and analyse these uncertainties quantitatively? In recent years, several approaches have been developed to solve the first problem. We show here an approach based on implicit stochastic geological modelling techniques, capable of handling complex geological settings. To address the second problem, we apply measures from information theory. We consider each subspace in a discretised model domain as a random variable. Based on the probability functions estimated from a suite of generated models, we evaluate the information entropy at each location in the subsurface as a measure of uncertainty. We subsequently estimate multivariate conditional entropy and mutual information between a set of locations and other regions in space, to determine spatial uncertainty correlations, and the

  14. Analysis of Model Uncertainties to Support Risk-Based Decisions Regarding Groundwater Contamination

    NASA Astrophysics Data System (ADS)

    Birdsell, K. H.; Vesselinov, V. V.; Davis, P.; Hollis, D.; Newman, B. D.; Echohawk, J. C.

    2005-12-01

    Model simulations are widely used in environmental management decision processes. However, there are various sources of uncertainty that commonly impact the model results. Consequently, it is crucial to account for all the possible model uncertainties that impact the model results so that they are adequately considered in the management decision process. Here we discuss an uncertainty analysis of model simulations related to a contamination site located within Los Alamos National Laboratory, NM. We describe how uncertainties are quantified and propagated through a series of coupled groundwater models and then used in a risk-based decision analysis to identify and rank alternative actions to protect the environment and water users from potential impacts of groundwater contamination from former liquid-effluent discharges. Uncertainties in the contaminant source, infiltration distribution, and transport through the unsaturated and saturated zones are analyzed using a series of alternative conceptual models and stochastic model parameters. Alternative conceptual models and uncertain model parameters are defined to encompass a large range of possible uncertainties associated with potential groundwater flow and transport based on existing data and expert knowledge about the system. In all, eight alternative conceptual models using 38 uncertain parameters were analyzed. For each conceptual model and related stochastic parameter realization, we simulate contaminant transport from the contaminant outfall to water-supply wells over the next 1000 years. Based on the simulated contaminant concentrations in the groundwater pumped by water-supply wells, we evaluate health risk for the receptors. Based on the model results, sensitivity analysis is applied to identify the parameters and conceptual model elements causing high concentrations at the water-supply wells. Decision analysis is applied to define the optimal course(s) of action, which may include clean-up, stabilization

  15. Sensitivity and Uncertainty Analysis in Chemical Mechanisms for Air Quality Modeling

    NASA Astrophysics Data System (ADS)

    Gao, Dongfen

    1995-01-01

    Ambient ozone in urban and regional air pollution is a serious environmental problem. Air quality models can be used to predict ozone concentrations and explore control strategies. One important component of such air quality models is a chemical mechanism. Sensitivity and uncertainty analysis play an important role in the evaluation of the performance of air quality models. The uncertainties associated with the RADM2 chemical mechanism in predicted concentrations of O_3, HCHO, H _2rm O_2, PAN, and HNO _3 were estimated. Monte Carlo simulations with Latin Hypercube Sampling were used to estimate the overall uncertainties in concentrations of species of interest, due to uncertainties in chemical parameters. The parameters that were treated as random variables were identified through first-order sensitivity and uncertainty analyses. Recent estimates of uncertainties in rate parameters and product yields were used. The results showed the relative uncertainties in ozone predictions are +/-23-50% (1sigma relative to the mean) in urban cases, and less than +/-20% in rural cases. Uncertainties in HNO_3 concentrations are the smallest, followed by HCHO, O_3 and PAN. Predicted H_2rm O_2 concentrations have the highest uncertainties. Uncertainties in the differences of peak ozone concentrations between base and control cases were also studied. The results show that the uncertainties in the fractional reductions in ozone concentrations were 9-12% with NO_{rm x} control at an ROG/NO_{rm x} ratio of 24:1 and 11-33% with ROG control at an ROG/NO _{rm x} ratio of 6:1. Linear regression analysis of the Monte Carlo results showed that uncertainties in rate parameters for the formation of HNO_3, for the reaction of HCHO + hv to 2HO _2 + CO, for PAN chemistry and for the photolysis of NO_2 are most influential to ozone concentrations and differences of ozone. The parameters that are important to ozone concentrations also tend to be relatively influential to other key species

  16. Influences of system uncertainties on the numerical transfer path analysis of engine systems

    NASA Astrophysics Data System (ADS)

    Acri, A.; Nijman, E.; Acri, A.; Offner, G.

    2017-10-01

    Practical mechanical systems operate with some degree of uncertainty. In numerical models uncertainties can result from poorly known or variable parameters, from geometrical approximation, from discretization or numerical errors, from uncertain inputs or from rapidly changing forcing that can be best described in a stochastic framework. Recently, random matrix theory was introduced to take parameter uncertainties into account in numerical modeling problems. In particular in this paper, Wishart random matrix theory is applied on a multi-body dynamic system to generate random variations of the properties of system components. Multi-body dynamics is a powerful numerical tool largely implemented during the design of new engines. In this paper the influence of model parameter variability on the results obtained from the multi-body simulation of engine dynamics is investigated. The aim is to define a methodology to properly assess and rank system sources when dealing with uncertainties. Particular attention is paid to the influence of these uncertainties on the analysis and the assessment of the different engine vibration sources. Examples of the effects of different levels of uncertainties are illustrated by means of examples using a representative numerical powertrain model. A numerical transfer path analysis, based on system dynamic substructuring, is used to derive and assess the internal engine vibration sources. The results obtained from this analysis are used to derive correlations between parameter uncertainties and statistical distribution of results. The derived statistical information can be used to advance the knowledge of the multi-body analysis and the assessment of system sources when uncertainties in model parameters are considered.

  17. Large-Scale Transport Model Uncertainty and Sensitivity Analysis: Distributed Sources in Complex, Hydrogeologic Systems

    NASA Astrophysics Data System (ADS)

    Wolfsberg, A.; Kang, Q.; Li, C.; Ruskauff, G.; Bhark, E.; Freeman, E.; Prothro, L.; Drellack, S.

    2007-12-01

    The Underground Test Area (UGTA) Project of the U.S. Department of Energy, National Nuclear Security Administration Nevada Site Office is in the process of assessing and developing regulatory decision options based on modeling predictions of contaminant transport from underground testing of nuclear weapons at the Nevada Test Site (NTS). The UGTA Project is attempting to develop an effective modeling strategy that addresses and quantifies multiple components of uncertainty including natural variability, parameter uncertainty, conceptual/model uncertainty, and decision uncertainty in translating model results into regulatory requirements. The modeling task presents multiple unique challenges to the hydrological sciences as a result of the complex fractured and faulted hydrostratigraphy, the distributed locations of sources, the suite of reactive and non-reactive radionuclides, and uncertainty in conceptual models. Characterization of the hydrogeologic system is difficult and expensive because of deep groundwater in the arid desert setting and the large spatial setting of the NTS. Therefore, conceptual model uncertainty is partially addressed through the development of multiple alternative conceptual models of the hydrostratigraphic framework and multiple alternative models of recharge and discharge. Uncertainty in boundary conditions is assessed through development of alternative groundwater fluxes through multiple simulations using the regional groundwater flow model. Calibration of alternative models to heads and measured or inferred fluxes has not proven to provide clear measures of model quality. Therefore, model screening by comparison to independently-derived natural geochemical mixing targets through cluster analysis has also been invoked to evaluate differences between alternative conceptual models. Advancing multiple alternative flow models, sensitivity of transport predictions to parameter uncertainty is assessed through Monte Carlo simulations. The

  18. Large-Scale Transport Model Uncertainty and Sensitivity Analysis: Distributed Sources in Complex Hydrogeologic Systems

    SciTech Connect

    Sig Drellack, Lance Prothro

    2007-12-01

    The Underground Test Area (UGTA) Project of the U.S. Department of Energy, National Nuclear Security Administration Nevada Site Office is in the process of assessing and developing regulatory decision options based on modeling predictions of contaminant transport from underground testing of nuclear weapons at the Nevada Test Site (NTS). The UGTA Project is attempting to develop an effective modeling strategy that addresses and quantifies multiple components of uncertainty including natural variability, parameter uncertainty, conceptual/model uncertainty, and decision uncertainty in translating model results into regulatory requirements. The modeling task presents multiple unique challenges to the hydrological sciences as a result of the complex fractured and faulted hydrostratigraphy, the distributed locations of sources, the suite of reactive and non-reactive radionuclides, and uncertainty in conceptual models. Characterization of the hydrogeologic system is difficult and expensive because of deep groundwater in the arid desert setting and the large spatial setting of the NTS. Therefore, conceptual model uncertainty is partially addressed through the development of multiple alternative conceptual models of the hydrostratigraphic framework and multiple alternative models of recharge and discharge. Uncertainty in boundary conditions is assessed through development of alternative groundwater fluxes through multiple simulations using the regional groundwater flow model. Calibration of alternative models to heads and measured or inferred fluxes has not proven to provide clear measures of model quality. Therefore, model screening by comparison to independently-derived natural geochemical mixing targets through cluster analysis has also been invoked to evaluate differences between alternative conceptual models. Advancing multiple alternative flow models, sensitivity of transport predictions to parameter uncertainty is assessed through Monte Carlo simulations. The

  19. Three Dimensional Vapor Intrusion Modeling: Model Validation and Uncertainty Analysis

    NASA Astrophysics Data System (ADS)

    Akbariyeh, S.; Patterson, B.; Rakoczy, A.; Li, Y.

    2013-12-01

    Volatile organic chemicals (VOCs), such as chlorinated solvents and petroleum hydrocarbons, are prevalent groundwater contaminants due to their improper disposal and accidental spillage. In addition to contaminating groundwater, VOCs may partition into the overlying vadose zone and enter buildings through gaps and cracks in foundation slabs or basement walls, a process termed vapor intrusion. Vapor intrusion of VOCs has been recognized as a detrimental source for human exposures to potential carcinogenic or toxic compounds. The simulation of vapor intrusion from a subsurface source has been the focus of many studies to better understand the process and guide field investigation. While multiple analytical and numerical models were developed to simulate the vapor intrusion process, detailed validation of these models against well controlled experiments is still lacking, due to the complexity and uncertainties associated with site characterization and soil gas flux and indoor air concentration measurement. In this work, we present an effort to validate a three-dimensional vapor intrusion model based on a well-controlled experimental quantification of the vapor intrusion pathways into a slab-on-ground building under varying environmental conditions. Finally, a probabilistic approach based on Monte Carlo simulations is implemented to determine the probability distribution of indoor air concentration based on the most uncertain input parameters.

  20. Uncertainty Analysis of Historic Water Resources Availability in Africa

    NASA Astrophysics Data System (ADS)

    McNally, A.; Arsenault, K. R.; Narapusetty, B.; Peters-Lidard, C. D.

    2014-12-01

    Seeing how current agrometerological conditions measure up to historic events helps analysts and decision-makers judge the potential impact that anomalous rainfall and temperatures will have on the availability and accessibility of food and water resources. We present results from the Famine Early Warning Systems Network (FEWS NET) Land Data Assimilation System (FLDAS), which is used to produce multi-model and rainfall ensembles of the water balance over semi-arid Africa from 1982-2014. The ensemble approach allows us to assess confidence in our estimates, which is critical given that food and water insecure regions in Africa are data-poor are characterized by complex interactions and feedbacks that cause deterministic hydrologic modeling approaches to fall short. We then use the ensemble of water balance estimates to calculate drought severity (derived from modeled soil moisture), and the Water Requirement Satisfaction Index (a function of atmospheric water demand). We compare these indices to the GIMMS 30-year vegetation data product from AVHRR, and the ESA ECV 30 year microwave soil moisture. These historical time series (with confidence bounds) allow us to improve our quantitative understanding of drought thresholds, to explore sources of parameter and model uncertainty, and to better contextualize current operational drought monitoring efforts in Africa.

  1. A global water supply reservoir yield model with uncertainty analysis

    NASA Astrophysics Data System (ADS)

    Kuria, Faith W.; Vogel, Richard M.

    2014-09-01

    Understanding the reliability and uncertainty associated with water supply yields derived from surface water reservoirs is central for planning purposes. Using a global dataset of monthly river discharge, we introduce a generalized model for estimating the mean and variance of water supply yield, Y, expected from a reservoir for a prespecified reliability, R, and storage capacity, S assuming a flow record of length n. The generalized storage-reliability-yield (SRY) relationships reported here have numerous water resource applications ranging from preliminary water supply investigations, to economic and climate change impact assessments. An example indicates how our generalized SRY relationship can be combined with a hydroclimatic model to determine the impact of climate change on surface reservoir water supply yields. We also document that the variability of estimates of water supply yield are invariant to characteristics of the reservoir system, including its storage capacity and reliability. Standardized metrics of the variability of water supply yields are shown to depend only on the sample size of the inflows and the statistical characteristics of the inflow series.

  2. Reducing capture zone uncertainty with a systematic sensitivity analysis.

    PubMed

    Esling, Steven P; Keller, John E; Miller, Kenneth J

    2008-01-01

    The U.S. Environmental Protection Agency has established several methods to delineate wellhead protection areas (WHPAs) around community wells in order to protect them from surface contamination sources. Delineating a WHPA often requires defining the capture zone for a well. Generally, analytical models or arbitrary setback zones have been used to define the capture zone in areas where little is known about the distribution of hydraulic head, hydraulic conductivity, or recharge. Numerical modeling, however, even in areas of sparse data, offers distinct advantages over the more simplified analytical models or arbitrary setback zones. The systematic approach discussed here calibrates a numerical flow model to regional topography and then applies a matrix of plausible recharge to hydraulic conductivity ratios (R/K) to investigate the impact on the size and shape of the capture zone. This approach does not attempt to determine the uncertainty of the model but instead yields several possible capture zones, the composite of which is likely to contain the actual capture zone. A WHPA based on this composite capture zone will protect ground water resources better than one based on any individual capture zone. An application of the method to three communities illustrates development of the R/K matrix and demonstrates that the method is particularly well suited for determining capture zones in alluvial aquifers.

  3. Probabilistic uncertainty analysis of epidemiological modeling to guide public health intervention policy.

    PubMed

    Gilbert, Jennifer A; Meyers, Lauren Ancel; Galvani, Alison P; Townsend, Jeffrey P

    2014-03-01

    Mathematical modeling of disease transmission has provided quantitative predictions for health policy, facilitating the evaluation of epidemiological outcomes and the cost-effectiveness of interventions. However, typical sensitivity analyses of deterministic dynamic infectious disease models focus on model architecture and the relative importance of parameters but neglect parameter uncertainty when reporting model predictions. Consequently, model results that identify point estimates of intervention levels necessary to terminate transmission yield limited insight into the probability of success. We apply probabilistic uncertainty analysis to a dynamic model of influenza transmission and assess global uncertainty in outcome. We illustrate that when parameter uncertainty is not incorporated into outcome estimates, levels of vaccination and treatment predicted to prevent an influenza epidemic will only have an approximately 50% chance of terminating transmission and that sensitivity analysis alone is not sufficient to obtain this information. We demonstrate that accounting for parameter uncertainty yields probabilities of epidemiological outcomes based on the degree to which data support the range of model predictions. Unlike typical sensitivity analyses of dynamic models that only address variation in parameters, the probabilistic uncertainty analysis described here enables modelers to convey the robustness of their predictions to policy makers, extending the power of epidemiological modeling to improve public health. Copyright © 2013 The Authors. Published by Elsevier B.V. All rights reserved.

  4. An uncertainty analysis of mean flow velocity measurements used to quantify emissions from stationary sources.

    PubMed

    Bryant, Rodney; Sanni, Olatunde; Moore, Elizabeth; Bundy, Matthew; Johnson, Aaron

    2014-06-01

    Point velocity measurements conducted by traversing a Pitot tube across the cross section of a flow conduit continue to be the standard practice for evaluating the accuracy of continuous flow-monitoring devices. Such velocity traverses were conducted in the exhaust duct of a reduced-scale analog of a stationary source, and mean flow velocity was computed using several common integration techniques. Sources of random and systematic measurement uncertainty were identified and applied in the uncertainty analysis. When applicable, the minimum requirements of the standard test methods were used to estimate measurement uncertainty due to random sources. Estimates of the systematic measurement uncertainty due to discretized measurements of the asymmetric flow field were determined by simulating point velocity traverse measurements in a flow distribution generated using computational fluid dynamics. For the evaluated flow system, estimates of relative expanded uncertainty for the mean flow velocity ranged from +/- 1.4% to +/- 9.3% and depended on the number of measurement locations and the method of integration. Accurate flow measurements in smokestacks are critical for quantifying the levels of greenhouse gas emissions from fossil-fuel-burning power plants, the largest emitters of carbon dioxide. A systematic uncertainty analysis is necessary to evaluate the accuracy of these measurements. This study demonstrates such an analysis and its application to identify specific measurement components and procedures needing focused attention to improve the accuracy of mean flow velocity measurements in smokestacks.

  5. Impact of model, methodological, and parameter uncertainty in the economic analysis of vaccination programs.

    PubMed

    Brisson, M; Edmunds, W J

    2006-01-01

    Guidelines for economic evaluations insist that the sensitivity of model results to alternative parameter values should be thoroughly explored. However, differences in model construction and analytical choices (such as the choice of a cost-effectiveness or cost-benefit framework) also introduce uncertainty in results, though these are rarely subjected to a thorough sensitivity analysis. In this article, the authors quantify the effect of model, methodological, and parameter uncertainty, taking varicella vaccination as an example. They used 3 different models (a static model, a dynamic model that only looks at the effect of vaccination on varicella, and a dynamic model that also assesses the implications of vaccination for zoster epidemiology) and 2 forms of analysis (cost-benefit and cost-utility). They also varied the discount rate and time frame of analysis. Probabilistic sensitivity analyses were performed to estimate the impact of parameter uncertainty. In their example, model and methodological choice had a profound effect on estimated cost-effectiveness, but parameter uncertainty played a relatively minor role. Under cost-utility analysis, the probabilistic sensitivity analysis suggested that there was a near certainty that vaccination dominates no vaccination, or the other way around, depending on model choice and perspective. Under cost-benefit analysis, vaccination always appeared to be attractive. Thus, the authors clearly show that model and methodological assumptions can have greater impact on results than parameter estimates, although sensitivity analyses are rarely performed on these sources of uncertainty.

  6. Unscented transform-based uncertainty analysis of rotating coil transducers for field mapping

    NASA Astrophysics Data System (ADS)

    Arpaia, P.; De Matteis, E.; Schiano Lo Moriello, R.

    2016-03-01

    The uncertainty of a rotating coil transducer for magnetic field mapping is analyzed. Unscented transform and statistical design of experiments are combined to determine magnetic field expectation, standard uncertainty, and separate contributions of the uncertainty sources. For nonlinear measurement models, the unscented transform-based approach is more error-proof than the linearization underlying the "Guide to the expression of Uncertainty in Measurements" (GUMs), owing to the absence of model approximations and derivatives computation. When GUM assumptions are not met, the deterministic sampling strategy strongly reduces computational burden with respect to Monte Carlo-based methods proposed by the Supplement 1 of the GUM. Furthermore, the design of experiments and the associated statistical analysis allow the uncertainty sources domain to be explored efficiently, as well as their significance and single contributions to be assessed for an effective setup configuration. A straightforward experimental case study highlights that a one-order-of-magnitude reduction in the relative uncertainty of the coil area produces a decrease in uncertainty of the field mapping transducer by a factor of 25 with respect to the worst condition. Moreover, about 700 trials and the related processing achieve results corresponding to 5 × 106 brute-force Monte Carlo simulations.

  7. Uncertainty analysis of multi-rate kinetics of uranium desorption from sediments

    SciTech Connect

    Zhang, Xiaoying; Liu, Chongxuan; Hu, Bill X.; Zhang, Guannan

    2014-01-01

    A multi-rate expression for uranyl [U(VI)] surface complexation reactions has been proposed to describe diffusion-limited U(VI) sorption/desorption in heterogeneous subsurface sediments. An important assumption in the rate expression is that its rate constants follow a certain type probability distribution. In this paper, a Bayes-based, Differential Evolution Markov Chain method was used to assess the distribution assumption and to analyze parameter and model structure uncertainties. U(VI) desorption from a contaminated sediment at the US Hanford 300 Area, Washington was used as an example for detail analysis. The results indicated that: 1) the rate constants in the multi-rate expression contain uneven uncertainties with slower rate constants having relative larger uncertainties; 2) the lognormal distribution is an effective assumption for the rate constants in the multi-rate model to simualte U(VI) desorption; 3) however, long-term prediction and its uncertainty may be significantly biased by the lognormal assumption for the smaller rate constants; and 4) both parameter and model structure uncertainties can affect the extrapolation of the multi-rate model with a larger uncertainty from the model structure. The results provide important insights into the factors contributing to the uncertainties of the multi-rate expression commonly used to describe the diffusion or mixing-limited sorption/desorption of both organic and inorganic contaminants in subsurface sediments.

  8. Uncertainty analysis of the standard delay-and-sum beamformer and array calibration

    NASA Astrophysics Data System (ADS)

    Yardibi, T.; Bahr, C.; Zawodny, N.; Liu, F.; Cattafesta, L. N., III; Li, J.

    2010-06-01

    Beamforming has become an ubiquitous task in aeroacoustic noise measurements for source localization and power estimation. The standard delay-and-sum (DAS) beamformer is the most commonly used beamforming algorithm due to its simplicity and robustness and also serves as the basis for more sophisticated algorithms, such as the deconvolution approach for the mapping of acoustic sources (DAMAS). The DAS data reduction equation is a function of many parameters including the microphone locations, microphone transfer functions, temperature and the cross-spectral matrix (CSM), where each one of these parameters has a unique uncertainty associated with it. This paper provides a systematic uncertainty analysis of the DAS beamformer and Dougherty's widely used calibration procedure under the assumption that the underlying mathematical model of incoherent, monopole sources is correct. An analytical multivariate method based on a first-order Taylor series expansion and a numerical Monte-Carlo method based on assumed uncertainty distributions for the input variables are considered. The uncertainty of calibration is analyzed using the Monte-Carlo method, whereas the uncertainty of the DAS beamformer is analyzed using both the complex multivariate and the Monte-Carlo methods. It is shown that the multivariate uncertainty analysis method fails when the perturbations are relatively large and/or the output distribution is non-Gaussian, and therefore the Monte-Carlo analysis should be used in the general case. The calibration procedure is shown to greatly reduce the uncertainties in the DAS power estimates. In particular, 95 percent confidence intervals for the DAS power estimates are presented with simulated data for various scenarios. Moreover, the 95 percent confidence intervals for the integrated DAS levels at different frequencies are computed using experimental data. It is shown that with experimental data, the 95 percent confidence intervals for the integrated power levels are

  9. Analysis of uncertainties in CRAC2 calculations: the inhalation pathway

    SciTech Connect

    Killough, G.G.; Dunning, D.E. Jr.

    1984-01-01

    CRAC2 is a computer code for estimating the health effects and economic costs that might result from a release of radioactivity from a nuclear reactor to the environment. This paper describes tests of sensitivity of the predicted health effects to uncertainties in parameters associated with inhalation of the released radionuclides. These parameters are the particle size of the carrier aerosol and, for each element in the release, the clearance parameters for the lung model on which the code's dose conversion factors for inhalation are based. CRAC2 uses hourly meteorological data and a straight-line Gaussian plume model to predict the transport of airborne radioactivity; it includes models for plume depletion and population evacuation, and data for the distributions of population and land use. The code can compute results for single weather sequences, or it can perform random sampling of weather sequences from the meteorological data file and compute results for each weather sequence in the sample. For the work described in this paper, we concentrated on three fixed weather sequences that represent a range of conditions. For each fixed weather sequence, we applied random sampling to joint distributions of the inhalation parameters in order to estimate the sensitivity of the predicted health effects. All sampling runs produced coefficients of variation that were less than 50%, but some differences of means between weather sequences were substantial, as were some differences between means and the corresponding CRAC2 results without random sampling. Early injuries showed differences of as much as 1 to 2 orders of magnitude, while the differences in early fatalities were less than a factor of 2. Latent cancer fatalities varied by less than 10%. 19 references, 6 figures, 3 tables.

  10. Uncertainties in Cancer Risk Coefficients for Environmental Exposure to Radionuclides. An Uncertainty Analysis for Risk Coefficients Reported in Federal Guidance Report No. 13

    SciTech Connect

    Pawel, David; Leggett, Richard Wayne; Eckerman, Keith F; Nelson, Christopher

    2007-01-01

    Federal Guidance Report No. 13 (FGR 13) provides risk coefficients for estimation of the risk of cancer due to low-level exposure to each of more than 800 radionuclides. Uncertainties in risk coefficients were quantified in FGR 13 for 33 cases (exposure to each of 11 radionuclides by each of three exposure pathways) on the basis of sensitivity analyses in which various combinations of plausible biokinetic, dosimetric, and radiation risk models were used to generate alternative risk coefficients. The present report updates the uncertainty analysis in FGR 13 for the cases of inhalation and ingestion of radionuclides and expands the analysis to all radionuclides addressed in that report. The analysis indicates that most risk coefficients for inhalation or ingestion of radionuclides are determined within a factor of 5 or less by current information. That is, application of alternate plausible biokinetic and dosimetric models and radiation risk models (based on the linear, no-threshold hypothesis with an adjustment for the dose and dose rate effectiveness factor) is unlikely to change these coefficients by more than a factor of 5. In this analysis the assessed uncertainty in the radiation risk model was found to be the main determinant of the uncertainty category for most risk coefficients, but conclusions concerning the relative contributions of risk and dose models to the total uncertainty in a risk coefficient may depend strongly on the method of assessing uncertainties in the risk model.

  11. Predictive Uncertainty Analysis for the Yucca Mountain Saturated Zone Flow Model

    NASA Astrophysics Data System (ADS)

    James, S. C.; Eddebbarh, A.; Doherty, J. E.

    2008-12-01

    Parameterization of a hydrogeologic flow and transport model is a principal factor governing the accuracy of predictions of future system behavior. Precision (minimal uncertainty or error variance), in turn, typically depends on an ability to infer the values of distributed system properties from historical measurements of system state through the model calibration process. When such data are scarce, or when their information content with respect to parameters that are most salient to predictions of interest is weak, the uncertainty associated with these predictions may be high, even if the model is "calibrated." Current modeling practice must recognize this condition and suggest a path toward improved predictive accuracy by identifying sources of predictive uncertainty and by determining what observation types will most reduce this uncertainty. The present paper illustrates a combination of methods that can be used for this purpose, all of which are readily implemented as an adjunct to calibration of highly parameterized models. Both linear and nonlinear methods of analysis are discussed. The former furnish a variety of statistics pertinent to predictive uncertainty levels, including specific and unique contributions made by different parameter types, and the worth of different observation types toward reducing this uncertainty. The latter accurately characterize the uncertainty of critical model predictions while helping identify mechanisms for how relatively unlikely (though possible) system behavior can still occur. The methods can be applied to the prediction of specific discharge made for a saturated zone flow model. The model predictive uncertainty, and by inference the uncertainty of other predictions related to site scale flow and transport, is thereby assessed.

  12. Development and Uncertainty Analysis of an Automatic Testing System for Diffusion Pump Performance

    NASA Astrophysics Data System (ADS)

    Zhang, S. W.; Liang, W. S.; Zhang, Z. J.

    A newly developed automatic testing system used in laboratory for diffusion pump performance measurement is introduced in this paper. By using two optical fiber sensors to indicate the oil level in glass-buret and a needle valve driven by a stepper motor to regulate the pressure in the test dome, the system can automatically test the ultimate pressure and pumping speed of a diffusion pump in accordance with ISO 1608. The uncertainty analysis theory is applied to analyze pumping speed measurement results. Based on the test principle and system structure, it is studied how much influence each component and test step contributes to the final uncertainty. According to differential method, the mathematical model for systematic uncertainty transfer function is established. Finally, by case study, combined uncertainties of manual operation and automatic operation are compared with each other (6.11% and 5.87% respectively). The reasonableness and practicality of this newly developed automatic testing system is proved.

  13. IAEA CRP on HTGR Uncertainty Analysis: Benchmark Definition and Test Cases

    SciTech Connect

    Gerhard Strydom; Frederik Reitsma; Hans Gougar; Bismark Tyobeka; Kostadin Ivanov

    2012-11-01

    Uncertainty and sensitivity studies are essential elements of the reactor simulation code verification and validation process. Although several international uncertainty quantification activities have been launched in recent years in the LWR, BWR and VVER domains (e.g. the OECD/NEA BEMUSE program [1], from which the current OECD/NEA LWR Uncertainty Analysis in Modelling (UAM) benchmark [2] effort was derived), the systematic propagation of uncertainties in cross-section, manufacturing and model parameters for High Temperature Reactor (HTGR) designs has not been attempted yet. This paper summarises the scope, objectives and exercise definitions of the IAEA Coordinated Research Project (CRP) on HTGR UAM [3]. Note that no results will be included here, as the HTGR UAM benchmark was only launched formally in April 2012, and the specification is currently still under development.

  14. Coupled Probabilistic And Possibilistic Uncertainty Estimation In Rule-Based Analysis Systems

    NASA Astrophysics Data System (ADS)

    Tsoukalas, L.; Ragheb, M.

    1987-05-01

    A methodology is developed for estimating the Performance of monitored engineering devices. Inferencing and decision-making under uncertainty is considered in Production-Rule Analysis systems where the knowledge about the system is both probabilistic and possibilistic. In this case uncertainty is considered as consisting of two components: Randomness describing the uncertainty of occurrence of an object, and Fuzziness describing the imprecision of the meaning of the object. The concepts of information granularity and of the probability of a fuzzy event are used. Propagation of the coupled Probabilistic and possibilistic uncertainty is carried out over model-based systems using the Rule-Based paradigm. The approach provides a measure of both the performance level and the reliability of a device.

  15. Propagation of uncertainty and sensitivity analysis in an integral oil-gas plume model

    NASA Astrophysics Data System (ADS)

    Wang, Shitao; Iskandarani, Mohamed; Srinivasan, Ashwanth; Thacker, W. Carlisle; Winokur, Justin; Knio, Omar M.

    2016-05-01

    Polynomial Chaos expansions are used to analyze uncertainties in an integral oil-gas plume model simulating the Deepwater Horizon oil spill. The study focuses on six uncertain input parameters—two entrainment parameters, the gas to oil ratio, two parameters associated with the droplet-size distribution, and the flow rate—that impact the model's estimates of the plume's trap and peel heights, and of its various gas fluxes. The ranges of the uncertain inputs were determined by experimental data. Ensemble calculations were performed to construct polynomial chaos-based surrogates that describe the variations in the outputs due to variations in the uncertain inputs. The surrogates were then used to estimate reliably the statistics of the model outputs, and to perform an analysis of variance. Two experiments were performed to study the impacts of high and low flow rate uncertainties. The analysis shows that in the former case the flow rate is the largest contributor to output uncertainties, whereas in the latter case, with the uncertainty range constrained by aposteriori analyses, the flow rate's contribution becomes negligible. The trap and peel heights uncertainties are then mainly due to uncertainties in the 95% percentile of the droplet size and in the entrainment parameters.

  16. Accounting for uncertainty in ecological analysis: the strengths and limitations of hierarchical statistical modeling.

    PubMed

    Cressie, Noel; Calder, Catherine A; Clark, James S; Ver Hoef, Jay M; Wikle, Christopher K

    2009-04-01

    Analyses of ecological data should account for the uncertainty in the process(es) that generated the data. However, accounting for these uncertainties is a difficult task, since ecology is known for its complexity. Measurement and/or process errors are often the only sources of uncertainty modeled when addressing complex ecological problems, yet analyses should also account for uncertainty in sampling design, in model specification, in parameters governing the specified model, and in initial and boundary conditions. Only then can we be confident in the scientific inferences and forecasts made from an analysis. Probability and statistics provide a framework that accounts for multiple sources of uncertainty. Given the complexities of ecological studies, the hierarchical statistical model is an invaluable tool. This approach is not new in ecology, and there are many examples (both Bayesian and non-Bayesian) in the literature illustrating the benefits of this approach. In this article, we provide a baseline for concepts, notation, and methods, from which discussion on hierarchical statistical modeling in ecology can proceed. We have also planted some seeds for discussion and tried to show where the practical difficulties lie. Our thesis is that hierarchical statistical modeling is a powerful way of approaching ecological analysis in the presence of inevitable but quantifiable uncertainties, even if practical issues sometimes require pragmatic compromises.

  17. Using Global Sensitivity Analysis to Understand the Implications of Epistemic Uncertainty in Earth Systems Modelling

    NASA Astrophysics Data System (ADS)

    Wagener, T.; Pianosi, F.; Almeida, S.; Holcombe, E.

    2016-12-01

    We can define epistemic uncertainty as those uncertainties that are not well determined by historical observations. This lack of determination can be because the future is not like the past, because the historical data is unreliable (imperfectly recorded from proxies or missing), or because it is scarce (either because measurements are not available at the right scale or there is simply no observation or network available). This kind of uncertainty is typical for earth system modelling, but our approaches to address it are poorly developed. Because epistemic uncertainties cannot easily be characterised by probability distributions, traditional uncertainty analysis techniques based on Monte Carlo simulation and forward propagation of uncertainty are not adequate. Global Sensitivity Analysis (GSA) can provide an alternative approach where, rather than quantifying the impact of poorly defined or even unknown uncertainties on model predictions, one can investigate at what level such uncertainties would start to matter and whether this level is likely to be reached within the relevant time period analysed. The underlying objective of GSA in this case lies in mapping the uncertain input factors onto critical regions of the model output, e.g. when the output exceeds a certain thresholds. Methods to implement this mapping step have so far received less attention and significant improvement is needed. We will present an example from landslide modelling - a field where observations are scarce, sub-surface characteristics are poorly constrained, and potential future rainfall triggers can be highly uncertain due to climate change. We demonstrate an approach that combines GSA and advanced Classification and Regression Tress (CART) to understand the risk of slope failure for an application in the Caribbean. We close with a discussion of opportunities for further methodological advancement.

  18. A Monte Carlo Uncertainty Analysis of Ozone Trend Predictions in a Two Dimensional Model. Revision

    NASA Technical Reports Server (NTRS)

    Considine, D. B.; Stolarski, R. S.; Hollandsworth, S. M.; Jackman, C. H.; Fleming, E. L.

    1998-01-01

    We use Monte Carlo analysis to estimate the uncertainty in predictions of total O3 trends between 1979 and 1995 made by the Goddard Space Flight Center (GSFC) two-dimensional (2D) model of stratospheric photochemistry and dynamics. The uncertainty is caused by gas-phase chemical reaction rates, photolysis coefficients, and heterogeneous reaction parameters which are model inputs. The uncertainty represents a lower bound to the total model uncertainty assuming the input parameter uncertainties are characterized correctly. Each of the Monte Carlo runs was initialized in 1970 and integrated for 26 model years through the end of 1995. This was repeated 419 times using input parameter sets generated by Latin Hypercube Sampling. The standard deviation (a) of the Monte Carlo ensemble of total 03 trend predictions is used to quantify the model uncertainty. The 34% difference between the model trend in globally and annually averaged total O3 using nominal inputs and atmospheric trends calculated from Nimbus 7 and Meteor 3 total ozone mapping spectrometer (TOMS) version 7 data is less than the 46% calculated 1 (sigma), model uncertainty, so there is no significant difference between the modeled and observed trends. In the northern hemisphere midlatitude spring the modeled and observed total 03 trends differ by more than 1(sigma) but less than 2(sigma), which we refer to as marginal significance. We perform a multiple linear regression analysis of the runs which suggests that only a few of the model reactions contribute significantly to the variance in the model predictions. The lack of significance in these comparisons suggests that they are of questionable use as guides for continuing model development. Large model/measurement differences which are many multiples of the input parameter uncertainty are seen in the meridional gradients of the trend and the peak-to-peak variations in the trends over an annual cycle. These discrepancies unambiguously indicate model formulation

  19. Uncertainty Analysis in the Decadal Survey Era: A Hydrologic Application using the Land Information System (LIS)

    NASA Astrophysics Data System (ADS)

    Harrison, K.; Kumar, S.; Peters-Lidard, C. D.; Santanello, J. A.

    2010-12-01

    Computing and algorithmic advancements are making possible a more complete accounting of errors and uncertainties in earth science modeling. Knowledge of uncertainty can be critical in many application areas and can help to guide scientific research efforts. Here, we describe a plan and progress to date for a fuller accounting of hydrologic modeling uncertainties that addresses the challenges posed by decadal survey missions. These challenges include the need to account for a wide range of error sources (e.g., model error, stochastically varying inputs, observational error, downscaling) and uncertainties (model parameters, error parameters, model selection). In addition, there is a need to incorporate into an assessment all available data, which for decadal survey missions includes the wealth of data from ground, air and satellite observing systems. Our core tool is NASA’s Land Information System (LIS), a high-resolution, high-performance, land surface modeling and data assimilation system that supports a wide range of land surface research and applications. Support for parameter and uncertainty estimation was recently incorporated into the software architecture, and to date three optimization algorithms (Levenberg-Marquardt, Genetic Algorithm, and SCE-UA) and two Markov chain Monte Carlo algorithms for Bayesian analysis (random walk, Differential Evolution-Monte Carlo) have been added. Results and discussion center on a case study that was the focus of Santanello et al. (2007) who demonstrated the use of remotely sensed soil moisture for hydrologic parameter estimation in the Walnut Gulch Experimental Watershed. We contrast results from uncertainty estimation to those from parameter estimation alone. We demonstrate considerable but not complete uncertainty reduction. From this analysis, we identify remaining challenges to a more complete accounting of uncertainties.

  20. Robust stability analysis of linear systems with parametric uncertainty

    NASA Astrophysics Data System (ADS)

    Zhai, Ding; Zhang, Qing-Ling; Liu, Guo-Yi

    2012-09-01

    This article is concerned with the problem of robust stability analysis of linear systems with uncertain parameters. By constructing an equivalent system with positive uncertain parameters and using the properties of these parameters, a new stability analysis condition is derived. Due to making use of the properties of uncertain parameters, the new proposed method has potential to give less conservative results than the existing approaches. A numerical example is given to illustrate the effectiveness of the proposed method.

  1. Overall uncertainty measurement for near infrared analysis of cryptotanshinone in tanshinone extract

    NASA Astrophysics Data System (ADS)

    Xue, Zhong; Xu, Bing; Shi, Xinyuan; Yang, Chan; Cui, Xianglong; Luo, Gan; Qiao, Yanjiang

    2017-01-01

    This study presented a new strategy of overall uncertainty measurement for near infrared (NIR) quantitative analysis of cryptotanshinone in tanshinone extract powders. The overall uncertainty of NIR analysis from validation data of precision, trueness and robustness study was fully investigated and discussed. Quality by design (QbD) elements, such as risk assessment and design of experiment (DOE) were utilized to organize the validation data. An "I × J × K" (series I, the number of repetitions J and level of concentrations K) full factorial design was used to calculate uncertainty from the precision and trueness data. And a 27-4 Plackett-Burmann matrix with four different influence factors resulted from the failure mode and effect analysis (FMEA) analysis was adapted for the robustness study. The overall uncertainty profile was introduced as a graphical decision making tool to evaluate the validity of NIR method over the predefined concentration range. In comparison with the T. Saffaj's method (Analyst, 2013, 138, 4677.) for overall uncertainty assessment, the proposed approach gave almost the same results, demonstrating that the proposed method was reasonable and valid. Moreover, the proposed method can help identify critical factors that influence the NIR prediction performance, which could be used for further optimization of the NIR analytical procedures in routine use.

  2. SOARCA Peach Bottom Atomic Power Station Long-Term Station Blackout Uncertainty Analysis: Knowledge Advancement.

    SciTech Connect

    Gauntt, Randall O.; Mattie, Patrick D.; Bixler, Nathan E.; Ross, Kyle; Cardoni, Jeffrey N; Kalinich, Donald A.; Osborn, Douglas M.; Sallaberry, Cedric Jean-Marie; Ghosh, S. Tina

    2014-02-01

    This paper describes the knowledge advancements from the uncertainty analysis for the State-of- the-Art Reactor Consequence Analyses (SOARCA) unmitigated long-term station blackout accident scenario at the Peach Bottom Atomic Power Station. This work assessed key MELCOR and MELCOR Accident Consequence Code System, Version 2 (MACCS2) modeling uncertainties in an integrated fashion to quantify the relative importance of each uncertain input on potential accident progression, radiological releases, and off-site consequences. This quantitative uncertainty analysis provides measures of the effects on consequences, of each of the selected uncertain parameters both individually and in interaction with other parameters. The results measure the model response (e.g., variance in the output) to uncertainty in the selected input. Investigation into the important uncertain parameters in turn yields insights into important phenomena for accident progression and off-site consequences. This uncertainty analysis confirmed the known importance of some parameters, such as failure rate of the Safety Relief Valve in accident progression modeling and the dry deposition velocity in off-site consequence modeling. The analysis also revealed some new insights, such as dependent effect of cesium chemical form for different accident progressions. (auth)

  3. Probability and possibility-based representations of uncertainty in fault tree analysis.

    PubMed

    Flage, Roger; Baraldi, Piero; Zio, Enrico; Aven, Terje

    2013-01-01

    Expert knowledge is an important source of input to risk analysis. In practice, experts might be reluctant to characterize their knowledge and the related (epistemic) uncertainty using precise probabilities. The theory of possibility allows for imprecision in probability assignments. The associated possibilistic representation of epistemic uncertainty can be combined with, and transformed into, a probabilistic representation; in this article, we show this with reference to a simple fault tree analysis. We apply an integrated (hybrid) probabilistic-possibilistic computational framework for the joint propagation of the epistemic uncertainty on the values of the (limiting relative frequency) probabilities of the basic events of the fault tree, and we use possibility-probability (probability-possibility) transformations for propagating the epistemic uncertainty within purely probabilistic and possibilistic settings. The results of the different approaches (hybrid, probabilistic, and possibilistic) are compared with respect to the representation of uncertainty about the top event (limiting relative frequency) probability. Both the rationale underpinning the approaches and the computational efforts they require are critically examined. We conclude that the approaches relevant in a given setting depend on the purpose of the risk analysis, and that further research is required to make the possibilistic approaches operational in a risk analysis context.

  4. Overall uncertainty measurement for near infrared analysis of cryptotanshinone in tanshinone extract.

    PubMed

    Xue, Zhong; Xu, Bing; Shi, Xinyuan; Yang, Chan; Cui, Xianglong; Luo, Gan; Qiao, Yanjiang

    2017-01-05

    This study presented a new strategy of overall uncertainty measurement for near infrared (NIR) quantitative analysis of cryptotanshinone in tanshinone extract powders. The overall uncertainty of NIR analysis from validation data of precision, trueness and robustness study was fully investigated and discussed. Quality by design (QbD) elements, such as risk assessment and design of experiment (DOE) were utilized to organize the validation data. An "I×J×K" (series I, the number of repetitions J and level of concentrations K) full factorial design was used to calculate uncertainty from the precision and trueness data. And a 2(7-4) Plackett-Burmann matrix with four different influence factors resulted from the failure mode and effect analysis (FMEA) analysis was adapted for the robustness study. The overall uncertainty profile was introduced as a graphical decision making tool to evaluate the validity of NIR method over the predefined concentration range. In comparison with the T. Saffaj's method (Analyst, 2013, 138, 4677.) for overall uncertainty assessment, the proposed approach gave almost the same results, demonstrating that the proposed method was reasonable and valid. Moreover, the proposed method can help identify critical factors that influence the NIR prediction performance, which could be used for further optimization of the NIR analytical procedures in routine use.

  5. Uncertainty and Sensitivity Analysis of Afterbody Radiative Heating Predictions for Earth Entry

    NASA Technical Reports Server (NTRS)

    West, Thomas K., IV; Johnston, Christopher O.; Hosder, Serhat

    2016-01-01

    The objective of this work was to perform sensitivity analysis and uncertainty quantification for afterbody radiative heating predictions of Stardust capsule during Earth entry at peak afterbody radiation conditions. The radiation environment in the afterbody region poses significant challenges for accurate uncertainty quantification and sensitivity analysis due to the complexity of the flow physics, computational cost, and large number of un-certain variables. In this study, first a sparse collocation non-intrusive polynomial chaos approach along with global non-linear sensitivity analysis was used to identify the most significant uncertain variables and reduce the dimensions of the stochastic problem. Then, a total order stochastic expansion was constructed over only the important parameters for an efficient and accurate estimate of the uncertainty in radiation. Based on previous work, 388 uncertain parameters were considered in the radiation model, which came from the thermodynamics, flow field chemistry, and radiation modeling. The sensitivity analysis showed that only four of these variables contributed significantly to afterbody radiation uncertainty, accounting for almost 95% of the uncertainty. These included the electronic- impact excitation rate for N between level 2 and level 5 and rates of three chemical reactions in uencing N, N(+), O, and O(+) number densities in the flow field.

  6. Seismic tomography and MASW as a tools improving Imaging - uncertainty analysis.

    NASA Astrophysics Data System (ADS)

    Marciniak, Artur; Majdański, Mariusz

    2017-04-01

    In recent years, near surface seismic imaging become topic of interest for geoengineers and geologists. In connection with other seismic methods like MASW and travel time tomography, seismic imaging can provide more complete model of shallow structures with analysis of uncertainty. Often forgotten, uncertainty analysis provide useful information for data interpretation, reducing possibility of mistakes in model applied projects. Moreover, application of different methods provide complete utilization of acquired data for in-depth interpretation, or with possibility to solve problems in other surveys. Applying different processing methods for the same raw data allowed authors to receive more accurate final result, with uncertainty analysis based on more complete dataset in comparison to the classical survey scheme.

  7. Uncertainty Analysis of LROC NAC Derived Elevation Models

    NASA Astrophysics Data System (ADS)

    Burns, K.; Yates, D. G.; Speyerer, E.; Robinson, M. S.

    2012-12-01

    One of the primary objectives of the Lunar Reconnaissance Orbiter Camera (LROC) [1] is to gather stereo observations with the Narrow Angle Camera (NAC) to generate digital elevation models (DEMs). From an altitude of 50 km, the NAC acquires images with a pixel scale of 0.5 meters, and a dual NAC observation covers approximately 5 km cross-track by 25 km down-track. This low altitude was common from September 2009 to December 2011. Images acquired during the commissioning phase and those acquired from the fixed orbit (after 11 December 2011) have pixel scales that range from 0.35 meters at the south pole to 2 meters at the north pole. Alimetric observations obtained by the Lunar Orbiter Laser Altimeter (LOLA) provide measurements of ±0.1 m between the spacecraft and the surface [2]. However, uncertainties in the spacecraft positioning can result in offsets (±20m) between altimeter tracks over many orbits. The LROC team is currently developing a tool to automatically register alimetric observations to NAC DEMs [3]. Using a generalized pattern search (GPS) algorithm, the new automatic registration adjusts the spacecraft position and pointing information during times when NAC images, as well as LOLA measurements, of the same region are acquired to provide an absolute reference frame for the DEM. This information is then imported into SOCET SET to aide in creating controlled NAC DEMs. For every DEM, a figure of merit (FOM) map is generated using SOCET SET software. This is a valuable tool for determining the relative accuracy of a specific pixel in a DEM. Each pixel in a FOM map is given a value to determine its "quality" by determining if the specific pixel was shadowed, saturated, suspicious, interpolated/extrapolated, or successfully correlated. The overall quality of a NAC DEM is a function of both the absolute and relative accuracies. LOLA altimetry provides the most accurate absolute geodetic reference frame with which the NAC DEMs can be compared. Offsets

  8. HYDROLOGIC MODEL CALIBRATION AND UNCERTAINTY IN SCENARIO ANALYSIS

    EPA Science Inventory

    A systematic analysis of model performance during simulations based on

    observed land-cover/use change is used to quantify error associated with water-yield

    simulations for a series of known landscape conditions over a 24-year period with the

    goal of evaluatin...

  9. Integrated Uncertainty Analysis for Ambient Pollutant Health Risk Assessment: A Case Study of Ozone Mortality Risk.

    PubMed

    Smith, Anne E; Glasgow, Garrett

    2017-05-18

    The U.S. Environmental Protection Agency (EPA) uses health risk assessment to help inform its decisions in setting national ambient air quality standards (NAAQS). EPA's standard approach is to make epidemiologically-based risk estimates based on a single statistical model selected from the scientific literature, called the "core" model. The uncertainty presented for "core" risk estimates reflects only the statistical uncertainty associated with that one model's concentration-response function parameter estimate(s). However, epidemiologically-based risk estimates are also subject to "model uncertainty," which is a lack of knowledge about which of many plausible model specifications and data sets best reflects the true relationship between health and ambient pollutant concentrations. In 2002, a National Academies of Sciences (NAS) committee recommended that model uncertainty be integrated into EPA's standard risk analysis approach. This article discusses how model uncertainty can be taken into account with an integrated uncertainty analysis (IUA) of health risk estimates. It provides an illustrative numerical example based on risk of premature death from respiratory mortality due to long-term exposures to ambient ozone, which is a health risk considered in the 2015 ozone NAAQS decision. This example demonstrates that use of IUA to quantitatively incorporate key model uncertainties into risk estimates produces a substantially altered understanding of the potential public health gain of a NAAQS policy decision, and that IUA can also produce more helpful insights to guide that decision, such as evidence of decreasing incremental health gains from progressive tightening of a NAAQS. © 2017 Society for Risk Analysis.

  10. A new algorithm for five-hole probe calibration, data reduction, and uncertainty analysis

    NASA Technical Reports Server (NTRS)

    Reichert, Bruce A.; Wendt, Bruce J.

    1994-01-01

    A new algorithm for five-hole probe calibration and data reduction using a non-nulling method is developed. The significant features of the algorithm are: (1) two components of the unit vector in the flow direction replace pitch and yaw angles as flow direction variables; and (2) symmetry rules are developed that greatly simplify Taylor's series representations of the calibration data. In data reduction, four pressure coefficients allow total pressure, static pressure, and flow direction to be calculated directly. The new algorithm's simplicity permits an analytical treatment of the propagation of uncertainty in five-hole probe measurement. The objectives of the uncertainty analysis are to quantify uncertainty of five-hole results (e.g., total pressure, static pressure, and flow direction) and determine the dependence of the result uncertainty on the uncertainty of all underlying experimental and calibration measurands. This study outlines a general procedure that other researchers may use to determine five-hole probe result uncertainty and provides guidance to improve measurement technique. The new algorithm is applied to calibrate and reduce data from a rake of five-hole probes. Here, ten individual probes are mounted on a single probe shaft and used simultaneously. Use of this probe is made practical by the simplicity afforded by this algorithm.

  11. Uncertainty analysis and risk-based design of detention basin without damage function

    NASA Astrophysics Data System (ADS)

    Tung, Yeou-Koung

    2017-05-01

    Risk-based analysis provides an economically defensible framework for determining the optimal design of hydrosystems with the minimum total cost including project cost (installation plus operation/maintenance/repair) and failure-induced expected damage cost. However, failure-related damage function with good quality may not be widely available in practical applications for assessing annual expected damage cost. In addition to aleatory uncertainty representing natural randomness of hydrologic events, there exists a variety of epistemic uncertainties due to knowledge deficiency from the use of inadequate models, inaccurate model parameters, etc. The presence of epistemic uncertainties could affect the loads and capacity of hydrosystem facilities which, in turn, would affect the value of failure-induced physical performance indicators. Using detention basin design as an example, this paper presents a systematic framework to integrate aleatory and epistemic uncertainties for the risk-based design under the condition of no monetary damage function. For illustration, aleatory uncertainty due to randomness of rainfall intensity and epistemic uncertainties caused by runoff coefficient and curve number are considered in risk-based design of an example detention basin.

  12. Uncertainty analysis and global sensitivity analysis of techno-economic assessments for biodiesel production.

    PubMed

    Tang, Zhang-Chun; Zhenzhou, Lu; Zhiwen, Liu; Ningcong, Xiao

    2015-01-01

    There are various uncertain parameters in the techno-economic assessments (TEAs) of biodiesel production, including capital cost, interest rate, feedstock price, maintenance rate, biodiesel conversion efficiency, glycerol price and operating cost. However, fewer studies focus on the influence of these parameters on TEAs. This paper investigated the effects of these parameters on the life cycle cost (LCC) and the unit cost (UC) in the TEAs of biodiesel production. The results show that LCC and UC exhibit variations when involving uncertain parameters. Based on the uncertainty analysis, three global sensitivity analysis (GSA) methods are utilized to quantify the contribution of an individual uncertain parameter to LCC and UC. The GSA results reveal that the feedstock price and the interest rate produce considerable effects on the TEAs. These results can provide a useful guide for entrepreneurs when they plan plants. Copyright © 2014 Elsevier Ltd. All rights reserved.

  13. Uncertainty Analysis for a Virtual Flow Meter Using an Air-Handling Unit Chilled Water Valve

    SciTech Connect

    Song, Li; Wang, Gang; Brambley, Michael R.

    2013-04-28

    A virtual water flow meter is developed that uses the chilled water control valve on an air-handling unit as a measurement device. The flow rate of water through the valve is calculated using the differential pressure across the valve and its associated coil, the valve command, and an empirically determined valve characteristic curve. Thus, the probability of error in the measurements is significantly greater than for conventionally manufactured flow meters. In this paper, mathematical models are developed and used to conduct uncertainty analysis for the virtual flow meter, and the results from the virtual meter are compared to measurements made with an ultrasonic flow meter. Theoretical uncertainty analysis shows that the total uncertainty in flow rates from the virtual flow meter is 1.46% with 95% confidence; comparison of virtual flow meter results with measurements from an ultrasonic flow meter yielded anuncertainty of 1.46% with 99% confidence. The comparable results from the theoretical uncertainty analysis and empirical comparison with the ultrasonic flow meter corroborate each other, and tend to validate the approach to computationally estimating uncertainty for virtual sensors introduced in this study.

  14. Uncertainty Analysis of Sonic Boom Levels Measured in a Simulator at NASA Langley

    NASA Technical Reports Server (NTRS)

    Rathsam, Jonathan; Ely, Jeffry W.

    2012-01-01

    A sonic boom simulator has been constructed at NASA Langley Research Center for testing the human response to sonic booms heard indoors. Like all measured quantities, sonic boom levels in the simulator are subject to systematic and random errors. To quantify these errors, and their net influence on the measurement result, a formal uncertainty analysis is conducted. Knowledge of the measurement uncertainty, or range of values attributable to the quantity being measured, enables reliable comparisons among measurements at different locations in the simulator as well as comparisons with field data or laboratory data from other simulators. The analysis reported here accounts for acoustic excitation from two sets of loudspeakers: one loudspeaker set at the facility exterior that reproduces the exterior sonic boom waveform and a second set of interior loudspeakers for reproducing indoor rattle sounds. The analysis also addresses the effect of pressure fluctuations generated when exterior doors of the building housing the simulator are opened. An uncertainty budget is assembled to document each uncertainty component, its sensitivity coefficient, and the combined standard uncertainty. The latter quantity will be reported alongside measurement results in future research reports to indicate data reliability.

  15. Uncertainty Analysis of Inertial Model Attitude Sensor Calibration and Application with a Recommended New Calibration Method

    NASA Technical Reports Server (NTRS)

    Tripp, John S.; Tcheng, Ping

    1999-01-01

    Statistical tools, previously developed for nonlinear least-squares estimation of multivariate sensor calibration parameters and the associated calibration uncertainty analysis, have been applied to single- and multiple-axis inertial model attitude sensors used in wind tunnel testing to measure angle of attack and roll angle. The analysis provides confidence and prediction intervals of calibrated sensor measurement uncertainty as functions of applied input pitch and roll angles. A comparative performance study of various experimental designs for inertial sensor calibration is presented along with corroborating experimental data. The importance of replicated calibrations over extended time periods has been emphasized; replication provides independent estimates of calibration precision and bias uncertainties, statistical tests for calibration or modeling bias uncertainty, and statistical tests for sensor parameter drift over time. A set of recommendations for a new standardized model attitude sensor calibration method and usage procedures is included. The statistical information provided by these procedures is necessary for the uncertainty analysis of aerospace test results now required by users of industrial wind tunnel test facilities.

  16. Bayesian uncertainty analysis compared with the application of the GUM and its supplements

    NASA Astrophysics Data System (ADS)

    Elster, Clemens

    2014-08-01

    The Guide to the Expression of Uncertainty in Measurement (GUM) has proven to be a major step towards the harmonization of uncertainty evaluation in metrology. Its procedures contain elements from both classical and Bayesian statistics. The recent supplements 1 and 2 to the GUM appear to move the guidelines towards the Bayesian point of view, and they produce a probability distribution that shall encode one's state of knowledge about the measurand. In contrast to a Bayesian uncertainty analysis, however, Bayes' theorem is not applied explicitly. Instead, a distribution is assigned for the input quantities which is then ‘propagated’ through a model that relates the input quantities to the measurand. The resulting distribution for the measurand may coincide with a distribution obtained by the application of Bayes' theorem, but this is not true in general. The relation between a Bayesian uncertainty analysis and the application of the GUM and its supplements is investigated. In terms of a simple example, similarities and differences in the approaches are illustrated. Then a general class of models is considered and conditions are specified for which the distribution obtained by supplement 1 to the GUM is equivalent to a posterior distribution resulting from the application of Bayes' theorem. The corresponding prior distribution is identified and assessed. Finally, we briefly compare the GUM approach with a Bayesian uncertainty analysis in the context of regression problems.

  17. IAEA Coordinated Research Project on HTGR Reactor Physics, Thermal-hydraulics and Depletion Uncertainty Analysis

    SciTech Connect

    Strydom, Gerhard; Bostelmann, F.

    2015-09-01

    The continued development of High Temperature Gas Cooled Reactors (HTGRs) requires verification of HTGR design and safety features with reliable high fidelity physics models and robust, efficient, and accurate codes. The predictive capability of coupled neutronics/thermal-hydraulics and depletion simulations for reactor design and safety analysis can be assessed with sensitivity analysis (SA) and uncertainty analysis (UA) methods. Uncertainty originates from errors in physical data, manufacturing uncertainties, modelling and computational algorithms. (The interested reader is referred to the large body of published SA and UA literature for a more complete overview of the various types of uncertainties, methodologies and results obtained). SA is helpful for ranking the various sources of uncertainty and error in the results of core analyses. SA and UA are required to address cost, safety, and licensing needs and should be applied to all aspects of reactor multi-physics simulation. SA and UA can guide experimental, modelling, and algorithm research and development. Current SA and UA rely either on derivative-based methods such as stochastic sampling methods or on generalized perturbation theory to obtain sensitivity coefficients. Neither approach addresses all needs. In order to benefit from recent advances in modelling and simulation and the availability of new covariance data (nuclear data uncertainties) extensive sensitivity and uncertainty studies are needed for quantification of the impact of different sources of uncertainties on the design and safety parameters of HTGRs. Only a parallel effort in advanced simulation and in nuclear data improvement will be able to provide designers with more robust and well validated calculation tools to meet design target accuracies. In February 2009, the Technical Working Group on Gas-Cooled Reactors (TWG-GCR) of the International Atomic Energy Agency (IAEA) recommended that the proposed Coordinated Research Program (CRP) on

  18. Uncertainty analysis of signal deconvolution using a measured instrument response function.

    PubMed

    Hartouni, E P; Beeman, B; Caggiano, J A; Cerjan, C; Eckart, M J; Grim, G P; Hatarik, R; Moore, A S; Munro, D H; Phillips, T; Sayre, D B

    2016-11-01

    A common analysis procedure minimizes the ln-likelihood that a set of experimental observables matches a parameterized model of the observation. The model includes a description of the underlying physical process as well as the instrument response function (IRF). In the case investigated here, the National Ignition Facility (NIF) neutron time-of-flight (nTOF) spectrometers, the IRF is constructed from measurements and models. IRF measurements have a finite precision that can make significant contributions to determine the uncertainty estimate of the physical model's parameters. We apply a Bayesian analysis to properly account for IRF uncertainties in calculating the ln-likelihood function used to find the optimum physical parameters.

  19. Development of a statistical sampling method for uncertainty analysis with SCALE

    SciTech Connect

    Williams, M.; Wiarda, D.; Smith, H.; Jessee, M. A.; Rearden, B. T.; Zwermann, W.; Klein, M.; Pautz, A.; Krzykacz-Hausmann, B.; Gallner, L.

    2012-07-01

    A new statistical sampling sequence called Sampler has been developed for the SCALE code system. Random values for the input multigroup cross sections are determined by using the XSUSA program to sample uncertainty data provided in the SCALE covariance library. Using these samples, Sampler computes perturbed self-shielded cross sections and propagates the perturbed nuclear data through any specified SCALE analysis sequence, including those for criticality safety, lattice physics with depletion, and shielding calculations. Statistical analysis of the output distributions provides uncertainties and correlations in the desired responses. (authors)

  20. Uncertainty analysis of signal deconvolution using a measured instrument response function

    NASA Astrophysics Data System (ADS)

    Hartouni, E. P.; Beeman, B.; Caggiano, J. A.; Cerjan, C.; Eckart, M. J.; Grim, G. P.; Hatarik, R.; Moore, A. S.; Munro, D. H.; Phillips, T.; Sayre, D. B.

    2016-11-01

    A common analysis procedure minimizes the ln-likelihood that a set of experimental observables matches a parameterized model of the observation. The model includes a description of the underlying physical process as well as the instrument response function (IRF). In the case investigated here, the National Ignition Facility (NIF) neutron time-of-flight (nTOF) spectrometers, the IRF is constructed from measurements and models. IRF measurements have a finite precision that can make significant contributions to determine the uncertainty estimate of the physical model's parameters. We apply a Bayesian analysis to properly account for IRF uncertainties in calculating the ln-likelihood function used to find the optimum physical parameters.

  1. Sensitivity and Uncertainty Analysis for Nuclear Criticality Safety Using KENO in the SCALE Code System

    NASA Astrophysics Data System (ADS)

    Rearden, B. T.

    Sensitivity and uncertainty methods have been developed to aid in the establishment of areas of applicability and validation of computer codes and nuclear data for nuclear criticality safety studies. A key component in this work is the generation of sensitivity and uncertainty parameters for typically several hundred benchmarks experiments used in validation exercises. Previously, only one-dimensional sensitivity tools were available for this task, which necessitated the remodeling of multidimensional inputs in order for such an analysis to be performed. This paper describes the development of the SEN3 Monte Carlo based sensitivity analysis sequence for SCALE.

  2. Detailed Uncertainty Analysis for Ares I Ascent Aerodynamics Wind Tunnel Database

    NASA Technical Reports Server (NTRS)

    Hemsch, Michael J.; Hanke, Jeremy L.; Walker, Eric L.; Houlden, Heather P.

    2008-01-01

    A detailed uncertainty analysis for the Ares I ascent aero 6-DOF wind tunnel database is described. While the database itself is determined using only the test results for the latest configuration, the data used for the uncertainty analysis comes from four tests on two different configurations at the Boeing Polysonic Wind Tunnel in St. Louis and the Unitary Plan Wind Tunnel at NASA Langley Research Center. Four major error sources are considered: (1) systematic errors from the balance calibration curve fits and model + balance installation, (2) run-to-run repeatability, (3) boundary-layer transition fixing, and (4) tunnel-to-tunnel reproducibility.

  3. Effect of soil property uncertainties on permafrost thaw projections: A calibration-constrained analysis

    SciTech Connect

    Harp, D. R.; Atchley, A. L.; Painter, S. L.; Coon, E. T.; Wilson, C. J.; Romanovsky, V. E.; Rowland, J. C.

    2015-06-29

    The effect of soil property uncertainties on permafrost thaw projections are studied using a three-phase subsurface thermal hydrology model and calibration-constrained uncertainty analysis. The Null-Space Monte Carlo method is used to identify soil hydrothermal parameter combinations that are consistent with borehole temperature measurements at the study site, the Barrow Environmental Observatory. Each parameter combination is then used in a forward projection of permafrost conditions for the 21st century (from calendar year 2006 to 2100) using atmospheric forcings from the Community Earth System Model (CESM) in the Representative Concentration Pathway (RCP) 8.5 greenhouse gas concentration trajectory. A 100-year projection allows for the evaluation of intra-annual uncertainty due to soil properties and the inter-annual variability due to year to year differences in CESM climate forcings. After calibrating to borehole temperature data at this well-characterized site, soil property uncertainties are still significant and result in significant intra-annual uncertainties in projected active layer thickness and annual thaw depth-duration even with a specified future climate. Intra-annual uncertainties in projected soil moisture content and Stefan number are small. A volume and time integrated Stefan number decreases significantly in the future climate, indicating that latent heat of phase change becomes more important than heat conduction in future climates. Out of 10 soil parameters, ALT, annual thaw depth-duration, and Stefan number are highly dependent on mineral soil porosity, while annual mean liquid saturation of the active layer is highly dependent on the mineral soil residual saturation and moderately dependent on peat residual saturation. By comparing the ensemble statistics to the spread of projected permafrost metrics using different climate models, we show that the effect of calibration-constrained uncertainty in soil properties, although significant, is

  4. Effect of soil property uncertainties on permafrost thaw projections: A calibration-constrained analysis

    SciTech Connect

    Harp, Dylan R.; Atchley, Adam L.; Painter, Scott L.; Coon, Ethan T.; Wilson, Cathy J.; Romanovsky, Vladimir E.; Rowland, Joel C.

    2016-02-11

    Here, the effect of soil property uncertainties on permafrost thaw projections are studied using a three-phase subsurface thermal hydrology model and calibration-constrained uncertainty analysis. The Null-Space Monte Carlo method is used to identify soil hydrothermal parameter combinations that are consistent with borehole temperature measurements at the study site, the Barrow Environmental Observatory. Each parameter combination is then used in a forward projection of permafrost conditions for the 21$^{st}$ century (from calendar year 2006 to 2100) using atmospheric forcings from the Community Earth System Model (CESM) in the Representative Concentration Pathway (RCP) 8.5 greenhouse gas concentration trajectory. A 100-year projection allows for the evaluation of intra-annual uncertainty due to soil properties and the inter-annual variability due to year to year differences in CESM climate forcings. After calibrating to borehole temperature data at this well-characterized site, soil property uncertainties are still significant and result in significant intra-annual uncertainties in projected active layer thickness and annual thaw depth-duration even with a specified future climate. Intra-annual uncertainties in projected soil moisture content and Stefan number are small. A volume and time integrated Stefan number decreases significantly in the future climate, indicating that latent heat of phase change becomes more important than heat conduction in future climates. Out of 10 soil parameters, ALT, annual thaw depth-duration, and Stefan number are highly dependent on mineral soil porosity, while annual mean liquid saturation of the active layer is highly dependent on the mineral soil residual saturation and moderately dependent on peat residual saturation. By comparing the ensemble statistics to the spread of projected permafrost metrics using different climate models, we show that the effect of calibration-constrained uncertainty in soil properties, although

  5. Effect of soil property uncertainties on permafrost thaw projections: A calibration-constrained analysis

    DOE PAGES

    Harp, Dylan R.; Atchley, Adam L.; Painter, Scott L.; ...

    2016-02-11

    Here, the effect of soil property uncertainties on permafrost thaw projections are studied using a three-phase subsurface thermal hydrology model and calibration-constrained uncertainty analysis. The Null-Space Monte Carlo method is used to identify soil hydrothermal parameter combinations that are consistent with borehole temperature measurements at the study site, the Barrow Environmental Observatory. Each parameter combination is then used in a forward projection of permafrost conditions for the 21more » $$^{st}$$ century (from calendar year 2006 to 2100) using atmospheric forcings from the Community Earth System Model (CESM) in the Representative Concentration Pathway (RCP) 8.5 greenhouse gas concentration trajectory. A 100-year projection allows for the evaluation of intra-annual uncertainty due to soil properties and the inter-annual variability due to year to year differences in CESM climate forcings. After calibrating to borehole temperature data at this well-characterized site, soil property uncertainties are still significant and result in significant intra-annual uncertainties in projected active layer thickness and annual thaw depth-duration even with a specified future climate. Intra-annual uncertainties in projected soil moisture content and Stefan number are small. A volume and time integrated Stefan number decreases significantly in the future climate, indicating that latent heat of phase change becomes more important than heat conduction in future climates. Out of 10 soil parameters, ALT, annual thaw depth-duration, and Stefan number are highly dependent on mineral soil porosity, while annual mean liquid saturation of the active layer is highly dependent on the mineral soil residual saturation and moderately dependent on peat residual saturation. By comparing the ensemble statistics to the spread of projected permafrost metrics using different climate models, we show that the effect of calibration-constrained uncertainty in soil properties

  6. Effect of soil property uncertainties on permafrost thaw projections: A calibration-constrained analysis

    DOE PAGES

    Harp, D. R.; Atchley, A. L.; Painter, S. L.; ...

    2015-06-29

    The effect of soil property uncertainties on permafrost thaw projections are studied using a three-phase subsurface thermal hydrology model and calibration-constrained uncertainty analysis. The Null-Space Monte Carlo method is used to identify soil hydrothermal parameter combinations that are consistent with borehole temperature measurements at the study site, the Barrow Environmental Observatory. Each parameter combination is then used in a forward projection of permafrost conditions for the 21st century (from calendar year 2006 to 2100) using atmospheric forcings from the Community Earth System Model (CESM) in the Representative Concentration Pathway (RCP) 8.5 greenhouse gas concentration trajectory. A 100-year projection allows formore » the evaluation of intra-annual uncertainty due to soil properties and the inter-annual variability due to year to year differences in CESM climate forcings. After calibrating to borehole temperature data at this well-characterized site, soil property uncertainties are still significant and result in significant intra-annual uncertainties in projected active layer thickness and annual thaw depth-duration even with a specified future climate. Intra-annual uncertainties in projected soil moisture content and Stefan number are small. A volume and time integrated Stefan number decreases significantly in the future climate, indicating that latent heat of phase change becomes more important than heat conduction in future climates. Out of 10 soil parameters, ALT, annual thaw depth-duration, and Stefan number are highly dependent on mineral soil porosity, while annual mean liquid saturation of the active layer is highly dependent on the mineral soil residual saturation and moderately dependent on peat residual saturation. As a result, by comparing the ensemble statistics to the spread of projected permafrost metrics using different climate models, we show that the effect of calibration-constrained uncertainty in soil properties

  7. Uncertainty analysis for absorbed dose from a brain receptor imaging agent

    SciTech Connect

    Aydogan, B.; Miller, L.F.; Sparks, R.B.; Stubbs, J.B.

    1999-01-01

    Absorbed dose estimates are known to contain uncertainties. A recent literature search indicates that prior to this study no rigorous investigation of uncertainty associated with absorbed dose has been undertaken. A method of uncertainty analysis for absorbed dose calculations has been developed and implemented for the brain receptor imaging agent {sup 123}I-IPT. The two major sources of uncertainty considered were the uncertainty associated with the determination of residence time and that associated with the determination of the S values. There are many sources of uncertainty in the determination of the S values, but only the inter-patient organ mass variation was considered in this work. The absorbed dose uncertainties were determined for lung, liver, heart and brain. Ninety-five percent confidence intervals of the organ absorbed dose distributions for each patient and for a seven-patient population group were determined by the ``Latin Hypercube Sampling`` method. For an individual patient, the upper bound of the 95% confidence interval of the absorbed dose was found to be about 2.5 times larger than the estimated mean absorbed dose. For the seven-patient population the upper bound of the 95% confidence interval of the absorbed dose distribution was around 45% more than the estimated population mean. For example, the 95% confidence interval of the population liver dose distribution was found to be between 1.49E+0.7 Gy/MBq and 4.65E+07 Gy/MBq with a mean of 2.52E+07 Gy/MBq. This study concluded that patients in a population receiving {sup 123}I-IPT could receive absorbed doses as much as twice as large as the standard estimated absorbed dose due to these uncertainties.

  8. Return on investment (ROI) analysis in the face of uncertainty

    PubMed

    East

    2000-01-01

    Calculation of return on investment should be as easy as knowing the cost and the benefit and doing the math. Unfortunately in medical informatics rarely do we know the benefit with absolute certainty. Often the benefit is a combination of "fuzzy" ill defined concepts. In order to support the ROI analysis for medical information systems we developed a probabilistic time varying model that allows each of the fuzzy concepts of savings to be expressed as a likelihood distribution rather than a fixed value. The model and an interactive simulator were created using the graphical programming language G (National Instruments, Dallas,TX). This tools allows rapid prototyping and drag and drop ease of customizing the model to the particular clinical setting. The resulting ROI is expressed in terms of likelihoods, potential risk and confidence intervals providing a unique view of the financial analysis not typically available. In this paper an example is illustrated for perioperative information systems.

  9. Interactive Planning under Uncertainty with Casual Modeling and Analysis

    DTIC Science & Technology

    2006-01-01

    Tool ( CAT ), a system for creating and analyzing causal models similar to Bayes networks. In order to use CAT as a tool for planning, users go through...an iterative process in which they use CAT to create and an- alyze alternative plans. One of the biggest difficulties is that the number of possible...Causal Analysis Tool ( CAT ), which is a tool for representing and analyzing causal networks sim- ilar to Bayesian networks. In order to represent plans

  10. Advanced uncertainty modelling for container port risk analysis.

    PubMed

    Alyami, Hani; Yang, Zaili; Riahi, Ramin; Bonsall, Stephen; Wang, Jin

    2016-08-13

    Globalization has led to a rapid increase of container movements in seaports. Risks in seaports need to be appropriately addressed to ensure economic wealth, operational efficiency, and personnel safety. As a result, the safety performance of a Container Terminal Operational System (CTOS) plays a growing role in improving the efficiency of international trade. This paper proposes a novel method to facilitate the application of Failure Mode and Effects Analysis (FMEA) in assessing the safety performance of CTOS. The new approach is developed through incorporating a Fuzzy Rule-Based Bayesian Network (FRBN) with Evidential Reasoning (ER) in a complementary manner. The former provides a realistic and flexible method to describe input failure information for risk estimates of individual hazardous events (HEs) at the bottom level of a risk analysis hierarchy. The latter is used to aggregate HEs safety estimates collectively, allowing dynamic risk-based decision support in CTOS from a systematic perspective. The novel feature of the proposed method, compared to those in traditional port risk analysis lies in a dynamic model capable of dealing with continually changing operational conditions in ports. More importantly, a new sensitivity analysis method is developed and carried out to rank the HEs by taking into account their specific risk estimations (locally) and their Risk Influence (RI) to a port's safety system (globally). Due to its generality, the new approach can be tailored for a wide range of applications in different safety and reliability engineering and management systems, particularly when real time risk ranking is required to measure, predict, and improve the associated system safety performance. Copyright © 2016 Elsevier Ltd. All rights reserved.

  11. Implementation of a Bayesian Engine for Uncertainty Analysis

    SciTech Connect

    Leng Vang; Curtis Smith; Steven Prescott

    2014-08-01

    In probabilistic risk assessment, it is important to have an environment where analysts have access to a shared and secured high performance computing and a statistical analysis tool package. As part of the advanced small modular reactor probabilistic risk analysis framework implementation, we have identified the need for advanced Bayesian computations. However, in order to make this technology available to non-specialists, there is also a need of a simplified tool that allows users to author models and evaluate them within this framework. As a proof-of-concept, we have implemented an advanced open source Bayesian inference tool, OpenBUGS, within the browser-based cloud risk analysis framework that is under development at the Idaho National Laboratory. This development, the “OpenBUGS Scripter” has been implemented as a client side, visual web-based and integrated development environment for creating OpenBUGS language scripts. It depends on the shared server environment to execute the generated scripts and to transmit results back to the user. The visual models are in the form of linked diagrams, from which we automatically create the applicable OpenBUGS script that matches the diagram. These diagrams can be saved locally or stored on the server environment to be shared with other users.

  12. Uncertainties in Earthquake Loss Analysis: A Case Study From Southern California

    NASA Astrophysics Data System (ADS)

    Mahdyiar, M.; Guin, J.

    2005-12-01

    Probabilistic earthquake hazard and loss analyses play important roles in many areas of risk management, including earthquake related public policy and insurance ratemaking. Rigorous loss estimation for portfolios of properties is difficult since there are various types of uncertainties in all aspects of modeling and analysis. It is the objective of this study to investigate the sensitivity of earthquake loss estimation to uncertainties in regional seismicity, earthquake source parameters, ground motions, and sites' spatial correlation on typical property portfolios in Southern California. Southern California is an attractive region for such a study because it has a large population concentration exposed to significant levels of seismic hazard. During the last decade, there have been several comprehensive studies of most regional faults and seismogenic sources. There have also been detailed studies on regional ground motion attenuations and regional and local site responses to ground motions. This information has been used by engineering seismologists to conduct regional seismic hazard and risk analysis on a routine basis. However, one of the more difficult tasks in such studies is the proper incorporation of uncertainties in the analysis. From the hazard side, there are uncertainties in the magnitudes, rates and mechanisms of the seismic sources and local site conditions and ground motion site amplifications. From the vulnerability side, there are considerable uncertainties in estimating the state of damage of buildings under different earthquake ground motions. From an analytical side, there are challenges in capturing the spatial correlation of ground motions and building damage, and integrating thousands of loss distribution curves with different degrees of correlation. In this paper we propose to address some of these issues by conducting loss analyses of a typical small portfolio in southern California, taking into consideration various source and ground

  13. Effect of soil property uncertainties on permafrost thaw projections: a calibration-constrained analysis

    NASA Astrophysics Data System (ADS)

    Harp, D. R.; Atchley, A. L.; Painter, S. L.; Coon, E. T.; Wilson, C. J.; Romanovsky, V. E.; Rowland, J. C.

    2016-02-01

    The effects of soil property uncertainties on permafrost thaw projections are studied using a three-phase subsurface thermal hydrology model and calibration-constrained uncertainty analysis. The null-space Monte Carlo method is used to identify soil hydrothermal parameter combinations that are consistent with borehole temperature measurements at the study site, the Barrow Environmental Observatory. Each parameter combination is then used in a forward projection of permafrost conditions for the 21st century (from calendar year 2006 to 2100) using atmospheric forcings from the Community Earth System Model (CESM) in the Representative Concentration Pathway (RCP) 8.5 greenhouse gas concentration trajectory. A 100-year projection allows for the evaluation of predictive uncertainty (due to soil property (parametric) uncertainty) and the inter-annual climate variability due to year to year differences in CESM climate forcings. After calibrating to measured borehole temperature data at this well-characterized site, soil property uncertainties are still significant and result in significant predictive uncertainties in projected active layer thickness and annual thaw depth-duration even with a specified future climate. Inter-annual climate variability in projected soil moisture content and Stefan number are small. A volume- and time-integrated Stefan number decreases significantly, indicating a shift in subsurface energy utilization in the future climate (latent heat of phase change becomes more important than heat conduction). Out of 10 soil parameters, ALT, annual thaw depth-duration, and Stefan number are highly dependent on mineral soil porosity, while annual mean liquid saturation of the active layer is highly dependent on the mineral soil residual saturation and moderately dependent on peat residual saturation. By comparing the ensemble statistics to the spread of projected permafrost metrics using different climate models, we quantify the relative magnitude of soil

  14. Use of Forward Sensitivity Analysis Method to Improve Code Scaling, Applicability, and Uncertainty (CSAU) Methodology

    SciTech Connect

    Haihua Zhao; Vincent A. Mousseau; Nam T. Dinh

    2010-10-01

    Code Scaling, Applicability, and Uncertainty (CSAU) methodology was developed in late 1980s by US NRC to systematically quantify reactor simulation uncertainty. Basing on CSAU methodology, Best Estimate Plus Uncertainty (BEPU) methods have been developed and widely used for new reactor designs and existing LWRs power uprate. In spite of these successes, several aspects of CSAU have been criticized for further improvement: i.e., (1) subjective judgement in PIRT process; (2) high cost due to heavily relying large experimental database, needing many experts man-years work, and very high computational overhead; (3) mixing numerical errors with other uncertainties; (4) grid dependence and same numerical grids for both scaled experiments and real plants applications; (5) user effects; Although large amount of efforts have been used to improve CSAU methodology, the above issues still exist. With the effort to develop next generation safety analysis codes, new opportunities appear to take advantage of new numerical methods, better physical models, and modern uncertainty qualification methods. Forward sensitivity analysis (FSA) directly solves the PDEs for parameter sensitivities (defined as the differential of physical solution with respective to any constant parameter). When the parameter sensitivities are available in a new advanced system analysis code, CSAU could be significantly improved: (1) Quantifying numerical errors: New codes which are totally implicit and with higher order accuracy can run much faster with numerical errors quantified by FSA. (2) Quantitative PIRT (Q-PIRT) to reduce subjective judgement and improving efficiency: treat numerical errors as special sensitivities against other physical uncertainties; only parameters having large uncertainty effects on design criterions are considered. (3) Greatly reducing computational costs for uncertainty qualification by (a) choosing optimized time steps and spatial sizes; (b) using gradient information

  15. Computational Fluid Dynamics Uncertainty Analysis for Payload Fairing Spacecraft Environmental Control Systems

    NASA Technical Reports Server (NTRS)

    Groves, Curtis E.

    2013-01-01

    Spacecraft thermal protection systems are at risk of being damaged due to airflow produced from Environmental Control Systems. There are inherent uncertainties and errors associated with using Computational Fluid Dynamics to predict the airflow field around a spacecraft from the Environmental Control System. This proposal describes an approach to validate the uncertainty in using Computational Fluid Dynamics to predict airflow speeds around an encapsulated spacecraft. The research described here is absolutely cutting edge. Quantifying the uncertainty in analytical predictions is imperative to the success of any simulation-based product. The method could provide an alternative to traditional"validation by test only'' mentality. This method could be extended to other disciplines and has potential to provide uncertainty for any numerical simulation, thus lowering the cost of performing these verifications while increasing the confidence in those predictions. Spacecraft requirements can include a maximum airflow speed to protect delicate instruments during ground processing. Computationaf Fluid Dynamics can be used to veritY these requirements; however, the model must be validated by test data. The proposed research project includes the following three objectives and methods. Objective one is develop, model, and perform a Computational Fluid Dynamics analysis of three (3) generic, non-proprietary, environmental control systems and spacecraft configurations. Several commercially available solvers have the capability to model the turbulent, highly three-dimensional, incompressible flow regime. The proposed method uses FLUENT and OPEN FOAM. Objective two is to perform an uncertainty analysis of the Computational Fluid . . . Dynamics model using the methodology found in "Comprehensive Approach to Verification and Validation of Computational Fluid Dynamics Simulations". This method requires three separate grids and solutions, which quantify the error bars around

  16. Analysis of Uncertainty and Variability in Finite Element Computational Models for Biomedical Engineering: Characterization and Propagation.

    PubMed

    Mangado, Nerea; Piella, Gemma; Noailly, Jérôme; Pons-Prats, Jordi; Ballester, Miguel Ángel González

    2016-01-01

    Computational modeling has become a powerful tool in biomedical engineering thanks to its potential to simulate coupled systems. However, real parameters are usually not accurately known, and variability is inherent in living organisms. To cope with this, probabilistic tools, statistical analysis and stochastic approaches have been used. This article aims to review the analysis of uncertainty and variability in the context of finite element modeling in biomedical engineering. Characterization techniques and propagation methods are presented, as well as examples of their applications in biomedical finite element simulations. Uncertainty propagation methods, both non-intrusive and intrusive, are described. Finally, pros and cons of the different approaches and their use in the scientific community are presented. This leads us to identify future directions for research and methodological development of uncertainty modeling in biomedical engineering.

  17. Analysis of Uncertainty and Variability in Finite Element Computational Models for Biomedical Engineering: Characterization and Propagation

    PubMed Central

    Mangado, Nerea; Piella, Gemma; Noailly, Jérôme; Pons-Prats, Jordi; Ballester, Miguel Ángel González

    2016-01-01

    Computational modeling has become a powerful tool in biomedical engineering thanks to its potential to simulate coupled systems. However, real parameters are usually not accurately known, and variability is inherent in living organisms. To cope with this, probabilistic tools, statistical analysis and stochastic approaches have been used. This article aims to review the analysis of uncertainty and variability in the context of finite element modeling in biomedical engineering. Characterization techniques and propagation methods are presented, as well as examples of their applications in biomedical finite element simulations. Uncertainty propagation methods, both non-intrusive and intrusive, are described. Finally, pros and cons of the different approaches and their use in the scientific community are presented. This leads us to identify future directions for research and methodological development of uncertainty modeling in biomedical engineering. PMID:27872840

  18. Testability requirement uncertainty analysis in the sensor selection and optimization model for PHM

    NASA Astrophysics Data System (ADS)

    Yang, S. M.; Qiu, J.; Liu, G. J.; Yang, P.; Zhang, Y.

    2012-05-01

    Prognostics and health management (PHM) has been an important part to guarantee the reliability and safety of complex systems. Design for testability (DFT) developed concurrently with system design is considered as a fundamental way to improve PHM performance, and sensor selection and optimization (SSO) is one of the important parts in DFT. To address the problem that testability requirement analysis in the existing SSO models does not take test uncertainty in actual scenario into account, fault detection uncertainty is analyzed from the view of fault attributes, sensor attributes and fault-sensor matching attributes qualitatively. And then, quantitative uncertainty analysis is given, which assigns a rational confidence level to fault size. A case is presented to demonstrate the proposed methodology for an electromechanical servo-controlled system, and application results show the proposed approach is reasonable and feasible.

  19. Order Under Uncertainty: Robust Differential Expression Analysis Using Probabilistic Models for Pseudotime Inference

    PubMed Central

    Campbell, Kieran R.

    2016-01-01

    Single cell gene expression profiling can be used to quantify transcriptional dynamics in temporal processes, such as cell differentiation, using computational methods to label each cell with a ‘pseudotime’ where true time series experimentation is too difficult to perform. However, owing to the high variability in gene expression between individual cells, there is an inherent uncertainty in the precise temporal ordering of the cells. Pre-existing methods for pseudotime estimation have predominantly given point estimates precluding a rigorous analysis of the implications of uncertainty. We use probabilistic modelling techniques to quantify pseudotime uncertainty and propagate this into downstream differential expression analysis. We demonstrate that reliance on a point estimate of pseudotime can lead to inflated false discovery rates and that probabilistic approaches provide greater robustness and measures of the temporal resolution that can be obtained from pseudotime inference. PMID:27870852

  20. Using Predictive Uncertainty Analysis to Assess Hydrologic Model Performance for a Watershed in Oregon

    NASA Astrophysics Data System (ADS)

    Brannan, K. M.; Somor, A.

    2016-12-01

    A variety of statistics are used to assess watershed model performance but these statistics do not directly answer the question: what is the uncertainty of my prediction. Understanding predictive uncertainty is important when using a watershed model to develop a Total Maximum Daily Load (TMDL). TMDLs are a key component of the US Clean Water Act and specify the amount of a pollutant that can enter a waterbody when the waterbody meets water quality criteria. TMDL developers use watershed models to estimate pollutant loads from nonpoint sources of pollution. We are developing a TMDL for bacteria impairments in a watershed in the Coastal Range of Oregon. We setup an HSPF model of the watershed and used the calibration software PEST to estimate HSPF hydrologic parameters and then perform predictive uncertainty analysis of stream flow. We used Monte-Carlo simulation to run the model with 1,000 different parameter sets and assess predictive uncertainty. In order to reduce the chance of specious parameter sets, we accounted for the relationships among parameter values by using mathematically-based regularization techniques and an estimate of the parameter covariance when generating random parameter sets. We used a novel approach to select flow data for predictive uncertainty analysis. We set aside flow data that occurred on days that bacteria samples were collected. We did not use these flows in the estimation of the model parameters. We calculated a percent uncertainty for each flow observation based 1,000 model runs. We also used several methods to visualize results with an emphasis on making the data accessible to both technical and general audiences. We will use the predictive uncertainty estimates in the next phase of our work, simulating bacteria fate and transport in the watershed.

  1. Performance evaluation of passive cooling in office buildings based on uncertainty and sensitivity analysis

    SciTech Connect

    Breesch, H.; Janssens, A.

    2010-08-15

    Natural night ventilation is an interesting passive cooling method in moderate climates. Driven by wind and stack generated pressures, it cools down the exposed building structure at night, in which the heat of the previous day is accumulated. The performance of natural night ventilation highly depends on the external weather conditions and especially on the outdoor temperature. An increase of this outdoor temperature is noticed over the last century and the IPCC predicts an additional rise to the end of this century. A methodology is needed to evaluate the reliable operation of the indoor climate of buildings in case of warmer and uncertain summer conditions. The uncertainty on the climate and on other design data can be very important in the decision process of a building project. The aim of this research is to develop a methodology to predict the performance of natural night ventilation using building energy simulation taking into account the uncertainties in the input. The performance evaluation of natural night ventilation is based on uncertainty and sensitivity analysis. The results of the uncertainty analysis showed that thermal comfort in a single office cooled with single-sided night ventilation had the largest uncertainty. The uncertainties on thermal comfort in case of passive stack and cross ventilation were substantially smaller. However, since wind, as the main driving force for cross ventilation, is highly variable, the cross ventilation strategy required larger louvre areas than the stack ventilation strategy to achieve a similar performance. The differences in uncertainty between the orientations were small. Sensitivity analysis was used to determine the most dominant set of input parameters causing the uncertainty on thermal comfort. The internal heat gains, solar heat gain coefficient of the sunblinds, internal convective heat transfer coefficient, thermophysical properties related to thermal mass, set-point temperatures controlling the natural

  2. A Parallel Disintegrated Model for Uncertainty Analysis in Estimating Electrical Power Outage Areas

    NASA Astrophysics Data System (ADS)

    Omitaomu, O. A.

    2008-05-01

    extreme events may lead to model uncertainty, parameter uncertainty, and/or decision uncertainty. The type and source of uncertainty can dictate the methods for characterizing the uncertainty and its impact on effective disaster management strategies. Several techniques including sensitivity analysis, fuzzy sets theory, and Bayes' Theorem have been used for quantifying specific sources of uncertainty in various studies. However, these studies focus on individual areas of uncertainty and extreme weather. In this paper, we present some preliminary results in developing a parallel disintegrated model for uncertainty analysis with application to estimating electric power outage areas. The proposed model is disintegrated in the sense that each elements of the impacts assessment framework is assessed separately; and parallel since for each source of uncertainty a number of equivalent estimating models are implemented and evaluated. The objectives of the model include identifying the sources of uncertainty to be included in assessment model and determining the trade-offs in reducing the uncertainty due to major sources. The model would also be useful for uncertainty analysis of extreme weather impacts assessment to other critical infrastructures.

  3. Uncertainty and sensitivity analysis of the retrieved essential climate variables from remotely sensed observations

    NASA Astrophysics Data System (ADS)

    Djepa, Vera; Badii, Atta

    2016-04-01

    The sensitivity of weather and climate system to sea ice thickness (SIT), Sea Ice Draft (SID) and Snow Depth (SD) in the Arctic is recognized from various studies. Decrease in SIT will affect atmospheric circulation, temperature, precipitation and wind speed in the Arctic and beyond. Ice thermodynamics and dynamic properties depend strongly on sea Ice Density (ID) and SD. SIT, SID, ID and SD are sensitive to environmental changes in the Polar region and impact the climate system. For accurate forecast of climate change, sea ice mass balance, ocean circulation and sea- atmosphere interactions it is required to have long term records of SIT, SID, SD and ID with errors and uncertainty analyses. The SID, SIT, ID and freeboard (F) have been retrieved from Radar Altimeter (RA) (on board ENVISAT) and IceBridge Laser Altimeter (LA) and validated, using over 10 years -collocated observations of SID and SD in the Arctic, provided from the European Space Agency (ESA CCI sea ice ECV project). Improved algorithms to retrieve SIT from LA and RA have been derived, applying statistical analysis. The snow depth is obtained from AMSR-E/Aqua and NASA IceBridge Snow Depth radar. The sea ice properties of pancake ice have been retrieved from ENVISAT/Synthetic Aperture Radar (ASAR). The uncertainties of the retrieved climate variables have been analysed and the impact of snow depth and sea ice density on retrieved SIT has been estimated. The sensitivity analysis illustrates the impact of uncertainties of input climate variables (ID and SD) on accuracy of the retrieved output variables (SIT and SID). The developed methodology of uncertainty and sensitivity analysis is essential for assessment of the impact of environmental variables on climate change and better understanding of the relationship between input and output variables. The uncertainty analysis quantifies the uncertainties of the model results and the sensitivity analysis evaluates the contribution of each input variable to

  4. A Study of the Critical Uncertainty Contributions in the Analysis of PCBs in Ambient Air

    PubMed Central

    Brown, Andrew S.; Brown, Richard J. C.

    2008-01-01

    The measurement of polychlorinated biphenyls (PCBs) in ambient air requires a complex, multistep sample preparation procedure prior to analysis by gas chromatography—mass spectrometry (GC-MS). Although routine analytical laboratories regularly carry out these measurements, they are often undertaken with little regard to the accurate calculation of measurement uncertainty, or appreciation of the sensitivity of the accuracy of the measurement to each step of the analysis. A measurement equation is developed for this analysis, and the contributory sources to the overall uncertainty when preparing calibration standards and other solutions by gravimetric and volumetric approaches are discussed and compared. For the example analysis presented, it is found that the uncertainty of the measurement is dominated by the repeatability of the GC-MS analysis and suggested that volumetric (as opposed to gravimetric) preparation of solutions does not adversely affect the overall uncertainty. The methodology presented in this work can also be applied to analogous methods for similar analytes, for example, those used to measure polycyclic aromatic hydrocarbons (PAHs), pesticides, dioxins, or furans in ambient air. PMID:18528517

  5. Characterization and Uncertainty Analysis of a Reference Pressure Measurement System for Wind Tunnels

    NASA Technical Reports Server (NTRS)

    Amer, Tahani; Tripp, John; Tcheng, Ping; Burkett, Cecil; Sealey, Bradley

    2004-01-01

    This paper presents the calibration results and uncertainty analysis of a high-precision reference pressure measurement system currently used in wind tunnels at the NASA Langley Research Center (LaRC). Sensors, calibration standards, and measurement instruments are subject to errors due to aging, drift with time, environment effects, transportation, the mathematical model, the calibration experimental design, and other factors. Errors occur at every link in the chain of measurements and data reduction from the sensor to the final computed results. At each link of the chain, bias and precision uncertainties must be separately estimated for facility use, and are combined to produce overall calibration and prediction confidence intervals for the instrument, typically at a 95% confidence level. The uncertainty analysis and calibration experimental designs used herein, based on techniques developed at LaRC, employ replicated experimental designs for efficiency, separate estimation of bias and precision uncertainties, and detection of significant parameter drift with time. Final results, including calibration confidence intervals and prediction intervals given as functions of the applied inputs, not as a fixed percentage of the full-scale value are presented. System uncertainties are propagated beginning with the initial reference pressure standard, to the calibrated instrument as a working standard in the facility. Among the several parameters that can affect the overall results are operating temperature, atmospheric pressure, humidity, and facility vibration. Effects of factors such as initial zeroing and temperature are investigated. The effects of the identified parameters on system performance and accuracy are discussed.

  6. Damage functions for climate-related hazards: unification and uncertainty analysis

    NASA Astrophysics Data System (ADS)

    Prahl, Boris F.; Rybski, Diego; Boettle, Markus; Kropp, Jürgen P.

    2016-05-01

    Most climate change impacts manifest in the form of natural hazards. Damage assessment typically relies on damage functions that translate the magnitude of extreme events to a quantifiable damage. In practice, the availability of damage functions is limited due to a lack of data sources and a lack of understanding of damage processes. The study of the characteristics of damage functions for different hazards could strengthen the theoretical foundation of damage functions and support their development and validation. Accordingly, we investigate analogies of damage functions for coastal flooding and for wind storms and identify a unified approach. This approach has general applicability for granular portfolios and may also be applied, for example, to heat-related mortality. Moreover, the unification enables the transfer of methodology between hazards and a consistent treatment of uncertainty. This is demonstrated by a sensitivity analysis on the basis of two simple case studies (for coastal flood and storm damage). The analysis reveals the relevance of the various uncertainty sources at varying hazard magnitude and on both the microscale and the macroscale level. Main findings are the dominance of uncertainty from the hazard magnitude and the persistent behaviour of intrinsic uncertainties on both scale levels. Our results shed light on the general role of uncertainties and provide useful insight for the application of the unified approach.

  7. Computational Fluid Dynamics Uncertainty Analysis Applied to Heat Transfer over a Flat Plate

    NASA Technical Reports Server (NTRS)

    Groves, Curtis Edward; Ilie, Marcel; Schallhorn, Paul A.

    2013-01-01

    There have been few discussions on using Computational Fluid Dynamics (CFD) without experimental validation. Pairing experimental data, uncertainty analysis, and analytical predictions provides a comprehensive approach to verification and is the current state of the art. With pressed budgets, collecting experimental data is rare or non-existent. This paper investigates and proposes a method to perform CFD uncertainty analysis only from computational data. The method uses current CFD uncertainty techniques coupled with the Student-T distribution to predict the heat transfer coefficient over a at plate. The inputs to the CFD model are varied from a specified tolerance or bias error and the difference in the results are used to estimate the uncertainty. The variation in each input is ranked from least to greatest to determine the order of importance. The results are compared to heat transfer correlations and conclusions drawn about the feasibility of using CFD without experimental data. The results provide a tactic to analytically estimate the uncertainty in a CFD model when experimental data is unavailable

  8. Linking trading ratio with TMDL (total maximum daily load) allocation matrix and uncertainty analysis.

    PubMed

    Zhang, H X

    2008-01-01

    An innovative approach for total maximum daily load (TMDL) allocation and implementation is the watershed-based pollutant trading. Given the inherent scientific uncertainty for the tradeoffs between point and nonpoint sources, setting of trading ratios can be a contentious issue and was already listed as an obstacle by several pollutant trading programs. One of the fundamental reasons that a trading ratio is often set higher (e.g. greater than 2) is to allow for uncertainty in the level of control needed to attain water quality standards, and to provide a buffer in case traded reductions are less effective than expected. However, most of the available studies did not provide an approach to explicitly address the determination of trading ratio. Uncertainty analysis has rarely been linked to determination of trading ratio.This paper presents a practical methodology in estimating "equivalent trading ratio (ETR)" and links uncertainty analysis with trading ratio determination from TMDL allocation process. Determination of ETR can provide a preliminary evaluation of "tradeoffs" between various combination of point and nonpoint source control strategies on ambient water quality improvement. A greater portion of NPS load reduction in overall TMDL load reduction generally correlates with greater uncertainty and thus requires greater trading ratio. The rigorous quantification of trading ratio will enhance the scientific basis and thus public perception for more informed decision in overall watershed-based pollutant trading program.

  9. A Methodology For Performing Global Uncertainty And Sensitivity Analysis In Systems Biology

    PubMed Central

    Marino, Simeone; Hogue, Ian B.; Ray, Christian J.; Kirschner, Denise E.

    2008-01-01

    Accuracy of results from mathematical and computer models of biological systems is often complicated by the presence of uncertainties in experimental data that are used to estimate parameter values. Current mathematical modeling approaches typically use either single-parameter or local sensitivity analyses. However, these methods do not accurately assess uncertainty and sensitivity in the system as, by default they hold all other parameters fixed at baseline values. Using techniques described within we demonstrate how a multi-dimensional parameter space can be studied globally so all uncertainties can be identified. Further, uncertainty and sensitivity analysis techniques can help to identify and ultimately control uncertainties. In this work we develop methods for applying existing analytical tools to perform analyses on a variety of mathematical and computer models. We compare two specific types of global sensitivity analysis indexes that have proven to be among the most robust and efficient. Through familiar and new examples of mathematical and computer models, we provide a complete methodology for performing these analyses, both in deterministic and stochastic settings, and propose novel techniques to handle problems encountered during this type of analyses. PMID:18572196

  10. Dynamic analysis of parametrically excited system under uncertainties and multi-frequency excitations

    NASA Astrophysics Data System (ADS)

    Wei, Sha; Han, Qinkai; Peng, Zhike; Chu, Fulei

    2016-05-01

    Some system parameters in mechanical systems are always uncertain due to uncertainties in geometric and material properties, lubrication condition and wear. For a more reasonable estimation of dynamic analysis of the parametrically excited system, the effect of uncertain parameters should be taken into account. This paper presents a new non-probabilistic analysis method for solving the dynamic responses of parametrically excited systems under uncertainties and multi-frequency excitations. By using the multi-dimensional harmonic balance method (MHBM) and the Chebyshev inclusion function (CIF), an interval multi-dimensional harmonic balance method (IMHBM) is obtained. To illustrate the accuracy of the proposed method, a time-varying geared system of wind turbine with different kinds of uncertainties is demonstrated. By comparing with the results of the scanning method, it is shown that the presented method is valid and effective for the parametrically excited system with uncertainties and multi-frequency excitations. The effects of some uncertain system parameters including uncertain mesh stiffnesses and uncertain bearing stiffnesses on the frequency responses of the system are also discussed in detail. It is shown that the dynamic responses of the system are insensitive to the uncertain mesh stiffness and bearing stiffnesses of the planetary gear stage. The uncertain bearing stiffnesses of the intermediate and high-speed stages will lead to relatively large uncertainties in the dynamic responses around resonant regions. It will provide valuable guidance for the optimal design and condition monitoring of wind turbine gearboxes.

  11. Uncertainty and sensitivity analysis of fission gas behavior in engineering-scale fuel modeling

    NASA Astrophysics Data System (ADS)

    Pastore, Giovanni; Swiler, L. P.; Hales, J. D.; Novascone, S. R.; Perez, D. M.; Spencer, B. W.; Luzzi, L.; Van Uffelen, P.; Williamson, R. L.

    2015-01-01

    The role of uncertainties in fission gas behavior calculations as part of engineering-scale nuclear fuel modeling is investigated using the BISON fuel performance code with a recently implemented physics-based model for fission gas release and swelling. Through the integration of BISON with the DAKOTA software, a sensitivity analysis of the results to selected model parameters is carried out based on UO2 single-pellet simulations covering different power regimes. The parameters are varied within ranges representative of the relative uncertainties and consistent with the information in the open literature. The study leads to an initial quantitative assessment of the uncertainty in fission gas behavior predictions with the parameter characterization presently available. Also, the relative importance of the single parameters is evaluated. Moreover, a sensitivity analysis is carried out based on simulations of a fuel rod irradiation experiment, pointing out a significant impact of the considered uncertainties on the calculated fission gas release and cladding diametral strain. The results of the study indicate that the commonly accepted deviation between calculated and measured fission gas release by a factor of 2 approximately corresponds to the inherent modeling uncertainty at high fission gas release. Nevertheless, significantly higher deviations may be expected for values around 10% and lower. Implications are discussed in terms of directions of research for the improved modeling of fission gas behavior for engineering purposes.

  12. Uncertainty and sensitivity analysis of fission gas behavior in engineering-scale fuel modeling

    SciTech Connect

    Pastore, Giovanni; Swiler, L. P.; Hales, Jason D.; Novascone, Stephen R.; Perez, Danielle M.; Spencer, Benjamin W.; Luzzi, Lelio; Uffelen, Paul Van; Williamson, Richard L.

    2014-10-12

    The role of uncertainties in fission gas behavior calculations as part of engineering-scale nuclear fuel modeling is investigated using the BISON fuel performance code and a recently implemented physics-based model for the coupled fission gas release and swelling. Through the integration of BISON with the DAKOTA software, a sensitivity analysis of the results to selected model parameters is carried out based on UO2 single-pellet simulations covering different power regimes. The parameters are varied within ranges representative of the relative uncertainties and consistent with the information from the open literature. The study leads to an initial quantitative assessment of the uncertainty in fission gas behavior modeling with the parameter characterization presently available. Also, the relative importance of the single parameters is evaluated. Moreover, a sensitivity analysis is carried out based on simulations of a fuel rod irradiation experiment, pointing out a significant impact of the considered uncertainties on the calculated fission gas release and cladding diametral strain. The results of the study indicate that the commonly accepted deviation between calculated and measured fission gas release by a factor of 2 approximately corresponds to the inherent modeling uncertainty at high fission gas release. Nevertheless, higher deviations may be expected for values around 10% and lower. Implications are discussed in terms of directions of research for the improved modeling of fission gas behavior for engineering purposes.

  13. Uncertainty analysis of primary water pollutant control in China's pulp and paper industry.

    PubMed

    Wen, Zong-guo; Di, Jing-han; Zhang, Xue-ying

    2016-03-15

    The total emission control target of water pollutants (e.g., COD and NH4-N) for a certain industrial sector can be predicted and analysed using the popular technology-based bottom-up modelling. However, this methodology has obvious uncertainty regarding the attainment of mitigation targets. The primary uncertainty comes from macro-production, pollutant reduction roadmap, and technical parameters. This research takes the paper and pulp industry in China as an example, and builds 5 mitigation scenarios via different combinations of raw material structure, scale structure, procedure mitigation technology, and end-of-pipe treatment technology. Using the methodology of uncertainty analysis via Monte Carlo, random sampling was conducted over a hundred thousand times. According to key parameters, sensitive parameters that impact total emission control targets such as industrial output, technique structure, cleaner production technology, and end-of-pipe treatment technology are discussed in this article. It appears that scenario uncertainty has a larger influence on COD emission than NH4-N, hence it is recommended that a looser total emission control target for COD is necessary to increase its feasibility and availability while maintaining the status quo of NH4-N. Consequently, from uncertainty analysis, this research recognizes the sensitive products, techniques, and technologies affecting industrial water pollution. Copyright © 2015 Elsevier Ltd. All rights reserved.

  14. Parameter uncertainty analysis of non-point source pollution from different land use types.

    PubMed

    Shen, Zhen-yao; Hong, Qian; Yu, Hong; Niu, Jun-feng

    2010-03-15

    Land use type is one of the most important factors that affect the uncertainty in non-point source (NPS) pollution simulation. In this study, seventeen sensitive parameters were screened from the Soil and Water Assessment Tool (SWAT) model for parameter uncertainty analysis for different land use types in the Daning River Watershed of the Three Gorges Reservoir area, China. First-Order Error Analysis (FOEA) method was adopted to analyze the effect of parameter uncertainty on model outputs under three types of land use, namely, plantation, forest and grassland. The model outputs selected in this study consisted of runoff, sediment yield, organic nitrogen (N), and total phosphorus (TP). The results indicated that the uncertainty conferred by the parameters differed among the three land use types. In forest and grassland, the parameter uncertainty in NPS pollution was primarily associated with runoff processes, but in plantation, the main uncertain parameters were related to runoff process and soil properties. Taken together, the study suggested that adjusting the structure of land use and controlling fertilizer use are helpful methods to control the NPS pollution in the Daning River Watershed.

  15. Event-scale power law recession analysis: quantifying methodological uncertainty

    NASA Astrophysics Data System (ADS)

    Dralle, David N.; Karst, Nathaniel J.; Charalampous, Kyriakos; Veenstra, Andrew; Thompson, Sally E.

    2017-01-01

    The study of single streamflow recession events is receiving increasing attention following the presentation of novel theoretical explanations for the emergence of power law forms of the recession relationship, and drivers of its variability. Individually characterizing streamflow recessions often involves describing the similarities and differences between model parameters fitted to each recession time series. Significant methodological sensitivity has been identified in the fitting and parameterization of models that describe populations of many recessions, but the dependence of estimated model parameters on methodological choices has not been evaluated for event-by-event forms of analysis. Here, we use daily streamflow data from 16 catchments in northern California and southern Oregon to investigate how combinations of commonly used streamflow recession definitions and fitting techniques impact parameter estimates of a widely used power law recession model. Results are relevant to watersheds that are relatively steep, forested, and rain-dominated. The highly seasonal mediterranean climate of northern California and southern Oregon ensures study catchments explore a wide range of recession behaviors and wetness states, ideal for a sensitivity analysis. In such catchments, we show the following: (i) methodological decisions, including ones that have received little attention in the literature, can impact parameter value estimates and model goodness of fit; (ii) the central tendencies of event-scale recession parameter probability distributions are largely robust to methodological choices, in the sense that differing methods rank catchments similarly according to the medians of these distributions; (iii) recession parameter distributions are method-dependent, but roughly catchment-independent, such that changing the choices made about a particular method affects a given parameter in similar ways across most catchments; and (iv) the observed correlative relationship

  16. Quantifying model-structure- and parameter-driven uncertainties in spring wheat phenology prediction with Bayesian analysis

    DOE PAGES

    Alderman, Phillip D.; Stanfill, Bryan

    2016-10-06

    Recent international efforts have brought renewed emphasis on the comparison of different agricultural systems models. Thus far, analysis of model-ensemble simulated results has not clearly differentiated between ensemble prediction uncertainties due to model structural differences per se and those due to parameter value uncertainties. Additionally, despite increasing use of Bayesian parameter estimation approaches with field-scale crop models, inadequate attention has been given to the full posterior distributions for estimated parameters. The objectives of this study were to quantify the impact of parameter value uncertainty on prediction uncertainty for modeling spring wheat phenology using Bayesian analysis and to assess the relativemore » contributions of model-structure-driven and parameter-value-driven uncertainty to overall prediction uncertainty. This study used a random walk Metropolis algorithm to estimate parameters for 30 spring wheat genotypes using nine phenology models based on multi-location trial data for days to heading and days to maturity. Across all cases, parameter-driven uncertainty accounted for between 19 and 52% of predictive uncertainty, while model-structure-driven uncertainty accounted for between 12 and 64%. Here, this study demonstrated the importance of quantifying both model-structure- and parameter-value-driven uncertainty when assessing overall prediction uncertainty in modeling spring wheat phenology. More generally, Bayesian parameter estimation provided a useful framework for quantifying and analyzing sources of prediction uncertainty.« less

  17. Quantifying model-structure- and parameter-driven uncertainties in spring wheat phenology prediction with Bayesian analysis

    SciTech Connect

    Alderman, Phillip D.; Stanfill, Bryan

    2016-10-06

    Recent international efforts have brought renewed emphasis on the comparison of different agricultural systems models. Thus far, analysis of model-ensemble simulated results has not clearly differentiated between ensemble prediction uncertainties due to model structural differences per se and those due to parameter value uncertainties. Additionally, despite increasing use of Bayesian parameter estimation approaches with field-scale crop models, inadequate attention has been given to the full posterior distributions for estimated parameters. The objectives of this study were to quantify the impact of parameter value uncertainty on prediction uncertainty for modeling spring wheat phenology using Bayesian analysis and to assess the relative contributions of model-structure-driven and parameter-value-driven uncertainty to overall prediction uncertainty. This study used a random walk Metropolis algorithm to estimate parameters for 30 spring wheat genotypes using nine phenology models based on multi-location trial data for days to heading and days to maturity. Across all cases, parameter-driven uncertainty accounted for between 19 and 52% of predictive uncertainty, while model-structure-driven uncertainty accounted for between 12 and 64%. Here, this study demonstrated the importance of quantifying both model-structure- and parameter-value-driven uncertainty when assessing overall prediction uncertainty in modeling spring wheat phenology. More generally, Bayesian parameter estimation provided a useful framework for quantifying and analyzing sources of prediction uncertainty.

  18. Rainfall or parameter uncertainty? The power of sensitivity analysis on grouped factors

    NASA Astrophysics Data System (ADS)

    Nossent, Jiri; Pereira, Fernando; Bauwens, Willy

    2017-04-01

    Hydrological models are typically used to study and represent (a part of) the hydrological cycle. In general, the output of these models mostly depends on their input rainfall and parameter values. Both model parameters and input precipitation however, are characterized by uncertainties and, therefore, lead to uncertainty on the model output. Sensitivity analysis (SA) allows to assess and compare the importance of the different factors for this output uncertainty. Hereto, the rainfall uncertainty can be incorporated in the SA by representing it as a probabilistic multiplier. Such multiplier can be defined for the entire time series, or several of these factors can be determined for every recorded rainfall pulse or for hydrological independent storm events. As a consequence, the number of parameters included in the SA related to the rainfall uncertainty can be (much) lower or (much) higher than the number of model parameters. Although such analyses can yield interesting results, it remains challenging to determine which type of uncertainty will affect the model output most due to the different weight both types will have within the SA. In this study, we apply the variance based Sobol' sensitivity analysis method to two different hydrological simulators (NAM and HyMod) for four diverse watersheds. Besides the different number of model parameters (NAM: 11 parameters; HyMod: 5 parameters), the setup of our sensitivity and uncertainty analysis-combination is also varied by defining a variety of scenarios including diverse numbers of rainfall multipliers. To overcome the issue of the different number of factors and, thus, the different weights of the two types of uncertainty, we build on one of the advantageous properties of the Sobol' SA, i.e. treating grouped parameters as a single parameter. The latter results in a setup with a single factor for each uncertainty type and allows for a straightforward comparison of their importance. In general, the results show a clear

  19. On the potential of uncertainty analysis for prediction of brake squeal propensity

    NASA Astrophysics Data System (ADS)

    Zhang, Zhi; Oberst, Sebastian; Lai, Joseph C. S.

    2016-09-01

    Brake squeal is a source of significant warranty-related claims for automotive manufacturers because it is annoying and is often perceived by customers as a safety concern. A brake squeal analysis is complex due to changing environmental and operating conditions, high sensitivity to manufacturing and assembly tolerances as well as the not so well understood role of nonlinearities. Although brake squeal is essentially a nonlinear problem, the standard analysis tool in industry is the linear complex eigenvalue analysis (CEA) which may under-predict or over-predict the number of unstable vibration modes. A nonlinear instability analysis is more predictive than CEA but is still computationally too expensive to be used routinely in industry for a full brake finite element model. Also, although the net work analysis of a linearised brake system has shown potential in predicting the origin of brake squeal, it has not been extensively used. In this study, the net work of an analytical viscously damped self-excited 4-dof friction oscillator with cubic contact force nonlinearity is compared with the instability prediction using the CEA and a nonlinear instability analysis. Results show that both the net work analysis and CEA under-predict the instability because of their inability to detect the sub-critical Hopf bifurcation. Then, the uncertainty analysis is applied to examine if it can improve instability prediction of a nonlinear system using linear methods and its limitations. By applying a variance-based global sensitivity analysis to parameters of the oscillator, suitable candidates for an uncertainty analysis are identified. Results of uncertainty analyses by applying polynomial chaos expansions to net work and CEA correlate well with those of the nonlinear analysis, hence demonstrating the potential of an uncertainty analysis in improving the prediction of brake squeal propensity using a linear method.

  20. Coherent uncertainty analysis of aerosol measurements from multiple satellite sensors

    NASA Astrophysics Data System (ADS)

    Petrenko, M.; Ichoku, C.

    2013-02-01

    Aerosol retrievals from multiple spaceborne sensors, including MODIS (on Terra and Aqua), MISR, OMI, POLDER, CALIOP, and SeaWiFS - altogether, a total of 11 different aerosol products - were comparatively analyzed using data collocated with ground-based aerosol observations from the Aerosol Robotic Network (AERONET) stations within the Multi-sensor Aerosol Products Sampling System (MAPSS, http://giovanni.gsfc.nasa.gov/mapss/ and http://giovanni.gsfc.nasa.gov/aerostat/). The analysis was performed by comparing quality-screened satellite aerosol optical depth or thickness (AOD or AOT) retrievals during 2006-2010 to available collocated AERONET measurements globally, regionally, and seasonally, and deriving a number of statistical measures of accuracy. We used a robust statistical approach to detect and remove possible outliers in the collocated data that can bias the results of the analysis. Overall, the proportion of outliers in each of the quality-screened AOD products was within 12%. Squared correlation coefficient (R2) values of the satellite AOD retrievals relative to AERONET exceeded 0.6, with R2 for most of the products exceeding 0.7 over land and 0.8 over ocean. Root mean square error (RMSE) values for most of the AOD products were within 0.15 over land and 0.09 over ocean. We have been able to generate global maps showing regions where the different products present advantages over the others, as well as the relative performance of each product over different landcover types. It was observed that while MODIS, MISR, and SeaWiFS provide accurate retrievals over most of the landcover types, multi-angle capabilities make MISR the only sensor to retrieve reliable AOD over barren and snow/ice surfaces. Likewise, active sensing enables CALIOP to retrieve aerosol properties over bright-surface shrublands more accurately than the

  1. Coherent uncertainty analysis of aerosol measurements from multiple satellite sensors

    NASA Astrophysics Data System (ADS)

    Petrenko, M.; Ichoku, C.

    2013-07-01

    Aerosol retrievals from multiple spaceborne sensors, including MODIS (on Terra and Aqua), MISR, OMI, POLDER, CALIOP, and SeaWiFS - altogether, a total of 11 different aerosol products - were comparatively analyzed using data collocated with ground-based aerosol observations from the Aerosol Robotic Network (AERONET) stations within the Multi-sensor Aerosol Products Sampling System (MAPSS, http://giovanni.gsfc.nasa.gov/mapss/ and http://giovanni.gsfc.nasa.gov/aerostat/. The analysis was performed by comparing quality-screened satellite aerosol optical depth or thickness (AOD or AOT) retrievals during 2006-2010 to available collocated AERONET measurements globally, regionally, and seasonally, and deriving a number of statistical measures of accuracy. We used a robust statistical approach to detect and remove possible outliers in the collocated data that can bias the results of the analysis. Overall, the proportion of outliers in each of the quality-screened AOD products was within 7%. Squared correlation coefficient (R2) values of the satellite AOD retrievals relative to AERONET exceeded 0.8 for many of the analyzed products, while root mean square error (RMSE) values for most of the AOD products were within 0.15 over land and 0.07 over ocean. We have been able to generate global maps showing regions where the different products present advantages over the others, as well as the relative performance of each product over different land cover types. It was observed that while MODIS, MISR, and SeaWiFS provide accurate retrievals over most of the land cover types, multi-angle capabilities make MISR the only sensor to retrieve reliable AOD over barren and snow/ice surfaces. Likewise, active sensing enables CALIOP to retrieve aerosol properties over bright-surface closed shrublands more accurately than the other sensors, while POLDER, which

  2. Coherent Uncertainty Analysis of Aerosol Measurements from Multiple Satellite Sensors

    NASA Technical Reports Server (NTRS)

    Petrenko, M.; Ichoku, C.

    2013-01-01

    Aerosol retrievals from multiple spaceborne sensors, including MODIS (on Terra and Aqua), MISR, OMI, POLDER, CALIOP, and SeaWiFS altogether, a total of 11 different aerosol products were comparatively analyzed using data collocated with ground-based aerosol observations from the Aerosol Robotic Network (AERONET) stations within the Multi-sensor Aerosol Products Sampling System (MAPSS, http://giovanni.gsfc.nasa.gov/mapss/ and http://giovanni.gsfc.nasa.gov/aerostat/). The analysis was performed by comparing quality-screened satellite aerosol optical depth or thickness (AOD or AOT) retrievals during 2006-2010 to available collocated AERONET measurements globally, regionally, and seasonally, and deriving a number of statistical measures of accuracy. We used a robust statistical approach to detect and remove possible outliers in the collocated data that can bias the results of the analysis. Overall, the proportion of outliers in each of the quality-screened AOD products was within 12%. Squared correlation coefficient (R2) values of the satellite AOD retrievals relative to AERONET exceeded 0.6, with R2 for most of the products exceeding 0.7 over land and 0.8 over ocean. Root mean square error (RMSE) values for most of the AOD products were within 0.15 over land and 0.09 over ocean. We have been able to generate global maps showing regions where the different products present advantages over the others, as well as the relative performance of each product over different landcover types. It was observed that while MODIS, MISR, and SeaWiFS provide accurate retrievals over most of the landcover types, multi-angle capabilities make MISR the only sensor to retrieve reliable AOD over barren and snow / ice surfaces. Likewise, active sensing enables CALIOP to retrieve aerosol properties over bright-surface shrublands more accurately than the other sensors, while POLDER, which is the only one of the sensors capable of measuring polarized aerosols, outperforms other sensors in

  3. Uncertainty quantification applied to the analysis and design of a Hypersonic Inflatable Aerodynamic Decelerator for spacecraft reentry

    NASA Astrophysics Data System (ADS)

    Brune, Andrew J.

    The primary objective of this research is to investigate the uncertainty in the multidisciplinary analysis of a Hypersonic Inflatable Aerodynamic Decelerator configuration for Mars entry, subject to uncertainty sources in the high-fidelity computational models and the operating conditions. Efficient uncertainty quantification methods based on stochastic expansions are applied to the analysis of the hypersonic flowfield, fluid-structure interaction, and flexible thermal protection system response of a deformable inflatable decelerator. Uncertainty analysis is first applied to the hypersonic flowfield simulations to quantify the uncertainty in the surface convective and radiative heat flux, pressure, and shear stress of a fixed inflatable decelerator, subject to uncertainties in the binary collision integrals of the transport properties, chemical kinetics, non-Boltzmann radiation modeling, and the freestream conditions. The uncertainty analysis for fluid-structure interaction modeling is conducted to quantify the uncertainty in the deflection and resulting surface heat flux, shear stress, and pressure of a deformable inflatable decelerator, subject to uncertainties in material structural properties, inflation pressure, and important flowfield uncertain variables identified in the initial study. The deflection uncertainty is shown to be primarily driven by the structural modeling uncertain variables and found to be insignificant in contributing to the resulting surface condition uncertainties. Uncertainty analysis is finally applied to the flexible thermal protection system bondline temperature for a ballistic Mars entry trajectory, subject to uncertainties in the material thermal properties and important flowfield variables from the initial study. The uncertainty in the bondline temperature is compared to its allowable temperature limit and shown to be primarily driven by the material thermal properties of the outer fabric and insulator layers, and the freestream

  4. Uncertainty in projected point precipitation extremes for hydrological impact analysis of climate change

    NASA Astrophysics Data System (ADS)

    Van Uytven, Els; Willems, Patrick

    2017-04-01

    Current trends in the hydro-meteorological variables indicate the potential impact of climate change on hydrological extremes. Therefore, they trigger an increased importance climate adaptation strategies in water management. The impact of climate change on hydro-meteorological and hydrological extremes is, however, highly uncertain. This is due to uncertainties introduced by the climate models, the internal variability inherent to the climate system, the greenhouse gas scenarios and the statistical downscaling methods. In view of the need to define sustainable climate adaptation strategies, there is a need to assess these uncertainties. This is commonly done by means of ensemble approaches. Because more and more climate models and statistical downscaling methods become available, there is a need to facilitate the climate impact and uncertainty analysis. A Climate Perturbation Tool has been developed for that purpose, which combines a set of statistical downscaling methods including weather typing, weather generator, transfer function and advanced perturbation based approaches. By use of an interactive interface, climate impact modelers can apply these statistical downscaling methods in a semi-automatic way to an ensemble of climate model runs. The tool is applicable to any region, but has been demonstrated so far to cases in Belgium, Suriname, Vietnam and Bangladesh. Time series representing future local-scale precipitation, temperature and potential evapotranspiration (PET) conditions were obtained, starting from time series of historical observations. Uncertainties on the future meteorological conditions are represented in two different ways: through an ensemble of time series, and a reduced set of synthetic scenarios. The both aim to span the full uncertainty range as assessed from the ensemble of climate model runs and downscaling methods. For Belgium, for instance, use was made of 100-year time series of 10-minutes precipitation observations and daily

  5. U.S. Environmental Protection Agency radiogenic risk projections: uncertainty analysis.

    PubMed

    Pawel, David J

    2013-01-01

    The U.S. Environmental Protection Agency (EPA) has updated its estimates of cancer risks due to low doses of ionizing radiation for the U.S. population, as well as their scientific basis. For the most part, these estimates were calculated using models recommended in the recent National Academy of Sciences' (BEIR VII) report on health effects from low levels of ionizing radiation. The new risk assessment includes uncertainty bounds associated with the projections for gender and cancer site-specific lifetime attributable risks. For most cancer sites, these uncertainty bounds were calculated using probability distributions for BEIR VII model parameter values, derived from a novel Bayesian analysis of cancer incidence data from the atomic bomb survivor lifespan study (LSS) cohort and subjective distributions for other relevant sources of uncertainty. This approach allowed for quantification of uncertainties associated with: 1) the effect of sampling variability on inferences drawn from the LSS cohort about the linear dose response and its dependence on temporal factors such as age-at-exposure, 2) differences in the radiogenic risks in the Japanese LSS cohort versus the U.S. population, 3) dosimetry errors, and 4) several other non-sampling sources. Some of the uncertainty associated with how risk depends on dose and dose rate was also quantified. For uniform whole-body exposures of low-dose gamma radiation to the entire population, EPA's cancer incidence risk coefficients and corresponding 90% uncertainty intervals (Gy) are 9.55 × 10 (4.3 × 10 to 1.8 × 10) for males and 1.35 × 10 (6.5 × 10 to 2.5 × 10) for females, where the numbers in parentheses represent an estimated 90% uncertainty interval. For many individual cancer sites, risk coefficients differ from corresponding uncertainty bounds by factors of about three to five, although uncertainties are larger for cancers of the stomach, prostate, liver, and uterus. Uncertainty intervals for many, but not all

  6. Probabilistic and deterministic evaluation of uncertainty in a local scale multi-risk analysis

    NASA Astrophysics Data System (ADS)

    Lari, S.; Frattini, P.; Crosta, G. B.

    2009-04-01

    We performed a probabilistic multi-risk analysis (QPRA) at the local scale for a 420 km2 area surrounding the town of Brescia (Northern Italy). We calculated the expected annual loss in terms of economical damage and life loss, for a set of risk scenarios of flood, earthquake and industrial accident with different occurrence probabilities and different intensities. The territorial unit used for the study was the census parcel, of variable area, for which a large amount of data was available. Due to the lack of information related to the evaluation of the hazards, to the value of the exposed elements (e.g., residential and industrial area, population, lifelines, sensitive elements as schools, hospitals) and to the process-specific vulnerability, and to a lack of knowledge of the processes (floods, industrial accidents, earthquakes), we assigned an uncertainty to the input variables of the analysis. For some variables an homogeneous uncertainty was assigned on the whole study area, as for instance for the number of buildings of various typologies, and for the event occurrence probability. In other cases, as for phenomena intensity (e.g.,depth of water during flood) and probability of impact, the uncertainty was defined in relation to the census parcel area. In fact assuming some variables homogeneously diffused or averaged on the census parcels, we introduce a larger error for larger parcels. We propagated the uncertainty in the analysis using three different models, describing the reliability of the output (risk) as a function of the uncertainty of the inputs (scenarios and vulnerability functions). We developed a probabilistic approach based on Monte Carlo simulation, and two deterministic models, namely First Order Second Moment (FOSM) and Point Estimate (PE). In general, similar values of expected losses are obtained with the three models. The uncertainty of the final risk value is in the three cases around the 30% of the expected value. Each of the models

  7. Uncertainty analysis of the Measured Performance Rating (MPR) method. Final report

    SciTech Connect

    Not Available

    1993-11-01

    A report was commissioned by the New York State Energy Research and Development Authority and the Electric Power Research Institute to evaluate the uncertainties in the energy monitoring method known as measured performance rating (MPR). The work is intended to help further development of the MPR system by quantitatively analyzing the uncertainties in estimates of the heat loss coefficients and heating system efficiencies. The analysis indicates that the MPR should detect as little as a 7 percent change in the heat loss coefficients and heating system efficiencies. The analysis indicate that the MPR should be able to detect as little as a 7 percent change in the heat loss coefficient at 95 percent confidence level. MPR appears sufficiently robust for characterizing common weatherization treatments; e.g., increasing attic insulation from R-7 to R-19 in a typical single-story, 1,100 sq. ft. house resulting in a 19 percent reduction in heat loss coefficient. Furnace efficiency uncertainties ranged up to three times those of the heat loss coefficients. Measurement uncertainties (at the 95 percent confidence level) were estimated to be from 1 to 5 percent for heat loss coefficients and 1.5 percent for a typical furnace efficiency. The analysis also shows a limitation in applying MPR to houses with heating ducts in slabs on grade and to those with very large thermal mass. Most of the uncertainties encountered in the study were due more to the methods of estimating the ``true`` heat loss coefficients, furnace efficiency, and furnace fuel consumption (by collecting fuel bills and simulating two actual houses) than to the MPR approach. These uncertainties in the true parameter values become evidence for arguments in favor of the need of empirical measures of heat loss coefficient and furnace efficiency, like the MPR method, rather than arguments against.

  8. Sensitivity and uncertainty analysis of estimated soil hydraulic parameters for simulating soil water content

    NASA Astrophysics Data System (ADS)

    Gupta, Manika; Garg, Naveen Kumar; Srivastava, Prashant K.

    2014-05-01

    The sensitivity and uncertainty analysis has been carried out for the scalar parameters (soil hydraulic parameters (SHPs)), which govern the simulation of soil water content in the unsaturated soil zone. The study involves field experiments, which were conducted in real field conditions for wheat crop in Roorkee, India under irrigated conditions. Soil samples were taken for the soil profile of 60 cm depth at an interval of 15 cm in the experimental field to determine soil water retention curves (SWRCs). These experimentally determined SWRCs were used to estimate the SHPs by least square optimization under constrained conditions. Sensitivity of the SHPs estimated by various pedotransfer functions (PTFs), that relate various easily measurable soil properties like soil texture, bulk density and organic carbon content, is compared with lab derived parameters to simulate respective soil water retention curves. Sensitivity analysis was carried out using the monte carlo simulations and the one factor at a time approach. The different sets of SHPs, along with experimentally determined saturated permeability, are then used as input parameters in physically based, root water uptake model to ascertain the uncertainties in simulating soil water content. The generalised likelihood uncertainty estimation procedure (GLUE) was subsequently used to estimate the uncertainty bounds (UB) on the model predictions. It was found that the experimentally obtained SHPs were able to simulate the soil water contents with efficiencies of 70-80% at all the depths for the three irrigation treatments. The SHPs obtained from the PTFs, performed with varying uncertainties in simulating the soil water contents. Keywords: Sensitivity analysis, Uncertainty estimation, Pedotransfer functions, Soil hydraulic parameters, Hydrological modelling

  9. FORMAL UNCERTAINTY ANALYSIS OF A LAGRANGIAN PHOTOCHEMICAL AIR POLLUTION MODEL. (R824792)

    EPA Science Inventory

    This study applied Monte Carlo analysis with Latin
    hypercube sampling to evaluate the effects of uncertainty
    in air parcel trajectory paths, emissions, rate constants,
    deposition affinities, mixing heights, and atmospheric stability
    on predictions from a vertically...

  10. Bayesian Uncertainty Analysis of PBPK Model Predictions for Permethrin in Rats

    EPA Science Inventory

    Uncertainty analysis of human physiologically-based pharmacokinetic (PBPK) model predictions can pose a significant challenge due to data limitations. As a result of these limitations, human models are often derived from extrapolated animal PBPK models, for which there is usuall...

  11. Overview and application of the Model Optimization, Uncertainty, and SEnsitivity Analysis (MOUSE) toolbox

    USDA-ARS?s Scientific Manuscript database

    For several decades, optimization and sensitivity/uncertainty analysis of environmental models has been the subject of extensive research. Although much progress has been made and sophisticated methods developed, the growing complexity of environmental models to represent real-world systems makes it...

  12. Applying Uncertainty Analysis to a Risk Assessment for the Pesticide Permethrin

    EPA Science Inventory

    We discuss the application of methods of uncertainty analysis from our previous poster to the problem of a risk assessment for exposure to the food-use pesticide permethrin resulting from residential pesticide crack and crevice application. Exposures are simulated by the SHEDS (S...

  13. A PROBABILISTIC APPROACH FOR ANALYSIS OF UNCERTAINTY IN THE EVALUATION OF WATERSHED MANAGEMENT PRACTICES

    EPA Science Inventory

    A computational framework is presented for analyzing the uncertainty in model estimates of water quality benefits of best management practices (BMPs) in two small (<10 km2) watersheds in Indiana. The analysis specifically recognizes the significance of the difference b...

  14. Uncertainty analysis of an irrigation scheduling model for water management in crop production

    USDA-ARS?s Scientific Manuscript database

    Irrigation scheduling tools are critical to allow producers to manage water resources for crop production in an accurate and timely manner. To be useful, these tools need to be accurate, complete, and relatively reliable. The current work presents the uncertainty analysis and its results for the Mis...

  15. Applying Uncertainty Analysis to a Risk Assessment for the Pesticide Permethrin

    EPA Science Inventory

    We discuss the application of methods of uncertainty analysis from our previous poster to the problem of a risk assessment for exposure to the food-use pesticide permethrin resulting from residential pesticide crack and crevice application. Exposures are simulated by the SHEDS (S...

  16. A PROBABILISTIC APPROACH FOR ANALYSIS OF UNCERTAINTY IN THE EVALUATION OF WATERSHED MANAGEMENT PRACTICES

    EPA Science Inventory

    A computational framework is presented for analyzing the uncertainty in model estimates of water quality benefits of best management practices (BMPs) in two small (<10 km2) watersheds in Indiana. The analysis specifically recognizes the significance of the difference b...

  17. Uncertainty analysis for multivariate state estimation in safety-critical and mission-critical maintenance applications

    SciTech Connect

    Zavaljevski, N.; Gross, K. C.

    2000-04-03

    The Multivariate State Estimation Technique (MSET) has been developed at Argonne National Laboratory (ANL) and applied for real time surveillance applications for the purposes of signal validation, sensor operability validation, equipment health monitoring, incipient component fault annunciation, and process anomaly identification. Although MSET was originally developed for applications in the commercial nuclear industry, it has recently been spun off for applications in fields such as aerospace, manufacturing, transportation, robotics, and ship propulsion. Notwithstanding these types of successful applications of MSET in industry, it is necessary for safety-critical and mission-critical applications of MSET to have reliability analysis methods, including a propagation-of-uncertainty tool, which is needed to support safety evaluations in a variety of industries, and technical-specification-change requests in the case of the nuclear industry. For these and related applications, a general purpose uncertainty analysis tool for MSET has been developed that uses Monte Carlo simulation with Latin Hypercube Sampling. For any new application of MSET, the uncertainty analysis tool developed here may be used to investigate quantitative propagation-of-uncertainty behavior for all sensors under surveillance. In addition to supporting safety analysis of surveillance systems that are based on MSET, the tool developed here can be employed in parametric studies to support system designers in evaluating the relative value of adding new sensors to an engineering system during early design stages or for equipment or facility upgrades.

  18. Calibration of Uncertainty Analysis of the SWAT Model Using Genetic Algorithms and Bayesian Model Averaging

    USDA-ARS?s Scientific Manuscript database

    In this paper, the Genetic Algorithms (GA) and Bayesian model averaging (BMA) were combined to simultaneously conduct calibration and uncertainty analysis for the Soil and Water Assessment Tool (SWAT). In this hybrid method, several SWAT models with different structures are first selected; next GA i...

  19. FORMAL UNCERTAINTY ANALYSIS OF A LAGRANGIAN PHOTOCHEMICAL AIR POLLUTION MODEL. (R824792)

    EPA Science Inventory

    This study applied Monte Carlo analysis with Latin
    hypercube sampling to evaluate the effects of uncertainty
    in air parcel trajectory paths, emissions, rate constants,
    deposition affinities, mixing heights, and atmospheric stability
    on predictions from a vertically...

  20. The Model Optimization, Uncertainty, and SEnsitivity analysis (MOUSE) toolbox: overview and application

    USDA-ARS?s Scientific Manuscript database

    For several decades, optimization and sensitivity/uncertainty analysis of environmental models has been the subject of extensive research. Although much progress has been made and sophisticated methods developed, the growing complexity of environmental models to represent real-world systems makes it...

  1. Efficiency of Analytical Methodologies in Uncertainty Analysis of Seismic Core Damage Frequency

    NASA Astrophysics Data System (ADS)

    Kawaguchi, Kenji; Uchiyama, Tomoaki; Muramatsu, Ken

    Fault Tree and Event Tree analysis is almost exclusively relied upon in the assessments of seismic Core Damage Frequency (CDF). In this approach, Direct Quantification of Fault tree using Monte Carlo simulation (DQFM) method, or simply called Monte Carlo (MC) method, and Binary Decision Diagram (BDD) method were introduced as alternatives for a traditional approximation method, namely Minimal Cut Set (MCS) method. However, there is still no agreement as to which method should be used in a risk assessment of seismic CDF, especially for uncertainty analysis. The purpose of this study is to examine the efficiencies of the three methods in uncertainty analysis as well as in point estimation so that the decision of selecting a proper method can be made effectively. The results show that the most efficient method would be BDD method in terms of accuracy and computational time. However, it will be discussed that BDD method is not always applicable to PSA models while MC method is so in theory. In turn, MC method was confirmed to agree with the exact solution obtained by BDD method, but it took a large amount of time, in particular for uncertainty analysis. On the other hand, it was shown that the approximation error of MCS method may not be as bad in uncertainty analysis as it is in point estimation. Based on these results and previous works, this paper will propose a scheme to select an appropriate analytical method for a seismic PSA study. Throughout this study, SECOM2-DQFM code was expanded to be able to utilize BDD method and to conduct uncertainty analysis with both MC and BDD method.

  2. A single-loop optimization method for reliability analysis with second order uncertainty

    NASA Astrophysics Data System (ADS)

    Xie, Shaojun; Pan, Baisong; Du, Xiaoping

    2015-08-01

    Reliability analysis may involve random variables and interval variables. In addition, some of the random variables may have interval distribution parameters owing to limited information. This kind of uncertainty is called second order uncertainty. This article develops an efficient reliability method for problems involving the three aforementioned types of uncertain input variables. The analysis produces the maximum and minimum reliability and is computationally demanding because two loops are needed: a reliability analysis loop with respect to random variables and an interval analysis loop for extreme responses with respect to interval variables. The first order reliability method and nonlinear optimization are used for the two loops, respectively. For computational efficiency, the two loops are combined into a single loop by treating the Karush-Kuhn-Tucker (KKT) optimal conditions of the interval analysis as constraints. Three examples are presented to demonstrate the proposed method.

  3. Parametric uncertainty analysis of pulse wave propagation in a model of a human arterial network

    SciTech Connect

    Xiu Dongbin Sherwin, Spencer J.

    2007-10-01

    Reduced models of human arterial networks are an efficient approach to analyze quantitative macroscopic features of human arterial flows. The justification for such models typically arise due to the significantly long wavelength associated with the system in comparison to the lengths of arteries in the networks. Although these types of models have been employed extensively and many issues associated with their implementations have been widely researched, the issue of data uncertainty has received comparatively little attention. Similar to many biological systems, a large amount of uncertainty exists in the value of the parameters associated with the models. Clearly reliable assessment of the system behaviour cannot be made unless the effect of such data uncertainty is quantified. In this paper we present a study of parametric data uncertainty in reduced modelling of human arterial networks which is governed by a hyperbolic system. The uncertain parameters are modelled as random variables and the governing equations for the arterial network therefore become stochastic. This type stochastic hyperbolic systems have not been previously systematically studied due to the difficulties introduced by the uncertainty such as a potential change in the mathematical character of the system and imposing boundary conditions. We demonstrate how the application of a high-order stochastic collocation method based on the generalized polynomial chaos expansion, combined with a discontinuous Galerkin spectral/hp element discretization in physical space, can successfully simulate this type of hyperbolic system subject to uncertain inputs with bounds. Building upon a numerical study of propagation of uncertainty and sensitivity in a simplified model with a single bifurcation, a systematical parameter sensitivity analysis is conducted on the wave dynamics in a multiple bifurcating human arterial network. Using the physical understanding of the dynamics of pulse waves in these types of

  4. Uncertainty Determination Methodology, Sampling Maps Generation and Trend Studies with Biomass Thermogravimetric Analysis

    PubMed Central

    Pazó, Jose A.; Granada, Enrique; Saavedra, Ángeles; Eguía, Pablo; Collazo, Joaquín

    2010-01-01

    This paper investigates a method for the determination of the maximum sampling error and confidence intervals of thermal properties obtained from thermogravimetric analysis (TG analysis) for several lignocellulosic materials (ground olive stone, almond shell, pine pellets and oak pellets), completing previous work of the same authors. A comparison has been made between results of TG analysis and prompt analysis. Levels of uncertainty and errors were obtained, demonstrating that properties evaluated by TG analysis were representative of the overall fuel composition, and no correlation between prompt and TG analysis exists. Additionally, a study of trends and time correlations is indicated. These results are particularly interesting for biomass energy applications. PMID:21152292

  5. Uncertainty determination methodology, sampling maps generation and trend studies with biomass thermogravimetric analysis.

    PubMed

    Pazó, Jose A; Granada, Enrique; Saavedra, Angeles; Eguía, Pablo; Collazo, Joaquín

    2010-09-28

    This paper investigates a method for the determination of the maximum sampling error and confidence intervals of thermal properties obtained from thermogravimetric analysis (TG analysis) for several lignocellulosic materials (ground olive stone, almond shell, pine pellets and oak pellets), completing previous work of the same authors. A comparison has been made between results of TG analysis and prompt analysis. Levels of uncertainty and errors were obtained, demonstrating that properties evaluated by TG analysis were representative of the overall fuel composition, and no correlation between prompt and TG analysis exists. Additionally, a study of trends and time correlations is indicated. These results are particularly interesting for biomass energy applications.

  6. Dissertation Defense: Computational Fluid Dynamics Uncertainty Analysis for Payload Fairing Spacecraft Environmental Control Systems

    NASA Technical Reports Server (NTRS)

    Groves, Curtis Edward

    2014-01-01

    Spacecraft thermal protection systems are at risk of being damaged due to airflow produced from Environmental Control Systems. There are inherent uncertainties and errors associated with using Computational Fluid Dynamics to predict the airflow field around a spacecraft from the Environmental Control System. This paper describes an approach to quantify the uncertainty in using Computational Fluid Dynamics to predict airflow speeds around an encapsulated spacecraft without the use of test data. Quantifying the uncertainty in analytical predictions is imperative to the success of any simulation-based product. The method could provide an alternative to traditional validation by test only mentality. This method could be extended to other disciplines and has potential to provide uncertainty for any numerical simulation, thus lowering the cost of performing these verifications while increasing the confidence in those predictions.Spacecraft requirements can include a maximum airflow speed to protect delicate instruments during ground processing. Computational Fluid Dynamics can be used to verify these requirements; however, the model must be validated by test data. This research includes the following three objectives and methods. Objective one is develop, model, and perform a Computational Fluid Dynamics analysis of three (3) generic, non-proprietary, environmental control systems and spacecraft configurations. Several commercially available and open source solvers have the capability to model the turbulent, highly three-dimensional, incompressible flow regime. The proposed method uses FLUENT, STARCCM+, and OPENFOAM. Objective two is to perform an uncertainty analysis of the Computational Fluid Dynamics model using the methodology found in Comprehensive Approach to Verification and Validation of Computational Fluid Dynamics Simulations. This method requires three separate grids and solutions, which quantify the error bars around Computational Fluid Dynamics predictions

  7. Dissertation Defense Computational Fluid Dynamics Uncertainty Analysis for Payload Fairing Spacecraft Environmental Control Systems

    NASA Technical Reports Server (NTRS)

    Groves, Curtis Edward

    2014-01-01

    Spacecraft thermal protection systems are at risk of being damaged due to airflow produced from Environmental Control Systems. There are inherent uncertainties and errors associated with using Computational Fluid Dynamics to predict the airflow field around a spacecraft from the Environmental Control System. This paper describes an approach to quantify the uncertainty in using Computational Fluid Dynamics to predict airflow speeds around an encapsulated spacecraft without the use of test data. Quantifying the uncertainty in analytical predictions is imperative to the success of any simulation-based product. The method could provide an alternative to traditional "validation by test only" mentality. This method could be extended to other disciplines and has potential to provide uncertainty for any numerical simulation, thus lowering the cost of performing these verifications while increasing the confidence in those predictions. Spacecraft requirements can include a maximum airflow speed to protect delicate instruments during ground processing. Computational Fluid Dynamics can be used to verify these requirements; however, the model must be validated by test data. This research includes the following three objectives and methods. Objective one is develop, model, and perform a Computational Fluid Dynamics analysis of three (3) generic, non-proprietary, environmental control systems and spacecraft configurations. Several commercially available and open source solvers have the capability to model the turbulent, highly three-dimensional, incompressible flow regime. The proposed method uses FLUENT, STARCCM+, and OPENFOAM. Objective two is to perform an uncertainty analysis of the Computational Fluid Dynamics model using the methodology found in "Comprehensive Approach to Verification and Validation of Computational Fluid Dynamics Simulations". This method requires three separate grids and solutions, which quantify the error bars around Computational Fluid Dynamics

  8. Analysis of the uncertainties in velocity measurements with triple hot-wire probes

    NASA Technical Reports Server (NTRS)

    Frota, M. N.; Moffat, R. J.

    1984-01-01

    A detailed computerized sensitivity analysis of the triple hot-wire equations has been performed in order to delineate the uncertainties associated with measurements of the velocity components. Absolute and relative uncertainties for the instantaneous hot-wire outputs are calculated as functions of roll and pitch angles, based on a constant probability combination of the uncertainties in the inputs. From the results, it is concluded that the small inherent difficulties associated with the triple hot-wire data do not reflect artifacts introduced by the data processing. Fixed errors present in the V and W channels of the output are due to the nonzero measuring volume of the triple wire probe, and are entirely predictable.

  9. Developing Uncertainty Models for Robust Flutter Analysis Using Ground Vibration Test Data

    NASA Technical Reports Server (NTRS)

    Potter, Starr; Lind, Rick; Kehoe, Michael W. (Technical Monitor)

    2001-01-01

    A ground vibration test can be used to obtain information about structural dynamics that is important for flutter analysis. Traditionally, this information#such as natural frequencies of modes#is used to update analytical models used to predict flutter speeds. The ground vibration test can also be used to obtain uncertainty models, such as natural frequencies and their associated variations, that can update analytical models for the purpose of predicting robust flutter speeds. Analyzing test data using the -norm, rather than the traditional 2-norm, is shown to lead to a minimum-size uncertainty description and, consequently, a least-conservative robust flutter speed. This approach is demonstrated using ground vibration test data for the Aerostructures Test Wing. Different norms are used to formulate uncertainty models and their associated robust flutter speeds to evaluate which norm is least conservative.

  10. How uncertainty analysis in ecological risk assessment is used in the courtroom

    SciTech Connect

    Hacker, C.; Watson, J.

    1995-12-31

    The prevalence of uncertainty analysis in environmental decision-making is increasing. Specific methods for estimating and expressing uncertainty are available and continually being improved. Although these methods are intended to provide a measure of the suitability of the data upon which a decision is based, their application in litigation may result in outcomes that are unanticipated by some in the scientific community. This divergence between those estimating uncertainty in assessing ecological risk and those judging its application can be attributed in part to the different ways evidence is used in science and law. This presentation will explain how scientific evidence is used in the courtroom. This explanation will use examples from case law to describe how courts decide who can be qualified to present evidence, what evidence can be presented, and how this evidence will be used in reaching a decision.

  11. Use of SUSA in Uncertainty and Sensitivity Analysis for INL VHTR Coupled Codes

    SciTech Connect

    Gerhard Strydom

    2010-06-01

    The need for a defendable and systematic Uncertainty and Sensitivity approach that conforms to the Code Scaling, Applicability, and Uncertainty (CSAU) process, and that could be used for a wide variety of software codes, was defined in 2008.The GRS (Gesellschaft für Anlagen und Reaktorsicherheit) company of Germany has developed one type of CSAU approach that is particularly well suited for legacy coupled core analysis codes, and a trial version of their commercial software product SUSA (Software for Uncertainty and Sensitivity Analyses) was acquired on May 12, 2010. This interim milestone report provides an overview of the current status of the implementation and testing of SUSA at the INL VHTR Project Office.

  12. An Uncertainty Analysis for Predicting Soil Profile Salinity Using EM Induction Data

    NASA Astrophysics Data System (ADS)

    Huang, Jingyi; Monteiro Santos, Fernando; Triantafilis, John

    2016-04-01

    Proximal soil sensing techniques such as electromagnetic (EM) induction have been used to identify and map the areal variation of average soil properties. However, soil varies with depth owing to the action of various soil forming factors (e.g., parent material and topography). In this work we collected EM data using an EM38 and EM34 meter along a 22-km transect in the Trangie District, Australia.We jointly inverted these data using EM4Soil software and compare our 2-dimensional model of true electrical conductivity (sigma - mS/m) with depth against measured electrical conductivity of a saturated soil-paste extract (ECe - dS/m) at depth of 0-16 m. Through the use of a linear regression (LR) model and by varying forward modelling algorithms (cumulative function and full solution), inversion algorithms (S1 and S2), and damping factor (lambda) we determined a suitable electromagnetic conductivity image (EMCI) which was optimal when using the full solution, S2 and lambda = 0.6. To evaluate uncertainty of the inversion process and the LR model, we conducted an uncertainty analysis. The distribution of the model misfit shows the largest uncertainty caused by inversion (mostly due to EM34-40) occurs at deeper profiles while the largest uncertainty of the LR model occurs where the soil profile is most saline. These uncertainty maps also illustrate us how the model accuracy can be improved in the future.

  13. A methodology for formulating a minimal uncertainty model for robust control system design and analysis

    NASA Technical Reports Server (NTRS)

    Belcastro, Christine M.; Chang, B.-C.; Fischl, Robert

    1989-01-01

    In the design and analysis of robust control systems for uncertain plants, the technique of formulating what is termed an M-delta model has become widely accepted and applied in the robust control literature. The M represents the transfer function matrix M(s) of the nominal system, and delta represents an uncertainty matrix acting on M(s). The uncertainty can arise from various sources, such as structured uncertainty from parameter variations or multiple unstructured uncertainties from unmodeled dynamics and other neglected phenomena. In general, delta is a block diagonal matrix, and for real parameter variations the diagonal elements are real. As stated in the literature, this structure can always be formed for any linear interconnection of inputs, outputs, transfer functions, parameter variations, and perturbations. However, very little of the literature addresses methods for obtaining this structure, and none of this literature addresses a general methodology for obtaining a minimal M-delta model for a wide class of uncertainty. Since have a delta matrix of minimum order would improve the efficiency of structured singular value (or multivariable stability margin) computations, a method of obtaining a minimal M-delta model would be useful. A generalized method of obtaining a minimal M-delta structure for systems with real parameter variations is given.

  14. Synthesis of superheavy elements: Uncertainty analysis to improve the predictive power of reaction models

    NASA Astrophysics Data System (ADS)

    Lü, Hongliang; Boilley, David; Abe, Yasuhisa; Shen, Caiwan

    2016-09-01

    Background: Synthesis of superheavy elements is performed by heavy-ion fusion-evaporation reactions. However, fusion is known to be hindered with respect to what can be observed with lighter ions. Thus some delicate ambiguities remain on the fusion mechanism that eventually lead to severe discrepancies in the calculated formation probabilities coming from different fusion models. Purpose: In the present work, we propose a general framework based upon uncertainty analysis in the hope of constraining fusion models. Method: To quantify uncertainty associated with the formation probability, we propose to propagate uncertainties in data and parameters using the Monte Carlo method in combination with a cascade code called kewpie2, with the aim of determining the associated uncertainty, namely the 95 % confidence interval. We also investigate the impact of different models or options, which cannot be modeled by continuous probability distributions, on the final results. An illustrative example is presented in detail and then a systematic study is carried out for a selected set of cold-fusion reactions. Results: It is rigorously shown that, at the 95 % confidence level, the total uncertainty of the empirical formation probability appears comparable to the discrepancy between calculated values. Conclusions: The results obtained from the present study provide direct evidence for predictive limitations of the existing fusion-evaporation models. It is thus necessary to find other ways to assess such models for the purpose of establishing a more reliable reaction theory, which is expected to guide future experiments on the production of superheavy elements.

  15. Uncertainty Analysis on Heat Transfer Correlations for RP-1 Fuel in Copper Tubing

    NASA Technical Reports Server (NTRS)

    Driscoll, E. A.; Landrum, D. B.

    2004-01-01

    NASA is studying kerosene (RP-1) for application in Next Generation Launch Technology (NGLT). Accurate heat transfer correlations in narrow passages at high temperatures and pressures are needed. Hydrocarbon fuels, such as RP-1, produce carbon deposition (coke) along the inside of tube walls when heated to high temperatures. A series of tests to measure the heat transfer using RP-1 fuel and examine the coking were performed in NASA Glenn Research Center's Heated Tube Facility. The facility models regenerative cooling by flowing room temperature RP-1 through resistively heated copper tubing. A Regression analysis is performed on the data to determine the heat transfer correlation for Nusselt number as a function of Reynolds and Prandtl numbers. Each measurement and calculation is analyzed to identify sources of uncertainty, including RP-1 property variations. Monte Carlo simulation is used to determine how each uncertainty source propagates through the regression and an overall uncertainty in predicted heat transfer coefficient. The implications of these uncertainties on engine design and ways to minimize existing uncertainties are discussed.

  16. Uncertainty Analysis and Order-by-Order Optimization of Chiral Nuclear Interactions

    DOE PAGES

    Carlsson, Boris; Forssen, Christian; Fahlin Strömberg, D.; ...

    2016-02-24

    Chiral effective field theory ( ΧEFT) provides a systematic approach to describe low-energy nuclear forces. Moreover, EFT is able to provide well-founded estimates of statistical and systematic uncertainties | although this unique advantage has not yet been fully exploited. We ll this gap by performing an optimization and statistical analysis of all the low-energy constants (LECs) up to next-to-next-to-leading order. Our optimization protocol corresponds to a simultaneous t to scattering and bound-state observables in the pion-nucleon, nucleon-nucleon, and few-nucleon sectors, thereby utilizing the full model capabilities of EFT. Finally, we study the effect on other observables by demonstrating forward-error-propagation methodsmore » that can easily be adopted by future works. We employ mathematical optimization and implement automatic differentiation to attain e cient and machine-precise first- and second-order derivatives of the objective function with respect to the LECs. This is also vital for the regression analysis. We use power-counting arguments to estimate the systematic uncertainty that is inherent to EFT and we construct chiral interactions at different orders with quantified uncertainties. Statistical error propagation is compared with Monte Carlo sampling showing that statistical errors are in general small compared to systematic ones. In conclusion, we find that a simultaneous t to different sets of data is critical to (i) identify the optimal set of LECs, (ii) capture all relevant correlations, (iii) reduce the statistical uncertainty, and (iv) attain order-by-order convergence in EFT. Furthermore, certain systematic uncertainties in the few-nucleon sector are shown to get substantially magnified in the many-body sector; in particlar when varying the cutoff in the chiral potentials. The methodology and results presented in this Paper open a new frontier for uncertainty quantification in ab initio nuclear theory.« less

  17. Dynamic analysis of global copper flows. Global stocks, postconsumer material flows, recycling indicators, and uncertainty evaluation.

    PubMed

    Glöser, Simon; Soulier, Marcel; Tercero Espinoza, Luis A

    2013-06-18

    We present a dynamic model of global copper stocks and flows which allows a detailed analysis of recycling efficiencies, copper stocks in use, and dissipated and landfilled copper. The model is based on historical mining and refined copper production data (1910-2010) enhanced by a unique data set of recent global semifinished goods production and copper end-use sectors provided by the copper industry. To enable the consistency of the simulated copper life cycle in terms of a closed mass balance, particularly the matching of recycled metal flows to reported historical annual production data, a method was developed to estimate the yearly global collection rates of end-of-life (postconsumer) scrap. Based on this method, we provide estimates of 8 different recycling indicators over time. The main indicator for the efficiency of global copper recycling from end-of-life (EoL) scrap--the EoL recycling rate--was estimated to be 45% on average, ± 5% (one standard deviation) due to uncertainty and variability over time in the period 2000-2010. As uncertainties of specific input data--mainly concerning assumptions on end-use lifetimes and their distribution--are high, a sensitivity analysis with regard to the effect of uncertainties in the input data on the calculated recycling indicators was performed. The sensitivity analysis included a stochastic (Monte Carlo) uncertainty evaluation with 10(5) simulation runs.

  18. Analysis of uncertainties in the estimates of nitrous oxide and methane emissions in the UK's greenhouse gas inventory for agriculture

    NASA Astrophysics Data System (ADS)

    Milne, Alice E.; Glendining, Margaret J.; Bellamy, Pat; Misselbrook, Tom; Gilhespy, Sarah; Rivas Casado, Monica; Hulin, Adele; van Oijen, Marcel; Whitmore, Andrew P.

    2014-01-01

    The UK's greenhouse gas inventory for agriculture uses a model based on the IPCC Tier 1 and Tier 2 methods to estimate the emissions of methane and nitrous oxide from agriculture. The inventory calculations are disaggregated at country level (England, Wales, Scotland and Northern Ireland). Before now, no detailed assessment of the uncertainties in the estimates of emissions had been done. We used Monte Carlo simulation to do such an analysis. We collated information on the uncertainties of each of the model inputs. The uncertainties propagate through the model and result in uncertainties in the estimated emissions. Using a sensitivity analysis, we found that in England and Scotland the uncertainty in the emission factor for emissions from N inputs (EF1) affected uncertainty the most, but that in Wales and Northern Ireland, the emission factor for N leaching and runoff (EF5) had greater influence. We showed that if the uncertainty in any one of these emission factors is reduced by 50%, the uncertainty in emissions of nitrous oxide reduces by 10%. The uncertainty in the estimate for the emissions of methane emission factors for enteric fermentation in cows and sheep most affected the uncertainty in methane emissions. When inventories are disaggregated (as that for the UK is) correlation between separate instances of each emission factor will affect the uncertainty in emissions. As more countries move towards inventory models with disaggregation, it is important that the IPCC give firm guidance on this topic.

  19. Quantitative uncertainty and sensitivity analysis of a PWR control rod ejection accident

    SciTech Connect

    Pasichnyk, I.; Perin, Y.; Velkov, K.

    2013-07-01

    The paper describes the results of the quantitative Uncertainty and Sensitivity (U/S) Analysis of a Rod Ejection Accident (REA) which is simulated by the coupled system code ATHLET-QUABOX/CUBBOX applying the GRS tool for U/S analysis SUSA/XSUSA. For the present study, a UOX/MOX mixed core loading based on a generic PWR is modeled. A control rod ejection is calculated for two reactor states: Hot Zero Power (HZP) and 30% of nominal power. The worst cases for the rod ejection are determined by steady-state neutronic simulations taking into account the maximum reactivity insertion in the system and the power peaking factor. For the U/S analysis 378 uncertain parameters are identified and quantified (thermal-hydraulic initial and boundary conditions, input parameters and variations of the two-group cross sections). Results for uncertainty and sensitivity analysis are presented for safety important global and local parameters. (authors)

  20. Uncertainty Analysis of Power Grid Investment Capacity Based on Monte Carlo

    NASA Astrophysics Data System (ADS)

    Qin, Junsong; Liu, Bingyi; Niu, Dongxiao

    By analyzing the influence factors of the investment capacity of power grid, to depreciation cost, sales price and sales quantity, net profit, financing and GDP of the second industry as the dependent variable to build the investment capacity analysis model. After carrying out Kolmogorov-Smirnov test, get the probability distribution of each influence factor. Finally, obtained the grid investment capacity uncertainty of analysis results by Monte Carlo simulation.

  1. Uncertainty analysis based on probability bounds (p-box) approach in probabilistic safety assessment.

    PubMed

    Karanki, Durga Rao; Kushwaha, Hari Shankar; Verma, Ajit Kumar; Ajit, Srividya

    2009-05-01

    A wide range of uncertainties will be introduced inevitably during the process of performing a safety assessment of engineering systems. The impact of all these uncertainties must be addressed if the analysis is to serve as a tool in the decision-making process. Uncertainties present in the components (input parameters of model or basic events) of model output are propagated to quantify its impact in the final results. There are several methods available in the literature, namely, method of moments, discrete probability analysis, Monte Carlo simulation, fuzzy arithmetic, and Dempster-Shafer theory. All the methods are different in terms of characterizing at the component level and also in propagating to the system level. All these methods have different desirable and undesirable features, making them more or less useful in different situations. In the probabilistic framework, which is most widely used, probability distribution is used to characterize uncertainty. However, in situations in which one cannot specify (1) parameter values for input distributions, (2) precise probability distributions (shape), and (3) dependencies between input parameters, these methods have limitations and are found to be not effective. In order to address some of these limitations, the article presents uncertainty analysis in the context of level-1 probabilistic safety assessment (PSA) based on a probability bounds (PB) approach. PB analysis combines probability theory and interval arithmetic to produce probability boxes (p-boxes), structures that allow the comprehensive propagation through calculation in a rigorous way. A practical case study is also carried out with the developed code based on the PB approach and compared with the two-phase Monte Carlo simulation results.

  2. Global sensitivity analysis in wastewater treatment plant model applications: prioritizing sources of uncertainty.

    PubMed

    Sin, Gürkan; Gernaey, Krist V; Neumann, Marc B; van Loosdrecht, Mark C M; Gujer, Willi

    2011-01-01

    This study demonstrates the usefulness of global sensitivity analysis in wastewater treatment plant (WWTP) design to prioritize sources of uncertainty and quantify their impact on performance criteria. The study, which is performed with the Benchmark Simulation Model no. 1 plant design, complements a previous paper on input uncertainty characterisation and propagation (Sin et al., 2009). A sampling-based sensitivity analysis is conducted to compute standardized regression coefficients. It was found that this method is able to decompose satisfactorily the variance of plant performance criteria (with R(2) > 0.9) for effluent concentrations, sludge production and energy demand. This high extent of linearity means that the plant performance criteria can be described as linear functions of the model inputs under the defined plant conditions. In effect, the system of coupled ordinary differential equations can be replaced by multivariate linear models, which can be used as surrogate models. The importance ranking based on the sensitivity measures demonstrates that the most influential factors involve ash content and influent inert particulate COD among others, largely responsible for the uncertainty in predicting sludge production and effluent ammonium concentration. While these results were in agreement with process knowledge, the added value is that the global sensitivity methods can quantify the contribution of the variance of significant parameters, e.g., ash content explains 70% of the variance in sludge production. Further the importance of formulating appropriate sensitivity analysis scenarios that match the purpose of the model application needs to be highlighted. Overall, the global sensitivity analysis proved a powerful tool for explaining and quantifying uncertainties as well as providing insight into devising useful ways for reducing uncertainties in the plant performance. This information can help engineers design robust WWTP plants.

  3. A detailed description of the uncertainty analysis for High Area Ratio Rocket Nozzle tests at the NASA Lewis Research Center

    NASA Technical Reports Server (NTRS)

    Davidian, Kenneth J.; Dieck, Ronald H.; Chuang, Isaac

    1987-01-01

    A preliminary uncertainty analysis has been performed for the High Area Ratio Rocket Nozzle test program which took place at the altitude test capsule of the Rocket Engine Test Facility at the NASA Lewis Research Center. Results from the study establish the uncertainty of measured and calculated parameters required for the calculation of rocket engine specific impulse. A generalized description of the uncertainty methodology used is provided. Specific equations and a detailed description of the analysis are presented. Verification of the uncertainty analysis model was performed by comparison with results from the experimental program's data reduction code. Final results include an uncertainty for specific impulse of 1.30 percent. The largest contributors to this uncertainty were calibration errors from the test capsule pressure and thrust measurement devices.

  4. puma 3.0: improved uncertainty propagation methods for gene and transcript expression analysis

    PubMed Central

    2013-01-01

    Background Microarrays have been a popular tool for gene expression profiling at genome-scale for over a decade due to the low cost, short turn-around time, excellent quantitative accuracy and ease of data generation. The Bioconductor package puma incorporates a suite of analysis methods for determining uncertainties from Affymetrix GeneChip data and propagating these uncertainties to downstream analysis. As isoform level expression profiling receives more and more interest within genomics in recent years, exon microarray technology offers an important tool to quantify expression level of the majority of exons and enables the possibility of measuring isoform level expression. However, puma does not include methods for the analysis of exon array data. Moreover, the current expression summarisation method for Affymetrix 3’ GeneChip data suffers from instability for low expression genes. For the downstream analysis, the method for differential expression detection is computationally intensive and the original expression clustering method does not consider the variance across the replicated technical and biological measurements. It is therefore necessary to develop improved uncertainty propagation methods for gene and transcript expression analysis. Results We extend the previously developed Bioconductor package puma with a new method especially designed for GeneChip Exon arrays and a set of improved downstream approaches. The improvements include: (i) a new gamma model for exon arrays which calculates isoform and gene expression measurements and a level of uncertainty associated with the estimates, using the multi-mappings between probes, isoforms and genes, (ii) a variant of the existing approach for the probe-level analysis of Affymetrix 3’ GeneChip data to produce more stable gene expression estimates, (iii) an improved method for detecting differential expression which is computationally more efficient than the existing approach in the package and (iv) an

  5. Data uncertainties in material flow analysis: Municipal solid waste management system in Maputo City, Mozambique.

    PubMed

    Dos Muchangos, Leticia Sarmento; Tokai, Akihiro; Hanashima, Atsuko

    2017-01-01

    Material flow analysis can effectively trace and quantify the flows and stocks of materials such as solid wastes in urban environments. However, the integrity of material flow analysis results is compromised by data uncertainties, an occurrence that is particularly acute in low-and-middle-income study contexts. This article investigates the uncertainties in the input data and their effects in a material flow analysis study of municipal solid waste management in Maputo City, the capital of Mozambique. The analysis is based on data collected in 2007 and 2014. Initially, the uncertainties and their ranges were identified by the data classification model of Hedbrant and Sörme, followed by the application of sensitivity analysis. The average lower and upper bounds were 29% and 71%, respectively, in 2007, increasing to 41% and 96%, respectively, in 2014. This indicates higher data quality in 2007 than in 2014. Results also show that not only data are partially missing from the established flows such as waste generation to final disposal, but also that they are limited and inconsistent in emerging flows and processes such as waste generation to material recovery (hence the wider variation in the 2014 parameters). The sensitivity analysis further clarified the most influencing parameter and the degree of influence of each parameter on the waste flows and the interrelations among the parameters. The findings highlight the need for an integrated municipal solid waste management approach to avoid transferring or worsening the negative impacts among the parameters and flows.

  6. SENSIT: a cross-section and design sensitivity and uncertainty analysis code. [In FORTRAN for CDC-7600, IBM 360

    SciTech Connect

    Gerstl, S.A.W.

    1980-01-01

    SENSIT computes the sensitivity and uncertainty of a calculated integral response (such as a dose rate) due to input cross sections and their uncertainties. Sensitivity profiles are computed for neutron and gamma-ray reaction cross sections of standard multigroup cross section sets and for secondary energy distributions (SEDs) of multigroup scattering matrices. In the design sensitivity mode, SENSIT computes changes in an integral response due to design changes and gives the appropriate sensitivity coefficients. Cross section uncertainty analyses are performed for three types of input data uncertainties: cross-section covariance matrices for pairs of multigroup reaction cross sections, spectral shape uncertainty parameters for secondary energy distributions (integral SED uncertainties), and covariance matrices for energy-dependent response functions. For all three types of data uncertainties SENSIT computes the resulting variance and estimated standard deviation in an integral response of interest, on the basis of generalized perturbation theory. SENSIT attempts to be more comprehensive than earlier sensitivity analysis codes, such as SWANLAKE.

  7. Spline analysis of Holocene sediment magnetic records: Uncertainty estimates for field modeling

    NASA Astrophysics Data System (ADS)

    Panovska, S.; Finlay, C. C.; Donadini, F.; Hirt, A. M.

    2012-02-01

    Sediment and archeomagnetic data spanning the Holocene enable us to reconstruct the evolution of the geomagnetic field on time scales of centuries to millennia. In global field modeling the reliability of data is taken into account by weighting according to uncertainty estimates. Uncertainties in sediment magnetic records arise from (1) imperfections in the paleomagnetic recording processes, (2) coring and (sub) sampling methods, (3) adopted averaging procedures, and (4) uncertainties in the age-depth models. We take a step toward improved uncertainty estimates by performing a comprehensive statistical analysis of the available global database of Holocene magnetic records. Smoothing spline models that capture the robust aspects of individual records are derived. This involves a cross-validation approach, based on an absolute deviation measure of misfit, to determine the smoothing parameter for each spline model, together with the use of a minimum smoothing time derived from the sedimentation rate and assumed lock-in depth. Departures from the spline models provide information concerning the random variability in each record. Temporal resolution analysis reveals that 50% of the records have smoothing times between 80 and 250 years. We also perform comparisons among the sediment magnetic records and archeomagnetic data, as well as with predictions from the global historical and archeomagnetic field models. Combining these approaches, we arrive at individual uncertainty estimates for each sediment record. These range from 2.5° to 11.2° (median: 5.9°; interquartile range: 5.4° to 7.2°) for inclination, 4.1° to 46.9° (median: 13.4°; interquartile range: 11.4° to 18.9°) for relative declination, and 0.59 to 1.32 (median: 0.93; interquartile range: 0.86 to 1.01) for standardized relative paleointensity. These values suggest that uncertainties may have been underestimated in previous studies. No compelling evidence for systematic inclination shallowing is

  8. Operational Implementation of a Pc Uncertainty Construct for Conjunction Assessment Risk Analysis

    NASA Technical Reports Server (NTRS)

    Newman, Lauri K.; Hejduk, Matthew D.; Johnson, Lauren C.

    2016-01-01

    Earlier this year the NASA Conjunction Assessment and Risk Analysis (CARA) project presented the theoretical and algorithmic aspects of a method to include the uncertainties in the calculation inputs when computing the probability of collision (Pc) between two space objects, principally uncertainties in the covariances and the hard-body radius. The output of this calculation approach is to produce rather than a single Pc value an entire probability density function that will represent the range of possible Pc values given the uncertainties in the inputs and bring CA risk analysis methodologies more in line with modern risk management theory. The present study provides results from the exercise of this method against an extended dataset of satellite conjunctions in order to determine the effect of its use on the evaluation of conjunction assessment (CA) event risk posture. The effects are found to be considerable: a good number of events are downgraded from or upgraded to a serious risk designation on the basis of consideration of the Pc uncertainty. The findings counsel the integration of the developed methods into NASA CA operations.

  9. Damage functions for climate-related hazards: unification and uncertainty analysis

    NASA Astrophysics Data System (ADS)

    Prahl, B. F.; Rybski, D.; Boettle, M.; Kropp, J. P.

    2015-11-01

    Most climate change impacts manifest in the form of natural hazards. For example, sea-level rise and changes in storm climatology are expected to increase the frequency and magnitude of flooding events. In practice there is a need for comprehensive damage assessment at an intermediate level of complexity. Answering this need, we reveal the common grounds of macroscale damage functions employed in storm damage, coastal-flood damage, and heat mortality assessment. The universal approach offers both bottom-up and top-down damage evaluation, employing either an explicit or an implicit portfolio description. Putting emphasis on the treatment of data uncertainties, we perform a sensitivity analysis across different scales. We find that the behaviour of intrinsic uncertainties on the microscale level (i.e. single item) does still persist on the macroscale level (i.e. portfolio). Furthermore, the analysis of uncertainties can reveal their specific relevance, allowing for simplification of the modelling chain. Our results shed light on the role of uncertainties and provide useful insight for the application of a unified damage function.

  10. SWEPP PAN assay system uncertainty analysis: Active mode measurements of solidified aqueous sludge waste

    SciTech Connect

    Blackwood, L.G.; Harker, Y.D.; Meachum, T.R.

    1997-12-01

    The Idaho National Engineering and Environmental Laboratory is being used as a temporary storage facility for transuranic waste generated by the US Nuclear Weapons program at the Rocky Flats Plant (RFP) in Golden, Colorado. Currently, there is a large effort in progress to prepare to ship this waste to the Waste Isolation Pilot Plant (WIPP) in Carlsbad, New Mexico. In order to meet the TRU Waste Characterization Quality Assurance Program Plan nondestructive assay compliance requirements and quality assurance objectives, it is necessary to determine the total uncertainty of the radioassay results produced by the Stored Waste Examination Pilot Plant (SWEPP) Passive Active Neutron (PAN) radioassay system. This paper is one of a series of reports quantifying the results of the uncertainty analysis of the PAN system measurements for specific waste types and measurement modes. In particular this report covers active mode measurements of weapons grade plutonium-contaminated aqueous sludge waste contained in 208 liter drums (item description codes 1, 2, 7, 800, 803, and 807). Results of the uncertainty analysis for PAN active mode measurements of aqueous sludge indicate that a bias correction multiplier of 1.55 should be applied to the PAN aqueous sludge measurements. With the bias correction, the uncertainty bounds on the expected bias are 0 {+-} 27%. These bounds meet the Quality Assurance Program Plan requirements for radioassay systems.

  11. Intolerance of Uncertainty in Eating Disorders: A Systematic Review and Meta-Analysis.

    PubMed

    Brown, Melanie; Robinson, Lauren; Campione, Giovanna Cristina; Wuensch, Kelsey; Hildebrandt, Tom; Micali, Nadia

    2017-09-01

    Intolerance of uncertainty is an empirically supported transdiagnostic construct that may have relevance in understanding eating disorders. We conducted a meta-analysis and systematic review of intolerance of uncertainty in eating disorders using Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) guidelines. We calculated random-effects standardised mean differences (SMD) for studies utilising the Intolerance of Uncertainty Scale (IUS) and summarised additional studies descriptively. Women with eating disorders have significantly higher IUS scores compared with healthy controls (SMD = 1.90; 95% C.I. 1.24 to 2.56; p < 0.001). Post hoc meta-analysis revealed significant differences when comparing women with anorexia nervosa with controls (SMD = 2.16; 95% C.I. 1.14 to 3.18; p < 0.001) and women with bulimia nervosa with controls (SMD = 2.03; 95% C.I. 1.30 to 2.75; p < 0.001). Our synthesis of findings suggests that intolerance of uncertainty may represent a vulnerability and maintenance factor for eating disorders and potential target of cognitive, behavioural, interoceptive and affective symptoms. Copyright © 2017 John Wiley & Sons, Ltd and Eating Disorders Association. Copyright © 2017 John Wiley & Sons, Ltd and Eating Disorders Association.

  12. Uncertainty and sensitivity analysis of fission gas behavior in engineering-scale fuel modeling

    DOE PAGES

    Pastore, Giovanni; Swiler, L. P.; Hales, Jason D.; ...

    2014-10-12

    The role of uncertainties in fission gas behavior calculations as part of engineering-scale nuclear fuel modeling is investigated using the BISON fuel performance code and a recently implemented physics-based model for the coupled fission gas release and swelling. Through the integration of BISON with the DAKOTA software, a sensitivity analysis of the results to selected model parameters is carried out based on UO2 single-pellet simulations covering different power regimes. The parameters are varied within ranges representative of the relative uncertainties and consistent with the information from the open literature. The study leads to an initial quantitative assessment of the uncertaintymore » in fission gas behavior modeling with the parameter characterization presently available. Also, the relative importance of the single parameters is evaluated. Moreover, a sensitivity analysis is carried out based on simulations of a fuel rod irradiation experiment, pointing out a significant impact of the considered uncertainties on the calculated fission gas release and cladding diametral strain. The results of the study indicate that the commonly accepted deviation between calculated and measured fission gas release by a factor of 2 approximately corresponds to the inherent modeling uncertainty at high fission gas release. Nevertheless, higher deviations may be expected for values around 10% and lower. Implications are discussed in terms of directions of research for the improved modeling of fission gas behavior for engineering purposes.« less

  13. Operational Implementation of a Pc Uncertainty Construct for Conjunction Assessment Risk Analysis

    NASA Astrophysics Data System (ADS)

    Newman, L.; Hejduk, M.; Johnson, L.

    2016-09-01

    Earlier this year the NASA Conjunction Assessment and Risk Analysis (CARA) project presented the theoretical and algorithmic aspects of a method to include the uncertainties in the calculation inputs when computing the probability of collision (Pc) between two space objects, principally uncertainties in the covariances and the hardbody radius. The output of this calculation approach is to produce rather than a single Pc value an entire probability density function that will represent the range of possible Pc values given the uncertainties in the inputs and bring CA risk analysis methodologies more in line with modern risk management theory. The present study provides results from the exercise of this method against an extended dataset of satellite conjunctions in order to determine the effect of its use on the evaluation of conjunction assessment (CA) event risk posture. The effects are found to be considerable: a good number of events are downgraded from or upgraded to a serious risk designation on the basis of consideration of the Pc uncertainty. The findings counsel the integration of the developed methods into NASA CA operations.

  14. Variability and Uncertainty in Probabilistic Seismic Hazard Analysis for the Island of Montreal

    NASA Astrophysics Data System (ADS)

    Elkady, Ahmed Mohamed Ahmed

    The current seismic design process for structures in Montreal is based on the 2005 edition of the National Building Code of Canada (NBCC 2005) which is based on a hazard level corresponding to a probability of exceedence of 2% in 50 years. The code is based on the Uniform Hazard Spectrum (UHS) and deaggregation values obtained by Geological Survey of Canada (GSC) modified version of F-RISK software and were obtained by a process that did not formally consider epistemic uncertainty. Epistemic uncertainty is related to the uncertainty in model formulation. A seismological model consists of seismic sources (source geometry, source location, recurrence rate, magnitude distribution, and maximum magnitude) and a Ground-Motion Prediction Equation (GMPE). In general, and particularly Montreal, GMPEs are the main source of epistemic uncertainty with respect to other variables of seismological the model. The objective of this thesis is to use CRISIS software to investigate the effect of epistemic uncertainty on probabilistic seismic hazard analysis (PSHA) products like the UHS and deaggregation values by incorporating different new GMPEs. The epsilon "epsilon" parameter is also discussed which represents the departure of the target ground motion from that predicted by the GMPE as it is not very well documented in Eastern Canada. A method is proposed to calculate epsilon values for Montreal relative to a given GMPE and to calculate robust weighted modal epsilon values when epistemic uncertainty is considered. Epsilon values are commonly used in seismic performance evaluations for identifying design events and selecting ground motion records for vulnerability and liquefaction studies. A brief overview of record epsilons is also presented which accounts for the spectral shape of the ground motion time history is also presented.

  15. Biophysical and Economic Uncertainty in the Analysis of Poverty Impacts of Climate Change

    NASA Astrophysics Data System (ADS)

    Hertel, T. W.; Lobell, D. B.; Verma, M.

    2011-12-01

    This paper seeks to understand the main sources of uncertainty in assessing the impacts of climate change on agricultural output, international trade, and poverty. We incorporate biophysical uncertainty by sampling from a distribution of global climate model predictions for temperature and precipitation for 2050. The implications of these realizations for crop yields around the globe are estimated using the recently published statistical crop yield functions provided by Lobell, Schlenker and Costa-Roberts (2011). By comparing these yields to those predicted under current climate, we obtain the likely change in crop yields owing to climate change. The economic uncertainty in our analysis relates to the response of the global economic system to these biophysical shocks. We use a modified version of the GTAP model to elicit the impact of the biophysical shocks on global patterns of production, consumption, trade and poverty. Uncertainty in these responses is reflected in the econometrically estimated parameters governing the responsiveness of international trade, consumption, production (and hence the intensive margin of supply response), and factor supplies (which govern the extensive margin of supply response). We sample from the distributions of these parameters as specified by Hertel et al. (2007) and Keeney and Hertel (2009). We find that, even though it is difficult to predict where in the world agricultural crops will be favorably affected by climate change, the responses of economic variables, including output and exports can be far more robust (Table 1). This is due to the fact that supply and demand decisions depend on relative prices, and relative prices depend on productivity changes relative to other crops in a given region, or relative to similar crops in other parts of the world. We also find that uncertainty in poverty impacts of climate change appears to be almost entirely driven by biophysical uncertainty.

  16. Uncertainty analysis of multi-rate kinetics of uranium desorption from sediments

    NASA Astrophysics Data System (ADS)

    Zhang, Xiaoying; Liu, Chongxuan; Hu, Bill X.; Zhang, Guannan

    2014-01-01

    Multi-rate surface complexation models have been proposed to describe the kinetics of uranyl (U(VI) surface complexation reactions (SCR) rate-limited by diffusive mass transfer to and from intragranular sorption sites in subsurface sediments. In this study, a Bayesian-based, Differential Evolution Markov Chain method was used to assess the uncertainty and to identify factors controlling the uncertainties of the multi-rate SCR model. The rate constants in the multi-rate SCR were estimated with and without assumption of a specified lognormal distribution to test the lognormal assumption typically used to minimize the number of the rate constants in the multi-rate model. U(VI) desorption under variable chemical conditions from a contaminated sediment at US Hanford 300 Area, Washington was used as an example. The results indicated that the estimated rate constants without a specified lognormal assumption approximately followed a lognormal distribution, indicating that the lognormal is an effective assumption for the rate constants in the multi-rate SCR model. However, those rate constants with their corresponding half-lives longer than the experimental durations for model characterization had larger uncertainties and could not be reliably estimated. The uncertainty analysis revealed that the time-scale of the experiments for calibrating the multi-rate SCR model, the assumption for the rate constant distribution, the geochemical conditions involved in predicting U(VI) desorption, and equilibrium U(VI) speciation reaction constants were the major factors contributing to the extrapolation uncertainties of the multi-rate SCR model. Overall, the results from this study demonstrated that the multi-rate SCR model with a lognormal distribution of its rate constants is an effective approach for describing rate-limited U(VI) desorption; however, the model contains uncertainties, especially for those smaller rate constants, that require careful consideration for predicting U

  17. Evaluation of Uncertainty in Runoff Analysis Incorporating Theory of Stochastic Process

    NASA Astrophysics Data System (ADS)

    Yoshimi, Kazuhiro; Wang, Chao-Wen; Yamada, Tadashi

    2015-04-01

    The aim of this paper is to provide a theoretical framework of uncertainty estimate on rainfall-runoff analysis based on theory of stochastic process. SDE (stochastic differential equation) based on this theory has been widely used in the field of mathematical finance due to predict stock price movement. Meanwhile, some researchers in the field of civil engineering have investigated by using this knowledge about SDE (stochastic differential equation) (e.g. Kurino et.al, 1999; Higashino and Kanda, 2001). However, there have been no studies about evaluation of uncertainty in runoff phenomenon based on comparisons between SDE (stochastic differential equation) and Fokker-Planck equation. The Fokker-Planck equation is a partial differential equation that describes the temporal variation of PDF (probability density function), and there is evidence to suggest that SDEs and Fokker-Planck equations are equivalent mathematically. In this paper, therefore, the uncertainty of discharge on the uncertainty of rainfall is explained theoretically and mathematically by introduction of theory of stochastic process. The lumped rainfall-runoff model is represented by SDE (stochastic differential equation) due to describe it as difference formula, because the temporal variation of rainfall is expressed by its average plus deviation, which is approximated by Gaussian distribution. This is attributed to the observed rainfall by rain-gauge station and radar rain-gauge system. As a result, this paper has shown that it is possible to evaluate the uncertainty of discharge by using the relationship between SDE (stochastic differential equation) and Fokker-Planck equation. Moreover, the results of this study show that the uncertainty of discharge increases as rainfall intensity rises and non-linearity about resistance grows strong. These results are clarified by PDFs (probability density function) that satisfy Fokker-Planck equation about discharge. It means the reasonable discharge can be

  18. Bayesian factor analysis to calculate a deprivation index and its uncertainty.

    PubMed

    Marí-Dell'Olmo, Marc; Martínez-Beneito, Miguel Angel; Borrell, Carme; Zurriaga, Oscar; Nolasco, Andreu; Domínguez-Berjón, M Felicitas

    2011-05-01

    Procedures for calculating deprivation indices in epidemiologic studies often show some common problems because the spatial dependence between units of analysis and uncertainty of the estimates is not usually accounted for. This work highlights these problems and illustrates how spatial factor Bayesian modeling could alleviate them. This study applies a cross-sectional ecological design to analyze the census tracts of 3 Spanish cities. To calculate the deprivation index, we used 5 socioeconomic indicators that comprise the deprivation index calculated in the MEDEA project. The deprivation index was estimated by a Bayesian factor analysis using hierarchical models, which takes the spatial dependence of the study units into account. We studied the relationship between this index and the one obtained using principal component analysis. Various analyses were carried out to assess the uncertainty obtained in the index. A high correlation was observed between the index obtained and the non-Bayesian index, but this relationship is not linear and there is disagreement between the methods when the areas are grouped according to quantiles. When the deprivation index is calculated using summary statistics based on the posterior distributions, the uncertainty of the index in each census tract is not taken into account. Failure to take this uncertainty into account may result in misclassification bias in the census tracts when these are grouped according to quantiles of the deprivation index. Not taking uncertainty into account may result in misclassification bias in the census tracts. This bias could interfere in subsequent analyses that include the deprivation index. Our proposal provides another tool for identifying groups with greater deprivation and for improving decision-making for public policy planning.

  19. Demonstration of Uncertainty Quantification and Sensitivity Analysis for PWR Fuel Performance with BISON

    SciTech Connect

    Zhang, Hongbin; Ladd, Jacob; Zhao, Haihua; Zou, Ling; Burns, Douglas

    2015-11-01

    BISON is an advanced fuels performance code being developed at Idaho National Laboratory and is the code of choice for fuels performance by the U.S. Department of Energy (DOE)’s Consortium for Advanced Simulation of Light Water Reactors (CASL) Program. An approach to uncertainty quantification and sensitivity analysis with BISON was developed and a new toolkit was created. A PWR fuel rod model was developed and simulated by BISON, and uncertainty quantification and sensitivity analysis were performed with eighteen uncertain input parameters. The maximum fuel temperature and gap conductance were selected as the figures of merit (FOM). Pearson, Spearman, and partial correlation coefficients were considered for all of the figures of merit in sensitivity analysis.

  20. Uncertainty quantification and sensitivity analysis with CASL Core Simulator VERA-CS

    SciTech Connect

    Brown, C. S.; Zhang, Hongbin

    2016-05-24

    Uncertainty quantification and sensitivity analysis are important for nuclear reactor safety design and analysis. A 2x2 fuel assembly core design was developed and simulated by the Virtual Environment for Reactor Applications, Core Simulator (VERA-CS) coupled neutronics and thermal-hydraulics code under development by the Consortium for Advanced Simulation of Light Water Reactors (CASL). An approach to uncertainty quantification and sensitivity analysis with VERA-CS was developed and a new toolkit was created to perform uncertainty quantification and sensitivity analysis with fourteen uncertain input parameters. Furthermore, the minimum departure from nucleate boiling ratio (MDNBR), maximum fuel center-line temperature, and maximum outer clad surface temperature were chosen as the selected figures of merit. Pearson, Spearman, and partial correlation coefficients were considered for all of the figures of merit in sensitivity analysis and coolant inlet temperature was consistently the most influential parameter. We used parameters as inputs to the critical heat flux calculation with the W-3 correlation were shown to be the most influential on the MDNBR, maximum fuel center-line temperature, and maximum outer clad surface temperature.

  1. Uncertainty quantification and sensitivity analysis with CASL Core Simulator VERA-CS

    DOE PAGES

    Brown, C. S.; Zhang, Hongbin

    2016-05-24

    Uncertainty quantification and sensitivity analysis are important for nuclear reactor safety design and analysis. A 2x2 fuel assembly core design was developed and simulated by the Virtual Environment for Reactor Applications, Core Simulator (VERA-CS) coupled neutronics and thermal-hydraulics code under development by the Consortium for Advanced Simulation of Light Water Reactors (CASL). An approach to uncertainty quantification and sensitivity analysis with VERA-CS was developed and a new toolkit was created to perform uncertainty quantification and sensitivity analysis with fourteen uncertain input parameters. Furthermore, the minimum departure from nucleate boiling ratio (MDNBR), maximum fuel center-line temperature, and maximum outer clad surfacemore » temperature were chosen as the selected figures of merit. Pearson, Spearman, and partial correlation coefficients were considered for all of the figures of merit in sensitivity analysis and coolant inlet temperature was consistently the most influential parameter. We used parameters as inputs to the critical heat flux calculation with the W-3 correlation were shown to be the most influential on the MDNBR, maximum fuel center-line temperature, and maximum outer clad surface temperature.« less

  2. Uncertainty optimization applied to the Monte Carlo analysis of planetary entry trajectories

    NASA Astrophysics Data System (ADS)

    Way, David Wesley

    2001-10-01

    Future robotic missions to Mars, as well as any human missions, will require precise entries to ensure safe landings near science objectives and pre-deployed assets. Planning for these missions will depend heavily on Monte Carlo analyses to evaluate active guidance algorithms, assess the impact of off-nominal conditions, and account for uncertainty. The dependability of Monte Carlo forecasts, however, is limited by the accuracy and completeness of the assumed uncertainties. This is because Monte Carlo analysis is a forward driven problem; beginning with the input uncertainties and proceeding to the forecast output statistics. An improvement to the Monte Carlo analysis is needed that will allow the problem to be worked in reverse. In this way, the largest allowable dispersions that achieve the required mission objectives can be determined quantitatively. This thesis proposes a methodology to optimize the uncertainties in the Monte Carlo analysis of spacecraft landing footprints. A metamodel is used to first write polynomial expressions for the size of the landing footprint as functions of the independent uncertainty extrema. The coefficients of the metamodel are determined by performing experiments. The metamodel is then used in a constrained optimization procedure to minimize a cost-tolerance function. First, a two-dimensional proof-of-concept problem was used to evaluate the feasibility of this optimization method. Next, the optimization method was further demonstrated on the Mars Surveyor Program 2001 Lander. The purpose of this example was to demonstrate that the methodology developed during the proof-of-concept could be scaled to solve larger, more complicated, "real world" problems. This research has shown that is possible to control the size of the landing footprint and establish tolerances for mission uncertainties. A simplified metamodel was developed, which is enabling for realistic problems with more than just a few uncertainties. A confidence interval on

  3. Uncertainty Analysis of Ozone-Depleting Substances: Mixing Ratios, EESC, ODPs, and GWPs

    NASA Astrophysics Data System (ADS)

    Velders, G. J.; Daniel, J. S.

    2013-12-01

    Important for the recovery of the ozone layer from depletion by ozone-depleting substances (ODSs) is the rate at which ODSs are removed from the atmosphere, that is, their lifetimes. Recently the WCRP/SPARC project conducted an assessment of lifetimes of ODSs [SPARC, 2013] and presented a new set of recommended lifetimes as well as their uncertainties. We present here a comprehensive uncertainty analysis of ODS mixing ratios, levels of equivalent effective stratospheric chlorine (EESC), radiative forcing, ozone depletion potentials (ODPs), and global warming potentials (GWPs), using the new lifetimes and their uncertainties as well as uncertainties on all other relevant parameters. Using a box model the year EESC returns to pre-1980 levels, a metric commonly used to indicate a level of recovery for ODS induced ozone depletion, is 2048 for mid-latitudes based on the new lifetimes, which is 2 years later than that based on the lifetimes from WMO [2011]. The uncertainty in this return time is much larger than this change, however. The year EESC returns to pre-1980 levels ranges from is 2038 to 2064 (95% CI) for mid-latitudes and 2060 to 2104 for the Antarctic. The largest contribution to these ranges comes from the uncertainties in the lifetimes, since the current atmospheric burden of CFCs is much larger than the amounts present in existing equipment or still being produced. The earlier end of the recovery times is comparable to the return time in a hypothetical scenario with a cease in anthropogenic ODS emissions in 2014. The upper end of the range corresponds with an extra emission of about 7 MtCFC-11-eq in 2015, or about twice the cumulative anthropogenic emissions of all ODSs from 2014 to 2050. Semi-empirical ODPs calculated using the lifetimes from SPARC [2013] are up to 25% lower than the data reported in WMO [2011] for most species, mainly as a result of the increase in the estimated lifetime of CFC-11. The ODP of Halon-2402 increases by 20%, while the only

  4. Numerical daemons in hydrological modeling: Effects on uncertainty assessment, sensitivity analysis and model predictions

    NASA Astrophysics Data System (ADS)

    Kavetski, D.; Clark, M. P.; Fenicia, F.

    2011-12-01

    Hydrologists often face sources of uncertainty that dwarf those normally encountered in many engineering and scientific disciplines. Especially when representing large scale integrated systems, internal heterogeneities such as stream networks, preferential flowpaths, vegetation, etc, are necessarily represented with a considerable degree of lumping. The inputs to these models are themselves often the products of sparse observational networks. Given the simplifications inherent in environmental models, especially lumped conceptual models, does it really matter how they are implemented? At the same time, given the complexities usually found in the response surfaces of hydrological models, increasingly sophisticated analysis methodologies are being proposed for sensitivity analysis, parameter calibration and uncertainty assessment. Quite remarkably, rather than being caused by the model structure/equations themselves, in many cases model analysis complexities are consequences of seemingly trivial aspects of the model implementation - often, literally, whether the start-of-step or end-of-step fluxes are used! The extent of problems can be staggering, including (i) degraded performance of parameter optimization and uncertainty analysis algorithms, (ii) erroneous and/or misleading conclusions of sensitivity analysis, parameter inference and model interpretations and, finally, (iii) poor reliability of a calibrated model in predictive applications. While the often nontrivial behavior of numerical approximations has long been recognized in applied mathematics and in physically-oriented fields of environmental sciences, it remains a problematic issue in many environmental modeling applications. Perhaps detailed attention to numerics is only warranted for complicated engineering models? Would not numerical errors be an insignificant component of total uncertainty when typical data and model approximations are present? Is this really a serious issue beyond some rare isolated

  5. Bayesian dose-response analysis for epidemiological studies with complex uncertainty in dose estimation.

    PubMed

    Kwon, Deukwoo; Hoffman, F Owen; Moroz, Brian E; Simon, Steven L

    2016-02-10

    Most conventional risk analysis methods rely on a single best estimate of exposure per person, which does not allow for adjustment for exposure-related uncertainty. Here, we propose a Bayesian model averaging method to properly quantify the relationship between radiation dose and disease outcomes by accounting for shared and unshared uncertainty in estimated dose. Our Bayesian risk analysis method utilizes multiple realizations of sets (vectors) of doses generated by a two-dimensional Monte Carlo simulation method that properly separates shared and unshared errors in dose estimation. The exposure model used in this work is taken from a study of the risk of thyroid nodules among a cohort of 2376 subjects who were exposed to fallout from nuclear testing in Kazakhstan. We assessed the performance of our method through an extensive series of simulations and comparisons against conventional regression risk analysis methods. When the estimated doses contain relatively small amounts of uncertainty, the Bayesian method using multiple a priori plausible draws of dose vectors gave similar results to the conventional regression-based methods of dose-response analysis. However, when large and complex mixtures of shared and unshared uncertainties are present, the Bayesian method using multiple dose vectors had significantly lower relative bias than conventional regression-based risk analysis methods and better coverage, that is, a markedly increased capability to include the true risk coefficient within the 95% credible interval of the Bayesian-based risk estimate. An evaluation of the dose-response using our method is presented for an epidemiological study of thyroid disease following radiation exposure.

  6. Adjoint-based uncertainty quantification and sensitivity analysis for reactor depletion calculations

    NASA Astrophysics Data System (ADS)

    Stripling, Hayes Franklin

    Depletion calculations for nuclear reactors model the dynamic coupling between the material composition and neutron flux and help predict reactor performance and safety characteristics. In order to be trusted as reliable predictive tools and inputs to licensing and operational decisions, the simulations must include an accurate and holistic quantification of errors and uncertainties in its outputs. Uncertainty quantification is a formidable challenge in large, realistic reactor models because of the large number of unknowns and myriad sources of uncertainty and error. We present a framework for performing efficient uncertainty quantification in depletion problems using an adjoint approach, with emphasis on high-fidelity calculations using advanced massively parallel computing architectures. This approach calls for a solution to two systems of equations: (a) the forward, engineering system that models the reactor, and (b) the adjoint system, which is mathematically related to but different from the forward system. We use the solutions of these systems to produce sensitivity and error estimates at a cost that does not grow rapidly with the number of uncertain inputs. We present the framework in a general fashion and apply it to both the source-driven and k-eigenvalue forms of the depletion equations. We describe the implementation and verification of solvers for the forward and ad- joint equations in the PDT code, and we test the algorithms on realistic reactor analysis problems. We demonstrate a new approach for reducing the memory and I/O demands on the host machine, which can be overwhelming for typical adjoint algorithms. Our conclusion is that adjoint depletion calculations using full transport solutions are not only computationally tractable, they are the most attractive option for performing uncertainty quantification on high-fidelity reactor analysis problems.

  7. Uncertainty evaluation of copula analysis of hydrological droughts in the East River basin, China

    NASA Astrophysics Data System (ADS)

    Zhang, Qiang; Xiao, Mingzhong; Singh, Vijay P.

    2015-06-01

    The joint probability behaviors of extreme hydro-meteorological events, such as droughts, have been receiving increasing attentions in recent years. Since extreme hydro-meteorological events are reflected by more than one variable, such as duration and intensity, copula functions have been widely applied. However, uncertainties of copula-based analysis of hydrological droughts as a result of selection of marginal distribution and copulas have not yet received significant concerns. The aim of this study is to evaluate such uncertainties based on Bayesian approach. The method is used to analyze hydrological drought in the East River basin (China), which is the principal supplier of water resources for megacities in the Pearl River Delta and also for Hong Kong. The results indicate that the credible intervals of most likely design drought events with a return period of 20 years in terms of drought severity and duration are considerably large at all stations in the East River basin, covering the drought event curves at return periods of 10 and 50 years. Also the influences of heavy-tailed marginal distribution on the uncertainty evaluation of joint distribution have been investigated. Results show that the stronger the heavy-tailed marginal distribution, the greater the uncertainty of the joint distribution, especially for the extreme event. Results of this study provide a technical reference for uncertainty evaluation in copula-based analysis of drought events at regional and global scales. The large credible interval of drought severity and duration greatly challenges measures of mitigation of drought hazards and water resource management.

  8. Bio-physical vs. Economic Uncertainty in the Analysis of Climate Change Impacts on World Agriculture

    NASA Astrophysics Data System (ADS)

    Hertel, T. W.; Lobell, D. B.

    2010-12-01

    Accumulating evidence suggests that agricultural production could be greatly affected by climate change, but there remains little quantitative understanding of how these agricultural impacts would affect economic livelihoods in poor countries. The recent paper by Hertel, Burke and Lobell (GEC, 2010) considers three scenarios of agricultural impacts of climate change, corresponding to the fifth, fiftieth, and ninety fifth percentiles of projected yield distributions for the world’s crops in 2030. They evaluate the resulting changes in global commodity prices, national economic welfare, and the incidence of poverty in a set of 15 developing countries. Although the small price changes under the medium scenario are consistent with previous findings, their low productivity scenario reveals the potential for much larger food price changes than reported in recent studies which have hitherto focused on the most likely outcomes. The poverty impacts of price changes under the extremely adverse scenario are quite heterogeneous and very significant in some population strata. They conclude that it is critical to look beyond central case climate shocks and beyond a simple focus on yields and highly aggregated poverty impacts. In this paper, we conduct a more formal, systematic sensitivity analysis (SSA) with respect to uncertainty in the biophysical impacts of climate change on agriculture, by explicitly specifying joint distributions for global yield changes - this time focusing on 2050. This permits us to place confidence intervals on the resulting price impacts and poverty results which reflect the uncertainty inherited from the biophysical side of the analysis. We contrast this with the economic uncertainty inherited from the global general equilibrium model (GTAP), by undertaking SSA with respect to the behavioral parameters in that model. This permits us to assess which type of uncertainty is more important for regional price and poverty outcomes. Finally, we undertake a

  9. Uncertainty Analysis and Order-by-Order Optimization of Chiral Nuclear Interactions

    NASA Astrophysics Data System (ADS)

    Carlsson, B. D.; Ekström, A.; Forssén, C.; Strömberg, D. Fahlin; Jansen, G. R.; Lilja, O.; Lindby, M.; Mattsson, B. A.; Wendt, K. A.

    2016-01-01

    Chiral effective field theory (χ EFT ) provides a systematic approach to describe low-energy nuclear forces. Moreover, χ EFT is able to provide well-founded estimates of statistical and systematic uncertainties—although this unique advantage has not yet been fully exploited. We fill this gap by performing an optimization and statistical analysis of all the low-energy constants (LECs) up to next-to-next-to-leading order. Our optimization protocol corresponds to a simultaneous fit to scattering and bound-state observables in the pion-nucleon, nucleon-nucleon, and few-nucleon sectors, thereby utilizing the full model capabilities of χ EFT . Finally, we study the effect on other observables by demonstrating forward-error-propagation methods that can easily be adopted by future works. We employ mathematical optimization and implement automatic differentiation to attain efficient and machine-precise first- and second-order derivatives of the objective function with respect to the LECs. This is also vital for the regression analysis. We use power-counting arguments to estimate the systematic uncertainty that is inherent to χ EFT , and we construct chiral interactions at different orders with quantified uncertainties. Statistical error propagation is compared with Monte Carlo sampling, showing that statistical errors are, in general, small compared to systematic ones. In conclusion, we find that a simultaneous fit to different sets of data is critical to (i) identify the optimal set of LECs, (ii) capture all relevant correlations, (iii) reduce the statistical uncertainty, and (iv) attain order-by-order convergence in χ EFT . Furthermore, certain systematic uncertainties in the few-nucleon sector are shown to get substantially magnified in the many-body sector, in particular when varying the cutoff in the chiral potentials. The methodology and results presented in this paper open a new frontier for uncertainty quantification in ab initio nuclear theory.

  10. A comparison of uncertainty analysis methods using a groundwater flow model

    SciTech Connect

    Doctor, P.G.; Jacobson, E.A.; Buchanan, J.A.

    1988-06-01

    This report evaluates three uncertainty analysis methods that are proposed for use in performances assessment activities within the OCRWM and Nuclear Regulatory Commission (NRC) communities. The three methods are Monte Carlo simulation with unconstrained sampling, Monte Carlo simulation with Latin Hypercube sampling, and first-order analysis. Monte Carlo simulation with unconstrained sampling is a generally accepted uncertainty analysis method, but it has the disadvantage of being costly and time consuming. Latin Hypercube sampling was proposed to make Monte Carlo simulation more efficient. However, although it was originally formulated for independent variables, which is a major drawback in performance assessment modeling, Latin Hypercube can be used to generate correlated samples. The first-order method is efficient to implement because it is based on the first-order Taylor series expansion; however, there is concern that it does not adequately describe the variability for complex models. These three uncertainty analysis methods were evaluated using a calibrated groundwater flow model of a unconfined aquifer in southern Arizona. The two simulation methods produced similar results, although the Latin Hypercube method tends to produce samples whose estimates of statistical parameters are closer to the desired parameters. The mean travel times for the first-order method does not agree with those of the simulations. In additions, the first-order method produces estimates of variance in travel times that are more variable than those produced by the simulation methods, resulting in nonconservative tolerance intervals. 13 refs., 33 figs.

  11. Uncertainty analysis for an effluent trading system in a typical nonpoint-sources-polluted watershed

    NASA Astrophysics Data System (ADS)

    Chen, Lei; Han, Zhaoxing; Wang, Guobo; Shen, Zhenyao

    2016-07-01

    Conventional effluent trading systems (ETSs) between point sources (PSs) and nonpoint sources (NPSs) are often unreliable because of the uncertain characteristics of NPSs. In this study, a new framework was established for PS-NPS ETSs, and a comprehensive analysis was conducted by quantifying the impacts of the uncertainties associated with the water assimilative capacity (WAC), NPS emissions, and measurement effectiveness. On the basis of these results, the uncertain characteristics of NPSs would result in a less cost-effective PS-NPS ETS during most hydrological periods, and there exists a clear transition occurs from the WAC constraint to the water quality constraint if these stochastic factors are considered. Specifically, the emission uncertainty had a greater impact on PSs, but an increase in the emission or abatement uncertainty caused the abatement efforts to shift from NPSs toward PSs. Moreover, the error transitivity from the WAC to conventional ETS approaches is more obvious than that to the WEFZ-based ETS. When NPSs emissions are relatively high, structural BMPs should be considered for trading, and vice versa. These results are critical to understand the impacts of uncertainty on the functionality of PS-NPS ETSs and to provide a trade-off between the confidence level and abatement efforts.

  12. Uncertainty quantification and global sensitivity analysis of the Los Alamos sea ice model

    SciTech Connect

    Urrego-Blanco, Jorge Rolando; Urban, Nathan Mark; Hunke, Elizabeth Clare; Turner, Adrian Keith; Jeffery, Nicole

    2016-04-01

    Changes in the high-latitude climate system have the potential to affect global climate through feedbacks with the atmosphere and connections with midlatitudes. Sea ice and climate models used to understand these changes have uncertainties that need to be characterized and quantified. We present a quantitative way to assess uncertainty in complex computer models, which is a new approach in the analysis of sea ice models. We characterize parametric uncertainty in the Los Alamos sea ice model (CICE) in a standalone configuration and quantify the sensitivity of sea ice area, extent, and volume with respect to uncertainty in 39 individual model parameters. Unlike common sensitivity analyses conducted in previous studies where parameters are varied one at a time, this study uses a global variance-based approach in which Sobol' sequences are used to efficiently sample the full 39-dimensional parameter space. We implement a fast emulator of the sea ice model whose predictions of sea ice extent, area, and volume are used to compute the Sobol' sensitivity indices of the 39 parameters. Main effects and interactions among the most influential parameters are also estimated by a nonparametric regression technique based on generalized additive models. A ranking based on the sensitivity indices indicates that model predictions are most sensitive to snow parameters such as snow conductivity and grain size, and the drainage of melt ponds. Lastly, it is recommended that research be prioritized toward more accurately determining these most influential parameter values by observational studies or by improving parameterizations in the sea ice model.

  13. Uncertainty analysis for an effluent trading system in a typical nonpoint-sources-polluted watershed

    PubMed Central

    Chen, Lei; Han, Zhaoxing; Wang, Guobo; Shen, Zhenyao

    2016-01-01

    Conventional effluent trading systems (ETSs) between point sources (PSs) and nonpoint sources (NPSs) are often unreliable because of the uncertain characteristics of NPSs. In this study, a new framework was established for PS-NPS ETSs, and a comprehensive analysis was conducted by quantifying the impacts of the uncertainties associated with the water assimilative capacity (WAC), NPS emissions, and measurement effectiveness. On the basis of these results, the uncertain characteristics of NPSs would result in a less cost-effective PS-NPS ETS during most hydrological periods, and there exists a clear transition occurs from the WAC constraint to the water quality constraint if these stochastic factors are considered. Specifically, the emission uncertainty had a greater impact on PSs, but an increase in the emission or abatement uncertainty caused the abatement efforts to shift from NPSs toward PSs. Moreover, the error transitivity from the WAC to conventional ETS approaches is more obvious than that to the WEFZ-based ETS. When NPSs emissions are relatively high, structural BMPs should be considered for trading, and vice versa. These results are critical to understand the impacts of uncertainty on the functionality of PS-NPS ETSs and to provide a trade-off between the confidence level and abatement efforts. PMID:27406070

  14. Uncertainty in the Bayesian meta-analysis of normally distributed surrogate endpoints.

    PubMed

    Bujkiewicz, Sylwia; Thompson, John R; Spata, Enti; Abrams, Keith R

    2015-08-13

    We investigate the effect of the choice of parameterisation of meta-analytic models and related uncertainty on the validation of surrogate endpoints. Different meta-analytical approaches take into account different levels of uncertainty which may impact on the accuracy of the predictions of treatment effect on the target outcome from the treatment effect on a surrogate endpoint obtained from these models. A range of Bayesian as well as frequentist meta-analytical methods are implemented using illustrative examples in relapsing-remitting multiple sclerosis, where the treatment effect on disability worsening is the primary outcome of interest in healthcare evaluation, while the effect on relapse rate is considered as a potential surrogate to the effect on disability progression, and in gastric cancer, where the disease-free survival has been shown to be a good surrogate endpoint to the overall survival. Sensitivity analysis was carried out to assess the impact of distributional assumptions on the predictions. Also, sensitivity to modelling assumptions and performance of the models were investigated by simulation. Although different methods can predict mean true outcome almost equally well, inclusion of uncertainty around all relevant parameters of the model may lead to less certain and hence more conservative predictions. When investigating endpoints as candidate surrogate outcomes, a careful choice of the meta-analytical approach has to be made. Models underestimating the uncertainty of available evidence may lead to overoptimistic predictions which can then have an effect on decisions made based on such predictions. © The Author(s) 2015.

  15. Uncertainty analysis for an effluent trading system in a typical nonpoint-sources-polluted watershed.

    PubMed

    Chen, Lei; Han, Zhaoxing; Wang, Guobo; Shen, Zhenyao

    2016-07-11

    Conventional effluent trading systems (ETSs) between point sources (PSs) and nonpoint sources (NPSs) are often unreliable because of the uncertain characteristics of NPSs. In this study, a new framework was established for PS-NPS ETSs, and a comprehensive analysis was conducted by quantifying the impacts of the uncertainties associated with the water assimilative capacity (WAC), NPS emissions, and measurement effectiveness. On the basis of these results, the uncertain characteristics of NPSs would result in a less cost-effective PS-NPS ETS during most hydrological periods, and there exists a clear transition occurs from the WAC constraint to the water quality constraint if these stochastic factors are considered. Specifically, the emission uncertainty had a greater impact on PSs, but an increase in the emission or abatement uncertainty caused the abatement efforts to shift from NPSs toward PSs. Moreover, the error transitivity from the WAC to conventional ETS approaches is more obvious than that to the WEFZ-based ETS. When NPSs emissions are relatively high, structural BMPs should be considered for trading, and vice versa. These results are critical to understand the impacts of uncertainty on the functionality of PS-NPS ETSs and to provide a trade-off between the confidence level and abatement efforts.

  16. [Application of uncertainty assessment in NIR quantitative analysis of traditional Chinese medicine].

    PubMed

    Xue, Zhong; Xu, Bing; Liu, Qian; Shi, Xin-Yuan; Li, Jian-Yu; Wu, Zhi-Sheng; Qiao, Yan-Jiang

    2014-10-01

    The near infrared (NIR) spectra of Liuyi San samples were collected during the mixing process and the quantitative models by PLS (partial least squares) method were generated for the quantification of the concentration of glycyrrhizin. The PLS quantitative model had good calibration and prediction performances (r(cal) 0.998 5, RMSEC = 0.044 mg · g(-1); r(val) = 0.947 4, RMSEP = 0.124 mg · g(-1)), indicating that NIR spectroscopy can be used as a rapid determination method of the concentration of glycyrrhizin in Liuyi San powder. After the validation tests were designed, the Liao-Lin-Iyer approach based on Monte Carlo simulation was used to estimate β-content-γ-confidence tolerance intervals. Then the uncertainty was calculated, and the uncer- tainty profile was drawn. The NIR analytical method was considered valid when the concentration of glycyrrhizin is above 1.56 mg · g(-1) since the uncertainty fell within the acceptable limits (λ = ± 20%). The results showed that uncertainty assessment can be used in NIR quantitative models of glycyrrhizin for different concentrations and provided references for other traditional Chinese medicine to finish the uncertainty assessment using NIR quantitative analysis.

  17. Adjoint-Based Sensitivity and Uncertainty Analysis for Density and Composition: A User’s Guide

    DOE PAGES

    Favorite, Jeffrey A.; Perkó, Zoltán; Kiedrowski, Brian C.; ...

    2017-03-01

    The evaluation of uncertainties is essential for criticality safety. Our paper deals with material density and composition uncertainties and provides guidance on how traditional first-order sensitivity methods can be used to predict their effects. Unlike problems that deal with traditional cross-section uncertainty analysis, material density and composition-related problems are often characterized by constraints that do not allow arbitrary and independent variations of the input parameters. Their proper handling requires constrained sensitivities that take into account the interdependence of the inputs. This paper discusses how traditional unconstrained isotopic density sensitivities can be calculated using the adjoint sensitivity capabilities of the popularmore » Monte Carlo codes MCNP6 and SCALE 6.2, and we also present the equations to be used when forward and adjoint flux distributions are available. Subsequently, we show how the constrained sensitivities can be computed using the unconstrained (adjoint-based) sensitivities as well as by applying central differences directly. We present three distinct procedures for enforcing the constraint on the input variables, each leading to different constrained sensitivities. As a guide, the sensitivity and uncertainty formulas for several frequently encountered specific cases involving densities and compositions are given. One analytic k∞ example highlights the relationship between constrained sensitivity formulas and central differences, and a more realistic numerical problem reveals similarities among the computer codes used and differences among the three methods of enforcing the constraint.« less

  18. Implementation and Validation of Uncertainty Analysis of Available Energy and Available Power

    SciTech Connect

    Jon P. Christophersen; John L. Morrison; B. J. Schubert; Shawn Allred

    2007-04-01

    The Idaho National Laboratory does extensive testing and evaluation of state-of-the-art batteries and ultracapacitors for hybrid-electric vehicle applications as part of the FreedomCAR and Vehicle Technologies Program. Significant parameters of interest include Available Energy and Available Power. Documenting the uncertainty analysis of these derived parameters is a very complex problem. The error is an unknown combination of both linearity and offset; the analysis presented in this paper computes the uncertainty both ways and then the most conservative method is assumed (which is the worst case scenario). Each method requires the use of over 134 equations, some of which are derived and some are measured values. This includes the measurement device error (calibration error) and bit resolution and analog noise error (standard deviation error). The implementation of these equations to acquire a closed form answer was done using Matlab (an array based programming language) and validated using Monte Carlo simulations.

  19. Cramér-Rao analysis of orientation estimation: influence of target model uncertainties.

    PubMed

    Gerwe, David R; Hill, Jennifer L; Idell, Paul S

    2003-05-01

    We explore the use of Cramér-Rao bound calculations for predicting fundamental limits on the accuracy with which target characteristics can be determined by using imaging sensors. In particular, estimation of satellite orientation from high-resolution sensors is examined. The analysis role that such bounds provide for sensor/experiment design, operation, and upgrade is discussed. Emphasis is placed on the importance of including all relevant target/sensor uncertainties in the analysis. Computer simulations are performed that illustrate that uncertainties in target features (e.g., shape, reflectance, and relative orientation) have a significant impact on the bounds and provide considerable insight as to how details of the three-dimensional target structure may influence the estimation process. The simulations also address the impact that a priori information has on the bounds.

  20. Cramér-Rao analysis of orientation estimation: influence of target model uncertainties

    NASA Astrophysics Data System (ADS)

    Gerwe, David R.; Hill, Jennifer L.; Idell, Paul S.

    2003-05-01

    We explore the use of Cramér-Rao bound calculations for predicting fundamental limits on the accuracy with which target characteristics can be determined by using imaging sensors. In particular, estimation of satellite orientation from high-resolution sensors is examined. The analysis role that such bounds provide for sensor/experiment design, operation, and upgrade is discussed. Emphasis is placed on the importance of including all relevant target/sensor uncertainties in the analysis. Computer simulations are performed that illustrate that uncertainties in target features (e.g., shape, reflectance, and relative orientation) have a significant impact on the bounds and provide considerable insight as to how details of the three-dimensional target structure may influence the estimation process. The simulations also address the impact that a priori information has on the bounds.

  1. Heterogenic Solid Biofuel Sampling Methodology and Uncertainty Associated with Prompt Analysis

    PubMed Central

    Pazó, Jose A.; Granada, Enrique; Saavedra, Ángeles; Patiño, David; Collazo, Joaquín

    2010-01-01

    Accurate determination of the properties of biomass is of particular interest in studies on biomass combustion or cofiring. The aim of this paper is to develop a methodology for prompt analysis of heterogeneous solid fuels with an acceptable degree of accuracy. Special care must be taken with the sampling procedure to achieve an acceptable degree of error and low statistical uncertainty. A sampling and error determination methodology for prompt analysis is presented and validated. Two approaches for the propagation of errors are also given and some comparisons are made in order to determine which may be better in this context. Results show in general low, acceptable levels of uncertainty, demonstrating that the samples obtained in the process are representative of the overall fuel composition. PMID:20559506

  2. Ecological risk analysis and genetically modified salmon: management in the face of uncertainty.

    PubMed

    Moreau, Darek T R

    2014-02-01

    The commercialization of growth hormone transgenic Atlantic salmon for aquaculture has become a controversial public policy issue. Concerns exist over the potential ecological effects of this biotechnology should animals escape captivity. From within an ecological risk-analysis framework, science has been sought to provide decision makers with evidence upon which to base regulatory decisions pertaining to genetically modified salmon. Here I review the available empirical information on the potential ecological and genetic effects of transgenic salmon and discuss the underlying eco-evolutionary science behind the topic. I conclude that data gaps and irreducible epistemic uncertainties limit the role of scientific inference in support of ecological risk management for transgenic salmon. I argue that predictive uncertainties are pervasive in complex eco-evolutionary systems and that it behooves those involved in the risk-analysis process to accept and communicate these limitations in the interest of timely, clear, and cautious risk-management options.

  3. Uncertainty Optimization Applied to the Monte Carlo Analysis of Planetary Entry Trajectories

    NASA Technical Reports Server (NTRS)

    Olds, John; Way, David

    2001-01-01

    Recently, strong evidence of liquid water under the surface of Mars and a meteorite that might contain ancient microbes have renewed interest in Mars exploration. With this renewed interest, NASA plans to send spacecraft to Mars approx. every 26 months. These future spacecraft will return higher-resolution images, make precision landings, engage in longer-ranging surface maneuvers, and even return Martian soil and rock samples to Earth. Future robotic missions and any human missions to Mars will require precise entries to ensure safe landings near science objective and pre-employed assets. Potential sources of water and other interesting geographic features are often located near hazards, such as within craters or along canyon walls. In order for more accurate landings to be made, spacecraft entering the Martian atmosphere need to use lift to actively control the entry. This active guidance results in much smaller landing footprints. Planning for these missions will depend heavily on Monte Carlo analysis. Monte Carlo trajectory simulations have been used with a high degree of success in recent planetary exploration missions. These analyses ascertain the impact of off-nominal conditions during a flight and account for uncertainty. Uncertainties generally stem from limitations in manufacturing tolerances, measurement capabilities, analysis accuracies, and environmental unknowns. Thousands of off-nominal trajectories are simulated by randomly dispersing uncertainty variables and collecting statistics on forecast variables. The dependability of Monte Carlo forecasts, however, is limited by the accuracy and completeness of the assumed uncertainties. This is because Monte Carlo analysis is a forward driven problem; beginning with the input uncertainties and proceeding to the forecasts outputs. It lacks a mechanism to affect or alter the uncertainties based on the forecast results. If the results are unacceptable, the current practice is to use an iterative, trial

  4. Bayesian Statistics and Uncertainty Quantification for Safety Boundary Analysis in Complex Systems

    NASA Technical Reports Server (NTRS)

    He, Yuning; Davies, Misty Dawn

    2014-01-01

    The analysis of a safety-critical system often requires detailed knowledge of safe regions and their highdimensional non-linear boundaries. We present a statistical approach to iteratively detect and characterize the boundaries, which are provided as parameterized shape candidates. Using methods from uncertainty quantification and active learning, we incrementally construct a statistical model from only few simulation runs and obtain statistically sound estimates of the shape parameters for safety boundaries.

  5. A novel primary system for compressible flow calibration uncertainty analysis for the preliminary design

    SciTech Connect

    Kegel, T.

    1995-08-01

    The operation of a primary system for compressible flow calibration is typically based on either a gravimetric or volumetric method of mass determination. The gravimetric method provides direct determination of mass while the volumetric method utilizes measurements of density and volume. This paper describes the preliminary design of a primary system that features both gravimetric and volumetric mass determination. The emphasis is on the presentation of an uncertainty analysis procedure to be used for preliminary design decisions.

  6. Stability analysis of thermo-acoustic nonlinear eigenproblems in annular combustors. Part II. Uncertainty quantification

    NASA Astrophysics Data System (ADS)

    Magri, Luca; Bauerheim, Michael; Nicoud, Franck; Juniper, Matthew P.

    2016-11-01

    Monte Carlo and Active Subspace Identification methods are combined with first- and second-order adjoint sensitivities to perform (forward) uncertainty quantification analysis of the thermo-acoustic stability of two annular combustor configurations. This method is applied to evaluate the risk factor, i.e., the probability for the system to be unstable. It is shown that the adjoint approach reduces the number of nonlinear-eigenproblem calculations by as much as the Monte Carlo samples.

  7. Quantitative uncertainty analysis of Life Cycle Assessment for algal biofuel production.

    PubMed

    Sills, Deborah L; Paramita, Vidia; Franke, Michael J; Johnson, Michael C; Akabas, Tal M; Greene, Charles H; Tester, Jefferson W

    2013-01-15

    As a result of algae's promise as a renewable energy feedstock, numerous studies have used Life Cycle Assessment (LCA) to quantify the environmental performance of algal biofuels, yet there is no consensus of results among them. Our work, motivated by the lack of comprehensive uncertainty analysis in previous studies, uses a Monte Carlo approach to estimate ranges of expected values of LCA metrics by incorporating parameter variability with empirically specified distribution functions. Results show that large uncertainties exist at virtually all steps of the biofuel production process. Although our findings agree with a number of earlier studies on matters such as the need for wet lipid extraction, nutrients recovered from waste streams, and high energy coproducts, the ranges of reported LCA metrics show that uncertainty analysis is crucial for developing technologies, such as algal biofuels. In addition, the ranges of energy return on (energy) invested (EROI) values resulting from our analysis help explain the high variability in EROI values from earlier studies. Reporting results from LCA models as ranges, and not single values, will more reliably inform industry and policy makers on expected energetic and environmental performance of biofuels produced from microalgae.

  8. Wavelet-Monte Carlo Hybrid System for HLW Nuclide Migration Modeling and Sensitivity and Uncertainty Analysis

    SciTech Connect

    Nasif, Hesham; Neyama, Atsushi

    2003-02-26

    This paper presents results of an uncertainty and sensitivity analysis for performance of the different barriers of high level radioactive waste repositories. SUA is a tool to perform the uncertainty and sensitivity on the output of Wavelet Integrated Repository System model (WIRS), which is developed to solve a system of nonlinear partial differential equations arising from the model formulation of radionuclide transport through repository. SUA performs sensitivity analysis (SA) and uncertainty analysis (UA) on a sample output from Monte Carlo simulation. The sample is generated by WIRS and contains the values of the output values of the maximum release rate in the form of time series and values of the input variables for a set of different simulations (runs), which are realized by varying the model input parameters. The Monte Carlo sample is generated with SUA as a pure random sample or using Latin Hypercube sampling technique. Tchebycheff and Kolmogrov confidence bounds are compute d on the maximum release rate for UA and effective non-parametric statistics to rank the influence of the model input parameters SA. Based on the results, we point out parameters that have primary influences on the performance of the engineered barrier system of a repository. The parameters found to be key contributor to the release rate are selenium and Cesium distribution coefficients in both geosphere and major water conducting fault (MWCF), the diffusion depth and water flow rate in the excavation-disturbed zone (EDZ).

  9. Uncertainty and sensitivity analysis of early exposure results with the MACCS Reactor Accident Consequence Model

    SciTech Connect

    Helton, J.C.; Johnson, J.D.; McKay, M.D.; Shiver, A.W.; Sprung, J.L.

    1995-01-01

    Uncertainty and sensitivity analysis techniques based on Latin hypercube sampling, partial correlation analysis and stepwise regression analysis are used in an investigation with the MACCS model of the early health effects associated with a severe accident at a nuclear power station. The primary purpose of this study is to provide guidance on the variables to be considered in future review work to reduce the uncertainty in the important variables used in the calculation of reactor accident consequences. The effects of 34 imprecisely known input variables on the following reactor accident consequences are studied: number of early fatalities, number of cases of prodromal vomiting, population dose within 10 mi of the reactor, population dose within 1000 mi of the reactor, individual early fatality probability within 1 mi of the reactor, and maximum early fatality distance. When the predicted variables are considered collectively, the following input variables were found to be the dominant contributors to uncertainty: scaling factor for horizontal dispersion, dry deposition velocity, inhalation protection factor for nonevacuees, groundshine shielding factor for nonevacuees, early fatality hazard function alpha value for bone marrow exposure, and scaling factor for vertical dispersion.

  10. Report on INL Activities for UncertaintyReduction Analysis of FY10

    SciTech Connect

    G. Palmiotti; H. Hiruta; M. Salvatores

    2010-09-01

    The work scope of this project related to the Work Packages of “Uncertainty Reduction Analyses” with the goal of reducing nuclear data uncertainties is to produce a set of improved nuclear data to be used both for a wide range of validated advanced fast reactor design calculations, and for providing guidelines for further improvements of the ENDF/B files (i.e. ENDF/B-VII, and future releases). This report presents the status of activities performed at INL under the FC R&D Work Package previously mentioned. First an analysis of uncertainty evaluation is presented using the new covariance data (AFCI version 1.2) made available by BNL. Then, analyses of a number of experiments, among those selected in the previous fiscal year and available, are presented making use of ENDF/B-VII data. These experiments include: updating of the ZPR-6/7 assembly (improved model and spectral indices), ZPPR-9 assembly (only simplified model available), ZPPR-10 (full detailed model), and irradiation experiments. These last experiments include PROFIL-1 were a new methodology has been employed in the Monte Carlo calculations, and also a deterministic analysis has been performed. This is the first time the Monte Carlo approach and ENDF/B-VII have been used for the PROFIL experiments. The PROFIL-2 and TRAPU experiments have been for the moment only modeled and a full analysis of the irradiation results will be finalized next fiscal year.

  11. The Importance of Behavioral Thresholds and Objective Functions in Contaminant Transport Uncertainty Analysis

    NASA Astrophysics Data System (ADS)

    Sykes, J. F.; Kang, M.; Thomson, N. R.

    2007-12-01

    The TCE release from The Lockformer Company in Lisle Illinois resulted in a plume in a confined aquifer that is more than 4 km long and impacted more than 300 residential wells. Many of the wells are on the fringe of the plume and have concentrations that did not exceed 5 ppb. The settlement for the Chapter 11 bankruptcy protection of Lockformer involved the establishment of a trust fund that compensates individuals with cancers with payments being based on cancer type, estimated TCE concentration in the well and the duration of exposure to TCE. The estimation of early arrival times and hence low likelihood events is critical in the determination of the eligibility of an individual for compensation. Thus, an emphasis must be placed on the accuracy of the leading tail region in the likelihood distribution of possible arrival times at a well. The estimation of TCE arrival time, using a three-dimensional analytical solution, involved parameter estimation and uncertainty analysis. Parameters in the model included TCE source parameters, groundwater velocities, dispersivities and the TCE decay coefficient for both the confining layer and the bedrock aquifer. Numerous objective functions, which include the well-known L2-estimator, robust estimators (L1-estimators and M-estimators), penalty functions, and dead zones, were incorporated in the parameter estimation process to treat insufficiencies in both the model and observational data due to errors, biases, and limitations. The concept of equifinality was adopted and multiple maximum likelihood parameter sets were accepted if pre-defined physical criteria were met. The criteria ensured that a valid solution predicted TCE concentrations for all TCE impacted areas. Monte Carlo samples are found to be inadequate for uncertainty analysis of this case study due to its inability to find parameter sets that meet the predefined physical criteria. Successful results are achieved using a Dynamically-Dimensioned Search sampling

  12. Uncertainty analysis of the Operational Simplified Surface Energy Balance (SSEBop) model at multiple flux tower sites

    NASA Astrophysics Data System (ADS)

    Chen, Mingshi; Senay, Gabriel B.; Singh, Ramesh K.; Verdin, James P.

    2016-05-01

    Evapotranspiration (ET) is an important component of the water cycle - ET from the land surface returns approximately 60% of the global precipitation back to the atmosphere. ET also plays an important role in energy transport among the biosphere, atmosphere, and hydrosphere. Current regional to global and daily to annual ET estimation relies mainly on surface energy balance (SEB) ET models or statistical and empirical methods driven by remote sensing data and various climatological databases. These models have uncertainties due to inevitable input errors, poorly defined parameters, and inadequate model structures. The eddy covariance measurements on water, energy, and carbon fluxes at the AmeriFlux tower sites provide an opportunity to assess the ET modeling uncertainties. In this study, we focused on uncertainty analysis of the Operational Simplified Surface Energy Balance (SSEBop) model for ET estimation at multiple AmeriFlux tower sites with diverse land cover characteristics and climatic conditions. The 8-day composite 1-km MODerate resolution Imaging Spectroradiometer (MODIS) land surface temperature (LST) was used as input land surface temperature for the SSEBop algorithms. The other input data were taken from the AmeriFlux database. Results of statistical analysis indicated that the SSEBop model performed well in estimating ET with an R2 of 0.86 between estimated ET and eddy covariance measurements at 42 AmeriFlux tower sites during 2001-2007. It was encouraging to see that the best performance was observed for croplands, where R2 was 0.92 with a root mean square error of 13 mm/month. The uncertainties or random errors from input variables and parameters of the SSEBop model led to monthly ET estimates with relative errors less than 20% across multiple flux tower sites distributed across different biomes. This uncertainty of the SSEBop model lies within the error range of other SEB models, suggesting systematic error or bias of the SSEBop model is within the

  13. Uncertainty analysis of the Operational Simplified Surface Energy Balance (SSEBop) model at multiple flux tower sites

    USGS Publications Warehouse

    Chen, Mingshi; Senay, Gabriel B.; Singh, Ramesh K.; Verdin, James P.

    2016-01-01

    Evapotranspiration (ET) is an important component of the water cycle – ET from the land surface returns approximately 60% of the global precipitation back to the atmosphere. ET also plays an important role in energy transport among the biosphere, atmosphere, and hydrosphere. Current regional to global and daily to annual ET estimation relies mainly on surface energy balance (SEB) ET models or statistical and empirical methods driven by remote sensing data and various climatological databases. These models have uncertainties due to inevitable input errors, poorly defined parameters, and inadequate model structures. The eddy covariance measurements on water, energy, and carbon fluxes at the AmeriFlux tower sites provide an opportunity to assess the ET modeling uncertainties. In this study, we focused on uncertainty analysis of the Operational Simplified Surface Energy Balance (SSEBop) model for ET estimation at multiple AmeriFlux tower sites with diverse land cover characteristics and climatic conditions. The 8-day composite 1-km MODerate resolution Imaging Spectroradiometer (MODIS) land surface temperature (LST) was used as input land surface temperature for the SSEBop algorithms. The other input data were taken from the AmeriFlux database. Results of statistical analysis indicated that the SSEBop model performed well in estimating ET with an R2 of 0.86 between estimated ET and eddy covariance measurements at 42 AmeriFlux tower sites during 2001–2007. It was encouraging to see that the best performance was observed for croplands, where R2 was 0.92 with a root mean square error of 13 mm/month. The uncertainties or random errors from input variables and parameters of the SSEBop model led to monthly ET estimates with relative errors less than 20% across multiple flux tower sites distributed across different biomes. This uncertainty of the SSEBop model lies within the error range of other SEB models, suggesting systematic error or bias of the SSEBop model is within

  14. Spatial uncertainty analysis: Propagation of interpolation errors in spatially distributed models

    USGS Publications Warehouse

    Phillips, D.L.; Marks, D.G.

    1996-01-01

    In simulation modelling, it is desirable to quantify model uncertainties and provide not only point estimates for output variables but confidence intervals as well. Spatially distributed physical and ecological process models are becoming widely used, with runs being made over a grid of points that represent the landscape. This requires input values at each grid point, which often have to be interpolated from irregularly scattered measurement sites, e.g., weather stations. Interpolation introduces spatially varying errors which propagate through the model We extended established uncertainty analysis methods to a spatial domain for quantifying spatial patterns of input variable interpolation errors and how they propagate through a model to affect the uncertainty of the model output. We applied this to a model of potential evapotranspiration (PET) as a demonstration. We modelled PET for three time periods in 1990 as a function of temperature, humidity, and wind on a 10-km grid across the U.S. portion of the Columbia River Basin. Temperature, humidity, and wind speed were interpolated using kriging from 700- 1000 supporting data points. Kriging standard deviations (SD) were used to quantify the spatially varying interpolation uncertainties. For each of 5693 grid points, 100 Monte Carlo simulations were done, using the kriged values of temperature, humidity, and wind, plus random error terms determined by the kriging SDs and the correlations of interpolation errors among the three variables. For the spring season example, kriging SDs averaged 2.6??C for temperature, 8.7% for relative humidity, and 0.38 m s-1 for wind. The resultant PET estimates had coefficients of variation (CVs) ranging from 14% to 27% for the 10-km grid cells. Maps of PET means and CVs showed the spatial patterns of PET with a measure of its uncertainty due to interpolation of the input variables. This methodology should be applicable to a variety of spatially distributed models using interpolated

  15. A guide to uncertainty quantification and sensitivity analysis for cardiovascular applications.

    PubMed

    Eck, Vinzenz Gregor; Donders, Wouter Paulus; Sturdy, Jacob; Feinberg, Jonathan; Delhaas, Tammo; Hellevik, Leif Rune; Huberts, Wouter

    2016-08-01

    As we shift from population-based medicine towards a more precise patient-specific regime guided by predictions of verified and well-established cardiovascular models, an urgent question arises: how sensitive are the model predictions to errors and uncertainties in the model inputs? To make our models suitable for clinical decision-making, precise knowledge of prediction reliability is of paramount importance. Efficient and practical methods for uncertainty quantification (UQ) and sensitivity analysis (SA) are therefore essential. In this work, we explain the concepts of global UQ and global, variance-based SA along with two often-used methods that are applicable to any model without requiring model implementation changes: Monte Carlo (MC) and polynomial chaos (PC). Furthermore, we propose a guide for UQ and SA according to a six-step procedure and demonstrate it for two clinically relevant cardiovascular models: model-based estimation of the fractional flow reserve (FFR) and model-based estimation of the total arterial compliance (CT ). Both MC and PC produce identical results and may be used interchangeably to identify most significant model inputs with respect to uncertainty in model predictions of FFR and CT . However, PC is more cost-efficient as it requires an order of magnitude fewer model evaluations than MC. Additionally, we demonstrate that targeted reduction of uncertainty in the most significant model inputs reduces the uncertainty in the model predictions efficiently. In conclusion, this article offers a practical guide to UQ and SA to help move the clinical application of mathematical models forward. Copyright © 2015 John Wiley & Sons, Ltd. Copyright © 2015 John Wiley & Sons, Ltd.

  16. Predictive uncertainty analysis of plume distribution for geological carbon sequestration using sparse-grid Bayesian method

    NASA Astrophysics Data System (ADS)

    Shi, X.; Zhang, G.

    2013-12-01

    Because of the extensive computational burden, parametric uncertainty analyses are rarely conducted for geological carbon sequestration (GCS) process based multi-phase models. The difficulty of predictive uncertainty analysis for the CO2 plume migration in realistic GCS models is not only due to the spatial distribution of the caprock and reservoir (i.e. heterogeneous model parameters), but also because the GCS optimization estimation problem has multiple local minima due to the complex nonlinear multi-phase (gas and aqueous), and multi-component (water, CO2, salt) transport equations. The geological model built by Doughty and Pruess (2004) for the Frio pilot site (Texas) was selected and assumed to represent the 'true' system, which was composed of seven different facies (geological units) distributed among 10 layers. We chose to calibrate the permeabilities of these facies. Pressure and gas saturation values from this true model were then extracted and used as observations for subsequent model calibration. Random noise was added to the observations to approximate realistic field conditions. Each simulation of the model lasts about 2 hours. In this study, we develop a new approach that improves computational efficiency of Bayesian inference by constructing a surrogate system based on an adaptive sparse-grid stochastic collocation method. This surrogate response surface global optimization algorithm is firstly used to calibrate the model parameters, then prediction uncertainty of the CO2 plume position is quantified due to the propagation from parametric uncertainty in the numerical experiments, which is also compared to the actual plume from the 'true' model. Results prove that the approach is computationally efficient for multi-modal optimization and prediction uncertainty quantification for computationally expensive simulation models. Both our inverse methodology and findings can be broadly applicable to GCS in heterogeneous storage formations.

  17. Initial Implementation of the Hanford Site-Wide Groundwater Flow and Uncertainty Analysis Framework

    NASA Astrophysics Data System (ADS)

    Cole, C. R.; Vermeul, V. R.; Freedman, V. L.; Bergeron, M. P.

    2002-05-01

    Since Hanford operations began in 1943, large volumes of wastewater have been discharged into the subsurface, creating groundwater mounds (> 20m) and regional-scale contaminant plumes that will require monitoring at least through site closure. Since the cessation of wastewater disposal activities in 1988, many of the ~700 monitoring wells that previously documented mounding and contaminant movement are currently going dry. An initial implementation of the Hanford Site uncertainty methodology presented in this paper and a companion poster, investigates which of the ~700 monitoring wells are likely to go dry between now and 2050. The long-term goals of the Pacific Northwest National Laboratory effort at Hanford include the development and implementation of an uncertainty methodology with the site-wide groundwater flow and transport model. Results are presented for two different conceptual models of the base of the unconfined aquifer. Model parameter uncertainty was determined through transient inverse modeling (1943-1996) using UCODE and ~76,000 historical observations of head. Since an analysis of model linearity using Beale's measure indicated that the model was sufficiently linear, the uncertainty in predicted future water levels was determined using linear confidence and prediction intervals. Both a steady-state and a transient case (1996-2050) were investigated in order to determine which of the current monitoring wells are likely to go dry. Results demonstrated that the uncertainty methodology can be used to evaluate the potential loss of existing monitoring wells in strategic locations, and to assist in the development of a long-term strategy for their replacement. Pacific Northwest National Laboratory is operated for the U.S. Department of Energy under Contract DE-AC06-76RL01830.

  18. Probabilistic Parameter Uncertainty Analysis of Single Input Single Output Control Systems

    NASA Technical Reports Server (NTRS)

    Smith, Brett A.; Kenny, Sean P.; Crespo, Luis G.

    2005-01-01

    The current standards for handling uncertainty in control systems use interval bounds for definition of the uncertain parameters. This approach gives no information about the likelihood of system performance, but simply gives the response bounds. When used in design, current methods of m-analysis and can lead to overly conservative controller design. With these methods, worst case conditions are weighted equally with the most likely conditions. This research explores a unique approach for probabilistic analysis of control systems. Current reliability methods are examined showing the strong areas of each in handling probability. A hybrid method is developed using these reliability tools for efficiently propagating probabilistic uncertainty through classical control analysis problems. The method developed is applied to classical response analysis as well as analysis methods that explore the effects of the uncertain parameters on stability and performance metrics. The benefits of using this hybrid approach for calculating the mean and variance of responses cumulative distribution functions are shown. Results of the probabilistic analysis of a missile pitch control system, and a non-collocated mass spring system, show the added information provided by this hybrid analysis.

  19. A GIS based spatially-explicit sensitivity and uncertainty analysis approach for multi-criteria decision analysis☆

    PubMed Central

    Feizizadeh, Bakhtiar; Jankowski, Piotr; Blaschke, Thomas

    2014-01-01

    GIS multicriteria decision analysis (MCDA) techniques are increasingly used in landslide susceptibility mapping for the prediction of future hazards, land use planning, as well as for hazard preparedness. However, the uncertainties associated with MCDA techniques are inevitable and model outcomes are open to multiple types of uncertainty. In this paper, we present a systematic approach to uncertainty and sensitivity analysis. We access the uncertainty of landslide susceptibility maps produced with GIS-MCDA techniques. A new spatially-explicit approach and Dempster–Shafer Theory (DST) are employed to assess the uncertainties associated with two MCDA techniques, namely Analytical Hierarchical Process (AHP) and Ordered Weighted Averaging (OWA) implemented in GIS. The methodology is composed of three different phases. First, weights are computed to express the relative importance of factors (criteria) for landslide susceptibility. Next, the uncertainty and sensitivity of landslide susceptibility is analyzed as a function of weights using Monte Carlo Simulation and Global Sensitivity Analysis. Finally, the results are validated using a landslide inventory database and by applying DST. The comparisons of the obtained landslide susceptibility maps of both MCDA techniques with known landslides show that the AHP outperforms OWA. However, the OWA-generated landslide susceptibility map shows lower uncertainty than the AHP-generated map. The results demonstrate that further improvement in the accuracy of GIS-based MCDA can be achieved by employing an integrated uncertainty–sensitivity analysis approach, in which the uncertainty of landslide susceptibility model is decomposed and attributed to model's criteria weights. PMID:25843987

  20. Predictive uncertainty analysis of a saltwater intrusion model using null-space Monte Carlo

    USGS Publications Warehouse

    Herckenrath, Daan; Langevin, Christian D.; Doherty, John

    2011-01-01

    Because of the extensive computational burden and perhaps a lack of awareness of existing methods, rigorous uncertainty analyses are rarely conducted for variable-density flow and transport models. For this reason, a recently developed null-space Monte Carlo (NSMC) method for quantifying prediction uncertainty was tested for a synthetic saltwater intrusion model patterned after the Henry problem. Saltwater intrusion caused by a reduction in fresh groundwater discharge was simulated for 1000 randomly generated hydraulic conductivity distributions, representing a mildly heterogeneous aquifer. From these 1000 simulations, the hydraulic conductivity distribution giving rise to the most extreme case of saltwater intrusion was selected and was assumed to represent the "true" system. Head and salinity values from this true model were then extracted and used as observations for subsequent model calibration. Random noise was added to the observations to approximate realistic field conditions. The NSMC method was used to calculate 1000 calibration-constrained parameter fields. If the dimensionality of the solution space was set appropriately, the estimated uncertainty range from the NSMC analysis encompassed the truth. Several variants of the method were implemented to investigate their effect on the efficiency of the NSMC method. Reducing the dimensionality of the null-space for the processing of the random parameter sets did not result in any significant gains in efficiency and compromised the ability of the NSMC method to encompass the true prediction value. The addition of intrapilot point heterogeneity to the NSMC process was also tested. According to a variogram comparison, this provided the same scale of heterogeneity that was used to generate the truth. However, incorporation of intrapilot point variability did not make a noticeable difference to the uncertainty of the prediction. With this higher level of heterogeneity, however, the computational burden of

  1. Measuring, using, and reducing experimental and computational uncertainty in reliability analysis of composite laminates

    NASA Astrophysics Data System (ADS)

    Smarslok, Benjamin P.

    The failure of the composite hydrogen tanks on the X-33 Reusable Launch Vehicle (RLV) from combined thermal and mechanical failure modes created a situation where the design weight was highly sensitive to uncertainties. Through previous research of sensitivity and reliability analysis on this problem, three areas of potential uncertainty reduction were recognized and became the focal points for this dissertation. The transverse elastic modulus and coefficient of thermal expansion were cited as being particularly sensitive input parameters with respect to weight. Measurement uncertainty analysis was performed on transverse modulus experiments, where the intermediate thickness measurements proved to be the greatest contributor to uncertainty. Data regarding correlations in the material properties of composite laminates is not always available, however the significance of correlated properties on probability of failure was detected. Therefore, a model was develo