Science.gov

Sample records for postcalibration uncertainty analysis

  1. Practical postcalibration uncertainty analysis: Yucca Mountain, Nevada.

    PubMed

    James, Scott C; Doherty, John E; Eddebbarh, Al-Aziz

    2009-01-01

    The values of parameters in a groundwater flow model govern the precision of predictions of future system behavior. Predictive precision, thus, typically depends on an ability to infer values of system properties from historical measurements through calibration. When such data are scarce, or when their information content with respect to parameters that are most relevant to predictions of interest is weak, predictive uncertainty may be high, even if the model is "calibrated." Recent advances help recognize this condition, quantitatively evaluate predictive uncertainty, and suggest a path toward improved predictive accuracy by identifying sources of predictive uncertainty and by determining what observations will most effectively reduce this uncertainty. We demonstrate linear and nonlinear predictive error/uncertainty analyses as applied to a groundwater flow model of Yucca Mountain, Nevada, the United States' proposed site for disposal of high-level radioactive waste. Linear and nonlinear uncertainty analyses are readily implemented as an adjunct to model calibration with medium to high parameterization density. Linear analysis yields contributions made by each parameter to a prediction's uncertainty and the worth of different observations, both existing and yet-to-be-gathered, toward reducing this uncertainty. Nonlinear analysis provides more accurate characterization of the uncertainty of model predictions while yielding their (approximate) probability distribution functions. This article applies the above methods to a prediction of specific discharge and confirms the uncertainty bounds on specific discharge supplied in the Yucca Mountain Project License Application. PMID:19744249

  2. Practical post-calibration uncertainty analysis: Yucca Mountain, Nevada, USA

    NASA Astrophysics Data System (ADS)

    James, S. C.; Doherty, J.; Eddebbarh, A.

    2009-12-01

    The values of parameters in a groundwater flow model govern the precision of predictions of future system behavior. Predictive precision, thus, typically depends on an ability to infer values of system properties from historical measurements through calibration. When such data are scarce, or when their information content with respect to parameters that are most relevant to predictions of interest is weak, predictive uncertainty may be high, even if the model is “calibrated.” Recent advances help recognize this condition, quantitatively evaluate predictive uncertainty, and suggest a path toward improved predictive accuracy by identifying sources of predictive uncertainty and by determining what observations will most effectively reduce this uncertainty. We demonstrate linear and nonlinear predictive error/uncertainty analyses as applied to a groundwater flow model of Yucca Mountain, Nevada, the US’s proposed site for disposal of high-level radioactive waste. Both of these types uncertainty analysis are readily implemented as an adjunct to model calibration with medium to high parameterization density. Linear analysis yields contributions made by each parameter to a prediction’s uncertainty and the worth of different observations, both existing and yet-to-be-gathered, toward reducing this uncertainty. Nonlinear analysis provides more accurate characterization of the uncertainty of model predictions while yielding their (approximate) probability distribution functions. This paper applies the above methods to a prediction of specific discharge and confirms the uncertainty bounds on specific discharge supplied in the Yucca Mountain Project License Application. Furthermore, Monte Carlo simulations confirm that hydrogeologic units thought to be flow barriers have probability distributions skewed toward lower permeabilities.

  3. Uncertainty analysis

    SciTech Connect

    Thomas, R.E.

    1982-03-01

    An evaluation is made of the suitability of analytical and statistical sampling methods for making uncertainty analyses. The adjoint method is found to be well-suited for obtaining sensitivity coefficients for computer programs involving large numbers of equations and input parameters. For this purpose the Latin Hypercube Sampling method is found to be inferior to conventional experimental designs. The Latin hypercube method can be used to estimate output probability density functions, but requires supplementary rank transformations followed by stepwise regression to obtain uncertainty information on individual input parameters. A simple Cork and Bottle problem is used to illustrate the efficiency of the adjoint method relative to certain statistical sampling methods. For linear models of the form Ax=b it is shown that a complete adjoint sensitivity analysis can be made without formulating and solving the adjoint problem. This can be done either by using a special type of statistical sampling or by reformulating the primal problem and using suitable linear programming software.

  4. MOUSE UNCERTAINTY ANALYSIS SYSTEM

    EPA Science Inventory

    The original MOUSE (Modular Oriented Uncertainty System) system was designed to deal with the problem of uncertainties in Environmental engineering calculations, such as a set of engineering cost or risk analysis equations. t was especially intended for use by individuals with li...

  5. Antarctic Photochemistry: Uncertainty Analysis

    NASA Technical Reports Server (NTRS)

    Stewart, Richard W.; McConnell, Joseph R.

    1999-01-01

    Understanding the photochemistry of the Antarctic region is important for several reasons. Analysis of ice cores provides historical information on several species such as hydrogen peroxide and sulfur-bearing compounds. The former can potentially provide information on the history of oxidants in the troposphere and the latter may shed light on DMS-climate relationships. Extracting such information requires that we be able to model the photochemistry of the Antarctic troposphere and relate atmospheric concentrations to deposition rates and sequestration in the polar ice. This paper deals with one aspect of the uncertainty inherent in photochemical models of the high latitude troposphere: that arising from imprecision in the kinetic data used in the calculations. Such uncertainties in Antarctic models tend to be larger than those in models of mid to low latitude clean air. One reason is the lower temperatures which result in increased imprecision in kinetic data, assumed to be best characterized at 298K. Another is the inclusion of a DMS oxidation scheme in the present model. Many of the rates in this scheme are less precisely known than are rates in the standard chemistry used in many stratospheric and tropospheric models.

  6. Analysis of Infiltration Uncertainty

    SciTech Connect

    R. McCurley

    2003-10-27

    The primary objectives of this uncertainty analysis are: (1) to develop and justify a set of uncertain parameters along with associated distributions; and (2) to use the developed uncertain parameter distributions and the results from selected analog site calculations done in ''Simulation of Net Infiltration for Modern and Potential Future Climates'' (USGS 2001 [160355]) to obtain the net infiltration weighting factors for the glacial transition climate. These weighting factors are applied to unsaturated zone (UZ) flow fields in Total System Performance Assessment (TSPA), as outlined in the ''Total System Performance Assessment-License Application Methods and Approach'' (BSC 2002 [160146], Section 3.1) as a method for the treatment of uncertainty. This report is a scientific analysis because no new and mathematical physical models are developed herein, and it is based on the use of the models developed in or for ''Simulation of Net Infiltration for Modern and Potential Future Climates'' (USGS 2001 [160355]). Any use of the term model refers to those developed in the infiltration numerical model report. TSPA License Application (LA) has included three distinct climate regimes in the comprehensive repository performance analysis for Yucca Mountain: present-day, monsoon, and glacial transition. Each climate regime was characterized using three infiltration-rate maps, including a lower- and upper-bound and a mean value (equal to the average of the two boundary values). For each of these maps, which were obtained based on analog site climate data, a spatially averaged value was also calculated by the USGS. For a more detailed discussion of these infiltration-rate maps, see ''Simulation of Net Infiltration for Modern and Potential Future Climates'' (USGS 2001 [160355]). For this Scientific Analysis Report, spatially averaged values were calculated for the lower-bound, mean, and upper-bound climate analogs only for the glacial transition climate regime, within the

  7. Critical evaluation of parameter consistency and predictive uncertainty in hydrological modeling: A case study using Bayesian total error analysis

    NASA Astrophysics Data System (ADS)

    Thyer, Mark; Renard, Benjamin; Kavetski, Dmitri; Kuczera, George; Franks, Stewart William; Srikanthan, Sri

    2009-12-01

    The lack of a robust framework for quantifying the parametric and predictive uncertainty of conceptual rainfall-runoff (CRR) models remains a key challenge in hydrology. The Bayesian total error analysis (BATEA) methodology provides a comprehensive framework to hypothesize, infer, and evaluate probability models describing input, output, and model structural error. This paper assesses the ability of BATEA and standard calibration approaches (standard least squares (SLS) and weighted least squares (WLS)) to address two key requirements of uncertainty assessment: (1) reliable quantification of predictive uncertainty and (2) reliable estimation of parameter uncertainty. The case study presents a challenging calibration of the lumped GR4J model to a catchment with ephemeral responses and large rainfall gradients. Postcalibration diagnostics, including checks of predictive distributions using quantile-quantile analysis, suggest that while still far from perfect, BATEA satisfied its assumed probability models better than SLS and WLS. In addition, WLS/SLS parameter estimates were highly dependent on the selected rain gauge and calibration period. This will obscure potential relationships between CRR parameters and catchment attributes and prevent the development of meaningful regional relationships. Conversely, BATEA provided consistent, albeit more uncertain, parameter estimates and thus overcomes one of the obstacles to parameter regionalization. However, significant departures from the calibration assumptions remained even in BATEA, e.g., systematic overestimation of predictive uncertainty, especially in validation. This is likely due to the inferred rainfall errors compensating for simplified treatment of model structural error.

  8. Uncertainties in offsite consequence analysis

    SciTech Connect

    Young, M.L.; Harper, F.T.; Lui, C.H.

    1996-03-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequences from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the U.S. Nuclear Regulatory Commission and the European Commission began co-sponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables using a formal expert judgment elicitation and evaluation process. This paper focuses on the methods used in and results of this on-going joint effort.

  9. Approaches to highly parameterized inversion: A guide to using PEST for model-parameter and predictive-uncertainty analysis

    USGS Publications Warehouse

    Doherty, John E.; Hunt, Randall J.; Tonkin, Matthew J.

    2010-01-01

    Analysis of the uncertainty associated with parameters used by a numerical model, and with predictions that depend on those parameters, is fundamental to the use of modeling in support of decisionmaking. Unfortunately, predictive uncertainty analysis with regard to models can be very computationally demanding, due in part to complex constraints on parameters that arise from expert knowledge of system properties on the one hand (knowledge constraints) and from the necessity for the model parameters to assume values that allow the model to reproduce historical system behavior on the other hand (calibration constraints). Enforcement of knowledge and calibration constraints on parameters used by a model does not eliminate the uncertainty in those parameters. In fact, in many cases, enforcement of calibration constraints simply reduces the uncertainties associated with a number of broad-scale combinations of model parameters that collectively describe spatially averaged system properties. The uncertainties associated with other combinations of parameters, especially those that pertain to small-scale parameter heterogeneity, may not be reduced through the calibration process. To the extent that a prediction depends on system-property detail, its postcalibration variability may be reduced very little, if at all, by applying calibration constraints; knowledge constraints remain the only limits on the variability of predictions that depend on such detail. Regrettably, in many common modeling applications, these constraints are weak. Though the PEST software suite was initially developed as a tool for model calibration, recent developments have focused on the evaluation of model-parameter and predictive uncertainty. As a complement to functionality that it provides for highly parameterized inversion (calibration) by means of formal mathematical regularization techniques, the PEST suite provides utilities for linear and nonlinear error-variance and uncertainty analysis in

  10. Uncertainty Analysis of Model Coupling

    NASA Astrophysics Data System (ADS)

    Held, H.; Knopf, B.; Schneider von Deimling, T.; Schellnhuber, H.-J.

    The Earth System is a highly complex system that is often modelled by coupling sev- eral nonlinear submodules. For predicting the climate with these models, the following uncertainties play an essential role: parameter uncertainty, uncertainty in initial con- ditions or model uncertainty. Here we will address uncertainty in initial conditions as well as model uncertainty. As the process of coupling is an important part of model- ing, the main aspect of this work is the investigation of uncertainties that are due to the coupling process. For this study we use conceptual models that, compared to GCMs, have the advantage that the model itself as well as the output can be treated in a mathematically elabo- rated way. As the time for running the model is much shorter, the investigation is also possible for a longer period, e.g. for paleo runs. In consideration of these facts it is feasible to analyse the whole phase space of the model. The process of coupling is investigated by using different methods of examining low order coupled atmosphere-ocean systems. In the dynamical approach a fully coupled system of the two submodules can be compared to a system where one submodule forces the other. For a particular atmosphere-ocean system, based on the Lorenz model for the atmosphere, there can be shown significant differences in the predictability of a forced system depending whether the subsystems are coupled in a linear or a non- linear way. In [1] it is shown that in the linear case the forcing cannot represent the coupling, but in the nonlinear case, that we investigated in our study, the variability and the statistics of the coupled system can be reproduced by the forcing. Another approach to analyse the coupling is to carry out a bifurcation analysis. Here the bifurcation diagram of a single atmosphere system is compared to that of a cou- pled atmosphere-ocean system. Again it can be seen from the different behaviour of the coupled and the uncoupled system, that the

  11. ENHANCED UNCERTAINTY ANALYSIS FOR SRS COMPOSITE ANALYSIS

    SciTech Connect

    Smith, F.; Phifer, M.

    2011-06-30

    The Composite Analysis (CA) performed for the Savannah River Site (SRS) in 2009 (SRS CA 2009) included a simplified uncertainty analysis. The uncertainty analysis in the CA (Smith et al. 2009b) was limited to considering at most five sources in a separate uncertainty calculation performed for each POA. To perform the uncertainty calculations in a reasonable amount of time, the analysis was limited to using 400 realizations, 2,000 years of simulated transport time, and the time steps used for the uncertainty analysis were increased from what was used in the CA base case analysis. As part of the CA maintenance plan, the Savannah River National Laboratory (SRNL) committed to improving the CA uncertainty/sensitivity analysis. The previous uncertainty analysis was constrained by the standard GoldSim licensing which limits the user to running at most four Monte Carlo uncertainty calculations (also called realizations) simultaneously. Some of the limitations on the number of realizations that could be practically run and the simulation time steps were removed by building a cluster of three HP Proliant windows servers with a total of 36 64-bit processors and by licensing the GoldSim DP-Plus distributed processing software. This allowed running as many as 35 realizations simultaneously (one processor is reserved as a master process that controls running the realizations). These enhancements to SRNL computing capabilities made uncertainty analysis: using 1000 realizations, using the time steps employed in the base case CA calculations, with more sources, and simulating radionuclide transport for 10,000 years feasible. In addition, an importance screening analysis was performed to identify the class of stochastic variables that have the most significant impact on model uncertainty. This analysis ran the uncertainty model separately testing the response to variations in the following five sets of model parameters: (a) K{sub d} values (72 parameters for the 36 CA elements in

  12. Risk Analysis and Uncertainty: Implications for Counselling

    ERIC Educational Resources Information Center

    Hassenzahl, David

    2004-01-01

    Over the past two decades, the risk analysis community has made substantial advances in understanding and describing uncertainty. Uncertainty is ubiquitous, complex, both quantitative and qualitative in nature, and often irreducible. Uncertainty thus creates a challenge when using risk analysis to evaluate the rationality of group and individual…

  13. Sensitivity and Uncertainty Analysis Shell

    Energy Science and Technology Software Center (ESTSC)

    1999-04-20

    SUNS (Sensitivity and Uncertainty Analysis Shell) is a 32-bit application that runs under Windows 95/98 and Windows NT. It is designed to aid in statistical analyses for a broad range of applications. The class of problems for which SUNS is suitable is generally defined by two requirements: 1. A computer code is developed or acquired that models some processes for which input is uncertain and the user is interested in statistical analysis of the outputmore » of that code. 2. The statistical analysis of interest can be accomplished using the Monte Carlo analysis. The implementation then requires that the user identify which input to the process model is to be manipulated for statistical analysis. With this information, the changes required to loosely couple SUNS with the process model can be completed. SUNS is then used to generate the required statistical sample and the user-supplied process model analyses the sample. The SUNS post processor displays statistical results from any existing file that contains sampled input and output values.« less

  14. Uncertainty analysis for Ulysses safety evaluation report

    NASA Technical Reports Server (NTRS)

    Frank, Michael V.

    1991-01-01

    As part of the effort to review the Ulysses Final Safety Analysis Report and to understand the risk of plutonium release from the Ulysses spacecraft General Purpose Heat Source-Radioisotope Thermal Generator, the Interagency Nuclear Safety Review Panel (INSRP) performed an integrated, quantitative analysis of the uncertainties of the calculated risk of plutonium release from Ulysses. Using state-of-art probabilistic risk assessment technology, the uncertainty analysis accounted for both variability and uncertainty of the key parameters of the risk analysis. The results show that INSRP had high confidence that risk of fatal cancers from potential plutonium release associated with calculated launch and deployment accident scenarios is low.

  15. Uncertainty quantification and error analysis

    SciTech Connect

    Higdon, Dave M; Anderson, Mark C; Habib, Salman; Klein, Richard; Berliner, Mark; Covey, Curt; Ghattas, Omar; Graziani, Carlo; Seager, Mark; Sefcik, Joseph; Stark, Philip

    2010-01-01

    UQ studies all sources of error and uncertainty, including: systematic and stochastic measurement error; ignorance; limitations of theoretical models; limitations of numerical representations of those models; limitations on the accuracy and reliability of computations, approximations, and algorithms; and human error. A more precise definition for UQ is suggested below.

  16. Uncertainty Analysis of Instrument Calibration and Application

    NASA Technical Reports Server (NTRS)

    Tripp, John S.; Tcheng, Ping

    1999-01-01

    Experimental aerodynamic researchers require estimated precision and bias uncertainties of measured physical quantities, typically at 95 percent confidence levels. Uncertainties of final computed aerodynamic parameters are obtained by propagation of individual measurement uncertainties through the defining functional expressions. In this paper, rigorous mathematical techniques are extended to determine precision and bias uncertainties of any instrument-sensor system. Through this analysis, instrument uncertainties determined through calibration are now expressed as functions of the corresponding measurement for linear and nonlinear univariate and multivariate processes. Treatment of correlated measurement precision error is developed. During laboratory calibration, calibration standard uncertainties are assumed to be an order of magnitude less than those of the instrument being calibrated. Often calibration standards do not satisfy this assumption. This paper applies rigorous statistical methods for inclusion of calibration standard uncertainty and covariance due to the order of their application. The effects of mathematical modeling error on calibration bias uncertainty are quantified. The effects of experimental design on uncertainty are analyzed. The importance of replication is emphasized, techniques for estimation of both bias and precision uncertainties using replication are developed. Statistical tests for stationarity of calibration parameters over time are obtained.

  17. Numerical Uncertainty Quantification for Radiation Analysis Tools

    NASA Technical Reports Server (NTRS)

    Anderson, Brooke; Blattnig, Steve; Clowdsley, Martha

    2007-01-01

    Recently a new emphasis has been placed on engineering applications of space radiation analyses and thus a systematic effort of Verification, Validation and Uncertainty Quantification (VV&UQ) of the tools commonly used for radiation analysis for vehicle design and mission planning has begun. There are two sources of uncertainty in geometric discretization addressed in this paper that need to be quantified in order to understand the total uncertainty in estimating space radiation exposures. One source of uncertainty is in ray tracing, as the number of rays increase the associated uncertainty decreases, but the computational expense increases. Thus, a cost benefit analysis optimizing computational time versus uncertainty is needed and is addressed in this paper. The second source of uncertainty results from the interpolation over the dose vs. depth curves that is needed to determine the radiation exposure. The question, then, is what is the number of thicknesses that is needed to get an accurate result. So convergence testing is performed to quantify the uncertainty associated with interpolating over different shield thickness spatial grids.

  18. Uncertainty Analysis for Photovoltaic Degradation Rates (Poster)

    SciTech Connect

    Jordan, D.; Kurtz, S.; Hansen, C.

    2014-04-01

    Dependable and predictable energy production is the key to the long-term success of the PV industry. PV systems show over the lifetime of their exposure a gradual decline that depends on many different factors such as module technology, module type, mounting configuration, climate etc. When degradation rates are determined from continuous data the statistical uncertainty is easily calculated from the regression coefficients. However, total uncertainty that includes measurement uncertainty and instrumentation drift is far more difficult to determine. A Monte Carlo simulation approach was chosen to investigate a comprehensive uncertainty analysis. The most important effect for degradation rates is to avoid instrumentation that changes over time in the field. For instance, a drifting irradiance sensor, which can be achieved through regular calibration, can lead to a substantially erroneous degradation rates. However, the accuracy of the irradiance sensor has negligible impact on degradation rate uncertainty emphasizing that precision (relative accuracy) is more important than absolute accuracy.

  19. Robustness analysis for real parametric uncertainty

    NASA Technical Reports Server (NTRS)

    Sideris, Athanasios

    1989-01-01

    Some key results in the literature in the area of robustness analysis for linear feedback systems with structured model uncertainty are reviewed. Some new results are given. Model uncertainty is described as a combination of real uncertain parameters and norm bounded unmodeled dynamics. Here the focus is on the case of parametric uncertainty. An elementary and unified derivation of the celebrated theorem of Kharitonov and the Edge Theorem is presented. Next, an algorithmic approach for robustness analysis in the cases of multilinear and polynomic parametric uncertainty (i.e., the closed loop characteristic polynomial depends multilinearly and polynomially respectively on the parameters) is given. The latter cases are most important from practical considerations. Some novel modifications in this algorithm which result in a procedure of polynomial time behavior in the number of uncertain parameters is outlined. Finally, it is shown how the more general problem of robustness analysis for combined parametric and dynamic (i.e., unmodeled dynamics) uncertainty can be reduced to the case of polynomic parametric uncertainty, and thus be solved by means of the algorithm.

  20. Uncertainty analysis for Ulysses safety evaluation report

    SciTech Connect

    Frank, M.V. )

    1991-01-01

    As part of the effort to review the Ulysses Final Safety Analysis Report and to understand the risk of plutonium release from the Ulysses spacecraft General Purpose Heat Source---Radioisotope Thermal Generator (GPHS-RTG), the Interagency Nuclear Safety Review Panel (INSRP) and the author performed an integrated, quantitative analysis of the uncertainties of the calculated risk of plutonium release from Ulysses. Using state-of-art probabilistic risk assessment technology, the uncertainty analysis accounted for both variability and uncertainty of the key parameters of the risk analysis. The results show that INSRP had high confidence that risk of fatal cancers from potential plutonium release associated with calculated launch and deployment accident scenarios is low.

  1. Spatial uncertainty analysis of population models

    SciTech Connect

    Jager, Yetta; King, Anthony Wayne; Schumaker, Nathan; Ashwood, Tom L; Jackson, Barbara L

    2004-01-01

    This paper describes an approach for conducting spatial uncertainty analysis of spatial population models, and illustrates the ecological consequences of spatial uncertainty for landscapes with different properties. Spatial population models typically simulate birth, death, and migration on an input map that describes habitat. Typically, only a single reference map is available, but we can imagine that a collection of other, slightly different, maps could be drawn to represent a particular species' habitat. As a first approximation, our approach assumes that spatial uncertainty (i.e., the variation among values assigned to a location by such a collection of maps) is constrained by characteristics of the reference map, regardless of how the map was produced. Our approach produces lower levels of uncertainty than alternative methods used in landscape ecology because we condition our alternative landscapes on local properties of the reference map. Simulated spatial uncertainty was higher near the borders of patches. Consequently, average uncertainty was highest for reference maps with equal proportions of suitable and unsuitable habitat, and no spatial autocorrelation. We used two population viability models to evaluate the ecological consequences of spatial uncertainty for landscapes with different properties. Spatial uncertainty produced larger variation among predictions of a spatially explicit model than those of a spatially implicit model. Spatially explicit model predictions of final female population size varied most among landscapes with enough clustered habitat to allow persistence. In contrast, predictions of population growth rate varied most among landscapes with only enough clustered habitat to support a small population, i.e., near a spatially mediated extinction threshold. We conclude that spatial uncertainty has the greatest effect on persistence when the amount and arrangement of suitable habitat are such that habitat capacity is near the minimum

  2. Uncertainty Analysis of Composite Structures

    NASA Technical Reports Server (NTRS)

    Noor, Ahmed K.; Starnes, James H., Jr.; Peters, Jeanne M.

    2000-01-01

    A two-phase approach and a computational procedure are presented for predicting the variability in the nonlinear response of composite structures associated with variations in the geometric and material parameters of the structure. In the first phase, hierarchical sensitivity analysis is used to identify the major parameters, which have the most effect on the response quantities of interest. In the second phase, the major parameters are taken to be fuzzy parameters, and a fuzzy set analysis is used to determine the range of variation of the response, associated with preselected variations in the major parameters. The effectiveness of the procedure is demonstrated by means of a numerical example of a cylindrical panel with four T-shaped stiffeners and a circular cutout.

  3. MOUSE - A COMPUTERIZED UNCERTAINTY ANALYSIS SYSTEM

    EPA Science Inventory

    Environmental engineering calculations involving uncertainties. either in the model itself or in the data, are far beyond the capabilities of conventional analysis for any but the simplest of models. here exist a number of general-purpose computer simulation languages, using Mont...

  4. Uncertainty Propagation in Hypersonic Vehicle Aerothermoelastic Analysis

    NASA Astrophysics Data System (ADS)

    Lamorte, Nicolas Etienne

    Hypersonic vehicles face a challenging flight environment. The aerothermoelastic analysis of its components requires numerous simplifying approximations. Identifying and quantifying the effect of uncertainties pushes the limits of the existing deterministic models, and is pursued in this work. An uncertainty quantification framework is used to propagate the effects of identified uncertainties on the stability margins and performance of the different systems considered. First, the aeroelastic stability of a typical section representative of a control surface on a hypersonic vehicle is examined. Variability in the uncoupled natural frequencies of the system is modeled to mimic the effect of aerodynamic heating. Next, the stability of an aerodynamically heated panel representing a component of the skin of a generic hypersonic vehicle is considered. Uncertainty in the location of transition from laminar to turbulent flow and the heat flux prediction is quantified using CFD. In both cases significant reductions of the stability margins are observed. A loosely coupled airframe--integrated scramjet engine is considered next. The elongated body and cowl of the engine flow path are subject to harsh aerothermodynamic loading which causes it to deform. Uncertainty associated with deformation prediction is propagated to the engine performance analysis. The cowl deformation is the main contributor to the sensitivity of the propulsion system performance. Finally, a framework for aerothermoelastic stability boundary calculation for hypersonic vehicles using CFD is developed. The usage of CFD enables one to consider different turbulence conditions, laminar or turbulent, and different models of the air mixture, in particular real gas model which accounts for dissociation of molecules at high temperature. The system is found to be sensitive to turbulence modeling as well as the location of the transition from laminar to turbulent flow. Real gas effects play a minor role in the

  5. Extended Forward Sensitivity Analysis for Uncertainty Quantification

    SciTech Connect

    Haihua Zhao; Vincent A. Mousseau

    2011-09-01

    Verification and validation (V&V) are playing more important roles to quantify uncertainties and realize high fidelity simulations in engineering system analyses, such as transients happened in a complex nuclear reactor system. Traditional V&V in the reactor system analysis focused more on the validation part or did not differentiate verification and validation. The traditional approach to uncertainty quantification is based on a 'black box' approach. The simulation tool is treated as an unknown signal generator, a distribution of inputs according to assumed probability density functions is sent in and the distribution of the outputs is measured and correlated back to the original input distribution. The 'black box' method mixes numerical errors with all other uncertainties. It is also not efficient to perform sensitivity analysis. Contrary to the 'black box' method, a more efficient sensitivity approach can take advantage of intimate knowledge of the simulation code. In these types of approaches equations for the propagation of uncertainty are constructed and the sensitivities are directly solved for as variables in the simulation. This paper presents the forward sensitivity analysis as a method to help uncertainty qualification. By including time step and potentially spatial step as special sensitivity parameters, the forward sensitivity method is extended as one method to quantify numerical errors. Note that by integrating local truncation errors over the whole system through the forward sensitivity analysis process, the generated time step and spatial step sensitivity information reflect global numerical errors. The discretization errors can be systematically compared against uncertainties due to other physical parameters. This extension makes the forward sensitivity method a much more powerful tool to help uncertainty qualification. By knowing the relative sensitivity of time and space steps with other interested physical parameters, the simulation is allowed

  6. Uncertainty Analysis for a Jet Flap Airfoil

    NASA Technical Reports Server (NTRS)

    Green, Lawrence L.; Cruz, Josue

    2006-01-01

    An analysis of variance (ANOVA) study was performed to quantify the potential uncertainties of lift and pitching moment coefficient calculations from a computational fluid dynamics code, relative to an experiment, for a jet flap airfoil configuration. Uncertainties due to a number of factors including grid density, angle of attack and jet flap blowing coefficient were examined. The ANOVA software produced a numerical model of the input coefficient data, as functions of the selected factors, to a user-specified order (linear, 2-factor interference, quadratic, or cubic). Residuals between the model and actual data were also produced at each of the input conditions, and uncertainty confidence intervals (in the form of Least Significant Differences or LSD) for experimental, computational, and combined experimental / computational data sets were computed. The LSD bars indicate the smallest resolvable differences in the functional values (lift or pitching moment coefficient) attributable solely to changes in independent variable, given just the input data points from selected data sets. The software also provided a collection of diagnostics which evaluate the suitability of the input data set for use within the ANOVA process, and which examine the behavior of the resultant data, possibly suggesting transformations which should be applied to the data to reduce the LSD. The results illustrate some of the key features of, and results from, the uncertainty analysis studies, including the use of both numerical (continuous) and categorical (discrete) factors, the effects of the number and range of the input data points, and the effects of the number of factors considered simultaneously.

  7. Extended Forward Sensitivity Analysis for Uncertainty Quantification

    SciTech Connect

    Haihua Zhao; Vincent A. Mousseau

    2008-09-01

    This report presents the forward sensitivity analysis method as a means for quantification of uncertainty in system analysis. The traditional approach to uncertainty quantification is based on a “black box” approach. The simulation tool is treated as an unknown signal generator, a distribution of inputs according to assumed probability density functions is sent in and the distribution of the outputs is measured and correlated back to the original input distribution. This approach requires large number of simulation runs and therefore has high computational cost. Contrary to the “black box” method, a more efficient sensitivity approach can take advantage of intimate knowledge of the simulation code. In this approach equations for the propagation of uncertainty are constructed and the sensitivity is solved for as variables in the same simulation. This “glass box” method can generate similar sensitivity information as the above “black box” approach with couples of runs to cover a large uncertainty region. Because only small numbers of runs are required, those runs can be done with a high accuracy in space and time ensuring that the uncertainty of the physical model is being measured and not simply the numerical error caused by the coarse discretization. In the forward sensitivity method, the model is differentiated with respect to each parameter to yield an additional system of the same size as the original one, the result of which is the solution sensitivity. The sensitivity of any output variable can then be directly obtained from these sensitivities by applying the chain rule of differentiation. We extend the forward sensitivity method to include time and spatial steps as special parameters so that the numerical errors can be quantified against other physical parameters. This extension makes the forward sensitivity method a much more powerful tool to help uncertainty analysis. By knowing the relative sensitivity of time and space steps with other

  8. Uncertainty Analysis of Decomposing Polyurethane Foam

    NASA Technical Reports Server (NTRS)

    Hobbs, Michael L.; Romero, Vicente J.

    2000-01-01

    Sensitivity/uncertainty analyses are necessary to determine where to allocate resources for improved predictions in support of our nation's nuclear safety mission. Yet, sensitivity/uncertainty analyses are not commonly performed on complex combustion models because the calculations are time consuming, CPU intensive, nontrivial exercises that can lead to deceptive results. To illustrate these ideas, a variety of sensitivity/uncertainty analyses were used to determine the uncertainty associated with thermal decomposition of polyurethane foam exposed to high radiative flux boundary conditions. The polyurethane used in this study is a rigid closed-cell foam used as an encapsulant. Related polyurethane binders such as Estane are used in many energetic materials of interest to the JANNAF community. The complex, finite element foam decomposition model used in this study has 25 input parameters that include chemistry, polymer structure, and thermophysical properties. The response variable was selected as the steady-state decomposition front velocity calculated as the derivative of the decomposition front location versus time. An analytical mean value sensitivity/uncertainty (MV) analysis was used to determine the standard deviation by taking numerical derivatives of the response variable with respect to each of the 25 input parameters. Since the response variable is also a derivative, the standard deviation was essentially determined from a second derivative that was extremely sensitive to numerical noise. To minimize the numerical noise, 50-micrometer element dimensions and approximately 1-msec time steps were required to obtain stable uncertainty results. As an alternative method to determine the uncertainty and sensitivity in the decomposition front velocity, surrogate response surfaces were generated for use with a constrained Latin Hypercube Sampling (LHS) technique. Two surrogate response surfaces were investigated: 1) a linear surrogate response surface (LIN) and 2

  9. LCA data quality: sensitivity and uncertainty analysis.

    PubMed

    Guo, M; Murphy, R J

    2012-10-01

    Life cycle assessment (LCA) data quality issues were investigated by using case studies on products from starch-polyvinyl alcohol based biopolymers and petrochemical alternatives. The time horizon chosen for the characterization models was shown to be an important sensitive parameter for the environmental profiles of all the polymers. In the global warming potential and the toxicity potential categories the comparison between biopolymers and petrochemical counterparts altered as the time horizon extended from 20 years to infinite time. These case studies demonstrated that the use of a single time horizon provide only one perspective on the LCA outcomes which could introduce an inadvertent bias into LCA outcomes especially in toxicity impact categories and thus dynamic LCA characterization models with varying time horizons are recommended as a measure of the robustness for LCAs especially comparative assessments. This study also presents an approach to integrate statistical methods into LCA models for analyzing uncertainty in industrial and computer-simulated datasets. We calibrated probabilities for the LCA outcomes for biopolymer products arising from uncertainty in the inventory and from data variation characteristics this has enabled assigning confidence to the LCIA outcomes in specific impact categories for the biopolymer vs. petrochemical polymer comparisons undertaken. Uncertainty combined with the sensitivity analysis carried out in this study has led to a transparent increase in confidence in the LCA findings. We conclude that LCAs lacking explicit interpretation of the degree of uncertainty and sensitivities are of limited value as robust evidence for decision making or comparative assertions. PMID:22854094

  10. Extended Forward Sensitivity Analysis for Uncertainty Quantification

    SciTech Connect

    Haihua Zhao; Vincent A. Mousseau

    2013-01-01

    This paper presents the extended forward sensitivity analysis as a method to help uncertainty qualification. By including time step and potentially spatial step as special sensitivity parameters, the forward sensitivity method is extended as one method to quantify numerical errors. Note that by integrating local truncation errors over the whole system through the forward sensitivity analysis process, the generated time step and spatial step sensitivity information reflect global numerical errors. The discretization errors can be systematically compared against uncertainties due to other physical parameters. This extension makes the forward sensitivity method a much more powerful tool to help uncertainty qualification. By knowing the relative sensitivity of time and space steps with other interested physical parameters, the simulation is allowed to run at optimized time and space steps without affecting the confidence of the physical parameter sensitivity results. The time and space steps forward sensitivity analysis method can also replace the traditional time step and grid convergence study with much less computational cost. Two well-defined benchmark problems with manufactured solutions are utilized to demonstrate the method.

  11. Coupled semivariogram uncertainty of hydrogeological and geophysical data on capture zone uncertainty analysis

    USGS Publications Warehouse

    Rahman, A.; Tsai, F.T.-C.; White, C.D.; Willson, C.S.

    2008-01-01

    This study investigates capture zone uncertainty that relates to the coupled semivariogram uncertainty of hydrogeological and geophysical data. Semivariogram uncertainty is represented by the uncertainty in structural parameters (range, sill, and nugget). We used the beta distribution function to derive the prior distributions of structural parameters. The probability distributions of structural parameters were further updated through the Bayesian approach with the Gaussian likelihood functions. Cokriging of noncollocated pumping test data and electrical resistivity data was conducted to better estimate hydraulic conductivity through autosemivariograms and pseudo-cross-semivariogram. Sensitivities of capture zone variability with respect to the spatial variability of hydraulic conductivity, porosity and aquifer thickness were analyzed using ANOVA. The proposed methodology was applied to the analysis of capture zone uncertainty at the Chicot aquifer in Southwestern Louisiana, where a regional groundwater flow model was developed. MODFLOW-MODPATH was adopted to delineate the capture zone. The ANOVA results showed that both capture zone area and compactness were sensitive to hydraulic conductivity variation. We concluded that the capture zone uncertainty due to the semivariogram uncertainty is much higher than that due to the kriging uncertainty for given semivariograms. In other words, the sole use of conditional variances of kriging may greatly underestimate the flow response uncertainty. Semivariogram uncertainty should also be taken into account in the uncertainty analysis. ?? 2008 ASCE.

  12. Pretest uncertainty analysis for chemical rocket engine tests

    NASA Technical Reports Server (NTRS)

    Davidian, Kenneth J.

    1987-01-01

    A parametric pretest uncertainty analysis has been performed for a chemical rocket engine test at a unique 1000:1 area ratio altitude test facility. Results from the parametric study provide the error limits required in order to maintain a maximum uncertainty of 1 percent on specific impulse. Equations used in the uncertainty analysis are presented.

  13. Uncertainty Analysis and Expert Judgment in Seismic Hazard Analysis

    NASA Astrophysics Data System (ADS)

    Klügel, Jens-Uwe

    2011-01-01

    The large uncertainty associated with the prediction of future earthquakes is usually regarded as the main reason for increased hazard estimates which have resulted from some recent large scale probabilistic seismic hazard analysis studies (e.g. the PEGASOS study in Switzerland and the Yucca Mountain study in the USA). It is frequently overlooked that such increased hazard estimates are characteristic for a single specific method of probabilistic seismic hazard analysis (PSHA): the traditional (Cornell-McGuire) PSHA method which has found its highest level of sophistication in the SSHAC probability method. Based on a review of the SSHAC probability model and its application in the PEGASOS project, it is shown that the surprising results of recent PSHA studies can be explained to a large extent by the uncertainty model used in traditional PSHA, which deviates from the state of the art in mathematics and risk analysis. This uncertainty model, the Ang-Tang uncertainty model, mixes concepts of decision theory with probabilistic hazard assessment methods leading to an overestimation of uncertainty in comparison to empirical evidence. Although expert knowledge can be a valuable source of scientific information, its incorporation into the SSHAC probability method does not resolve the issue of inflating uncertainties in PSHA results. Other, more data driven, PSHA approaches in use in some European countries are less vulnerable to this effect. The most valuable alternative to traditional PSHA is the direct probabilistic scenario-based approach, which is closely linked with emerging neo-deterministic methods based on waveform modelling.

  14. Application of uncertainty analysis to cooling tower thermal performance tests

    SciTech Connect

    Yost, J.G.; Wheeler, D.E.

    1986-01-01

    The purpose of this paper is to provide an overview of uncertainty analyses. The following topics are addressed: l. A review and summary of the basic constituents of an uncertainty analysis with definitions and discussion of basic terms; 2. A discussion of the benefits and uses of uncertainty analysis; and 3. Example uncertainty analyses with emphasis on the problems, limitations, and site-specific complications.

  15. Representation of analysis results involving aleatory and epistemic uncertainty.

    SciTech Connect

    Johnson, Jay Dean; Helton, Jon Craig; Oberkampf, William Louis; Sallaberry, Cedric J.

    2008-08-01

    Procedures are described for the representation of results in analyses that involve both aleatory uncertainty and epistemic uncertainty, with aleatory uncertainty deriving from an inherent randomness in the behavior of the system under study and epistemic uncertainty deriving from a lack of knowledge about the appropriate values to use for quantities that are assumed to have fixed but poorly known values in the context of a specific study. Aleatory uncertainty is usually represented with probability and leads to cumulative distribution functions (CDFs) or complementary cumulative distribution functions (CCDFs) for analysis results of interest. Several mathematical structures are available for the representation of epistemic uncertainty, including interval analysis, possibility theory, evidence theory and probability theory. In the presence of epistemic uncertainty, there is not a single CDF or CCDF for a given analysis result. Rather, there is a family of CDFs and a corresponding family of CCDFs that derive from epistemic uncertainty and have an uncertainty structure that derives from the particular uncertainty structure (i.e., interval analysis, possibility theory, evidence theory, probability theory) used to represent epistemic uncertainty. Graphical formats for the representation of epistemic uncertainty in families of CDFs and CCDFs are investigated and presented for the indicated characterizations of epistemic uncertainty.

  16. A TIERED APPROACH TO PERFORMING UNCERTAINTY ANALYSIS IN CONDUCTING EXPOSURE ANALYSIS FOR CHEMICALS

    EPA Science Inventory

    The WHO/IPCS draft Guidance Document on Characterizing and Communicating Uncertainty in Exposure Assessment provides guidance on recommended strategies for conducting uncertainty analysis as part of human exposure analysis. Specifically, a tiered approach to uncertainty analysis ...

  17. Uncertainty Budget Analysis for Dimensional Inspection Processes (U)

    SciTech Connect

    Valdez, Lucas M.

    2012-07-26

    This paper is intended to provide guidance and describe how to prepare an uncertainty analysis of a dimensional inspection process through the utilization of an uncertainty budget analysis. The uncertainty analysis is stated in the same methodology as that of the ISO GUM standard for calibration and testing. There is a specific distinction between how Type A and Type B uncertainty analysis is used in a general and specific process. All theory and applications are utilized to represent both a generalized approach to estimating measurement uncertainty and how to report and present these estimations for dimensional measurements in a dimensional inspection process. The analysis of this uncertainty budget shows that a well-controlled dimensional inspection process produces a conservative process uncertainty, which can be attributed to the necessary assumptions in place for best possible results.

  18. Assessing and reporting uncertainties in dietary exposure analysis: Mapping of uncertainties in a tiered approach.

    PubMed

    Kettler, Susanne; Kennedy, Marc; McNamara, Cronan; Oberdörfer, Regina; O'Mahony, Cian; Schnabel, Jürgen; Smith, Benjamin; Sprong, Corinne; Faludi, Roland; Tennant, David

    2015-08-01

    Uncertainty analysis is an important component of dietary exposure assessments in order to understand correctly the strength and limits of its results. Often, standard screening procedures are applied in a first step which results in conservative estimates. If through those screening procedures a potential exceedance of health-based guidance values is indicated, within the tiered approach more refined models are applied. However, the sources and types of uncertainties in deterministic and probabilistic models can vary or differ. A key objective of this work has been the mapping of different sources and types of uncertainties to better understand how to best use uncertainty analysis to generate more realistic comprehension of dietary exposure. In dietary exposure assessments, uncertainties can be introduced by knowledge gaps about the exposure scenario, parameter and the model itself. With this mapping, general and model-independent uncertainties have been identified and described, as well as those which can be introduced and influenced by the specific model during the tiered approach. This analysis identifies that there are general uncertainties common to point estimates (screening or deterministic methods) and probabilistic exposure assessment methods. To provide further clarity, general sources of uncertainty affecting many dietary exposure assessments should be separated from model-specific uncertainties. PMID:25890086

  19. Geoengineering to Avoid Overshoot: An Uncertainty Analysis

    NASA Astrophysics Data System (ADS)

    Tanaka, K.

    2009-04-01

    Geoengineering (or climate engineering) using stratospheric sulfur injections (Crutzen, 2006) has been called for research in case of an urgent need for stopping global warming when other mitigation efforts were exhausted. Although there are a number of concerns over this idea (e.g. Robock, 2008), it is still useful to consider geoengineering as a possible method to limit warming caused by overshoot. Overshoot is a feature accompanied by low stabilizations scenarios aiming for a stringent target (Rao et al., 2008) in which total radiative forcing temporarily exceeds the target before reaching there. Scenarios achieving a 50% emission reduction by 2050 produces overshoot. Overshoot could cause sustained warming for decades due to the inertia of the climate system. If stratospheric sulfur injections were to be used as a "last resort" to avoid overshoot, what would be the suitable start-year and injection profile of such an intervention? Wigley (2006) examined climate response to combined mitigation/geoengineering scenarios with the intent to avert overshoot. Wigley's analysis demonstrated a basic potential of such a combined mitigation/geoengineering approach to avoid temperature overshoot - however it considered only simplistic sulfur injection profiles (all started in 2010), just one mitigation scenario, and did not examine the sensitivity of the climate response to any underlying uncertainties. This study builds upon Wigley's premise of the combined mitigation/geoengineering approach and brings associated uncertainty into the analysis. First, this study addresses the question as to how much geoengineering intervention would be needed to avoid overshoot by considering associated uncertainty? Then, would a geoengineering intervention of such a magnitude including uncertainty be permissible in considering all the other side effects? This study begins from the supposition that geoengineering could be employed to cap warming at 2.0°C since preindustrial. A few

  20. The challenges on uncertainty analysis for pebble bed HTGR

    SciTech Connect

    Hao, C.; Li, F.; Zhang, H.

    2012-07-01

    The uncertainty analysis is very popular and important, and many works have been done for Light Water Reactor (LWR), although the experience for the uncertainty analysis in High Temperature Gas cooled Reactor (HTGR) modeling is still in the primary stage. IAEA will launch a Coordination Research Project (CRP) on this topic soon. This paper addresses some challenges for the uncertainty analysis in HTGR modeling, based on the experience of OECD LWR Uncertainty Analysis in Modeling (UAM) activities, and taking into account the peculiarities of pebble bed HTGR designs. The main challenges for HTGR UAM are: the lack of experience, the totally different code packages, the coupling of power distribution, temperature distribution and burnup distribution through the temperature feedback and pebble flow. The most serious challenge is how to deal with the uncertainty in pebble flow, the uncertainty in pebble bed flow modeling, and their contribution to the uncertainty of maximum fuel temperature, which is the most interested parameter for the modular HTGR. (authors)

  1. Uncertainty Analysis with Site Specific Groundwater Models: Experiences and Observations

    SciTech Connect

    Brewer, K.

    2003-07-15

    Groundwater flow and transport predictions are a major component of remedial action evaluations for contaminated groundwater at the Savannah River Site. Because all groundwater modeling results are subject to uncertainty from various causes; quantification of the level of uncertainty in the modeling predictions is beneficial to project decision makers. Complex site-specific models present formidable challenges for implementing an uncertainty analysis.

  2. Uncertainty Analysis of Simulated Hydraulic Fracturing

    NASA Astrophysics Data System (ADS)

    Chen, M.; Sun, Y.; Fu, P.; Carrigan, C. R.; Lu, Z.

    2012-12-01

    Artificial hydraulic fracturing is being used widely to stimulate production of oil, natural gas, and geothermal reservoirs with low natural permeability. Optimization of field design and operation is limited by the incomplete characterization of the reservoir, as well as the complexity of hydrological and geomechanical processes that control the fracturing. Thus, there are a variety of uncertainties associated with the pre-existing fracture distribution, rock mechanics, and hydraulic-fracture engineering that require evaluation of their impact on the optimized design. In this study, a multiple-stage scheme was employed to evaluate the uncertainty. We first define the ranges and distributions of 11 input parameters that characterize the natural fracture topology, in situ stress, geomechanical behavior of the rock matrix and joint interfaces, and pumping operation, to cover a wide spectrum of potential conditions expected for a natural reservoir. These parameters were then sampled 1,000 times in an 11-dimensional parameter space constrained by the specified ranges using the Latin-hypercube method. These 1,000 parameter sets were fed into the fracture simulators, and the outputs were used to construct three designed objective functions, i.e. fracture density, opened fracture length and area density. Using PSUADE, three response surfaces (11-dimensional) of the objective functions were developed and global sensitivity was analyzed to identify the most sensitive parameters for the objective functions representing fracture connectivity, which are critical for sweep efficiency of the recovery process. The second-stage high resolution response surfaces were constructed with dimension reduced to the number of the most sensitive parameters. An additional response surface with respect to the objective function of the fractal dimension for fracture distributions was constructed in this stage. Based on these response surfaces, comprehensive uncertainty analyses were conducted

  3. Measurement uncertainty analysis techniques applied to PV performance measurements

    SciTech Connect

    Wells, C.

    1992-10-01

    The purpose of this presentation is to provide a brief introduction to measurement uncertainty analysis, outline how it is done, and illustrate uncertainty analysis with examples drawn from the PV field, with particular emphasis toward its use in PV performance measurements. The uncertainty information we know and state concerning a PV performance measurement or a module test result determines, to a significant extent, the value and quality of that result. What is measurement uncertainty analysis It is an outgrowth of what has commonly been called error analysis. But uncertainty analysis, a more recent development, gives greater insight into measurement processes and tests, experiments, or calibration results. Uncertainty analysis gives us an estimate of the I interval about a measured value or an experiment's final result within which we believe the true value of that quantity will lie. Why should we take the time to perform an uncertainty analysis A rigorous measurement uncertainty analysis: Increases the credibility and value of research results; allows comparisons of results from different labs; helps improve experiment design and identifies where changes are needed to achieve stated objectives (through use of the pre-test analysis); plays a significant role in validating measurements and experimental results, and in demonstrating (through the post-test analysis) that valid data have been acquired; reduces the risk of making erroneous decisions; demonstrates quality assurance and quality control measures have been accomplished; define Valid Data as data having known and documented paths of: Origin, including theory; measurements; traceability to measurement standards; computations; uncertainty analysis of results.

  4. Micro-Pulse Lidar Signals: Uncertainty Analysis

    NASA Technical Reports Server (NTRS)

    Welton, Ellsworth J.; Campbell, James R.; Starr, David OC. (Technical Monitor)

    2002-01-01

    Micro-pulse lidar (MPL) systems are small, autonomous, eye-safe lidars used for continuous observations of the vertical distribution of cloud and aerosol layers. Since the construction of the first MPL in 1993, procedures have been developed to correct for various instrument effects present in MPL signals. The primary instrument effects include afterpulse, laser-detector cross-talk, and overlap, poor near-range (less than 6 km) focusing. The accurate correction of both afterpulse and overlap effects are required to study both clouds and aerosols. Furthermore, the outgoing energy of the laser pulses and the statistical uncertainty of the MPL detector must also be correctly determined in order to assess the accuracy of MPL observations. The uncertainties associated with the afterpulse, overlap, pulse energy, detector noise, and all remaining quantities affecting measured MPL signals, are determined in this study. The uncertainties are propagated through the entire MPL correction process to give a net uncertainty on the final corrected MPL signal. The results show that in the near range, the overlap uncertainty dominates. At altitudes above the overlap region, the dominant source of uncertainty is caused by uncertainty in the pulse energy. However, if the laser energy is low, then during mid-day, high solar background levels can significantly reduce the signal-to-noise of the detector. In such a case, the statistical uncertainty of the detector count rate becomes dominant at altitudes above the overlap region.

  5. An optimization based sampling approach for multiple metrics uncertainty analysis using generalized likelihood uncertainty estimation

    NASA Astrophysics Data System (ADS)

    Zhou, Rurui; Li, Yu; Lu, Di; Liu, Haixing; Zhou, Huicheng

    2016-09-01

    This paper investigates the use of an epsilon-dominance non-dominated sorted genetic algorithm II (ɛ-NSGAII) as a sampling approach with an aim to improving sampling efficiency for multiple metrics uncertainty analysis using Generalized Likelihood Uncertainty Estimation (GLUE). The effectiveness of ɛ-NSGAII based sampling is demonstrated compared with Latin hypercube sampling (LHS) through analyzing sampling efficiency, multiple metrics performance, parameter uncertainty and flood forecasting uncertainty with a case study of flood forecasting uncertainty evaluation based on Xinanjiang model (XAJ) for Qing River reservoir, China. Results obtained demonstrate the following advantages of the ɛ-NSGAII based sampling approach in comparison to LHS: (1) The former performs more effective and efficient than LHS, for example the simulation time required to generate 1000 behavioral parameter sets is shorter by 9 times; (2) The Pareto tradeoffs between metrics are demonstrated clearly with the solutions from ɛ-NSGAII based sampling, also their Pareto optimal values are better than those of LHS, which means better forecasting accuracy of ɛ-NSGAII parameter sets; (3) The parameter posterior distributions from ɛ-NSGAII based sampling are concentrated in the appropriate ranges rather than uniform, which accords with their physical significance, also parameter uncertainties are reduced significantly; (4) The forecasted floods are close to the observations as evaluated by three measures: the normalized total flow outside the uncertainty intervals (FOUI), average relative band-width (RB) and average deviation amplitude (D). The flood forecasting uncertainty is also reduced a lot with ɛ-NSGAII based sampling. This study provides a new sampling approach to improve multiple metrics uncertainty analysis under the framework of GLUE, and could be used to reveal the underlying mechanisms of parameter sets under multiple conflicting metrics in the uncertainty analysis process.

  6. Statistical Uncertainty Analysis Applied to Criticality Calculation

    SciTech Connect

    Hartini, Entin; Andiwijayakusuma, Dinan; Susmikanti, Mike; Nursinta, A. W.

    2010-06-22

    In this paper, we present an uncertainty methodology based on a statistical approach, for assessing uncertainties in criticality prediction using monte carlo method due to uncertainties in the isotopic composition of the fuel. The methodology has been applied to criticality calculations with MCNP5 with additional stochastic input of the isotopic fuel composition. The stochastic input were generated using the latin hypercube sampling method based one the probability density function of each nuclide composition. The automatic passing of the stochastic input to the MCNP and the repeated criticality calculation is made possible by using a python script to link the MCNP and our latin hypercube sampling code.

  7. Uncertainty Analysis of Historical Hurricane Data

    NASA Technical Reports Server (NTRS)

    Green, Lawrence L.

    2007-01-01

    An analysis of variance (ANOVA) study was conducted for historical hurricane data dating back to 1851 that was obtained from the U. S. Department of Commerce National Oceanic and Atmospheric Administration (NOAA). The data set was chosen because it is a large, publicly available collection of information, exhibiting great variability which has made the forecasting of future states, from current and previous states, difficult. The availability of substantial, high-fidelity validation data, however, made for an excellent uncertainty assessment study. Several factors (independent variables) were identified from the data set, which could potentially influence the track and intensity of the storms. The values of these factors, along with the values of responses of interest (dependent variables) were extracted from the data base, and provided to a commercial software package for processing via the ANOVA technique. The primary goal of the study was to document the ANOVA modeling uncertainty and predictive errors in making predictions about hurricane location and intensity 24 to 120 hours beyond known conditions, as reported by the data set. A secondary goal was to expose the ANOVA technique to a broader community within NASA. The independent factors considered to have an influence on the hurricane track included the current and starting longitudes and latitudes (measured in degrees), and current and starting maximum sustained wind speeds (measured in knots), and the storm starting date, its current duration from its first appearance, and the current year fraction of each reading, all measured in years. The year fraction and starting date were included in order to attempt to account for long duration cyclic behaviors, such as seasonal weather patterns, and years in which the sea or atmosphere were unusually warm or cold. The effect of short duration weather patterns and ocean conditions could not be examined with the current data set. The responses analyzed were the storm

  8. Characterization and evaluation of uncertainty in probabilistic risk analysis

    SciTech Connect

    Parry, G.W.; Winter, P.W.

    1981-01-01

    The sources of uncertainty in probabilistic risk analysis are discussed, using the event and fault-tree methodology as an example. The role of statistics in quantifying these uncertainties is investigated. A class of uncertainties is identified which are, at present, unquantifiable using either classical or Bayesian statistics. it is argued that Bayesian statistics is the more appropriate vehicle for the probabilistic analysis of rare events, and a short review is given with some discussion on the representation of ignorance.

  9. Radiometer Design Analysis Based Upon Measurement Uncertainty

    NASA Technical Reports Server (NTRS)

    Racette, Paul E.; Lang, Roger H.

    2004-01-01

    This paper introduces a method for predicting the performance of a radiometer design based on calculating the measurement uncertainty. The variety in radiometer designs and the demand for improved radiometric measurements justify the need for a more general and comprehensive method to assess system performance. Radiometric resolution, or sensitivity, is a figure of merit that has been commonly used to characterize the performance of a radiometer. However when evaluating the performance of a calibration design for a radiometer, the use of radiometric resolution has limited application. These limitations are overcome by considering instead the measurement uncertainty. A method for calculating measurement uncertainty for a generic radiometer design including its calibration algorithm is presented. The result is a generalized technique by which system calibration architectures and design parameters can be studied to optimize instrument performance for given requirements and constraints. Example applications demonstrate the utility of using measurement uncertainty as a figure of merit.

  10. Uncertainty analysis of statistical downscaling methods

    NASA Astrophysics Data System (ADS)

    Khan, Mohammad Sajjad; Coulibaly, Paulin; Dibike, Yonas

    2006-03-01

    Three downscaling models namely Statistical Down-Scaling Model (SDSM), Long Ashton Research Station Weather Generator (LARS-WG) model and Artificial Neural Network (ANN) model have been compared in terms various uncertainty assessments exhibited in their downscaled results of daily precipitation, daily maximum and minimum temperatures. In case of daily maximum and minimum temperature, uncertainty is assessed by comparing monthly mean and variance of downscaled and observed daily maximum and minimum temperature at each month of the year at 95% confidence level. In addition, uncertainties of the monthly means and variances of downscaled daily temperature have been calculated using 95% confidence intervals, which are compared with the observed uncertainties of means and variances. In daily precipitation downscaling, in addition to comparing means and variances, uncertainties have been assessed by comparing monthly mean dry and wet spell lengths and their confidence intervals, cumulative frequency distributions (cdfs) of monthly mean of daily precipitation, and the distributions of monthly wet and dry days for observed and downscaled daily precipitation. The study has been carried out using 40 years of observed and downscaled daily precipitation, daily maximum and minimum temperature data using NCEP (National Center for Environmental Prediction) reanalysis predictors starting from 1961 to 2000. The uncertainty assessment results indicate that the SDSM is the most capable of reproducing various statistical characteristics of observed data in its downscaled results with 95% confidence level, the ANN is the least capable in this respect, and the LARS-WG is in between SDSM and ANN.

  11. Uncertainty Analysis of Thermal Comfort Parameters

    NASA Astrophysics Data System (ADS)

    Ribeiro, A. Silva; Alves e Sousa, J.; Cox, Maurice G.; Forbes, Alistair B.; Matias, L. Cordeiro; Martins, L. Lages

    2015-08-01

    International Standard ISO 7730:2005 defines thermal comfort as that condition of mind that expresses the degree of satisfaction with the thermal environment. Although this definition is inevitably subjective, the Standard gives formulae for two thermal comfort indices, predicted mean vote ( PMV) and predicted percentage dissatisfied ( PPD). The PMV formula is based on principles of heat balance and experimental data collected in a controlled climate chamber under steady-state conditions. The PPD formula depends only on PMV. Although these formulae are widely recognized and adopted, little has been done to establish measurement uncertainties associated with their use, bearing in mind that the formulae depend on measured values and tabulated values given to limited numerical accuracy. Knowledge of these uncertainties are invaluable when values provided by the formulae are used in making decisions in various health and civil engineering situations. This paper examines these formulae, giving a general mechanism for evaluating the uncertainties associated with values of the quantities on which the formulae depend. Further, consideration is given to the propagation of these uncertainties through the formulae to provide uncertainties associated with the values obtained for the indices. Current international guidance on uncertainty evaluation is utilized.

  12. Spatial Uncertainty Analysis of Ecological Models

    SciTech Connect

    Jager, H.I.; Ashwood, T.L.; Jackson, B.L.; King, A.W.

    2000-09-02

    The authors evaluated the sensitivity of a habitat model and a source-sink population model to spatial uncertainty in landscapes with different statistical properties and for hypothetical species with different habitat requirements. Sequential indicator simulation generated alternative landscapes from a source map. Their results showed that spatial uncertainty was highest for landscapes in which suitable habitat was rare and spatially uncorrelated. Although, they were able to exert some control over the degree of spatial uncertainty by varying the sampling density drawn from the source map, intrinsic spatial properties (i.e., average frequency and degree of spatial autocorrelation) played a dominant role in determining variation among realized maps. To evaluate the ecological significance of landscape variation, they compared the variation in predictions from a simple habitat model to variation among landscapes for three species types. Spatial uncertainty in predictions of the amount of source habitat depended on both the spatial life history characteristics of the species and the statistical attributes of the synthetic landscapes. Species differences were greatest when the landscape contained a high proportion of suitable habitat. The predicted amount of source habitat was greater for edge-dependent (interior) species in landscapes with spatially uncorrelated(correlated) suitable habitat. A source-sink model demonstrated that, although variation among landscapes resulted in relatively little variation in overall population growth rate, this spatial uncertainty was sufficient in some situations, to produce qualitatively different predictions about population viability (i.e., population decline vs. increase).

  13. Uncertainty Analysis in Space Radiation Protection

    NASA Technical Reports Server (NTRS)

    Cucinotta, Francis A.

    2011-01-01

    Space radiation is comprised of high energy and charge (HZE) nuclei, protons, and secondary radiation including neutrons. The uncertainties in estimating the health risks from galactic cosmic rays (GCR) are a major limitation to the length of space missions, the evaluation of potential risk mitigation approaches, and application of the As Low As Reasonably Achievable (ALARA) principle. For long duration space missio ns, risks may approach radiation exposure limits, therefore the uncertainties in risk projections become a major safety concern and methodologies used for ground-based works are not deemed to be sufficient. NASA limits astronaut exposures to a 3% risk of exposure induced death (REID) and protects against uncertainties in risks projections using an assessment of 95% confidence intervals in the projection model. We discuss NASA s approach to space radiation uncertainty assessments and applications for the International Space Station (ISS) program and design studies of future missions to Mars and other destinations. Several features of NASA s approach will be discussed. Radiation quality descriptions are based on the properties of radiation tracks rather than LET with probability distribution functions (PDF) for uncertainties derived from radiobiology experiments at particle accelerators. The application of age and gender specific models for individual astronauts is described. Because more than 90% of astronauts are never-smokers, an alternative risk calculation for never-smokers is used and will be compared to estimates for an average U.S. population. Because of the high energies of the GCR limits the benefits of shielding and the limited role expected for pharmaceutical countermeasures, uncertainty reduction continues to be the optimal approach to improve radiation safety for space missions.

  14. Controllable set analysis for planetary landing under model uncertainties

    NASA Astrophysics Data System (ADS)

    Long, Jiateng; Gao, Ai; Cui, Pingyuan

    2015-07-01

    Controllable set analysis is a beneficial method in planetary landing mission design by feasible entry state selection in order to achieve landing accuracy and satisfy entry path constraints. In view of the severe impact of model uncertainties on planetary landing safety and accuracy, the purpose of this paper is to investigate the controllable set under uncertainties between on-board model and the real situation. Controllable set analysis under model uncertainties is composed of controllable union set (CUS) analysis and controllable intersection set (CIS) analysis. Definitions of CUS and CIS are demonstrated and computational method of them based on Gauss pseudospectral method is presented. Their applications on entry states distribution analysis under uncertainties and robustness of nominal entry state selection to uncertainties are illustrated by situations with ballistic coefficient, lift-to-drag ratio and atmospheric uncertainty in Mars entry. With analysis of CUS and CIS, the robustness of entry state selection and entry trajectory to model uncertainties can be guaranteed, thus enhancing the safety, reliability and accuracy under model uncertainties during planetary entry and landing.

  15. Uncertainty Modeling for Structural Control Analysis and Synthesis

    NASA Technical Reports Server (NTRS)

    Campbell, Mark E.; Crawley, Edward F.

    1996-01-01

    The development of an accurate model of uncertainties for the control of structures that undergo a change in operational environment, based solely on modeling and experimentation in the original environment is studied. The application used throughout this work is the development of an on-orbit uncertainty model based on ground modeling and experimentation. A ground based uncertainty model consisting of mean errors and bounds on critical structural parameters is developed. The uncertainty model is created using multiple data sets to observe all relevant uncertainties in the system. The Discrete Extended Kalman Filter is used as an identification/parameter estimation method for each data set, in addition to providing a covariance matrix which aids in the development of the uncertainty model. Once ground based modal uncertainties have been developed, they are localized to specific degrees of freedom in the form of mass and stiffness uncertainties. Two techniques are presented: a matrix method which develops the mass and stiffness uncertainties in a mathematical manner; and a sensitivity method which assumes a form for the mass and stiffness uncertainties in macroelements and scaling factors. This form allows the derivation of mass and stiffness uncertainties in a more physical manner. The mass and stiffness uncertainties of the ground based system are then mapped onto the on-orbit system, and projected to create an analogous on-orbit uncertainty model in the form of mean errors and bounds on critical parameters. The Middeck Active Control Experiment is introduced as experimental verification for the localization and projection methods developed. In addition, closed loop results from on-orbit operations of the experiment verify the use of the uncertainty model for control analysis and synthesis in space.

  16. Measurement uncertainty analysis techniques applied to PV performance measurements

    SciTech Connect

    Wells, C

    1992-10-01

    The purpose of this presentation is to provide a brief introduction to measurement uncertainty analysis, outline how it is done, and illustrate uncertainty analysis with examples drawn from the PV field, with particular emphasis toward its use in PV performance measurements. The uncertainty information we know and state concerning a PV performance measurement or a module test result determines, to a significant extent, the value and quality of that result. What is measurement uncertainty analysis? It is an outgrowth of what has commonly been called error analysis. But uncertainty analysis, a more recent development, gives greater insight into measurement processes and tests, experiments, or calibration results. Uncertainty analysis gives us an estimate of the I interval about a measured value or an experiment`s final result within which we believe the true value of that quantity will lie. Why should we take the time to perform an uncertainty analysis? A rigorous measurement uncertainty analysis: Increases the credibility and value of research results; allows comparisons of results from different labs; helps improve experiment design and identifies where changes are needed to achieve stated objectives (through use of the pre-test analysis); plays a significant role in validating measurements and experimental results, and in demonstrating (through the post-test analysis) that valid data have been acquired; reduces the risk of making erroneous decisions; demonstrates quality assurance and quality control measures have been accomplished; define Valid Data as data having known and documented paths of: Origin, including theory; measurements; traceability to measurement standards; computations; uncertainty analysis of results.

  17. Analysis of the Uncertainty in Microbubble Characterization.

    PubMed

    Harfield, Caroline; Fury, Christopher R; Memoli, Gianluca; Jones, Philip; Ovenden, Nick; Stride, Eleanor

    2016-06-01

    There is increasing interest in the use of microbubble contrast agents for quantitative imaging applications such as perfusion and blood pressure measurement. The response of a microbubble to ultrasound excitation is, however, extremely sensitive to its size, the properties of its coating and the characteristics of the sound field and surrounding environment. Hence the results of microbubble characterization experiments can be significantly affected by experimental uncertainties, and this can limit their utility in predictive modelling. The aim of this study was to attempt to quantify these uncertainties and their influence upon measured microbubble characteristics. Estimates for the parameters characterizing the microbubble coating were obtained by fitting model data to numerical simulations of microbubble dynamics. The effect of uncertainty in different experimental parameters was gauged by modifying the relevant input values to the fitting process. The results indicate that even the minimum expected uncertainty in, for example, measurements of microbubble radius using conventional optical microscopy, leads to variations in the estimated coating parameters of ∼20%. This should be taken into account in designing microbubble characterization experiments and in the use of data obtained from them. PMID:26993799

  18. New Programming Environments for Uncertainty Analysis

    NASA Astrophysics Data System (ADS)

    Hill, M. C.; Poeter, E. P.; Banta, E. R.; Christensen, S.; Cooley, R. L.; Ely, D. M.; Babendreier, J.; Leavesley, G.; Tonkin, M.; Julich, R.

    2005-12-01

    We live in a world of faster computers, better GUI's and visualization technology, increasing international cooperation made possible by new digital infrastructure, new agreements between US federal agencies (such as ISCMEM), new European Union programs (such as Harmoniqua), and greater collaboration between US university scientists through CUAHSI. These changes provide new resources for tackling the difficult job of quantifying how well our models perform. This talk introduces new programming environments that take advantage of these new developments and will change the paradigm of how we develop methods for uncertainty evaluation. For example, the programming environments provided by COSU API, JUPITER API, and Sensitivity/Optimization Toolbox provide enormous opportunities for faster and more meaningful evaluation of uncertainties. Instead of waiting years for ideas and theories to be compared in the complex circumstances of interest to resource managers, these new programming environments will expedite the process. In the new paradigm, unproductive ideas and theories will be revealed more quickly, productive ideas and theories will more quickly be used to address our increasingly difficult water resources problems. As examples, two ideas in JUPITER API applications are presented: uncertainty correction factors that account for system complexities not represented in models, and PPR and OPR statistics used to identify new data needed to reduce prediction uncertainty.

  19. Uncertainty Analysis of Air Radiation for Lunar Return Shock Layers

    NASA Technical Reports Server (NTRS)

    Kleb, Bil; Johnston, Christopher O.

    2008-01-01

    By leveraging a new uncertainty markup technique, two risk analysis methods are used to compute the uncertainty of lunar-return shock layer radiation predicted by the High temperature Aerothermodynamic Radiation Algorithm (HARA). The effects of epistemic uncertainty, or uncertainty due to a lack of knowledge, is considered for the following modeling parameters: atomic line oscillator strengths, atomic line Stark broadening widths, atomic photoionization cross sections, negative ion photodetachment cross sections, molecular bands oscillator strengths, and electron impact excitation rates. First, a simplified shock layer problem consisting of two constant-property equilibrium layers is considered. The results of this simplified problem show that the atomic nitrogen oscillator strengths and Stark broadening widths in both the vacuum ultraviolet and infrared spectral regions, along with the negative ion continuum, are the dominant uncertainty contributors. Next, three variable property stagnation-line shock layer cases are analyzed: a typical lunar return case and two Fire II cases. For the near-equilibrium lunar return and Fire 1643-second cases, the resulting uncertainties are very similar to the simplified case. Conversely, the relatively nonequilibrium 1636-second case shows significantly larger influence from electron impact excitation rates of both atoms and molecules. For all cases, the total uncertainty in radiative heat flux to the wall due to epistemic uncertainty in modeling parameters is 30% as opposed to the erroneously-small uncertainty levels (plus or minus 6%) found when treating model parameter uncertainties as aleatory (due to chance) instead of epistemic (due to lack of knowledge).

  20. INTEGRATED UNCERTAINTY ANALYSIS TO SUPPORT EFFECTIVE ENVIRONMENTAL DECISION-MAKING

    EPA Science Inventory

    The expected results of this research are to 1) contribute to our understanding of dominant uncertainties in models typically used across many RIAs, 2) identify integrated uncertainty analysis strategies that the EPA and other regulatory agencies can use to evaluate overall un...

  1. AN IMPROVEMENT TO THE MOUSE COMPUTERIZED UNCERTAINTY ANALYSIS SYSTEM

    EPA Science Inventory

    The original MOUSE (Modular Oriented Uncertainty System) system was designed to deal with the problem of uncertainties in Environmental engineering calculations, such as a set of engineering cast or risk analysis equations. It was especially intended for use by individuals with l...

  2. Analysis of Hydrogeologic Conceptual Model and Parameter Uncertainty

    SciTech Connect

    Meyer, Philip D.; Nicholson, Thomas J.; Mishra, Srikanta

    2003-06-24

    A systematic methodology for assessing hydrogeologic conceptual model, parameter, and scenario uncertainties is being developed to support technical reviews of environmental assessments related to decommissioning of nuclear facilities. The first major task being undertaken is to produce a coupled parameter and conceptual model uncertainty assessment methodology. This task is based on previous studies that have primarily dealt individually with these two types of uncertainties. Conceptual model uncertainty analysis is based on the existence of alternative conceptual models that are generated using a set of clearly stated guidelines targeted at the needs of NRC staff. Parameter uncertainty analysis makes use of generic site characterization data as well as site-specific characterization and monitoring data to evaluate parameter uncertainty in each of the alternative conceptual models. Propagation of parameter uncertainty will be carried out through implementation of a general stochastic model of groundwater flow and transport in the saturated and unsaturated zones. Evaluation of prediction uncertainty will make use of Bayesian model averaging and visualization of model results. The goal of this study is to develop a practical tool to quantify uncertainties in the conceptual model and parameters identified in performance assessments.

  3. Automated uncertainty analysis methods in the FRAP computer codes. [PWR

    SciTech Connect

    Peck, S O

    1980-01-01

    A user oriented, automated uncertainty analysis capability has been incorporated in the Fuel Rod Analysis Program (FRAP) computer codes. The FRAP codes have been developed for the analysis of Light Water Reactor fuel rod behavior during steady state (FRAPCON) and transient (FRAP-T) conditions as part of the United States Nuclear Regulatory Commission's Water Reactor Safety Research Program. The objective of uncertainty analysis of these codes is to obtain estimates of the uncertainty in computed outputs of the codes is to obtain estimates of the uncertainty in computed outputs of the codes as a function of known uncertainties in input variables. This paper presents the methods used to generate an uncertainty analysis of a large computer code, discusses the assumptions that are made, and shows techniques for testing them. An uncertainty analysis of FRAP-T calculated fuel rod behavior during a hypothetical loss-of-coolant transient is presented as an example and carried through the discussion to illustrate the various concepts.

  4. Development of a special-purpose test surface guided by uncertainty analysis - Introduction of a new uncertainty analysis step

    NASA Technical Reports Server (NTRS)

    Wang, T.; Simon, T. W.

    1988-01-01

    Development of a recent experimental program to investigate the effects of streamwise curvature on boundary layer transition required making a bendable, heated and instrumented test wall, a rather nonconventional surface. The present paper describes this surface, the design choices made in its development and how uncertainty analysis was used, beginning early in the test program, to make such design choices. Published uncertainty analysis techniques were found to be of great value; but, it became clear that another step, one herein called the pre-test analysis, would aid the program development. Finally, it is shown how the uncertainty analysis was used to determine whether the test surface was qualified for service.

  5. Detailed Uncertainty Analysis of the ZEM-3 Measurement System

    NASA Technical Reports Server (NTRS)

    Mackey, Jon; Sehirlioglu, Alp; Dynys, Fred

    2014-01-01

    The measurement of Seebeck coefficient and electrical resistivity are critical to the investigation of all thermoelectric systems. Therefore, it stands that the measurement uncertainty must be well understood to report ZT values which are accurate and trustworthy. A detailed uncertainty analysis of the ZEM-3 measurement system has been performed. The uncertainty analysis calculates error in the electrical resistivity measurement as a result of sample geometry tolerance, probe geometry tolerance, statistical error, and multi-meter uncertainty. The uncertainty on Seebeck coefficient includes probe wire correction factors, statistical error, multi-meter uncertainty, and most importantly the cold-finger effect. The cold-finger effect plagues all potentiometric (four-probe) Seebeck measurement systems, as heat parasitically transfers through thermocouple probes. The effect leads to an asymmetric over-estimation of the Seebeck coefficient. A thermal finite element analysis allows for quantification of the phenomenon, and provides an estimate on the uncertainty of the Seebeck coefficient. The thermoelectric power factor has been found to have an uncertainty of +9-14 at high temperature and 9 near room temperature.

  6. Including uncertainty in hazard analysis through fuzzy measures

    SciTech Connect

    Bott, T.F.; Eisenhawer, S.W.

    1997-12-01

    This paper presents a method for capturing the uncertainty expressed by an Hazard Analysis (HA) expert team when estimating the frequencies and consequences of accident sequences and provides a sound mathematical framework for propagating this uncertainty to the risk estimates for these accident sequences. The uncertainty is readily expressed as distributions that can visually aid the analyst in determining the extent and source of risk uncertainty in HA accident sequences. The results also can be expressed as single statistics of the distribution in a manner analogous to expressing a probabilistic distribution as a point-value statistic such as a mean or median. The study discussed here used data collected during the elicitation portion of an HA on a high-level waste transfer process to demonstrate the techniques for capturing uncertainty. These data came from observations of the uncertainty that HA team members expressed in assigning frequencies and consequences to accident sequences during an actual HA. This uncertainty was captured and manipulated using ideas from possibility theory. The result of this study is a practical method for displaying and assessing the uncertainty in the HA team estimates of the frequency and consequences for accident sequences. This uncertainty provides potentially valuable information about accident sequences that typically is lost in the HA process.

  7. An Approach of Uncertainty Evaluation for Thermal-Hydraulic Analysis

    SciTech Connect

    Katsunori Ogura; Hisashi Ninokata

    2002-07-01

    An approach to evaluate uncertainty systematically for thermal-hydraulic analysis programs is demonstrated. The approach is applied to the Peach Bottom Unit 2 Turbine Trip 2 Benchmark and is validated. (authors)

  8. A Stochastic Collocation Algorithm for Uncertainty Analysis

    NASA Technical Reports Server (NTRS)

    Mathelin, Lionel; Hussaini, M. Yousuff; Zang, Thomas A. (Technical Monitor)

    2003-01-01

    This report describes a stochastic collocation method to adequately handle a physically intrinsic uncertainty in the variables of a numerical simulation. For instance, while the standard Galerkin approach to Polynomial Chaos requires multi-dimensional summations over the stochastic basis functions, the stochastic collocation method enables to collapse those summations to a one-dimensional summation only. This report furnishes the essential algorithmic details of the new stochastic collocation method and provides as a numerical example the solution of the Riemann problem with the stochastic collocation method used for the discretization of the stochastic parameters.

  9. A stochastic approach to uncertainty quantification in residual moveout analysis

    NASA Astrophysics Data System (ADS)

    Johng-Ay, T.; Landa, E.; Dossou-Gbété, S.; Bordes, L.

    2015-06-01

    Oil and gas exploration and production relies usually on the interpretation of a single seismic image, which is obtained from observed data. However, the statistical nature of seismic data and the various approximations and assumptions are sources of uncertainties which may corrupt the evaluation of parameters. The quantification of these uncertainties is a major issue which supposes to help in decisions that have important social and commercial implications. The residual moveout analysis, which is an important step in seismic data processing is usually performed by a deterministic approach. In this paper we discuss a Bayesian approach to the uncertainty analysis.

  10. Uncertainty Analysis of Knowledge Reductions in Rough Sets

    PubMed Central

    Zhang, Nan

    2014-01-01

    Uncertainty analysis is a vital issue in intelligent information processing, especially in the age of big data. Rough set theory has attracted much attention to this field since it was proposed. Relative reduction is an important problem of rough set theory. Different relative reductions have been investigated for preserving some specific classification abilities in various applications. This paper examines the uncertainty analysis of five different relative reductions in four aspects, that is, reducts' relationship, boundary region granularity, rules variance, and uncertainty measure according to a constructed decision table. PMID:25258725

  11. Deconvolution of variability and uncertainty in the Cassini safety analysis

    NASA Astrophysics Data System (ADS)

    Kampas, Frank J.; Loughin, Stephen

    1998-01-01

    The standard method for propagation of uncertainty in a risk analysis requires rerunning the risk calculation numerous times with model parameters chosen from their uncertainty distributions. This was not practical for the Cassini nuclear safety analysis, due to the computationally intense nature of the risk calculation. A less computationally intense procedure was developed which requires only two calculations for each accident case. The first of these is the standard ``best-estimate'' calculation. In the second calculation, variables and parameters change simultaneously. The mathematical technique of deconvolution is then used to separate out an uncertainty multiplier distribution, which can be used to calculate distribution functions at various levels of confidence.

  12. Sensitivity analysis for handling uncertainty in an economic evaluation.

    PubMed

    Limwattananon, Supon

    2014-05-01

    To meet updated international standards, this paper revises the previous Thai guidelines for conducting sensitivity analyses as part of the decision analysis model for health technology assessment. It recommends both deterministic and probabilistic sensitivity analyses to handle uncertainty of the model parameters, which are best represented graphically. Two new methodological issues are introduced-a threshold analysis of medicines' unit prices for fulfilling the National Lists of Essential Medicines' requirements and the expected value of information for delaying decision-making in contexts where there are high levels of uncertainty. Further research is recommended where parameter uncertainty is significant and where the cost of conducting the research is not prohibitive. PMID:24964700

  13. Uncertainty analysis technique for OMEGA Dante measurementsa)

    NASA Astrophysics Data System (ADS)

    May, M. J.; Widmann, K.; Sorce, C.; Park, H.-S.; Schneider, M.

    2010-10-01

    The Dante is an 18 channel x-ray filtered diode array which records the spectrally and temporally resolved radiation flux from various targets (e.g., hohlraums, etc.) at x-ray energies between 50 eV and 10 keV. It is a main diagnostic installed on the OMEGA laser facility at the Laboratory for Laser Energetics, University of Rochester. The absolute flux is determined from the photometric calibration of the x-ray diodes, filters and mirrors, and an unfold algorithm. Understanding the errors on this absolute measurement is critical for understanding hohlraum energetic physics. We present a new method for quantifying the uncertainties on the determined flux using a Monte Carlo parameter variation technique. This technique combines the uncertainties in both the unfold algorithm and the error from the absolute calibration of each channel into a one sigma Gaussian error function. One thousand test voltage sets are created using these error functions and processed by the unfold algorithm to produce individual spectra and fluxes. Statistical methods are applied to the resultant set of fluxes to estimate error bars on the measurements.

  14. Uncertainty Analysis Technique for OMEGA Dante Measurements

    SciTech Connect

    May, M J; Widmann, K; Sorce, C; Park, H; Schneider, M

    2010-05-07

    The Dante is an 18 channel X-ray filtered diode array which records the spectrally and temporally resolved radiation flux from various targets (e.g. hohlraums, etc.) at X-ray energies between 50 eV to 10 keV. It is a main diagnostics installed on the OMEGA laser facility at the Laboratory for Laser Energetics, University of Rochester. The absolute flux is determined from the photometric calibration of the X-ray diodes, filters and mirrors and an unfold algorithm. Understanding the errors on this absolute measurement is critical for understanding hohlraum energetic physics. We present a new method for quantifying the uncertainties on the determined flux using a Monte-Carlo parameter variation technique. This technique combines the uncertainties in both the unfold algorithm and the error from the absolute calibration of each channel into a one sigma Gaussian error function. One thousand test voltage sets are created using these error functions and processed by the unfold algorithm to produce individual spectra and fluxes. Statistical methods are applied to the resultant set of fluxes to estimate error bars on the measurements.

  15. Uncertainty analysis technique for OMEGA Dante measurements

    SciTech Connect

    May, M. J.; Widmann, K.; Sorce, C.; Park, H.-S.; Schneider, M.

    2010-10-15

    The Dante is an 18 channel x-ray filtered diode array which records the spectrally and temporally resolved radiation flux from various targets (e.g., hohlraums, etc.) at x-ray energies between 50 eV and 10 keV. It is a main diagnostic installed on the OMEGA laser facility at the Laboratory for Laser Energetics, University of Rochester. The absolute flux is determined from the photometric calibration of the x-ray diodes, filters and mirrors, and an unfold algorithm. Understanding the errors on this absolute measurement is critical for understanding hohlraum energetic physics. We present a new method for quantifying the uncertainties on the determined flux using a Monte Carlo parameter variation technique. This technique combines the uncertainties in both the unfold algorithm and the error from the absolute calibration of each channel into a one sigma Gaussian error function. One thousand test voltage sets are created using these error functions and processed by the unfold algorithm to produce individual spectra and fluxes. Statistical methods are applied to the resultant set of fluxes to estimate error bars on the measurements.

  16. MOUSE (MODULAR ORIENTED UNCERTAINTY SYSTEM): A COMPUTERIZED UNCERTAINTY ANALYSIS SYSTEM (FOR MICRO- COMPUTERS)

    EPA Science Inventory

    Environmental engineering calculations involving uncertainties; either in the model itself or in the data, are far beyond the capabilities of conventional analysis for any but the simplest of models. There exist a number of general-purpose computer simulation languages, using Mon...

  17. Uncertainty Analysis via Failure Domain Characterization: Polynomial Requirement Functions

    NASA Technical Reports Server (NTRS)

    Crespo, Luis G.; Munoz, Cesar A.; Narkawicz, Anthony J.; Kenny, Sean P.; Giesy, Daniel P.

    2011-01-01

    This paper proposes an uncertainty analysis framework based on the characterization of the uncertain parameter space. This characterization enables the identification of worst-case uncertainty combinations and the approximation of the failure and safe domains with a high level of accuracy. Because these approximations are comprised of subsets of readily computable probability, they enable the calculation of arbitrarily tight upper and lower bounds to the failure probability. A Bernstein expansion approach is used to size hyper-rectangular subsets while a sum of squares programming approach is used to size quasi-ellipsoidal subsets. These methods are applicable to requirement functions whose functional dependency on the uncertainty is a known polynomial. Some of the most prominent features of the methodology are the substantial desensitization of the calculations from the uncertainty model assumed (i.e., the probability distribution describing the uncertainty) as well as the accommodation for changes in such a model with a practically insignificant amount of computational effort.

  18. Uncertainty Analysis via Failure Domain Characterization: Unrestricted Requirement Functions

    NASA Technical Reports Server (NTRS)

    Crespo, Luis G.; Kenny, Sean P.; Giesy, Daniel P.

    2011-01-01

    This paper proposes an uncertainty analysis framework based on the characterization of the uncertain parameter space. This characterization enables the identification of worst-case uncertainty combinations and the approximation of the failure and safe domains with a high level of accuracy. Because these approximations are comprised of subsets of readily computable probability, they enable the calculation of arbitrarily tight upper and lower bounds to the failure probability. The methods developed herein, which are based on nonlinear constrained optimization, are applicable to requirement functions whose functional dependency on the uncertainty is arbitrary and whose explicit form may even be unknown. Some of the most prominent features of the methodology are the substantial desensitization of the calculations from the assumed uncertainty model (i.e., the probability distribution describing the uncertainty) as well as the accommodation for changes in such a model with a practically insignificant amount of computational effort.

  19. The Role of Uncertainty in Aerospace Vehicle Analysis and Design

    NASA Technical Reports Server (NTRS)

    Kenny, Sean P.; Crespo, Luis G.

    2011-01-01

    Effective uncertainty quantification (UQ) begins at the earliest phase in the design phase for which there are adequate models and continues tightly integrated to the analysis and design cycles as the refinement of the models and the fidelity of the tools increase. It is essential that uncertainty quantification strategies provide objective information to support the processes of identifying, analyzing and accommodating for the effects of uncertainty. Assessments of uncertainty should never render the results more difficult for engineers and decision makers to comprehend, but instead provide them with critical information to assist with resource utilization decisions and risk mitigation strategies. Success would be measured by the tools to enable engineers and decision makers to effectively balance critical project resources against system requirements while accounting for the impact of uncertainty.

  20. Uncertainty analysis of geothermal energy economics

    NASA Astrophysics Data System (ADS)

    Sener, Adil Caner

    This dissertation research endeavors to explore geothermal energy economics by assessing and quantifying the uncertainties associated with the nature of geothermal energy and energy investments overall. The study introduces a stochastic geothermal cost model and a valuation approach for different geothermal power plant development scenarios. The Monte Carlo simulation technique is employed to obtain probability distributions of geothermal energy development costs and project net present values. In the study a stochastic cost model with incorporated dependence structure is defined and compared with the model where random variables are modeled as independent inputs. One of the goals of the study is to attempt to shed light on the long-standing modeling problem of dependence modeling between random input variables. The dependence between random input variables will be modeled by employing the method of copulas. The study focuses on four main types of geothermal power generation technologies and introduces a stochastic levelized cost model for each technology. Moreover, we also compare the levelized costs of natural gas combined cycle and coal-fired power plants with geothermal power plants. The input data used in the model relies on the cost data recently reported by government agencies and non-profit organizations, such as the Department of Energy, National Laboratories, California Energy Commission and Geothermal Energy Association. The second part of the study introduces the stochastic discounted cash flow valuation model for the geothermal technologies analyzed in the first phase. In this phase of the study, the Integrated Planning Model (IPM) software was used to forecast the revenue streams of geothermal assets under different price and regulation scenarios. These results are then combined to create a stochastic revenue forecast of the power plants. The uncertainties in gas prices and environmental regulations will be modeled and their potential impacts will be

  1. Analysis of automated highway system risks and uncertainties. Volume 5

    SciTech Connect

    Sicherman, A.

    1994-10-01

    This volume describes a risk analysis performed to help identify important Automated Highway System (AHS) deployment uncertainties and quantify their effect on costs and benefits for a range of AHS deployment scenarios. The analysis identified a suite of key factors affecting vehicle and roadway costs, capacities and market penetrations for alternative AHS deployment scenarios. A systematic protocol was utilized for obtaining expert judgments of key factor uncertainties in the form of subjective probability percentile assessments. Based on these assessments, probability distributions on vehicle and roadway costs, capacity and market penetration were developed for the different scenarios. The cost/benefit risk methodology and analysis provide insights by showing how uncertainties in key factors translate into uncertainties in summary cost/benefit indices.

  2. Probabilistic accident consequence uncertainty analysis -- Uncertainty assessment for internal dosimetry. Volume 1: Main report

    SciTech Connect

    Goossens, L.H.J.; Kraan, B.C.P.; Cooke, R.M.; Harrison, J.D.; Harper, F.T.; Hora, S.C.

    1998-04-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library of uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA internal dosimetry models.

  3. Probabilistic accident consequence uncertainty analysis -- Late health effects uncertainty assessment. Volume 1: Main report

    SciTech Connect

    Little, M.P.; Muirhead, C.R.; Goossens, L.H.J.; Kraan, B.C.P.; Cooke, R.M.; Harper, F.T.; Hora, S.C.

    1997-12-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library of uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA late health effects models.

  4. Probabilistic accident consequence uncertainty analysis -- Early health effects uncertainty assessment. Volume 1: Main report

    SciTech Connect

    Haskin, F.E.; Harper, F.T.; Goossens, L.H.J.; Kraan, B.C.P.; Grupa, J.B.

    1997-12-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library of uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA early health effects models.

  5. Proceedings of the CEC/USDOE workshop on uncertainty analysis

    SciTech Connect

    Elderkin, C.E. ); Kelly, G.N. )

    1990-09-01

    In recent years it has become increasingly important to specify the uncertainty inherent in consequence assessments and in the models that trace radionuclides from their source, through the environment, to their impacts on human health. European and US scientists have, been independently developing and applying methods for analyzing uncertainty. It recently became apparent that a scientific exchange on this subject would be beneficial as improvements are sought and as uncertainty methods find broader application. The Commission of the European Communities (CEC) and the Office of Health and Environmental Research of the US Department of Energy (OHER/DOE), through their continuing agreement for cooperation, decided to co-sponsor the CEC/USDOE Workshop on Uncertainty Analysis. CEC's Radiation Protection Research Programme and OHER's Atmospheric Studies in Complex Terrain Program collaborated in planning and organizing the workshop, which was held in Santa Fe, New Mexico, on November 13 through 16, 1989. As the workshop progressed, the perspectives of individual participants, each with their particular background and interests in some segment of consequence assessment and its uncertainties, contributed to a broader view of how uncertainties are introduced and handled. This proceedings contains, first, the editors' introduction to the problem of uncertainty analysis and their general summary and conclusions. These are then followed by the results of the working groups, and the abstracts of individual presentations.

  6. Uncertainty of calculation results in vehicle collision analysis.

    PubMed

    Wach, Wojciech; Unarski, Jan

    2007-04-11

    In the analysis of road accidents two types of calculation result uncertainty can be distinguished: modelling uncertainty and uncertainty in calculation results [R.M. Brach, M. Brach, Vehicle Accident Analysis & Reconstruction Methods, SAE International Publisher, Warrendale, 2005]. The problem becomes very important first of all when minor modifications of input parameters or application of different models of the phenomenon lead to a fundamentally different answer to the question posed by the court. The aim of the paper was to prove the necessity of including the problem of uncertainty in calculations related to vehicle collision mechanics and to justify the application of different error analysis methods recommendable in vehicle collision reconstruction. The data file from crash test No. 7 [H. Burg, M. Lindenmann, Unfallversuche, Verlag Information Ambs, Kippenheim, 1982] was used, the selection restricted to the range typical of average police records of collision place. Collision speeds were calculated using two methods: reconstruction and simulation. The analysis of uncertainty was carried out. Maximum and mean square uncertainty were calculated by means of total differential of relevant forms. Since the reconstruction resulted in very broad error intervals of uniform distribution, additional calculations were performed by the Monte Carlo method using algorithm described in [W. Wach, J. Unarski, Determination of vehicle velocities and collision location by means of Monte Carlo simulation method, Special Publication Accident Reconstruction SP-1999, SAE Paper No. 2006-01-0907, 2006]. PMID:16884874

  7. New challenges on uncertainty propagation assessment of flood risk analysis

    NASA Astrophysics Data System (ADS)

    Martins, Luciano; Aroca-Jiménez, Estefanía; Bodoque, José M.; Díez-Herrero, Andrés

    2016-04-01

    Natural hazards, such as floods, cause considerable damage to the human life, material and functional assets every year and around the World. Risk assessment procedures has associated a set of uncertainties, mainly of two types: natural, derived from stochastic character inherent in the flood process dynamics; and epistemic, that are associated with lack of knowledge or the bad procedures employed in the study of these processes. There are abundant scientific and technical literature on uncertainties estimation in each step of flood risk analysis (e.g. rainfall estimates, hydraulic modelling variables); but very few experience on the propagation of the uncertainties along the flood risk assessment. Therefore, epistemic uncertainties are the main goal of this work, in particular,understand the extension of the propagation of uncertainties throughout the process, starting with inundability studies until risk analysis, and how far does vary a proper analysis of the risk of flooding. These methodologies, such as Polynomial Chaos Theory (PCT), Method of Moments or Monte Carlo, are used to evaluate different sources of error, such as data records (precipitation gauges, flow gauges...), hydrologic and hydraulic modelling (inundation estimation), socio-demographic data (damage estimation) to evaluate the uncertainties propagation (UP) considered in design flood risk estimation both, in numerical and cartographic expression. In order to consider the total uncertainty and understand what factors are contributed most to the final uncertainty, we used the method of Polynomial Chaos Theory (PCT). It represents an interesting way to handle to inclusion of uncertainty in the modelling and simulation process. PCT allows for the development of a probabilistic model of the system in a deterministic setting. This is done by using random variables and polynomials to handle the effects of uncertainty. Method application results have a better robustness than traditional analysis

  8. Uncertainty Analysis of Seebeck Coefficient and Electrical Resistivity Characterization

    NASA Technical Reports Server (NTRS)

    Mackey, Jon; Sehirlioglu, Alp; Dynys, Fred

    2014-01-01

    In order to provide a complete description of a materials thermoelectric power factor, in addition to the measured nominal value, an uncertainty interval is required. The uncertainty may contain sources of measurement error including systematic bias error and precision error of a statistical nature. The work focuses specifically on the popular ZEM-3 (Ulvac Technologies) measurement system, but the methods apply to any measurement system. The analysis accounts for sources of systematic error including sample preparation tolerance, measurement probe placement, thermocouple cold-finger effect, and measurement parameters; in addition to including uncertainty of a statistical nature. Complete uncertainty analysis of a measurement system allows for more reliable comparison of measurement data between laboratories.

  9. Uncertainty and sensitivity analysis and its applications in OCD measurements

    NASA Astrophysics Data System (ADS)

    Vagos, Pedro; Hu, Jiangtao; Liu, Zhuan; Rabello, Silvio

    2009-03-01

    This article describes an Uncertainty & Sensitivity Analysis package, a mathematical tool that can be an effective time-shortcut for optimizing OCD models. By including real system noises in the model, an accurate method for predicting measurements uncertainties is shown. The assessment, in an early stage, of the uncertainties, sensitivities and correlations of the parameters to be measured drives the user in the optimization of the OCD measurement strategy. Real examples are discussed revealing common pitfalls like hidden correlations and simulation results are compared with real measurements. Special emphasis is given to 2 different cases: 1) the optimization of the data set of multi-head metrology tools (NI-OCD, SE-OCD), 2) the optimization of the azimuth measurement angle in SE-OCD. With the uncertainty and sensitivity analysis result, the right data set and measurement mode (NI-OCD, SE-OCD or NI+SE OCD) can be easily selected to achieve the best OCD model performance.

  10. Analysis of uncertainties in turbine metal temperature predictions

    NASA Technical Reports Server (NTRS)

    Stepka, F. S.

    1980-01-01

    An analysis was conducted to examine the extent to which various factors influence the accuracy of analytically predicting turbine blade metal temperatures and to determine the uncertainties in these predictions for several accuracies of the influence factors. The advanced turbofan engine gas conditions of 1700 K and 40 atmospheres were considered along with those of a highly instrumented high temperature turbine test rig and a low temperature turbine rig that simulated the engine conditions. The analysis showed that the uncertainty in analytically predicting local blade temperature was as much as 98 K, or 7.6 percent of the metal absolute temperature, with current knowledge of the influence factors. The expected reductions in uncertainties in the influence factors with additional knowledge and tests should reduce the uncertainty in predicting blade metal temperature to 28 K, or 2.1 percent of the metal absolute temperature.

  11. INSPECTION SHOP: PLAN TO PROVIDE UNCERTAINTY ANALYSIS WITH MEASUREMENTS

    SciTech Connect

    Nederbragt, W W

    2006-12-20

    The LLNL inspection shop is chartered to make dimensional measurements of components for critical programmatic experiments. These measurements ensure that components are within tolerance and provide geometric details that can be used to further refine simulations. For these measurements to be useful, they must be significantly more accurate than the tolerances that are being checked. For example, if a part has a specified dimension of 100 millimeters and a tolerance of 1 millimeter, then the precision and/or accuracy of the measurement should be less than 1 millimeter. Using the ''10-to-1 gaugemaker's rule of thumb'', the desired precision of the measurement should be less than 100 micrometers. Currently, the process for associating measurement uncertainty with data is not standardized, nor is the uncertainty based on a thorough uncertainty analysis. The goal of this project is to begin providing measurement uncertainty statements with critical measurements performed in the inspection shop. To accomplish this task, comprehensive knowledge about the underlying sources of uncertainty for measurement instruments need to be understood and quantified. Moreover, measurements of elemental uncertainties for each physical source need to be combined in a meaningful way to obtain an overall measurement uncertainty.

  12. Uncertainty analysis for common Seebeck and electrical resistivity measurement systems.

    PubMed

    Mackey, Jon; Dynys, Frederick; Sehirlioglu, Alp

    2014-08-01

    This work establishes the level of uncertainty for electrical measurements commonly made on thermoelectric samples. The analysis targets measurement systems based on the four probe method. Sources of uncertainty for both electrical resistivity and Seebeck coefficient were identified and evaluated. Included are reasonable estimates on the magnitude of each source, and cumulative propagation of error. Uncertainty for the Seebeck coefficient includes the cold-finger effect which has been quantified with thermal finite element analysis. The cold-finger effect, which is a result of parasitic heat transfer down the thermocouple probes, leads to an asymmetric over-estimation of the Seebeck coefficient. A silicon germanium thermoelectric sample has been characterized to provide an understanding of the total measurement uncertainty. The electrical resistivity was determined to contain uncertainty of ±7.0% across any measurement temperature. The Seebeck coefficient of the system is +1.0%/-13.1% at high temperature and ±1.0% near room temperature. The power factor has a combined uncertainty of +7.3%/-27.0% at high temperature and ±7.5% near room temperature. These ranges are calculated to be typical values for a general four probe Seebeck and resistivity measurement configuration. PMID:25173324

  13. Effect of material uncertainties on dynamic analysis of piezoelectric fans

    NASA Astrophysics Data System (ADS)

    Srivastava, Swapnil; Yadav, Shubham Kumar; Mukherjee, Sujoy

    2015-04-01

    A piezofan is a resonant device that uses a piezoceramic material to induce oscillations in a cantilever beam. In this study, lumped-mass modelling is used to analyze a piezoelectric fan. Uncertainties are associated with the piezoelectric structures due to several reasons such as variation during manufacturing process, temperature, presence of adhesive layer between the piezoelectric actuator/sensor and the shim stock etc. Presence of uncertainty in the piezoelectric materials can influence the dynamic behavior of the piezoelectric fan such as natural frequency, tip deflection etc. Moreover, these quantities will also affect the performance parameters of the piezoelectric fan. Uncertainty analysis is performed using classical Monte Carlo Simulation (MCS). It is found that the propagation of uncertainty causes significant deviations from the baseline deterministic predictions, which also affect the achievable performance of the piezofan. The numerical results in this paper provide useful bounds on several performance parameters of the cooling fan and will enhance confidence in the design process.

  14. Analysis and Reduction of Complex Networks Under Uncertainty

    SciTech Connect

    Knio, Omar M

    2014-04-09

    This is a collaborative proposal that aims at developing new methods for the analysis and reduction of complex multiscale networks under uncertainty. The approach is based on combining methods of computational singular perturbation (CSP) and probabilistic uncertainty quantification. In deterministic settings, CSP yields asymptotic approximations of reduced-dimensionality “slow manifolds” on which a multiscale dynamical system evolves. Introducing uncertainty raises fundamentally new issues, particularly concerning its impact on the topology of slow manifolds, and means to represent and quantify associated variability. To address these challenges, this project uses polynomial chaos (PC) methods to reformulate uncertain network models, and to analyze them using CSP in probabilistic terms. Specific objectives include (1) developing effective algorithms that can be used to illuminate fundamental and unexplored connections among model reduction, multiscale behavior, and uncertainty, and (2) demonstrating the performance of these algorithms through applications to model problems.

  15. Uncertainty Analysis and Parameter Estimation For Nearshore Hydrodynamic Models

    NASA Astrophysics Data System (ADS)

    Ardani, S.; Kaihatu, J. M.

    2012-12-01

    Numerical models represent deterministic approaches used for the relevant physical processes in the nearshore. Complexity of the physics of the model and uncertainty involved in the model inputs compel us to apply a stochastic approach to analyze the robustness of the model. The Bayesian inverse problem is one powerful way to estimate the important input model parameters (determined by apriori sensitivity analysis) and can be used for uncertainty analysis of the outputs. Bayesian techniques can be used to find the range of most probable parameters based on the probability of the observed data and the residual errors. In this study, the effect of input data involving lateral (Neumann) boundary conditions, bathymetry and off-shore wave conditions on nearshore numerical models are considered. Monte Carlo simulation is applied to a deterministic numerical model (the Delft3D modeling suite for coupled waves and flow) for the resulting uncertainty analysis of the outputs (wave height, flow velocity, mean sea level and etc.). Uncertainty analysis of outputs is performed by random sampling from the input probability distribution functions and running the model as required until convergence to the consistent results is achieved. The case study used in this analysis is the Duck94 experiment, which was conducted at the U.S. Army Field Research Facility at Duck, North Carolina, USA in the fall of 1994. The joint probability of model parameters relevant for the Duck94 experiments will be found using the Bayesian approach. We will further show that, by using Bayesian techniques to estimate the optimized model parameters as inputs and applying them for uncertainty analysis, we can obtain more consistent results than using the prior information for input data which means that the variation of the uncertain parameter will be decreased and the probability of the observed data will improve as well. Keywords: Monte Carlo Simulation, Delft3D, uncertainty analysis, Bayesian techniques

  16. Parameter uncertainty analysis of a biokinetic model of caesium.

    PubMed

    Li, W B; Klein, W; Blanchardon, E; Puncher, M; Leggett, R W; Oeh, U; Breustedt, B; Noßke, D; Lopez, M A

    2015-01-01

    Parameter uncertainties for the biokinetic model of caesium (Cs) developed by Leggett et al. were inventoried and evaluated. The methods of parameter uncertainty analysis were used to assess the uncertainties of model predictions with the assumptions of model parameter uncertainties and distributions. Furthermore, the importance of individual model parameters was assessed by means of sensitivity analysis. The calculated uncertainties of model predictions were compared with human data of Cs measured in blood and in the whole body. It was found that propagating the derived uncertainties in model parameter values reproduced the range of bioassay data observed in human subjects at different times after intake. The maximum ranges, expressed as uncertainty factors (UFs) (defined as a square root of ratio between 97.5th and 2.5th percentiles) of blood clearance, whole-body retention and urinary excretion of Cs predicted at earlier time after intake were, respectively: 1.5, 1.0 and 2.5 at the first day; 1.8, 1.1 and 2.4 at Day 10 and 1.8, 2.0 and 1.8 at Day 100; for the late times (1000 d) after intake, the UFs were increased to 43, 24 and 31, respectively. The model parameters of transfer rates between kidneys and blood, muscle and blood and the rate of transfer from kidneys to urinary bladder content are most influential to the blood clearance and to the whole-body retention of Cs. For the urinary excretion, the parameters of transfer rates from urinary bladder content to urine and from kidneys to urinary bladder content impact mostly. The implication and effect on the estimated equivalent and effective doses of the larger uncertainty of 43 in whole-body retention in the later time, say, after Day 500 will be explored in a successive work in the framework of EURADOS. PMID:24743755

  17. Uncertainty of quantitative microbiological methods of pharmaceutical analysis.

    PubMed

    Gunar, O V; Sakhno, N G

    2015-12-30

    The total uncertainty of quantitative microbiological methods, used in pharmaceutical analysis, consists of several components. The analysis of the most important sources of the quantitative microbiological methods variability demonstrated no effect of culture media and plate-count techniques in the estimation of microbial count while the highly significant effect of other factors (type of microorganism, pharmaceutical product and individual reading and interpreting errors) was established. The most appropriate method of statistical analysis of such data was ANOVA which enabled not only the effect of individual factors to be estimated but also their interactions. Considering all the elements of uncertainty and combining them mathematically the combined relative uncertainty of the test results was estimated both for method of quantitative examination of non-sterile pharmaceuticals and microbial count technique without any product. These data did not exceed 35%, appropriated for a traditional plate count methods. PMID:26456251

  18. Uncertainty Analysis for RELAP5-3D

    SciTech Connect

    Aaron J. Pawel; Dr. George L. Mesina

    2011-08-01

    In its current state, RELAP5-3D is a 'best-estimate' code; it is one of our most reliable programs for modeling what occurs within reactor systems in transients from given initial conditions. This code, however, remains an estimator. A statistical analysis has been performed that begins to lay the foundation for a full uncertainty analysis. By varying the inputs over assumed probability density functions, the output parameters were shown to vary. Using such statistical tools as means, variances, and tolerance intervals, a picture of how uncertain the results are based on the uncertainty of the inputs has been obtained.

  19. A Peep into the Uncertainty-Complexity-Relevance Modeling Trilemma through Global Sensitivity and Uncertainty Analysis

    NASA Astrophysics Data System (ADS)

    Munoz-Carpena, R.; Muller, S. J.; Chu, M.; Kiker, G. A.; Perz, S. G.

    2014-12-01

    Model Model complexity resulting from the need to integrate environmental system components cannot be understated. In particular, additional emphasis is urgently needed on rational approaches to guide decision making through uncertainties surrounding the integrated system across decision-relevant scales. However, in spite of the difficulties that the consideration of modeling uncertainty represent for the decision process, it should not be avoided or the value and science behind the models will be undermined. These two issues; i.e., the need for coupled models that can answer the pertinent questions and the need for models that do so with sufficient certainty, are the key indicators of a model's relevance. Model relevance is inextricably linked with model complexity. Although model complexity has advanced greatly in recent years there has been little work to rigorously characterize the threshold of relevance in integrated and complex models. Formally assessing the relevance of the model in the face of increasing complexity would be valuable because there is growing unease among developers and users of complex models about the cumulative effects of various sources of uncertainty on model outputs. In particular, this issue has prompted doubt over whether the considerable effort going into further elaborating complex models will in fact yield the expected payback. New approaches have been proposed recently to evaluate the uncertainty-complexity-relevance modeling trilemma (Muller, Muñoz-Carpena and Kiker, 2011) by incorporating state-of-the-art global sensitivity and uncertainty analysis (GSA/UA) in every step of the model development so as to quantify not only the uncertainty introduced by the addition of new environmental components, but the effect that these new components have over existing components (interactions, non-linear responses). Outputs from the analysis can also be used to quantify system resilience (stability, alternative states, thresholds or tipping

  20. Assessment of Uncertainties Related to Seismic Hazard Using Fuzzy Analysis

    NASA Astrophysics Data System (ADS)

    Jorjiashvili, N.; Yokoi, T.; Javakhishvili, Z.

    2013-05-01

    Seismic hazard analysis in last few decades has been become very important issue. Recently, new technologies and available data have been improved that helped many scientists to understand where and why earthquakes happen, physics of earthquakes, etc. They have begun to understand the role of uncertainty in Seismic hazard analysis. However, there is still significant problem how to handle existing uncertainty. The same lack of information causes difficulties to quantify uncertainty accurately. Usually attenuation curves are obtained in statistical way: regression analysis. Statistical and probabilistic analysis show overlapped results for the site coefficients. This overlapping takes place not only at the border between two neighboring classes, but also among more than three classes. Although the analysis starts from classifying sites using the geological terms, these site coefficients are not classified at all. In the present study, this problem is solved using Fuzzy set theory. Using membership functions the ambiguities at the border between neighboring classes can be avoided. Fuzzy set theory is performed for southern California by conventional way. In this study standard deviations that show variations between each site class obtained by Fuzzy set theory and classical way are compared. Results on this analysis show that when we have insufficient data for hazard assessment site classification based on Fuzzy set theory shows values of standard deviations less than obtained by classical way which is direct proof of less uncertainty.

  1. Planning for robust reserve networks using uncertainty analysis

    USGS Publications Warehouse

    Moilanen, A.; Runge, M.C.; Elith, J.; Tyre, A.; Carmel, Y.; Fegraus, E.; Wintle, B.A.; Burgman, M.; Ben-Haim, Y.

    2006-01-01

    Planning land-use for biodiversity conservation frequently involves computer-assisted reserve selection algorithms. Typically such algorithms operate on matrices of species presence?absence in sites, or on species-specific distributions of model predicted probabilities of occurrence in grid cells. There are practically always errors in input data?erroneous species presence?absence data, structural and parametric uncertainty in predictive habitat models, and lack of correspondence between temporal presence and long-run persistence. Despite these uncertainties, typical reserve selection methods proceed as if there is no uncertainty in the data or models. Having two conservation options of apparently equal biological value, one would prefer the option whose value is relatively insensitive to errors in planning inputs. In this work we show how uncertainty analysis for reserve planning can be implemented within a framework of information-gap decision theory, generating reserve designs that are robust to uncertainty. Consideration of uncertainty involves modifications to the typical objective functions used in reserve selection. Search for robust-optimal reserve structures can still be implemented via typical reserve selection optimization techniques, including stepwise heuristics, integer-programming and stochastic global search.

  2. Probabilistic accident consequence uncertainty analysis: Dispersion and deposition uncertainty assessment, main report

    SciTech Connect

    Harper, F.T.; Young, M.L.; Miller, L.A.; Hora, S.C.; Lui, C.H.; Goossens, L.H.J.; Cooke, R.M.; Paesler-Sauer, J.; Helton, J.C.

    1995-01-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the risks presented by nuclear installations based on postulated frequencies and magnitudes of potential accidents. In 1991, the US Nuclear Regulatory Commission (NRC) and the Commission of the European Communities (CEC) began a joint uncertainty analysis of the two codes. The ultimate objective of the joint effort was to develop credible and traceable uncertainty distributions for the input variables of the codes. Expert elicitation was identified as the best technology available for developing a library of uncertainty distributions for the selected consequence parameters. The study was formulated jointly and was limited to the current code models and to physical quantities that could be measured in experiments. Experts developed their distributions independently. To validate the distributions generated for the wet deposition input variables, samples were taken from these distributions and propagated through the wet deposition code model. Resulting distributions closely replicated the aggregated elicited wet deposition distributions. To validate the distributions generated for the dispersion code input variables, samples from the distributions and propagated through the Gaussian plume model (GPM) implemented in the MACCS and COSYMA codes. Project teams from the NRC and CEC cooperated successfully to develop and implement a unified process for the elaboration of uncertainty distributions on consequence code input parameters. Formal expert judgment elicitation proved valuable for synthesizing the best available information. Distributions on measurable atmospheric dispersion and deposition parameters were successfully elicited from experts involved in the many phenomenological areas of consequence analysis. This volume is the first of a three-volume document describing the project.

  3. Probabilistic accident consequence uncertainty analysis: Dispersion and deposition uncertainty assessment, appendices A and B

    SciTech Connect

    Harper, F.T.; Young, M.L.; Miller, L.A.; Hora, S.C.; Lui, C.H.; Goossens, L.H.J.; Cooke, R.M.; Paesler-Sauer, J.; Helton, J.C.

    1995-01-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, completed in 1990, estimate the risks presented by nuclear installations based on postulated frequencies and magnitudes of potential accidents. In 1991, the US Nuclear Regulatory Commission (NRC) and the Commission of the European Communities (CEC) began a joint uncertainty analysis of the two codes. The objective was to develop credible and traceable uncertainty distributions for the input variables of the codes. Expert elicitation, developed independently, was identified as the best technology available for developing a library of uncertainty distributions for the selected consequence parameters. The study was formulated jointly and was limited to the current code models and to physical quantities that could be measured in experiments. To validate the distributions generated for the wet deposition input variables, samples were taken from these distributions and propagated through the wet deposition code model along with the Gaussian plume model (GPM) implemented in the MACCS and COSYMA codes. Resulting distributions closely replicated the aggregated elicited wet deposition distributions. Project teams from the NRC and CEC cooperated successfully to develop and implement a unified process for the elaboration of uncertainty distributions on consequence code input parameters. Formal expert judgment elicitation proved valuable for synthesizing the best available information. Distributions on measurable atmospheric dispersion and deposition parameters were successfully elicited from experts involved in the many phenomenological areas of consequence analysis. This volume is the second of a three-volume document describing the project and contains two appendices describing the rationales for the dispersion and deposition data along with short biographies of the 16 experts who participated in the project.

  4. Visual Scanning Hartmann Optical Tester (VSHOT) Uncertainty Analysis (Milestone Report)

    SciTech Connect

    Gray, A.; Lewandowski, A.; Wendelin, T.

    2010-10-01

    In 1997, an uncertainty analysis was conducted of the Video Scanning Hartmann Optical Tester (VSHOT). In 2010, we have completed a new analysis, based primarily on the geometric optics of the system, and it shows sensitivities to various design and operational parameters. We discuss sources of error with measuring devices, instrument calibrations, and operator measurements for a parabolic trough mirror panel test. These help to guide the operator in proper setup, and help end-users to understand the data they are provided. We include both the systematic (bias) and random (precision) errors for VSHOT testing and their contributions to the uncertainty. The contributing factors we considered in this study are: target tilt; target face to laser output distance; instrument vertical offset; laser output angle; distance between the tool and the test piece; camera calibration; and laser scanner. These contributing factors were applied to the calculated slope error, focal length, and test article tilt that are generated by the VSHOT data processing. Results show the estimated 2-sigma uncertainty in slope error for a parabolic trough line scan test to be +/-0.2 milliradians; uncertainty in the focal length is +/- 0.1 mm, and the uncertainty in test article tilt is +/- 0.04 milliradians.

  5. Probabilistic accident consequence uncertainty analysis -- Uncertainty assessment for internal dosimetry. Volume 2: Appendices

    SciTech Connect

    Goossens, L.H.J.; Kraan, B.C.P.; Cooke, R.M.; Harrison, J.D.; Harper, F.T.; Hora, S.C.

    1998-04-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library of uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA internal dosimetry models. This volume contains appendices that include (1) a summary of the MACCS and COSYMA consequence codes, (2) the elicitation questionnaires and case structures, (3) the rationales and results for the panel on internal dosimetry, (4) short biographies of the experts, and (5) the aggregated results of their responses.

  6. Probabilistic accident consequence uncertainty analysis -- Uncertainty assessment for deposited material and external doses. Volume 2: Appendices

    SciTech Connect

    Goossens, L.H.J.; Kraan, B.C.P.; Cooke, R.M.; Boardman, J.; Jones, J.A.; Harper, F.T.; Young, M.L.; Hora, S.C.

    1997-12-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library of uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA deposited material and external dose models. This volume contains appendices that include (1) a summary of the MACCS and COSYMA consequence codes, (2) the elicitation questionnaires and case structures, (3) the rationales and results for the panel on deposited material and external doses, (4) short biographies of the experts, and (5) the aggregated results of their responses.

  7. Probabilistic accident consequence uncertainty analysis -- Early health effects uncertainty assessment. Volume 2: Appendices

    SciTech Connect

    Haskin, F.E.; Harper, F.T.; Goossens, L.H.J.; Kraan, B.C.P.

    1997-12-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library of uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA early health effects models. This volume contains appendices that include (1) a summary of the MACCS and COSYMA consequence codes, (2) the elicitation questionnaires and case structures, (3) the rationales and results for the panel on early health effects, (4) short biographies of the experts, and (5) the aggregated results of their responses.

  8. MOUSE: A COMPUTERIZED UNCERTAINTY ANALYSIS SYSTEM OPERATIONAL MANUAL (DISKETTE)

    EPA Science Inventory

    Environmental engineering calculations involving uncertainties; either in the model itself or in the data, are far beyond the capabilities of conventional analysis for any but the simplest of models. There exist a number of general-purpose computer simulation languages, using Mon...

  9. UNCERTAINTY ANALYSIS OF RUNOFF ESTIMATES FROM A RUNOFF CONTOUR MAP

    EPA Science Inventory

    The US EPA in cooperation with the USGS conducted an analysis to quantify the uncertainty associated with interpolatinq runoff to specific sites using a runoff contour map. e interpolated runoff to 93 gaged watersheds from a runoff contour map using 1) hand interpolation to the w...

  10. Practitioner Representations of Environmental Uncertainty: An Application of Discriminant Analysis.

    ERIC Educational Resources Information Center

    Acharya, Lalit

    Multiple discriminant analysis was used to analyze the structure of a perceived environmental uncertainty variable employed previously in research on public relations roles. Data came from a subset (N=229) of a national sample of public relations practitioners belonging to the Public Relations Society of America, who completed a set of scaled…

  11. An educational model for ensemble streamflow simulation and uncertainty analysis

    NASA Astrophysics Data System (ADS)

    AghaKouchak, A.; Nakhjiri, N.; Habib, E.

    2013-02-01

    This paper presents the hands-on modeling toolbox, HBV-Ensemble, designed as a complement to theoretical hydrology lectures, to teach hydrological processes and their uncertainties. The HBV-Ensemble can be used for in-class lab practices and homework assignments, and assessment of students' understanding of hydrological processes. Using this modeling toolbox, students can gain more insights into how hydrological processes (e.g., precipitation, snowmelt and snow accumulation, soil moisture, evapotranspiration and runoff generation) are interconnected. The educational toolbox includes a MATLAB Graphical User Interface (GUI) and an ensemble simulation scheme that can be used for teaching uncertainty analysis, parameter estimation, ensemble simulation and model sensitivity. HBV-Ensemble was administered in a class for both in-class instruction and a final project, and students submitted their feedback about the toolbox. The results indicate that this educational software had a positive impact on students understanding and knowledge of uncertainty in hydrological modeling.

  12. The effects of uncertainty on the analysis of atmospheric deposition

    SciTech Connect

    Bloyd, C.N. ); Small, M.J.; Henrion, M.; Rubin, E.S. )

    1988-01-01

    Research efforts on the problem of acid ran are directed at improving current scientific understanding in critical areas, including sources of precursor emissions, the transport and transformation of pollutants in the atmosphere, the deposition of acidic species, and the chemical and biological effects of acid deposition on aquatic systems, materials, forests, crops and human health. The general goal of these research efforts is to characterize the current situation and to develop analytical models which can be used to predict the response of various systems to changes in critical parameters. This paper describes a framework which enables one to characterize uncertainty at each major stage of the modeling process. Following a general presentation of the modeling framework, a description is given of the methods chosen to characterize uncertainty for each major step. Analysis is then performed to illustrate the effects of uncertainty on future lake acidification in the Adirondacks Park area of upstate New York.

  13. Uncertainty analysis on photogrammetry-derived national shoreline

    NASA Astrophysics Data System (ADS)

    Yao, Fang

    Photogrammetric shoreline mapping remains the primary method for mapping the national shoreline used by the National Geodetic Survey (NGS) in the National Oceanic and Atmospheric Administration (NOAA). To date, NGS has not conducted a statistical analysis on the photograrnmetry-derived shoreline uncertainty. The aim of this thesis is to develop and test a rigorous total propagated uncertainty (TPU) model for shoreline compiled from both tide-coordinated and non-tide-coordinated aerial imagery using photogrammetric methods. Survey imagery collected over a study site in northeast Maine was used to test the TPU model. The TPU model developed in this thesis can easily be extended to other areas and may facilitate estimation of uncertainty in inundation models and marsh migration models.

  14. Uncertainty analysis of a SFR core with sodium plenum

    SciTech Connect

    Canuti, E.; Ivanov, E.; Tiberi, V.; Pignet, S.

    2012-07-01

    The new concepts of Sodium-cooled Fast Reactors have to reach the Generation IV safety objectives. In this regard the Sodium Void Effect has to be minimized for the future projects of large-size SFR as well as the uncertainties on it. The Inst. of Radiological Protection and Nuclear Safety (IRSN) as technological support of French public authorities is in charge of safety assessment of operating and under construction reactors, as well as future projects. In order to state about the safety of new SFR designs the IRSN must be able to evaluate core parameters and their uncertainties. In this frame a sensitivity and uncertainty study has been performed to evaluate the impact of nuclear data uncertainty on sodium void effect, for the benchmark model of large SFR BN-800. The benchmark parameters (effective multiplication factor and sodium void effect) have been evaluated using two codes, the deterministic code ERANOS and the Monte Carlo code SCALE, while the S/U analysis has been performed only with SCALE. The results of the these studies point out the most relevant cross section uncertainties that affect the SVE and how efforts should be done in increasing the existing nuclear data accuracies. (authors)

  15. A convolution integral approach for performance assessments with uncertainty analysis

    SciTech Connect

    Dawoud, E.; Miller, L.F.

    1999-09-01

    Performance assessments that include uncertainty analyses and risk assessments are typically not obtained for time-dependent releases of radioactive contaminants to the geosphere when a series of sequentially coupled transport models is required for determining results. This is due, in part, to the geophysical complexity of the site, and to the numerical complexity of the fate and transport models. The lack of a practical tool for linking the transport models in a fashion that facilitates uncertainty analysis is another reason for not performing uncertainty analyses in these studies. The multiconvolution integral (MCI) approach presented herein greatly facilitates the practicality of incorporating uncertainty analyses into performance assessments. In this research an MCI approach is developed, and the decoupling of fate and transport processes into an independent system is described. A conceptual model, extracted from the Inactive Tanks project at the Oak Ridge National Laboratory (ORNL), is used to demonstrate the approach. Numerical models are used for transport of {sup 90}Sr from a disposal facility, WC-1 at ORNL, through the vadose and saturated zones to a downgradient point at Fifth Creek, and an analytical surface water model is used to transport the contaminants to a downstream potential receptor point at White Oak Creek. The probability density functions of the final concentrations obtained by the MCI approach are in excellent agreement with those obtained by a Monte Carlo approach that propagated uncertainties through all submodels for each random sample.

  16. Probabilistic uncertainty analysis of laser/material thermal interactions

    NASA Astrophysics Data System (ADS)

    Pelaccio, Dennis George

    Performance of a system during heat-flux (laser-type) irradiation is of increasing importance to a variety of defense and commercial applications. For laser irradiation of spacecraft components, such as a laser power or propulsion system receiver, predicting with accuracy the moment (time) and type of failure of it is difficult. These difficulties arise from the inherent nonlinear nature of the problem, because surface reradiation heat transport mechanisms come into play as the system is heated. Additionally, there are uncertainties associated with the irradiation source intensity, interaction cross-section and view angle; the property state of the material(s) that are being heated; and the effective emissivity/absorptivity and surface radiation view factor(s). The physical properties of the materials on a spacecraft may also change greatly over time due to exposure to the space environment. To better understand the uncertainties associated with these issues, a study was performed at the University of New Mexico's Institute for Space and Nuclear Power Studies, under U. S. Air Force Phillips Laboratory sponsorship, to develop and apply uncertainty computer model for generic laser heating problems that incorporate probabilistic design (Monte Carlo sampling based) assessment methods. This work discusses in detail: the background associated with the laser irradiation/material thermal interaction process; past work in related technical areas; the research objectives of the study; the technical approach employed; as well as the development and application of the generic one- and two-dimensional laser/material heating uncertainty interaction analysis models. This study successfully demonstrated an efficient uncertainty assessment methodology to assess simple laser irradiation/material thermal heating process problems. Key parameter uncertainties were characterized and ranked for numerous example problem applications, and the influence of various Monte Carlo sampling

  17. Uncertainty analysis for regional-scale reserve selection.

    PubMed

    Moilanen, Atte; Wintle, Brendan A; Elith, Jane; Burgman, Mark

    2006-12-01

    Methods for reserve selection and conservation planning often ignore uncertainty. For example, presence-absence observations and predictions of habitat models are used as inputs but commonly assumed to be without error. We applied information-gap decision theory to develop uncertainty analysis methods for reserve selection. Our proposed method seeks a solution that is robust in achieving a given conservation target, despite uncertainty in the data. We maximized robustness in reserve selection through a novel method, "distribution discounting," in which the site- and species-specific measure of conservation value (related to species-specific occupancy probabilities) was penalized by an error measure (in our study, related to accuracy of statistical prediction). Because distribution discounting can be implemented as a modification of input files, it is a computationally efficient solution for implementing uncertainty analysis into reserve selection. Thus, the method is particularly useful for high-dimensional decision problems characteristic of regional conservation assessment. We implemented distribution discounting in the zonation reserve-selection algorithm that produces a hierarchy of conservation priorities throughout the landscape. We applied it to reserve selection for seven priority fauna in a landscape in New South Wales, Australia. The distribution discounting method can be easily adapted for use with different kinds of data (e.g., probability of occurrence or abundance) and different landscape descriptions (grid or patch based) and incorporated into other reserve-selection algorithms and software. PMID:17181804

  18. Uncertainty Analysis of the Grazing Flow Impedance Tube

    NASA Technical Reports Server (NTRS)

    Brown, Martha C.; Jones, Michael G.; Watson, Willie R.

    2012-01-01

    This paper outlines a methodology to identify the measurement uncertainty of NASA Langley s Grazing Flow Impedance Tube (GFIT) over its operating range, and to identify the parameters that most significantly contribute to the acoustic impedance prediction. Two acoustic liners are used for this study. The first is a single-layer, perforate-over-honeycomb liner that is nonlinear with respect to sound pressure level. The second consists of a wire-mesh facesheet and a honeycomb core, and is linear with respect to sound pressure level. These liners allow for evaluation of the effects of measurement uncertainty on impedances educed with linear and nonlinear liners. In general, the measurement uncertainty is observed to be larger for the nonlinear liners, with the largest uncertainty occurring near anti-resonance. A sensitivity analysis of the aerodynamic parameters (Mach number, static temperature, and static pressure) used in the impedance eduction process is also conducted using a Monte-Carlo approach. This sensitivity analysis demonstrates that the impedance eduction process is virtually insensitive to each of these parameters.

  19. Treatment of uncertainties in the IPCC: a philosophical analysis

    NASA Astrophysics Data System (ADS)

    Jebeile, J.; Drouet, I.

    2014-12-01

    The IPCC produces scientific reports out of findings on climate and climate change. Because the findings are uncertain in many respects, the production of reports requires aggregating assessments of uncertainties of different kinds. This difficult task is currently regulated by the Guidance note for lead authors of the IPCC fifth assessment report on consistent treatment of uncertainties. The note recommends that two metrics—i.e. confidence and likelihood— be used for communicating the degree of certainty in findings. Confidence is expressed qualitatively "based on the type, amount, quality, and consistency of evidence […] and the degree of agreement", while likelihood is expressed probabilistically "based on statistical analysis of observations or model results, or expert judgment". Therefore, depending on the evidence evaluated, authors have the choice to present either an assigned level of confidence or a quantified measure of likelihood. But aggregating assessments of uncertainties of these two different kinds express distinct and conflicting methodologies. So the question arises whether the treatment of uncertainties in the IPCC is rationally justified. In order to answer the question, it is worth comparing the IPCC procedures with the formal normative theories of epistemic rationality which have been developed by philosophers. These theories—which include contributions to the philosophy of probability and to bayesian probabilistic confirmation theory—are relevant for our purpose because they are commonly used to assess the rationality of common collective jugement formation based on uncertain knowledge. In this paper we make the comparison and pursue the following objectives: i/we determine whether the IPCC confidence and likelihood can be compared with the notions of uncertainty targeted by or underlying the formal normative theories of epistemic rationality; ii/we investigate whether the formal normative theories of epistemic rationality justify

  20. UNCERTAINTY ANALYSIS FOR THE TECHA RIVER DOSIMETRY SYSTEM

    SciTech Connect

    Napier, Bruce A.; Degteva, M. O.; Shagina, N. B.; Anspaugh, L. R.

    2013-04-01

    Uncertainties in the doses estimated for the members of the Techa River Cohort (TRC) are being estimated with a two-dimensional Monte Carlo approach. In order to provide more accurate and precise estimates of individual dose (and thus more precise estimates of radiation risk) for the members of the TRC, a new dosimetric calculation system, the Techa River Dosimetry System-2009 (TRDS-2009) has been prepared. The deterministic version of the improved dosimetry system TRDS-2009D was basically completed in April 2009. Recent developments in evaluation of dose-response models in light of uncertain dose have highlighted the importance of different types of uncertainties in the development of individual dose estimates. These include uncertain parameters that may be either shared (common to some or all individuals) or unshared (a unique value for each person whose dose is to be estimated) within the dosimetric cohort. The nature of the type of uncertainty may be aleatory (random variability of true values due to stochastic processes) or epistemic (due to lack of complete knowledge about a unique quantity). Finally, there is a need to identify whether the structure of the errors is either related to measurement (the estimate differs from the true value by an error that is stochastically independent of the true value; frequently called classical uncertainty) or related to grouping (the true value varies from the estimate by an error that is random and is independent of the estimate; frequently called Berkson uncertainty). An approach has been developed that identifies the nature of the various input parameters and calculational methods incorporated in the Techa River Dosimetry System (based on the TRDS-2009D implementation), and a stochastic calculation model has been prepared to estimate the uncertainties in the dose estimates. This article reviews the concepts of uncertainty analysis, the equations, and input parameters, and then identifies the authors’ interpretations

  1. Compositional Analysis of Lignocellulosic Feedstocks. 2. Method Uncertainties

    PubMed Central

    2010-01-01

    The most common procedures for characterizing the chemical components of lignocellulosic feedstocks use a two-stage sulfuric acid hydrolysis to fractionate biomass for gravimetric and instrumental analyses. The uncertainty (i.e., dispersion of values from repeated measurement) in the primary data is of general interest to those with technical or financial interests in biomass conversion technology. The composition of a homogenized corn stover feedstock (154 replicate samples in 13 batches, by 7 analysts in 2 laboratories) was measured along with a National Institute of Standards and Technology (NIST) reference sugar cane bagasse, as a control, using this laboratory's suite of laboratory analytical procedures (LAPs). The uncertainty was evaluated by the statistical analysis of these data and is reported as the standard deviation of each component measurement. Censored and uncensored versions of these data sets are reported, as evidence was found for intermittent instrumental and equipment problems. The censored data are believed to represent the “best case” results of these analyses, whereas the uncensored data show how small method changes can strongly affect the uncertainties of these empirical methods. Relative standard deviations (RSD) of 1−3% are reported for glucan, xylan, lignin, extractives, and total component closure with the other minor components showing 4−10% RSD. The standard deviations seen with the corn stover and NIST bagasse materials were similar, which suggests that the uncertainties reported here are due more to the analytical method used than to the specific feedstock type being analyzed. PMID:20669952

  2. INTEGRATION OF SYSTEM COMPONENTS AND UNCERTAINTY ANALYSIS - HANFORD EXAMPLES

    SciTech Connect

    WOOD MI

    2009-07-09

    {sm_bullet} Deterministic 'One Off' analyses as basis for evaluating sensitivity and uncertainty relative to reference case {sm_bullet} Spatial coverage identical to reference case {sm_bullet} Two types of analysis assumptions - Minimax parameter values around reference case conditions - 'What If' cases that change reference case condition and associated parameter values {sm_bullet} No conclusions about likelihood of estimated result other than' qualitative expectation that actual outcome should tend toward reference case estimate

  3. Uncertainty analysis of penicillin V production using Monte Carlo simulation.

    PubMed

    Biwer, Arno; Griffith, Steve; Cooney, Charles

    2005-04-20

    Uncertainty and variability affect economic and environmental performance in the production of biotechnology and pharmaceutical products. However, commercial process simulation software typically provides analysis that assumes deterministic rather than stochastic process parameters and thus is not capable of dealing with the complexities created by variance that arise in the decision-making process. Using the production of penicillin V as a case study, this article shows how uncertainty can be quantified and evaluated. The first step is construction of a process model, as well as analysis of its cost structure and environmental impact. The second step is identification of uncertain variables and determination of their probability distributions based on available process and literature data. Finally, Monte Carlo simulations are run to see how these uncertainties propagate through the model and affect key economic and environmental outcomes. Thus, the overall variation of these objective functions are quantified, the technical, supply chain, and market parameters that contribute most to the existing variance are identified and the differences between economic and ecological evaluation are analyzed. In our case study analysis, we show that final penicillin and biomass concentrations in the fermenter have the highest contribution to variance for both unit production cost and environmental impact. The penicillin selling price dominates return on investment variance as well as the variance for other revenue-dependent parameters. PMID:15742389

  4. Uncertainty analysis in WWTP model applications: a critical discussion using an example from design.

    PubMed

    Sin, Gürkan; Gernaey, Krist V; Neumann, Marc B; van Loosdrecht, Mark C M; Gujer, Willi

    2009-06-01

    This study focuses on uncertainty analysis of WWTP models and analyzes the issue of framing and how it affects the interpretation of uncertainty analysis results. As a case study, the prediction of uncertainty involved in model-based design of a wastewater treatment plant is studied. The Monte Carlo procedure is used for uncertainty estimation, for which the input uncertainty is quantified through expert elicitation and the sampling is performed using the Latin hypercube method. Three scenarios from engineering practice are selected to examine the issue of framing: (1) uncertainty due to stoichiometric, biokinetic and influent parameters; (2) uncertainty due to hydraulic behaviour of the plant and mass transfer parameters; (3) uncertainty due to the combination of (1) and (2). The results demonstrate that depending on the way the uncertainty analysis is framed, the estimated uncertainty of design performance criteria differs significantly. The implication for the practical applications of uncertainty analysis in the wastewater industry is profound: (i) as the uncertainty analysis results are specific to the framing used, the results must be interpreted within the context of that framing; and (ii) the framing must be crafted according to the particular purpose of uncertainty analysis/model application. Finally, it needs to be emphasised that uncertainty analysis is no doubt a powerful tool for model-based design among others, however clear guidelines for good uncertainty analysis in wastewater engineering practice are needed. PMID:19447462

  5. Geoengineering to Avoid Overshoot: An Analysis of Uncertainty

    NASA Astrophysics Data System (ADS)

    Tanaka, Katsumasa; Cho, Cheolhung; Krey, Volker; Patt, Anthony; Rafaj, Peter; Rao-Skirbekk, Shilpa; Wagner, Fabian

    2010-05-01

    ., 2009) is employed to calculate climate responses including associated uncertainty and to estimate geoengineering profiles to cap the warming at 2°C since preindustrial. The inversion setup for the model ACC2 is used to estimate the uncertain parameters (e.g. climate sensitivity) against associated historical observations (e.g. global-mean surface air temperature). Our preliminary results show that under climate and scenario uncertainties, a geoengineering intervention to avoid an overshoot would be with medium intensity in the latter half of this century (≈ 1 Mt. Pinatubo eruption every 4 years in terms of stratospheric sulfur injections). The start year of geoengineering intervention does not significantly influence the long-term geoengineering profile. However, a geoengineering intervention of the medium intensity could bring about substantial environmental side effects such as the destruction of stratospheric ozone. Our results point to the necessity to pursue persistently mainstream mitigation efforts. 2) Pollution Abatement and Geoengineering The second study examines the potential of geoengineering combined with air clean policy. A drastic air pollution abatement might result in an abrupt warming because it would suddenly remove the tropospheric aerosols which partly offset the background global warming (e.g. Andreae et al, 2005, Raddatz and Tanaka, 2010). This study investigates the magnitude of unrealized warming under a range of policy assumptions and associated uncertainties. Then the profile of geoengineering is estimated to suppress the warming that would be accompanied by clean air policy. This study is the first attempt to explore uncertainty in the warming caused by clean air policy - Kloster et al. (2009), which assess regional changes in climate and hydrological cycle, has not however included associated uncertainties in the analysis. A variety of policy assumptions will be devised to represent various degrees of air pollution abatement. These

  6. Final Report. Analysis and Reduction of Complex Networks Under Uncertainty

    SciTech Connect

    Marzouk, Youssef M.; Coles, T.; Spantini, A.; Tosatto, L.

    2013-09-30

    The project was a collaborative effort among MIT, Sandia National Laboratories (local PI Dr. Habib Najm), the University of Southern California (local PI Prof. Roger Ghanem), and The Johns Hopkins University (local PI Prof. Omar Knio, now at Duke University). Our focus was the analysis and reduction of large-scale dynamical systems emerging from networks of interacting components. Such networks underlie myriad natural and engineered systems. Examples important to DOE include chemical models of energy conversion processes, and elements of national infrastructure—e.g., electric power grids. Time scales in chemical systems span orders of magnitude, while infrastructure networks feature both local and long-distance connectivity, with associated clusters of time scales. These systems also blend continuous and discrete behavior; examples include saturation phenomena in surface chemistry and catalysis, and switching in electrical networks. Reducing size and stiffness is essential to tractable and predictive simulation of these systems. Computational singular perturbation (CSP) has been effectively used to identify and decouple dynamics at disparate time scales in chemical systems, allowing reduction of model complexity and stiffness. In realistic settings, however, model reduction must contend with uncertainties, which are often greatest in large-scale systems most in need of reduction. Uncertainty is not limited to parameters; one must also address structural uncertainties—e.g., whether a link is present in a network—and the impact of random perturbations, e.g., fluctuating loads or sources. Research under this project developed new methods for the analysis and reduction of complex multiscale networks under uncertainty, by combining computational singular perturbation (CSP) with probabilistic uncertainty quantification. CSP yields asymptotic approximations of reduceddimensionality “slow manifolds” on which a multiscale dynamical system evolves. Introducing

  7. UCNA Systematic Uncertainties: Developments in Analysis and Method

    NASA Astrophysics Data System (ADS)

    Zeck, Bryan

    2012-10-01

    The UCNA experiment is an effort to measure the beta-decay asymmetry parameter A of the correlation between the electron momentum and the neutron spin, using bottled polarized ultracold neutrons in a homogenous 1 T magnetic field. Continued improvements in both analysis and method are helping to push the measurement uncertainty to the limits of the current statistical sensitivity (less than 0.4%). The implementation of thinner decay trap windows will be discussed, as will the use of a tagged beta particle calibration source to measure angle-dependent scattering effects and energy loss. Additionally, improvements in position reconstruction and polarization measurements using a new shutter system will be introduced. A full accounting of the current systematic uncertainties will be given.

  8. Global land cover mapping: a review and uncertainty analysis

    USGS Publications Warehouse

    Congalton, Russell G.; Gu, Jianyu; Yadav, Kamini; Thenkabail, Prasad S.; Ozdogan, Mutlu

    2014-01-01

    Given the advances in remotely sensed imagery and associated technologies, several global land cover maps have been produced in recent times including IGBP DISCover, UMD Land Cover, Global Land Cover 2000 and GlobCover 2009. However, the utility of these maps for specific applications has often been hampered due to considerable amounts of uncertainties and inconsistencies. A thorough review of these global land cover projects including evaluating the sources of error and uncertainty is prudent and enlightening. Therefore, this paper describes our work in which we compared, summarized and conducted an uncertainty analysis of the four global land cover mapping projects using an error budget approach. The results showed that the classification scheme and the validation methodology had the highest error contribution and implementation priority. A comparison of the classification schemes showed that there are many inconsistencies between the definitions of the map classes. This is especially true for the mixed type classes for which thresholds vary for the attributes/discriminators used in the classification process. Examination of these four global mapping projects provided quite a few important lessons for the future global mapping projects including the need for clear and uniform definitions of the classification scheme and an efficient, practical, and valid design of the accuracy assessment.

  9. Bayesian analysis of input uncertainty in hydrological modeling: 2. Application

    NASA Astrophysics Data System (ADS)

    Kavetski, Dmitri; Kuczera, George; Franks, Stewart W.

    2006-03-01

    The Bayesian total error analysis (BATEA) methodology directly addresses both input and output errors in hydrological modeling, requiring the modeler to make explicit, rather than implicit, assumptions about the likely extent of data uncertainty. This study considers a BATEA assessment of two North American catchments: (1) French Broad River and (2) Potomac basins. It assesses the performance of the conceptual Variable Infiltration Capacity (VIC) model with and without accounting for input (precipitation) uncertainty. The results show the considerable effects of precipitation errors on the predicted hydrographs (especially the prediction limits) and on the calibrated parameters. In addition, the performance of BATEA in the presence of severe model errors is analyzed. While BATEA allows a very direct treatment of input uncertainty and yields some limited insight into model errors, it requires the specification of valid error models, which are currently poorly understood and require further work. Moreover, it leads to computationally challenging highly dimensional problems. For some types of models, including the VIC implemented using robust numerical methods, the computational cost of BATEA can be reduced using Newton-type methods.

  10. The Uncertainty in the Local Seismic Response Analysis

    SciTech Connect

    Pasculli, A.; Pugliese, A.; Romeo, R. W.; Sano, T.

    2008-07-08

    In the present paper is shown the influence on the local seismic response analysis exerted by considering dispersion and uncertainty in the seismic input as well as in the dynamic properties of soils. In a first attempt a 1D numerical model is developed accounting for both the aleatory nature of the input motion and the stochastic variability of the dynamic properties of soils. The seismic input is introduced in a non-conventional way through a power spectral density, for which an elastic response spectrum, derived--for instance--by a conventional seismic hazard analysis, is required with an appropriate level of reliability. The uncertainty in the geotechnical properties of soils are instead investigated through a well known simulation technique (Monte Carlo method) for the construction of statistical ensembles. The result of a conventional local seismic response analysis given by a deterministic elastic response spectrum is replaced, in our approach, by a set of statistical elastic response spectra, each one characterized by an appropriate level of probability to be reached or exceeded. The analyses have been carried out for a well documented real case-study. Lastly, we anticipate a 2D numerical analysis to investigate also the spatial variability of soil's properties.

  11. Biomass Thermogravimetric Analysis: Uncertainty Determination Methodology and Sampling Maps Generation

    PubMed Central

    Pazó, Jose A.; Granada, Enrique; Saavedra, Ángeles; Eguía, Pablo; Collazo, Joaquín

    2010-01-01

    The objective of this study was to develop a methodology for the determination of the maximum sampling error and confidence intervals of thermal properties obtained from thermogravimetric analysis (TG), including moisture, volatile matter, fixed carbon and ash content. The sampling procedure of the TG analysis was of particular interest and was conducted with care. The results of the present study were compared to those of a prompt analysis, and a correlation between the mean values and maximum sampling errors of the methods were not observed. In general, low and acceptable levels of uncertainty and error were obtained, demonstrating that the properties evaluated by TG analysis were representative of the overall fuel composition. The accurate determination of the thermal properties of biomass with precise confidence intervals is of particular interest in energetic biomass applications. PMID:20717532

  12. MOUSE (MODULAR ORIENTED UNCERTAINTY SYSTEM): A COMPUTERIZED UNCERTAINTY ANALYSIS SYSTEM. OPERATIONAL MANUAL.

    EPA Science Inventory

    MOUSE (Modular Oriented Uncertainty SystEm) deals with the problem of uncertainties in models that consist of one or more algebraic equations. It was especially designed for use by those with little or no knowledge of computer languages or programming. It is compact (and thus can...

  13. Results of a 24-inch Hybrid Motor Performance Uncertainty Analysis

    NASA Technical Reports Server (NTRS)

    Sims, Joseph D.; Coleman, Hugh W.

    1998-01-01

    The subscale (11 and 24-inch) hybrid motors at the Marshall Space Flight Center (MSFC) have been used as versatile and cost effective testbeds for developing new technology. Comparisons between motor configuration, ignition systems, feed systems, fuel formulations, and nozzle materials have been carried out without detailed consideration as to haw "good" the motor performance data were. For the 250,000 lb/thrust motor developed by the Hybrid Propulsion Demonstration Program consortium, this shortcoming is particularly risky because motor performance will likely be used as put of a set of downselect criteria to choose between competing ignition and feed systems under development. This analysis directly addresses that shortcoming by applying uncertainty analysis techniques to the experimental determination of the characteristic velocity, theoretical characteristic velocity, and characteristic velocity efficiency for a 24-inch motor firing. With the adoption of fuel-lined headends, flow restriction, and aft mixing chambers, state of the an 24-inch hybrid motors have become very efficient However, impossibly high combustion efficiencies (some computed as high as 108%) have been measured in some tests with 11-inch motors. This analysis has given new insight into explaining how these efficiencies were measured to be so high, and into which experimental measurements contribute the most to the overall uncertainty.

  14. Additional challenges for uncertainty analysis in river engineering

    NASA Astrophysics Data System (ADS)

    Berends, Koen; Warmink, Jord; Hulscher, Suzanne

    2016-04-01

    the proposed intervention. The implicit assumption underlying such analysis is that both models are commensurable. We hypothesize that they are commensurable only to a certain extent. In an idealised study we have demonstrated that prediction performance loss should be expected with increasingly large engineering works. When accounting for parametric uncertainty of floodplain roughness in model identification, we see uncertainty bounds for predicted effects of interventions increase with increasing intervention scale. Calibration of these types of models therefore seems to have a shelf-life, beyond which calibration does not longer improves prediction. Therefore a qualification scheme for model use is required that can be linked to model validity. In this study, we characterize model use along three dimensions: extrapolation (using the model with different external drivers), extension (using the model for different output or indicators) and modification (using modified models). Such use of models is expected to have implications for the applicability of surrogating modelling for efficient uncertainty analysis as well, which is recommended for future research. Warmink, J. J.; Straatsma, M. W.; Huthoff, F.; Booij, M. J. & Hulscher, S. J. M. H. 2013. Uncertainty of design water levels due to combined bed form and vegetation roughness in the Dutch river Waal. Journal of Flood Risk Management 6, 302-318 . DOI: 10.1111/jfr3.12014

  15. Productivity of Northern Eurasian forests: Analysis of uncertainties

    NASA Astrophysics Data System (ADS)

    Shvidenko, Anatoly; Schepaschenko, Dmitry; McCallum, Ian

    2010-05-01

    Indicators of biological productivity of forests (live and dead biomass, net primary production, net and gross growth) are crucial for both assessment of the impacts of terrestrial ecosystems on major biogeochemical cycles and practice of sustainable forest management. However, different information and the diversity of methods used in the assessments of forests productivity cause substantial variation in reported estimates. The paper contains a systems analysis of the existing methods, their uncertainties, and a description of available information. With respect to Northern Eurasian forests, the major reasons for uncertainties could be categorized as following: (1) significant biases that are inherent in a number of important sources of available information (e.g., forest inventory data, results of measurements of some indicators in situ); (2) inadequacy and oversimplification of models of different types (empirical aggregations, process-based models); (3) lack of data for some regions; and (4) upscaling procedure of 'point' observations. Based on as comprehensive as possible adherence to the principles of systems analysis, we made an attempt to provide a reanalysis of indicators of forests productivity of Russia aiming at obtaining the results for which uncertainties could be estimated in a reliable and transparent way. Within a landscape-ecosystem approach it has required (1) development of an expert system for refinement of initial data including elimination of recognized biases; (2) delineation of ecological regions based on gradients of major indicators of productivity; (3) transition to multidimensional models (e.g., for calculation of spatially distributed biomass expansion factors); (4) use of process-based elements in empirical models; and (5) development of some approaches which presumably do not have recognized biases. However, taking into account the fuzzy character of the problem, the above approach (as well as any other individually used method) is

  16. Statistical analysis of the uncertainty related to flood hazard appraisal

    NASA Astrophysics Data System (ADS)

    Notaro, Vincenza; Freni, Gabriele

    2015-12-01

    The estimation of flood hazard frequency statistics for an urban catchment is of great interest in practice. It provides the evaluation of potential flood risk and related damage and supports decision making for flood risk management. Flood risk is usually defined as function of the probability, that a system deficiency can cause flooding (hazard), and the expected damage, due to the flooding magnitude (damage), taking into account both the exposure and the vulnerability of the goods at risk. The expected flood damage can be evaluated by an a priori estimation of potential damage caused by flooding or by interpolating real damage data. With regard to flood hazard appraisal several procedures propose to identify some hazard indicator (HI) such as flood depth or the combination of flood depth and velocity and to assess the flood hazard corresponding to the analyzed area comparing the HI variables with user-defined threshold values or curves (penalty curves or matrixes). However, flooding data are usually unavailable or piecemeal allowing for carrying out a reliable flood hazard analysis, therefore hazard analysis is often performed by means of mathematical simulations aimed at evaluating water levels and flow velocities over catchment surface. As results a great part of the uncertainties intrinsic to flood risk appraisal can be related to the hazard evaluation due to the uncertainty inherent to modeling results and to the subjectivity of the user defined hazard thresholds applied to link flood depth to a hazard level. In the present work, a statistical methodology was proposed for evaluating and reducing the uncertainties connected with hazard level estimation. The methodology has been applied to a real urban watershed as case study.

  17. Foundational methods for model verification and uncertainty analysis (Invited)

    NASA Astrophysics Data System (ADS)

    Jakeman, A. J.; Croke, B. F.; Guillaume, J. H.; Jakeman, J. D.; Shin, M.

    2013-12-01

    Before embarking on formal methods of uncertainty analysis that may entail unnecessarily restrictive assumptions and sophisticated treatment, prudence dictates exploring one's data, model candidates and applicable objective functions with a mixture of methods as a first step. It seems that there are several foundational methods that warrant more attention in practice and that there is scope for the development of new ones. Ensuing results from a selection of foundational methods may well inform the choice of formal methods and assumptions, or suffice in themselves as an effective appreciation of uncertainty. Through the case of four lumped rainfall-runoff models of varying complexity from several watersheds we illustrate that there are valuable methods, many of them already in open source software, others we have recently developed, which can be invoked to yield valuable insights into model veracity and uncertainty. We show results of using methods of global sensitivity analysis that help: determine whether insensitive parameters impact on predictions and therefore cannot be fixed; and identify which combinations of objective function, dataset and model structure allow insensitive parameters to be estimated. We apply response surface and polynomial chaos methods to yield knowledge of the models' response surfaces and parameter interactions, thereby informing model redesign. A new approach to model structure discrimination is presented based on Pareto methods and cross-validation. It reveals which model structures are acceptable in the sense that they are non-dominated by other structures across calibration and validation periods and across catchments according to specified performance criteria. Finally we present and demonstrate a falsification approach that shows the value of examining scenarios of model structures and parameters to identify any change that might have a specified effect on a prediction.

  18. Reducing spatial uncertainty in climatic maps through geostatistical analysis

    NASA Astrophysics Data System (ADS)

    Pesquer, Lluís; Ninyerola, Miquel; Pons, Xavier

    2014-05-01

    Climatic maps from meteorological stations and geographical co-variables can be obtained through correlative models (Ninyerola et al., 2000)*. Nevertheless, the spatial uncertainty of the resulting maps could be reduced. The present work is a new stage over those approaches aiming to study how to obtain better results while characterizing spatial uncertainty. The study area is Catalonia (32000 km2), a region with highly variable relief (0 to 3143 m). We have used 217 stations (321 to 1244 mm) to model the annual precipitation in two steps: 1/ multiple regression using geographical variables (elevation, distance to the coast, latitude, etc) and 2/ refinement of the results by adding the spatial interpolation of the regression residuals with inverse distance weighting (IDW), regularized splines with tension (SPT) or ordinary kriging (OK). Spatial uncertainty analysis is based on an independent subsample (test set), randomly selected in previous works. The main contribution of this work is the analysis of this test set as well as the search for an optimal process of division (split) of the stations in two sets, one used to perform the multiple regression and residuals interpolation (fit set), and another used to compute the quality (test set); optimal division should reduce spatial uncertainty and improve the overall quality. Two methods have been evaluated against classical methods: (random selection RS and leave-one-out cross-validation LOOCV): selection by Euclidian 2D-distance, and selection by anisotropic 2D-distance combined with a 3D-contribution (suitable weighted) from the most representative independent variable. Both methods define a minimum threshold distance, obtained by variogram analysis, between samples. Main preliminary results for LOOCV, RS (average from 10 executions), Euclidian criterion (EU), and for anisotropic criterion (with 1.1 value, UTMY coordinate has a bit more weight than UTMX) combined with 3D criteria (A3D) (1000 factor for elevation

  19. Geoengineering to Avoid Overshoot: An Analysis of Uncertainty

    NASA Astrophysics Data System (ADS)

    Tanaka, Katsumasa; Cho, Cheolhung; Krey, Volker; Patt, Anthony; Rafaj, Peter; Rao-Skirbekk, Shilpa; Wagner, Fabian

    2010-05-01

    ., 2009) is employed to calculate climate responses including associated uncertainty and to estimate geoengineering profiles to cap the warming at 2°C since preindustrial. The inversion setup for the model ACC2 is used to estimate the uncertain parameters (e.g. climate sensitivity) against associated historical observations (e.g. global-mean surface air temperature). Our preliminary results show that under climate and scenario uncertainties, a geoengineering intervention to avoid an overshoot would be with medium intensity in the latter half of this century (≈ 1 Mt. Pinatubo eruption every 4 years in terms of stratospheric sulfur injections). The start year of geoengineering intervention does not significantly influence the long-term geoengineering profile. However, a geoengineering intervention of the medium intensity could bring about substantial environmental side effects such as the destruction of stratospheric ozone. Our results point to the necessity to pursue persistently mainstream mitigation efforts. 2) Pollution Abatement and Geoengineering The second study examines the potential of geoengineering combined with air clean policy. A drastic air pollution abatement might result in an abrupt warming because it would suddenly remove the tropospheric aerosols which partly offset the background global warming (e.g. Andreae et al, 2005, Raddatz and Tanaka, 2010). This study investigates the magnitude of unrealized warming under a range of policy assumptions and associated uncertainties. Then the profile of geoengineering is estimated to suppress the warming that would be accompanied by clean air policy. This study is the first attempt to explore uncertainty in the warming caused by clean air policy - Kloster et al. (2009), which assess regional changes in climate and hydrological cycle, has not however included associated uncertainties in the analysis. A variety of policy assumptions will be devised to represent various degrees of air pollution abatement. These

  20. Dynamic wake prediction and visualization with uncertainty analysis

    NASA Technical Reports Server (NTRS)

    Holforty, Wendy L. (Inventor); Powell, J. David (Inventor)

    2005-01-01

    A dynamic wake avoidance system utilizes aircraft and atmospheric parameters readily available in flight to model and predict airborne wake vortices in real time. A novel combination of algorithms allows for a relatively simple yet robust wake model to be constructed based on information extracted from a broadcast. The system predicts the location and movement of the wake based on the nominal wake model and correspondingly performs an uncertainty analysis on the wake model to determine a wake hazard zone (no fly zone), which comprises a plurality of wake planes, each moving independently from another. The system selectively adjusts dimensions of each wake plane to minimize spatial and temporal uncertainty, thereby ensuring that the actual wake is within the wake hazard zone. The predicted wake hazard zone is communicated in real time directly to a user via a realistic visual representation. In an example, the wake hazard zone is visualized on a 3-D flight deck display to enable a pilot to visualize or see a neighboring aircraft as well as its wake. The system substantially enhances the pilot's situational awareness and allows for a further safe decrease in spacing, which could alleviate airport and airspace congestion.

  1. Selection of Representative Models for Decision Analysis Under Uncertainty

    NASA Astrophysics Data System (ADS)

    Meira, Luis A. A.; Coelho, Guilherme P.; Santos, Antonio Alberto S.; Schiozer, Denis J.

    2016-03-01

    The decision-making process in oil fields includes a step of risk analysis associated with the uncertainties present in the variables of the problem. Such uncertainties lead to hundreds, even thousands, of possible scenarios that are supposed to be analyzed so an effective production strategy can be selected. Given this high number of scenarios, a technique to reduce this set to a smaller, feasible subset of representative scenarios is imperative. The selected scenarios must be representative of the original set and also free of optimistic and pessimistic bias. This paper is devoted to propose an assisted methodology to identify representative models in oil fields. To do so, first a mathematical function was developed to model the representativeness of a subset of models with respect to the full set that characterizes the problem. Then, an optimization tool was implemented to identify the representative models of any problem, considering not only the cross-plots of the main output variables, but also the risk curves and the probability distribution of the attribute-levels of the problem. The proposed technique was applied to two benchmark cases and the results, evaluated by experts in the field, indicate that the obtained solutions are richer than those identified by previously adopted manual approaches. The program bytecode is available under request.

  2. Uncertainty Reduction using Bayesian Inference and Sensitivity Analysis: A Sequential Approach to the NASA Langley Uncertainty Quantification Challenge

    NASA Technical Reports Server (NTRS)

    Sankararaman, Shankar

    2016-01-01

    This paper presents a computational framework for uncertainty characterization and propagation, and sensitivity analysis under the presence of aleatory and epistemic un- certainty, and develops a rigorous methodology for efficient refinement of epistemic un- certainty by identifying important epistemic variables that significantly affect the overall performance of an engineering system. The proposed methodology is illustrated using the NASA Langley Uncertainty Quantification Challenge (NASA-LUQC) problem that deals with uncertainty analysis of a generic transport model (GTM). First, Bayesian inference is used to infer subsystem-level epistemic quantities using the subsystem-level model and corresponding data. Second, tools of variance-based global sensitivity analysis are used to identify four important epistemic variables (this limitation specified in the NASA-LUQC is reflective of practical engineering situations where not all epistemic variables can be refined due to time/budget constraints) that significantly affect system-level performance. The most significant contribution of this paper is the development of the sequential refine- ment methodology, where epistemic variables for refinement are not identified all-at-once. Instead, only one variable is first identified, and then, Bayesian inference and global sensi- tivity calculations are repeated to identify the next important variable. This procedure is continued until all 4 variables are identified and the refinement in the system-level perfor- mance is computed. The advantages of the proposed sequential refinement methodology over the all-at-once uncertainty refinement approach are explained, and then applied to the NASA Langley Uncertainty Quantification Challenge problem.

  3. Probabilistic accident consequence uncertainty analysis: Food chain uncertainty assessment. Volume 1: Main report

    SciTech Connect

    Brown, J.; Goossens, L.H.J.; Kraan, B.C.P.

    1997-06-01

    This volume is the first of a two-volume document that summarizes a joint project conducted by the US Nuclear Regulatory Commission and the European Commission to assess uncertainties in the MACCS and COSYMA probabilistic accident consequence codes. These codes were developed primarily for estimating the risks presented by nuclear reactors based on postulated frequencies and magnitudes of potential accidents. This document reports on an ongoing project to assess uncertainty in the MACCS and COSYMA calculations for the offsite consequences of radionuclide releases by hypothetical nuclear power plant accidents. A panel of sixteen experts was formed to compile credible and traceable uncertainty distributions for food chain variables that affect calculations of offsite consequences. The expert judgment elicitation procedure and its outcomes are described in these volumes. Other panels were formed to consider uncertainty in other aspects of the codes. Their results are described in companion reports. Volume 1 contains background information and a complete description of the joint consequence uncertainty study. Volume 2 contains appendices that include (1) a summary of the MACCS and COSYMA consequence codes, (2) the elicitation questionnaires and case structures for both panels, (3) the rationales and results for the panels on soil and plant transfer and animal transfer, (4) short biographies of the experts, and (5) the aggregated results of their responses.

  4. Uncertainty analysis of planar laser-induced fluorescence measurements

    NASA Astrophysics Data System (ADS)

    Tavoularis, Stavros; Vanderwel, Christina

    2014-11-01

    We present a thorough analysis of the uncertainty of the planar laser-induced fluorescence (PLIF) method. We consider the measurement of concentration maps in cross-sections parallel to and normal to the axis of a slender plume containing Rhodamine 6G as a passive scalar tracer and transported by a turbulent shear flow. In particular, we identify two previously unexplored sources of error contributed by non-uniformity of the concentration across the laser sheet and by secondary fluorescence. We propose new methods to evaluate and correct for these sources of error and demonstrate that the corrected concentration measurements accurately determined the injected dye mass flow rate of the plume in the far field. Supported by NSERC.

  5. Handling of Uncertainty in Analysis of Chest Pain

    PubMed Central

    Hudson, D. L.; Cohen, M. E.

    1990-01-01

    Over the last 15 years, a large research effort has been devoted to the development of rule-based expert systems for medical decision making. While many of these systems provide competent medical advice in their domains of expertise, few are employed in actual clinical use for a variety of reasons. One of the challenges confronting these systems is the task of dealing with uncertain information. Although much attention has been devoted to this problem, no completely satisfactory solution has been found. A strategy for coping with uncertainty based on a modified production rule format as well as an automated technique for extracting relative degrees of importance of contributing factors is discussed as applied to an existing medical expert system for the analysis of chest pain.

  6. Impact of Model Uncertainties on Quantitative Analysis of FUV Auroral Images: Peak Production Height

    NASA Technical Reports Server (NTRS)

    Germany, G. A.; Lummerzheim, D.; Parks, G. K.; Brittnacher, M. J.; Spann, James F., Jr.; Richards, Phil G.

    1999-01-01

    We demonstrate that small uncertainties in the modeled height of peak production for FUV emissions can lead to significant uncertainties in the analysis of these sai-ne emissions. In particular, an uncertainty of only 3 km in the peak production height can lead to a 50% uncertainty in the mean auroral energy deduced from the images. This altitude uncertainty is comparable to differences in different auroral deposition models currently used for UVI analysis. Consequently, great care must be taken in quantitative photometric analysis and interpretation of FUV auroral images.

  7. [Parameter uncertainty analysis for urban rainfall runoff modelling].

    PubMed

    Huang, Jin-Liang; Lin, Jie; Du, Peng-Fei

    2012-07-01

    An urban watershed in Xiamen was selected to perform the parameter uncertainty analysis for urban stormwater runoff modeling in terms of identification and sensitivity analysis based on storm water management model (SWMM) using Monte-Carlo sampling and regionalized sensitivity analysis (RSA) algorithm. Results show that Dstore-Imperv, Dstore-Perv and Curve Number (CN) are the identifiable parameters with larger K-S values in hydrological and hydraulic module, and the rank of K-S values in hydrological and hydraulic module is Dstore-Imperv > CN > Dstore-Perv > N-Perv > conductivity > Con-Mann > N-Imperv. With regards to water quality module, the parameters in exponent washoff model including Coefficient and Exponent and the Max. Buildup parameter of saturation buildup model in three land cover types are the identifiable parameters with the larger K-S values. In comparison, the K-S value of rate constant in three landuse/cover types is smaller than that of Max. Buildup, Coefficient and Exponent. PMID:23002595

  8. Improving uncertainty analysis in European Union risk assessment of chemicals.

    PubMed

    Verdonck, Frederik A M; Souren, Astrid; van Asselt, Marjolein B A; Van Sprang, Patrick A; Vanrolleghem, Peter A

    2007-07-01

    Handling uncertainty in curren European Union (EU) risk assessment of new and existing substances is problematic for several reasons. The known or quantifiable sources of uncertainty are mainly considered. Uncertainty is insufficiently, explicitly communicated to risk managers and decision makers but hidden and concealed in risk quotient numbers that appear to be certain and, therefore, create a false sense of certainty and protectiveness. The new EU chemical policy legislation, REACH, is an opportunity to learn from interdisciplinary thinking in order to evolve to smart risk assessment: an assessment in which awareness and openness to uncertainty is used to produce better characterizations and evaluations of risks. In a smart risk assessment context, quantifying uncertainty is not an aim but just a productive means to refine the assessment or to find alternative solutions for the problem at stake. Guidance and examples are given on how to differentiate, assess, and use uncertainty. PMID:17695106

  9. Regional Frequency and Uncertainty Analysis of Extreme Precipitation in Bangladesh

    NASA Astrophysics Data System (ADS)

    Mortuza, M. R.; Demissie, Y.; Li, H. Y.

    2014-12-01

    Increased frequency of extreme precipitations, especially those with multiday durations, are responsible for recent urban floods and associated significant losses of lives and infrastructures in Bangladesh. Reliable and routinely updated estimation of the frequency of occurrence of such extreme precipitation events are thus important for developing up-to-date hydraulic structures and stormwater drainage system that can effectively minimize future risk from similar events. In this study, we have updated the intensity-duration-frequency (IDF) curves for Bangladesh using daily precipitation data from 1961 to 2010 and quantified associated uncertainties. Regional frequency analysis based on L-moments is applied on 1-day, 2-day and 5-day annual maximum precipitation series due to its advantages over at-site estimation. The regional frequency approach pools the information from climatologically similar sites to make reliable estimates of quantiles given that the pooling group is homogeneous and of reasonable size. We have used Region of influence (ROI) approach along with homogeneity measure based on L-moments to identify the homogenous pooling groups for each site. Five 3-parameter distributions (i.e., Generalized Logistic, Generalized Extreme value, Generalized Normal, Pearson Type Three, and Generalized Pareto) are used for a thorough selection of appropriate models that fit the sample data. Uncertainties related to the selection of the distributions and historical data are quantified using the Bayesian Model Averaging and Balanced Bootstrap approaches respectively. The results from this study can be used to update the current design and management of hydraulic structures as well as in exploring spatio-temporal variations of extreme precipitation and associated risk.

  10. Uncertainty Analysis of the Three Pagodas Fault-Source Geometry

    NASA Astrophysics Data System (ADS)

    Haller, K. M.

    2015-12-01

    Probabilistic seismic-hazard assessment generally relies on an earthquake catalog (to estimate future seismicity from the locations and rates of past earthquakes) and faults sources (to estimate future seismicity from the known paleoseismic history of surface rupture). The paleoseismic history of potentially active faults in Southeast Asia is addressed at few locations and spans only a few complete recurrence intervals; many faults remain unstudied. Even where the timing of a surface-rupturing earthquakes is known, the extent of rupture may not be well constrained. Therefore, subjective judgment of experts is often used to define the three-dimensional size of future ruptures; limited paleoseismic data can lead to large uncertainties in ground-motion hazard from fault sources due to the preferred models that underlie these judgments. The 300-km-long, strike-slip Three Pagodas fault in western Thailand is possibly one of the most active faults in the country. The fault parallels the plate boundary and may be characterized by a slip rate high enough to result in measurable ground-motion at periods of interest for building design. The known paleoseismic history is limited and likely does not include the largest possible earthquake on the fault. This lack of knowledge begs the question what sizes of earthquakes are expected? Preferred rupture models constrain possible magnitude-frequency distributions, and alternative rupture models can result in different ground-motion hazard near the fault. This analysis includes alternative rupture models for the Three Pagodas fault, a first-level check against gross modeling assumptions to assure the source model is a reasonable reflection of observed data, and resulting ground-motion hazard for each alternative. Inadequate paleoseismic data is an important source of uncertainty that could be compensated for by considering alternative rupture models for poorly known seismic sources.

  11. Cassini Spacecraft Uncertainty Analysis Data and Methodology Review and Update/Volume 1: Updated Parameter Uncertainty Models for the Consequence Analysis

    SciTech Connect

    WHEELER, TIMOTHY A.; WYSS, GREGORY D.; HARPER, FREDERICK T.

    2000-11-01

    Uncertainty distributions for specific parameters of the Cassini General Purpose Heat Source Radioisotope Thermoelectric Generator (GPHS-RTG) Final Safety Analysis Report consequence risk analysis were revised and updated. The revisions and updates were done for all consequence parameters for which relevant information exists from the joint project on Probabilistic Accident Consequence Uncertainty Analysis by the United States Nuclear Regulatory Commission and the Commission of European Communities.

  12. Fukushima Daiichi unit 1 uncertainty analysis--Preliminary selection of uncertain parameters and analysis methodology

    SciTech Connect

    Cardoni, Jeffrey N.; Kalinich, Donald A.

    2014-02-01

    Sandia National Laboratories (SNL) plans to conduct uncertainty analyses (UA) on the Fukushima Daiichi unit (1F1) plant with the MELCOR code. The model to be used was developed for a previous accident reconstruction investigation jointly sponsored by the US Department of Energy (DOE) and Nuclear Regulatory Commission (NRC). However, that study only examined a handful of various model inputs and boundary conditions, and the predictions yielded only fair agreement with plant data and current release estimates. The goal of this uncertainty study is to perform a focused evaluation of uncertainty in core melt progression behavior and its effect on key figures-of-merit (e.g., hydrogen production, vessel lower head failure, etc.). In preparation for the SNL Fukushima UA work, a scoping study has been completed to identify important core melt progression parameters for the uncertainty analysis. The study also lays out a preliminary UA methodology.

  13. Use of probabilistic methods for analysis of cost and duration uncertainties in a decision analysis framework

    SciTech Connect

    Boak, D.M.; Painton, L.

    1995-12-08

    Probabilistic forecasting techniques have been used in many risk assessment and performance assessment applications on radioactive waste disposal projects such as Yucca Mountain and the Waste Isolation Pilot Plant (WIPP). Probabilistic techniques such as Monte Carlo and Latin Hypercube sampling methods are routinely used to treat uncertainties in physical parameters important in simulating radionuclide transport in a coupled geohydrologic system and assessing the ability of that system to comply with regulatory release limits. However, the use of probabilistic techniques in the treatment of uncertainties in the cost and duration of programmatic alternatives on risk and performance assessment projects is less common. Where significant uncertainties exist and where programmatic decisions must be made despite existing uncertainties, probabilistic techniques may yield important insights into decision options, especially when used in a decision analysis framework and when properly balanced with deterministic analyses. For relatively simple evaluations, these types of probabilistic evaluations can be made using personal computer-based software.

  14. Incorporating parametric uncertainty into population viability analysis models

    USGS Publications Warehouse

    McGowan, Conor P.; Runge, Michael C.; Larson, Michael A.

    2011-01-01

    Uncertainty in parameter estimates from sampling variation or expert judgment can introduce substantial uncertainty into ecological predictions based on those estimates. However, in standard population viability analyses, one of the most widely used tools for managing plant, fish and wildlife populations, parametric uncertainty is often ignored in or discarded from model projections. We present a method for explicitly incorporating this source of uncertainty into population models to fully account for risk in management and decision contexts. Our method involves a two-step simulation process where parametric uncertainty is incorporated into the replication loop of the model and temporal variance is incorporated into the loop for time steps in the model. Using the piping plover, a federally threatened shorebird in the USA and Canada, as an example, we compare abundance projections and extinction probabilities from simulations that exclude and include parametric uncertainty. Although final abundance was very low for all sets of simulations, estimated extinction risk was much greater for the simulation that incorporated parametric uncertainty in the replication loop. Decisions about species conservation (e.g., listing, delisting, and jeopardy) might differ greatly depending on the treatment of parametric uncertainty in population models.

  15. Experimental investigations for uncertainty quantification in brake squeal analysis

    NASA Astrophysics Data System (ADS)

    Renault, A.; Massa, F.; Lallemand, B.; Tison, T.

    2016-04-01

    The aim of this paper is to improve the correlation between the experimental and the numerical prediction of unstable frequencies for automotive brake systems considering uncertainty. First, an experimental quantification of uncertainty and a discussion analysing the contributions of uncertainty to a numerical squeal simulation are proposed. Frequency and transient simulations are performed considering nominal values of model parameters, determined experimentally. The obtained results are compared with those derived from experimental tests to highlight the limitation of deterministic simulations. The effects of the different kinds of uncertainty detected in working conditions of brake system, the pad boundary condition, the brake system material properties and the pad surface topography are discussed by defining different unstable mode classes. Finally, a correlation between experimental and numerical results considering uncertainty is successfully proposed for an industrial brake system. Results from the different comparisons reveal also a major influence of the pad topography and consequently the contact distribution.

  16. Uncertainty analysis of fission fraction for reactor antineutrino experiments

    NASA Astrophysics Data System (ADS)

    Ma, X. B.; Lu, F.; Wang, L. Z.; Chen, Y. X.; Zhong, W. L.; An, F. P.

    2016-06-01

    Reactor simulation is an important source of uncertainties for a reactor neutrino experiment. Therefore, how to evaluate the antineutrino flux uncertainty results from reactor simulation is an important question. In this study, a method of the antineutrino flux uncertainty result from reactor simulation was proposed by considering the correlation coefficient. In order to use this method in the Daya Bay antineutrino experiment, the open source code DRAGON was improved and used for obtaining the fission fraction and correlation coefficient. The average fission fraction between DRAGON and SCIENCE code was compared and the difference was less than 5% for all the four isotopes. The uncertainty of fission fraction was evaluated by comparing simulation atomic density of four main isotopes with Takahama-3 experiment measurement. After that, the uncertainty of the antineutrino flux results from reactor simulation was evaluated as 0.6% per core for Daya Bay antineutrino experiment.

  17. Probabilistic accident consequence uncertainty analysis: Food chain uncertainty assessment. Volume 2: Appendices

    SciTech Connect

    Brown, J.; Goossens, L.H.J.; Kraan, B.C.P.

    1997-06-01

    This volume is the second of a two-volume document that summarizes a joint project by the US Nuclear Regulatory and the Commission of European Communities to assess uncertainties in the MACCS and COSYMA probabilistic accident consequence codes. These codes were developed primarily for estimating the risks presented by nuclear reactors based on postulated frequencies and magnitudes of potential accidents. This two-volume report, which examines mechanisms and uncertainties of transfer through the food chain, is the first in a series of five such reports. A panel of sixteen experts was formed to compile credible and traceable uncertainty distributions for food chain transfer that affect calculations of offsite radiological consequences. Seven of the experts reported on transfer into the food chain through soil and plants, nine reported on transfer via food products from animals, and two reported on both. The expert judgment elicitation procedure and its outcomes are described in these volumes. This volume contains seven appendices. Appendix A presents a brief discussion of the MAACS and COSYMA model codes. Appendix B is the structure document and elicitation questionnaire for the expert panel on soils and plants. Appendix C presents the rationales and responses of each of the members of the soils and plants expert panel. Appendix D is the structure document and elicitation questionnaire for the expert panel on animal transfer. The rationales and responses of each of the experts on animal transfer are given in Appendix E. Brief biographies of the food chain expert panel members are provided in Appendix F. Aggregated results of expert responses are presented in graph format in Appendix G.

  18. ACCOUNTING FOR CALIBRATION UNCERTAINTIES IN X-RAY ANALYSIS: EFFECTIVE AREAS IN SPECTRAL FITTING

    SciTech Connect

    Lee, Hyunsook; Kashyap, Vinay L.; Drake, Jeremy J.; Ratzlaff, Pete; Siemiginowska, Aneta E-mail: vkashyap@cfa.harvard.edu E-mail: rpete@head.cfa.harvard.edu

    2011-04-20

    While considerable advance has been made to account for statistical uncertainties in astronomical analyses, systematic instrumental uncertainties have been generally ignored. This can be crucial to a proper interpretation of analysis results because instrumental calibration uncertainty is a form of systematic uncertainty. Ignoring it can underestimate error bars and introduce bias into the fitted values of model parameters. Accounting for such uncertainties currently requires extensive case-specific simulations if using existing analysis packages. Here, we present general statistical methods that incorporate calibration uncertainties into spectral analysis of high-energy data. We first present a method based on multiple imputation that can be applied with any fitting method, but is necessarily approximate. We then describe a more exact Bayesian approach that works in conjunction with a Markov chain Monte Carlo based fitting. We explore methods for improving computational efficiency, and in particular detail a method of summarizing calibration uncertainties with a principal component analysis of samples of plausible calibration files. This method is implemented using recently codified Chandra effective area uncertainties for low-resolution spectral analysis and is verified using both simulated and actual Chandra data. Our procedure for incorporating effective area uncertainty is easily generalized to other types of calibration uncertainties.

  19. Sensitivity and Uncertainty Analysis to Burn-up Estimates on ADS Using ACAB Code

    SciTech Connect

    Cabellos, O; Sanz, J; Rodriguez, A; Gonzalez, E; Embid, M; Alvarez, F; Reyes, S

    2005-02-11

    Within the scope of the Accelerator Driven System (ADS) concept for nuclear waste management applications, the burnup uncertainty estimates due to uncertainty in the activation cross sections (XSs) are important regarding both the safety and the efficiency of the waste burning process. We have applied both sensitivity analysis and Monte Carlo methodology to actinides burnup calculations in a lead-bismuth cooled subcritical ADS. The sensitivity analysis is used to identify the reaction XSs and the dominant chains that contribute most significantly to the uncertainty. The Monte Carlo methodology gives the burnup uncertainty estimates due to the synergetic/global effect of the complete set of XS uncertainties. These uncertainty estimates are valuable to assess the need of any experimental or systematic reevaluation of some uncertainty XSs for ADS.

  20. A structured analysis of uncertainty surrounding modeled impacts of groundwater-extraction rules

    NASA Astrophysics Data System (ADS)

    Guillaume, Joseph H. A.; Qureshi, M. Ejaz; Jakeman, Anthony J.

    2012-08-01

    Integrating economic and groundwater models for groundwater-management can help improve understanding of trade-offs involved between conflicting socioeconomic and biophysical objectives. However, there is significant uncertainty in most strategic decision-making situations, including in the models constructed to represent them. If not addressed, this uncertainty may be used to challenge the legitimacy of the models and decisions made using them. In this context, a preliminary uncertainty analysis was conducted of a dynamic coupled economic-groundwater model aimed at assessing groundwater extraction rules. The analysis demonstrates how a variety of uncertainties in such a model can be addressed. A number of methods are used including propagation of scenarios and bounds on parameters, multiple models, block bootstrap time-series sampling and robust linear regression for model calibration. These methods are described within the context of a theoretical uncertainty management framework, using a set of fundamental uncertainty management tasks and an uncertainty typology.

  1. Sensitivity and Uncertainty Analysis to Burnup Estimates on ADS using the ACAB Code

    SciTech Connect

    Cabellos, O.; Sanz, J.; Rodriguez, A.; Gonzalez, E.; Embid, M.; Alvarez, F.; Reyes, S.

    2005-05-24

    Within the scope of the Accelerator Driven System (ADS) concept for nuclear waste management applications, the burnup uncertainty estimates due to uncertainty in the activation cross sections (XSs) are important regarding both the safety and the efficiency of the waste burning process. We have applied both sensitivity analysis and Monte Carlo methodology to actinides burnup calculations in a lead-bismuth cooled subcritical ADS. The sensitivity analysis is used to identify the reaction XSs and the dominant chains that contribute most significantly to the uncertainty. The Monte Carlo methodology gives the burnup uncertainty estimates due to the synergetic/global effect of the complete set of XS uncertainties. These uncertainty estimates are valuable to assess the need of any experimental or systematic re-evaluation of some uncertainty XSs for ADS.

  2. Approximate Uncertainty Modeling in Risk Analysis with Vine Copulas

    PubMed Central

    Bedford, Tim; Daneshkhah, Alireza

    2015-01-01

    Many applications of risk analysis require us to jointly model multiple uncertain quantities. Bayesian networks and copulas are two common approaches to modeling joint uncertainties with probability distributions. This article focuses on new methodologies for copulas by developing work of Cooke, Bedford, Kurowica, and others on vines as a way of constructing higher dimensional distributions that do not suffer from some of the restrictions of alternatives such as the multivariate Gaussian copula. The article provides a fundamental approximation result, demonstrating that we can approximate any density as closely as we like using vines. It further operationalizes this result by showing how minimum information copulas can be used to provide parametric classes of copulas that have such good levels of approximation. We extend previous approaches using vines by considering nonconstant conditional dependencies, which are particularly relevant in financial risk modeling. We discuss how such models may be quantified, in terms of expert judgment or by fitting data, and illustrate the approach by modeling two financial data sets. PMID:26332240

  3. Measurement uncertainty sources analysis for parasitic time grating sensors

    NASA Astrophysics Data System (ADS)

    Yang, Hongtao; Zhou, Jiao; Fan, Bin; Fei, Yetai; Peng, Donglin; Wu, Tianfeng

    2016-01-01

    The signal quality of traveling wave and the measurement accuracy of parasitic time grating can be improved by optimiz ing its structure. This optimization process can be guided through building the electrical traveling wave equation with respect to the structure and the traveling wave signal generation principle. Based on Ansoft Maxwell simulation, the important electromagnetic parameters and the main uncertainty sources were analyzed and determined respectively. In the simulation parameters such as the excitation signal frequency, the gap width, the relative area of the probe, the coils number, the excitation signal amplitude and the core length were set to different values. It can be seen from the simulation results that excitation signal frequency, the gap width, the relative area between the probe and the rotor are the major factors to influence the angular measuring accuracy of parasitic time grating sensor. Meanwhile, the coils number, the excitation signal amplitude and the core length are the secondary factors. The analysis result can be utilized to optimize the structure of parasitic time grating and correct measurement error.

  4. CASMO5/TSUNAMI-3D spent nuclear fuel reactivity uncertainty analysis

    SciTech Connect

    Ferrer, R.; Rhodes, J.; Smith, K.

    2012-07-01

    The CASMO5 lattice physics code is used in conjunction with the TSUNAMI-3D sequence in ORNL's SCALE 6 code system to estimate the uncertainties in hot-to-cold reactivity changes due to cross-section uncertainty for PWR assemblies at various burnup points. The goal of the analysis is to establish the multiplication factor uncertainty similarity between various fuel assemblies at different conditions in a quantifiable manner and to obtain a bound on the hot-to-cold reactivity uncertainty over the various assembly types and burnup attributed to fundamental cross-section data uncertainty. (authors)

  5. Uncertainty Analysis of NASA Glenn's 8- by 6-Foot Supersonic Wind Tunnel

    NASA Technical Reports Server (NTRS)

    Stephens, Julia E.; Hubbard, Erin P.; Walter, Joel A.; McElroy, Tyler

    2016-01-01

    An analysis was performed to determine the measurement uncertainty of the Mach Number of the 8- by 6-foot Supersonic Wind Tunnel at the NASA Glenn Research Center. This paper details the analysis process used, including methods for handling limited data and complicated data correlations. Due to the complexity of the equations used, a Monte Carlo Method was utilized for this uncertainty analysis. A summary of the findings are presented as pertains to understanding what the uncertainties are, how they impact various research tests in the facility, and methods of reducing the uncertainties in the future.

  6. ANALYSIS OF MEASUREMENT UNCERTAINTIES IN THE NULLING TEST FOR AIR LEAKAGE FROM RESIDENTIAL DUCTS.

    SciTech Connect

    ANDREWS,J.W.

    2001-04-01

    An analysis of measurement uncertainties in a recently proposed method of measuring air leakage in residential duct systems has been carried out. The uncertainties in supply and return leakage rates are expressed in terms of the value of the envelope leakage flow coefficient and the uncertainties in measured pressures and air flow rates. Results of the analysis are compared with data published by two research groups.

  7. Analysis and reduction of chemical models under uncertainty.

    SciTech Connect

    Oxberry, Geoff; Debusschere, Bert J.; Najm, Habib N.

    2008-08-01

    While models of combustion processes have been successful in developing engines with improved fuel economy, more costly simulations are required to accurately model pollution chemistry. These simulations will also involve significant parametric uncertainties. Computational singular perturbation (CSP) and polynomial chaos-uncertainty quantification (PC-UQ) can be used to mitigate the additional computational cost of modeling combustion with uncertain parameters. PC-UQ was used to interrogate and analyze the Davis-Skodje model, where the deterministic parameter in the model was replaced with an uncertain parameter. In addition, PC-UQ was combined with CSP to explore how model reduction could be combined with uncertainty quantification to understand how reduced models are affected by parametric uncertainty.

  8. Problem Solving Environment for Uncertainty Analysis and Design Exploration

    Energy Science and Technology Software Center (ESTSC)

    2009-09-02

    PSUDE is an software system that is used to study the relationships between the inputs and outputs of general simulation models for the purpose of performaing uncertainty and sensitivity analyses on simulation models.

  9. Problem Solving Environment for Uncertainty Analysis and Design Exploration

    Energy Science and Technology Software Center (ESTSC)

    2011-10-26

    PSUADE is an software system that is used to study the releationships between the inputs and outputs of gerneral simulation models for the purpose of performing uncertainty and sensitivity analyses on simulation models.

  10. Problem Solving Environment for Uncertainty Analysis and Design Exploration

    Energy Science and Technology Software Center (ESTSC)

    2008-05-31

    PSUADE is an software system that is used to study the relationships between the inputs and outputs of general simulation models for the purpose of performing uncertainty and sensitivity analyses on simulation models.

  11. Methodology for Uncertainty Analysis of Dynamic Computational Toxicology Models

    EPA Science Inventory

    The task of quantifying the uncertainty in both parameter estimates and model predictions has become more important with the increased use of dynamic computational toxicology models by the EPA. Dynamic toxicological models include physiologically-based pharmacokinetic (PBPK) mode...

  12. INCORPORATING UNCERTAINTY ANALYSIS INTO INTEGRATED AIR QUALITY PLANNING

    EPA Science Inventory

    The proposed research will develop methods by which air quality planners could formally consider the uncertainty of models that inform control strategy development. We will identify key photochemical model inputs, epidemiological parameters, and other assumptions that most in...

  13. Probabilistic structural analysis to quantify uncertainties associated with turbopump blades

    NASA Technical Reports Server (NTRS)

    Nagpal, Vinod K.; Rubinstein, Robert; Chamis, Christos C.

    1988-01-01

    A probabilistic study of turbopump blades has been in progress at NASA Lewis Research Center for over the last two years. The objectives of this study are to evaluate the effects of uncertainties in geometry and material properties on the structural response of the turbopump blades to evaluate the tolerance limits on the design. A methodology based on probabilistic approach was developed to quantify the effects of the random uncertainties. The results indicate that only the variations in geometry have significant effects.

  14. Model parameter uncertainty analysis for an annual field-scale P loss model

    NASA Astrophysics Data System (ADS)

    Bolster, Carl H.; Vadas, Peter A.; Boykin, Debbie

    2016-08-01

    Phosphorous (P) fate and transport models are important tools for developing and evaluating conservation practices aimed at reducing P losses from agricultural fields. Because all models are simplifications of complex systems, there will exist an inherent amount of uncertainty associated with their predictions. It is therefore important that efforts be directed at identifying, quantifying, and communicating the different sources of model uncertainties. In this study, we conducted an uncertainty analysis with the Annual P Loss Estimator (APLE) model. Our analysis included calculating parameter uncertainties and confidence and prediction intervals for five internal regression equations in APLE. We also estimated uncertainties of the model input variables based on values reported in the literature. We then predicted P loss for a suite of fields under different management and climatic conditions while accounting for uncertainties in the model parameters and inputs and compared the relative contributions of these two sources of uncertainty to the overall uncertainty associated with predictions of P loss. Both the overall magnitude of the prediction uncertainties and the relative contributions of the two sources of uncertainty varied depending on management practices and field characteristics. This was due to differences in the number of model input variables and the uncertainties in the regression equations associated with each P loss pathway. Inspection of the uncertainties in the five regression equations brought attention to a previously unrecognized limitation with the equation used to partition surface-applied fertilizer P between leaching and runoff losses. As a result, an alternate equation was identified that provided similar predictions with much less uncertainty. Our results demonstrate how a thorough uncertainty and model residual analysis can be used to identify limitations with a model. Such insight can then be used to guide future data collection and model

  15. Uncertainty Analysis of the Single-Vector Force Balance Calibration System

    NASA Technical Reports Server (NTRS)

    Parker, Peter A.; Liu, Tianshu

    2002-01-01

    This paper presents an uncertainty analysis of the Single-Vector Force Balance Calibration System (SVS). This study is focused on the uncertainty involved in setting the independent variables during the calibration experiment. By knowing the uncertainty in the calibration system, the fundamental limits of the calibration accuracy of a particular balance can be determined. A brief description of the SVS mechanical system is provided. A mathematical model is developed to describe the mechanical system elements. A sensitivity analysis of these parameters is carried out through numerical simulations to assess the sensitivity of the total uncertainty to the elemental error sources. These sensitivity coefficients provide valuable information regarding the relative significance of the elemental sources of error. An example calculation of the total uncertainty for a specific balance is provided. Results from this uncertainty analysis are specific to the Single-Vector System, but the approach is broad in nature and therefore applicable to other measurement and calibration systems.

  16. A POTENTIAL APPLICATION OF UNCERTAINTY ANALYSIS TO DOE-STD-3009-94 ACCIDENT ANALYSIS

    SciTech Connect

    Palmrose, D E; Yang, J M

    2007-05-10

    The objective of this paper is to assess proposed transuranic waste accident analysis guidance and recent software improvements in a Windows-OS version of MACCS2 that allows the inputting of parameter uncertainty. With this guidance and code capability, there is the potential to perform a quantitative uncertainty assessment of unmitigated accident releases with respect to the 25 rem Evaluation Guideline (EG) of DOE-STD-3009-94 CN3 (STD-3009). Historically, the classification of safety systems in a U.S. Department of Energy (DOE) nuclear facility's safety basis has involved how subject matter experts qualitatively view uncertainty in the STD-3009 Appendix A accident analysis methodology. Specifically, whether consequence uncertainty could be larger than previously evaluated so the site-specific accident consequences may challenge the EG. This paper assesses whether a potential uncertainty capability for MACCS2 could provide a stronger technical basis as to when the consequences from a design basis accident (DBA) truly challenges the 25 rem EG.

  17. LCOE Uncertainty Analysis for Hydropower using Monte Carlo Simulations

    SciTech Connect

    Chalise, Dol Raj; O'Connor, Patrick W; DeNeale, Scott T; Uria Martinez, Rocio; Kao, Shih-Chieh

    2015-01-01

    Levelized Cost of Energy (LCOE) is an important metric to evaluate the cost and performance of electricity production generation alternatives, and combined with other measures, can be used to assess the economics of future hydropower development. Multiple assumptions on input parameters are required to calculate the LCOE, which each contain some level of uncertainty, in turn affecting the accuracy of LCOE results. This paper explores these uncertainties, their sources, and ultimately the level of variability they introduce at the screening level of project evaluation for non-powered dams (NPDs) across the U.S. Owing to site-specific differences in site design, the LCOE for hydropower varies significantly from project to project unlike technologies with more standardized configurations such as wind and gas. Therefore, to assess the impact of LCOE input uncertainty on the economics of U.S. hydropower resources, these uncertainties must be modeled across the population of potential opportunities. To demonstrate the impact of uncertainty, resource data from a recent nationwide non-powered dam (NPD) resource assessment (Hadjerioua et al., 2012) and screening-level predictive cost equations (O Connor et al., 2015) are used to quantify and evaluate uncertainties in project capital and operations & maintenance costs, and generation potential at broad scale. LCOE dependence on financial assumptions is also evaluated on a sensitivity basis to explore ownership/investment implications on project economics for the U.S. hydropower fleet. The results indicate that the LCOE for U.S. NPDs varies substantially. The LCOE estimates for the potential NPD projects of capacity greater than 1 MW range from 40 to 182 $/MWh, with average of 106 $/MWh. 4,000 MW could be developed through projects with individual LCOE values below 100 $/MWh. The results also indicate that typically 90 % of LCOE uncertainty can be attributed to uncertainties in capital costs and energy production; however

  18. Uncertainty analysis of the AEDC 7V chamber

    NASA Astrophysics Data System (ADS)

    Crider, Dustin; Lowry, Heard; Nicholson, Randy; Mead, Kimberly

    2005-05-01

    For over 30 years, the Space Systems Test Facility and space chambers at the Arnold Engineering Development Center (AEDC) have been used to perform space sensor characterization, calibration, and mission simulation testing of space-based, interceptor, and airborne sensors. In partnership with the Missile Defense Agency (MDA), capability upgrades are continuously pursued to keep pace with evolving sensor technologies. Upgrades to sensor test facilities require rigorous facility characterization and calibration activities that are part of AEDC's annual activities to comply with Major Range Test Facility Base processes to ensure quality metrology and test data. This paper discusses the ongoing effort to characterize and quantify Aerospace Chamber 7V measurement uncertainties. The 7V Chamber is a state-of-the-art cryogenic/vacuum facility providing calibration and high-fidelity mission simulation for infrared seekers and sensors against a low-infrared background. One of its key features is the high fidelity of the radiometric calibration process. Calibration of the radiometric sources used is traceable to the National Institute of Standards and Technology and provides relative uncertainties on the order of two to three percent, based on measurement data acquired during many test periods. Three types of sources of measurement error and top-level uncertainties have been analyzed; these include radiometric calibration, target position, and spectral output. The approach used and presented is to quantify uncertainties of each component in the optical system and then build uncertainty diagrams and easily updated databases to detail the uncertainty for each optical system. The formalism, equations, and corresponding analyses are provided to help describe how the specific quantities are derived and currently used. This paper presents the uncertainty methodology used and current results.

  19. Sensitivity and uncertainty analysis applied to the JHR reactivity prediction

    SciTech Connect

    Leray, O.; Vaglio-Gaudard, C.; Hudelot, J. P.; Santamarina, A.; Noguere, G.; Di-Salvo, J.

    2012-07-01

    The on-going AMMON program in EOLE reactor at CEA Cadarache (France) provides experimental results to qualify the HORUS-3D/N neutronics calculation scheme used for the design and safety studies of the new Material Testing Jules Horowitz Reactor (JHR). This paper presents the determination of technological and nuclear data uncertainties on the core reactivity and the propagation of the latter from the AMMON experiment to JHR. The technological uncertainty propagation was performed with a direct perturbation methodology using the 3D French stochastic code TRIPOLI4 and a statistical methodology using the 2D French deterministic code APOLLO2-MOC which leads to a value of 289 pcm (1{sigma}). The Nuclear Data uncertainty propagation relies on a sensitivity study on the main isotopes and the use of a retroactive marginalization method applied to the JEFF 3.1.1 {sup 27}Al evaluation in order to obtain a realistic multi-group covariance matrix associated with the considered evaluation. This nuclear data uncertainty propagation leads to a K{sub eff} uncertainty of 624 pcm for the JHR core and 684 pcm for the AMMON reference configuration core. Finally, transposition and reduction of the prior uncertainty were made using the Representativity method which demonstrates the similarity of the AMMON experiment with JHR (the representativity factor is 0.95). The final impact of JEFF 3.1.1 nuclear data on the Begin Of Life (BOL) JHR reactivity calculated by the HORUS-3D/N V4.0 is a bias of +216 pcm with an associated posterior uncertainty of 304 pcm (1{sigma}). (authors)

  20. Uncertainty Analysis of RELAP5-3D

    SciTech Connect

    Alexandra E Gertman; Dr. George L Mesina

    2012-07-01

    As world-wide energy consumption continues to increase, so does the demand for the use of alternative energy sources, such as Nuclear Energy. Nuclear Power Plants currently supply over 370 gigawatts of electricity, and more than 60 new nuclear reactors have been commissioned by 15 different countries. The primary concern for Nuclear Power Plant operation and lisencing has been safety. The safety of the operation of Nuclear Power Plants is no simple matter- it involves the training of operators, design of the reactor, as well as equipment and design upgrades throughout the lifetime of the reactor, etc. To safely design, operate, and understand nuclear power plants, industry and government alike have relied upon the use of best-estimate simulation codes, which allow for an accurate model of any given plant to be created with well-defined margins of safety. The most widely used of these best-estimate simulation codes in the Nuclear Power industry is RELAP5-3D. Our project focused on improving the modeling capabilities of RELAP5-3D by developing uncertainty estimates for its calculations. This work involved analyzing high, medium, and low ranked phenomena from an INL PIRT on a small break Loss-Of-Coolant Accident as wall as an analysis of a large break Loss-Of- Coolant Accident. Statistical analyses were performed using correlation coefficients. To perform the studies, computer programs were written that modify a template RELAP5 input deck to produce one deck for each combination of key input parameters. Python scripting enabled the running of the generated input files with RELAP5-3D on INL’s massively parallel cluster system. Data from the studies was collected and analyzed with SAS. A summary of the results of our studies are presented.

  1. NASTRAN variance analysis and plotting of HBDY elements. [analysis of uncertainties of the computer results as a function of uncertainties in the input data

    NASA Technical Reports Server (NTRS)

    Harder, R. L.

    1974-01-01

    The NASTRAN Thermal Analyzer has been intended to do variance analysis and plot the thermal boundary elements. The objective of the variance analysis addition is to assess the sensitivity of temperature variances resulting from uncertainties inherent in input parameters for heat conduction analysis. The plotting capability provides the ability to check the geometry (location, size and orientation) of the boundary elements of a model in relation to the conduction elements. Variance analysis is the study of uncertainties of the computed results as a function of uncertainties of the input data. To study this problem using NASTRAN, a solution is made for both the expected values of all inputs, plus another solution for each uncertain variable. A variance analysis module subtracts the results to form derivatives, and then can determine the expected deviations of output quantities.

  2. Principles and applications of measurement and uncertainty analysis in research and calibration

    SciTech Connect

    Wells, C.V.

    1992-11-01

    Interest in Measurement Uncertainty Analysis has grown in the past several years as it has spread to new fields of application, and research and development of uncertainty methodologies have continued. This paper discusses the subject from the perspectives of both research and calibration environments. It presents a history of the development and an overview of the principles of uncertainty analysis embodied in the United States National Standard, ANSI/ASME PTC 19.1-1985, Measurement Uncertainty. Examples are presented in which uncertainty analysis was utilized or is needed to gain further knowledge of a particular measurement process and to characterize final results. Measurement uncertainty analysis provides a quantitative estimate of the interval about a measured value or an experiment result within which the true value of that quantity is expected to lie. Years ago, Harry Ku of the United States National Bureau of Standards stated that ``The informational content of the statement of uncertainty determines, to a large extent, the worth of the calibrated value.`` Today, that statement is just as true about calibration or research results as it was in 1968. Why is that true? What kind of information should we include in a statement of uncertainty accompanying a calibrated value? How and where do we get the information to include in an uncertainty statement? How should we interpret and use measurement uncertainty information? This discussion will provide answers to these and other questions about uncertainty in research and in calibration. The methodology to be described has been developed by national and international groups over the past nearly thirty years, and individuals were publishing information even earlier. Yet the work is largely unknown in many science and engineering arenas. I will illustrate various aspects of uncertainty analysis with some examples drawn from the radiometry measurement and calibration discipline from research activities.

  3. Principles and applications of measurement and uncertainty analysis in research and calibration

    SciTech Connect

    Wells, C.V.

    1992-11-01

    Interest in Measurement Uncertainty Analysis has grown in the past several years as it has spread to new fields of application, and research and development of uncertainty methodologies have continued. This paper discusses the subject from the perspectives of both research and calibration environments. It presents a history of the development and an overview of the principles of uncertainty analysis embodied in the United States National Standard, ANSI/ASME PTC 19.1-1985, Measurement Uncertainty. Examples are presented in which uncertainty analysis was utilized or is needed to gain further knowledge of a particular measurement process and to characterize final results. Measurement uncertainty analysis provides a quantitative estimate of the interval about a measured value or an experiment result within which the true value of that quantity is expected to lie. Years ago, Harry Ku of the United States National Bureau of Standards stated that The informational content of the statement of uncertainty determines, to a large extent, the worth of the calibrated value.'' Today, that statement is just as true about calibration or research results as it was in 1968. Why is that true What kind of information should we include in a statement of uncertainty accompanying a calibrated value How and where do we get the information to include in an uncertainty statement How should we interpret and use measurement uncertainty information This discussion will provide answers to these and other questions about uncertainty in research and in calibration. The methodology to be described has been developed by national and international groups over the past nearly thirty years, and individuals were publishing information even earlier. Yet the work is largely unknown in many science and engineering arenas. I will illustrate various aspects of uncertainty analysis with some examples drawn from the radiometry measurement and calibration discipline from research activities.

  4. Uncertainty and Sensitivity Analysis in Performance Assessment for the Waste Isolation Pilot Plant

    SciTech Connect

    Helton, J.C.

    1998-12-17

    The Waste Isolation Pilot Plant (WIPP) is under development by the U.S. Department of Energy (DOE) for the geologic (deep underground) disposal of transuranic (TRU) waste. This development has been supported by a sequence of performance assessments (PAs) carried out by Sandla National Laboratories (SNL) to assess what is known about the WIPP and to provide .tidance for future DOE research and development activities. Uncertainty and sensitivity analysis procedures based on Latin hypercube sampling and regression techniques play an important role in these PAs by providing an assessment of the uncertainty in important analysis outcomes and identi~ing the sources of thk uncertainty. Performance assessments for the WIPP are conceptually and computational] y interesting due to regulatory requirements to assess and display the effects of both stochastic (i.e., aleatory) and subjective (i.e., epistemic) uncertainty, where stochastic uncertainty arises from the possible disruptions that could occur over the 10,000 yr regulatory period associated with the WIPP and subjective uncertainty arises from an inability to unambi-aously characterize the many models and associated parameters required in a PA for the WIPP. The interplay between uncertainty analysis, sensitivity analysis, stochastic uncertainty and subjective uncertainty are discussed and illustrated in the context of a recent PA carried out by SNL to support an application by the DOE to the U.S. Environmental Protection Agency for the certification of the WIPP for the disposal of TRU waste.

  5. SOARCA Peach Bottom Atomic Power Station Long-Term Station Blackout Uncertainty Analysis: Convergence of the Uncertainty Results

    SciTech Connect

    Bixler, Nathan E.; Osborn, Douglas M.; Sallaberry, Cedric Jean-Marie; Eckert-Gallup, Aubrey Celia; Mattie, Patrick D.; Ghosh, S. Tina

    2014-02-01

    This paper describes the convergence of MELCOR Accident Consequence Code System, Version 2 (MACCS2) probabilistic results of offsite consequences for the uncertainty analysis of the State-of-the-Art Reactor Consequence Analyses (SOARCA) unmitigated long-term station blackout scenario at the Peach Bottom Atomic Power Station. The consequence metrics evaluated are individual latent-cancer fatality (LCF) risk and individual early fatality risk. Consequence results are presented as conditional risk (i.e., assuming the accident occurs, risk per event) to individuals of the public as a result of the accident. In order to verify convergence for this uncertainty analysis, as recommended by the Nuclear Regulatory Commission’s Advisory Committee on Reactor Safeguards, a ‘high’ source term from the original population of Monte Carlo runs has been selected to be used for: (1) a study of the distribution of consequence results stemming solely from epistemic uncertainty in the MACCS2 parameters (i.e., separating the effect from the source term uncertainty), and (2) a comparison between Simple Random Sampling (SRS) and Latin Hypercube Sampling (LHS) in order to validate the original results obtained with LHS. Three replicates (each using a different random seed) of size 1,000 each using LHS and another set of three replicates of size 1,000 using SRS are analyzed. The results show that the LCF risk results are well converged with either LHS or SRS sampling. The early fatality risk results are less well converged at radial distances beyond 2 miles, and this is expected due to the sparse data (predominance of “zero” results).

  6. Probabilistic accident consequence uncertainty analysis: Dispersion and deposition uncertainty assessment. Volume 3, Appendices C, D, E, F, and G

    SciTech Connect

    Harper, F.T.; Young, M.L.; Miller, L.A.

    1995-01-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, completed in 1990, estimate the risks presented by nuclear installations based on postulated frequencies and magnitudes of potential accidents. In 1991, the US Nuclear Regulatory Commission (NRC) and the Commission of the European Communities (CEC) began a joint uncertainty analysis of the two codes. The objective was to develop credible and traceable uncertainty distributions for the input variables of the codes. Expert elicitation, developed independently, was identified as the best technology available for developing a library of uncertainty distributions for the selected consequence parameters. The study was formulated jointly and was limited to the current code models and to physical quantities that could be measured in experiments. To validate the distributions generated for the wet deposition input variables, samples were taken from these distributions and propagated through the wet deposition code model along with the Gaussian plume model (GPM) implemented in the MACCS and COSYMA codes. Resulting distributions closely replicated the aggregated elicited wet deposition distributions. Project teams from the NRC and CEC cooperated successfully to develop and implement a unified process for the elaboration of uncertainty distributions on consequence code input parameters. Formal expert judgment elicitation proved valuable for synthesizing the best available information. Distributions on measurable atmospheric dispersion and deposition parameters were successfully elicited from experts involved in the many phenomenological areas of consequence analysis. This volume is the third of a three-volume document describing the project and contains descriptions of the probability assessment principles; the expert identification and selection process; the weighting methods used; the inverse modeling methods; case structures; and summaries of the consequence codes.

  7. Fuzzy-algebra uncertainty analysis for abnormal-environment safety assessment

    SciTech Connect

    Cooper, J.A.

    1994-01-01

    Many safety (risk) analyses depend on uncertain inputs and on mathematical models chosen from various alternatives, but give fixed results (implying no uncertainty). Conventional uncertainty analyses help, but are also based on assumptions and models, the accuracy of which may be difficult to assure. Some of the models and assumptions that on cursory examination seem reasonable can be misleading. As a result, quantitative assessments, even those accompanied by uncertainty measures, can give unwarranted impressions of accuracy. Since analysis results can be a major contributor to a safety-measure decision process, risk management depends on relating uncertainty to only the information available. The uncertainties due to abnormal environments are even more challenging than those in normal-environment safety assessments, and therefore require an even more cautious approach. A fuzzy algebra analysis is proposed in this report that has the potential to appropriately reflect the information available and portray uncertainties well, especially for abnormal environments.

  8. RECONSTRUCTING EXPOSURE SCENARIOS USING DOSE BIOMARKERS - AN APPLICATION OF BAYESIAN UNCERTAINTY ANALYSIS

    EPA Science Inventory

    We use Bayesian uncertainty analysis to explore how to estimate pollutant exposures from biomarker concentrations. The growing number of national databases with exposure data makes such an analysis possible. They contain datasets of pharmacokinetic biomarkers for many polluta...

  9. New approaches to uncertainty analysis for use in aggregate and cumulative risk assessment of pesticides.

    PubMed

    Kennedy, Marc C; van der Voet, Hilko; Roelofs, Victoria J; Roelofs, Willem; Glass, C Richard; de Boer, Waldo J; Kruisselbrink, Johannes W; Hart, Andy D M

    2015-05-01

    Risk assessments for human exposures to plant protection products (PPPs) have traditionally focussed on single routes of exposure and single compounds. Extensions to estimate aggregate (multi-source) and cumulative (multi-compound) exposure from PPPs present many new challenges and additional uncertainties that should be addressed as part of risk analysis and decision-making. A general approach is outlined for identifying and classifying the relevant uncertainties and variabilities. The implementation of uncertainty analysis within the MCRA software, developed as part of the EU-funded ACROPOLIS project to address some of these uncertainties, is demonstrated. An example is presented for dietary and non-dietary exposures to the triazole class of compounds. This demonstrates the chaining of models, linking variability and uncertainty generated from an external model for bystander exposure with variability and uncertainty in MCRA dietary exposure assessments. A new method is also presented for combining pesticide usage survey information with limited residue monitoring data, to address non-detect uncertainty. The results show that incorporating usage information reduces uncertainty in parameters of the residue distribution but that in this case quantifying uncertainty is not a priority, at least for UK grown crops. A general discussion of alternative approaches to treat uncertainty, either quantitatively or qualitatively, is included. PMID:25688423

  10. Design and Uncertainty Analysis for a PVTt Gas Flow Standard

    PubMed Central

    Wright, John D.; Johnson, Aaron N.; Moldover, Michael R.

    2003-01-01

    A new pressure, volume, temperature, and, time (PVTt) primary gas flow standard at the National Institute of Standards and Technology has an expanded uncertainty (k = 2) of between 0.02 % and 0.05 %. The standard spans the flow range of 1 L/min to 2000 L/min using two collection tanks and two diverter valve systems. The standard measures flow by collecting gas in a tank of known volume during a measured time interval. We describe the significant and novel features of the standard and analyze its uncertainty. The gas collection tanks have a small diameter and are immersed in a uniform, stable, thermostatted water bath. The collected gas achieves thermal equilibrium rapidly and the uncertainty of the average gas temperature is only 7 mK (22 × 10−6 T). A novel operating method leads to essentially zero mass change in and very low uncertainty contributions from the inventory volume. Gravimetric and volume expansion techniques were used to determine the tank and the inventory volumes. Gravimetric determinations of collection tank volume made with nitrogen and argon agree with a standard deviation of 16 × 10−6 VT. The largest source of uncertainty in the flow measurement is drift of the pressure sensor over time, which contributes relative standard uncertainty of 60 × 10−6 to the determinations of the volumes of the collection tanks and to the flow measurements. Throughout the range 3 L/min to 110 L/min, flows were measured independently using the 34 L and the 677 L collection systems, and the two systems agreed within a relative difference of 150 × 10−6. Double diversions were used to evaluate the 677 L system over a range of 300 L/min to 1600 L/min, and the relative differences between single and double diversions were less than 75 × 10−6. PMID:27413592

  11. Probabilistic structural analysis to quantify uncertainties associated with turbopump blades

    NASA Technical Reports Server (NTRS)

    Nagpal, Vinod K.; Rubinstein, Robert; Chamis, Christos C.

    1987-01-01

    A probabilistic study of turbopump blades has been in progress at NASA Lewis Research Center for over the last two years. The objectives of this study are to evaluate the effects of uncertainties in geometry and material properties on the structural response of the turbopump blades to evaluate the tolerance limits on the design. A methodology based on probabilistic approach has been developed to quantify the effects of the random uncertainties. The results of this study indicate that only the variations in geometry have significant effects.

  12. Population Uncertainty in Model Ecosystem: Analysis by Stochastic Differential Equation

    NASA Astrophysics Data System (ADS)

    Morita, Satoru; Tainaka, Kei-ichi; Nagata, Hiroyasu; Yoshimura, Jin

    2008-09-01

    Perturbation experiments are carried out by the numerical simulations of a contact process and its mean-field version. Here, the mortality rate increases or decreases suddenly. It is known that fluctuation enhancement (FE) occurs after perturbation, where FE indicates population uncertainty. In the present paper, we develop a new theory of stochastic differential equation. The agreement between the theory and the mean-field simulation is almost perfect. This theory enables us to find a much stronger FE than that reported previously. We discuss the population uncertainty in the recovering process of endangered species.

  13. Thorough approach to measurement uncertainty analysis applied to immersed heat exchanger testing

    SciTech Connect

    Farrington, R B; Wells, C V

    1986-04-01

    This paper discusses the value of an uncertainty analysis, discusses how to determine measurement uncertainty, and then details the sources of error in instrument calibration, data acquisition, and data reduction for a particular experiment. Methods are discussed to determine both the systematic (or bias) error in an experiment as well as to determine the random (or precision) error in the experiment. The detailed analysis is applied to two sets of conditions in measuring the effectiveness of an immersed coil heat exchanger. It shows the value of such analysis as well as an approach to reduce overall measurement uncertainty and to improve the experiment. This paper outlines how to perform an uncertainty analysis and then provides a detailed example of how to apply the methods discussed in the paper. The authors hope this paper will encourage researchers and others to become more concerned with their measurement processes and to report measurement uncertainty with all of their test results.

  14. Measurement uncertainty analysis of low-dose-rate prostate seed brachytherapy: post-implant dosimetry.

    PubMed

    Gregory, Kent J; Pattison, John E; Bibbo, Giovanni

    2015-03-01

    The minimal dose covering 90 % of the prostate volume--D 90--is arguably the most important dosimetric parameter in low-dose-rate prostate seed brachytherapy. In this study an analysis of the measurement uncertainties in D 90 from low-dose-rate prostate seed brachytherapy was conducted for two common treatment procedures with two different post-implant dosimetry methods. The analysis was undertaken in order to determine the magnitude of D 90 uncertainty, how the magnitude of the uncertainty varied when D 90 was calculated using different dosimetry methods, and which factors were the major contributors to the uncertainty. The analysis considered the prostate as being homogeneous and tissue equivalent and made use of published data, as well as original data collected specifically for this analysis, and was performed according to the Guide to the expression of uncertainty in measurement (GUM). It was found that when prostate imaging and seed implantation were conducted in two separate sessions using only CT images for post-implant analysis, the expanded uncertainty in D 90 values were about 25 % at the 95 % confidence interval. When prostate imaging and seed implantation were conducted during a single session using CT and ultrasound images for post-implant analysis, the expanded uncertainty in D 90 values were about 33 %. Methods for reducing these uncertainty levels are discussed. It was found that variations in contouring the target tissue made the largest contribution to D 90 uncertainty, while the uncertainty in seed source strength made only a small contribution. It is important that clinicians appreciate the overall magnitude of D 90 uncertainty and understand the factors that affect it so that clinical decisions are soundly based, and resources are appropriately allocated. PMID:25555753

  15. Uncertainty Analysis in Large Area Aboveground Biomass Mapping

    NASA Astrophysics Data System (ADS)

    Baccini, A.; Carvalho, L.; Dubayah, R.; Goetz, S. J.; Friedl, M. A.

    2011-12-01

    Satellite and aircraft-based remote sensing observations are being more frequently used to generate spatially explicit estimates of aboveground carbon stock of forest ecosystems. Because deforestation and forest degradation account for circa 10% of anthropogenic carbon emissions to the atmosphere, policy mechanisms are increasingly recognized as a low-cost mitigation option to reduce carbon emission. They are, however, contingent upon the capacity to accurately measures carbon stored in the forests. Here we examine the sources of uncertainty and error propagation in generating maps of aboveground biomass. We focus on characterizing uncertainties associated with maps at the pixel and spatially aggregated national scales. We pursue three strategies to describe the error and uncertainty properties of aboveground biomass maps, including: (1) model-based assessment using confidence intervals derived from linear regression methods; (2) data-mining algorithms such as regression trees and ensembles of these; (3) empirical assessments using independently collected data sets.. The latter effort explores error propagation using field data acquired within satellite-based lidar (GLAS) acquisitions versus alternative in situ methods that rely upon field measurements that have not been systematically collected for this purpose (e.g. from forest inventory data sets). A key goal of our effort is to provide multi-level characterizations that provide both pixel and biome-level estimates of uncertainties at different scales.

  16. Uncertainty analysis for a field-scale P loss model

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Models are often used to predict phosphorus (P) loss from agricultural fields. While it is commonly recognized that model predictions are inherently uncertain, few studies have addressed prediction uncertainties using P loss models. In this study we assessed the effect of model input error on predic...

  17. A probability density function method for acoustic field uncertainty analysis

    NASA Astrophysics Data System (ADS)

    James, Kevin R.; Dowling, David R.

    2005-11-01

    Acoustic field predictions, whether analytical or computational, rely on knowledge of the environmental, boundary, and initial conditions. When knowledge of these conditions is uncertain, acoustic field predictions will also be uncertain, even if the techniques for field prediction are perfect. Quantifying acoustic field uncertainty is important for applications that require accurate field amplitude and phase predictions, like matched-field techniques for sonar, nondestructive evaluation, bio-medical ultrasound, and atmospheric remote sensing. Drawing on prior turbulence research, this paper describes how an evolution equation for the probability density function (PDF) of the predicted acoustic field can be derived and used to quantify predicted-acoustic-field uncertainties arising from uncertain environmental, boundary, or initial conditions. Example calculations are presented in one and two spatial dimensions for the one-point PDF for the real and imaginary parts of a harmonic field, and show that predicted field uncertainty increases with increasing range and frequency. In particular, at 500 Hz in an ideal 100 m deep underwater sound channel with a 1 m root-mean-square depth uncertainty, the PDF results presented here indicate that at a range of 5 km, all phases and a 10 dB range of amplitudes will have non-negligible probability. Evolution equations for the two-point PDF are also derived.

  18. Uncertainty analysis and robust trajectory linearization control of a flexible air-breathing hypersonic vehicle

    NASA Astrophysics Data System (ADS)

    Pu, Zhiqiang; Tan, Xiangmin; Fan, Guoliang; Yi, Jianqiang

    2014-08-01

    Flexible air-breathing hypersonic vehicles feature significant uncertainties which pose huge challenges to robust controller designs. In this paper, four major categories of uncertainties are analyzed, that is, uncertainties associated with flexible effects, aerodynamic parameter variations, external environmental disturbances, and control-oriented modeling errors. A uniform nonlinear uncertainty model is explored for the first three uncertainties which lumps all uncertainties together and consequently is beneficial for controller synthesis. The fourth uncertainty is additionally considered in stability analysis. Based on these analyses, the starting point of the control design is to decompose the vehicle dynamics into five functional subsystems. Then a robust trajectory linearization control (TLC) scheme consisting of five robust subsystem controllers is proposed. In each subsystem controller, TLC is combined with the extended state observer (ESO) technique for uncertainty compensation. The stability of the overall closed-loop system with the four aforementioned uncertainties and additional singular perturbations is analyzed. Particularly, the stability of nonlinear ESO is also discussed from a Liénard system perspective. At last, simulations demonstrate the great control performance and the uncertainty rejection ability of the robust scheme.

  19. Survey of sampling-based methods for uncertainty and sensitivity analysis.

    SciTech Connect

    Johnson, Jay Dean; Helton, Jon Craig; Sallaberry, Cedric J. PhD.; Storlie, Curt B. (Colorado State University, Fort Collins, CO)

    2006-06-01

    Sampling-based methods for uncertainty and sensitivity analysis are reviewed. The following topics are considered: (1) Definition of probability distributions to characterize epistemic uncertainty in analysis inputs, (2) Generation of samples from uncertain analysis inputs, (3) Propagation of sampled inputs through an analysis, (4) Presentation of uncertainty analysis results, and (5) Determination of sensitivity analysis results. Special attention is given to the determination of sensitivity analysis results, with brief descriptions and illustrations given for the following procedures/techniques: examination of scatterplots, correlation analysis, regression analysis, partial correlation analysis, rank transformations, statistical tests for patterns based on gridding, entropy tests for patterns based on gridding, nonparametric regression analysis, squared rank differences/rank correlation coefficient test, two dimensional Kolmogorov-Smirnov test, tests for patterns based on distance measures, top down coefficient of concordance, and variance decomposition.

  20. Model parameter uncertainty analysis for an annual field-scale phosphorus loss model

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Phosphorous (P) loss models are important tools for developing and evaluating conservation practices aimed at reducing P losses from agricultural fields. All P loss models, however, have an inherent amount of uncertainty associated with them. In this study, we conducted an uncertainty analysis with ...

  1. Parameter uncertainty analysis for the annual phosphorus loss estimator (APLE) model

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Phosphorous (P) loss models are important tools for developing and evaluating conservation practices aimed at reducing P losses from agricultural fields. All P loss models, however, have an inherent amount of uncertainty associated with them. In this study, we conducted an uncertainty analysis with ...

  2. Linguistic uncertainty in qualitative risk analysis and how to minimize it.

    PubMed

    Carey, Janet M; Burgman, Mark A

    2008-04-01

    Most risk assessments assume uncertainty may be decomposed into variability and incertitude. Language is often overlooked as a source of uncertainty, but linguistic uncertainty may be pervasive in workshops, committees, and other face-to-face language-based settings where it can result in misunderstanding and arbitrary disagreement. Here we present examples of linguistic uncertainty drawn from qualitative risk analysis undertaken in stakeholder workshops and describe how the uncertainties were treated. We used a process of iterative re-assessment of likelihoods and consequences, interspersed with facilitated discussion, to assist in the reduction of language-based uncertainty. The effects of this process were evident as changes in the level of agreement among groups of assessors in the ranking of hazards. PMID:18469210

  3. Use of paired simple and complex models to reduce predictive bias and quantify uncertainty

    NASA Astrophysics Data System (ADS)

    Doherty, John; Christensen, Steen

    2011-12-01

    Modern environmental management and decision-making is based on the use of increasingly complex numerical models. Such models have the advantage of allowing representation of complex processes and heterogeneous system property distributions inasmuch as these are understood at any particular study site. The latter are often represented stochastically, this reflecting knowledge of the character of system heterogeneity at the same time as it reflects a lack of knowledge of its spatial details. Unfortunately, however, complex models are often difficult to calibrate because of their long run times and sometimes questionable numerical stability. Analysis of predictive uncertainty is also a difficult undertaking when using models such as these. Such analysis must reflect a lack of knowledge of spatial hydraulic property details. At the same time, it must be subject to constraints on the spatial variability of these details born of the necessity for model outputs to replicate observations of historical system behavior. In contrast, the rapid run times and general numerical reliability of simple models often promulgates good calibration and ready implementation of sophisticated methods of calibration-constrained uncertainty analysis. Unfortunately, however, many system and process details on which uncertainty may depend are, by design, omitted from simple models. This can lead to underestimation of the uncertainty associated with many predictions of management interest. The present paper proposes a methodology that attempts to overcome the problems associated with complex models on the one hand and simple models on the other hand, while allowing access to the benefits each of them offers. It provides a theoretical analysis of the simplification process from a subspace point of view, this yielding insights into the costs of model simplification, and into how some of these costs may be reduced. It then describes a methodology for paired model usage through which predictive

  4. Measurement uncertainties in regression analysis with scarcity of data

    NASA Astrophysics Data System (ADS)

    Sousa, J. A.; Ribeiro, A. S.; Cox, M. G.; Harris, P. M.; Sousa, J. F. V.

    2010-07-01

    The evaluation of measurement uncertainty, in certain fields of science, faces the problem of scarcity of data. This is certainly the case in the testing of geological soils in civil engineering, where tests can take several days or weeks and where the same sample is not available for further testing, being destroyed during the experiment. In this particular study attention will be paid to triaxial compression tests used to typify particular soils. The purpose of the testing is to determine two parameters that characterize the soil, namely, cohesion and friction angle. These parameters are defined in terms of the intercept and slope of a straight line fitted to a small number of points (usually three) derived from experimental data. The use of ordinary least squares to obtain uncertainties associated with estimates of the two parameters would be unreliable if there were only three points (and no replicates) and hence only one degrees of freedom.

  5. Uncertainty analysis of densities and isotopics: Handling correlations

    SciTech Connect

    Favorite, J. A.; Armstrong, J. C.; Burr, T.

    2013-07-01

    This paper discusses two cases of correlated parameters in uncertainty analyses: (1) the case of measured mass, density, and volume or spatial dimension correlations; and (2) the case of measured material isotopics, where increasing one atom fraction must cause the others to decrease. In the first case, an equation is derived that has a term due to uncertain density, a term due to uncertain dimensions, and a term due to the correlation between density and dimensions. In a numerical test problem, this equation gives the same result as the standard equation that treats mass and dimensions independently. In the case of isotopics, an equation is derived relating the uncertainty due to uncertain isotopic fractions to the sensitivities to isotopic densities, which are easier to calculate. The equation is verified in a test problem. (authors)

  6. The combined analysis of uncertainty and patient heterogeneity in medical decision models.

    PubMed

    Groot Koerkamp, Bas; Stijnen, Theo; Weinstein, Milton C; Hunink, M G Myriam

    2011-01-01

    The analysis of both patient heterogeneity and parameter uncertainty in decision models is increasingly recommended. In addition, the complexity of current medical decision models commonly requires simulating individual subjects, which introduces stochastic uncertainty. The combined analysis of uncertainty and heterogeneity often involves complex nested Monte Carlo simulations to obtain the model outcomes of interest. In this article, the authors distinguish eight model types, each dealing with a different combination of patient heterogeneity, parameter uncertainty, and stochastic uncertainty. The analyses that are required to obtain the model outcomes are expressed in equations, explained in stepwise algorithms, and demonstrated in examples. Patient heterogeneity is represented by frequency distributions and analyzed with Monte Carlo simulation. Parameter uncertainty is represented by probability distributions and analyzed with 2nd-order Monte Carlo simulation (aka probabilistic sensitivity analysis). Stochastic uncertainty is analyzed with 1st-order Monte Carlo simulation (i.e., trials or random walks). This article can be used as a reference for analyzing complex models with more than one type of uncertainty and patient heterogeneity. PMID:20974904

  7. Probabilistic Fracture Analysis of Functionally Graded Materials--Part I: Uncertainty and Probabilistic Analysis Method

    SciTech Connect

    Song, Junho; Nguyen, Tam H.; Paulino, Glaucio H.

    2008-02-15

    Probabilistic fracture analysis is performed for predicting uncertain fracture responses of Functionally Graded Material (FGM) structures. The uncertainties in material properties including Young's modulus and fracture toughness are considered. The limit state function for a crack initiation event is defined in terms of the J-integral for FGMs. The First-Order-Reliability-Method (FORM) is used in conjunction with a finite element code that computes the J-integral with high accuracy. A two-step probabilistic analysis procedure is proposed to investigate the effects of the uncertainties in the spatial distribution of Young's modulus on the probability of crack initiation in FGMs. First, we investigate the effects of the uncertainties in the shape of the spatial distribution by considering the slope and the location of the inflection point of a spatial distribution profile as random quantities. Second, we investigate the effects of the spatial fluctuations of Young's modulus by making use of a discretized random field. The companion paper (Part II) implements this method into a finite element fracture analysis code and presents numerical examples.

  8. Bayesian Neural Networks for Uncertainty Analysis of Hydrologic Modeling: A Comparison of Two Schemes

    SciTech Connect

    Zhang, Xuesong; Zhao, Kaiguang

    2012-06-01

    Bayesian Neural Networks (BNNs) have been shown as useful tools to analyze modeling uncertainty of Neural Networks (NNs). This research focuses on the comparison of two BNNs. The first BNNs (BNN-I) use statistical methods to describe the characteristics of different uncertainty sources (input, parameter, and model structure) and integrate these uncertainties into a Markov Chain Monte Carlo (MCMC) framework to estimate total uncertainty. The second BNNs (BNN-II) lump all uncertainties into a single error term (i.e. the residual between model prediction and measurement). In this study, we propose a simple BNN-II, which use Genetic Algorithms (GA) and Bayesian Model Averaging (BMA) to calibrate Neural Networks with different structures (number of hidden units) and combine the predictions from different NNs to derive predictions and uncertainty analysis. We tested these two BNNs in two watersheds for daily and monthly hydrologic simulation. The BMA based BNNs developed in this study outperforms BNN-I in the two watersheds in terms of both accurate prediction and uncertainty estimation. These results show that, given incomplete understanding of the characteristics associated with each uncertainty source, the simple lumped error approach may yield better prediction and uncertainty estimation.

  9. Incorporating Fuzzy Systems Modeling and Possibility Theory in Hydrogeological Uncertainty Analysis

    NASA Astrophysics Data System (ADS)

    Faybishenko, B.

    2008-12-01

    Hydrogeological predictions are subject to numerous uncertainties, including the development of conceptual, mathematical, and numerical models, as well as determination of their parameters. Stochastic simulations of hydrogeological systems and the associated uncertainty analysis are usually based on the assumption that the data characterizing spatial and temporal variations of hydrogeological processes are random, and the output uncertainty is quantified using a probability distribution. However, hydrogeological systems are often characterized by imprecise, vague, inconsistent, incomplete or subjective information. One of the modern approaches to modeling and uncertainty quantification of such systems is based on using a combination of statistical and fuzzy-logic uncertainty analyses. The aims of this presentation are to: (1) present evidence of fuzziness in developing conceptual hydrogeological models, and (2) give examples of the integration of the statistical and fuzzy-logic analyses in modeling and assessing both aleatoric uncertainties (e.g., caused by vagueness in assessing the subsurface system heterogeneities of fractured-porous media) and epistemic uncertainties (e.g., caused by the selection of different simulation models) involved in hydrogeological modeling. The author will discuss several case studies illustrating the application of fuzzy modeling for assessing the water balance and water travel time in unsaturated-saturated media. These examples will include the evaluation of associated uncertainties using the main concepts of possibility theory, a comparison between the uncertainty evaluation using probabilistic and possibility theories, and a transformation of the probabilities into possibilities distributions (and vice versa) for modeling hydrogeological processes.

  10. A GIS based spatially-explicit sensitivity and uncertainty analysis approach for multi-criteria decision analysis

    NASA Astrophysics Data System (ADS)

    Feizizadeh, Bakhtiar; Jankowski, Piotr; Blaschke, Thomas

    2014-03-01

    GIS multicriteria decision analysis (MCDA) techniques are increasingly used in landslide susceptibility mapping for the prediction of future hazards, land use planning, as well as for hazard preparedness. However, the uncertainties associated with MCDA techniques are inevitable and model outcomes are open to multiple types of uncertainty. In this paper, we present a systematic approach to uncertainty and sensitivity analysis. We access the uncertainty of landslide susceptibility maps produced with GIS-MCDA techniques. A new spatially-explicit approach and Dempster-Shafer Theory (DST) are employed to assess the uncertainties associated with two MCDA techniques, namely Analytical Hierarchical Process (AHP) and Ordered Weighted Averaging (OWA) implemented in GIS. The methodology is composed of three different phases. First, weights are computed to express the relative importance of factors (criteria) for landslide susceptibility. Next, the uncertainty and sensitivity of landslide susceptibility is analyzed as a function of weights using Monte Carlo Simulation and Global Sensitivity Analysis. Finally, the results are validated using a landslide inventory database and by applying DST. The comparisons of the obtained landslide susceptibility maps of both MCDA techniques with known landslides show that the AHP outperforms OWA. However, the OWA-generated landslide susceptibility map shows lower uncertainty than the AHP-generated map. The results demonstrate that further improvement in the accuracy of GIS-based MCDA can be achieved by employing an integrated uncertainty-sensitivity analysis approach, in which the uncertainty of landslide susceptibility model is decomposed and attributed to model's criteria weights.

  11. UNCERTAINTY ANALYSIS OF TCE USING THE DOSE EXPOSURE ESTIMATING MODEL (DEEM) IN ACSL

    EPA Science Inventory

    The ACSL-based Dose Exposure Estimating Model(DEEM) under development by EPA is used to perform art uncertainty analysis of a physiologically based pharmacokinetic (PSPK) model of trichloroethylene (TCE). This model involves several circulating metabolites such as trichloroacet...

  12. Video Scanning Hartmann Optical Tester (VSHOT) Uncertainty Analysis: Preprint

    SciTech Connect

    Lewandowski, A.; Gray, A.

    2010-10-01

    This purely analytical work is based primarily on the geometric optics of the system and shows sensitivities to various design and operational parameters. We discuss sources of error with measuring devices, instrument calibrations, and operator measurements for a parabolic trough test. In this paper, we include both the random (precision) and systematic (bias) errors for VSHOT testing and their contributions to the uncertainty. The contributing factors that we considered in this study are target tilt, target face to laser output distance, instrument vertical offset, scanner tilt, distance between the tool and the test piece, camera calibration, and scanner/calibration.

  13. PROBABILISTIC SENSITIVITY AND UNCERTAINTY ANALYSIS WORKSHOP SUMMARY REPORT

    SciTech Connect

    Seitz, R

    2008-06-25

    Stochastic or probabilistic modeling approaches are being applied more frequently in the United States and globally to quantify uncertainty and enhance understanding of model response in performance assessments for disposal of radioactive waste. This increased use has resulted in global interest in sharing results of research and applied studies that have been completed to date. This technical report reflects the results of a workshop that was held to share results of research and applied work related to performance assessments conducted at United States Department of Energy sites. Key findings of this research and applied work are discussed and recommendations for future activities are provided.

  14. CALIBRATION AND UNCERTAINTY ANALYSIS OF SWAT MODEL IN A JAPANESE RIVER CATCHMENT

    NASA Astrophysics Data System (ADS)

    Luo, Pingping; Takara, Kaoru; He, Bin; Cao, Wenqiang; Yamashiki, Yosuke; Nover, Daniel

    Calibration and uncertainty analysis is necessary to perform the best estimation and uncertainty identification of hydrological models. This paper uses the Soil and Water Assessment Tool-Calibration and Uncertainly Procedures (SWAT-CUP) model to analyze the uncertainty of SWAT model in a Japanese river catchment. The GLUE and SUFI-2 techniques used in this analysis show quite good results with high value of R2 as 0.98 and 0.95 for monthly simulation. Daily simulation results during calibration and validation are also good with R2 as 0.86 and 0.80. For uncertainty results, the 95% prediction uncertainty (95PPU) brackets very well with the observation. The p-factors of uncertainty analysis for the calibration and validation periods are 92% and 94%. The calibration result by using GLUE shows better than that by using SUFI-2. However, the processing time of the GLUE approach is longer than SUFI-2 approach when they were run in the SWAT-CUP. The uncertainty analysis indicates that the parameters of effective hydraulic conductivity in main channel alluvium (CH_K2) and base-flow alpha factor for bank storage (ALPHA_BNK) play important roles for calibration and validation of SWAT model.

  15. Sensitvity and Uncertainty Analysis for a Minor-actinide Transmuter with JENDL-4.0

    NASA Astrophysics Data System (ADS)

    Iwamoto, H.; Nishihara, K.; Sugawara, T.; Tsujimoto, K.

    2014-04-01

    A sensitivity and uncertainty analysis was performed for the minor-actinide transmuter proposed by the Japan Atomic Energy Agency with JENDL-4.0. Analysis with sensitivity coefficients and the JENDL-4.0 covariance data showed that the covariances of the capture cross sections and fission-related parameters of MAs and Pu isotopes have considerable impact on the uncertainties of reactor physics parameters, and covariances of the inelastic scattering cross section of lead-bismuth eutectic (LBE) materials significantly affect the uncertainty of coolant-void reactivity.

  16. Uncertainty analysis of diffuse-gray radiation enclosure problems: A hypersensitive case study

    NASA Technical Reports Server (NTRS)

    Taylor, Robert P.; Luck, Rogelio; Hodge, B. K.; Steele, W. Glenn

    1993-01-01

    An uncertainty analysis of diffuse-gray enclosure problems is presented. The genesis was a diffuse-gray enclosure problem which proved to be hypersensitive to the specification of view factors. This genesis is discussed in some detail. The uncertainty analysis is presented for the general diffuse-gray enclosure problem and applied to the hypersensitive case study. It was found that the hypersensitivity could be greatly reduced by enforcing both closure and reciprocity for the view factors. The effects of uncertainties in the surface emissivities and temperatures are also investigated.

  17. PRACTICAL SENSITIVITY AND UNCERTAINTY ANALYSIS TECHNIQUES APPLIED TO AGRICULTURAL SYSTEMS MODELS

    Technology Transfer Automated Retrieval System (TEKTRAN)

    We present a practical evaluation framework for analysis of two complex, process-based agricultural system models, WEPP and RZWQM. The evaluation framework combines sensitivity analysis and the uncertainty analysis techniques of first order error analysis (FOA) and Monte Carlo simulation with Latin ...

  18. Analysis and Reduction of Complex Networks Under Uncertainty.

    SciTech Connect

    Ghanem, Roger G

    2014-07-31

    This effort was a collaboration with Youssef Marzouk of MIT, Omar Knio of Duke University (at the time at Johns Hopkins University) and Habib Najm of Sandia National Laboratories. The objective of this effort was to develop the mathematical and algorithmic capacity to analyze complex networks under uncertainty. Of interest were chemical reaction networks and smart grid networks. The statements of work for USC focused on the development of stochastic reduced models for uncertain networks. The USC team was led by Professor Roger Ghanem and consisted of one graduate student and a postdoc. The contributions completed by the USC team consisted of 1) methodology and algorithms to address the eigenvalue problem, a problem of significance in the stability of networks under stochastic perturbations, 2) methodology and algorithms to characterize probability measures on graph structures with random flows. This is an important problem in characterizing random demand (encountered in smart grid) and random degradation (encountered in infrastructure systems), as well as modeling errors in Markov Chains (with ubiquitous relevance !). 3) methodology and algorithms for treating inequalities in uncertain systems. This is an important problem in the context of models for material failure and network flows under uncertainty where conditions of failure or flow are described in the form of inequalities between the state variables.

  19. Comprehensive Approach to Verification and Validation of CFD Simulations Applied to Backward Facing Step-Application of CFD Uncertainty Analysis

    NASA Technical Reports Server (NTRS)

    Groves, Curtis E.; LLie, Marcel; Shallhorn, Paul A.

    2012-01-01

    There are inherent uncertainties and errors associated with using Computational Fluid Dynamics (CFD) to predict the flow field and there is no standard method for evaluating uncertainty in the CFD community. This paper describes an approach to -validate the . uncertainty in using CFD. The method will use the state of the art uncertainty analysis applying different turbulence niodels and draw conclusions on which models provide the least uncertainty and which models most accurately predict the flow of a backward facing step.

  20. Monte Carlo analysis of uncertainties in the Netherlands greenhouse gas emission inventory for 1990-2004

    NASA Astrophysics Data System (ADS)

    Ramírez, Andrea; de Keizer, Corry; Van der Sluijs, Jeroen P.; Olivier, Jos; Brandes, Laurens

    This paper presents an assessment of the value added of a Monte Carlo analysis of the uncertainties in the Netherlands inventory of greenhouse gases over a Tier 1 analysis. It also examines which parameters contributed the most to the total emission uncertainty and identified areas of high priority for the further improvement of the accuracy and quality of the inventory. The Monte Carlo analysis resulted in an uncertainty range in total GHG emissions of 4.1% in 2004 and 5.4% in 1990 (with LUCF) and 5.3% (in 1990) and 3.9% (in 2004) for GHG emissions without LUCF. Uncertainty in the trend was estimated at 4.5%. The values are in the same order of magnitude as those estimated in the Tier 1. The results show that accounting for correlation among parameters is important, and for the Netherlands inventory it has a larger impact on the uncertainty in the trend than on the uncertainty in the total GHG emissions. The main contributors to overall uncertainty are found to be related to N 2O emissions from agricultural soils, the N 2O implied emission factors of Nitric Acid Production, CH 4 from managed solid waste disposal on land, and the implied emission factor of CH 4 from manure management from cattle.

  1. Uncertainty estimation in organic elemental analysis using information from proficiency tests.

    PubMed

    Companyó, R; Rubio, R; Sahuquillo, A; Boqué, R; Maroto, A; Riu, J

    2008-12-01

    We evaluate the uncertainty in organic elemental analysis of C, H, N, and S. We use data from six proficiency tests (PTs), in which some 35 Spanish laboratories participated. The uncertainty of the technique is estimated from the relative within-laboratory and between-laboratory variances for pure substances and samples with complex matrices (soil, powdered milk, oil, ash, and petroleum coke). We also calculate the relative standard uncertainties for individual laboratories when analysing pure substances using historical data from the participation of each laboratory in different editions of PTs. The uncertainty values obtained for the individual laboratories are comparable with the uncertainty of the technique and correlate with the combined z-scores. The evolution over time of those laboratories participating in common editions of PTs is also evaluated. PMID:18853149

  2. Monte carlo uncertainty analysis for photothermal radiometry measurements using a curve fit process

    NASA Astrophysics Data System (ADS)

    Horne, Kyle; Fleming, Austin; Timmins, Ben; Ban, Heng

    2015-12-01

    Photothermal radiometry (PTR) has become a popular method to measure thermal properties of layered materials. Much research has been done to determine the capabilities of PTR, but with little uncertainty analysis. This study reports a Monte Carlo uncertainty analysis to quantify uncertainty of film diffusivity and effusivity measurements, presents a sensitivity study for each input parameter, compares linear and logarithmic spacing of data points on frequency scans, and investigates the validity of a one-dimensional heat transfer assumption. Logarithmic spacing of frequencies when taking data is found to be unequivocally superior to linear spacing, while the use of a higher-dimensional heat transfer model is only needed for certain measurement configurations. The sensitivity analysis supports the frequency spacing conclusion, as well as explains trends seen in the uncertainty data.

  3. Uncertainty analysis for probabilistic pipe fracture evaluations in LBB applications

    SciTech Connect

    Rahman, S.; Ghadiali, N.; Wilkowski, G.

    1997-04-01

    During the NRC`s Short Cracks in Piping and Piping Welds Program at Battelle, a probabilistic methodology was developed to conduct fracture evaluations of circumferentially cracked pipes for application to leak-rate detection. Later, in the IPIRG-2 program, several parameters that may affect leak-before-break and other pipe flaw evaluations were identified. This paper presents new results from several uncertainty analyses to evaluate the effects of normal operating stresses, normal plus safe-shutdown earthquake stresses, off-centered cracks, restraint of pressure-induced bending, and dynamic and cyclic loading rates on the conditional failure probability of pipes. systems in BWR and PWR. For each parameter, the sensitivity to conditional probability of failure and hence, its importance on probabilistic leak-before-break evaluations were determined.

  4. Modelling System Processes to Support Uncertainty Analysis and Robustness Evaluation

    NASA Technical Reports Server (NTRS)

    Blackwell, Charles; Cuzzi, Jeffrey (Technical Monitor)

    1996-01-01

    In the use of advanced systems control techniques in the development of a dynamic system, results from effective mathematical modelling is required. Historically, in some cases the use of a model which only reflects the "expected" or "nominal" important -information about the system's internal processes has resulted in acceptable system performance, but it should be recognized that for those cases success was due to a combination of the remarkable inherent potential of feedback control for robustness and fortuitously wide margins between system performance requirements and system performance capability. In the cases of a CELSS development, no such fortuitous combinations should be expected, and it should be expected that the uncertainty in the information on the system's processes will have to be taken into account in order to generate a performance robust design. In this paper, we develop one perspective of the issue of providing robustness as mathematical modelling impacts it, and present some examples of model formats which serve the needed purpose.

  5. Model calibration and uncertainty analysis in signaling networks.

    PubMed

    Heinemann, Tim; Raue, Andreas

    2016-06-01

    For a long time the biggest challenges in modeling cellular signal transduction networks has been the inference of crucial pathway components and the qualitative description of their interactions. As a result of the emergence of powerful high-throughput experiments, it is now possible to measure data of high temporal and spatial resolution and to analyze signaling dynamics quantitatively. In addition, this increase of high-quality data is the basis for a better understanding of model limitations and their influence on the predictive power of models. We review established approaches in signal transduction network modeling with a focus on ordinary differential equation models as well as related developments in model calibration. As central aspects of the calibration process we discuss possibilities of model adaptation based on data-driven parameter optimization and the concomitant objective of reducing model uncertainties. PMID:27085224

  6. The IAEA Coordinated Research Program on HTGR Uncertainty Analysis: Phase I Status and Initial Results

    SciTech Connect

    Strydom, Gerhard; Bostelmann, Friederike; Ivanov, Kostadin

    2014-10-01

    The continued development of High Temperature Gas Cooled Reactors (HTGRs) requires verification of HTGR design and safety features with reliable high fidelity physics models and robust, efficient, and accurate codes. One way to address the uncertainties in the HTGR analysis tools is to assess the sensitivity of critical parameters (such as the calculated maximum fuel temperature during loss of coolant accidents) to a few important input uncertainties. The input parameters were identified by engineering judgement in the past but are today typically based on a Phenomena Identification Ranking Table (PIRT) process. The input parameters can also be derived from sensitivity studies and are then varied in the analysis to find a spread in the parameter of importance. However, there is often no easy way to compensate for these uncertainties. In engineering system design, a common approach for addressing performance uncertainties is to add compensating margins to the system, but with passive properties credited it is not so clear how to apply it in the case of modular HTGR heat removal path. Other more sophisticated uncertainty modelling approaches, including Monte Carlo analysis, have also been proposed and applied. Ideally one wishes to apply a more fundamental approach to determine the predictive capability and accuracies of coupled neutronics/thermal-hydraulics and depletion simulations used for reactor design and safety assessment. Today there is a broader acceptance of the use of uncertainty analysis even in safety studies, and it has been accepted by regulators in some cases to replace the traditional conservative analysis. Therefore some safety analysis calculations may use a mixture of these approaches for different parameters depending upon the particular requirements of the analysis problem involved. Sensitivity analysis can for example be used to provide information as part of an uncertainty analysis to determine best estimate plus uncertainty results to the

  7. Uncertainty analysis of statistical downscaling models using general circulation model over an international wetland

    NASA Astrophysics Data System (ADS)

    Etemadi, H.; Samadi, S.; Sharifikia, M.

    2014-06-01

    Regression-based statistical downscaling model (SDSM) is an appropriate method which broadly uses to resolve the coarse spatial resolution of general circulation models (GCMs). Nevertheless, the assessment of uncertainty propagation linked with climatic variables is essential to any climate change impact study. This study presents a procedure to characterize uncertainty analysis of two GCM models link with Long Ashton Research Station Weather Generator (LARS-WG) and SDSM in one of the most vulnerable international wetland, namely "Shadegan" in an arid region of Southwest Iran. In the case of daily temperature, uncertainty is estimated by comparing monthly mean and variance of downscaled and observed daily data at a 95 % confidence level. Uncertainties were then evaluated from comparing monthly mean dry and wet spell lengths and their 95 % CI in daily precipitation downscaling using 1987-2005 interval. The uncertainty results indicated that the LARS-WG is the most proficient model at reproducing various statistical characteristics of observed data at a 95 % uncertainty bounds while the SDSM model is the least capable in this respect. The results indicated a sequences uncertainty analysis at three different climate stations and produce significantly different climate change responses at 95 % CI. Finally the range of plausible climate change projections suggested a need for the decision makers to augment their long-term wetland management plans to reduce its vulnerability to climate change impacts.

  8. Uncertainty Analysis Framework - Hanford Site-Wide Groundwater Flow and Transport Model

    SciTech Connect

    Cole, Charles R.; Bergeron, Marcel P.; Murray, Christopher J.; Thorne, Paul D.; Wurstner, Signe K.; Rogers, Phillip M.

    2001-11-09

    Pacific Northwest National Laboratory (PNNL) embarked on a new initiative to strengthen the technical defensibility of the predictions being made with a site-wide groundwater flow and transport model at the U.S. Department of Energy Hanford Site in southeastern Washington State. In FY 2000, the focus of the initiative was on the characterization of major uncertainties in the current conceptual model that would affect model predictions. The long-term goals of the initiative are the development and implementation of an uncertainty estimation methodology in future assessments and analyses using the site-wide model. This report focuses on the development and implementation of an uncertainty analysis framework.

  9. A Comparative Review of Sensitivity and Uncertainty Analysis of Large-Scale Systems - II: Statistical Methods

    SciTech Connect

    Cacuci, Dan G.; Ionescu-Bujor, Mihaela

    2004-07-15

    Part II of this review paper highlights the salient features of the most popular statistical methods currently used for local and global sensitivity and uncertainty analysis of both large-scale computational models and indirect experimental measurements. These statistical procedures represent sampling-based methods (random sampling, stratified importance sampling, and Latin Hypercube sampling), first- and second-order reliability algorithms (FORM and SORM, respectively), variance-based methods (correlation ratio-based methods, the Fourier Amplitude Sensitivity Test, and the Sobol Method), and screening design methods (classical one-at-a-time experiments, global one-at-a-time design methods, systematic fractional replicate designs, and sequential bifurcation designs). It is emphasized that all statistical uncertainty and sensitivity analysis procedures first commence with the 'uncertainty analysis' stage and only subsequently proceed to the 'sensitivity analysis' stage; this path is the exact reverse of the conceptual path underlying the methods of deterministic sensitivity and uncertainty analysis where the sensitivities are determined prior to using them for uncertainty analysis. By comparison to deterministic methods, statistical methods for uncertainty and sensitivity analysis are relatively easier to develop and use but cannot yield exact values of the local sensitivities. Furthermore, current statistical methods have two major inherent drawbacks as follows: 1. Since many thousands of simulations are needed to obtain reliable results, statistical methods are at best expensive (for small systems) or, at worst, impracticable (e.g., for large time-dependent systems).2. Since the response sensitivities and parameter uncertainties are inherently and inseparably amalgamated in the results produced by these methods, improvements in parameter uncertainties cannot be directly propagated to improve response uncertainties; rather, the entire set of simulations and

  10. Uncertainty analysis of the modelling chain from GCM to flood inundation

    NASA Astrophysics Data System (ADS)

    Wetterhall, F.; He, Y.; Freer, J. F.; Cloke, H.; Pappenberger, F.; Wilson, M.; McGregor, G.

    2010-05-01

    This study aims to set up novel techniques for tracking uncertainties through a modelling framework of extreme floods under a climate change. More specifically, thestudy will attempts to (1) assess future flood inundation impacts and extent as well as its hazards and (2) quantifying the cascading uncertainties in a modelling framework. The modeling framework consists of statistically and dynamically downscaled meteorological input from an ensemble of GCMs and RCMs. The climate input is further driving a set of rainfall-runoff models, in this case LISFLOOD-RR and HBV. The hydrological models provide modelled discharges which are fed through two flood inundation models, LISFLOOD-FP and HECRAS. Uncertainties in climate impact modelling are many, for example input errors in observations, impact model parameter and structural uncertainties, parameterisation and resolution errors in climate models and the underlying future scenarios. The uncertainties are cascaded through the modelling chain and it is important to rigorously estimate this uncertainty at all levels. The main aim of this project is to incorporate all these uncertainties at the very end of the chain in a flood risk map. The main research questions of this study are (1) how sensitive is the cascade setup to the downscaled meteorological input from the GCMs, particularly with respect to extreme events; (2) how is the climate change signal affected by the downscaling technique; (3) how can we quantify the sources and magnitude of uncertainties when simulating flood inundation within the context of climate change; (4) how do we deal with multi-scale multi-source uncertainties whilst taking into account the limitations of our observed measurements; (5) how do we develop strategies that improve the efficiency of sampling such a cascaded modeling structure to characterise the uncertainties and most importantly; (6) how do we convey the information to stake holders and policy makers? The analysis is done in three

  11. Sensitivity and Uncertainty Analysis in Chemical Mechanisms for Air Quality Modeling

    NASA Astrophysics Data System (ADS)

    Gao, Dongfen

    1995-01-01

    Ambient ozone in urban and regional air pollution is a serious environmental problem. Air quality models can be used to predict ozone concentrations and explore control strategies. One important component of such air quality models is a chemical mechanism. Sensitivity and uncertainty analysis play an important role in the evaluation of the performance of air quality models. The uncertainties associated with the RADM2 chemical mechanism in predicted concentrations of O_3, HCHO, H _2rm O_2, PAN, and HNO _3 were estimated. Monte Carlo simulations with Latin Hypercube Sampling were used to estimate the overall uncertainties in concentrations of species of interest, due to uncertainties in chemical parameters. The parameters that were treated as random variables were identified through first-order sensitivity and uncertainty analyses. Recent estimates of uncertainties in rate parameters and product yields were used. The results showed the relative uncertainties in ozone predictions are +/-23-50% (1sigma relative to the mean) in urban cases, and less than +/-20% in rural cases. Uncertainties in HNO_3 concentrations are the smallest, followed by HCHO, O_3 and PAN. Predicted H_2rm O_2 concentrations have the highest uncertainties. Uncertainties in the differences of peak ozone concentrations between base and control cases were also studied. The results show that the uncertainties in the fractional reductions in ozone concentrations were 9-12% with NO_{rm x} control at an ROG/NO_{rm x} ratio of 24:1 and 11-33% with ROG control at an ROG/NO _{rm x} ratio of 6:1. Linear regression analysis of the Monte Carlo results showed that uncertainties in rate parameters for the formation of HNO_3, for the reaction of HCHO + hv to 2HO _2 + CO, for PAN chemistry and for the photolysis of NO_2 are most influential to ozone concentrations and differences of ozone. The parameters that are important to ozone concentrations also tend to be relatively influential to other key species

  12. Uncertainty Analysis of Historic Water Resources Availability in Africa

    NASA Astrophysics Data System (ADS)

    McNally, A.; Arsenault, K. R.; Narapusetty, B.; Peters-Lidard, C. D.

    2014-12-01

    Seeing how current agrometerological conditions measure up to historic events helps analysts and decision-makers judge the potential impact that anomalous rainfall and temperatures will have on the availability and accessibility of food and water resources. We present results from the Famine Early Warning Systems Network (FEWS NET) Land Data Assimilation System (FLDAS), which is used to produce multi-model and rainfall ensembles of the water balance over semi-arid Africa from 1982-2014. The ensemble approach allows us to assess confidence in our estimates, which is critical given that food and water insecure regions in Africa are data-poor are characterized by complex interactions and feedbacks that cause deterministic hydrologic modeling approaches to fall short. We then use the ensemble of water balance estimates to calculate drought severity (derived from modeled soil moisture), and the Water Requirement Satisfaction Index (a function of atmospheric water demand). We compare these indices to the GIMMS 30-year vegetation data product from AVHRR, and the ESA ECV 30 year microwave soil moisture. These historical time series (with confidence bounds) allow us to improve our quantitative understanding of drought thresholds, to explore sources of parameter and model uncertainty, and to better contextualize current operational drought monitoring efforts in Africa.

  13. Three Dimensional Vapor Intrusion Modeling: Model Validation and Uncertainty Analysis

    NASA Astrophysics Data System (ADS)

    Akbariyeh, S.; Patterson, B.; Rakoczy, A.; Li, Y.

    2013-12-01

    Volatile organic chemicals (VOCs), such as chlorinated solvents and petroleum hydrocarbons, are prevalent groundwater contaminants due to their improper disposal and accidental spillage. In addition to contaminating groundwater, VOCs may partition into the overlying vadose zone and enter buildings through gaps and cracks in foundation slabs or basement walls, a process termed vapor intrusion. Vapor intrusion of VOCs has been recognized as a detrimental source for human exposures to potential carcinogenic or toxic compounds. The simulation of vapor intrusion from a subsurface source has been the focus of many studies to better understand the process and guide field investigation. While multiple analytical and numerical models were developed to simulate the vapor intrusion process, detailed validation of these models against well controlled experiments is still lacking, due to the complexity and uncertainties associated with site characterization and soil gas flux and indoor air concentration measurement. In this work, we present an effort to validate a three-dimensional vapor intrusion model based on a well-controlled experimental quantification of the vapor intrusion pathways into a slab-on-ground building under varying environmental conditions. Finally, a probabilistic approach based on Monte Carlo simulations is implemented to determine the probability distribution of indoor air concentration based on the most uncertain input parameters.

  14. Large-Scale Transport Model Uncertainty and Sensitivity Analysis: Distributed Sources in Complex, Hydrogeologic Systems

    NASA Astrophysics Data System (ADS)

    Wolfsberg, A.; Kang, Q.; Li, C.; Ruskauff, G.; Bhark, E.; Freeman, E.; Prothro, L.; Drellack, S.

    2007-12-01

    The Underground Test Area (UGTA) Project of the U.S. Department of Energy, National Nuclear Security Administration Nevada Site Office is in the process of assessing and developing regulatory decision options based on modeling predictions of contaminant transport from underground testing of nuclear weapons at the Nevada Test Site (NTS). The UGTA Project is attempting to develop an effective modeling strategy that addresses and quantifies multiple components of uncertainty including natural variability, parameter uncertainty, conceptual/model uncertainty, and decision uncertainty in translating model results into regulatory requirements. The modeling task presents multiple unique challenges to the hydrological sciences as a result of the complex fractured and faulted hydrostratigraphy, the distributed locations of sources, the suite of reactive and non-reactive radionuclides, and uncertainty in conceptual models. Characterization of the hydrogeologic system is difficult and expensive because of deep groundwater in the arid desert setting and the large spatial setting of the NTS. Therefore, conceptual model uncertainty is partially addressed through the development of multiple alternative conceptual models of the hydrostratigraphic framework and multiple alternative models of recharge and discharge. Uncertainty in boundary conditions is assessed through development of alternative groundwater fluxes through multiple simulations using the regional groundwater flow model. Calibration of alternative models to heads and measured or inferred fluxes has not proven to provide clear measures of model quality. Therefore, model screening by comparison to independently-derived natural geochemical mixing targets through cluster analysis has also been invoked to evaluate differences between alternative conceptual models. Advancing multiple alternative flow models, sensitivity of transport predictions to parameter uncertainty is assessed through Monte Carlo simulations. The

  15. Large-Scale Transport Model Uncertainty and Sensitivity Analysis: Distributed Sources in Complex Hydrogeologic Systems

    SciTech Connect

    Sig Drellack, Lance Prothro

    2007-12-01

    The Underground Test Area (UGTA) Project of the U.S. Department of Energy, National Nuclear Security Administration Nevada Site Office is in the process of assessing and developing regulatory decision options based on modeling predictions of contaminant transport from underground testing of nuclear weapons at the Nevada Test Site (NTS). The UGTA Project is attempting to develop an effective modeling strategy that addresses and quantifies multiple components of uncertainty including natural variability, parameter uncertainty, conceptual/model uncertainty, and decision uncertainty in translating model results into regulatory requirements. The modeling task presents multiple unique challenges to the hydrological sciences as a result of the complex fractured and faulted hydrostratigraphy, the distributed locations of sources, the suite of reactive and non-reactive radionuclides, and uncertainty in conceptual models. Characterization of the hydrogeologic system is difficult and expensive because of deep groundwater in the arid desert setting and the large spatial setting of the NTS. Therefore, conceptual model uncertainty is partially addressed through the development of multiple alternative conceptual models of the hydrostratigraphic framework and multiple alternative models of recharge and discharge. Uncertainty in boundary conditions is assessed through development of alternative groundwater fluxes through multiple simulations using the regional groundwater flow model. Calibration of alternative models to heads and measured or inferred fluxes has not proven to provide clear measures of model quality. Therefore, model screening by comparison to independently-derived natural geochemical mixing targets through cluster analysis has also been invoked to evaluate differences between alternative conceptual models. Advancing multiple alternative flow models, sensitivity of transport predictions to parameter uncertainty is assessed through Monte Carlo simulations. The

  16. Probabilistic uncertainty analysis of epidemiological modeling to guide public health intervention policy.

    PubMed

    Gilbert, Jennifer A; Meyers, Lauren Ancel; Galvani, Alison P; Townsend, Jeffrey P

    2014-03-01

    Mathematical modeling of disease transmission has provided quantitative predictions for health policy, facilitating the evaluation of epidemiological outcomes and the cost-effectiveness of interventions. However, typical sensitivity analyses of deterministic dynamic infectious disease models focus on model architecture and the relative importance of parameters but neglect parameter uncertainty when reporting model predictions. Consequently, model results that identify point estimates of intervention levels necessary to terminate transmission yield limited insight into the probability of success. We apply probabilistic uncertainty analysis to a dynamic model of influenza transmission and assess global uncertainty in outcome. We illustrate that when parameter uncertainty is not incorporated into outcome estimates, levels of vaccination and treatment predicted to prevent an influenza epidemic will only have an approximately 50% chance of terminating transmission and that sensitivity analysis alone is not sufficient to obtain this information. We demonstrate that accounting for parameter uncertainty yields probabilities of epidemiological outcomes based on the degree to which data support the range of model predictions. Unlike typical sensitivity analyses of dynamic models that only address variation in parameters, the probabilistic uncertainty analysis described here enables modelers to convey the robustness of their predictions to policy makers, extending the power of epidemiological modeling to improve public health. PMID:24593920

  17. Uncertainty analysis of multi-rate kinetics of uranium desorption from sediments

    SciTech Connect

    Zhang, Xiaoying; Liu, Chongxuan; Hu, Bill X.; Zhang, Guannan

    2014-01-01

    A multi-rate expression for uranyl [U(VI)] surface complexation reactions has been proposed to describe diffusion-limited U(VI) sorption/desorption in heterogeneous subsurface sediments. An important assumption in the rate expression is that its rate constants follow a certain type probability distribution. In this paper, a Bayes-based, Differential Evolution Markov Chain method was used to assess the distribution assumption and to analyze parameter and model structure uncertainties. U(VI) desorption from a contaminated sediment at the US Hanford 300 Area, Washington was used as an example for detail analysis. The results indicated that: 1) the rate constants in the multi-rate expression contain uneven uncertainties with slower rate constants having relative larger uncertainties; 2) the lognormal distribution is an effective assumption for the rate constants in the multi-rate model to simualte U(VI) desorption; 3) however, long-term prediction and its uncertainty may be significantly biased by the lognormal assumption for the smaller rate constants; and 4) both parameter and model structure uncertainties can affect the extrapolation of the multi-rate model with a larger uncertainty from the model structure. The results provide important insights into the factors contributing to the uncertainties of the multi-rate expression commonly used to describe the diffusion or mixing-limited sorption/desorption of both organic and inorganic contaminants in subsurface sediments.

  18. Unscented transform-based uncertainty analysis of rotating coil transducers for field mapping

    NASA Astrophysics Data System (ADS)

    Arpaia, P.; De Matteis, E.; Schiano Lo Moriello, R.

    2016-03-01

    The uncertainty of a rotating coil transducer for magnetic field mapping is analyzed. Unscented transform and statistical design of experiments are combined to determine magnetic field expectation, standard uncertainty, and separate contributions of the uncertainty sources. For nonlinear measurement models, the unscented transform-based approach is more error-proof than the linearization underlying the "Guide to the expression of Uncertainty in Measurements" (GUMs), owing to the absence of model approximations and derivatives computation. When GUM assumptions are not met, the deterministic sampling strategy strongly reduces computational burden with respect to Monte Carlo-based methods proposed by the Supplement 1 of the GUM. Furthermore, the design of experiments and the associated statistical analysis allow the uncertainty sources domain to be explored efficiently, as well as their significance and single contributions to be assessed for an effective setup configuration. A straightforward experimental case study highlights that a one-order-of-magnitude reduction in the relative uncertainty of the coil area produces a decrease in uncertainty of the field mapping transducer by a factor of 25 with respect to the worst condition. Moreover, about 700 trials and the related processing achieve results corresponding to 5 × 106 brute-force Monte Carlo simulations.

  19. Unscented transform-based uncertainty analysis of rotating coil transducers for field mapping.

    PubMed

    Arpaia, P; De Matteis, E; Schiano Lo Moriello, R

    2016-03-01

    The uncertainty of a rotating coil transducer for magnetic field mapping is analyzed. Unscented transform and statistical design of experiments are combined to determine magnetic field expectation, standard uncertainty, and separate contributions of the uncertainty sources. For nonlinear measurement models, the unscented transform-based approach is more error-proof than the linearization underlying the "Guide to the expression of Uncertainty in Measurements" (GUMs), owing to the absence of model approximations and derivatives computation. When GUM assumptions are not met, the deterministic sampling strategy strongly reduces computational burden with respect to Monte Carlo-based methods proposed by the Supplement 1 of the GUM. Furthermore, the design of experiments and the associated statistical analysis allow the uncertainty sources domain to be explored efficiently, as well as their significance and single contributions to be assessed for an effective setup configuration. A straightforward experimental case study highlights that a one-order-of-magnitude reduction in the relative uncertainty of the coil area produces a decrease in uncertainty of the field mapping transducer by a factor of 25 with respect to the worst condition. Moreover, about 700 trials and the related processing achieve results corresponding to 5 × 10(6) brute-force Monte Carlo simulations. PMID:27036810

  20. Uncertainty Analysis Using BMA for Hydrologic Projections under Future Climate Change

    NASA Astrophysics Data System (ADS)

    Beigi, E.; Tsai, F. T. C.

    2014-12-01

    This study conducts uncertainty analysis on future region-scale hydrologic projections under the uncertain climate change projections of the Intergovernmental Panel on Climate Change (IPCC) Fifth Assessment Report. The choice of the global climate models (GCMs), the greenhouse gas concentration trajectories and the GCM initial conditions are considered to be three major sources of uncertainty in the projected precipitation and temperature. This study uses the 133 sets of downscaled precipitation and temperature of the 1/8 degree BCCA projections for uncertainty analysis, which are derived from 22 CMIP5 GCMs, four emissions paths (RCP2.6, RCP4.5, RCP6.0, and RCP8.5), and different number of GCM initial conditions. The downscaled precipitation and temperature are used in the hydrologic model HELP3 to derive high-resolution spatiotemporal distributions of surface runoff, evapotranspiration and groundwater recharge from 2010 to 2099. The hierarchical Bayesian model averaging (HBMA) method is adopted to segregate and prioritize the three sources of climate projection uncertainty, obtain the ensemble mean of hydrologic projections, and quantify the hydrologic projection uncertainty. Posterior model probabilities in the BMA are calculated based on a performance criterion and a convergence criterion. The performance criterion is the GCM performance in reproducing the historical climate. The convergence criterion is the closeness of GCM simulation to the ensemble mean of future projections. Different likelihood functions are used to investigate their impacts on the posterior model probabilities. The methodology is applied to the study of hydrologic projections and uncertainty for the area of the Southern Hills aquifer system, southwestern Mississippi and southeastern Louisiana. The study area is divided into more than 2.6 million subdivisions by intersecting various datasets through the ArcGIS. The analysis is computationally intensive. Parallel computation is used to

  1. Uncertainties in Cancer Risk Coefficients for Environmental Exposure to Radionuclides. An Uncertainty Analysis for Risk Coefficients Reported in Federal Guidance Report No. 13

    SciTech Connect

    Pawel, David; Leggett, Richard Wayne; Eckerman, Keith F; Nelson, Christopher

    2007-01-01

    Federal Guidance Report No. 13 (FGR 13) provides risk coefficients for estimation of the risk of cancer due to low-level exposure to each of more than 800 radionuclides. Uncertainties in risk coefficients were quantified in FGR 13 for 33 cases (exposure to each of 11 radionuclides by each of three exposure pathways) on the basis of sensitivity analyses in which various combinations of plausible biokinetic, dosimetric, and radiation risk models were used to generate alternative risk coefficients. The present report updates the uncertainty analysis in FGR 13 for the cases of inhalation and ingestion of radionuclides and expands the analysis to all radionuclides addressed in that report. The analysis indicates that most risk coefficients for inhalation or ingestion of radionuclides are determined within a factor of 5 or less by current information. That is, application of alternate plausible biokinetic and dosimetric models and radiation risk models (based on the linear, no-threshold hypothesis with an adjustment for the dose and dose rate effectiveness factor) is unlikely to change these coefficients by more than a factor of 5. In this analysis the assessed uncertainty in the radiation risk model was found to be the main determinant of the uncertainty category for most risk coefficients, but conclusions concerning the relative contributions of risk and dose models to the total uncertainty in a risk coefficient may depend strongly on the method of assessing uncertainties in the risk model.

  2. Large-scale estimation and uncertainty analysis of gross primary production in Tibetan alpine grasslands

    NASA Astrophysics Data System (ADS)

    He, Honglin; Liu, Min; Xiao, Xiangming; Ren, Xiaoli; Zhang, Li; Sun, Xiaomin; Yang, Yuanhe; Li, Yingnian; Zhao, Liang; Shi, Peili; Du, Mingyuan; Ma, Yaoming; Ma, Mingguo; Zhang, Yu; Yu, Guirui

    2014-03-01

    Gross primary production (GPP) is an important parameter for carbon cycle and climate change research. Previous estimations of GPP on the Tibetan Plateau were usually reported without quantitative uncertainty analyses. This study sought to quantify the uncertainty and its partitioning in GPP estimation across Tibetan alpine grasslands during 2003-2008 with the modified Vegetation Photosynthesis Model (VPM). Monte Carlo analysis was used to provide a quantitative assessment of the uncertainty in model simulations, and Sobol' variance decomposition method was applied to determine the relative contribution of each source of uncertainty to the total uncertainty. The results showed that the modified VPM successfully reproduced the seasonal dynamics and magnitude of GPP of 10 flux tower sites on the plateau (R2 = 0.77 - 0.95, p < 0.001). The 6 year mean GPP in Tibetan alpine grasslands was estimated at 223.3 Tg C yr-1 (312.3 g C m-2 yr-1). The mean annual GPP increased from western to eastern plateau, with the increase of annual temperature and precipitation and the decrease of elevation, while the decrease of GPP from southern to northern plateau was primarily driven by air temperature. Furthermore, the mean relative uncertainty of the annual GPP was 18.30%, with larger uncertainty occurring in regions with lower GPP. Photosynthetic active radiation, enhanced vegetation index, and the maximum light use efficiency (LUE) are the primary sources of uncertainty in GPP estimation, contributing 36.84%, 26.86%, and 21.99%, respectively. This emphasizes the importance of uncertainty in driving variables as well as that of maximum LUE in LUE model simulation.

  3. Uncertainty analysis of modeled carbon and water fluxes in a subtropical coniferous plantation

    NASA Astrophysics Data System (ADS)

    Ren, Xiaoli; He, Honglin; Moore, David J. P.; Zhang, Li; Liu, Min; Li, Fan; Yu, Guirui; Wang, Huimin

    2013-12-01

    Estimating the exchanges of carbon and water between vegetation and the atmosphere requires process-based ecosystem models; however, uncertainty in model predictions is inevitable due to the uncertainties in model structure, model parameters, and driving variables. This paper proposes a methodological framework for analyzing prediction uncertainty of ecosystem models caused by parameters and applies it in Qianyanzhou subtropical coniferous plantation using the Simplified Photosynthesis and Evapotranspiration model. We selected 20 key parameters from 42 parameters of the model using one-at-a-time sensitivity analysis method and estimated their posterior distributions using Markov Chain Monte Carlo technique. Prediction uncertainty was quantified through Monte Carlo method and partitioned using Sobol' method by decomposing the total variance of model predictions into different components. The uncertainty in predicted net ecosystem CO2 exchange (NEE), gross primary production (GPP), ecosystem respiration (RE), evapotranspiration (ET), and transpiration (T), defined as the coefficient of variation, was 61.0%, 20.6%, 12.7%, 14.2%, and 19.9%, respectively. Modeled carbon and water fluxes were highly sensitive to two parameters, maximum net CO2 assimilation rate (Amax) and specific leaf weight (SLWC). They contributed more than two thirds of the uncertainty in predicted NEE, GPP, ET, and T and almost one third of the uncertainty in predicted RE, which should be focused on in further efforts to reduce uncertainty. The results indicated a direction for future model development and data collection. Although there were still limitations in the framework illustrated here, it did provide a paradigm for systematic quantification of ecosystem model prediction uncertainty.

  4. Propagation of uncertainty and sensitivity analysis in an integral oil-gas plume model

    NASA Astrophysics Data System (ADS)

    Wang, Shitao; Iskandarani, Mohamed; Srinivasan, Ashwanth; Thacker, W. Carlisle; Winokur, Justin; Knio, Omar M.

    2016-05-01

    Polynomial Chaos expansions are used to analyze uncertainties in an integral oil-gas plume model simulating the Deepwater Horizon oil spill. The study focuses on six uncertain input parameters—two entrainment parameters, the gas to oil ratio, two parameters associated with the droplet-size distribution, and the flow rate—that impact the model's estimates of the plume's trap and peel heights, and of its various gas fluxes. The ranges of the uncertain inputs were determined by experimental data. Ensemble calculations were performed to construct polynomial chaos-based surrogates that describe the variations in the outputs due to variations in the uncertain inputs. The surrogates were then used to estimate reliably the statistics of the model outputs, and to perform an analysis of variance. Two experiments were performed to study the impacts of high and low flow rate uncertainties. The analysis shows that in the former case the flow rate is the largest contributor to output uncertainties, whereas in the latter case, with the uncertainty range constrained by aposteriori analyses, the flow rate's contribution becomes negligible. The trap and peel heights uncertainties are then mainly due to uncertainties in the 95% percentile of the droplet size and in the entrainment parameters.

  5. Uncertainties in the governance of animal disease: an interdisciplinary framework for analysis

    PubMed Central

    Fish, Robert; Austin, Zoe; Christley, Robert; Haygarth, Philip M.; Heathwaite, Louise A.; Latham, Sophia; Medd, William; Mort, Maggie; Oliver, David M.; Pickup, Roger; Wastling, Jonathan M.; Wynne, Brian

    2011-01-01

    Uncertainty is an inherent feature of strategies to contain animal disease. In this paper, an interdisciplinary framework for representing strategies of containment, and analysing how uncertainties are embedded and propagated through them, is developed and illustrated. Analysis centres on persistent, periodic and emerging disease threats, with a particular focus on cryptosporidiosis, foot and mouth disease and avian influenza. Uncertainty is shown to be produced at strategic, tactical and operational levels of containment, and across the different arenas of disease prevention, anticipation and alleviation. The paper argues for more critically reflexive assessments of uncertainty in containment policy and practice. An interdisciplinary approach has an important contribution to make, but is absent from current real-world containment policy. PMID:21624922

  6. Incorporation of Uncertainty and Variability of Drip Shield and Waste Package Degradation in WAPDEG Analysis

    SciTech Connect

    J.C. Helton

    2000-04-19

    This presentation investigates the incorporation of uncertainty and variability of drip shield and waste package degradation in analyses with the Waste Package Degradation (WAPDEG) program (CRWMS M&O 1998). This plan was developed in accordance with Development Plan TDP-EBS-MD-000020 (CRWMS M&O 1999a). Topics considered include (1) the nature of uncertainty and variability (Section 6.1), (2) incorporation of variability and uncertainty into analyses involving individual patches, waste packages, groups of waste packages, and the entire repository (Section 6.2), (3) computational strategies (Section 6.3), (4) incorporation of multiple waste package layers (i.e., drip shield, Alloy 22, and stainless steel) into an analysis (Section 6.4), (5) uncertainty in the characterization of variability (Section 6.5), and (6) Gaussian variance partitioning (Section 6.6). The presentation ends with a brief concluding discussion (Section 7).

  7. IAEA CRP on HTGR Uncertainty Analysis: Benchmark Definition and Test Cases

    SciTech Connect

    Gerhard Strydom; Frederik Reitsma; Hans Gougar; Bismark Tyobeka; Kostadin Ivanov

    2012-11-01

    Uncertainty and sensitivity studies are essential elements of the reactor simulation code verification and validation process. Although several international uncertainty quantification activities have been launched in recent years in the LWR, BWR and VVER domains (e.g. the OECD/NEA BEMUSE program [1], from which the current OECD/NEA LWR Uncertainty Analysis in Modelling (UAM) benchmark [2] effort was derived), the systematic propagation of uncertainties in cross-section, manufacturing and model parameters for High Temperature Reactor (HTGR) designs has not been attempted yet. This paper summarises the scope, objectives and exercise definitions of the IAEA Coordinated Research Project (CRP) on HTGR UAM [3]. Note that no results will be included here, as the HTGR UAM benchmark was only launched formally in April 2012, and the specification is currently still under development.

  8. Development and Uncertainty Analysis of an Automatic Testing System for Diffusion Pump Performance

    NASA Astrophysics Data System (ADS)

    Zhang, S. W.; Liang, W. S.; Zhang, Z. J.

    A newly developed automatic testing system used in laboratory for diffusion pump performance measurement is introduced in this paper. By using two optical fiber sensors to indicate the oil level in glass-buret and a needle valve driven by a stepper motor to regulate the pressure in the test dome, the system can automatically test the ultimate pressure and pumping speed of a diffusion pump in accordance with ISO 1608. The uncertainty analysis theory is applied to analyze pumping speed measurement results. Based on the test principle and system structure, it is studied how much influence each component and test step contributes to the final uncertainty. According to differential method, the mathematical model for systematic uncertainty transfer function is established. Finally, by case study, combined uncertainties of manual operation and automatic operation are compared with each other (6.11% and 5.87% respectively). The reasonableness and practicality of this newly developed automatic testing system is proved.

  9. Uncertainty analysis for 3D geological modeling using the Kriging variance

    NASA Astrophysics Data System (ADS)

    Choi, Yosoon; Choi, Younjung; Park, Sebeom; Um, Jeong-Gi

    2014-05-01

    The credible estimation of geological properties is critical in many geosciences fields including the geotechnical engineering, environmental engineering, mining engineering and petroleum engineering. Many interpolation techniques have been developed to estimate the geological properties from limited sampling data such as borehole logs. The Kriging is an interpolation technique that gives the best linear unbiased prediction of the intermediate values. It also provides the Kriging variance which quantifies the uncertainty of the kriging estimates. This study provides a new method to analyze the uncertainty in 3D geological modeling using the Kriging variance. The cut-off values determined by the Kriging variance were used to effectively visualize the 3D geological models with different confidence levels. This presentation describes the method for uncertainty analysis and a case study which evaluates the amount of recoverable resources by considering the uncertainty.

  10. Uncertainty and Sensitivity Analysis of Afterbody Radiative Heating Predictions for Earth Entry

    NASA Technical Reports Server (NTRS)

    West, Thomas K., IV; Johnston, Christopher O.; Hosder, Serhat

    2016-01-01

    The objective of this work was to perform sensitivity analysis and uncertainty quantification for afterbody radiative heating predictions of Stardust capsule during Earth entry at peak afterbody radiation conditions. The radiation environment in the afterbody region poses significant challenges for accurate uncertainty quantification and sensitivity analysis due to the complexity of the flow physics, computational cost, and large number of un-certain variables. In this study, first a sparse collocation non-intrusive polynomial chaos approach along with global non-linear sensitivity analysis was used to identify the most significant uncertain variables and reduce the dimensions of the stochastic problem. Then, a total order stochastic expansion was constructed over only the important parameters for an efficient and accurate estimate of the uncertainty in radiation. Based on previous work, 388 uncertain parameters were considered in the radiation model, which came from the thermodynamics, flow field chemistry, and radiation modeling. The sensitivity analysis showed that only four of these variables contributed significantly to afterbody radiation uncertainty, accounting for almost 95% of the uncertainty. These included the electronic- impact excitation rate for N between level 2 and level 5 and rates of three chemical reactions in uencing N, N(+), O, and O(+) number densities in the flow field.

  11. Overall uncertainty measurement for near infrared analysis of cryptotanshinone in tanshinone extract.

    PubMed

    Xue, Zhong; Xu, Bing; Shi, Xinyuan; Yang, Chan; Cui, Xianglong; Luo, Gan; Qiao, Yanjiang

    2017-01-01

    This study presented a new strategy of overall uncertainty measurement for near infrared (NIR) quantitative analysis of cryptotanshinone in tanshinone extract powders. The overall uncertainty of NIR analysis from validation data of precision, trueness and robustness study was fully investigated and discussed. Quality by design (QbD) elements, such as risk assessment and design of experiment (DOE) were utilized to organize the validation data. An "I×J×K" (series I, the number of repetitions J and level of concentrations K) full factorial design was used to calculate uncertainty from the precision and trueness data. And a 2(7-4) Plackett-Burmann matrix with four different influence factors resulted from the failure mode and effect analysis (FMEA) analysis was adapted for the robustness study. The overall uncertainty profile was introduced as a graphical decision making tool to evaluate the validity of NIR method over the predefined concentration range. In comparison with the T. Saffaj's method (Analyst, 2013, 138, 4677.) for overall uncertainty assessment, the proposed approach gave almost the same results, demonstrating that the proposed method was reasonable and valid. Moreover, the proposed method can help identify critical factors that influence the NIR prediction performance, which could be used for further optimization of the NIR analytical procedures in routine use. PMID:27404670

  12. SOARCA Peach Bottom Atomic Power Station Long-Term Station Blackout Uncertainty Analysis: Knowledge Advancement.

    SciTech Connect

    Gauntt, Randall O.; Mattie, Patrick D.; Bixler, Nathan E.; Ross, Kyle; Cardoni, Jeffrey N; Kalinich, Donald A.; Osborn, Douglas M.; Sallaberry, Cedric Jean-Marie; Ghosh, S. Tina

    2014-02-01

    This paper describes the knowledge advancements from the uncertainty analysis for the State-of- the-Art Reactor Consequence Analyses (SOARCA) unmitigated long-term station blackout accident scenario at the Peach Bottom Atomic Power Station. This work assessed key MELCOR and MELCOR Accident Consequence Code System, Version 2 (MACCS2) modeling uncertainties in an integrated fashion to quantify the relative importance of each uncertain input on potential accident progression, radiological releases, and off-site consequences. This quantitative uncertainty analysis provides measures of the effects on consequences, of each of the selected uncertain parameters both individually and in interaction with other parameters. The results measure the model response (e.g., variance in the output) to uncertainty in the selected input. Investigation into the important uncertain parameters in turn yields insights into important phenomena for accident progression and off-site consequences. This uncertainty analysis confirmed the known importance of some parameters, such as failure rate of the Safety Relief Valve in accident progression modeling and the dry deposition velocity in off-site consequence modeling. The analysis also revealed some new insights, such as dependent effect of cesium chemical form for different accident progressions. (auth)

  13. An example uncertainty and sensitivity analysis at the Horonobe site for performance assessment calculations.

    SciTech Connect

    James, Scott Carlton

    2004-08-01

    Given pre-existing Groundwater Modeling System (GMS) models of the Horonobe Underground Research Laboratory (URL) at both the regional and site scales, this work performs an example uncertainty analysis for performance assessment (PA) applications. After a general overview of uncertainty and sensitivity analysis techniques, the existing GMS sitescale model is converted to a PA model of the steady-state conditions expected after URL closure. This is done to examine the impact of uncertainty in site-specific data in conjunction with conceptual model uncertainty regarding the location of the Oomagari Fault. In addition, a quantitative analysis of the ratio of dispersive to advective forces, the F-ratio, is performed for stochastic realizations of each conceptual model. All analyses indicate that accurate characterization of the Oomagari Fault with respect to both location and hydraulic conductivity is critical to PA calculations. This work defines and outlines typical uncertainty and sensitivity analysis procedures and demonstrates them with example PA calculations relevant to the Horonobe URL.

  14. Uncertainty analysis and global sensitivity analysis of techno-economic assessments for biodiesel production.

    PubMed

    Tang, Zhang-Chun; Zhenzhou, Lu; Zhiwen, Liu; Ningcong, Xiao

    2015-01-01

    There are various uncertain parameters in the techno-economic assessments (TEAs) of biodiesel production, including capital cost, interest rate, feedstock price, maintenance rate, biodiesel conversion efficiency, glycerol price and operating cost. However, fewer studies focus on the influence of these parameters on TEAs. This paper investigated the effects of these parameters on the life cycle cost (LCC) and the unit cost (UC) in the TEAs of biodiesel production. The results show that LCC and UC exhibit variations when involving uncertain parameters. Based on the uncertainty analysis, three global sensitivity analysis (GSA) methods are utilized to quantify the contribution of an individual uncertain parameter to LCC and UC. The GSA results reveal that the feedstock price and the interest rate produce considerable effects on the TEAs. These results can provide a useful guide for entrepreneurs when they plan plants. PMID:25459861

  15. A Monte Carlo Uncertainty Analysis of Ozone Trend Predictions in a Two Dimensional Model. Revision

    NASA Technical Reports Server (NTRS)

    Considine, D. B.; Stolarski, R. S.; Hollandsworth, S. M.; Jackman, C. H.; Fleming, E. L.

    1998-01-01

    We use Monte Carlo analysis to estimate the uncertainty in predictions of total O3 trends between 1979 and 1995 made by the Goddard Space Flight Center (GSFC) two-dimensional (2D) model of stratospheric photochemistry and dynamics. The uncertainty is caused by gas-phase chemical reaction rates, photolysis coefficients, and heterogeneous reaction parameters which are model inputs. The uncertainty represents a lower bound to the total model uncertainty assuming the input parameter uncertainties are characterized correctly. Each of the Monte Carlo runs was initialized in 1970 and integrated for 26 model years through the end of 1995. This was repeated 419 times using input parameter sets generated by Latin Hypercube Sampling. The standard deviation (a) of the Monte Carlo ensemble of total 03 trend predictions is used to quantify the model uncertainty. The 34% difference between the model trend in globally and annually averaged total O3 using nominal inputs and atmospheric trends calculated from Nimbus 7 and Meteor 3 total ozone mapping spectrometer (TOMS) version 7 data is less than the 46% calculated 1 (sigma), model uncertainty, so there is no significant difference between the modeled and observed trends. In the northern hemisphere midlatitude spring the modeled and observed total 03 trends differ by more than 1(sigma) but less than 2(sigma), which we refer to as marginal significance. We perform a multiple linear regression analysis of the runs which suggests that only a few of the model reactions contribute significantly to the variance in the model predictions. The lack of significance in these comparisons suggests that they are of questionable use as guides for continuing model development. Large model/measurement differences which are many multiples of the input parameter uncertainty are seen in the meridional gradients of the trend and the peak-to-peak variations in the trends over an annual cycle. These discrepancies unambiguously indicate model formulation

  16. HYDROLOGIC MODEL CALIBRATION AND UNCERTAINTY IN SCENARIO ANALYSIS

    EPA Science Inventory

    A systematic analysis of model performance during simulations based on

    observed land-cover/use change is used to quantify error associated with water-yield

    simulations for a series of known landscape conditions over a 24-year period with the

    goal of evaluatin...

  17. Uncertainty Analysis of LROC NAC Derived Elevation Models

    NASA Astrophysics Data System (ADS)

    Burns, K.; Yates, D. G.; Speyerer, E.; Robinson, M. S.

    2012-12-01

    One of the primary objectives of the Lunar Reconnaissance Orbiter Camera (LROC) [1] is to gather stereo observations with the Narrow Angle Camera (NAC) to generate digital elevation models (DEMs). From an altitude of 50 km, the NAC acquires images with a pixel scale of 0.5 meters, and a dual NAC observation covers approximately 5 km cross-track by 25 km down-track. This low altitude was common from September 2009 to December 2011. Images acquired during the commissioning phase and those acquired from the fixed orbit (after 11 December 2011) have pixel scales that range from 0.35 meters at the south pole to 2 meters at the north pole. Alimetric observations obtained by the Lunar Orbiter Laser Altimeter (LOLA) provide measurements of ±0.1 m between the spacecraft and the surface [2]. However, uncertainties in the spacecraft positioning can result in offsets (±20m) between altimeter tracks over many orbits. The LROC team is currently developing a tool to automatically register alimetric observations to NAC DEMs [3]. Using a generalized pattern search (GPS) algorithm, the new automatic registration adjusts the spacecraft position and pointing information during times when NAC images, as well as LOLA measurements, of the same region are acquired to provide an absolute reference frame for the DEM. This information is then imported into SOCET SET to aide in creating controlled NAC DEMs. For every DEM, a figure of merit (FOM) map is generated using SOCET SET software. This is a valuable tool for determining the relative accuracy of a specific pixel in a DEM. Each pixel in a FOM map is given a value to determine its "quality" by determining if the specific pixel was shadowed, saturated, suspicious, interpolated/extrapolated, or successfully correlated. The overall quality of a NAC DEM is a function of both the absolute and relative accuracies. LOLA altimetry provides the most accurate absolute geodetic reference frame with which the NAC DEMs can be compared. Offsets

  18. Implementation of a Bayesian Engine for Uncertainty Analysis

    SciTech Connect

    Leng Vang; Curtis Smith; Steven Prescott

    2014-08-01

    In probabilistic risk assessment, it is important to have an environment where analysts have access to a shared and secured high performance computing and a statistical analysis tool package. As part of the advanced small modular reactor probabilistic risk analysis framework implementation, we have identified the need for advanced Bayesian computations. However, in order to make this technology available to non-specialists, there is also a need of a simplified tool that allows users to author models and evaluate them within this framework. As a proof-of-concept, we have implemented an advanced open source Bayesian inference tool, OpenBUGS, within the browser-based cloud risk analysis framework that is under development at the Idaho National Laboratory. This development, the “OpenBUGS Scripter” has been implemented as a client side, visual web-based and integrated development environment for creating OpenBUGS language scripts. It depends on the shared server environment to execute the generated scripts and to transmit results back to the user. The visual models are in the form of linked diagrams, from which we automatically create the applicable OpenBUGS script that matches the diagram. These diagrams can be saved locally or stored on the server environment to be shared with other users.

  19. Development of a statistical sampling method for uncertainty analysis with SCALE

    SciTech Connect

    Williams, M.; Wiarda, D.; Smith, H.; Jessee, M. A.; Rearden, B. T.; Zwermann, W.; Klein, M.; Pautz, A.; Krzykacz-Hausmann, B.; Gallner, L.

    2012-07-01

    A new statistical sampling sequence called Sampler has been developed for the SCALE code system. Random values for the input multigroup cross sections are determined by using the XSUSA program to sample uncertainty data provided in the SCALE covariance library. Using these samples, Sampler computes perturbed self-shielded cross sections and propagates the perturbed nuclear data through any specified SCALE analysis sequence, including those for criticality safety, lattice physics with depletion, and shielding calculations. Statistical analysis of the output distributions provides uncertainties and correlations in the desired responses. (authors)

  20. Detailed Uncertainty Analysis for Ares I Ascent Aerodynamics Wind Tunnel Database

    NASA Technical Reports Server (NTRS)

    Hemsch, Michael J.; Hanke, Jeremy L.; Walker, Eric L.; Houlden, Heather P.

    2008-01-01

    A detailed uncertainty analysis for the Ares I ascent aero 6-DOF wind tunnel database is described. While the database itself is determined using only the test results for the latest configuration, the data used for the uncertainty analysis comes from four tests on two different configurations at the Boeing Polysonic Wind Tunnel in St. Louis and the Unitary Plan Wind Tunnel at NASA Langley Research Center. Four major error sources are considered: (1) systematic errors from the balance calibration curve fits and model + balance installation, (2) run-to-run repeatability, (3) boundary-layer transition fixing, and (4) tunnel-to-tunnel reproducibility.

  1. Uncertainty Analysis for a Virtual Flow Meter Using an Air-Handling Unit Chilled Water Valve

    SciTech Connect

    Song, Li; Wang, Gang; Brambley, Michael R.

    2013-04-28

    A virtual water flow meter is developed that uses the chilled water control valve on an air-handling unit as a measurement device. The flow rate of water through the valve is calculated using the differential pressure across the valve and its associated coil, the valve command, and an empirically determined valve characteristic curve. Thus, the probability of error in the measurements is significantly greater than for conventionally manufactured flow meters. In this paper, mathematical models are developed and used to conduct uncertainty analysis for the virtual flow meter, and the results from the virtual meter are compared to measurements made with an ultrasonic flow meter. Theoretical uncertainty analysis shows that the total uncertainty in flow rates from the virtual flow meter is 1.46% with 95% confidence; comparison of virtual flow meter results with measurements from an ultrasonic flow meter yielded anuncertainty of 1.46% with 99% confidence. The comparable results from the theoretical uncertainty analysis and empirical comparison with the ultrasonic flow meter corroborate each other, and tend to validate the approach to computationally estimating uncertainty for virtual sensors introduced in this study.

  2. Uncertainty analysis in environmental radioactivity measurements using the Monte Carlo code MCNP5

    NASA Astrophysics Data System (ADS)

    Gallardo, S.; Querol, A.; Ortiz, J.; Ródenas, J.; Verdú, G.; Villanueva, J. F.

    2015-11-01

    High Purity Germanium (HPGe) detectors are widely used for environmental radioactivity measurements due to their excellent energy resolution. Monte Carlo (MC) codes are a useful tool to complement experimental measurements in calibration procedures at the laboratory. However, the efficiency curve of the detector can vary due to uncertainties associated with measurements. These uncertainties can be classified into some categories: geometrical parameters of the measurement (distance source-detector, volume of the source), properties of the radiation source (radionuclide activity, branching ratio), and detector characteristics (Ge dead layer, active volume, end cap thickness). The Monte Carlo simulation can be also affected by other kind of uncertainties mainly related to cross sections and to the calculation itself. Normally, all these uncertainties are not well known and it required a deep analysis to determine their effect on the detector efficiency. In this work, the Noether-Wilks formula is used to carry out the uncertainty analysis. A Probability Density Function (PDF) is assigned to each variable involved in the sampling process. The size of the sampling is determined from the characteristics of the tolerance intervals by applying the Noether-Wilks formula. Results of the analysis transform the efficiency curve into a region of possible values into the tolerance intervals. Results show a good agreement between experimental measurements and simulations for two different matrices (water and sand).

  3. Uncertainty Analysis of Inertial Model Attitude Sensor Calibration and Application with a Recommended New Calibration Method

    NASA Technical Reports Server (NTRS)

    Tripp, John S.; Tcheng, Ping

    1999-01-01

    Statistical tools, previously developed for nonlinear least-squares estimation of multivariate sensor calibration parameters and the associated calibration uncertainty analysis, have been applied to single- and multiple-axis inertial model attitude sensors used in wind tunnel testing to measure angle of attack and roll angle. The analysis provides confidence and prediction intervals of calibrated sensor measurement uncertainty as functions of applied input pitch and roll angles. A comparative performance study of various experimental designs for inertial sensor calibration is presented along with corroborating experimental data. The importance of replicated calibrations over extended time periods has been emphasized; replication provides independent estimates of calibration precision and bias uncertainties, statistical tests for calibration or modeling bias uncertainty, and statistical tests for sensor parameter drift over time. A set of recommendations for a new standardized model attitude sensor calibration method and usage procedures is included. The statistical information provided by these procedures is necessary for the uncertainty analysis of aerospace test results now required by users of industrial wind tunnel test facilities.

  4. Uncertainty Analysis of Sonic Boom Levels Measured in a Simulator at NASA Langley

    NASA Technical Reports Server (NTRS)

    Rathsam, Jonathan; Ely, Jeffry W.

    2012-01-01

    A sonic boom simulator has been constructed at NASA Langley Research Center for testing the human response to sonic booms heard indoors. Like all measured quantities, sonic boom levels in the simulator are subject to systematic and random errors. To quantify these errors, and their net influence on the measurement result, a formal uncertainty analysis is conducted. Knowledge of the measurement uncertainty, or range of values attributable to the quantity being measured, enables reliable comparisons among measurements at different locations in the simulator as well as comparisons with field data or laboratory data from other simulators. The analysis reported here accounts for acoustic excitation from two sets of loudspeakers: one loudspeaker set at the facility exterior that reproduces the exterior sonic boom waveform and a second set of interior loudspeakers for reproducing indoor rattle sounds. The analysis also addresses the effect of pressure fluctuations generated when exterior doors of the building housing the simulator are opened. An uncertainty budget is assembled to document each uncertainty component, its sensitivity coefficient, and the combined standard uncertainty. The latter quantity will be reported alongside measurement results in future research reports to indicate data reliability.

  5. IAEA Coordinated Research Project on HTGR Reactor Physics, Thermal-hydraulics and Depletion Uncertainty Analysis

    SciTech Connect

    Strydom, Gerhard; Bostelmann, F.

    2015-09-01

    The continued development of High Temperature Gas Cooled Reactors (HTGRs) requires verification of HTGR design and safety features with reliable high fidelity physics models and robust, efficient, and accurate codes. The predictive capability of coupled neutronics/thermal-hydraulics and depletion simulations for reactor design and safety analysis can be assessed with sensitivity analysis (SA) and uncertainty analysis (UA) methods. Uncertainty originates from errors in physical data, manufacturing uncertainties, modelling and computational algorithms. (The interested reader is referred to the large body of published SA and UA literature for a more complete overview of the various types of uncertainties, methodologies and results obtained). SA is helpful for ranking the various sources of uncertainty and error in the results of core analyses. SA and UA are required to address cost, safety, and licensing needs and should be applied to all aspects of reactor multi-physics simulation. SA and UA can guide experimental, modelling, and algorithm research and development. Current SA and UA rely either on derivative-based methods such as stochastic sampling methods or on generalized perturbation theory to obtain sensitivity coefficients. Neither approach addresses all needs. In order to benefit from recent advances in modelling and simulation and the availability of new covariance data (nuclear data uncertainties) extensive sensitivity and uncertainty studies are needed for quantification of the impact of different sources of uncertainties on the design and safety parameters of HTGRs. Only a parallel effort in advanced simulation and in nuclear data improvement will be able to provide designers with more robust and well validated calculation tools to meet design target accuracies. In February 2009, the Technical Working Group on Gas-Cooled Reactors (TWG-GCR) of the International Atomic Energy Agency (IAEA) recommended that the proposed Coordinated Research Program (CRP) on

  6. A new algorithm for five-hole probe calibration, data reduction, and uncertainty analysis

    NASA Technical Reports Server (NTRS)

    Reichert, Bruce A.; Wendt, Bruce J.

    1994-01-01

    A new algorithm for five-hole probe calibration and data reduction using a non-nulling method is developed. The significant features of the algorithm are: (1) two components of the unit vector in the flow direction replace pitch and yaw angles as flow direction variables; and (2) symmetry rules are developed that greatly simplify Taylor's series representations of the calibration data. In data reduction, four pressure coefficients allow total pressure, static pressure, and flow direction to be calculated directly. The new algorithm's simplicity permits an analytical treatment of the propagation of uncertainty in five-hole probe measurement. The objectives of the uncertainty analysis are to quantify uncertainty of five-hole results (e.g., total pressure, static pressure, and flow direction) and determine the dependence of the result uncertainty on the uncertainty of all underlying experimental and calibration measurands. This study outlines a general procedure that other researchers may use to determine five-hole probe result uncertainty and provides guidance to improve measurement technique. The new algorithm is applied to calibrate and reduce data from a rake of five-hole probes. Here, ten individual probes are mounted on a single probe shaft and used simultaneously. Use of this probe is made practical by the simplicity afforded by this algorithm.

  7. Uncertainty in the Specification of Surface Characteristics, Part ii: Hierarchy of Interaction-Explicit Statistical Analysis

    NASA Astrophysics Data System (ADS)

    Niyogi, Devdutta S.; Raman, Sethu; Alapaty, Kiran

    The uncertainty in the specification of surface characteristics in soil-vegetation- atmosphere-transfer (SVAT) schemes within planetary boundary-layer (PBL) or mesoscale models is addressed. The hypothesis to be tested is whether the errors in the specification of the individual parameters are accumulative or whether they tend to balance each other in the overall sense for the system. A hierarchy of statistical applications is developed: classical one-at-a-time (OAT) approach, level 1; linear analysis of variance (ANOVA), level 1.5; fractional factorial (FF), or level 2; two-factor interaction (TFI) technique, or level 2.5; and a non-linear response surface methodology (RSM), or level 3. Using the First ISLSCP Field Experiment (FIFE) observations for June 6, 1987 as the initial condition for a SVAT scheme dynamically coupled to a PBL model, the interactions between uncertainty errors are analyzed. A secondary objective addresses the temporal changes in the uncertainty pattern using data for morning, afternoon, and evening conditions.It is found that the outcome from the level 1 OAT-like studies can be considered as the limiting uncertainty values for the majority of mesoscale cases. From the higher-level analyses, it is concluded that for most of the moderate surface scenarios, the effective uncertainty from the individual parameters is balanced and thus lowered. However, for the extreme cases, such as near wilting or saturation soil moisture, the uncertainties add up synergistically and these effects can be even greater than those from the outcomes of the OAT-like studies. Thus, parameter uncertainty cannot be simply related to its deviation alone, but is also dependent on other parameter settings. Also, from the temporal changes in the interaction pattern studies, it is found that, for the morning case soil texture is the important parameter, for afternoon vegetation parameters are crucial, while for the evening case soil moisture is capable of propagating

  8. Effect of soil property uncertainties on permafrost thaw projections: a calibration-constrained analysis

    NASA Astrophysics Data System (ADS)

    Harp, D. R.; Atchley, A. L.; Painter, S. L.; Coon, E. T.; Wilson, C. J.; Romanovsky, V. E.; Rowland, J. C.

    2015-06-01

    The effect of soil property uncertainties on permafrost thaw projections are studied using a three-phase subsurface thermal hydrology model and calibration-constrained uncertainty analysis. The Null-Space Monte Carlo method is used to identify soil hydrothermal parameter combinations that are consistent with borehole temperature measurements at the study site, the Barrow Environmental Observatory. Each parameter combination is then used in a forward projection of permafrost conditions for the 21st century (from calendar year 2006 to 2100) using atmospheric forcings from the Community Earth System Model (CESM) in the Representative Concentration Pathway (RCP) 8.5 greenhouse gas concentration trajectory. A 100-year projection allows for the evaluation of intra-annual uncertainty due to soil properties and the inter-annual variability due to year to year differences in CESM climate forcings. After calibrating to borehole temperature data at this well-characterized site, soil property uncertainties are still significant and result in significant intra-annual uncertainties in projected active layer thickness and annual thaw depth-duration even with a specified future climate. Intra-annual uncertainties in projected soil moisture content and Stefan number are small. A volume and time integrated Stefan number decreases significantly in the future climate, indicating that latent heat of phase change becomes more important than heat conduction in future climates. Out of 10 soil parameters, ALT, annual thaw depth-duration, and Stefan number are highly dependent on mineral soil porosity, while annual mean liquid saturation of the active layer is highly dependent on the mineral soil residual saturation and moderately dependent on peat residual saturation. By comparing the ensemble statistics to the spread of projected permafrost metrics using different climate models, we show that the effect of calibration-constrained uncertainty in soil properties, although significant, is

  9. Effect of soil property uncertainties on permafrost thaw projections: A calibration-constrained analysis

    SciTech Connect

    Harp, Dylan; Atchley, Adam; Painter, Scott L; Coon, Ethan T.; Wilson, Cathy; Romanovsky, Vladimir E; Rowland, Joel

    2016-01-01

    The effect of soil property uncertainties on permafrost thaw projections are studied using a three-phase subsurface thermal hydrology model and calibration-constrained uncertainty analysis. The Null-Space Monte Carlo method is used to identify soil hydrothermal parameter combinations that are consistent with borehole temperature measurements at the study site, the Barrow Environmental Observatory. Each parameter combination is then used in a forward projection of permafrost conditions for the 21$^{st}$ century (from calendar year 2006 to 2100) using atmospheric forcings from the Community Earth System Model (CESM) in the Representative Concentration Pathway (RCP) 8.5 greenhouse gas concentration trajectory. A 100-year projection allows for the evaluation of intra-annual uncertainty due to soil properties and the inter-annual variability due to year to year differences in CESM climate forcings. After calibrating to borehole temperature data at this well-characterized site, soil property uncertainties are still significant and result in significant intra-annual uncertainties in projected active layer thickness and annual thaw depth-duration even with a specified future climate. Intra-annual uncertainties in projected soil moisture content and Stefan number are small. A volume and time integrated Stefan number decreases significantly in the future climate, indicating that latent heat of phase change becomes more important than heat conduction in future climates. Out of 10 soil parameters, ALT, annual thaw depth-duration, and Stefan number are highly dependent on mineral soil porosity, while annual mean liquid saturation of the active layer is highly dependent on the mineral soil residual saturation and moderately dependent on peat residual saturation. By comparing the ensemble statistics to the spread of projected permafrost metrics using different climate models, we show that the effect of calibration-constrained uncertainty in soil properties, although significant

  10. Effect of soil property uncertainties on permafrost thaw projections: A calibration-constrained analysis

    SciTech Connect

    Harp, D. R.; Atchley, A. L.; Painter, S. L.; Coon, E. T.; Wilson, C. J.; Romanovsky, V. E.; Rowland, J. C.

    2015-06-29

    The effect of soil property uncertainties on permafrost thaw projections are studied using a three-phase subsurface thermal hydrology model and calibration-constrained uncertainty analysis. The Null-Space Monte Carlo method is used to identify soil hydrothermal parameter combinations that are consistent with borehole temperature measurements at the study site, the Barrow Environmental Observatory. Each parameter combination is then used in a forward projection of permafrost conditions for the 21st century (from calendar year 2006 to 2100) using atmospheric forcings from the Community Earth System Model (CESM) in the Representative Concentration Pathway (RCP) 8.5 greenhouse gas concentration trajectory. A 100-year projection allows for the evaluation of intra-annual uncertainty due to soil properties and the inter-annual variability due to year to year differences in CESM climate forcings. After calibrating to borehole temperature data at this well-characterized site, soil property uncertainties are still significant and result in significant intra-annual uncertainties in projected active layer thickness and annual thaw depth-duration even with a specified future climate. Intra-annual uncertainties in projected soil moisture content and Stefan number are small. A volume and time integrated Stefan number decreases significantly in the future climate, indicating that latent heat of phase change becomes more important than heat conduction in future climates. Out of 10 soil parameters, ALT, annual thaw depth-duration, and Stefan number are highly dependent on mineral soil porosity, while annual mean liquid saturation of the active layer is highly dependent on the mineral soil residual saturation and moderately dependent on peat residual saturation. By comparing the ensemble statistics to the spread of projected permafrost metrics using different climate models, we show that the effect of calibration-constrained uncertainty in soil properties, although significant, is

  11. Uncertainty analysis for absorbed dose from a brain receptor imaging agent

    SciTech Connect

    Aydogan, B.; Miller, L.F.; Sparks, R.B.; Stubbs, J.B.

    1999-01-01

    Absorbed dose estimates are known to contain uncertainties. A recent literature search indicates that prior to this study no rigorous investigation of uncertainty associated with absorbed dose has been undertaken. A method of uncertainty analysis for absorbed dose calculations has been developed and implemented for the brain receptor imaging agent {sup 123}I-IPT. The two major sources of uncertainty considered were the uncertainty associated with the determination of residence time and that associated with the determination of the S values. There are many sources of uncertainty in the determination of the S values, but only the inter-patient organ mass variation was considered in this work. The absorbed dose uncertainties were determined for lung, liver, heart and brain. Ninety-five percent confidence intervals of the organ absorbed dose distributions for each patient and for a seven-patient population group were determined by the ``Latin Hypercube Sampling`` method. For an individual patient, the upper bound of the 95% confidence interval of the absorbed dose was found to be about 2.5 times larger than the estimated mean absorbed dose. For the seven-patient population the upper bound of the 95% confidence interval of the absorbed dose distribution was around 45% more than the estimated population mean. For example, the 95% confidence interval of the population liver dose distribution was found to be between 1.49E+0.7 Gy/MBq and 4.65E+07 Gy/MBq with a mean of 2.52E+07 Gy/MBq. This study concluded that patients in a population receiving {sup 123}I-IPT could receive absorbed doses as much as twice as large as the standard estimated absorbed dose due to these uncertainties.

  12. Effect of soil property uncertainties on permafrost thaw projections: A calibration-constrained analysis

    DOE PAGESBeta

    Harp, Dylan R.; Atchley, Adam L.; Painter, Scott L.; Coon, Ethan T.; Wilson, Cathy J.; Romanovsky, Vladimir E.; Rowland, Joel C.

    2016-02-11

    Here, the effect of soil property uncertainties on permafrost thaw projections are studied using a three-phase subsurface thermal hydrology model and calibration-constrained uncertainty analysis. The Null-Space Monte Carlo method is used to identify soil hydrothermal parameter combinations that are consistent with borehole temperature measurements at the study site, the Barrow Environmental Observatory. Each parameter combination is then used in a forward projection of permafrost conditions for the 21more » $$^{st}$$ century (from calendar year 2006 to 2100) using atmospheric forcings from the Community Earth System Model (CESM) in the Representative Concentration Pathway (RCP) 8.5 greenhouse gas concentration trajectory. A 100-year projection allows for the evaluation of intra-annual uncertainty due to soil properties and the inter-annual variability due to year to year differences in CESM climate forcings. After calibrating to borehole temperature data at this well-characterized site, soil property uncertainties are still significant and result in significant intra-annual uncertainties in projected active layer thickness and annual thaw depth-duration even with a specified future climate. Intra-annual uncertainties in projected soil moisture content and Stefan number are small. A volume and time integrated Stefan number decreases significantly in the future climate, indicating that latent heat of phase change becomes more important than heat conduction in future climates. Out of 10 soil parameters, ALT, annual thaw depth-duration, and Stefan number are highly dependent on mineral soil porosity, while annual mean liquid saturation of the active layer is highly dependent on the mineral soil residual saturation and moderately dependent on peat residual saturation. By comparing the ensemble statistics to the spread of projected permafrost metrics using different climate models, we show that the effect of calibration-constrained uncertainty in soil properties

  13. Effect of soil property uncertainties on permafrost thaw projections: A calibration-constrained analysis

    DOE PAGESBeta

    Harp, D. R.; Atchley, A. L.; Painter, S. L.; Coon, E. T.; Wilson, C. J.; Romanovsky, V. E.; Rowland, J. C.

    2015-06-29

    The effect of soil property uncertainties on permafrost thaw projections are studied using a three-phase subsurface thermal hydrology model and calibration-constrained uncertainty analysis. The Null-Space Monte Carlo method is used to identify soil hydrothermal parameter combinations that are consistent with borehole temperature measurements at the study site, the Barrow Environmental Observatory. Each parameter combination is then used in a forward projection of permafrost conditions for the 21st century (from calendar year 2006 to 2100) using atmospheric forcings from the Community Earth System Model (CESM) in the Representative Concentration Pathway (RCP) 8.5 greenhouse gas concentration trajectory. A 100-year projection allows formore » the evaluation of intra-annual uncertainty due to soil properties and the inter-annual variability due to year to year differences in CESM climate forcings. After calibrating to borehole temperature data at this well-characterized site, soil property uncertainties are still significant and result in significant intra-annual uncertainties in projected active layer thickness and annual thaw depth-duration even with a specified future climate. Intra-annual uncertainties in projected soil moisture content and Stefan number are small. A volume and time integrated Stefan number decreases significantly in the future climate, indicating that latent heat of phase change becomes more important than heat conduction in future climates. Out of 10 soil parameters, ALT, annual thaw depth-duration, and Stefan number are highly dependent on mineral soil porosity, while annual mean liquid saturation of the active layer is highly dependent on the mineral soil residual saturation and moderately dependent on peat residual saturation. By comparing the ensemble statistics to the spread of projected permafrost metrics using different climate models, we show that the effect of calibration-constrained uncertainty in soil properties, although

  14. How protective are respirator assigned protection factors: an uncertainty analysis.

    PubMed

    Nelson, T J; Jayjock, M A; Colton, C E

    2000-01-01

    This investigation evaluated the risk of overexposure for a selected assigned protection factor by performing Monte Carlo simulations. A model was constructed to assess respirator performance by calculating the concentration inside the respirator. Estimates of the factors that affect respirator performance were described as distributions. The distributions used a worst case estimate for concentration in the workplace, the worst case for respirator performance (the fifth percentile person), and the worst case for exhalation valve leakage. A Monte Carlo analysis then provided estimates of the percentage of time that concentration inside the respirator exceeded the occupational exposure limit (OEL). For a half-facepiece respirator with an APF of 10, the calculations indicated a low risk of being exposed above an OEL, with mean exposures being controlled well below an OEL. PMID:10885889

  15. Interval arithmetic operations for uncertainty analysis with correlated interval variables

    NASA Astrophysics Data System (ADS)

    Jiang, Chao; Fu, Chun-Ming; Ni, Bing-Yu; Han, Xu

    2016-08-01

    A new interval arithmetic method is proposed to solve interval functions with correlated intervals through which the overestimation problem existing in interval analysis could be significantly alleviated. The correlation between interval parameters is defined by the multidimensional parallelepiped model which is convenient to describe the correlative and independent interval variables in a unified framework. The original interval variables with correlation are transformed into the standard space without correlation, and then the relationship between the original variables and the standard interval variables is obtained. The expressions of four basic interval arithmetic operations, namely addition, subtraction, multiplication, and division, are given in the standard space. Finally, several numerical examples and a two-step bar are used to demonstrate the effectiveness of the proposed method.

  16. Effect of soil property uncertainties on permafrost thaw projections: a calibration-constrained analysis

    NASA Astrophysics Data System (ADS)

    Harp, D. R.; Atchley, A. L.; Painter, S. L.; Coon, E. T.; Wilson, C. J.; Romanovsky, V. E.; Rowland, J. C.

    2016-02-01

    The effects of soil property uncertainties on permafrost thaw projections are studied using a three-phase subsurface thermal hydrology model and calibration-constrained uncertainty analysis. The null-space Monte Carlo method is used to identify soil hydrothermal parameter combinations that are consistent with borehole temperature measurements at the study site, the Barrow Environmental Observatory. Each parameter combination is then used in a forward projection of permafrost conditions for the 21st century (from calendar year 2006 to 2100) using atmospheric forcings from the Community Earth System Model (CESM) in the Representative Concentration Pathway (RCP) 8.5 greenhouse gas concentration trajectory. A 100-year projection allows for the evaluation of predictive uncertainty (due to soil property (parametric) uncertainty) and the inter-annual climate variability due to year to year differences in CESM climate forcings. After calibrating to measured borehole temperature data at this well-characterized site, soil property uncertainties are still significant and result in significant predictive uncertainties in projected active layer thickness and annual thaw depth-duration even with a specified future climate. Inter-annual climate variability in projected soil moisture content and Stefan number are small. A volume- and time-integrated Stefan number decreases significantly, indicating a shift in subsurface energy utilization in the future climate (latent heat of phase change becomes more important than heat conduction). Out of 10 soil parameters, ALT, annual thaw depth-duration, and Stefan number are highly dependent on mineral soil porosity, while annual mean liquid saturation of the active layer is highly dependent on the mineral soil residual saturation and moderately dependent on peat residual saturation. By comparing the ensemble statistics to the spread of projected permafrost metrics using different climate models, we quantify the relative magnitude of soil

  17. Use of Forward Sensitivity Analysis Method to Improve Code Scaling, Applicability, and Uncertainty (CSAU) Methodology

    SciTech Connect

    Haihua Zhao; Vincent A. Mousseau; Nam T. Dinh

    2010-10-01

    Code Scaling, Applicability, and Uncertainty (CSAU) methodology was developed in late 1980s by US NRC to systematically quantify reactor simulation uncertainty. Basing on CSAU methodology, Best Estimate Plus Uncertainty (BEPU) methods have been developed and widely used for new reactor designs and existing LWRs power uprate. In spite of these successes, several aspects of CSAU have been criticized for further improvement: i.e., (1) subjective judgement in PIRT process; (2) high cost due to heavily relying large experimental database, needing many experts man-years work, and very high computational overhead; (3) mixing numerical errors with other uncertainties; (4) grid dependence and same numerical grids for both scaled experiments and real plants applications; (5) user effects; Although large amount of efforts have been used to improve CSAU methodology, the above issues still exist. With the effort to develop next generation safety analysis codes, new opportunities appear to take advantage of new numerical methods, better physical models, and modern uncertainty qualification methods. Forward sensitivity analysis (FSA) directly solves the PDEs for parameter sensitivities (defined as the differential of physical solution with respective to any constant parameter). When the parameter sensitivities are available in a new advanced system analysis code, CSAU could be significantly improved: (1) Quantifying numerical errors: New codes which are totally implicit and with higher order accuracy can run much faster with numerical errors quantified by FSA. (2) Quantitative PIRT (Q-PIRT) to reduce subjective judgement and improving efficiency: treat numerical errors as special sensitivities against other physical uncertainties; only parameters having large uncertainty effects on design criterions are considered. (3) Greatly reducing computational costs for uncertainty qualification by (a) choosing optimized time steps and spatial sizes; (b) using gradient information

  18. Testability requirement uncertainty analysis in the sensor selection and optimization model for PHM

    NASA Astrophysics Data System (ADS)

    Yang, S. M.; Qiu, J.; Liu, G. J.; Yang, P.; Zhang, Y.

    2012-05-01

    Prognostics and health management (PHM) has been an important part to guarantee the reliability and safety of complex systems. Design for testability (DFT) developed concurrently with system design is considered as a fundamental way to improve PHM performance, and sensor selection and optimization (SSO) is one of the important parts in DFT. To address the problem that testability requirement analysis in the existing SSO models does not take test uncertainty in actual scenario into account, fault detection uncertainty is analyzed from the view of fault attributes, sensor attributes and fault-sensor matching attributes qualitatively. And then, quantitative uncertainty analysis is given, which assigns a rational confidence level to fault size. A case is presented to demonstrate the proposed methodology for an electromechanical servo-controlled system, and application results show the proposed approach is reasonable and feasible.

  19. Computational Fluid Dynamics Uncertainty Analysis for Payload Fairing Spacecraft Environmental Control Systems

    NASA Technical Reports Server (NTRS)

    Groves, Curtis E.

    2013-01-01

    Spacecraft thermal protection systems are at risk of being damaged due to airflow produced from Environmental Control Systems. There are inherent uncertainties and errors associated with using Computational Fluid Dynamics to predict the airflow field around a spacecraft from the Environmental Control System. This proposal describes an approach to validate the uncertainty in using Computational Fluid Dynamics to predict airflow speeds around an encapsulated spacecraft. The research described here is absolutely cutting edge. Quantifying the uncertainty in analytical predictions is imperative to the success of any simulation-based product. The method could provide an alternative to traditional"validation by test only'' mentality. This method could be extended to other disciplines and has potential to provide uncertainty for any numerical simulation, thus lowering the cost of performing these verifications while increasing the confidence in those predictions. Spacecraft requirements can include a maximum airflow speed to protect delicate instruments during ground processing. Computationaf Fluid Dynamics can be used to veritY these requirements; however, the model must be validated by test data. The proposed research project includes the following three objectives and methods. Objective one is develop, model, and perform a Computational Fluid Dynamics analysis of three (3) generic, non-proprietary, environmental control systems and spacecraft configurations. Several commercially available solvers have the capability to model the turbulent, highly three-dimensional, incompressible flow regime. The proposed method uses FLUENT and OPEN FOAM. Objective two is to perform an uncertainty analysis of the Computational Fluid . . . Dynamics model using the methodology found in "Comprehensive Approach to Verification and Validation of Computational Fluid Dynamics Simulations". This method requires three separate grids and solutions, which quantify the error bars around

  20. On the potential of uncertainty analysis for prediction of brake squeal propensity

    NASA Astrophysics Data System (ADS)

    Zhang, Zhi; Oberst, Sebastian; Lai, Joseph C. S.

    2016-09-01

    Brake squeal is a source of significant warranty-related claims for automotive manufacturers because it is annoying and is often perceived by customers as a safety concern. A brake squeal analysis is complex due to changing environmental and operating conditions, high sensitivity to manufacturing and assembly tolerances as well as the not so well understood role of nonlinearities. Although brake squeal is essentially a nonlinear problem, the standard analysis tool in industry is the linear complex eigenvalue analysis (CEA) which may under-predict or over-predict the number of unstable vibration modes. A nonlinear instability analysis is more predictive than CEA but is still computationally too expensive to be used routinely in industry for a full brake finite element model. Also, although the net work analysis of a linearised brake system has shown potential in predicting the origin of brake squeal, it has not been extensively used. In this study, the net work of an analytical viscously damped self-excited 4-dof friction oscillator with cubic contact force nonlinearity is compared with the instability prediction using the CEA and a nonlinear instability analysis. Results show that both the net work analysis and CEA under-predict the instability because of their inability to detect the sub-critical Hopf bifurcation. Then, the uncertainty analysis is applied to examine if it can improve instability prediction of a nonlinear system using linear methods and its limitations. By applying a variance-based global sensitivity analysis to parameters of the oscillator, suitable candidates for an uncertainty analysis are identified. Results of uncertainty analyses by applying polynomial chaos expansions to net work and CEA correlate well with those of the nonlinear analysis, hence demonstrating the potential of an uncertainty analysis in improving the prediction of brake squeal propensity using a linear method.

  1. Coherent uncertainty analysis of aerosol measurements from multiple satellite sensors

    NASA Astrophysics Data System (ADS)

    Petrenko, M.; Ichoku, C.

    2013-02-01

    Aerosol retrievals from multiple spaceborne sensors, including MODIS (on Terra and Aqua), MISR, OMI, POLDER, CALIOP, and SeaWiFS - altogether, a total of 11 different aerosol products - were comparatively analyzed using data collocated with ground-based aerosol observations from the Aerosol Robotic Network (AERONET) stations within the Multi-sensor Aerosol Products Sampling System (MAPSS, http://giovanni.gsfc.nasa.gov/mapss/ and http://giovanni.gsfc.nasa.gov/aerostat/). The analysis was performed by comparing quality-screened satellite aerosol optical depth or thickness (AOD or AOT) retrievals during 2006-2010 to available collocated AERONET measurements globally, regionally, and seasonally, and deriving a number of statistical measures of accuracy. We used a robust statistical approach to detect and remove possible outliers in the collocated data that can bias the results of the analysis. Overall, the proportion of outliers in each of the quality-screened AOD products was within 12%. Squared correlation coefficient (R2) values of the satellite AOD retrievals relative to AERONET exceeded 0.6, with R2 for most of the products exceeding 0.7 over land and 0.8 over ocean. Root mean square error (RMSE) values for most of the AOD products were within 0.15 over land and 0.09 over ocean. We have been able to generate global maps showing regions where the different products present advantages over the others, as well as the relative performance of each product over different landcover types. It was observed that while MODIS, MISR, and SeaWiFS provide accurate retrievals over most of the landcover types, multi-angle capabilities make MISR the only sensor to retrieve reliable AOD over barren and snow/ice surfaces. Likewise, active sensing enables CALIOP to retrieve aerosol properties over bright-surface shrublands more accurately than the

  2. Coherent Uncertainty Analysis of Aerosol Measurements from Multiple Satellite Sensors

    NASA Technical Reports Server (NTRS)

    Petrenko, M.; Ichoku, C.

    2013-01-01

    Aerosol retrievals from multiple spaceborne sensors, including MODIS (on Terra and Aqua), MISR, OMI, POLDER, CALIOP, and SeaWiFS altogether, a total of 11 different aerosol products were comparatively analyzed using data collocated with ground-based aerosol observations from the Aerosol Robotic Network (AERONET) stations within the Multi-sensor Aerosol Products Sampling System (MAPSS, http://giovanni.gsfc.nasa.gov/mapss/ and http://giovanni.gsfc.nasa.gov/aerostat/). The analysis was performed by comparing quality-screened satellite aerosol optical depth or thickness (AOD or AOT) retrievals during 2006-2010 to available collocated AERONET measurements globally, regionally, and seasonally, and deriving a number of statistical measures of accuracy. We used a robust statistical approach to detect and remove possible outliers in the collocated data that can bias the results of the analysis. Overall, the proportion of outliers in each of the quality-screened AOD products was within 12%. Squared correlation coefficient (R2) values of the satellite AOD retrievals relative to AERONET exceeded 0.6, with R2 for most of the products exceeding 0.7 over land and 0.8 over ocean. Root mean square error (RMSE) values for most of the AOD products were within 0.15 over land and 0.09 over ocean. We have been able to generate global maps showing regions where the different products present advantages over the others, as well as the relative performance of each product over different landcover types. It was observed that while MODIS, MISR, and SeaWiFS provide accurate retrievals over most of the landcover types, multi-angle capabilities make MISR the only sensor to retrieve reliable AOD over barren and snow / ice surfaces. Likewise, active sensing enables CALIOP to retrieve aerosol properties over bright-surface shrublands more accurately than the other sensors, while POLDER, which is the only one of the sensors capable of measuring polarized aerosols, outperforms other sensors in

  3. Non-probabilistic interval analysis method for dynamic response analysis of nonlinear systems with uncertainty

    NASA Astrophysics Data System (ADS)

    Qiu, Zhiping; Ma, Lihong; Wang, Xiaojun

    2009-01-01

    Effects of uncertainties on the dynamic response of the nonlinear vibration systems with general form are investigated. Based on interval mathematics, modeling the uncertain parameters as interval numbers, a non-probabilistic interval analysis method, which estimates the range of the nonlinear dynamic response with the help of Taylor series expansion, is presented, where the partial derivatives of the dynamic response with respect to uncertain parameters are considered to be interval numbers. The sensitivity matrices of dynamic response with the uncertain parameters are derived. For the presented method, only the bounds on uncertain parameters are needed, instead of probabilistic density distribution or statistical quantities. Numerical examples are used to illustrate the validity and feasibility of the presented method.

  4. Uncertainty and sensitivity analysis of the retrieved essential climate variables from remotely sensed observations

    NASA Astrophysics Data System (ADS)

    Djepa, Vera; Badii, Atta

    2016-04-01

    The sensitivity of weather and climate system to sea ice thickness (SIT), Sea Ice Draft (SID) and Snow Depth (SD) in the Arctic is recognized from various studies. Decrease in SIT will affect atmospheric circulation, temperature, precipitation and wind speed in the Arctic and beyond. Ice thermodynamics and dynamic properties depend strongly on sea Ice Density (ID) and SD. SIT, SID, ID and SD are sensitive to environmental changes in the Polar region and impact the climate system. For accurate forecast of climate change, sea ice mass balance, ocean circulation and sea- atmosphere interactions it is required to have long term records of SIT, SID, SD and ID with errors and uncertainty analyses. The SID, SIT, ID and freeboard (F) have been retrieved from Radar Altimeter (RA) (on board ENVISAT) and IceBridge Laser Altimeter (LA) and validated, using over 10 years -collocated observations of SID and SD in the Arctic, provided from the European Space Agency (ESA CCI sea ice ECV project). Improved algorithms to retrieve SIT from LA and RA have been derived, applying statistical analysis. The snow depth is obtained from AMSR-E/Aqua and NASA IceBridge Snow Depth radar. The sea ice properties of pancake ice have been retrieved from ENVISAT/Synthetic Aperture Radar (ASAR). The uncertainties of the retrieved climate variables have been analysed and the impact of snow depth and sea ice density on retrieved SIT has been estimated. The sensitivity analysis illustrates the impact of uncertainties of input climate variables (ID and SD) on accuracy of the retrieved output variables (SIT and SID). The developed methodology of uncertainty and sensitivity analysis is essential for assessment of the impact of environmental variables on climate change and better understanding of the relationship between input and output variables. The uncertainty analysis quantifies the uncertainties of the model results and the sensitivity analysis evaluates the contribution of each input variable to

  5. Uncertainty and sensitivity analysis of fission gas behavior in engineering-scale fuel modeling

    NASA Astrophysics Data System (ADS)

    Pastore, Giovanni; Swiler, L. P.; Hales, J. D.; Novascone, S. R.; Perez, D. M.; Spencer, B. W.; Luzzi, L.; Van Uffelen, P.; Williamson, R. L.

    2015-01-01

    The role of uncertainties in fission gas behavior calculations as part of engineering-scale nuclear fuel modeling is investigated using the BISON fuel performance code with a recently implemented physics-based model for fission gas release and swelling. Through the integration of BISON with the DAKOTA software, a sensitivity analysis of the results to selected model parameters is carried out based on UO2 single-pellet simulations covering different power regimes. The parameters are varied within ranges representative of the relative uncertainties and consistent with the information in the open literature. The study leads to an initial quantitative assessment of the uncertainty in fission gas behavior predictions with the parameter characterization presently available. Also, the relative importance of the single parameters is evaluated. Moreover, a sensitivity analysis is carried out based on simulations of a fuel rod irradiation experiment, pointing out a significant impact of the considered uncertainties on the calculated fission gas release and cladding diametral strain. The results of the study indicate that the commonly accepted deviation between calculated and measured fission gas release by a factor of 2 approximately corresponds to the inherent modeling uncertainty at high fission gas release. Nevertheless, significantly higher deviations may be expected for values around 10% and lower. Implications are discussed in terms of directions of research for the improved modeling of fission gas behavior for engineering purposes.

  6. Dynamic analysis of parametrically excited system under uncertainties and multi-frequency excitations

    NASA Astrophysics Data System (ADS)

    Wei, Sha; Han, Qinkai; Peng, Zhike; Chu, Fulei

    2016-05-01

    Some system parameters in mechanical systems are always uncertain due to uncertainties in geometric and material properties, lubrication condition and wear. For a more reasonable estimation of dynamic analysis of the parametrically excited system, the effect of uncertain parameters should be taken into account. This paper presents a new non-probabilistic analysis method for solving the dynamic responses of parametrically excited systems under uncertainties and multi-frequency excitations. By using the multi-dimensional harmonic balance method (MHBM) and the Chebyshev inclusion function (CIF), an interval multi-dimensional harmonic balance method (IMHBM) is obtained. To illustrate the accuracy of the proposed method, a time-varying geared system of wind turbine with different kinds of uncertainties is demonstrated. By comparing with the results of the scanning method, it is shown that the presented method is valid and effective for the parametrically excited system with uncertainties and multi-frequency excitations. The effects of some uncertain system parameters including uncertain mesh stiffnesses and uncertain bearing stiffnesses on the frequency responses of the system are also discussed in detail. It is shown that the dynamic responses of the system are insensitive to the uncertain mesh stiffness and bearing stiffnesses of the planetary gear stage. The uncertain bearing stiffnesses of the intermediate and high-speed stages will lead to relatively large uncertainties in the dynamic responses around resonant regions. It will provide valuable guidance for the optimal design and condition monitoring of wind turbine gearboxes.

  7. Uncertainty and sensitivity analysis of fission gas behavior in engineering-scale fuel modeling

    SciTech Connect

    Pastore, Giovanni; Swiler, L. P.; Hales, Jason D.; Novascone, Stephen R.; Perez, Danielle M.; Spencer, Benjamin W.; Luzzi, Lelio; Uffelen, Paul Van; Williamson, Richard L.

    2014-10-12

    The role of uncertainties in fission gas behavior calculations as part of engineering-scale nuclear fuel modeling is investigated using the BISON fuel performance code and a recently implemented physics-based model for the coupled fission gas release and swelling. Through the integration of BISON with the DAKOTA software, a sensitivity analysis of the results to selected model parameters is carried out based on UO2 single-pellet simulations covering different power regimes. The parameters are varied within ranges representative of the relative uncertainties and consistent with the information from the open literature. The study leads to an initial quantitative assessment of the uncertainty in fission gas behavior modeling with the parameter characterization presently available. Also, the relative importance of the single parameters is evaluated. Moreover, a sensitivity analysis is carried out based on simulations of a fuel rod irradiation experiment, pointing out a significant impact of the considered uncertainties on the calculated fission gas release and cladding diametral strain. The results of the study indicate that the commonly accepted deviation between calculated and measured fission gas release by a factor of 2 approximately corresponds to the inherent modeling uncertainty at high fission gas release. Nevertheless, higher deviations may be expected for values around 10% and lower. Implications are discussed in terms of directions of research for the improved modeling of fission gas behavior for engineering purposes.

  8. Linking trading ratio with TMDL (total maximum daily load) allocation matrix and uncertainty analysis.

    PubMed

    Zhang, H X

    2008-01-01

    An innovative approach for total maximum daily load (TMDL) allocation and implementation is the watershed-based pollutant trading. Given the inherent scientific uncertainty for the tradeoffs between point and nonpoint sources, setting of trading ratios can be a contentious issue and was already listed as an obstacle by several pollutant trading programs. One of the fundamental reasons that a trading ratio is often set higher (e.g. greater than 2) is to allow for uncertainty in the level of control needed to attain water quality standards, and to provide a buffer in case traded reductions are less effective than expected. However, most of the available studies did not provide an approach to explicitly address the determination of trading ratio. Uncertainty analysis has rarely been linked to determination of trading ratio.This paper presents a practical methodology in estimating "equivalent trading ratio (ETR)" and links uncertainty analysis with trading ratio determination from TMDL allocation process. Determination of ETR can provide a preliminary evaluation of "tradeoffs" between various combination of point and nonpoint source control strategies on ambient water quality improvement. A greater portion of NPS load reduction in overall TMDL load reduction generally correlates with greater uncertainty and thus requires greater trading ratio. The rigorous quantification of trading ratio will enhance the scientific basis and thus public perception for more informed decision in overall watershed-based pollutant trading program. PMID:18653943

  9. A Methodology For Performing Global Uncertainty And Sensitivity Analysis In Systems Biology

    PubMed Central

    Marino, Simeone; Hogue, Ian B.; Ray, Christian J.; Kirschner, Denise E.

    2008-01-01

    Accuracy of results from mathematical and computer models of biological systems is often complicated by the presence of uncertainties in experimental data that are used to estimate parameter values. Current mathematical modeling approaches typically use either single-parameter or local sensitivity analyses. However, these methods do not accurately assess uncertainty and sensitivity in the system as, by default they hold all other parameters fixed at baseline values. Using techniques described within we demonstrate how a multi-dimensional parameter space can be studied globally so all uncertainties can be identified. Further, uncertainty and sensitivity analysis techniques can help to identify and ultimately control uncertainties. In this work we develop methods for applying existing analytical tools to perform analyses on a variety of mathematical and computer models. We compare two specific types of global sensitivity analysis indexes that have proven to be among the most robust and efficient. Through familiar and new examples of mathematical and computer models, we provide a complete methodology for performing these analyses, both in deterministic and stochastic settings, and propose novel techniques to handle problems encountered during this type of analyses. PMID:18572196

  10. Uncertainty analysis of primary water pollutant control in China's pulp and paper industry.

    PubMed

    Wen, Zong-guo; Di, Jing-han; Zhang, Xue-ying

    2016-03-15

    The total emission control target of water pollutants (e.g., COD and NH4-N) for a certain industrial sector can be predicted and analysed using the popular technology-based bottom-up modelling. However, this methodology has obvious uncertainty regarding the attainment of mitigation targets. The primary uncertainty comes from macro-production, pollutant reduction roadmap, and technical parameters. This research takes the paper and pulp industry in China as an example, and builds 5 mitigation scenarios via different combinations of raw material structure, scale structure, procedure mitigation technology, and end-of-pipe treatment technology. Using the methodology of uncertainty analysis via Monte Carlo, random sampling was conducted over a hundred thousand times. According to key parameters, sensitive parameters that impact total emission control targets such as industrial output, technique structure, cleaner production technology, and end-of-pipe treatment technology are discussed in this article. It appears that scenario uncertainty has a larger influence on COD emission than NH4-N, hence it is recommended that a looser total emission control target for COD is necessary to increase its feasibility and availability while maintaining the status quo of NH4-N. Consequently, from uncertainty analysis, this research recognizes the sensitive products, techniques, and technologies affecting industrial water pollution. PMID:26722715

  11. Quasi-Stochastic Analysis of Uncertainty for Modelling Structurally Controlled Failures

    NASA Astrophysics Data System (ADS)

    Elmouttie, M. K.; Poropat, G. V.

    2014-03-01

    In one approach to predicting the behaviour of rock masses, effort is being devoted to the use of probabilistic methods to model structures interior to a rock mass (sometimes referred to as `inferred' or `stochastic' structures). The physical properties of these structures (e.g. position, orientation, size) are modelled as random parameters, the statistical properties of which are derived from the measurements of a sample of the population (sometimes referred to as `deterministic' structures). Relatively little attention has been devoted to the uncertainty associated with the deterministic structures. Typical geotechnical analyses rely on either an entirely stochastic analysis, or deterministic analyses representing the structures with a fixed shape (i.e. disc), position, size, and orientation. The simplifications assumed for this model introduce both epistemic and stochastic uncertainties. In this paper, it is shown that these uncertainties should be quantified and propagated to the predictions of behaviour derived from subsequent analyses. We demonstrate a methodology which we have termed quasi- stochastic analysis to perform this propagation. It is shown that relatively small levels of uncertainty can have large influence on the uncertainties associated with geotechnical analyses, such as predictions of block size and block stability, and therefore this methodology can provide the practitioner with a method for better interpretation of these results.

  12. Damage functions for climate-related hazards: unification and uncertainty analysis

    NASA Astrophysics Data System (ADS)

    Prahl, Boris F.; Rybski, Diego; Boettle, Markus; Kropp, Jürgen P.

    2016-05-01

    Most climate change impacts manifest in the form of natural hazards. Damage assessment typically relies on damage functions that translate the magnitude of extreme events to a quantifiable damage. In practice, the availability of damage functions is limited due to a lack of data sources and a lack of understanding of damage processes. The study of the characteristics of damage functions for different hazards could strengthen the theoretical foundation of damage functions and support their development and validation. Accordingly, we investigate analogies of damage functions for coastal flooding and for wind storms and identify a unified approach. This approach has general applicability for granular portfolios and may also be applied, for example, to heat-related mortality. Moreover, the unification enables the transfer of methodology between hazards and a consistent treatment of uncertainty. This is demonstrated by a sensitivity analysis on the basis of two simple case studies (for coastal flood and storm damage). The analysis reveals the relevance of the various uncertainty sources at varying hazard magnitude and on both the microscale and the macroscale level. Main findings are the dominance of uncertainty from the hazard magnitude and the persistent behaviour of intrinsic uncertainties on both scale levels. Our results shed light on the general role of uncertainties and provide useful insight for the application of the unified approach.

  13. Characterization and Uncertainty Analysis of a Reference Pressure Measurement System for Wind Tunnels

    NASA Technical Reports Server (NTRS)

    Amer, Tahani; Tripp, John; Tcheng, Ping; Burkett, Cecil; Sealey, Bradley

    2004-01-01

    This paper presents the calibration results and uncertainty analysis of a high-precision reference pressure measurement system currently used in wind tunnels at the NASA Langley Research Center (LaRC). Sensors, calibration standards, and measurement instruments are subject to errors due to aging, drift with time, environment effects, transportation, the mathematical model, the calibration experimental design, and other factors. Errors occur at every link in the chain of measurements and data reduction from the sensor to the final computed results. At each link of the chain, bias and precision uncertainties must be separately estimated for facility use, and are combined to produce overall calibration and prediction confidence intervals for the instrument, typically at a 95% confidence level. The uncertainty analysis and calibration experimental designs used herein, based on techniques developed at LaRC, employ replicated experimental designs for efficiency, separate estimation of bias and precision uncertainties, and detection of significant parameter drift with time. Final results, including calibration confidence intervals and prediction intervals given as functions of the applied inputs, not as a fixed percentage of the full-scale value are presented. System uncertainties are propagated beginning with the initial reference pressure standard, to the calibrated instrument as a working standard in the facility. Among the several parameters that can affect the overall results are operating temperature, atmospheric pressure, humidity, and facility vibration. Effects of factors such as initial zeroing and temperature are investigated. The effects of the identified parameters on system performance and accuracy are discussed.

  14. Computational Fluid Dynamics Uncertainty Analysis Applied to Heat Transfer over a Flat Plate

    NASA Technical Reports Server (NTRS)

    Groves, Curtis Edward; Ilie, Marcel; Schallhorn, Paul A.

    2013-01-01

    There have been few discussions on using Computational Fluid Dynamics (CFD) without experimental validation. Pairing experimental data, uncertainty analysis, and analytical predictions provides a comprehensive approach to verification and is the current state of the art. With pressed budgets, collecting experimental data is rare or non-existent. This paper investigates and proposes a method to perform CFD uncertainty analysis only from computational data. The method uses current CFD uncertainty techniques coupled with the Student-T distribution to predict the heat transfer coefficient over a at plate. The inputs to the CFD model are varied from a specified tolerance or bias error and the difference in the results are used to estimate the uncertainty. The variation in each input is ranked from least to greatest to determine the order of importance. The results are compared to heat transfer correlations and conclusions drawn about the feasibility of using CFD without experimental data. The results provide a tactic to analytically estimate the uncertainty in a CFD model when experimental data is unavailable

  15. Uncertainty Determination Methodology, Sampling Maps Generation and Trend Studies with Biomass Thermogravimetric Analysis

    PubMed Central

    Pazó, Jose A.; Granada, Enrique; Saavedra, Ángeles; Eguía, Pablo; Collazo, Joaquín

    2010-01-01

    This paper investigates a method for the determination of the maximum sampling error and confidence intervals of thermal properties obtained from thermogravimetric analysis (TG analysis) for several lignocellulosic materials (ground olive stone, almond shell, pine pellets and oak pellets), completing previous work of the same authors. A comparison has been made between results of TG analysis and prompt analysis. Levels of uncertainty and errors were obtained, demonstrating that properties evaluated by TG analysis were representative of the overall fuel composition, and no correlation between prompt and TG analysis exists. Additionally, a study of trends and time correlations is indicated. These results are particularly interesting for biomass energy applications. PMID:21152292

  16. A single-loop optimization method for reliability analysis with second order uncertainty

    NASA Astrophysics Data System (ADS)

    Xie, Shaojun; Pan, Baisong; Du, Xiaoping

    2015-08-01

    Reliability analysis may involve random variables and interval variables. In addition, some of the random variables may have interval distribution parameters owing to limited information. This kind of uncertainty is called second order uncertainty. This article develops an efficient reliability method for problems involving the three aforementioned types of uncertain input variables. The analysis produces the maximum and minimum reliability and is computationally demanding because two loops are needed: a reliability analysis loop with respect to random variables and an interval analysis loop for extreme responses with respect to interval variables. The first order reliability method and nonlinear optimization are used for the two loops, respectively. For computational efficiency, the two loops are combined into a single loop by treating the Karush-Kuhn-Tucker (KKT) optimal conditions of the interval analysis as constraints. Three examples are presented to demonstrate the proposed method.

  17. The Model Optimization, Uncertainty, and SEnsitivity analysis (MOUSE) toolbox: overview and application

    Technology Transfer Automated Retrieval System (TEKTRAN)

    For several decades, optimization and sensitivity/uncertainty analysis of environmental models has been the subject of extensive research. Although much progress has been made and sophisticated methods developed, the growing complexity of environmental models to represent real-world systems makes it...

  18. A PROBABILISTIC APPROACH FOR ANALYSIS OF UNCERTAINTY IN THE EVALUATION OF WATERSHED MANAGEMENT PRACTICES

    EPA Science Inventory

    A computational framework is presented for analyzing the uncertainty in model estimates of water quality benefits of best management practices (BMPs) in two small (<10 km2) watersheds in Indiana. The analysis specifically recognizes the significance of the difference b...

  19. Bayesian Uncertainty Analysis of PBPK Model Predictions for Permethrin in Rats

    EPA Science Inventory

    Uncertainty analysis of human physiologically-based pharmacokinetic (PBPK) model predictions can pose a significant challenge due to data limitations. As a result of these limitations, human models are often derived from extrapolated animal PBPK models, for which there is usuall...

  20. Cross Section Sensitivity and Uncertainty Analysis Including Secondary Neutron Energy and Angular Distributions.

    Energy Science and Technology Software Center (ESTSC)

    1991-03-12

    Version 00 SUSD calculates sensitivity coefficients for one- and two-dimensional transport problems. Variance and standard deviation of detector responses or design parameters can be obtained using cross-section covariance matrices. In neutron transport problems, this code can perform sensitivity-uncertainty analysis for secondary angular distribution (SAD) or secondary energy distribution (SED).

  1. Uncertainty analysis of an irrigation scheduling model for water management in crop production

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Irrigation scheduling tools are critical to allow producers to manage water resources for crop production in an accurate and timely manner. To be useful, these tools need to be accurate, complete, and relatively reliable. The current work presents the uncertainty analysis and its results for the Mis...

  2. FORMAL UNCERTAINTY ANALYSIS OF A LAGRANGIAN PHOTOCHEMICAL AIR POLLUTION MODEL. (R824792)

    EPA Science Inventory

    This study applied Monte Carlo analysis with Latin
    hypercube sampling to evaluate the effects of uncertainty
    in air parcel trajectory paths, emissions, rate constants,
    deposition affinities, mixing heights, and atmospheric stability
    on predictions from a vertically...

  3. Preliminary uncertainty and sensitivity analysis for basic transport parameters at the Horonobe Site, Hokkaido, Japan.

    SciTech Connect

    James, Scott Carlton; Zimmerman, Dean Anthony

    2003-10-01

    Incorporating results from a previously developed finite element model, an uncertainty and parameter sensitivity analysis was conducted using preliminary site-specific data from Horonobe, Japan (data available from five boreholes as of 2003). Latin Hypercube Sampling was used to draw random parameter values from the site-specific measured, or approximated, physicochemical uncertainty distributions. Using pathlengths and groundwater velocities extracted from the three-dimensional, finite element flow and particle tracking model, breakthrough curves for multiple realizations were calculated with the semi-analytical, one-dimensional, multirate transport code, STAMMT-L. A stepwise linear regression analysis using the 5, 50, and 95% breakthrough times as the dependent variables and LHS sampled site physicochemical parameters as the independent variables was used to perform a sensitivity analysis. Results indicate that the distribution coefficients and hydraulic conductivities are the parameters responsible for most of the variation among simulated breakthrough times. This suggests that researchers and data collectors at the Horonobe site should focus on accurately assessing these parameters and quantifying their uncertainty. Because the Horonobe Underground Research Laboratory is in an early phase of its development, this work should be considered as a first step toward an integration of uncertainty and sensitivity analyses with decision analysis.

  4. Overview and application of the Model Optimization, Uncertainty, and SEnsitivity Analysis (MOUSE) toolbox

    Technology Transfer Automated Retrieval System (TEKTRAN)

    For several decades, optimization and sensitivity/uncertainty analysis of environmental models has been the subject of extensive research. Although much progress has been made and sophisticated methods developed, the growing complexity of environmental models to represent real-world systems makes it...

  5. Calibration of Uncertainty Analysis of the SWAT Model Using Genetic Algorithms and Bayesian Model Averaging

    Technology Transfer Automated Retrieval System (TEKTRAN)

    In this paper, the Genetic Algorithms (GA) and Bayesian model averaging (BMA) were combined to simultaneously conduct calibration and uncertainty analysis for the Soil and Water Assessment Tool (SWAT). In this hybrid method, several SWAT models with different structures are first selected; next GA i...

  6. Applying Uncertainty Analysis to a Risk Assessment for the Pesticide Permethrin

    EPA Science Inventory

    We discuss the application of methods of uncertainty analysis from our previous poster to the problem of a risk assessment for exposure to the food-use pesticide permethrin resulting from residential pesticide crack and crevice application. Exposures are simulated by the SHEDS (S...

  7. Comparing uncertainty analysis techniques for a SWAT application to the Chaohe Basin in China

    NASA Astrophysics Data System (ADS)

    Yang, Jing; Reichert, Peter; Abbaspour, K. C.; Xia, Jun; Yang, Hong

    2008-08-01

    SummaryDistributed watershed models are increasingly being used to support decisions about alternative management strategies in the areas of land use change, climate change, water allocation, and pollution control. For this reason it is important that these models pass through a careful calibration and uncertainty analysis. To fulfil this demand, in recent years, scientists have come up with various uncertainty analysis techniques for watershed models. To determine the differences and similarities of these techniques we compared five uncertainty analysis procedures: Generalized Likelihood Uncertainty Estimation (GLUE), Parameter Solution (ParaSol), Sequential Uncertainty FItting algorithm (SUFI-2), and a Bayesian framework implemented using Markov chain Monte Carlo (MCMC) and Importance Sampling (IS) techniques. As these techniques are different in their philosophies and leave the user some freedom in formulating the generalized likelihood measure, objective function, or likelihood function, a literal comparison between these techniques is not possible. As there is a small spectrum of different applications in hydrology for the first three techniques, we made this choice according to their typical use in hydrology. For Bayesian inference, we used a recently developed likelihood function that does not obviously violate the statistical assumptions, namely a continuous-time autoregressive error model. We implemented all these techniques for the soil and water assessment tool (SWAT) and applied them to the Chaohe Basin in China. We compared the results with respect to the posterior parameter distributions, performances of their best estimates, prediction uncertainty, conceptual bases, computational efficiency, and difficulty of implementation. The comparison results for these categories are listed and the advantages and disadvantages are analyzed. From the point of view of the authors, if computationally feasible, Bayesian-based approaches are most recommendable

  8. Uncertainty analysis of the Measured Performance Rating (MPR) method. Final report

    SciTech Connect

    Not Available

    1993-11-01

    A report was commissioned by the New York State Energy Research and Development Authority and the Electric Power Research Institute to evaluate the uncertainties in the energy monitoring method known as measured performance rating (MPR). The work is intended to help further development of the MPR system by quantitatively analyzing the uncertainties in estimates of the heat loss coefficients and heating system efficiencies. The analysis indicates that the MPR should detect as little as a 7 percent change in the heat loss coefficients and heating system efficiencies. The analysis indicate that the MPR should be able to detect as little as a 7 percent change in the heat loss coefficient at 95 percent confidence level. MPR appears sufficiently robust for characterizing common weatherization treatments; e.g., increasing attic insulation from R-7 to R-19 in a typical single-story, 1,100 sq. ft. house resulting in a 19 percent reduction in heat loss coefficient. Furnace efficiency uncertainties ranged up to three times those of the heat loss coefficients. Measurement uncertainties (at the 95 percent confidence level) were estimated to be from 1 to 5 percent for heat loss coefficients and 1.5 percent for a typical furnace efficiency. The analysis also shows a limitation in applying MPR to houses with heating ducts in slabs on grade and to those with very large thermal mass. Most of the uncertainties encountered in the study were due more to the methods of estimating the ``true`` heat loss coefficients, furnace efficiency, and furnace fuel consumption (by collecting fuel bills and simulating two actual houses) than to the MPR approach. These uncertainties in the true parameter values become evidence for arguments in favor of the need of empirical measures of heat loss coefficient and furnace efficiency, like the MPR method, rather than arguments against.

  9. Flood damage maps: ranking sources of uncertainty with variance-based sensitivity analysis

    NASA Astrophysics Data System (ADS)

    Saint-Geours, N.; Grelot, F.; Bailly, J.-S.; Lavergne, C.

    2012-04-01

    In order to increase the reliability of flood damage assessment, we need to question the uncertainty associated with the whole flood risk modeling chain. Using a case study on the basin of the Orb River, France, we demonstrate how variance-based sensitivity analysis can be used to quantify uncertainty in flood damage maps at different spatial scales and to identify the sources of uncertainty which should be reduced first. Flood risk mapping is recognized as an effective tool in flood risk management and the elaboration of flood risk maps is now required for all major river basins in the European Union (European directive 2007/60/EC). Flood risk maps can be based on the computation of the Mean Annual Damages indicator (MAD). In this approach, potential damages due to different flood events are estimated for each individual stake over the study area, then averaged over time - using the return period of each flood event - and finally mapped. The issue of uncertainty associated with these flood damage maps should be carefully scrutinized, as they are used to inform the relevant stakeholders or to design flood mitigation measures. Maps of the MAD indicator are based on the combination of hydrological, hydraulic, geographic and economic modeling efforts: as a result, numerous sources of uncertainty arise in their elaboration. Many recent studies describe these various sources of uncertainty (Koivumäki 2010, Bales 2009). Some authors propagate these uncertainties through the flood risk modeling chain and estimate confidence bounds around the resulting flood damage estimates (de Moel 2010). It would now be of great interest to go a step further and to identify which sources of uncertainty account for most of the variability in Mean Annual Damages estimates. We demonstrate the use of variance-based sensitivity analysis to rank sources of uncertainty in flood damage mapping and to quantify their influence on the accuracy of flood damage estimates. We use a quasi

  10. Quantification of source uncertainties in Seismic Probabilistic Tsunami Hazard Analysis (SPTHA)

    NASA Astrophysics Data System (ADS)

    Selva, J.; Tonini, R.; Molinari, I.; Tiberti, M. M.; Romano, F.; Grezio, A.; Melini, D.; Piatanesi, A.; Basili, R.; Lorito, S.

    2016-06-01

    We propose a procedure for uncertainty quantification in Probabilistic Tsunami Hazard Analysis (PTHA), with a special emphasis on the uncertainty related to statistical modelling of the earthquake source in Seismic PTHA (SPTHA), and on the separate treatment of subduction and crustal earthquakes (treated as background seismicity). An event tree approach and ensemble modelling are used in spite of more classical approaches, such as the hazard integral and the logic tree. This procedure consists of four steps: (1) exploration of aleatory uncertainty through an event tree, with alternative implementations for exploring epistemic uncertainty; (2) numerical computation of tsunami generation and propagation up to a given offshore isobath; (3) (optional) site-specific quantification of inundation; (4) simultaneous quantification of aleatory and epistemic uncertainty through ensemble modelling. The proposed procedure is general and independent of the kind of tsunami source considered; however, we implement step 1, the event tree, specifically for SPTHA, focusing on seismic source uncertainty. To exemplify the procedure, we develop a case study considering seismic sources in the Ionian Sea (central-eastern Mediterranean Sea), using the coasts of Southern Italy as a target zone. The results show that an efficient and complete quantification of all the uncertainties is feasible even when treating a large number of potential sources and a large set of alternative model formulations. We also find that (i) treating separately subduction and background (crustal) earthquakes allows for optimal use of available information and for avoiding significant biases; (ii) both subduction interface and crustal faults contribute to the SPTHA, with different proportions that depend on source-target position and tsunami intensity; (iii) the proposed framework allows sensitivity and deaggregation analyses, demonstrating the applicability of the method for operational assessments.

  11. Quantification of source uncertainties in Seismic Probabilistic Tsunami Hazard Analysis (SPTHA)

    NASA Astrophysics Data System (ADS)

    Selva, J.; Tonini, R.; Molinari, I.; Tiberti, M. M.; Romano, F.; Grezio, A.; Melini, D.; Piatanesi, A.; Basili, R.; Lorito, S.

    2016-03-01

    We propose a procedure for uncertainty quantification in Probabilistic Tsunami Hazard Analysis (PTHA), with a special emphasis on the uncertainty related to statistical modelling of the earthquake source in Seismic PTHA (SPTHA), and on the separate treatment of subduction and crustal earthquakes (treated as background seismicity). An event tree approach and ensemble modelling are used in spite of more classical approaches, such as the hazard integral and the logic tree. This procedure consists of four steps: 1) exploration of aleatory uncertainty through an event tree, with alternative implementations for exploring epistemic uncertainty; 2) numerical computation of tsunami generation and propagation up to a given offshore isobath; 3) (optional) site-specific quantification of inundation; 4) simultaneous quantification of aleatory and epistemic uncertainty through ensemble modelling. The proposed procedure is general and independent of the kind of tsunami source considered; however, we implement step 1), the event tree, specifically for SPTHA, focussing on seismic source uncertainty. To exemplify the procedure, we develop a case study considering seismic sources in the Ionian Sea (central-eastern Mediterranean Sea), using the coasts of Southern Italy as a target zone. The results show that an efficient and complete quantification of all the uncertainties is feasible even when treating a large number of potential sources and a large set of alternative model formulations. We also find that: i) treating separately subduction and background (crustal) earthquakes allows for optimal use of available information and for avoiding significant biases; ii) both subduction interface and crustal faults contribute to the SPTHA, with different proportions that depend on source-target position and tsunami intensity; iii) the proposed framework allows sensitivity and deaggregation analyses, demonstrating the applicability of the method for operational assessments.

  12. Uncertainty Analysis of Flash Flood Guidance: Topographic Data and Model Parameter Errors

    NASA Astrophysics Data System (ADS)

    Georgakakos, K. P.; Ntelekos, A. A.; Krajewski, W. F.

    2004-05-01

    Flash Flood Guidance (FFG) is the volume of rainfall required to generate bankfull flows at the outlet of a basin over a specified time interval and initial soil moisture conditions. Operationally the soil moisture conditions are generated every 6 hours by the execution of the Sacramento - Soil Moisture Accounting (SAC - SMA) model at the River Forecast Centers (RFC's). This guidance is used with actual radar rainfall data over the basin to assist with the production of flash flood warnings. The backbone of the FFG system is the Threshold Runoff (Thresh-R), the calculation of which is done offline as a one time task. Thersh-R is the volume of effective rainfall of a given duration needed to cause bankfull flows at the basin outlet. In this study, bankfull conditions from uniform steady flow and the Geomorphologic Unit Hydrograph theory are used for the calculation of Thresh-R for a basin located in Illinois River at Oklahoma. The uncertainty related with the GIS and channel data for the calculation of thresh-R is introduced and an ensemble of threshold runoff values is produced. Then, the FFG is modeled with the use of a time-continuous approximation of the upper zone of the SAC-SMA hydrologic model and quadratic function approximations. The thresh-R ensemble is fed into the FFG model to study the uncertainty in the FFG values due to the uncertainty in the GIS and channel data that contribute to the uncertainty of threshold runoff. The numerical experiments are then repeated but additional uncertainty in the key parameters of the analytical Sacramento model solution is added, to study the synergistic effect of both uncertainties. The results of analysis are presented and the parameters that affect more the FFG uncertainty are identified. The need of transforming the currently deterministic operational FFG system to a probabilistic or an ensemble one is also discussed.

  13. Dissertation Defense: Computational Fluid Dynamics Uncertainty Analysis for Payload Fairing Spacecraft Environmental Control Systems

    NASA Technical Reports Server (NTRS)

    Groves, Curtis Edward

    2014-01-01

    Spacecraft thermal protection systems are at risk of being damaged due to airflow produced from Environmental Control Systems. There are inherent uncertainties and errors associated with using Computational Fluid Dynamics to predict the airflow field around a spacecraft from the Environmental Control System. This paper describes an approach to quantify the uncertainty in using Computational Fluid Dynamics to predict airflow speeds around an encapsulated spacecraft without the use of test data. Quantifying the uncertainty in analytical predictions is imperative to the success of any simulation-based product. The method could provide an alternative to traditional validation by test only mentality. This method could be extended to other disciplines and has potential to provide uncertainty for any numerical simulation, thus lowering the cost of performing these verifications while increasing the confidence in those predictions.Spacecraft requirements can include a maximum airflow speed to protect delicate instruments during ground processing. Computational Fluid Dynamics can be used to verify these requirements; however, the model must be validated by test data. This research includes the following three objectives and methods. Objective one is develop, model, and perform a Computational Fluid Dynamics analysis of three (3) generic, non-proprietary, environmental control systems and spacecraft configurations. Several commercially available and open source solvers have the capability to model the turbulent, highly three-dimensional, incompressible flow regime. The proposed method uses FLUENT, STARCCM+, and OPENFOAM. Objective two is to perform an uncertainty analysis of the Computational Fluid Dynamics model using the methodology found in Comprehensive Approach to Verification and Validation of Computational Fluid Dynamics Simulations. This method requires three separate grids and solutions, which quantify the error bars around Computational Fluid Dynamics predictions

  14. Dissertation Defense Computational Fluid Dynamics Uncertainty Analysis for Payload Fairing Spacecraft Environmental Control Systems

    NASA Technical Reports Server (NTRS)

    Groves, Curtis Edward

    2014-01-01

    Spacecraft thermal protection systems are at risk of being damaged due to airflow produced from Environmental Control Systems. There are inherent uncertainties and errors associated with using Computational Fluid Dynamics to predict the airflow field around a spacecraft from the Environmental Control System. This paper describes an approach to quantify the uncertainty in using Computational Fluid Dynamics to predict airflow speeds around an encapsulated spacecraft without the use of test data. Quantifying the uncertainty in analytical predictions is imperative to the success of any simulation-based product. The method could provide an alternative to traditional "validation by test only" mentality. This method could be extended to other disciplines and has potential to provide uncertainty for any numerical simulation, thus lowering the cost of performing these verifications while increasing the confidence in those predictions. Spacecraft requirements can include a maximum airflow speed to protect delicate instruments during ground processing. Computational Fluid Dynamics can be used to verify these requirements; however, the model must be validated by test data. This research includes the following three objectives and methods. Objective one is develop, model, and perform a Computational Fluid Dynamics analysis of three (3) generic, non-proprietary, environmental control systems and spacecraft configurations. Several commercially available and open source solvers have the capability to model the turbulent, highly three-dimensional, incompressible flow regime. The proposed method uses FLUENT, STARCCM+, and OPENFOAM. Objective two is to perform an uncertainty analysis of the Computational Fluid Dynamics model using the methodology found in "Comprehensive Approach to Verification and Validation of Computational Fluid Dynamics Simulations". This method requires three separate grids and solutions, which quantify the error bars around Computational Fluid Dynamics

  15. How uncertainty analysis in ecological risk assessment is used in the courtroom

    SciTech Connect

    Hacker, C.; Watson, J.

    1995-12-31

    The prevalence of uncertainty analysis in environmental decision-making is increasing. Specific methods for estimating and expressing uncertainty are available and continually being improved. Although these methods are intended to provide a measure of the suitability of the data upon which a decision is based, their application in litigation may result in outcomes that are unanticipated by some in the scientific community. This divergence between those estimating uncertainty in assessing ecological risk and those judging its application can be attributed in part to the different ways evidence is used in science and law. This presentation will explain how scientific evidence is used in the courtroom. This explanation will use examples from case law to describe how courts decide who can be qualified to present evidence, what evidence can be presented, and how this evidence will be used in reaching a decision.

  16. Use of SUSA in Uncertainty and Sensitivity Analysis for INL VHTR Coupled Codes

    SciTech Connect

    Gerhard Strydom

    2010-06-01

    The need for a defendable and systematic Uncertainty and Sensitivity approach that conforms to the Code Scaling, Applicability, and Uncertainty (CSAU) process, and that could be used for a wide variety of software codes, was defined in 2008.The GRS (Gesellschaft für Anlagen und Reaktorsicherheit) company of Germany has developed one type of CSAU approach that is particularly well suited for legacy coupled core analysis codes, and a trial version of their commercial software product SUSA (Software for Uncertainty and Sensitivity Analyses) was acquired on May 12, 2010. This interim milestone report provides an overview of the current status of the implementation and testing of SUSA at the INL VHTR Project Office.

  17. Developing Uncertainty Models for Robust Flutter Analysis Using Ground Vibration Test Data

    NASA Technical Reports Server (NTRS)

    Potter, Starr; Lind, Rick; Kehoe, Michael W. (Technical Monitor)

    2001-01-01

    A ground vibration test can be used to obtain information about structural dynamics that is important for flutter analysis. Traditionally, this information#such as natural frequencies of modes#is used to update analytical models used to predict flutter speeds. The ground vibration test can also be used to obtain uncertainty models, such as natural frequencies and their associated variations, that can update analytical models for the purpose of predicting robust flutter speeds. Analyzing test data using the -norm, rather than the traditional 2-norm, is shown to lead to a minimum-size uncertainty description and, consequently, a least-conservative robust flutter speed. This approach is demonstrated using ground vibration test data for the Aerostructures Test Wing. Different norms are used to formulate uncertainty models and their associated robust flutter speeds to evaluate which norm is least conservative.

  18. Quantitative uncertainty and sensitivity analysis of a PWR control rod ejection accident

    SciTech Connect

    Pasichnyk, I.; Perin, Y.; Velkov, K.

    2013-07-01

    The paper describes the results of the quantitative Uncertainty and Sensitivity (U/S) Analysis of a Rod Ejection Accident (REA) which is simulated by the coupled system code ATHLET-QUABOX/CUBBOX applying the GRS tool for U/S analysis SUSA/XSUSA. For the present study, a UOX/MOX mixed core loading based on a generic PWR is modeled. A control rod ejection is calculated for two reactor states: Hot Zero Power (HZP) and 30% of nominal power. The worst cases for the rod ejection are determined by steady-state neutronic simulations taking into account the maximum reactivity insertion in the system and the power peaking factor. For the U/S analysis 378 uncertain parameters are identified and quantified (thermal-hydraulic initial and boundary conditions, input parameters and variations of the two-group cross sections). Results for uncertainty and sensitivity analysis are presented for safety important global and local parameters. (authors)

  19. A methodology for formulating a minimal uncertainty model for robust control system design and analysis

    NASA Technical Reports Server (NTRS)

    Belcastro, Christine M.; Chang, B.-C.; Fischl, Robert

    1989-01-01

    In the design and analysis of robust control systems for uncertain plants, the technique of formulating what is termed an M-delta model has become widely accepted and applied in the robust control literature. The M represents the transfer function matrix M(s) of the nominal system, and delta represents an uncertainty matrix acting on M(s). The uncertainty can arise from various sources, such as structured uncertainty from parameter variations or multiple unstructured uncertainties from unmodeled dynamics and other neglected phenomena. In general, delta is a block diagonal matrix, and for real parameter variations the diagonal elements are real. As stated in the literature, this structure can always be formed for any linear interconnection of inputs, outputs, transfer functions, parameter variations, and perturbations. However, very little of the literature addresses methods for obtaining this structure, and none of this literature addresses a general methodology for obtaining a minimal M-delta model for a wide class of uncertainty. Since have a delta matrix of minimum order would improve the efficiency of structured singular value (or multivariable stability margin) computations, a method of obtaining a minimal M-delta model would be useful. A generalized method of obtaining a minimal M-delta structure for systems with real parameter variations is given.

  20. Uncertainty Analysis on Heat Transfer Correlations for RP-1 Fuel in Copper Tubing

    NASA Technical Reports Server (NTRS)

    Driscoll, E. A.; Landrum, D. B.

    2004-01-01

    NASA is studying kerosene (RP-1) for application in Next Generation Launch Technology (NGLT). Accurate heat transfer correlations in narrow passages at high temperatures and pressures are needed. Hydrocarbon fuels, such as RP-1, produce carbon deposition (coke) along the inside of tube walls when heated to high temperatures. A series of tests to measure the heat transfer using RP-1 fuel and examine the coking were performed in NASA Glenn Research Center's Heated Tube Facility. The facility models regenerative cooling by flowing room temperature RP-1 through resistively heated copper tubing. A Regression analysis is performed on the data to determine the heat transfer correlation for Nusselt number as a function of Reynolds and Prandtl numbers. Each measurement and calculation is analyzed to identify sources of uncertainty, including RP-1 property variations. Monte Carlo simulation is used to determine how each uncertainty source propagates through the regression and an overall uncertainty in predicted heat transfer coefficient. The implications of these uncertainties on engine design and ways to minimize existing uncertainties are discussed.

  1. An Uncertainty Analysis for Predicting Soil Profile Salinity Using EM Induction Data

    NASA Astrophysics Data System (ADS)

    Huang, Jingyi; Monteiro Santos, Fernando; Triantafilis, John

    2016-04-01

    Proximal soil sensing techniques such as electromagnetic (EM) induction have been used to identify and map the areal variation of average soil properties. However, soil varies with depth owing to the action of various soil forming factors (e.g., parent material and topography). In this work we collected EM data using an EM38 and EM34 meter along a 22-km transect in the Trangie District, Australia.We jointly inverted these data using EM4Soil software and compare our 2-dimensional model of true electrical conductivity (sigma - mS/m) with depth against measured electrical conductivity of a saturated soil-paste extract (ECe - dS/m) at depth of 0-16 m. Through the use of a linear regression (LR) model and by varying forward modelling algorithms (cumulative function and full solution), inversion algorithms (S1 and S2), and damping factor (lambda) we determined a suitable electromagnetic conductivity image (EMCI) which was optimal when using the full solution, S2 and lambda = 0.6. To evaluate uncertainty of the inversion process and the LR model, we conducted an uncertainty analysis. The distribution of the model misfit shows the largest uncertainty caused by inversion (mostly due to EM34-40) occurs at deeper profiles while the largest uncertainty of the LR model occurs where the soil profile is most saline. These uncertainty maps also illustrate us how the model accuracy can be improved in the future.

  2. Greenhouse gas emissions from shifting cultivation in the tropics, including uncertainty and sensitivity analysis

    NASA Astrophysics Data System (ADS)

    Silva, J. M. N.; Carreiras, J. M. B.; Rosa, I.; Pereira, J. M. C.

    2011-10-01

    Annual emissions of CO2, CH4, CO, N2O, and NOx from biomass burning in shifting cultivation systems in tropical Asia, Africa, and America were estimated at national and continental levels as the product of area burned, aboveground biomass, combustion completeness, and emission factor. The total area of shifting cultivation in each country was derived from the Global Land Cover 2000 map, while the area cleared and burned annually was obtained by multiplying the total area by the rotation cycle of shifting cultivation, calculated using cropping and fallow lengths reported in the literature. Aboveground biomass accumulation was estimated as a function of the duration and mean temperature of the growing season, soil texture type, and length of the fallow period. The uncertainty associated with each model variable was estimated, and an uncertainty and sensitivity analysis of greenhouse gas estimates was performed with Monte Carlo and variance decomposition techniques. Our results reveal large uncertainty in emission estimates for all five gases. In the case of CO2, mean (standard deviation) emissions from shifting cultivation in Asia, Africa, and America were estimated at 241 (132), 205 (139), and 295 (197) Tg yr-1, respectively. Combustion completeness and emission factors were the model inputs that contributed the most to the uncertainty of estimates. Our mean estimates are lower than the literature values for atmospheric emission from biomass burning in shifting cultivation systems. Only mean values could be compared since other studies do not provide any measure of uncertainty.

  3. Uncertainty and Sensitivity analysis of a physically-based landslide model

    NASA Astrophysics Data System (ADS)

    Yatheendradas, Soni; Kirschbaum, Dalia

    2015-04-01

    Rainfall-induced landslides are hazardous to life and property. Rain data sources like satellite remote sensors combined with physically-based models of landslide initiation are a potentially economical solution for anticipating and early warning of possible landslide activity. In this work, we explore the output uncertainty of the physically-based USGS model, TRIGRS (Transient Rainfall Infiltration and Grid-Based Regional Slope-Stability) under both an a priori model parameter specification scenario and a model calibration scenario using a powerful stochastic optimization algorithm. We study a set of 50+ historic landslides over the Macon County in North Carolina as an example regional robust analysis. We then conduct a robust multivariate sensitivity analysis of the modeled output to various factors including rainfall forcing, initial and boundary conditions, and model parameters including topographic slope. Satellite rainfall uncertainty distributions are prescribed based on stochastic regressions to benchmark rain values at each location. Information about the most influential factors from sensitivity analysis will help to preferentially direct field work efforts towards associated observations. This will contribute to reducing output uncertainty in future modeling efforts. We also show how we can conveniently reduce model complexity considering negligibly influential factors to maintain example required levels of predictive accuracy and uncertainty.

  4. Dynamic analysis of global copper flows. Global stocks, postconsumer material flows, recycling indicators, and uncertainty evaluation.

    PubMed

    Glöser, Simon; Soulier, Marcel; Tercero Espinoza, Luis A

    2013-06-18

    We present a dynamic model of global copper stocks and flows which allows a detailed analysis of recycling efficiencies, copper stocks in use, and dissipated and landfilled copper. The model is based on historical mining and refined copper production data (1910-2010) enhanced by a unique data set of recent global semifinished goods production and copper end-use sectors provided by the copper industry. To enable the consistency of the simulated copper life cycle in terms of a closed mass balance, particularly the matching of recycled metal flows to reported historical annual production data, a method was developed to estimate the yearly global collection rates of end-of-life (postconsumer) scrap. Based on this method, we provide estimates of 8 different recycling indicators over time. The main indicator for the efficiency of global copper recycling from end-of-life (EoL) scrap--the EoL recycling rate--was estimated to be 45% on average, ± 5% (one standard deviation) due to uncertainty and variability over time in the period 2000-2010. As uncertainties of specific input data--mainly concerning assumptions on end-use lifetimes and their distribution--are high, a sensitivity analysis with regard to the effect of uncertainties in the input data on the calculated recycling indicators was performed. The sensitivity analysis included a stochastic (Monte Carlo) uncertainty evaluation with 10(5) simulation runs. PMID:23725041

  5. Efficient Methods for Bayesian Uncertainty Analysis and Global Optimization of Computationally Expensive Environmental Models

    NASA Astrophysics Data System (ADS)

    Shoemaker, Christine; Espinet, Antoine; Pang, Min

    2015-04-01

    Models of complex environmental systems can be computationally expensive in order to describe the dynamic interactions of the many components over a sizeable time period. Diagnostics of these systems can include forward simulations of calibrated models under uncertainty and analysis of alternatives of systems management. This discussion will focus on applications of new surrogate optimization and uncertainty analysis methods to environmental models that can enhance our ability to extract information and understanding. For complex models, optimization and especially uncertainty analysis can require a large number of model simulations, which is not feasible for computationally expensive models. Surrogate response surfaces can be used in Global Optimization and Uncertainty methods to obtain accurate answers with far fewer model evaluations, which made the methods practical for computationally expensive models for which conventional methods are not feasible. In this paper we will discuss the application of the SOARS surrogate method for estimating Bayesian posterior density functions for model parameters for a TOUGH2 model of geologic carbon sequestration. We will also briefly discuss new parallel surrogate global optimization algorithm applied to two groundwater remediation sites that was implemented on a supercomputer with up to 64 processors. The applications will illustrate the use of these methods to predict the impact of monitoring and management on subsurface contaminants.

  6. Uncertainty Analysis of Power Grid Investment Capacity Based on Monte Carlo

    NASA Astrophysics Data System (ADS)

    Qin, Junsong; Liu, Bingyi; Niu, Dongxiao

    By analyzing the influence factors of the investment capacity of power grid, to depreciation cost, sales price and sales quantity, net profit, financing and GDP of the second industry as the dependent variable to build the investment capacity analysis model. After carrying out Kolmogorov-Smirnov test, get the probability distribution of each influence factor. Finally, obtained the grid investment capacity uncertainty of analysis results by Monte Carlo simulation.

  7. Global sensitivity analysis in wastewater treatment plant model applications: prioritizing sources of uncertainty.

    PubMed

    Sin, Gürkan; Gernaey, Krist V; Neumann, Marc B; van Loosdrecht, Mark C M; Gujer, Willi

    2011-01-01

    This study demonstrates the usefulness of global sensitivity analysis in wastewater treatment plant (WWTP) design to prioritize sources of uncertainty and quantify their impact on performance criteria. The study, which is performed with the Benchmark Simulation Model no. 1 plant design, complements a previous paper on input uncertainty characterisation and propagation (Sin et al., 2009). A sampling-based sensitivity analysis is conducted to compute standardized regression coefficients. It was found that this method is able to decompose satisfactorily the variance of plant performance criteria (with R(2) > 0.9) for effluent concentrations, sludge production and energy demand. This high extent of linearity means that the plant performance criteria can be described as linear functions of the model inputs under the defined plant conditions. In effect, the system of coupled ordinary differential equations can be replaced by multivariate linear models, which can be used as surrogate models. The importance ranking based on the sensitivity measures demonstrates that the most influential factors involve ash content and influent inert particulate COD among others, largely responsible for the uncertainty in predicting sludge production and effluent ammonium concentration. While these results were in agreement with process knowledge, the added value is that the global sensitivity methods can quantify the contribution of the variance of significant parameters, e.g., ash content explains 70% of the variance in sludge production. Further the importance of formulating appropriate sensitivity analysis scenarios that match the purpose of the model application needs to be highlighted. Overall, the global sensitivity analysis proved a powerful tool for explaining and quantifying uncertainties as well as providing insight into devising useful ways for reducing uncertainties in the plant performance. This information can help engineers design robust WWTP plants. PMID:20828785

  8. A detailed description of the uncertainty analysis for High Area Ratio Rocket Nozzle tests at the NASA Lewis Research Center

    NASA Technical Reports Server (NTRS)

    Davidian, Kenneth J.; Dieck, Ronald H.; Chuang, Isaac

    1987-01-01

    A preliminary uncertainty analysis has been performed for the High Area Ratio Rocket Nozzle test program which took place at the altitude test capsule of the Rocket Engine Test Facility at the NASA Lewis Research Center. Results from the study establish the uncertainty of measured and calculated parameters required for the calculation of rocket engine specific impulse. A generalized description of the uncertainty methodology used is provided. Specific equations and a detailed description of the analysis are presented. Verification of the uncertainty analysis model was performed by comparison with results from the experimental program's data reduction code. Final results include an uncertainty for specific impulse of 1.30 percent. The largest contributors to this uncertainty were calibration errors from the test capsule pressure and thrust measurement devices.

  9. A detailed description of the uncertainty analysis for high area ratio rocket nozzle tests at the NASA Lewis Research Center

    NASA Technical Reports Server (NTRS)

    Davidian, Kenneth J.; Dieck, Ronald H.; Chuang, Isaac

    1987-01-01

    A preliminary uncertainty analysis was performed for the High Area Ratio Rocket Nozzle test program which took place at the altitude test capsule of the Rocket Engine Test Facility at the NASA Lewis Research Center. Results from the study establish the uncertainty of measured and calculated parameters required for the calculation of rocket engine specific impulse. A generalized description of the uncertainty methodology used is provided. Specific equations and a detailed description of the analysis is presented. Verification of the uncertainty analysis model was performed by comparison with results from the experimental program's data reduction code. Final results include an uncertainty for specific impulse of 1.30 percent. The largest contributors to this uncertainty were calibration errors from the test capsule pressure and thrust measurement devices.

  10. MODEL UNCERTAINTY ANALYSIS, FIELD DATA COLLECTION AND ANALYSIS OF CONTAMINATED VAPOR INTRUSION INTO BUILDINGS

    EPA Science Inventory

    To address uncertainty associated with the evaluation of vapor intrusion problems we are working on a three part strategy that includes: evaluation of uncertainty in model-based assessments; collection of field data and assessment of sites using EPA and state protocols.

  11. Spline analysis of Holocene sediment magnetic records: Uncertainty estimates for field modeling

    NASA Astrophysics Data System (ADS)

    Panovska, S.; Finlay, C. C.; Donadini, F.; Hirt, A. M.

    2012-02-01

    Sediment and archeomagnetic data spanning the Holocene enable us to reconstruct the evolution of the geomagnetic field on time scales of centuries to millennia. In global field modeling the reliability of data is taken into account by weighting according to uncertainty estimates. Uncertainties in sediment magnetic records arise from (1) imperfections in the paleomagnetic recording processes, (2) coring and (sub) sampling methods, (3) adopted averaging procedures, and (4) uncertainties in the age-depth models. We take a step toward improved uncertainty estimates by performing a comprehensive statistical analysis of the available global database of Holocene magnetic records. Smoothing spline models that capture the robust aspects of individual records are derived. This involves a cross-validation approach, based on an absolute deviation measure of misfit, to determine the smoothing parameter for each spline model, together with the use of a minimum smoothing time derived from the sedimentation rate and assumed lock-in depth. Departures from the spline models provide information concerning the random variability in each record. Temporal resolution analysis reveals that 50% of the records have smoothing times between 80 and 250 years. We also perform comparisons among the sediment magnetic records and archeomagnetic data, as well as with predictions from the global historical and archeomagnetic field models. Combining these approaches, we arrive at individual uncertainty estimates for each sediment record. These range from 2.5° to 11.2° (median: 5.9°; interquartile range: 5.4° to 7.2°) for inclination, 4.1° to 46.9° (median: 13.4°; interquartile range: 11.4° to 18.9°) for relative declination, and 0.59 to 1.32 (median: 0.93; interquartile range: 0.86 to 1.01) for standardized relative paleointensity. These values suggest that uncertainties may have been underestimated in previous studies. No compelling evidence for systematic inclination shallowing is

  12. Sensitivity analysis of a global aerosol model to understand how parametric uncertainties affect model predictions

    NASA Astrophysics Data System (ADS)

    Lee, L. A.; Carslaw, K. S.; Pringle, K. J.

    2012-04-01

    Global aerosol contributions to radiative forcing (and hence climate change) are persistently subject to large uncertainty in successive Intergovernmental Panel on Climate Change (IPCC) reports (Schimel et al., 1996; Penner et al., 2001; Forster et al., 2007). As such more complex global aerosol models are being developed to simulate aerosol microphysics in the atmosphere. The uncertainty in global aerosol model estimates is currently estimated by measuring the diversity amongst different models (Textor et al., 2006, 2007; Meehl et al., 2007). The uncertainty at the process level due to the need to parameterise in such models is not yet understood and it is difficult to know whether the added model complexity comes at a cost of high model uncertainty. In this work the model uncertainty and its sources due to the uncertain parameters is quantified using variance-based sensitivity analysis. Due to the complexity of a global aerosol model we use Gaussian process emulation with a sufficient experimental design to make such as a sensitivity analysis possible. The global aerosol model used here is GLOMAP (Mann et al., 2010) and we quantify the sensitivity of numerous model outputs to 27 expertly elicited uncertain model parameters describing emissions and processes such as growth and removal of aerosol. Using the R package DiceKriging (Roustant et al., 2010) along with the package sensitivity (Pujol, 2008) it has been possible to produce monthly global maps of model sensitivity to the uncertain parameters over the year 2008. Global model outputs estimated by the emulator are shown to be consistent with previously published estimates (Spracklen et al. 2010, Mann et al. 2010) but now we have an associated measure of parameter uncertainty and its sources. It can be seen that globally some parameters have no effect on the model predictions and any further effort in their development may be unnecessary, although a structural error in the model might also be identified. The

  13. Characterization of Sealed Radioactive Sources: An Uncertainty Analysis to Improve Detection Methods

    SciTech Connect

    Daniel G. Cummings; James D. Sommers; Mary L. Adamic; Marcos Jimenez; Jeffrey G. Giglio; Kevin P. Carney; Karl Grimm

    2009-12-01

    The characterization of several types of sealed radiation sources has been accomplished. Specifically, cesium and strontium sources have been chemically characterized by inductively coupled plasma mass spectrometry (ICP-MS). One of the important pieces of information coming from the characterization is the “age” since purification. The age refers to the time since purification of the Cs or Sr components. A detailed error analysis of the mass spectrometric work has been undertaken to identify areas of improvement, as well as quantitating the effect the errors have on the “age” determined. This paper reports an uncertainty analysis associated with the measurements and identifying areas of improvement and alternative techniques that may reduce the uncertainties. In particular, work on isotope dilution using ICP-MS for the “age” determination of sealed sources will be presented, along with a detailed error analysis. The results will be compared to the original work done with simple instrument calibration.

  14. Uncertainty in the estimates of peak ground acceleration in seismic hazard analysis

    NASA Astrophysics Data System (ADS)

    Pavlenko, V. A.

    2015-11-01

    Probabilistic seismic hazard analysis has become a standard procedure preceding the antiseismic construction. An important component of the relevant calculations is the allowance for the uncertainty in the strong motion parameters (e.g., peak ground acceleration (PGA)). In the present-day approaches of probabilistic analysis, this uncertainty is modeled by a random variable (a residual) which has a lognormal distribution. With this model, the extrapolation into the area of long return periods yields nonzero probabilities of unrealistically high PGA. In the present work, the distribution of the logarithmic PGA residuals is modeled by different parametric distributions. From the set of these distributions, the one which provides the closest approximation of the empirical data is selected by the statistical criteria. The analysis shows that the generalized extreme value distribution (GEVD) most accurately reproduces the residuals of the logarithmic PGA, and the tail of the distribution is approximated by the generalized Pareto distribution (GPD).

  15. Damage functions for climate-related hazards: unification and uncertainty analysis

    NASA Astrophysics Data System (ADS)

    Prahl, B. F.; Rybski, D.; Boettle, M.; Kropp, J. P.

    2015-11-01

    Most climate change impacts manifest in the form of natural hazards. For example, sea-level rise and changes in storm climatology are expected to increase the frequency and magnitude of flooding events. In practice there is a need for comprehensive damage assessment at an intermediate level of complexity. Answering this need, we reveal the common grounds of macroscale damage functions employed in storm damage, coastal-flood damage, and heat mortality assessment. The universal approach offers both bottom-up and top-down damage evaluation, employing either an explicit or an implicit portfolio description. Putting emphasis on the treatment of data uncertainties, we perform a sensitivity analysis across different scales. We find that the behaviour of intrinsic uncertainties on the microscale level (i.e. single item) does still persist on the macroscale level (i.e. portfolio). Furthermore, the analysis of uncertainties can reveal their specific relevance, allowing for simplification of the modelling chain. Our results shed light on the role of uncertainties and provide useful insight for the application of a unified damage function.

  16. Uncertainty and sensitivity analysis of fission gas behavior in engineering-scale fuel modeling

    DOE PAGESBeta

    Pastore, Giovanni; Swiler, L. P.; Hales, Jason D.; Novascone, Stephen R.; Perez, Danielle M.; Spencer, Benjamin W.; Luzzi, Lelio; Uffelen, Paul Van; Williamson, Richard L.

    2014-10-12

    The role of uncertainties in fission gas behavior calculations as part of engineering-scale nuclear fuel modeling is investigated using the BISON fuel performance code and a recently implemented physics-based model for the coupled fission gas release and swelling. Through the integration of BISON with the DAKOTA software, a sensitivity analysis of the results to selected model parameters is carried out based on UO2 single-pellet simulations covering different power regimes. The parameters are varied within ranges representative of the relative uncertainties and consistent with the information from the open literature. The study leads to an initial quantitative assessment of the uncertaintymore » in fission gas behavior modeling with the parameter characterization presently available. Also, the relative importance of the single parameters is evaluated. Moreover, a sensitivity analysis is carried out based on simulations of a fuel rod irradiation experiment, pointing out a significant impact of the considered uncertainties on the calculated fission gas release and cladding diametral strain. The results of the study indicate that the commonly accepted deviation between calculated and measured fission gas release by a factor of 2 approximately corresponds to the inherent modeling uncertainty at high fission gas release. Nevertheless, higher deviations may be expected for values around 10% and lower. Implications are discussed in terms of directions of research for the improved modeling of fission gas behavior for engineering purposes.« less

  17. Practical issues in handling data input and uncertainty in a budget impact analysis.

    PubMed

    Nuijten, M J C; Mittendorf, T; Persson, U

    2011-06-01

    The objective of this paper was to address the importance of dealing systematically and comprehensively with uncertainty in a budget impact analysis (BIA) in more detail. The handling of uncertainty in health economics was used as a point of reference for addressing the uncertainty in a BIA. This overview shows that standard methods of sensitivity analysis, which are used for standard data set in a health economic model (clinical probabilities, treatment patterns, resource utilisation and prices/tariffs), cannot always be used for the input data for the BIA model beyond the health economic data set for various reasons. Whereas in a health economic model, only limited data may come from a Delphi panel, a BIA model often relies on a majority of data taken from a Delphi panel. In addition, the dataset in a BIA model also includes forecasts (e.g. annual growth, uptakes curves, substitution effects, changes in prescription restrictions and guidelines, future distribution of the available treatment modalities, off-label use). As a consequence, the use of standard sensitivity analyses for BIA data set might be limited because of the lack of appropriate distributions as data sources are limited, or because of the need for forecasting. Therefore, scenario analyses might be more appropriate to capture the uncertainty in the BIA data set in the overall BIA model. PMID:20364289

  18. Uncertainty based analysis of the impact of watershed phosphorus load on reservoir phosphorus concentration

    NASA Astrophysics Data System (ADS)

    Karamouz, Mohammad; Taheriyoun, Masoud; Seyedabadi, Mohammadreza; Nazif, Sara

    2015-02-01

    In many regions of the world that depend on surface reservoirs as a source of water supply, eutrophication is a major water quality problem. Developing simulation models to evaluate the impact of watershed nutrient loads on the reservoir's water quality is an essential step in eutrophication management. In this regard, analysis of model uncertainty gives an opportunity to assess the reliability and the margin of safety of the model predictions for Total Maximum Daily Load (TMDL) from the watershed nutrient load. In this study, a computational procedure has been proposed for the analysis of the model uncertainties in simulation of watershed phosphorous load and reservoir phosphorous concentration. Data from the Aharchai watershed which is located upstream of the Satarkhan reservoir in the northwestern part of Iran, is used as the study area to test the effectiveness of the proposed methodology. The Soil and Water Assessment Tools (SWAT) is utilized for assessment of watershed phosphorus load as the main agent resulting in the reservoir eutrophication in the region. The most effective parameters in model performance are identified by a global sensitivity analysis technique named modified Fourier Amplitude Sensitivity Test (FAST) which can incorporate parameter interdependencies. The Generalized Likelihood Uncertainty Estimation (GLUE) technique is also applied to set up behavioral ranges of the parameters that are relevant to the actual observations. Finally, the cumulative weighted-likelihood distribution functions (CWLDF) are derived for outputs of the SWAT. They are used jointly for estimation of results uncertainty limits using the Copula method. To assess the effectiveness of applying Best Management Practices (BMPs) in the watershed, two scenarios of with and without BMPs application are tested. The results showed the effectiveness of the proposed model in uncertainty estimation of watershed phosphorus load and reservoir phosphorus concentration as well as the

  19. Uncertainty analysis of multi-rate kinetics of uranium desorption from sediments

    NASA Astrophysics Data System (ADS)

    Zhang, Xiaoying; Liu, Chongxuan; Hu, Bill X.; Zhang, Guannan

    2014-01-01

    Multi-rate surface complexation models have been proposed to describe the kinetics of uranyl (U(VI) surface complexation reactions (SCR) rate-limited by diffusive mass transfer to and from intragranular sorption sites in subsurface sediments. In this study, a Bayesian-based, Differential Evolution Markov Chain method was used to assess the uncertainty and to identify factors controlling the uncertainties of the multi-rate SCR model. The rate constants in the multi-rate SCR were estimated with and without assumption of a specified lognormal distribution to test the lognormal assumption typically used to minimize the number of the rate constants in the multi-rate model. U(VI) desorption under variable chemical conditions from a contaminated sediment at US Hanford 300 Area, Washington was used as an example. The results indicated that the estimated rate constants without a specified lognormal assumption approximately followed a lognormal distribution, indicating that the lognormal is an effective assumption for the rate constants in the multi-rate SCR model. However, those rate constants with their corresponding half-lives longer than the experimental durations for model characterization had larger uncertainties and could not be reliably estimated. The uncertainty analysis revealed that the time-scale of the experiments for calibrating the multi-rate SCR model, the assumption for the rate constant distribution, the geochemical conditions involved in predicting U(VI) desorption, and equilibrium U(VI) speciation reaction constants were the major factors contributing to the extrapolation uncertainties of the multi-rate SCR model. Overall, the results from this study demonstrated that the multi-rate SCR model with a lognormal distribution of its rate constants is an effective approach for describing rate-limited U(VI) desorption; however, the model contains uncertainties, especially for those smaller rate constants, that require careful consideration for predicting U

  20. Biophysical and Economic Uncertainty in the Analysis of Poverty Impacts of Climate Change

    NASA Astrophysics Data System (ADS)

    Hertel, T. W.; Lobell, D. B.; Verma, M.

    2011-12-01

    This paper seeks to understand the main sources of uncertainty in assessing the impacts of climate change on agricultural output, international trade, and poverty. We incorporate biophysical uncertainty by sampling from a distribution of global climate model predictions for temperature and precipitation for 2050. The implications of these realizations for crop yields around the globe are estimated using the recently published statistical crop yield functions provided by Lobell, Schlenker and Costa-Roberts (2011). By comparing these yields to those predicted under current climate, we obtain the likely change in crop yields owing to climate change. The economic uncertainty in our analysis relates to the response of the global economic system to these biophysical shocks. We use a modified version of the GTAP model to elicit the impact of the biophysical shocks on global patterns of production, consumption, trade and poverty. Uncertainty in these responses is reflected in the econometrically estimated parameters governing the responsiveness of international trade, consumption, production (and hence the intensive margin of supply response), and factor supplies (which govern the extensive margin of supply response). We sample from the distributions of these parameters as specified by Hertel et al. (2007) and Keeney and Hertel (2009). We find that, even though it is difficult to predict where in the world agricultural crops will be favorably affected by climate change, the responses of economic variables, including output and exports can be far more robust (Table 1). This is due to the fact that supply and demand decisions depend on relative prices, and relative prices depend on productivity changes relative to other crops in a given region, or relative to similar crops in other parts of the world. We also find that uncertainty in poverty impacts of climate change appears to be almost entirely driven by biophysical uncertainty.

  1. Evaluation of Uncertainty in Runoff Analysis Incorporating Theory of Stochastic Process

    NASA Astrophysics Data System (ADS)

    Yoshimi, Kazuhiro; Wang, Chao-Wen; Yamada, Tadashi

    2015-04-01

    The aim of this paper is to provide a theoretical framework of uncertainty estimate on rainfall-runoff analysis based on theory of stochastic process. SDE (stochastic differential equation) based on this theory has been widely used in the field of mathematical finance due to predict stock price movement. Meanwhile, some researchers in the field of civil engineering have investigated by using this knowledge about SDE (stochastic differential equation) (e.g. Kurino et.al, 1999; Higashino and Kanda, 2001). However, there have been no studies about evaluation of uncertainty in runoff phenomenon based on comparisons between SDE (stochastic differential equation) and Fokker-Planck equation. The Fokker-Planck equation is a partial differential equation that describes the temporal variation of PDF (probability density function), and there is evidence to suggest that SDEs and Fokker-Planck equations are equivalent mathematically. In this paper, therefore, the uncertainty of discharge on the uncertainty of rainfall is explained theoretically and mathematically by introduction of theory of stochastic process. The lumped rainfall-runoff model is represented by SDE (stochastic differential equation) due to describe it as difference formula, because the temporal variation of rainfall is expressed by its average plus deviation, which is approximated by Gaussian distribution. This is attributed to the observed rainfall by rain-gauge station and radar rain-gauge system. As a result, this paper has shown that it is possible to evaluate the uncertainty of discharge by using the relationship between SDE (stochastic differential equation) and Fokker-Planck equation. Moreover, the results of this study show that the uncertainty of discharge increases as rainfall intensity rises and non-linearity about resistance grows strong. These results are clarified by PDFs (probability density function) that satisfy Fokker-Planck equation about discharge. It means the reasonable discharge can be

  2. Variability and Uncertainty in Probabilistic Seismic Hazard Analysis for the Island of Montreal

    NASA Astrophysics Data System (ADS)

    Elkady, Ahmed Mohamed Ahmed

    The current seismic design process for structures in Montreal is based on the 2005 edition of the National Building Code of Canada (NBCC 2005) which is based on a hazard level corresponding to a probability of exceedence of 2% in 50 years. The code is based on the Uniform Hazard Spectrum (UHS) and deaggregation values obtained by Geological Survey of Canada (GSC) modified version of F-RISK software and were obtained by a process that did not formally consider epistemic uncertainty. Epistemic uncertainty is related to the uncertainty in model formulation. A seismological model consists of seismic sources (source geometry, source location, recurrence rate, magnitude distribution, and maximum magnitude) and a Ground-Motion Prediction Equation (GMPE). In general, and particularly Montreal, GMPEs are the main source of epistemic uncertainty with respect to other variables of seismological the model. The objective of this thesis is to use CRISIS software to investigate the effect of epistemic uncertainty on probabilistic seismic hazard analysis (PSHA) products like the UHS and deaggregation values by incorporating different new GMPEs. The epsilon "epsilon" parameter is also discussed which represents the departure of the target ground motion from that predicted by the GMPE as it is not very well documented in Eastern Canada. A method is proposed to calculate epsilon values for Montreal relative to a given GMPE and to calculate robust weighted modal epsilon values when epistemic uncertainty is considered. Epsilon values are commonly used in seismic performance evaluations for identifying design events and selecting ground motion records for vulnerability and liquefaction studies. A brief overview of record epsilons is also presented which accounts for the spectral shape of the ground motion time history is also presented.

  3. Numerical daemons in hydrological modeling: Effects on uncertainty assessment, sensitivity analysis and model predictions

    NASA Astrophysics Data System (ADS)

    Kavetski, D.; Clark, M. P.; Fenicia, F.

    2011-12-01

    Hydrologists often face sources of uncertainty that dwarf those normally encountered in many engineering and scientific disciplines. Especially when representing large scale integrated systems, internal heterogeneities such as stream networks, preferential flowpaths, vegetation, etc, are necessarily represented with a considerable degree of lumping. The inputs to these models are themselves often the products of sparse observational networks. Given the simplifications inherent in environmental models, especially lumped conceptual models, does it really matter how they are implemented? At the same time, given the complexities usually found in the response surfaces of hydrological models, increasingly sophisticated analysis methodologies are being proposed for sensitivity analysis, parameter calibration and uncertainty assessment. Quite remarkably, rather than being caused by the model structure/equations themselves, in many cases model analysis complexities are consequences of seemingly trivial aspects of the model implementation - often, literally, whether the start-of-step or end-of-step fluxes are used! The extent of problems can be staggering, including (i) degraded performance of parameter optimization and uncertainty analysis algorithms, (ii) erroneous and/or misleading conclusions of sensitivity analysis, parameter inference and model interpretations and, finally, (iii) poor reliability of a calibrated model in predictive applications. While the often nontrivial behavior of numerical approximations has long been recognized in applied mathematics and in physically-oriented fields of environmental sciences, it remains a problematic issue in many environmental modeling applications. Perhaps detailed attention to numerics is only warranted for complicated engineering models? Would not numerical errors be an insignificant component of total uncertainty when typical data and model approximations are present? Is this really a serious issue beyond some rare isolated

  4. A comparison of uncertainty analysis methods using a groundwater flow model

    SciTech Connect

    Doctor, P.G.; Jacobson, E.A.; Buchanan, J.A.

    1988-06-01

    This report evaluates three uncertainty analysis methods that are proposed for use in performances assessment activities within the OCRWM and Nuclear Regulatory Commission (NRC) communities. The three methods are Monte Carlo simulation with unconstrained sampling, Monte Carlo simulation with Latin Hypercube sampling, and first-order analysis. Monte Carlo simulation with unconstrained sampling is a generally accepted uncertainty analysis method, but it has the disadvantage of being costly and time consuming. Latin Hypercube sampling was proposed to make Monte Carlo simulation more efficient. However, although it was originally formulated for independent variables, which is a major drawback in performance assessment modeling, Latin Hypercube can be used to generate correlated samples. The first-order method is efficient to implement because it is based on the first-order Taylor series expansion; however, there is concern that it does not adequately describe the variability for complex models. These three uncertainty analysis methods were evaluated using a calibrated groundwater flow model of a unconfined aquifer in southern Arizona. The two simulation methods produced similar results, although the Latin Hypercube method tends to produce samples whose estimates of statistical parameters are closer to the desired parameters. The mean travel times for the first-order method does not agree with those of the simulations. In additions, the first-order method produces estimates of variance in travel times that are more variable than those produced by the simulation methods, resulting in nonconservative tolerance intervals. 13 refs., 33 figs.

  5. Deterministic methods for sensitivity and uncertainty analysis in large-scale computer models

    SciTech Connect

    Worley, B.A.; Oblow, E.M.; Pin, F.G.; Maerker, R.E.; Horwedel, J.E.; Wright, R.Q.; Lucius, J.L.

    1987-01-01

    The fields of sensitivity and uncertainty analysis are dominated by statistical techniques when large-scale modeling codes are being analyzed. This paper reports on the development and availability of two systems, GRESS and ADGEN, that make use of computer calculus compilers to automate the implementation of deterministic sensitivity analysis capability into existing computer models. This automation removes the traditional limitation of deterministic sensitivity methods. The paper describes a deterministic uncertainty analysis method (DUA) that uses derivative information as a basis to propagate parameter probability distributions to obtain result probability distributions. The paper demonstrates the deterministic approach to sensitivity and uncertainty analysis as applied to a sample problem that models the flow of water through a borehole. The sample problem is used as a basis to compare the cumulative distribution function of the flow rate as calculated by the standard statistical methods and the DUA method. The DUA method gives a more accurate result based upon only two model executions compared to fifty executions in the statistical case.

  6. Physical characterization of explosive volcanic eruptions based on tephra deposits: Propagation of uncertainties and sensitivity analysis

    NASA Astrophysics Data System (ADS)

    Bonadonna, Costanza; Biass, Sébastien; Costa, Antonio

    2015-04-01

    Regardless of the recent advances in geophysical monitoring and real-time quantitative observations of explosive volcanic eruptions, the characterization of tephra deposits remains one of the largest sources of information on Eruption Source Parameters (ESPs) (i.e. plume height, erupted volume/mass, Mass Eruption Rate - MER, eruption duration, Total Grain-Size Distribution - TGSD). ESPs are crucial for the characterization of volcanic systems and for the compilation of comprehensive hazard scenarios but are naturally associated with various degrees of uncertainties that are traditionally not well quantified. Recent studies have highlighted the uncertainties associated with the estimation of ESPs mostly related to: i) the intrinsic variability of the natural system, ii) the observational error and iii) the strategies used to determine physical parameters. Here we review recent studies focused on the characterization of these uncertainties and we present a sensitivity analysis for the determination of ESPs and a systematic investigation to quantify the propagation of uncertainty applied to two case studies. In particular, we highlight the dependence of ESPs on specific observations used as input parameters (i.e. diameter of the largest clasts, thickness measurements, area of isopach contours, deposit density, downwind and crosswind range of isopleth maps, and empirical constants and wind speed for the determination of MER). The highest uncertainty is associated to the estimation of MER and eruption duration and is related to the determination of crosswind range of isopleth maps and the empirical constants used in the empirical parameterization relating MER and plume height. Given the exponential nature of the relation between MER and plume height, the propagation of uncertainty is not symmetrical, and both an underestimation of the empirical constant and an overestimation of plume height have the highest impact on the final outcome. A ± 20% uncertainty on thickness

  7. Uncertainty Analysis of Ozone-Depleting Substances: Mixing Ratios, EESC, ODPs, and GWPs

    NASA Astrophysics Data System (ADS)

    Velders, G. J.; Daniel, J. S.

    2013-12-01

    Important for the recovery of the ozone layer from depletion by ozone-depleting substances (ODSs) is the rate at which ODSs are removed from the atmosphere, that is, their lifetimes. Recently the WCRP/SPARC project conducted an assessment of lifetimes of ODSs [SPARC, 2013] and presented a new set of recommended lifetimes as well as their uncertainties. We present here a comprehensive uncertainty analysis of ODS mixing ratios, levels of equivalent effective stratospheric chlorine (EESC), radiative forcing, ozone depletion potentials (ODPs), and global warming potentials (GWPs), using the new lifetimes and their uncertainties as well as uncertainties on all other relevant parameters. Using a box model the year EESC returns to pre-1980 levels, a metric commonly used to indicate a level of recovery for ODS induced ozone depletion, is 2048 for mid-latitudes based on the new lifetimes, which is 2 years later than that based on the lifetimes from WMO [2011]. The uncertainty in this return time is much larger than this change, however. The year EESC returns to pre-1980 levels ranges from is 2038 to 2064 (95% CI) for mid-latitudes and 2060 to 2104 for the Antarctic. The largest contribution to these ranges comes from the uncertainties in the lifetimes, since the current atmospheric burden of CFCs is much larger than the amounts present in existing equipment or still being produced. The earlier end of the recovery times is comparable to the return time in a hypothetical scenario with a cease in anthropogenic ODS emissions in 2014. The upper end of the range corresponds with an extra emission of about 7 MtCFC-11-eq in 2015, or about twice the cumulative anthropogenic emissions of all ODSs from 2014 to 2050. Semi-empirical ODPs calculated using the lifetimes from SPARC [2013] are up to 25% lower than the data reported in WMO [2011] for most species, mainly as a result of the increase in the estimated lifetime of CFC-11. The ODP of Halon-2402 increases by 20%, while the only

  8. Adjoint-based uncertainty quantification and sensitivity analysis for reactor depletion calculations

    NASA Astrophysics Data System (ADS)

    Stripling, Hayes Franklin

    Depletion calculations for nuclear reactors model the dynamic coupling between the material composition and neutron flux and help predict reactor performance and safety characteristics. In order to be trusted as reliable predictive tools and inputs to licensing and operational decisions, the simulations must include an accurate and holistic quantification of errors and uncertainties in its outputs. Uncertainty quantification is a formidable challenge in large, realistic reactor models because of the large number of unknowns and myriad sources of uncertainty and error. We present a framework for performing efficient uncertainty quantification in depletion problems using an adjoint approach, with emphasis on high-fidelity calculations using advanced massively parallel computing architectures. This approach calls for a solution to two systems of equations: (a) the forward, engineering system that models the reactor, and (b) the adjoint system, which is mathematically related to but different from the forward system. We use the solutions of these systems to produce sensitivity and error estimates at a cost that does not grow rapidly with the number of uncertain inputs. We present the framework in a general fashion and apply it to both the source-driven and k-eigenvalue forms of the depletion equations. We describe the implementation and verification of solvers for the forward and ad- joint equations in the PDT code, and we test the algorithms on realistic reactor analysis problems. We demonstrate a new approach for reducing the memory and I/O demands on the host machine, which can be overwhelming for typical adjoint algorithms. Our conclusion is that adjoint depletion calculations using full transport solutions are not only computationally tractable, they are the most attractive option for performing uncertainty quantification on high-fidelity reactor analysis problems.

  9. Uncertainty Analysis and Order-by-Order Optimization of Chiral Nuclear Interactions

    NASA Astrophysics Data System (ADS)

    Carlsson, B. D.; Ekström, A.; Forssén, C.; Strömberg, D. Fahlin; Jansen, G. R.; Lilja, O.; Lindby, M.; Mattsson, B. A.; Wendt, K. A.

    2016-01-01

    Chiral effective field theory (χ EFT ) provides a systematic approach to describe low-energy nuclear forces. Moreover, χ EFT is able to provide well-founded estimates of statistical and systematic uncertainties—although this unique advantage has not yet been fully exploited. We fill this gap by performing an optimization and statistical analysis of all the low-energy constants (LECs) up to next-to-next-to-leading order. Our optimization protocol corresponds to a simultaneous fit to scattering and bound-state observables in the pion-nucleon, nucleon-nucleon, and few-nucleon sectors, thereby utilizing the full model capabilities of χ EFT . Finally, we study the effect on other observables by demonstrating forward-error-propagation methods that can easily be adopted by future works. We employ mathematical optimization and implement automatic differentiation to attain efficient and machine-precise first- and second-order derivatives of the objective function with respect to the LECs. This is also vital for the regression analysis. We use power-counting arguments to estimate the systematic uncertainty that is inherent to χ EFT , and we construct chiral interactions at different orders with quantified uncertainties. Statistical error propagation is compared with Monte Carlo sampling, showing that statistical errors are, in general, small compared to systematic ones. In conclusion, we find that a simultaneous fit to different sets of data is critical to (i) identify the optimal set of LECs, (ii) capture all relevant correlations, (iii) reduce the statistical uncertainty, and (iv) attain order-by-order convergence in χ EFT . Furthermore, certain systematic uncertainties in the few-nucleon sector are shown to get substantially magnified in the many-body sector, in particular when varying the cutoff in the chiral potentials. The methodology and results presented in this paper open a new frontier for uncertainty quantification in ab initio nuclear theory.

  10. Heterogenic Solid Biofuel Sampling Methodology and Uncertainty Associated with Prompt Analysis

    PubMed Central

    Pazó, Jose A.; Granada, Enrique; Saavedra, Ángeles; Patiño, David; Collazo, Joaquín

    2010-01-01

    Accurate determination of the properties of biomass is of particular interest in studies on biomass combustion or cofiring. The aim of this paper is to develop a methodology for prompt analysis of heterogeneous solid fuels with an acceptable degree of accuracy. Special care must be taken with the sampling procedure to achieve an acceptable degree of error and low statistical uncertainty. A sampling and error determination methodology for prompt analysis is presented and validated. Two approaches for the propagation of errors are also given and some comparisons are made in order to determine which may be better in this context. Results show in general low, acceptable levels of uncertainty, demonstrating that the samples obtained in the process are representative of the overall fuel composition. PMID:20559506

  11. Recommendations for damping and treatment of modeling uncertainty in seismic analysis of CANDU nuclear power plant

    SciTech Connect

    Usmani, S.A.; Baughman, P.D.

    1996-12-01

    The seismic analysis of the CANDU nuclear power plant is governed by Canadian Standard series N289. However, the dynamic analysis of some equipment and system such as the CANDU reactor and fueling machine must treat unique components not directly covered by the broad recommendations of these standards. This paper looks at the damping values and treatment of modeling uncertainty recommended by CSA N289.3, the current state of knowledge and expert opinion as reflected in several current standards, testing results, and the unique aspects of the CANDU system. Damping values are recommended for the component parts of the CANDU reactor and fueling machine system: reactor building, calandria vault, calandria, fuel channel, pressure tube, fueling machine and support structure. Recommendations for treatment of modeling and other uncertainties are also presented.

  12. Implementation and Validation of Uncertainty Analysis of Available Energy and Available Power

    SciTech Connect

    Jon P. Christophersen; John L. Morrison; B. J. Schubert; Shawn Allred

    2007-04-01

    The Idaho National Laboratory does extensive testing and evaluation of state-of-the-art batteries and ultracapacitors for hybrid-electric vehicle applications as part of the FreedomCAR and Vehicle Technologies Program. Significant parameters of interest include Available Energy and Available Power. Documenting the uncertainty analysis of these derived parameters is a very complex problem. The error is an unknown combination of both linearity and offset; the analysis presented in this paper computes the uncertainty both ways and then the most conservative method is assumed (which is the worst case scenario). Each method requires the use of over 134 equations, some of which are derived and some are measured values. This includes the measurement device error (calibration error) and bit resolution and analog noise error (standard deviation error). The implementation of these equations to acquire a closed form answer was done using Matlab (an array based programming language) and validated using Monte Carlo simulations.

  13. Uncertainty analysis for an effluent trading system in a typical nonpoint-sources-polluted watershed.

    PubMed

    Chen, Lei; Han, Zhaoxing; Wang, Guobo; Shen, Zhenyao

    2016-01-01

    Conventional effluent trading systems (ETSs) between point sources (PSs) and nonpoint sources (NPSs) are often unreliable because of the uncertain characteristics of NPSs. In this study, a new framework was established for PS-NPS ETSs, and a comprehensive analysis was conducted by quantifying the impacts of the uncertainties associated with the water assimilative capacity (WAC), NPS emissions, and measurement effectiveness. On the basis of these results, the uncertain characteristics of NPSs would result in a less cost-effective PS-NPS ETS during most hydrological periods, and there exists a clear transition occurs from the WAC constraint to the water quality constraint if these stochastic factors are considered. Specifically, the emission uncertainty had a greater impact on PSs, but an increase in the emission or abatement uncertainty caused the abatement efforts to shift from NPSs toward PSs. Moreover, the error transitivity from the WAC to conventional ETS approaches is more obvious than that to the WEFZ-based ETS. When NPSs emissions are relatively high, structural BMPs should be considered for trading, and vice versa. These results are critical to understand the impacts of uncertainty on the functionality of PS-NPS ETSs and to provide a trade-off between the confidence level and abatement efforts. PMID:27406070

  14. Sensitivity and uncertainty analysis for Abreu & Johnson numerical vapor intrusion model.

    PubMed

    Ma, Jie; Yan, Guangxu; Li, Haiyan; Guo, Shaohui

    2016-03-01

    This study conducted one-at-a-time (OAT) sensitivity and uncertainty analysis for a numerical vapor intrusion model for nine input parameters, including soil porosity, soil moisture, soil air permeability, aerobic biodegradation rate, building depressurization, crack width, floor thickness, building volume, and indoor air exchange rate. Simulations were performed for three soil types (clay, silt, and sand), two source depths (3 and 8m), and two source concentrations (1 and 400 g/m(3)). Model sensitivity and uncertainty for shallow and high-concentration vapor sources (3m and 400 g/m(3)) are much smaller than for deep and low-concentration sources (8m and 1g/m(3)). For high-concentration sources, soil air permeability, indoor air exchange rate, and building depressurization (for high permeable soil like sand) are key contributors to model output uncertainty. For low-concentration sources, soil porosity, soil moisture, aerobic biodegradation rate and soil gas permeability are key contributors to model output uncertainty. Another important finding is that impacts of aerobic biodegradation on vapor intrusion potential of petroleum hydrocarbons are negligible when vapor source concentration is high, because of insufficient oxygen supply that limits aerobic biodegradation activities. PMID:26619051

  15. Error Analysis and Measurement Uncertainty for a Fiber Grating Strain-Temperature Sensor

    PubMed Central

    Tang, Jaw-Luen; Wang, Jian-Neng

    2010-01-01

    A fiber grating sensor capable of distinguishing between temperature and strain, using a reference and a dual-wavelength fiber Bragg grating, is presented. Error analysis and measurement uncertainty for this sensor are studied theoretically and experimentally. The measured root mean squared errors for temperature T and strain ε were estimated to be 0.13 °C and 6 με, respectively. The maximum errors for temperature and strain were calculated as 0.00155 T + 2.90 × 10−6 ε and 3.59 × 10−5 ε + 0.01887 T, respectively. Using the estimation of expanded uncertainty at 95% confidence level with a coverage factor of k = 2.205, temperature and strain measurement uncertainties were evaluated as 2.60 °C and 32.05 με, respectively. For the first time, to our knowledge, we have demonstrated the feasibility of estimating the measurement uncertainty for simultaneous strain-temperature sensing with such a fiber grating sensor. PMID:22163567

  16. Uncertainty analysis for an effluent trading system in a typical nonpoint-sources-polluted watershed

    PubMed Central

    Chen, Lei; Han, Zhaoxing; Wang, Guobo; Shen, Zhenyao

    2016-01-01

    Conventional effluent trading systems (ETSs) between point sources (PSs) and nonpoint sources (NPSs) are often unreliable because of the uncertain characteristics of NPSs. In this study, a new framework was established for PS-NPS ETSs, and a comprehensive analysis was conducted by quantifying the impacts of the uncertainties associated with the water assimilative capacity (WAC), NPS emissions, and measurement effectiveness. On the basis of these results, the uncertain characteristics of NPSs would result in a less cost-effective PS-NPS ETS during most hydrological periods, and there exists a clear transition occurs from the WAC constraint to the water quality constraint if these stochastic factors are considered. Specifically, the emission uncertainty had a greater impact on PSs, but an increase in the emission or abatement uncertainty caused the abatement efforts to shift from NPSs toward PSs. Moreover, the error transitivity from the WAC to conventional ETS approaches is more obvious than that to the WEFZ-based ETS. When NPSs emissions are relatively high, structural BMPs should be considered for trading, and vice versa. These results are critical to understand the impacts of uncertainty on the functionality of PS-NPS ETSs and to provide a trade-off between the confidence level and abatement efforts. PMID:27406070

  17. Uncertainty analysis of continuum scale ferroelectric energy landscapes using density functional theory

    NASA Astrophysics Data System (ADS)

    Oates, William S.; Miles, Paul; Leon, Lider; Smith, Ralph

    2016-04-01

    Density functional theory (DFT) provides exceptional predictions of material properties of ideal crystal structures such as elastic modulus and dielectric constants. This includes ferroelectric crystals where excellent predictions of spontaneous polarization, lattice strain, and elastic moduli have been predicted using DFT. Less analysis has focused on quantifying uncertainty of the energy landscape over a broad range of polarization states in ferroelectric materials. This is non-trivial because the degrees of freedom contained within a unit cell are reduced to a single vector order parameter which is normally polarization. For example, lead titanate contains five atoms and 15 degrees of freedom of atomic nuclei motion which contribute to the overall unit cell polarization. Bayesian statistics is used to identify the uncertainty and propagation of error of a continuum scale, Landau energy function for lead titanate. Uncertainty in different parameters is quantified and this uncertainty is propagated through the model to illustrate error propagation over the energy surface. Such results are shown to have an impact in integration of quantum simulations within a ferroelectric phase field continuum modeling framework.

  18. Uncertainty analysis for an effluent trading system in a typical nonpoint-sources-polluted watershed

    NASA Astrophysics Data System (ADS)

    Chen, Lei; Han, Zhaoxing; Wang, Guobo; Shen, Zhenyao

    2016-07-01

    Conventional effluent trading systems (ETSs) between point sources (PSs) and nonpoint sources (NPSs) are often unreliable because of the uncertain characteristics of NPSs. In this study, a new framework was established for PS-NPS ETSs, and a comprehensive analysis was conducted by quantifying the impacts of the uncertainties associated with the water assimilative capacity (WAC), NPS emissions, and measurement effectiveness. On the basis of these results, the uncertain characteristics of NPSs would result in a less cost-effective PS-NPS ETS during most hydrological periods, and there exists a clear transition occurs from the WAC constraint to the water quality constraint if these stochastic factors are considered. Specifically, the emission uncertainty had a greater impact on PSs, but an increase in the emission or abatement uncertainty caused the abatement efforts to shift from NPSs toward PSs. Moreover, the error transitivity from the WAC to conventional ETS approaches is more obvious than that to the WEFZ-based ETS. When NPSs emissions are relatively high, structural BMPs should be considered for trading, and vice versa. These results are critical to understand the impacts of uncertainty on the functionality of PS-NPS ETSs and to provide a trade-off between the confidence level and abatement efforts.

  19. Uncertainty quantification and global sensitivity analysis of the Los Alamos sea ice model

    NASA Astrophysics Data System (ADS)

    Urrego-Blanco, Jorge R.; Urban, Nathan M.; Hunke, Elizabeth C.; Turner, Adrian K.; Jeffery, Nicole

    2016-04-01

    Changes in the high-latitude climate system have the potential to affect global climate through feedbacks with the atmosphere and connections with midlatitudes. Sea ice and climate models used to understand these changes have uncertainties that need to be characterized and quantified. We present a quantitative way to assess uncertainty in complex computer models, which is a new approach in the analysis of sea ice models. We characterize parametric uncertainty in the Los Alamos sea ice model (CICE) in a standalone configuration and quantify the sensitivity of sea ice area, extent, and volume with respect to uncertainty in 39 individual model parameters. Unlike common sensitivity analyses conducted in previous studies where parameters are varied one at a time, this study uses a global variance-based approach in which Sobol' sequences are used to efficiently sample the full 39-dimensional parameter space. We implement a fast emulator of the sea ice model whose predictions of sea ice extent, area, and volume are used to compute the Sobol' sensitivity indices of the 39 parameters. Main effects and interactions among the most influential parameters are also estimated by a nonparametric regression technique based on generalized additive models. A ranking based on the sensitivity indices indicates that model predictions are most sensitive to snow parameters such as snow conductivity and grain size, and the drainage of melt ponds. It is recommended that research be prioritized toward more accurately determining these most influential parameter values by observational studies or by improving parameterizations in the sea ice model.

  20. Uncertainty Optimization Applied to the Monte Carlo Analysis of Planetary Entry Trajectories

    NASA Technical Reports Server (NTRS)

    Olds, John; Way, David

    2001-01-01

    Recently, strong evidence of liquid water under the surface of Mars and a meteorite that might contain ancient microbes have renewed interest in Mars exploration. With this renewed interest, NASA plans to send spacecraft to Mars approx. every 26 months. These future spacecraft will return higher-resolution images, make precision landings, engage in longer-ranging surface maneuvers, and even return Martian soil and rock samples to Earth. Future robotic missions and any human missions to Mars will require precise entries to ensure safe landings near science objective and pre-employed assets. Potential sources of water and other interesting geographic features are often located near hazards, such as within craters or along canyon walls. In order for more accurate landings to be made, spacecraft entering the Martian atmosphere need to use lift to actively control the entry. This active guidance results in much smaller landing footprints. Planning for these missions will depend heavily on Monte Carlo analysis. Monte Carlo trajectory simulations have been used with a high degree of success in recent planetary exploration missions. These analyses ascertain the impact of off-nominal conditions during a flight and account for uncertainty. Uncertainties generally stem from limitations in manufacturing tolerances, measurement capabilities, analysis accuracies, and environmental unknowns. Thousands of off-nominal trajectories are simulated by randomly dispersing uncertainty variables and collecting statistics on forecast variables. The dependability of Monte Carlo forecasts, however, is limited by the accuracy and completeness of the assumed uncertainties. This is because Monte Carlo analysis is a forward driven problem; beginning with the input uncertainties and proceeding to the forecasts outputs. It lacks a mechanism to affect or alter the uncertainties based on the forecast results. If the results are unacceptable, the current practice is to use an iterative, trial

  1. Bayesian Statistics and Uncertainty Quantification for Safety Boundary Analysis in Complex Systems

    NASA Technical Reports Server (NTRS)

    He, Yuning; Davies, Misty Dawn

    2014-01-01

    The analysis of a safety-critical system often requires detailed knowledge of safe regions and their highdimensional non-linear boundaries. We present a statistical approach to iteratively detect and characterize the boundaries, which are provided as parameterized shape candidates. Using methods from uncertainty quantification and active learning, we incrementally construct a statistical model from only few simulation runs and obtain statistically sound estimates of the shape parameters for safety boundaries.

  2. Bayesian uncertainty analysis for complex physical systems modelled by computer simulators with applications to tipping points

    NASA Astrophysics Data System (ADS)

    Caiado, C. C. S.; Goldstein, M.

    2015-09-01

    In this paper we present and illustrate basic Bayesian techniques for the uncertainty analysis of complex physical systems modelled by computer simulators. We focus on emulation and history matching and also discuss the treatment of observational errors and structural discrepancies in time series. We exemplify such methods using a four-box model for the termohaline circulation. We show how these methods may be applied to systems containing tipping points and how to treat possible discontinuities using multiple emulators.

  3. Quantitative uncertainty analysis of Life Cycle Assessment for algal biofuel production.

    PubMed

    Sills, Deborah L; Paramita, Vidia; Franke, Michael J; Johnson, Michael C; Akabas, Tal M; Greene, Charles H; Tester, Jefferson W

    2013-01-15

    As a result of algae's promise as a renewable energy feedstock, numerous studies have used Life Cycle Assessment (LCA) to quantify the environmental performance of algal biofuels, yet there is no consensus of results among them. Our work, motivated by the lack of comprehensive uncertainty analysis in previous studies, uses a Monte Carlo approach to estimate ranges of expected values of LCA metrics by incorporating parameter variability with empirically specified distribution functions. Results show that large uncertainties exist at virtually all steps of the biofuel production process. Although our findings agree with a number of earlier studies on matters such as the need for wet lipid extraction, nutrients recovered from waste streams, and high energy coproducts, the ranges of reported LCA metrics show that uncertainty analysis is crucial for developing technologies, such as algal biofuels. In addition, the ranges of energy return on (energy) invested (EROI) values resulting from our analysis help explain the high variability in EROI values from earlier studies. Reporting results from LCA models as ranges, and not single values, will more reliably inform industry and policy makers on expected energetic and environmental performance of biofuels produced from microalgae. PMID:23237457

  4. Formal analysis of values under conditions of uncertainty applied to the siting of hazardous waste facilities

    SciTech Connect

    Hatfield, T.H.

    1985-01-01

    A major impediment to the siting of hazardous waste facilities is the reaction of the public to such a locally unwanted land use. The siting controversy is exacerbated by uncertainty with information available to evaluate candidate sites. In response to this problem, a normal analysis of values has been developed that treats the siting of hazardous waste facilities as a multiattribute group decision under conditions of uncertainty. The design phase of the formal analysis includes the selection of participants, the structuring of objectives and criteria, and the verification of independence conditions that determine the aggregation rules for the evaluation criteria. The evaluation phase of the formal analysis utilizes a questionnaire to assess the following attitudes of each participant for each criterion: the relative weighting and ranking of each criterion; the participant's values under conditions of certainty for the various levels of a criterion, given as a measurable value function v(x); values under conditions of uncertainty, given as a Von Neumann-Morgenstern utility function, u(x); and risk attitudes towards the uncertain performance of a site, given as the relative risk premium or RRP, where RRP =u(x) - v(x). The aggregation of individual evaluations is also a part of the evaluation phase.

  5. Wavelet-Monte Carlo Hybrid System for HLW Nuclide Migration Modeling and Sensitivity and Uncertainty Analysis

    SciTech Connect

    Nasif, Hesham; Neyama, Atsushi

    2003-02-26

    This paper presents results of an uncertainty and sensitivity analysis for performance of the different barriers of high level radioactive waste repositories. SUA is a tool to perform the uncertainty and sensitivity on the output of Wavelet Integrated Repository System model (WIRS), which is developed to solve a system of nonlinear partial differential equations arising from the model formulation of radionuclide transport through repository. SUA performs sensitivity analysis (SA) and uncertainty analysis (UA) on a sample output from Monte Carlo simulation. The sample is generated by WIRS and contains the values of the output values of the maximum release rate in the form of time series and values of the input variables for a set of different simulations (runs), which are realized by varying the model input parameters. The Monte Carlo sample is generated with SUA as a pure random sample or using Latin Hypercube sampling technique. Tchebycheff and Kolmogrov confidence bounds are compute d on the maximum release rate for UA and effective non-parametric statistics to rank the influence of the model input parameters SA. Based on the results, we point out parameters that have primary influences on the performance of the engineered barrier system of a repository. The parameters found to be key contributor to the release rate are selenium and Cesium distribution coefficients in both geosphere and major water conducting fault (MWCF), the diffusion depth and water flow rate in the excavation-disturbed zone (EDZ).

  6. Uncertainty and sensitivity analysis of early exposure results with the MACCS Reactor Accident Consequence Model

    SciTech Connect

    Helton, J.C.; Johnson, J.D.; McKay, M.D.; Shiver, A.W.; Sprung, J.L.

    1995-01-01

    Uncertainty and sensitivity analysis techniques based on Latin hypercube sampling, partial correlation analysis and stepwise regression analysis are used in an investigation with the MACCS model of the early health effects associated with a severe accident at a nuclear power station. The primary purpose of this study is to provide guidance on the variables to be considered in future review work to reduce the uncertainty in the important variables used in the calculation of reactor accident consequences. The effects of 34 imprecisely known input variables on the following reactor accident consequences are studied: number of early fatalities, number of cases of prodromal vomiting, population dose within 10 mi of the reactor, population dose within 1000 mi of the reactor, individual early fatality probability within 1 mi of the reactor, and maximum early fatality distance. When the predicted variables are considered collectively, the following input variables were found to be the dominant contributors to uncertainty: scaling factor for horizontal dispersion, dry deposition velocity, inhalation protection factor for nonevacuees, groundshine shielding factor for nonevacuees, early fatality hazard function alpha value for bone marrow exposure, and scaling factor for vertical dispersion.

  7. Probabilistic Parameter Uncertainty Analysis of Single Input Single Output Control Systems

    NASA Technical Reports Server (NTRS)

    Smith, Brett A.; Kenny, Sean P.; Crespo, Luis G.

    2005-01-01

    The current standards for handling uncertainty in control systems use interval bounds for definition of the uncertain parameters. This approach gives no information about the likelihood of system performance, but simply gives the response bounds. When used in design, current methods of m-analysis and can lead to overly conservative controller design. With these methods, worst case conditions are weighted equally with the most likely conditions. This research explores a unique approach for probabilistic analysis of control systems. Current reliability methods are examined showing the strong areas of each in handling probability. A hybrid method is developed using these reliability tools for efficiently propagating probabilistic uncertainty through classical control analysis problems. The method developed is applied to classical response analysis as well as analysis methods that explore the effects of the uncertain parameters on stability and performance metrics. The benefits of using this hybrid approach for calculating the mean and variance of responses cumulative distribution functions are shown. Results of the probabilistic analysis of a missile pitch control system, and a non-collocated mass spring system, show the added information provided by this hybrid analysis.

  8. The Importance of Behavioral Thresholds and Objective Functions in Contaminant Transport Uncertainty Analysis

    NASA Astrophysics Data System (ADS)

    Sykes, J. F.; Kang, M.; Thomson, N. R.

    2007-12-01

    The TCE release from The Lockformer Company in Lisle Illinois resulted in a plume in a confined aquifer that is more than 4 km long and impacted more than 300 residential wells. Many of the wells are on the fringe of the plume and have concentrations that did not exceed 5 ppb. The settlement for the Chapter 11 bankruptcy protection of Lockformer involved the establishment of a trust fund that compensates individuals with cancers with payments being based on cancer type, estimated TCE concentration in the well and the duration of exposure to TCE. The estimation of early arrival times and hence low likelihood events is critical in the determination of the eligibility of an individual for compensation. Thus, an emphasis must be placed on the accuracy of the leading tail region in the likelihood distribution of possible arrival times at a well. The estimation of TCE arrival time, using a three-dimensional analytical solution, involved parameter estimation and uncertainty analysis. Parameters in the model included TCE source parameters, groundwater velocities, dispersivities and the TCE decay coefficient for both the confining layer and the bedrock aquifer. Numerous objective functions, which include the well-known L2-estimator, robust estimators (L1-estimators and M-estimators), penalty functions, and dead zones, were incorporated in the parameter estimation process to treat insufficiencies in both the model and observational data due to errors, biases, and limitations. The concept of equifinality was adopted and multiple maximum likelihood parameter sets were accepted if pre-defined physical criteria were met. The criteria ensured that a valid solution predicted TCE concentrations for all TCE impacted areas. Monte Carlo samples are found to be inadequate for uncertainty analysis of this case study due to its inability to find parameter sets that meet the predefined physical criteria. Successful results are achieved using a Dynamically-Dimensioned Search sampling

  9. Uncertainty analysis of the Operational Simplified Surface Energy Balance (SSEBop) model at multiple flux tower sites

    NASA Astrophysics Data System (ADS)

    Chen, Mingshi; Senay, Gabriel B.; Singh, Ramesh K.; Verdin, James P.

    2016-05-01

    Evapotranspiration (ET) is an important component of the water cycle - ET from the land surface returns approximately 60% of the global precipitation back to the atmosphere. ET also plays an important role in energy transport among the biosphere, atmosphere, and hydrosphere. Current regional to global and daily to annual ET estimation relies mainly on surface energy balance (SEB) ET models or statistical and empirical methods driven by remote sensing data and various climatological databases. These models have uncertainties due to inevitable input errors, poorly defined parameters, and inadequate model structures. The eddy covariance measurements on water, energy, and carbon fluxes at the AmeriFlux tower sites provide an opportunity to assess the ET modeling uncertainties. In this study, we focused on uncertainty analysis of the Operational Simplified Surface Energy Balance (SSEBop) model for ET estimation at multiple AmeriFlux tower sites with diverse land cover characteristics and climatic conditions. The 8-day composite 1-km MODerate resolution Imaging Spectroradiometer (MODIS) land surface temperature (LST) was used as input land surface temperature for the SSEBop algorithms. The other input data were taken from the AmeriFlux database. Results of statistical analysis indicated that the SSEBop model performed well in estimating ET with an R2 of 0.86 between estimated ET and eddy covariance measurements at 42 AmeriFlux tower sites during 2001-2007. It was encouraging to see that the best performance was observed for croplands, where R2 was 0.92 with a root mean square error of 13 mm/month. The uncertainties or random errors from input variables and parameters of the SSEBop model led to monthly ET estimates with relative errors less than 20% across multiple flux tower sites distributed across different biomes. This uncertainty of the SSEBop model lies within the error range of other SEB models, suggesting systematic error or bias of the SSEBop model is within the

  10. Bayesian analysis of stage-fall-discharge rating curves and their uncertainties

    NASA Astrophysics Data System (ADS)

    Mansanarez, Valentin; Le Coz, Jérôme; Renard, Benjamin; Lang, Michel; Pierrefeu, Gilles; Le Boursicaud, Raphaël; Pobanz, Karine

    2016-04-01

    Stage-fall-discharge (SFD) rating curves are traditionally used to compute streamflow records at sites where the energy slope of the flow is variable due to variable backwater effects. Building on existing Bayesian approaches, we introduce an original hydraulics-based method for developing SFD rating curves used at twin gauge stations and estimating their uncertainties. Conventional power functions for channel and section controls are used, and transition to a backwater-affected channel control is computed based on a continuity condition, solved either analytically or numerically. The difference between the reference levels at the two stations is estimated as another uncertain parameter of the SFD model. The method proposed in this presentation incorporates information from both the hydraulic knowledge (equations of channel or section controls) and the information available in the stage-fall-discharge observations (gauging data). The obtained total uncertainty combines the parametric uncertainty and the remnant uncertainty related to the model of rating curve. This method provides a direct estimation of the physical inputs of the rating curve (roughness, width, slope bed, distance between twin gauges, etc.). The performance of the new method is tested using an application case affected by the variable backwater of a run-of-the-river dam: the Rhône river at Valence, France. In particular, a sensitivity analysis to the prior information and to the gauging dataset is performed. At that site, the stage-fall-discharge domain is well documented with gaugings conducted over a range of backwater affected and unaffected conditions. The performance of the new model was deemed to be satisfactory. Notably, transition to uniform flow when the overall range of the auxiliary stage is gauged is correctly simulated. The resulting curves are in good agreement with the observations (gaugings) and their uncertainty envelopes are acceptable for computing streamflow records. Similar

  11. A guide to uncertainty quantification and sensitivity analysis for cardiovascular applications.

    PubMed

    Eck, Vinzenz Gregor; Donders, Wouter Paulus; Sturdy, Jacob; Feinberg, Jonathan; Delhaas, Tammo; Hellevik, Leif Rune; Huberts, Wouter

    2016-08-01

    As we shift from population-based medicine towards a more precise patient-specific regime guided by predictions of verified and well-established cardiovascular models, an urgent question arises: how sensitive are the model predictions to errors and uncertainties in the model inputs? To make our models suitable for clinical decision-making, precise knowledge of prediction reliability is of paramount importance. Efficient and practical methods for uncertainty quantification (UQ) and sensitivity analysis (SA) are therefore essential. In this work, we explain the concepts of global UQ and global, variance-based SA along with two often-used methods that are applicable to any model without requiring model implementation changes: Monte Carlo (MC) and polynomial chaos (PC). Furthermore, we propose a guide for UQ and SA according to a six-step procedure and demonstrate it for two clinically relevant cardiovascular models: model-based estimation of the fractional flow reserve (FFR) and model-based estimation of the total arterial compliance (CT ). Both MC and PC produce identical results and may be used interchangeably to identify most significant model inputs with respect to uncertainty in model predictions of FFR and CT . However, PC is more cost-efficient as it requires an order of magnitude fewer model evaluations than MC. Additionally, we demonstrate that targeted reduction of uncertainty in the most significant model inputs reduces the uncertainty in the model predictions efficiently. In conclusion, this article offers a practical guide to UQ and SA to help move the clinical application of mathematical models forward. Copyright © 2015 John Wiley & Sons, Ltd. PMID:26475178

  12. Spatial uncertainty analysis: Propagation of interpolation errors in spatially distributed models

    USGS Publications Warehouse

    Phillips, D.L.; Marks, D.G.

    1996-01-01

    In simulation modelling, it is desirable to quantify model uncertainties and provide not only point estimates for output variables but confidence intervals as well. Spatially distributed physical and ecological process models are becoming widely used, with runs being made over a grid of points that represent the landscape. This requires input values at each grid point, which often have to be interpolated from irregularly scattered measurement sites, e.g., weather stations. Interpolation introduces spatially varying errors which propagate through the model We extended established uncertainty analysis methods to a spatial domain for quantifying spatial patterns of input variable interpolation errors and how they propagate through a model to affect the uncertainty of the model output. We applied this to a model of potential evapotranspiration (PET) as a demonstration. We modelled PET for three time periods in 1990 as a function of temperature, humidity, and wind on a 10-km grid across the U.S. portion of the Columbia River Basin. Temperature, humidity, and wind speed were interpolated using kriging from 700- 1000 supporting data points. Kriging standard deviations (SD) were used to quantify the spatially varying interpolation uncertainties. For each of 5693 grid points, 100 Monte Carlo simulations were done, using the kriged values of temperature, humidity, and wind, plus random error terms determined by the kriging SDs and the correlations of interpolation errors among the three variables. For the spring season example, kriging SDs averaged 2.6??C for temperature, 8.7% for relative humidity, and 0.38 m s-1 for wind. The resultant PET estimates had coefficients of variation (CVs) ranging from 14% to 27% for the 10-km grid cells. Maps of PET means and CVs showed the spatial patterns of PET with a measure of its uncertainty due to interpolation of the input variables. This methodology should be applicable to a variety of spatially distributed models using interpolated

  13. A GIS based spatially-explicit sensitivity and uncertainty analysis approach for multi-criteria decision analysis☆

    PubMed Central

    Feizizadeh, Bakhtiar; Jankowski, Piotr; Blaschke, Thomas

    2014-01-01

    GIS multicriteria decision analysis (MCDA) techniques are increasingly used in landslide susceptibility mapping for the prediction of future hazards, land use planning, as well as for hazard preparedness. However, the uncertainties associated with MCDA techniques are inevitable and model outcomes are open to multiple types of uncertainty. In this paper, we present a systematic approach to uncertainty and sensitivity analysis. We access the uncertainty of landslide susceptibility maps produced with GIS-MCDA techniques. A new spatially-explicit approach and Dempster–Shafer Theory (DST) are employed to assess the uncertainties associated with two MCDA techniques, namely Analytical Hierarchical Process (AHP) and Ordered Weighted Averaging (OWA) implemented in GIS. The methodology is composed of three different phases. First, weights are computed to express the relative importance of factors (criteria) for landslide susceptibility. Next, the uncertainty and sensitivity of landslide susceptibility is analyzed as a function of weights using Monte Carlo Simulation and Global Sensitivity Analysis. Finally, the results are validated using a landslide inventory database and by applying DST. The comparisons of the obtained landslide susceptibility maps of both MCDA techniques with known landslides show that the AHP outperforms OWA. However, the OWA-generated landslide susceptibility map shows lower uncertainty than the AHP-generated map. The results demonstrate that further improvement in the accuracy of GIS-based MCDA can be achieved by employing an integrated uncertainty–sensitivity analysis approach, in which the uncertainty of landslide susceptibility model is decomposed and attributed to model's criteria weights. PMID:25843987

  14. Reduced Uncertainties in the Flutter Analysis of the Aerostructures Test Wing

    NASA Technical Reports Server (NTRS)

    Pak, Chan-gi; Lung, Shun-fat

    2010-01-01

    Tuning the finite element model using measured data to minimize the model uncertainties is a challenging task in the area of structural dynamics. A test validated finite element model can provide a reliable flutter analysis to define the flutter placard speed to which the aircraft can be flown prior to flight flutter testing. Minimizing the difference between numerical and experimental results is a type of optimization problem. Through the use of the National Aeronautics and Space Administration Dryden Flight Research Center s (Edwards, California, USA) multidisciplinary design, analysis, and optimization tool to optimize the objective function and constraints; the mass properties, the natural frequencies, and the mode shapes are matched to the target data and the mass matrix orthogonality is retained. The approach in this study has been applied to minimize the model uncertainties for the structural dynamic model of the aerostructures test wing, which was designed, built, and tested at the National Aeronautics and Space Administration Dryden Flight Research Center. A 25-percent change in flutter speed has been shown after reducing the uncertainties

  15. Reduced Uncertainties in the Flutter Analysis of the Aerostructures Test Wing

    NASA Technical Reports Server (NTRS)

    Pak, Chan-Gi; Lung, Shun Fat

    2011-01-01

    Tuning the finite element model using measured data to minimize the model uncertainties is a challenging task in the area of structural dynamics. A test validated finite element model can provide a reliable flutter analysis to define the flutter placard speed to which the aircraft can be flown prior to flight flutter testing. Minimizing the difference between numerical and experimental results is a type of optimization problem. Through the use of the National Aeronautics and Space Administration Dryden Flight Research Center's (Edwards, California) multidisciplinary design, analysis, and optimization tool to optimize the objective function and constraints; the mass properties, the natural frequencies, and the mode shapes are matched to the target data, and the mass matrix orthogonality is retained. The approach in this study has been applied to minimize the model uncertainties for the structural dynamic model of the aerostructures test wing, which was designed, built, and tested at the National Aeronautics and Space Administration Dryden Flight Research Center. A 25 percent change in flutter speed has been shown after reducing the uncertainties.

  16. Designing, operating and maintaining artificial recharge pond under uncertainty: a probabilistic risk analysis

    NASA Astrophysics Data System (ADS)

    Pedretti, D.; Sanchez-Vila, X.; Fernandez-Garcia, D.; Bolster, D.; Tartakovsky, D. M.; Barahona-Palomo, M.

    2011-12-01

    Decision makers require long term effective hydraulic criteria to optimize the design of artificial recharge ponds. However, uncontrolled multiscale pore clogging effects on heterogeneous soils determines uncertainties which must be quantified. One of the most remarkable effect is the reduction of infiltration capacity over time, which affect the quantity and quality of aquifer recharging water. We developed a probabilistic (engineering) risk analysis where pore clogging is modeled as an exponential decay with time and where clogging mechanisms are differently sensitive to some properties of the soils, which are heterogeneously organized in space. We studied both a real case and some synthetic infiltration ponds. The risk is defined for the infiltration capacity to drop below a target value at a specific time after the facility is working. We can account for a variety of maintenance strategies that target different clogging mechanisms. In our analysis, physical clogging mechanisms induce the greatest uncertainty and that maintenance targeted at these can yield optimal results. However, considering the fundamental role of the spatial variability in the initial properties, we conclude that an adequate initial characterization of the surface infiltration ponds is strategically critical to determine the degree of uncertainty of different maintenance solutions and thus to make cost-effective and reliable decisions.

  17. Predictive uncertainty analysis of a saltwater intrusion model using null-space Monte Carlo

    USGS Publications Warehouse

    Herckenrath, Daan; Langevin, Christian D.; Doherty, John

    2011-01-01

    Because of the extensive computational burden and perhaps a lack of awareness of existing methods, rigorous uncertainty analyses are rarely conducted for variable-density flow and transport models. For this reason, a recently developed null-space Monte Carlo (NSMC) method for quantifying prediction uncertainty was tested for a synthetic saltwater intrusion model patterned after the Henry problem. Saltwater intrusion caused by a reduction in fresh groundwater discharge was simulated for 1000 randomly generated hydraulic conductivity distributions, representing a mildly heterogeneous aquifer. From these 1000 simulations, the hydraulic conductivity distribution giving rise to the most extreme case of saltwater intrusion was selected and was assumed to represent the "true" system. Head and salinity values from this true model were then extracted and used as observations for subsequent model calibration. Random noise was added to the observations to approximate realistic field conditions. The NSMC method was used to calculate 1000 calibration-constrained parameter fields. If the dimensionality of the solution space was set appropriately, the estimated uncertainty range from the NSMC analysis encompassed the truth. Several variants of the method were implemented to investigate their effect on the efficiency of the NSMC method. Reducing the dimensionality of the null-space for the processing of the random parameter sets did not result in any significant gains in efficiency and compromised the ability of the NSMC method to encompass the true prediction value. The addition of intrapilot point heterogeneity to the NSMC process was also tested. According to a variogram comparison, this provided the same scale of heterogeneity that was used to generate the truth. However, incorporation of intrapilot point variability did not make a noticeable difference to the uncertainty of the prediction. With this higher level of heterogeneity, however, the computational burden of

  18. Uncertainty analysis of projections of ozone-depleting substances: mixing ratios, EESC, ODPs, and GWPs

    NASA Astrophysics Data System (ADS)

    Velders, G. J. M.; Daniel, J. S.

    2014-03-01

    The rates at which ozone-depleting substances (ODSs) are removed from the atmosphere, which determine the lifetimes of these ODSs, are key factors for determining the rate of ozone layer recovery in the coming decades. We present here a comprehensive uncertainty analysis of future mixing ratios of ODSs, levels of equivalent effective stratospheric chlorine (EESC), ozone depletion potentials, and global warming potentials (GWPs), using, among other information, the 2013 WCRP/SPARC (World Climate Research Programme/Stratospheric Processes and their Role in Climate) assessment of lifetimes of ODSs and their uncertainties. The year EESC returns to pre-1980 levels, a metric commonly used to indicate a level of recovery from ODS-induced ozone depletion, is 2048 for midlatitudes and 2075 for Antarctic conditions based on the lifetimes from the SPARC assessment, which is about 2 and 4 yr, respectively, later than based on the lifetimes from the WMO (World Meteorological Organization) assessment of 2011. However, the uncertainty in this return to 1980 levels is much larger than the shift due to this change in lifetimes. The year EESC returns to pre-1980 levels ranges from 2039 to 2064 (95% confidence interval) for midlatitudes and from 2061 to 2105 for the Antarctic spring. The primary contribution to these ranges comes from the uncertainty in the lifetimes, with smaller contributions from uncertainties in other modeled parameters. The earlier years of the return estimates derived by the uncertainty analysis, i.e., 2039 for midlatitudes and 2061 for Antarctic spring, are comparable to a hypothetical scenario in which emissions of ODSs cease in 2014. The later end of the range, i.e., 2064 for midlatitudes and 2105 for Antarctic spring, can also be obtained by a scenario with an additional emission of about 7 Mt CFC-11 eq. (eq. - equivalent) in 2015, which is the same as about 2 times the projected cumulative anthropogenic emissions of all ODSs from 2014 to 2050, or about 12

  19. Uncertainty analysis of least-cost modeling for designing wildlife linkages.

    PubMed

    Beier, Paul; Majka, Daniel R; Newell, Shawn L

    2009-12-01

    Least-cost models for focal species are widely used to design wildlife corridors. To evaluate the least-cost modeling approach used to develop 15 linkage designs in southern California, USA, we assessed robustness of the largest and least constrained linkage. Species experts parameterized models for eight species with weights for four habitat factors (land cover, topographic position, elevation, road density) and resistance values for each class within a factor (e.g., each class of land cover). Each model produced a proposed corridor for that species. We examined the extent to which uncertainty in factor weights and class resistance values affected two key conservation-relevant outputs, namely, the location and modeled resistance to movement of each proposed corridor. To do so, we compared the proposed corridor to 13 alternative corridors created with parameter sets that spanned the plausible ranges of biological uncertainty in these parameters. Models for five species were highly robust (mean overlap 88%, little or no increase in resistance). Although the proposed corridors for the other three focal species overlapped as little as 0% (mean 58%) of the alternative corridors, resistance in the proposed corridors for these three species was rarely higher than resistance in the alternative corridors (mean difference was 0.025 on a scale of 1 10; worst difference was 0.39). As long as the model had the correct rank order of resistance values and factor weights, our results suggest that the predicted corridor is robust to uncertainty. The three carnivore focal species, alone or in combination, were not effective umbrellas for the other focal species. The carnivore corridors failed to overlap the predicted corridors of most other focal species and provided relatively high resistance for the other focal species (mean increase of 2.7 resistance units). Least-cost modelers should conduct uncertainty analysis so that decision-makers can appreciate the potential impact of

  20. Advanced Simulation Capability for Environmental Management (ASCEM): Developments in Uncertainty Quantification and Sensitivity Analysis.

    NASA Astrophysics Data System (ADS)

    McKinney, S. W.

    2015-12-01

    Effectiveness of uncertainty quantification (UQ) and sensitivity analysis (SA) has been improved in ASCEM by choosing from a variety of methods to best suit each model. Previously, ASCEM had a small toolset for UQ and SA, leaving out benefits of the many unincluded methods. Many UQ and SA methods are useful for analyzing models with specific characteristics; therefore, programming these methods into ASCEM would have been inefficient. Embedding the R programming language into ASCEM grants access to a plethora of UQ and SA methods. As a result, programming required is drastically decreased, and runtime efficiency and analysis effectiveness are increased relative to each unique model.

  1. Idealization, uncertainty and heterogeneity : game frameworks defined with formal concept analysis.

    SciTech Connect

    Racovitan, M. T.; Sallach, D. L.; Decision and Information Sciences; Northern Illinois Univ.

    2006-01-01

    The present study begins with Formal Concept Analysis, and undertakes to demonstrate how a succession of game frameworks may, by design, address increasingly complex and interesting social phenomena. We develop a series of multi-agent exchange games, each of which incorporates an additional dimension of complexity. All games are based on coalition patterns in exchanges where diverse cultural markers provide a basis for trust and reciprocity. The first game is characterized by an idealized concept of trust. A second game framework introduces uncertainty regarding the reciprocity of prospective transactions. A third game framework retains idealized trust and uncertainty, and adds additional agent heterogeneity. Cultural markers are not equally salient in conferring or withholding trust, and the result is a richer transactional process.

  2. Uncertainty Analysis for a De-pressurised Loss of Forced Cooling Event of the PBMR Reactor

    SciTech Connect

    Jansen van Rensburg, Pieter A.; Sage, Martin G.

    2006-07-01

    This paper presents an uncertainty analysis for a De-pressurised Loss of Forced Cooling (DLOFC) event that was performed with the systems CFD (Computational Fluid Dynamics) code Flownex for the PBMR reactor. An uncertainty analysis was performed to determine the variation in maximum fuel, core barrel and reactor pressure vessel (RPV) temperature due to variations in model input parameters. Some of the input parameters that were varied are: thermo-physical properties of helium and the various solid materials, decay heat, neutron and gamma heating, pebble bed pressure loss, pebble bed Nusselt number and pebble bed bypass flows. The Flownex model of the PBMR reactor is a 2-dimensional axisymmetrical model. It is simplified in terms of geometry and some other input values. However, it is believed that the model adequately indicates the effect of changes in certain input parameters on the fuel temperature and other components during a DLOFC event. Firstly, a sensitivity study was performed where input variables were varied individually according to predefined uncertainty ranges and the results were sorted according to the effect on maximum fuel temperature. In the sensitivity study, only seven variables had a significant effect on the maximum fuel temperature (greater that 5 deg. C). The most significant are power distribution profile, decay heat, reflector properties and effective pebble bed conductivity. Secondly, Monte Carlo analyses were performed in which twenty variables were varied simultaneously within predefined uncertainty ranges. For a one-tailed 95% confidence level, the conservatism that should be added to the best estimate calculation of the maximum fuel temperature for a DLOFC was determined as 53 deg. C. This value will probably increase after some model refinements in the future. Flownex was found to be a valuable tool for uncertainly analyses, facilitating both sensitivity studies and Monte Carlo analyses. (authors)

  3. Polynomial chaos expansions for uncertainty propagation and moment independent sensitivity analysis of seawater intrusion simulations

    NASA Astrophysics Data System (ADS)

    Rajabi, Mohammad Mahdi; Ataie-Ashtiani, Behzad; Simmons, Craig T.

    2015-01-01

    Real world models of seawater intrusion (SWI) require high computational efforts. This creates computational difficulties for the uncertainty propagation (UP) analysis of these models due the need for repeated numerical simulations in order to adequately capture the underlying statistics that describe the uncertainty in model outputs. Moreover, despite the obvious advantages of moment-independent global sensitivity analysis (SA) methods, these methods have rarely been employed for SWI and other complex groundwater models. The reason is that moment-independent global SA methods involve repeated UP analysis which further becomes computationally demanding. This study proposes the use of non-intrusive polynomial chaos expansions (PCEs) as a means to significantly accelerate UP analysis in SWI numerical modeling studies and shows that despite the highly non-linear and non-smooth input/output relationship that exists in SWI models, non-intrusive PCEs provide a reliable and yet computationally efficient surrogate of the original numerical model. The study illustrates that for the considered two and six dimensional UP problems, PCEs offer a more accurate estimation of the statistics describing the uncertainty in model outputs compared to Monte Carlo simulations based on the original numerical model. This study also shows that the use of non-intrusive PCEs in the estimation of the moment-independent sensitivity indices (i.e. delta indices) decreases the computational time by several orders of magnitude without causing significant loss of accuracy. The use of non-intrusive PCEs for the generation of SWI hazard maps is proposed to extend the practical applications of UP analysis in coastal aquifer management studies.

  4. Sensitivity and uncertainty analysis within a methodology for evaluating environmental restoration technologies

    NASA Astrophysics Data System (ADS)

    Zio, Enrico; Apostolakis, George E.

    1999-03-01

    This paper illustrates an application of sensitivity and uncertainty analysis techniques within a methodology for evaluating environmental restoration technologies. The methodology consists of two main parts: the first part ("analysis") integrates a wide range of decision criteria and impact evaluation techniques in a framework that emphasizes and incorporates input from stakeholders in all aspects of the process. Its products are the rankings of the alternative options for each stakeholder using, essentially, expected utility theory. The second part ("deliberation") utilizes the analytical results of the "analysis" and attempts to develop consensus among the stakeholders in a session in which the stakeholders discuss and evaluate the analytical results. This paper deals with the analytical part of the approach and the uncertainty and sensitivity analyses that were carried out in preparation for the deliberative process. The objective of these investigations was that of testing the robustness of the assessments and of pointing out possible existing sources of disagreements among the participating stakeholders, thus providing insights for the successive deliberative process. Standard techniques, such as differential analysis, Monte Carlo sampling and a two-dimensional policy region analysis proved sufficient for the task.

  5. Global uncertainty assessment in hydrological forecasting by means of statistical analysis of forecast errors

    NASA Astrophysics Data System (ADS)

    Montanari, A.; Grossi, G.

    2007-12-01

    It is well known that uncertainty assessment in hydrological forecasting is a topical issue. Already in 1905 W.E. Cooke, who was issuing daily weather forecasts in Australia, stated: "It seems to me that the condition of confidence or otherwise form a very important part of the prediction, and ought to find expression". Uncertainty assessment in hydrology involves the analysis of multiple sources of error. The contribution of these latter to the formation of the global uncertainty cannot be quantified independently, unless (a) one is willing to introduce subjective assumptions about the nature of the individual error components or (2) independent observations are available for estimating input error, model error, parameter error and state error. An alternative approach, that is applied in this study and still requires the introduction of some assumptions, is to quantify the global hydrological uncertainty in an integrated way, without attempting to quantify each independent contribution. This methodology can be applied in situations characterized by limited data availability and therefore is gaining increasing attention by end users. This work aims to propose a statistically based approach for assessing the global uncertainty in hydrological forecasting, by building a statistical model for the forecast error xt,d, where t is the forecast time and d is the lead time. Accordingly, the probability distribution of xt,d is inferred through a non linear multiple regression, depending on an arbitrary number of selected conditioning variables. These include the current forecast issued by the hydrological model, the past forecast error and internal state variables of the model. The final goal is to indirectly relate the forecast error to the sources of uncertainty, through a probabilistic link with the conditioning variables. Any statistical model is based on assumptions whose fulfilment is to be checked in order to assure the validity of the underlying theory. Statistical

  6. Uncertainty quantification based on pillars of experiment, theory, and computation. Part I: Data analysis

    NASA Astrophysics Data System (ADS)

    Elishakoff, I.; Sarlin, N.

    2016-06-01

    In this paper we provide a general methodology of analysis and design of systems involving uncertainties. Available experimental data is enclosed by some geometric figures (triangle, rectangle, ellipse, parallelogram, super ellipse) of minimum area. Then these areas are inflated resorting to the Chebyshev inequality in order to take into account the forecasted data. Next step consists in evaluating response of system when uncertainties are confined to one of the above five suitably inflated geometric figures. This step involves a combined theoretical and computational analysis. We evaluate the maximum response of the system subjected to variation of uncertain parameters in each hypothesized region. The results of triangular, interval, ellipsoidal, parallelogram, and super ellipsoidal calculi are compared with the view of identifying the region that leads to minimum of maximum response. That response is identified as a result of the suggested predictive inference. The methodology thus synthesizes probabilistic notion with each of the five calculi. Using the term "pillar" in the title was inspired by the News Release (2013) on according Honda Prize to J. Tinsley Oden, stating, among others, that "Dr. Oden refers to computational science as the "third pillar" of scientific inquiry, standing beside theoretical and experimental science. Computational science serves as a new paradigm for acquiring knowledge and informing decisions important to humankind". Analysis of systems with uncertainties necessitates employment of all three pillars. The analysis is based on the assumption that that the five shapes are each different conservative estimates of the true bounding region. The smallest of the maximal displacements in x and y directions (for a 2D system) therefore provides the closest estimate of the true displacements based on the above assumption.

  7. An uncertainty and sensitivity analysis approach for GIS-based multicriteria landslide susceptibility mapping

    PubMed Central

    Feizizadeh, Bakhtiar; Blaschke, Thomas

    2014-01-01

    GIS-based multicriteria decision analysis (MCDA) methods are increasingly being used in landslide susceptibility mapping. However, the uncertainties that are associated with MCDA techniques may significantly impact the results. This may sometimes lead to inaccurate outcomes and undesirable consequences. This article introduces a new GIS-based MCDA approach. We illustrate the consequences of applying different MCDA methods within a decision-making process through uncertainty analysis. Three GIS-MCDA methods in conjunction with Monte Carlo simulation (MCS) and Dempster–Shafer theory are analyzed for landslide susceptibility mapping (LSM) in the Urmia lake basin in Iran, which is highly susceptible to landslide hazards. The methodology comprises three stages. First, the LSM criteria are ranked and a sensitivity analysis is implemented to simulate error propagation based on the MCS. The resulting weights are expressed through probability density functions. Accordingly, within the second stage, three MCDA methods, namely analytical hierarchy process (AHP), weighted linear combination (WLC) and ordered weighted average (OWA), are used to produce the landslide susceptibility maps. In the third stage, accuracy assessments are carried out and the uncertainties of the different results are measured. We compare the accuracies of the three MCDA methods based on (1) the Dempster–Shafer theory and (2) a validation of the results using an inventory of known landslides and their respective coverage based on object-based image analysis of IRS-ID satellite images. The results of this study reveal that through the integration of GIS and MCDA models, it is possible to identify strategies for choosing an appropriate method for LSM. Furthermore, our findings indicate that the integration of MCDA and MCS can significantly improve the accuracy of the results. In LSM, the AHP method performed best, while the OWA reveals better performance in the reliability assessment. The WLC

  8. Using time-varying global sensitivity analysis to understand the importance of different uncertainty sources in hydrological modelling

    NASA Astrophysics Data System (ADS)

    Pianosi, Francesca; Wagener, Thorsten

    2016-04-01

    Simulations from environmental models are affected by potentially large uncertainties stemming from various sources, including model parameters and observational uncertainty in the input/output data. Understanding the relative importance of such sources of uncertainty is essential to support model calibration, validation and diagnostic evaluation, and to prioritize efforts for uncertainty reduction. Global Sensitivity Analysis (GSA) provides the theoretical framework and the numerical tools to gain this understanding. However, in traditional applications of GSA, model outputs are an aggregation of the full set of simulated variables. This aggregation of propagated uncertainties prior to GSA may lead to a significant loss of information and may cover up local behaviour that could be of great interest. In this work, we propose a time-varying version of a recently developed density-based GSA method, called PAWN, as a viable option to reduce this loss of information. We apply our approach to a medium-complexity hydrological model in order to address two questions: [1] Can we distinguish between the relative importance of parameter uncertainty versus data uncertainty in time? [2] Do these influences change in catchments with different characteristics? The results present the first quantitative investigation on the relative importance of parameter and data uncertainty across time. They also provide a demonstration of the value of time-varying GSA to investigate the propagation of uncertainty through numerical models and therefore guide additional data collection needs and model calibration/assessment.

  9. Uncertainty Modeling for Robustness Analysis of Control Upset Prevention and Recovery Systems

    NASA Technical Reports Server (NTRS)

    Belcastro, Christine M.; Khong, Thuan H.; Shin, Jong-Yeob; Kwatny, Harry; Chang, Bor-Chin; Balas, Gary J.

    2005-01-01

    Formal robustness analysis of aircraft control upset prevention and recovery systems could play an important role in their validation and ultimate certification. Such systems (developed for failure detection, identification, and reconfiguration, as well as upset recovery) need to be evaluated over broad regions of the flight envelope and under extreme flight conditions, and should include various sources of uncertainty. However, formulation of linear fractional transformation (LFT) models for representing system uncertainty can be very difficult for complex parameter-dependent systems. This paper describes a preliminary LFT modeling software tool which uses a matrix-based computational approach that can be directly applied to parametric uncertainty problems involving multivariate matrix polynomial dependencies. Several examples are presented (including an F-16 at an extreme flight condition, a missile model, and a generic example with numerous crossproduct terms), and comparisons are given with other LFT modeling tools that are currently available. The LFT modeling method and preliminary software tool presented in this paper are shown to compare favorably with these methods.

  10. Probabilistic accident consequence uncertainty analysis -- Late health effects uncertain assessment. Volume 2: Appendices

    SciTech Connect

    Little, M.P.; Muirhead, C.R.; Goossens, L.H.J.; Kraan, B.C.P.; Cooke, R.M.; Harper, F.T.; Hora, S.C.

    1997-12-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library of uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA late health effects models. This volume contains appendices that include (1) a summary of the MACCS and COSYMA consequence codes, (2) the elicitation questionnaires and case structures, (3) the rationales and results for the expert panel on late health effects, (4) short biographies of the experts, and (5) the aggregated results of their responses.

  11. Estimating uncertainty in policy analysis: health effects from inhaled sulfur oxides

    SciTech Connect

    Amaral, D.A.L.

    1983-01-01

    This study presents methods for the incorporation of uncertainty into quanitative analysis of the problem of estimating health risks from coal-fired power plants. Probabilistic long-range models of sulfur material balance and sets of plume trajectories are combined to produce probabilistic estimates of population exposure to sulfur air pollution for the addition of a hypothetical coal-burning power plant in the Ohio River Valley. In another segment, the change in population exposure which might occur if ambient sulfate were to be reduced everywhere in the northeastern United States is calculated. A third case is made up of a set of hypothetical urban and rural scenarios representing typical northeastern situations. Models of health impacts obtained through the elicitation of subjective expert judgment are applied to each of these population exposure estimates. Seven leading experts in the field of sulfur air pollution and health participated, yielding five quantitative models for morbidity and/or mortality effects from human exposure to ambient sulfate. In each case analyzed, the predictions based on probability distributions provided by the experts spanned several orders of magnitude, including some predictions of zero effects and some of up to a few percent of the total morality. It is concluded that uncertainty about whether sulfate has adverse effects dominates the scientific uncertainty about the atmospheric processes which generate and transport this pollutant.

  12. pyNSMC: A Python Module for Null-Space Monte Carlo Uncertainty Analysis

    NASA Astrophysics Data System (ADS)

    White, J.; Brakefield, L. K.

    2015-12-01

    The null-space monte carlo technique is a non-linear uncertainty analyses technique that is well-suited to high-dimensional inverse problems. While the technique is powerful, the existing workflow for completing null-space monte carlo is cumbersome, requiring the use of multiple commandline utilities, several sets of intermediate files and even a text editor. pyNSMC is an open-source python module that automates the workflow of null-space monte carlo uncertainty analyses. The module is fully compatible with the PEST and PEST++ software suites and leverages existing functionality of pyEMU, a python framework for linear-based uncertainty analyses. pyNSMC greatly simplifies the existing workflow for null-space monte carlo by taking advantage of object oriented design facilities in python. The core of pyNSMC is the ensemble class, which draws and stores realized random vectors and also provides functionality for exporting and visualizing results. By relieving users of the tedium associated with file handling and command line utility execution, pyNSMC instead focuses the user on the important steps and assumptions of null-space monte carlo analysis. Furthermore, pyNSMC facilitates learning through flow charts and results visualization, which are available at many points in the algorithm. The ease-of-use of the pyNSMC workflow is compared to the existing workflow for null-space monte carlo for a synthetic groundwater model with hundreds of estimable parameters.

  13. Analysis of parameter uncertainty of a flow and quality stormwater model.

    PubMed

    Dotto, C B S; Deletic, A; Fletcher, T D

    2009-01-01

    Uncertainty is intrinsic to all monitoring programs and all models. It cannot realistically be eliminated, but it is necessary to understand the sources of uncertainty, and their consequences on models and decisions. The aim of this paper is to evaluate uncertainty in a flow and water quality stormwater model, due to the model parameters and the availability of data for calibration and validation of the flow model. The MUSIC model, widely used in Australian stormwater practice, has been investigated. Frequentist and Bayesian methods were used for calibration and sensitivity analysis, respectively. It was found that out of 13 calibration parameters of the rainfall/runoff model, only two matter (the model results were not sensitive to the other 11). This suggests that the model can be simplified without losing its accuracy. The evaluation of the water quality models proved to be much more difficult. For the specific catchment and model tested, we argue that for rainfall/runoff, 6 months of data for calibration and 6 months of data for validation are required to produce reliable predictions. Further work is needed to make similar recommendations for modelling water quality. PMID:19657167

  14. Fast eutrophication assessment for stormwater wet detention ponds via fuzzy probit regression analysis under uncertainty.

    PubMed

    Tahsin, Subrina; Chang, Ni-Bin

    2016-02-01

    Stormwater wet detention ponds have been a commonly employed best management practice for stormwater management throughout the world for many years. In the past, the trophic state index values have been used to evaluate seasonal changes in water quality and rank lakes within a region or between several regions; yet, to date, there is no similar index for stormwater wet detention ponds. This study aimed to develop a new multivariate trophic state index (MTSI) suitable for conducting a rapid eutrophication assessment of stormwater wet detention ponds under uncertainty with respect to three typical physical and chemical properties. Six stormwater wet detention ponds in Florida were selected for demonstration of the new MTSI with respect to total phosphorus (TP), total nitrogen (TN), and Secchi disk depth (SDD) as cognitive assessment metrics to sense eutrophication potential collectively and inform the environmental impact holistically. Due to the involvement of multiple endogenous variables (i.e., TN, TP, and SDD) for the eutrophication assessment simultaneously under uncertainty, fuzzy synthetic evaluation was applied to first standardize and synchronize the sources of uncertainty in the decision analysis. The ordered probit regression model was then formulated for assessment based on the concept of MTSI with the inputs from the fuzzy synthetic evaluation. It is indicative that the severe eutrophication condition is present during fall, which might be due to frequent heavy summer storm events contributing to high-nutrient inputs in these six ponds. PMID:26733470

  15. A novel risk-based analysis for the production system under epistemic uncertainty

    NASA Astrophysics Data System (ADS)

    Khalaj, Mehran; Khalaj, Fereshteh; Khalaj, Amineh

    2013-11-01

    Risk analysis of production system, while the actual and appropriate data is not available, will cause wrong system parameters prediction and wrong decision making. In uncertainty condition, there are no appropriate measures for decision making. In epistemic uncertainty, we are confronted by the lack of data. Therefore, in calculating the system risk, we encounter vagueness that we have to use more methods that are efficient in decision making. In this research, using Dempster-Shafer method and risk assessment diagram, the researchers have achieved a better method of calculating tools failure risk. Traditional statistical methods for recognizing and evaluating systems are not always appropriate, especially when enough data is not available. The goal of this research was to present a more modern and applied method in real world organizations. The findings of this research were used in a case study, and an appropriate framework and constraint for tools risk were provided. The research has presented a hopeful concept for the calculation of production systems' risk, and its results show that in uncertainty condition or in case of the lack of knowledge, the selection of an appropriate method will facilitate the decision-making process.

  16. Uncertainty and sensitivity analysis of biokinetic models for radiopharmaceuticals used in nuclear medicine.

    PubMed

    Li, W B; Hoeschen, C

    2010-01-01

    Mathematical models for kinetics of radiopharmaceuticals in humans were developed and are used to estimate the radiation absorbed dose for patients in nuclear medicine by the International Commission on Radiological Protection and the Medical Internal Radiation Dose (MIRD) Committee. However, due to the fact that the residence times used were derived from different subjects, partially even with different ethnic backgrounds, a large variation in the model parameters propagates to a high uncertainty of the dose estimation. In this work, a method was developed for analysing the uncertainty and sensitivity of biokinetic models that are used to calculate the residence times. The biokinetic model of (18)F-FDG (FDG) developed by the MIRD Committee was analysed by this developed method. The sources of uncertainty of all model parameters were evaluated based on the experiments. The Latin hypercube sampling technique was used to sample the parameters for model input. Kinetic modelling of FDG in humans was performed. Sensitivity of model parameters was indicated by combining the model input and output, using regression and partial correlation analysis. The transfer rate parameter of plasma to other tissue fast is the parameter with the greatest influence on the residence time of plasma. Optimisation of biokinetic data acquisition in the clinical practice by exploitation of the sensitivity of model parameters obtained in this study is discussed. PMID:20185457

  17. Uncertainty analysis and validation of the estimation of effective hydraulic properties at the Darcy scale

    NASA Astrophysics Data System (ADS)

    Mesgouez, A.; Buis, S.; Ruy, S.; Lefeuve-Mesgouez, G.

    2014-05-01

    The determination of the hydraulic properties of heterogeneous soils or porous media remains challenging. In the present study, we focus on determining the effective properties of heterogeneous porous media at the Darcy scale with an analysis of their uncertainties. Preliminary, experimental measurements of the hydraulic properties of each component of the heterogeneous medium are obtained. The properties of the effective medium, representing an equivalent homogeneous material, are determined numerically by simulating a water flow in a three-dimensional representation of the heterogeneous medium, under steady-state scenarios and using its component properties. One of the major aspects of this study is to take into account the uncertainties of these properties in the computation and evaluation of the effective properties. This is done using a bootstrap method. Numerical evaporation experiments are conducted both on the heterogeneous and on the effective homogeneous materials to evaluate the effectiveness of the proposed approach. First, the impact of the uncertainties of the component properties on the simulated water matric potential is found to be high for the heterogeneous material configuration. Second, it is shown that the strategy developed herein leads to a reduction of this impact. Finally, the adequacy between the mean of the simulations for the two configurations confirms the suitability of the homogenization approach, even in the case of dynamic scenarios. Although it is applied to green roof substrates, a two-component media composed of bark compost and pozzolan used in the construction of buildings, the methodology proposed in this study is generic.

  18. Perspectives Gained in an Evaluation of Uncertainty, Sensitivity, and Decision Analysis Software

    SciTech Connect

    Davis, F.J.; Helton, J.C.

    1999-02-24

    The following software packages for uncertainty, sensitivity, and decision analysis were reviewed and also tested with several simple analysis problems: Crystal Ball, RiskQ, SUSA-PC, Analytica, PRISM, Ithink, Stella, LHS, STEPWISE, and JMP. Results from the review and test problems are presented. The study resulted in the recognition of the importance of four considerations in the selection of a software package: (1) the availability of an appropriate selection of distributions, (2) the ease with which data flows through the input sampling, model evaluation, and output analysis process, (3) the type of models that can be incorporated into the analysis process, and (4) the level of confidence in the software modeling and results.

  19. Uncertainty analysis for an updated dose assessment for a US nuclear test site: Bikini Atoll

    SciTech Connect

    Bogen, K.T.; Conrado, C.L.; Robison, W.L.

    1995-11-01

    A detailed analysis of uncertainty and interindividual variability in estimated doses was conducted for a rehabilitation scenario for Bikini Island at Bikini Atoll, in which the top 40 cm of soil would be removed in the housing and village area, and the rest of the island is treated with potassium fertilizer, prior to an assumed resettlement date of 1999. Predicted doses were considered for the following fallout-related exposure pathways: ingested Cesium-137 and Strontium-90, external gamma exposure, and inhalation and ingestion of Americium-241 + Plutonium-239+240. Two dietary scenarios were considered: (1) imported foods are available (IA), and (2) imported foods are unavailable (only local foods are consumed) (IUA). Corresponding calculations of uncertainty in estimated population-average dose showed that after {approximately}5 y of residence on Bikini, the upper and lower 95% confidence limits with respect to uncertainty in this dose are estimated to be approximately 2-fold higher and lower than its population-average value, respectively (under both IA and IUA assumptions). Corresponding calculations of interindividual variability in the expected value of dose with respect to uncertainty showed that after {approximately}5 y of residence on Bikini, the upper and lower 95% confidence limits with respect to interindividual variability in this dose are estimated to be approximately 2-fold higher and lower than its expected value, respectively (under both IA and IUA assumptions). For reference, the expected values of population-average dose at age 70 were estimated to be 1.6 and 5.2 cSv under the IA and IUA dietary assumptions, respectively. Assuming that 200 Bikini resettlers would be exposed to local foods (under both IA and IUA assumptions), the maximum 1-y dose received by any Bikini resident is most likely to be approximately 2 and 8 mSv under the IA and IUA assumptions, respectively.

  20. Error modeling based on geostatistics for uncertainty analysis in crop mapping using Gaofen-1 multispectral imagery

    NASA Astrophysics Data System (ADS)

    You, Jiong; Pei, Zhiyuan

    2015-01-01

    With the development of remote sensing technology, its applications in agriculture monitoring systems, crop mapping accuracy, and spatial distribution are more and more being explored by administrators and users. Uncertainty in crop mapping is profoundly affected by the spatial pattern of spectral reflectance values obtained from the applied remote sensing data. Errors in remotely sensed crop cover information and the propagation in derivative products need to be quantified and handled correctly. Therefore, this study discusses the methods of error modeling for uncertainty characterization in crop mapping using GF-1 multispectral imagery. An error modeling framework based on geostatistics is proposed, which introduced the sequential Gaussian simulation algorithm to explore the relationship between classification errors and the spectral signature from remote sensing data source. On this basis, a misclassification probability model to produce a spatially explicit classification error probability surface for the map of a crop is developed, which realizes the uncertainty characterization for crop mapping. In this process, trend surface analysis was carried out to generate a spatially varying mean response and the corresponding residual response with spatial variation for the spectral bands of GF-1 multispectral imagery. Variogram models were employed to measure the spatial dependence in the spectral bands and the derived misclassification probability surfaces. Simulated spectral data and classification results were quantitatively analyzed. Through experiments using data sets from a region in the low rolling country located at the Yangtze River valley, it was found that GF-1 multispectral imagery can be used for crop mapping with a good overall performance, the proposal error modeling framework can be used to quantify the uncertainty in crop mapping, and the misclassification probability model can summarize the spatial variation in map accuracy and is helpful for

  1. Uncertainty analysis in regulatory programs: Application factors versus probabilistic methods in ecological risk assessments of chemicals

    SciTech Connect

    Moore, D.R.J.; Elliot, B.

    1995-12-31

    In assessments of toxic chemicals, sources of uncertainty may be dealt with by two basic approaches: application factors and probabilistic methods. In regulatory programs, the most common approach is to calculate a quotient by dividing the predicted environmental concentration (PEC) by the predicted no effects concentration (PNEC). PNECs are usually derived from laboratory bioassays, thus requiring the use of application factors to account for uncertainty introduced by the extrapolation from the laboratory to the field, and from measurement to assessment endpoints. Using this approach, often with worst-case assumptions about exposure and species sensitivities, the hope is that chemicals with a quotient of less than one will have a very low probability of causing adverse ecological effects. This approach has received widespread criticism recently, particularly because it tends to be overly conservative and does not adequately estimate the magnitude and probability of causing adverse effects. On the plus side, application factors are simple to use, accepted worldwide, and may be used with limited effects data in a quotient calculation. The alternative approach is to use probabilistic methods such as Monte Carlo simulation, Baye`s theorem or other techniques to estimate risk. Such methods often have rigorous statistical assumptions and may have large data requirements. Stating an effect in probabilistic terms, however, forces the identification of sources of uncertainty and quantification of their impact on risk estimation. In this presentation the authors discuss the advantages and disadvantages of using application factors and probabilistic methods in dealing with uncertainty in ecological risk assessments of chemicals. Based on this analysis, recommendations are presented to assist in choosing the appropriate approach for different types of regulatory programs dealing with toxic chemicals.

  2. Uncertainty analysis of projections of ozone-depleting substances: mixing ratios, EESC, ODPs, and GWPs

    NASA Astrophysics Data System (ADS)

    Velders, G. J. M.; Daniel, J. S.

    2013-10-01

    The rates at which ozone depleting substances (ODSs) are removed from the atmosphere, that is, their lifetimes, are key factors for determining the rate of ozone layer recovery in the coming decades. We present here a comprehensive uncertainty analysis of future mixing ratios of ODSs, levels of equivalent effective stratospheric chlorine (EESC), ozone depletion potentials, and global warming potentials, using, among other information, the 2013 WCRP/SPARC assessment of lifetimes of ODSs and their uncertainties. The year EESC returns to pre-1980 levels, a metric commonly used to indicate a level of recovery from ODS-induced ozone depletion, is 2048 for mid-latitudes based on the lifetimes from the SPARC assessment, which is about 2 yr later than based on the lifetimes from the WMO assessment of 2011. However, the uncertainty in this return to 1980 levels is much larger than the 2 yr change. The year EESC returns to pre-1980 levels ranges from 2039 to 2064 (95% confidence interval) for mid-latitudes and 2061 to 2105 for the Antarctic spring. The primary contribution to these ranges comes from the uncertainty in the lifetimes. The earlier years of the return estimates are comparable to a hypothetical scenario in which emissions of ODSs cease in 2014. The later end of the range corresponds to a scenario containing an additional emission of about 7 Mt CFC-11-eq in 2015, which is the same as about 2 times the cumulative anthropogenic emissions of all ODSs from 2014 to 2050, or about 12 times the cumulative HCFC emissions from 2014 to 2050.

  3. Analysis of uncertainties in the regional acid deposition model, version 2 (RADM2), gas-phase chemical mechanism. Final report

    SciTech Connect

    Gao, D.; Milford, J.B.; Stockwell, W.R.

    1996-04-01

    This report describes the results of a detailed analysis of uncertainties in the RADM2 chemical mechanism, which was developed by Stockwell et al. (1990) for use in urban and regional scale models of the formation and transport of ozone and other photochemical air pollutants. The uncertainty analysis was conducted for box model simulations of chemical conditions representing summertime smog episodes in polluted rural and urban areas. Estimated uncertainties in the rate parameters and product yields of the mechanism were propagated through the simulations using Monte Carlo analysis with a Latin Hypercube Sampling scheme. Uncertainty estimates for the mechanism parameters were compiled from published reviews, supplemented as necessary by original estimates. Correlations between parameters were considered in the analysis as appropriate.

  4. Reduction and Uncertainty Analysis of Chemical Mechanisms Based on Local and Global Sensitivities

    NASA Astrophysics Data System (ADS)

    Esposito, Gaetano

    Numerical simulations of critical reacting flow phenomena in hypersonic propulsion devices require accurate representation of finite-rate chemical kinetics. The chemical kinetic models available for hydrocarbon fuel combustion are rather large, involving hundreds of species and thousands of reactions. As a consequence, they cannot be used in multi-dimensional computational fluid dynamic calculations in the foreseeable future due to the prohibitive computational cost. In addition to the computational difficulties, it is also known that some fundamental chemical kinetic parameters of detailed models have significant level of uncertainty due to limited experimental data available and to poor understanding of interactions among kinetic parameters. In the present investigation, local and global sensitivity analysis techniques are employed to develop a systematic approach of reducing and analyzing detailed chemical kinetic models. Unlike previous studies in which skeletal model reduction was based on the separate analysis of simple cases, in this work a novel strategy based on Principal Component Analysis of local sensitivity values is presented. This new approach is capable of simultaneously taking into account all the relevant canonical combustion configurations over different composition, temperature and pressure conditions. Moreover, the procedure developed in this work represents the first documented inclusion of non-premixed extinction phenomena, which is of great relevance in hypersonic combustors, in an automated reduction algorithm. The application of the skeletal reduction to a detailed kinetic model consisting of 111 species in 784 reactions is demonstrated. The resulting reduced skeletal model of 37--38 species showed that the global ignition/propagation/extinction phenomena of ethylene-air mixtures can be predicted within an accuracy of 2% of the full detailed model. The problems of both understanding non-linear interactions between kinetic parameters and

  5. Sensitivity Analysis and Uncertainty Characterization of Subnational Building Energy Demand in an Integrated Assessment Model

    NASA Astrophysics Data System (ADS)

    Scott, M. J.; Daly, D.; McJeon, H.; Zhou, Y.; Clarke, L.; Rice, J.; Whitney, P.; Kim, S.

    2012-12-01

    Residential and commercial buildings are a major source of energy consumption and carbon dioxide emissions in the United States, accounting for 41% of energy consumption and 40% of carbon emissions in 2011. Integrated assessment models (IAMs) historically have been used to estimate the impact of energy consumption on greenhouse gas emissions at the national and international level. Increasingly they are being asked to evaluate mitigation and adaptation policies that have a subnational dimension. In the United States, for example, building energy codes are adopted and enforced at the state and local level. Adoption of more efficient appliances and building equipment is sometimes directed or actively promoted by subnational governmental entities for mitigation or adaptation to climate change. The presentation reports on new example results from the Global Change Assessment Model (GCAM) IAM, one of a flexibly-coupled suite of models of human and earth system interactions known as the integrated Regional Earth System Model (iRESM) system. iRESM can evaluate subnational climate policy in the context of the important uncertainties represented by national policy and the earth system. We have added a 50-state detailed U.S. building energy demand capability to GCAM that is sensitive to national climate policy, technology, regional population and economic growth, and climate. We are currently using GCAM in a prototype stakeholder-driven uncertainty characterization process to evaluate regional climate mitigation and adaptation options in a 14-state pilot region in the U.S. upper Midwest. The stakeholder-driven decision process involves several steps, beginning with identifying policy alternatives and decision criteria based on stakeholder outreach, identifying relevant potential uncertainties, then performing sensitivity analysis, characterizing the key uncertainties from the sensitivity analysis, and propagating and quantifying their impact on the relevant decisions. In the

  6. Uncertainty propagation analysis applied to volcanic ash dispersal at Mt. Etna by using a Lagrangian model

    NASA Astrophysics Data System (ADS)

    de'Michieli Vitturi, Mattia; Pardini, Federica; Spanu, Antonio; Neri, Augusto; Vittoria Salvetti, Maria

    2015-04-01

    Volcanic ash clouds represent a major hazard for populations living nearby volcanic centers producing a risk for humans and a potential threat to crops, ground infrastructures, and aviation traffic. Lagrangian particle dispersal models are commonly used for tracking ash particles emitted from volcanic plumes and transported under the action of atmospheric wind fields. In this work, we present the results of an uncertainty propagation analysis applied to volcanic ash dispersal from weak plumes with specific focus on the uncertainties related to the grain-size distribution of the mixture. To this aim, the Eulerian fully compressible mesoscale non-hydrostatic model WRF was used to generate the driving wind, representative of the atmospheric conditions occurring during the event of November 24, 2006 at Mt. Etna. Then, the Lagrangian particle model LPAC (de' Michieli Vitturi et al., JGR 2010) was used to simulate the transport of mass particles under the action of atmospheric conditions. The particle motion equations were derived by expressing the Lagrangian particle acceleration as the sum of the forces acting along its trajectory, with drag forces calculated as a function of particle diameter, density, shape and Reynolds number. The simulations were representative of weak plume events of Mt. Etna and aimed to quantify the effect on the dispersal process of the uncertainty in the particle sphericity and in the mean and variance of a log-normal distribution function describing the grain-size of ash particles released from the eruptive column. In order to analyze the sensitivity of particle dispersal to these uncertain parameters with a reasonable number of simulations, and therefore with affordable computational costs, response surfaces in the parameter space were built by using the generalized polynomial chaos technique. The uncertainty analysis allowed to quantify the most probable values, as well as their pdf, of the number of particles as well as of the mean and

  7. A Grounded Analysis of Career Uncertainty Perceived by College Students in Taiwan

    ERIC Educational Resources Information Center

    Tien, Hsiu-Lan Shelley; Lin, Chia-Huei; Chen, Shu-Chi

    2005-01-01

    The authors examined career-related uncertainties perceived by college students in Taiwan. Five hundred thirty-two Taiwanese students responded to a free-response instrument containing 3 questions related to career uncertainties: (1) the sources of career uncertainty; (2) the experiences at the moment of feeling uncertainty; and (3) coping…

  8. Uncertainty and sensitivity analysis of food pathway results with the MACCS Reactor Accident Consequence Model

    SciTech Connect

    Helton, J.C.; Johnson, J.D.; Rollstin, J.A.; Shiver, A.W.; Sprung, J.L.

    1995-01-01

    Uncertainty and sensitivity analysis techniques based on Latin hypercube sampling, partial correlation analysis and stepwise regression analysis are used in an investigation with the MACCS model of the food pathways associated with a severe accident at a nuclear power station. The primary purpose of this study is to provide guidance on the variables to be considered in future review work to reduce the uncertainty in the important variables used in the calculation of reactor accident consequences. The effects of 87 imprecisely-known input variables on the following reactor accident consequences are studied: crop growing season dose, crop long-term dose, milk growing season dose, total food pathways dose, total ingestion pathways dose, total long-term pathways dose, area dependent cost, crop disposal cost, milk disposal cost, condemnation area, crop disposal area and milk disposal area. When the predicted variables are considered collectively, the following input variables were found to be the dominant contributors to uncertainty: fraction of cesium deposition on grain fields that is retained on plant surfaces and transferred directly to grain, maximum allowable ground concentrations of Cs-137 and Sr-90 for production of crops, ground concentrations of Cs-134, Cs-137 and I-131 at which the disposal of milk will be initiated due to accidents that occur during the growing season, ground concentrations of Cs-134, I-131 and Sr-90 at which the disposal of crops will be initiated due to accidents that occur during the growing season, rate of depletion of Cs-137 and Sr-90 from the root zone, transfer of Sr-90 from soil to legumes, transfer of Cs-137 from soil to pasture, transfer of cesium from animal feed to meat, and the transfer of cesium, iodine and strontium from animal feed to milk.

  9. A multimedia environmental model of chemical distribution: fate, transport, and uncertainty analysis.

    PubMed

    Luo, Yuzhou; Yang, Xiusheng

    2007-01-01

    This paper presented a framework for analysis of chemical concentration in the environment and evaluation of variance propagation within the model. This framework was illustrated through a case study of selected organic compounds of benzo[alpha]pyrene (BAP) and hexachlorobenzene (HCB) in the Great Lakes region. A multimedia environmental fate model was applied to perform stochastic simulations of chemical concentrations in various media. Both uncertainty in chemical properties and variability in hydrometeorological parameters were included in the Monte Carlo simulation, resulting in a distribution of concentrations in each medium. Parameters of compartmental dimensions, densities, emissions, and background concentrations were assumed to be constant in this study. The predicted concentrations in air, surface water and sediment were compared to reported data for validation purpose. Based on rank correlations, a sensitivity analysis was conducted to determine the influence of individual input parameters on the output variance for concentration in each environmental medium and for the basin-wide total mass inventory. Results of model validation indicated that the model predictions were in reasonable agreement with spatial distribution patterns, among the five lake basins, of reported data in the literature. For the chemical and environmental parameters given in this study, parameters associated to air-ground partitioning (such as moisture in surface soil, vapor pressure, and deposition velocity) and chemical distribution in soil solid (such as organic carbon partition coefficient and organic carbon content in root-zone soil) were targeted to reduce the uncertainty in basin-wide mass inventory. This results of sensitivity analysis in this study also indicated that the model sensitivity to an input parameter might be affected by the magnitudes of input parameters defined by the parameter settings in the simulation scenario. Therefore, uncertainty and sensitivity analyses

  10. Uncertainty and sensitivity analysis of chronic exposure results with the MACCS reactor accident consequence model

    SciTech Connect

    Helton, J.C.; Johnson, J.D.; Rollstin, J.A.; Shiver, A.W.; Sprung, J.L.

    1995-01-01

    Uncertainty and sensitivity analysis techniques based on Latin hypercube sampling, partial correlation analysis and stepwise regression analysis are used in an investigation with the MACCS model of the chronic exposure pathways associated with a severe accident at a nuclear power station. The primary purpose of this study is to provide guidance on the variables to be considered in future review work to reduce the uncertainty in the important variables used in the calculation of reactor accident consequences. The effects of 75 imprecisely known input variables on the following reactor accident consequences are studied: crop growing season dose, crop long-term dose, water ingestion dose, milk growing season dose, long-term groundshine dose, long-term inhalation dose, total food pathways dose, total ingestion pathways dose, total long-term pathways dose, total latent cancer fatalities, area-dependent cost, crop disposal cost, milk disposal cost, population-dependent cost, total economic cost, condemnation area, condemnation population, crop disposal area and milk disposal area. When the predicted variables are considered collectively, the following input variables were found to be the dominant contributors to uncertainty: dry deposition velocity, transfer of cesium from animal feed to milk, transfer of cesium from animal feed to meat, ground concentration of Cs-134 at which the disposal of milk products will be initiated, transfer of Sr-90 from soil to legumes, maximum allowable ground concentration of Sr-90 for production of crops, fraction of cesium entering surface water that is consumed in drinking water, groundshine shielding factor, scale factor defining resuspension, dose reduction associated with decontamination, and ground concentration of 1-131 at which disposal of crops will be initiated due to accidents that occur during the growing season.

  11. AN OVERVIEW OF THE UNCERTAINTY ANALYSIS, SENSITIVITY ANALYSIS, AND PARAMETER ESTIMATION (UA/SA/PE) API AND HOW TO IMPLEMENT IT

    EPA Science Inventory

    The Application Programming Interface (API) for Uncertainty Analysis, Sensitivity Analysis, and
    Parameter Estimation (UA/SA/PE API) (also known as Calibration, Optimization and Sensitivity and Uncertainty (CUSO)) was developed in a joint effort between several members of both ...

  12. Response Model Based Analysis of Climate Model Sensitivities and Uncertainties using the LLNL UQ Pipeline

    NASA Astrophysics Data System (ADS)

    Brandon, S. T.; Domyancic, D. M.; Johnson, B. J.; Nimmakayala, R.; Lucas, D. D.; Tannahill, J.; Christianson, G.; McEnerney, J.; Klein, R.

    2011-12-01

    A Lawrence Livermore National Laboratory (LLNL) multi-directorate strategic initiative is developing uncertainty quantification (UQ) tools and techniques that are being applied to climate research. The LLNL UQ Pipeline and corresponding computational tools support the ensemble-of-models approach to UQ, and these tools have enabled the production of a comprehensive set of present-day climate calculations using the Community Atmospheric Model (CAM) and, more recently, the Community Earth System Model (CESM) codes. Statistical analysis of the ensemble is made possible by fitting a response surface, or surrogate model, to the ensemble-of-models data. We describe the LLNL UQ Pipeline and techniques that enable the execution and analysis of climate UQ and sensitivities studies on LLNL's high performance computing (HPC) resources. The analysis techniques are applied to an ensemble consisting of 1,000 CAM4 simulations. We also present two methods, direct sampling and bootstrapping, that quantify the errors in the ability of the response function to model the CAM4 ensemble. This work was performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344 and was funded by the Uncertainty Quantification Strategic Initiative Laboratory Directed Research and Development Project at LLNL under project tracking code 10-SI-013.

  13. Numerical analysis of step-drawdown tests: Parameter identification and uncertainty

    NASA Astrophysics Data System (ADS)

    Louwyck, A.; Vandenbohede, A.; Lebbe, L.

    2010-01-01

    SummaryAn inverse, numerical model is presented as an interpretation method for step-drawdown tests. The model consists of an axi-symmetric model to simulate radial flow in a horizontally stratified aquifer, and a nonlinear regression analysis to identify hydraulic parameters and quantify parameter uncertainty. Compared to commonly used methods based on analytical solutions, the inverse model allows a more representative aquifer schematisation, usage of all drawdown observations, and a comprehensive analysis of parameter uncertainty. Data sets from two step-drawdown tests performed in a homogeneous, confined aquifer are interpreted and it is concluded that transmissivity and well loss coefficient are identifiable. Additional data sampled from an observation well are recommended to estimate aquifer storativity. Reliability of parameter estimates decreases with a decrease in the number of steps and therefore it is recommended to conduct step-drawdown tests having at least four steps. Analysis of data from a third test performed in an unconfined, layered aquifer reveals that only the well loss coefficient is identifiable and that prior information about hydraulic parameters of extracted and adjacent layers is required. In general, it is concluded that the inverse, numerical model is a viable interpretation method to identify the well loss coefficient and additional aquifer parameters from step-drawdown test data.

  14. Sensitivity Analysis and Uncertainty Propagation in a General-Purpose Thermal Analysis Code

    SciTech Connect

    Blackwell, Bennie F.; Dowding, Kevin J.

    1999-08-04

    Methods are discussed for computing the sensitivity of field variables to changes in material properties and initial/boundary condition parameters for heat transfer problems. The method we focus on is termed the ''Sensitivity Equation Method'' (SEM). It involves deriving field equations for sensitivity coefficients by differentiating the original field equations with respect to the parameters of interest and numerically solving the resulting sensitivity field equations. Uncertainty in the model parameters are then propagated through the computational model using results derived from first-order perturbation theory; this technique is identical to the methodology typically used to propagate experimental uncertainty. Numerical results are presented for the design of an experiment to estimate the thermal conductivity of stainless steel using transient temperature measurements made on prototypical hardware of a companion contact conductance experiment. Comments are made relative to extending the SEM to conjugate heat transfer problems.

  15. How to assess the Efficiency and "Uncertainty" of Global Sensitivity Analysis?

    NASA Astrophysics Data System (ADS)

    Haghnegahdar, Amin; Razavi, Saman

    2016-04-01

    Sensitivity analysis (SA) is an important paradigm for understanding model behavior, characterizing uncertainty, improving model calibration, etc. Conventional "global" SA (GSA) approaches are rooted in different philosophies, resulting in different and sometime conflicting and/or counter-intuitive assessment of sensitivity. Moreover, most global sensitivity techniques are highly computationally demanding to be able to generate robust and stable sensitivity metrics over the entire model response surface. Accordingly, a novel sensitivity analysis method called Variogram Analysis of Response Surfaces (VARS) is introduced to overcome the aforementioned issues. VARS uses the Variogram concept to efficiently provide a comprehensive assessment of global sensitivity across a range of scales within the parameter space. Based on the VARS principles, in this study we present innovative ideas to assess (1) the efficiency of GSA algorithms and (2) the level of confidence we can assign to a sensitivity assessment. We use multiple hydrological models with different levels of complexity to explain the new ideas.

  16. Uncertainty analysis in vulnerability estimations for elements at risk- a review of concepts and some examples on landslides

    NASA Astrophysics Data System (ADS)

    Ciurean, R. L.; Glade, T.

    2012-04-01

    Decision under uncertainty is a constant of everyday life and an important component of risk management and governance. Recently, experts have emphasized the importance of quantifying uncertainty in all phases of landslide risk analysis. Due to its multi-dimensional and dynamic nature, (physical) vulnerability is inherently complex and the "degree of loss" estimates imprecise and to some extent even subjective. Uncertainty analysis introduces quantitative modeling approaches that allow for a more explicitly objective output, improving the risk management process as well as enhancing communication between various stakeholders for better risk governance. This study presents a review of concepts for uncertainty analysis in vulnerability of elements at risk to landslides. Different semi-quantitative and quantitative methods are compared based on their feasibility in real-world situations, hazard dependency, process stage in vulnerability assessment (i.e. input data, model, output), and applicability within an integrated landslide hazard and risk framework. The resulted observations will help to identify current gaps and future needs in vulnerability assessment, including estimation of uncertainty propagation, transferability of the methods, development of visualization tools, but also address basic questions like what is uncertainty and how uncertainty can be quantified or treated in a reliable and reproducible way.

  17. Numerical Uncertainty Analysis for Computational Fluid Dynamics using Student T Distribution -- Application of CFD Uncertainty Analysis Compared to Exact Analytical Solution

    NASA Technical Reports Server (NTRS)

    Groves, Curtis E.; Ilie, marcel; Shallhorn, Paul A.

    2014-01-01

    Computational Fluid Dynamics (CFD) is the standard numerical tool used by Fluid Dynamists to estimate solutions to many problems in academia, government, and industry. CFD is known to have errors and uncertainties and there is no universally adopted method to estimate such quantities. This paper describes an approach to estimate CFD uncertainties strictly numerically using inputs and the Student-T distribution. The approach is compared to an exact analytical solution of fully developed, laminar flow between infinite, stationary plates. It is shown that treating all CFD input parameters as oscillatory uncertainty terms coupled with the Student-T distribution can encompass the exact solution.

  18. Big Data Geo-Analytical Tool Development for Spatial Analysis Uncertainty Visualization and Quantification Needs

    NASA Astrophysics Data System (ADS)

    Rose, K.; Bauer, J. R.; Baker, D. V.

    2015-12-01

    As big data computing capabilities are increasingly paired with spatial analytical tools and approaches, there is a need to ensure uncertainty associated with the datasets used in these analyses is adequately incorporated and portrayed in results. Often the products of spatial analyses, big data and otherwise, are developed using discontinuous, sparse, and often point-driven data to represent continuous phenomena. Results from these analyses are generally presented without clear explanations of the uncertainty associated with the interpolated values. The Variable Grid Method (VGM) offers users with a flexible approach designed for application to a variety of analyses where users there is a need to study, evaluate, and analyze spatial trends and patterns while maintaining connection to and communicating the uncertainty in the underlying spatial datasets. The VGM outputs a simultaneous visualization representative of the spatial data analyses and quantification of underlying uncertainties, which can be calculated using data related to sample density, sample variance, interpolation error, uncertainty calculated from multiple simulations. In this presentation we will show how we are utilizing Hadoop to store and perform spatial analysis through the development of custom Spark and MapReduce applications that incorporate ESRI Hadoop libraries. The team will present custom 'Big Data' geospatial applications that run on the Hadoop cluster and integrate with ESRI ArcMap with the team's probabilistic VGM approach. The VGM-Hadoop tool has been specially built as a multi-step MapReduce application running on the Hadoop cluster for the purpose of data reduction. This reduction is accomplished by generating multi-resolution, non-overlapping, attributed topology that is then further processed using ESRI's geostatistical analyst to convey a probabilistic model of a chosen study region. Finally, we will share our approach for implementation of data reduction and topology generation

  19. Two-Stage Automatic Calibration and Predictive Uncertainty Analysis of a Semi-distributed Watershed Model

    NASA Astrophysics Data System (ADS)

    Lin, Z.; Radcliffe, D. E.; Doherty, J.

    2004-12-01

    monthly flow produced a very good fit to the measured data. Nash and Sutcliffe coefficients for daily and monthly flow over the calibration period were 0.60 and 0.86, respectively; they were 0.61 and 0.87 respectively over the validation period. Regardless of the level of model-to-measurement fit, nonuniqueness of the optimal parameter values renders the necessity of uncertainty analysis for model prediction. The nonlinear prediction uncertainty analysis showed that cautions must be exercised when using the SWAT model to predict instantaneous peak flows. The PEST (Parameter Estimation) free software was used to conduct the two-stage automatic calibration and prediction uncertainty analysis of the SWAT model.

  20. Transparent tools for uncertainty analysis in high level waste disposal facilities safety

    SciTech Connect

    Lemos, Francisco Luiz de; Helmuth, Karl-Heinz; Sullivan, Terry

    2007-07-01

    In this paper some results of a further development of a technical cooperation project, initiated in 2004, between the CDTN/CNEN, The Brazilian National Nuclear Energy Commission, and the STUK, The Finnish Radiation and Nuclear Safety Authority, are presented. The objective of this project is to study applications of fuzzy logic, and artificial intelligence methods, on uncertainty analysis of high level waste disposal facilities safety assessment. Uncertainty analysis is an essential part of the study of the complex interactions of the features, events and processes, which will affect the performance of the HLW disposal system over the thousands of years in the future. Very often the development of conceptual and computational models requires simplifications and selection of over conservative parameters that can lead to unrealistic results. These results can mask the existing uncertainties which, consequently, can be an obstacle to a better understanding of the natural processes. A correct evaluation of uncertainties and their rule on data interpretation is an important step for the improvement of the confidence in the calculations and public acceptance. This study focuses on dissolution (source), solubility and sorption (sink) as key processes for determination of release and migration of radionuclides. These factors are affected by a number of parameters that characterize the near and far fields such as pH; temperature; redox conditions; and other groundwater properties. On the other hand, these parameters are also consequence of other processes and conditions such as water rock interaction; pH and redox buffering. Fuzzy logic tools have been proved to be suited for dealing with interpretation of complex, and some times conflicting, data. For example, although some parameters, such as pH and carbonate, are treated as independent, they have influence in each other and on the solubility. It is used the technique of fuzzy cognitive mapping is used for analysis of

  1. Identifying the Potential Loss of Monitoring Wells Using an Uncertainty Analysis

    SciTech Connect

    Freedman, Vicky L.; Waichler, Scott R.; Cole, Charles R.; Vermeul, Vince R.; Bergeron, Marcel P.

    2005-11-01

    From the mid-1940s through the 1980s, large volumes of wastewater were discharged at the Hanford Site in southeastern Washington State, causing a large-scale rise (in excess of 20 m) in the water table. When wastewater discharges ceased in 1988, groundwater mounds began to dissipate. This caused a large number of wells to go dry and has made it difficult to monitor contaminant plume migration. To identify the wells that could potentially go dry, a first order uncertainty analysis was performed using a three-dimensional, finite element code (CFEST) coupled with UCODE, a nonlinear parameter estimation code. The analysis was conducted in four steps. First, key parameter values were identified by calibrating to historical hydraulic head data. Second, the model was tested for linearity, a strict requirement for representing output uncertainty. Third, results from the calibration period were used to verify model predictions by comparing monitoring wells? wet/dry status with field data. In the final step, predictions on the number and locations of dry wells were made through the year 2048. A non-physically based model that extrapolated trends at each individual well was also tested as a predictor of a well?s wet/dry status. Results demonstrated that when uncertainty in both parameter estimates and measurement error was considered, the CFEST-based model successfully predicted the majority of dry wells, outperforming the trend model. Predictions made through the year 2048 identified approximately 50% of the wells in the monitoring well network are likely to go dry, which can aid in decisions for their replacement.

  2. Stochastic modelling of landfill leachate and biogas production incorporating waste heterogeneity. Model formulation and uncertainty analysis.

    PubMed

    Zacharof, A I; Butler, A P

    2004-01-01

    A mathematical model simulating the hydrological and biochemical processes occurring in landfilled waste is presented and demonstrated. The model combines biochemical and hydrological models into an integrated representation of the landfill environment. Waste decomposition is modelled using traditional biochemical waste decomposition pathways combined with a simplified methodology for representing the rate of decomposition. Water flow through the waste is represented using a statistical velocity model capable of representing the effects of waste heterogeneity on leachate flow through the waste. Given the limitations in data capture from landfill sites, significant emphasis is placed on improving parameter identification and reducing parameter requirements. A sensitivity analysis is performed, highlighting the model's response to changes in input variables. A model test run is also presented, demonstrating the model capabilities. A parameter perturbation model sensitivity analysis was also performed. This has been able to show that although the model is sensitive to certain key parameters, its overall intuitive response provides a good basis for making reasonable predictions of the future state of the landfill system. Finally, due to the high uncertainty associated with landfill data, a tool for handling input data uncertainty is incorporated in the model's structure. It is concluded that the model can be used as a reasonable tool for modelling landfill processes and that further work should be undertaken to assess the model's performance. PMID:15120429

  3. An improved method of fuzzy support degree based on uncertainty analysis

    NASA Astrophysics Data System (ADS)

    Huang, Yuan; Wu, Jing; Wu, Lihua; Sheng, Weidong

    2015-10-01

    Most multisensor association algorithms based on fuzzy set theory forms the opinion of fuzzy proposition using a simple triangular function. It does not take the randomness of measurements into account. Otherwise, the variance of sensors supposed to be known in the triangular function, but in fact the exact variance is difficult to acquire. This paper discuss about two situations with known and unknown variance of sensors. First, with known variance and known mean. This paper proposes a method, which use the probability ratio to calculate the fuzzy support degree. The interaction between the two objects is considered. Second, with unknown variance and known mean value, we replace the sample mean in the gray auto correlation function with the real sensor mean value to analysis the uncertainty which is the correlation coefficient between targets and measurements actually. In this way, it can deal with the case of small sample. Finally, form the opinion about the fuzzy proposition in terms of weighting the opinion of all the sensors based on the result of uncertainty analysis. Sufficient simulations on some typical scenarios are performed, and the results indicate that the method presented is efficient.

  4. Uncertainty Analysis for Assessing Leakage Through Water Tunnels: A Case from Nepal Himalaya

    NASA Astrophysics Data System (ADS)

    Panthi, Krishna Kanta; Nilsen, Bjørn

    2010-09-01

    Water leakage problems in unlined or shotcrete lined water tunnels are not new issues. In many occasions severe water leakage problems have been faced that not only have reduced the stability of the rock mass, but also have caused valuable water to be lost from it, causing safety risk as well as huge economic loss to the projects. Hence, making tunnels water tight plays an important role in improving stability and safety of underground excavations. The real challenge is however accurate prediction and quantification of possible water leakage, so that cost consequences can be incorporated during planning of a water conveying tunnel project. The main purposes of this paper are to analyze extensive data on leakage test carried out through exploratory drillhole used to define the need for pre-injection grouting of Khimti headrace tunnel and to carry out probabilistic approach of uncertainty analysis based on relationship established between leakage, hydrostatic head and selected Q-value parameters. The authors believe that the new approach regarding uncertainty analysis of leakage presented in this paper will improve the understanding of leakage characteristics of the rock mass, and hope this will lead to a better understanding concerning quantification of possible water leakage from unlined and shotcrete lined water tunnels.

  5. A seismic hazard uncertainty analysis for the New Madrid seismic zone

    USGS Publications Warehouse

    Cramer, C.H.

    2001-01-01

    A review of the scientific issues relevant to characterizing earthquake sources in the New Madrid seismic zone has led to the development of a logic tree of possible alternative parameters. A variability analysis, using Monte Carlo sampling of this consensus logic tree, is presented and discussed. The analysis shows that for 2%-exceedence-in-50-year hazard, the best-estimate seismic hazard map is similar to previously published seismic hazard maps for the area. For peak ground acceleration (PGA) and spectral acceleration at 0.2 and 1.0 s (0.2 and 1.0 s Sa), the coefficient of variation (COV) representing the knowledge-based uncertainty in seismic hazard can exceed 0.6 over the New Madrid seismic zone and diminishes to about 0.1 away from areas of seismic activity. Sensitivity analyses show that the largest contributor to PGA, 0.2 and 1.0 s Sa seismic hazard variability is the uncertainty in the location of future 1811-1812 New Madrid sized earthquakes. This is followed by the variability due to the choice of ground motion attenuation relation, the magnitude for the 1811-1812 New Madrid earthquakes, and the recurrence interval for M>6.5 events. Seismic hazard is not very sensitive to the variability in seismogenic width and length. Published by Elsevier Science B.V.

  6. Uncertainty Analysis for the Miniaturized Laser Heterodyne Radiometer (mini-LHR)

    NASA Technical Reports Server (NTRS)

    Clarke, G. B.; Wilson E. L.; Miller, J. H.; Melroy, H. R.

    2014-01-01

    Presented here is a sensitivity analysis for the miniaturized laser heterodyne radiometer (mini-LHR). This passive, ground-based instrument measures carbon dioxide (CO2) in the atmospheric column and has been under development at NASA/GSFC since 2009. The goal of this development is to produce a low-cost, easily-deployable instrument that can extend current ground measurement networks in order to (1) validate column satellite observations, (2) provide coverage in regions of limited satellite observations, (3) target regions of interest such as thawing permafrost, and (4) support the continuity of a long-term climate record. In this paper an uncertainty analysis of the instrument performance is presented and compared with results from three sets of field measurements. The signal-to-noise ratio (SNR) and corresponding uncertainty for a single scan are calculated to be 329.4+/-1.3 by deploying error propagation through the equation governing the SNR. Reported is an absorbance noise of 0.0024 for 6 averaged scans of field data, for an instrument precision of approximately 0.2 ppmv for CO2.

  7. The Multi-Sensor Aerosol Products Sampling System (MAPSS) for Integrated Analysis of Satellite Retrieval Uncertainties

    NASA Technical Reports Server (NTRS)

    Ichoku, Charles; Petrenko, Maksym; Leptoukh, Gregory

    2010-01-01

    Among the known atmospheric constituents, aerosols represent the greatest uncertainty in climate research. Although satellite-based aerosol retrieval has practically become routine, especially during the last decade, there is often disagreement between similar aerosol parameters retrieved from different sensors, leaving users confused as to which sensors to trust for answering important science questions about the distribution, properties, and impacts of aerosols. As long as there is no consensus and the inconsistencies are not well characterized and understood ', there will be no way of developing reliable climate data records from satellite aerosol measurements. Fortunately, the most globally representative well-calibrated ground-based aerosol measurements corresponding to the satellite-retrieved products are available from the Aerosol Robotic Network (AERONET). To adequately utilize the advantages offered by this vital resource,., an online Multi-sensor Aerosol Products Sampling System (MAPSS) was recently developed. The aim of MAPSS is to facilitate detailed comparative analysis of satellite aerosol measurements from different sensors (Terra-MODIS, Aqua-MODIS, Terra-MISR, Aura-OMI, Parasol-POLDER, and Calipso-CALIOP) based on the collocation of these data products over AERONET stations. In this presentation, we will describe the strategy of the MAPSS system, its potential advantages for the aerosol community, and the preliminary results of an integrated comparative uncertainty analysis of aerosol products from multiple satellite sensors.

  8. Accounting for uncertainty in the analysis of overlap layer mean velocity models

    NASA Astrophysics Data System (ADS)

    Oliver, Todd A.; Moser, Robert D.

    2012-07-01

    When assessing the veracity of mathematical models, it is important to consider the uncertainties in the data used for the assessment. In this paper, we study the impact of data uncertainties on the analysis of overlap layer models for the mean velocity in wall-bounded turbulent flows. Specifically, the tools of Bayesian statistics are used to calibrate and compare six competing models of the mean velocity profile, including multiple logarithmic and power law forms, using velocity profile measurements from a zero-pressure-gradient turbulent boundary layer and fully developed turbulent pipe flow. The calibration problem is formulated as a Bayesian update of the joint probability density function for the calibration parameters, which are treated as random variables to characterize incomplete knowledge about their values. This probabilistic formulation provides a natural treatment of uncertainty and gives insight into the quality of the fit, features that are not easily obtained in deterministic calibration procedures. The model comparison also relies on a Bayesian update. In particular, the relative probabilities of the competing models are updated using the calibration data. The resulting posterior probabilities quantify the relative plausibility of the competing models given the data. For the boundary layer, results are shown for five subsets of the turbulent boundary layer data due to Österlund, including different Reynolds number and wall distance ranges, and multiple assumptions regarding the magnitude of the uncertainty in the velocity measurements. For most choices, multiple models have relatively high posterior probability, indicating that it is difficult to distinguish between the models. For the most inclusive data sets—i.e., the largest ranges of Reynolds number and wall distance—the first-order logarithmic law due to Buschmann and Gad-el-Hak is significantly more probable, given the data, than the other models evaluated. For the pipe flow, data from

  9. Analysis of uncertainties in infrared camera measurements of a turbofan engine in an altitude test cell

    NASA Astrophysics Data System (ADS)

    Morris, T. A.; Marciniak, M. A.; Wollenweber, G. C.; Turk, J. A.

    2006-06-01

    m bands using a parametric analysis. Two LWIR bands are considered to provide insight into specific previous measurements made with a quantum-well IR photo-detector (QWIP, roughly 8-9 μm), as well as potential future measurements made using broader band imagers (e.g., HgCdTe at 8-12 μm). A sensitivity analysis in the style of a Monte Carlo simulation is also performed to gauge the uncertainty in the radiometric model calculations.

  10. Grid and basis adaptive polynomial chaos techniques for sensitivity and uncertainty analysis

    SciTech Connect

    Perkó, Zoltán Gilli, Luca Lathouwers, Danny Kloosterman, Jan Leen

    2014-03-01

    The demand for accurate and computationally affordable sensitivity and uncertainty techniques is constantly on the rise and has become especially pressing in the nuclear field with the shift to Best Estimate Plus Uncertainty methodologies in the licensing of nuclear installations. Besides traditional, already well developed methods – such as first order perturbation theory or Monte Carlo sampling – Polynomial Chaos Expansion (PCE) has been given a growing emphasis in recent years due to its simple application and good performance. This paper presents new developments of the research done at TU Delft on such Polynomial Chaos (PC) techniques. Our work is focused on the Non-Intrusive Spectral Projection (NISP) approach and adaptive methods for building the PCE of responses of interest. Recent efforts resulted in a new adaptive sparse grid algorithm designed for estimating the PC coefficients. The algorithm is based on Gerstner's procedure for calculating multi-dimensional integrals but proves to be computationally significantly cheaper, while at the same it retains a similar accuracy as the original method. More importantly the issue of basis adaptivity has been investigated and two techniques have been implemented for constructing the sparse PCE of quantities of interest. Not using the traditional full PC basis set leads to further reduction in computational time since the high order grids necessary for accurately estimating the near zero expansion coefficients of polynomial basis vectors not needed in the PCE can be excluded from the calculation. Moreover the sparse PC representation of the response is easier to handle when used for sensitivity analysis or uncertainty propagation due to the smaller number of basis vectors. The developed grid and basis adaptive methods have been implemented in Matlab as the Fully Adaptive Non-Intrusive Spectral Projection (FANISP) algorithm and were tested on four analytical problems. These show consistent good performance both

  11. Probabilistic floodplain hazard mapping: managing uncertainty by using a bivariate approach for flood frequency analysis

    NASA Astrophysics Data System (ADS)

    Candela, Angela; Tito Aronica, Giuseppe

    2014-05-01

    Floods are a global problem and are considered the most frequent natural disaster world-wide. Many studies show that the severity and frequency of floods have increased in recent years and underline the difficulty to separate the effects of natural climatic changes and human influences as land management practices, urbanization etc. Flood risk analysis and assessment is required to provide information on current or future flood hazard and risks in order to accomplish flood risk mitigation, to propose, evaluate and select measures to reduce it. Both components of risk can be mapped individually and are affected by multiple uncertainties as well as the joint estimate of flood risk. Major sources of uncertainty include statistical analysis of extremes events, definition of hydrological input, channel and floodplain topography representation, the choice of effective hydraulic roughness coefficients. The classical procedure to estimate flood discharge for a chosen probability of exceedance is to deal with a rainfall-runoff model associating to risk the same return period of original rainfall, in accordance with the iso-frequency criterion. Alternatively, a flood frequency analysis to a given record of discharge data is applied, but again the same probability is associated to flood discharges and respective risk. Moreover, since flood peaks and corresponding flood volumes are variables of the same phenomenon, they should be, directly, correlated and, consequently, multivariate statistical analyses must be applied. This study presents an innovative approach to obtain flood hazard maps where hydrological input (synthetic flood design event) to a 2D hydraulic model has been defined by generating flood peak discharges and volumes from: a) a classical univariate approach, b) a bivariate statistical analysis, through the use of copulas. The univariate approach considers flood hydrographs generation by an indirect approach (rainfall-runoff transformation using input rainfall

  12. Evaluating the use of uncertainty visualization for exploratory analysis of land cover change: A qualitative expert user study

    NASA Astrophysics Data System (ADS)

    Kinkeldey, Christoph; Schiewe, Jochen; Gerstmann, Henning; Götze, Christian; Kit, Oleksandr; Lüdeke, Matthias; Taubenböck, Hannes; Wurm, Michael

    2015-11-01

    Extensive research on geodata uncertainty has been conducted in the past decades, mostly related to modeling, quantifying, and communicating uncertainty. But findings on if and how users can incorporate this information into spatial analyses are still rare. In this paper we address these questions with a focus on land cover change analysis. We conducted semi-structured interviews with three expert groups dealing with change analysis in the fields of climate research, urban development, and vegetation monitoring. During the interviews we used a software prototype to show change scenarios that the experts had analyzed before, extended by visual depiction of uncertainty related to land cover change. This paper describes the study, summarizes results, and discusses findings as well as the study method. Participants came up with several ideas for applications that could be supported by uncertainty, for example, identification of erroneous change, description of change detection algorithm characteristics, or optimization of change detection parameters. Regarding the aspect of reasoning with uncertainty in land cover change data the interviewees saw potential in better-informed hypotheses and insights about change. Communication of uncertainty information to users was seen as critical, depending on the users' role and expertize. We judge semi-structured interviews to be suitable for the purpose of this study and emphasize the potential of qualitative methods (workshops, focus groups etc.) for future uncertainty visualization studies.

  13. Statistical uncertainty analysis applied to the DRAGONv4 code lattice calculations and based on JENDL-4 covariance data

    SciTech Connect

    Hernandez-Solis, A.; Demaziere, C.; Ekberg, C.; Oedegaard-Jensen, A.

    2012-07-01

    In this paper, multi-group microscopic cross-section uncertainty is propagated through the DRAGON (Version 4) lattice code, in order to perform uncertainty analysis on k{infinity} and 2-group homogenized macroscopic cross-sections predictions. A statistical methodology is employed for such purposes, where cross-sections of certain isotopes of various elements belonging to the 172 groups DRAGLIB library format, are considered as normal random variables. This library is based on JENDL-4 data, because JENDL-4 contains the largest amount of isotopic covariance matrixes among the different major nuclear data libraries. The aim is to propagate multi-group nuclide uncertainty by running the DRAGONv4 code 500 times, and to assess the output uncertainty of a test case corresponding to a 17 x 17 PWR fuel assembly segment without poison. The chosen sampling strategy for the current study is Latin Hypercube Sampling (LHS). The quasi-random LHS allows a much better coverage of the input uncertainties than simple random sampling (SRS) because it densely stratifies across the range of each input probability distribution. Output uncertainty assessment is based on the tolerance limits concept, where the sample formed by the code calculations infers to cover 95% of the output population with at least a 95% of confidence. This analysis is the first attempt to propagate parameter uncertainties of modern multi-group libraries, which are used to feed advanced lattice codes that perform state of the art resonant self-shielding calculations such as DRAGONv4. (authors)

  14. An uncertainty analysis of the flood-stage upstream from a bridge.

    PubMed

    Sowiński, M

    2006-01-01

    The paper begins with the formulation of the problem in the form of a general performance function. Next the Latin hypercube sampling (LHS) technique--a modified version of the Monte Carlo method is briefly described. The essential uncertainty analysis of the flood-stage upstream from a bridge starts with a description of the hydraulic model. This model concept is based on the HEC-RAS model developed for subcritical flow under a bridge without piers in which the energy equation is applied. The next section contains the characteristic of the basic variables including a specification of their statistics (means and variances). Next the problem of correlated variables is discussed and assumptions concerning correlation among basic variables are formulated. The analysis of results is based on LHS ranking lists obtained from the computer package UNCSAM. Results fot two examples are given: one for independent and the other for correlated variables. PMID:16532737

  15. Effect of soil property uncertainties on permafrost thaw projections: a calibration-constrained analysis: Modeling Archive

    DOE Data Explorer

    J.C. Rowland; D.R. Harp; C.J. Wilson; A.L. Atchley; V.E. Romanovsky; E.T. Coon; S.L. Painter

    2016-02-02

    This Modeling Archive is in support of an NGEE Arctic publication available at doi:10.5194/tc-10-341-2016. This dataset contains an ensemble of thermal-hydro soil parameters including porosity, thermal conductivity, thermal conductivity shape parameters, and residual saturation of peat and mineral soil. The ensemble was generated using a Null-Space Monte Carlo analysis of parameter uncertainty based on a calibration to soil temperatures collected at the Barrow Environmental Observatory site by the NGEE team. The micro-topography of ice wedge polygons present at the site is included in the analysis using three 1D column models to represent polygon center, rim and trough features. The Arctic Terrestrial Simulator (ATS) was used in the calibration to model multiphase thermal and hydrological processes in the subsurface.

  16. Efficient uncertainty quantification in stochastic finite element analysis based on functional principal components

    NASA Astrophysics Data System (ADS)

    Bianchini, Ilaria; Argiento, Raffaele; Auricchio, Ferdinando; Lanzarone, Ettore

    2015-09-01

    The great influence of uncertainties on the behavior of physical systems has always drawn attention to the importance of a stochastic approach to engineering problems. Accordingly, in this paper, we address the problem of solving a Finite Element analysis in the presence of uncertain parameters. We consider an approach in which several solutions of the problem are obtained in correspondence of parameters samples, and propose a novel non-intrusive method, which exploits the functional principal component analysis, to get acceptable computational efforts. Indeed, the proposed approach allows constructing an optimal basis of the solutions space and projecting the full Finite Element problem into a smaller space spanned by this basis. Even if solving the problem in this reduced space is computationally convenient, very good approximations are obtained by upper bounding the error between the full Finite Element solution and the reduced one. Finally, we assess the applicability of the proposed approach through different test cases, obtaining satisfactory results.

  17. Fission Spectrum Related Uncertainties

    SciTech Connect

    G. Aliberti; I. Kodeli; G. Palmiotti; M. Salvatores

    2007-10-01

    The paper presents a preliminary uncertainty analysis related to potential uncertainties on the fission spectrum data. Consistent results are shown for a reference fast reactor design configuration and for experimental thermal configurations. However the results obtained indicate the need for further analysis, in particular in terms of fission spectrum uncertainty data assessment.

  18. Uncertainty analysis in 3D global models: Aerosol representation in MOZART-4

    NASA Astrophysics Data System (ADS)

    Gasore, J.; Prinn, R. G.

    2012-12-01

    The Probabilistic Collocation Method (PCM) has been proven to be an efficient general method of uncertainty analysis in atmospheric models (Tatang et al 1997, Cohen&Prinn 2011). However, its application has been mainly limited to urban- and regional-scale models and chemical source-sink models, because of the drastic increase in computational cost when the dimension of uncertain parameters increases. Moreover, the high-dimensional output of global models has to be reduced to allow a computationally reasonable number of polynomials to be generated. This dimensional reduction has been mainly achieved by grouping the model grids into a few regions based on prior knowledge and expectations; urban versus rural for instance. As the model output is used to estimate the coefficients of the polynomial chaos expansion (PCE), the arbitrariness in the regional aggregation can generate problems in estimating uncertainties. To address these issues in a complex model, we apply the probabilistic collocation method of uncertainty analysis to the aerosol representation in MOZART-4, which is a 3D global chemical transport model (Emmons et al., 2010). Thereafter, we deterministically delineate the model output surface into regions of homogeneous response using the method of Principal Component Analysis. This allows the quantification of the uncertainty associated with the dimensional reduction. Because only a bulk mass is calculated online in Mozart-4, a lognormal number distribution is assumed with a priori fixed scale and location parameters, to calculate the surface area for heterogeneous reactions involving tropospheric oxidants. We have applied the PCM to the six parameters of the lognormal number distributions of Black Carbon, Organic Carbon and Sulfate. We have carried out a Monte-Carlo sampling from the probability density functions of the six uncertain parameters, using the reduced PCE model. The global mean concentration of major tropospheric oxidants did not show a

  19. Voxel-based statistical analysis of uncertainties associated with deformable image registration

    NASA Astrophysics Data System (ADS)

    Li, Shunshan; Glide-Hurst, Carri; Lu, Mei; Kim, Jinkoo; Wen, Ning; Adams, Jeffrey N.; Gordon, James; Chetty, Indrin J.; Zhong, Hualiang

    2013-09-01

    Deformable image registration (DIR) algorithms have inherent uncertainties in their displacement vector fields (DVFs).The purpose of this study is to develop an optimal metric to estimate DIR uncertainties. Six computational phantoms have been developed from the CT images of lung cancer patients using a finite element method (FEM). The FEM generated DVFs were used as a standard for registrations performed on each of these phantoms. A mechanics-based metric, unbalanced energy (UE), was developed to evaluate these registration DVFs. The potential correlation between UE and DIR errors was explored using multivariate analysis, and the results were validated by landmark approach and compared with two other error metrics: DVF inverse consistency (IC) and image intensity difference (ID). Landmark-based validation was performed using the POPI-model. The results show that the Pearson correlation coefficient between UE and DIR error is rUE-error = 0.50. This is higher than rIC-error = 0.29 for IC and DIR error and rID-error = 0.37 for ID and DIR error. The Pearson correlation coefficient between UE and the product of the DIR displacements and errors is rUE-error × DVF = 0.62 for the six patients and rUE-error × DVF = 0.73 for the POPI-model data. It has been demonstrated that UE has a strong correlation with DIR errors, and the UE metric outperforms the IC and ID metrics in estimating DIR uncertainties. The quantified UE metric can be a useful tool for adaptive treatment strategies, including probability-based adaptive treatment planning.

  20. Uncertainty Analysis And Synergy Of Aerosol Products From Multiple Satellite Sensors For Advanced Atmospheric Research

    NASA Astrophysics Data System (ADS)

    Ichoku, C. M.; Petrenko, M.

    2013-05-01

    Aerosols are tiny particles suspended in the air, and can be made up of wind-blown dust, smoke from fires, and particulate emissions from automobiles, industries, and other natural and man-made sources. Aerosols can have significant impacts on the air quality, and can interact with clouds and solar radiation in such a way as to affect the water cycle and climate. However, the extent and scale of these impacts are still poorly understood, and this represents one of the greatest uncertainties in climate research to date. To fill this gap in our knowledge, the global and local properties of atmospheric aerosols are being extensively observed and measured, especially during the last decade, using both satellite and ground-based instruments, including such spaceborne sensors as MODIS on the Terra and Aqua satellites, MISR on Terra, OMI on Aura, POLDER on PARASOL, CALIOP on CALIPSO, SeaWiFS on SeaStar, and the ground-based Aerosol Robotic Network (AERONET) of sunphotometers. The aerosol measurements collected by these instruments over the last decade contribute to an unprecedented availability of the most complete set of complimentary aerosol measurements ever acquired. Still, to be able to utilize these measurements synergistically, they have to be carefully and uniformly analyzed and inter-compared, in order to understand the uncertainties and limitations of the products - a process that is greatly complicated by the diversity of differences that exist among them. In this presentation, we will show results of a coherent comparative uncertainty analysis of aerosol measurements from the above-named satellite sensors relative to AERONET. We use these results to demonstrate how these sensors perform in different parts of the world over different landcover types as well as their performance relative to one another, thereby facilitating product selection and integration for specific research and applications needs.

  1. A Variance Decomposition Approach to Uncertainty Quantification and Sensitivity Analysis of the J&E Model

    PubMed Central

    Moradi, Ali; Tootkaboni, Mazdak; Pennell, Kelly G.

    2015-01-01

    The Johnson and Ettinger (J&E) model is the most widely used vapor intrusion model in the United States. It is routinely used as part of hazardous waste site assessments to evaluate the potential for vapor intrusion exposure risks. This study incorporates mathematical approaches that allow sensitivity and uncertainty of the J&E model to be evaluated. In addition to performing Monte Carlo simulations to examine the uncertainty in the J&E model output, a powerful global sensitivity analysis technique based on Sobol indices is used to evaluate J&E model sensitivity to variations in the input parameters. The results suggest that the J&E model is most sensitive to the building air exchange rate, regardless of soil type and source depth. Building air exchange rate is not routinely measured during vapor intrusion investigations, but clearly improved estimates and/or measurements of the air exchange rate would lead to improved model predictions. It is also found that the J&E model is more sensitive to effective diffusivity, than effective permeability. Field measurements of effective diffusivity are not commonly collected during vapor intrusion investigations; however, consideration of this parameter warrants additional attention. Finally, the effects of input uncertainties on model predictions for different scenarios (e.g. sandy soil as compared to clayey soil, and “shallow” sources as compared to “deep” sources) are evaluated. Our results, not only identify the range of variability to be expected depending on the scenario at hand, but also mark the important cases where special care is needed when estimating the input parameters to which the J&E model is most sensitive. PMID:25947051

  2. Analysis of regression confidence intervals and Bayesian credible intervals for uncertainty quantification

    NASA Astrophysics Data System (ADS)

    Lu, Dan; Ye, Ming; Hill, Mary C.

    2012-09-01

    Confidence intervals based on classical regression theories augmented to include prior information and credible intervals based on Bayesian theories are conceptually different ways to quantify parametric and predictive uncertainties. Because both confidence and credible intervals are used in environmental modeling, we seek to understand their differences and similarities. This is of interest in part because calculating confidence intervals typically requires tens to thousands of model runs, while Bayesian credible intervals typically require tens of thousands to millions of model runs. Given multi-Gaussian distributed observation errors, our theoretical analysis shows that, for linear or linearized-nonlinear models, confidence and credible intervals are always numerically identical when consistent prior information is used. For nonlinear models, nonlinear confidence and credible intervals can be numerically identical if parameter confidence regions defined using the approximate likelihood method and parameter credible regions estimated using Markov chain Monte Carlo realizations are numerically identical and predictions are a smooth, monotonic function of the parameters. Both occur if intrinsic model nonlinearity is small. While the conditions of Gaussian errors and small intrinsic model nonlinearity are violated by many environmental models, heuristic tests using analytical and numerical models suggest that linear and nonlinear confidence intervals can be useful approximations of uncertainty even under significantly nonideal conditions. In the context of epistemic model error for a complex synthetic nonlinear groundwater problem, the linear and nonlinear confidence and credible intervals for individual models performed similarly enough to indicate that the computationally frugal confidence intervals can be useful in many circumstances. Experiences with these groundwater models are expected to be broadly applicable to many environmental models. We suggest that for

  3. A variance decomposition approach to uncertainty quantification and sensitivity analysis of the Johnson and Ettinger model.

    PubMed

    Moradi, Ali; Tootkaboni, Mazdak; Pennell, Kelly G

    2015-02-01

    The Johnson and Ettinger (J&E) model is the most widely used vapor intrusion model in the United States. It is routinely used as part of hazardous waste site assessments to evaluate the potential for vapor intrusion exposure risks. This study incorporates mathematical approaches that allow sensitivity and uncertainty of the J&E model to be evaluated. In addition to performing Monte Carlo simulations to examine the uncertainty in the J&E model output, a powerful global sensitivity analysis technique based on Sobol indices is used to evaluate J&E model sensitivity to variations in the input parameters. The results suggest that the J&E model is most sensitive to the building air exchange rate, regardless of soil type and source depth. Building air exchange rate is not routinely measured during vapor intrusion investigations, but clearly improved estimates and/or measurements of the air exchange rate would lead to improved model predictions. It is also found that the J&E model is more sensitive to effective diffusivity than to effective permeability. Field measurements of effective diffusivity are not commonly collected during vapor intrusion investigations; however, consideration of this parameter warrants additional attention. Finally, the effects of input uncertainties on model predictions for different scenarios (e.g., sandy soil as compared to clayey soil, and "shallow" sources as compared to "deep" sources) are evaluated. Our results not only identify the range of variability to be expected depending on the scenario at hand, but also mark the important cases where special care is needed when estimating the input parameters to which the J&E model is most sensitive. PMID:25947051

  4. Development code for sensitivity and uncertainty analysis of input on the MCNPX for neutronic calculation in PWR core

    SciTech Connect

    Hartini, Entin Andiwijayakusuma, Dinan

    2014-09-30

    This research was carried out on the development of code for uncertainty analysis is based on a statistical approach for assessing the uncertainty input parameters. In the butn-up calculation of fuel, uncertainty analysis performed for input parameters fuel density, coolant density and fuel temperature. This calculation is performed during irradiation using Monte Carlo N-Particle Transport. The Uncertainty method based on the probabilities density function. Development code is made in python script to do coupling with MCNPX for criticality and burn-up calculations. Simulation is done by modeling the geometry of PWR terrace, with MCNPX on the power 54 MW with fuel type UO2 pellets. The calculation is done by using the data library continuous energy cross-sections ENDF / B-VI. MCNPX requires nuclear data in ACE format. Development of interfaces for obtaining nuclear data in the form of ACE format of ENDF through special process NJOY calculation to temperature changes in a certain range.

  5. COMPUTATIONAL METHODS FOR SENSITIVITY AND UNCERTAINTY ANALYSIS FOR ENVIRONMENTAL AND BIOLOGICAL MODELS

    EPA Science Inventory

    This work introduces a computationally efficient alternative method for uncertainty propagation, the Stochastic Response Surface Method (SRSM). The SRSM approximates uncertainties in model outputs through a series expansion in normal random variables (polynomial chaos expansion)...

  6. Climate change impacts on extreme events in the United States: an uncertainty analysis

    EPA Science Inventory

    Extreme weather and climate events, such as heat waves, droughts and severe precipitation events, have substantial impacts on ecosystems and the economy. However, future climate simulations display large uncertainty in mean changes. As a result, the uncertainty in future changes ...

  7. Multi-Dimensional, Discrete-Ordinates Based Cross Section Sensitivity and Uncertainty Analysis Code System.

    Energy Science and Technology Software Center (ESTSC)

    2008-05-22

    are the latest versions available from NEA-DB). o The memory and data management was updated as well as the language level (code was rewritten from Fortran-77 to Fortran-95). SUSD3D is coupled to several discrete‑ordinates codes via binary interface files. SUSD3D can use the flux moment files produced by discrete ordinates codes: ANISN, DORT, TORT, ONEDANT, TWODANT, and THREEDANT. In some of these codes minor modifications are required. Variable dimensions used in the TORT‑DORT system are supported. In 3D analysis the geometry and material composition is taken directly from the TORT produced VARSCL binary file, reducing in this way the user's input to SUSD3D. Multigroup cross‑section sets are read in the GENDF format of the NJOY/GROUPR code system, and the covariance data are expected in the COVFIL format of NJOY/ERRORR or the COVERX format of PUFF‑2. The ZZ‑VITAMIN‑J/COVA cross section covariance matrix library can be used as an alternative to the NJOY code system. The package includes the ANGELO code to produce the covariance data in the required energy structure in the COVFIL format. The following cross section processing modules to be added to the NJOY‑94 code system are included in the package: o ERR34: an extension of the ERRORR module of the NJOY code system for the File‑34 processing. It is used to prepare multigroup SAD cross sections covariance matrices. o GROUPSR: An additional code module for the preparation of partial cross sections for SAD sensitivity analysis. Updated version of the same code from SUSD, extended to the ENDF‑6 format. o SEADR: An additional code module to prepare group covariance matrices for SAD/SED uncertainty analysis.« less

  8. Automation and uncertainty analysis of a method for in-vivo range verification in particle therapy

    NASA Astrophysics Data System (ADS)

    Frey, K.; Unholtz, D.; Bauer, J.; Debus, J.; Min, C. H.; Bortfeld, T.; Paganetti, H.; Parodi, K.

    2014-10-01

    We introduce the automation of the range difference calculation deduced from particle-irradiation induced β+-activity distributions with the so-called most-likely-shift approach, and evaluate its reliability via the monitoring of algorithm- and patient-specific uncertainty factors. The calculation of the range deviation is based on the minimization of the absolute profile differences in the distal part of two activity depth profiles shifted against each other. Depending on the workflow of positron emission tomography (PET)-based range verification, the two profiles under evaluation can correspond to measured and simulated distributions, or only measured data from different treatment sessions. In comparison to previous work, the proposed approach includes an automated identification of the distal region of interest for each pair of PET depth profiles and under consideration of the planned dose distribution, resulting in the optimal shift distance. Moreover, it introduces an estimate of uncertainty associated to the identified shift, which is then used as weighting factor to ‘red flag’ problematic large range differences. Furthermore, additional patient-specific uncertainty factors are calculated using available computed tomography (CT) data to support the range analysis. The performance of the new method for in-vivo treatment verification in the clinical routine is investigated with in-room PET images for proton therapy as well as with offline PET images for proton and carbon ion therapy. The comparison between measured PET activity distributions and predictions obtained by Monte Carlo simulations or measurements from previous treatment fractions is performed. For this purpose, a total of 15 patient datasets were analyzed, which were acquired at Massachusetts General Hospital and Heidelberg Ion-Beam Therapy Center with in-room PET and offline PET/CT scanners, respectively. Calculated range differences between the compared activity distributions are reported in

  9. TRITIUM UNCERTAINTY ANALYSIS FOR SURFACE WATER SAMPLES AT THE SAVANNAH RIVER SITE

    SciTech Connect

    Atkinson, R.

    2012-07-31

    Radiochemical analyses of surface water samples, in the framework of Environmental Monitoring, have associated uncertainties for the radioisotopic results reported. These uncertainty analyses pertain to the tritium results from surface water samples collected at five locations on the Savannah River near the U.S. Department of Energy's Savannah River Site (SRS). Uncertainties can result from the field-sampling routine, can be incurred during transport due to the physical properties of the sample, from equipment limitations, and from the measurement instrumentation used. The uncertainty reported by the SRS in their Annual Site Environmental Report currently considers only the counting uncertainty in the measurements, which is the standard reporting protocol for radioanalytical chemistry results. The focus of this work is to provide an overview of all uncertainty components associated with SRS tritium measurements, estimate the total uncertainty according to ISO 17025, and to propose additional experiments to verify some of the estimated uncertainties. The main uncertainty components discovered and investigated in this paper are tritium absorption or desorption in the sample container, HTO/H{sub 2}O isotopic effect during distillation, pipette volume, and tritium standard uncertainty. The goal is to quantify these uncertainties and to establish a combined uncertainty in order to increase the scientific depth of the SRS Annual Site Environmental Report.

  10. A comparison of five forest interception models using global sensitivity and uncertainty analysis

    NASA Astrophysics Data System (ADS)

    Linhoss, Anna C.; Siegert, Courtney M.

    2016-07-01

    Interception by the forest canopy plays a critical role in the hydrologic cycle by removing a significant portion of incoming precipitation from the terrestrial component. While there are a number of existing physical models of forest interception, few studies have summarized or compared these models. The objective of this work is to use global sensitivity and uncertainty analysis to compare five mechanistic interception models including the Rutter, Rutter Sparse, Gash, Sparse Gash, and Liu models. Using parameter probability distribution functions of values from the literature, our results show that on average storm duration [Dur], gross precipitation [PG], canopy storage [S] and solar radiation [Rn] are the most important model parameters. On the other hand, empirical parameters used in calculating evaporation and drip (i.e. trunk evaporation as a proportion of evaporation from the saturated canopy [ɛ], the empirical drainage parameter [b], the drainage partitioning coefficient [pd], and the rate of water dripping from the canopy when canopy storage has been reached [Ds]) have relatively low levels of importance in interception modeling. As such, future modeling efforts should aim to decompose parameters that are the most influential in determining model outputs into easily measurable physical components. Because this study compares models, the choices regarding the parameter probability distribution functions are applied across models, which enables a more definitive ranking of model uncertainty.

  11. The Null Space Monte Carlo Uncertainty Analysis of Heterogeneity for Preferential Flow Simulation

    NASA Astrophysics Data System (ADS)

    Ghasemizade, M.; Radny, D.; Schirmer, M.

    2014-12-01

    Preferential flow paths can have a huge impact on the amount and time of runoff generation, particularly in areas where subsurface flow dominates this process. In order to simulate preferential flow mechanisms, many different approaches have been suggested. However, the efficiency of such approaches are rarely investigated in a predictive sense. The main reason is that the models which are used to simulate preferential flows require many parameters. This can lead to a dramatic increase of model run times, especially in the context of highly nonlinear models which themselves are demanding. We attempted in this research to simulate the daily recharge values of a weighing lysimeter, including preferential flows, with the 3-D physically based model HydroGeoSphere. To accomplish that, we used the matrix pore concept with varying hydraulic conductivities within the lysimeter to represent heterogeneity. It was assumed that spatially correlated heterogeneity is the main driver of triggering preferential flow paths. In order to capture the spatial distribution of hydraulic conductivity values we used pilot points and geostatistical model structures. Since hydraulic conductivity values at each pilot point are functioning as parameters, the model is a highly parameterized one. Due to this fact, we used the robust and newly developed method of null space Monte Carlo for analyzing the uncertainty of the model outputs. Results of the uncertainty analysis show that the method of pilot points is reliable in order to represent preferential flow paths.

  12. Risk-cost-benefit analysis for transportation corridors with interval uncertainties of heterogeneous data.

    PubMed

    Xu, Junrui; Lambert, James H

    2015-04-01

    Access management, which systematically limits opportunities for egress and ingress of vehicles to highway lanes, is critical to protect trillions of dollars of current investment in transportation. This article addresses allocating resources for access management with incomplete and partially relevant data on crash rates, travel speeds, and other factors. While access management can be effective to avoid crashes, reduce travel times, and increase route capacities, the literature suggests a need for performance metrics to guide investments in resource allocation across large corridor networks and several time horizons. In this article, we describe a quantitative decision model to support an access management program via risk-cost-benefit analysis under data uncertainties from diverse sources of data and expertise. The approach quantifies potential benefits, including safety improvement and travel time savings, and costs of access management through functional relationships of input parameters including crash rates, corridor access point densities, and traffic volumes. Parameter uncertainties, which vary across locales and experts, are addressed via numerical interval analyses. This approach is demonstrated at several geographic scales across 7,000 kilometers of highways in a geographic region and several subregions. The demonstration prioritizes route segments that would benefit from risk management, including (i) additional data or elicitation, (ii) right-of-way purchases, (iii) restriction or closing of access points, (iv) new alignments, (v) developer proffers, and (vi) etc. The approach ought to be of wide interest to analysts, planners, policymakers, and stakeholders who rely on heterogeneous data and expertise for risk management. PMID:24924626

  13. Efficient Calibration/Uncertainty Analysis Using Paired Complex/Surrogate Models.

    PubMed

    Burrows, Wesley; Doherty, John

    2015-01-01

    The use of detailed groundwater models to simulate complex environmental processes can be hampered by (1) long run-times and (2) a penchant for solution convergence problems. Collectively, these can undermine the ability of a modeler to reduce and quantify predictive uncertainty, and therefore limit the use of such detailed models in the decision-making context. We explain and demonstrate a novel approach to calibration and the exploration of posterior predictive uncertainty, of a complex model, that can overcome these problems in many modelling contexts. The methodology relies on conjunctive use of a simplified surrogate version of the complex model in combination with the complex model itself. The methodology employs gradient-based subspace analysis and is thus readily adapted for use in highly parameterized contexts. In its most basic form, one or more surrogate models are used for calculation of the partial derivatives that collectively comprise the Jacobian matrix. Meanwhile, testing of parameter upgrades and the making of predictions is done by the original complex model. The methodology is demonstrated using a density-dependent seawater intrusion model in which the model domain is characterized by a heterogeneous distribution of hydraulic conductivity. PMID:25142272

  14. Uncertainty Analysis for the Evaluation of a Passive Runway Arresting System

    NASA Technical Reports Server (NTRS)

    Deloach, Richard; Marlowe, Jill M.; Yager, Thomas J.

    2009-01-01

    This paper considers the stopping distance of an aircraft involved in a runway overrun incident when the runway has been provided with an extension comprised of a material engineered to induce high levels of rolling friction and drag. A formula for stopping distance is derived that is shown to be the product of a known formula for the case of friction without drag, and a dimensionless constant between 0 and 1 that quantifies the further reduction in stopping distance when drag is introduced. This additional quantity, identified as the Drag Reduction Factor, D, is shown to depend on the ratio of drag force to friction force experienced by the aircraft as it enters the overrun area. The specific functional form of D is shown to depend on how drag varies with speed. A detailed uncertainty analysis is presented which reveals how the uncertainty in estimates of stopping distance are influenced by experimental error in the force measurements that are acquired in a typical evaluation experiment conducted to assess candidate overrun materials.

  15. Bayesian analysis of stage-discharge relationships affected by hysteresis and quantification of the associated uncertainties

    NASA Astrophysics Data System (ADS)

    Mansanarez, Valentin; Le Coz, Jérôme; Renard, Benjamin; Lang, Michel; Birgand, François

    2015-04-01

    . Moreover, the kinematic wave celerity yielded less uncertain discharges than the constant celerity option. In the absence of rating shifts, the hysteretic rating curve estimated during a given flood event can be applied to subsequent events with the same accuracy. The calibration can also be made using gaugings from different events. Furthermore, this method does not detect hysteresis when it is applied to well-known and well-identifiable univocal stage-discharge relation. Finally, an analysis of the best gauging strategy demonstrates than, for a hysteretic flow event, the most common strategy, i.e. to gauge during the falling limb near the peak flow, yields high uncertainties in the rising limb and a biased identification of the hysteresis amplitude The best strategy is to gauge near a few remarkable points of the flood wave (min and max stage, max discharge, min and max stage gradient), not necessarily during a single event.

  16. A review and classification of approaches for dealing with uncertainty in multi-criteria decision analysis for healthcare decisions.

    PubMed

    Broekhuizen, Henk; Groothuis-Oudshoorn, Catharina G M; van Til, Janine A; Hummel, J Marjan; IJzerman, Maarten J

    2015-05-01

    Multi-criteria decision analysis (MCDA) is increasingly used to support decisions in healthcare involving multiple and conflicting criteria. Although uncertainty is usually carefully addressed in health economic evaluations, whether and how the different sources of uncertainty are dealt with and with what methods in MCDA is less known. The objective of this study is to review how uncertainty can be explicitly taken into account in MCDA and to discuss which approach may be appropriate for healthcare decision makers. A literature review was conducted in the Scopus and PubMed databases. Two reviewers independently categorized studies according to research areas, the type of MCDA used, and the approach used to quantify uncertainty. Selected full text articles were read for methodological details. The search strategy identified 569 studies. The five approaches most identified were fuzzy set theory (45% of studies), probabilistic sensitivity analysis (15%), deterministic sensitivity analysis (31%), Bayesian framework (6%), and grey theory (3%). A large number of papers considered the analytic hierarchy process in combination with fuzzy set theory (31%). Only 3% of studies were published in healthcare-related journals. In conclusion, our review identified five different approaches to take uncertainty into account in MCDA. The deterministic approach is most likely sufficient for most healthcare policy decisions because of its low complexity and straightforward implementation. However, more complex approaches may be needed when multiple sources of uncertainty must be considered simultaneously. PMID:25630758

  17. Uncertainty Analysis of Radar and Gauge Rainfall Estimates in the Russian River Basin

    NASA Astrophysics Data System (ADS)

    Cifelli, R.; Chen, H.; Willie, D.; Reynolds, D.; Campbell, C.; Sukovich, E.

    2013-12-01

    Radar Quantitative Precipitation Estimation (QPE) has been a very important application of weather radar since it was introduced and made widely available after World War II. Although great progress has been made over the last two decades, it is still a challenging process especially in regions of complex terrain such as the western U.S. It is also extremely difficult to make direct use of radar precipitation data in quantitative hydrologic forecasting models. To improve the understanding of rainfall estimation and distributions in the NOAA Hydrometeorology Testbed in northern California (HMT-West), extensive evaluation of radar and gauge QPE products has been performed using a set of independent rain gauge data. This study focuses on the rainfall evaluation in the Russian River Basin. The statistical properties of the different gridded QPE products will be compared quantitatively. The main emphasis of this study will be on the analysis of uncertainties of the radar and gauge rainfall products that are subject to various sources of error. The spatial variation analysis of the radar estimates is performed by measuring the statistical distribution of the radar base data such as reflectivity and by the comparison with a rain gauge cluster. The application of mean field bias values to the radar rainfall data will also be described. The uncertainty analysis of the gauge rainfall will be focused on the comparison of traditional kriging and conditional bias penalized kriging (Seo 2012) methods. This comparison is performed with the retrospective Multisensor Precipitation Estimator (MPE) system installed at the NOAA Earth System Research Laboratory. The independent gauge set will again be used as the verification tool for the newly generated rainfall products.

  18. Report on INL Activities for Uncertainty Reduction Analysis of FY11

    SciTech Connect

    G. Plamiotti; H. Hiruta; M. Salvatores

    2011-09-01

    This report presents the status of activities performed at INL under the ARC Work Package on 'Uncertainty Reduction Analyses' that has a main goal the reduction of uncertainties associated with nuclear data on neutronic integral parameters of interest for the design of advanced fast reactors under consideration by the ARC program. First, an analysis of experiments was carried out. For both JOYO (the first Japanese fast reactor) and ZPPR-9 (a large size zero power plutonium fueled experiment performed at ANL-W in Idaho) the performance of ENDF/B-VII.0 is quite satisfying except for the sodium void configurations of ZPPR-9, but for which one has to take into account the approximation of the modeling. In fact, when one uses a more detailed model (calculations performed at ANL in a companion WP) more reasonable results are obtained. A large effort was devoted to the analysis of the irradiation experiments, PROFIL-1 and -2 and TRAPU, performed at the French fast reactor PHENIX. For these experiments a pre-release of the ENDF/B-VII.1 cross section files was also used, in order to provide validation feedback to the CSWEG nuclear data evaluation community. In the PROFIL experiments improvements can be observed for the ENDF/B-VII.1 capture data in 238Pu, 241Am, 244Cm, 97Mo, 151Sm, 153Eu, and for 240Pu(n,2n). On the other hand, 240,242Pu, 95Mo, 133Cs and 145Nd capture C/E results are worse. For the major actinides 235U and especially 239Pu capture C/E's are underestimated. For fission products, 105,106Pd, 143,144Nd and 147,149Sm are significantly underestimated, while 101Ru and 151Sm are overestimated. Other C/E deviations from unity are within the combined experimental and calculated statistical uncertainty. From the TRAPU analysis, the major improvement is in the predicted 243Cm build-up, presumably due to an improved 242Cm capture evaluation. The COSMO experiment was also analyzed in order to provide useful feedback on fission cross sections. It was found out that ENDF

  19. Estimation of the uncertainties of extraction and clean-up steps in pesticide residue analysis of plant commodities.

    PubMed

    Omeroglu, P Yolci; Ambrus, A; Boyacioglu, D

    2013-01-01

    Extraction and clean-up constitute important steps in pesticide residue analysis. For the correct interpretation of analytical results, uncertainties of extraction and clean-up steps should be taken into account when the combined uncertainty of the analytical result is estimated. In the scope of this study, uncertainties of extraction and clean-up steps were investigated by spiking (14)C-labelled chlorpyrifos to analytical portions of tomato, orange, apple, green bean, cucumber, jackfruit, papaya and starfruit. After each step, replicate measurements were carried out with a liquid scintillation counter. Uncertainties in extraction and clean-up steps were estimated separately for every matrix and method combination by using within-laboratory reproducibility standard deviation and were characterised with the CV of recoveries. It was observed that the uncertainty of the ethyl acetate extraction step varied between 0.8% and 5.9%. The relative standard uncertainty of the clean-up step with dispersive SPE used in the method known as QuEChERS was estimated to be around 1.5% for tomato, apple and green beans. The highest variation of 4.8% was observed in cucumber. The uncertainty of the clean-up step with gel permeation chromatography ranged between 5.3% and 13.1%, and it was relatively higher than that obtained with the dispersive SPE method. PMID:23216411

  20. Determination of combined measurement uncertainty via Monte Carlo analysis for the imaging spectrometer ROSIS.

    PubMed

    Lenhard, Karim

    2012-06-20

    To enable traceability of imaging spectrometer data, the associated measurement uncertainties have to be provided reliably. Here a new tool for a Monte-Carlo-type measurement uncertainty propagation for the uncertainties that originate from the spectrometer itself is described. For this, an instrument model of the imaging spectrometer ROSIS is used. Combined uncertainties are then derived for radiometrically and spectrally calibrated data using a synthetic at-sensor radiance spectrum as input. By coupling this new software tool with an inverse modeling program, the measurement uncertainties are propagated for an exemplary water data product. PMID:22722281

  1. Strain Gauge Balance Uncertainty Analysis at NASA Langley: A Technical Review

    NASA Technical Reports Server (NTRS)

    Tripp, John S.

    1999-01-01

    This paper describes a method to determine the uncertainties of measured forces and moments from multi-component force balances used in wind tunnel tests. A multivariate regression technique is first employed to estimate the uncertainties of the six balance sensitivities and 156 interaction coefficients derived from established balance calibration procedures. These uncertainties are then employed to calculate the uncertainties of force-moment values computed from observed balance output readings obtained during tests. Confidence and prediction intervals are obtained for each computed force and moment as functions of the actual measurands. Techniques are discussed for separate estimation of balance bias and precision uncertainties.

  2. Uncertainty analysis on the design of thermal conductivity measurement by a guarded cut-bar technique

    NASA Astrophysics Data System (ADS)

    Xing, Changhu; Jensen, Colby; Ban, Heng; Phillips, Jeffrey

    2011-07-01

    A technique adapted from the guarded-comparative-longitudinal heat flow method was selected for the measurement of the thermal conductivity of a nuclear fuel compact over a temperature range characteristic of its usage. This technique fulfills the requirement for non-destructive measurement of the composite compact. Although numerous measurement systems have been created based on the guarded-comparative method, comprehensive systematic (bias) and measurement (precision) uncertainty associated with this technique have not been fully analyzed. In addition to the geometric effect in the bias error, which has been analyzed previously, this paper studies the working condition which is another potential error source. Using finite element analysis, this study showed the effect of these two types of error sources in the thermal conductivity measurement process and the limitations in the design selection of various parameters by considering their effect on the precision error. The results and conclusions provide valuable reference for designing and operating an experimental measurement system using this technique.

  3. The uncertainty recovery analysis for interdependent infrastructure systems using the dynamic inoperability input-output model

    NASA Astrophysics Data System (ADS)

    Xu, Wenping; Wang, Zongjun; Hong, Liu; He, Ligang; Chen, Xueguang

    2015-05-01

    In this paper, an innovatory modelling framework is proposed to conduct the uncertainty recovery analysis for the interdependent infrastructure sectors based on the dynamic inoperability input-output model (DIIM). The DIIM captures the inoperability of infrastructure systems, and therefore can easily analyse how perturbations propagate among interconnected infrastructures and how to implement effective mitigation efforts after a disaster. In this paper, based on the random recovery time distribution, we apply the Monte Carlo simulation to obtain the distributions of the economic losses for the critical interdependent infrastructure sectors after a disaster. The proposed method can provide the decision-makers the guidance in making suitable risk-management decisions as well as how the risks can be mitigated, if the disaster cannot be avoided to happen in the first place.

  4. Uncertainty analysis using Monte Carlo method in the measurement of phase by ESPI

    SciTech Connect

    Anguiano Morales, Marcelino; Martinez, Amalia; Rayas, J. A.; Cordero, Raul R.

    2008-04-15

    A method for simultaneously measuring whole field in-plane displacements by using optical fiber and based on the dual-beam illumination principle electronic speckle pattern interferometry (ESPI) is presented in this paper. A set of single mode optical fibers and beamsplitter are employed to split the laser beam into four beams of equal intensity.One pair of fibers is utilized to illuminate the sample in the horizontal plane so it is sensitive only to horizontal in-plane displacement. Another pair of optical fibers is set to be sensitive only to vertical in-plane displacement. Each pair of optical fibers differs in longitude to avoid unwanted interference. By means of a Fourier-transform method of fringe-pattern analysis (Takeda method), we can obtain the quantitative data of whole field displacements. We found the uncertainty associated with the phases by mean of Monte Carlo-based technique.

  5. An introductory guide to uncertainty analysis in environmental and health risk assessment. Environmental Restoration Program

    SciTech Connect

    Hammonds, J.S.; Hoffman, F.O.; Bartell, S.M.

    1994-12-01

    This report presents guidelines for evaluating uncertainty in mathematical equations and computer models applied to assess human health and environmental risk. Uncertainty analyses involve the propagation of uncertainty in model parameters and model structure to obtain confidence statements for the estimate of risk and identify the model components of dominant importance. Uncertainty analyses are required when there is no a priori knowledge about uncertainty in the risk estimate and when there is a chance that the failure to assess uncertainty may affect the selection of wrong options for risk reduction. Uncertainty analyses are effective when they are conducted in an iterative mode. When the uncertainty in the risk estimate is intolerable for decision-making, additional data are acquired for the dominant model components that contribute most to uncertainty. This process is repeated until the level of residual uncertainty can be tolerated. A analytical and numerical methods for error propagation are presented along with methods for identifying the most important contributors to uncertainty. Monte Carlo simulation with either Simple Random Sampling (SRS) or Latin Hypercube Sampling (LHS) is proposed as the most robust method for propagating uncertainty through either simple or complex models. A distinction is made between simulating a stochastically varying assessment endpoint (i.e., the distribution of individual risks in an exposed population) and quantifying uncertainty due to lack of knowledge about a fixed but unknown quantity (e.g., a specific individual, the maximally exposed individual, or the mean, median, or 95%-tile of the distribution of exposed individuals). Emphasis is placed on the need for subjective judgement to quantify uncertainty when relevant data are absent or incomplete.

  6. Propagating Water Quality Analysis Uncertainty Into Resource Management Decisions Through Probabilistic Modeling

    NASA Astrophysics Data System (ADS)

    Gronewold, A. D.; Wolpert, R. L.; Reckhow, K. H.

    2007-12-01

    Most probable number (MPN) and colony-forming-unit (CFU) are two estimates of fecal coliform bacteria concentration commonly used as measures of water quality in United States shellfish harvesting waters. The MPN is the maximum likelihood estimate (or MLE) of the true fecal coliform concentration based on counts of non-sterile tubes in serial dilution of a sample aliquot, indicating bacterial metabolic activity. The CFU is the MLE of the true fecal coliform concentration based on the number of bacteria colonies emerging on a growth plate after inoculation from a sample aliquot. Each estimating procedure has intrinsic variability and is subject to additional uncertainty arising from minor variations in experimental protocol. Several versions of each procedure (using different sized aliquots or different numbers of tubes, for example) are in common use, each with its own levels of probabilistic and experimental error and uncertainty. It has been observed empirically that the MPN procedure is more variable than the CFU procedure, and that MPN estimates are somewhat higher on average than CFU estimates, on split samples from the same water bodies. We construct a probabilistic model that provides a clear theoretical explanation for the observed variability in, and discrepancy between, MPN and CFU measurements. We then explore how this variability and uncertainty might propagate into shellfish harvesting area management decisions through a two-phased modeling strategy. First, we apply our probabilistic model in a simulation-based analysis of future water quality standard violation frequencies under alternative land use scenarios, such as those evaluated under guidelines of the total maximum daily load (TMDL) program. Second, we apply our model to water quality data from shellfish harvesting areas which at present are closed (either conditionally or permanently) to shellfishing, to determine if alternative laboratory analysis procedures might have led to different

  7. A Bayesian analysis of uncertainties on lung doses resulting from occupational exposures to uranium.

    PubMed

    Puncher, M; Birchall, A; Bull, R K

    2013-09-01

    In a recent epidemiological study, Bayesian estimates of lung doses were calculated in order to determine a possible association between lung dose and lung cancer incidence resulting from occupational exposures to uranium. These calculations, which produce probability distributions of doses, used the human respiratory tract model (HRTM) published by the International Commission on Radiological Protection (ICRP) with a revised particle transport clearance model. In addition to the Bayesian analyses, point estimates (PEs) of doses were also provided for that study using the existing HRTM as it is described in ICRP Publication 66. The PEs are to be used in a preliminary analysis of risk. To explain the differences between the PEs and Bayesian analysis, in this paper the methodology was applied to former UK nuclear workers who constituted a subset of the study cohort. The resulting probability distributions of lung doses calculated using the Bayesian methodology were compared with the PEs obtained for each worker. Mean posterior lung doses were on average 8-fold higher than PEs and the uncertainties on doses varied over a wide range, being greater than two orders of magnitude for some lung tissues. It is shown that it is the prior distributions of the parameters describing absorption from the lungs to blood that are responsible for the large difference between posterior mean doses and PEs. Furthermore, it is the large prior uncertainties on these parameters that are mainly responsible for the large uncertainties on lung doses. It is concluded that accurate determination of the chemical form of inhaled uranium, as well as the absorption parameter values for these materials, is important for obtaining unbiased estimates of lung doses from occupational exposures to uranium for epidemiological studies. Finally, it should be noted that the inferences regarding the PEs described here apply only to the assessments of cases provided for the epidemiological study, where central

  8. Accounting for Multiple Sources of Uncertainty in the Statistical Analysis of Holocene Sea Levels

    NASA Astrophysics Data System (ADS)

    Cahill, N.; Parnell, A. C.; Kemp, A.; Horton, B.

    2014-12-01

    We perform a Bayesian statistical analysis on historical and late Holocene rates of sea-level change. The data that form the input to the statistical model are tide-gauge measurements and proxy reconstructions from cores of coastal sediment. The aims are to estimate rates of sea-level change, to determine when modern rates of rise began and to observe how these rates have evolved over time. Many current methods for doing this use simple linear regression to estimate rates. This is often inappropriate as it is too rigid and it can ignore uncertainties that arise as part of the data collection exercise. This can lead to over-confidence in the sea-level trends being characterized. The proposed model places a Gaussian process prior on the rate process (i.e. the process that determines how rates of sea-level are changing over time). The likelihood of the observed data is the integral of this process. When dealing with proxy reconstructions, the model is set in an errors-in-variables framework so as to take account of age uncertainty. It is also necessary to account for glacio-isostatic adjustment, which introduces a covariance between individual age and sea-level observations. This method allows for the estimation of the rate process with full consideration of all sources of uncertainty. The model captures the continuous and dynamic evolution of sea-level change and results show that modern rates of rise are consistently increasing. Analysis of a global tide-gauge record (Church and White, 2011) indicated that the rate of sea-level rise increased continuously since 1880AD and is currently 1.9mm/yr (95% credible interval of 1.84 to 2.03mm/yr). Applying the model to a proxy reconstruction from North Carolina (Kemp et al., 2011) indicated that the mean rate of rise in this locality since the middle of the 19th century (current rate of 2.44 mm/yr with a 95% credible interval of 1.91 to 3.01mm/yr) is unprecedented in at least the last 2000 years.

  9. Uncertainty in the analysis of the overall equipment effectiveness on the shop floor

    NASA Astrophysics Data System (ADS)

    Rößler, M. P.; Abele, E.

    2013-06-01

    In this article an approach will be presented which supports transparency regarding the effectiveness of manufacturing equipment by combining the fuzzy set theory with the method of the overall equipment effectiveness analysis. One of the key principles of lean production and also a fundamental task in production optimization projects is the prior analysis of the current state of a production system by the use of key performance indicators to derive possible future states. The current state of the art in overall equipment effectiveness analysis is usually performed by cumulating different machine states by means of decentralized data collection without the consideration of uncertainty. In manual data collection or semi-automated plant data collection systems the quality of derived data often diverges and leads optimization teams to distorted conclusions about the real optimization potential of manufacturing equipment. The method discussed in this paper is to help practitioners to get more reliable results in the analysis phase and so better results of optimization projects. Under consideration of a case study obtained results are discussed.

  10. Uncertainty in patient set-up margin analysis in radiation therapy

    PubMed Central

    Suzuki, Junji; Tateoka, Kunihiko; Shima, Katsumi; Yaegashi, Yuji; Fujimoto, Kazunori; Saitoh, Yuichi; Nakata, Akihiro; Abe, Tadanori; Nakazawa, Takuya; Sakata, Kouichi; Hareyama, Masato

    2012-01-01

    We investigated the uncertainty in patient set-up margin analysis with a small dataset consisting of a limited number of clinical cases over a short time period, and propose a method for determining the optimum set-up margin. Patient set-up errors from 555 registration images of 15 patients with prostate cancer were tested for normality using a quantile-quantile (Q-Q) plot and a Kolmogorov–Smirnov test with the hypothesis that the data were not normally distributed. The ranges of set-up errors include the set-up errors within the 95% interval of the entire patient data histogram, and their equivalent normal distributions were compared. The patient set-up error was not normally distributed. When the patient set-up error distribution was assumed to have a normal distribution, an underestimate of the actual set-up error occurred in some patients but an overestimate occurred in others. When using a limited dataset for patient set-up errors, which consists of only a small number of the cases over a short period of time in a clinical practice, the 2.5% and 97.5% intervals of the actual patient data histogram from the percentile method should be used for estimating the set-up margin. Since set-up error data is usually not normally distributed, these intervals should provide a more accurate estimate of set-up margin. In this way, the uncertainty in patient set-up margin analysis in radiation therapy can be reduced. PMID:22843628

  11. Model complexity in carbon sequestration:A design of experiment and response surface uncertainty analysis

    NASA Astrophysics Data System (ADS)

    Zhang, Y.; Li, S.