Sample records for evaluate model uncertainties

  1. A Bayesian Framework of Uncertainties Integration in 3D Geological Model

    NASA Astrophysics Data System (ADS)

    Liang, D.; Liu, X.

    2017-12-01

    3D geological model can describe complicated geological phenomena in an intuitive way while its application may be limited by uncertain factors. Great progress has been made over the years, lots of studies decompose the uncertainties of geological model to analyze separately, while ignored the comprehensive impacts of multi-source uncertainties. Great progress has been made over the years, while lots of studies ignored the comprehensive impacts of multi-source uncertainties when analyzed them item by item from each source. To evaluate the synthetical uncertainty, we choose probability distribution to quantify uncertainty, and propose a bayesian framework of uncertainties integration. With this framework, we integrated data errors, spatial randomness, and cognitive information into posterior distribution to evaluate synthetical uncertainty of geological model. Uncertainties propagate and cumulate in modeling process, the gradual integration of multi-source uncertainty is a kind of simulation of the uncertainty propagation. Bayesian inference accomplishes uncertainty updating in modeling process. Maximum entropy principle makes a good effect on estimating prior probability distribution, which ensures the prior probability distribution subjecting to constraints supplied by the given information with minimum prejudice. In the end, we obtained a posterior distribution to evaluate synthetical uncertainty of geological model. This posterior distribution represents the synthetical impact of all the uncertain factors on the spatial structure of geological model. The framework provides a solution to evaluate synthetical impact on geological model of multi-source uncertainties and a thought to study uncertainty propagation mechanism in geological modeling.

  2. Evaluation of the ²³⁹Pu prompt fission neutron spectrum induced by neutrons of 500 keV and associated covariances

    DOE PAGES

    Neudecker, D.; Talou, P.; Kawano, T.; ...

    2015-08-01

    We present evaluations of the prompt fission neutron spectrum (PFNS) of ²³⁹Pu induced by 500 keV neutrons, and associated covariances. In a previous evaluation by Talou et al. 2010, surprisingly low evaluated uncertainties were obtained, partly due to simplifying assumptions in the quantification of uncertainties from experiment and model. Therefore, special emphasis is placed here on a thorough uncertainty quantification of experimental data and of the Los Alamos model predicted values entering the evaluation. In addition, the Los Alamos model was extended and an evaluation technique was employed that takes into account the qualitative differences between normalized model predicted valuesmore » and experimental shape data. These improvements lead to changes in the evaluated PFNS and overall larger evaluated uncertainties than in the previous work. However, these evaluated uncertainties are still smaller than those obtained in a statistical analysis using experimental information only, due to strong model correlations. Hence, suggestions to estimate model defect uncertainties are presented, which lead to more reasonable evaluated uncertainties. The calculated k eff of selected criticality benchmarks obtained with these new evaluations agree with each other within their uncertainties despite the different approaches to estimate model defect uncertainties. The k eff one standard deviations overlap with some of those obtained using ENDF/B-VII.1, albeit their mean values are further away from unity. Spectral indexes for the Jezebel critical assembly calculated with the newly evaluated PFNS agree with the experimental data for selected (n,γ) and (n,f) reactions, and show improvements for high-energy threshold (n,2n) reactions compared to ENDF/B-VII.1.« less

  3. Evaluation of the 239 Pu prompt fission neutron spectrum induced by neutrons of 500 keV and associated covariances

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Neudecker, D.; Talou, P.; Kawano, T.

    2015-08-01

    We present evaluations of the prompt fission neutron spectrum (PFNS) of (PU)-P-239 induced by 500 keV neutrons, and associated covariances. In a previous evaluation by Talon et al. (2010), surprisingly low evaluated uncertainties were obtained, partly due to simplifying assumptions in the quantification of uncertainties from experiment and model. Therefore, special emphasis is placed here on a thorough uncertainty quantification of experimental data and of the Los Alamos model predicted values entering the evaluation. In addition, the Los Alamos model was extended and an evaluation technique was employed that takes into account the qualitative differences between normalized model predicted valuesmore » and experimental shape data These improvements lead to changes in the evaluated PENS and overall larger evaluated uncertainties than in the previous work. However, these evaluated uncertainties are still smaller than those obtained in a statistical analysis using experimental information only, due to strong model correlations. Hence, suggestions to estimate model defect uncertainties are presented. which lead to more reasonable evaluated uncertainties. The calculated k(eff) of selected criticality benchmarks obtained with these new evaluations agree with each other within their uncertainties despite the different approaches to estimate model defect uncertainties. The k(eff) one standard deviations overlap with some of those obtained using ENDF/B-VILl, albeit their mean values are further away from unity. Spectral indexes for the Jezebel critical assembly calculated with the newly evaluated PFNS agree with the experimental data for selected (n,) and (n,f) reactions, and show improvements for highenergy threshold (n,2n) reactions compared to ENDF/B-VII.l. (C) 2015 Elsevier B.V. All rights reserved.« less

  4. Uncertainty evaluation of dead zone of diagnostic ultrasound equipment

    NASA Astrophysics Data System (ADS)

    Souza, R. M.; Alvarenga, A. V.; Braz, D. S.; Petrella, L. I.; Costa-Felix, R. P. B.

    2016-07-01

    This paper presents a model for evaluating measurement uncertainty of a feature used in the assessment of ultrasound images: dead zone. The dead zone was measured by two technicians of the INMETRO's Laboratory of Ultrasound using a phantom and following the standard IEC/TS 61390. The uncertainty model was proposed based on the Guide to the Expression of Uncertainty in Measurement. For the tested equipment, results indicate a dead zone of 1.01 mm, and based on the proposed model, the expanded uncertainty was 0.17 mm. The proposed uncertainty model contributes as a novel way for metrological evaluation of diagnostic imaging by ultrasound.

  5. A model-averaging method for assessing groundwater conceptual model uncertainty.

    PubMed

    Ye, Ming; Pohlmann, Karl F; Chapman, Jenny B; Pohll, Greg M; Reeves, Donald M

    2010-01-01

    This study evaluates alternative groundwater models with different recharge and geologic components at the northern Yucca Flat area of the Death Valley Regional Flow System (DVRFS), USA. Recharge over the DVRFS has been estimated using five methods, and five geological interpretations are available at the northern Yucca Flat area. Combining the recharge and geological components together with additional modeling components that represent other hydrogeological conditions yields a total of 25 groundwater flow models. As all the models are plausible given available data and information, evaluating model uncertainty becomes inevitable. On the other hand, hydraulic parameters (e.g., hydraulic conductivity) are uncertain in each model, giving rise to parametric uncertainty. Propagation of the uncertainty in the models and model parameters through groundwater modeling causes predictive uncertainty in model predictions (e.g., hydraulic head and flow). Parametric uncertainty within each model is assessed using Monte Carlo simulation, and model uncertainty is evaluated using the model averaging method. Two model-averaging techniques (on the basis of information criteria and GLUE) are discussed. This study shows that contribution of model uncertainty to predictive uncertainty is significantly larger than that of parametric uncertainty. For the recharge and geological components, uncertainty in the geological interpretations has more significant effect on model predictions than uncertainty in the recharge estimates. In addition, weighted residuals vary more for the different geological models than for different recharge models. Most of the calibrated observations are not important for discriminating between the alternative models, because their weighted residuals vary only slightly from one model to another.

  6. [The uncertainty evaluation of analytical results of 27 elements in geological samples by X-ray fluorescence spectrometry].

    PubMed

    Wang, Yi-Ya; Zhan, Xiu-Chun

    2014-04-01

    Evaluating uncertainty of analytical results with 165 geological samples by polarized dispersive X-ray fluorescence spectrometry (P-EDXRF) has been reported according to the internationally accepted guidelines. One hundred sixty five pressed pellets of similar matrix geological samples with reliable values were analyzed by P-EDXRF. These samples were divided into several different concentration sections in the concentration ranges of every component. The relative uncertainties caused by precision and accuracy of 27 components were evaluated respectively. For one element in one concentration, the relative uncertainty caused by precision can be calculated according to the average value of relative standard deviation with different concentration level in one concentration section, n = 6 stands for the 6 results of one concentration level. The relative uncertainty caused by accuracy in one concentration section can be evaluated by the relative standard deviation of relative deviation with different concentration level in one concentration section. According to the error propagation theory, combining the precision uncertainty and the accuracy uncertainty into a global uncertainty, this global uncertainty acted as method uncertainty. This model of evaluating uncertainty can solve a series of difficult questions in the process of evaluating uncertainty, such as uncertainties caused by complex matrix of geological samples, calibration procedure, standard samples, unknown samples, matrix correction, overlap correction, sample preparation, instrument condition and mathematics model. The uncertainty of analytical results in this method can act as the uncertainty of the results of the similar matrix unknown sample in one concentration section. This evaluation model is a basic statistical method owning the practical application value, which can provide a strong base for the building of model of the following uncertainty evaluation function. However, this model used a lot of samples which cannot simply be applied to other types of samples with different matrix samples. The number of samples is too large to adapt to other type's samples. We will strive for using this study as a basis to establish a reasonable basis of mathematical statistics function mode to be applied to different types of samples.

  7. A Hierarchical Multi-Model Approach for Uncertainty Segregation, Prioritization and Comparative Evaluation of Competing Modeling Propositions

    NASA Astrophysics Data System (ADS)

    Tsai, F. T.; Elshall, A. S.; Hanor, J. S.

    2012-12-01

    Subsurface modeling is challenging because of many possible competing propositions for each uncertain model component. How can we judge that we are selecting the correct proposition for an uncertain model component out of numerous competing propositions? How can we bridge the gap between synthetic mental principles such as mathematical expressions on one hand, and empirical observation such as observation data on the other hand when uncertainty exists on both sides? In this study, we introduce hierarchical Bayesian model averaging (HBMA) as a multi-model (multi-proposition) framework to represent our current state of knowledge and decision for hydrogeological structure modeling. The HBMA framework allows for segregating and prioritizing different sources of uncertainty, and for comparative evaluation of competing propositions for each source of uncertainty. We applied the HBMA to a study of hydrostratigraphy and uncertainty propagation of the Southern Hills aquifer system in the Baton Rouge area, Louisiana. We used geophysical data for hydrogeological structure construction through indictor hydrostratigraphy method and used lithologic data from drillers' logs for model structure calibration. However, due to uncertainty in model data, structure and parameters, multiple possible hydrostratigraphic models were produced and calibrated. The study considered four sources of uncertainties. To evaluate mathematical structure uncertainty, the study considered three different variogram models and two geological stationarity assumptions. With respect to geological structure uncertainty, the study considered two geological structures with respect to the Denham Springs-Scotlandville fault. With respect to data uncertainty, the study considered two calibration data sets. These four sources of uncertainty with their corresponding competing modeling propositions resulted in 24 calibrated models. The results showed that by segregating different sources of uncertainty, HBMA analysis provided insights on uncertainty priorities and propagation. In addition, it assisted in evaluating the relative importance of competing modeling propositions for each uncertain model component. By being able to dissect the uncertain model components and provide weighted representation of the competing propositions for each uncertain model component based on the background knowledge, the HBMA functions as an epistemic framework for advancing knowledge about the system under study.

  8. Uncertainty in BMP evaluation and optimization for watershed management

    NASA Astrophysics Data System (ADS)

    Chaubey, I.; Cibin, R.; Sudheer, K.; Her, Y.

    2012-12-01

    Use of computer simulation models have increased substantially to make watershed management decisions and to develop strategies for water quality improvements. These models are often used to evaluate potential benefits of various best management practices (BMPs) for reducing losses of pollutants from sources areas into receiving waterbodies. Similarly, use of simulation models in optimizing selection and placement of best management practices under single (maximization of crop production or minimization of pollutant transport) and multiple objective functions has increased recently. One of the limitations of the currently available assessment and optimization approaches is that the BMP strategies are considered deterministic. Uncertainties in input data (e.g. precipitation, streamflow, sediment, nutrient and pesticide losses measured, land use) and model parameters may result in considerable uncertainty in watershed response under various BMP options. We have developed and evaluated options to include uncertainty in BMP evaluation and optimization for watershed management. We have also applied these methods to evaluate uncertainty in ecosystem services from mixed land use watersheds. In this presentation, we will discuss methods to to quantify uncertainties in BMP assessment and optimization solutions due to uncertainties in model inputs and parameters. We have used a watershed model (Soil and Water Assessment Tool or SWAT) to simulate the hydrology and water quality in mixed land use watershed located in Midwest USA. The SWAT model was also used to represent various BMPs in the watershed needed to improve water quality. SWAT model parameters, land use change parameters, and climate change parameters were considered uncertain. It was observed that model parameters, land use and climate changes resulted in considerable uncertainties in BMP performance in reducing P, N, and sediment loads. In addition, climate change scenarios also affected uncertainties in SWAT simulated crop yields. Considerable uncertainties in the net cost and the water quality improvements resulted due to uncertainties in land use, climate change, and model parameter values.

  9. A modified F-test for evaluating model performance by including both experimental and simulation uncertainties

    USDA-ARS?s Scientific Manuscript database

    Experimental and simulation uncertainties have not been included in many of the statistics used in assessing agricultural model performance. The objectives of this study were to develop an F-test that can be used to evaluate model performance considering experimental and simulation uncertainties, an...

  10. Evaluating a multispecies adaptive management framework: Must uncertainty impede effective decision-making?

    USGS Publications Warehouse

    Smith, David R.; McGowan, Conor P.; Daily, Jonathan P.; Nichols, James D.; Sweka, John A.; Lyons, James E.

    2013-01-01

    Application of adaptive management to complex natural resource systems requires careful evaluation to ensure that the process leads to improved decision-making. As part of that evaluation, adaptive policies can be compared with alternative nonadaptive management scenarios. Also, the value of reducing structural (ecological) uncertainty to achieving management objectives can be quantified.A multispecies adaptive management framework was recently adopted by the Atlantic States Marine Fisheries Commission for sustainable harvest of Delaware Bay horseshoe crabs Limulus polyphemus, while maintaining adequate stopover habitat for migrating red knots Calidris canutus rufa, the focal shorebird species. The predictive model set encompassed the structural uncertainty in the relationships between horseshoe crab spawning, red knot weight gain and red knot vital rates. Stochastic dynamic programming was used to generate a state-dependent strategy for harvest decisions given that uncertainty. In this paper, we employed a management strategy evaluation approach to evaluate the performance of this adaptive management framework. Active adaptive management was used by including model weights as state variables in the optimization and reducing structural uncertainty by model weight updating.We found that the value of information for reducing structural uncertainty is expected to be low, because the uncertainty does not appear to impede effective management. Harvest policy responded to abundance levels of both species regardless of uncertainty in the specific relationship that generated those abundances. Thus, the expected horseshoe crab harvest and red knot abundance were similar when the population generating model was uncertain or known, and harvest policy was robust to structural uncertainty as specified.Synthesis and applications. The combination of management strategy evaluation with state-dependent strategies from stochastic dynamic programming was an informative approach to evaluate adaptive management performance and value of learning. Although natural resource decisions are characterized by uncertainty, not all uncertainty will cause decisions to be altered substantially, as we found in this case. It is important to incorporate uncertainty into the decision framing and evaluate the effect of reducing that uncertainty on achieving the desired outcomes

  11. Evaluating Predictive Uncertainty of Hyporheic Exchange Modelling

    NASA Astrophysics Data System (ADS)

    Chow, R.; Bennett, J.; Dugge, J.; Wöhling, T.; Nowak, W.

    2017-12-01

    Hyporheic exchange is the interaction of water between rivers and groundwater, and is difficult to predict. One of the largest contributions to predictive uncertainty for hyporheic fluxes have been attributed to the representation of heterogeneous subsurface properties. This research aims to evaluate which aspect of the subsurface representation - the spatial distribution of hydrofacies or the model for local-scale (within-facies) heterogeneity - most influences the predictive uncertainty. Also, we seek to identify data types that help reduce this uncertainty best. For this investigation, we conduct a modelling study of the Steinlach River meander, in Southwest Germany. The Steinlach River meander is an experimental site established in 2010 to monitor hyporheic exchange at the meander scale. We use HydroGeoSphere, a fully integrated surface water-groundwater model, to model hyporheic exchange and to assess the predictive uncertainty of hyporheic exchange transit times (HETT). A highly parameterized complex model is built and treated as `virtual reality', which is in turn modelled with simpler subsurface parameterization schemes (Figure). Then, we conduct Monte-Carlo simulations with these models to estimate the predictive uncertainty. Results indicate that: Uncertainty in HETT is relatively small for early times and increases with transit times. Uncertainty from local-scale heterogeneity is negligible compared to uncertainty in the hydrofacies distribution. Introducing more data to a poor model structure may reduce predictive variance, but does not reduce predictive bias. Hydraulic head observations alone cannot constrain the uncertainty of HETT, however an estimate of hyporheic exchange flux proves to be more effective at reducing this uncertainty. Figure: Approach for evaluating predictive model uncertainty. A conceptual model is first developed from the field investigations. A complex model (`virtual reality') is then developed based on that conceptual model. This complex model then serves as the basis to compare simpler model structures. Through this approach, predictive uncertainty can be quantified relative to a known reference solution.

  12. Evaluating uncertainty in environmental life-cycle assessment. A case study comparing two insulation options for a Dutch one-family dwelling.

    PubMed

    Huijbregts, Mark A J; Gilijamse, Wim; Ragas, Ad M J; Reijnders, Lucas

    2003-06-01

    The evaluation of uncertainty is relatively new in environmental life-cycle assessment (LCA). It provides useful information to assess the reliability of LCA-based decisions and to guide future research toward reducing uncertainty. Most uncertainty studies in LCA quantify only one type of uncertainty, i.e., uncertainty due to input data (parameter uncertainty). However, LCA outcomes can also be uncertain due to normative choices (scenario uncertainty) and the mathematical models involved (model uncertainty). The present paper outlines a new methodology that quantifies parameter, scenario, and model uncertainty simultaneously in environmental life-cycle assessment. The procedure is illustrated in a case study that compares two insulation options for a Dutch one-family dwelling. Parameter uncertainty was quantified by means of Monte Carlo simulation. Scenario and model uncertainty were quantified by resampling different decision scenarios and model formulations, respectively. Although scenario and model uncertainty were not quantified comprehensively, the results indicate that both types of uncertainty influence the case study outcomes. This stresses the importance of quantifying parameter, scenario, and model uncertainty simultaneously. The two insulation options studied were found to have significantly different impact scores for global warming, stratospheric ozone depletion, and eutrophication. The thickest insulation option has the lowest impact on global warming and eutrophication, and the highest impact on stratospheric ozone depletion.

  13. MODEL UNCERTAINTY ANALYSIS, FIELD DATA COLLECTION AND ANALYSIS OF CONTAMINATED VAPOR INTRUSION INTO BUILDINGS

    EPA Science Inventory

    To address uncertainty associated with the evaluation of vapor intrusion problems we are working on a three part strategy that includes: evaluation of uncertainty in model-based assessments; collection of field data and assessment of sites using EPA and state protocols.

  14. Monte-Carlo-based uncertainty propagation with hierarchical models—a case study in dynamic torque

    NASA Astrophysics Data System (ADS)

    Klaus, Leonard; Eichstädt, Sascha

    2018-04-01

    For a dynamic calibration, a torque transducer is described by a mechanical model, and the corresponding model parameters are to be identified from measurement data. A measuring device for the primary calibration of dynamic torque, and a corresponding model-based calibration approach, have recently been developed at PTB. The complete mechanical model of the calibration set-up is very complex, and involves several calibration steps—making a straightforward implementation of a Monte Carlo uncertainty evaluation tedious. With this in mind, we here propose to separate the complete model into sub-models, with each sub-model being treated with individual experiments and analysis. The uncertainty evaluation for the overall model then has to combine the information from the sub-models in line with Supplement 2 of the Guide to the Expression of Uncertainty in Measurement. In this contribution, we demonstrate how to carry this out using the Monte Carlo method. The uncertainty evaluation involves various input quantities of different origin and the solution of a numerical optimisation problem.

  15. Observational uncertainty and regional climate model evaluation: A pan-European perspective

    NASA Astrophysics Data System (ADS)

    Kotlarski, Sven; Szabó, Péter; Herrera, Sixto; Räty, Olle; Keuler, Klaus; Soares, Pedro M.; Cardoso, Rita M.; Bosshard, Thomas; Pagé, Christian; Boberg, Fredrik; Gutiérrez, José M.; Jaczewski, Adam; Kreienkamp, Frank; Liniger, Mark. A.; Lussana, Cristian; Szepszo, Gabriella

    2017-04-01

    Local and regional climate change assessments based on downscaling methods crucially depend on the existence of accurate and reliable observational reference data. In dynamical downscaling via regional climate models (RCMs) observational data can influence model development itself and, later on, model evaluation, parameter calibration and added value assessment. In empirical-statistical downscaling, observations serve as predictand data and directly influence model calibration with corresponding effects on downscaled climate change projections. Focusing on the evaluation of RCMs, we here analyze the influence of uncertainties in observational reference data on evaluation results in a well-defined performance assessment framework and on a European scale. For this purpose we employ three different gridded observational reference grids, namely (1) the well-established EOBS dataset (2) the recently developed EURO4M-MESAN regional re-analysis, and (3) several national high-resolution and quality-controlled gridded datasets that recently became available. In terms of climate models five reanalysis-driven experiments carried out by five different RCMs within the EURO-CORDEX framework are used. Two variables (temperature and precipitation) and a range of evaluation metrics that reflect different aspects of RCM performance are considered. We furthermore include an illustrative model ranking exercise and relate observational spread to RCM spread. The results obtained indicate a varying influence of observational uncertainty on model evaluation depending on the variable, the season, the region and the specific performance metric considered. Over most parts of the continent, the influence of the choice of the reference dataset for temperature is rather small for seasonal mean values and inter-annual variability. Here, model uncertainty (as measured by the spread between the five RCM simulations considered) is typically much larger than reference data uncertainty. For parameters of the daily temperature distribution and for the spatial pattern correlation, however, important dependencies on the reference dataset can arise. The related evaluation uncertainties can be as large or even larger than model uncertainty. For precipitation the influence of observational uncertainty is, in general, larger than for temperature. It often dominates model uncertainty especially for the evaluation of the wet day frequency, the spatial correlation and the shape and location of the distribution of daily values. But even the evaluation of large-scale seasonal mean values can be considerably affected by the choice of the reference. When employing a simple and illustrative model ranking scheme on these results it is found that RCM ranking in many cases depends on the reference dataset employed.

  16. The NIST Simple Guide for Evaluating and Expressing Measurement Uncertainty

    NASA Astrophysics Data System (ADS)

    Possolo, Antonio

    2016-11-01

    NIST has recently published guidance on the evaluation and expression of the uncertainty of NIST measurement results [1, 2], supplementing but not replacing B. N. Taylor and C. E. Kuyatt's (1994) Guidelines for Evaluating and Expressing the Uncertainty of NIST Measurement Results (NIST Technical Note 1297) [3], which tracks closely the Guide to the expression of uncertainty in measurement (GUM) [4], originally published in 1995 by the Joint Committee for Guides in Metrology of the International Bureau of Weights and Measures (BIPM). The scope of this Simple Guide, however, is much broader than the scope of both NIST Technical Note 1297 and the GUM, because it attempts to address several of the uncertainty evaluation challenges that have arisen at NIST since the 1990s, for example to include molecular biology, greenhouse gases and climate science measurements, and forensic science. The Simple Guide also expands the scope of those two other guidance documents by recognizing observation equations (that is, statistical models) as bona fide measurement models. These models are indispensable to reduce data from interlaboratory studies, to combine measurement results for the same measurand obtained by different methods, and to characterize the uncertainty of calibration and analysis functions used in the measurement of force, temperature, or composition of gas mixtures. This presentation reviews the salient aspects of the Simple Guide, illustrates the use of models and methods for uncertainty evaluation not contemplated in the GUM, and also demonstrates the NIST Uncertainty Machine [5] and the NIST Consensus Builder, which are web-based applications accessible worldwide that facilitate evaluations of measurement uncertainty and the characterization of consensus values in interlaboratory studies.

  17. Model parameter uncertainty analysis for an annual field-scale P loss model

    NASA Astrophysics Data System (ADS)

    Bolster, Carl H.; Vadas, Peter A.; Boykin, Debbie

    2016-08-01

    Phosphorous (P) fate and transport models are important tools for developing and evaluating conservation practices aimed at reducing P losses from agricultural fields. Because all models are simplifications of complex systems, there will exist an inherent amount of uncertainty associated with their predictions. It is therefore important that efforts be directed at identifying, quantifying, and communicating the different sources of model uncertainties. In this study, we conducted an uncertainty analysis with the Annual P Loss Estimator (APLE) model. Our analysis included calculating parameter uncertainties and confidence and prediction intervals for five internal regression equations in APLE. We also estimated uncertainties of the model input variables based on values reported in the literature. We then predicted P loss for a suite of fields under different management and climatic conditions while accounting for uncertainties in the model parameters and inputs and compared the relative contributions of these two sources of uncertainty to the overall uncertainty associated with predictions of P loss. Both the overall magnitude of the prediction uncertainties and the relative contributions of the two sources of uncertainty varied depending on management practices and field characteristics. This was due to differences in the number of model input variables and the uncertainties in the regression equations associated with each P loss pathway. Inspection of the uncertainties in the five regression equations brought attention to a previously unrecognized limitation with the equation used to partition surface-applied fertilizer P between leaching and runoff losses. As a result, an alternate equation was identified that provided similar predictions with much less uncertainty. Our results demonstrate how a thorough uncertainty and model residual analysis can be used to identify limitations with a model. Such insight can then be used to guide future data collection and model development and evaluation efforts.

  18. An uncertainty analysis of wildfire modeling [Chapter 13

    Treesearch

    Karin Riley; Matthew Thompson

    2017-01-01

    Before fire models can be understood, evaluated, and effectively applied to support decision making, model-based uncertainties must be analyzed. In this chapter, we identify and classify sources of uncertainty using an established analytical framework, and summarize results graphically in an uncertainty matrix. Our analysis facilitates characterization of the...

  19. A systematic uncertainty analysis of an evaluative fate and exposure model.

    PubMed

    Hertwich, E G; McKone, T E; Pease, W S

    2000-08-01

    Multimedia fate and exposure models are widely used to regulate the release of toxic chemicals, to set cleanup standards for contaminated sites, and to evaluate emissions in life-cycle assessment. CalTOX, one of these models, is used to calculate the potential dose, an outcome that is combined with the toxicity of the chemical to determine the Human Toxicity Potential (HTP), used to aggregate and compare emissions. The comprehensive assessment of the uncertainty in the potential dose calculation in this article serves to provide the information necessary to evaluate the reliability of decisions based on the HTP A framework for uncertainty analysis in multimedia risk assessment is proposed and evaluated with four types of uncertainty. Parameter uncertainty is assessed through Monte Carlo analysis. The variability in landscape parameters is assessed through a comparison of potential dose calculations for different regions in the United States. Decision rule uncertainty is explored through a comparison of the HTP values under open and closed system boundaries. Model uncertainty is evaluated through two case studies, one using alternative formulations for calculating the plant concentration and the other testing the steady state assumption for wet deposition. This investigation shows that steady state conditions for the removal of chemicals from the atmosphere are not appropriate and result in an underestimate of the potential dose for 25% of the 336 chemicals evaluated.

  20. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Knudsen, J.K.; Smith, C.L.

    The steps involved to incorporate parameter uncertainty into the Nuclear Regulatory Commission (NRC) accident sequence precursor (ASP) models is covered in this paper. Three different uncertainty distributions (i.e., lognormal, beta, gamma) were evaluated to Determine the most appropriate distribution. From the evaluation, it was Determined that the lognormal distribution will be used for the ASP models uncertainty parameters. Selection of the uncertainty parameters for the basic events is also discussed. This paper covers the process of determining uncertainty parameters for the supercomponent basic events (i.e., basic events that are comprised of more than one component which can have more thanmore » one failure mode) that are utilized in the ASP models. Once this is completed, the ASP model is ready to be utilized to propagate parameter uncertainty for event assessments.« less

  1. Metrics for evaluating performance and uncertainty of Bayesian network models

    Treesearch

    Bruce G. Marcot

    2012-01-01

    This paper presents a selected set of existing and new metrics for gauging Bayesian network model performance and uncertainty. Selected existing and new metrics are discussed for conducting model sensitivity analysis (variance reduction, entropy reduction, case file simulation); evaluating scenarios (influence analysis); depicting model complexity (numbers of model...

  2. Uncertainty Modeling and Evaluation of CMM Task Oriented Measurement Based on SVCMM

    NASA Astrophysics Data System (ADS)

    Li, Hongli; Chen, Xiaohuai; Cheng, Yinbao; Liu, Houde; Wang, Hanbin; Cheng, Zhenying; Wang, Hongtao

    2017-10-01

    Due to the variety of measurement tasks and the complexity of the errors of coordinate measuring machine (CMM), it is very difficult to reasonably evaluate the uncertainty of the measurement results of CMM. It has limited the application of CMM. Task oriented uncertainty evaluation has become a difficult problem to be solved. Taking dimension measurement as an example, this paper puts forward a practical method of uncertainty modeling and evaluation of CMM task oriented measurement (called SVCMM method). This method makes full use of the CMM acceptance or reinspection report and the Monte Carlo computer simulation method (MCM). The evaluation example is presented, and the results are evaluated by the traditional method given in GUM and the proposed method, respectively. The SVCMM method is verified to be feasible and practical. It can help CMM users to conveniently complete the measurement uncertainty evaluation through a single measurement cycle.

  3. Multivariate Probabilistic Analysis of an Hydrological Model

    NASA Astrophysics Data System (ADS)

    Franceschini, Samuela; Marani, Marco

    2010-05-01

    Model predictions derived based on rainfall measurements and hydrological model results are often limited by the systematic error of measuring instruments, by the intrinsic variability of the natural processes and by the uncertainty of the mathematical representation. We propose a means to identify such sources of uncertainty and to quantify their effects based on point-estimate approaches, as a valid alternative to cumbersome Montecarlo methods. We present uncertainty analyses on the hydrologic response to selected meteorological events, in the mountain streamflow-generating portion of the Brenta basin at Bassano del Grappa, Italy. The Brenta river catchment has a relatively uniform morphology and quite a heterogeneous rainfall-pattern. In the present work, we evaluate two sources of uncertainty: data uncertainty (the uncertainty due to data handling and analysis) and model uncertainty (the uncertainty related to the formulation of the model). We thus evaluate the effects of the measurement error of tipping-bucket rain gauges, the uncertainty in estimating spatially-distributed rainfall through block kriging, and the uncertainty associated with estimated model parameters. To this end, we coupled a deterministic model based on the geomorphological theory of the hydrologic response to probabilistic methods. In particular we compare the results of Monte Carlo Simulations (MCS) to the results obtained, in the same conditions, using Li's Point Estimate Method (LiM). The LiM is a probabilistic technique that approximates the continuous probability distribution function of the considered stochastic variables by means of discrete points and associated weights. This allows to satisfactorily reproduce results with only few evaluations of the model function. The comparison between the LiM and MCS results highlights the pros and cons of using an approximating method. LiM is less computationally demanding than MCS, but has limited applicability especially when the model response is highly nonlinear. Higher-order approximations can provide more accurate estimations, but reduce the numerical advantage of the LiM. The results of the uncertainty analysis identify the main sources of uncertainty in the computation of river discharge. In this particular case the spatial variability of rainfall and the model parameters uncertainty are shown to have the greatest impact on discharge evaluation. This, in turn, highlights the need to support any estimated hydrological response with probability information and risk analysis results in order to provide a robust, systematic framework for decision making.

  4. Predicting long-range transport: a systematic evaluation of two multimedia transport models.

    PubMed

    Bennett, D H; Scheringer, M; McKone, T E; Hungerbühler, K

    2001-03-15

    The United Nations Environment Program has recently developed criteria to identify and restrict chemicals with a potential for persistence and long-range transport (persistent organic pollutants or POPs). There are many stakeholders involved, and the issues are not only scientific but also include social, economic, and political factors. This work focuses on one aspect of the POPs debate, the criteria for determining the potential for long-range transport (LRT). Our goal is to determine if current models are reliable enough to support decisions that classify a chemical based on the LRT potential. We examine the robustness of two multimedia fate models for determining the relative ranking and absolute spatial range of various chemicals in the environment. We also consider the effect of parameter uncertainties and the model uncertainty associated with the selection of an algorithm for gas-particle partitioning on the model results. Given the same chemical properties, both models give virtually the same ranking. However, when chemical parameter uncertainties and model uncertainties such as particle partitioning are considered, the spatial range distributions obtained for the individual chemicals overlap, preventing a distinct rank order. The absolute values obtained for the predicted spatial range or travel distance differ significantly between the two models for the uncertainties evaluated. We find that to evaluate a chemical when large and unresolved uncertainties exist, it is more informative to use two or more models and include multiple types of uncertainty. Model differences and uncertainties must be explicitly confronted to determine how the limitations of scientific knowledge impact predictions in the decision-making process.

  5. Measures of GCM Performance as Functions of Model Parameters Affecting Clouds and Radiation

    NASA Astrophysics Data System (ADS)

    Jackson, C.; Mu, Q.; Sen, M.; Stoffa, P.

    2002-05-01

    This abstract is one of three related presentations at this meeting dealing with several issues surrounding optimal parameter and uncertainty estimation of model predictions of climate. Uncertainty in model predictions of climate depends in part on the uncertainty produced by model approximations or parameterizations of unresolved physics. Evaluating these uncertainties is computationally expensive because one needs to evaluate how arbitrary choices for any given combination of model parameters affects model performance. Because the computational effort grows exponentially with the number of parameters being investigated, it is important to choose parameters carefully. Evaluating whether a parameter is worth investigating depends on two considerations: 1) does reasonable choices of parameter values produce a large range in model response relative to observational uncertainty? and 2) does the model response depend non-linearly on various combinations of model parameters? We have decided to narrow our attention to selecting parameters that affect clouds and radiation, as it is likely that these parameters will dominate uncertainties in model predictions of future climate. We present preliminary results of ~20 to 30 AMIPII style climate model integrations using NCAR's CCM3.10 that show model performance as functions of individual parameters controlling 1) critical relative humidity for cloud formation (RHMIN), and 2) boundary layer critical Richardson number (RICR). We also explore various definitions of model performance that include some or all observational data sources (surface air temperature and pressure, meridional and zonal winds, clouds, long and short-wave cloud forcings, etc...) and evaluate in a few select cases whether the model's response depends non-linearly on the parameter values we have selected.

  6. Blast Load Simulator Experiments for Computational Model Validation: Report 2

    DTIC Science & Technology

    2017-02-01

    repeatability. The uncertainty in the experimental pressures and impulses was evaluated by computing 95% confidence intervals on the results. DISCLAIMER: The...Experiment uncertainty The uncertainty in the experimental pressure and impulse was evaluated for the five replicate experiments for which, as closely as...comparisons were made among the replicated experiments to evaluate repeatability. The uncertainty in the experimental pressures and impulses was

  7. Multi-objective calibration and uncertainty analysis of hydrologic models; A comparative study between formal and informal methods

    NASA Astrophysics Data System (ADS)

    Shafii, M.; Tolson, B.; Matott, L. S.

    2012-04-01

    Hydrologic modeling has benefited from significant developments over the past two decades. This has resulted in building of higher levels of complexity into hydrologic models, which eventually makes the model evaluation process (parameter estimation via calibration and uncertainty analysis) more challenging. In order to avoid unreasonable parameter estimates, many researchers have suggested implementation of multi-criteria calibration schemes. Furthermore, for predictive hydrologic models to be useful, proper consideration of uncertainty is essential. Consequently, recent research has emphasized comprehensive model assessment procedures in which multi-criteria parameter estimation is combined with statistically-based uncertainty analysis routines such as Bayesian inference using Markov Chain Monte Carlo (MCMC) sampling. Such a procedure relies on the use of formal likelihood functions based on statistical assumptions, and moreover, the Bayesian inference structured on MCMC samplers requires a considerably large number of simulations. Due to these issues, especially in complex non-linear hydrological models, a variety of alternative informal approaches have been proposed for uncertainty analysis in the multi-criteria context. This study aims at exploring a number of such informal uncertainty analysis techniques in multi-criteria calibration of hydrological models. The informal methods addressed in this study are (i) Pareto optimality which quantifies the parameter uncertainty using the Pareto solutions, (ii) DDS-AU which uses the weighted sum of objective functions to derive the prediction limits, and (iii) GLUE which describes the total uncertainty through identification of behavioral solutions. The main objective is to compare such methods with MCMC-based Bayesian inference with respect to factors such as computational burden, and predictive capacity, which are evaluated based on multiple comparative measures. The measures for comparison are calculated both for calibration and evaluation periods. The uncertainty analysis methodologies are applied to a simple 5-parameter rainfall-runoff model, called HYMOD.

  8. Modifying climate change habitat models using tree species-specific assessments of model uncertainty and life history-factors

    Treesearch

    Stephen N. Matthews; Louis R. Iverson; Anantha M. Prasad; Matthew P. Peters; Paul G. Rodewald

    2011-01-01

    Species distribution models (SDMs) to evaluate trees' potential responses to climate change are essential for developing appropriate forest management strategies. However, there is a great need to better understand these models' limitations and evaluate their uncertainties. We have previously developed statistical models of suitable habitat, based on both...

  9. An Information Search Model of Evaluative Concerns in Intergroup Interaction

    ERIC Educational Resources Information Center

    Vorauer, Jacquie D.

    2006-01-01

    In an information search model, evaluative concerns during intergroup interaction are conceptualized as a joint function of uncertainty regarding and importance attached to out-group members' views of oneself. High uncertainty generally fosters evaluative concerns during intergroup exchanges. Importance depends on whether out-group members'…

  10. Applications of explicitly-incorporated/post-processing measurement uncertainty in watershed modeling

    USDA-ARS?s Scientific Manuscript database

    The importance of measurement uncertainty in terms of calculation of model evaluation error statistics has been recently stated in the literature. The impact of measurement uncertainty on calibration results indicates the potential vague zone in the field of watershed modeling where the assumption ...

  11. Parallel Computing and Model Evaluation for Environmental Systems: An Overview of the Supermuse and Frames Software Technologies

    EPA Science Inventory

    ERD’s Supercomputer for Model Uncertainty and Sensitivity Evaluation (SuperMUSE) is a key to enhancing quality assurance in environmental models and applications. Uncertainty analysis and sensitivity analysis remain critical, though often overlooked steps in the development and e...

  12. Quantifying measurement uncertainty and spatial variability in the context of model evaluation

    NASA Astrophysics Data System (ADS)

    Choukulkar, A.; Brewer, A.; Pichugina, Y. L.; Bonin, T.; Banta, R. M.; Sandberg, S.; Weickmann, A. M.; Djalalova, I.; McCaffrey, K.; Bianco, L.; Wilczak, J. M.; Newman, J. F.; Draxl, C.; Lundquist, J. K.; Wharton, S.; Olson, J.; Kenyon, J.; Marquis, M.

    2017-12-01

    In an effort to improve wind forecasts for the wind energy sector, the Department of Energy and the NOAA funded the second Wind Forecast Improvement Project (WFIP2). As part of the WFIP2 field campaign, a large suite of in-situ and remote sensing instrumentation was deployed to the Columbia River Gorge in Oregon and Washington from October 2015 - March 2017. The array of instrumentation deployed included 915-MHz wind profiling radars, sodars, wind- profiling lidars, and scanning lidars. The role of these instruments was to provide wind measurements at high spatial and temporal resolution for model evaluation and improvement of model physics. To properly determine model errors, the uncertainties in instrument-model comparisons need to be quantified accurately. These uncertainties arise from several factors such as measurement uncertainty, spatial variability, and interpolation of model output to instrument locations, to name a few. In this presentation, we will introduce a formalism to quantify measurement uncertainty and spatial variability. The accuracy of this formalism will be tested using existing datasets such as the eXperimental Planetary boundary layer Instrumentation Assessment (XPIA) campaign. Finally, the uncertainties in wind measurement and the spatial variability estimates from the WFIP2 field campaign will be discussed to understand the challenges involved in model evaluation.

  13. Prediction of seismic collapse risk of steel moment frame mid-rise structures by meta-heuristic algorithms

    NASA Astrophysics Data System (ADS)

    Jough, Fooad Karimi Ghaleh; Şensoy, Serhan

    2016-12-01

    Different performance levels may be obtained for sideway collapse evaluation of steel moment frames depending on the evaluation procedure used to handle uncertainties. In this article, the process of representing modelling uncertainties, record to record (RTR) variations and cognitive uncertainties for moment resisting steel frames of various heights is discussed in detail. RTR uncertainty is used by incremental dynamic analysis (IDA), modelling uncertainties are considered through backbone curves and hysteresis loops of component, and cognitive uncertainty is presented in three levels of material quality. IDA is used to evaluate RTR uncertainty based on strong ground motion records selected by the k-means algorithm, which is favoured over Monte Carlo selection due to its time saving appeal. Analytical equations of the Response Surface Method are obtained through IDA results by the Cuckoo algorithm, which predicts the mean and standard deviation of the collapse fragility curve. The Takagi-Sugeno-Kang model is used to represent material quality based on the response surface coefficients. Finally, collapse fragility curves with the various sources of uncertainties mentioned are derived through a large number of material quality values and meta variables inferred by the Takagi-Sugeno-Kang fuzzy model based on response surface method coefficients. It is concluded that a better risk management strategy in countries where material quality control is weak, is to account for cognitive uncertainties in fragility curves and the mean annual frequency.

  14. Equifinality and process-based modelling

    NASA Astrophysics Data System (ADS)

    Khatami, S.; Peel, M. C.; Peterson, T. J.; Western, A. W.

    2017-12-01

    Equifinality is understood as one of the fundamental difficulties in the study of open complex systems, including catchment hydrology. A review of the hydrologic literature reveals that the term equifinality has been widely used, but in many cases inconsistently and without coherent recognition of the various facets of equifinality, which can lead to ambiguity but also methodological fallacies. Therefore, in this study we first characterise the term equifinality within the context of hydrological modelling by reviewing the genesis of the concept of equifinality and then presenting a theoretical framework. During past decades, equifinality has mainly been studied as a subset of aleatory (arising due to randomness) uncertainty and for the assessment of model parameter uncertainty. Although the connection between parameter uncertainty and equifinality is undeniable, we argue there is more to equifinality than just aleatory parameter uncertainty. That is, the importance of equifinality and epistemic uncertainty (arising due to lack of knowledge) and their implications is overlooked in our current practice of model evaluation. Equifinality and epistemic uncertainty in studying, modelling, and evaluating hydrologic processes are treated as if they can be simply discussed in (or often reduced to) probabilistic terms (as for aleatory uncertainty). The deficiencies of this approach to conceptual rainfall-runoff modelling are demonstrated for selected Australian catchments by examination of parameter and internal flux distributions and interactions within SIMHYD. On this basis, we present a new approach that expands equifinality concept beyond model parameters to inform epistemic uncertainty. The new approach potentially facilitates the identification and development of more physically plausible models and model evaluation schemes particularly within the multiple working hypotheses framework, and is generalisable to other fields of environmental modelling as well.

  15. Evaluation of Uncertainty in Constituent Input Parameters for Modeling the Fate of IMX 101 Components

    DTIC Science & Technology

    2017-05-01

    ER D C/ EL T R- 17 -7 Environmental Security Technology Certification Program (ESTCP) Evaluation of Uncertainty in Constituent Input...Environmental Security Technology Certification Program (ESTCP) ERDC/EL TR-17-7 May 2017 Evaluation of Uncertainty in Constituent Input Parameters...Environmental Evaluation and Characterization Sys- tem (TREECS™) was applied to a groundwater site and a surface water site to evaluate the sensitivity

  16. Uncertainty assessment of a model for biological nitrogen and phosphorus removal: Application to a large wastewater treatment plant

    NASA Astrophysics Data System (ADS)

    Mannina, Giorgio; Cosenza, Alida; Viviani, Gaspare

    In the last few years, the use of mathematical models in WasteWater Treatment Plant (WWTP) processes has become a common way to predict WWTP behaviour. However, mathematical models generally demand advanced input for their implementation that must be evaluated by an extensive data-gathering campaign, which cannot always be carried out. This fact, together with the intrinsic complexity of the model structure, leads to model results that may be very uncertain. Quantification of the uncertainty is imperative. However, despite the importance of uncertainty quantification, only few studies have been carried out in the wastewater treatment field, and those studies only included a few of the sources of model uncertainty. Seeking the development of the area, the paper presents the uncertainty assessment of a mathematical model simulating biological nitrogen and phosphorus removal. The uncertainty assessment was conducted according to the Generalised Likelihood Uncertainty Estimation (GLUE) methodology that has been scarcely applied in wastewater field. The model was based on activated-sludge models 1 (ASM) and 2 (ASM2). Different approaches can be used for uncertainty analysis. The GLUE methodology requires a large number of Monte Carlo simulations in which a random sampling of individual parameters drawn from probability distributions is used to determine a set of parameter values. Using this approach, model reliability was evaluated based on its capacity to globally limit the uncertainty. The method was applied to a large full-scale WWTP for which quantity and quality data was gathered. The analysis enabled to gain useful insights for WWTP modelling identifying the crucial aspects where higher uncertainty rely and where therefore, more efforts should be provided in terms of both data gathering and modelling practises.

  17. An improved method to represent DEM uncertainty in glacial lake outburst flood propagation using stochastic simulations

    NASA Astrophysics Data System (ADS)

    Watson, Cameron S.; Carrivick, Jonathan; Quincey, Duncan

    2015-10-01

    Modelling glacial lake outburst floods (GLOFs) or 'jökulhlaups', necessarily involves the propagation of large and often stochastic uncertainties throughout the source to impact process chain. Since flood routing is primarily a function of underlying topography, communication of digital elevation model (DEM) uncertainty should accompany such modelling efforts. Here, a new stochastic first-pass assessment technique was evaluated against an existing GIS-based model and an existing 1D hydrodynamic model, using three DEMs with different spatial resolution. The analysis revealed the effect of DEM uncertainty and model choice on several flood parameters and on the prediction of socio-economic impacts. Our new model, which we call MC-LCP (Monte Carlo Least Cost Path) and which is distributed in the supplementary information, demonstrated enhanced 'stability' when compared to the two existing methods, and this 'stability' was independent of DEM choice. The MC-LCP model outputs an uncertainty continuum within its extent, from which relative socio-economic risk can be evaluated. In a comparison of all DEM and model combinations, the Shuttle Radar Topography Mission (SRTM) DEM exhibited fewer artefacts compared to those with the Advanced Spaceborne Thermal Emission and Reflection Radiometer Global Digital Elevation Model (ASTER GDEM), and were comparable to those with a finer resolution Advanced Land Observing Satellite Panchromatic Remote-sensing Instrument for Stereo Mapping (ALOS PRISM) derived DEM. Overall, we contend that the variability we find between flood routing model results suggests that consideration of DEM uncertainty and pre-processing methods is important when assessing flow routing and when evaluating potential socio-economic implications of a GLOF event. Incorporation of a stochastic variable provides an illustration of uncertainty that is important when modelling and communicating assessments of an inherently complex process.

  18. Accounting for uncertainty in marine reserve design.

    PubMed

    Halpern, Benjamin S; Regan, Helen M; Possingham, Hugh P; McCarthy, Michael A

    2006-01-01

    Ecosystems and the species and communities within them are highly complex systems that defy predictions with any degree of certainty. Managing and conserving these systems in the face of uncertainty remains a daunting challenge, particularly with respect to developing networks of marine reserves. Here we review several modelling frameworks that explicitly acknowledge and incorporate uncertainty, and then use these methods to evaluate reserve spacing rules given increasing levels of uncertainty about larval dispersal distances. Our approach finds similar spacing rules as have been proposed elsewhere - roughly 20-200 km - but highlights several advantages provided by uncertainty modelling over more traditional approaches to developing these estimates. In particular, we argue that uncertainty modelling can allow for (1) an evaluation of the risk associated with any decision based on the assumed uncertainty; (2) a method for quantifying the costs and benefits of reducing uncertainty; and (3) a useful tool for communicating to stakeholders the challenges in managing highly uncertain systems. We also argue that incorporating rather than avoiding uncertainty will increase the chances of successfully achieving conservation and management goals.

  19. Approaches to highly parameterized inversion: A guide to using PEST for model-parameter and predictive-uncertainty analysis

    USGS Publications Warehouse

    Doherty, John E.; Hunt, Randall J.; Tonkin, Matthew J.

    2010-01-01

    Analysis of the uncertainty associated with parameters used by a numerical model, and with predictions that depend on those parameters, is fundamental to the use of modeling in support of decisionmaking. Unfortunately, predictive uncertainty analysis with regard to models can be very computationally demanding, due in part to complex constraints on parameters that arise from expert knowledge of system properties on the one hand (knowledge constraints) and from the necessity for the model parameters to assume values that allow the model to reproduce historical system behavior on the other hand (calibration constraints). Enforcement of knowledge and calibration constraints on parameters used by a model does not eliminate the uncertainty in those parameters. In fact, in many cases, enforcement of calibration constraints simply reduces the uncertainties associated with a number of broad-scale combinations of model parameters that collectively describe spatially averaged system properties. The uncertainties associated with other combinations of parameters, especially those that pertain to small-scale parameter heterogeneity, may not be reduced through the calibration process. To the extent that a prediction depends on system-property detail, its postcalibration variability may be reduced very little, if at all, by applying calibration constraints; knowledge constraints remain the only limits on the variability of predictions that depend on such detail. Regrettably, in many common modeling applications, these constraints are weak. Though the PEST software suite was initially developed as a tool for model calibration, recent developments have focused on the evaluation of model-parameter and predictive uncertainty. As a complement to functionality that it provides for highly parameterized inversion (calibration) by means of formal mathematical regularization techniques, the PEST suite provides utilities for linear and nonlinear error-variance and uncertainty analysis in these highly parameterized modeling contexts. Availability of these utilities is particularly important because, in many cases, a significant proportion of the uncertainty associated with model parameters-and the predictions that depend on them-arises from differences between the complex properties of the real world and the simplified representation of those properties that is expressed by the calibrated model. This report is intended to guide intermediate to advanced modelers in the use of capabilities available with the PEST suite of programs for evaluating model predictive error and uncertainty. A brief theoretical background is presented on sources of parameter and predictive uncertainty and on the means for evaluating this uncertainty. Applications of PEST tools are then discussed for overdetermined and underdetermined problems, both linear and nonlinear. PEST tools for calculating contributions to model predictive uncertainty, as well as optimization of data acquisition for reducing parameter and predictive uncertainty, are presented. The appendixes list the relevant PEST variables, files, and utilities required for the analyses described in the document.

  20. Evaluating the uncertainty of input quantities in measurement models

    NASA Astrophysics Data System (ADS)

    Possolo, Antonio; Elster, Clemens

    2014-06-01

    The Guide to the Expression of Uncertainty in Measurement (GUM) gives guidance about how values and uncertainties should be assigned to the input quantities that appear in measurement models. This contribution offers a concrete proposal for how that guidance may be updated in light of the advances in the evaluation and expression of measurement uncertainty that were made in the course of the twenty years that have elapsed since the publication of the GUM, and also considering situations that the GUM does not yet contemplate. Our motivation is the ongoing conversation about a new edition of the GUM. While generally we favour a Bayesian approach to uncertainty evaluation, we also recognize the value that other approaches may bring to the problems considered here, and focus on methods for uncertainty evaluation and propagation that are widely applicable, including to cases that the GUM has not yet addressed. In addition to Bayesian methods, we discuss maximum-likelihood estimation, robust statistical methods, and measurement models where values of nominal properties play the same role that input quantities play in traditional models. We illustrate these general-purpose techniques in concrete examples, employing data sets that are realistic but that also are of conveniently small sizes. The supplementary material available online lists the R computer code that we have used to produce these examples (stacks.iop.org/Met/51/3/339/mmedia). Although we strive to stay close to clause 4 of the GUM, which addresses the evaluation of uncertainty for input quantities, we depart from it as we review the classes of measurement models that we believe are generally useful in contemporary measurement science. We also considerably expand and update the treatment that the GUM gives to Type B evaluations of uncertainty: reviewing the state-of-the-art, disciplined approach to the elicitation of expert knowledge, and its encapsulation in probability distributions that are usable in uncertainty propagation exercises. In this we deviate markedly and emphatically from the GUM Supplement 1, which gives pride of place to the Principle of Maximum Entropy as a means to assign probability distributions to input quantities.

  1. Plurality of Type A evaluations of uncertainty

    NASA Astrophysics Data System (ADS)

    Possolo, Antonio; Pintar, Adam L.

    2017-10-01

    The evaluations of measurement uncertainty involving the application of statistical methods to measurement data (Type A evaluations as specified in the Guide to the Expression of Uncertainty in Measurement, GUM) comprise the following three main steps: (i) developing a statistical model that captures the pattern of dispersion or variability in the experimental data, and that relates the data either to the measurand directly or to some intermediate quantity (input quantity) that the measurand depends on; (ii) selecting a procedure for data reduction that is consistent with this model and that is fit for the purpose that the results are intended to serve; (iii) producing estimates of the model parameters, or predictions based on the fitted model, and evaluations of uncertainty that qualify either those estimates or these predictions, and that are suitable for use in subsequent uncertainty propagation exercises. We illustrate these steps in uncertainty evaluations related to the measurement of the mass fraction of vanadium in a bituminous coal reference material, including the assessment of the homogeneity of the material, and to the calibration and measurement of the amount-of-substance fraction of a hydrochlorofluorocarbon in air, and of the age of a meteorite. Our goal is to expose the plurality of choices that can reasonably be made when taking each of the three steps outlined above, and to show that different choices typically lead to different estimates of the quantities of interest, and to different evaluations of the associated uncertainty. In all the examples, the several alternatives considered represent choices that comparably competent statisticians might make, but who differ in the assumptions that they are prepared to rely on, and in their selection of approach to statistical inference. They represent also alternative treatments that the same statistician might give to the same data when the results are intended for different purposes.

  2. Modelling of the X,Y,Z positioning errors and uncertainty evaluation for the LNE’s mAFM using the Monte Carlo method

    NASA Astrophysics Data System (ADS)

    Ceria, Paul; Ducourtieux, Sebastien; Boukellal, Younes; Allard, Alexandre; Fischer, Nicolas; Feltin, Nicolas

    2017-03-01

    In order to evaluate the uncertainty budget of the LNE’s mAFM, a reference instrument dedicated to the calibration of nanoscale dimensional standards, a numerical model has been developed to evaluate the measurement uncertainty of the metrology loop involved in the XYZ positioning of the tip relative to the sample. The objective of this model is to overcome difficulties experienced when trying to evaluate some uncertainty components which cannot be experimentally determined and more specifically, the one linked to the geometry of the metrology loop. The model is based on object-oriented programming and developed under Matlab. It integrates one hundred parameters that allow the control of the geometry of the metrology loop without using analytical formulae. The created objects, mainly the reference and the mobile prism and their mirrors, the interferometers and their laser beams, can be moved and deformed freely to take into account several error sources. The Monte Carlo method is then used to determine the positioning uncertainty of the instrument by randomly drawing the parameters according to their associated tolerances and their probability density functions (PDFs). The whole process follows Supplement 2 to ‘The Guide to the Expression of the Uncertainty in Measurement’ (GUM). Some advanced statistical tools like Morris design and Sobol indices are also used to provide a sensitivity analysis by identifying the most influential parameters and quantifying their contribution to the XYZ positioning uncertainty. The approach validated in the paper shows that the actual positioning uncertainty is about 6 nm. As the final objective is to reach 1 nm, we engage in a discussion to estimate the most effective way to reduce the uncertainty.

  3. Evaluating Uncertainty in Integrated Environmental Models: A Review of Concepts and Tools

    EPA Science Inventory

    This paper reviews concepts for evaluating integrated environmental models and discusses a list of relevant software-based tools. A simplified taxonomy for sources of uncertainty and a glossary of key terms with standard definitions are provided in the context of integrated appro...

  4. Uncertainty evaluation with increasing borehole drilling in subsurface hydrogeological explorations

    NASA Astrophysics Data System (ADS)

    Amano, K.; Ohyama, T.; Kumamoto, S.; Shimo, M.

    2016-12-01

    Quantities of drilling boreholes have been a difficult subject for field investigators in such as subsurface hydrogeological explorations. This problem becomes a bigger in heterogeneous formations or rock masses so we need to develop quantitative criteria for evaluating uncertainties during borehole investigations.To test an uncertainty reduction with increasing boreholes, we prepared a simple hydrogeological model and virtual hydraulic tests were carried out by using this model. The model consists of 125,000 elements of which hydraulic conductivities are generated randomly from the log-normal distribution in a 2-kilometer cube. Uncertainties were calculated by the difference of head distributions between the original model and the inchoate models made by virtual hydraulic test one by one.The results show the level and the variance of uncertainty are strongly correlated to the average and variance of the hydraulic conductivities. This kind of trends also could be seen in the actual field data obtained from the deep borehole investigations in Horonobe Town, northern Hokkaido, Japan. Here, a new approach using fractional bias (FB) and normalized mean square error (NMSE) for evaluating uncertainty characteristics will be introduced and the possibility of use as an indicator for decision making (i.e. to stop borehole drilling or to continue borehole drilling) in field investigations will be discussed.

  5. Uncertainty in eddy covariance measurements and its application to physiological models

    Treesearch

    D.Y. Hollinger; A.D. Richardson; A.D. Richardson

    2005-01-01

    Flux data are noisy, and this uncertainty is largely due to random measurement error. Knowledge of uncertainty is essential for the statistical evaluation of modeled andmeasured fluxes, for comparison of parameters derived by fitting models to measured fluxes and in formal data-assimilation efforts. We used the difference between simultaneous measurements from two...

  6. Evaluation of Uncertainty in Constituent Input Parameters for Modeling the Fate of RDX

    DTIC Science & Technology

    2015-07-01

    exercise was to evaluate the importance of chemical -specific model input parameters, the impacts of their uncertainty, and the potential benefits of... chemical -specific inputs for RDX that were determined to be sensitive with relatively high uncertainty: these included the soil-water linear...Koc for organic chemicals . The EFS values provided for log Koc of RDX were 1.72 and 1.95. OBJECTIVE: TREECS™ (http://el.erdc.usace.army.mil/treecs

  7. Impacts of uncertainties in weather and streamflow observations in calibration and evaluation of an elevation distributed HBV-model

    NASA Astrophysics Data System (ADS)

    Engeland, K.; Steinsland, I.; Petersen-Øverleir, A.; Johansen, S.

    2012-04-01

    The aim of this study is to assess the uncertainties in streamflow simulations when uncertainties in both observed inputs (precipitation and temperature) and streamflow observations used in the calibration of the hydrological model are explicitly accounted for. To achieve this goal we applied the elevation distributed HBV model operating on daily time steps to a small catchment in high elevation in Southern Norway where the seasonal snow cover is important. The uncertainties in precipitation inputs were quantified using conditional simulation. This procedure accounts for the uncertainty related to the density of the precipitation network, but neglects uncertainties related to measurement bias/errors and eventual elevation gradients in precipitation. The uncertainties in temperature inputs were quantified using a Bayesian temperature interpolation procedure where the temperature lapse rate is re-estimated every day. The uncertainty in the lapse rate was accounted for whereas the sampling uncertainty related to network density was neglected. For every day a random sample of precipitation and temperature inputs were drawn to be applied as inputs to the hydrologic model. The uncertainties in observed streamflow were assessed based on the uncertainties in the rating curve model. A Bayesian procedure was applied to estimate the probability for rating curve models with 1 to 3 segments and the uncertainties in their parameters. This method neglects uncertainties related to errors in observed water levels. Note that one rating curve was drawn to make one realisation of a whole time series of streamflow, thus the rating curve errors lead to a systematic bias in the streamflow observations. All these uncertainty sources were linked together in both calibration and evaluation of the hydrologic model using a DREAM based MCMC routine. Effects of having less information (e.g. missing one streamflow measurement for defining the rating curve or missing one precipitation station) was also investigated.

  8. Evaluating land cover influences on model uncertainties—A case study of cropland carbon dynamics in the Mid-Continent Intensive Campaign region

    USGS Publications Warehouse

    Li, Zhengpeng; Liu, Shuguang; Zhang, Xuesong; West, Tristram O.; Ogle, Stephen M.; Zhou, Naijun

    2016-01-01

    Quantifying spatial and temporal patterns of carbon sources and sinks and their uncertainties across agriculture-dominated areas remains challenging for understanding regional carbon cycles. Characteristics of local land cover inputs could impact the regional carbon estimates but the effect has not been fully evaluated in the past. Within the North American Carbon Program Mid-Continent Intensive (MCI) Campaign, three models were developed to estimate carbon fluxes on croplands: an inventory-based model, the Environmental Policy Integrated Climate (EPIC) model, and the General Ensemble biogeochemical Modeling System (GEMS) model. They all provided estimates of three major carbon fluxes on cropland: net primary production (NPP), net ecosystem production (NEP), and soil organic carbon (SOC) change. Using data mining and spatial statistics, we studied the spatial distribution of the carbon fluxes uncertainties and the relationships between the uncertainties and the land cover characteristics. Results indicated that uncertainties for all three carbon fluxes were not randomly distributed, but instead formed multiple clusters within the MCI region. We investigated the impacts of three land cover characteristics on the fluxes uncertainties: cropland percentage, cropland richness and cropland diversity. The results indicated that cropland percentage significantly influenced the uncertainties of NPP and NEP, but not on the uncertainties of SOC change. Greater uncertainties of NPP and NEP were found in counties with small cropland percentage than the counties with large cropland percentage. Cropland species richness and diversity also showed negative correlations with the model uncertainties. Our study demonstrated that the land cover characteristics contributed to the uncertainties of regional carbon fluxes estimates. The approaches we used in this study can be applied to other ecosystem models to identify the areas with high uncertainties and where models can be improved to reduce overall uncertainties for regional carbon flux estimates.

  9. Evaluating measurement uncertainty in fluid phase equilibrium calculations

    NASA Astrophysics Data System (ADS)

    van der Veen, Adriaan M. H.

    2018-04-01

    The evaluation of measurement uncertainty in accordance with the ‘Guide to the expression of uncertainty in measurement’ (GUM) has not yet become widespread in physical chemistry. With only the law of the propagation of uncertainty from the GUM, many of these uncertainty evaluations would be cumbersome, as models are often non-linear and require iterative calculations. The methods from GUM supplements 1 and 2 enable the propagation of uncertainties under most circumstances. Experimental data in physical chemistry are used, for example, to derive reference property data and support trade—all applications where measurement uncertainty plays an important role. This paper aims to outline how the methods for evaluating and propagating uncertainty can be applied to some specific cases with a wide impact: deriving reference data from vapour pressure data, a flash calculation, and the use of an equation-of-state to predict the properties of both phases in a vapour-liquid equilibrium. The three uncertainty evaluations demonstrate that the methods of GUM and its supplements are a versatile toolbox that enable us to evaluate the measurement uncertainty of physical chemical measurements, including the derivation of reference data, such as the equilibrium thermodynamical properties of fluids.

  10. A Review On Accuracy and Uncertainty of Spatial Data and Analyses with special reference to Urban and Hydrological Modelling

    NASA Astrophysics Data System (ADS)

    Devendran, A. A.; Lakshmanan, G.

    2014-11-01

    Data quality for GIS processing and analysis is becoming an increased concern due to the accelerated application of GIS technology for problem solving and decision making roles. Uncertainty in the geographic representation of the real world arises as these representations are incomplete. Identification of the sources of these uncertainties and the ways in which they operate in GIS based representations become crucial in any spatial data representation and geospatial analysis applied to any field of application. This paper reviews the articles on the various components of spatial data quality and various uncertainties inherent in them and special focus is paid to two fields of application such as Urban Simulation and Hydrological Modelling. Urban growth is a complicated process involving the spatio-temporal changes of all socio-economic and physical components at different scales. Cellular Automata (CA) model is one of the simulation models, which randomly selects potential cells for urbanisation and the transition rules evaluate the properties of the cell and its neighbour. Uncertainty arising from CA modelling is assessed mainly using sensitivity analysis including Monte Carlo simulation method. Likewise, the importance of hydrological uncertainty analysis has been emphasized in recent years and there is an urgent need to incorporate uncertainty estimation into water resources assessment procedures. The Soil and Water Assessment Tool (SWAT) is a continuous time watershed model to evaluate various impacts of land use management and climate on hydrology and water quality. Hydrological model uncertainties using SWAT model are dealt primarily by Generalized Likelihood Uncertainty Estimation (GLUE) method.

  11. Branch-and-Bound algorithm applied to uncertainty quantification of a Boiling Water Reactor Station Blackout

    DOE PAGES

    Nielsen, Joseph; Tokuhiro, Akira; Hiromoto, Robert; ...

    2015-11-13

    Evaluation of the impacts of uncertainty and sensitivity in modeling presents a significant set of challenges in particular to high fidelity modeling. Computational costs and validation of models creates a need for cost effective decision making with regards to experiment design. Experiments designed to validate computation models can be used to reduce uncertainty in the physical model. In some cases, large uncertainty in a particular aspect of the model may or may not have a large impact on the final results. For example, modeling of a relief valve may result in large uncertainty, however, the actual effects on final peakmore » clad temperature in a reactor transient may be small and the large uncertainty with respect to valve modeling may be considered acceptable. Additionally, the ability to determine the adequacy of a model and the validation supporting it should be considered within a risk informed framework. Low fidelity modeling with large uncertainty may be considered adequate if the uncertainty is considered acceptable with respect to risk. In other words, models that are used to evaluate the probability of failure should be evaluated more rigorously with the intent of increasing safety margin. Probabilistic risk assessment (PRA) techniques have traditionally been used to identify accident conditions and transients. Traditional classical event tree methods utilize analysts’ knowledge and experience to identify the important timing of events in coordination with thermal-hydraulic modeling. These methods lack the capability to evaluate complex dynamic systems. In these systems, time and energy scales associated with transient events may vary as a function of transition times and energies to arrive at a different physical state. Dynamic PRA (DPRA) methods provide a more rigorous analysis of complex dynamic systems. Unfortunately DPRA methods introduce issues associated with combinatorial explosion of states. This study presents a methodology to address combinatorial explosion using a Branch-and-Bound algorithm applied to Dynamic Event Trees (DET), which utilize LENDIT (L – Length, E – Energy, N – Number, D – Distribution, I – Information, and T – Time) as well as a set theory to describe system, state, resource, and response (S2R2) sets to create bounding functions for the DET. The optimization of the DET in identifying high probability failure branches is extended to create a Phenomenological Identification and Ranking Table (PIRT) methodology to evaluate modeling parameters important to safety of those failure branches that have a high probability of failure. The PIRT can then be used as a tool to identify and evaluate the need for experimental validation of models that have the potential to reduce risk. Finally, in order to demonstrate this methodology, a Boiling Water Reactor (BWR) Station Blackout (SBO) case study is presented.« less

  12. Evaluating critical uncertainty thresholds in a spatial model of forest pest invasion risk

    Treesearch

    Frank H. Koch; Denys Yemshanov; Daniel W. McKenney; William D. Smith

    2009-01-01

    Pest risk maps can provide useful decision support in invasive species management, but most do not adequately consider the uncertainty associated with predicted risk values. This study explores how increased uncertainty in a risk model’s numeric assumptions might affect the resultant risk map. We used a spatial stochastic model, integrating components for...

  13. Uncertainty estimation of simulated water levels for the Mitch flood event in Tegucigalpa

    NASA Astrophysics Data System (ADS)

    Fuentes Andino, Diana Carolina; Halldin, Sven; Keith, Beven; Chong-Yu, Xu

    2013-04-01

    Hurricane Mitch in 1998 left a devastating flood in Tegucigalpa, the capital city of Honduras. Due to the extremely large magnitude of the Mitch flood, hydrometric measurements were not taken during the event. However, post-event indirect measurements of the discharge were obtained by the U.S. Geological Survey (USGS) and post-event surveyed high water marks were obtained by the Japan International Cooperation agency (JICA). This work proposes a methodology to simulate the water level during the Mitch event when the available data is associated with large uncertainty. The results of the two-dimensional hydrodynamic model LISFLOOD-FP will be evaluated using the Generalized Uncertainty Estimation (GLUE) framework. The main challenge in the proposed methodology is to formulate an approach to evaluate the model results when there are large uncertainties coming from both the model parameters and the evaluation data.

  14. Comprehensive Approach to Verification and Validation of CFD Simulations Applied to Backward Facing Step-Application of CFD Uncertainty Analysis

    NASA Technical Reports Server (NTRS)

    Groves, Curtis E.; LLie, Marcel; Shallhorn, Paul A.

    2012-01-01

    There are inherent uncertainties and errors associated with using Computational Fluid Dynamics (CFD) to predict the flow field and there is no standard method for evaluating uncertainty in the CFD community. This paper describes an approach to -validate the . uncertainty in using CFD. The method will use the state of the art uncertainty analysis applying different turbulence niodels and draw conclusions on which models provide the least uncertainty and which models most accurately predict the flow of a backward facing step.

  15. Using prediction uncertainty analysis to design hydrologic monitoring networks: Example applications from the Great Lakes water availability pilot project

    USGS Publications Warehouse

    Fienen, Michael N.; Doherty, John E.; Hunt, Randall J.; Reeves, Howard W.

    2010-01-01

    The importance of monitoring networks for resource-management decisions is becoming more recognized, in both theory and application. Quantitative computer models provide a science-based framework to evaluate the efficacy and efficiency of existing and possible future monitoring networks. In the study described herein, two suites of tools were used to evaluate the worth of new data for specific predictions, which in turn can support efficient use of resources needed to construct a monitoring network. The approach evaluates the uncertainty of a model prediction and, by using linear propagation of uncertainty, estimates how much uncertainty could be reduced if the model were calibrated with addition information (increased a priori knowledge of parameter values or new observations). The theoretical underpinnings of the two suites of tools addressing this technique are compared, and their application to a hypothetical model based on a local model inset into the Great Lakes Water Availability Pilot model are described. Results show that meaningful guidance for monitoring network design can be obtained by using the methods explored. The validity of this guidance depends substantially on the parameterization as well; hence, parameterization must be considered not only when designing the parameter-estimation paradigm but also-importantly-when designing the prediction-uncertainty paradigm.

  16. Hierarchical Bayesian Model Averaging for Chance Constrained Remediation Designs

    NASA Astrophysics Data System (ADS)

    Chitsazan, N.; Tsai, F. T.

    2012-12-01

    Groundwater remediation designs are heavily relying on simulation models which are subjected to various sources of uncertainty in their predictions. To develop a robust remediation design, it is crucial to understand the effect of uncertainty sources. In this research, we introduce a hierarchical Bayesian model averaging (HBMA) framework to segregate and prioritize sources of uncertainty in a multi-layer frame, where each layer targets a source of uncertainty. The HBMA framework provides an insight to uncertainty priorities and propagation. In addition, HBMA allows evaluating model weights in different hierarchy levels and assessing the relative importance of models in each level. To account for uncertainty, we employ a chance constrained (CC) programming for stochastic remediation design. Chance constrained programming was implemented traditionally to account for parameter uncertainty. Recently, many studies suggested that model structure uncertainty is not negligible compared to parameter uncertainty. Using chance constrained programming along with HBMA can provide a rigorous tool for groundwater remediation designs under uncertainty. In this research, the HBMA-CC was applied to a remediation design in a synthetic aquifer. The design was to develop a scavenger well approach to mitigate saltwater intrusion toward production wells. HBMA was employed to assess uncertainties from model structure, parameter estimation and kriging interpolation. An improved harmony search optimization method was used to find the optimal location of the scavenger well. We evaluated prediction variances of chloride concentration at the production wells through the HBMA framework. The results showed that choosing the single best model may lead to a significant error in evaluating prediction variances for two reasons. First, considering the single best model, variances that stem from uncertainty in the model structure will be ignored. Second, considering the best model with non-dominant model weight may underestimate or overestimate prediction variances by ignoring other plausible propositions. Chance constraints allow developing a remediation design with a desirable reliability. However, considering the single best model, the calculated reliability will be different from the desirable reliability. We calculated the reliability of the design for the models at different levels of HBMA. The results showed that by moving toward the top layers of HBMA, the calculated reliability converges to the chosen reliability. We employed the chance constrained optimization along with the HBMA framework to find the optimal location and pumpage for the scavenger well. The results showed that using models at different levels in the HBMA framework, the optimal location of the scavenger well remained the same, but the optimal extraction rate was altered. Thus, we concluded that the optimal pumping rate was sensitive to the prediction variance. Also, the prediction variance was changed by using different extraction rate. Using very high extraction rate will cause prediction variances of chloride concentration at the production wells to approach zero regardless of which HBMA models used.

  17. Factoring uncertainty into restoration modeling of in-situ leach uranium mines

    USGS Publications Warehouse

    Johnson, Raymond H.; Friedel, Michael J.

    2009-01-01

    Postmining restoration is one of the greatest concerns for uranium in-situ leach (ISL) mining operations. The ISL-affected aquifer needs to be returned to conditions specified in the mining permit (either premining or other specified conditions). When uranium ISL operations are completed, postmining restoration is usually achieved by injecting reducing agents into the mined zone. The objective of this process is to restore the aquifer to premining conditions by reducing the solubility of uranium and other metals in the ground water. Reactive transport modeling is a potentially useful method for simulating the effectiveness of proposed restoration techniques. While reactive transport models can be useful, they are a simplification of reality that introduces uncertainty through the model conceptualization, parameterization, and calibration processes. For this reason, quantifying the uncertainty in simulated temporal and spatial hydrogeochemistry is important for postremedial risk evaluation of metal concentrations and mobility. Quantifying the range of uncertainty in key predictions (such as uranium concentrations at a specific location) can be achieved using forward Monte Carlo or other inverse modeling techniques (trial-and-error parameter sensitivity, calibration constrained Monte Carlo). These techniques provide simulated values of metal concentrations at specified locations that can be presented as nonlinear uncertainty limits or probability density functions. Decisionmakers can use these results to better evaluate environmental risk as future metal concentrations with a limited range of possibilities, based on a scientific evaluation of uncertainty.

  18. Uncertainty Aware Structural Topology Optimization Via a Stochastic Reduced Order Model Approach

    NASA Technical Reports Server (NTRS)

    Aguilo, Miguel A.; Warner, James E.

    2017-01-01

    This work presents a stochastic reduced order modeling strategy for the quantification and propagation of uncertainties in topology optimization. Uncertainty aware optimization problems can be computationally complex due to the substantial number of model evaluations that are necessary to accurately quantify and propagate uncertainties. This computational complexity is greatly magnified if a high-fidelity, physics-based numerical model is used for the topology optimization calculations. Stochastic reduced order model (SROM) methods are applied here to effectively 1) alleviate the prohibitive computational cost associated with an uncertainty aware topology optimization problem; and 2) quantify and propagate the inherent uncertainties due to design imperfections. A generic SROM framework that transforms the uncertainty aware, stochastic topology optimization problem into a deterministic optimization problem that relies only on independent calls to a deterministic numerical model is presented. This approach facilitates the use of existing optimization and modeling tools to accurately solve the uncertainty aware topology optimization problems in a fraction of the computational demand required by Monte Carlo methods. Finally, an example in structural topology optimization is presented to demonstrate the effectiveness of the proposed uncertainty aware structural topology optimization approach.

  19. Multi-criteria evaluation of wastewater treatment plant control strategies under uncertainty.

    PubMed

    Flores-Alsina, Xavier; Rodríguez-Roda, Ignasi; Sin, Gürkan; Gernaey, Krist V

    2008-11-01

    The evaluation of activated sludge control strategies in wastewater treatment plants (WWTP) via mathematical modelling is a complex activity because several objectives; e.g. economic, environmental, technical and legal; must be taken into account at the same time, i.e. the evaluation of the alternatives is a multi-criteria problem. Activated sludge models are not well characterized and some of the parameters can present uncertainty, e.g. the influent fractions arriving to the facility and the effect of either temperature or toxic compounds on the kinetic parameters, having a strong influence in the model predictions used during the evaluation of the alternatives and affecting the resulting rank of preferences. Using a simplified version of the IWA Benchmark Simulation Model No. 2 as a case study, this article shows the variations in the decision making when the uncertainty in activated sludge model (ASM) parameters is either included or not during the evaluation of WWTP control strategies. This paper comprises two main sections. Firstly, there is the evaluation of six WWTP control strategies using multi-criteria decision analysis setting the ASM parameters at their default value. In the following section, the uncertainty is introduced, i.e. input uncertainty, which is characterized by probability distribution functions based on the available process knowledge. Next, Monte Carlo simulations are run to propagate input through the model and affect the different outcomes. Thus (i) the variation in the overall degree of satisfaction of the control objectives for the generated WWTP control strategies is quantified, (ii) the contributions of environmental, legal, technical and economic objectives to the existing variance are identified and finally (iii) the influence of the relative importance of the control objectives during the selection of alternatives is analyzed. The results show that the control strategies with an external carbon source reduce the output uncertainty in the criteria used to quantify the degree of satisfaction of environmental, technical and legal objectives, but increasing the economical costs and their variability as a trade-off. Also, it is shown how a preliminary selected alternative with cascade ammonium controller becomes less desirable when input uncertainty is included, having simpler alternatives more chance of success.

  20. Analytic uncertainty and sensitivity analysis of models with input correlations

    NASA Astrophysics Data System (ADS)

    Zhu, Yueying; Wang, Qiuping A.; Li, Wei; Cai, Xu

    2018-03-01

    Probabilistic uncertainty analysis is a common means of evaluating mathematical models. In mathematical modeling, the uncertainty in input variables is specified through distribution laws. Its contribution to the uncertainty in model response is usually analyzed by assuming that input variables are independent of each other. However, correlated parameters are often happened in practical applications. In the present paper, an analytic method is built for the uncertainty and sensitivity analysis of models in the presence of input correlations. With the method, it is straightforward to identify the importance of the independence and correlations of input variables in determining the model response. This allows one to decide whether or not the input correlations should be considered in practice. Numerical examples suggest the effectiveness and validation of our analytic method in the analysis of general models. A practical application of the method is also proposed to the uncertainty and sensitivity analysis of a deterministic HIV model.

  1. Uncertainty in urban flood damage assessment due to urban drainage modelling and depth-damage curve estimation.

    PubMed

    Freni, G; La Loggia, G; Notaro, V

    2010-01-01

    Due to the increased occurrence of flooding events in urban areas, many procedures for flood damage quantification have been defined in recent decades. The lack of large databases in most cases is overcome by combining the output of urban drainage models and damage curves linking flooding to expected damage. The application of advanced hydraulic models as diagnostic, design and decision-making support tools has become a standard practice in hydraulic research and application. Flooding damage functions are usually evaluated by a priori estimation of potential damage (based on the value of exposed goods) or by interpolating real damage data (recorded during historical flooding events). Hydraulic models have undergone continuous advancements, pushed forward by increasing computer capacity. The details of the flooding propagation process on the surface and the details of the interconnections between underground and surface drainage systems have been studied extensively in recent years, resulting in progressively more reliable models. The same level of was advancement has not been reached with regard to damage curves, for which improvements are highly connected to data availability; this remains the main bottleneck in the expected flooding damage estimation. Such functions are usually affected by significant uncertainty intrinsically related to the collected data and to the simplified structure of the adopted functional relationships. The present paper aimed to evaluate this uncertainty by comparing the intrinsic uncertainty connected to the construction of the damage-depth function to the hydraulic model uncertainty. In this way, the paper sought to evaluate the role of hydraulic model detail level in the wider context of flood damage estimation. This paper demonstrated that the use of detailed hydraulic models might not be justified because of the higher computational cost and the significant uncertainty in damage estimation curves. This uncertainty occurs mainly because a large part of the total uncertainty is dependent on depth-damage curves. Improving the estimation of these curves may provide better results in term of uncertainty reduction than the adoption of detailed hydraulic models.

  2. Global Sensitivity Analysis for Identifying Important Parameters of Nitrogen Nitrification and Denitrification under Model and Scenario Uncertainties

    NASA Astrophysics Data System (ADS)

    Ye, M.; Chen, Z.; Shi, L.; Zhu, Y.; Yang, J.

    2017-12-01

    Nitrogen reactive transport modeling is subject to uncertainty in model parameters, structures, and scenarios. While global sensitivity analysis is a vital tool for identifying the parameters important to nitrogen reactive transport, conventional global sensitivity analysis only considers parametric uncertainty. This may result in inaccurate selection of important parameters, because parameter importance may vary under different models and modeling scenarios. By using a recently developed variance-based global sensitivity analysis method, this paper identifies important parameters with simultaneous consideration of parametric uncertainty, model uncertainty, and scenario uncertainty. In a numerical example of nitrogen reactive transport modeling, a combination of three scenarios of soil temperature and two scenarios of soil moisture leads to a total of six scenarios. Four alternative models are used to evaluate reduction functions used for calculating actual rates of nitrification and denitrification. The model uncertainty is tangled with scenario uncertainty, as the reduction functions depend on soil temperature and moisture content. The results of sensitivity analysis show that parameter importance varies substantially between different models and modeling scenarios, which may lead to inaccurate selection of important parameters if model and scenario uncertainties are not considered. This problem is avoided by using the new method of sensitivity analysis in the context of model averaging and scenario averaging. The new method of sensitivity analysis can be applied to other problems of contaminant transport modeling when model uncertainty and/or scenario uncertainty are present.

  3. Lognormal Approximations of Fault Tree Uncertainty Distributions.

    PubMed

    El-Shanawany, Ashraf Ben; Ardron, Keith H; Walker, Simon P

    2018-01-26

    Fault trees are used in reliability modeling to create logical models of fault combinations that can lead to undesirable events. The output of a fault tree analysis (the top event probability) is expressed in terms of the failure probabilities of basic events that are input to the model. Typically, the basic event probabilities are not known exactly, but are modeled as probability distributions: therefore, the top event probability is also represented as an uncertainty distribution. Monte Carlo methods are generally used for evaluating the uncertainty distribution, but such calculations are computationally intensive and do not readily reveal the dominant contributors to the uncertainty. In this article, a closed-form approximation for the fault tree top event uncertainty distribution is developed, which is applicable when the uncertainties in the basic events of the model are lognormally distributed. The results of the approximate method are compared with results from two sampling-based methods: namely, the Monte Carlo method and the Wilks method based on order statistics. It is shown that the closed-form expression can provide a reasonable approximation to results obtained by Monte Carlo sampling, without incurring the computational expense. The Wilks method is found to be a useful means of providing an upper bound for the percentiles of the uncertainty distribution while being computationally inexpensive compared with full Monte Carlo sampling. The lognormal approximation method and Wilks's method appear attractive, practical alternatives for the evaluation of uncertainty in the output of fault trees and similar multilinear models. © 2018 Society for Risk Analysis.

  4. Application of uncertainty and sensitivity analysis to the air quality SHERPA modelling tool

    NASA Astrophysics Data System (ADS)

    Pisoni, E.; Albrecht, D.; Mara, T. A.; Rosati, R.; Tarantola, S.; Thunis, P.

    2018-06-01

    Air quality has significantly improved in Europe over the past few decades. Nonetheless we still find high concentrations in measurements mainly in specific regions or cities. This dimensional shift, from EU-wide to hot-spot exceedances, calls for a novel approach to regional air quality management (to complement EU-wide existing policies). The SHERPA (Screening for High Emission Reduction Potentials on Air quality) modelling tool was developed in this context. It provides an additional tool to be used in support to regional/local decision makers responsible for the design of air quality plans. It is therefore important to evaluate the quality of the SHERPA model, and its behavior in the face of various kinds of uncertainty. Uncertainty and sensitivity analysis techniques can be used for this purpose. They both reveal the links between assumptions and forecasts, help in-model simplification and may highlight unexpected relationships between inputs and outputs. Thus, a policy steered SHERPA module - predicting air quality improvement linked to emission reduction scenarios - was evaluated by means of (1) uncertainty analysis (UA) to quantify uncertainty in the model output, and (2) by sensitivity analysis (SA) to identify the most influential input sources of this uncertainty. The results of this study provide relevant information about the key variables driving the SHERPA output uncertainty, and advise policy-makers and modellers where to place their efforts for an improved decision-making process.

  5. A Reliability Estimation in Modeling Watershed Runoff With Uncertainties

    NASA Astrophysics Data System (ADS)

    Melching, Charles S.; Yen, Ben Chie; Wenzel, Harry G., Jr.

    1990-10-01

    The reliability of simulation results produced by watershed runoff models is a function of uncertainties in nature, data, model parameters, and model structure. A framework is presented here for using a reliability analysis method (such as first-order second-moment techniques or Monte Carlo simulation) to evaluate the combined effect of the uncertainties on the reliability of output hydrographs from hydrologic models. For a given event the prediction reliability can be expressed in terms of the probability distribution of the estimated hydrologic variable. The peak discharge probability for a watershed in Illinois using the HEC-1 watershed model is given as an example. The study of the reliability of predictions from watershed models provides useful information on the stochastic nature of output from deterministic models subject to uncertainties and identifies the relative contribution of the various uncertainties to unreliability of model predictions.

  6. PC-BASED SUPERCOMPUTING FOR UNCERTAINTY AND SENSITIVITY ANALYSIS OF MODELS

    EPA Science Inventory

    Evaluating uncertainty and sensitivity of multimedia environmental models that integrate assessments of air, soil, sediments, groundwater, and surface water is a difficult task. It can be an enormous undertaking even for simple, single-medium models (i.e. groundwater only) descr...

  7. Methods for evaluating the predictive accuracy of structural dynamic models

    NASA Technical Reports Server (NTRS)

    Hasselman, Timothy K.; Chrostowski, Jon D.

    1991-01-01

    Modeling uncertainty is defined in terms of the difference between predicted and measured eigenvalues and eigenvectors. Data compiled from 22 sets of analysis/test results was used to create statistical databases for large truss-type space structures and both pretest and posttest models of conventional satellite-type space structures. Modeling uncertainty is propagated through the model to produce intervals of uncertainty on frequency response functions, both amplitude and phase. This methodology was used successfully to evaluate the predictive accuracy of several structures, including the NASA CSI Evolutionary Structure tested at Langley Research Center. Test measurements for this structure were within + one-sigma intervals of predicted accuracy for the most part, demonstrating the validity of the methodology and computer code.

  8. Computer-Based Model Calibration and Uncertainty Analysis: Terms and Concepts

    DTIC Science & Technology

    2015-07-01

    uncertainty analyses throughout the lifecycle of planning, designing, and operating of Civil Works flood risk management projects as described in...value 95% of the time. In the frequentist approach to PE, model parameters area regarded as having true values, and their estimate is based on the...in catchment models. 1. Evaluating parameter uncertainty. Water Resources Research 19(5):1151–1172. Lee, P. M. 2012. Bayesian statistics: An

  9. Natural hazard modeling and uncertainty analysis [Chapter 2

    Treesearch

    Matthew Thompson; Jord J. Warmink

    2017-01-01

    Modeling can play a critical role in assessing and mitigating risks posed by natural hazards. These modeling efforts generally aim to characterize the occurrence, intensity, and potential consequences of natural hazards. Uncertainties surrounding the modeling process can have important implications for the development, application, evaluation, and interpretation of...

  10. Integrating model behavior, optimization, and sensitivity/uncertainty analysis: overview and application of the MOUSE software toolbox

    USDA-ARS?s Scientific Manuscript database

    This paper provides an overview of the Model Optimization, Uncertainty, and SEnsitivity Analysis (MOUSE) software application, an open-source, Java-based toolbox of visual and numerical analysis components for the evaluation of environmental models. MOUSE is based on the OPTAS model calibration syst...

  11. Uncertainty of climate change impact on groundwater reserves - Application to a chalk aquifer

    NASA Astrophysics Data System (ADS)

    Goderniaux, Pascal; Brouyère, Serge; Wildemeersch, Samuel; Therrien, René; Dassargues, Alain

    2015-09-01

    Recent studies have evaluated the impact of climate change on groundwater resources for different geographical and climatic contexts. However, most studies have either not estimated the uncertainty around projected impacts or have limited the analysis to the uncertainty related to climate models. In this study, the uncertainties around impact projections from several sources (climate models, natural variability of the weather, hydrological model calibration) are calculated and compared for the Geer catchment (465 km2) in Belgium. We use a surface-subsurface integrated model implemented using the finite element code HydroGeoSphere, coupled with climate change scenarios (2010-2085) and the UCODE_2005 inverse model, to assess the uncertainty related to the calibration of the hydrological model. This integrated model provides a more realistic representation of the water exchanges between surface and subsurface domains and constrains more the calibration with the use of both surface and subsurface observed data. Sensitivity and uncertainty analyses were performed on predictions. The linear uncertainty analysis is approximate for this nonlinear system, but it provides some measure of uncertainty for computationally demanding models. Results show that, for the Geer catchment, the most important uncertainty is related to calibration of the hydrological model. The total uncertainty associated with the prediction of groundwater levels remains large. By the end of the century, however, the uncertainty becomes smaller than the predicted decline in groundwater levels.

  12. Constraining ozone-precursor responsiveness using ambient measurements

    EPA Science Inventory

    This study develops probabilistic estimates of ozone (O3) sensitivities to precursoremissions by incorporating uncertainties in photochemical modeling and evaluating modelperformance based on ground-level observations of O3 and oxides of nitrogen (NOx).Uncertainties in model form...

  13. Evaluation of pollutant loads from stormwater BMPs to receiving water using load frequency curves with uncertainty analysis.

    PubMed

    Park, Daeryong; Roesner, Larry A

    2012-12-15

    This study examined pollutant loads released to receiving water from a typical urban watershed in the Los Angeles (LA) Basin of California by applying a best management practice (BMP) performance model that includes uncertainty. This BMP performance model uses the k-C model and incorporates uncertainty analysis and the first-order second-moment (FOSM) method to assess the effectiveness of BMPs for removing stormwater pollutants. Uncertainties were considered for the influent event mean concentration (EMC) and the aerial removal rate constant of the k-C model. The storage treatment overflow and runoff model (STORM) was used to simulate the flow volume from watershed, the bypass flow volume and the flow volume that passes through the BMP. Detention basins and total suspended solids (TSS) were chosen as representatives of stormwater BMP and pollutant, respectively. This paper applies load frequency curves (LFCs), which replace the exceedance percentage with an exceedance frequency as an alternative to load duration curves (LDCs), to evaluate the effectiveness of BMPs. An evaluation method based on uncertainty analysis is suggested because it applies a water quality standard exceedance based on frequency and magnitude. As a result, the incorporation of uncertainty in the estimates of pollutant loads can assist stormwater managers in determining the degree of total daily maximum load (TMDL) compliance that could be expected from a given BMP in a watershed. Copyright © 2012 Elsevier Ltd. All rights reserved.

  14. Reliability of a new biokinetic model of zirconium in internal dosimetry: part I, parameter uncertainty analysis.

    PubMed

    Li, Wei Bo; Greiter, Matthias; Oeh, Uwe; Hoeschen, Christoph

    2011-12-01

    The reliability of biokinetic models is essential in internal dose assessments and radiation risk analysis for the public, occupational workers, and patients exposed to radionuclides. In this paper, a method for assessing the reliability of biokinetic models by means of uncertainty and sensitivity analysis was developed. The paper is divided into two parts. In the first part of the study published here, the uncertainty sources of the model parameters for zirconium (Zr), developed by the International Commission on Radiological Protection (ICRP), were identified and analyzed. Furthermore, the uncertainty of the biokinetic experimental measurement performed at the Helmholtz Zentrum München-German Research Center for Environmental Health (HMGU) for developing a new biokinetic model of Zr was analyzed according to the Guide to the Expression of Uncertainty in Measurement, published by the International Organization for Standardization. The confidence interval and distribution of model parameters of the ICRP and HMGU Zr biokinetic models were evaluated. As a result of computer biokinetic modelings, the mean, standard uncertainty, and confidence interval of model prediction calculated based on the model parameter uncertainty were presented and compared to the plasma clearance and urinary excretion measured after intravenous administration. It was shown that for the most important compartment, the plasma, the uncertainty evaluated for the HMGU model was much smaller than that for the ICRP model; that phenomenon was observed for other organs and tissues as well. The uncertainty of the integral of the radioactivity of Zr up to 50 y calculated by the HMGU model after ingestion by adult members of the public was shown to be smaller by a factor of two than that of the ICRP model. It was also shown that the distribution type of the model parameter strongly influences the model prediction, and the correlation of the model input parameters affects the model prediction to a certain extent depending on the strength of the correlation. In the case of model prediction, the qualitative comparison of the model predictions with the measured plasma and urinary data showed the HMGU model to be more reliable than the ICRP model; quantitatively, the uncertainty model prediction by the HMGU systemic biokinetic model is smaller than that of the ICRP model. The uncertainty information on the model parameters analyzed in this study was used in the second part of the paper regarding a sensitivity analysis of the Zr biokinetic models.

  15. Effects of uncertainties in hydrological modelling. A case study of a mountainous catchment in Southern Norway

    NASA Astrophysics Data System (ADS)

    Engeland, Kolbjørn; Steinsland, Ingelin; Johansen, Stian Solvang; Petersen-Øverleir, Asgeir; Kolberg, Sjur

    2016-05-01

    In this study, we explore the effect of uncertainty and poor observation quality on hydrological model calibration and predictions. The Osali catchment in Western Norway was selected as case study and an elevation distributed HBV-model was used. We systematically evaluated the effect of accounting for uncertainty in parameters, precipitation input, temperature input and streamflow observations. For precipitation and temperature we accounted for the interpolation uncertainty, and for streamflow we accounted for rating curve uncertainty. Further, the effects of poorer quality of precipitation input and streamflow observations were explored. Less information about precipitation was obtained by excluding the nearest precipitation station from the analysis, while reduced information about the streamflow was obtained by omitting the highest and lowest streamflow observations when estimating the rating curve. The results showed that including uncertainty in the precipitation and temperature inputs has a negligible effect on the posterior distribution of parameters and for the Nash-Sutcliffe (NS) efficiency for the predicted flows, while the reliability and the continuous rank probability score (CRPS) improves. Less information in precipitation input resulted in a shift in the water balance parameter Pcorr, a model producing smoother streamflow predictions, giving poorer NS and CRPS, but higher reliability. The effect of calibrating the hydrological model using streamflow observations based on different rating curves is mainly seen as variability in the water balance parameter Pcorr. When evaluating predictions, the best evaluation scores were not achieved for the rating curve used for calibration, but for rating curves giving smoother streamflow observations. Less information in streamflow influenced the water balance parameter Pcorr, and increased the spread in evaluation scores by giving both better and worse scores.

  16. Bayesian Model Averaging of Artificial Intelligence Models for Hydraulic Conductivity Estimation

    NASA Astrophysics Data System (ADS)

    Nadiri, A.; Chitsazan, N.; Tsai, F. T.; Asghari Moghaddam, A.

    2012-12-01

    This research presents a Bayesian artificial intelligence model averaging (BAIMA) method that incorporates multiple artificial intelligence (AI) models to estimate hydraulic conductivity and evaluate estimation uncertainties. Uncertainty in the AI model outputs stems from error in model input as well as non-uniqueness in selecting different AI methods. Using one single AI model tends to bias the estimation and underestimate uncertainty. BAIMA employs Bayesian model averaging (BMA) technique to address the issue of using one single AI model for estimation. BAIMA estimates hydraulic conductivity by averaging the outputs of AI models according to their model weights. In this study, the model weights were determined using the Bayesian information criterion (BIC) that follows the parsimony principle. BAIMA calculates the within-model variances to account for uncertainty propagation from input data to AI model output. Between-model variances are evaluated to account for uncertainty due to model non-uniqueness. We employed Takagi-Sugeno fuzzy logic (TS-FL), artificial neural network (ANN) and neurofuzzy (NF) to estimate hydraulic conductivity for the Tasuj plain aquifer, Iran. BAIMA combined three AI models and produced better fitting than individual models. While NF was expected to be the best AI model owing to its utilization of both TS-FL and ANN models, the NF model is nearly discarded by the parsimony principle. The TS-FL model and the ANN model showed equal importance although their hydraulic conductivity estimates were quite different. This resulted in significant between-model variances that are normally ignored by using one AI model.

  17. Benchmark Evaluation of Start-Up and Zero-Power Measurements at the High-Temperature Engineering Test Reactor

    DOE PAGES

    Bess, John D.; Fujimoto, Nozomu

    2014-10-09

    Benchmark models were developed to evaluate six cold-critical and two warm-critical, zero-power measurements of the HTTR. Additional measurements of a fully-loaded subcritical configuration, core excess reactivity, shutdown margins, six isothermal temperature coefficients, and axial reaction-rate distributions were also evaluated as acceptable benchmark experiments. Insufficient information is publicly available to develop finely-detailed models of the HTTR as much of the design information is still proprietary. However, the uncertainties in the benchmark models are judged to be of sufficient magnitude to encompass any biases and bias uncertainties incurred through the simplification process used to develop the benchmark models. Dominant uncertainties in themore » experimental keff for all core configurations come from uncertainties in the impurity content of the various graphite blocks that comprise the HTTR. Monte Carlo calculations of keff are between approximately 0.9 % and 2.7 % greater than the benchmark values. Reevaluation of the HTTR models as additional information becomes available could improve the quality of this benchmark and possibly reduce the computational biases. High-quality characterization of graphite impurities would significantly improve the quality of the HTTR benchmark assessment. Simulation of the other reactor physics measurements are in good agreement with the benchmark experiment values. The complete benchmark evaluation details are available in the 2014 edition of the International Handbook of Evaluated Reactor Physics Benchmark Experiments.« less

  18. Assessing the relative importance of parameter and forcing uncertainty and their interactions in conceptual hydrological model simulations

    NASA Astrophysics Data System (ADS)

    Mockler, E. M.; Chun, K. P.; Sapriza-Azuri, G.; Bruen, M.; Wheater, H. S.

    2016-11-01

    Predictions of river flow dynamics provide vital information for many aspects of water management including water resource planning, climate adaptation, and flood and drought assessments. Many of the subjective choices that modellers make including model and criteria selection can have a significant impact on the magnitude and distribution of the output uncertainty. Hydrological modellers are tasked with understanding and minimising the uncertainty surrounding streamflow predictions before communicating the overall uncertainty to decision makers. Parameter uncertainty in conceptual rainfall-runoff models has been widely investigated, and model structural uncertainty and forcing data have been receiving increasing attention. This study aimed to assess uncertainties in streamflow predictions due to forcing data and the identification of behavioural parameter sets in 31 Irish catchments. By combining stochastic rainfall ensembles and multiple parameter sets for three conceptual rainfall-runoff models, an analysis of variance model was used to decompose the total uncertainty in streamflow simulations into contributions from (i) forcing data, (ii) identification of model parameters and (iii) interactions between the two. The analysis illustrates that, for our subjective choices, hydrological model selection had a greater contribution to overall uncertainty, while performance criteria selection influenced the relative intra-annual uncertainties in streamflow predictions. Uncertainties in streamflow predictions due to the method of determining parameters were relatively lower for wetter catchments, and more evenly distributed throughout the year when the Nash-Sutcliffe Efficiency of logarithmic values of flow (lnNSE) was the evaluation criterion.

  19. Determining the nuclear data uncertainty on MONK10 and WIMS10 criticality calculations

    NASA Astrophysics Data System (ADS)

    Ware, Tim; Dobson, Geoff; Hanlon, David; Hiles, Richard; Mason, Robert; Perry, Ray

    2017-09-01

    The ANSWERS Software Service is developing a number of techniques to better understand and quantify uncertainty on calculations of the neutron multiplication factor, k-effective, in nuclear fuel and other systems containing fissile material. The uncertainty on the calculated k-effective arises from a number of sources, including nuclear data uncertainties, manufacturing tolerances, modelling approximations and, for Monte Carlo simulation, stochastic uncertainty. For determining the uncertainties due to nuclear data, a set of application libraries have been generated for use with the MONK10 Monte Carlo and the WIMS10 deterministic criticality and reactor physics codes. This paper overviews the generation of these nuclear data libraries by Latin hypercube sampling of JEFF-3.1.2 evaluated data based upon a library of covariance data taken from JEFF, ENDF/B, JENDL and TENDL evaluations. Criticality calculations have been performed with MONK10 and WIMS10 using these sampled libraries for a number of benchmark models of fissile systems. Results are presented which show the uncertainty on k-effective for these systems arising from the uncertainty on the input nuclear data.

  20. Quantifying and reducing model-form uncertainties in Reynolds-averaged Navier-Stokes simulations: A data-driven, physics-informed Bayesian approach

    NASA Astrophysics Data System (ADS)

    Xiao, H.; Wu, J.-L.; Wang, J.-X.; Sun, R.; Roy, C. J.

    2016-11-01

    Despite their well-known limitations, Reynolds-Averaged Navier-Stokes (RANS) models are still the workhorse tools for turbulent flow simulations in today's engineering analysis, design and optimization. While the predictive capability of RANS models depends on many factors, for many practical flows the turbulence models are by far the largest source of uncertainty. As RANS models are used in the design and safety evaluation of many mission-critical systems such as airplanes and nuclear power plants, quantifying their model-form uncertainties has significant implications in enabling risk-informed decision-making. In this work we develop a data-driven, physics-informed Bayesian framework for quantifying model-form uncertainties in RANS simulations. Uncertainties are introduced directly to the Reynolds stresses and are represented with compact parameterization accounting for empirical prior knowledge and physical constraints (e.g., realizability, smoothness, and symmetry). An iterative ensemble Kalman method is used to assimilate the prior knowledge and observation data in a Bayesian framework, and to propagate them to posterior distributions of velocities and other Quantities of Interest (QoIs). We use two representative cases, the flow over periodic hills and the flow in a square duct, to evaluate the performance of the proposed framework. Both cases are challenging for standard RANS turbulence models. Simulation results suggest that, even with very sparse observations, the obtained posterior mean velocities and other QoIs have significantly better agreement with the benchmark data compared to the baseline results. At most locations the posterior distribution adequately captures the true model error within the developed model form uncertainty bounds. The framework is a major improvement over existing black-box, physics-neutral methods for model-form uncertainty quantification, where prior knowledge and details of the models are not exploited. This approach has potential implications in many fields in which the governing equations are well understood but the model uncertainty comes from unresolved physical processes.

  1. Uncertainty Quantification in Climate Modeling and Projection

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Qian, Yun; Jackson, Charles; Giorgi, Filippo

    The projection of future climate is one of the most complex problems undertaken by the scientific community. Although scientists have been striving to better understand the physical basis of the climate system and to improve climate models, the overall uncertainty in projections of future climate has not been significantly reduced (e.g., from the IPCC AR4 to AR5). With the rapid increase of complexity in Earth system models, reducing uncertainties in climate projections becomes extremely challenging. Since uncertainties always exist in climate models, interpreting the strengths and limitations of future climate projections is key to evaluating risks, and climate change informationmore » for use in Vulnerability, Impact, and Adaptation (VIA) studies should be provided with both well-characterized and well-quantified uncertainty. The workshop aimed at providing participants, many of them from developing countries, information on strategies to quantify the uncertainty in climate model projections and assess the reliability of climate change information for decision-making. The program included a mixture of lectures on fundamental concepts in Bayesian inference and sampling, applications, and hands-on computer laboratory exercises employing software packages for Bayesian inference, Markov Chain Monte Carlo methods, and global sensitivity analyses. The lectures covered a range of scientific issues underlying the evaluation of uncertainties in climate projections, such as the effects of uncertain initial and boundary conditions, uncertain physics, and limitations of observational records. Progress in quantitatively estimating uncertainties in hydrologic, land surface, and atmospheric models at both regional and global scales was also reviewed. The application of Uncertainty Quantification (UQ) concepts to coupled climate system models is still in its infancy. The Coupled Model Intercomparison Project (CMIP) multi-model ensemble currently represents the primary data for assessing reliability and uncertainties of climate change information. An alternative approach is to generate similar ensembles by perturbing parameters within a single-model framework. One of workshop’s objectives was to give participants a deeper understanding of these approaches within a Bayesian statistical framework. However, there remain significant challenges still to be resolved before UQ can be applied in a convincing way to climate models and their projections.« less

  2. Quantifying the impact of the longitudinal dispersion coefficient parameter uncertainty on the physical transport processes in rivers

    NASA Astrophysics Data System (ADS)

    Camacho Suarez, V. V.; Shucksmith, J.; Schellart, A.

    2016-12-01

    Analytical and numerical models can be used to represent the advection-dispersion processes governing the transport of pollutants in rivers (Fan et al., 2015; Van Genuchten et al., 2013). Simplifications, assumptions and parameter estimations in these models result in various uncertainties within the modelling process and estimations of pollutant concentrations. In this study, we explore both: 1) the structural uncertainty due to the one dimensional simplification of the Advection Dispersion Equation (ADE) and 2) the parameter uncertainty due to the semi empirical estimation of the longitudinal dispersion coefficient. The relative significance of these uncertainties has not previously been examined. By analysing both the relative structural uncertainty of analytical solutions of the ADE, and the parameter uncertainty due to the longitudinal dispersion coefficient via a Monte Carlo analysis, an evaluation of the dominant uncertainties for a case study in the river Chillan, Chile is presented over a range of spatial scales.

  3. Effects of model structural uncertainty on carbon cycle projections: biological nitrogen fixation as a case study

    NASA Astrophysics Data System (ADS)

    Wieder, William R.; Cleveland, Cory C.; Lawrence, David M.; Bonan, Gordon B.

    2015-04-01

    Uncertainties in terrestrial carbon (C) cycle projections increase uncertainty of potential climate feedbacks. Efforts to improve model performance often include increased representation of biogeochemical processes, such as coupled carbon-nitrogen (N) cycles. In doing so, models are becoming more complex, generating structural uncertainties in model form that reflect incomplete knowledge of how to represent underlying processes. Here, we explore structural uncertainties associated with biological nitrogen fixation (BNF) and quantify their effects on C cycle projections. We find that alternative plausible structures to represent BNF result in nearly equivalent terrestrial C fluxes and pools through the twentieth century, but the strength of the terrestrial C sink varies by nearly a third (50 Pg C) by the end of the twenty-first century under a business-as-usual climate change scenario representative concentration pathway 8.5. These results indicate that actual uncertainty in future C cycle projections may be larger than previously estimated, and this uncertainty will limit C cycle projections until model structures can be evaluated and refined.

  4. Systematic Uncertainties in High-Energy Hadronic Interaction Models

    NASA Astrophysics Data System (ADS)

    Zha, M.; Knapp, J.; Ostapchenko, S.

    2003-07-01

    Hadronic interaction models for cosmic ray energies are uncertain since our knowledge of hadronic interactions is extrap olated from accelerator experiments at much lower energies. At present most high-energy models are based on Grib ov-Regge theory of multi-Pomeron exchange, which provides a theoretical framework to evaluate cross-sections and particle production. While experimental data constrain some of the model parameters, others are not well determined and are therefore a source of systematic uncertainties. In this paper we evaluate the variation of results obtained with the QGSJET model, when modifying parameters relating to three ma jor sources of uncertainty: the form of the parton structure function, the role of diffractive interactions, and the string hadronisation. Results on inelastic cross sections, on secondary particle production and on the air shower development are discussed.

  5. Bayesian uncertainty analysis for complex systems biology models: emulation, global parameter searches and evaluation of gene functions.

    PubMed

    Vernon, Ian; Liu, Junli; Goldstein, Michael; Rowe, James; Topping, Jen; Lindsey, Keith

    2018-01-02

    Many mathematical models have now been employed across every area of systems biology. These models increasingly involve large numbers of unknown parameters, have complex structure which can result in substantial evaluation time relative to the needs of the analysis, and need to be compared to observed data of various forms. The correct analysis of such models usually requires a global parameter search, over a high dimensional parameter space, that incorporates and respects the most important sources of uncertainty. This can be an extremely difficult task, but it is essential for any meaningful inference or prediction to be made about any biological system. It hence represents a fundamental challenge for the whole of systems biology. Bayesian statistical methodology for the uncertainty analysis of complex models is introduced, which is designed to address the high dimensional global parameter search problem. Bayesian emulators that mimic the systems biology model but which are extremely fast to evaluate are embeded within an iterative history match: an efficient method to search high dimensional spaces within a more formal statistical setting, while incorporating major sources of uncertainty. The approach is demonstrated via application to a model of hormonal crosstalk in Arabidopsis root development, which has 32 rate parameters, for which we identify the sets of rate parameter values that lead to acceptable matches between model output and observed trend data. The multiple insights into the model's structure that this analysis provides are discussed. The methodology is applied to a second related model, and the biological consequences of the resulting comparison, including the evaluation of gene functions, are described. Bayesian uncertainty analysis for complex models using both emulators and history matching is shown to be a powerful technique that can greatly aid the study of a large class of systems biology models. It both provides insight into model behaviour and identifies the sets of rate parameters of interest.

  6. Evaluate the seasonal cycle and interannual variability of carbon fluxes and the associated uncertainties using modeled and observed data

    NASA Astrophysics Data System (ADS)

    Zeng, F.; Collatz, G. J.; Ivanoff, A.

    2013-12-01

    We assessed the performance of the Carnegie-Ames-Stanford Approach - Global Fire Emissions Database (CASA-GFED3) terrestrial carbon cycle model in simulating seasonal cycle and interannual variability (IAV) of global and regional carbon fluxes and uncertainties associated with model parameterization. Key model parameters were identified from sensitivity analyses and their uncertainties were propagated through model processes using the Monte Carlo approach to estimate the uncertainties in carbon fluxes and pool sizes. Three independent flux data sets, the global gross primary productivity (GPP) upscaled from eddy covariance flux measurements by Jung et al. (2011), the net ecosystem exchange (NEE) estimated by CarbonTracker, and the eddy covariance flux observations, were used to evaluate modeled fluxes and the uncertainties. Modeled fluxes agree well with both Jung's GPP and CarbonTracker NEE in the amplitude and phase of seasonal cycle, except in the case of GPP in tropical regions where Jung et al. (2011) showed larger fluxes and seasonal amplitude. Modeled GPP IAV is positively correlated (p < 0.1) with Jung's GPP IAV except in the tropics and temperate South America. The correlations between modeled NEE IAV and CarbonTracker NEE IAV are weak at regional to continental scales but stronger when fluxes are aggregated to >40°N latitude. At regional to continental scales flux uncertainties were larger than the IAV in the fluxes for both Jung's GPP and CarbonTracker NEE. Comparisons with eddy covariance flux observations are focused on sites within regions and years of recorded large-scale climate anomalies. We also evaluated modeled biomass using other independent continental biomass estimates and found good agreement. From the comparisons we identify the strengths and weaknesses of the model to capture the seasonal cycle and IAV of carbon fluxes and highlight ways to improve model performance.

  7. Development and comparison of Bayesian modularization method in uncertainty assessment of hydrological models

    NASA Astrophysics Data System (ADS)

    Li, L.; Xu, C.-Y.; Engeland, K.

    2012-04-01

    With respect to model calibration, parameter estimation and analysis of uncertainty sources, different approaches have been used in hydrological models. Bayesian method is one of the most widely used methods for uncertainty assessment of hydrological models, which incorporates different sources of information into a single analysis through Bayesian theorem. However, none of these applications can well treat the uncertainty in extreme flows of hydrological models' simulations. This study proposes a Bayesian modularization method approach in uncertainty assessment of conceptual hydrological models by considering the extreme flows. It includes a comprehensive comparison and evaluation of uncertainty assessments by a new Bayesian modularization method approach and traditional Bayesian models using the Metropolis Hasting (MH) algorithm with the daily hydrological model WASMOD. Three likelihood functions are used in combination with traditional Bayesian: the AR (1) plus Normal and time period independent model (Model 1), the AR (1) plus Normal and time period dependent model (Model 2) and the AR (1) plus multi-normal model (Model 3). The results reveal that (1) the simulations derived from Bayesian modularization method are more accurate with the highest Nash-Sutcliffe efficiency value, and (2) the Bayesian modularization method performs best in uncertainty estimates of entire flows and in terms of the application and computational efficiency. The study thus introduces a new approach for reducing the extreme flow's effect on the discharge uncertainty assessment of hydrological models via Bayesian. Keywords: extreme flow, uncertainty assessment, Bayesian modularization, hydrological model, WASMOD

  8. Uncertainty analysis of hydrological modeling in a tropical area using different algorithms

    NASA Astrophysics Data System (ADS)

    Rafiei Emam, Ammar; Kappas, Martin; Fassnacht, Steven; Linh, Nguyen Hoang Khanh

    2018-01-01

    Hydrological modeling outputs are subject to uncertainty resulting from different sources of errors (e.g., error in input data, model structure, and model parameters), making quantification of uncertainty in hydrological modeling imperative and meant to improve reliability of modeling results. The uncertainty analysis must solve difficulties in calibration of hydrological models, which further increase in areas with data scarcity. The purpose of this study is to apply four uncertainty analysis algorithms to a semi-distributed hydrological model, quantifying different source of uncertainties (especially parameter uncertainty) and evaluate their performance. In this study, the Soil and Water Assessment Tools (SWAT) eco-hydrological model was implemented for the watershed in the center of Vietnam. The sensitivity of parameters was analyzed, and the model was calibrated. The uncertainty analysis for the hydrological model was conducted based on four algorithms: Generalized Likelihood Uncertainty Estimation (GLUE), Sequential Uncertainty Fitting (SUFI), Parameter Solution method (ParaSol) and Particle Swarm Optimization (PSO). The performance of the algorithms was compared using P-factor and Rfactor, coefficient of determination (R 2), the Nash Sutcliffe coefficient of efficiency (NSE) and Percent Bias (PBIAS). The results showed the high performance of SUFI and PSO with P-factor>0.83, R-factor <0.56 and R 2>0.91, NSE>0.89, and 0.18

  9. Methods for handling uncertainty within pharmaceutical funding decisions

    NASA Astrophysics Data System (ADS)

    Stevenson, Matt; Tappenden, Paul; Squires, Hazel

    2014-01-01

    This article provides a position statement regarding decision making under uncertainty within the economic evaluation of pharmaceuticals, with a particular focus upon the National Institute for Health and Clinical Excellence context within England and Wales. This area is of importance as funding agencies have a finite budget from which to purchase a selection of competing health care interventions. The objective function generally used is that of maximising societal health with an explicit acknowledgement that there will be opportunity costs associated with purchasing a particular intervention. Three components of uncertainty are discussed within a pharmaceutical funding perspective: methodological uncertainty, parameter uncertainty and structural uncertainty, alongside a discussion of challenges that are particularly pertinent to health economic evaluation. The discipline has focused primarily on handling methodological and parameter uncertainty and a clear reference case has been developed for consistency across evaluations. However, uncertainties still remain. Less attention has been given to methods for handling structural uncertainty. The lack of adequate methods to explicitly incorporate this aspect of model development may result in the true uncertainty surrounding health care investment decisions being underestimated. Research in this area is ongoing as we review.

  10. A multiphysical ensemble system of numerical snow modelling

    NASA Astrophysics Data System (ADS)

    Lafaysse, Matthieu; Cluzet, Bertrand; Dumont, Marie; Lejeune, Yves; Vionnet, Vincent; Morin, Samuel

    2017-05-01

    Physically based multilayer snowpack models suffer from various modelling errors. To represent these errors, we built the new multiphysical ensemble system ESCROC (Ensemble System Crocus) by implementing new representations of different physical processes in the deterministic coupled multilayer ground/snowpack model SURFEX/ISBA/Crocus. This ensemble was driven and evaluated at Col de Porte (1325 m a.s.l., French alps) over 18 years with a high-quality meteorological and snow data set. A total number of 7776 simulations were evaluated separately, accounting for the uncertainties of evaluation data. The ability of the ensemble to capture the uncertainty associated to modelling errors is assessed for snow depth, snow water equivalent, bulk density, albedo and surface temperature. Different sub-ensembles of the ESCROC system were studied with probabilistic tools to compare their performance. Results show that optimal members of the ESCROC system are able to explain more than half of the total simulation errors. Integrating members with biases exceeding the range corresponding to observational uncertainty is necessary to obtain an optimal dispersion, but this issue can also be a consequence of the fact that meteorological forcing uncertainties were not accounted for. The ESCROC system promises the integration of numerical snow-modelling errors in ensemble forecasting and ensemble assimilation systems in support of avalanche hazard forecasting and other snowpack-modelling applications.

  11. Improved parameter inference in catchment models: 1. Evaluating parameter uncertainty

    NASA Astrophysics Data System (ADS)

    Kuczera, George

    1983-10-01

    A Bayesian methodology is developed to evaluate parameter uncertainty in catchment models fitted to a hydrologic response such as runoff, the goal being to improve the chance of successful regionalization. The catchment model is posed as a nonlinear regression model with stochastic errors possibly being both autocorrelated and heteroscedastic. The end result of this methodology, which may use Box-Cox power transformations and ARMA error models, is the posterior distribution, which summarizes what is known about the catchment model parameters. This can be simplified to a multivariate normal provided a linearization in parameter space is acceptable; means of checking and improving this assumption are discussed. The posterior standard deviations give a direct measure of parameter uncertainty, and study of the posterior correlation matrix can indicate what kinds of data are required to improve the precision of poorly determined parameters. Finally, a case study involving a nine-parameter catchment model fitted to monthly runoff and soil moisture data is presented. It is shown that use of ordinary least squares when its underlying error assumptions are violated gives an erroneous description of parameter uncertainty.

  12. Uncertainty Analysis of Coupled Socioeconomic-Cropping Models: Building Confidence in Climate Change Decision-Support Tools for Local Stakeholders

    NASA Astrophysics Data System (ADS)

    Malard, J. J.; Rojas, M.; Adamowski, J. F.; Gálvez, J.; Tuy, H. A.; Melgar-Quiñonez, H.

    2015-12-01

    While cropping models represent the biophysical aspects of agricultural systems, system dynamics modelling offers the possibility of representing the socioeconomic (including social and cultural) aspects of these systems. The two types of models can then be coupled in order to include the socioeconomic dimensions of climate change adaptation in the predictions of cropping models.We develop a dynamically coupled socioeconomic-biophysical model of agricultural production and its repercussions on food security in two case studies from Guatemala (a market-based, intensive agricultural system and a low-input, subsistence crop-based system). Through the specification of the climate inputs to the cropping model, the impacts of climate change on the entire system can be analysed, and the participatory nature of the system dynamics model-building process, in which stakeholders from NGOs to local governmental extension workers were included, helps ensure local trust in and use of the model.However, the analysis of climate variability's impacts on agroecosystems includes uncertainty, especially in the case of joint physical-socioeconomic modelling, and the explicit representation of this uncertainty in the participatory development of the models is important to ensure appropriate use of the models by the end users. In addition, standard model calibration, validation, and uncertainty interval estimation techniques used for physically-based models are impractical in the case of socioeconomic modelling. We present a methodology for the calibration and uncertainty analysis of coupled biophysical (cropping) and system dynamics (socioeconomic) agricultural models, using survey data and expert input to calibrate and evaluate the uncertainty of the system dynamics as well as of the overall coupled model. This approach offers an important tool for local decision makers to evaluate the potential impacts of climate change and their feedbacks through the associated socioeconomic system.

  13. Sensitivity and Uncertainty Analysis for Streamflow Prediction Using Different Objective Functions and Optimization Algorithms: San Joaquin California

    NASA Astrophysics Data System (ADS)

    Paul, M.; Negahban-Azar, M.

    2017-12-01

    The hydrologic models usually need to be calibrated against observed streamflow at the outlet of a particular drainage area through a careful model calibration. However, a large number of parameters are required to fit in the model due to their unavailability of the field measurement. Therefore, it is difficult to calibrate the model for a large number of potential uncertain model parameters. This even becomes more challenging if the model is for a large watershed with multiple land uses and various geophysical characteristics. Sensitivity analysis (SA) can be used as a tool to identify most sensitive model parameters which affect the calibrated model performance. There are many different calibration and uncertainty analysis algorithms which can be performed with different objective functions. By incorporating sensitive parameters in streamflow simulation, effects of the suitable algorithm in improving model performance can be demonstrated by the Soil and Water Assessment Tool (SWAT) modeling. In this study, the SWAT was applied in the San Joaquin Watershed in California covering 19704 km2 to calibrate the daily streamflow. Recently, sever water stress escalating due to intensified climate variability, prolonged drought and depleting groundwater for agricultural irrigation in this watershed. Therefore it is important to perform a proper uncertainty analysis given the uncertainties inherent in hydrologic modeling to predict the spatial and temporal variation of the hydrologic process to evaluate the impacts of different hydrologic variables. The purpose of this study was to evaluate the sensitivity and uncertainty of the calibrated parameters for predicting streamflow. To evaluate the sensitivity of the calibrated parameters three different optimization algorithms (Sequential Uncertainty Fitting- SUFI-2, Generalized Likelihood Uncertainty Estimation- GLUE and Parameter Solution- ParaSol) were used with four different objective functions (coefficient of determination- r2, Nash-Sutcliffe efficiency- NSE, percent bias- PBIAS, and Kling-Gupta efficiency- KGE). The preliminary results showed that using the SUFI-2 algorithm with the objective function NSE and KGE has improved significantly the calibration (e.g. R2 and NSE is found 0.52 and 0.47 respectively for daily streamflow calibration).

  14. Modeling Input Errors to Improve Uncertainty Estimates for Sediment Transport Model Predictions

    NASA Astrophysics Data System (ADS)

    Jung, J. Y.; Niemann, J. D.; Greimann, B. P.

    2016-12-01

    Bayesian methods using Markov chain Monte Carlo algorithms have recently been applied to sediment transport models to assess the uncertainty in the model predictions due to the parameter values. Unfortunately, the existing approaches can only attribute overall uncertainty to the parameters. This limitation is critical because no model can produce accurate forecasts if forced with inaccurate input data, even if the model is well founded in physical theory. In this research, an existing Bayesian method is modified to consider the potential errors in input data during the uncertainty evaluation process. The input error is modeled using Gaussian distributions, and the means and standard deviations are treated as uncertain parameters. The proposed approach is tested by coupling it to the Sedimentation and River Hydraulics - One Dimension (SRH-1D) model and simulating a 23-km reach of the Tachia River in Taiwan. The Wu equation in SRH-1D is used for computing the transport capacity for a bed material load of non-cohesive material. Three types of input data are considered uncertain: (1) the input flowrate at the upstream boundary, (2) the water surface elevation at the downstream boundary, and (3) the water surface elevation at a hydraulic structure in the middle of the reach. The benefits of modeling the input errors in the uncertainty analysis are evaluated by comparing the accuracy of the most likely forecast and the coverage of the observed data by the credible intervals to those of the existing method. The results indicate that the internal boundary condition has the largest uncertainty among those considered. Overall, the uncertainty estimates from the new method are notably different from those of the existing method for both the calibration and forecast periods.

  15. Modeling responses of large-river fish populations to global climate change through downscaling and incorporation of predictive uncertainty

    USGS Publications Warehouse

    Wildhaber, Mark L.; Wikle, Christopher K.; Anderson, Christopher J.; Franz, Kristie J.; Moran, Edward H.; Dey, Rima; Mader, Helmut; Kraml, Julia

    2012-01-01

    Climate change operates over a broad range of spatial and temporal scales. Understanding its effects on ecosystems requires multi-scale models. For understanding effects on fish populations of riverine ecosystems, climate predicted by coarse-resolution Global Climate Models must be downscaled to Regional Climate Models to watersheds to river hydrology to population response. An additional challenge is quantifying sources of uncertainty given the highly nonlinear nature of interactions between climate variables and community level processes. We present a modeling approach for understanding and accomodating uncertainty by applying multi-scale climate models and a hierarchical Bayesian modeling framework to Midwest fish population dynamics and by linking models for system components together by formal rules of probability. The proposed hierarchical modeling approach will account for sources of uncertainty in forecasts of community or population response. The goal is to evaluate the potential distributional changes in an ecological system, given distributional changes implied by a series of linked climate and system models under various emissions/use scenarios. This understanding will aid evaluation of management options for coping with global climate change. In our initial analyses, we found that predicted pallid sturgeon population responses were dependent on the climate scenario considered.

  16. Uncertainty of streamwater solute fluxes in five contrasting headwater catchments including model uncertainty and natural variability (Invited)

    NASA Astrophysics Data System (ADS)

    Aulenbach, B. T.; Burns, D. A.; Shanley, J. B.; Yanai, R. D.; Bae, K.; Wild, A.; Yang, Y.; Dong, Y.

    2013-12-01

    There are many sources of uncertainty in estimates of streamwater solute flux. Flux is the product of discharge and concentration (summed over time), each of which has measurement uncertainty of its own. Discharge can be measured almost continuously, but concentrations are usually determined from discrete samples, which increases uncertainty dependent on sampling frequency and how concentrations are assigned for the periods between samples. Gaps between samples can be estimated by linear interpolation or by models that that use the relations between concentration and continuously measured or known variables such as discharge, season, temperature, and time. For this project, developed in cooperation with QUEST (Quantifying Uncertainty in Ecosystem Studies), we evaluated uncertainty for three flux estimation methods and three different sampling frequencies (monthly, weekly, and weekly plus event). The constituents investigated were dissolved NO3, Si, SO4, and dissolved organic carbon (DOC), solutes whose concentration dynamics exhibit strongly contrasting behavior. The evaluation was completed for a 10-year period at five small, forested watersheds in Georgia, New Hampshire, New York, Puerto Rico, and Vermont. Concentration regression models were developed for each solute at each of the three sampling frequencies for all five watersheds. Fluxes were then calculated using (1) a linear interpolation approach, (2) a regression-model method, and (3) the composite method - which combines the regression-model method for estimating concentrations and the linear interpolation method for correcting model residuals to the observed sample concentrations. We considered the best estimates of flux to be derived using the composite method at the highest sampling frequencies. We also evaluated the importance of sampling frequency and estimation method on flux estimate uncertainty; flux uncertainty was dependent on the variability characteristics of each solute and varied for different reporting periods (e.g. 10-year, study period vs. annually vs. monthly). The usefulness of the two regression model based flux estimation approaches was dependent upon the amount of variance in concentrations the regression models could explain. Our results can guide the development of optimal sampling strategies by weighing sampling frequency with improvements in uncertainty in stream flux estimates for solutes with particular characteristics of variability. The appropriate flux estimation method is dependent on a combination of sampling frequency and the strength of concentration regression models. Sites: Biscuit Brook (Frost Valley, NY), Hubbard Brook Experimental Forest and LTER (West Thornton, NH), Luquillo Experimental Forest and LTER (Luquillo, Puerto Rico), Panola Mountain (Stockbridge, GA), Sleepers River Research Watershed (Danville, VT)

  17. Using Ecosystem Experiments to Improve Vegetation Models

    DOE PAGES

    Medlyn, Belinda; Zaehle, S; DeKauwe, Martin G.; ...

    2015-05-21

    Ecosystem responses to rising CO2 concentrations are a major source of uncertainty in climate change projections. Data from ecosystem-scale Free-Air CO2 Enrichment (FACE) experiments provide a unique opportunity to reduce this uncertainty. The recent FACE Model–Data Synthesis project aimed to use the information gathered in two forest FACE experiments to assess and improve land ecosystem models. A new 'assumption-centred' model intercomparison approach was used, in which participating models were evaluated against experimental data based on the ways in which they represent key ecological processes. Identifying and evaluating the main assumptions caused differences among models, and the assumption-centered approach produced amore » clear roadmap for reducing model uncertainty. We explain this approach and summarize the resulting research agenda. We encourage the application of this approach in other model intercomparison projects to fundamentally improve predictive understanding of the Earth system.« less

  18. Uncertainty evaluation of nuclear reaction model parameters using integral and microscopic measurements. Covariances evaluation with CONRAD code

    NASA Astrophysics Data System (ADS)

    de Saint Jean, C.; Habert, B.; Archier, P.; Noguere, G.; Bernard, D.; Tommasi, J.; Blaise, P.

    2010-10-01

    In the [eV;MeV] energy range, modelling of the neutron induced reactions are based on nuclear reaction models having parameters. Estimation of co-variances on cross sections or on nuclear reaction model parameters is a recurrent puzzle in nuclear data evaluation. Major breakthroughs were asked by nuclear reactor physicists to assess proper uncertainties to be used in applications. In this paper, mathematical methods developped in the CONRAD code[2] will be presented to explain the treatment of all type of uncertainties, including experimental ones (statistical and systematic) and propagate them to nuclear reaction model parameters or cross sections. Marginalization procedure will thus be exposed using analytical or Monte-Carlo solutions. Furthermore, one major drawback found by reactor physicist is the fact that integral or analytical experiments (reactor mock-up or simple integral experiment, e.g. ICSBEP, …) were not taken into account sufficiently soon in the evaluation process to remove discrepancies. In this paper, we will describe a mathematical framework to take into account properly this kind of information.

  19. Uncertainty estimation of a complex water quality model: The influence of Box-Cox transformation on Bayesian approaches and comparison with a non-Bayesian method

    NASA Astrophysics Data System (ADS)

    Freni, Gabriele; Mannina, Giorgio

    In urban drainage modelling, uncertainty analysis is of undoubted necessity. However, uncertainty analysis in urban water-quality modelling is still in its infancy and only few studies have been carried out. Therefore, several methodological aspects still need to be experienced and clarified especially regarding water quality modelling. The use of the Bayesian approach for uncertainty analysis has been stimulated by its rigorous theoretical framework and by the possibility of evaluating the impact of new knowledge on the modelling predictions. Nevertheless, the Bayesian approach relies on some restrictive hypotheses that are not present in less formal methods like the Generalised Likelihood Uncertainty Estimation (GLUE). One crucial point in the application of Bayesian method is the formulation of a likelihood function that is conditioned by the hypotheses made regarding model residuals. Statistical transformations, such as the use of Box-Cox equation, are generally used to ensure the homoscedasticity of residuals. However, this practice may affect the reliability of the analysis leading to a wrong uncertainty estimation. The present paper aims to explore the influence of the Box-Cox equation for environmental water quality models. To this end, five cases were considered one of which was the “real” residuals distributions (i.e. drawn from available data). The analysis was applied to the Nocella experimental catchment (Italy) which is an agricultural and semi-urbanised basin where two sewer systems, two wastewater treatment plants and a river reach were monitored during both dry and wet weather periods. The results show that the uncertainty estimation is greatly affected by residual transformation and a wrong assumption may also affect the evaluation of model uncertainty. The use of less formal methods always provide an overestimation of modelling uncertainty with respect to Bayesian method but such effect is reduced if a wrong assumption is made regarding the residuals distribution. If residuals are not normally distributed, the uncertainty is over-estimated if Box-Cox transformation is not applied or non-calibrated parameter is used.

  20. Evaluating the impacts of agricultural land management practices on water resources: A probabilistic hydrologic modeling approach.

    PubMed

    Prada, A F; Chu, M L; Guzman, J A; Moriasi, D N

    2017-05-15

    Evaluating the effectiveness of agricultural land management practices in minimizing environmental impacts using models is challenged by the presence of inherent uncertainties during the model development stage. One issue faced during the model development stage is the uncertainty involved in model parameterization. Using a single optimized set of parameters (one snapshot) to represent baseline conditions of the system limits the applicability and robustness of the model to properly represent future or alternative scenarios. The objective of this study was to develop a framework that facilitates model parameter selection while evaluating uncertainty to assess the impacts of land management practices at the watershed scale. The model framework was applied to the Lake Creek watershed located in southwestern Oklahoma, USA. A two-step probabilistic approach was implemented to parameterize the Agricultural Policy/Environmental eXtender (APEX) model using global uncertainty and sensitivity analysis to estimate the full spectrum of total monthly water yield (WYLD) and total monthly Nitrogen loads (N) in the watershed under different land management practices. Twenty-seven models were found to represent the baseline scenario in which uncertainty of up to 29% and 400% in WYLD and N, respectively, is plausible. Changing the land cover to pasture manifested the highest decrease in N to up to 30% for a full pasture coverage while changing to full winter wheat cover can increase the N up to 11%. The methodology developed in this study was able to quantify the full spectrum of system responses, the uncertainty associated with them, and the most important parameters that drive their variability. Results from this study can be used to develop strategic decisions on the risks and tradeoffs associated with different management alternatives that aim to increase productivity while also minimizing their environmental impacts. Copyright © 2017 Elsevier Ltd. All rights reserved.

  1. Methodologies for evaluating performance and assessing uncertainty of atmospheric dispersion models

    NASA Astrophysics Data System (ADS)

    Chang, Joseph C.

    This thesis describes methodologies to evaluate the performance and to assess the uncertainty of atmospheric dispersion models, tools that predict the fate of gases and aerosols upon their release into the atmosphere. Because of the large economic and public-health impacts often associated with the use of the dispersion model results, these models should be properly evaluated, and their uncertainty should be properly accounted for and understood. The CALPUFF, HPAC, and VLSTRACK dispersion modeling systems were applied to the Dipole Pride (DP26) field data (˜20 km in scale), in order to demonstrate the evaluation and uncertainty assessment methodologies. Dispersion model performance was found to be strongly dependent on the wind models used to generate gridded wind fields from observed station data. This is because, despite the fact that the test site was a flat area, the observed surface wind fields still showed considerable spatial variability, partly because of the surrounding mountains. It was found that the two components were comparable for the DP26 field data, with variability more important than uncertainty closer to the source, and less important farther away from the source. Therefore, reducing data errors for input meteorology may not necessarily increase model accuracy due to random turbulence. DP26 was a research-grade field experiment, where the source, meteorological, and concentration data were all well-measured. Another typical application of dispersion modeling is a forensic study where the data are usually quite scarce. An example would be the modeling of the alleged releases of chemical warfare agents during the 1991 Persian Gulf War, where the source data had to rely on intelligence reports, and where Iraq had stopped reporting weather data to the World Meteorological Organization since the 1981 Iran-Iraq-war. Therefore the meteorological fields inside Iraq must be estimated by models such as prognostic mesoscale meteorological models, based on observational data from areas outside of Iraq, and using the global fields simulated by the global meteorological models as the initial and boundary conditions for the mesoscale models. It was found that while comparing model predictions to observations in areas outside of Iraq, the predicted surface wind directions had errors between 30 to 90 deg, but the inter-model differences (or uncertainties) in the predicted surface wind directions inside Iraq, where there were no onsite data, were fairly constant at about 70 deg. (Abstract shortened by UMI.)

  2. How should epistemic uncertainty in modelling water resources management problems shape evaluations of their operations?

    NASA Astrophysics Data System (ADS)

    Dobson, B.; Pianosi, F.; Reed, P. M.; Wagener, T.

    2017-12-01

    In previous work, we have found that water supply companies are typically hesitant to use reservoir operation tools to inform their release decisions. We believe that this is, in part, due to a lack of faith in the fidelity of the optimization exercise with regards to its ability to represent the real world. In an attempt to quantify this, recent literature has studied the impact on performance from uncertainty arising in: forcing (e.g. reservoir inflows), parameters (e.g. parameters for the estimation of evaporation rate) and objectives (e.g. worst first percentile or worst case). We suggest that there is also epistemic uncertainty in the choices made during model creation, for example in the formulation of an evaporation model or aggregating regional storages. We create `rival framings' (a methodology originally developed to demonstrate the impact of uncertainty arising from alternate objective formulations), each with different modelling choices, and determine their performance impacts. We identify the Pareto approximate set of policies for several candidate formulations and then make them compete with one another in a large ensemble re-evaluation in each other's modelled spaces. This enables us to distinguish the impacts of different structural changes in the model used to evaluate system performance in an effort to generalize the validity of the optimized performance expectations.

  3. Carbon cycle confidence and uncertainty: Exploring variation among soil biogeochemical models

    DOE PAGES

    Wieder, William R.; Hartman, Melannie D.; Sulman, Benjamin N.; ...

    2017-11-09

    Emerging insights into factors responsible for soil organic matter stabilization and decomposition are being applied in a variety of contexts, but new tools are needed to facilitate the understanding, evaluation, and improvement of soil biogeochemical theory and models at regional to global scales. To isolate the effects of model structural uncertainty on the global distribution of soil carbon stocks and turnover times we developed a soil biogeochemical testbed that forces three different soil models with consistent climate and plant productivity inputs. The models tested here include a first-order, microbial implicit approach (CASA-CNP), and two recently developed microbially explicit models thatmore » can be run at global scales (MIMICS and CORPSE). When forced with common environmental drivers, the soil models generated similar estimates of initial soil carbon stocks (roughly 1,400 Pg C globally, 0–100 cm), but each model shows a different functional relationship between mean annual temperature and inferred turnover times. Subsequently, the models made divergent projections about the fate of these soil carbon stocks over the 20th century, with models either gaining or losing over 20 Pg C globally between 1901 and 2010. Single-forcing experiments with changed inputs, tem- perature, and moisture suggest that uncertainty associated with freeze-thaw processes as well as soil textural effects on soil carbon stabilization were larger than direct temper- ature uncertainties among models. Finally, the models generated distinct projections about the timing and magnitude of seasonal heterotrophic respiration rates, again reflecting structural uncertainties that were related to environmental sensitivities and assumptions about physicochemical stabilization of soil organic matter. Here, by providing a computationally tractable and numerically consistent framework to evaluate models we aim to better understand uncertainties among models and generate insights about fac- tors regulating the turnover of soil organic matter.« less

  4. Carbon cycle confidence and uncertainty: Exploring variation among soil biogeochemical models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wieder, William R.; Hartman, Melannie D.; Sulman, Benjamin N.

    Emerging insights into factors responsible for soil organic matter stabilization and decomposition are being applied in a variety of contexts, but new tools are needed to facilitate the understanding, evaluation, and improvement of soil biogeochemical theory and models at regional to global scales. To isolate the effects of model structural uncertainty on the global distribution of soil carbon stocks and turnover times we developed a soil biogeochemical testbed that forces three different soil models with consistent climate and plant productivity inputs. The models tested here include a first-order, microbial implicit approach (CASA-CNP), and two recently developed microbially explicit models thatmore » can be run at global scales (MIMICS and CORPSE). When forced with common environmental drivers, the soil models generated similar estimates of initial soil carbon stocks (roughly 1,400 Pg C globally, 0–100 cm), but each model shows a different functional relationship between mean annual temperature and inferred turnover times. Subsequently, the models made divergent projections about the fate of these soil carbon stocks over the 20th century, with models either gaining or losing over 20 Pg C globally between 1901 and 2010. Single-forcing experiments with changed inputs, tem- perature, and moisture suggest that uncertainty associated with freeze-thaw processes as well as soil textural effects on soil carbon stabilization were larger than direct temper- ature uncertainties among models. Finally, the models generated distinct projections about the timing and magnitude of seasonal heterotrophic respiration rates, again reflecting structural uncertainties that were related to environmental sensitivities and assumptions about physicochemical stabilization of soil organic matter. Here, by providing a computationally tractable and numerically consistent framework to evaluate models we aim to better understand uncertainties among models and generate insights about fac- tors regulating the turnover of soil organic matter.« less

  5. Performance assessment of a Bayesian Forecasting System (BFS) for real-time flood forecasting

    NASA Astrophysics Data System (ADS)

    Biondi, D.; De Luca, D. L.

    2013-02-01

    SummaryThe paper evaluates, for a number of flood events, the performance of a Bayesian Forecasting System (BFS), with the aim of evaluating total uncertainty in real-time flood forecasting. The predictive uncertainty of future streamflow is estimated through the Bayesian integration of two separate processors. The former evaluates the propagation of input uncertainty on simulated river discharge, the latter computes the hydrological uncertainty of actual river discharge associated with all other possible sources of error. A stochastic model and a distributed rainfall-runoff model were assumed, respectively, for rainfall and hydrological response simulations. A case study was carried out for a small basin in the Calabria region (southern Italy). The performance assessment of the BFS was performed with adequate verification tools suited for probabilistic forecasts of continuous variables such as streamflow. Graphical tools and scalar metrics were used to evaluate several attributes of the forecast quality of the entire time-varying predictive distributions: calibration, sharpness, accuracy, and continuous ranked probability score (CRPS). Besides the overall system, which incorporates both sources of uncertainty, other hypotheses resulting from the BFS properties were examined, corresponding to (i) a perfect hydrological model; (ii) a non-informative rainfall forecast for predicting streamflow; and (iii) a perfect input forecast. The results emphasize the importance of using different diagnostic approaches to perform comprehensive analyses of predictive distributions, to arrive at a multifaceted view of the attributes of the prediction. For the case study, the selected criteria revealed the interaction of the different sources of error, in particular the crucial role of the hydrological uncertainty processor when compensating, at the cost of wider forecast intervals, for the unreliable and biased predictive distribution resulting from the Precipitation Uncertainty Processor.

  6. Performance of uncertainty quantification methodologies and linear solvers in cardiovascular simulations

    NASA Astrophysics Data System (ADS)

    Seo, Jongmin; Schiavazzi, Daniele; Marsden, Alison

    2017-11-01

    Cardiovascular simulations are increasingly used in clinical decision making, surgical planning, and disease diagnostics. Patient-specific modeling and simulation typically proceeds through a pipeline from anatomic model construction using medical image data to blood flow simulation and analysis. To provide confidence intervals on simulation predictions, we use an uncertainty quantification (UQ) framework to analyze the effects of numerous uncertainties that stem from clinical data acquisition, modeling, material properties, and boundary condition selection. However, UQ poses a computational challenge requiring multiple evaluations of the Navier-Stokes equations in complex 3-D models. To achieve efficiency in UQ problems with many function evaluations, we implement and compare a range of iterative linear solver and preconditioning techniques in our flow solver. We then discuss applications to patient-specific cardiovascular simulation and how the problem/boundary condition formulation in the solver affects the selection of the most efficient linear solver. Finally, we discuss performance improvements in the context of uncertainty propagation. Support from National Institute of Health (R01 EB018302) is greatly appreciated.

  7. Receiving water quality assessment: comparison between simplified and detailed integrated urban modelling approaches.

    PubMed

    Mannina, Giorgio; Viviani, Gaspare

    2010-01-01

    Urban water quality management often requires use of numerical models allowing the evaluation of the cause-effect relationship between the input(s) (i.e. rainfall, pollutant concentrations on catchment surface and in sewer system) and the resulting water quality response. The conventional approach to the system (i.e. sewer system, wastewater treatment plant and receiving water body), considering each component separately, does not enable optimisation of the whole system. However, recent gains in understanding and modelling make it possible to represent the system as a whole and optimise its overall performance. Indeed, integrated urban drainage modelling is of growing interest for tools to cope with Water Framework Directive requirements. Two different approaches can be employed for modelling the whole urban drainage system: detailed and simplified. Each has its advantages and disadvantages. Specifically, detailed approaches can offer a higher level of reliability in the model results, but can be very time consuming from the computational point of view. Simplified approaches are faster but may lead to greater model uncertainty due to an over-simplification. To gain insight into the above problem, two different modelling approaches have been compared with respect to their uncertainty. The first urban drainage integrated model approach uses the Saint-Venant equations and the 1D advection-dispersion equations, for the quantity and for the quality aspects, respectively. The second model approach consists of the simplified reservoir model. The analysis used a parsimonious bespoke model developed in previous studies. For the uncertainty analysis, the Generalised Likelihood Uncertainty Estimation (GLUE) procedure was used. Model reliability was evaluated on the basis of capacity of globally limiting the uncertainty. Both models have a good capability to fit the experimental data, suggesting that all adopted approaches are equivalent both for quantity and quality. The detailed model approach is more robust and presents less uncertainty in terms of uncertainty bands. On the other hand, the simplified river water quality model approach shows higher uncertainty and may be unsuitable for receiving water body quality assessment.

  8. Improving snow density estimation for mapping SWE with Lidar snow depth: assessment of uncertainty in modeled density and field sampling strategies in NASA SnowEx

    NASA Astrophysics Data System (ADS)

    Raleigh, M. S.; Smyth, E.; Small, E. E.

    2017-12-01

    The spatial distribution of snow water equivalent (SWE) is not sufficiently monitored with either remotely sensed or ground-based observations for water resources management. Recent applications of airborne Lidar have yielded basin-wide mapping of SWE when combined with a snow density model. However, in the absence of snow density observations, the uncertainty in these SWE maps is dominated by uncertainty in modeled snow density rather than in Lidar measurement of snow depth. Available observations tend to have a bias in physiographic regime (e.g., flat open areas) and are often insufficient in number to support testing of models across a range of conditions. Thus, there is a need for targeted sampling strategies and controlled model experiments to understand where and why different snow density models diverge. This will enable identification of robust model structures that represent dominant processes controlling snow densification, in support of basin-scale estimation of SWE with remotely-sensed snow depth datasets. The NASA SnowEx mission is a unique opportunity to evaluate sampling strategies of snow density and to quantify and reduce uncertainty in modeled snow density. In this presentation, we present initial field data analyses and modeling results over the Colorado SnowEx domain in the 2016-2017 winter campaign. We detail a framework for spatially mapping the uncertainty in snowpack density, as represented across multiple models. Leveraging the modular SUMMA model, we construct a series of physically-based models to assess systematically the importance of specific process representations to snow density estimates. We will show how models and snow pit observations characterize snow density variations with forest cover in the SnowEx domains. Finally, we will use the spatial maps of density uncertainty to evaluate the selected locations of snow pits, thereby assessing the adequacy of the sampling strategy for targeting uncertainty in modeled snow density.

  9. Assessing and reporting uncertainties in dietary exposure analysis - Part II: Application of the uncertainty template to a practical example of exposure assessment.

    PubMed

    Tennant, David; Bánáti, Diána; Kennedy, Marc; König, Jürgen; O'Mahony, Cian; Kettler, Susanne

    2017-11-01

    A previous publication described methods for assessing and reporting uncertainty in dietary exposure assessments. This follow-up publication uses a case study to develop proposals for representing and communicating uncertainty to risk managers. The food ingredient aspartame is used as the case study in a simple deterministic model (the EFSA FAIM template) and with more sophisticated probabilistic exposure assessment software (FACET). Parameter and model uncertainties are identified for each modelling approach and tabulated. The relative importance of each source of uncertainty is then evaluated using a semi-quantitative scale and the results expressed using two different forms of graphical summary. The value of this approach in expressing uncertainties in a manner that is relevant to the exposure assessment and useful to risk managers is then discussed. It was observed that the majority of uncertainties are often associated with data sources rather than the model itself. However, differences in modelling methods can have the greatest impact on uncertainties overall, particularly when the underlying data are the same. It was concluded that improved methods for communicating uncertainties for risk management is the research area where the greatest amount of effort is suggested to be placed in future. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.

  10. BioVapor Model Evaluation

    EPA Science Inventory

    General background on modeling and specifics of modeling vapor intrusion are given. Three classical model applications are described and related to the problem of petroleum vapor intrusion. These indicate the need for model calibration and uncertainty analysis. Evaluation of Bi...

  11. Climate change impacts on tree ranges: model intercomparison facilitates understanding and quantification of uncertainty.

    PubMed

    Cheaib, Alissar; Badeau, Vincent; Boe, Julien; Chuine, Isabelle; Delire, Christine; Dufrêne, Eric; François, Christophe; Gritti, Emmanuel S; Legay, Myriam; Pagé, Christian; Thuiller, Wilfried; Viovy, Nicolas; Leadley, Paul

    2012-06-01

    Model-based projections of shifts in tree species range due to climate change are becoming an important decision support tool for forest management. However, poorly evaluated sources of uncertainty require more scrutiny before relying heavily on models for decision-making. We evaluated uncertainty arising from differences in model formulations of tree response to climate change based on a rigorous intercomparison of projections of tree distributions in France. We compared eight models ranging from niche-based to process-based models. On average, models project large range contractions of temperate tree species in lowlands due to climate change. There was substantial disagreement between models for temperate broadleaf deciduous tree species, but differences in the capacity of models to account for rising CO(2) impacts explained much of the disagreement. There was good quantitative agreement among models concerning the range contractions for Scots pine. For the dominant Mediterranean tree species, Holm oak, all models foresee substantial range expansion. © 2012 Blackwell Publishing Ltd/CNRS.

  12. Sensitivity of Polar Stratospheric Ozone Loss to Uncertainties in Chemical Reaction Kinetics

    NASA Technical Reports Server (NTRS)

    Kawa, S. Randolph; Stolarksi, Richard S.; Douglass, Anne R.; Newman, Paul A.

    2008-01-01

    Several recent observational and laboratory studies of processes involved in polar stratospheric ozone loss have prompted a reexamination of aspects of our understanding for this key indicator of global change. To a large extent, our confidence in understanding and projecting changes in polar and global ozone is based on our ability to simulate these processes in numerical models of chemistry and transport. The fidelity of the models is assessed in comparison with a wide range of observations. These models depend on laboratory-measured kinetic reaction rates and photolysis cross sections to simulate molecular interactions. A typical stratospheric chemistry mechanism has on the order of 50- 100 species undergoing over a hundred intermolecular reactions and several tens of photolysis reactions. The rates of all of these reactions are subject to uncertainty, some substantial. Given the complexity of the models, however, it is difficult to quantify uncertainties in many aspects of system. In this study we use a simple box-model scenario for Antarctic ozone to estimate the uncertainty in loss attributable to known reaction kinetic uncertainties. Following the method of earlier work, rates and uncertainties from the latest laboratory evaluations are applied in random combinations. We determine the key reactions and rates contributing the largest potential errors and compare the results to observations to evaluate which combinations are consistent with atmospheric data. Implications for our theoretical and practical understanding of polar ozone loss will be assessed.

  13. Uncertainty evaluation in normalization of isotope delta measurement results against international reference materials.

    PubMed

    Meija, Juris; Chartrand, Michelle M G

    2018-01-01

    Isotope delta measurements are normalized against international reference standards. Although multi-point normalization is becoming a standard practice, the existing uncertainty evaluation practices are either undocumented or are incomplete. For multi-point normalization, we present errors-in-variables regression models for explicit accounting of the measurement uncertainty of the international standards along with the uncertainty that is attributed to their assigned values. This manuscript presents framework to account for the uncertainty that arises due to a small number of replicate measurements and discusses multi-laboratory data reduction while accounting for inevitable correlations between the laboratories due to the use of identical reference materials for calibration. Both frequentist and Bayesian methods of uncertainty analysis are discussed.

  14. Statistical error model for a solar electric propulsion thrust subsystem

    NASA Technical Reports Server (NTRS)

    Bantell, M. H.

    1973-01-01

    The solar electric propulsion thrust subsystem statistical error model was developed as a tool for investigating the effects of thrust subsystem parameter uncertainties on navigation accuracy. The model is currently being used to evaluate the impact of electric engine parameter uncertainties on navigation system performance for a baseline mission to Encke's Comet in the 1980s. The data given represent the next generation in statistical error modeling for low-thrust applications. Principal improvements include the representation of thrust uncertainties and random process modeling in terms of random parametric variations in the thrust vector process for a multi-engine configuration.

  15. Multi-Hypothesis Modelling Capabilities for Robust Data-Model Integration

    NASA Astrophysics Data System (ADS)

    Walker, A. P.; De Kauwe, M. G.; Lu, D.; Medlyn, B.; Norby, R. J.; Ricciuto, D. M.; Rogers, A.; Serbin, S.; Weston, D. J.; Ye, M.; Zaehle, S.

    2017-12-01

    Large uncertainty is often inherent in model predictions due to imperfect knowledge of how to describe the mechanistic processes (hypotheses) that a model is intended to represent. Yet this model hypothesis uncertainty (MHU) is often overlooked or informally evaluated, as methods to quantify and evaluate MHU are limited. MHU is increased as models become more complex because each additional processes added to a model comes with inherent MHU as well as parametric unceratinty. With the current trend of adding more processes to Earth System Models (ESMs), we are adding uncertainty, which can be quantified for parameters but not MHU. Model inter-comparison projects do allow for some consideration of hypothesis uncertainty but in an ad hoc and non-independent fashion. This has stymied efforts to evaluate ecosystem models against data and intepret the results mechanistically because it is not simple to interpret exactly why a model is producing the results it does and identify which model assumptions are key as they combine models of many sub-systems and processes, each of which may be conceptualised and represented mathematically in various ways. We present a novel modelling framework—the multi-assumption architecture and testbed (MAAT)—that automates the combination, generation, and execution of a model ensemble built with different representations of process. We will present the argument that multi-hypothesis modelling needs to be considered in conjunction with other capabilities (e.g. the Predictive Ecosystem Analyser; PecAn) and statistical methods (e.g. sensitivity anaylsis, data assimilation) to aid efforts in robust data model integration to enhance our predictive understanding of biological systems.

  16. Impact of uncertainties in inorganic chemical rate constants on tropospheric composition and ozone radiative forcing

    NASA Astrophysics Data System (ADS)

    Newsome, Ben; Evans, Mat

    2017-12-01

    Chemical rate constants determine the composition of the atmosphere and how this composition has changed over time. They are central to our understanding of climate change and air quality degradation. Atmospheric chemistry models, whether online or offline, box, regional or global, use these rate constants. Expert panels evaluate laboratory measurements, making recommendations for the rate constants that should be used. This results in very similar or identical rate constants being used by all models. The inherent uncertainties in these recommendations are, in general, therefore ignored. We explore the impact of these uncertainties on the composition of the troposphere using the GEOS-Chem chemistry transport model. Based on the Jet Propulsion Laboratory (JPL) and International Union of Pure and Applied Chemistry (IUPAC) evaluations we assess the influence of 50 mainly inorganic rate constants and 10 photolysis rates on tropospheric composition through the use of the GEOS-Chem chemistry transport model. We assess the impact on four standard metrics: annual mean tropospheric ozone burden, surface ozone and tropospheric OH concentrations, and tropospheric methane lifetime. Uncertainty in the rate constants for NO2 + OH M HNO3 and O3 + NO → NO2 + O2 are the two largest sources of uncertainty in these metrics. The absolute magnitude of the change in the metrics is similar if rate constants are increased or decreased by their σ values. We investigate two methods of assessing these uncertainties, addition in quadrature and a Monte Carlo approach, and conclude they give similar outcomes. Combining the uncertainties across the 60 reactions gives overall uncertainties on the annual mean tropospheric ozone burden, surface ozone and tropospheric OH concentrations, and tropospheric methane lifetime of 10, 11, 16 and 16 %, respectively. These are larger than the spread between models in recent model intercomparisons. Remote regions such as the tropics, poles and upper troposphere are most uncertain. This chemical uncertainty is sufficiently large to suggest that rate constant uncertainty should be considered alongside other processes when model results disagree with measurement. Calculations for the pre-industrial simulation allow a tropospheric ozone radiative forcing to be calculated of 0.412 ± 0.062 W m-2. This uncertainty (13 %) is comparable to the inter-model spread in ozone radiative forcing found in previous model-model intercomparison studies where the rate constants used in the models are all identical or very similar. Thus, the uncertainty of tropospheric ozone radiative forcing should expanded to include this additional source of uncertainty. These rate constant uncertainties are significant and suggest that refinement of supposedly well-known chemical rate constants should be considered alongside other improvements to enhance our understanding of atmospheric processes.

  17. Using statistical model to simulate the impact of climate change on maize yield with climate and crop uncertainties

    NASA Astrophysics Data System (ADS)

    Zhang, Yi; Zhao, Yanxia; Wang, Chunyi; Chen, Sining

    2017-11-01

    Assessment of the impact of climate change on crop productions with considering uncertainties is essential for properly identifying and decision-making agricultural practices that are sustainable. In this study, we employed 24 climate projections consisting of the combinations of eight GCMs and three emission scenarios representing the climate projections uncertainty, and two crop statistical models with 100 sets of parameters in each model representing parameter uncertainty within the crop models. The goal of this study was to evaluate the impact of climate change on maize ( Zea mays L.) yield at three locations (Benxi, Changling, and Hailun) across Northeast China (NEC) in periods 2010-2039 and 2040-2069, taking 1976-2005 as the baseline period. The multi-models ensembles method is an effective way to deal with the uncertainties. The results of ensemble simulations showed that maize yield reductions were less than 5 % in both future periods relative to the baseline. To further understand the contributions of individual sources of uncertainty, such as climate projections and crop model parameters, in ensemble yield simulations, variance decomposition was performed. The results indicated that the uncertainty from climate projections was much larger than that contributed by crop model parameters. Increased ensemble yield variance revealed the increasing uncertainty in the yield simulation in the future periods.

  18. Assessing model uncertainty using hexavalent chromium and ...

    EPA Pesticide Factsheets

    Introduction: The National Research Council recommended quantitative evaluation of uncertainty in effect estimates for risk assessment. This analysis considers uncertainty across model forms and model parameterizations with hexavalent chromium [Cr(VI)] and lung cancer mortality as an example. The objective of this analysis is to characterize model uncertainty by evaluating the variance in estimates across several epidemiologic analyses.Methods: This analysis compared 7 publications analyzing two different chromate production sites in Ohio and Maryland. The Ohio cohort consisted of 482 workers employed from 1940-72, while the Maryland site employed 2,357 workers from 1950-74. Cox and Poisson models were the only model forms considered by study authors to assess the effect of Cr(VI) on lung cancer mortality. All models adjusted for smoking and included a 5-year exposure lag, however other latency periods and model covariates such as age and race were considered. Published effect estimates were standardized to the same units and normalized by their variances to produce a standardized metric to compare variability in estimates across and within model forms. A total of 7 similarly parameterized analyses were considered across model forms, and 23 analyses with alternative parameterizations were considered within model form (14 Cox; 9 Poisson). Results: Across Cox and Poisson model forms, adjusted cumulative exposure coefficients for 7 similar analyses ranged from 2.47

  19. Opinion: The use of natural hazard modeling for decision making under uncertainty

    Treesearch

    David E. Calkin; Mike Mentis

    2015-01-01

    Decision making to mitigate the effects of natural hazards is a complex undertaking fraught with uncertainty. Models to describe risks associated with natural hazards have proliferated in recent years. Concurrently, there is a growing body of work focused on developing best practices for natural hazard modeling and to create structured evaluation criteria for complex...

  20. Uncertainties in Predicting Rice Yield by Current Crop Models Under a Wide Range of Climatic Conditions

    NASA Technical Reports Server (NTRS)

    Li, Tao; Hasegawa, Toshihiro; Yin, Xinyou; Zhu, Yan; Boote, Kenneth; Adam, Myriam; Bregaglio, Simone; Buis, Samuel; Confalonieri, Roberto; Fumoto, Tamon; hide

    2014-01-01

    Predicting rice (Oryza sativa) productivity under future climates is important for global food security. Ecophysiological crop models in combination with climate model outputs are commonly used in yield prediction, but uncertainties associated with crop models remain largely unquantified. We evaluated 13 rice models against multi-year experimental yield data at four sites with diverse climatic conditions in Asia and examined whether different modeling approaches on major physiological processes attribute to the uncertainties of prediction to field measured yields and to the uncertainties of sensitivity to changes in temperature and CO2 concentration [CO2]. We also examined whether a use of an ensemble of crop models can reduce the uncertainties. Individual models did not consistently reproduce both experimental and regional yields well, and uncertainty was larger at the warmest and coolest sites. The variation in yield projections was larger among crop models than variation resulting from 16 global climate model-based scenarios. However, the mean of predictions of all crop models reproduced experimental data, with an uncertainty of less than 10 percent of measured yields. Using an ensemble of eight models calibrated only for phenology or five models calibrated in detail resulted in the uncertainty equivalent to that of the measured yield in well-controlled agronomic field experiments. Sensitivity analysis indicates the necessity to improve the accuracy in predicting both biomass and harvest index in response to increasing [CO2] and temperature.

  1. A new robust adaptive controller for vibration control of active engine mount subjected to large uncertainties

    NASA Astrophysics Data System (ADS)

    Fakhari, Vahid; Choi, Seung-Bok; Cho, Chang-Hyun

    2015-04-01

    This work presents a new robust model reference adaptive control (MRAC) for vibration control caused from vehicle engine using an electromagnetic type of active engine mount. Vibration isolation performances of the active mount associated with the robust controller are evaluated in the presence of large uncertainties. As a first step, an active mount with linear solenoid actuator is prepared and its dynamic model is identified via experimental test. Subsequently, a new robust MRAC based on the gradient method with σ-modification is designed by selecting a proper reference model. In designing the robust adaptive control, structured (parametric) uncertainties in the stiffness of the passive part of the mount and in damping ratio of the active part of the mount are considered to investigate the robustness of the proposed controller. Experimental and simulation results are presented to evaluate performance focusing on the robustness behavior of the controller in the face of large uncertainties. The obtained results show that the proposed controller can sufficiently provide the robust vibration control performance even in the presence of large uncertainties showing an effective vibration isolation.

  2. Evaluating Uncertainty of Runoff Simulation using SWAT model of the Feilaixia Watershed in China Based on the GLUE Method

    NASA Astrophysics Data System (ADS)

    Chen, X.; Huang, G.

    2017-12-01

    In recent years, distributed hydrological models have been widely used in storm water management, water resources protection and so on. Therefore, how to evaluate the uncertainty of the model reasonably and efficiently becomes a hot topic today. In this paper, the soil and water assessment tool (SWAT) model is constructed for the study area of China's Feilaixia watershed, and the uncertainty of the runoff simulation is analyzed by GLUE method deeply. Taking the initial parameter range of GLUE method as the research core, the influence of different initial parameter ranges on model uncertainty is studied. In this paper, two sets of parameter ranges are chosen as the object of study, the first one (range 1) is recommended by SWAT-CUP and the second one (range 2) is calibrated by SUFI-2. The results showed that under the same number of simulations (10,000 times), the overall uncertainty obtained by the range 2 is less than the range 1. Specifically, the "behavioral" parameter sets for the range 2 is 10000 and for the range 1 is 4448. In the calibration and the validation, the ratio of P-factor to R-factor for range 1 is 1.387 and 1.391, and for range 2 is 1.405 and 1.462 respectively. In addition, the simulation result of range 2 is better with the NS and R2 slightly higher than range 1. Therefore, it can be concluded that using the parameter range calibrated by SUFI-2 as the initial parameter range for the GLUE is a way to effectively capture and evaluate the simulation uncertainty.

  3. External Peer Review Team Report for Corrective Action Unit 97: Yucca Flat/Climax Mine, Nevada National Security Site, Nye County, Nevada, Revision 0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Marutzky, Sam J.; Andrews, Robert

    The peer review team commends the Navarro-Intera, LLC (N-I), team for its efforts in using limited data to model the fate of radionuclides in groundwater at Yucca Flat. Recognizing the key uncertainties and related recommendations discussed in Section 6.0 of this report, the peer review team has concluded that U.S. Department of Energy (DOE) is ready for a transition to model evaluation studies in the corrective action decision document (CADD)/corrective action plan (CAP) stage. The DOE, National Nuclear Security Administration Nevada Field Office (NNSA/NFO) clarified the charge to the peer review team in a letter dated October 9, 2014, frommore » Bill R. Wilborn, NNSA/NFO Underground Test Area (UGTA) Activity Lead, to Sam J. Marutzky, N-I UGTA Project Manager: “The model and supporting information should be sufficiently complete that the key uncertainties can be adequately identified such that they can be addressed by appropriate model evaluation studies. The model evaluation studies may include data collection and model refinements conducted during the CADD/CAP stage. One major input to identifying ‘key uncertainties’ is the detailed peer review provided by independent qualified peers.” The key uncertainties that the peer review team recognized and potential concerns associated with each are outlined in Section 6.0, along with recommendations corresponding to each uncertainty. The uncertainties, concerns, and recommendations are summarized in Table ES-1. The number associated with each concern refers to the section in this report where the concern is discussed in detail.« less

  4. Quantifying and reducing model-form uncertainties in Reynolds-averaged Navier–Stokes simulations: A data-driven, physics-informed Bayesian approach

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xiao, H., E-mail: hengxiao@vt.edu; Wu, J.-L.; Wang, J.-X.

    Despite their well-known limitations, Reynolds-Averaged Navier–Stokes (RANS) models are still the workhorse tools for turbulent flow simulations in today's engineering analysis, design and optimization. While the predictive capability of RANS models depends on many factors, for many practical flows the turbulence models are by far the largest source of uncertainty. As RANS models are used in the design and safety evaluation of many mission-critical systems such as airplanes and nuclear power plants, quantifying their model-form uncertainties has significant implications in enabling risk-informed decision-making. In this work we develop a data-driven, physics-informed Bayesian framework for quantifying model-form uncertainties in RANS simulations.more » Uncertainties are introduced directly to the Reynolds stresses and are represented with compact parameterization accounting for empirical prior knowledge and physical constraints (e.g., realizability, smoothness, and symmetry). An iterative ensemble Kalman method is used to assimilate the prior knowledge and observation data in a Bayesian framework, and to propagate them to posterior distributions of velocities and other Quantities of Interest (QoIs). We use two representative cases, the flow over periodic hills and the flow in a square duct, to evaluate the performance of the proposed framework. Both cases are challenging for standard RANS turbulence models. Simulation results suggest that, even with very sparse observations, the obtained posterior mean velocities and other QoIs have significantly better agreement with the benchmark data compared to the baseline results. At most locations the posterior distribution adequately captures the true model error within the developed model form uncertainty bounds. The framework is a major improvement over existing black-box, physics-neutral methods for model-form uncertainty quantification, where prior knowledge and details of the models are not exploited. This approach has potential implications in many fields in which the governing equations are well understood but the model uncertainty comes from unresolved physical processes. - Highlights: • Proposed a physics–informed framework to quantify uncertainty in RANS simulations. • Framework incorporates physical prior knowledge and observation data. • Based on a rigorous Bayesian framework yet fully utilizes physical model. • Applicable for many complex physical systems beyond turbulent flows.« less

  5. Two-Stage Modeling of Formaldehyde-Induced Tumor Incidence in the Rat—analysis of Uncertainties

    EPA Science Inventory

    This works extends the 2-stage cancer modeling of tumor incidence in formaldehyde-exposed rats carried out at the CIIT Centers for Health Research. We modify key assumptions, evaluate the effect of selected uncertainties, and develop confidence bounds on parameter estimates. Th...

  6. Status, Numbers and Influence

    ERIC Educational Resources Information Center

    Melamed, David; Savage, Scott V.

    2013-01-01

    We develop a theoretical model of social influence in n-person groups. We argue that disagreement between group members introduces uncertainty into the social situation, and this uncertainty motivates people to use status characteristics to evaluate the merits of a particular opinion. Our model takes the numerical distribution of opinions and the…

  7. A general Bayesian framework for calibrating and evaluating stochastic models of annual multi-site hydrological data

    NASA Astrophysics Data System (ADS)

    Frost, Andrew J.; Thyer, Mark A.; Srikanthan, R.; Kuczera, George

    2007-07-01

    SummaryMulti-site simulation of hydrological data are required for drought risk assessment of large multi-reservoir water supply systems. In this paper, a general Bayesian framework is presented for the calibration and evaluation of multi-site hydrological data at annual timescales. Models included within this framework are the hidden Markov model (HMM) and the widely used lag-1 autoregressive (AR(1)) model. These models are extended by the inclusion of a Box-Cox transformation and a spatial correlation function in a multi-site setting. Parameter uncertainty is evaluated using Markov chain Monte Carlo techniques. Models are evaluated by their ability to reproduce a range of important extreme statistics and compared using Bayesian model selection techniques which evaluate model probabilities. The case study, using multi-site annual rainfall data situated within catchments which contribute to Sydney's main water supply, provided the following results: Firstly, in terms of model probabilities and diagnostics, the inclusion of the Box-Cox transformation was preferred. Secondly the AR(1) and HMM performed similarly, while some other proposed AR(1)/HMM models with regionally pooled parameters had greater posterior probability than these two models. The practical significance of parameter and model uncertainty was illustrated using a case study involving drought security analysis for urban water supply. It was shown that ignoring parameter uncertainty resulted in a significant overestimate of reservoir yield and an underestimation of system vulnerability to severe drought.

  8. The effects of geometric uncertainties on computational modelling of knee biomechanics

    NASA Astrophysics Data System (ADS)

    Meng, Qingen; Fisher, John; Wilcox, Ruth

    2017-08-01

    The geometry of the articular components of the knee is an important factor in predicting joint mechanics in computational models. There are a number of uncertainties in the definition of the geometry of cartilage and meniscus, and evaluating the effects of these uncertainties is fundamental to understanding the level of reliability of the models. In this study, the sensitivity of knee mechanics to geometric uncertainties was investigated by comparing polynomial-based and image-based knee models and varying the size of meniscus. The results suggested that the geometric uncertainties in cartilage and meniscus resulting from the resolution of MRI and the accuracy of segmentation caused considerable effects on the predicted knee mechanics. Moreover, even if the mathematical geometric descriptors can be very close to the imaged-based articular surfaces, the detailed contact pressure distribution produced by the mathematical geometric descriptors was not the same as that of the image-based model. However, the trends predicted by the models based on mathematical geometric descriptors were similar to those of the imaged-based models.

  9. An integrated uncertainty analysis and data assimilation approach for improved streamflow predictions

    NASA Astrophysics Data System (ADS)

    Hogue, T. S.; He, M.; Franz, K. J.; Margulis, S. A.; Vrugt, J. A.

    2010-12-01

    The current study presents an integrated uncertainty analysis and data assimilation approach to improve streamflow predictions while simultaneously providing meaningful estimates of the associated uncertainty. Study models include the National Weather Service (NWS) operational snow model (SNOW17) and rainfall-runoff model (SAC-SMA). The proposed approach uses the recently developed DiffeRential Evolution Adaptive Metropolis (DREAM) to simultaneously estimate uncertainties in model parameters, forcing, and observations. An ensemble Kalman filter (EnKF) is configured with the DREAM-identified uncertainty structure and applied to assimilating snow water equivalent data into the SNOW17 model for improved snowmelt simulations. Snowmelt estimates then serves as an input to the SAC-SMA model to provide streamflow predictions at the basin outlet. The robustness and usefulness of the approach is evaluated for a snow-dominated watershed in the northern Sierra Mountains. This presentation describes the implementation of DREAM and EnKF into the coupled SNOW17 and SAC-SMA models and summarizes study results and findings.

  10. Framework for Uncertainty Assessment - Hanford Site-Wide Groundwater Flow and Transport Modeling

    NASA Astrophysics Data System (ADS)

    Bergeron, M. P.; Cole, C. R.; Murray, C. J.; Thorne, P. D.; Wurstner, S. K.

    2002-05-01

    Pacific Northwest National Laboratory is in the process of development and implementation of an uncertainty estimation methodology for use in future site assessments that addresses parameter uncertainty as well as uncertainties related to the groundwater conceptual model. The long-term goals of the effort are development and implementation of an uncertainty estimation methodology for use in future assessments and analyses being made with the Hanford site-wide groundwater model. The basic approach in the framework developed for uncertainty assessment consists of: 1) Alternate conceptual model (ACM) identification to identify and document the major features and assumptions of each conceptual model. The process must also include a periodic review of the existing and proposed new conceptual models as data or understanding become available. 2) ACM development of each identified conceptual model through inverse modeling with historical site data. 3) ACM evaluation to identify which of conceptual models are plausible and should be included in any subsequent uncertainty assessments. 4) ACM uncertainty assessments will only be carried out for those ACMs determined to be plausible through comparison with historical observations and model structure identification measures. The parameter uncertainty assessment process generally involves: a) Model Complexity Optimization - to identify the important or relevant parameters for the uncertainty analysis; b) Characterization of Parameter Uncertainty - to develop the pdfs for the important uncertain parameters including identification of any correlations among parameters; c) Propagation of Uncertainty - to propagate parameter uncertainties (e.g., by first order second moment methods if applicable or by a Monte Carlo approach) through the model to determine the uncertainty in the model predictions of interest. 5)Estimation of combined ACM and scenario uncertainty by a double sum with each component of the inner sum (an individual CCDF) representing parameter uncertainty associated with a particular scenario and ACM and the outer sum enumerating the various plausible ACM and scenario combinations in order to represent the combined estimate of uncertainty (a family of CCDFs). A final important part of the framework includes identification, enumeration, and documentation of all the assumptions, which include those made during conceptual model development, required by the mathematical model, required by the numerical model, made during the spatial and temporal descretization process, needed to assign the statistical model and associated parameters that describe the uncertainty in the relevant input parameters, and finally those assumptions required by the propagation method. Pacific Northwest National Laboratory is operated for the U.S. Department of Energy under Contract DE-AC06-76RL01830.

  11. A Cascade Approach to Uncertainty Estimation for the Hydrological Simulation of Droughts

    NASA Astrophysics Data System (ADS)

    Smith, Katie; Tanguy, Maliko; Parry, Simon; Prudhomme, Christel

    2016-04-01

    Uncertainty poses a significant challenge in environmental research and the characterisation and quantification of uncertainty has become a research priority over the past decade. Studies of extreme events are particularly affected by issues of uncertainty. This study focusses on the sources of uncertainty in the modelling of streamflow droughts in the United Kingdom. Droughts are a poorly understood natural hazard with no universally accepted definition. Meteorological, hydrological and agricultural droughts have different meanings and vary both spatially and temporally, yet each is inextricably linked. The work presented here is part of two extensive interdisciplinary projects investigating drought reconstruction and drought forecasting capabilities in the UK. Lumped catchment models are applied to simulate streamflow drought, and uncertainties from 5 different sources are investigated: climate input data, potential evapotranspiration (PET) method, hydrological model, within model structure, and model parameterisation. Latin Hypercube sampling is applied to develop large parameter ensembles for each model structure which are run using parallel computing on a high performance computer cluster. Parameterisations are assessed using a multi-objective evaluation criteria which includes both general and drought performance metrics. The effect of different climate input data and PET methods on model output is then considered using the accepted model parameterisations. The uncertainty from each of the sources creates a cascade, and when presented as such the relative importance of each aspect of uncertainty can be determined.

  12. Evaluation of habitat suitability index models by global sensitivity and uncertainty analyses: a case study for submerged aquatic vegetation

    USGS Publications Warehouse

    Zajac, Zuzanna; Stith, Bradley M.; Bowling, Andrea C.; Langtimm, Catherine A.; Swain, Eric D.

    2015-01-01

    Habitat suitability index (HSI) models are commonly used to predict habitat quality and species distributions and are used to develop biological surveys, assess reserve and management priorities, and anticipate possible change under different management or climate change scenarios. Important management decisions may be based on model results, often without a clear understanding of the level of uncertainty associated with model outputs. We present an integrated methodology to assess the propagation of uncertainty from both inputs and structure of the HSI models on model outputs (uncertainty analysis: UA) and relative importance of uncertain model inputs and their interactions on the model output uncertainty (global sensitivity analysis: GSA). We illustrate the GSA/UA framework using simulated hydrology input data from a hydrodynamic model representing sea level changes and HSI models for two species of submerged aquatic vegetation (SAV) in southwest Everglades National Park: Vallisneria americana (tape grass) and Halodule wrightii (shoal grass). We found considerable spatial variation in uncertainty for both species, but distributions of HSI scores still allowed discrimination of sites with good versus poor conditions. Ranking of input parameter sensitivities also varied spatially for both species, with high habitat quality sites showing higher sensitivity to different parameters than low-quality sites. HSI models may be especially useful when species distribution data are unavailable, providing means of exploiting widely available environmental datasets to model past, current, and future habitat conditions. The GSA/UA approach provides a general method for better understanding HSI model dynamics, the spatial and temporal variation in uncertainties, and the parameters that contribute most to model uncertainty. Including an uncertainty and sensitivity analysis in modeling efforts as part of the decision-making framework will result in better-informed, more robust decisions.

  13. Can agent based models effectively reduce fisheries management implementation uncertainty?

    NASA Astrophysics Data System (ADS)

    Drexler, M.

    2016-02-01

    Uncertainty is an inherent feature of fisheries management. Implementation uncertainty remains a challenge to quantify often due to unintended responses of users to management interventions. This problem will continue to plague both single species and ecosystem based fisheries management advice unless the mechanisms driving these behaviors are properly understood. Equilibrium models, where each actor in the system is treated as uniform and predictable, are not well suited to forecast the unintended behaviors of individual fishers. Alternatively, agent based models (AMBs) can simulate the behaviors of each individual actor driven by differing incentives and constraints. This study evaluated the feasibility of using AMBs to capture macro scale behaviors of the US West Coast Groundfish fleet. Agent behavior was specified at the vessel level. Agents made daily fishing decisions using knowledge of their own cost structure, catch history, and the histories of catch and quota markets. By adding only a relatively small number of incentives, the model was able to reproduce highly realistic macro patterns of expected outcomes in response to management policies (catch restrictions, MPAs, ITQs) while preserving vessel heterogeneity. These simulations indicate that agent based modeling approaches hold much promise for simulating fisher behaviors and reducing implementation uncertainty. Additional processes affecting behavior, informed by surveys, are continually being added to the fisher behavior model. Further coupling of the fisher behavior model to a spatial ecosystem model will provide a fully integrated social, ecological, and economic model capable of performing management strategy evaluations to properly consider implementation uncertainty in fisheries management.

  14. Monitoring and modeling as a continuing learning process: the use of hydrological models in a general probabilistic framework.

    NASA Astrophysics Data System (ADS)

    Baroni, G.; Gräff, T.; Reinstorf, F.; Oswald, S. E.

    2012-04-01

    Nowadays uncertainty and sensitivity analysis are considered basic tools for the assessment of hydrological models and the evaluation of the most important sources of uncertainty. In this context, in the last decades several methods have been developed and applied in different hydrological conditions. However, in most of the cases, the studies have been done by investigating mainly the influence of the parameter uncertainty on the simulated outputs and few approaches tried to consider also other sources of uncertainty i.e. input and model structure. Moreover, several constrains arise when spatially distributed parameters are involved. To overcome these limitations a general probabilistic framework based on Monte Carlo simulations and the Sobol method has been proposed. In this study, the general probabilistic framework was applied at field scale using a 1D physical-based hydrological model (SWAP). Furthermore, the framework was extended at catchment scale in combination with a spatially distributed hydrological model (SHETRAN). The models are applied in two different experimental sites in Germany: a relatively flat cropped field close to Potsdam (Brandenburg) and a small mountainous catchment with agricultural land use (Schaefertal, Harz Mountains). For both cases, input and parameters are considered as major sources of uncertainty. Evaluation of the models was based on soil moisture detected at plot scale in different depths and, for the catchment site, also with daily discharge values. The study shows how the framework can take into account all the various sources of uncertainty i.e. input data, parameters (either in scalar or spatially distributed form) and model structures. The framework can be used in a loop in order to optimize further monitoring activities used to improve the performance of the model. In the particular applications, the results show how the sources of uncertainty are specific for each process considered. The influence of the input data as well as the presence of compensating errors become clear by the different processes simulated.

  15. Crop Model Improvement Reduces the Uncertainty of the Response to Temperature of Multi-Model Ensembles

    NASA Technical Reports Server (NTRS)

    Maiorano, Andrea; Martre, Pierre; Asseng, Senthold; Ewert, Frank; Mueller, Christoph; Roetter, Reimund P.; Ruane, Alex C.; Semenov, Mikhail A.; Wallach, Daniel; Wang, Enli

    2016-01-01

    To improve climate change impact estimates and to quantify their uncertainty, multi-model ensembles (MMEs) have been suggested. Model improvements can improve the accuracy of simulations and reduce the uncertainty of climate change impact assessments. Furthermore, they can reduce the number of models needed in a MME. Herein, 15 wheat growth models of a larger MME were improved through re-parameterization and/or incorporating or modifying heat stress effects on phenology, leaf growth and senescence, biomass growth, and grain number and size using detailed field experimental data from the USDA Hot Serial Cereal experiment (calibration data set). Simulation results from before and after model improvement were then evaluated with independent field experiments from a CIMMYT worldwide field trial network (evaluation data set). Model improvements decreased the variation (10th to 90th model ensemble percentile range) of grain yields simulated by the MME on average by 39% in the calibration data set and by 26% in the independent evaluation data set for crops grown in mean seasonal temperatures greater than 24 C. MME mean squared error in simulating grain yield decreased by 37%. A reduction in MME uncertainty range by 27% increased MME prediction skills by 47%. Results suggest that the mean level of variation observed in field experiments and used as a benchmark can be reached with half the number of models in the MME. Improving crop models is therefore important to increase the certainty of model-based impact assessments and allow more practical, i.e. smaller MMEs to be used effectively.

  16. Uncertainties in Past and Future Global Water Availability

    NASA Astrophysics Data System (ADS)

    Sheffield, J.; Kam, J.

    2014-12-01

    Understanding how water availability changes on inter-annual to decadal time scales and how it may change in the future under climate change are a key part of understanding future stresses on water and food security. Historic evaluations of water availability on regional to global scales are generally based on large-scale model simulations with their associated uncertainties, in particular for long-term changes. Uncertainties are due to model errors and missing processes, parameter uncertainty, and errors in meteorological forcing data. Recent multi-model inter-comparisons and impact studies have highlighted large differences for past reconstructions, due to different simplifying assumptions in the models or the inclusion of physical processes such as CO2 fertilization. Modeling of direct anthropogenic factors such as water and land management also carry large uncertainties in their physical representation and from lack of socio-economic data. Furthermore, there is little understanding of the impact of uncertainties in the meteorological forcings that underpin these historic simulations. Similarly, future changes in water availability are highly uncertain due to climate model diversity, natural variability and scenario uncertainty, each of which dominates at different time scales. In particular, natural climate variability is expected to dominate any externally forced signal over the next several decades. We present results from multi-land surface model simulations of the historic global availability of water in the context of natural variability (droughts) and long-term changes (drying). The simulations take into account the impact of uncertainties in the meteorological forcings and the incorporation of water management in the form of reservoirs and irrigation. The results indicate that model uncertainty is important for short-term drought events, and forcing uncertainty is particularly important for long-term changes, especially uncertainty in precipitation due to reduced gauge density in recent years. We also discuss uncertainties in future projections from these models as driven by bias-corrected and downscaled CMIP5 climate projections, in the context of the balance between climate model robustness and climate model diversity.

  17. Gum-compliant uncertainty propagations for Pu and U concentration measurements using the 1st-prototype XOS/LANL hiRX instrument; an SRNL H-Canyon Test Bed performance evaluation project

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Holland, Michael K.; O'Rourke, Patrick E.

    An SRNL H-Canyon Test Bed performance evaluation project was completed jointly by SRNL and LANL on a prototype monochromatic energy dispersive x-ray fluorescence instrument, the hiRX. A series of uncertainty propagations were generated based upon plutonium and uranium measurements performed using the alpha-prototype hiRX instrument. Data reduction and uncertainty modeling provided in this report were performed by the SRNL authors. Observations and lessons learned from this evaluation were also used to predict the expected uncertainties that should be achievable at multiple plutonium and uranium concentration levels provided instrument hardware and software upgrades being recommended by LANL and SRNL are performed.

  18. The epistemic and aleatory uncertainties of the ETAS-type models: an application to the Central Italy seismicity.

    PubMed

    Lombardi, A M

    2017-09-18

    Stochastic models provide quantitative evaluations about the occurrence of earthquakes. A basic component of this type of models are the uncertainties in defining main features of an intrinsically random process. Even if, at a very basic level, any attempting to distinguish between types of uncertainty is questionable, an usual way to deal with this topic is to separate epistemic uncertainty, due to lack of knowledge, from aleatory variability, due to randomness. In the present study this problem is addressed in the narrow context of short-term modeling of earthquakes and, specifically, of ETAS modeling. By mean of an application of a specific version of the ETAS model to seismicity of Central Italy, recently struck by a sequence with a main event of Mw6.5, the aleatory and epistemic (parametric) uncertainty are separated and quantified. The main result of the paper is that the parametric uncertainty of the ETAS-type model, adopted here, is much lower than the aleatory variability in the process. This result points out two main aspects: an analyst has good chances to set the ETAS-type models, but he may retrospectively describe and forecast the earthquake occurrences with still limited precision and accuracy.

  19. A unified approach for squeal instability analysis of disc brakes with two types of random-fuzzy uncertainties

    NASA Astrophysics Data System (ADS)

    Lü, Hui; Shangguan, Wen-Bin; Yu, Dejie

    2017-09-01

    Automotive brake systems are always subjected to various types of uncertainties and two types of random-fuzzy uncertainties may exist in the brakes. In this paper, a unified approach is proposed for squeal instability analysis of disc brakes with two types of random-fuzzy uncertainties. In the proposed approach, two uncertainty analysis models with mixed variables are introduced to model the random-fuzzy uncertainties. The first one is the random and fuzzy model, in which random variables and fuzzy variables exist simultaneously and independently. The second one is the fuzzy random model, in which uncertain parameters are all treated as random variables while their distribution parameters are expressed as fuzzy numbers. Firstly, the fuzziness is discretized by using α-cut technique and the two uncertainty analysis models are simplified into random-interval models. Afterwards, by temporarily neglecting interval uncertainties, the random-interval models are degraded into random models, in which the expectations, variances, reliability indexes and reliability probabilities of system stability functions are calculated. And then, by reconsidering the interval uncertainties, the bounds of the expectations, variances, reliability indexes and reliability probabilities are computed based on Taylor series expansion. Finally, by recomposing the analysis results at each α-cut level, the fuzzy reliability indexes and probabilities can be obtained, by which the brake squeal instability can be evaluated. The proposed approach gives a general framework to deal with both types of random-fuzzy uncertainties that may exist in the brakes and its effectiveness is demonstrated by numerical examples. It will be a valuable supplement to the systematic study of brake squeal considering uncertainty.

  20. Uncertainty and sensitivity analysis of control strategies using the benchmark simulation model No1 (BSM1).

    PubMed

    Flores-Alsina, Xavier; Rodriguez-Roda, Ignasi; Sin, Gürkan; Gernaey, Krist V

    2009-01-01

    The objective of this paper is to perform an uncertainty and sensitivity analysis of the predictions of the Benchmark Simulation Model (BSM) No. 1, when comparing four activated sludge control strategies. The Monte Carlo simulation technique is used to evaluate the uncertainty in the BSM1 predictions, considering the ASM1 bio-kinetic parameters and influent fractions as input uncertainties while the Effluent Quality Index (EQI) and the Operating Cost Index (OCI) are focused on as model outputs. The resulting Monte Carlo simulations are presented using descriptive statistics indicating the degree of uncertainty in the predicted EQI and OCI. Next, the Standard Regression Coefficients (SRC) method is used for sensitivity analysis to identify which input parameters influence the uncertainty in the EQI predictions the most. The results show that control strategies including an ammonium (S(NH)) controller reduce uncertainty in both overall pollution removal and effluent total Kjeldahl nitrogen. Also, control strategies with an external carbon source reduce the effluent nitrate (S(NO)) uncertainty increasing both their economical cost and variability as a trade-off. Finally, the maximum specific autotrophic growth rate (micro(A)) causes most of the variance in the effluent for all the evaluated control strategies. The influence of denitrification related parameters, e.g. eta(g) (anoxic growth rate correction factor) and eta(h) (anoxic hydrolysis rate correction factor), becomes less important when a S(NO) controller manipulating an external carbon source addition is implemented.

  1. Evaluating Precipitation from Orbital Data Products of TRMM and GPM over the Indian Subcontinent

    NASA Astrophysics Data System (ADS)

    Jayaluxmi, I.; Kumar, D. N.

    2015-12-01

    The rapidly growing records of microwave based precipitation data made available from various earth observation satellites have instigated a pressing need towards evaluating the associated uncertainty which arise from different sources such as retrieval error, spatial/temporal sampling error and sensor dependent error. Pertaining to microwave remote sensing, most of the studies in literature focus on gridded data products, fewer studies exist on evaluating the uncertainty inherent in orbital data products. Evaluation of the latter are essential as they potentially cause large uncertainties during real time flood forecasting studies especially at the watershed scale. The present study evaluates the uncertainty of precipitation data derived from the orbital data products of the Tropical Rainfall Measuring Mission (TRMM) satellite namely the 2A12, 2A25 and 2B31 products. Case study results over the flood prone basin of Mahanadi, India, are analyzed for precipitation uncertainty through these three facets viz., a) Uncertainty quantification using the volumetric metrics from the contingency table [Aghakouchak and Mehran 2014] b) Error characterization using additive and multiplicative error models c) Error decomposition to identify systematic and random errors d) Comparative assessment with the orbital data from GPM mission. The homoscedastic random errors from multiplicative error models justify a better representation of precipitation estimates by the 2A12 algorithm. It can be concluded that although the radiometer derived 2A12 precipitation data is known to suffer from many sources of uncertainties, spatial analysis over the case study region of India testifies that they are in excellent agreement with the reference estimates for the data period considered [Indu and Kumar 2015]. References A. AghaKouchak and A. Mehran (2014), Extended contingency table: Performance metrics for satellite observations and climate model simulations, Water Resources Research, vol. 49, 7144-7149; J. Indu and D. Nagesh Kumar (2015), Evaluation of Precipitation Retrievals from Orbital Data Products of TRMM over a Subtropical basin in India, IEEE Transactions on Geoscience and Remote Sensing, in press, doi: 10.1109/TGRS.2015.2440338.

  2. Accuracy assessment for a multi-parameter optical calliper in on line automotive applications

    NASA Astrophysics Data System (ADS)

    D'Emilia, G.; Di Gasbarro, D.; Gaspari, A.; Natale, E.

    2017-08-01

    In this work, a methodological approach based on the evaluation of the measurement uncertainty is applied to an experimental test case, related to the automotive sector. The uncertainty model for different measurement procedures of a high-accuracy optical gauge is discussed in order to individuate the best measuring performances of the system for on-line applications and when the measurement requirements are becoming more stringent. In particular, with reference to the industrial production and control strategies of high-performing turbochargers, two uncertainty models are proposed, discussed and compared, to be used by the optical calliper. Models are based on an integrated approach between measurement methods and production best practices to emphasize their mutual coherence. The paper shows the possible advantages deriving from the considerations that the measurement uncertainty modelling provides, in order to keep control of the uncertainty propagation on all the indirect measurements useful for production statistical control, on which basing further improvements.

  3. Sensitivity of Polar Stratospheric Ozone Loss to Uncertainties in Chemical Reaction Kinetics

    NASA Technical Reports Server (NTRS)

    Kawa, S. Randolph; Stolarski, Richard S.; Douglass, Anne R.; Newman, Paul A.

    2008-01-01

    Several recent observational and laboratory studies of processes involved in polar stratospheric ozone loss have prompted a reexamination of aspect of out understanding for this key indicator of global change. To a large extent, our confidence in understanding and projecting changes in polar and global ozone is based on our ability to to simulate these process in numerical models of chemistry and transport. These models depend on laboratory-measured kinetic reaction rates and photlysis cross section to simulate molecular interactions. In this study we use a simple box-model scenario for Antarctic ozone to estimate the uncertainty in loss attributable to known reaction kinetic uncertainties. Following the method of earlier work, rates and uncertainties from the latest laboratory evaluation are applied in random combinations. We determine the key reaction and rates contributing the largest potential errors and compare the results to observations to evaluate which combinations are consistent with atmospheric data. Implications for our theoretical and practical understanding of polar ozone loss will be assessed.

  4. Spatial planning using probabilistic flood maps

    NASA Astrophysics Data System (ADS)

    Alfonso, Leonardo; Mukolwe, Micah; Di Baldassarre, Giuliano

    2015-04-01

    Probabilistic flood maps account for uncertainty in flood inundation modelling and convey a degree of certainty in the outputs. Major sources of uncertainty include input data, topographic data, model structure, observation data and parametric uncertainty. Decision makers prefer less ambiguous information from modellers; this implies that uncertainty is suppressed to yield binary flood maps. Though, suppressing information may potentially lead to either surprise or misleading decisions. Inclusion of uncertain information in the decision making process is therefore desirable and transparent. To this end, we utilise the Prospect theory and information from a probabilistic flood map to evaluate potential decisions. Consequences related to the decisions were evaluated using flood risk analysis. Prospect theory explains how choices are made given options for which probabilities of occurrence are known and accounts for decision makers' characteristics such as loss aversion and risk seeking. Our results show that decision making is pronounced when there are high gains and loss, implying higher payoffs and penalties, therefore a higher gamble. Thus the methodology may be appropriately considered when making decisions based on uncertain information.

  5. THE ATMOSPHERIC MODEL EVALUATION (AMET): METEOROLOGY MODULE

    EPA Science Inventory

    An Atmospheric Model Evaluation Tool (AMET), composed of meteorological and air quality components, is being developed to examine the error and uncertainty in the model simulations. AMET matches observations with the corresponding model-estimated values in space and time, and the...

  6. Multivariate Geostatistical Analysis of Uncertainty for the Hydrodynamic Model of a Geological Trap for Carbon Dioxide Storage. Case study: Multilayered Geological Structure Vest Valcele, ROMANIA

    NASA Astrophysics Data System (ADS)

    Scradeanu, D.; Pagnejer, M.

    2012-04-01

    The purpose of the works is to evaluate the uncertainty of the hydrodynamic model for a multilayered geological structure, a potential trap for carbon dioxide storage. The hydrodynamic model is based on a conceptual model of the multilayered hydrostructure with three components: 1) spatial model; 2) parametric model and 3) energy model. The necessary data to achieve the three components of the conceptual model are obtained from: 240 boreholes explored by geophysical logging and seismic investigation, for the first two components, and an experimental water injection test for the last one. The hydrodinamic model is a finite difference numerical model based on a 3D stratigraphic model with nine stratigraphic units (Badenian and Oligocene) and a 3D multiparameter model (porosity, permeability, hydraulic conductivity, storage coefficient, leakage etc.). The uncertainty of the two 3D models was evaluated using multivariate geostatistical tools: a)cross-semivariogram for structural analysis, especially the study of anisotropy and b)cokriging to reduce estimation variances in a specific situation where is a cross-correlation between a variable and one or more variables that are undersampled. It has been identified important differences between univariate and bivariate anisotropy. The minimised uncertainty of the parametric model (by cokriging) was transferred to hydrodynamic model. The uncertainty distribution of the pressures generated by the water injection test has been additional filtered by the sensitivity of the numerical model. The obtained relative errors of the pressure distribution in the hydrodynamic model are 15-20%. The scientific research was performed in the frame of the European FP7 project "A multiple space and time scale approach for the quantification of deep saline formation for CO2 storage(MUSTANG)".

  7. Uncertainty in solid precipitation and snow depth prediction for Siberia using the Noah and Noah-MP land surface models

    NASA Astrophysics Data System (ADS)

    Suzuki, Kazuyoshi; Zupanski, Milija

    2018-01-01

    In this study, we investigate the uncertainties associated with land surface processes in an ensemble predication context. Specifically, we compare the uncertainties produced by a coupled atmosphere-land modeling system with two different land surface models, the Noah- MP land surface model (LSM) and the Noah LSM, by using the Maximum Likelihood Ensemble Filter (MLEF) data assimilation system as a platform for ensemble prediction. We carried out 24-hour prediction simulations in Siberia with 32 ensemble members beginning at 00:00 UTC on 5 March 2013. We then compared the model prediction uncertainty of snow depth and solid precipitation with observation-based research products and evaluated the standard deviation of the ensemble spread. The prediction skill and ensemble spread exhibited high positive correlation for both LSMs, indicating a realistic uncertainty estimation. The inclusion of a multiple snowlayer model in the Noah-MP LSM was beneficial for reducing the uncertainties of snow depth and snow depth change compared to the Noah LSM, but the uncertainty in daily solid precipitation showed minimal difference between the two LSMs. The impact of LSM choice in reducing temperature uncertainty was limited to surface layers of the atmosphere. In summary, we found that the more sophisticated Noah-MP LSM reduces uncertainties associated with land surface processes compared to the Noah LSM. Thus, using prediction models with improved skill implies improved predictability and greater certainty of prediction.

  8. Accounting for observation uncertainties in an evaluation metric of low latitude turbulent air-sea fluxes: application to the comparison of a suite of IPSL model versions

    NASA Astrophysics Data System (ADS)

    Servonnat, Jérôme; Găinuşă-Bogdan, Alina; Braconnot, Pascale

    2017-09-01

    Turbulent momentum and heat (sensible heat and latent heat) fluxes at the air-sea interface are key components of the whole energetic of the Earth's climate. The evaluation of these fluxes in the climate models is still difficult because of the large uncertainties associated with the reference products. In this paper we present an objective metric accounting for reference uncertainties to evaluate the annual cycle of the low latitude turbulent fluxes of a suite of IPSL climate models. This metric consists in a Hotelling T 2 test between the simulated and observed field in a reduce space characterized by the dominant modes of variability that are common to both the model and the reference, taking into account the observational uncertainty. The test is thus more severe when uncertainties are small as it is the case for sea surface temperature (SST). The results of the test show that for almost all variables and all model versions the model-reference differences are not zero. It is not possible to distinguish between model versions for sensible heat and meridional wind stress, certainly due to the large observational uncertainties. All model versions share similar biases for the different variables. There is no improvement between the reference versions of the IPSL model used for CMIP3 and CMIP5. The test also reveals that the higher horizontal resolution fails to improve the representation of the turbulent surface fluxes compared to the other versions. The representation of the fluxes is further degraded in a version with improved atmospheric physics with an amplification of some of the biases in the Indian Ocean and in the intertropical convergence zone. The ranking of the model versions for the turbulent fluxes is not correlated with the ranking found for SST. This highlights that despite the fact that SST gradients are important for the large-scale atmospheric circulation patterns, other factors such as wind speed, and air-sea temperature contrast play an important role in the representation of turbulent fluxes.

  9. Evaluation of global fine-resolution precipitation products and their uncertainty quantification in ensemble discharge simulations

    NASA Astrophysics Data System (ADS)

    Qi, W.; Zhang, C.; Fu, G.; Sweetapple, C.; Zhou, H.

    2016-02-01

    The applicability of six fine-resolution precipitation products, including precipitation radar, infrared, microwave and gauge-based products, using different precipitation computation recipes, is evaluated using statistical and hydrological methods in northeastern China. In addition, a framework quantifying uncertainty contributions of precipitation products, hydrological models, and their interactions to uncertainties in ensemble discharges is proposed. The investigated precipitation products are Tropical Rainfall Measuring Mission (TRMM) products (TRMM3B42 and TRMM3B42RT), Global Land Data Assimilation System (GLDAS)/Noah, Asian Precipitation - Highly-Resolved Observational Data Integration Towards Evaluation of Water Resources (APHRODITE), Precipitation Estimation from Remotely Sensed Information using Artificial Neural Networks (PERSIANN), and a Global Satellite Mapping of Precipitation (GSMAP-MVK+) product. Two hydrological models of different complexities, i.e. a water and energy budget-based distributed hydrological model and a physically based semi-distributed hydrological model, are employed to investigate the influence of hydrological models on simulated discharges. Results show APHRODITE has high accuracy at a monthly scale compared with other products, and GSMAP-MVK+ shows huge advantage and is better than TRMM3B42 in relative bias (RB), Nash-Sutcliffe coefficient of efficiency (NSE), root mean square error (RMSE), correlation coefficient (CC), false alarm ratio, and critical success index. These findings could be very useful for validation, refinement, and future development of satellite-based products (e.g. NASA Global Precipitation Measurement). Although large uncertainty exists in heavy precipitation, hydrological models contribute most of the uncertainty in extreme discharges. Interactions between precipitation products and hydrological models can have the similar magnitude of contribution to discharge uncertainty as the hydrological models. A better precipitation product does not guarantee a better discharge simulation because of interactions. It is also found that a good discharge simulation depends on a good coalition of a hydrological model and a precipitation product, suggesting that, although the satellite-based precipitation products are not as accurate as the gauge-based products, they could have better performance in discharge simulations when appropriately combined with hydrological models. This information is revealed for the first time and very beneficial for precipitation product applications.

  10. A new process sensitivity index to identify important system processes under process model and parametric uncertainty

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dai, Heng; Ye, Ming; Walker, Anthony P.

    Hydrological models are always composed of multiple components that represent processes key to intended model applications. When a process can be simulated by multiple conceptual-mathematical models (process models), model uncertainty in representing the process arises. While global sensitivity analysis methods have been widely used for identifying important processes in hydrologic modeling, the existing methods consider only parametric uncertainty but ignore the model uncertainty for process representation. To address this problem, this study develops a new method to probe multimodel process sensitivity by integrating the model averaging methods into the framework of variance-based global sensitivity analysis, given that the model averagingmore » methods quantify both parametric and model uncertainty. A new process sensitivity index is derived as a metric of relative process importance, and the index includes variance in model outputs caused by uncertainty in both process models and model parameters. For demonstration, the new index is used to evaluate the processes of recharge and geology in a synthetic study of groundwater reactive transport modeling. The recharge process is simulated by two models that converting precipitation to recharge, and the geology process is also simulated by two models of different parameterizations of hydraulic conductivity; each process model has its own random parameters. The new process sensitivity index is mathematically general, and can be applied to a wide range of problems in hydrology and beyond.« less

  11. Sensitivity analysis of respiratory parameter uncertainties: impact of criterion function form and constraints.

    PubMed

    Lutchen, K R

    1990-08-01

    A sensitivity analysis based on weighted least-squares regression is presented to evaluate alternative methods for fitting lumped-parameter models to respiratory impedance data. The goal is to maintain parameter accuracy simultaneously with practical experiment design. The analysis focuses on predicting parameter uncertainties using a linearized approximation for joint confidence regions. Applications are with four-element parallel and viscoelastic models for 0.125- to 4-Hz data and a six-element model with separate tissue and airway properties for input and transfer impedance data from 2-64 Hz. The criterion function form was evaluated by comparing parameter uncertainties when data are fit as magnitude and phase, dynamic resistance and compliance, or real and imaginary parts of input impedance. The proper choice of weighting can make all three criterion variables comparable. For the six-element model, parameter uncertainties were predicted when both input impedance and transfer impedance are acquired and fit simultaneously. A fit to both data sets from 4 to 64 Hz could reduce parameter estimate uncertainties considerably from those achievable by fitting either alone. For the four-element models, use of an independent, but noisy, measure of static compliance was assessed as a constraint on model parameters. This may allow acceptable parameter uncertainties for a minimum frequency of 0.275-0.375 Hz rather than 0.125 Hz. This reduces data acquisition requirements from a 16- to a 5.33- to 8-s breath holding period. These results are approximations, and the impact of using the linearized approximation for the confidence regions is discussed.

  12. Lessons from Climate Modeling on the Design and Use of Ensembles for Crop Modeling

    NASA Technical Reports Server (NTRS)

    Wallach, Daniel; Mearns, Linda O.; Ruane, Alexander C.; Roetter, Reimund P.; Asseng, Senthold

    2016-01-01

    Working with ensembles of crop models is a recent but important development in crop modeling which promises to lead to better uncertainty estimates for model projections and predictions, better predictions using the ensemble mean or median, and closer collaboration within the modeling community. There are numerous open questions about the best way to create and analyze such ensembles. Much can be learned from the field of climate modeling, given its much longer experience with ensembles. We draw on that experience to identify questions and make propositions that should help make ensemble modeling with crop models more rigorous and informative. The propositions include defining criteria for acceptance of models in a crop MME, exploring criteria for evaluating the degree of relatedness of models in a MME, studying the effect of number of models in the ensemble, development of a statistical model of model sampling, creation of a repository for MME results, studies of possible differential weighting of models in an ensemble, creation of single model ensembles based on sampling from the uncertainty distribution of parameter values or inputs specifically oriented toward uncertainty estimation, the creation of super ensembles that sample more than one source of uncertainty, the analysis of super ensemble results to obtain information on total uncertainty and the separate contributions of different sources of uncertainty and finally further investigation of the use of the multi-model mean or median as a predictor.

  13. Assessing uncertainty in published risk estimates using ...

    EPA Pesticide Factsheets

    Introduction: The National Research Council recommended quantitative evaluation of uncertainty in effect estimates for risk assessment. This analysis considers uncertainty across model forms and model parameterizations with hexavalent chromium [Cr(VI)] and lung cancer mortality as an example. The objective is to characterize model uncertainty by evaluating estimates across published epidemiologic studies of the same cohort.Methods: This analysis was based on 5 studies analyzing a cohort of 2,357 workers employed from 1950-74 in a chromate production plant in Maryland. Cox and Poisson models were the only model forms considered by study authors to assess the effect of Cr(VI) on lung cancer mortality. All models adjusted for smoking and included a 5-year exposure lag, however other latency periods and model covariates such as age and race were considered. Published effect estimates were standardized to the same units and normalized by their variances to produce a standardized metric to compare variability within and between model forms. A total of 5 similarly parameterized analyses were considered across model form, and 16 analyses with alternative parameterizations were considered within model form (10 Cox; 6 Poisson). Results: Across Cox and Poisson model forms, adjusted cumulative exposure coefficients (betas) for 5 similar analyses ranged from 2.47 to 4.33 (mean=2.97, σ2=0.63). Within the 10 Cox models, coefficients ranged from 2.53 to 4.42 (mean=3.29, σ2=0.

  14. Evaluation of risk from acts of terrorism :the adversary/defender model using belief and fuzzy sets.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Darby, John L.

    Risk from an act of terrorism is a combination of the likelihood of an attack, the likelihood of success of the attack, and the consequences of the attack. The considerable epistemic uncertainty in each of these three factors can be addressed using the belief/plausibility measure of uncertainty from the Dempster/Shafer theory of evidence. The adversary determines the likelihood of the attack. The success of the attack and the consequences of the attack are determined by the security system and mitigation measures put in place by the defender. This report documents a process for evaluating risk of terrorist acts using anmore » adversary/defender model with belief/plausibility as the measure of uncertainty. Also, the adversary model is a linguistic model that applies belief/plausibility to fuzzy sets used in an approximate reasoning rule base.« less

  15. Measurement uncertainty evaluation of conicity error inspected on CMM

    NASA Astrophysics Data System (ADS)

    Wang, Dongxia; Song, Aiguo; Wen, Xiulan; Xu, Youxiong; Qiao, Guifang

    2016-01-01

    The cone is widely used in mechanical design for rotation, centering and fixing. Whether the conicity error can be measured and evaluated accurately will directly influence its assembly accuracy and working performance. According to the new generation geometrical product specification(GPS), the error and its measurement uncertainty should be evaluated together. The mathematical model of the minimum zone conicity error is established and an improved immune evolutionary algorithm(IIEA) is proposed to search for the conicity error. In the IIEA, initial antibodies are firstly generated by using quasi-random sequences and two kinds of affinities are calculated. Then, each antibody clone is generated and they are self-adaptively mutated so as to maintain diversity. Similar antibody is suppressed and new random antibody is generated. Because the mathematical model of conicity error is strongly nonlinear and the input quantities are not independent, it is difficult to use Guide to the expression of uncertainty in the measurement(GUM) method to evaluate measurement uncertainty. Adaptive Monte Carlo method(AMCM) is proposed to estimate measurement uncertainty in which the number of Monte Carlo trials is selected adaptively and the quality of the numerical results is directly controlled. The cone parts was machined on lathe CK6140 and measured on Miracle NC 454 Coordinate Measuring Machine(CMM). The experiment results confirm that the proposed method not only can search for the approximate solution of the minimum zone conicity error(MZCE) rapidly and precisely, but also can evaluate measurement uncertainty and give control variables with an expected numerical tolerance. The conicity errors computed by the proposed method are 20%-40% less than those computed by NC454 CMM software and the evaluation accuracy improves significantly.

  16. Development and comparison in uncertainty assessment based Bayesian modularization method in hydrological modeling

    NASA Astrophysics Data System (ADS)

    Li, Lu; Xu, Chong-Yu; Engeland, Kolbjørn

    2013-04-01

    SummaryWith respect to model calibration, parameter estimation and analysis of uncertainty sources, various regression and probabilistic approaches are used in hydrological modeling. A family of Bayesian methods, which incorporates different sources of information into a single analysis through Bayes' theorem, is widely used for uncertainty assessment. However, none of these approaches can well treat the impact of high flows in hydrological modeling. This study proposes a Bayesian modularization uncertainty assessment approach in which the highest streamflow observations are treated as suspect information that should not influence the inference of the main bulk of the model parameters. This study includes a comprehensive comparison and evaluation of uncertainty assessments by our new Bayesian modularization method and standard Bayesian methods using the Metropolis-Hastings (MH) algorithm with the daily hydrological model WASMOD. Three likelihood functions were used in combination with standard Bayesian method: the AR(1) plus Normal model independent of time (Model 1), the AR(1) plus Normal model dependent on time (Model 2) and the AR(1) plus Multi-normal model (Model 3). The results reveal that the Bayesian modularization method provides the most accurate streamflow estimates measured by the Nash-Sutcliffe efficiency and provide the best in uncertainty estimates for low, medium and entire flows compared to standard Bayesian methods. The study thus provides a new approach for reducing the impact of high flows on the discharge uncertainty assessment of hydrological models via Bayesian method.

  17. Impacts of uncertainties in European gridded precipitation observations on regional climate analysis

    PubMed Central

    Gobiet, Andreas

    2016-01-01

    ABSTRACT Gridded precipitation data sets are frequently used to evaluate climate models or to remove model output biases. Although precipitation data are error prone due to the high spatio‐temporal variability of precipitation and due to considerable measurement errors, relatively few attempts have been made to account for observational uncertainty in model evaluation or in bias correction studies. In this study, we compare three types of European daily data sets featuring two Pan‐European data sets and a set that combines eight very high‐resolution station‐based regional data sets. Furthermore, we investigate seven widely used, larger scale global data sets. Our results demonstrate that the differences between these data sets have the same magnitude as precipitation errors found in regional climate models. Therefore, including observational uncertainties is essential for climate studies, climate model evaluation, and statistical post‐processing. Following our results, we suggest the following guidelines for regional precipitation assessments. (1) Include multiple observational data sets from different sources (e.g. station, satellite, reanalysis based) to estimate observational uncertainties. (2) Use data sets with high station densities to minimize the effect of precipitation undersampling (may induce about 60% error in data sparse regions). The information content of a gridded data set is mainly related to its underlying station density and not to its grid spacing. (3) Consider undercatch errors of up to 80% in high latitudes and mountainous regions. (4) Analyses of small‐scale features and extremes are especially uncertain in gridded data sets. For higher confidence, use climate‐mean and larger scale statistics. In conclusion, neglecting observational uncertainties potentially misguides climate model development and can severely affect the results of climate change impact assessments. PMID:28111497

  18. Impacts of uncertainties in European gridded precipitation observations on regional climate analysis.

    PubMed

    Prein, Andreas F; Gobiet, Andreas

    2017-01-01

    Gridded precipitation data sets are frequently used to evaluate climate models or to remove model output biases. Although precipitation data are error prone due to the high spatio-temporal variability of precipitation and due to considerable measurement errors, relatively few attempts have been made to account for observational uncertainty in model evaluation or in bias correction studies. In this study, we compare three types of European daily data sets featuring two Pan-European data sets and a set that combines eight very high-resolution station-based regional data sets. Furthermore, we investigate seven widely used, larger scale global data sets. Our results demonstrate that the differences between these data sets have the same magnitude as precipitation errors found in regional climate models. Therefore, including observational uncertainties is essential for climate studies, climate model evaluation, and statistical post-processing. Following our results, we suggest the following guidelines for regional precipitation assessments. (1) Include multiple observational data sets from different sources (e.g. station, satellite, reanalysis based) to estimate observational uncertainties. (2) Use data sets with high station densities to minimize the effect of precipitation undersampling (may induce about 60% error in data sparse regions). The information content of a gridded data set is mainly related to its underlying station density and not to its grid spacing. (3) Consider undercatch errors of up to 80% in high latitudes and mountainous regions. (4) Analyses of small-scale features and extremes are especially uncertain in gridded data sets. For higher confidence, use climate-mean and larger scale statistics. In conclusion, neglecting observational uncertainties potentially misguides climate model development and can severely affect the results of climate change impact assessments.

  19. Developing Uncertainty Models for Robust Flutter Analysis Using Ground Vibration Test Data

    NASA Technical Reports Server (NTRS)

    Potter, Starr; Lind, Rick; Kehoe, Michael W. (Technical Monitor)

    2001-01-01

    A ground vibration test can be used to obtain information about structural dynamics that is important for flutter analysis. Traditionally, this information#such as natural frequencies of modes#is used to update analytical models used to predict flutter speeds. The ground vibration test can also be used to obtain uncertainty models, such as natural frequencies and their associated variations, that can update analytical models for the purpose of predicting robust flutter speeds. Analyzing test data using the -norm, rather than the traditional 2-norm, is shown to lead to a minimum-size uncertainty description and, consequently, a least-conservative robust flutter speed. This approach is demonstrated using ground vibration test data for the Aerostructures Test Wing. Different norms are used to formulate uncertainty models and their associated robust flutter speeds to evaluate which norm is least conservative.

  20. Neural network uncertainty assessment using Bayesian statistics: a remote sensing application

    NASA Technical Reports Server (NTRS)

    Aires, F.; Prigent, C.; Rossow, W. B.

    2004-01-01

    Neural network (NN) techniques have proved successful for many regression problems, in particular for remote sensing; however, uncertainty estimates are rarely provided. In this article, a Bayesian technique to evaluate uncertainties of the NN parameters (i.e., synaptic weights) is first presented. In contrast to more traditional approaches based on point estimation of the NN weights, we assess uncertainties on such estimates to monitor the robustness of the NN model. These theoretical developments are illustrated by applying them to the problem of retrieving surface skin temperature, microwave surface emissivities, and integrated water vapor content from a combined analysis of satellite microwave and infrared observations over land. The weight uncertainty estimates are then used to compute analytically the uncertainties in the network outputs (i.e., error bars and correlation structure of these errors). Such quantities are very important for evaluating any application of an NN model. The uncertainties on the NN Jacobians are then considered in the third part of this article. Used for regression fitting, NN models can be used effectively to represent highly nonlinear, multivariate functions. In this situation, most emphasis is put on estimating the output errors, but almost no attention has been given to errors associated with the internal structure of the regression model. The complex structure of dependency inside the NN is the essence of the model, and assessing its quality, coherency, and physical character makes all the difference between a blackbox model with small output errors and a reliable, robust, and physically coherent model. Such dependency structures are described to the first order by the NN Jacobians: they indicate the sensitivity of one output with respect to the inputs of the model for given input data. We use a Monte Carlo integration procedure to estimate the robustness of the NN Jacobians. A regularization strategy based on principal component analysis is proposed to suppress the multicollinearities in order to make these Jacobians robust and physically meaningful.

  1. Impacts of Future Climate Change on California Perennial Crop Yields: Model Projections with Climate and Crop Uncertainties

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lobell, D; Field, C; Cahill, K

    2006-01-10

    Most research on the agricultural impacts of climate change has focused on the major annual crops, yet perennial cropping systems are less adaptable and thus potentially more susceptible to damage. Improved assessments of yield responses to future climate are needed to prioritize adaptation strategies in the many regions where perennial crops are economically and culturally important. These impact assessments, in turn, must rely on climate and crop models that contain often poorly defined uncertainties. We evaluated the impact of climate change on six major perennial crops in California: wine grapes, almonds, table grapes, oranges, walnuts, and avocados. Outputs from multiplemore » climate models were used to evaluate climate uncertainty, while multiple statistical crop models, derived by resampling historical databases, were used to address crop response uncertainties. We find that, despite these uncertainties, climate change in California is very likely to put downward pressure on yields of almonds, walnuts, avocados, and table grapes by 2050. Without CO{sub 2} fertilization or adaptation measures, projected losses range from 0 to >40% depending on the crop and the trajectory of climate change. Climate change uncertainty generally had a larger impact on projections than crop model uncertainty, although the latter was substantial for several crops. Opportunities for expansion into cooler regions are identified, but this adaptation would require substantial investments and may be limited by non-climatic constraints. Given the long time scales for growth and production of orchards and vineyards ({approx}30 years), climate change should be an important factor in selecting perennial varieties and deciding whether and where perennials should be planted.« less

  2. Evaluation of measurement uncertainty of glucose in clinical chemistry.

    PubMed

    Berçik Inal, B; Koldas, M; Inal, H; Coskun, C; Gümüs, A; Döventas, Y

    2007-04-01

    The definition of the uncertainty of measurement used in the International Vocabulary of Basic and General Terms in Metrology (VIM) is a parameter associated with the result of a measurement, which characterizes the dispersion of the values that could reasonably be attributed to the measurand. Uncertainty of measurement comprises many components. In addition to every parameter, the measurement uncertainty is that a value should be given by all institutions that have been accredited. This value shows reliability of the measurement. GUM, published by NIST, contains uncertainty directions. Eurachem/CITAC Guide CG4 was also published by Eurachem/CITAC Working Group in the year 2000. Both of them offer a mathematical model, for uncertainty can be calculated. There are two types of uncertainty in measurement. Type A is the evaluation of uncertainty through the statistical analysis and type B is the evaluation of uncertainty through other means, for example, certificate reference material. Eurachem Guide uses four types of distribution functions: (1) rectangular distribution that gives limits without specifying a level of confidence (u(x)=a/ radical3) to a certificate; (2) triangular distribution that values near to the same point (u(x)=a/ radical6); (3) normal distribution in which an uncertainty is given in the form of a standard deviation s, a relative standard deviation s/ radicaln, or a coefficient of variance CV% without specifying the distribution (a = certificate value, u = standard uncertainty); and (4) confidence interval.

  3. Assessing Uncertainties in Gridded Emissions: A Case Study for Fossil Fuel Carbon Dioxide (FFCO2) Emission Data

    NASA Technical Reports Server (NTRS)

    Oda, T.; Ott, L.; Lauvaux, T.; Feng, S.; Bun, R.; Roman, M.; Baker, D. F.; Pawson, S.

    2017-01-01

    Fossil fuel carbon dioxide (CO2) emissions (FFCO2) are the largest input to the global carbon cycle on a decadal time scale. Because total emissions are assumed to be reasonably well constrained by fuel statistics, FFCO2 often serves as a reference in order to deduce carbon uptake by poorly understood terrestrial and ocean sinks. Conventional atmospheric CO2 flux inversions solve for spatially explicit regional sources and sinks and estimate land and ocean fluxes by subtracting FFCO2. Thus, errors in FFCO2 can propagate into the final inferred flux estimates. Gridded emissions are often based on disaggregation of emissions estimated at national or regional level. Although national and regional total FFCO2 are well known, gridded emission fields are subject to additional uncertainties due to the emission disaggregation. Assessing such uncertainties is often challenging because of the lack of physical measurements for evaluation. We first review difficulties in assessing uncertainties associated with gridded FFCO2 emission data and present several approaches for evaluation of such uncertainties at multiple scales. Given known limitations, inter-emission data differences are often used as a proxy for the uncertainty. The popular approach allows us to characterize differences in emissions, but does not allow us to fully quantify emission disaggregation biases. Our work aims to vicariously evaluate FFCO2 emission data using atmospheric models and measurements. We show a global simulation experiment where uncertainty estimates are propagated as an atmospheric tracer (uncertainty tracer) alongside CO2 in NASA's GEOS model and discuss implications of FFCO2 uncertainties in the context of flux inversions. We also demonstrate the use of high resolution urban CO2 simulations as a tool for objectively evaluating FFCO2 data over intense emission regions. Though this study focuses on FFCO2 emission data, the outcome of this study could also help improve the knowledge of similar gridded emissions data for non-CO2 compounds with similar emission characteristics.

  4. Assessing uncertainties in gridded emissions: A case study for fossil fuel carbon dioxide (FFCO2) emission data

    NASA Astrophysics Data System (ADS)

    Oda, T.; Ott, L. E.; Lauvaux, T.; Feng, S.; Bun, R.; Roman, M. O.; Baker, D. F.; Pawson, S.

    2017-12-01

    Fossil fuel carbon dioxide (CO2) emissions (FFCO2) are the largest input to the global carbon cycle on a decadal time scale. Because total emissions are assumed to be reasonably well constrained by fuel statistics, FFCO2 often serves as a reference in order to deduce carbon uptake by poorly understood terrestrial and ocean sinks. Conventional atmospheric CO2 flux inversions solve for spatially explicit regional sources and sinks and estimate land and ocean fluxes by subtracting FFCO2. Thus, errors in FFCO2 can propagate into the final inferred flux estimates. Gridded emissions are often based on disaggregation of emissions estimated at national or regional level. Although national and regional total FFCO2 are well known, gridded emission fields are subject to additional uncertainties due to the emission disaggregation. Assessing such uncertainties is often challenging because of the lack of physical measurements for evaluation. We first review difficulties in assessing uncertainties associated with gridded FFCO2 emission data and present several approaches for evaluation of such uncertainties at multiple scales. Given known limitations, inter-emission data differences are often used as a proxy for the uncertainty. The popular approach allows us to characterize differences in emissions, but does not allow us to fully quantify emission disaggregation biases. Our work aims to vicariously evaluate FFCO2 emission data using atmospheric models and measurements. We show a global simulation experiment where uncertainty estimates are propagated as an atmospheric tracer (uncertainty tracer) alongside CO2 in NASA's GEOS model and discuss implications of FFCO2 uncertainties in the context of flux inversions. We also demonstrate the use of high resolution urban CO2 simulations as a tool for objectively evaluating FFCO2 data over intense emission regions. Though this study focuses on FFCO2 emission data, the outcome of this study could also help improve the knowledge of similar gridded emissions data for non-CO2 compounds that share emission sectors.

  5. Constructing Surrogate Models of Complex Systems with Enhanced Sparsity: Quantifying the Influence of Conformational Uncertainty in Biomolecular Solvation

    DOE PAGES

    Lei, Huan; Yang, Xiu; Zheng, Bin; ...

    2015-11-05

    Biomolecules exhibit conformational fluctuations near equilibrium states, inducing uncertainty in various biological properties in a dynamic way. We have developed a general method to quantify the uncertainty of target properties induced by conformational fluctuations. Using a generalized polynomial chaos (gPC) expansion, we construct a surrogate model of the target property with respect to varying conformational states. We also propose a method to increase the sparsity of the gPC expansion by defining a set of conformational “active space” random variables. With the increased sparsity, we employ the compressive sensing method to accurately construct the surrogate model. We demonstrate the performance ofmore » the surrogate model by evaluating fluctuation-induced uncertainty in solvent-accessible surface area for the bovine trypsin inhibitor protein system and show that the new approach offers more accurate statistical information than standard Monte Carlo approaches. Further more, the constructed surrogate model also enables us to directly evaluate the target property under various conformational states, yielding a more accurate response surface than standard sparse grid collocation methods. In particular, the new method provides higher accuracy in high-dimensional systems, such as biomolecules, where sparse grid performance is limited by the accuracy of the computed quantity of interest. Finally, our new framework is generalizable and can be used to investigate the uncertainty of a wide variety of target properties in biomolecular systems.« less

  6. The effects of geometric uncertainties on computational modelling of knee biomechanics

    PubMed Central

    Fisher, John; Wilcox, Ruth

    2017-01-01

    The geometry of the articular components of the knee is an important factor in predicting joint mechanics in computational models. There are a number of uncertainties in the definition of the geometry of cartilage and meniscus, and evaluating the effects of these uncertainties is fundamental to understanding the level of reliability of the models. In this study, the sensitivity of knee mechanics to geometric uncertainties was investigated by comparing polynomial-based and image-based knee models and varying the size of meniscus. The results suggested that the geometric uncertainties in cartilage and meniscus resulting from the resolution of MRI and the accuracy of segmentation caused considerable effects on the predicted knee mechanics. Moreover, even if the mathematical geometric descriptors can be very close to the imaged-based articular surfaces, the detailed contact pressure distribution produced by the mathematical geometric descriptors was not the same as that of the image-based model. However, the trends predicted by the models based on mathematical geometric descriptors were similar to those of the imaged-based models. PMID:28879008

  7. A new UK fission yield evaluation UKFY3.7

    NASA Astrophysics Data System (ADS)

    Mills, Robert William

    2017-09-01

    The JEFF neutron induced and spontaneous fission product yield evaluation is currently unchanged from JEFF-3.1.1, also known by its UK designation UKFY3.6A. It is based upon experimental data combined with empirically fitted mass, charge and isomeric state models which are then adjusted within the experimental and model uncertainties to conform to the physical constraints of the fission process. A new evaluation has been prepared for JEFF, called UKFY3.7, that incorporates new experimental data and replaces the current empirical models (multi-Gaussian fits of mass distribution and Wahl Zp model for charge distribution combined with parameter extrapolation), with predictions from GEF. The GEF model has the advantage that one set of parameters allows the prediction of many different fissioning nuclides at different excitation energies unlike previous models where each fissioning nuclide at a specific excitation energy had to be fitted individually to the relevant experimental data. The new UKFY3.7 evaluation, submitted for testing as part of JEFF-3.3, is described alongside initial results of testing. In addition, initial ideas for future developments allowing inclusion of new measurements types and changing from any neutron spectrum type to true neutron energy dependence are discussed. Also, a method is proposed to propagate uncertainties of fission product yields based upon the experimental data that underlies the fission yield evaluation. The covariance terms being determined from the evaluated cumulative and independent yields combined with the experimental uncertainties on the cumulative yield measurements.

  8. Mesoscale modelling methodology based on nudging to increase accuracy in WRA

    NASA Astrophysics Data System (ADS)

    Mylonas Dirdiris, Markos; Barbouchi, Sami; Hermmann, Hugo

    2016-04-01

    The offshore wind energy has recently become a rapidly growing renewable energy resource worldwide, with several offshore wind projects in development in different planning stages. Despite of this, a better understanding of the atmospheric interaction within the marine atmospheric boundary layer (MABL) is needed in order to contribute to a better energy capture and cost-effectiveness. Light has been thrown in observational nudging as it has recently become an innovative method to increase the accuracy of wind flow modelling. This particular study focuses on the observational nudging capability of Weather Research and Forecasting (WRF) and ways the uncertainty of wind flow modelling in the wind resource assessment (WRA) can be reduced. Finally, an alternative way to calculate the model uncertainty is pinpointed. Approach WRF mesoscale model will be nudged with observations from FINO3 at three different heights. The model simulations with and without applying observational nudging will be verified against FINO1 measurement data at 100m. In order to evaluate the observational nudging capability of WRF two ways to derive the model uncertainty will be described: one global uncertainty and an uncertainty per wind speed bin derived using the recommended practice of the IEA in order to link the model uncertainty to a wind energy production uncertainty. This study assesses the observational data assimilation capability of WRF model within the same vertical gridded atmospheric column. The principal aim is to investigate whether having observations up to one height could improve the simulation at a higher vertical level. The study will use objective analysis implementing a Cress-man scheme interpolation to interpolate the observation in time and in sp ace (keeping the horizontal component constant) to the gridded analysis. Then the WRF model core will incorporate the interpolated variables to the "first guess" to develop a nudged simulation. Consequently, WRF with and without applying observational nudging will be validated against the higher level of FINO1 met mast using verification statistical metrics such as root mean square error (RMSE), standard deviation of mean error (ME Std), mean error average (bias) and Pearson correlation coefficient (R). The respective process will be followed for different atmospheric stratification regimes in order to evaluate the sensibility of the method to the atmospheric stability. Finally, since wind speed does not have an equally distributed impact on the power yield, the uncertainty will be measured using two ways resulting in a global uncertainty and one per wind speed bin based on a wind turbine power curve in order to evaluate the WRF for the purposes of wind power generation. Conclusion This study shows the higher accuracy of the WRF model after nudging observational data. In a next step these results will be compared with traditional vertical extrapolation methods such as power and log laws. The larger picture of this work would be to nudge the observations from a short offshore metmast in order for the WRF to reconstruct accurately the entire wind profile of the atmosphere up to hub height. This is an important step in order to reduce the cost of offshore WRA. Learning objectives 1. The audience will get a clear view of the added value of observational nudging; 2. An interesting way to calculate WRF uncertainty will be described, linking wind speed uncertainty to energy uncertainty.

  9. Effects of uncertain topographic input data on two-dimensional flow modeling in a gravel-bed river

    USGS Publications Warehouse

    Legleiter, C.J.; Kyriakidis, P.C.; McDonald, R.R.; Nelson, J.M.

    2011-01-01

    Many applications in river research and management rely upon two-dimensional (2D) numerical models to characterize flow fields, assess habitat conditions, and evaluate channel stability. Predictions from such models are potentially highly uncertain due to the uncertainty associated with the topographic data provided as input. This study used a spatial stochastic simulation strategy to examine the effects of topographic uncertainty on flow modeling. Many, equally likely bed elevation realizations for a simple meander bend were generated and propagated through a typical 2D model to produce distributions of water-surface elevation, depth, velocity, and boundary shear stress at each node of the model's computational grid. Ensemble summary statistics were used to characterize the uncertainty associated with these predictions and to examine the spatial structure of this uncertainty in relation to channel morphology. Simulations conditioned to different data configurations indicated that model predictions became increasingly uncertain as the spacing between surveyed cross sections increased. Model sensitivity to topographic uncertainty was greater for base flow conditions than for a higher, subbankfull flow (75% of bankfull discharge). The degree of sensitivity also varied spatially throughout the bend, with the greatest uncertainty occurring over the point bar where the flow field was influenced by topographic steering effects. Uncertain topography can therefore introduce significant uncertainty to analyses of habitat suitability and bed mobility based on flow model output. In the presence of such uncertainty, the results of these studies are most appropriately represented in probabilistic terms using distributions of model predictions derived from a series of topographic realizations. Copyright 2011 by the American Geophysical Union.

  10. The importance of parameterization when simulating the hydrologic response of vegetative land-cover change

    NASA Astrophysics Data System (ADS)

    White, Jeremy; Stengel, Victoria; Rendon, Samuel; Banta, John

    2017-08-01

    Computer models of hydrologic systems are frequently used to investigate the hydrologic response of land-cover change. If the modeling results are used to inform resource-management decisions, then providing robust estimates of uncertainty in the simulated response is an important consideration. Here we examine the importance of parameterization, a necessarily subjective process, on uncertainty estimates of the simulated hydrologic response of land-cover change. Specifically, we applied the soil water assessment tool (SWAT) model to a 1.4 km2 watershed in southern Texas to investigate the simulated hydrologic response of brush management (the mechanical removal of woody plants), a discrete land-cover change. The watershed was instrumented before and after brush-management activities were undertaken, and estimates of precipitation, streamflow, and evapotranspiration (ET) are available; these data were used to condition and verify the model. The role of parameterization in brush-management simulation was evaluated by constructing two models, one with 12 adjustable parameters (reduced parameterization) and one with 1305 adjustable parameters (full parameterization). Both models were subjected to global sensitivity analysis as well as Monte Carlo and generalized likelihood uncertainty estimation (GLUE) conditioning to identify important model inputs and to estimate uncertainty in several quantities of interest related to brush management. Many realizations from both parameterizations were identified as behavioral in that they reproduce daily mean streamflow acceptably well according to Nash-Sutcliffe model efficiency coefficient, percent bias, and coefficient of determination. However, the total volumetric ET difference resulting from simulated brush management remains highly uncertain after conditioning to daily mean streamflow, indicating that streamflow data alone are not sufficient to inform the model inputs that influence the simulated outcomes of brush management the most. Additionally, the reduced-parameterization model grossly underestimates uncertainty in the total volumetric ET difference compared to the full-parameterization model; total volumetric ET difference is a primary metric for evaluating the outcomes of brush management. The failure of the reduced-parameterization model to provide robust uncertainty estimates demonstrates the importance of parameterization when attempting to quantify uncertainty in land-cover change simulations.

  11. The importance of parameterization when simulating the hydrologic response of vegetative land-cover change

    USGS Publications Warehouse

    White, Jeremy; Stengel, Victoria G.; Rendon, Samuel H.; Banta, John

    2017-01-01

    Computer models of hydrologic systems are frequently used to investigate the hydrologic response of land-cover change. If the modeling results are used to inform resource-management decisions, then providing robust estimates of uncertainty in the simulated response is an important consideration. Here we examine the importance of parameterization, a necessarily subjective process, on uncertainty estimates of the simulated hydrologic response of land-cover change. Specifically, we applied the soil water assessment tool (SWAT) model to a 1.4 km2 watershed in southern Texas to investigate the simulated hydrologic response of brush management (the mechanical removal of woody plants), a discrete land-cover change. The watershed was instrumented before and after brush-management activities were undertaken, and estimates of precipitation, streamflow, and evapotranspiration (ET) are available; these data were used to condition and verify the model. The role of parameterization in brush-management simulation was evaluated by constructing two models, one with 12 adjustable parameters (reduced parameterization) and one with 1305 adjustable parameters (full parameterization). Both models were subjected to global sensitivity analysis as well as Monte Carlo and generalized likelihood uncertainty estimation (GLUE) conditioning to identify important model inputs and to estimate uncertainty in several quantities of interest related to brush management. Many realizations from both parameterizations were identified as behavioral in that they reproduce daily mean streamflow acceptably well according to Nash–Sutcliffe model efficiency coefficient, percent bias, and coefficient of determination. However, the total volumetric ET difference resulting from simulated brush management remains highly uncertain after conditioning to daily mean streamflow, indicating that streamflow data alone are not sufficient to inform the model inputs that influence the simulated outcomes of brush management the most. Additionally, the reduced-parameterization model grossly underestimates uncertainty in the total volumetric ET difference compared to the full-parameterization model; total volumetric ET difference is a primary metric for evaluating the outcomes of brush management. The failure of the reduced-parameterization model to provide robust uncertainty estimates demonstrates the importance of parameterization when attempting to quantify uncertainty in land-cover change simulations.

  12. Space Radiation Cancer Risk Projections and Uncertainties - 2010

    NASA Technical Reports Server (NTRS)

    Cucinotta, Francis A.; Kim, Myung-Hee Y.; Chappell, Lori J.

    2011-01-01

    Uncertainties in estimating health risks from galactic cosmic rays greatly limit space mission lengths and potential risk mitigation evaluations. NASA limits astronaut exposures to a 3% risk of exposure-induced death and protects against uncertainties using an assessment of 95% confidence intervals in the projection model. Revisions to this model for lifetime cancer risks from space radiation and new estimates of model uncertainties are described here. We review models of space environments and transport code predictions of organ exposures, and characterize uncertainties in these descriptions. We summarize recent analysis of low linear energy transfer radio-epidemiology data, including revision to Japanese A-bomb survivor dosimetry, longer follow-up of exposed cohorts, and reassessments of dose and dose-rate reduction effectiveness factors. We compare these projections and uncertainties with earlier estimates. Current understanding of radiation quality effects and recent data on factors of relative biological effectiveness and particle track structure are reviewed. Recent radiobiology experiment results provide new information on solid cancer and leukemia risks from heavy ions. We also consider deviations from the paradigm of linearity at low doses of heavy ions motivated by non-targeted effects models. New findings and knowledge are used to revise the NASA risk projection model for space radiation cancer risks.

  13. On the uncertainty of phenological responses to climate change and its implication for terrestrial biosphere models

    NASA Astrophysics Data System (ADS)

    Migliavacca, M.; Sonnentag, O.; Keenan, T. F.; Cescatti, A.; O'Keefe, J.; Richardson, A. D.

    2012-01-01

    Phenology, the timing of recurring life cycle events, controls numerous land surface feedbacks to the climate systems through the regulation of exchanges of carbon, water and energy between the biosphere and atmosphere. Land surface models, however, are known to have systematic errors in the simulation of spring phenology, which potentially could propagate to uncertainty in modeled responses to future climate change. Here, we analyzed the Harvard Forest phenology record to investigate and characterize the sources of uncertainty in phenological forecasts and the subsequent impacts on model forecasts of carbon and water cycling in the future. Using a model-data fusion approach, we combined information from 20 yr of phenological observations of 11 North American woody species with 12 phenological models of different complexity to predict leaf bud-burst. The evaluation of different phenological models indicated support for spring warming models with photoperiod limitations and, though to a lesser extent, to chilling models based on the alternating model structure. We assessed three different sources of uncertainty in phenological forecasts: parameter uncertainty, model uncertainty, and driver uncertainty. The latter was characterized running the models to 2099 using 2 different IPCC climate scenarios (A1fi vs. B1, i.e. high CO2 emissions vs. low CO2 emissions scenario). Parameter uncertainty was the smallest (average 95% CI: 2.4 day century-1 for scenario B1 and 4.5 day century-1 for A1fi), whereas driver uncertainty was the largest (up to 8.4 day century-1 in the simulated trends). The uncertainty related to model structure is also large and the predicted bud-burst trends as well as the shape of the smoothed projections varied somewhat among models (±7.7 day century-1 for A1fi, ±3.6 day century-1 for B1). The forecast sensitivity of bud-burst to temperature (i.e. days bud-burst advanced per degree of warming) varied between 2.2 day °C-1 and 5.2 day °C-1 depending on model structure. We quantified the impact of uncertainties in bud-burst forecasts on simulated carbon and water fluxes using a process-based terrestrial biosphere model. Uncertainty in phenology model structure led to uncertainty in the description of the seasonality of processes, which accumulated to uncertainty in annual model estimates of gross primary productivity (GPP) and evapotranspiration (ET) of 9.6% and 2.9% respectively. A sensitivity analysis shows that a variation of ±10 days in bud-burst dates led to a variation of ±5.0% for annual GPP and about ±2.0% for ET. For phenology models, differences among future climate scenarios represent the largest source of uncertainty, followed by uncertainties related to model structure, and finally, uncertainties related to model parameterization. The uncertainties we have quantified will affect the description of the seasonality of processes and in particular the simulation of carbon uptake by forest ecosystems, with a larger impact of uncertainties related to phenology model structure, followed by uncertainties related to phenological model parameterization.

  14. A multi-model assessment of terrestrial biosphere model data needs

    NASA Astrophysics Data System (ADS)

    Gardella, A.; Cowdery, E.; De Kauwe, M. G.; Desai, A. R.; Duveneck, M.; Fer, I.; Fisher, R.; Knox, R. G.; Kooper, R.; LeBauer, D.; McCabe, T.; Minunno, F.; Raiho, A.; Serbin, S.; Shiklomanov, A. N.; Thomas, A.; Walker, A.; Dietze, M.

    2017-12-01

    Terrestrial biosphere models provide us with the means to simulate the impacts of climate change and their uncertainties. Going beyond direct observation and experimentation, models synthesize our current understanding of ecosystem processes and can give us insight on data needed to constrain model parameters. In previous work, we leveraged the Predictive Ecosystem Analyzer (PEcAn) to assess the contribution of different parameters to the uncertainty of the Ecosystem Demography model v2 (ED) model outputs across various North American biomes (Dietze et al., JGR-G, 2014). While this analysis identified key research priorities, the extent to which these priorities were model- and/or biome-specific was unclear. Furthermore, because the analysis only studied one model, we were unable to comment on the effect of variability in model structure to overall predictive uncertainty. Here, we expand this analysis to all biomes globally and a wide sample of models that vary in complexity: BioCro, CABLE, CLM, DALEC, ED2, FATES, G'DAY, JULES, LANDIS, LINKAGES, LPJ-GUESS, MAESPA, PRELES, SDGVM, SIPNET, and TEM. Prior to performing uncertainty analyses, model parameter uncertainties were assessed by assimilating all available trait data from the combination of the BETYdb and TRY trait databases, using an updated multivariate version of PEcAn's Hierarchical Bayesian meta-analysis. Next, sensitivity analyses were performed for all models across a range of sites globally to assess sensitivities for a range of different outputs (GPP, ET, SH, Ra, NPP, Rh, NEE, LAI) at multiple time scales from the sub-annual to the decadal. Finally, parameter uncertainties and model sensitivities were combined to evaluate the fractional contribution of each parameter to the predictive uncertainty for a specific variable at a specific site and timescale. Facilitated by PEcAn's automated workflows, this analysis represents the broadest assessment of the sensitivities and uncertainties in terrestrial models to date, and provides a comprehensive roadmap for constraining model uncertainties through model development and data collection.

  15. Uncertainty Quantification for Robust Control of Wind Turbines using Sliding Mode Observer

    NASA Astrophysics Data System (ADS)

    Schulte, Horst

    2016-09-01

    A new quantification method of uncertain models for robust wind turbine control using sliding-mode techniques is presented with the objective to improve active load mitigation. This approach is based on the so-called equivalent output injection signal, which corresponds to the average behavior of the discontinuous switching term, establishing and maintaining a motion on a so-called sliding surface. The injection signal is directly evaluated to obtain estimates of the uncertainty bounds of external disturbances and parameter uncertainties. The applicability of the proposed method is illustrated by the quantification of a four degree-of-freedom model of the NREL 5MW reference turbine containing uncertainties.

  16. Value of Information Analysis Applied to the Economic Evaluation of Interventions Aimed at Reducing Juvenile Delinquency: An Illustration.

    PubMed

    Eeren, Hester V; Schawo, Saskia J; Scholte, Ron H J; Busschbach, Jan J V; Hakkaart, Leona

    2015-01-01

    To investigate whether a value of information analysis, commonly applied in health care evaluations, is feasible and meaningful in the field of crime prevention. Interventions aimed at reducing juvenile delinquency are increasingly being evaluated according to their cost-effectiveness. Results of cost-effectiveness models are subject to uncertainty in their cost and effect estimates. Further research can reduce that parameter uncertainty. The value of such further research can be estimated using a value of information analysis, as illustrated in the current study. We built upon an earlier published cost-effectiveness model that demonstrated the comparison of two interventions aimed at reducing juvenile delinquency. Outcomes were presented as costs per criminal activity free year. At a societal willingness-to-pay of €71,700 per criminal activity free year, further research to eliminate parameter uncertainty was valued at €176 million. Therefore, in this illustrative analysis, the value of information analysis determined that society should be willing to spend a maximum of €176 million in reducing decision uncertainty in the cost-effectiveness of the two interventions. Moreover, the results suggest that reducing uncertainty in some specific model parameters might be more valuable than in others. Using a value of information framework to assess the value of conducting further research in the field of crime prevention proved to be feasible. The results were meaningful and can be interpreted according to health care evaluation studies. This analysis can be helpful in justifying additional research funds to further inform the reimbursement decision in regard to interventions for juvenile delinquents.

  17. Harnessing the theoretical foundations of the exponential and beta-Poisson dose-response models to quantify parameter uncertainty using Markov Chain Monte Carlo.

    PubMed

    Schmidt, Philip J; Pintar, Katarina D M; Fazil, Aamir M; Topp, Edward

    2013-09-01

    Dose-response models are the essential link between exposure assessment and computed risk values in quantitative microbial risk assessment, yet the uncertainty that is inherent to computed risks because the dose-response model parameters are estimated using limited epidemiological data is rarely quantified. Second-order risk characterization approaches incorporating uncertainty in dose-response model parameters can provide more complete information to decisionmakers by separating variability and uncertainty to quantify the uncertainty in computed risks. Therefore, the objective of this work is to develop procedures to sample from posterior distributions describing uncertainty in the parameters of exponential and beta-Poisson dose-response models using Bayes's theorem and Markov Chain Monte Carlo (in OpenBUGS). The theoretical origins of the beta-Poisson dose-response model are used to identify a decomposed version of the model that enables Bayesian analysis without the need to evaluate Kummer confluent hypergeometric functions. Herein, it is also established that the beta distribution in the beta-Poisson dose-response model cannot address variation among individual pathogens, criteria to validate use of the conventional approximation to the beta-Poisson model are proposed, and simple algorithms to evaluate actual beta-Poisson probabilities of infection are investigated. The developed MCMC procedures are applied to analysis of a case study data set, and it is demonstrated that an important region of the posterior distribution of the beta-Poisson dose-response model parameters is attributable to the absence of low-dose data. This region includes beta-Poisson models for which the conventional approximation is especially invalid and in which many beta distributions have an extreme shape with questionable plausibility. © Her Majesty the Queen in Right of Canada 2013. Reproduced with the permission of the Minister of the Public Health Agency of Canada.

  18. Performance metrics and variance partitioning reveal sources of uncertainty in species distribution models

    USGS Publications Warehouse

    Watling, James I.; Brandt, Laura A.; Bucklin, David N.; Fujisaki, Ikuko; Mazzotti, Frank J.; Romañach, Stephanie; Speroterra, Carolina

    2015-01-01

    Species distribution models (SDMs) are widely used in basic and applied ecology, making it important to understand sources and magnitudes of uncertainty in SDM performance and predictions. We analyzed SDM performance and partitioned variance among prediction maps for 15 rare vertebrate species in the southeastern USA using all possible combinations of seven potential sources of uncertainty in SDMs: algorithms, climate datasets, model domain, species presences, variable collinearity, CO2 emissions scenarios, and general circulation models. The choice of modeling algorithm was the greatest source of uncertainty in SDM performance and prediction maps, with some additional variation in performance associated with the comprehensiveness of the species presences used for modeling. Other sources of uncertainty that have received attention in the SDM literature such as variable collinearity and model domain contributed little to differences in SDM performance or predictions in this study. Predictions from different algorithms tended to be more variable at northern range margins for species with more northern distributions, which may complicate conservation planning at the leading edge of species' geographic ranges. The clear message emerging from this work is that researchers should use multiple algorithms for modeling rather than relying on predictions from a single algorithm, invest resources in compiling a comprehensive set of species presences, and explicitly evaluate uncertainty in SDM predictions at leading range margins.

  19. Uncertainty Quantification of Evapotranspiration and Infiltration from Modeling and Historic Time Series at the Savannah River F-Area

    NASA Astrophysics Data System (ADS)

    Faybishenko, B.; Flach, G. P.

    2012-12-01

    The objectives of this presentation are: (a) to illustrate the application of Monte Carlo and fuzzy-probabilistic approaches for uncertainty quantification (UQ) in predictions of potential evapotranspiration (PET), actual evapotranspiration (ET), and infiltration (I), using uncertain hydrological or meteorological time series data, and (b) to compare the results of these calculations with those from field measurements at the U.S. Department of Energy Savannah River Site (SRS), near Aiken, South Carolina, USA. The UQ calculations include the evaluation of aleatory (parameter uncertainty) and epistemic (model) uncertainties. The effect of aleatory uncertainty is expressed by assigning the probability distributions of input parameters, using historical monthly averaged data from the meteorological station at the SRS. The combined effect of aleatory and epistemic uncertainties on the UQ of PET, ET, and Iis then expressed by aggregating the results of calculations from multiple models using a p-box and fuzzy numbers. The uncertainty in PETis calculated using the Bair-Robertson, Blaney-Criddle, Caprio, Hargreaves-Samani, Hamon, Jensen-Haise, Linacre, Makkink, Priestly-Taylor, Penman, Penman-Monteith, Thornthwaite, and Turc models. Then, ET is calculated from the modified Budyko model, followed by calculations of I from the water balance equation. We show that probabilistic and fuzzy-probabilistic calculations using multiple models generate the PET, ET, and Idistributions, which are well within the range of field measurements. We also show that a selection of a subset of models can be used to constrain the uncertainty quantification of PET, ET, and I.

  20. Effect of Uncertainty on Deterministic Runway Scheduling

    NASA Technical Reports Server (NTRS)

    Gupta, Gautam; Malik, Waqar; Jung, Yoon C.

    2012-01-01

    Active runway scheduling involves scheduling departures for takeoffs and arrivals for runway crossing subject to numerous constraints. This paper evaluates the effect of uncertainty on a deterministic runway scheduler. The evaluation is done against a first-come- first-serve scheme. In particular, the sequence from a deterministic scheduler is frozen and the times adjusted to satisfy all separation criteria; this approach is tested against FCFS. The comparison is done for both system performance (throughput and system delay) and predictability, and varying levels of congestion are considered. The modeling of uncertainty is done in two ways: as equal uncertainty in availability at the runway as for all aircraft, and as increasing uncertainty for later aircraft. Results indicate that the deterministic approach consistently performs better than first-come-first-serve in both system performance and predictability.

  1. Impacts of climate change and internal climate variability on french rivers streamflows

    NASA Astrophysics Data System (ADS)

    Dayon, Gildas; Boé, Julien; Martin, Eric

    2016-04-01

    The assessment of the impacts of climate change often requires to set up long chains of modeling, from the model to estimate the future concentration of greenhouse gases to the impact model. Throughout the modeling chain, sources of uncertainty accumulate making the exploitation of results for the development of adaptation strategies difficult. It is proposed here to assess the impacts of climate change on the hydrological cycle over France and the associated uncertainties. The contribution of the uncertainties from greenhouse gases emission scenario, climate models and internal variability are addressed in this work. To have a large ensemble of climate simulations, the study is based on Global Climate Models (GCM) simulations from the Coupled Model Intercomparison Phase 5 (CMIP5), including several simulations from the same GCM to properly assess uncertainties from internal climate variability. Simulations from the four Radiative Concentration Pathway (RCP) are downscaled with a statistical method developed in a previous study (Dayon et al. 2015). The hydrological system Isba-Modcou is then driven by the downscaling results on a 8 km grid over France. Isba is a land surface model that calculates the energy and water balance and Modcou a hydrogeological model that routes the surface runoff given by Isba. Based on that framework, uncertainties uncertainties from greenhouse gases emission scenario, climate models and climate internal variability are evaluated. Their relative importance is described for the next decades and the end of this century. In a last part, uncertainties due to internal climate variability on streamflows simulated with downscaled GCM and Isba-Modcou are evaluated against observations and hydrological reconstructions on the whole 20th century. Hydrological reconstructions are based on the downscaling of recent atmospheric reanalyses of the 20th century and observations of temperature and precipitation. We show that the multi-decadal variability of streamflows observed in the 20th century is generally weaker in the hydrological simulations done with the historical simulations from climate models. References: Dayon et al. (2015), Transferability in the future climate of a statistical downscaling mehtod for precipitation in France, J. Geophys. Res. Atmos., 120, 1023-1043, doi:10.1002/2014JD022236

  2. Climate data induced uncertainty in model-based estimations of terrestrial primary productivity

    NASA Astrophysics Data System (ADS)

    Wu, Zhendong; Ahlström, Anders; Smith, Benjamin; Ardö, Jonas; Eklundh, Lars; Fensholt, Rasmus; Lehsten, Veiko

    2017-06-01

    Model-based estimations of historical fluxes and pools of the terrestrial biosphere differ substantially. These differences arise not only from differences between models but also from differences in the environmental and climatic data used as input to the models. Here we investigate the role of uncertainties in historical climate data by performing simulations of terrestrial gross primary productivity (GPP) using a process-based dynamic vegetation model (LPJ-GUESS) forced by six different climate datasets. We find that the climate induced uncertainty, defined as the range among historical simulations in GPP when forcing the model with the different climate datasets, can be as high as 11 Pg C yr-1 globally (9% of mean GPP). We also assessed a hypothetical maximum climate data induced uncertainty by combining climate variables from different datasets, which resulted in significantly larger uncertainties of 41 Pg C yr-1 globally or 32% of mean GPP. The uncertainty is partitioned into components associated to the three main climatic drivers, temperature, precipitation, and shortwave radiation. Additionally, we illustrate how the uncertainty due to a given climate driver depends both on the magnitude of the forcing data uncertainty (climate data range) and the apparent sensitivity of the modeled GPP to the driver (apparent model sensitivity). We find that LPJ-GUESS overestimates GPP compared to empirically based GPP data product in all land cover classes except for tropical forests. Tropical forests emerge as a disproportionate source of uncertainty in GPP estimation both in the simulations and empirical data products. The tropical forest uncertainty is most strongly associated with shortwave radiation and precipitation forcing, of which climate data range contributes higher to overall uncertainty than apparent model sensitivity to forcing. Globally, precipitation dominates the climate induced uncertainty over nearly half of the vegetated land area, which is mainly due to climate data range and less so due to the apparent model sensitivity. Overall, climate data ranges are found to contribute more to the climate induced uncertainty than apparent model sensitivity to forcing. Our study highlights the need to better constrain tropical climate, and demonstrates that uncertainty caused by climatic forcing data must be considered when comparing and evaluating carbon cycle model results and empirical datasets.

  3. Large-Scale Transport Model Uncertainty and Sensitivity Analysis: Distributed Sources in Complex Hydrogeologic Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sig Drellack, Lance Prothro

    2007-12-01

    The Underground Test Area (UGTA) Project of the U.S. Department of Energy, National Nuclear Security Administration Nevada Site Office is in the process of assessing and developing regulatory decision options based on modeling predictions of contaminant transport from underground testing of nuclear weapons at the Nevada Test Site (NTS). The UGTA Project is attempting to develop an effective modeling strategy that addresses and quantifies multiple components of uncertainty including natural variability, parameter uncertainty, conceptual/model uncertainty, and decision uncertainty in translating model results into regulatory requirements. The modeling task presents multiple unique challenges to the hydrological sciences as a result ofmore » the complex fractured and faulted hydrostratigraphy, the distributed locations of sources, the suite of reactive and non-reactive radionuclides, and uncertainty in conceptual models. Characterization of the hydrogeologic system is difficult and expensive because of deep groundwater in the arid desert setting and the large spatial setting of the NTS. Therefore, conceptual model uncertainty is partially addressed through the development of multiple alternative conceptual models of the hydrostratigraphic framework and multiple alternative models of recharge and discharge. Uncertainty in boundary conditions is assessed through development of alternative groundwater fluxes through multiple simulations using the regional groundwater flow model. Calibration of alternative models to heads and measured or inferred fluxes has not proven to provide clear measures of model quality. Therefore, model screening by comparison to independently-derived natural geochemical mixing targets through cluster analysis has also been invoked to evaluate differences between alternative conceptual models. Advancing multiple alternative flow models, sensitivity of transport predictions to parameter uncertainty is assessed through Monte Carlo simulations. The simulations are challenged by the distributed sources in each of the Corrective Action Units, by complex mass transfer processes, and by the size and complexity of the field-scale flow models. An efficient methodology utilizing particle tracking results and convolution integrals provides in situ concentrations appropriate for Monte Carlo analysis. Uncertainty in source releases and transport parameters including effective porosity, fracture apertures and spacing, matrix diffusion coefficients, sorption coefficients, and colloid load and mobility are considered. With the distributions of input uncertainties and output plume volumes, global analysis methods including stepwise regression, contingency table analysis, and classification tree analysis are used to develop sensitivity rankings of parameter uncertainties for each model considered, thus assisting a variety of decisions.« less

  4. Trapped Radiation Model Uncertainties: Model-Data and Model-Model Comparisons

    NASA Technical Reports Server (NTRS)

    Armstrong, T. W.; Colborn, B. L.

    2000-01-01

    The standard AP8 and AE8 models for predicting trapped proton and electron environments have been compared with several sets of flight data to evaluate model uncertainties. Model comparisons are made with flux and dose measurements made on various U.S. low-Earth orbit satellites (APEX, CRRES, DMSP, LDEF, NOAA) and Space Shuttle flights, on Russian satellites (Photon-8, Cosmos-1887, Cosmos-2044), and on the Russian Mir Space Station. This report gives the details of the model-data comparisons-summary results in terms of empirical model uncertainty factors that can be applied for spacecraft design applications are given in a combination report. The results of model-model comparisons are also presented from standard AP8 and AE8 model predictions compared with the European Space Agency versions of AP8 and AE8 and with Russian-trapped radiation models.

  5. Trapped Radiation Model Uncertainties: Model-Data and Model-Model Comparisons

    NASA Technical Reports Server (NTRS)

    Armstrong, T. W.; Colborn, B. L.

    2000-01-01

    The standard AP8 and AE8 models for predicting trapped proton and electron environments have been compared with several sets of flight data to evaluate model uncertainties. Model comparisons are made with flux and dose measurements made on various U.S. low-Earth orbit satellites (APEX, CRRES, DMSP. LDEF, NOAA) and Space Shuttle flights, on Russian satellites (Photon-8, Cosmos-1887, Cosmos-2044), and on the Russian Mir space station. This report gives the details of the model-data comparisons -- summary results in terms of empirical model uncertainty factors that can be applied for spacecraft design applications are given in a companion report. The results of model-model comparisons are also presented from standard AP8 and AE8 model predictions compared with the European Space Agency versions of AP8 and AE8 and with Russian trapped radiation models.

  6. Position uncertainty distribution for articulated arm coordinate measuring machine based on simplified definite integration

    NASA Astrophysics Data System (ADS)

    You, Xu; Zhi-jian, Zong; Qun, Gao

    2018-07-01

    This paper describes a methodology for the position uncertainty distribution of an articulated arm coordinate measuring machine (AACMM). First, a model of the structural parameter uncertainties was established by statistical method. Second, the position uncertainty space volume of the AACMM in a certain configuration was expressed using a simplified definite integration method based on the structural parameter uncertainties; it was then used to evaluate the position accuracy of the AACMM in a certain configuration. Third, the configurations of a certain working point were calculated by an inverse solution, and the position uncertainty distribution of a certain working point was determined; working point uncertainty can be evaluated by the weighting method. Lastly, the position uncertainty distribution in the workspace of the ACCMM was described by a map. A single-point contrast test of a 6-joint AACMM was carried out to verify the effectiveness of the proposed method, and it was shown that the method can describe the position uncertainty of the AACMM and it was used to guide the calibration of the AACMM and the choice of AACMM’s accuracy area.

  7. Aerosol-type retrieval and uncertainty quantification from OMI data

    NASA Astrophysics Data System (ADS)

    Kauppi, Anu; Kolmonen, Pekka; Laine, Marko; Tamminen, Johanna

    2017-11-01

    We discuss uncertainty quantification for aerosol-type selection in satellite-based atmospheric aerosol retrieval. The retrieval procedure uses precalculated aerosol microphysical models stored in look-up tables (LUTs) and top-of-atmosphere (TOA) spectral reflectance measurements to solve the aerosol characteristics. The forward model approximations cause systematic differences between the modelled and observed reflectance. Acknowledging this model discrepancy as a source of uncertainty allows us to produce more realistic uncertainty estimates and assists the selection of the most appropriate LUTs for each individual retrieval.This paper focuses on the aerosol microphysical model selection and characterisation of uncertainty in the retrieved aerosol type and aerosol optical depth (AOD). The concept of model evidence is used as a tool for model comparison. The method is based on Bayesian inference approach, in which all uncertainties are described as a posterior probability distribution. When there is no single best-matching aerosol microphysical model, we use a statistical technique based on Bayesian model averaging to combine AOD posterior probability densities of the best-fitting models to obtain an averaged AOD estimate. We also determine the shared evidence of the best-matching models of a certain main aerosol type in order to quantify how plausible it is that it represents the underlying atmospheric aerosol conditions.The developed method is applied to Ozone Monitoring Instrument (OMI) measurements using a multiwavelength approach for retrieving the aerosol type and AOD estimate with uncertainty quantification for cloud-free over-land pixels. Several larger pixel set areas were studied in order to investigate the robustness of the developed method. We evaluated the retrieved AOD by comparison with ground-based measurements at example sites. We found that the uncertainty of AOD expressed by posterior probability distribution reflects the difficulty in model selection. The posterior probability distribution can provide a comprehensive characterisation of the uncertainty in this kind of problem for aerosol-type selection. As a result, the proposed method can account for the model error and also include the model selection uncertainty in the total uncertainty budget.

  8. Estimating Model Prediction Error: Should You Treat Predictions as Fixed or Random?

    NASA Technical Reports Server (NTRS)

    Wallach, Daniel; Thorburn, Peter; Asseng, Senthold; Challinor, Andrew J.; Ewert, Frank; Jones, James W.; Rotter, Reimund; Ruane, Alexander

    2016-01-01

    Crop models are important tools for impact assessment of climate change, as well as for exploring management options under current climate. It is essential to evaluate the uncertainty associated with predictions of these models. We compare two criteria of prediction error; MSEP fixed, which evaluates mean squared error of prediction for a model with fixed structure, parameters and inputs, and MSEP uncertain( X), which evaluates mean squared error averaged over the distributions of model structure, inputs and parameters. Comparison of model outputs with data can be used to estimate the former. The latter has a squared bias term, which can be estimated using hindcasts, and a model variance term, which can be estimated from a simulation experiment. The separate contributions to MSEP uncertain (X) can be estimated using a random effects ANOVA. It is argued that MSEP uncertain (X) is the more informative uncertainty criterion, because it is specific to each prediction situation.

  9. Evaluation of the multi-model CORDEX-Africa hindcast using RCMES

    NASA Astrophysics Data System (ADS)

    Kim, J.; Waliser, D. E.; Lean, P.; Mattmann, C. A.; Goodale, C. E.; Hart, A.; Zimdars, P.; Hewitson, B.; Jones, C.

    2011-12-01

    Recent global climate change studies have concluded with a high confidence level that the observed increasing trend in the global-mean surface air temperatures since mid-20th century is triggered by the emission of anthropogenic greenhouse gases (GHGs). The increase in the global-mean temperature due to anthropogenic emissions is nearly monotonic and may alter the climatological norms resulting in a new climate normal. In the presence of anthropogenic climate change, assessing regional impacts of the altered climate state and developing the plans for mitigating any adverse impacts are an important concern. Assessing future climate state and its impact remains a difficult task largely because of the uncertainties in future emissions and model errors. Uncertainties in climate projections propagates into impact assessment models and result in uncertainties in the impact assessments. In order to facilitate the evaluation of model data, a fundamental step for assessing model errors, the JPL Regional Climate Model Evaluation System (RCMES: Lean et al. 2010; Hart et al. 2011) has been developed through a joint effort of the investigators from UCLA and JPL. RCMES is also a regional climate component of a larger worldwide ExArch project. We will present the evaluation of the surface temperatures and precipitation from multiple RCMs participating in the African component of the Coordinated Regional Climate Downscaling Experiment (CORDEX) that has organized a suite of regional climate projection experiments in which multiple RCMs and GCMs are incorporated. As a part of the project, CORDEX organized a 20-year regional climate hindcast study in order to quantify and understand the uncertainties originating from model errors. Investigators from JPL, UCLA, and the CORDEX-Africa team collaborate to analyze the RCM hindcast data using RCMES. The analysis is focused on measuring the closeness between individual regional climate model outputs as well as their ensembles and observed data. The model evaluation is quantified in terms of widely used metrics. Details on the conceptual outline and architecture of RCMES is presented in two companion papers "The Regional climate model Evaluation System (RCMES) based on contemporary satellite and other observations for assessing regional climate model fidelity" and "A Reusable Framework for Regional Climate Model Evaluation" in GC07 and IN30, respectively.

  10. Buy now, saved later? The critical impact of time-to-pandemic uncertainty on pandemic cost-effectiveness analyses.

    PubMed

    Drake, Tom; Chalabi, Zaid; Coker, Richard

    2015-02-01

    Investment in pandemic preparedness is a long-term gamble, with the return on investment coming at an unknown point in the future. Many countries have chosen to stockpile key resources, and the number of pandemic economic evaluations has risen sharply since 2009. We assess the importance of uncertainty in time-to-pandemic (and associated discounting) in pandemic economic evaluation, a factor frequently neglected in the literature to-date. We use a probability tree model and Monte Carlo parameter sampling to consider the cost effectiveness of antiviral stockpiling in Cambodia under parameter uncertainty. Mean elasticity and mutual information (MI) are used to assess the importance of time-to-pandemic compared with other parameters. We also consider the sensitivity to choice of sampling distribution used to model time-to-pandemic uncertainty. Time-to-pandemic and discount rate are the primary drivers of sensitivity and uncertainty in pandemic cost effectiveness models. Base case cost effectiveness of antiviral stockpiling ranged between is US$112 and US$3599 per DALY averted using historical pandemic intervals for time-to-pandemic. The mean elasticities for time-to-pandemic and discount rate were greater than all other parameters. Similarly, the MI scores for time to pandemic and discount rate were greater than other parameters. Time-to-pandemic and discount rate were key drivers of uncertainty in cost-effectiveness results regardless of time-to-pandemic sampling distribution choice. Time-to-pandemic assumptions can "substantially" affect cost-effectiveness results and, in our model, is a greater contributor to uncertainty in cost-effectiveness results than any other parameter. We strongly recommend that cost-effectiveness models include probabilistic analysis of time-to-pandemic uncertainty. Published by Oxford University Press in association with The London School of Hygiene and Tropical Medicine © The Author 2013; all rights reserved.

  11. Uncertainty evaluation of a regional real-time system for rain-induced landslides

    NASA Astrophysics Data System (ADS)

    Kirschbaum, Dalia; Stanley, Thomas; Yatheendradas, Soni

    2015-04-01

    A new prototype regional model and evaluation framework has been developed over Central America and the Caribbean region using satellite-based information including precipitation estimates, modeled soil moisture, topography, soils, as well as regionally available datasets such as road networks and distance to fault zones. The algorithm framework incorporates three static variables: a susceptibility map; a 24-hr rainfall triggering threshold; and an antecedent soil moisture variable threshold, which have been calibrated using historic landslide events. The thresholds are regionally heterogeneous and are based on the percentile distribution of the rainfall or antecedent moisture time series. A simple decision tree algorithm framework integrates all three variables with the rainfall and soil moisture time series and generates a landslide nowcast in real-time based on the previous 24 hours over this region. This system has been evaluated using several available landslide inventories over the Central America and Caribbean region. Spatiotemporal uncertainty and evaluation metrics of the model are presented here based on available landslides reports. This work also presents a probabilistic representation of potential landslide activity over the region which can be used to further refine and improve the real-time landslide hazard assessment system as well as better identify and characterize the uncertainties inherent in this type of regional approach. The landslide algorithm provides a flexible framework to improve hazard estimation and reduce uncertainty at any spatial and temporal scale.

  12. Cumulative uncertainty in measured streamflow and water quality data for small watersheds

    USGS Publications Warehouse

    Harmel, R.D.; Cooper, R.J.; Slade, R.M.; Haney, R.L.; Arnold, J.G.

    2006-01-01

    The scientific community has not established an adequate understanding of the uncertainty inherent in measured water quality data, which is introduced by four procedural categories: streamflow measurement, sample collection, sample preservation/storage, and laboratory analysis. Although previous research has produced valuable information on relative differences in procedures within these categories, little information is available that compares the procedural categories or presents the cumulative uncertainty in resulting water quality data. As a result, quality control emphasis is often misdirected, and data uncertainty is typically either ignored or accounted for with an arbitrary margin of safety. Faced with the need for scientifically defensible estimates of data uncertainty to support water resource management, the objectives of this research were to: (1) compile selected published information on uncertainty related to measured streamflow and water quality data for small watersheds, (2) use a root mean square error propagation method to compare the uncertainty introduced by each procedural category, and (3) use the error propagation method to determine the cumulative probable uncertainty in measured streamflow, sediment, and nutrient data. Best case, typical, and worst case "data quality" scenarios were examined. Averaged across all constituents, the calculated cumulative probable uncertainty (??%) contributed under typical scenarios ranged from 6% to 19% for streamflow measurement, from 4% to 48% for sample collection, from 2% to 16% for sample preservation/storage, and from 5% to 21% for laboratory analysis. Under typical conditions, errors in storm loads ranged from 8% to 104% for dissolved nutrients, from 8% to 110% for total N and P, and from 7% to 53% for TSS. Results indicated that uncertainty can increase substantially under poor measurement conditions and limited quality control effort. This research provides introductory scientific estimates of uncertainty in measured water quality data. The results and procedures presented should also assist modelers in quantifying the "quality"of calibration and evaluation data sets, determining model accuracy goals, and evaluating model performance.

  13. Assessing and mitigating uncertainties in the Noah-MP land-model simulations over the Tibet Plateau region

    NASA Astrophysics Data System (ADS)

    Zhang, G.; Chen, F.; Gan, Y.

    2017-12-01

    Assessing and mitigating uncertainties in the Noah-MP land-model simulations over the Tibet Plateau region Guo Zhang1, Fei Chen1,2, Yanjun Gan11State Key Laboratory of Severe Weather, Chinese Academy of Meteorological Sciences, Beijing, China 2National Center for Atmospheric Research, Boulder, Colorado, USA Uncertainties in the Noah with multiparameterization (Noah-MP) land surface model were assessed through physics ensemble simulations for four sparsely-vegetated sites located in the Tibetan Plateau region. Those simulations were evaluated using observations at the four sites during the third Tibetan Plateau Experiment (TIPEX III).The impacts of uncertainties in precipitation data used as forcing conditions, parameterizations of sub-processes such as soil organic matter and rhizosphere on physics-ensemble simulations are identified using two different methods: the natural selection and Tukey's test. This study attempts to answer the following questions: 1) what is the relative contribution of precipitation-forcing uncertainty to the overall uncertainty range of Noah-MP simulations at those sites as compared to that at a more moisture and densely vegetated site; 2) what are the most sensitive physical parameterization for those sites; 3) can we identify the parameterizations that need to be improved? The investigation was conducted by evaluating simulated seasonal evolution of soil temperature, soilmoisture, surface heat fluxes through a number of Noah-MP ensemble simulations.

  14. WE-D-BRE-07: Variance-Based Sensitivity Analysis to Quantify the Impact of Biological Uncertainties in Particle Therapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kamp, F.; Brueningk, S.C.; Wilkens, J.J.

    Purpose: In particle therapy, treatment planning and evaluation are frequently based on biological models to estimate the relative biological effectiveness (RBE) or the equivalent dose in 2 Gy fractions (EQD2). In the context of the linear-quadratic model, these quantities depend on biological parameters (α, β) for ions as well as for the reference radiation and on the dose per fraction. The needed biological parameters as well as their dependency on ion species and ion energy typically are subject to large (relative) uncertainties of up to 20–40% or even more. Therefore it is necessary to estimate the resulting uncertainties in e.g.more » RBE or EQD2 caused by the uncertainties of the relevant input parameters. Methods: We use a variance-based sensitivity analysis (SA) approach, in which uncertainties in input parameters are modeled by random number distributions. The evaluated function is executed 10{sup 4} to 10{sup 6} times, each run with a different set of input parameters, randomly varied according to their assigned distribution. The sensitivity S is a variance-based ranking (from S = 0, no impact, to S = 1, only influential part) of the impact of input uncertainties. The SA approach is implemented for carbon ion treatment plans on 3D patient data, providing information about variations (and their origin) in RBE and EQD2. Results: The quantification enables 3D sensitivity maps, showing dependencies of RBE and EQD2 on different input uncertainties. The high number of runs allows displaying the interplay between different input uncertainties. The SA identifies input parameter combinations which result in extreme deviations of the result and the input parameter for which an uncertainty reduction is the most rewarding. Conclusion: The presented variance-based SA provides advantageous properties in terms of visualization and quantification of (biological) uncertainties and their impact. The method is very flexible, model independent, and enables a broad assessment of uncertainties. Supported by DFG grant WI 3745/1-1 and DFG cluster of excellence: Munich-Centre for Advanced Photonics.« less

  15. Evaluating Organic Aerosol Model Performance: Impact of two Embedded Assumptions

    NASA Astrophysics Data System (ADS)

    Jiang, W.; Giroux, E.; Roth, H.; Yin, D.

    2004-05-01

    Organic aerosols are important due to their abundance in the polluted lower atmosphere and their impact on human health and vegetation. However, modeling organic aerosols is a very challenging task because of the complexity of aerosol composition, structure, and formation processes. Assumptions and their associated uncertainties in both models and measurement data make model performance evaluation a truly demanding job. Although some assumptions are obvious, others are hidden and embedded, and can significantly impact modeling results, possibly even changing conclusions about model performance. This paper focuses on analyzing the impact of two embedded assumptions on evaluation of organic aerosol model performance. One assumption is about the enthalpy of vaporization widely used in various secondary organic aerosol (SOA) algorithms. The other is about the conversion factor used to obtain ambient organic aerosol concentrations from measured organic carbon. These two assumptions reflect uncertainties in the model and in the ambient measurement data, respectively. For illustration purposes, various choices of the assumed values are implemented in the evaluation process for an air quality model based on CMAQ (the Community Multiscale Air Quality Model). Model simulations are conducted for the Lower Fraser Valley covering Southwest British Columbia, Canada, and Northwest Washington, United States, for a historical pollution episode in 1993. To understand the impact of the assumed enthalpy of vaporization on modeling results, its impact on instantaneous organic aerosol yields (IAY) through partitioning coefficients is analysed first. The analysis shows that utilizing different enthalpy of vaporization values causes changes in the shapes of IAY curves and in the response of SOA formation capability of reactive organic gases to temperature variations. These changes are then carried into the air quality model and cause substantial changes in the organic aerosol modeling results. In another aspect, using different assumed factors to convert measured organic carbon to organic aerosol concentrations cause substantial variations in the processed ambient data themselves, which are normally used as performance targets for model evaluations. The combination of uncertainties in the modeling results and in the moving performance targets causes major uncertainties in the final conclusion about the model performance. Without further information, the best thing that a modeler can do is to choose a combination of the assumed values from the sensible parameter ranges available in the literature, based on the best match of the modeling results with the processed measurement data. However, the best match of the modeling results with the processed measurement data may not necessarily guarantee that the model itself is rigorous and the model performance is robust. Conclusions on the model performance can only be reached with sufficient understanding of the uncertainties and their impact.

  16. Assessment of parametric uncertainty for groundwater reactive transport modeling,

    USGS Publications Warehouse

    Shi, Xiaoqing; Ye, Ming; Curtis, Gary P.; Miller, Geoffery L.; Meyer, Philip D.; Kohler, Matthias; Yabusaki, Steve; Wu, Jichun

    2014-01-01

    The validity of using Gaussian assumptions for model residuals in uncertainty quantification of a groundwater reactive transport model was evaluated in this study. Least squares regression methods explicitly assume Gaussian residuals, and the assumption leads to Gaussian likelihood functions, model parameters, and model predictions. While the Bayesian methods do not explicitly require the Gaussian assumption, Gaussian residuals are widely used. This paper shows that the residuals of the reactive transport model are non-Gaussian, heteroscedastic, and correlated in time; characterizing them requires using a generalized likelihood function such as the formal generalized likelihood function developed by Schoups and Vrugt (2010). For the surface complexation model considered in this study for simulating uranium reactive transport in groundwater, parametric uncertainty is quantified using the least squares regression methods and Bayesian methods with both Gaussian and formal generalized likelihood functions. While the least squares methods and Bayesian methods with Gaussian likelihood function produce similar Gaussian parameter distributions, the parameter distributions of Bayesian uncertainty quantification using the formal generalized likelihood function are non-Gaussian. In addition, predictive performance of formal generalized likelihood function is superior to that of least squares regression and Bayesian methods with Gaussian likelihood function. The Bayesian uncertainty quantification is conducted using the differential evolution adaptive metropolis (DREAM(zs)) algorithm; as a Markov chain Monte Carlo (MCMC) method, it is a robust tool for quantifying uncertainty in groundwater reactive transport models. For the surface complexation model, the regression-based local sensitivity analysis and Morris- and DREAM(ZS)-based global sensitivity analysis yield almost identical ranking of parameter importance. The uncertainty analysis may help select appropriate likelihood functions, improve model calibration, and reduce predictive uncertainty in other groundwater reactive transport and environmental modeling.

  17. Uncertainties in Decadal Model Evaluation due to the Choice of Different Reanalysis Products

    NASA Astrophysics Data System (ADS)

    Illing, Sebastian; Kadow, Christopher; Kunst, Oliver; Cubasch, Ulrich

    2014-05-01

    In recent years decadal predictions have become very popular in the climate science community. A major task is the evaluation and validation of a decadal prediction system. Therefore hindcast experiments are performed and evaluated against observation based or reanalysis data-sets. That is, various metrics and skill scores like the anomaly correlation or the mean squared error skill score (MSSS) are calculated to estimate potential prediction skill of the model system. Our results will mostly feature the Baseline 1 hindcast experiments from the MiKlip decadal prediction system. MiKlip (www.fona-miklip.de) is a project for medium-term climate prediction funded by the Federal Ministry of Education and Research in Germany (BMBF) and has the aim to create a model system that can provide reliable decadal forecasts on climate and weather. There are various reanalysis and observation based products covering at least the last forty years which can be used for model evaluation, for instance the 20th Century Reanalysis from NOAA-CIRES, the Climate Forecast System Reanalysis from NCEP or the Interim Reanalysis from ECMWF. Each of them is based on different climate models and observations. We will show that the choice of the reanalysis product has a huge impact on the value of various skill metrics. In some cases this may actually lead to a change in the interpretation of the results, e.g. when one tries to compare two model versions and the anomaly correlation difference changes its sign for two different reanalysis products. We will also show first results of our studies investigating the influence and effect of this source of uncertainty for decadal model evaluation. Furthermore we point out regions which are most affected by this uncertainty and where one has to cautious interpreting skill scores. In addition we introduce some strategies to overcome or at least reduce this source of uncertainty.

  18. Uncertainty in modeled upper ocean heat content change

    NASA Astrophysics Data System (ADS)

    Tokmakian, Robin; Challenor, Peter

    2014-02-01

    This paper examines the uncertainty in the change in the heat content in the ocean component of a general circulation model. We describe the design and implementation of our statistical methodology. Using an ensemble of model runs and an emulator, we produce an estimate of the full probability distribution function (PDF) for the change in upper ocean heat in an Atmosphere/Ocean General Circulation Model, the Community Climate System Model v. 3, across a multi-dimensional input space. We show how the emulator of the GCM's heat content change and hence, the PDF, can be validated and how implausible outcomes from the emulator can be identified when compared to observational estimates of the metric. In addition, the paper describes how the emulator outcomes and related uncertainty information might inform estimates of the same metric from a multi-model Coupled Model Intercomparison Project phase 3 ensemble. We illustrate how to (1) construct an ensemble based on experiment design methods, (2) construct and evaluate an emulator for a particular metric of a complex model, (3) validate the emulator using observational estimates and explore the input space with respect to implausible outcomes and (4) contribute to the understanding of uncertainties within a multi-model ensemble. Finally, we estimate the most likely value for heat content change and its uncertainty for the model, with respect to both observations and the uncertainty in the value for the input parameters.

  19. A New Combined Stepwise-Based High-Order Decoupled Direct and Reduced-Form Method To Improve Uncertainty Analysis in PM2.5 Simulations.

    PubMed

    Huang, Zhijiong; Hu, Yongtao; Zheng, Junyu; Yuan, Zibing; Russell, Armistead G; Ou, Jiamin; Zhong, Zhuangmin

    2017-04-04

    The traditional reduced-form model (RFM) based on the high-order decoupled direct method (HDDM), is an efficient uncertainty analysis approach for air quality models, but it has large biases in uncertainty propagation due to the limitation of the HDDM in predicting nonlinear responses to large perturbations of model inputs. To overcome the limitation, a new stepwise-based RFM method that combines several sets of local sensitive coefficients under different conditions is proposed. Evaluations reveal that the new RFM improves the prediction of nonlinear responses. The new method is applied to quantify uncertainties in simulated PM 2.5 concentrations in the Pearl River Delta (PRD) region of China as a case study. Results show that the average uncertainty range of hourly PM 2.5 concentrations is -28% to 57%, which can cover approximately 70% of the observed PM 2.5 concentrations, while the traditional RFM underestimates the upper bound of the uncertainty range by 1-6%. Using a variance-based method, the PM 2.5 boundary conditions and primary PM 2.5 emissions are found to be the two major uncertainty sources in PM 2.5 simulations. The new RFM better quantifies the uncertainty range in model simulations and can be applied to improve applications that rely on uncertainty information.

  20. Evaluating Variability and Uncertainty of Geological Strength Index at a Specific Site

    NASA Astrophysics Data System (ADS)

    Wang, Yu; Aladejare, Adeyemi Emman

    2016-09-01

    Geological Strength Index (GSI) is an important parameter for estimating rock mass properties. GSI can be estimated from quantitative GSI chart, as an alternative to the direct observational method which requires vast geological experience of rock. GSI chart was developed from past observations and engineering experience, with either empiricism or some theoretical simplifications. The GSI chart thereby contains model uncertainty which arises from its development. The presence of such model uncertainty affects the GSI estimated from GSI chart at a specific site; it is, therefore, imperative to quantify and incorporate the model uncertainty during GSI estimation from the GSI chart. A major challenge for quantifying the GSI chart model uncertainty is a lack of the original datasets that have been used to develop the GSI chart, since the GSI chart was developed from past experience without referring to specific datasets. This paper intends to tackle this problem by developing a Bayesian approach for quantifying the model uncertainty in GSI chart when using it to estimate GSI at a specific site. The model uncertainty in the GSI chart and the inherent spatial variability in GSI are modeled explicitly in the Bayesian approach. The Bayesian approach generates equivalent samples of GSI from the integrated knowledge of GSI chart, prior knowledge and observation data available from site investigation. Equations are derived for the Bayesian approach, and the proposed approach is illustrated using data from a drill and blast tunnel project. The proposed approach effectively tackles the problem of how to quantify the model uncertainty that arises from using GSI chart for characterization of site-specific GSI in a transparent manner.

  1. Risk Assessment of Groundwater Contamination: A Multilevel Fuzzy Comprehensive Evaluation Approach Based on DRASTIC Model

    PubMed Central

    Zhang, Yan; Zhong, Ming

    2013-01-01

    Groundwater contamination is a serious threat to water supply. Risk assessment of groundwater contamination is an effective way to protect the safety of groundwater resource. Groundwater is a complex and fuzzy system with many uncertainties, which is impacted by different geological and hydrological factors. In order to deal with the uncertainty in the risk assessment of groundwater contamination, we propose an approach with analysis hierarchy process and fuzzy comprehensive evaluation integrated together. Firstly, the risk factors of groundwater contamination are identified by the sources-pathway-receptor-consequence method, and a corresponding index system of risk assessment based on DRASTIC model is established. Due to the complexity in the process of transitions between the possible pollution risks and the uncertainties of factors, the method of analysis hierarchy process is applied to determine the weights of each factor, and the fuzzy sets theory is adopted to calculate the membership degrees of each factor. Finally, a case study is presented to illustrate and test this methodology. It is concluded that the proposed approach integrates the advantages of both analysis hierarchy process and fuzzy comprehensive evaluation, which provides a more flexible and reliable way to deal with the linguistic uncertainty and mechanism uncertainty in groundwater contamination without losing important information. PMID:24453883

  2. Integrated Arrival and Departure Schedule Optimization Under Uncertainty

    NASA Technical Reports Server (NTRS)

    Xue, Min; Zelinski, Shannon

    2014-01-01

    In terminal airspace, integrating arrivals and departures with shared waypoints provides the potential of improving operational efficiency by allowing direct routes when possible. Incorporating stochastic evaluation as a post-analysis process of deterministic optimization, and imposing a safety buffer in deterministic optimization, are two ways to learn and alleviate the impact of uncertainty and to avoid unexpected outcomes. This work presents a third and direct way to take uncertainty into consideration during the optimization. The impact of uncertainty was incorporated into cost evaluations when searching for the optimal solutions. The controller intervention count was computed using a heuristic model and served as another stochastic cost besides total delay. Costs under uncertainty were evaluated using Monte Carlo simulations. The Pareto fronts that contain a set of solutions were identified and the trade-off between delays and controller intervention count was shown. Solutions that shared similar delays but had different intervention counts were investigated. The results showed that optimization under uncertainty could identify compromise solutions on Pareto fonts, which is better than deterministic optimization with extra safety buffers. It helps decision-makers reduce controller intervention while achieving low delays.

  3. Computer Model Inversion and Uncertainty Quantification in the Geosciences

    NASA Astrophysics Data System (ADS)

    White, Jeremy T.

    The subject of this dissertation is use of computer models as data analysis tools in several different geoscience settings, including integrated surface water/groundwater modeling, tephra fallout modeling, geophysical inversion, and hydrothermal groundwater modeling. The dissertation is organized into three chapters, which correspond to three individual publication manuscripts. In the first chapter, a linear framework is developed to identify and estimate the potential predictive consequences of using a simple computer model as a data analysis tool. The framework is applied to a complex integrated surface-water/groundwater numerical model with thousands of parameters. Several types of predictions are evaluated, including particle travel time and surface-water/groundwater exchange volume. The analysis suggests that model simplifications have the potential to corrupt many types of predictions. The implementation of the inversion, including how the objective function is formulated, what minimum of the objective function value is acceptable, and how expert knowledge is enforced on parameters, can greatly influence the manifestation of model simplification. Depending on the prediction, failure to specifically address each of these important issues during inversion is shown to degrade the reliability of some predictions. In some instances, inversion is shown to increase, rather than decrease, the uncertainty of a prediction, which defeats the purpose of using a model as a data analysis tool. In the second chapter, an efficient inversion and uncertainty quantification approach is applied to a computer model of volcanic tephra transport and deposition. The computer model simulates many physical processes related to tephra transport and fallout. The utility of the approach is demonstrated for two eruption events. In both cases, the importance of uncertainty quantification is highlighted by exposing the variability in the conditioning provided by the observations used for inversion. The worth of different types of tephra data to reduce parameter uncertainty is evaluated, as is the importance of different observation error models. The analyses reveal the importance using tephra granulometry data for inversion, which results in reduced uncertainty for most eruption parameters. In the third chapter, geophysical inversion is combined with hydrothermal modeling to evaluate the enthalpy of an undeveloped geothermal resource in a pull-apart basin located in southeastern Armenia. A high-dimensional gravity inversion is used to define the depth to the contact between the lower-density valley fill sediments and the higher-density surrounding host rock. The inverted basin depth distribution was used to define the hydrostratigraphy for the coupled groundwater-flow and heat-transport model that simulates the circulation of hydrothermal fluids in the system. Evaluation of several different geothermal system configurations indicates that the most likely system configuration is a low-enthalpy, liquid-dominated geothermal system.

  4. Uncertainty propagation of p-boxes using sparse polynomial chaos expansions

    NASA Astrophysics Data System (ADS)

    Schöbi, Roland; Sudret, Bruno

    2017-06-01

    In modern engineering, physical processes are modelled and analysed using advanced computer simulations, such as finite element models. Furthermore, concepts of reliability analysis and robust design are becoming popular, hence, making efficient quantification and propagation of uncertainties an important aspect. In this context, a typical workflow includes the characterization of the uncertainty in the input variables. In this paper, input variables are modelled by probability-boxes (p-boxes), accounting for both aleatory and epistemic uncertainty. The propagation of p-boxes leads to p-boxes of the output of the computational model. A two-level meta-modelling approach is proposed using non-intrusive sparse polynomial chaos expansions to surrogate the exact computational model and, hence, to facilitate the uncertainty quantification analysis. The capabilities of the proposed approach are illustrated through applications using a benchmark analytical function and two realistic engineering problem settings. They show that the proposed two-level approach allows for an accurate estimation of the statistics of the response quantity of interest using a small number of evaluations of the exact computational model. This is crucial in cases where the computational costs are dominated by the runs of high-fidelity computational models.

  5. Uncertainty propagation of p-boxes using sparse polynomial chaos expansions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schöbi, Roland, E-mail: schoebi@ibk.baug.ethz.ch; Sudret, Bruno, E-mail: sudret@ibk.baug.ethz.ch

    2017-06-15

    In modern engineering, physical processes are modelled and analysed using advanced computer simulations, such as finite element models. Furthermore, concepts of reliability analysis and robust design are becoming popular, hence, making efficient quantification and propagation of uncertainties an important aspect. In this context, a typical workflow includes the characterization of the uncertainty in the input variables. In this paper, input variables are modelled by probability-boxes (p-boxes), accounting for both aleatory and epistemic uncertainty. The propagation of p-boxes leads to p-boxes of the output of the computational model. A two-level meta-modelling approach is proposed using non-intrusive sparse polynomial chaos expansions tomore » surrogate the exact computational model and, hence, to facilitate the uncertainty quantification analysis. The capabilities of the proposed approach are illustrated through applications using a benchmark analytical function and two realistic engineering problem settings. They show that the proposed two-level approach allows for an accurate estimation of the statistics of the response quantity of interest using a small number of evaluations of the exact computational model. This is crucial in cases where the computational costs are dominated by the runs of high-fidelity computational models.« less

  6. Estimating the uncertainty of the impact of climate change on alluvial aquifers. Case study in central Italy

    NASA Astrophysics Data System (ADS)

    Romano, Emanuele; Camici, Stefania; Brocca, Luca; Moramarco, Tommaso; Pica, Federico; Preziosi, Elisabetta

    2014-05-01

    There is evidence that the precipitation pattern in Europe is trending towards more humid conditions in the northern region and drier conditions in the southern and central-eastern regions. However, a great deal of uncertainty concerns how the changes in precipitations will have an impact on water resources, particularly on groundwater, and this uncertainty should be evaluated on the basis of that coming from 1) future climate scenarios of Global Circulation Models (GCMs) and 2) modeling chains including the downscaling technique, the infiltration model and the calibration/validation procedure used to develop the groundwater flow model. With the aim of quantifying the uncertainty of these components, the Valle Umbra porous aquifer (Central Italy) has been considered as a case study. This aquifer, that is exploited for human consumption and irrigation, is mainly fed by the effective infiltration from the ground surface and partly by the inflow from the carbonate aquifers bordering the valley. A numerical groundwater flow model has been developed through the finite difference MODFLOW2005 code and it has been calibrated and validated considering the recharge regime computed through a Thornthwaite-Mather infiltration model under the climate conditions observed in the period 1956-2012. Future scenarios (2010-2070) of temperature and precipitation have been obtained from three different GMCs: ECHAM-5 (Max Planck Institute, Germany), PCM (National Centre Atmospheric Research) and CCSM3 (National Centre Atmospheric Research). Each scenario has been downscaled (DSC) to the data of temperature and precipitation collected in the baseline period 1960-1990 at the stations located in the study area through two different statistical techniques (linear rescaling and quantile mapping). Then, stochastic rainfall and temperature time series are generated through the Neyman-Scott Rectangular Pulses model (NSRP) for precipitation and the Fractionally Differenced ARIMA model (FARIMA) for temperature. Such a procedure has allowed to estimate, through the Thornthwaite-Mather model, the uncertainty related to the future scenarios of recharge to the aquifer. Finally, all the scenarios of recharge have been used as input to the groundwater flow model and the results have been evaluated in terms of the uncertainty on the computed aquifer heads and total budget. The main results have indicated that most of the uncertainty on the impact to the aquifer arise from the uncertainty on the first part of the processing chain GCM-DSC.

  7. Performance Analysis of Transposition Models Simulating Solar Radiation on Inclined Surfaces

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xie, Yu; Sengupta, Manajit

    2016-06-02

    Transposition models have been widely used in the solar energy industry to simulate solar radiation on inclined photovoltaic panels. Following numerous studies comparing the performance of transposition models, this work aims to understand the quantitative uncertainty in state-of-the-art transposition models and the sources leading to the uncertainty. Our results show significant differences between two highly used isotropic transposition models, with one substantially underestimating the diffuse plane-of-array irradiances when diffuse radiation is perfectly isotropic. In the empirical transposition models, the selection of the empirical coefficients and land surface albedo can both result in uncertainty in the output. This study can bemore » used as a guide for the future development of physics-based transposition models and evaluations of system performance.« less

  8. Application of a baseflow filter for evaluating model structure suitability of the IHACRES CMD

    NASA Astrophysics Data System (ADS)

    Kim, H. S.

    2015-02-01

    The main objective of this study was to assess the predictive uncertainty from the rainfall-runoff model structure coupling a conceptual module (non-linear module) with a metric transfer function module (linear module). The methodology was primarily based on the comparison between the outputs of the rainfall-runoff model and those from an alternative model approach. An alternative model approach was used to minimise uncertainties arising from data and the model structure. A baseflow filter was adopted to better understand deficiencies in the forms of the rainfall-runoff model by avoiding the uncertainties related to data and the model structure. The predictive uncertainty from the model structure was investigated for representative groups of catchments having similar hydrological response characteristics in the upper Murrumbidgee Catchment. In the assessment of model structure suitability, the consistency (or variability) of catchment response over time and space in model performance and parameter values has been investigated to detect problems related to the temporal and spatial variability of the model accuracy. The predictive error caused by model uncertainty was evaluated through analysis of the variability of the model performance and parameters. A graphical comparison of model residuals, effective rainfall estimates and hydrographs was used to determine a model's ability related to systematic model deviation between simulated and observed behaviours and general behavioural differences in the timing and magnitude of peak flows. The model's predictability was very sensitive to catchment response characteristics. The linear module performs reasonably well in the wetter catchments but has considerable difficulties when applied to the drier catchments where a hydrologic response is dominated by quick flow. The non-linear module has a potential limitation in its capacity to capture non-linear processes for converting observed rainfall into effective rainfall in both the wetter and drier catchments. The comparative study based on a better quantification of the accuracy and precision of hydrological modelling predictions yields a better understanding for the potential improvement of model deficiencies.

  9. The Global Aerosol Synthesis and Science Project (GASSP): Measurements and Modeling to Reduce Uncertainty

    DOE PAGES

    Reddington, C. L.; Carslaw, K. S.; Stier, P.; ...

    2017-09-01

    The largest uncertainty in the historical radiative forcing of climate is caused by changes in aerosol particles due to anthropogenic activity. Sophisticated aerosol microphysics processes have been included in many climate models in an effort to reduce the uncertainty. However, the models are very challenging to evaluate and constrain because they require extensive in situ measurements of the particle size distribution, number concentration, and chemical composition that are not available from global satellite observations. The Global Aerosol Synthesis and Science Project (GASSP) aims to improve the robustness of global aerosol models by combining new methodologies for quantifying model uncertainty, tomore » create an extensive global dataset of aerosol in situ microphysical and chemical measurements, and to develop new ways to assess the uncertainty associated with comparing sparse point measurements with low-resolution models. GASSP has assembled over 45,000 hours of measurements from ships and aircraft as well as data from over 350 ground stations. The measurements have been harmonized into a standardized format that is easily used by modelers and nonspecialist users. Available measurements are extensive, but they are biased to polluted regions of the Northern Hemisphere, leaving large pristine regions and many continental areas poorly sampled. The aerosol radiative forcing uncertainty can be reduced using a rigorous model–data synthesis approach. Nevertheless, our research highlights significant remaining challenges because of the difficulty of constraining many interwoven model uncertainties simultaneously. Although the physical realism of global aerosol models still needs to be improved, the uncertainty in aerosol radiative forcing will be reduced most effectively by systematically and rigorously constraining the models using extensive syntheses of measurements.« less

  10. The Global Aerosol Synthesis and Science Project (GASSP): Measurements and Modeling to Reduce Uncertainty

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Reddington, C. L.; Carslaw, K. S.; Stier, P.

    The largest uncertainty in the historical radiative forcing of climate is caused by changes in aerosol particles due to anthropogenic activity. Sophisticated aerosol microphysics processes have been included in many climate models in an effort to reduce the uncertainty. However, the models are very challenging to evaluate and constrain because they require extensive in situ measurements of the particle size distribution, number concentration, and chemical composition that are not available from global satellite observations. The Global Aerosol Synthesis and Science Project (GASSP) aims to improve the robustness of global aerosol models by combining new methodologies for quantifying model uncertainty, tomore » create an extensive global dataset of aerosol in situ microphysical and chemical measurements, and to develop new ways to assess the uncertainty associated with comparing sparse point measurements with low-resolution models. GASSP has assembled over 45,000 hours of measurements from ships and aircraft as well as data from over 350 ground stations. The measurements have been harmonized into a standardized format that is easily used by modelers and nonspecialist users. Available measurements are extensive, but they are biased to polluted regions of the Northern Hemisphere, leaving large pristine regions and many continental areas poorly sampled. The aerosol radiative forcing uncertainty can be reduced using a rigorous model–data synthesis approach. Nevertheless, our research highlights significant remaining challenges because of the difficulty of constraining many interwoven model uncertainties simultaneously. Although the physical realism of global aerosol models still needs to be improved, the uncertainty in aerosol radiative forcing will be reduced most effectively by systematically and rigorously constraining the models using extensive syntheses of measurements.« less

  11. Effects of Uncertainties in Hydrological Modelling. A Case Study of a Mountainous Catchment in Southern Norway

    NASA Astrophysics Data System (ADS)

    Engeland, Kolbjorn; Steinsland, Ingelin

    2016-04-01

    The aim of this study is to investigate how the inclusion of uncertainties in inputs and observed streamflow influence the parameter estimation, streamflow predictions and model evaluation. In particular we wanted to answer the following research questions: • What is the effect of including a random error in the precipitation and temperature inputs? • What is the effect of decreased information about precipitation by excluding the nearest precipitation station? • What is the effect of the uncertainty in streamflow observations? • What is the effect of reduced information about the true streamflow by using a rating curve where the measurement of the highest and lowest streamflow is excluded when estimating the rating curve? To answer these questions, we designed a set of calibration experiments and evaluation strategies. We used the elevation distributed HBV model operating on daily time steps combined with a Bayesian formulation and the MCMC routine Dream for parameter inference. The uncertainties in inputs was represented by creating ensembles of precipitation and temperature. The precipitation ensemble were created using a meta-gaussian random field approach. The temperature ensembles were created using a 3D Bayesian kriging with random sampling of the temperature laps rate. The streamflow ensembles were generated by a Bayesian multi-segment rating curve model. Precipitation and temperatures were randomly sampled for every day, whereas the streamflow ensembles were generated from rating curve ensembles, and the same rating curve was always used for the whole time series in a calibration or evaluation run. We chose a catchment with a meteorological station measuring precipitation and temperature, and a rating curve of relatively high quality. This allowed us to investigate and further test the effect of having less information on precipitation and streamflow during model calibration, predictions and evaluation. The results showed that including uncertainty in the precipitation and temperature input has a negligible effect on the posterior distribution of parameters and for the Nash-Sutcliffe (NS) efficiency for the predicted flows, while the reliability and the continuous rank probability score (CRPS) improves. Reduced information in precipitation input resulted in a and a shift in the water balance parameter Pcorr, a model producing smoother streamflow predictions giving poorer NS and CRPS, but higher reliability. The effect of calibrating the hydrological model using wrong rating curves is mainly seen as variability in the water balance parameter Pcorr. When evaluating predictions obtained using a wrong rating curve, the evaluation scores varies depending on the true rating curve. Generally, the best evaluation scores were not achieved for the rating curve used for calibration, but for a rating curves giving low variance in streamflow observations. Reduced information in streamflow influenced the water balance parameter Pcorr, and increased the spread in evaluation scores giving both better and worse scores. This case study shows that estimating the water balance is challenging since both precipitation inputs and streamflow observations have pronounced systematic component in their uncertainties.

  12. Ensembles modeling approach to study Climate Change impacts on Wheat

    NASA Astrophysics Data System (ADS)

    Ahmed, Mukhtar; Claudio, Stöckle O.; Nelson, Roger; Higgins, Stewart

    2017-04-01

    Simulations of crop yield under climate variability are subject to uncertainties, and quantification of such uncertainties is essential for effective use of projected results in adaptation and mitigation strategies. In this study we evaluated the uncertainties related to crop-climate models using five crop growth simulation models (CropSyst, APSIM, DSSAT, STICS and EPIC) and 14 general circulation models (GCMs) for 2 representative concentration pathways (RCP) of atmospheric CO2 (4.5 and 8.5 W m-2) in the Pacific Northwest (PNW), USA. The aim was to assess how different process-based crop models could be used accurately for estimation of winter wheat growth, development and yield. Firstly, all models were calibrated for high rainfall, medium rainfall, low rainfall and irrigated sites in the PNW using 1979-2010 as the baseline period. Response variables were related to farm management and soil properties, and included crop phenology, leaf area index (LAI), biomass and grain yield of winter wheat. All five models were run from 2000 to 2100 using the 14 GCMs and 2 RCPs to evaluate the effect of future climate (rainfall, temperature and CO2) on winter wheat phenology, LAI, biomass, grain yield and harvest index. Simulated time to flowering and maturity was reduced in all models except EPIC with some level of uncertainty. All models generally predicted an increase in biomass and grain yield under elevated CO2 but this effect was more prominent under rainfed conditions than irrigation. However, there was uncertainty in the simulation of crop phenology, biomass and grain yield under 14 GCMs during three prediction periods (2030, 2050 and 2070). We concluded that to improve accuracy and consistency in simulating wheat growth dynamics and yield under a changing climate, a multimodel ensemble approach should be used.

  13. Uncertainty and sensitivity assessments of an agricultural-hydrological model (RZWQM2) using the GLUE method

    NASA Astrophysics Data System (ADS)

    Sun, Mei; Zhang, Xiaolin; Huo, Zailin; Feng, Shaoyuan; Huang, Guanhua; Mao, Xiaomin

    2016-03-01

    Quantitatively ascertaining and analyzing the effects of model uncertainty on model reliability is a focal point for agricultural-hydrological models due to more uncertainties of inputs and processes. In this study, the generalized likelihood uncertainty estimation (GLUE) method with Latin hypercube sampling (LHS) was used to evaluate the uncertainty of the RZWQM-DSSAT (RZWQM2) model outputs responses and the sensitivity of 25 parameters related to soil properties, nutrient transport and crop genetics. To avoid the one-sided risk of model prediction caused by using a single calibration criterion, the combined likelihood (CL) function integrated information concerning water, nitrogen, and crop production was introduced in GLUE analysis for the predictions of the following four model output responses: the total amount of water content (T-SWC) and the nitrate nitrogen (T-NIT) within the 1-m soil profile, the seed yields of waxy maize (Y-Maize) and winter wheat (Y-Wheat). In the process of evaluating RZWQM2, measurements and meteorological data were obtained from a field experiment that involved a winter wheat and waxy maize crop rotation system conducted from 2003 to 2004 in southern Beijing. The calibration and validation results indicated that RZWQM2 model can be used to simulate the crop growth and water-nitrogen migration and transformation in wheat-maize crop rotation planting system. The results of uncertainty analysis using of GLUE method showed T-NIT was sensitive to parameters relative to nitrification coefficient, maize growth characteristics on seedling period, wheat vernalization period, and wheat photoperiod. Parameters on soil saturated hydraulic conductivity, nitrogen nitrification and denitrification, and urea hydrolysis played an important role in crop yield component. The prediction errors for RZWQM2 outputs with CL function were relatively lower and uniform compared with other likelihood functions composed of individual calibration criterion. This new and successful application of the GLUE method for determining the uncertainty and sensitivity of the RZWQM2 could provide a reference for the optimization of model parameters with different emphases according to research interests.

  14. Methods and Tools for Evaluating Uncertainty in Ecological Models: A Survey

    EPA Science Inventory

    Poster presented at the Ecological Society of America Meeting. Ecologists are familiar with a variety of uncertainty techniques, particularly in the intersection of maximum likelihood parameter estimation and Monte Carlo analysis techniques, as well as a recent increase in Baye...

  15. Formal modeling of a system of chemical reactions under uncertainty.

    PubMed

    Ghosh, Krishnendu; Schlipf, John

    2014-10-01

    We describe a novel formalism representing a system of chemical reactions, with imprecise rates of reactions and concentrations of chemicals, and describe a model reduction method, pruning, based on the chemical properties. We present two algorithms, midpoint approximation and interval approximation, for construction of efficient model abstractions with uncertainty in data. We evaluate computational feasibility by posing queries in computation tree logic (CTL) on a prototype of extracellular-signal-regulated kinase (ERK) pathway.

  16. Governing Laws of Complex System Predictability under Co-evolving Uncertainty Sources: Theory and Nonlinear Geophysical Applications

    NASA Astrophysics Data System (ADS)

    Perdigão, R. A. P.

    2017-12-01

    Predictability assessments are traditionally made on a case-by-case basis, often by running the particular model of interest with randomly perturbed initial/boundary conditions and parameters, producing computationally expensive ensembles. These approaches provide a lumped statistical view of uncertainty evolution, without eliciting the fundamental processes and interactions at play in the uncertainty dynamics. In order to address these limitations, we introduce a systematic dynamical framework for predictability assessment and forecast, by analytically deriving governing equations of predictability in terms of the fundamental architecture of dynamical systems, independent of any particular problem under consideration. The framework further relates multiple uncertainty sources along with their coevolutionary interplay, enabling a comprehensive and explicit treatment of uncertainty dynamics along time, without requiring the actual model to be run. In doing so, computational resources are freed and a quick and effective a-priori systematic dynamic evaluation is made of predictability evolution and its challenges, including aspects in the model architecture and intervening variables that may require optimization ahead of initiating any model runs. It further brings out universal dynamic features in the error dynamics elusive to any case specific treatment, ultimately shedding fundamental light on the challenging issue of predictability. The formulated approach, framed with broad mathematical physics generality in mind, is then implemented in dynamic models of nonlinear geophysical systems with various degrees of complexity, in order to evaluate their limitations and provide informed assistance on how to optimize their design and improve their predictability in fundamental dynamical terms.

  17. Climate data induced uncertainty in model based estimations of terrestrial primary productivity

    NASA Astrophysics Data System (ADS)

    Wu, Z.; Ahlström, A.; Smith, B.; Ardö, J.; Eklundh, L.; Fensholt, R.; Lehsten, V.

    2016-12-01

    Models used to project global vegetation and carbon cycle differ in their estimates of historical fluxes and pools. These differences arise not only from differences between models but also from differences in the environmental and climatic data that forces the models. Here we investigate the role of uncertainties in historical climate data, encapsulated by a set of six historical climate datasets. We focus on terrestrial gross primary productivity (GPP) and analyze the results from a dynamic process-based vegetation model (LPJ-GUESS) forced by six different climate datasets and two empirical datasets of GPP (derived from flux towers and remote sensing). We find that the climate induced uncertainty, defined as the difference among historical simulations in GPP when forcing the model with the different climate datasets, can be as high as 33 Pg C yr-1 globally (19% of mean GPP). The uncertainty is partitioned into the three main climatic drivers, temperature, precipitation, and shortwave radiation. Additionally, we illustrate how the uncertainty due to a given climate driver depends both on the magnitude of the forcing data uncertainty (the data range) and the sensitivity of the modeled GPP to the driver (the ecosystem sensitivity). The analysis is performed globally and stratified into five land cover classes. We find that the dynamic vegetation model overestimates GPP, compared to empirically based GPP data over most areas, except for the tropical region. Both the simulations and empirical estimates agree that the tropical region is a disproportionate source of uncertainty in GPP estimation. This is mainly caused by uncertainties in shortwave radiation forcing, of which climate data range contributes slightly higher uncertainty than ecosystem sensitivity to shortwave radiation. We also find that precipitation dominated the climate induced uncertainty over nearly half of terrestrial vegetated surfaces, which is mainly due to large ecosystem sensitivity to precipitation. Overall, climate data ranges are found to contribute more to the climate induced uncertainty than ecosystem sensitivity. Our study highlights the need to better constrain tropical climate and demonstrate that uncertainty caused by climatic forcing data must be considered when comparing and evaluating model results and empirical datasets.

  18. A Practical Approach to Address Uncertainty in Stakeholder Deliberations.

    PubMed

    Gregory, Robin; Keeney, Ralph L

    2017-03-01

    This article addresses the difficulties of incorporating uncertainty about consequence estimates as part of stakeholder deliberations involving multiple alternatives. Although every prediction of future consequences necessarily involves uncertainty, a large gap exists between common practices for addressing uncertainty in stakeholder deliberations and the procedures of prescriptive decision-aiding models advanced by risk and decision analysts. We review the treatment of uncertainty at four main phases of the deliberative process: with experts asked to describe possible consequences of competing alternatives, with stakeholders who function both as individuals and as members of coalitions, with the stakeholder committee composed of all stakeholders, and with decisionmakers. We develop and recommend a model that uses certainty equivalents as a theoretically robust and practical approach for helping diverse stakeholders to incorporate uncertainties when evaluating multiple-objective alternatives as part of public policy decisions. © 2017 Society for Risk Analysis.

  19. Neutron multiplicity counting: Confidence intervals for reconstruction parameters

    DOE PAGES

    Verbeke, Jerome M.

    2016-03-09

    From nuclear materials accountability to homeland security, the need for improved nuclear material detection, assay, and authentication has grown over the past decades. Starting in the 1940s, neutron multiplicity counting techniques have enabled quantitative evaluation of masses and multiplications of fissile materials. In this paper, we propose a new method to compute uncertainties on these parameters using a model-based sequential Bayesian processor, resulting in credible regions in the fissile material mass and multiplication space. These uncertainties will enable us to evaluate quantitatively proposed improvements to the theoretical fission chain model. Additionally, because the processor can calculate uncertainties in real time,more » it is a useful tool in applications such as portal monitoring: monitoring can stop as soon as a preset confidence of non-threat is reached.« less

  20. Improved representation of situational awareness within a dismounted small combat unit constructive simulation

    NASA Astrophysics Data System (ADS)

    Lee, K. David; Colony, Mike

    2011-06-01

    Modeling and simulation has been established as a cost-effective means of supporting the development of requirements, exploring doctrinal alternatives, assessing system performance, and performing design trade-off analysis. The Army's constructive simulation for the evaluation of equipment effectiveness in small combat unit operations is currently limited to representation of situation awareness without inclusion of the many uncertainties associated with real world combat environments. The goal of this research is to provide an ability to model situation awareness and decision process uncertainties in order to improve evaluation of the impact of battlefield equipment on ground soldier and small combat unit decision processes. Our Army Probabilistic Inference and Decision Engine (Army-PRIDE) system provides this required uncertainty modeling through the application of two critical techniques that allow Bayesian network technology to be applied to real-time applications. (Object-Oriented Bayesian Network methodology and Object-Oriented Inference technique). In this research, we implement decision process and situation awareness models for a reference scenario using Army-PRIDE and demonstrate its ability to model a variety of uncertainty elements, including: confidence of source, information completeness, and information loss. We also demonstrate that Army-PRIDE improves the realism of the current constructive simulation's decision processes through Monte Carlo simulation.

  1. Evaluation of thermal cameras in quality systems according to ISO 9000 or EN 45000 standards

    NASA Astrophysics Data System (ADS)

    Chrzanowski, Krzysztof

    2001-03-01

    According to the international standards ISO 9001-9004 and EN 45001-45003 the industrial plants and the accreditation laboratories that implemented the quality systems according to these standards are required to evaluate an uncertainty of measurements. Manufacturers of thermal cameras do not offer any data that could enable estimation of measurement uncertainty of these imagers. Difficulties in determining the measurement uncertainty is an important limitation of thermal cameras for applications in the industrial plants and the cooperating accreditation laboratories that have implemented these quality systems. A set of parameters for characterization of commercial thermal cameras, a measuring set, some results of testing of these cameras, a mathematical model of uncertainty, and a software that enables quick calculation of uncertainty of temperature measurements with thermal cameras are presented in this paper.

  2. Robust control of seismically excited cable stayed bridges with MR dampers

    NASA Astrophysics Data System (ADS)

    YeganehFallah, Arash; Khajeh Ahamd Attari, Nader

    2017-03-01

    In recent decades active and semi-active structural control are becoming attractive alternatives for enhancing performance of civil infrastructures subjected to seismic and winds loads. However, in order to have reliable active and semi-active control, there is a need to include information of uncertainties in design of the controller. In real world for civil structures, parameters such as loading places, stiffness, mass and damping are time variant and uncertain. These uncertainties in many cases model as parametric uncertainties. The motivation of this research is to design a robust controller for attenuating the vibrational responses of civil infrastructures, regarding their dynamical uncertainties. Uncertainties in structural dynamic’s parameters are modeled as affine uncertainties in state space modeling. These uncertainties are decoupled from the system through Linear Fractional Transformation (LFT) and are assumed to be unknown input to the system but norm bounded. The robust H ∞ controller is designed for the decoupled system to regulate the evaluation outputs and it is robust to effects of uncertainties, disturbance and sensors noise. The cable stayed bridge benchmark which is equipped with MR damper is considered for the numerical simulation. The simulated results show that the proposed robust controller can effectively mitigate undesired uncertainties effects on systems’ responds under seismic loading.

  3. ISMIP6 - initMIP: Greenland ice sheet model initialisation experiments

    NASA Astrophysics Data System (ADS)

    Goelzer, Heiko; Nowicki, Sophie; Payne, Tony; Larour, Eric; Abe Ouchi, Ayako; Gregory, Jonathan; Lipscomb, William; Seroussi, Helene; Shepherd, Andrew; Edwards, Tamsin

    2016-04-01

    Earlier large-scale Greenland ice sheet sea-level projections e.g. those run during ice2sea and SeaRISE initiatives have shown that ice sheet initialisation can have a large effect on the projections and gives rise to important uncertainties. This intercomparison exercise (initMIP) aims at comparing, evaluating and improving the initialization techniques used in the ice sheet modeling community and to estimate the associated uncertainties. It is the first in a series of ice sheet model intercomparison activities within ISMIP6 (Ice Sheet Model Intercomparison Project for CMIP6). The experiments are conceived for the large-scale Greenland ice sheet and are designed to allow intercomparison between participating models of 1) the initial present-day state of the ice sheet and 2) the response in two schematic forward experiments. The latter experiments serve to evaluate the initialisation in terms of model drift (forward run without any forcing) and response to a large perturbation (prescribed surface mass balance anomaly). We present and discuss first results of the intercomparison and highlight important uncertainties with respect to projections of the Greenland ice sheet sea-level contribution.

  4. Parameter sensitivity analysis of a 1-D cold region lake model for land-surface schemes

    NASA Astrophysics Data System (ADS)

    Guerrero, José-Luis; Pernica, Patricia; Wheater, Howard; Mackay, Murray; Spence, Chris

    2017-12-01

    Lakes might be sentinels of climate change, but the uncertainty in their main feedback to the atmosphere - heat-exchange fluxes - is often not considered within climate models. Additionally, these fluxes are seldom measured, hindering critical evaluation of model output. Analysis of the Canadian Small Lake Model (CSLM), a one-dimensional integral lake model, was performed to assess its ability to reproduce diurnal and seasonal variations in heat fluxes and the sensitivity of simulated fluxes to changes in model parameters, i.e., turbulent transport parameters and the light extinction coefficient (Kd). A C++ open-source software package, Problem Solving environment for Uncertainty Analysis and Design Exploration (PSUADE), was used to perform sensitivity analysis (SA) and identify the parameters that dominate model behavior. The generalized likelihood uncertainty estimation (GLUE) was applied to quantify the fluxes' uncertainty, comparing daily-averaged eddy-covariance observations to the output of CSLM. Seven qualitative and two quantitative SA methods were tested, and the posterior likelihoods of the modeled parameters, obtained from the GLUE analysis, were used to determine the dominant parameters and the uncertainty in the modeled fluxes. Despite the ubiquity of the equifinality issue - different parameter-value combinations yielding equivalent results - the answer to the question was unequivocal: Kd, a measure of how much light penetrates the lake, dominates sensible and latent heat fluxes, and the uncertainty in their estimates is strongly related to the accuracy with which Kd is determined. This is important since accurate and continuous measurements of Kd could reduce modeling uncertainty.

  5. Monte Carlo analysis of uncertainty propagation in a stratospheric model. 1: Development of a concise stratospheric model

    NASA Technical Reports Server (NTRS)

    Rundel, R. D.; Butler, D. M.; Stolarski, R. S.

    1977-01-01

    A concise model has been developed to analyze uncertainties in stratospheric perturbations, yet uses a minimum of computer time and is complete enough to represent the results of more complex models. The steady state model applies iteration to achieve coupling between interacting species. The species are determined from diffusion equations with appropriate sources and sinks. Diurnal effects due to chlorine nitrate formation are accounted for by analytic approximation. The model has been used to evaluate steady state perturbations due to injections of chlorine and NO(X).

  6. Constructive Epistemic Modeling: A Hierarchical Bayesian Model Averaging Method

    NASA Astrophysics Data System (ADS)

    Tsai, F. T. C.; Elshall, A. S.

    2014-12-01

    Constructive epistemic modeling is the idea that our understanding of a natural system through a scientific model is a mental construct that continually develops through learning about and from the model. Using the hierarchical Bayesian model averaging (HBMA) method [1], this study shows that segregating different uncertain model components through a BMA tree of posterior model probabilities, model prediction, within-model variance, between-model variance and total model variance serves as a learning tool [2]. First, the BMA tree of posterior model probabilities permits the comparative evaluation of the candidate propositions of each uncertain model component. Second, systemic model dissection is imperative for understanding the individual contribution of each uncertain model component to the model prediction and variance. Third, the hierarchical representation of the between-model variance facilitates the prioritization of the contribution of each uncertain model component to the overall model uncertainty. We illustrate these concepts using the groundwater modeling of a siliciclastic aquifer-fault system. The sources of uncertainty considered are from geological architecture, formation dip, boundary conditions and model parameters. The study shows that the HBMA analysis helps in advancing knowledge about the model rather than forcing the model to fit a particularly understanding or merely averaging several candidate models. [1] Tsai, F. T.-C., and A. S. Elshall (2013), Hierarchical Bayesian model averaging for hydrostratigraphic modeling: Uncertainty segregation and comparative evaluation. Water Resources Research, 49, 5520-5536, doi:10.1002/wrcr.20428. [2] Elshall, A.S., and F. T.-C. Tsai (2014). Constructive epistemic modeling of groundwater flow with geological architecture and boundary condition uncertainty under Bayesian paradigm, Journal of Hydrology, 517, 105-119, doi: 10.1016/j.jhydrol.2014.05.027.

  7. Probabilistic evaluation of damage potential in earthquake-induced liquefaction in a 3-D soil deposit

    NASA Astrophysics Data System (ADS)

    Halder, A.; Miller, F. J.

    1982-03-01

    A probabilistic model to evaluate the risk of liquefaction at a site and to limit or eliminate damage during earthquake induced liquefaction is proposed. The model is extended to consider three dimensional nonhomogeneous soil properties. The parameters relevant to the liquefaction phenomenon are identified, including: (1) soil parameters; (2) parameters required to consider laboratory test and sampling effects; and (3) loading parameters. The fundamentals of risk based design concepts pertient to liquefaction are reviewed. A detailed statistical evaluation of the soil parameters in the proposed liquefaction model is provided and the uncertainty associated with the estimation of in situ relative density is evaluated for both direct and indirect methods. It is found that the liquefaction potential the uncertainties in the load parameters could be higher than those in the resistance parameters.

  8. Validation and uncertainty analysis of a pre-treatment 2D dose prediction model

    NASA Astrophysics Data System (ADS)

    Baeza, Jose A.; Wolfs, Cecile J. A.; Nijsten, Sebastiaan M. J. J. G.; Verhaegen, Frank

    2018-02-01

    Independent verification of complex treatment delivery with megavolt photon beam radiotherapy (RT) has been effectively used to detect and prevent errors. This work presents the validation and uncertainty analysis of a model that predicts 2D portal dose images (PDIs) without a patient or phantom in the beam. The prediction model is based on an exponential point dose model with separable primary and secondary photon fluence components. The model includes a scatter kernel, off-axis ratio map, transmission values and penumbra kernels for beam-delimiting components. These parameters were derived through a model fitting procedure supplied with point dose and dose profile measurements of radiation fields. The model was validated against a treatment planning system (TPS; Eclipse) and radiochromic film measurements for complex clinical scenarios, including volumetric modulated arc therapy (VMAT). Confidence limits on fitted model parameters were calculated based on simulated measurements. A sensitivity analysis was performed to evaluate the effect of the parameter uncertainties on the model output. For the maximum uncertainty, the maximum deviating measurement sets were propagated through the fitting procedure and the model. The overall uncertainty was assessed using all simulated measurements. The validation of the prediction model against the TPS and the film showed a good agreement, with on average 90.8% and 90.5% of pixels passing a (2%,2 mm) global gamma analysis respectively, with a low dose threshold of 10%. The maximum and overall uncertainty of the model is dependent on the type of clinical plan used as input. The results can be used to study the robustness of the model. A model for predicting accurate 2D pre-treatment PDIs in complex RT scenarios can be used clinically and its uncertainties can be taken into account.

  9. Plant Uptake of Organic Pollutants from Soil: A Critical Review ofBioconcentration Estimates Based on Modelsand Experiments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McKone, Thomas E.; Maddalena, Randy L.

    2007-01-01

    The role of terrestrial vegetation in transferring chemicals from soil and air into specific plant tissues (stems, leaves, roots, etc.) is still not well characterized. We provide here a critical review of plant-to-soil bioconcentration ratio (BCR) estimates based on models and experimental data. This review includes the conceptual and theoretical formulations of the bioconcentration ratio, constructing and calibrating empirical and mathematical algorithms to describe this ratio and the experimental data used to quantify BCRs and calibrate the model performance. We first evaluate the theoretical basis for the BCR concept and BCR models and consider how lack of knowledge and datamore » limits reliability and consistency of BCR estimates. We next consider alternate modeling strategies for BCR. A key focus of this evaluation is the relative contributions to overall uncertainty from model uncertainty versus variability in the experimental data used to develop and test the models. As a case study, we consider a single chemical, hexahydro-1,3,5-trinitro-1,3,5-triazine (RDX), and focus on variability of bioconcentration measurements obtained from 81 experiments with different plant species, different plant tissues, different experimental conditions, and different methods for reporting concentrations in the soil and plant tissues. We use these observations to evaluate both the magnitude of experimental variability in plant bioconcentration and compare this to model uncertainty. Among these 81 measurements, the variation of the plant/soil BCR has a geometric standard deviation (GSD) of 3.5 and a coefficient of variability (CV-ratio of arithmetic standard deviation to mean) of 1.7. These variations are significant but low relative to model uncertainties--which have an estimated GSD of 10 with a corresponding CV of 14.« less

  10. Prediction uncertainty and data worth assessment for groundwater transport times in an agricultural catchment

    NASA Astrophysics Data System (ADS)

    Zell, Wesley O.; Culver, Teresa B.; Sanford, Ward E.

    2018-06-01

    Uncertainties about the age of base-flow discharge can have serious implications for the management of degraded environmental systems where subsurface pathways, and the ongoing release of pollutants that accumulated in the subsurface during past decades, dominate the water quality signal. Numerical groundwater models may be used to estimate groundwater return times and base-flow ages and thus predict the time required for stakeholders to see the results of improved agricultural management practices. However, the uncertainty inherent in the relationship between (i) the observations of atmospherically-derived tracers that are required to calibrate such models and (ii) the predictions of system age that the observations inform have not been investigated. For example, few if any studies have assessed the uncertainty of numerically-simulated system ages or evaluated the uncertainty reductions that may result from the expense of collecting additional subsurface tracer data. In this study we combine numerical flow and transport modeling of atmospherically-derived tracers with prediction uncertainty methods to accomplish four objectives. First, we show the relative importance of head, discharge, and tracer information for characterizing response times in a uniquely data rich catchment that includes 266 age-tracer measurements (SF6, CFCs, and 3H) in addition to long term monitoring of water levels and stream discharge. Second, we calculate uncertainty intervals for model-simulated base-flow ages using both linear and non-linear methods, and find that the prediction sensitivity vector used by linear first-order second-moment methods results in much larger uncertainties than non-linear Monte Carlo methods operating on the same parameter uncertainty. Third, by combining prediction uncertainty analysis with multiple models of the system, we show that data-worth calculations and monitoring network design are sensitive to variations in the amount of water leaving the system via stream discharge and irrigation withdrawals. Finally, we demonstrate a novel model-averaged computation of potential data worth that can account for these uncertainties in model structure.

  11. Modeling Errors in Daily Precipitation Measurements: Additive or Multiplicative?

    NASA Technical Reports Server (NTRS)

    Tian, Yudong; Huffman, George J.; Adler, Robert F.; Tang, Ling; Sapiano, Matthew; Maggioni, Viviana; Wu, Huan

    2013-01-01

    The definition and quantification of uncertainty depend on the error model used. For uncertainties in precipitation measurements, two types of error models have been widely adopted: the additive error model and the multiplicative error model. This leads to incompatible specifications of uncertainties and impedes intercomparison and application.In this letter, we assess the suitability of both models for satellite-based daily precipitation measurements in an effort to clarify the uncertainty representation. Three criteria were employed to evaluate the applicability of either model: (1) better separation of the systematic and random errors; (2) applicability to the large range of variability in daily precipitation; and (3) better predictive skills. It is found that the multiplicative error model is a much better choice under all three criteria. It extracted the systematic errors more cleanly, was more consistent with the large variability of precipitation measurements, and produced superior predictions of the error characteristics. The additive error model had several weaknesses, such as non constant variance resulting from systematic errors leaking into random errors, and the lack of prediction capability. Therefore, the multiplicative error model is a better choice.

  12. An open-source Java-based Toolbox for environmental model evaluation: The MOUSE Software Application

    USDA-ARS?s Scientific Manuscript database

    A consequence of environmental model complexity is that the task of understanding how environmental models work and identifying their sensitivities/uncertainties, etc. becomes progressively more difficult. Comprehensive numerical and visual evaluation tools have been developed such as the Monte Carl...

  13. Uncertainty Modeling for Robustness Analysis of Control Upset Prevention and Recovery Systems

    NASA Technical Reports Server (NTRS)

    Belcastro, Christine M.; Khong, Thuan H.; Shin, Jong-Yeob; Kwatny, Harry; Chang, Bor-Chin; Balas, Gary J.

    2005-01-01

    Formal robustness analysis of aircraft control upset prevention and recovery systems could play an important role in their validation and ultimate certification. Such systems (developed for failure detection, identification, and reconfiguration, as well as upset recovery) need to be evaluated over broad regions of the flight envelope and under extreme flight conditions, and should include various sources of uncertainty. However, formulation of linear fractional transformation (LFT) models for representing system uncertainty can be very difficult for complex parameter-dependent systems. This paper describes a preliminary LFT modeling software tool which uses a matrix-based computational approach that can be directly applied to parametric uncertainty problems involving multivariate matrix polynomial dependencies. Several examples are presented (including an F-16 at an extreme flight condition, a missile model, and a generic example with numerous crossproduct terms), and comparisons are given with other LFT modeling tools that are currently available. The LFT modeling method and preliminary software tool presented in this paper are shown to compare favorably with these methods.

  14. Assessment of BTEX-induced health risk under multiple uncertainties at a petroleum-contaminated site: An integrated fuzzy stochastic approach

    NASA Astrophysics Data System (ADS)

    Zhang, Xiaodong; Huang, Guo H.

    2011-12-01

    Groundwater pollution has gathered more and more attention in the past decades. Conducting an assessment of groundwater contamination risk is desired to provide sound bases for supporting risk-based management decisions. Therefore, the objective of this study is to develop an integrated fuzzy stochastic approach to evaluate risks of BTEX-contaminated groundwater under multiple uncertainties. It consists of an integrated interval fuzzy subsurface modeling system (IIFMS) and an integrated fuzzy second-order stochastic risk assessment (IFSOSRA) model. The IIFMS is developed based on factorial design, interval analysis, and fuzzy sets approach to predict contaminant concentrations under hybrid uncertainties. Two input parameters (longitudinal dispersivity and porosity) are considered to be uncertain with known fuzzy membership functions, and intrinsic permeability is considered to be an interval number with unknown distribution information. A factorial design is conducted to evaluate interactive effects of the three uncertain factors on the modeling outputs through the developed IIFMS. The IFSOSRA model can systematically quantify variability and uncertainty, as well as their hybrids, presented as fuzzy, stochastic and second-order stochastic parameters in health risk assessment. The developed approach haw been applied to the management of a real-world petroleum-contaminated site within a western Canada context. The results indicate that multiple uncertainties, under a combination of information with various data-quality levels, can be effectively addressed to provide supports in identifying proper remedial efforts. A unique contribution of this research is the development of an integrated fuzzy stochastic approach for handling various forms of uncertainties associated with simulation and risk assessment efforts.

  15. Improvements to Nuclear Data and Its Uncertainties by Theoretical Modeling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Danon, Yaron; Nazarewicz, Witold; Talou, Patrick

    2013-02-18

    This project addresses three important gaps in existing evaluated nuclear data libraries that represent a significant hindrance against highly advanced modeling and simulation capabilities for the Advanced Fuel Cycle Initiative (AFCI). This project will: Develop advanced theoretical tools to compute prompt fission neutrons and gamma-ray characteristics well beyond average spectra and multiplicity, and produce new evaluated files of U and Pu isotopes, along with some minor actinides; Perform state-of-the-art fission cross-section modeling and calculations using global and microscopic model input parameters, leading to truly predictive fission cross-sections capabilities. Consistent calculations for a suite of Pu isotopes will be performed; Implementmore » innovative data assimilation tools, which will reflect the nuclear data evaluation process much more accurately, and lead to a new generation of uncertainty quantification files. New covariance matrices will be obtained for Pu isotopes and compared to existing ones. The deployment of a fleet of safe and efficient advanced reactors that minimize radiotoxic waste and are proliferation-resistant is a clear and ambitious goal of AFCI. While in the past the design, construction and operation of a reactor were supported through empirical trials, this new phase in nuclear energy production is expected to rely heavily on advanced modeling and simulation capabilities. To be truly successful, a program for advanced simulations of innovative reactors will have to develop advanced multi-physics capabilities, to be run on massively parallel super- computers, and to incorporate adequate and precise underlying physics. And all these areas have to be developed simultaneously to achieve those ambitious goals. Of particular interest are reliable fission cross-section uncertainty estimates (including important correlations) and evaluations of prompt fission neutrons and gamma-ray spectra and uncertainties.« less

  16. Robust network data envelopment analysis approach to evaluate the efficiency of regional electricity power networks under uncertainty.

    PubMed

    Fathollah Bayati, Mohsen; Sadjadi, Seyed Jafar

    2017-01-01

    In this paper, new Network Data Envelopment Analysis (NDEA) models are developed to evaluate the efficiency of regional electricity power networks. The primary objective of this paper is to consider perturbation in data and develop new NDEA models based on the adaptation of robust optimization methodology. Furthermore, in this paper, the efficiency of the entire networks of electricity power, involving generation, transmission and distribution stages is measured. While DEA has been widely used to evaluate the efficiency of the components of electricity power networks during the past two decades, there is no study to evaluate the efficiency of the electricity power networks as a whole. The proposed models are applied to evaluate the efficiency of 16 regional electricity power networks in Iran and the effect of data uncertainty is also investigated. The results are compared with the traditional network DEA and parametric SFA methods. Validity and verification of the proposed models are also investigated. The preliminary results indicate that the proposed models were more reliable than the traditional Network DEA model.

  17. Robust network data envelopment analysis approach to evaluate the efficiency of regional electricity power networks under uncertainty

    PubMed Central

    Sadjadi, Seyed Jafar

    2017-01-01

    In this paper, new Network Data Envelopment Analysis (NDEA) models are developed to evaluate the efficiency of regional electricity power networks. The primary objective of this paper is to consider perturbation in data and develop new NDEA models based on the adaptation of robust optimization methodology. Furthermore, in this paper, the efficiency of the entire networks of electricity power, involving generation, transmission and distribution stages is measured. While DEA has been widely used to evaluate the efficiency of the components of electricity power networks during the past two decades, there is no study to evaluate the efficiency of the electricity power networks as a whole. The proposed models are applied to evaluate the efficiency of 16 regional electricity power networks in Iran and the effect of data uncertainty is also investigated. The results are compared with the traditional network DEA and parametric SFA methods. Validity and verification of the proposed models are also investigated. The preliminary results indicate that the proposed models were more reliable than the traditional Network DEA model. PMID:28953900

  18. FORMAL UNCERTAINTY ANALYSIS OF A LAGRANGIAN PHOTOCHEMICAL AIR POLLUTION MODEL. (R824792)

    EPA Science Inventory

    This study applied Monte Carlo analysis with Latin
    hypercube sampling to evaluate the effects of uncertainty
    in air parcel trajectory paths, emissions, rate constants,
    deposition affinities, mixing heights, and atmospheric stability
    on predictions from a vertically...

  19. Uncertainty Evaluation and Appropriate Distribution for the RDHM in the Rockies

    NASA Astrophysics Data System (ADS)

    Kim, J.; Bastidas, L. A.; Clark, E. P.

    2010-12-01

    The problems that hydrologic models have in properly reproducing the processes involved in mountainous areas, and in particular the Rocky Mountains, are widely acknowledged. Herein, we present an application of the National Weather Service RDHM distributed model over the Durango River basin in Colorado. We focus primarily in the assessment of the model prediction uncertainty associated with the parameter estimation and the comparison of the model performance using parameters obtained with a priori estimation following the procedure of Koren et al., and those obtained via inverse modeling using a variety of Markov chain Monte Carlo based optimization algorithms. The model evaluation is based on traditional procedures as well as non-traditional ones based on the use of shape matching functions, which are more appropriate for the evaluation of distributed information (e.g. Hausdorff distance, earth movers distance). The variables used for the model performance evaluation are discharge (with internal nodes), snow cover and snow water equivalent. An attempt to establish the proper degree of distribution, for the Durango basin with the RDHM model, is also presented.

  20. Robustness for slope stability modelling under deep uncertainty

    NASA Astrophysics Data System (ADS)

    Almeida, Susana; Holcombe, Liz; Pianosi, Francesca; Wagener, Thorsten

    2015-04-01

    Landslides can have large negative societal and economic impacts, such as loss of life and damage to infrastructure. However, the ability of slope stability assessment to guide management is limited by high levels of uncertainty in model predictions. Many of these uncertainties cannot be easily quantified, such as those linked to climate change and other future socio-economic conditions, restricting the usefulness of traditional decision analysis tools. Deep uncertainty can be managed more effectively by developing robust, but not necessarily optimal, policies that are expected to perform adequately under a wide range of future conditions. Robust strategies are particularly valuable when the consequences of taking a wrong decision are high as is often the case of when managing natural hazard risks such as landslides. In our work a physically based numerical model of hydrologically induced slope instability (the Combined Hydrology and Stability Model - CHASM) is applied together with robust decision making to evaluate the most important uncertainties (storm events, groundwater conditions, surface cover, slope geometry, material strata and geotechnical properties) affecting slope stability. Specifically, impacts of climate change on long-term slope stability are incorporated, accounting for the deep uncertainty in future climate projections. Our findings highlight the potential of robust decision making to aid decision support for landslide hazard reduction and risk management under conditions of deep uncertainty.

  1. CALiPER Exploratory Study: Accounting for Uncertainty in Lumen Measurements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bergman, Rolf; Paget, Maria L.; Richman, Eric E.

    2011-03-31

    With a well-defined and shared understanding of uncertainty in lumen measurements, testing laboratories can better evaluate their processes, contributing to greater consistency and credibility of lighting testing a key component of the U.S. Department of Energy (DOE) Commercially Available LED Product Evaluation and Reporting (CALiPER) program. Reliable lighting testing is a crucial underlying factor contributing toward the success of many energy-efficient lighting efforts, such as the DOE GATEWAY demonstrations, Lighting Facts Label, ENERGY STAR® energy efficient lighting programs, and many others. Uncertainty in measurements is inherent to all testing methodologies, including photometric and other lighting-related testing. Uncertainty exists for allmore » equipment, processes, and systems of measurement in individual as well as combined ways. A major issue with testing and the resulting accuracy of the tests is the uncertainty of the complete process. Individual equipment uncertainties are typically identified, but their relative value in practice and their combined value with other equipment and processes in the same test are elusive concepts, particularly for complex types of testing such as photometry. The total combined uncertainty of a measurement result is important for repeatable and comparative measurements for light emitting diode (LED) products in comparison with other technologies as well as competing products. This study provides a detailed and step-by-step method for determining uncertainty in lumen measurements, working closely with related standards efforts and key industry experts. This report uses the structure proposed in the Guide to Uncertainty Measurements (GUM) for evaluating and expressing uncertainty in measurements. The steps of the procedure are described and a spreadsheet format adapted for integrating sphere and goniophotometric uncertainty measurements is provided for entering parameters, ordering the information, calculating intermediate values and, finally, obtaining expanded uncertainties. Using this basis and examining each step of the photometric measurement and calibration methods, mathematical uncertainty models are developed. Determination of estimated values of input variables is discussed. Guidance is provided for the evaluation of the standard uncertainties of each input estimate, covariances associated with input estimates and the calculation of the result measurements. With this basis, the combined uncertainty of the measurement results and finally, the expanded uncertainty can be determined.« less

  2. Prioritizing Chemicals and Data Requirements for Screening-Level Exposure and Risk Assessment

    PubMed Central

    Brown, Trevor N.; Wania, Frank; Breivik, Knut; McLachlan, Michael S.

    2012-01-01

    Background: Scientists and regulatory agencies strive to identify chemicals that may cause harmful effects to humans and the environment; however, prioritization is challenging because of the large number of chemicals requiring evaluation and limited data and resources. Objectives: We aimed to prioritize chemicals for exposure and exposure potential and obtain a quantitative perspective on research needs to better address uncertainty in screening assessments. Methods: We used a multimedia mass balance model to prioritize > 12,000 organic chemicals using four far-field human exposure metrics. The propagation of variance (uncertainty) in key chemical information used as model input for calculating exposure metrics was quantified. Results: Modeled human concentrations and intake rates span approximately 17 and 15 orders of magnitude, respectively. Estimates of exposure potential using human concentrations and a unit emission rate span approximately 13 orders of magnitude, and intake fractions span 7 orders of magnitude. The actual chemical emission rate contributes the greatest variance (uncertainty) in exposure estimates. The human biotransformation half-life is the second greatest source of uncertainty in estimated concentrations. In general, biotransformation and biodegradation half-lives are greater sources of uncertainty in modeled exposure and exposure potential than chemical partition coefficients. Conclusions: Mechanistic exposure modeling is suitable for screening and prioritizing large numbers of chemicals. By including uncertainty analysis and uncertainty in chemical information in the exposure estimates, these methods can help identify and address the important sources of uncertainty in human exposure and risk assessment in a systematic manner. PMID:23008278

  3. Info-gap management of public health Policy for TB with HIV-prevalence and epidemiological uncertainty

    PubMed Central

    2012-01-01

    Background Formulation and evaluation of public health policy commonly employs science-based mathematical models. For instance, epidemiological dynamics of TB is dominated, in general, by flow between actively and latently infected populations. Thus modelling is central in planning public health intervention. However, models are highly uncertain because they are based on observations that are geographically and temporally distinct from the population to which they are applied. Aims We aim to demonstrate the advantages of info-gap theory, a non-probabilistic approach to severe uncertainty when worst cases cannot be reliably identified and probability distributions are unreliable or unavailable. Info-gap is applied here to mathematical modelling of epidemics and analysis of public health decision-making. Methods Applying info-gap robustness analysis to tuberculosis/HIV (TB/HIV) epidemics, we illustrate the critical role of incorporating uncertainty in formulating recommendations for interventions. Robustness is assessed as the magnitude of uncertainty that can be tolerated by a given intervention. We illustrate the methodology by exploring interventions that alter the rates of diagnosis, cure, relapse and HIV infection. Results We demonstrate several policy implications. Equivalence among alternative rates of diagnosis and relapse are identified. The impact of initial TB and HIV prevalence on the robustness to uncertainty is quantified. In some configurations, increased aggressiveness of intervention improves the predicted outcome but also reduces the robustness to uncertainty. Similarly, predicted outcomes may be better at larger target times, but may also be more vulnerable to model error. Conclusions The info-gap framework is useful for managing model uncertainty and is attractive when uncertainties on model parameters are extreme. When a public health model underlies guidelines, info-gap decision theory provides valuable insight into the confidence of achieving agreed-upon goals. PMID:23249291

  4. Info-gap management of public health Policy for TB with HIV-prevalence and epidemiological uncertainty.

    PubMed

    Ben-Haim, Yakov; Dacso, Clifford C; Zetola, Nicola M

    2012-12-19

    Formulation and evaluation of public health policy commonly employs science-based mathematical models. For instance, epidemiological dynamics of TB is dominated, in general, by flow between actively and latently infected populations. Thus modelling is central in planning public health intervention. However, models are highly uncertain because they are based on observations that are geographically and temporally distinct from the population to which they are applied. We aim to demonstrate the advantages of info-gap theory, a non-probabilistic approach to severe uncertainty when worst cases cannot be reliably identified and probability distributions are unreliable or unavailable. Info-gap is applied here to mathematical modelling of epidemics and analysis of public health decision-making. Applying info-gap robustness analysis to tuberculosis/HIV (TB/HIV) epidemics, we illustrate the critical role of incorporating uncertainty in formulating recommendations for interventions. Robustness is assessed as the magnitude of uncertainty that can be tolerated by a given intervention. We illustrate the methodology by exploring interventions that alter the rates of diagnosis, cure, relapse and HIV infection. We demonstrate several policy implications. Equivalence among alternative rates of diagnosis and relapse are identified. The impact of initial TB and HIV prevalence on the robustness to uncertainty is quantified. In some configurations, increased aggressiveness of intervention improves the predicted outcome but also reduces the robustness to uncertainty. Similarly, predicted outcomes may be better at larger target times, but may also be more vulnerable to model error. The info-gap framework is useful for managing model uncertainty and is attractive when uncertainties on model parameters are extreme. When a public health model underlies guidelines, info-gap decision theory provides valuable insight into the confidence of achieving agreed-upon goals.

  5. Funnel Libraries for Real-Time Robust Feedback Motion Planning

    DTIC Science & Technology

    2016-07-21

    motion plans for a robot that are guaranteed to suc- ceed despite uncertainty in the environment, parametric model uncertainty, and disturbances...resulting funnel library is then used to sequentially compose motion plans at runtime while ensuring the safety of the robot . A major advantage of...the work presented here is that by explicitly taking into account the effect of uncertainty, the robot can evaluate motion plans based on how vulnerable

  6. Analysis of actuator delay and its effect on uncertainty quantification for real-time hybrid simulation

    NASA Astrophysics Data System (ADS)

    Chen, Cheng; Xu, Weijie; Guo, Tong; Chen, Kai

    2017-10-01

    Uncertainties in structure properties can result in different responses in hybrid simulations. Quantification of the effect of these uncertainties would enable researchers to estimate the variances of structural responses observed from experiments. This poses challenges for real-time hybrid simulation (RTHS) due to the existence of actuator delay. Polynomial chaos expansion (PCE) projects the model outputs on a basis of orthogonal stochastic polynomials to account for influences of model uncertainties. In this paper, PCE is utilized to evaluate effect of actuator delay on the maximum displacement from real-time hybrid simulation of a single degree of freedom (SDOF) structure when accounting for uncertainties in structural properties. The PCE is first applied for RTHS without delay to determine the order of PCE, the number of sample points as well as the method for coefficients calculation. The PCE is then applied to RTHS with actuator delay. The mean, variance and Sobol indices are compared and discussed to evaluate the effects of actuator delay on uncertainty quantification for RTHS. Results show that the mean and the variance of the maximum displacement increase linearly and exponentially with respect to actuator delay, respectively. Sensitivity analysis through Sobol indices also indicates the influence of the single random variable decreases while the coupling effect increases with the increase of actuator delay.

  7. Real-Time Optimal Flood Control Decision Making and Risk Propagation Under Multiple Uncertainties

    NASA Astrophysics Data System (ADS)

    Zhu, Feilin; Zhong, Ping-An; Sun, Yimeng; Yeh, William W.-G.

    2017-12-01

    Multiple uncertainties exist in the optimal flood control decision-making process, presenting risks involving flood control decisions. This paper defines the main steps in optimal flood control decision making that constitute the Forecast-Optimization-Decision Making (FODM) chain. We propose a framework for supporting optimal flood control decision making under multiple uncertainties and evaluate risk propagation along the FODM chain from a holistic perspective. To deal with uncertainties, we employ stochastic models at each link of the FODM chain. We generate synthetic ensemble flood forecasts via the martingale model of forecast evolution. We then establish a multiobjective stochastic programming with recourse model for optimal flood control operation. The Pareto front under uncertainty is derived via the constraint method coupled with a two-step process. We propose a novel SMAA-TOPSIS model for stochastic multicriteria decision making. Then we propose the risk assessment model, the risk of decision-making errors and rank uncertainty degree to quantify the risk propagation process along the FODM chain. We conduct numerical experiments to investigate the effects of flood forecast uncertainty on optimal flood control decision making and risk propagation. We apply the proposed methodology to a flood control system in the Daduhe River basin in China. The results indicate that the proposed method can provide valuable risk information in each link of the FODM chain and enable risk-informed decisions with higher reliability.

  8. Uncertainty in exposure to air pollution

    NASA Astrophysics Data System (ADS)

    Pebesma, Edzer; Helle, Kristina; Christoph, Stasch; Rasouli, Soora; Timmermans, Harry; Walker, Sam-Erik; Denby, Bruce

    2013-04-01

    To assess exposure to air pollution for a person or for a group of people, one needs to know where the person or group is as a function of time, and what the air pollution is at these times and locations. In this study we used the Albatross activity-based model to assess the whereabouts of people and the uncertainties in this, and a probabilistic air quality system based on TAPM/EPISODE to assess air quality probabilistically. The outcomes of the two models were combined to assess exposure to air pollution, and the errors in it. We used the area around Rotterdam (Netherlands) as a case study. As the outcomes of both models come as Monte Carlo realizations, it was relatively easy to cancel one of the sources of uncertainty (movement of persons, air pollution) in order to identify their respective contributions, and also to compare evaluations for individuals with averages for a population of persons. As the output is probabilistic, and in addition spatially and temporally varying, the visual analysis of the complete results poses some challenges. This case study was one of the test cases in the UncertWeb project, which has built concepts and tools to realize the uncertainty-enabled model web. Some of the tools and protocols will be shown and evaluated in this presentation. For the uncertainty of exposure, the uncertainty of air quality was more important than the uncertainty of peoples locations. This difference was stronger for PM10 than for NO2. The workflow was implemented as generic Web services in UncertWeb that also allow for other inputs than the simulated activity schedules and air quality with other resolution. However, due to this flexibility, the Web services require standardized formats and the overlay algorithm is not optimized for the specific use case resulting in a data and processing overhead. Hence, we implemented the full analysis in parallel in R, for this specific case as the model web solution had difficulties with massive data.

  9. Evaluation of five dry particle deposition parameterizations for incorporation into atmospheric transport models

    NASA Astrophysics Data System (ADS)

    Khan, Tanvir R.; Perlinger, Judith A.

    2017-10-01

    Despite considerable effort to develop mechanistic dry particle deposition parameterizations for atmospheric transport models, current knowledge has been inadequate to propose quantitative measures of the relative performance of available parameterizations. In this study, we evaluated the performance of five dry particle deposition parameterizations developed by Zhang et al. (2001) (Z01), Petroff and Zhang (2010) (PZ10), Kouznetsov and Sofiev (2012) (KS12), Zhang and He (2014) (ZH14), and Zhang and Shao (2014) (ZS14), respectively. The evaluation was performed in three dimensions: model ability to reproduce observed deposition velocities, Vd (accuracy); the influence of imprecision in input parameter values on the modeled Vd (uncertainty); and identification of the most influential parameter(s) (sensitivity). The accuracy of the modeled Vd was evaluated using observations obtained from five land use categories (LUCs): grass, coniferous and deciduous forests, natural water, and ice/snow. To ascertain the uncertainty in modeled Vd, and quantify the influence of imprecision in key model input parameters, a Monte Carlo uncertainty analysis was performed. The Sobol' sensitivity analysis was conducted with the objective to determine the parameter ranking from the most to the least influential. Comparing the normalized mean bias factors (indicators of accuracy), we find that the ZH14 parameterization is the most accurate for all LUCs except for coniferous forest, for which it is second most accurate. From Monte Carlo simulations, the estimated mean normalized uncertainties in the modeled Vd obtained for seven particle sizes (ranging from 0.005 to 2.5 µm) for the five LUCs are 17, 12, 13, 16, and 27 % for the Z01, PZ10, KS12, ZH14, and ZS14 parameterizations, respectively. From the Sobol' sensitivity results, we suggest that the parameter rankings vary by particle size and LUC for a given parameterization. Overall, for dp = 0.001 to 1.0 µm, friction velocity was one of the three most influential parameters in all parameterizations. For giant particles (dp = 10 µm), relative humidity was the most influential parameter. Because it is the least complex of the five parameterizations, and it has the greatest accuracy and least uncertainty, we propose that the ZH14 parameterization is currently superior for incorporation into atmospheric transport models.

  10. Development of Probabilistic Flood Inundation Mapping For Flooding Induced by Dam Failure

    NASA Astrophysics Data System (ADS)

    Tsai, C.; Yeh, J. J. J.

    2017-12-01

    A primary function of flood inundation mapping is to forecast flood hazards and assess potential losses. However, uncertainties limit the reliability of inundation hazard assessments. Major sources of uncertainty should be taken into consideration by an optimal flood management strategy. This study focuses on the 20km reach downstream of the Shihmen Reservoir in Taiwan. A dam failure induced flood herein provides the upstream boundary conditions of flood routing. The two major sources of uncertainty that are considered in the hydraulic model and the flood inundation mapping herein are uncertainties in the dam break model and uncertainty of the roughness coefficient. The perturbance moment method is applied to a dam break model and the hydro system model to develop probabilistic flood inundation mapping. Various numbers of uncertain variables can be considered in these models and the variability of outputs can be quantified. The probabilistic flood inundation mapping for dam break induced floods can be developed with consideration of the variability of output using a commonly used HEC-RAS model. Different probabilistic flood inundation mappings are discussed and compared. Probabilistic flood inundation mappings are hoped to provide new physical insights in support of the evaluation of concerning reservoir flooded areas.

  11. Evaluation of calibration efficacy under different levels of uncertainty

    DOE PAGES

    Heo, Yeonsook; Graziano, Diane J.; Guzowski, Leah; ...

    2014-06-10

    This study examines how calibration performs under different levels of uncertainty in model input data. It specifically assesses the efficacy of Bayesian calibration to enhance the reliability of EnergyPlus model predictions. A Bayesian approach can be used to update uncertain values of parameters, given measured energy-use data, and to quantify the associated uncertainty.We assess the efficacy of Bayesian calibration under a controlled virtual-reality setup, which enables rigorous validation of the accuracy of calibration results in terms of both calibrated parameter values and model predictions. Case studies demonstrate the performance of Bayesian calibration of base models developed from audit data withmore » differing levels of detail in building design, usage, and operation.« less

  12. STochastic Analysis of Technical Systems (STATS): A model for evaluating combined effects of multiple uncertainties

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kranz, L.; VanKuiken, J.C.; Gillette, J.L.

    1989-12-01

    The STATS model, now modified to run on microcomputers, uses user- defined component uncertainties to calculate composite uncertainty distributions for systems or technologies. The program can be used to investigate uncertainties for a single technology on to compare two technologies. Although the term technology'' is used throughout the program screens, the program can accommodate very broad problem definitions. For example, electrical demand uncertainties, health risks associated with toxic material exposures, or traffic queuing delay times can be estimated. The terminology adopted in this version of STATS reflects the purpose of the earlier version, which was to aid in comparing advancedmore » electrical generating technologies. A comparison of two clean coal technologies in two power plants is given as a case study illustration. 7 refs., 35 figs., 7 tabs.« less

  13. Geographical scenario uncertainty in generic fate and exposure factors of toxic pollutants for life-cycle impact assessment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huijbregts, Mark A.J.; Lundi, Sven; McKone, Thomas E.

    In environmental life-cycle assessments (LCA), fate and exposure factors account for the general fate and exposure properties of chemicals under generic environmental conditions by means of 'evaluative' multi-media fate and exposure box models. To assess the effect of using different generic environmental conditions, fate and exposure factors of chemicals emitted under typical conditions of (1) Western Europe, (2) Australia and (3) the United States of America were compared with the multi-media fate and exposure box model USES-LCA. Comparing the results of the three evaluative environments, it was found that the uncertainty in fate and exposure factors for ecosystems and humansmore » due to choice of an evaluative environment, as represented by the ratio of the 97.5th and 50th percentile, is between a factor 2 and 10. Particularly, fate and exposure factors of emissions causing effects in fresh water ecosystems and effects on human health have relatively high uncertainty. This uncertainty i s mainly caused by the continental difference in the average soil erosion rate, the dimensions of the fresh water and agricultural soil compartment, and the fraction of drinking water coming from ground water.« less

  14. A comprehensive evaluation of input data-induced uncertainty in nonpoint source pollution modeling

    NASA Astrophysics Data System (ADS)

    Chen, L.; Gong, Y.; Shen, Z.

    2015-11-01

    Watershed models have been used extensively for quantifying nonpoint source (NPS) pollution, but few studies have been conducted on the error-transitivity from different input data sets to NPS modeling. In this paper, the effects of four input data, including rainfall, digital elevation models (DEMs), land use maps, and the amount of fertilizer, on NPS simulation were quantified and compared. A systematic input-induced uncertainty was investigated using watershed model for phosphorus load prediction. Based on the results, the rain gauge density resulted in the largest model uncertainty, followed by DEMs, whereas land use and fertilizer amount exhibited limited impacts. The mean coefficient of variation for errors in single rain gauges-, multiple gauges-, ASTER GDEM-, NFGIS DEM-, land use-, and fertilizer amount information was 0.390, 0.274, 0.186, 0.073, 0.033 and 0.005, respectively. The use of specific input information, such as key gauges, is also highlighted to achieve the required model accuracy. In this sense, these results provide valuable information to other model-based studies for the control of prediction uncertainty.

  15. Scientific Discovery through Advanced Computing (SciDAC-3) Partnership Project Annual Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hoffman, Forest M.; Bochev, Pavel B.; Cameron-Smith, Philip J..

    The Applying Computationally Efficient Schemes for BioGeochemical Cycles ACES4BGC Project is advancing the predictive capabilities of Earth System Models (ESMs) by reducing two of the largest sources of uncertainty, aerosols and biospheric feedbacks, with a highly efficient computational approach. In particular, this project is implementing and optimizing new computationally efficient tracer advection algorithms for large numbers of tracer species; adding important biogeochemical interactions between the atmosphere, land, and ocean models; and applying uncertainty quanti cation (UQ) techniques to constrain process parameters and evaluate uncertainties in feedbacks between biogeochemical cycles and the climate system.

  16. Addressing subjective decision-making inherent in GLUE-based multi-criteria rainfall-runoff model calibration

    NASA Astrophysics Data System (ADS)

    Shafii, Mahyar; Tolson, Bryan; Shawn Matott, L.

    2015-04-01

    GLUE is one of the most commonly used informal methodologies for uncertainty estimation in hydrological modelling. Despite the ease-of-use of GLUE, it involves a number of subjective decisions such as the strategy for identifying the behavioural solutions. This study evaluates the impact of behavioural solution identification strategies in GLUE on the quality of model output uncertainty. Moreover, two new strategies are developed to objectively identify behavioural solutions. The first strategy considers Pareto-based ranking of parameter sets, while the second one is based on ranking the parameter sets based on an aggregated criterion. The proposed strategies, as well as the traditional strategies in the literature, are evaluated with respect to reliability (coverage of observations by the envelope of model outcomes) and sharpness (width of the envelope of model outcomes) in different numerical experiments. These experiments include multi-criteria calibration and uncertainty estimation of three rainfall-runoff models with different number of parameters. To demonstrate the importance of behavioural solution identification strategy more appropriately, GLUE is also compared with two other informal multi-criteria calibration and uncertainty estimation methods (Pareto optimization and DDS-AU). The results show that the model output uncertainty varies with the behavioural solution identification strategy, and furthermore, a robust GLUE implementation would require considering multiple behavioural solution identification strategies and choosing the one that generates the desired balance between sharpness and reliability. The proposed objective strategies prove to be the best options in most of the case studies investigated in this research. Implementing such an approach for a high-dimensional calibration problem enables GLUE to generate robust results in comparison with Pareto optimization and DDS-AU.

  17. Closed-Loop Evaluation of an Integrated Failure Identification and Fault Tolerant Control System for a Transport Aircraft

    NASA Technical Reports Server (NTRS)

    Shin, Jong-Yeob; Belcastro, Christine; Khong, thuan

    2006-01-01

    Formal robustness analysis of aircraft control upset prevention and recovery systems could play an important role in their validation and ultimate certification. Such systems developed for failure detection, identification, and reconfiguration, as well as upset recovery, need to be evaluated over broad regions of the flight envelope or under extreme flight conditions, and should include various sources of uncertainty. To apply formal robustness analysis, formulation of linear fractional transformation (LFT) models of complex parameter-dependent systems is required, which represent system uncertainty due to parameter uncertainty and actuator faults. This paper describes a detailed LFT model formulation procedure from the nonlinear model of a transport aircraft by using a preliminary LFT modeling software tool developed at the NASA Langley Research Center, which utilizes a matrix-based computational approach. The closed-loop system is evaluated over the entire flight envelope based on the generated LFT model which can cover nonlinear dynamics. The robustness analysis results of the closed-loop fault tolerant control system of a transport aircraft are presented. A reliable flight envelope (safe flight regime) is also calculated from the robust performance analysis results, over which the closed-loop system can achieve the desired performance of command tracking and failure detection.

  18. Advances in Applications of Hierarchical Bayesian Methods with Hydrological Models

    NASA Astrophysics Data System (ADS)

    Alexander, R. B.; Schwarz, G. E.; Boyer, E. W.

    2017-12-01

    Mechanistic and empirical watershed models are increasingly used to inform water resource decisions. Growing access to historical stream measurements and data from in-situ sensor technologies has increased the need for improved techniques for coupling models with hydrological measurements. Techniques that account for the intrinsic uncertainties of both models and measurements are especially needed. Hierarchical Bayesian methods provide an efficient modeling tool for quantifying model and prediction uncertainties, including those associated with measurements. Hierarchical methods can also be used to explore spatial and temporal variations in model parameters and uncertainties that are informed by hydrological measurements. We used hierarchical Bayesian methods to develop a hybrid (statistical-mechanistic) SPARROW (SPAtially Referenced Regression On Watershed attributes) model of long-term mean annual streamflow across diverse environmental and climatic drainages in 18 U.S. hydrological regions. Our application illustrates the use of a new generation of Bayesian methods that offer more advanced computational efficiencies than the prior generation. Evaluations of the effects of hierarchical (regional) variations in model coefficients and uncertainties on model accuracy indicates improved prediction accuracies (median of 10-50%) but primarily in humid eastern regions, where model uncertainties are one-third of those in arid western regions. Generally moderate regional variability is observed for most hierarchical coefficients. Accounting for measurement and structural uncertainties, using hierarchical state-space techniques, revealed the effects of spatially-heterogeneous, latent hydrological processes in the "localized" drainages between calibration sites; this improved model precision, with only minor changes in regional coefficients. Our study can inform advances in the use of hierarchical methods with hydrological models to improve their integration with stream measurements.

  19. Evaluation of machine learning algorithms for prediction of regions of high Reynolds averaged Navier Stokes uncertainty

    DOE PAGES

    Ling, Julia; Templeton, Jeremy Alan

    2015-08-04

    Reynolds Averaged Navier Stokes (RANS) models are widely used in industry to predict fluid flows, despite their acknowledged deficiencies. Not only do RANS models often produce inaccurate flow predictions, but there are very limited diagnostics available to assess RANS accuracy for a given flow configuration. If experimental or higher fidelity simulation results are not available for RANS validation, there is no reliable method to evaluate RANS accuracy. This paper explores the potential of utilizing machine learning algorithms to identify regions of high RANS uncertainty. Three different machine learning algorithms were evaluated: support vector machines, Adaboost decision trees, and random forests.more » The algorithms were trained on a database of canonical flow configurations for which validated direct numerical simulation or large eddy simulation results were available, and were used to classify RANS results on a point-by-point basis as having either high or low uncertainty, based on the breakdown of specific RANS modeling assumptions. Classifiers were developed for three different basic RANS eddy viscosity model assumptions: the isotropy of the eddy viscosity, the linearity of the Boussinesq hypothesis, and the non-negativity of the eddy viscosity. It is shown that these classifiers are able to generalize to flows substantially different from those on which they were trained. As a result, feature selection techniques, model evaluation, and extrapolation detection are discussed in the context of turbulence modeling applications.« less

  20. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lei, Huan; Yang, Xiu; Zheng, Bin

    Biomolecules exhibit conformational fluctuations near equilibrium states, inducing uncertainty in various biological properties in a dynamic way. We have developed a general method to quantify the uncertainty of target properties induced by conformational fluctuations. Using a generalized polynomial chaos (gPC) expansion, we construct a surrogate model of the target property with respect to varying conformational states. We also propose a method to increase the sparsity of the gPC expansion by defining a set of conformational “active space” random variables. With the increased sparsity, we employ the compressive sensing method to accurately construct the surrogate model. We demonstrate the performance ofmore » the surrogate model by evaluating fluctuation-induced uncertainty in solvent-accessible surface area for the bovine trypsin inhibitor protein system and show that the new approach offers more accurate statistical information than standard Monte Carlo approaches. Further more, the constructed surrogate model also enables us to directly evaluate the target property under various conformational states, yielding a more accurate response surface than standard sparse grid collocation methods. In particular, the new method provides higher accuracy in high-dimensional systems, such as biomolecules, where sparse grid performance is limited by the accuracy of the computed quantity of interest. Finally, our new framework is generalizable and can be used to investigate the uncertainty of a wide variety of target properties in biomolecular systems.« less

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lei, Huan; Yang, Xiu; Zheng, Bin

    Biomolecules exhibit conformational fluctuations near equilibrium states, inducing uncertainty in various biological properties in a dynamic way. We have developed a general method to quantify the uncertainty of target properties induced by conformational fluctuations. Using a generalized polynomial chaos (gPC) expansion, we construct a surrogate model of the target property with respect to varying conformational states. We also propose a method to increase the sparsity of the gPC expansion by defining a set of conformational “active space” random variables. With the increased sparsity, we employ the compressive sensing method to accurately construct the surrogate model. We demonstrate the performance ofmore » the surrogate model by evaluating fluctuation-induced uncertainty in solvent-accessible surface area for the bovine trypsin inhibitor protein system and show that the new approach offers more accurate statistical information than standard Monte Carlo approaches. Further more, the constructed surrogate model also enables us to directly evaluate the target property under various conformational states, yielding a more accurate response surface than standard sparse grid collocation methods. In particular, the new method provides higher accuracy in high-dimensional systems, such as biomolecules, where sparse grid performance is limited by the accuracy of the computed quantity of interest. Our new framework is generalizable and can be used to investigate the uncertainty of a wide variety of target properties in biomolecular systems.« less

  2. A Practical Probabilistic Graphical Modeling Tool for Weighing ...

    EPA Pesticide Factsheets

    Past weight-of-evidence frameworks for adverse ecological effects have provided soft-scoring procedures for judgments based on the quality and measured attributes of evidence. Here, we provide a flexible probabilistic structure for weighing and integrating lines of evidence for ecological risk determinations. Probabilistic approaches can provide both a quantitative weighing of lines of evidence and methods for evaluating risk and uncertainty. The current modeling structure wasdeveloped for propagating uncertainties in measured endpoints and their influence on the plausibility of adverse effects. To illustrate the approach, we apply the model framework to the sediment quality triad using example lines of evidence for sediment chemistry measurements, bioassay results, and in situ infauna diversity of benthic communities using a simplified hypothetical case study. We then combine the three lines evidence and evaluate sensitivity to the input parameters, and show how uncertainties are propagated and how additional information can be incorporated to rapidly update the probability of impacts. The developed network model can be expanded to accommodate additional lines of evidence, variables and states of importance, and different types of uncertainties in the lines of evidence including spatial and temporal as well as measurement errors. We provide a flexible Bayesian network structure for weighing and integrating lines of evidence for ecological risk determinations

  3. Effect of soil property uncertainties on permafrost thaw projections: a calibration-constrained analysis

    NASA Astrophysics Data System (ADS)

    Harp, D. R.; Atchley, A. L.; Painter, S. L.; Coon, E. T.; Wilson, C. J.; Romanovsky, V. E.; Rowland, J. C.

    2016-02-01

    The effects of soil property uncertainties on permafrost thaw projections are studied using a three-phase subsurface thermal hydrology model and calibration-constrained uncertainty analysis. The null-space Monte Carlo method is used to identify soil hydrothermal parameter combinations that are consistent with borehole temperature measurements at the study site, the Barrow Environmental Observatory. Each parameter combination is then used in a forward projection of permafrost conditions for the 21st century (from calendar year 2006 to 2100) using atmospheric forcings from the Community Earth System Model (CESM) in the Representative Concentration Pathway (RCP) 8.5 greenhouse gas concentration trajectory. A 100-year projection allows for the evaluation of predictive uncertainty (due to soil property (parametric) uncertainty) and the inter-annual climate variability due to year to year differences in CESM climate forcings. After calibrating to measured borehole temperature data at this well-characterized site, soil property uncertainties are still significant and result in significant predictive uncertainties in projected active layer thickness and annual thaw depth-duration even with a specified future climate. Inter-annual climate variability in projected soil moisture content and Stefan number are small. A volume- and time-integrated Stefan number decreases significantly, indicating a shift in subsurface energy utilization in the future climate (latent heat of phase change becomes more important than heat conduction). Out of 10 soil parameters, ALT, annual thaw depth-duration, and Stefan number are highly dependent on mineral soil porosity, while annual mean liquid saturation of the active layer is highly dependent on the mineral soil residual saturation and moderately dependent on peat residual saturation. By comparing the ensemble statistics to the spread of projected permafrost metrics using different climate models, we quantify the relative magnitude of soil property uncertainty to another source of permafrost uncertainty, structural climate model uncertainty. We show that the effect of calibration-constrained uncertainty in soil properties, although significant, is less than that produced by structural climate model uncertainty for this location.

  4. Protein construct storage: Bayesian variable selection and prediction with mixtures.

    PubMed

    Clyde, M A; Parmigiani, G

    1998-07-01

    Determining optimal conditions for protein storage while maintaining a high level of protein activity is an important question in pharmaceutical research. A designed experiment based on a space-filling design was conducted to understand the effects of factors affecting protein storage and to establish optimal storage conditions. Different model-selection strategies to identify important factors may lead to very different answers about optimal conditions. Uncertainty about which factors are important, or model uncertainty, can be a critical issue in decision-making. We use Bayesian variable selection methods for linear models to identify important variables in the protein storage data, while accounting for model uncertainty. We also use the Bayesian framework to build predictions based on a large family of models, rather than an individual model, and to evaluate the probability that certain candidate storage conditions are optimal.

  5. A study of undue pain and surfing: using hierarchical criteria to assess website quality.

    PubMed

    Lorence, Daniel; Abraham, Joanna

    2008-09-01

    In studies of web-based consumer health information, scant attention has been paid to the selective development of differential methodologies for website quality evaluation, or to selective grouping and analysis of specific ;domains of uncertainty' in healthcare. Our objective is to introduce a more refined model for website evaluation, and illustrate its application using assessment of websites within an area of ongoing medical uncertainty, back pain. In this exploratory technology assessment, we suggest a model for assessing these ;domains of uncertainty' within healthcare, using qualitative assessment of websites and hierarchical concepts. Using such a hierarchy of quality criteria, we review medical information provided by the most frequently accessed websites related to back pain. Websites are evaluated using standardized criteria, with results rated from the viewpoint of the consumer. Results show that standardization of quality rating across subjective content, and between commercial and niche search results, can provide a consumer-friendly dimension to health information.

  6. Consensus building for interlaboratory studies, key comparisons, and meta-analysis

    NASA Astrophysics Data System (ADS)

    Koepke, Amanda; Lafarge, Thomas; Possolo, Antonio; Toman, Blaza

    2017-06-01

    Interlaboratory studies in measurement science, including key comparisons, and meta-analyses in several fields, including medicine, serve to intercompare measurement results obtained independently, and typically produce a consensus value for the common measurand that blends the values measured by the participants. Since interlaboratory studies and meta-analyses reveal and quantify differences between measured values, regardless of the underlying causes for such differences, they also provide so-called ‘top-down’ evaluations of measurement uncertainty. Measured values are often substantially over-dispersed by comparison with their individual, stated uncertainties, thus suggesting the existence of yet unrecognized sources of uncertainty (dark uncertainty). We contrast two different approaches to take dark uncertainty into account both in the computation of consensus values and in the evaluation of the associated uncertainty, which have traditionally been preferred by different scientific communities. One inflates the stated uncertainties by a multiplicative factor. The other adds laboratory-specific ‘effects’ to the value of the measurand. After distinguishing what we call recipe-based and model-based approaches to data reductions in interlaboratory studies, we state six guiding principles that should inform such reductions. These principles favor model-based approaches that expose and facilitate the critical assessment of validating assumptions, and give preeminence to substantive criteria to determine which measurement results to include, and which to exclude, as opposed to purely statistical considerations, and also how to weigh them. Following an overview of maximum likelihood methods, three general purpose procedures for data reduction are described in detail, including explanations of how the consensus value and degrees of equivalence are computed, and the associated uncertainty evaluated: the DerSimonian-Laird procedure; a hierarchical Bayesian procedure; and the Linear Pool. These three procedures have been implemented and made widely accessible in a Web-based application (NIST Consensus Builder). We illustrate principles, statistical models, and data reduction procedures in four examples: (i) the measurement of the Newtonian constant of gravitation; (ii) the measurement of the half-lives of radioactive isotopes of caesium and strontium; (iii) the comparison of two alternative treatments for carotid artery stenosis; and (iv) a key comparison where the measurand was the calibration factor of a radio-frequency power sensor.

  7. The Strengths and Limitations of Satellite Data for Evaluating Tropospheric Processes in Chemistry-Climate Models

    NASA Technical Reports Server (NTRS)

    Duncan, Bryan

    2012-01-01

    There is now a wealth of satellite data products available with which to evaluate a model fs simulation of tropospheric composition and other model processes. All of these data products have their strengths and limitations that need to be considered for this purpose. For example, uncertainties are introduced into a data product when 1) converting a slant column to a vertical column and 2) estimating the amount of a total column of a trace gas (e.g., ozone, nitrogen dioxide) that resides in the troposphere. Oftentimes, these uncertainties are not well quantified and the satellite data products are not well evaluated against in situ observations. However, these limitations do not preclude us from using these data products to evaluate our model processes if we understand these strengths and limitations when developing diagnostics. I will show several examples of how satellite data products are being used to evaluate particular model processes with a focus on the strengths and limitations of these data products. In addition, I will introduce the goals of a newly formed team to address issues on the topic of "satellite data for improved model evaluation and process studies" that is established in support of the IGAC/SPARC Global Chemistry ]Climate Modeling and Evaluation Workshop.

  8. Design and experimental evaluation of robust controllers for a two-wheeled robot

    NASA Astrophysics Data System (ADS)

    Kralev, J.; Slavov, Ts.; Petkov, P.

    2016-11-01

    The paper presents the design and experimental evaluation of two alternative μ-controllers for robust vertical stabilisation of a two-wheeled self-balancing robot. The controllers design is based on models derived by identification from closed-loop experimental data. In the first design, a signal-based uncertainty representation obtained directly from the identification procedure is used, which leads to a controller of order 29. In the second design the signal uncertainty is approximated by an input multiplicative uncertainty, which leads to a controller of order 50, subsequently reduced to 30. The performance of the two μ-controllers is compared with the performance of a conventional linear quadratic controller with 17th-order Kalman filter. A proportional-integral controller of the rotational motion around the vertical axis is implemented as well. The control code is generated using Simulink® controller models and is embedded in a digital signal processor. Results from the simulation of the closed-loop system as well as experimental results obtained during the real-time implementation of the designed controllers are given. The theoretical investigation and experimental results confirm that the closed-loop system achieves robust performance in respect to the uncertainties related to the identified robot model.

  9. A PROBABILISTIC APPROACH FOR ANALYSIS OF UNCERTAINTY IN THE EVALUATION OF WATERSHED MANAGEMENT PRACTICES

    EPA Science Inventory

    A computational framework is presented for analyzing the uncertainty in model estimates of water quality benefits of best management practices (BMPs) in two small (<10 km2) watersheds in Indiana. The analysis specifically recognizes the significance of the difference b...

  10. Resolving dust emission responses to land cover change using an ecological land classification

    USDA-ARS?s Scientific Manuscript database

    Despite efforts to quantify the impacts of land cover change on wind erosion, assessment uncertainty remains large. We address this uncertainty by evaluating the application of ecological site concepts and state-and-transition models (STMs) for detecting and quantitatively describing the impacts of ...

  11. Multi-Scale Validation of a Nanodiamond Drug Delivery System and Multi-Scale Engineering Education

    ERIC Educational Resources Information Center

    Schwalbe, Michelle Kristin

    2010-01-01

    This dissertation has two primary concerns: (i) evaluating the uncertainty and prediction capabilities of a nanodiamond drug delivery model using Bayesian calibration and bias correction, and (ii) determining conceptual difficulties of multi-scale analysis from an engineering education perspective. A Bayesian uncertainty quantification scheme…

  12. Mass discharge estimation from contaminated sites: Multi-model solutions for assessment of conceptual uncertainty

    NASA Astrophysics Data System (ADS)

    Thomsen, N. I.; Troldborg, M.; McKnight, U. S.; Binning, P. J.; Bjerg, P. L.

    2012-04-01

    Mass discharge estimates are increasingly being used in the management of contaminated sites. Such estimates have proven useful for supporting decisions related to the prioritization of contaminated sites in a groundwater catchment. Potential management options can be categorised as follows: (1) leave as is, (2) clean up, or (3) further investigation needed. However, mass discharge estimates are often very uncertain, which may hamper the management decisions. If option 1 is incorrectly chosen soil and water quality will decrease, threatening or destroying drinking water resources. The risk of choosing option 2 is to spend money on remediating a site that does not pose a problem. Choosing option 3 will often be safest, but may not be the optimal economic solution. Quantification of the uncertainty in mass discharge estimates can therefore greatly improve the foundation for selecting the appropriate management option. The uncertainty of mass discharge estimates depends greatly on the extent of the site characterization. A good approach for uncertainty estimation will be flexible with respect to the investigation level, and account for both parameter and conceptual model uncertainty. We propose a method for quantifying the uncertainty of dynamic mass discharge estimates from contaminant point sources on the local scale. The method considers both parameter and conceptual uncertainty through a multi-model approach. The multi-model approach evaluates multiple conceptual models for the same site. The different conceptual models consider different source characterizations and hydrogeological descriptions. The idea is to include a set of essentially different conceptual models where each model is believed to be realistic representation of the given site, based on the current level of information. Parameter uncertainty is quantified using Monte Carlo simulations. For each conceptual model we calculate a transient mass discharge estimate with uncertainty bounds resulting from the parametric uncertainty. To quantify the conceptual uncertainty from a given site, we combine the outputs from the different conceptual models using Bayesian model averaging. The weight for each model is obtained by integrating available data and expert knowledge using Bayesian belief networks. The multi-model approach is applied to a contaminated site. At the site a DNAPL (dense non aqueous phase liquid) spill consisting of PCE (perchloroethylene) has contaminated a fractured clay till aquitard overlaying a limestone aquifer. The exact shape and nature of the source is unknown and so is the importance of transport in the fractures. The result of the multi-model approach is a visual representation of the uncertainty of the mass discharge estimates for the site which can be used to support the management options.

  13. Causes of variation among rice models in yield response to CO2 examined with Free-Air CO2 Enrichment and growth chamber experiments.

    PubMed

    Hasegawa, Toshihiro; Li, Tao; Yin, Xinyou; Zhu, Yan; Boote, Kenneth; Baker, Jeffrey; Bregaglio, Simone; Buis, Samuel; Confalonieri, Roberto; Fugice, Job; Fumoto, Tamon; Gaydon, Donald; Kumar, Soora Naresh; Lafarge, Tanguy; Marcaida Iii, Manuel; Masutomi, Yuji; Nakagawa, Hiroshi; Oriol, Philippe; Ruget, Françoise; Singh, Upendra; Tang, Liang; Tao, Fulu; Wakatsuki, Hitomi; Wallach, Daniel; Wang, Yulong; Wilson, Lloyd Ted; Yang, Lianxin; Yang, Yubin; Yoshida, Hiroe; Zhang, Zhao; Zhu, Jianguo

    2017-11-01

    The CO 2 fertilization effect is a major source of uncertainty in crop models for future yield forecasts, but coordinated efforts to determine the mechanisms of this uncertainty have been lacking. Here, we studied causes of uncertainty among 16 crop models in predicting rice yield in response to elevated [CO 2 ] (E-[CO 2 ]) by comparison to free-air CO 2 enrichment (FACE) and chamber experiments. The model ensemble reproduced the experimental results well. However, yield prediction in response to E-[CO 2 ] varied significantly among the rice models. The variation was not random: models that overestimated at one experiment simulated greater yield enhancements at the others. The variation was not associated with model structure or magnitude of photosynthetic response to E-[CO 2 ] but was significantly associated with the predictions of leaf area. This suggests that modelled secondary effects of E-[CO 2 ] on morphological development, primarily leaf area, are the sources of model uncertainty. Rice morphological development is conservative to carbon acquisition. Uncertainty will be reduced by incorporating this conservative nature of the morphological response to E-[CO 2 ] into the models. Nitrogen levels, particularly under limited situations, make the prediction more uncertain. Improving models to account for [CO 2 ] × N interactions is necessary to better evaluate management practices under climate change.

  14. System Identification and Uncertainty Quantification Using Orthogonal Excitations and the Semi-span Super Sonic Transport (S4T) Model

    NASA Technical Reports Server (NTRS)

    Heeg, Jennifer; Wieseman, Carol D.

    2012-01-01

    Orthogonal harmonic multisine excitations were utilized in a wind tunnel test and in simulation of the SemiSpan Supersonic Transport model to assess aeroservoelastic characteristics. Fundamental issues associated with analyzing sinusoidal signals were examined, including spectral leakage, excitation truncation, and uncertainties on frequency response functions and mean-square coherence. Simulation allowed for evaluation of these issues relative to a truth model, while wind tunnel data introduced real-world implementation issues.

  15. Uncertainty in monitoring E. coli concentrations in streams and stormwater runoff

    NASA Astrophysics Data System (ADS)

    Harmel, R. D.; Hathaway, J. M.; Wagner, K. L.; Wolfe, J. E.; Karthikeyan, R.; Francesconi, W.; McCarthy, D. T.

    2016-03-01

    Microbial contamination of surface waters, a substantial public health concern throughout the world, is typically identified by fecal indicator bacteria such as Escherichia coli. Thus, monitoring E. coli concentrations is critical to evaluate current conditions, determine restoration effectiveness, and inform model development and calibration. An often overlooked component of these monitoring and modeling activities is understanding the inherent random and systematic uncertainty present in measured data. In this research, a review and subsequent analysis was performed to identify, document, and analyze measurement uncertainty of E. coli data collected in stream flow and stormwater runoff as individual discrete samples or throughout a single runoff event. Data on the uncertainty contributed by sample collection, sample preservation/storage, and laboratory analysis in measured E. coli concentrations were compiled and analyzed, and differences in sampling method and data quality scenarios were compared. The analysis showed that: (1) manual integrated sampling produced the lowest random and systematic uncertainty in individual samples, but automated sampling typically produced the lowest uncertainty when sampling throughout runoff events; (2) sample collection procedures often contributed the highest amount of uncertainty, although laboratory analysis introduced substantial random uncertainty and preservation/storage introduced substantial systematic uncertainty under some scenarios; and (3) the uncertainty in measured E. coli concentrations was greater than that of sediment and nutrients, but the difference was not as great as may be assumed. This comprehensive analysis of uncertainty in E. coli concentrations measured in streamflow and runoff should provide valuable insight for designing E. coli monitoring projects, reducing uncertainty in quality assurance efforts, regulatory and policy decision making, and fate and transport modeling.

  16. Parameter uncertainty analysis of a biokinetic model of caesium

    DOE PAGES

    Li, W. B.; Klein, W.; Blanchardon, Eric; ...

    2014-04-17

    Parameter uncertainties for the biokinetic model of caesium (Cs) developed by Leggett et al. were inventoried and evaluated. The methods of parameter uncertainty analysis were used to assess the uncertainties of model predictions with the assumptions of model parameter uncertainties and distributions. Furthermore, the importance of individual model parameters was assessed by means of sensitivity analysis. The calculated uncertainties of model predictions were compared with human data of Cs measured in blood and in the whole body. It was found that propagating the derived uncertainties in model parameter values reproduced the range of bioassay data observed in human subjects atmore » different times after intake. The maximum ranges, expressed as uncertainty factors (UFs) (defined as a square root of ratio between 97.5th and 2.5th percentiles) of blood clearance, whole-body retention and urinary excretion of Cs predicted at earlier time after intake were, respectively: 1.5, 1.0 and 2.5 at the first day; 1.8, 1.1 and 2.4 at Day 10 and 1.8, 2.0 and 1.8 at Day 100; for the late times (1000 d) after intake, the UFs were increased to 43, 24 and 31, respectively. The model parameters of transfer rates between kidneys and blood, muscle and blood and the rate of transfer from kidneys to urinary bladder content are most influential to the blood clearance and to the whole-body retention of Cs. For the urinary excretion, the parameters of transfer rates from urinary bladder content to urine and from kidneys to urinary bladder content impact mostly. The implication and effect on the estimated equivalent and effective doses of the larger uncertainty of 43 in whole-body retention in the later time, say, after Day 500 will be explored in a successive work in the framework of EURADOS.« less

  17. Evaluation of Parameter Uncertainty Reduction in Groundwater Flow Modeling Using Multiple Environmental Tracers

    NASA Astrophysics Data System (ADS)

    Arnold, B. W.; Gardner, P.

    2013-12-01

    Calibration of groundwater flow models for the purpose of evaluating flow and aquifer heterogeneity typically uses observations of hydraulic head in wells and appropriate boundary conditions. Environmental tracers have a wide variety of decay rates and input signals in recharge, resulting in a potentially broad source of additional information to constrain flow rates and heterogeneity. A numerical study was conducted to evaluate the reduction in uncertainty during model calibration using observations of various environmental tracers and combinations of tracers. A synthetic data set was constructed by simulating steady groundwater flow and transient tracer transport in a high-resolution, 2-D aquifer with heterogeneous permeability and porosity using the PFLOTRAN software code. Data on pressure and tracer concentration were extracted at well locations and then used as observations for automated calibration of a flow and transport model using the pilot point method and the PEST code. Optimization runs were performed to estimate parameter values of permeability at 30 pilot points in the model domain for cases using 42 observations of: 1) pressure, 2) pressure and CFC11 concentrations, 3) pressure and Ar-39 concentrations, and 4) pressure, CFC11, Ar-39, tritium, and He-3 concentrations. Results show significantly lower uncertainty, as indicated by the 95% linear confidence intervals, in permeability values at the pilot points for cases including observations of environmental tracer concentrations. The average linear uncertainty range for permeability at the pilot points using pressure observations alone is 4.6 orders of magnitude, using pressure and CFC11 concentrations is 1.6 orders of magnitude, using pressure and Ar-39 concentrations is 0.9 order of magnitude, and using pressure, CFC11, Ar-39, tritium, and He-3 concentrations is 1.0 order of magnitude. Data on Ar-39 concentrations result in the greatest parameter uncertainty reduction because its half-life of 269 years is similar to the range of transport times (hundreds to thousands of years) in the heterogeneous synthetic aquifer domain. The slightly higher uncertainty range for the case using all of the environmental tracers simultaneously is probably due to structural errors in the model introduced by the pilot point regularization scheme. It is concluded that maximum information and uncertainty reduction for constraining a groundwater flow model is obtained using an environmental tracer whose half-life is well matched to the range of transport times through the groundwater flow system. Sandia is a multiprogram laboratory operated by Sandia Corporation, a Lockheed Martin Company, for the United States Department of Energy's National Nuclear Security Administration under contract DE-AC04-94AL85000.

  18. Post-processing of multi-hydrologic model simulations for improved streamflow projections

    NASA Astrophysics Data System (ADS)

    khajehei, sepideh; Ahmadalipour, Ali; Moradkhani, Hamid

    2016-04-01

    Hydrologic model outputs are prone to bias and uncertainty due to knowledge deficiency in model and data. Uncertainty in hydroclimatic projections arises due to uncertainty in hydrologic model as well as the epistemic or aleatory uncertainties in GCM parameterization and development. This study is conducted to: 1) evaluate the recently developed multi-variate post-processing method for historical simulations and 2) assess the effect of post-processing on uncertainty and reliability of future streamflow projections in both high-flow and low-flow conditions. The first objective is performed for historical period of 1970-1999. Future streamflow projections are generated for 10 statistically downscaled GCMs from two widely used downscaling methods: Bias Corrected Statistically Downscaled (BCSD) and Multivariate Adaptive Constructed Analogs (MACA), over the period of 2010-2099 for two representative concentration pathways of RCP4.5 and RCP8.5. Three semi-distributed hydrologic models were employed and calibrated at 1/16 degree latitude-longitude resolution for over 100 points across the Columbia River Basin (CRB) in the pacific northwest USA. Streamflow outputs are post-processed through a Bayesian framework based on copula functions. The post-processing approach is relying on a transfer function developed based on bivariate joint distribution between the observation and simulation in historical period. Results show that application of post-processing technique leads to considerably higher accuracy in historical simulations and also reducing model uncertainty in future streamflow projections.

  19. Data-free and data-driven spectral perturbations for RANS UQ

    NASA Astrophysics Data System (ADS)

    Edeling, Wouter; Mishra, Aashwin; Iaccarino, Gianluca

    2017-11-01

    Despite recent developments in high-fidelity turbulent flow simulations, RANS modeling is still vastly used by industry, due to its inherent low cost. Since accuracy is a concern in RANS modeling, model-form UQ is an essential tool for assessing the impacts of this uncertainty on quantities of interest. Applying the spectral decomposition to the modeled Reynolds-Stress Tensor (RST) allows for the introduction of decoupled perturbations into the baseline intensity (kinetic energy), shape (eigenvalues), and orientation (eigenvectors). This constitutes a natural methodology to evaluate the model form uncertainty associated to different aspects of RST modeling. In a predictive setting, one frequently encounters an absence of any relevant reference data. To make data-free predictions with quantified uncertainty we employ physical bounds to a-priori define maximum spectral perturbations. When propagated, these perturbations yield intervals of engineering utility. High-fidelity data opens up the possibility of inferring a distribution of uncertainty, by means of various data-driven machine-learning techniques. We will demonstrate our framework on a number of flow problems where RANS models are prone to failure. This research was partially supported by the Defense Advanced Research Projects Agency under the Enabling Quantification of Uncertainty in Physical Systems (EQUiPS) project (technical monitor: Dr Fariba Fahroo), and the DOE PSAAP-II program.

  20. Climate model uncertainty in impact assessments for agriculture: A multi-ensemble case study on maize in sub-Saharan Africa

    NASA Astrophysics Data System (ADS)

    Dale, Amy; Fant, Charles; Strzepek, Kenneth; Lickley, Megan; Solomon, Susan

    2017-03-01

    We present maize production in sub-Saharan Africa as a case study in the exploration of how uncertainties in global climate change, as reflected in projections from a range of climate model ensembles, influence climate impact assessments for agriculture. The crop model AquaCrop-OS (Food and Agriculture Organization of the United Nations) was modified to run on a 2° × 2° grid and coupled to 122 climate model projections from multi-model ensembles for three emission scenarios (Coupled Model Intercomparison Project Phase 3 [CMIP3] SRES A1B and CMIP5 Representative Concentration Pathway [RCP] scenarios 4.5 and 8.5) as well as two "within-model" ensembles (NCAR CCSM3 and ECHAM5/MPI-OM) designed to capture internal variability (i.e., uncertainty due to chaos in the climate system). In spite of high uncertainty, most notably in the high-producing semi-arid zones, we observed robust regional and sub-regional trends across all ensembles. In agreement with previous work, we project widespread yield losses in the Sahel region and Southern Africa, resilience in Central Africa, and sub-regional increases in East Africa and at the southern tip of the continent. Spatial patterns of yield losses corresponded with spatial patterns of aridity increases, which were explicitly evaluated. Internal variability was a major source of uncertainty in both within-model and between-model ensembles and explained the majority of the spatial distribution of uncertainty in yield projections. Projected climate change impacts on maize production in different regions and nations ranged from near-zero or positive (upper quartile estimates) to substantially negative (lower quartile estimates), highlighting a need for risk management strategies that are adaptive and robust to uncertainty.

  1. Evaluation of Trapped Radiation Model Uncertainties for Spacecraft Design

    NASA Technical Reports Server (NTRS)

    Armstrong, T. W.; Colborn, B. L.

    2000-01-01

    The standard AP8 and AE8 models for predicting trapped proton and electron environments have been compared with several sets of flight data to evaluate model uncertainties. Model comparisons are made with flux, dose, and activation measurements made on various U.S. low-Earth orbit satellites (APEX, CRRES, DMSP, LDEF, NOAA) and Space Shuttle flights, on Russian satellites (Photon-8, Cosmos-1887, Cosmos-2044), and on the Russian Mir Space Station. This report gives a summary of the model-data comparisons-detailed results are given in a companion report. Results from the model comparisons with flic,ht data show, for example, the AP8 model underpredicts the trapped proton flux at low altitudes by a factor of about two (independent of proton energy and solar cycle conditions), and that the AE8 model overpredicts the flux in the outer electron belt by an order of magnitude or more.

  2. Evaluation of Trapped Radiation Model Uncertainties for Spacecraft Design

    NASA Technical Reports Server (NTRS)

    Armstrong, T. W.; Colborn, B. L.

    2000-01-01

    The standard AP8 and AE8 models for predicting trapped proton and electron environments have been compared with several sets of flight data to evaluate model uncertainties. Model comparisons are made with flux, dose, and activation measurements made on various U.S. low-Earth orbit satellites (APEX, CRRES, DMSP. LDEF, NOAA) and Space Shuttle flights, on Russian satellites (Photon-8, Cosmos-1887, Cosmos-2044), and on the Russian Mir space station. This report gives a summary of the model-data given in a companion report. Results from the model comparisons with flight data show, for example, that the AP8 model underpredicts the trapped proton flux at low altitudes by a factor of about two (independent of proton energy and solar cycle conditions), and that the AE8 model overpredict the flux in the outer electron belt be an order of magnitude or more.

  3. Evaluation of Neutron Radiography Reactor LEU-Core Start-Up Measurements

    DOE PAGES

    Bess, John D.; Maddock, Thomas L.; Smolinski, Andrew T.; ...

    2014-11-04

    Benchmark models were developed to evaluate the cold-critical start-up measurements performed during the fresh core reload of the Neutron Radiography (NRAD) reactor with Low Enriched Uranium (LEU) fuel. Experiments include criticality, control-rod worth measurements, shutdown margin, and excess reactivity for four core loadings with 56, 60, 62, and 64 fuel elements. The worth of four graphite reflector block assemblies and an empty dry tube used for experiment irradiations were also measured and evaluated for the 60-fuel-element core configuration. Dominant uncertainties in the experimental k eff come from uncertainties in the manganese content and impurities in the stainless steel fuel claddingmore » as well as the 236U and erbium poison content in the fuel matrix. Calculations with MCNP5 and ENDF/B-VII.0 neutron nuclear data are approximately 1.4% (9σ) greater than the benchmark model eigenvalues, which is commonly seen in Monte Carlo simulations of other TRIGA reactors. Simulations of the worth measurements are within the 2σ uncertainty for most of the benchmark experiment worth values. The complete benchmark evaluation details are available in the 2014 edition of the International Handbook of Evaluated Reactor Physics Benchmark Experiments.« less

  4. Evaluation of Neutron Radiography Reactor LEU-Core Start-Up Measurements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bess, John D.; Maddock, Thomas L.; Smolinski, Andrew T.

    Benchmark models were developed to evaluate the cold-critical start-up measurements performed during the fresh core reload of the Neutron Radiography (NRAD) reactor with Low Enriched Uranium (LEU) fuel. Experiments include criticality, control-rod worth measurements, shutdown margin, and excess reactivity for four core loadings with 56, 60, 62, and 64 fuel elements. The worth of four graphite reflector block assemblies and an empty dry tube used for experiment irradiations were also measured and evaluated for the 60-fuel-element core configuration. Dominant uncertainties in the experimental k eff come from uncertainties in the manganese content and impurities in the stainless steel fuel claddingmore » as well as the 236U and erbium poison content in the fuel matrix. Calculations with MCNP5 and ENDF/B-VII.0 neutron nuclear data are approximately 1.4% (9σ) greater than the benchmark model eigenvalues, which is commonly seen in Monte Carlo simulations of other TRIGA reactors. Simulations of the worth measurements are within the 2σ uncertainty for most of the benchmark experiment worth values. The complete benchmark evaluation details are available in the 2014 edition of the International Handbook of Evaluated Reactor Physics Benchmark Experiments.« less

  5. Pragmatic geometric model evaluation

    NASA Astrophysics Data System (ADS)

    Pamer, Robert

    2015-04-01

    Quantification of subsurface model reliability is mathematically and technically demanding as there are many different sources of uncertainty and some of the factors can be assessed merely in a subjective way. For many practical applications in industry or risk assessment (e. g. geothermal drilling) a quantitative estimation of possible geometric variations in depth unit is preferred over relative numbers because of cost calculations for different scenarios. The talk gives an overview of several factors that affect the geometry of structural subsurface models that are based upon typical geological survey organization (GSO) data like geological maps, borehole data and conceptually driven construction of subsurface elements (e. g. fault network). Within the context of the trans-European project "GeoMol" uncertainty analysis has to be very pragmatic also because of different data rights, data policies and modelling software between the project partners. In a case study a two-step evaluation methodology for geometric subsurface model uncertainty is being developed. In a first step several models of the same volume of interest have been calculated by omitting successively more and more input data types (seismic constraints, fault network, outcrop data). The positions of the various horizon surfaces are then compared. The procedure is equivalent to comparing data of various levels of detail and therefore structural complexity. This gives a measure of the structural significance of each data set in space and as a consequence areas of geometric complexity are identified. These areas are usually very data sensitive hence geometric variability in between individual data points in these areas is higher than in areas of low structural complexity. Instead of calculating a multitude of different models by varying some input data or parameters as it is done by Monte-Carlo-simulations, the aim of the second step of the evaluation procedure (which is part of the ongoing work) is to calculate basically two model variations that can be seen as geometric extremes of all available input data. This does not lead to a probability distribution for the spatial position of geometric elements but it defines zones of major (or minor resp.) geometric variations due to data uncertainty. Both model evaluations are then analyzed together to give ranges of possible model outcomes in metric units.

  6. Understanding Climate Uncertainty with an Ocean Focus

    NASA Astrophysics Data System (ADS)

    Tokmakian, R. T.

    2009-12-01

    Uncertainty in climate simulations arises from various aspects of the end-to-end process of modeling the Earth’s climate. First, there is uncertainty from the structure of the climate model components (e.g. ocean/ice/atmosphere). Even the most complex models are deficient, not only in the complexity of the processes they represent, but in which processes are included in a particular model. Next, uncertainties arise from the inherent error in the initial and boundary conditions of a simulation. Initial conditions are the state of the weather or climate at the beginning of the simulation and other such things, and typically come from observations. Finally, there is the uncertainty associated with the values of parameters in the model. These parameters may represent physical constants or effects, such as ocean mixing, or non-physical aspects of modeling and computation. The uncertainty in these input parameters propagates through the non-linear model to give uncertainty in the outputs. The models in 2020 will no doubt be better than today’s models, but they will still be imperfect, and development of uncertainty analysis technology is a critical aspect of understanding model realism and prediction capability. Smith [2002] and Cox and Stephenson [2007] discuss the need for methods to quantify the uncertainties within complicated systems so that limitations or weaknesses of the climate model can be understood. In making climate predictions, we need to have available both the most reliable model or simulation and a methods to quantify the reliability of a simulation. If quantitative uncertainty questions of the internal model dynamics are to be answered with complex simulations such as AOGCMs, then the only known path forward is based on model ensembles that characterize behavior with alternative parameter settings [e.g. Rougier, 2007]. The relevance and feasibility of using "Statistical Analysis of Computer Code Output" (SACCO) methods for examining uncertainty in ocean circulation due to parameter specification will be described and early results using the ocean/ice components of the CCSM climate model in a designed experiment framework will be shown. Cox, P. and D. Stephenson, Climate Change: A Changing Climate for Prediction, 2007, Science 317 (5835), 207, DOI: 10.1126/science.1145956. Rougier, J. C., 2007: Probabilistic Inference for Future Climate Using an Ensemble of Climate Model Evaluations, Climatic Change, 81, 247-264. Smith L., 2002, What might we learn from climate forecasts? Proc. Nat’l Academy of Sciences, Vol. 99, suppl. 1, 2487-2492 doi:10.1073/pnas.012580599.

  7. A Peep into the Uncertainty-Complexity-Relevance Modeling Trilemma through Global Sensitivity and Uncertainty Analysis

    NASA Astrophysics Data System (ADS)

    Munoz-Carpena, R.; Muller, S. J.; Chu, M.; Kiker, G. A.; Perz, S. G.

    2014-12-01

    Model Model complexity resulting from the need to integrate environmental system components cannot be understated. In particular, additional emphasis is urgently needed on rational approaches to guide decision making through uncertainties surrounding the integrated system across decision-relevant scales. However, in spite of the difficulties that the consideration of modeling uncertainty represent for the decision process, it should not be avoided or the value and science behind the models will be undermined. These two issues; i.e., the need for coupled models that can answer the pertinent questions and the need for models that do so with sufficient certainty, are the key indicators of a model's relevance. Model relevance is inextricably linked with model complexity. Although model complexity has advanced greatly in recent years there has been little work to rigorously characterize the threshold of relevance in integrated and complex models. Formally assessing the relevance of the model in the face of increasing complexity would be valuable because there is growing unease among developers and users of complex models about the cumulative effects of various sources of uncertainty on model outputs. In particular, this issue has prompted doubt over whether the considerable effort going into further elaborating complex models will in fact yield the expected payback. New approaches have been proposed recently to evaluate the uncertainty-complexity-relevance modeling trilemma (Muller, Muñoz-Carpena and Kiker, 2011) by incorporating state-of-the-art global sensitivity and uncertainty analysis (GSA/UA) in every step of the model development so as to quantify not only the uncertainty introduced by the addition of new environmental components, but the effect that these new components have over existing components (interactions, non-linear responses). Outputs from the analysis can also be used to quantify system resilience (stability, alternative states, thresholds or tipping points) in the face of environmental and anthropogenic change (Perz, Muñoz-Carpena, Kiker and Holt, 2013), and through MonteCarlo mapping potential management activities over the most important factors or processes to influence the system towards behavioral (desirable) outcomes (Chu-Agor, Muñoz-Carpena et al., 2012).

  8. Uncertainty quantification and validation of combined hydrological and macroeconomic analyses.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hernandez, Jacquelynne; Parks, Mancel Jordan; Jennings, Barbara Joan

    2010-09-01

    Changes in climate can lead to instabilities in physical and economic systems, particularly in regions with marginal resources. Global climate models indicate increasing global mean temperatures over the decades to come and uncertainty in the local to national impacts means perceived risks will drive planning decisions. Agent-based models provide one of the few ways to evaluate the potential changes in behavior in coupled social-physical systems and to quantify and compare risks. The current generation of climate impact analyses provides estimates of the economic cost of climate change for a limited set of climate scenarios that account for a small subsetmore » of the dynamics and uncertainties. To better understand the risk to national security, the next generation of risk assessment models must represent global stresses, population vulnerability to those stresses, and the uncertainty in population responses and outcomes that could have a significant impact on U.S. national security.« less

  9. Results of the Greenland Ice Sheet Model Initialisation Experiments ISMIP6 - initMIP-Greenland

    NASA Astrophysics Data System (ADS)

    Goelzer, H.; Nowicki, S.; Edwards, T.; Beckley, M.; Abe-Ouchi, A.; Aschwanden, A.; Calov, R.; Gagliardini, O.; Gillet-chaulet, F.; Golledge, N. R.; Gregory, J. M.; Greve, R.; Humbert, A.; Huybrechts, P.; Larour, E. Y.; Lipscomb, W. H.; Le ´h, S.; Lee, V.; Kennedy, J. H.; Pattyn, F.; Payne, A. J.; Rodehacke, C. B.; Rückamp, M.; Saito, F.; Schlegel, N.; Seroussi, H. L.; Shepherd, A.; Sun, S.; Vandewal, R.; Ziemen, F. A.

    2016-12-01

    Earlier large-scale Greenland ice sheet sea-level projections e.g. those run during ice2sea and SeaRISE initiatives have shown that ice sheet initialisation can have a large effect on the projections and gives rise to important uncertainties. The goal of this intercomparison exercise (initMIP-Greenland) is to compare, evaluate and improve the initialization techniques used in the ice sheet modeling community and to estimate the associated uncertainties. It is the first in a series of ice sheet model intercomparison activities within ISMIP6 (Ice Sheet Model Intercomparison Project for CMIP6). Two experiments for the large-scale Greenland ice sheet have been designed to allow intercomparison between participating models of 1) the initial present-day state of the ice sheet and 2) the response in two schematic forward experiments. The forward experiments serve to evaluate the initialisation in terms of model drift (forward run without any forcing) and response to a large perturbation (prescribed surface mass balance anomaly). We present and discuss final results of the intercomparison and highlight important uncertainties with respect to projections of the Greenland ice sheet sea-level contribution.

  10. Exploring dust emission responses to land cover change using an ecological land classification

    NASA Astrophysics Data System (ADS)

    Galloza, Magda S.; Webb, Nicholas P.; Bleiweiss, Max P.; Winters, Craig; Herrick, Jeffrey E.; Ayers, Eldon

    2018-06-01

    Despite efforts to quantify the impacts of land cover change on wind erosion, assessment uncertainty remains large. We address this uncertainty by evaluating the application of ecological site concepts and state-and-transition models (STMs) for detecting and quantitatively describing the impacts of land cover change on wind erosion. We apply a dust emission model over a rangeland study area in the northern Chihuahuan Desert, New Mexico, USA, and evaluate spatiotemporal patterns of modelled horizontal sediment mass flux and dust emission in the context of ecological sites and their vegetation states; representing a diversity of land cover types. Our results demonstrate how the impacts of land cover change on dust emission can be quantified, compared across land cover classes, and interpreted in the context of an ecological model that encapsulates land management intensity and change. Results also reveal the importance of established weaknesses in the dust model soil characterisation and drag partition scheme, which appeared generally insensitive to the impacts of land cover change. New models that address these weaknesses, coupled with ecological site concepts and field measurements across land cover types, could significantly reduce assessment uncertainties and provide opportunities for identifying land management options.

  11. Uncertainty-based Optimization Algorithms in Designing Fractionated Spacecraft

    PubMed Central

    Ning, Xin; Yuan, Jianping; Yue, Xiaokui

    2016-01-01

    A fractionated spacecraft is an innovative application of a distributive space system. To fully understand the impact of various uncertainties on its development, launch and in-orbit operation, we use the stochastic missioncycle cost to comprehensively evaluate the survivability, flexibility, reliability and economy of the ways of dividing the various modules of the different configurations of fractionated spacecraft. We systematically describe its concept and then analyze its evaluation and optimal design method that exists during recent years and propose the stochastic missioncycle cost for comprehensive evaluation. We also establish the models of the costs such as module development, launch and deployment and the impacts of their uncertainties respectively. Finally, we carry out the Monte Carlo simulation of the complete missioncycle costs of various configurations of the fractionated spacecraft under various uncertainties and give and compare the probability density distribution and statistical characteristics of its stochastic missioncycle cost, using the two strategies of timing module replacement and non-timing module replacement. The simulation results verify the effectiveness of the comprehensive evaluation method and show that our evaluation method can comprehensively evaluate the adaptability of the fractionated spacecraft under different technical and mission conditions. PMID:26964755

  12. Particle image velocimetry correlation signal-to-noise ratio metrics and measurement uncertainty quantification

    NASA Astrophysics Data System (ADS)

    Xue, Zhenyu; Charonko, John J.; Vlachos, Pavlos P.

    2014-11-01

    In particle image velocimetry (PIV) the measurement signal is contained in the recorded intensity of the particle image pattern superimposed on a variety of noise sources. The signal-to-noise-ratio (SNR) strength governs the resulting PIV cross correlation and ultimately the accuracy and uncertainty of the resulting PIV measurement. Hence we posit that correlation SNR metrics calculated from the correlation plane can be used to quantify the quality of the correlation and the resulting uncertainty of an individual measurement. In this paper we extend the original work by Charonko and Vlachos and present a framework for evaluating the correlation SNR using a set of different metrics, which in turn are used to develop models for uncertainty estimation. Several corrections have been applied in this work. The SNR metrics and corresponding models presented herein are expanded to be applicable to both standard and filtered correlations by applying a subtraction of the minimum correlation value to remove the effect of the background image noise. In addition, the notion of a ‘valid’ measurement is redefined with respect to the correlation peak width in order to be consistent with uncertainty quantification principles and distinct from an ‘outlier’ measurement. Finally the type and significance of the error distribution function is investigated. These advancements lead to more robust and reliable uncertainty estimation models compared with the original work by Charonko and Vlachos. The models are tested against both synthetic benchmark data as well as experimental measurements. In this work, {{U}68.5} uncertainties are estimated at the 68.5% confidence level while {{U}95} uncertainties are estimated at 95% confidence level. For all cases the resulting calculated coverage factors approximate the expected theoretical confidence intervals, thus demonstrating the applicability of these new models for estimation of uncertainty for individual PIV measurements.

  13. Application of probabilistic modelling for the uncertainty evaluation of alignment measurements of large accelerator magnets assemblies

    NASA Astrophysics Data System (ADS)

    Doytchinov, I.; Tonnellier, X.; Shore, P.; Nicquevert, B.; Modena, M.; Mainaud Durand, H.

    2018-05-01

    Micrometric assembly and alignment requirements for future particle accelerators, and especially large assemblies, create the need for accurate uncertainty budgeting of alignment measurements. Measurements and uncertainties have to be accurately stated and traceable, to international standards, for metre-long sized assemblies, in the range of tens of µm. Indeed, these hundreds of assemblies will be produced and measured by several suppliers around the world, and will have to be integrated into a single machine. As part of the PACMAN project at CERN, we proposed and studied a practical application of probabilistic modelling of task-specific alignment uncertainty by applying a simulation by constraints calibration method. Using this method, we calibrated our measurement model using available data from ISO standardised tests (10360 series) for the metrology equipment. We combined this model with reference measurements and analysis of the measured data to quantify the actual specific uncertainty of each alignment measurement procedure. Our methodology was successfully validated against a calibrated and traceable 3D artefact as part of an international inter-laboratory study. The validated models were used to study the expected alignment uncertainty and important sensitivity factors in measuring the shortest and longest of the compact linear collider study assemblies, 0.54 m and 2.1 m respectively. In both cases, the laboratory alignment uncertainty was within the targeted uncertainty budget of 12 µm (68% confidence level). It was found that the remaining uncertainty budget for any additional alignment error compensations, such as the thermal drift error due to variation in machine operation heat load conditions, must be within 8.9 µm and 9.8 µm (68% confidence level) respectively.

  14. Balancing the stochastic description of uncertainties as a function of hydrologic model complexity

    NASA Astrophysics Data System (ADS)

    Del Giudice, D.; Reichert, P.; Albert, C.; Kalcic, M.; Logsdon Muenich, R.; Scavia, D.; Bosch, N. S.; Michalak, A. M.

    2016-12-01

    Uncertainty analysis is becoming an important component of forecasting water and pollutant fluxes in urban and rural environments. Properly accounting for errors in the modeling process can help to robustly assess the uncertainties associated with the inputs (e.g. precipitation) and outputs (e.g. runoff) of hydrological models. In recent years we have investigated several Bayesian methods to infer the parameters of a mechanistic hydrological model along with those of the stochastic error component. The latter describes the uncertainties of model outputs and possibly inputs. We have adapted our framework to a variety of applications, ranging from predicting floods in small stormwater systems to nutrient loads in large agricultural watersheds. Given practical constraints, we discuss how in general the number of quantities to infer probabilistically varies inversely with the complexity of the mechanistic model. Most often, when evaluating a hydrological model of intermediate complexity, we can infer the parameters of the model as well as of the output error model. Describing the output errors as a first order autoregressive process can realistically capture the "downstream" effect of inaccurate inputs and structure. With simpler runoff models we can additionally quantify input uncertainty by using a stochastic rainfall process. For complex hydrologic transport models, instead, we show that keeping model parameters fixed and just estimating time-dependent output uncertainties could be a viable option. The common goal across all these applications is to create time-dependent prediction intervals which are both reliable (cover the nominal amount of validation data) and precise (are as narrow as possible). In conclusion, we recommend focusing both on the choice of the hydrological model and of the probabilistic error description. The latter can include output uncertainty only, if the model is computationally-expensive, or, with simpler models, it can separately account for different sources of errors like in the inputs and the structure of the model.

  15. Uncertainties have a meaning: Information entropy as a quality measure for 3-D geological models

    NASA Astrophysics Data System (ADS)

    Wellmann, J. Florian; Regenauer-Lieb, Klaus

    2012-03-01

    Analyzing, visualizing and communicating uncertainties are important issues as geological models can never be fully determined. To date, there exists no general approach to quantify uncertainties in geological modeling. We propose here to use information entropy as an objective measure to compare and evaluate model and observational results. Information entropy was introduced in the 50s and defines a scalar value at every location in the model for predictability. We show that this method not only provides a quantitative insight into model uncertainties but, due to the underlying concept of information entropy, can be related to questions of data integration (i.e. how is the model quality interconnected with the used input data) and model evolution (i.e. does new data - or a changed geological hypothesis - optimize the model). In other words information entropy is a powerful measure to be used for data assimilation and inversion. As a first test of feasibility, we present the application of the new method to the visualization of uncertainties in geological models, here understood as structural representations of the subsurface. Applying the concept of information entropy on a suite of simulated models, we can clearly identify (a) uncertain regions within the model, even for complex geometries; (b) the overall uncertainty of a geological unit, which is, for example, of great relevance in any type of resource estimation; (c) a mean entropy for the whole model, important to track model changes with one overall measure. These results cannot easily be obtained with existing standard methods. The results suggest that information entropy is a powerful method to visualize uncertainties in geological models, and to classify the indefiniteness of single units and the mean entropy of a model quantitatively. Due to the relationship of this measure to the missing information, we expect the method to have a great potential in many types of geoscientific data assimilation problems — beyond pure visualization.

  16. “Wrong, but Useful”: Negotiating Uncertainty in Infectious Disease Modelling

    PubMed Central

    Christley, Robert M.; Mort, Maggie; Wynne, Brian; Wastling, Jonathan M.; Heathwaite, A. Louise; Pickup, Roger; Austin, Zoë; Latham, Sophia M.

    2013-01-01

    For infectious disease dynamical models to inform policy for containment of infectious diseases the models must be able to predict; however, it is well recognised that such prediction will never be perfect. Nevertheless, the consensus is that although models are uncertain, some may yet inform effective action. This assumes that the quality of a model can be ascertained in order to evaluate sufficiently model uncertainties, and to decide whether or not, or in what ways or under what conditions, the model should be ‘used’. We examined uncertainty in modelling, utilising a range of data: interviews with scientists, policy-makers and advisors, and analysis of policy documents, scientific publications and reports of major inquiries into key livestock epidemics. We show that the discourse of uncertainty in infectious disease models is multi-layered, flexible, contingent, embedded in context and plays a critical role in negotiating model credibility. We argue that usability and stability of a model is an outcome of the negotiation that occurs within the networks and discourses surrounding it. This negotiation employs a range of discursive devices that renders uncertainty in infectious disease modelling a plastic quality that is amenable to ‘interpretive flexibility’. The utility of models in the face of uncertainty is a function of this flexibility, the negotiation this allows, and the contexts in which model outputs are framed and interpreted in the decision making process. We contend that rather than being based predominantly on beliefs about quality, the usefulness and authority of a model may at times be primarily based on its functional status within the broad social and political environment in which it acts. PMID:24146851

  17. Fukushima Daiichi unit 1 uncertainty analysis--Preliminary selection of uncertain parameters and analysis methodology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cardoni, Jeffrey N.; Kalinich, Donald A.

    2014-02-01

    Sandia National Laboratories (SNL) plans to conduct uncertainty analyses (UA) on the Fukushima Daiichi unit (1F1) plant with the MELCOR code. The model to be used was developed for a previous accident reconstruction investigation jointly sponsored by the US Department of Energy (DOE) and Nuclear Regulatory Commission (NRC). However, that study only examined a handful of various model inputs and boundary conditions, and the predictions yielded only fair agreement with plant data and current release estimates. The goal of this uncertainty study is to perform a focused evaluation of uncertainty in core melt progression behavior and its effect on keymore » figures-of-merit (e.g., hydrogen production, vessel lower head failure, etc.). In preparation for the SNL Fukushima UA work, a scoping study has been completed to identify important core melt progression parameters for the uncertainty analysis. The study also lays out a preliminary UA methodology.« less

  18. A methodology for computing uncertainty bounds of multivariable systems based on sector stability theory concepts

    NASA Technical Reports Server (NTRS)

    Waszak, Martin R.

    1992-01-01

    The application of a sector-based stability theory approach to the formulation of useful uncertainty descriptions for linear, time-invariant, multivariable systems is explored. A review of basic sector properties and sector-based approach are presented first. The sector-based approach is then applied to several general forms of parameter uncertainty to investigate its advantages and limitations. The results indicate that the sector uncertainty bound can be used effectively to evaluate the impact of parameter uncertainties on the frequency response of the design model. Inherent conservatism is a potential limitation of the sector-based approach, especially for highly dependent uncertain parameters. In addition, the representation of the system dynamics can affect the amount of conservatism reflected in the sector bound. Careful application of the model can help to reduce this conservatism, however, and the solution approach has some degrees of freedom that may be further exploited to reduce the conservatism.

  19. Uncertainty Assessments of 2D and Axisymmetric Hypersonic Shock Wave - Turbulent Boundary Layer Interaction Simulations at Compression Corners

    NASA Technical Reports Server (NTRS)

    Gnoffo, Peter A.; Berry, Scott A.; VanNorman, John W.

    2011-01-01

    This paper is one of a series of five papers in a special session organized by the NASA Fundamental Aeronautics Program that addresses uncertainty assessments for CFD simulations in hypersonic flow. Simulations of a shock emanating from a compression corner and interacting with a fully developed turbulent boundary layer are evaluated herein. Mission relevant conditions at Mach 7 and Mach 14 are defined for a pre-compression ramp of a scramjet powered vehicle. Three compression angles are defined, the smallest to avoid separation losses and the largest to force a separated flow engaging more complicated flow physics. The Baldwin-Lomax and the Cebeci-Smith algebraic models, the one-equation Spalart-Allmaras model with the Catrix-Aupoix compressibility modification and two-equation models including Menter SST, Wilcox k-omega 98, and Wilcox k-omega 06 turbulence models are evaluated. Each model is fully defined herein to preclude any ambiguity regarding model implementation. Comparisons are made to existing experimental data and Van Driest theory to provide preliminary assessment of model form uncertainty. A set of coarse grained uncertainty metrics are defined to capture essential differences among turbulence models. Except for the inability of algebraic models to converge for some separated flows there is no clearly superior model as judged by these metrics. A preliminary metric for the numerical component of uncertainty in shock-turbulent-boundary-layer interactions at compression corners sufficiently steep to cause separation is defined as 55%. This value is a median of differences with experimental data averaged for peak pressure and heating and for extent of separation captured in new, grid-converged solutions presented here. This value is consistent with existing results in a literature review of hypersonic shock-turbulent-boundary-layer interactions by Roy and Blottner and with more recent computations of MacLean.

  20. Use of Inverse-Modeling Methods to Improve Ground-Water-Model Calibration and Evaluate Model-Prediction Uncertainty, Camp Edwards, Cape Cod, Massachusetts

    USGS Publications Warehouse

    Walter, Donald A.; LeBlanc, Denis R.

    2008-01-01

    Historical weapons testing and disposal activities at Camp Edwards, which is located on the Massachusetts Military Reservation, western Cape Cod, have resulted in the release of contaminants into an underlying sand and gravel aquifer that is the sole source of potable water to surrounding communities. Ground-water models have been used at the site to simulate advective transport in the aquifer in support of field investigations. Reasonable models developed by different groups and calibrated by trial and error often yield different predictions of advective transport, and the predictions lack quantitative measures of uncertainty. A recently (2004) developed regional model of western Cape Cod, modified to include the sensitivity and parameter-estimation capabilities of MODFLOW-2000, was used in this report to evaluate the utility of inverse (statistical) methods to (1) improve model calibration and (2) assess model-prediction uncertainty. Simulated heads and flows were most sensitive to recharge and to the horizontal hydraulic conductivity of the Buzzards Bay and Sandwich Moraines and the Buzzards Bay and northern parts of the Mashpee outwash plains. Conversely, simulated heads and flows were much less sensitive to vertical hydraulic conductivity. Parameter estimation (inverse calibration) improved the match to observed heads and flows; the absolute mean residual for heads improved by 0.32 feet and the absolute mean residual for streamflows improved by about 0.2 cubic feet per second. Advective-transport predictions in Camp Edwards generally were most sensitive to the parameters with the highest precision (lowest coefficients of variation), indicating that the numerical model is adequate for evaluating prediction uncertainties in and around Camp Edwards. The incorporation of an advective-transport observation, representing the leading edge of a contaminant plume that had been difficult to match by using trial-and-error calibration, improved the match between an observed and simulated plume path; however, a modified representation of local geology was needed to simultaneously maintain a reasonable calibration to heads and flows and to the plume path. Advective-transport uncertainties were expressed as about 68-, 95-, and 99-percent confidence intervals on three dimensional simulated particle positions. The confidence intervals can be graphically represented as ellipses around individual particle positions in the X-Y (geographic) plane and in the X-Z or Y-Z (vertical) planes. The merging of individual ellipses allows uncertainties on forward particle tracks to be displayed in map or cross-sectional view as a cone of uncertainty around a simulated particle path; uncertainties on reverse particle-track endpoints - representing simulated recharge locations - can be geographically displayed as areas at the water table around the discrete particle endpoints. This information gives decisionmakers insight into the level of confidence they can have in particle-tracking results and can assist them in the efficient use of available field resources.

  1. Re-evaluation of the sorption behaviour of Bromide and Sulfamethazine under field conditions using leaching data and modelling methods

    NASA Astrophysics Data System (ADS)

    Gassmann, Matthias; Olsson, Oliver; Höper, Heinrich; Hamscher, Gerd; Kümmerer, Klaus

    2016-04-01

    The simulation of reactive transport in the aquatic environment is hampered by the ambiguity of environmental fate process conceptualizations for a specific substance in the literature. Concepts are usually identified by experimental studies and inverse modelling under controlled lab conditions in order to reduce environmental uncertainties such as uncertain boundary conditions and input data. However, since environmental conditions affect substance behaviour, a re-evaluation might be necessary under environmental conditions which might, in turn, be affected by uncertainties. Using a combination of experimental data and simulations of the leaching behaviour of the veterinary antibiotic Sulfamethazine (SMZ; synonym: sulfadimidine) and the hydrological tracer Bromide (Br) in a field lysimeter, we re-evaluated the sorption concepts of both substances under uncertain field conditions. Sampling data of a field lysimeter experiment in which both substances were applied twice a year with manure and sampled at the bottom of two lysimeters during three subsequent years was used for model set-up and evaluation. The total amount of leached SMZ and Br were 22 μg and 129 mg, respectively. A reactive transport model was parameterized to the conditions of the two lysimeters filled with monoliths (depth 2 m, area 1 m²) of a sandy soil showing a low pH value under which Bromide is sorptive. We used different sorption concepts such as constant and organic-carbon dependent sorption coefficients and instantaneous and kinetic sorption equilibrium. Combining the sorption concepts resulted in four scenarios per substance with different equations for sorption equilibrium and sorption kinetics. The GLUE (Generalized Likelihood Uncertainty Estimation) method was applied to each scenario using parameter ranges found in experimental and modelling studies. The parameter spaces for each scenario were sampled using a Latin Hypercube method which was refined around local model efficiency maxima. Results of the cumulative SMZ leaching simulations suggest a best conceptualization combination of instantaneous sorption to organic carbon which is consistent with the literature. The best Nash-Sutcliffe efficiency (Neff) was 0.96 and the 5th and 95th percentile of the uncertainty estimation were 18 and 27 μg. In contrast, both scenarios of kinetic Br sorption had similar results (Neff =0.99, uncertainty bounds 110-176 mg and 112-176 mg) but were clearly better than instantaneous sorption scenarios. Therefore, only the concept of sorption kinetics could be identified for Br modelling whereas both tested sorption equilibrium coefficient concepts performed equally well. The reasons for this specific case of equifinality may be uncertainties of model input data under field conditions or an insensitivity of the sorption equilibrium method due to relatively low adsorption of Br. Our results show that it may be possible to identify or at least falsify specific sorption concepts under uncertain field conditions using a long-term leaching experiment and modelling methods. Cases of environmental fate concept equifinality arouse the possibility of future model structure uncertainty analysis using an ensemble of models with different environmental fate concepts.

  2. Scheme for the selection of measurement uncertainty models in blood establishments' screening immunoassays.

    PubMed

    Pereira, Paulo; Westgard, James O; Encarnação, Pedro; Seghatchian, Jerard; de Sousa, Gracinda

    2015-02-01

    Blood establishments routinely perform screening immunoassays to assess safety of the blood components. As with any other screening test, results have an inherent uncertainty. In blood establishments the major concern is the chance of false negatives, due to its possible impact on patients' health. This article briefly reviews GUM and diagnostic accuracy models for screening immunoassays, recommending a scheme to support the screening laboratories' staffs on the selection of a model considering the intended use of the screening results (i.e., post-transfusion safety). The discussion is grounded on a "risk-based thinking", risk being considered from the blood donor selection to the screening immunoassays. A combination of GUM and diagnostic accuracy models to evaluate measurement uncertainty in blood establishments is recommended. Copyright © 2014 Elsevier Ltd. All rights reserved.

  3. Achieving Robustness to Uncertainty for Financial Decision-making

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Barnum, George M.; Van Buren, Kendra L.; Hemez, Francois M.

    2014-01-10

    This report investigates the concept of robustness analysis to support financial decision-making. Financial models, that forecast future stock returns or market conditions, depend on assumptions that might be unwarranted and variables that might exhibit large fluctuations from their last-known values. The analysis of robustness explores these sources of uncertainty, and recommends model settings such that the forecasts used for decision-making are as insensitive as possible to the uncertainty. A proof-of-concept is presented with the Capital Asset Pricing Model. The robustness of model predictions is assessed using info-gap decision theory. Info-gaps are models of uncertainty that express the “distance,” or gapmore » of information, between what is known and what needs to be known in order to support the decision. The analysis yields a description of worst-case stock returns as a function of increasing gaps in our knowledge. The analyst can then decide on the best course of action by trading-off worst-case performance with “risk”, which is how much uncertainty they think needs to be accommodated in the future. The report also discusses the Graphical User Interface, developed using the MATLAB® programming environment, such that the user can control the analysis through an easy-to-navigate interface. Three directions of future work are identified to enhance the present software. First, the code should be re-written using the Python scientific programming software. This change will achieve greater cross-platform compatibility, better portability, allow for a more professional appearance, and render it independent from a commercial license, which MATLAB® requires. Second, a capability should be developed to allow users to quickly implement and analyze their own models. This will facilitate application of the software to the evaluation of proprietary financial models. The third enhancement proposed is to add the ability to evaluate multiple models simultaneously. When two models reflect past data with similar accuracy, the more robust of the two is preferable for decision-making because its predictions are, by definition, less sensitive to the uncertainty.« less

  4. Empirically evaluating decision-analytic models.

    PubMed

    Goldhaber-Fiebert, Jeremy D; Stout, Natasha K; Goldie, Sue J

    2010-08-01

    Model-based cost-effectiveness analyses support decision-making. To augment model credibility, evaluation via comparison to independent, empirical studies is recommended. We developed a structured reporting format for model evaluation and conducted a structured literature review to characterize current model evaluation recommendations and practices. As an illustration, we applied the reporting format to evaluate a microsimulation of human papillomavirus and cervical cancer. The model's outputs and uncertainty ranges were compared with multiple outcomes from a study of long-term progression from high-grade precancer (cervical intraepithelial neoplasia [CIN]) to cancer. Outcomes included 5 to 30-year cumulative cancer risk among women with and without appropriate CIN treatment. Consistency was measured by model ranges overlapping study confidence intervals. The structured reporting format included: matching baseline characteristics and follow-up, reporting model and study uncertainty, and stating metrics of consistency for model and study results. Structured searches yielded 2963 articles with 67 meeting inclusion criteria and found variation in how current model evaluations are reported. Evaluation of the cervical cancer microsimulation, reported using the proposed format, showed a modeled cumulative risk of invasive cancer for inadequately treated women of 39.6% (30.9-49.7) at 30 years, compared with the study: 37.5% (28.4-48.3). For appropriately treated women, modeled risks were 1.0% (0.7-1.3) at 30 years, study: 1.5% (0.4-3.3). To support external and projective validity, cost-effectiveness models should be iteratively evaluated as new studies become available, with reporting standardized to facilitate assessment. Such evaluations are particularly relevant for models used to conduct comparative effectiveness analyses.

  5. How much is new information worth? Evaluating the financial benefit of resolving management uncertainty

    USGS Publications Warehouse

    Maxwell, Sean L.; Rhodes, Jonathan R.; Runge, Michael C.; Possingham, Hugh P.; Ng, Chooi Fei; McDonald Madden, Eve

    2015-01-01

    Conservation decision-makers face a trade-off between spending limited funds on direct management action, or gaining new information in an attempt to improve management performance in the future. Value-of-information analysis can help to resolve this trade-off by evaluating how much management performance could improve if new information was gained. Value-of-information analysis has been used extensively in other disciplines, but there are only a few examples where it has informed conservation planning, none of which have used it to evaluate the financial value of gaining new information. We address this gap by applying value-of-information analysis to the management of a declining koala Phascolarctos cinereuspopulation. Decision-makers responsible for managing this population face uncertainty about survival and fecundity rates, and how habitat cover affects mortality threats. The value of gaining new information about these uncertainties was calculated using a deterministic matrix model of the koala population to find the expected population growth rate if koala mortality threats were optimally managed under alternative model hypotheses, which represented the uncertainties faced by koala managers. Gaining new information about survival and fecundity rates and the effect of habitat cover on mortality threats will do little to improve koala management. Across a range of management budgets, no more than 1·7% of the budget should be spent on resolving these uncertainties. The value of information was low because optimal management decisions were not sensitive to the uncertainties we considered. Decisions were instead driven by a substantial difference in the cost efficiency of management actions. The value of information was up to forty times higher when the cost efficiencies of different koala management actions were similar. Synthesis and applications. This study evaluates the ecological and financial benefits of gaining new information to inform a conservation problem. We also theoretically demonstrate that the value of reducing uncertainty is highest when it is not clear which management action is the most cost efficient. This study will help expand the use of value-of-information analyses in conservation by providing a cost efficiency metric by which to evaluate research or monitoring.

  6. Bayesian Assessment of the Uncertainties of Estimates of a Conceptual Rainfall-Runoff Model Parameters

    NASA Astrophysics Data System (ADS)

    Silva, F. E. O. E.; Naghettini, M. D. C.; Fernandes, W.

    2014-12-01

    This paper evaluated the uncertainties associated with the estimation of the parameters of a conceptual rainfall-runoff model, through the use of Bayesian inference techniques by Monte Carlo simulation. The Pará River sub-basin, located in the upper São Francisco river basin, in southeastern Brazil, was selected for developing the studies. In this paper, we used the Rio Grande conceptual hydrologic model (EHR/UFMG, 2001) and the Markov Chain Monte Carlo simulation method named DREAM (VRUGT, 2008a). Two probabilistic models for the residues were analyzed: (i) the classic [Normal likelihood - r ≈ N (0, σ²)]; and (ii) a generalized likelihood (SCHOUPS & VRUGT, 2010), in which it is assumed that the differences between observed and simulated flows are correlated, non-stationary, and distributed as a Skew Exponential Power density. The assumptions made for both models were checked to ensure that the estimation of uncertainties in the parameters was not biased. The results showed that the Bayesian approach proved to be adequate to the proposed objectives, enabling and reinforcing the importance of assessing the uncertainties associated with hydrological modeling.

  7. Appropriate Domain Size for Groundwater Flow Modeling with a Discrete Fracture Network Model.

    PubMed

    Ji, Sung-Hoon; Koh, Yong-Kwon

    2017-01-01

    When a discrete fracture network (DFN) is constructed from statistical conceptualization, uncertainty in simulating the hydraulic characteristics of a fracture network can arise due to the domain size. In this study, the appropriate domain size, where less significant uncertainty in the stochastic DFN model is expected, was suggested for the Korea Atomic Energy Research Institute Underground Research Tunnel (KURT) site. The stochastic DFN model for the site was established, and the appropriate domain size was determined with the density of the percolating cluster and the percolation probability using the stochastically generated DFNs for various domain sizes. The applicability of the appropriate domain size to our study site was evaluated by comparing the statistical properties of stochastically generated fractures of varying domain sizes and estimating the uncertainty in the equivalent permeability of the generated DFNs. Our results show that the uncertainty of the stochastic DFN model is acceptable when the modeling domain is larger than the determined appropriate domain size, and the appropriate domain size concept is applicable to our study site. © 2016, National Ground Water Association.

  8. Propagation of hydro-meteorological uncertainty in a model cascade framework to inundation prediction

    NASA Astrophysics Data System (ADS)

    Rodríguez-Rincón, J. P.; Pedrozo-Acuña, A.; Breña-Naranjo, J. A.

    2015-07-01

    This investigation aims to study the propagation of meteorological uncertainty within a cascade modelling approach to flood prediction. The methodology was comprised of a numerical weather prediction (NWP) model, a distributed rainfall-runoff model and a 2-D hydrodynamic model. The uncertainty evaluation was carried out at the meteorological and hydrological levels of the model chain, which enabled the investigation of how errors that originated in the rainfall prediction interact at a catchment level and propagate to an estimated inundation area and depth. For this, a hindcast scenario is utilised removing non-behavioural ensemble members at each stage, based on the fit with observed data. At the hydrodynamic level, an uncertainty assessment was not incorporated; instead, the model was setup following guidelines for the best possible representation of the case study. The selected extreme event corresponds to a flood that took place in the southeast of Mexico during November 2009, for which field data (e.g. rain gauges; discharge) and satellite imagery were available. Uncertainty in the meteorological model was estimated by means of a multi-physics ensemble technique, which is designed to represent errors from our limited knowledge of the processes generating precipitation. In the hydrological model, a multi-response validation was implemented through the definition of six sets of plausible parameters from past flood events. Precipitation fields from the meteorological model were employed as input in a distributed hydrological model, and resulting flood hydrographs were used as forcing conditions in the 2-D hydrodynamic model. The evolution of skill within the model cascade shows a complex aggregation of errors between models, suggesting that in valley-filling events hydro-meteorological uncertainty has a larger effect on inundation depths than that observed in estimated flood inundation extents.

  9. Explicitly integrating parameter, input, and structure uncertainties into Bayesian Neural Networks for probabilistic hydrologic forecasting

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Xuesong; Liang, Faming; Yu, Beibei

    2011-11-09

    Estimating uncertainty of hydrologic forecasting is valuable to water resources and other relevant decision making processes. Recently, Bayesian Neural Networks (BNNs) have been proved powerful tools for quantifying uncertainty of streamflow forecasting. In this study, we propose a Markov Chain Monte Carlo (MCMC) framework to incorporate the uncertainties associated with input, model structure, and parameter into BNNs. This framework allows the structure of the neural networks to change by removing or adding connections between neurons and enables scaling of input data by using rainfall multipliers. The results show that the new BNNs outperform the BNNs that only consider uncertainties associatedmore » with parameter and model structure. Critical evaluation of posterior distribution of neural network weights, number of effective connections, rainfall multipliers, and hyper-parameters show that the assumptions held in our BNNs are not well supported. Further understanding of characteristics of different uncertainty sources and including output error into the MCMC framework are expected to enhance the application of neural networks for uncertainty analysis of hydrologic forecasting.« less

  10. Stream Discharge and Evapotranspiration Responses to Climate Change and Their Associated Uncertainties in a Large Semi-Arid Basin

    NASA Astrophysics Data System (ADS)

    Bassam, S.; Ren, J.

    2017-12-01

    Predicting future water availability in watersheds is very important for proper water resources management, especially in semi-arid regions with scarce water resources. Hydrological models have been considered as powerful tools in predicting future hydrological conditions in watershed systems in the past two decades. Streamflow and evapotranspiration are the two important components in watershed water balance estimation as the former is the most commonly-used indicator of the overall water budget estimation, and the latter is the second biggest component of water budget (biggest outflow from the system). One of the main concerns in watershed scale hydrological modeling is the uncertainties associated with model prediction, which could arise from errors in model parameters and input meteorological data, or errors in model representation of the physics of hydrological processes. Understanding and quantifying these uncertainties are vital to water resources managers for proper decision making based on model predictions. In this study, we evaluated the impacts of different climate change scenarios on the future stream discharge and evapotranspiration, and their associated uncertainties, throughout a large semi-arid basin using a stochastically-calibrated, physically-based, semi-distributed hydrological model. The results of this study could provide valuable insights in applying hydrological models in large scale watersheds, understanding the associated sensitivity and uncertainties in model parameters, and estimating the corresponding impacts on interested hydrological process variables under different climate change scenarios.

  11. How well does your model capture the terrestrial ecosystem dynamics of the Arctic-Boreal Region?

    NASA Astrophysics Data System (ADS)

    Stofferahn, E.; Fisher, J. B.; Hayes, D. J.; Huntzinger, D. N.; Schwalm, C.

    2016-12-01

    The Arctic-Boreal Region (ABR) is a major source of uncertainties for terrestrial biosphere model (TBM) simulations. These uncertainties are precipitated by a lack of observational data from the region, affecting the parameterizations of cold environment processes in the models. Addressing these uncertainties requires a coordinated effort of data collection and integration of the following key indicators of the ABR ecosystem: disturbance, flora / fauna and related ecosystem function, carbon pools and biogeochemistry, permafrost, and hydrology. We are developing a model-data integration framework for NASA's Arctic Boreal Vulnerability Experiment (ABoVE), wherein data collection for the key ABoVE indicators is driven by matching observations and model outputs to the ABoVE indicators. The data are used as reference datasets for a benchmarking system which evaluates TBM performance with respect to ABR processes. The benchmarking system utilizes performance metrics to identify intra-model and inter-model strengths and weaknesses, which in turn provides guidance to model development teams for reducing uncertainties in TBM simulations of the ABR. The system is directly connected to the International Land Model Benchmarking (ILaMB) system, as an ABR-focused application.

  12. Probabilistic and deterministic evaluation of uncertainty in a local scale multi-risk analysis

    NASA Astrophysics Data System (ADS)

    Lari, S.; Frattini, P.; Crosta, G. B.

    2009-04-01

    We performed a probabilistic multi-risk analysis (QPRA) at the local scale for a 420 km2 area surrounding the town of Brescia (Northern Italy). We calculated the expected annual loss in terms of economical damage and life loss, for a set of risk scenarios of flood, earthquake and industrial accident with different occurrence probabilities and different intensities. The territorial unit used for the study was the census parcel, of variable area, for which a large amount of data was available. Due to the lack of information related to the evaluation of the hazards, to the value of the exposed elements (e.g., residential and industrial area, population, lifelines, sensitive elements as schools, hospitals) and to the process-specific vulnerability, and to a lack of knowledge of the processes (floods, industrial accidents, earthquakes), we assigned an uncertainty to the input variables of the analysis. For some variables an homogeneous uncertainty was assigned on the whole study area, as for instance for the number of buildings of various typologies, and for the event occurrence probability. In other cases, as for phenomena intensity (e.g.,depth of water during flood) and probability of impact, the uncertainty was defined in relation to the census parcel area. In fact assuming some variables homogeneously diffused or averaged on the census parcels, we introduce a larger error for larger parcels. We propagated the uncertainty in the analysis using three different models, describing the reliability of the output (risk) as a function of the uncertainty of the inputs (scenarios and vulnerability functions). We developed a probabilistic approach based on Monte Carlo simulation, and two deterministic models, namely First Order Second Moment (FOSM) and Point Estimate (PE). In general, similar values of expected losses are obtained with the three models. The uncertainty of the final risk value is in the three cases around the 30% of the expected value. Each of the models, nevertheless, requires different assumptions and computational efforts, and provides results with different level of detail.

  13. Evaluation and hydrological application of satellite-based precipitation datasets in driving hydrological models over the Huifa river basin in Northeast China

    NASA Astrophysics Data System (ADS)

    Zhu, Honglei; Li, Ying; Huang, Yanwei; Li, Yingchen; Hou, Cuicui; Shi, Xiaoliang

    2018-07-01

    Satellite-based precipitation estimates with high spatial and temporal resolution and large areal coverage have provided hydrologists a potential alternative source for hydrological applications since the last few years, especially for ungauged regions. This study evaluates five satellite-based precipitation datasets, namely, Fengyun, TRMM 3B42, TRMM 3B42RT, CMORPH_BLD and CMORPH_RAW, against gauge observations for streamflow simulation with a distributed hydrological model (SWAT) over the Huifa river basin, Northeast China. Results show that, by comparing the statistical indices (MA, M5P, STDE, ME, BIAS and CC) and inter-annual precipitation, it is demonstrated that Fengyun TRMM 3B42 and CMORPH_BLD show better agreement with gauge precipitation data than CMORPH_RAW and TRMM 3B42RT. When the SWAT model for each dataset calibrated and validated individually, satisfactory model performances (defined as: NS > 0.5) are achieved at daily scale for Fengyun, TRMM 3B42 and gauge-driven model, and very good performances (defined as: NS > 0.75) are achieved at monthly scale for Fengyun and gauge-driven model, respectively. The CMORPH_BLD forced daily simulations also yield higher values of NS and R2 than CMORPH_RAW and TRMM 3B42RT at daily and monthly step. From the uncertainty results, variations of P-factor values and frequency distribution curves of NS suggest that the simulation uncertainty increase when operating the Fengyun, 3B42RT, CMORPH_BLD and CMORPH_RAW-driven model with best fitted parameters for rain gauge SWAT model. The results also indicate that the influence of parameter uncertainty on model simulation results may be greater than the effect of input data accuracy. It is noted that uncertainty analysis is necessary to evaluate the hydrological applications of satellite-based precipitation datasets.

  14. Orbital Debris Shape and Orientation Effects on Ballistic Limits

    NASA Technical Reports Server (NTRS)

    Evans, Steven W.; Williamsen, Joel E.

    2005-01-01

    The SPHC hydrodynamic code was used to evaluate the effects of orbital debris particle shape and orientation on penetration of a typical spacecraft dual-wall shield. Impacts were simulated at near-normal obliquity at 12 km/sec. Debris cloud characteristics and damage potential are compared with those from impacts by spherical projectiles. Results of these simulations indicate the uncertainties in the predicted ballistic limits due to modeling uncertainty and to uncertainty in the impactor orientation.

  15. Assessment and visualization of uncertainty for countrywide soil organic matter map of Hungary using local entropy

    NASA Astrophysics Data System (ADS)

    Szatmári, Gábor; Pásztor, László

    2016-04-01

    Uncertainty is a general term expressing our imperfect knowledge in describing an environmental process and we are aware of it (Bárdossy and Fodor, 2004). Sampling, laboratory measurements, models and so on are subject to uncertainty. Effective quantification and visualization of uncertainty would be indispensable to stakeholders (e.g. policy makers, society). Soil related features and their spatial models should be stressfully targeted to uncertainty assessment because their inferences are further used in modelling and decision making process. The aim of our present study was to assess and effectively visualize the local uncertainty of the countrywide soil organic matter (SOM) spatial distribution model of Hungary using geostatistical tools and concepts. The Hungarian Soil Information and Monitoring System's SOM data (approximately 1,200 observations) and environmental related, spatially exhaustive secondary information (i.e. digital elevation model, climatic maps, MODIS satellite images and geological map) were used to model the countrywide SOM spatial distribution by regression kriging. It would be common to use the calculated estimation (or kriging) variance as a measure of uncertainty, however the normality and homoscedasticity hypotheses have to be refused according to our preliminary analysis on the data. Therefore, a normal score transformation and a sequential stochastic simulation approach was introduced to be able to model and assess the local uncertainty. Five hundred equally probable realizations (i.e. stochastic images) were generated. The number of the stochastic images is fairly enough to provide a model of uncertainty at each location, which is a complete description of uncertainty in geostatistics (Deutsch and Journel, 1998). Furthermore, these models can be applied e.g. to contour the probability of any events, which can be regarded as goal oriented digital soil maps and are of interest for agricultural management and decision making as well. A standardized measure of the local entropy was used to visualize uncertainty, where entropy values close to 1 correspond to high uncertainty, whilst values close to 0 correspond low uncertainty. The advantage of the usage of local entropy in this context is that it combines probabilities from multiple members into a single number for each location of the model. In conclusion, it is straightforward to use a sequential stochastic simulation approach to the assessment of uncertainty, when normality and homoscedasticity are violated. The visualization of uncertainty using the local entropy is effective and communicative to stakeholders because it represents the uncertainty through a single number within a [0, 1] scale. References: Bárdossy, Gy. & Fodor, J., 2004. Evaluation of Uncertainties and Risks in Geology. Springer-Verlag, Berlin Heidelberg. Deutsch, C.V. & Journel, A.G., 1998. GSLIB: geostatistical software library and user's guide. Oxford University Press, New York. Acknowledgement: Our work was supported by the Hungarian National Scientific Research Foundation (OTKA, Grant No. K105167).

  16. Streamflow hindcasting in European river basins via multi-parametric ensemble of the mesoscale hydrologic model (mHM)

    NASA Astrophysics Data System (ADS)

    Noh, Seong Jin; Rakovec, Oldrich; Kumar, Rohini; Samaniego, Luis

    2016-04-01

    There have been tremendous improvements in distributed hydrologic modeling (DHM) which made a process-based simulation with a high spatiotemporal resolution applicable on a large spatial scale. Despite of increasing information on heterogeneous property of a catchment, DHM is still subject to uncertainties inherently coming from model structure, parameters and input forcing. Sequential data assimilation (DA) may facilitate improved streamflow prediction via DHM using real-time observations to correct internal model states. In conventional DA methods such as state updating, parametric uncertainty is, however, often ignored mainly due to practical limitations of methodology to specify modeling uncertainty with limited ensemble members. If parametric uncertainty related with routing and runoff components is not incorporated properly, predictive uncertainty by DHM may be insufficient to capture dynamics of observations, which may deteriorate predictability. Recently, a multi-scale parameter regionalization (MPR) method was proposed to make hydrologic predictions at different scales using a same set of model parameters without losing much of the model performance. The MPR method incorporated within the mesoscale hydrologic model (mHM, http://www.ufz.de/mhm) could effectively represent and control uncertainty of high-dimensional parameters in a distributed model using global parameters. In this study, we present a global multi-parametric ensemble approach to incorporate parametric uncertainty of DHM in DA to improve streamflow predictions. To effectively represent and control uncertainty of high-dimensional parameters with limited number of ensemble, MPR method is incorporated with DA. Lagged particle filtering is utilized to consider the response times and non-Gaussian characteristics of internal hydrologic processes. The hindcasting experiments are implemented to evaluate impacts of the proposed DA method on streamflow predictions in multiple European river basins having different climate and catchment characteristics. Because augmentation of parameters is not required within an assimilation window, the approach could be stable with limited ensemble members and viable for practical uses.

  17. Quantile regression reveals hidden bias and uncertainty in habitat models

    Treesearch

    Brian S. Cade; Barry R. Noon; Curtis H. Flather

    2005-01-01

    We simulated the effects of missing information on statistical distributions of animal response that covaried with measured predictors of habitat to evaluate the utility and performance of quantile regression for providing more useful intervals of uncertainty in habitat relationships. These procedures were evaulated for conditions in which heterogeneity and hidden bias...

  18. Uncertainty in nutrient loads from tile drained landscapes: Effect of sampling frequency, calculation algorithm, and compositing strategies

    USDA-ARS?s Scientific Manuscript database

    Accurate estimates of annual nutrient loads are required to evaluate trends in water quality following changes in land use or management and to calibrate and validate water quality models. While much emphasis has been placed on understanding the uncertainty of watershed-scale nutrient load estimates...

  19. Estimating annual bole biomass production using uncertainty analysis

    Treesearch

    Travis J. Woolley; Mark E. Harmon; Kari B. O' Connell

    2007-01-01

    Two common sampling methodologies coupled with a simple statistical model were evaluated to determine the accuracy and precision of annual bole biomass production (BBP) and inter-annual variability estimates using this type of approach. We performed an uncertainty analysis using Monte Carlo methods in conjunction with radial growth core data from trees in three Douglas...

  20. Monte Carlo simulation for uncertainty estimation on structural data in implicit 3-D geological modeling, a guide for disturbance distribution selection and parameterization

    NASA Astrophysics Data System (ADS)

    Pakyuz-Charrier, Evren; Lindsay, Mark; Ogarko, Vitaliy; Giraud, Jeremie; Jessell, Mark

    2018-04-01

    Three-dimensional (3-D) geological structural modeling aims to determine geological information in a 3-D space using structural data (foliations and interfaces) and topological rules as inputs. This is necessary in any project in which the properties of the subsurface matters; they express our understanding of geometries in depth. For that reason, 3-D geological models have a wide range of practical applications including but not restricted to civil engineering, the oil and gas industry, the mining industry, and water management. These models, however, are fraught with uncertainties originating from the inherent flaws of the modeling engines (working hypotheses, interpolator's parameterization) and the inherent lack of knowledge in areas where there are no observations combined with input uncertainty (observational, conceptual and technical errors). Because 3-D geological models are often used for impactful decision-making it is critical that all 3-D geological models provide accurate estimates of uncertainty. This paper's focus is set on the effect of structural input data measurement uncertainty propagation in implicit 3-D geological modeling. This aim is achieved using Monte Carlo simulation for uncertainty estimation (MCUE), a stochastic method which samples from predefined disturbance probability distributions that represent the uncertainty of the original input data set. MCUE is used to produce hundreds to thousands of altered unique data sets. The altered data sets are used as inputs to produce a range of plausible 3-D models. The plausible models are then combined into a single probabilistic model as a means to propagate uncertainty from the input data to the final model. In this paper, several improved methods for MCUE are proposed. The methods pertain to distribution selection for input uncertainty, sample analysis and statistical consistency of the sampled distribution. Pole vector sampling is proposed as a more rigorous alternative than dip vector sampling for planar features and the use of a Bayesian approach to disturbance distribution parameterization is suggested. The influence of incorrect disturbance distributions is discussed and propositions are made and evaluated on synthetic and realistic cases to address the sighted issues. The distribution of the errors of the observed data (i.e., scedasticity) is shown to affect the quality of prior distributions for MCUE. Results demonstrate that the proposed workflows improve the reliability of uncertainty estimation and diminish the occurrence of artifacts.

  1. Impact of intermediate and high energy nuclear data on the neutronic safety parameters of MYRRHA accelerator driven system

    NASA Astrophysics Data System (ADS)

    Stankovskiy, Alexey; Çelik, Yurdunaz; Eynde, Gert Van den

    2017-09-01

    Perturbation of external neutron source can cause significant local power changes transformed into undesired safety-related events in an accelerator driven system. Therefore for the accurate design of MYRRHA sub-critical core it is important to evaluate the uncertainty of power responses caused by the uncertainties in nuclear reaction models describing the particle transport from primary proton energy down to the evaluated nuclear data table range. The calculations with a set of models resulted in quite low uncertainty on the local power caused by significant perturbation of primary neutron yield from proton interactions with lead and bismuth isotopes. The considered accidental event of prescribed proton beam shape loss causes drastic increase in local power but does not practically change the total core thermal power making this effect difficult to detect. In the same time the results demonstrate a correlation between perturbed local power responses in normal operation and misaligned beam conditions indicating that generation of covariance data for proton and neutron induced neutron multiplicities for lead and bismuth isotopes is needed to obtain reliable uncertainties for local power responses.

  2. Uncertain programming models for portfolio selection with uncertain returns

    NASA Astrophysics Data System (ADS)

    Zhang, Bo; Peng, Jin; Li, Shengguo

    2015-10-01

    In an indeterminacy economic environment, experts' knowledge about the returns of securities consists of much uncertainty instead of randomness. This paper discusses portfolio selection problem in uncertain environment in which security returns cannot be well reflected by historical data, but can be evaluated by the experts. In the paper, returns of securities are assumed to be given by uncertain variables. According to various decision criteria, the portfolio selection problem in uncertain environment is formulated as expected-variance-chance model and chance-expected-variance model by using the uncertainty programming. Within the framework of uncertainty theory, for the convenience of solving the models, some crisp equivalents are discussed under different conditions. In addition, a hybrid intelligent algorithm is designed in the paper to provide a general method for solving the new models in general cases. At last, two numerical examples are provided to show the performance and applications of the models and algorithm.

  3. Evaluation of Automated Model Calibration Techniques for Residential Building Energy Simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Robertson, J.; Polly, B.; Collis, J.

    2013-09-01

    This simulation study adapts and applies the general framework described in BESTEST-EX (Judkoff et al 2010) for self-testing residential building energy model calibration methods. BEopt/DOE-2.2 is used to evaluate four mathematical calibration methods in the context of monthly, daily, and hourly synthetic utility data for a 1960's-era existing home in a cooling-dominated climate. The home's model inputs are assigned probability distributions representing uncertainty ranges, random selections are made from the uncertainty ranges to define 'explicit' input values, and synthetic utility billing data are generated using the explicit input values. The four calibration methods evaluated in this study are: an ASHRAEmore » 1051-RP-based approach (Reddy and Maor 2006), a simplified simulated annealing optimization approach, a regression metamodeling optimization approach, and a simple output ratio calibration approach. The calibration methods are evaluated for monthly, daily, and hourly cases; various retrofit measures are applied to the calibrated models and the methods are evaluated based on the accuracy of predicted savings, computational cost, repeatability, automation, and ease of implementation.« less

  4. Evaluation of Automated Model Calibration Techniques for Residential Building Energy Simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    and Ben Polly, Joseph Robertson; Polly, Ben; Collis, Jon

    2013-09-01

    This simulation study adapts and applies the general framework described in BESTEST-EX (Judkoff et al 2010) for self-testing residential building energy model calibration methods. BEopt/DOE-2.2 is used to evaluate four mathematical calibration methods in the context of monthly, daily, and hourly synthetic utility data for a 1960's-era existing home in a cooling-dominated climate. The home's model inputs are assigned probability distributions representing uncertainty ranges, random selections are made from the uncertainty ranges to define "explicit" input values, and synthetic utility billing data are generated using the explicit input values. The four calibration methods evaluated in this study are: an ASHRAEmore » 1051-RP-based approach (Reddy and Maor 2006), a simplified simulated annealing optimization approach, a regression metamodeling optimization approach, and a simple output ratio calibration approach. The calibration methods are evaluated for monthly, daily, and hourly cases; various retrofit measures are applied to the calibrated models and the methods are evaluated based on the accuracy of predicted savings, computational cost, repeatability, automation, and ease of implementation.« less

  5. Reactive transport modeling at uranium in situ recovery sites: uncertainties in uranium sorption on iron hydroxides

    USGS Publications Warehouse

    Johnson, Raymond H.; Tutu, Hlanganani; Brown, Adrian; Figueroa, Linda; Wolkersdorfer, Christian

    2013-01-01

    Geochemical changes that can occur down gradient from uranium in situ recovery (ISR) sites are important for various stakeholders to understand when evaluating potential effects on surrounding groundwater quality. If down gradient solid-phase material consists of sandstone with iron hydroxide coatings (no pyrite or organic carbon), sorption of uranium on iron hydroxides can control uranium mobility. Using one-dimensional reactive transport models with PHREEQC, two different geochemical databases, and various geochemical parameters, the uncertainties in uranium sorption on iron hydroxides are evaluated, because these oxidized zones create a greater risk for future uranium transport than fully reduced zones where uranium generally precipitates.

  6. A Bayesian model averaging method for the derivation of reservoir operating rules

    NASA Astrophysics Data System (ADS)

    Zhang, Jingwen; Liu, Pan; Wang, Hao; Lei, Xiaohui; Zhou, Yanlai

    2015-09-01

    Because the intrinsic dynamics among optimal decision making, inflow processes and reservoir characteristics are complex, functional forms of reservoir operating rules are always determined subjectively. As a result, the uncertainty of selecting form and/or model involved in reservoir operating rules must be analyzed and evaluated. In this study, we analyze the uncertainty of reservoir operating rules using the Bayesian model averaging (BMA) model. Three popular operating rules, namely piecewise linear regression, surface fitting and a least-squares support vector machine, are established based on the optimal deterministic reservoir operation. These individual models provide three-member decisions for the BMA combination, enabling the 90% release interval to be estimated by the Markov Chain Monte Carlo simulation. A case study of China's the Baise reservoir shows that: (1) the optimal deterministic reservoir operation, superior to any reservoir operating rules, is used as the samples to derive the rules; (2) the least-squares support vector machine model is more effective than both piecewise linear regression and surface fitting; (3) BMA outperforms any individual model of operating rules based on the optimal trajectories. It is revealed that the proposed model can reduce the uncertainty of operating rules, which is of great potential benefit in evaluating the confidence interval of decisions.

  7. Impact of in-Sewer Degradation of Pharmaceutical and Personal Care Products (PPCPs) Population Markers on a Population Model.

    PubMed

    O'Brien, Jake William; Banks, Andrew Phillip William; Novic, Andrew Joseph; Mueller, Jochen F; Jiang, Guangming; Ort, Christoph; Eaglesham, Geoff; Yuan, Zhiguo; Thai, Phong K

    2017-04-04

    A key uncertainty of wastewater-based epidemiology is the size of the population which contributed to a given wastewater sample. We previously developed and validated a Bayesian inference model to estimate population size based on 14 population markers which: (1) are easily measured and (2) have mass loads which correlate with population size. However, the potential uncertainty of the model prediction due to in-sewer degradation of these markers was not evaluated. In this study, we addressed this gap by testing their stability under sewer conditions and assessed whether degradation impacts the model estimates. Five markers, which formed the core of our model, were stable in the sewers while the others were not. Our evaluation showed that the presence of unstable population markers in the model did not decrease the precision of the population estimates providing that stable markers such as acesulfame remained in the model. However, to achieve the minimum uncertainty in population estimates, we propose that the core markers to be included in population models for other sites should meet two additional criteria: (3) negligible degradation in wastewater to ensure the stability of chemicals during collection; and (4) < 10% in-sewer degradation could occur during the mean residence time of the sewer network.

  8. Geologic uncertainty in a regulatory environment: An example from the potential Yucca Mountain nuclear waste repository site

    NASA Astrophysics Data System (ADS)

    Rautman, C. A.; Treadway, A. H.

    1991-11-01

    Regulatory geologists are concerned with predicting the performance of sites proposed for waste disposal or for remediation of existing pollution problems. Geologic modeling of these sites requires large-scale expansion of knowledge obtained from very limited sampling. This expansion induces considerable uncertainty into the geologic models of rock properties that are required for modeling the predicted performance of the site. One method for assessing this uncertainty is through nonparametric geostatistical simulation. Simulation can produce a series of equiprobable models of a rock property of interest. Each model honors measured values at sampled locations, and each can be constructed to emulate both the univariate histogram and the spatial covariance structure of the measured data. Computing a performance model for a number of geologic simulations allows evaluation of the effects of geologic uncertainty. A site may be judged acceptable if the number of failures to meet a particular performance criterion produced by these computations is sufficiently low. A site that produces too many failures may be either unacceptable or simply inadequately described. The simulation approach to addressing geologic uncertainty is being applied to the potential high-level nuclear waste repository site at Yucca Mountain, Nevada, U.S.A. Preliminary geologic models of unsaturated permeability have been created that reproduce observed statistical properties reasonably well. A spread of unsaturated groundwater travel times has been computed that reflects the variability of those geologic models. Regions within the simulated models exhibiting the greatest variability among multiple runs are candidates for obtaining the greatest reduction in uncertainty through additional site characterization.

  9. Evaluation of incremental reactivity and its uncertainty in Southern California.

    PubMed

    Martien, Philip T; Harley, Robert A; Milford, Jana B; Russell, Armistead G

    2003-04-15

    The incremental reactivity (IR) and relative incremental reactivity (RIR) of carbon monoxide and 30 individual volatile organic compounds (VOC) were estimated for the South Coast Air Basin using two photochemical air quality models: a 3-D, grid-based model and a vertically resolved trajectory model. Both models include an extended version of the SAPRC99 chemical mechanism. For the 3-D modeling, the decoupled direct method (DDM-3D) was used to assess reactivities. The trajectory model was applied to estimate uncertainties in reactivities due to uncertainties in chemical rate parameters, deposition parameters, and emission rates using Monte Carlo analysis with Latin hypercube sampling. For most VOC, RIRs were found to be consistent in rankings with those produced by Carter using a box model. However, 3-D simulations show that coastal regions, upwind of most of the emissions, have comparatively low IR but higher RIR than predicted by box models for C4-C5 alkenes and carbonyls that initiate the production of HOx radicals. Biogenic VOC emissions were found to have a lower RIR than predicted by box model estimates, because emissions of these VOC were mostly downwind of the areas of primary ozone production. Uncertainties in RIR of individual VOC were found to be dominated by uncertainties in the rate parameters of their primary oxidation reactions. The coefficient of variation (COV) of most RIR values ranged from 20% to 30%, whereas the COV of absolute incremental reactivity ranged from about 30% to 40%. In general, uncertainty and variability both decreased when relative rather than absolute reactivity metrics were used.

  10. Associating uncertainty with datasets using Linked Data and allowing propagation via provenance chains

    NASA Astrophysics Data System (ADS)

    Car, Nicholas; Cox, Simon; Fitch, Peter

    2015-04-01

    With earth-science datasets increasingly being published to enable re-use in projects disassociated from the original data acquisition or generation, there is an urgent need for associated metadata to be connected, in order to guide their application. In particular, provenance traces should support the evaluation of data quality and reliability. However, while standards for describing provenance are emerging (e.g. PROV-O), these do not include the necessary statistical descriptors and confidence assessments. UncertML has a mature conceptual model that may be used to record uncertainty metadata. However, by itself UncertML does not support the representation of uncertainty of multi-part datasets, and provides no direct way of associating the uncertainty information - metadata in relation to a dataset - with dataset objects.We present a method to address both these issues by combining UncertML with PROV-O, and delivering resulting uncertainty-enriched provenance traces through the Linked Data API. UncertProv extends the PROV-O provenance ontology with an RDF formulation of the UncertML conceptual model elements, adds further elements to support uncertainty representation without a conceptual model and the integration of UncertML through links to documents. The Linked ID API provides a systematic way of navigating from dataset objects to their UncertProv metadata and back again. The Linked Data API's 'views' capability enables access to UncertML and non-UncertML uncertainty metadata representations for a dataset. With this approach, it is possible to access and navigate the uncertainty metadata associated with a published dataset using standard semantic web tools, such as SPARQL queries. Where the uncertainty data follows the UncertML model it can be automatically interpreted and may also support automatic uncertainty propagation . Repositories wishing to enable uncertainty propagation for all datasets must ensure that all elements that are associated with uncertainty (PROV-O Entity and Activity classes) have UncertML elements recorded. This methodology is intentionally flexible to allow uncertainty metadata in many forms, not limited to UncertML. While the more formal representation of uncertainty metadata is desirable (using UncertProv elements to implement the UncertML conceptual model ), this will not always be possible, and any uncertainty data stored will be better than none. Since the UncertProv ontology contains a superset of UncertML elements to facilitate the representation of non-UncertML uncertainty data, it could easily be extended to include other formal uncertainty conceptual models thus allowing non-UncertML propagation calculations.

  11. Computational Fluid Dynamics Best Practice Guidelines in the Analysis of Storage Dry Cask

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zigh, A.; Solis, J.

    2008-07-01

    Computational fluid dynamics (CFD) methods are used to evaluate the thermal performance of a dry cask under long term storage conditions in accordance with NUREG-1536 [NUREG-1536, 1997]. A three-dimensional CFD model was developed and validated using data for a ventilated storage cask (VSC-17) collected by Idaho National Laboratory (INL). The developed Fluent CFD model was validated to minimize the modeling and application uncertainties. To address modeling uncertainties, the paper focused on turbulence modeling of buoyancy driven air flow. Similarly, in the application uncertainties, the pressure boundary conditions used to model the air inlet and outlet vents were investigated and validated.more » Different turbulence models were used to reduce the modeling uncertainty in the CFD simulation of the air flow through the annular gap between the overpack and the multi-assembly sealed basket (MSB). Among the chosen turbulence models, the validation showed that the low Reynolds k-{epsilon} and the transitional k-{omega} turbulence models predicted the measured temperatures closely. To assess the impact of pressure boundary conditions used at the air inlet and outlet channels on the application uncertainties, a sensitivity analysis of operating density was undertaken. For convergence purposes, all available commercial CFD codes include the operating density in the pressure gradient term of the momentum equation. The validation showed that the correct operating density corresponds to the density evaluated at the air inlet condition of pressure and temperature. Next, the validated CFD method was used to predict the thermal performance of an existing dry cask storage system. The evaluation uses two distinct models: a three-dimensional and an axisymmetrical representation of the cask. In the 3-D model, porous media was used to model only the volume occupied by the rodded region that is surrounded by the BWR channel box. In the axisymmetric model, porous media was used to model the entire region that encompasses the fuel assemblies as well as the gaps in between. Consequently, a larger volume is represented by porous media in the second model; hence, a higher frictional flow resistance is introduced in the momentum equations. The conservatism and the safety margins of these models were compared to assess the applicability and the realism of these two models. The three-dimensional model included fewer geometry simplifications and is recommended as it predicted less conservative fuel cladding temperature values, while still assuring the existence of adequate safety margins. (authors)« less

  12. Uncertainties propagation and global sensitivity analysis of the frequency response function of piezoelectric energy harvesters

    NASA Astrophysics Data System (ADS)

    Ruiz, Rafael O.; Meruane, Viviana

    2017-06-01

    The goal of this work is to describe a framework to propagate uncertainties in piezoelectric energy harvesters (PEHs). These uncertainties are related to the incomplete knowledge of the model parameters. The framework presented could be employed to conduct prior robust stochastic predictions. The prior analysis assumes a known probability density function for the uncertain variables and propagates the uncertainties to the output voltage. The framework is particularized to evaluate the behavior of the frequency response functions (FRFs) in PEHs, while its implementation is illustrated by the use of different unimorph and bimorph PEHs subjected to different scenarios: free of uncertainties, common uncertainties, and uncertainties as a product of imperfect clamping. The common variability associated with the PEH parameters are tabulated and reported. A global sensitivity analysis is conducted to identify the Sobol indices. Results indicate that the elastic modulus, density, and thickness of the piezoelectric layer are the most relevant parameters of the output variability. The importance of including the model parameter uncertainties in the estimation of the FRFs is revealed. In this sense, the present framework constitutes a powerful tool in the robust design and prediction of PEH performance.

  13. Hierarchical Bayesian Model Averaging for Non-Uniqueness and Uncertainty Analysis of Artificial Neural Networks

    NASA Astrophysics Data System (ADS)

    Fijani, E.; Chitsazan, N.; Nadiri, A.; Tsai, F. T.; Asghari Moghaddam, A.

    2012-12-01

    Artificial Neural Networks (ANNs) have been widely used to estimate concentration of chemicals in groundwater systems. However, estimation uncertainty is rarely discussed in the literature. Uncertainty in ANN output stems from three sources: ANN inputs, ANN parameters (weights and biases), and ANN structures. Uncertainty in ANN inputs may come from input data selection and/or input data error. ANN parameters are naturally uncertain because they are maximum-likelihood estimated. ANN structure is also uncertain because there is no unique ANN model given a specific case. Therefore, multiple plausible AI models are generally resulted for a study. One might ask why good models have to be ignored in favor of the best model in traditional estimation. What is the ANN estimation variance? How do the variances from different ANN models accumulate to the total estimation variance? To answer these questions we propose a Hierarchical Bayesian Model Averaging (HBMA) framework. Instead of choosing one ANN model (the best ANN model) for estimation, HBMA averages outputs of all plausible ANN models. The model weights are based on the evidence of data. Therefore, the HBMA avoids overconfidence on the single best ANN model. In addition, HBMA is able to analyze uncertainty propagation through aggregation of ANN models in a hierarchy framework. This method is applied for estimation of fluoride concentration in the Poldasht plain and the Bazargan plain in Iran. Unusually high fluoride concentration in the Poldasht and Bazargan plains has caused negative effects on the public health. Management of this anomaly requires estimation of fluoride concentration distribution in the area. The results show that the HBMA provides a knowledge-decision-based framework that facilitates analyzing and quantifying ANN estimation uncertainties from different sources. In addition HBMA allows comparative evaluation of the realizations for each source of uncertainty by segregating the uncertainty sources in a hierarchical framework. Fluoride concentration estimation using the HBMA method shows better agreement to the observation data in the test step because they are not based on a single model with a non-dominate weights.

  14. Fundamental uncertainty limit of optical flow velocimetry according to Heisenberg's uncertainty principle.

    PubMed

    Fischer, Andreas

    2016-11-01

    Optical flow velocity measurements are important for understanding the complex behavior of flows. Although a huge variety of methods exist, they are either based on a Doppler or a time-of-flight measurement principle. Doppler velocimetry evaluates the velocity-dependent frequency shift of light scattered at a moving particle, whereas time-of-flight velocimetry evaluates the traveled distance of a scattering particle per time interval. Regarding the aim of achieving a minimal measurement uncertainty, it is unclear if one principle allows to achieve lower uncertainties or if both principles can achieve equal uncertainties. For this reason, the natural, fundamental uncertainty limit according to Heisenberg's uncertainty principle is derived for Doppler and time-of-flight measurement principles, respectively. The obtained limits of the velocity uncertainty are qualitatively identical showing, e.g., a direct proportionality for the absolute value of the velocity to the power of 32 and an indirect proportionality to the square root of the scattered light power. Hence, both measurement principles have identical potentials regarding the fundamental uncertainty limit due to the quantum mechanical behavior of photons. This fundamental limit can be attained (at least asymptotically) in reality either with Doppler or time-of-flight methods, because the respective Cramér-Rao bounds for dominating photon shot noise, which is modeled as white Poissonian noise, are identical with the conclusions from Heisenberg's uncertainty principle.

  15. Land Surface Verification Toolkit (LVT) - A Generalized Framework for Land Surface Model Evaluation

    NASA Technical Reports Server (NTRS)

    Kumar, Sujay V.; Peters-Lidard, Christa D.; Santanello, Joseph; Harrison, Ken; Liu, Yuqiong; Shaw, Michael

    2011-01-01

    Model evaluation and verification are key in improving the usage and applicability of simulation models for real-world applications. In this article, the development and capabilities of a formal system for land surface model evaluation called the Land surface Verification Toolkit (LVT) is described. LVT is designed to provide an integrated environment for systematic land model evaluation and facilitates a range of verification approaches and analysis capabilities. LVT operates across multiple temporal and spatial scales and employs a large suite of in-situ, remotely sensed and other model and reanalysis datasets in their native formats. In addition to the traditional accuracy-based measures, LVT also includes uncertainty and ensemble diagnostics, information theory measures, spatial similarity metrics and scale decomposition techniques that provide novel ways for performing diagnostic model evaluations. Though LVT was originally designed to support the land surface modeling and data assimilation framework known as the Land Information System (LIS), it also supports hydrological data products from other, non-LIS environments. In addition, the analysis of diagnostics from various computational subsystems of LIS including data assimilation, optimization and uncertainty estimation are supported within LVT. Together, LIS and LVT provide a robust end-to-end environment for enabling the concepts of model data fusion for hydrological applications. The evolving capabilities of LVT framework are expected to facilitate rapid model evaluation efforts and aid the definition and refinement of formal evaluation procedures for the land surface modeling community.

  16. An Approach to Experimental Design for the Computer Analysis of Complex Phenomenon

    NASA Technical Reports Server (NTRS)

    Rutherford, Brian

    2000-01-01

    The ability to make credible system assessments, predictions and design decisions related to engineered systems and other complex phenomenon is key to a successful program for many large-scale investigations in government and industry. Recently, many of these large-scale analyses have turned to computational simulation to provide much of the required information. Addressing specific goals in the computer analysis of these complex phenomenon is often accomplished through the use of performance measures that are based on system response models. The response models are constructed using computer-generated responses together with physical test results where possible. They are often based on probabilistically defined inputs and generally require estimation of a set of response modeling parameters. As a consequence, the performance measures are themselves distributed quantities reflecting these variabilities and uncertainties. Uncertainty in the values of the performance measures leads to uncertainties in predicted performance and can cloud the decisions required of the analysis. A specific goal of this research has been to develop methodology that will reduce this uncertainty in an analysis environment where limited resources and system complexity together restrict the number of simulations that can be performed. An approach has been developed that is based on evaluation of the potential information provided for each "intelligently selected" candidate set of computer runs. Each candidate is evaluated by partitioning the performance measure uncertainty into two components - one component that could be explained through the additional computational simulation runs and a second that would remain uncertain. The portion explained is estimated using a probabilistic evaluation of likely results for the additional computational analyses based on what is currently known about the system. The set of runs indicating the largest potential reduction in uncertainty is then selected and the computational simulations are performed. Examples are provided to demonstrate this approach on small scale problems. These examples give encouraging results. Directions for further research are indicated.

  17. Towards national-scale greenhouse gas emissions evaluation with robust uncertainty estimates

    NASA Astrophysics Data System (ADS)

    Rigby, Matthew; Swallow, Ben; Lunt, Mark; Manning, Alistair; Ganesan, Anita; Stavert, Ann; Stanley, Kieran; O'Doherty, Simon

    2016-04-01

    Through the Deriving Emissions related to Climate Change (DECC) network and the Greenhouse gAs Uk and Global Emissions (GAUGE) programme, the UK's greenhouse gases are now monitored by instruments mounted on telecommunications towers and churches, on a ferry that performs regular transects of the North Sea, on-board a research aircraft and from space. When combined with information from high-resolution chemical transport models such as the Met Office Numerical Atmospheric dispersion Modelling Environment (NAME), these measurements are allowing us to evaluate emissions more accurately than has previously been possible. However, it has long been appreciated that current methods for quantifying fluxes using atmospheric data suffer from uncertainties, primarily relating to the chemical transport model, that have been largely ignored to date. Here, we use novel model reduction techniques for quantifying the influence of a set of potential systematic model errors on the outcome of a national-scale inversion. This new technique has been incorporated into a hierarchical Bayesian framework, which can be shown to reduce the influence of subjective choices on the outcome of inverse modelling studies. Using estimates of the UK's methane emissions derived from DECC and GAUGE tall-tower measurements as a case study, we will show that such model systematic errors have the potential to significantly increase the uncertainty on national-scale emissions estimates. Therefore, we conclude that these factors must be incorporated in national emissions evaluation efforts, if they are to be credible.

  18. Invited Article: Concepts and tools for the evaluation of measurement uncertainty

    NASA Astrophysics Data System (ADS)

    Possolo, Antonio; Iyer, Hari K.

    2017-01-01

    Measurements involve comparisons of measured values with reference values traceable to measurement standards and are made to support decision-making. While the conventional definition of measurement focuses on quantitative properties (including ordinal properties), we adopt a broader view and entertain the possibility of regarding qualitative properties also as legitimate targets for measurement. A measurement result comprises the following: (i) a value that has been assigned to a property based on information derived from an experiment or computation, possibly also including information derived from other sources, and (ii) a characterization of the margin of doubt that remains about the true value of the property after taking that information into account. Measurement uncertainty is this margin of doubt, and it can be characterized by a probability distribution on the set of possible values of the property of interest. Mathematical or statistical models enable the quantification of measurement uncertainty and underlie the varied collection of methods available for uncertainty evaluation. Some of these methods have been in use for over a century (for example, as introduced by Gauss for the combination of mutually inconsistent observations or for the propagation of "errors"), while others are of fairly recent vintage (for example, Monte Carlo methods including those that involve Markov Chain Monte Carlo sampling). This contribution reviews the concepts, models, methods, and computations that are commonly used for the evaluation of measurement uncertainty, and illustrates their application in realistic examples drawn from multiple areas of science and technology, aiming to serve as a general, widely accessible reference.

  19. Evaluation of regional climate simulations for air quality modelling purposes

    NASA Astrophysics Data System (ADS)

    Menut, Laurent; Tripathi, Om P.; Colette, Augustin; Vautard, Robert; Flaounas, Emmanouil; Bessagnet, Bertrand

    2013-05-01

    In order to evaluate the future potential benefits of emission regulation on regional air quality, while taking into account the effects of climate change, off-line air quality projection simulations are driven using weather forcing taken from regional climate models. These regional models are themselves driven by simulations carried out using global climate models (GCM) and economical scenarios. Uncertainties and biases in climate models introduce an additional "climate modeling" source of uncertainty that is to be added to all other types of uncertainties in air quality modeling for policy evaluation. In this article we evaluate the changes in air quality-related weather variables induced by replacing reanalyses-forced by GCM-forced regional climate simulations. As an example we use GCM simulations carried out in the framework of the ERA-interim programme and of the CMIP5 project using the Institut Pierre-Simon Laplace climate model (IPSLcm), driving regional simulations performed in the framework of the EURO-CORDEX programme. In summer, we found compensating deficiencies acting on photochemistry: an overestimation by GCM-driven weather due to a positive bias in short-wave radiation, a negative bias in wind speed, too many stagnant episodes, and a negative temperature bias. In winter, air quality is mostly driven by dispersion, and we could not identify significant differences in either wind or planetary boundary layer height statistics between GCM-driven and reanalyses-driven regional simulations. However, precipitation appears largely overestimated in GCM-driven simulations, which could significantly affect the simulation of aerosol concentrations. The identification of these biases will help interpreting results of future air quality simulations using these data. Despite these, we conclude that the identified differences should not lead to major difficulties in using GCM-driven regional climate simulations for air quality projections.

  20. Reservoir Performance Under Future Climate For Basins With Different Hydrologic Sensitivities

    NASA Astrophysics Data System (ADS)

    Mateus, M. C.; Tullos, D. D.

    2013-12-01

    In addition to long-standing uncertainties related to variable inflows and market price of power, reservoir operators face a number of new uncertainties related to hydrologic nonstationarity, changing environmental regulations, and rapidly growing water and energy demands. This study investigates the impact, sensitivity, and uncertainty of changing hydrology on hydrosystem performance across different hydrogeologic settings. We evaluate the performance of reservoirs in the Santiam River basin, including a case study in the North Santiam Basin, with high permeability and extensive groundwater storage, and the South Santiam Basin, with low permeability, little groundwater storage and rapid runoff response. The modeling objective is to address the following study questions: (1) for the two hydrologic regimes, how does the flood management, water supply, and environmental performance of current reservoir operations change under future 2.5, 50 and 97.5 percentile streamflow projections; and (2) how much change in inflow is required to initiate a failure to meet downstream minimum or maximum flows in the future. We couple global climate model results with a rainfall-runoff model and a formal Bayesian uncertainty analysis to simulate future inflow hydrographs as inputs to a reservoir operations model. To evaluate reservoir performance under a changing climate, we calculate reservoir refill reliability, changes in flood frequency, and reservoir time and volumetric reliability of meeting minimum spring and summer flow target. Reservoir performance under future hydrology appears to vary with hydrogeology. We find higher sensitivity to floods for the North Santiam Basin and higher sensitivity to minimum flow targets for the South Santiam Basin. Higher uncertainty is related with basins with a more complex hydrologeology. Results from model simulations contribute to understanding of the reliability and vulnerability of reservoirs to a changing climate.

  1. Probabilistic flood inundation mapping at ungauged streams due to roughness coefficient uncertainty in hydraulic modelling

    NASA Astrophysics Data System (ADS)

    Papaioannou, George; Vasiliades, Lampros; Loukas, Athanasios; Aronica, Giuseppe T.

    2017-04-01

    Probabilistic flood inundation mapping is performed and analysed at the ungauged Xerias stream reach, Volos, Greece. The study evaluates the uncertainty introduced by the roughness coefficient values on hydraulic models in flood inundation modelling and mapping. The well-established one-dimensional (1-D) hydraulic model, HEC-RAS is selected and linked to Monte-Carlo simulations of hydraulic roughness. Terrestrial Laser Scanner data have been used to produce a high quality DEM for input data uncertainty minimisation and to improve determination accuracy on stream channel topography required by the hydraulic model. Initial Manning's n roughness coefficient values are based on pebble count field surveys and empirical formulas. Various theoretical probability distributions are fitted and evaluated on their accuracy to represent the estimated roughness values. Finally, Latin Hypercube Sampling has been used for generation of different sets of Manning roughness values and flood inundation probability maps have been created with the use of Monte Carlo simulations. Historical flood extent data, from an extreme historical flash flood event, are used for validation of the method. The calibration process is based on a binary wet-dry reasoning with the use of Median Absolute Percentage Error evaluation metric. The results show that the proposed procedure supports probabilistic flood hazard mapping at ungauged rivers and provides water resources managers with valuable information for planning and implementing flood risk mitigation strategies.

  2. Accounting for observational uncertainties in the evaluation of low latitude turbulent air-sea fluxes simulated in a suite of IPSL model versions

    NASA Astrophysics Data System (ADS)

    Servonnat, Jerome; Braconnot, Pascale; Gainusa-Bogdan, Alina

    2015-04-01

    Turbulent momentum and heat (sensible and latent) fluxes at the air-sea interface are key components of the whole energetic of the Earth's climate and their good representation in climate models is of prime importance. In this work, we use the methodology developed by Braconnot & Frankignoul (1993) to perform a Hotelling T2 test on spatio-temporal fields (annual cycles). This statistic provides a quantitative measure accounting for an estimate of the observational uncertainty for the evaluation of low-latitude turbulent air-sea fluxes in a suite of IPSL model versions. The spread within the observational ensemble of turbulent flux data products assembled by Gainusa-Bogdan et al (submitted) is used as an estimate of the observational uncertainty for the different turbulent fluxes. The methodology holds on a selection of a small number of dominating variability patterns (EOFs) that are common to both the model and the observations for the comparison. Consequently it focuses on the large-scale variability patterns and avoids the possibly noisy smaller scales. The results show that different versions of the IPSL couple model share common large scale model biases, but also that there the skill on sea surface temperature is not necessarily directly related to the skill in the representation of the different turbulent fluxes. Despite the large error bars on the observations the test clearly distinguish the different merits of the different model version. The analyses of the common EOF patterns and related time series provide guidance on the major differences with the observations. This work is a first attempt to use such statistic on the evaluation of the spatio-temporal variability of the turbulent fluxes, accounting for an observational uncertainty, and represents an efficient tool for systematic evaluation of simulated air-seafluxes, considering both the fluxes and the related atmospheric variables. References Braconnot, P., and C. Frankignoul (1993), Testing Model Simulations of the Thermocline Depth Variability in the Tropical Atlantic from 1982 through 1984, J. Phys. Oceanogr., 23(4), 626-647 Gainusa-Bogdan A., Braconnot P. and Servonnat J. (submitted), Using an ensemble data set of turbulent air-sea fluxes to evaluate the IPSL climate model in tropical regions, Journal of Geophysical Research Atmosphere, 2014JD022985

  3. Parameter estimation for groundwater models under uncertain irrigation data

    USGS Publications Warehouse

    Demissie, Yonas; Valocchi, Albert J.; Cai, Ximing; Brozovic, Nicholas; Senay, Gabriel; Gebremichael, Mekonnen

    2015-01-01

    The success of modeling groundwater is strongly influenced by the accuracy of the model parameters that are used to characterize the subsurface system. However, the presence of uncertainty and possibly bias in groundwater model source/sink terms may lead to biased estimates of model parameters and model predictions when the standard regression-based inverse modeling techniques are used. This study first quantifies the levels of bias in groundwater model parameters and predictions due to the presence of errors in irrigation data. Then, a new inverse modeling technique called input uncertainty weighted least-squares (IUWLS) is presented for unbiased estimation of the parameters when pumping and other source/sink data are uncertain. The approach uses the concept of generalized least-squares method with the weight of the objective function depending on the level of pumping uncertainty and iteratively adjusted during the parameter optimization process. We have conducted both analytical and numerical experiments, using irrigation pumping data from the Republican River Basin in Nebraska, to evaluate the performance of ordinary least-squares (OLS) and IUWLS calibration methods under different levels of uncertainty of irrigation data and calibration conditions. The result from the OLS method shows the presence of statistically significant (p < 0.05) bias in estimated parameters and model predictions that persist despite calibrating the models to different calibration data and sample sizes. However, by directly accounting for the irrigation pumping uncertainties during the calibration procedures, the proposed IUWLS is able to minimize the bias effectively without adding significant computational burden to the calibration processes.

  4. Evaluating Productivity Predictions Under Elevated CO2 Conditions: Multi-Model Benchmarking Across FACE Experiments

    NASA Astrophysics Data System (ADS)

    Cowdery, E.; Dietze, M.

    2016-12-01

    As atmospheric levels of carbon dioxide levels continue to increase, it is critical that terrestrial ecosystem models can accurately predict ecological responses to the changing environment. Current predictions of net primary productivity (NPP) in response to elevated atmospheric CO2 concentration are highly variable and contain a considerable amount of uncertainty.The Predictive Ecosystem Analyzer (PEcAn) is an informatics toolbox that wraps around an ecosystem model and can be used to help identify which factors drive uncertainty. We tested a suite of models (LPJ-GUESS, MAESPA, GDAY, CLM5, DALEC, ED2), which represent a range from low to high structural complexity, across a range of Free-Air CO2 Enrichment (FACE) experiments: the Kennedy Space Center Open Top Chamber Experiment, the Rhinelander FACE experiment, the Duke Forest FACE experiment and the Oak Ridge Experiment on CO2 Enrichment. These tests were implemented in a novel benchmarking workflow that is automated, repeatable, and generalized to incorporate different sites and ecological models. Observational data from the FACE experiments represent a first test of this flexible, extensible approach aimed at providing repeatable tests of model process representation.To identify and evaluate the assumptions causing inter-model differences we used PEcAn to perform model sensitivity and uncertainty analysis, not only to assess the components of NPP, but also to examine system processes such nutrient uptake and and water use. Combining the observed patterns of uncertainty between multiple models with results of the recent FACE-model data synthesis project (FACE-MDS) can help identify which processes need further study and additional data constraints. These findings can be used to inform future experimental design and in turn can provide informative starting point for data assimilation.

  5. Stochastic and Perturbed Parameter Representations of Model Uncertainty in Convection Parameterization

    NASA Astrophysics Data System (ADS)

    Christensen, H. M.; Moroz, I.; Palmer, T.

    2015-12-01

    It is now acknowledged that representing model uncertainty in atmospheric simulators is essential for the production of reliable probabilistic ensemble forecasts, and a number of different techniques have been proposed for this purpose. Stochastic convection parameterization schemes use random numbers to represent the difference between a deterministic parameterization scheme and the true atmosphere, accounting for the unresolved sub grid-scale variability associated with convective clouds. An alternative approach varies the values of poorly constrained physical parameters in the model to represent the uncertainty in these parameters. This study presents new perturbed parameter schemes for use in the European Centre for Medium Range Weather Forecasts (ECMWF) convection scheme. Two types of scheme are developed and implemented. Both schemes represent the joint uncertainty in four of the parameters in the convection parametrisation scheme, which was estimated using the Ensemble Prediction and Parameter Estimation System (EPPES). The first scheme developed is a fixed perturbed parameter scheme, where the values of uncertain parameters are changed between ensemble members, but held constant over the duration of the forecast. The second is a stochastically varying perturbed parameter scheme. The performance of these schemes was compared to the ECMWF operational stochastic scheme, Stochastically Perturbed Parametrisation Tendencies (SPPT), and to a model which does not represent uncertainty in convection. The skill of probabilistic forecasts made using the different models was evaluated. While the perturbed parameter schemes improve on the stochastic parametrisation in some regards, the SPPT scheme outperforms the perturbed parameter approaches when considering forecast variables that are particularly sensitive to convection. Overall, SPPT schemes are the most skilful representations of model uncertainty due to convection parametrisation. Reference: H. M. Christensen, I. M. Moroz, and T. N. Palmer, 2015: Stochastic and Perturbed Parameter Representations of Model Uncertainty in Convection Parameterization. J. Atmos. Sci., 72, 2525-2544.

  6. Accounting for Parameter Uncertainty in Complex Atmospheric Models, With an Application to Greenhouse Gas Emissions Evaluation

    NASA Astrophysics Data System (ADS)

    Swallow, B.; Rigby, M. L.; Rougier, J.; Manning, A.; Thomson, D.; Webster, H. N.; Lunt, M. F.; O'Doherty, S.

    2016-12-01

    In order to understand underlying processes governing environmental and physical phenomena, a complex mathematical model is usually required. However, there is an inherent uncertainty related to the parameterisation of unresolved processes in these simulators. Here, we focus on the specific problem of accounting for uncertainty in parameter values in an atmospheric chemical transport model. Systematic errors introduced by failing to account for these uncertainties have the potential to have a large effect on resulting estimates in unknown quantities of interest. One approach that is being increasingly used to address this issue is known as emulation, in which a large number of forward runs of the simulator are carried out, in order to approximate the response of the output to changes in parameters. However, due to the complexity of some models, it is often unfeasible to run large numbers of training runs that is usually required for full statistical emulators of the environmental processes. We therefore present a simplified model reduction method for approximating uncertainties in complex environmental simulators without the need for very large numbers of training runs. We illustrate the method through an application to the Met Office's atmospheric transport model NAME. We show how our parameter estimation framework can be incorporated into a hierarchical Bayesian inversion, and demonstrate the impact on estimates of UK methane emissions, using atmospheric mole fraction data. We conclude that accounting for uncertainties in the parameterisation of complex atmospheric models is vital if systematic errors are to be minimized and all relevant uncertainties accounted for. We also note that investigations of this nature can prove extremely useful in highlighting deficiencies in the simulator that might otherwise be missed.

  7. Reliability of Current Biokinetic and Dosimetric Models for Radionuclides: A Pilot Study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Leggett, Richard Wayne; Eckerman, Keith F; Meck, Robert A.

    2008-10-01

    This report describes the results of a pilot study of the reliability of the biokinetic and dosimetric models currently used by the U.S. Nuclear Regulatory Commission (NRC) as predictors of dose per unit internal or external exposure to radionuclides. The study examines the feasibility of critically evaluating the accuracy of these models for a comprehensive set of radionuclides of concern to the NRC. Each critical evaluation would include: identification of discrepancies between the models and current databases; characterization of uncertainties in model predictions of dose per unit intake or unit external exposure; characterization of variability in dose per unit intakemore » or unit external exposure; and evaluation of prospects for development of more accurate models. Uncertainty refers here to the level of knowledge of a central value for a population, and variability refers to quantitative differences between different members of a population. This pilot study provides a critical assessment of models for selected radionuclides representing different levels of knowledge of dose per unit exposure. The main conclusions of this study are as follows: (1) To optimize the use of available NRC resources, the full study should focus on radionuclides most frequently encountered in the workplace or environment. A list of 50 radionuclides is proposed. (2) The reliability of a dose coefficient for inhalation or ingestion of a radionuclide (i.e., an estimate of dose per unit intake) may depend strongly on the specific application. Multiple characterizations of the uncertainty in a dose coefficient for inhalation or ingestion of a radionuclide may be needed for different forms of the radionuclide and different levels of information of that form available to the dose analyst. (3) A meaningful characterization of variability in dose per unit intake of a radionuclide requires detailed information on the biokinetics of the radionuclide and hence is not feasible for many infrequently studied radionuclides. (4) The biokinetics of a radionuclide in the human body typically represents the greatest source of uncertainty or variability in dose per unit intake. (5) Characterization of uncertainty in dose per unit exposure is generally a more straightforward problem for external exposure than for intake of a radionuclide. (6) For many radionuclides the most important outcome of a large-scale critical evaluation of databases and biokinetic models for radionuclides is expected to be the improvement of current models. Many of the current models do not fully or accurately reflect available radiobiological or physiological information, either because the models are outdated or because they were based on selective or uncritical use of data or inadequate model structures. In such cases the models should be replaced with physiologically realistic models that incorporate a wider spectrum of information.« less

  8. Flexibility evaluation of multiechelon supply chains.

    PubMed

    Almeida, João Flávio de Freitas; Conceição, Samuel Vieira; Pinto, Luiz Ricardo; de Camargo, Ricardo Saraiva; Júnior, Gilberto de Miranda

    2018-01-01

    Multiechelon supply chains are complex logistics systems that require flexibility and coordination at a tactical level to cope with environmental uncertainties in an efficient and effective manner. To cope with these challenges, mathematical programming models are developed to evaluate supply chain flexibility. However, under uncertainty, supply chain models become complex and the scope of flexibility analysis is generally reduced. This paper presents a unified approach that can evaluate the flexibility of a four-echelon supply chain via a robust stochastic programming model. The model simultaneously considers the plans of multiple business divisions such as marketing, logistics, manufacturing, and procurement, whose goals are often conflicting. A numerical example with deterministic parameters is presented to introduce the analysis, and then, the model stochastic parameters are considered to evaluate flexibility. The results of the analysis on supply, manufacturing, and distribution flexibility are presented. Tradeoff analysis of demand variability and service levels is also carried out. The proposed approach facilitates the adoption of different management styles, thus improving supply chain resilience. The model can be extended to contexts pertaining to supply chain disruptions; for example, the model can be used to explore operation strategies when subtle events disrupt supply, manufacturing, or distribution.

  9. Flexibility evaluation of multiechelon supply chains

    PubMed Central

    Conceição, Samuel Vieira; Pinto, Luiz Ricardo; de Camargo, Ricardo Saraiva; Júnior, Gilberto de Miranda

    2018-01-01

    Multiechelon supply chains are complex logistics systems that require flexibility and coordination at a tactical level to cope with environmental uncertainties in an efficient and effective manner. To cope with these challenges, mathematical programming models are developed to evaluate supply chain flexibility. However, under uncertainty, supply chain models become complex and the scope of flexibility analysis is generally reduced. This paper presents a unified approach that can evaluate the flexibility of a four-echelon supply chain via a robust stochastic programming model. The model simultaneously considers the plans of multiple business divisions such as marketing, logistics, manufacturing, and procurement, whose goals are often conflicting. A numerical example with deterministic parameters is presented to introduce the analysis, and then, the model stochastic parameters are considered to evaluate flexibility. The results of the analysis on supply, manufacturing, and distribution flexibility are presented. Tradeoff analysis of demand variability and service levels is also carried out. The proposed approach facilitates the adoption of different management styles, thus improving supply chain resilience. The model can be extended to contexts pertaining to supply chain disruptions; for example, the model can be used to explore operation strategies when subtle events disrupt supply, manufacturing, or distribution. PMID:29584755

  10. Quantification of downscaled precipitation uncertainties via Bayesian inference

    NASA Astrophysics Data System (ADS)

    Nury, A. H.; Sharma, A.; Marshall, L. A.

    2017-12-01

    Prediction of precipitation from global climate model (GCM) outputs remains critical to decision-making in water-stressed regions. In this regard, downscaling of GCM output has been a useful tool for analysing future hydro-climatological states. Several downscaling approaches have been developed for precipitation downscaling, including those using dynamical or statistical downscaling methods. Frequently, outputs from dynamical downscaling are not readily transferable across regions for significant methodical and computational difficulties. Statistical downscaling approaches provide a flexible and efficient alternative, providing hydro-climatological outputs across multiple temporal and spatial scales in many locations. However these approaches are subject to significant uncertainty, arising due to uncertainty in the downscaled model parameters and in the use of different reanalysis products for inferring appropriate model parameters. Consequently, these will affect the performance of simulation in catchment scale. This study develops a Bayesian framework for modelling downscaled daily precipitation from GCM outputs. This study aims to introduce uncertainties in downscaling evaluating reanalysis datasets against observational rainfall data over Australia. In this research a consistent technique for quantifying downscaling uncertainties by means of Bayesian downscaling frame work has been proposed. The results suggest that there are differences in downscaled precipitation occurrences and extremes.

  11. Validating the applicability of the GUM procedure

    NASA Astrophysics Data System (ADS)

    Cox, Maurice G.; Harris, Peter M.

    2014-08-01

    This paper is directed at practitioners seeking a degree of assurance in the quality of the results of an uncertainty evaluation when using the procedure in the Guide to the Expression of Uncertainty in Measurement (GUM) (JCGM 100 : 2008). Such assurance is required in adhering to general standards such as International Standard ISO/IEC 17025 or other sector-specific standards. We investigate the extent to which such assurance can be given. For many practical cases, a measurement result incorporating an evaluated uncertainty that is correct to one significant decimal digit would be acceptable. Any quantification of the numerical precision of an uncertainty statement is naturally relative to the adequacy of the measurement model and the knowledge used of the quantities in that model. For general univariate and multivariate measurement models, we emphasize the use of a Monte Carlo method, as recommended in GUM Supplements 1 and 2. One use of this method is as a benchmark in terms of which measurement results provided by the GUM can be assessed in any particular instance. We mainly consider measurement models that are linear in the input quantities, or have been linearized and the linearization process is deemed to be adequate. When the probability distributions for those quantities are independent, we indicate the use of other approaches such as convolution methods based on the fast Fourier transform and, particularly, Chebyshev polynomials as benchmarks.

  12. Effect of Increased Intensity of Physiotherapy on Patient Outcomes After Stroke: An Economic Literature Review and Cost-Effectiveness Analysis

    PubMed Central

    Chan, B

    2015-01-01

    Background Functional improvements have been seen in stroke patients who have received an increased intensity of physiotherapy. This requires additional costs in the form of increased physiotherapist time. Objectives The objective of this economic analysis is to determine the cost-effectiveness of increasing the intensity of physiotherapy (duration and/or frequency) during inpatient rehabilitation after stroke, from the perspective of the Ontario Ministry of Health and Long-term Care. Data Sources The inputs for our economic evaluation were extracted from articles published in peer-reviewed journals and from reports from government sources or the Canadian Stroke Network. Where published data were not available, we sought expert opinion and used inputs based on the experts' estimates. Review Methods The primary outcome we considered was cost per quality-adjusted life-year (QALY). We also evaluated functional strength training because of its similarities to physiotherapy. We used a 2-state Markov model to evaluate the cost-effectiveness of functional strength training and increased physiotherapy intensity for stroke inpatient rehabilitation. The model had a lifetime timeframe with a 5% annual discount rate. We then used sensitivity analyses to evaluate uncertainty in the model inputs. Results We found that functional strength training and higher-intensity physiotherapy resulted in lower costs and improved outcomes over a lifetime. However, our sensitivity analyses revealed high levels of uncertainty in the model inputs, and therefore in the results. Limitations There is a high level of uncertainty in this analysis due to the uncertainty in model inputs, with some of the major inputs based on expert panel consensus or expert opinion. In addition, the utility outcomes were based on a clinical study conducted in the United Kingdom (i.e., 1 study only, and not in an Ontario or Canadian setting). Conclusions Functional strength training and higher-intensity physiotherapy may result in lower costs and improved health outcomes. However, these results should be interpreted with caution. PMID:26366241

  13. Study of synthesis techniques for insensitive aircraft control systems

    NASA Technical Reports Server (NTRS)

    Harvey, C. A.; Pope, R. E.

    1977-01-01

    Insensitive flight control system design criteria was defined in terms of maximizing performance (handling qualities, RMS gust response, transient response, stability margins) over a defined parameter range. Wing load alleviation for the C-5A was chosen as a design problem. The C-5A model was a 79-state, two-control structure with uncertainties assumed to exist in dynamic pressure, structural damping and frequency, and the stability derivative, M sub w. Five new techniques (mismatch estimation, uncertainty weighting, finite dimensional inverse, maximum difficulty, dual Lyapunov) were developed. Six existing techniques (additive noise, minimax, multiplant, sensitivity vector augmentation, state dependent noise, residualization) and the mismatch estimation and uncertainty weighting techniques were synthesized and evaluated on the design example. Evaluation and comparison of these six techniques indicated that the minimax and the uncertainty weighting techniques were superior to the other six, and of these two, uncertainty weighting has lower computational requirements. Techniques based on the three remaining new concepts appear promising and are recommended for further research.

  14. A flexible Bayesian assessment for the expected impact of data on prediction confidence for optimal sampling designs

    NASA Astrophysics Data System (ADS)

    Leube, Philipp; Geiges, Andreas; Nowak, Wolfgang

    2010-05-01

    Incorporating hydrogeological data, such as head and tracer data, into stochastic models of subsurface flow and transport helps to reduce prediction uncertainty. Considering limited financial resources available for the data acquisition campaign, information needs towards the prediction goal should be satisfied in a efficient and task-specific manner. For finding the best one among a set of design candidates, an objective function is commonly evaluated, which measures the expected impact of data on prediction confidence, prior to their collection. An appropriate approach to this task should be stochastically rigorous, master non-linear dependencies between data, parameters and model predictions, and allow for a wide variety of different data types. Existing methods fail to fulfill all these requirements simultaneously. For this reason, we introduce a new method, denoted as CLUE (Cross-bred Likelihood Uncertainty Estimator), that derives the essential distributions and measures of data utility within a generalized, flexible and accurate framework. The method makes use of Bayesian GLUE (Generalized Likelihood Uncertainty Estimator) and extends it to an optimal design method by marginalizing over the yet unknown data values. Operating in a purely Bayesian Monte-Carlo framework, CLUE is a strictly formal information processing scheme free of linearizations. It provides full flexibility associated with the type of measurements (linear, non-linear, direct, indirect) and accounts for almost arbitrary sources of uncertainty (e.g. heterogeneity, geostatistical assumptions, boundary conditions, model concepts) via stochastic simulation and Bayesian model averaging. This helps to minimize the strength and impact of possible subjective prior assumptions, that would be hard to defend prior to data collection. Our study focuses on evaluating two different uncertainty measures: (i) expected conditional variance and (ii) expected relative entropy of a given prediction goal. The applicability and advantages are shown in a synthetic example. Therefor, we consider a contaminant source, posing a threat on a drinking water well in an aquifer. Furthermore, we assume uncertainty in geostatistical parameters, boundary conditions and hydraulic gradient. The two mentioned measures evaluate the sensitivity of (1) general prediction confidence and (2) exceedance probability of a legal regulatory threshold value on sampling locations.

  15. Impacts of Process and Prediction Uncertainties on Projected Hanford Waste Glass Amount

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gervasio, Vivianaluxa; Vienna, John D.; Kim, Dong-Sang

    Analyses were performed to evaluate the impacts of using the advanced glass models, constraints (Vienna et al. 2016), and uncertainty descriptions on projected Hanford glass mass. The maximum allowable WOL was estimated for waste compositions while simultaneously satisfying all applicable glass property and composition constraints with sufficient confidence. Different components of prediction and composition/process uncertainties were systematically included in the calculations to evaluate their impacts on glass mass. The analyses estimated the production of 23,360 MT of IHLW glass when no uncertainties were taken into accound. Accounting for prediction and composition/process uncertainties resulted in 5.01 relative percent increase in estimatedmore » glass mass 24,531 MT. Roughly equal impacts were found for prediction uncertainties (2.58 RPD) and composition/process uncertainties (2.43 RPD). ILAW mass was predicted to be 282,350 MT without uncertainty and with weaste loading “line” rules in place. Accounting for prediction and composition/process uncertainties resulted in only 0.08 relative percent increase in estimated glass mass of 282,562 MTG. Without application of line rules the glass mass decreases by 10.6 relative percent (252,490 MT) for the case with no uncertainties. Addition of prediction uncertainties increases glass mass by 1.32 relative percent and the addition of composition/process uncertainties increase glass mass by an additional 7.73 relative percent (9.06 relative percent increase combined). The glass mass estimate without line rules (275,359 MT) was 2.55 relative percent lower than that with the line rules (282,562 MT), after accounting for all applicable uncertainties.« less

  16. Assessment of flood susceptible areas using spatially explicit, probabilistic multi-criteria decision analysis

    NASA Astrophysics Data System (ADS)

    Tang, Zhongqian; Zhang, Hua; Yi, Shanzhen; Xiao, Yangfan

    2018-03-01

    GIS-based multi-criteria decision analysis (MCDA) is increasingly used to support flood risk assessment. However, conventional GIS-MCDA methods fail to adequately represent spatial variability and are accompanied with considerable uncertainty. It is, thus, important to incorporate spatial variability and uncertainty into GIS-based decision analysis procedures. This research develops a spatially explicit, probabilistic GIS-MCDA approach for the delineation of potentially flood susceptible areas. The approach integrates the probabilistic and the local ordered weighted averaging (OWA) methods via Monte Carlo simulation, to take into account the uncertainty related to criteria weights, spatial heterogeneity of preferences and the risk attitude of the analyst. The approach is applied to a pilot study for the Gucheng County, central China, heavily affected by the hazardous 2012 flood. A GIS database of six geomorphological and hydrometeorological factors for the evaluation of susceptibility was created. Moreover, uncertainty and sensitivity analysis were performed to investigate the robustness of the model. The results indicate that the ensemble method improves the robustness of the model outcomes with respect to variation in criteria weights and identifies which criteria weights are most responsible for the variability of model outcomes. Therefore, the proposed approach is an improvement over the conventional deterministic method and can provides a more rational, objective and unbiased tool for flood susceptibility evaluation.

  17. A kriging metamodel-assisted robust optimization method based on a reverse model

    NASA Astrophysics Data System (ADS)

    Zhou, Hui; Zhou, Qi; Liu, Congwei; Zhou, Taotao

    2018-02-01

    The goal of robust optimization methods is to obtain a solution that is both optimum and relatively insensitive to uncertainty factors. Most existing robust optimization approaches use outer-inner nested optimization structures where a large amount of computational effort is required because the robustness of each candidate solution delivered from the outer level should be evaluated in the inner level. In this article, a kriging metamodel-assisted robust optimization method based on a reverse model (K-RMRO) is first proposed, in which the nested optimization structure is reduced into a single-loop optimization structure to ease the computational burden. Ignoring the interpolation uncertainties from kriging, K-RMRO may yield non-robust optima. Hence, an improved kriging-assisted robust optimization method based on a reverse model (IK-RMRO) is presented to take the interpolation uncertainty of kriging metamodel into consideration. In IK-RMRO, an objective switching criterion is introduced to determine whether the inner level robust optimization or the kriging metamodel replacement should be used to evaluate the robustness of design alternatives. The proposed criterion is developed according to whether or not the robust status of the individual can be changed because of the interpolation uncertainties from the kriging metamodel. Numerical and engineering cases are used to demonstrate the applicability and efficiency of the proposed approach.

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sattison, M.B.; Blackman, H.S.; Novack, S.D.

    The Office for Analysis and Evaluation of Operational Data (AEOD) has sought the assistance of the Idaho National Engineering Laboratory (INEL) to make some significant enhancements to the SAPHIRE-based Accident Sequence Precursor (ASP) models recently developed by the INEL. The challenge of this project is to provide the features of a full-scale PRA within the framework of the simplified ASP models. Some of these features include: (1) uncertainty analysis addressing the standard PRA uncertainties and the uncertainties unique to the ASP models and methods, (2) incorporation and proper quantification of individual human actions and the interaction among human actions, (3)more » enhanced treatment of common cause failures, and (4) extension of the ASP models to more closely mimic full-scale PRAs (inclusion of more initiators, explicitly modeling support system failures, etc.). This paper provides an overview of the methods being used to make the above improvements.« less

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sattison, M.B.; Blackman, H.S.; Novack, S.D.

    The Office for Analysis and Evaluation of Operational Data (AEOD) has sought the assistance of the Idaho National Engineering Laboratory (INEL) to make some significant enhancements to the SAPHIRE-based Accident Sequence Precursor (ASP) models recently developed by the INEL. The challenge of this project is to provide the features of a full-scale PRA within the framework of the simplified ASP models. Some of these features include: (1) uncertainty analysis addressing the standard PRA uncertainties and the uncertainties unique to the ASP models and methodology, (2) incorporation and proper quantification of individual human actions and the interaction among human actions, (3)more » enhanced treatment of common cause failures, and (4) extension of the ASP models to more closely mimic full-scale PRAs (inclusion of more initiators, explicitly modeling support system failures, etc.). This paper provides an overview of the methods being used to make the above improvements.« less

  20. Uncertainty quantification and optimal decisions

    PubMed Central

    2017-01-01

    A mathematical model can be analysed to construct policies for action that are close to optimal for the model. If the model is accurate, such policies will be close to optimal when implemented in the real world. In this paper, the different aspects of an ideal workflow are reviewed: modelling, forecasting, evaluating forecasts, data assimilation and constructing control policies for decision-making. The example of the oil industry is used to motivate the discussion, and other examples, such as weather forecasting and precision agriculture, are used to argue that the same mathematical ideas apply in different contexts. Particular emphasis is placed on (i) uncertainty quantification in forecasting and (ii) how decisions are optimized and made robust to uncertainty in models and judgements. This necessitates full use of the relevant data and by balancing costs and benefits into the long term may suggest policies quite different from those relevant to the short term. PMID:28484343

  1. Evaluation and comparison of different RCMs simulations of the Mediterranean climate: a view on the impact of model resolution and Mediterranean sea coupling.

    NASA Astrophysics Data System (ADS)

    Panthou, Gérémy; Vrac, Mathieu; Drobinski, Philippe; Bastin, Sophie; Somot, Samuel; Li, Laurent

    2015-04-01

    As regularly stated by numerous authors, the Mediterranean climate is considered as one major climate 'hot spot'. At least, three reasons may explain this statement. First, this region is known for being regularly affected by extreme hydro-meteorological events (heavy precipitation and flash-floods during the autumn season; droughts and heat waves during spring and summer). Second, the vulnerability of populations in regard of these extreme events is expected to increase during the XXIst century (at least due to the projected population growth in this region). At last, Global Circulation Models project that this regional climate will be highly sensitive to climate change. Moreover, global warming is expected to intensify the hydrological cycle and thus to increase the frequency of extreme hydro-meteorological events. In order to propose adaptation strategies, the robust estimation of the future evolution of the Mediterranean climate and the associated extreme hydro-meteorological events (in terms of intensity/frequency) is of great relevance. However, these projections are characterized by large uncertainties. Many components of the simulation chain can explain these large uncertainties : (i) uncertainties concerning the emission scenario; (ii) climate model simulations suffer of parametrization errors and uncertainties concerning the initial state of the climate; and (iii) the additional uncertainties given by the (dynamical or statistical) downscaling techniques and the impact model. Narrowing (as fine as possible) these uncertainties is a major challenge of the actual climate research. One way for that is to reduce the uncertainties associated with each component. In this study, we are interested in evaluating the potential improvement of : (i) coupled RCM simulations (with the Mediterranean Sea) in comparison with atmosphere only (stand-alone) RCM simulations and (ii) RCM simulations at a finer resolution in comparison with larger resolution. For that, three different RCMs (WRF, ALADIN, LMDZ4) were run, forced by ERA-Interim reanalyses, within the MED-CORDEX experiment. For each RCM, different versions (coupled/stand-alone, high/low resolution) were realized. A large set of scores was developed and applied in order to evaluate the performances of these different RCMs simulations. These scores were applied for three variables (daily precipitation amount, mean daily air temperature and the dry spell lengths). A particular attention was given to the RCM capability to reproduce the seasonal and spatial pattern of extreme statistics. Results show that the differences between coupled and stand-alone RCMs are localized very near the Mediterranean sea and that the model resolution has a slight impact on the scores obtained. Globally, the main differences between the RCM simulations come from the RCM used. Keywords: Mediterranean climate, extreme hydro-meteorological events, RCM simulations, evaluation of climate simulations

  2. Spreadsheet for designing valid least-squares calibrations: A tutorial.

    PubMed

    Bettencourt da Silva, Ricardo J N

    2016-02-01

    Instrumental methods of analysis are used to define the price of goods, the compliance of products with a regulation, or the outcome of fundamental or applied research. These methods can only play their role properly if reported information is objective and their quality is fit for the intended use. If measurement results are reported with an adequately small measurement uncertainty both of these goals are achieved. The evaluation of the measurement uncertainty can be performed by the bottom-up approach, that involves a detailed description of the measurement process, or using a pragmatic top-down approach that quantify major uncertainty components from global performance data. The bottom-up approach is not so frequently used due to the need to master the quantification of individual components responsible for random and systematic effects that affect measurement results. This work presents a tutorial that can be easily used by non-experts in the accurate evaluation of the measurement uncertainty of instrumental methods of analysis calibrated using least-squares regressions. The tutorial includes the definition of the calibration interval, the assessments of instrumental response homoscedasticity, the definition of calibrators preparation procedure required for least-squares regression model application, the assessment of instrumental response linearity and the evaluation of measurement uncertainty. The developed measurement model is only applicable in calibration ranges where signal precision is constant. A MS-Excel file is made available to allow the easy application of the tutorial. This tool can be useful for cases where top-down approaches cannot produce results with adequately low measurement uncertainty. An example of the application of this tool to the determination of nitrate in water by ion chromatography is presented. Copyright © 2015 Elsevier B.V. All rights reserved.

  3. Evaluation of uncertainties in the CRCM-simulated North American climate

    NASA Astrophysics Data System (ADS)

    de Elía, Ramón; Caya, Daniel; Côté, Hélène; Frigon, Anne; Biner, Sébastien; Giguère, Michel; Paquin, Dominique; Harvey, Richard; Plummer, David

    2008-02-01

    This work is a first step in the analysis of uncertainty sources in the RCM-simulated climate over North America. Three main sets of sensitivity studies were carried out: the first estimates the magnitude of internal variability, which is needed to evaluate the significance of changes in the simulated climate induced by any model modification. The second is devoted to the role of CRCM configuration as a source of uncertainty, in particular the sensitivity to nesting technique, domain size, and driving reanalysis. The third study aims to assess the relative importance of the previously estimated sensitivities by performing two additional sensitivity experiments: one, in which the reanalysis driving data is replaced by data generated by the second generation Coupled Global Climate Model (CGCM2), and another, in which a different CRCM version is used. Results show that the internal variability, triggered by differences in initial conditions, is much smaller than the sensitivity to any other source. Results also show that levels of uncertainty originating from liberty of choices in the definition of configuration parameters are comparable among themselves and are smaller than those due to the choice of CGCM or CRCM version used. These results suggest that uncertainty originated by the CRCM configuration latitude (freedom of choice among domain sizes, nesting techniques and reanalysis dataset), although important, does not seem to be a major obstacle to climate downscaling. Finally, with the aim of evaluating the combined effect of the different uncertainties, the ensemble spread is estimated for a subset of the analysed simulations. Results show that downscaled surface temperature is in general more uncertain in the northern regions, while precipitation is more uncertain in the central and eastern US.

  4. Advanced Modeling and Uncertainty Quantification for Flight Dynamics; Interim Results and Challenges

    NASA Technical Reports Server (NTRS)

    Hyde, David C.; Shweyk, Kamal M.; Brown, Frank; Shah, Gautam

    2014-01-01

    As part of the NASA Vehicle Systems Safety Technologies (VSST), Assuring Safe and Effective Aircraft Control Under Hazardous Conditions (Technical Challenge #3), an effort is underway within Boeing Research and Technology (BR&T) to address Advanced Modeling and Uncertainty Quantification for Flight Dynamics (VSST1-7). The scope of the effort is to develop and evaluate advanced multidisciplinary flight dynamics modeling techniques, including integrated uncertainties, to facilitate higher fidelity response characterization of current and future aircraft configurations approaching and during loss-of-control conditions. This approach is to incorporate multiple flight dynamics modeling methods for aerodynamics, structures, and propulsion, including experimental, computational, and analytical. Also to be included are techniques for data integration and uncertainty characterization and quantification. This research shall introduce new and updated multidisciplinary modeling and simulation technologies designed to improve the ability to characterize airplane response in off-nominal flight conditions. The research shall also introduce new techniques for uncertainty modeling that will provide a unified database model comprised of multiple sources, as well as an uncertainty bounds database for each data source such that a full vehicle uncertainty analysis is possible even when approaching or beyond Loss of Control boundaries. Methodologies developed as part of this research shall be instrumental in predicting and mitigating loss of control precursors and events directly linked to causal and contributing factors, such as stall, failures, damage, or icing. The tasks will include utilizing the BR&T Water Tunnel to collect static and dynamic data to be compared to the GTM extended WT database, characterizing flight dynamics in off-nominal conditions, developing tools for structural load estimation under dynamic conditions, devising methods for integrating various modeling elements into a real-time simulation capability, generating techniques for uncertainty modeling that draw data from multiple modeling sources, and providing a unified database model that includes nominal plus increments for each flight condition. This paper presents status of testing in the BR&T water tunnel and analysis of the resulting data and efforts to characterize these data using alternative modeling methods. Program challenges and issues are also presented.

  5. Uncertainty and sensitivity analysis of fission gas behavior in engineering-scale fuel modeling

    DOE PAGES

    Pastore, Giovanni; Swiler, L. P.; Hales, Jason D.; ...

    2014-10-12

    The role of uncertainties in fission gas behavior calculations as part of engineering-scale nuclear fuel modeling is investigated using the BISON fuel performance code and a recently implemented physics-based model for the coupled fission gas release and swelling. Through the integration of BISON with the DAKOTA software, a sensitivity analysis of the results to selected model parameters is carried out based on UO2 single-pellet simulations covering different power regimes. The parameters are varied within ranges representative of the relative uncertainties and consistent with the information from the open literature. The study leads to an initial quantitative assessment of the uncertaintymore » in fission gas behavior modeling with the parameter characterization presently available. Also, the relative importance of the single parameters is evaluated. Moreover, a sensitivity analysis is carried out based on simulations of a fuel rod irradiation experiment, pointing out a significant impact of the considered uncertainties on the calculated fission gas release and cladding diametral strain. The results of the study indicate that the commonly accepted deviation between calculated and measured fission gas release by a factor of 2 approximately corresponds to the inherent modeling uncertainty at high fission gas release. Nevertheless, higher deviations may be expected for values around 10% and lower. Implications are discussed in terms of directions of research for the improved modeling of fission gas behavior for engineering purposes.« less

  6. Hydrologic drought prediction under climate change: Uncertainty modeling with Dempster-Shafer and Bayesian approaches

    NASA Astrophysics Data System (ADS)

    Raje, Deepashree; Mujumdar, P. P.

    2010-09-01

    Representation and quantification of uncertainty in climate change impact studies are a difficult task. Several sources of uncertainty arise in studies of hydrologic impacts of climate change, such as those due to choice of general circulation models (GCMs), scenarios and downscaling methods. Recently, much work has focused on uncertainty quantification and modeling in regional climate change impacts. In this paper, an uncertainty modeling framework is evaluated, which uses a generalized uncertainty measure to combine GCM, scenario and downscaling uncertainties. The Dempster-Shafer (D-S) evidence theory is used for representing and combining uncertainty from various sources. A significant advantage of the D-S framework over the traditional probabilistic approach is that it allows for the allocation of a probability mass to sets or intervals, and can hence handle both aleatory or stochastic uncertainty, and epistemic or subjective uncertainty. This paper shows how the D-S theory can be used to represent beliefs in some hypotheses such as hydrologic drought or wet conditions, describe uncertainty and ignorance in the system, and give a quantitative measurement of belief and plausibility in results. The D-S approach has been used in this work for information synthesis using various evidence combination rules having different conflict modeling approaches. A case study is presented for hydrologic drought prediction using downscaled streamflow in the Mahanadi River at Hirakud in Orissa, India. Projections of n most likely monsoon streamflow sequences are obtained from a conditional random field (CRF) downscaling model, using an ensemble of three GCMs for three scenarios, which are converted to monsoon standardized streamflow index (SSFI-4) series. This range is used to specify the basic probability assignment (bpa) for a Dempster-Shafer structure, which represents uncertainty associated with each of the SSFI-4 classifications. These uncertainties are then combined across GCMs and scenarios using various evidence combination rules given by the D-S theory. A Bayesian approach is also presented for this case study, which models the uncertainty in projected frequencies of SSFI-4 classifications by deriving a posterior distribution for the frequency of each classification, using an ensemble of GCMs and scenarios. Results from the D-S and Bayesian approaches are compared, and relative merits of each approach are discussed. Both approaches show an increasing probability of extreme, severe and moderate droughts and decreasing probability of normal and wet conditions in Orissa as a result of climate change.

  7. Interval Predictor Models with a Formal Characterization of Uncertainty and Reliability

    NASA Technical Reports Server (NTRS)

    Crespo, Luis G.; Giesy, Daniel P.; Kenny, Sean P.

    2014-01-01

    This paper develops techniques for constructing empirical predictor models based on observations. By contrast to standard models, which yield a single predicted output at each value of the model's inputs, Interval Predictors Models (IPM) yield an interval into which the unobserved output is predicted to fall. The IPMs proposed prescribe the output as an interval valued function of the model's inputs, render a formal description of both the uncertainty in the model's parameters and of the spread in the predicted output. Uncertainty is prescribed as a hyper-rectangular set in the space of model's parameters. The propagation of this set through the empirical model yields a range of outputs of minimal spread containing all (or, depending on the formulation, most) of the observations. Optimization-based strategies for calculating IPMs and eliminating the effects of outliers are proposed. Outliers are identified by evaluating the extent by which they degrade the tightness of the prediction. This evaluation can be carried out while the IPM is calculated. When the data satisfies mild stochastic assumptions, and the optimization program used for calculating the IPM is convex (or, when its solution coincides with the solution to an auxiliary convex program), the model's reliability (that is, the probability that a future observation would be within the predicted range of outputs) can be bounded rigorously by a non-asymptotic formula.

  8. Bayesian model evidence as a model evaluation metric

    NASA Astrophysics Data System (ADS)

    Guthke, Anneli; Höge, Marvin; Nowak, Wolfgang

    2017-04-01

    When building environmental systems models, we are typically confronted with the questions of how to choose an appropriate model (i.e., which processes to include or neglect) and how to measure its quality. Various metrics have been proposed that shall guide the modeller towards a most robust and realistic representation of the system under study. Criteria for evaluation often address aspects of accuracy (absence of bias) or of precision (absence of unnecessary variance) and need to be combined in a meaningful way in order to address the inherent bias-variance dilemma. We suggest using Bayesian model evidence (BME) as a model evaluation metric that implicitly performs a tradeoff between bias and variance. BME is typically associated with model weights in the context of Bayesian model averaging (BMA). However, it can also be seen as a model evaluation metric in a single-model context or in model comparison. It combines a measure for goodness of fit with a penalty for unjustifiable complexity. Unjustifiable refers to the fact that the appropriate level of model complexity is limited by the amount of information available for calibration. Derived in a Bayesian context, BME naturally accounts for measurement errors in the calibration data as well as for input and parameter uncertainty. BME is therefore perfectly suitable to assess model quality under uncertainty. We will explain in detail and with schematic illustrations what BME measures, i.e. how complexity is defined in the Bayesian setting and how this complexity is balanced with goodness of fit. We will further discuss how BME compares to other model evaluation metrics that address accuracy and precision such as the predictive logscore or other model selection criteria such as the AIC, BIC or KIC. Although computationally more expensive than other metrics or criteria, BME represents an appealing alternative because it provides a global measure of model quality. Even if not applicable to each and every case, we aim at stimulating discussion about how to judge the quality of hydrological models in the presence of uncertainty in general by dissecting the mechanism behind BME.

  9. Evaluation of stormwater micropollutant source control and end-of-pipe control strategies using an uncertainty-calibrated integrated dynamic simulation model.

    PubMed

    Vezzaro, L; Sharma, A K; Ledin, A; Mikkelsen, P S

    2015-03-15

    The estimation of micropollutant (MP) fluxes in stormwater systems is a fundamental prerequisite when preparing strategies to reduce stormwater MP discharges to natural waters. Dynamic integrated models can be important tools in this step, as they can be used to integrate the limited data provided by monitoring campaigns and to evaluate the performance of different strategies based on model simulation results. This study presents an example where six different control strategies, including both source-control and end-of-pipe treatment, were compared. The comparison focused on fluxes of heavy metals (copper, zinc) and organic compounds (fluoranthene). MP fluxes were estimated by using an integrated dynamic model, in combination with stormwater quality measurements. MP sources were identified by using GIS land usage data, runoff quality was simulated by using a conceptual accumulation/washoff model, and a stormwater retention pond was simulated by using a dynamic treatment model based on MP inherent properties. Uncertainty in the results was estimated with a pseudo-Bayesian method. Despite the great uncertainty in the MP fluxes estimated by the runoff quality model, it was possible to compare the six scenarios in terms of discharged MP fluxes, compliance with water quality criteria, and sediment accumulation. Source-control strategies obtained better results in terms of reduction of MP emissions, but all the simulated strategies failed in fulfilling the criteria based on emission limit values. The results presented in this study shows how the efficiency of MP pollution control strategies can be quantified by combining advanced modeling tools (integrated stormwater quality model, uncertainty calibration). Copyright © 2014 Elsevier Ltd. All rights reserved.

  10. Impact of Exposure Uncertainty on the Association between Perfluorooctanoate and Preeclampsia in the C8 Health Project Population.

    PubMed

    Avanasi, Raghavendhran; Shin, Hyeong-Moo; Vieira, Verónica M; Savitz, David A; Bartell, Scott M

    2016-01-01

    Uncertainty in exposure estimates from models can result in exposure measurement error and can potentially affect the validity of epidemiological studies. We recently used a suite of environmental models and an integrated exposure and pharmacokinetic model to estimate individual perfluorooctanoate (PFOA) serum concentrations and assess the association with preeclampsia from 1990 through 2006 for the C8 Health Project participants. The aims of the current study are to evaluate impact of uncertainty in estimated PFOA drinking-water concentrations on estimated serum concentrations and their reported epidemiological association with preeclampsia. For each individual public water district, we used Monte Carlo simulations to vary the year-by-year PFOA drinking-water concentration by randomly sampling from lognormal distributions for random error in the yearly public water district PFOA concentrations, systematic error specific to each water district, and global systematic error in the release assessment (using the estimated concentrations from the original fate and transport model as medians and a range of 2-, 5-, and 10-fold uncertainty). Uncertainty in PFOA water concentrations could cause major changes in estimated serum PFOA concentrations among participants. However, there is relatively little impact on the resulting epidemiological association in our simulations. The contribution of exposure uncertainty to the total uncertainty (including regression parameter variance) ranged from 5% to 31%, and bias was negligible. We found that correlated exposure uncertainty can substantially change estimated PFOA serum concentrations, but results in only minor impacts on the epidemiological association between PFOA and preeclampsia. Avanasi R, Shin HM, Vieira VM, Savitz DA, Bartell SM. 2016. Impact of exposure uncertainty on the association between perfluorooctanoate and preeclampsia in the C8 Health Project population. Environ Health Perspect 124:126-132; http://dx.doi.org/10.1289/ehp.1409044.

  11. Guidelines 13 and 14—Prediction uncertainty

    USGS Publications Warehouse

    Hill, Mary C.; Tiedeman, Claire

    2005-01-01

    An advantage of using optimization for model development and calibration is that optimization provides methods for evaluating and quantifying prediction uncertainty. Both deterministic and statistical methods can be used. Guideline 13 discusses using regression and post-audits, which we classify as deterministic methods. Guideline 14 discusses inferential statistics and Monte Carlo methods, which we classify as statistical methods.

  12. Evaluation of the BioVapor Model

    EPA Science Inventory

    The BioVapor model addresses transport and biodegradation of petroleum vapors in the subsurface. This presentation describes basic background on the nature and scientific basis of environmental transport models. It then describes a series of parameter uncertainty runs of the Bi...

  13. Conceptual uncertainty in crystalline bedrock: Is simple evaluation the only practical approach?

    USGS Publications Warehouse

    Geier, J.; Voss, C.I.; Dverstorp, B.

    2002-01-01

    A simple evaluation can be used to characterize the capacity of crystalline bedrock to act as a barrier to release radionuclides from a nuclear waste repository. Physically plausible bounds on groundwater flow and an effective transport-resistance parameter are estimated based on fundamental principles and idealized models of pore geometry. Application to an intensively characterized site in Sweden shows that, due to high spatial variability and uncertainty regarding properties of transport paths, the uncertainty associated with the geological barrier is too high to allow meaningful discrimination between good and poor performance. Application of more complex (stochastic-continuum and discrete-fracture-network) models does not yield a significant improvement in the resolution of geological barrier performance. Comparison with seven other less intensively characterized crystalline study sites in Sweden leads to similar results, raising a question as to what extent the geological barrier function can be characterized by state-of-the art site investigation methods prior to repository construction. A simple evaluation provides a simple and robust practical approach for inclusion in performance assessment.

  14. Conceptual uncertainty in crystalline bedrock: Is simple evaluation the only practical approach?

    USGS Publications Warehouse

    Geier, J.; Voss, C.I.; Dverstorp, B.

    2002-01-01

    A simple evaluation can be used to characterise the capacity of crystalline bedrock to act as a barrier to releases of radionuclides from a nuclear waste repository. Physically plausible bounds on groundwater flow and an effective transport-resistance parameter are estimated based on fundamental principles and idealised models of pore geometry. Application to an intensively characterised site in Sweden shows that, due to high spatial variability and uncertainty regarding properties of transport paths, the uncertainty associated with the geological barrier is too high to allow meaningful discrimination between good and poor performance. Application of more complex (stochastic-continuum and discrete-fracture-network) models does not yield a significant improvement in the resolution of geologic-barrier performance. Comparison with seven other less intensively characterised crystalline study sites in Sweden leads to similar results, raising a question as to what extent the geological barrier function can be characterised by state-of-the art site investigation methods prior to repository construction. A simple evaluation provides a simple and robust practical approach for inclusion in performance assessment.

  15. Calibration of Predictor Models Using Multiple Validation Experiments

    NASA Technical Reports Server (NTRS)

    Crespo, Luis G.; Kenny, Sean P.; Giesy, Daniel P.

    2015-01-01

    This paper presents a framework for calibrating computational models using data from several and possibly dissimilar validation experiments. The offset between model predictions and observations, which might be caused by measurement noise, model-form uncertainty, and numerical error, drives the process by which uncertainty in the models parameters is characterized. The resulting description of uncertainty along with the computational model constitute a predictor model. Two types of predictor models are studied: Interval Predictor Models (IPMs) and Random Predictor Models (RPMs). IPMs use sets to characterize uncertainty, whereas RPMs use random vectors. The propagation of a set through a model makes the response an interval valued function of the state, whereas the propagation of a random vector yields a random process. Optimization-based strategies for calculating both types of predictor models are proposed. Whereas the formulations used to calculate IPMs target solutions leading to the interval value function of minimal spread containing all observations, those for RPMs seek to maximize the models' ability to reproduce the distribution of observations. Regarding RPMs, we choose a structure for the random vector (i.e., the assignment of probability to points in the parameter space) solely dependent on the prediction error. As such, the probabilistic description of uncertainty is not a subjective assignment of belief, nor is it expected to asymptotically converge to a fixed value, but instead it casts the model's ability to reproduce the experimental data. This framework enables evaluating the spread and distribution of the predicted response of target applications depending on the same parameters beyond the validation domain.

  16. Uncertainty Considerations for Ballistic Limit Equations

    NASA Technical Reports Server (NTRS)

    Schonberg, W. P.; Evans, H. J.; Williamsen, J. E.; Boyer, R. L.; Nakayama, G. S.

    2005-01-01

    The overall risk for any spacecraft system is typically determined using a Probabilistic Risk Assessment (PRA). A PRA attempts to determine the overall risk associated with a particular mission by factoring in all known risks (and their corresponding uncertainties, if known) to the spacecraft during its mission. The threat to mission and human life posed by the mircro-meteoroid & orbital debris (MMOD) environment is one of the risks. NASA uses the BUMPER II program to provide point estimate predictions of MMOD risk for the Space Shuttle and the International Space Station. However, BUMPER II does not provide uncertainty bounds or confidence intervals for its predictions. With so many uncertainties believed to be present in the models used within BUMPER II, providing uncertainty bounds with BUMPER II results would appear to be essential to properly evaluating its predictions of MMOD risk. The uncertainties in BUMPER II come primarily from three areas: damage prediction/ballistic limit equations, environment models, and failure criteria definitions. In order to quantify the overall uncertainty bounds on MMOD risk predictions, the uncertainties in these three areas must be identified. In this paper, possible approaches through which uncertainty bounds can be developed for the various damage prediction and ballistic limit equations encoded within the shuttle and station versions of BUMPER II are presented and discussed. We begin the paper with a review of the current approaches used by NASA to perform a PRA for the Space Shuttle and the International Space Station, followed by a review of the results of a recent sensitivity analysis performed by NASA using the shuttle version of the BUMPER II code. Following a discussion of the various equations that are encoded in BUMPER II, we propose several possible approaches for establishing uncertainty bounds for the equations within BUMPER II. We conclude with an evaluation of these approaches and present a recommendation regarding which of them would be the most appropriate to follow.

  17. An 'Observational Large Ensemble' to compare observed and modeled temperature trend uncertainty due to internal variability.

    NASA Astrophysics Data System (ADS)

    Poppick, A. N.; McKinnon, K. A.; Dunn-Sigouin, E.; Deser, C.

    2017-12-01

    Initial condition climate model ensembles suggest that regional temperature trends can be highly variable on decadal timescales due to characteristics of internal climate variability. Accounting for trend uncertainty due to internal variability is therefore necessary to contextualize recent observed temperature changes. However, while the variability of trends in a climate model ensemble can be evaluated directly (as the spread across ensemble members), internal variability simulated by a climate model may be inconsistent with observations. Observation-based methods for assessing the role of internal variability on trend uncertainty are therefore required. Here, we use a statistical resampling approach to assess trend uncertainty due to internal variability in historical 50-year (1966-2015) winter near-surface air temperature trends over North America. We compare this estimate of trend uncertainty to simulated trend variability in the NCAR CESM1 Large Ensemble (LENS), finding that uncertainty in wintertime temperature trends over North America due to internal variability is largely overestimated by CESM1, on average by a factor of 32%. Our observation-based resampling approach is combined with the forced signal from LENS to produce an 'Observational Large Ensemble' (OLENS). The members of OLENS indicate a range of spatially coherent fields of temperature trends resulting from different sequences of internal variability consistent with observations. The smaller trend variability in OLENS suggests that uncertainty in the historical climate change signal in observations due to internal variability is less than suggested by LENS.

  18. Advanced Small Modular Reactor Economics Model Development

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harrison, Thomas J.

    2014-10-01

    The US Department of Energy Office of Nuclear Energy’s Advanced Small Modular Reactor (SMR) research and development activities focus on four key areas: Developing assessment methods for evaluating advanced SMR technologies and characteristics; and Developing and testing of materials, fuels and fabrication techniques; and Resolving key regulatory issues identified by US Nuclear Regulatory Commission and industry; and Developing advanced instrumentation and controls and human-machine interfaces. This report focuses on development of assessment methods to evaluate advanced SMR technologies and characteristics. Specifically, this report describes the expansion and application of the economic modeling effort at Oak Ridge National Laboratory. Analysis ofmore » the current modeling methods shows that one of the primary concerns for the modeling effort is the handling of uncertainty in cost estimates. Monte Carlo–based methods are commonly used to handle uncertainty, especially when implemented by a stand-alone script within a program such as Python or MATLAB. However, a script-based model requires each potential user to have access to a compiler and an executable capable of handling the script. Making the model accessible to multiple independent analysts is best accomplished by implementing the model in a common computing tool such as Microsoft Excel. Excel is readily available and accessible to most system analysts, but it is not designed for straightforward implementation of a Monte Carlo–based method. Using a Monte Carlo algorithm requires in-spreadsheet scripting and statistical analyses or the use of add-ons such as Crystal Ball. An alternative method uses propagation of error calculations in the existing Excel-based system to estimate system cost uncertainty. This method has the advantage of using Microsoft Excel as is, but it requires the use of simplifying assumptions. These assumptions do not necessarily bring into question the analytical results. In fact, the analysis shows that the propagation of error method introduces essentially negligible error, especially when compared to the uncertainty associated with some of the estimates themselves. The results of these uncertainty analyses generally quantify and identify the sources of uncertainty in the overall cost estimation. The obvious generalization—that capital cost uncertainty is the main driver—can be shown to be an accurate generalization for the current state of reactor cost analysis. However, the detailed analysis on a component-by-component basis helps to demonstrate which components would benefit most from research and development to decrease the uncertainty, as well as which components would benefit from research and development to decrease the absolute cost.« less

  19. SU-E-T-659: Quantitative Evaluation of Patient Setup Accuracy of Stereotactic Radiotherapy with the Frameless 6D-ExacTrac System Using Statistical Modeling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Keeling, V; Jin, H; Hossain, S

    2015-06-15

    Purpose: To evaluate patient setup accuracy and quantify individual and cumulative positioning uncertainties associated with different hardware and software components of the stereotactic radiotherapy (SRS/SRT) with the frameless-6D-ExacTrac system. Methods: A statistical model was used to evaluate positioning uncertainties of the different components of SRS/SRT treatment with the BrainLAB 6D-ExacTrac system using the positioning shifts of 35 patients having cranial lesions (49 total lesions treated in 1, 3, 5 fractions). All these patients were immobilized with rigid head-and-neck masks, simulated with BrainLAB-localizer and planned with iPlan treatment planning system. Infrared imaging (IR) was used initially to setup patients. Then, stereoscopicmore » x-ray images (XC) were acquired and registered to corresponding digitally-reconstructed-radiographs using bony-anatomy matching to calculate 6D-translational and rotational shifts. When the shifts were within tolerance (0.7mm and 1°), treatment was initiated. Otherwise corrections were applied and additional x-rays were acquired (XV) to verify that patient position was within tolerance. Results: The uncertainties from the mask, localizer, IR-frame, x-ray imaging, MV and kV isocentricity were quantified individually. Mask uncertainty (Translational: Lateral, Longitudinal, Vertical; Rotational: Pitch, Roll, Yaw) was the largest and varied with patients in the range (−1.05−1.50mm, −5.06–3.57mm, −5.51−3.49mm; −1.40−2.40°, −1.24−1.74°, and −2.43−1.90°) obtained from mean of XC shifts for each patient. Setup uncertainty in IR positioning (0.88,2.12,1.40mm, and 0.64,0.83,0.96°) was extracted from standard-deviation of XC. Systematic uncertainties of the localizer (−0.03,−0.01,0.03mm, and −0.03,0.00,−0.01°) and frame (0.18,0.25,−1.27mm,−0.32,0.18, and 0.47°) were extracted from means of all XV setups and mean of all XC distributions, respectively. Uncertainties in isocentricity of the MV radiotherapy machine were (0.27,0.24,0.34mm) and kV-imager (0.15,−0.4,0.21mm). Conclusion: A statistical model was developed to evaluate the individual and cumulative systematic and random uncertainties induced by the different hardware and software components of the 6D-ExacTrac-system. The immobilization mask was associated with the largest positioning uncertainty.« less

  20. Evaluation de l'impact du vent et des manoeuvres hydrauliques sur le calcul des apports naturels par bilan hydrique pour un reservoir hydroelectrique

    NASA Astrophysics Data System (ADS)

    Roy, Mathieu

    Natural inflow is an important data for a water resource manager. In fact, Hydro-Quebec uses historical natural inflow data to perform a daily prediction of the amount of water that will be received in each of its hydroelectric reservoirs. This prediction allows the establishment of reservoir operating rules in order to optimize hydropower without compromising the safety of hydraulic structures. To obtain an accurate prediction, it follows that the system's input needs to be very well known. However, it can be very difficult to accurately measure the natural supply of a set of regulated reservoirs. Therefore, Hydro-Quebec uses an indirect method of calculation. This method consists of evaluating the reservoir's inflow using the water balance equation. Yet, this equation is not immune to errors and uncertainties. Water level measurement is an important input in order to compute the water balance equation. However, several sources of uncertainty including the effect of wind and hydraulic maneuvers can affect the readings of limnimetric gages. Fluctuations in water level caused by these effects carry over in the water balance equation. Consequently, natural inflow's signal may become noisy and affected by external errors. The main objective of this report is to evaluate the uncertainty caused by the effects of wind and hydraulic maneuvers on water balance equation. To this end, hydrodynamic models of reservoirs Outardes 4 and Gouin were prepared. According to the literature review, wind effects can be studied either by an unsteady state approach or by assuming steady state approach. Unsteady state simulation of wind effects on reservoir Gouin and Outardes 4 were performed by hydrodynamic modelling. Consideration of an unsteady state implies that the wind conditions vary throughout the simulation. This feature allows taking into account temporal effect of wind duration. In addition, it also allows the consideration of inertial forces such as seiches which are caused by wind conditions that can vary abruptly. Once the models were calibrated, unsteady state simulations were conducted in closed system where unsteady observed winds were the only forces included. From the simulated water levels obtained at each gage, water balance equation was calculated to determine the daily uncertainty of natural inflow in unsteady conditions. At Outardes 4, a maximum uncertainty of 20 m3/s was estimated during the month of October 2010. On the other hand, at the Gouin reservoir, a maximum uncertainty of 340m3/s was estimated during the month of July 2012. Steady state modelling is another approach to evaluate wind effect uncertainty in the water balance equation. This type of approach consists of assuming that the water level is instantly tilted under the influence of wind. Hence, temporal effect of wind duration and seiches cannot be taken into account. However, the advantage of steady state modelling is that it's better suited than unsteady state modelling to evaluate wind uncertainty in real time. Two steady state modelling methods were experimented to estimate water level difference between gages in function of wind characteristics: hydrodynamic modelling and non-parametric regression. It has been found that non-parametric models are more efficient when it comes to estimate water level differences between gages. However, the use of hydrodynamic model demonstrated that to study wind uncertainty in the water balance equation, it is preferable to assess wind responses individually at each gage instead of using water level differences. Finally, a combination method of water level gages observations has been developed. It allows reducing wind/hydraulic maneuvers impacts on the water balance equation. This method, which is applicable in real time, consists of assigning a variable weight at each limnimetric gages. In other words, the weights automatically adjust in order to minimize steady state modeled wind responses. The estimation of hydraulic maneuvers has also been included in the gage weight adjustment. It has been found that this new combination method allows the correction of noisy natural inflow signal under wind and hydraulic maneuvers effects. However, some fluctuations persist which reflects the complexity of correcting these effects on a real time based daily water balance equation. (Abstract shortened by UMI.).

  1. Impact of state updating and multi-parametric ensemble for streamflow hindcasting in European river basins

    NASA Astrophysics Data System (ADS)

    Noh, S. J.; Rakovec, O.; Kumar, R.; Samaniego, L. E.

    2015-12-01

    Accurate and reliable streamflow prediction is essential to mitigate social and economic damage coming from water-related disasters such as flood and drought. Sequential data assimilation (DA) may facilitate improved streamflow prediction using real-time observations to correct internal model states. In conventional DA methods such as state updating, parametric uncertainty is often ignored mainly due to practical limitations of methodology to specify modeling uncertainty with limited ensemble members. However, if parametric uncertainty related with routing and runoff components is not incorporated properly, predictive uncertainty by model ensemble may be insufficient to capture dynamics of observations, which may deteriorate predictability. Recently, a multi-scale parameter regionalization (MPR) method was proposed to make hydrologic predictions at different scales using a same set of model parameters without losing much of the model performance. The MPR method incorporated within the mesoscale hydrologic model (mHM, http://www.ufz.de/mhm) could effectively represent and control uncertainty of high-dimensional parameters in a distributed model using global parameters. In this study, we evaluate impacts of streamflow data assimilation over European river basins. Especially, a multi-parametric ensemble approach is tested to consider the effects of parametric uncertainty in DA. Because augmentation of parameters is not required within an assimilation window, the approach could be more stable with limited ensemble members and have potential for operational uses. To consider the response times and non-Gaussian characteristics of internal hydrologic processes, lagged particle filtering is utilized. The presentation will be focused on gains and limitations of streamflow data assimilation and multi-parametric ensemble method over large-scale basins.

  2. Projecting Range Limits with Coupled Thermal Tolerance - Climate Change Models: An Example Based on Gray Snapper (Lutjanus griseus) along the U.S. East Coast

    PubMed Central

    Hare, Jonathan A.; Wuenschel, Mark J.; Kimball, Matthew E.

    2012-01-01

    We couple a species range limit hypothesis with the output of an ensemble of general circulation models to project the poleward range limit of gray snapper. Using laboratory-derived thermal limits and statistical downscaling from IPCC AR4 general circulation models, we project that gray snapper will shift northwards; the magnitude of this shift is dependent on the magnitude of climate change. We also evaluate the uncertainty in our projection and find that statistical uncertainty associated with the experimentally-derived thermal limits is the largest contributor (∼ 65%) to overall quantified uncertainty. This finding argues for more experimental work aimed at understanding and parameterizing the effects of climate change and variability on marine species. PMID:23284974

  3. Accounting for uncertainty in model-based prevalence estimation: paratuberculosis control in dairy herds.

    PubMed

    Davidson, Ross S; McKendrick, Iain J; Wood, Joanna C; Marion, Glenn; Greig, Alistair; Stevenson, Karen; Sharp, Michael; Hutchings, Michael R

    2012-09-10

    A common approach to the application of epidemiological models is to determine a single (point estimate) parameterisation using the information available in the literature. However, in many cases there is considerable uncertainty about parameter values, reflecting both the incomplete nature of current knowledge and natural variation, for example between farms. Furthermore model outcomes may be highly sensitive to different parameter values. Paratuberculosis is an infection for which many of the key parameter values are poorly understood and highly variable, and for such infections there is a need to develop and apply statistical techniques which make maximal use of available data. A technique based on Latin hypercube sampling combined with a novel reweighting method was developed which enables parameter uncertainty and variability to be incorporated into a model-based framework for estimation of prevalence. The method was evaluated by applying it to a simulation of paratuberculosis in dairy herds which combines a continuous time stochastic algorithm with model features such as within herd variability in disease development and shedding, which have not been previously explored in paratuberculosis models. Generated sample parameter combinations were assigned a weight, determined by quantifying the model's resultant ability to reproduce prevalence data. Once these weights are generated the model can be used to evaluate other scenarios such as control options. To illustrate the utility of this approach these reweighted model outputs were used to compare standard test and cull control strategies both individually and in combination with simple husbandry practices that aim to reduce infection rates. The technique developed has been shown to be applicable to a complex model incorporating realistic control options. For models where parameters are not well known or subject to significant variability, the reweighting scheme allowed estimated distributions of parameter values to be combined with additional sources of information, such as that available from prevalence distributions, resulting in outputs which implicitly handle variation and uncertainty. This methodology allows for more robust predictions from modelling approaches by allowing for parameter uncertainty and combining different sources of information, and is thus expected to be useful in application to a large number of disease systems.

  4. Dynamic Modelling under Uncertainty: The Case of Trypanosoma brucei Energy Metabolism

    PubMed Central

    Achcar, Fiona; Kerkhoven, Eduard J.; Bakker, Barbara M.; Barrett, Michael P.; Breitling, Rainer

    2012-01-01

    Kinetic models of metabolism require detailed knowledge of kinetic parameters. However, due to measurement errors or lack of data this knowledge is often uncertain. The model of glycolysis in the parasitic protozoan Trypanosoma brucei is a particularly well analysed example of a quantitative metabolic model, but so far it has been studied with a fixed set of parameters only. Here we evaluate the effect of parameter uncertainty. In order to define probability distributions for each parameter, information about the experimental sources and confidence intervals for all parameters were collected. We created a wiki-based website dedicated to the detailed documentation of this information: the SilicoTryp wiki (http://silicotryp.ibls.gla.ac.uk/wiki/Glycolysis). Using information collected in the wiki, we then assigned probability distributions to all parameters of the model. This allowed us to sample sets of alternative models, accurately representing our degree of uncertainty. Some properties of the model, such as the repartition of the glycolytic flux between the glycerol and pyruvate producing branches, are robust to these uncertainties. However, our analysis also allowed us to identify fragilities of the model leading to the accumulation of 3-phosphoglycerate and/or pyruvate. The analysis of the control coefficients revealed the importance of taking into account the uncertainties about the parameters, as the ranking of the reactions can be greatly affected. This work will now form the basis for a comprehensive Bayesian analysis and extension of the model considering alternative topologies. PMID:22379410

  5. Evaluating experimental design for soil-plant model selection using a Bootstrap Filter and Bayesian model averaging

    NASA Astrophysics Data System (ADS)

    Wöhling, T.; Schöniger, A.; Geiges, A.; Nowak, W.; Gayler, S.

    2013-12-01

    The objective selection of appropriate models for realistic simulations of coupled soil-plant processes is a challenging task since the processes are complex, not fully understood at larger scales, and highly non-linear. Also, comprehensive data sets are scarce, and measurements are uncertain. In the past decades, a variety of different models have been developed that exhibit a wide range of complexity regarding their approximation of processes in the coupled model compartments. We present a method for evaluating experimental design for maximum confidence in the model selection task. The method considers uncertainty in parameters, measurements and model structures. Advancing the ideas behind Bayesian Model Averaging (BMA), we analyze the changes in posterior model weights and posterior model choice uncertainty when more data are made available. This allows assessing the power of different data types, data densities and data locations in identifying the best model structure from among a suite of plausible models. The models considered in this study are the crop models CERES, SUCROS, GECROS and SPASS, which are coupled to identical routines for simulating soil processes within the modelling framework Expert-N. The four models considerably differ in the degree of detail at which crop growth and root water uptake are represented. Monte-Carlo simulations were conducted for each of these models considering their uncertainty in soil hydraulic properties and selected crop model parameters. Using a Bootstrap Filter (BF), the models were then conditioned on field measurements of soil moisture, matric potential, leaf-area index, and evapotranspiration rates (from eddy-covariance measurements) during a vegetation period of winter wheat at a field site at the Swabian Alb in Southwestern Germany. Following our new method, we derived model weights when using all data or different subsets thereof. We discuss to which degree the posterior mean outperforms the prior mean and all individual posterior models, how informative the data types were for reducing prediction uncertainty of evapotranspiration and deep drainage, and how well the model structure can be identified based on the different data types and subsets. We further analyze the impact of measurement uncertainty und systematic model errors on the effective sample size of the BF and the resulting model weights.

  6. Selection of Polynomial Chaos Bases via Bayesian Model Uncertainty Methods with Applications to Sparse Approximation of PDEs with Stochastic Inputs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Karagiannis, Georgios; Lin, Guang

    2014-02-15

    Generalized polynomial chaos (gPC) expansions allow the representation of the solution of a stochastic system as a series of polynomial terms. The number of gPC terms increases dramatically with the dimension of the random input variables. When the number of the gPC terms is larger than that of the available samples, a scenario that often occurs if the evaluations of the system are expensive, the evaluation of the gPC expansion can be inaccurate due to over-fitting. We propose a fully Bayesian approach that allows for global recovery of the stochastic solution, both in spacial and random domains, by coupling Bayesianmore » model uncertainty and regularization regression methods. It allows the evaluation of the PC coefficients on a grid of spacial points via (1) Bayesian model average or (2) medial probability model, and their construction as functions on the spacial domain via spline interpolation. The former accounts the model uncertainty and provides Bayes-optimal predictions; while the latter, additionally, provides a sparse representation of the solution by evaluating the expansion on a subset of dominating gPC bases when represented as a gPC expansion. Moreover, the method quantifies the importance of the gPC bases through inclusion probabilities. We design an MCMC sampler that evaluates all the unknown quantities without the need of ad-hoc techniques. The proposed method is suitable for, but not restricted to, problems whose stochastic solution is sparse at the stochastic level with respect to the gPC bases while the deterministic solver involved is expensive. We demonstrate the good performance of the proposed method and make comparisons with others on 1D, 14D and 40D in random space elliptic stochastic partial differential equations.« less

  7. Assessing uncertain human exposure to ambient air pollution using environmental models in the Web

    NASA Astrophysics Data System (ADS)

    Gerharz, L. E.; Pebesma, E.; Denby, B.

    2012-04-01

    Ambient air quality can have significant impact on human health by causing respiratory and cardio-vascular diseases. Thereby, the pollutant concentration a person is exposed to can differ considerably between individuals depending on their daily routine and movement patterns. Using a straight forward approach this exposure can be estimated by integration of individual space-time paths and spatio-temporally resolved ambient air quality data. To allow a realistic exposure assessment, it is furthermore important to consider uncertainties due to input and model errors. In this work, we present a generic, web-based approach for estimating individual exposure by integration of uncertain position and air quality information implemented as a web service. Following the Model Web initiative envisioning an infrastructure for deploying, executing and chaining environmental models as services, existing models and data sources for e.g. air quality, can be used to assess exposure. Therefore, the service needs to deal with different formats, resolutions and uncertainty representations provided by model or data services. Potential mismatch can be accounted for by transformation of uncertainties and (dis-)aggregation of data under consideration of changes in the uncertainties using components developed in the UncertWeb project. In UncertWeb, the Model Web vision is extended to an Uncertainty-enabled Model Web, where services can process and communicate uncertainties in the data and models. The propagation of uncertainty to the exposure results is quantified using Monte Carlo simulation by combining different realisations of positions and ambient concentrations. Two case studies were used to evaluate the developed exposure assessment service. In a first study, GPS tracks with a positional uncertainty of a few meters, collected in the urban area of Münster, Germany were used to assess exposure to PM10 (particulate matter smaller 10 µm). Air quality data was provided by an uncertainty-enabled air quality model system which provided realisations of concentrations per hour on a 250 m x 250 m resolved grid over Münster. The second case study uses modelled human trajectories in Rotterdam, The Netherlands. The trajectories were provided as realisations in 15 min resolution per 4 digit postal code from an activity model. Air quality estimates were provided for different pollutants as ensembles by a coupled meteorology and air quality model system on a 1 km x 1 km grid with hourly resolution. Both case studies show the successful application of the service to different resolutions and uncertainty representations.

  8. Evaluation of the tau-omega model for passive microwave soil moisture retrieval using SMAPEx data sets

    USDA-ARS?s Scientific Manuscript database

    The parameters used for passive soil moisture retrieval algorithms reported in the literature encompass a wide range, leading to a large uncertainty in the applicability of those values. This paper presents an evaluation of the proposed parameterizations of the tau-omega model from 1) SMAP ATBD for ...

  9. Model Evaluation Report for Corrective Action Unit 98: Frenchman Flat, Nevada National Security Site, Nye County, Nevada, Revision 1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ruskauff, Greg; Marutzky, Sam

    Model evaluation focused solely on the PIN STRIPE and MILK SHAKE underground nuclear tests’ contaminant boundaries (CBs) because they had the largest extent, uncertainty, and potential consequences. The CAMBRIC radionuclide migration experiment also had a relatively large CB, but because it was constrained by transport data (notably Well UE-5n), there was little uncertainty, and radioactive decay reduced concentrations before much migration could occur. Each evaluation target and the associated data-collection activity were assessed in turn to determine whether the new data support, or demonstrate conservatism of, the CB forecasts. The modeling team—in this case, the same team that developed themore » Frenchman Flat geologic, source term, and groundwater flow and transport models—analyzed the new data and presented the results to a PER committee. Existing site understanding and its representation in numerical groundwater flow and transport models was evaluated in light of the new data and the ability to proceed to the CR stage of long-term monitoring and institutional control.« less

  10. Fast and optimized methodology to generate road traffic emission inventories and their uncertainties

    NASA Astrophysics Data System (ADS)

    Blond, N.; Ho, B. Q.; Clappier, A.

    2012-04-01

    Road traffic emissions are one of the main sources of air pollution in the cities. They are also the main sources of uncertainties in the air quality numerical models used to forecast and define abatement strategies. Until now, the available models for generating road traffic emission always required a big effort, money and time. This inhibits decisions to preserve air quality, especially in developing countries where road traffic emissions are changing very fast. In this research, we developed a new model designed to fast produce road traffic emission inventories. This model, called EMISENS, combines the well-known top-down and bottom-up approaches to force them to be coherent. A Monte Carlo methodology is included for computing emission uncertainties and the uncertainty rate due to each input parameters. This paper presents the EMISENS model and a demonstration of its capabilities through an application over Strasbourg region (Alsace), France. Same input data as collected for Circul'air model (using bottom-up approach) which has been applied for many years to forecast and study air pollution by the Alsatian air quality agency, ASPA, are used to evaluate the impact of several simplifications that a user could operate . These experiments give the possibility to review older methodologies and evaluate EMISENS results when few input data are available to produce emission inventories, as in developing countries and assumptions need to be done. We show that same average fraction of mileage driven with a cold engine can be used for all the cells of the study domain and one emission factor could replace both cold and hot emission factors.

  11. On solar geoengineering and climate uncertainty

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    MacMartin, Douglas; Kravitz, Benjamin S.; Rasch, Philip J.

    2015-09-03

    Uncertainty in the climate system response has been raised as a concern regarding solar geoengineering. Here we show that model projections of regional climate change outcomes may have greater agreement under solar geoengineering than with CO2 alone. We explore the effects of geoengineering on one source of climate system uncertainty by evaluating the inter-model spread across 12 climate models participating in the Geoengineering Model Intercomparison project (GeoMIP). The model spread in regional temperature and precipitation changes is reduced with CO2 and a solar reduction, in comparison to the case with increased CO2 alone. That is, the intermodel spread in predictionsmore » of climate change and the model spread in the response to solar geoengineering are not additive but rather partially cancel. Furthermore, differences in efficacy explain most of the differences between models in their temperature response to an increase in CO2 that is offset by a solar reduction. These conclusions are important for clarifying geoengineering risks.« less

  12. Development of Flight-Test Performance Estimation Techniques for Small Unmanned Aerial Systems

    NASA Astrophysics Data System (ADS)

    McCrink, Matthew Henry

    This dissertation provides a flight-testing framework for assessing the performance of fixed-wing, small-scale unmanned aerial systems (sUAS) by leveraging sub-system models of components unique to these vehicles. The development of the sub-system models, and their links to broader impacts on sUAS performance, is the key contribution of this work. The sub-system modeling and analysis focuses on the vehicle's propulsion, navigation and guidance, and airframe components. Quantification of the uncertainty in the vehicle's power available and control states is essential for assessing the validity of both the methods and results obtained from flight-tests. Therefore, detailed propulsion and navigation system analyses are presented to validate the flight testing methodology. Propulsion system analysis required the development of an analytic model of the propeller in order to predict the power available over a range of flight conditions. The model is based on the blade element momentum (BEM) method. Additional corrections are added to the basic model in order to capture the Reynolds-dependent scale effects unique to sUAS. The model was experimentally validated using a ground based testing apparatus. The BEM predictions and experimental analysis allow for a parameterized model relating the electrical power, measurable during flight, to the power available required for vehicle performance analysis. Navigation system details are presented with a specific focus on the sensors used for state estimation, and the resulting uncertainty in vehicle state. Uncertainty quantification is provided by detailed calibration techniques validated using quasi-static and hardware-in-the-loop (HIL) ground based testing. The HIL methods introduced use a soft real-time flight simulator to provide inertial quality data for assessing overall system performance. Using this tool, the uncertainty in vehicle state estimation based on a range of sensors, and vehicle operational environments is presented. The propulsion and navigation system models are used to evaluate flight-testing methods for evaluating fixed-wing sUAS performance. A brief airframe analysis is presented to provide a foundation for assessing the efficacy of the flight-test methods. The flight-testing presented in this work is focused on validating the aircraft drag polar, zero-lift drag coefficient, and span efficiency factor. Three methods are detailed and evaluated for estimating these design parameters. Specific focus is placed on the influence of propulsion and navigation system uncertainty on the resulting performance data. Performance estimates are used in conjunction with the propulsion model to estimate the impact sensor and measurement uncertainty on the endurance and range of a fixed-wing sUAS. Endurance and range results for a simplistic power available model are compared to the Reynolds-dependent model presented in this work. Additional parameter sensitivity analysis related to state estimation uncertainties encountered in flight-testing are presented. Results from these analyses indicate that the sub-system models introduced in this work are of first-order importance, on the order of 5-10% change in range and endurance, in assessing the performance of a fixed-wing sUAS.

  13. Validation of sea ice models using an uncertainty-based distance metric for multiple model variables: NEW METRIC FOR SEA ICE MODEL VALIDATION

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Urrego-Blanco, Jorge R.; Hunke, Elizabeth C.; Urban, Nathan M.

    Here, we implement a variance-based distance metric (D n) to objectively assess skill of sea ice models when multiple output variables or uncertainties in both model predictions and observations need to be considered. The metric compares observations and model data pairs on common spatial and temporal grids improving upon highly aggregated metrics (e.g., total sea ice extent or volume) by capturing the spatial character of model skill. The D n metric is a gamma-distributed statistic that is more general than the χ 2 statistic commonly used to assess model fit, which requires the assumption that the model is unbiased andmore » can only incorporate observational error in the analysis. The D n statistic does not assume that the model is unbiased, and allows the incorporation of multiple observational data sets for the same variable and simultaneously for different variables, along with different types of variances that can characterize uncertainties in both observations and the model. This approach represents a step to establish a systematic framework for probabilistic validation of sea ice models. The methodology is also useful for model tuning by using the D n metric as a cost function and incorporating model parametric uncertainty as part of a scheme to optimize model functionality. We apply this approach to evaluate different configurations of the standalone Los Alamos sea ice model (CICE) encompassing the parametric uncertainty in the model, and to find new sets of model configurations that produce better agreement than previous configurations between model and observational estimates of sea ice concentration and thickness.« less

  14. Validation of sea ice models using an uncertainty-based distance metric for multiple model variables: NEW METRIC FOR SEA ICE MODEL VALIDATION

    DOE PAGES

    Urrego-Blanco, Jorge R.; Hunke, Elizabeth C.; Urban, Nathan M.; ...

    2017-04-01

    Here, we implement a variance-based distance metric (D n) to objectively assess skill of sea ice models when multiple output variables or uncertainties in both model predictions and observations need to be considered. The metric compares observations and model data pairs on common spatial and temporal grids improving upon highly aggregated metrics (e.g., total sea ice extent or volume) by capturing the spatial character of model skill. The D n metric is a gamma-distributed statistic that is more general than the χ 2 statistic commonly used to assess model fit, which requires the assumption that the model is unbiased andmore » can only incorporate observational error in the analysis. The D n statistic does not assume that the model is unbiased, and allows the incorporation of multiple observational data sets for the same variable and simultaneously for different variables, along with different types of variances that can characterize uncertainties in both observations and the model. This approach represents a step to establish a systematic framework for probabilistic validation of sea ice models. The methodology is also useful for model tuning by using the D n metric as a cost function and incorporating model parametric uncertainty as part of a scheme to optimize model functionality. We apply this approach to evaluate different configurations of the standalone Los Alamos sea ice model (CICE) encompassing the parametric uncertainty in the model, and to find new sets of model configurations that produce better agreement than previous configurations between model and observational estimates of sea ice concentration and thickness.« less

  15. Data and Model Uncertainties associated with Biogeochemical Groundwater Remediation and their impact on Decision Analysis

    NASA Astrophysics Data System (ADS)

    Pandey, S.; Vesselinov, V. V.; O'Malley, D.; Karra, S.; Hansen, S. K.

    2016-12-01

    Models and data are used to characterize the extent of contamination and remediation, both of which are dependent upon the complex interplay of processes ranging from geochemical reactions, microbial metabolism, and pore-scale mixing to heterogeneous flow and external forcings. Characterization is wrought with important uncertainties related to the model itself (e.g. conceptualization, model implementation, parameter values) and the data used for model calibration (e.g. sparsity, measurement errors). This research consists of two primary components: (1) Developing numerical models that incorporate the complex hydrogeology and biogeochemistry that drive groundwater contamination and remediation; (2) Utilizing novel techniques for data/model-based analyses (such as parameter calibration and uncertainty quantification) to aid in decision support for optimal uncertainty reduction related to characterization and remediation of contaminated sites. The reactive transport models are developed using PFLOTRAN and are capable of simulating a wide range of biogeochemical and hydrologic conditions that affect the migration and remediation of groundwater contaminants under diverse field conditions. Data/model-based analyses are achieved using MADS, which utilizes Bayesian methods and Information Gap theory to address the data/model uncertainties discussed above. We also use these tools to evaluate different models, which vary in complexity, in order to weigh and rank models based on model accuracy (in representation of existing observations), model parsimony (everything else being equal, models with smaller number of model parameters are preferred), and model robustness (related to model predictions of unknown future states). These analyses are carried out on synthetic problems, but are directly related to real-world problems; for example, the modeled processes and data inputs are consistent with the conditions at the Los Alamos National Laboratory contamination sites (RDX and Chromium).

  16. Additional challenges for uncertainty analysis in river engineering

    NASA Astrophysics Data System (ADS)

    Berends, Koen; Warmink, Jord; Hulscher, Suzanne

    2016-04-01

    The management of rivers for improving safety, shipping and environment requires conscious effort on the part of river managers. River engineers design hydraulic works to tackle various challenges, from increasing flow conveyance to ensuring minimal water depths for environmental flow and inland shipping. Last year saw the completion of such large scale river engineering in the 'Room for the River' programme for the Dutch Rhine River system, in which several dozen of human interventions were built to increase flood safety. Engineering works in rivers are not completed in isolation from society. Rather, their benefits - increased safety, landscaping beauty - and their disadvantages - expropriation, hindrance - directly affect inhabitants. Therefore river managers are required to carefully defend their plans. The effect of engineering works on river dynamics is being evaluated using hydraulic river models. Two-dimensional numerical models based on the shallow water equations provide the predictions necessary to make decisions on designs and future plans. However, like all environmental models, these predictions are subject to uncertainty. In recent years progress has been made in the identification of the main sources of uncertainty for hydraulic river models. Two of the most important sources are boundary conditions and hydraulic roughness (Warmink et al. 2013). The result of these sources of uncertainty is that the identification of single, deterministic prediction model is a non-trivial task. This is this is a well-understood problem in other fields as well - most notably hydrology - and known as equifinality. However, the particular case of human intervention modelling with hydraulic river models compounds the equifinality case. The model that provides the reference baseline situation is usually identified through calibration and afterwards modified for the engineering intervention. This results in two distinct models, the evaluation of which yields the effect of the proposed intervention. The implicit assumption underlying such analysis is that both models are commensurable. We hypothesize that they are commensurable only to a certain extent. In an idealised study we have demonstrated that prediction performance loss should be expected with increasingly large engineering works. When accounting for parametric uncertainty of floodplain roughness in model identification, we see uncertainty bounds for predicted effects of interventions increase with increasing intervention scale. Calibration of these types of models therefore seems to have a shelf-life, beyond which calibration does not longer improves prediction. Therefore a qualification scheme for model use is required that can be linked to model validity. In this study, we characterize model use along three dimensions: extrapolation (using the model with different external drivers), extension (using the model for different output or indicators) and modification (using modified models). Such use of models is expected to have implications for the applicability of surrogating modelling for efficient uncertainty analysis as well, which is recommended for future research. Warmink, J. J.; Straatsma, M. W.; Huthoff, F.; Booij, M. J. & Hulscher, S. J. M. H. 2013. Uncertainty of design water levels due to combined bed form and vegetation roughness in the Dutch river Waal. Journal of Flood Risk Management 6, 302-318 . DOI: 10.1111/jfr3.12014

  17. BioVapor Model Evaluation (St. Louis, MO)

    EPA Science Inventory

    The BioVapor model addresses transport and biodegradation of petroleum vapors in the subsurface. This presentation describes basic background on the nature and scientific basis of environmental transport models. It then describes a series of parameter uncertainty runs of the Bi...

  18. Opportunities of probabilistic flood loss models

    NASA Astrophysics Data System (ADS)

    Schröter, Kai; Kreibich, Heidi; Lüdtke, Stefan; Vogel, Kristin; Merz, Bruno

    2016-04-01

    Oftentimes, traditional uni-variate damage models as for instance depth-damage curves fail to reproduce the variability of observed flood damage. However, reliable flood damage models are a prerequisite for the practical usefulness of the model results. Innovative multi-variate probabilistic modelling approaches are promising to capture and quantify the uncertainty involved and thus to improve the basis for decision making. In this study we compare the predictive capability of two probabilistic modelling approaches, namely Bagging Decision Trees and Bayesian Networks and traditional stage damage functions. For model evaluation we use empirical damage data which are available from computer aided telephone interviews that were respectively compiled after the floods in 2002, 2005, 2006 and 2013 in the Elbe and Danube catchments in Germany. We carry out a split sample test by sub-setting the damage records. One sub-set is used to derive the models and the remaining records are used to evaluate the predictive performance of the model. Further we stratify the sample according to catchments which allows studying model performance in a spatial transfer context. Flood damage estimation is carried out on the scale of the individual buildings in terms of relative damage. The predictive performance of the models is assessed in terms of systematic deviations (mean bias), precision (mean absolute error) as well as in terms of sharpness of the predictions the reliability which is represented by the proportion of the number of observations that fall within the 95-quantile and 5-quantile predictive interval. The comparison of the uni-variable Stage damage function and the multivariable model approach emphasises the importance to quantify predictive uncertainty. With each explanatory variable, the multi-variable model reveals an additional source of uncertainty. However, the predictive performance in terms of precision (mbe), accuracy (mae) and reliability (HR) is clearly improved in comparison to uni-variable Stage damage function. Overall, Probabilistic models provide quantitative information about prediction uncertainty which is crucial to assess the reliability of model predictions and improves the usefulness of model results.

  19. UCODE_2005 and six other computer codes for universal sensitivity analysis, calibration, and uncertainty evaluation constructed using the JUPITER API

    USGS Publications Warehouse

    Poeter, Eileen E.; Hill, Mary C.; Banta, Edward R.; Mehl, Steffen; Christensen, Steen

    2006-01-01

    This report documents the computer codes UCODE_2005 and six post-processors. Together the codes can be used with existing process models to perform sensitivity analysis, data needs assessment, calibration, prediction, and uncertainty analysis. Any process model or set of models can be used; the only requirements are that models have numerical (ASCII or text only) input and output files, that the numbers in these files have sufficient significant digits, that all required models can be run from a single batch file or script, and that simulated values are continuous functions of the parameter values. Process models can include pre-processors and post-processors as well as one or more models related to the processes of interest (physical, chemical, and so on), making UCODE_2005 extremely powerful. An estimated parameter can be a quantity that appears in the input files of the process model(s), or a quantity used in an equation that produces a value that appears in the input files. In the latter situation, the equation is user-defined. UCODE_2005 can compare observations and simulated equivalents. The simulated equivalents can be any simulated value written in the process-model output files or can be calculated from simulated values with user-defined equations. The quantities can be model results, or dependent variables. For example, for ground-water models they can be heads, flows, concentrations, and so on. Prior, or direct, information on estimated parameters also can be considered. Statistics are calculated to quantify the comparison of observations and simulated equivalents, including a weighted least-squares objective function. In addition, data-exchange files are produced that facilitate graphical analysis. UCODE_2005 can be used fruitfully in model calibration through its sensitivity analysis capabilities and its ability to estimate parameter values that result in the best possible fit to the observations. Parameters are estimated using nonlinear regression: a weighted least-squares objective function is minimized with respect to the parameter values using a modified Gauss-Newton method or a double-dogleg technique. Sensitivities needed for the method can be read from files produced by process models that can calculate sensitivities, such as MODFLOW-2000, or can be calculated by UCODE_2005 using a more general, but less accurate, forward- or central-difference perturbation technique. Problems resulting from inaccurate sensitivities and solutions related to the perturbation techniques are discussed in the report. Statistics are calculated and printed for use in (1) diagnosing inadequate data and identifying parameters that probably cannot be estimated; (2) evaluating estimated parameter values; and (3) evaluating how well the model represents the simulated processes. Results from UCODE_2005 and codes RESIDUAL_ANALYSIS and RESIDUAL_ANALYSIS_ADV can be used to evaluate how accurately the model represents the processes it simulates. Results from LINEAR_UNCERTAINTY can be used to quantify the uncertainty of model simulated values if the model is sufficiently linear. Results from MODEL_LINEARITY and MODEL_LINEARITY_ADV can be used to evaluate model linearity and, thereby, the accuracy of the LINEAR_UNCERTAINTY results. UCODE_2005 can also be used to calculate nonlinear confidence and predictions intervals, which quantify the uncertainty of model simulated values when the model is not linear. CORFAC_PLUS can be used to produce factors that allow intervals to account for model intrinsic nonlinearity and small-scale variations in system characteristics that are not explicitly accounted for in the model or the observation weighting. The six post-processing programs are independent of UCODE_2005 and can use the results of other programs that produce the required data-exchange files. UCODE_2005 and the other six codes are intended for use on any computer operating system. The programs con

  20. Uncertainty in projected point precipitation extremes for hydrological impact analysis of climate change

    NASA Astrophysics Data System (ADS)

    Van Uytven, Els; Willems, Patrick

    2017-04-01

    Current trends in the hydro-meteorological variables indicate the potential impact of climate change on hydrological extremes. Therefore, they trigger an increased importance climate adaptation strategies in water management. The impact of climate change on hydro-meteorological and hydrological extremes is, however, highly uncertain. This is due to uncertainties introduced by the climate models, the internal variability inherent to the climate system, the greenhouse gas scenarios and the statistical downscaling methods. In view of the need to define sustainable climate adaptation strategies, there is a need to assess these uncertainties. This is commonly done by means of ensemble approaches. Because more and more climate models and statistical downscaling methods become available, there is a need to facilitate the climate impact and uncertainty analysis. A Climate Perturbation Tool has been developed for that purpose, which combines a set of statistical downscaling methods including weather typing, weather generator, transfer function and advanced perturbation based approaches. By use of an interactive interface, climate impact modelers can apply these statistical downscaling methods in a semi-automatic way to an ensemble of climate model runs. The tool is applicable to any region, but has been demonstrated so far to cases in Belgium, Suriname, Vietnam and Bangladesh. Time series representing future local-scale precipitation, temperature and potential evapotranspiration (PET) conditions were obtained, starting from time series of historical observations. Uncertainties on the future meteorological conditions are represented in two different ways: through an ensemble of time series, and a reduced set of synthetic scenarios. The both aim to span the full uncertainty range as assessed from the ensemble of climate model runs and downscaling methods. For Belgium, for instance, use was made of 100-year time series of 10-minutes precipitation observations and daily temperature and PET observations at Uccle and a large ensemble of 160 global climate model runs (CMIP5). They cover all four representative concentration pathway based greenhouse gas scenarios. While evaluating the downscaled meteorological series, particular attention was given to the performance of extreme value metrics (e.g. for precipitation, by means of intensity-duration-frequency statistics). Moreover, the total uncertainty was decomposed in the fractional uncertainties for each of the uncertainty sources considered. Research assessing the additional uncertainty due to parameter and structural uncertainties of the hydrological impact model is ongoing.

  1. Quantifying uncertainties in the structural response of SSME blades

    NASA Technical Reports Server (NTRS)

    Nagpal, Vinod K.

    1987-01-01

    To quantify the uncertainties associated with the geometry and material properties of a Space Shuttle Main Engine (SSME) turbopump blade, a computer code known as STAEBL was used. A finite element model of the blade used 80 triangular shell elements with 55 nodes and five degrees of freedom per node. The whole study was simulated on the computer and no real experiments were conducted. The structural response has been evaluated in terms of three variables which are natural frequencies, root (maximum) stress, and blade tip displacements. The results of the study indicate that only the geometric uncertainties have significant effects on the response. Uncertainties in material properties have insignificant effects.

  2. Reliability and performance evaluation of systems containing embedded rule-based expert systems

    NASA Technical Reports Server (NTRS)

    Beaton, Robert M.; Adams, Milton B.; Harrison, James V. A.

    1989-01-01

    A method for evaluating the reliability of real-time systems containing embedded rule-based expert systems is proposed and investigated. It is a three stage technique that addresses the impact of knowledge-base uncertainties on the performance of expert systems. In the first stage, a Markov reliability model of the system is developed which identifies the key performance parameters of the expert system. In the second stage, the evaluation method is used to determine the values of the expert system's key performance parameters. The performance parameters can be evaluated directly by using a probabilistic model of uncertainties in the knowledge-base or by using sensitivity analyses. In the third and final state, the performance parameters of the expert system are combined with performance parameters for other system components and subsystems to evaluate the reliability and performance of the complete system. The evaluation method is demonstrated in the context of a simple expert system used to supervise the performances of an FDI algorithm associated with an aircraft longitudinal flight-control system.

  3. Exploring Best Practice Skills to Predict Uncertainties in Venture Capital Investment Decision-Making

    NASA Astrophysics Data System (ADS)

    Blum, David Arthur

    Algae biodiesel is the sole sustainable and abundant transportation fuel source that can replace petrol diesel use; however, high competition and economic uncertainties exist, influencing independent venture capital decision making. Technology, market, management, and government action uncertainties influence competition and economic uncertainties in the venture capital industry. The purpose of this qualitative case study was to identify the best practice skills at IVC firms to predict uncertainty between early and late funding stages. The basis of the study was real options theory, a framework used to evaluate and understand the economic and competition uncertainties inherent in natural resource investment and energy derived from plant-based oils. Data were collected from interviews of 24 venture capital partners based in the United States who invest in algae and other renewable energy solutions. Data were analyzed by coding and theme development interwoven with the conceptual framework. Eight themes emerged: (a) expected returns model, (b) due diligence, (c) invest in specific sectors, (d) reduced uncertainty-late stage, (e) coopetition, (f) portfolio firm relationships, (g) differentiation strategy, and (h) modeling uncertainty and best practice. The most noteworthy finding was that predicting uncertainty at the early stage was impractical; at the expansion and late funding stages, however, predicting uncertainty was possible. The implications of these findings will affect social change by providing independent venture capitalists with best practice skills to increase successful exits, lessen uncertainty, and encourage increased funding of renewable energy firms, contributing to cleaner and healthier communities throughout the United States..

  4. Two approaches to incorporate clinical data uncertainty into multiple criteria decision analysis for benefit-risk assessment of medicinal products.

    PubMed

    Wen, Shihua; Zhang, Lanju; Yang, Bo

    2014-07-01

    The Problem formulation, Objectives, Alternatives, Consequences, Trade-offs, Uncertainties, Risk attitude, and Linked decisions (PrOACT-URL) framework and multiple criteria decision analysis (MCDA) have been recommended by the European Medicines Agency for structured benefit-risk assessment of medicinal products undergoing regulatory review. The objective of this article was to provide solutions to incorporate the uncertainty from clinical data into the MCDA model when evaluating the overall benefit-risk profiles among different treatment options. Two statistical approaches, the δ-method approach and the Monte-Carlo approach, were proposed to construct the confidence interval of the overall benefit-risk score from the MCDA model as well as other probabilistic measures for comparing the benefit-risk profiles between treatment options. Both approaches can incorporate the correlation structure between clinical parameters (criteria) in the MCDA model and are straightforward to implement. The two proposed approaches were applied to a case study to evaluate the benefit-risk profile of an add-on therapy for rheumatoid arthritis (drug X) relative to placebo. It demonstrated a straightforward way to quantify the impact of the uncertainty from clinical data to the benefit-risk assessment and enabled statistical inference on evaluating the overall benefit-risk profiles among different treatment options. The δ-method approach provides a closed form to quantify the variability of the overall benefit-risk score in the MCDA model, whereas the Monte-Carlo approach is more computationally intensive but can yield its true sampling distribution for statistical inference. The obtained confidence intervals and other probabilistic measures from the two approaches enhance the benefit-risk decision making of medicinal products. Copyright © 2014 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  5. The Importance of Uncertainty and Sensitivity Analysis in Process-based Models of Carbon and Nitrogen Cycling in Terrestrial Ecosystems with Particular Emphasis on Forest Ecosystems — Selected Papers from a Workshop Organized by the International Society for Ecological Modelling (ISEM) at the Third Biennal Meeting of the International Environmental Modelling and Software Society (IEMSS) in Burlington, Vermont, USA, August 9-13, 2006

    USGS Publications Warehouse

    Larocque, Guy R.; Bhatti, Jagtar S.; Liu, Jinxun; Ascough, James C.; Gordon, Andrew M.

    2008-01-01

    Many process-based models of carbon (C) and nitrogen (N) cycles have been developed for terrestrial ecosystems, including forest ecosystems. They address many basic issues of ecosystems structure and functioning, such as the role of internal feedback in ecosystem dynamics. The critical factor in these phenomena is scale, as these processes operate at scales from the minute (e.g. particulate pollution impacts on trees and other organisms) to the global (e.g. climate change). Research efforts remain important to improve the capability of such models to better represent the dynamics of terrestrial ecosystems, including the C, nutrient, (e.g. N) and water cycles. Existing models are sufficiently well advanced to help decision makers develop sustainable management policies and planning of terrestrial ecosystems, as they make realistic predictions when used appropriately. However, decision makers must be aware of their limitations by having the opportunity to evaluate the uncertainty associated with process-based models (Smith and Heath, 2001 and Allen et al., 2004). The variation in scale of issues currently being addressed by modelling efforts makes the evaluation of uncertainty a daunting task.

  6. GIA Model Statistics for GRACE Hydrology, Cryosphere, and Ocean Science

    NASA Astrophysics Data System (ADS)

    Caron, L.; Ivins, E. R.; Larour, E.; Adhikari, S.; Nilsson, J.; Blewitt, G.

    2018-03-01

    We provide a new analysis of glacial isostatic adjustment (GIA) with the goal of assembling the model uncertainty statistics required for rigorously extracting trends in surface mass from the Gravity Recovery and Climate Experiment (GRACE) mission. Such statistics are essential for deciphering sea level, ocean mass, and hydrological changes because the latter signals can be relatively small (≤2 mm/yr water height equivalent) over very large regions, such as major ocean basins and watersheds. With abundant new >7 year continuous measurements of vertical land motion (VLM) reported by Global Positioning System stations on bedrock and new relative sea level records, our new statistical evaluation of GIA uncertainties incorporates Bayesian methodologies. A unique aspect of the method is that both the ice history and 1-D Earth structure vary through a total of 128,000 forward models. We find that best fit models poorly capture the statistical inferences needed to correctly invert for lower mantle viscosity and that GIA uncertainty exceeds the uncertainty ascribed to trends from 14 years of GRACE data in polar regions.

  7. Characterizing uncertainty and variability in physiologically based pharmacokinetic models: state of the science and needs for research and implementation.

    PubMed

    Barton, Hugh A; Chiu, Weihsueh A; Setzer, R Woodrow; Andersen, Melvin E; Bailer, A John; Bois, Frédéric Y; Dewoskin, Robert S; Hays, Sean; Johanson, Gunnar; Jones, Nancy; Loizou, George; Macphail, Robert C; Portier, Christopher J; Spendiff, Martin; Tan, Yu-Mei

    2007-10-01

    Physiologically based pharmacokinetic (PBPK) models are used in mode-of-action based risk and safety assessments to estimate internal dosimetry in animals and humans. When used in risk assessment, these models can provide a basis for extrapolating between species, doses, and exposure routes or for justifying nondefault values for uncertainty factors. Characterization of uncertainty and variability is increasingly recognized as important for risk assessment; this represents a continuing challenge for both PBPK modelers and users. Current practices show significant progress in specifying deterministic biological models and nondeterministic (often statistical) models, estimating parameters using diverse data sets from multiple sources, using them to make predictions, and characterizing uncertainty and variability of model parameters and predictions. The International Workshop on Uncertainty and Variability in PBPK Models, held 31 Oct-2 Nov 2006, identified the state-of-the-science, needed changes in practice and implementation, and research priorities. For the short term, these include (1) multidisciplinary teams to integrate deterministic and nondeterministic/statistical models; (2) broader use of sensitivity analyses, including for structural and global (rather than local) parameter changes; and (3) enhanced transparency and reproducibility through improved documentation of model structure(s), parameter values, sensitivity and other analyses, and supporting, discrepant, or excluded data. Longer-term needs include (1) theoretical and practical methodological improvements for nondeterministic/statistical modeling; (2) better methods for evaluating alternative model structures; (3) peer-reviewed databases of parameters and covariates, and their distributions; (4) expanded coverage of PBPK models across chemicals with different properties; and (5) training and reference materials, such as cases studies, bibliographies/glossaries, model repositories, and enhanced software. The multidisciplinary dialogue initiated by this Workshop will foster the collaboration, research, data collection, and training necessary to make characterizing uncertainty and variability a standard practice in PBPK modeling and risk assessment.

  8. Incorporating rainfall uncertainty in a SWAT model: the river Zenne basin (Belgium) case study

    NASA Astrophysics Data System (ADS)

    Tolessa Leta, Olkeba; Nossent, Jiri; van Griensven, Ann; Bauwens, Willy

    2013-04-01

    The European Union Water Framework Directive (EU-WFD) called its member countries to achieve a good ecological status for all inland and coastal water bodies by 2015. According to recent studies, the river Zenne (Belgium) is far from this objective. Therefore, an interuniversity and multidisciplinary project "Towards a Good Ecological Status in the river Zenne (GESZ)" was launched to evaluate the effects of wastewater management plans on the river. In this project, different models have been developed and integrated using the Open Modelling Interface (OpenMI). The hydrologic, semi-distributed Soil and Water Assessment Tool (SWAT) is hereby used as one of the model components in the integrated modelling chain in order to model the upland catchment processes. The assessment of the uncertainty of SWAT is an essential aspect of the decision making process, in order to design robust management strategies that take the predicted uncertainties into account. Model uncertainty stems from the uncertainties on the model parameters, the input data (e.g, rainfall), the calibration data (e.g., stream flows) and on the model structure itself. The objective of this paper is to assess the first three sources of uncertainty in a SWAT model of the river Zenne basin. For the assessment of rainfall measurement uncertainty, first, we identified independent rainfall periods, based on the daily precipitation and stream flow observations and using the Water Engineering Time Series PROcessing tool (WETSPRO). Secondly, we assigned a rainfall multiplier parameter for each of the independent rainfall periods, which serves as a multiplicative input error corruption. Finally, we treated these multipliers as latent parameters in the model optimization and uncertainty analysis (UA). For parameter uncertainty assessment, due to the high number of parameters of the SWAT model, first, we screened out its most sensitive parameters using the Latin Hypercube One-factor-At-a-Time (LH-OAT) technique. Subsequently, we only considered the most sensitive parameters for parameter optimization and UA. To explicitly account for the stream flow uncertainty, we assumed that the stream flow measurement error increases linearly with the stream flow value. To assess the uncertainty and infer posterior distributions of the parameters, we used a Markov Chain Monte Carlo (MCMC) sampler - differential evolution adaptive metropolis (DREAM) that uses sampling from an archive of past states to generate candidate points in each individual chain. It is shown that the marginal posterior distributions of the rainfall multipliers vary widely between individual events, as a consequence of rainfall measurement errors and the spatial variability of the rain. Only few of the rainfall events are well defined. The marginal posterior distributions of the SWAT model parameter values are well defined and identified by DREAM, within their prior ranges. The posterior distributions of output uncertainty parameter values also show that the stream flow data is highly uncertain. The approach of using rainfall multipliers to treat rainfall uncertainty for a complex model has an impact on the model parameter marginal posterior distributions and on the model results Corresponding author: Tel.: +32 (0)2629 3027; fax: +32(0)2629 3022. E-mail: otolessa@vub.ac.be

  9. An analytical framework to assist decision makers in the use of forest ecosystem model predictions

    USDA-ARS?s Scientific Manuscript database

    The predictions of most terrestrial ecosystem models originate from deterministic simulations. Relatively few uncertainty evaluation exercises in model outputs are performed by either model developers or users. This issue has important consequences for decision makers who rely on models to develop n...

  10. Developing a non-point source P loss indicator in R and its parameter uncertainty assessment using GLUE: a case study in northern China.

    PubMed

    Su, Jingjun; Du, Xinzhong; Li, Xuyong

    2018-05-16

    Uncertainty analysis is an important prerequisite for model application. However, the existing phosphorus (P) loss indexes or indicators were rarely evaluated. This study applied generalized likelihood uncertainty estimation (GLUE) method to assess the uncertainty of parameters and modeling outputs of a non-point source (NPS) P indicator constructed in R language. And the influences of subjective choices of likelihood formulation and acceptability threshold of GLUE on model outputs were also detected. The results indicated the following. (1) Parameters RegR 2 , RegSDR 2 , PlossDP fer , PlossDP man , DPDR, and DPR were highly sensitive to overall TP simulation and their value ranges could be reduced by GLUE. (2) Nash efficiency likelihood (L 1 ) seemed to present better ability in accentuating high likelihood value simulations than the exponential function (L 2 ) did. (3) The combined likelihood integrating the criteria of multiple outputs acted better than single likelihood in model uncertainty assessment in terms of reducing the uncertainty band widths and assuring the fitting goodness of whole model outputs. (4) A value of 0.55 appeared to be a modest choice of threshold value to balance the interests between high modeling efficiency and high bracketing efficiency. Results of this study could provide (1) an option to conduct NPS modeling under one single computer platform, (2) important references to the parameter setting for NPS model development in similar regions, (3) useful suggestions for the application of GLUE method in studies with different emphases according to research interests, and (4) important insights into the watershed P management in similar regions.

  11. Adaptive Management and the Value of Information: Learning Via Intervention in Epidemiology

    PubMed Central

    Shea, Katriona; Tildesley, Michael J.; Runge, Michael C.; Fonnesbeck, Christopher J.; Ferrari, Matthew J.

    2014-01-01

    Optimal intervention for disease outbreaks is often impeded by severe scientific uncertainty. Adaptive management (AM), long-used in natural resource management, is a structured decision-making approach to solving dynamic problems that accounts for the value of resolving uncertainty via real-time evaluation of alternative models. We propose an AM approach to design and evaluate intervention strategies in epidemiology, using real-time surveillance to resolve model uncertainty as management proceeds, with foot-and-mouth disease (FMD) culling and measles vaccination as case studies. We use simulations of alternative intervention strategies under competing models to quantify the effect of model uncertainty on decision making, in terms of the value of information, and quantify the benefit of adaptive versus static intervention strategies. Culling decisions during the 2001 UK FMD outbreak were contentious due to uncertainty about the spatial scale of transmission. The expected benefit of resolving this uncertainty prior to a new outbreak on a UK-like landscape would be £45–£60 million relative to the strategy that minimizes livestock losses averaged over alternate transmission models. AM during the outbreak would be expected to recover up to £20.1 million of this expected benefit. AM would also recommend a more conservative initial approach (culling of infected premises and dangerous contact farms) than would a fixed strategy (which would additionally require culling of contiguous premises). For optimal targeting of measles vaccination, based on an outbreak in Malawi in 2010, AM allows better distribution of resources across the affected region; its utility depends on uncertainty about both the at-risk population and logistical capacity. When daily vaccination rates are highly constrained, the optimal initial strategy is to conduct a small, quick campaign; a reduction in expected burden of approximately 10,000 cases could result if campaign targets can be updated on the basis of the true susceptible population. Formal incorporation of a policy to update future management actions in response to information gained in the course of an outbreak can change the optimal initial response and result in significant cost savings. AM provides a framework for using multiple models to facilitate public-health decision making and an objective basis for updating management actions in response to improved scientific understanding. PMID:25333371

  12. Adaptive management and the value of information: learning via intervention in epidemiology

    USGS Publications Warehouse

    Shea, Katriona; Tildesley, Michael J.; Runge, Michael C.; Fonnesbeck, Christopher J.; Ferrari, Matthew J.

    2014-01-01

    Optimal intervention for disease outbreaks is often impeded by severe scientific uncertainty. Adaptive management (AM), long-used in natural resource management, is a structured decision-making approach to solving dynamic problems that accounts for the value of resolving uncertainty via real-time evaluation of alternative models. We propose an AM approach to design and evaluate intervention strategies in epidemiology, using real-time surveillance to resolve model uncertainty as management proceeds, with foot-and-mouth disease (FMD) culling and measles vaccination as case studies. We use simulations of alternative intervention strategies under competing models to quantify the effect of model uncertainty on decision making, in terms of the value of information, and quantify the benefit of adaptive versus static intervention strategies. Culling decisions during the 2001 UK FMD outbreak were contentious due to uncertainty about the spatial scale of transmission. The expected benefit of resolving this uncertainty prior to a new outbreak on a UK-like landscape would be £45–£60 million relative to the strategy that minimizes livestock losses averaged over alternate transmission models. AM during the outbreak would be expected to recover up to £20.1 million of this expected benefit. AM would also recommend a more conservative initial approach (culling of infected premises and dangerous contact farms) than would a fixed strategy (which would additionally require culling of contiguous premises). For optimal targeting of measles vaccination, based on an outbreak in Malawi in 2010, AM allows better distribution of resources across the affected region; its utility depends on uncertainty about both the at-risk population and logistical capacity. When daily vaccination rates are highly constrained, the optimal initial strategy is to conduct a small, quick campaign; a reduction in expected burden of approximately 10,000 cases could result if campaign targets can be updated on the basis of the true susceptible population. Formal incorporation of a policy to update future management actions in response to information gained in the course of an outbreak can change the optimal initial response and result in significant cost savings. AM provides a framework for using multiple models to facilitate public-health decision making and an objective basis for updating management actions in response to improved scientific understanding.

  13. Verification Techniques for Parameter Selection and Bayesian Model Calibration Presented for an HIV Model

    NASA Astrophysics Data System (ADS)

    Wentworth, Mami Tonoe

    Uncertainty quantification plays an important role when making predictive estimates of model responses. In this context, uncertainty quantification is defined as quantifying and reducing uncertainties, and the objective is to quantify uncertainties in parameter, model and measurements, and propagate the uncertainties through the model, so that one can make a predictive estimate with quantified uncertainties. Two of the aspects of uncertainty quantification that must be performed prior to propagating uncertainties are model calibration and parameter selection. There are several efficient techniques for these processes; however, the accuracy of these methods are often not verified. This is the motivation for our work, and in this dissertation, we present and illustrate verification frameworks for model calibration and parameter selection in the context of biological and physical models. First, HIV models, developed and improved by [2, 3, 8], describe the viral infection dynamics of an HIV disease. These are also used to make predictive estimates of viral loads and T-cell counts and to construct an optimal control for drug therapy. Estimating input parameters is an essential step prior to uncertainty quantification. However, not all the parameters are identifiable, implying that they cannot be uniquely determined by the observations. These unidentifiable parameters can be partially removed by performing parameter selection, a process in which parameters that have minimal impacts on the model response are determined. We provide verification techniques for Bayesian model calibration and parameter selection for an HIV model. As an example of a physical model, we employ a heat model with experimental measurements presented in [10]. A steady-state heat model represents a prototypical behavior for heat conduction and diffusion process involved in a thermal-hydraulic model, which is a part of nuclear reactor models. We employ this simple heat model to illustrate verification techniques for model calibration. For Bayesian model calibration, we employ adaptive Metropolis algorithms to construct densities for input parameters in the heat model and the HIV model. To quantify the uncertainty in the parameters, we employ two MCMC algorithms: Delayed Rejection Adaptive Metropolis (DRAM) [33] and Differential Evolution Adaptive Metropolis (DREAM) [66, 68]. The densities obtained using these methods are compared to those obtained through the direct numerical evaluation of the Bayes' formula. We also combine uncertainties in input parameters and measurement errors to construct predictive estimates for a model response. A significant emphasis is on the development and illustration of techniques to verify the accuracy of sampling-based Metropolis algorithms. We verify the accuracy of DRAM and DREAM by comparing chains, densities and correlations obtained using DRAM, DREAM and the direct evaluation of Bayes formula. We also perform similar analysis for credible and prediction intervals for responses. Once the parameters are estimated, we employ energy statistics test [63, 64] to compare the densities obtained by different methods for the HIV model. The energy statistics are used to test the equality of distributions. We also consider parameter selection and verification techniques for models having one or more parameters that are noninfluential in the sense that they minimally impact model outputs. We illustrate these techniques for a dynamic HIV model but note that the parameter selection and verification framework is applicable to a wide range of biological and physical models. To accommodate the nonlinear input to output relations, which are typical for such models, we focus on global sensitivity analysis techniques, including those based on partial correlations, Sobol indices based on second-order model representations, and Morris indices, as well as a parameter selection technique based on standard errors. A significant objective is to provide verification strategies to assess the accuracy of those techniques, which we illustrate in the context of the HIV model. Finally, we examine active subspace methods as an alternative to parameter subset selection techniques. The objective of active subspace methods is to determine the subspace of inputs that most strongly affect the model response, and to reduce the dimension of the input space. The major difference between active subspace methods and parameter selection techniques is that parameter selection identifies influential parameters whereas subspace selection identifies a linear combination of parameters that impacts the model responses significantly. We employ active subspace methods discussed in [22] for the HIV model and present a verification that the active subspace successfully reduces the input dimensions.

  14. Evaluating data worth for ground-water management under uncertainty

    USGS Publications Warehouse

    Wagner, B.J.

    1999-01-01

    A decision framework is presented for assessing the value of ground-water sampling within the context of ground-water management under uncertainty. The framework couples two optimization models-a chance-constrained ground-water management model and an integer-programing sampling network design model-to identify optimal pumping and sampling strategies. The methodology consists of four steps: (1) The optimal ground-water management strategy for the present level of model uncertainty is determined using the chance-constrained management model; (2) for a specified data collection budget, the monitoring network design model identifies, prior to data collection, the sampling strategy that will minimize model uncertainty; (3) the optimal ground-water management strategy is recalculated on the basis of the projected model uncertainty after sampling; and (4) the worth of the monitoring strategy is assessed by comparing the value of the sample information-i.e., the projected reduction in management costs-with the cost of data collection. Steps 2-4 are repeated for a series of data collection budgets, producing a suite of management/monitoring alternatives, from which the best alternative can be selected. A hypothetical example demonstrates the methodology's ability to identify the ground-water sampling strategy with greatest net economic benefit for ground-water management.A decision framework is presented for assessing the value of ground-water sampling within the context of ground-water management under uncertainty. The framework couples two optimization models - a chance-constrained ground-water management model and an integer-programming sampling network design model - to identify optimal pumping and sampling strategies. The methodology consists of four steps: (1) The optimal ground-water management strategy for the present level of model uncertainty is determined using the chance-constrained management model; (2) for a specified data collection budget, the monitoring network design model identifies, prior to data collection, the sampling strategy that will minimize model uncertainty; (3) the optimal ground-water management strategy is recalculated on the basis of the projected model uncertainty after sampling; and (4) the worth of the monitoring strategy is assessed by comparing the value of the sample information - i.e., the projected reduction in management costs - with the cost of data collection. Steps 2-4 are repeated for a series of data collection budgets, producing a suite of management/monitoring alternatives, from which the best alternative can be selected. A hypothetical example demonstrates the methodology's ability to identify the ground-water sampling strategy with greatest net economic benefit for ground-water management.

  15. Development and comparison of metrics for evaluating climate models and estimation of projection uncertainty

    NASA Astrophysics Data System (ADS)

    Ring, Christoph; Pollinger, Felix; Kaspar-Ott, Irena; Hertig, Elke; Jacobeit, Jucundus; Paeth, Heiko

    2017-04-01

    The COMEPRO project (Comparison of Metrics for Probabilistic Climate Change Projections of Mediterranean Precipitation), funded by the Deutsche Forschungsgemeinschaft (DFG), is dedicated to the development of new evaluation metrics for state-of-the-art climate models. Further, we analyze implications for probabilistic projections of climate change. This study focuses on the results of 4-field matrix metrics. Here, six different approaches are compared. We evaluate 24 models of the Coupled Model Intercomparison Project Phase 3 (CMIP3), 40 of CMIP5 and 18 of the Coordinated Regional Downscaling Experiment (CORDEX). In addition to the annual and seasonal precipitation the mean temperature is analysed. We consider both 50-year trend and climatological mean for the second half of the 20th century. For the probabilistic projections of climate change A1b, A2 (CMIP3) and RCP4.5, RCP8.5 (CMIP5,CORDEX) scenarios are used. The eight main study areas are located in the Mediterranean. However, we apply our metrics to globally distributed regions as well. The metrics show high simulation quality of temperature trend and both precipitation and temperature mean for most climate models and study areas. In addition, we find high potential for model weighting in order to reduce uncertainty. These results are in line with other accepted evaluation metrics and studies. The comparison of the different 4-field approaches reveals high correlations for most metrics. The results of the metric-weighted probabilistic density functions of climate change are heterogeneous. We find for different regions and seasons both increases and decreases of uncertainty. The analysis of global study areas is consistent with the regional study areas of the Medeiterrenean.

  16. Linear Mixed Models: Gum and Beyond

    NASA Astrophysics Data System (ADS)

    Arendacká, Barbora; Täubner, Angelika; Eichstädt, Sascha; Bruns, Thomas; Elster, Clemens

    2014-04-01

    In Annex H.5, the Guide to the Evaluation of Uncertainty in Measurement (GUM) [1] recognizes the necessity to analyze certain types of experiments by applying random effects ANOVA models. These belong to the more general family of linear mixed models that we focus on in the current paper. Extending the short introduction provided by the GUM, our aim is to show that the more general, linear mixed models cover a wider range of situations occurring in practice and can be beneficial when employed in data analysis of long-term repeated experiments. Namely, we point out their potential as an aid in establishing an uncertainty budget and as means for gaining more insight into the measurement process. We also comment on computational issues and to make the explanations less abstract, we illustrate all the concepts with the help of a measurement campaign conducted in order to challenge the uncertainty budget in calibration of accelerometers.

  17. Impacts of Process and Prediction Uncertainties on Projected Hanford Waste Glass Amount

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gervasio, V.; Kim, D. S.; Vienna, J. D.

    Analyses were performed to evaluate the impacts of using the advanced glass models, constraints, and uncertainty descriptions on projected Hanford glass mass. The maximum allowable waste oxide loading (WOL) was estimated for waste compositions while simultaneously satisfying all applicable glass property and composition constraints with sufficient confidence. Different components of prediction and composition/process uncertainties were systematically included in the calculations to evaluate their impacts on glass mass. The analyses estimated the production of 23,360 MT of immobilized high-level waste (IHLW) glass when no uncertainties were taken into account. Accounting for prediction and composition/process uncertainties resulted in 5.01 relative percent increasemore » in estimated glass mass of 24,531 MT. Roughly equal impacts were found for prediction uncertainties (2.58 RPD) and composition/process uncertainties (2.43 RPD). The immobilized low-activity waste (ILAW) mass was predicted to be 282,350 MT without uncertainty and with waste loading “line” rules in place. Accounting for prediction and composition/process uncertainties resulted in only 0.08 relative percent increase in estimated glass mass of 282,562 MT. Without application of line rules the glass mass decreases by 10.6 relative percent (252,490 MT) for the case with no uncertainties. Addition of prediction uncertainties increases glass mass by 1.32 relative percent and the addition of composition/process uncertainties increase glass mass by an additional 7.73 relative percent (9.06 relative percent increase combined). The glass mass estimate without line rules (275,359 MT) was 2.55 relative percent lower than that with the line rules (282,562 MT), after accounting for all applicable uncertainties.« less

  18. Characterizing bias correction uncertainty in wheat yield predictions

    NASA Astrophysics Data System (ADS)

    Ortiz, Andrea Monica; Jones, Julie; Freckleton, Robert; Scaife, Adam

    2017-04-01

    Farming systems are under increased pressure due to current and future climate change, variability and extremes. Research on the impacts of climate change on crop production typically rely on the output of complex Global and Regional Climate Models, which are used as input to crop impact models. Yield predictions from these top-down approaches can have high uncertainty for several reasons, including diverse model construction and parameterization, future emissions scenarios, and inherent or response uncertainty. These uncertainties propagate down each step of the 'cascade of uncertainty' that flows from climate input to impact predictions, leading to yield predictions that may be too complex for their intended use in practical adaptation options. In addition to uncertainty from impact models, uncertainty can also stem from the intermediate steps that are used in impact studies to adjust climate model simulations to become more realistic when compared to observations, or to correct the spatial or temporal resolution of climate simulations, which are often not directly applicable as input into impact models. These important steps of bias correction or calibration also add uncertainty to final yield predictions, given the various approaches that exist to correct climate model simulations. In order to address how much uncertainty the choice of bias correction method can add to yield predictions, we use several evaluation runs from Regional Climate Models from the Coordinated Regional Downscaling Experiment over Europe (EURO-CORDEX) at different resolutions together with different bias correction methods (linear and variance scaling, power transformation, quantile-quantile mapping) as input to a statistical crop model for wheat, a staple European food crop. The objective of our work is to compare the resulting simulation-driven hindcasted wheat yields to climate observation-driven wheat yield hindcasts from the UK and Germany in order to determine ranges of yield uncertainty that result from different climate model simulation input and bias correction methods. We simulate wheat yields using a General Linear Model that includes the effects of seasonal maximum temperatures and precipitation, since wheat is sensitive to heat stress during important developmental stages. We use the same statistical model to predict future wheat yields using the recently available bias-corrected simulations of EURO-CORDEX-Adjust. While statistical models are often criticized for their lack of complexity, an advantage is that we are here able to consider only the effect of the choice of climate model, resolution or bias correction method on yield. Initial results using both past and future bias-corrected climate simulations with a process-based model will also be presented. Through these methods, we make recommendations in preparing climate model output for crop models.

  19. Determining the Uncertainties in Prescribed Burn Emissions Through Comparison of Satellite Estimates to Ground-based Estimates and Air Quality Model Evaluations in Southeastern US

    NASA Astrophysics Data System (ADS)

    Odman, M. T.; Hu, Y.; Russell, A. G.

    2016-12-01

    Prescribed burning is practiced throughout the US, and most widely in the Southeast, for the purpose of maintaining and improving the ecosystem, and reducing the wildfire risk. However, prescribed burn emissions contribute significantly to the of trace gas and particulate matter loads in the atmosphere. In places where air quality is already stressed by other anthropogenic emissions, prescribed burns can lead to major health and environmental problems. Air quality modeling efforts are under way to assess the impacts of prescribed burn emissions. Operational forecasts of the impacts are also emerging for use in dynamic management of air quality as well as the burns. Unfortunately, large uncertainties exist in the process of estimating prescribed burn emissions and these uncertainties limit the accuracy of the burn impact predictions. Prescribed burn emissions are estimated by using either ground-based information or satellite observations. When there is sufficient local information about the burn area, the types of fuels, their consumption amounts, and the progression of the fire, ground-based estimates are more accurate. In the absence of such information satellites remain as the only reliable source for emission estimation. To determine the level of uncertainty in prescribed burn emissions, we compared estimates derived from a burn permit database and other ground-based information to the estimates by the Biomass Burning Emissions Product derived from a constellation of NOAA and NASA satellites. Using these emissions estimates we conducted simulations with the Community Multiscale Air Quality (CMAQ) model and predicted trace gas and particulate matter concentrations throughout the Southeast for two consecutive burn seasons (2015 and 2016). In this presentation, we will compare model predicted concentrations to measurements at monitoring stations and evaluate if the differences are commensurate with our emission uncertainty estimates. We will also investigate if spatial and temporal patterns in the differences reveal the sources of the uncertainty in the prescribed burn emission estimates.

  20. The Use of Bayesian Methods for Uncertainty Analysis and Evaluation of Biological Hypotheses in PBPK Models

    EPA Science Inventory

    Physiologically based pharmacokinetic (PBPK) models are compartmental models that describe the uptake and distribution of drugs and chemicals throughout the body. They can be structured so that model parameters (i.e., physiological and chemical-specific) reflect biological charac...

  1. Reconstructing Exposures from Biomarkers using Exposure-Pharmacokinetic Modeling - A Case Study with Carbaryl

    EPA Science Inventory

    Sources of uncertainty involved in exposure reconstruction for a short half-life chemical, carbaryl, were characterized using the Cumulative and Aggregate Risk Evaluation System (CARES), an exposure model, and a human physiologically based pharmacokinetic (PBPK) model. CARES was...

  2. A RETROSPECTIVE ANALYSIS OF MODEL UNCERTAINTY FOR FORECASTING HYDROLOGIC CHANGE

    EPA Science Inventory

    GIS-based hydrologic modeling offers a convenient means of assessing the impacts associated with land-cover/use change for environmental planning efforts. Alternative future scenarios can be used as input to hydrologic models and compared with existing conditions to evaluate pot...

  3. A Model-Based Prognostics Approach Applied to Pneumatic Valves

    NASA Technical Reports Server (NTRS)

    Daigle, Matthew J.; Goebel, Kai

    2011-01-01

    Within the area of systems health management, the task of prognostics centers on predicting when components will fail. Model-based prognostics exploits domain knowledge of the system, its components, and how they fail by casting the underlying physical phenomena in a physics-based model that is derived from first principles. Uncertainty cannot be avoided in prediction, therefore, algorithms are employed that help in managing these uncertainties. The particle filtering algorithm has become a popular choice for model-based prognostics due to its wide applicability, ease of implementation, and support for uncertainty management. We develop a general model-based prognostics methodology within a robust probabilistic framework using particle filters. As a case study, we consider a pneumatic valve from the Space Shuttle cryogenic refueling system. We develop a detailed physics-based model of the pneumatic valve, and perform comprehensive simulation experiments to illustrate our prognostics approach and evaluate its effectiveness and robustness. The approach is demonstrated using historical pneumatic valve data from the refueling system.

  4. Assessing concentration uncertainty estimates from passive microwave sea ice products

    NASA Astrophysics Data System (ADS)

    Meier, W.; Brucker, L.; Miller, J. A.

    2017-12-01

    Sea ice concentration is an essential climate variable and passive microwave derived estimates of concentration are one of the longest satellite-derived climate records. However, until recently uncertainty estimates were not provided. Numerous validation studies provided insight into general error characteristics, but the studies have found that concentration error varied greatly depending on sea ice conditions. Thus, an uncertainty estimate from each observation is desired, particularly for initialization, assimilation, and validation of models. Here we investigate three sea ice products that include an uncertainty for each concentration estimate: the NASA Team 2 algorithm product, the EUMETSAT Ocean and Sea Ice Satellite Application Facility (OSI-SAF) product, and the NOAA/NSIDC Climate Data Record (CDR) product. Each product estimates uncertainty with a completely different approach. The NASA Team 2 product derives uncertainty internally from the algorithm method itself. The OSI-SAF uses atmospheric reanalysis fields and a radiative transfer model. The CDR uses spatial variability from two algorithms. Each approach has merits and limitations. Here we evaluate the uncertainty estimates by comparing the passive microwave concentration products with fields derived from the NOAA VIIRS sensor. The results show that the relationship between the product uncertainty estimates and the concentration error (relative to VIIRS) is complex. This may be due to the sea ice conditions, the uncertainty methods, as well as the spatial and temporal variability of the passive microwave and VIIRS products.

  5. Pre-fire and post-fire surface fuel and cover measurements collected in the southeastern United States for model evaluation and development - RxCADRE 2008, 2011 and 2012

    Treesearch

    Roger D. Ottmar; Andrew T. Hudak; Susan J. Prichard; Clinton S. Wright; Joseph C. Restaino; Maureen C. Kennedy; Robert E. Vihnanek

    2016-01-01

    A lack of independent, quality-assured data prevents scientists from effectively evaluating predictions and uncertainties in fire models used by land managers. This paper presents a summary of pre-fire and post-fire fuel, fuel moisture and surface cover fraction data that can be used for fire model evaluation and development. The data were collected in the...

  6. Application of Probabilistic Analysis to Aircraft Impact Dynamics

    NASA Technical Reports Server (NTRS)

    Lyle, Karen H.; Padula, Sharon L.; Stockwell, Alan E.

    2003-01-01

    Full-scale aircraft crash simulations performed with nonlinear, transient dynamic, finite element codes can incorporate structural complexities such as: geometrically accurate models; human occupant models; and advanced material models to include nonlinear stressstrain behaviors, laminated composites, and material failure. Validation of these crash simulations is difficult due to a lack of sufficient information to adequately determine the uncertainty in the experimental data and the appropriateness of modeling assumptions. This paper evaluates probabilistic approaches to quantify the uncertainty in the simulated responses. Several criteria are used to determine that a response surface method is the most appropriate probabilistic approach. The work is extended to compare optimization results with and without probabilistic constraints.

  7. Global Sensitivity Analysis for Process Identification under Model Uncertainty

    NASA Astrophysics Data System (ADS)

    Ye, M.; Dai, H.; Walker, A. P.; Shi, L.; Yang, J.

    2015-12-01

    The environmental system consists of various physical, chemical, and biological processes, and environmental models are always built to simulate these processes and their interactions. For model building, improvement, and validation, it is necessary to identify important processes so that limited resources can be used to better characterize the processes. While global sensitivity analysis has been widely used to identify important processes, the process identification is always based on deterministic process conceptualization that uses a single model for representing a process. However, environmental systems are complex, and it happens often that a single process may be simulated by multiple alternative models. Ignoring the model uncertainty in process identification may lead to biased identification in that identified important processes may not be so in the real world. This study addresses this problem by developing a new method of global sensitivity analysis for process identification. The new method is based on the concept of Sobol sensitivity analysis and model averaging. Similar to the Sobol sensitivity analysis to identify important parameters, our new method evaluates variance change when a process is fixed at its different conceptualizations. The variance considers both parametric and model uncertainty using the method of model averaging. The method is demonstrated using a synthetic study of groundwater modeling that considers recharge process and parameterization process. Each process has two alternative models. Important processes of groundwater flow and transport are evaluated using our new method. The method is mathematically general, and can be applied to a wide range of environmental problems.

  8. INNOVATIVE METHODS FOR EMISSION INVENTORY DEVELOPMENT AND EVALUATION: WORKSHOP SYNTHESIS

    EPA Science Inventory

    Emission inventories are key databases for evaluating, managing, and regulating air pollutants. Refinements and innovations in instruments that measure air pollutants, models that calculate emissions, and techniques for data management and uncertainty assessment are critical to ...

  9. JUPITER PROJECT - JOINT UNIVERSAL PARAMETER IDENTIFICATION AND EVALUATION OF RELIABILITY

    EPA Science Inventory

    The JUPITER (Joint Universal Parameter IdenTification and Evaluation of Reliability) project builds on the technology of two widely used codes for sensitivity analysis, data assessment, calibration, and uncertainty analysis of environmental models: PEST and UCODE.

  10. Comparison of beam position calculation methods for application in digital acquisition systems

    NASA Astrophysics Data System (ADS)

    Reiter, A.; Singh, R.

    2018-05-01

    Different approaches to the data analysis of beam position monitors in hadron accelerators are compared adopting the perspective of an analog-to-digital converter in a sampling acquisition system. Special emphasis is given to position uncertainty and robustness against bias and interference that may be encountered in an accelerator environment. In a time-domain analysis of data in the presence of statistical noise, the position calculation based on the difference-over-sum method with algorithms like signal integral or power can be interpreted as a least-squares analysis of a corresponding fit function. This link to the least-squares method is exploited in the evaluation of analysis properties and in the calculation of position uncertainty. In an analytical model and experimental evaluations the positions derived from a straight line fit or equivalently the standard deviation are found to be the most robust and to offer the least variance. The measured position uncertainty is consistent with the model prediction in our experiment, and the results of tune measurements improve significantly.

  11. [Evaluation of uncertainty for determination of tin and its compounds in air of workplace by flame atomic absorption spectrometry].

    PubMed

    Wei, Qiuning; Wei, Yuan; Liu, Fangfang; Ding, Yalei

    2015-10-01

    To investigate the method for uncertainty evaluation of determination of tin and its compounds in the air of workplace by flame atomic absorption spectrometry. The national occupational health standards, GBZ/T160.28-2004 and JJF1059-1999, were used to build a mathematical model of determination of tin and its compounds in the air of workplace and to calculate the components of uncertainty. In determination of tin and its compounds in the air of workplace using flame atomic absorption spectrometry, the uncertainty for the concentration of the standard solution, atomic absorption spectrophotometer, sample digestion, parallel determination, least square fitting of the calibration curve, and sample collection was 0.436%, 0.13%, 1.07%, 1.65%, 3.05%, and 2.89%, respectively. The combined uncertainty was 9.3%.The concentration of tin in the test sample was 0.132 mg/m³, and the expanded uncertainty for the measurement was 0.012 mg/m³ (K=2). The dominant uncertainty for determination of tin and its compounds in the air of workplace comes from least squares fitting of the calibration curve and sample collection. Quality control should be improved in the process of calibration curve fitting and sample collection.

  12. A prospective earthquake forecast experiment in the western Pacific

    NASA Astrophysics Data System (ADS)

    Eberhard, David A. J.; Zechar, J. Douglas; Wiemer, Stefan

    2012-09-01

    Since the beginning of 2009, the Collaboratory for the Study of Earthquake Predictability (CSEP) has been conducting an earthquake forecast experiment in the western Pacific. This experiment is an extension of the Kagan-Jackson experiments begun 15 years earlier and is a prototype for future global earthquake predictability experiments. At the beginning of each year, seismicity models make a spatially gridded forecast of the number of Mw≥ 5.8 earthquakes expected in the next year. For the three participating statistical models, we analyse the first two years of this experiment. We use likelihood-based metrics to evaluate the consistency of the forecasts with the observed target earthquakes and we apply measures based on Student's t-test and the Wilcoxon signed-rank test to compare the forecasts. Overall, a simple smoothed seismicity model (TripleS) performs the best, but there are some exceptions that indicate continued experiments are vital to fully understand the stability of these models, the robustness of model selection and, more generally, earthquake predictability in this region. We also estimate uncertainties in our results that are caused by uncertainties in earthquake location and seismic moment. Our uncertainty estimates are relatively small and suggest that the evaluation metrics are relatively robust. Finally, we consider the implications of our results for a global earthquake forecast experiment.

  13. Optimisation of lateral car dynamics taking into account parameter uncertainties

    NASA Astrophysics Data System (ADS)

    Busch, Jochen; Bestle, Dieter

    2014-02-01

    Simulation studies on an active all-wheel-steering car show that disturbance of vehicle parameters have high influence on lateral car dynamics. This motivates the need of robust design against such parameter uncertainties. A specific parametrisation is established combining deterministic, velocity-dependent steering control parameters with partly uncertain, velocity-independent vehicle parameters for simultaneous use in a numerical optimisation process. Model-based objectives are formulated and summarised in a multi-objective optimisation problem where especially the lateral steady-state behaviour is improved by an adaption strategy based on measurable uncertainties. The normally distributed uncertainties are generated by optimal Latin hypercube sampling and a response surface based strategy helps to cut down time consuming model evaluations which offers the possibility to use a genetic optimisation algorithm. Optimisation results are discussed in different criterion spaces and the achieved improvements confirm the validity of the proposed procedure.

  14. Characterisation of soft magnetic materials by measurement: Evaluation of uncertainties up to 1.8 T and 9 kHz

    NASA Astrophysics Data System (ADS)

    Elfgen, S.; Franck, D.; Hameyer, K.

    2018-04-01

    Magnetic measurements are indispensable for the characterization of soft magnetic material used e.g. in electrical machines. Characteristic values are used as quality control during production and for the parametrization of material models. Uncertainties and errors in the measurements are reflected directly in the parameters of the material models. This can result in over-dimensioning and inaccuracies in simulations for the design of electrical machines. Therefore, existing influencing factors in the characterization of soft magnetic materials are named and their resulting uncertainties contributions studied. The analysis of the resulting uncertainty contributions can serve the operator as additional selection criteria for different measuring sensors. The investigation is performed for measurements within and outside the currently prescribed standard, using a Single sheet tester and its impact on the identification of iron loss parameter is studied.

  15. Uncertainty in Analyzed Water and Energy Budgets at Continental Scales

    NASA Technical Reports Server (NTRS)

    Bosilovich, Michael G.; Robertson, F. R.; Mocko, D.; Chen, J.

    2011-01-01

    Operational analyses and retrospective-analyses provide all the physical terms of mater and energy budgets, guided by the assimilation of atmospheric observations. However, there is significant reliance on the numerical models, and so, uncertainty in the budget terms is always present. Here, we use a recently developed data set consisting of a mix of 10 analyses (both operational and retrospective) to quantify the uncertainty of analyzed water and energy budget terms for GEWEX continental-scale regions, following the evaluation of Dr. John Roads using individual reanalyses data sets.

  16. Essays in energy policy and planning modeling under uncertainty: Value of information, optimistic biases, and simulation of capacity markets

    NASA Astrophysics Data System (ADS)

    Hu, Ming-Che

    Optimization and simulation are popular operations research and systems analysis tools for energy policy modeling. This dissertation addresses three important questions concerning the use of these tools for energy market (and electricity market) modeling and planning under uncertainty. (1) What is the value of information and cost of disregarding different sources of uncertainty for the U.S. energy economy? (2) Could model-based calculations of the performance (social welfare) of competitive and oligopolistic market equilibria be optimistically biased due to uncertainties in objective function coefficients? (3) How do alternative sloped demand curves perform in the PJM capacity market under economic and weather uncertainty? How does curve adjustment and cost dynamics affect the capacity market outcomes? To address the first question, two-stage stochastic optimization is utilized in the U.S. national MARKAL energy model; then the value of information and cost of ignoring uncertainty are estimated for three uncertainties: carbon cap policy, load growth and natural gas prices. When an uncertainty is important, then explicitly considering those risks when making investments will result in better performance in expectation (positive expected cost of ignoring uncertainty). Furthermore, eliminating the uncertainty would improve strategies even further, meaning that improved forecasts of future conditions are valuable ( i.e., a positive expected value of information). Also, the value of policy coordination shows the difference between a strategy developed under the incorrect assumption of no carbon cap and a strategy correctly anticipating imposition of such a cap. For the second question, game theory models are formulated and the existence of optimistic (positive) biases in market equilibria (both competitive and oligopoly markets) are proved, in that calculated social welfare and producer profits will, in expectation, exceed the values that will actually be received. Theoretical analyses prove the general existence of this bias for both competitive and oligopolistic models when production costs and demand curves are uncertain. Also demonstrated is an optimistic bias for the net benefits of introducing a new technology into a market when the cost of the new technology is uncertainty. The optimistic biases are quantified for a model of the northwest European electricity market (including Belgium, France, Germany and the Netherlands). Demand uncertainty results in an optimistic bias of 150,000-220,000 [Euro]/hr of total surplus and natural gas price uncertainty yields a smaller bias of 8,000-10,000 [Euro]/hr for total surplus. Further, adding a new uncertain technology (biomass) to the set of possible generation methods almost doubles the optimistic bias (14,000-18,000 [Euro]/hr). The third question concerns ex ante evaluation of the Reliability Pricing Model (RPM)---the new PJM capacity market---launched in June 2007. A Monte Carlo simulation model is developed to simulate PJM capacity market and predict market performance, producer revenue, and consumer payments. An important input to RPM is a demand curve for capacity; several alternative demand curves are compared, and sensitivity analyses conducted of those conclusions. One conclusion is that the sloped demand curves are more robust because those demand curves gives higher reliability with lower consumer payments. In addition, the performance of the curves is evaluated for a more sophisticated market design in which the demand curve can be adjusted in response to previous market outcomes and where the capital costs may change unexpectedly. The simulation shows that curve adjustment increases system reliability with lower consumer payments. Also the effect of learning-by-doing, leading to lower plant capital costs, leads to higher average reserve margin and lower consumer payments. In contrast, a the sudden rise in capital costs causes a decrease in reliability and an increase in consumer payments.

  17. Hierarchical stochastic modeling of large river ecosystems and fish growth across spatio-temporal scales and climate models: the Missouri River endangered pallid sturgeon example

    USGS Publications Warehouse

    Wildhaber, Mark L.; Wikle, Christopher K.; Moran, Edward H.; Anderson, Christopher J.; Franz, Kristie J.; Dey, Rima

    2017-01-01

    We present a hierarchical series of spatially decreasing and temporally increasing models to evaluate the uncertainty in the atmosphere – ocean global climate model (AOGCM) and the regional climate model (RCM) relative to the uncertainty in the somatic growth of the endangered pallid sturgeon (Scaphirhynchus albus). For effects on fish populations of riverine ecosystems, cli- mate output simulated by coarse-resolution AOGCMs and RCMs must be downscaled to basins to river hydrology to population response. One needs to transfer the information from these climate simulations down to the individual scale in a way that minimizes extrapolation and can account for spatio-temporal variability in the intervening stages. The goal is a framework to determine whether, given uncertainties in the climate models and the biological response, meaningful inference can still be made. The non-linear downscaling of climate information to the river scale requires that one realistically account for spatial and temporal variability across scale. Our down- scaling procedure includes the use of fixed/calibrated hydrological flow and temperature models coupled with a stochastically parameterized sturgeon bioenergetics model. We show that, although there is a large amount of uncertainty associated with both the climate model output and the fish growth process, one can establish significant differences in fish growth distributions between models, and between future and current climates for a given model.

  18. Aeroelastic Uncertainty Quantification Studies Using the S4T Wind Tunnel Model

    NASA Technical Reports Server (NTRS)

    Nikbay, Melike; Heeg, Jennifer

    2017-01-01

    This paper originates from the joint efforts of an aeroelastic study team in the Applied Vehicle Technology Panel from NATO Science and Technology Organization, with the Task Group number AVT-191, titled "Application of Sensitivity Analysis and Uncertainty Quantification to Military Vehicle Design." We present aeroelastic uncertainty quantification studies using the SemiSpan Supersonic Transport wind tunnel model at the NASA Langley Research Center. The aeroelastic study team decided treat both structural and aerodynamic input parameters as uncertain and represent them as samples drawn from statistical distributions, propagating them through aeroelastic analysis frameworks. Uncertainty quantification processes require many function evaluations to asses the impact of variations in numerous parameters on the vehicle characteristics, rapidly increasing the computational time requirement relative to that required to assess a system deterministically. The increased computational time is particularly prohibitive if high-fidelity analyses are employed. As a remedy, the Istanbul Technical University team employed an Euler solver in an aeroelastic analysis framework, and implemented reduced order modeling with Polynomial Chaos Expansion and Proper Orthogonal Decomposition to perform the uncertainty propagation. The NASA team chose to reduce the prohibitive computational time by employing linear solution processes. The NASA team also focused on determining input sample distributions.

  19. Robustness of risk maps and survey networks to knowledge gaps about a new invasive pest.

    PubMed

    Yemshanov, Denys; Koch, Frank H; Ben-Haim, Yakov; Smith, William D

    2010-02-01

    In pest risk assessment it is frequently necessary to make management decisions regarding emerging threats under severe uncertainty. Although risk maps provide useful decision support for invasive alien species, they rarely address knowledge gaps associated with the underlying risk model or how they may change the risk estimates. Failure to recognize uncertainty leads to risk-ignorant decisions and miscalculation of expected impacts as well as the costs required to minimize these impacts. Here we use the information gap concept to evaluate the robustness of risk maps to uncertainties in key assumptions about an invading organism. We generate risk maps with a spatial model of invasion that simulates potential entries of an invasive pest via international marine shipments, their spread through a landscape, and establishment on a susceptible host. In particular, we focus on the question of how much uncertainty in risk model assumptions can be tolerated before the risk map loses its value. We outline this approach with an example of a forest pest recently detected in North America, Sirex noctilio Fabricius. The results provide a spatial representation of the robustness of predictions of S. noctilio invasion risk to uncertainty and show major geographic hotspots where the consideration of uncertainty in model parameters may change management decisions about a new invasive pest. We then illustrate how the dependency between the extent of uncertainties and the degree of robustness of a risk map can be used to select a surveillance network design that is most robust to knowledge gaps about the pest.

  20. Investigating the effect and uncertainties of light absorbing impurities in snow and ice on snow melt and discharge generation using a hydrologic catchment model and satellite data

    NASA Astrophysics Data System (ADS)

    Matt, Felix; Burkhart, John F.

    2017-04-01

    Light absorbing impurities in snow and ice (LAISI) originating from atmospheric deposition enhance snow melt by increasing the absorption of short wave radiation. The consequences are a shortening of the snow cover duration due to increased snow melt and, with respect to hydrologic processes, a temporal shift in the discharge generation. However, the magnitude of these effects as simulated in numerical models have large uncertainties, originating mainly from uncertainties in the wet and dry deposition of light absorbing aerosols, limitations in the model representation of the snowpack, and the lack of observable variables required to estimate model parameters and evaluate the simulated variables connected with the representation of LAISI. This leads to high uncertainties in the additional energy absorbed by the snow due to the presence of LAISI, a key variable in understanding snowpack energy-balance dynamics. In this study, we assess the effect of LAISI on snow melt and discharge generation and the involved uncertainties in a high mountain catchment located in the western Himalayas by using a distributed hydrological catchment model with focus on the representation of the seasonal snow pack. The snow albedo is hereby calculated from a radiative transfer model for snow, taking the increased absorption of short wave radiation by LAISI into account. Meteorological forcing data is generated from an assimilation of observations and high resolution WRF simulations, and LAISI mixing ratios from deposition rates of Black Carbon simulated with the FLEXPART model. To asses the quality of our simulations and the related uncertainties, we compare the simulated additional energy absorbed by the snow due to the presence of LAISI to the MODIS Dust Radiative Forcing in Snow (MODDRFS) algorithm satellite product.

  1. Adjoint-Based Implicit Uncertainty Analysis for Figures of Merit in a Laser Inertial Fusion Engine

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Seifried, J E; Fratoni, M; Kramer, K J

    A primary purpose of computational models is to inform design decisions and, in order to make those decisions reliably, the confidence in the results of such models must be estimated. Monte Carlo neutron transport models are common tools for reactor designers. These types of models contain several sources of uncertainty that propagate onto the model predictions. Two uncertainties worthy of note are (1) experimental and evaluation uncertainties of nuclear data that inform all neutron transport models and (2) statistical counting precision, which all results of a Monte Carlo codes contain. Adjoint-based implicit uncertainty analyses allow for the consideration of anymore » number of uncertain input quantities and their effects upon the confidence of figures of merit with only a handful of forward and adjoint transport calculations. When considering a rich set of uncertain inputs, adjoint-based methods remain hundreds of times more computationally efficient than Direct Monte-Carlo methods. The LIFE (Laser Inertial Fusion Energy) engine is a concept being developed at Lawrence Livermore National Laboratory. Various options exist for the LIFE blanket, depending on the mission of the design. The depleted uranium hybrid LIFE blanket design strives to close the fission fuel cycle without enrichment or reprocessing, while simultaneously achieving high discharge burnups with reduced proliferation concerns. Neutron transport results that are central to the operation of the design are tritium production for fusion fuel, fission of fissile isotopes for energy multiplication, and production of fissile isotopes for sustained power. In previous work, explicit cross-sectional uncertainty analyses were performed for reaction rates related to the figures of merit for the depleted uranium hybrid LIFE blanket. Counting precision was also quantified for both the figures of merit themselves and the cross-sectional uncertainty estimates to gauge the validity of the analysis. All cross-sectional uncertainties were small (0.1-0.8%), bounded counting uncertainties, and were precise with regard to counting precision. Adjoint/importance distributions were generated for the same reaction rates. The current work leverages those adjoint distributions to transition from explicit sensitivities, in which the neutron flux is constrained, to implicit sensitivities, in which the neutron flux responds to input perturbations. This treatment vastly expands the set of data that contribute to uncertainties to produce larger, more physically accurate uncertainty estimates.« less

  2. Evaluating variability and uncertainty separately in microbial quantitative risk assessment using two R packages.

    PubMed

    Pouillot, Régis; Delignette-Muller, Marie Laure

    2010-09-01

    Quantitative risk assessment has emerged as a valuable tool to enhance the scientific basis of regulatory decisions in the food safety domain. This article introduces the use of two new computing resources (R packages) specifically developed to help risk assessors in their projects. The first package, "fitdistrplus", gathers tools for choosing and fitting a parametric univariate distribution to a given dataset. The data may be continuous or discrete. Continuous data may be right-, left- or interval-censored as is frequently obtained with analytical methods, with the possibility of various censoring thresholds within the dataset. Bootstrap procedures then allow the assessor to evaluate and model the uncertainty around the parameters and to transfer this information into a quantitative risk assessment model. The second package, "mc2d", helps to build and study two dimensional (or second-order) Monte-Carlo simulations in which the estimation of variability and uncertainty in the risk estimates is separated. This package easily allows the transfer of separated variability and uncertainty along a chain of conditional mathematical and probabilistic models. The usefulness of these packages is illustrated through a risk assessment of hemolytic and uremic syndrome in children linked to the presence of Escherichia coli O157:H7 in ground beef. These R packages are freely available at the Comprehensive R Archive Network (cran.r-project.org). Copyright 2010 Elsevier B.V. All rights reserved.

  3. Uncertainty Evaluation of Computational Model Used to Support the Integrated Powerhead Demonstration Project

    NASA Technical Reports Server (NTRS)

    Steele, W. G.; Molder, K. J.; Hudson, S. T.; Vadasy, K. V.; Rieder, P. T.; Giel, T.

    2005-01-01

    NASA and the U.S. Air Force are working on a joint project to develop a new hydrogen-fueled, full-flow, staged combustion rocket engine. The initial testing and modeling work for the Integrated Powerhead Demonstrator (IPD) project is being performed by NASA Marshall and Stennis Space Centers. A key factor in the testing of this engine is the ability to predict and measure the transient fluid flow during engine start and shutdown phases of operation. A model built by NASA Marshall in the ROCket Engine Transient Simulation (ROCETS) program is used to predict transient engine fluid flows. The model is initially calibrated to data from previous tests on the Stennis E1 test stand. The model is then used to predict the next run. Data from this run can then be used to recalibrate the model providing a tool to guide the test program in incremental steps to reduce the risk to the prototype engine. In this paper, they define this type of model as a calibrated model. This paper proposes a method to estimate the uncertainty of a model calibrated to a set of experimental test data. The method is similar to that used in the calibration of experiment instrumentation. For the IPD example used in this paper, the model uncertainty is determined for both LOX and LH flow rates using previous data. The successful use of this model is then demonstrated to predict another similar test run within the uncertainty bounds. The paper summarizes the uncertainty methodology when a model is continually recalibrated with new test data. The methodology is general and can be applied to other calibrated models.

  4. NGA West 2 | Pacific Earthquake Engineering Research Center

    Science.gov Websites

    , multi-year research program to improve Next Generation Attenuation models for active tectonic regions earthquake engineering, including modeling of directivity and directionality; verification of NGA-West models epistemic uncertainty; and evaluation of soil amplification factors in NGA models versus NEHRP site factors

  5. Evaluating the impacts of agricultural land management practices: A probabilistic hydrologic modeling approach

    USDA-ARS?s Scientific Manuscript database

    The complexity of the hydrologic system challenges the development of models. One issue faced during the model development stage is the uncertainty involved in model parameterization. Using a single optimized set of parameters (one snapshot) to represent baseline conditions of the system limits the ...

  6. An optimization based sampling approach for multiple metrics uncertainty analysis using generalized likelihood uncertainty estimation

    NASA Astrophysics Data System (ADS)

    Zhou, Rurui; Li, Yu; Lu, Di; Liu, Haixing; Zhou, Huicheng

    2016-09-01

    This paper investigates the use of an epsilon-dominance non-dominated sorted genetic algorithm II (ɛ-NSGAII) as a sampling approach with an aim to improving sampling efficiency for multiple metrics uncertainty analysis using Generalized Likelihood Uncertainty Estimation (GLUE). The effectiveness of ɛ-NSGAII based sampling is demonstrated compared with Latin hypercube sampling (LHS) through analyzing sampling efficiency, multiple metrics performance, parameter uncertainty and flood forecasting uncertainty with a case study of flood forecasting uncertainty evaluation based on Xinanjiang model (XAJ) for Qing River reservoir, China. Results obtained demonstrate the following advantages of the ɛ-NSGAII based sampling approach in comparison to LHS: (1) The former performs more effective and efficient than LHS, for example the simulation time required to generate 1000 behavioral parameter sets is shorter by 9 times; (2) The Pareto tradeoffs between metrics are demonstrated clearly with the solutions from ɛ-NSGAII based sampling, also their Pareto optimal values are better than those of LHS, which means better forecasting accuracy of ɛ-NSGAII parameter sets; (3) The parameter posterior distributions from ɛ-NSGAII based sampling are concentrated in the appropriate ranges rather than uniform, which accords with their physical significance, also parameter uncertainties are reduced significantly; (4) The forecasted floods are close to the observations as evaluated by three measures: the normalized total flow outside the uncertainty intervals (FOUI), average relative band-width (RB) and average deviation amplitude (D). The flood forecasting uncertainty is also reduced a lot with ɛ-NSGAII based sampling. This study provides a new sampling approach to improve multiple metrics uncertainty analysis under the framework of GLUE, and could be used to reveal the underlying mechanisms of parameter sets under multiple conflicting metrics in the uncertainty analysis process.

  7. Markov random fields and graphs for uncertainty management and symbolic data fusion in an urban scene interpretation

    NASA Astrophysics Data System (ADS)

    Moissinac, Henri; Maitre, Henri; Bloch, Isabelle

    1995-11-01

    An image interpretation method is presented for the automatic processing of aerial pictures of a urban landscape. In order to improve the picture analysis, some a priori knowledge extracted from a geographic map is introduced. A coherent graph-based model of the city is built, starting with the road network. A global uncertainty management scheme has been designed in order to evaluate the final confidence we can have in the final results. This model and the uncertainty management tend to reflect the hierarchy of the available data and the interpretation levels. The symbolic relationships linking the different kinds of elements are taken into account while propagating and combining the confidence measures along the interpretation process.

  8. Effect of Temporal and Spatial Rainfall Resolution on HSPF Predictive Performance and Parameter Estimation

    EPA Science Inventory

    Watershed scale rainfall‐runoff models are used for environmental management and regulatory modeling applications, but their effectiveness are limited by predictive uncertainties associated with model input data. This study evaluated the effect of temporal and spatial rainfall re...

  9. Bringing social standards into project evaluation under dynamic uncertainty.

    PubMed

    Knudsen, Odin K; Scandizzo, Pasquale L

    2005-04-01

    Society often sets social standards that define thresholds of damage to society or the environment above which compensation must be paid to the state or other parties. In this article, we analyze the interdependence between the use of social standards and investment evaluation under dynamic uncertainty where a negative externality above a threshold established by society requires an assessment and payment of damages. Under uncertainty, the party considering implementing a project or new technology must not only assess when the project is economically efficient to implement but when to abandon a project that could potentially exceed the social standard. Using real-option theory and simple models, we demonstrate how such a social standard can be integrated into cost-benefit analysis through the use of a development option and a liability option coupled with a damage function. Uncertainty, in fact, implies that both parties interpret the social standard as a target for safety rather than an inflexible barrier that cannot be overcome. The larger is the uncertainty, in fact, the greater will be the tolerance for damages in excess of the social standard from both parties.

  10. Forecasting biodiversity in breeding birds using best practices

    PubMed Central

    Taylor, Shawn D.; White, Ethan P.

    2018-01-01

    Biodiversity forecasts are important for conservation, management, and evaluating how well current models characterize natural systems. While the number of forecasts for biodiversity is increasing, there is little information available on how well these forecasts work. Most biodiversity forecasts are not evaluated to determine how well they predict future diversity, fail to account for uncertainty, and do not use time-series data that captures the actual dynamics being studied. We addressed these limitations by using best practices to explore our ability to forecast the species richness of breeding birds in North America. We used hindcasting to evaluate six different modeling approaches for predicting richness. Hindcasts for each method were evaluated annually for a decade at 1,237 sites distributed throughout the continental United States. All models explained more than 50% of the variance in richness, but none of them consistently outperformed a baseline model that predicted constant richness at each site. The best practices implemented in this study directly influenced the forecasts and evaluations. Stacked species distribution models and “naive” forecasts produced poor estimates of uncertainty and accounting for this resulted in these models dropping in the relative performance compared to other models. Accounting for observer effects improved model performance overall, but also changed the rank ordering of models because it did not improve the accuracy of the “naive” model. Considering the forecast horizon revealed that the prediction accuracy decreased across all models as the time horizon of the forecast increased. To facilitate the rapid improvement of biodiversity forecasts, we emphasize the value of specific best practices in making forecasts and evaluating forecasting methods. PMID:29441230

  11. Information-Theoretic Benchmarking of Land Surface Models

    NASA Astrophysics Data System (ADS)

    Nearing, Grey; Mocko, David; Kumar, Sujay; Peters-Lidard, Christa; Xia, Youlong

    2016-04-01

    Benchmarking is a type of model evaluation that compares model performance against a baseline metric that is derived, typically, from a different existing model. Statistical benchmarking was used to qualitatively show that land surface models do not fully utilize information in boundary conditions [1] several years before Gong et al [2] discovered the particular type of benchmark that makes it possible to *quantify* the amount of information lost by an incorrect or imperfect model structure. This theoretical development laid the foundation for a formal theory of model benchmarking [3]. We here extend that theory to separate uncertainty contributions from the three major components of dynamical systems models [4]: model structures, model parameters, and boundary conditions describe time-dependent details of each prediction scenario. The key to this new development is the use of large-sample [5] data sets that span multiple soil types, climates, and biomes, which allows us to segregate uncertainty due to parameters from the two other sources. The benefit of this approach for uncertainty quantification and segregation is that it does not rely on Bayesian priors (although it is strictly coherent with Bayes' theorem and with probability theory), and therefore the partitioning of uncertainty into different components is *not* dependent on any a priori assumptions. We apply this methodology to assess the information use efficiency of the four land surface models that comprise the North American Land Data Assimilation System (Noah, Mosaic, SAC-SMA, and VIC). Specifically, we looked at the ability of these models to estimate soil moisture and latent heat fluxes. We found that in the case of soil moisture, about 25% of net information loss was from boundary conditions, around 45% was from model parameters, and 30-40% was from the model structures. In the case of latent heat flux, boundary conditions contributed about 50% of net uncertainty, and model structures contributed about 40%. There was relatively little difference between the different models. 1. G. Abramowitz, R. Leuning, M. Clark, A. Pitman, Evaluating the performance of land surface models. Journal of Climate 21, (2008). 2. W. Gong, H. V. Gupta, D. Yang, K. Sricharan, A. O. Hero, Estimating Epistemic & Aleatory Uncertainties During Hydrologic Modeling: An Information Theoretic Approach. Water Resources Research 49, 2253-2273 (2013). 3. G. S. Nearing, H. V. Gupta, The quantity and quality of information in hydrologic models. Water Resources Research 51, 524-538 (2015). 4. H. V. Gupta, G. S. Nearing, Using models and data to learn: A systems theoretic perspective on the future of hydrological science. Water Resources Research 50(6), 5351-5359 (2014). 5. H. V. Gupta et al., Large-sample hydrology: a need to balance depth with breadth. Hydrology and Earth System Sciences Discussions 10, 9147-9189 (2013).

  12. Evaluation of SimpleTreat 4.0: Simulations of pharmaceutical removal in wastewater treatment plant facilities.

    PubMed

    Lautz, L S; Struijs, J; Nolte, T M; Breure, A M; van der Grinten, E; van de Meent, D; van Zelm, R

    2017-02-01

    In this study, the removal of pharmaceuticals from wastewater as predicted by SimpleTreat 4.0 was evaluated. Field data obtained from literature of 43 pharmaceuticals, measured in 51 different activated sludge WWTPs were used. Based on reported influent concentrations, the effluent concentrations were calculated with SimpleTreat 4.0 and compared to measured effluent concentrations. The model predicts effluent concentrations mostly within a factor of 10, using the specific WWTP parameters as well as SimpleTreat default parameters, while it systematically underestimates concentrations in secondary sludge. This may be caused by unexpected sorption, resulting from variability in WWTP operating conditions, and/or QSAR applicability domain mismatch and background concentrations prior to measurements. Moreover, variability in detection techniques and sampling methods can cause uncertainty in measured concentration levels. To find possible structural improvements, we also evaluated SimpleTreat 4.0 using several specific datasets with different degrees of uncertainty and variability. This evaluation verified that the most influencing parameters for water effluent predictions were biodegradation and the hydraulic retention time. Results showed that model performance is highly dependent on the nature and quality, i.e. degree of uncertainty, of the data. The default values for reactor settings in SimpleTreat result in realistic predictions. Copyright © 2016 Elsevier Ltd. All rights reserved.

  13. INNOVATIVE METHODS FOR EMISSION-INVENTORY DEVELOPMENT AND EVALUATION: WORKSHOP SUMMARY

    EPA Science Inventory

    Emission inventories are an essential tool for evaluating, managing, and regulating air pollution. Refinements and innovations in instruments that measure air pollutants, models that calculate emissions as well as techniques for data management and uncertainty assessment are nee...

  14. A COMPUTATIONAL FRAMEWORK FOR EVALUATION OF NPS MANAGEMENT SCENARIOS: ROLE OF PARAMETER UNCERTAINTY

    EPA Science Inventory

    Utility of complex distributed-parameter watershed models for evaluation of the effectiveness of non-point source sediment and nutrient abatement scenarios such as Best Management Practices (BMPs) often follows the traditional {calibrate ---> validate ---> predict} procedure. Des...

  15. Parameter uncertainty and variability in evaluative fate and exposure models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hertwich, E.G.; McKone, T.E.; Pease, W.S.

    The human toxicity potential, a weighting scheme used to evaluate toxic emissions for life cycle assessment and toxics release inventories, is based on potential dose calculations and toxicity factors. This paper evaluates the variance in potential dose calculations that can be attributed to the uncertainty in chemical-specific input parameters as well as the variability in exposure factors and landscape parameters. A knowledge of the uncertainty allows us to assess the robustness of a decision based on the toxicity potential; a knowledge of the sources of uncertainty allows one to focus resources if the uncertainty is to be reduced. The potentialmore » does of 236 chemicals was assessed. The chemicals were grouped by dominant exposure route, and a Monte Carlo analysis was conducted for one representative chemical in each group. The variance is typically one to two orders of magnitude. For comparison, the point estimates in potential dose for 236 chemicals span ten orders of magnitude. Most of the variance in the potential dose is due to chemical-specific input parameters, especially half-lives, although exposure factors such as fish intake and the source of drinking water can be important for chemicals whose dominant exposure is through indirect routes. Landscape characteristics are generally of minor importance.« less

  16. Uncertainty Evaluation of the New Setup for Measurement of Water-Vapor Permeation Rate by a Dew-Point Sensor

    NASA Astrophysics Data System (ADS)

    Hudoklin, D.; Šetina, J.; Drnovšek, J.

    2012-09-01

    The measurement of the water-vapor permeation rate (WVPR) through materials is very important in many industrial applications such as the development of new fabrics and construction materials, in the semiconductor industry, packaging, vacuum techniques, etc. The demand for this kind of measurement grows considerably and thus many different methods for measuring the WVPR are developed and standardized within numerous national and international standards. However, comparison of existing methods shows a low level of mutual agreement. The objective of this paper is to demonstrate the necessary uncertainty evaluation for WVPR measurements, so as to provide a basis for development of a corresponding reference measurement standard. This paper presents a specially developed measurement setup, which employs a precision dew-point sensor for WVPR measurements on specimens of different shapes. The paper also presents a physical model, which tries to account for both dynamic and quasi-static methods, the common types of WVPR measurements referred to in standards and scientific publications. An uncertainty evaluation carried out according to the ISO/IEC guide to the expression of uncertainty in measurement (GUM) shows the relative expanded ( k = 2) uncertainty to be 3.0 % for WVPR of 6.71 mg . h-1 (corresponding to permeance of 30.4 mg . m-2. day-1 . hPa-1).

  17. Assessing model sensitivity and uncertainty across multiple Free-Air CO2 Enrichment experiments.

    NASA Astrophysics Data System (ADS)

    Cowdery, E.; Dietze, M.

    2015-12-01

    As atmospheric levels of carbon dioxide levels continue to increase, it is critical that terrestrial ecosystem models can accurately predict ecological responses to the changing environment. Current predictions of net primary productivity (NPP) in response to elevated atmospheric CO2 concentrations are highly variable and contain a considerable amount of uncertainty. It is necessary that we understand which factors are driving this uncertainty. The Free-Air CO2 Enrichment (FACE) experiments have equipped us with a rich data source that can be used to calibrate and validate these model predictions. To identify and evaluate the assumptions causing inter-model differences we performed model sensitivity and uncertainty analysis across ambient and elevated CO2 treatments using the Data Assimilation Linked Ecosystem Carbon (DALEC) model and the Ecosystem Demography Model (ED2), two process-based models ranging from low to high complexity respectively. These modeled process responses were compared to experimental data from the Kennedy Space Center Open Top Chamber Experiment, the Nevada Desert Free Air CO2 Enrichment Facility, the Rhinelander FACE experiment, the Wyoming Prairie Heating and CO2 Enrichment Experiment, the Duke Forest Face experiment and the Oak Ridge Experiment on CO2 Enrichment. By leveraging data access proxy and data tilling services provided by the BrownDog data curation project alongside analysis modules available in the Predictive Ecosystem Analyzer (PEcAn), we produced automated, repeatable benchmarking workflows that are generalized to incorporate different sites and ecological models. Combining the observed patterns of uncertainty between the two models with results of the recent FACE-model data synthesis project (FACE-MDS) can help identify which processes need further study and additional data constraints. These findings can be used to inform future experimental design and in turn can provide informative starting point for data assimilation.

  18. Quantification of LiDAR measurement uncertainty through propagation of errors due to sensor sub-systems and terrain morphology

    NASA Astrophysics Data System (ADS)

    Goulden, T.; Hopkinson, C.

    2013-12-01

    The quantification of LiDAR sensor measurement uncertainty is important for evaluating the quality of derived DEM products, compiling risk assessment of management decisions based from LiDAR information, and enhancing LiDAR mission planning capabilities. Current quality assurance estimates of LiDAR measurement uncertainty are limited to post-survey empirical assessments or vendor estimates from commercial literature. Empirical evidence can provide valuable information for the performance of the sensor in validated areas; however, it cannot characterize the spatial distribution of measurement uncertainty throughout the extensive coverage of typical LiDAR surveys. Vendor advertised error estimates are often restricted to strict and optimal survey conditions, resulting in idealized values. Numerical modeling of individual pulse uncertainty provides an alternative method for estimating LiDAR measurement uncertainty. LiDAR measurement uncertainty is theoretically assumed to fall into three distinct categories, 1) sensor sub-system errors, 2) terrain influences, and 3) vegetative influences. This research details the procedures for numerical modeling of measurement uncertainty from the sensor sub-system (GPS, IMU, laser scanner, laser ranger) and terrain influences. Results show that errors tend to increase as the laser scan angle, altitude or laser beam incidence angle increase. An experimental survey over a flat and paved runway site, performed with an Optech ALTM 3100 sensor, showed an increase in modeled vertical errors of 5 cm, at a nadir scan orientation, to 8 cm at scan edges; for an aircraft altitude of 1200 m and half scan angle of 15°. In a survey with the same sensor, at a highly sloped glacial basin site absent of vegetation, modeled vertical errors reached over 2 m. Validation of error models within the glacial environment, over three separate flight lines, respectively showed 100%, 85%, and 75% of elevation residuals fell below error predictions. Future work in LiDAR sensor measurement uncertainty must focus on the development of vegetative error models to create more robust error prediction algorithms. To achieve this objective, comprehensive empirical exploratory analysis is recommended to relate vegetative parameters to observed errors.

  19. Underwater noise modelling for environmental impact assessment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Farcas, Adrian; Thompson, Paul M.; Merchant, Nathan D., E-mail: nathan.merchant@cefas.co.uk

    Assessment of underwater noise is increasingly required by regulators of development projects in marine and freshwater habitats, and noise pollution can be a constraining factor in the consenting process. Noise levels arising from the proposed activity are modelled and the potential impact on species of interest within the affected area is then evaluated. Although there is considerable uncertainty in the relationship between noise levels and impacts on aquatic species, the science underlying noise modelling is well understood. Nevertheless, many environmental impact assessments (EIAs) do not reflect best practice, and stakeholders and decision makers in the EIA process are often unfamiliarmore » with the concepts and terminology that are integral to interpreting noise exposure predictions. In this paper, we review the process of underwater noise modelling and explore the factors affecting predictions of noise exposure. Finally, we illustrate the consequences of errors and uncertainties in noise modelling, and discuss future research needs to reduce uncertainty in noise assessments.« less

  20. Uncertainty analysis of least-cost modeling for designing wildlife linkages.

    PubMed

    Beier, Paul; Majka, Daniel R; Newell, Shawn L

    2009-12-01

    Least-cost models for focal species are widely used to design wildlife corridors. To evaluate the least-cost modeling approach used to develop 15 linkage designs in southern California, USA, we assessed robustness of the largest and least constrained linkage. Species experts parameterized models for eight species with weights for four habitat factors (land cover, topographic position, elevation, road density) and resistance values for each class within a factor (e.g., each class of land cover). Each model produced a proposed corridor for that species. We examined the extent to which uncertainty in factor weights and class resistance values affected two key conservation-relevant outputs, namely, the location and modeled resistance to movement of each proposed corridor. To do so, we compared the proposed corridor to 13 alternative corridors created with parameter sets that spanned the plausible ranges of biological uncertainty in these parameters. Models for five species were highly robust (mean overlap 88%, little or no increase in resistance). Although the proposed corridors for the other three focal species overlapped as little as 0% (mean 58%) of the alternative corridors, resistance in the proposed corridors for these three species was rarely higher than resistance in the alternative corridors (mean difference was 0.025 on a scale of 1 10; worst difference was 0.39). As long as the model had the correct rank order of resistance values and factor weights, our results suggest that the predicted corridor is robust to uncertainty. The three carnivore focal species, alone or in combination, were not effective umbrellas for the other focal species. The carnivore corridors failed to overlap the predicted corridors of most other focal species and provided relatively high resistance for the other focal species (mean increase of 2.7 resistance units). Least-cost modelers should conduct uncertainty analysis so that decision-makers can appreciate the potential impact of model uncertainty on conservation decisions. Our approach to uncertainty analysis (which can be called a worst-case scenario approach) is appropriate for complex models in which distribution of the input parameters cannot be specified.

  1. Software for occupational health and safety risk analysis based on a fuzzy model.

    PubMed

    Stefanovic, Miladin; Tadic, Danijela; Djapan, Marko; Macuzic, Ivan

    2012-01-01

    Risk and safety management are very important issues in healthcare systems. Those are complex systems with many entities, hazards and uncertainties. In such an environment, it is very hard to introduce a system for evaluating and simulating significant hazards. In this paper, we analyzed different types of hazards in healthcare systems and we introduced a new fuzzy model for evaluating and ranking hazards. Finally, we presented a developed software solution, based on the suggested fuzzy model for evaluating and monitoring risk.

  2. The Inherent Uncertainty of In-Situ Observations and its Implications for Modeling Evapotranspiration

    NASA Astrophysics Data System (ADS)

    Alfieri, J. G.

    2012-12-01

    In-situ observations are essential to a broad range of applications including the development, calibration, and validation of both the numerical and remote sensing-based models. For example, observational data is requisite in order to evaluate the skill of these models both to represent the complex biogeophysical processes regulating evapotranspiration (ET) and to predict the magnitude of the moisture flux. As such, by propagating into these subsequent activities, any uncertainty or errors associated with the observational data have the potential to adversely impact the accuracy and utility of these models. It is, therefore, critical that the factors driving measurement uncertainty are fully understood so that the steps can be taken to account for its effects and mitigate its impact on subsequent analyses. Field measurements of ET can be collected using a variety of techniques including eddy covariance (EC), lysimetry (LY), and scintillometry (SC). Each of these methods is underpinned by a unique set of theoretical considerations and practical constraints; and, as a result, each method is susceptible to differing types of systematic and random error. Since the uncertainty associated with the field measurements is predicated on how well numerous factors - for example, environmental conditions - adhere to those prescribed by the underlying assumptions, the quality of in-situ observations collected via the differing methods can vary significantly both over time and from site-to-site. Using data from both site studies and large field campaigns, such as IHOP_2002 and BEAREX08, the sources of uncertainty in field observations will be discussed. The impact of measurement uncertainty on model validation will also be illustrated.

  3. Accuracy metrics for judging time scale algorithms

    NASA Technical Reports Server (NTRS)

    Douglas, R. J.; Boulanger, J.-S.; Jacques, C.

    1994-01-01

    Time scales have been constructed in different ways to meet the many demands placed upon them for time accuracy, frequency accuracy, long-term stability, and robustness. Usually, no single time scale is optimum for all purposes. In the context of the impending availability of high-accuracy intermittently-operated cesium fountains, we reconsider the question of evaluating the accuracy of time scales which use an algorithm to span interruptions of the primary standard. We consider a broad class of calibration algorithms that can be evaluated and compared quantitatively for their accuracy in the presence of frequency drift and a full noise model (a mixture of white PM, flicker PM, white FM, flicker FM, and random walk FM noise). We present the analytic techniques for computing the standard uncertainty for the full noise model and this class of calibration algorithms. The simplest algorithm is evaluated to find the average-frequency uncertainty arising from the noise of the cesium fountain's local oscillator and from the noise of a hydrogen maser transfer-standard. This algorithm and known noise sources are shown to permit interlaboratory frequency transfer with a standard uncertainty of less than 10(exp -15) for periods of 30-100 days.

  4. Potential New Lidar Observations for Cloud Studies

    NASA Technical Reports Server (NTRS)

    Winker, Dave; Hu, Yong; Narir, Amin; Cai, Xia

    2015-01-01

    The response of clouds to global warming represents a major uncertainty in estimating climate sensitivity. These uncertainties have been tracked to shallow marine clouds in the tropics and subtropics. CALIOP observations have already been used extensively to evaluate model predictions of shallow cloud fraction and top height (Leahy et al. 2013; Nam et al 2012). Tools are needed to probe the lowest levels of the troposphere. The large footprint of satellite lidars gives large multiple scattering from clouds which presents new possibilities for cloud retrievals to constrain model predictions.

  5. Uncertainty in the Bayesian meta-analysis of normally distributed surrogate endpoints

    PubMed Central

    Thompson, John R; Spata, Enti; Abrams, Keith R

    2015-01-01

    We investigate the effect of the choice of parameterisation of meta-analytic models and related uncertainty on the validation of surrogate endpoints. Different meta-analytical approaches take into account different levels of uncertainty which may impact on the accuracy of the predictions of treatment effect on the target outcome from the treatment effect on a surrogate endpoint obtained from these models. A range of Bayesian as well as frequentist meta-analytical methods are implemented using illustrative examples in relapsing–remitting multiple sclerosis, where the treatment effect on disability worsening is the primary outcome of interest in healthcare evaluation, while the effect on relapse rate is considered as a potential surrogate to the effect on disability progression, and in gastric cancer, where the disease-free survival has been shown to be a good surrogate endpoint to the overall survival. Sensitivity analysis was carried out to assess the impact of distributional assumptions on the predictions. Also, sensitivity to modelling assumptions and performance of the models were investigated by simulation. Although different methods can predict mean true outcome almost equally well, inclusion of uncertainty around all relevant parameters of the model may lead to less certain and hence more conservative predictions. When investigating endpoints as candidate surrogate outcomes, a careful choice of the meta-analytical approach has to be made. Models underestimating the uncertainty of available evidence may lead to overoptimistic predictions which can then have an effect on decisions made based on such predictions. PMID:26271918

  6. Uncertainty in the Bayesian meta-analysis of normally distributed surrogate endpoints.

    PubMed

    Bujkiewicz, Sylwia; Thompson, John R; Spata, Enti; Abrams, Keith R

    2017-10-01

    We investigate the effect of the choice of parameterisation of meta-analytic models and related uncertainty on the validation of surrogate endpoints. Different meta-analytical approaches take into account different levels of uncertainty which may impact on the accuracy of the predictions of treatment effect on the target outcome from the treatment effect on a surrogate endpoint obtained from these models. A range of Bayesian as well as frequentist meta-analytical methods are implemented using illustrative examples in relapsing-remitting multiple sclerosis, where the treatment effect on disability worsening is the primary outcome of interest in healthcare evaluation, while the effect on relapse rate is considered as a potential surrogate to the effect on disability progression, and in gastric cancer, where the disease-free survival has been shown to be a good surrogate endpoint to the overall survival. Sensitivity analysis was carried out to assess the impact of distributional assumptions on the predictions. Also, sensitivity to modelling assumptions and performance of the models were investigated by simulation. Although different methods can predict mean true outcome almost equally well, inclusion of uncertainty around all relevant parameters of the model may lead to less certain and hence more conservative predictions. When investigating endpoints as candidate surrogate outcomes, a careful choice of the meta-analytical approach has to be made. Models underestimating the uncertainty of available evidence may lead to overoptimistic predictions which can then have an effect on decisions made based on such predictions.

  7. Uncertainty of Wheat Water Use: Simulated Patterns and Sensitivity to Temperature and CO2

    NASA Technical Reports Server (NTRS)

    Cammarano, Davide; Roetter, Reimund P.; Asseng, Senthold; Ewert, Frank; Wallach, Daniel; Martre, Pierre; Hatfield, Jerry L.; Jones, James W.; Rosenzweig, Cynthia E.; Ruane, Alex C.; hide

    2016-01-01

    Projected global warming and population growth will reduce future water availability for agriculture. Thus, it is essential to increase the efficiency in using water to ensure crop productivity. Quantifying crop water use (WU; i.e. actual evapotranspiration) is a critical step towards this goal. Here, sixteen wheat simulation models were used to quantify sources of model uncertainty and to estimate the relative changes and variability between models for simulated WU, water use efficiency (WUE, WU per unit of grain dry mass produced), transpiration efficiency (Teff, transpiration per kg of unit of grain yield dry mass produced), grain yield, crop transpiration and soil evaporation at increased temperatures and elevated atmospheric carbon dioxide concentrations ([CO2]). The greatest uncertainty in simulating water use, potential evapotranspiration, crop transpiration and soil evaporation was due to differences in how crop transpiration was modelled and accounted for 50 of the total variability among models. The simulation results for the sensitivity to temperature indicated that crop WU will decline with increasing temperature due to reduced growing seasons. The uncertainties in simulated crop WU, and in particularly due to uncertainties in simulating crop transpiration, were greater under conditions of increased temperatures and with high temperatures in combination with elevated atmospheric [CO2] concentrations. Hence the simulation of crop WU, and in particularly crop transpiration under higher temperature, needs to be improved and evaluated with field measurements before models can be used to simulate climate change impacts on future crop water demand.

  8. Epistemic uncertainties and natural hazard risk assessment - Part 1: A review of the issues

    NASA Astrophysics Data System (ADS)

    Beven, K. J.; Aspinall, W. P.; Bates, P. D.; Borgomeo, E.; Goda, K.; Hall, J. W.; Page, T.; Phillips, J. C.; Rougier, J. T.; Simpson, M.; Stephenson, D. B.; Smith, P. J.; Wagener, T.; Watson, M.

    2015-12-01

    Uncertainties in natural hazard risk assessment are generally dominated by the sources arising from lack of knowledge or understanding of the processes involved. There is a lack of knowledge about frequencies, process representations, parameters, present and future boundary conditions, consequences and impacts, and the meaning of observations in evaluating simulation models. These are the epistemic uncertainties that can be difficult to constrain, especially in terms of event or scenario probabilities, even as elicited probabilities rationalized on the basis of expert judgements. This paper reviews the issues raised by trying to quantify the effects of epistemic uncertainties. Such scientific uncertainties might have significant influence on decisions that are made for risk management, so it is important to communicate the meaning of an uncertainty estimate and to provide an audit trail of the assumptions on which it is based. Some suggestions for good practice in doing so are made.

  9. Uncertainties in future-proof decision-making: the Dutch Delta Model

    NASA Astrophysics Data System (ADS)

    IJmker, Janneke; Snippen, Edwin; Ruijgh, Erik

    2013-04-01

    In 1953, a number of European countries experienced flooding after a major storm event coming from the northwest. Over 2100 people died of the resulting floods, 1800 of them being Dutch. This gave rise to the development of the so-called Delta Works and Zuiderzee Works that strongly reduced the flood risk in the Netherlands. These measures were a response to a large flooding event. As boundary conditions have changed (increasing population, increasing urban development, etc.) , the flood risk should be evaluated continuously, and measures should be taken if necessary. The Delta Programme was designed to be prepared for future changes and to limit the flood risk, taking into account economics, nature, landscape, residence and recreation . To support decisions in the Delta Programme, the Delta Model was developed. By using four different input scenarios (extremes in climate and economics) and variations in system setup, the outcomes of the Delta Model represent a range of possible outcomes for the hydrological situation in 2050 and 2100. These results flow into effect models that give insight in the integrated effects on freshwater supply (including navigation, industry and ecology) and flood risk. As the long-term water management policy of the Netherlands for the next decades will be based on these results, they have to be reliable. Therefore, a study was carried out to investigate the impact of uncertainties on the model outcomes. The study focused on "known unknowns": uncertainties in the boundary conditions, in the parameterization and in the model itself. This showed that for different parts of the Netherlands, the total uncertainty is in the order of meters! Nevertheless, (1) the total uncertainty is dominated by uncertainties in boundary conditions. Internal model uncertainties are subordinate to that. Furthermore, (2) the model responses develop in a logical way, such that the exact model outcomes might be uncertain, but the outcomes of different model runs are reliable relative to each other. The Delta Model therefore is a reliable instrument for finding the optimal water management policy for the future. As the exact model outcomes show a high degree of uncertainty, the model analysis will be on a large numbers of model runs to gain insight in the sensitivity of the model for different setups and boundary conditions. The results allow fast investigation of (relative) effects of measures. Furthermore, it helps to identify bottlenecks in the system. To summarize, the Delta Model is a tool for policy makers to base their policy strategies on quantitative rather than qualitative information. It can be applied to the current and future situation, and feeds the political discussion. The uncertainty of the model has no determinative effect on the analysis that can be done by the Delta Model.

  10. Evaluation of single and multiple Doppler lidar techniques to measure complex flow during the XPIA field campaign

    NASA Astrophysics Data System (ADS)

    Choukulkar, Aditya; Brewer, W. Alan; Sandberg, Scott P.; Weickmann, Ann; Bonin, Timothy A.; Hardesty, R. Michael; Lundquist, Julie K.; Delgado, Ruben; Valerio Iungo, G.; Ashton, Ryan; Debnath, Mithu; Bianco, Laura; Wilczak, James M.; Oncley, Steven; Wolfe, Daniel

    2017-01-01

    Accurate three-dimensional information of wind flow fields can be an important tool in not only visualizing complex flow but also understanding the underlying physical processes and improving flow modeling. However, a thorough analysis of the measurement uncertainties is required to properly interpret results. The XPIA (eXperimental Planetary boundary layer Instrumentation Assessment) field campaign conducted at the Boulder Atmospheric Observatory (BAO) in Erie, CO, from 2 March to 31 May 2015 brought together a large suite of in situ and remote sensing measurement platforms to evaluate complex flow measurement strategies. In this paper, measurement uncertainties for different single and multi-Doppler strategies using simple scan geometries (conical, vertical plane and staring) are investigated. The tradeoffs (such as time-space resolution vs. spatial coverage) among the different measurement techniques are evaluated using co-located measurements made near the BAO tower. Sensitivity of the single-/multi-Doppler measurement uncertainties to averaging period are investigated using the sonic anemometers installed on the BAO tower as the standard reference. Finally, the radiometer measurements are used to partition the measurement periods as a function of atmospheric stability to determine their effect on measurement uncertainty. It was found that with an increase in spatial coverage and measurement complexity, the uncertainty in the wind measurement also increased. For multi-Doppler techniques, the increase in uncertainty for temporally uncoordinated measurements is possibly due to requiring additional assumptions of stationarity along with horizontal homogeneity and less representative line-of-sight velocity statistics. It was also found that wind speed measurement uncertainty was lower during stable conditions compared to unstable conditions.

  11. Policy implications of uncertainty in modeled life-cycle greenhouse gas emissions of biofuels.

    PubMed

    Mullins, Kimberley A; Griffin, W Michael; Matthews, H Scott

    2011-01-01

    Biofuels have received legislative support recently in California's Low-Carbon Fuel Standard and the Federal Energy Independence and Security Act. Both present new fuel types, but neither provides methodological guidelines for dealing with the inherent uncertainty in evaluating their potential life-cycle greenhouse gas emissions. Emissions reductions are based on point estimates only. This work demonstrates the use of Monte Carlo simulation to estimate life-cycle emissions distributions from ethanol and butanol from corn or switchgrass. Life-cycle emissions distributions for each feedstock and fuel pairing modeled span an order of magnitude or more. Using a streamlined life-cycle assessment, corn ethanol emissions range from 50 to 250 g CO(2)e/MJ, for example, and each feedstock-fuel pathway studied shows some probability of greater emissions than a distribution for gasoline. Potential GHG emissions reductions from displacing fossil fuels with biofuels are difficult to forecast given this high degree of uncertainty in life-cycle emissions. This uncertainty is driven by the importance and uncertainty of indirect land use change emissions. Incorporating uncertainty in the decision making process can illuminate the risks of policy failure (e.g., increased emissions), and a calculated risk of failure due to uncertainty can be used to inform more appropriate reduction targets in future biofuel policies.

  12. Application of Probability Methods to Assess Crash Modeling Uncertainty

    NASA Technical Reports Server (NTRS)

    Lyle, Karen H.; Stockwell, Alan E.; Hardy, Robin C.

    2003-01-01

    Full-scale aircraft crash simulations performed with nonlinear, transient dynamic, finite element codes can incorporate structural complexities such as: geometrically accurate models; human occupant models; and advanced material models to include nonlinear stress-strain behaviors, and material failure. Validation of these crash simulations is difficult due to a lack of sufficient information to adequately determine the uncertainty in the experimental data and the appropriateness of modeling assumptions. This paper evaluates probabilistic approaches to quantify the effects of finite element modeling assumptions on the predicted responses. The vertical drop test of a Fokker F28 fuselage section will be the focus of this paper. The results of a probabilistic analysis using finite element simulations will be compared with experimental data.

  13. Application of Probability Methods to Assess Crash Modeling Uncertainty

    NASA Technical Reports Server (NTRS)

    Lyle, Karen H.; Stockwell, Alan E.; Hardy, Robin C.

    2007-01-01

    Full-scale aircraft crash simulations performed with nonlinear, transient dynamic, finite element codes can incorporate structural complexities such as: geometrically accurate models; human occupant models; and advanced material models to include nonlinear stress-strain behaviors, and material failure. Validation of these crash simulations is difficult due to a lack of sufficient information to adequately determine the uncertainty in the experimental data and the appropriateness of modeling assumptions. This paper evaluates probabilistic approaches to quantify the effects of finite element modeling assumptions on the predicted responses. The vertical drop test of a Fokker F28 fuselage section will be the focus of this paper. The results of a probabilistic analysis using finite element simulations will be compared with experimental data.

  14. Global sensitivity analysis for identifying important parameters of nitrogen nitrification and denitrification under model uncertainty and scenario uncertainty

    NASA Astrophysics Data System (ADS)

    Chen, Zhuowei; Shi, Liangsheng; Ye, Ming; Zhu, Yan; Yang, Jinzhong

    2018-06-01

    Nitrogen reactive transport modeling is subject to uncertainty in model parameters, structures, and scenarios. By using a new variance-based global sensitivity analysis method, this paper identifies important parameters for nitrogen reactive transport with simultaneous consideration of these three uncertainties. A combination of three scenarios of soil temperature and two scenarios of soil moisture creates a total of six scenarios. Four alternative models describing the effect of soil temperature and moisture content are used to evaluate the reduction functions used for calculating actual reaction rates. The results show that for nitrogen reactive transport problem, parameter importance varies substantially among different models and scenarios. Denitrification and nitrification process is sensitive to soil moisture content status rather than to the moisture function parameter. Nitrification process becomes more important at low moisture content and low temperature. However, the changing importance of nitrification activity with respect to temperature change highly relies on the selected model. Model-averaging is suggested to assess the nitrification (or denitrification) contribution by reducing the possible model error. Despite the introduction of biochemical heterogeneity or not, fairly consistent parameter importance rank is obtained in this study: optimal denitrification rate (Kden) is the most important parameter; reference temperature (Tr) is more important than temperature coefficient (Q10); empirical constant in moisture response function (m) is the least important one. Vertical distribution of soil moisture but not temperature plays predominant role controlling nitrogen reaction. This study provides insight into the nitrogen reactive transport modeling and demonstrates an effective strategy of selecting the important parameters when future temperature and soil moisture carry uncertainties or when modelers face with multiple ways of establishing nitrogen models.

  15. A Composite View of Ozone Evolution in the 1995-1996 Northern Winter Polar Vortex Developed from Airborne Lidar and Satellite Observations

    NASA Technical Reports Server (NTRS)

    Douglass, A. R.; Schoeberl, M. R.; Kawa, S. R.; Browell, E. V.

    2000-01-01

    The processes which contribute to the ozone evolution in the high latitude northern lower stratosphere are evaluated using a three dimensional model simulation and ozone observations. The model uses winds and temperatures from the Goddard Earth Observing System Data Assimilation System. The simulation results are compared with ozone observations from three platforms: the differential absorption lidar (DIAL) which was flown on the NASA DC-8 as part of the Vortex Ozone Transport Experiment; the Microwave Limb Sounder (MLS); the Polar Ozone and Aerosol Measurement (POAM II) solar occultation instrument. Time series for the different data sets are consistent with each other, and diverge from model time series during December and January. The model ozone in December and January is shown to be much less sensitive to the model photochemistry than to the model vertical transport, which depends on the model vertical motion as well as the model vertical gradient. We evaluate the dependence of model ozone evolution on the model ozone gradient by comparing simulations with different initial conditions for ozone. The modeled ozone throughout December and January most closely resembles observed ozone when the vertical profiles between 12 and 20 km within the polar vortex closely match December DIAL observations. We make a quantitative estimate of the uncertainty in the vertical advection using diabatic trajectory calculations. The net transport uncertainty is significant, and should be accounted for when comparing observations with model ozone. The observed and modeled ozone time series during December and January are consistent when these transport uncertainties are taken into account.

  16. External Peer Review Team Report Underground Testing Area Subproject for Frenchman Flat, Revision 1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sam Marutzky

    2010-09-01

    An external peer review was conducted to review the groundwater models used in the corrective action investigation stage of the Underground Test Area (UGTA) subproject to forecast zones of potential contamination in 1,000 years for the Frenchman Flat area. The goal of the external peer review was to provide technical evaluation of the studies and to assist in assessing the readiness of the UGTA subproject to progress to monitoring activities for further model evaluation. The external peer review team consisted of six independent technical experts with expertise in geology, hydrogeology,'''groundwater modeling, and radiochemistry. The peer review team was tasked withmore » addressing the following questions: 1. Are the modeling approaches, assumptions, and model results for Frenchman Flat consistent with the use of modeling studies as a decision tool for resolution of environmental and regulatory requirements? 2. Do the modeling results adequately account for uncertainty in models of flow and transport in the Frenchman Flat hydrological setting? a. Are the models of sufficient scale/resolution to adequately predict contaminant transport in the Frenchman Flat setting? b. Have all key processes been included in the model? c. Are the methods used to forecast contaminant boundaries from the transport modeling studies reasonable and appropriate? d. Are the assessments of uncertainty technically sound and consistent with state-of-the-art approaches currently used in the hydrological sciences? 3. Are the datasets and modeling results adequate for a transition to Corrective Action Unit monitoring studies—the next stage in the UGTA strategy for Frenchman Flat? The peer review team is of the opinion that, with some limitations, the modeling approaches, assumptions, and model results are consistent with the use of modeling studies for resolution of environmental and regulatory requirements. The peer review team further finds that the modeling studies have accounted for uncertainty in models of flow and transport in the Frenchman Flat except for a few deficiencies described in the report. Finally, the peer review team concludes that the UGTA subproject has explored a wide range of variations in assumptions, methods, and data, and should proceed to the next stage with an emphasis on monitoring studies. The corrective action strategy, as described in the Federal Facility Agreement and Consent Order, states that the groundwater flow and transport models for each corrective action unit will consider, at a minimum, the following: • Alternative hydrostratigraphic framework models of the modeling domain. • Uncertainty in the radiological and hydrological source terms. • Alternative models of recharge. • Alternative boundary conditions and groundwater flows. • Multiple permissive sets of calibrated flow models. • Probabilistic simulations of transport using plausible sets of alternative framework and recharge models, and boundary and groundwater flows from calibrated flow models. • Ensembles of forecasts of contaminant boundaries. • Sensitivity and uncertainty analyses of model outputs. The peer review team finds that these minimum requirements have been met. While the groundwater modeling and uncertainty analyses have been quite detailed, the peer review team has identified several modeling-related issues that should be addressed in the next phase of the corrective action activities: • Evaluating and using water-level gradients from the pilot wells at the Area 5 Radioactive Waste Management Site in model calibration. • Re-evaluating the use of geochemical age-dating data to constrain model calibrations. • Developing water budgets for the alluvial and upper volcanic aquifer systems in Frenchman Flat. • Considering modeling approaches in which calculated groundwater flow directions near the water table are not predetermined by model boundary conditions and areas of recharge, all of which are very uncertain. • Evaluating local-scale variations in hydraulic conductivity on the calculated contaminant boundaries. • Evaluating the effects of non-steady-state flow conditions on calculated contaminant boundaries, including the effects of long-term declines in water levels, climatic change, and disruption of groundwater system by potential earthquake faulting along either of the two major controlling fault zones in the flow system (the Cane Spring and Rock Valley faults). • Considering the use of less-complex modeling approaches. • Evaluating the large change in water levels in the vicinity of the Frenchman Flat playa and developing a conceptual model to explain these water-level changes. • Developing a long-term groundwater level monitoring program for Frenchman Flat with regular monitoring of water levels at key monitoring wells. Despite these reservations, the peer review team strongly believes that the UGTA subproject should proceed to the next stage.« less

  17. Evaluation of Cost Leadership Strategy in Shipping Enterprises with Simulation Model

    NASA Astrophysics Data System (ADS)

    Ferfeli, Maria V.; Vaxevanou, Anthi Z.; Damianos, Sakas P.

    2009-08-01

    The present study will attempt the evaluation of cost leadership strategy that prevails in certain shipping enterprises and the creation of simulation models based on strategic model STAIR. The above model is an alternative method of strategic applications evaluation. This is held in order to be realised if the strategy of cost leadership creates competitive advantage [1] and this will be achieved via the technical simulation which appreciates the interactions between the operations of an enterprise and the decision-making strategy in conditions of uncertainty with reduction of undertaken risk.

  18. HYDROLOGIC MODEL UNCERTAINTY ASSOCIATED WITH SIMULATING FUTURE LAND-COVER/USE SCENARIOS: A RETROSPECTIVE ANALYSIS

    EPA Science Inventory

    GIS-based hydrologic modeling offers a convenient means of assessing the impacts associated with land-cover/use change for environmental planning efforts. Alternative future scenarios can be used as input to hydrologic models and compared with existing conditions to evaluate pot...

  19. A Formal Approach to Empirical Dynamic Model Optimization and Validation

    NASA Technical Reports Server (NTRS)

    Crespo, Luis G; Morelli, Eugene A.; Kenny, Sean P.; Giesy, Daniel P.

    2014-01-01

    A framework was developed for the optimization and validation of empirical dynamic models subject to an arbitrary set of validation criteria. The validation requirements imposed upon the model, which may involve several sets of input-output data and arbitrary specifications in time and frequency domains, are used to determine if model predictions are within admissible error limits. The parameters of the empirical model are estimated by finding the parameter realization for which the smallest of the margins of requirement compliance is as large as possible. The uncertainty in the value of this estimate is characterized by studying the set of model parameters yielding predictions that comply with all the requirements. Strategies are presented for bounding this set, studying its dependence on admissible prediction error set by the analyst, and evaluating the sensitivity of the model predictions to parameter variations. This information is instrumental in characterizing uncertainty models used for evaluating the dynamic model at operating conditions differing from those used for its identification and validation. A practical example based on the short period dynamics of the F-16 is used for illustration.

  20. Decay heat uncertainty for BWR used fuel due to modeling and nuclear data uncertainties

    DOE PAGES

    Ilas, Germina; Liljenfeldt, Henrik

    2017-05-19

    Characterization of the energy released from radionuclide decay in nuclear fuel discharged from reactors is essential for the design, safety, and licensing analyses of used nuclear fuel storage, transportation, and repository systems. There are a limited number of decay heat measurements available for commercial used fuel applications. Because decay heat measurements can be expensive or impractical for covering the multitude of existing fuel designs, operating conditions, and specific application purposes, decay heat estimation relies heavily on computer code prediction. Uncertainty evaluation for calculated decay heat is an important aspect when assessing code prediction and a key factor supporting decision makingmore » for used fuel applications. While previous studies have largely focused on uncertainties in code predictions due to nuclear data uncertainties, this study discusses uncertainties in calculated decay heat due to uncertainties in assembly modeling parameters as well as in nuclear data. Capabilities in the SCALE nuclear analysis code system were used to quantify the effect on calculated decay heat of uncertainties in nuclear data and selected manufacturing and operation parameters for a typical boiling water reactor (BWR) fuel assembly. Furthermore, the BWR fuel assembly used as the reference case for this study was selected from a set of assemblies for which high-quality decay heat measurements are available, to assess the significance of the results through comparison with calculated and measured decay heat data.« less

  1. Decay heat uncertainty for BWR used fuel due to modeling and nuclear data uncertainties

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ilas, Germina; Liljenfeldt, Henrik

    Characterization of the energy released from radionuclide decay in nuclear fuel discharged from reactors is essential for the design, safety, and licensing analyses of used nuclear fuel storage, transportation, and repository systems. There are a limited number of decay heat measurements available for commercial used fuel applications. Because decay heat measurements can be expensive or impractical for covering the multitude of existing fuel designs, operating conditions, and specific application purposes, decay heat estimation relies heavily on computer code prediction. Uncertainty evaluation for calculated decay heat is an important aspect when assessing code prediction and a key factor supporting decision makingmore » for used fuel applications. While previous studies have largely focused on uncertainties in code predictions due to nuclear data uncertainties, this study discusses uncertainties in calculated decay heat due to uncertainties in assembly modeling parameters as well as in nuclear data. Capabilities in the SCALE nuclear analysis code system were used to quantify the effect on calculated decay heat of uncertainties in nuclear data and selected manufacturing and operation parameters for a typical boiling water reactor (BWR) fuel assembly. Furthermore, the BWR fuel assembly used as the reference case for this study was selected from a set of assemblies for which high-quality decay heat measurements are available, to assess the significance of the results through comparison with calculated and measured decay heat data.« less

  2. Effect of soil property uncertainties on permafrost thaw projections: A calibration-constrained analysis

    DOE PAGES

    Harp, Dylan R.; Atchley, Adam L.; Painter, Scott L.; ...

    2016-02-11

    Here, the effect of soil property uncertainties on permafrost thaw projections are studied using a three-phase subsurface thermal hydrology model and calibration-constrained uncertainty analysis. The Null-Space Monte Carlo method is used to identify soil hydrothermal parameter combinations that are consistent with borehole temperature measurements at the study site, the Barrow Environmental Observatory. Each parameter combination is then used in a forward projection of permafrost conditions for the 21more » $$^{st}$$ century (from calendar year 2006 to 2100) using atmospheric forcings from the Community Earth System Model (CESM) in the Representative Concentration Pathway (RCP) 8.5 greenhouse gas concentration trajectory. A 100-year projection allows for the evaluation of intra-annual uncertainty due to soil properties and the inter-annual variability due to year to year differences in CESM climate forcings. After calibrating to borehole temperature data at this well-characterized site, soil property uncertainties are still significant and result in significant intra-annual uncertainties in projected active layer thickness and annual thaw depth-duration even with a specified future climate. Intra-annual uncertainties in projected soil moisture content and Stefan number are small. A volume and time integrated Stefan number decreases significantly in the future climate, indicating that latent heat of phase change becomes more important than heat conduction in future climates. Out of 10 soil parameters, ALT, annual thaw depth-duration, and Stefan number are highly dependent on mineral soil porosity, while annual mean liquid saturation of the active layer is highly dependent on the mineral soil residual saturation and moderately dependent on peat residual saturation. By comparing the ensemble statistics to the spread of projected permafrost metrics using different climate models, we show that the effect of calibration-constrained uncertainty in soil properties, although significant, is less than that produced by structural climate model uncertainty for this location.« less

  3. Effect of soil property uncertainties on permafrost thaw projections: A calibration-constrained analysis

    DOE PAGES

    Harp, D. R.; Atchley, A. L.; Painter, S. L.; ...

    2015-06-29

    The effect of soil property uncertainties on permafrost thaw projections are studied using a three-phase subsurface thermal hydrology model and calibration-constrained uncertainty analysis. The Null-Space Monte Carlo method is used to identify soil hydrothermal parameter combinations that are consistent with borehole temperature measurements at the study site, the Barrow Environmental Observatory. Each parameter combination is then used in a forward projection of permafrost conditions for the 21st century (from calendar year 2006 to 2100) using atmospheric forcings from the Community Earth System Model (CESM) in the Representative Concentration Pathway (RCP) 8.5 greenhouse gas concentration trajectory. A 100-year projection allows formore » the evaluation of intra-annual uncertainty due to soil properties and the inter-annual variability due to year to year differences in CESM climate forcings. After calibrating to borehole temperature data at this well-characterized site, soil property uncertainties are still significant and result in significant intra-annual uncertainties in projected active layer thickness and annual thaw depth-duration even with a specified future climate. Intra-annual uncertainties in projected soil moisture content and Stefan number are small. A volume and time integrated Stefan number decreases significantly in the future climate, indicating that latent heat of phase change becomes more important than heat conduction in future climates. Out of 10 soil parameters, ALT, annual thaw depth-duration, and Stefan number are highly dependent on mineral soil porosity, while annual mean liquid saturation of the active layer is highly dependent on the mineral soil residual saturation and moderately dependent on peat residual saturation. As a result, by comparing the ensemble statistics to the spread of projected permafrost metrics using different climate models, we show that the effect of calibration-constrained uncertainty in soil properties, although significant, is less than that produced by structural climate model uncertainty for this location.« less

  4. Effect of soil property uncertainties on permafrost thaw projections: a calibration-constrained analysis

    NASA Astrophysics Data System (ADS)

    Harp, D. R.; Atchley, A. L.; Painter, S. L.; Coon, E. T.; Wilson, C. J.; Romanovsky, V. E.; Rowland, J. C.

    2015-06-01

    The effect of soil property uncertainties on permafrost thaw projections are studied using a three-phase subsurface thermal hydrology model and calibration-constrained uncertainty analysis. The Null-Space Monte Carlo method is used to identify soil hydrothermal parameter combinations that are consistent with borehole temperature measurements at the study site, the Barrow Environmental Observatory. Each parameter combination is then used in a forward projection of permafrost conditions for the 21st century (from calendar year 2006 to 2100) using atmospheric forcings from the Community Earth System Model (CESM) in the Representative Concentration Pathway (RCP) 8.5 greenhouse gas concentration trajectory. A 100-year projection allows for the evaluation of intra-annual uncertainty due to soil properties and the inter-annual variability due to year to year differences in CESM climate forcings. After calibrating to borehole temperature data at this well-characterized site, soil property uncertainties are still significant and result in significant intra-annual uncertainties in projected active layer thickness and annual thaw depth-duration even with a specified future climate. Intra-annual uncertainties in projected soil moisture content and Stefan number are small. A volume and time integrated Stefan number decreases significantly in the future climate, indicating that latent heat of phase change becomes more important than heat conduction in future climates. Out of 10 soil parameters, ALT, annual thaw depth-duration, and Stefan number are highly dependent on mineral soil porosity, while annual mean liquid saturation of the active layer is highly dependent on the mineral soil residual saturation and moderately dependent on peat residual saturation. By comparing the ensemble statistics to the spread of projected permafrost metrics using different climate models, we show that the effect of calibration-constrained uncertainty in soil properties, although significant, is less than that produced by structural climate model uncertainty for this location.

  5. Highly Enriched Uranium Metal Cylinders Surrounded by Various Reflector Materials

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bernard Jones; J. Blair Briggs; Leland Monteirth

    A series of experiments was performed at Los Alamos Scientific Laboratory in 1958 to determine critical masses of cylinders of Oralloy (Oy) reflected by a number of materials. The experiments were all performed on the Comet Universal Critical Assembly Machine, and consisted of discs of highly enriched uranium (93.3 wt.% 235U) reflected by half-inch and one-inch-thick cylindrical shells of various reflector materials. The experiments were performed by members of Group N-2, particularly K. W. Gallup, G. E. Hansen, H. C. Paxton, and R. H. White. This experiment was intended to ascertain critical masses for criticality safety purposes, as well asmore » to compare neutron transport cross sections to those obtained from danger coefficient measurements with the Topsy Oralloy-Tuballoy reflected and Godiva unreflected critical assemblies. The reflector materials examined in this series of experiments are as follows: magnesium, titanium, aluminum, graphite, mild steel, nickel, copper, cobalt, molybdenum, natural uranium, tungsten, beryllium, aluminum oxide, molybdenum carbide, and polythene (polyethylene). Also included are two special configurations of composite beryllium and iron reflectors. Analyses were performed in which uncertainty associated with six different parameters was evaluated; namely, extrapolation to the uranium critical mass, uranium density, 235U enrichment, reflector density, reflector thickness, and reflector impurities. In addition to the idealizations made by the experimenters (removal of the platen and diaphragm), two simplifications were also made to the benchmark models that resulted in a small bias and additional uncertainty. First of all, since impurities in core and reflector materials are only estimated, they are not included in the benchmark models. Secondly, the room, support structure, and other possible surrounding equipment were not included in the model. Bias values that result from these two simplifications were determined and associated uncertainty in the bias values were included in the overall uncertainty in benchmark keff values. Bias values were very small, ranging from 0.0004 ?k low to 0.0007 ?k low. Overall uncertainties range from ? 0.0018 to ? 0.0030. Major contributors to the overall uncertainty include uncertainty in the extrapolation to the uranium critical mass and the uranium density. Results are summarized in Figure 1. Figure 1. Experimental, Benchmark-Model, and MCNP/KENO Calculated Results The 32 configurations described and evaluated under ICSBEP Identifier HEU-MET-FAST-084 are judged to be acceptable for use as criticality safety benchmark experiments and should be valuable integral benchmarks for nuclear data testing of the various reflector materials. Details of the benchmark models, uncertainty analyses, and final results are given in this paper.« less

  6. Using uncertainty to link and rank evidence from biomedical literature for model curation

    PubMed Central

    Zerva, Chrysoula; Batista-Navarro, Riza; Day, Philip; Ananiadou, Sophia

    2017-01-01

    Abstract Motivation In recent years, there has been great progress in the field of automated curation of biomedical networks and models, aided by text mining methods that provide evidence from literature. Such methods must not only extract snippets of text that relate to model interactions, but also be able to contextualize the evidence and provide additional confidence scores for the interaction in question. Although various approaches calculating confidence scores have focused primarily on the quality of the extracted information, there has been little work on exploring the textual uncertainty conveyed by the author. Despite textual uncertainty being acknowledged in biomedical text mining as an attribute of text mined interactions (events), it is significantly understudied as a means of providing a confidence measure for interactions in pathways or other biomedical models. In this work, we focus on improving identification of textual uncertainty for events and explore how it can be used as an additional measure of confidence for biomedical models. Results We present a novel method for extracting uncertainty from the literature using a hybrid approach that combines rule induction and machine learning. Variations of this hybrid approach are then discussed, alongside their advantages and disadvantages. We use subjective logic theory to combine multiple uncertainty values extracted from different sources for the same interaction. Our approach achieves F-scores of 0.76 and 0.88 based on the BioNLP-ST and Genia-MK corpora, respectively, making considerable improvements over previously published work. Moreover, we evaluate our proposed system on pathways related to two different areas, namely leukemia and melanoma cancer research. Availability and implementation The leukemia pathway model used is available in Pathway Studio while the Ras model is available via PathwayCommons. Online demonstration of the uncertainty extraction system is available for research purposes at http://argo.nactem.ac.uk/test. The related code is available on https://github.com/c-zrv/uncertainty_components.git. Details on the above are available in the Supplementary Material. Contact sophia.ananiadou@manchester.ac.uk Supplementary information Supplementary data are available at Bioinformatics online. PMID:29036627

  7. Using uncertainty to link and rank evidence from biomedical literature for model curation.

    PubMed

    Zerva, Chrysoula; Batista-Navarro, Riza; Day, Philip; Ananiadou, Sophia

    2017-12-01

    In recent years, there has been great progress in the field of automated curation of biomedical networks and models, aided by text mining methods that provide evidence from literature. Such methods must not only extract snippets of text that relate to model interactions, but also be able to contextualize the evidence and provide additional confidence scores for the interaction in question. Although various approaches calculating confidence scores have focused primarily on the quality of the extracted information, there has been little work on exploring the textual uncertainty conveyed by the author. Despite textual uncertainty being acknowledged in biomedical text mining as an attribute of text mined interactions (events), it is significantly understudied as a means of providing a confidence measure for interactions in pathways or other biomedical models. In this work, we focus on improving identification of textual uncertainty for events and explore how it can be used as an additional measure of confidence for biomedical models. We present a novel method for extracting uncertainty from the literature using a hybrid approach that combines rule induction and machine learning. Variations of this hybrid approach are then discussed, alongside their advantages and disadvantages. We use subjective logic theory to combine multiple uncertainty values extracted from different sources for the same interaction. Our approach achieves F-scores of 0.76 and 0.88 based on the BioNLP-ST and Genia-MK corpora, respectively, making considerable improvements over previously published work. Moreover, we evaluate our proposed system on pathways related to two different areas, namely leukemia and melanoma cancer research. The leukemia pathway model used is available in Pathway Studio while the Ras model is available via PathwayCommons. Online demonstration of the uncertainty extraction system is available for research purposes at http://argo.nactem.ac.uk/test. The related code is available on https://github.com/c-zrv/uncertainty_components.git. Details on the above are available in the Supplementary Material. sophia.ananiadou@manchester.ac.uk. Supplementary data are available at Bioinformatics online. © The Author 2017. Published by Oxford University Press.

  8. Development of Semi-distributed ecohydrological model in the Rio Grande De Manati River Basin, Puerto Rico

    NASA Astrophysics Data System (ADS)

    Setegn, S. G.; Ortiz, J.; Melendez, J.; Barreto, M.; Torres-Perez, J. L.; Guild, L. S.

    2015-12-01

    There are limited studies in Puerto Rico that shows the water resources availability and variability with respect to changing climates and land use. The main goal of the HICE-PR (Human Impacts to Coastal Ecosystems in Puerto Rico (HICE-PR): the Río Loco Watershed (southwest coast PR) project which was funded by NASA is to evaluate the impacts of land use/land cover changes on the quality and extent of coastal and marine ecosystems (CMEs) in two priority watersheds in Puerto Rico (Manatí and Guánica).The main objective of this study is to set up a physically based spatially distributed hydrological model, Soil and Water Assessment Tool (SWAT) for the analysis of hydrological processes in the Rio Grande de Manati river basin. SWAT (soil and water assessment tool) is a spatially distributed watershed model developed to predict the impact of land management practices on water, sediment and agricultural chemical yields in large complex watersheds. For efficient use of distributed models for hydrological and scenario analysis, it is important that these models pass through a careful calibration and uncertainty analysis. The model was calibrated and validated using Sequential Uncertainty Fitting (SUFI-2) calibration and uncertainty analysis algorithms. The model evaluation statistics for streamflows prediction shows that there is a good agreement between the measured and simulated flows that was verified by coefficients of determination and Nash Sutcliffe efficiency greater than 0.5. Keywords: Hydrological Modeling; SWAT; SUFI-2; Rio Grande De Manati; Puerto Rico

  9. Uncertainty characterization and quantification in air pollution models. Application to the ADMS-Urban model.

    NASA Astrophysics Data System (ADS)

    Debry, E.; Malherbe, L.; Schillinger, C.; Bessagnet, B.; Rouil, L.

    2009-04-01

    Evaluation of human exposure to atmospheric pollution usually requires the knowledge of pollutants concentrations in ambient air. In the framework of PAISA project, which studies the influence of socio-economical status on relationships between air pollution and short term health effects, the concentrations of gas and particle pollutants are computed over Strasbourg with the ADMS-Urban model. As for any modeling result, simulated concentrations come with uncertainties which have to be characterized and quantified. There are several sources of uncertainties related to input data and parameters, i.e. fields used to execute the model like meteorological fields, boundary conditions and emissions, related to the model formulation because of incomplete or inaccurate treatment of dynamical and chemical processes, and inherent to the stochastic behavior of atmosphere and human activities [1]. Our aim is here to assess the uncertainties of the simulated concentrations with respect to input data and model parameters. In this scope the first step consisted in bringing out the input data and model parameters that contribute most effectively to space and time variability of predicted concentrations. Concentrations of several pollutants were simulated for two months in winter 2004 and two months in summer 2004 over five areas of Strasbourg. The sensitivity analysis shows the dominating influence of boundary conditions and emissions. Among model parameters, the roughness and Monin-Obukhov lengths appear to have non neglectable local effects. Dry deposition is also an important dynamic process. The second step of the characterization and quantification of uncertainties consists in attributing a probability distribution to each input data and model parameter and in propagating the joint distribution of all data and parameters into the model so as to associate a probability distribution to the modeled concentrations. Several analytical and numerical methods exist to perform an uncertainty analysis. We chose the Monte Carlo method which has already been applied to atmospheric dispersion models [2, 3, 4]. The main advantage of this method is to be insensitive to the number of perturbed parameters but its drawbacks are its computation cost and its slow convergence. In order to speed up this one we used the method of antithetic variable which takes adavantage of the symmetry of probability laws. The air quality model simulations were carried out by the Association for study and watching of Atmospheric Pollution in Alsace (ASPA). The output concentrations distributions can then be updated with a Bayesian method. This work is part of an INERIS Research project also aiming at assessing the uncertainty of the CHIMERE dispersion model used in the Prev'Air forecasting platform (www.prevair.org) in order to deliver more accurate predictions. (1) Rao, K.S. Uncertainty Analysis in Atmospheric Dispersion Modeling, Pure and Applied Geophysics, 2005, 162, 1893-1917. (2) Beekmann, M. and Derognat, C. Monte Carlo uncertainty analysis of a regional-scale transport chemistry model constrained by measurements from the Atmospheric Pollution Over the PAris Area (ESQUIF) campaign, Journal of Geophysical Research, 2003, 108, 8559-8576. (3) Hanna, S.R. and Lu, Z. and Frey, H.C. and Wheeler, N. and Vukovich, J. and Arunachalam, S. and Fernau, M. and Hansen, D.A. Uncertainties in predicted ozone concentrations due to input uncertainties for the UAM-V photochemical grid model applied to the July 1995 OTAG domain, Atmospheric Environment, 2001, 35, 891-903. (4) Romanowicz, R. and Higson, H. and Teasdale, I. Bayesian uncertainty estimation methodology applied to air pollution modelling, Environmetrics, 2000, 11, 351-371.

  10. Numerical modelling of glacial lake outburst floods using physically based dam-breach models

    NASA Astrophysics Data System (ADS)

    Westoby, M. J.; Brasington, J.; Glasser, N. F.; Hambrey, M. J.; Reynolds, J. M.; Hassan, M. A. A. M.; Lowe, A.

    2015-03-01

    The instability of moraine-dammed proglacial lakes creates the potential for catastrophic glacial lake outburst floods (GLOFs) in high-mountain regions. In this research, we use a unique combination of numerical dam-breach and two-dimensional hydrodynamic modelling, employed within a generalised likelihood uncertainty estimation (GLUE) framework, to quantify predictive uncertainty in model outputs associated with a reconstruction of the Dig Tsho failure in Nepal. Monte Carlo analysis was used to sample the model parameter space, and morphological descriptors of the moraine breach were used to evaluate model performance. Multiple breach scenarios were produced by differing parameter ensembles associated with a range of breach initiation mechanisms, including overtopping waves and mechanical failure of the dam face. The material roughness coefficient was found to exert a dominant influence over model performance. The downstream routing of scenario-specific breach hydrographs revealed significant differences in the timing and extent of inundation. A GLUE-based methodology for constructing probabilistic maps of inundation extent, flow depth, and hazard is presented and provides a useful tool for communicating uncertainty in GLOF hazard assessment.

  11. Mathematics applied to the climate system: outstanding challenges and recent progress

    PubMed Central

    Williams, Paul D.; Cullen, Michael J. P.; Davey, Michael K.; Huthnance, John M.

    2013-01-01

    The societal need for reliable climate predictions and a proper assessment of their uncertainties is pressing. Uncertainties arise not only from initial conditions and forcing scenarios, but also from model formulation. Here, we identify and document three broad classes of problems, each representing what we regard to be an outstanding challenge in the area of mathematics applied to the climate system. First, there is the problem of the development and evaluation of simple physically based models of the global climate. Second, there is the problem of the development and evaluation of the components of complex models such as general circulation models. Third, there is the problem of the development and evaluation of appropriate statistical frameworks. We discuss these problems in turn, emphasizing the recent progress made by the papers presented in this Theme Issue. Many pressing challenges in climate science require closer collaboration between climate scientists, mathematicians and statisticians. We hope the papers contained in this Theme Issue will act as inspiration for such collaborations and for setting future research directions. PMID:23588054

  12. Influence of air quality model resolution on uncertainty associated with health impacts

    NASA Astrophysics Data System (ADS)

    Thompson, T. M.; Selin, N. E.

    2012-06-01

    We use regional air quality modeling to evaluate the impact of model resolution on uncertainty associated with the human health benefits resulting from proposed air quality regulations. Using a regional photochemical model (CAMx), we ran a modeling episode with meteorological inputs representing conditions as they occurred during August through September 2006, and two emissions inventories (a 2006 base case and a 2018 proposed control scenario, both for Houston, Texas) at 36, 12, 4 and 2 km resolution. The base case model performance was evaluated for each resolution against daily maximum 8-h averaged ozone measured at monitoring stations. Results from each resolution were more similar to each other than they were to measured values. Population-weighted ozone concentrations were calculated for each resolution and applied to concentration response functions (with 95% confidence intervals) to estimate the health impacts of modeled ozone reduction from the base case to the control scenario. We found that estimated avoided mortalities were not significantly different between 2, 4 and 12 km resolution runs, but 36 km resolution may over-predict some potential health impacts. Given the cost/benefit analysis requirements of the Clean Air Act, the uncertainty associated with human health impacts and therefore the results reported in this study, we conclude that health impacts calculated from population weighted ozone concentrations obtained using regional photochemical models at 36 km resolution fall within the range of values obtained using fine (12 km or finer) resolution modeling. However, in some cases, 36 km resolution may not be fine enough to statistically replicate the results achieved using 2 and 4 km resolution. On average, when modeling at 36 km resolution, 7 deaths per ozone month were avoided because of ozone reductions resulting from the proposed emissions reductions (95% confidence interval was 2-9). When modeling at 2, 4 or 12 km finer scale resolution, on average 5 deaths were avoided due to the same reductions (95% confidence interval was 2-7). Initial results for this specific region show that modeling at a resolution finer than 12 km is unlikely to improve uncertainty in benefits analysis. We suggest that 12 km resolution may be appropriate for uncertainty analyses in areas with similar chemistry, but that resolution requirements should be assessed on a case-by-case basis and revised as confidence intervals for concentration-response functions are updated.

  13. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Savy, J.

    New design and evaluation guidelines for department of energy facilities subjected to natural phenomena hazard, are being finalized. Although still in draft form at this time, the document describing those guidelines should be considered to be an update of previously available guidelines. The recommendations in the guidelines document mentioned above, and simply referred to as the guidelines'' thereafter, are based on the best information at the time of its development. In particular, the seismic hazard model for the Princeton site was based on a study performed in 1981 for Lawrence Livermore National Laboratory (LLNL), which relied heavily on the resultsmore » of the NRC's Systematic Evaluation Program and was based on a methodology and data sets developed in 1977 and 1978. Considerable advances have been made in the last ten years in the domain of seismic hazard modeling. Thus, it is recommended to update the estimate of the seismic hazard at the DOE sites whenever possible. The major differences between previous estimates and the ones proposed in this study for the PPPL are in the modeling of the strong ground motion at the site, and the treatment of the total uncertainty in the estimates to include knowledge uncertainty, random uncertainty, and expert opinion diversity as well. 28 refs.« less

  14. Towards Interpreting the Signal of CO2 Emissions from Megacities by Applying a Lagrangian Receptor-oriented Model to OCO-2 XCO2 data

    NASA Astrophysics Data System (ADS)

    Wu, D.; Lin, J. C.; Oda, T.; Ye, X.; Lauvaux, T.; Yang, E. G.; Kort, E. A.

    2017-12-01

    Urban regions are large emitters of CO2 whose emission inventories are still associated with large uncertainties. Therefore, a strong need exists to better quantify emissions from megacities using a top-down approach. Satellites — e.g., the Orbiting Carbon Observatory 2 (OCO-2), provide a platform for monitoring spatiotemporal column CO2 concentrations (XCO2). In this study, we present a Lagrangian receptor-oriented model framework and evaluate "model-retrieved" XCO2 by comparing against OCO-2-retrieved XCO2, for three megacities/regions (Riyadh, Cairo and Pearl River Delta). OCO-2 soundings indicate pronounced XCO2 enhancements (dXCO2) when crossing Riyadh, which are successfully captured by our model with a slight latitude shift. From this model framework, we can identify and compare the relative contributions of dXCO2 resulted from anthropogenic emission versus biospheric fluxes. In addition, to impose constraints on emissions for Riyadh through inversion methods, three uncertainties sources are addressed in this study, including 1) transport errors, 2) receptor and model setups in atmospheric models, and 3) urban emission uncertainties. For 1), we calculate transport errors by adding a wind error component to randomize particle distributions. For 2), a set of sensitivity tests using bootstrap method is performed to describe proper ways to setup receptors in Lagrangian models. For 3), both emission uncertainties from the Fossil Fuel Data Assimilation System (FFDAS) and the spread among three emission inventories are used to approximate an overall fractional uncertainty in modeled anthropogenic signal (dXCO2.anthro). Lastly, we investigate the definition of background (clean) XCO2 for megacities from retrieved XCO2 by means of statistical tools and our model framework.

  15. Comparison of Homogeneous and Heterogeneous CFD Fuel Models for Phase I of the IAEA CRP on HTR Uncertainties Benchmark

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gerhard Strydom; Su-Jong Yoon

    2014-04-01

    Computational Fluid Dynamics (CFD) evaluation of homogeneous and heterogeneous fuel models was performed as part of the Phase I calculations of the International Atomic Energy Agency (IAEA) Coordinate Research Program (CRP) on High Temperature Reactor (HTR) Uncertainties in Modeling (UAM). This study was focused on the nominal localized stand-alone fuel thermal response, as defined in Ex. I-3 and I-4 of the HTR UAM. The aim of the stand-alone thermal unit-cell simulation is to isolate the effect of material and boundary input uncertainties on a very simplified problem, before propagation of these uncertainties are performed in subsequent coupled neutronics/thermal fluids phasesmore » on the benchmark. In many of the previous studies for high temperature gas cooled reactors, the volume-averaged homogeneous mixture model of a single fuel compact has been applied. In the homogeneous model, the Tristructural Isotropic (TRISO) fuel particles in the fuel compact were not modeled directly and an effective thermal conductivity was employed for the thermo-physical properties of the fuel compact. On the contrary, in the heterogeneous model, the uranium carbide (UCO), inner and outer pyrolytic carbon (IPyC/OPyC) and silicon carbide (SiC) layers of the TRISO fuel particles are explicitly modeled. The fuel compact is modeled as a heterogeneous mixture of TRISO fuel kernels embedded in H-451 matrix graphite. In this study, a steady-state and transient CFD simulations were performed with both homogeneous and heterogeneous models to compare the thermal characteristics. The nominal values of the input parameters are used for this CFD analysis. In a future study, the effects of input uncertainties in the material properties and boundary parameters will be investigated and reported.« less

  16. MODFLOW 2000 Head Uncertainty, a First-Order Second Moment Method

    USGS Publications Warehouse

    Glasgow, H.S.; Fortney, M.D.; Lee, J.; Graettinger, A.J.; Reeves, H.W.

    2003-01-01

    A computationally efficient method to estimate the variance and covariance in piezometric head results computed through MODFLOW 2000 using a first-order second moment (FOSM) approach is presented. This methodology employs a first-order Taylor series expansion to combine model sensitivity with uncertainty in geologic data. MODFLOW 2000 is used to calculate both the ground water head and the sensitivity of head to changes in input data. From a limited number of samples, geologic data are extrapolated and their associated uncertainties are computed through a conditional probability calculation. Combining the spatially related sensitivity and input uncertainty produces the variance-covariance matrix, the diagonal of which is used to yield the standard deviation in MODFLOW 2000 head. The variance in piezometric head can be used for calibrating the model, estimating confidence intervals, directing exploration, and evaluating the reliability of a design. A case study illustrates the approach, where aquifer transmissivity is the spatially related uncertain geologic input data. The FOSM methodology is shown to be applicable for calculating output uncertainty for (1) spatially related input and output data, and (2) multiple input parameters (transmissivity and recharge).

  17. Generation Expansion Planning With Large Amounts of Wind Power via Decision-Dependent Stochastic Programming

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhan, Yiduo; Zheng, Qipeng P.; Wang, Jianhui

    Power generation expansion planning needs to deal with future uncertainties carefully, given that the invested generation assets will be in operation for a long time. Many stochastic programming models have been proposed to tackle this challenge. However, most previous works assume predetermined future uncertainties (i.e., fixed random outcomes with given probabilities). In several recent studies of generation assets' planning (e.g., thermal versus renewable), new findings show that the investment decisions could affect the future uncertainties as well. To this end, this paper proposes a multistage decision-dependent stochastic optimization model for long-term large-scale generation expansion planning, where large amounts of windmore » power are involved. In the decision-dependent model, the future uncertainties are not only affecting but also affected by the current decisions. In particular, the probability distribution function is determined by not only input parameters but also decision variables. To deal with the nonlinear constraints in our model, a quasi-exact solution approach is then introduced to reformulate the multistage stochastic investment model to a mixed-integer linear programming model. The wind penetration, investment decisions, and the optimality of the decision-dependent model are evaluated in a series of multistage case studies. The results show that the proposed decision-dependent model provides effective optimization solutions for long-term generation expansion planning.« less

  18. Assessment of uncertainty in discrete fracture network modeling using probabilistic distribution method.

    PubMed

    Wei, Yaqiang; Dong, Yanhui; Yeh, Tian-Chyi J; Li, Xiao; Wang, Liheng; Zha, Yuanyuan

    2017-11-01

    There have been widespread concerns about solute transport problems in fractured media, e.g. the disposal of high-level radioactive waste in geological fractured rocks. Numerical simulation of particle tracking is gradually being employed to address these issues. Traditional predictions of radioactive waste transport using discrete fracture network (DFN) models often consider one particular realization of the fracture distribution based on fracture statistic features. This significantly underestimates the uncertainty of the risk of radioactive waste deposit evaluation. To adequately assess the uncertainty during the DFN modeling in a potential site for the disposal of high-level radioactive waste, this paper utilized the probabilistic distribution method (PDM). The method was applied to evaluate the risk of nuclear waste deposit in Beishan, China. Moreover, the impact of the number of realizations on the simulation results was analyzed. In particular, the differences between the modeling results of one realization and multiple realizations were demonstrated. Probabilistic distributions of 20 realizations at different times were also obtained. The results showed that the employed PDM can be used to describe the ranges of the contaminant particle transport. The high-possibility contaminated areas near the release point were more concentrated than the farther areas after 5E6 days, which was 25,400 m 2 .

  19. Trapped between two tails: trading off scientific uncertainties via climate targets

    NASA Astrophysics Data System (ADS)

    Lemoine, Derek; McJeon, Haewon C.

    2013-09-01

    Climate change policies must trade off uncertainties about future warming, about the social and ecological impacts of warming, and about the cost of reducing greenhouse gas emissions. We show that laxer carbon targets produce broader distributions for climate damages, skewed towards severe outcomes. However, if potential low-carbon technologies fill overlapping niches, then more stringent carbon targets produce broader distributions for the cost of reducing emissions, skewed towards high-cost outcomes. We use the technology-rich GCAM integrated assessment model to assess the robustness of 450 and 500 ppm carbon targets to each uncertain factor. The 500 ppm target provides net benefits across a broad range of futures. The 450 ppm target provides net benefits only when impacts are greater than conventionally assumed, when multiple technological breakthroughs lower the cost of abatement, or when evaluated with a low discount rate. Policy evaluations are more sensitive to uncertainty about abatement technology and impacts than to uncertainty about warming.

  20. Trapped Between Two Tails: Trading Off Scientific Uncertainties via Climate Targets

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lemoine, Derek M.; McJeon, Haewon C.

    2013-08-20

    Climate change policies must trade off uncertainties about future warming, about the social and ecological impacts of warming, and about the cost of reducing greenhouse gas emissions. We show that laxer carbon targets produce broader distributions for climate damages, skewed towards severe outcomes. However, if potential low-carbon technologies fill overlapping niches, then more stringent carbon targets produce broader distributions for the cost of reducing emissions, skewed towards high-cost outcomes. We use the technology- rich GCAM integrated assessment model to assess the robustness of 450 ppm and 500 ppm carbon targets to each uncertain factor. The 500 ppm target provides netmore » benefits across a broad range of futures. The 450 ppm target provides net benefits only when impacts are greater than conventionally assumed, when multiple technological breakthroughs lower the cost of abatement, or when evaluated with a low discount rate. Policy evaluations are more sensitive to uncertainty about abatement technology and impacts than to uncertainty about warming.« less

Top