Sample records for current model uncertainties

  1. Combining Satellite Ocean Color and Hydrodynamic Model Uncertainties in Bio-Optical Forecasts

    DTIC Science & Technology

    2014-04-03

    observed chlorophyll distribution for that day (MODIS Image for October 17, 2011), without regard to sign, I.e., IFigs. 11(c)-11(a)l. Black pixels indicate...time using the current field from the model. Uncertainties in both the satellite chlorophyll values and the currents from the circulation model impact...ensemole techniques to partition the chlorophyll uncertainties into components due to atmospheric correction and bio-optical inversion. By combining

  2. Exploring uncertainty and model predictive performance concepts via a modular snowmelt-runoff modeling framework

    Treesearch

    Tyler Jon Smith; Lucy Amanda Marshall

    2010-01-01

    Model selection is an extremely important aspect of many hydrologic modeling studies because of the complexity, variability, and uncertainty that surrounds the current understanding of watershed-scale systems. However, development and implementation of a complete precipitation-runoff modeling framework, from model selection to calibration and uncertainty analysis, are...

  3. Quantifying and Reducing Curve-Fitting Uncertainty in Isc

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Campanelli, Mark; Duck, Benjamin; Emery, Keith

    2015-06-14

    Current-voltage (I-V) curve measurements of photovoltaic (PV) devices are used to determine performance parameters and to establish traceable calibration chains. Measurement standards specify localized curve fitting methods, e.g., straight-line interpolation/extrapolation of the I-V curve points near short-circuit current, Isc. By considering such fits as statistical linear regressions, uncertainties in the performance parameters are readily quantified. However, the legitimacy of such a computed uncertainty requires that the model be a valid (local) representation of the I-V curve and that the noise be sufficiently well characterized. Using more data points often has the advantage of lowering the uncertainty. However, more data pointsmore » can make the uncertainty in the fit arbitrarily small, and this fit uncertainty misses the dominant residual uncertainty due to so-called model discrepancy. Using objective Bayesian linear regression for straight-line fits for Isc, we investigate an evidence-based method to automatically choose data windows of I-V points with reduced model discrepancy. We also investigate noise effects. Uncertainties, aligned with the Guide to the Expression of Uncertainty in Measurement (GUM), are quantified throughout.« less

  4. Quantifying and Reducing Curve-Fitting Uncertainty in Isc: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Campanelli, Mark; Duck, Benjamin; Emery, Keith

    Current-voltage (I-V) curve measurements of photovoltaic (PV) devices are used to determine performance parameters and to establish traceable calibration chains. Measurement standards specify localized curve fitting methods, e.g., straight-line interpolation/extrapolation of the I-V curve points near short-circuit current, Isc. By considering such fits as statistical linear regressions, uncertainties in the performance parameters are readily quantified. However, the legitimacy of such a computed uncertainty requires that the model be a valid (local) representation of the I-V curve and that the noise be sufficiently well characterized. Using more data points often has the advantage of lowering the uncertainty. However, more data pointsmore » can make the uncertainty in the fit arbitrarily small, and this fit uncertainty misses the dominant residual uncertainty due to so-called model discrepancy. Using objective Bayesian linear regression for straight-line fits for Isc, we investigate an evidence-based method to automatically choose data windows of I-V points with reduced model discrepancy. We also investigate noise effects. Uncertainties, aligned with the Guide to the Expression of Uncertainty in Measurement (GUM), are quantified throughout.« less

  5. Experimental and modeling uncertainties in the validation of lower hybrid current drive

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Poli, F. M.; Bonoli, P. T.; Chilenski, M.

    Our work discusses sources of uncertainty in the validation of lower hybrid wave current drive simulations against experiments, by evolving self-consistently the magnetic equilibrium and the heating and current drive profiles, calculated with a combined toroidal ray tracing code and 3D Fokker–Planck solver. The simulations indicate a complex interplay of elements, where uncertainties in the input plasma parameters, in the models and in the transport solver combine and compensate each other, at times. It is concluded that ray-tracing calculations should include a realistic representation of the density and temperature in the region between the confined plasma and the wall, whichmore » is especially important in regimes where the LH waves are weakly damped and undergo multiple reflections from the plasma boundary. Uncertainties introduced in the processing of diagnostic data as well as uncertainties introduced by model approximations are assessed. We show that, by comparing the evolution of the plasma parameters in self-consistent simulations with available data, inconsistencies can be identified and limitations in the models or in the experimental data assessed.« less

  6. Experimental and modeling uncertainties in the validation of lower hybrid current drive

    DOE PAGES

    Poli, F. M.; Bonoli, P. T.; Chilenski, M.; ...

    2016-07-28

    Our work discusses sources of uncertainty in the validation of lower hybrid wave current drive simulations against experiments, by evolving self-consistently the magnetic equilibrium and the heating and current drive profiles, calculated with a combined toroidal ray tracing code and 3D Fokker–Planck solver. The simulations indicate a complex interplay of elements, where uncertainties in the input plasma parameters, in the models and in the transport solver combine and compensate each other, at times. It is concluded that ray-tracing calculations should include a realistic representation of the density and temperature in the region between the confined plasma and the wall, whichmore » is especially important in regimes where the LH waves are weakly damped and undergo multiple reflections from the plasma boundary. Uncertainties introduced in the processing of diagnostic data as well as uncertainties introduced by model approximations are assessed. We show that, by comparing the evolution of the plasma parameters in self-consistent simulations with available data, inconsistencies can be identified and limitations in the models or in the experimental data assessed.« less

  7. Glass Property Models and Constraints for Estimating the Glass to be Produced at Hanford by Implementing Current Advanced Glass Formulation Efforts

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vienna, John D.; Kim, Dong-Sang; Skorski, Daniel C.

    2013-07-01

    Recent glass formulation and melter testing data have suggested that significant increases in waste loading in HLW and LAW glasses are possible over current system planning estimates. The data (although limited in some cases) were evaluated to determine a set of constraints and models that could be used to estimate the maximum loading of specific waste compositions in glass. It is recommended that these models and constraints be used to estimate the likely HLW and LAW glass volumes that would result if the current glass formulation studies are successfully completed. It is recognized that some of the models are preliminarymore » in nature and will change in the coming years. Plus the models do not currently address the prediction uncertainties that would be needed before they could be used in plant operations. The models and constraints are only meant to give an indication of rough glass volumes and are not intended to be used in plant operation or waste form qualification activities. A current research program is in place to develop the data, models, and uncertainty descriptions for that purpose. A fundamental tenet underlying the research reported in this document is to try to be less conservative than previous studies when developing constraints for estimating the glass to be produced by implementing current advanced glass formulation efforts. The less conservative approach documented herein should allow for the estimate of glass masses that may be realized if the current efforts in advanced glass formulations are completed over the coming years and are as successful as early indications suggest they may be. Because of this approach there is an unquantifiable uncertainty in the ultimate glass volume projections due to model prediction uncertainties that has to be considered along with other system uncertainties such as waste compositions and amounts to be immobilized, split factors between LAW and HLW, etc.« less

  8. CFCI3 (CFC-11): UV Absorption Spectrum Temperature Dependence Measurements and the Impact on Atmospheric Lifetime and Uncertainty

    NASA Technical Reports Server (NTRS)

    Mcgillen, Max R.; Fleming, Eric L.; Jackman, Charles H.; Burkholder, James B.

    2014-01-01

    CFCl3 (CFC-11) is both an atmospheric ozone-depleting and potent greenhouse gas that is removed primarily via stratospheric UV photolysis. Uncertainty in the temperature dependence of its UV absorption spectrum is a significant contributing factor to the overall uncertainty in its global lifetime and, thus, model calculations of stratospheric ozone recovery and climate change. In this work, the CFC-11 UV absorption spectrum was measured over a range of wavelength (184.95 - 230 nm) and temperature (216 - 296 K). We report a spectrum temperature dependence that is less than currently recommended for use in atmospheric models. The impact on its atmospheric lifetime was quantified using a 2-D model and the spectrum parameterization developed in this work. The obtained global annually averaged lifetime was 58.1 +- 0.7 years (2 sigma uncertainty due solely to the spectrum uncertainty). The lifetime is slightly reduced and the uncertainty significantly reduced from that obtained using current spectrum recommendations

  9. Structured Uncertainty Bound Determination From Data for Control and Performance Validation

    NASA Technical Reports Server (NTRS)

    Lim, Kyong B.

    2003-01-01

    This report attempts to document the broad scope of issues that must be satisfactorily resolved before one can expect to methodically obtain, with a reasonable confidence, a near-optimal robust closed loop performance in physical applications. These include elements of signal processing, noise identification, system identification, model validation, and uncertainty modeling. Based on a recently developed methodology involving a parameterization of all model validating uncertainty sets for a given linear fractional transformation (LFT) structure and noise allowance, a new software, Uncertainty Bound Identification (UBID) toolbox, which conveniently executes model validation tests and determine uncertainty bounds from data, has been designed and is currently available. This toolbox also serves to benchmark the current state-of-the-art in uncertainty bound determination and in turn facilitate benchmarking of robust control technology. To help clarify the methodology and use of the new software, two tutorial examples are provided. The first involves the uncertainty characterization of a flexible structure dynamics, and the second example involves a closed loop performance validation of a ducted fan based on an uncertainty bound from data. These examples, along with other simulation and experimental results, also help describe the many factors and assumptions that determine the degree of success in applying robust control theory to practical problems.

  10. Optimal regeneration planning for old-growth forest: addressing scientific uncertainty in endangered species recovery through adaptive management

    USGS Publications Warehouse

    Moore, C.T.; Conroy, M.J.

    2006-01-01

    Stochastic and structural uncertainties about forest dynamics present challenges in the management of ephemeral habitat conditions for endangered forest species. Maintaining critical foraging and breeding habitat for the endangered red-cockaded woodpecker (Picoides borealis) requires an uninterrupted supply of old-growth forest. We constructed and optimized a dynamic forest growth model for the Piedmont National Wildlife Refuge (Georgia, USA) with the objective of perpetuating a maximum stream of old-growth forest habitat. Our model accommodates stochastic disturbances and hardwood succession rates, and uncertainty about model structure. We produced a regeneration policy that was indexed by current forest state and by current weight of evidence among alternative model forms. We used adaptive stochastic dynamic programming, which anticipates that model probabilities, as well as forest states, may change through time, with consequent evolution of the optimal decision for any given forest state. In light of considerable uncertainty about forest dynamics, we analyzed a set of competing models incorporating extreme, but plausible, parameter values. Under any of these models, forest silviculture practices currently recommended for the creation of woodpecker habitat are suboptimal. We endorse fully adaptive approaches to the management of endangered species habitats in which predictive modeling, monitoring, and assessment are tightly linked.

  11. Linked Sensitivity Analysis, Calibration, and Uncertainty Analysis Using a System Dynamics Model for Stroke Comparative Effectiveness Research.

    PubMed

    Tian, Yuan; Hassmiller Lich, Kristen; Osgood, Nathaniel D; Eom, Kirsten; Matchar, David B

    2016-11-01

    As health services researchers and decision makers tackle more difficult problems using simulation models, the number of parameters and the corresponding degree of uncertainty have increased. This often results in reduced confidence in such complex models to guide decision making. To demonstrate a systematic approach of linked sensitivity analysis, calibration, and uncertainty analysis to improve confidence in complex models. Four techniques were integrated and applied to a System Dynamics stroke model of US veterans, which was developed to inform systemwide intervention and research planning: Morris method (sensitivity analysis), multistart Powell hill-climbing algorithm and generalized likelihood uncertainty estimation (calibration), and Monte Carlo simulation (uncertainty analysis). Of 60 uncertain parameters, sensitivity analysis identified 29 needing calibration, 7 that did not need calibration but significantly influenced key stroke outcomes, and 24 not influential to calibration or stroke outcomes that were fixed at their best guess values. One thousand alternative well-calibrated baselines were obtained to reflect calibration uncertainty and brought into uncertainty analysis. The initial stroke incidence rate among veterans was identified as the most influential uncertain parameter, for which further data should be collected. That said, accounting for current uncertainty, the analysis of 15 distinct prevention and treatment interventions provided a robust conclusion that hypertension control for all veterans would yield the largest gain in quality-adjusted life years. For complex health care models, a mixed approach was applied to examine the uncertainty surrounding key stroke outcomes and the robustness of conclusions. We demonstrate that this rigorous approach can be practical and advocate for such analysis to promote understanding of the limits of certainty in applying models to current decisions and to guide future data collection. © The Author(s) 2016.

  12. Host Model Uncertainty in Aerosol Radiative Effects: the AeroCom Prescribed Experiment and Beyond

    NASA Astrophysics Data System (ADS)

    Stier, Philip; Schutgens, Nick; Bian, Huisheng; Boucher, Olivier; Chin, Mian; Ghan, Steven; Huneeus, Nicolas; Kinne, Stefan; Lin, Guangxing; Myhre, Gunnar; Penner, Joyce; Randles, Cynthia; Samset, Bjorn; Schulz, Michael; Yu, Hongbin; Zhou, Cheng; Bellouin, Nicolas; Ma, Xiaoyan; Yu, Fangqun; Takemura, Toshihiko

    2013-04-01

    Anthropogenic and natural aerosol radiative effects are recognized to affect global and regional climate. Multi-model "diversity" in estimates of the aerosol radiative effect is often perceived as a measure of the uncertainty in modelling aerosol itself. However, current aerosol models vary considerably in model components relevant for the calculation of aerosol radiative forcings and feedbacks and the associated "host-model uncertainties" are generally convoluted with the actual uncertainty in aerosol modelling. In the AeroCom Prescribed intercomparison study we systematically isolate and quantify host model uncertainties on aerosol forcing experiments through prescription of identical aerosol radiative properties in eleven participating models. Host model errors in aerosol radiative forcing are largest in regions of uncertain host model components, such as stratocumulus cloud decks or areas with poorly constrained surface albedos, such as sea ice. Our results demonstrate that host model uncertainties are an important component of aerosol forcing uncertainty that require further attention. However, uncertainties in aerosol radiative effects also include short-term and long-term feedback processes that will be systematically explored in future intercomparison studies. Here we will present an overview of the proposals for discussion and results from early scoping studies.

  13. Uncertainty quantification of Antarctic contribution to sea-level rise using the fast Elementary Thermomechanical Ice Sheet (f.ETISh) model

    NASA Astrophysics Data System (ADS)

    Bulthuis, Kevin; Arnst, Maarten; Pattyn, Frank; Favier, Lionel

    2017-04-01

    Uncertainties in sea-level rise projections are mostly due to uncertainties in Antarctic ice-sheet predictions (IPCC AR5 report, 2013), because key parameters related to the current state of the Antarctic ice sheet (e.g. sub-ice-shelf melting) and future climate forcing are poorly constrained. Here, we propose to improve the predictions of Antarctic ice-sheet behaviour using new uncertainty quantification methods. As opposed to ensemble modelling (Bindschadler et al., 2013) which provides a rather limited view on input and output dispersion, new stochastic methods (Le Maître and Knio, 2010) can provide deeper insight into the impact of uncertainties on complex system behaviour. Such stochastic methods usually begin with deducing a probabilistic description of input parameter uncertainties from the available data. Then, the impact of these input parameter uncertainties on output quantities is assessed by estimating the probability distribution of the outputs by means of uncertainty propagation methods such as Monte Carlo methods or stochastic expansion methods. The use of such uncertainty propagation methods in glaciology may be computationally costly because of the high computational complexity of ice-sheet models. This challenge emphasises the importance of developing reliable and computationally efficient ice-sheet models such as the f.ETISh ice-sheet model (Pattyn, 2015), a new fast thermomechanical coupled ice sheet/ice shelf model capable of handling complex and critical processes such as the marine ice-sheet instability mechanism. Here, we apply these methods to investigate the role of uncertainties in sub-ice-shelf melting, calving rates and climate projections in assessing Antarctic contribution to sea-level rise for the next centuries using the f.ETISh model. We detail the methods and show results that provide nominal values and uncertainty bounds for future sea-level rise as a reflection of the impact of the input parameter uncertainties under consideration, as well as a ranking of the input parameter uncertainties in the order of the significance of their contribution to uncertainty in future sea-level rise. In addition, we discuss how limitations posed by the available information (poorly constrained data) pose challenges that motivate our current research.

  14. DIFMOD2: A NEXT GENERATION DIFFUSE LAYER MODEL

    EPA Science Inventory

    Jenne (1998) suggested that the majority of uncertainty in our current ability to model the environmental partitioning behavior of ionic species on natural surfaces resulted from uncertainties in our understanding of surface acidity behavior. Traditional 2-pK Grahame-Gouy-Chapma...

  15. Irreducible Uncertainty in Terrestrial Carbon Projections

    NASA Astrophysics Data System (ADS)

    Lovenduski, N. S.; Bonan, G. B.

    2016-12-01

    We quantify and isolate the sources of uncertainty in projections of carbon accumulation by the ocean and terrestrial biosphere over 2006-2100 using output from Earth System Models participating in the 5th Coupled Model Intercomparison Project. We consider three independent sources of uncertainty in our analysis of variance: (1) internal variability, driven by random, internal variations in the climate system, (2) emission scenario, driven by uncertainty in future radiative forcing, and (3) model structure, wherein different models produce different projections given the same emission scenario. Whereas uncertainty in projections of ocean carbon accumulation by 2100 is 100 Pg C and driven primarily by emission scenario, uncertainty in projections of terrestrial carbon accumulation by 2100 is 50% larger than that of the ocean, and driven primarily by model structure. This structural uncertainty is correlated with emission scenario: the variance associated with model structure is an order of magnitude larger under a business-as-usual scenario (RCP8.5) than a mitigation scenario (RCP2.6). In an effort to reduce this structural uncertainty, we apply various model weighting schemes to our analysis of variance in terrestrial carbon accumulation projections. The largest reductions in uncertainty are achieved when giving all the weight to a single model; here the uncertainty is of a similar magnitude to the ocean projections. Such an analysis suggests that this structural uncertainty is irreducible given current terrestrial model development efforts.

  16. Computational Fluid Dynamics Uncertainty Analysis Applied to Heat Transfer over a Flat Plate

    NASA Technical Reports Server (NTRS)

    Groves, Curtis Edward; Ilie, Marcel; Schallhorn, Paul A.

    2013-01-01

    There have been few discussions on using Computational Fluid Dynamics (CFD) without experimental validation. Pairing experimental data, uncertainty analysis, and analytical predictions provides a comprehensive approach to verification and is the current state of the art. With pressed budgets, collecting experimental data is rare or non-existent. This paper investigates and proposes a method to perform CFD uncertainty analysis only from computational data. The method uses current CFD uncertainty techniques coupled with the Student-T distribution to predict the heat transfer coefficient over a at plate. The inputs to the CFD model are varied from a specified tolerance or bias error and the difference in the results are used to estimate the uncertainty. The variation in each input is ranked from least to greatest to determine the order of importance. The results are compared to heat transfer correlations and conclusions drawn about the feasibility of using CFD without experimental data. The results provide a tactic to analytically estimate the uncertainty in a CFD model when experimental data is unavailable

  17. Niches, models, and climate change: Assessing the assumptions and uncertainties

    PubMed Central

    Wiens, John A.; Stralberg, Diana; Jongsomjit, Dennis; Howell, Christine A.; Snyder, Mark A.

    2009-01-01

    As the rate and magnitude of climate change accelerate, understanding the consequences becomes increasingly important. Species distribution models (SDMs) based on current ecological niche constraints are used to project future species distributions. These models contain assumptions that add to the uncertainty in model projections stemming from the structure of the models, the algorithms used to translate niche associations into distributional probabilities, the quality and quantity of data, and mismatches between the scales of modeling and data. We illustrate the application of SDMs using two climate models and two distributional algorithms, together with information on distributional shifts in vegetation types, to project fine-scale future distributions of 60 California landbird species. Most species are projected to decrease in distribution by 2070. Changes in total species richness vary over the state, with large losses of species in some “hotspots” of vulnerability. Differences in distributional shifts among species will change species co-occurrences, creating spatial variation in similarities between current and future assemblages. We use these analyses to consider how assumptions can be addressed and uncertainties reduced. SDMs can provide a useful way to incorporate future conditions into conservation and management practices and decisions, but the uncertainties of model projections must be balanced with the risks of taking the wrong actions or the costs of inaction. Doing this will require that the sources and magnitudes of uncertainty are documented, and that conservationists and resource managers be willing to act despite the uncertainties. The alternative, of ignoring the future, is not an option. PMID:19822750

  18. Uncertainty Analysis Framework - Hanford Site-Wide Groundwater Flow and Transport Model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cole, Charles R.; Bergeron, Marcel P.; Murray, Christopher J.

    2001-11-09

    Pacific Northwest National Laboratory (PNNL) embarked on a new initiative to strengthen the technical defensibility of the predictions being made with a site-wide groundwater flow and transport model at the U.S. Department of Energy Hanford Site in southeastern Washington State. In FY 2000, the focus of the initiative was on the characterization of major uncertainties in the current conceptual model that would affect model predictions. The long-term goals of the initiative are the development and implementation of an uncertainty estimation methodology in future assessments and analyses using the site-wide model. This report focuses on the development and implementation of anmore » uncertainty analysis framework.« less

  19. Uncertainty quantification of fast sodium current steady-state inactivation for multi-scale models of cardiac electrophysiology.

    PubMed

    Pathmanathan, Pras; Shotwell, Matthew S; Gavaghan, David J; Cordeiro, Jonathan M; Gray, Richard A

    2015-01-01

    Perhaps the most mature area of multi-scale systems biology is the modelling of the heart. Current models are grounded in over fifty years of research in the development of biophysically detailed models of the electrophysiology (EP) of cardiac cells, but one aspect which is inadequately addressed is the incorporation of uncertainty and physiological variability. Uncertainty quantification (UQ) is the identification and characterisation of the uncertainty in model parameters derived from experimental data, and the computation of the resultant uncertainty in model outputs. It is a necessary tool for establishing the credibility of computational models, and will likely be expected of EP models for future safety-critical clinical applications. The focus of this paper is formal UQ of one major sub-component of cardiac EP models, the steady-state inactivation of the fast sodium current, INa. To better capture average behaviour and quantify variability across cells, we have applied for the first time an 'individual-based' statistical methodology to assess voltage clamp data. Advantages of this approach over a more traditional 'population-averaged' approach are highlighted. The method was used to characterise variability amongst cells isolated from canine epi and endocardium, and this variability was then 'propagated forward' through a canine model to determine the resultant uncertainty in model predictions at different scales, such as of upstroke velocity and spiral wave dynamics. Statistically significant differences between epi and endocardial cells (greater half-inactivation and less steep slope of steady state inactivation curve for endo) was observed, and the forward propagation revealed a lack of robustness of the model to underlying variability, but also surprising robustness to variability at the tissue scale. Overall, the methodology can be used to: (i) better analyse voltage clamp data; (ii) characterise underlying population variability; (iii) investigate consequences of variability; and (iv) improve the ability to validate a model. To our knowledge this article is the first to quantify population variability in membrane dynamics in this manner, and the first to perform formal UQ for a component of a cardiac model. The approach is likely to find much wider applicability across systems biology as current application domains reach greater levels of maturity. Published by Elsevier Ltd.

  20. Data assimilation and bathymetric inversion in a two-dimensional horizontal surf zone model

    NASA Astrophysics Data System (ADS)

    Wilson, G. W.; Ã-Zkan-Haller, H. T.; Holman, R. A.

    2010-12-01

    A methodology is described for assimilating observations in a steady state two-dimensional horizontal (2-DH) model of nearshore hydrodynamics (waves and currents), using an ensemble-based statistical estimator. In this application, we treat bathymetry as a model parameter, which is subject to a specified prior uncertainty. The statistical estimator uses state augmentation to produce posterior (inverse, updated) estimates of bathymetry, wave height, and currents, as well as their posterior uncertainties. A case study is presented, using data from a 2-D array of in situ sensors on a natural beach (Duck, NC). The prior bathymetry is obtained by interpolation from recent bathymetric surveys; however, the resulting prior circulation is not in agreement with measurements. After assimilating data (significant wave height and alongshore current), the accuracy of modeled fields is improved, and this is quantified by comparing with observations (both assimilated and unassimilated). Hence, for the present data, 2-DH bathymetric uncertainty is an important source of error in the model and can be quantified and corrected using data assimilation. Here the bathymetric uncertainty is ascribed to inadequate temporal sampling; bathymetric surveys were conducted on a daily basis, but bathymetric change occurred on hourly timescales during storms, such that hydrodynamic model skill was significantly degraded. Further tests are performed to analyze the model sensitivities used in the assimilation and to determine the influence of different observation types and sampling schemes.

  1. Mind the Gap: Exploring the Underground of the NASA Space Cancer Risk Model

    NASA Technical Reports Server (NTRS)

    Chappell, L. J.; Elgart, S. R.; Milder, C. M.; Shavers, M. R.; Semones, E. J.; Huff, J. L.

    2017-01-01

    The REID quantifies the lifetime risk of death from radiation-induced cancer in an exposed astronaut. The NASA Space Cancer Risk (NSCR) 2012 mode incorporates elements from physics, biology, epidemiology, and statistics to generate the REID distribution. The current model quantifies the space radiation environment, radiation quality, and dose-rate effects to estimate a NASA-weighted dose. This weighted dose is mapped to the excess risk of radiation-induced cancer mortality from acute exposures to gamma rays and then transferred to an astronaut population. Finally, the REID is determined by integrating this risk over the individual's lifetime. The calculated upper 95% confidence limit of the REID is used to restrict an astronaut's permissible mission duration (PMD) for a proposed mission. As a statistical quantity characterized by broad, subjective uncertainties, REID estimates for space missions result in wide distributions. Currently, the upper 95% confidence level is over 350% larger than the mean REID value, which can severely limit an astronaut's PMD. The model incorporates inputs from multiple scientific disciplines in the risk estimation process. Physics and particle transport models calculate how radiation moves through space, penetrates spacecraft, and makes its way to the human beings onboard. Epidemiological studies of exposures from atomic bombings, medical treatments, and power plants are used to quantify health risks from acute and chronic low linear energy transfer (LET) ionizing radiation. Biological studies in cellular and animal models using radiation at various LETs and energies inform quality metrics for ions present in space radiation. Statistical methodologies unite these elements, controlling for mathematical and scientific uncertainty and variability. Despite current progress, these research platforms contain knowledge gaps contributing to the large uncertainties still present in the model. The NASA Space Radiation Program Element (SRPE) defines the knowledge gaps that impact our understanding of the cancer risks. These gaps are outlined in NASA's Human Research Roadmap [4], which identifies the research questions and actions recommended for reducing the uncertainty in the current NSCR model and for formulation of future models. The greatest contributors to uncertainty in the current model include radiation quality, dose rate effects, and the transfer of exposure-based risk from other populations to an astronaut population. Future formulations of the risk model may benefit from including other potential sources of uncertainty such as space dosimetry, errors in human epidemiology data, and the impact of microgravity and other spaceflight stressors. Here, we discuss the current capabilities of the NSCR-2012 model and several immediate research needs, highlighting areas expected to have an operational impact on the current model schema. The following subway-style route map outlines the NSCR-2012 model (Green Line), emphasizing the research gaps in the Human Research Roadmap for risk of radiation-induced carcinogenesis (Stops on Dashed Lines). The map diagrams how these research gaps feed specific portions of the model.

  2. An Integrated Uncertainty Analysis and Ensemble-based Data Assimilation Framework for Operational Snow Predictions

    NASA Astrophysics Data System (ADS)

    He, M.; Hogue, T. S.; Franz, K.; Margulis, S. A.; Vrugt, J. A.

    2009-12-01

    The National Weather Service (NWS), the agency responsible for short- and long-term streamflow predictions across the nation, primarily applies the SNOW17 model for operational forecasting of snow accumulation and melt. The SNOW17-forecasted snowmelt serves as an input to a rainfall-runoff model for streamflow forecasts in snow-dominated areas. The accuracy of streamflow predictions in these areas largely relies on the accuracy of snowmelt. However, no direct snowmelt measurements are available to validate the SNOW17 predictions. Instead, indirect measurements such as snow water equivalent (SWE) measurements or discharge are typically used to calibrate SNOW17 parameters. In addition, the forecast practice is inherently deterministic, lacking tools to systematically address forecasting uncertainties (e.g., uncertainties in parameters, forcing, SWE and discharge observations, etc.). The current research presents an Integrated Uncertainty analysis and Ensemble-based data Assimilation (IUEA) framework to improve predictions of snowmelt and discharge while simultaneously providing meaningful estimates of the associated uncertainty. The IUEA approach uses the recently developed DiffeRential Evolution Adaptive Metropolis (DREAM) to simultaneously estimate uncertainties in model parameters, forcing, and observations. The robustness and usefulness of the IUEA-SNOW17 framework is evaluated for snow-dominated watersheds in the northern Sierra Mountains, using the coupled IUEA-SNOW17 and an operational soil moisture accounting model (SAC-SMA). Preliminary results are promising and indicate successful performance of the coupled IUEA-SNOW17 framework. Implementation of the SNOW17 with the IUEA is straightforward and requires no major modification to the SNOW17 model structure. The IUEA-SNOW17 framework is intended to be modular and transferable and should assist the NWS in advancing the current forecasting system and reinforcing current operational forecasting skill.

  3. Verifying and Validating Simulation Models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hemez, Francois M.

    2015-02-23

    This presentation is a high-level discussion of the Verification and Validation (V&V) of computational models. Definitions of V&V are given to emphasize that “validation” is never performed in a vacuum; it accounts, instead, for the current state-of-knowledge in the discipline considered. In particular comparisons between physical measurements and numerical predictions should account for their respective sources of uncertainty. The differences between error (bias), aleatoric uncertainty (randomness) and epistemic uncertainty (ignorance, lack-of- knowledge) are briefly discussed. Four types of uncertainty in physics and engineering are discussed: 1) experimental variability, 2) variability and randomness, 3) numerical uncertainty and 4) model-form uncertainty. Statisticalmore » sampling methods are available to propagate, and analyze, variability and randomness. Numerical uncertainty originates from the truncation error introduced by the discretization of partial differential equations in time and space. Model-form uncertainty is introduced by assumptions often formulated to render a complex problem more tractable and amenable to modeling and simulation. The discussion concludes with high-level guidance to assess the “credibility” of numerical simulations, which stems from the level of rigor with which these various sources of uncertainty are assessed and quantified.« less

  4. Statistical error model for a solar electric propulsion thrust subsystem

    NASA Technical Reports Server (NTRS)

    Bantell, M. H.

    1973-01-01

    The solar electric propulsion thrust subsystem statistical error model was developed as a tool for investigating the effects of thrust subsystem parameter uncertainties on navigation accuracy. The model is currently being used to evaluate the impact of electric engine parameter uncertainties on navigation system performance for a baseline mission to Encke's Comet in the 1980s. The data given represent the next generation in statistical error modeling for low-thrust applications. Principal improvements include the representation of thrust uncertainties and random process modeling in terms of random parametric variations in the thrust vector process for a multi-engine configuration.

  5. Uncertainties in modeling low-energy neutrino-induced reactions on iron-group nuclei

    NASA Astrophysics Data System (ADS)

    Paar, N.; Suzuki, T.; Honma, M.; Marketin, T.; Vretenar, D.

    2011-10-01

    Charged-current neutrino-nucleus cross sections for 54,56Fe and 58,60Ni are calculated and compared using frameworks based on relativistic and Skyrme energy-density functionals and on the shell model. The current theoretical uncertainties in modeling neutrino-nucleus cross sections are assessed in relation to the predicted Gamow-Teller transition strength and available data, to multipole decomposition of the cross sections, and to cross sections averaged over the Michel flux and Fermi-Dirac distribution. By employing different microscopic approaches and models, the decay-at-rest (DAR) neutrino-56Fe cross section and its theoretical uncertainty are estimated to be <σ>th=(258±57)×10-42cm2, in very good agreement with the experimental value <σ>exp=(256±108±43)×10-42cm2.

  6. Assessing the impact of radiative parameter uncertainty on plant growth simulation

    NASA Astrophysics Data System (ADS)

    Viskari, T.; Serbin, S.; Dietze, M.; Shiklomanov, A. N.

    2015-12-01

    Current Earth system models do not adequately project either the magnitude or the sign of carbon fluxes and storage associated with the terrestrial carbon cycle resulting in significant uncertainties in their potential feedbacks on the future climate system. A primary reason for the current uncertainty in these models is the lack of observational constraints of key biomes at relevant spatial and temporal scales. There is an increasingly large and highly resolved amount of remotely sensed observations that can provide the critical model inputs. However, effectively incorporating these data requires the use of radiative transfer models and their associated assumptions. How these parameter assumptions and uncertainties affect model projections for, e.g., leaf physiology, soil temperature or growth has not been examined in depth. In this presentation we discuss the use of high spectral resolution observations at the near surface to landscape scales to inform ecosystem process modeling efforts, particularly the uncertainties related to properties describing the radiation regime within vegetation canopies and the impact on C cycle projections. We illustrate that leaf and wood radiative properties and their associated uncertainties have an important impact on projected forest carbon uptake and storage. We further show the need for a strong data constraint on these properties and discuss sources of this remotely sensed information and methods for data assimilation into models. We present our approach as an efficient means for understanding and correcting implicit assumptions and model structural deficiencies in radiation transfer in vegetation canopies. Ultimately, a better understanding of the radiation balance of ecosystems will improve regional and global scale C and energy balance projections.

  7. Incorporating uncertainty into mercury-offset decisions with a probabilistic network for National Pollutant Discharge Elimination System permit holders: an interim report

    USGS Publications Warehouse

    Wood, Alexander

    2004-01-01

    This interim report describes an alternative approach for evaluating the efficacy of using mercury (Hg) offsets to improve water quality. Hg-offset programs may allow dischargers facing higher-pollution control costs to meet their regulatory obligations by making more cost effective pollutant-reduction decisions. Efficient Hg management requires methods to translate that science and economics into a regulatory decision framework. This report documents the work in progress by the U.S. Geological Surveys Western Geographic Science Center in collaboration with Stanford University toward developing this decision framework to help managers, regulators, and other stakeholders decide whether offsets can cost effectively meet the Hg total maximum daily load (TMDL) requirements in the Sacramento River watershed. Two key approaches being considered are: (1) a probabilistic approach that explicitly incorporates scientific uncertainty, cost information, and value judgments; and (2) a quantitative approach that captures uncertainty in testing the feasibility of Hg offsets. Current fate and transport-process models commonly attempt to predict chemical transformations and transport pathways deterministically. However, the physical, chemical, and biologic processes controlling the fate and transport of Hg in aquatic environments are complex and poorly understood. Deterministic models of Hg environmental behavior contain large uncertainties, reflecting this lack of understanding. The uncertainty in these underlying physical processes may produce similarly large uncertainties in the decisionmaking process. However, decisions about control strategies are still being made despite the large uncertainties in current Hg loadings, the relations between total Hg (HgT) loading and methylmercury (MeHg) formation, and the relations between control efforts and Hg content in fish. The research presented here focuses on an alternative analytical approach to the current use of safety factors and deterministic methods for Hg TMDL decision support, one that is fully compatible with an adaptive management approach. This alternative approach uses empirical data and informed judgment to provide a scientific and technical basis for helping National Pollutant Discharge Elimination System (NPDES) permit holders make management decisions. An Hg-offset system would be an option if a wastewater-treatment plant could not achieve NPDES permit requirements for HgT reduction. We develop a probabilistic decision-analytical model consisting of three submodels for HgT loading, MeHg, and cost mitigation within a Bayesian network that integrates information of varying rigor and detail into a simple model of a complex system. Hg processes are identified and quantified by using a combination of historical data, statistical models, and expert judgment. Such an integrated approach to uncertainty analysis allows easy updating of prediction and inference when observations of model variables are made. We demonstrate our approach with data from the Cache Creek watershed (a subbasin of the Sacramento River watershed). The empirical models used to generate the needed probability distributions are based on the same empirical models currently being used by the Central Valley Regional Water Quality Control Cache Creek Hg TMDL working group. The significant difference is that input uncertainty and error are explicitly included in the model and propagated throughout its algorithms. This work demonstrates how to integrate uncertainty into the complex and highly uncertain Hg TMDL decisionmaking process. The various sources of uncertainty are propagated as decision risk that allows decisionmakers to simultaneously consider uncertainties in remediation/implementation costs while attempting to meet environmental/ecologic targets. We must note that this research is on going. As more data are collected, the HgT and cost-mitigation submodels are updated and the uncer

  8. Predicting future uncertainty constraints on global warming projections

    DOE PAGES

    Shiogama, H.; Stone, D.; Emori, S.; ...

    2016-01-11

    Projections of global mean temperature changes (ΔT) in the future are associated with intrinsic uncertainties. Much climate policy discourse has been guided by "current knowledge" of the ΔTs uncertainty, ignoring the likely future reductions of the uncertainty, because a mechanism for predicting these reductions is lacking. By using simulations of Global Climate Models from the Coupled Model Intercomparison Project Phase 5 ensemble as pseudo past and future observations, we estimate how fast and in what way the uncertainties of ΔT can decline when the current observation network of surface air temperature is maintained. At least in the world of pseudomore » observations under the Representative Concentration Pathways (RCPs), we can drastically reduce more than 50% of the ΔTs uncertainty in the 2040 s by 2029, and more than 60% of the ΔTs uncertainty in the 2090 s by 2049. Under the highest forcing scenario of RCPs, we can predict the true timing of passing the 2°C (3°C) warming threshold 20 (30) years in advance with errors less than 10 years. These results demonstrate potential for sequential decision-making strategies to take advantage of future progress in understanding of anthropogenic climate change.« less

  9. Predicting future uncertainty constraints on global warming projections

    PubMed Central

    Shiogama, H.; Stone, D.; Emori, S.; Takahashi, K.; Mori, S.; Maeda, A.; Ishizaki, Y.; Allen, M. R.

    2016-01-01

    Projections of global mean temperature changes (ΔT) in the future are associated with intrinsic uncertainties. Much climate policy discourse has been guided by “current knowledge” of the ΔTs uncertainty, ignoring the likely future reductions of the uncertainty, because a mechanism for predicting these reductions is lacking. By using simulations of Global Climate Models from the Coupled Model Intercomparison Project Phase 5 ensemble as pseudo past and future observations, we estimate how fast and in what way the uncertainties of ΔT can decline when the current observation network of surface air temperature is maintained. At least in the world of pseudo observations under the Representative Concentration Pathways (RCPs), we can drastically reduce more than 50% of the ΔTs uncertainty in the 2040 s by 2029, and more than 60% of the ΔTs uncertainty in the 2090 s by 2049. Under the highest forcing scenario of RCPs, we can predict the true timing of passing the 2 °C (3 °C) warming threshold 20 (30) years in advance with errors less than 10 years. These results demonstrate potential for sequential decision-making strategies to take advantage of future progress in understanding of anthropogenic climate change. PMID:26750491

  10. Predicting future uncertainty constraints on global warming projections

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shiogama, H.; Stone, D.; Emori, S.

    Projections of global mean temperature changes (ΔT) in the future are associated with intrinsic uncertainties. Much climate policy discourse has been guided by "current knowledge" of the ΔTs uncertainty, ignoring the likely future reductions of the uncertainty, because a mechanism for predicting these reductions is lacking. By using simulations of Global Climate Models from the Coupled Model Intercomparison Project Phase 5 ensemble as pseudo past and future observations, we estimate how fast and in what way the uncertainties of ΔT can decline when the current observation network of surface air temperature is maintained. At least in the world of pseudomore » observations under the Representative Concentration Pathways (RCPs), we can drastically reduce more than 50% of the ΔTs uncertainty in the 2040 s by 2029, and more than 60% of the ΔTs uncertainty in the 2090 s by 2049. Under the highest forcing scenario of RCPs, we can predict the true timing of passing the 2°C (3°C) warming threshold 20 (30) years in advance with errors less than 10 years. These results demonstrate potential for sequential decision-making strategies to take advantage of future progress in understanding of anthropogenic climate change.« less

  11. Uncertainty estimates of purity measurements based on current information: toward a "live validation" of purity methods.

    PubMed

    Apostol, Izydor; Kelner, Drew; Jiang, Xinzhao Grace; Huang, Gang; Wypych, Jette; Zhang, Xin; Gastwirt, Jessica; Chen, Kenneth; Fodor, Szilan; Hapuarachchi, Suminda; Meriage, Dave; Ye, Frank; Poppe, Leszek; Szpankowski, Wojciech

    2012-12-01

    To predict precision and other performance characteristics of chromatographic purity methods, which represent the most widely used form of analysis in the biopharmaceutical industry. We have conducted a comprehensive survey of purity methods, and show that all performance characteristics fall within narrow measurement ranges. This observation was used to develop a model called Uncertainty Based on Current Information (UBCI), which expresses these performance characteristics as a function of the signal and noise levels, hardware specifications, and software settings. We applied the UCBI model to assess the uncertainty of purity measurements, and compared the results to those from conventional qualification. We demonstrated that the UBCI model is suitable to dynamically assess method performance characteristics, based on information extracted from individual chromatograms. The model provides an opportunity for streamlining qualification and validation studies by implementing a "live validation" of test results utilizing UBCI as a concurrent assessment of measurement uncertainty. Therefore, UBCI can potentially mitigate the challenges associated with laborious conventional method validation and facilitates the introduction of more advanced analytical technologies during the method lifecycle.

  12. An integrated uncertainty analysis and data assimilation approach for improved streamflow predictions

    NASA Astrophysics Data System (ADS)

    Hogue, T. S.; He, M.; Franz, K. J.; Margulis, S. A.; Vrugt, J. A.

    2010-12-01

    The current study presents an integrated uncertainty analysis and data assimilation approach to improve streamflow predictions while simultaneously providing meaningful estimates of the associated uncertainty. Study models include the National Weather Service (NWS) operational snow model (SNOW17) and rainfall-runoff model (SAC-SMA). The proposed approach uses the recently developed DiffeRential Evolution Adaptive Metropolis (DREAM) to simultaneously estimate uncertainties in model parameters, forcing, and observations. An ensemble Kalman filter (EnKF) is configured with the DREAM-identified uncertainty structure and applied to assimilating snow water equivalent data into the SNOW17 model for improved snowmelt simulations. Snowmelt estimates then serves as an input to the SAC-SMA model to provide streamflow predictions at the basin outlet. The robustness and usefulness of the approach is evaluated for a snow-dominated watershed in the northern Sierra Mountains. This presentation describes the implementation of DREAM and EnKF into the coupled SNOW17 and SAC-SMA models and summarizes study results and findings.

  13. Satellite Re-entry Modeling and Uncertainty Quantification

    NASA Astrophysics Data System (ADS)

    Horsley, M.

    2012-09-01

    LEO trajectory modeling is a fundamental aerospace capability and has applications in many areas of aerospace, such as maneuver planning, sensor scheduling, re-entry prediction, collision avoidance, risk analysis, and formation flying. Somewhat surprisingly, modeling the trajectory of an object in low Earth orbit is still a challenging task. This is primarily due to the large uncertainty in the upper atmospheric density, about 15-20% (1-sigma) for most thermosphere models. Other contributions come from our inability to precisely model future solar and geomagnetic activities, the potentially unknown shape, material construction and attitude history of the satellite, and intermittent, noisy tracking data. Current methods to predict a satellite's re-entry trajectory typically involve making a single prediction, with the uncertainty dealt with in an ad-hoc manner, usually based on past experience. However, due to the extreme speed of a LEO satellite, even small uncertainties in the re-entry time translate into a very large uncertainty in the location of the re-entry event. Currently, most methods simply update the re-entry estimate on a regular basis. This results in a wide range of estimates that are literally spread over the entire globe. With no understanding of the underlying distribution of potential impact points, the sequence of impact points predicted by the current methodology are largely useless until just a few hours before re-entry. This paper will discuss the development of a set of the High Performance Computing (HPC)-based capabilities to support near real-time quantification of the uncertainty inherent in uncontrolled satellite re-entries. An appropriate management of the uncertainties is essential for a rigorous treatment of the re-entry/LEO trajectory problem. The development of HPC-based tools for re-entry analysis is important as it will allow a rigorous and robust approach to risk assessment by decision makers in an operational setting. Uncertainty quantification results from the recent uncontrolled re-entry of the Phobos-Grunt satellite will be presented and discussed. This work performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344.

  14. Development of a Prototype Model-Form Uncertainty Knowledge Base

    NASA Technical Reports Server (NTRS)

    Green, Lawrence L.

    2016-01-01

    Uncertainties are generally classified as either aleatory or epistemic. Aleatory uncertainties are those attributed to random variation, either naturally or through manufacturing processes. Epistemic uncertainties are generally attributed to a lack of knowledge. One type of epistemic uncertainty is called model-form uncertainty. The term model-form means that among the choices to be made during a design process within an analysis, there are different forms of the analysis process, which each give different results for the same configuration at the same flight conditions. Examples of model-form uncertainties include the grid density, grid type, and solver type used within a computational fluid dynamics code, or the choice of the number and type of model elements within a structures analysis. The objectives of this work are to identify and quantify a representative set of model-form uncertainties and to make this information available to designers through an interactive knowledge base (KB). The KB can then be used during probabilistic design sessions, so as to enable the possible reduction of uncertainties in the design process through resource investment. An extensive literature search has been conducted to identify and quantify typical model-form uncertainties present within aerospace design. An initial attempt has been made to assemble the results of this literature search into a searchable KB, usable in real time during probabilistic design sessions. A concept of operations and the basic structure of a model-form uncertainty KB are described. Key operations within the KB are illustrated. Current limitations in the KB, and possible workarounds are explained.

  15. Uncertainties in Predicting Rice Yield by Current Crop Models Under a Wide Range of Climatic Conditions

    NASA Technical Reports Server (NTRS)

    Li, Tao; Hasegawa, Toshihiro; Yin, Xinyou; Zhu, Yan; Boote, Kenneth; Adam, Myriam; Bregaglio, Simone; Buis, Samuel; Confalonieri, Roberto; Fumoto, Tamon; hide

    2014-01-01

    Predicting rice (Oryza sativa) productivity under future climates is important for global food security. Ecophysiological crop models in combination with climate model outputs are commonly used in yield prediction, but uncertainties associated with crop models remain largely unquantified. We evaluated 13 rice models against multi-year experimental yield data at four sites with diverse climatic conditions in Asia and examined whether different modeling approaches on major physiological processes attribute to the uncertainties of prediction to field measured yields and to the uncertainties of sensitivity to changes in temperature and CO2 concentration [CO2]. We also examined whether a use of an ensemble of crop models can reduce the uncertainties. Individual models did not consistently reproduce both experimental and regional yields well, and uncertainty was larger at the warmest and coolest sites. The variation in yield projections was larger among crop models than variation resulting from 16 global climate model-based scenarios. However, the mean of predictions of all crop models reproduced experimental data, with an uncertainty of less than 10 percent of measured yields. Using an ensemble of eight models calibrated only for phenology or five models calibrated in detail resulted in the uncertainty equivalent to that of the measured yield in well-controlled agronomic field experiments. Sensitivity analysis indicates the necessity to improve the accuracy in predicting both biomass and harvest index in response to increasing [CO2] and temperature.

  16. Quantifying Uncertainties in N2O Emission Due to N Fertilizer Application in Cultivated Areas

    PubMed Central

    Philibert, Aurore; Loyce, Chantal; Makowski, David

    2012-01-01

    Nitrous oxide (N2O) is a greenhouse gas with a global warming potential approximately 298 times greater than that of CO2. In 2006, the Intergovernmental Panel on Climate Change (IPCC) estimated N2O emission due to synthetic and organic nitrogen (N) fertilization at 1% of applied N. We investigated the uncertainty on this estimated value, by fitting 13 different models to a published dataset including 985 N2O measurements. These models were characterized by (i) the presence or absence of the explanatory variable “applied N”, (ii) the function relating N2O emission to applied N (exponential or linear function), (iii) fixed or random background (i.e. in the absence of N application) N2O emission and (iv) fixed or random applied N effect. We calculated ranges of uncertainty on N2O emissions from a subset of these models, and compared them with the uncertainty ranges currently used in the IPCC-Tier 1 method. The exponential models outperformed the linear models, and models including one or two random effects outperformed those including fixed effects only. The use of an exponential function rather than a linear function has an important practical consequence: the emission factor is not constant and increases as a function of applied N. Emission factors estimated using the exponential function were lower than 1% when the amount of N applied was below 160 kg N ha−1. Our uncertainty analysis shows that the uncertainty range currently used by the IPCC-Tier 1 method could be reduced. PMID:23226430

  17. 'spup' - an R package for uncertainty propagation in spatial environmental modelling

    NASA Astrophysics Data System (ADS)

    Sawicka, Kasia; Heuvelink, Gerard

    2016-04-01

    Computer models have become a crucial tool in engineering and environmental sciences for simulating the behaviour of complex static and dynamic systems. However, while many models are deterministic, the uncertainty in their predictions needs to be estimated before they are used for decision support. Currently, advances in uncertainty propagation and assessment have been paralleled by a growing number of software tools for uncertainty analysis, but none has gained recognition for a universal applicability, including case studies with spatial models and spatial model inputs. Due to the growing popularity and applicability of the open source R programming language we undertook a project to develop an R package that facilitates uncertainty propagation analysis in spatial environmental modelling. In particular, the 'spup' package provides functions for examining the uncertainty propagation starting from input data and model parameters, via the environmental model onto model predictions. The functions include uncertainty model specification, stochastic simulation and propagation of uncertainty using Monte Carlo (MC) techniques, as well as several uncertainty visualization functions. Uncertain environmental variables are represented in the package as objects whose attribute values may be uncertain and described by probability distributions. Both numerical and categorical data types are handled. Spatial auto-correlation within an attribute and cross-correlation between attributes is also accommodated for. For uncertainty propagation the package has implemented the MC approach with efficient sampling algorithms, i.e. stratified random sampling and Latin hypercube sampling. The design includes facilitation of parallel computing to speed up MC computation. The MC realizations may be used as an input to the environmental models called from R, or externally. Selected static and interactive visualization methods that are understandable by non-experts with limited background in statistics can be used to summarize and visualize uncertainty about the measured input, model parameters and output of the uncertainty propagation. We demonstrate that the 'spup' package is an effective and easy tool to apply and can be used in multi-disciplinary research and model-based decision support.

  18. 'spup' - an R package for uncertainty propagation analysis in spatial environmental modelling

    NASA Astrophysics Data System (ADS)

    Sawicka, Kasia; Heuvelink, Gerard

    2017-04-01

    Computer models have become a crucial tool in engineering and environmental sciences for simulating the behaviour of complex static and dynamic systems. However, while many models are deterministic, the uncertainty in their predictions needs to be estimated before they are used for decision support. Currently, advances in uncertainty propagation and assessment have been paralleled by a growing number of software tools for uncertainty analysis, but none has gained recognition for a universal applicability and being able to deal with case studies with spatial models and spatial model inputs. Due to the growing popularity and applicability of the open source R programming language we undertook a project to develop an R package that facilitates uncertainty propagation analysis in spatial environmental modelling. In particular, the 'spup' package provides functions for examining the uncertainty propagation starting from input data and model parameters, via the environmental model onto model predictions. The functions include uncertainty model specification, stochastic simulation and propagation of uncertainty using Monte Carlo (MC) techniques, as well as several uncertainty visualization functions. Uncertain environmental variables are represented in the package as objects whose attribute values may be uncertain and described by probability distributions. Both numerical and categorical data types are handled. Spatial auto-correlation within an attribute and cross-correlation between attributes is also accommodated for. For uncertainty propagation the package has implemented the MC approach with efficient sampling algorithms, i.e. stratified random sampling and Latin hypercube sampling. The design includes facilitation of parallel computing to speed up MC computation. The MC realizations may be used as an input to the environmental models called from R, or externally. Selected visualization methods that are understandable by non-experts with limited background in statistics can be used to summarize and visualize uncertainty about the measured input, model parameters and output of the uncertainty propagation. We demonstrate that the 'spup' package is an effective and easy tool to apply and can be used in multi-disciplinary research and model-based decision support.

  19. Quantification of uncertainty in flood risk assessment for flood protection planning: a Bayesian approach

    NASA Astrophysics Data System (ADS)

    Dittes, Beatrice; Špačková, Olga; Ebrahimian, Negin; Kaiser, Maria; Rieger, Wolfgang; Disse, Markus; Straub, Daniel

    2017-04-01

    Flood risk estimates are subject to significant uncertainties, e.g. due to limited records of historic flood events, uncertainty in flood modeling, uncertain impact of climate change or uncertainty in the exposure and loss estimates. In traditional design of flood protection systems, these uncertainties are typically just accounted for implicitly, based on engineering judgment. In the AdaptRisk project, we develop a fully quantitative framework for planning of flood protection systems under current and future uncertainties using quantitative pre-posterior Bayesian decision analysis. In this contribution, we focus on the quantification of the uncertainties and study their relative influence on the flood risk estimate and on the planning of flood protection systems. The following uncertainty components are included using a Bayesian approach: 1) inherent and statistical (i.e. limited record length) uncertainty; 2) climate uncertainty that can be learned from an ensemble of GCM-RCM models; 3) estimates of climate uncertainty components not covered in 2), such as bias correction, incomplete ensemble, local specifics not captured by the GCM-RCM models; 4) uncertainty in the inundation modelling; 5) uncertainty in damage estimation. We also investigate how these uncertainties are possibly reduced in the future when new evidence - such as new climate models, observed extreme events, and socio-economic data - becomes available. Finally, we look into how this new evidence influences the risk assessment and effectivity of flood protection systems. We demonstrate our methodology for a pre-alpine catchment in southern Germany: the Mangfall catchment in Bavaria that includes the city of Rosenheim, which suffered significant losses during the 2013 flood event.

  20. A Hierarchical Multi-Model Approach for Uncertainty Segregation, Prioritization and Comparative Evaluation of Competing Modeling Propositions

    NASA Astrophysics Data System (ADS)

    Tsai, F. T.; Elshall, A. S.; Hanor, J. S.

    2012-12-01

    Subsurface modeling is challenging because of many possible competing propositions for each uncertain model component. How can we judge that we are selecting the correct proposition for an uncertain model component out of numerous competing propositions? How can we bridge the gap between synthetic mental principles such as mathematical expressions on one hand, and empirical observation such as observation data on the other hand when uncertainty exists on both sides? In this study, we introduce hierarchical Bayesian model averaging (HBMA) as a multi-model (multi-proposition) framework to represent our current state of knowledge and decision for hydrogeological structure modeling. The HBMA framework allows for segregating and prioritizing different sources of uncertainty, and for comparative evaluation of competing propositions for each source of uncertainty. We applied the HBMA to a study of hydrostratigraphy and uncertainty propagation of the Southern Hills aquifer system in the Baton Rouge area, Louisiana. We used geophysical data for hydrogeological structure construction through indictor hydrostratigraphy method and used lithologic data from drillers' logs for model structure calibration. However, due to uncertainty in model data, structure and parameters, multiple possible hydrostratigraphic models were produced and calibrated. The study considered four sources of uncertainties. To evaluate mathematical structure uncertainty, the study considered three different variogram models and two geological stationarity assumptions. With respect to geological structure uncertainty, the study considered two geological structures with respect to the Denham Springs-Scotlandville fault. With respect to data uncertainty, the study considered two calibration data sets. These four sources of uncertainty with their corresponding competing modeling propositions resulted in 24 calibrated models. The results showed that by segregating different sources of uncertainty, HBMA analysis provided insights on uncertainty priorities and propagation. In addition, it assisted in evaluating the relative importance of competing modeling propositions for each uncertain model component. By being able to dissect the uncertain model components and provide weighted representation of the competing propositions for each uncertain model component based on the background knowledge, the HBMA functions as an epistemic framework for advancing knowledge about the system under study.

  1. Sources of Uncertainty in Predicting Land Surface Fluxes Using Diverse Data and Models

    NASA Technical Reports Server (NTRS)

    Dungan, Jennifer L.; Wang, Weile; Michaelis, Andrew; Votava, Petr; Nemani, Ramakrishma

    2010-01-01

    In the domain of predicting land surface fluxes, models are used to bring data from large observation networks and satellite remote sensing together to make predictions about present and future states of the Earth. Characterizing the uncertainty about such predictions is a complex process and one that is not yet fully understood. Uncertainty exists about initialization, measurement and interpolation of input variables; model parameters; model structure; and mixed spatial and temporal supports. Multiple models or structures often exist to describe the same processes. Uncertainty about structure is currently addressed by running an ensemble of different models and examining the distribution of model outputs. To illustrate structural uncertainty, a multi-model ensemble experiment we have been conducting using the Terrestrial Observation and Prediction System (TOPS) will be discussed. TOPS uses public versions of process-based ecosystem models that use satellite-derived inputs along with surface climate data and land surface characterization to produce predictions of ecosystem fluxes including gross and net primary production and net ecosystem exchange. Using the TOPS framework, we have explored the uncertainty arising from the application of models with different assumptions, structures, parameters, and variable definitions. With a small number of models, this only begins to capture the range of possible spatial fields of ecosystem fluxes. Few attempts have been made to systematically address the components of uncertainty in such a framework. We discuss the characterization of uncertainty for this approach including both quantifiable and poorly known aspects.

  2. From climate-change spaghetti to climate-change distributions for 21st Century California

    USGS Publications Warehouse

    Dettinger, M.D.

    2005-01-01

    The uncertainties associated with climate-change projections for California are unlikely to disappear any time soon, and yet important long-term decisions will be needed to accommodate those potential changes. Projection uncertainties have typically been addressed by analysis of a few scenarios, chosen based on availability or to capture the extreme cases among available projections. However, by focusing on more common projections rather than the most extreme projections (using a new resampling method), new insights into current projections emerge: (1) uncertainties associated with future greenhouse-gas emissions are comparable with the differences among climate models, so that neither source of uncertainties should be neglected or underrepresented; (2) twenty-first century temperature projections spread more, overall, than do precipitation scenarios; (3) projections of extremely wet futures for California are true outliers among current projections; and (4) current projections that are warmest tend, overall, to yield a moderately drier California, while the cooler projections yield a somewhat wetter future. The resampling approach applied in this paper also provides a natural opportunity to objectively incorporate measures of model skill and the likelihoods of various emission scenarios into future assessments.

  3. Application of data fusion modeling (DFM) to site characterization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Porter, D.W.; Gibbs, B.P.; Jones, W.F.

    1996-01-01

    Subsurface characterization is faced with substantial uncertainties because the earth is very heterogeneous, and typical data sets are fragmented and disparate. DFM removes many of the data limitations of current methods to quantify and reduce uncertainty for a variety of data types and models. DFM is a methodology to compute hydrogeological state estimates and their uncertainties from three sources of information: measured data, physical laws, and statistical models for spatial heterogeneities. The benefits of DFM are savings in time and cost through the following: the ability to update models in real time to help guide site assessment, improved quantification ofmore » uncertainty for risk assessment, and improved remedial design by quantifying the uncertainty in safety margins. A Bayesian inverse modeling approach is implemented with a Gauss Newton method where spatial heterogeneities are viewed as Markov random fields. Information from data, physical laws, and Markov models is combined in a Square Root Information Smoother (SRIS). Estimates and uncertainties can be computed for heterogeneous hydraulic conductivity fields in multiple geological layers from the usually sparse hydraulic conductivity data and the often more plentiful head data. An application of DFM to the Old Burial Ground at the DOE Savannah River Site will be presented. DFM estimates and quantifies uncertainty in hydrogeological parameters using variably saturated flow numerical modeling to constrain the estimation. Then uncertainties are propagated through the transport modeling to quantify the uncertainty in tritium breakthrough curves at compliance points.« less

  4. Application of data fusion modeling (DFM) to site characterization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Porter, D.W.; Gibbs, B.P.; Jones, W.F.

    1996-12-31

    Subsurface characterization is faced with substantial uncertainties because the earth is very heterogeneous, and typical data sets are fragmented and disparate. DFM removes many of the data limitations of current methods to quantify and reduce uncertainty for a variety of data types and models. DFM is a methodology to compute hydrogeological state estimates and their uncertainties from three sources of information: measured data, physical laws, and statistical models for spatial heterogeneities. The benefits of DFM are savings in time and cost through the following: the ability to update models in real time to help guide site assessment, improved quantification ofmore » uncertainty for risk assessment, and improved remedial design by quantifying the uncertainty in safety margins. A Bayesian inverse modeling approach is implemented with a Gauss Newton method where spatial heterogeneities are viewed as Markov random fields. Information from data, physical laws, and Markov models is combined in a Square Root Information Smoother (SRIS). Estimates and uncertainties can be computed for heterogeneous hydraulic conductivity fields in multiple geological layers from the usually sparse hydraulic conductivity data and the often more plentiful head data. An application of DFM to the Old Burial Ground at the DOE Savannah River Site will be presented. DFM estimates and quantifies uncertainty in hydrogeological parameters using variably saturated flow numerical modeling to constrain the estimation. Then uncertainties are propagated through the transport modeling to quantify the uncertainty in tritium breakthrough curves at compliance points.« less

  5. Evaluation of habitat suitability index models by global sensitivity and uncertainty analyses: a case study for submerged aquatic vegetation

    USGS Publications Warehouse

    Zajac, Zuzanna; Stith, Bradley M.; Bowling, Andrea C.; Langtimm, Catherine A.; Swain, Eric D.

    2015-01-01

    Habitat suitability index (HSI) models are commonly used to predict habitat quality and species distributions and are used to develop biological surveys, assess reserve and management priorities, and anticipate possible change under different management or climate change scenarios. Important management decisions may be based on model results, often without a clear understanding of the level of uncertainty associated with model outputs. We present an integrated methodology to assess the propagation of uncertainty from both inputs and structure of the HSI models on model outputs (uncertainty analysis: UA) and relative importance of uncertain model inputs and their interactions on the model output uncertainty (global sensitivity analysis: GSA). We illustrate the GSA/UA framework using simulated hydrology input data from a hydrodynamic model representing sea level changes and HSI models for two species of submerged aquatic vegetation (SAV) in southwest Everglades National Park: Vallisneria americana (tape grass) and Halodule wrightii (shoal grass). We found considerable spatial variation in uncertainty for both species, but distributions of HSI scores still allowed discrimination of sites with good versus poor conditions. Ranking of input parameter sensitivities also varied spatially for both species, with high habitat quality sites showing higher sensitivity to different parameters than low-quality sites. HSI models may be especially useful when species distribution data are unavailable, providing means of exploiting widely available environmental datasets to model past, current, and future habitat conditions. The GSA/UA approach provides a general method for better understanding HSI model dynamics, the spatial and temporal variation in uncertainties, and the parameters that contribute most to model uncertainty. Including an uncertainty and sensitivity analysis in modeling efforts as part of the decision-making framework will result in better-informed, more robust decisions.

  6. Assessing uncertainties in land cover projections.

    PubMed

    Alexander, Peter; Prestele, Reinhard; Verburg, Peter H; Arneth, Almut; Baranzelli, Claudia; Batista E Silva, Filipe; Brown, Calum; Butler, Adam; Calvin, Katherine; Dendoncker, Nicolas; Doelman, Jonathan C; Dunford, Robert; Engström, Kerstin; Eitelberg, David; Fujimori, Shinichiro; Harrison, Paula A; Hasegawa, Tomoko; Havlik, Petr; Holzhauer, Sascha; Humpenöder, Florian; Jacobs-Crisioni, Chris; Jain, Atul K; Krisztin, Tamás; Kyle, Page; Lavalle, Carlo; Lenton, Tim; Liu, Jiayi; Meiyappan, Prasanth; Popp, Alexander; Powell, Tom; Sands, Ronald D; Schaldach, Rüdiger; Stehfest, Elke; Steinbuks, Jevgenijs; Tabeau, Andrzej; van Meijl, Hans; Wise, Marshall A; Rounsevell, Mark D A

    2017-02-01

    Understanding uncertainties in land cover projections is critical to investigating land-based climate mitigation policies, assessing the potential of climate adaptation strategies and quantifying the impacts of land cover change on the climate system. Here, we identify and quantify uncertainties in global and European land cover projections over a diverse range of model types and scenarios, extending the analysis beyond the agro-economic models included in previous comparisons. The results from 75 simulations over 18 models are analysed and show a large range in land cover area projections, with the highest variability occurring in future cropland areas. We demonstrate systematic differences in land cover areas associated with the characteristics of the modelling approach, which is at least as great as the differences attributed to the scenario variations. The results lead us to conclude that a higher degree of uncertainty exists in land use projections than currently included in climate or earth system projections. To account for land use uncertainty, it is recommended to use a diverse set of models and approaches when assessing the potential impacts of land cover change on future climate. Additionally, further work is needed to better understand the assumptions driving land use model results and reveal the causes of uncertainty in more depth, to help reduce model uncertainty and improve the projections of land cover. © 2016 John Wiley & Sons Ltd.

  7. The Impact of Model and Rainfall Forcing Errors on Characterizing Soil Moisture Uncertainty in Land Surface Modeling

    NASA Technical Reports Server (NTRS)

    Maggioni, V.; Anagnostou, E. N.; Reichle, R. H.

    2013-01-01

    The contribution of rainfall forcing errors relative to model (structural and parameter) uncertainty in the prediction of soil moisture is investigated by integrating the NASA Catchment Land Surface Model (CLSM), forced with hydro-meteorological data, in the Oklahoma region. Rainfall-forcing uncertainty is introduced using a stochastic error model that generates ensemble rainfall fields from satellite rainfall products. The ensemble satellite rain fields are propagated through CLSM to produce soil moisture ensembles. Errors in CLSM are modeled with two different approaches: either by perturbing model parameters (representing model parameter uncertainty) or by adding randomly generated noise (representing model structure and parameter uncertainty) to the model prognostic variables. Our findings highlight that the method currently used in the NASA GEOS-5 Land Data Assimilation System to perturb CLSM variables poorly describes the uncertainty in the predicted soil moisture, even when combined with rainfall model perturbations. On the other hand, by adding model parameter perturbations to rainfall forcing perturbations, a better characterization of uncertainty in soil moisture simulations is observed. Specifically, an analysis of the rank histograms shows that the most consistent ensemble of soil moisture is obtained by combining rainfall and model parameter perturbations. When rainfall forcing and model prognostic perturbations are added, the rank histogram shows a U-shape at the domain average scale, which corresponds to a lack of variability in the forecast ensemble. The more accurate estimation of the soil moisture prediction uncertainty obtained by combining rainfall and parameter perturbations is encouraging for the application of this approach in ensemble data assimilation systems.

  8. Quantifying the intra-annual uncertainties in climate change assessment over 10 sub-basins across the Pacific Northwest US

    NASA Astrophysics Data System (ADS)

    Ahmadalipour, Ali; Moradkhani, Hamid; Rana, Arun

    2017-04-01

    Uncertainty is an inevitable feature of climate change impact assessments. Understanding and quantifying different sources of uncertainty is of high importance, which can help modeling agencies improve the current models and scenarios. In this study, we have assessed the future changes in three climate variables (i.e. precipitation, maximum temperature, and minimum temperature) over 10 sub-basins across the Pacific Northwest US. To conduct the study, 10 statistically downscaled CMIP5 GCMs from two downscaling methods (i.e. BCSD and MACA) were utilized at 1/16 degree spatial resolution for the historical period of 1970-2000 and future period of 2010-2099. For the future projections, two future scenarios of RCP4.5 and RCP8.5 were used. Furthermore, Bayesian Model Averaging (BMA) was employed to develop a probabilistic future projection for each climate variable. Results indicate superiority of BMA simulations compared to individual models. Increasing temperature and precipitation are projected at annual timescale. However, the changes are not uniform among different seasons. Model uncertainty shows to be the major source of uncertainty, while downscaling uncertainty significantly contributes to the total uncertainty, especially in summer.

  9. Multiscale Modeling of Carbon Fiber Reinforced Polymer (CFRP) for Integrated Computational Materials Engineering Process

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gao, Jiaying; Liang, Biao; Zhang, Weizhao

    In this work, a multiscale modeling framework for CFRP is introduced to study hierarchical structure of CFRP. Four distinct scales are defined: nanoscale, microscale, mesoscale, and macroscale. Information at lower scales can be passed to higher scale, which is beneficial for studying effect of constituents on macroscale part’s mechanical property. This bottom-up modeling approach enables better understanding of CFRP from finest details. Current study focuses on microscale and mesoscale. Representative volume element is used at microscale and mesoscale to model material’s properties. At microscale, unidirection CFRP (UD) RVE is used to study properties of UD. The UD RVE can bemore » modeled with different volumetric fraction to encounter non-uniform fiber distribution in CFRP part. Such consideration is important in modeling uncertainties at microscale level. Currently, we identified volumetric fraction as the only uncertainty parameters in UD RVE. To measure effective material properties of UD RVE, periodic boundary conditions (PBC) are applied to UD RVE to ensure convergence of obtained properties. Properties of UD is directly used at mesoscale woven RVE modeling, where each yarn is assumed to have same properties as UD. Within woven RVE, there can be many potential uncertainties parameters to consider for a physical modeling of CFRP. Currently, we will consider fiber misalignment within yarn and angle between wrap and weft yarns. PBC is applied to woven RVE to calculate its effective material properties. The effect of uncertainties are investigated quantitatively by Gaussian process. Preliminary results of UD and Woven study are analyzed for efficacy of the RVE modeling. This work is considered as the foundation for future multiscale modeling framework development for ICME project.« less

  10. Quantifying allometric model uncertainty for plot-level live tree biomass stocks with a data-driven, hierarchical framework

    Treesearch

    Brian J. Clough; Matthew B. Russell; Grant M. Domke; Christopher W. Woodall

    2016-01-01

    Accurate uncertainty assessments of plot-level live tree biomass stocks are an important precursor to estimating uncertainty in annual national greenhouse gas inventories (NGHGIs) developed from forest inventory data. However, current approaches employed within the United States’ NGHGI do not specifically incorporate methods to address error in tree-scale biomass...

  11. When 1+1 can be >2: Uncertainties compound when simulating climate, fisheries and marine ecosystems

    NASA Astrophysics Data System (ADS)

    Evans, Karen; Brown, Jaclyn N.; Sen Gupta, Alex; Nicol, Simon J.; Hoyle, Simon; Matear, Richard; Arrizabalaga, Haritz

    2015-03-01

    Multi-disciplinary approaches that combine oceanographic, biogeochemical, ecosystem, fisheries population and socio-economic models are vital tools for modelling whole ecosystems. Interpreting the outputs from such complex models requires an appreciation of the many different types of modelling frameworks being used and their associated limitations and uncertainties. Both users and developers of particular model components will often have little involvement or understanding of other components within such modelling frameworks. Failure to recognise limitations and uncertainties associated with components and how these uncertainties might propagate throughout modelling frameworks can potentially result in poor advice for resource management. Unfortunately, many of the current integrative frameworks do not propagate the uncertainties of their constituent parts. In this review, we outline the major components of a generic whole of ecosystem modelling framework incorporating the external pressures of climate and fishing. We discuss the limitations and uncertainties associated with each component of such a modelling system, along with key research gaps. Major uncertainties in modelling frameworks are broadly categorised into those associated with (i) deficient knowledge in the interactions of climate and ocean dynamics with marine organisms and ecosystems; (ii) lack of observations to assess and advance modelling efforts and (iii) an inability to predict with confidence natural ecosystem variability and longer term changes as a result of external drivers (e.g. greenhouse gases, fishing effort) and the consequences for marine ecosystems. As a result of these uncertainties and intrinsic differences in the structure and parameterisation of models, users are faced with considerable challenges associated with making appropriate choices on which models to use. We suggest research directions required to address these uncertainties, and caution against overconfident predictions. Understanding the full impact of uncertainty makes it clear that full comprehension and robust certainty about the systems themselves are not feasible. A key research direction is the development of management systems that are robust to this unavoidable uncertainty.

  12. Hierarchical multi-scale approach to validation and uncertainty quantification of hyper-spectral image modeling

    NASA Astrophysics Data System (ADS)

    Engel, Dave W.; Reichardt, Thomas A.; Kulp, Thomas J.; Graff, David L.; Thompson, Sandra E.

    2016-05-01

    Validating predictive models and quantifying uncertainties inherent in the modeling process is a critical component of the HARD Solids Venture program [1]. Our current research focuses on validating physics-based models predicting the optical properties of solid materials for arbitrary surface morphologies and characterizing the uncertainties in these models. We employ a systematic and hierarchical approach by designing physical experiments and comparing the experimental results with the outputs of computational predictive models. We illustrate this approach through an example comparing a micro-scale forward model to an idealized solid-material system and then propagating the results through a system model to the sensor level. Our efforts should enhance detection reliability of the hyper-spectral imaging technique and the confidence in model utilization and model outputs by users and stakeholders.

  13. Current models of the intensely ionizing particle environment in space

    NASA Technical Reports Server (NTRS)

    Adams, James H., Jr.

    1988-01-01

    The Cosmic Ray Effects on MicroElectronics (CREME) model that is currently in use to estimate single event effect rates in spacecraft is described. The CREME model provides a description of the radiation environment in interplanetary space near the orbit of the earth that contains no major deficiencies. The accuracy of the galactic cosmic ray model is limited by the uncertainties in solar modulation. The model for solar energetic particles could be improved by making use of all the data that has been collected on solar energetic particle events. There remain major uncertainties about the environment within the earth's magnetosphere, because of the uncertainties over the charge states of the heavy ions in the anomalous component and solar flares, and because of trapped heavy ions. The present CREME model is valid only at 1 AU, but it could be extended to other parts of the heliosphere. There is considerable data on the radiation environment from 0.2 to 35 AU in the ecliptic plane. This data could be used to extend the CREME model.

  14. Leader-Follower Formation Control of UUVs with Model Uncertainties, Current Disturbances, and Unstable Communication

    PubMed Central

    Yan, Zheping; Xu, Da; Chen, Tao; Zhang, Wei; Liu, Yibo

    2018-01-01

    Unmanned underwater vehicles (UUVs) have rapidly developed as mobile sensor networks recently in the investigation, survey, and exploration of the underwater environment. The goal of this paper is to develop a practical and efficient formation control method to improve work efficiency of multi-UUV sensor networks. Distributed leader-follower formation controllers are designed based on a state feedback and consensus algorithm. Considering that each vehicle is subject to model uncertainties and current disturbances, a second-order integral UUV model with a nonlinear function is established using the state feedback linearized method under current disturbances. For unstable communication among UUVs, communication failure and acoustic link noise interference are considered. Two-layer random switching communication topologies are proposed to solve the problem of communication failure. For acoustic link noise interference, accurate representation of valid communication information and noise stripping when designing controllers is necessary. Effective communication topology weights are designed to represent the validity of communication information interfered by noise. Utilizing state feedback and noise stripping, sufficient conditions for design formation controllers are proposed to ensure UUV formation achieves consensus under model uncertainties, current disturbances, and unstable communication. The stability of formation controllers is proven by the Lyapunov-Razumikhin theorem, and the validity is verified by simulation results. PMID:29473919

  15. Space Radiation Cancer Risk Projections and Uncertainties - 2010

    NASA Technical Reports Server (NTRS)

    Cucinotta, Francis A.; Kim, Myung-Hee Y.; Chappell, Lori J.

    2011-01-01

    Uncertainties in estimating health risks from galactic cosmic rays greatly limit space mission lengths and potential risk mitigation evaluations. NASA limits astronaut exposures to a 3% risk of exposure-induced death and protects against uncertainties using an assessment of 95% confidence intervals in the projection model. Revisions to this model for lifetime cancer risks from space radiation and new estimates of model uncertainties are described here. We review models of space environments and transport code predictions of organ exposures, and characterize uncertainties in these descriptions. We summarize recent analysis of low linear energy transfer radio-epidemiology data, including revision to Japanese A-bomb survivor dosimetry, longer follow-up of exposed cohorts, and reassessments of dose and dose-rate reduction effectiveness factors. We compare these projections and uncertainties with earlier estimates. Current understanding of radiation quality effects and recent data on factors of relative biological effectiveness and particle track structure are reviewed. Recent radiobiology experiment results provide new information on solid cancer and leukemia risks from heavy ions. We also consider deviations from the paradigm of linearity at low doses of heavy ions motivated by non-targeted effects models. New findings and knowledge are used to revise the NASA risk projection model for space radiation cancer risks.

  16. Predicting long-range transport: a systematic evaluation of two multimedia transport models.

    PubMed

    Bennett, D H; Scheringer, M; McKone, T E; Hungerbühler, K

    2001-03-15

    The United Nations Environment Program has recently developed criteria to identify and restrict chemicals with a potential for persistence and long-range transport (persistent organic pollutants or POPs). There are many stakeholders involved, and the issues are not only scientific but also include social, economic, and political factors. This work focuses on one aspect of the POPs debate, the criteria for determining the potential for long-range transport (LRT). Our goal is to determine if current models are reliable enough to support decisions that classify a chemical based on the LRT potential. We examine the robustness of two multimedia fate models for determining the relative ranking and absolute spatial range of various chemicals in the environment. We also consider the effect of parameter uncertainties and the model uncertainty associated with the selection of an algorithm for gas-particle partitioning on the model results. Given the same chemical properties, both models give virtually the same ranking. However, when chemical parameter uncertainties and model uncertainties such as particle partitioning are considered, the spatial range distributions obtained for the individual chemicals overlap, preventing a distinct rank order. The absolute values obtained for the predicted spatial range or travel distance differ significantly between the two models for the uncertainties evaluated. We find that to evaluate a chemical when large and unresolved uncertainties exist, it is more informative to use two or more models and include multiple types of uncertainty. Model differences and uncertainties must be explicitly confronted to determine how the limitations of scientific knowledge impact predictions in the decision-making process.

  17. Robustness of Feedback Systems with Several Modelling Errors

    DTIC Science & Technology

    1990-06-01

    Patterson AFB, OH 45433-6553 to help us maintain a current mailing list. Copies of this report should not be returned unless return is required by security...Wright Research (If applicable) and Development Center WRDC/FIGC F33615-88-C-3601 8c. ADDRESS (City, State, and ZIP Code) 10. SOURCE OF FUNDING NUMBERS...feedback systems with several sources of modelling uncertainty. We assume that each source of uncertainty is modelled as a stable unstructured

  18. Uncertainty and sensitivity analysis for photovoltaic system modeling.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hansen, Clifford W.; Pohl, Andrew Phillip; Jordan, Dirk

    2013-12-01

    We report an uncertainty and sensitivity analysis for modeling DC energy from photovoltaic systems. We consider two systems, each comprised of a single module using either crystalline silicon or CdTe cells, and located either at Albuquerque, NM, or Golden, CO. Output from a PV system is predicted by a sequence of models. Uncertainty in the output of each model is quantified by empirical distributions of each model's residuals. We sample these distributions to propagate uncertainty through the sequence of models to obtain an empirical distribution for each PV system's output. We considered models that: (1) translate measured global horizontal, directmore » and global diffuse irradiance to plane-of-array irradiance; (2) estimate effective irradiance from plane-of-array irradiance; (3) predict cell temperature; and (4) estimate DC voltage, current and power. We found that the uncertainty in PV system output to be relatively small, on the order of 1% for daily energy. Four alternative models were considered for the POA irradiance modeling step; we did not find the choice of one of these models to be of great significance. However, we observed that the POA irradiance model introduced a bias of upwards of 5% of daily energy which translates directly to a systematic difference in predicted energy. Sensitivity analyses relate uncertainty in the PV system output to uncertainty arising from each model. We found that the residuals arising from the POA irradiance and the effective irradiance models to be the dominant contributors to residuals for daily energy, for either technology or location considered. This analysis indicates that efforts to reduce the uncertainty in PV system output should focus on improvements to the POA and effective irradiance models.« less

  19. Uncertainty in BMP evaluation and optimization for watershed management

    NASA Astrophysics Data System (ADS)

    Chaubey, I.; Cibin, R.; Sudheer, K.; Her, Y.

    2012-12-01

    Use of computer simulation models have increased substantially to make watershed management decisions and to develop strategies for water quality improvements. These models are often used to evaluate potential benefits of various best management practices (BMPs) for reducing losses of pollutants from sources areas into receiving waterbodies. Similarly, use of simulation models in optimizing selection and placement of best management practices under single (maximization of crop production or minimization of pollutant transport) and multiple objective functions has increased recently. One of the limitations of the currently available assessment and optimization approaches is that the BMP strategies are considered deterministic. Uncertainties in input data (e.g. precipitation, streamflow, sediment, nutrient and pesticide losses measured, land use) and model parameters may result in considerable uncertainty in watershed response under various BMP options. We have developed and evaluated options to include uncertainty in BMP evaluation and optimization for watershed management. We have also applied these methods to evaluate uncertainty in ecosystem services from mixed land use watersheds. In this presentation, we will discuss methods to to quantify uncertainties in BMP assessment and optimization solutions due to uncertainties in model inputs and parameters. We have used a watershed model (Soil and Water Assessment Tool or SWAT) to simulate the hydrology and water quality in mixed land use watershed located in Midwest USA. The SWAT model was also used to represent various BMPs in the watershed needed to improve water quality. SWAT model parameters, land use change parameters, and climate change parameters were considered uncertain. It was observed that model parameters, land use and climate changes resulted in considerable uncertainties in BMP performance in reducing P, N, and sediment loads. In addition, climate change scenarios also affected uncertainties in SWAT simulated crop yields. Considerable uncertainties in the net cost and the water quality improvements resulted due to uncertainties in land use, climate change, and model parameter values.

  20. Improved representation of situational awareness within a dismounted small combat unit constructive simulation

    NASA Astrophysics Data System (ADS)

    Lee, K. David; Colony, Mike

    2011-06-01

    Modeling and simulation has been established as a cost-effective means of supporting the development of requirements, exploring doctrinal alternatives, assessing system performance, and performing design trade-off analysis. The Army's constructive simulation for the evaluation of equipment effectiveness in small combat unit operations is currently limited to representation of situation awareness without inclusion of the many uncertainties associated with real world combat environments. The goal of this research is to provide an ability to model situation awareness and decision process uncertainties in order to improve evaluation of the impact of battlefield equipment on ground soldier and small combat unit decision processes. Our Army Probabilistic Inference and Decision Engine (Army-PRIDE) system provides this required uncertainty modeling through the application of two critical techniques that allow Bayesian network technology to be applied to real-time applications. (Object-Oriented Bayesian Network methodology and Object-Oriented Inference technique). In this research, we implement decision process and situation awareness models for a reference scenario using Army-PRIDE and demonstrate its ability to model a variety of uncertainty elements, including: confidence of source, information completeness, and information loss. We also demonstrate that Army-PRIDE improves the realism of the current constructive simulation's decision processes through Monte Carlo simulation.

  1. Surface Nitrification: A Major Uncertainty in Marine N2O Emissions

    NASA Technical Reports Server (NTRS)

    Zamora, Lauren M.; Oschlies, Andreas

    2014-01-01

    The ocean is responsible for up to a third of total global nitrous oxide (N2O) emissions, but uncertainties in emission rates of this potent greenhouse gas are high (approaching 100%). Here we use a marine biogeochemical model to assess six major uncertainties in estimates of N2O production, thereby providing guidance in how future studies may most effectively reduce uncertainties in current and future marine N2O emissions. Potential surface N2O production from nitrification causes the largest uncertainty in N2O emissions (estimated up to approximately 1.6 Tg N/yr (sup -1) or 48% of modeled values), followed by the unknown oxygen concentration at which N2O production switches to N2O consumption (0.8 Tg N/yr (sup -1)or 24% of modeled values). Other uncertainties are minor, cumulatively changing regional emissions by less than 15%. If production of N2O by surface nitrification could be ruled out in future studies, uncertainties in marine N2O emissions would be halved.

  2. A multi-model assessment of terrestrial biosphere model data needs

    NASA Astrophysics Data System (ADS)

    Gardella, A.; Cowdery, E.; De Kauwe, M. G.; Desai, A. R.; Duveneck, M.; Fer, I.; Fisher, R.; Knox, R. G.; Kooper, R.; LeBauer, D.; McCabe, T.; Minunno, F.; Raiho, A.; Serbin, S.; Shiklomanov, A. N.; Thomas, A.; Walker, A.; Dietze, M.

    2017-12-01

    Terrestrial biosphere models provide us with the means to simulate the impacts of climate change and their uncertainties. Going beyond direct observation and experimentation, models synthesize our current understanding of ecosystem processes and can give us insight on data needed to constrain model parameters. In previous work, we leveraged the Predictive Ecosystem Analyzer (PEcAn) to assess the contribution of different parameters to the uncertainty of the Ecosystem Demography model v2 (ED) model outputs across various North American biomes (Dietze et al., JGR-G, 2014). While this analysis identified key research priorities, the extent to which these priorities were model- and/or biome-specific was unclear. Furthermore, because the analysis only studied one model, we were unable to comment on the effect of variability in model structure to overall predictive uncertainty. Here, we expand this analysis to all biomes globally and a wide sample of models that vary in complexity: BioCro, CABLE, CLM, DALEC, ED2, FATES, G'DAY, JULES, LANDIS, LINKAGES, LPJ-GUESS, MAESPA, PRELES, SDGVM, SIPNET, and TEM. Prior to performing uncertainty analyses, model parameter uncertainties were assessed by assimilating all available trait data from the combination of the BETYdb and TRY trait databases, using an updated multivariate version of PEcAn's Hierarchical Bayesian meta-analysis. Next, sensitivity analyses were performed for all models across a range of sites globally to assess sensitivities for a range of different outputs (GPP, ET, SH, Ra, NPP, Rh, NEE, LAI) at multiple time scales from the sub-annual to the decadal. Finally, parameter uncertainties and model sensitivities were combined to evaluate the fractional contribution of each parameter to the predictive uncertainty for a specific variable at a specific site and timescale. Facilitated by PEcAn's automated workflows, this analysis represents the broadest assessment of the sensitivities and uncertainties in terrestrial models to date, and provides a comprehensive roadmap for constraining model uncertainties through model development and data collection.

  3. Hierarchical Multi-Scale Approach To Validation and Uncertainty Quantification of Hyper-Spectral Image Modeling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Engel, David W.; Reichardt, Thomas A.; Kulp, Thomas J.

    Validating predictive models and quantifying uncertainties inherent in the modeling process is a critical component of the HARD Solids Venture program [1]. Our current research focuses on validating physics-based models predicting the optical properties of solid materials for arbitrary surface morphologies and characterizing the uncertainties in these models. We employ a systematic and hierarchical approach by designing physical experiments and comparing the experimental results with the outputs of computational predictive models. We illustrate this approach through an example comparing a micro-scale forward model to an idealized solid-material system and then propagating the results through a system model to the sensormore » level. Our efforts should enhance detection reliability of the hyper-spectral imaging technique and the confidence in model utilization and model outputs by users and stakeholders.« less

  4. Reliability of Current Biokinetic and Dosimetric Models for Radionuclides: A Pilot Study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Leggett, Richard Wayne; Eckerman, Keith F; Meck, Robert A.

    2008-10-01

    This report describes the results of a pilot study of the reliability of the biokinetic and dosimetric models currently used by the U.S. Nuclear Regulatory Commission (NRC) as predictors of dose per unit internal or external exposure to radionuclides. The study examines the feasibility of critically evaluating the accuracy of these models for a comprehensive set of radionuclides of concern to the NRC. Each critical evaluation would include: identification of discrepancies between the models and current databases; characterization of uncertainties in model predictions of dose per unit intake or unit external exposure; characterization of variability in dose per unit intakemore » or unit external exposure; and evaluation of prospects for development of more accurate models. Uncertainty refers here to the level of knowledge of a central value for a population, and variability refers to quantitative differences between different members of a population. This pilot study provides a critical assessment of models for selected radionuclides representing different levels of knowledge of dose per unit exposure. The main conclusions of this study are as follows: (1) To optimize the use of available NRC resources, the full study should focus on radionuclides most frequently encountered in the workplace or environment. A list of 50 radionuclides is proposed. (2) The reliability of a dose coefficient for inhalation or ingestion of a radionuclide (i.e., an estimate of dose per unit intake) may depend strongly on the specific application. Multiple characterizations of the uncertainty in a dose coefficient for inhalation or ingestion of a radionuclide may be needed for different forms of the radionuclide and different levels of information of that form available to the dose analyst. (3) A meaningful characterization of variability in dose per unit intake of a radionuclide requires detailed information on the biokinetics of the radionuclide and hence is not feasible for many infrequently studied radionuclides. (4) The biokinetics of a radionuclide in the human body typically represents the greatest source of uncertainty or variability in dose per unit intake. (5) Characterization of uncertainty in dose per unit exposure is generally a more straightforward problem for external exposure than for intake of a radionuclide. (6) For many radionuclides the most important outcome of a large-scale critical evaluation of databases and biokinetic models for radionuclides is expected to be the improvement of current models. Many of the current models do not fully or accurately reflect available radiobiological or physiological information, either because the models are outdated or because they were based on selective or uncritical use of data or inadequate model structures. In such cases the models should be replaced with physiologically realistic models that incorporate a wider spectrum of information.« less

  5. Scientific Uncertainties in Climate Change Detection and Attribution Studies

    NASA Astrophysics Data System (ADS)

    Santer, B. D.

    2017-12-01

    It has been claimed that the treatment and discussion of key uncertainties in climate science is "confined to hushed sidebar conversations at scientific conferences". This claim is demonstrably incorrect. Climate change detection and attribution studies routinely consider key uncertainties in observational climate data, as well as uncertainties in model-based estimates of natural variability and the "fingerprints" in response to different external forcings. The goal is to determine whether such uncertainties preclude robust identification of a human-caused climate change fingerprint. It is also routine to investigate the impact of applying different fingerprint identification strategies, and to assess how detection and attribution results are impacted by differences in the ability of current models to capture important aspects of present-day climate. The exploration of the uncertainties mentioned above will be illustrated using examples from detection and attribution studies with atmospheric temperature and moisture.

  6. The Source Inversion Validation (SIV) Initiative: A Collaborative Study on Uncertainty Quantification in Earthquake Source Inversions

    NASA Astrophysics Data System (ADS)

    Mai, P. M.; Schorlemmer, D.; Page, M.

    2012-04-01

    Earthquake source inversions image the spatio-temporal rupture evolution on one or more fault planes using seismic and/or geodetic data. Such studies are critically important for earthquake seismology in general, and for advancing seismic hazard analysis in particular, as they reveal earthquake source complexity and help (i) to investigate earthquake mechanics; (ii) to develop spontaneous dynamic rupture models; (iii) to build models for generating rupture realizations for ground-motion simulations. In applications (i - iii), the underlying finite-fault source models are regarded as "data" (input information), but their uncertainties are essentially unknown. After all, source models are obtained from solving an inherently ill-posed inverse problem to which many a priori assumptions and uncertain observations are applied. The Source Inversion Validation (SIV) project is a collaborative effort to better understand the variability between rupture models for a single earthquake (as manifested in the finite-source rupture model database) and to develop robust uncertainty quantification for earthquake source inversions. The SIV project highlights the need to develop a long-standing and rigorous testing platform to examine the current state-of-the-art in earthquake source inversion, and to develop and test novel source inversion approaches. We will review the current status of the SIV project, and report the findings and conclusions of the recent workshops. We will briefly discuss several source-inversion methods, how they treat uncertainties in data, and assess the posterior model uncertainty. Case studies include initial forward-modeling tests on Green's function calculations, and inversion results for synthetic data from spontaneous dynamic crack-like strike-slip earthquake on steeply dipping fault, embedded in a layered crustal velocity-density structure.

  7. Probabilistic inversion of expert assessments to inform projections about Antarctic ice sheet responses.

    PubMed

    Fuller, Robert William; Wong, Tony E; Keller, Klaus

    2017-01-01

    The response of the Antarctic ice sheet (AIS) to changing global temperatures is a key component of sea-level projections. Current projections of the AIS contribution to sea-level changes are deeply uncertain. This deep uncertainty stems, in part, from (i) the inability of current models to fully resolve key processes and scales, (ii) the relatively sparse available data, and (iii) divergent expert assessments. One promising approach to characterizing the deep uncertainty stemming from divergent expert assessments is to combine expert assessments, observations, and simple models by coupling probabilistic inversion and Bayesian inversion. Here, we present a proof-of-concept study that uses probabilistic inversion to fuse a simple AIS model and diverse expert assessments. We demonstrate the ability of probabilistic inversion to infer joint prior probability distributions of model parameters that are consistent with expert assessments. We then confront these inferred expert priors with instrumental and paleoclimatic observational data in a Bayesian inversion. These additional constraints yield tighter hindcasts and projections. We use this approach to quantify how the deep uncertainty surrounding expert assessments affects the joint probability distributions of model parameters and future projections.

  8. Drought Persistence Errors in Global Climate Models

    NASA Astrophysics Data System (ADS)

    Moon, H.; Gudmundsson, L.; Seneviratne, S. I.

    2018-04-01

    The persistence of drought events largely determines the severity of socioeconomic and ecological impacts, but the capability of current global climate models (GCMs) to simulate such events is subject to large uncertainties. In this study, the representation of drought persistence in GCMs is assessed by comparing state-of-the-art GCM model simulations to observation-based data sets. For doing so, we consider dry-to-dry transition probabilities at monthly and annual scales as estimates for drought persistence, where a dry status is defined as negative precipitation anomaly. Though there is a substantial spread in the drought persistence bias, most of the simulations show systematic underestimation of drought persistence at global scale. Subsequently, we analyzed to which degree (i) inaccurate observations, (ii) differences among models, (iii) internal climate variability, and (iv) uncertainty of the employed statistical methods contribute to the spread in drought persistence errors using an analysis of variance approach. The results show that at monthly scale, model uncertainty and observational uncertainty dominate, while the contribution from internal variability is small in most cases. At annual scale, the spread of the drought persistence error is dominated by the statistical estimation error of drought persistence, indicating that the partitioning of the error is impaired by the limited number of considered time steps. These findings reveal systematic errors in the representation of drought persistence in current GCMs and suggest directions for further model improvement.

  9. Assessment of uncertainties in radiation-induced cancer risk predictions at clinically relevant doses

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nguyen, J.; Moteabbed, M.; Paganetti, H., E-mail: hpaganetti@mgh.harvard.edu

    2015-01-15

    Purpose: Theoretical dose–response models offer the possibility to assess second cancer induction risks after external beam therapy. The parameters used in these models are determined with limited data from epidemiological studies. Risk estimations are thus associated with considerable uncertainties. This study aims at illustrating uncertainties when predicting the risk for organ-specific second cancers in the primary radiation field illustrated by choosing selected treatment plans for brain cancer patients. Methods: A widely used risk model was considered in this study. The uncertainties of the model parameters were estimated with reported data of second cancer incidences for various organs. Standard error propagationmore » was then subsequently applied to assess the uncertainty in the risk model. Next, second cancer risks of five pediatric patients treated for cancer in the head and neck regions were calculated. For each case, treatment plans for proton and photon therapy were designed to estimate the uncertainties (a) in the lifetime attributable risk (LAR) for a given treatment modality and (b) when comparing risks of two different treatment modalities. Results: Uncertainties in excess of 100% of the risk were found for almost all organs considered. When applied to treatment plans, the calculated LAR values have uncertainties of the same magnitude. A comparison between cancer risks of different treatment modalities, however, does allow statistically significant conclusions. In the studied cases, the patient averaged LAR ratio of proton and photon treatments was 0.35, 0.56, and 0.59 for brain carcinoma, brain sarcoma, and bone sarcoma, respectively. Their corresponding uncertainties were estimated to be potentially below 5%, depending on uncertainties in dosimetry. Conclusions: The uncertainty in the dose–response curve in cancer risk models makes it currently impractical to predict the risk for an individual external beam treatment. On the other hand, the ratio of absolute risks between two modalities is less sensitive to the uncertainties in the risk model and can provide statistically significant estimates.« less

  10. Analysis of air quality management with emphasis on transportation sources

    NASA Technical Reports Server (NTRS)

    English, T. D.; Divita, E.; Lees, L.

    1980-01-01

    The current environment and practices of air quality management were examined for three regions: Denver, Phoenix, and the South Coast Air Basin of California. These regions were chosen because the majority of their air pollution emissions are related to mobile sources. The impact of auto exhaust on the air quality management process is characterized and assessed. An examination of the uncertainties in air pollutant measurements, emission inventories, meteorological parameters, atmospheric chemistry, and air quality simulation models is performed. The implications of these uncertainties to current air quality management practices is discussed. A set of corrective actions are recommended to reduce these uncertainties.

  11. Modeling approaches for characterizing and evaluating environmental exposure to engineered nanomaterials in support of risk-based decision making.

    PubMed

    Hendren, Christine Ogilvie; Lowry, Michael; Grieger, Khara D; Money, Eric S; Johnston, John M; Wiesner, Mark R; Beaulieu, Stephen M

    2013-02-05

    As the use of engineered nanomaterials becomes more prevalent, the likelihood of unintended exposure to these materials also increases. Given the current scarcity of experimental data regarding fate, transport, and bioavailability, determining potential environmental exposure to these materials requires an in depth analysis of modeling techniques that can be used in both the near- and long-term. Here, we provide a critical review of traditional and emerging exposure modeling approaches to highlight the challenges that scientists and decision-makers face when developing environmental exposure and risk assessments for nanomaterials. We find that accounting for nanospecific properties, overcoming data gaps, realizing model limitations, and handling uncertainty are key to developing informative and reliable environmental exposure and risk assessments for engineered nanomaterials. We find methods suited to recognizing and addressing significant uncertainty to be most appropriate for near-term environmental exposure modeling, given the current state of information and the current insufficiency of established deterministic models to address environmental exposure to engineered nanomaterials.

  12. Diagnostic uncertainty, guilt, mood, and disability in back pain.

    PubMed

    Serbic, Danijela; Pincus, Tamar; Fife-Schaw, Chris; Dawson, Helen

    2016-01-01

    In the majority of patients a definitive cause for low back pain (LBP) cannot be established, and many patients report feeling uncertain about their diagnosis, accompanied by guilt. The relationship between diagnostic uncertainty, guilt, mood, and disability is currently unknown. This study tested 3 theoretical models to explore possible pathways between these factors. In Model 1, diagnostic uncertainty was hypothesized to correlate with pain-related guilt, which in turn would positively correlate with depression, anxiety and disability. Two alternative models were tested: (a) a path from depression and anxiety to guilt, from guilt to diagnostic uncertainty, and finally to disability; (b) a model in which depression and anxiety, and independently, diagnostic uncertainty, were associated with guilt, which in turn was associated with disability. Structural equation modeling was employed on data from 413 participants with chronic LBP. All 3 models showed a reasonable-to-good fit with the data, with the 2 alternative models providing marginally better fit indices. Guilt, and especially social guilt, was associated with disability in all 3 models. Diagnostic uncertainty was associated with guilt, but only moderately. Low mood was also associated with guilt. Two newly defined factors, pain related guilt and diagnostic uncertainty, appear to be linked to disability and mood in people with LBP. The causal path of these links cannot be established in this cross sectional study. However, pain-related guilt especially appears to be important, and future research should examine whether interventions directly targeting guilt improve outcomes. (c) 2015 APA, all rights reserved).

  13. Uncertainty Modeling for Robustness Analysis of Control Upset Prevention and Recovery Systems

    NASA Technical Reports Server (NTRS)

    Belcastro, Christine M.; Khong, Thuan H.; Shin, Jong-Yeob; Kwatny, Harry; Chang, Bor-Chin; Balas, Gary J.

    2005-01-01

    Formal robustness analysis of aircraft control upset prevention and recovery systems could play an important role in their validation and ultimate certification. Such systems (developed for failure detection, identification, and reconfiguration, as well as upset recovery) need to be evaluated over broad regions of the flight envelope and under extreme flight conditions, and should include various sources of uncertainty. However, formulation of linear fractional transformation (LFT) models for representing system uncertainty can be very difficult for complex parameter-dependent systems. This paper describes a preliminary LFT modeling software tool which uses a matrix-based computational approach that can be directly applied to parametric uncertainty problems involving multivariate matrix polynomial dependencies. Several examples are presented (including an F-16 at an extreme flight condition, a missile model, and a generic example with numerous crossproduct terms), and comparisons are given with other LFT modeling tools that are currently available. The LFT modeling method and preliminary software tool presented in this paper are shown to compare favorably with these methods.

  14. Bridging groundwater models and decision support with a Bayesian network

    USGS Publications Warehouse

    Fienen, Michael N.; Masterson, John P.; Plant, Nathaniel G.; Gutierrez, Benjamin T.; Thieler, E. Robert

    2013-01-01

    Resource managers need to make decisions to plan for future environmental conditions, particularly sea level rise, in the face of substantial uncertainty. Many interacting processes factor in to the decisions they face. Advances in process models and the quantification of uncertainty have made models a valuable tool for this purpose. Long-simulation runtimes and, often, numerical instability make linking process models impractical in many cases. A method for emulating the important connections between model input and forecasts, while propagating uncertainty, has the potential to provide a bridge between complicated numerical process models and the efficiency and stability needed for decision making. We explore this using a Bayesian network (BN) to emulate a groundwater flow model. We expand on previous approaches to validating a BN by calculating forecasting skill using cross validation of a groundwater model of Assateague Island in Virginia and Maryland, USA. This BN emulation was shown to capture the important groundwater-flow characteristics and uncertainty of the groundwater system because of its connection to island morphology and sea level. Forecast power metrics associated with the validation of multiple alternative BN designs guided the selection of an optimal level of BN complexity. Assateague island is an ideal test case for exploring a forecasting tool based on current conditions because the unique hydrogeomorphological variability of the island includes a range of settings indicative of past, current, and future conditions. The resulting BN is a valuable tool for exploring the response of groundwater conditions to sea level rise in decision support.

  15. Equifinality and process-based modelling

    NASA Astrophysics Data System (ADS)

    Khatami, S.; Peel, M. C.; Peterson, T. J.; Western, A. W.

    2017-12-01

    Equifinality is understood as one of the fundamental difficulties in the study of open complex systems, including catchment hydrology. A review of the hydrologic literature reveals that the term equifinality has been widely used, but in many cases inconsistently and without coherent recognition of the various facets of equifinality, which can lead to ambiguity but also methodological fallacies. Therefore, in this study we first characterise the term equifinality within the context of hydrological modelling by reviewing the genesis of the concept of equifinality and then presenting a theoretical framework. During past decades, equifinality has mainly been studied as a subset of aleatory (arising due to randomness) uncertainty and for the assessment of model parameter uncertainty. Although the connection between parameter uncertainty and equifinality is undeniable, we argue there is more to equifinality than just aleatory parameter uncertainty. That is, the importance of equifinality and epistemic uncertainty (arising due to lack of knowledge) and their implications is overlooked in our current practice of model evaluation. Equifinality and epistemic uncertainty in studying, modelling, and evaluating hydrologic processes are treated as if they can be simply discussed in (or often reduced to) probabilistic terms (as for aleatory uncertainty). The deficiencies of this approach to conceptual rainfall-runoff modelling are demonstrated for selected Australian catchments by examination of parameter and internal flux distributions and interactions within SIMHYD. On this basis, we present a new approach that expands equifinality concept beyond model parameters to inform epistemic uncertainty. The new approach potentially facilitates the identification and development of more physically plausible models and model evaluation schemes particularly within the multiple working hypotheses framework, and is generalisable to other fields of environmental modelling as well.

  16. IMPROVING PARTICULATE MATTER SOURCE APPORTIONMENT FOR HEALTH STUDIES: A TRAINED RECEPTOR MODELING APPROACH WITH SENSITIVITY, UNCERTAINTY AND SPATIAL ANALYSES

    EPA Science Inventory

    An approach for conducting PM source apportionment will be developed, tested, and applied that directly addresses limitations in current SA methods, in particular variability, biases, and intensive resource requirements. Uncertainties in SA results and sensitivities to SA inpu...

  17. Toward best practice framing of uncertainty in scientific publications: A review of Water Resources Research abstracts

    NASA Astrophysics Data System (ADS)

    Guillaume, Joseph H. A.; Helgeson, Casey; Elsawah, Sondoss; Jakeman, Anthony J.; Kummu, Matti

    2017-08-01

    Uncertainty is recognized as a key issue in water resources research, among other sciences. Discussions of uncertainty typically focus on tools and techniques applied within an analysis, e.g., uncertainty quantification and model validation. But uncertainty is also addressed outside the analysis, in writing scientific publications. The language that authors use conveys their perspective of the role of uncertainty when interpreting a claim—what we call here "framing" the uncertainty. This article promotes awareness of uncertainty framing in four ways. (1) It proposes a typology of eighteen uncertainty frames, addressing five questions about uncertainty. (2) It describes the context in which uncertainty framing occurs. This is an interdisciplinary topic, involving philosophy of science, science studies, linguistics, rhetoric, and argumentation. (3) We analyze the use of uncertainty frames in a sample of 177 abstracts from the Water Resources Research journal in 2015. This helped develop and tentatively verify the typology, and provides a snapshot of current practice. (4) We make provocative recommendations to achieve a more influential, dynamic science. Current practice in uncertainty framing might be described as carefully considered incremental science. In addition to uncertainty quantification and degree of belief (present in ˜5% of abstracts), uncertainty is addressed by a combination of limiting scope, deferring to further work (˜25%) and indicating evidence is sufficient (˜40%)—or uncertainty is completely ignored (˜8%). There is a need for public debate within our discipline to decide in what context different uncertainty frames are appropriate. Uncertainty framing cannot remain a hidden practice evaluated only by lone reviewers.

  18. The effect of alternative seismotectonic models on PSHA results - a sensitivity study for two sites in Israel

    NASA Astrophysics Data System (ADS)

    Avital, Matan; Kamai, Ronnie; Davis, Michael; Dor, Ory

    2018-02-01

    We present a full probabilistic seismic hazard analysis (PSHA) sensitivity analysis for two sites in southern Israel - one in the near field of a major fault system and one farther away. The PSHA analysis is conducted for alternative source representations, using alternative model parameters for the main seismic sources, such as slip rate and Mmax, among others. The analysis also considers the effect of the ground motion prediction equation (GMPE) on the hazard results. In this way, the two types of epistemic uncertainty - modelling uncertainty and parametric uncertainty - are treated and addressed. We quantify the uncertainty propagation by testing its influence on the final calculated hazard, such that the controlling knowledge gaps are identified and can be treated in future studies. We find that current practice in Israel, as represented by the current version of the building code, grossly underestimates the hazard, by approximately 40 % in short return periods (e.g. 10 % in 50 years) and by as much as 150 % in long return periods (e.g. 10E-5). The analysis shows that this underestimation is most probably due to a combination of factors, including source definitions as well as the GMPE used for analysis.

  19. A Risk-Based Approach for Aerothermal/TPS Analysis and Testing

    NASA Technical Reports Server (NTRS)

    Wright, Michael J.; Grinstead, Jay H.; Bose, Deepak

    2007-01-01

    The current status of aerothermal and thermal protection system modeling for civilian entry missions is reviewed. For most such missions, the accuracy of our simulations is limited not by the tools and processes currently employed, but rather by reducible deficiencies in the underlying physical models. Improving the accuracy of and reducing the uncertainties in these models will enable a greater understanding of the system level impacts of a particular thermal protection system and of the system operation and risk over the operational life of the system. A strategic plan will be laid out by which key modeling deficiencies can be identified via mission-specific gap analysis. Once these gaps have been identified, the driving component uncertainties are determined via sensitivity analyses. A Monte-Carlo based methodology is presented for physics-based probabilistic uncertainty analysis of aerothermodynamics and thermal protection system material response modeling. These data are then used to advocate for and plan focused testing aimed at reducing key uncertainties. The results of these tests are used to validate or modify existing physical models. Concurrently, a testing methodology is outlined for thermal protection materials. The proposed approach is based on using the results of uncertainty/sensitivity analyses discussed above to tailor ground testing so as to best identify and quantify system performance and risk drivers. A key component of this testing is understanding the relationship between the test and flight environments. No existing ground test facility can simultaneously replicate all aspects of the flight environment, and therefore good models for traceability to flight are critical to ensure a low risk, high reliability thermal protection system design. Finally, the role of flight testing in the overall thermal protection system development strategy is discussed.

  20. Intergovernmental Panel on Climate Change (IPCC)\\, Working Group 1, 1994: Modelling Results Relating Future Atmospheric CO2 Concentrations to Industrial Emissions (DB1009)

    DOE Data Explorer

    Enting, I. G.; Wigley, M. L.; Heimann, M.

    1995-01-01

    This database contains the results of various projections of the relation between future CO2 concentrations and future industrial emissions. These projections were contributed by groups from a number of countries as part of the scientific assessment for the report, "Radiative Forcing of Climate Change" (1994), issued by Working Group 1 of the Intergovernmental Panel on Climate Change. There were three types of calculations: (1) forward projections, calculating the atmospheric CO2 concentrations resulting from specified emissions scenarios; (2) inverse calculations, determining the emission rates that would be required to achieve stabilization of CO2 concentrations via specified pathways; (3) impulse response function calculations, required for determining Global Warming Potentials. The projections were extrapolations of global carbon cycle models from pre-industrial times (starting at 1765) to 2100 or 2200 A.D. There were two aspects to the exercise: (1) an assessment of the uncertainty due to uncertainties regarding the current carbon budget, and (2) an assessment of the uncertainties arising from differences between models. To separate these effects, a set of standard conditions was used to explore inter-model differences and then a series of sensitivity studies was used to explore the consequences of current uncertainties in the carbon cycle.

  1. PIV Uncertainty Methodologies for CFD Code Validation at the MIR Facility

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sabharwall, Piyush; Skifton, Richard; Stoots, Carl

    2013-12-01

    Currently, computational fluid dynamics (CFD) is widely used in the nuclear thermal hydraulics field for design and safety analyses. To validate CFD codes, high quality multi dimensional flow field data are essential. The Matched Index of Refraction (MIR) Flow Facility at Idaho National Laboratory has a unique capability to contribute to the development of validated CFD codes through the use of Particle Image Velocimetry (PIV). The significance of the MIR facility is that it permits non intrusive velocity measurement techniques, such as PIV, through complex models without requiring probes and other instrumentation that disturb the flow. At the heart ofmore » any PIV calculation is the cross-correlation, which is used to estimate the displacement of particles in some small part of the image over the time span between two images. This image displacement is indicated by the location of the largest peak. In the MIR facility, uncertainty quantification is a challenging task due to the use of optical measurement techniques. Currently, this study is developing a reliable method to analyze uncertainty and sensitivity of the measured data and develop a computer code to automatically analyze the uncertainty/sensitivity of the measured data. The main objective of this study is to develop a well established uncertainty quantification method for the MIR Flow Facility, which consists of many complicated uncertainty factors. In this study, the uncertainty sources are resolved in depth by categorizing them into uncertainties from the MIR flow loop and PIV system (including particle motion, image distortion, and data processing). Then, each uncertainty source is mathematically modeled or adequately defined. Finally, this study will provide a method and procedure to quantify the experimental uncertainty in the MIR Flow Facility with sample test results.« less

  2. Prediction and assimilation of surf-zone processes using a Bayesian network: Part I: Forward models

    USGS Publications Warehouse

    Plant, Nathaniel G.; Holland, K. Todd

    2011-01-01

    Prediction of coastal processes, including waves, currents, and sediment transport, can be obtained from a variety of detailed geophysical-process models with many simulations showing significant skill. This capability supports a wide range of research and applied efforts that can benefit from accurate numerical predictions. However, the predictions are only as accurate as the data used to drive the models and, given the large temporal and spatial variability of the surf zone, inaccuracies in data are unavoidable such that useful predictions require corresponding estimates of uncertainty. We demonstrate how a Bayesian-network model can be used to provide accurate predictions of wave-height evolution in the surf zone given very sparse and/or inaccurate boundary-condition data. The approach is based on a formal treatment of a data-assimilation problem that takes advantage of significant reduction of the dimensionality of the model system. We demonstrate that predictions of a detailed geophysical model of the wave evolution are reproduced accurately using a Bayesian approach. In this surf-zone application, forward prediction skill was 83%, and uncertainties in the model inputs were accurately transferred to uncertainty in output variables. We also demonstrate that if modeling uncertainties were not conveyed to the Bayesian network (i.e., perfect data or model were assumed), then overly optimistic prediction uncertainties were computed. More consistent predictions and uncertainties were obtained by including model-parameter errors as a source of input uncertainty. Improved predictions (skill of 90%) were achieved because the Bayesian network simultaneously estimated optimal parameters while predicting wave heights.

  3. Prospects for Precision Neutrino Cross Section Measurements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harris, Deborah A.

    2016-01-28

    The need for precision cross section measurements is more urgent now than ever before, given the central role neutrino oscillation measurements play in the field of particle physics. The definition of precision is something worth considering, however. In order to build the best model for an oscillation experiment, cross section measurements should span a broad range of energies, neutrino interaction channels, and target nuclei. Precision might better be defined not in the final uncertainty associated with any one measurement but rather with the breadth of measurements that are available to constrain models. Current experience shows that models are better constrainedmore » by 10 measurements across different processes and energies with 10% uncertainties than by one measurement of one process on one nucleus with a 1% uncertainty. This article describes the current status of and future prospects for the field of precision cross section measurements considering the metric of how many processes, energies, and nuclei have been studied.« less

  4. Uncertainty in predictions of oil spill trajectories in a coastal zone

    NASA Astrophysics Data System (ADS)

    Sebastião, P.; Guedes Soares, C.

    2006-12-01

    A method is introduced to determine the uncertainties in the predictions of oil spill trajectories using a classic oil spill model. The method considers the output of the oil spill model as a function of random variables, which are the input parameters, and calculates the standard deviation of the output results which provides a measure of the uncertainty of the model as a result of the uncertainties of the input parameters. In addition to a single trajectory that is calculated by the oil spill model using the mean values of the parameters, a band of trajectories can be defined when various simulations are done taking into account the uncertainties of the input parameters. This band of trajectories defines envelopes of the trajectories that are likely to be followed by the spill given the uncertainties of the input. The method was applied to an oil spill that occurred in 1989 near Sines in the southwestern coast of Portugal. This model represented well the distinction between a wind driven part that remained offshore, and a tide driven part that went ashore. For both parts, the method defined two trajectory envelopes, one calculated exclusively with the wind fields, and the other using wind and tidal currents. In both cases reasonable approximation to the observed results was obtained. The envelope of likely trajectories that is obtained with the uncertainty modelling proved to give a better interpretation of the trajectories that were simulated by the oil spill model.

  5. Hotspots of uncertainty in land-use and land-cover change projections: a global-scale model comparison.

    PubMed

    Prestele, Reinhard; Alexander, Peter; Rounsevell, Mark D A; Arneth, Almut; Calvin, Katherine; Doelman, Jonathan; Eitelberg, David A; Engström, Kerstin; Fujimori, Shinichiro; Hasegawa, Tomoko; Havlik, Petr; Humpenöder, Florian; Jain, Atul K; Krisztin, Tamás; Kyle, Page; Meiyappan, Prasanth; Popp, Alexander; Sands, Ronald D; Schaldach, Rüdiger; Schüngel, Jan; Stehfest, Elke; Tabeau, Andrzej; Van Meijl, Hans; Van Vliet, Jasper; Verburg, Peter H

    2016-12-01

    Model-based global projections of future land-use and land-cover (LULC) change are frequently used in environmental assessments to study the impact of LULC change on environmental services and to provide decision support for policy. These projections are characterized by a high uncertainty in terms of quantity and allocation of projected changes, which can severely impact the results of environmental assessments. In this study, we identify hotspots of uncertainty, based on 43 simulations from 11 global-scale LULC change models representing a wide range of assumptions of future biophysical and socioeconomic conditions. We attribute components of uncertainty to input data, model structure, scenario storyline and a residual term, based on a regression analysis and analysis of variance. From this diverse set of models and scenarios, we find that the uncertainty varies, depending on the region and the LULC type under consideration. Hotspots of uncertainty appear mainly at the edges of globally important biomes (e.g., boreal and tropical forests). Our results indicate that an important source of uncertainty in forest and pasture areas originates from different input data applied in the models. Cropland, in contrast, is more consistent among the starting conditions, while variation in the projections gradually increases over time due to diverse scenario assumptions and different modeling approaches. Comparisons at the grid cell level indicate that disagreement is mainly related to LULC type definitions and the individual model allocation schemes. We conclude that improving the quality and consistency of observational data utilized in the modeling process and improving the allocation mechanisms of LULC change models remain important challenges. Current LULC representation in environmental assessments might miss the uncertainty arising from the diversity of LULC change modeling approaches, and many studies ignore the uncertainty in LULC projections in assessments of LULC change impacts on climate, water resources or biodiversity. © 2016 The Authors. Global Change Biology Published by John Wiley & Sons Ltd.

  6. ERRORS IN APPLYING LOW IONIC-STRENGTH ACTIVITY COEFFICIENT ALGORITHMS TO HIGHER IONIC-STRENGTH AQUATIC MEDIA

    EPA Science Inventory

    The toxicological and regulatory communities are currently exploring the use of the free-ion-activity (FIA) model both alone and in conjunction with the biotic ligand model (BLM) as a means of reducing uncertainties in current methods for assessing metals bioavailability from aqu...

  7. Global Sensitivity Analysis and Estimation of Model Error, Toward Uncertainty Quantification in Scramjet Computations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huan, Xun; Safta, Cosmin; Sargsyan, Khachik

    The development of scramjet engines is an important research area for advancing hypersonic and orbital flights. Progress toward optimal engine designs requires accurate flow simulations together with uncertainty quantification. However, performing uncertainty quantification for scramjet simulations is challenging due to the large number of uncertain parameters involved and the high computational cost of flow simulations. These difficulties are addressed in this paper by developing practical uncertainty quantification algorithms and computational methods, and deploying them in the current study to large-eddy simulations of a jet in crossflow inside a simplified HIFiRE Direct Connect Rig scramjet combustor. First, global sensitivity analysis ismore » conducted to identify influential uncertain input parameters, which can help reduce the system’s stochastic dimension. Second, because models of different fidelity are used in the overall uncertainty quantification assessment, a framework for quantifying and propagating the uncertainty due to model error is presented. In conclusion, these methods are demonstrated on a nonreacting jet-in-crossflow test problem in a simplified scramjet geometry, with parameter space up to 24 dimensions, using static and dynamic treatments of the turbulence subgrid model, and with two-dimensional and three-dimensional geometries.« less

  8. Global Sensitivity Analysis and Estimation of Model Error, Toward Uncertainty Quantification in Scramjet Computations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huan, Xun; Safta, Cosmin; Sargsyan, Khachik

    The development of scramjet engines is an important research area for advancing hypersonic and orbital flights. Progress toward optimal engine designs requires accurate flow simulations together with uncertainty quantification. However, performing uncertainty quantification for scramjet simulations is challenging due to the large number of uncertain parameters involved and the high computational cost of flow simulations. These difficulties are addressed in this paper by developing practical uncertainty quantification algorithms and computational methods, and deploying them in the current study to large-eddy simulations of a jet in crossflow inside a simplified HIFiRE Direct Connect Rig scramjet combustor. First, global sensitivity analysis ismore » conducted to identify influential uncertain input parameters, which can help reduce the system’s stochastic dimension. Second, because models of different fidelity are used in the overall uncertainty quantification assessment, a framework for quantifying and propagating the uncertainty due to model error is presented. Finally, these methods are demonstrated on a nonreacting jet-in-crossflow test problem in a simplified scramjet geometry, with parameter space up to 24 dimensions, using static and dynamic treatments of the turbulence subgrid model, and with two-dimensional and three-dimensional geometries.« less

  9. Global Sensitivity Analysis and Estimation of Model Error, Toward Uncertainty Quantification in Scramjet Computations

    NASA Astrophysics Data System (ADS)

    Huan, Xun; Safta, Cosmin; Sargsyan, Khachik; Geraci, Gianluca; Eldred, Michael S.; Vane, Zachary P.; Lacaze, Guilhem; Oefelein, Joseph C.; Najm, Habib N.

    2018-03-01

    The development of scramjet engines is an important research area for advancing hypersonic and orbital flights. Progress toward optimal engine designs requires accurate flow simulations together with uncertainty quantification. However, performing uncertainty quantification for scramjet simulations is challenging due to the large number of uncertain parameters involved and the high computational cost of flow simulations. These difficulties are addressed in this paper by developing practical uncertainty quantification algorithms and computational methods, and deploying them in the current study to large-eddy simulations of a jet in crossflow inside a simplified HIFiRE Direct Connect Rig scramjet combustor. First, global sensitivity analysis is conducted to identify influential uncertain input parameters, which can help reduce the systems stochastic dimension. Second, because models of different fidelity are used in the overall uncertainty quantification assessment, a framework for quantifying and propagating the uncertainty due to model error is presented. These methods are demonstrated on a nonreacting jet-in-crossflow test problem in a simplified scramjet geometry, with parameter space up to 24 dimensions, using static and dynamic treatments of the turbulence subgrid model, and with two-dimensional and three-dimensional geometries.

  10. Global Sensitivity Analysis and Estimation of Model Error, Toward Uncertainty Quantification in Scramjet Computations

    DOE PAGES

    Huan, Xun; Safta, Cosmin; Sargsyan, Khachik; ...

    2018-02-09

    The development of scramjet engines is an important research area for advancing hypersonic and orbital flights. Progress toward optimal engine designs requires accurate flow simulations together with uncertainty quantification. However, performing uncertainty quantification for scramjet simulations is challenging due to the large number of uncertain parameters involved and the high computational cost of flow simulations. These difficulties are addressed in this paper by developing practical uncertainty quantification algorithms and computational methods, and deploying them in the current study to large-eddy simulations of a jet in crossflow inside a simplified HIFiRE Direct Connect Rig scramjet combustor. First, global sensitivity analysis ismore » conducted to identify influential uncertain input parameters, which can help reduce the system’s stochastic dimension. Second, because models of different fidelity are used in the overall uncertainty quantification assessment, a framework for quantifying and propagating the uncertainty due to model error is presented. In conclusion, these methods are demonstrated on a nonreacting jet-in-crossflow test problem in a simplified scramjet geometry, with parameter space up to 24 dimensions, using static and dynamic treatments of the turbulence subgrid model, and with two-dimensional and three-dimensional geometries.« less

  11. Bayesian analysis of input uncertainty in hydrological modeling: 2. Application

    NASA Astrophysics Data System (ADS)

    Kavetski, Dmitri; Kuczera, George; Franks, Stewart W.

    2006-03-01

    The Bayesian total error analysis (BATEA) methodology directly addresses both input and output errors in hydrological modeling, requiring the modeler to make explicit, rather than implicit, assumptions about the likely extent of data uncertainty. This study considers a BATEA assessment of two North American catchments: (1) French Broad River and (2) Potomac basins. It assesses the performance of the conceptual Variable Infiltration Capacity (VIC) model with and without accounting for input (precipitation) uncertainty. The results show the considerable effects of precipitation errors on the predicted hydrographs (especially the prediction limits) and on the calibrated parameters. In addition, the performance of BATEA in the presence of severe model errors is analyzed. While BATEA allows a very direct treatment of input uncertainty and yields some limited insight into model errors, it requires the specification of valid error models, which are currently poorly understood and require further work. Moreover, it leads to computationally challenging highly dimensional problems. For some types of models, including the VIC implemented using robust numerical methods, the computational cost of BATEA can be reduced using Newton-type methods.

  12. Open issues in hadronic interactions for air showers

    NASA Astrophysics Data System (ADS)

    Pierog, Tanguy

    2017-06-01

    In detailed air shower simulations, the uncertainty in the prediction of shower observables for different primary particles and energies is currently dominated by differences between hadronic interaction models. With the results of the first run of the LHC, the difference between post-LHC model predictions has been reduced to the same level as experimental uncertainties of cosmic ray experiments. At the same time new types of air shower observables, like the muon production depth, have been measured, adding new constraints on hadronic models. Currently no model is able to consistently reproduce all mass composition measurements possible within the Pierre Auger Observatory for instance. Comparing the different models, and with LHC and cosmic ray data, we will show that the remaining open issues in hadronic interactions in air shower development are now in the pion-air interactions and in nuclear effects.

  13. Hotspots of uncertainty in land-use and land-cover change projections: A global-scale model comparison

    DOE PAGES

    Prestele, Reinhard; Alexander, Peter; Rounsevell, Mark D. A.; ...

    2016-05-02

    Model-based global projections of future land use and land cover (LULC) change are frequently used in environmental assessments to study the impact of LULC change on environmental services and to provide decision support for policy. These projections are characterized by a high uncertainty in terms of quantity and allocation of projected changes, which can severely impact the results of environmental assessments. In this study, we identify hotspots of uncertainty, based on 43 simulations from 11 global-scale LULC change models representing a wide range of assumptions of future biophysical and socio-economic conditions. We attribute components of uncertainty to input data, modelmore » structure, scenario storyline and a residual term, based on a regression analysis and analysis of variance. From this diverse set of models and scenarios we find that the uncertainty varies, depending on the region and the LULC type under consideration. Hotspots of uncertainty appear mainly at the edges of globally important biomes (e.g. boreal and tropical forests). Our results indicate that an important source of uncertainty in forest and pasture areas originates from different input data applied in the models. Cropland, in contrast, is more consistent among the starting conditions, while variation in the projections gradually increases over time due to diverse scenario assumptions and different modeling approaches. Comparisons at the grid cell level indicate that disagreement is mainly related to LULC type definitions and the individual model allocation schemes. We conclude that improving the quality and consistency of observational data utilized in the modeling process as well as improving the allocation mechanisms of LULC change models remain important challenges. Furthermore, current LULC representation in environmental assessments might miss the uncertainty arising from the diversity of LULC change modeling approaches and many studies ignore the uncertainty in LULC projections in assessments of LULC change impacts on climate, water resources or biodiversity.« less

  14. Hotspots of uncertainty in land-use and land-cover change projections: A global-scale model comparison

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Prestele, Reinhard; Alexander, Peter; Rounsevell, Mark D. A.

    Model-based global projections of future land use and land cover (LULC) change are frequently used in environmental assessments to study the impact of LULC change on environmental services and to provide decision support for policy. These projections are characterized by a high uncertainty in terms of quantity and allocation of projected changes, which can severely impact the results of environmental assessments. In this study, we identify hotspots of uncertainty, based on 43 simulations from 11 global-scale LULC change models representing a wide range of assumptions of future biophysical and socio-economic conditions. We attribute components of uncertainty to input data, modelmore » structure, scenario storyline and a residual term, based on a regression analysis and analysis of variance. From this diverse set of models and scenarios we find that the uncertainty varies, depending on the region and the LULC type under consideration. Hotspots of uncertainty appear mainly at the edges of globally important biomes (e.g. boreal and tropical forests). Our results indicate that an important source of uncertainty in forest and pasture areas originates from different input data applied in the models. Cropland, in contrast, is more consistent among the starting conditions, while variation in the projections gradually increases over time due to diverse scenario assumptions and different modeling approaches. Comparisons at the grid cell level indicate that disagreement is mainly related to LULC type definitions and the individual model allocation schemes. We conclude that improving the quality and consistency of observational data utilized in the modeling process as well as improving the allocation mechanisms of LULC change models remain important challenges. Furthermore, current LULC representation in environmental assessments might miss the uncertainty arising from the diversity of LULC change modeling approaches and many studies ignore the uncertainty in LULC projections in assessments of LULC change impacts on climate, water resources or biodiversity.« less

  15. Uncertainties in Galactic Chemical Evolution Models

    DOE PAGES

    Cote, Benoit; Ritter, Christian; Oshea, Brian W.; ...

    2016-06-15

    Here we use a simple one-zone galactic chemical evolution model to quantify the uncertainties generated by the input parameters in numerical predictions for a galaxy with properties similar to those of the Milky Way. We compiled several studies from the literature to gather the current constraints for our simulations regarding the typical value and uncertainty of the following seven basic parameters: the lower and upper mass limits of the stellar initial mass function (IMF), the slope of the high-mass end of the stellar IMF, the slope of the delay-time distribution function of Type Ia supernovae (SNe Ia), the number ofmore » SNe Ia per M ⊙ formed, the total stellar mass formed, and the final mass of gas. We derived a probability distribution function to express the range of likely values for every parameter, which were then included in a Monte Carlo code to run several hundred simulations with randomly selected input parameters. This approach enables us to analyze the predicted chemical evolution of 16 elements in a statistical manner by identifying the most probable solutions along with their 68% and 95% confidence levels. Our results show that the overall uncertainties are shaped by several input parameters that individually contribute at different metallicities, and thus at different galactic ages. The level of uncertainty then depends on the metallicity and is different from one element to another. Among the seven input parameters considered in this work, the slope of the IMF and the number of SNe Ia are currently the two main sources of uncertainty. The thicknesses of the uncertainty bands bounded by the 68% and 95% confidence levels are generally within 0.3 and 0.6 dex, respectively. When looking at the evolution of individual elements as a function of galactic age instead of metallicity, those same thicknesses range from 0.1 to 0.6 dex for the 68% confidence levels and from 0.3 to 1.0 dex for the 95% confidence levels. The uncertainty in our chemical evolution model does not include uncertainties relating to stellar yields, star formation and merger histories, and modeling assumptions.« less

  16. Uncertainty analysis of an irrigation scheduling model for water management in crop production

    USDA-ARS?s Scientific Manuscript database

    Irrigation scheduling tools are critical to allow producers to manage water resources for crop production in an accurate and timely manner. To be useful, these tools need to be accurate, complete, and relatively reliable. The current work presents the uncertainty analysis and its results for the Mis...

  17. Predictions of space radiation fatality risk for exploration missions

    NASA Astrophysics Data System (ADS)

    Cucinotta, Francis A.; To, Khiet; Cacao, Eliedonna

    2017-05-01

    In this paper we describe revisions to the NASA Space Cancer Risk (NSCR) model focusing on updates to probability distribution functions (PDF) representing the uncertainties in the radiation quality factor (QF) model parameters and the dose and dose-rate reduction effectiveness factor (DDREF). We integrate recent heavy ion data on liver, colorectal, intestinal, lung, and Harderian gland tumors with other data from fission neutron experiments into the model analysis. In an earlier work we introduced distinct QFs for leukemia and solid cancer risk predictions, and here we consider liver cancer risks separately because of the higher RBE's reported in mouse experiments compared to other tumors types, and distinct risk factors for liver cancer for astronauts compared to the U.S. population. The revised model is used to make predictions of fatal cancer and circulatory disease risks for 1-year deep space and International Space Station (ISS) missions, and a 940 day Mars mission. We analyzed the contribution of the various model parameter uncertainties to the overall uncertainty, which shows that the uncertainties in relative biological effectiveness (RBE) factors at high LET due to statistical uncertainties and differences across tissue types and mouse strains are the dominant uncertainty. NASA's exposure limits are approached or exceeded for each mission scenario considered. Two main conclusions are made: 1) Reducing the current estimate of about a 3-fold uncertainty to a 2-fold or lower uncertainty will require much more expansive animal carcinogenesis studies in order to reduce statistical uncertainties and understand tissue, sex and genetic variations. 2) Alternative model assumptions such as non-targeted effects, increased tumor lethality and decreased latency at high LET, and non-cancer mortality risks from circulatory diseases could significantly increase risk estimates to several times higher than the NASA limits.

  18. Uncertainty Quantification of CFD Data Generated for a Model Scramjet Isolator Flowfield

    NASA Technical Reports Server (NTRS)

    Baurle, R. A.; Axdahl, E. L.

    2017-01-01

    Computational fluid dynamics is now considered to be an indispensable tool for the design and development of scramjet engine components. Unfortunately, the quantification of uncertainties is rarely addressed with anything other than sensitivity studies, so the degree of confidence associated with the numerical results remains exclusively with the subject matter expert that generated them. This practice must be replaced with a formal uncertainty quantification process for computational fluid dynamics to play an expanded role in the system design, development, and flight certification process. Given the limitations of current hypersonic ground test facilities, this expanded role is believed to be a requirement by some in the hypersonics community if scramjet engines are to be given serious consideration as a viable propulsion system. The present effort describes a simple, relatively low cost, nonintrusive approach to uncertainty quantification that includes the basic ingredients required to handle both aleatoric (random) and epistemic (lack of knowledge) sources of uncertainty. The nonintrusive nature of the approach allows the computational fluid dynamicist to perform the uncertainty quantification with the flow solver treated as a "black box". Moreover, a large fraction of the process can be automated, allowing the uncertainty assessment to be readily adapted into the engineering design and development workflow. In the present work, the approach is applied to a model scramjet isolator problem where the desire is to validate turbulence closure models in the presence of uncertainty. In this context, the relevant uncertainty sources are determined and accounted for to allow the analyst to delineate turbulence model-form errors from other sources of uncertainty associated with the simulation of the facility flow.

  19. Photovoltaic System Modeling. Uncertainty and Sensitivity Analyses

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hansen, Clifford W.; Martin, Curtis E.

    2015-08-01

    We report an uncertainty and sensitivity analysis for modeling AC energy from ph otovoltaic systems . Output from a PV system is predicted by a sequence of models. We quantify u ncertainty i n the output of each model using empirical distribution s of each model's residuals. We propagate uncertainty through the sequence of models by sampli ng these distributions to obtain a n empirical distribution of a PV system's output. We consider models that: (1) translate measured global horizontal, direct and global diffuse irradiance to plane - of - array irradiance; (2) estimate effective irradiance; (3) predict cell temperature;more » (4) estimate DC voltage, current and power ; (5) reduce DC power for losses due to inefficient maximum power point tracking or mismatch among modules; and (6) convert DC to AC power . O ur analysis consider s a notional PV system com prising an array of FirstSolar FS - 387 modules and a 250 kW AC inverter ; we use measured irradiance and weather at Albuquerque, NM. We found the uncertainty in PV syste m output to be relatively small, on the order of 1% for daily energy. We found that unce rtainty in the models for POA irradiance and effective irradiance to be the dominant contributors to uncertainty in predicted daily energy. Our analysis indicates that efforts to reduce the uncertainty in PV system output predictions may yield the greatest improvements by focusing on the POA and effective irradiance models.« less

  20. Uncertainty quantification and validation of combined hydrological and macroeconomic analyses.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hernandez, Jacquelynne; Parks, Mancel Jordan; Jennings, Barbara Joan

    2010-09-01

    Changes in climate can lead to instabilities in physical and economic systems, particularly in regions with marginal resources. Global climate models indicate increasing global mean temperatures over the decades to come and uncertainty in the local to national impacts means perceived risks will drive planning decisions. Agent-based models provide one of the few ways to evaluate the potential changes in behavior in coupled social-physical systems and to quantify and compare risks. The current generation of climate impact analyses provides estimates of the economic cost of climate change for a limited set of climate scenarios that account for a small subsetmore » of the dynamics and uncertainties. To better understand the risk to national security, the next generation of risk assessment models must represent global stresses, population vulnerability to those stresses, and the uncertainty in population responses and outcomes that could have a significant impact on U.S. national security.« less

  1. A framework for optimization and quantification of uncertainty and sensitivity for developing carbon capture systems

    DOE PAGES

    Eslick, John C.; Ng, Brenda; Gao, Qianwen; ...

    2014-12-31

    Under the auspices of the U.S. Department of Energy’s Carbon Capture Simulation Initiative (CCSI), a Framework for Optimization and Quantification of Uncertainty and Sensitivity (FOQUS) has been developed. This tool enables carbon capture systems to be rapidly synthesized and rigorously optimized, in an environment that accounts for and propagates uncertainties in parameters and models. FOQUS currently enables (1) the development of surrogate algebraic models utilizing the ALAMO algorithm, which can be used for superstructure optimization to identify optimal process configurations, (2) simulation-based optimization utilizing derivative free optimization (DFO) algorithms with detailed black-box process models, and (3) rigorous uncertainty quantification throughmore » PSUADE. FOQUS utilizes another CCSI technology, the Turbine Science Gateway, to manage the thousands of simulated runs necessary for optimization and UQ. Thus, this computational framework has been demonstrated for the design and analysis of a solid sorbent based carbon capture system.« less

  2. Uncertainty Quantification in Climate Modeling and Projection

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Qian, Yun; Jackson, Charles; Giorgi, Filippo

    The projection of future climate is one of the most complex problems undertaken by the scientific community. Although scientists have been striving to better understand the physical basis of the climate system and to improve climate models, the overall uncertainty in projections of future climate has not been significantly reduced (e.g., from the IPCC AR4 to AR5). With the rapid increase of complexity in Earth system models, reducing uncertainties in climate projections becomes extremely challenging. Since uncertainties always exist in climate models, interpreting the strengths and limitations of future climate projections is key to evaluating risks, and climate change informationmore » for use in Vulnerability, Impact, and Adaptation (VIA) studies should be provided with both well-characterized and well-quantified uncertainty. The workshop aimed at providing participants, many of them from developing countries, information on strategies to quantify the uncertainty in climate model projections and assess the reliability of climate change information for decision-making. The program included a mixture of lectures on fundamental concepts in Bayesian inference and sampling, applications, and hands-on computer laboratory exercises employing software packages for Bayesian inference, Markov Chain Monte Carlo methods, and global sensitivity analyses. The lectures covered a range of scientific issues underlying the evaluation of uncertainties in climate projections, such as the effects of uncertain initial and boundary conditions, uncertain physics, and limitations of observational records. Progress in quantitatively estimating uncertainties in hydrologic, land surface, and atmospheric models at both regional and global scales was also reviewed. The application of Uncertainty Quantification (UQ) concepts to coupled climate system models is still in its infancy. The Coupled Model Intercomparison Project (CMIP) multi-model ensemble currently represents the primary data for assessing reliability and uncertainties of climate change information. An alternative approach is to generate similar ensembles by perturbing parameters within a single-model framework. One of workshop’s objectives was to give participants a deeper understanding of these approaches within a Bayesian statistical framework. However, there remain significant challenges still to be resolved before UQ can be applied in a convincing way to climate models and their projections.« less

  3. Fukushima Daiichi unit 1 uncertainty analysis--Preliminary selection of uncertain parameters and analysis methodology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cardoni, Jeffrey N.; Kalinich, Donald A.

    2014-02-01

    Sandia National Laboratories (SNL) plans to conduct uncertainty analyses (UA) on the Fukushima Daiichi unit (1F1) plant with the MELCOR code. The model to be used was developed for a previous accident reconstruction investigation jointly sponsored by the US Department of Energy (DOE) and Nuclear Regulatory Commission (NRC). However, that study only examined a handful of various model inputs and boundary conditions, and the predictions yielded only fair agreement with plant data and current release estimates. The goal of this uncertainty study is to perform a focused evaluation of uncertainty in core melt progression behavior and its effect on keymore » figures-of-merit (e.g., hydrogen production, vessel lower head failure, etc.). In preparation for the SNL Fukushima UA work, a scoping study has been completed to identify important core melt progression parameters for the uncertainty analysis. The study also lays out a preliminary UA methodology.« less

  4. Host model uncertainties in aerosol radiative forcing estimates: results from the AeroCom prescribed intercomparison study

    NASA Astrophysics Data System (ADS)

    Stier, P.; Schutgens, N. A. J.; Bian, H.; Boucher, O.; Chin, M.; Ghan, S.; Huneeus, N.; Kinne, S.; Lin, G.; Myhre, G.; Penner, J. E.; Randles, C.; Samset, B.; Schulz, M.; Yu, H.; Zhou, C.

    2012-09-01

    Simulated multi-model "diversity" in aerosol direct radiative forcing estimates is often perceived as measure of aerosol uncertainty. However, current models used for aerosol radiative forcing calculations vary considerably in model components relevant for forcing calculations and the associated "host-model uncertainties" are generally convoluted with the actual aerosol uncertainty. In this AeroCom Prescribed intercomparison study we systematically isolate and quantify host model uncertainties on aerosol forcing experiments through prescription of identical aerosol radiative properties in nine participating models. Even with prescribed aerosol radiative properties, simulated clear-sky and all-sky aerosol radiative forcings show significant diversity. For a purely scattering case with globally constant optical depth of 0.2, the global-mean all-sky top-of-atmosphere radiative forcing is -4.51 W m-2 and the inter-model standard deviation is 0.70 W m-2, corresponding to a relative standard deviation of 15%. For a case with partially absorbing aerosol with an aerosol optical depth of 0.2 and single scattering albedo of 0.8, the forcing changes to 1.26 W m-2, and the standard deviation increases to 1.21 W m-2, corresponding to a significant relative standard deviation of 96%. However, the top-of-atmosphere forcing variability owing to absorption is low, with relative standard deviations of 9% clear-sky and 12% all-sky. Scaling the forcing standard deviation for a purely scattering case to match the sulfate radiative forcing in the AeroCom Direct Effect experiment, demonstrates that host model uncertainties could explain about half of the overall sulfate forcing diversity of 0.13 W m-2 in the AeroCom Direct Radiative Effect experiment. Host model errors in aerosol radiative forcing are largest in regions of uncertain host model components, such as stratocumulus cloud decks or areas with poorly constrained surface albedos, such as sea ice. Our results demonstrate that host model uncertainties are an important component of aerosol forcing uncertainty that require further attention.

  5. Probabilistic inversion of expert assessments to inform projections about Antarctic ice sheet responses

    PubMed Central

    Wong, Tony E.; Keller, Klaus

    2017-01-01

    The response of the Antarctic ice sheet (AIS) to changing global temperatures is a key component of sea-level projections. Current projections of the AIS contribution to sea-level changes are deeply uncertain. This deep uncertainty stems, in part, from (i) the inability of current models to fully resolve key processes and scales, (ii) the relatively sparse available data, and (iii) divergent expert assessments. One promising approach to characterizing the deep uncertainty stemming from divergent expert assessments is to combine expert assessments, observations, and simple models by coupling probabilistic inversion and Bayesian inversion. Here, we present a proof-of-concept study that uses probabilistic inversion to fuse a simple AIS model and diverse expert assessments. We demonstrate the ability of probabilistic inversion to infer joint prior probability distributions of model parameters that are consistent with expert assessments. We then confront these inferred expert priors with instrumental and paleoclimatic observational data in a Bayesian inversion. These additional constraints yield tighter hindcasts and projections. We use this approach to quantify how the deep uncertainty surrounding expert assessments affects the joint probability distributions of model parameters and future projections. PMID:29287095

  6. Uncertainty in monitoring E. coli concentrations in streams and stormwater runoff

    NASA Astrophysics Data System (ADS)

    Harmel, R. D.; Hathaway, J. M.; Wagner, K. L.; Wolfe, J. E.; Karthikeyan, R.; Francesconi, W.; McCarthy, D. T.

    2016-03-01

    Microbial contamination of surface waters, a substantial public health concern throughout the world, is typically identified by fecal indicator bacteria such as Escherichia coli. Thus, monitoring E. coli concentrations is critical to evaluate current conditions, determine restoration effectiveness, and inform model development and calibration. An often overlooked component of these monitoring and modeling activities is understanding the inherent random and systematic uncertainty present in measured data. In this research, a review and subsequent analysis was performed to identify, document, and analyze measurement uncertainty of E. coli data collected in stream flow and stormwater runoff as individual discrete samples or throughout a single runoff event. Data on the uncertainty contributed by sample collection, sample preservation/storage, and laboratory analysis in measured E. coli concentrations were compiled and analyzed, and differences in sampling method and data quality scenarios were compared. The analysis showed that: (1) manual integrated sampling produced the lowest random and systematic uncertainty in individual samples, but automated sampling typically produced the lowest uncertainty when sampling throughout runoff events; (2) sample collection procedures often contributed the highest amount of uncertainty, although laboratory analysis introduced substantial random uncertainty and preservation/storage introduced substantial systematic uncertainty under some scenarios; and (3) the uncertainty in measured E. coli concentrations was greater than that of sediment and nutrients, but the difference was not as great as may be assumed. This comprehensive analysis of uncertainty in E. coli concentrations measured in streamflow and runoff should provide valuable insight for designing E. coli monitoring projects, reducing uncertainty in quality assurance efforts, regulatory and policy decision making, and fate and transport modeling.

  7. Ensemble urban flood simulation in comparison with laboratory-scale experiments: Impact of interaction models for manhole, sewer pipe, and surface flow

    NASA Astrophysics Data System (ADS)

    Noh, Seong Jin; Lee, Seungsoo; An, Hyunuk; Kawaike, Kenji; Nakagawa, Hajime

    2016-11-01

    An urban flood is an integrated phenomenon that is affected by various uncertainty sources such as input forcing, model parameters, complex geometry, and exchanges of flow among different domains in surfaces and subsurfaces. Despite considerable advances in urban flood modeling techniques, limited knowledge is currently available with regard to the impact of dynamic interaction among different flow domains on urban floods. In this paper, an ensemble method for urban flood modeling is presented to consider the parameter uncertainty of interaction models among a manhole, a sewer pipe, and surface flow. Laboratory-scale experiments on urban flood and inundation are performed under various flow conditions to investigate the parameter uncertainty of interaction models. The results show that ensemble simulation using interaction models based on weir and orifice formulas reproduces experimental data with high accuracy and detects the identifiability of model parameters. Among interaction-related parameters, the parameters of the sewer-manhole interaction show lower uncertainty than those of the sewer-surface interaction. Experimental data obtained under unsteady-state conditions are more informative than those obtained under steady-state conditions to assess the parameter uncertainty of interaction models. Although the optimal parameters vary according to the flow conditions, the difference is marginal. Simulation results also confirm the capability of the interaction models and the potential of the ensemble-based approaches to facilitate urban flood simulation.

  8. Comparison of Model and Observations of Middle Atmospheric HOx Response to Solar 27-day Cycles: Quantifying Model Uncertainties due to Photochemistry

    NASA Astrophysics Data System (ADS)

    Wang, S.; Li, K. F.; Shia, R. L.; Yung, Y. L.; Sander, S. P.

    2016-12-01

    HO2 and OH (known as odd oxygen HOx), play an important role in middle atmospheric chemistry, in particular, O3 destruction through catalytic HOx reaction cycles. Due to their photochemical production and short chemical lifetimes, HOx species response rapidly to solar UV irradiance changes during solar cycles, resulting in variability in the corresponding O3 chemistry. Observational evidences for both OH and HO2 variability due to solar cycles have been reported. However, puzzling discrepancies remain. In particular, the large discrepancy between model and observations of solar 11-year cycle signal in OH and the significantly different model results when adopting different solar spectral irradiance (SSI) [Wang et al., 2013] suggest that both uncertainties in SSI variability and uncertainties in our current understanding of HOx-O3 chemistry could contribute to the discrepancy. Since the short-term SSI variability (e.g. changes during solar 27-day cycles) has little uncertainty, investigating 27-day solar cycle signals in HOx allows us to simplify the complex problem and to focus on the uncertainties in chemistry alone. We use the Caltech-JPL photochemical model to simulate observed HOx variability during 27-day cycles. The comparison between Aura Microwave Limb Sounder (MLS) observations and our model results (using standard chemistry and "adjusted chemistry", respectively) will be discussed. A better understanding of uncertainties in chemistry will eventually help us separate the contribution of chemistry from contributions of SSI uncertainties to the complex discrepancy between model and observations of OH responses to solar 11-year cycles.

  9. Mass discharge estimation from contaminated sites: Multi-model solutions for assessment of conceptual uncertainty

    NASA Astrophysics Data System (ADS)

    Thomsen, N. I.; Troldborg, M.; McKnight, U. S.; Binning, P. J.; Bjerg, P. L.

    2012-04-01

    Mass discharge estimates are increasingly being used in the management of contaminated sites. Such estimates have proven useful for supporting decisions related to the prioritization of contaminated sites in a groundwater catchment. Potential management options can be categorised as follows: (1) leave as is, (2) clean up, or (3) further investigation needed. However, mass discharge estimates are often very uncertain, which may hamper the management decisions. If option 1 is incorrectly chosen soil and water quality will decrease, threatening or destroying drinking water resources. The risk of choosing option 2 is to spend money on remediating a site that does not pose a problem. Choosing option 3 will often be safest, but may not be the optimal economic solution. Quantification of the uncertainty in mass discharge estimates can therefore greatly improve the foundation for selecting the appropriate management option. The uncertainty of mass discharge estimates depends greatly on the extent of the site characterization. A good approach for uncertainty estimation will be flexible with respect to the investigation level, and account for both parameter and conceptual model uncertainty. We propose a method for quantifying the uncertainty of dynamic mass discharge estimates from contaminant point sources on the local scale. The method considers both parameter and conceptual uncertainty through a multi-model approach. The multi-model approach evaluates multiple conceptual models for the same site. The different conceptual models consider different source characterizations and hydrogeological descriptions. The idea is to include a set of essentially different conceptual models where each model is believed to be realistic representation of the given site, based on the current level of information. Parameter uncertainty is quantified using Monte Carlo simulations. For each conceptual model we calculate a transient mass discharge estimate with uncertainty bounds resulting from the parametric uncertainty. To quantify the conceptual uncertainty from a given site, we combine the outputs from the different conceptual models using Bayesian model averaging. The weight for each model is obtained by integrating available data and expert knowledge using Bayesian belief networks. The multi-model approach is applied to a contaminated site. At the site a DNAPL (dense non aqueous phase liquid) spill consisting of PCE (perchloroethylene) has contaminated a fractured clay till aquitard overlaying a limestone aquifer. The exact shape and nature of the source is unknown and so is the importance of transport in the fractures. The result of the multi-model approach is a visual representation of the uncertainty of the mass discharge estimates for the site which can be used to support the management options.

  10. Characterizing Uncertainty and Variability in PBPK Models ...

    EPA Pesticide Factsheets

    Mode-of-action based risk and safety assessments can rely upon tissue dosimetry estimates in animals and humans obtained from physiologically-based pharmacokinetic (PBPK) modeling. However, risk assessment also increasingly requires characterization of uncertainty and variability; such characterization for PBPK model predictions represents a continuing challenge to both modelers and users. Current practices show significant progress in specifying deterministic biological models and the non-deterministic (often statistical) models, estimating their parameters using diverse data sets from multiple sources, and using them to make predictions and characterize uncertainty and variability. The International Workshop on Uncertainty and Variability in PBPK Models, held Oct 31-Nov 2, 2006, sought to identify the state-of-the-science in this area and recommend priorities for research and changes in practice and implementation. For the short term, these include: (1) multidisciplinary teams to integrate deterministic and non-deterministic/statistical models; (2) broader use of sensitivity analyses, including for structural and global (rather than local) parameter changes; and (3) enhanced transparency and reproducibility through more complete documentation of the model structure(s) and parameter values, the results of sensitivity and other analyses, and supporting, discrepant, or excluded data. Longer-term needs include: (1) theoretic and practical methodological impro

  11. The Value of Decision Analytical Modeling in Surgical Research: An Example of Laparoscopic Versus Open Distal Pancreatectomy.

    PubMed

    Tax, Casper; Govaert, Paulien H M; Stommel, Martijn W J; Besselink, Marc G H; Gooszen, Hein G; Rovers, Maroeska M

    2017-11-02

    To illustrate how decision modeling may identify relevant uncertainty and can preclude or identify areas of future research in surgery. To optimize use of research resources, a tool is needed that assists in identifying relevant uncertainties and the added value of reducing these uncertainties. The clinical pathway for laparoscopic distal pancreatectomy (LDP) versus open (ODP) for nonmalignant lesions was modeled in a decision tree. Cost-effectiveness based on complications, hospital stay, costs, quality of life, and survival was analyzed. The effect of existing uncertainty on the cost-effectiveness was addressed, as well as the expected value of eliminating uncertainties. Based on 29 nonrandomized studies (3.701 patients) the model shows that LDP is more cost-effective compared with ODP. Scenarios in which LDP does not outperform ODP for cost-effectiveness seem unrealistic, e.g., a 30-day mortality rate of 1.79 times higher after LDP as compared with ODP, conversion in 62.2%, surgically repair of incisional hernias in 21% after LDP, or an average 2.3 days longer hospital stay after LDP than after ODP. Taking all uncertainty into account, LDP remained more cost-effective. Minimizing these uncertainties did not change the outcome. The results show how decision analytical modeling can help to identify relevant uncertainty and guide decisions for future research in surgery. Based on the current available evidence, a randomized clinical trial on complications, hospital stay, costs, quality of life, and survival is highly unlikely to change the conclusion that LDP is more cost-effective than ODP.

  12. Importance of anthropogenic climate impact, sampling error and urban development in sewer system design.

    PubMed

    Egger, C; Maurer, M

    2015-04-15

    Urban drainage design relying on observed precipitation series neglects the uncertainties associated with current and indeed future climate variability. Urban drainage design is further affected by the large stochastic variability of precipitation extremes and sampling errors arising from the short observation periods of extreme precipitation. Stochastic downscaling addresses anthropogenic climate impact by allowing relevant precipitation characteristics to be derived from local observations and an ensemble of climate models. This multi-climate model approach seeks to reflect the uncertainties in the data due to structural errors of the climate models. An ensemble of outcomes from stochastic downscaling allows for addressing the sampling uncertainty. These uncertainties are clearly reflected in the precipitation-runoff predictions of three urban drainage systems. They were mostly due to the sampling uncertainty. The contribution of climate model uncertainty was found to be of minor importance. Under the applied greenhouse gas emission scenario (A1B) and within the period 2036-2065, the potential for urban flooding in our Swiss case study is slightly reduced on average compared to the reference period 1981-2010. Scenario planning was applied to consider urban development associated with future socio-economic factors affecting urban drainage. The impact of scenario uncertainty was to a large extent found to be case-specific, thus emphasizing the need for scenario planning in every individual case. The results represent a valuable basis for discussions of new drainage design standards aiming specifically to include considerations of uncertainty. Copyright © 2015 Elsevier Ltd. All rights reserved.

  13. Particle Dark Matter constraints: the effect of Galactic uncertainties

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Benito, Maria; Bernal, Nicolás; Iocco, Fabio

    2017-02-01

    Collider, space, and Earth based experiments are now able to probe several extensions of the Standard Model of particle physics which provide viable dark matter candidates. Direct and indirect dark matter searches rely on inputs of astrophysical nature, such as the local dark matter density or the shape of the dark matter density profile in the target in object. The determination of these quantities is highly affected by astrophysical uncertainties. The latter, especially those for our own Galaxy, are ill-known, and often not fully accounted for when analyzing the phenomenology of particle physics models. In this paper we present amore » systematic, quantitative estimate of how astrophysical uncertainties on Galactic quantities (such as the local galactocentric distance, circular velocity, or the morphology of the stellar disk and bulge) propagate to the determination of the phenomenology of particle physics models, thus eventually affecting the determination of new physics parameters. We present results in the context of two specific extensions of the Standard Model (the Singlet Scalar and the Inert Doublet) that we adopt as case studies for their simplicity in illustrating the magnitude and impact of such uncertainties on the parameter space of the particle physics model itself. Our findings point toward very relevant effects of current Galactic uncertainties on the determination of particle physics parameters, and urge a systematic estimate of such uncertainties in more complex scenarios, in order to achieve constraints on the determination of new physics that realistically include all known uncertainties.« less

  14. Model output: fact or artefact?

    NASA Astrophysics Data System (ADS)

    Melsen, Lieke

    2015-04-01

    As a third-year PhD-student, I relatively recently entered the wonderful world of scientific Hydrology. A science that has many pillars that directly impact society, for example with the prediction of hydrological extremes (both floods and drought), climate change, applications in agriculture, nature conservation, drinking water supply, etcetera. Despite its demonstrable societal relevance, hydrology is often seen as a science between two stools. Like Klemeš (1986) stated: "By their academic background, hydrologists are foresters, geographers, electrical engineers, geologists, system analysts, physicists, mathematicians, botanists, and most often civil engineers." Sometimes it seems that the engineering genes are still present in current hydrological sciences, and this results in pragmatic rather than scientific approaches for some of the current problems and challenges we have in hydrology. Here, I refer to the uncertainty in hydrological modelling that is often neglected. For over thirty years, uncertainty in hydrological models has been extensively discussed and studied. But it is not difficult to find peer-reviewed articles in which it is implicitly assumed that model simulations represent the truth rather than a conceptualization of reality. For instance in trend studies, where data is extrapolated 100 years ahead. Of course one can use different forcing datasets to estimate the uncertainty of the input data, but how to prevent that the output is not a model artefact, caused by the model structure? Or how about impact studies, e.g. of a dam impacting river flow. Measurements are often available for the period after dam construction, so models are used to simulate river flow before dam construction. Both are compared in order to qualify the effect of the dam. But on what basis can we tell that the model tells us the truth? Model validation is common nowadays, but validation only (comparing observations with model output) is not sufficient to assume that a model reflects reality. E.g. due to nonuniqueness or so called equifinality; different model construction lead to same output (Oreskes et al., 1994, Beven, 2005). But also because validation only does not provide us information on whether we are 'right for the wrong reasons' (Kirchner, 2006; Oreskes et al., 1994). We can never know how right or wrong our models are, because we do not fully understand reality. But we can estimate the uncertainty from the model and the input data itself. Many techniques have been developed that help in estimating model uncertainty. E.g. model structural uncertainty, studied in the FUSE framework (Clark et al., 2008), parameter uncertainty with GLUE (Beven and Binley, 1992) and DREAM (Vrugt et al., 2008), input data uncertainty using BATEA (Kavetski et al., 2006). These are just some examples that pop-up in a first search. But somehow, these techniques are only used and applied in studies that focus on the model uncertainty itself, and hardly ever occur in studies that have a research question outside of the uncertainty-region. We know that models don't tell us the truth, but we have the tendency to claim they are, based on validation only. A model is always a simplification of reality, which by definition leads to uncertainty when model output and observations of reality are compared. The least we could do is estimate the uncertainty of the model and the data itself. My question therefore is: As a scientist, can we accept that we believe things of which we know they might not be true? And secondly: How to deal with this? How should model uncertainty change the way we communicate scientific results? References Beven, K., and A. Binley, The future of distributed models: Model calibration and uncertainty prediction, HP 6 (1992). Beven, K., A manifesto for the equifinality thesis, JoH 320 (2006). Clark, M.P., A.G. Slater, D.E. Rupp, R.A. Woods, J.A. Vrugt, H.V. Gupta, T. Wagener and L.E. Hay, Framework for Understanding Structural Errors (FUSE): A modular framework to diagnose differences between hydrological models, WRR 44 (2008). Kavetski, D., G. Kuczera and S.W. Franks, Bayesian analysis of input uncertainty in hydrological modeling: 1. Theory, WRR 42 (2006). Kirchner, J.W., Getting the right answers for the right reasons: Linking measurements, analyses, and models to advance the science of hydrology, WRR 42 (2006). Klemeš, V., Dilettantism in Hydrology: Transition or Destiny?, WRR 22-9 (1986). Oreskes, N., K. Shrader-Frechette, and K. Belitz, Verification, Validation and Confirmation of Numerical Models in Earth Sciences, SCIENCE 263 (1994). Vrugt, J.A., C.J.F. ter Braak, M.P. Clar, J.M. Hyman, and B.A. Robinson, Treatment of input uncertainty in hydrologic modeling: Doing hydrology backward with Markov chain Monte Carlo simulation, WRR 44, (2008).

  15. A Two-Step Approach to Uncertainty Quantification of Core Simulators

    DOE PAGES

    Yankov, Artem; Collins, Benjamin; Klein, Markus; ...

    2012-01-01

    For the multiple sources of error introduced into the standard computational regime for simulating reactor cores, rigorous uncertainty analysis methods are available primarily to quantify the effects of cross section uncertainties. Two methods for propagating cross section uncertainties through core simulators are the XSUSA statistical approach and the “two-step” method. The XSUSA approach, which is based on the SUSA code package, is fundamentally a stochastic sampling method. Alternatively, the two-step method utilizes generalized perturbation theory in the first step and stochastic sampling in the second step. The consistency of these two methods in quantifying uncertainties in the multiplication factor andmore » in the core power distribution was examined in the framework of phase I-3 of the OECD Uncertainty Analysis in Modeling benchmark. With the Three Mile Island Unit 1 core as a base model for analysis, the XSUSA and two-step methods were applied with certain limitations, and the results were compared to those produced by other stochastic sampling-based codes. Based on the uncertainty analysis results, conclusions were drawn as to the method that is currently more viable for computing uncertainties in burnup and transient calculations.« less

  16. A roadmap for improving the representation of photosynthesis in Earth System Models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rogers, Alistair; Medlyn, Belinda E.; Dukes, Jeffrey S.

    Accurate representation of photosynthesis in terrestrial biosphere models (TBMs) is essential for robust projections of global change. However, current representations vary markedly between TBMs, contributing uncertainty projections of global carbon fluxes.

  17. A roadmap for improving the representation of photosynthesis in Earth System Models

    DOE PAGES

    Rogers, Alistair; Medlyn, Belinda E.; Dukes, Jeffrey S.; ...

    2016-11-28

    Accurate representation of photosynthesis in terrestrial biosphere models (TBMs) is essential for robust projections of global change. However, current representations vary markedly between TBMs, contributing uncertainty projections of global carbon fluxes.

  18. Evaluation of wind induced currents modeling along the Southern Caspian Sea

    NASA Astrophysics Data System (ADS)

    Bohluly, Asghar; Esfahani, Fariba Sadat; Montazeri Namin, Masoud; Chegini, Fatemeh

    2018-02-01

    To improve our understanding of the Caspian Sea hydrodynamics, its circulation is simulated with special focus on wind-driven currents of its southern basin. The hydrodynamic models are forced with a newly developed fine resolution wind field to increase the accuracy of current modeling. A 2D shallow water equation model and a 3D baroclinic model are applied separately to examine the performance of each model for specific applications in the Caspian Sea. The model results are validated against recent field measurements including AWAC and temperature observations in the southern continental shelf region. Results show that the 2D model is able to well predict the depth-averaged current speed in storm conditions in narrow area of southern coasts. This finding suggests physical oceanographers apply 2D modeling as a more affordable method for extreme current speed analysis at the continental shelf region. On the other hand the 3D model demonstrates a better performance in reproducing monthly mean circulation and hence is preferable for surface circulation of Caspian Sea. Monthly sea surface circulation fields of the southern basin reveal a dipole cyclonic-anticyclonic pattern, a dominant eastward current along the southern coasts which intensifies from May to November and a dominant southward current along the eastern coasts in all months except February when the flow is northward. Monthly mean wind fields exhibit two main patterns including a north-south pattern occurring at warm months and collision of two wind fronts especially in the cold months. This collision occurs on a narrow region at the southern continental shelf regions. Due to wind field complexities, it leads to a major source of uncertainty in predicting the wind-driven currents. However, this source of uncertainty is significantly alleviated by applying a fine resolution wind field.

  19. Constructing Scientific Arguments Using Evidence from Dynamic Computational Climate Models

    NASA Astrophysics Data System (ADS)

    Pallant, Amy; Lee, Hee-Sun

    2015-04-01

    Modeling and argumentation are two important scientific practices students need to develop throughout school years. In this paper, we investigated how middle and high school students ( N = 512) construct a scientific argument based on evidence from computational models with which they simulated climate change. We designed scientific argumentation tasks with three increasingly complex dynamic climate models. Each scientific argumentation task consisted of four parts: multiple-choice claim, openended explanation, five-point Likert scale uncertainty rating, and open-ended uncertainty rationale. We coded 1,294 scientific arguments in terms of a claim's consistency with current scientific consensus, whether explanations were model based or knowledge based and categorized the sources of uncertainty (personal vs. scientific). We used chi-square and ANOVA tests to identify significant patterns. Results indicate that (1) a majority of students incorporated models as evidence to support their claims, (2) most students used model output results shown on graphs to confirm their claim rather than to explain simulated molecular processes, (3) students' dependence on model results and their uncertainty rating diminished as the dynamic climate models became more and more complex, (4) some students' misconceptions interfered with observing and interpreting model results or simulated processes, and (5) students' uncertainty sources reflected more frequently on their assessment of personal knowledge or abilities related to the tasks than on their critical examination of scientific evidence resulting from models. These findings have implications for teaching and research related to the integration of scientific argumentation and modeling practices to address complex Earth systems.

  20. Estimating the spatial distribution of wintering little brown bat populations in the eastern United States

    USGS Publications Warehouse

    Russell, Robin E.; Tinsley, Karl; Erickson, Richard A.; Thogmartin, Wayne E.; Jennifer A. Szymanski,

    2014-01-01

    Depicting the spatial distribution of wildlife species is an important first step in developing management and conservation programs for particular species. Accurate representation of a species distribution is important for predicting the effects of climate change, land-use change, management activities, disease, and other landscape-level processes on wildlife populations. We developed models to estimate the spatial distribution of little brown bat (Myotis lucifugus) wintering populations in the United States east of the 100th meridian, based on known hibernacula locations. From this data, we developed several scenarios of wintering population counts per county that incorporated uncertainty in the spatial distribution of the hibernacula as well as uncertainty in the size of the current little brown bat population. We assessed the variability in our results resulting from effects of uncertainty. Despite considerable uncertainty in the known locations of overwintering little brown bats in the eastern United States, we believe that models accurately depicting the effects of the uncertainty are useful for making management decisions as these models are a coherent organization of the best available information.

  1. Effects of Boron and Graphite Uncertainty in Fuel for TREAT Simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vaughn, Kyle; Mausolff, Zander; Gonzalez, Esteban

    Advanced modeling techniques and current computational capacity make full core TREAT simulations possible, with the goal of such simulations to understand the pre-test core and minimize the number of required calibrations. But, in order to simulate TREAT with a high degree of precision the reactor materials and geometry must also be modeled with a high degree of precision. This paper examines how uncertainty in the reported values of boron and graphite have an effect on simulations of TREAT.

  2. Nuclear Data Uncertainties for Typical LWR Fuel Assemblies and a Simple Reactor Core

    NASA Astrophysics Data System (ADS)

    Rochman, D.; Leray, O.; Hursin, M.; Ferroukhi, H.; Vasiliev, A.; Aures, A.; Bostelmann, F.; Zwermann, W.; Cabellos, O.; Diez, C. J.; Dyrda, J.; Garcia-Herranz, N.; Castro, E.; van der Marck, S.; Sjöstrand, H.; Hernandez, A.; Fleming, M.; Sublet, J.-Ch.; Fiorito, L.

    2017-01-01

    The impact of the current nuclear data library covariances such as in ENDF/B-VII.1, JEFF-3.2, JENDL-4.0, SCALE and TENDL, for relevant current reactors is presented in this work. The uncertainties due to nuclear data are calculated for existing PWR and BWR fuel assemblies (with burn-up up to 40 GWd/tHM, followed by 10 years of cooling time) and for a simplified PWR full core model (without burn-up) for quantities such as k∞, macroscopic cross sections, pin power or isotope inventory. In this work, the method of propagation of uncertainties is based on random sampling of nuclear data, either from covariance files or directly from basic parameters. Additionally, possible biases on calculated quantities are investigated such as the self-shielding treatment. Different calculation schemes are used, based on CASMO, SCALE, DRAGON, MCNP or FISPACT-II, thus simulating real-life assignments for technical-support organizations. The outcome of such a study is a comparison of uncertainties with two consequences. One: although this study is not expected to lead to similar results between the involved calculation schemes, it provides an insight on what can happen when calculating uncertainties and allows to give some perspectives on the range of validity on these uncertainties. Two: it allows to dress a picture of the state of the knowledge as of today, using existing nuclear data library covariances and current methods.

  3. A Bayesian Hierarchical Modeling Approach to Predicting Flow in Ungauged Basins

    NASA Astrophysics Data System (ADS)

    Gronewold, A.; Alameddine, I.; Anderson, R. M.

    2009-12-01

    Recent innovative approaches to identifying and applying regression-based relationships between land use patterns (such as increasing impervious surface area and decreasing vegetative cover) and rainfall-runoff model parameters represent novel and promising improvements to predicting flow from ungauged basins. In particular, these approaches allow for predicting flows under uncertain and potentially variable future conditions due to rapid land cover changes, variable climate conditions, and other factors. Despite the broad range of literature on estimating rainfall-runoff model parameters, however, the absence of a robust set of modeling tools for identifying and quantifying uncertainties in (and correlation between) rainfall-runoff model parameters represents a significant gap in current hydrological modeling research. Here, we build upon a series of recent publications promoting novel Bayesian and probabilistic modeling strategies for quantifying rainfall-runoff model parameter estimation uncertainty. Our approach applies alternative measures of rainfall-runoff model parameter joint likelihood (including Nash-Sutcliffe efficiency, among others) to simulate samples from the joint parameter posterior probability density function. We then use these correlated samples as response variables in a Bayesian hierarchical model with land use coverage data as predictor variables in order to develop a robust land use-based tool for forecasting flow in ungauged basins while accounting for, and explicitly acknowledging, parameter estimation uncertainty. We apply this modeling strategy to low-relief coastal watersheds of Eastern North Carolina, an area representative of coastal resource waters throughout the world because of its sensitive embayments and because of the abundant (but currently threatened) natural resources it hosts. Consequently, this area is the subject of several ongoing studies and large-scale planning initiatives, including those conducted through the United States Environmental Protection Agency (USEPA) total maximum daily load (TMDL) program, as well as those addressing coastal population dynamics and sea level rise. Our approach has several advantages, including the propagation of parameter uncertainty through a nonparametric probability distribution which avoids common pitfalls of fitting parameters and model error structure to a predetermined parametric distribution function. In addition, by explicitly acknowledging correlation between model parameters (and reflecting those correlations in our predictive model) our model yields relatively efficient prediction intervals (unlike those in the current literature which are often unnecessarily large, and may lead to overly-conservative management actions). Finally, our model helps improve understanding of the rainfall-runoff process by identifying model parameters (and associated catchment attributes) which are most sensitive to current and future land use change patterns. Disclaimer: Although this work was reviewed by EPA and approved for publication, it may not necessarily reflect official Agency policy.

  4. Characterizing uncertainty when evaluating risk management metrics: risk assessment modeling of Listeria monocytogenes contamination in ready-to-eat deli meats.

    PubMed

    Gallagher, Daniel; Ebel, Eric D; Gallagher, Owen; Labarre, David; Williams, Michael S; Golden, Neal J; Pouillot, Régis; Dearfield, Kerry L; Kause, Janell

    2013-04-01

    This report illustrates how the uncertainty about food safety metrics may influence the selection of a performance objective (PO). To accomplish this goal, we developed a model concerning Listeria monocytogenes in ready-to-eat (RTE) deli meats. This application used a second order Monte Carlo model that simulates L. monocytogenes concentrations through a series of steps: the food-processing establishment, transport, retail, the consumer's home and consumption. The model accounted for growth inhibitor use, retail cross contamination, and applied an FAO/WHO dose response model for evaluating the probability of illness. An appropriate level of protection (ALOP) risk metric was selected as the average risk of illness per serving across all consumed servings-per-annum and the model was used to solve for the corresponding performance objective (PO) risk metric as the maximum allowable L. monocytogenes concentration (cfu/g) at the processing establishment where regulatory monitoring would occur. Given uncertainty about model inputs, an uncertainty distribution of the PO was estimated. Additionally, we considered how RTE deli meats contaminated at levels above the PO would be handled by the industry using three alternative approaches. Points on the PO distribution represent the probability that - if the industry complies with a particular PO - the resulting risk-per-serving is less than or equal to the target ALOP. For example, assuming (1) a target ALOP of -6.41 log10 risk of illness per serving, (2) industry concentrations above the PO that are re-distributed throughout the remaining concentration distribution and (3) no dose response uncertainty, establishment PO's of -4.98 and -4.39 log10 cfu/g would be required for 90% and 75% confidence that the target ALOP is met, respectively. The PO concentrations from this example scenario are more stringent than the current typical monitoring level of an absence in 25 g (i.e., -1.40 log10 cfu/g) or a stricter criteria of absence in 125 g (i.e., -2.1 log10 cfu/g). This example, and others, demonstrates that a PO for L. monocytogenes would be far below any current monitoring capabilities. Furthermore, this work highlights the demands placed on risk managers and risk assessors when applying uncertain risk models to the current risk metric framework. Copyright © 2013 Elsevier B.V. All rights reserved.

  5. Comparison of two optimization algorithms for fuzzy finite element model updating for damage detection in a wind turbine blade

    NASA Astrophysics Data System (ADS)

    Turnbull, Heather; Omenzetter, Piotr

    2018-03-01

    vDifficulties associated with current health monitoring and inspection practices combined with harsh, often remote, operational environments of wind turbines highlight the requirement for a non-destructive evaluation system capable of remotely monitoring the current structural state of turbine blades. This research adopted a physics based structural health monitoring methodology through calibration of a finite element model using inverse techniques. A 2.36m blade from a 5kW turbine was used as an experimental specimen, with operational modal analysis techniques utilised to realize the modal properties of the system. Modelling the experimental responses as fuzzy numbers using the sub-level technique, uncertainty in the response parameters was propagated back through the model and into the updating parameters. Initially, experimental responses of the blade were obtained, with a numerical model of the blade created and updated. Deterministic updating was carried out through formulation and minimisation of a deterministic objective function using both firefly algorithm and virus optimisation algorithm. Uncertainty in experimental responses were modelled using triangular membership functions, allowing membership functions of updating parameters (Young's modulus and shear modulus) to be obtained. Firefly algorithm and virus optimisation algorithm were again utilised, however, this time in the solution of fuzzy objective functions. This enabled uncertainty associated with updating parameters to be quantified. Varying damage location and severity was simulated experimentally through addition of small masses to the structure intended to cause a structural alteration. A damaged model was created, modelling four variable magnitude nonstructural masses at predefined points and updated to provide a deterministic damage prediction and information in relation to the parameters uncertainty via fuzzy updating.

  6. Advanced Small Modular Reactor Economics Model Development

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harrison, Thomas J.

    2014-10-01

    The US Department of Energy Office of Nuclear Energy’s Advanced Small Modular Reactor (SMR) research and development activities focus on four key areas: Developing assessment methods for evaluating advanced SMR technologies and characteristics; and Developing and testing of materials, fuels and fabrication techniques; and Resolving key regulatory issues identified by US Nuclear Regulatory Commission and industry; and Developing advanced instrumentation and controls and human-machine interfaces. This report focuses on development of assessment methods to evaluate advanced SMR technologies and characteristics. Specifically, this report describes the expansion and application of the economic modeling effort at Oak Ridge National Laboratory. Analysis ofmore » the current modeling methods shows that one of the primary concerns for the modeling effort is the handling of uncertainty in cost estimates. Monte Carlo–based methods are commonly used to handle uncertainty, especially when implemented by a stand-alone script within a program such as Python or MATLAB. However, a script-based model requires each potential user to have access to a compiler and an executable capable of handling the script. Making the model accessible to multiple independent analysts is best accomplished by implementing the model in a common computing tool such as Microsoft Excel. Excel is readily available and accessible to most system analysts, but it is not designed for straightforward implementation of a Monte Carlo–based method. Using a Monte Carlo algorithm requires in-spreadsheet scripting and statistical analyses or the use of add-ons such as Crystal Ball. An alternative method uses propagation of error calculations in the existing Excel-based system to estimate system cost uncertainty. This method has the advantage of using Microsoft Excel as is, but it requires the use of simplifying assumptions. These assumptions do not necessarily bring into question the analytical results. In fact, the analysis shows that the propagation of error method introduces essentially negligible error, especially when compared to the uncertainty associated with some of the estimates themselves. The results of these uncertainty analyses generally quantify and identify the sources of uncertainty in the overall cost estimation. The obvious generalization—that capital cost uncertainty is the main driver—can be shown to be an accurate generalization for the current state of reactor cost analysis. However, the detailed analysis on a component-by-component basis helps to demonstrate which components would benefit most from research and development to decrease the uncertainty, as well as which components would benefit from research and development to decrease the absolute cost.« less

  7. Potential of European 14CO2 observation network to estimate the fossil fuel CO2 emissions via atmospheric inversions

    NASA Astrophysics Data System (ADS)

    Wang, Yilong; Broquet, Grégoire; Ciais, Philippe; Chevallier, Frédéric; Vogel, Felix; Wu, Lin; Yin, Yi; Wang, Rong; Tao, Shu

    2018-03-01

    Combining measurements of atmospheric CO2 and its radiocarbon (14CO2) fraction and transport modeling in atmospheric inversions offers a way to derive improved estimates of CO2 emitted from fossil fuel (FFCO2). In this study, we solve for the monthly FFCO2 emission budgets at regional scale (i.e., the size of a medium-sized country in Europe) and investigate the performance of different observation networks and sampling strategies across Europe. The inversion system is built on the LMDZv4 global transport model at 3.75° × 2.5° resolution. We conduct Observing System Simulation Experiments (OSSEs) and use two types of diagnostics to assess the potential of the observation and inverse modeling frameworks. The first one relies on the theoretical computation of the uncertainty in the estimate of emissions from the inversion, known as posterior uncertainty, and on the uncertainty reduction compared to the uncertainty in the inventories of these emissions, which are used as a prior knowledge by the inversion (called prior uncertainty). The second one is based on comparisons of prior and posterior estimates of the emission to synthetic true emissions when these true emissions are used beforehand to generate the synthetic fossil fuel CO2 mixing ratio measurements that are assimilated in the inversion. With 17 stations currently measuring 14CO2 across Europe using 2-week integrated sampling, the uncertainty reduction for monthly FFCO2 emissions in a country where the network is rather dense like Germany, is larger than 30 %. With the 43 14CO2 measurement stations planned in Europe, the uncertainty reduction for monthly FFCO2 emissions is increased for the UK, France, Italy, eastern Europe and the Balkans, depending on the configuration of prior uncertainty. Further increasing the number of stations or the sampling frequency improves the uncertainty reduction (up to 40 to 70 %) in high emitting regions, but the performance of the inversion remains limited over low-emitting regions, even assuming a dense observation network covering the whole of Europe. This study also shows that both the theoretical uncertainty reduction (and resulting posterior uncertainty) from the inversion and the posterior estimate of emissions itself, for a given prior and true estimate of the emissions, are highly sensitive to the choice between two configurations of the prior uncertainty derived from the general estimate by inventory compilers or computations on existing inventories. In particular, when the configuration of the prior uncertainty statistics in the inversion system does not match the difference between these prior and true estimates, the posterior estimate of emissions deviates significantly from the truth. This highlights the difficulty of filtering the targeted signal in the model-data misfit for this specific inversion framework, the need to strongly rely on the prior uncertainty characterization for this and, consequently, the need for improved estimates of the uncertainties in current emission inventories for real applications with actual data. We apply the posterior uncertainty in annual emissions to the problem of detecting a trend of FFCO2, showing that increasing the monitoring period (e.g., more than 20 years) is more efficient than reducing uncertainty in annual emissions by adding stations. The coarse spatial resolution of the atmospheric transport model used in this OSSE (typical of models used for global inversions of natural CO2 fluxes) leads to large representation errors (related to the inability of the transport model to capture the spatial variability of the actual fluxes and mixing ratios at subgrid scales), which is a key limitation of our OSSE setup to improve the accuracy of the monitoring of FFCO2 emissions in European regions. Using a high-resolution transport model should improve the potential to retrieve FFCO2 emissions, and this needs to be investigated.

  8. Theoretical uncertainties in the calculation of supersymmetric dark matter observables

    NASA Astrophysics Data System (ADS)

    Bergeron, Paul; Sandick, Pearl; Sinha, Kuver

    2018-05-01

    We estimate the current theoretical uncertainty in supersymmetric dark matter predictions by comparing several state-of-the-art calculations within the minimal supersymmetric standard model (MSSM). We consider standard neutralino dark matter scenarios — coannihilation, well-tempering, pseudoscalar resonance — and benchmark models both in the pMSSM framework and in frameworks with Grand Unified Theory (GUT)-scale unification of supersymmetric mass parameters. The pipelines we consider are constructed from the publicly available software packages SOFTSUSY, SPheno, FeynHiggs, SusyHD, micrOMEGAs, and DarkSUSY. We find that the theoretical uncertainty in the relic density as calculated by different pipelines, in general, far exceeds the statistical errors reported by the Planck collaboration. In GUT models, in particular, the relative discrepancies in the results reported by different pipelines can be as much as a few orders of magnitude. We find that these discrepancies are especially pronounced for cases where the dark matter physics relies critically on calculations related to electroweak symmetry breaking, which we investigate in detail, and for coannihilation models, where there is heightened sensitivity to the sparticle spectrum. The dark matter annihilation cross section today and the scattering cross section with nuclei also suffer appreciable theoretical uncertainties, which, as experiments reach the relevant sensitivities, could lead to uncertainty in conclusions regarding the viability or exclusion of particular models.

  9. How uncertain are climate model projections of water availability indicators across the Middle East?

    PubMed

    Hemming, Debbie; Buontempo, Carlo; Burke, Eleanor; Collins, Mat; Kaye, Neil

    2010-11-28

    The projection of robust regional climate changes over the next 50 years presents a considerable challenge for the current generation of climate models. Water cycle changes are particularly difficult to model in this area because major uncertainties exist in the representation of processes such as large-scale and convective rainfall and their feedback with surface conditions. We present climate model projections and uncertainties in water availability indicators (precipitation, run-off and drought index) for the 1961-1990 and 2021-2050 periods. Ensembles from two global climate models (GCMs) and one regional climate model (RCM) are used to examine different elements of uncertainty. Although all three ensembles capture the general distribution of observed annual precipitation across the Middle East, the RCM is consistently wetter than observations, especially over the mountainous areas. All future projections show decreasing precipitation (ensemble median between -5 and -25%) in coastal Turkey and parts of Lebanon, Syria and Israel and consistent run-off and drought index changes. The Intergovernmental Panel on Climate Change (IPCC) Fourth Assessment Report (AR4) GCM ensemble exhibits drying across the north of the region, whereas the Met Office Hadley Centre work Quantifying Uncertainties in Model ProjectionsAtmospheric (QUMP-A) GCM and RCM ensembles show slight drying in the north and significant wetting in the south. RCM projections also show greater sensitivity (both wetter and drier) and a wider uncertainty range than QUMP-A. The nature of these uncertainties suggests that both large-scale circulation patterns, which influence region-wide drying/wetting patterns, and regional-scale processes, which affect localized water availability, are important sources of uncertainty in these projections. To reduce large uncertainties in water availability projections, it is suggested that efforts would be well placed to focus on the understanding and modelling of both large-scale processes and their teleconnections with Middle East climate and localized processes involved in orographic precipitation.

  10. Predictions of space radiation fatality risk for exploration missions.

    PubMed

    Cucinotta, Francis A; To, Khiet; Cacao, Eliedonna

    2017-05-01

    In this paper we describe revisions to the NASA Space Cancer Risk (NSCR) model focusing on updates to probability distribution functions (PDF) representing the uncertainties in the radiation quality factor (QF) model parameters and the dose and dose-rate reduction effectiveness factor (DDREF). We integrate recent heavy ion data on liver, colorectal, intestinal, lung, and Harderian gland tumors with other data from fission neutron experiments into the model analysis. In an earlier work we introduced distinct QFs for leukemia and solid cancer risk predictions, and here we consider liver cancer risks separately because of the higher RBE's reported in mouse experiments compared to other tumors types, and distinct risk factors for liver cancer for astronauts compared to the U.S. The revised model is used to make predictions of fatal cancer and circulatory disease risks for 1-year deep space and International Space Station (ISS) missions, and a 940 day Mars mission. We analyzed the contribution of the various model parameter uncertainties to the overall uncertainty, which shows that the uncertainties in relative biological effectiveness (RBE) factors at high LET due to statistical uncertainties and differences across tissue types and mouse strains are the dominant uncertainty. NASA's exposure limits are approached or exceeded for each mission scenario considered. Two main conclusions are made: 1) Reducing the current estimate of about a 3-fold uncertainty to a 2-fold or lower uncertainty will require much more expansive animal carcinogenesis studies in order to reduce statistical uncertainties and understand tissue, sex and genetic variations. 2) Alternative model assumptions such as non-targeted effects, increased tumor lethality and decreased latency at high LET, and non-cancer mortality risks from circulatory diseases could significantly increase risk estimates to several times higher than the NASA limits. Copyright © 2017 The Committee on Space Research (COSPAR). Published by Elsevier Ltd. All rights reserved.

  11. Impact of Exposure Uncertainty on the Association between Perfluorooctanoate and Preeclampsia in the C8 Health Project Population.

    PubMed

    Avanasi, Raghavendhran; Shin, Hyeong-Moo; Vieira, Verónica M; Savitz, David A; Bartell, Scott M

    2016-01-01

    Uncertainty in exposure estimates from models can result in exposure measurement error and can potentially affect the validity of epidemiological studies. We recently used a suite of environmental models and an integrated exposure and pharmacokinetic model to estimate individual perfluorooctanoate (PFOA) serum concentrations and assess the association with preeclampsia from 1990 through 2006 for the C8 Health Project participants. The aims of the current study are to evaluate impact of uncertainty in estimated PFOA drinking-water concentrations on estimated serum concentrations and their reported epidemiological association with preeclampsia. For each individual public water district, we used Monte Carlo simulations to vary the year-by-year PFOA drinking-water concentration by randomly sampling from lognormal distributions for random error in the yearly public water district PFOA concentrations, systematic error specific to each water district, and global systematic error in the release assessment (using the estimated concentrations from the original fate and transport model as medians and a range of 2-, 5-, and 10-fold uncertainty). Uncertainty in PFOA water concentrations could cause major changes in estimated serum PFOA concentrations among participants. However, there is relatively little impact on the resulting epidemiological association in our simulations. The contribution of exposure uncertainty to the total uncertainty (including regression parameter variance) ranged from 5% to 31%, and bias was negligible. We found that correlated exposure uncertainty can substantially change estimated PFOA serum concentrations, but results in only minor impacts on the epidemiological association between PFOA and preeclampsia. Avanasi R, Shin HM, Vieira VM, Savitz DA, Bartell SM. 2016. Impact of exposure uncertainty on the association between perfluorooctanoate and preeclampsia in the C8 Health Project population. Environ Health Perspect 124:126-132; http://dx.doi.org/10.1289/ehp.1409044.

  12. Uncertainty analysis of vegetation distribution in the northern high latitudes during the 21st century with a dynamic vegetation model.

    PubMed

    Jiang, Yueyang; Zhuang, Qianlai; Schaphoff, Sibyll; Sitch, Stephen; Sokolov, Andrei; Kicklighter, David; Melillo, Jerry

    2012-03-01

    This study aims to assess how high-latitude vegetation may respond under various climate scenarios during the 21st century with a focus on analyzing model parameters induced uncertainty and how this uncertainty compares to the uncertainty induced by various climates. The analysis was based on a set of 10,000 Monte Carlo ensemble Lund-Potsdam-Jena (LPJ) simulations for the northern high latitudes (45(o)N and polewards) for the period 1900-2100. The LPJ Dynamic Global Vegetation Model (LPJ-DGVM) was run under contemporary and future climates from four Special Report Emission Scenarios (SRES), A1FI, A2, B1, and B2, based on the Hadley Centre General Circulation Model (GCM), and six climate scenarios, X901M, X902L, X903H, X904M, X905L, and X906H from the Integrated Global System Model (IGSM) at the Massachusetts Institute of Technology (MIT). In the current dynamic vegetation model, some parameters are more important than others in determining the vegetation distribution. Parameters that control plant carbon uptake and light-use efficiency have the predominant influence on the vegetation distribution of both woody and herbaceous plant functional types. The relative importance of different parameters varies temporally and spatially and is influenced by climate inputs. In addition to climate, these parameters play an important role in determining the vegetation distribution in the region. The parameter-based uncertainties contribute most to the total uncertainty. The current warming conditions lead to a complexity of vegetation responses in the region. Temperate trees will be more sensitive to climate variability, compared with boreal forest trees and C3 perennial grasses. This sensitivity would result in a unanimous northward greenness migration due to anomalous warming in the northern high latitudes. Temporally, boreal needleleaved evergreen plants are projected to decline considerably, and a large portion of C3 perennial grass is projected to disappear by the end of the 21st century. In contrast, the area of temperate trees would increase, especially under the most extreme A1FI scenario. As the warming continues, the northward greenness expansion in the Arctic region could continue.

  13. Constraints on CDM cosmology from galaxy power spectrum, CMB and SNIa evolution

    NASA Astrophysics Data System (ADS)

    Ferramacho, L. D.; Blanchard, A.; Zolnierowski, Y.

    2009-05-01

    Aims: We examine the constraints that can be obtained on standard cold dark matter models from the most currently used data set: CMB anisotropies, type Ia supernovae and the SDSS luminous red galaxies. We also examine how these constraints are widened when the equation of state parameter w and the curvature parameter Ωk are left as free parameters. Finally, we investigate the impact on these constraints of a possible form of evolution in SNIa intrinsic luminosity. Methods: We obtained our results from MCMC analysis using the full likelihood of each data set. Results: For the ΛCDM model, our “vanilla” model, cosmological parameters are tightly constrained and consistent with current estimates from various methods. When the dark energy parameter w is free we find that the constraints remain mostly unchanged, i.e. changes are smaller than the 1 sigma uncertainties. Similarly, relaxing the assumption of a flat universe leads to nearly identical constraints on the dark energy density parameter of the universe Ω_Λ , baryon density of the universe Ω_b, the optical depth τ, the index of the power spectrum of primordial fluctuations n_S, with most one sigma uncertainties better than 5%. More significant changes appear on other parameters: while preferred values are almost unchanged, uncertainties for the physical dark matter density Ω_ch^2, Hubble constant H0 and σ8 are typically twice as large. The constraint on the age of the Universe, which is very accurate for the vanilla model, is the most degraded. We found that different methodological approaches on large scale structure estimates lead to appreciable differences in preferred values and uncertainty widths. We found that possible evolution in SNIa intrinsic luminosity does not alter these constraints by much, except for w, for which the uncertainty is twice as large. At the same time, this possible evolution is severely constrained. Conclusions: We conclude that systematic uncertainties for some estimated quantities are similar or larger than statistical ones.

  14. Host model uncertainties in aerosol radiative forcing estimates: results from the AeroCom Prescribed intercomparison study

    NASA Astrophysics Data System (ADS)

    Stier, P.; Schutgens, N. A. J.; Bellouin, N.; Bian, H.; Boucher, O.; Chin, M.; Ghan, S.; Huneeus, N.; Kinne, S.; Lin, G.; Ma, X.; Myhre, G.; Penner, J. E.; Randles, C. A.; Samset, B.; Schulz, M.; Takemura, T.; Yu, F.; Yu, H.; Zhou, C.

    2013-03-01

    Simulated multi-model "diversity" in aerosol direct radiative forcing estimates is often perceived as a measure of aerosol uncertainty. However, current models used for aerosol radiative forcing calculations vary considerably in model components relevant for forcing calculations and the associated "host-model uncertainties" are generally convoluted with the actual aerosol uncertainty. In this AeroCom Prescribed intercomparison study we systematically isolate and quantify host model uncertainties on aerosol forcing experiments through prescription of identical aerosol radiative properties in twelve participating models. Even with prescribed aerosol radiative properties, simulated clear-sky and all-sky aerosol radiative forcings show significant diversity. For a purely scattering case with globally constant optical depth of 0.2, the global-mean all-sky top-of-atmosphere radiative forcing is -4.47 Wm-2 and the inter-model standard deviation is 0.55 Wm-2, corresponding to a relative standard deviation of 12%. For a case with partially absorbing aerosol with an aerosol optical depth of 0.2 and single scattering albedo of 0.8, the forcing changes to 1.04 Wm-2, and the standard deviation increases to 1.01 W-2, corresponding to a significant relative standard deviation of 97%. However, the top-of-atmosphere forcing variability owing to absorption (subtracting the scattering case from the case with scattering and absorption) is low, with absolute (relative) standard deviations of 0.45 Wm-2 (8%) clear-sky and 0.62 Wm-2 (11%) all-sky. Scaling the forcing standard deviation for a purely scattering case to match the sulfate radiative forcing in the AeroCom Direct Effect experiment demonstrates that host model uncertainties could explain about 36% of the overall sulfate forcing diversity of 0.11 Wm-2 in the AeroCom Direct Radiative Effect experiment. Host model errors in aerosol radiative forcing are largest in regions of uncertain host model components, such as stratocumulus cloud decks or areas with poorly constrained surface albedos, such as sea ice. Our results demonstrate that host model uncertainties are an important component of aerosol forcing uncertainty that require further attention.

  15. Advanced Modeling and Uncertainty Quantification for Flight Dynamics; Interim Results and Challenges

    NASA Technical Reports Server (NTRS)

    Hyde, David C.; Shweyk, Kamal M.; Brown, Frank; Shah, Gautam

    2014-01-01

    As part of the NASA Vehicle Systems Safety Technologies (VSST), Assuring Safe and Effective Aircraft Control Under Hazardous Conditions (Technical Challenge #3), an effort is underway within Boeing Research and Technology (BR&T) to address Advanced Modeling and Uncertainty Quantification for Flight Dynamics (VSST1-7). The scope of the effort is to develop and evaluate advanced multidisciplinary flight dynamics modeling techniques, including integrated uncertainties, to facilitate higher fidelity response characterization of current and future aircraft configurations approaching and during loss-of-control conditions. This approach is to incorporate multiple flight dynamics modeling methods for aerodynamics, structures, and propulsion, including experimental, computational, and analytical. Also to be included are techniques for data integration and uncertainty characterization and quantification. This research shall introduce new and updated multidisciplinary modeling and simulation technologies designed to improve the ability to characterize airplane response in off-nominal flight conditions. The research shall also introduce new techniques for uncertainty modeling that will provide a unified database model comprised of multiple sources, as well as an uncertainty bounds database for each data source such that a full vehicle uncertainty analysis is possible even when approaching or beyond Loss of Control boundaries. Methodologies developed as part of this research shall be instrumental in predicting and mitigating loss of control precursors and events directly linked to causal and contributing factors, such as stall, failures, damage, or icing. The tasks will include utilizing the BR&T Water Tunnel to collect static and dynamic data to be compared to the GTM extended WT database, characterizing flight dynamics in off-nominal conditions, developing tools for structural load estimation under dynamic conditions, devising methods for integrating various modeling elements into a real-time simulation capability, generating techniques for uncertainty modeling that draw data from multiple modeling sources, and providing a unified database model that includes nominal plus increments for each flight condition. This paper presents status of testing in the BR&T water tunnel and analysis of the resulting data and efforts to characterize these data using alternative modeling methods. Program challenges and issues are also presented.

  16. Linear Mixed Models: Gum and Beyond

    NASA Astrophysics Data System (ADS)

    Arendacká, Barbora; Täubner, Angelika; Eichstädt, Sascha; Bruns, Thomas; Elster, Clemens

    2014-04-01

    In Annex H.5, the Guide to the Evaluation of Uncertainty in Measurement (GUM) [1] recognizes the necessity to analyze certain types of experiments by applying random effects ANOVA models. These belong to the more general family of linear mixed models that we focus on in the current paper. Extending the short introduction provided by the GUM, our aim is to show that the more general, linear mixed models cover a wider range of situations occurring in practice and can be beneficial when employed in data analysis of long-term repeated experiments. Namely, we point out their potential as an aid in establishing an uncertainty budget and as means for gaining more insight into the measurement process. We also comment on computational issues and to make the explanations less abstract, we illustrate all the concepts with the help of a measurement campaign conducted in order to challenge the uncertainty budget in calibration of accelerometers.

  17. How to reduce the uncertainties in predictions of local coastal sea level as decision support: the contribution of GGOS

    NASA Astrophysics Data System (ADS)

    Plag, H.-P.

    2009-04-01

    Local Sea Level (LSL) rise is one of the major anticipated impacts of future global warming. In many low-lying and often subsiding coastal areas, an increase of local sea-surface height is likely to increase the hazards of storm surges and hurricances and to lead to major inundation. Single major disasters due to storm surges and hurricanes hitting densely populated urban areas are estimated to inflict losses in excess of 100 billion. Decision makers face a trade-off between imposing the very high costs of coastal protection, mitigation and adaptation upon today's national economies and leaving the costs of potential major disasters to future generations. Risk and vulnerability assessments in support of informed decisions require as input predictions of the range of future LSL rise with reliable estimates of uncertainties. Secular changes in LSL are the result of a mix of location-dependent factors including ocean temperature and salinity changes, ocean and atmospheric circulation changes, mass exchange of the ocean with terrestrial water storage and the cryosphere, and vertical land motion. Current aleatory uncertainties in observations relevant to past and current LSL changes combined with epistemic uncertainties in some of the forcing functions for LSL changes produce a large range of plausible future LSL trajectories. This large range hampers the development of reasonable mitigation and adaptation strategies in the coastal zone. A detailed analysis of the uncertainties helps to answer the question what and how observations could help to reduce the uncertainties. The analysis shows that the Global Geodetic Observing System (GGOS) provides valuable observations and products towards this goal. Observations of the large ice sheets can improve the constraints on the current mass balance of the cryosphere and support cryosphere model validation. Vertical land motion close to melting ice sheets are highly relevant in the validation of models for the elastic response of the Earth to glacial deloading. Combination of satellite gravity mission with ground-based observations of gravity and vertical land motion in areas with significant mass changes (both in cryosphere, land water storage, and ocean) could help to improve models of the global water and energy cycle, which ultimately improves the understanding of current LSL changes. For LSL projections, local vertical land motion given in a reference frame tied to the center of mass is an important input, which currently contributes significantly to the error budget of LSL predictions. Improvements of the terrestrial reference frame would reduce this error contribution.

  18. Model-Averaged ℓ1 Regularization using Markov Chain Monte Carlo Model Composition

    PubMed Central

    Fraley, Chris; Percival, Daniel

    2014-01-01

    Bayesian Model Averaging (BMA) is an effective technique for addressing model uncertainty in variable selection problems. However, current BMA approaches have computational difficulty dealing with data in which there are many more measurements (variables) than samples. This paper presents a method for combining ℓ1 regularization and Markov chain Monte Carlo model composition techniques for BMA. By treating the ℓ1 regularization path as a model space, we propose a method to resolve the model uncertainty issues arising in model averaging from solution path point selection. We show that this method is computationally and empirically effective for regression and classification in high-dimensional datasets. We apply our technique in simulations, as well as to some applications that arise in genomics. PMID:25642001

  19. Uncertainty analysis in vulnerability estimations for elements at risk- a review of concepts and some examples on landslides

    NASA Astrophysics Data System (ADS)

    Ciurean, R. L.; Glade, T.

    2012-04-01

    Decision under uncertainty is a constant of everyday life and an important component of risk management and governance. Recently, experts have emphasized the importance of quantifying uncertainty in all phases of landslide risk analysis. Due to its multi-dimensional and dynamic nature, (physical) vulnerability is inherently complex and the "degree of loss" estimates imprecise and to some extent even subjective. Uncertainty analysis introduces quantitative modeling approaches that allow for a more explicitly objective output, improving the risk management process as well as enhancing communication between various stakeholders for better risk governance. This study presents a review of concepts for uncertainty analysis in vulnerability of elements at risk to landslides. Different semi-quantitative and quantitative methods are compared based on their feasibility in real-world situations, hazard dependency, process stage in vulnerability assessment (i.e. input data, model, output), and applicability within an integrated landslide hazard and risk framework. The resulted observations will help to identify current gaps and future needs in vulnerability assessment, including estimation of uncertainty propagation, transferability of the methods, development of visualization tools, but also address basic questions like what is uncertainty and how uncertainty can be quantified or treated in a reliable and reproducible way.

  20. [Risk, uncertainty and ignorance in medicine].

    PubMed

    Rørtveit, G; Strand, R

    2001-04-30

    Exploration of healthy patients' risk factors for disease has become a major medical activity. The rationale behind primary prevention through exploration and therapeutic risk reduction is not separated from the theoretical assumption that every form of uncertainty can be expressed as risk. Distinguishing "risk" (as quantitative probabilities in a known sample space), "strict uncertainty" (when the sample space is known, but probabilities of events cannot be quantified) and "ignorance" (when the sample space is not fully known), a typical clinical situation (primary risk of coronary disease) is analysed. It is shown how strict uncertainty and sometimes ignorance can be present, in which case the orthodox decision theoretical rationale for treatment breaks down. For use in such cases, a different ideal model of rationality is proposed, focusing on the patient's considered reasons. This model has profound implications for the current understanding of medical professionalism as well as for the design of clinical guidelines.

  1. Characterisation of soft magnetic materials by measurement: Evaluation of uncertainties up to 1.8 T and 9 kHz

    NASA Astrophysics Data System (ADS)

    Elfgen, S.; Franck, D.; Hameyer, K.

    2018-04-01

    Magnetic measurements are indispensable for the characterization of soft magnetic material used e.g. in electrical machines. Characteristic values are used as quality control during production and for the parametrization of material models. Uncertainties and errors in the measurements are reflected directly in the parameters of the material models. This can result in over-dimensioning and inaccuracies in simulations for the design of electrical machines. Therefore, existing influencing factors in the characterization of soft magnetic materials are named and their resulting uncertainties contributions studied. The analysis of the resulting uncertainty contributions can serve the operator as additional selection criteria for different measuring sensors. The investigation is performed for measurements within and outside the currently prescribed standard, using a Single sheet tester and its impact on the identification of iron loss parameter is studied.

  2. Model Uncertainty and Bayesian Model Averaged Benchmark Dose Estimation for Continuous Data

    EPA Science Inventory

    The benchmark dose (BMD) approach has gained acceptance as a valuable risk assessment tool, but risk assessors still face significant challenges associated with selecting an appropriate BMD/BMDL estimate from the results of a set of acceptable dose-response models. Current approa...

  3. An adaptive modeling and simulation environment for combined-cycle data reconciliation and degradation estimation

    NASA Astrophysics Data System (ADS)

    Lin, Tsungpo

    Performance engineers face the major challenge in modeling and simulation for the after-market power system due to system degradation and measurement errors. Currently, the majority in power generation industries utilizes the deterministic data matching method to calibrate the model and cascade system degradation, which causes significant calibration uncertainty and also the risk of providing performance guarantees. In this research work, a maximum-likelihood based simultaneous data reconciliation and model calibration (SDRMC) is used for power system modeling and simulation. By replacing the current deterministic data matching with SDRMC one can reduce the calibration uncertainty and mitigate the error propagation to the performance simulation. A modeling and simulation environment for a complex power system with certain degradation has been developed. In this environment multiple data sets are imported when carrying out simultaneous data reconciliation and model calibration. Calibration uncertainties are estimated through error analyses and populated to performance simulation by using principle of error propagation. System degradation is then quantified by performance comparison between the calibrated model and its expected new & clean status. To mitigate smearing effects caused by gross errors, gross error detection (GED) is carried out in two stages. The first stage is a screening stage, in which serious gross errors are eliminated in advance. The GED techniques used in the screening stage are based on multivariate data analysis (MDA), including multivariate data visualization and principal component analysis (PCA). Subtle gross errors are treated at the second stage, in which the serial bias compensation or robust M-estimator is engaged. To achieve a better efficiency in the combined scheme of the least squares based data reconciliation and the GED technique based on hypotheses testing, the Levenberg-Marquardt (LM) algorithm is utilized as the optimizer. To reduce the computation time and stabilize the problem solving for a complex power system such as a combined cycle power plant, meta-modeling using the response surface equation (RSE) and system/process decomposition are incorporated with the simultaneous scheme of SDRMC. The goal of this research work is to reduce the calibration uncertainties and, thus, the risks of providing performance guarantees arisen from uncertainties in performance simulation.

  4. The future of the North American carbon cycle - projections and associated climate change

    NASA Astrophysics Data System (ADS)

    Huntzinger, D. N.; Chatterjee, A.; Cooley, S. R.; Dunne, J. P.; Hoffman, F. M.; Luo, Y.; Moore, D. J.; Ohrel, S. B.; Poulter, B.; Ricciuto, D. M.; Tzortziou, M.; Walker, A. P.; Mayes, M. A.

    2016-12-01

    Approximately half of anthropogenic emissions from the burning of fossil fuels is taken up annually by carbon sinks on the land and in the oceans. However, there are key uncertainties in how carbon uptake by terrestrial, ocean, and freshwater systems will respond to, and interact with, climate into the future. Here, we outline the current state of understanding on the future carbon budget of these major reservoirs within North America and the globe. We examine the drivers of future carbon cycle changes, including carbon-climate feedbacks, atmospheric composition, nutrient availability, and human activity and management decisions. Progress has been made at identifying vulnerabilities in carbon pools, including high-latitude permafrost, peatlands, freshwater and coastal wetlands, and ecosystems subject to disturbance events, such as insects, fire and drought. However, many of these processes/pools are not well represented in current models, and model intercomparison studies have shown a range in carbon cycle response to factors such as climate and CO2 fertilization. Furthermore, as model complexity increases, understanding the drivers of model spread becomes increasingly more difficult. As a result, uncertainties in future carbon cycle projections are large. It is also uncertain how management decisions and policies will impact future carbon stocks and flows. In order to guide policy, a better understanding of the risk and magnitude of North American carbon cycle changes is needed. This requires that future carbon cycle projections be conditioned on current observations and be reported with sufficient confidence and fully specified uncertainties.

  5. Gaussian Process Model for Antarctic Surface Mass Balance and Ice Core Site Selection

    NASA Astrophysics Data System (ADS)

    White, P. A.; Reese, S.; Christensen, W. F.; Rupper, S.

    2017-12-01

    Surface mass balance (SMB) is an important factor in the estimation of sea level change, and data are collected to estimate models for prediction of SMB on the Antarctic ice sheet. Using Favier et al.'s (2013) quality-controlled aggregate data set of SMB field measurements, a fully Bayesian spatial model is posed to estimate Antarctic SMB and propose new field measurement locations. Utilizing Nearest-Neighbor Gaussian process (NNGP) models, SMB is estimated over the Antarctic ice sheet. An Antarctic SMB map is rendered using this model and is compared with previous estimates. A prediction uncertainty map is created to identify regions of high SMB uncertainty. The model estimates net SMB to be 2173 Gton yr-1 with 95% credible interval (2021,2331) Gton yr-1. On average, these results suggest lower Antarctic SMB and higher uncertainty than previously purported [Vaughan et al. (1999); Van de Berg et al. (2006); Arthern, Winebrenner and Vaughan (2006); Bromwich et al. (2004); Lenaerts et al. (2012)], even though this model utilizes significantly more observations than previous models. Using the Gaussian process' uncertainty and model parameters, we propose 15 new measurement locations for field study utilizing a maximin space-filling, error-minimizing design; these potential measurements are identied to minimize future estimation uncertainty. Using currently accepted Antarctic mass balance estimates and our SMB estimate, we estimate net mass loss [Shepherd et al. (2012); Jacob et al. (2012)]. Furthermore, we discuss modeling details for both space-time data and combining field measurement data with output from mathematical models using the NNGP framework.

  6. Uncertainties in Emissions In Emissions Inputs for Near-Road Assessments

    EPA Science Inventory

    Emissions, travel demand, and dispersion models are all needed to obtain temporally and spatially resolved pollutant concentrations. Current methodology combines these three models in a bottom-up approach based on hourly traffic and emissions estimates, and hourly dispersion conc...

  7. The development of an hourly gridded rainfall product for hydrological applications in England and Wales

    NASA Astrophysics Data System (ADS)

    Liguori, Sara; O'Loughlin, Fiachra; Souvignet, Maxime; Coxon, Gemma; Freer, Jim; Woods, Ross

    2014-05-01

    This research presents a newly developed observed sub-daily gridded precipitation product for England and Wales. Importantly our analysis specifically allows a quantification of rainfall errors from grid to the catchment scale, useful for hydrological model simulation and the evaluation of prediction uncertainties. Our methodology involves the disaggregation of the current one kilometre daily gridded precipitation records available for the United Kingdom[1]. The hourly product is created using information from: 1) 2000 tipping-bucket rain gauges; and 2) the United Kingdom Met-Office weather radar network. These two independent datasets provide rainfall estimates at temporal resolutions much smaller than the current daily gridded rainfall product; thus allowing the disaggregation of the daily rainfall records to an hourly timestep. Our analysis is conducted for the period 2004 to 2008, limited by the current availability of the datasets. We analyse the uncertainty components affecting the accuracy of this product. Specifically we explore how these uncertainties vary spatially, temporally and with climatic regimes. Preliminary results indicate scope for improvement of hydrological model performance by the utilisation of this new hourly gridded rainfall product. Such product will improve our ability to diagnose and identify structural errors in hydrological modelling by including the quantification of input errors. References [1] Keller V, Young AR, Morris D, Davies H (2006) Continuous Estimation of River Flows. Technical Report: Estimation of Precipitation Inputs. in Agency E (ed.). Environmental Agency.

  8. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cote, Benoit; Ritter, Christian; Oshea, Brian W.

    Here we use a simple one-zone galactic chemical evolution model to quantify the uncertainties generated by the input parameters in numerical predictions for a galaxy with properties similar to those of the Milky Way. We compiled several studies from the literature to gather the current constraints for our simulations regarding the typical value and uncertainty of the following seven basic parameters: the lower and upper mass limits of the stellar initial mass function (IMF), the slope of the high-mass end of the stellar IMF, the slope of the delay-time distribution function of Type Ia supernovae (SNe Ia), the number ofmore » SNe Ia per M ⊙ formed, the total stellar mass formed, and the final mass of gas. We derived a probability distribution function to express the range of likely values for every parameter, which were then included in a Monte Carlo code to run several hundred simulations with randomly selected input parameters. This approach enables us to analyze the predicted chemical evolution of 16 elements in a statistical manner by identifying the most probable solutions along with their 68% and 95% confidence levels. Our results show that the overall uncertainties are shaped by several input parameters that individually contribute at different metallicities, and thus at different galactic ages. The level of uncertainty then depends on the metallicity and is different from one element to another. Among the seven input parameters considered in this work, the slope of the IMF and the number of SNe Ia are currently the two main sources of uncertainty. The thicknesses of the uncertainty bands bounded by the 68% and 95% confidence levels are generally within 0.3 and 0.6 dex, respectively. When looking at the evolution of individual elements as a function of galactic age instead of metallicity, those same thicknesses range from 0.1 to 0.6 dex for the 68% confidence levels and from 0.3 to 1.0 dex for the 95% confidence levels. The uncertainty in our chemical evolution model does not include uncertainties relating to stellar yields, star formation and merger histories, and modeling assumptions.« less

  9. Characterizing uncertainty and variability in physiologically based pharmacokinetic models: state of the science and needs for research and implementation.

    PubMed

    Barton, Hugh A; Chiu, Weihsueh A; Setzer, R Woodrow; Andersen, Melvin E; Bailer, A John; Bois, Frédéric Y; Dewoskin, Robert S; Hays, Sean; Johanson, Gunnar; Jones, Nancy; Loizou, George; Macphail, Robert C; Portier, Christopher J; Spendiff, Martin; Tan, Yu-Mei

    2007-10-01

    Physiologically based pharmacokinetic (PBPK) models are used in mode-of-action based risk and safety assessments to estimate internal dosimetry in animals and humans. When used in risk assessment, these models can provide a basis for extrapolating between species, doses, and exposure routes or for justifying nondefault values for uncertainty factors. Characterization of uncertainty and variability is increasingly recognized as important for risk assessment; this represents a continuing challenge for both PBPK modelers and users. Current practices show significant progress in specifying deterministic biological models and nondeterministic (often statistical) models, estimating parameters using diverse data sets from multiple sources, using them to make predictions, and characterizing uncertainty and variability of model parameters and predictions. The International Workshop on Uncertainty and Variability in PBPK Models, held 31 Oct-2 Nov 2006, identified the state-of-the-science, needed changes in practice and implementation, and research priorities. For the short term, these include (1) multidisciplinary teams to integrate deterministic and nondeterministic/statistical models; (2) broader use of sensitivity analyses, including for structural and global (rather than local) parameter changes; and (3) enhanced transparency and reproducibility through improved documentation of model structure(s), parameter values, sensitivity and other analyses, and supporting, discrepant, or excluded data. Longer-term needs include (1) theoretical and practical methodological improvements for nondeterministic/statistical modeling; (2) better methods for evaluating alternative model structures; (3) peer-reviewed databases of parameters and covariates, and their distributions; (4) expanded coverage of PBPK models across chemicals with different properties; and (5) training and reference materials, such as cases studies, bibliographies/glossaries, model repositories, and enhanced software. The multidisciplinary dialogue initiated by this Workshop will foster the collaboration, research, data collection, and training necessary to make characterizing uncertainty and variability a standard practice in PBPK modeling and risk assessment.

  10. Uncertainty in projected point precipitation extremes for hydrological impact analysis of climate change

    NASA Astrophysics Data System (ADS)

    Van Uytven, Els; Willems, Patrick

    2017-04-01

    Current trends in the hydro-meteorological variables indicate the potential impact of climate change on hydrological extremes. Therefore, they trigger an increased importance climate adaptation strategies in water management. The impact of climate change on hydro-meteorological and hydrological extremes is, however, highly uncertain. This is due to uncertainties introduced by the climate models, the internal variability inherent to the climate system, the greenhouse gas scenarios and the statistical downscaling methods. In view of the need to define sustainable climate adaptation strategies, there is a need to assess these uncertainties. This is commonly done by means of ensemble approaches. Because more and more climate models and statistical downscaling methods become available, there is a need to facilitate the climate impact and uncertainty analysis. A Climate Perturbation Tool has been developed for that purpose, which combines a set of statistical downscaling methods including weather typing, weather generator, transfer function and advanced perturbation based approaches. By use of an interactive interface, climate impact modelers can apply these statistical downscaling methods in a semi-automatic way to an ensemble of climate model runs. The tool is applicable to any region, but has been demonstrated so far to cases in Belgium, Suriname, Vietnam and Bangladesh. Time series representing future local-scale precipitation, temperature and potential evapotranspiration (PET) conditions were obtained, starting from time series of historical observations. Uncertainties on the future meteorological conditions are represented in two different ways: through an ensemble of time series, and a reduced set of synthetic scenarios. The both aim to span the full uncertainty range as assessed from the ensemble of climate model runs and downscaling methods. For Belgium, for instance, use was made of 100-year time series of 10-minutes precipitation observations and daily temperature and PET observations at Uccle and a large ensemble of 160 global climate model runs (CMIP5). They cover all four representative concentration pathway based greenhouse gas scenarios. While evaluating the downscaled meteorological series, particular attention was given to the performance of extreme value metrics (e.g. for precipitation, by means of intensity-duration-frequency statistics). Moreover, the total uncertainty was decomposed in the fractional uncertainties for each of the uncertainty sources considered. Research assessing the additional uncertainty due to parameter and structural uncertainties of the hydrological impact model is ongoing.

  11. Bayesian inversion of the global present-day GIA signal uncertainty from RSL data

    NASA Astrophysics Data System (ADS)

    Caron, Lambert; Ivins, Erik R.; Adhikari, Surendra; Larour, Eric

    2017-04-01

    Various geophysical signals measured in the process of studying the present-day climate change (such as changes in the Earth gravitational potential, ocean altimery or GPS data) include a secular Glacial Isostatic Adjustment contribution that has to be corrected for. Yet, one of the current major challenges that Glacial Isostatic Adjustment modelling is currently struggling with is to accurately determine the uncertainty of the predicted present-day GIA signal. This is especially true at the global scale, where coupling between ice history and mantle rheology greatly contributes to the non-uniqueness of the solutions. Here we propose to use more than 11000 paleo sea level records to constrain a set of GIA Bayesian inversions and thoroughly explore its parameters space. We include two linearly relaxing models to represent the mantle rheology and couple them with a scalable ice history model in order to better assess the non-uniqueness of the solutions. From the resulting estimates of the Probability Density Function, we then extract maps of uncertainty affecting the present-day vertical land motion and geoid due to GIA at the global scale, and their associated expectation of the signal.

  12. Ecosystem modeling of coastal acidification and hypoxia and structural uncertainties in the representation of sediment-water exchanges

    EPA Science Inventory

    Numerical ecosystem models of coastal acidification (CA) and hypoxia have been developed to synthesize current scientific understanding and provide predictions for nutrient management and policy. However, there is not a scientific consensus about the structure of these models an...

  13. INVERSE MODELING TO ESTIMATE NH3 EMISSION SEASONALLY AND THE SENSITIVITY TO UNCERTAINTY REPRESENTATIONS

    EPA Science Inventory

    Inverse modeling has been used extensively on the global scale to produce top-down estimates of emissions for chemicals such as CO and CH4. Regional scale air quality studies could also benefit from inverse modeling as a tool to evaluate current emission inventories; however, ...

  14. Hierarchical stochastic modeling of large river ecosystems and fish growth across spatio-temporal scales and climate models: the Missouri River endangered pallid sturgeon example

    USGS Publications Warehouse

    Wildhaber, Mark L.; Wikle, Christopher K.; Moran, Edward H.; Anderson, Christopher J.; Franz, Kristie J.; Dey, Rima

    2017-01-01

    We present a hierarchical series of spatially decreasing and temporally increasing models to evaluate the uncertainty in the atmosphere – ocean global climate model (AOGCM) and the regional climate model (RCM) relative to the uncertainty in the somatic growth of the endangered pallid sturgeon (Scaphirhynchus albus). For effects on fish populations of riverine ecosystems, cli- mate output simulated by coarse-resolution AOGCMs and RCMs must be downscaled to basins to river hydrology to population response. One needs to transfer the information from these climate simulations down to the individual scale in a way that minimizes extrapolation and can account for spatio-temporal variability in the intervening stages. The goal is a framework to determine whether, given uncertainties in the climate models and the biological response, meaningful inference can still be made. The non-linear downscaling of climate information to the river scale requires that one realistically account for spatial and temporal variability across scale. Our down- scaling procedure includes the use of fixed/calibrated hydrological flow and temperature models coupled with a stochastically parameterized sturgeon bioenergetics model. We show that, although there is a large amount of uncertainty associated with both the climate model output and the fish growth process, one can establish significant differences in fish growth distributions between models, and between future and current climates for a given model.

  15. On the influence of simulated SST warming on rainfall projections in the Indo-Pacific domain: an AGCM study

    NASA Astrophysics Data System (ADS)

    Zhang, Huqiang; Zhao, Y.; Moise, A.; Ye, H.; Colman, R.; Roff, G.; Zhao, M.

    2018-02-01

    Significant uncertainty exists in regional climate change projections, particularly for rainfall and other hydro-climate variables. In this study, we conduct a series of Atmospheric General Circulation Model (AGCM) experiments with different future sea surface temperature (SST) warming simulated by a range of coupled climate models. They allow us to assess the extent to which uncertainty from current coupled climate model rainfall projections can be attributed to their simulated SST warming. Nine CMIP5 model-simulated global SST warming anomalies have been super-imposed onto the current SSTs simulated by the Australian climate model ACCESS1.3. The ACCESS1.3 SST-forced experiments closely reproduce rainfall means and interannual variations as in its own fully coupled experiments. Although different global SST warming intensities explain well the inter-model difference in global mean precipitation changes, at regional scales the SST influence vary significantly. SST warming explains about 20-25% of the patterns of precipitation changes in each of the four/five models in its rainfall projections over the oceans in the Indo-Pacific domain, but there are also a couple of models in which different SST warming explains little of their precipitation pattern changes. The influence is weaker again for rainfall changes over land. Roughly similar levels of contribution can be attributed to different atmospheric responses to SST warming in these models. The weak SST influence in our study could be due to the experimental setup applied: superimposing different SST warming anomalies onto the same SSTs simulated for current climate by ACCESS1.3 rather than directly using model-simulated past and future SSTs. Similar modelling and analysis from other modelling groups with more carefully designed experiments are needed to tease out uncertainties caused by different SST warming patterns, different SST mean biases and different model physical/dynamical responses to the same underlying SST forcing.

  16. Building Quantitative Hydrologic Storylines from Process-based Models for Managing Water Resources in the U.S. Under Climate-changed Futures

    NASA Astrophysics Data System (ADS)

    Arnold, J.; Gutmann, E. D.; Clark, M. P.; Nijssen, B.; Vano, J. A.; Addor, N.; Wood, A.; Newman, A. J.; Mizukami, N.; Brekke, L. D.; Rasmussen, R.; Mendoza, P. A.

    2016-12-01

    Climate change narratives for water-resource applications must represent the change signals contextualized by hydroclimatic process variability and uncertainty at multiple scales. Building narratives of plausible change includes assessing uncertainties across GCM structure, internal climate variability, climate downscaling methods, and hydrologic models. Work with this linked modeling chain has dealt mostly with GCM sampling directed separately to either model fidelity (does the model correctly reproduce the physical processes in the world?) or sensitivity (of different model responses to CO2 forcings) or diversity (of model type, structure, and complexity). This leaves unaddressed any interactions among those measures and with other components in the modeling chain used to identify water-resource vulnerabilities to specific climate threats. However, time-sensitive, real-world vulnerability studies typically cannot accommodate a full uncertainty ensemble across the whole modeling chain, so a gap has opened between current scientific knowledge and most routine applications for climate-changed hydrology. To close that gap, the US Army Corps of Engineers, the Bureau of Reclamation, and the National Center for Atmospheric Research are working on techniques to subsample uncertainties objectively across modeling chain components and to integrate results into quantitative hydrologic storylines of climate-changed futures. Importantly, these quantitative storylines are not drawn from a small sample of models or components. Rather, they stem from the more comprehensive characterization of the full uncertainty space for each component. Equally important from the perspective of water-resource practitioners, these quantitative hydrologic storylines are anchored in actual design and operations decisions potentially affected by climate change. This talk will describe part of our work characterizing variability and uncertainty across modeling chain components and their interactions using newly developed observational data, models and model outputs, and post-processing tools for making the resulting quantitative storylines most useful in practical hydrology applications.

  17. Predicting uncertainty in future marine ice sheet volume using Bayesian statistical methods

    NASA Astrophysics Data System (ADS)

    Davis, A. D.

    2015-12-01

    The marine ice instability can trigger rapid retreat of marine ice streams. Recent observations suggest that marine ice systems in West Antarctica have begun retreating. However, unknown ice dynamics, computationally intensive mathematical models, and uncertain parameters in these models make predicting retreat rate and ice volume difficult. In this work, we fuse current observational data with ice stream/shelf models to develop probabilistic predictions of future grounded ice sheet volume. Given observational data (e.g., thickness, surface elevation, and velocity) and a forward model that relates uncertain parameters (e.g., basal friction and basal topography) to these observations, we use a Bayesian framework to define a posterior distribution over the parameters. A stochastic predictive model then propagates uncertainties in these parameters to uncertainty in a particular quantity of interest (QoI)---here, the volume of grounded ice at a specified future time. While the Bayesian approach can in principle characterize the posterior predictive distribution of the QoI, the computational cost of both the forward and predictive models makes this effort prohibitively expensive. To tackle this challenge, we introduce a new Markov chain Monte Carlo method that constructs convergent approximations of the QoI target density in an online fashion, yielding accurate characterizations of future ice sheet volume at significantly reduced computational cost.Our second goal is to attribute uncertainty in these Bayesian predictions to uncertainties in particular parameters. Doing so can help target data collection, for the purpose of constraining the parameters that contribute most strongly to uncertainty in the future volume of grounded ice. For instance, smaller uncertainties in parameters to which the QoI is highly sensitive may account for more variability in the prediction than larger uncertainties in parameters to which the QoI is less sensitive. We use global sensitivity analysis to help answer this question, and make the computation of sensitivity indices computationally tractable using a combination of polynomial chaos and Monte Carlo techniques.

  18. The visualization of spatial uncertainty

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Srivastava, R.M.

    1994-12-31

    Geostatistical conditions simulation is gaining acceptance as a numerical modeling tool in the petroleum industry. Unfortunately, many of the new users of conditional simulation work with only one outcome or ``realization`` and ignore the many other outcomes that could be produced by their conditional simulation tools. 3-D visualization tools allow them to create very realistic images of this single outcome as reality. There are many methods currently available for presenting the uncertainty information from a family of possible outcomes; most of these, however, use static displays and many present uncertainty in a format that is not intuitive. This paper exploresmore » the visualization of uncertainty through dynamic displays that exploit the intuitive link between uncertainty and change by presenting the use with a constantly evolving model. The key technical challenge to such a dynamic presentation is the ability to create numerical models that honor the available well data and geophysical information and yet are incrementally different so that successive frames can be viewed rapidly as an animated cartoon. An example of volumetric uncertainty from a Gulf Coast reservoir will be used to demonstrate that such a dynamic presentation is the ability to create numerical models that honor the available well data and geophysical information and yet are incrementally different so that successive frames can be viewed rapidly as an animated cartoon. An example of volumetric uncertainty from a Gulf Coast reservoir will be used to demonstrate that such animation is possible and to show that such dynamic displays can be an effective tool in risk analysis for the petroleum industry.« less

  19. Modelling irrigated maize with a combination of coupled-model simulation and uncertainty analysis, in the northwest of China

    NASA Astrophysics Data System (ADS)

    Li, Y.; Kinzelbach, W.; Zhou, J.; Cheng, G. D.; Li, X.

    2012-05-01

    The hydrologic model HYDRUS-1-D and the crop growth model WOFOST are coupled to efficiently manage water resources in agriculture and improve the prediction of crop production. The results of the coupled model are validated by experimental studies of irrigated-maize done in the middle reaches of northwest China's Heihe River, a semi-arid to arid region. Good agreement is achieved between the simulated evapotranspiration, soil moisture and crop production and their respective field measurements made under current maize irrigation and fertilization. Based on the calibrated model, the scenario analysis reveals that the most optimal amount of irrigation is 500-600 mm in this region. However, for regions without detailed observation, the results of the numerical simulation can be unreliable for irrigation decision making owing to the shortage of calibrated model boundary conditions and parameters. So, we develop a method of combining model ensemble simulations and uncertainty/sensitivity analysis to speculate the probability of crop production. In our studies, the uncertainty analysis is used to reveal the risk of facing a loss of crop production as irrigation decreases. The global sensitivity analysis is used to test the coupled model and further quantitatively analyse the impact of the uncertainty of coupled model parameters and environmental scenarios on crop production. This method can be used for estimation in regions with no or reduced data availability.

  20. Characterizing bias correction uncertainty in wheat yield predictions

    NASA Astrophysics Data System (ADS)

    Ortiz, Andrea Monica; Jones, Julie; Freckleton, Robert; Scaife, Adam

    2017-04-01

    Farming systems are under increased pressure due to current and future climate change, variability and extremes. Research on the impacts of climate change on crop production typically rely on the output of complex Global and Regional Climate Models, which are used as input to crop impact models. Yield predictions from these top-down approaches can have high uncertainty for several reasons, including diverse model construction and parameterization, future emissions scenarios, and inherent or response uncertainty. These uncertainties propagate down each step of the 'cascade of uncertainty' that flows from climate input to impact predictions, leading to yield predictions that may be too complex for their intended use in practical adaptation options. In addition to uncertainty from impact models, uncertainty can also stem from the intermediate steps that are used in impact studies to adjust climate model simulations to become more realistic when compared to observations, or to correct the spatial or temporal resolution of climate simulations, which are often not directly applicable as input into impact models. These important steps of bias correction or calibration also add uncertainty to final yield predictions, given the various approaches that exist to correct climate model simulations. In order to address how much uncertainty the choice of bias correction method can add to yield predictions, we use several evaluation runs from Regional Climate Models from the Coordinated Regional Downscaling Experiment over Europe (EURO-CORDEX) at different resolutions together with different bias correction methods (linear and variance scaling, power transformation, quantile-quantile mapping) as input to a statistical crop model for wheat, a staple European food crop. The objective of our work is to compare the resulting simulation-driven hindcasted wheat yields to climate observation-driven wheat yield hindcasts from the UK and Germany in order to determine ranges of yield uncertainty that result from different climate model simulation input and bias correction methods. We simulate wheat yields using a General Linear Model that includes the effects of seasonal maximum temperatures and precipitation, since wheat is sensitive to heat stress during important developmental stages. We use the same statistical model to predict future wheat yields using the recently available bias-corrected simulations of EURO-CORDEX-Adjust. While statistical models are often criticized for their lack of complexity, an advantage is that we are here able to consider only the effect of the choice of climate model, resolution or bias correction method on yield. Initial results using both past and future bias-corrected climate simulations with a process-based model will also be presented. Through these methods, we make recommendations in preparing climate model output for crop models.

  1. Towards uncertainty estimation for operational forecast products - a multi-model-ensemble approach for the North Sea and the Baltic Sea

    NASA Astrophysics Data System (ADS)

    Golbeck, Inga; Li, Xin; Janssen, Frank

    2014-05-01

    Several independent operational ocean models provide forecasts of the ocean state (e.g. sea level, temperature, salinity and ice cover) in the North Sea and the Baltic Sea on a daily basis. These forecasts are the primary source of information for a variety of information and emergency response systems used e.g. to issue sea level warnings or carry out oil drift forecast. The forecasts are of course highly valuable as such, but often suffer from a lack of information on their uncertainty. With the aim of augmenting the existing operational ocean forecasts in the North Sea and the Baltic Sea by a measure of uncertainty a multi-model-ensemble (MME) system for sea surface temperature (SST), sea surface salinity (SSS) and water transports has been set up in the framework of the MyOcean-2 project. Members of MyOcean-2, the NOOS² and HIROMB/BOOS³ communities provide 48h-forecasts serving as inputs. Different variables are processed separately due to their different physical characteristics. Based on the so far collected daily MME products of SST and SSS, a statistical method, Empirical Orthogonal Function (EOF) analysis is applied to assess their spatial and temporal variability. For sea surface currents, progressive vector diagrams at specific points are consulted to estimate the performance of the circulation models especially in hydrodynamic important areas, e.g. inflow/outflow of the Baltic Sea, Norwegian trench and English Channel. For further versions of the MME system, it is planned to extend the MME to other variables like e.g. sea level, ocean currents or ice cover based on the needs of the model providers and their customers. It is also planned to include in-situ data to augment the uncertainty information and for validation purposes. Additionally, weighting methods will be implemented into the MME system to develop more complex uncertainty measures. The methodology used to create the MME will be outlined and different ensemble products will be presented. In addition, some preliminary results based on the statistical analysis of the uncertainty measures provide first estimates of the regional and temporal performance of the ocean models for each parameter. ²Northwest European Shelf Operational Oceanography System ³High-resolution Operational Model of the Baltic / Baltic Operational Oceanographic System

  2. A New Multivariate Approach in Generating Ensemble Meteorological Forcings for Hydrological Forecasting

    NASA Astrophysics Data System (ADS)

    Khajehei, Sepideh; Moradkhani, Hamid

    2015-04-01

    Producing reliable and accurate hydrologic ensemble forecasts are subject to various sources of uncertainty, including meteorological forcing, initial conditions, model structure, and model parameters. Producing reliable and skillful precipitation ensemble forecasts is one approach to reduce the total uncertainty in hydrological applications. Currently, National Weather Prediction (NWP) models are developing ensemble forecasts for various temporal ranges. It is proven that raw products from NWP models are biased in mean and spread. Given the above state, there is a need for methods that are able to generate reliable ensemble forecasts for hydrological applications. One of the common techniques is to apply statistical procedures in order to generate ensemble forecast from NWP-generated single-value forecasts. The procedure is based on the bivariate probability distribution between the observation and single-value precipitation forecast. However, one of the assumptions of the current method is fitting Gaussian distribution to the marginal distributions of observed and modeled climate variable. Here, we have described and evaluated a Bayesian approach based on Copula functions to develop an ensemble precipitation forecast from the conditional distribution of single-value precipitation forecasts. Copula functions are known as the multivariate joint distribution of univariate marginal distributions, which are presented as an alternative procedure in capturing the uncertainties related to meteorological forcing. Copulas are capable of modeling the joint distribution of two variables with any level of correlation and dependency. This study is conducted over a sub-basin in the Columbia River Basin in USA using the monthly precipitation forecasts from Climate Forecast System (CFS) with 0.5x0.5 Deg. spatial resolution to reproduce the observations. The verification is conducted on a different period and the superiority of the procedure is compared with Ensemble Pre-Processor approach currently used by National Weather Service River Forecast Centers in USA.

  3. Uncertainty in spatially explicit animal dispersal models

    USGS Publications Warehouse

    Mooij, Wolf M.; DeAngelis, Donald L.

    2003-01-01

    Uncertainty in estimates of survival of dispersing animals is a vexing difficulty in conservation biology. The current notion is that this uncertainty decreases the usefulness of spatially explicit population models in particular. We examined this problem by comparing dispersal models of three levels of complexity: (1) an event-based binomial model that considers only the occurrence of mortality or arrival, (2) a temporally explicit exponential model that employs mortality and arrival rates, and (3) a spatially explicit grid-walk model that simulates the movement of animals through an artificial landscape. Each model was fitted to the same set of field data. A first objective of the paper is to illustrate how the maximum-likelihood method can be used in all three cases to estimate the means and confidence limits for the relevant model parameters, given a particular set of data on dispersal survival. Using this framework we show that the structure of the uncertainty for all three models is strikingly similar. In fact, the results of our unified approach imply that spatially explicit dispersal models, which take advantage of information on landscape details, suffer less from uncertainly than do simpler models. Moreover, we show that the proposed strategy of model development safeguards one from error propagation in these more complex models. Finally, our approach shows that all models related to animal dispersal, ranging from simple to complex, can be related in a hierarchical fashion, so that the various approaches to modeling such dispersal can be viewed from a unified perspective.

  4. Assessing model sensitivity and uncertainty across multiple Free-Air CO2 Enrichment experiments.

    NASA Astrophysics Data System (ADS)

    Cowdery, E.; Dietze, M.

    2015-12-01

    As atmospheric levels of carbon dioxide levels continue to increase, it is critical that terrestrial ecosystem models can accurately predict ecological responses to the changing environment. Current predictions of net primary productivity (NPP) in response to elevated atmospheric CO2 concentrations are highly variable and contain a considerable amount of uncertainty. It is necessary that we understand which factors are driving this uncertainty. The Free-Air CO2 Enrichment (FACE) experiments have equipped us with a rich data source that can be used to calibrate and validate these model predictions. To identify and evaluate the assumptions causing inter-model differences we performed model sensitivity and uncertainty analysis across ambient and elevated CO2 treatments using the Data Assimilation Linked Ecosystem Carbon (DALEC) model and the Ecosystem Demography Model (ED2), two process-based models ranging from low to high complexity respectively. These modeled process responses were compared to experimental data from the Kennedy Space Center Open Top Chamber Experiment, the Nevada Desert Free Air CO2 Enrichment Facility, the Rhinelander FACE experiment, the Wyoming Prairie Heating and CO2 Enrichment Experiment, the Duke Forest Face experiment and the Oak Ridge Experiment on CO2 Enrichment. By leveraging data access proxy and data tilling services provided by the BrownDog data curation project alongside analysis modules available in the Predictive Ecosystem Analyzer (PEcAn), we produced automated, repeatable benchmarking workflows that are generalized to incorporate different sites and ecological models. Combining the observed patterns of uncertainty between the two models with results of the recent FACE-model data synthesis project (FACE-MDS) can help identify which processes need further study and additional data constraints. These findings can be used to inform future experimental design and in turn can provide informative starting point for data assimilation.

  5. Design of experiments enhanced statistical process control for wind tunnel check standard testing

    NASA Astrophysics Data System (ADS)

    Phillips, Ben D.

    The current wind tunnel check standard testing program at NASA Langley Research Center is focused on increasing data quality, uncertainty quantification and overall control and improvement of wind tunnel measurement processes. The statistical process control (SPC) methodology employed in the check standard testing program allows for the tracking of variations in measurements over time as well as an overall assessment of facility health. While the SPC approach can and does provide researchers with valuable information, it has certain limitations in the areas of process improvement and uncertainty quantification. It is thought by utilizing design of experiments methodology in conjunction with the current SPC practices that one can efficiently and more robustly characterize uncertainties and develop enhanced process improvement procedures. In this research, methodologies were developed to generate regression models for wind tunnel calibration coefficients, balance force coefficients and wind tunnel flow angularities. The coefficients of these regression models were then tracked in statistical process control charts, giving a higher level of understanding of the processes. The methodology outlined is sufficiently generic such that this research can be applicable to any wind tunnel check standard testing program.

  6. Adjoint-Based Implicit Uncertainty Analysis for Figures of Merit in a Laser Inertial Fusion Engine

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Seifried, J E; Fratoni, M; Kramer, K J

    A primary purpose of computational models is to inform design decisions and, in order to make those decisions reliably, the confidence in the results of such models must be estimated. Monte Carlo neutron transport models are common tools for reactor designers. These types of models contain several sources of uncertainty that propagate onto the model predictions. Two uncertainties worthy of note are (1) experimental and evaluation uncertainties of nuclear data that inform all neutron transport models and (2) statistical counting precision, which all results of a Monte Carlo codes contain. Adjoint-based implicit uncertainty analyses allow for the consideration of anymore » number of uncertain input quantities and their effects upon the confidence of figures of merit with only a handful of forward and adjoint transport calculations. When considering a rich set of uncertain inputs, adjoint-based methods remain hundreds of times more computationally efficient than Direct Monte-Carlo methods. The LIFE (Laser Inertial Fusion Energy) engine is a concept being developed at Lawrence Livermore National Laboratory. Various options exist for the LIFE blanket, depending on the mission of the design. The depleted uranium hybrid LIFE blanket design strives to close the fission fuel cycle without enrichment or reprocessing, while simultaneously achieving high discharge burnups with reduced proliferation concerns. Neutron transport results that are central to the operation of the design are tritium production for fusion fuel, fission of fissile isotopes for energy multiplication, and production of fissile isotopes for sustained power. In previous work, explicit cross-sectional uncertainty analyses were performed for reaction rates related to the figures of merit for the depleted uranium hybrid LIFE blanket. Counting precision was also quantified for both the figures of merit themselves and the cross-sectional uncertainty estimates to gauge the validity of the analysis. All cross-sectional uncertainties were small (0.1-0.8%), bounded counting uncertainties, and were precise with regard to counting precision. Adjoint/importance distributions were generated for the same reaction rates. The current work leverages those adjoint distributions to transition from explicit sensitivities, in which the neutron flux is constrained, to implicit sensitivities, in which the neutron flux responds to input perturbations. This treatment vastly expands the set of data that contribute to uncertainties to produce larger, more physically accurate uncertainty estimates.« less

  7. Uncertainty in geocenter estimates in the context of ITRF2014

    NASA Astrophysics Data System (ADS)

    Riddell, Anna R.; King, Matt A.; Watson, Christopher S.; Sun, Yu; Riva, Riccardo E. M.; Rietbroek, Roelof

    2017-05-01

    Uncertainty in the geocenter position and its subsequent motion affects positioning estimates on the surface of the Earth and downstream products such as site velocities, particularly the vertical component. The current version of the International Terrestrial Reference Frame, ITRF2014, derives its origin as the long-term averaged center of mass as sensed by satellite laser ranging (SLR), and by definition, it adopts only linear motion of the origin with uncertainty determined using a white noise process. We compare weekly SLR translations relative to the ITRF2014 origin, with network translations estimated from station displacements from surface mass transport models. We find that the proportion of variance explained in SLR translations by the model-derived translations is on average less than 10%. Time-correlated noise and nonlinear rates, particularly evident in the Y and Z components of the SLR translations with respect to the ITRF2014 origin, are not fully replicated by the model-derived translations. This suggests that translation-related uncertainties are underestimated when a white noise model is adopted and that substantial systematic errors remain in the data defining the ITRF origin. When using a white noise model, we find uncertainties in the rate of SLR X, Y, and Z translations of ±0.03, ±0.03, and ±0.06, respectively, increasing to ±0.13, ±0.17, and ±0.33 (mm/yr, 1 sigma) when a power law and white noise model is adopted.

  8. Uncertainty in Operational Atmospheric Analyses and Re-Analyses

    NASA Astrophysics Data System (ADS)

    Langland, R.; Maue, R. N.

    2016-12-01

    This talk will describe uncertainty in atmospheric analyses of wind and temperature produced by operational forecast models and in re-analysis products. Because the "true" atmospheric state cannot be precisely quantified, there is necessarily error in every atmospheric analysis, and this error can be estimated by computing differences ( variance and bias) between analysis products produced at various centers (e.g., ECMWF, NCEP, U.S Navy, etc.) that use independent data assimilation procedures, somewhat different sets of atmospheric observations and forecast models with different resolutions, dynamical equations, and physical parameterizations. These estimates of analysis uncertainty provide a useful proxy to actual analysis error. For this study, we use a unique multi-year and multi-model data archive developed at NRL-Monterey. It will be shown that current uncertainty in atmospheric analyses is closely correlated with the geographic distribution of assimilated in-situ atmospheric observations, especially those provided by high-accuracy radiosonde and commercial aircraft observations. The lowest atmospheric analysis uncertainty is found over North America, Europe and Eastern Asia, which have the largest numbers of radiosonde and commercial aircraft observations. Analysis uncertainty is substantially larger (by factors of two to three times) in most of the Southern hemisphere, the North Pacific ocean, and under-developed nations of Africa and South America where there are few radiosonde or commercial aircraft data. It appears that in regions where atmospheric analyses depend primarily on satellite radiance observations, analysis uncertainty of both temperature and wind remains relatively high compared to values found over North America and Europe.

  9. Calculating when elective abdominal aortic aneurysm repair improves survival for individual patients: development of the Aneurysm Repair Decision Aid and economic evaluation.

    PubMed

    Grant, Stuart W; Sperrin, Matthew; Carlson, Eric; Chinai, Natasha; Ntais, Dionysios; Hamilton, Matthew; Dunn, Graham; Buchan, Iain; Davies, Linda; McCollum, Charles N

    2015-04-01

    Abdominal aortic aneurysm (AAA) repair aims to prevent premature death from AAA rupture. Elective repair is currently recommended when AAA diameter reaches 5.5 cm (men) and 5.0 cm (women). Applying population-based indications may not be appropriate for individual patient decisions, as the optimal indication is likely to differ between patients based on age and comorbidities. To develop an Aneurysm Repair Decision Aid (ARDA) to indicate when elective AAA repair optimises survival for individual patients and to assess the cost-effectiveness and associated uncertainty of elective repair at the aneurysm diameter recommended by the ARDA compared with current practice. The UK Vascular Governance North West and National Vascular Database provided individual patient data to develop predictive models for perioperative mortality and survival. Data from published literature were used to model AAA growth and risk of rupture. The cost-effectiveness analysis used data from published literature and from local and national databases. A combination of systematic review methods and clinical registries were used to provide data to populate models and inform the structure of the ARDA. Discrete event simulation (DES) was used to model the patient journey from diagnosis to death and synthesised data were used to estimate patient outcomes and costs for elective repair at alternative aneurysm diameters. Eight patient clinical scenarios (vignettes) were used as exemplars. The DES structure was validated by clinical and statistical experts. The economic evaluation estimated costs, quality-adjusted life-years (QALYs) and incremental cost-effectiveness ratios (ICERs) from the NHS, social care provider and patient perspective over a lifetime horizon. Cost-effectiveness acceptability analyses and probabilistic sensitivity analyses explored uncertainty in the data and the value for money of ARDA-based decisions. The ARDA outcome measures include perioperative mortality risk, annual risk of rupture, 1-, 5- and 10-year survival, postoperative long-term survival, median life expectancy and predicted time to current threshold for aneurysm repair. The primary economic measure was the ICER using the QALY as the measure of health benefit. The analysis demonstrated it is feasible to build and run a complex clinical decision aid using DES. The model results support current guidelines for most vignettes but suggest that earlier repair may be effective in younger, fitter patients and ongoing surveillance may be effective in elderly patients with comorbidities. The model adds information to support decisions for patients with aneurysms outside current indications. The economic evaluation suggests that using the ARDA compared with current guidelines could be cost-effective but there is a high level of uncertainty. Lack of high-quality long-term data to populate all sections of the model meant that there is high uncertainty about the long-term clinical and economic consequences of repair. Modelling assumptions were necessary and the developed survival models require external validation. The ARDA provides detailed information on the potential consequences of AAA repair or a decision not to repair that may be helpful to vascular surgeons and their patients in reaching informed decisions. Further research is required to reduce uncertainty about key data, including reintervention following AAA repair, and assess the acceptability and feasibility of the ARDA for use in routine clinical practice. The National Institute for Health Research Health Technology Assessment programme.

  10. Woody plants and the prediction of climate-change impacts on bird diversity.

    PubMed

    Kissling, W D; Field, R; Korntheuer, H; Heyder, U; Böhning-Gaese, K

    2010-07-12

    Current methods of assessing climate-induced shifts of species distributions rarely account for species interactions and usually ignore potential differences in response times of interacting taxa to climate change. Here, we used species-richness data from 1005 breeding bird and 1417 woody plant species in Kenya and employed model-averaged coefficients from regression models and median climatic forecasts assembled across 15 climate-change scenarios to predict bird species richness under climate change. Forecasts assuming an instantaneous response of woody plants and birds to climate change suggested increases in future bird species richness across most of Kenya whereas forecasts assuming strongly lagged woody plant responses to climate change indicated a reversed trend, i.e. reduced bird species richness. Uncertainties in predictions of future bird species richness were geographically structured, mainly owing to uncertainties in projected precipitation changes. We conclude that assessments of future species responses to climate change are very sensitive to current uncertainties in regional climate-change projections, and to the inclusion or not of time-lagged interacting taxa. We expect even stronger effects for more specialized plant-animal associations. Given the slow response time of woody plant distributions to climate change, current estimates of future biodiversity of many animal taxa may be both biased and too optimistic.

  11. Numerical uncertainty in computational engineering and physics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hemez, Francois M

    2009-01-01

    Obtaining a solution that approximates ordinary or partial differential equations on a computational mesh or grid does not necessarily mean that the solution is accurate or even 'correct'. Unfortunately assessing the quality of discrete solutions by questioning the role played by spatial and temporal discretizations generally comes as a distant third to test-analysis comparison and model calibration. This publication is contributed to raise awareness of the fact that discrete solutions introduce numerical uncertainty. This uncertainty may, in some cases, overwhelm in complexity and magnitude other sources of uncertainty that include experimental variability, parametric uncertainty and modeling assumptions. The concepts ofmore » consistency, convergence and truncation error are overviewed to explain the articulation between the exact solution of continuous equations, the solution of modified equations and discrete solutions computed by a code. The current state-of-the-practice of code and solution verification activities is discussed. An example in the discipline of hydro-dynamics illustrates the significant effect that meshing can have on the quality of code predictions. A simple method is proposed to derive bounds of solution uncertainty in cases where the exact solution of the continuous equations, or its modified equations, is unknown. It is argued that numerical uncertainty originating from mesh discretization should always be quantified and accounted for in the overall uncertainty 'budget' that supports decision-making for applications in computational physics and engineering.« less

  12. Quantifying the Uncertainty in Discharge Data Using Hydraulic Knowledge and Uncertain Gaugings

    NASA Astrophysics Data System (ADS)

    Renard, B.; Le Coz, J.; Bonnifait, L.; Branger, F.; Le Boursicaud, R.; Horner, I.; Mansanarez, V.; Lang, M.

    2014-12-01

    River discharge is a crucial variable for Hydrology: as the output variable of most hydrologic models, it is used for sensitivity analyses, model structure identification, parameter estimation, data assimilation, prediction, etc. A major difficulty stems from the fact that river discharge is not measured continuously. Instead, discharge time series used by hydrologists are usually based on simple stage-discharge relations (rating curves) calibrated using a set of direct stage-discharge measurements (gaugings). In this presentation, we present a Bayesian approach to build such hydrometric rating curves, to estimate the associated uncertainty and to propagate this uncertainty to discharge time series. The three main steps of this approach are described: (1) Hydraulic analysis: identification of the hydraulic controls that govern the stage-discharge relation, identification of the rating curve equation and specification of prior distributions for the rating curve parameters; (2) Rating curve estimation: Bayesian inference of the rating curve parameters, accounting for the individual uncertainties of available gaugings, which often differ according to the discharge measurement procedure and the flow conditions; (3) Uncertainty propagation: quantification of the uncertainty in discharge time series, accounting for both the rating curve uncertainties and the uncertainty of recorded stage values. In addition, we also discuss current research activities, including the treatment of non-univocal stage-discharge relationships (e.g. due to hydraulic hysteresis, vegetation growth, sudden change of the geometry of the section, etc.).

  13. Quantification of LiDAR measurement uncertainty through propagation of errors due to sensor sub-systems and terrain morphology

    NASA Astrophysics Data System (ADS)

    Goulden, T.; Hopkinson, C.

    2013-12-01

    The quantification of LiDAR sensor measurement uncertainty is important for evaluating the quality of derived DEM products, compiling risk assessment of management decisions based from LiDAR information, and enhancing LiDAR mission planning capabilities. Current quality assurance estimates of LiDAR measurement uncertainty are limited to post-survey empirical assessments or vendor estimates from commercial literature. Empirical evidence can provide valuable information for the performance of the sensor in validated areas; however, it cannot characterize the spatial distribution of measurement uncertainty throughout the extensive coverage of typical LiDAR surveys. Vendor advertised error estimates are often restricted to strict and optimal survey conditions, resulting in idealized values. Numerical modeling of individual pulse uncertainty provides an alternative method for estimating LiDAR measurement uncertainty. LiDAR measurement uncertainty is theoretically assumed to fall into three distinct categories, 1) sensor sub-system errors, 2) terrain influences, and 3) vegetative influences. This research details the procedures for numerical modeling of measurement uncertainty from the sensor sub-system (GPS, IMU, laser scanner, laser ranger) and terrain influences. Results show that errors tend to increase as the laser scan angle, altitude or laser beam incidence angle increase. An experimental survey over a flat and paved runway site, performed with an Optech ALTM 3100 sensor, showed an increase in modeled vertical errors of 5 cm, at a nadir scan orientation, to 8 cm at scan edges; for an aircraft altitude of 1200 m and half scan angle of 15°. In a survey with the same sensor, at a highly sloped glacial basin site absent of vegetation, modeled vertical errors reached over 2 m. Validation of error models within the glacial environment, over three separate flight lines, respectively showed 100%, 85%, and 75% of elevation residuals fell below error predictions. Future work in LiDAR sensor measurement uncertainty must focus on the development of vegetative error models to create more robust error prediction algorithms. To achieve this objective, comprehensive empirical exploratory analysis is recommended to relate vegetative parameters to observed errors.

  14. Optimal CO2 mitigation under damage risk valuation

    NASA Astrophysics Data System (ADS)

    Crost, Benjamin; Traeger, Christian P.

    2014-07-01

    The current generation has to set mitigation policy under uncertainty about the economic consequences of climate change. This uncertainty governs both the level of damages for a given level of warming, and the steepness of the increase in damage per warming degree. Our model of climate and the economy is a stochastic version of a model employed in assessing the US Social Cost of Carbon (DICE). We compute the optimal carbon taxes and CO2 abatement levels that maximize welfare from economic consumption over time under different risk states. In accordance with recent developments in finance, we separate preferences about time and risk to improve the model's calibration of welfare to observed market interest. We show that introducing the modern asset pricing framework doubles optimal abatement and carbon taxation. Uncertainty over the level of damages at a given temperature increase can result in a slight increase of optimal emissions as compared to using expected damages. In contrast, uncertainty governing the steepness of the damage increase in temperature results in a substantially higher level of optimal mitigation.

  15. A Review and Analysis of Remote Sensing Capability for Air Quality Measurements as a Potential Decision Support Tool Conducted by the NASA DEVELOP Program

    NASA Technical Reports Server (NTRS)

    Ross, A.; Richards, A.; Keith, K.; Frew, C.; Boseck, J.; Sutton, S.; Watts, C.; Rickman, D.

    2007-01-01

    This project focused on a comprehensive utilization of air quality model products as decision support tools (DST) needed for public health applications. A review of past and future air quality measurement methods and their uncertainty, along with the relationship of air quality to national and global public health, is vital. This project described current and future NASA satellite remote sensing and ground sensing capabilities and the potential for using these sensors to enhance the prediction, prevention, and control of public health effects that result from poor air quality. The qualitative uncertainty of current satellite remotely sensed air quality, the ground-based remotely sensed air quality, the air quality/public health model, and the decision making process is evaluated in this study. Current peer-reviewed literature suggests that remotely sensed air quality parameters correlate well with ground-based sensor data. A satellite remote-sensed and ground-sensed data complement is needed to enhance the models/tools used by policy makers for the protection of national and global public health communities

  16. Capstone Teaching Models: Combining Simulation, Analytical Intuitive Learning Processes, History and Effectiveness

    ERIC Educational Resources Information Center

    Reid, Maurice; Brown, Steve; Tabibzadeh, Kambiz

    2012-01-01

    For the past decade teaching models have been changing, reflecting the dynamics, complexities, and uncertainties of today's organizations. The traditional and the more current active models of learning have disadvantages. Simulation provides a platform to combine the best aspects of both types of teaching practices. This research explores the…

  17. Generation Expansion Planning With Large Amounts of Wind Power via Decision-Dependent Stochastic Programming

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhan, Yiduo; Zheng, Qipeng P.; Wang, Jianhui

    Power generation expansion planning needs to deal with future uncertainties carefully, given that the invested generation assets will be in operation for a long time. Many stochastic programming models have been proposed to tackle this challenge. However, most previous works assume predetermined future uncertainties (i.e., fixed random outcomes with given probabilities). In several recent studies of generation assets' planning (e.g., thermal versus renewable), new findings show that the investment decisions could affect the future uncertainties as well. To this end, this paper proposes a multistage decision-dependent stochastic optimization model for long-term large-scale generation expansion planning, where large amounts of windmore » power are involved. In the decision-dependent model, the future uncertainties are not only affecting but also affected by the current decisions. In particular, the probability distribution function is determined by not only input parameters but also decision variables. To deal with the nonlinear constraints in our model, a quasi-exact solution approach is then introduced to reformulate the multistage stochastic investment model to a mixed-integer linear programming model. The wind penetration, investment decisions, and the optimality of the decision-dependent model are evaluated in a series of multistage case studies. The results show that the proposed decision-dependent model provides effective optimization solutions for long-term generation expansion planning.« less

  18. On the formulation of a minimal uncertainty model for robust control with structured uncertainty

    NASA Technical Reports Server (NTRS)

    Belcastro, Christine M.; Chang, B.-C.; Fischl, Robert

    1991-01-01

    In the design and analysis of robust control systems for uncertain plants, representing the system transfer matrix in the form of what has come to be termed an M-delta model has become widely accepted and applied in the robust control literature. The M represents a transfer function matrix M(s) of the nominal closed loop system, and the delta represents an uncertainty matrix acting on M(s). The nominal closed loop system M(s) results from closing the feedback control system, K(s), around a nominal plant interconnection structure P(s). The uncertainty can arise from various sources, such as structured uncertainty from parameter variations or multiple unsaturated uncertainties from unmodeled dynamics and other neglected phenomena. In general, delta is a block diagonal matrix, but for real parameter variations delta is a diagonal matrix of real elements. Conceptually, the M-delta structure can always be formed for any linear interconnection of inputs, outputs, transfer functions, parameter variations, and perturbations. However, very little of the currently available literature addresses computational methods for obtaining this structure, and none of this literature addresses a general methodology for obtaining a minimal M-delta model for a wide class of uncertainty, where the term minimal refers to the dimension of the delta matrix. Since having a minimally dimensioned delta matrix would improve the efficiency of structured singular value (or multivariable stability margin) computations, a method of obtaining a minimal M-delta would be useful. Hence, a method of obtaining the interconnection system P(s) is required. A generalized procedure for obtaining a minimal P-delta structure for systems with real parameter variations is presented. Using this model, the minimal M-delta model can then be easily obtained by closing the feedback loop. The procedure involves representing the system in a cascade-form state-space realization, determining the minimal uncertainty matrix, delta, and constructing the state-space representation of P(s). Three examples are presented to illustrate the procedure.

  19. Measurements of downwelling far-infrared radiance during the RHUBC-II campaign at Cerro Toco, Chile and comparisons with line-by-line radiative transfer calculations

    NASA Astrophysics Data System (ADS)

    Mast, Jeffrey C.; Mlynczak, Martin G.; Cageao, Richard P.; Kratz, David P.; Latvakoski, Harri; Johnson, David G.; Turner, David D.; Mlawer, Eli J.

    2017-09-01

    Downwelling radiances at the Earth's surface measured by the Far-Infrared Spectroscopy of the Troposphere (FIRST) instrument in an environment with integrated precipitable water (IPW) as low as 0.03 cm are compared with calculated spectra in the far-infrared and mid-infrared. FIRST (a Fourier transform spectrometer) was deployed from August through October 2009 at 5.38 km MSL on Cerro Toco, a mountain in the Atacama Desert of Chile. There FIRST took part in the Radiative Heating in Unexplored Bands Campaign Part 2 (RHUBC-II), the goal of which is the assessment of water vapor spectroscopy. Radiosonde water vapor and temperature vertical profiles are input into the Atmospheric and Environmental Research (AER) Line-by-Line Radiative Transfer Model (LBLRTM) to compute modeled radiances. The LBLRTM minus FIRST residual spectrum is calculated to assess agreement. Uncertainties (1-σ) in both the measured and modeled radiances are also determined. Measured and modeled radiances nearly all agree to within combined (total) uncertainties. Features exceeding uncertainties can be corrected into the combined uncertainty by increasing water vapor and model continuum absorption, however this may not be necessary due to 1-σ uncertainties (68% confidence). Furthermore, the uncertainty in the measurement-model residual is very large and no additional information on the adequacy of current water vapor spectral line or continuum absorption parameters may be derived. Similar future experiments in similarly cold and dry environments will require absolute accuracy of 0.1% of a 273 K blackbody in radiance and water vapor accuracy of ∼3% in the profile layers contributing to downwelling radiance at the surface.

  20. The next GUM and its proposals: a comparison study

    NASA Astrophysics Data System (ADS)

    Damasceno, J. C.; Couto, P. R. G.

    2018-03-01

    The Guide to the Expression of Uncertainty in Measurement (GUM) is currently under revision. New proposals for its implementation were circulated in the form of a draft document. Two of the main changes are explored in this work using a Brinell hardness model example. Changes in the evaluation of uncertainty for repeated indications and in the construction of coverage intervals are compared with the classic GUM and with Monte Carlo simulation method.

  1. Quantifying the uncertainty in discharge data using hydraulic knowledge and uncertain gaugings: a Bayesian method named BaRatin

    NASA Astrophysics Data System (ADS)

    Le Coz, Jérôme; Renard, Benjamin; Bonnifait, Laurent; Branger, Flora; Le Boursicaud, Raphaël; Horner, Ivan; Mansanarez, Valentin; Lang, Michel; Vigneau, Sylvain

    2015-04-01

    River discharge is a crucial variable for Hydrology: as the output variable of most hydrologic models, it is used for sensitivity analyses, model structure identification, parameter estimation, data assimilation, prediction, etc. A major difficulty stems from the fact that river discharge is not measured continuously. Instead, discharge time series used by hydrologists are usually based on simple stage-discharge relations (rating curves) calibrated using a set of direct stage-discharge measurements (gaugings). In this presentation, we present a Bayesian approach (cf. Le Coz et al., 2014) to build such hydrometric rating curves, to estimate the associated uncertainty and to propagate this uncertainty to discharge time series. The three main steps of this approach are described: (1) Hydraulic analysis: identification of the hydraulic controls that govern the stage-discharge relation, identification of the rating curve equation and specification of prior distributions for the rating curve parameters; (2) Rating curve estimation: Bayesian inference of the rating curve parameters, accounting for the individual uncertainties of available gaugings, which often differ according to the discharge measurement procedure and the flow conditions; (3) Uncertainty propagation: quantification of the uncertainty in discharge time series, accounting for both the rating curve uncertainties and the uncertainty of recorded stage values. The rating curve uncertainties combine the parametric uncertainties and the remnant uncertainties that reflect the limited accuracy of the mathematical model used to simulate the physical stage-discharge relation. In addition, we also discuss current research activities, including the treatment of non-univocal stage-discharge relationships (e.g. due to hydraulic hysteresis, vegetation growth, sudden change of the geometry of the section, etc.). An operational version of the BaRatin software and its graphical interface are made available free of charge on request to the authors. J. Le Coz, B. Renard, L. Bonnifait, F. Branger, R. Le Boursicaud (2014). Combining hydraulic knowledge and uncertain gaugings in the estimation of hydrometric rating curves: a Bayesian approach, Journal of Hydrology, 509, 573-587.

  2. Uncertainty analysis of vegetation distribution in the northern high latitudes during the 21st century with a dynamic vegetation model

    PubMed Central

    Jiang, Yueyang; Zhuang, Qianlai; Schaphoff, Sibyll; Sitch, Stephen; Sokolov, Andrei; Kicklighter, David; Melillo, Jerry

    2012-01-01

    This study aims to assess how high-latitude vegetation may respond under various climate scenarios during the 21st century with a focus on analyzing model parameters induced uncertainty and how this uncertainty compares to the uncertainty induced by various climates. The analysis was based on a set of 10,000 Monte Carlo ensemble Lund-Potsdam-Jena (LPJ) simulations for the northern high latitudes (45oN and polewards) for the period 1900–2100. The LPJ Dynamic Global Vegetation Model (LPJ-DGVM) was run under contemporary and future climates from four Special Report Emission Scenarios (SRES), A1FI, A2, B1, and B2, based on the Hadley Centre General Circulation Model (GCM), and six climate scenarios, X901M, X902L, X903H, X904M, X905L, and X906H from the Integrated Global System Model (IGSM) at the Massachusetts Institute of Technology (MIT). In the current dynamic vegetation model, some parameters are more important than others in determining the vegetation distribution. Parameters that control plant carbon uptake and light-use efficiency have the predominant influence on the vegetation distribution of both woody and herbaceous plant functional types. The relative importance of different parameters varies temporally and spatially and is influenced by climate inputs. In addition to climate, these parameters play an important role in determining the vegetation distribution in the region. The parameter-based uncertainties contribute most to the total uncertainty. The current warming conditions lead to a complexity of vegetation responses in the region. Temperate trees will be more sensitive to climate variability, compared with boreal forest trees and C3 perennial grasses. This sensitivity would result in a unanimous northward greenness migration due to anomalous warming in the northern high latitudes. Temporally, boreal needleleaved evergreen plants are projected to decline considerably, and a large portion of C3 perennial grass is projected to disappear by the end of the 21st century. In contrast, the area of temperate trees would increase, especially under the most extreme A1FI scenario. As the warming continues, the northward greenness expansion in the Arctic region could continue. PMID:22822437

  3. The uncertainty of crop yield projections is reduced by improved temperature response functions.

    PubMed

    Wang, Enli; Martre, Pierre; Zhao, Zhigan; Ewert, Frank; Maiorano, Andrea; Rötter, Reimund P; Kimball, Bruce A; Ottman, Michael J; Wall, Gerard W; White, Jeffrey W; Reynolds, Matthew P; Alderman, Phillip D; Aggarwal, Pramod K; Anothai, Jakarat; Basso, Bruno; Biernath, Christian; Cammarano, Davide; Challinor, Andrew J; De Sanctis, Giacomo; Doltra, Jordi; Fereres, Elias; Garcia-Vila, Margarita; Gayler, Sebastian; Hoogenboom, Gerrit; Hunt, Leslie A; Izaurralde, Roberto C; Jabloun, Mohamed; Jones, Curtis D; Kersebaum, Kurt C; Koehler, Ann-Kristin; Liu, Leilei; Müller, Christoph; Naresh Kumar, Soora; Nendel, Claas; O'Leary, Garry; Olesen, Jørgen E; Palosuo, Taru; Priesack, Eckart; Eyshi Rezaei, Ehsan; Ripoche, Dominique; Ruane, Alex C; Semenov, Mikhail A; Shcherbak, Iurii; Stöckle, Claudio; Stratonovitch, Pierre; Streck, Thilo; Supit, Iwan; Tao, Fulu; Thorburn, Peter; Waha, Katharina; Wallach, Daniel; Wang, Zhimin; Wolf, Joost; Zhu, Yan; Asseng, Senthold

    2017-07-17

    Increasing the accuracy of crop productivity estimates is a key element in planning adaptation strategies to ensure global food security under climate change. Process-based crop models are effective means to project climate impact on crop yield, but have large uncertainty in yield simulations. Here, we show that variations in the mathematical functions currently used to simulate temperature responses of physiological processes in 29 wheat models account for >50% of uncertainty in simulated grain yields for mean growing season temperatures from 14 °C to 33 °C. We derived a set of new temperature response functions that when substituted in four wheat models reduced the error in grain yield simulations across seven global sites with different temperature regimes by 19% to 50% (42% average). We anticipate the improved temperature responses to be a key step to improve modelling of crops under rising temperature and climate change, leading to higher skill of crop yield projections.

  4. The Uncertainty of Crop Yield Projections Is Reduced by Improved Temperature Response Functions

    NASA Technical Reports Server (NTRS)

    Wang, Enli; Martre, Pierre; Zhao, Zhigan; Ewert, Frank; Maiorano, Andrea; Rotter, Reimund P.; Kimball, Bruce A.; Ottman, Michael J.; White, Jeffrey W.; Reynolds, Matthew P.; hide

    2017-01-01

    Increasing the accuracy of crop productivity estimates is a key element in planning adaptation strategies to ensure global food security under climate change. Process-based crop models are effective means to project climate impact on crop yield, but have large uncertainty in yield simulations. Here, we show that variations in the mathematical functions currently used to simulate temperature responses of physiological processes in 29 wheat models account for is greater than 50% of uncertainty in simulated grain yields for mean growing season temperatures from 14 C to 33 C. We derived a set of new temperature response functions that when substituted in four wheat models reduced the error in grain yield simulations across seven global sites with different temperature regimes by 19% to 50% (42% average). We anticipate the improved temperature responses to be a key step to improve modelling of crops under rising temperature and climate change, leading to higher skill of crop yield projections.

  5. Cirrus Susceptibility to Changes in Ice Nuclei: Physical Processes, Model Uncertainties, and Measurement Needs

    NASA Technical Reports Server (NTRS)

    Jensen, Eric

    2017-01-01

    In this talk, I will begin by discussing the physical processes that govern the competition between heterogeneous and homogeneous ice nucleation in upper tropospheric cirrus clouds. Next, I will review the current knowledge of low-temperature ice nucleation from laboratory experiments and field measurements. I will then discuss the uncertainties and deficiencies in representations of cirrus processes in global models used to estimate the climate impacts of changes in cirrus clouds. Lastly, I will review the critical field measurements needed to advance our understanding of cirrus and their susceptibility to changes in aerosol properties.

  6. The contribution of natural variability to GCM bias: Can we effectively bias-correct climate projections?

    NASA Astrophysics Data System (ADS)

    McAfee, S. A.; DeLaFrance, A.

    2017-12-01

    Investigating the impacts of climate change often entails using projections from inherently imperfect general circulation models (GCMs) to drive models that simulate biophysical or societal systems in great detail. Error or bias in the GCM output is often assessed in relation to observations, and the projections are adjusted so that the output from impacts models can be compared to historical or observed conditions. Uncertainty in the projections is typically accommodated by running more than one future climate trajectory to account for differing emissions scenarios, model simulations, and natural variability. The current methods for dealing with error and uncertainty treat them as separate problems. In places where observed and/or simulated natural variability is large, however, it may not be possible to identify a consistent degree of bias in mean climate, blurring the lines between model error and projection uncertainty. Here we demonstrate substantial instability in mean monthly temperature bias across a suite of GCMs used in CMIP5. This instability is greatest in the highest latitudes during the cool season, where shifts from average temperatures below to above freezing could have profound impacts. In models with the greatest degree of bias instability, the timing of regional shifts from below to above average normal temperatures in a single climate projection can vary by about three decades, depending solely on the degree of bias assessed. This suggests that current bias correction methods based on comparison to 20- or 30-year normals may be inappropriate, particularly in the polar regions.

  7. A new UK fission yield evaluation UKFY3.7

    NASA Astrophysics Data System (ADS)

    Mills, Robert William

    2017-09-01

    The JEFF neutron induced and spontaneous fission product yield evaluation is currently unchanged from JEFF-3.1.1, also known by its UK designation UKFY3.6A. It is based upon experimental data combined with empirically fitted mass, charge and isomeric state models which are then adjusted within the experimental and model uncertainties to conform to the physical constraints of the fission process. A new evaluation has been prepared for JEFF, called UKFY3.7, that incorporates new experimental data and replaces the current empirical models (multi-Gaussian fits of mass distribution and Wahl Zp model for charge distribution combined with parameter extrapolation), with predictions from GEF. The GEF model has the advantage that one set of parameters allows the prediction of many different fissioning nuclides at different excitation energies unlike previous models where each fissioning nuclide at a specific excitation energy had to be fitted individually to the relevant experimental data. The new UKFY3.7 evaluation, submitted for testing as part of JEFF-3.3, is described alongside initial results of testing. In addition, initial ideas for future developments allowing inclusion of new measurements types and changing from any neutron spectrum type to true neutron energy dependence are discussed. Also, a method is proposed to propagate uncertainties of fission product yields based upon the experimental data that underlies the fission yield evaluation. The covariance terms being determined from the evaluated cumulative and independent yields combined with the experimental uncertainties on the cumulative yield measurements.

  8. Uncertainties in future-proof decision-making: the Dutch Delta Model

    NASA Astrophysics Data System (ADS)

    IJmker, Janneke; Snippen, Edwin; Ruijgh, Erik

    2013-04-01

    In 1953, a number of European countries experienced flooding after a major storm event coming from the northwest. Over 2100 people died of the resulting floods, 1800 of them being Dutch. This gave rise to the development of the so-called Delta Works and Zuiderzee Works that strongly reduced the flood risk in the Netherlands. These measures were a response to a large flooding event. As boundary conditions have changed (increasing population, increasing urban development, etc.) , the flood risk should be evaluated continuously, and measures should be taken if necessary. The Delta Programme was designed to be prepared for future changes and to limit the flood risk, taking into account economics, nature, landscape, residence and recreation . To support decisions in the Delta Programme, the Delta Model was developed. By using four different input scenarios (extremes in climate and economics) and variations in system setup, the outcomes of the Delta Model represent a range of possible outcomes for the hydrological situation in 2050 and 2100. These results flow into effect models that give insight in the integrated effects on freshwater supply (including navigation, industry and ecology) and flood risk. As the long-term water management policy of the Netherlands for the next decades will be based on these results, they have to be reliable. Therefore, a study was carried out to investigate the impact of uncertainties on the model outcomes. The study focused on "known unknowns": uncertainties in the boundary conditions, in the parameterization and in the model itself. This showed that for different parts of the Netherlands, the total uncertainty is in the order of meters! Nevertheless, (1) the total uncertainty is dominated by uncertainties in boundary conditions. Internal model uncertainties are subordinate to that. Furthermore, (2) the model responses develop in a logical way, such that the exact model outcomes might be uncertain, but the outcomes of different model runs are reliable relative to each other. The Delta Model therefore is a reliable instrument for finding the optimal water management policy for the future. As the exact model outcomes show a high degree of uncertainty, the model analysis will be on a large numbers of model runs to gain insight in the sensitivity of the model for different setups and boundary conditions. The results allow fast investigation of (relative) effects of measures. Furthermore, it helps to identify bottlenecks in the system. To summarize, the Delta Model is a tool for policy makers to base their policy strategies on quantitative rather than qualitative information. It can be applied to the current and future situation, and feeds the political discussion. The uncertainty of the model has no determinative effect on the analysis that can be done by the Delta Model.

  9. Improving Air Quality (and Weather) Predictions using Advanced Data Assimilation Techniques Applied to Coupled Models during KORUS-AQ

    NASA Astrophysics Data System (ADS)

    Carmichael, G. R.; Saide, P. E.; Gao, M.; Streets, D. G.; Kim, J.; Woo, J. H.

    2017-12-01

    Ambient aerosols are important air pollutants with direct impacts on human health and on the Earth's weather and climate systems through their interactions with radiation and clouds. Their role is dependent on their distributions of size, number, phase and composition, which vary significantly in space and time. There remain large uncertainties in simulated aerosol distributions due to uncertainties in emission estimates and in chemical and physical processes associated with their formation and removal. These uncertainties lead to large uncertainties in weather and air quality predictions and in estimates of health and climate change impacts. Despite these uncertainties and challenges, regional-scale coupled chemistry-meteorological models such as WRF-Chem have significant capabilities in predicting aerosol distributions and explaining aerosol-weather interactions. We explore the hypothesis that new advances in on-line, coupled atmospheric chemistry/meteorological models, and new emission inversion and data assimilation techniques applicable to such coupled models, can be applied in innovative ways using current and evolving observation systems to improve predictions of aerosol distributions at regional scales. We investigate the impacts of assimilating AOD from geostationary satellite (GOCI) and surface PM2.5 measurements on predictions of AOD and PM in Korea during KORUS-AQ through a series of experiments. The results suggest assimilating datasets from multiple platforms can improve the predictions of aerosol temporal and spatial distributions.

  10. Reliability ensemble averaging of 21st century projections of terrestrial net primary productivity reduces global and regional uncertainties

    NASA Astrophysics Data System (ADS)

    Exbrayat, Jean-François; Bloom, A. Anthony; Falloon, Pete; Ito, Akihiko; Smallman, T. Luke; Williams, Mathew

    2018-02-01

    Multi-model averaging techniques provide opportunities to extract additional information from large ensembles of simulations. In particular, present-day model skill can be used to evaluate their potential performance in future climate simulations. Multi-model averaging methods have been used extensively in climate and hydrological sciences, but they have not been used to constrain projected plant productivity responses to climate change, which is a major uncertainty in Earth system modelling. Here, we use three global observationally orientated estimates of current net primary productivity (NPP) to perform a reliability ensemble averaging (REA) method using 30 global simulations of the 21st century change in NPP based on the Inter-Sectoral Impact Model Intercomparison Project (ISIMIP) business as usual emissions scenario. We find that the three REA methods support an increase in global NPP by the end of the 21st century (2095-2099) compared to 2001-2005, which is 2-3 % stronger than the ensemble ISIMIP mean value of 24.2 Pg C y-1. Using REA also leads to a 45-68 % reduction in the global uncertainty of 21st century NPP projection, which strengthens confidence in the resilience of the CO2 fertilization effect to climate change. This reduction in uncertainty is especially clear for boreal ecosystems although it may be an artefact due to the lack of representation of nutrient limitations on NPP in most models. Conversely, the large uncertainty that remains on the sign of the response of NPP in semi-arid regions points to the need for better observations and model development in these regions.

  11. Operational hydrological forecasting in Bavaria. Part II: Ensemble forecasting

    NASA Astrophysics Data System (ADS)

    Ehret, U.; Vogelbacher, A.; Moritz, K.; Laurent, S.; Meyer, I.; Haag, I.

    2009-04-01

    In part I of this study, the operational flood forecasting system in Bavaria and an approach to identify and quantify forecast uncertainty was introduced. The approach is split into the calculation of an empirical 'overall error' from archived forecasts and the calculation of an empirical 'model error' based on hydrometeorological forecast tests, where rainfall observations were used instead of forecasts. The 'model error' can especially in upstream catchments where forecast uncertainty is strongly dependent on the current predictability of the atrmosphere be superimposed on the spread of a hydrometeorological ensemble forecast. In Bavaria, two meteorological ensemble prediction systems are currently tested for operational use: the 16-member COSMO-LEPS forecast and a poor man's ensemble composed of DWD GME, DWD Cosmo-EU, NCEP GFS, Aladin-Austria, MeteoSwiss Cosmo-7. The determination of the overall forecast uncertainty is dependent on the catchment characteristics: 1. Upstream catchment with high influence of weather forecast a) A hydrological ensemble forecast is calculated using each of the meteorological forecast members as forcing. b) Corresponding to the characteristics of the meteorological ensemble forecast, each resulting forecast hydrograph can be regarded as equally likely. c) The 'model error' distribution, with parameters dependent on hydrological case and lead time, is added to each forecast timestep of each ensemble member d) For each forecast timestep, the overall (i.e. over all 'model error' distribution of each ensemble member) error distribution is calculated e) From this distribution, the uncertainty range on a desired level (here: the 10% and 90% percentile) is extracted and drawn as forecast envelope. f) As the mean or median of an ensemble forecast does not necessarily exhibit meteorologically sound temporal evolution, a single hydrological forecast termed 'lead forecast' is chosen and shown in addition to the uncertainty bounds. This can be either an intermediate forecast between the extremes of the ensemble spread or a manually selected forecast based on a meteorologists advice. 2. Downstream catchments with low influence of weather forecast In downstream catchments with strong human impact on discharge (e.g. by reservoir operation) and large influence of upstream gauge observation quality on forecast quality, the 'overall error' may in most cases be larger than the combination of the 'model error' and an ensemble spread. Therefore, the overall forecast uncertainty bounds are calculated differently: a) A hydrological ensemble forecast is calculated using each of the meteorological forecast members as forcing. Here, additionally the corresponding inflow hydrograph from all upstream catchments must be used. b) As for an upstream catchment, the uncertainty range is determined by combination of 'model error' and the ensemble member forecasts c) In addition, the 'overall error' is superimposed on the 'lead forecast'. For reasons of consistency, the lead forecast must be based on the same meteorological forecast in the downstream and all upstream catchments. d) From the resulting two uncertainty ranges (one from the ensemble forecast and 'model error', one from the 'lead forecast' and 'overall error'), the envelope is taken as the most prudent uncertainty range. In sum, the uncertainty associated with each forecast run is calculated and communicated to the public in the form of 10% and 90% percentiles. As in part I of this study, the methodology as well as the useful- or uselessness of the resulting uncertainty ranges will be presented and discussed by typical examples.

  12. Observing terrestrial ecosystems and the carbon cycle from space

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schimel, David; Pavlick, Ryan; Fisher, Joshua B.

    2015-02-06

    Modeled terrestrial ecosystem and carbon cycle feedbacks contribute substantial uncertainty to projections of future climate. The limitations of current observing networks contribute to this uncertainty. Here we present a current climatology of global model predictions and observations for photosynthesis, biomass, plant diversity and plant functional diversity. Carbon cycle tipping points occur in terrestrial regions where fluxes or stocks are largest, and where biological variability is highest, the tropics and Arctic/Boreal zones. Global observations are predominately in the mid-latitudes and are sparse in high and low latitude ecosystems. Observing and forecasting ecosystem change requires sustained observations of sufficient density in timemore » and space in critical regions. Using data and theory available now, we can develop a strategy to detect and forecast terrestrial carbon cycle-climate interactions, by combining in situ and remote techniques.« less

  13. Constraints on the dark matter neutralinos from the radio emissions of galaxy clusters

    NASA Astrophysics Data System (ADS)

    Kiew, Ching-Yee; Hwang, Chorng-Yuan; Zainal Abibin, Zamri

    2017-05-01

    By assuming the dark matter to be composed of neutralinos, we used the detection of upper limit on diffuse radio emission in a sample of galaxy clusters to put constraint on the properties of neutralinos. We showed the upper limit constraint on <σv>-mχ space with neutralino annihilation through b\\bar{b} and μ+μ- channels. The best constraint is from the galaxy clusters A2199 and A1367. We showed the uncertainty due to the density profile and cluster magnetic field. The largest uncertainty comes from the uncertainty in dark matter spatial distribution. We also investigated the constraints on minimal Supergravity (mSUGRA) and minimal supersymmetric standard model (MSSM) parameter space by scanning the parameters using the darksusy package. By using the current radio observation, we managed to exclude 40 combinations of mSUGRA parameters. On the other hand, 573 combinations of MSSM parameters can be excluded by current observation.

  14. Incorporating uncertainty in watershed management decision-making: A mercury TMDL case study

    USGS Publications Warehouse

    Labiosa, W.; Leckie, J.; Shachter, R.; Freyberg, D.; Rytuba, J.; ,

    2005-01-01

    Water quality impairment due to high mercury fish tissue concentrations and high mercury aqueous concentrations is a widespread problem in several sub-watersheds that are major sources of mercury to the San Francisco Bay. Several mercury Total Maximum Daily Load regulations are currently being developed to address this problem. Decisions about control strategies are being made despite very large uncertainties about current mercury loading behavior, relationships between total mercury loading and methyl mercury formation, and relationships between potential controls and mercury fish tissue levels. To deal with the issues of very large uncertainties, data limitations, knowledge gaps, and very limited State agency resources, this work proposes a decision analytical alternative for mercury TMDL decision support. The proposed probabilistic decision model is Bayesian in nature and is fully compatible with a "learning while doing" adaptive management approach. Strategy evaluation, sensitivity analysis, and information collection prioritization are examples of analyses that can be performed using this approach.

  15. Theoretical uncertainties on the radius of low- and very-low-mass stars

    NASA Astrophysics Data System (ADS)

    Tognelli, E.; Prada Moroni, P. G.; Degl'Innocenti, S.

    2018-05-01

    We performed an analysis of the main theoretical uncertainties that affect the radius of low- and very-low-mass stars predicted by current stellar models. We focused on stars in the mass range 0.1-1 M⊙, on both the zero-age main sequence (ZAMS) and on 1, 2, and 5 Gyr isochrones. First, we quantified the impact on the radius of the uncertainty of several quantities, namely the equation of state, radiative opacity, atmospheric models, convection efficiency, and initial chemical composition. Then, we computed the cumulative radius error stripe obtained by adding the radius variation due to all the analysed quantities. As a general trend, the radius uncertainty increases with the stellar mass. For ZAMS structures the cumulative error stripe of very-low-mass stars is about ±2 and ±3 per cent, while at larger masses it increases up to ±4 and ±5 per cent. The radius uncertainty gets larger and age dependent if isochrones are considered, reaching for M ˜ 1 M⊙ about +12(-15) per cent at an age of 5 Gyr. We also investigated the radius uncertainty at a fixed luminosity. In this case, the cumulative error stripe is the same for both ZAMS and isochrone models and it ranges from about ±4 to +7 and +9(-5) per cent. We also showed that the sole uncertainty on the chemical composition plays an important role in determining the radius error stripe, producing a radius variation that ranges between about ±1 and ±2 per cent on ZAMS models with fixed mass and about ±3 and ±5 per cent at a fixed luminosity.

  16. Crash Certification by Analysis - Are We There Yet?

    NASA Technical Reports Server (NTRS)

    Jackson, Karen E.; Fasanella, Edwin L.; Lyle, Karen H.

    2006-01-01

    This paper addresses the issue of crash certification by analysis. This broad topic encompasses many ancillary issues including model validation procedures, uncertainty in test data and analysis models, probabilistic techniques for test-analysis correlation, verification of the mathematical formulation, and establishment of appropriate qualification requirements. This paper will focus on certification requirements for crashworthiness of military helicopters; capabilities of the current analysis codes used for crash modeling and simulation, including some examples of simulations from the literature to illustrate the current approach to model validation; and future directions needed to achieve "crash certification by analysis."

  17. Using dry and wet year hydroclimatic extremes to guide future hydrologic projections

    NASA Astrophysics Data System (ADS)

    Oni, Stephen; Futter, Martyn; Ledesma, Jose; Teutschbein, Claudia; Buttle, Jim; Laudon, Hjalmar

    2016-07-01

    There are growing numbers of studies on climate change impacts on forest hydrology, but limited attempts have been made to use current hydroclimatic variabilities to constrain projections of future climatic conditions. Here we used historical wet and dry years as a proxy for expected future extreme conditions in a boreal catchment. We showed that runoff could be underestimated by at least 35 % when dry year parameterizations were used for wet year conditions. Uncertainty analysis showed that behavioural parameter sets from wet and dry years separated mainly on precipitation-related parameters and to a lesser extent on parameters related to landscape processes, while uncertainties inherent in climate models (as opposed to differences in calibration or performance metrics) appeared to drive the overall uncertainty in runoff projections under dry and wet hydroclimatic conditions. Hydrologic model calibration for climate impact studies could be based on years that closely approximate anticipated conditions to better constrain uncertainty in projecting extreme conditions in boreal and temperate regions.

  18. A New Approach in Generating Meteorological Forecasts for Ensemble Streamflow Forecasting using Multivariate Functions

    NASA Astrophysics Data System (ADS)

    Khajehei, S.; Madadgar, S.; Moradkhani, H.

    2014-12-01

    The reliability and accuracy of hydrological predictions are subject to various sources of uncertainty, including meteorological forcing, initial conditions, model parameters and model structure. To reduce the total uncertainty in hydrological applications, one approach is to reduce the uncertainty in meteorological forcing by using the statistical methods based on the conditional probability density functions (pdf). However, one of the requirements for current methods is to assume the Gaussian distribution for the marginal distribution of the observed and modeled meteorology. Here we propose a Bayesian approach based on Copula functions to develop the conditional distribution of precipitation forecast needed in deriving a hydrologic model for a sub-basin in the Columbia River Basin. Copula functions are introduced as an alternative approach in capturing the uncertainties related to meteorological forcing. Copulas are multivariate joint distribution of univariate marginal distributions, which are capable to model the joint behavior of variables with any level of correlation and dependency. The method is applied to the monthly forecast of CPC with 0.25x0.25 degree resolution to reproduce the PRISM dataset over 1970-2000. Results are compared with Ensemble Pre-Processor approach as a common procedure used by National Weather Service River forecast centers in reproducing observed climatology during a ten-year verification period (2000-2010).

  19. Which climate change path are we following? Bad news from Scots pine

    PubMed Central

    D’Andrea, Ettore; Rezaie, Negar; Cammarano, Mario; Matteucci, Giorgio

    2017-01-01

    Current expectations on future climate derive from coordinated experiments, which compile many climate models for sampling the entire uncertainty related to emission scenarios, initial conditions, and modelling process. Quantifying this uncertainty is important for taking decisions that are robust under a wide range of possible future conditions. Nevertheless, if uncertainty is too large, it can prevent from planning specific and effective measures. For this reason, reducing the spectrum of the possible scenarios to a small number of one or a few models that actually represent the climate pathway influencing natural ecosystems would substantially increase our planning capacity. Here we adopt a multidisciplinary approach based on the comparison of observed and expected spatial patterns of response to climate change in order to identify which specific models, among those included in the CMIP5, catch the real climate variation driving the response of natural ecosystems. We used dendrochronological analyses for determining the geographic pattern of recent growth trends for three European species of trees. At the same time, we modelled the climatic niche for the same species and forecasted the suitability variation expected across Europe under each different GCM. Finally, we estimated how well each GCM explains the real response of ecosystems, by comparing the expected variation with the observed growth trends. Doing this, we identified four climatic models that are coherent with the observed trends. These models are close to the highest range limit of the climatic variations expected by the ensemble of the CMIP5 models, suggesting that current predictions of climate change impacts on ecosystems could be underestimated. PMID:29252985

  20. Which climate change path are we following? Bad news from Scots pine.

    PubMed

    Bombi, Pierluigi; D'Andrea, Ettore; Rezaie, Negar; Cammarano, Mario; Matteucci, Giorgio

    2017-01-01

    Current expectations on future climate derive from coordinated experiments, which compile many climate models for sampling the entire uncertainty related to emission scenarios, initial conditions, and modelling process. Quantifying this uncertainty is important for taking decisions that are robust under a wide range of possible future conditions. Nevertheless, if uncertainty is too large, it can prevent from planning specific and effective measures. For this reason, reducing the spectrum of the possible scenarios to a small number of one or a few models that actually represent the climate pathway influencing natural ecosystems would substantially increase our planning capacity. Here we adopt a multidisciplinary approach based on the comparison of observed and expected spatial patterns of response to climate change in order to identify which specific models, among those included in the CMIP5, catch the real climate variation driving the response of natural ecosystems. We used dendrochronological analyses for determining the geographic pattern of recent growth trends for three European species of trees. At the same time, we modelled the climatic niche for the same species and forecasted the suitability variation expected across Europe under each different GCM. Finally, we estimated how well each GCM explains the real response of ecosystems, by comparing the expected variation with the observed growth trends. Doing this, we identified four climatic models that are coherent with the observed trends. These models are close to the highest range limit of the climatic variations expected by the ensemble of the CMIP5 models, suggesting that current predictions of climate change impacts on ecosystems could be underestimated.

  1. Projecting malaria hazard from climate change in eastern Africa using large ensembles to estimate uncertainty.

    PubMed

    Leedale, Joseph; Tompkins, Adrian M; Caminade, Cyril; Jones, Anne E; Nikulin, Grigory; Morse, Andrew P

    2016-03-31

    The effect of climate change on the spatiotemporal dynamics of malaria transmission is studied using an unprecedented ensemble of climate projections, employing three diverse bias correction and downscaling techniques, in order to partially account for uncertainty in climate- driven malaria projections. These large climate ensembles drive two dynamical and spatially explicit epidemiological malaria models to provide future hazard projections for the focus region of eastern Africa. While the two malaria models produce very distinct transmission patterns for the recent climate, their response to future climate change is similar in terms of sign and spatial distribution, with malaria transmission moving to higher altitudes in the East African Community (EAC) region, while transmission reduces in lowland, marginal transmission zones such as South Sudan. The climate model ensemble generally projects warmer and wetter conditions over EAC. The simulated malaria response appears to be driven by temperature rather than precipitation effects. This reduces the uncertainty due to the climate models, as precipitation trends in tropical regions are very diverse, projecting both drier and wetter conditions with the current state-of-the-art climate model ensemble. The magnitude of the projected changes differed considerably between the two dynamical malaria models, with one much more sensitive to climate change, highlighting that uncertainty in the malaria projections is also associated with the disease modelling approach.

  2. Multi-Hypothesis Modelling Capabilities for Robust Data-Model Integration

    NASA Astrophysics Data System (ADS)

    Walker, A. P.; De Kauwe, M. G.; Lu, D.; Medlyn, B.; Norby, R. J.; Ricciuto, D. M.; Rogers, A.; Serbin, S.; Weston, D. J.; Ye, M.; Zaehle, S.

    2017-12-01

    Large uncertainty is often inherent in model predictions due to imperfect knowledge of how to describe the mechanistic processes (hypotheses) that a model is intended to represent. Yet this model hypothesis uncertainty (MHU) is often overlooked or informally evaluated, as methods to quantify and evaluate MHU are limited. MHU is increased as models become more complex because each additional processes added to a model comes with inherent MHU as well as parametric unceratinty. With the current trend of adding more processes to Earth System Models (ESMs), we are adding uncertainty, which can be quantified for parameters but not MHU. Model inter-comparison projects do allow for some consideration of hypothesis uncertainty but in an ad hoc and non-independent fashion. This has stymied efforts to evaluate ecosystem models against data and intepret the results mechanistically because it is not simple to interpret exactly why a model is producing the results it does and identify which model assumptions are key as they combine models of many sub-systems and processes, each of which may be conceptualised and represented mathematically in various ways. We present a novel modelling framework—the multi-assumption architecture and testbed (MAAT)—that automates the combination, generation, and execution of a model ensemble built with different representations of process. We will present the argument that multi-hypothesis modelling needs to be considered in conjunction with other capabilities (e.g. the Predictive Ecosystem Analyser; PecAn) and statistical methods (e.g. sensitivity anaylsis, data assimilation) to aid efforts in robust data model integration to enhance our predictive understanding of biological systems.

  3. Correlating electroluminescence characterization and physics-based models of InGaN/GaN LEDs: Pitfalls and open issues

    NASA Astrophysics Data System (ADS)

    Calciati, Marco; Goano, Michele; Bertazzi, Francesco; Vallone, Marco; Zhou, Xiangyu; Ghione, Giovanni; Meneghini, Matteo; Meneghesso, Gaudenzio; Zanoni, Enrico; Bellotti, Enrico; Verzellesi, Giovanni; Zhu, Dandan; Humphreys, Colin

    2014-06-01

    Electroluminescence (EL) characterization of InGaN/GaN light-emitting diodes (LEDs), coupled with numerical device models of different sophistication, is routinely adopted not only to establish correlations between device efficiency and structural features, but also to make inferences about the loss mechanisms responsible for LED efficiency droop at high driving currents. The limits of this investigative approach are discussed here in a case study based on a comprehensive set of current- and temperature-dependent EL data from blue LEDs with low and high densities of threading dislocations (TDs). First, the effects limiting the applicability of simpler (closed-form and/or one-dimensional) classes of models are addressed, like lateral current crowding, vertical carrier distribution nonuniformity, and interband transition broadening. Then, the major sources of uncertainty affecting state-of-the-art numerical device simulation are reviewed and discussed, including (i) the approximations in the transport description through the multi-quantum-well active region, (ii) the alternative valence band parametrizations proposed to calculate the spontaneous emission rate, (iii) the difficulties in defining the Auger coefficients due to inadequacies in the microscopic quantum well description and the possible presence of extra, non-Auger high-current-density recombination mechanisms and/or Auger-induced leakage. In the case of the present LED structures, the application of three-dimensional numerical-simulation-based analysis to the EL data leads to an explanation of efficiency droop in terms of TD-related and Auger-like nonradiative losses, with a C coefficient in the 10-30 cm6/s range at room temperature, close to the larger theoretical calculations reported so far. However, a study of the combined effects of structural and model uncertainties suggests that the C values thus determined could be overestimated by about an order of magnitude. This preliminary attempt at uncertainty quantification confirms, beyond the present case, the need for an improved description of carrier transport and microscopic radiative and nonradiative recombination mechanisms in device-level LED numerical models.

  4. Visual Detection Under Uncertainty Operates Via an Early Static, Not Late Dynamic, Non-Linearity

    PubMed Central

    Neri, Peter

    2010-01-01

    Signals in the environment are rarely specified exactly: our visual system may know what to look for (e.g., a specific face), but not its exact configuration (e.g., where in the room, or in what orientation). Uncertainty, and the ability to deal with it, is a fundamental aspect of visual processing. The MAX model is the current gold standard for describing how human vision handles uncertainty: of all possible configurations for the signal, the observer chooses the one corresponding to the template associated with the largest response. We propose an alternative model in which the MAX operation, which is a dynamic non-linearity (depends on multiple inputs from several stimulus locations) and happens after the input stimulus has been matched to the possible templates, is replaced by an early static non-linearity (depends only on one input corresponding to one stimulus location) which is applied before template matching. By exploiting an integrated set of analytical and experimental tools, we show that this model is able to account for a number of empirical observations otherwise unaccounted for by the MAX model, and is more robust with respect to the realistic limitations imposed by the available neural hardware. We then discuss how these results, currently restricted to a simple visual detection task, may extend to a wider range of problems in sensory processing. PMID:21212835

  5. Chemical element transport in stellar evolution models

    PubMed Central

    Cassisi, Santi

    2017-01-01

    Stellar evolution computations provide the foundation of several methods applied to study the evolutionary properties of stars and stellar populations, both Galactic and extragalactic. The accuracy of the results obtained with these techniques is linked to the accuracy of the stellar models, and in this context the correct treatment of the transport of chemical elements is crucial. Unfortunately, in many respects calculations of the evolution of the chemical abundance profiles in stars are still affected by sometimes sizable uncertainties. Here, we review the various mechanisms of element transport included in the current generation of stellar evolution calculations, how they are implemented, the free parameters and uncertainties involved, the impact on the models and the observational constraints. PMID:28878972

  6. Chemical element transport in stellar evolution models.

    PubMed

    Salaris, Maurizio; Cassisi, Santi

    2017-08-01

    Stellar evolution computations provide the foundation of several methods applied to study the evolutionary properties of stars and stellar populations, both Galactic and extragalactic. The accuracy of the results obtained with these techniques is linked to the accuracy of the stellar models, and in this context the correct treatment of the transport of chemical elements is crucial. Unfortunately, in many respects calculations of the evolution of the chemical abundance profiles in stars are still affected by sometimes sizable uncertainties. Here, we review the various mechanisms of element transport included in the current generation of stellar evolution calculations, how they are implemented, the free parameters and uncertainties involved, the impact on the models and the observational constraints.

  7. Evaluation of five dry particle deposition parameterizations for incorporation into atmospheric transport models

    NASA Astrophysics Data System (ADS)

    Khan, Tanvir R.; Perlinger, Judith A.

    2017-10-01

    Despite considerable effort to develop mechanistic dry particle deposition parameterizations for atmospheric transport models, current knowledge has been inadequate to propose quantitative measures of the relative performance of available parameterizations. In this study, we evaluated the performance of five dry particle deposition parameterizations developed by Zhang et al. (2001) (Z01), Petroff and Zhang (2010) (PZ10), Kouznetsov and Sofiev (2012) (KS12), Zhang and He (2014) (ZH14), and Zhang and Shao (2014) (ZS14), respectively. The evaluation was performed in three dimensions: model ability to reproduce observed deposition velocities, Vd (accuracy); the influence of imprecision in input parameter values on the modeled Vd (uncertainty); and identification of the most influential parameter(s) (sensitivity). The accuracy of the modeled Vd was evaluated using observations obtained from five land use categories (LUCs): grass, coniferous and deciduous forests, natural water, and ice/snow. To ascertain the uncertainty in modeled Vd, and quantify the influence of imprecision in key model input parameters, a Monte Carlo uncertainty analysis was performed. The Sobol' sensitivity analysis was conducted with the objective to determine the parameter ranking from the most to the least influential. Comparing the normalized mean bias factors (indicators of accuracy), we find that the ZH14 parameterization is the most accurate for all LUCs except for coniferous forest, for which it is second most accurate. From Monte Carlo simulations, the estimated mean normalized uncertainties in the modeled Vd obtained for seven particle sizes (ranging from 0.005 to 2.5 µm) for the five LUCs are 17, 12, 13, 16, and 27 % for the Z01, PZ10, KS12, ZH14, and ZS14 parameterizations, respectively. From the Sobol' sensitivity results, we suggest that the parameter rankings vary by particle size and LUC for a given parameterization. Overall, for dp = 0.001 to 1.0 µm, friction velocity was one of the three most influential parameters in all parameterizations. For giant particles (dp = 10 µm), relative humidity was the most influential parameter. Because it is the least complex of the five parameterizations, and it has the greatest accuracy and least uncertainty, we propose that the ZH14 parameterization is currently superior for incorporation into atmospheric transport models.

  8. Modelling the Air–Surface Exchange of Ammonia from the Field to Global Scale

    EPA Science Inventory

    The Working Group addressed the current understanding and uncertainties in the processes controlling ammonia (NH3) bi-directional exchange, and in the application of numerical models to describe these processes. As a starting point for the discussion, the Working Group drew on th...

  9. Uncertainty Quantification in Geomagnetic Field Modeling

    NASA Astrophysics Data System (ADS)

    Chulliat, A.; Nair, M. C.; Alken, P.; Meyer, B.; Saltus, R.; Woods, A.

    2017-12-01

    Geomagnetic field models are mathematical descriptions of the various sources of the Earth's magnetic field, and are generally obtained by solving an inverse problem. They are widely used in research to separate and characterize field sources, but also in many practical applications such as aircraft and ship navigation, smartphone orientation, satellite attitude control, and directional drilling. In recent years, more sophisticated models have been developed, thanks to the continuous availability of high quality satellite data and to progress in modeling techniques. Uncertainty quantification has become an integral part of model development, both to assess the progress made and to address specific users' needs. Here we report on recent advances made by our group in quantifying the uncertainty of geomagnetic field models. We first focus on NOAA's World Magnetic Model (WMM) and the International Geomagnetic Reference Field (IGRF), two reference models of the main (core) magnetic field produced every five years. We describe the methods used in quantifying the model commission error as well as the omission error attributed to various un-modeled sources such as magnetized rocks in the crust and electric current systems in the atmosphere and near-Earth environment. A simple error model was derived from this analysis, to facilitate usage in practical applications. We next report on improvements brought by combining a main field model with a high resolution crustal field model and a time-varying, real-time external field model, like in NOAA's High Definition Geomagnetic Model (HDGM). The obtained uncertainties are used by the directional drilling industry to mitigate health, safety and environment risks.

  10. Recent global methane trends: an investigation using hierarchical Bayesian methods

    NASA Astrophysics Data System (ADS)

    Rigby, M. L.; Stavert, A.; Ganesan, A.; Lunt, M. F.

    2014-12-01

    Following a decade with little growth, methane concentrations began to increase across the globe in 2007, and have continued to rise ever since. The reasons for this renewed growth are currently the subject of much debate. Here, we discuss the recent observed trends, and highlight some of the strengths and weaknesses in current "inverse" methods for quantifying fluxes using observations. In particular, we focus on the outstanding problems of accurately quantifying uncertainties in inverse frameworks. We examine to what extent the recent methane changes can be explained by the current generation of flux models and inventories. We examine the major modes of variability in wetland models along with the Global Fire Emissions Database (GFED) and the Emissions Database for Global Atmospheric Research (EDGAR). Using the Model for Ozone and Related Tracers (MOZART), we determine whether the spatial and temporal atmospheric trends predicted using these emissions can be brought into consistency with in situ atmospheric observations. We use a novel hierarchical Bayesian methodology in which scaling factors applied to the principal components of the flux fields are estimated simultaneously with the uncertainties associated with the a priori fluxes and with model representations of the observations. Using this method, we examine the predictive power of methane flux models for explaining recent fluctuations.

  11. Relevance of the hadronic interaction model in the interpretation of multiple muon data as detected with the MACRO experiment

    NASA Astrophysics Data System (ADS)

    Ambrosio, M.; Antolini, R.; Aramo, C.; Auriemma, G.; Baldini, A.; Barbarino, G. C.; Barish, B. C.; Battistoni, G.; Bellotti, R.; Bemporad, C.; Bernardini, P.; Bilokon, H.; Bisi, V.; Bloise, C.; Bower, C.; Bussino, S.; Cafagna, F.; Calicchio, M.; Campana, D.; Carboni, M.; Castellano, M.; Cecchini, S.; Cei, F.; Chiarella, V.; Coutu, S.; de Benedictis, L.; de Cataldo, G.; Dekhissi, H.; de Marzo, C.; de Mitri, I.; de Vincenzi, M.; di Credico, A.; Erriquez, O.; Favuzzi, C.; Forti, C.; Fusco, P.; Giacomelli, G.; Giannini, G.; Giglietto, N.; Grassi, M.; Gray, L.; Grillo, A.; Guarino, F.; Guarnaccia, P.; Gustavino, C.; Habig, A.; Hanson, K.; Hawthorne, A.; Heinz, R.; Iarocci, E.; Katsavounidis, E.; Kearns, E.; Kyriazopoulou, S.; Lamanna, E.; Lane, C.; Levin, D. S.; Lipari, P.; Longley, N. P.; Longo, M. J.; Maaroufi, F.; Mancarella, G.; Mandrioli, G.; Manzoor, S.; Margiotta Neri, A.; Marini, A.; Martello, D.; Marzari-Chiesa, A.; Mazziotta, M. N.; Mazzotta, C.; Michael, D. G.; Mikheyev, S.; Miller, L.; Monacelli, P.; Montaruli, T.; Monteno, M.; Mufson, S.; Musser, J.; Nicoló, D.; Nolty, R.; Okada, C.; Orth, C.; Osteria, G.; Palamara, O.; Patera, V.; Patrizii, L.; Pazzi, R.; Peck, C. W.; Petrera, S.; Pistilli, P.; Popa, V.; Rainó, A.; Rastelli, A.; Reynoldson, J.; Ronga, F.; Rubizzo, U.; Sanzgiri, A.; Satriano, C.; Satta, L.; Scapparone, E.; Scholberg, K.; Sciubba, A.; Serra-Lugaresi, P.; Severi, M.; Sioli, M.; Sitta, M.; Spinelli, P.; Spinetti, M.; Spurio, M.; Steinberg, R.; Stone, J. L.; Sulak, L. R.; Surdo, A.; Tarlé, G.; Togo, V.; Walter, C. W.; Webb, R.

    1999-03-01

    With the aim of discussing the effect of the possible sources of systematic uncertainties in simulation models, the analysis of multiple muon events from the MACRO experiment at Gran Sasso is reviewed. In particular, the predictions from different currently available hadronic interaction models are compared.

  12. ACCOUNTING FOR CALIBRATION UNCERTAINTIES IN X-RAY ANALYSIS: EFFECTIVE AREAS IN SPECTRAL FITTING

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, Hyunsook; Kashyap, Vinay L.; Drake, Jeremy J.

    2011-04-20

    While considerable advance has been made to account for statistical uncertainties in astronomical analyses, systematic instrumental uncertainties have been generally ignored. This can be crucial to a proper interpretation of analysis results because instrumental calibration uncertainty is a form of systematic uncertainty. Ignoring it can underestimate error bars and introduce bias into the fitted values of model parameters. Accounting for such uncertainties currently requires extensive case-specific simulations if using existing analysis packages. Here, we present general statistical methods that incorporate calibration uncertainties into spectral analysis of high-energy data. We first present a method based on multiple imputation that can bemore » applied with any fitting method, but is necessarily approximate. We then describe a more exact Bayesian approach that works in conjunction with a Markov chain Monte Carlo based fitting. We explore methods for improving computational efficiency, and in particular detail a method of summarizing calibration uncertainties with a principal component analysis of samples of plausible calibration files. This method is implemented using recently codified Chandra effective area uncertainties for low-resolution spectral analysis and is verified using both simulated and actual Chandra data. Our procedure for incorporating effective area uncertainty is easily generalized to other types of calibration uncertainties.« less

  13. Method for measuring target rotation angle by theodolites

    NASA Astrophysics Data System (ADS)

    Sun, Zelin; Wang, Zhao; Zhai, Huanchun; Yang, Xiaoxu

    2013-05-01

    To overcome the disadvantage of the current measurement methods using theodolites in an environment with shock and long working hours and so on, this paper proposes a new method for 3D coordinate measurement that is based on an immovable measuring coordinate system. According to the measuring principle, the mathematics model is established and the measurement uncertainty is analysed. The measurement uncertainty of the new method is a function of the theodolite observation angles and their uncertainty, and can be reduced by optimizing the theodolites’ placement. Compared to other methods, this method allows the theodolite positions to be changed in the measuring process, and mutual collimation between the theodolites is not required. The experimental results show that the measurement model and the optimal placement principle are correct, and the measurement error is less than 0.01° after optimizing the theodolites’ placement.

  14. Nuclear-effects model embedded stochastically in simulation (NEMESIS) summary report. Technical paper

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Youngren, M.A.

    1989-11-01

    An analytic probability model of tactical nuclear warfare in the theater is presented in this paper. The model addresses major problems associated with representing nuclear warfare in the theater. Current theater representations of a potential nuclear battlefield are developed in context of low-resolution, theater-level models or scenarios. These models or scenarios provide insufficient resolution in time and space for modeling a nuclear exchange. The model presented in this paper handles the spatial uncertainty in potentially targeted unit locations by proposing two-dimensional multivariate probability models for the actual and perceived locations of units subordinate to the major (division-level) units represented inmore » theater scenarios. The temporal uncertainty in the activities of interest represented in our theater-level Force Evaluation Model (FORCEM) is handled through probability models of the acquisition and movement of potential nuclear target units.« less

  15. The North American Regional Climate Change Assessment Program (NARCCAP): Status and results

    NASA Astrophysics Data System (ADS)

    Arritt, R.

    2009-04-01

    NARCCAP is an international program that is generating projections of climate change for the U.S., Canada, and northern Mexico at decision-relevant regional scales. NARCCAP uses multiple limited-area regional climate models (RCMs) nested within multiple atmosphere-ocean general circulation models (AOGCMs). The use of multiple regional and global models allows us to investigate the uncertainty in model responses to future emissions (here, the A2 SRES scenario). The project also includes global time-slice experiments at the same discretization (50 km) using the GFDL atmospheric model (AM2.1) and the NCAR atmospheric model (CAM3). Phase I of the experiment uses the regional models nested within reanalysis in order to establish uncertainty attributable to the RCMs themselves. Phase II of the project then nests the RCMs within results from the current and future runs of the AOGCMs to explore the cascade of uncertainty from the global to the regional models. Phase I has been completed and the results to be shown include findings that spectral nudging is beneficial in some regions but not in others. Phase II is nearing completion and some preliminary results will be shown.

  16. Mapping flood hazards under uncertainty through probabilistic flood inundation maps

    NASA Astrophysics Data System (ADS)

    Stephens, T.; Bledsoe, B. P.; Miller, A. J.; Lee, G.

    2017-12-01

    Changing precipitation, rapid urbanization, and population growth interact to create unprecedented challenges for flood mitigation and management. Standard methods for estimating risk from flood inundation maps generally involve simulations of floodplain hydraulics for an established regulatory discharge of specified frequency. Hydraulic model results are then geospatially mapped and depicted as a discrete boundary of flood extents and a binary representation of the probability of inundation (in or out) that is assumed constant over a project's lifetime. Consequently, existing methods utilized to define flood hazards and assess risk management are hindered by deterministic approaches that assume stationarity in a nonstationary world, failing to account for spatio-temporal variability of climate and land use as they translate to hydraulic models. This presentation outlines novel techniques for portraying flood hazards and the results of multiple flood inundation maps spanning hydroclimatic regions. Flood inundation maps generated through modeling of floodplain hydraulics are probabilistic reflecting uncertainty quantified through Monte-Carlo analyses of model inputs and parameters under current and future scenarios. The likelihood of inundation and range of variability in flood extents resulting from Monte-Carlo simulations are then compared with deterministic evaluations of flood hazards from current regulatory flood hazard maps. By facilitating alternative approaches of portraying flood hazards, the novel techniques described in this presentation can contribute to a shifting paradigm in flood management that acknowledges the inherent uncertainty in model estimates and the nonstationary behavior of land use and climate.

  17. A web-application for visualizing uncertainty in numerical ensemble models

    NASA Astrophysics Data System (ADS)

    Alberti, Koko; Hiemstra, Paul; de Jong, Kor; Karssenberg, Derek

    2013-04-01

    Numerical ensemble models are used in the analysis and forecasting of a wide range of environmental processes. Common use cases include assessing the consequences of nuclear accidents, pollution releases into the ocean or atmosphere, forest fires, volcanic eruptions, or identifying areas at risk from such hazards. In addition to the increased use of scenario analyses and model forecasts, the availability of supplementary data describing errors and model uncertainties is increasingly commonplace. Unfortunately most current visualization routines are not capable of properly representing uncertain information. As a result, uncertainty information is not provided at all, not readily accessible, or it is not communicated effectively to model users such as domain experts, decision makers, policy makers, or even novice users. In an attempt to address these issues a lightweight and interactive web-application has been developed. It makes clear and concise uncertainty visualizations available in a web-based mapping and visualization environment, incorporating aggregation (upscaling) techniques to adjust uncertainty information to the zooming level. The application has been built on a web mapping stack of open source software, and can quantify and visualize uncertainties in numerical ensemble models in such a way that both expert and novice users can investigate uncertainties present in a simple ensemble dataset. As a test case, a dataset was used which forecasts the spread of an airborne tracer across Western Europe. Extrinsic uncertainty representations are used in which dynamic circular glyphs are overlaid on model attribute maps to convey various uncertainty concepts. It supports both basic uncertainty metrics such as standard deviation, standard error, width of the 95% confidence interval and interquartile range, as well as more experimental ones aimed at novice users. Ranges of attribute values can be specified, and the circular glyphs dynamically change size to represent the probability of the attribute value falling within the specified interval. For more advanced users graphs of the cumulative probability density function, histograms, and time series plume charts are available. To avoid risking a cognitive overload and crowding of glyphs on the map pane, the support of the data used for generating the glyphs is linked dynamically to the zoom level. Zooming in and out respectively decreases and increases the underlying support size of data used for generating the glyphs, thereby making uncertainty information of the original data upscaled to the resolution of the visualization accessible to the user. This feature also ensures that the glyphs are neatly spaced in a regular grid regardless of the zoom level. Finally, the web-application has been presented to groups of test users of varying degrees of expertise in order to evaluate the usability of the interface and the effectiveness of uncertainty visualizations based on circular glyphs.

  18. On the role of model-based monitoring for adaptive planning under uncertainty

    NASA Astrophysics Data System (ADS)

    Raso, Luciano; Kwakkel, Jan; Timmermans, Jos; Haasnoot, Mariolijn

    2016-04-01

    Adaptive plans, designed to anticipate and respond to an unfolding uncertain future, have found a fertile application domain in the planning of deltas that are exposed to rapid socioeconomic development and climate change. Adaptive planning, under the moniker of adaptive delta management, is used in the Dutch Delta Program for developing a nation-wide plan to prepare for uncertain climate change and socio-economic developments. Scientifically, adaptive delta management relies heavily on Dynamic Adaptive Policy Pathways. Currently, in the Netherlands the focus is shifting towards implementing the adaptive delta plan. This shift is especially relevant because the efficacy of adaptive plans hinges on monitoring on-going developments and ensuring that actions are indeed taken if and when necessary. In the design of an effective monitoring system for an adaptive plan, three challenges have to be confronted: • Shadow of the past: The development of adaptive plans and the design of their monitoring system relies heavily on current knowledge of the system, and current beliefs about plausible future developments. A static monitoring system is therefore exposed to the exact same uncertainties one tries to address through adaptive planning. • Inhibition of learning: Recent applications of adaptive planning tend to overlook the importance of learning and new information, and fail to account for this explicitly in the design of adaptive plans. • Challenge of surprise: Adaptive policies are designed in light of the current foreseen uncertainties. However, developments that are not considered during the design phase as being plausible could still substantially affect the performance of adaptive policies. The shadow of the past, the inhibition of learning, and the challenge of surprise taken together suggest that there is a need for redesigning the concepts of monitoring and evaluation to support the implementation of adaptive plans. Innovations from control theory, triggered by the challenge of uncertainty in operational control, may offer solutions from which monitoring for adaptive planning can benefit. Specifically: (i) in control, observations are incorporated into the model through data assimilation, updating the present state, boundary conditions, and parameters based on new observations, diminishing the shadow of the past; (ii) adaptive control is a way to modify the characteristics of the internal model, incorporating new knowledge on the system, countervailing the inhibition of learning; and (iii) in closed-loop control, a continuous system update equips the controller with "inherent robustness", i.e. to capacity to adapts to new conditions even when these were not initially considered. We aim to explore how inherent robustness addresses the challenge of surprise. Innovations in model-based control might help to improve and adapt the models used to support adaptive delta management to new information (reducing uncertainty). Moreover, this would offer a starting point for using these models not only in the design of adaptive plans, but also as part of the monitoring. The proposed research requires multidisciplinary cooperation between control theory, the policy sciences, and integrated assessment modeling.

  19. Quantifying volcanic hazard at Campi Flegrei caldera (Italy) with uncertainty assessment: 2. Pyroclastic density current invasion maps

    NASA Astrophysics Data System (ADS)

    Neri, Augusto; Bevilacqua, Andrea; Esposti Ongaro, Tomaso; Isaia, Roberto; Aspinall, Willy P.; Bisson, Marina; Flandoli, Franco; Baxter, Peter J.; Bertagnini, Antonella; Iannuzzi, Enrico; Orsucci, Simone; Pistolesi, Marco; Rosi, Mauro; Vitale, Stefano

    2015-04-01

    Campi Flegrei (CF) is an example of an active caldera containing densely populated settlements at very high risk of pyroclastic density currents (PDCs). We present here an innovative method for assessing background spatial PDC hazard in a caldera setting with probabilistic invasion maps conditional on the occurrence of an explosive event. The method encompasses the probabilistic assessment of potential vent opening positions, derived in the companion paper, combined with inferences about the spatial density distribution of PDC invasion areas from a simplified flow model, informed by reconstruction of deposits from eruptions in the last 15 ka. The flow model describes the PDC kinematics and accounts for main effects of topography on flow propagation. Structured expert elicitation is used to incorporate certain sources of epistemic uncertainty, and a Monte Carlo approach is adopted to produce a set of probabilistic hazard maps for the whole CF area. Our findings show that, in case of eruption, almost the entire caldera is exposed to invasion with a mean probability of at least 5%, with peaks greater than 50% in some central areas. Some areas outside the caldera are also exposed to this danger, with mean probabilities of invasion of the order of 5-10%. Our analysis suggests that these probability estimates have location-specific uncertainties which can be substantial. The results prove to be robust with respect to alternative elicitation models and allow the influence on hazard mapping of different sources of uncertainty, and of theoretical and numerical assumptions, to be quantified.

  20. Vadose zone transport field study: Detailed test plan for simulated leak tests

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    AL Ward; GW Gee

    2000-06-23

    The US Department of Energy (DOE) Groundwater/Vadose Zone Integration Project Science and Technology initiative was created in FY 1999 to reduce the uncertainty associated with vadose zone transport processes beneath waste sites at DOE's Hanford Site near Richland, Washington. This information is needed not only to evaluate the risks from transport, but also to support the adoption of measures for minimizing impacts to the groundwater and surrounding environment. The principal uncertainties in vadose zone transport are the current distribution of source contaminants and the natural heterogeneity of the soil in which the contaminants reside. Oversimplified conceptual models resulting from thesemore » uncertainties and limited use of hydrologic characterization and monitoring technologies have hampered the understanding contaminant migration through Hanford's vadose zone. Essential prerequisites for reducing vadose transport uncertainly include the development of accurate conceptual models and the development or adoption of monitoring techniques capable of delineating the current distributions of source contaminants and characterizing natural site heterogeneity. The Vadose Zone Transport Field Study (VZTFS) was conceived as part of the initiative to address the major uncertainties confronting vadose zone fate and transport predictions at the Hanford Site and to overcome the limitations of previous characterization attempts. Pacific Northwest National Laboratory (PNNL) is managing the VZTFS for DOE. The VZTFS will conduct field investigations that will improve the understanding of field-scale transport and lead to the development or identification of efficient and cost-effective characterization methods. Ideally, these methods will capture the extent of contaminant plumes using existing infrastructure (i.e., more than 1,300 steel-cased boreholes). The objectives of the VZTFS are to conduct controlled transport experiments at well-instrumented field sites at Hanford to: identify mechanisms controlling transport processes in soils typical of the hydrogeologic conditions of Hanford's waste disposal sites; reduce uncertainty in conceptual models; develop a detailed and accurate database of hydraulic and transport parameters for validation of three-dimensional numerical models; identify and evaluate advanced, cost-effective characterization methods with the potential to assess changing conditions in the vadose zone, particularly as surrogates of currently undetectable high-risk contaminants. This plan provides details for conducting field tests during FY 2000 to accomplish these objectives. Details of additional testing during FY 2001 and FY 2002 will be developed as part of the work planning process implemented by the Integration Project.« less

  1. The timing and probability of treatment switch under cost uncertainty: an application to patients with gastrointestinal stromal tumor.

    PubMed

    de Mello-Sampayo, Felipa

    2014-03-01

    Cost fluctuations render the outcome of any treatment switch uncertain, so that decision makers might have to wait for more information before optimally switching treatments, especially when the incremental cost per quality-adjusted life year (QALY) gained cannot be fully recovered later on. To analyze the timing of treatment switch under cost uncertainty. A dynamic stochastic model for the optimal timing of a treatment switch is developed and applied to a problem in medical decision taking, i.e. to patients with unresectable gastrointestinal stromal tumour (GIST). The theoretical model suggests that cost uncertainty reduces expected net benefit. In addition, cost volatility discourages switching treatments. The stochastic model also illustrates that as technologies become less cost competitive, the cost uncertainty becomes more dominant. With limited substitutability, higher quality of technologies will increase the demand for those technologies disregarding the cost uncertainty. The results of the empirical application suggest that the first-line treatment may be the better choice when considering lifetime welfare. Under uncertainty and irreversibility, low-risk patients must begin the second-line treatment as soon as possible, which is precisely when the second-line treatment is least valuable. As the costs of reversing current treatment impacts fall, it becomes more feasible to provide the option-preserving treatment to these low-risk individuals later on. Copyright © 2014 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  2. Spatial resolution and measurement uncertainty of strains in bone and bone-cement interface using digital volume correlation.

    PubMed

    Zhu, Ming-Liang; Zhang, Qing-Hang; Lupton, Colin; Tong, Jie

    2016-04-01

    The measurement uncertainty of strains has been assessed in a bone analogue (sawbone), bovine trabecular bone and bone-cement interface specimens under zero load using the Digital Volume Correlation (DVC) method. The effects of sub-volume size, sample constraint and preload on the measured strain uncertainty have been examined. There is generally a trade-off between the measurement uncertainty and the spatial resolution. Suitable sub-volume sizes have been be selected based on a compromise between the measurement uncertainty and the spatial resolution of the cases considered. A ratio of sub-volume size to a microstructure characteristic (Tb.Sp) was introduced to reflect a suitable spatial resolution, and the measurement uncertainty associated was assessed. Specifically, ratios between 1.6 and 4 appear to give rise to standard deviations in the measured strains between 166 and 620 με in all the cases considered, which would seem to suffice for strain analysis in pre as well as post yield loading regimes. A microscale finite element (μFE) model was built from the CT images of the sawbone, and the results from the μFE model and a continuum FE model were compared with those from the DVC. The strain results were found to differ significantly between the two methods at tissue level, consistent in trend with the results found in human bones, indicating mainly a limitation of the current DVC method in mapping strains at this level. Copyright © 2015 Elsevier Ltd. All rights reserved.

  3. Uncertainty assessment of PM2.5 contamination mapping using spatiotemporal sequential indicator simulations and multi-temporal monitoring data.

    PubMed

    Yang, Yong; Christakos, George; Huang, Wei; Lin, Chengda; Fu, Peihong; Mei, Yang

    2016-04-12

    Because of the rapid economic growth in China, many regions are subjected to severe particulate matter pollution. Thus, improving the methods of determining the spatiotemporal distribution and uncertainty of air pollution can provide considerable benefits when developing risk assessments and environmental policies. The uncertainty assessment methods currently in use include the sequential indicator simulation (SIS) and indicator kriging techniques. However, these methods cannot be employed to assess multi-temporal data. In this work, a spatiotemporal sequential indicator simulation (STSIS) based on a non-separable spatiotemporal semivariogram model was used to assimilate multi-temporal data in the mapping and uncertainty assessment of PM2.5 distributions in a contaminated atmosphere. PM2.5 concentrations recorded throughout 2014 in Shandong Province, China were used as the experimental dataset. Based on the number of STSIS procedures, we assessed various types of mapping uncertainties, including single-location uncertainties over one day and multiple days and multi-location uncertainties over one day and multiple days. A comparison of the STSIS technique with the SIS technique indicate that a better performance was obtained with the STSIS method.

  4. Uncertainty assessment of PM2.5 contamination mapping using spatiotemporal sequential indicator simulations and multi-temporal monitoring data

    NASA Astrophysics Data System (ADS)

    Yang, Yong; Christakos, George; Huang, Wei; Lin, Chengda; Fu, Peihong; Mei, Yang

    2016-04-01

    Because of the rapid economic growth in China, many regions are subjected to severe particulate matter pollution. Thus, improving the methods of determining the spatiotemporal distribution and uncertainty of air pollution can provide considerable benefits when developing risk assessments and environmental policies. The uncertainty assessment methods currently in use include the sequential indicator simulation (SIS) and indicator kriging techniques. However, these methods cannot be employed to assess multi-temporal data. In this work, a spatiotemporal sequential indicator simulation (STSIS) based on a non-separable spatiotemporal semivariogram model was used to assimilate multi-temporal data in the mapping and uncertainty assessment of PM2.5 distributions in a contaminated atmosphere. PM2.5 concentrations recorded throughout 2014 in Shandong Province, China were used as the experimental dataset. Based on the number of STSIS procedures, we assessed various types of mapping uncertainties, including single-location uncertainties over one day and multiple days and multi-location uncertainties over one day and multiple days. A comparison of the STSIS technique with the SIS technique indicate that a better performance was obtained with the STSIS method.

  5. Future forest aboveground carbon dynamics in the central United States: the importance of forest demographic processes

    Treesearch

    Wenchi Jin; Hong S. He; Frank R. Thompson; Wen J. Wang; Jacob S. Fraser; Stephen R. Shifley; Brice B. Hanberry; William D. Dijak

    2017-01-01

    The Central Hardwood Forest (CHF) in the United States is currently a major carbon sink, there are uncertainties in how long the current carbon sink will persist and if the CHF will eventually become a carbon source. We used a multi-model ensemble to investigate aboveground carbon density of the CHF from 2010 to 2300 under current climate. Simulations were done using...

  6. Organic food consumption in Taiwan: Motives, involvement, and purchase intention under the moderating role of uncertainty.

    PubMed

    Teng, Chih-Ching; Lu, Chi-Heng

    2016-10-01

    Despite the progressive development of the organic food sector in Taiwan, little is known about how consumers' consumption motives will influence organic food decision through various degrees of involvement and whether or not consumers with various degrees of uncertainty will vary in their intention to buy organic foods. The current study aims to examine the effect of consumption motives on behavioral intention related to organic food consumption under the mediating role of involvement as well as the moderating role of uncertainty. Research data were collected from organic food consumers in Taiwan via a questionnaire survey, eventually obtaining 457 valid questionnaires for analysis. This study tested the overall model fit and hypotheses through structural equation modeling method (SEM). The results show that consumer involvement significantly mediates the effects of health consciousness and ecological motives on organic food purchase intention, but not applied to food safety concern. Moreover, the moderating effect of uncertainty is statistical significance, indicating that the relationship between involvement and purchase intention becomes weaker in the condition of consumers with higher degree of uncertainty. Several implications and suggestions are also discussed for organic food providers and marketers. Copyright © 2016. Published by Elsevier Ltd.

  7. High-Precision Half-Life Measurement for the Superallowed β+ Emitter 22Mg

    NASA Astrophysics Data System (ADS)

    Dunlop, Michelle

    2017-09-01

    High precision measurements of the Ft values for superallowed Fermi beta transitions between 0+ isobaric analogue states allow for stringent tests of the electroweak interaction. These transitions provide an experimental probe of the Conserved-Vector-Current hypothesis, the most precise determination of the up-down element of the Cabibbo-Kobayashi-Maskawa matrix, and set stringent limits on the existence of scalar currents in the weak interaction. To calculate the Ft values several theoretical corrections must be applied to the experimental data, some of which have large model dependent variations. Precise experimental determinations of the ft values can be used to help constrain the different models. The uncertainty in the 22Mg superallowed Ft value is dominated by the uncertainty in the experimental ft value. The adopted half-life of 22Mg is determined from two measurements which disagree with one another, resulting in the inflation of the weighted-average half-life uncertainty by a factor of 2. The 22Mg half-life was measured with a precision of 0.02% via direct β counting at TRIUMF's ISAC facility, leading to an improvement in the world-average half-life by more than a factor of 3.

  8. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Engel, David W.; Jarman, Kenneth D.; Xu, Zhijie

    This report describes our initial research to quantify uncertainties in the identification and characterization of possible attack states in a network. As a result, we should be able to estimate the current state in which the network is operating, based on a wide variety of network data, and attach a defensible measure of confidence to these state estimates. The output of this research will be new uncertainty quantification (UQ) methods to help develop a process for model development and apply UQ to characterize attacks/adversaries, create an understanding of the degree to which methods scale to "big" data, and offer methodsmore » for addressing model approaches with regard to validation and accuracy.« less

  9. A Practical Probabilistic Graphical Modeling Tool for Weighing ...

    EPA Pesticide Factsheets

    Past weight-of-evidence frameworks for adverse ecological effects have provided soft-scoring procedures for judgments based on the quality and measured attributes of evidence. Here, we provide a flexible probabilistic structure for weighing and integrating lines of evidence for ecological risk determinations. Probabilistic approaches can provide both a quantitative weighing of lines of evidence and methods for evaluating risk and uncertainty. The current modeling structure wasdeveloped for propagating uncertainties in measured endpoints and their influence on the plausibility of adverse effects. To illustrate the approach, we apply the model framework to the sediment quality triad using example lines of evidence for sediment chemistry measurements, bioassay results, and in situ infauna diversity of benthic communities using a simplified hypothetical case study. We then combine the three lines evidence and evaluate sensitivity to the input parameters, and show how uncertainties are propagated and how additional information can be incorporated to rapidly update the probability of impacts. The developed network model can be expanded to accommodate additional lines of evidence, variables and states of importance, and different types of uncertainties in the lines of evidence including spatial and temporal as well as measurement errors. We provide a flexible Bayesian network structure for weighing and integrating lines of evidence for ecological risk determinations

  10. IAEA Coordinated Research Project on HTGR Reactor Physics, Thermal-hydraulics and Depletion Uncertainty Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Strydom, Gerhard; Bostelmann, F.

    The continued development of High Temperature Gas Cooled Reactors (HTGRs) requires verification of HTGR design and safety features with reliable high fidelity physics models and robust, efficient, and accurate codes. The predictive capability of coupled neutronics/thermal-hydraulics and depletion simulations for reactor design and safety analysis can be assessed with sensitivity analysis (SA) and uncertainty analysis (UA) methods. Uncertainty originates from errors in physical data, manufacturing uncertainties, modelling and computational algorithms. (The interested reader is referred to the large body of published SA and UA literature for a more complete overview of the various types of uncertainties, methodologies and results obtained).more » SA is helpful for ranking the various sources of uncertainty and error in the results of core analyses. SA and UA are required to address cost, safety, and licensing needs and should be applied to all aspects of reactor multi-physics simulation. SA and UA can guide experimental, modelling, and algorithm research and development. Current SA and UA rely either on derivative-based methods such as stochastic sampling methods or on generalized perturbation theory to obtain sensitivity coefficients. Neither approach addresses all needs. In order to benefit from recent advances in modelling and simulation and the availability of new covariance data (nuclear data uncertainties) extensive sensitivity and uncertainty studies are needed for quantification of the impact of different sources of uncertainties on the design and safety parameters of HTGRs. Only a parallel effort in advanced simulation and in nuclear data improvement will be able to provide designers with more robust and well validated calculation tools to meet design target accuracies. In February 2009, the Technical Working Group on Gas-Cooled Reactors (TWG-GCR) of the International Atomic Energy Agency (IAEA) recommended that the proposed Coordinated Research Program (CRP) on the HTGR Uncertainty Analysis in Modelling (UAM) be implemented. This CRP is a continuation of the previous IAEA and Organization for Economic Co-operation and Development (OECD)/Nuclear Energy Agency (NEA) international activities on Verification and Validation (V&V) of available analytical capabilities for HTGR simulation for design and safety evaluations. Within the framework of these activities different numerical and experimental benchmark problems were performed and insight was gained about specific physics phenomena and the adequacy of analysis methods.« less

  11. Uncertainty in mixing models: a blessing in disguise?

    NASA Astrophysics Data System (ADS)

    Delsman, J. R.; Oude Essink, G. H. P.

    2012-04-01

    Despite the abundance of tracer-based studies in catchment hydrology over the past decades, relatively few studies have addressed the uncertainty associated with these studies in much detail. This uncertainty stems from analytical error, spatial and temporal variance in end-member composition, and from not incorporating all relevant processes in the necessarily simplistic mixing models. Instead of applying standard EMMA methodology, we used end-member mixing model analysis within a Monte Carlo framework to quantify the uncertainty surrounding our analysis. Borrowing from the well-known GLUE methodology, we discarded mixing models that could not satisfactorily explain sample concentrations and analyzed the posterior parameter set. This use of environmental tracers aided in disentangling hydrological pathways in a Dutch polder catchment. This 10 km2 agricultural catchment is situated in the coastal region of the Netherlands. Brackish groundwater seepage, originating from Holocene marine transgressions, adversely affects water quality in this catchment. Current water management practice is aimed at improving water quality by flushing the catchment with fresh water from the river Rhine. Climate change is projected to decrease future fresh water availability, signifying the need for a more sustainable water management practice and a better understanding of the functioning of the catchment. The end-member mixing analysis increased our understanding of the hydrology of the studied catchment. The use of a GLUE-like framework for applying the end-member mixing analysis not only quantified the uncertainty associated with the analysis, the analysis of the posterior parameter set also identified the existence of catchment processes otherwise overlooked.

  12. Drought Persistence in Models and Observations

    NASA Astrophysics Data System (ADS)

    Moon, Heewon; Gudmundsson, Lukas; Seneviratne, Sonia

    2017-04-01

    Many regions of the world have experienced drought events that persisted several years and caused substantial economic and ecological impacts in the 20th century. However, it remains unclear whether there are significant trends in the frequency or severity of these prolonged drought events. In particular, an important issue is linked to systematic biases in the representation of persistent drought events in climate models, which impedes analysis related to the detection and attribution of drought trends. This study assesses drought persistence errors in global climate model (GCM) simulations from the 5th phase of Coupled Model Intercomparison Project (CMIP5), in the period of 1901-2010. The model simulations are compared with five gridded observational data products. The analysis focuses on two aspects: the identification of systematic biases in the models and the partitioning of the spread of drought-persistence-error into four possible sources of uncertainty: model uncertainty, observation uncertainty, internal climate variability and the estimation error of drought persistence. We use monthly and yearly dry-to-dry transition probabilities as estimates for drought persistence with drought conditions defined as negative precipitation anomalies. For both time scales we find that most model simulations consistently underestimated drought persistence except in a few regions such as India and Eastern South America. Partitioning the spread of the drought-persistence-error shows that at the monthly time scale model uncertainty and observation uncertainty are dominant, while the contribution from internal variability does play a minor role in most cases. At the yearly scale, the spread of the drought-persistence-error is dominated by the estimation error, indicating that the partitioning is not statistically significant, due to a limited number of considered time steps. These findings reveal systematic errors in the representation of drought persistence in current climate models and highlight the main contributors of uncertainty of drought-persistence-error. Future analyses will focus on investigating the temporal propagation of drought persistence to better understand the causes for the identified errors in the representation of drought persistence in state-of-the-art climate models.

  13. REVIEW OF MECHANISTIC UNDERSTANDING AND MODELING AND UNCERTAINTY ANALYSIS METHODS FOR PREDICTING CEMENTITIOUS BARRIER PERFORMANCE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Langton, C.; Kosson, D.

    2009-11-30

    Cementitious barriers for nuclear applications are one of the primary controls for preventing or limiting radionuclide release into the environment. At the present time, performance and risk assessments do not fully incorporate the effectiveness of engineered barriers because the processes that influence performance are coupled and complicated. Better understanding the behavior of cementitious barriers is necessary to evaluate and improve the design of materials and structures used for radioactive waste containment, life extension of current nuclear facilities, and design of future nuclear facilities, including those needed for nuclear fuel storage and processing, nuclear power production and waste management. The focusmore » of the Cementitious Barriers Partnership (CBP) literature review is to document the current level of knowledge with respect to: (1) mechanisms and processes that directly influence the performance of cementitious materials (2) methodologies for modeling the performance of these mechanisms and processes and (3) approaches to addressing and quantifying uncertainties associated with performance predictions. This will serve as an important reference document for the professional community responsible for the design and performance assessment of cementitious materials in nuclear applications. This review also provides a multi-disciplinary foundation for identification, research, development and demonstration of improvements in conceptual understanding, measurements and performance modeling that would be lead to significant reductions in the uncertainties and improved confidence in the estimating the long-term performance of cementitious materials in nuclear applications. This report identifies: (1) technology gaps that may be filled by the CBP project and also (2) information and computational methods that are in currently being applied in related fields but have not yet been incorporated into performance assessments of cementitious barriers. The various chapters contain both a description of the mechanism or and a discussion of the current approaches to modeling the phenomena.« less

  14. Projected status of the Pacific walrus (Odobenus rosmarus divergens) in the twenty-first century

    USGS Publications Warehouse

    Jay, Chadwick V.; Marcot, Bruce G.; Douglas, David C.

    2011-01-01

    Extensive and rapid losses of sea ice in the Arctic have raised conservation concerns for the Pacific walrus (Odobenus rosmarus divergens), a large pinniped inhabiting arctic and subarctic continental shelf waters of the Chukchi and Bering seas. We developed a Bayesian network model to integrate potential effects of changing environmental conditions and anthropogenic stressors on the future status of the Pacific walrus population at four periods through the twenty-first century. The model framework allowed for inclusion of various sources and levels of knowledge, and representation of structural and parameter uncertainties. Walrus outcome probabilities through the century reflected a clear trend of worsening conditions for the subspecies. From the current observation period to the end of century, the greatest change in walrus outcome probabilities was a progressive decrease in the outcome state of robust and a concomitant increase in the outcome state of vulnerable. The probabilities of rare and extirpated states each progressively increased but remained <10% through the end of the century. The summed probabilities of vulnerable, rare, and extirpated (P(v,r,e)) increased from a current level of 10% in 2004 to 22% by 2050 and 40% by 2095. The degree of uncertainty in walrus outcomes increased monotonically over future periods. In the model, sea ice habitat (particularly for summer/fall) and harvest levels had the greatest influence on future population outcomes. Other potential stressors had much smaller influences on walrus outcomes, mostly because of uncertainty in their future states and our current poor understanding of their mechanistic influence on walrus abundance.

  15. Constraining the inferred paleohydrologic evolution of a deep unsaturated zone in the Amargosa Desert

    USGS Publications Warehouse

    Walvoord, Michelle Ann; Stonestrom, David A.; Andraski, Brian J.; Striegl, Robert G.

    2004-01-01

    Natural flow regimes in deep unsaturated zones of arid interfluvial environments are rarely in hydraulic equilibrium with near-surface boundary conditions imposed by present-day plant–soil–atmosphere dynamics. Nevertheless, assessments of water resources and contaminant transport require realistic estimates of gas, water, and solute fluxes under past, present, and projected conditions. Multimillennial transients that are captured in current hydraulic, chemical, and isotopic profiles can be interpreted to constrain alternative scenarios of paleohydrologic evolution following climatic and vegetational shifts from pluvial to arid conditions. However, interpreting profile data with numerical models presents formidable challenges in that boundary conditions must be prescribed throughout the entire Holocene, when we have at most a few decades of actual records. Models of profile development at the Amargosa Desert Research Site include substantial uncertainties from imperfectly known initial and boundary conditions when simulating flow and solute transport over millennial timescales. We show how multiple types of profile data, including matric potentials and porewater concentrations of Cl−, δD, δ18O, can be used in multiphase heat, flow, and transport models to expose and reduce uncertainty in paleohydrologic reconstructions. Results indicate that a dramatic shift in the near-surface water balance occurred approximately 16000 yr ago, but that transitions in precipitation, temperature, and vegetation were not necessarily synchronous. The timing of the hydraulic transition imparts the largest uncertainty to model-predicted contemporary fluxes. In contrast, the uncertainties associated with initial (late Pleistocene) conditions and boundary conditions during the Holocene impart only small uncertainties to model-predicted contemporaneous fluxes.

  16. Estimating Model Prediction Error: Should You Treat Predictions as Fixed or Random?

    NASA Technical Reports Server (NTRS)

    Wallach, Daniel; Thorburn, Peter; Asseng, Senthold; Challinor, Andrew J.; Ewert, Frank; Jones, James W.; Rotter, Reimund; Ruane, Alexander

    2016-01-01

    Crop models are important tools for impact assessment of climate change, as well as for exploring management options under current climate. It is essential to evaluate the uncertainty associated with predictions of these models. We compare two criteria of prediction error; MSEP fixed, which evaluates mean squared error of prediction for a model with fixed structure, parameters and inputs, and MSEP uncertain( X), which evaluates mean squared error averaged over the distributions of model structure, inputs and parameters. Comparison of model outputs with data can be used to estimate the former. The latter has a squared bias term, which can be estimated using hindcasts, and a model variance term, which can be estimated from a simulation experiment. The separate contributions to MSEP uncertain (X) can be estimated using a random effects ANOVA. It is argued that MSEP uncertain (X) is the more informative uncertainty criterion, because it is specific to each prediction situation.

  17. Source Data Impacts on Epistemic Uncertainty for Launch Vehicle Fault Tree Models

    NASA Technical Reports Server (NTRS)

    Al Hassan, Mohammad; Novack, Steven; Ring, Robert

    2016-01-01

    Launch vehicle systems are designed and developed using both heritage and new hardware. Design modifications to the heritage hardware to fit new functional system requirements can impact the applicability of heritage reliability data. Risk estimates for newly designed systems must be developed from generic data sources such as commercially available reliability databases using reliability prediction methodologies, such as those addressed in MIL-HDBK-217F. Failure estimates must be converted from the generic environment to the specific operating environment of the system in which it is used. In addition, some qualification of applicability for the data source to the current system should be made. Characterizing data applicability under these circumstances is crucial to developing model estimations that support confident decisions on design changes and trade studies. This paper will demonstrate a data-source applicability classification method for suggesting epistemic component uncertainty to a target vehicle based on the source and operating environment of the originating data. The source applicability is determined using heuristic guidelines while translation of operating environments is accomplished by applying statistical methods to MIL-HDK-217F tables. The paper will provide one example for assigning environmental factors uncertainty when translating between operating environments for the microelectronic part-type components. The heuristic guidelines will be followed by uncertainty-importance routines to assess the need for more applicable data to reduce model uncertainty.

  18. From catchment scale hydrologic processes to numerical models and robust predictions of climate change impacts at regional scales

    NASA Astrophysics Data System (ADS)

    Wagener, T.

    2017-12-01

    Current societal problems and questions demand that we increasingly build hydrologic models for regional or even continental scale assessment of global change impacts. Such models offer new opportunities for scientific advancement, for example by enabling comparative hydrology or connectivity studies, and for improved support of water management decision, since we might better understand regional impacts on water resources from large scale phenomena such as droughts. On the other hand, we are faced with epistemic uncertainties when we move up in scale. The term epistemic uncertainty describes those uncertainties that are not well determined by historical observations. This lack of determination can be because the future is not like the past (e.g. due to climate change), because the historical data is unreliable (e.g. because it is imperfectly recorded from proxies or missing), or because it is scarce (either because measurements are not available at the right scale or there is no observation network available at all). In this talk I will explore: (1) how we might build a bridge between what we have learned about catchment scale processes and hydrologic model development and evaluation at larger scales. (2) How we can understand the impact of epistemic uncertainty in large scale hydrologic models. And (3) how we might utilize large scale hydrologic predictions to understand climate change impacts, e.g. on infectious disease risk.

  19. Uncertainty quantification for nuclear density functional theory and information content of new measurements.

    PubMed

    McDonnell, J D; Schunck, N; Higdon, D; Sarich, J; Wild, S M; Nazarewicz, W

    2015-03-27

    Statistical tools of uncertainty quantification can be used to assess the information content of measured observables with respect to present-day theoretical models, to estimate model errors and thereby improve predictive capability, to extrapolate beyond the regions reached by experiment, and to provide meaningful input to applications and planned measurements. To showcase new opportunities offered by such tools, we make a rigorous analysis of theoretical statistical uncertainties in nuclear density functional theory using Bayesian inference methods. By considering the recent mass measurements from the Canadian Penning Trap at Argonne National Laboratory, we demonstrate how the Bayesian analysis and a direct least-squares optimization, combined with high-performance computing, can be used to assess the information content of the new data with respect to a model based on the Skyrme energy density functional approach. Employing the posterior probability distribution computed with a Gaussian process emulator, we apply the Bayesian framework to propagate theoretical statistical uncertainties in predictions of nuclear masses, two-neutron dripline, and fission barriers. Overall, we find that the new mass measurements do not impose a constraint that is strong enough to lead to significant changes in the model parameters. The example discussed in this study sets the stage for quantifying and maximizing the impact of new measurements with respect to current modeling and guiding future experimental efforts, thus enhancing the experiment-theory cycle in the scientific method.

  20. Measuring the free neutron lifetime to <= 0.3s via the beam method

    NASA Astrophysics Data System (ADS)

    Fomin, Nadia

    2017-09-01

    Neutron beta decay is an archetype for all semi-leptonic charged-current weak processes. While of interest as a fundamental particle property, a precise value for the neutron lifetime is also required for consistency tests of the Standard Model as well as to calculate the primordial 4He abundance in Big Bang Nucleosynthesis models. An effort has begun to develop an in-beam measurement of the neutron lifetime with a projected <= 0.3s uncertainty. This effort is part of a phased campaign of neutron lifetime measurements based at the NIST Center for Neutron Research, using the Sussex-ILL-NIST technique. Recent advances in neutron fluence measurement techniques as well as new large area silicon detector technology address the two largest sources of uncertainty of in-beam measurements, paving the way for a new measurement. The experimental design and projected uncertainties for the 0.3s measurement will be discussed.

  1. Observational constraints indicate risk of drying in the Amazon basin.

    PubMed

    Shiogama, Hideo; Emori, Seita; Hanasaki, Naota; Abe, Manabu; Masutomi, Yuji; Takahashi, Kiyoshi; Nozawa, Toru

    2011-03-29

    Climate warming due to human activities will be accompanied by hydrological cycle changes. Economies, societies and ecosystems in South America are vulnerable to such water resource changes. Hence, water resource impact assessments for South America, and corresponding adaptation and mitigation policies, have attracted increased attention. However, substantial uncertainties remain in the current water resource assessments that are based on multiple coupled Atmosphere Ocean General Circulation models. This uncertainty varies from significant wetting to catastrophic drying. By applying a statistical method, we characterized the uncertainty and identified global-scale metrics for measuring the reliability of water resource assessments in South America. Here, we show that, although the ensemble mean assessment suggested wetting across most of South America, the observational constraints indicate a higher probability of drying in the Amazon basin. Thus, over-reliance on the consensus of models can lead to inappropriate decision making.

  2. Determining an empirical estimate of the tracking inconsistency component for true astrometric uncertainties

    NASA Astrophysics Data System (ADS)

    Ramanjooloo, Yudish; Tholen, David J.; Fohring, Dora; Claytor, Zach; Hung, Denise

    2017-10-01

    The asteroid community is moving towards the implementation of a new astrometric reporting format. This new format will finally include of complementary astrometric uncertainties in the reported observations. The availability of uncertainties will allow ephemeris predictions and orbit solutions to be constrained with greater reliability, thereby improving the efficiency of the community's follow-up and recovery efforts.Our current uncertainty model involves our uncertainties in centroiding on the trailed stars and asteroid and the uncertainty due to the astrometric solution. The accuracy of our astrometric measurements are reliant on how well we can minimise the offset between the spatial and temporal centroids of the stars and the asteroid. This offset is currently unmodelled and can be caused by variations in the cloud transparency, the seeing and tracking inconsistencies. The magnitude zero point of the image, which is affected by fluctuating weather conditions and the catalog bias in the photometric magnitudes, can serve as an indicator of the presence and thickness of clouds. Through comparison of the astrometric uncertainties to the orbit solution residuals, it was apparent that a component of the error analysis remained unaccounted for, as a result of cloud coverage and thickness, telescope tracking inconsistencies and variable seeing. This work will attempt to quantify the tracking inconsistency component. We have acquired a rich dataset with the University of Hawaii 2.24 metre telescope (UH-88 inch) that is well positioned to construct an empirical estimate of the tracking inconsistency component. This work is funded by NASA grant NXX13AI64G.

  3. Uncertainty squared: Choosing among multiple input probability distributions and interpreting multiple output probability distributions in Monte Carlo climate risk models

    NASA Astrophysics Data System (ADS)

    Baer, P.; Mastrandrea, M.

    2006-12-01

    Simple probabilistic models which attempt to estimate likely transient temperature change from specified CO2 emissions scenarios must make assumptions about at least six uncertain aspects of the causal chain between emissions and temperature: current radiative forcing (including but not limited to aerosols), current land use emissions, carbon sinks, future non-CO2 forcing, ocean heat uptake, and climate sensitivity. Of these, multiple PDFs (probability density functions) have been published for the climate sensitivity, a couple for current forcing and ocean heat uptake, one for future non-CO2 forcing, and none for current land use emissions or carbon cycle uncertainty (which are interdependent). Different assumptions about these parameters, as well as different model structures, will lead to different estimates of likely temperature increase from the same emissions pathway. Thus policymakers will be faced with a range of temperature probability distributions for the same emissions scenarios, each described by a central tendency and spread. Because our conventional understanding of uncertainty and probability requires that a probabilistically defined variable of interest have only a single mean (or median, or modal) value and a well-defined spread, this "multidimensional" uncertainty defies straightforward utilization in policymaking. We suggest that there are no simple solutions to the questions raised. Crucially, we must dispel the notion that there is a "true" probability probabilities of this type are necessarily subjective, and reasonable people may disagree. Indeed, we suggest that what is at stake is precisely the question, what is it reasonable to believe, and to act as if we believe? As a preliminary suggestion, we demonstrate how the output of a simple probabilistic climate model might be evaluated regarding the reasonableness of the outputs it calculates with different input PDFs. We suggest further that where there is insufficient evidence to clearly favor one range of probabilistic projections over another, that the choice of results on which to base policy must necessarily involve ethical considerations, as they have inevitable consequences for the distribution of risk In particular, the choice to use a more "optimistic" PDF for climate sensitivity (or other components of the causal chain) leads to the allowance of higher emissions consistent with any specified goal for risk reduction, and thus leads to higher climate impacts, in exchange for lower mitigation costs.

  4. From biota to chemistry and climate: towards a comprehensive description of trace gas exchange between the biosphere and atmosphere

    NASA Astrophysics Data System (ADS)

    Arneth, A.; Sitch, S.; Bondeau, A.; Butterbach-Bahl, K.; Foster, P.; Gedney, N.; de Noblet-Ducoudré, N.; Prentice, I. C.; Sanderson, M.; Thonicke, K.; Wania, R.; Zaehle, S.

    2010-01-01

    Exchange of non-CO2 trace gases between the land surface and the atmosphere plays an important role in atmospheric chemistry and climate. Recent studies have highlighted its importance for interpretation of glacial-interglacial ice-core records, the simulation of the pre-industrial and present atmosphere, and the potential for large climate-chemistry and climate-aerosol feedbacks in the coming century. However, spatial and temporal variations in trace gas emissions and the magnitude of future feedbacks are a major source of uncertainty in atmospheric chemistry, air quality and climate science. To reduce such uncertainties Dynamic Global Vegetation Models (DGVMs) are currently being expanded to mechanistically represent processes relevant to non-CO2 trace gas exchange between land biota and the atmosphere. In this paper we present a review of important non-CO2 trace gas emissions, the state-of-the-art in DGVM modelling of processes regulating these emissions, identify key uncertainties for global scale model applications, and discuss a methodology for model integration and evaluation.

  5. From biota to chemistry and climate: towards a comprehensive description of trace gas exchange between the biosphere and atmosphere

    NASA Astrophysics Data System (ADS)

    Arneth, A.; Sitch, S.; Bondeau, A.; Butterbach-Bahl, K.; Foster, P.; Gedney, N.; de Noblet-Ducoudré, N.; Prentice, I. C.; Sanderson, M.; Thonicke, K.; Wania, R.; Zaehle, S.

    2009-07-01

    Exchange of non-CO2 trace gases between the land surface and the atmosphere plays an important role in atmospheric chemistry and climate. Recent studies have highlighted its importance for interpretation of glacial-interglacial ice-core records, the simulation of the pre-industrial and present atmosphere, and the potential for large climate-chemistry and climate-aerosol feedbacks in the coming century. However, spatial and temporal variations in trace gas emissions and the magnitude of future feedbacks are a major source of uncertainty in atmospheric chemistry, air quality and climate science. To reduce such uncertainties Dynamic Global Vegetation Models (DGVMs) are currently being expanded to mechanistically represent processes relevant to non-CO2 trace gas exchange between land biota and the atmosphere. In this paper we present a review of important non-CO2 trace gas emissions, the state-of-the-art in DGVM modelling of processes regulating these emissions, identify key uncertainties for global scale model applications, and discuss a methodology for model integration and evaluation.

  6. Accounting for Age Uncertainty in Growth Modeling, the Case Study of Yellowfin Tuna (Thunnus albacares) of the Indian Ocean

    PubMed Central

    Dortel, Emmanuelle; Massiot-Granier, Félix; Rivot, Etienne; Million, Julien; Hallier, Jean-Pierre; Morize, Eric; Munaron, Jean-Marie; Bousquet, Nicolas; Chassot, Emmanuel

    2013-01-01

    Age estimates, typically determined by counting periodic growth increments in calcified structures of vertebrates, are the basis of population dynamics models used for managing exploited or threatened species. In fisheries research, the use of otolith growth rings as an indicator of fish age has increased considerably in recent decades. However, otolith readings include various sources of uncertainty. Current ageing methods, which converts an average count of rings into age, only provide periodic age estimates in which the range of uncertainty is fully ignored. In this study, we describe a hierarchical model for estimating individual ages from repeated otolith readings. The model was developed within a Bayesian framework to explicitly represent the sources of uncertainty associated with age estimation, to allow for individual variations and to include knowledge on parameters from expertise. The performance of the proposed model was examined through simulations, and then it was coupled to a two-stanza somatic growth model to evaluate the impact of the age estimation method on the age composition of commercial fisheries catches. We illustrate our approach using the saggital otoliths of yellowfin tuna of the Indian Ocean collected through large-scale mark-recapture experiments. The simulation performance suggested that the ageing error model was able to estimate the ageing biases and provide accurate age estimates, regardless of the age of the fish. Coupled with the growth model, this approach appeared suitable for modeling the growth of Indian Ocean yellowfin and is consistent with findings of previous studies. The simulations showed that the choice of the ageing method can strongly affect growth estimates with subsequent implications for age-structured data used as inputs for population models. Finally, our modeling approach revealed particularly useful to reflect uncertainty around age estimates into the process of growth estimation and it can be applied to any study relying on age estimation. PMID:23637773

  7. Global validation of a process-based model on vegetation gross primary production using eddy covariance observations.

    PubMed

    Liu, Dan; Cai, Wenwen; Xia, Jiangzhou; Dong, Wenjie; Zhou, Guangsheng; Chen, Yang; Zhang, Haicheng; Yuan, Wenping

    2014-01-01

    Gross Primary Production (GPP) is the largest flux in the global carbon cycle. However, large uncertainties in current global estimations persist. In this study, we examined the performance of a process-based model (Integrated BIosphere Simulator, IBIS) at 62 eddy covariance sites around the world. Our results indicated that the IBIS model explained 60% of the observed variation in daily GPP at all validation sites. Comparison with a satellite-based vegetation model (Eddy Covariance-Light Use Efficiency, EC-LUE) revealed that the IBIS simulations yielded comparable GPP results as the EC-LUE model. Global mean GPP estimated by the IBIS model was 107.50±1.37 Pg C year(-1) (mean value ± standard deviation) across the vegetated area for the period 2000-2006, consistent with the results of the EC-LUE model (109.39±1.48 Pg C year(-1)). To evaluate the uncertainty introduced by the parameter Vcmax, which represents the maximum photosynthetic capacity, we inversed Vcmax using Markov Chain-Monte Carlo (MCMC) procedures. Using the inversed Vcmax values, the simulated global GPP increased by 16.5 Pg C year(-1), indicating that IBIS model is sensitive to Vcmax, and large uncertainty exists in model parameterization.

  8. Streamflow loss quantification for groundwater flow modeling using a wading-rod-mounted acoustic Doppler current profiler in a headwater stream

    NASA Astrophysics Data System (ADS)

    Pflügl, Christian; Hoehn, Philipp; Hofmann, Thilo

    2017-04-01

    Irrespective of the availability of various field measurement and modeling approaches, the quantification of interactions between surface water and groundwater systems remains associated with high uncertainty. Such uncertainties on stream-aquifer interaction have a high potential to misinterpret the local water budget and water quality significantly. Due to typically considerable temporal variation of stream discharge rates, it is desirable for the measurement of streamflow to reduce the measuring duration while reducing uncertainty. Streamflow measurements, according to the velocity-area method, have been performed along reaches of a losing-disconnected, subalpine headwater stream using a 2-dimensional, wading-rod-mounted acoustic Doppler current profiler (ADCP). The method was chosen, with stream morphology not allowing for boat-mounted setups, to reduce uncertainty compared to conventional, single-point streamflow measurements of similar measurement duration. Reach-averaged stream loss rates were subsequently quantified between 12 cross sections. They enabled the delineation of strongly infiltrating stream reaches and their differentiation from insignificantly infiltrating reaches. Furthermore, a total of 10 near-stream observation wells were constructed and/or equipped with pressure and temperature loggers. The time series of near-stream groundwater temperature data were cross-correlated with stream temperature time series to yield supportive qualitative information on the delineation of infiltrating reaches. Subsequently, as a reference parameterization, the hydraulic conductivity and specific yield of a numerical, steady-state model of groundwater flow, in the unconfined glaciofluvial aquifer adjacent to the stream, were inversely determined incorporating the inferred stream loss rates. Applying synthetic sets of infiltration rates, resembling increasing levels of uncertainty associated with single-point streamflow measurements of comparable duration, the same inversion procedure was run. The volume-weighted mean of the respective parameter distribution within 200 m of stream periphery deviated increasingly from the reference parameterization at increasing deviation of infiltration rates.

  9. Comparison of model estimated and measured direct-normal solar irradiance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Halthore, R.N.; Schwartz, S.E.; Michalsky, J.J.

    1997-12-01

    Direct-normal solar irradiance (DNSI), the energy in the solar spectrum incident in unit time at the Earth{close_quote}s surface on a unit area perpendicular to the direction to the Sun, depends only on atmospheric extinction of solar energy without regard to the details of the extinction, whether absorption or scattering. Here we report a set of closure experiments performed in north central Oklahoma in April 1996 under cloud-free conditions, wherein measured atmospheric composition and aerosol optical thickness are input to a radiative transfer model, MODTRAN 3, to estimate DNSI, which is then compared with measured values obtained with normal incidence pyrheliometersmore » and absolute cavity radiometers. Uncertainty in aerosol optical thickness (AOT) dominates the uncertainty in DNSI calculation. AOT measured by an independently calibrated Sun photometer and a rotating shadow-band radiometer agree to within the uncertainties of each measurement. For 36 independent comparisons the agreement between measured and model-estimated values of DNSI falls within the combined uncertainties in the measurement (0.3{endash}0.7{percent}) and model calculation (1.8{percent}), albeit with a slight average model underestimate ({minus}0.18{plus_minus}0.94){percent}; for a DNSI of 839Wm{sup {minus}2} this corresponds to {minus}1.5{plus_minus}7.9Wm{sup {minus}2}. The agreement is nearly independent of air mass and water-vapor path abundance. These results thus establish the accuracy of the current knowledge of the solar spectrum, its integrated power, and the atmospheric extinction as a function of wavelength as represented in MODTRAN 3. An important consequence is that atmospheric absorption of short-wave energy is accurately parametrized in the model to within the above uncertainties. {copyright} 1997 American Geophysical Union« less

  10. Use of raster-based data layers to model spatial variation of seismotectonic data in probabilistic seismic hazard assessment

    NASA Astrophysics Data System (ADS)

    Zolfaghari, Mohammad R.

    2009-07-01

    Recent achievements in computer and information technology have provided the necessary tools to extend the application of probabilistic seismic hazard mapping from its traditional engineering use to many other applications. Examples for such applications are risk mitigation, disaster management, post disaster recovery planning and catastrophe loss estimation and risk management. Due to the lack of proper knowledge with regard to factors controlling seismic hazards, there are always uncertainties associated with all steps involved in developing and using seismic hazard models. While some of these uncertainties can be controlled by more accurate and reliable input data, the majority of the data and assumptions used in seismic hazard studies remain with high uncertainties that contribute to the uncertainty of the final results. In this paper a new methodology for the assessment of seismic hazard is described. The proposed approach provides practical facility for better capture of spatial variations of seismological and tectonic characteristics, which allows better treatment of their uncertainties. In the proposed approach, GIS raster-based data models are used in order to model geographical features in a cell-based system. The cell-based source model proposed in this paper provides a framework for implementing many geographically referenced seismotectonic factors into seismic hazard modelling. Examples for such components are seismic source boundaries, rupture geometry, seismic activity rate, focal depth and the choice of attenuation functions. The proposed methodology provides improvements in several aspects of the standard analytical tools currently being used for assessment and mapping of regional seismic hazard. The proposed methodology makes the best use of the recent advancements in computer technology in both software and hardware. The proposed approach is well structured to be implemented using conventional GIS tools.

  11. Robust guaranteed-cost adaptive quantum phase estimation

    NASA Astrophysics Data System (ADS)

    Roy, Shibdas; Berry, Dominic W.; Petersen, Ian R.; Huntington, Elanor H.

    2017-05-01

    Quantum parameter estimation plays a key role in many fields like quantum computation, communication, and metrology. Optimal estimation allows one to achieve the most precise parameter estimates, but requires accurate knowledge of the model. Any inevitable uncertainty in the model parameters may heavily degrade the quality of the estimate. It is therefore desired to make the estimation process robust to such uncertainties. Robust estimation was previously studied for a varying phase, where the goal was to estimate the phase at some time in the past, using the measurement results from both before and after that time within a fixed time interval up to current time. Here, we consider a robust guaranteed-cost filter yielding robust estimates of a varying phase in real time, where the current phase is estimated using only past measurements. Our filter minimizes the largest (worst-case) variance in the allowable range of the uncertain model parameter(s) and this determines its guaranteed cost. It outperforms in the worst case the optimal Kalman filter designed for the model with no uncertainty, which corresponds to the center of the possible range of the uncertain parameter(s). Moreover, unlike the Kalman filter, our filter in the worst case always performs better than the best achievable variance for heterodyne measurements, which we consider as the tolerable threshold for our system. Furthermore, we consider effective quantum efficiency and effective noise power, and show that our filter provides the best results by these measures in the worst case.

  12. Solving Water Crisis through Understanding of Hydrology and Human Systems: a Possible Target

    NASA Astrophysics Data System (ADS)

    Montanari, A.

    2014-12-01

    While the majority of the Earth surface is still in pristine conditions, the totality of the hydrological systems that are relevant to humans are human impacted, with the only exception of small headwater catchments. In fact, the limited transferability of water in space and time implies that water withdrawals from natural resources take place where and when water is needed. Therefore, hydrological systems are impacted where and when humans are, thereby causing a direct perturbation of all water bodies that are relevant to society. The current trend of population dynamics and the current status of water systems are such that the above impact will be not sustainable in the near future, therefore causing a water emergency that will be extended to all intensively populated regions of the world, with relevant implications on migration fluxes, political status and social security. Therefore mitigation actions are urgently needed, whose planning needs to be based on improved interpretations of the above impact. Up to recent times, hydrologists mainly concentrated their research on catchments where the human perturbation is limited, to improve our understanding of pristine hydrology. There were good motivations for this focus: given the relevant uncertainty affecting hydrological modeling, and the even greater uncertainty involved in societal modeling, hydrologists made an effort to separate hydrological and human dynamics. Nowadays, the urgency of the above need to mitigate the global water crisis through improved water resources management calls for a research attempt to bridge water and social sciences. The relevant research question is how to build operational models in order to fully account for the interactions and feedbacks between water resources systems and society. Given that uncertainty estimation is necessary for the operational application of model results, one of the crucial issues is how to quantify uncertainty by means of suitable assumptions. This talk will provide an introduction to the problem and a personal perspective to move forward to set up improved operational models to assist societal planning to mitigate the global water crisis.

  13. Review of revised Klamath River Total Maximum Daily Load models from Link River Dam to Keno Dam, Oregon

    USGS Publications Warehouse

    Rounds, Stewart A.; Sullivan, Annett B.

    2013-01-01

    Flow and water-quality models are being used to support the development of Total Maximum Daily Load (TMDL) plans for the Klamath River downstream of Upper Klamath Lake (UKL) in south-central Oregon. For riverine reaches, the RMA-2 and RMA-11 models were used, whereas the CE-QUAL-W2 model was used to simulate pooled reaches. The U.S. Geological Survey (USGS) was asked to review the most upstream of these models, from Link River Dam at the outlet of UKL downstream through the first pooled reach of the Klamath River from Lake Ewauna to Keno Dam. Previous versions of these models were reviewed in 2009 by USGS. Since that time, important revisions were made to correct several problems and address other issues. This review documents an assessment of the revised models, with emphasis on the model revisions and any remaining issues. The primary focus of this review is the 19.7-mile Lake Ewauna to Keno Dam reach of the Klamath River that was simulated with the CE-QUAL-W2 model. Water spends far more time in the Lake Ewauna to Keno Dam reach than in the 1-mile Link River reach that connects UKL to the Klamath River, and most of the critical reactions affecting water quality upstream of Keno Dam occur in that pooled reach. This model review includes assessments of years 2000 and 2002 current conditions scenarios, which were used to calibrate the model, as well as a natural conditions scenario that was used as the reference condition for the TMDL and was based on the 2000 flow conditions. The natural conditions scenario included the removal of Keno Dam, restoration of the Keno reef (a shallow spot that was removed when the dam was built), removal of all point-source inputs, and derivation of upstream boundary water-quality inputs from a previously developed UKL TMDL model. This review examined the details of the models, including model algorithms, parameter values, and boundary conditions; the review did not assess the draft Klamath River TMDL or the TMDL allocations. Attention to the details of a model is one of the best ways to identify potential problems, correct them if possible, and begin to assess the magnitude of potential model errors and uncertainty. Model users need to determine the level of acceptable uncertainty associated with their objectives, identify all sources of potential uncertainty (model uncertainty, data uncertainty, etc.), and assess their approach and results accordingly. In the draft Klamath River TMDL, the Oregon Department of Environmental Quality identified the upstream boundary conditions as the largest source of uncertainty for both the current and natural conditions scenarios, not the model algorithms or choice of model parameters. We agree that the upstream boundary conditions are one of the largest, if not the largest, source of model uncertainty; therefore, the derivation of upstream boundary conditions may be more important to the TMDL than some other model-related issues identified in this review. The revised models contain a number of changes, some of which were done to solve small problems and are largely inconsequential to model results, but others of which are important and affect model predictions of instream concentrations. A consistent version of the model is now applied to all scenarios, and an error in the source code was corrected that had inadvertently discarded 20 percent of the incoming solar radiation in the original model. The baseline light-extinction coefficient for water was decreased and set to a consistent and defensible value across all models of reservoir reaches. Inconsistencies among the values of certain parameters in the original models, such as the ammonia nitrification rate and the decomposition rates of organic matter, have been eliminated, although the reasoning behind the final selections was not documented. The dependence of the rate of sediment oxygen demand (SOD) on temperature was modified such that the SOD rate was substantially decreased at temperatures less than 20°C, causing the model to predict higher dissolved oxygen (DO) concentrations in spring, autumn, and winter. Although that change to the temperature dependence function was done to make the function more similar to the model’s default, this change was not accompanied by any documentation of recalibration or sensitivity exercises. The maximum SOD rate for the 2002 current conditions scenario was decreased from 3.0 grams per square meter per day (g/m2/d) in the original model to 2.0 g/m2/d in the revised model, a considerable adjustment that appears to have been needed to offset effects of a change to another variable (O2LIM) that would have resulted in a substantial increase in the effective SOD rate for 2002. A 50-percent decrease in the SOD rate over a 2-year period, however, is not likely to be mirrored by field measurements, so this change may be compensating for some process that is not represented correctly in the DO budget for the current conditions scenarios. Several important changes were made to the natural conditions scenario. First, the elevation of the Keno reef was corrected; the elevation specified in the original model was 1 foot too high, which affected the volume of the pooled reach and the travel time through it. The most important changes to this scenario were to the upstream boundary inputs of organic matter and algae, which affect incoming fluxes of nitrogen and phosphorus. Algal biomass inputs were increased by approximately 60 percent during summer because of a change in the way those inputs were derived from results of the UKL TMDL model. Non-algal organic matter inputs were decreased, particularly in summer to correct a problem attributed to double-counting of phosphorus in the original inputs. The distribution of non-algal organic matter was changed from 20 percent dissolved in the original model to 90 percent dissolved in the revised model in response to review comments and published data. The overall sum of algal biomass and non-living organic matter was decreased, which resulted in lower inputs of total phosphorus and nitrogen. Total phosphorus inputs were less than 0.03 mg/L, and although the inputs were derived from selected results of the UKL TMDL model, these concentrations seem too low to be representative of a historically eutrophic system surrounded by extensive wetlands, peat soils, and a groundwater system high in phosphorus. The draft TMDL states that the upstream boundary conditions are the greatest source of uncertainty, greater than any uncertainty associated with the models. Efforts to improve existing models of algal growth and nutrient cycling in UKL, therefore, would provide a substantial benefit to downstream modeling efforts on the Klamath River. Although many improvements were made in revising the Klamath River TMDL models, some issues and uncertainties remain. Several errors in the model source code remain, but do not affect model results for this application as long as certain options and rates are not changed; future users of these models should be aware of these issues. Although the distribution of dissolved and particulate organic matter was modified for the natural conditions scenario, that distribution was not changed for the current conditions scenarios. Recent data on that distribution and the likely rates of organic matter decomposition could be used to improve these models in the future. Nitrate predictions at Keno (Highway 66) still are too high for the current conditions scenarios; future efforts should re-evaluate the model’s denitrification rates and the release rate of ammonia from anoxic sediments. Possibly the most important of the remaining issues are tied to the two-state (healthy/unhealthy) hypothesis for the algae population that was coded into the model. Some of the rates and conversion functions could be refined to make them more acceptable; currently, the published literature does not support the concept of moderately low dissolved-oxygen concentrations as a stressor of algae in the ranges used by the model. More research is needed before these algorithms can be truly tested. The algorithms currently appear to help the model fit the patterns in the available data, and that is useful and perhaps sufficient for some purposes, but those algorithms are not truly predictive or reliable for certain purposes until they can be tested through well-designed experiments and research. In summary, the TMDL models used to simulate Link and Klamath Rivers from Link River Dam to Keno Dam were revised to fix several problems and address various issues. The resulting models are an improvement over those that were reviewed by USGS in 2009, and represent a useful advance in the simulation of a complex system that is difficult to model. However, several issues remain that cause increased uncertainty in the model results. Depending on the objectives of the modeling, now or in the future, these remaining issues could be more or less important. For the Klamath River TMDL, the upstream boundary conditions may be a larger source of uncertainty than the concerns with model algorithms and model parameters identified in this review. Efforts to re-evaluate the available models of algal growth and nutrient cycling in UKL would be highly beneficial to downstream modeling efforts in the Klamath River.

  14. The Intolerance of Uncertainty Inventory: Validity and Comparison of Scoring Methods to Assess Individuals Screening Positive for Anxiety and Depression.

    PubMed

    Lauriola, Marco; Mosca, Oriana; Trentini, Cristina; Foschi, Renato; Tambelli, Renata; Carleton, R Nicholas

    2018-01-01

    Intolerance of Uncertainty is a fundamental transdiagnostic personality construct hierarchically organized with a core general factor underlying diverse clinical manifestations. The current study evaluated the construct validity of the Intolerance of Uncertainty Inventory, a two-part scale separately assessing a unitary Intolerance of Uncertainty disposition to consider uncertainties to be unacceptable and threatening (Part A) and the consequences of such disposition, regarding experiential avoidance, chronic doubt, overestimation of threat, worrying, control of uncertain situations, and seeking reassurance (Part B). Community members ( N = 1046; Mean age = 36.69 ± 12.31 years; 61% females) completed the Intolerance of Uncertainty Inventory with the Beck Depression Inventory-II and the State-Trait Anxiety Inventory. Part A demonstrated a robust unidimensional structure and an excellent convergent validity with Part B. A bifactor model was the best fitting model for Part B. Based on these results, we compared the hierarchical factor scores with summated ratings clinical proxy groups reporting anxiety and depression symptoms. Summated rating scores were associated with both depression and anxiety and proportionally increased with the co-occurrence of depressive and anxious symptoms. By contrast, hierarchical scores were useful to detect which facets mostly separated between for depression and anxiety groups. In sum, Part A was a reliable and valid transdiagnostic measure of Intolerance of Uncertainty. The Part B was arguably more useful for assessing clinical manifestations of Intolerance of Uncertainty for specific disorders, provided that hierarchical scores are used. Overall, our study suggest that clinical assessments might need to shift toward hierarchical factor scores.

  15. The Intolerance of Uncertainty Inventory: Validity and Comparison of Scoring Methods to Assess Individuals Screening Positive for Anxiety and Depression

    PubMed Central

    Lauriola, Marco; Mosca, Oriana; Trentini, Cristina; Foschi, Renato; Tambelli, Renata; Carleton, R. Nicholas

    2018-01-01

    Intolerance of Uncertainty is a fundamental transdiagnostic personality construct hierarchically organized with a core general factor underlying diverse clinical manifestations. The current study evaluated the construct validity of the Intolerance of Uncertainty Inventory, a two-part scale separately assessing a unitary Intolerance of Uncertainty disposition to consider uncertainties to be unacceptable and threatening (Part A) and the consequences of such disposition, regarding experiential avoidance, chronic doubt, overestimation of threat, worrying, control of uncertain situations, and seeking reassurance (Part B). Community members (N = 1046; Mean age = 36.69 ± 12.31 years; 61% females) completed the Intolerance of Uncertainty Inventory with the Beck Depression Inventory-II and the State-Trait Anxiety Inventory. Part A demonstrated a robust unidimensional structure and an excellent convergent validity with Part B. A bifactor model was the best fitting model for Part B. Based on these results, we compared the hierarchical factor scores with summated ratings clinical proxy groups reporting anxiety and depression symptoms. Summated rating scores were associated with both depression and anxiety and proportionally increased with the co-occurrence of depressive and anxious symptoms. By contrast, hierarchical scores were useful to detect which facets mostly separated between for depression and anxiety groups. In sum, Part A was a reliable and valid transdiagnostic measure of Intolerance of Uncertainty. The Part B was arguably more useful for assessing clinical manifestations of Intolerance of Uncertainty for specific disorders, provided that hierarchical scores are used. Overall, our study suggest that clinical assessments might need to shift toward hierarchical factor scores. PMID:29632505

  16. Validation of a model with climatic and flow scenario analysis: case of Lake Burrumbeet in southeastern Australia.

    PubMed

    Yihdego, Yohannes; Webb, John

    2016-05-01

    Forecast evaluation is an important topic that addresses the development of reliable hydrological probabilistic forecasts, mainly through the use of climate uncertainties. Often, validation has no place in hydrology for most of the times, despite the parameters of a model are uncertain. Similarly, the structure of the model can be incorrectly chosen. A calibrated and verified dynamic hydrologic water balance spreadsheet model has been used to assess the effect of climate variability on Lake Burrumbeet, southeastern Australia. The lake level has been verified to lake level, lake volume, lake surface area, surface outflow and lake salinity. The current study aims to increase lake level confidence model prediction through historical validation for the year 2008-2013, under different climatic scenario. Based on the observed climatic condition (2008-2013), it fairly matches with a hybridization of scenarios, being the period interval (2008-2013), corresponds to both dry and wet climatic condition. Besides to the hydrologic stresses uncertainty, uncertainty in the calibrated model is among the major drawbacks involved in making scenario simulations. In line with this, the uncertainty in the calibrated model was tested using sensitivity analysis and showed that errors in the model can largely be attributed to erroneous estimates of evaporation and rainfall, and surface inflow to a lesser. The study demonstrates that several climatic scenarios should be analysed, with a combination of extreme climate, stream flow and climate change instead of one assumed climatic sequence, to improve climate variability prediction in the future. Performing such scenario analysis is a valid exercise to comprehend the uncertainty with the model structure and hydrology, in a meaningful way, without missing those, even considered as less probable, ultimately turned to be crucial for decision making and will definitely increase the confidence of model prediction for management of the water resources.

  17. Development of robust building energy demand-side control strategy under uncertainty

    NASA Astrophysics Data System (ADS)

    Kim, Sean Hay

    The potential of carbon emission regulations applied to an individual building will encourage building owners to purchase utility-provided green power or to employ onsite renewable energy generation. As both cases are based on intermittent renewable energy sources, demand side control is a fundamental precondition for maximizing the effectiveness of using renewable energy sources. Such control leads to a reduction in peak demand and/or in energy demand variability, therefore, such reduction in the demand profile eventually enhances the efficiency of an erratic supply of renewable energy. The combined operation of active thermal energy storage and passive building thermal mass has shown substantial improvement in demand-side control performance when compared to current state-of-the-art demand-side control measures. Specifically, "model-based" optimal control for this operation has the potential to significantly increase performance and bring economic advantages. However, due to the uncertainty in certain operating conditions in the field its control effectiveness could be diminished and/or seriously damaged, which results in poor performance. This dissertation pursues improvements of current demand-side controls under uncertainty by proposing a robust supervisory demand-side control strategy that is designed to be immune from uncertainty and perform consistently under uncertain conditions. Uniqueness and superiority of the proposed robust demand-side controls are found as below: a. It is developed based on fundamental studies about uncertainty and a systematic approach to uncertainty analysis. b. It reduces variability of performance under varied conditions, and thus avoids the worst case scenario. c. It is reactive in cases of critical "discrepancies" observed caused by the unpredictable uncertainty that typically scenario uncertainty imposes, and thus it increases control efficiency. This is obtainable by means of i) multi-source composition of weather forecasts including both historical archive and online sources and ii) adaptive Multiple model-based controls (MMC) to mitigate detrimental impacts of varying scenario uncertainties. The proposed robust demand-side control strategy verifies its outstanding demand-side control performance in varied and non-indigenous conditions compared to the existing control strategies including deterministic optimal controls. This result reemphasizes importance of the demand-side control for a building in the global carbon economy. It also demonstrates a capability of risk management of the proposed robust demand-side controls in highly uncertain situations, which eventually attains the maximum benefit in both theoretical and practical perspectives.

  18. The use of coupled atmospheric and hydrological models for water-resources management in headwater basins

    USGS Publications Warehouse

    Leavesley, G.; Hay, L.

    1998-01-01

    Coupled atmospheric and hydrological models provide an opportunity for the improved management of water resources in headwater basins. Issues currently limiting full implementation of coupled-model methodologies include (a) the degree of uncertainty in the accuracy of precipitation and other meteorological variables simulated by atmospheric models, and (b) the problem of discordant scales between atmospheric and bydrological models. Alternative methodologies being developed to address these issues are reviewed.

  19. Probability and Confidence Trade-space (PACT) Evaluation: Accounting for Uncertainty in Sparing Assessments

    NASA Technical Reports Server (NTRS)

    Anderson, Leif; Box, Neil; Carter, Katrina; DiFilippo, Denise; Harrington, Sean; Jackson, David; Lutomski, Michael

    2012-01-01

    There are two general shortcomings to the current annual sparing assessment: 1. The vehicle functions are currently assessed according to confidence targets, which can be misleading- overly conservative or optimistic. 2. The current confidence levels are arbitrarily determined and do not account for epistemic uncertainty (lack of knowledge) in the ORU failure rate. There are two major categories of uncertainty that impact Sparing Assessment: (a) Aleatory Uncertainty: Natural variability in distribution of actual failures around an Mean Time Between Failure (MTBF) (b) Epistemic Uncertainty : Lack of knowledge about the true value of an Orbital Replacement Unit's (ORU) MTBF We propose an approach to revise confidence targets and account for both categories of uncertainty, an approach we call Probability and Confidence Trade-space (PACT) evaluation.

  20. Defending Against Advanced Persistent Threats Using Game-Theory.

    PubMed

    Rass, Stefan; König, Sandra; Schauer, Stefan

    2017-01-01

    Advanced persistent threats (APT) combine a variety of different attack forms ranging from social engineering to technical exploits. The diversity and usual stealthiness of APT turns them into a central problem of contemporary practical system security, since information on attacks, the current system status or the attacker's incentives is often vague, uncertain and in many cases even unavailable. Game theory is a natural approach to model the conflict between the attacker and the defender, and this work investigates a generalized class of matrix games as a risk mitigation tool for an advanced persistent threat (APT) defense. Unlike standard game and decision theory, our model is tailored to capture and handle the full uncertainty that is immanent to APTs, such as disagreement among qualitative expert risk assessments, unknown adversarial incentives and uncertainty about the current system state (in terms of how deeply the attacker may have penetrated into the system's protective shells already). Practically, game-theoretic APT models can be derived straightforwardly from topological vulnerability analysis, together with risk assessments as they are done in common risk management standards like the ISO 31000 family. Theoretically, these models come with different properties than classical game theoretic models, whose technical solution presented in this work may be of independent interest.

  1. Bounding uncertainty in volumetric geometric models for terrestrial lidar observations of ecosystems.

    PubMed

    Paynter, Ian; Genest, Daniel; Peri, Francesco; Schaaf, Crystal

    2018-04-06

    Volumetric models with known biases are shown to provide bounds for the uncertainty in estimations of volume for ecologically interesting objects, observed with a terrestrial laser scanner (TLS) instrument. Bounding cuboids, three-dimensional convex hull polygons, voxels, the Outer Hull Model and Square Based Columns (SBCs) are considered for their ability to estimate the volume of temperate and tropical trees, as well as geomorphological features such as bluffs and saltmarsh creeks. For temperate trees, supplementary geometric models are evaluated for their ability to bound the uncertainty in cylinder-based reconstructions, finding that coarser volumetric methods do not currently constrain volume meaningfully, but may be helpful with further refinement, or in hybridized models. Three-dimensional convex hull polygons consistently overestimate object volume, and SBCs consistently underestimate volume. Voxel estimations vary in their bias, due to the point density of the TLS data, and occlusion, particularly in trees. The response of the models to parametrization is analysed, observing unexpected trends in the SBC estimates for the drumlin dataset. Establishing that this result is due to the resolution of the TLS observations being insufficient to support the resolution of the geometric model, it is suggested that geometric models with predictable outcomes can also highlight data quality issues when they produce illogical results.

  2. Bounding uncertainty in volumetric geometric models for terrestrial lidar observations of ecosystems

    PubMed Central

    Genest, Daniel; Peri, Francesco; Schaaf, Crystal

    2018-01-01

    Volumetric models with known biases are shown to provide bounds for the uncertainty in estimations of volume for ecologically interesting objects, observed with a terrestrial laser scanner (TLS) instrument. Bounding cuboids, three-dimensional convex hull polygons, voxels, the Outer Hull Model and Square Based Columns (SBCs) are considered for their ability to estimate the volume of temperate and tropical trees, as well as geomorphological features such as bluffs and saltmarsh creeks. For temperate trees, supplementary geometric models are evaluated for their ability to bound the uncertainty in cylinder-based reconstructions, finding that coarser volumetric methods do not currently constrain volume meaningfully, but may be helpful with further refinement, or in hybridized models. Three-dimensional convex hull polygons consistently overestimate object volume, and SBCs consistently underestimate volume. Voxel estimations vary in their bias, due to the point density of the TLS data, and occlusion, particularly in trees. The response of the models to parametrization is analysed, observing unexpected trends in the SBC estimates for the drumlin dataset. Establishing that this result is due to the resolution of the TLS observations being insufficient to support the resolution of the geometric model, it is suggested that geometric models with predictable outcomes can also highlight data quality issues when they produce illogical results. PMID:29503722

  3. A CRITICAL ASSESSMENT OF ELEMENTAL MERCURY AIR/WATER EXCHANGE PARTNERS

    EPA Science Inventory

    Although evasion of elemental mercury from aquatic systems can significantly deplete net mercury accumulation resulting from atmospheric deposition, the current ability to model elemental mercury air/water exchange is limited by uncertainties in our understanding of all gaseous a...

  4. Uncertainties in Integrated Climate Change Impact Assessments by Sub-setting GCMs Based on Annual as well as Crop Growing Period under Rice Based Farming System of Indo-Gangetic Plains of India

    NASA Astrophysics Data System (ADS)

    Pillai, S. N.; Singh, H.; Panwar, A. S.; Meena, M. S.; Singh, S. V.; Singh, B.; Paudel, G. P.; Baigorria, G. A.; Ruane, A. C.; McDermid, S.; Boote, K. J.; Porter, C.; Valdivia, R. O.

    2016-12-01

    Integrated assessment of climate change impact on agricultural productivity is a challenge to the scientific community due to uncertainties of input data, particularly the climate, soil, crop calibration and socio-economic dataset. However, the uncertainty due to selection of GCMs is the major source due to complex underlying processes involved in initial as well as the boundary conditions dealt in solving the air-sea interactions. Under Agricultural Modeling Intercomparison and Improvement Project (AgMIP), the Indo-Gangetic Plains Regional Research Team investigated the uncertainties caused due to selection of GCMs through sub-setting based on annual as well as crop-growth period of rice-wheat systems in AgMIP Integrated Assessment methodology. The AgMIP Phase II protocols were used to study the linking of climate-crop-economic models for two study sites Meerut and Karnal to analyse the sensitivity of current production systems to climate change. Climate Change Projections were made using 29 CMIP5 GCMs under RCP4.5 and RCP 8.5 during mid-century period (2040-2069). Two crop models (APSIM & DSSAT) were used. TOA-MD economic model was used for integrated assessment. Based on RAPs (Representative Agricultural Pathways), some of the parameters, which are not possible to get through modeling, derived from literature and interactions with stakeholders incorporated into the TOA-MD model for integrated assessment.

  5. Space Radiation Cancer Risks

    NASA Technical Reports Server (NTRS)

    Cucinotta, Francis A.

    2007-01-01

    Space radiation presents major challenges to astronauts on the International Space Station and for future missions to the Earth s moon or Mars. Methods used to project risks on Earth need to be modified because of the large uncertainties in projecting cancer risks from space radiation, and thus impact safety factors. We describe NASA s unique approach to radiation safety that applies uncertainty based criteria within the occupational health program for astronauts: The two terrestrial criteria of a point estimate of maximum acceptable level of risk and application of the principle of As Low As Reasonably Achievable (ALARA) are supplemented by a third requirement that protects against risk projection uncertainties using the upper 95% confidence level (CL) in the radiation cancer projection model. NASA s acceptable level of risk for ISS and their new lunar program have been set at the point-estimate of a 3-percent risk of exposure induced death (REID). Tissue-averaged organ dose-equivalents are combined with age at exposure and gender-dependent risk coefficients to project the cumulative occupational radiation risks incurred by astronauts. The 95% CL criteria in practice is a stronger criterion than ALARA, but not an absolute cut-off as is applied to a point projection of a 3% REID. We describe the most recent astronaut dose limits, and present a historical review of astronaut organ doses estimates from the Mercury through the current ISS program, and future projections for lunar and Mars missions. NASA s 95% CL criteria is linked to a vibrant ground based radiobiology program investigating the radiobiology of high-energy protons and heavy ions. The near-term goal of research is new knowledge leading to the reduction of uncertainties in projection models. Risk projections involve a product of many biological and physical factors, each of which has a differential range of uncertainty due to lack of data and knowledge. The current model for projecting space radiation cancer risk relies on the three assumptions of linearity, additivity, and scaling along with the use of population averages. We describe uncertainty estimates for this model, and new experimental data that sheds light on the accuracy of the underlying assumptions. These methods make it possible to express risk management objectives in terms of quantitative metrics, i.e., the number of days in space without exceeding a given risk level within well defined confidence limits. The resulting methodology is applied to several human space exploration mission scenarios including lunar station, deep space outpost, and a Mars mission. Factors that dominate risk projection uncertainties and application of this approach to assess candidate mitigation approaches are described.

  6. Efficient Data-Worth Analysis Using a Multilevel Monte Carlo Method Applied in Oil Reservoir Simulations

    NASA Astrophysics Data System (ADS)

    Lu, D.; Ricciuto, D. M.; Evans, K. J.

    2017-12-01

    Data-worth analysis plays an essential role in improving the understanding of the subsurface system, in developing and refining subsurface models, and in supporting rational water resources management. However, data-worth analysis is computationally expensive as it requires quantifying parameter uncertainty, prediction uncertainty, and both current and potential data uncertainties. Assessment of these uncertainties in large-scale stochastic subsurface simulations using standard Monte Carlo (MC) sampling or advanced surrogate modeling is extremely computationally intensive, sometimes even infeasible. In this work, we propose efficient Bayesian analysis of data-worth using a multilevel Monte Carlo (MLMC) method. Compared to the standard MC that requires a significantly large number of high-fidelity model executions to achieve a prescribed accuracy in estimating expectations, the MLMC can substantially reduce the computational cost with the use of multifidelity approximations. As the data-worth analysis involves a great deal of expectation estimations, the cost savings from MLMC in the assessment can be very outstanding. While the proposed MLMC-based data-worth analysis is broadly applicable, we use it to a highly heterogeneous oil reservoir simulation to select an optimal candidate data set that gives the largest uncertainty reduction in predicting mass flow rates at four production wells. The choices made by the MLMC estimation are validated by the actual measurements of the potential data, and consistent with the estimation obtained from the standard MC. But compared to the standard MC, the MLMC greatly reduces the computational costs in the uncertainty reduction estimation, with up to 600 days cost savings when one processor is used.

  7. Shining Light into Cosmic Dark Ages

    NASA Astrophysics Data System (ADS)

    Fialkov, Anastasia

    2018-06-01

    Exploration of the early Universe is ongoing. One of the most interesting probes of the epoch is the redshifted 21-cm line of neutral hydrogen. Modeling of this signal is difficult due to large uncertainties in both astrophysical and cosmological parameters that describe the high redshift Universe. In my talk I will discuss current theoretical understanding and the status of modeling.

  8. Effects of species biological traits and environmental heterogeneity on simulated tree species distribution shifts under climate change

    Treesearch

    Wen J. Wang; Hong S. He; Frank R. Thompson; Martin A. Spetich; Jacob S. Fraser

    2018-01-01

    Demographic processes (fecundity, dispersal, colonization, growth, and mortality) and their interactions with environmental changes are notwell represented in current climate-distribution models (e.g., niche and biophysical process models) and constitute a large uncertainty in projections of future tree species distribution shifts.We investigate how species biological...

  9. Argo Development Program.

    DTIC Science & Technology

    1986-06-01

    nonlinear form and account for uncertainties in model parameters, structural simplifications of the model, and disturbances. This technique summarizes...SHARPS system. *The take into account the coupling between axes two curves are nearly identical, except that the without becoming unwieldy. The low...are mainly caused by errors and control errors and accounts for the bandwidth limitations and the simulated current. observed offsets. The overshoot

  10. Climate change adaptation and Integrated Water Resource Management in the water sector

    NASA Astrophysics Data System (ADS)

    Ludwig, Fulco; van Slobbe, Erik; Cofino, Wim

    2014-10-01

    Integrated Water Resources Management (IWRM) was introduced in 1980s to better optimise water uses between different water demanding sectors. However, since it was introduced water systems have become more complicated due to changes in the global water cycle as a result of climate change. The realization that climate change will have a significant impact on water availability and flood risks has driven research and policy making on adaptation. This paper discusses the main similarities and differences between climate change adaptation and IWRM. The main difference between the two is the focus on current and historic issues of IWRM compared to the (long-term) future focus of adaptation. One of the main problems of implementing climate change adaptation is the large uncertainties in future projections. Two completely different approaches to adaptation have been developed in response to these large uncertainties. A top-down approach based on large scale biophysical impacts analyses focussing on quantifying and minimizing uncertainty by using a large range of scenarios and different climate and impact models. The main problem with this approach is the propagation of uncertainties within the modelling chain. The opposite is the bottom up approach which basically ignores uncertainty. It focusses on reducing vulnerabilities, often at local scale, by developing resilient water systems. Both these approaches however are unsuitable for integrating into water management. The bottom up approach focuses too much on socio-economic vulnerability and too little on developing (technical) solutions. The top-down approach often results in an “explosion” of uncertainty and therefore complicates decision making. A more promising direction of adaptation would be a risk based approach. Future research should further develop and test an approach which starts with developing adaptation strategies based on current and future risks. These strategies should then be evaluated using a range of future scenarios in order to develop robust adaptation measures and strategies.

  11. Value of information: interim analysis of a randomized, controlled trial of goal-directed hemodynamic treatment for aged patients.

    PubMed

    Bartha, Erzsebet; Davidson, Thomas; Brodtkorb, Thor-Henrik; Carlsson, Per; Kalman, Sigridur

    2013-07-09

    A randomized, controlled trial, intended to include 460 patients, is currently studying peroperative goal-directed hemodynamic treatment (GDHT) of aged hip-fracture patients. Interim efficacy analysis performed on the first 100 patients was statistically uncertain; thus, the trial is continuing in accordance with the trial protocol. This raised the present investigation's main question: Is it reasonable to continue to fund the trial to decrease uncertainty? To answer this question, a previously developed probabilistic cost-effectiveness model was used. That model depicts (1) a choice between routine fluid treatment and GDHT, given uncertainty of current evidence and (2) the monetary value of further data collection to decrease uncertainty. This monetary value, that is, the expected value of perfect information (EVPI), could be used to compare future research costs. Thus, the primary aim of the present investigation was to analyze EVPI of an ongoing trial with interim efficacy observed. A previously developed probabilistic decision analytic cost-effectiveness model was employed to compare the routine fluid treatment to GDHT. Results from the interim analysis, published trials, the meta-analysis, and the registry data were used as model inputs. EVPI was predicted using (1) combined uncertainty of model inputs; (2) threshold value of society's willingness to pay for one, quality-adjusted life-year; and (3) estimated number of future patients exposed to choice between GDHT and routine fluid treatment during the expected lifetime of GDHT. If a decision to use GDHT were based on cost-effectiveness, then the decision would have a substantial degree of uncertainty. Assuming a 5-year lifetime of GDHT in clinical practice, the number of patients who would be subject to future decisions was 30,400. EVPI per patient would be €204 at a €20,000 threshold value of society's willingness to pay for one quality-adjusted life-year. Given a future population of 30,400 individuals, total EVPI would be €6.19 million. If future trial costs are below EVPI, further data collection is potentially cost-effective. When applying a cost-effectiveness model, statements such as 'further research is needed' are replaced with 'further research is cost-effective and 'further funding of a trial is justified'. ClinicalTrials.gov NCT01141894.

  12. Variance decomposition shows the importance of human-climate feedbacks in the Earth system

    NASA Astrophysics Data System (ADS)

    Calvin, K. V.; Bond-Lamberty, B. P.; Jones, A. D.; Shi, X.; Di Vittorio, A. V.; Thornton, P. E.

    2017-12-01

    The human and Earth systems are intricately linked: climate influences agricultural production, renewable energy potential, and water availability, for example, while anthropogenic emissions from industry and land use change alter temperature and precipitation. Such feedbacks have the potential to significantly alter future climate change. Current climate change projections contain significant uncertainties, however, and because Earth System Models do not generally include dynamic human (demography, economy, energy, water, land use) components, little is known about how climate feedbacks contribute to that uncertainty. Here we use variance decomposition of a novel coupled human-earth system model to show that the influence of human-climate feedbacks can be as large as 17% of the total variance in the near term for global mean temperature rise, and 11% in the long term for cropland area. The near-term contribution of energy and land use feedbacks to the climate on global mean temperature rise is as large as that from model internal variability, a factor typically considered in modeling studies. Conversely, the contribution of climate feedbacks to cropland extent, while non-negligible, is less than that from socioeconomics, policy, or model. Previous assessments have largely excluded these feedbacks, with the climate community focusing on uncertainty due to internal variability, scenario, and model and the integrated assessment community focusing on uncertainty due to socioeconomics, technology, policy, and model. Our results set the stage for a new generation of models and hypothesis testing to determine when and how bidirectional feedbacks between human and Earth systems should be considered in future assessments of climate change.

  13. Improving the representation of photosynthesis in Earth system models

    NASA Astrophysics Data System (ADS)

    Rogers, A.; Medlyn, B. E.; Dukes, J.; Bonan, G. B.; von Caemmerer, S.; Dietze, M.; Kattge, J.; Leakey, A. D.; Mercado, L. M.; Niinemets, U.; Prentice, I. C. C.; Serbin, S.; Sitch, S.; Way, D. A.; Zaehle, S.

    2015-12-01

    Continued use of fossil fuel drives an accelerating increase in atmospheric CO2 concentration ([CO2]) and is the principal cause of global climate change. Many of the observed and projected impacts of rising [CO2] portend increasing environmental and economic risk, yet the uncertainty surrounding the projection of our future climate by Earth System Models (ESMs) is unacceptably high. Improving confidence in our estimation of future [CO2] is essential if we seek to project global change with greater confidence. There are critical uncertainties over the long term response of terrestrial CO2 uptake to global change, more specifically, over the size of the terrestrial carbon sink and over its sensitivity to rising [CO2] and temperature. Reducing the uncertainty associated with model representation of the largest CO2 flux on the planet is therefore an essential part of improving confidence in projections of global change. Here we have examined model representation of photosynthesis in seven process models including several global models that underlie the representation of photosynthesis in the land surface model component of ESMs that were part of the recent Fifth Assessment Report from the IPCC. Our approach was to focus on how physiological responses are represented by these models, and to better understand how structural and parametric differences drive variation in model responses to light, CO2, nutrients, temperature, vapor pressure deficit and soil moisture. We challenged each model to produce leaf and canopy responses to these factors to help us identify areas in which current process knowledge and emerging data sets could be used to improve model skill, and also identify knowledge gaps in current understanding that directly impact model outputs. We hope this work will provide a roadmap for the scientific activity that is necessary to advance process representation, parameterization and scaling of photosynthesis in the next generation of Earth System Models.

  14. The Inferential Structure of Actionable Science in Climatological and Hydrological Co-Productions

    NASA Astrophysics Data System (ADS)

    Brumble, K. C.

    2016-12-01

    Across the geophysical sciences, and in hydrology in particular, there is a growing emphasis on and desire to produce "actionable science" and "user-inspired" science. Fueled by the need to make research approachable, intelligible, and useful for decision-makers, policy-makers, and across disciplinary boundaries, actionable science endeavors seek to replace the traditional downward flow of information model for knowledge in the sciences. Instead the focus is on more dynamical knowledge flow between the local and contingent and the vast and complex. New methodologies which allow for the co-production of knowledge between modelers, model users, and decision-makers will be surveyed for the structure of knowledge flow present, and for innovations in communicating and handling uncertainties across traditional disciplinary boundaries. Current and possible future methods for handling sources of uncertainty and cascades of uncertainty will be addressed. Examples will be drawn from recent projects involving the interactions between climate modeling groups, hydrological modelers, and decision makers at the local and regional level in water security to try and identify key methodologies for the co-production of actionable knowledge exportable to other applications in the boundary between systems impacted by climate change.

  15. Climate Twins - a tool to explore future climate impacts by assessing real world conditions: Exploration principles, underlying data, similarity conditions and uncertainty ranges

    NASA Astrophysics Data System (ADS)

    Loibl, Wolfgang; Peters-Anders, Jan; Züger, Johann

    2010-05-01

    To achieve public awareness and thorough understanding about expected climate changes and their future implications, ways have to be found to communicate model outputs to the public in a scientifically sound and easily understandable way. The newly developed Climate Twins tool tries to fulfil these requirements via an intuitively usable web application, which compares spatial patterns of current climate with future climate patterns, derived from regional climate model results. To get a picture of the implications of future climate in an area of interest, users may click on a certain location within an interactive map with underlying future climate information. A second map depicts the matching Climate Twin areas according to current climate conditions. In this way scientific output can be communicated to the public which allows for experiencing climate change through comparison with well-known real world conditions. To identify climatic coincidence seems to be a simple exercise, but the accuracy and applicability of the similarity identification depends very much on the selection of climate indicators, similarity conditions and uncertainty ranges. Too many indicators representing various climate characteristics and too narrow uncertainty ranges will judge little or no area as regions with similar climate, while too little indicators and too wide uncertainty ranges will address too large regions as those with similar climate which may not be correct. Similarity cannot be just explored by comparing mean values or by calculating correlation coefficients. As climate change triggers an alteration of various indicators, like maxima, minima, variation magnitude, frequency of extreme events etc., the identification of appropriate similarity conditions is a crucial question to be solved. For Climate Twins identification, it is necessary to find a right balance of indicators, similarity conditions and uncertainty ranges, unless the results will be too vague conducting a useful Climate Twins regions search. The Climate Twins tool works actually comparing future climate conditions of a certain source area in the Greater Alpine Region with current climate conditions of entire Europe and the neighbouring southern as well south-eastern areas as target regions. A next version will integrate web crawling features for searching information about climate-related local adaptations observed today in the target region which may turn out as appropriate solution for the source region under future climate conditions. The contribution will present the current tool functionally and will discuss which indicator sets, similarity conditions and uncertainty ranges work best to deliver scientifically sound climate comparisons and distinct mapping results.

  16. Benchmarking observational uncertainties for hydrology (Invited)

    NASA Astrophysics Data System (ADS)

    McMillan, H. K.; Krueger, T.; Freer, J. E.; Westerberg, I.

    2013-12-01

    There is a pressing need for authoritative and concise information on the expected error distributions and magnitudes in hydrological data, to understand its information content. Many studies have discussed how to incorporate uncertainty information into model calibration and implementation, and shown how model results can be biased if uncertainty is not appropriately characterised. However, it is not always possible (for example due to financial or time constraints) to make detailed studies of uncertainty for every research study. Instead, we propose that the hydrological community could benefit greatly from sharing information on likely uncertainty characteristics and the main factors that control the resulting magnitude. In this presentation, we review the current knowledge of uncertainty for a number of key hydrological variables: rainfall, flow and water quality (suspended solids, nitrogen, phosphorus). We collated information on the specifics of the data measurement (data type, temporal and spatial resolution), error characteristics measured (e.g. standard error, confidence bounds) and error magnitude. Our results were primarily split by data type. Rainfall uncertainty was controlled most strongly by spatial scale, flow uncertainty was controlled by flow state (low, high) and gauging method. Water quality presented a more complex picture with many component errors. For all variables, it was easy to find examples where relative error magnitude exceeded 40%. We discuss some of the recent developments in hydrology which increase the need for guidance on typical error magnitudes, in particular when doing comparative/regionalisation and multi-objective analysis. Increased sharing of data, comparisons between multiple catchments, and storage in national/international databases can mean that data-users are far removed from data collection, but require good uncertainty information to reduce bias in comparisons or catchment regionalisation studies. Recently it has become more common for hydrologists to use multiple data types and sources within a single study. This may be driven by complex water management questions which integrate water quantity, quality and ecology; or by recognition of the value of auxiliary data to understand hydrological processes. We discuss briefly the impact of data uncertainty on the increasingly popular use of diagnostic signatures for hydrological process understanding and model development.

  17. Correlating electroluminescence characterization and physics-based models of InGaN/GaN LEDs: Pitfalls and open issues

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Calciati, Marco; Vallone, Marco; Zhou, Xiangyu

    2014-06-15

    Electroluminescence (EL) characterization of InGaN/GaN light-emitting diodes (LEDs), coupled with numerical device models of different sophistication, is routinely adopted not only to establish correlations between device efficiency and structural features, but also to make inferences about the loss mechanisms responsible for LED efficiency droop at high driving currents. The limits of this investigative approach are discussed here in a case study based on a comprehensive set of current- and temperature-dependent EL data from blue LEDs with low and high densities of threading dislocations (TDs). First, the effects limiting the applicability of simpler (closed-form and/or one-dimensional) classes of models are addressed,more » like lateral current crowding, vertical carrier distribution nonuniformity, and interband transition broadening. Then, the major sources of uncertainty affecting state-of-the-art numerical device simulation are reviewed and discussed, including (i) the approximations in the transport description through the multi-quantum-well active region, (ii) the alternative valence band parametrizations proposed to calculate the spontaneous emission rate, (iii) the difficulties in defining the Auger coefficients due to inadequacies in the microscopic quantum well description and the possible presence of extra, non-Auger high-current-density recombination mechanisms and/or Auger-induced leakage. In the case of the present LED structures, the application of three-dimensional numerical-simulation-based analysis to the EL data leads to an explanation of efficiency droop in terms of TD-related and Auger-like nonradiative losses, with a C coefficient in the 10{sup −30} cm{sup 6}/s range at room temperature, close to the larger theoretical calculations reported so far. However, a study of the combined effects of structural and model uncertainties suggests that the C values thus determined could be overestimated by about an order of magnitude. This preliminary attempt at uncertainty quantification confirms, beyond the present case, the need for an improved description of carrier transport and microscopic radiative and nonradiative recombination mechanisms in device-level LED numerical models.« less

  18. Measurement of Neutrino and Antineutrino Total Charged-Current Cross Sections on Carbon with MINERvA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ren, Lu

    This thesis presents a measurement of charged-current inclusive cross sections of muon neutrino and antineutrino interaction on carbon, and antineutrino to neutrino cross section ratio, r, in the energy range 2 - 22 GeV, with data collected in the MINERA experiment. The dataset corresponds to an exposure of 3.2 x 10 20 protons on target (POT) for neutrinos and 1.01020 POT for antineutrinos. Measurement of neutrino and antineutrino charged-current inclusive cross sections provides essential constraints for future long baseline neutrino oscillation experiment at a few GeV energy range. Our measured antineutrino cross section has an uncertainty in the range 6.1%more » - 10.5% and is the most precise measurement below 6 GeV to date. The measured r has an uncertainty of 5.0% - 7.5%. This is the rst measurement below 6 GeV since Gargamelle in 1970s. The cross sections are measured as a function of neutrino energy by dividing the eciency corrected charged-current sample with extracted uxes. Fluxes are obtained using the low- method, which uses low hadronic energy subsamples of charged-current inclusive sample to extract ux. Measured cross sections show good agreement with the prediction of neutrino interaction models above 7 GeV, and are about 10% below the model below 7 GeV. The measured r agrees with the GENIE model [1] over the whole energy region. The measured cross sections and r are compared with world data.« less

  19. An Adaptation Dilemma Caused by Impacts-Modeling Uncertainty

    NASA Astrophysics Data System (ADS)

    Frieler, K.; Müller, C.; Elliott, J. W.; Heinke, J.; Arneth, A.; Bierkens, M. F.; Ciais, P.; Clark, D. H.; Deryng, D.; Doll, P. M.; Falloon, P.; Fekete, B. M.; Folberth, C.; Friend, A. D.; Gosling, S. N.; Haddeland, I.; Khabarov, N.; Lomas, M. R.; Masaki, Y.; Nishina, K.; Neumann, K.; Oki, T.; Pavlick, R.; Ruane, A. C.; Schmid, E.; Schmitz, C.; Stacke, T.; Stehfest, E.; Tang, Q.; Wisser, D.

    2013-12-01

    Ensuring future well-being for a growing population under either strong climate change or an aggressive mitigation strategy requires a subtle balance of potentially conflicting response measures. In the case of competing goals, uncertainty in impact estimates plays a central role when high confidence in achieving a primary objective (such as food security) directly implies an increased probability of uncertainty induced failure with regard to a competing target (such as climate protection). We use cross sectoral consistent multi-impact model simulations from the Inter-Sectoral Impact Model Intercomparison Project (ISI-MIP, www.isi-mip.org) to illustrate this uncertainty dilemma: RCP projections from 7 global crop, 11 hydrological, and 7 biomes models are combined to analyze irrigation and land use changes as possible responses to climate change and increasing crop demand due to population growth and economic development. We show that - while a no-regrets option with regard to climate protection - additional irrigation alone is not expected to balance the demand increase by 2050. In contrast, a strong expansion of cultivated land closes the projected production-demand gap in some crop models. However, it comes at the expense of a loss of natural carbon sinks of order 50%. Given the large uncertainty of state of the art crop model projections even these strong land use changes would not bring us ';on the safe side' with respect to food supply. In a world where increasing carbon emissions continue to shrink the overall solution space, we demonstrate that current impacts-modeling uncertainty is a luxury we cannot afford. ISI-MIP is intended to provide cross sectoral consistent impact projections for model intercomparison and improvement as well as cross-sectoral integration. The results presented here were generated within the first Fast-Track phase of the project covering global impact projections. The second phase will also include regional projections. It is the aim of the project to build up a CMIP like open archive for climate impact projections allowing for the necessary sharpening the our picture of a 1,2,3,4 degrees warmer world.

  20. Development and Testing of a Coupled Ocean-atmosphere Mesoscale Ensemble Prediction System

    DTIC Science & Technology

    2011-06-28

    wind, temperature, and moisture variables, while the oceanographic ET is derived from ocean current, temperature, and salinity variables. Estimates of...wind, temperature, and moisture variables while the oceanographic ET is derived from ocean current temperature, and salinity variables. Estimates of...uncertainty in the model. Rigorously accurate ensemble methods for describing the distribution of future states given past information include particle

  1. Prototype Biology-Based Radiation Risk Module Project

    NASA Technical Reports Server (NTRS)

    Terrier, Douglas; Clayton, Ronald G.; Patel, Zarana; Hu, Shaowen; Huff, Janice

    2015-01-01

    Biological effects of space radiation and risk mitigation are strategic knowledge gaps for the Evolvable Mars Campaign. The current epidemiology-based NASA Space Cancer Risk (NSCR) model contains large uncertainties (HAT #6.5a) due to lack of information on the radiobiology of galactic cosmic rays (GCR) and lack of human data. The use of experimental models that most accurately replicate the response of human tissues is critical for precision in risk projections. Our proposed study will compare DNA damage, histological, and cell kinetic parameters after irradiation in normal 2D human cells versus 3D tissue models, and it will use a multi-scale computational model (CHASTE) to investigate various biological processes that may contribute to carcinogenesis, including radiation-induced cellular signaling pathways. This cross-disciplinary work, with biological validation of an evolvable mathematical computational model, will help reduce uncertainties within NSCR and aid risk mitigation for radiation-induced carcinogenesis.

  2. A computationally inexpensive model for estimating dimensional measurement uncertainty due to x-ray computed tomography instrument misalignments

    NASA Astrophysics Data System (ADS)

    Ametova, Evelina; Ferrucci, Massimiliano; Chilingaryan, Suren; Dewulf, Wim

    2018-06-01

    The recent emergence of advanced manufacturing techniques such as additive manufacturing and an increased demand on the integrity of components have motivated research on the application of x-ray computed tomography (CT) for dimensional quality control. While CT has shown significant empirical potential for this purpose, there is a need for metrological research to accelerate the acceptance of CT as a measuring instrument. The accuracy in CT-based measurements is vulnerable to the instrument geometrical configuration during data acquisition, namely the relative position and orientation of x-ray source, rotation stage, and detector. Consistency between the actual instrument geometry and the corresponding parameters used in the reconstruction algorithm is critical. Currently available procedures provide users with only estimates of geometrical parameters. Quantification and propagation of uncertainty in the measured geometrical parameters must be considered to provide a complete uncertainty analysis and to establish confidence intervals for CT dimensional measurements. In this paper, we propose a computationally inexpensive model to approximate the influence of errors in CT geometrical parameters on dimensional measurement results. We use surface points extracted from a computer-aided design (CAD) model to model discrepancies in the radiographic image coordinates assigned to the projected edges between an aligned system and a system with misalignments. The efficacy of the proposed method was confirmed on simulated and experimental data in the presence of various geometrical uncertainty contributors.

  3. Status of the \\varvec{Λ (1405)}

    NASA Astrophysics Data System (ADS)

    Mai, Maxim

    2018-07-01

    I give an overview of the current status of the lowest s-wave baryon resonance in the strangeness (S=-1) channel, the Λ (1405). Recent results from Lattice QCD calculations and new high-precision data from photoproduction experiments are highlighted in this talk. On the theoretical side various directions have been explored over the last two decades on the basis of coupled-channel chiral unitary models. New photoproduction data can be used to reduce statistical uncertainty of the predictions of such models. As for the systematic uncertainties, a recent comparative analysis of modern approaches exhibits many similarities but also large ambiguities in some of the predicted properties of the antikaon-nucleon scattering amplitudes. Some possible ways to reduce such a model dependence are discussed at the end of this manuscript.

  4. Nuclear Effects in Quasi-Elastic and Delta Resonance Production at Low Momentum Transfer

    NASA Astrophysics Data System (ADS)

    Demgen, John Gibney

    Analysis of data collected by the MINERvA experiment is done by showing the distribution of charged hadron energy for interactions that have low momentum transfer. This distribution reveals major discrepancies between the detector data and the standard MINERvA interaction model with only a simple global Fermi gas model. Adding additional model elements, the random phase approximation (RPA), meson exchange current (MEC), and a reduction of resonance delta production improve this discrepancy. Special attention is paid to resonance delta production systematic uncertainties, which do not make up these discrepancies even when added with resolution and biasing systematic uncertainties. Eye- scanning of events in this region also show a discrepancy, but we were insensitive to two-proton events, the predicted signature of the MEC process.

  5. Reservoir Performance Under Future Climate For Basins With Different Hydrologic Sensitivities

    NASA Astrophysics Data System (ADS)

    Mateus, M. C.; Tullos, D. D.

    2013-12-01

    In addition to long-standing uncertainties related to variable inflows and market price of power, reservoir operators face a number of new uncertainties related to hydrologic nonstationarity, changing environmental regulations, and rapidly growing water and energy demands. This study investigates the impact, sensitivity, and uncertainty of changing hydrology on hydrosystem performance across different hydrogeologic settings. We evaluate the performance of reservoirs in the Santiam River basin, including a case study in the North Santiam Basin, with high permeability and extensive groundwater storage, and the South Santiam Basin, with low permeability, little groundwater storage and rapid runoff response. The modeling objective is to address the following study questions: (1) for the two hydrologic regimes, how does the flood management, water supply, and environmental performance of current reservoir operations change under future 2.5, 50 and 97.5 percentile streamflow projections; and (2) how much change in inflow is required to initiate a failure to meet downstream minimum or maximum flows in the future. We couple global climate model results with a rainfall-runoff model and a formal Bayesian uncertainty analysis to simulate future inflow hydrographs as inputs to a reservoir operations model. To evaluate reservoir performance under a changing climate, we calculate reservoir refill reliability, changes in flood frequency, and reservoir time and volumetric reliability of meeting minimum spring and summer flow target. Reservoir performance under future hydrology appears to vary with hydrogeology. We find higher sensitivity to floods for the North Santiam Basin and higher sensitivity to minimum flow targets for the South Santiam Basin. Higher uncertainty is related with basins with a more complex hydrologeology. Results from model simulations contribute to understanding of the reliability and vulnerability of reservoirs to a changing climate.

  6. Regional crop yield forecasting: a probabilistic approach

    NASA Astrophysics Data System (ADS)

    de Wit, A.; van Diepen, K.; Boogaard, H.

    2009-04-01

    Information on the outlook on yield and production of crops over large regions is essential for government services dealing with import and export of food crops, for agencies with a role in food relief, for international organizations with a mandate in monitoring the world food production and trade, and for commodity traders. Process-based mechanistic crop models are an important tool for providing such information, because they can integrate the effect of crop management, weather and soil on crop growth. When properly integrated in a yield forecasting system, the aggregated model output can be used to predict crop yield and production at regional, national and continental scales. Nevertheless, given the scales at which these models operate, the results are subject to large uncertainties due to poorly known weather conditions and crop management. Current yield forecasting systems are generally deterministic in nature and provide no information about the uncertainty bounds on their output. To improve on this situation we present an ensemble-based approach where uncertainty bounds can be derived from the dispersion of results in the ensemble. The probabilistic information provided by this ensemble-based system can be used to quantify uncertainties (risk) on regional crop yield forecasts and can therefore be an important support to quantitative risk analysis in a decision making process.

  7. Advanced Booster Liquid Engine Combustion Stability

    NASA Technical Reports Server (NTRS)

    Tucker, Kevin; Gentz, Steve; Nettles, Mindy

    2015-01-01

    Combustion instability is a phenomenon in liquid rocket engines caused by complex coupling between the time-varying combustion processes and the fluid dynamics in the combustor. Consequences of the large pressure oscillations associated with combustion instability often cause significant hardware damage and can be catastrophic. The current combustion stability assessment tools are limited by the level of empiricism in many inputs and embedded models. This limited predictive capability creates significant uncertainty in stability assessments. This large uncertainty then increases hardware development costs due to heavy reliance on expensive and time-consuming testing.

  8. Evaluating Productivity Predictions Under Elevated CO2 Conditions: Multi-Model Benchmarking Across FACE Experiments

    NASA Astrophysics Data System (ADS)

    Cowdery, E.; Dietze, M.

    2016-12-01

    As atmospheric levels of carbon dioxide levels continue to increase, it is critical that terrestrial ecosystem models can accurately predict ecological responses to the changing environment. Current predictions of net primary productivity (NPP) in response to elevated atmospheric CO2 concentration are highly variable and contain a considerable amount of uncertainty.The Predictive Ecosystem Analyzer (PEcAn) is an informatics toolbox that wraps around an ecosystem model and can be used to help identify which factors drive uncertainty. We tested a suite of models (LPJ-GUESS, MAESPA, GDAY, CLM5, DALEC, ED2), which represent a range from low to high structural complexity, across a range of Free-Air CO2 Enrichment (FACE) experiments: the Kennedy Space Center Open Top Chamber Experiment, the Rhinelander FACE experiment, the Duke Forest FACE experiment and the Oak Ridge Experiment on CO2 Enrichment. These tests were implemented in a novel benchmarking workflow that is automated, repeatable, and generalized to incorporate different sites and ecological models. Observational data from the FACE experiments represent a first test of this flexible, extensible approach aimed at providing repeatable tests of model process representation.To identify and evaluate the assumptions causing inter-model differences we used PEcAn to perform model sensitivity and uncertainty analysis, not only to assess the components of NPP, but also to examine system processes such nutrient uptake and and water use. Combining the observed patterns of uncertainty between multiple models with results of the recent FACE-model data synthesis project (FACE-MDS) can help identify which processes need further study and additional data constraints. These findings can be used to inform future experimental design and in turn can provide informative starting point for data assimilation.

  9. Probabilistic risk assessment for CO2 storage in geological formations: robust design and support for decision making under uncertainty

    NASA Astrophysics Data System (ADS)

    Oladyshkin, Sergey; Class, Holger; Helmig, Rainer; Nowak, Wolfgang

    2010-05-01

    CO2 storage in geological formations is currently being discussed intensively as a technology for mitigating CO2 emissions. However, any large-scale application requires a thorough analysis of the potential risks. Current numerical simulation models are too expensive for probabilistic risk analysis and for stochastic approaches based on brute-force repeated simulation. Even single deterministic simulations may require parallel high-performance computing. The multiphase flow processes involved are too non-linear for quasi-linear error propagation and other simplified stochastic tools. As an alternative approach, we propose a massive stochastic model reduction based on the probabilistic collocation method. The model response is projected onto a orthogonal basis of higher-order polynomials to approximate dependence on uncertain parameters (porosity, permeability etc.) and design parameters (injection rate, depth etc.). This allows for a non-linear propagation of model uncertainty affecting the predicted risk, ensures fast computation and provides a powerful tool for combining design variables and uncertain variables into one approach based on an integrative response surface. Thus, the design task of finding optimal injection regimes explicitly includes uncertainty, which leads to robust designs of the non-linear system that minimize failure probability and provide valuable support for risk-informed management decisions. We validate our proposed stochastic approach by Monte Carlo simulation using a common 3D benchmark problem (Class et al. Computational Geosciences 13, 2009). A reasonable compromise between computational efforts and precision was reached already with second-order polynomials. In our case study, the proposed approach yields a significant computational speedup by a factor of 100 compared to Monte Carlo simulation. We demonstrate that, due to the non-linearity of the flow and transport processes during CO2 injection, including uncertainty in the analysis leads to a systematic and significant shift of predicted leakage rates towards higher values compared with deterministic simulations, affecting both risk estimates and the design of injection scenarios. This implies that, neglecting uncertainty can be a strong simplification for modeling CO2 injection, and the consequences can be stronger than when neglecting several physical phenomena (e.g. phase transition, convective mixing, capillary forces etc.). The authors would like to thank the German Research Foundation (DFG) for financial support of the project within the Cluster of Excellence in Simulation Technology (EXC 310/1) at the University of Stuttgart. Keywords: polynomial chaos; CO2 storage; multiphase flow; porous media; risk assessment; uncertainty; integrative response surfaces

  10. Uncertainty principles for inverse source problems for electromagnetic and elastic waves

    NASA Astrophysics Data System (ADS)

    Griesmaier, Roland; Sylvester, John

    2018-06-01

    In isotropic homogeneous media, far fields of time-harmonic electromagnetic waves radiated by compactly supported volume currents, and elastic waves radiated by compactly supported body force densities can be modelled in very similar fashions. Both are projected restricted Fourier transforms of vector-valued source terms. In this work we generalize two types of uncertainty principles recently developed for far fields of scalar-valued time-harmonic waves in Griesmaier and Sylvester (2017 SIAM J. Appl. Math. 77 154–80) to this vector-valued setting. These uncertainty principles yield stability criteria and algorithms for splitting far fields radiated by collections of well-separated sources into the far fields radiated by individual source components, and for the restoration of missing data segments. We discuss proper regularization strategies for these inverse problems, provide stability estimates based on the new uncertainty principles, and comment on reconstruction schemes. A numerical example illustrates our theoretical findings.

  11. Aerosol Direct Radiative Effects Over the Northwest Atlantic, Northwest Pacific, and North Indian Oceans: Estimates Based on In-situ Chemical and Optical Measurements and Chemical Transport Modeling

    NASA Astrophysics Data System (ADS)

    Bates, T. S.; Anderson, T. L.; Baynard, T.; Bond, T.; Boucher, O.; Carmichael, G.; Clarke, A.; Erlick, C.; Guo, H.; Horowitz, L.; Howell, S.; Kulkarni, S.; Maring, H.; McComiskey, A.; Middlebrook, A.; Noone, K.; O'Dowd, C. D.; Ogren, J. A.; Penner, J.; Quinn, P. K.; Ravishankara, A. R.; Savoie, D. L.; Schwartz, S. E.; Shinozuka, Y.; Tang, Y.; Weber, R. J.; Wu, Y.

    2005-12-01

    The largest uncertainty in the radiative forcing of climate change over the industrial era is that due to aerosols, a substantial fraction of which is the uncertainty associated with scattering and absorption of shortwave (solar) radiation by anthropogenic aerosols in cloud-free conditions. Quantifying and reducing the uncertainty in aerosol influences on climate is critical to understanding climate change over the industrial period and to improving predictions of future climate change for assumed emission scenarios. Measurements of aerosol properties during major field campaigns in several regions of the globe during the past decade are contributing to an enhanced understanding of atmospheric aerosols and their effects on light scattering and climate. The present study, which focuses on three regions downwind of major urban/population centers (North Indian Ocean during INDOEX, the Northwest Pacific Ocean during ACE-Asia, and the Northwest Atlantic Ocean during ICARTT), incorporates understanding gained from field observations of aerosol distributions and properties into calculations of perturbations in radiative fluxes due to these aerosols. This study evaluates the current state of observations and of two chemical transport models (STEM and MOZART). Measurements of burdens, extinction optical depth, and direct radiative effect of aerosols (change in radiative flux due to total aerosols) are used as measurement-model check points to assess uncertainties. In-situ measured and remotely sensed aerosol properties for each region (mixing state, mass scattering efficiency, single scattering albedo, and angular scattering properties and their dependences on relative humidity) are used as input parameters to two radiative transfer models (GFDL and University of Michigan) to constrain estimates of aerosol radiative effects, with uncertainties in each step propagated through the analysis. Such comparisons with observations and resultant reductions in uncertainties are essential for improving and developing confidence in climate model calculations incorporating aerosol forcing.

  12. Professional or administrative value patterns? Clinical pathways in medical problem-solving processes.

    PubMed

    Holmberg, Leif

    2007-11-01

    A health-care organization simultaneously belongs to two different institutional value patterns: a professional and an administrative value pattern. At the administrative level, medical problem-solving processes are generally perceived as the efficient application of familiar chains of activities to well-defined problems; and a low task uncertainty is therefore assumed at the work-floor level. This assumption is further reinforced through clinical pathways and other administrative guidelines. However, studies have shown that in clinical practice such administrative guidelines are often considered inadequate and difficult to implement mainly because physicians generally perceive task uncertainty to be high and that the guidelines do not cover the scope of encountered deviations. The current administrative level guidelines impose uniform structural features that meet the requirement for low task uncertainty. Within these structural constraints, physicians must organize medical problem-solving processes to meet any task uncertainty that may be encountered. Medical problem-solving processes with low task uncertainty need to be organized independently of processes with high task uncertainty. Each process must be evaluated according to different performance standards and needs to have autonomous administrative guideline models. Although clinical pathways seem appropriate when there is low task uncertainty, other kinds of guidelines are required when the task uncertainty is high.

  13. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lewis, John R.; Brooks, Dusty Marie

    In pressurized water reactors, the prevention, detection, and repair of cracks within dissimilar metal welds is essential to ensure proper plant functionality and safety. Weld residual stresses, which are difficult to model and cannot be directly measured, contribute to the formation and growth of cracks due to primary water stress corrosion cracking. Additionally, the uncertainty in weld residual stress measurements and modeling predictions is not well understood, further complicating the prediction of crack evolution. The purpose of this document is to develop methodology to quantify the uncertainty associated with weld residual stress that can be applied to modeling predictions andmore » experimental measurements. Ultimately, the results can be used to assess the current state of uncertainty and to build confidence in both modeling and experimental procedures. The methodology consists of statistically modeling the variation in the weld residual stress profiles using functional data analysis techniques. Uncertainty is quantified using statistical bounds (e.g. confidence and tolerance bounds) constructed with a semi-parametric bootstrap procedure. Such bounds describe the range in which quantities of interest, such as means, are expected to lie as evidenced by the data. The methodology is extended to provide direct comparisons between experimental measurements and modeling predictions by constructing statistical confidence bounds for the average difference between the two quantities. The statistical bounds on the average difference can be used to assess the level of agreement between measurements and predictions. The methodology is applied to experimental measurements of residual stress obtained using two strain relief measurement methods and predictions from seven finite element models developed by different organizations during a round robin study.« less

  14. Status of LDEF radiation modeling

    NASA Technical Reports Server (NTRS)

    Watts, John W.; Armstrong, T. W.; Colborn, B. L.

    1995-01-01

    The current status of model prediction and comparison with LDEF radiation dosimetry measurements is summarized with emphasis on major results obtained in evaluating the uncertainties of present radiation environment model. The consistency of results and conclusions obtained from model comparison with different sets of LDEF radiation data (dose, activation, fluence, LET spectra) is discussed. Examples where LDEF radiation data and modeling results can be utilized to provide improved radiation assessments for planned LEO missions (e.g., Space Station) are given.

  15. The IAEA coordinated research programme on HTGR uncertainty analysis: Phase I status and Ex. I-1 prismatic reference results

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bostelmann, Friederike; Strydom, Gerhard; Reitsma, Frederik

    The quantification of uncertainties in design and safety analysis of reactors is today not only broadly accepted, but in many cases became the preferred way to replace traditional conservative analysis for safety and licensing analysis. The use of a more fundamental methodology is also consistent with the reliable high fidelity physics models and robust, efficient, and accurate codes available today. To facilitate uncertainty analysis applications a comprehensive approach and methodology must be developed and applied, in contrast to the historical approach where sensitivity analysis were performed and uncertainties then determined by a simplified statistical combination of a few important inputmore » parameters. New methodologies are currently under development in the OECD/NEA Light Water Reactor (LWR) Uncertainty Analysis in Best-Estimate Modelling (UAM) benchmark activity. High Temperature Gas-cooled Reactor (HTGR) designs require specific treatment of the double heterogeneous fuel design and large graphite quantities at high temperatures. The IAEA has therefore launched a Coordinated Research Project (CRP) on HTGR Uncertainty Analysis in Modelling (UAM) in 2013 to study uncertainty propagation specifically in the HTGR analysis chain. Two benchmark problems are defined, with the prismatic design represented by the General Atomics (GA) MHTGR-350 and a 250 MW modular pebble bed design similar to the Chinese HTR-PM. Work has started on the first phase and the current CRP status is reported in the paper. A comparison of the Serpent and SCALE/KENO-VI reference Monte Carlo results for Ex. I-1 of the MHTGR-350 design is also included. It was observed that the SCALE/KENO-VI Continuous Energy (CE) k ∞ values were 395 pcm (Ex. I-1a) to 803 pcm (Ex. I-1b) higher than the respective Serpent lattice calculations, and that within the set of the SCALE results, the KENO-VI 238 Multi-Group (MG) k ∞ values were up to 800 pcm lower than the KENO-VI CE values. The use of the latest ENDF-B-VII.1 cross section library in Serpent lead to ~180 pcm lower k ∞ values compared to the older ENDF-B-VII.0 dataset, caused by the modified graphite neutron capture cross section. Furthermore, the fourth beta release of SCALE 6.2 likewise produced lower CE k∞ values when compared to SCALE 6.1, and the improved performance of the new 252-group library available in SCALE 6.2 is especially noteworthy. A SCALE/TSUNAMI uncertainty analysis of the Hot Full Power variant for Ex. I-1a furthermore concluded that the 238U(n,γ) (capture) and 235U(View the MathML source) cross-section covariance matrices contributed the most to the total k ∞ uncertainty of 0.58%.« less

  16. The IAEA coordinated research programme on HTGR uncertainty analysis: Phase I status and Ex. I-1 prismatic reference results

    DOE PAGES

    Bostelmann, Friederike; Strydom, Gerhard; Reitsma, Frederik; ...

    2016-01-11

    The quantification of uncertainties in design and safety analysis of reactors is today not only broadly accepted, but in many cases became the preferred way to replace traditional conservative analysis for safety and licensing analysis. The use of a more fundamental methodology is also consistent with the reliable high fidelity physics models and robust, efficient, and accurate codes available today. To facilitate uncertainty analysis applications a comprehensive approach and methodology must be developed and applied, in contrast to the historical approach where sensitivity analysis were performed and uncertainties then determined by a simplified statistical combination of a few important inputmore » parameters. New methodologies are currently under development in the OECD/NEA Light Water Reactor (LWR) Uncertainty Analysis in Best-Estimate Modelling (UAM) benchmark activity. High Temperature Gas-cooled Reactor (HTGR) designs require specific treatment of the double heterogeneous fuel design and large graphite quantities at high temperatures. The IAEA has therefore launched a Coordinated Research Project (CRP) on HTGR Uncertainty Analysis in Modelling (UAM) in 2013 to study uncertainty propagation specifically in the HTGR analysis chain. Two benchmark problems are defined, with the prismatic design represented by the General Atomics (GA) MHTGR-350 and a 250 MW modular pebble bed design similar to the Chinese HTR-PM. Work has started on the first phase and the current CRP status is reported in the paper. A comparison of the Serpent and SCALE/KENO-VI reference Monte Carlo results for Ex. I-1 of the MHTGR-350 design is also included. It was observed that the SCALE/KENO-VI Continuous Energy (CE) k ∞ values were 395 pcm (Ex. I-1a) to 803 pcm (Ex. I-1b) higher than the respective Serpent lattice calculations, and that within the set of the SCALE results, the KENO-VI 238 Multi-Group (MG) k ∞ values were up to 800 pcm lower than the KENO-VI CE values. The use of the latest ENDF-B-VII.1 cross section library in Serpent lead to ~180 pcm lower k ∞ values compared to the older ENDF-B-VII.0 dataset, caused by the modified graphite neutron capture cross section. Furthermore, the fourth beta release of SCALE 6.2 likewise produced lower CE k∞ values when compared to SCALE 6.1, and the improved performance of the new 252-group library available in SCALE 6.2 is especially noteworthy. A SCALE/TSUNAMI uncertainty analysis of the Hot Full Power variant for Ex. I-1a furthermore concluded that the 238U(n,γ) (capture) and 235U(View the MathML source) cross-section covariance matrices contributed the most to the total k ∞ uncertainty of 0.58%.« less

  17. Uncertainty quantification for nuclear density functional theory and information content of new measurements

    DOE PAGES

    McDonnell, J. D.; Schunck, N.; Higdon, D.; ...

    2015-03-24

    Statistical tools of uncertainty quantification can be used to assess the information content of measured observables with respect to present-day theoretical models, to estimate model errors and thereby improve predictive capability, to extrapolate beyond the regions reached by experiment, and to provide meaningful input to applications and planned measurements. To showcase new opportunities offered by such tools, we make a rigorous analysis of theoretical statistical uncertainties in nuclear density functional theory using Bayesian inference methods. By considering the recent mass measurements from the Canadian Penning Trap at Argonne National Laboratory, we demonstrate how the Bayesian analysis and a direct least-squaresmore » optimization, combined with high-performance computing, can be used to assess the information content of the new data with respect to a model based on the Skyrme energy density functional approach. Employing the posterior probability distribution computed with a Gaussian process emulator, we apply the Bayesian framework to propagate theoretical statistical uncertainties in predictions of nuclear masses, two-neutron dripline, and fission barriers. Overall, we find that the new mass measurements do not impose a constraint that is strong enough to lead to significant changes in the model parameters. In addition, the example discussed in this study sets the stage for quantifying and maximizing the impact of new measurements with respect to current modeling and guiding future experimental efforts, thus enhancing the experiment-theory cycle in the scientific method.« less

  18. Uncertainty quantification for nuclear density functional theory and information content of new measurements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McDonnell, J. D.; Schunck, N.; Higdon, D.

    2015-03-24

    Statistical tools of uncertainty quantification can be used to assess the information content of measured observables with respect to present-day theoretical models, to estimate model errors and thereby improve predictive capability, to extrapolate beyond the regions reached by experiment, and to provide meaningful input to applications and planned measurements. To showcase new opportunities offered by such tools, we make a rigorous analysis of theoretical statistical uncertainties in nuclear density functional theory using Bayesian inference methods. By considering the recent mass measurements from the Canadian Penning Trap at Argonne National Laboratory, we demonstrate how the Bayesian analysis and a direct least-squaresmore » optimization, combined with high-performance computing, can be used to assess the information content of the new data with respect to a model based on the Skyrme energy density functional approach. Employing the posterior probability distribution computed with a Gaussian process emulator, we apply the Bayesian framework to propagate theoretical statistical uncertainties in predictions of nuclear masses, two-neutron dripline, and fission barriers. Overall, we find that the new mass measurements do not impose a constraint that is strong enough to lead to significant changes in the model parameters. As a result, the example discussed in this study sets the stage for quantifying and maximizing the impact of new measurements with respect to current modeling and guiding future experimental efforts, thus enhancing the experiment-theory cycle in the scientific method.« less

  19. Technical note: Bayesian calibration of dynamic ruminant nutrition models.

    PubMed

    Reed, K F; Arhonditsis, G B; France, J; Kebreab, E

    2016-08-01

    Mechanistic models of ruminant digestion and metabolism have advanced our understanding of the processes underlying ruminant animal physiology. Deterministic modeling practices ignore the inherent variation within and among individual animals and thus have no way to assess how sources of error influence model outputs. We introduce Bayesian calibration of mathematical models to address the need for robust mechanistic modeling tools that can accommodate error analysis by remaining within the bounds of data-based parameter estimation. For the purpose of prediction, the Bayesian approach generates a posterior predictive distribution that represents the current estimate of the value of the response variable, taking into account both the uncertainty about the parameters and model residual variability. Predictions are expressed as probability distributions, thereby conveying significantly more information than point estimates in regard to uncertainty. Our study illustrates some of the technical advantages of Bayesian calibration and discusses the future perspectives in the context of animal nutrition modeling. Copyright © 2016 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.

  20. Bottom friction. A practical approach to modelling coastal oceanography

    NASA Astrophysics Data System (ADS)

    Bolanos, Rodolfo; Jensen, Palle; Kofoed-Hansen, Henrik; Tornsfeldt Sørensen, Jacob

    2017-04-01

    Coastal processes imply the interaction of the atmosphere, the sea, the coastline and the bottom. The spatial gradients in this area are normally large, induced by orographic and bathymetric features. Although nowadays it is possible to obtain high-resolution bathymetry, the details of the seabed, e.g. sediment type, presence of biological material and living organisms are not available. Additionally, these properties as well as bathymetry can also be highly dynamic. These bottom characteristics are very important to describe the boundary layer of currents and waves and control to a large degree the dissipation of flows. The bottom friction is thus typically a calibration parameter in numerical modelling of coastal processes. In this work, we assess this process and put it into context of other physical processes uncertainties influencing wind-waves and currents in the coastal areas. A case study in the North Sea is used, particularly the west coast of Denmark, where water depth of less than 30 m cover a wide fringe along the coast, where several offshore wind farm developments are being carried out. We use the hydrodynamic model MIKE 21 HD and the spectral wave model MIKE 21 SW to simulate atmosphere and tidal induced flows and the wind wave generation and propagation. Both models represent state of the art and have been developed for flexible meshes, ideal for coastal oceanography as they can better represent coastlines and allow a variable spatial resolution within the domain. Sensitivity tests to bottom friction formulations are carried out into context of other processes (e.g. model forcing uncertainties, wind and wave interactions, wind drag coefficient). Additionally, a map of varying bottom properties is generated based on a literature survey to explore the impact of the spatial variability. Assessment of different approaches is made in order to establish a best practice regarding bottom friction and coastal oceanographic modelling. Its contribution is also assessed during storm conditions, where its most evident impact is expected as waves are affected by the bottom processes in larger areas, making bottom dissipation more efficient. We use available waves and current measurements in the North Sea (e.g. Ekofisk, Fino platforms and some other coastal stations at the west coast of Denmark) to quantify the importance of processes influencing waves and currents in the coastal zone and putting it in the context of the importance of bottom friction and other processes uncertainties.

  1. A climate robust integrated modelling framework for regional impact assessment of climate change

    NASA Astrophysics Data System (ADS)

    Janssen, Gijs; Bakker, Alexander; van Ek, Remco; Groot, Annemarie; Kroes, Joop; Kuiper, Marijn; Schipper, Peter; van Walsum, Paul; Wamelink, Wieger; Mol, Janet

    2013-04-01

    Decision making towards climate proofing the water management of regional catchments can benefit greatly from the availability of a climate robust integrated modelling framework, capable of a consistent assessment of climate change impacts on the various interests present in the catchments. In the Netherlands, much effort has been devoted to developing state-of-the-art regional dynamic groundwater models with a very high spatial resolution (25x25 m2). Still, these models are not completely satisfactory to decision makers because the modelling concepts do not take into account feedbacks between meteorology, vegetation/crop growth, and hydrology. This introduces uncertainties in forecasting the effects of climate change on groundwater, surface water, agricultural yields, and development of groundwater dependent terrestrial ecosystems. These uncertainties add to the uncertainties about the predictions on climate change itself. In order to create an integrated, climate robust modelling framework, we coupled existing model codes on hydrology, agriculture and nature that are currently in use at the different research institutes in the Netherlands. The modelling framework consists of the model codes MODFLOW (groundwater flow), MetaSWAP (vadose zone), WOFOST (crop growth), SMART2-SUMO2 (soil-vegetation) and NTM3 (nature valuation). MODFLOW, MetaSWAP and WOFOST are coupled online (i.e. exchange information on time step basis). Thus, changes in meteorology and CO2-concentrations affect crop growth and feedbacks between crop growth, vadose zone water movement and groundwater recharge are accounted for. The model chain WOFOST-MetaSWAP-MODFLOW generates hydrological input for the ecological prediction model combination SMART2-SUMO2-NTM3. The modelling framework was used to support the regional water management decision making process in the 267 km2 Baakse Beek-Veengoot catchment in the east of the Netherlands. Computations were performed for regionalized 30-year climate change scenarios developed by KNMI for precipitation and reference evapotranspiration according to Penman-Monteith. Special focus in the project was on the role of uncertainty. How valid is the information that is generated by this modelling framework? What are the most important uncertainties of the input data, how do they affect the results of the model chain and how can the uncertainties of the data, results, and model concepts be quantified and communicated? Besides these technical issues, an important part of the study was devoted to the perception of stakeholders. Stakeholder analysis and additional working sessions yielded insight into how the models, their results and the uncertainties are perceived, how the modelling framework and results connect to the stakeholders' information demands and what kind of additional information is needed for adequate support on decision making.

  2. Climate change and European forests: what do we know, what are the uncertainties, and what are the implications for forest management?

    PubMed

    Lindner, Marcus; Fitzgerald, Joanne B; Zimmermann, Niklaus E; Reyer, Christopher; Delzon, Sylvain; van der Maaten, Ernst; Schelhaas, Mart-Jan; Lasch, Petra; Eggers, Jeannette; van der Maaten-Theunissen, Marieke; Suckow, Felicitas; Psomas, Achilleas; Poulter, Benjamin; Hanewinkel, Marc

    2014-12-15

    The knowledge about potential climate change impacts on forests is continuously expanding and some changes in growth, drought induced mortality and species distribution have been observed. However despite a significant body of research, a knowledge and communication gap exists between scientists and non-scientists as to how climate change impact scenarios can be interpreted and what they imply for European forests. It is still challenging to advise forest decision makers on how best to plan for climate change as many uncertainties and unknowns remain and it is difficult to communicate these to practitioners and other decision makers while retaining emphasis on the importance of planning for adaptation. In this paper, recent developments in climate change observations and projections, observed and projected impacts on European forests and the associated uncertainties are reviewed and synthesised with a view to understanding the implications for forest management. Current impact assessments with simulation models contain several simplifications, which explain the discrepancy between results of many simulation studies and the rapidly increasing body of evidence about already observed changes in forest productivity and species distribution. In simulation models uncertainties tend to cascade onto one another; from estimating what future societies will be like and general circulation models (GCMs) at the global level, down to forest models and forest management at the local level. Individual climate change impact studies should not be uncritically used for decision-making without reflection on possible shortcomings in system understanding, model accuracy and other assumptions made. It is important for decision makers in forest management to realise that they have to take long-lasting management decisions while uncertainty about climate change impacts are still large. We discuss how to communicate about uncertainty - which is imperative for decision making - without diluting the overall message. Considering the range of possible trends and uncertainties in adaptive forest management requires expert knowledge and enhanced efforts for providing science-based decision support. Copyright © 2014 Elsevier Ltd. All rights reserved.

  3. Estimates of CO2 fluxes over the city of Cape Town, South Africa, through Bayesian inverse modelling

    NASA Astrophysics Data System (ADS)

    Nickless, Alecia; Rayner, Peter J.; Engelbrecht, Francois; Brunke, Ernst-Günther; Erni, Birgit; Scholes, Robert J.

    2018-04-01

    We present a city-scale inversion over Cape Town, South Africa. Measurement sites for atmospheric CO2 concentrations were installed at Robben Island and Hangklip lighthouses, located downwind and upwind of the metropolis. Prior estimates of the fossil fuel fluxes were obtained from a bespoke inventory analysis where emissions were spatially and temporally disaggregated and uncertainty estimates determined by means of error propagation techniques. Net ecosystem exchange (NEE) fluxes from biogenic processes were obtained from the land atmosphere exchange model CABLE (Community Atmosphere Biosphere Land Exchange). Uncertainty estimates were based on the estimates of net primary productivity. CABLE was dynamically coupled to the regional climate model CCAM (Conformal Cubic Atmospheric Model), which provided the climate inputs required to drive the Lagrangian particle dispersion model. The Bayesian inversion framework included a control vector where fossil fuel and NEE fluxes were solved for separately.Due to the large prior uncertainty prescribed to the NEE fluxes, the current inversion framework was unable to adequately distinguish between the fossil fuel and NEE fluxes, but the inversion was able to obtain improved estimates of the total fluxes within pixels and across the domain. The median of the uncertainty reductions of the total weekly flux estimates for the inversion domain of Cape Town was 28 %, but reach as high as 50 %. At the pixel level, uncertainty reductions of the total weekly flux reached up to 98 %, but these large uncertainty reductions were for NEE-dominated pixels. Improved corrections to the fossil fuel fluxes would be possible if the uncertainty around the prior NEE fluxes could be reduced. In order for this inversion framework to be operationalised for monitoring, reporting, and verification (MRV) of emissions from Cape Town, the NEE component of the CO2 budget needs to be better understood. Additional measurements of Δ14C and δ13C isotope measurements would be a beneficial component of an atmospheric monitoring programme aimed at MRV of CO2 for any city which has significant biogenic influence, allowing improved separation of contributions from NEE and fossil fuel fluxes to the observed CO2 concentration.

  4. A model-averaging method for assessing groundwater conceptual model uncertainty.

    PubMed

    Ye, Ming; Pohlmann, Karl F; Chapman, Jenny B; Pohll, Greg M; Reeves, Donald M

    2010-01-01

    This study evaluates alternative groundwater models with different recharge and geologic components at the northern Yucca Flat area of the Death Valley Regional Flow System (DVRFS), USA. Recharge over the DVRFS has been estimated using five methods, and five geological interpretations are available at the northern Yucca Flat area. Combining the recharge and geological components together with additional modeling components that represent other hydrogeological conditions yields a total of 25 groundwater flow models. As all the models are plausible given available data and information, evaluating model uncertainty becomes inevitable. On the other hand, hydraulic parameters (e.g., hydraulic conductivity) are uncertain in each model, giving rise to parametric uncertainty. Propagation of the uncertainty in the models and model parameters through groundwater modeling causes predictive uncertainty in model predictions (e.g., hydraulic head and flow). Parametric uncertainty within each model is assessed using Monte Carlo simulation, and model uncertainty is evaluated using the model averaging method. Two model-averaging techniques (on the basis of information criteria and GLUE) are discussed. This study shows that contribution of model uncertainty to predictive uncertainty is significantly larger than that of parametric uncertainty. For the recharge and geological components, uncertainty in the geological interpretations has more significant effect on model predictions than uncertainty in the recharge estimates. In addition, weighted residuals vary more for the different geological models than for different recharge models. Most of the calibrated observations are not important for discriminating between the alternative models, because their weighted residuals vary only slightly from one model to another.

  5. Estimating the risk of Amazonian forest dieback.

    PubMed

    Rammig, Anja; Jupp, Tim; Thonicke, Kirsten; Tietjen, Britta; Heinke, Jens; Ostberg, Sebastian; Lucht, Wolfgang; Cramer, Wolfgang; Cox, Peter

    2010-08-01

    *Climate change will very likely affect most forests in Amazonia during the course of the 21st century, but the direction and intensity of the change are uncertain, in part because of differences in rainfall projections. In order to constrain this uncertainty, we estimate the probability for biomass change in Amazonia on the basis of rainfall projections that are weighted by climate model performance for current conditions. *We estimate the risk of forest dieback by using weighted rainfall projections from 24 general circulation models (GCMs) to create probability density functions (PDFs) for future forest biomass changes simulated by a dynamic vegetation model (LPJmL). *Our probabilistic assessment of biomass change suggests a likely shift towards increasing biomass compared with nonweighted results. Biomass estimates range between a gain of 6.2 and a loss of 2.7 kg carbon m(-2) for the Amazon region, depending on the strength of CO(2) fertilization. *The uncertainty associated with the long-term effect of CO(2) is much larger than that associated with precipitation change. This underlines the importance of reducing uncertainties in the direct effects of CO(2) on tropical ecosystems.

  6. Drivers and uncertainties of forecasted range shifts for warm-water fishes under climate and land cover change

    USGS Publications Warehouse

    Bouska, Kristen; Whitledge, Gregory W.; Lant, Christopher; Schoof, Justin

    2018-01-01

    Land cover is an important determinant of aquatic habitat and is projected to shift with climate changes, yet climate-driven land cover changes are rarely factored into climate assessments. To quantify impacts and uncertainty of coupled climate and land cover change on warm-water fish species’ distributions, we used an ensemble model approach to project distributions of 14 species. For each species, current range projections were compared to 27 scenario-based projections and aggregated to visualize uncertainty. Multiple regression and model selection techniques were used to identify drivers of range change. Novel, or no-analogue, climates were assessed to evaluate transferability of models. Changes in total probability of occurrence ranged widely across species, from a 63% increase to a 65% decrease. Distributional gains and losses were largely driven by temperature and flow variables and underscore the importance of habitat heterogeneity and connectivity to facilitate adaptation to changing conditions. Finally, novel climate conditions were driven by mean annual maximum temperature, which stresses the importance of understanding the role of temperature on fish physiology and the role of temperature-mitigating management practices.

  7. The Impact of Uncertain Physical Parameters on HVAC Demand Response

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sun, Yannan; Elizondo, Marcelo A.; Lu, Shuai

    HVAC units are currently one of the major resources providing demand response (DR) in residential buildings. Models of HVAC with DR function can improve understanding of its impact on power system operations and facilitate the deployment of DR technologies. This paper investigates the importance of various physical parameters and their distributions to the HVAC response to DR signals, which is a key step to the construction of HVAC models for a population of units with insufficient data. These parameters include the size of floors, insulation efficiency, the amount of solid mass in the house, and efficiency of the HVAC units.more » These parameters are usually assumed to follow Gaussian or Uniform distributions. We study the effect of uncertainty in the chosen parameter distributions on the aggregate HVAC response to DR signals, during transient phase and in steady state. We use a quasi-Monte Carlo sampling method with linear regression and Prony analysis to evaluate sensitivity of DR output to the uncertainty in the distribution parameters. The significance ranking on the uncertainty sources is given for future guidance in the modeling of HVAC demand response.« less

  8. Assessing uncertainty in the turbulent upper-ocean mixed layer using an unstructured finite-element solver

    NASA Astrophysics Data System (ADS)

    Pacheco, Luz; Smith, Katherine; Hamlington, Peter; Niemeyer, Kyle

    2017-11-01

    Vertical transport flux in the ocean upper mixed layer has recently been attributed to submesoscale currents, which occur at scales on the order of kilometers in the horizontal direction. These phenomena, which include fronts and mixed-layer instabilities, have been of particular interest due to the effect of turbulent mixing on nutrient transport, facilitating phytoplankton blooms. We study these phenomena using a non-hydrostatic, large eddy simulation for submesoscale currents in the ocean, developed using the extensible, open-source finite element platform FEniCs. Our model solves the standard Boussinesq Euler equations in variational form using the finite element method. FEniCs enables the use of parallel computing on modern systems for efficient computing time, and is suitable for unstructured grids where irregular topography can be considered in the future. The solver will be verified against the well-established NCAR-LES model and validated against observational data. For the verification with NCAR-LES, the velocity, pressure, and buoyancy fields are compared through a surface-wind-driven, open-ocean case. We use this model to study the impacts of uncertainties in the model parameters, such as near-surface buoyancy flux and secondary circulation, and discuss implications.

  9. Measurement of the Antineutrino Double-Differential Charged-Current Quasi-Elastic Scattering Cross Section at MINERvA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Patrick, Cheryl

    Next-generation neutrino oscillation experiments, such as DUNE and Hyper-Kamiokande, hope to measure charge-parity (CP) violation in the lepton sector. In order to do this, they must dramatically reduce their current levels of uncertainty, particularly those due to neutrino-nucleus interaction models. As CP violation is a measure of the difference between the oscillation properties of neutrinos and antineutrinos, data about how the less-studied antineutrinos interact is especially valuable. We present the MINERvA experiment's first double-differential scattering cross sections for antineutrinos on scintillator, in the few-GeV range relevant to experiments such as DUNE and NOvA. We also present total antineutrino-scintillator quasi-elastic crossmore » sections as a function of energy, which we compare to measurements from previous experiments. As well as being useful to help reduce oscillation experiments' uncertainty, our data can also be used to study the prevalence of various cor relation and final-state interaction effects within the nucleus. We compare to models produced by different model generators, and are able to draw first conclusions about the predictions of these models.« less

  10. A Bayesian Framework of Uncertainties Integration in 3D Geological Model

    NASA Astrophysics Data System (ADS)

    Liang, D.; Liu, X.

    2017-12-01

    3D geological model can describe complicated geological phenomena in an intuitive way while its application may be limited by uncertain factors. Great progress has been made over the years, lots of studies decompose the uncertainties of geological model to analyze separately, while ignored the comprehensive impacts of multi-source uncertainties. Great progress has been made over the years, while lots of studies ignored the comprehensive impacts of multi-source uncertainties when analyzed them item by item from each source. To evaluate the synthetical uncertainty, we choose probability distribution to quantify uncertainty, and propose a bayesian framework of uncertainties integration. With this framework, we integrated data errors, spatial randomness, and cognitive information into posterior distribution to evaluate synthetical uncertainty of geological model. Uncertainties propagate and cumulate in modeling process, the gradual integration of multi-source uncertainty is a kind of simulation of the uncertainty propagation. Bayesian inference accomplishes uncertainty updating in modeling process. Maximum entropy principle makes a good effect on estimating prior probability distribution, which ensures the prior probability distribution subjecting to constraints supplied by the given information with minimum prejudice. In the end, we obtained a posterior distribution to evaluate synthetical uncertainty of geological model. This posterior distribution represents the synthetical impact of all the uncertain factors on the spatial structure of geological model. The framework provides a solution to evaluate synthetical impact on geological model of multi-source uncertainties and a thought to study uncertainty propagation mechanism in geological modeling.

  11. Variances in the projections, resulting from CLIMEX, Boosted Regression Trees and Random Forests techniques

    NASA Astrophysics Data System (ADS)

    Shabani, Farzin; Kumar, Lalit; Solhjouy-fard, Samaneh

    2017-08-01

    The aim of this study was to have a comparative investigation and evaluation of the capabilities of correlative and mechanistic modeling processes, applied to the projection of future distributions of date palm in novel environments and to establish a method of minimizing uncertainty in the projections of differing techniques. The location of this study on a global scale is in Middle Eastern Countries. We compared the mechanistic model CLIMEX (CL) with the correlative models MaxEnt (MX), Boosted Regression Trees (BRT), and Random Forests (RF) to project current and future distributions of date palm ( Phoenix dactylifera L.). The Global Climate Model (GCM), the CSIRO-Mk3.0 (CS) using the A2 emissions scenario, was selected for making projections. Both indigenous and alien distribution data of the species were utilized in the modeling process. The common areas predicted by MX, BRT, RF, and CL from the CS GCM were extracted and compared to ascertain projection uncertainty levels of each individual technique. The common areas identified by all four modeling techniques were used to produce a map indicating suitable and unsuitable areas for date palm cultivation for Middle Eastern countries, for the present and the year 2100. The four different modeling approaches predict fairly different distributions. Projections from CL were more conservative than from MX. The BRT and RF were the most conservative methods in terms of projections for the current time. The combination of the final CL and MX projections for the present and 2100 provide higher certainty concerning those areas that will become highly suitable for future date palm cultivation. According to the four models, cold, hot, and wet stress, with differences on a regional basis, appears to be the major restrictions on future date palm distribution. The results demonstrate variances in the projections, resulting from different techniques. The assessment and interpretation of model projections requires reservations, especially in correlative models such as MX, BRT, and RF. Intersections between different techniques may decrease uncertainty in future distribution projections. However, readers should not miss the fact that the uncertainties are mostly because the future GHG emission scenarios are unknowable with sufficient precision. Suggestions towards methodology and processing for improving projections are included.

  12. Improving the driver-automation interaction: an approach using automation uncertainty.

    PubMed

    Beller, Johannes; Heesen, Matthias; Vollrath, Mark

    2013-12-01

    The aim of this study was to evaluate whether communicating automation uncertainty improves the driver-automation interaction. A false system understanding of infallibility may provoke automation misuse and can lead to severe consequences in case of automation failure. The presentation of automation uncertainty may prevent this false system understanding and, as was shown by previous studies, may have numerous benefits. Few studies, however, have clearly shown the potential of communicating uncertainty information in driving. The current study fills this gap. We conducted a driving simulator experiment, varying the presented uncertainty information between participants (no uncertainty information vs. uncertainty information) and the automation reliability (high vs.low) within participants. Participants interacted with a highly automated driving system while engaging in secondary tasks and were required to cooperate with the automation to drive safely. Quantile regressions and multilevel modeling showed that the presentation of uncertainty information increases the time to collision in the case of automation failure. Furthermore, the data indicated improved situation awareness and better knowledge of fallibility for the experimental group. Consequently, the automation with the uncertainty symbol received higher trust ratings and increased acceptance. The presentation of automation uncertaintythrough a symbol improves overall driver-automation cooperation. Most automated systems in driving could benefit from displaying reliability information. This display might improve the acceptance of fallible systems and further enhances driver-automation cooperation.

  13. Impact of uncertain head tissue conductivity in the optimization of transcranial direct current stimulation for an auditory target

    NASA Astrophysics Data System (ADS)

    Schmidt, Christian; Wagner, Sven; Burger, Martin; van Rienen, Ursula; Wolters, Carsten H.

    2015-08-01

    Objective. Transcranial direct current stimulation (tDCS) is a non-invasive brain stimulation technique to modify neural excitability. Using multi-array tDCS, we investigate the influence of inter-individually varying head tissue conductivity profiles on optimal electrode configurations for an auditory cortex stimulation. Approach. In order to quantify the uncertainty of the optimal electrode configurations, multi-variate generalized polynomial chaos expansions of the model solutions are used based on uncertain conductivity profiles of the compartments skin, skull, gray matter, and white matter. Stochastic measures, probability density functions, and sensitivity of the quantities of interest are investigated for each electrode and the current density at the target with the resulting stimulation protocols visualized on the head surface. Main results. We demonstrate that the optimized stimulation protocols are only comprised of a few active electrodes, with tolerable deviations in the stimulation amplitude of the anode. However, large deviations in the order of the uncertainty in the conductivity profiles could be noted in the stimulation protocol of the compensating cathodes. Regarding these main stimulation electrodes, the stimulation protocol was most sensitive to uncertainty in skull conductivity. Finally, the probability that the current density amplitude in the auditory cortex target region is supra-threshold was below 50%. Significance. The results suggest that an uncertain conductivity profile in computational models of tDCS can have a substantial influence on the prediction of optimal stimulation protocols for stimulation of the auditory cortex. The investigations carried out in this study present a possibility to predict the probability of providing a therapeutic effect with an optimized electrode system for future auditory clinical and experimental procedures of tDCS applications.

  14. Lessons Learned from 2 Decades of Modelling Forest Dead Organic Matter and Soil Carbon at the National Scale

    NASA Astrophysics Data System (ADS)

    Shaw, C.; Kurz, W. A.; Metsaranta, J.; Bona, K. A.; Hararuk, O.; Smyth, C.

    2017-12-01

    The Carbon Budget Model of the Canadian Forest Sector (CBM-CFS3) is a forest carbon budget model that operates on individual stands. It is applied from regional to national-scales in Canada for national and international reporting of GHG emissions and removals and in support of analyses of forest sector mitigation options and other scientific and policy questions. This presentation will review the history and continuous improvement process of representations of dead organic matter (DOM) and soil carbon modelling. Early model versions in which dead organic matter (DOM) pools only included litter, downed deadwood and soil, to the current version where these pools are estimated separately to better compare model estimates against field measurements, or new pools have been added. Uncertainty analyses consistently point at soil C pools as large sources of uncertainty. With the new ground plot measurements from the National Forest Inventory, and with a newly compiled forest soil carbon database, we have recently completed a model data assimilation exercise that helped reduce parameter uncertainties. Lessons learned from the continuous improvement process will be summarised and we will discuss how model modification have led to improved representation of DOM and soil carbon dynamics. We conclude by suggesting future research priorities that can advance DOM and soil carbon modelling in Canadian forest ecosystems.

  15. Model Sensitivity Studies of the Decrease in Atmospheric Carbon Tetrachloride

    NASA Technical Reports Server (NTRS)

    Chipperfield, Martyn P.; Liang, Qing; Rigby, Matt; Hossaini, Ryan; Montzka, Stephen A.; Dhomse, Sandip; Feng, Wuhu; Prinn, Ronald G.; Weiss, Ray F.; Harth, Christina M.; hide

    2016-01-01

    Carbon tetrachloride (CCl4) is an ozone-depleting substance, which is controlled by the Montreal Protocol and for which the atmospheric abundance is decreasing. However, the current observed rate of this decrease is known to be slower than expected based on reported CCl4 emissions and its estimated overall atmospheric lifetime. Here we use a three-dimensional (3-D) chemical transport model to investigate the impact on its predicted decay of uncertainties in the rates at which CCl4 is removed from the atmosphere by photolysis, by ocean uptake and by degradation in soils. The largest sink is atmospheric photolysis (74% of total), but a reported 10% uncertainty in its combined photolysis cross section and quantum yield has only a modest impact on the modelled rate of CCl4 decay. This is partly due to the limiting effect of the rate of transport of CCl4 from the main tropospheric reservoir to the stratosphere, where photolytic loss occurs. The model suggests large interannual variability in the magnitude of this stratospheric photolysis sink caused by variations in transport. The impact of uncertainty in the minor soil sink (9%of total) is also relatively small. In contrast, the model shows that uncertainty in ocean loss (17%of total) has the largest impact on modelled CCl4 decay due to its sizeable contribution to CCl4 loss and large lifetime uncertainty range (147 to 241 years). With an assumed CCl4 emission rate of 39 Gg year(exp -1), the reference simulation with the best estimate of loss processes still underestimates the observed CCl4 (overestimates the decay) over the past 2 decades but to a smaller extent than previous studies. Changes to the rate of CCl4 loss processes, in line with known uncertainties, could bring the model into agreement with in situ surface and remote-sensing measurements, as could an increase in emissions to around 47 Gg year(exp -1). Further progress in constraining the CCl4 budget is partly limited by systematic biases between observational datasets. For example, surface observations from the National Oceanic and Atmospheric Administration (NOAA) network are larger than from the Advanced Global Atmospheric Gases Experiment (AGAGE) network but have shown a steeper decreasing trend over the past 2 decades. These differences imply a difference in emissions which is significant relative to uncertainties in the magnitudes of the CCl4 sinks.

  16. Model sensitivity studies of the decrease in atmospheric carbon tetrachloride

    NASA Astrophysics Data System (ADS)

    Chipperfield, Martyn P.; Liang, Qing; Rigby, Matthew; Hossaini, Ryan; Montzka, Stephen A.; Dhomse, Sandip; Feng, Wuhu; Prinn, Ronald G.; Weiss, Ray F.; Harth, Christina M.; Salameh, Peter K.; Mühle, Jens; O'Doherty, Simon; Young, Dickon; Simmonds, Peter G.; Krummel, Paul B.; Fraser, Paul J.; Steele, L. Paul; Happell, James D.; Rhew, Robert C.; Butler, James; Yvon-Lewis, Shari A.; Hall, Bradley; Nance, David; Moore, Fred; Miller, Ben R.; Elkins, James W.; Harrison, Jeremy J.; Boone, Chris D.; Atlas, Elliot L.; Mahieu, Emmanuel

    2016-12-01

    Carbon tetrachloride (CCl4) is an ozone-depleting substance, which is controlled by the Montreal Protocol and for which the atmospheric abundance is decreasing. However, the current observed rate of this decrease is known to be slower than expected based on reported CCl4 emissions and its estimated overall atmospheric lifetime. Here we use a three-dimensional (3-D) chemical transport model to investigate the impact on its predicted decay of uncertainties in the rates at which CCl4 is removed from the atmosphere by photolysis, by ocean uptake and by degradation in soils. The largest sink is atmospheric photolysis (74 % of total), but a reported 10 % uncertainty in its combined photolysis cross section and quantum yield has only a modest impact on the modelled rate of CCl4 decay. This is partly due to the limiting effect of the rate of transport of CCl4 from the main tropospheric reservoir to the stratosphere, where photolytic loss occurs. The model suggests large interannual variability in the magnitude of this stratospheric photolysis sink caused by variations in transport. The impact of uncertainty in the minor soil sink (9 % of total) is also relatively small. In contrast, the model shows that uncertainty in ocean loss (17 % of total) has the largest impact on modelled CCl4 decay due to its sizeable contribution to CCl4 loss and large lifetime uncertainty range (147 to 241 years). With an assumed CCl4 emission rate of 39 Gg year-1, the reference simulation with the best estimate of loss processes still underestimates the observed CCl4 (overestimates the decay) over the past 2 decades but to a smaller extent than previous studies. Changes to the rate of CCl4 loss processes, in line with known uncertainties, could bring the model into agreement with in situ surface and remote-sensing measurements, as could an increase in emissions to around 47 Gg year-1. Further progress in constraining the CCl4 budget is partly limited by systematic biases between observational datasets. For example, surface observations from the National Oceanic and Atmospheric Administration (NOAA) network are larger than from the Advanced Global Atmospheric Gases Experiment (AGAGE) network but have shown a steeper decreasing trend over the past 2 decades. These differences imply a difference in emissions which is significant relative to uncertainties in the magnitudes of the CCl4 sinks.

  17. Quantifying model-structure- and parameter-driven uncertainties in spring wheat phenology prediction with Bayesian analysis

    DOE PAGES

    Alderman, Phillip D.; Stanfill, Bryan

    2016-10-06

    Recent international efforts have brought renewed emphasis on the comparison of different agricultural systems models. Thus far, analysis of model-ensemble simulated results has not clearly differentiated between ensemble prediction uncertainties due to model structural differences per se and those due to parameter value uncertainties. Additionally, despite increasing use of Bayesian parameter estimation approaches with field-scale crop models, inadequate attention has been given to the full posterior distributions for estimated parameters. The objectives of this study were to quantify the impact of parameter value uncertainty on prediction uncertainty for modeling spring wheat phenology using Bayesian analysis and to assess the relativemore » contributions of model-structure-driven and parameter-value-driven uncertainty to overall prediction uncertainty. This study used a random walk Metropolis algorithm to estimate parameters for 30 spring wheat genotypes using nine phenology models based on multi-location trial data for days to heading and days to maturity. Across all cases, parameter-driven uncertainty accounted for between 19 and 52% of predictive uncertainty, while model-structure-driven uncertainty accounted for between 12 and 64%. Here, this study demonstrated the importance of quantifying both model-structure- and parameter-value-driven uncertainty when assessing overall prediction uncertainty in modeling spring wheat phenology. More generally, Bayesian parameter estimation provided a useful framework for quantifying and analyzing sources of prediction uncertainty.« less

  18. Uncertainty analysis of the Operational Simplified Surface Energy Balance (SSEBop) model at multiple flux tower sites

    USGS Publications Warehouse

    Chen, Mingshi; Senay, Gabriel B.; Singh, Ramesh K.; Verdin, James P.

    2016-01-01

    Evapotranspiration (ET) is an important component of the water cycle – ET from the land surface returns approximately 60% of the global precipitation back to the atmosphere. ET also plays an important role in energy transport among the biosphere, atmosphere, and hydrosphere. Current regional to global and daily to annual ET estimation relies mainly on surface energy balance (SEB) ET models or statistical and empirical methods driven by remote sensing data and various climatological databases. These models have uncertainties due to inevitable input errors, poorly defined parameters, and inadequate model structures. The eddy covariance measurements on water, energy, and carbon fluxes at the AmeriFlux tower sites provide an opportunity to assess the ET modeling uncertainties. In this study, we focused on uncertainty analysis of the Operational Simplified Surface Energy Balance (SSEBop) model for ET estimation at multiple AmeriFlux tower sites with diverse land cover characteristics and climatic conditions. The 8-day composite 1-km MODerate resolution Imaging Spectroradiometer (MODIS) land surface temperature (LST) was used as input land surface temperature for the SSEBop algorithms. The other input data were taken from the AmeriFlux database. Results of statistical analysis indicated that the SSEBop model performed well in estimating ET with an R2 of 0.86 between estimated ET and eddy covariance measurements at 42 AmeriFlux tower sites during 2001–2007. It was encouraging to see that the best performance was observed for croplands, where R2 was 0.92 with a root mean square error of 13 mm/month. The uncertainties or random errors from input variables and parameters of the SSEBop model led to monthly ET estimates with relative errors less than 20% across multiple flux tower sites distributed across different biomes. This uncertainty of the SSEBop model lies within the error range of other SEB models, suggesting systematic error or bias of the SSEBop model is within the normal range. This finding implies that the simplified parameterization of the SSEBop model did not significantly affect the accuracy of the ET estimate while increasing the ease of model setup for operational applications. The sensitivity analysis indicated that the SSEBop model is most sensitive to input variables, land surface temperature (LST) and reference ET (ETo); and parameters, differential temperature (dT), and maximum ET scalar (Kmax), particularly during the non-growing season and in dry areas. In summary, the uncertainty assessment verifies that the SSEBop model is a reliable and robust method for large-area ET estimation. The SSEBop model estimates can be further improved by reducing errors in two input variables (ETo and LST) and two key parameters (Kmax and dT).

  19. Exploiting the MODIS albedos with the Two-stream Inversion Package (JRC-TIP): 2. Fractions of transmitted and absorbed fluxes in the vegetation and soil layers

    NASA Astrophysics Data System (ADS)

    Pinty, B.; Clerici, M.; Andredakis, I.; Kaminski, T.; Taberner, M.; Verstraete, M. M.; Gobron, N.; Plummer, S.; Widlowski, J.-L.

    2011-05-01

    The two-stream model parameters and associated uncertainties retrieved by inversion against MODIS broadband visible and near-infrared white sky surface albedos were discussed in a companion paper. The present paper concentrates on the partitioning of the solar radiation fluxes delivered by the Joint Research Centre Two-stream Inversion Package (JRC-TIP). The estimation of the various flux fractions related to the vegetation and the background layers separately capitalizes on the probability density functions of the model parameters discussed in the companion paper. The propagation of uncertainties from the observations to the model parameters is achieved via the Hessian of the cost function and yields a covariance matrix of posterior parameter uncertainties. This matrix is propagated to the radiation fluxes via the model's Jacobian matrix of first derivatives. Results exhibit a rather good spatiotemporal consistency given that the prior values on the model parameters are not specified as a function of land cover type and/or vegetation phenological states. A specific investigation based on a scenario imposing stringent conditions of leaf absorbing and scattering properties highlights the impact of such constraints that are, as a matter of fact, currently adopted in vegetation index approaches. Special attention is also given to snow-covered and snow-contaminated areas since these regions encompass significant reflectance changes that strongly affect land surface processes. A definite asset of the JRC-TIP lies in its capability to control and ultimately relax a number of assumptions that are often implicit in traditional approaches. These features greatly help us understand the discrepancies between the different data sets of land surface properties and fluxes that are currently available. Through a series of selected examples, the inverse procedure implemented in the JRC-TIP is shown to be robust, reliable, and compliant with large-scale processing requirements. Furthermore, this package ensures the physical consistency between the set of observations, the two-stream model parameters, and radiation fluxes. It also documents the retrieval of associated uncertainties.

  20. Uncertainty Considerations for Ballistic Limit Equations

    NASA Technical Reports Server (NTRS)

    Schonberg, W. P.; Evans, H. J.; Williamsen, J. E.; Boyer, R. L.; Nakayama, G. S.

    2005-01-01

    The overall risk for any spacecraft system is typically determined using a Probabilistic Risk Assessment (PRA). A PRA attempts to determine the overall risk associated with a particular mission by factoring in all known risks (and their corresponding uncertainties, if known) to the spacecraft during its mission. The threat to mission and human life posed by the mircro-meteoroid & orbital debris (MMOD) environment is one of the risks. NASA uses the BUMPER II program to provide point estimate predictions of MMOD risk for the Space Shuttle and the International Space Station. However, BUMPER II does not provide uncertainty bounds or confidence intervals for its predictions. With so many uncertainties believed to be present in the models used within BUMPER II, providing uncertainty bounds with BUMPER II results would appear to be essential to properly evaluating its predictions of MMOD risk. The uncertainties in BUMPER II come primarily from three areas: damage prediction/ballistic limit equations, environment models, and failure criteria definitions. In order to quantify the overall uncertainty bounds on MMOD risk predictions, the uncertainties in these three areas must be identified. In this paper, possible approaches through which uncertainty bounds can be developed for the various damage prediction and ballistic limit equations encoded within the shuttle and station versions of BUMPER II are presented and discussed. We begin the paper with a review of the current approaches used by NASA to perform a PRA for the Space Shuttle and the International Space Station, followed by a review of the results of a recent sensitivity analysis performed by NASA using the shuttle version of the BUMPER II code. Following a discussion of the various equations that are encoded in BUMPER II, we propose several possible approaches for establishing uncertainty bounds for the equations within BUMPER II. We conclude with an evaluation of these approaches and present a recommendation regarding which of them would be the most appropriate to follow.

  1. Closing the Seasonal Ocean Surface Temperature Balance in the Eastern Tropical Oceans from Remote Sensing and Model Reanalyses

    NASA Technical Reports Server (NTRS)

    Roberts, J. Brent; Clayson, Carol A.

    2012-01-01

    The Eastern tropical ocean basins are regions of significant atmosphere-ocean interaction and are important to variability across subseasonal to decadal time scales. The numerous physical processes at play in these areas strain the abilities of coupled general circulation models to accurately reproduce observed upper ocean variability. Furthermore, limitations in the observing system of important terms in the surface temperature balance (e.g., turbulent and radiative heat fluxes, advection) introduce uncertainty into the analyses of processes controlling sea surface temperature variability. This study presents recent efforts to close the surface temperature balance through estimation of the terms in the mixed layer temperature budget using state-of-the-art remotely sensed and model-reanalysis derived products. A set of twelve net heat flux estimates constructed using combinations of radiative and turbulent heat flux products - including GEWEX-SRB, ISCCP-SRF, OAFlux, SeaFlux, among several others - are used with estimates of oceanic advection, entrainment, and mixed layer depth variability to investigate the seasonal variability of ocean surface temperatures. Particular emphasis is placed on how well the upper ocean temperature balance is, or is not, closed on these scales using the current generation of observational and model reanalysis products. That is, the magnitudes and spatial variability of residual imbalances are addressed. These residuals are placed into context within the current uncertainties of the surface net heat fluxes and the role of the mixed layer depth variability in scaling the impact of those uncertainties, particularly in the shallow mixed layers of the Eastern tropical ocean basins.

  2. Model structures amplify uncertainty in predicted soil carbon responses to climate change.

    PubMed

    Shi, Zheng; Crowell, Sean; Luo, Yiqi; Moore, Berrien

    2018-06-04

    Large model uncertainty in projected future soil carbon (C) dynamics has been well documented. However, our understanding of the sources of this uncertainty is limited. Here we quantify the uncertainties arising from model parameters, structures and their interactions, and how those uncertainties propagate through different models to projections of future soil carbon stocks. Both the vertically resolved model and the microbial explicit model project much greater uncertainties to climate change than the conventional soil C model, with both positive and negative C-climate feedbacks, whereas the conventional model consistently predicts positive soil C-climate feedback. Our findings suggest that diverse model structures are necessary to increase confidence in soil C projection. However, the larger uncertainty in the complex models also suggests that we need to strike a balance between model complexity and the need to include diverse model structures in order to forecast soil C dynamics with high confidence and low uncertainty.

  3. Technical Evaluation of the NASA Model for Cancer Risk to Astronauts Due to Space Radiation

    NASA Technical Reports Server (NTRS)

    2012-01-01

    At the request of NASA, the National Research Council's (NRC's) Committee for Evaluation of Space Radiation Cancer Risk Model1 reviewed a number of changes that NASA proposes to make to its model for estimating the risk of radiation-induced cancer in astronauts. The NASA model in current use was last updated in 2005, and the proposed model would incorporate recent research directed at improving the quantification and understanding of the health risks posed by the space radiation environment. NASA's proposed model is defined by the 2011 NASA report Space Radiation Cancer Risk Projections and Uncertainties--2010 . The committee's evaluation is based primarily on this source, which is referred to hereafter as the 2011 NASA report, with mention of specific sections or tables. The overall process for estimating cancer risks due to low linear energy transfer (LET) radiation exposure has been fully described in reports by a number of organizations. The approaches described in the reports from all of these expert groups are quite similar. NASA's proposed space radiation cancer risk assessment model calculates, as its main output, age- and gender-specific risk of exposure-induced death (REID) for use in the estimation of mission and astronaut-specific cancer risk. The model also calculates the associated uncertainties in REID. The general approach for estimating risk and uncertainty in the proposed model is broadly similar to that used for the current (2005) NASA model and is based on recommendations by the National Council on Radiation Protection and Measurements. However, NASA's proposed model has significant changes with respect to the following: the integration of new findings and methods into its components by taking into account newer epidemiological data and analyses, new radiobiological data indicating that quality factors differ for leukemia and solid cancers, an improved method for specifying quality factors in terms of radiation track structure concepts as opposed to the previous approach based on linear energy transfer, the development of a new solar particle event (SPE) model, and the updates to galactic cosmic ray (GCR) and shielding transport models. The newer epidemiological information includes updates to the cancer incidence rates from the life span study (LSS) of the Japanese atomic bomb survivors, transferred to the U.S. population and converted to cancer mortality rates from U.S. population statistics. In addition, the proposed model provides an alternative analysis applicable to lifetime never-smokers (NSs). Details of the uncertainty analysis in the model have also been updated and revised. NASA's proposed model and associated uncertainties are complex in their formulation and as such require a very clear and precise set of descriptions. The committee found the 2011 NASA report challenging to review largely because of the lack of clarity in the model descriptions and derivation of the various parameters used. The committee requested some clarifications from NASA throughout its review and was able to resolve many, but not all, of the ambiguities in the written description.

  4. Probabilistic accident consequence uncertainty analysis: Dispersion and deposition uncertainty assessment, appendices A and B

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harper, F.T.; Young, M.L.; Miller, L.A.

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, completed in 1990, estimate the risks presented by nuclear installations based on postulated frequencies and magnitudes of potential accidents. In 1991, the US Nuclear Regulatory Commission (NRC) and the Commission of the European Communities (CEC) began a joint uncertainty analysis of the two codes. The objective was to develop credible and traceable uncertainty distributions for the input variables of the codes. Expert elicitation, developed independently, was identified as the best technology available for developing a library of uncertainty distributions for the selected consequence parameters. The study was formulatedmore » jointly and was limited to the current code models and to physical quantities that could be measured in experiments. To validate the distributions generated for the wet deposition input variables, samples were taken from these distributions and propagated through the wet deposition code model along with the Gaussian plume model (GPM) implemented in the MACCS and COSYMA codes. Resulting distributions closely replicated the aggregated elicited wet deposition distributions. Project teams from the NRC and CEC cooperated successfully to develop and implement a unified process for the elaboration of uncertainty distributions on consequence code input parameters. Formal expert judgment elicitation proved valuable for synthesizing the best available information. Distributions on measurable atmospheric dispersion and deposition parameters were successfully elicited from experts involved in the many phenomenological areas of consequence analysis. This volume is the second of a three-volume document describing the project and contains two appendices describing the rationales for the dispersion and deposition data along with short biographies of the 16 experts who participated in the project.« less

  5. Uncertainty in Population Estimates for Endangered Animals and Improving the Recovery Process

    PubMed Central

    Haines, Aaron M.; Zak, Matthew; Hammond, Katie; Scott, J. Michael; Goble, Dale D.; Rachlow, Janet L.

    2013-01-01

    Simple Summary The objective of our study was to evaluate the mention of uncertainty (i.e., variance) associated with population size estimates within U.S. recovery plans for endangered animals. To do this we reviewed all finalized recovery plans for listed terrestrial vertebrate species. We found that more recent recovery plans reported more estimates of population size and uncertainty. Also, bird and mammal recovery plans reported more estimates of population size and uncertainty. We recommend that updated recovery plans combine uncertainty of population size estimates with a minimum detectable difference to aid in successful recovery. Abstract United States recovery plans contain biological information for a species listed under the Endangered Species Act and specify recovery criteria to provide basis for species recovery. The objective of our study was to evaluate whether recovery plans provide uncertainty (e.g., variance) with estimates of population size. We reviewed all finalized recovery plans for listed terrestrial vertebrate species to record the following data: (1) if a current population size was given, (2) if a measure of uncertainty or variance was associated with current estimates of population size and (3) if population size was stipulated for recovery. We found that 59% of completed recovery plans specified a current population size, 14.5% specified a variance for the current population size estimate and 43% specified population size as a recovery criterion. More recent recovery plans reported more estimates of current population size, uncertainty and population size as a recovery criterion. Also, bird and mammal recovery plans reported more estimates of population size and uncertainty compared to reptiles and amphibians. We suggest the use of calculating minimum detectable differences to improve confidence when delisting endangered animals and we identified incentives for individuals to get involved in recovery planning to improve access to quantitative data. PMID:26479531

  6. Accounting for uncertainty in model-based prevalence estimation: paratuberculosis control in dairy herds.

    PubMed

    Davidson, Ross S; McKendrick, Iain J; Wood, Joanna C; Marion, Glenn; Greig, Alistair; Stevenson, Karen; Sharp, Michael; Hutchings, Michael R

    2012-09-10

    A common approach to the application of epidemiological models is to determine a single (point estimate) parameterisation using the information available in the literature. However, in many cases there is considerable uncertainty about parameter values, reflecting both the incomplete nature of current knowledge and natural variation, for example between farms. Furthermore model outcomes may be highly sensitive to different parameter values. Paratuberculosis is an infection for which many of the key parameter values are poorly understood and highly variable, and for such infections there is a need to develop and apply statistical techniques which make maximal use of available data. A technique based on Latin hypercube sampling combined with a novel reweighting method was developed which enables parameter uncertainty and variability to be incorporated into a model-based framework for estimation of prevalence. The method was evaluated by applying it to a simulation of paratuberculosis in dairy herds which combines a continuous time stochastic algorithm with model features such as within herd variability in disease development and shedding, which have not been previously explored in paratuberculosis models. Generated sample parameter combinations were assigned a weight, determined by quantifying the model's resultant ability to reproduce prevalence data. Once these weights are generated the model can be used to evaluate other scenarios such as control options. To illustrate the utility of this approach these reweighted model outputs were used to compare standard test and cull control strategies both individually and in combination with simple husbandry practices that aim to reduce infection rates. The technique developed has been shown to be applicable to a complex model incorporating realistic control options. For models where parameters are not well known or subject to significant variability, the reweighting scheme allowed estimated distributions of parameter values to be combined with additional sources of information, such as that available from prevalence distributions, resulting in outputs which implicitly handle variation and uncertainty. This methodology allows for more robust predictions from modelling approaches by allowing for parameter uncertainty and combining different sources of information, and is thus expected to be useful in application to a large number of disease systems.

  7. Next-Generation Tools For Next-Generation Surveys

    NASA Astrophysics Data System (ADS)

    Murray, S. G.

    2017-04-01

    The next generation of large-scale galaxy surveys, across the electromagnetic spectrum, loom on the horizon as explosively game-changing datasets, in terms of our understanding of cosmology and structure formation. We are on the brink of a torrent of data that is set to both confirm and constrain current theories to an unprecedented level, and potentially overturn many of our conceptions. One of the great challenges of this forthcoming deluge is to extract maximal scientific content from the vast array of raw data. This challenge requires not only well-understood and robust physical models, but a commensurate network of software implementations with which to efficiently apply them. The halo model, a semi-analytic treatment of cosmological spatial statistics down to nonlinear scales, provides an excellent mathematical framework for exploring the nature of dark matter. This thesis presents a next-generation toolkit based on the halo model formalism, intended to fulfil the requirements of next-generation surveys. Our toolkit comprises three tools: (i) hmf, a comprehensive and flexible calculator for halo mass functions (HMFs) within extended Press-Schechter theory, (ii) the MRP distribution for extremely efficient analytic characterisation of HMFs, and (iii) halomod, an extension of hmf which provides support for the full range of halo model components. In addition to the development and technical presentation of these tools, we apply each to the task of physical modelling. With hmf, we determine the precision of our knowledge of the HMF, due to uncertainty in our knowledge of the cosmological parameters, over the past decade of cosmic microwave background (CMB) experiments. We place rule-of-thumb uncertainties on the predicted HMF for the Planck cosmology, and find that current limits on the precision are driven by modeling uncertainties rather than those from cosmological parameters. With the MRP, we create and test a method for robustly fitting the HMF to observed masses with arbitrary measurement uncertainties on a per-object basis. We find that our method reduces estimation uncertainty on parameters by over 50%, and correctly accounts for Eddington bias even in extremely poorly measured data. Additionally, we use the analytical properties of the MRP to obtain asymptotically correct forms for the stellar-mass halo-mass relation, in the subhalo abundance matching scheme. Finally, with halomod, we explore the viability of the halo model as a test of warm dark matter (WDM) via galaxy clustering. Examining three distinct scale regimes, we find that the clustering of galaxies at the smallest resolvable scales may provide a valuable independent probe in the coming era.

  8. Model parameter uncertainty analysis for an annual field-scale P loss model

    NASA Astrophysics Data System (ADS)

    Bolster, Carl H.; Vadas, Peter A.; Boykin, Debbie

    2016-08-01

    Phosphorous (P) fate and transport models are important tools for developing and evaluating conservation practices aimed at reducing P losses from agricultural fields. Because all models are simplifications of complex systems, there will exist an inherent amount of uncertainty associated with their predictions. It is therefore important that efforts be directed at identifying, quantifying, and communicating the different sources of model uncertainties. In this study, we conducted an uncertainty analysis with the Annual P Loss Estimator (APLE) model. Our analysis included calculating parameter uncertainties and confidence and prediction intervals for five internal regression equations in APLE. We also estimated uncertainties of the model input variables based on values reported in the literature. We then predicted P loss for a suite of fields under different management and climatic conditions while accounting for uncertainties in the model parameters and inputs and compared the relative contributions of these two sources of uncertainty to the overall uncertainty associated with predictions of P loss. Both the overall magnitude of the prediction uncertainties and the relative contributions of the two sources of uncertainty varied depending on management practices and field characteristics. This was due to differences in the number of model input variables and the uncertainties in the regression equations associated with each P loss pathway. Inspection of the uncertainties in the five regression equations brought attention to a previously unrecognized limitation with the equation used to partition surface-applied fertilizer P between leaching and runoff losses. As a result, an alternate equation was identified that provided similar predictions with much less uncertainty. Our results demonstrate how a thorough uncertainty and model residual analysis can be used to identify limitations with a model. Such insight can then be used to guide future data collection and model development and evaluation efforts.

  9. Greenhouse gas network design using backward Lagrangian particle dispersion modelling - Part 1: Methodology and Australian test case

    NASA Astrophysics Data System (ADS)

    Ziehn, T.; Nickless, A.; Rayner, P. J.; Law, R. M.; Roff, G.; Fraser, P.

    2014-09-01

    This paper describes the generation of optimal atmospheric measurement networks for determining carbon dioxide fluxes over Australia using inverse methods. A Lagrangian particle dispersion model is used in reverse mode together with a Bayesian inverse modelling framework to calculate the relationship between weekly surface fluxes, comprising contributions from the biosphere and fossil fuel combustion, and hourly concentration observations for the Australian continent. Meteorological driving fields are provided by the regional version of the Australian Community Climate and Earth System Simulator (ACCESS) at 12 km resolution at an hourly timescale. Prior uncertainties are derived on a weekly timescale for biosphere fluxes and fossil fuel emissions from high-resolution model runs using the Community Atmosphere Biosphere Land Exchange (CABLE) model and the Fossil Fuel Data Assimilation System (FFDAS) respectively. The influence from outside the modelled domain is investigated, but proves to be negligible for the network design. Existing ground-based measurement stations in Australia are assessed in terms of their ability to constrain local flux estimates from the land. We find that the six stations that are currently operational are already able to reduce the uncertainties on surface flux estimates by about 30%. A candidate list of 59 stations is generated based on logistic constraints and an incremental optimisation scheme is used to extend the network of existing stations. In order to achieve an uncertainty reduction of about 50%, we need to double the number of measurement stations in Australia. Assuming equal data uncertainties for all sites, new stations would be mainly located in the northern and eastern part of the continent.

  10. Modeling tidal hydrodynamics of San Diego Bay, California

    USGS Publications Warehouse

    Wang, P.-F.; Cheng, R.T.; Richter, K.; Gross, E.S.; Sutton, D.; Gartner, J.W.

    1998-01-01

    In 1983, current data were collected by the National Oceanic and Atmospheric Administration using mechanical current meters. During 1992 through 1996, acoustic Doppler current profilers as well as mechanical current meters and tide gauges were used. These measurements not only document tides and tidal currents in San Diego Bay, but also provide independent data sets for model calibration and verification. A high resolution (100-m grid), depth-averaged, numerical hydrodynamic model has been implemented for San Diego Bay to describe essential tidal hydrodynamic processes in the bay. The model is calibrated using the 1983 data set and verified using the more recent 1992-1996 data. Discrepancies between model predictions and field data in beth model calibration and verification are on the order of the magnitude of uncertainties in the field data. The calibrated and verified numerical model has been used to quantify residence time and dilution and flushing of contaminant effluent into San Diego Bay. Furthermore, the numerical model has become an important research tool in ongoing hydrodynamic and water quality studies and in guiding future field data collection programs.

  11. Calculating salt loads to Great Salt Lake and the associated uncertainties for water year 2013; updating a 48 year old standard

    USGS Publications Warehouse

    Shope, Christopher L.; Angeroth, Cory E.

    2015-01-01

    Effective management of surface waters requires a robust understanding of spatiotemporal constituent loadings from upstream sources and the uncertainty associated with these estimates. We compared the total dissolved solids loading into the Great Salt Lake (GSL) for water year 2013 with estimates of previously sampled periods in the early 1960s.We also provide updated results on GSL loading, quantitatively bounded by sampling uncertainties, which are useful for current and future management efforts. Our statistical loading results were more accurate than those from simple regression models. Our results indicate that TDS loading to the GSL in water year 2013 was 14.6 million metric tons with uncertainty ranging from 2.8 to 46.3 million metric tons, which varies greatly from previous regression estimates for water year 1964 of 2.7 million metric tons. Results also indicate that locations with increased sampling frequency are correlated with decreasing confidence intervals. Because time is incorporated into the LOADEST models, discrepancies are largely expected to be a function of temporally lagged salt storage delivery to the GSL associated with terrestrial and in-stream processes. By incorporating temporally variable estimates and statistically derived uncertainty of these estimates,we have provided quantifiable variability in the annual estimates of dissolved solids loading into the GSL. Further, our results support the need for increased monitoring of dissolved solids loading into saline lakes like the GSL by demonstrating the uncertainty associated with different levels of sampling frequency.

  12. A Framework for Modeling Emerging Diseases to Inform Management

    PubMed Central

    Katz, Rachel A.; Richgels, Katherine L.D.; Walsh, Daniel P.; Grant, Evan H.C.

    2017-01-01

    The rapid emergence and reemergence of zoonotic diseases requires the ability to rapidly evaluate and implement optimal management decisions. Actions to control or mitigate the effects of emerging pathogens are commonly delayed because of uncertainty in the estimates and the predicted outcomes of the control tactics. The development of models that describe the best-known information regarding the disease system at the early stages of disease emergence is an essential step for optimal decision-making. Models can predict the potential effects of the pathogen, provide guidance for assessing the likelihood of success of different proposed management actions, quantify the uncertainty surrounding the choice of the optimal decision, and highlight critical areas for immediate research. We demonstrate how to develop models that can be used as a part of a decision-making framework to determine the likelihood of success of different management actions given current knowledge. PMID:27983501

  13. A Framework for Modeling Emerging Diseases to Inform Management.

    PubMed

    Russell, Robin E; Katz, Rachel A; Richgels, Katherine L D; Walsh, Daniel P; Grant, Evan H C

    2017-01-01

    The rapid emergence and reemergence of zoonotic diseases requires the ability to rapidly evaluate and implement optimal management decisions. Actions to control or mitigate the effects of emerging pathogens are commonly delayed because of uncertainty in the estimates and the predicted outcomes of the control tactics. The development of models that describe the best-known information regarding the disease system at the early stages of disease emergence is an essential step for optimal decision-making. Models can predict the potential effects of the pathogen, provide guidance for assessing the likelihood of success of different proposed management actions, quantify the uncertainty surrounding the choice of the optimal decision, and highlight critical areas for immediate research. We demonstrate how to develop models that can be used as a part of a decision-making framework to determine the likelihood of success of different management actions given current knowledge.

  14. A framework for modeling emerging diseases to inform management

    USGS Publications Warehouse

    Russell, Robin E.; Katz, Rachel A.; Richgels, Katherine L. D.; Walsh, Daniel P.; Grant, Evan H. Campbell

    2017-01-01

    The rapid emergence and reemergence of zoonotic diseases requires the ability to rapidly evaluate and implement optimal management decisions. Actions to control or mitigate the effects of emerging pathogens are commonly delayed because of uncertainty in the estimates and the predicted outcomes of the control tactics. The development of models that describe the best-known information regarding the disease system at the early stages of disease emergence is an essential step for optimal decision-making. Models can predict the potential effects of the pathogen, provide guidance for assessing the likelihood of success of different proposed management actions, quantify the uncertainty surrounding the choice of the optimal decision, and highlight critical areas for immediate research. We demonstrate how to develop models that can be used as a part of a decision-making framework to determine the likelihood of success of different management actions given current knowledge.

  15. Realising the Uncertainty Enabled Model Web

    NASA Astrophysics Data System (ADS)

    Cornford, D.; Bastin, L.; Pebesma, E. J.; Williams, M.; Stasch, C.; Jones, R.; Gerharz, L.

    2012-12-01

    The FP7 funded UncertWeb project aims to create the "uncertainty enabled model web". The central concept here is that geospatial models and data resources are exposed via standard web service interfaces, such as the Open Geospatial Consortium (OGC) suite of encodings and interface standards, allowing the creation of complex workflows combining both data and models. The focus of UncertWeb is on the issue of managing uncertainty in such workflows, and providing the standards, architecture, tools and software support necessary to realise the "uncertainty enabled model web". In this paper we summarise the developments in the first two years of UncertWeb, illustrating several key points with examples taken from the use case requirements that motivate the project. Firstly we address the issue of encoding specifications. We explain the usage of UncertML 2.0, a flexible encoding for representing uncertainty based on a probabilistic approach. This is designed to be used within existing standards such as Observations and Measurements (O&M) and data quality elements of ISO19115 / 19139 (geographic information metadata and encoding specifications) as well as more broadly outside the OGC domain. We show profiles of O&M that have been developed within UncertWeb and how UncertML 2.0 is used within these. We also show encodings based on NetCDF and discuss possible future directions for encodings in JSON. We then discuss the issues of workflow construction, considering discovery of resources (both data and models). We discuss why a brokering approach to service composition is necessary in a world where the web service interfaces remain relatively heterogeneous, including many non-OGC approaches, in particular the more mainstream SOAP and WSDL approaches. We discuss the trade-offs between delegating uncertainty management functions to the service interfaces themselves and integrating the functions in the workflow management system. We describe two utility services to address conversion between uncertainty types, and between the spatial / temporal support of service inputs / outputs. Finally we describe the tools being generated within the UncertWeb project, considering three main aspects: i) Elicitation of uncertainties on model inputs. We are developing tools to enable domain experts to provide judgements about input uncertainties from UncertWeb model components (e.g. parameters in meteorological models) which allow panels of experts to engage in the process and reach a consensus view on the current knowledge / beliefs about that parameter or variable. We are developing systems for continuous and categorical variables as well as stationary spatial fields. ii) Visualisation of the resulting uncertain outputs from the end of the workflow, but also at intermediate steps. At this point we have prototype implementations driven by the requirements from the use cases that motivate UncertWeb. iii) Sensitivity and uncertainty analysis on model outputs. Here we show the design of the overall system we are developing, including the deployment of an emulator framework to allow computationally efficient approaches. We conclude with a summary of the open issues and remaining challenges we are facing in UncertWeb, and provide a brief overview of how we plan to tackle these.

  16. Testing charged current quasi-elastic and multinucleon interaction models in the NEUT neutrino interaction generator with published datasets from the MiniBooNE and MINERνA experiments

    NASA Astrophysics Data System (ADS)

    Wilkinson, C.; Terri, R.; Andreopoulos, C.; Bercellie, A.; Bronner, C.; Cartwright, S.; de Perio, P.; Dobson, J.; Duffy, K.; Furmanski, A. P.; Haegel, L.; Hayato, Y.; Kaboth, A.; Mahn, K.; McFarland, K. S.; Nowak, J.; Redij, A.; Rodrigues, P.; Sánchez, F.; Schwehr, J. D.; Sinclair, P.; Sobczyk, J. T.; Stamoulis, P.; Stowell, P.; Tacik, R.; Thompson, L.; Tobayama, S.; Wascko, M. O.; Żmuda, J.

    2016-04-01

    There has been a great deal of theoretical work on sophisticated charged current quasi-elastic (CCQE) neutrino interaction models in recent years, prompted by a number of experimental results that measured unexpectedly large CCQE cross sections on nuclear targets. As the dominant interaction mode at T2K energies, and the signal process in oscillation analyses, it is important for the T2K experiment to include realistic CCQE cross section uncertainties in T2K analyses. To this end, T2K's Neutrino Interaction Working Group has implemented a number of recent models in NEUT, T2K's primary neutrino interaction event generator. In this paper, we give an overview of the models implemented and present fits to published νμ and ν¯ μ CCQE cross section measurements from the MiniBooNE and MINER ν A experiments. The results of the fits are used to select a default cross section model for future T2K analyses and to constrain the cross section uncertainties of the model. We find strong tension between datasets for all models investigated. Among the evaluated models, the combination of a modified relativistic Fermi gas with multinucleon CCQE-like interactions gives the most consistent description of the available data.

  17. The Importance of Uncertainty and Sensitivity Analysis in Process-based Models of Carbon and Nitrogen Cycling in Terrestrial Ecosystems with Particular Emphasis on Forest Ecosystems — Selected Papers from a Workshop Organized by the International Society for Ecological Modelling (ISEM) at the Third Biennal Meeting of the International Environmental Modelling and Software Society (IEMSS) in Burlington, Vermont, USA, August 9-13, 2006

    USGS Publications Warehouse

    Larocque, Guy R.; Bhatti, Jagtar S.; Liu, Jinxun; Ascough, James C.; Gordon, Andrew M.

    2008-01-01

    Many process-based models of carbon (C) and nitrogen (N) cycles have been developed for terrestrial ecosystems, including forest ecosystems. They address many basic issues of ecosystems structure and functioning, such as the role of internal feedback in ecosystem dynamics. The critical factor in these phenomena is scale, as these processes operate at scales from the minute (e.g. particulate pollution impacts on trees and other organisms) to the global (e.g. climate change). Research efforts remain important to improve the capability of such models to better represent the dynamics of terrestrial ecosystems, including the C, nutrient, (e.g. N) and water cycles. Existing models are sufficiently well advanced to help decision makers develop sustainable management policies and planning of terrestrial ecosystems, as they make realistic predictions when used appropriately. However, decision makers must be aware of their limitations by having the opportunity to evaluate the uncertainty associated with process-based models (Smith and Heath, 2001 and Allen et al., 2004). The variation in scale of issues currently being addressed by modelling efforts makes the evaluation of uncertainty a daunting task.

  18. Theory of choice in bandit, information sampling and foraging tasks.

    PubMed

    Averbeck, Bruno B

    2015-03-01

    Decision making has been studied with a wide array of tasks. Here we examine the theoretical structure of bandit, information sampling and foraging tasks. These tasks move beyond tasks where the choice in the current trial does not affect future expected rewards. We have modeled these tasks using Markov decision processes (MDPs). MDPs provide a general framework for modeling tasks in which decisions affect the information on which future choices will be made. Under the assumption that agents are maximizing expected rewards, MDPs provide normative solutions. We find that all three classes of tasks pose choices among actions which trade-off immediate and future expected rewards. The tasks drive these trade-offs in unique ways, however. For bandit and information sampling tasks, increasing uncertainty or the time horizon shifts value to actions that pay-off in the future. Correspondingly, decreasing uncertainty increases the relative value of actions that pay-off immediately. For foraging tasks the time-horizon plays the dominant role, as choices do not affect future uncertainty in these tasks.

  19. Non-Parametric Collision Probability for Low-Velocity Encounters

    NASA Technical Reports Server (NTRS)

    Carpenter, J. Russell

    2007-01-01

    An implicit, but not necessarily obvious, assumption in all of the current techniques for assessing satellite collision probability is that the relative position uncertainty is perfectly correlated in time. If there is any mis-modeling of the dynamics in the propagation of the relative position error covariance matrix, time-wise de-correlation of the uncertainty will increase the probability of collision over a given time interval. The paper gives some examples that illustrate this point. This paper argues that, for the present, Monte Carlo analysis is the best available tool for handling low-velocity encounters, and suggests some techniques for addressing the issues just described. One proposal is for the use of a non-parametric technique that is widely used in actuarial and medical studies. The other suggestion is that accurate process noise models be used in the Monte Carlo trials to which the non-parametric estimate is applied. A further contribution of this paper is a description of how the time-wise decorrelation of uncertainty increases the probability of collision.

  20. Impact of a Ground Network of Miniaturized Laser Heterodyne Radiometers (mini-LHRs) on Global Carbon Flux Estimates

    NASA Astrophysics Data System (ADS)

    DiGregorio, A.; Wilson, E. L.; Palmer, P. I.; Mao, J.; Feng, L.

    2017-12-01

    We present the simulated impact of a small (50 instrument) ground network of NASA Goddard Space Flight Center's miniaturized laser heterodyne radiometer (mini-LHR), a small, low cost ( 50k), portable, and high precision CH4 and CO2 measuring instrument. Partnered with AERONET as a non-intrusive accessory, the mini-LHR is able to leverage the 500+ instrument AERONET network for rapid network deployment and testing, and simultaneously retrieve co-located aerosol data, an important input for sattelite measurements. This observing systems simulation experiment (OSSE) uses the 3-D GEOS-Chem chemistry transport model and 50 strategically selected sites to model flux estimate uncertainty reduction of both TCCON and mini-LHR instruments. We found that 50 mini-LHR sites are capable of improving global uncertainty by up to 70%, with local improvements in the Southern Hemisphere reaching to 90%. Our studies show that addition of the mini-LHR to current ground networks will play a major role in reduction of global carbon flux uncertainty.

  1. Calibration and Validation of Landsat Tree Cover in the Taiga-Tundra Ecotone

    NASA Technical Reports Server (NTRS)

    Montesano, Paul Mannix; Neigh, Christopher S. R.; Sexton, Joseph; Feng, Min; Channan, Saurabh; Ranson, Kenneth J.; Townshend, John R.

    2016-01-01

    Monitoring current forest characteristics in the taiga-tundra ecotone (TTE) at multiple scales is critical for understanding its vulnerability to structural changes. A 30 m spatial resolution Landsat-based tree canopy cover map has been calibrated and validated in the TTE with reference tree cover data from airborne LiDAR and high resolution spaceborne images across the full range of boreal forest tree cover. This domain-specific calibration model used estimates of forest height to determine reference forest cover that best matched Landsat estimates. The model removed the systematic under-estimation of tree canopy cover greater than 80% and indicated that Landsat estimates of tree canopy cover more closely matched canopies at least 2 m in height rather than 5 m. The validation improved estimates of uncertainty in tree canopy cover in discontinuous TTE forests for three temporal epochs (2000, 2005, and 2010) by reducing systematic errors, leading to increases in tree canopy cover uncertainty. Average pixel-level uncertainties in tree canopy cover were 29.0%, 27.1% and 31.1% for the 2000, 2005 and 2010 epochs, respectively. Maps from these calibrated data improve the uncertainty associated with Landsat tree canopy cover estimates in the discontinuous forests of the circumpolar TTE.

  2. Efficient Characterization of Parametric Uncertainty of Complex (Bio)chemical Networks.

    PubMed

    Schillings, Claudia; Sunnåker, Mikael; Stelling, Jörg; Schwab, Christoph

    2015-08-01

    Parametric uncertainty is a particularly challenging and relevant aspect of systems analysis in domains such as systems biology where, both for inference and for assessing prediction uncertainties, it is essential to characterize the system behavior globally in the parameter space. However, current methods based on local approximations or on Monte-Carlo sampling cope only insufficiently with high-dimensional parameter spaces associated with complex network models. Here, we propose an alternative deterministic methodology that relies on sparse polynomial approximations. We propose a deterministic computational interpolation scheme which identifies most significant expansion coefficients adaptively. We present its performance in kinetic model equations from computational systems biology with several hundred parameters and state variables, leading to numerical approximations of the parametric solution on the entire parameter space. The scheme is based on adaptive Smolyak interpolation of the parametric solution at judiciously and adaptively chosen points in parameter space. As Monte-Carlo sampling, it is "non-intrusive" and well-suited for massively parallel implementation, but affords higher convergence rates. This opens up new avenues for large-scale dynamic network analysis by enabling scaling for many applications, including parameter estimation, uncertainty quantification, and systems design.

  3. Efficient Characterization of Parametric Uncertainty of Complex (Bio)chemical Networks

    PubMed Central

    Schillings, Claudia; Sunnåker, Mikael; Stelling, Jörg; Schwab, Christoph

    2015-01-01

    Parametric uncertainty is a particularly challenging and relevant aspect of systems analysis in domains such as systems biology where, both for inference and for assessing prediction uncertainties, it is essential to characterize the system behavior globally in the parameter space. However, current methods based on local approximations or on Monte-Carlo sampling cope only insufficiently with high-dimensional parameter spaces associated with complex network models. Here, we propose an alternative deterministic methodology that relies on sparse polynomial approximations. We propose a deterministic computational interpolation scheme which identifies most significant expansion coefficients adaptively. We present its performance in kinetic model equations from computational systems biology with several hundred parameters and state variables, leading to numerical approximations of the parametric solution on the entire parameter space. The scheme is based on adaptive Smolyak interpolation of the parametric solution at judiciously and adaptively chosen points in parameter space. As Monte-Carlo sampling, it is “non-intrusive” and well-suited for massively parallel implementation, but affords higher convergence rates. This opens up new avenues for large-scale dynamic network analysis by enabling scaling for many applications, including parameter estimation, uncertainty quantification, and systems design. PMID:26317784

  4. Bayesian nonparametric adaptive control using Gaussian processes.

    PubMed

    Chowdhary, Girish; Kingravi, Hassan A; How, Jonathan P; Vela, Patricio A

    2015-03-01

    Most current model reference adaptive control (MRAC) methods rely on parametric adaptive elements, in which the number of parameters of the adaptive element are fixed a priori, often through expert judgment. An example of such an adaptive element is radial basis function networks (RBFNs), with RBF centers preallocated based on the expected operating domain. If the system operates outside of the expected operating domain, this adaptive element can become noneffective in capturing and canceling the uncertainty, thus rendering the adaptive controller only semiglobal in nature. This paper investigates a Gaussian process-based Bayesian MRAC architecture (GP-MRAC), which leverages the power and flexibility of GP Bayesian nonparametric models of uncertainty. The GP-MRAC does not require the centers to be preallocated, can inherently handle measurement noise, and enables MRAC to handle a broader set of uncertainties, including those that are defined as distributions over functions. We use stochastic stability arguments to show that GP-MRAC guarantees good closed-loop performance with no prior domain knowledge of the uncertainty. Online implementable GP inference methods are compared in numerical simulations against RBFN-MRAC with preallocated centers and are shown to provide better tracking and improved long-term learning.

  5. Can we improve top-down GHG inverse methods through informed prior and better representations of atmospheric transport? Insights from the Atmospheric Carbon and Transport (ACT) - America Aircraft Mission

    NASA Astrophysics Data System (ADS)

    Feng, S.; Lauvaux, T.; Keller, K.; Davis, K. J.

    2016-12-01

    Current estimates of biogenic carbon fluxes over North America based on top-down atmospheric inversions are subject to considerable uncertainty. This uncertainty stems to a large part from the uncertain prior fluxes estimates with the associated error covariances and approximations in the atmospheric transport models that link observed carbon dioxide mixing ratios with surface fluxes. Specifically, approximations in the representation of vertical mixing associated with atmospheric turbulence or convective transport and largely under-determined prior fluxes and their error structures significantly hamper our capacity to reliably estimate regional carbon fluxes. The Atmospheric Carbon and Transport - America (ACT-America) mission aims at reducing the uncertainties in inverse fluxes at the regional-scale by deploying airborne and ground-based platforms to characterize atmospheric GHG mixing ratios and the concurrent atmospheric dynamics. Two aircraft measure the 3-dimensional distribution of greenhouse gases at synoptic scales, focusing on the atmospheric boundary layer and the free troposphere during both fair and stormy weather conditions. Here we analyze two main questions: (i) What level of information can we expect from the currently planned observations? (ii) How might ACT-America reduce the hindcast and predictive uncertainty of carbon estimates over North America?

  6. An Assessment of Current Fan Noise Prediction Capability

    NASA Technical Reports Server (NTRS)

    Envia, Edmane; Woodward, Richard P.; Elliott, David M.; Fite, E. Brian; Hughes, Christopher E.; Podboy, Gary G.; Sutliff, Daniel L.

    2008-01-01

    In this paper, the results of an extensive assessment exercise carried out to establish the current state of the art for predicting fan noise at NASA are presented. Representative codes in the empirical, analytical, and computational categories were exercised and assessed against a set of benchmark acoustic data obtained from wind tunnel tests of three model scale fans. The chosen codes were ANOPP, representing an empirical capability, RSI, representing an analytical capability, and LINFLUX, representing a computational aeroacoustics capability. The selected benchmark fans cover a wide range of fan pressure ratios and fan tip speeds, and are representative of modern turbofan engine designs. The assessment results indicate that the ANOPP code can predict fan noise spectrum to within 4 dB of the measurement uncertainty band on a third-octave basis for the low and moderate tip speed fans except at extreme aft emission angles. The RSI code can predict fan broadband noise spectrum to within 1.5 dB of experimental uncertainty band provided the rotor-only contribution is taken into account. The LINFLUX code can predict interaction tone power levels to within experimental uncertainties at low and moderate fan tip speeds, but could deviate by as much as 6.5 dB outside the experimental uncertainty band at the highest tip speeds in some case.

  7. A 20-Year High-Resolution Wave Resource Assessment of Japan with Wave-Current Interactions

    NASA Astrophysics Data System (ADS)

    Webb, A.; Waseda, T.; Kiyomatsu, K.

    2016-02-01

    Energy harvested from surface ocean waves and tidal currents has the potential to be a significant source of green energy, particularly for countries with extensive coastlines such as Japan. As part of a larger marine renewable energy project*, The University of Tokyo (in cooperation with JAMSTEC) has conducted a state-of-the-art wave resource assessment (with uncertainty estimates) to assist with wave generator site identification and construction in Japan. This assessment will be publicly available and is based on a large-scale NOAA WAVEWATCH III (version 4.18) simulation using NCEP and JAMSTEC forcings. It includes several key components to improve model skill: a 20-year simulation to reduce aleatory uncertainty, a four-nested-layer approach to resolve a 1 km shoreline, and finite-depth and current effects included in all wave power density calculations. This latter component is particularly important for regions near strong currents such as the Kuroshio. Here, we will analyze the different wave power density equations, discuss the model setup, and present results from the 20-year assessment (with a focus on the role of wave-current interactions). Time permitting, a comparison will also be made with simulations using JMA MSM 5 km winds. *New Energy and Industrial Technology Development Organization (NEDO): "Research on the Framework and Infrastructure of Marine Renewable Energy; an Energy Potential Assessment"

  8. Trends and uncertainties in budburst projections of Norway spruce in Northern Europe.

    PubMed

    Olsson, Cecilia; Olin, Stefan; Lindström, Johan; Jönsson, Anna Maria

    2017-12-01

    Budburst is regulated by temperature conditions, and a warming climate is associated with earlier budburst. A range of phenology models has been developed to assess climate change effects, and they tend to produce different results. This is mainly caused by different model representations of tree physiology processes, selection of observational data for model parameterization, and selection of climate model data to generate future projections. In this study, we applied (i) Bayesian inference to estimate model parameter values to address uncertainties associated with selection of observational data, (ii) selection of climate model data representative of a larger dataset, and (iii) ensembles modeling over multiple initial conditions, model classes, model parameterizations, and boundary conditions to generate future projections and uncertainty estimates. The ensemble projection indicated that the budburst of Norway spruce in northern Europe will on average take place 10.2 ± 3.7 days earlier in 2051-2080 than in 1971-2000, given climate conditions corresponding to RCP 8.5. Three provenances were assessed separately (one early and two late), and the projections indicated that the relationship among provenance will remain also in a warmer climate. Structurally complex models were more likely to fail predicting budburst for some combinations of site and year than simple models. However, they contributed to the overall picture of current understanding of climate impacts on tree phenology by capturing additional aspects of temperature response, for example, chilling. Model parameterizations based on single sites were more likely to result in model failure than parameterizations based on multiple sites, highlighting that the model parameterization is sensitive to initial conditions and may not perform well under other climate conditions, whether the change is due to a shift in space or over time. By addressing a range of uncertainties, this study showed that ensemble modeling provides a more robust impact assessment than would a single phenology model run.

  9. Modeling of dust deposition in central Asia

    USDA-ARS?s Scientific Manuscript database

    The deposition of dust particles has a significant influence on the global bio-geochemical cycle. Currently, the lack of spatiotemporal data creates great uncertainty in estimating the global dust budget. To improve our understanding of the fate, transport and cycling of airborne dust, there is a ne...

  10. New Directions: Understanding Interactions of Air Quality and Climate Change at Regional Scales

    EPA Science Inventory

    The estimates of the short-lived climate forcers’ (SLCFs) impacts and mitigation effects on the radiation balance have large uncertainty because the current global model set-ups and simulations contain simplified parameterizations and do not completely cover the full range of air...

  11. Uncertainty assessment of urban pluvial flood risk in a context of climate change adaptation decision making

    NASA Astrophysics Data System (ADS)

    Arnbjerg-Nielsen, Karsten; Zhou, Qianqian

    2014-05-01

    There has been a significant increase in climatic extremes in many regions. In Central and Northern Europe, this has led to more frequent and more severe floods. Along with improved flood modelling technologies this has enabled development of economic assessment of climate change adaptation to increasing urban flood risk. Assessment of adaptation strategies often requires a comprehensive risk-based economic analysis of current risk, drivers of change of risk over time, and measures to reduce the risk. However, such studies are often associated with large uncertainties. The uncertainties arise from basic assumptions in the economic analysis and the hydrological model, but also from the projection of future societies to local climate change impacts and suitable adaptation options. This presents a challenge to decision makers when trying to identify robust measures. We present an integrated uncertainty analysis, which can assess and quantify the overall uncertainty in relation to climate change adaptation to urban flash floods. The analysis is based on an uncertainty cascade that by means of Monte Carlo simulations of flood risk assessments incorporates climate change impacts as a key driver of risk changes over time. The overall uncertainty is then attributed to six bulk processes: climate change impact, urban rainfall-runoff processes, stage-depth functions, unit cost of repair, cost of adaptation measures, and discount rate. We apply the approach on an urban hydrological catchment in Odense, Denmark, and find that the uncertainty on the climate change impact appears to have the least influence on the net present value of the studied adaptation measures-. This does not imply that the climate change impact is not important, but that the uncertainties are not dominating when deciding on action or in-action. We then consider the uncertainty related to choosing between adaptation options given that a decision of action has been taken. In this case the major part of the uncertainty on the estimated net present values is identical for all adaptation options and will therefore not affect a comparison between adaptation measures. This makes the chose among the options easier. Furthermore, the explicit attribution of uncertainty also enables a reduction of the overall uncertainty by identifying the processes which contributes the most. This knowledge can then be used to further reduce the uncertainty related to decision making, as a substantial part of the remaining uncertainty is epistemic.

  12. Evaluation of Uncertainty and Sensitivity in Environmental Modeling at a Radioactive Waste Management Site

    NASA Astrophysics Data System (ADS)

    Stockton, T. B.; Black, P. K.; Catlett, K. M.; Tauxe, J. D.

    2002-05-01

    Environmental modeling is an essential component in the evaluation of regulatory compliance of radioactive waste management sites (RWMSs) at the Nevada Test Site in southern Nevada, USA. For those sites that are currently operating, further goals are to support integrated decision analysis for the development of acceptance criteria for future wastes, as well as site maintenance, closure, and monitoring. At these RWMSs, the principal pathways for release of contamination to the environment are upward towards the ground surface rather than downwards towards the deep water table. Biotic processes, such as burrow excavation and plant uptake and turnover, dominate this upward transport. A combined multi-pathway contaminant transport and risk assessment model was constructed using the GoldSim modeling platform. This platform facilitates probabilistic analysis of environmental systems, and is especially well suited for assessments involving radionuclide decay chains. The model employs probabilistic definitions of key parameters governing contaminant transport, with the goals of quantifying cumulative uncertainty in the estimation of performance measures and providing information necessary to perform sensitivity analyses. This modeling differs from previous radiological performance assessments (PAs) in that the modeling parameters are intended to be representative of the current knowledge, and the uncertainty in that knowledge, of parameter values rather than reflective of a conservative assessment approach. While a conservative PA may be sufficient to demonstrate regulatory compliance, a parametrically honest PA can also be used for more general site decision-making. In particular, a parametrically honest probabilistic modeling approach allows both uncertainty and sensitivity analyses to be explicitly coupled to the decision framework using a single set of model realizations. For example, sensitivity analysis provides a guide for analyzing the value of collecting more information by quantifying the relative importance of each input parameter in predicting the model response. However, in these complex, high dimensional eco-system models, represented by the RWMS model, the dynamics of the systems can act in a non-linear manner. Quantitatively assessing the importance of input variables becomes more difficult as the dimensionality, the non-linearities, and the non-monotonicities of the model increase. Methods from data mining such as Multivariate Adaptive Regression Splines (MARS) and the Fourier Amplitude Sensitivity Test (FAST) provide tools that can be used in global sensitivity analysis in these high dimensional, non-linear situations. The enhanced interpretability of model output provided by the quantitative measures estimated by these global sensitivity analysis tools will be demonstrated using the RWMS model.

  13. Probabilistic Radiological Performance Assessment Modeling and Uncertainty

    NASA Astrophysics Data System (ADS)

    Tauxe, J.

    2004-12-01

    A generic probabilistic radiological Performance Assessment (PA) model is presented. The model, built using the GoldSim systems simulation software platform, concerns contaminant transport and dose estimation in support of decision making with uncertainty. Both the U.S. Nuclear Regulatory Commission (NRC) and the U.S. Department of Energy (DOE) require assessments of potential future risk to human receptors of disposal of LLW. Commercially operated LLW disposal facilities are licensed by the NRC (or agreement states), and the DOE operates such facilities for disposal of DOE-generated LLW. The type of PA model presented is probabilistic in nature, and hence reflects the current state of knowledge about the site by using probability distributions to capture what is expected (central tendency or average) and the uncertainty (e.g., standard deviation) associated with input parameters, and propagating through the model to arrive at output distributions that reflect expected performance and the overall uncertainty in the system. Estimates of contaminant release rates, concentrations in environmental media, and resulting doses to human receptors well into the future are made by running the model in Monte Carlo fashion, with each realization representing a possible combination of input parameter values. Statistical summaries of the results can be compared to regulatory performance objectives, and decision makers are better informed of the inherently uncertain aspects of the model which supports their decision-making. While this information may make some regulators uncomfortable, they must realize that uncertainties which were hidden in a deterministic analysis are revealed in a probabilistic analysis, and the chance of making a correct decision is now known rather than hoped for. The model includes many typical features and processes that would be part of a PA, but is entirely fictitious. This does not represent any particular site and is meant to be a generic example. A practitioner could, however, start with this model as a GoldSim template and, by adding site specific features and parameter values (distributions), use this model as a starting point for a real model to be used in real decision making.

  14. Different approaches to overcome uncertainties of production systems

    NASA Astrophysics Data System (ADS)

    Azizi, Amir; Sorooshian, Shahryar

    2015-05-01

    This study presented a comprehensive review on the understanding of uncertainty and the current approaches that have been proposed to handle the uncertainties in the production systems. This paper classified proposed approaches into 11 groups. The paper studied 114 scholarly papers through various international journals. The paper added the latest findings to the body of knowledge to the current reservoir of understanding of the production uncertainties. Thus, the paper prepared the needs of researchers and practitioners for easy references in this area. This review also provided an excellent source to continue further studies on how to deal with the uncertainties of production system.

  15. Reusable launch vehicle model uncertainties impact analysis

    NASA Astrophysics Data System (ADS)

    Chen, Jiaye; Mu, Rongjun; Zhang, Xin; Deng, Yanpeng

    2018-03-01

    Reusable launch vehicle(RLV) has the typical characteristics of complex aerodynamic shape and propulsion system coupling, and the flight environment is highly complicated and intensely changeable. So its model has large uncertainty, which makes the nominal system quite different from the real system. Therefore, studying the influences caused by the uncertainties on the stability of the control system is of great significance for the controller design. In order to improve the performance of RLV, this paper proposes the approach of analyzing the influence of the model uncertainties. According to the typical RLV, the coupling dynamic and kinematics models are built. Then different factors that cause uncertainties during building the model are analyzed and summed up. After that, the model uncertainties are expressed according to the additive uncertainty model. Choosing the uncertainties matrix's maximum singular values as the boundary model, and selecting the uncertainties matrix's norm to show t how much the uncertainty factors influence is on the stability of the control system . The simulation results illustrate that the inertial factors have the largest influence on the stability of the system, and it is necessary and important to take the model uncertainties into consideration before the designing the controller of this kind of aircraft( like RLV, etc).

  16. Empirical estimates to reduce modeling uncertainties of soil organic carbon in permafrost regions: a review of recent progress and remaining challenges

    USGS Publications Warehouse

    Mishra, U.; Jastrow, J.D.; Matamala, R.; Hugelius, G.; Koven, C.D.; Harden, Jennifer W.; Ping, S.L.; Michaelson, G.J.; Fan, Z.; Miller, R.M.; McGuire, A.D.; Tarnocai, C.; Kuhry, P.; Riley, W.J.; Schaefer, K.; Schuur, E.A.G.; Jorgenson, M.T.; Hinzman, L.D.

    2013-01-01

    The vast amount of organic carbon (OC) stored in soils of the northern circumpolar permafrost region is a potentially vulnerable component of the global carbon cycle. However, estimates of the quantity, decomposability, and combustibility of OC contained in permafrost-region soils remain highly uncertain, thereby limiting our ability to predict the release of greenhouse gases due to permafrost thawing. Substantial differences exist between empirical and modeling estimates of the quantity and distribution of permafrost-region soil OC, which contribute to large uncertainties in predictions of carbon–climate feedbacks under future warming. Here, we identify research challenges that constrain current assessments of the distribution and potential decomposability of soil OC stocks in the northern permafrost region and suggest priorities for future empirical and modeling studies to address these challenges.

  17. Defending Against Advanced Persistent Threats Using Game-Theory

    PubMed Central

    König, Sandra; Schauer, Stefan

    2017-01-01

    Advanced persistent threats (APT) combine a variety of different attack forms ranging from social engineering to technical exploits. The diversity and usual stealthiness of APT turns them into a central problem of contemporary practical system security, since information on attacks, the current system status or the attacker’s incentives is often vague, uncertain and in many cases even unavailable. Game theory is a natural approach to model the conflict between the attacker and the defender, and this work investigates a generalized class of matrix games as a risk mitigation tool for an advanced persistent threat (APT) defense. Unlike standard game and decision theory, our model is tailored to capture and handle the full uncertainty that is immanent to APTs, such as disagreement among qualitative expert risk assessments, unknown adversarial incentives and uncertainty about the current system state (in terms of how deeply the attacker may have penetrated into the system’s protective shells already). Practically, game-theoretic APT models can be derived straightforwardly from topological vulnerability analysis, together with risk assessments as they are done in common risk management standards like the ISO 31000 family. Theoretically, these models come with different properties than classical game theoretic models, whose technical solution presented in this work may be of independent interest. PMID:28045922

  18. Active control of ECCD-induced tearing mode stabilization in coupled NIMROD/GENRAY HPC simulations

    NASA Astrophysics Data System (ADS)

    Jenkins, Thomas; Kruger, Scott; Held, Eric

    2013-10-01

    Actively controlled ECCD applied in or near magnetic islands formed by NTMs has been successfully shown to control/suppress these modes, despite uncertainties in island O-point locations (where induced current is most stabilizing) relative to the RF deposition region. Integrated numerical models of the mode stabilization process can resolve these uncertainties and augment experimental efforts to determine optimal ITER NTM stabilization strategies. The advanced SWIM model incorporates RF effects in the equations/closures of extended MHD as 3D (not toroidal or bounce-averaged) quasilinear diffusion coefficients. Equilibration of driven current within the island geometry is modeled using the same extended MHD dynamics governing the physics of island formation, yielding a more accurate/self-consistent picture of island response to RF drive. Additionally, a numerical active feedback control system gathers data from synthetic diagnostics to dynamically trigger & spatially align the RF fields. Computations which model the RF deposition using ray tracing, assemble the 3D QL operator from ray & profile data, calculate the resultant xMHD forces, and dynamically realign the RF to more efficiently stabilize modes are presented; the efficacy of various control strategies is also discussed. Supported by the SciDAC Center for Extended MHD Modeling (CEMM); see also https://cswim.org.

  19. On combination of strict Bayesian principles with model reduction technique or how stochastic model calibration can become feasible for large-scale applications

    NASA Astrophysics Data System (ADS)

    Oladyshkin, S.; Schroeder, P.; Class, H.; Nowak, W.

    2013-12-01

    Predicting underground carbon dioxide (CO2) storage represents a challenging problem in a complex dynamic system. Due to lacking information about reservoir parameters, quantification of uncertainties may become the dominant question in risk assessment. Calibration on past observed data from pilot-scale test injection can improve the predictive power of the involved geological, flow, and transport models. The current work performs history matching to pressure time series from a pilot storage site operated in Europe, maintained during an injection period. Simulation of compressible two-phase flow and transport (CO2/brine) in the considered site is computationally very demanding, requiring about 12 days of CPU time for an individual model run. For that reason, brute-force approaches for calibration are not feasible. In the current work, we explore an advanced framework for history matching based on the arbitrary polynomial chaos expansion (aPC) and strict Bayesian principles. The aPC [1] offers a drastic but accurate stochastic model reduction. Unlike many previous chaos expansions, it can handle arbitrary probability distribution shapes of uncertain parameters, and can therefore handle directly the statistical information appearing during the matching procedure. We capture the dependence of model output on these multipliers with the expansion-based reduced model. In our study we keep the spatial heterogeneity suggested by geophysical methods, but consider uncertainty in the magnitude of permeability trough zone-wise permeability multipliers. Next combined the aPC with Bootstrap filtering (a brute-force but fully accurate Bayesian updating mechanism) in order to perform the matching. In comparison to (Ensemble) Kalman Filters, our method accounts for higher-order statistical moments and for the non-linearity of both the forward model and the inversion, and thus allows a rigorous quantification of calibrated model uncertainty. The usually high computational costs of accurate filtering become very feasible for our suggested aPC-based calibration framework. However, the power of aPC-based Bayesian updating strongly depends on the accuracy of prior information. In the current study, the prior assumptions on the model parameters were not satisfactory and strongly underestimate the reservoir pressure. Thus, the aPC-based response surface used in Bootstrap filtering is fitted to a distant and poorly chosen region within the parameter space. Thanks to the iterative procedure suggested in [2] we overcome this drawback with small computational costs. The iteration successively improves the accuracy of the expansion around the current estimation of the posterior distribution. The final result is a calibrated model of the site that can be used for further studies, with an excellent match to the data. References [1] Oladyshkin S. and Nowak W. Data-driven uncertainty quantification using the arbitrary polynomial chaos expansion. Reliability Engineering and System Safety, 106:179-190, 2012. [2] Oladyshkin S., Class H., Nowak W. Bayesian updating via Bootstrap filtering combined with data-driven polynomial chaos expansions: methodology and application to history matching for carbon dioxide storage in geological formations. Computational Geosciences, 17 (4), 671-687, 2013.

  20. Uncertainty in temperature response of current consumption-based emissions estimates

    NASA Astrophysics Data System (ADS)

    Karstensen, J.; Peters, G. P.; Andrew, R. M.

    2014-09-01

    Several studies have connected emissions of greenhouse gases to economic and trade data to quantify the causal chain from consumption to emissions and climate change. These studies usually combine data and models originating from different sources, making it difficult to estimate uncertainties in the end results. We estimate uncertainties in economic data, multi-pollutant emission statistics and metric parameters, and use Monte Carlo analysis to quantify contributions to uncertainty and to determine how uncertainty propagates to estimates of global temperature change from regional and sectoral territorial- and consumption-based emissions for the year 2007. We find that the uncertainties are sensitive to the emission allocations, mix of pollutants included, the metric and its time horizon, and the level of aggregation of the results. Uncertainties in the final results are largely dominated by the climate sensitivity and the parameters associated with the warming effects of CO2. The economic data have a relatively small impact on uncertainty at the global and national level, while much higher uncertainties are found at the sectoral level. Our results suggest that consumption-based national emissions are not significantly more uncertain than the corresponding production based emissions, since the largest uncertainties are due to metric and emissions which affect both perspectives equally. The two perspectives exhibit different sectoral uncertainties, due to changes of pollutant compositions. We find global sectoral consumption uncertainties in the range of ±9-±27% using the global temperature potential with a 50 year time horizon, with metric uncertainties dominating. National level uncertainties are similar in both perspectives due to the dominance of CO2 over other pollutants. The consumption emissions of the top 10 emitting regions have a broad uncertainty range of ±9-±25%, with metric and emissions uncertainties contributing similarly. The Absolute global temperature potential with a 50 year time horizon has much higher uncertainties, with considerable uncertainty overlap for regions and sectors, indicating that the ranking of countries is uncertain.

  1. Improved Methodology for Developing Cost Uncertainty Models for Naval Vessels

    DTIC Science & Technology

    2009-04-22

    Deegan , 2007). Risk cannot be assessed with a point estimate, as it represents a single value that serves as a best guess for the parameter to be...or stakeholders ( Deegan & Fields, 2007). This paper analyzes the current NAVSEA 05C Cruiser (CG(X)) probabilistic cost model including data...provided by Mr. Chris Deegan and his CG(X) analysts. The CG(X) model encompasses all factors considered for cost of the entire program, including

  2. CASL L1 Milestone report : CASL.P4.01, sensitivity and uncertainty analysis for CIPS with VIPRE-W and BOA.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sung, Yixing; Adams, Brian M.; Secker, Jeffrey R.

    2011-12-01

    The CASL Level 1 Milestone CASL.P4.01, successfully completed in December 2011, aimed to 'conduct, using methodologies integrated into VERA, a detailed sensitivity analysis and uncertainty quantification of a crud-relevant problem with baseline VERA capabilities (ANC/VIPRE-W/BOA).' The VUQ focus area led this effort, in partnership with AMA, and with support from VRI. DAKOTA was coupled to existing VIPRE-W thermal-hydraulics and BOA crud/boron deposit simulations representing a pressurized water reactor (PWR) that previously experienced crud-induced power shift (CIPS). This work supports understanding of CIPS by exploring the sensitivity and uncertainty in BOA outputs with respect to uncertain operating and model parameters. Thismore » report summarizes work coupling the software tools, characterizing uncertainties, and analyzing the results of iterative sensitivity and uncertainty studies. These studies focused on sensitivity and uncertainty of CIPS indicators calculated by the current version of the BOA code used in the industry. Challenges with this kind of analysis are identified to inform follow-on research goals and VERA development targeting crud-related challenge problems.« less

  3. Maximum warming occurs about one decade after a carbon dioxide emission

    NASA Astrophysics Data System (ADS)

    Ricke, Katharine L.; Caldeira, Ken

    2014-12-01

    It is known that carbon dioxide emissions cause the Earth to warm, but no previous study has focused on examining how long it takes to reach maximum warming following a particular CO2 emission. Using conjoined results of carbon-cycle and physical-climate model intercomparison projects (Taylor et al 2012, Joos et al 2013), we find the median time between an emission and maximum warming is 10.1 years, with a 90% probability range of 6.6-30.7 years. We evaluate uncertainties in timing and amount of warming, partitioning them into three contributing factors: carbon cycle, climate sensitivity and ocean thermal inertia. If uncertainty in any one factor is reduced to zero without reducing uncertainty in the other factors, the majority of overall uncertainty remains. Thus, narrowing uncertainty in century-scale warming depends on narrowing uncertainty in all contributing factors. Our results indicate that benefit from avoided climate damage from avoided CO2 emissions will be manifested within the lifetimes of people who acted to avoid that emission. While such avoidance could be expected to benefit future generations, there is potential for emissions avoidance to provide substantial benefit to current generations.

  4. Evaluating uncertainty in environmental life-cycle assessment. A case study comparing two insulation options for a Dutch one-family dwelling.

    PubMed

    Huijbregts, Mark A J; Gilijamse, Wim; Ragas, Ad M J; Reijnders, Lucas

    2003-06-01

    The evaluation of uncertainty is relatively new in environmental life-cycle assessment (LCA). It provides useful information to assess the reliability of LCA-based decisions and to guide future research toward reducing uncertainty. Most uncertainty studies in LCA quantify only one type of uncertainty, i.e., uncertainty due to input data (parameter uncertainty). However, LCA outcomes can also be uncertain due to normative choices (scenario uncertainty) and the mathematical models involved (model uncertainty). The present paper outlines a new methodology that quantifies parameter, scenario, and model uncertainty simultaneously in environmental life-cycle assessment. The procedure is illustrated in a case study that compares two insulation options for a Dutch one-family dwelling. Parameter uncertainty was quantified by means of Monte Carlo simulation. Scenario and model uncertainty were quantified by resampling different decision scenarios and model formulations, respectively. Although scenario and model uncertainty were not quantified comprehensively, the results indicate that both types of uncertainty influence the case study outcomes. This stresses the importance of quantifying parameter, scenario, and model uncertainty simultaneously. The two insulation options studied were found to have significantly different impact scores for global warming, stratospheric ozone depletion, and eutrophication. The thickest insulation option has the lowest impact on global warming and eutrophication, and the highest impact on stratospheric ozone depletion.

  5. Space Radiation Cancer Risks and Uncertainties for Mars Missions

    NASA Technical Reports Server (NTRS)

    Cucinotta, F. A.; Schimmerling, W.; Wilson, J. W.; Peterson, L. E.; Badhwar, G. D.; Saganti, P. B.; Dicello, J. F.

    2001-01-01

    Projecting cancer risks from exposure to space radiation is highly uncertain because of the absence of data for humans and because of the limited radiobiology data available for estimating late effects from the high-energy and charge (HZE) ions present in the galactic cosmic rays (GCR). Cancer risk projections involve many biological and physical factors, each of which has a differential range of uncertainty due to the lack of data and knowledge. We discuss an uncertainty assessment within the linear-additivity model using the approach of Monte Carlo sampling from subjective error distributions that represent the lack of knowledge in each factor to quantify the overall uncertainty in risk projections. Calculations are performed using the space radiation environment and transport codes for several Mars mission scenarios. This approach leads to estimates of the uncertainties in cancer risk projections of 400-600% for a Mars mission. The uncertainties in the quality factors are dominant. Using safety standards developed for low-Earth orbit, long-term space missions (>90 days) outside the Earth's magnetic field are currently unacceptable if the confidence levels in risk projections are considered. Because GCR exposures involve multiple particle or delta-ray tracks per cellular array, our results suggest that the shape of the dose response at low dose rates may be an additional uncertainty for estimating space radiation risks.

  6. Robust Control Design for Uncertain Nonlinear Dynamic Systems

    NASA Technical Reports Server (NTRS)

    Kenny, Sean P.; Crespo, Luis G.; Andrews, Lindsey; Giesy, Daniel P.

    2012-01-01

    Robustness to parametric uncertainty is fundamental to successful control system design and as such it has been at the core of many design methods developed over the decades. Despite its prominence, most of the work on robust control design has focused on linear models and uncertainties that are non-probabilistic in nature. Recently, researchers have acknowledged this disparity and have been developing theory to address a broader class of uncertainties. This paper presents an experimental application of robust control design for a hybrid class of probabilistic and non-probabilistic parametric uncertainties. The experimental apparatus is based upon the classic inverted pendulum on a cart. The physical uncertainty is realized by a known additional lumped mass at an unknown location on the pendulum. This unknown location has the effect of substantially altering the nominal frequency and controllability of the nonlinear system, and in the limit has the capability to make the system neutrally stable and uncontrollable. Another uncertainty to be considered is a direct current motor parameter. The control design objective is to design a controller that satisfies stability, tracking error, control power, and transient behavior requirements for the largest range of parametric uncertainties. This paper presents an overview of the theory behind the robust control design methodology and the experimental results.

  7. An improved non-Markovian degradation model with long-term dependency and item-to-item uncertainty

    NASA Astrophysics Data System (ADS)

    Xi, Xiaopeng; Chen, Maoyin; Zhang, Hanwen; Zhou, Donghua

    2018-05-01

    It is widely noted in the literature that the degradation should be simplified into a memoryless Markovian process for the purpose of predicting the remaining useful life (RUL). However, there actually exists the long-term dependency in the degradation processes of some industrial systems, including electromechanical equipments, oil tankers, and large blast furnaces. This implies the new degradation state depends not only on the current state, but also on the historical states. Such dynamic systems cannot be accurately described by traditional Markovian models. Here we present an improved non-Markovian degradation model with both the long-term dependency and the item-to-item uncertainty. As a typical non-stationary process with dependent increments, fractional Brownian motion (FBM) is utilized to simulate the fractal diffusion of practical degradations. The uncertainty among multiple items can be represented by a random variable of the drift. Based on this model, the unknown parameters are estimated through the maximum likelihood (ML) algorithm, while a closed-form solution to the RUL distribution is further derived using a weak convergence theorem. The practicability of the proposed model is fully verified by two real-world examples. The results demonstrate that the proposed method can effectively reduce the prediction error.

  8. Uncertainties in cylindrical anode current inferences on pulsed power drivers

    NASA Astrophysics Data System (ADS)

    Porwitzky, Andrew; Brown, Justin

    2018-06-01

    For over a decade, velocimetry based techniques have been used to infer the electrical current delivered to dynamic materials properties experiments on pulsed power drivers such as the Z Machine. Though originally developed for planar load geometries, in recent years, inferring the current delivered to cylindrical coaxial loads has become a valuable diagnostic tool for numerous platforms. Presented is a summary of uncertainties that can propagate through the current inference technique when applied to expanding cylindrical anodes. An equation representing quantitative uncertainty is developed which shows the unfold method to be accurate to a few percent above 10 MA of load current.

  9. Global Sensitivity Analysis for Identifying Important Parameters of Nitrogen Nitrification and Denitrification under Model and Scenario Uncertainties

    NASA Astrophysics Data System (ADS)

    Ye, M.; Chen, Z.; Shi, L.; Zhu, Y.; Yang, J.

    2017-12-01

    Nitrogen reactive transport modeling is subject to uncertainty in model parameters, structures, and scenarios. While global sensitivity analysis is a vital tool for identifying the parameters important to nitrogen reactive transport, conventional global sensitivity analysis only considers parametric uncertainty. This may result in inaccurate selection of important parameters, because parameter importance may vary under different models and modeling scenarios. By using a recently developed variance-based global sensitivity analysis method, this paper identifies important parameters with simultaneous consideration of parametric uncertainty, model uncertainty, and scenario uncertainty. In a numerical example of nitrogen reactive transport modeling, a combination of three scenarios of soil temperature and two scenarios of soil moisture leads to a total of six scenarios. Four alternative models are used to evaluate reduction functions used for calculating actual rates of nitrification and denitrification. The model uncertainty is tangled with scenario uncertainty, as the reduction functions depend on soil temperature and moisture content. The results of sensitivity analysis show that parameter importance varies substantially between different models and modeling scenarios, which may lead to inaccurate selection of important parameters if model and scenario uncertainties are not considered. This problem is avoided by using the new method of sensitivity analysis in the context of model averaging and scenario averaging. The new method of sensitivity analysis can be applied to other problems of contaminant transport modeling when model uncertainty and/or scenario uncertainty are present.

  10. Uncertainty Modeling for Structural Control Analysis and Synthesis

    NASA Technical Reports Server (NTRS)

    Campbell, Mark E.; Crawley, Edward F.

    1996-01-01

    The development of an accurate model of uncertainties for the control of structures that undergo a change in operational environment, based solely on modeling and experimentation in the original environment is studied. The application used throughout this work is the development of an on-orbit uncertainty model based on ground modeling and experimentation. A ground based uncertainty model consisting of mean errors and bounds on critical structural parameters is developed. The uncertainty model is created using multiple data sets to observe all relevant uncertainties in the system. The Discrete Extended Kalman Filter is used as an identification/parameter estimation method for each data set, in addition to providing a covariance matrix which aids in the development of the uncertainty model. Once ground based modal uncertainties have been developed, they are localized to specific degrees of freedom in the form of mass and stiffness uncertainties. Two techniques are presented: a matrix method which develops the mass and stiffness uncertainties in a mathematical manner; and a sensitivity method which assumes a form for the mass and stiffness uncertainties in macroelements and scaling factors. This form allows the derivation of mass and stiffness uncertainties in a more physical manner. The mass and stiffness uncertainties of the ground based system are then mapped onto the on-orbit system, and projected to create an analogous on-orbit uncertainty model in the form of mean errors and bounds on critical parameters. The Middeck Active Control Experiment is introduced as experimental verification for the localization and projection methods developed. In addition, closed loop results from on-orbit operations of the experiment verify the use of the uncertainty model for control analysis and synthesis in space.

  11. Predicting wildfire occurrence distribution with spatial point process models and its uncertainty assessment: a case study in the Lake Tahoe Basin, USA

    Treesearch

    Jian Yang; Peter J. Weisberg; Thomas E. Dilts; E. Louise Loudermilk; Robert M. Scheller; Alison Stanton; Carl Skinner

    2015-01-01

    Strategic fire and fuel management planning benefits from detailed understanding of how wildfire occurrences are distributed spatially under current climate, and from predictive models of future wildfire occurrence given climate change scenarios. In this study, we fitted historical wildfire occurrence data from 1986 to 2009 to a suite of spatial point process (SPP)...

  12. Rapid shelf-wide cooling response of a stratified coastal ocean to hurricanes.

    PubMed

    Seroka, Greg; Miles, Travis; Xu, Yi; Kohut, Josh; Schofield, Oscar; Glenn, Scott

    2017-06-01

    Large uncertainty in the predicted intensity of tropical cyclones (TCs) persists compared to the steadily improving skill in the predicted TC tracks. This intensity uncertainty has its most significant implications in the coastal zone, where TC impacts to populated shorelines are greatest. Recent studies have demonstrated that rapid ahead-of-eye-center cooling of a stratified coastal ocean can have a significant impact on hurricane intensity forecasts. Using observation-validated, high-resolution ocean modeling, the stratified coastal ocean cooling processes observed in two U.S. Mid-Atlantic hurricanes were investigated: Hurricane Irene (2011)-with an inshore Mid-Atlantic Bight (MAB) track during the late summer stratified coastal ocean season-and Tropical Storm Barry (2007)-with an offshore track during early summer. For both storms, the critical ahead-of-eye-center depth-averaged force balance across the entire MAB shelf included an onshore wind stress balanced by an offshore pressure gradient. This resulted in onshore surface currents opposing offshore bottom currents that enhanced surface to bottom current shear and turbulent mixing across the thermocline, resulting in the rapid cooling of the surface layer ahead-of-eye-center. Because the same baroclinic and mixing processes occurred for two storms on opposite ends of the track and seasonal stratification envelope, the response appears robust. It will be critical to forecast these processes and their implications for a wide range of future storms using realistic 3-D coupled atmosphere-ocean models to lower the uncertainty in predictions of TC intensities and impacts and enable coastal populations to better respond to increasing rapid intensification threats in an era of rising sea levels.

  13. Damage severity assessment in wind turbine blade laboratory model through fuzzy finite element model updating

    NASA Astrophysics Data System (ADS)

    Turnbull, Heather; Omenzetter, Piotr

    2017-04-01

    The recent shift towards development of clean, sustainable energy sources has provided a new challenge in terms of structural safety and reliability: with aging, manufacturing defects, harsh environmental and operational conditions, and extreme events such as lightning strikes wind turbines can become damaged resulting in production losses and environmental degradation. To monitor the current structural state of the turbine, structural health monitoring (SHM) techniques would be beneficial. Physics based SHM in the form of calibration of a finite element model (FEMs) by inverse techniques is adopted in this research. Fuzzy finite element model updating (FFEMU) techniques for damage severity assessment of a small-scale wind turbine blade are discussed and implemented. The main advantage is the ability of FFEMU to account in a simple way for uncertainty within the problem of model updating. Uncertainty quantification techniques, such as fuzzy sets, enable a convenient mathematical representation of the various uncertainties. Experimental frequencies obtained from modal analysis on a small-scale wind turbine blade were described by fuzzy numbers to model measurement uncertainty. During this investigation, damage severity estimation was investigated through addition of small masses of varying magnitude to the trailing edge of the structure. This structural modification, intended to be in lieu of damage, enabled non-destructive experimental simulation of structural change. A numerical model was constructed with multiple variable additional masses simulated upon the blades trailing edge and used as updating parameters. Objective functions for updating were constructed and minimized using both particle swarm optimization algorithm and firefly algorithm. FFEMU was able to obtain a prediction of baseline material properties of the blade whilst also successfully predicting, with sufficient accuracy, a larger magnitude of structural alteration and its location.

  14. A robust approach to chance constrained optimal power flow with renewable generation

    DOE PAGES

    Lubin, Miles; Dvorkin, Yury; Backhaus, Scott N.

    2016-09-01

    Optimal Power Flow (OPF) dispatches controllable generation at minimum cost subject to operational constraints on generation and transmission assets. The uncertainty and variability of intermittent renewable generation is challenging current deterministic OPF approaches. Recent formulations of OPF use chance constraints to limit the risk from renewable generation uncertainty, however, these new approaches typically assume the probability distributions which characterize the uncertainty and variability are known exactly. We formulate a robust chance constrained (RCC) OPF that accounts for uncertainty in the parameters of these probability distributions by allowing them to be within an uncertainty set. The RCC OPF is solved usingmore » a cutting-plane algorithm that scales to large power systems. We demonstrate the RRC OPF on a modified model of the Bonneville Power Administration network, which includes 2209 buses and 176 controllable generators. In conclusion, deterministic, chance constrained (CC), and RCC OPF formulations are compared using several metrics including cost of generation, area control error, ramping of controllable generators, and occurrence of transmission line overloads as well as the respective computational performance.« less

  15. Plans for a measurement of the neutron lifetime to better than 0.3s using a Penning trap and absolute measurement of neutron fluence

    NASA Astrophysics Data System (ADS)

    Mulholland, Jonathan; NBL3 Collaboration

    2014-09-01

    The decay of the free neutron is the prototypical charged current semi-leptonic weak process. A precise value for the neutron lifetime is required for consistency tests of the Standard Model and is needed to predict the primordial He4 abundance from the theory of Big Bang Nucleosynthesis. Plans are being made for an in-beam measurement of the neutron lifetime with an anticipated 0.3s of uncertainty or better. This effort is part of a phased campaign of neutron lifetime measurements based at the NIST Center for Neutron Research, using the Sussex-ILL-NIST technique. Advances in neutron fluence measurement, used in to provide the best existing in-beam determination of the neutron lifetime, as well as new silicon detector technology, in use now at LANSCE, address the two largest contributors to the uncertainty of in-beam measurements-the statistical uncertainty associated with proton counting and the systematic uncertainty in the neutron fluence measurement. The experimental design and projected uncertainties for the 0.3s measurement will be discussed.

  16. An efficient Bayesian data-worth analysis using a multilevel Monte Carlo method

    NASA Astrophysics Data System (ADS)

    Lu, Dan; Ricciuto, Daniel; Evans, Katherine

    2018-03-01

    Improving the understanding of subsurface systems and thus reducing prediction uncertainty requires collection of data. As the collection of subsurface data is costly, it is important that the data collection scheme is cost-effective. Design of a cost-effective data collection scheme, i.e., data-worth analysis, requires quantifying model parameter, prediction, and both current and potential data uncertainties. Assessment of these uncertainties in large-scale stochastic subsurface hydrological model simulations using standard Monte Carlo (MC) sampling or surrogate modeling is extremely computationally intensive, sometimes even infeasible. In this work, we propose an efficient Bayesian data-worth analysis using a multilevel Monte Carlo (MLMC) method. Compared to the standard MC that requires a significantly large number of high-fidelity model executions to achieve a prescribed accuracy in estimating expectations, the MLMC can substantially reduce computational costs using multifidelity approximations. Since the Bayesian data-worth analysis involves a great deal of expectation estimation, the cost saving of the MLMC in the assessment can be outstanding. While the proposed MLMC-based data-worth analysis is broadly applicable, we use it for a highly heterogeneous two-phase subsurface flow simulation to select an optimal candidate data set that gives the largest uncertainty reduction in predicting mass flow rates at four production wells. The choices made by the MLMC estimation are validated by the actual measurements of the potential data, and consistent with the standard MC estimation. But compared to the standard MC, the MLMC greatly reduces the computational costs.

  17. Multiobjective hedging rules for flood water conservation

    NASA Astrophysics Data System (ADS)

    Ding, Wei; Zhang, Chi; Cai, Ximing; Li, Yu; Zhou, Huicheng

    2017-03-01

    Flood water conservation can be beneficial for water uses especially in areas with water stress but also can pose additional flood risk. The potential of flood water conservation is affected by many factors, especially decision makers' preference for water conservation and reservoir inflow forecast uncertainty. This paper discusses the individual and joint effects of these two factors on the trade-off between flood control and water conservation, using a multiobjective, two-stage reservoir optimal operation model. It is shown that hedging between current water conservation and future flood control exists only when forecast uncertainty or decision makers' preference is within a certain range, beyond which, hedging is trivial and the multiobjective optimization problem is reduced to a single objective problem with either flood control or water conservation. Different types of hedging rules are identified with different levels of flood water conservation preference, forecast uncertainties, acceptable flood risk, and reservoir storage capacity. Critical values of decision preference (represented by a weight) and inflow forecast uncertainty (represented by standard deviation) are identified. These inform reservoir managers with a feasible range of their preference to water conservation and thresholds of forecast uncertainty, specifying possible water conservation within the thresholds. The analysis also provides inputs for setting up an optimization model by providing the range of objective weights and the choice of hedging rule types. A case study is conducted to illustrate the concepts and analyses.

  18. Statistical Approaches to Interpretation of Local, Regional, and National Highway-Runoff and Urban-Stormwater Data

    USGS Publications Warehouse

    Tasker, Gary D.; Granato, Gregory E.

    2000-01-01

    Decision makers need viable methods for the interpretation of local, regional, and national-highway runoff and urban-stormwater data including flows, concentrations and loads of chemical constituents and sediment, potential effects on receiving waters, and the potential effectiveness of various best management practices (BMPs). Valid (useful for intended purposes), current, and technically defensible stormwater-runoff models are needed to interpret data collected in field studies, to support existing highway and urban-runoffplanning processes, to meet National Pollutant Discharge Elimination System (NPDES) requirements, and to provide methods for computation of Total Maximum Daily Loads (TMDLs) systematically and economically. Historically, conceptual, simulation, empirical, and statistical models of varying levels of detail, complexity, and uncertainty have been used to meet various data-quality objectives in the decision-making processes necessary for the planning, design, construction, and maintenance of highways and for other land-use applications. Water-quality simulation models attempt a detailed representation of the physical processes and mechanisms at a given site. Empirical and statistical regional water-quality assessment models provide a more general picture of water quality or changes in water quality over a region. All these modeling techniques share one common aspect-their predictive ability is poor without suitable site-specific data for calibration. To properly apply the correct model, one must understand the classification of variables, the unique characteristics of water-resources data, and the concept of population structure and analysis. Classifying variables being used to analyze data may determine which statistical methods are appropriate for data analysis. An understanding of the characteristics of water-resources data is necessary to evaluate the applicability of different statistical methods, to interpret the results of these techniques, and to use tools and techniques that account for the unique nature of water-resources data sets. Populations of data on stormwater-runoff quantity and quality are often best modeled as logarithmic transformations. Therefore, these factors need to be considered to form valid, current, and technically defensible stormwater-runoff models. Regression analysis is an accepted method for interpretation of water-resources data and for prediction of current or future conditions at sites that fit the input data model. Regression analysis is designed to provide an estimate of the average response of a system as it relates to variation in one or more known variables. To produce valid models, however, regression analysis should include visual analysis of scatterplots, an examination of the regression equation, evaluation of the method design assumptions, and regression diagnostics. A number of statistical techniques are described in the text and in the appendixes to provide information necessary to interpret data by use of appropriate methods. Uncertainty is an important part of any decisionmaking process. In order to deal with uncertainty problems, the analyst needs to know the severity of the statistical uncertainty of the methods used to predict water quality. Statistical models need to be based on information that is meaningful, representative, complete, precise, accurate, and comparable to be deemed valid, up to date, and technically supportable. To assess uncertainty in the analytical tools, the modeling methods, and the underlying data set, all of these components need be documented and communicated in an accessible format within project publications.

  19. Reconstruction of droughts in India using multiple land-surface models (1951-2015)

    NASA Astrophysics Data System (ADS)

    Mishra, Vimal; Shah, Reepal; Azhar, Syed; Shah, Harsh; Modi, Parth; Kumar, Rohini

    2018-04-01

    India has witnessed some of the most severe historical droughts in the current decade, and severity, frequency, and areal extent of droughts have been increasing. As a large part of the population of India is dependent on agriculture, soil moisture drought affecting agricultural activities (crop yields) has significant impacts on socio-economic conditions. Due to limited observations, soil moisture is generally simulated using land-surface hydrological models (LSMs); however, these LSM outputs have uncertainty due to many factors, including errors in forcing data and model parameterization. Here we reconstruct agricultural drought events over India during the period of 1951-2015 based on simulated soil moisture from three LSMs, the Variable Infiltration Capacity (VIC), the Noah, and the Community Land Model (CLM). Based on simulations from the three LSMs, we find that major drought events occurred in 1987, 2002, and 2015 during the monsoon season (June through September). During the Rabi season (November through February), major soil moisture droughts occurred in 1966, 1973, 2001, and 2003. Soil moisture droughts estimated from the three LSMs are comparable in terms of their spatial coverage; however, differences are found in drought severity. Moreover, we find a higher uncertainty in simulated drought characteristics over a large part of India during the major crop-growing season (Rabi season, November to February: NDJF) compared to those of the monsoon season (June to September: JJAS). Furthermore, uncertainty in drought estimates is higher for severe and localized droughts. Higher uncertainty in the soil moisture droughts is largely due to the difference in model parameterizations (especially soil depth), resulting in different persistence of soil moisture simulated by the three LSMs. Our study highlights the importance of accounting for the LSMs' uncertainty and consideration of the multi-model ensemble system for the real-time monitoring and prediction of drought over India.

  20. Nuclear data uncertainty propagation by the XSUSA method in the HELIOS2 lattice code

    NASA Astrophysics Data System (ADS)

    Wemple, Charles; Zwermann, Winfried

    2017-09-01

    Uncertainty quantification has been extensively applied to nuclear criticality analyses for many years and has recently begun to be applied to depletion calculations. However, regulatory bodies worldwide are trending toward requiring such analyses for reactor fuel cycle calculations, which also requires uncertainty propagation for isotopics and nuclear reaction rates. XSUSA is a proven methodology for cross section uncertainty propagation based on random sampling of the nuclear data according to covariance data in multi-group representation; HELIOS2 is a lattice code widely used for commercial and research reactor fuel cycle calculations. This work describes a technique to automatically propagate the nuclear data uncertainties via the XSUSA approach through fuel lattice calculations in HELIOS2. Application of the XSUSA methodology in HELIOS2 presented some unusual challenges because of the highly-processed multi-group cross section data used in commercial lattice codes. Currently, uncertainties based on the SCALE 6.1 covariance data file are being used, but the implementation can be adapted to other covariance data in multi-group structure. Pin-cell and assembly depletion calculations, based on models described in the UAM-LWR Phase I and II benchmarks, are performed and uncertainties in multiplication factor, reaction rates, isotope concentrations, and delayed-neutron data are calculated. With this extension, it will be possible for HELIOS2 users to propagate nuclear data uncertainties directly from the microscopic cross sections to subsequent core simulations.

  1. An online mineral dust model within the global/regional NMMB: current progress and plans

    NASA Astrophysics Data System (ADS)

    Perez, C.; Haustein, K.; Janjic, Z.; Jorba, O.; Baldasano, J. M.; Black, T.; Nickovic, S.

    2008-12-01

    While mineral dust distribution and effects are important on global scales, they strongly depend on dust emissions that are occurring on small spatial and temporal scales. Indeed, the accuracy of surface wind speed used in dust models is crucial. Due to the high-order power dependency on wind friction velocity and the threshold behaviour of dust emissions, small errors in surface wind speed lead to large dust emission errors. Most global dust models use prescribed wind fields provided by major meteorological centres (e.g., NCEP and ECMWF) and their spatial resolution is currently about 1 degree x 1 degree . Such wind speeds tend to be strongly underestimated over arid and semi-arid areas and do not account for mesoscale systems responsible for a significant fraction of dust emissions regionally and globally. Other significant uncertainties in dust emissions resulting from such approaches are related to the misrepresentation of high subgrid-scale spatial heterogeneity in soil and vegetation boundary conditions, mainly in semi-arid areas. In order to significantly reduce these uncertainties, the Barcelona Supercomputing Center is currently implementing a mineral dust model coupled on-line with the new global/regional NMMB atmospheric model using the ESMF framework under development in NOAA/NCEP/EMC. The NMMB is an evolution of the operational WRF-NMME extending from meso to global scales, and including non-hydrostatic option and improved tracer advection. This model is planned to become the next-generation NCEP mesoscale model for operational weather forecasting in North America. Current implementation is based on the well established regional dust model and forecast system Eta/DREAM (http://www.bsc.es/projects/earthscience/DREAM/). First successful global simulations show the potentials of such an approach and compare well with DREAM regionally. Ongoing developments include improvements in dust size distribution representation, sedimentation, dry deposition, wet scavenging and dust-radiation feedback, as well as the efficient implementation of the model on High Performance Supercomputers for global simulations and forecasts at high resolution.

  2. Responses to clinical uncertainty in Australian general practice trainees: a cross-sectional analysis.

    PubMed

    Cooke, Georga; Tapley, Amanda; Holliday, Elizabeth; Morgan, Simon; Henderson, Kim; Ball, Jean; van Driel, Mieke; Spike, Neil; Kerr, Rohan; Magin, Parker

    2017-12-01

    Tolerance for ambiguity is essential for optimal learning and professional competence. General practice trainees must be, or must learn to be, adept at managing clinical uncertainty. However, few studies have examined associations of intolerance of uncertainty in this group. The aim of this study was to establish levels of tolerance of uncertainty in Australian general practice trainees and associations of uncertainty with demographic, educational and training practice factors. A cross-sectional analysis was performed on the Registrar Clinical Encounters in Training (ReCEnT) project, an ongoing multi-site cohort study. Scores on three of the four independent subscales of the Physicians' Reaction to Uncertainty (PRU) instrument were analysed as outcome variables in linear regression models with trainee and practice factors as independent variables. A total of 594 trainees contributed data on a total of 1209 occasions. Trainees in earlier training terms had higher scores for 'Anxiety due to uncertainty', 'Concern about bad outcomes' and 'Reluctance to disclose diagnosis/treatment uncertainty to patients'. Beyond this, findings suggest two distinct sets of associations regarding reaction to uncertainty. Firstly, affective aspects of uncertainty (the 'Anxiety' and 'Concern' subscales) were associated with female gender, less experience in hospital prior to commencing general practice training, and graduation overseas. Secondly, a maladaptive response to uncertainty (the 'Reluctance to disclose' subscale) was associated with urban practice, health qualifications prior to studying medicine, practice in an area of higher socio-economic status, and being Australian-trained. This study has established levels of three measures of trainees' responses to uncertainty and associations with these responses. The current findings suggest differing 'phenotypes' of trainees with high 'affective' responses to uncertainty and those reluctant to disclose uncertainty to patients. More research is needed to examine the relationship between clinical uncertainty and clinical outcomes, temporal changes in tolerance for uncertainty, and strategies that might assist physicians in developing adaptive responses to clinical uncertainty. © 2017 John Wiley & Sons Ltd and The Association for the Study of Medical Education.

  3. Future socio-economic impacts and vulnerabilities

    Treesearch

    Balgis Osman-Elasha; Neil Adger; Maria Brockhaus; Carol J. Pierce Colfer; Brent Sohngen; Tallaat Dafalla; Linda A. Joyce; Nkem Johnson; Carmenza Robledo

    2009-01-01

    The projected impacts of climate change are significant, and despite the uncertainties associated with current climate and ecosystem model projections, the associated changes in the provision of forest ecosystem services are expected to be substantial in many parts of the world. These impacts will present significant social and economic challenges for affected...

  4. Framework for Uncertainty Assessment - Hanford Site-Wide Groundwater Flow and Transport Modeling

    NASA Astrophysics Data System (ADS)

    Bergeron, M. P.; Cole, C. R.; Murray, C. J.; Thorne, P. D.; Wurstner, S. K.

    2002-05-01

    Pacific Northwest National Laboratory is in the process of development and implementation of an uncertainty estimation methodology for use in future site assessments that addresses parameter uncertainty as well as uncertainties related to the groundwater conceptual model. The long-term goals of the effort are development and implementation of an uncertainty estimation methodology for use in future assessments and analyses being made with the Hanford site-wide groundwater model. The basic approach in the framework developed for uncertainty assessment consists of: 1) Alternate conceptual model (ACM) identification to identify and document the major features and assumptions of each conceptual model. The process must also include a periodic review of the existing and proposed new conceptual models as data or understanding become available. 2) ACM development of each identified conceptual model through inverse modeling with historical site data. 3) ACM evaluation to identify which of conceptual models are plausible and should be included in any subsequent uncertainty assessments. 4) ACM uncertainty assessments will only be carried out for those ACMs determined to be plausible through comparison with historical observations and model structure identification measures. The parameter uncertainty assessment process generally involves: a) Model Complexity Optimization - to identify the important or relevant parameters for the uncertainty analysis; b) Characterization of Parameter Uncertainty - to develop the pdfs for the important uncertain parameters including identification of any correlations among parameters; c) Propagation of Uncertainty - to propagate parameter uncertainties (e.g., by first order second moment methods if applicable or by a Monte Carlo approach) through the model to determine the uncertainty in the model predictions of interest. 5)Estimation of combined ACM and scenario uncertainty by a double sum with each component of the inner sum (an individual CCDF) representing parameter uncertainty associated with a particular scenario and ACM and the outer sum enumerating the various plausible ACM and scenario combinations in order to represent the combined estimate of uncertainty (a family of CCDFs). A final important part of the framework includes identification, enumeration, and documentation of all the assumptions, which include those made during conceptual model development, required by the mathematical model, required by the numerical model, made during the spatial and temporal descretization process, needed to assign the statistical model and associated parameters that describe the uncertainty in the relevant input parameters, and finally those assumptions required by the propagation method. Pacific Northwest National Laboratory is operated for the U.S. Department of Energy under Contract DE-AC06-76RL01830.

  5. Current and future pluvial flood hazard analysis for the city of Antwerp

    NASA Astrophysics Data System (ADS)

    Willems, Patrick; Tabari, Hossein; De Niel, Jan; Van Uytven, Els; Lambrechts, Griet; Wellens, Geert

    2016-04-01

    For the city of Antwerp in Belgium, higher rainfall extremes were observed in comparison with surrounding areas. The differences were found statistically significant for some areas and may be the result of the heat island effect in combination with the higher concentrations of aerosols. A network of 19 rain gauges but with varying records length (the longest since the 1960s) and continuous radar data for 10 years were combined to map the spatial variability of rainfall extremes over the city at various durations from 15 minutes to 1 day together with the uncertainty. The improved spatial rainfall information was used as input in the sewer system model of the city to analyze the frequency of urban pluvial floods. Comparison with historical flood observations from various sources (fire brigade and media) confirmed that the improved spatial rainfall information also improved sewer impact results on both the magnitude and frequency of the sewer floods. Next to these improved urban flood impact results for recent and current climatological conditions, the new insights on the local rainfall microclimate were also helpful to enhance future projections on rainfall extremes and pluvial floods in the city. This was done by improved statistical downscaling of all available CMIP5 global climate model runs (160 runs) for the 4 RCP scenarios, as well as the available EURO-CORDEX regional climate model runs. Two types of statistical downscaling methods were applied for that purpose (a weather typing based method, and a quantile perturbation approach), making use of the microclimate results and its dependency on specific weather types. Changes in extreme rainfall intensities were analyzed and mapped as a function of the RCP scenario, together with the uncertainty, decomposed in the uncertainties related to the climate models, the climate model initialization or limited length of the 30-year time series (natural climate variability) and the statistical downscaling (albeit limited to two types of methods). These were finally transferred into future pluvial flash flood hazard maps for the city together with the uncertainties, and are considered as basis for spatial planning and adaptation.

  6. A Conceptual Methodology for Assessing Acquisition Requirements Robustness against Technology Uncertainties

    NASA Astrophysics Data System (ADS)

    Chou, Shuo-Ju

    2011-12-01

    In recent years the United States has shifted from a threat-based acquisition policy that developed systems for countering specific threats to a capabilities-based strategy that emphasizes the acquisition of systems that provide critical national defense capabilities. This shift in policy, in theory, allows for the creation of an "optimal force" that is robust against current and future threats regardless of the tactics and scenario involved. In broad terms, robustness can be defined as the insensitivity of an outcome to "noise" or non-controlled variables. Within this context, the outcome is the successful achievement of defense strategies and the noise variables are tactics and scenarios that will be associated with current and future enemies. Unfortunately, a lack of system capability, budget, and schedule robustness against technology performance and development uncertainties has led to major setbacks in recent acquisition programs. This lack of robustness stems from the fact that immature technologies have uncertainties in their expected performance, development cost, and schedule that cause to variations in system effectiveness and program development budget and schedule requirements. Unfortunately, the Technology Readiness Assessment process currently used by acquisition program managers and decision-makers to measure technology uncertainty during critical program decision junctions does not adequately capture the impact of technology performance and development uncertainty on program capability and development metrics. The Technology Readiness Level metric employed by the TRA to describe program technology elements uncertainties can only provide a qualitative and non-descript estimation of the technology uncertainties. In order to assess program robustness, specifically requirements robustness, against technology performance and development uncertainties, a new process is needed. This process should provide acquisition program managers and decision-makers with the ability to assess or measure the robustness of program requirements against such uncertainties. A literature review of techniques for forecasting technology performance and development uncertainties and subsequent impacts on capability, budget, and schedule requirements resulted in the conclusion that an analysis process that coupled a probabilistic analysis technique such as Monte Carlo Simulations with quantitative and parametric models of technology performance impact and technology development time and cost requirements would allow the probabilities of meeting specific constraints of these requirements to be established. These probabilities of requirements success metrics can then be used as a quantitative and probabilistic measure of program requirements robustness against technology uncertainties. Combined with a Multi-Objective Genetic Algorithm optimization process and computer-based Decision Support System, critical information regarding requirements robustness against technology uncertainties can be captured and quantified for acquisition decision-makers. This results in a more informed and justifiable selection of program technologies during initial program definition as well as formulation of program development and risk management strategies. To meet the stated research objective, the ENhanced TEchnology Robustness Prediction and RISk Evaluation (ENTERPRISE) methodology was formulated to provide a structured and transparent process for integrating these enabling techniques to provide a probabilistic and quantitative assessment of acquisition program requirements robustness against technology performance and development uncertainties. In order to demonstrate the capabilities of the ENTERPRISE method and test the research Hypotheses, an demonstration application of this method was performed on a notional program for acquiring the Carrier-based Suppression of Enemy Air Defenses (SEAD) using Unmanned Combat Aircraft Systems (UCAS) and their enabling technologies. The results of this implementation provided valuable insights regarding the benefits and inner workings of this methodology as well as its limitations that should be addressed in the future to narrow the gap between current state and the desired state.

  7. Mapping of uncertainty relations between continuous and discrete time

    NASA Astrophysics Data System (ADS)

    Chiuchiú, Davide; Pigolotti, Simone

    2018-03-01

    Lower bounds on fluctuations of thermodynamic currents depend on the nature of time, discrete or continuous. To understand the physical reason, we compare current fluctuations in discrete-time Markov chains and continuous-time master equations. We prove that current fluctuations in the master equations are always more likely, due to random timings of transitions. This comparison leads to a mapping of the moments of a current between discrete and continuous time. We exploit this mapping to obtain uncertainty bounds. Our results reduce the quests for uncertainty bounds in discrete and continuous time to a single problem.

  8. Mapping of uncertainty relations between continuous and discrete time.

    PubMed

    Chiuchiù, Davide; Pigolotti, Simone

    2018-03-01

    Lower bounds on fluctuations of thermodynamic currents depend on the nature of time, discrete or continuous. To understand the physical reason, we compare current fluctuations in discrete-time Markov chains and continuous-time master equations. We prove that current fluctuations in the master equations are always more likely, due to random timings of transitions. This comparison leads to a mapping of the moments of a current between discrete and continuous time. We exploit this mapping to obtain uncertainty bounds. Our results reduce the quests for uncertainty bounds in discrete and continuous time to a single problem.

  9. Reducing uncertainty in dust monitoring to detect aeolian sediment transport responses to land cover change

    NASA Astrophysics Data System (ADS)

    Webb, N.; Chappell, A.; Van Zee, J.; Toledo, D.; Duniway, M.; Billings, B.; Tedela, N.

    2017-12-01

    Anthropogenic land use and land cover change (LULCC) influence global rates of wind erosion and dust emission, yet our understanding of the magnitude of the responses remains poor. Field measurements and monitoring provide essential data to resolve aeolian sediment transport patterns and assess the impacts of human land use and management intensity. Data collected in the field are also required for dust model calibration and testing, as models have become the primary tool for assessing LULCC-dust cycle interactions. However, there is considerable uncertainty in estimates of dust emission due to the spatial variability of sediment transport. Field sampling designs are currently rudimentary and considerable opportunities are available to reduce the uncertainty. Establishing the minimum detectable change is critical for measuring spatial and temporal patterns of sediment transport, detecting potential impacts of LULCC and land management, and for quantifying the uncertainty of dust model estimates. Here, we evaluate the effectiveness of common sampling designs (e.g., simple random sampling, systematic sampling) used to measure and monitor aeolian sediment transport rates. Using data from the US National Wind Erosion Research Network across diverse rangeland and cropland cover types, we demonstrate how only large changes in sediment mass flux (of the order 200% to 800%) can be detected when small sample sizes are used, crude sampling designs are implemented, or when the spatial variation is large. We then show how statistical rigour and the straightforward application of a sampling design can reduce the uncertainty and detect change in sediment transport over time and between land use and land cover types.

  10. Value of Information Analysis Applied to the Economic Evaluation of Interventions Aimed at Reducing Juvenile Delinquency: An Illustration.

    PubMed

    Eeren, Hester V; Schawo, Saskia J; Scholte, Ron H J; Busschbach, Jan J V; Hakkaart, Leona

    2015-01-01

    To investigate whether a value of information analysis, commonly applied in health care evaluations, is feasible and meaningful in the field of crime prevention. Interventions aimed at reducing juvenile delinquency are increasingly being evaluated according to their cost-effectiveness. Results of cost-effectiveness models are subject to uncertainty in their cost and effect estimates. Further research can reduce that parameter uncertainty. The value of such further research can be estimated using a value of information analysis, as illustrated in the current study. We built upon an earlier published cost-effectiveness model that demonstrated the comparison of two interventions aimed at reducing juvenile delinquency. Outcomes were presented as costs per criminal activity free year. At a societal willingness-to-pay of €71,700 per criminal activity free year, further research to eliminate parameter uncertainty was valued at €176 million. Therefore, in this illustrative analysis, the value of information analysis determined that society should be willing to spend a maximum of €176 million in reducing decision uncertainty in the cost-effectiveness of the two interventions. Moreover, the results suggest that reducing uncertainty in some specific model parameters might be more valuable than in others. Using a value of information framework to assess the value of conducting further research in the field of crime prevention proved to be feasible. The results were meaningful and can be interpreted according to health care evaluation studies. This analysis can be helpful in justifying additional research funds to further inform the reimbursement decision in regard to interventions for juvenile delinquents.

  11. Water Resource Planning Under Future Climate and Socioeconomic Uncertainty in the Cauvery River Basin in Karnataka, India

    PubMed Central

    Conway, Declan; Dessai, Suraje; Stainforth, David A.

    2018-01-01

    Abstract Decision‐Making Under Uncertainty (DMUU) approaches have been less utilized in developing countries than developed countries for water resources contexts. High climate vulnerability and rapid socioeconomic change often characterize developing country contexts, making DMUU approaches relevant. We develop an iterative multi‐method DMUU approach, including scenario generation, coproduction with stakeholders and water resources modeling. We apply this approach to explore the robustness of adaptation options and pathways against future climate and socioeconomic uncertainties in the Cauvery River Basin in Karnataka, India. A water resources model is calibrated and validated satisfactorily using observed streamflow. Plausible future changes in Indian Summer Monsoon (ISM) precipitation and water demand are used to drive simulations of water resources from 2021 to 2055. Two stakeholder‐identified decision‐critical metrics are examined: a basin‐wide metric comprising legal instream flow requirements for the downstream state of Tamil Nadu, and a local metric comprising water supply reliability to Bangalore city. In model simulations, the ability to satisfy these performance metrics without adaptation is reduced under almost all scenarios. Implementing adaptation options can partially offset the negative impacts of change. Sequencing of options according to stakeholder priorities into Adaptation Pathways affects metric satisfaction. Early focus on agricultural demand management improves the robustness of pathways but trade‐offs emerge between intrabasin and basin‐wide water availability. We demonstrate that the fine balance between water availability and demand is vulnerable to future changes and uncertainty. Despite current and long‐term planning challenges, stakeholders in developing countries may engage meaningfully in coproduction approaches for adaptation decision‐making under deep uncertainty. PMID:29706676

  12. Water Resource Planning Under Future Climate and Socioeconomic Uncertainty in the Cauvery River Basin in Karnataka, India.

    PubMed

    Bhave, Ajay Gajanan; Conway, Declan; Dessai, Suraje; Stainforth, David A

    2018-02-01

    Decision-Making Under Uncertainty (DMUU) approaches have been less utilized in developing countries than developed countries for water resources contexts. High climate vulnerability and rapid socioeconomic change often characterize developing country contexts, making DMUU approaches relevant. We develop an iterative multi-method DMUU approach, including scenario generation, coproduction with stakeholders and water resources modeling. We apply this approach to explore the robustness of adaptation options and pathways against future climate and socioeconomic uncertainties in the Cauvery River Basin in Karnataka, India. A water resources model is calibrated and validated satisfactorily using observed streamflow. Plausible future changes in Indian Summer Monsoon (ISM) precipitation and water demand are used to drive simulations of water resources from 2021 to 2055. Two stakeholder-identified decision-critical metrics are examined: a basin-wide metric comprising legal instream flow requirements for the downstream state of Tamil Nadu, and a local metric comprising water supply reliability to Bangalore city. In model simulations, the ability to satisfy these performance metrics without adaptation is reduced under almost all scenarios. Implementing adaptation options can partially offset the negative impacts of change. Sequencing of options according to stakeholder priorities into Adaptation Pathways affects metric satisfaction. Early focus on agricultural demand management improves the robustness of pathways but trade-offs emerge between intrabasin and basin-wide water availability. We demonstrate that the fine balance between water availability and demand is vulnerable to future changes and uncertainty. Despite current and long-term planning challenges, stakeholders in developing countries may engage meaningfully in coproduction approaches for adaptation decision-making under deep uncertainty.

  13. Water Resource Planning Under Future Climate and Socioeconomic Uncertainty in the Cauvery River Basin in Karnataka, India

    NASA Astrophysics Data System (ADS)

    Bhave, Ajay Gajanan; Conway, Declan; Dessai, Suraje; Stainforth, David A.

    2018-02-01

    Decision-Making Under Uncertainty (DMUU) approaches have been less utilized in developing countries than developed countries for water resources contexts. High climate vulnerability and rapid socioeconomic change often characterize developing country contexts, making DMUU approaches relevant. We develop an iterative multi-method DMUU approach, including scenario generation, coproduction with stakeholders and water resources modeling. We apply this approach to explore the robustness of adaptation options and pathways against future climate and socioeconomic uncertainties in the Cauvery River Basin in Karnataka, India. A water resources model is calibrated and validated satisfactorily using observed streamflow. Plausible future changes in Indian Summer Monsoon (ISM) precipitation and water demand are used to drive simulations of water resources from 2021 to 2055. Two stakeholder-identified decision-critical metrics are examined: a basin-wide metric comprising legal instream flow requirements for the downstream state of Tamil Nadu, and a local metric comprising water supply reliability to Bangalore city. In model simulations, the ability to satisfy these performance metrics without adaptation is reduced under almost all scenarios. Implementing adaptation options can partially offset the negative impacts of change. Sequencing of options according to stakeholder priorities into Adaptation Pathways affects metric satisfaction. Early focus on agricultural demand management improves the robustness of pathways but trade-offs emerge between intrabasin and basin-wide water availability. We demonstrate that the fine balance between water availability and demand is vulnerable to future changes and uncertainty. Despite current and long-term planning challenges, stakeholders in developing countries may engage meaningfully in coproduction approaches for adaptation decision-making under deep uncertainty.

  14. Quantifying model uncertainty in seasonal Arctic sea-ice forecasts

    NASA Astrophysics Data System (ADS)

    Blanchard-Wrigglesworth, Edward; Barthélemy, Antoine; Chevallier, Matthieu; Cullather, Richard; Fučkar, Neven; Massonnet, François; Posey, Pamela; Wang, Wanqiu; Zhang, Jinlun; Ardilouze, Constantin; Bitz, Cecilia; Vernieres, Guillaume; Wallcraft, Alan; Wang, Muyin

    2017-04-01

    Dynamical model forecasts in the Sea Ice Outlook (SIO) of September Arctic sea-ice extent over the last decade have shown lower skill than that found in both idealized model experiments and hindcasts of previous decades. Additionally, it is unclear how different model physics, initial conditions or post-processing techniques contribute to SIO forecast uncertainty. In this work, we have produced a seasonal forecast of 2015 Arctic summer sea ice using SIO dynamical models initialized with identical sea-ice thickness in the central Arctic. Our goals are to calculate the relative contribution of model uncertainty and irreducible error growth to forecast uncertainty and assess the importance of post-processing, and to contrast pan-Arctic forecast uncertainty with regional forecast uncertainty. We find that prior to forecast post-processing, model uncertainty is the main contributor to forecast uncertainty, whereas after forecast post-processing forecast uncertainty is reduced overall, model uncertainty is reduced by an order of magnitude, and irreducible error growth becomes the main contributor to forecast uncertainty. While all models generally agree in their post-processed forecasts of September sea-ice volume and extent, this is not the case for sea-ice concentration. Additionally, forecast uncertainty of sea-ice thickness grows at a much higher rate along Arctic coastlines relative to the central Arctic ocean. Potential ways of offering spatial forecast information based on the timescale over which the forecast signal beats the noise are also explored.

  15. Probabilistic assessment of the impact of coal seam gas development on groundwater: Surat Basin, Australia

    NASA Astrophysics Data System (ADS)

    Cui, Tao; Moore, Catherine; Raiber, Matthias

    2018-05-01

    Modelling cumulative impacts of basin-scale coal seam gas (CSG) extraction is challenging due to the long time frames and spatial extent over which impacts occur combined with the need to consider local-scale processes. The computational burden of such models limits the ability to undertake calibration and sensitivity and uncertainty analyses. A framework is presented that integrates recently developed methods and tools to address the computational burdens of an assessment of drawdown impacts associated with rapid CSG development in the Surat Basin, Australia. The null space Monte Carlo method combined with singular value decomposition (SVD)-assisted regularisation was used to analyse the uncertainty of simulated drawdown impacts. The study also describes how the computational burden of assessing local-scale impacts was mitigated by adopting a novel combination of a nested modelling framework which incorporated a model emulator of drawdown in dual-phase flow conditions, and a methodology for representing local faulting. This combination provides a mechanism to support more reliable estimates of regional CSG-related drawdown predictions. The study indicates that uncertainties associated with boundary conditions are reduced significantly when expressing differences between scenarios. The results are analysed and distilled to enable the easy identification of areas where the simulated maximum drawdown impacts could exceed trigger points associated with legislative `make good' requirements; trigger points require that either an adjustment in the development scheme or other measures are implemented to remediate the impact. This report contributes to the currently small body of work that describes modelling and uncertainty analyses of CSG extraction impacts on groundwater.

  16. Greenhouse gas network design using backward Lagrangian particle dispersion modelling - Part 1: Methodology and Australian test case

    NASA Astrophysics Data System (ADS)

    Ziehn, T.; Nickless, A.; Rayner, P. J.; Law, R. M.; Roff, G.; Fraser, P.

    2014-03-01

    This paper describes the generation of optimal atmospheric measurement networks for determining carbon dioxide fluxes over Australia using inverse methods. A Lagrangian particle dispersion model is used in reverse mode together with a Bayesian inverse modelling framework to calculate the relationship between weekly surface fluxes and hourly concentration observations for the Australian continent. Meteorological driving fields are provided by the regional version of the Australian Community Climate and Earth System Simulator (ACCESS) at 12 km resolution at an hourly time scale. Prior uncertainties are derived on a weekly time scale for biosphere fluxes and fossil fuel emissions from high resolution BIOS2 model runs and from the Fossil Fuel Data Assimilation System (FFDAS), respectively. The influence from outside the modelled domain is investigated, but proves to be negligible for the network design. Existing ground based measurement stations in Australia are assessed in terms of their ability to constrain local flux estimates from the land. We find that the six stations that are currently operational are already able to reduce the uncertainties on surface flux estimates by about 30%. A candidate list of 59 stations is generated based on logistic constraints and an incremental optimization scheme is used to extend the network of existing stations. In order to achieve an uncertainty reduction of about 50% we need to double the number of measurement stations in Australia. Assuming equal data uncertainties for all sites, new stations would be mainly located in the northern and eastern part of the continent.

  17. A structured analysis of uncertainty surrounding modeled impacts of groundwater-extraction rules

    NASA Astrophysics Data System (ADS)

    Guillaume, Joseph H. A.; Qureshi, M. Ejaz; Jakeman, Anthony J.

    2012-08-01

    Integrating economic and groundwater models for groundwater-management can help improve understanding of trade-offs involved between conflicting socioeconomic and biophysical objectives. However, there is significant uncertainty in most strategic decision-making situations, including in the models constructed to represent them. If not addressed, this uncertainty may be used to challenge the legitimacy of the models and decisions made using them. In this context, a preliminary uncertainty analysis was conducted of a dynamic coupled economic-groundwater model aimed at assessing groundwater extraction rules. The analysis demonstrates how a variety of uncertainties in such a model can be addressed. A number of methods are used including propagation of scenarios and bounds on parameters, multiple models, block bootstrap time-series sampling and robust linear regression for model calibration. These methods are described within the context of a theoretical uncertainty management framework, using a set of fundamental uncertainty management tasks and an uncertainty typology.

  18. Our Sun IV: The Standard Model and Helioseismology: Consequences of Uncertainties in Input Physics and in Observed Solar Parameters

    NASA Technical Reports Server (NTRS)

    Boothroyd, Arnold I.; Sackmann, I.-Juliana

    2001-01-01

    Helioseismic frequency observations provide an extremely accurate window into the solar interior; frequencies from the Michaelson Doppler Imager (MDI) on the Solar and Heliospheric Observatory (SOHO) spacecraft, enable the adiabatic sound speed and adiabatic index to be inferred with an accuracy of a few parts in 10(exp 4) and the density with an accuracy of a few parts in 10(exp 3). This has become a Serious challenge to theoretical models of the Sun. Therefore, we have undertaken a self-consistent, systematic study of the sources of uncertainties in the standard solar models. We found that the largest effect on the interior structure arises from the observational uncertainties in the photospheric abundances of the elements, which affect the sound speed profile at the level of 3 parts in 10(exp 3). The estimated 4% uncertainty in the OPAL opacities could lead to effects of 1 part in 10(exp 3); the approximately 5%, uncertainty in the basic pp nuclear reaction rate would have a similar effect, as would uncertainties of approximately 15% in the diffusion constants for the gravitational settling of helium. The approximately 50% uncertainties in diffusion constants for the heavier elements would have nearly as large an effect. Different observational methods for determining the solar radius yield results differing by as much as 7 parts in 10(exp 4); we found that this leads to uncertainties of a few parts in 10(exp 3) in the sound speed int the solar convective envelope, but has negligible effect on the interior. Our reference standard solar model yielded a convective envelope position of 0.7135 solar radius, in excellent agreement with the observed value of 0.713 +/- 0.001 solar radius and was significantly affected only by Z/X, the pp rate, and the uncertainties in helium diffusion constants. Our reference model also yielded envelope helium abundance of 0.2424, in good agreement with the approximate range of 0.24 to 0.25 inferred from helioseismic observations; only extreme Z/X values yielded envelope helium abundance outside this range. We found that other current uncertainties, namely, in the solar age and luminosity, in nuclear rates other than the pp reaction, in the low-temperature molecular opacities, and in the low-density equation of state, have no significant effect on the quantities that can be inferred from helioseismic observations. The predicted pre-main-sequence lithium depletion is uncertain by a factor of 2. The predicted neutrino capture rate is uncertain by approximately 30% for the Cl-27 experiment and by approximately 3% for Ga-71 experiments, while the B-8 neutrino flux is uncertain by approximately 30%.

  19. Biophysical and Economic Uncertainty in the Analysis of Poverty Impacts of Climate Change

    NASA Astrophysics Data System (ADS)

    Hertel, T. W.; Lobell, D. B.; Verma, M.

    2011-12-01

    This paper seeks to understand the main sources of uncertainty in assessing the impacts of climate change on agricultural output, international trade, and poverty. We incorporate biophysical uncertainty by sampling from a distribution of global climate model predictions for temperature and precipitation for 2050. The implications of these realizations for crop yields around the globe are estimated using the recently published statistical crop yield functions provided by Lobell, Schlenker and Costa-Roberts (2011). By comparing these yields to those predicted under current climate, we obtain the likely change in crop yields owing to climate change. The economic uncertainty in our analysis relates to the response of the global economic system to these biophysical shocks. We use a modified version of the GTAP model to elicit the impact of the biophysical shocks on global patterns of production, consumption, trade and poverty. Uncertainty in these responses is reflected in the econometrically estimated parameters governing the responsiveness of international trade, consumption, production (and hence the intensive margin of supply response), and factor supplies (which govern the extensive margin of supply response). We sample from the distributions of these parameters as specified by Hertel et al. (2007) and Keeney and Hertel (2009). We find that, even though it is difficult to predict where in the world agricultural crops will be favorably affected by climate change, the responses of economic variables, including output and exports can be far more robust (Table 1). This is due to the fact that supply and demand decisions depend on relative prices, and relative prices depend on productivity changes relative to other crops in a given region, or relative to similar crops in other parts of the world. We also find that uncertainty in poverty impacts of climate change appears to be almost entirely driven by biophysical uncertainty.

  20. On the uncertainty of phenological responses to climate change and its implication for terrestrial biosphere models

    NASA Astrophysics Data System (ADS)

    Migliavacca, M.; Sonnentag, O.; Keenan, T. F.; Cescatti, A.; O'Keefe, J.; Richardson, A. D.

    2012-01-01

    Phenology, the timing of recurring life cycle events, controls numerous land surface feedbacks to the climate systems through the regulation of exchanges of carbon, water and energy between the biosphere and atmosphere. Land surface models, however, are known to have systematic errors in the simulation of spring phenology, which potentially could propagate to uncertainty in modeled responses to future climate change. Here, we analyzed the Harvard Forest phenology record to investigate and characterize the sources of uncertainty in phenological forecasts and the subsequent impacts on model forecasts of carbon and water cycling in the future. Using a model-data fusion approach, we combined information from 20 yr of phenological observations of 11 North American woody species with 12 phenological models of different complexity to predict leaf bud-burst. The evaluation of different phenological models indicated support for spring warming models with photoperiod limitations and, though to a lesser extent, to chilling models based on the alternating model structure. We assessed three different sources of uncertainty in phenological forecasts: parameter uncertainty, model uncertainty, and driver uncertainty. The latter was characterized running the models to 2099 using 2 different IPCC climate scenarios (A1fi vs. B1, i.e. high CO2 emissions vs. low CO2 emissions scenario). Parameter uncertainty was the smallest (average 95% CI: 2.4 day century-1 for scenario B1 and 4.5 day century-1 for A1fi), whereas driver uncertainty was the largest (up to 8.4 day century-1 in the simulated trends). The uncertainty related to model structure is also large and the predicted bud-burst trends as well as the shape of the smoothed projections varied somewhat among models (±7.7 day century-1 for A1fi, ±3.6 day century-1 for B1). The forecast sensitivity of bud-burst to temperature (i.e. days bud-burst advanced per degree of warming) varied between 2.2 day °C-1 and 5.2 day °C-1 depending on model structure. We quantified the impact of uncertainties in bud-burst forecasts on simulated carbon and water fluxes using a process-based terrestrial biosphere model. Uncertainty in phenology model structure led to uncertainty in the description of the seasonality of processes, which accumulated to uncertainty in annual model estimates of gross primary productivity (GPP) and evapotranspiration (ET) of 9.6% and 2.9% respectively. A sensitivity analysis shows that a variation of ±10 days in bud-burst dates led to a variation of ±5.0% for annual GPP and about ±2.0% for ET. For phenology models, differences among future climate scenarios represent the largest source of uncertainty, followed by uncertainties related to model structure, and finally, uncertainties related to model parameterization. The uncertainties we have quantified will affect the description of the seasonality of processes and in particular the simulation of carbon uptake by forest ecosystems, with a larger impact of uncertainties related to phenology model structure, followed by uncertainties related to phenological model parameterization.

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Muller, L; Soldner, A; Kirk, M

    Purpose: The beam range uncertainty presents a special challenge for proton therapy. Novel technologies currently under development offer strategies to reduce the range uncertainty [1,2]. This work quantifies the potential advantages that could be realized by such a reduction for dosimetrically challenging chordomas at the base of skull. Therapeutic improvement was assessed by evaluating tumor control probabilities (TCP) and normal tissue complication probabilities (NTCP). Methods: Treatment plans were made for a modulated-scanned proton delivery technique using the Eclipse treatment planning system. The prescription dose was 7920 cGy to the CTV. Three different range uncertainty scenarios were considered: 5 mm (3.5%more » of the beam range + 1 mm, representing current clinical practice, “Curr”), 2 mm (1.3%), and 1 mm (0.7%). For each of 4 patients, 3 different PTVs were defined via uniform expansion of the CTV by the value of the range uncertainty. Tumor control probability (TCP) and normal tissue complication probabilities (NTCPs) for organs-at-risk (OARs) were calculated using the Lyman-Kutcher-Burman[3] formalism and published model parameters [ref Terahara[4], quantec S10, Burman Red Journal v21 pp 123]. Our plan optimization strategy was to achieve PTV close to prescription while maintaining OAR NTCP values at or better than the Curr plan. Results: The average TCP values for the 5, 2, and 1 mm range uncertainty scenarios are 51%, 55% and 65%. The improvement in TCP for patients was between 4 and 30%, depending primarily on the proximity of the GTV to OAR. The average NTCPs for the brainstem and cord were about 4% and 1%, respectively, for all target margins. Conclusion: For base of skull chordomas, reduced target margins can substantially increase the TCP without increasing the NTCP. This work demonstrates the potential significance of a reduction in the range uncertainty for proton beams.« less

  2. Exploiting Surface Albedos Products to Bridge the Gap Between Remote Sensing Information and Climate Models

    NASA Astrophysics Data System (ADS)

    Pinty, Bernard; Andredakis, Ioannis; Clerici, Marco; Kaminski, Thomas; Taberner, Malcolm; Stephen, Plummer

    2011-01-01

    We present results from the application of an inversion method conducted using MODIS derived broadband visible and near-infrared surface albedo products. This contribution is an extension of earlier efforts to optimally retrieve land surface fluxes and associated two- stream model parameters based on the Joint Research Centre Two-stream Inversion Package (JRC-TIP). The discussion focuses on products (based on the mean and one-sigma values of the Probability Distribution Functions (PDFs)) obtained during the summer and winter and highlight specific issues related to snowy conditions. This paper discusses the retrieved model parameters including the effective Leaf Area Index (LAI), the background brightness and the scattering efficiency of the vegetation elements. The spatial and seasonal changes exhibited by these parameters agree with common knowledge and underscore the richness of the high quality surface albedo data sets. At the same time, the opportunity to generate global maps of new products, such as the background albedo, underscores the advantages of using state of the art algorithmic approaches capable of fully exploiting accurate satellite remote sensing datasets. The detailed analyses of the retrieval uncertainties highlight the central role and contribution of the LAI, the main process parameter to interpret radiation transfer observations over vegetated surfaces. The posterior covariance matrix of the uncertainties is further exploited to quantify the knowledge gain from the ingestion of MODIS surface albedo products. The estimation of the radiation fluxes that are absorbed, transmitted and scattered by the vegetation layer and its background is achieved on the basis of the retrieved PDFs of the model parameters. The propagation of uncertainties from the observations to the model parameters is achieved via the Hessian of the cost function and yields a covariance matrix of posterior parameter uncertainties. This matrix is propagated to the radiation fluxes via the model’s Jacobian matrix of first derivatives. A definite asset of the JRC-TIP lies in its capability to control and ultimately relax a number of assumptions that are often implicit in traditional approaches. These features greatly help understand the discrepancies between the different data sets of land surface properties and fluxes that are currently available. Through a series of selected examples, the inverse procedure implemented in the JRC-TIP is shown to be robust, reliable and compliant with large scale processing requirements. Furthermore, this package ensures the physical consistency between the set of observations, the two-stream model parameters and radiation fluxes. It also documents the retrieval of associated uncertainties. The knowledge gained from the availability of remote sensing surface albedo products can be expressed in quantitative terms using a simple metric. This metric helps identify the geographical locations and periods of the year where the remote sensing products fail in reducing the uncertainty on the process model parameters as can be specified from current knowledge.

  3. Determination of Uncertainties for the New SSME Model

    NASA Technical Reports Server (NTRS)

    Coleman, Hugh W.; Hawk, Clark W.

    1996-01-01

    This report discusses the uncertainty analysis performed in support of a new test analysis and performance prediction model for the Space Shuttle Main Engine. The new model utilizes uncertainty estimates for experimental data and for the analytical model to obtain the most plausible operating condition for the engine system. This report discusses the development of the data sets and uncertainty estimates to be used in the development of the new model. It also presents the application of uncertainty analysis to analytical models and the uncertainty analysis for the conservation of mass and energy balance relations is presented. A new methodology for the assessment of the uncertainty associated with linear regressions is presented.

  4. Matching experimental and three dimensional numerical models for structural vibration problems with uncertainties

    NASA Astrophysics Data System (ADS)

    Langer, P.; Sepahvand, K.; Guist, C.; Bär, J.; Peplow, A.; Marburg, S.

    2018-03-01

    The simulation model which examines the dynamic behavior of real structures needs to address the impact of uncertainty in both geometry and material parameters. This article investigates three-dimensional finite element models for structural dynamics problems with respect to both model and parameter uncertainties. The parameter uncertainties are determined via laboratory measurements on several beam-like samples. The parameters are then considered as random variables to the finite element model for exploring the uncertainty effects on the quality of the model outputs, i.e. natural frequencies. The accuracy of the output predictions from the model is compared with the experimental results. To this end, the non-contact experimental modal analysis is conducted to identify the natural frequency of the samples. The results show a good agreement compared with experimental data. Furthermore, it is demonstrated that geometrical uncertainties have more influence on the natural frequencies compared to material parameters and material uncertainties are about two times higher than geometrical uncertainties. This gives valuable insights for improving the finite element model due to various parameter ranges required in a modeling process involving uncertainty.

  5. Evaluating Predictive Uncertainty of Hyporheic Exchange Modelling

    NASA Astrophysics Data System (ADS)

    Chow, R.; Bennett, J.; Dugge, J.; Wöhling, T.; Nowak, W.

    2017-12-01

    Hyporheic exchange is the interaction of water between rivers and groundwater, and is difficult to predict. One of the largest contributions to predictive uncertainty for hyporheic fluxes have been attributed to the representation of heterogeneous subsurface properties. This research aims to evaluate which aspect of the subsurface representation - the spatial distribution of hydrofacies or the model for local-scale (within-facies) heterogeneity - most influences the predictive uncertainty. Also, we seek to identify data types that help reduce this uncertainty best. For this investigation, we conduct a modelling study of the Steinlach River meander, in Southwest Germany. The Steinlach River meander is an experimental site established in 2010 to monitor hyporheic exchange at the meander scale. We use HydroGeoSphere, a fully integrated surface water-groundwater model, to model hyporheic exchange and to assess the predictive uncertainty of hyporheic exchange transit times (HETT). A highly parameterized complex model is built and treated as `virtual reality', which is in turn modelled with simpler subsurface parameterization schemes (Figure). Then, we conduct Monte-Carlo simulations with these models to estimate the predictive uncertainty. Results indicate that: Uncertainty in HETT is relatively small for early times and increases with transit times. Uncertainty from local-scale heterogeneity is negligible compared to uncertainty in the hydrofacies distribution. Introducing more data to a poor model structure may reduce predictive variance, but does not reduce predictive bias. Hydraulic head observations alone cannot constrain the uncertainty of HETT, however an estimate of hyporheic exchange flux proves to be more effective at reducing this uncertainty. Figure: Approach for evaluating predictive model uncertainty. A conceptual model is first developed from the field investigations. A complex model (`virtual reality') is then developed based on that conceptual model. This complex model then serves as the basis to compare simpler model structures. Through this approach, predictive uncertainty can be quantified relative to a known reference solution.

  6. Observed and Modeled HOCl Profiles in the Midlatitude Stratosphere: Implication for Ozone Loss

    NASA Technical Reports Server (NTRS)

    Kovalenko, L. J.; Jucks, K. W.; Salawitch, R. J.; Toon, G. C.; Blavier, J. F.; Johnson, D. G.; Kleinbohl, A.; Livesey, N. J .; Margitan, J. J.; Pickett, H. M.; hide

    2007-01-01

    Vertical profiles of stratospheric HOCl calculated with a diurnal steady-state photochemical model that uses currently recommended reaction rates and photolysis cross sections underestimate observed profiles of HOCl obtained by two balloon-borne instruments, FIRS-2 (a far-infrared emission spectrometer) and MkIV (a mid-infrared, solar absorption spectrometer). Considerable uncertainty (a factor of two) persists in laboratory measurements of the rate constant (k(sub 1)) for the reaction ClO + HO2 yields HOCl + O2. Agreement between modeled and measured HOCl can be attained using a value of k(sub 1) from Stimpfle et al. (1979) that is about a factor-of-two faster than the currently recommended rate constant. Comparison of modeled and measured HOCl suggests that models using the currently recommended value for k(sub 1) may underestimate the role of the HOCl catalytic cycle for ozone depletion, important in the midlatitude lower stratosphere.

  7. Variability And Uncertainty Analysis Of Contaminant Transport Model Using Fuzzy Latin Hypercube Sampling Technique

    NASA Astrophysics Data System (ADS)

    Kumar, V.; Nayagum, D.; Thornton, S.; Banwart, S.; Schuhmacher2, M.; Lerner, D.

    2006-12-01

    Characterization of uncertainty associated with groundwater quality models is often of critical importance, as for example in cases where environmental models are employed in risk assessment. Insufficient data, inherent variability and estimation errors of environmental model parameters introduce uncertainty into model predictions. However, uncertainty analysis using conventional methods such as standard Monte Carlo sampling (MCS) may not be efficient, or even suitable, for complex, computationally demanding models and involving different nature of parametric variability and uncertainty. General MCS or variant of MCS such as Latin Hypercube Sampling (LHS) assumes variability and uncertainty as a single random entity and the generated samples are treated as crisp assuming vagueness as randomness. Also when the models are used as purely predictive tools, uncertainty and variability lead to the need for assessment of the plausible range of model outputs. An improved systematic variability and uncertainty analysis can provide insight into the level of confidence in model estimates, and can aid in assessing how various possible model estimates should be weighed. The present study aims to introduce, Fuzzy Latin Hypercube Sampling (FLHS), a hybrid approach of incorporating cognitive and noncognitive uncertainties. The noncognitive uncertainty such as physical randomness, statistical uncertainty due to limited information, etc can be described by its own probability density function (PDF); whereas the cognitive uncertainty such estimation error etc can be described by the membership function for its fuzziness and confidence interval by ?-cuts. An important property of this theory is its ability to merge inexact generated data of LHS approach to increase the quality of information. The FLHS technique ensures that the entire range of each variable is sampled with proper incorporation of uncertainty and variability. A fuzzified statistical summary of the model results will produce indices of sensitivity and uncertainty that relate the effects of heterogeneity and uncertainty of input variables to model predictions. The feasibility of the method is validated to assess uncertainty propagation of parameter values for estimation of the contamination level of a drinking water supply well due to transport of dissolved phenolics from a contaminated site in the UK.

  8. A new retrieval algorithm for tropospheric temperature, humidity and pressure profiling based on GNSS radio occultation data

    NASA Astrophysics Data System (ADS)

    Kirchengast, Gottfried; Li, Ying; Scherllin-Pirscher, Barbara; Schwärz, Marc; Schwarz, Jakob; Nielsen, Johannes K.

    2017-04-01

    The GNSS radio occultation (RO) technique is an important remote sensing technique for obtaining thermodynamic profiles of temperature, humidity, and pressure in the Earth's troposphere. However, due to refraction effects of both dry ambient air and water vapor in the troposphere, retrieval of accurate thermodynamic profiles at these lower altitudes is challenging and requires suitable background information in addition to the RO refractivity information. Here we introduce a new moist air retrieval algorithm aiming to improve the quality and robustness of retrieving temperature, humidity and pressure profiles in moist air tropospheric conditions. The new algorithm consists of four steps: (1) use of prescribed specific humidity and its uncertainty to retrieve temperature and its associated uncertainty; (2) use of prescribed temperature and its uncertainty to retrieve specific humidity and its associated uncertainty; (3) use of the previous results to estimate final temperature and specific humidity profiles through optimal estimation; (4) determination of air pressure and density profiles from the results obtained before. The new algorithm does not require elaborated matrix inversions which are otherwise widely used in 1D-Var retrieval algorithms, and it allows a transparent uncertainty propagation, whereby the uncertainties of prescribed variables are dynamically estimated accounting for their spatial and temporal variations. Estimated random uncertainties are calculated by constructing error covariance matrices from co-located ECMWF short-range forecast and corresponding analysis profiles. Systematic uncertainties are estimated by empirical modeling. The influence of regarding or disregarding vertical error correlations is quantified. The new scheme is implemented with static input uncertainty profiles in WEGC's current OPSv5.6 processing system and with full scope in WEGC's next-generation system, the Reference Occultation Processing System (rOPS). Results from both WEGC systems, current OPSv5.6 and next-generation rOPS, are shown and discussed, based on both insights from individual profiles and statistical ensembles, and compared to moist air retrieval results from the UCAR Boulder and ROM-SAF Copenhagen centers. The results show that the new algorithmic scheme improves the temperature, humidity and pressure retrieval performance, in particular also the robustness including for integrated uncertainty estimation for large-scale applications, over the previous algorithms. The new rOPS-implemented algorithm will therefore be used in the first large-scale reprocessing towards a tropospheric climate data record 2001-2016 by the rOPS, including its integrated uncertainty propagation.

  9. Development and Utilization of Regional Oceanic Modeling System (ROMS). Delicacy, Imprecision, and Uncertainty of Oceanic Simulations: An Investigation with the Regional Oceanic Modeling System (ROMS). Mixing in the Ocean Surface Layer Using the Regional Oceanic Modeling System (ROMS).

    DTIC Science & Technology

    2011-09-30

    community use for ROMS is biogeochemisty: chemical cycles, water quality, blooms , micro-nutrients, larval dispersal, biome transitions, and coupling to...J.C. McWilliams, X. Capet, and J. Kurian, 2010: Heat balance and eddies in the Peru- Chile Current System. Climate Dynamics, 37, in press. doi10.1007

  10. Modeling uncertainty: quicksand for water temperature modeling

    USGS Publications Warehouse

    Bartholow, John M.

    2003-01-01

    Uncertainty has been a hot topic relative to science generally, and modeling specifically. Modeling uncertainty comes in various forms: measured data, limited model domain, model parameter estimation, model structure, sensitivity to inputs, modelers themselves, and users of the results. This paper will address important components of uncertainty in modeling water temperatures, and discuss several areas that need attention as the modeling community grapples with how to incorporate uncertainty into modeling without getting stuck in the quicksand that prevents constructive contributions to policy making. The material, and in particular the reference, are meant to supplement the presentation given at this conference.

  11. DATA ASSIMILATION APPROACH FOR FORECAST OF SOLAR ACTIVITY CYCLES

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kitiashvili, Irina N., E-mail: irina.n.kitiashvili@nasa.gov

    Numerous attempts to predict future solar cycles are mostly based on empirical relations derived from observations of previous cycles, and they yield a wide range of predicted strengths and durations of the cycles. Results obtained with current dynamo models also deviate strongly from each other, thus raising questions about criteria to quantify the reliability of such predictions. The primary difficulties in modeling future solar activity are shortcomings of both the dynamo models and observations that do not allow us to determine the current and past states of the global solar magnetic structure and its dynamics. Data assimilation is a relativelymore » new approach to develop physics-based predictions and estimate their uncertainties in situations where the physical properties of a system are not well-known. This paper presents an application of the ensemble Kalman filter method for modeling and prediction of solar cycles through use of a low-order nonlinear dynamo model that includes the essential physics and can describe general properties of the sunspot cycles. Despite the simplicity of this model, the data assimilation approach provides reasonable estimates for the strengths of future solar cycles. In particular, the prediction of Cycle 24 calculated and published in 2008 is so far holding up quite well. In this paper, I will present my first attempt to predict Cycle 25 using the data assimilation approach, and discuss the uncertainties of that prediction.« less

  12. Production of NOx by Lightning and its Effects on Atmospheric Chemistry

    NASA Technical Reports Server (NTRS)

    Pickering, Kenneth E.

    2009-01-01

    Production of NO(x) by lightning remains the NO(x) source with the greatest uncertainty. Current estimates of the global source strength range over a factor of four (from 2 to 8 TgN/year). Ongoing efforts to reduce this uncertainty through field programs, cloud-resolved modeling, global modeling, and satellite data analysis will be described in this seminar. Representation of the lightning source in global or regional chemical transport models requires three types of information: the distribution of lightning flashes as a function of time and space, the production of NO(x) per flash, and the effective vertical distribution of the lightning-injected NO(x). Methods of specifying these items in a model will be discussed. For example, the current method of specifying flash rates in NASA's Global Modeling Initiative (GMI) chemical transport model will be discussed, as well as work underway in developing algorithms for use in the regional models CMAQ and WRF-Chem. A number of methods have been employed to estimate either production per lightning flash or the production per unit flash length. Such estimates derived from cloud-resolved chemistry simulations and from satellite NO2 retrievals will be presented as well as the methodologies employed. Cloud-resolved model output has also been used in developing vertical profiles of lightning NO(x) for use in global models. Effects of lightning NO(x) on O3 and HO(x) distributions will be illustrated regionally and globally.

  13. Variability in modeled cloud feedback tied to differences in the climatological spatial pattern of clouds

    NASA Astrophysics Data System (ADS)

    Siler, Nicholas; Po-Chedley, Stephen; Bretherton, Christopher S.

    2018-02-01

    Despite the increasing sophistication of climate models, the amount of surface warming expected from a doubling of atmospheric CO_2 (equilibrium climate sensitivity) remains stubbornly uncertain, in part because of differences in how models simulate the change in global albedo due to clouds (the shortwave cloud feedback). Here, model differences in the shortwave cloud feedback are found to be closely related to the spatial pattern of the cloud contribution to albedo (α) in simulations of the current climate: high-feedback models exhibit lower (higher) α in regions of warm (cool) sea-surface temperatures, and therefore predict a larger reduction in global-mean α as temperatures rise and warm regions expand. The spatial pattern of α is found to be strongly predictive (r=0.84) of a model's global cloud feedback, with satellite observations indicating a most-likely value of 0.58± 0.31 Wm^{-2} K^{-1} (90% confidence). This estimate is higher than the model-average cloud feedback of 0.43 Wm^{-2} K^{-1}, with half the range of uncertainty. The observational constraint on climate sensitivity is weaker but still significant, suggesting a likely value of 3.68 ± 1.30 K (90% confidence), which also favors the upper range of model estimates. These results suggest that uncertainty in model estimates of the global cloud feedback may be substantially reduced by ensuring a realistic distribution of clouds between regions of warm and cool SSTs in simulations of the current climate.

  14. RenderView: physics-based multi- and hyperspectral rendering using measured background panoramics

    NASA Astrophysics Data System (ADS)

    Talcott, Denise M.; Brown, Wade W.; Thomas, David J.

    2003-09-01

    As part of the survivability engineering process it is necessary to accurately model and visualize the vehicle signatures in multi- or hyperspectral bands of interest. The signature at a given wavelength is a function of the surface optical properties, reflection of the background and, in the thermal region, the emission of thermal radiation. Currently, it is difficult to obtain and utilize background models that are of sufficient fidelity when compared with the vehicle models. In addition, the background models create an additional layer of uncertainty in estimating the vehicles signature. Therefore, to meet exacting rendering requirements we have developed RenderView, which incorporates the full bidirectional reflectance distribution function (BRDF). Instead of using a modeled background we have incorporated a measured calibrated background panoramic image to provide the high fidelity background interaction. Uncertainty in the background signature is reduced to the error in the measurement which is considerably smaller than the uncertainty inherent in a modeled background. RenderView utilizes a number of different descriptions of the BRDF, including the Sandford-Robertson. In addition, it provides complete conservation of energy with off axis sampling. A description of RenderView will be presented along with a methodology developed for collecting background panoramics. Examples of the RenderView output and the background panoramics will be presented along with our approach to handling the solar irradiance problem.

  15. Modelling multi-hazard hurricane damages on an urbanized coast with a Bayesian Network approach

    USGS Publications Warehouse

    van Verseveld, H.C.W.; Van Dongeren, A. R.; Plant, Nathaniel G.; Jäger, W.S.; den Heijer, C.

    2015-01-01

    Hurricane flood impacts to residential buildings in coastal zones are caused by a number of hazards, such as inundation, overflow currents, erosion, and wave attack. However, traditional hurricane damage models typically make use of stage-damage functions, where the stage is related to flooding depth only. Moreover, these models are deterministic and do not consider the large amount of uncertainty associated with both the processes themselves and with the predictions. This uncertainty becomes increasingly important when multiple hazards (flooding, wave attack, erosion, etc.) are considered simultaneously. This paper focusses on establishing relationships between observed damage and multiple hazard indicators in order to make better probabilistic predictions. The concept consists of (1) determining Local Hazard Indicators (LHIs) from a hindcasted storm with use of a nearshore morphodynamic model, XBeach, and (2) coupling these LHIs and building characteristics to the observed damages. We chose a Bayesian Network approach in order to make this coupling and used the LHIs ‘Inundation depth’, ‘Flow velocity’, ‘Wave attack’, and ‘Scour depth’ to represent flooding, current, wave impacts, and erosion related hazards.The coupled hazard model was tested against four thousand damage observations from a case site at the Rockaway Peninsula, NY, that was impacted by Hurricane Sandy in late October, 2012. The model was able to accurately distinguish ‘Minor damage’ from all other outcomes 95% of the time and could distinguish areas that were affected by the storm, but not severely damaged, 68% of the time. For the most heavily damaged buildings (‘Major Damage’ and ‘Destroyed’), projections of the expected damage underestimated the observed damage. The model demonstrated that including multiple hazards doubled the prediction skill, with Log-Likelihood Ratio test (a measure of improved accuracy and reduction in uncertainty) scores between 0.02 and 0.17 when only one hazard is considered and a score of 0.37 when multiple hazards are considered simultaneously. The LHIs with the most predictive skill were ‘Inundation depth’ and ‘Wave attack’. The Bayesian Network approach has several advantages over the market-standard stage-damage functions: the predictive capacity of multiple indicators can be combined; probabilistic predictions can be obtained, which include uncertainty; and quantitative as well as descriptive information can be used simultaneously.

  16. Constellation Program Lessons Learned in the Quantification and Use of Aerodynamic Uncertainty

    NASA Technical Reports Server (NTRS)

    Walker, Eric L.; Hemsch, Michael J.; Pinier, Jeremy T.; Bibb, Karen L.; Chan, David T.; Hanke, Jeremy L.

    2011-01-01

    The NASA Constellation Program has worked for the past five years to develop a re- placement for the current Space Transportation System. Of the elements that form the Constellation Program, only two require databases that define aerodynamic environments and their respective uncertainty: the Ares launch vehicles and the Orion crew and launch abort vehicles. Teams were established within the Ares and Orion projects to provide repre- sentative aerodynamic models including both baseline values and quantified uncertainties. A technical team was also formed within the Constellation Program to facilitate integra- tion among the project elements. This paper is a summary of the collective experience of the three teams working with the quantification and use of uncertainty in aerodynamic environments: the Ares and Orion project teams as well as the Constellation integration team. Not all of the lessons learned discussed in this paper could be applied during the course of the program, but they are included in the hope of benefiting future projects.

  17. Extending flood forecasting lead time in large basin by coupling bias-corrected WRF QPF with distributed hydrological model

    NASA Astrophysics Data System (ADS)

    LI, J.; Chen, Y.; Wang, H. Y.

    2016-12-01

    In large basin flood forecasting, the forecasting lead time is very important. Advances in numerical weather forecasting in the past decades provides new input to extend flood forecasting lead time in large rivers. Challenges for fulfilling this goal currently is that the uncertainty of QPF with these kinds of NWP models are still high, so controlling the uncertainty of QPF is an emerging technique requirement.The Weather Research and Forecasting (WRF) model is one of these NWPs, and how to control the QPF uncertainty of WRF is the research topic of many researchers among the meteorological community. In this study, the QPF products in the Liujiang river basin, a big river with a drainage area of 56,000 km2, was compared with the ground observation precipitation from a rain gauge networks firstly, and the results show that the uncertainty of the WRF QPF is relatively high. So a post-processed algorithm by correlating the QPF with the observed precipitation is proposed to remove the systematical bias in QPF. With this algorithm, the post-processed WRF QPF is close to the ground observed precipitation in area-averaged precipitation. Then the precipitation is coupled with the Liuxihe model, a physically based distributed hydrological model that is widely used in small watershed flash flood forecasting. The Liuxihe Model has the advantage with gridded precipitation from NWP and could optimize model parameters when there are some observed hydrological data even there is only a few, it also has very high model resolution to improve model performance, and runs on high performance supercomputer with parallel algorithm if executed in large rivers. Two flood events in the Liujiang River were collected, one was used to optimize the model parameters and another is used to validate the model. The results show that the river flow simulation has been improved largely, and could be used for real-time flood forecasting trail in extending flood forecasting leading time.

  18. Revisiting the generation and interpretation of climate models experiments for adaptation decision-making (Invited)

    NASA Astrophysics Data System (ADS)

    Ranger, N.; Millner, A.; Niehoerster, F.

    2010-12-01

    Traditionally, climate change risk assessments have taken a roughly four-stage linear ‘chain’ of moving from socioeconomic projections, to climate projections, to primary impacts and then finally onto economic and social impact assessment. Adaptation decisions are then made on the basis of these outputs. The escalation of uncertainty through this chain is well known; resulting in an ‘explosion’ of uncertainties in the final risk and adaptation assessment. The space of plausible future risk scenarios is growing ever wider with the application of new techniques which aim to explore uncertainty ever more deeply; such as those used in the recent ‘probabilistic’ UK Climate Projections 2009, and the stochastic integrated assessment models, for example PAGE2002. This explosion of uncertainty can make decision-making problematic, particularly given that the uncertainty information communicated can not be treated as strictly probabilistic and therefore, is not an easy fit with standard decision-making under uncertainty approaches. Additional problems can arise from the fact that the uncertainty estimated for different components of the ‘chain’ is rarely directly comparable or combinable. Here, we explore the challenges and limitations of using current projections for adaptation decision-making. We report the findings of a recent report completed for the UK Adaptation Sub-Committee on approaches to deal with these challenges and make robust adaptation decisions today. To illustrate these approaches, we take a number of illustrative case studies, including a case of adaptation to hurricane risk on the US Gulf Coast. This is a particularly interesting case as it involves urgent adaptation of long-lived infrastructure but requires interpreting highly uncertain climate change science and modelling; i.e. projections of Atlantic basin hurricane activity. An approach we outline is reversing the linear chain of assessments to put the economics and decision-making first. Such an approach forces one to focus on the information of greatest value for the specific decision. We suggest that such an approach will help to accommodate the uncertainties in the chain and facilitate robust decision-making. Initial findings of these case studies will be presented with the aim of raising open questions and promoting discussion of the methodology. Finally, we reflect on the implications for the design of climate model experiments.

  19. Assessing Uncertainty in Risk Assessment Models (BOSC CSS meeting)

    EPA Science Inventory

    In vitro assays are increasingly being used in risk assessments Uncertainty in assays leads to uncertainty in models used for risk assessments. This poster assesses uncertainty in the ER and AR models.

  20. Multi-model seasonal forecast of Arctic sea-ice: forecast uncertainty at pan-Arctic and regional scales

    NASA Astrophysics Data System (ADS)

    Blanchard-Wrigglesworth, E.; Barthélemy, A.; Chevallier, M.; Cullather, R.; Fučkar, N.; Massonnet, F.; Posey, P.; Wang, W.; Zhang, J.; Ardilouze, C.; Bitz, C. M.; Vernieres, G.; Wallcraft, A.; Wang, M.

    2017-08-01

    Dynamical model forecasts in the Sea Ice Outlook (SIO) of September Arctic sea-ice extent over the last decade have shown lower skill than that found in both idealized model experiments and hindcasts of previous decades. Additionally, it is unclear how different model physics, initial conditions or forecast post-processing (bias correction) techniques contribute to SIO forecast uncertainty. In this work, we have produced a seasonal forecast of 2015 Arctic summer sea ice using SIO dynamical models initialized with identical sea-ice thickness in the central Arctic. Our goals are to calculate the relative contribution of model uncertainty and irreducible error growth to forecast uncertainty and assess the importance of post-processing, and to contrast pan-Arctic forecast uncertainty with regional forecast uncertainty. We find that prior to forecast post-processing, model uncertainty is the main contributor to forecast uncertainty, whereas after forecast post-processing forecast uncertainty is reduced overall, model uncertainty is reduced by an order of magnitude, and irreducible error growth becomes the main contributor to forecast uncertainty. While all models generally agree in their post-processed forecasts of September sea-ice volume and extent, this is not the case for sea-ice concentration. Additionally, forecast uncertainty of sea-ice thickness grows at a much higher rate along Arctic coastlines relative to the central Arctic ocean. Potential ways of offering spatial forecast information based on the timescale over which the forecast signal beats the noise are also explored.

  1. Empirically evaluating decision-analytic models.

    PubMed

    Goldhaber-Fiebert, Jeremy D; Stout, Natasha K; Goldie, Sue J

    2010-08-01

    Model-based cost-effectiveness analyses support decision-making. To augment model credibility, evaluation via comparison to independent, empirical studies is recommended. We developed a structured reporting format for model evaluation and conducted a structured literature review to characterize current model evaluation recommendations and practices. As an illustration, we applied the reporting format to evaluate a microsimulation of human papillomavirus and cervical cancer. The model's outputs and uncertainty ranges were compared with multiple outcomes from a study of long-term progression from high-grade precancer (cervical intraepithelial neoplasia [CIN]) to cancer. Outcomes included 5 to 30-year cumulative cancer risk among women with and without appropriate CIN treatment. Consistency was measured by model ranges overlapping study confidence intervals. The structured reporting format included: matching baseline characteristics and follow-up, reporting model and study uncertainty, and stating metrics of consistency for model and study results. Structured searches yielded 2963 articles with 67 meeting inclusion criteria and found variation in how current model evaluations are reported. Evaluation of the cervical cancer microsimulation, reported using the proposed format, showed a modeled cumulative risk of invasive cancer for inadequately treated women of 39.6% (30.9-49.7) at 30 years, compared with the study: 37.5% (28.4-48.3). For appropriately treated women, modeled risks were 1.0% (0.7-1.3) at 30 years, study: 1.5% (0.4-3.3). To support external and projective validity, cost-effectiveness models should be iteratively evaluated as new studies become available, with reporting standardized to facilitate assessment. Such evaluations are particularly relevant for models used to conduct comparative effectiveness analyses.

  2. Uncertainty of climate change impact on groundwater reserves - Application to a chalk aquifer

    NASA Astrophysics Data System (ADS)

    Goderniaux, Pascal; Brouyère, Serge; Wildemeersch, Samuel; Therrien, René; Dassargues, Alain

    2015-09-01

    Recent studies have evaluated the impact of climate change on groundwater resources for different geographical and climatic contexts. However, most studies have either not estimated the uncertainty around projected impacts or have limited the analysis to the uncertainty related to climate models. In this study, the uncertainties around impact projections from several sources (climate models, natural variability of the weather, hydrological model calibration) are calculated and compared for the Geer catchment (465 km2) in Belgium. We use a surface-subsurface integrated model implemented using the finite element code HydroGeoSphere, coupled with climate change scenarios (2010-2085) and the UCODE_2005 inverse model, to assess the uncertainty related to the calibration of the hydrological model. This integrated model provides a more realistic representation of the water exchanges between surface and subsurface domains and constrains more the calibration with the use of both surface and subsurface observed data. Sensitivity and uncertainty analyses were performed on predictions. The linear uncertainty analysis is approximate for this nonlinear system, but it provides some measure of uncertainty for computationally demanding models. Results show that, for the Geer catchment, the most important uncertainty is related to calibration of the hydrological model. The total uncertainty associated with the prediction of groundwater levels remains large. By the end of the century, however, the uncertainty becomes smaller than the predicted decline in groundwater levels.

  3. Uncertainty in Population Estimates for Endangered Animals and Improving the Recovery Process.

    PubMed

    Haines, Aaron M; Zak, Matthew; Hammond, Katie; Scott, J Michael; Goble, Dale D; Rachlow, Janet L

    2013-08-13

    United States recovery plans contain biological information for a species listed under the Endangered Species Act and specify recovery criteria to provide basis for species recovery. The objective of our study was to evaluate whether recovery plans provide uncertainty (e.g., variance) with estimates of population size. We reviewed all finalized recovery plans for listed terrestrial vertebrate species to record the following data: (1) if a current population size was given, (2) if a measure of uncertainty or variance was associated with current estimates of population size and (3) if population size was stipulated for recovery. We found that 59% of completed recovery plans specified a current population size, 14.5% specified a variance for the current population size estimate and 43% specified population size as a recovery criterion. More recent recovery plans reported more estimates of current population size, uncertainty and population size as a recovery criterion. Also, bird and mammal recovery plans reported more estimates of population size and uncertainty compared to reptiles and amphibians. We suggest the use of calculating minimum detectable differences to improve confidence when delisting endangered animals and we identified incentives for individuals to get involved in recovery planning to improve access to quantitative data.

  4. Uncertainty in temperature response of current consumption-based emissions estimates

    NASA Astrophysics Data System (ADS)

    Karstensen, J.; Peters, G. P.; Andrew, R. M.

    2015-05-01

    Several studies have connected emissions of greenhouse gases to economic and trade data to quantify the causal chain from consumption to emissions and climate change. These studies usually combine data and models originating from different sources, making it difficult to estimate uncertainties along the entire causal chain. We estimate uncertainties in economic data, multi-pollutant emission statistics, and metric parameters, and use Monte Carlo analysis to quantify contributions to uncertainty and to determine how uncertainty propagates to estimates of global temperature change from regional and sectoral territorial- and consumption-based emissions for the year 2007. We find that the uncertainties are sensitive to the emission allocations, mix of pollutants included, the metric and its time horizon, and the level of aggregation of the results. Uncertainties in the final results are largely dominated by the climate sensitivity and the parameters associated with the warming effects of CO2. Based on our assumptions, which exclude correlations in the economic data, the uncertainty in the economic data appears to have a relatively small impact on uncertainty at the national level in comparison to emissions and metric uncertainty. Much higher uncertainties are found at the sectoral level. Our results suggest that consumption-based national emissions are not significantly more uncertain than the corresponding production-based emissions since the largest uncertainties are due to metric and emissions which affect both perspectives equally. The two perspectives exhibit different sectoral uncertainties, due to changes of pollutant compositions. We find global sectoral consumption uncertainties in the range of ±10 to ±27 % using the Global Temperature Potential with a 50-year time horizon, with metric uncertainties dominating. National-level uncertainties are similar in both perspectives due to the dominance of CO2 over other pollutants. The consumption emissions of the top 10 emitting regions have a broad uncertainty range of ±9 to ±25 %, with metric and emission uncertainties contributing similarly. The absolute global temperature potential (AGTP) with a 50-year time horizon has much higher uncertainties, with considerable uncertainty overlap for regions and sectors, indicating that the ranking of countries is uncertain.

  5. Robust tracking control of a magnetically suspended rigid body

    NASA Technical Reports Server (NTRS)

    Lim, Kyong B.; Cox, David E.

    1994-01-01

    This study is an application of H-infinity and micro-synthesis for designing robust tracking controllers for the Large Angle Magnetic Suspension Test Facility. The modeling, design, analysis, simulation, and testing of a control law that guarantees tracking performance under external disturbances and model uncertainties is investigated. The type of uncertainties considered and the tracking performance metric used is discussed. This study demonstrates the tradeoff between tracking performance at low frequencies and robustness at high frequencies. Two sets of controllers were designed and tested. The first set emphasized performance over robustness, while the second set traded off performance for robustness. Comparisons of simulation and test results are also included. Current simulation and experimental results indicate that reasonably good robust tracking performance can be attained for this system using multivariable robust control approach.

  6. Progress toward a new beam measurement of the neutron lifetime

    NASA Astrophysics Data System (ADS)

    Hoogerheide, Shannon Fogwell; BL2 Collaboration

    2017-01-01

    Neutron beta decay is the simplest example of nuclear beta decay. A precise value of the neutron lifetime is important for consistency tests of the Standard Model and Big Bang Nucleosynthesis models. The beam neutron lifetime method requires the absolute counting of the decay protons in a neutron beam of precisely known flux. Recent work has resulted in improvements in both the neutron and proton detection systems that should permit a significant reduction in systematic uncertainties. A new measurement of the neutron lifetime using the beam method is underway at the National Institute of Standards and Technology Center for Neutron Research. The projected uncertainty of this new measurement is 1 s. An overview of the measurement, its current status, and the technical improvements will be discussed.

  7. Monte Carlo Calculation of Thermal Neutron Inelastic Scattering Cross Section Uncertainties by Sampling Perturbed Phonon Spectra

    NASA Astrophysics Data System (ADS)

    Holmes, Jesse Curtis

    Nuclear data libraries provide fundamental reaction information required by nuclear system simulation codes. The inclusion of data covariances in these libraries allows the user to assess uncertainties in system response parameters as a function of uncertainties in the nuclear data. Formats and procedures are currently established for representing covariances for various types of reaction data in ENDF libraries. This covariance data is typically generated utilizing experimental measurements and empirical models, consistent with the method of parent data production. However, ENDF File 7 thermal neutron scattering library data is, by convention, produced theoretically through fundamental scattering physics model calculations. Currently, there is no published covariance data for ENDF File 7 thermal libraries. Furthermore, no accepted methodology exists for quantifying or representing uncertainty information associated with this thermal library data. The quality of thermal neutron inelastic scattering cross section data can be of high importance in reactor analysis and criticality safety applications. These cross sections depend on the material's structure and dynamics. The double-differential scattering law, S(alpha, beta), tabulated in ENDF File 7 libraries contains this information. For crystalline solids, S(alpha, beta) is primarily a function of the material's phonon density of states (DOS). Published ENDF File 7 libraries are commonly produced by calculation and processing codes, such as the LEAPR module of NJOY, which utilize the phonon DOS as the fundamental input for inelastic scattering calculations to directly output an S(alpha, beta) matrix. To determine covariances for the S(alpha, beta) data generated by this process, information about uncertainties in the DOS is required. The phonon DOS may be viewed as a probability density function of atomic vibrational energy states that exist in a material. Probable variation in the shape of this spectrum may be established that depends on uncertainties in the physics models and methodology employed to produce the DOS. Through Monte Carlo sampling of perturbations from the reference phonon spectrum, an S(alpha, beta) covariance matrix may be generated. In this work, density functional theory and lattice dynamics in the harmonic approximation are used to calculate the phonon DOS for hexagonal crystalline graphite. This form of graphite is used as an example material for the purpose of demonstrating procedures for analyzing, calculating and processing thermal neutron inelastic scattering uncertainty information. Several sources of uncertainty in thermal neutron inelastic scattering calculations are examined, including sources which cannot be directly characterized through a description of the phonon DOS uncertainty, and their impacts are evaluated. Covariances for hexagonal crystalline graphite S(alpha, beta) data are quantified by coupling the standard methodology of LEAPR with a Monte Carlo sampling process. The mechanics of efficiently representing and processing this covariance information is also examined. Finally, with appropriate sensitivity information, it is shown that an S(alpha, beta) covariance matrix can be propagated to generate covariance data for integrated cross sections, secondary energy distributions, and coupled energy-angle distributions. This approach enables a complete description of thermal neutron inelastic scattering cross section uncertainties which may be employed to improve the simulation of nuclear systems.

  8. Uncertainty analysis of an inflow forecasting model: extension of the UNEEC machine learning-based method

    NASA Astrophysics Data System (ADS)

    Pianosi, Francesca; Lal Shrestha, Durga; Solomatine, Dimitri

    2010-05-01

    This research presents an extension of UNEEC (Uncertainty Estimation based on Local Errors and Clustering, Shrestha and Solomatine, 2006, 2008 & Solomatine and Shrestha, 2009) method in the direction of explicit inclusion of parameter uncertainty. UNEEC method assumes that there is an optimal model and the residuals of the model can be used to assess the uncertainty of the model prediction. It is assumed that all sources of uncertainty including input, parameter and model structure uncertainty are explicitly manifested in the model residuals. In this research, theses assumptions are relaxed, and the UNEEC method is extended to consider parameter uncertainty as well (abbreviated as UNEEC-P). In UNEEC-P, first we use Monte Carlo (MC) sampling in parameter space to generate N model realizations (each of which is a time series), estimate the prediction quantiles based on the empirical distribution functions of the model residuals considering all the residual realizations, and only then apply the standard UNEEC method that encapsulates the uncertainty of a hydrologic model (expressed by quantiles of the error distribution) in a machine learning model (e.g., ANN). UNEEC-P is applied first to a linear regression model of synthetic data, and then to a real case study of forecasting inflow to lake Lugano in northern Italy. The inflow forecasting model is a stochastic heteroscedastic model (Pianosi and Soncini-Sessa, 2009). The preliminary results show that the UNEEC-P method produces wider uncertainty bounds, which is consistent with the fact that the method considers also parameter uncertainty of the optimal model. In the future UNEEC method will be further extended to consider input and structure uncertainty which will provide more realistic estimation of model predictions.

  9. A tool for efficient, model-independent management optimization under uncertainty

    USGS Publications Warehouse

    White, Jeremy; Fienen, Michael N.; Barlow, Paul M.; Welter, Dave E.

    2018-01-01

    To fill a need for risk-based environmental management optimization, we have developed PESTPP-OPT, a model-independent tool for resource management optimization under uncertainty. PESTPP-OPT solves a sequential linear programming (SLP) problem and also implements (optional) efficient, “on-the-fly” (without user intervention) first-order, second-moment (FOSM) uncertainty techniques to estimate model-derived constraint uncertainty. Combined with a user-specified risk value, the constraint uncertainty estimates are used to form chance-constraints for the SLP solution process, so that any optimal solution includes contributions from model input and observation uncertainty. In this way, a “single answer” that includes uncertainty is yielded from the modeling analysis. PESTPP-OPT uses the familiar PEST/PEST++ model interface protocols, which makes it widely applicable to many modeling analyses. The use of PESTPP-OPT is demonstrated with a synthetic, integrated surface-water/groundwater model. The function and implications of chance constraints for this synthetic model are discussed.

  10. Equilibrium approach towards water resource management and pollution control in coal chemical industrial park.

    PubMed

    Xu, Jiuping; Hou, Shuhua; Xie, Heping; Lv, Chengwei; Yao, Liming

    2018-08-01

    In this study, an integrated water and waste load allocation model is proposed to assist decision makers in better understanding the trade-offs between economic growth, resource utilization, and environmental protection of coal chemical industries which characteristically have high water consumption and pollution. In the decision framework, decision makers in a same park, each of whom have different goals and preferences, work together to seek a collective benefit. Similar to a Stackelberg-Nash game, the proposed approach illuminates the decision making interrelationships and involves in the conflict coordination between the park authority and the individual coal chemical company stockholders. In the proposed method, to response to climate change and other uncertainties, a risk assessment tool, Conditional Value-at-Risk (CVaR) and uncertainties through reflecting parameters and coefficients using probability and fuzzy set theory are integrated in the modeling process. Then a case study from Yuheng coal chemical park is presented to demonstrate the practicality and efficiency of the optimization model. To reasonable search the potential consequences of different responses to water and waste load allocation strategies, a number of scenario results considering environmental uncertainty and decision maker' attitudes are examined to explore the tradeoffs between economic development and environmental protection and decision makers' objectives. The results are helpful for decision/police makers to adjust current strategies adapting for current changes. Based on the scenario analyses and discussion, some propositions and operational policies are given and sensitive adaptation strategies are presented to support the efficient, balanced and sustainable development of coal chemical industrial parks. Copyright © 2018 Elsevier Ltd. All rights reserved.

  11. Climate change on the Colorado River: a method to search for robust management strategies

    NASA Astrophysics Data System (ADS)

    Keefe, R.; Fischbach, J. R.

    2010-12-01

    The Colorado River is a principal source of water for the seven Basin States, providing approximately 16.5 maf per year to users in the southwestern United States and Mexico. Though the dynamics of the river ensure Upper Basin users a reliable supply of water, the three Lower Basin states (California, Nevada, and Arizona) are in danger of delivery interruptions as Upper Basin demand increases and climate change threatens to reduce future streamflows. In light of the recent drought and uncertain effects of climate change on Colorado River flows, we evaluate the performance of a suite of policies modeled after the shortage sharing agreement adopted in December 2007 by the Department of the Interior. We build on the current literature by using a simplified model of the Lower Colorado River to consider future streamflow scenarios given climate change uncertainty. We also generate different scenarios of parametric consumptive use growth in the Upper Basin and evaluate alternate management strategies in light of these uncertainties. Uncertainty associated with climate change is represented with a multi-model ensemble from the literature, using a nearest neighbor perturbation to increase the size of the ensemble. We use Robust Decision Making to compare near-term or long-term management strategies across an ensemble of plausible future scenarios with the goal of identifying one or more approaches that are robust to alternate assumptions about the future. This method entails using search algorithms to quantitatively identify vulnerabilities that may threaten a given strategy (including the current operating policy) and characterize key tradeoffs between strategies under different scenarios.

  12. Model sensitivity studies of the decrease in atmospheric carbon tetrachloride

    DOE PAGES

    Chipperfield, Martyn P.; Liang, Qing; Rigby, Matthew; ...

    2016-12-20

    Carbon tetrachloride (CCl 4) is an ozone-depleting substance, which is controlled by the Montreal Protocol and for which the atmospheric abundance is decreasing. But, the current observed rate of this decrease is known to be slower than expected based on reported CCl 4 emissions and its estimated overall atmospheric lifetime. Here we use a three-dimensional (3-D) chemical transport model to investigate the impact on its predicted decay of uncertainties in the rates at which CCl 4 is removed from the atmosphere by photolysis, by ocean uptake and by degradation in soils. The largest sink is atmospheric photolysis (74 % ofmore » total), but a reported 10 % uncertainty in its combined photolysis cross section and quantum yield has only a modest impact on the modelled rate of CCl 4 decay. This is partly due to the limiting effect of the rate of transport of CCl 4 from the main tropospheric reservoir to the stratosphere, where photolytic loss occurs. The model suggests large interannual variability in the magnitude of this stratospheric photolysis sink caused by variations in transport. The impact of uncertainty in the minor soil sink (9 % of total) is also relatively small. In contrast, the model shows that uncertainty in ocean loss (17 % of total) has the largest impact on modelled CCl 4 decay due to its sizeable contribution to CCl 4 loss and large lifetime uncertainty range (147 to 241 years). Furthermore, with an assumed CCl 4 emission rate of 39 Gg year -1, the reference simulation with the best estimate of loss processes still underestimates the observed CCl 4 (overestimates the decay) over the past 2 decades but to a smaller extent than previous studies. Changes to the rate of CCl 4 loss processes, in line with known uncertainties, could bring the model into agreement with in situ surface and remote-sensing measurements, as could an increase in emissions to around 47 Gg year -1. Further progress in constraining the CCl 4 budget is partly limited by systematic biases between observational datasets. For example, surface observations from the National Oceanic and Atmospheric Administration (NOAA) network are larger than from the Advanced Global Atmospheric Gases Experiment (AGAGE) network but have shown a steeper decreasing trend over the past 2 decades. The observed differences imply a difference in emissions which is significant relative to uncertainties in the magnitudes of the CCl 4 sinks.« less

  13. Model sensitivity studies of the decrease in atmospheric carbon tetrachloride

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chipperfield, Martyn P.; Liang, Qing; Rigby, Matthew

    Carbon tetrachloride (CCl 4) is an ozone-depleting substance, which is controlled by the Montreal Protocol and for which the atmospheric abundance is decreasing. But, the current observed rate of this decrease is known to be slower than expected based on reported CCl 4 emissions and its estimated overall atmospheric lifetime. Here we use a three-dimensional (3-D) chemical transport model to investigate the impact on its predicted decay of uncertainties in the rates at which CCl 4 is removed from the atmosphere by photolysis, by ocean uptake and by degradation in soils. The largest sink is atmospheric photolysis (74 % ofmore » total), but a reported 10 % uncertainty in its combined photolysis cross section and quantum yield has only a modest impact on the modelled rate of CCl 4 decay. This is partly due to the limiting effect of the rate of transport of CCl 4 from the main tropospheric reservoir to the stratosphere, where photolytic loss occurs. The model suggests large interannual variability in the magnitude of this stratospheric photolysis sink caused by variations in transport. The impact of uncertainty in the minor soil sink (9 % of total) is also relatively small. In contrast, the model shows that uncertainty in ocean loss (17 % of total) has the largest impact on modelled CCl 4 decay due to its sizeable contribution to CCl 4 loss and large lifetime uncertainty range (147 to 241 years). Furthermore, with an assumed CCl 4 emission rate of 39 Gg year -1, the reference simulation with the best estimate of loss processes still underestimates the observed CCl 4 (overestimates the decay) over the past 2 decades but to a smaller extent than previous studies. Changes to the rate of CCl 4 loss processes, in line with known uncertainties, could bring the model into agreement with in situ surface and remote-sensing measurements, as could an increase in emissions to around 47 Gg year -1. Further progress in constraining the CCl 4 budget is partly limited by systematic biases between observational datasets. For example, surface observations from the National Oceanic and Atmospheric Administration (NOAA) network are larger than from the Advanced Global Atmospheric Gases Experiment (AGAGE) network but have shown a steeper decreasing trend over the past 2 decades. The observed differences imply a difference in emissions which is significant relative to uncertainties in the magnitudes of the CCl 4 sinks.« less

  14. Estimating the ecology of extinct species with paleoecological data assimilation

    NASA Astrophysics Data System (ADS)

    Raiho, A.; McLachlan, J. S.; Dietze, M.

    2017-12-01

    In order to understand long term, unobservable ecosystem processes, ecologists must use both paleoecoloigcal data and ecosystem models. Models parameterize species competitive interactions using modern data. But, modern ecological or physiological observations are not available for extinct species, making it difficult for models to conceptualize their ecology. For instance, American chestnut (Castanea dentata), who played a large role in forests of northeastern US, was decimated by disease to virtual extinction. Since chestnut's demise, defining its ecology has been controversial. Models typically assume that chestnut's ecology was very similar to oak; They parameterize chestnut like oak species. These assumptions are drawn from paleoecological data, but these data are often reported without uncertainty. Since the paleoecological data are often reported without uncertainty, paleoecological data has never been directly incorporated with ecosystem models. We developed a Bayesian statistical model to estimate fractional composition from paleoecological data with uncertainty. Then, we assimilated this data product into an ecosystem model for long term forest succession using a generalized ensemble adjustment filter to determine which species demographic parameters lead to changes in species composition over the last 2,000 years at Harvard Forest. We found that chestnut was strongly negatively correlated with white pine (Pinus strobus) and red oak (Quercus rubra) in the process covariance matrix, suggesting a strong competitive interaction that is not currently understood by models for forest succession. These findings provide support for utilizing a data assimilation framework to ecologically interpret paleoecological data or data products to learn about the ecology of extinct species.

  15. On the uncertainty of phenological responses to climate change, and implications for a terrestrial biosphere model

    NASA Astrophysics Data System (ADS)

    Migliavacca, M.; Sonnentag, O.; Keenan, T. F.; Cescatti, A.; O'Keefe, J.; Richardson, A. D.

    2012-06-01

    Phenology, the timing of recurring life cycle events, controls numerous land surface feedbacks to the climate system through the regulation of exchanges of carbon, water and energy between the biosphere and atmosphere. Terrestrial biosphere models, however, are known to have systematic errors in the simulation of spring phenology, which potentially could propagate to uncertainty in modeled responses to future climate change. Here, we used the Harvard Forest phenology record to investigate and characterize sources of uncertainty in predicting phenology, and the subsequent impacts on model forecasts of carbon and water cycling. Using a model-data fusion approach, we combined information from 20 yr of phenological observations of 11 North American woody species, with 12 leaf bud-burst models that varied in complexity. Akaike's Information Criterion indicated support for spring warming models with photoperiod limitations and, to a lesser extent, models that included chilling requirements. We assessed three different sources of uncertainty in phenological forecasts: parameter uncertainty, model uncertainty, and driver uncertainty. The latter was characterized running the models to 2099 using 2 different IPCC climate scenarios (A1fi vs. B1, i.e. high CO2 emissions vs. low CO2 emissions scenario). Parameter uncertainty was the smallest (average 95% Confidence Interval - CI: 2.4 days century-1 for scenario B1 and 4.5 days century-1 for A1fi), whereas driver uncertainty was the largest (up to 8.4 days century-1 in the simulated trends). The uncertainty related to model structure is also large and the predicted bud-burst trends as well as the shape of the smoothed projections varied among models (±7.7 days century-1 for A1fi, ±3.6 days century-1 for B1). The forecast sensitivity of bud-burst to temperature (i.e. days bud-burst advanced per degree of warming) varied between 2.2 days °C-1 and 5.2 days °C-1 depending on model structure. We quantified the impact of uncertainties in bud-burst forecasts on simulated photosynthetic CO2 uptake and evapotranspiration (ET) using a process-based terrestrial biosphere model. Uncertainty in phenology model structure led to uncertainty in the description of forest seasonality, which accumulated to uncertainty in annual model estimates of gross primary productivity (GPP) and ET of 9.6% and 2.9%, respectively. A sensitivity analysis shows that a variation of ±10 days in bud-burst dates led to a variation of ±5.0% for annual GPP and about ±2.0% for ET. For phenology models, differences among future climate scenarios (i.e. driver) represent the largest source of uncertainty, followed by uncertainties related to model structure, and finally, related to model parameterization. The uncertainties we have quantified will affect the description of the seasonality of ecosystem processes and in particular the simulation of carbon uptake by forest ecosystems, with a larger impact of uncertainties related to phenology model structure, followed by uncertainties related to phenological model parameterization.

  16. The Slow Controls System of the New Muon g-2 Experiment at Fermilab

    NASA Astrophysics Data System (ADS)

    Eads, Michael; New Muon g-2 Collaboration

    2015-04-01

    The goal of the new muon g-2 experiment (E-989), currently under construction at Fermi National Accelerator Laboratory, is to measure the anomalous gyromagnetic ratio of the muon with unprecedented precision. The uncertainty goal of the experiment, 0.14ppm, represents a four-fold improvement over the current best measurement of this value and has the potential to increase the current three standard deviation disagreement with the predicted standard model value to five standard deviations. Measuring the operating conditions of the experiment will be essential to achieving these uncertainty goals. This talk will describe the design and the current status of E-989's slow controls system. This system, based on the MIDAS Slow Control Bus, will be used to measure and record currents, voltages, temperatures, humidities, pressures, flows, and other data which is collected asynchronously with the injection of the muon beam. The system consists of a variety of sensors and front-end electronics which interface to back-end data acquisition, data storage, and data monitoring systems. Parts of the system are all already operational and the full system will be completed before beam commissioning begins in 2017.

  17. Integrating landslide and liquefaction hazard and loss estimates with existing USGS real-time earthquake information products

    USGS Publications Warehouse

    Allstadt, Kate E.; Thompson, Eric M.; Hearne, Mike; Nowicki Jessee, M. Anna; Zhu, J.; Wald, David J.; Tanyas, Hakan

    2017-01-01

    The U.S. Geological Survey (USGS) has made significant progress toward the rapid estimation of shaking and shakingrelated losses through their Did You Feel It? (DYFI), ShakeMap, ShakeCast, and PAGER products. However, quantitative estimates of the extent and severity of secondary hazards (e.g., landsliding, liquefaction) are not currently included in scenarios and real-time post-earthquake products despite their significant contributions to hazard and losses for many events worldwide. We are currently running parallel global statistical models for landslides and liquefaction developed with our collaborators in testing mode, but much work remains in order to operationalize these systems. We are expanding our efforts in this area by not only improving the existing statistical models, but also by (1) exploring more sophisticated, physics-based models where feasible; (2) incorporating uncertainties; and (3) identifying and undertaking research and product development to provide useful landslide and liquefaction estimates and their uncertainties. Although our existing models use standard predictor variables that are accessible globally or regionally, including peak ground motions, topographic slope, and distance to water bodies, we continue to explore readily available proxies for rock and soil strength as well as other susceptibility terms. This work is based on the foundation of an expanding, openly available, case-history database we are compiling along with historical ShakeMaps for each event. The expected outcome of our efforts is a robust set of real-time secondary hazards products that meet the needs of a wide variety of earthquake information users. We describe the available datasets and models, developments currently underway, and anticipated products. 

  18. Uncertainty analysis of hydrological modeling in a tropical area using different algorithms

    NASA Astrophysics Data System (ADS)

    Rafiei Emam, Ammar; Kappas, Martin; Fassnacht, Steven; Linh, Nguyen Hoang Khanh

    2018-01-01

    Hydrological modeling outputs are subject to uncertainty resulting from different sources of errors (e.g., error in input data, model structure, and model parameters), making quantification of uncertainty in hydrological modeling imperative and meant to improve reliability of modeling results. The uncertainty analysis must solve difficulties in calibration of hydrological models, which further increase in areas with data scarcity. The purpose of this study is to apply four uncertainty analysis algorithms to a semi-distributed hydrological model, quantifying different source of uncertainties (especially parameter uncertainty) and evaluate their performance. In this study, the Soil and Water Assessment Tools (SWAT) eco-hydrological model was implemented for the watershed in the center of Vietnam. The sensitivity of parameters was analyzed, and the model was calibrated. The uncertainty analysis for the hydrological model was conducted based on four algorithms: Generalized Likelihood Uncertainty Estimation (GLUE), Sequential Uncertainty Fitting (SUFI), Parameter Solution method (ParaSol) and Particle Swarm Optimization (PSO). The performance of the algorithms was compared using P-factor and Rfactor, coefficient of determination (R 2), the Nash Sutcliffe coefficient of efficiency (NSE) and Percent Bias (PBIAS). The results showed the high performance of SUFI and PSO with P-factor>0.83, R-factor <0.56 and R 2>0.91, NSE>0.89, and 0.18

  19. Expert system for analyzing eddy current measurements

    DOEpatents

    Levy, Arthur J.; Oppenlander, Jane E.; Brudnoy, David M.; Englund, James M.; Loomis, Kent C.

    1994-01-01

    A method and apparatus (called DODGER) analyzes eddy current data for heat exchanger tubes or any other metallic object. DODGER uses an expert system to analyze eddy current data by reasoning with uncertainty and pattern recognition. The expert system permits DODGER to analyze eddy current data intelligently, and obviate operator uncertainty by analyzing the data in a uniform and consistent manner.

  20. Assessing the relative importance of parameter and forcing uncertainty and their interactions in conceptual hydrological model simulations

    NASA Astrophysics Data System (ADS)

    Mockler, E. M.; Chun, K. P.; Sapriza-Azuri, G.; Bruen, M.; Wheater, H. S.

    2016-11-01

    Predictions of river flow dynamics provide vital information for many aspects of water management including water resource planning, climate adaptation, and flood and drought assessments. Many of the subjective choices that modellers make including model and criteria selection can have a significant impact on the magnitude and distribution of the output uncertainty. Hydrological modellers are tasked with understanding and minimising the uncertainty surrounding streamflow predictions before communicating the overall uncertainty to decision makers. Parameter uncertainty in conceptual rainfall-runoff models has been widely investigated, and model structural uncertainty and forcing data have been receiving increasing attention. This study aimed to assess uncertainties in streamflow predictions due to forcing data and the identification of behavioural parameter sets in 31 Irish catchments. By combining stochastic rainfall ensembles and multiple parameter sets for three conceptual rainfall-runoff models, an analysis of variance model was used to decompose the total uncertainty in streamflow simulations into contributions from (i) forcing data, (ii) identification of model parameters and (iii) interactions between the two. The analysis illustrates that, for our subjective choices, hydrological model selection had a greater contribution to overall uncertainty, while performance criteria selection influenced the relative intra-annual uncertainties in streamflow predictions. Uncertainties in streamflow predictions due to the method of determining parameters were relatively lower for wetter catchments, and more evenly distributed throughout the year when the Nash-Sutcliffe Efficiency of logarithmic values of flow (lnNSE) was the evaluation criterion.

  1. Advanced Methods for Determining Prediction Uncertainty in Model-Based Prognostics with Application to Planetary Rovers

    NASA Technical Reports Server (NTRS)

    Daigle, Matthew J.; Sankararaman, Shankar

    2013-01-01

    Prognostics is centered on predicting the time of and time until adverse events in components, subsystems, and systems. It typically involves both a state estimation phase, in which the current health state of a system is identified, and a prediction phase, in which the state is projected forward in time. Since prognostics is mainly a prediction problem, prognostic approaches cannot avoid uncertainty, which arises due to several sources. Prognostics algorithms must both characterize this uncertainty and incorporate it into the predictions so that informed decisions can be made about the system. In this paper, we describe three methods to solve these problems, including Monte Carlo-, unscented transform-, and first-order reliability-based methods. Using a planetary rover as a case study, we demonstrate and compare the different methods in simulation for battery end-of-discharge prediction.

  2. The Social Construction of Uncertainty in Healthcare Delivery

    NASA Astrophysics Data System (ADS)

    Begun, James W.; Kaissi, Amer A.

    We explore the following question: How would healthcare delivery be different if uncertainty were widely recognized, accurately diagnosed, and appropriately managed? Unlike most studies of uncertainty, we examine uncertainty at more than one level of analysis, considering uncertainty that arises at the patient-clinician interaction level and at the organizational level of healthcare delivery. We consider the effects of history, as the forces and systems that currently shape and manage uncertainty have emerged over a long time period. The purpose of this broad and speculative "thought exercise" is to generate greater sensemaking of the current state of healthcare delivery, particularly in the realm of organizational and public policy, and to generate new research questions about healthcare delivery. The discussion is largely based on experience in the United States, which may limit its generalizability.

  3. `spup' - An R Package for Analysis of Spatial Uncertainty Propagation and Application to Trace Gas Emission Simulations

    NASA Astrophysics Data System (ADS)

    Sawicka, K.; Breuer, L.; Houska, T.; Santabarbara Ruiz, I.; Heuvelink, G. B. M.

    2016-12-01

    Computer models have become a crucial tool in engineering and environmental sciences for simulating the behaviour of complex static and dynamic systems. However, while many models are deterministic, the uncertainty in their predictions needs to be estimated before they are used for decision support. Advances in uncertainty propagation analysis and assessment have been paralleled by a growing number of software tools for uncertainty analysis, but none has gained recognition for a universal applicability, including case studies with spatial models and spatial model inputs. Due to the growing popularity and applicability of the open source R programming language we undertook a project to develop an R package that facilitates uncertainty propagation analysis in spatial environmental modelling. In particular, the `spup' package provides functions for examining the uncertainty propagation starting from input data and model parameters, via the environmental model onto model predictions. The functions include uncertainty model specification, stochastic simulation and propagation of uncertainty using Monte Carlo techniques, as well as several uncertainty visualization functions. Here we will demonstrate that the 'spup' package is an effective and easy-to-use tool to be applied even in a very complex study case, and that it can be used in multi-disciplinary research and model-based decision support. As an example, we use the ecological LandscapeDNDC model to analyse propagation of uncertainties associated with spatial variability of the model driving forces such as rainfall, nitrogen deposition and fertilizer inputs. The uncertainty propagation is analysed for the prediction of emissions of N2O and CO2 for a German low mountainous, agriculturally developed catchment. The study tests the effect of spatial correlations on spatially aggregated model outputs, and could serve as an advice for developing best management practices and model improvement strategies.

  4. Development of a coupled hydrological - hydrodynamic model for probabilistic catchment flood inundation modelling

    NASA Astrophysics Data System (ADS)

    Quinn, Niall; Freer, Jim; Coxon, Gemma; Dunne, Toby; Neal, Jeff; Bates, Paul; Sampson, Chris; Smith, Andy; Parkin, Geoff

    2017-04-01

    Computationally efficient flood inundation modelling systems capable of representing important hydrological and hydrodynamic flood generating processes over relatively large regions are vital for those interested in flood preparation, response, and real time forecasting. However, such systems are currently not readily available. This can be particularly important where flood predictions from intense rainfall are considered as the processes leading to flooding often involve localised, non-linear spatially connected hillslope-catchment responses. Therefore, this research introduces a novel hydrological-hydraulic modelling framework for the provision of probabilistic flood inundation predictions across catchment to regional scales that explicitly account for spatial variability in rainfall-runoff and routing processes. Approaches have been developed to automate the provision of required input datasets and estimate essential catchment characteristics from freely available, national datasets. This is an essential component of the framework as when making predictions over multiple catchments or at relatively large scales, and where data is often scarce, obtaining local information and manually incorporating it into the model quickly becomes infeasible. An extreme flooding event in the town of Morpeth, NE England, in 2008 was used as a first case study evaluation of the modelling framework introduced. The results demonstrated a high degree of prediction accuracy when comparing modelled and reconstructed event characteristics for the event, while the efficiency of the modelling approach used enabled the generation of relatively large ensembles of realisations from which uncertainty within the prediction may be represented. This research supports previous literature highlighting the importance of probabilistic forecasting, particularly during extreme events, which can be often be poorly characterised or even missed by deterministic predictions due to the inherent uncertainty in any model application. Future research will aim to further evaluate the robustness of the approaches introduced by applying the modelling framework to a variety of historical flood events across UK catchments. Furthermore, the flexibility and efficiency of the framework is ideally suited to the examination of the propagation of errors through the model which will help gain a better understanding of the dominant sources of uncertainty currently impacting flood inundation predictions.

  5. Quantifying radar-rainfall uncertainties in urban drainage flow modelling

    NASA Astrophysics Data System (ADS)

    Rico-Ramirez, M. A.; Liguori, S.; Schellart, A. N. A.

    2015-09-01

    This work presents the results of the implementation of a probabilistic system to model the uncertainty associated to radar rainfall (RR) estimates and the way this uncertainty propagates through the sewer system of an urban area located in the North of England. The spatial and temporal correlations of the RR errors as well as the error covariance matrix were computed to build a RR error model able to generate RR ensembles that reproduce the uncertainty associated with the measured rainfall. The results showed that the RR ensembles provide important information about the uncertainty in the rainfall measurement that can be propagated in the urban sewer system. The results showed that the measured flow peaks and flow volumes are often bounded within the uncertainty area produced by the RR ensembles. In 55% of the simulated events, the uncertainties in RR measurements can explain the uncertainties observed in the simulated flow volumes. However, there are also some events where the RR uncertainty cannot explain the whole uncertainty observed in the simulated flow volumes indicating that there are additional sources of uncertainty that must be considered such as the uncertainty in the urban drainage model structure, the uncertainty in the urban drainage model calibrated parameters, and the uncertainty in the measured sewer flows.

  6. A stochastic optimization model under modeling uncertainty and parameter certainty for groundwater remediation design--part I. Model development.

    PubMed

    He, L; Huang, G H; Lu, H W

    2010-04-15

    Solving groundwater remediation optimization problems based on proxy simulators can usually yield optimal solutions differing from the "true" ones of the problem. This study presents a new stochastic optimization model under modeling uncertainty and parameter certainty (SOMUM) and the associated solution method for simultaneously addressing modeling uncertainty associated with simulator residuals and optimizing groundwater remediation processes. This is a new attempt different from the previous modeling efforts. The previous ones focused on addressing uncertainty in physical parameters (i.e. soil porosity) while this one aims to deal with uncertainty in mathematical simulator (arising from model residuals). Compared to the existing modeling approaches (i.e. only parameter uncertainty is considered), the model has the advantages of providing mean-variance analysis for contaminant concentrations, mitigating the effects of modeling uncertainties on optimal remediation strategies, offering confidence level of optimal remediation strategies to system designers, and reducing computational cost in optimization processes. 2009 Elsevier B.V. All rights reserved.

  7. Software risk estimation and management techniques at JPL

    NASA Technical Reports Server (NTRS)

    Hihn, J.; Lum, K.

    2002-01-01

    In this talk we will discuss how uncertainty has been incorporated into the JPL software model, probabilistic-based estimates, and how risk is addressed, how cost risk is currently being explored via a variety of approaches, from traditional risk lists, to detailed WBS-based risk estimates to the Defect Detection and Prevention (DDP) tool.

  8. Development and Utilization of Regional Oceanic Modeling System (ROMS) & Delicacy, Imprecision, and Uncertainty of Oceanic Simulations: An Investigation with ROMS

    DTIC Science & Technology

    2010-09-30

    and Ecosystems: An important community use for ROMS is biogeochemisty: chemical cycles, water quality, blooms , micro-nutrients, larval dispersal... Chile current system. J. Climate, submitted. Colas, F., X. Capet, and J. McWilliams, 2010b: Mesoscale eddy buoyancy flux and eddy-induced

  9. Delicacy, Imprecision, and Uncertainty of Oceanic Simulations: An Investigation with the Regional Oceanic Modeling System (ROMS)

    DTIC Science & Technology

    2013-09-30

    Geochemistry and Ecosystems: An important community use for ROMS is biogeochemisty: chemical cycles, water quality, blooms , micro-nutrients, larval...Sci., submitted. Colas, F., J.C. McWilliams, X. Capet, and J. Kurian, 2012: Heat balance and eddies in the Peru- Chile Current System. Climate

  10. Decisions Under Uncertainty III: Rationality Issues, Sex Stereotypes, and Sex Role Appropriateness.

    ERIC Educational Resources Information Center

    Bonoma, Thomas V.

    The explanatory cornerstone of most currently viable social theories is a strict cost-gain assumption. The clearest formal explication of this view is contained in subjective expected utility models (SEU), in which individuals are assumed to scale their subjective likelihood estimates of decisional consequences and the personalistic worth or…

  11. Hydrological modeling as an evaluation tool of EURO-CORDEX climate projections and bias correction methods

    NASA Astrophysics Data System (ADS)

    Hakala, Kirsti; Addor, Nans; Seibert, Jan

    2017-04-01

    Streamflow stemming from Switzerland's mountainous landscape will be influenced by climate change, which will pose significant challenges to the water management and policy sector. In climate change impact research, the determination of future streamflow is impeded by different sources of uncertainty, which propagate through the model chain. In this research, we explicitly considered the following sources of uncertainty: (1) climate models, (2) downscaling of the climate projections to the catchment scale, (3) bias correction method and (4) parameterization of the hydrological model. We utilize climate projections at the 0.11 degree 12.5 km resolution from the EURO-CORDEX project, which are the most recent climate projections for the European domain. EURO-CORDEX is comprised of regional climate model (RCM) simulations, which have been downscaled from global climate models (GCMs) from the CMIP5 archive, using both dynamical and statistical techniques. Uncertainties are explored by applying a modeling chain involving 14 GCM-RCMs to ten Swiss catchments. We utilize the rainfall-runoff model HBV Light, which has been widely used in operational hydrological forecasting. The Lindström measure, a combination of model efficiency and volume error, was used as an objective function to calibrate HBV Light. Ten best sets of parameters are then achieved by calibrating using the genetic algorithm and Powell optimization (GAP) method. The GAP optimization method is based on the evolution of parameter sets, which works by selecting and recombining high performing parameter sets with each other. Once HBV is calibrated, we then perform a quantitative comparison of the influence of biases inherited from climate model simulations to the biases stemming from the hydrological model. The evaluation is conducted over two time periods: i) 1980-2009 to characterize the simulation realism under the current climate and ii) 2070-2099 to identify the magnitude of the projected change of streamflow under the climate scenarios RCP4.5 and RCP8.5. We utilize two techniques for correcting biases in the climate model output: quantile mapping and a new method, frequency bias correction. The FBC method matches the frequencies between observed and GCM-RCM data. In this way, it can be used to correct for all time scales, which is a known limitation of quantile mapping. A novel approach for the evaluation of the climate simulations and bias correction methods was then applied. Streamflow can be thought of as the "great integrator" of uncertainties. The ability, or the lack thereof, to correctly simulate streamflow is a way to assess the realism of the bias-corrected climate simulations. Long-term monthly mean as well as high and low flow metrics are used to evaluate the realism of the simulations under current climate and to gauge the impacts of climate change on streamflow. Preliminary results show that under present climate, calibration of the hydrological model comprises of a much smaller band of uncertainty in the modeling chain as compared to the bias correction of the GCM-RCMs. Therefore, for future time periods, we expect the bias correction of climate model data to have a greater influence on projected changes in streamflow than the calibration of the hydrological model.

  12. "But it might be a heart attack": intolerance of uncertainty and panic disorder symptoms.

    PubMed

    Carleton, R Nicholas; Duranceau, Sophie; Freeston, Mark H; Boelen, Paul A; McCabe, Randi E; Antony, Martin M

    2014-06-01

    Panic disorder models describe interactions between feared anxiety-related physical sensations (i.e., anxiety sensitivity; AS) and catastrophic interpretations therein. Intolerance of uncertainty (IU) has been implicated as necessary for catastrophic interpretations in community samples. The current study examined relationships between IU, AS, and panic disorder symptoms in a clinical sample. Participants had a principal diagnosis of panic disorder, with or without agoraphobia (n=132; 66% women). IU was expected to account for significant variance in panic symptoms controlling for AS. AS was expected to mediate the relationship between IU and panic symptoms, whereas IU was expected to moderate the relationship between AS and panic symptoms. Hierarchical linear regressions indicated that IU accounted for significant unique variance in panic symptoms relative to AS, with comparable part correlations. Mediation and moderation models were also tested and suggested direct and indirect effects of IU on panic symptoms through AS; however, an interaction effect was not supported. The current cross-sectional evidence supports a role for IU in panic symptoms, independent of AS. Copyright © 2014 Elsevier Ltd. All rights reserved.

  13. Full uncertainty quantification of N2O and NO emissions using the biogeochemical model LandscapeDNDC on site and regional scale

    NASA Astrophysics Data System (ADS)

    Haas, Edwin; Santabarbara, Ignacio; Kiese, Ralf; Butterbach-Bahl, Klaus

    2017-04-01

    Numerical simulation models are increasingly used to estimate greenhouse gas emissions at site to regional / national scale and are outlined as the most advanced methodology (Tier 3) in the framework of UNFCCC reporting. Process-based models incorporate the major processes of the carbon and nitrogen cycle of terrestrial ecosystems and are thus thought to be widely applicable at various conditions and spatial scales. Process based modelling requires high spatial resolution input data on soil properties, climate drivers and management information. The acceptance of model based inventory calculations depends on the assessment of the inventory's uncertainty (model, input data and parameter induced uncertainties). In this study we fully quantify the uncertainty in modelling soil N2O and NO emissions from arable, grassland and forest soils using the biogeochemical model LandscapeDNDC. We address model induced uncertainty (MU) by contrasting two different soil biogeochemistry modules within LandscapeDNDC. The parameter induced uncertainty (PU) was assessed by using joint parameter distributions for key parameters describing microbial C and N turnover processes as obtained by different Bayesian calibration studies for each model configuration. Input data induced uncertainty (DU) was addressed by Bayesian calibration of soil properties, climate drivers and agricultural management practices data. For the MU, DU and PU we performed several hundred simulations each to contribute to the individual uncertainty assessment. For the overall uncertainty quantification we assessed the model prediction probability, followed by sampled sets of input datasets and parameter distributions. Statistical analysis of the simulation results have been used to quantify the overall full uncertainty of the modelling approach. With this study we can contrast the variation in model results to the different sources of uncertainties for each ecosystem. Further we have been able to perform a fully uncertainty analysis for modelling N2O and NO emissions from arable, grassland and forest soils necessary for the comprehensibility of modelling results. We have applied the methodology to a regional inventory to assess the overall modelling uncertainty for a regional N2O and NO emissions inventory for the state of Saxony, Germany.

  14. Uncertainty in tsunami sediment transport modeling

    USGS Publications Warehouse

    Jaffe, Bruce E.; Goto, Kazuhisa; Sugawara, Daisuke; Gelfenbaum, Guy R.; La Selle, SeanPaul M.

    2016-01-01

    Erosion and deposition from tsunamis record information about tsunami hydrodynamics and size that can be interpreted to improve tsunami hazard assessment. We explore sources and methods for quantifying uncertainty in tsunami sediment transport modeling. Uncertainty varies with tsunami, study site, available input data, sediment grain size, and model. Although uncertainty has the potential to be large, published case studies indicate that both forward and inverse tsunami sediment transport models perform well enough to be useful for deciphering tsunami characteristics, including size, from deposits. New techniques for quantifying uncertainty, such as Ensemble Kalman Filtering inversion, and more rigorous reporting of uncertainties will advance the science of tsunami sediment transport modeling. Uncertainty may be decreased with additional laboratory studies that increase our understanding of the semi-empirical parameters and physics of tsunami sediment transport, standardized benchmark tests to assess model performance, and development of hybrid modeling approaches to exploit the strengths of forward and inverse models.

  15. Microbial models with data-driven parameters predict stronger soil carbon responses to climate change.

    PubMed

    Hararuk, Oleksandra; Smith, Matthew J; Luo, Yiqi

    2015-06-01

    Long-term carbon (C) cycle feedbacks to climate depend on the future dynamics of soil organic carbon (SOC). Current models show low predictive accuracy at simulating contemporary SOC pools, which can be improved through parameter estimation. However, major uncertainty remains in global soil responses to climate change, particularly uncertainty in how the activity of soil microbial communities will respond. To date, the role of microbes in SOC dynamics has been implicitly described by decay rate constants in most conventional global carbon cycle models. Explicitly including microbial biomass dynamics into C cycle model formulations has shown potential to improve model predictive performance when assessed against global SOC databases. This study aimed to data-constrained parameters of two soil microbial models, evaluate the improvements in performance of those calibrated models in predicting contemporary carbon stocks, and compare the SOC responses to climate change and their uncertainties between microbial and conventional models. Microbial models with calibrated parameters explained 51% of variability in the observed total SOC, whereas a calibrated conventional model explained 41%. The microbial models, when forced with climate and soil carbon input predictions from the 5th Coupled Model Intercomparison Project (CMIP5), produced stronger soil C responses to 95 years of climate change than any of the 11 CMIP5 models. The calibrated microbial models predicted between 8% (2-pool model) and 11% (4-pool model) soil C losses compared with CMIP5 model projections which ranged from a 7% loss to a 22.6% gain. Lastly, we observed unrealistic oscillatory SOC dynamics in the 2-pool microbial model. The 4-pool model also produced oscillations, but they were less prominent and could be avoided, depending on the parameter values. © 2014 John Wiley & Sons Ltd.

  16. Uncertainty Aware Structural Topology Optimization Via a Stochastic Reduced Order Model Approach

    NASA Technical Reports Server (NTRS)

    Aguilo, Miguel A.; Warner, James E.

    2017-01-01

    This work presents a stochastic reduced order modeling strategy for the quantification and propagation of uncertainties in topology optimization. Uncertainty aware optimization problems can be computationally complex due to the substantial number of model evaluations that are necessary to accurately quantify and propagate uncertainties. This computational complexity is greatly magnified if a high-fidelity, physics-based numerical model is used for the topology optimization calculations. Stochastic reduced order model (SROM) methods are applied here to effectively 1) alleviate the prohibitive computational cost associated with an uncertainty aware topology optimization problem; and 2) quantify and propagate the inherent uncertainties due to design imperfections. A generic SROM framework that transforms the uncertainty aware, stochastic topology optimization problem into a deterministic optimization problem that relies only on independent calls to a deterministic numerical model is presented. This approach facilitates the use of existing optimization and modeling tools to accurately solve the uncertainty aware topology optimization problems in a fraction of the computational demand required by Monte Carlo methods. Finally, an example in structural topology optimization is presented to demonstrate the effectiveness of the proposed uncertainty aware structural topology optimization approach.

  17. When Models and Observations Collide: Journeying towards an Integrated Snow Depth Product

    NASA Astrophysics Data System (ADS)

    Webster, M.; Petty, A.; Boisvert, L.; Markus, T.; Kurtz, N. T.; Kwok, R.; Perovich, D. K.

    2017-12-01

    Knowledge of snow depth is essential for assessing changes in sea ice mass balance due to snow's insulating and reflective properties. In remote sensing applications, the accuracy of sea ice thickness retrievals from altimetry crucially depends on snow depth. Despite the need for snow depth data, we currently lack continuous observations that capture the basin-scale snow depth distribution and its seasonal evolution. Recent in situ and remote sensing observations are sparse in space and time, and contain uncertainties, caveats, and/or biases that often require careful interpretation. Likewise, using model output for remote sensing applications is limited due to uncertainties in atmospheric forcing and different treatments of snow processes. Here, we summarize our efforts in bringing observational and model data together to develop an approach for an integrated snow depth product. We start with a snow budget model and incrementally incorporate snow processes to determine the effects on snow depth and to assess model sensitivity. We discuss lessons learned in model-observation integration and ideas for potential improvements to the treatment of snow in models.

  18. Results and Error Estimates from GRACE Forward Modeling over Greenland, Canada, and Alaska

    NASA Astrophysics Data System (ADS)

    Bonin, J. A.; Chambers, D. P.

    2012-12-01

    Forward modeling using a weighted least squares technique allows GRACE information to be projected onto a pre-determined collection of local basins. This decreases the impact of spatial leakage, allowing estimates of mass change to be better localized. The technique is especially valuable where models of current-day mass change are poor, such as over Greenland and Antarctica. However, the accuracy of the forward model technique has not been determined, nor is it known how the distribution of the local basins affects the results. We use a "truth" model composed of hydrology and ice-melt slopes as an example case, to estimate the uncertainties of this forward modeling method and expose those design parameters which may result in an incorrect high-resolution mass distribution. We then apply these optimal parameters in a forward model estimate created from RL05 GRACE data. We compare the resulting mass slopes with the expected systematic errors from the simulation, as well as GIA and basic trend-fitting uncertainties. We also consider whether specific regions (such as Ellesmere Island and Baffin Island) can be estimated reliably using our optimal basin layout.

  19. Multi-model approach to assess the impact of climate change on runoff

    NASA Astrophysics Data System (ADS)

    Dams, J.; Nossent, J.; Senbeta, T. B.; Willems, P.; Batelaan, O.

    2015-10-01

    The assessment of climate change impacts on hydrology is subject to uncertainties related to the climate change scenarios, stochastic uncertainties of the hydrological model and structural uncertainties of the hydrological model. This paper focuses on the contribution of structural uncertainty of hydrological models to the overall uncertainty of the climate change impact assessment. To quantify the structural uncertainty of hydrological models, four physically based hydrological models (SWAT, PRMS and a semi- and fully distributed version of the WetSpa model) are set up for a catchment in Belgium. Each model is calibrated using four different objective functions. Three climate change scenarios with a high, mean and low hydrological impact are statistically perturbed from a large ensemble of climate change scenarios and are used to force the hydrological models. This methodology allows assessing and comparing the uncertainty introduced by the climate change scenarios with the uncertainty introduced by the hydrological model structure. Results show that the hydrological model structure introduces a large uncertainty on both the average monthly discharge and the extreme peak and low flow predictions under the climate change scenarios. For the low impact climate change scenario, the uncertainty range of the mean monthly runoff is comparable to the range of these runoff values in the reference period. However, for the mean and high impact scenarios, this range is significantly larger. The uncertainty introduced by the climate change scenarios is larger than the uncertainty due to the hydrological model structure for the low and mean hydrological impact scenarios, but the reverse is true for the high impact climate change scenario. The mean and high impact scenarios project increasing peak discharges, while the low impact scenario projects increasing peak discharges only for peak events with return periods larger than 1.6 years. All models suggest for all scenarios a decrease of the lowest flows, except for the SWAT model with the mean hydrological impact climate change scenario. The results of this study indicate that besides the uncertainty introduced by the climate change scenarios also the hydrological model structure uncertainty should be taken into account in the assessment of climate change impacts on hydrology. To make it more straightforward and transparent to include model structural uncertainty in hydrological impact studies, there is a need for hydrological modelling tools that allow flexible structures and methods to validate model structures in their ability to assess impacts under unobserved future climatic conditions.

  20. Development and comparison of Bayesian modularization method in uncertainty assessment of hydrological models

    NASA Astrophysics Data System (ADS)

    Li, L.; Xu, C.-Y.; Engeland, K.

    2012-04-01

    With respect to model calibration, parameter estimation and analysis of uncertainty sources, different approaches have been used in hydrological models. Bayesian method is one of the most widely used methods for uncertainty assessment of hydrological models, which incorporates different sources of information into a single analysis through Bayesian theorem. However, none of these applications can well treat the uncertainty in extreme flows of hydrological models' simulations. This study proposes a Bayesian modularization method approach in uncertainty assessment of conceptual hydrological models by considering the extreme flows. It includes a comprehensive comparison and evaluation of uncertainty assessments by a new Bayesian modularization method approach and traditional Bayesian models using the Metropolis Hasting (MH) algorithm with the daily hydrological model WASMOD. Three likelihood functions are used in combination with traditional Bayesian: the AR (1) plus Normal and time period independent model (Model 1), the AR (1) plus Normal and time period dependent model (Model 2) and the AR (1) plus multi-normal model (Model 3). The results reveal that (1) the simulations derived from Bayesian modularization method are more accurate with the highest Nash-Sutcliffe efficiency value, and (2) the Bayesian modularization method performs best in uncertainty estimates of entire flows and in terms of the application and computational efficiency. The study thus introduces a new approach for reducing the extreme flow's effect on the discharge uncertainty assessment of hydrological models via Bayesian. Keywords: extreme flow, uncertainty assessment, Bayesian modularization, hydrological model, WASMOD

  1. Quantifying the uncertainties in life cycle greenhouse gas emissions for UK wheat ethanol

    NASA Astrophysics Data System (ADS)

    Yan, Xiaoyu; Boies, Adam M.

    2013-03-01

    Biofuels are increasingly promoted worldwide as a means for reducing greenhouse gas (GHG) emissions from transport. However, current regulatory frameworks and most academic life cycle analyses adopt a deterministic approach in determining the GHG intensities of biofuels and thus ignore the inherent risk associated with biofuel production. This study aims to develop a transparent stochastic method for evaluating UK biofuels that determines both the magnitude and uncertainty of GHG intensity on the basis of current industry practices. Using wheat ethanol as a case study, we show that the GHG intensity could span a range of 40-110 gCO2e MJ-1 when land use change (LUC) emissions and various sources of uncertainty are taken into account, as compared with a regulatory default value of 44 gCO2e MJ-1. This suggests that the current deterministic regulatory framework underestimates wheat ethanol GHG intensity and thus may not be effective in evaluating transport fuels. Uncertainties in determining the GHG intensity of UK wheat ethanol include limitations of available data at a localized scale, and significant scientific uncertainty of parameters such as soil N2O and LUC emissions. Biofuel polices should be robust enough to incorporate the currently irreducible uncertainties and flexible enough to be readily revised when better science is available.

  2. A Python Interface for the Dakota Iterative Systems Analysis Toolkit

    NASA Astrophysics Data System (ADS)

    Piper, M.; Hutton, E.; Syvitski, J. P.

    2016-12-01

    Uncertainty quantification is required to improve the accuracy, reliability, and accountability of Earth science models. Dakota is a software toolkit, developed at Sandia National Laboratories, that provides an interface between models and a library of analysis methods, including support for sensitivity analysis, uncertainty quantification, optimization, and calibration techniques. Dakota is a powerful tool, but its learning curve is steep: the user not only must understand the structure and syntax of the Dakota input file, but also must develop intermediate code, called an analysis driver, that allows Dakota to run a model. The CSDMS Dakota interface (CDI) is a Python package that wraps and extends Dakota's user interface. It simplifies the process of configuring and running a Dakota experiment. A user can program to the CDI, allowing a Dakota experiment to be scripted. The CDI creates Dakota input files and provides a generic analysis driver. Any model written in Python that exposes a Basic Model Interface (BMI), as well as any model componentized in the CSDMS modeling framework, automatically works with the CDI. The CDI has a plugin architecture, so models written in other languages, or those that don't expose a BMI, can be accessed by the CDI by programmatically extending a template; an example is provided in the CDI distribution. Currently, six Dakota analysis methods have been implemented for examples from the much larger Dakota library. To demonstrate the CDI, we performed an uncertainty quantification experiment with the HydroTrend hydrological water balance and transport model. In the experiment, we evaluated the response of long-term suspended sediment load at the river mouth (Qs) to uncertainty in two input parameters, annual mean temperature (T) and precipitation (P), over a series of 100-year runs, using the polynomial chaos method. Through Dakota, we calculated moments, local and global (Sobol') sensitivity indices, and probability density and cumulative distribution functions for the response.

  3. Atmospheric Carbon Dioxide and the Global Carbon Cycle: The Key Uncertainties

    DOE R&D Accomplishments Database

    Peng, T. H.; Post, W. M.; DeAngelis, D. L.; Dale, V. H.; Farrell, M. P.

    1987-12-01

    The biogeochemical cycling of carbon between its sources and sinks determines the rate of increase in atmospheric CO{sub 2} concentrations. The observed increase in atmospheric CO{sub 2} content is less than the estimated release from fossil fuel consumption and deforestation. This discrepancy can be explained by interactions between the atmosphere and other global carbon reservoirs such as the oceans, and the terrestrial biosphere including soils. Undoubtedly, the oceans have been the most important sinks for CO{sub 2} produced by man. But, the physical, chemical, and biological processes of oceans are complex and, therefore, credible estimates of CO{sub 2} uptake can probably only come from mathematical models. Unfortunately, one- and two-dimensional ocean models do not allow for enough CO{sub 2} uptake to accurately account for known releases. Thus, they produce higher concentrations of atmospheric CO{sub 2} than was historically the case. More complex three-dimensional models, while currently being developed, may make better use of existing tracer data than do one- and two-dimensional models and will also incorporate climate feedback effects to provide a more realistic view of ocean dynamics and CO{sub 2} fluxes. The instability of current models to estimate accurately oceanic uptake of CO{sub 2} creates one of the key uncertainties in predictions of atmospheric CO{sub 2} increases and climate responses over the next 100 to 200 years.

  4. Propagation of model and forcing uncertainty into hydrological drought characteristics in a multi-model century-long experiment in continental river basins

    NASA Astrophysics Data System (ADS)

    Samaniego, L. E.; Kumar, R.; Schaefer, D.; Huang, S.; Yang, T.; Mishra, V.; Eisner, S.; Vetter, T.; Pechlivanidis, I.; Liersch, S.; Flörke, M.; Krysanova, V.

    2015-12-01

    Droughts are creeping hydro-meteorological events that bring societiesand natural systems to their limits and inducing considerablesocio-economic losses. Currently it is hypothesized that climate changewill exacerbate current trends leading a more severe and extendeddroughts, as well as, larger than normal recovery periods. Currentassessments, however, lack of a consistent framework to deal withcompatible initial conditions for the impact models and a set ofstandardized historical and future forcings. The ISI-MIP project provides an unique opportunity to understand thepropagation of model and forcing uncertainty into century-long timeseries of drought characteristics using an ensemble of model predictionsacross a broad range of climate scenarios and regions. In the presentstudy, we analyze this issue using the hydrologic simulations carriedout with HYPE, mHM, SWIM, VIC, and WaterGAP3 in seven large continentalriver basins: Amazon, Blue Nile, Ganges, Niger, Mississippi, Rhine,Yellow. All models are calibrated against observed streamflow duringthe period 1971-2001 using the same forcings based on the WATCH datasets. These constrained models were then forced with bias correctedoutputs of five CMIP-5 GCMs under four RCP scenarios (i.e. 2.6, 4.5,6.0, and 8.5 W/m2) for the period 1971-2099. A non-parametric kernel density approach is used to estimate thetemporal evolution of a monthly runoff index based on simulatedstreamflow. Hydrologic simulations corresponding to each GCM during thehistoric period of 1981-2010 serve as reference for the estimation ofthe basin specific monthly probability distribution functions. GCMspecific reference pdfs are then used to recast the future hydrologicmodel outputs from different RCP scenarios. Based on these results,drought severity and duration are investigated during periods: 1)2006-2035, 2) 2036-2065 and 3) 2070-2099. Two main hypothesis areinvestigated: 1) model predictive uncertainty of drought indices amongdifferent hydrologic models is negligible compared to the uncertaintyoriginated from different GCMs and 2) the projected drift of droughtcharacteristics is hydrologic model independent and it is only driven bythe GCM variability. The temporal evolution between drought severity andduration is also analyzed.

  5. Model-independent determination of the strong phase difference between D 0 and {\\overline{D}}^0\\to {π}+{π}-{π}+{π}- amplitudes

    NASA Astrophysics Data System (ADS)

    Harnew, Samuel; Naik, Paras; Prouve, Claire; Rademacker, Jonas; Asner, David

    2018-01-01

    For the first time, the strong phase difference between D 0 and {\\overline{D}}^0\\to {π}+{π}-{π}+{π}- amplitudes is determined in bins of the decay phase space. The measurement uses 818 pb-1 of e + e - collision data that is taken at the ψ(3770) resonance and collected by the CLEO-c experiment. The measurement is important for the determination of the CP -violating phase γ in B ± → DK ± (and similar) decays, where the D meson (which represents a superposition of D 0 and {\\overline{D}}^0 ) subsequently decays to π + π - π + π -. To obtain optimal sensitivity to γ, the phase space of the D → π + π - π + π - decay is divided into bins based on a recent amplitude model of the decay. Although an amplitude model is used to define the bins, the measurements obtained are model-independent. The CP -even fraction of the D → π + π - π + π - decay is determined to be F + 4 π = 0.769 ± 0.021 ± 0.010, where the uncertainties are statistical and systematic, respectively. Using simulated B ± → DK ±, D → π + π - π + π - decays, it is estimated that by the end of the current LHC run, the LHCb experiment could determine γ from this decay mode with an uncertainty of (±10 ± 7)°, where the first uncertainty is statistical based on estimated LHCb event yields, and the second is due to the uncertainties on the parameters determined in this paper.

  6. Uncertainty in the Modeling of Tsunami Sediment Transport

    NASA Astrophysics Data System (ADS)

    Jaffe, B. E.; Sugawara, D.; Goto, K.; Gelfenbaum, G. R.; La Selle, S.

    2016-12-01

    Erosion and deposition from tsunamis record information about tsunami hydrodynamics and size that can be interpreted to improve tsunami hazard assessment. A recent study (Jaffe et al., 2016) explores sources and methods for quantifying uncertainty in tsunami sediment transport modeling. Uncertainty varies with tsunami properties, study site characteristics, available input data, sediment grain size, and the model used. Although uncertainty has the potential to be large, case studies for both forward and inverse models have shown that sediment transport modeling provides useful information on tsunami inundation and hydrodynamics that can be used to improve tsunami hazard assessment. New techniques for quantifying uncertainty, such as Ensemble Kalman Filtering inversion, and more rigorous reporting of uncertainties will advance the science of tsunami sediment transport modeling. Uncertainty may be decreased with additional laboratory studies that increase our understanding of the semi-empirical parameters and physics of tsunami sediment transport, standardized benchmark tests to assess model performance, and the development of hybrid modeling approaches to exploit the strengths of forward and inverse models. As uncertainty in tsunami sediment transport modeling is reduced, and with increased ability to quantify uncertainty, the geologic record of tsunamis will become more valuable in the assessment of tsunami hazard. Jaffe, B., Goto, K., Sugawara, D., Gelfenbaum, G., and La Selle, S., "Uncertainty in Tsunami Sediment Transport Modeling", Journal of Disaster Research Vol. 11 No. 4, pp. 647-661, 2016, doi: 10.20965/jdr.2016.p0647 https://www.fujipress.jp/jdr/dr/dsstr001100040647/

  7. The North American Multi-Model Ensemble (NMME): Phase-1 Seasonal to Interannual Prediction, Phase-2 Toward Developing Intra-Seasonal Prediction

    NASA Technical Reports Server (NTRS)

    Kirtman, Ben P.; Min, Dughong; Infanti, Johnna M.; Kinter, James L., III; Paolino, Daniel A.; Zhang, Qin; vandenDool, Huug; Saha, Suranjana; Mendez, Malaquias Pena; Becker, Emily; hide

    2013-01-01

    The recent US National Academies report "Assessment of Intraseasonal to Interannual Climate Prediction and Predictability" was unequivocal in recommending the need for the development of a North American Multi-Model Ensemble (NMME) operational predictive capability. Indeed, this effort is required to meet the specific tailored regional prediction and decision support needs of a large community of climate information users. The multi-model ensemble approach has proven extremely effective at quantifying prediction uncertainty due to uncertainty in model formulation, and has proven to produce better prediction quality (on average) then any single model ensemble. This multi-model approach is the basis for several international collaborative prediction research efforts, an operational European system and there are numerous examples of how this multi-model ensemble approach yields superior forecasts compared to any single model. Based on two NOAA Climate Test Bed (CTB) NMME workshops (February 18, and April 8, 2011) a collaborative and coordinated implementation strategy for a NMME prediction system has been developed and is currently delivering real-time seasonal-to-interannual predictions on the NOAA Climate Prediction Center (CPC) operational schedule. The hindcast and real-time prediction data is readily available (e.g., http://iridl.ldeo.columbia.edu/SOURCES/.Models/.NMME/) and in graphical format from CPC (http://origin.cpc.ncep.noaa.gov/products/people/wd51yf/NMME/index.html). Moreover, the NMME forecast are already currently being used as guidance for operational forecasters. This paper describes the new NMME effort, presents an overview of the multi-model forecast quality, and the complementary skill associated with individual models.

  8. Role of large scale energy systems models in R and D planning

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lamontagne, J.

    1980-11-01

    Long-term energy policy deals with the problem of finite supplies of convenient energy sources becoming more costly as they are depleted. The development of alternative technologies to provide new sources of energy and extend the lives of current ones is an attractive option available to government. Thus, one aspect of long-term energy policy involves investment in R and D. The importance of the problems addressed by R and D to the future of society (especially with regard to energy) dictates adoption of a cogent approach to resource allocation and to the designation of priorities for R and D. It ismore » hoped that energy systems models when properly used can provide useful inputs to this process. The influence of model results on energy policy makers who are not knowledgable about flaws or uncertainties in the models, errors in assumptions in model inputs which can result in faulty forecasts, the overall usefulness of energy system models, and model limitations are discussed. It is suggested that the large scale energy systems models currently used for assessing a broad spectrum of policy issues need to be replaced with reasonably simple models capable of dealing with uncertainty in a straightforward manner, and their methodologies and the meaning of their results should be transparent, especially to those removed from the modeling process. Energy models should be clearly related to specific issues. Methodologies should be clearly related to specific decisions, and should allow adjustments to be easily made for alternative assumptions and for additional knowledge gained during the evolution of the energy system. (LCL)« less

  9. Evaluating fishery rehabilitation under uncertainty: A bioeconomic analysis of quota management for the Green Bay yellow perch fishery

    USGS Publications Warehouse

    Johnson, B.L.; Milliman, S.R.; Bishop, R.C.; Kitchell, J.F.

    1992-01-01

    The fishery for yellow perch Perca flavescens in Green Bay, Lake Michigan, is currently operating under a rehabilitation plan based on a commercial harvest quota. We developed a bioeconomic computer model that included links between population density and growth, recruitment, and fishing effort for this fishery. Random variability was included in the stock-recruitment relation and in a simulated population assessment. We used the model in an adaptive management framework to evaluate the effects of the rehabilitation plan on both commercial and sport fisheries and to search for ways to improve the plan. Results indicate that the current quota policy is a member of a set of policies that would meet most management goals and increase total value of the fishery. Sensitivity analyses indicate that this conclusion is robust over a wide range of biological conditions. We predict that commercial fishers will lose money relative to the baseline condition, but they may receive other benefits from the elimination of the common-property nature of the fishery. The prospect exists for managing variability in harvest and stock size and for maximizing economic returns in the fishery, but more information is required, primarily on sportfishing effort dynamics and angler preferences. Stock-recruitment relations, density dependence of growth, and dynamics of sportfishing effort are the primary sources of uncertainty limiting the precision of our predictions. The current quota policy is about as good as other policies at reducing this uncertainty and appears, overall, to be one of the best choices for this fishery. The analytical techniques used in this study were primarily simple, heuristic approaches that could be easily transferred to other studies.

  10. Rapid shelf‐wide cooling response of a stratified coastal ocean to hurricanes

    PubMed Central

    Miles, Travis; Xu, Yi; Kohut, Josh; Schofield, Oscar; Glenn, Scott

    2017-01-01

    Abstract Large uncertainty in the predicted intensity of tropical cyclones (TCs) persists compared to the steadily improving skill in the predicted TC tracks. This intensity uncertainty has its most significant implications in the coastal zone, where TC impacts to populated shorelines are greatest. Recent studies have demonstrated that rapid ahead‐of‐eye‐center cooling of a stratified coastal ocean can have a significant impact on hurricane intensity forecasts. Using observation‐validated, high‐resolution ocean modeling, the stratified coastal ocean cooling processes observed in two U.S. Mid‐Atlantic hurricanes were investigated: Hurricane Irene (2011)—with an inshore Mid‐Atlantic Bight (MAB) track during the late summer stratified coastal ocean season—and Tropical Storm Barry (2007)—with an offshore track during early summer. For both storms, the critical ahead‐of‐eye‐center depth‐averaged force balance across the entire MAB shelf included an onshore wind stress balanced by an offshore pressure gradient. This resulted in onshore surface currents opposing offshore bottom currents that enhanced surface to bottom current shear and turbulent mixing across the thermocline, resulting in the rapid cooling of the surface layer ahead‐of‐eye‐center. Because the same baroclinic and mixing processes occurred for two storms on opposite ends of the track and seasonal stratification envelope, the response appears robust. It will be critical to forecast these processes and their implications for a wide range of future storms using realistic 3‐D coupled atmosphere‐ocean models to lower the uncertainty in predictions of TC intensities and impacts and enable coastal populations to better respond to increasing rapid intensification threats in an era of rising sea levels. PMID:28944132

  11. Multifidelity, Multidisciplinary Design Under Uncertainty with Non-Intrusive Polynomial Chaos

    NASA Technical Reports Server (NTRS)

    West, Thomas K., IV; Gumbert, Clyde

    2017-01-01

    The primary objective of this work is to develop an approach for multifidelity uncertainty quantification and to lay the framework for future design under uncertainty efforts. In this study, multifidelity is used to describe both the fidelity of the modeling of the physical systems, as well as the difference in the uncertainty in each of the models. For computational efficiency, a multifidelity surrogate modeling approach based on non-intrusive polynomial chaos using the point-collocation technique is developed for the treatment of both multifidelity modeling and multifidelity uncertainty modeling. Two stochastic model problems are used to demonstrate the developed methodologies: a transonic airfoil model and multidisciplinary aircraft analysis model. The results of both showed the multifidelity modeling approach was able to predict the output uncertainty predicted by the high-fidelity model as a significant reduction in computational cost.

  12. Investment in hydrogen tri-generation for wastewater treatment plants under uncertainties

    NASA Astrophysics Data System (ADS)

    Gharieh, Kaveh; Jafari, Mohsen A.; Guo, Qizhong

    2015-11-01

    In this article, we present a compound real option model for investment in hydrogen tri-generation and onsite hydrogen dispensing systems for a wastewater treatment plant under price and market uncertainties. The ultimate objective is to determine optimal timing and investment thresholds to exercise initial and subsequent options such that the total savings are maximized. Initial option includes investment in a 1.4 (MW) Molten Carbonate Fuel Cell (MCFC) fed by mixture of waste biogas from anaerobic digestion and natural gas, along with auxiliary equipment. Produced hydrogen in MCFC via internal reforming, is recovered from the exhaust gas stream using Pressure Swing Adsorption (PSA) purification technology. Therefore the expansion option includes investment in hydrogen compression, storage and dispensing (CSD) systems which creates additional revenue by selling hydrogen onsite in retail price. This work extends current state of investment modeling within the context of hydrogen tri-generation by considering: (i) Modular investment plan for hydrogen tri-generation and dispensing systems, (ii) Multiple sources of uncertainties along with more realistic probability distributions, (iii) Optimal operation of hydrogen tri-generation is considered, which results in realistic saving estimation.

  13. Towards national-scale greenhouse gas emissions evaluation with robust uncertainty estimates

    NASA Astrophysics Data System (ADS)

    Rigby, Matthew; Swallow, Ben; Lunt, Mark; Manning, Alistair; Ganesan, Anita; Stavert, Ann; Stanley, Kieran; O'Doherty, Simon

    2016-04-01

    Through the Deriving Emissions related to Climate Change (DECC) network and the Greenhouse gAs Uk and Global Emissions (GAUGE) programme, the UK's greenhouse gases are now monitored by instruments mounted on telecommunications towers and churches, on a ferry that performs regular transects of the North Sea, on-board a research aircraft and from space. When combined with information from high-resolution chemical transport models such as the Met Office Numerical Atmospheric dispersion Modelling Environment (NAME), these measurements are allowing us to evaluate emissions more accurately than has previously been possible. However, it has long been appreciated that current methods for quantifying fluxes using atmospheric data suffer from uncertainties, primarily relating to the chemical transport model, that have been largely ignored to date. Here, we use novel model reduction techniques for quantifying the influence of a set of potential systematic model errors on the outcome of a national-scale inversion. This new technique has been incorporated into a hierarchical Bayesian framework, which can be shown to reduce the influence of subjective choices on the outcome of inverse modelling studies. Using estimates of the UK's methane emissions derived from DECC and GAUGE tall-tower measurements as a case study, we will show that such model systematic errors have the potential to significantly increase the uncertainty on national-scale emissions estimates. Therefore, we conclude that these factors must be incorporated in national emissions evaluation efforts, if they are to be credible.

  14. Setting priorities for research on pollution reduction functions of agricultural buffers.

    PubMed

    Dosskey, Michael G

    2002-11-01

    The success of buffer installation initiatives and programs to reduce nonpoint source pollution of streams on agricultural lands will depend the ability of local planners to locate and design buffers for specific circumstances with substantial and predictable results. Current predictive capabilities are inadequate, and major sources of uncertainty remain. An assessment of these uncertainties cautions that there is greater risk of overestimating buffer impact than underestimating it. Priorities for future research are proposed that will lead more quickly to major advances in predictive capabilities. Highest priority is given for work on the surface runoff filtration function, which is almost universally important to the amount of pollution reduction expected from buffer installation and for which there remain major sources of uncertainty for predicting level of impact. Foremost uncertainties surround the extent and consequences of runoff flow concentration and pollutant accumulation. Other buffer functions, including filtration of groundwater nitrate and stabilization of channel erosion sources of sediments, may be important in some regions. However, uncertainty surrounds our ability to identify and quantify the extent of site conditions where buffer installation can substantially reduce stream pollution in these ways. Deficiencies in predictive models reflect gaps in experimental information as well as technology to account for spatial heterogeneity of pollutant sources, pathways, and buffer capabilities across watersheds. Since completion of a comprehensive watershed-scale buffer model is probably far off, immediate needs call for simpler techniques to gage the probable impacts of buffer installation at local scales.

  15. Probabilistic accounting of uncertainty in forecasts of species distributions under climate change

    USGS Publications Warehouse

    Wenger, Seth J.; Som, Nicholas A.; Dauwalter, Daniel C.; Isaak, Daniel J.; Neville, Helen M.; Luce, Charles H.; Dunham, Jason B.; Young, Michael K.; Fausch, Kurt D.; Rieman, Bruce E.

    2013-01-01

    Forecasts of species distributions under future climates are inherently uncertain, but there have been few attempts to describe this uncertainty comprehensively in a probabilistic manner. We developed a Monte Carlo approach that accounts for uncertainty within generalized linear regression models (parameter uncertainty and residual error), uncertainty among competing models (model uncertainty), and uncertainty in future climate conditions (climate uncertainty) to produce site-specific frequency distributions of occurrence probabilities across a species’ range. We illustrated the method by forecasting suitable habitat for bull trout (Salvelinus confluentus) in the Interior Columbia River Basin, USA, under recent and projected 2040s and 2080s climate conditions. The 95% interval of total suitable habitat under recent conditions was estimated at 30.1–42.5 thousand km; this was predicted to decline to 0.5–7.9 thousand km by the 2080s. Projections for the 2080s showed that the great majority of stream segments would be unsuitable with high certainty, regardless of the climate data set or bull trout model employed. The largest contributor to uncertainty in total suitable habitat was climate uncertainty, followed by parameter uncertainty and model uncertainty. Our approach makes it possible to calculate a full distribution of possible outcomes for a species, and permits ready graphical display of uncertainty for individual locations and of total habitat.

  16. Can fuzzy logic bring complex problems into focus? Modeling imprecise factors in environmental policy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McKone, Thomas E.; Deshpande, Ashok W.

    2004-06-14

    In modeling complex environmental problems, we often fail to make precise statements about inputs and outcome. In this case the fuzzy logic method native to the human mind provides a useful way to get at these problems. Fuzzy logic represents a significant change in both the approach to and outcome of environmental evaluations. Risk assessment is currently based on the implicit premise that probability theory provides the necessary and sufficient tools for dealing with uncertainty and variability. The key advantage of fuzzy methods is the way they reflect the human mind in its remarkable ability to store and process informationmore » which is consistently imprecise, uncertain, and resistant to classification. Our case study illustrates the ability of fuzzy logic to integrate statistical measurements with imprecise health goals. But we submit that fuzzy logic and probability theory are complementary and not competitive. In the world of soft computing, fuzzy logic has been widely used and has often been the ''smart'' behind smart machines. But it will require more effort and case studies to establish its niche in risk assessment or other types of impact assessment. Although we often hear complaints about ''bright lines,'' could we adapt to a system that relaxes these lines to fuzzy gradations? Would decision makers and the public accept expressions of water or air quality goals in linguistic terms with computed degrees of certainty? Resistance is likely. In many regions, such as the US and European Union, it is likely that both decision makers and members of the public are more comfortable with our current system in which government agencies avoid confronting uncertainties by setting guidelines that are crisp and often fail to communicate uncertainty. But some day perhaps a more comprehensive approach that includes exposure surveys, toxicological data, epidemiological studies coupled with fuzzy modeling will go a long way in resolving some of the conflict, divisiveness, and controversy in the current regulatory paradigm.« less

  17. Quasielastic charged-current neutrino scattering in the scaling model with relativistic effective mass

    NASA Astrophysics Data System (ADS)

    Ruiz Simo, I.; Martinez-Consentino, V. L.; Amaro, J. E.; Ruiz Arriola, E.

    2018-06-01

    We use a recent scaling analysis of the quasielastic electron scattering data from C 12 to predict the quasielastic charge-changing neutrino scattering cross sections within an uncertainty band. We use a scaling function extracted from a selection of the (e ,e') cross section data, and an effective nucleon mass inspired by the relativistic mean-field model of nuclear matter. The corresponding superscaling analysis with relativistic effective mass (SuSAM*) describes a large amount of the electron data lying inside a phenomenological quasielastic band. The effective mass incorporates the enhancement of the transverse current produced by the relativistic mean field. The scaling function incorporates nuclear effects beyond the impulse approximation, in particular meson-exchange currents and short-range correlations producing tails in the scaling function. Besides its simplicity, this model describes the neutrino data as reasonably well as other more sophisticated nuclear models.

  18. Chronic beryllium disease and cancer risk estimates with uncertainty for beryllium released to the air from the Rocky Flats Plant.

    PubMed Central

    McGavran, P D; Rood, A S; Till, J E

    1999-01-01

    Beryllium was released into the air from routine operations and three accidental fires at the Rocky Flats Plant (RFP) in Colorado from 1958 to 1989. We evaluated environmental monitoring data and developed estimates of airborne concentrations and their uncertainties and calculated lifetime cancer risks and risks of chronic beryllium disease to hypothetical receptors. This article discusses exposure-response relationships for lung cancer and chronic beryllium disease. We assigned a distribution to cancer slope factor values based on the relative risk estimates from an occupational epidemiologic study used by the U.S. Environmental Protection Agency (EPA) to determine the slope factors. We used the regional atmospheric transport code for Hanford emission tracking atmospheric transport model for exposure calculations because it is particularly well suited for long-term annual-average dispersion estimates and it incorporates spatially varying meteorologic and environmental parameters. We accounted for model prediction uncertainty by using several multiplicative stochastic correction factors that accounted for uncertainty in the dispersion estimate, the meteorology, deposition, and plume depletion. We used Monte Carlo techniques to propagate model prediction uncertainty through to the final risk calculations. We developed nine exposure scenarios of hypothetical but typical residents of the RFP area to consider the lifestyle, time spent outdoors, location, age, and sex of people who may have been exposed. We determined geometric mean incremental lifetime cancer incidence risk estimates for beryllium inhalation for each scenario. The risk estimates were < 10(-6). Predicted air concentrations were well below the current reference concentration derived by the EPA for beryllium sensitization. Images Figure 1 Figure 2 Figure 3 Figure 4 Figure 5 Figure 6 PMID:10464074

  19. Extending the SBUV MOD Ozone Profile data record with OMPS Nadir Profiler Data: Updated Trends and Uncertainties

    NASA Astrophysics Data System (ADS)

    Frith, S. M.; Stolarski, R. S.; McPeters, R. D.; Kramarova, N. A.

    2017-12-01

    The Ozone Monitoring and Profile Suite (OMPS) on the Suomi NPP satellite comprises three instruments measuring profile and total column ozone. The Nadir Profiler sensor measures broadly-resolved vertical ozone profiles retrieved from backscattered UV radiances, and continues a nearly unbroken record of measurements from the Solar Backscatter Ultraviolet (SBUV and SBUV/2) series of instruments dating back to late 1978. The SBUV Merged Ozone Dataset (MOD) combines data from the SBUV instrument series into a single coherent data record. The last instrument in the series, operating on the NOAA 19 satellite, is expected to encounter higher measurement uncertainties as the N19 orbit drifts closer to the terminator, necessitating a move to the next generation OMPS instruments. Here we incorporate OMPS NP v2.3 data from 2012-2017 into the MOD record and evaluate the effects of the new data on theoverall record, particularly the sensitivity of long-term trend estimates derived from MOD. We will evaluate the uncertainty associated with merging multiple records. We use a Monte Carlo modeling approach to estimate the potential for uncertainties in the calibration and drift of individual instruments to mimic long-term variations in the merged data set. Intra-instrument comparisons during overlap periods are used to quantify the uncertainty of each instrument in the Monte Carlo simulations. Current error estimates using this approach are likely conservative because we model a Gaussian distribution of potential offsets and drifts when the actual distributions are more complicated. In this work we will investigate the effects of the additional data set, but also pursue approaches to define the Monte Carlo model more precisely to better characterize the potential error.

  20. Global-mean BC lifetime as an indicator of model skill? Constraining the vertical aerosol distribution using aircraft observations

    NASA Astrophysics Data System (ADS)

    Lund, M. T.; Samset, B. H.; Skeie, R. B.; Berntsen, T.

    2017-12-01

    Several recent studies have used observations from the HIPPO flight campaigns to constrain the modeled vertical distribution of black carbon (BC) over the Pacific. Results indicate a relatively linear relationship between global-mean atmospheric BC residence time, or lifetime, and bias in current models. A lifetime of less than 5 days is necessary for models to reasonably reproduce these observations. This is shorter than what many global models predict, which will in turn affect their estimates of BC climate impacts. Here we use the chemistry-transport model OsloCTM to examine whether this relationship between global BC lifetime and model skill also holds for a broader a set of flight campaigns from 2009-2013 covering both remote marine and continental regions at a range of latitudes. We perform four sets of simulations with varying scavenging efficiency to obtain a spread in the modeled global BC lifetime and calculate the model error and bias for each campaign and region. Vertical BC profiles are constructed using an online flight simulator, as well by averaging and interpolating monthly mean model output, allowing us to quantify sampling errors arising when measurements are compared with model output at different spatial and temporal resolutions. Using the OsloCTM coupled with a microphysical aerosol parameterization, we investigate the sensitivity of modeled BC vertical distribution to uncertainties in the aerosol aging and scavenging processes in more detail. From this, we can quantify how model uncertainties in the BC life cycle propagate into uncertainties in its climate impacts. For most campaigns and regions, a short global-mean BC lifetime corresponds with the lowest model error and bias. On an aggregated level, sampling errors appear to be small, but larger differences are seen in individual regions. However, we also find that model-measurement discrepancies in BC vertical profiles cannot be uniquely attributed to uncertainties in a single process or parameter, at least in this model. Model development therefore needs to focus on improvements to individual processes, supported by a broad range of observational and experimental data, rather than tuning individual, effective parameters such as global BC lifetime.

  1. Prediction Accuracy of Error Rates for MPTB Space Experiment

    NASA Technical Reports Server (NTRS)

    Buchner, S. P.; Campbell, A. B.; Davis, D.; McMorrow, D.; Petersen, E. L.; Stassinopoulos, E. G.; Ritter, J. C.

    1998-01-01

    This paper addresses the accuracy of radiation-induced upset-rate predictions in space using the results of ground-based measurements together with standard environmental and device models. The study is focused on two part types - 16 Mb NEC DRAM's (UPD4216) and 1 Kb SRAM's (AMD93L422) - both of which are currently in space on board the Microelectronics and Photonics Test Bed (MPTB). To date, ground-based measurements of proton-induced single event upset (SEM cross sections as a function of energy have been obtained and combined with models of the proton environment to predict proton-induced error rates in space. The role played by uncertainties in the environmental models will be determined by comparing the modeled radiation environment with the actual environment measured aboard MPTB. Heavy-ion induced upsets have also been obtained from MPTB and will be compared with the "predicted" error rate following ground testing that will be done in the near future. These results should help identify sources of uncertainty in predictions of SEU rates in space.

  2. Identifying influences on model uncertainty: an application using a forest carbon budget model

    Treesearch

    James E. Smith; Linda S. Heath

    2001-01-01

    Uncertainty is an important consideration for both developers and users of environmental simulation models. Establishing quantitative estimates of uncertainty for deterministic models can be difficult when the underlying bases for such information are scarce. We demonstrate an application of probabilistic uncertainty analysis that provides for refinements in...

  3. A Bayesian Network Based Global Sensitivity Analysis Method for Identifying Dominant Processes in a Multi-physics Model

    NASA Astrophysics Data System (ADS)

    Dai, H.; Chen, X.; Ye, M.; Song, X.; Zachara, J. M.

    2016-12-01

    Sensitivity analysis has been an important tool in groundwater modeling to identify the influential parameters. Among various sensitivity analysis methods, the variance-based global sensitivity analysis has gained popularity for its model independence characteristic and capability of providing accurate sensitivity measurements. However, the conventional variance-based method only considers uncertainty contribution of single model parameters. In this research, we extended the variance-based method to consider more uncertainty sources and developed a new framework to allow flexible combinations of different uncertainty components. We decompose the uncertainty sources into a hierarchical three-layer structure: scenario, model and parametric. Furthermore, each layer of uncertainty source is capable of containing multiple components. An uncertainty and sensitivity analysis framework was then constructed following this three-layer structure using Bayesian network. Different uncertainty components are represented as uncertain nodes in this network. Through the framework, variance-based sensitivity analysis can be implemented with great flexibility of using different grouping strategies for uncertainty components. The variance-based sensitivity analysis thus is improved to be able to investigate the importance of an extended range of uncertainty sources: scenario, model, and other different combinations of uncertainty components which can represent certain key model system processes (e.g., groundwater recharge process, flow reactive transport process). For test and demonstration purposes, the developed methodology was implemented into a test case of real-world groundwater reactive transport modeling with various uncertainty sources. The results demonstrate that the new sensitivity analysis method is able to estimate accurate importance measurements for any uncertainty sources which were formed by different combinations of uncertainty components. The new methodology can provide useful information for environmental management and decision-makers to formulate policies and strategies.

  4. Partitioning the Uncertainty in Estimates of Mean Basal Area Obtained from 10-year Diameter Growth Model Predictions

    Treesearch

    Ronald E. McRoberts

    2005-01-01

    Uncertainty in model-based predictions of individual tree diameter growth is attributed to three sources: measurement error for predictor variables, residual variability around model predictions, and uncertainty in model parameter estimates. Monte Carlo simulations are used to propagate the uncertainty from the three sources through a set of diameter growth models to...

  5. Aeroservoelastic Uncertainty Model Identification from Flight Data

    NASA Technical Reports Server (NTRS)

    Brenner, Martin J.

    2001-01-01

    Uncertainty modeling is a critical element in the estimation of robust stability margins for stability boundary prediction and robust flight control system development. There has been a serious deficiency to date in aeroservoelastic data analysis with attention to uncertainty modeling. Uncertainty can be estimated from flight data using both parametric and nonparametric identification techniques. The model validation problem addressed in this paper is to identify aeroservoelastic models with associated uncertainty structures from a limited amount of controlled excitation inputs over an extensive flight envelope. The challenge to this problem is to update analytical models from flight data estimates while also deriving non-conservative uncertainty descriptions consistent with the flight data. Multisine control surface command inputs and control system feedbacks are used as signals in a wavelet-based modal parameter estimation procedure for model updates. Transfer function estimates are incorporated in a robust minimax estimation scheme to get input-output parameters and error bounds consistent with the data and model structure. Uncertainty estimates derived from the data in this manner provide an appropriate and relevant representation for model development and robust stability analysis. This model-plus-uncertainty identification procedure is applied to aeroservoelastic flight data from the NASA Dryden Flight Research Center F-18 Systems Research Aircraft.

  6. Tissue resistivity estimation in the presence of positional and geometrical uncertainties.

    PubMed

    Baysal, U; Eyüboğlu, B M

    2000-08-01

    Geometrical uncertainties (organ boundary variation and electrode position uncertainties) are the biggest sources of error in estimating electrical resistivity of tissues from body surface measurements. In this study, in order to decrease estimation errors, the statistically constrained minimum mean squared error estimation algorithm (MiMSEE) is constrained with a priori knowledge of the geometrical uncertainties in addition to the constraints based on geometry, resistivity range, linearization and instrumentation errors. The MiMSEE calculates an optimum inverse matrix, which maps the surface measurements to the unknown resistivity distribution. The required data are obtained from four-electrode impedance measurements, similar to injected-current electrical impedance tomography (EIT). In this study, the surface measurements are simulated by using a numerical thorax model. The data are perturbed with additive instrumentation noise. Simulated surface measurements are then used to estimate the tissue resistivities by using the proposed algorithm. The results are compared with the results of conventional least squares error estimator (LSEE). Depending on the region, the MiMSEE yields an estimation error between 0.42% and 31.3% compared with 7.12% to 2010% for the LSEE. It is shown that the MiMSEE is quite robust even in the case of geometrical uncertainties.

  7. Oceanic ensemble forecasting in the Gulf of Mexico: An application to the case of the Deep Water Horizon oil spill

    NASA Astrophysics Data System (ADS)

    Khade, Vikram; Kurian, Jaison; Chang, Ping; Szunyogh, Istvan; Thyng, Kristen; Montuoro, Raffaele

    2017-05-01

    This paper demonstrates the potential of ocean ensemble forecasting in the Gulf of Mexico (GoM). The Bred Vector (BV) technique with one week rescaling frequency is implemented on a 9 km resolution version of the Regional Ocean Modelling System (ROMS). Numerical experiments are carried out by using the HYCOM analysis products to define the initial conditions and the lateral boundary conditions. The growth rates of the forecast uncertainty are estimated to be about 10% of initial amplitude per week. By carrying out ensemble forecast experiments with and without perturbed surface forcing, it is demonstrated that in the coastal regions accounting for uncertainties in the atmospheric forcing is more important than accounting for uncertainties in the ocean initial conditions. In the Loop Current region, the initial condition uncertainties, are the dominant source of the forecast uncertainty. The root-mean-square error of the Lagrangian track forecasts at the 15-day forecast lead time can be reduced by about 10 - 50 km using the ensemble mean Eulerian forecast of the oceanic flow for the computation of the tracks, instead of the single-initial-condition Eulerian forecast.

  8. Solid laboratory calibration of a nonimaging spectroradiometer.

    PubMed

    Schaepman, M E; Dangel, S

    2000-07-20

    Field-based nonimaging spectroradiometers are often used in vicarious calibration experiments for airborne or spaceborne imaging spectrometers. The calibration uncertainties associated with these ground measurements contribute substantially to the overall modeling error in radiance- or reflectance-based vicarious calibration experiments. Because of limitations in the radiometric stability of compact field spectroradiometers, vicarious calibration experiments are based primarily on reflectance measurements rather than on radiance measurements. To characterize the overall uncertainty of radiance-based approaches and assess the sources of uncertainty, we carried out a full laboratory calibration. This laboratory calibration of a nonimaging spectroradiometer is based on a measurement plan targeted at achieving a

  9. Extreme geomagnetic storms: Probabilistic forecasts and their uncertainties

    USGS Publications Warehouse

    Riley, Pete; Love, Jeffrey J.

    2017-01-01

    Extreme space weather events are low-frequency, high-risk phenomena. Estimating their rates of occurrence, as well as their associated uncertainties, is difficult. In this study, we derive statistical estimates and uncertainties for the occurrence rate of an extreme geomagnetic storm on the scale of the Carrington event (or worse) occurring within the next decade. We model the distribution of events as either a power law or lognormal distribution and use (1) Kolmogorov-Smirnov statistic to estimate goodness of fit, (2) bootstrapping to quantify the uncertainty in the estimates, and (3) likelihood ratio tests to assess whether one distribution is preferred over another. Our best estimate for the probability of another extreme geomagnetic event comparable to the Carrington event occurring within the next 10 years is 10.3% 95%  confidence interval (CI) [0.9,18.7] for a power law distribution but only 3.0% 95% CI [0.6,9.0] for a lognormal distribution. However, our results depend crucially on (1) how we define an extreme event, (2) the statistical model used to describe how the events are distributed in intensity, (3) the techniques used to infer the model parameters, and (4) the data and duration used for the analysis. We test a major assumption that the data represent time stationary processes and discuss the implications. If the current trends persist, suggesting that we are entering a period of lower activity, our forecasts may represent upper limits rather than best estimates.

  10. Constraining global dry deposition of ozone: observations and modeling

    NASA Astrophysics Data System (ADS)

    Silva, S. J.; Heald, C. L.

    2016-12-01

    Ozone loss through dry deposition to vegetation is a critically important process for both air quality and ecosystem health. Current estimates are that nearly 25% of all surface ozone is destroyed through dry deposition, and billions of dollars are lost annually due to losses of ecosystem services and agricultural yield associated with ozone damage. However there are still substantial uncertainties regarding the spatial distribution and magnitude of the global depositional flux. As land cover change continues throughout this century, dry deposition of ozone will change in ways that are yet still poorly understood. Nearly every major atmospheric chemistry model uses a variation of the "resistor in series parameterization" for the calculation of dry deposition. By far the most commonly implemented parameterization is of the form presented in Wesely (1989), and is dependent on many variables, including land type look up tables, solar radiation, leaf area index, temperature, and more. The uncertainties contained within the various parts of this parameterization have to date not been fully explored. A lack of understanding of these uncertainties, coupled with a dearth of routine measurements of ozone deposition, ultimately challenges our ability to understand the impacts of land cover change on surface ozone. In this work, we use a suite of globally-distributed observations from the past two decades and the GEOS-Chem chemical transport model to constrain global dry deposition, improve our understanding of these uncertainties, and contextualize the impact of land cover change on ozone concentrations.

  11. The Contribution of Soils to North America's Current and Future Climate

    NASA Astrophysics Data System (ADS)

    Mayes, M. A.; Reed, S.; Thornton, P. E.; Lajtha, K.; Bailey, V. L.; Shrestha, G.; Jastrow, J. D.; Torn, M. S.

    2015-12-01

    This presentation will cover key aspects of the terrestrial soil carbon cycle in North America and the US for the upcoming State of the Carbon Cycle Report (SOCCRII). SOCCRII seeks to summarize how natural processes and human interactions affect the global carbon cycle, how socio-economic trends affect greenhouse gas concentrations in the atmosphere, and how ecosystems are influenced by and respond to greenhouse gas emissions, management decisions, and concomitant climate effects. Here, we will summarize the contemporary understanding of carbon stocks, fluxes, and drivers in the soil ecosystem compartment. We will highlight recent advances in modeling the magnitude of soil carbon stocks and fluxes, as well as the importance of remaining uncertainties in predicting soil carbon cycling and its relationship with climate. Attention will be given to the role of uncertainties in predicting future fluxes from soils, and how those uncertainties vary by region and ecosystem. We will also address how climate feedbacks and management decisions can enhance or minimize future climatic effects based on current understanding and observations, and will highlight select research needs to improve our understanding of the balance of carbon in soils in North America.

  12. Modelling of plasma-based dry reforming: how do uncertainties in the input data affect the calculation results?

    NASA Astrophysics Data System (ADS)

    Wang, Weizong; Berthelot, Antonin; Zhang, Quanzhi; Bogaerts, Annemie

    2018-05-01

    One of the main issues in plasma chemistry modeling is that the cross sections and rate coefficients are subject to uncertainties, which yields uncertainties in the modeling results and hence hinders the predictive capabilities. In this paper, we reveal the impact of these uncertainties on the model predictions of plasma-based dry reforming in a dielectric barrier discharge. For this purpose, we performed a detailed uncertainty analysis and sensitivity study. 2000 different combinations of rate coefficients, based on the uncertainty from a log-normal distribution, are used to predict the uncertainties in the model output. The uncertainties in the electron density and electron temperature are around 11% and 8% at the maximum of the power deposition for a 70% confidence level. Still, this can have a major effect on the electron impact rates and hence on the calculated conversions of CO2 and CH4, as well as on the selectivities of CO and H2. For the CO2 and CH4 conversion, we obtain uncertainties of 24% and 33%, respectively. For the CO and H2 selectivity, the corresponding uncertainties are 28% and 14%, respectively. We also identify which reactions contribute most to the uncertainty in the model predictions. In order to improve the accuracy and reliability of plasma chemistry models, we recommend using only verified rate coefficients, and we point out the need for dedicated verification experiments.

  13. Comparison of statistical and theoretical habitat models for conservation planning: the benefit of ensemble prediction

    USGS Publications Warehouse

    Jones-Farrand, D. Todd; Fearer, Todd M.; Thogmartin, Wayne E.; Thompson, Frank R.; Nelson, Mark D.; Tirpak, John M.

    2011-01-01

    Selection of a modeling approach is an important step in the conservation planning process, but little guidance is available. We compared two statistical and three theoretical habitat modeling approaches representing those currently being used for avian conservation planning at landscape and regional scales: hierarchical spatial count (HSC), classification and regression tree (CRT), habitat suitability index (HSI), forest structure database (FS), and habitat association database (HA). We focused our comparison on models for five priority forest-breeding species in the Central Hardwoods Bird Conservation Region: Acadian Flycatcher, Cerulean Warbler, Prairie Warbler, Red-headed Woodpecker, and Worm-eating Warbler. Lacking complete knowledge on the distribution and abundance of each species with which we could illuminate differences between approaches and provide strong grounds for recommending one approach over another, we used two approaches to compare models: rank correlations among model outputs and comparison of spatial correspondence. In general, rank correlations were significantly positive among models for each species, indicating general agreement among the models. Worm-eating Warblers had the highest pairwise correlations, all of which were significant (P , 0.05). Red-headed Woodpeckers had the lowest agreement among models, suggesting greater uncertainty in the relative conservation value of areas within the region. We assessed model uncertainty by mapping the spatial congruence in priorities (i.e., top ranks) resulting from each model for each species and calculating the coefficient of variation across model ranks for each location. This allowed identification of areas more likely to be good targets of conservation effort for a species, those areas that were least likely, and those in between where uncertainty is higher and thus conservation action incorporates more risk. Based on our results, models developed independently for the same purpose (conservation planning for a particular species in a particular geography) yield different answers and thus different conservation strategies. We assert that using only one habitat model (even if validated) as the foundation of a conservation plan is risky. Using multiple models (i.e., ensemble prediction) can reduce uncertainty and increase efficacy of conservation action when models corroborate one another and increase understanding of the system when they do not.

  14. Estimation and impact assessment of input and parameter uncertainty in predicting groundwater flow with a fully distributed model

    NASA Astrophysics Data System (ADS)

    Touhidul Mustafa, Syed Md.; Nossent, Jiri; Ghysels, Gert; Huysmans, Marijke

    2017-04-01

    Transient numerical groundwater flow models have been used to understand and forecast groundwater flow systems under anthropogenic and climatic effects, but the reliability of the predictions is strongly influenced by different sources of uncertainty. Hence, researchers in hydrological sciences are developing and applying methods for uncertainty quantification. Nevertheless, spatially distributed flow models pose significant challenges for parameter and spatially distributed input estimation and uncertainty quantification. In this study, we present a general and flexible approach for input and parameter estimation and uncertainty analysis of groundwater models. The proposed approach combines a fully distributed groundwater flow model (MODFLOW) with the DiffeRential Evolution Adaptive Metropolis (DREAM) algorithm. To avoid over-parameterization, the uncertainty of the spatially distributed model input has been represented by multipliers. The posterior distributions of these multipliers and the regular model parameters were estimated using DREAM. The proposed methodology has been applied in an overexploited aquifer in Bangladesh where groundwater pumping and recharge data are highly uncertain. The results confirm that input uncertainty does have a considerable effect on the model predictions and parameter distributions. Additionally, our approach also provides a new way to optimize the spatially distributed recharge and pumping data along with the parameter values under uncertain input conditions. It can be concluded from our approach that considering model input uncertainty along with parameter uncertainty is important for obtaining realistic model predictions and a correct estimation of the uncertainty bounds.

  15. Assessing and reporting uncertainties in dietary exposure analysis: Mapping of uncertainties in a tiered approach.

    PubMed

    Kettler, Susanne; Kennedy, Marc; McNamara, Cronan; Oberdörfer, Regina; O'Mahony, Cian; Schnabel, Jürgen; Smith, Benjamin; Sprong, Corinne; Faludi, Roland; Tennant, David

    2015-08-01

    Uncertainty analysis is an important component of dietary exposure assessments in order to understand correctly the strength and limits of its results. Often, standard screening procedures are applied in a first step which results in conservative estimates. If through those screening procedures a potential exceedance of health-based guidance values is indicated, within the tiered approach more refined models are applied. However, the sources and types of uncertainties in deterministic and probabilistic models can vary or differ. A key objective of this work has been the mapping of different sources and types of uncertainties to better understand how to best use uncertainty analysis to generate more realistic comprehension of dietary exposure. In dietary exposure assessments, uncertainties can be introduced by knowledge gaps about the exposure scenario, parameter and the model itself. With this mapping, general and model-independent uncertainties have been identified and described, as well as those which can be introduced and influenced by the specific model during the tiered approach. This analysis identifies that there are general uncertainties common to point estimates (screening or deterministic methods) and probabilistic exposure assessment methods. To provide further clarity, general sources of uncertainty affecting many dietary exposure assessments should be separated from model-specific uncertainties. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.

  16. Fatigue damage prognosis of internal delamination in composite plates under cyclic compression loadings using affine arithmetic as uncertainty propagation tool

    NASA Astrophysics Data System (ADS)

    Gbaguidi, Audrey J.-M.

    Structural health monitoring (SHM) has become indispensable for reducing maintenance costs and increasing the in-service capacity of a structure. The increased use of lightweight composite materials in aircraft structures drastically increased the effects of fatigue induced damage on their critical structural components and thus the necessity to predict the remaining life of those components. Damage prognosis, one of the least investigated fields in SHM, uses the current damage state of the system to forecast its future performance by estimating the expected loading environments. A successful damage prediction model requires the integration of technologies in areas like measurements, materials science, mechanics of materials, and probability theories, but most importantly the quantification of uncertainty in all these areas. In this study, Affine Arithmetic is used as a method for incorporating the uncertainties due to the material properties into the fatigue life prognosis of composite plates subjected to cyclic compressive loadings. When loadings are compressive in nature, the composite plates undergo repeated buckling-unloading of the delaminated layer which induces mixed modes I and II states of stress at the tip of the delamination in the plates. The Kardomateas model-based prediction law is used to predict the growth of the delamination, while the integration of the effects of the uncertainties for modes I and II coefficients in the fatigue life prediction model is handled using Affine arithmetic. The Mode I and Mode II interlaminar fracture toughness and fatigue characterization of the composite plates are first experimentally studied to obtain the material coefficients and fracture toughness, respectively. Next, these obtained coefficients are used in the Kardomateas law to predict the delamination lengths in the composite plates while using Affine Arithmetic to handle their uncertainties. At last, the fatigue characterization of the composite plates during compressive-buckling loadings is experimentally studied, and the delamination lengths obtained are compared with the predicted values to check the performance of Affine Arithmetic as an uncertainty propagation tool.

  17. A probabilistic approach to emissions from transportation sector in the coming decades

    NASA Astrophysics Data System (ADS)

    Yan, F.; Winijkul, E.; Bond, T. C.; Streets, D. G.

    2010-12-01

    Future emission estimates are necessary for understanding climate change, designing national and international strategies for air quality control and evaluating mitigation policies. Emission inventories are uncertain and future projections even more so. Most current emission projection models are deterministic; in other words, there is only single answer for each scenario. As a result, uncertainties have not been included in the estimation of climate forcing or other environmental effects, but it is important to quantify the uncertainty inherent in emission projections. We explore uncertainties of emission projections from transportation sector in the coming decades by sensitivity analysis and Monte Carlo simulations. These projections are based on a technology driven model: the Speciated Pollutants Emission Wizard (SPEW)-Trend, which responds to socioeconomic conditions in different economic and mitigation scenarios. The model contains detail about technology stock, including consumption growth rates, retirement rates, timing of emission standards, deterioration rates and transition rates from normal vehicles to vehicles with extremely high emission factors (termed “superemitters”). However, understanding of these parameters, as well as relationships with socioeconomic conditions, is uncertain. We project emissions from transportation sectors under four different IPCC scenarios (A1B, A2, B1, and B2). Due to the later implementation of advanced emission standards, Africa has the highest annual growth rate (1.2-3.1%) from 2010 to 2050. Superemitters begin producing more than 50% of global emissions around year 2020. We estimate uncertainties from the relationships between technological change and socioeconomic conditions and examine their impact on future emissions. Sensitivities to parameters governing retirement rates are highest, causing changes in global emissions from-26% to +55% on average from 2010 to 2050. We perform Monte Carlo simulations to examine how these uncertainties will affect total emissions if any input parameter that has inherent the uncertainties is substituted by a range of values-probability distribution and varies at the same time; the 95% confidence interval of global emission annual growth rate is -1.9% to +0.2% per year.

  18. An Approach to Experimental Design for the Computer Analysis of Complex Phenomenon

    NASA Technical Reports Server (NTRS)

    Rutherford, Brian

    2000-01-01

    The ability to make credible system assessments, predictions and design decisions related to engineered systems and other complex phenomenon is key to a successful program for many large-scale investigations in government and industry. Recently, many of these large-scale analyses have turned to computational simulation to provide much of the required information. Addressing specific goals in the computer analysis of these complex phenomenon is often accomplished through the use of performance measures that are based on system response models. The response models are constructed using computer-generated responses together with physical test results where possible. They are often based on probabilistically defined inputs and generally require estimation of a set of response modeling parameters. As a consequence, the performance measures are themselves distributed quantities reflecting these variabilities and uncertainties. Uncertainty in the values of the performance measures leads to uncertainties in predicted performance and can cloud the decisions required of the analysis. A specific goal of this research has been to develop methodology that will reduce this uncertainty in an analysis environment where limited resources and system complexity together restrict the number of simulations that can be performed. An approach has been developed that is based on evaluation of the potential information provided for each "intelligently selected" candidate set of computer runs. Each candidate is evaluated by partitioning the performance measure uncertainty into two components - one component that could be explained through the additional computational simulation runs and a second that would remain uncertain. The portion explained is estimated using a probabilistic evaluation of likely results for the additional computational analyses based on what is currently known about the system. The set of runs indicating the largest potential reduction in uncertainty is then selected and the computational simulations are performed. Examples are provided to demonstrate this approach on small scale problems. These examples give encouraging results. Directions for further research are indicated.

  19. Impact of input data uncertainty on environmental exposure assessment models: A case study for electromagnetic field modelling from mobile phone base stations.

    PubMed

    Beekhuizen, Johan; Heuvelink, Gerard B M; Huss, Anke; Bürgi, Alfred; Kromhout, Hans; Vermeulen, Roel

    2014-11-01

    With the increased availability of spatial data and computing power, spatial prediction approaches have become a standard tool for exposure assessment in environmental epidemiology. However, such models are largely dependent on accurate input data. Uncertainties in the input data can therefore have a large effect on model predictions, but are rarely quantified. With Monte Carlo simulation we assessed the effect of input uncertainty on the prediction of radio-frequency electromagnetic fields (RF-EMF) from mobile phone base stations at 252 receptor sites in Amsterdam, The Netherlands. The impact on ranking and classification was determined by computing the Spearman correlations and weighted Cohen's Kappas (based on tertiles of the RF-EMF exposure distribution) between modelled values and RF-EMF measurements performed at the receptor sites. The uncertainty in modelled RF-EMF levels was large with a median coefficient of variation of 1.5. Uncertainty in receptor site height, building damping and building height contributed most to model output uncertainty. For exposure ranking and classification, the heights of buildings and receptor sites were the most important sources of uncertainty, followed by building damping, antenna- and site location. Uncertainty in antenna power, tilt, height and direction had a smaller impact on model performance. We quantified the effect of input data uncertainty on the prediction accuracy of an RF-EMF environmental exposure model, thereby identifying the most important sources of uncertainty and estimating the total uncertainty stemming from potential errors in the input data. This approach can be used to optimize the model and better interpret model output. Copyright © 2014 Elsevier Inc. All rights reserved.

  20. Future directions for LDEF ionizing radiation modeling and assessments

    NASA Technical Reports Server (NTRS)

    Armstrong, T. W.; Colborn, B. L.

    1993-01-01

    A calculational program utilizing data from radiation dosimetry measurements aboard the Long Duration Exposure Facility (LDEF) satellite to reduce the uncertainties in current models defining the ionizing radiation environment is in progress. Most of the effort to date has been on using LDEF radiation dose measurements to evaluate models defining the geomagnetically trapped radiation, which has provided results applicable to radiation design assessments being performed for Space Station Freedom. Plans for future data comparisons, model evaluations, and assessments using additional LDEF data sets (LET spectra, induced radioactivity, and particle spectra) are discussed.

Top