Sample records for considerable uncertainty due

  1. Averaging business cycles vs. myopia: Do we need a long term vision when developing IRP?

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McDonald, C.; Gupta, P.C.

    1995-05-01

    Utility demand forecasting is inherently imprecise due to the number of uncertainties resulting from business cycles, policy making, technology breakthroughs, national and international political upheavals and the limitations of the forecasting tools. This implies that revisions based primarily on recent experience could lead to unstable forecasts. Moreover, new planning tools are required that provide an explicit consideration of uncertainty and lead to flexible and robust planning tools are required that provide an explicit consideration of uncertainty and lead to flexible and robust planning decisions.

  2. Uncertainty in BMP evaluation and optimization for watershed management

    NASA Astrophysics Data System (ADS)

    Chaubey, I.; Cibin, R.; Sudheer, K.; Her, Y.

    2012-12-01

    Use of computer simulation models have increased substantially to make watershed management decisions and to develop strategies for water quality improvements. These models are often used to evaluate potential benefits of various best management practices (BMPs) for reducing losses of pollutants from sources areas into receiving waterbodies. Similarly, use of simulation models in optimizing selection and placement of best management practices under single (maximization of crop production or minimization of pollutant transport) and multiple objective functions has increased recently. One of the limitations of the currently available assessment and optimization approaches is that the BMP strategies are considered deterministic. Uncertainties in input data (e.g. precipitation, streamflow, sediment, nutrient and pesticide losses measured, land use) and model parameters may result in considerable uncertainty in watershed response under various BMP options. We have developed and evaluated options to include uncertainty in BMP evaluation and optimization for watershed management. We have also applied these methods to evaluate uncertainty in ecosystem services from mixed land use watersheds. In this presentation, we will discuss methods to to quantify uncertainties in BMP assessment and optimization solutions due to uncertainties in model inputs and parameters. We have used a watershed model (Soil and Water Assessment Tool or SWAT) to simulate the hydrology and water quality in mixed land use watershed located in Midwest USA. The SWAT model was also used to represent various BMPs in the watershed needed to improve water quality. SWAT model parameters, land use change parameters, and climate change parameters were considered uncertain. It was observed that model parameters, land use and climate changes resulted in considerable uncertainties in BMP performance in reducing P, N, and sediment loads. In addition, climate change scenarios also affected uncertainties in SWAT simulated crop yields. Considerable uncertainties in the net cost and the water quality improvements resulted due to uncertainties in land use, climate change, and model parameter values.

  3. A bi-objective model for robust yard allocation scheduling for outbound containers

    NASA Astrophysics Data System (ADS)

    Liu, Changchun; Zhang, Canrong; Zheng, Li

    2017-01-01

    This article examines the yard allocation problem for outbound containers, with consideration of uncertainty factors, mainly including the arrival and operation time of calling vessels. Based on the time buffer inserting method, a bi-objective model is constructed to minimize the total operational cost and to maximize the robustness of fighting against the uncertainty. Due to the NP-hardness of the constructed model, a two-stage heuristic is developed to solve the problem. In the first stage, initial solutions are obtained by a greedy algorithm that looks n-steps ahead with the uncertainty factors set as their respective expected values; in the second stage, based on the solutions obtained in the first stage and with consideration of uncertainty factors, a neighbourhood search heuristic is employed to generate robust solutions that can fight better against the fluctuation of uncertainty factors. Finally, extensive numerical experiments are conducted to test the performance of the proposed method.

  4. Detectability limit and uncertainty considerations for laser induced fluorescence spectroscopy in flames

    NASA Technical Reports Server (NTRS)

    Daily, J. W.

    1978-01-01

    Laser induced fluorescence spectroscopy of flames is discussed, and derived uncertainty relations are used to calculate detectability limits due to statistical errors. Interferences due to Rayleigh scattering from molecules as well as Mie scattering and incandescence from particles have been examined for their effect on detectability limits. Fluorescence trapping is studied, and some methods for reducing the effect are considered. Fluorescence trapping places an upper limit on the number density of the fluorescing species that can be measured without signal loss.

  5. The impact of land use on estimates of pesticide leaching potential: Assessments and uncertainties

    NASA Astrophysics Data System (ADS)

    Loague, Keith

    1991-11-01

    This paper illustrates the magnitude of uncertainty which can exist for pesticide leaching assessments, due to data uncertainties, both between soil orders and within a single soil order. The current work differs from previous efforts because the impact of uncertainty in recharge estimates is considered. The examples are for diuron leaching in the Pearl Harbor Basin. The results clearly indicate that land use has a significant impact on both estimates of pesticide leaching potential and the uncertainties associated with those estimates. It appears that the regulation of agricultural chemicals in the future should include consideration for changing land use.

  6. Uncertainties in selected river water quality data

    NASA Astrophysics Data System (ADS)

    Rode, M.; Suhr, U.

    2007-02-01

    Monitoring of surface waters is primarily done to detect the status and trends in water quality and to identify whether observed trends arise from natural or anthropogenic causes. Empirical quality of river water quality data is rarely certain and knowledge of their uncertainties is essential to assess the reliability of water quality models and their predictions. The objective of this paper is to assess the uncertainties in selected river water quality data, i.e. suspended sediment, nitrogen fraction, phosphorus fraction, heavy metals and biological compounds. The methodology used to structure the uncertainty is based on the empirical quality of data and the sources of uncertainty in data (van Loon et al., 2005). A literature review was carried out including additional experimental data of the Elbe river. All data of compounds associated with suspended particulate matter have considerable higher sampling uncertainties than soluble concentrations. This is due to high variability within the cross section of a given river. This variability is positively correlated with total suspended particulate matter concentrations. Sampling location has also considerable effect on the representativeness of a water sample. These sampling uncertainties are highly site specific. The estimation of uncertainty in sampling can only be achieved by taking at least a proportion of samples in duplicates. Compared to sampling uncertainties, measurement and analytical uncertainties are much lower. Instrument quality can be stated well suited for field and laboratory situations for all considered constituents. Analytical errors can contribute considerably to the overall uncertainty of river water quality data. Temporal autocorrelation of river water quality data is present but literature on general behaviour of water quality compounds is rare. For meso scale river catchments (500-3000 km2) reasonable yearly dissolved load calculations can be achieved using biweekly sample frequencies. For suspended sediments none of the methods investigated produced very reliable load estimates when weekly concentrations data were used. Uncertainties associated with loads estimates based on infrequent samples will decrease with increasing size of rivers.

  7. Uncertainties in selected surface water quality data

    NASA Astrophysics Data System (ADS)

    Rode, M.; Suhr, U.

    2006-09-01

    Monitoring of surface waters is primarily done to detect the status and trends in water quality and to identify whether observed trends arise form natural or anthropogenic causes. Empirical quality of surface water quality data is rarely certain and knowledge of their uncertainties is essential to assess the reliability of water quality models and their predictions. The objective of this paper is to assess the uncertainties in selected surface water quality data, i.e. suspended sediment, nitrogen fraction, phosphorus fraction, heavy metals and biological compounds. The methodology used to structure the uncertainty is based on the empirical quality of data and the sources of uncertainty in data (van Loon et al., 2006). A literature review was carried out including additional experimental data of the Elbe river. All data of compounds associated with suspended particulate matter have considerable higher sampling uncertainties than soluble concentrations. This is due to high variability's within the cross section of a given river. This variability is positively correlated with total suspended particulate matter concentrations. Sampling location has also considerable effect on the representativeness of a water sample. These sampling uncertainties are highly site specific. The estimation of uncertainty in sampling can only be achieved by taking at least a proportion of samples in duplicates. Compared to sampling uncertainties measurement and analytical uncertainties are much lower. Instrument quality can be stated well suited for field and laboratory situations for all considered constituents. Analytical errors can contribute considerable to the overall uncertainty of surface water quality data. Temporal autocorrelation of surface water quality data is present but literature on general behaviour of water quality compounds is rare. For meso scale river catchments reasonable yearly dissolved load calculations can be achieved using biweekly sample frequencies. For suspended sediments none of the methods investigated produced very reliable load estimates when weekly concentrations data were used. Uncertainties associated with loads estimates based on infrequent samples will decrease with increasing size of rivers.

  8. Picosecond timing resolution detection of ggr-photons utilizing microchannel-plate detectors: experimental tests of quantum nonlocality and photon localization

    NASA Astrophysics Data System (ADS)

    Irby, Victor D.

    2004-09-01

    The concept and subsequent experimental verification of the proportionality between pulse amplitude and detector transit time for microchannel-plate detectors is presented. This discovery has led to considerable improvement in the overall timing resolution for detection of high-energy ggr-photons. Utilizing a 22Na positron source, a full width half maximum (FWHM) timing resolution of 138 ps has been achieved. This FWHM includes detector transit-time spread for both chevron-stack-type detectors, timing spread due to uncertainties in annihilation location, all electronic uncertainty and any remaining quantum mechanical uncertainty. The first measurement of the minimum quantum uncertainty in the time interval between detection of the two annihilation photons is reported. The experimental results give strong evidence against instantaneous spatial localization of ggr-photons due to measurement-induced nonlocal quantum wavefunction collapse. The experimental results are also the first that imply momentum is conserved only after the quantum uncertainty in time has elapsed (Yukawa H 1935 Proc. Phys. Math. Soc. Japan 17 48).

  9. Bulk electric system reliability evaluation incorporating wind power and demand side management

    NASA Astrophysics Data System (ADS)

    Huang, Dange

    Electric power systems are experiencing dramatic changes with respect to structure, operation and regulation and are facing increasing pressure due to environmental and societal constraints. Bulk electric system reliability is an important consideration in power system planning, design and operation particularly in the new competitive environment. A wide range of methods have been developed to perform bulk electric system reliability evaluation. Theoretically, sequential Monte Carlo simulation can include all aspects and contingencies in a power system and can be used to produce an informative set of reliability indices. It has become a practical and viable tool for large system reliability assessment technique due to the development of computing power and is used in the studies described in this thesis. The well-being approach used in this research provides the opportunity to integrate an accepted deterministic criterion into a probabilistic framework. This research work includes the investigation of important factors that impact bulk electric system adequacy evaluation and security constrained adequacy assessment using the well-being analysis framework. Load forecast uncertainty is an important consideration in an electrical power system. This research includes load forecast uncertainty considerations in bulk electric system reliability assessment and the effects on system, load point and well-being indices and reliability index probability distributions are examined. There has been increasing worldwide interest in the utilization of wind power as a renewable energy source over the last two decades due to enhanced public awareness of the environment. Increasing penetration of wind power has significant impacts on power system reliability, and security analyses become more uncertain due to the unpredictable nature of wind power. The effects of wind power additions in generating and bulk electric system reliability assessment considering site wind speed correlations and the interactive effects of wind power and load forecast uncertainty on system reliability are examined. The concept of the security cost associated with operating in the marginal state in the well-being framework is incorporated in the economic analyses associated with system expansion planning including wind power and load forecast uncertainty. Overall reliability cost/worth analyses including security cost concepts are applied to select an optimal wind power injection strategy in a bulk electric system. The effects of the various demand side management measures on system reliability are illustrated using the system, load point, and well-being indices, and the reliability index probability distributions. The reliability effects of demand side management procedures in a bulk electric system including wind power and load forecast uncertainty considerations are also investigated. The system reliability effects due to specific demand side management programs are quantified and examined in terms of their reliability benefits.

  10. Uncertainties in stormwater runoff data collection from a small urban catchment, Southeast China.

    PubMed

    Huang, Jinliang; Tu, Zhenshun; Du, Pengfei; Lin, Jie; Li, Qingsheng

    2010-01-01

    Monitoring data are often used to identify stormwater runoff characteristics and in stormwater runoff modelling without consideration of their inherent uncertainties. Integrated with discrete sample analysis and error propagation analysis, this study attempted to quantify the uncertainties of discrete chemical oxygen demand (COD), total suspended solids (TSS) concentration, stormwater flowrate, stormwater event volumes, COD event mean concentration (EMC), and COD event loads in terms of flow measurement, sample collection, storage and laboratory analysis. The results showed that the uncertainties due to sample collection, storage and laboratory analysis of COD from stormwater runoff are 13.99%, 19.48% and 12.28%. Meanwhile, flow measurement uncertainty was 12.82%, and the sample collection uncertainty of TSS from stormwater runoff was 31.63%. Based on the law of propagation of uncertainties, the uncertainties regarding event flow volume, COD EMC and COD event loads were quantified as 7.03%, 10.26% and 18.47%.

  11. A drag-free Lo-Lo satellite system for improved gravity field measurements

    NASA Technical Reports Server (NTRS)

    Fischell, R. E.; Pisacane, V. L.

    1978-01-01

    At very low altitudes, the effect of atmospheric drag results in drastically reduced orbit lifetimes and considerable uncertainty in satellite motions. The concept suggested herein employs a DISturbance COmpensation System (DISCOS) on each of a pair of satellites at very low altitudes to provide refined measurements of the earth's gravitational field. The DISCOS maintains the satellites in orbit and essentially eliminates motion uncertainties due mostly to drag and to a lesser extent from solar radiation pressure. By a closed-loop measurement of the relative rangerate between the two low satellites, one can determine the earth's gravitational field with a considerably greater accuracy than could be obtained by tracking a single satellite.

  12. Uncertainty in temperature response of current consumption-based emissions estimates

    NASA Astrophysics Data System (ADS)

    Karstensen, J.; Peters, G. P.; Andrew, R. M.

    2014-09-01

    Several studies have connected emissions of greenhouse gases to economic and trade data to quantify the causal chain from consumption to emissions and climate change. These studies usually combine data and models originating from different sources, making it difficult to estimate uncertainties in the end results. We estimate uncertainties in economic data, multi-pollutant emission statistics and metric parameters, and use Monte Carlo analysis to quantify contributions to uncertainty and to determine how uncertainty propagates to estimates of global temperature change from regional and sectoral territorial- and consumption-based emissions for the year 2007. We find that the uncertainties are sensitive to the emission allocations, mix of pollutants included, the metric and its time horizon, and the level of aggregation of the results. Uncertainties in the final results are largely dominated by the climate sensitivity and the parameters associated with the warming effects of CO2. The economic data have a relatively small impact on uncertainty at the global and national level, while much higher uncertainties are found at the sectoral level. Our results suggest that consumption-based national emissions are not significantly more uncertain than the corresponding production based emissions, since the largest uncertainties are due to metric and emissions which affect both perspectives equally. The two perspectives exhibit different sectoral uncertainties, due to changes of pollutant compositions. We find global sectoral consumption uncertainties in the range of ±9-±27% using the global temperature potential with a 50 year time horizon, with metric uncertainties dominating. National level uncertainties are similar in both perspectives due to the dominance of CO2 over other pollutants. The consumption emissions of the top 10 emitting regions have a broad uncertainty range of ±9-±25%, with metric and emissions uncertainties contributing similarly. The Absolute global temperature potential with a 50 year time horizon has much higher uncertainties, with considerable uncertainty overlap for regions and sectors, indicating that the ranking of countries is uncertain.

  13. 6th international conference on case histories in geotechnical engineering August 2008conference report.

    DOT National Transportation Integrated Search

    2009-01-01

    Due to uncertainty in the nature of soils, a systematic study of the performance of geotechnical structures and its match with predictions is extremely important. Therefore, considerable research effort is being devoted to geotechnical engineering th...

  14. Damage assessment of composite plate structures with material and measurement uncertainty

    NASA Astrophysics Data System (ADS)

    Chandrashekhar, M.; Ganguli, Ranjan

    2016-06-01

    Composite materials are very useful in structural engineering particularly in weight sensitive applications. Two different test models of the same structure made from composite materials can display very different dynamic behavior due to large uncertainties associated with composite material properties. Also, composite structures can suffer from pre-existing imperfections like delaminations, voids or cracks during fabrication. In this paper, we show that modeling and material uncertainties in composite structures can cause considerable problem in damage assessment. A recently developed C0 shear deformable locking free refined composite plate element is employed in the numerical simulations to alleviate modeling uncertainty. A qualitative estimate of the impact of modeling uncertainty on the damage detection problem is made. A robust Fuzzy Logic System (FLS) with sliding window defuzzifier is used for delamination damage detection in composite plate type structures. The FLS is designed using variations in modal frequencies due to randomness in material properties. Probabilistic analysis is performed using Monte Carlo Simulation (MCS) on a composite plate finite element model. It is demonstrated that the FLS shows excellent robustness in delamination detection at very high levels of randomness in input data.

  15. Robust design optimization method for centrifugal impellers under surface roughness uncertainties due to blade fouling

    NASA Astrophysics Data System (ADS)

    Ju, Yaping; Zhang, Chuhua

    2016-03-01

    Blade fouling has been proved to be a great threat to compressor performance in operating stage. The current researches on fouling-induced performance degradations of centrifugal compressors are based mainly on simplified roughness models without taking into account the realistic factors such as spatial non-uniformity and randomness of the fouling-induced surface roughness. Moreover, little attention has been paid to the robust design optimization of centrifugal compressor impellers with considerations of blade fouling. In this paper, a multi-objective robust design optimization method is developed for centrifugal impellers under surface roughness uncertainties due to blade fouling. A three-dimensional surface roughness map is proposed to describe the nonuniformity and randomness of realistic fouling accumulations on blades. To lower computational cost in robust design optimization, the support vector regression (SVR) metamodel is combined with the Monte Carlo simulation (MCS) method to conduct the uncertainty analysis of fouled impeller performance. The analyzed results show that the critical fouled region associated with impeller performance degradations lies at the leading edge of blade tip. The SVR metamodel has been proved to be an efficient and accurate means in the detection of impeller performance variations caused by roughness uncertainties. After design optimization, the robust optimal design is found to be more efficient and less sensitive to fouling uncertainties while maintaining good impeller performance in the clean condition. This research proposes a systematic design optimization method for centrifugal compressors with considerations of blade fouling, providing a practical guidance to the design of advanced centrifugal compressors.

  16. Proteomic analysis of a model fish species exposed to individual pesticides and a binary mixture

    EPA Science Inventory

    Aquatic organisms are often exposed to multiple pesticides simultaneously. Due to the relatively poor characterization of mixture constituent interactions and the potential for highly complex exposure scenarios, there is considerable uncertainty in understanding the toxicity of m...

  17. Uncertainty in temperature response of current consumption-based emissions estimates

    NASA Astrophysics Data System (ADS)

    Karstensen, J.; Peters, G. P.; Andrew, R. M.

    2015-05-01

    Several studies have connected emissions of greenhouse gases to economic and trade data to quantify the causal chain from consumption to emissions and climate change. These studies usually combine data and models originating from different sources, making it difficult to estimate uncertainties along the entire causal chain. We estimate uncertainties in economic data, multi-pollutant emission statistics, and metric parameters, and use Monte Carlo analysis to quantify contributions to uncertainty and to determine how uncertainty propagates to estimates of global temperature change from regional and sectoral territorial- and consumption-based emissions for the year 2007. We find that the uncertainties are sensitive to the emission allocations, mix of pollutants included, the metric and its time horizon, and the level of aggregation of the results. Uncertainties in the final results are largely dominated by the climate sensitivity and the parameters associated with the warming effects of CO2. Based on our assumptions, which exclude correlations in the economic data, the uncertainty in the economic data appears to have a relatively small impact on uncertainty at the national level in comparison to emissions and metric uncertainty. Much higher uncertainties are found at the sectoral level. Our results suggest that consumption-based national emissions are not significantly more uncertain than the corresponding production-based emissions since the largest uncertainties are due to metric and emissions which affect both perspectives equally. The two perspectives exhibit different sectoral uncertainties, due to changes of pollutant compositions. We find global sectoral consumption uncertainties in the range of ±10 to ±27 % using the Global Temperature Potential with a 50-year time horizon, with metric uncertainties dominating. National-level uncertainties are similar in both perspectives due to the dominance of CO2 over other pollutants. The consumption emissions of the top 10 emitting regions have a broad uncertainty range of ±9 to ±25 %, with metric and emission uncertainties contributing similarly. The absolute global temperature potential (AGTP) with a 50-year time horizon has much higher uncertainties, with considerable uncertainty overlap for regions and sectors, indicating that the ranking of countries is uncertain.

  18. Characterization of a neutron sensitive MCP/Timepix detector for quantitative image analysis at a pulsed neutron source

    NASA Astrophysics Data System (ADS)

    Watanabe, Kenichi; Minniti, Triestino; Kockelmann, Winfried; Dalgliesh, Robert; Burca, Genoveva; Tremsin, Anton S.

    2017-07-01

    The uncertainties and the stability of a neutron sensitive MCP/Timepix detector when operating in the event timing mode for quantitative image analysis at a pulsed neutron source were investigated. The dominant component to the uncertainty arises from the counting statistics. The contribution of the overlap correction to the uncertainty was concluded to be negligible from considerations based on the error propagation even if a pixel occupation probability is more than 50%. We, additionally, have taken into account the multiple counting effect in consideration of the counting statistics. Furthermore, the detection efficiency of this detector system changes under relatively high neutron fluxes due to the ageing effects of current Microchannel Plates. Since this efficiency change is position-dependent, it induces a memory image. The memory effect can be significantly reduced with correction procedures using the rate equations describing the permanent gain degradation and the scrubbing effect on the inner surfaces of the MCP pores.

  19. Cancer Risk Assessment for Space Radiation

    NASA Technical Reports Server (NTRS)

    Richmond, Robert C.; Cruz, Angela; Bors, Karen; Curreri, Peter A. (Technical Monitor)

    2001-01-01

    Predicting the occurrence of human cancer following exposure to any agent causing genetic damage is a difficult task. This is because the uncertainty of uniform exposure to the damaging agent, and the uncertainty of uniform processing of that damage within a complex set of biological variables, degrade the confidence of predicting the delayed expression of cancer as a relatively rare event within any given clinically normal individual. The radiation health research priorities for enabling long-duration human exploration of space were established in the 1996 NRC Report entitled 'Radiation Hazards to Crews of Interplanetary Missions: Biological Issues and Research Strategies'. This report emphasized that a 15-fold uncertainty in predicting radiation-induced cancer incidence must be reduced before NASA can commit humans to extended interplanetary missions. That report concluded that the great majority of this uncertainty is biologically based, while a minority is physically based due to uncertainties in radiation dosimetry and radiation transport codes. Since that report, the biologically based uncertainty has remained large, and the relatively small uncertainty associated with radiation dosimetry has increased due to the considerations raised by concepts of microdosimetry. In a practical sense, however, the additional uncertainties introduced by microdosimetry are encouraging since they are in a direction of lowered effective dose absorbed through infrequent interactions of any given cell with the high energy particle component of space radiation. Additional information is contained in the original extended abstract.

  20. Multi-objective calibration and uncertainty analysis of hydrologic models; A comparative study between formal and informal methods

    NASA Astrophysics Data System (ADS)

    Shafii, M.; Tolson, B.; Matott, L. S.

    2012-04-01

    Hydrologic modeling has benefited from significant developments over the past two decades. This has resulted in building of higher levels of complexity into hydrologic models, which eventually makes the model evaluation process (parameter estimation via calibration and uncertainty analysis) more challenging. In order to avoid unreasonable parameter estimates, many researchers have suggested implementation of multi-criteria calibration schemes. Furthermore, for predictive hydrologic models to be useful, proper consideration of uncertainty is essential. Consequently, recent research has emphasized comprehensive model assessment procedures in which multi-criteria parameter estimation is combined with statistically-based uncertainty analysis routines such as Bayesian inference using Markov Chain Monte Carlo (MCMC) sampling. Such a procedure relies on the use of formal likelihood functions based on statistical assumptions, and moreover, the Bayesian inference structured on MCMC samplers requires a considerably large number of simulations. Due to these issues, especially in complex non-linear hydrological models, a variety of alternative informal approaches have been proposed for uncertainty analysis in the multi-criteria context. This study aims at exploring a number of such informal uncertainty analysis techniques in multi-criteria calibration of hydrological models. The informal methods addressed in this study are (i) Pareto optimality which quantifies the parameter uncertainty using the Pareto solutions, (ii) DDS-AU which uses the weighted sum of objective functions to derive the prediction limits, and (iii) GLUE which describes the total uncertainty through identification of behavioral solutions. The main objective is to compare such methods with MCMC-based Bayesian inference with respect to factors such as computational burden, and predictive capacity, which are evaluated based on multiple comparative measures. The measures for comparison are calculated both for calibration and evaluation periods. The uncertainty analysis methodologies are applied to a simple 5-parameter rainfall-runoff model, called HYMOD.

  1. Importance of anthropogenic climate impact, sampling error and urban development in sewer system design.

    PubMed

    Egger, C; Maurer, M

    2015-04-15

    Urban drainage design relying on observed precipitation series neglects the uncertainties associated with current and indeed future climate variability. Urban drainage design is further affected by the large stochastic variability of precipitation extremes and sampling errors arising from the short observation periods of extreme precipitation. Stochastic downscaling addresses anthropogenic climate impact by allowing relevant precipitation characteristics to be derived from local observations and an ensemble of climate models. This multi-climate model approach seeks to reflect the uncertainties in the data due to structural errors of the climate models. An ensemble of outcomes from stochastic downscaling allows for addressing the sampling uncertainty. These uncertainties are clearly reflected in the precipitation-runoff predictions of three urban drainage systems. They were mostly due to the sampling uncertainty. The contribution of climate model uncertainty was found to be of minor importance. Under the applied greenhouse gas emission scenario (A1B) and within the period 2036-2065, the potential for urban flooding in our Swiss case study is slightly reduced on average compared to the reference period 1981-2010. Scenario planning was applied to consider urban development associated with future socio-economic factors affecting urban drainage. The impact of scenario uncertainty was to a large extent found to be case-specific, thus emphasizing the need for scenario planning in every individual case. The results represent a valuable basis for discussions of new drainage design standards aiming specifically to include considerations of uncertainty. Copyright © 2015 Elsevier Ltd. All rights reserved.

  2. Post-processing of multi-hydrologic model simulations for improved streamflow projections

    NASA Astrophysics Data System (ADS)

    khajehei, sepideh; Ahmadalipour, Ali; Moradkhani, Hamid

    2016-04-01

    Hydrologic model outputs are prone to bias and uncertainty due to knowledge deficiency in model and data. Uncertainty in hydroclimatic projections arises due to uncertainty in hydrologic model as well as the epistemic or aleatory uncertainties in GCM parameterization and development. This study is conducted to: 1) evaluate the recently developed multi-variate post-processing method for historical simulations and 2) assess the effect of post-processing on uncertainty and reliability of future streamflow projections in both high-flow and low-flow conditions. The first objective is performed for historical period of 1970-1999. Future streamflow projections are generated for 10 statistically downscaled GCMs from two widely used downscaling methods: Bias Corrected Statistically Downscaled (BCSD) and Multivariate Adaptive Constructed Analogs (MACA), over the period of 2010-2099 for two representative concentration pathways of RCP4.5 and RCP8.5. Three semi-distributed hydrologic models were employed and calibrated at 1/16 degree latitude-longitude resolution for over 100 points across the Columbia River Basin (CRB) in the pacific northwest USA. Streamflow outputs are post-processed through a Bayesian framework based on copula functions. The post-processing approach is relying on a transfer function developed based on bivariate joint distribution between the observation and simulation in historical period. Results show that application of post-processing technique leads to considerably higher accuracy in historical simulations and also reducing model uncertainty in future streamflow projections.

  3. Sources of uncertanity as a basis to fill the information gap in a response to flood

    NASA Astrophysics Data System (ADS)

    Kekez, Toni; Knezic, Snjezana

    2016-04-01

    Taking into account uncertainties in flood risk management remains a challenge due to difficulties in choosing adequate structural and/or non-structural risk management options. Despite stated measures wrong decisions are often being made when flood occurs. Parameter and structural uncertainties which include model and observation errors as well as lack of knowledge about system characteristics are the main considerations. Real time flood risk assessment methods are predominantly based on measured water level values and vulnerability as well as other relevant characteristics of flood affected area. The goal of this research is to identify sources of uncertainties and to minimize information gap between the point where the water level is measured and the affected area, taking into consideration main uncertainties that can affect risk value at the observed point or section of the river. Sources of uncertainties are identified and determined using system analysis approach and relevant uncertainties are included in the risk assessment model. With such methodological approach it is possible to increase response time with more effective risk assessment which includes uncertainty propagation model. Response phase could be better planned with adequate early warning systems resulting in more time and less costs to help affected areas and save human lives. Reliable and precise information is necessary to raise emergency operability level in order to enhance safety of citizens and reducing possible damage. The results of the EPISECC (EU funded FP7) project are used to validate potential benefits of this research in order to improve flood risk management and response methods. EPISECC aims at developing a concept of a common European Information Space for disaster response which, among other disasters, considers the floods.

  4. Report from the Integrated Modeling Panel at the Workshop on the Science of Ignition on NIF

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Marinak, M; Lamb, D

    2012-07-03

    This section deals with multiphysics radiation hydrodynamics codes used to design and simulate targets in the ignition campaign. These topics encompass all the physical processes they model, and include consideration of any approximations necessary due to finite computer resources. The section focuses on what developments would have the highest impact on reducing uncertainties in modeling most relevant to experimental observations. It considers how the ICF codes should be employed in the ignition campaign. This includes a consideration of how the experiments can be best structured to test the physical models the codes employ.

  5. Resilient guaranteed cost control of a power system.

    PubMed

    Soliman, Hisham M; Soliman, Mostafa H; Hassan, Mohammad F

    2014-05-01

    With the development of power system interconnection, the low-frequency oscillation is becoming more and more prominent which may cause system separation and loss of energy to consumers. This paper presents an innovative robust control for power systems in which the operating conditions are changing continuously due to load changes. However, practical implementation of robust control can be fragile due to controller inaccuracies (tolerance of resistors used with operational amplifiers). A new design of resilient (non-fragile) robust control is given that takes into consideration both model and controller uncertainties by an iterative solution of a set of linear matrix inequalities (LMI). Both uncertainties are cast into a norm-bounded structure. A sufficient condition is derived to achieve the desired settling time for damping power system oscillations in face of plant and controller uncertainties. Furthermore, an improved controller design, resilient guaranteed cost controller, is derived to achieve oscillations damping in a guaranteed cost manner. The effectiveness of the algorithm is shown for a single machine infinite bus system, and then, it is extended to multi-area power system.

  6. Accounting for Uncertainty and Time Lags in Equivalency Calculations for Offsetting in Aquatic Resources Management Programs

    NASA Astrophysics Data System (ADS)

    Bradford, Michael J.

    2017-10-01

    Biodiversity offset programs attempt to minimize unavoidable environmental impacts of anthropogenic activities by requiring offsetting measures in sufficient quantity to counterbalance losses due to the activity. Multipliers, or offsetting ratios, have been used to increase the amount of offsets to account for uncertainty but those ratios have generally been derived from theoretical or ad-hoc considerations. I analyzed uncertainty in the offsetting process in the context of offsetting for impacts to freshwater fisheries productivity. For aquatic habitats I demonstrate that an empirical risk-based approach for evaluating prediction uncertainty is feasible, and if data are available appropriate adjustments to offset requirements can be estimated. For two data-rich examples I estimate multipliers in the range of 1.5:1 - 2.5:1 are sufficient to account for the uncertainty in the prediction of gains and losses. For aquatic habitats adjustments for time delays in the delivery of offset benefits can also be calculated and are likely smaller than those for prediction uncertainty. However, the success of a biodiversity offsetting program will also depend on the management of the other components of risk not addressed by these adjustments.

  7. Accounting for Uncertainty and Time Lags in Equivalency Calculations for Offsetting in Aquatic Resources Management Programs.

    PubMed

    Bradford, Michael J

    2017-10-01

    Biodiversity offset programs attempt to minimize unavoidable environmental impacts of anthropogenic activities by requiring offsetting measures in sufficient quantity to counterbalance losses due to the activity. Multipliers, or offsetting ratios, have been used to increase the amount of offsets to account for uncertainty but those ratios have generally been derived from theoretical or ad-hoc considerations. I analyzed uncertainty in the offsetting process in the context of offsetting for impacts to freshwater fisheries productivity. For aquatic habitats I demonstrate that an empirical risk-based approach for evaluating prediction uncertainty is feasible, and if data are available appropriate adjustments to offset requirements can be estimated. For two data-rich examples I estimate multipliers in the range of 1.5:1 - 2.5:1 are sufficient to account for the uncertainty in the prediction of gains and losses. For aquatic habitats adjustments for time delays in the delivery of offset benefits can also be calculated and are likely smaller than those for prediction uncertainty. However, the success of a biodiversity offsetting program will also depend on the management of the other components of risk not addressed by these adjustments.

  8. Reducing uncertainty in dust monitoring to detect aeolian sediment transport responses to land cover change

    NASA Astrophysics Data System (ADS)

    Webb, N.; Chappell, A.; Van Zee, J.; Toledo, D.; Duniway, M.; Billings, B.; Tedela, N.

    2017-12-01

    Anthropogenic land use and land cover change (LULCC) influence global rates of wind erosion and dust emission, yet our understanding of the magnitude of the responses remains poor. Field measurements and monitoring provide essential data to resolve aeolian sediment transport patterns and assess the impacts of human land use and management intensity. Data collected in the field are also required for dust model calibration and testing, as models have become the primary tool for assessing LULCC-dust cycle interactions. However, there is considerable uncertainty in estimates of dust emission due to the spatial variability of sediment transport. Field sampling designs are currently rudimentary and considerable opportunities are available to reduce the uncertainty. Establishing the minimum detectable change is critical for measuring spatial and temporal patterns of sediment transport, detecting potential impacts of LULCC and land management, and for quantifying the uncertainty of dust model estimates. Here, we evaluate the effectiveness of common sampling designs (e.g., simple random sampling, systematic sampling) used to measure and monitor aeolian sediment transport rates. Using data from the US National Wind Erosion Research Network across diverse rangeland and cropland cover types, we demonstrate how only large changes in sediment mass flux (of the order 200% to 800%) can be detected when small sample sizes are used, crude sampling designs are implemented, or when the spatial variation is large. We then show how statistical rigour and the straightforward application of a sampling design can reduce the uncertainty and detect change in sediment transport over time and between land use and land cover types.

  9. The detection of climate change due to the enhanced greenhouse effect

    NASA Technical Reports Server (NTRS)

    Schiffer, Robert A.; Unninayar, Sushel

    1991-01-01

    The greenhouse effect is accepted as an undisputed fact from both theoretical and observational considerations. In Earth's atmosphere, the primary greenhouse gas is water vapor. The specific concern today is that increasing concentrations of anthropogenically introduced greenhouse gases will, sooner or later, irreversibly alter the climate of Earth. Detecting climate change has been complicated by uncertainties in historical observations and measurements. Thus, the primary concern for the GEDEX project is how can climate change and enhanced greenhouse effects be unambiguously detected and quantified. Specifically examined are the areas of: Earth surface temperature; the free atmosphere (850 millibars and above); space-based measurements; measurement uncertainties; and modeling the observed temperature record.

  10. Deriving persistence indicators from regulatory water-sediment studies – opportunities and limitations in OECD 308 data.

    PubMed

    Honti, Mark; Fenner, Kathrin

    2015-05-19

    The OECD guideline 308 describes a laboratory test method to assess aerobic and anaerobic transformation of organic chemicals in aquatic sediment systems and is an integral part of tiered testing strategies in different legislative frameworks for the environmental risk assessment of chemicals. The results from experiments carried out according to OECD 308 are generally used to derive persistence indicators for hazard assessment or half-lives for exposure assessment. We used Bayesian parameter estimation and system representations of various complexities to systematically assess opportunities and limitations for estimating these indicators from existing data generated according to OECD 308 for 23 pesticides and pharmaceuticals. We found that there is a disparity between the uncertainty and the conceptual robustness of persistence indicators. Disappearance half-lives are directly extractable with limited uncertainty, but they lump degradation and phase transfer information and are not robust against changes in system geometry. Transformation half-lives are less system-specific but require inverse modeling to extract, resulting in considerable uncertainty. Available data were thus insufficient to derive indicators that had both acceptable robustness and uncertainty, which further supports previously voiced concerns about the usability and efficiency of these costly experiments. Despite the limitations of existing data, we suggest the time until 50% of the parent compound has been transformed in the entire system (DegT(50,system)) could still be a useful indicator of persistence in the upper, partially aerobic sediment layer in the context of PBT assessment. This should, however, be accompanied by a mandatory reporting or full standardization of the geometry of the experimental system. We recommend transformation half-lives determined by inverse modeling to be used as input parameters into fate models for exposure assessment, if due consideration is given to their uncertainty.

  11. The impact of lake and reservoir parameterization on global streamflow simulation.

    PubMed

    Zajac, Zuzanna; Revilla-Romero, Beatriz; Salamon, Peter; Burek, Peter; Hirpa, Feyera A; Beck, Hylke

    2017-05-01

    Lakes and reservoirs affect the timing and magnitude of streamflow, and are therefore essential hydrological model components, especially in the context of global flood forecasting. However, the parameterization of lake and reservoir routines on a global scale is subject to considerable uncertainty due to lack of information on lake hydrographic characteristics and reservoir operating rules. In this study we estimated the effect of lakes and reservoirs on global daily streamflow simulations of a spatially-distributed LISFLOOD hydrological model. We applied state-of-the-art global sensitivity and uncertainty analyses for selected catchments to examine the effect of uncertain lake and reservoir parameterization on model performance. Streamflow observations from 390 catchments around the globe and multiple performance measures were used to assess model performance. Results indicate a considerable geographical variability in the lake and reservoir effects on the streamflow simulation. Nash-Sutcliffe Efficiency (NSE) and Kling-Gupta Efficiency (KGE) metrics improved for 65% and 38% of catchments respectively, with median skill score values of 0.16 and 0.2 while scores deteriorated for 28% and 52% of the catchments, with median values -0.09 and -0.16, respectively. The effect of reservoirs on extreme high flows was substantial and widespread in the global domain, while the effect of lakes was spatially limited to a few catchments. As indicated by global sensitivity analysis, parameter uncertainty substantially affected uncertainty of model performance. Reservoir parameters often contributed to this uncertainty, although the effect varied widely among catchments. The effect of reservoir parameters on model performance diminished with distance downstream of reservoirs in favor of other parameters, notably groundwater-related parameters and channel Manning's roughness coefficient. This study underscores the importance of accounting for lakes and, especially, reservoirs and using appropriate parameterization in large-scale hydrological simulations.

  12. Plasticity models of material variability based on uncertainty quantification techniques

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jones, Reese E.; Rizzi, Francesco; Boyce, Brad

    The advent of fabrication techniques like additive manufacturing has focused attention on the considerable variability of material response due to defects and other micro-structural aspects. This variability motivates the development of an enhanced design methodology that incorporates inherent material variability to provide robust predictions of performance. In this work, we develop plasticity models capable of representing the distribution of mechanical responses observed in experiments using traditional plasticity models of the mean response and recently developed uncertainty quantification (UQ) techniques. Lastly, we demonstrate that the new method provides predictive realizations that are superior to more traditional ones, and how these UQmore » techniques can be used in model selection and assessing the quality of calibrated physical parameters.« less

  13. Generalized uncertainty principle: implications for black hole complementarity

    NASA Astrophysics Data System (ADS)

    Chen, Pisin; Ong, Yen Chin; Yeom, Dong-han

    2014-12-01

    At the heart of the black hole information loss paradox and the firewall controversy lies the conflict between quantum mechanics and general relativity. Much has been said about quantum corrections to general relativity, but much less in the opposite direction. It is therefore crucial to examine possible corrections to quantum mechanics due to gravity. Indeed, the Heisenberg Uncertainty Principle is one profound feature of quantum mechanics, which nevertheless may receive correction when gravitational effects become important. Such generalized uncertainty principle [GUP] has been motivated from not only quite general considerations of quantum mechanics and gravity, but also string theoretic arguments. We examine the role of GUP in the context of black hole complementarity. We find that while complementarity can be violated by large N rescaling if one assumes only the Heisenberg's Uncertainty Principle, the application of GUP may save complementarity, but only if certain N -dependence is also assumed. This raises two important questions beyond the scope of this work, i.e., whether GUP really has the proposed form of N -dependence, and whether black hole complementarity is indeed correct.

  14. The Certainty of Uncertainty: Potential Sources of Bias and Imprecision in Disease Ecology Studies.

    PubMed

    Lachish, Shelly; Murray, Kris A

    2018-01-01

    Wildlife diseases have important implications for wildlife and human health, the preservation of biodiversity and the resilience of ecosystems. However, understanding disease dynamics and the impacts of pathogens in wild populations is challenging because these complex systems can rarely, if ever, be observed without error. Uncertainty in disease ecology studies is commonly defined in terms of either heterogeneity in detectability (due to variation in the probability of encountering, capturing, or detecting individuals in their natural habitat) or uncertainty in disease state assignment (due to misclassification errors or incomplete information). In reality, however, uncertainty in disease ecology studies extends beyond these components of observation error and can arise from multiple varied processes, each of which can lead to bias and a lack of precision in parameter estimates. Here, we present an inventory of the sources of potential uncertainty in studies that attempt to quantify disease-relevant parameters from wild populations (e.g., prevalence, incidence, transmission rates, force of infection, risk of infection, persistence times, and disease-induced impacts). We show that uncertainty can arise via processes pertaining to aspects of the disease system, the study design, the methods used to study the system, and the state of knowledge of the system, and that uncertainties generated via one process can propagate through to others because of interactions between the numerous biological, methodological and environmental factors at play. We show that many of these sources of uncertainty may not be immediately apparent to researchers (for example, unidentified crypticity among vectors, hosts or pathogens, a mismatch between the temporal scale of sampling and disease dynamics, demographic or social misclassification), and thus have received comparatively little consideration in the literature to date. Finally, we discuss the type of bias or imprecision introduced by these varied sources of uncertainty and briefly present appropriate sampling and analytical methods to account for, or minimise, their influence on estimates of disease-relevant parameters. This review should assist researchers and practitioners to navigate the pitfalls of uncertainty in wildlife disease ecology studies.

  15. Human Health Risk Assessment of Pharmaceuticals in Water: Issues and Challenges Ahead

    PubMed Central

    Kumar, Arun; Chang, Biao; Xagoraraki, Irene

    2010-01-01

    This study identified existing issues related to quantitative pharmaceutical risk assessment (QPhRA, hereafter) for pharmaceuticals in water and proposed possible solutions by analyzing methodologies and findings of different published QPhRA studies. Retrospective site-specific QPhRA studies from different parts of the world (U.S.A., United Kingdom, Europe, India, etc.) were reviewed in a structured manner to understand different assumptions, outcomes obtained and issues, identified/addressed/raised by the different QPhRA studies. Till date, most of the published studies have concluded that there is no appreciable risk to human health during environmental exposures of pharmaceuticals; however, attention is still required to following identified issues: (1) Use of measured versus predicted pharmaceutical concentration, (2) Identification of pharmaceuticals-of-concern and compounds needing special considerations, (3) Use of source water versus finished drinking water-related exposure scenarios, (4) Selection of representative exposure routes, (5) Valuation of uncertainty factors, and (6) Risk assessment for mixture of chemicals. To close the existing data and methodology gaps, this study proposed possible ways to address and/or incorporation these considerations within the QPhRA framework; however, more research work is still required to address issues, such as incorporation of short-term to long-term extrapolation and mixture effects in the QPhRA framework. Specifically, this study proposed a development of a new “mixture effects-related uncertainty factor” for mixture of chemicals (i.e., mixUFcomposite), similar to an uncertainty factor of a single chemical, within the QPhRA framework. In addition to all five traditionally used uncertainty factors, this uncertainty factor is also proposed to include concentration effects due to presence of different range of concentration levels of pharmaceuticals in a mixture. However, further work is required to determine values of all six uncertainty factors and incorporate them to use during estimation of point-of-departure values within the QPhRA framework. PMID:21139869

  16. Legal considerations in infectious diseases and dentistry.

    PubMed

    Burris, S

    1996-04-01

    Dentists, similar to other professionals subject to legal regulation, often have an overly simple view of the legal system. Communicable diseases present questions on the cutting edge of the law, and, as the previous discussion makes perhaps painfully clear, there is considerable uncertainty on many important legal points. Legal uncertainty is often a reflection of social or scientific uncertainty. Clear answers emerge less from the words of lawyers and judges than from the actions of professionals themselves, who ultimately set the standard of care. In any area of legal uncertainty, the dentist is best advised to adhere to the best scientific information available and to meet the ethical standards of the profession.

  17. Reconstruction of droughts in India using multiple land-surface models (1951-2015)

    NASA Astrophysics Data System (ADS)

    Mishra, Vimal; Shah, Reepal; Azhar, Syed; Shah, Harsh; Modi, Parth; Kumar, Rohini

    2018-04-01

    India has witnessed some of the most severe historical droughts in the current decade, and severity, frequency, and areal extent of droughts have been increasing. As a large part of the population of India is dependent on agriculture, soil moisture drought affecting agricultural activities (crop yields) has significant impacts on socio-economic conditions. Due to limited observations, soil moisture is generally simulated using land-surface hydrological models (LSMs); however, these LSM outputs have uncertainty due to many factors, including errors in forcing data and model parameterization. Here we reconstruct agricultural drought events over India during the period of 1951-2015 based on simulated soil moisture from three LSMs, the Variable Infiltration Capacity (VIC), the Noah, and the Community Land Model (CLM). Based on simulations from the three LSMs, we find that major drought events occurred in 1987, 2002, and 2015 during the monsoon season (June through September). During the Rabi season (November through February), major soil moisture droughts occurred in 1966, 1973, 2001, and 2003. Soil moisture droughts estimated from the three LSMs are comparable in terms of their spatial coverage; however, differences are found in drought severity. Moreover, we find a higher uncertainty in simulated drought characteristics over a large part of India during the major crop-growing season (Rabi season, November to February: NDJF) compared to those of the monsoon season (June to September: JJAS). Furthermore, uncertainty in drought estimates is higher for severe and localized droughts. Higher uncertainty in the soil moisture droughts is largely due to the difference in model parameterizations (especially soil depth), resulting in different persistence of soil moisture simulated by the three LSMs. Our study highlights the importance of accounting for the LSMs' uncertainty and consideration of the multi-model ensemble system for the real-time monitoring and prediction of drought over India.

  18. Tests and comparisons of gravity models.

    NASA Technical Reports Server (NTRS)

    Marsh, J. G.; Douglas, B. C.

    1971-01-01

    Optical observations of the GEOS satellites were used to obtain orbital solutions with different sets of geopotential coefficients. The solutions were compared before and after modification to high order terms (necessary because of resonance) and were then analyzed by comparing subsequent observations with predicted trajectories. The most important source of error in orbit determination and prediction for the GEOS satellites is the effect of resonance found in most published sets of geopotential coefficients. Modifications to the sets yield greatly improved orbits in most cases. The results of these comparisons suggest that with the best optical tracking systems and gravity models, satellite position error due to gravity model uncertainty can reach 50-100 m during a heavily observed 5-6 day orbital arc. If resonant coefficients are estimated, the uncertainty is reduced considerably.

  19. The Influence of Boundary Layer Parameters on Interior Noise

    NASA Technical Reports Server (NTRS)

    Palumbo, Daniel L.; Rocha, Joana

    2012-01-01

    Predictions of the wall pressure in the turbulent boundary of an aerospace vehicle can differ substantially from measurement due to phenomena that are not well understood. Characterizing the phenomena will require additional testing at considerable cost. Before expending scarce resources, it is desired to quantify the effect of the uncertainty in wall pressure predictions and measurements on structural response and acoustic radiation. A sensitivity analysis is performed on four parameters of the Corcos cross spectrum model: power spectrum, streamwise and cross stream coherence lengths and Mach number. It is found that at lower frequencies where high power levels and long coherence lengths exist, the radiated sound power prediction has up to 7 dB of uncertainty in power spectrum levels with streamwise and cross stream coherence lengths contributing equally to the total.

  20. Cloud Ice: A Climate Model Challenge With Signs and Expectations of Progress

    NASA Astrophysics Data System (ADS)

    Li, F.; Waliser, D.; Bacmeister, J.; Chern, J.; Del Genio, T.; Jiang, J.; Kharitondov, M.; Liou, K.; Meng, H.; Minnis, P.; Rossow, B.; Stephens, G.; Sun-Mack, S.; Tao, W.; Vane, D.; Woods, C.; Tompkins, A.; Wu, D.

    2007-12-01

    Global climate models (GCMs), including those assessed in the IPCC AR4, exhibit considerable disagreement in the amount of cloud ice - both in terms of the annual global mean as well as their spatial variability. Global measurements of cloud ice have been difficult due to the challenges involved in remotely sensing ice water content (IWC) and its vertical profile - including complications associated with multi-level clouds, mixed-phases and multiple hydrometer types, the uncertainty in classifying ice particle size and shape for remote retrievals, and the relatively small time and space scales associated with deep convection. Together, these measurement difficulties make it a challenge to characterize and understand the mechanisms of ice cloud formation and dissipation. Fortunately, there are new observational resources recently established that can be expected to lead to considerable reduction in the observational uncertainties of cloud ice, and in turn improve the fidelity of model representations. Specifically, these include the Microwave Limb Sounder (MLS) on the Earth Observing System (EOS) Aura satellite, and the CloudSat and Calipso satellite missions, all of which fly in formation in what is referred to as the A-Train. Based on radar and limb-sounding techniques, these new satellite measurements provide a considerable leap forward in terms of the information gathered regarding upper-tropospheric cloud IWC as well as other macrophysical and microphysical properties. In this presentation, we describe the current state of GCM representations of cloud ice and their associated uncertainties, the nature of the new observational resources for constraining cloud ice values in GCMs, the challenges in making model-data comparisons with these data resources, and prospects for near-term improvements in model representations.

  1. Uncertainty Quantification and Regional Sensitivity Analysis of Snow-related Parameters in the Canadian LAnd Surface Scheme (CLASS)

    NASA Astrophysics Data System (ADS)

    Badawy, B.; Fletcher, C. G.

    2017-12-01

    The parameterization of snow processes in land surface models is an important source of uncertainty in climate simulations. Quantifying the importance of snow-related parameters, and their uncertainties, may therefore lead to better understanding and quantification of uncertainty within integrated earth system models. However, quantifying the uncertainty arising from parameterized snow processes is challenging due to the high-dimensional parameter space, poor observational constraints, and parameter interaction. In this study, we investigate the sensitivity of the land simulation to uncertainty in snow microphysical parameters in the Canadian LAnd Surface Scheme (CLASS) using an uncertainty quantification (UQ) approach. A set of training cases (n=400) from CLASS is used to sample each parameter across its full range of empirical uncertainty, as determined from available observations and expert elicitation. A statistical learning model using support vector regression (SVR) is then constructed from the training data (CLASS output variables) to efficiently emulate the dynamical CLASS simulations over a much larger (n=220) set of cases. This approach is used to constrain the plausible range for each parameter using a skill score, and to identify the parameters with largest influence on the land simulation in CLASS at global and regional scales, using a random forest (RF) permutation importance algorithm. Preliminary sensitivity tests indicate that snow albedo refreshment threshold and the limiting snow depth, below which bare patches begin to appear, have the highest impact on snow output variables. The results also show a considerable reduction of the plausible ranges of the parameters values and hence reducing their uncertainty ranges, which can lead to a significant reduction of the model uncertainty. The implementation and results of this study will be presented and discussed in details.

  2. Impacts of uncertainties in European gridded precipitation observations on regional climate analysis

    PubMed Central

    Gobiet, Andreas

    2016-01-01

    ABSTRACT Gridded precipitation data sets are frequently used to evaluate climate models or to remove model output biases. Although precipitation data are error prone due to the high spatio‐temporal variability of precipitation and due to considerable measurement errors, relatively few attempts have been made to account for observational uncertainty in model evaluation or in bias correction studies. In this study, we compare three types of European daily data sets featuring two Pan‐European data sets and a set that combines eight very high‐resolution station‐based regional data sets. Furthermore, we investigate seven widely used, larger scale global data sets. Our results demonstrate that the differences between these data sets have the same magnitude as precipitation errors found in regional climate models. Therefore, including observational uncertainties is essential for climate studies, climate model evaluation, and statistical post‐processing. Following our results, we suggest the following guidelines for regional precipitation assessments. (1) Include multiple observational data sets from different sources (e.g. station, satellite, reanalysis based) to estimate observational uncertainties. (2) Use data sets with high station densities to minimize the effect of precipitation undersampling (may induce about 60% error in data sparse regions). The information content of a gridded data set is mainly related to its underlying station density and not to its grid spacing. (3) Consider undercatch errors of up to 80% in high latitudes and mountainous regions. (4) Analyses of small‐scale features and extremes are especially uncertain in gridded data sets. For higher confidence, use climate‐mean and larger scale statistics. In conclusion, neglecting observational uncertainties potentially misguides climate model development and can severely affect the results of climate change impact assessments. PMID:28111497

  3. Impacts of uncertainties in European gridded precipitation observations on regional climate analysis.

    PubMed

    Prein, Andreas F; Gobiet, Andreas

    2017-01-01

    Gridded precipitation data sets are frequently used to evaluate climate models or to remove model output biases. Although precipitation data are error prone due to the high spatio-temporal variability of precipitation and due to considerable measurement errors, relatively few attempts have been made to account for observational uncertainty in model evaluation or in bias correction studies. In this study, we compare three types of European daily data sets featuring two Pan-European data sets and a set that combines eight very high-resolution station-based regional data sets. Furthermore, we investigate seven widely used, larger scale global data sets. Our results demonstrate that the differences between these data sets have the same magnitude as precipitation errors found in regional climate models. Therefore, including observational uncertainties is essential for climate studies, climate model evaluation, and statistical post-processing. Following our results, we suggest the following guidelines for regional precipitation assessments. (1) Include multiple observational data sets from different sources (e.g. station, satellite, reanalysis based) to estimate observational uncertainties. (2) Use data sets with high station densities to minimize the effect of precipitation undersampling (may induce about 60% error in data sparse regions). The information content of a gridded data set is mainly related to its underlying station density and not to its grid spacing. (3) Consider undercatch errors of up to 80% in high latitudes and mountainous regions. (4) Analyses of small-scale features and extremes are especially uncertain in gridded data sets. For higher confidence, use climate-mean and larger scale statistics. In conclusion, neglecting observational uncertainties potentially misguides climate model development and can severely affect the results of climate change impact assessments.

  4. Uncertainty and Clinical Psychology: Therapists' Responses.

    ERIC Educational Resources Information Center

    Bienenfeld, Sheila

    Three sources of professional uncertainty have been described: uncertainty about the practitioner's mastery of knowledge; uncertainty due to gaps in the knowledge base itself; and uncertainty about the source of the uncertainty, i.e., the practitioner does not know whether his uncertainty is due to gaps in the knowledge base or to personal…

  5. Assessing uncertain human exposure to ambient air pollution using environmental models in the Web

    NASA Astrophysics Data System (ADS)

    Gerharz, L. E.; Pebesma, E.; Denby, B.

    2012-04-01

    Ambient air quality can have significant impact on human health by causing respiratory and cardio-vascular diseases. Thereby, the pollutant concentration a person is exposed to can differ considerably between individuals depending on their daily routine and movement patterns. Using a straight forward approach this exposure can be estimated by integration of individual space-time paths and spatio-temporally resolved ambient air quality data. To allow a realistic exposure assessment, it is furthermore important to consider uncertainties due to input and model errors. In this work, we present a generic, web-based approach for estimating individual exposure by integration of uncertain position and air quality information implemented as a web service. Following the Model Web initiative envisioning an infrastructure for deploying, executing and chaining environmental models as services, existing models and data sources for e.g. air quality, can be used to assess exposure. Therefore, the service needs to deal with different formats, resolutions and uncertainty representations provided by model or data services. Potential mismatch can be accounted for by transformation of uncertainties and (dis-)aggregation of data under consideration of changes in the uncertainties using components developed in the UncertWeb project. In UncertWeb, the Model Web vision is extended to an Uncertainty-enabled Model Web, where services can process and communicate uncertainties in the data and models. The propagation of uncertainty to the exposure results is quantified using Monte Carlo simulation by combining different realisations of positions and ambient concentrations. Two case studies were used to evaluate the developed exposure assessment service. In a first study, GPS tracks with a positional uncertainty of a few meters, collected in the urban area of Münster, Germany were used to assess exposure to PM10 (particulate matter smaller 10 µm). Air quality data was provided by an uncertainty-enabled air quality model system which provided realisations of concentrations per hour on a 250 m x 250 m resolved grid over Münster. The second case study uses modelled human trajectories in Rotterdam, The Netherlands. The trajectories were provided as realisations in 15 min resolution per 4 digit postal code from an activity model. Air quality estimates were provided for different pollutants as ensembles by a coupled meteorology and air quality model system on a 1 km x 1 km grid with hourly resolution. Both case studies show the successful application of the service to different resolutions and uncertainty representations.

  6. Sources of uncertainty in annual forest inventory estimates

    Treesearch

    Ronald E. McRoberts

    2000-01-01

    Although design and estimation aspects of annual forest inventories have begun to receive considerable attention within the forestry and natural resources communities, little attention has been devoted to identifying the sources of uncertainty inherent in these systems or to assessing the impact of those uncertainties on the total uncertainties of inventory estimates....

  7. Identifying influences on model uncertainty: an application using a forest carbon budget model

    Treesearch

    James E. Smith; Linda S. Heath

    2001-01-01

    Uncertainty is an important consideration for both developers and users of environmental simulation models. Establishing quantitative estimates of uncertainty for deterministic models can be difficult when the underlying bases for such information are scarce. We demonstrate an application of probabilistic uncertainty analysis that provides for refinements in...

  8. Radiometer uncertainty equation research of 2D planar scanning PMMW imaging system

    NASA Astrophysics Data System (ADS)

    Hu, Taiyang; Xu, Jianzhong; Xiao, Zelong

    2009-07-01

    With advances in millimeter-wave technology, passive millimeter-wave (PMMW) imaging technology has received considerable concerns, and it has established itself in a wide range of military and civil practical applications, such as in the areas of remote sensing, blind landing, precision guidance and security inspection. Both the high transparency of clothing at millimeter wavelengths and the spatial resolution required to generate adequate images combine to make imaging at millimeter wavelengths a natural approach of screening people for concealed contraband detection. And at the same time, the passive operation mode does not present a safety hazard to the person who is under inspection. Based on the description to the design and engineering implementation of a W-band two-dimensional (2D) planar scanning imaging system, a series of scanning methods utilized in PMMW imaging are generally compared and analyzed, followed by a discussion on the operational principle of the mode of 2D planar scanning particularly. Furthermore, it is found that the traditional radiometer uncertainty equation, which is derived from a moving platform, does not hold under this 2D planar scanning mode due to the fact that there is no absolute connection between the scanning rates in horizontal direction and vertical direction. Consequently, an improved radiometer uncertainty equation is carried out in this paper, by means of taking the total time spent on scanning and imaging into consideration, with the purpose of solving the problem mentioned above. In addition, the related factors which affect the quality of radiometric images are further investigated under the improved radiometer uncertainty equation, and ultimately some original results are presented and analyzed to demonstrate the significance and validity of this new methodology.

  9. Cancer Risk Assessment for Space Radiation

    NASA Technical Reports Server (NTRS)

    Richmond, Robert C.; Curreri, Peter A. (Technical Monitor)

    2002-01-01

    Predicting the occurrence of human cancer following exposure to any agent causing genetic damage is a difficult task. This is because the uncertainty of uniform exposure to the damaging agent, and the uncertainty of uniform processing of that damage within a complex set of biological variables, degrade the confidence of predicting the delayed expression of cancer as a relatively rare event within any given clinically normal individual. The radiation health research priorities for enabling long-duration human exploration of space were established in the 1996 NRC Report entitled "Radiation Hazards to Crews of Interplanetary Missions: Biological Issues and Research Strategies". This report emphasized that a 15-fold uncertainty in predicting radiation-induced cancer incidence must be reduced before NASA can commit humans to extended interplanetary missions. That report concluded that the great majority of this uncertainty is biologically based, while a minority is physically based due to uncertainties in radiation dosimetry and radiation transport codes. Since that report, the biologically based uncertainty has remained large, and the relatively small uncertainty associated with radiation dosimetry has increased due to the considerations raised by concepts of microdosimetry. In a practical sense, however, the additional uncertainties introduced by microdosimetry are encouraging since they are in a direction of lowered effective dose absorbed through infrequent interactions of any given cell with the high energy particle component of space radiation. The biological uncertainty in predicting cancer risk for space radiation derives from two primary facts. 1) One animal tumor study has been reported that includes a relevant spectrum of particle radiation energies, and that is the Harderian gland model in mice. Fact #1: Extension of cancer risk from animal models, and especially from a single study in an animal model, to humans is inherently uncertain. 2) One human database is predominantly used for assessing cancer risk caused by space radiation, and that is the Japanese atomic bomb survivors. Fact #2: The atomic-bomb-survivor database, itself a remarkable achievement, contains uncertainties. These include the actual exposure to each individual, the radiation quality of that exposure, and the fact that the exposure was to acute doses of predominantly low-LET radiation, not to chronic exposures of high-LET radiation expected on long-duration interplanetary manned missions.

  10. Choice of generic antihypertensive drugs for the primary prevention of cardiovascular disease--a cost-effectiveness analysis.

    PubMed

    Wisløff, Torbjørn; Selmer, Randi M; Halvorsen, Sigrun; Fretheim, Atle; Norheim, Ole F; Kristiansen, Ivar Sønbø

    2012-04-04

    Hypertension is one of the leading causes of cardiovascular disease (CVD). A range of antihypertensive drugs exists, and their prices vary widely mainly due to patent rights. The objective of this study was to explore the cost-effectiveness of different generic antihypertensive drugs as first, second and third choice for primary prevention of cardiovascular disease. We used the Norwegian Cardiovascular Disease model (NorCaD) to simulate the cardiovascular life of patients from hypertension without symptoms until they were all dead or 100 years old. The risk of CVD events and costs were based on recent Norwegian sources. In single-drug treatment, all antihypertensives are cost-effective compared to no drug treatment. In the base-case analysis, the first, second and third choice of antihypertensive were calcium channel blocker, thiazide and angiotensin-converting enzyme inhibitor. However the sensitivity and scenario analyses indicated considerable uncertainty in that angiotensin receptor blockers as well as, angiotensin-converting enzyme inhibitors, beta blockers and thiazides could be the most cost-effective antihypertensive drugs. Generic antihypertensives are cost-effective in a wide range of risk groups. There is considerable uncertainty, however, regarding which drug is the most cost-effective.

  11. SU-E-J-159: Intra-Patient Deformable Image Registration Uncertainties Quantified Using the Distance Discordance Metric

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Saleh, Z; Thor, M; Apte, A

    2014-06-01

    Purpose: The quantitative evaluation of deformable image registration (DIR) is currently challenging due to lack of a ground truth. In this study we test a new method proposed for quantifying multiple-image based DIRrelated uncertainties, for DIR of pelvic images. Methods: 19 patients were analyzed, each with 6 CT scans, who previously had radiotherapy for prostate cancer. Manually delineated structures for rectum and bladder, which served as ground truth structures, were delineated on the planning CT and each subsequent scan. For each patient, voxel-by-voxel DIR-related uncertainties were evaluated, following B-spline based DIR, by applying a previously developed metric, the distance discordancemore » metric (DDM; Saleh et al., PMB (2014) 59:733). The DDM map was superimposed on the first acquired CT scan and DDM statistics were assessed, also relative to two metrics estimating the agreement between the propagated and the manually delineated structures. Results: The highest DDM values which correspond to greatest spatial uncertainties were observed near the body surface and in the bowel due to the presence of gas. The mean rectal and bladder DDM values ranged from 1.1–11.1 mm and 1.5–12.7 mm, respectively. There was a strong correlation in the DDMs between the rectum and bladder (Pearson R = 0.68 for the max DDM). For both structures, DDM was correlated with the ratio between the DIR-propagated and manually delineated volumes (R = 0.74 for the max rectal DDM). The maximum rectal DDM was negatively correlated with the Dice Similarity Coefficient between the propagated and the manually delineated volumes (R= −0.52). Conclusion: The multipleimage based DDM map quantified considerable DIR variability across different structures and among patients. Besides using the DDM for quantifying DIR-related uncertainties it could potentially be used to adjust for uncertainties in DIR-based accumulated dose distributions.« less

  12. Uncertainty in Estimates of Net Seasonal Snow Accumulation on Glaciers from In Situ Measurements

    NASA Astrophysics Data System (ADS)

    Pulwicki, A.; Flowers, G. E.; Radic, V.

    2017-12-01

    Accurately estimating the net seasonal snow accumulation (or "winter balance") on glaciers is central to assessing glacier health and predicting glacier runoff. However, measuring and modeling snow distribution is inherently difficult in mountainous terrain, resulting in high uncertainties in estimates of winter balance. Our work focuses on uncertainty attribution within the process of converting direct measurements of snow depth and density to estimates of winter balance. We collected more than 9000 direct measurements of snow depth across three glaciers in the St. Elias Mountains, Yukon, Canada in May 2016. Linear regression (LR) and simple kriging (SK), combined with cross correlation and Bayesian model averaging, are used to interpolate estimates of snow water equivalent (SWE) from snow depth and density measurements. Snow distribution patterns are found to differ considerably between glaciers, highlighting strong inter- and intra-basin variability. Elevation is found to be the dominant control of the spatial distribution of SWE, but the relationship varies considerably between glaciers. A simple parameterization of wind redistribution is also a small but statistically significant predictor of SWE. The SWE estimated for one study glacier has a short range parameter (90 m) and both LR and SK estimate a winter balance of 0.6 m w.e. but are poor predictors of SWE at measurement locations. The other two glaciers have longer SWE range parameters ( 450 m) and due to differences in extrapolation, SK estimates are more than 0.1 m w.e. (up to 40%) lower than LR estimates. By using a Monte Carlo method to quantify the effects of various sources of uncertainty, we find that the interpolation of estimated values of SWE is a larger source of uncertainty than the assignment of snow density or than the representation of the SWE value within a terrain model grid cell. For our study glaciers, the total winter balance uncertainty ranges from 0.03 (8%) to 0.15 (54%) m w.e. depending primarily on the interpolation method. Despite the challenges associated with accurately and precisely estimating winter balance, our results are consistent with the previously reported regional accumulation gradient.

  13. Solar neutrinos and the MSW effect for three-neutrino mixing

    NASA Technical Reports Server (NTRS)

    Shi, X.; Schramm, David N.

    1991-01-01

    Researchers considered three-neutrino Mikheyev-Smirnov-Wolfenstein (MSW) mixing, assuming m sub 3 is much greater than m sub 2 is greater than m sub 1 as expected from theoretical consideration if neutrinos have mass. They calculated the corresponding mixing parameter space allowed by the Cl-37 and Kamiokande 2 experiments. They also calculated the expected depletion for the Ga-71 experiment. They explored a range of theoretical uncertainty due to possible astrophysical effects by varying the B-8 neutrino flux and redoing the MSW mixing calculation.

  14. Uncertainties in Climatological Seawater Density Calculations

    NASA Astrophysics Data System (ADS)

    Dai, Hao; Zhang, Xining

    2018-03-01

    In most applications, with seawater conductivity, temperature, and pressure data measured in situ by various observation instruments e.g., Conductivity-Temperature-Depth instruments (CTD), the density which has strong ties to ocean dynamics and so on is computed according to equations of state for seawater. This paper, based on density computational formulae in the Thermodynamic Equation of Seawater 2010 (TEOS-10), follows the Guide of the expression of Uncertainty in Measurement (GUM) and assesses the main sources of uncertainties. By virtue of climatological decades-average temperature/Practical Salinity/pressure data sets in the global ocean provided by the National Oceanic and Atmospheric Administration (NOAA), correlation coefficients between uncertainty sources are determined and the combined standard uncertainties uc>(ρ>) in seawater density calculations are evaluated. For grid points in the world ocean with 0.25° resolution, the standard deviations of uc>(ρ>) in vertical profiles cover the magnitude order of 10-4 kg m-3. The uc>(ρ>) means in vertical profiles of the Baltic Sea are about 0.028kg m-3 due to the larger scatter of Absolute Salinity anomaly. The distribution of the uc>(ρ>) means in vertical profiles of the world ocean except for the Baltic Sea, which covers the range of >(0.004,0.01>) kg m-3, is related to the correlation coefficient r>(SA,p>) between Absolute Salinity SA and pressure p. The results in the paper are based on sensors' measuring uncertainties of high accuracy CTD. Larger uncertainties in density calculations may arise if connected with lower sensors' specifications. This work may provide valuable uncertainty information required for reliability considerations of ocean circulation and global climate models.

  15. MOMENTS OF UNCERTAINTY: ETHICAL CONSIDERATIONS AND EMERGING CONTAMINANTS

    PubMed Central

    Cordner, Alissa; Brown, Phil

    2013-01-01

    Science on emerging environmental health threats involves numerous ethical concerns related to scientific uncertainty about conducting, interpreting, communicating, and acting upon research findings, but the connections between ethical decision making and scientific uncertainty are under-studied in sociology. Under conditions of scientific uncertainty, researcher conduct is not fully prescribed by formal ethical codes of conduct, increasing the importance of ethical reflection by researchers, conflicts over research conduct, and reliance on informal ethical standards. This paper draws on in-depth interviews with scientists, regulators, activists, industry representatives, and fire safety experts to explore ethical considerations of moments of uncertainty using a case study of flame retardants, chemicals widely used in consumer products with potential negative health and environmental impacts. We focus on the uncertainty that arises in measuring people’s exposure to these chemicals through testing of their personal environments or bodies. We identify four sources of ethical concerns relevant to scientific uncertainty: 1) choosing research questions or methods, 2) interpreting scientific results, 3) communicating results to multiple publics, and 4) applying results for policy-making. This research offers lessons about professional conduct under conditions of uncertainty, ethical research practice, democratization of scientific knowledge, and science’s impact on policy. PMID:24249964

  16. Quantifying uncertainty and computational complexity for pore-scale simulations

    NASA Astrophysics Data System (ADS)

    Chen, C.; Yuan, Z.; Wang, P.; Yang, X.; Zhenyan, L.

    2016-12-01

    Pore-scale simulation is an essential tool to understand the complex physical process in many environmental problems, from multi-phase flow in the subsurface to fuel cells. However, in practice, factors such as sample heterogeneity, data sparsity and in general, our insufficient knowledge of the underlying process, render many simulation parameters and hence the prediction results uncertain. Meanwhile, most pore-scale simulations (in particular, direct numerical simulation) incur high computational cost due to finely-resolved spatio-temporal scales, which further limits our data/samples collection. To address those challenges, we propose a novel framework based on the general polynomial chaos (gPC) and build a surrogate model representing the essential features of the underlying system. To be specific, we apply the novel framework to analyze the uncertainties of the system behavior based on a series of pore-scale numerical experiments, such as flow and reactive transport in 2D heterogeneous porous media and 3D packed beds. Comparing with recent pore-scale uncertainty quantification studies using Monte Carlo techniques, our new framework requires fewer number of realizations and hence considerably reduce the overall computational cost, while maintaining the desired accuracy.

  17. Quantifying the uncertainties of a bottom-up emission inventory of anthropogenic atmospheric pollutants in China

    NASA Astrophysics Data System (ADS)

    Zhao, Y.; Nielsen, C. P.; Lei, Y.; McElroy, M. B.; Hao, J.

    2010-11-01

    The uncertainties of a national, bottom-up inventory of Chinese emissions of anthropogenic SO2, NOx, and particulate matter (PM) of different size classes and carbonaceous species are comprehensively quantified, for the first time, using Monte Carlo simulation. The inventory is structured by seven dominant sectors: coal-fired electric power, cement, iron and steel, other industry (boiler combustion), other industry (non-combustion processes), transportation, and residential. For each parameter related to emission factors or activity-level calculations, the uncertainties, represented as probability distributions, are either statistically fitted using results of domestic field tests or, when these are lacking, estimated based on foreign or other domestic data. The uncertainties (i.e., 95% confidence intervals around the central estimates) of Chinese emissions of SO2, NOx, total PM, PM10, PM2.5, black carbon (BC), and organic carbon (OC) in 2005 are estimated to be -14%~12%, -10%~36%, -10%~36%, -12%~42% -16%~52%, -23%~130%, and -37%~117%, respectively. Variations at activity levels (e.g., energy consumption or industrial production) are not the main source of emission uncertainties. Due to narrow classification of source types, large sample sizes, and relatively high data quality, the coal-fired power sector is estimated to have the smallest emission uncertainties for all species except BC and OC. Due to poorer source classifications and a wider range of estimated emission factors, considerable uncertainties of NOx and PM emissions from cement production and boiler combustion in other industries are found. The probability distributions of emission factors for biomass burning, the largest source of BC and OC, are fitted based on very limited domestic field measurements, and special caution should thus be taken interpreting these emission uncertainties. Although Monte Carlo simulation yields narrowed estimates of uncertainties compared to previous bottom-up emission studies, the results are not always consistent with those derived from satellite observations. The results thus represent an incremental research advance; while the analysis provides current estimates of uncertainty to researchers investigating Chinese and global atmospheric transport and chemistry, it also identifies specific needs in data collection and analysis to improve on them. Strengthened quantification of emissions of the included species and other, closely associated ones - notably CO2, generated largely by the same processes and thus subject to many of the same parameter uncertainties - is essential not only for science but for the design of policies to redress critical atmospheric environmental hazards at local, regional, and global scales.

  18. Quantifying the uncertainties of a bottom-up emission inventory of anthropogenic atmospheric pollutants in China

    NASA Astrophysics Data System (ADS)

    Zhao, Y.; Nielsen, C. P.; Lei, Y.; McElroy, M. B.; Hao, J.

    2011-03-01

    The uncertainties of a national, bottom-up inventory of Chinese emissions of anthropogenic SO2, NOx, and particulate matter (PM) of different size classes and carbonaceous species are comprehensively quantified, for the first time, using Monte Carlo simulation. The inventory is structured by seven dominant sectors: coal-fired electric power, cement, iron and steel, other industry (boiler combustion), other industry (non-combustion processes), transportation, and residential. For each parameter related to emission factors or activity-level calculations, the uncertainties, represented as probability distributions, are either statistically fitted using results of domestic field tests or, when these are lacking, estimated based on foreign or other domestic data. The uncertainties (i.e., 95% confidence intervals around the central estimates) of Chinese emissions of SO2, NOx, total PM, PM10, PM2.5, black carbon (BC), and organic carbon (OC) in 2005 are estimated to be -14%~13%, -13%~37%, -11%~38%, -14%~45%, -17%~54%, -25%~136%, and -40%~121%, respectively. Variations at activity levels (e.g., energy consumption or industrial production) are not the main source of emission uncertainties. Due to narrow classification of source types, large sample sizes, and relatively high data quality, the coal-fired power sector is estimated to have the smallest emission uncertainties for all species except BC and OC. Due to poorer source classifications and a wider range of estimated emission factors, considerable uncertainties of NOx and PM emissions from cement production and boiler combustion in other industries are found. The probability distributions of emission factors for biomass burning, the largest source of BC and OC, are fitted based on very limited domestic field measurements, and special caution should thus be taken interpreting these emission uncertainties. Although Monte Carlo simulation yields narrowed estimates of uncertainties compared to previous bottom-up emission studies, the results are not always consistent with those derived from satellite observations. The results thus represent an incremental research advance; while the analysis provides current estimates of uncertainty to researchers investigating Chinese and global atmospheric transport and chemistry, it also identifies specific needs in data collection and analysis to improve on them. Strengthened quantification of emissions of the included species and other, closely associated ones - notably CO2, generated largely by the same processes and thus subject to many of the same parameter uncertainties - is essential not only for science but for the design of policies to redress critical atmospheric environmental hazards at local, regional, and global scales.

  19. Quantifying the sources of uncertainty in an ensemble of hydrological climate-impact projections

    NASA Astrophysics Data System (ADS)

    Aryal, Anil; Shrestha, Sangam; Babel, Mukand S.

    2018-01-01

    The objective of this paper is to quantify the various sources of uncertainty in the assessment of climate change impact on hydrology in the Tamakoshi River Basin, located in the north-eastern part of Nepal. Multiple climate and hydrological models were used to simulate future climate conditions and discharge in the basin. The simulated results of future climate and river discharge were analysed for the quantification of sources of uncertainty using two-way and three-way ANOVA. The results showed that temperature and precipitation in the study area are projected to change in near- (2010-2039), mid- (2040-2069) and far-future (2070-2099) periods. Maximum temperature is likely to rise by 1.75 °C under Representative Concentration Pathway (RCP) 4.5 and by 3.52 °C under RCP 8.5. Similarly, the minimum temperature is expected to rise by 2.10 °C under RCP 4.5 and by 3.73 °C under RCP 8.5 by the end of the twenty-first century. Similarly, the precipitation in the study area is expected to change by - 2.15% under RCP 4.5 and - 2.44% under RCP 8.5 scenarios. The future discharge in the study area was projected using two hydrological models, viz. Soil and Water Assessment Tool (SWAT) and Hydrologic Engineering Center's Hydrologic Modelling System (HEC-HMS). The SWAT model projected discharge is expected to change by small amount, whereas HEC-HMS model projected considerably lower discharge in future compared to the baseline period. The results also show that future climate variables and river hydrology contain uncertainty due to the choice of climate models, RCP scenarios, bias correction methods and hydrological models. During wet days, more uncertainty is observed due to the use of different climate models, whereas during dry days, the use of different hydrological models has a greater effect on uncertainty. Inter-comparison of the impacts of different climate models reveals that the REMO climate model shows higher uncertainty in the prediction of precipitation and, consequently, in the prediction of future discharge and maximum probable flood.

  20. Section summary: Uncertainty and design considerations

    Treesearch

    Stephen Hagen

    2013-01-01

    Well planned sampling designs and robust approaches to estimating uncertainty are critical components of forest monitoring. The importance of uncertainty estimation increases as deforestation and degradation issues become more closely tied to financing incentives for reducing greenhouse gas emissions in the forest sector. Investors like to know risk and risk is tightly...

  1. Uncertainty quantification of Antarctic contribution to sea-level rise using the fast Elementary Thermomechanical Ice Sheet (f.ETISh) model

    NASA Astrophysics Data System (ADS)

    Bulthuis, Kevin; Arnst, Maarten; Pattyn, Frank; Favier, Lionel

    2017-04-01

    Uncertainties in sea-level rise projections are mostly due to uncertainties in Antarctic ice-sheet predictions (IPCC AR5 report, 2013), because key parameters related to the current state of the Antarctic ice sheet (e.g. sub-ice-shelf melting) and future climate forcing are poorly constrained. Here, we propose to improve the predictions of Antarctic ice-sheet behaviour using new uncertainty quantification methods. As opposed to ensemble modelling (Bindschadler et al., 2013) which provides a rather limited view on input and output dispersion, new stochastic methods (Le Maître and Knio, 2010) can provide deeper insight into the impact of uncertainties on complex system behaviour. Such stochastic methods usually begin with deducing a probabilistic description of input parameter uncertainties from the available data. Then, the impact of these input parameter uncertainties on output quantities is assessed by estimating the probability distribution of the outputs by means of uncertainty propagation methods such as Monte Carlo methods or stochastic expansion methods. The use of such uncertainty propagation methods in glaciology may be computationally costly because of the high computational complexity of ice-sheet models. This challenge emphasises the importance of developing reliable and computationally efficient ice-sheet models such as the f.ETISh ice-sheet model (Pattyn, 2015), a new fast thermomechanical coupled ice sheet/ice shelf model capable of handling complex and critical processes such as the marine ice-sheet instability mechanism. Here, we apply these methods to investigate the role of uncertainties in sub-ice-shelf melting, calving rates and climate projections in assessing Antarctic contribution to sea-level rise for the next centuries using the f.ETISh model. We detail the methods and show results that provide nominal values and uncertainty bounds for future sea-level rise as a reflection of the impact of the input parameter uncertainties under consideration, as well as a ranking of the input parameter uncertainties in the order of the significance of their contribution to uncertainty in future sea-level rise. In addition, we discuss how limitations posed by the available information (poorly constrained data) pose challenges that motivate our current research.

  2. MO-E-BRE-01: Determination, Minimization and Communication of Uncertainties in Radiation Therapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Van Dyk, J; Palta, J; Bortfeld, T

    2014-06-15

    Medical Physicists have a general understanding of uncertainties in the radiation treatment process, both with respect to dosimetry and geometry. However, there is a desire to be more quantitative about uncertainty estimation. A recent International Atomic Energy Agency (IAEA) report (about to be published) recommends that we should be as “accurate as reasonably achievable, technical and biological factors being taken into account”. Thus, a single recommendation as a goal for accuracy in radiation therapy is an oversimplification. That report also suggests that individual clinics should determine their own level of uncertainties for their specific treatment protocols. The question is “howmore » do we implement this in clinical practice”? AAPM Monograph 35 (2011 AAPM Summer School) addressed many specific aspects of uncertainties in each of the steps of a course of radiation treatment. The intent of this symposium is: (1) to review uncertainty considerations in the entire radiation treatment process including uncertainty determination for each step and uncertainty propagation for the total process, (2) to consider aspects of robust optimization which optimizes treatment plans while protecting them against uncertainties, and (3) to describe various methods of displaying uncertainties and communicating uncertainties to the relevant professionals. While the theoretical and research aspects will also be described, the emphasis will be on the practical considerations for the medical physicist in clinical practice. Learning Objectives: To review uncertainty determination in the overall radiation treatment process. To consider uncertainty modeling and uncertainty propagation. To highlight the basic ideas and clinical potential of robust optimization procedures to generate optimal treatment plans that are not severely affected by uncertainties. To describe methods of uncertainty communication and display.« less

  3. Statistical and Spatial Analysis of Bathymetric Data for the St. Clair River, 1971-2007

    USGS Publications Warehouse

    Bennion, David

    2009-01-01

    To address questions concerning ongoing geomorphic processes in the St. Clair River, selected bathymetric datasets spanning 36 years were analyzed. Comparisons of recent high-resolution datasets covering the upper river indicate a highly variable, active environment. Although statistical and spatial comparisons of the datasets show that some changes to the channel size and shape have taken place during the study period, uncertainty associated with various survey methods and interpolation processes limit the statistically certain results. The methods used to spatially compare the datasets are sensitive to small variations in position and depth that are within the range of uncertainty associated with the datasets. Characteristics of the data, such as the density of measured points and the range of values surveyed, can also influence the results of spatial comparison. With due consideration of these limitations, apparently active and ongoing areas of elevation change in the river are mapped and discussed.

  4. Modeling uncertainty in computerized guidelines using fuzzy logic.

    PubMed Central

    Jaulent, M. C.; Joyaux, C.; Colombet, I.; Gillois, P.; Degoulet, P.; Chatellier, G.

    2001-01-01

    Computerized Clinical Practice Guidelines (CPGs) improve quality of care by assisting physicians in their decision making. A number of problems emerges since patients with close characteristics are given contradictory recommendations. In this article, we propose to use fuzzy logic to model uncertainty due to the use of thresholds in CPGs. A fuzzy classification procedure has been developed that provides for each message of the CPG, a strength of recommendation that rates the appropriateness of the recommendation for the patient under consideration. This work is done in the context of a CPG for the diagnosis and the management of hypertension, published in 1997 by the French agency ANAES. A population of 82 patients with mild to moderate hypertension was selected and the results of the classification system were compared to whose given by a classical decision tree. Observed agreement is 86.6% and the variability of recommendations for patients with close characteristics is reduced. PMID:11825196

  5. Global land cover mapping: a review and uncertainty analysis

    USGS Publications Warehouse

    Congalton, Russell G.; Gu, Jianyu; Yadav, Kamini; Thenkabail, Prasad S.; Ozdogan, Mutlu

    2014-01-01

    Given the advances in remotely sensed imagery and associated technologies, several global land cover maps have been produced in recent times including IGBP DISCover, UMD Land Cover, Global Land Cover 2000 and GlobCover 2009. However, the utility of these maps for specific applications has often been hampered due to considerable amounts of uncertainties and inconsistencies. A thorough review of these global land cover projects including evaluating the sources of error and uncertainty is prudent and enlightening. Therefore, this paper describes our work in which we compared, summarized and conducted an uncertainty analysis of the four global land cover mapping projects using an error budget approach. The results showed that the classification scheme and the validation methodology had the highest error contribution and implementation priority. A comparison of the classification schemes showed that there are many inconsistencies between the definitions of the map classes. This is especially true for the mixed type classes for which thresholds vary for the attributes/discriminators used in the classification process. Examination of these four global mapping projects provided quite a few important lessons for the future global mapping projects including the need for clear and uniform definitions of the classification scheme and an efficient, practical, and valid design of the accuracy assessment.

  6. Robust Adaptation? Assessing the sensitivity of safety margins in flood defences to uncertainty in future simulations - a case study from Ireland.

    NASA Astrophysics Data System (ADS)

    Murphy, Conor; Bastola, Satish; Sweeney, John

    2013-04-01

    Climate change impact and adaptation assessments have traditionally adopted a 'top-down' scenario based approach, where information from different Global Climate Models (GCMs) and emission scenarios are employed to develop impacts led adaptation strategies. Due to the tradeoffs in the computational cost and need to include a wide range of GCMs for fuller characterization of uncertainties, scenarios are better used for sensitivity testing and adaptation options appraisal. One common approach to adaptation that has been defined as robust is the use of safety margins. In this work the sensitivity of safety margins that have been adopted by the agency responsible for flood risk management in Ireland, to the uncertainty in future projections are examined. The sensitivity of fluvial flood risk to climate change is assessed for four Irish catchments using a large number of GCMs (17) forced with three emissions scenarios (SRES A1B, A2, B1) as input to four hydrological models. Both uncertainty within and between hydrological models is assessed using the GLUE framework. Regionalisation is achieved using a change factor method to infer changes in the parameters of a weather generator using monthly output from the GCMs, while flood frequency analysis is conducted using the method of probability weighted moments to fit the Generalised Extreme Value distribution to ~20,000 annual maxima series. The sensitivity of design margins to the uncertainty space considered is visualised using risk response surfaces. The hydrological sensitivity is measured as the percentage change in flood peak for specified recurrence intervals. Results indicate that there is a considerable residual risk associated with allowances of +20% when uncertainties are accounted for and that the risk of exceedence of design allowances is greatest for more extreme, low frequency events with considerable implication for critical infrastructure, e.g., culverts, bridges, flood defences whose designs are normally associated with such return periods. Sensitivity results show that the impact of climate change is not as great for flood peaks with higher return periods. The average width of the uncertainty range and the size of the range for each catchment reveals that the uncertainties in low frequency events are greater than high frequency events. In addition, the uncertainty interval, estimated as the average width of the uncertainty range of flow for the five return periods, grows wider with a decrease in the runoff coefficient and wetness index of each catchment, both of which tend to increase the nonlinearity in the rainfall response. A key management question that emerges is the acceptability of residual risk where high exposure of vulnerable populations and/or critical infrastructure coincide with high costs of additional capacity in safety margins.

  7. An overview of methods to identify and manage uncertainty for modelling problems in the water-environment-agriculture cross-sector

    DOE PAGES

    Jakeman, Anthony J.; Jakeman, John Davis

    2018-03-14

    Uncertainty pervades the representation of systems in the water–environment–agriculture cross-sector. Successful methods to address uncertainties have largely focused on standard mathematical formulations of biophysical processes in a single sector, such as partial or ordinary differential equations. More attention to integrated models of such systems is warranted. Model components representing the different sectors of an integrated model can have less standard, and different, formulations to one another, as well as different levels of epistemic knowledge and data informativeness. Thus, uncertainty is not only pervasive but also crosses boundaries and propagates between system components. Uncertainty assessment (UA) cries out for more eclecticmore » treatment in these circumstances, some of it being more qualitative and empirical. Here in this paper, we discuss the various sources of uncertainty in such a cross-sectoral setting and ways to assess and manage them. We have outlined a fast-growing set of methodologies, particularly in the computational mathematics literature on uncertainty quantification (UQ), that seem highly pertinent for uncertainty assessment. There appears to be considerable scope for advancing UA by integrating relevant UQ techniques into cross-sectoral problem applications. Of course this will entail considerable collaboration between domain specialists who often take first ownership of the problem and computational methods experts.« less

  8. An overview of methods to identify and manage uncertainty for modelling problems in the water-environment-agriculture cross-sector

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jakeman, Anthony J.; Jakeman, John Davis

    Uncertainty pervades the representation of systems in the water–environment–agriculture cross-sector. Successful methods to address uncertainties have largely focused on standard mathematical formulations of biophysical processes in a single sector, such as partial or ordinary differential equations. More attention to integrated models of such systems is warranted. Model components representing the different sectors of an integrated model can have less standard, and different, formulations to one another, as well as different levels of epistemic knowledge and data informativeness. Thus, uncertainty is not only pervasive but also crosses boundaries and propagates between system components. Uncertainty assessment (UA) cries out for more eclecticmore » treatment in these circumstances, some of it being more qualitative and empirical. Here in this paper, we discuss the various sources of uncertainty in such a cross-sectoral setting and ways to assess and manage them. We have outlined a fast-growing set of methodologies, particularly in the computational mathematics literature on uncertainty quantification (UQ), that seem highly pertinent for uncertainty assessment. There appears to be considerable scope for advancing UA by integrating relevant UQ techniques into cross-sectoral problem applications. Of course this will entail considerable collaboration between domain specialists who often take first ownership of the problem and computational methods experts.« less

  9. Intercomparison of different uncertainty sources in hydrological climate change projections for an alpine catchment (upper Clutha River, New Zealand)

    NASA Astrophysics Data System (ADS)

    Jobst, Andreas M.; Kingston, Daniel G.; Cullen, Nicolas J.; Schmid, Josef

    2018-06-01

    As climate change is projected to alter both temperature and precipitation, snow-controlled mid-latitude catchments are expected to experience substantial shifts in their seasonal regime, which will have direct implications for water management. In order to provide authoritative projections of climate change impacts, the uncertainty inherent to all components of the modelling chain needs to be accounted for. This study assesses the uncertainty in potential impacts of climate change on the hydro-climate of a headwater sub-catchment of New Zealand's largest catchment (the Clutha River) using a fully distributed hydrological model (WaSiM) and unique ensemble encompassing different uncertainty sources: general circulation model (GCM), emission scenario, bias correction and snow model. The inclusion of snow models is particularly important, given that (1) they are a rarely considered aspect of uncertainty in hydrological modelling studies, and (2) snow has a considerable influence on seasonal patterns of river flow in alpine catchments such as the Clutha. Projected changes in river flow for the 2050s and 2090s encompass substantial increases in streamflow from May to October, and a decline between December and March. The dominant drivers are changes in the seasonal distribution of precipitation (for the 2090s +29 to +84 % in winter) and substantial decreases in the seasonal snow storage due to temperature increase. A quantitative comparison of uncertainty identified GCM structure as the dominant contributor in the seasonal streamflow signal (44-57 %) followed by emission scenario (16-49 %), bias correction (4-22 %) and snow model (3-10 %). While these findings suggest that the role of the snow model is comparatively small, its contribution to the overall uncertainty was still found to be noticeable for winter and summer.

  10. Joint Knowledge Generation Between Climate Science and Infrastructure Engineering

    NASA Astrophysics Data System (ADS)

    Stoner, A. M. K.; Hayhoe, K.; Jacobs, J. M.

    2015-12-01

    Over the past decade the engineering community has become increasingly aware of the need to incorporate climate projections into the planning and design of sensitive infrastructure. However, this is a task that is easier said than done. This presentation will discuss some of the successes and hurdles experiences through the past year, from a climate scientist's perspective, working with engineers in infrastructure research and applied engineering through the Infrastructure & Climate Network (ICNet). Engineers rely on strict building codes and ordinances, and can be the subject of lawsuits if those codes are not followed. Matters are further complicated by the uncertainty inherent to climate projections, which include short-term natural variability, as well as the influence of scientific uncertainty and even human behavior on the rate and magnitude of change. Climate scientists typically address uncertainty by creating projections based on multiple models following different future scenarios. This uncertainty is difficult to incorporate into engineering projects, however, due to the fact that they cannot build two different bridges, one allowing for a lower amount of change, and another for a higher. More often than not there is a considerable difference between the costs of building two such bridges, which means that available funds often are the deciding factor. Discussions of climate science are often well received with engineers who work in the research area of infrastructure; going a step further, however, and implementing it in applied engineering projects can be challenging. This presentation will discuss some of the challenges and opportunities inherent to collaborations between climate scientists and transportation engineers, drawing from a range of studies including truck weight restrictions on roads during the spring thaw, and bridge deck performance due to environmental forcings.

  11. Hydrological model uncertainty due to spatial evapotranspiration estimation methods

    NASA Astrophysics Data System (ADS)

    Yu, Xuan; Lamačová, Anna; Duffy, Christopher; Krám, Pavel; Hruška, Jakub

    2016-05-01

    Evapotranspiration (ET) continues to be a difficult process to estimate in seasonal and long-term water balances in catchment models. Approaches to estimate ET typically use vegetation parameters (e.g., leaf area index [LAI], interception capacity) obtained from field observation, remote sensing data, national or global land cover products, and/or simulated by ecosystem models. In this study we attempt to quantify the uncertainty that spatial evapotranspiration estimation introduces into hydrological simulations when the age of the forest is not precisely known. The Penn State Integrated Hydrologic Model (PIHM) was implemented for the Lysina headwater catchment, located 50°03‧N, 12°40‧E in the western part of the Czech Republic. The spatial forest patterns were digitized from forest age maps made available by the Czech Forest Administration. Two ET methods were implemented in the catchment model: the Biome-BGC forest growth sub-model (1-way coupled to PIHM) and with the fixed-seasonal LAI method. From these two approaches simulation scenarios were developed. We combined the estimated spatial forest age maps and two ET estimation methods to drive PIHM. A set of spatial hydrologic regime and streamflow regime indices were calculated from the modeling results for each method. Intercomparison of the hydrological responses to the spatial vegetation patterns suggested considerable variation in soil moisture and recharge and a small uncertainty in the groundwater table elevation and streamflow. The hydrologic modeling with ET estimated by Biome-BGC generated less uncertainty due to the plant physiology-based method. The implication of this research is that overall hydrologic variability induced by uncertain management practices was reduced by implementing vegetation models in the catchment models.

  12. Understanding uncertainty in precipitation changes in a balanced perturbed-physics ensemble under multiple climate forcings

    NASA Astrophysics Data System (ADS)

    Millar, R.; Ingram, W.; Allen, M. R.; Lowe, J.

    2013-12-01

    Temperature and precipitation patterns are the climate variables with the greatest impacts on both natural and human systems. Due to the small spatial scales and the many interactions involved in the global hydrological cycle, in general circulation models (GCMs) representations of precipitation changes are subject to considerable uncertainty. Quantifying and understanding the causes of uncertainty (and identifying robust features of predictions) in both global and local precipitation change is an essential challenge of climate science. We have used the huge distributed computing capacity of the climateprediction.net citizen science project to examine parametric uncertainty in an ensemble of 20,000 perturbed-physics versions of the HadCM3 general circulation model. The ensemble has been selected to have a control climate in top-of-atmosphere energy balance [Yamazaki et al. 2013, J.G.R.]. We force this ensemble with several idealised climate-forcing scenarios including carbon dioxide step and transient profiles, solar radiation management geoengineering experiments with stratospheric aerosols, and short-lived climate forcing agents. We will present the results from several of these forcing scenarios under GCM parametric uncertainty. We examine the global mean precipitation energy budget to understand the robustness of a simple non-linear global precipitation model [Good et al. 2012, Clim. Dyn.] as a better explanation of precipitation changes in transient climate projections under GCM parametric uncertainty than a simple linear tropospheric energy balance model. We will also present work investigating robust conclusions about precipitation changes in a balanced ensemble of idealised solar radiation management scenarios [Kravitz et al. 2011, Atmos. Sci. Let.].

  13. Analysis of flood hazard under consideration of dike breaches

    NASA Astrophysics Data System (ADS)

    Vorogushyn, S.; Apel, H.; Lindenschmidt, K.-E.; Merz, B.

    2009-04-01

    The study focuses on the development and application of a new modelling system which allows a comprehensive flood hazard assessment along diked river reaches under consideration of dike failures. The proposed Inundation Hazard Assessment Model (IHAM) represents a hybrid probabilistic-deterministic model. It comprises three models interactively coupled at runtime. These are: (1) 1D unsteady hydrodynamic model of river channel and floodplain flow between dikes, (2) probabilistic dike breach model which determines possible dike breach locations, breach widths and breach outflow discharges, and (3) 2D raster-based diffusion wave storage cell model of the hinterland areas behind the dikes. Due to the unsteady nature of the 1D and 2D coupled models, the dependence between hydraulic load at various locations along the reach is explicitly considered. The probabilistic dike breach model describes dike failures due to three failure mechanisms: overtopping, piping and slope instability caused by the seepage flow through the dike core (micro-instability). Dike failures for each mechanism are simulated based on fragility functions. The probability of breach is conditioned by the uncertainty in geometrical and geotechnical dike parameters. The 2D storage cell model driven by the breach outflow boundary conditions computes an extended spectrum of flood intensity indicators such as water depth, flow velocity, impulse, inundation duration and rate of water rise. IHAM is embedded in a Monte Carlo simulation in order to account for the natural variability of the flood generation processes reflected in the form of input hydrographs and for the randomness of dike failures given by breach locations, times and widths. The scenario calculations for the developed synthetic input hydrographs for the main river and tributary were carried out for floods with return periods of T = 100; 200; 500; 1000 a. Based on the modelling results, probabilistic dike hazard maps could be generated that indicate the failure probability of each discretised dike section for every scenario magnitude. Besides the binary inundation patterns that indicate the probability of raster cells being inundated, IHAM generates probabilistic flood hazard maps. These maps display spatial patterns of the considered flood intensity indicators and their associated return periods. The probabilistic nature of IHAM allows for the generation of percentile flood hazard maps that indicate the median and uncertainty bounds of the flood intensity indicators. The uncertainty results from the natural variability of the flow hydrographs and randomness of dike breach processes. The same uncertainty sources determine the uncertainty in the flow hydrographs along the study reach. The simulations showed that the dike breach stochasticity has an increasing impact on hydrograph uncertainty in downstream direction. Whereas in the upstream part of the reach the hydrograph uncertainty is mainly stipulated by the variability of the flood wave form, the dike failures strongly shape the uncertainty boundaries in the downstream part of the reach. Finally, scenarios of polder deployment for the extreme floods with T = 200; 500; 1000 a were simulated with IHAM. The results indicate a rather weak reduction of the mean and median flow hydrographs in the river channel. However, the capping of the flow peaks resulted in a considerable reduction of the overtopping failures downstream of the polder with a simultaneous slight increase of the piping and slope micro-instability frequencies explained by a more durable average impoundment. The developed IHAM simulation system represents a new scientific tool for studying fluvial inundation dynamics under extreme conditions incorporating effects of technical flood protection measures. With its major outputs in form of novel probabilistic inundation and dike hazard maps, the IHAM system has a high practical value for decision support in flood management.

  14. And yet it moves! Involving transient flow conditions is the logical next step for WHPA analysis

    NASA Astrophysics Data System (ADS)

    Rodriguez-Pretelin, A.; Nowak, W.

    2017-12-01

    As the first line of defense among different safety measures, Wellhead Protection Areas (WHPAs) have been broadly used to protect drinking water wells against sources of pollution. In most cases, their implementation relies on simplifications, such as assuming homogeneous or zonated aquifer conditions or considering steady-state flow scenarios. Obviously, both assumptions inevitably invoke errors. However, while uncertainty due to aquifer heterogeneity has been extensively studied in the literature, the impact of transient flow conditions have received yet very little attention. For instance, WHPA maps in the offices of water supply companies are fixed maps derived from steady-state models although the actual catchment out there are transient. To mitigate high computational costs, we approximate transiency by means of a dynamic superposition of steady-state flow solutions. Then, we analyze four transient drivers that often appear on the seasonal scale: (I) regional groundwater flow direction, (II) strength of the regional hydraulic gradient, (III) natural recharge to the groundwater and (IV) pumping rate. The integration of transiency in WHPA analysis leads to time-frequency maps. They express for each location the temporal frequency of catchment membership. Furthermore, we account for the uncertainty due to incomplete knowledge on geological and transiency conditions, solved through Monte Carlo simulations. The main contribution of this study, is to show the need of enhancing groundwater well protection by considering transient flow considerations during WHPA analysis. To support and complement our statement, we demonstrate that 1) each transient driver imprints an individual spatial pattern in the required WHPA, ranking their influence through a global sensitivity analysis. 2) We compare the influence of transient conditions compared to geological uncertainty in terms of areal WHPA demand. 3) We show that considering geological uncertainty alone is insufficient in the presence of transient conditions. 4) We propose a practical decision rule for selecting a proper reliability level protection in the presence of both transiency and geological uncertainty.

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Newman, Jennifer; Clifton, Andrew; Bonin, Timothy

    As wind turbine sizes increase and wind energy expands to more complex and remote sites, remote-sensing devices such as lidars are expected to play a key role in wind resource assessment and power performance testing. The switch to remote-sensing devices represents a paradigm shift in the way the wind industry typically obtains and interprets measurement data for wind energy. For example, the measurement techniques and sources of uncertainty for a remote-sensing device are vastly different from those associated with a cup anemometer on a meteorological tower. Current IEC standards for quantifying remote sensing device uncertainty for power performance testing considermore » uncertainty due to mounting, calibration, and classification of the remote sensing device, among other parameters. Values of the uncertainty are typically given as a function of the mean wind speed measured by a reference device and are generally fixed, leading to climatic uncertainty values that apply to the entire measurement campaign. However, real-world experience and a consideration of the fundamentals of the measurement process have shown that lidar performance is highly dependent on atmospheric conditions, such as wind shear, turbulence, and aerosol content. At present, these conditions are not directly incorporated into the estimated uncertainty of a lidar device. In this presentation, we describe the development of a new dynamic lidar uncertainty framework that adapts to current flow conditions and more accurately represents the actual uncertainty inherent in lidar measurements under different conditions. In this new framework, sources of uncertainty are identified for estimation of the line-of-sight wind speed and reconstruction of the three-dimensional wind field. These sources are then related to physical processes caused by the atmosphere and lidar operating conditions. The framework is applied to lidar data from a field measurement site to assess the ability of the framework to predict errors in lidar-measured wind speed. The results show how uncertainty varies over time and can be used to help select data with different levels of uncertainty for different applications, for example, low uncertainty data for power performance testing versus all data for plant performance monitoring.« less

  16. Fear of knowledge: Clinical hypotheses in diagnostic and prognostic reasoning.

    PubMed

    Chiffi, Daniele; Zanotti, Renzo

    2017-10-01

    Patients are interested in receiving accurate diagnostic and prognostic information. Models and reasoning about diagnoses have been extensively investigated from a foundational perspective; however, for all its importance, prognosis has yet to receive a comparable degree of philosophical and methodological attention, and this may be due to the difficulties inherent in accurate prognostics. In the light of these considerations, we discuss a considerable body of critical thinking on the topic of prognostication and its strict relations with diagnostic reasoning, pointing out the distinction between nosographic and pathophysiological types of diagnosis and prognosis, underlying the importance of the explication and explanation processes. We then distinguish between various forms of hypothetical reasoning applied to reach diagnostic and prognostic judgments, comparing them with specific forms of abductive reasoning. The main thesis is that creative abduction regarding clinical hypotheses in diagnostic process is very unlikely to occur, whereas this seems to be often the case for prognostic judgments. The reasons behind this distinction are due to the different types of uncertainty involved in diagnostic and prognostic judgments. © 2016 John Wiley & Sons, Ltd.

  17. Hotspots of uncertainty in land-use and land-cover change projections: A global-scale model comparison

    DOE PAGES

    Prestele, Reinhard; Alexander, Peter; Rounsevell, Mark D. A.; ...

    2016-05-02

    Model-based global projections of future land use and land cover (LULC) change are frequently used in environmental assessments to study the impact of LULC change on environmental services and to provide decision support for policy. These projections are characterized by a high uncertainty in terms of quantity and allocation of projected changes, which can severely impact the results of environmental assessments. In this study, we identify hotspots of uncertainty, based on 43 simulations from 11 global-scale LULC change models representing a wide range of assumptions of future biophysical and socio-economic conditions. We attribute components of uncertainty to input data, modelmore » structure, scenario storyline and a residual term, based on a regression analysis and analysis of variance. From this diverse set of models and scenarios we find that the uncertainty varies, depending on the region and the LULC type under consideration. Hotspots of uncertainty appear mainly at the edges of globally important biomes (e.g. boreal and tropical forests). Our results indicate that an important source of uncertainty in forest and pasture areas originates from different input data applied in the models. Cropland, in contrast, is more consistent among the starting conditions, while variation in the projections gradually increases over time due to diverse scenario assumptions and different modeling approaches. Comparisons at the grid cell level indicate that disagreement is mainly related to LULC type definitions and the individual model allocation schemes. We conclude that improving the quality and consistency of observational data utilized in the modeling process as well as improving the allocation mechanisms of LULC change models remain important challenges. Furthermore, current LULC representation in environmental assessments might miss the uncertainty arising from the diversity of LULC change modeling approaches and many studies ignore the uncertainty in LULC projections in assessments of LULC change impacts on climate, water resources or biodiversity.« less

  18. Hotspots of uncertainty in land-use and land-cover change projections: a global-scale model comparison.

    PubMed

    Prestele, Reinhard; Alexander, Peter; Rounsevell, Mark D A; Arneth, Almut; Calvin, Katherine; Doelman, Jonathan; Eitelberg, David A; Engström, Kerstin; Fujimori, Shinichiro; Hasegawa, Tomoko; Havlik, Petr; Humpenöder, Florian; Jain, Atul K; Krisztin, Tamás; Kyle, Page; Meiyappan, Prasanth; Popp, Alexander; Sands, Ronald D; Schaldach, Rüdiger; Schüngel, Jan; Stehfest, Elke; Tabeau, Andrzej; Van Meijl, Hans; Van Vliet, Jasper; Verburg, Peter H

    2016-12-01

    Model-based global projections of future land-use and land-cover (LULC) change are frequently used in environmental assessments to study the impact of LULC change on environmental services and to provide decision support for policy. These projections are characterized by a high uncertainty in terms of quantity and allocation of projected changes, which can severely impact the results of environmental assessments. In this study, we identify hotspots of uncertainty, based on 43 simulations from 11 global-scale LULC change models representing a wide range of assumptions of future biophysical and socioeconomic conditions. We attribute components of uncertainty to input data, model structure, scenario storyline and a residual term, based on a regression analysis and analysis of variance. From this diverse set of models and scenarios, we find that the uncertainty varies, depending on the region and the LULC type under consideration. Hotspots of uncertainty appear mainly at the edges of globally important biomes (e.g., boreal and tropical forests). Our results indicate that an important source of uncertainty in forest and pasture areas originates from different input data applied in the models. Cropland, in contrast, is more consistent among the starting conditions, while variation in the projections gradually increases over time due to diverse scenario assumptions and different modeling approaches. Comparisons at the grid cell level indicate that disagreement is mainly related to LULC type definitions and the individual model allocation schemes. We conclude that improving the quality and consistency of observational data utilized in the modeling process and improving the allocation mechanisms of LULC change models remain important challenges. Current LULC representation in environmental assessments might miss the uncertainty arising from the diversity of LULC change modeling approaches, and many studies ignore the uncertainty in LULC projections in assessments of LULC change impacts on climate, water resources or biodiversity. © 2016 The Authors. Global Change Biology Published by John Wiley & Sons Ltd.

  19. Hotspots of uncertainty in land-use and land-cover change projections: A global-scale model comparison

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Prestele, Reinhard; Alexander, Peter; Rounsevell, Mark D. A.

    Model-based global projections of future land use and land cover (LULC) change are frequently used in environmental assessments to study the impact of LULC change on environmental services and to provide decision support for policy. These projections are characterized by a high uncertainty in terms of quantity and allocation of projected changes, which can severely impact the results of environmental assessments. In this study, we identify hotspots of uncertainty, based on 43 simulations from 11 global-scale LULC change models representing a wide range of assumptions of future biophysical and socio-economic conditions. We attribute components of uncertainty to input data, modelmore » structure, scenario storyline and a residual term, based on a regression analysis and analysis of variance. From this diverse set of models and scenarios we find that the uncertainty varies, depending on the region and the LULC type under consideration. Hotspots of uncertainty appear mainly at the edges of globally important biomes (e.g. boreal and tropical forests). Our results indicate that an important source of uncertainty in forest and pasture areas originates from different input data applied in the models. Cropland, in contrast, is more consistent among the starting conditions, while variation in the projections gradually increases over time due to diverse scenario assumptions and different modeling approaches. Comparisons at the grid cell level indicate that disagreement is mainly related to LULC type definitions and the individual model allocation schemes. We conclude that improving the quality and consistency of observational data utilized in the modeling process as well as improving the allocation mechanisms of LULC change models remain important challenges. Furthermore, current LULC representation in environmental assessments might miss the uncertainty arising from the diversity of LULC change modeling approaches and many studies ignore the uncertainty in LULC projections in assessments of LULC change impacts on climate, water resources or biodiversity.« less

  20. Zuwanderung und Integration als strategischer Ansatzpunkt städtischer Regenerierung - Das Beispiel der westfälischen Stadt Altena

    NASA Astrophysics Data System (ADS)

    Mantel, Anna; Engel, Susen; Nuissl, Henning

    2018-03-01

    The small town of Altena is among the fastest-shrinking cities in western Germany and has recently attracted national and international attention due to its "welcoming culture" for refugees. This can be understood within the context of the town's strategic urban development policies aiming to counter the demographic change. This article argues that a regeneration strategy directed towards immigration and integration can offer a chance for shrinking cities but is simultaneously faced with considerable challenges and uncertainties, which could be dealt with through an "integrative approach" to urban development.

  1. Zuwanderung und Integration als strategischer Ansatzpunkt städtischer Regenerierung. Das Beispiel der westfälischen Stadt Altena

    NASA Astrophysics Data System (ADS)

    Mantel, Anna; Engel, Susen; Nuissl, Henning

    2018-04-01

    The small town of Altena is among the fastest-shrinking cities in western Germany and has recently attracted national and international attention due to its "welcoming culture" for refugees. This can be understood within the context of the town's strategic urban development policies aiming to counter the demographic change. This article argues that a regeneration strategy directed towards immigration and integration can offer a chance for shrinking cities but is simultaneously faced with considerable challenges and uncertainties, which could be dealt with through an "integrative approach" to urban development.

  2. Magnetic Ordering in Sr 3YCo 4O 10+x

    DOE PAGES

    Kishida, Takayoshi; Kapetanakis, Myron D.; Yan, Jiaqiang; ...

    2016-01-28

    Transition-metal oxides often exhibit complex magnetic behavior due to the strong interplay between atomic-structure, electronic and magnetic degrees of freedom. Cobaltates, especially, exhibit complex behavior because of cobalt’s ability to adopt various valence and spin state configurations. The case of the oxygen-deficient perovskite Sr 3YCo 4O 10+x (SYCO) has gained considerable attention because of persisting uncertainties about its structure and the origin of the observed room temperature ferromagnetism. Here we report a combined investigation of SYCO using aberration-corrected scanning transmission electron microscopy and density functional theory calculations.

  3. Uncertainty in the Future of Seasonal Snowpack over North America.

    NASA Astrophysics Data System (ADS)

    McCrary, R. R.; Mearns, L.

    2017-12-01

    The uncertainty in future changes in seasonal snowpack (snow water equivalent, SWE) and snow cover extent (SCE) for North America are explored using the North American Regional Climate Change Assessment Program (NARCCAP) suite of regional climate models (RCMs) and their driving CMIP3 global circulation models (GCMs). The higher resolution of the NARCCAP RCMs is found to add significant value to the details of future projections of SWE in topographically complex regions such as the Pacific Northwest and the Rocky Mountains. The NARCCAP models also add detailed information regarding changes in the southernmost extent of snow cover. 11 of the 12 NARCCAP ensemble members contributed SWE output which we use to explore the uncertainty in future snowpack at higher resolution. In this study, we quantify the uncertainty in future projections by looking at the spread of the interquartile range of the different models. By mid-Century the RCMs consistently predict that winter SWE amounts will decrease over most of North America. The only exception to this is in Northern Canada, where increased moisture supply leads to increases in SWE in all but one of the RCMs. While the models generally agree on the sign of the change in SWE, there is considerable spread in the magnitude (absolute and percent) of the change. The RCMs also agree that the number of days with measureable snow on the ground is projected to decrease, with snow accumulation occurring later in the Fall/Winter and melting starting earlier in the Spring/Summer. As with SWE amount, spread across the models is large for changes in the timing of the snow season and can vary by over a month between models. While most of the NARCCAP models project a total loss of measurable snow along the southernmost edge of their historical range, there is considerable uncertainty about where this will occur within the ensemble due to the bias in snow cover extent in the historical simulations. We explore methods to increase our confidence about the regions that will lose any seasonal snow.

  4. Uncertainty and sensitivity assessment of flood risk assessments

    NASA Astrophysics Data System (ADS)

    de Moel, H.; Aerts, J. C.

    2009-12-01

    Floods are one of the most frequent and costly natural disasters. In order to protect human lifes and valuable assets from the effect of floods many defensive structures have been build. Despite these efforts economic losses due to catastrophic flood events have, however, risen substantially during the past couple of decades because of continuing economic developments in flood prone areas. On top of that, climate change is expected to affect the magnitude and frequency of flood events. Because these ongoing trends are expected to continue, a transition can be observed in various countries to move from a protective flood management approach to a more risk based flood management approach. In a risk based approach, flood risk assessments play an important role in supporting decision making. Most flood risk assessments assess flood risks in monetary terms (damage estimated for specific situations or expected annual damage) in order to feed cost-benefit analysis of management measures. Such flood risk assessments contain, however, considerable uncertainties. This is the result from uncertainties in the many different input parameters propagating through the risk assessment and accumulating in the final estimate. Whilst common in some other disciplines, as with integrated assessment models, full uncertainty and sensitivity analyses of flood risk assessments are not so common. Various studies have addressed uncertainties regarding flood risk assessments, but have mainly focussed on the hydrological conditions. However, uncertainties in other components of the risk assessment, like the relation between water depth and monetary damage, can be substantial as well. This research therefore tries to assess the uncertainties of all components of monetary flood risk assessments, using a Monte Carlo based approach. Furthermore, the total uncertainty will also be attributed to the different input parameters using a variance based sensitivity analysis. Assessing and visualizing the uncertainties of the final risk estimate will be helpful to decision makers to make better informed decisions and attributing this uncertainty to the input parameters helps to identify which parameters are most important when it comes to uncertainty in the final estimate and should therefore deserve additional attention in further research.

  5. Tolerance and UQ4SIM: Nimble Uncertainty Documentation and Analysis Software

    NASA Technical Reports Server (NTRS)

    Kleb, Bil

    2008-01-01

    Ultimately, scientific numerical models need quantified output uncertainties so that modeling can evolve to better match reality. Documenting model input uncertainties and variabilities is a necessary first step toward that goal. Without known input parameter uncertainties, model sensitivities are all one can determine, and without code verification, output uncertainties are simply not reliable. The basic premise of uncertainty markup is to craft a tolerance and tagging mini-language that offers a natural, unobtrusive presentation and does not depend on parsing each type of input file format. Each file is marked up with tolerances and optionally, associated tags that serve to label the parameters and their uncertainties. The evolution of such a language, often called a Domain Specific Language or DSL, is given in [1], but in final form it parallels tolerances specified on an engineering drawing, e.g., 1 +/- 0.5, 5 +/- 10%, 2 +/- 10 where % signifies percent and o signifies order of magnitude. Tags, necessary for error propagation, can be added by placing a quotation-mark-delimited tag after the tolerance, e.g., 0.7 +/- 20% 'T_effective'. In addition, tolerances might have different underlying distributions, e.g., Uniform, Normal, or Triangular, or the tolerances may merely be intervals due to lack of knowledge (uncertainty). Finally, to address pragmatic considerations such as older models that require specific number-field formats, C-style format specifiers can be appended to the tolerance like so, 1.35 +/- 10U_3.2f. As an example of use, consider figure 1, where a chemical reaction input file is has been marked up to include tolerances and tags per table 1. Not only does the technique provide a natural method of specifying tolerances, but it also servers as in situ documentation of model uncertainties. This tolerance language comes with a utility to strip the tolerances (and tags), to provide a path to the nominal model parameter file. And, as shown in [1], having the ability to quickly mark and identify model parameter uncertainties facilitates error propagation, which in turn yield output uncertainties.

  6. Dark Energy Survey Year 1 results: cross-correlation redshifts - methods and systematics characterization

    NASA Astrophysics Data System (ADS)

    Gatti, M.; Vielzeuf, P.; Davis, C.; Cawthon, R.; Rau, M. M.; DeRose, J.; De Vicente, J.; Alarcon, A.; Rozo, E.; Gaztanaga, E.; Hoyle, B.; Miquel, R.; Bernstein, G. M.; Bonnett, C.; Carnero Rosell, A.; Castander, F. J.; Chang, C.; da Costa, L. N.; Gruen, D.; Gschwend, J.; Hartley, W. G.; Lin, H.; MacCrann, N.; Maia, M. A. G.; Ogando, R. L. C.; Roodman, A.; Sevilla-Noarbe, I.; Troxel, M. A.; Wechsler, R. H.; Asorey, J.; Davis, T. M.; Glazebrook, K.; Hinton, S. R.; Lewis, G.; Lidman, C.; Macaulay, E.; Möller, A.; O'Neill, C. R.; Sommer, N. E.; Uddin, S. A.; Yuan, F.; Zhang, B.; Abbott, T. M. C.; Allam, S.; Annis, J.; Bechtol, K.; Brooks, D.; Burke, D. L.; Carollo, D.; Carrasco Kind, M.; Carretero, J.; Cunha, C. E.; D'Andrea, C. B.; DePoy, D. L.; Desai, S.; Eifler, T. F.; Evrard, A. E.; Flaugher, B.; Fosalba, P.; Frieman, J.; García-Bellido, J.; Gerdes, D. W.; Goldstein, D. A.; Gruendl, R. A.; Gutierrez, G.; Honscheid, K.; Hoormann, J. K.; Jain, B.; James, D. J.; Jarvis, M.; Jeltema, T.; Johnson, M. W. G.; Johnson, M. D.; Krause, E.; Kuehn, K.; Kuhlmann, S.; Kuropatkin, N.; Li, T. S.; Lima, M.; Marshall, J. L.; Melchior, P.; Menanteau, F.; Nichol, R. C.; Nord, B.; Plazas, A. A.; Reil, K.; Rykoff, E. S.; Sako, M.; Sanchez, E.; Scarpine, V.; Schubnell, M.; Sheldon, E.; Smith, M.; Smith, R. C.; Soares-Santos, M.; Sobreira, F.; Suchyta, E.; Swanson, M. E. C.; Tarle, G.; Thomas, D.; Tucker, B. E.; Tucker, D. L.; Vikram, V.; Walker, A. R.; Weller, J.; Wester, W.; Wolf, R. C.

    2018-06-01

    We use numerical simulations to characterize the performance of a clustering-based method to calibrate photometric redshift biases. In particular, we cross-correlate the weak lensing source galaxies from the Dark Energy Survey Year 1 sample with redMaGiC galaxies (luminous red galaxies with secure photometric redshifts) to estimate the redshift distribution of the former sample. The recovered redshift distributions are used to calibrate the photometric redshift bias of standard photo-z methods applied to the same source galaxy sample. We apply the method to two photo-z codes run in our simulated data: Bayesian Photometric Redshift and Directional Neighbourhood Fitting. We characterize the systematic uncertainties of our calibration procedure, and find that these systematic uncertainties dominate our error budget. The dominant systematics are due to our assumption of unevolving bias and clustering across each redshift bin, and to differences between the shapes of the redshift distributions derived by clustering versus photo-zs. The systematic uncertainty in the mean redshift bias of the source galaxy sample is Δz ≲ 0.02, though the precise value depends on the redshift bin under consideration. We discuss possible ways to mitigate the impact of our dominant systematics in future analyses.

  7. Characterizing model uncertainties in the life cycle of lignocellulose-based ethanol fuels.

    PubMed

    Spatari, Sabrina; MacLean, Heather L

    2010-11-15

    Renewable and low carbon fuel standards being developed at federal and state levels require an estimation of the life cycle carbon intensity (LCCI) of candidate fuels that can substitute for gasoline, such as second generation bioethanol. Estimating the LCCI of such fuels with a high degree of confidence requires the use of probabilistic methods to account for known sources of uncertainty. We construct life cycle models for the bioconversion of agricultural residue (corn stover) and energy crops (switchgrass) and explicitly examine uncertainty using Monte Carlo simulation. Using statistical methods to identify significant model variables from public data sets and Aspen Plus chemical process models,we estimate stochastic life cycle greenhouse gas (GHG) emissions for the two feedstocks combined with two promising fuel conversion technologies. The approach can be generalized to other biofuel systems. Our results show potentially high and uncertain GHG emissions for switchgrass-ethanol due to uncertain CO₂ flux from land use change and N₂O flux from N fertilizer. However, corn stover-ethanol,with its low-in-magnitude, tight-in-spread LCCI distribution, shows considerable promise for reducing life cycle GHG emissions relative to gasoline and corn-ethanol. Coproducts are important for reducing the LCCI of all ethanol fuels we examine.

  8. Challenges and regulatory considerations in the acoustic measurement of high-frequency (>20 MHz) ultrasound.

    PubMed

    Nagle, Samuel M; Sundar, Guru; Schafer, Mark E; Harris, Gerald R; Vaezy, Shahram; Gessert, James M; Howard, Samuel M; Moore, Mary K; Eaton, Richard M

    2013-11-01

    This article examines the challenges associated with making acoustic output measurements at high ultrasound frequencies (>20 MHz) in the context of regulatory considerations contained in the US Food and Drug Administration industry guidance document for diagnostic ultrasound devices. Error sources in the acoustic measurement, including hydrophone calibration and spatial averaging, nonlinear distortion, and mechanical alignment, are evaluated, and the limitations of currently available acoustic measurement instruments are discussed. An uncertainty analysis of acoustic intensity and power measurements is presented, and an example uncertainty calculation is done on a hypothetical 30-MHz high-frequency ultrasound system. This analysis concludes that the estimated measurement uncertainty of the acoustic intensity is +73%/-86%, and the uncertainty in the mechanical index is +37%/-43%. These values exceed the respective levels in the Food and Drug Administration guidance document of 30% and 15%, respectively, which are more representative of the measurement uncertainty associated with characterizing lower-frequency ultrasound systems. Recommendations made for minimizing the measurement uncertainty include implementing a mechanical positioning system that has sufficient repeatability and precision, reconstructing the time-pressure waveform via deconvolution using the hydrophone frequency response, and correcting for hydrophone spatial averaging.

  9. Towards a Global Aerosol Climatology: Preliminary Trends in Tropospheric Aerosol Amounts and Corresponding Impact on Radiative Forcing between 1950 and 1990

    NASA Technical Reports Server (NTRS)

    Tegen, Ina; Koch, Dorothy; Lacis, Andrew A.; Sato, Makiko

    1999-01-01

    A global aerosol climatology is needed in the study of decadal temperature change due to natural and anthropogenic forcing of global climate change. A preliminary aerosol climatology has been developed from global transport models for a mixture of sulfate and carbonaceous aerosols from fossil fuel burning, including also contributions from other major aerosol types such as soil dust and sea salt. The aerosol distributions change for the period of 1950 to 1990 due to changes in emissions of SO2 and carbon particles from fossil fuel burning. The optical thickness of fossil fuel derived aerosols increased by nearly a factor of 3 during this period, with particularly strong increase in eastern Asia over the whole time period. In countries where environmental laws came into effect since the early 1980s (e.g. US and western Europe), emissions and consequently aerosol optical thicknesses did not increase considerably after 1980, resulting in a shift in the global distribution pattern over this period. In addition to the optical thickness, aerosol single scattering albedos may have changed during this period due to different trends in absorbing black carbon and reflecting sulfate aerosols. However, due to the uncertainties in the emission trends, this change cannot be determined with any confidence. Radiative forcing of this aerosol distribution is calculated for several scenarios, resulting in a wide range of uncertainties for top-of-atmosphere (TOA) forcings. Uncertainties in the contribution of the strongly absorbing black carbon aerosol leads to a range in TOA forcings of ca. -0.5 to + 0.1 Wm (exp. -2), while the change in aerosol distributions between 1950 to 1990 leads to a change of -0.1 to -0.3 Wm (exp. -2), for fossil fuel derived aerosol with a "moderate" contribution of black carbon aerosol.

  10. Quantifying the impact of the longitudinal dispersion coefficient parameter uncertainty on the physical transport processes in rivers

    NASA Astrophysics Data System (ADS)

    Camacho Suarez, V. V.; Shucksmith, J.; Schellart, A.

    2016-12-01

    Analytical and numerical models can be used to represent the advection-dispersion processes governing the transport of pollutants in rivers (Fan et al., 2015; Van Genuchten et al., 2013). Simplifications, assumptions and parameter estimations in these models result in various uncertainties within the modelling process and estimations of pollutant concentrations. In this study, we explore both: 1) the structural uncertainty due to the one dimensional simplification of the Advection Dispersion Equation (ADE) and 2) the parameter uncertainty due to the semi empirical estimation of the longitudinal dispersion coefficient. The relative significance of these uncertainties has not previously been examined. By analysing both the relative structural uncertainty of analytical solutions of the ADE, and the parameter uncertainty due to the longitudinal dispersion coefficient via a Monte Carlo analysis, an evaluation of the dominant uncertainties for a case study in the river Chillan, Chile is presented over a range of spatial scales.

  11. Projecting malaria hazard from climate change in eastern Africa using large ensembles to estimate uncertainty.

    PubMed

    Leedale, Joseph; Tompkins, Adrian M; Caminade, Cyril; Jones, Anne E; Nikulin, Grigory; Morse, Andrew P

    2016-03-31

    The effect of climate change on the spatiotemporal dynamics of malaria transmission is studied using an unprecedented ensemble of climate projections, employing three diverse bias correction and downscaling techniques, in order to partially account for uncertainty in climate- driven malaria projections. These large climate ensembles drive two dynamical and spatially explicit epidemiological malaria models to provide future hazard projections for the focus region of eastern Africa. While the two malaria models produce very distinct transmission patterns for the recent climate, their response to future climate change is similar in terms of sign and spatial distribution, with malaria transmission moving to higher altitudes in the East African Community (EAC) region, while transmission reduces in lowland, marginal transmission zones such as South Sudan. The climate model ensemble generally projects warmer and wetter conditions over EAC. The simulated malaria response appears to be driven by temperature rather than precipitation effects. This reduces the uncertainty due to the climate models, as precipitation trends in tropical regions are very diverse, projecting both drier and wetter conditions with the current state-of-the-art climate model ensemble. The magnitude of the projected changes differed considerably between the two dynamical malaria models, with one much more sensitive to climate change, highlighting that uncertainty in the malaria projections is also associated with the disease modelling approach.

  12. Remote Sensing of Tropical Ecosystems: Atmospheric Correction and Cloud Masking Matter

    NASA Technical Reports Server (NTRS)

    Hilker, Thomas; Lyapustin, Alexei I.; Tucker, Compton J.; Sellers, Piers J.; Hall, Forrest G.; Wang, Yujie

    2012-01-01

    Tropical rainforests are significant contributors to the global cycles of energy, water and carbon. As a result, monitoring of the vegetation status over regions such as Amazonia has been a long standing interest of Earth scientists trying to determine the effect of climate change and anthropogenic disturbance on the tropical ecosystems and its feedback on the Earth's climate. Satellite-based remote sensing is the only practical approach for observing the vegetation dynamics of regions like the Amazon over useful spatial and temporal scales, but recent years have seen much controversy over satellite-derived vegetation states in Amazônia, with studies predicting opposite feedbacks depending on data processing technique and interpretation. Recent results suggest that some of this uncertainty could stem from a lack of quality in atmospheric correction and cloud screening. In this paper, we assess these uncertainties by comparing the current standard surface reflectance products (MYD09, MYD09GA) and derived composites (MYD09A1, MCD43A4 and MYD13A2 - Vegetation Index) from the Moderate Resolution Imaging Spectroradiometer (MODIS) onboard the Aqua satellite to results obtained from the Multi-Angle Implementation of Atmospheric Correction (MAIAC) algorithm. MAIAC uses a new cloud screening technique, and novel aerosol retrieval and atmospheric correction procedures which are based on time-series and spatial analyses. Our results show considerable improvements of MAIAC processed surface reflectance compared to MYD09/MYD13 with noise levels reduced by a factor of up to 10. Uncertainties in the current MODIS surface reflectance product were mainly due to residual cloud and aerosol contamination which affected the Normalized Difference Vegetation Index (NDVI): During the wet season, with cloud cover ranging between 90 percent and 99 percent, conventionally processed NDVI was significantly depressed due to undetected clouds. A smaller reduction in NDVI due to increased aerosol levels was observed during the dry season, with an inverse dependence of NDVI on aerosol optical thickness (AOT). NDVI observations processed with MAIAC showed highly reproducible and stable inter-annual patterns with little or no dependence on cloud cover, and no significant dependence on AOT (p less than 0.05). In addition to a better detection of cloudy pixels, MAIAC obtained about 20-80 percent more cloud free pixels, depending on season, a considerable amount for land analysis given the very high cloud cover (75-99 percent) observed at any given time in the area. We conclude that a new generation of atmospheric correction algorithms, such as MAIAC, can help to dramatically improve vegetation estimates over tropical rain forest, ultimately leading to reduced uncertainties in satellite-derived vegetation products globally.

  13. Operational Implementation of a Pc Uncertainty Construct for Conjunction Assessment Risk Analysis

    NASA Technical Reports Server (NTRS)

    Newman, Lauri K.; Hejduk, Matthew D.; Johnson, Lauren C.

    2016-01-01

    Earlier this year the NASA Conjunction Assessment and Risk Analysis (CARA) project presented the theoretical and algorithmic aspects of a method to include the uncertainties in the calculation inputs when computing the probability of collision (Pc) between two space objects, principally uncertainties in the covariances and the hard-body radius. The output of this calculation approach is to produce rather than a single Pc value an entire probability density function that will represent the range of possible Pc values given the uncertainties in the inputs and bring CA risk analysis methodologies more in line with modern risk management theory. The present study provides results from the exercise of this method against an extended dataset of satellite conjunctions in order to determine the effect of its use on the evaluation of conjunction assessment (CA) event risk posture. The effects are found to be considerable: a good number of events are downgraded from or upgraded to a serious risk designation on the basis of consideration of the Pc uncertainty. The findings counsel the integration of the developed methods into NASA CA operations.

  14. Medical retirement from sport after concussions

    PubMed Central

    Davis-Hayes, Cecilia; Baker, David R.; Bottiglieri, Thomas S.; Levine, William N.; Desai, Natasha; Gossett, James D.

    2018-01-01

    Purpose of review In patients with a considerable history of sports-related concussion, the decision of when to discontinue participation in sports due to medical concerns including neurologic disorders has potentially life-altering consequences, especially for young athletes, and merits a comprehensive evaluation involving nuanced discussion. Few resources exist to aid the sports medicine provider. Recent findings In this narrative review, we describe 10 prototypical vignettes based upon the authors' collective experience in concussion management and propose an algorithm to help clinicians navigate retirement discussions. Issues for consideration include absolute and relative contraindications to return to sport, ranging from clinical or radiographic evidence of lasting neurologic injury to prolonged concussion recovery periods or reduced injury threshold to patient-centered factors including personal identity through sport, financial motivations, and navigating uncertainty in the context of long-term risks. Summary The authors propose a novel treatment algorithm based on real patient cases to guide medical retirement decisions after concussion in sport. PMID:29517059

  15. A bottom-up approach in estimating the measurement uncertainty and other important considerations for quantitative analyses in drug testing for horses.

    PubMed

    Leung, Gary N W; Ho, Emmie N M; Kwok, W Him; Leung, David K K; Tang, Francis P W; Wan, Terence S M; Wong, April S Y; Wong, Colton H F; Wong, Jenny K Y; Yu, Nola H

    2007-09-07

    Quantitative determination, particularly for threshold substances in biological samples, is much more demanding than qualitative identification. A proper assessment of any quantitative determination is the measurement uncertainty (MU) associated with the determined value. The International Standard ISO/IEC 17025, "General requirements for the competence of testing and calibration laboratories", has more prescriptive requirements on the MU than its superseded document, ISO/IEC Guide 25. Under the 2005 or 1999 versions of the new standard, an estimation of the MU is mandatory for all quantitative determinations. To comply with the new requirement, a protocol was established in the authors' laboratory in 2001. The protocol has since evolved based on our practical experience, and a refined version was adopted in 2004. This paper describes our approach in establishing the MU, as well as some other important considerations, for the quantification of threshold substances in biological samples as applied in the area of doping control for horses. The testing of threshold substances can be viewed as a compliance test (or testing to a specified limit). As such, it should only be necessary to establish the MU at the threshold level. The steps in a "Bottom-Up" approach adopted by us are similar to those described in the EURACHEM/CITAC guide, "Quantifying Uncertainty in Analytical Measurement". They involve first specifying the measurand, including the relationship between the measurand and the input quantities upon which it depends. This is followed by identifying all applicable uncertainty contributions using a "cause and effect" diagram. The magnitude of each uncertainty component is then calculated and converted to a standard uncertainty. A recovery study is also conducted to determine if the method bias is significant and whether a recovery (or correction) factor needs to be applied. All standard uncertainties with values greater than 30% of the largest one are then used to derive the combined standard uncertainty. Finally, an expanded uncertainty is calculated at 99% one-tailed confidence level by multiplying the standard uncertainty with an appropriate coverage factor (k). A sample is considered positive if the determined concentration of the threshold substance exceeds its threshold by the expanded uncertainty. In addition, other important considerations, which can have a significant impact on quantitative analyses, will be presented.

  16. Using cost-benefit concepts in design floods improves communication of uncertainty

    NASA Astrophysics Data System (ADS)

    Ganora, Daniele; Botto, Anna; Laio, Francesco; Claps, Pierluigi

    2017-04-01

    Flood frequency analysis, i.e. the study of the relationships between the magnitude and the rarity of high flows in a river, is the usual procedure adopted to assess flood hazard, preliminary to the plan/design of flood protection measures. It grounds on the fit of a probability distribution to the peak discharge values recorded in gauging stations and the final estimates over a region are thus affected by uncertainty, due to the limited sample availability and of the possible alternatives in terms of the probabilistic model and the parameter estimation methods used. In the last decade, the scientific community dealt with this issue by developing a number of methods to quantify such uncertainty components. Usually, uncertainty is visually represented through confidence bands, which are easy to understand, but are not yet demonstrated to be useful for design purposes: they usually disorient decision makers, as the design flood is no longer univocally defined, making the decision process undetermined. These considerations motivated the development of the uncertainty-compliant design flood estimator (UNCODE) procedure (Botto et al., 2014) that allows one to select meaningful flood design values accounting for the associated uncertainty by considering additional constraints based on cost-benefit criteria. This method suggests an explicit multiplication factor that corrects the traditional (without uncertainty) design flood estimates to incorporate the effects of uncertainty in the estimate at the same safety level. Even though the UNCODE method was developed for design purposes, it can represent a powerful and robust tool to help clarifying the effects of the uncertainty in statistical estimation. As the process produces increased design flood estimates, this outcome demonstrates how uncertainty leads to more expensive flood protection measures, or insufficiency of current defenses. Moreover, the UNCODE approach can be used to assess the "value" of data, as the costs of flood prevention can get down by reducing uncertainty with longer observed flood records. As the multiplication factor is dimensionless, some examples of application provided show how this approach allows simple comparisons of the effects of uncertainty in different catchments, helping to build ranking procedures for planning purposes. REFERENCES Botto, A., Ganora, D., Laio, F., and Claps, P.: Uncertainty compliant design flood estimation, Water Resources Research, 50, doi:10.1002/2013WR014981, 2014.

  17. Propagation of stage measurement uncertainties to streamflow time series

    NASA Astrophysics Data System (ADS)

    Horner, Ivan; Le Coz, Jérôme; Renard, Benjamin; Branger, Flora; McMillan, Hilary

    2016-04-01

    Streamflow uncertainties due to stage measurements errors are generally overlooked in the promising probabilistic approaches that have emerged in the last decade. We introduce an original error model for propagating stage uncertainties through a stage-discharge rating curve within a Bayesian probabilistic framework. The method takes into account both rating curve (parametric errors and structural errors) and stage uncertainty (systematic and non-systematic errors). Practical ways to estimate the different types of stage errors are also presented: (1) non-systematic errors due to instrument resolution and precision and non-stationary waves and (2) systematic errors due to gauge calibration against the staff gauge. The method is illustrated at a site where the rating-curve-derived streamflow can be compared with an accurate streamflow reference. The agreement between the two time series is overall satisfying. Moreover, the quantification of uncertainty is also satisfying since the streamflow reference is compatible with the streamflow uncertainty intervals derived from the rating curve and the stage uncertainties. Illustrations from other sites are also presented. Results are much contrasted depending on the site features. In some cases, streamflow uncertainty is mainly due to stage measurement errors. The results also show the importance of discriminating systematic and non-systematic stage errors, especially for long term flow averages. Perspectives for improving and validating the streamflow uncertainty estimates are eventually discussed.

  18. Planning: Management of predictability and uncertainty and keeping abreast of developments

    NASA Technical Reports Server (NTRS)

    Bastien-Thiry, Christophe; Verfaillie, Gerard

    1993-01-01

    The purpose of this study is to propose method to set up and control of a space mission plan such as that of the HERMES spaceplane. The interest of this subject, other than its complexity, is due to the need to manage imprecision and uncertainty during a mission, as well as changes in between missions. Under these conditions, the set up and control of a flight plan require certain special attention and this has led us to define a certain number of qualities: mastery of complexity in order to resolve conflicts between activities: configuration, resource and time management; consideration of various criteria such as risk minimization or the attainment of mission objectives; robustness and flexibility to allow for hazards and deviations from the norm during operation without having to draw up new plans; aptness for replanning by making changes to the plan without having to set up the whole plan again; and memorization and explanation facility in order to manage developments between missions.

  19. Lunar Navigation Architecture Design Considerations

    NASA Technical Reports Server (NTRS)

    D'Souza, Christopher; Getchius, Joel; Holt, Greg; Moreau, Michael

    2009-01-01

    The NASA Constellation Program is aiming to establish a long-term presence on the lunar surface. The Constellation elements (Orion, Altair, Earth Departure Stage, and Ares launch vehicles) will require a lunar navigation architecture for navigation state updates during lunar-class missions. Orion in particular has baselined earth-based ground direct tracking as the primary source for much of its absolute navigation needs. However, due to the uncertainty in the lunar navigation architecture, the Orion program has had to make certain assumptions on the capabilities of such architectures in order to adequately scale the vehicle design trade space. The following paper outlines lunar navigation requirements, the Orion program assumptions, and the impacts of these assumptions to the lunar navigation architecture design. The selection of potential sites was based upon geometric baselines, logistical feasibility, redundancy, and abort support capability. Simulated navigation covariances mapped to entry interface flightpath- angle uncertainties were used to evaluate knowledge errors. A minimum ground station architecture was identified consisting of Goldstone, Madrid, Canberra, Santiago, Hartebeeshoek, Dongora, Hawaii, Guam, and Ascension Island (or the geometric equivalent).

  20. Crossing Science-Policy-Societal Boundaries to Reduce Scientific and Institutional Uncertainty in Small-Scale Fisheries.

    PubMed

    Sutton, Abigail M; Rudd, Murray A

    2016-10-01

    The governance of small-scale fisheries (SSF) is challenging due to the uncertainty, complexity, and interconnectedness of social, political, ecological, and economical processes. Conventional SSF management has focused on a centralized and top-down approach. A major criticism of conventional management is the over-reliance on 'expert science' to guide decision-making and poor consideration of fishers' contextually rich knowledge. That is thought to exacerbate the already low governance potential of SSF. Integrating scientific knowledge with fishers' knowledge is increasingly popular and is often assumed to help reduce levels of biophysical and institutional uncertainties. Many projects aimed at encouraging knowledge integration have, however, been unsuccessful. Our objective in this research was to assess factors that influence knowledge integration and the uptake of integrated knowledge into policy-making. We report results from 54 semi-structured interviews with SSF researchers and practitioners from around the globe. Our analysis is framed in terms of scientific credibility, societal legitimacy, and policy saliency, and we discuss cases that have been partially or fully successful in reducing uncertainty via push-and-pull-oriented boundary crossing initiatives. Our findings suggest that two important factors affect the science-policy-societal boundary: a lack of consensus among stakeholders about what constitutes credible knowledge and institutional uncertainty resulting from shifting policies and leadership change. A lack of training for scientific leaders and an apparent 'shelf-life' for community organizations highlight the importance of ongoing institutional support for knowledge integration projects. Institutional support may be enhanced through such investments, such as capacity building and specialized platforms for knowledge integration.

  1. Crossing Science-Policy-Societal Boundaries to Reduce Scientific and Institutional Uncertainty in Small-Scale Fisheries

    NASA Astrophysics Data System (ADS)

    Sutton, Abigail M.; Rudd, Murray A.

    2016-10-01

    The governance of small-scale fisheries (SSF) is challenging due to the uncertainty, complexity, and interconnectedness of social, political, ecological, and economical processes. Conventional SSF management has focused on a centralized and top-down approach. A major criticism of conventional management is the over-reliance on `expert science' to guide decision-making and poor consideration of fishers' contextually rich knowledge. That is thought to exacerbate the already low governance potential of SSF. Integrating scientific knowledge with fishers' knowledge is increasingly popular and is often assumed to help reduce levels of biophysical and institutional uncertainties. Many projects aimed at encouraging knowledge integration have, however, been unsuccessful. Our objective in this research was to assess factors that influence knowledge integration and the uptake of integrated knowledge into policy-making. We report results from 54 semi-structured interviews with SSF researchers and practitioners from around the globe. Our analysis is framed in terms of scientific credibility, societal legitimacy, and policy saliency, and we discuss cases that have been partially or fully successful in reducing uncertainty via push-and-pull-oriented boundary crossing initiatives. Our findings suggest that two important factors affect the science-policy-societal boundary: a lack of consensus among stakeholders about what constitutes credible knowledge and institutional uncertainty resulting from shifting policies and leadership change. A lack of training for scientific leaders and an apparent `shelf-life' for community organizations highlight the importance of ongoing institutional support for knowledge integration projects. Institutional support may be enhanced through such investments, such as capacity building and specialized platforms for knowledge integration.

  2. Uncertainty Analysis of Air Radiation for Lunar Return Shock Layers

    NASA Technical Reports Server (NTRS)

    Kleb, Bil; Johnston, Christopher O.

    2008-01-01

    By leveraging a new uncertainty markup technique, two risk analysis methods are used to compute the uncertainty of lunar-return shock layer radiation predicted by the High temperature Aerothermodynamic Radiation Algorithm (HARA). The effects of epistemic uncertainty, or uncertainty due to a lack of knowledge, is considered for the following modeling parameters: atomic line oscillator strengths, atomic line Stark broadening widths, atomic photoionization cross sections, negative ion photodetachment cross sections, molecular bands oscillator strengths, and electron impact excitation rates. First, a simplified shock layer problem consisting of two constant-property equilibrium layers is considered. The results of this simplified problem show that the atomic nitrogen oscillator strengths and Stark broadening widths in both the vacuum ultraviolet and infrared spectral regions, along with the negative ion continuum, are the dominant uncertainty contributors. Next, three variable property stagnation-line shock layer cases are analyzed: a typical lunar return case and two Fire II cases. For the near-equilibrium lunar return and Fire 1643-second cases, the resulting uncertainties are very similar to the simplified case. Conversely, the relatively nonequilibrium 1636-second case shows significantly larger influence from electron impact excitation rates of both atoms and molecules. For all cases, the total uncertainty in radiative heat flux to the wall due to epistemic uncertainty in modeling parameters is 30% as opposed to the erroneously-small uncertainty levels (plus or minus 6%) found when treating model parameter uncertainties as aleatory (due to chance) instead of epistemic (due to lack of knowledge).

  3. Propagation of nuclear data uncertainties for fusion power measurements

    NASA Astrophysics Data System (ADS)

    Sjöstrand, Henrik; Conroy, Sean; Helgesson, Petter; Hernandez, Solis Augusto; Koning, Arjan; Pomp, Stephan; Rochman, Dimitri

    2017-09-01

    Neutron measurements using neutron activation systems are an essential part of the diagnostic system at large fusion machines such as JET and ITER. Nuclear data is used to infer the neutron yield. Consequently, high-quality nuclear data is essential for the proper determination of the neutron yield and fusion power. However, uncertainties due to nuclear data are not fully taken into account in uncertainty analysis for neutron yield calibrations using activation foils. This paper investigates the neutron yield uncertainty due to nuclear data using the so-called Total Monte Carlo Method. The work is performed using a detailed MCNP model of the JET fusion machine; the uncertainties due to the cross-sections and angular distributions in JET structural materials, as well as the activation cross-sections in the activation foils, are analysed. It is found that a significant contribution to the neutron yield uncertainty can come from uncertainties in the nuclear data.

  4. A review of uncertainty research in impact assessment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Leung, Wanda, E-mail: wanda.leung@usask.ca; Noble, Bram, E-mail: b.noble@usask.ca; Gunn, Jill, E-mail: jill.gunn@usask.ca

    2015-01-15

    This paper examines uncertainty research in Impact Assessment (IA) and the focus of attention of the IA scholarly literature. We do so by first exploring ‘outside’ the IA literature, identifying three main themes of uncertainty research, and then apply these themes to examine the focus of scholarly research on uncertainty ‘inside’ IA. Based on a search of the database Scopus, we identified 134 journal papers published between 1970 and 2013 that address uncertainty in IA, 75% of which were published since 2005. We found that 90% of IA research addressing uncertainty focused on uncertainty in the practice of IA, includingmore » uncertainty in impact predictions, models and managing environmental impacts. Notwithstanding early guidance on uncertainty treatment in IA from the 1980s, we found no common, underlying conceptual framework that was guiding research on uncertainty in IA practice. Considerably less attention, only 9% of papers, focused on uncertainty communication, disclosure and decision-making under uncertain conditions, the majority of which focused on the need to disclose uncertainties as opposed to providing guidance on how to do so and effectively use that information to inform decisions. Finally, research focused on theory building for explaining human behavior with respect to uncertainty avoidance constituted only 1% of the IA published literature. We suggest the need for further conceptual framework development for researchers focused on identifying and addressing uncertainty in IA practice; the need for guidance on how best to communicate uncertainties in practice, versus criticizing practitioners for not doing so; research that explores how best to interpret and use disclosures about uncertainty when making decisions about project approvals, and the implications of doing so; and academic theory building and exploring the utility of existing theories to better understand and explain uncertainty avoidance behavior in IA. - Highlights: • We identified three main themes of uncertainty research in 134 papers from the scholarly literature. • The majority of research has focused on better methods for managing uncertainty in predictions. • Uncertainty disclosure is demanded of practitioners, but there is little guidance on how to do so. • There is limited theoretical explanation as to why uncertainty is avoided or not disclosed. • Conceptual, practical and theoretical guidance are required for IA uncertainty consideration.« less

  5. An alternative expression to the Sackur-Tetrode entropy formula for an ideal gas

    NASA Astrophysics Data System (ADS)

    Nagata, Shoichi

    2018-03-01

    An expression for the entropy of a monoatomic classical ideal gas is known as the Sackur-Tetrode equation. This pioneering investigation about 100 years ago incorporates quantum considerations. The purpose of this paper is to provide an alternative expression for the entropy in terms of the Heisenberg uncertainty relation. The analysis is made on the basis of fluctuation theory, for a canonical system in thermal equilibrium at temperature T. This new formula indicates manifestly that the entropy of macroscopic world is recognized as a measure of uncertainty in microscopic quantum world. The entropy in the Sackur-Tetrode equation can be re-interpreted from a different perspective viewpoint. The emphasis is on the connection between the entropy and the uncertainty relation in quantum consideration.

  6. Uncertainty as knowledge

    PubMed Central

    Lewandowsky, Stephan; Ballard, Timothy; Pancost, Richard D.

    2015-01-01

    This issue of Philosophical Transactions examines the relationship between scientific uncertainty about climate change and knowledge. Uncertainty is an inherent feature of the climate system. Considerable effort has therefore been devoted to understanding how to effectively respond to a changing, yet uncertain climate. Politicians and the public often appeal to uncertainty as an argument to delay mitigative action. We argue that the appropriate response to uncertainty is exactly the opposite: uncertainty provides an impetus to be concerned about climate change, because greater uncertainty increases the risks associated with climate change. We therefore suggest that uncertainty can be a source of actionable knowledge. We survey the papers in this issue, which address the relationship between uncertainty and knowledge from physical, economic and social perspectives. We also summarize the pervasive psychological effects of uncertainty, some of which may militate against a meaningful response to climate change, and we provide pointers to how those difficulties may be ameliorated. PMID:26460108

  7. Ramadan fasting and dental treatment considerations: a review.

    PubMed

    Shaeesta, Khaleelahmed Bhavikatti; Prabhuji, M Lv; Shruthi, J R

    2015-01-01

    During the sacred month of Ramadan, Muslims abstain from the consumption of food from dawn until dusk. Extended fasting hours produce changes in the body's metabolism during this period. A majority of the population who fast also restrict themselves from undergoing dental treatments due to a fear of breaking the fast. Even among health professionals, a certain amount of uncertainty prevails about the implications of treating a patient who is fasting. To help clinicians carry out safe and effective treatment without hampering a patient's religious beliefs, the present article focuses on the effect of Ramadan fasting on the body's metabolism and the ramifications for treatment aspects, including medications and dental procedures.

  8. Coastal Impact Underestimated From Rapid Sea Level Rise

    NASA Astrophysics Data System (ADS)

    Anderson, John; Milliken, Kristy; Wallace, Davin; Rodriguez, Antonio; Simms, Alexander

    2010-06-01

    A primary effect of global warming is accelerated sea level rise, which will eventually drown low-lying coastal areas, including some of the world's most populated cities. Predictions from the Fourth Assessment Report of the Intergovernmental Panel on Climate Change (IPCC) suggest that sea level may rise by as much as 0.6 meter by 2100 [Solomon et al., 2007]. However, uncertainty remains about how projected melting of the Greenland and Antarctic ice sheets will contribute to sea level rise. Further, considerable variability is introduced to these calculations due to coastal subsidence, especially along the northern Gulf of Mexico (see http://tidesandcurrents.noaa.gov/sltrends/sltrends.shtml).

  9. Design optimization under uncertainty and speed variability for a piezoelectric energy harvester powering a tire pressure monitoring sensor

    NASA Astrophysics Data System (ADS)

    Toghi Eshghi, Amin; Lee, Soobum; Kazem Sadoughi, Mohammad; Hu, Chao; Kim, Young-Cheol; Seo, Jong-Ho

    2017-10-01

    Energy harvesting (EH) technologies to power small sized electronic devices are attracting great attention. Wasted energy in a vehicle’s rotating tire has a great potential to enable self-powered tire pressure monitoring sensors (TPMS). Piezoelectric type energy harvesters can be used to collect vibrational energy and power such systems. Due to the presence of harsh acceleration in a rotating tire, a design tradeoff needs to be studied to prolong the harvester’s fatigue life as well as to ensure sufficient power generation. However, the design by traditional deterministic design optimization (DDO) does not show reliable performance due to the lack of consideration of various uncertainty factors (e.g., manufacturing tolerances, material properties, and loading conditions). In this study, we address a new EH design formulation that considers the uncertainty in car speed, dimensional tolerances and material properties, and solve this design problem using reliability-based design optimization (RBDO). The RBDO problem is formulated to maximize compactness and minimize weight of a TPMS harvester while satisfying power and durability requirements. A transient analysis has been done to measure the time varying response of EH such as power generation, dynamic strain, and stress. A conservative design formulation is proposed to consider the expected power from varied speed and stress at higher speed. When compared to the DDO, the RBDO results show that the reliability of EH is increased significantly by scarifying the objective function. Finally, experimental test has been conducted to demonstrate the merits of RBDO design over DDO.

  10. Detectability and Interpretational Uncertainties: Considerations in Gauging the Impacts of Land Disturbance on Streamflow

    EPA Science Inventory

    Hydrologic impacts of land disturbance and management can be confounded by rainfall variability. As a consequence, attempts to gauge and quantify these effects through streamflow monitoring are typically subject to uncertainties. This paper addresses the quantification and deline...

  11. Radiation Effects and Protection for Moon and Mars Missions

    NASA Technical Reports Server (NTRS)

    Parnell, Thomas A.; Watts, John W., Jr.; Armstrong, Tony W.

    1998-01-01

    Manned and robotic missions to the Earth's moon and Mars are exposed to a continuous flux of Galactic Cosmic Rays (GCR) and occasional, but intense, fluxes of Solar Energetic Particles (SEP). These natural radiations impose hazards to manned exploration, but also present some constraints to the design of robotic missions. The hazards to interplanetary flight crews and their uncertainties have been studied recently by a National Research Council Committee (Space Studies Board 1996). Considering the present uncertainty estimates, thick spacecraft shielding would be needed for manned missions, some of which could be accomplished with onboard equipment and expendables. For manned and robotic missions, the effects of radiation on electronics, sensors, and controls require special consideration in spacecraft design. This paper describes the GCR and SEP particle fluxes, secondary particles behind shielding, uncertainties in radiobiological effects and their impact on manned spacecraft design, as well as the major effects on spacecraft equipment. The principal calculational tools and considerations to mitigate the radiation effects are discussed, and work in progress to reduce uncertainties is included.

  12. Parameter Uncertainties for a 10-Meter Ground-Based Optical Reception Station

    NASA Technical Reports Server (NTRS)

    Shaik, K.

    1990-01-01

    Performance uncertainties for a 10-m optical reception station may arise from the nature of the communications channel or from a specific technology choice. Both types of uncertainties are described in this article to develop an understanding of the limitations imposed by them and to provide a rational basis for making technical decisions. The performance at night will be considerably higher than for daytime reception.

  13. In-Vessel Tritium Retention and Removal in ITER-FEAT

    NASA Astrophysics Data System (ADS)

    Federici, G.; Brooks, J. N.; Iseli, M.; Wu, C. H.

    Erosion of the divertor and first-wall plasma-facing components, tritium uptake in the re-deposited films, and direct implantation in the armour material surfaces surrounding the plasma, represent crucial physical issues that affect the design of future fusion devices. In this paper we present the derivation, and discuss the results, of current predictions of tritium inventory in ITER-FEAT due to co-deposition and implantation and their attendant uncertainties. The current armour materials proposed for ITER-FEAT are beryllium on the first-wall, carbon-fibre-composites on the divertor plate near the separatrix strike points, to withstand the high thermal loads expected during off-normal events, e.g., disruptions, and tungsten elsewhere in the divertor. Tritium co-deposition with chemically eroded carbon in the divertor, and possibly with some Be eroded from the first-wall, is expected to represent the dominant mechanism of in-vessel tritium retention in ITER-FEAT. This demands efficient in-situ methods of mitigation and retrieval to avoid frequent outages due to the reaching of precautionary operating limits set by safety considerations (e.g., ˜350 g of in-vessel co-deposited tritium) and for fuel economy reasons. Priority areas where further R&D work is required to narrow the remaining uncertainties are also briefly discussed.

  14. Applying ILAMB to data from several generations of the Community Land Model to assess the relative contribution of model improvements and forcing uncertainty to model-data agreement

    NASA Astrophysics Data System (ADS)

    Lawrence, D. M.; Fisher, R.; Koven, C.; Oleson, K. W.; Swenson, S. C.; Hoffman, F. M.; Randerson, J. T.; Collier, N.; Mu, M.

    2017-12-01

    The International Land Model Benchmarking (ILAMB) project is a model-data intercomparison and integration project designed to assess and help improve land models. The current package includes assessment of more than 25 land variables across more than 60 global, regional, and site-level (e.g., FLUXNET) datasets. ILAMB employs a broad range of metrics including RMSE, mean error, spatial distributions, interannual variability, and functional relationships. Here, we apply ILAMB for the purpose of assessment of several generations of the Community Land Model (CLM4, CLM4.5, and CLM5). Encouragingly, CLM5, which is the result of model development over the last several years by more than 50 researchers from 15 different institutions, shows broad improvements across many ILAMB metrics including LAI, GPP, vegetation carbon stocks, and the historical net ecosystem carbon balance among others. We will also show that considerable uncertainty arises from the historical climate forcing data used (GSWP3v1 and CRUNCEPv7). ILAMB score variations due to forcing data can be as large for many variables as that due to model structural differences. Strengths and weaknesses and persistent biases across model generations will also be presented.

  15. Monte Carlo analysis of uncertainty propagation in a stratospheric model. 2: Uncertainties due to reaction rates

    NASA Technical Reports Server (NTRS)

    Stolarski, R. S.; Butler, D. M.; Rundel, R. D.

    1977-01-01

    A concise stratospheric model was used in a Monte-Carlo analysis of the propagation of reaction rate uncertainties through the calculation of an ozone perturbation due to the addition of chlorine. Two thousand Monte-Carlo cases were run with 55 reaction rates being varied. Excellent convergence was obtained in the output distributions because the model is sensitive to the uncertainties in only about 10 reactions. For a 1 ppby chlorine perturbation added to a 1.5 ppby chlorine background, the resultant 1 sigma uncertainty on the ozone perturbation is a factor of 1.69 on the high side and 1.80 on the low side. The corresponding 2 sigma factors are 2.86 and 3.23. Results are also given for the uncertainties, due to reaction rates, in the ambient concentrations of stratospheric species.

  16. Uncertainty based modeling of rainfall-runoff: Combined differential evolution adaptive Metropolis (DREAM) and K-means clustering

    NASA Astrophysics Data System (ADS)

    Zahmatkesh, Zahra; Karamouz, Mohammad; Nazif, Sara

    2015-09-01

    Simulation of rainfall-runoff process in urban areas is of great importance considering the consequences and damages of extreme runoff events and floods. The first issue in flood hazard analysis is rainfall simulation. Large scale climate signals have been proved to be effective in rainfall simulation and prediction. In this study, an integrated scheme is developed for rainfall-runoff modeling considering different sources of uncertainty. This scheme includes three main steps of rainfall forecasting, rainfall-runoff simulation and future runoff prediction. In the first step, data driven models are developed and used to forecast rainfall using large scale climate signals as rainfall predictors. Due to high effect of different sources of uncertainty on the output of hydrologic models, in the second step uncertainty associated with input data, model parameters and model structure is incorporated in rainfall-runoff modeling and simulation. Three rainfall-runoff simulation models are developed for consideration of model conceptual (structural) uncertainty in real time runoff forecasting. To analyze the uncertainty of the model structure, streamflows generated by alternative rainfall-runoff models are combined, through developing a weighting method based on K-means clustering. Model parameters and input uncertainty are investigated using an adaptive Markov Chain Monte Carlo method. Finally, calibrated rainfall-runoff models are driven using the forecasted rainfall to predict future runoff for the watershed. The proposed scheme is employed in the case study of the Bronx River watershed, New York City. Results of uncertainty analysis of rainfall-runoff modeling reveal that simultaneous estimation of model parameters and input uncertainty significantly changes the probability distribution of the model parameters. It is also observed that by combining the outputs of the hydrological models using the proposed clustering scheme, the accuracy of runoff simulation in the watershed is remarkably improved up to 50% in comparison to the simulations by the individual models. Results indicate that the developed methodology not only provides reliable tools for rainfall and runoff modeling, but also adequate time for incorporating required mitigation measures in dealing with potentially extreme runoff events and flood hazard. Results of this study can be used in identification of the main factors affecting flood hazard analysis.

  17. Development of risk management strategies for state DOTs to effectively deal with volatile prices of transportation construction materials.

    DOT National Transportation Integrated Search

    2014-06-01

    Volatility in price of critical materials used in transportation projects, such as asphalt cement, leads to : considerable uncertainty about project cost. This uncertainty may lead to price speculation and inflated : bid prices submitted by highway c...

  18. Dynamic rating curve assessment in hydrometric stations and calculation of the associated uncertainties : Quality and monitoring indicators

    NASA Astrophysics Data System (ADS)

    Morlot, Thomas; Perret, Christian; Favre, Anne-Catherine

    2013-04-01

    Whether we talk about safety reasons, energy production or regulation, water resources management is one of EDF's (French hydropower company) main concerns. To meet these needs, since the fifties EDF-DTG operates a hydrometric network that includes more than 350 hydrometric stations. The data collected allow real time monitoring of rivers (hydro meteorological forecasts at points of interests), as well as hydrological studies and the sizing of structures. Ensuring the quality of stream flow data is a priority. A rating curve is an indirect method of estimating the discharge in rivers based on water level measurements. The value of discharge obtained thanks to the rating curve is not entirely accurate due to the constant changes of the river bed morphology, to the precision of the gaugings (direct and punctual discharge measurements) and to the quality of the tracing. As time goes on, the uncertainty of the estimated discharge from a rating curve « gets older » and increases: therefore the final level of uncertainty remains particularly difficult to assess. Moreover, the current EDF capacity to produce a rating curve is not suited to the frequency of change of the stage-discharge relationship. The actual method does not take into consideration the variation of the flow conditions and the modifications of the river bed which occur due to natural processes such as erosion, sedimentation and seasonal vegetation growth. In order to get the most accurate stream flow data and to improve their reliability, this study undertakes an original « dynamic» method to compute rating curves based on historical gaugings from a hydrometric station. A curve is computed for each new gauging and a model of uncertainty is adjusted for each of them. The model of uncertainty takes into account the inaccuracies in the measurement of the water height, the quality of the tracing, the uncertainty of the gaugings and the aging of the confidence intervals calculated with a variographic analysis. These rating curves enable to provide values of stream flow taking into account the variability of flow conditions, while providing a model of uncertainties resulting from the aging of the rating curves. By taking into account the variability of the flow conditions and the life of the hydrometric station, this original dynamic method can answer important questions in the field of hydrometry such as « How many gaugings a year have to be made so as to produce stream flow data with an average uncertainty of X% ? » and « When and in which range of water flow do we have to realize those gaugings ? ». KEY WORDS : Uncertainty, Rating curve, Hydrometric station, Gauging, Variogram, Stream Flow

  19. Development of Probabilistic Flood Inundation Mapping For Flooding Induced by Dam Failure

    NASA Astrophysics Data System (ADS)

    Tsai, C.; Yeh, J. J. J.

    2017-12-01

    A primary function of flood inundation mapping is to forecast flood hazards and assess potential losses. However, uncertainties limit the reliability of inundation hazard assessments. Major sources of uncertainty should be taken into consideration by an optimal flood management strategy. This study focuses on the 20km reach downstream of the Shihmen Reservoir in Taiwan. A dam failure induced flood herein provides the upstream boundary conditions of flood routing. The two major sources of uncertainty that are considered in the hydraulic model and the flood inundation mapping herein are uncertainties in the dam break model and uncertainty of the roughness coefficient. The perturbance moment method is applied to a dam break model and the hydro system model to develop probabilistic flood inundation mapping. Various numbers of uncertain variables can be considered in these models and the variability of outputs can be quantified. The probabilistic flood inundation mapping for dam break induced floods can be developed with consideration of the variability of output using a commonly used HEC-RAS model. Different probabilistic flood inundation mappings are discussed and compared. Probabilistic flood inundation mappings are hoped to provide new physical insights in support of the evaluation of concerning reservoir flooded areas.

  20. An 'Observational Large Ensemble' to compare observed and modeled temperature trend uncertainty due to internal variability.

    NASA Astrophysics Data System (ADS)

    Poppick, A. N.; McKinnon, K. A.; Dunn-Sigouin, E.; Deser, C.

    2017-12-01

    Initial condition climate model ensembles suggest that regional temperature trends can be highly variable on decadal timescales due to characteristics of internal climate variability. Accounting for trend uncertainty due to internal variability is therefore necessary to contextualize recent observed temperature changes. However, while the variability of trends in a climate model ensemble can be evaluated directly (as the spread across ensemble members), internal variability simulated by a climate model may be inconsistent with observations. Observation-based methods for assessing the role of internal variability on trend uncertainty are therefore required. Here, we use a statistical resampling approach to assess trend uncertainty due to internal variability in historical 50-year (1966-2015) winter near-surface air temperature trends over North America. We compare this estimate of trend uncertainty to simulated trend variability in the NCAR CESM1 Large Ensemble (LENS), finding that uncertainty in wintertime temperature trends over North America due to internal variability is largely overestimated by CESM1, on average by a factor of 32%. Our observation-based resampling approach is combined with the forced signal from LENS to produce an 'Observational Large Ensemble' (OLENS). The members of OLENS indicate a range of spatially coherent fields of temperature trends resulting from different sequences of internal variability consistent with observations. The smaller trend variability in OLENS suggests that uncertainty in the historical climate change signal in observations due to internal variability is less than suggested by LENS.

  1. The evolving definition of essential tremor: What are we dealing with?

    PubMed

    Louis, Elan D

    2018-01-01

    Although essential tremor (ET) is commonly encountered in clinical practice, historically, there has been considerable disagreement as how to best define it, and now with a growing sense of its clinical complexity, how to best encapsulate it. Here, I draw attention to five issues of current uncertainty. A PubMed search conducted on June 19, 2017 crossed "essential tremor" with 9 second search terms (e.g., definition, diagnosis). There are several major issues of clinical and diagnostic uncertainty. Underlying each issue is a larger question about the nature of the underlying pathophysiology of ET. Does age of onset of ET matter? How much dystonia is acceptable in ET? How much in the way of "cerebellar signs" are acceptable? Are non-motor features due to the underlying disease or merely secondary to the clinical features? Is ET a single disease entity or something else? We are learning more about ET and, as a by-product of these efforts, are struggling with its definition. Further understanding the nature of the underlying disease pathogenesis as well as the role the cerebellum and cerebellar relays play in this process will likely provide important clues to enable us to bring order to areas of uncertainty. Copyright © 2017 Elsevier Ltd. All rights reserved.

  2. Uncertainty of sensory signal explains variation of color constancy.

    PubMed

    Witzel, Christoph; van Alphen, Carlijn; Godau, Christoph; O'Regan, J Kevin

    2016-12-01

    Color constancy is the ability to recognize the color of an object (or more generally of a surface) under different illuminations. Without color constancy, surface color as a perceptual attribute would not be meaningful in the visual environment, where illumination changes all the time. Nevertheless, it is not obvious how color constancy is possible in the light of metamer mismatching. Surfaces that produce exactly the same sensory color signal under one illumination (metamerism) may produce utterly different sensory signals under another illumination (metamer mismatching). Here we show that this phenomenon explains to a large extent the variation of color constancy across different colors. For this purpose, color constancy was measured for different colors in an asymmetric matching task with photorealistic images. Color constancy performance was strongly correlated to the size of metamer mismatch volumes, which describe the uncertainty of the sensory signal due to metamer mismatching for a given color. The higher the uncertainty of the sensory signal, the lower the observers' color constancy. At the same time, sensory singularities, color categories, and cone ratios did not affect color constancy. The present findings do not only provide considerable insight into the determinants of color constancy, they also show that metamer mismatch volumes must be taken into account when investigating color as a perceptual property of objects and surfaces.

  3. Uncertainty considerations in calibration and validation of hydrologic and water quality models

    USDA-ARS?s Scientific Manuscript database

    Hydrologic and water quality models (HWQMs) are increasingly used to support decisions on the state of various environmental issues and policy directions on present and future scenarios, at scales varying from watershed to continental levels. Uncertainty associated with such models may impact the ca...

  4. Patterns of zone management uncertainty in cotton using tarnished plant bug distributions, NDVI, soil EC, yield and thermal imagery

    USDA-ARS?s Scientific Manuscript database

    Management zones for various crops have been delineated using NDVI (Normalized Difference Vegetation Index), apparent bulk soil electrical conductivity (ECa - Veris), and yield data; however, estimations of uncertainty for these data layers are equally important considerations. The objective of this...

  5. Evaluating uncertainties in modelling the snow hydrology of the Fraser River Basin, British Columbia, Canada

    NASA Astrophysics Data System (ADS)

    Islam, Siraj Ul; Déry, Stephen J.

    2017-03-01

    This study evaluates predictive uncertainties in the snow hydrology of the Fraser River Basin (FRB) of British Columbia (BC), Canada, using the Variable Infiltration Capacity (VIC) model forced with several high-resolution gridded climate datasets. These datasets include the Canadian Precipitation Analysis and the thin-plate smoothing splines (ANUSPLIN), North American Regional Reanalysis (NARR), University of Washington (UW) and Pacific Climate Impacts Consortium (PCIC) gridded products. Uncertainties are evaluated at different stages of the VIC implementation, starting with the driving datasets, optimization of model parameters, and model calibration during cool and warm phases of the Pacific Decadal Oscillation (PDO). The inter-comparison of the forcing datasets (precipitation and air temperature) and their VIC simulations (snow water equivalent - SWE - and runoff) reveals widespread differences over the FRB, especially in mountainous regions. The ANUSPLIN precipitation shows a considerable dry bias in the Rocky Mountains, whereas the NARR winter air temperature is 2 °C warmer than the other datasets over most of the FRB. In the VIC simulations, the elevation-dependent changes in the maximum SWE (maxSWE) are more prominent at higher elevations of the Rocky Mountains, where the PCIC-VIC simulation accumulates too much SWE and ANUSPLIN-VIC yields an underestimation. Additionally, at each elevation range, the day of maxSWE varies from 10 to 20 days between the VIC simulations. The snow melting season begins early in the NARR-VIC simulation, whereas the PCIC-VIC simulation delays the melting, indicating seasonal uncertainty in SWE simulations. When compared with the observed runoff for the Fraser River main stem at Hope, BC, the ANUSPLIN-VIC simulation shows considerable underestimation of runoff throughout the water year owing to reduced precipitation in the ANUSPLIN forcing dataset. The NARR-VIC simulation yields more winter and spring runoff and earlier decline of flows in summer due to a nearly 15-day earlier onset of the FRB springtime snowmelt. Analysis of the parametric uncertainty in the VIC calibration process shows that the choice of the initial parameter range plays a crucial role in defining the model hydrological response for the FRB. Furthermore, the VIC calibration process is biased toward cool and warm phases of the PDO and the choice of proper calibration and validation time periods is important for the experimental setup. Overall the VIC hydrological response is prominently influenced by the uncertainties involved in the forcing datasets rather than those in its parameter optimization and experimental setups.

  6. Covariance propagation in spectral indices

    DOE PAGES

    Griffin, P. J.

    2015-01-09

    In this study, the dosimetry community has a history of using spectral indices to support neutron spectrum characterization and cross section validation efforts. An important aspect to this type of analysis is the proper consideration of the contribution of the spectrum uncertainty to the total uncertainty in calculated spectral indices (SIs). This study identifies deficiencies in the traditional treatment of the SI uncertainty, provides simple bounds to the spectral component in the SI uncertainty estimates, verifies that these estimates are reflected in actual applications, details a methodology that rigorously captures the spectral contribution to the uncertainty in the SI, andmore » provides quantified examples that demonstrate the importance of the proper treatment the spectral contribution to the uncertainty in the SI.« less

  7. Cost Implications of Uncertainty in CO{sub 2} Storage Resource Estimates: A Review

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Anderson, Steven T., E-mail: sanderson@usgs.gov

    Carbon capture from stationary sources and geologic storage of carbon dioxide (CO{sub 2}) is an important option to include in strategies to mitigate greenhouse gas emissions. However, the potential costs of commercial-scale CO{sub 2} storage are not well constrained, stemming from the inherent uncertainty in storage resource estimates coupled with a lack of detailed estimates of the infrastructure needed to access those resources. Storage resource estimates are highly dependent on storage efficiency values or storage coefficients, which are calculated based on ranges of uncertain geological and physical reservoir parameters. If dynamic factors (such as variability in storage efficiencies, pressure interference,more » and acceptable injection rates over time), reservoir pressure limitations, boundaries on migration of CO{sub 2}, consideration of closed or semi-closed saline reservoir systems, and other possible constraints on the technically accessible CO{sub 2} storage resource (TASR) are accounted for, it is likely that only a fraction of the TASR could be available without incurring significant additional costs. Although storage resource estimates typically assume that any issues with pressure buildup due to CO{sub 2} injection will be mitigated by reservoir pressure management, estimates of the costs of CO{sub 2} storage generally do not include the costs of active pressure management. Production of saline waters (brines) could be essential to increasing the dynamic storage capacity of most reservoirs, but including the costs of this critical method of reservoir pressure management could increase current estimates of the costs of CO{sub 2} storage by two times, or more. Even without considering the implications for reservoir pressure management, geologic uncertainty can significantly impact CO{sub 2} storage capacities and costs, and contribute to uncertainty in carbon capture and storage (CCS) systems. Given the current state of available information and the scarcity of (data from) long-term commercial-scale CO{sub 2} storage projects, decision makers may experience considerable difficulty in ascertaining the realistic potential, the likely costs, and the most beneficial pattern of deployment of CCS as an option to reduce CO{sub 2} concentrations in the atmosphere.« less

  8. Cost implications of uncertainty in CO2 storage resource estimates: A review

    USGS Publications Warehouse

    Anderson, Steven T.

    2017-01-01

    Carbon capture from stationary sources and geologic storage of carbon dioxide (CO2) is an important option to include in strategies to mitigate greenhouse gas emissions. However, the potential costs of commercial-scale CO2 storage are not well constrained, stemming from the inherent uncertainty in storage resource estimates coupled with a lack of detailed estimates of the infrastructure needed to access those resources. Storage resource estimates are highly dependent on storage efficiency values or storage coefficients, which are calculated based on ranges of uncertain geological and physical reservoir parameters. If dynamic factors (such as variability in storage efficiencies, pressure interference, and acceptable injection rates over time), reservoir pressure limitations, boundaries on migration of CO2, consideration of closed or semi-closed saline reservoir systems, and other possible constraints on the technically accessible CO2 storage resource (TASR) are accounted for, it is likely that only a fraction of the TASR could be available without incurring significant additional costs. Although storage resource estimates typically assume that any issues with pressure buildup due to CO2 injection will be mitigated by reservoir pressure management, estimates of the costs of CO2 storage generally do not include the costs of active pressure management. Production of saline waters (brines) could be essential to increasing the dynamic storage capacity of most reservoirs, but including the costs of this critical method of reservoir pressure management could increase current estimates of the costs of CO2 storage by two times, or more. Even without considering the implications for reservoir pressure management, geologic uncertainty can significantly impact CO2 storage capacities and costs, and contribute to uncertainty in carbon capture and storage (CCS) systems. Given the current state of available information and the scarcity of (data from) long-term commercial-scale CO2 storage projects, decision makers may experience considerable difficulty in ascertaining the realistic potential, the likely costs, and the most beneficial pattern of deployment of CCS as an option to reduce CO2 concentrations in the atmosphere.

  9. Data related uncertainty in near-surface vulnerability assessments for agrochemicals in the San Joaquin Valley.

    PubMed

    Loague, Keith; Blanke, James S; Mills, Melissa B; Diaz-Diaz, Ricardo; Corwin, Dennis L

    2012-01-01

    Precious groundwater resources across the United States have been contaminated due to decades-long nonpoint-source applications of agricultural chemicals. Assessing the impact of past, ongoing, and future chemical applications for large-scale agriculture operations is timely for designing best-management practices to prevent subsurface pollution. Presented here are the results from a series of regional-scale vulnerability assessments for the San Joaquin Valley (SJV). Two relatively simple indices, the retardation and attenuation factors, are used to estimate near-surface vulnerabilities based on the chemical properties of 32 pesticides and the variability of both soil characteristics and recharge rates across the SJV. The uncertainties inherit to these assessments, derived from the uncertainties within the chemical and soil data bases, are estimated using first-order analyses. The results are used to screen and rank the chemicals based on mobility and leaching potential, without and with consideration of data-related uncertainties. Chemicals of historic high visibility in the SJV (e.g., atrazine, DBCP [dibromochloropropane], ethylene dibromide, and simazine) are ranked in the top half of those considered. Vulnerability maps generated for atrazine and DBCP, featured for their legacy status in the study area, clearly illustrate variations within and across the assessments. For example, the leaching potential is greater for DBCP than for atrazine, the leaching potential for DBCP is greater for the spatially variable recharge values than for the average recharge rate, and the leaching potentials for both DBCP and atrazine are greater for the annual recharge estimates than for the monthly recharge estimates. The data-related uncertainties identified in this study can be significant, targeting opportunities for improving future vulnerability assessments. Copyright © by the American Society of Agronomy, Crop Science Society of America, and Soil Science Society of America, Inc.

  10. Uncertainty quantification in volumetric Particle Image Velocimetry

    NASA Astrophysics Data System (ADS)

    Bhattacharya, Sayantan; Charonko, John; Vlachos, Pavlos

    2016-11-01

    Particle Image Velocimetry (PIV) uncertainty quantification is challenging due to coupled sources of elemental uncertainty and complex data reduction procedures in the measurement chain. Recent developments in this field have led to uncertainty estimation methods for planar PIV. However, no framework exists for three-dimensional volumetric PIV. In volumetric PIV the measurement uncertainty is a function of reconstructed three-dimensional particle location that in turn is very sensitive to the accuracy of the calibration mapping function. Furthermore, the iterative correction to the camera mapping function using triangulated particle locations in space (volumetric self-calibration) has its own associated uncertainty due to image noise and ghost particle reconstructions. Here we first quantify the uncertainty in the triangulated particle position which is a function of particle detection and mapping function uncertainty. The location uncertainty is then combined with the three-dimensional cross-correlation uncertainty that is estimated as an extension of the 2D PIV uncertainty framework. Finally the overall measurement uncertainty is quantified using an uncertainty propagation equation. The framework is tested with both simulated and experimental cases. For the simulated cases the variation of estimated uncertainty with the elemental volumetric PIV error sources are also evaluated. The results show reasonable prediction of standard uncertainty with good coverage.

  11. A general model for attitude determination error analysis

    NASA Technical Reports Server (NTRS)

    Markley, F. Landis; Seidewitz, ED; Nicholson, Mark

    1988-01-01

    An overview is given of a comprehensive approach to filter and dynamics modeling for attitude determination error analysis. The models presented include both batch least-squares and sequential attitude estimation processes for both spin-stabilized and three-axis stabilized spacecraft. The discussion includes a brief description of a dynamics model of strapdown gyros, but it does not cover other sensor models. Model parameters can be chosen to be solve-for parameters, which are assumed to be estimated as part of the determination process, or consider parameters, which are assumed to have errors but not to be estimated. The only restriction on this choice is that the time evolution of the consider parameters must not depend on any of the solve-for parameters. The result of an error analysis is an indication of the contributions of the various error sources to the uncertainties in the determination of the spacecraft solve-for parameters. The model presented gives the uncertainty due to errors in the a priori estimates of the solve-for parameters, the uncertainty due to measurement noise, the uncertainty due to dynamic noise (also known as process noise or measurement noise), the uncertainty due to the consider parameters, and the overall uncertainty due to all these sources of error.

  12. Evaluation on uncertainty sources in projecting hydrological changes over the Xijiang River basin in South China

    NASA Astrophysics Data System (ADS)

    Yuan, Fei; Zhao, Chongxu; Jiang, Yong; Ren, Liliang; Shan, Hongcui; Zhang, Limin; Zhu, Yonghua; Chen, Tao; Jiang, Shanhu; Yang, Xiaoli; Shen, Hongren

    2017-11-01

    Projections of hydrological changes are associated with large uncertainties from different sources, which should be quantified for an effective implementation of water management policies adaptive to future climate change. In this study, a modeling chain framework to project future hydrological changes and the associated uncertainties in the Xijiang River basin, South China, was established. The framework consists of three emission scenarios (ESs), four climate models (CMs), four statistical downscaling (SD) methods, four hydrological modeling (HM) schemes, and four probability distributions (PDs) for extreme flow frequency analyses. Direct variance method was adopted to analyze the manner by which uncertainty sources such as ES, CM, SD, and HM affect the estimates of future evapotranspiration (ET) and streamflow, and to quantify the uncertainties of PDs in future flood and drought risk assessment. Results show that ES is one of the least important uncertainty sources in most situations. CM, in general, is the dominant uncertainty source for the projections of monthly ET and monthly streamflow during most of the annual cycle, daily streamflow below the 99.6% quantile level, and extreme low flow. SD is the most predominant uncertainty source in the projections of extreme high flow, and has a considerable percentage of uncertainty contribution in monthly streamflow projections in July-September. The effects of SD in other cases are negligible. HM is a non-ignorable uncertainty source that has the potential to produce much larger uncertainties for the projections of low flow and ET in warm and wet seasons than for the projections of high flow. PD contributes a larger percentage of uncertainty in extreme flood projections than it does in extreme low flow estimates. Despite the large uncertainties in hydrological projections, this work found that future extreme low flow would undergo a considerable reduction, and a noticeable increase in drought risk in the Xijiang River basin would be expected. Thus, the necessity of employing effective water-saving techniques and adaptive water resources management strategies for drought disaster mitigation should be addressed.

  13. Consumer-phase Salmonella enterica serovar enteritidis risk assessment for egg-containing food products.

    PubMed

    Mokhtari, Amirhossein; Moore, Christina M; Yang, Hong; Jaykus, Lee-Ann; Morales, Roberta; Cates, Sheryl C; Cowen, Peter

    2006-06-01

    We describe a one-dimensional probabilistic model of the role of domestic food handling behaviors on salmonellosis risk associated with the consumption of eggs and egg-containing foods. Six categories of egg-containing foods were defined based on the amount of egg contained in the food, whether eggs are pooled, and the degree of cooking practiced by consumers. We used bootstrap simulation to quantify uncertainty in risk estimates due to sampling error, and sensitivity analysis to identify key sources of variability and uncertainty in the model. Because of typical model characteristics such as nonlinearity, interaction between inputs, thresholds, and saturation points, Sobol's method, a novel sensitivity analysis approach, was used to identify key sources of variability. Based on the mean probability of illness, examples of foods from the food categories ranked from most to least risk of illness were: (1) home-made salad dressings/ice cream; (2) fried eggs/boiled eggs; (3) omelettes; and (4) baked foods/breads. For food categories that may include uncooked eggs (e.g., home-made salad dressings/ice cream), consumer handling conditions such as storage time and temperature after food preparation were the key sources of variability. In contrast, for food categories associated with undercooked eggs (e.g., fried/soft-boiled eggs), the initial level of Salmonella contamination and the log10 reduction due to cooking were the key sources of variability. Important sources of uncertainty varied with both the risk percentile and the food category under consideration. This work adds to previous risk assessments focused on egg production and storage practices, and provides a science-based approach to inform consumer risk communications regarding safe egg handling practices.

  14. Estimating the Health Effects of Greenhouse Gas Mitigation Strategies: Addressing Parametric, Model, and Valuation Challenges

    PubMed Central

    Hess, Jeremy J.; Ebi, Kristie L.; Markandya, Anil; Balbus, John M.; Wilkinson, Paul; Haines, Andy; Chalabi, Zaid

    2014-01-01

    Background: Policy decisions regarding climate change mitigation are increasingly incorporating the beneficial and adverse health impacts of greenhouse gas emission reduction strategies. Studies of such co-benefits and co-harms involve modeling approaches requiring a range of analytic decisions that affect the model output. Objective: Our objective was to assess analytic decisions regarding model framework, structure, choice of parameters, and handling of uncertainty when modeling health co-benefits, and to make recommendations for improvements that could increase policy uptake. Methods: We describe the assumptions and analytic decisions underlying models of mitigation co-benefits, examining their effects on modeling outputs, and consider tools for quantifying uncertainty. Discussion: There is considerable variation in approaches to valuation metrics, discounting methods, uncertainty characterization and propagation, and assessment of low-probability/high-impact events. There is also variable inclusion of adverse impacts of mitigation policies, and limited extension of modeling domains to include implementation considerations. Going forward, co-benefits modeling efforts should be carried out in collaboration with policy makers; these efforts should include the full range of positive and negative impacts and critical uncertainties, as well as a range of discount rates, and should explicitly characterize uncertainty. We make recommendations to improve the rigor and consistency of modeling of health co-benefits. Conclusion: Modeling health co-benefits requires systematic consideration of the suitability of model assumptions, of what should be included and excluded from the model framework, and how uncertainty should be treated. Increased attention to these and other analytic decisions has the potential to increase the policy relevance and application of co-benefits modeling studies, potentially helping policy makers to maximize mitigation potential while simultaneously improving health. Citation: Remais JV, Hess JJ, Ebi KL, Markandya A, Balbus JM, Wilkinson P, Haines A, Chalabi Z. 2014. Estimating the health effects of greenhouse gas mitigation strategies: addressing parametric, model, and valuation challenges. Environ Health Perspect 122:447–455; http://dx.doi.org/10.1289/ehp.1306744 PMID:24583270

  15. Copernicus measurement of the Jovian Lyman-alpha emission and its aeronomical significance

    NASA Technical Reports Server (NTRS)

    Atreya, S. K.; Kerr, R. B.; Upson, W. L., II; Festou, M. C.; Donahue, T. M.; Barker, E. S.; Cochran, W. D.; Bertaux, J. L.

    1982-01-01

    It is pointed out that the intensity of the Lyman-alpha emission is a good indicator of the principal aeronomical processes on the major planets. The high-resolution ultraviolet spectrometer aboard the Orbiting Astronomical Observatory Copernicus was used in 1980 April and May to detect the Jovian Lyman-alpha emission by spectroscopically discriminating it from other Doppler shifted Lyman-alpha emissions such as those of the geocorona, and the interplanetary medium. Taking into consideration the reported emission data, it appears that an unusually large energy input due to the particle precipitation in the auroral region must have been responsible for the large observed Lyman-alpha intensity during the Voyager encounter. At most other times, the observed Jovian Lyman-alpha intensity can be explained, within the range of statistical uncertainty, by a model that takes into consideration the solar EUV flux, the solar Lyman-alpha flux, the high exospheric temperature, and the eddy diffusion coefficient without energy input from the auroral sources.

  16. Addressing the paradox of the team innovation process: A review and practical considerations.

    PubMed

    Thayer, Amanda L; Petruzzelli, Alexandra; McClurg, Caitlin E

    2018-01-01

    Facilitating team innovation is paramount to promoting progress in the science, technology, engineering, and math fields, as well as advancing national health, safety, prosperity, and welfare. However, innovation teams face a unique set of challenges due to the novelty and uncertainty that is core to the definition of innovation, as well as the paradoxical nature of idea generation and idea implementation processes. These and other challenges must be overcome for innovation teams to realize their full potential for producing change. The purpose of this review is, thus, to provide insight into the unique context that these teams function within and provide an integrative, evidence-based, and practically useful, organizing heuristic that focuses on the most important considerations for facilitating team innovation. Finally, we provide practical guidance for psychologists, organizations, practitioners, scientists, educators, policymakers, and others who employ teams to produce novel, innovative solutions to today's problems. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  17. Uncertainties in Past and Future Global Water Availability

    NASA Astrophysics Data System (ADS)

    Sheffield, J.; Kam, J.

    2014-12-01

    Understanding how water availability changes on inter-annual to decadal time scales and how it may change in the future under climate change are a key part of understanding future stresses on water and food security. Historic evaluations of water availability on regional to global scales are generally based on large-scale model simulations with their associated uncertainties, in particular for long-term changes. Uncertainties are due to model errors and missing processes, parameter uncertainty, and errors in meteorological forcing data. Recent multi-model inter-comparisons and impact studies have highlighted large differences for past reconstructions, due to different simplifying assumptions in the models or the inclusion of physical processes such as CO2 fertilization. Modeling of direct anthropogenic factors such as water and land management also carry large uncertainties in their physical representation and from lack of socio-economic data. Furthermore, there is little understanding of the impact of uncertainties in the meteorological forcings that underpin these historic simulations. Similarly, future changes in water availability are highly uncertain due to climate model diversity, natural variability and scenario uncertainty, each of which dominates at different time scales. In particular, natural climate variability is expected to dominate any externally forced signal over the next several decades. We present results from multi-land surface model simulations of the historic global availability of water in the context of natural variability (droughts) and long-term changes (drying). The simulations take into account the impact of uncertainties in the meteorological forcings and the incorporation of water management in the form of reservoirs and irrigation. The results indicate that model uncertainty is important for short-term drought events, and forcing uncertainty is particularly important for long-term changes, especially uncertainty in precipitation due to reduced gauge density in recent years. We also discuss uncertainties in future projections from these models as driven by bias-corrected and downscaled CMIP5 climate projections, in the context of the balance between climate model robustness and climate model diversity.

  18. Uncertainty Assessment of Synthetic Design Hydrographs for Gauged and Ungauged Catchments

    NASA Astrophysics Data System (ADS)

    Brunner, Manuela I.; Sikorska, Anna E.; Furrer, Reinhard; Favre, Anne-Catherine

    2018-03-01

    Design hydrographs described by peak discharge, hydrograph volume, and hydrograph shape are essential for engineering tasks involving storage. Such design hydrographs are inherently uncertain as are classical flood estimates focusing on peak discharge only. Various sources of uncertainty contribute to the total uncertainty of synthetic design hydrographs for gauged and ungauged catchments. These comprise model uncertainties, sampling uncertainty, and uncertainty due to the choice of a regionalization method. A quantification of the uncertainties associated with flood estimates is essential for reliable decision making and allows for the identification of important uncertainty sources. We therefore propose an uncertainty assessment framework for the quantification of the uncertainty associated with synthetic design hydrographs. The framework is based on bootstrap simulations and consists of three levels of complexity. On the first level, we assess the uncertainty due to individual uncertainty sources. On the second level, we quantify the total uncertainty of design hydrographs for gauged catchments and the total uncertainty of regionalizing them to ungauged catchments but independently from the construction uncertainty. On the third level, we assess the coupled uncertainty of synthetic design hydrographs in ungauged catchments, jointly considering construction and regionalization uncertainty. We find that the most important sources of uncertainty in design hydrograph construction are the record length and the choice of the flood sampling strategy. The total uncertainty of design hydrographs in ungauged catchments depends on the catchment properties and is not negligible in our case.

  19. Projecting future air pollution-related mortality under a changing climate: progress, uncertainties and research needs.

    PubMed

    Madaniyazi, Lina; Guo, Yuming; Yu, Weiwei; Tong, Shilu

    2015-02-01

    Climate change may affect mortality associated with air pollutants, especially for fine particulate matter (PM2.5) and ozone (O3). Projection studies of such kind involve complicated modelling approaches with uncertainties. We conducted a systematic review of researches and methods for projecting future PM2.5-/O3-related mortality to identify the uncertainties and optimal approaches for handling uncertainty. A literature search was conducted in October 2013, using the electronic databases: PubMed, Scopus, ScienceDirect, ProQuest, and Web of Science. The search was limited to peer-reviewed journal articles published in English from January 1980 to September 2013. Fifteen studies fulfilled the inclusion criteria. Most studies reported that an increase of climate change-induced PM2.5 and O3 may result in an increase in mortality. However, little research has been conducted in developing countries with high emissions and dense populations. Additionally, health effects induced by PM2.5 may dominate compared to those caused by O3, but projection studies of PM2.5-related mortality are fewer than those of O3-related mortality. There is a considerable variation in approaches of scenario-based projection researches, which makes it difficult to compare results. Multiple scenarios, models and downscaling methods have been used to reduce uncertainties. However, few studies have discussed what the main source of uncertainties is and which uncertainty could be most effectively reduced. Projecting air pollution-related mortality requires a systematic consideration of assumptions and uncertainties, which will significantly aid policymakers in efforts to manage potential impacts of PM2.5 and O3 on mortality in the context of climate change. Crown Copyright © 2014. Published by Elsevier Ltd. All rights reserved.

  20. Diffusion, Dispersion, and Uncertainty in Anisotropic Fractal Porous Media

    NASA Astrophysics Data System (ADS)

    Monnig, N. D.; Benson, D. A.

    2007-12-01

    Motivated by field measurements of aquifer hydraulic conductivity (K), recent techniques were developed to construct anisotropic fractal random fields, in which the scaling, or self-similarity parameter, varies with direction and is defined by a matrix. Ensemble numerical results are analyzed for solute transport through these 2-D "operator-scaling" fractional Brownian motion (fBm) ln(K) fields. Contrary to some analytic stochastic theories for monofractal K fields, the plume growth rates never exceed Mercado's (1967) purely stratified aquifer growth rate of plume apparent dispersivity proportional to mean distance. Apparent super-stratified growth must be the result of other demonstrable factors, such as initial plume size. The addition of large local dispersion and diffusion does not significantly change the effective longitudinal dispersivity of the plumes. In the presence of significant local dispersion or diffusion, the concentration coefficient of variation CV={σc}/{\\langle c \\rangle} remains large at the leading edge of the plumes. This indicates that even with considerable mixing due to dispersion or diffusion, there is still substantial uncertainty in the leading edge of a plume moving in fractal porous media.

  1. The devil that we know: lead (Pb) replacement policies under conditions of scientific uncertainty

    NASA Technical Reports Server (NTRS)

    Ogunseitan, Dele; Schoenung, Julie; Saphores, Jean-Daniel; Shapiro, Andrew; Bhuie, Amrit; Kang, Hai-Yong; Nixon, Hilary; Stein, Antionette

    2003-01-01

    Engineering and economic considerations are typical driving forces behind the selection of specific chemicals used in the manufacture of consumer products. Only recently has post-consumer environmental impact become part of the major considerations during the initial phases of product design. Therefore, reactive, rather than proactive strategies have dominated the consideration of environmental and health issues in product design.

  2. Uncertainty evaluation in normalization of isotope delta measurement results against international reference materials.

    PubMed

    Meija, Juris; Chartrand, Michelle M G

    2018-01-01

    Isotope delta measurements are normalized against international reference standards. Although multi-point normalization is becoming a standard practice, the existing uncertainty evaluation practices are either undocumented or are incomplete. For multi-point normalization, we present errors-in-variables regression models for explicit accounting of the measurement uncertainty of the international standards along with the uncertainty that is attributed to their assigned values. This manuscript presents framework to account for the uncertainty that arises due to a small number of replicate measurements and discusses multi-laboratory data reduction while accounting for inevitable correlations between the laboratories due to the use of identical reference materials for calibration. Both frequentist and Bayesian methods of uncertainty analysis are discussed.

  3. Considerations for interpreting probabilistic estimates of uncertainty of forest carbon

    Treesearch

    James E. Smith; Linda S. Heath

    2000-01-01

    Quantitative estimated of carbon inventories are needed as part of nationwide attempts to reduce net release of greenhouse gases and the associated climate forcing. Naturally, an appreciable amount of uncertainty is inherent in such large-scale assessments, especially since both science and policy issues are still evolving. Decision makers need an idea of the...

  4. Consider the Alternative: The Effects of Causal Knowledge on Representing and Using Alternative Hypotheses in Judgments under Uncertainty

    ERIC Educational Resources Information Center

    Hayes, Brett K.; Hawkins, Guy E.; Newell, Ben R.

    2016-01-01

    Four experiments examined the locus of impact of causal knowledge on consideration of alternative hypotheses in judgments under uncertainty. Two possible loci were examined; overcoming neglect of the alternative when developing a representation of a judgment problem and improving utilization of statistics associated with the alternative…

  5. Introducing Decision Making under Uncertainty and Strategic Considerations in Engineering Design

    ERIC Educational Resources Information Center

    Kosmopoulou, Georgia; Jog, Chintamani; Freeman, Margaret; Papavassiliou, Dimitrios V.

    2010-01-01

    Chemical Engineering graduates will face challenges at the workplace that even their peers who graduated a few years ago were not expected to face. One such major challenge is the management and operation of companies and plants under conditions of uncertainty and the need to make decisions in competitive situations. Modern developments in…

  6. Uncertainties in 63Ni and 55Fe determinations using liquid scintillation counting methods.

    PubMed

    Herranz, M; Idoeta, R; Abelairas, A; Legarda, F

    2012-09-01

    The implementation of (63)Ni and (55)Fe determination methods in an environmental laboratory implies their validation. In this process, the uncertainties related to these methods should be analysed. In this work, the expression of the uncertainty of the results obtained using separation methods followed by liquid scintillation counting is presented. This analysis includes the consideration of uncertainties coming from the different alternatives which these methods use as well as those which are specific to the individual laboratory and the competency of its operators in applying the standard ORISE (Oak Ridge Institute for Science and Education) methods. Copyright © 2012 Elsevier Ltd. All rights reserved.

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Park, R.; Hong, Seungkyu K.; Kwon, Hyoung-Ahn

    We used a 3-D regional atmospheric chemistry transport model (WRF-Chem) to examine processes that determine O3 in East Asia; in particular, we focused on O3 dry deposition, which is an uncertain research area due to insufficient observation and numerical studies in East Asia. Here, we compare two widely used dry deposition parameterization schemes, Wesely and M3DRY, which are used in the WRF-Chem and CMAQ models, respectively. The O3 dry deposition velocities simulated using the two aforementioned schemes under identical meteorological conditions show considerable differences (a factor of 2) due to surface resistance parameterization discrepancies. The O3 concentration differed by upmore » to 10 ppbv for the monthly mean. The simulated and observed dry deposition velocities were compared, which showed that the Wesely scheme model is consistent with the observations and successfully reproduces the observed diurnal variation. We conduct several sensitivity simulations by changing the land use data, the surface resistance of the water and the model’s spatial resolution to examine the factors that affect O3 concentrations in East Asia. As shown, the model was considerably sensitive to the input parameters, which indicates a high uncertainty for such O3 dry deposition simulations. Observations are necessary to constrain the dry deposition parameterization and input data to improve the East Asia air quality models.« less

  8. Adaptive grid based multi-objective Cauchy differential evolution for stochastic dynamic economic emission dispatch with wind power uncertainty

    PubMed Central

    Lei, Xiaohui; Wang, Chao; Yue, Dong; Xie, Xiangpeng

    2017-01-01

    Since wind power is integrated into the thermal power operation system, dynamic economic emission dispatch (DEED) has become a new challenge due to its uncertain characteristics. This paper proposes an adaptive grid based multi-objective Cauchy differential evolution (AGB-MOCDE) for solving stochastic DEED with wind power uncertainty. To properly deal with wind power uncertainty, some scenarios are generated to simulate those possible situations by dividing the uncertainty domain into different intervals, the probability of each interval can be calculated using the cumulative distribution function, and a stochastic DEED model can be formulated under different scenarios. For enhancing the optimization efficiency, Cauchy mutation operation is utilized to improve differential evolution by adjusting the population diversity during the population evolution process, and an adaptive grid is constructed for retaining diversity distribution of Pareto front. With consideration of large number of generated scenarios, the reduction mechanism is carried out to decrease the scenarios number with covariance relationships, which can greatly decrease the computational complexity. Moreover, the constraint-handling technique is also utilized to deal with the system load balance while considering transmission loss among thermal units and wind farms, all the constraint limits can be satisfied under the permitted accuracy. After the proposed method is simulated on three test systems, the obtained results reveal that in comparison with other alternatives, the proposed AGB-MOCDE can optimize the DEED problem while handling all constraint limits, and the optimal scheme of stochastic DEED can decrease the conservation of interval optimization, which can provide a more valuable optimal scheme for real-world applications. PMID:28961262

  9. Multi-Hypothesis Modelling Capabilities for Robust Data-Model Integration

    NASA Astrophysics Data System (ADS)

    Walker, A. P.; De Kauwe, M. G.; Lu, D.; Medlyn, B.; Norby, R. J.; Ricciuto, D. M.; Rogers, A.; Serbin, S.; Weston, D. J.; Ye, M.; Zaehle, S.

    2017-12-01

    Large uncertainty is often inherent in model predictions due to imperfect knowledge of how to describe the mechanistic processes (hypotheses) that a model is intended to represent. Yet this model hypothesis uncertainty (MHU) is often overlooked or informally evaluated, as methods to quantify and evaluate MHU are limited. MHU is increased as models become more complex because each additional processes added to a model comes with inherent MHU as well as parametric unceratinty. With the current trend of adding more processes to Earth System Models (ESMs), we are adding uncertainty, which can be quantified for parameters but not MHU. Model inter-comparison projects do allow for some consideration of hypothesis uncertainty but in an ad hoc and non-independent fashion. This has stymied efforts to evaluate ecosystem models against data and intepret the results mechanistically because it is not simple to interpret exactly why a model is producing the results it does and identify which model assumptions are key as they combine models of many sub-systems and processes, each of which may be conceptualised and represented mathematically in various ways. We present a novel modelling framework—the multi-assumption architecture and testbed (MAAT)—that automates the combination, generation, and execution of a model ensemble built with different representations of process. We will present the argument that multi-hypothesis modelling needs to be considered in conjunction with other capabilities (e.g. the Predictive Ecosystem Analyser; PecAn) and statistical methods (e.g. sensitivity anaylsis, data assimilation) to aid efforts in robust data model integration to enhance our predictive understanding of biological systems.

  10. Uncertainties in the Forecasted Performance of Sediment Diversions Associated with Differences Between "Optimized" Diversion Design Criteria and the Natural Crevasse-Splay Sub-Delta Life-Cycle

    NASA Astrophysics Data System (ADS)

    Brown, G.

    2017-12-01

    Sediment diversions have been proposed as a crucial component of the restoration of Coastal Louisiana. They are generally characterized as a means of creating land by mimicking natural crevasse-splay sub-delta processes. However, the criteria that are often promoted to optimize the performance of these diversions (i.e. large, sand-rich diversions into existing, degraded wetlands) are at odds with the natural processes that govern the development of crevasse-splay sub-deltas (typically sand-lean or sand-neutral diversions into open water). This is due in large part to the fact that these optimization criteria have been developed in the absence of consideration for the natural constraints associated with fundamental hydraulics: specifically, the conservation of mechanical energy. Although the implementation of the aforementioned optimization criteria have the potential to greatly increase the land-building capacity of a given diversion, the concomitant widespread inundation of the existing wetlands (an unavoidable consequence of diverting into a shallow, vegetated embayment), and the resultant stresses on existing wetland vegetation, have the potential to dramatically accelerate the loss of these existing wetlands. Hence, there are inherent uncertainties in the forecasted performance of sediment diversions that are designed according to the criteria mentioned above. This talk details the reasons for these uncertainties, using analytic and numerical model results, together with evidence from field observations and experiments. The likelihood that, in the foreseeable future, these uncertainties can be reduced, or even rationally bounded, is discussed.

  11. Bayesian characterization of uncertainty in species interaction strengths.

    PubMed

    Wolf, Christopher; Novak, Mark; Gitelman, Alix I

    2017-06-01

    Considerable effort has been devoted to the estimation of species interaction strengths. This effort has focused primarily on statistical significance testing and obtaining point estimates of parameters that contribute to interaction strength magnitudes, leaving the characterization of uncertainty associated with those estimates unconsidered. We consider a means of characterizing the uncertainty of a generalist predator's interaction strengths by formulating an observational method for estimating a predator's prey-specific per capita attack rates as a Bayesian statistical model. This formulation permits the explicit incorporation of multiple sources of uncertainty. A key insight is the informative nature of several so-called non-informative priors that have been used in modeling the sparse data typical of predator feeding surveys. We introduce to ecology a new neutral prior and provide evidence for its superior performance. We use a case study to consider the attack rates in a New Zealand intertidal whelk predator, and we illustrate not only that Bayesian point estimates can be made to correspond with those obtained by frequentist approaches, but also that estimation uncertainty as described by 95% intervals is more useful and biologically realistic using the Bayesian method. In particular, unlike in bootstrap confidence intervals, the lower bounds of the Bayesian posterior intervals for attack rates do not include zero when a predator-prey interaction is in fact observed. We conclude that the Bayesian framework provides a straightforward, probabilistic characterization of interaction strength uncertainty, enabling future considerations of both the deterministic and stochastic drivers of interaction strength and their impact on food webs.

  12. Communicating mega-projects in the face of uncertainties: Israeli mass media treatment of the Dead Sea Water Canal.

    PubMed

    Fischhendler, Itay; Cohen-Blankshtain, Galit; Shuali, Yoav; Boykoff, Max

    2015-10-01

    Given the potential for uncertainties to influence mega-projects, this study examines how mega-projects are deliberated in the public arena. The paper traces the strategies used to promote the Dead Sea Water Canal. Findings show that the Dead Sea mega-project was encumbered by ample uncertainties. Treatment of uncertainties in early coverage was dominated by economics and raised primarily by politicians, while more contemporary media discourses have been dominated by ecological uncertainties voiced by environmental non-governmental organizations. This change in uncertainty type is explained by the changing nature of the project and by shifts in societal values over time. The study also reveals that 'uncertainty reduction' and to a lesser degree, 'project cancellation', are still the strategies most often used to address uncertainties. Statistical analysis indicates that although uncertainties and strategies are significantly correlated, there may be other intervening variables that affect this correlation. This research also therefore contributes to wider and ongoing considerations of uncertainty in the public arena through various media representational practices. © The Author(s) 2013.

  13. Uncertainty in structural interpretation: Lessons to be learnt

    NASA Astrophysics Data System (ADS)

    Bond, Clare E.

    2015-05-01

    Uncertainty in the interpretation of geological data is an inherent element of geology. Datasets from different sources: remotely sensed seismic imagery, field data and borehole data, are often combined and interpreted to create a geological model of the sub-surface. The data have limited resolution and spatial distribution that results in uncertainty in the interpretation of the data and in the subsequent geological model(s) created. Methods to determine the extent of interpretational uncertainty of a dataset, how to capture and express that uncertainty, and consideration of uncertainties in terms of risk have been investigated. Here I review the work that has taken place and discuss best practice in accounting for uncertainties in structural interpretation workflows. Barriers to best practice are reflected on, including the use of software packages for interpretation. Experimental evidence suggests that minimising interpretation error through the use of geological reasoning and rules can help decrease interpretation uncertainty; through identification of inadmissible interpretations and in highlighting areas of uncertainty. Understanding expert thought processes and reasoning, including the use of visuospatial skills, during interpretation may aid in the identification of uncertainties, and in the education of new geoscientists.

  14. Attributing uncertainty in streamflow simulations due to variable inputs via the Quantile Flow Deviation metric

    NASA Astrophysics Data System (ADS)

    Shoaib, Syed Abu; Marshall, Lucy; Sharma, Ashish

    2018-06-01

    Every model to characterise a real world process is affected by uncertainty. Selecting a suitable model is a vital aspect of engineering planning and design. Observation or input errors make the prediction of modelled responses more uncertain. By way of a recently developed attribution metric, this study is aimed at developing a method for analysing variability in model inputs together with model structure variability to quantify their relative contributions in typical hydrological modelling applications. The Quantile Flow Deviation (QFD) metric is used to assess these alternate sources of uncertainty. The Australian Water Availability Project (AWAP) precipitation data for four different Australian catchments is used to analyse the impact of spatial rainfall variability on simulated streamflow variability via the QFD. The QFD metric attributes the variability in flow ensembles to uncertainty associated with the selection of a model structure and input time series. For the case study catchments, the relative contribution of input uncertainty due to rainfall is higher than that due to potential evapotranspiration, and overall input uncertainty is significant compared to model structure and parameter uncertainty. Overall, this study investigates the propagation of input uncertainty in a daily streamflow modelling scenario and demonstrates how input errors manifest across different streamflow magnitudes.

  15. Dark Energy Survey Year 1 Results: Cross-Correlation Redshifts - Methods and Systematics Characterization

    DOE PAGES

    Gatti, M.

    2018-02-22

    We use numerical simulations to characterize the performance of a clustering-based method to calibrate photometric redshift biases. In particular, we cross-correlate the weak lensing (WL) source galaxies from the Dark Energy Survey Year 1 (DES Y1) sample with redMaGiC galaxies (luminous red galaxies with secure photometric red- shifts) to estimate the redshift distribution of the former sample. The recovered redshift distributions are used to calibrate the photometric redshift bias of standard photo-z methods applied to the same source galaxy sample. We also apply the method to three photo-z codes run in our simulated data: Bayesian Photometric Redshift (BPZ), Directional Neighborhoodmore » Fitting (DNF), and Random Forest-based photo-z (RF). We characterize the systematic uncertainties of our calibration procedure, and find that these systematic uncertainties dominate our error budget. The dominant systematics are due to our assumption of unevolving bias and clustering across each redshift bin, and to differences between the shapes of the redshift distributions derived by clustering vs photo-z's. The systematic uncertainty in the mean redshift bias of the source galaxy sample is z ≲ 0.02, though the precise value depends on the redshift bin under consideration. Here, we discuss possible ways to mitigate the impact of our dominant systematics in future analyses.« less

  16. Dark Energy Survey Year 1 Results: Cross-Correlation Redshifts - Methods and Systematics Characterization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gatti, M.

    We use numerical simulations to characterize the performance of a clustering-based method to calibrate photometric redshift biases. In particular, we cross-correlate the weak lensing (WL) source galaxies from the Dark Energy Survey Year 1 (DES Y1) sample with redMaGiC galaxies (luminous red galaxies with secure photometric red- shifts) to estimate the redshift distribution of the former sample. The recovered redshift distributions are used to calibrate the photometric redshift bias of standard photo-z methods applied to the same source galaxy sample. We also apply the method to three photo-z codes run in our simulated data: Bayesian Photometric Redshift (BPZ), Directional Neighborhoodmore » Fitting (DNF), and Random Forest-based photo-z (RF). We characterize the systematic uncertainties of our calibration procedure, and find that these systematic uncertainties dominate our error budget. The dominant systematics are due to our assumption of unevolving bias and clustering across each redshift bin, and to differences between the shapes of the redshift distributions derived by clustering vs photo-z's. The systematic uncertainty in the mean redshift bias of the source galaxy sample is z ≲ 0.02, though the precise value depends on the redshift bin under consideration. Here, we discuss possible ways to mitigate the impact of our dominant systematics in future analyses.« less

  17. CO2 Fluxes Associated with Soil Organic C Stock Changes in the Mid-Continent Region of the U.S.

    NASA Astrophysics Data System (ADS)

    Ogle, S. M.; Paustian, K.; Easter, M.; Killian, K.; Williams, S.

    2005-12-01

    Regional CO2 sources and sinks need to be quantified in the terrestrial biosphere for basic understanding and policy development. Our objective was to quantify CO2 fluxes for the Mid-Continent Region of the US, including Iowa and neighboring areas in adjacent states, using a "bottom-up" simulation modeling approach. Soils represent an important potential sink for this largely agricultural region because of limited potential for CO2 uptake and storage in woody biomass. SOC stocks were estimated to have increased during the 1990s at a rate equivalent to 3.81 Tg CO2 yr-1, but with considerable sub-regional variation due to differences in land use and management patterns. Sinks were driven by conservation tillage adoption, enrollment in the Conservation Reserve Program, and conversion of annual crops to continuous hay or pasture. The dominant source of CO2 from soils in the Mid-Continent Region was attributed to drainage and cultivation of organic soils. Uncertainties in regional estimates were determined using a Monte Carlo Analysis and empirically-based uncertainty estimator, and the largest uncertainties were associated with estimating the fluxes from drained organic soils. A major research challenge is to verify the accuracy of these rates using "top-down" atmospheric budgets that are independent of the bottom-up inventory.

  18. Non-uniform dose distributions in cranial radiation therapy

    NASA Astrophysics Data System (ADS)

    Bender, Edward T.

    Radiation treatments are often delivered to patients with brain metastases. For those patients who receive radiation to the entire brain, there is a risk of long-term neuro-cognitive side effects, which may be due to damage to the hippocampus. In clinical MRI and CT scans it can be difficult to identify the hippocampus, but once identified it can be partially spared from radiation dose. Using deformable image registration we demonstrate a semi-automatic technique for obtaining an estimated location of this structure in a clinical MRI or CT scan. Deformable image registration is a useful tool in other areas such as adaptive radiotherapy, where the radiation oncology team monitors patients during the course of treatment and adjusts the radiation treatments if necessary when the patient anatomy changes. Deformable image registration is used in this setting, but there is a considerable level of uncertainty. This work represents one of many possible approaches at investigating the nature of these uncertainties utilizing consistency metrics. We will show that metrics such as the inverse consistency error correlate with actual registration uncertainties. Specifically relating to brain metastases, this work investigates where in the brain metastases are likely to form, and how the primary cancer site is related. We will show that the cerebellum is at high risk for metastases and that non-uniform dose distributions may be advantageous when delivering prophylactic cranial irradiation for patients with small cell lung cancer in complete remission.

  19. A polynomial chaos ensemble hydrologic prediction system for efficient parameter inference and robust uncertainty assessment

    NASA Astrophysics Data System (ADS)

    Wang, S.; Huang, G. H.; Baetz, B. W.; Huang, W.

    2015-11-01

    This paper presents a polynomial chaos ensemble hydrologic prediction system (PCEHPS) for an efficient and robust uncertainty assessment of model parameters and predictions, in which possibilistic reasoning is infused into probabilistic parameter inference with simultaneous consideration of randomness and fuzziness. The PCEHPS is developed through a two-stage factorial polynomial chaos expansion (PCE) framework, which consists of an ensemble of PCEs to approximate the behavior of the hydrologic model, significantly speeding up the exhaustive sampling of the parameter space. Multiple hypothesis testing is then conducted to construct an ensemble of reduced-dimensionality PCEs with only the most influential terms, which is meaningful for achieving uncertainty reduction and further acceleration of parameter inference. The PCEHPS is applied to the Xiangxi River watershed in China to demonstrate its validity and applicability. A detailed comparison between the HYMOD hydrologic model, the ensemble of PCEs, and the ensemble of reduced PCEs is performed in terms of accuracy and efficiency. Results reveal temporal and spatial variations in parameter sensitivities due to the dynamic behavior of hydrologic systems, and the effects (magnitude and direction) of parametric interactions depending on different hydrological metrics. The case study demonstrates that the PCEHPS is capable not only of capturing both expert knowledge and probabilistic information in the calibration process, but also of implementing an acceleration of more than 10 times faster than the hydrologic model without compromising the predictive accuracy.

  20. Evolution of design considerations in complex craniofacial reconstruction using patient-specific implants.

    PubMed

    Peel, Sean; Bhatia, Satyajeet; Eggbeer, Dominic; Morris, Daniel S; Hayhurst, Caroline

    2017-06-01

    Previously published evidence has established major clinical benefits from using computer-aided design, computer-aided manufacturing, and additive manufacturing to produce patient-specific devices. These include cutting guides, drilling guides, positioning guides, and implants. However, custom devices produced using these methods are still not in routine use, particularly by the UK National Health Service. Oft-cited reasons for this slow uptake include the following: a higher up-front cost than conventionally fabricated devices, material-choice uncertainty, and a lack of long-term follow-up due to their relatively recent introduction. This article identifies a further gap in current knowledge - that of design rules, or key specification considerations for complex computer-aided design/computer-aided manufacturing/additive manufacturing devices. This research begins to address the gap by combining a detailed review of the literature with first-hand experience of interdisciplinary collaboration on five craniofacial patient case studies. In each patient case, bony lesions in the orbito-temporal region were segmented, excised, and reconstructed in the virtual environment. Three cases translated these digital plans into theatre via polymer surgical guides. Four cases utilised additive manufacturing to fabricate titanium implants. One implant was machined from polyether ether ketone. From the literature, articles with relevant abstracts were analysed to extract design considerations. In all, 19 frequently recurring design considerations were extracted from previous publications. Nine new design considerations were extracted from the case studies - on the basis of subjective clinical evaluation. These were synthesised to produce a design considerations framework to assist clinicians with prescribing and design engineers with modelling. Promising avenues for further research are proposed.

  1. Tsunami hazard assessments with consideration of uncertain earthquakes characteristics

    NASA Astrophysics Data System (ADS)

    Sepulveda, I.; Liu, P. L. F.; Grigoriu, M. D.; Pritchard, M. E.

    2017-12-01

    The uncertainty quantification of tsunami assessments due to uncertain earthquake characteristics faces important challenges. First, the generated earthquake samples must be consistent with the properties observed in past events. Second, it must adopt an uncertainty propagation method to determine tsunami uncertainties with a feasible computational cost. In this study we propose a new methodology, which improves the existing tsunami uncertainty assessment methods. The methodology considers two uncertain earthquake characteristics, the slip distribution and location. First, the methodology considers the generation of consistent earthquake slip samples by means of a Karhunen Loeve (K-L) expansion and a translation process (Grigoriu, 2012), applicable to any non-rectangular rupture area and marginal probability distribution. The K-L expansion was recently applied by Le Veque et al. (2016). We have extended the methodology by analyzing accuracy criteria in terms of the tsunami initial conditions. Furthermore, and unlike this reference, we preserve the original probability properties of the slip distribution, by avoiding post sampling treatments such as earthquake slip scaling. Our approach is analyzed and justified in the framework of the present study. Second, the methodology uses a Stochastic Reduced Order model (SROM) (Grigoriu, 2009) instead of a classic Monte Carlo simulation, which reduces the computational cost of the uncertainty propagation. The methodology is applied on a real case. We study tsunamis generated at the site of the 2014 Chilean earthquake. We generate earthquake samples with expected magnitude Mw 8. We first demonstrate that the stochastic approach of our study generates consistent earthquake samples with respect to the target probability laws. We also show that the results obtained from SROM are more accurate than classic Monte Carlo simulations. We finally validate the methodology by comparing the simulated tsunamis and the tsunami records for the 2014 Chilean earthquake. Results show that leading wave measurements fall within the tsunami sample space. At later times, however, there are mismatches between measured data and the simulated results, suggesting that other sources of uncertainty are as relevant as the uncertainty of the studied earthquake characteristics.

  2. Evaluating uncertainty in environmental life-cycle assessment. A case study comparing two insulation options for a Dutch one-family dwelling.

    PubMed

    Huijbregts, Mark A J; Gilijamse, Wim; Ragas, Ad M J; Reijnders, Lucas

    2003-06-01

    The evaluation of uncertainty is relatively new in environmental life-cycle assessment (LCA). It provides useful information to assess the reliability of LCA-based decisions and to guide future research toward reducing uncertainty. Most uncertainty studies in LCA quantify only one type of uncertainty, i.e., uncertainty due to input data (parameter uncertainty). However, LCA outcomes can also be uncertain due to normative choices (scenario uncertainty) and the mathematical models involved (model uncertainty). The present paper outlines a new methodology that quantifies parameter, scenario, and model uncertainty simultaneously in environmental life-cycle assessment. The procedure is illustrated in a case study that compares two insulation options for a Dutch one-family dwelling. Parameter uncertainty was quantified by means of Monte Carlo simulation. Scenario and model uncertainty were quantified by resampling different decision scenarios and model formulations, respectively. Although scenario and model uncertainty were not quantified comprehensively, the results indicate that both types of uncertainty influence the case study outcomes. This stresses the importance of quantifying parameter, scenario, and model uncertainty simultaneously. The two insulation options studied were found to have significantly different impact scores for global warming, stratospheric ozone depletion, and eutrophication. The thickest insulation option has the lowest impact on global warming and eutrophication, and the highest impact on stratospheric ozone depletion.

  3. Multimode squeezing, biphotons and uncertainty relations in polarization quantum optics

    NASA Technical Reports Server (NTRS)

    Karassiov, V. P.

    1994-01-01

    The concept of squeezing and uncertainty relations are discussed for multimode quantum light with the consideration of polarization. Using the polarization gauge SU(2) invariance of free electromagnetic fields, we separate the polarization and biphoton degrees of freedom from other ones, and consider uncertainty relations characterizing polarization and biphoton observables. As a consequence, we obtain a new classification of states of unpolarized (and partially polarized) light within quantum optics. We also discuss briefly some interrelations of our analysis with experiments connected with solving some fundamental problems of physics.

  4. GPS (Global Positioning System) Error Budgets, Accuracy and Applications Considerations for Test and Training Ranges.

    DTIC Science & Technology

    1982-12-01

    RELATIONSHIP OF POOP AND HOOP WITH A PRIORI ALTITUDE UNCERTAINTY IN 3 DIMENSIONAL NAVIGATION. 4Satellite configuration ( AZEL ), (00,100), (900,10O), (180,10O...RELATIONSHIP OF HOOP WITH A PRIORI ALTITUDE UNCERTAINTY IN 2 DIMENSIONAL NAVIGATION. Satellite configuration ( AZEL ), (°,lO), (90,10), (180,lOO), (27o8...UNCERTAINTY IN 2 DIMENSIONAL NAVIGATION. Satellite configuration ( AZEL ), (00,100), (909,200), (l80*,30*), (270*,40*) 4.4-12 4.t 78 " 70 " 30F 20F 4S, a

  5. Uncertainty Assessment: What Good Does it Do? (Invited)

    NASA Astrophysics Data System (ADS)

    Oreskes, N.; Lewandowsky, S.

    2013-12-01

    The scientific community has devoted considerable time and energy to understanding, quantifying and articulating the uncertainties related to anthropogenic climate change. However, informed decision-making and good public policy arguably rely far more on a central core of understanding of matters that are scientifically well established than on detailed understanding and articulation of all relevant uncertainties. Advocates of vaccination, for example, stress its overall efficacy in preventing morbidity and mortality--not the uncertainties over how long the protective effects last. Advocates for colonoscopy for cancer screening stress its capacity to detect polyps before they become cancerous, with relatively little attention paid to the fact that many, if not most, polyps, would not become cancerous even if left unremoved. So why has the climate science community spent so much time focused on uncertainty? One reason, of course, is that articulation of uncertainty is a normal and appropriate part of scientific work. However, we argue that there is another reason that involves the pressure that the scientific community has experienced from individuals and groups promoting doubt about anthropogenic climate change. Specifically, doubt-mongering groups focus public attention on scientific uncertainty as a means to undermine scientific claims, equating uncertainty with untruth. Scientists inadvertently validate these arguments by agreeing that much of the science is uncertain, and thus seemingly implying that our knowledge is insecure. The problem goes further, as the scientific community attempts to articulate more clearly, and reduce, those uncertainties, thus, seemingly further agreeing that the knowledge base is insufficient to warrant public and governmental action. We refer to this effect as 'seepage,' as the effects of doubt-mongering seep into the scientific community and the scientific agenda, despite the fact that addressing these concerns does little to alter the public debate or advance public policy. We argue that attempts to address public doubts by improving uncertainty assessment are bound to fail, insofar as the motives for doubt-mongering are independent of scientific uncertainty, and therefore remain unaffected even as those uncertainties are diminished. We illustrate this claim by consideration of the evolution of the debate over the past ten years over the relationship between hurricanes and anthropogenic climate change. We suggest that scientists should pursue uncertainty assessment if such assessment improves scientific understanding, but not as a means to reduce public doubts or advance public policy in relation to anthropogenic climate change.

  6. Reusable launch vehicle model uncertainties impact analysis

    NASA Astrophysics Data System (ADS)

    Chen, Jiaye; Mu, Rongjun; Zhang, Xin; Deng, Yanpeng

    2018-03-01

    Reusable launch vehicle(RLV) has the typical characteristics of complex aerodynamic shape and propulsion system coupling, and the flight environment is highly complicated and intensely changeable. So its model has large uncertainty, which makes the nominal system quite different from the real system. Therefore, studying the influences caused by the uncertainties on the stability of the control system is of great significance for the controller design. In order to improve the performance of RLV, this paper proposes the approach of analyzing the influence of the model uncertainties. According to the typical RLV, the coupling dynamic and kinematics models are built. Then different factors that cause uncertainties during building the model are analyzed and summed up. After that, the model uncertainties are expressed according to the additive uncertainty model. Choosing the uncertainties matrix's maximum singular values as the boundary model, and selecting the uncertainties matrix's norm to show t how much the uncertainty factors influence is on the stability of the control system . The simulation results illustrate that the inertial factors have the largest influence on the stability of the system, and it is necessary and important to take the model uncertainties into consideration before the designing the controller of this kind of aircraft( like RLV, etc).

  7. Modelling uncertainties and possible future trends of precipitation and temperature for 10 sub-basins in Columbia River Basin (CRB)

    NASA Astrophysics Data System (ADS)

    Ahmadalipour, A.; Rana, A.; Qin, Y.; Moradkhani, H.

    2014-12-01

    Trends and changes in future climatic parameters, such as, precipitation and temperature have been a central part of climate change studies. In the present work, we have analyzed the seasonal and yearly trends and uncertainties of prediction in all the 10 sub-basins of Columbia River Basin (CRB) for future time period of 2010-2099. The work is carried out using 2 different sets of statistically downscaled Global Climate Model (GCMs) projection datasets i.e. Bias correction and statistical downscaling (BCSD) generated at Portland State University and The Multivariate Adaptive Constructed Analogs (MACA) generated at University of Idaho. The analysis is done for with 10 GCM downscaled products each from CMIP5 daily dataset totaling to 40 different downscaled products for robust analysis. Summer, winter and yearly trend analysis is performed for all the 10 sub-basins using linear regression (significance tested by student t test) and Mann Kendall test (0.05 percent significance level), for precipitation (P), temperature maximum (Tmax) and temperature minimum (Tmin). Thereafter, all the parameters are modelled for uncertainty, across all models, in all the 10 sub-basins and across the CRB for future scenario periods. Results have indicated in varied degree of trends for all the sub-basins, mostly pointing towards a significant increase in all three climatic parameters, for all the seasons and yearly considerations. Uncertainty analysis have reveled very high change in all the parameters across models and sub-basins under consideration. Basin wide uncertainty analysis is performed to corroborate results from smaller, sub-basin scale. Similar trends and uncertainties are reported on the larger scale as well. Interestingly, both trends and uncertainties are higher during winter period than during summer, contributing to large part of the yearly change.

  8. Uncertainties of flood frequency estimation approaches based on continuous simulation using data resampling

    NASA Astrophysics Data System (ADS)

    Arnaud, Patrick; Cantet, Philippe; Odry, Jean

    2017-11-01

    Flood frequency analyses (FFAs) are needed for flood risk management. Many methods exist ranging from classical purely statistical approaches to more complex approaches based on process simulation. The results of these methods are associated with uncertainties that are sometimes difficult to estimate due to the complexity of the approaches or the number of parameters, especially for process simulation. This is the case of the simulation-based FFA approach called SHYREG presented in this paper, in which a rainfall generator is coupled with a simple rainfall-runoff model in an attempt to estimate the uncertainties due to the estimation of the seven parameters needed to estimate flood frequencies. The six parameters of the rainfall generator are mean values, so their theoretical distribution is known and can be used to estimate the generator uncertainties. In contrast, the theoretical distribution of the single hydrological model parameter is unknown; consequently, a bootstrap method is applied to estimate the calibration uncertainties. The propagation of uncertainty from the rainfall generator to the hydrological model is also taken into account. This method is applied to 1112 basins throughout France. Uncertainties coming from the SHYREG method and from purely statistical approaches are compared, and the results are discussed according to the length of the recorded observations, basin size and basin location. Uncertainties of the SHYREG method decrease as the basin size increases or as the length of the recorded flow increases. Moreover, the results show that the confidence intervals of the SHYREG method are relatively small despite the complexity of the method and the number of parameters (seven). This is due to the stability of the parameters and takes into account the dependence of uncertainties due to the rainfall model and the hydrological calibration. Indeed, the uncertainties on the flow quantiles are on the same order of magnitude as those associated with the use of a statistical law with two parameters (here generalised extreme value Type I distribution) and clearly lower than those associated with the use of a three-parameter law (here generalised extreme value Type II distribution). For extreme flood quantiles, the uncertainties are mostly due to the rainfall generator because of the progressive saturation of the hydrological model.

  9. Comment on ‘Relativistic theory of the falling retroreflector gravimeter’

    NASA Astrophysics Data System (ADS)

    Křen, Petr; Pálinkáš, Vojtech

    2018-04-01

    In the paper by Ashby (2018 Metrologia 55 1-10) the correction due to the time delay of light propagated through the prism retroreflector of absolute gravimeters is discussed. Accordingly, the correction of about  -6.8 µGal should be applied for a typical gravimeter such as the most precise FG5(X) gravimeter declaring standard uncertainty at the level of 2 µGal. In consequence, the present gravimetric results related to the Kibble balance or the global absolute gravity reference system should be significantly changed. However, such a change needs a deeper scientific consensus. In our comment, we would like to show that the proposed correction should not be applied since the author’s consideration is incorrect.

  10. Optimization of multimagnetometer systems on a spacecraft

    NASA Technical Reports Server (NTRS)

    Neubauer, F. M.

    1975-01-01

    The problem of optimizing the position of magnetometers along a boom of given length to yield a minimized total error is investigated. The discussion is limited to at most four magnetometers, which seems to be a practical limit due to weight, power, and financial considerations. The outlined error analysis is applied to some illustrative cases. The optimal magnetometer locations, for which the total error is minimum, are computed for given boom length, instrument errors, and very conservative magnetic field models characteristic for spacecraft with only a restricted or ineffective magnetic cleanliness program. It is shown that the error contribution by the magnetometer inaccuracy is increased as the number of magnetometers is increased, whereas the spacecraft field uncertainty is diminished by an appreciably larger amount.

  11. Study of different filtering techniques applied to spectra from airborne gamma spectrometry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wilhelm, Emilien; Gutierrez, Sebastien; Reboli, Anne

    2015-07-01

    One of the features of spectra obtained by airborne gamma spectrometry is low counting statistics due to the short acquisition time (1 s) and the large source-detector distance (40 m). It leads to considerable uncertainty in radionuclide identification and determination of their respective activities from the windows method recommended by the IAEA, especially for low-level radioactivity. The present work compares the results obtained with filters in terms of errors of the filtered spectra with the window method and over the whole gamma energy range. The results are used to determine which filtering technique is the most suitable in combination withmore » some method for total stripping of the spectrum. (authors)« less

  12. Uncertainty Aware Structural Topology Optimization Via a Stochastic Reduced Order Model Approach

    NASA Technical Reports Server (NTRS)

    Aguilo, Miguel A.; Warner, James E.

    2017-01-01

    This work presents a stochastic reduced order modeling strategy for the quantification and propagation of uncertainties in topology optimization. Uncertainty aware optimization problems can be computationally complex due to the substantial number of model evaluations that are necessary to accurately quantify and propagate uncertainties. This computational complexity is greatly magnified if a high-fidelity, physics-based numerical model is used for the topology optimization calculations. Stochastic reduced order model (SROM) methods are applied here to effectively 1) alleviate the prohibitive computational cost associated with an uncertainty aware topology optimization problem; and 2) quantify and propagate the inherent uncertainties due to design imperfections. A generic SROM framework that transforms the uncertainty aware, stochastic topology optimization problem into a deterministic optimization problem that relies only on independent calls to a deterministic numerical model is presented. This approach facilitates the use of existing optimization and modeling tools to accurately solve the uncertainty aware topology optimization problems in a fraction of the computational demand required by Monte Carlo methods. Finally, an example in structural topology optimization is presented to demonstrate the effectiveness of the proposed uncertainty aware structural topology optimization approach.

  13. Integrating climate change considerations into forest management tools and training

    Treesearch

    Linda M. Nagel; Christopher W. Swanston; Maria K. Janowiak

    2010-01-01

    Silviculturists are currently facing the challenge of developing management strategies that meet broad ecological and social considerations in spite of a high degree of uncertainty in future climatic conditions. Forest managers need state-of-the-art knowledge about climate change and potential impacts to facilitate development of silvicultural objectives and...

  14. Uncertainties in s -process nucleosynthesis in low mass stars determined from Monte Carlo variations

    NASA Astrophysics Data System (ADS)

    Cescutti, G.; Hirschi, R.; Nishimura, N.; den Hartogh, J. W.; Rauscher, T.; Murphy, A. St J.; Cristallo, S.

    2018-05-01

    The main s-process taking place in low mass stars produces about half of the elements heavier than iron. It is therefore very important to determine the importance and impact of nuclear physics uncertainties on this process. We have performed extensive nuclear reaction network calculations using individual and temperature-dependent uncertainties for reactions involving elements heavier than iron, within a Monte Carlo framework. Using this technique, we determined the uncertainty in the main s-process abundance predictions due to nuclear uncertainties link to weak interactions and neutron captures on elements heavier than iron. We also identified the key nuclear reactions dominating these uncertainties. We found that β-decay rate uncertainties affect only a few nuclides near s-process branchings, whereas most of the uncertainty in the final abundances is caused by uncertainties in neutron capture rates, either directly producing or destroying the nuclide of interest. Combined total nuclear uncertainties due to reactions on heavy elements are in general small (less than 50%). Three key reactions, nevertheless, stand out because they significantly affect the uncertainties of a large number of nuclides. These are 56Fe(n,γ), 64Ni(n,γ), and 138Ba(n,γ). We discuss the prospect of reducing uncertainties in the key reactions identified in this study with future experiments.

  15. Assessing the potential effects and cost-effectiveness of programmatic herpes zoster vaccination of elderly in the Netherlands

    PubMed Central

    2010-01-01

    Background Herpes zoster (HZ) is a painful disease affecting a considerable part of the elderly. Programmatic HZ vaccination of elderly people may considerably reduce HZ morbidity and its related costs, but the extent of these effects is unknown. In this article, the potential effects and cost-effectiveness of programmatic HZ vaccination of elderly in the Netherlands have been assessed according to a framework that was developed to support evidence-based decision making regarding inclusion of new vaccines in the Dutch National Immunization Program. Methods An analytical framework was used combining a checklist, which structured relevant data on the vaccine, pathogen and disease, and a cost-effectiveness analysis. The cost-effectiveness analysis was performed from a societal perspective, using a Markov-cohort-model. Simultaneous vaccination with influenza was assumed. Results Due to the combination of waning immunity after vaccination and a reduced efficacy of vaccination at high ages, the most optimal cost-effectiveness ratio (€21716 per QALY) for HZ vaccination in the Netherlands was found for 70-year olds. This estimated ratio is just above the socially accepted threshold in the Netherlands of €20000 per QALY. If additional reduction of postherpetic neuralgia was included, the cost-effectiveness ratio improved (~€10000 per QALY) but uncertainty for this scenario is high. Conclusions Vaccination against HZ at the age of 70 years seems marginally cost-effective in the Netherlands. Due to limited vaccine efficacy a considerable part of the disease burden caused by HZ will remain, even with optimal acceptance of programmatic vaccination. PMID:20707884

  16. Assessing the potential effects and cost-effectiveness of programmatic herpes zoster vaccination of elderly in the Netherlands.

    PubMed

    van Lier, Alies; van Hoek, Albert Jan; Opstelten, Wim; Boot, Hein J; de Melker, Hester E

    2010-08-13

    Herpes zoster (HZ) is a painful disease affecting a considerable part of the elderly. Programmatic HZ vaccination of elderly people may considerably reduce HZ morbidity and its related costs, but the extent of these effects is unknown. In this article, the potential effects and cost-effectiveness of programmatic HZ vaccination of elderly in the Netherlands have been assessed according to a framework that was developed to support evidence-based decision making regarding inclusion of new vaccines in the Dutch National Immunization Program. An analytical framework was used combining a checklist, which structured relevant data on the vaccine, pathogen and disease, and a cost-effectiveness analysis. The cost-effectiveness analysis was performed from a societal perspective, using a Markov-cohort-model. Simultaneous vaccination with influenza was assumed. Due to the combination of waning immunity after vaccination and a reduced efficacy of vaccination at high ages, the most optimal cost-effectiveness ratio (21716 euro per QALY) for HZ vaccination in the Netherlands was found for 70-year olds. This estimated ratio is just above the socially accepted threshold in the Netherlands of 20000 euro per QALY. If additional reduction of postherpetic neuralgia was included, the cost-effectiveness ratio improved (approximately 10000 euro per QALY) but uncertainty for this scenario is high. Vaccination against HZ at the age of 70 years seems marginally cost-effective in the Netherlands. Due to limited vaccine efficacy a considerable part of the disease burden caused by HZ will remain, even with optimal acceptance of programmatic vaccination.

  17. Decay heat uncertainty for BWR used fuel due to modeling and nuclear data uncertainties

    DOE PAGES

    Ilas, Germina; Liljenfeldt, Henrik

    2017-05-19

    Characterization of the energy released from radionuclide decay in nuclear fuel discharged from reactors is essential for the design, safety, and licensing analyses of used nuclear fuel storage, transportation, and repository systems. There are a limited number of decay heat measurements available for commercial used fuel applications. Because decay heat measurements can be expensive or impractical for covering the multitude of existing fuel designs, operating conditions, and specific application purposes, decay heat estimation relies heavily on computer code prediction. Uncertainty evaluation for calculated decay heat is an important aspect when assessing code prediction and a key factor supporting decision makingmore » for used fuel applications. While previous studies have largely focused on uncertainties in code predictions due to nuclear data uncertainties, this study discusses uncertainties in calculated decay heat due to uncertainties in assembly modeling parameters as well as in nuclear data. Capabilities in the SCALE nuclear analysis code system were used to quantify the effect on calculated decay heat of uncertainties in nuclear data and selected manufacturing and operation parameters for a typical boiling water reactor (BWR) fuel assembly. Furthermore, the BWR fuel assembly used as the reference case for this study was selected from a set of assemblies for which high-quality decay heat measurements are available, to assess the significance of the results through comparison with calculated and measured decay heat data.« less

  18. Decay heat uncertainty for BWR used fuel due to modeling and nuclear data uncertainties

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ilas, Germina; Liljenfeldt, Henrik

    Characterization of the energy released from radionuclide decay in nuclear fuel discharged from reactors is essential for the design, safety, and licensing analyses of used nuclear fuel storage, transportation, and repository systems. There are a limited number of decay heat measurements available for commercial used fuel applications. Because decay heat measurements can be expensive or impractical for covering the multitude of existing fuel designs, operating conditions, and specific application purposes, decay heat estimation relies heavily on computer code prediction. Uncertainty evaluation for calculated decay heat is an important aspect when assessing code prediction and a key factor supporting decision makingmore » for used fuel applications. While previous studies have largely focused on uncertainties in code predictions due to nuclear data uncertainties, this study discusses uncertainties in calculated decay heat due to uncertainties in assembly modeling parameters as well as in nuclear data. Capabilities in the SCALE nuclear analysis code system were used to quantify the effect on calculated decay heat of uncertainties in nuclear data and selected manufacturing and operation parameters for a typical boiling water reactor (BWR) fuel assembly. Furthermore, the BWR fuel assembly used as the reference case for this study was selected from a set of assemblies for which high-quality decay heat measurements are available, to assess the significance of the results through comparison with calculated and measured decay heat data.« less

  19. Impact of Pitot tube calibration on the uncertainty of water flow rate measurement

    NASA Astrophysics Data System (ADS)

    de Oliveira Buscarini, Icaro; Costa Barsaglini, Andre; Saiz Jabardo, Paulo Jose; Massami Taira, Nilson; Nader, Gilder

    2015-10-01

    Water utility companies often use Cole type Pitot tubes to map velocity profiles and thus measure flow rate. Frequent monitoring and measurement of flow rate is an important step in identifying leaks and other types of losses. In Brazil losses as high as 42% are common and in some places even higher values are found. When using Cole type Pitot tubes to measure the flow rate, the uncertainty of the calibration coefficient (Cd) is a major component of the overall flow rate measurement uncertainty. A common practice is to employ the usual value Cd = 0.869, in use since Cole proposed his Pitot tube in 1896. Analysis of 414 calibrations of Cole type Pitot tubes show that Cd varies considerably and values as high 0.020 for the expanded uncertainty are common. Combined with other uncertainty sources, the overall velocity measurement uncertainty is 0.02, increasing flowrate measurement uncertainty by 1.5% which, for the Sao Paulo metropolitan area (Brazil) corresponds to 3.5 × 107 m3/year.

  20. Sea Surface Temperatures in the Indo-Pacific Warm Pool During the Early Pliocene Warm Period

    NASA Astrophysics Data System (ADS)

    Dekens, P. S.; Ravelo, A. C.; Griffith, E. M.

    2010-12-01

    The Indo-Pacific warm pool (IPWP) plays an important role in both regional and global climate, but the response of this region to anthropogenic climate change is not well understood. While the early Pliocene is not a perfect analogue for anthropogenic climate change, it is the most recent time in Earth history when global temperatures were warmer than they are today for a sustained period of time. SST in the eastern equatorial Pacific was 2-4○C warmer in the early Pliocene compared to today. A Mg/Ca SST at ODP site 806 in the western equatorial Pacific indicates that SST were stable through the last 5Ma (Wara et al., 2005). We generated a G. sacculifer Mg/Ca record in the Indian Ocean (ODP sit 758) for the last 5 Ma, which also shows that IPWP SST has remained relatively stable through the last 5 Ma and was not warmer in the early Pliocene compared today. A recent paper suggests that the Mg/Ca of seawater may have varied through the last 5 Ma and significantly affected Mg/Ca SST estimates (Medina-Elizalde et al., 2008). However, there is considerable uncertainty in the estimates of seawater Mg/Ca variations through time. We will present a detailed examination of these uncertainties to examine the possible range of seawater Mg/Ca through the last 5 Ma. Due to the lack of culturing work of foraminifera at different Mg/Ca ratios in the growth water there is also uncertainty in how changes in seawater Mg/Ca will affect the temperatures signal in the proxy. We will explore how uncertainties in the record of seawater Mg/Ca variations through time and its effect on the Mg/Ca SST proxy potentially influence the interpretation of the Mg/Ca SST records at ODP sites 806 and 758 in the IPWP, and ODP site 847 in the eastern equatorial Pacific. We will also explore how adjustment of the Mg/Ca SST estimates (due to reconstructed Mg/Ca seawater variations) affects the δ18O of water when adjusted Mg/Ca SST estimates are paired with δ18O measurements of the same samples.

  1. Uncertainty in the delayed neutron fraction in fuel assembly depletion calculations

    NASA Astrophysics Data System (ADS)

    Aures, Alexander; Bostelmann, Friederike; Kodeli, Ivan A.; Velkov, Kiril; Zwermann, Winfried

    2017-09-01

    This study presents uncertainty and sensitivity analyses of the delayed neutron fraction of light water reactor and sodium-cooled fast reactor fuel assemblies. For these analyses, the sampling-based XSUSA methodology is used to propagate cross section uncertainties in neutron transport and depletion calculations. Cross section data is varied according to the SCALE 6.1 covariance library. Since this library includes nu-bar uncertainties only for the total values, it has been supplemented by delayed nu-bar uncertainties from the covariance data of the JENDL-4.0 nuclear data library. The neutron transport and depletion calculations are performed with the TRITON/NEWT sequence of the SCALE 6.1 package. The evolution of the delayed neutron fraction uncertainty over burn-up is analysed without and with the consideration of delayed nu-bar uncertainties. Moreover, the main contributors to the result uncertainty are determined. In all cases, the delayed nu-bar uncertainties increase the delayed neutron fraction uncertainty. Depending on the fuel composition, the delayed nu-bar values of uranium and plutonium in fact give the main contributions to the delayed neutron fraction uncertainty for the LWR fuel assemblies. For the SFR case, the uncertainty of the scattering cross section of U-238 is the main contributor.

  2. Stress Corrosion of Ceramic Materials

    DTIC Science & Technology

    1981-10-01

    stresses are liable to fail after an indeterminate period of time, leading to a considerable uncertainty in the safe design stress. One of the objectives...of modern ceramics technology is to reduce the uncertainty associated with structural design , and hence, to improve our capabilities of designing ...processes that occur during stress corrosion cracking. Recent advances in th~earea of structural design with ceramic materials have lead to several

  3. Coupled semivariogram uncertainty of hydrogeological and geophysical data on capture zone uncertainty analysis

    USGS Publications Warehouse

    Rahman, A.; Tsai, F.T.-C.; White, C.D.; Willson, C.S.

    2008-01-01

    This study investigates capture zone uncertainty that relates to the coupled semivariogram uncertainty of hydrogeological and geophysical data. Semivariogram uncertainty is represented by the uncertainty in structural parameters (range, sill, and nugget). We used the beta distribution function to derive the prior distributions of structural parameters. The probability distributions of structural parameters were further updated through the Bayesian approach with the Gaussian likelihood functions. Cokriging of noncollocated pumping test data and electrical resistivity data was conducted to better estimate hydraulic conductivity through autosemivariograms and pseudo-cross-semivariogram. Sensitivities of capture zone variability with respect to the spatial variability of hydraulic conductivity, porosity and aquifer thickness were analyzed using ANOVA. The proposed methodology was applied to the analysis of capture zone uncertainty at the Chicot aquifer in Southwestern Louisiana, where a regional groundwater flow model was developed. MODFLOW-MODPATH was adopted to delineate the capture zone. The ANOVA results showed that both capture zone area and compactness were sensitive to hydraulic conductivity variation. We concluded that the capture zone uncertainty due to the semivariogram uncertainty is much higher than that due to the kriging uncertainty for given semivariograms. In other words, the sole use of conditional variances of kriging may greatly underestimate the flow response uncertainty. Semivariogram uncertainty should also be taken into account in the uncertainty analysis. ?? 2008 ASCE.

  4. Quantifying model-structure- and parameter-driven uncertainties in spring wheat phenology prediction with Bayesian analysis

    DOE PAGES

    Alderman, Phillip D.; Stanfill, Bryan

    2016-10-06

    Recent international efforts have brought renewed emphasis on the comparison of different agricultural systems models. Thus far, analysis of model-ensemble simulated results has not clearly differentiated between ensemble prediction uncertainties due to model structural differences per se and those due to parameter value uncertainties. Additionally, despite increasing use of Bayesian parameter estimation approaches with field-scale crop models, inadequate attention has been given to the full posterior distributions for estimated parameters. The objectives of this study were to quantify the impact of parameter value uncertainty on prediction uncertainty for modeling spring wheat phenology using Bayesian analysis and to assess the relativemore » contributions of model-structure-driven and parameter-value-driven uncertainty to overall prediction uncertainty. This study used a random walk Metropolis algorithm to estimate parameters for 30 spring wheat genotypes using nine phenology models based on multi-location trial data for days to heading and days to maturity. Across all cases, parameter-driven uncertainty accounted for between 19 and 52% of predictive uncertainty, while model-structure-driven uncertainty accounted for between 12 and 64%. Here, this study demonstrated the importance of quantifying both model-structure- and parameter-value-driven uncertainty when assessing overall prediction uncertainty in modeling spring wheat phenology. More generally, Bayesian parameter estimation provided a useful framework for quantifying and analyzing sources of prediction uncertainty.« less

  5. A Comprehensive Analysis of Uncertainties Affecting the Stellar Mass-Halo Mass Relation for 0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Behroozi, Peter S.; Conroy, Charlie; Wechsler, Risa H.

    2010-06-07

    We conduct a comprehensive analysis of the relationship between central galaxies and their host dark matter halos, as characterized by the stellar mass - halo mass (SM-HM) relation, with rigorous consideration of uncertainties. Our analysis focuses on results from the abundance matching technique, which assumes that every dark matter halo or subhalo above a specific mass threshold hosts one galaxy. We provide a robust estimate of the SM-HM relation for 0 < z < 1 and discuss the quantitative effects of uncertainties in observed galaxy stellar mass functions (GSMFs) (including stellar mass estimates and counting uncertainties), halo mass functions (includingmore » cosmology and uncertainties from substructure), and the abundance matching technique used to link galaxies to halos (including scatter in this connection). Our analysis results in a robust estimate of the SM-HM relation and its evolution from z=0 to z=4. The shape and evolution are well constrained for z < 1. The largest uncertainties at these redshifts are due to stellar mass estimates (0.25 dex uncertainty in normalization); however, failure to account for scatter in stellar masses at fixed halo mass can lead to errors of similar magnitude in the SM-HM relation for central galaxies in massive halos. We also investigate the SM-HM relation to z = 4, although the shape of the relation at higher redshifts remains fairly unconstrained when uncertainties are taken into account. We find that the integrated star formation at a given halo mass peaks at 10-20% of available baryons for all redshifts from 0 to 4. This peak occurs at a halo mass of 7 x 10{sup 11} M{sub {circle_dot}} at z = 0 and this mass increases by a factor of 5 to z = 4. At lower and higher masses, star formation is substantially less efficient, with stellar mass scaling as M{sub *} {approx} M{sub h}{sup 2.3} at low masses and M{sub *} {approx} M{sub h}{sup 0.29} at high masses. The typical stellar mass for halos with mass less than 10{sup 12} M{sub {circle_dot}} has increased by 0.3-0.45 dex for halos since z {approx} 1. These results will provide a powerful tool to inform galaxy evolution models.« less

  6. Uncertainty and Sensitivity of Direct Economic Flood Damages: the FloodRisk Free and Open-Source Software

    NASA Astrophysics Data System (ADS)

    Albano, R.; Sole, A.; Mancusi, L.; Cantisani, A.; Perrone, A.

    2017-12-01

    The considerable increase of flood damages in the the past decades has shifted in Europe the attention from protection against floods to managing flood risks. In this context, the expected damages assessment represents a crucial information within the overall flood risk management process. The present paper proposes an open source software, called FloodRisk, that is able to operatively support stakeholders in the decision making processes with a what-if approach by carrying out the rapid assessment of the flood consequences, in terms of direct economic damage and loss of human lives. The evaluation of the damage scenarios, trough the use of the GIS software proposed here, is essential for cost-benefit or multi-criteria analysis of risk mitigation alternatives. However, considering that quantitative assessment of flood damages scenarios is characterized by intrinsic uncertainty, a scheme has been developed to identify and quantify the role of the input parameters in the total uncertainty of flood loss model application in urban areas with mild terrain and complex topography. By the concept of parallel models, the contribution of different module and input parameters to the total uncertainty is quantified. The results of the present case study have exhibited a high epistemic uncertainty on the damage estimation module and, in particular, on the type and form of the utilized damage functions, which have been adapted and transferred from different geographic and socio-economic contexts because there aren't depth-damage functions that are specifically developed for Italy. Considering that uncertainty and sensitivity depend considerably on local characteristics, the epistemic uncertainty associated with the risk estimate is reduced by introducing additional information into the risk analysis. In the light of the obtained results, it is evident the need to produce and disseminate (open) data to develop micro-scale vulnerability curves. Moreover, the urgent need to push forward research into the implementation of methods and models for the assimilation of uncertainties in decision-making processes emerges.

  7. Challenges of Sustaining the International Space Station Through 2020 and Beyond: Reassessing Confidence Targets for System Availability

    NASA Technical Reports Server (NTRS)

    Lutomski, Michael G.; Carter-Journet, Katrina; Anderson, Leif; Box, Neil; Harrington, Sean; Jackson, David; DiFilippo, Denise

    2012-01-01

    The International Space Station (ISS) was originally designed to operate until 2015 with a plan for deorbiting the ISS in 2016. Currently, the international partnership has agreed to extend the operations until 2020 and discussions are underway to extend the life even further to 2028. Each partner is responsible for the sustaining engineering, sparing, and maintenance of their own segments. National Aeronautics and Space Administration's (NASA's) challenge is to purchase the needed number of spares to maintain the functional availability of the ISS systems necessary for the United States On-Orbit Segment s contribution. This presentation introduces an analytical approach to assessing uncertainty in ISS hardware necessary to extend the life of the vehicle. Some key areas for consideration are: establishing what confidence targets are required to ensure science can be continuously carried out on the ISS, defining what confidence targets are reasonable to ensure vehicle survivability, considering what is required to determine if the confidence targets are too high, and whether sufficient number of spares are purchased. The results of the analysis will provide a methodological basis for reassessing vehicle subsystem confidence targets. This analysis compares the probability of existing spares exceeding the total expected unit demand of the Orbital Replacement Unit (ORU) in functional hierarchies approximating the vehicle subsystems. In cases where the functional hierarchies' availability does not meet subsystem confidence targets, the analysis will further identify which ORUs may require additional spares to extend the life of the ISS. The resulting probability is dependent upon hardware reliability estimates. However, the ISS hardware fleet carries considerable epistemic uncertainty which must be factored into the development and execution of sparing risk postures. In addition, it is also recognized that uncertainty in the assessment is due to disconnects between modeled functions and actual subsystem operations. Perhaps most importantly, it is acknowledged that conservative confidence targets per subsystem are currently accepted. This presentation will also discuss how subsystem confidence targets may be relaxed based on calculating the level of uncertainty for each corresponding ORU-function. The presentation will conclude with the various strengths and limitations for implementing the analytical approach in sustaining the ISS through end of life; 2020 and beyond.

  8. Measurements of fusion neutron yields by neutron activation technique: Uncertainty due to the uncertainty on activation cross-sections

    NASA Astrophysics Data System (ADS)

    Stankunas, Gediminas; Batistoni, Paola; Sjöstrand, Henrik; Conroy, Sean; JET Contributors

    2015-07-01

    The neutron activation technique is routinely used in fusion experiments to measure the neutron yields. This paper investigates the uncertainty on these measurements as due to the uncertainties on dosimetry and activation reactions. For this purpose, activation cross-sections were taken from the International Reactor Dosimetry and Fusion File (IRDFF-v1.05) in 640 groups ENDF-6 format for several reactions of interest for both 2.5 and 14 MeV neutrons. Activation coefficients (reaction rates) have been calculated using the neutron flux spectra at JET vacuum vessel, both for DD and DT plasmas, calculated by MCNP in the required 640-energy group format. The related uncertainties for the JET neutron spectra are evaluated as well using the covariance data available in the library. These uncertainties are in general small, but not negligible when high accuracy is required in the determination of the fusion neutron yields.

  9. Linking 1D coastal ocean modelling to environmental management: an ensemble approach

    NASA Astrophysics Data System (ADS)

    Mussap, Giulia; Zavatarelli, Marco; Pinardi, Nadia

    2017-12-01

    The use of a one-dimensional interdisciplinary numerical model of the coastal ocean as a tool contributing to the formulation of ecosystem-based management (EBM) is explored. The focus is on the definition of an experimental design based on ensemble simulations, integrating variability linked to scenarios (characterised by changes in the system forcing) and to the concurrent variation of selected, and poorly constrained, model parameters. The modelling system used was previously specifically designed for the use in "data-rich" areas, so that horizontal dynamics can be resolved by a diagnostic approach and external inputs can be parameterised by nudging schemes properly calibrated. Ensembles determined by changes in the simulated environmental (physical and biogeochemical) dynamics, under joint forcing and parameterisation variations, highlight the uncertainties associated to the application of specific scenarios that are relevant to EBM, providing an assessment of the reliability of the predicted changes. The work has been carried out by implementing the coupled modelling system BFM-POM1D in an area of Gulf of Trieste (northern Adriatic Sea), considered homogeneous from the point of view of hydrological properties, and forcing it by changing climatic (warming) and anthropogenic (reduction of the land-based nutrient input) pressure. Model parameters affected by considerable uncertainties (due to the lack of relevant observations) were varied jointly with the scenarios of change. The resulting large set of ensemble simulations provided a general estimation of the model uncertainties related to the joint variation of pressures and model parameters. The information of the model result variability aimed at conveying efficiently and comprehensibly the information on the uncertainties/reliability of the model results to non-technical EBM planners and stakeholders, in order to have the model-based information effectively contributing to EBM.

  10. Accounting for complementarity to maximize monitoring power for species management.

    PubMed

    Tulloch, Ayesha I T; Chadès, Iadine; Possingham, Hugh P

    2013-10-01

    To choose among conservation actions that may benefit many species, managers need to monitor the consequences of those actions. Decisions about which species to monitor from a suite of different species being managed are hindered by natural variability in populations and uncertainty in several factors: the ability of the monitoring to detect a change, the likelihood of the management action being successful for a species, and how representative species are of one another. However, the literature provides little guidance about how to account for these uncertainties when deciding which species to monitor to determine whether the management actions are delivering outcomes. We devised an approach that applies decision science and selects the best complementary suite of species to monitor to meet specific conservation objectives. We created an index for indicator selection that accounts for the likelihood of successfully detecting a real trend due to a management action and whether that signal provides information about other species. We illustrated the benefit of our approach by analyzing a monitoring program for invasive predator management aimed at recovering 14 native Australian mammals of conservation concern. Our method selected the species that provided more monitoring power at lower cost relative to the current strategy and traditional approaches that consider only a subset of the important considerations. Our benefit function accounted for natural variability in species growth rates, uncertainty in the responses of species to the prescribed action, and how well species represent others. Monitoring programs that ignore uncertainty, likelihood of detecting change, and complementarity between species will be more costly and less efficient and may waste funding that could otherwise be used for management. © 2013 Society for Conservation Biology.

  11. Consistency of Estimated Global Water Cycle Variations Over the Satellite Era

    NASA Technical Reports Server (NTRS)

    Robertson, F. R.; Bosilovich, M. G.; Roberts, J. B.; Reichle, R. H.; Adler, R.; Ricciardulli, L.; Berg, W.; Huffman, G. J.

    2013-01-01

    Motivated by the question of whether recent indications of decadal climate variability and a possible "climate shift" may have affected the global water balance, we examine evaporation minus precipitation (E-P) variability integrated over the global oceans and global land from three points of view-remotely sensed retrievals / objective analyses over the oceans, reanalysis vertically-integrated moisture convergence (MFC) over land, and land surface models forced with observations-based precipitation, radiation and near-surface meteorology. Because monthly variations in area-averaged atmospheric moisture storage are small and the global integral of moisture convergence must approach zero, area-integrated E-P over ocean should essentially equal precipitation minus evapotranspiration (P-ET) over land (after adjusting for ocean and land areas). Our analysis reveals considerable uncertainty in the decadal variations of ocean evaporation when integrated to global scales. This is due to differences among datasets in 10m wind speed and near-surface atmospheric specific humidity (2m qa) used in bulk aerodynamic retrievals. Precipitation variations, all relying substantially on passive microwave retrievals over ocean, still have uncertainties in decadal variability, but not to the degree present with ocean evaporation estimates. Reanalysis MFC and P-ET over land from several observationally forced diagnostic and land surface models agree best on interannual variations. However, upward MFC (i.e. P-ET) reanalysis trends are likely related in part to observing system changes affecting atmospheric assimilation models. While some evidence for a low-frequency E-P maximum near 2000 is found, consistent with a recent apparent pause in sea-surface temperature (SST) rise, uncertainties in the datasets used here remain significant. Prospects for further reducing uncertainties are discussed. The results are interpreted in the context of recent climate variability (Pacific Decadal Oscillation, Atlantic Meridional Overturning), and efforts to distinguish these modes from longer-term trends.

  12. Determination of the carbon budget of a pasture: effect of system boundaries and flux uncertainties

    NASA Astrophysics Data System (ADS)

    Felber, R.; Bretscher, D.; Münger, A.; Neftel, A.; Ammann, C.

    2015-12-01

    Carbon (C) sequestration in the soil is considered as a potential important mechanism to mitigate greenhouse gas (GHG) emissions of the agricultural sector. It can be quantified by the net ecosystem carbon budget (NECB) describing the change of soil C as the sum of all relevant import and export fluxes. NECB was investigated here in detail for an intensively grazed dairy pasture in Switzerland. Two budget approaches with different system boundaries were applied: NECBtot for system boundaries including the grazing cows and NECBpast for system boundaries excluding the cows. CO2 and CH4 exchange induced by soil/vegetation processes as well as direct emissions by the animals were derived from eddy covariance measurements. Other C fluxes were either measured (milk yield, concentrate feeding) or derived based on animal performance data (intake, excreta). For the investigated year, both approaches resulted in a small non-significant C loss: NECBtot - 13 ± 61 g C m-2 yr-1 and NECBpast - 17 ± 81 g C m-2 yr-1. The considerable uncertainties, depending on the approach, were mainly due to errors in the CO2 exchange or in the animal related fluxes. The associated GHG budget revealed CH4 emissions from the cows to be the major contributor, but with much lower uncertainty compared to NECB. Although only one year of data limit the representativeness of the carbon budget results, they demonstrated the important contribution of the non-CO2 fluxes depending on the chosen system boundaries and the effect of their propagated uncertainty in an exemplary way. The simultaneous application and comparison of both NECB approaches provides a useful consistency check for the carbon budget determination and can help to identify and eliminate systematic errors.

  13. Determination of the carbon budget of a pasture: effect of system boundaries and flux uncertainties

    NASA Astrophysics Data System (ADS)

    Felber, Raphael; Bretscher, Daniel; Münger, Andreas; Neftel, Albrecht; Ammann, Christof

    2016-05-01

    Carbon (C) sequestration in the soil is considered as a potential important mechanism to mitigate greenhouse gas (GHG) emissions of the agricultural sector. It can be quantified by the net ecosystem carbon budget (NECB) describing the change of soil C as the sum of all relevant import and export fluxes. NECB was investigated here in detail for an intensively grazed dairy pasture in Switzerland. Two budget approaches with different system boundaries were applied: NECBtot for system boundaries including the grazing cows and NECBpast for system boundaries excluding the cows. CO2 and CH4 exchange induced by soil/vegetation processes as well as direct emissions by the animals were derived from eddy covariance measurements. Other C fluxes were either measured (milk yield, concentrate feeding) or derived based on animal performance data (intake, excreta). For the investigated year, both approaches resulted in a small near-neutral C budget: NECBtot -27 ± 62 and NECBpast 23 ± 76 g C m-2 yr-1. The considerable uncertainties, depending on the approach, were mainly due to errors in the CO2 exchange or in the animal-related fluxes. The comparison of the NECB results with the annual exchange of other GHG revealed CH4 emissions from the cows to be the major contributor in terms of CO2 equivalents, but with much lower uncertainty compared to NECB. Although only 1 year of data limit the representativeness of the carbon budget results, they demonstrate the important contribution of the non-CO2 fluxes depending on the chosen system boundaries and the effect of their propagated uncertainty in an exemplary way. The simultaneous application and comparison of both NECB approaches provides a useful consistency check for the carbon budget determination and can help to identify and eliminate systematic errors.

  14. Nuclear Physical Uncertainties in Modeling X-Ray Bursts

    NASA Astrophysics Data System (ADS)

    Regis, Eric; Amthor, A. Matthew

    2017-09-01

    Type I x-ray bursts occur when a neutron star accretes material from the surface of another star in a compact binary star system. For certain accretion rates and material compositions, much of the nuclear material is burned in short, explosive bursts. Using a one-dimensional stellar model, Kepler, and a comprehensive nuclear reaction rate library, ReacLib, we have simulated chains of type I x-ray bursts. Unfortunately, there are large remaining uncertainties in the nuclear reaction rates involved, since many of the isotopes reacting are unstable and have not yet been studied experimentally. Some individual reactions, when varied within their estimated uncertainty, alter the light curves dramatically. This limits our ability to understand the structure of the neutron star. Previous studies have looked at the effects of individual reaction rate uncertainties. We have applied a Monte Carlo method ``-simultaneously varying a set of reaction rates'' -in order to probe the expected uncertainty in x-ray burst behaviour due to the total uncertainty in all nuclear reaction rates. Furthermore, we aim to discover any nonlinear effects due to the coupling between different reaction rates. Early results show clear non-linear effects. This research was made possible by NSF-DUE Grant 1317446, BUScholars Program.

  15. The Fermi Galactic Center GeV Excess and Implications for Dark Matter

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ackermann, M.; Buehler, R.; Ajello, M.

    2017-05-01

    The region around the Galactic Center (GC) is now well established to be brighter at energies of a few GeV than what is expected from conventional models of diffuse gamma-ray emission and catalogs of known gamma-ray sources. We study the GeV excess using 6.5 yr of data from the Fermi Large Area Telescope. We characterize the uncertainty of the GC excess spectrum and morphology due to uncertainties in cosmic-ray source distributions and propagation, uncertainties in the distribution of interstellar gas in the Milky Way, and uncertainties due to a potential contribution from the Fermi bubbles. We also evaluate uncertainties inmore » the excess properties due to resolved point sources of gamma rays. The GC is of particular interest, as it would be expected to have the brightest signal from annihilation of weakly interacting massive dark matter (DM) particles. However, control regions along the Galactic plane, where a DM signal is not expected, show excesses of similar amplitude relative to the local background. Based on the magnitude of the systematic uncertainties, we conservatively report upper limits for the annihilation cross-section as a function of particle mass and annihilation channel.« less

  16. ACCOUNTING FOR CALIBRATION UNCERTAINTIES IN X-RAY ANALYSIS: EFFECTIVE AREAS IN SPECTRAL FITTING

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, Hyunsook; Kashyap, Vinay L.; Drake, Jeremy J.

    2011-04-20

    While considerable advance has been made to account for statistical uncertainties in astronomical analyses, systematic instrumental uncertainties have been generally ignored. This can be crucial to a proper interpretation of analysis results because instrumental calibration uncertainty is a form of systematic uncertainty. Ignoring it can underestimate error bars and introduce bias into the fitted values of model parameters. Accounting for such uncertainties currently requires extensive case-specific simulations if using existing analysis packages. Here, we present general statistical methods that incorporate calibration uncertainties into spectral analysis of high-energy data. We first present a method based on multiple imputation that can bemore » applied with any fitting method, but is necessarily approximate. We then describe a more exact Bayesian approach that works in conjunction with a Markov chain Monte Carlo based fitting. We explore methods for improving computational efficiency, and in particular detail a method of summarizing calibration uncertainties with a principal component analysis of samples of plausible calibration files. This method is implemented using recently codified Chandra effective area uncertainties for low-resolution spectral analysis and is verified using both simulated and actual Chandra data. Our procedure for incorporating effective area uncertainty is easily generalized to other types of calibration uncertainties.« less

  17. Probabilistic short-term volcanic hazard in phases of unrest: A case study for tephra fallout

    NASA Astrophysics Data System (ADS)

    Selva, Jacopo; Costa, Antonio; Sandri, Laura; Macedonio, Giovanni; Marzocchi, Warner

    2014-12-01

    During volcanic crises, volcanologists estimate the impact of possible imminent eruptions usually through deterministic modeling of the effects of one or a few preestablished scenarios. Despite such an approach may bring an important information to the decision makers, the sole use of deterministic scenarios does not allow scientists to properly take into consideration all uncertainties, and it cannot be used to assess quantitatively the risk because the latter unavoidably requires a probabilistic approach. We present a model based on the concept of Bayesian event tree (hereinafter named BET_VH_ST, standing for Bayesian event tree for short-term volcanic hazard), for short-term near-real-time probabilistic volcanic hazard analysis formulated for any potential hazardous phenomenon accompanying an eruption. The specific goal of BET_VH_ST is to produce a quantitative assessment of the probability of exceedance of any potential level of intensity for a given volcanic hazard due to eruptions within restricted time windows (hours to days) in any area surrounding the volcano, accounting for all natural and epistemic uncertainties. BET_VH_ST properly assesses the conditional probability at each level of the event tree accounting for any relevant information derived from the monitoring system, theoretical models, and the past history of the volcano, propagating any relevant epistemic uncertainty underlying these assessments. As an application example of the model, we apply BET_VH_ST to assess short-term volcanic hazard related to tephra loading during Major Emergency Simulation Exercise, a major exercise at Mount Vesuvius that took place from 19 to 23 October 2006, consisting in a blind simulation of Vesuvius reactivation, from the early warning phase up to the final eruption, including the evacuation of a sample of about 2000 people from the area at risk. The results show that BET_VH_ST is able to produce short-term forecasts of the impact of tephra fall during a rapidly evolving crisis, accurately accounting for and propagating all uncertainties and enabling rational decision making under uncertainty.

  18. Multiple-Strain Approach and Probabilistic Modeling of Consumer Habits in Quantitative Microbial Risk Assessment: A Quantitative Assessment of Exposure to Staphylococcal Enterotoxin A in Raw Milk.

    PubMed

    Crotta, Matteo; Rizzi, Rita; Varisco, Giorgio; Daminelli, Paolo; Cunico, Elena Cosciani; Luini, Mario; Graber, Hans Ulrich; Paterlini, Franco; Guitian, Javier

    2016-03-01

    Quantitative microbial risk assessment (QMRA) models are extensively applied to inform management of a broad range of food safety risks. Inevitably, QMRA modeling involves an element of simplification of the biological process of interest. Two features that are frequently simplified or disregarded are the pathogenicity of multiple strains of a single pathogen and consumer behavior at the household level. In this study, we developed a QMRA model with a multiple-strain approach and a consumer phase module (CPM) based on uncertainty distributions fitted from field data. We modeled exposure to staphylococcal enterotoxin A in raw milk in Lombardy; a specific enterotoxin production module was thus included. The model is adaptable and could be used to assess the risk related to other pathogens in raw milk as well as other staphylococcal enterotoxins. The multiplestrain approach, implemented as a multinomial process, allowed the inclusion of variability and uncertainty with regard to pathogenicity at the bacterial level. Data from 301 questionnaires submitted to raw milk consumers were used to obtain uncertainty distributions for the CPM. The distributions were modeled to be easily updatable with further data or evidence. The sources of uncertainty due to the multiple-strain approach and the CPM were identified, and their impact on the output was assessed by comparing specific scenarios to the baseline. When the distributions reflecting the uncertainty in consumer behavior were fixed to the 95th percentile, the risk of exposure increased up to 160 times. This reflects the importance of taking into consideration the diversity of consumers' habits at the household level and the impact that the lack of knowledge about variables in the CPM can have on the final QMRA estimates. The multiple-strain approach lends itself to use in other food matrices besides raw milk and allows the model to better capture the complexity of the real world and to be capable of geographical specificity.

  19. A potato model intercomparison across varying climates and productivity levels.

    PubMed

    Fleisher, David H; Condori, Bruno; Quiroz, Roberto; Alva, Ashok; Asseng, Senthold; Barreda, Carolina; Bindi, Marco; Boote, Kenneth J; Ferrise, Roberto; Franke, Angelinus C; Govindakrishnan, Panamanna M; Harahagazwe, Dieudonne; Hoogenboom, Gerrit; Naresh Kumar, Soora; Merante, Paolo; Nendel, Claas; Olesen, Jorgen E; Parker, Phillip S; Raes, Dirk; Raymundo, Rubi; Ruane, Alex C; Stockle, Claudio; Supit, Iwan; Vanuytrecht, Eline; Wolf, Joost; Woli, Prem

    2017-03-01

    A potato crop multimodel assessment was conducted to quantify variation among models and evaluate responses to climate change. Nine modeling groups simulated agronomic and climatic responses at low-input (Chinoli, Bolivia and Gisozi, Burundi)- and high-input (Jyndevad, Denmark and Washington, United States) management sites. Two calibration stages were explored, partial (P1), where experimental dry matter data were not provided, and full (P2). The median model ensemble response outperformed any single model in terms of replicating observed yield across all locations. Uncertainty in simulated yield decreased from 38% to 20% between P1 and P2. Model uncertainty increased with interannual variability, and predictions for all agronomic variables were significantly different from one model to another (P < 0.001). Uncertainty averaged 15% higher for low- vs. high-input sites, with larger differences observed for evapotranspiration (ET), nitrogen uptake, and water use efficiency as compared to dry matter. A minimum of five partial, or three full, calibrated models was required for an ensemble approach to keep variability below that of common field variation. Model variation was not influenced by change in carbon dioxide (C), but increased as much as 41% and 23% for yield and ET, respectively, as temperature (T) or rainfall (W) moved away from historical levels. Increases in T accounted for the highest amount of uncertainty, suggesting that methods and parameters for T sensitivity represent a considerable unknown among models. Using median model ensemble values, yield increased on average 6% per 100-ppm C, declined 4.6% per °C, and declined 2% for every 10% decrease in rainfall (for nonirrigated sites). Differences in predictions due to model representation of light utilization were significant (P < 0.01). These are the first reported results quantifying uncertainty for tuber/root crops and suggest modeling assessments of climate change impact on potato may be improved using an ensemble approach. © 2016 John Wiley & Sons Ltd.

  20. SU-E-T-503: IMRT Optimization Using Monte Carlo Dose Engine: The Effect of Statistical Uncertainty.

    PubMed

    Tian, Z; Jia, X; Graves, Y; Uribe-Sanchez, A; Jiang, S

    2012-06-01

    With the development of ultra-fast GPU-based Monte Carlo (MC) dose engine, it becomes clinically realistic to compute the dose-deposition coefficients (DDC) for IMRT optimization using MC simulation. However, it is still time-consuming if we want to compute DDC with small statistical uncertainty. This work studies the effects of the statistical error in DDC matrix on IMRT optimization. The MC-computed DDC matrices are simulated here by adding statistical uncertainties at a desired level to the ones generated with a finite-size pencil beam algorithm. A statistical uncertainty model for MC dose calculation is employed. We adopt a penalty-based quadratic optimization model and gradient descent method to optimize fluence map and then recalculate the corresponding actual dose distribution using the noise-free DDC matrix. The impacts of DDC noise are assessed in terms of the deviation of the resulted dose distributions. We have also used a stochastic perturbation theory to theoretically estimate the statistical errors of dose distributions on a simplified optimization model. A head-and-neck case is used to investigate the perturbation to IMRT plan due to MC's statistical uncertainty. The relative errors of the final dose distributions of the optimized IMRT are found to be much smaller than those in the DDC matrix, which is consistent with our theoretical estimation. When history number is decreased from 108 to 106, the dose-volume-histograms are still very similar to the error-free DVHs while the error in DDC is about 3.8%. The results illustrate that the statistical errors in the DDC matrix have a relatively small effect on IMRT optimization in dose domain. This indicates we can use relatively small number of histories to obtain the DDC matrix with MC simulation within a reasonable amount of time, without considerably compromising the accuracy of the optimized treatment plan. This work is supported by Varian Medical Systems through a Master Research Agreement. © 2012 American Association of Physicists in Medicine.

  1. Incorporating anthropogenic influences into fire probability models: Effects of development and climate change on fire activity in California

    NASA Astrophysics Data System (ADS)

    Mann, M.; Moritz, M.; Batllori, E.; Waller, E.; Krawchuk, M.; Berck, P.

    2014-12-01

    The costly interactions between humans and natural fire regimes throughout California demonstrate the need to understand the uncertainties surrounding wildfire, especially in the face of a changing climate and expanding human communities. Although a number of statistical and process-based wildfire models exist for California, there is enormous uncertainty about the location and number of future fires. Models estimate an increase in fire occurrence between nine and fifty-three percent by the end of the century. Our goal is to assess the role of uncertainty in climate and anthropogenic influences on the state's fire regime from 2000-2050. We develop an empirical model that integrates novel information about the distribution and characteristics of future plant communities without assuming a particular distribution, and improve on previous efforts by integrating dynamic estimates of population density at each forecast time step. Historically, we find that anthropogenic influences account for up to fifty percent of the total fire count, and that further housing development will incite or suppress additional fires according to their intensity. We also find that the total area burned is likely to increase but at a slower than historical rate. Previous findings of substantially increased numbers of fires may be tied to the assumption of static fuel loadings, and the use of proxy variables not relevant to plant community distributions. We also find considerable agreement between GFDL and PCM model A2 runs, with decreasing fire counts expected only in areas of coastal influence below San Francisco and above Los Angeles. Due to potential shifts in rainfall patterns, substantial uncertainty remains for the semiarid deserts of the inland south. The broad shifts of wildfire between California's climatic regions forecast in this study point to dramatic shifts in the pressures plant and human communities will face by midcentury. The information provided by this study reduces the level of uncertainty surrounding the influence that natural and anthropogenic systems have on wildfire.

  2. Streamflow loss quantification for groundwater flow modeling using a wading-rod-mounted acoustic Doppler current profiler in a headwater stream

    NASA Astrophysics Data System (ADS)

    Pflügl, Christian; Hoehn, Philipp; Hofmann, Thilo

    2017-04-01

    Irrespective of the availability of various field measurement and modeling approaches, the quantification of interactions between surface water and groundwater systems remains associated with high uncertainty. Such uncertainties on stream-aquifer interaction have a high potential to misinterpret the local water budget and water quality significantly. Due to typically considerable temporal variation of stream discharge rates, it is desirable for the measurement of streamflow to reduce the measuring duration while reducing uncertainty. Streamflow measurements, according to the velocity-area method, have been performed along reaches of a losing-disconnected, subalpine headwater stream using a 2-dimensional, wading-rod-mounted acoustic Doppler current profiler (ADCP). The method was chosen, with stream morphology not allowing for boat-mounted setups, to reduce uncertainty compared to conventional, single-point streamflow measurements of similar measurement duration. Reach-averaged stream loss rates were subsequently quantified between 12 cross sections. They enabled the delineation of strongly infiltrating stream reaches and their differentiation from insignificantly infiltrating reaches. Furthermore, a total of 10 near-stream observation wells were constructed and/or equipped with pressure and temperature loggers. The time series of near-stream groundwater temperature data were cross-correlated with stream temperature time series to yield supportive qualitative information on the delineation of infiltrating reaches. Subsequently, as a reference parameterization, the hydraulic conductivity and specific yield of a numerical, steady-state model of groundwater flow, in the unconfined glaciofluvial aquifer adjacent to the stream, were inversely determined incorporating the inferred stream loss rates. Applying synthetic sets of infiltration rates, resembling increasing levels of uncertainty associated with single-point streamflow measurements of comparable duration, the same inversion procedure was run. The volume-weighted mean of the respective parameter distribution within 200 m of stream periphery deviated increasingly from the reference parameterization at increasing deviation of infiltration rates.

  3. SU-E-T-622: Planning Technique for Passively-Scattered Involved-Node Proton Therapy of Mediastinal Lymphoma with Consideration of Cardiac Motion

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Flampouri, S; Li, Z; Hoppe, B

    2015-06-15

    Purpose: To develop a treatment planning method for passively-scattered involved-node proton therapy of mediastinal lymphoma robust to breathing and cardiac motions. Methods: Beam-specific planning treatment volumes (bsPTV) are calculated for each proton field to incorporate pertinent uncertainties. Geometric margins are added laterally to each beam while margins for range uncertainty due to setup errors, breathing, and calibration curve uncertainties are added along each beam. The calculation of breathing motion and deformation effects on proton range includes all 4DCT phases. The anisotropic water equivalent margins are translated to distances on average 4DCT. Treatment plans are designed so each beam adequately coversmore » the corresponding bsPTV. For targets close to the heart, cardiac motion effects on dosemaps are estimated by using a library of anonymous ECG-gated cardiac CTs (cCT). The cCT, originally contrast-enhanced, are partially overridden to allow meaningful proton dose calculations. Targets similar to the treatment targets are drawn on one or more cCT sets matching the anatomy of the patient. Plans based on the average cCT are calculated on individual phases, then deformed to the average and accumulated. When clinically significant dose discrepancies occur between planned and accumulated doses, the patient plan is modified to reduce the cardiac motion effects. Results: We found that bsPTVs as planning targets create dose distributions similar to the conventional proton planning distributions, while they are a valuable tool for visualization of the uncertainties. For large targets with variability in motion and depth, integral dose was reduced because of the anisotropic margins. In most cases, heart motion has a clinically insignificant effect on target coverage. Conclusion: A treatment planning method was developed and used for proton therapy of mediastinal lymphoma. The technique incorporates bsPTVs compensating for all common sources of uncertainties and estimation of the effects of cardiac motion not commonly performed.« less

  4. Diabetes in Combat: Effect of Military Deployment on Diabetes Mellitus in Air Force Personnel

    DTIC Science & Technology

    2017-04-01

    participating in military deployments due to the uncertainty of healthcare availability in an austere environment. For military providers, assessing a member...Diabetes Mellitus (OM) from participating in military deployments due to the uncertainty of healthcare availability in an austere environment. For

  5. A window into the future of the Earth, hidden in the jungles of Costa Rica's volcanoes

    NASA Astrophysics Data System (ADS)

    Fisher, J. B.; Schwandner, F. M.; Asner, G. P.; Schimel, D.; Norby, R. J.; Keller, M.; Pavlick, R.; Braverman, A. J.; Pieri, D. C.; Diaz, J. A.; Gutierrez, M.; Duarte, E. A.; Lewicki, J. L.; Manning, C. E.; Deering, C. D.; Seibt, U.; Miller, G. R.; Drewry, D.; Chambers, J.

    2017-12-01

    The CO2 fertilization response of the terrestrial biosphere contributes among the largest sensitivities and uncertainties across projections of the Earth's future. The source of that uncertainty can be pinpointed to the largest fluxes in the biosphere: the tropics. Free Air CO2 Enrichment (FACE) experiments have contributed immensely to our understanding of short-term CO2 fertilization, but, outside of a small pilot study in development, have been absent in the tropics. This is largely due to numerous hurdles of not only conducting such experiments in challenging environments, but also due to the need to expand their extent considerably to encompass the enormous diversity of species-level responses, in addition to the need for multi-decadal scale responses. As such, we have remained at a critical impasse in our ability to advance understanding of the response of the tropical biosphere to increasing CO2. Recent discoveries have found a cluster of volcanoes degassing CO2 into tropical ecosystems in Costa Rica at concentrations similar to future Earth atmosphere levels. The degassing has been occurring persistently from 10s to 100s of years over 10s to 100s of square kilometers, at different levels depending on the volcano. Fortuitously, this provides a natural "experiment" across a range of conditions needed to assess a widespread and long-lived tropical ecosystem response to elevated CO2: tree species will have had time to shift in composition, traits, structure, and function. Nonetheless, due both to the challenges with assessing these changes on the ground, and heterogeneity causing problems with coarse-scale satellite remote sensing observations, this "window" into the future of the Earth has remained veiled. Here, we describe an airborne-based plan designed to uncover this gem hidden in the jungles of Costa Rica's volcanoes.

  6. Application of an Integrated HPC Reliability Prediction Framework to HMMWV Suspension System

    DTIC Science & Technology

    2010-09-13

    model number M966 (TOW Missle Carrier, Basic Armor without weapons), since they were available. Tires used for all simulations were the bias-type...vehicle fleet, including consideration of all kinds of uncertainty, especially including model uncertainty. The end result will be a tool to use...building an adequate vehicle reliability prediction framework for military vehicles is the accurate modeling of the integration of various types of

  7. Challenges of Sustaining the International Space Station through 2020 and Beyond: Including Epistemic Uncertainty in Reassessing Confidence Targets

    NASA Technical Reports Server (NTRS)

    Anderson, Leif; Carter-Journet, Katrina; Box, Neil; DiFilippo, Denise; Harrington, Sean; Jackson, David; Lutomski, Michael

    2012-01-01

    This paper introduces an analytical approach, Probability and Confidence Trade-space (PACT), which can be used to assess uncertainty in International Space Station (ISS) hardware sparing necessary to extend the life of the vehicle. There are several key areas under consideration in this research. We investigate what sparing confidence targets may be reasonable to ensure vehicle survivability and for completion of science on the ISS. The results of the analysis will provide a methodological basis for reassessing vehicle subsystem confidence targets. An ongoing annual analysis currently compares the probability of existing spares exceeding the total expected unit demand of the Orbital Replacement Unit (ORU) in functional hierarchies approximating the vehicle subsystems. In cases where the functional hierarchies availability does not meet subsystem confidence targets, the current sparing analysis further identifies which ORUs may require additional spares to extend the life of the ISS. The resulting probability is dependent upon hardware reliability estimates. However, the ISS hardware fleet carries considerable epistemic uncertainty (uncertainty in the knowledge of the true hardware failure rate), which does not currently factor into the annual sparing analysis. The existing confidence targets may be conservative. This paper will also discuss how confidence targets may be relaxed based on the inclusion of epistemic uncertainty for each ORU. The paper will conclude with strengths and limitations for implementing the analytical approach in sustaining the ISS through end of life, 2020 and beyond.

  8. Constraining past seawater δ18O and temperature records developed from foraminiferal geochemistry

    NASA Astrophysics Data System (ADS)

    Quinn, T. M.; Thirumalai, K.; Marino, G.

    2016-12-01

    Paired measurements of magnesium-to-calcium ratios (Mg/Ca) and the stable oxygen isotopic composition (δ18O) in foraminifera have significantly advanced our knowledge of the climate system by providing information on past temperature and seawater δ18O (δ18Osw, a proxy for salinity and ice volume). However, multiple sources of uncertainty exist in transferring these downcore geochemical data into quantitative paleoclimate reconstructions. Here, we develop a computational toolkit entitled Paleo-Seawater Uncertainty Solver (PSU Solver) that performs bootstrap Monte Carlo simulations to constrain these various sources of uncertainty. PSU Solver calculates temperature and δ18Osw, and their respective confidence intervals using an iterative approach with user-defined errors, calibrations, and sea-level curves. Our probabilistic approach yields reduced uncertainty constraints compared to theoretical considerations and commonly used propagation exercises. We demonstrate the applicability of PSU Solver for published records covering three timescales: the late Holocene, the last deglaciation, and the last glacial period. We show that the influence of salinity on Mg/Ca can considerably alter the structure and amplitude of change in the resulting reconstruction and can impact the interpretation of paleoceanographic time series. We also highlight the sensitivity of the records to various inputs of sea-level curves, transfer functions, and uncertainty constraints. PSU Solver offers an expeditious yet rigorous approach to test the robustness of past climate variability inferred from paired Mg/Ca-δ18O measurements.

  9. Reliability Quantification of the Flexure: A Critical Stirling Convertor Component

    NASA Technical Reports Server (NTRS)

    Shah, Ashwin R.; Korovaichuk, Igor; Zampino, Edward J.

    2004-01-01

    Uncertainties in the manufacturing, fabrication process, material behavior, loads, and boundary conditions results in the variation of the stresses and strains induced in the flexures and its fatigue life. Past experience and the test data at material coupon levels revealed a significant amount of scatter of the fatigue life. Owing to these facts, the design of the flexure, using conventional approaches based on safety factor or traditional reliability based on similar equipment considerations does not provide a direct measure of reliability. Additionally, it may not be feasible to run actual long term fatigue tests due to cost and time constraints. Therefore it is difficult to ascertain material fatigue strength limit. The objective of the paper is to present a methodology and quantified results of numerical simulation for the reliability of flexures used in the Stirling convertor for their structural performance. The proposed approach is based on application of finite element analysis method in combination with the random fatigue limit model, which includes uncertainties in material fatigue life. Additionally, sensitivity of fatigue life reliability to the design variables is quantified and its use to develop guidelines to improve design, manufacturing, quality control and inspection design process is described.

  10. Variability and uncertainty in life cycle assessment models for greenhouse gas emissions from Canadian oil sands production.

    PubMed

    Brandt, Adam R

    2012-01-17

    Because of interest in greenhouse gas (GHG) emissions from transportation fuels production, a number of recent life cycle assessment (LCA) studies have calculated GHG emissions from oil sands extraction, upgrading, and refining pathways. The results from these studies vary considerably. This paper reviews factors affecting energy consumption and GHG emissions from oil sands extraction. It then uses publicly available data to analyze the assumptions made in the LCA models to better understand the causes of variability in emissions estimates. It is found that the variation in oil sands GHG estimates is due to a variety of causes. In approximate order of importance, these are scope of modeling and choice of projects analyzed (e.g., specific projects vs industry averages); differences in assumed energy intensities of extraction and upgrading; differences in the fuel mix assumptions; treatment of secondary noncombustion emissions sources, such as venting, flaring, and fugitive emissions; and treatment of ecological emissions sources, such as land-use change-associated emissions. The GHGenius model is recommended as the LCA model that is most congruent with reported industry average data. GHGenius also has the most comprehensive system boundaries. Last, remaining uncertainties and future research needs are discussed.

  11. Estimating the Triple-Point Isotope Effect and the Corresponding Uncertainties for Cryogenic Fixed Points

    NASA Astrophysics Data System (ADS)

    Tew, W. L.

    2008-02-01

    The sensitivities of melting temperatures to isotopic variations in monatomic and diatomic atmospheric gases using both theoretical and semi-empirical methods are estimated. The current state of knowledge of the vapor-pressure isotope effects (VPIE) and triple-point isotope effects (TPIE) is briefly summarized for the noble gases (except He), and for selected diatomic molecules including oxygen. An approximate expression is derived to estimate the relative shift in the melting temperature with isotopic substitution. In general, the magnitude of the effects diminishes with increasing molecular mass and increasing temperature. Knowledge of the VPIE, molar volumes, and heat of fusion are sufficient to estimate the temperature shift or isotopic sensitivity coefficient via the derived expression. The usefulness of this approach is demonstrated in the estimation of isotopic sensitivities and uncertainties for triple points of xenon and molecular oxygen for which few documented estimates were previously available. The calculated sensitivities from this study are considerably higher than previous estimates for Xe, and lower than other estimates in the case of oxygen. In both these cases, the predicted sensitivities are small and the resulting variations in triple point temperatures due to mass fractionation effects are less than 20 μK.

  12. Developing a probability-based model of aquifer vulnerability in an agricultural region

    NASA Astrophysics Data System (ADS)

    Chen, Shih-Kai; Jang, Cheng-Shin; Peng, Yi-Huei

    2013-04-01

    SummaryHydrogeological settings of aquifers strongly influence the regional groundwater movement and pollution processes. Establishing a map of aquifer vulnerability is considerably critical for planning a scheme of groundwater quality protection. This study developed a novel probability-based DRASTIC model of aquifer vulnerability in the Choushui River alluvial fan, Taiwan, using indicator kriging and to determine various risk categories of contamination potentials based on estimated vulnerability indexes. Categories and ratings of six parameters in the probability-based DRASTIC model were probabilistically characterized according to the parameter classification methods of selecting a maximum estimation probability and calculating an expected value. Moreover, the probability-based estimation and assessment gave us an excellent insight into propagating the uncertainty of parameters due to limited observation data. To examine the prediction capacity of pollutants for the developed probability-based DRASTIC model, medium, high, and very high risk categories of contamination potentials were compared with observed nitrate-N exceeding 0.5 mg/L indicating the anthropogenic groundwater pollution. The analyzed results reveal that the developed probability-based DRASTIC model is capable of predicting high nitrate-N groundwater pollution and characterizing the parameter uncertainty via the probability estimation processes.

  13. Multidecadal Scale Detection Time for Potentially Increasing Atlantic Storm Surges in a Warming Climate

    NASA Astrophysics Data System (ADS)

    Lee, Benjamin Seiyon; Haran, Murali; Keller, Klaus

    2017-10-01

    Storm surges are key drivers of coastal flooding, which generate considerable risks. Strategies to manage these risks can hinge on the ability to (i) project the return periods of extreme storm surges and (ii) detect potential changes in their statistical properties. There are several lines of evidence linking rising global average temperatures and increasingly frequent extreme storm surges. This conclusion is, however, subject to considerable structural uncertainty. This leads to two main questions: What are projections under various plausible statistical models? How long would it take to distinguish among these plausible statistical models? We address these questions by analyzing observed and simulated storm surge data. We find that (1) there is a positive correlation between global mean temperature rise and increasing frequencies of extreme storm surges; (2) there is considerable uncertainty underlying the strength of this relationship; and (3) if the frequency of storm surges is increasing, this increase can be detected within a multidecadal timescale (≈20 years from now).

  14. Accounting for multiple sources of uncertainty in impact assessments: The example of the BRACE study

    NASA Astrophysics Data System (ADS)

    O'Neill, B. C.

    2015-12-01

    Assessing climate change impacts often requires the use of multiple scenarios, types of models, and data sources, leading to a large number of potential sources of uncertainty. For example, a single study might require a choice of a forcing scenario, climate model, bias correction and/or downscaling method, societal development scenario, model (typically several) for quantifying elements of societal development such as economic and population growth, biophysical model (such as for crop yields or hydrology), and societal impact model (e.g. economic or health model). Some sources of uncertainty are reduced or eliminated by the framing of the question. For example, it may be useful to ask what an impact outcome would be conditional on a given societal development pathway, forcing scenario, or policy. However many sources of uncertainty remain, and it is rare for all or even most of these sources to be accounted for. I use the example of a recent integrated project on the Benefits of Reduced Anthropogenic Climate changE (BRACE) to explore useful approaches to uncertainty across multiple components of an impact assessment. BRACE comprises 23 papers that assess the differences in impacts between two alternative climate futures: those associated with Representative Concentration Pathways (RCPs) 4.5 and 8.5. It quantifies difference in impacts in terms of extreme events, health, agriculture, tropical cyclones, and sea level rise. Methodologically, it includes climate modeling, statistical analysis, integrated assessment modeling, and sector-specific impact modeling. It employs alternative scenarios of both radiative forcing and societal development, but generally uses a single climate model (CESM), partially accounting for climate uncertainty by drawing heavily on large initial condition ensembles. Strengths and weaknesses of the approach to uncertainty in BRACE are assessed. Options under consideration for improving the approach include the use of perturbed physics ensembles of CESM, employing results from multiple climate models, and combining the results from single impact models with statistical representations of uncertainty across multiple models. A key consideration is the relationship between the question being addressed and the uncertainty approach.

  15. Integrated carbon budget models for the Everglades terrestrial-coastal-oceanic gradient: Current status and needs for inter-site comparisons

    USGS Publications Warehouse

    Troxler, Tiffany G.; Gaiser, Evelyn; Barr, Jordan; Fuentes, Jose D.; Jaffe, Rudolf; Childers, Daniel L.; Collado-Vides, Ligia; Rivera-Monroy, Victor H.; Castañeda-Moya, Edward; Anderson, William; Chambers, Randy; Chen, Meilian; Coronado-Molina, Carlos; Davis, Stephen E.; Engel, Victor C.; Fitz, Carl; Fourqurean, James; Frankovich, Tom; Kominoski, John; Madden, Chris; Malone, Sparkle L.; Oberbauer, Steve F.; Olivas, Paulo; Richards, Jennifer; Saunders, Colin; Schedlbauer, Jessica; Scinto, Leonard J.; Sklar, Fred; Smith, Thomas J.; Smoak, Joseph M.; Starr, Gregory; Twilley, Robert; Whelan, Kevin

    2013-01-01

    Recent studies suggest that coastal ecosystems can bury significantly more C than tropical forests, indicating that continued coastal development and exposure to sea level rise and storms will have global biogeochemical consequences. The Florida Coastal Everglades Long Term Ecological Research (FCE LTER) site provides an excellent subtropical system for examining carbon (C) balance because of its exposure to historical changes in freshwater distribution and sea level rise and its history of significant long-term carbon-cycling studies. FCE LTER scientists used net ecosystem C balance and net ecosystem exchange data to estimate C budgets for riverine mangrove, freshwater marsh, and seagrass meadows, providing insights into the magnitude of C accumulation and lateral aquatic C transport. Rates of net C production in the riverine mangrove forest exceeded those reported for many tropical systems, including terrestrial forests, but there are considerable uncertainties around those estimates due to the high potential for gain and loss of C through aquatic fluxes. C production was approximately balanced between gain and loss in Everglades marshes; however, the contribution of periphyton increases uncertainty in these estimates. Moreover, while the approaches used for these initial estimates were informative, a resolved approach for addressing areas of uncertainty is critically needed for coastal wetland ecosystems. Once resolved, these C balance estimates, in conjunction with an understanding of drivers and key ecosystem feedbacks, can inform cross-system studies of ecosystem response to long-term changes in climate, hydrologic management, and other land use along coastlines

  16. Uncertainty analysis of vegetation distribution in the northern high latitudes during the 21st century with a dynamic vegetation model.

    PubMed

    Jiang, Yueyang; Zhuang, Qianlai; Schaphoff, Sibyll; Sitch, Stephen; Sokolov, Andrei; Kicklighter, David; Melillo, Jerry

    2012-03-01

    This study aims to assess how high-latitude vegetation may respond under various climate scenarios during the 21st century with a focus on analyzing model parameters induced uncertainty and how this uncertainty compares to the uncertainty induced by various climates. The analysis was based on a set of 10,000 Monte Carlo ensemble Lund-Potsdam-Jena (LPJ) simulations for the northern high latitudes (45(o)N and polewards) for the period 1900-2100. The LPJ Dynamic Global Vegetation Model (LPJ-DGVM) was run under contemporary and future climates from four Special Report Emission Scenarios (SRES), A1FI, A2, B1, and B2, based on the Hadley Centre General Circulation Model (GCM), and six climate scenarios, X901M, X902L, X903H, X904M, X905L, and X906H from the Integrated Global System Model (IGSM) at the Massachusetts Institute of Technology (MIT). In the current dynamic vegetation model, some parameters are more important than others in determining the vegetation distribution. Parameters that control plant carbon uptake and light-use efficiency have the predominant influence on the vegetation distribution of both woody and herbaceous plant functional types. The relative importance of different parameters varies temporally and spatially and is influenced by climate inputs. In addition to climate, these parameters play an important role in determining the vegetation distribution in the region. The parameter-based uncertainties contribute most to the total uncertainty. The current warming conditions lead to a complexity of vegetation responses in the region. Temperate trees will be more sensitive to climate variability, compared with boreal forest trees and C3 perennial grasses. This sensitivity would result in a unanimous northward greenness migration due to anomalous warming in the northern high latitudes. Temporally, boreal needleleaved evergreen plants are projected to decline considerably, and a large portion of C3 perennial grass is projected to disappear by the end of the 21st century. In contrast, the area of temperate trees would increase, especially under the most extreme A1FI scenario. As the warming continues, the northward greenness expansion in the Arctic region could continue.

  17. New Strategies in Radiation Therapy: Exploiting the Full Potential of Protons

    PubMed Central

    Mohan, Radhe; Mahajan, Anita; Minsky, Bruce D.

    2013-01-01

    Protons provide significant dosimetric advantages compared with photons due to their unique depth-dose distribution characteristics. However, they are more sensitive to the effects of intra- and inter-treatment fraction anatomic variations and uncertainties in treatment setup. Furthermore, in the current practice of proton therapy, the biological effectiveness of protons relative to photons is assumed to have a generic fixed value of 1.1. However, this is a simplification, and it is likely higher in different portions of the proton beam. Current clinical practice and trials have not fully exploited the unique physical and biological properties of protons. Intensity-modulated proton therapy, with its ability to manipulate energies (in addition to intensities), provides an entirely new dimension, which, with ongoing research, has considerable potential to increase the therapeutic ratio. PMID:24077353

  18. Accuracy assessment for a multi-parameter optical calliper in on line automotive applications

    NASA Astrophysics Data System (ADS)

    D'Emilia, G.; Di Gasbarro, D.; Gaspari, A.; Natale, E.

    2017-08-01

    In this work, a methodological approach based on the evaluation of the measurement uncertainty is applied to an experimental test case, related to the automotive sector. The uncertainty model for different measurement procedures of a high-accuracy optical gauge is discussed in order to individuate the best measuring performances of the system for on-line applications and when the measurement requirements are becoming more stringent. In particular, with reference to the industrial production and control strategies of high-performing turbochargers, two uncertainty models are proposed, discussed and compared, to be used by the optical calliper. Models are based on an integrated approach between measurement methods and production best practices to emphasize their mutual coherence. The paper shows the possible advantages deriving from the considerations that the measurement uncertainty modelling provides, in order to keep control of the uncertainty propagation on all the indirect measurements useful for production statistical control, on which basing further improvements.

  19. Dynamics of entanglement and uncertainty relation in coupled harmonic oscillator system: exact results

    NASA Astrophysics Data System (ADS)

    Park, DaeKil

    2018-06-01

    The dynamics of entanglement and uncertainty relation is explored by solving the time-dependent Schrödinger equation for coupled harmonic oscillator system analytically when the angular frequencies and coupling constant are arbitrarily time dependent. We derive the spectral and Schmidt decompositions for vacuum solution. Using the decompositions, we derive the analytical expressions for von Neumann and Rényi entropies. Making use of Wigner distribution function defined in phase space, we derive the time dependence of position-momentum uncertainty relations. To show the dynamics of entanglement and uncertainty relation graphically, we introduce two toy models and one realistic quenched model. While the dynamics can be conjectured by simple consideration in the toy models, the dynamics in the realistic quenched model is somewhat different from that in the toy models. In particular, the dynamics of entanglement exhibits similar pattern to dynamics of uncertainty parameter in the realistic quenched model.

  20. Multiple effects and uncertainties of emission control policies in China: Implications for public health, soil acidification, and global temperature.

    PubMed

    Zhao, Yu; McElroy, Michael B; Xing, Jia; Duan, Lei; Nielsen, Chris P; Lei, Yu; Hao, Jiming

    2011-11-15

    Policies to control emissions of criteria pollutants in China may have conflicting impacts on public health, soil acidification, and climate. Two scenarios for 2020, a base case without anticipated control measures and a more realistic case including such controls, are evaluated to quantify the effects of the policies on emissions and resulting environmental outcomes. Large benefits to public health can be expected from the controls, attributed mainly to reduced emissions of primary PM and gaseous PM precursors, and thus lower ambient concentrations of PM2.5. Approximately 4% of all-cause mortality in the country can be avoided (95% confidence interval: 1-7%), particularly in eastern and north-central China, regions with large population densities and high levels of PM2.5. Surface ozone levels, however, are estimated to increase in parts of those regions, despite NOX reductions. This implies VOC-limited conditions. Even with significant reduction of SO2 and NOX emissions, the controls will not significantly mitigate risks of soil acidification, judged by the exceedance levels of critical load (CL). This is due to the decrease in primary PM emissions, with the consequent reduction in deposition of alkaline base cations. Compared to 2005, even larger CL exceedances are found for both scenarios in 2020, implying that PM control may negate any recovery from soil acidification due to SO2 reductions. Noting large uncertainties, current polices to control emissions of criteria pollutants in China will not reduce climate warming, since controlling SO2 emissions also reduces reflective secondary aerosols. Black carbon emission is an important source of uncertainty concerning the effects of Chinese control policies on global temperature change. Given these conflicts, greater consideration should be paid to reconciling varied environmental objectives, and emission control strategies should target not only criteria pollutants but also species such as VOCs and CO2. Copyright © 2011 Elsevier B.V. All rights reserved.

  1. ICESat laser altimetry over small mountain glaciers

    NASA Astrophysics Data System (ADS)

    Treichler, Désirée; Kääb, Andreas

    2016-09-01

    Using sparsely glaciated southern Norway as a case study, we assess the potential and limitations of ICESat laser altimetry for analysing regional glacier elevation change in rough mountain terrain. Differences between ICESat GLAS elevations and reference elevation data are plotted over time to derive a glacier surface elevation trend for the ICESat acquisition period 2003-2008. We find spatially varying biases between ICESat and three tested digital elevation models (DEMs): the Norwegian national DEM, SRTM DEM, and a high-resolution lidar DEM. For regional glacier elevation change, the spatial inconsistency of reference DEMs - a result of spatio-temporal merging - has the potential to significantly affect or dilute trends. Elevation uncertainties of all three tested DEMs exceed ICESat elevation uncertainty by an order of magnitude, and are thus limiting the accuracy of the method, rather than ICESat uncertainty. ICESat matches glacier size distribution of the study area well and measures small ice patches not commonly monitored in situ. The sample is large enough for spatial and thematic subsetting. Vertical offsets to ICESat elevations vary for different glaciers in southern Norway due to spatially inconsistent reference DEM age. We introduce a per-glacier correction that removes these spatially varying offsets, and considerably increases trend significance. Only after application of this correction do individual campaigns fit observed in situ glacier mass balance. Our correction also has the potential to improve glacier trend significance for other causes of spatially varying vertical offsets, for instance due to radar penetration into ice and snow for the SRTM DEM or as a consequence of mosaicking and merging that is common for national or global DEMs. After correction of reference elevation bias, we find that ICESat provides a robust and realistic estimate of a moderately negative glacier mass balance of around -0.36 ± 0.07 m ice per year. This regional estimate agrees well with the heterogeneous but overall negative in situ glacier mass balance observed in the area.

  2. Planning for robust reserve networks using uncertainty analysis

    USGS Publications Warehouse

    Moilanen, A.; Runge, M.C.; Elith, Jane; Tyre, A.; Carmel, Y.; Fegraus, E.; Wintle, B.A.; Burgman, M.; Ben-Haim, Y.

    2006-01-01

    Planning land-use for biodiversity conservation frequently involves computer-assisted reserve selection algorithms. Typically such algorithms operate on matrices of species presence?absence in sites, or on species-specific distributions of model predicted probabilities of occurrence in grid cells. There are practically always errors in input data?erroneous species presence?absence data, structural and parametric uncertainty in predictive habitat models, and lack of correspondence between temporal presence and long-run persistence. Despite these uncertainties, typical reserve selection methods proceed as if there is no uncertainty in the data or models. Having two conservation options of apparently equal biological value, one would prefer the option whose value is relatively insensitive to errors in planning inputs. In this work we show how uncertainty analysis for reserve planning can be implemented within a framework of information-gap decision theory, generating reserve designs that are robust to uncertainty. Consideration of uncertainty involves modifications to the typical objective functions used in reserve selection. Search for robust-optimal reserve structures can still be implemented via typical reserve selection optimization techniques, including stepwise heuristics, integer-programming and stochastic global search.

  3. Uncertainty assessment of PM2.5 contamination mapping using spatiotemporal sequential indicator simulations and multi-temporal monitoring data.

    PubMed

    Yang, Yong; Christakos, George; Huang, Wei; Lin, Chengda; Fu, Peihong; Mei, Yang

    2016-04-12

    Because of the rapid economic growth in China, many regions are subjected to severe particulate matter pollution. Thus, improving the methods of determining the spatiotemporal distribution and uncertainty of air pollution can provide considerable benefits when developing risk assessments and environmental policies. The uncertainty assessment methods currently in use include the sequential indicator simulation (SIS) and indicator kriging techniques. However, these methods cannot be employed to assess multi-temporal data. In this work, a spatiotemporal sequential indicator simulation (STSIS) based on a non-separable spatiotemporal semivariogram model was used to assimilate multi-temporal data in the mapping and uncertainty assessment of PM2.5 distributions in a contaminated atmosphere. PM2.5 concentrations recorded throughout 2014 in Shandong Province, China were used as the experimental dataset. Based on the number of STSIS procedures, we assessed various types of mapping uncertainties, including single-location uncertainties over one day and multiple days and multi-location uncertainties over one day and multiple days. A comparison of the STSIS technique with the SIS technique indicate that a better performance was obtained with the STSIS method.

  4. Uncertainty assessment of PM2.5 contamination mapping using spatiotemporal sequential indicator simulations and multi-temporal monitoring data

    NASA Astrophysics Data System (ADS)

    Yang, Yong; Christakos, George; Huang, Wei; Lin, Chengda; Fu, Peihong; Mei, Yang

    2016-04-01

    Because of the rapid economic growth in China, many regions are subjected to severe particulate matter pollution. Thus, improving the methods of determining the spatiotemporal distribution and uncertainty of air pollution can provide considerable benefits when developing risk assessments and environmental policies. The uncertainty assessment methods currently in use include the sequential indicator simulation (SIS) and indicator kriging techniques. However, these methods cannot be employed to assess multi-temporal data. In this work, a spatiotemporal sequential indicator simulation (STSIS) based on a non-separable spatiotemporal semivariogram model was used to assimilate multi-temporal data in the mapping and uncertainty assessment of PM2.5 distributions in a contaminated atmosphere. PM2.5 concentrations recorded throughout 2014 in Shandong Province, China were used as the experimental dataset. Based on the number of STSIS procedures, we assessed various types of mapping uncertainties, including single-location uncertainties over one day and multiple days and multi-location uncertainties over one day and multiple days. A comparison of the STSIS technique with the SIS technique indicate that a better performance was obtained with the STSIS method.

  5. The effect of soil heterogeneity on ATES performance

    NASA Astrophysics Data System (ADS)

    Sommer, W.; Rijnaarts, H.; Grotenhuis, T.; van Gaans, P.

    2012-04-01

    Due to an increasing demand for sustainable energy, application of Aquifer Thermal Energy Storage (ATES) is growing rapidly. Large-scale application of ATES is limited by the space that is available in the subsurface. Especially in urban areas, suboptimal performance is expected due to thermal interference between individual wells of a single system, or interference with other ATES systems or groundwater abstractions. To avoid thermal interference there are guidelines on well spacing. However, these guidelines, and also design calculations, are based on the assumption of a homogeneous subsurface, while studies report a standard deviation in logpermeability of 1 to 2 for unconsolidated aquifers (Gelhar, 1993). Such heterogeneity may create preferential pathways, reducing ATES performance due to increased advective heat loss or interference between ATES wells. The role of hydraulic heterogeneity of the subsurface related to ATES performance has received little attention in literature. Previous research shows that even small amounts of heterogeneity can result in considerable uncertainty in the distribution of thermal energy in the subsurface and an increased radius of influence (Ferguson, 2007). This is supported by subsurface temperature measurements around ATES wells, which suggest heterogeneity gives rise to preferential pathways and short-circuiting between ATES wells (Bridger and Allen, 2010). Using 3-dimensional stochastic heat transport modeling, we quantified the influence of heterogeneity on the performance of a doublet well energy storage system. The following key parameters are varied to study their influence on thermal recovery and thermal balance: 1) regional flow velocity, 2) distance between wells and 3) characteristics of the heterogeneity. Results show that heterogeneity at the scale of a doublet ATES system introduces an uncertainty up to 18% in expected thermal recovery. The uncertainty increases with decreasing distance between ATES wells. The uncertainty in the thermal balance ratio related to heterogeneity is limited (smaller than 3%). If thermal interference should be avoided, wells in heterogeneous aquifers should be placed further apart than in homogeneous aquifers, leading to larger volume claim in the subsurface. By relating the number of ATES systems in an area to their expected performance, these results can be used to optimize regional application of ATES. Bridger, D. W. and D. M. Allen (2010). "Heat transport simulations in a heterogeneous aquifer used for aquifer thermal energy storage (ATES)." Canadian Geotechnical Journal 47(1): 96-115. Ferguson, G. (2007). "Heterogeneity and thermal modeling of ground water." Ground Water 45(4): 485-490. Gelhar, L. W. (1993). Stochastic Subsurface Hydrology, Prentice Hall.

  6. Uncertainties in Earthquake Loss Analysis: A Case Study From Southern California

    NASA Astrophysics Data System (ADS)

    Mahdyiar, M.; Guin, J.

    2005-12-01

    Probabilistic earthquake hazard and loss analyses play important roles in many areas of risk management, including earthquake related public policy and insurance ratemaking. Rigorous loss estimation for portfolios of properties is difficult since there are various types of uncertainties in all aspects of modeling and analysis. It is the objective of this study to investigate the sensitivity of earthquake loss estimation to uncertainties in regional seismicity, earthquake source parameters, ground motions, and sites' spatial correlation on typical property portfolios in Southern California. Southern California is an attractive region for such a study because it has a large population concentration exposed to significant levels of seismic hazard. During the last decade, there have been several comprehensive studies of most regional faults and seismogenic sources. There have also been detailed studies on regional ground motion attenuations and regional and local site responses to ground motions. This information has been used by engineering seismologists to conduct regional seismic hazard and risk analysis on a routine basis. However, one of the more difficult tasks in such studies is the proper incorporation of uncertainties in the analysis. From the hazard side, there are uncertainties in the magnitudes, rates and mechanisms of the seismic sources and local site conditions and ground motion site amplifications. From the vulnerability side, there are considerable uncertainties in estimating the state of damage of buildings under different earthquake ground motions. From an analytical side, there are challenges in capturing the spatial correlation of ground motions and building damage, and integrating thousands of loss distribution curves with different degrees of correlation. In this paper we propose to address some of these issues by conducting loss analyses of a typical small portfolio in southern California, taking into consideration various source and ground motion uncertainties. The approach is designed to integrate loss distribution functions with different degrees of correlation for portfolio analysis. The analysis is based on USGS 2002 regional seismicity model.

  7. Approximating uncertainty of annual runoff and reservoir yield using stochastic replicates of global climate model data

    NASA Astrophysics Data System (ADS)

    Peel, M. C.; Srikanthan, R.; McMahon, T. A.; Karoly, D. J.

    2015-04-01

    Two key sources of uncertainty in projections of future runoff for climate change impact assessments are uncertainty between global climate models (GCMs) and within a GCM. Within-GCM uncertainty is the variability in GCM output that occurs when running a scenario multiple times but each run has slightly different, but equally plausible, initial conditions. The limited number of runs available for each GCM and scenario combination within the Coupled Model Intercomparison Project phase 3 (CMIP3) and phase 5 (CMIP5) data sets, limits the assessment of within-GCM uncertainty. In this second of two companion papers, the primary aim is to present a proof-of-concept approximation of within-GCM uncertainty for monthly precipitation and temperature projections and to assess the impact of within-GCM uncertainty on modelled runoff for climate change impact assessments. A secondary aim is to assess the impact of between-GCM uncertainty on modelled runoff. Here we approximate within-GCM uncertainty by developing non-stationary stochastic replicates of GCM monthly precipitation and temperature data. These replicates are input to an off-line hydrologic model to assess the impact of within-GCM uncertainty on projected annual runoff and reservoir yield. We adopt stochastic replicates of available GCM runs to approximate within-GCM uncertainty because large ensembles, hundreds of runs, for a given GCM and scenario are unavailable, other than the Climateprediction.net data set for the Hadley Centre GCM. To date within-GCM uncertainty has received little attention in the hydrologic climate change impact literature and this analysis provides an approximation of the uncertainty in projected runoff, and reservoir yield, due to within- and between-GCM uncertainty of precipitation and temperature projections. In the companion paper, McMahon et al. (2015) sought to reduce between-GCM uncertainty by removing poorly performing GCMs, resulting in a selection of five better performing GCMs from CMIP3 for use in this paper. Here we present within- and between-GCM uncertainty results in mean annual precipitation (MAP), mean annual temperature (MAT), mean annual runoff (MAR), the standard deviation of annual precipitation (SDP), standard deviation of runoff (SDR) and reservoir yield for five CMIP3 GCMs at 17 worldwide catchments. Based on 100 stochastic replicates of each GCM run at each catchment, within-GCM uncertainty was assessed in relative form as the standard deviation expressed as a percentage of the mean of the 100 replicate values of each variable. The average relative within-GCM uncertainties from the 17 catchments and 5 GCMs for 2015-2044 (A1B) were MAP 4.2%, SDP 14.2%, MAT 0.7%, MAR 10.1% and SDR 17.6%. The Gould-Dincer Gamma (G-DG) procedure was applied to each annual runoff time series for hypothetical reservoir capacities of 1 × MAR and 3 × MAR and the average uncertainties in reservoir yield due to within-GCM uncertainty from the 17 catchments and 5 GCMs were 25.1% (1 × MAR) and 11.9% (3 × MAR). Our approximation of within-GCM uncertainty is expected to be an underestimate due to not replicating the GCM trend. However, our results indicate that within-GCM uncertainty is important when interpreting climate change impact assessments. Approximately 95% of values of MAP, SDP, MAT, MAR, SDR and reservoir yield from 1 × MAR or 3 × MAR capacity reservoirs are expected to fall within twice their respective relative uncertainty (standard deviation/mean). Within-GCM uncertainty has significant implications for interpreting climate change impact assessments that report future changes within our range of uncertainty for a given variable - these projected changes may be due solely to within-GCM uncertainty. Since within-GCM variability is amplified from precipitation to runoff and then to reservoir yield, climate change impact assessments that do not take into account within-GCM uncertainty risk providing water resources management decision makers with a sense of certainty that is unjustified.

  8. Structural and parameteric uncertainty quantification in cloud microphysics parameterization schemes

    NASA Astrophysics Data System (ADS)

    van Lier-Walqui, M.; Morrison, H.; Kumjian, M. R.; Prat, O. P.; Martinkus, C.

    2017-12-01

    Atmospheric model parameterization schemes employ approximations to represent the effects of unresolved processes. These approximations are a source of error in forecasts, caused in part by considerable uncertainty about the optimal value of parameters within each scheme -- parameteric uncertainty. Furthermore, there is uncertainty regarding the best choice of the overarching structure of the parameterization scheme -- structrual uncertainty. Parameter estimation can constrain the first, but may struggle with the second because structural choices are typically discrete. We address this problem in the context of cloud microphysics parameterization schemes by creating a flexible framework wherein structural and parametric uncertainties can be simultaneously constrained. Our scheme makes no assuptions about drop size distribution shape or the functional form of parametrized process rate terms. Instead, these uncertainties are constrained by observations using a Markov Chain Monte Carlo sampler within a Bayesian inference framework. Our scheme, the Bayesian Observationally-constrained Statistical-physical Scheme (BOSS), has flexibility to predict various sets of prognostic drop size distribution moments as well as varying complexity of process rate formulations. We compare idealized probabilistic forecasts from versions of BOSS with varying levels of structural complexity. This work has applications in ensemble forecasts with model physics uncertainty, data assimilation, and cloud microphysics process studies.

  9. Host Model Uncertainty in Aerosol Radiative Effects: the AeroCom Prescribed Experiment and Beyond

    NASA Astrophysics Data System (ADS)

    Stier, Philip; Schutgens, Nick; Bian, Huisheng; Boucher, Olivier; Chin, Mian; Ghan, Steven; Huneeus, Nicolas; Kinne, Stefan; Lin, Guangxing; Myhre, Gunnar; Penner, Joyce; Randles, Cynthia; Samset, Bjorn; Schulz, Michael; Yu, Hongbin; Zhou, Cheng; Bellouin, Nicolas; Ma, Xiaoyan; Yu, Fangqun; Takemura, Toshihiko

    2013-04-01

    Anthropogenic and natural aerosol radiative effects are recognized to affect global and regional climate. Multi-model "diversity" in estimates of the aerosol radiative effect is often perceived as a measure of the uncertainty in modelling aerosol itself. However, current aerosol models vary considerably in model components relevant for the calculation of aerosol radiative forcings and feedbacks and the associated "host-model uncertainties" are generally convoluted with the actual uncertainty in aerosol modelling. In the AeroCom Prescribed intercomparison study we systematically isolate and quantify host model uncertainties on aerosol forcing experiments through prescription of identical aerosol radiative properties in eleven participating models. Host model errors in aerosol radiative forcing are largest in regions of uncertain host model components, such as stratocumulus cloud decks or areas with poorly constrained surface albedos, such as sea ice. Our results demonstrate that host model uncertainties are an important component of aerosol forcing uncertainty that require further attention. However, uncertainties in aerosol radiative effects also include short-term and long-term feedback processes that will be systematically explored in future intercomparison studies. Here we will present an overview of the proposals for discussion and results from early scoping studies.

  10. Methods for Estimating the Uncertainty in Emergy Table-Form Models

    EPA Science Inventory

    Emergy studies have suffered criticism due to the lack of uncertainty analysis and this shortcoming may have directly hindered the wider application and acceptance of this methodology. Recently, to fill this gap, the sources of uncertainty in emergy analysis were described and an...

  11. A COMPREHENSIVE ANALYSIS OF UNCERTAINTIES AFFECTING THE STELLAR MASS-HALO MASS RELATION FOR 0 < z < 4

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Behroozi, Peter S.; Wechsler, Risa H.; Conroy, Charlie

    2010-07-01

    We conduct a comprehensive analysis of the relationship between central galaxies and their host dark matter halos, as characterized by the stellar mass-halo mass (SM-HM) relation, with rigorous consideration of uncertainties. Our analysis focuses on results from the abundance matching technique, which assumes that every dark matter halo or subhalo above a specific mass threshold hosts one galaxy. We provide a robust estimate of the SM-HM relation for 0 < z < 1 and discuss the quantitative effects of uncertainties in observed galaxy stellar mass functions (including stellar mass estimates and counting uncertainties), halo mass functions (including cosmology and uncertaintiesmore » from substructure), and the abundance matching technique used to link galaxies to halos (including scatter in this connection). Our analysis results in a robust estimate of the SM-HM relation and its evolution from z = 0 to z = 4. The shape and the evolution are well constrained for z < 1. The largest uncertainties at these redshifts are due to stellar mass estimates (0.25 dex uncertainty in normalization); however, failure to account for scatter in stellar masses at fixed halo mass can lead to errors of similar magnitude in the SM-HM relation for central galaxies in massive halos. We also investigate the SM-HM relation to z = 4, although the shape of the relation at higher redshifts remains fairly unconstrained when uncertainties are taken into account. We find that the integrated star formation at a given halo mass peaks at 10%-20% of available baryons for all redshifts from 0 to 4. This peak occurs at a halo mass of 7 x 10{sup 11} M{sub sun} at z = 0 and this mass increases by a factor of 5 to z = 4. At lower and higher masses, star formation is substantially less efficient, with stellar mass scaling as M{sub *} {approx} M {sup 2.3}{sub h} at low masses and M{sub *} {approx} M {sup 0.29}{sub h} at high masses. The typical stellar mass for halos with mass less than 10{sup 12} M{sub sun} has increased by 0.3-0.45 dex for halos since z {approx} 1. These results will provide a powerful tool to inform galaxy evolution models.« less

  12. Known unknowns: building an ethics of uncertainty into genomic medicine.

    PubMed

    Newson, Ainsley J; Leonard, Samantha J; Hall, Alison; Gaff, Clara L

    2016-09-01

    Genomic testing has reached the point where, technically at least, it can be cheaper to undertake panel-, exome- or whole genome testing than it is to sequence a single gene. An attribute of these approaches is that information gleaned will often have uncertain significance. In addition to the challenges this presents for pre-test counseling and informed consent, a further consideration emerges over how - ethically - we should conceive of and respond to this uncertainty. To date, the ethical aspects of uncertainty in genomics have remained under-explored. In this paper, we draft a conceptual and ethical response to the question of how to conceive of and respond to uncertainty in genomic medicine. After introducing the problem, we articulate a concept of 'genomic uncertainty'. Drawing on this, together with exemplar clinical cases and related empirical literature, we then critique the presumption that uncertainty is always problematic and something to be avoided, or eradicated. We conclude by outlining an 'ethics of genomic uncertainty'; describing how we might handle uncertainty in genomic medicine. This involves fostering resilience, welfare, autonomy and solidarity. Uncertainty will be an inherent aspect of clinical practice in genomics for some time to come. Genomic testing should not be offered with the explicit aim to reduce uncertainty. Rather, uncertainty should be appraised, adapted to and communicated about as part of the process of offering and providing genomic information.

  13. Water Table Uncertainties due to Uncertainties in Structure and Properties of an Unconfined Aquifer.

    PubMed

    Hauser, Juerg; Wellmann, Florian; Trefry, Mike

    2018-03-01

    We consider two sources of geology-related uncertainty in making predictions of the steady-state water table elevation for an unconfined aquifer. That is the uncertainty in the depth to base of the aquifer and in the hydraulic conductivity distribution within the aquifer. Stochastic approaches to hydrological modeling commonly use geostatistical techniques to account for hydraulic conductivity uncertainty within the aquifer. In the absence of well data allowing derivation of a relationship between geophysical and hydrological parameters, the use of geophysical data is often limited to constraining the structural boundaries. If we recover the base of an unconfined aquifer from an analysis of geophysical data, then the associated uncertainties are a consequence of the geophysical inversion process. In this study, we illustrate this by quantifying water table uncertainties for the unconfined aquifer formed by the paleochannel network around the Kintyre Uranium deposit in Western Australia. The focus of the Bayesian parametric bootstrap approach employed for the inversion of the available airborne electromagnetic data is the recovery of the base of the paleochannel network and the associated uncertainties. This allows us to then quantify the associated influences on the water table in a conceptualized groundwater usage scenario and compare the resulting uncertainties with uncertainties due to an uncertain hydraulic conductivity distribution within the aquifer. Our modeling shows that neither uncertainties in the depth to the base of the aquifer nor hydraulic conductivity uncertainties alone can capture the patterns of uncertainty in the water table that emerge when the two are combined. © 2017, National Ground Water Association.

  14. Multi-model inference for incorporating trophic and climate uncertainty into stock assessments

    NASA Astrophysics Data System (ADS)

    Ianelli, James; Holsman, Kirstin K.; Punt, André E.; Aydin, Kerim

    2016-12-01

    Ecosystem-based fisheries management (EBFM) approaches allow a broader and more extensive consideration of objectives than is typically possible with conventional single-species approaches. Ecosystem linkages may include trophic interactions and climate change effects on productivity for the relevant species within the system. Presently, models are evolving to include a comprehensive set of fishery and ecosystem information to address these broader management considerations. The increased scope of EBFM approaches is accompanied with a greater number of plausible models to describe the systems. This can lead to harvest recommendations and biological reference points that differ considerably among models. Model selection for projections (and specific catch recommendations) often occurs through a process that tends to adopt familiar, often simpler, models without considering those that incorporate more complex ecosystem information. Multi-model inference provides a framework that resolves this dilemma by providing a means of including information from alternative, often divergent models to inform biological reference points and possible catch consequences. We apply an example of this approach to data for three species of groundfish in the Bering Sea: walleye pollock, Pacific cod, and arrowtooth flounder using three models: 1) an age-structured "conventional" single-species model, 2) an age-structured single-species model with temperature-specific weight at age, and 3) a temperature-specific multi-species stock assessment model. The latter two approaches also include consideration of alternative future climate scenarios, adding another dimension to evaluate model projection uncertainty. We show how Bayesian model-averaging methods can be used to incorporate such trophic and climate information to broaden single-species stock assessments by using an EBFM approach that may better characterize uncertainty.

  15. The future of hydropower planning modeling

    NASA Astrophysics Data System (ADS)

    Haas, J.; Zuñiga, D.; Nowak, W.; Olivares, M. A.; Castelletti, A.; Thilmant, A.

    2017-12-01

    Planning the investment and operation of hydropower plants with optimization tools dates back to the 1970s. The focus used to be solely on the provision of energy. However, advances in computational capacity and solving algorithms, dynamic markets, expansion of renewable sources, and a better understanding of hydropower environmental impacts have recently led to the development of novel planning approaches. In this work, we provide a review, systematization, and trend analysis of these approaches. Further, through interviews with experts, we outline the future of hydropower planning modeling and identify the gaps towards it. We classified the found models along environmental, economic, multipurpose and technical criteria. Environmental interactions include hydropeaking mitigation, water quality protection and limiting greenhouse gas emissions from reservoirs. Economic and regulatory criteria consider uncertainties of fossil fuel prices and relicensing of water rights and power purchase agreements. Multipurpose considerations account for irrigation, tourism, flood protection and drinking water. Recently included technical details account for sedimentation in reservoirs and variable efficiencies of turbines. Additional operational considerations relate to hydrological aspects such as dynamic reservoir inflows, water losses, and climate change. Although many of the above criteria have been addressed in detail on a project-to-project basis, models remain overly simplistic for planning large power fleets. Future hydropower planning tools are expected to improve the representation of the water-energy nexus, including environmental and multipurpose criteria. Further, they will concentrate on identifying new sources of operational flexibility (e.g. through installing additional turbines and pumps) for integrating renewable energy. The operational detail will increase, potentially emphasizing variable efficiencies, storage capacity losses due to sedimentation, and the timing of inflows (which are becoming more variable under climate change). Finally, the relicensing of existing operations and planning new installations are subject to deep uncertainties that need to be captured.

  16. Methodology for quantifying uncertainty in coal assessments with an application to a Texas lignite deposit

    USGS Publications Warehouse

    Olea, R.A.; Luppens, J.A.; Tewalt, S.J.

    2011-01-01

    A common practice for characterizing uncertainty in coal resource assessments has been the itemization of tonnage at the mining unit level and the classification of such units according to distance to drilling holes. Distance criteria, such as those used in U.S. Geological Survey Circular 891, are still widely used for public disclosure. A major deficiency of distance methods is that they do not provide a quantitative measure of uncertainty. Additionally, relying on distance between data points alone does not take into consideration other factors known to have an influence on uncertainty, such as spatial correlation, type of probability distribution followed by the data, geological discontinuities, and boundary of the deposit. Several geostatistical methods have been combined to formulate a quantitative characterization for appraising uncertainty. Drill hole datasets ranging from widespread exploration drilling to detailed development drilling from a lignite deposit in Texas were used to illustrate the modeling. The results show that distance to the nearest drill hole is almost completely unrelated to uncertainty, which confirms the inadequacy of characterizing uncertainty based solely on a simple classification of resources by distance classes. The more complex statistical methods used in this study quantify uncertainty and show good agreement between confidence intervals in the uncertainty predictions and data from additional drilling. ?? 2010.

  17. Assessing the relative importance of parameter and forcing uncertainty and their interactions in conceptual hydrological model simulations

    NASA Astrophysics Data System (ADS)

    Mockler, E. M.; Chun, K. P.; Sapriza-Azuri, G.; Bruen, M.; Wheater, H. S.

    2016-11-01

    Predictions of river flow dynamics provide vital information for many aspects of water management including water resource planning, climate adaptation, and flood and drought assessments. Many of the subjective choices that modellers make including model and criteria selection can have a significant impact on the magnitude and distribution of the output uncertainty. Hydrological modellers are tasked with understanding and minimising the uncertainty surrounding streamflow predictions before communicating the overall uncertainty to decision makers. Parameter uncertainty in conceptual rainfall-runoff models has been widely investigated, and model structural uncertainty and forcing data have been receiving increasing attention. This study aimed to assess uncertainties in streamflow predictions due to forcing data and the identification of behavioural parameter sets in 31 Irish catchments. By combining stochastic rainfall ensembles and multiple parameter sets for three conceptual rainfall-runoff models, an analysis of variance model was used to decompose the total uncertainty in streamflow simulations into contributions from (i) forcing data, (ii) identification of model parameters and (iii) interactions between the two. The analysis illustrates that, for our subjective choices, hydrological model selection had a greater contribution to overall uncertainty, while performance criteria selection influenced the relative intra-annual uncertainties in streamflow predictions. Uncertainties in streamflow predictions due to the method of determining parameters were relatively lower for wetter catchments, and more evenly distributed throughout the year when the Nash-Sutcliffe Efficiency of logarithmic values of flow (lnNSE) was the evaluation criterion.

  18. Climate Projections and Uncertainty Communication.

    PubMed

    Joslyn, Susan L; LeClerc, Jared E

    2016-01-01

    Lingering skepticism about climate change might be due in part to the way climate projections are perceived by members of the public. Variability between scientists' estimates might give the impression that scientists disagree about the fact of climate change rather than about details concerning the extent or timing. Providing uncertainty estimates might clarify that the variability is due in part to quantifiable uncertainty inherent in the prediction process, thereby increasing people's trust in climate projections. This hypothesis was tested in two experiments. Results suggest that including uncertainty estimates along with climate projections leads to an increase in participants' trust in the information. Analyses explored the roles of time, place, demographic differences (e.g., age, gender, education level, political party affiliation), and initial belief in climate change. Implications are discussed in terms of the potential benefit of adding uncertainty estimates to public climate projections. Copyright © 2015 Cognitive Science Society, Inc.

  19. An interval-based possibilistic programming method for waste management with cost minimization and environmental-impact abatement under uncertainty.

    PubMed

    Li, Y P; Huang, G H

    2010-09-15

    Considerable public concerns have been raised in the past decades since a large amount of pollutant emissions from municipal solid waste (MSW) disposal of processes pose risks on surrounding environment and human health. Moreover, in MSW management, various uncertainties exist in the related costs, impact factors and objectives, which can affect the optimization processes and the decision schemes generated. In this study, an interval-based possibilistic programming (IBPP) method is developed for planning the MSW management with minimized system cost and environmental impact under uncertainty. The developed method can deal with uncertainties expressed as interval values and fuzzy sets in the left- and right-hand sides of constraints and objective function. An interactive algorithm is provided for solving the IBPP problem, which does not lead to more complicated intermediate submodels and has a relatively low computational requirement. The developed model is applied to a case study of planning a MSW management system, where mixed integer linear programming (MILP) technique is introduced into the IBPP framework to facilitate dynamic analysis for decisions of timing, sizing and siting in terms of capacity expansion for waste-management facilities. Three cases based on different waste-management policies are examined. The results obtained indicate that inclusion of environmental impacts in the optimization model can change the traditional waste-allocation pattern merely based on the economic-oriented planning approach. The results obtained can help identify desired alternatives for managing MSW, which has advantages in providing compromised schemes under an integrated consideration of economic efficiency and environmental impact under uncertainty. Copyright 2010 Elsevier B.V. All rights reserved.

  20. The Development and Assesment of Adaptation Pathways for Urban Pluvial Flooding

    NASA Astrophysics Data System (ADS)

    Babovic, F.; Mijic, A.; Madani, K.

    2017-12-01

    Around the globe, urban areas are growing in both size and importance. However, due to the prevalence of impermeable surfaces within the urban fabric of cities these areas have a high risk of pluvial flooding. Due to the convergence of population growth and climate change the risk of pluvial flooding is growing. When designing solutions and adaptations to pluvial flood risk urban planners and engineers encounter a great deal of uncertainty due to model uncertainty, uncertainty within the data utilised, and uncertainty related to future climate and land use conditions. The interaction of these uncertainties leads to conditions of deep uncertainty. However, infrastructure systems must be designed and built in the face of this deep uncertainty. An Adaptation Tipping Points (ATP) methodology was used to develop a strategy to adapt an urban drainage system in the North East of London under conditions of deep uncertainty. The ATP approach was used to assess the current drainage system and potential drainage system adaptations. These adaptations were assessed against potential changes in rainfall depth and peakedness-defined as the ratio of mean to peak rainfall. These solutions encompassed both traditional and blue-green solutions that the Local Authority are known to be considering. This resulted in a set of Adaptation Pathways. However, theses pathways do not convey any information regarding the relative merits and demerits of the potential adaptation options presented. To address this a cost-benefit metric was developed that would reflect the solutions' costs and benefits under uncertainty. The resulting metric combines elements of the Benefits of SuDS Tool (BeST) with real options analysis in order to reflect the potential value of ecosystem services delivered by blue-green solutions under uncertainty. Lastly, it is discussed how a local body can utilise the adaptation pathways; their relative costs and benefits; and a system of local data collection to help guide better decision making with respect to urban flood adaptation.

  1. Comparative quantification of health risks: Conceptual framework and methodological issues

    PubMed Central

    Murray, Christopher JL; Ezzati, Majid; Lopez, Alan D; Rodgers, Anthony; Vander Hoorn, Stephen

    2003-01-01

    Reliable and comparable analysis of risks to health is key for preventing disease and injury. Causal attribution of morbidity and mortality to risk factors has traditionally been conducted in the context of methodological traditions of individual risk factors, often in a limited number of settings, restricting comparability. In this paper, we discuss the conceptual and methodological issues for quantifying the population health effects of individual or groups of risk factors in various levels of causality using knowledge from different scientific disciplines. The issues include: comparing the burden of disease due to the observed exposure distribution in a population with the burden from a hypothetical distribution or series of distributions, rather than a single reference level such as non-exposed; considering the multiple stages in the causal network of interactions among risk factor(s) and disease outcome to allow making inferences about some combinations of risk factors for which epidemiological studies have not been conducted, including the joint effects of multiple risk factors; calculating the health loss due to risk factor(s) as a time-indexed "stream" of disease burden due to a time-indexed "stream" of exposure, including consideration of discounting; and the sources of uncertainty. PMID:12780936

  2. Climate data induced uncertainty in model-based estimations of terrestrial primary productivity

    NASA Astrophysics Data System (ADS)

    Wu, Zhendong; Ahlström, Anders; Smith, Benjamin; Ardö, Jonas; Eklundh, Lars; Fensholt, Rasmus; Lehsten, Veiko

    2017-06-01

    Model-based estimations of historical fluxes and pools of the terrestrial biosphere differ substantially. These differences arise not only from differences between models but also from differences in the environmental and climatic data used as input to the models. Here we investigate the role of uncertainties in historical climate data by performing simulations of terrestrial gross primary productivity (GPP) using a process-based dynamic vegetation model (LPJ-GUESS) forced by six different climate datasets. We find that the climate induced uncertainty, defined as the range among historical simulations in GPP when forcing the model with the different climate datasets, can be as high as 11 Pg C yr-1 globally (9% of mean GPP). We also assessed a hypothetical maximum climate data induced uncertainty by combining climate variables from different datasets, which resulted in significantly larger uncertainties of 41 Pg C yr-1 globally or 32% of mean GPP. The uncertainty is partitioned into components associated to the three main climatic drivers, temperature, precipitation, and shortwave radiation. Additionally, we illustrate how the uncertainty due to a given climate driver depends both on the magnitude of the forcing data uncertainty (climate data range) and the apparent sensitivity of the modeled GPP to the driver (apparent model sensitivity). We find that LPJ-GUESS overestimates GPP compared to empirically based GPP data product in all land cover classes except for tropical forests. Tropical forests emerge as a disproportionate source of uncertainty in GPP estimation both in the simulations and empirical data products. The tropical forest uncertainty is most strongly associated with shortwave radiation and precipitation forcing, of which climate data range contributes higher to overall uncertainty than apparent model sensitivity to forcing. Globally, precipitation dominates the climate induced uncertainty over nearly half of the vegetated land area, which is mainly due to climate data range and less so due to the apparent model sensitivity. Overall, climate data ranges are found to contribute more to the climate induced uncertainty than apparent model sensitivity to forcing. Our study highlights the need to better constrain tropical climate, and demonstrates that uncertainty caused by climatic forcing data must be considered when comparing and evaluating carbon cycle model results and empirical datasets.

  3. A Peep into the Uncertainty-Complexity-Relevance Modeling Trilemma through Global Sensitivity and Uncertainty Analysis

    NASA Astrophysics Data System (ADS)

    Munoz-Carpena, R.; Muller, S. J.; Chu, M.; Kiker, G. A.; Perz, S. G.

    2014-12-01

    Model Model complexity resulting from the need to integrate environmental system components cannot be understated. In particular, additional emphasis is urgently needed on rational approaches to guide decision making through uncertainties surrounding the integrated system across decision-relevant scales. However, in spite of the difficulties that the consideration of modeling uncertainty represent for the decision process, it should not be avoided or the value and science behind the models will be undermined. These two issues; i.e., the need for coupled models that can answer the pertinent questions and the need for models that do so with sufficient certainty, are the key indicators of a model's relevance. Model relevance is inextricably linked with model complexity. Although model complexity has advanced greatly in recent years there has been little work to rigorously characterize the threshold of relevance in integrated and complex models. Formally assessing the relevance of the model in the face of increasing complexity would be valuable because there is growing unease among developers and users of complex models about the cumulative effects of various sources of uncertainty on model outputs. In particular, this issue has prompted doubt over whether the considerable effort going into further elaborating complex models will in fact yield the expected payback. New approaches have been proposed recently to evaluate the uncertainty-complexity-relevance modeling trilemma (Muller, Muñoz-Carpena and Kiker, 2011) by incorporating state-of-the-art global sensitivity and uncertainty analysis (GSA/UA) in every step of the model development so as to quantify not only the uncertainty introduced by the addition of new environmental components, but the effect that these new components have over existing components (interactions, non-linear responses). Outputs from the analysis can also be used to quantify system resilience (stability, alternative states, thresholds or tipping points) in the face of environmental and anthropogenic change (Perz, Muñoz-Carpena, Kiker and Holt, 2013), and through MonteCarlo mapping potential management activities over the most important factors or processes to influence the system towards behavioral (desirable) outcomes (Chu-Agor, Muñoz-Carpena et al., 2012).

  4. Global, Regional, and National Burden of Rheumatic Heart Disease, 1990-2015.

    PubMed

    Watkins, David A; Johnson, Catherine O; Colquhoun, Samantha M; Karthikeyan, Ganesan; Beaton, Andrea; Bukhman, Gene; Forouzanfar, Mohammed H; Longenecker, Christopher T; Mayosi, Bongani M; Mensah, George A; Nascimento, Bruno R; Ribeiro, Antonio L P; Sable, Craig A; Steer, Andrew C; Naghavi, Mohsen; Mokdad, Ali H; Murray, Christopher J L; Vos, Theo; Carapetis, Jonathan R; Roth, Gregory A

    2017-08-24

    Rheumatic heart disease remains an important preventable cause of cardiovascular death and disability, particularly in low-income and middle-income countries. We estimated global, regional, and national trends in the prevalence of and mortality due to rheumatic heart disease as part of the 2015 Global Burden of Disease study. We systematically reviewed data on fatal and nonfatal rheumatic heart disease for the period from 1990 through 2015. Two Global Burden of Disease analytic tools, the Cause of Death Ensemble model and DisMod-MR 2.1, were used to produce estimates of mortality and prevalence, including estimates of uncertainty. We estimated that there were 319,400 (95% uncertainty interval, 297,300 to 337,300) deaths due to rheumatic heart disease in 2015. Global age-standardized mortality due to rheumatic heart disease decreased by 47.8% (95% uncertainty interval, 44.7 to 50.9) from 1990 to 2015, but large differences were observed across regions. In 2015, the highest age-standardized mortality due to and prevalence of rheumatic heart disease were observed in Oceania, South Asia, and central sub-Saharan Africa. We estimated that in 2015 there were 33.4 million (95% uncertainty interval, 29.7 million to 43.1 million) cases of rheumatic heart disease and 10.5 million (95% uncertainty interval, 9.6 million to 11.5 million) disability-adjusted life-years due to rheumatic heart disease globally. We estimated the global disease prevalence of and mortality due to rheumatic heart disease over a 25-year period. The health-related burden of rheumatic heart disease has declined worldwide, but high rates of disease persist in some of the poorest regions in the world. (Funded by the Bill and Melinda Gates Foundation and the Medtronic Foundation.).

  5. The Fermi Galactic Center GeV excess and implications for dark matter

    DOE PAGES

    Ackermann, M.; Ajello, M.; Albert, A.; ...

    2017-05-04

    Here, the region around the Galactic Center (GC) is now well established to be brighter at energies of a few GeV than what is expected from conventional models of diffuse gamma-ray emission and catalogs of known gamma-ray sources. We study the GeV excess using 6.5 yr of data from the Fermi Large Area Telescope. We characterize the uncertainty of the GC excess spectrum and morphology due to uncertainties in cosmic-ray source distributions and propagation, uncertainties in the distribution of interstellar gas in the Milky Way, and uncertainties due to a potential contribution from the Fermi bubbles. We also evaluate uncertaintiesmore » in the excess properties due to resolved point sources of gamma rays. The GC is of particular interest, as it would be expected to have the brightest signal from annihilation of weakly interacting massive dark matter (DM) particles. However, control regions along the Galactic plane, where a DM signal is not expected, show excesses of similar amplitude relative to the local background. Furthermore, based on the magnitude of the systematic uncertainties, we conservatively report upper limits for the annihilation cross-section as a function of particle mass and annihilation channel.« less

  6. Worldwide data sets constrain the water vapor uptake coefficient in cloud formation

    PubMed Central

    Raatikainen, Tomi; Nenes, Athanasios; Seinfeld, John H.; Morales, Ricardo; Moore, Richard H.; Lathem, Terry L.; Lance, Sara; Padró, Luz T.; Lin, Jack J.; Cerully, Kate M.; Bougiatioti, Aikaterini; Cozic, Julie; Ruehl, Christopher R.; Chuang, Patrick Y.; Anderson, Bruce E.; Flagan, Richard C.; Jonsson, Haflidi; Mihalopoulos, Nikos; Smith, James N.

    2013-01-01

    Cloud droplet formation depends on the condensation of water vapor on ambient aerosols, the rate of which is strongly affected by the kinetics of water uptake as expressed by the condensation (or mass accommodation) coefficient, αc. Estimates of αc for droplet growth from activation of ambient particles vary considerably and represent a critical source of uncertainty in estimates of global cloud droplet distributions and the aerosol indirect forcing of climate. We present an analysis of 10 globally relevant data sets of cloud condensation nuclei to constrain the value of αc for ambient aerosol. We find that rapid activation kinetics (αc > 0.1) is uniformly prevalent. This finding resolves a long-standing issue in cloud physics, as the uncertainty in water vapor accommodation on droplets is considerably less than previously thought. PMID:23431189

  7. Worldwide data sets constrain the water vapor uptake coefficient in cloud formation.

    PubMed

    Raatikainen, Tomi; Nenes, Athanasios; Seinfeld, John H; Morales, Ricardo; Moore, Richard H; Lathem, Terry L; Lance, Sara; Padró, Luz T; Lin, Jack J; Cerully, Kate M; Bougiatioti, Aikaterini; Cozic, Julie; Ruehl, Christopher R; Chuang, Patrick Y; Anderson, Bruce E; Flagan, Richard C; Jonsson, Haflidi; Mihalopoulos, Nikos; Smith, James N

    2013-03-05

    Cloud droplet formation depends on the condensation of water vapor on ambient aerosols, the rate of which is strongly affected by the kinetics of water uptake as expressed by the condensation (or mass accommodation) coefficient, αc. Estimates of αc for droplet growth from activation of ambient particles vary considerably and represent a critical source of uncertainty in estimates of global cloud droplet distributions and the aerosol indirect forcing of climate. We present an analysis of 10 globally relevant data sets of cloud condensation nuclei to constrain the value of αc for ambient aerosol. We find that rapid activation kinetics (αc > 0.1) is uniformly prevalent. This finding resolves a long-standing issue in cloud physics, as the uncertainty in water vapor accommodation on droplets is considerably less than previously thought.

  8. How to Avoid Errors in Error Propagation: Prediction Intervals and Confidence Intervals in Forest Biomass

    NASA Astrophysics Data System (ADS)

    Lilly, P.; Yanai, R. D.; Buckley, H. L.; Case, B. S.; Woollons, R. C.; Holdaway, R. J.; Johnson, J.

    2016-12-01

    Calculations of forest biomass and elemental content require many measurements and models, each contributing uncertainty to the final estimates. While sampling error is commonly reported, based on replicate plots, error due to uncertainty in the regression used to estimate biomass from tree diameter is usually not quantified. Some published estimates of uncertainty due to the regression models have used the uncertainty in the prediction of individuals, ignoring uncertainty in the mean, while others have propagated uncertainty in the mean while ignoring individual variation. Using the simple case of the calcium concentration of sugar maple leaves, we compare the variation among individuals (the standard deviation) to the uncertainty in the mean (the standard error) and illustrate the declining importance in the prediction of individual concentrations as the number of individuals increases. For allometric models, the analogous statistics are the prediction interval (or the residual variation in the model fit) and the confidence interval (describing the uncertainty in the best fit model). The effect of propagating these two sources of error is illustrated using the mass of sugar maple foliage. The uncertainty in individual tree predictions was large for plots with few trees; for plots with 30 trees or more, the uncertainty in individuals was less important than the uncertainty in the mean. Authors of previously published analyses have reanalyzed their data to show the magnitude of these two sources of uncertainty in scales ranging from experimental plots to entire countries. The most correct analysis will take both sources of uncertainty into account, but for practical purposes, country-level reports of uncertainty in carbon stocks, as required by the IPCC, can ignore the uncertainty in individuals. Ignoring the uncertainty in the mean will lead to exaggerated estimates of confidence in estimates of forest biomass and carbon and nutrient contents.

  9. CASMO5/TSUNAMI-3D spent nuclear fuel reactivity uncertainty analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ferrer, R.; Rhodes, J.; Smith, K.

    2012-07-01

    The CASMO5 lattice physics code is used in conjunction with the TSUNAMI-3D sequence in ORNL's SCALE 6 code system to estimate the uncertainties in hot-to-cold reactivity changes due to cross-section uncertainty for PWR assemblies at various burnup points. The goal of the analysis is to establish the multiplication factor uncertainty similarity between various fuel assemblies at different conditions in a quantifiable manner and to obtain a bound on the hot-to-cold reactivity uncertainty over the various assembly types and burnup attributed to fundamental cross-section data uncertainty. (authors)

  10. Effects of aerodynamic heating and TPS thermal performance uncertainties on the Shuttle Orbiter

    NASA Technical Reports Server (NTRS)

    Goodrich, W. D.; Derry, S. M.; Maraia, R. J.

    1980-01-01

    A procedure for estimating uncertainties in the aerodynamic-heating and thermal protection system (TPS) thermal-performance methodologies developed for the Shuttle Orbiter is presented. This procedure is used in predicting uncertainty bands around expected or nominal TPS thermal responses for the Orbiter during entry. Individual flowfield and TPS parameters that make major contributions to these uncertainty bands are identified and, by statistical considerations, combined in a manner suitable for making engineering estimates of the TPS thermal confidence intervals and temperature margins relative to design limits. Thus, for a fixed TPS design, entry trajectories for future Orbiter missions can be shaped subject to both the thermal-margin and confidence-interval requirements. This procedure is illustrated by assessing the thermal margins offered by selected areas of the existing Orbiter TPS design for an entry trajectory typifying early flight test missions.

  11. What is the Uncertainty in MODIS Aerosol Optical Depth in the Vicinity of Clouds?

    NASA Technical Reports Server (NTRS)

    Patadia, Falguni; Levy, Rob; Mattoo, Shana

    2017-01-01

    MODIS dark-target (DT) algorithm retrieves aerosol optical depth (AOD) using a Look Up Table (LUT) approach. Global comparison of AOD (Collection 6 ) with ground-based sun photometer gives an Estimated Error (EE) of +/-(0.04 + 10%) over ocean. However, EE does not represent per-retrieval uncertainty. For retrievals that are biased high compared to AERONET, here we aim to closely examine the contribution of biases due to presence of clouds and per-pixel retrieval uncertainty. We have characterized AOD uncertainty at 550 nm, due to standard deviation of reflectance in 10 km retrieval region, uncertainty related to gas (H2O, O3) absorption, surface albedo, and aerosol models. The uncertainty in retrieved AOD seems to lie within the estimated over ocean error envelope of +/-(0.03+10%). Regions between broken clouds tend to have higher uncertainty. Compared to C6 AOD, a retrieval omitting observations in the vicinity of clouds (< or = 1 km) is biased by about +/- 0.05. For homogeneous aerosol distribution, clear sky retrievals show near zero bias. Close look at per-pixel reflectance histograms suggests retrieval possibility using median reflectance values.

  12. Systematic and statistical uncertainties in simulated r-process abundances due to uncertain nuclear masses

    DOE PAGES

    Surman, Rebecca; Mumpower, Matthew; McLaughlin, Gail

    2017-02-27

    Unknown nuclear masses are a major source of nuclear physics uncertainty for r-process nucleosynthesis calculations. Here we examine the systematic and statistical uncertainties that arise in r-process abundance predictions due to uncertainties in the masses of nuclear species on the neutron-rich side of stability. There is a long history of examining systematic uncertainties by the application of a variety of different mass models to r-process calculations. Here we expand upon such efforts by examining six DFT mass models, where we capture the full impact of each mass model by updating the other nuclear properties — including neutron capture rates, β-decaymore » lifetimes, and β-delayed neutron emission probabilities — that depend on the masses. Unlike systematic effects, statistical uncertainties in the r-process pattern have just begun to be explored. Here we apply a global Monte Carlo approach, starting from the latest FRDM masses and considering random mass variations within the FRDM rms error. Here, we find in each approach that uncertain nuclear masses produce dramatic uncertainties in calculated r-process yields, which can be reduced in upcoming experimental campaigns.« less

  13. Systematic and statistical uncertainties in simulated r-process abundances due to uncertain nuclear masses

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Surman, Rebecca; Mumpower, Matthew; McLaughlin, Gail

    Unknown nuclear masses are a major source of nuclear physics uncertainty for r-process nucleosynthesis calculations. Here we examine the systematic and statistical uncertainties that arise in r-process abundance predictions due to uncertainties in the masses of nuclear species on the neutron-rich side of stability. There is a long history of examining systematic uncertainties by the application of a variety of different mass models to r-process calculations. Here we expand upon such efforts by examining six DFT mass models, where we capture the full impact of each mass model by updating the other nuclear properties — including neutron capture rates, β-decaymore » lifetimes, and β-delayed neutron emission probabilities — that depend on the masses. Unlike systematic effects, statistical uncertainties in the r-process pattern have just begun to be explored. Here we apply a global Monte Carlo approach, starting from the latest FRDM masses and considering random mass variations within the FRDM rms error. Here, we find in each approach that uncertain nuclear masses produce dramatic uncertainties in calculated r-process yields, which can be reduced in upcoming experimental campaigns.« less

  14. Life-cycle assessment of municipal solid waste management alternatives with consideration of uncertainty: SIWMS development and application.

    PubMed

    Hanandeh, Ali El; El-Zein, Abbas

    2010-05-01

    This paper describes the development and application of the Stochastic Integrated Waste Management Simulator (SIWMS) model. SIWMS provides a detailed view of the environmental impacts and associated costs of municipal solid waste (MSW) management alternatives under conditions of uncertainty. The model follows a life-cycle inventory approach extended with compensatory systems to provide more equitable bases for comparing different alternatives. Economic performance is measured by the net present value. The model is verified against four publicly available models under deterministic conditions and then used to study the impact of uncertainty on Sydney's MSW management 'best practices'. Uncertainty has a significant effect on all impact categories. The greatest effect is observed in the global warming category where a reversal of impact direction is predicted. The reliability of the system is most sensitive to uncertainties in the waste processing and disposal. The results highlight the importance of incorporating uncertainty at all stages to better understand the behaviour of the MSW system. Copyright (c) 2010 Elsevier Ltd. All rights reserved.

  15. Internal Variability-Generated Uncertainty in East Asian Climate Projections Estimated with 40 CCSM3 Ensembles.

    PubMed

    Yao, Shuai-Lei; Luo, Jing-Jia; Huang, Gang

    2016-01-01

    Regional climate projections are challenging because of large uncertainty particularly stemming from unpredictable, internal variability of the climate system. Here, we examine the internal variability-induced uncertainty in precipitation and surface air temperature (SAT) trends during 2005-2055 over East Asia based on 40 member ensemble projections of the Community Climate System Model Version 3 (CCSM3). The model ensembles are generated from a suite of different atmospheric initial conditions using the same SRES A1B greenhouse gas scenario. We find that projected precipitation trends are subject to considerably larger internal uncertainty and hence have lower confidence, compared to the projected SAT trends in both the boreal winter and summer. Projected SAT trends in winter have relatively higher uncertainty than those in summer. Besides, the lower-level atmospheric circulation has larger uncertainty than that in the mid-level. Based on k-means cluster analysis, we demonstrate that a substantial portion of internally-induced precipitation and SAT trends arises from internal large-scale atmospheric circulation variability. These results highlight the importance of internal climate variability in affecting regional climate projections on multi-decadal timescales.

  16. Integrated Arrival and Departure Schedule Optimization Under Uncertainty

    NASA Technical Reports Server (NTRS)

    Xue, Min; Zelinski, Shannon

    2014-01-01

    In terminal airspace, integrating arrivals and departures with shared waypoints provides the potential of improving operational efficiency by allowing direct routes when possible. Incorporating stochastic evaluation as a post-analysis process of deterministic optimization, and imposing a safety buffer in deterministic optimization, are two ways to learn and alleviate the impact of uncertainty and to avoid unexpected outcomes. This work presents a third and direct way to take uncertainty into consideration during the optimization. The impact of uncertainty was incorporated into cost evaluations when searching for the optimal solutions. The controller intervention count was computed using a heuristic model and served as another stochastic cost besides total delay. Costs under uncertainty were evaluated using Monte Carlo simulations. The Pareto fronts that contain a set of solutions were identified and the trade-off between delays and controller intervention count was shown. Solutions that shared similar delays but had different intervention counts were investigated. The results showed that optimization under uncertainty could identify compromise solutions on Pareto fonts, which is better than deterministic optimization with extra safety buffers. It helps decision-makers reduce controller intervention while achieving low delays.

  17. Global Sensitivity Analysis for Identifying Important Parameters of Nitrogen Nitrification and Denitrification under Model and Scenario Uncertainties

    NASA Astrophysics Data System (ADS)

    Ye, M.; Chen, Z.; Shi, L.; Zhu, Y.; Yang, J.

    2017-12-01

    Nitrogen reactive transport modeling is subject to uncertainty in model parameters, structures, and scenarios. While global sensitivity analysis is a vital tool for identifying the parameters important to nitrogen reactive transport, conventional global sensitivity analysis only considers parametric uncertainty. This may result in inaccurate selection of important parameters, because parameter importance may vary under different models and modeling scenarios. By using a recently developed variance-based global sensitivity analysis method, this paper identifies important parameters with simultaneous consideration of parametric uncertainty, model uncertainty, and scenario uncertainty. In a numerical example of nitrogen reactive transport modeling, a combination of three scenarios of soil temperature and two scenarios of soil moisture leads to a total of six scenarios. Four alternative models are used to evaluate reduction functions used for calculating actual rates of nitrification and denitrification. The model uncertainty is tangled with scenario uncertainty, as the reduction functions depend on soil temperature and moisture content. The results of sensitivity analysis show that parameter importance varies substantially between different models and modeling scenarios, which may lead to inaccurate selection of important parameters if model and scenario uncertainties are not considered. This problem is avoided by using the new method of sensitivity analysis in the context of model averaging and scenario averaging. The new method of sensitivity analysis can be applied to other problems of contaminant transport modeling when model uncertainty and/or scenario uncertainty are present.

  18. Uncertainty Analysis of Thermal Comfort Parameters

    NASA Astrophysics Data System (ADS)

    Ribeiro, A. Silva; Alves e Sousa, J.; Cox, Maurice G.; Forbes, Alistair B.; Matias, L. Cordeiro; Martins, L. Lages

    2015-08-01

    International Standard ISO 7730:2005 defines thermal comfort as that condition of mind that expresses the degree of satisfaction with the thermal environment. Although this definition is inevitably subjective, the Standard gives formulae for two thermal comfort indices, predicted mean vote ( PMV) and predicted percentage dissatisfied ( PPD). The PMV formula is based on principles of heat balance and experimental data collected in a controlled climate chamber under steady-state conditions. The PPD formula depends only on PMV. Although these formulae are widely recognized and adopted, little has been done to establish measurement uncertainties associated with their use, bearing in mind that the formulae depend on measured values and tabulated values given to limited numerical accuracy. Knowledge of these uncertainties are invaluable when values provided by the formulae are used in making decisions in various health and civil engineering situations. This paper examines these formulae, giving a general mechanism for evaluating the uncertainties associated with values of the quantities on which the formulae depend. Further, consideration is given to the propagation of these uncertainties through the formulae to provide uncertainties associated with the values obtained for the indices. Current international guidance on uncertainty evaluation is utilized.

  19. Harnessing the uncertainty monster: Putting quantitative constraints on the intergenerational social discount rate

    NASA Astrophysics Data System (ADS)

    Lewandowsky, Stephan; Freeman, Mark C.; Mann, Michael E.

    2017-09-01

    There is broad consensus among economists that unmitigated climate change will ultimately have adverse global economic consequences, that the costs of inaction will likely outweigh the cost of taking action, and that social planners should therefore put a price on carbon. However, there is considerable debate and uncertainty about the appropriate value of the social discount rate, that is the extent to which future damages should be discounted relative to mitigation costs incurred now. We briefly review the ethical issues surrounding the social discount rate and then report a simulation experiment that constrains the value of the discount rate by considering 4 sources of uncertainty and ambiguity: Scientific uncertainty about the extent of future warming, social uncertainty about future population and future economic development, political uncertainty about future mitigation trajectories, and ethical ambiguity about how much the welfare of future generations should be valued today. We compute a certainty-equivalent declining discount rate that accommodates all those sources of uncertainty and ambiguity. The forward (instantaneous) discount rate converges to a value near 0% by century's end and the spot (horizon) discount rate drops below 2% by 2100 and drops below previous estimates by 2070.

  20. The two-dimensional Monte Carlo: a new methodologic paradigm for dose reconstruction for epidemiological studies.

    PubMed

    Simon, Steven L; Hoffman, F Owen; Hofer, Eduard

    2015-01-01

    Retrospective dose estimation, particularly dose reconstruction that supports epidemiological investigations of health risk, relies on various strategies that include models of physical processes and exposure conditions with detail ranging from simple to complex. Quantification of dose uncertainty is an essential component of assessments for health risk studies since, as is well understood, it is impossible to retrospectively determine the true dose for each person. To address uncertainty in dose estimation, numerical simulation tools have become commonplace and there is now an increased understanding about the needs and what is required for models used to estimate cohort doses (in the absence of direct measurement) to evaluate dose response. It now appears that for dose-response algorithms to derive the best, unbiased estimate of health risk, we need to understand the type, magnitude and interrelationships of the uncertainties of model assumptions, parameters and input data used in the associated dose estimation models. Heretofore, uncertainty analysis of dose estimates did not always properly distinguish between categories of errors, e.g., uncertainty that is specific to each subject (i.e., unshared error), and uncertainty of doses from a lack of understanding and knowledge about parameter values that are shared to varying degrees by numbers of subsets of the cohort. While mathematical propagation of errors by Monte Carlo simulation methods has been used for years to estimate the uncertainty of an individual subject's dose, it was almost always conducted without consideration of dependencies between subjects. In retrospect, these types of simple analyses are not suitable for studies with complex dose models, particularly when important input data are missing or otherwise not available. The dose estimation strategy presented here is a simulation method that corrects the previous deficiencies of analytical or simple Monte Carlo error propagation methods and is termed, due to its capability to maintain separation between shared and unshared errors, the two-dimensional Monte Carlo (2DMC) procedure. Simply put, the 2DMC method simulates alternative, possibly true, sets (or vectors) of doses for an entire cohort rather than a single set that emerges when each individual's dose is estimated independently from other subjects. Moreover, estimated doses within each simulated vector maintain proper inter-relationships such that the estimated doses for members of a cohort subgroup that share common lifestyle attributes and sources of uncertainty are properly correlated. The 2DMC procedure simulates inter-individual variability of possibly true doses within each dose vector and captures the influence of uncertainty in the values of dosimetric parameters across multiple realizations of possibly true vectors of cohort doses. The primary characteristic of the 2DMC approach, as well as its strength, are defined by the proper separation between uncertainties shared by members of the entire cohort or members of defined cohort subsets, and uncertainties that are individual-specific and therefore unshared.

  1. Uncertainty for calculating transport on Titan: A probabilistic description of bimolecular diffusion parameters

    NASA Astrophysics Data System (ADS)

    Plessis, S.; McDougall, D.; Mandt, K.; Greathouse, T.; Luspay-Kuti, A.

    2015-11-01

    Bimolecular diffusion coefficients are important parameters used by atmospheric models to calculate altitude profiles of minor constituents in an atmosphere. Unfortunately, laboratory measurements of these coefficients were never conducted at temperature conditions relevant to the atmosphere of Titan. Here we conduct a detailed uncertainty analysis of the bimolecular diffusion coefficient parameters as applied to Titan's upper atmosphere to provide a better understanding of the impact of uncertainty for this parameter on models. Because temperature and pressure conditions are much lower than the laboratory conditions in which bimolecular diffusion parameters were measured, we apply a Bayesian framework, a problem-agnostic framework, to determine parameter estimates and associated uncertainties. We solve the Bayesian calibration problem using the open-source QUESO library which also performs a propagation of uncertainties in the calibrated parameters to temperature and pressure conditions observed in Titan's upper atmosphere. Our results show that, after propagating uncertainty through the Massman model, the uncertainty in molecular diffusion is highly correlated to temperature and we observe no noticeable correlation with pressure. We propagate the calibrated molecular diffusion estimate and associated uncertainty to obtain an estimate with uncertainty due to bimolecular diffusion for the methane molar fraction as a function of altitude. Results show that the uncertainty in methane abundance due to molecular diffusion is in general small compared to eddy diffusion and the chemical kinetics description. However, methane abundance is most sensitive to uncertainty in molecular diffusion above 1200 km where the errors are nontrivial and could have important implications for scientific research based on diffusion models in this altitude range.

  2. Breastfeeding considerations of opioid dependent mothers and infants.

    PubMed

    Hilton, Tara C

    2012-01-01

    The American Academy of Pediatrics (AAP) has a long-standing recommendation against breastfeeding if the maternal methadone dose is above 20 mg/day. In 2001, the AAP lifted the dose restriction of maternal methadone allowing methadone-maintained mothers to breastfeed. The allowance of breastfeeding among mothers taking methadone has been met with opposition due to the uncertainty that exists related to methadone exposure of the suckling infant. Methadone-maintained mothers are at higher risk for abuse, concomitant psychiatric disorders, limited access to healthcare, and financial hardship. Breastfeeding rates among methadone-maintained women tend to be low compared to the national average. This manuscript will discuss the implications for healthcare practitioners caring for methadone-maintained mothers and infants and associated risks and benefits of breastfeeding. This population of mothers and infants stands to obtain particular benefits from the various well-known advantages of breastfeeding.

  3. The impact of debris on marine life.

    PubMed

    Gall, S C; Thompson, R C

    2015-03-15

    Marine debris is listed among the major perceived threats to biodiversity, and is cause for particular concern due to its abundance, durability and persistence in the marine environment. An extensive literature search reviewed the current state of knowledge on the effects of marine debris on marine organisms. 340 original publications reported encounters between organisms and marine debris and 693 species. Plastic debris accounted for 92% of encounters between debris and individuals. Numerous direct and indirect consequences were recorded, with the potential for sublethal effects of ingestion an area of considerable uncertainty and concern. Comparison to the IUCN Red List highlighted that at least 17% of species affected by entanglement and ingestion were listed as threatened or near threatened. Hence where marine debris combines with other anthropogenic stressors it may affect populations, trophic interactions and assemblages. Copyright © 2015 Elsevier Ltd. All rights reserved.

  4. The health regionalization process from the perspective of the transation cost theory.

    PubMed

    Sancho, Leyla Gomes; Geremia, Daniela Savi; Dain, Sulamis; Geremia, Fabiano; Leão, Cláudio José Silva

    2017-04-01

    This study analyzes the incidence of transaction costs in the regionalization process of health policies in the Brazilian federal system. In this work, regionalized health actions contracted and agreed between federal agencies have assumed a transactional nature. A conceptual theoretical essay of reflective nature was prepared with the purpose of questioning and proposing new approaches to improve the health regionalization process. The main considerations suggest that institutional management tools proposed by the standards and regulations of the Unified Health System have a low potential to reduce transaction costs, especially due to hardships in reconciling common goals among the entities, environment surrounded by uncertainty, asymmetries and incomplete information, bounded rationality and conflict of interest. However, regionalization can reduce the incidence of social and/or operational costs, through improved access to health and the construction of more efficient governance models.

  5. Space based observations: A state of the art solution for spatial monitoring tropical forested watershed productivity at regional scale in developing countries

    NASA Astrophysics Data System (ADS)

    Mahmud, M. R.

    2014-02-01

    This paper presents the simplified and operational approach of mapping the water yield in tropical watershed using space-based multi sensor remote sensing data. Two main critical hydrological rainfall variables namely rainfall and evapotranspiration are being estimated by satellite measurement and reinforce the famous Thornthwaite & Mather water balance model. The satellite rainfall and ET estimates were able to represent the actual value on the ground with accuracy under considerable conditions. The satellite derived water yield had good agreement and relation with actual streamflow. A high bias measurement may result due to; i) influence of satellite rainfall estimates during heavy storm, and ii) large uncertainties and standard deviation of MODIS temperature data product. The output of this study managed to improve the regional scale of hydrology assessment in Peninsular Malaysia.

  6. Delineating baseflow contribution areas for streams - A model and methods comparison

    NASA Astrophysics Data System (ADS)

    Chow, Reynold; Frind, Michael E.; Frind, Emil O.; Jones, Jon P.; Sousa, Marcelo R.; Rudolph, David L.; Molson, John W.; Nowak, Wolfgang

    2016-12-01

    This study addresses the delineation of areas that contribute baseflow to a stream reach, also known as stream capture zones. Such areas can be delineated using standard well capture zone delineation methods, with three important differences: (1) natural gradients are smaller compared to those produced by supply wells and are therefore subject to greater numerical errors, (2) stream discharge varies seasonally, and (3) stream discharge varies spatially. This study focuses on model-related uncertainties due to model characteristics, discretization schemes, delineation methods, and particle tracking algorithms. The methodology is applied to the Alder Creek watershed in southwestern Ontario. Four different model codes are compared: HydroGeoSphere, WATFLOW, MODFLOW, and FEFLOW. In addition, two delineation methods are compared: reverse particle tracking and reverse transport, where the latter considers local-scale parameter uncertainty by using a macrodispersion term to produce a capture probability plume. The results from this study indicate that different models can calibrate acceptably well to the same data and produce very similar distributions of hydraulic head, but can produce different capture zones. The stream capture zone is found to be highly sensitive to the particle tracking algorithm. It was also found that particle tracking by itself, if applied to complex systems such as the Alder Creek watershed, would require considerable subjective judgement in the delineation of stream capture zones. Reverse transport is an alternative and more reliable approach that provides probability intervals for the baseflow contribution areas, taking uncertainty into account. The two approaches can be used together to enhance the confidence in the final outcome.

  7. Impacts of using an ensemble Kalman filter on air quality simulations along the California-Mexico border region during Cal-Mex 2010 field campaign.

    PubMed

    Bei, Naifang; Li, Guohui; Meng, Zhiyong; Weng, Yonghui; Zavala, Miguel; Molina, L T

    2014-11-15

    The purpose of this study is to investigate the impact of using an ensemble Kalman filter (EnKF) on air quality simulations in the California-Mexico border region on two days (May 30 and June 04, 2010) during Cal-Mex 2010. The uncertainties in ozone (O3) and aerosol simulations in the border area due to the meteorological initial uncertainties were examined through ensemble simulations. The ensemble spread of surface O3 averaged over the coastal region was less than 10ppb. The spreads in the nitrate and ammonium aerosols are substantial on both days, mostly caused by the large uncertainties in the surface temperature and humidity simulations. In general, the forecast initialized with the EnKF analysis (EnKF) improved the simulation of meteorological fields to some degree in the border region compared to the reference forecast initialized with NCEP analysis data (FCST) and the simulation with observation nudging (FDDA), which in turn leading to reasonable air quality simulations. The simulated surface O3 distributions by EnKF were consistently better than FCST and FDDA on both days. EnKF usually produced more reasonable simulations of nitrate and ammonium aerosols compared to the observations, but still have difficulties in improving the simulations of organic and sulfate aerosols. However, discrepancies between the EnKF simulations and the measurements were still considerably large, particularly for sulfate and organic aerosols, indicating that there are still ample rooms for improvement in the present data assimilation and/or the modeling systems. Copyright © 2014 Elsevier B.V. All rights reserved.

  8. Characterisation of a reference site for quantifying uncertainties related to soil sampling.

    PubMed

    Barbizzi, Sabrina; de Zorzi, Paolo; Belli, Maria; Pati, Alessandra; Sansone, Umberto; Stellato, Luisa; Barbina, Maria; Deluisa, Andrea; Menegon, Sandro; Coletti, Valter

    2004-01-01

    The paper reports a methodology adopted to face problems related to quality assurance in soil sampling. The SOILSAMP project, funded by the Environmental Protection Agency of Italy (APAT), is aimed at (i) establishing protocols for soil sampling in different environments; (ii) assessing uncertainties associated with different soil sampling methods in order to select the "fit-for-purpose" method; (iii) qualifying, in term of trace elements spatial variability, a reference site for national and international inter-comparison exercises. Preliminary results and considerations are illustrated.

  9. Analysing uncertainties of supply and demand in the future use of hydrogen as an energy vector

    NASA Astrophysics Data System (ADS)

    Lenel, U. R.; Davies, D. G. S.; Moore, M. A.

    An analytical technique (Analysis with Uncertain Qualities), developed at Fulmer, is being used to examine the sensitivity of the outcome to uncertainties in input quantities in order to highlight which input quantities critically affect the potential role of hydrogen. The work presented here includes an outline of the model and the analysis technique, along with basic considerations of the input quantities to the model (demand, supply and constraints). Some examples are given of probabilistic estimates of input quantities.

  10. The effects of geometric uncertainties on computational modelling of knee biomechanics

    NASA Astrophysics Data System (ADS)

    Meng, Qingen; Fisher, John; Wilcox, Ruth

    2017-08-01

    The geometry of the articular components of the knee is an important factor in predicting joint mechanics in computational models. There are a number of uncertainties in the definition of the geometry of cartilage and meniscus, and evaluating the effects of these uncertainties is fundamental to understanding the level of reliability of the models. In this study, the sensitivity of knee mechanics to geometric uncertainties was investigated by comparing polynomial-based and image-based knee models and varying the size of meniscus. The results suggested that the geometric uncertainties in cartilage and meniscus resulting from the resolution of MRI and the accuracy of segmentation caused considerable effects on the predicted knee mechanics. Moreover, even if the mathematical geometric descriptors can be very close to the imaged-based articular surfaces, the detailed contact pressure distribution produced by the mathematical geometric descriptors was not the same as that of the image-based model. However, the trends predicted by the models based on mathematical geometric descriptors were similar to those of the imaged-based models.

  11. Planning Under Continuous Time and Resource Uncertainty: A Challenge for AI

    NASA Technical Reports Server (NTRS)

    Bresina, John; Dearden, Richard; Meuleau, Nicolas; Smith, David; Washington, Rich; Clancy, Daniel (Technical Monitor)

    2002-01-01

    There has been considerable work in Al on decision-theoretic planning and planning under uncertainty. Unfortunately, all of this work suffers from one or more of the following limitations: 1) it relies on very simple models of actions and time, 2) it assumes that uncertainty is manifested in discrete action outcomes, and 3) it is only practical for very small problems. For many real world problems, these assumptions fail to hold. A case in point is planning the activities for a Mars rover. For this domain none of the above assumptions are valid: 1) actions can be concurrent and have differing durations, 2) there is uncertainty concerning action durations and consumption of continuous resources like power, and 3) typical daily plans involve on the order of a hundred actions. We describe the rover problem, discuss previous work on planning under uncertainty, and present a detailed. but very small, example illustrating some of the difficulties of finding good plans.

  12. Beam quality corrections for parallel-plate ion chambers in electron reference dosimetry

    NASA Astrophysics Data System (ADS)

    Zink, K.; Wulff, J.

    2012-04-01

    Current dosimetry protocols (AAPM, IAEA, IPEM, DIN) recommend parallel-plate ionization chambers for dose measurements in clinical electron beams. This study presents detailed Monte Carlo simulations of beam quality correction factors for four different types of parallel-plate chambers: NACP-02, Markus, Advanced Markus and Roos. These chambers differ in constructive details which should have notable impact on the resulting perturbation corrections, hence on the beam quality corrections. The results reveal deviations to the recommended beam quality corrections given in the IAEA TRS-398 protocol in the range of 0%-2% depending on energy and chamber type. For well-guarded chambers, these deviations could be traced back to a non-unity and energy-dependent wall perturbation correction. In the case of the guardless Markus chamber, a nearly energy-independent beam quality correction is resulting as the effects of wall and cavity perturbation compensate each other. For this chamber, the deviations to the recommended values are the largest and may exceed 2%. From calculations of type-B uncertainties including effects due to uncertainties of the underlying cross-sectional data as well as uncertainties due to the chamber material composition and chamber geometry, the overall uncertainty of calculated beam quality correction factors was estimated to be <0.7%. Due to different chamber positioning recommendations given in the national and international dosimetry protocols, an additional uncertainty in the range of 0.2%-0.6% is present. According to the IAEA TRS-398 protocol, the uncertainty in clinical electron dosimetry using parallel-plate ion chambers is 1.7%. This study may help to reduce this uncertainty significantly.

  13. Comparison of Model and Observations of Middle Atmospheric HOx Response to Solar 27-day Cycles: Quantifying Model Uncertainties due to Photochemistry

    NASA Astrophysics Data System (ADS)

    Wang, S.; Li, K. F.; Shia, R. L.; Yung, Y. L.; Sander, S. P.

    2016-12-01

    HO2 and OH (known as odd oxygen HOx), play an important role in middle atmospheric chemistry, in particular, O3 destruction through catalytic HOx reaction cycles. Due to their photochemical production and short chemical lifetimes, HOx species response rapidly to solar UV irradiance changes during solar cycles, resulting in variability in the corresponding O3 chemistry. Observational evidences for both OH and HO2 variability due to solar cycles have been reported. However, puzzling discrepancies remain. In particular, the large discrepancy between model and observations of solar 11-year cycle signal in OH and the significantly different model results when adopting different solar spectral irradiance (SSI) [Wang et al., 2013] suggest that both uncertainties in SSI variability and uncertainties in our current understanding of HOx-O3 chemistry could contribute to the discrepancy. Since the short-term SSI variability (e.g. changes during solar 27-day cycles) has little uncertainty, investigating 27-day solar cycle signals in HOx allows us to simplify the complex problem and to focus on the uncertainties in chemistry alone. We use the Caltech-JPL photochemical model to simulate observed HOx variability during 27-day cycles. The comparison between Aura Microwave Limb Sounder (MLS) observations and our model results (using standard chemistry and "adjusted chemistry", respectively) will be discussed. A better understanding of uncertainties in chemistry will eventually help us separate the contribution of chemistry from contributions of SSI uncertainties to the complex discrepancy between model and observations of OH responses to solar 11-year cycles.

  14. Hydro-geomorphology of the middle Elwha River, Washington, following dam removal

    NASA Astrophysics Data System (ADS)

    Morgan, J. A.; Nelson, P. A.; Brogan, D. J.

    2017-12-01

    Dam removal is an increasingly common river restoration practice, which can produce dramatic increases in sediment supply to downstream reaches. There remains, however, considerable uncertainty in how mesoscale morphological units (e.g., riffles and pools) respond to the flow and sediment supply changes associated with dam removal. The recent removal of Glines Canyon Dam on the Elwha River in Washington State provides a natural setting to explore how increased sediment supply due to dam removal may affect downstream reaches. Here, we present observations and surveys documenting how a 1 km reach, located approximately 5 km downstream of the former dam site, has evolved following dam removal. Annual topographic/bathymetric surveys were conducted in 2014-2016 using RTK-GNSS methods, and these surveys were coupled with airborne lidar to create continuous surface maps of the valley bottom. Differencing the elevation models reveals channel widening and migration due to lateral bank retreat and bar aggradation. Analysis of aerial imagery dating back to 1939 suggests that rates of both widening and meander migration have increased following dam removal. We also used results from depth-averaged hydrodynamic modeling with a fuzzy c-means clustering approach to delineate riffle and pool units; this analysis suggests that both riffles and pools stayed relatively consistent from 2014-2015, while both areas decreased from 2015 to 2016. Without any considerable changes to the hydrologic regime these higher rates of change are implied to be the result of the increased sediment supply. Our results, which indicate an increased dynamism due directly to the amplified sediment supply, have the potential to further inform river managers and restoration specialists who oversee projects related to changing sediment regimes.

  15. Effect of uncertainties on probabilistic-based design capacity of hydrosystems

    NASA Astrophysics Data System (ADS)

    Tung, Yeou-Koung

    2018-02-01

    Hydrosystems engineering designs involve analysis of hydrometric data (e.g., rainfall, floods) and use of hydrologic/hydraulic models, all of which contribute various degrees of uncertainty to the design process. Uncertainties in hydrosystem designs can be generally categorized into aleatory and epistemic types. The former arises from the natural randomness of hydrologic processes whereas the latter are due to knowledge deficiency in model formulation and model parameter specification. This study shows that the presence of epistemic uncertainties induces uncertainty in determining the design capacity. Hence, the designer needs to quantify the uncertainty features of design capacity to determine the capacity with a stipulated performance reliability under the design condition. Using detention basin design as an example, the study illustrates a methodological framework by considering aleatory uncertainty from rainfall and epistemic uncertainties from the runoff coefficient, curve number, and sampling error in design rainfall magnitude. The effects of including different items of uncertainty and performance reliability on the design detention capacity are examined. A numerical example shows that the mean value of the design capacity of the detention basin increases with the design return period and this relation is found to be practically the same regardless of the uncertainty types considered. The standard deviation associated with the design capacity, when subject to epistemic uncertainty, increases with both design frequency and items of epistemic uncertainty involved. It is found that the epistemic uncertainty due to sampling error in rainfall quantiles should not be ignored. Even with a sample size of 80 (relatively large for a hydrologic application) the inclusion of sampling error in rainfall quantiles resulted in a standard deviation about 2.5 times higher than that considering only the uncertainty of the runoff coefficient and curve number. Furthermore, the presence of epistemic uncertainties in the design would result in under-estimation of the annual failure probability of the hydrosystem and has a discounting effect on the anticipated design return period.

  16. Behavior and design of large structural concrete bridge pier overhangs.

    DOT National Transportation Integrated Search

    1997-02-01

    In designing large cantilever bent caps for use on recent projects under current AASHTO design specifications, : designers were faced with considerable uncertainties. Questions arose when designers attempted to satisfy : both serviceability and stren...

  17. Parametric uncertainties in global model simulations of black carbon column mass concentration

    NASA Astrophysics Data System (ADS)

    Pearce, Hana; Lee, Lindsay; Reddington, Carly; Carslaw, Ken; Mann, Graham

    2016-04-01

    Previous studies have deduced that the annual mean direct radiative forcing from black carbon (BC) aerosol may regionally be up to 5 W m-2 larger than expected due to underestimation of global atmospheric BC absorption in models. We have identified the magnitude and important sources of parametric uncertainty in simulations of BC column mass concentration from a global aerosol microphysics model (GLOMAP-Mode). A variance-based uncertainty analysis of 28 parameters has been performed, based on statistical emulators trained on model output from GLOMAP-Mode. This is the largest number of uncertain model parameters to be considered in a BC uncertainty analysis to date and covers primary aerosol emissions, microphysical processes and structural parameters related to the aerosol size distribution. We will present several recommendations for further research to improve the fidelity of simulated BC. In brief, we find that the standard deviation around the simulated mean annual BC column mass concentration varies globally between 2.5 x 10-9 g cm-2 in remote marine regions and 1.25 x 10-6 g cm-2 near emission sources due to parameter uncertainty Between 60 and 90% of the variance over source regions is due to uncertainty associated with primary BC emission fluxes, including biomass burning, fossil fuel and biofuel emissions. While the contributions to BC column uncertainty from microphysical processes, for example those related to dry and wet deposition, are increased over remote regions, we find that emissions still make an important contribution in these areas. It is likely, however, that the importance of structural model error, i.e. differences between models, is greater than parametric uncertainty. We have extended our analysis to emulate vertical BC profiles at several locations in the mid-Pacific Ocean and identify the parameters contributing to uncertainty in the vertical distribution of black carbon at these locations. We will present preliminary comparisons of emulated BC vertical profiles from the AeroCom multi-model ensemble and Hiaper Pole-to-Pole (HIPPO) observations.

  18. Simplified methods for real-time prediction of storm surge uncertainty: The city of Venice case study

    NASA Astrophysics Data System (ADS)

    Mel, Riccardo; Viero, Daniele Pietro; Carniello, Luca; Defina, Andrea; D'Alpaos, Luigi

    2014-09-01

    Providing reliable and accurate storm surge forecasts is important for a wide range of problems related to coastal environments. In order to adequately support decision-making processes, it also become increasingly important to be able to estimate the uncertainty associated with the storm surge forecast. The procedure commonly adopted to do this uses the results of a hydrodynamic model forced by a set of different meteorological forecasts; however, this approach requires a considerable, if not prohibitive, computational cost for real-time application. In the present paper we present two simplified methods for estimating the uncertainty affecting storm surge prediction with moderate computational effort. In the first approach we use a computationally fast, statistical tidal model instead of a hydrodynamic numerical model to estimate storm surge uncertainty. The second approach is based on the observation that the uncertainty in the sea level forecast mainly stems from the uncertainty affecting the meteorological fields; this has led to the idea to estimate forecast uncertainty via a linear combination of suitable meteorological variances, directly extracted from the meteorological fields. The proposed methods were applied to estimate the uncertainty in the storm surge forecast in the Venice Lagoon. The results clearly show that the uncertainty estimated through a linear combination of suitable meteorological variances nicely matches the one obtained using the deterministic approach and overcomes some intrinsic limitations in the use of a statistical tidal model.

  19. A prototype operational earthquake loss model for California based on UCERF3-ETAS – A first look at valuation

    USGS Publications Warehouse

    Field, Edward; Porter, Keith; Milner, Kevn

    2017-01-01

    We present a prototype operational loss model based on UCERF3-ETAS, which is the third Uniform California Earthquake Rupture Forecast with an Epidemic Type Aftershock Sequence (ETAS) component. As such, UCERF3-ETAS represents the first earthquake forecast to relax fault segmentation assumptions and to include multi-fault ruptures, elastic-rebound, and spatiotemporal clustering, all of which seem important for generating realistic and useful aftershock statistics. UCERF3-ETAS is nevertheless an approximation of the system, however, so usefulness will vary and potential value needs to be ascertained in the context of each application. We examine this question with respect to statewide loss estimates, exemplifying how risk can be elevated by orders of magnitude due to triggered events following various scenario earthquakes. Two important considerations are the probability gains, relative to loss likelihoods in the absence of main shocks, and the rapid decay of gains with time. Significant uncertainties and model limitations remain, so we hope this paper will inspire similar analyses with respect to other risk metrics to help ascertain whether operationalization of UCERF3-ETAS would be worth the considerable resources required.

  20. Water pollution risk associated with natural gas extraction from the Marcellus Shale.

    PubMed

    Rozell, Daniel J; Reaven, Sheldon J

    2012-08-01

    In recent years, shale gas formations have become economically viable through the use of horizontal drilling and hydraulic fracturing. These techniques carry potential environmental risk due to their high water use and substantial risk for water pollution. Using probability bounds analysis, we assessed the likelihood of water contamination from natural gas extraction in the Marcellus Shale. Probability bounds analysis is well suited when data are sparse and parameters highly uncertain. The study model identified five pathways of water contamination: transportation spills, well casing leaks, leaks through fractured rock, drilling site discharge, and wastewater disposal. Probability boxes were generated for each pathway. The potential contamination risk and epistemic uncertainty associated with hydraulic fracturing wastewater disposal was several orders of magnitude larger than the other pathways. Even in a best-case scenario, it was very likely that an individual well would release at least 200 m³ of contaminated fluids. Because the total number of wells in the Marcellus Shale region could range into the tens of thousands, this substantial potential risk suggested that additional steps be taken to reduce the potential for contaminated fluid leaks. To reduce the considerable epistemic uncertainty, more data should be collected on the ability of industrial and municipal wastewater treatment facilities to remove contaminants from used hydraulic fracturing fluid. © 2012 Society for Risk Analysis.

  1. Frequency-response mismatch effects in Johnson noise thermometry

    NASA Astrophysics Data System (ADS)

    White, D. R.; Qu, J.-F.

    2018-02-01

    Johnson noise thermometry is of considerable interest at present due to the planned redefinition of the kelvin in 2019, and several determinations of the Boltzmann constant have recently been published in support of the redefinition. To determine the Boltzmann constant by noise thermometry, the thermal noise from a sensing resistor at the triple point of water is compared to a pseudo-random noise with a calculable power spectral density traceable to quantum electrical standards. In all the measurements to date, the two dominant sources of measurement uncertainty are strongly influenced by a single factor: the frequency-response mismatch between the sets of leads connecting the thermometer to the two noise sources. In the most recent determination at the National Institute of Metrology, China, substantial changes were made to the connecting leads to reduce the mismatch effects. The aims of this paper are, firstly, to describe and explain the rationale for the changes, and secondly, to better understand the effects of the least-squares fits and the bias-variance compromise in the analysis of measurements affected by the mismatch effects. While significant improvements can be made to the connecting leads to lessen the effects of the frequency-response mismatch, the efforts are unlikely to be rewarded by a significant increase in bandwidth or a significant reduction in uncertainty.

  2. Measurement-based climatology of aerosol direct radiative effect, its sensitivities, and uncertainties from a background southeast US site

    NASA Astrophysics Data System (ADS)

    Sherman, James P.; McComiskey, Allison

    2018-03-01

    Aerosol optical properties measured at Appalachian State University's co-located NASA AERONET and NOAA ESRL aerosol network monitoring sites over a nearly four-year period (June 2012-Feb 2016) are used, along with satellite-based surface reflectance measurements, to study the seasonal variability of diurnally averaged clear sky aerosol direct radiative effect (DRE) and radiative efficiency (RE) at the top-of-atmosphere (TOA) and at the surface. Aerosol chemistry and loading at the Appalachian State site are likely representative of the background southeast US (SE US), home to high summertime aerosol loading and one of only a few regions not to have warmed during the 20th century. This study is the first multi-year ground truth DRE study in the SE US, using aerosol network data products that are often used to validate satellite-based aerosol retrievals. The study is also the first in the SE US to quantify DRE uncertainties and sensitivities to aerosol optical properties and surface reflectance, including their seasonal dependence.Median DRE for the study period is -2.9 W m-2 at the TOA and -6.1 W m-2 at the surface. Monthly median and monthly mean DRE at the TOA (surface) are -1 to -2 W m-2 (-2 to -3 W m-2) during winter months and -5 to -6 W m-2 (-10 W m-2) during summer months. The DRE cycles follow the annual cycle of aerosol optical depth (AOD), which is 9 to 10 times larger in summer than in winter. Aerosol RE is anti-correlated with DRE, with winter values 1.5 to 2 times more negative than summer values. Due to the large seasonal dependence of aerosol DRE and RE, we quantify the sensitivity of DRE to aerosol optical properties and surface reflectance, using a calendar day representative of each season (21 December for winter; 21 March for spring, 21 June for summer, and 21 September for fall). We use these sensitivities along with measurement uncertainties of aerosol optical properties and surface reflectance to calculate DRE uncertainties. We also estimate uncertainty in calculated diurnally-averaged DRE due to diurnal aerosol variability. Aerosol DRE at both the TOA and surface is most sensitive to changes in AOD, followed by single-scattering albedo (ω0). One exception is under the high summertime aerosol loading conditions (AOD ≥ 0.15 at 550 nm), when sensitivity of TOA DRE to ω0 is comparable to that of AOD. Aerosol DRE is less sensitive to changes in scattering asymmetry parameter (g) and surface reflectance (R). While DRE sensitivity to AOD varies by only ˜ 25 to 30 % with season, DRE sensitivity to ω0, g, and R largely follow the annual AOD cycle at APP, varying by factors of 8 to 15 with season. Since the measurement uncertainties of AOD, ω0, g, and R are comparable at Appalachian State, their relative contributions to DRE uncertainty are largely influenced by their (seasonally dependent) DRE sensitivity values, which suggests that the seasonal dependence of DRE uncertainty must be accounted for. Clear sky aerosol DRE uncertainty at the TOA (surface) due to measurement uncertainties ranges from 0.45 (0.75 W m-2) for December to 1.1 (1.6 W m-2) for June. Expressed as a fraction of DRE computed using monthly median aerosol optical properties and surface reflectance, the DRE uncertainties at TOA (surface) are 20 to 24 % (15 to 22 %) for March, June, and September and 49 (50 %) for DEC. The relatively low DRE uncertainties are largely due to the low uncertainty in AOD measured by AERONET. Use of satellite-based AOD measurements by MODIS in the DRE calculations increases DRE uncertainties by a factor of 2 to 5 and DRE uncertainties are dominated by AOD uncertainty for all seasons. Diurnal variability in AOD (and to a lesser extent g) contributes to uncertainties in DRE calculated using daily-averaged aerosol optical properties that are slightly larger (by ˜ 20 to 30 %) than DRE uncertainties due to measurement uncertainties during summer and fall, with comparable uncertainties during winter and spring.

  3. Local setup errors in image-guided radiotherapy for head and neck cancer patients immobilized with a custom-made device.

    PubMed

    Giske, Kristina; Stoiber, Eva M; Schwarz, Michael; Stoll, Armin; Muenter, Marc W; Timke, Carmen; Roeder, Falk; Debus, Juergen; Huber, Peter E; Thieke, Christian; Bendl, Rolf

    2011-06-01

    To evaluate the local positioning uncertainties during fractionated radiotherapy of head-and-neck cancer patients immobilized using a custom-made fixation device and discuss the effect of possible patient correction strategies for these uncertainties. A total of 45 head-and-neck patients underwent regular control computed tomography scanning using an in-room computed tomography scanner. The local and global positioning variations of all patients were evaluated by applying a rigid registration algorithm. One bounding box around the complete target volume and nine local registration boxes containing relevant anatomic structures were introduced. The resulting uncertainties for a stereotactic setup and the deformations referenced to one anatomic local registration box were determined. Local deformations of the patients immobilized using our custom-made device were compared with previously published results. Several patient positioning correction strategies were simulated, and the residual local uncertainties were calculated. The patient anatomy in the stereotactic setup showed local systematic positioning deviations of 1-4 mm. The deformations referenced to a particular anatomic local registration box were similar to the reported deformations assessed from patients immobilized with commercially available Aquaplast masks. A global correction, including the rotational error compensation, decreased the remaining local translational errors. Depending on the chosen patient positioning strategy, the remaining local uncertainties varied considerably. Local deformations in head-and-neck patients occur even if an elaborate, custom-made patient fixation method is used. A rotational error correction decreased the required margins considerably. None of the considered correction strategies achieved perfect alignment. Therefore, weighting of anatomic subregions to obtain the optimal correction vector should be investigated in the future. Copyright © 2011 Elsevier Inc. All rights reserved.

  4. Inter-comparison of interpolated background nitrogen dioxide concentrations across Greater Manchester, UK

    NASA Astrophysics Data System (ADS)

    Lindley, S. J.; Walsh, T.

    There are many modelling methods dedicated to the estimation of spatial patterns in pollutant concentrations, each with their distinctive advantages and disadvantages. The derivation of a surface of air quality values from monitoring data alone requires the conversion of point-based data from a limited number of monitoring stations to a continuous surface using interpolation. Since interpolation techniques involve the estimation of data at un-sampled points based on calculated relationships between data measured at a number of known sample points, they are subject to some uncertainty, both in terms of the values estimated and their spatial distribution. These uncertainties, which are incorporated into many empirical and semi-empirical mapping methodologies, could be recognised in any further usage of the data and also in the assessment of the extent of an exceedence of an air quality standard and the degree of exposure this may represent. There is a wide range of available interpolation techniques and the differences in the characteristics of these result in variations in the output surfaces estimated from the same set of input points. The work presented in this paper provides an examination of uncertainties through the application of a number of interpolation techniques available in standard GIS packages to a case study nitrogen dioxide data set for the Greater Manchester conurbation in northern England. The implications of the use of different techniques are discussed through application to hourly concentrations during an air quality episode and annual average concentrations in 2001. Patterns of concentrations demonstrate considerable differences in the estimated spatial pattern of maxima as the combined effects of chemical processes, topography and meteorology. In the case of air quality episodes, the considerable spatial variability of concentrations results in large uncertainties in the surfaces produced but these uncertainties vary widely from area to area. In view of the uncertainties with classical techniques research is ongoing to develop alternative methods which should in time help improve the suite of tools available to air quality managers.

  5. Characterization of the uncertainty of divergence time estimation under relaxed molecular clock models using multiple loci.

    PubMed

    Zhu, Tianqi; Dos Reis, Mario; Yang, Ziheng

    2015-03-01

    Genetic sequence data provide information about the distances between species or branch lengths in a phylogeny, but not about the absolute divergence times or the evolutionary rates directly. Bayesian methods for dating species divergences estimate times and rates by assigning priors on them. In particular, the prior on times (node ages on the phylogeny) incorporates information in the fossil record to calibrate the molecular tree. Because times and rates are confounded, our posterior time estimates will not approach point values even if an infinite amount of sequence data are used in the analysis. In a previous study we developed a finite-sites theory to characterize the uncertainty in Bayesian divergence time estimation in analysis of large but finite sequence data sets under a strict molecular clock. As most modern clock dating analyses use more than one locus and are conducted under relaxed clock models, here we extend the theory to the case of relaxed clock analysis of data from multiple loci (site partitions). Uncertainty in posterior time estimates is partitioned into three sources: Sampling errors in the estimates of branch lengths in the tree for each locus due to limited sequence length, variation of substitution rates among lineages and among loci, and uncertainty in fossil calibrations. Using a simple but analogous estimation problem involving the multivariate normal distribution, we predict that as the number of loci ([Formula: see text]) goes to infinity, the variance in posterior time estimates decreases and approaches the infinite-data limit at the rate of 1/[Formula: see text], and the limit is independent of the number of sites in the sequence alignment. We then confirmed the predictions by using computer simulation on phylogenies of two or three species, and by analyzing a real genomic data set for six primate species. Our results suggest that with the fossil calibrations fixed, analyzing multiple loci or site partitions is the most effective way for improving the precision of posterior time estimation. However, even if a huge amount of sequence data is analyzed, considerable uncertainty will persist in time estimates. © The Author(s) 2014. Published by Oxford University Press on behalf of the Society of Systematic Biologists.

  6. Constraining Swiss Methane Emissions from Atmospheric Observations: Sensitivities and Temporal Development

    NASA Astrophysics Data System (ADS)

    Henne, Stephan; Leuenberger, Markus; Steinbacher, Martin; Eugster, Werner; Meinhardt, Frank; Bergamaschi, Peter; Emmenegger, Lukas; Brunner, Dominik

    2017-04-01

    Similar to other Western European countries, agricultural sources dominate the methane (CH4) emission budget in Switzerland. 'Bottom-up' estimates of these emissions are still connected with relatively large uncertainties due to considerable variability and uncertainties in observed emission factors for the underlying processes (e.g., enteric fermentation, manure management). Here, we present a regional-scale (˜300 x 200 km2) atmospheric inversion study of CH4 emissions in Switzerland making use of the recently established CarboCount-CH network of four stations on the Swiss Plateau as well as the neighbouring mountain-top sites Jungfraujoch and Schauinsland (Germany). Continuous observations from all CarboCount-CH sites are available since 2013. We use a high-resolution (7 x 7 km2) Lagrangian particle dispersion model (FLEXPART-COSMO) in connection with two different inversion systems (Bayesian and extended Kalman filter) to estimate spatially and temporally resolved CH4 emissions for the Swiss domain in the period 2013 to 2016. An extensive set of sensitivity inversions is used to assess the overall uncertainty of our inverse approach. In general we find good agreement of the total Swiss CH4 emissions between our 'top-down' estimate and the national 'bottom-up' reporting. In addition, a robust emission seasonality, with reduced winter time values, can be seen in all years. No significant trend or year-to-year variability was observed for the analysed four-year period, again in agreement with a very small downward trend in the national 'bottom-up' reporting. Special attention is given to the influence of boundary conditions as taken from different global scale model simulations (TM5, FLEXPART) and remote observations. We find that uncertainties in the boundary conditions can induce large offsets in the national total emissions. However, spatial emission patterns are less sensitive to the choice of boundary condition. Furthermore and in order to demonstrate the validity of our approach, a series of inversion runs using synthetic observations, generated from 'true' emissions, in combination with various sources of uncertainty are presented.

  7. How important are the descriptions of vegetation in distributed hydrologic models?

    NASA Astrophysics Data System (ADS)

    Cuntz, Matthias; Thober, Stephan; Zink, Matthias; Rakovec, Oldrich; Samaniego, Luis

    2016-04-01

    The land surface transforms incoming, absorbed radiation into other energy forms and radiation with longer wavelengths. The land surface emits long-wave radiation, stores energy in the soil, the biomass and the air in the boundary layer, and exchanges sensible and latent heat with the atmosphere. The latter, latent heat consists of evaporation from the soil and canopy and transpiration by plants. Plants enhance in this picture the absorption of incoming radiation and decrease the resistance for evaporation of deeper soil water. Transpiration by plants is therefore either energy-limited by low incoming radiation or water-limited by small soil moisture. In the extreme cases, all available energy will be used for evapotranspiration in cold regions and all available water will be used for evapotranspiration in arid regions. Very simple formulations of latent heat, which include plant processes only very indirectly, work well in hydrologic models for these limiting cases. These simple formulations seem to work also surprisingly well in temperate regions. Hydrologic models have, however, considerable problems in semi-arid regions where the vegetation influence on latent heat should be largest. But the models have to deal with much more problems in these regions. For example data scarcity in the Mediterranean leads to very large model uncertainty due to the forcing data. Water supply is also often very regulated in semi-arid regions. Variability in river discharge can hence be largely driven by the anthropogenic influence rather than natural meteorological variations in these regions. Here we will show for Europe the areas and times when the descriptions of plant processes are important for hydrologic models. We will compare differences in model uncertainties that come from 1. different formulations of evapotranspiration, 2. different descriptions of soil-plant interactions, and 3. uncertainty in the model's input data. It can be seen that model uncertainty stemming from uncertain input data is similar or larger in magnitude than the uncertainty coming from the descriptions of the vegetation in the models. Acquisition of better input data should thus go hand in hand with more sophisticated descriptions of the land surface.

  8. Estimation and impact assessment of input and parameter uncertainty in predicting groundwater flow with a fully distributed model

    NASA Astrophysics Data System (ADS)

    Touhidul Mustafa, Syed Md.; Nossent, Jiri; Ghysels, Gert; Huysmans, Marijke

    2017-04-01

    Transient numerical groundwater flow models have been used to understand and forecast groundwater flow systems under anthropogenic and climatic effects, but the reliability of the predictions is strongly influenced by different sources of uncertainty. Hence, researchers in hydrological sciences are developing and applying methods for uncertainty quantification. Nevertheless, spatially distributed flow models pose significant challenges for parameter and spatially distributed input estimation and uncertainty quantification. In this study, we present a general and flexible approach for input and parameter estimation and uncertainty analysis of groundwater models. The proposed approach combines a fully distributed groundwater flow model (MODFLOW) with the DiffeRential Evolution Adaptive Metropolis (DREAM) algorithm. To avoid over-parameterization, the uncertainty of the spatially distributed model input has been represented by multipliers. The posterior distributions of these multipliers and the regular model parameters were estimated using DREAM. The proposed methodology has been applied in an overexploited aquifer in Bangladesh where groundwater pumping and recharge data are highly uncertain. The results confirm that input uncertainty does have a considerable effect on the model predictions and parameter distributions. Additionally, our approach also provides a new way to optimize the spatially distributed recharge and pumping data along with the parameter values under uncertain input conditions. It can be concluded from our approach that considering model input uncertainty along with parameter uncertainty is important for obtaining realistic model predictions and a correct estimation of the uncertainty bounds.

  9. Entropic uncertainty for spin-1/2 XXX chains in the presence of inhomogeneous magnetic fields and its steering via weak measurement reversals

    NASA Astrophysics Data System (ADS)

    Wang, Dong; Ming, Fei; Huang, Ai-Jun; Sun, Wen-Yang; Ye, Liu

    2017-09-01

    The uncertainty principle configures a low bound to the measuring precision for a pair of non-commuting observables, and hence is considerably nontrivial to quantum precision measurement in the field of quantum information theory. In this letter, we consider the entropic uncertainty relation (EUR) in the context of quantum memory in a two-qubit isotropic Heisenberg spin chain. Specifically, we explore the dynamics of EUR in a practical scenario, where two associated nodes of a one-dimensional XXX-spin chain, under an inhomogeneous magnetic field, are connected to a thermal entanglement. We show that the temperature and magnetic field effect can lead to the inflation of the measuring uncertainty, stemming from the reduction of systematic quantum correlation. Notably, we reveal that, firstly, the uncertainty is not fully dependent on the observed quantum correlation of the system; secondly, the dynamical behaviors of the measuring uncertainty are relatively distinct with respect to ferromagnetism and antiferromagnetism chains. Meanwhile, we deduce that the measuring uncertainty is dramatically correlated with the mixedness of the system, implying that smaller mixedness tends to reduce the uncertainty. Furthermore, we propose an effective strategy to control the uncertainty of interest by means of quantum weak measurement reversal. Therefore, our work may shed light on the dynamics of the measuring uncertainty in the Heisenberg spin chain, and thus be important to quantum precision measurement in various solid-state systems.

  10. Uncertainty in eddy covariance measurements and its application to physiological models

    Treesearch

    D.Y. Hollinger; A.D. Richardson; A.D. Richardson

    2005-01-01

    Flux data are noisy, and this uncertainty is largely due to random measurement error. Knowledge of uncertainty is essential for the statistical evaluation of modeled andmeasured fluxes, for comparison of parameters derived by fitting models to measured fluxes and in formal data-assimilation efforts. We used the difference between simultaneous measurements from two...

  11. Response of hydrology to climate change in the southern Appalachian mountains using Bayesian inference

    Treesearch

    Wei Wu; James S. Clark; James M. Vose

    2012-01-01

    Predicting long-term consequences of climate change on hydrologic processes has been limited due to the needs to accommodate the uncertainties in hydrological measurements for calibration, and to account for the uncertainties in the models that would ingest those calibrations and uncertainties in climate predictions as basis for hydrological predictions. We implemented...

  12. An Efficient Deterministic Approach to Model-based Prediction Uncertainty Estimation

    DTIC Science & Technology

    2012-09-01

    94035, USA abhinav.saxena@nasa.gov ABSTRACT Prognostics deals with the prediction of the end of life ( EOL ) of a system. EOL is a random variable, due...future evolution of the system, accumulating additional uncertainty into the predicted EOL . Prediction algorithms that do not account for these sources of...uncertainty are misrepresenting the EOL and can lead to poor decisions based on their results. In this paper, we explore the impact of uncertainty in

  13. Decision analysis of shoreline protection under climate change uncertainty

    NASA Astrophysics Data System (ADS)

    Chao, Philip T.; Hobbs, Benjamin F.

    1997-04-01

    If global warming occurs, it could significantly affect water resource distribution and availability. Yet it is unclear whether the prospect of such change is relevant to water resources management decisions being made today. We model a shoreline protection decision problem with a stochastic dynamic program (SDP) to determine whether consideration of the possibility of climate change would alter the decision. Three questions are addressed with the SDP: (l) How important is climate change compared to other uncertainties?, (2) What is the economic loss if climate change uncertainty is ignored?, and (3) How does belief in climate change affect the timing of the decision? In the case study, sensitivity analysis shows that uncertainty in real discount rates has a stronger effect upon the decision than belief in climate change. Nevertheless, a strong belief in climate change makes the shoreline protection project less attractive and often alters the decision to build it.

  14. Cantilever spring constant calibration using laser Doppler vibrometry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ohler, Benjamin

    2007-06-15

    Uncertainty in cantilever spring constants is a critical issue in atomic force microscopy (AFM) force measurements. Though numerous methods exist for calibrating cantilever spring constants, the accuracy of these methods can be limited by both the physical models themselves as well as uncertainties in their experimental implementation. Here we report the results from two of the most common calibration methods, the thermal tune method and the Sader method. These were implemented on a standard AFM system as well as using laser Doppler vibrometry (LDV). Using LDV eliminates some uncertainties associated with optical lever detection on an AFM. It also offersmore » considerably higher signal to noise deflection measurements. We find that AFM and LDV result in similar uncertainty in the calibrated spring constants, about 5%, using either the thermal tune or Sader methods provided that certain limitations of the methods and instrumentation are observed.« less

  15. Optimization and resilience in natural resources management

    USGS Publications Warehouse

    Williams, Byron K.; Johnson, Fred A.

    2015-01-01

    We consider the putative tradeoff between optimization and resilience in the management of natural resources, using a framework that incorporates different sources of uncertainty that are common in natural resources management. We address one-time decisions, and then expand the decision context to the more complex problem of iterative decision making. For both cases we focus on two key sources of uncertainty: partial observability of system state and uncertainty as to system dynamics. Optimal management strategies will vary considerably depending on the timeframe being considered and the amount and quality of information that is available to characterize system features and project the consequences of potential decisions. But in all cases an optimal decision making framework, if properly identified and focused, can be useful in recognizing sound decisions. We argue that under the conditions of deep uncertainty that characterize many resource systems, an optimal decision process that focuses on robustness does not automatically induce a loss of resilience.

  16. PROCESS DESIGN FOR ENVIRONMENT: A MULTI-OBJECTIVE FRAMEWORK UNDER UNCERTAINTY

    EPA Science Inventory

    Designing chemical processes for environment requires consideration of several indexes of environmental impact including ozone depletion and global warming potentials, human and aquatic toxicity, and photochemical oxidation, and acid rain potentials. Current methodologies like t...

  17. The EEOC's New Equal Pay Act Guidelines.

    ERIC Educational Resources Information Center

    Greenlaw, Paul S.; Kohl, John P.

    1982-01-01

    Analyzes the new guidelines for enforcement of the Equal Pay Act and their implications for personnel management. Argues that there are key problem areas in the new regulations arising from considerable ambiguity and uncertainty about their interpretation. (SK)

  18. Tutorial: Parallel Computing of Simulation Models for Risk Analysis.

    PubMed

    Reilly, Allison C; Staid, Andrea; Gao, Michael; Guikema, Seth D

    2016-10-01

    Simulation models are widely used in risk analysis to study the effects of uncertainties on outcomes of interest in complex problems. Often, these models are computationally complex and time consuming to run. This latter point may be at odds with time-sensitive evaluations or may limit the number of parameters that are considered. In this article, we give an introductory tutorial focused on parallelizing simulation code to better leverage modern computing hardware, enabling risk analysts to better utilize simulation-based methods for quantifying uncertainty in practice. This article is aimed primarily at risk analysts who use simulation methods but do not yet utilize parallelization to decrease the computational burden of these models. The discussion is focused on conceptual aspects of embarrassingly parallel computer code and software considerations. Two complementary examples are shown using the languages MATLAB and R. A brief discussion of hardware considerations is located in the Appendix. © 2016 Society for Risk Analysis.

  19. Human subjects concerns in ground based ECLSS testing - Managing uncertainty in closely recycled systems

    NASA Technical Reports Server (NTRS)

    Crump, William J.; Janik, Daniel S.; Thomas, L. Dale

    1990-01-01

    U.S. space missions have to this point used water either made on board or carried from earth and discarded after use. For Space Station Freedom, long duration life support will include air and water recycling using a series of physical-chemical subsystems. The Environmental Control and Life Support System (ECLSS) designed for this application must be tested extensively at all stages of hardware maturity. Human test subjects are required to conduct some of these tests, and the risks associated with the use of development hardware must be addressed. Federal guidelines for protection of human subjects require careful consideration of risks and potential benefits by an Institutional Review Board (IRB) before and during testing. This paper reviews the ethical principles guiding this consideration, details the problems and uncertainties inherent in current hardware testing, and presents an incremental approach to risk assessment for ECLSS testing.

  20. Probabilistic Physics-Based Risk Tools Used to Analyze the International Space Station Electrical Power System Output

    NASA Technical Reports Server (NTRS)

    Patel, Bhogila M.; Hoge, Peter A.; Nagpal, Vinod K.; Hojnicki, Jeffrey S.; Rusick, Jeffrey J.

    2004-01-01

    This paper describes the methods employed to apply probabilistic modeling techniques to the International Space Station (ISS) power system. These techniques were used to quantify the probabilistic variation in the power output, also called the response variable, due to variations (uncertainties) associated with knowledge of the influencing factors called the random variables. These uncertainties can be due to unknown environmental conditions, variation in the performance of electrical power system components or sensor tolerances. Uncertainties in these variables, cause corresponding variations in the power output, but the magnitude of that effect varies with the ISS operating conditions, e.g. whether or not the solar panels are actively tracking the sun. Therefore, it is important to quantify the influence of these uncertainties on the power output for optimizing the power available for experiments.

  1. How multiple causes combine: independence constraints on causal inference.

    PubMed

    Liljeholm, Mimi

    2015-01-01

    According to the causal power view, two core constraints-that causes occur independently (i.e., no confounding) and influence their effects independently-serve as boundary conditions for causal induction. This study investigated how violations of these constraints modulate uncertainty about the existence and strength of a causal relationship. Participants were presented with pairs of candidate causes that were either confounded or not, and that either interacted or exerted their influences independently. Consistent with the causal power view, uncertainty about the existence and strength of causal relationships was greater when causes were confounded or interacted than when unconfounded and acting independently. An elemental Bayesian causal model captured differences in uncertainty due to confounding but not those due to an interaction. Implications of distinct sources of uncertainty for the selection of contingency information and causal generalization are discussed.

  2. Methodologies for evaluating performance and assessing uncertainty of atmospheric dispersion models

    NASA Astrophysics Data System (ADS)

    Chang, Joseph C.

    This thesis describes methodologies to evaluate the performance and to assess the uncertainty of atmospheric dispersion models, tools that predict the fate of gases and aerosols upon their release into the atmosphere. Because of the large economic and public-health impacts often associated with the use of the dispersion model results, these models should be properly evaluated, and their uncertainty should be properly accounted for and understood. The CALPUFF, HPAC, and VLSTRACK dispersion modeling systems were applied to the Dipole Pride (DP26) field data (˜20 km in scale), in order to demonstrate the evaluation and uncertainty assessment methodologies. Dispersion model performance was found to be strongly dependent on the wind models used to generate gridded wind fields from observed station data. This is because, despite the fact that the test site was a flat area, the observed surface wind fields still showed considerable spatial variability, partly because of the surrounding mountains. It was found that the two components were comparable for the DP26 field data, with variability more important than uncertainty closer to the source, and less important farther away from the source. Therefore, reducing data errors for input meteorology may not necessarily increase model accuracy due to random turbulence. DP26 was a research-grade field experiment, where the source, meteorological, and concentration data were all well-measured. Another typical application of dispersion modeling is a forensic study where the data are usually quite scarce. An example would be the modeling of the alleged releases of chemical warfare agents during the 1991 Persian Gulf War, where the source data had to rely on intelligence reports, and where Iraq had stopped reporting weather data to the World Meteorological Organization since the 1981 Iran-Iraq-war. Therefore the meteorological fields inside Iraq must be estimated by models such as prognostic mesoscale meteorological models, based on observational data from areas outside of Iraq, and using the global fields simulated by the global meteorological models as the initial and boundary conditions for the mesoscale models. It was found that while comparing model predictions to observations in areas outside of Iraq, the predicted surface wind directions had errors between 30 to 90 deg, but the inter-model differences (or uncertainties) in the predicted surface wind directions inside Iraq, where there were no onsite data, were fairly constant at about 70 deg. (Abstract shortened by UMI.)

  3. Uncertainty analysis of vegetation distribution in the northern high latitudes during the 21st century with a dynamic vegetation model

    PubMed Central

    Jiang, Yueyang; Zhuang, Qianlai; Schaphoff, Sibyll; Sitch, Stephen; Sokolov, Andrei; Kicklighter, David; Melillo, Jerry

    2012-01-01

    This study aims to assess how high-latitude vegetation may respond under various climate scenarios during the 21st century with a focus on analyzing model parameters induced uncertainty and how this uncertainty compares to the uncertainty induced by various climates. The analysis was based on a set of 10,000 Monte Carlo ensemble Lund-Potsdam-Jena (LPJ) simulations for the northern high latitudes (45oN and polewards) for the period 1900–2100. The LPJ Dynamic Global Vegetation Model (LPJ-DGVM) was run under contemporary and future climates from four Special Report Emission Scenarios (SRES), A1FI, A2, B1, and B2, based on the Hadley Centre General Circulation Model (GCM), and six climate scenarios, X901M, X902L, X903H, X904M, X905L, and X906H from the Integrated Global System Model (IGSM) at the Massachusetts Institute of Technology (MIT). In the current dynamic vegetation model, some parameters are more important than others in determining the vegetation distribution. Parameters that control plant carbon uptake and light-use efficiency have the predominant influence on the vegetation distribution of both woody and herbaceous plant functional types. The relative importance of different parameters varies temporally and spatially and is influenced by climate inputs. In addition to climate, these parameters play an important role in determining the vegetation distribution in the region. The parameter-based uncertainties contribute most to the total uncertainty. The current warming conditions lead to a complexity of vegetation responses in the region. Temperate trees will be more sensitive to climate variability, compared with boreal forest trees and C3 perennial grasses. This sensitivity would result in a unanimous northward greenness migration due to anomalous warming in the northern high latitudes. Temporally, boreal needleleaved evergreen plants are projected to decline considerably, and a large portion of C3 perennial grass is projected to disappear by the end of the 21st century. In contrast, the area of temperate trees would increase, especially under the most extreme A1FI scenario. As the warming continues, the northward greenness expansion in the Arctic region could continue. PMID:22822437

  4. Uncertainties of predictions from parton distributions II: theoretical errors

    NASA Astrophysics Data System (ADS)

    Martin, A. D.; Roberts, R. G.; Stirling, W. J.; Thorne, R. S.

    2004-06-01

    We study the uncertainties in parton distributions, determined in global fits to deep inelastic and related hard scattering data, due to so-called theoretical errors. Amongst these, we include potential errors due to the change of perturbative order (NLO to NNLO), ln(1/x) and ln(1-x) effects, absorptive corrections and higher-twist contributions. We investigate these uncertainties both by including explicit corrections to our standard global analysis and by examining the sensitivity to changes of the x, Q 2, W 2 cuts on the data that are fitted. In this way we expose those kinematic regions where the conventional DGLAP description is inadequate. As a consequence we obtain a set of NLO, and of NNLO, conservative partons where the data are fully consistent with DGLAP evolution, but over a restricted kinematic domain. We also examine the potential effects of such issues as the choice of input parametrisation, heavy target corrections, assumptions about the strange quark sea and isospin violation. Hence we are able to compare the theoretical errors with those uncertainties due to errors on the experimental measurements, which we studied previously. We use W and Higgs boson production at the Tevatron and the LHC as explicit examples of the uncertainties arising from parton distributions. For many observables the theoretical error is dominant, but for the cross section for W production at the Tevatron both the theoretical and experimental uncertainties are small, and hence the NNLO prediction may serve as a valuable luminosity monitor.

  5. Optimal control problems of epidemic systems with parameter uncertainties: application to a malaria two-age-classes transmission model with asymptomatic carriers.

    PubMed

    Mwanga, Gasper G; Haario, Heikki; Capasso, Vicenzo

    2015-03-01

    The main scope of this paper is to study the optimal control practices of malaria, by discussing the implementation of a catalog of optimal control strategies in presence of parameter uncertainties, which is typical of infectious diseases data. In this study we focus on a deterministic mathematical model for the transmission of malaria, including in particular asymptomatic carriers and two age classes in the human population. A partial qualitative analysis of the relevant ODE system has been carried out, leading to a realistic threshold parameter. For the deterministic model under consideration, four possible control strategies have been analyzed: the use of Long-lasting treated mosquito nets, indoor residual spraying, screening and treatment of symptomatic and asymptomatic individuals. The numerical results show that using optimal control the disease can be brought to a stable disease free equilibrium when all four controls are used. The Incremental Cost-Effectiveness Ratio (ICER) for all possible combinations of the disease-control measures is determined. The numerical simulations of the optimal control in the presence of parameter uncertainty demonstrate the robustness of the optimal control: the main conclusions of the optimal control remain unchanged, even if inevitable variability remains in the control profiles. The results provide a promising framework for the designing of cost-effective strategies for disease controls with multiple interventions, even under considerable uncertainty of model parameters. Copyright © 2014 Elsevier Inc. All rights reserved.

  6. Forest management under climatic and social uncertainty: trade-offs between reducing climate change impacts and fostering adaptive capacity.

    PubMed

    Seidl, Rupert; Lexer, Manfred J

    2013-01-15

    The unabated continuation of anthropogenic greenhouse gas emissions and the lack of an international consensus on a stringent climate change mitigation policy underscore the importance of adaptation for coping with the all but inevitable changes in the climate system. Adaptation measures in forestry have particularly long lead times. A timely implementation is thus crucial for reducing the considerable climate vulnerability of forest ecosystems. However, since future environmental conditions as well as future societal demands on forests are inherently uncertain, a core requirement for adaptation is robustness to a wide variety of possible futures. Here we explicitly address the roles of climatic and social uncertainty in forest management, and tackle the question of robustness of adaptation measures in the context of multi-objective sustainable forest management (SFM). We used the Austrian Federal Forests (AFF) as a case study, and employed a comprehensive vulnerability assessment framework based on ecosystem modeling, multi-criteria decision analysis, and practitioner participation. We explicitly considered climate uncertainty by means of three climate change scenarios, and accounted for uncertainty in future social demands by means of three societal preference scenarios regarding SFM indicators. We found that the effects of climatic and social uncertainty on the projected performance of management were in the same order of magnitude, underlining the notion that climate change adaptation requires an integrated social-ecological perspective. Furthermore, our analysis of adaptation measures revealed considerable trade-offs between reducing adverse impacts of climate change and facilitating adaptive capacity. This finding implies that prioritization between these two general aims of adaptation is necessary in management planning, which we suggest can draw on uncertainty analysis: Where the variation induced by social-ecological uncertainty renders measures aiming to reduce climate change impacts statistically insignificant (i.e., for approximately one third of the investigated management units of the AFF case study), fostering adaptive capacity is suggested as the preferred pathway for adaptation. We conclude that climate change adaptation needs to balance between anticipating expected future conditions and building the capacity to address unknowns and surprises. Copyright © 2012 Elsevier Ltd. All rights reserved.

  7. Using uncertainty and sensitivity analyses in socioecological agent-based models to improve their analytical performance and policy relevance.

    PubMed

    Ligmann-Zielinska, Arika; Kramer, Daniel B; Spence Cheruvelil, Kendra; Soranno, Patricia A

    2014-01-01

    Agent-based models (ABMs) have been widely used to study socioecological systems. They are useful for studying such systems because of their ability to incorporate micro-level behaviors among interacting agents, and to understand emergent phenomena due to these interactions. However, ABMs are inherently stochastic and require proper handling of uncertainty. We propose a simulation framework based on quantitative uncertainty and sensitivity analyses to build parsimonious ABMs that serve two purposes: exploration of the outcome space to simulate low-probability but high-consequence events that may have significant policy implications, and explanation of model behavior to describe the system with higher accuracy. The proposed framework is applied to the problem of modeling farmland conservation resulting in land use change. We employ output variance decomposition based on quasi-random sampling of the input space and perform three computational experiments. First, we perform uncertainty analysis to improve model legitimacy, where the distribution of results informs us about the expected value that can be validated against independent data, and provides information on the variance around this mean as well as the extreme results. In our last two computational experiments, we employ sensitivity analysis to produce two simpler versions of the ABM. First, input space is reduced only to inputs that produced the variance of the initial ABM, resulting in a model with output distribution similar to the initial model. Second, we refine the value of the most influential input, producing a model that maintains the mean of the output of initial ABM but with less spread. These simplifications can be used to 1) efficiently explore model outcomes, including outliers that may be important considerations in the design of robust policies, and 2) conduct explanatory analysis that exposes the smallest number of inputs influencing the steady state of the modeled system.

  8. Using Uncertainty and Sensitivity Analyses in Socioecological Agent-Based Models to Improve Their Analytical Performance and Policy Relevance

    PubMed Central

    Ligmann-Zielinska, Arika; Kramer, Daniel B.; Spence Cheruvelil, Kendra; Soranno, Patricia A.

    2014-01-01

    Agent-based models (ABMs) have been widely used to study socioecological systems. They are useful for studying such systems because of their ability to incorporate micro-level behaviors among interacting agents, and to understand emergent phenomena due to these interactions. However, ABMs are inherently stochastic and require proper handling of uncertainty. We propose a simulation framework based on quantitative uncertainty and sensitivity analyses to build parsimonious ABMs that serve two purposes: exploration of the outcome space to simulate low-probability but high-consequence events that may have significant policy implications, and explanation of model behavior to describe the system with higher accuracy. The proposed framework is applied to the problem of modeling farmland conservation resulting in land use change. We employ output variance decomposition based on quasi-random sampling of the input space and perform three computational experiments. First, we perform uncertainty analysis to improve model legitimacy, where the distribution of results informs us about the expected value that can be validated against independent data, and provides information on the variance around this mean as well as the extreme results. In our last two computational experiments, we employ sensitivity analysis to produce two simpler versions of the ABM. First, input space is reduced only to inputs that produced the variance of the initial ABM, resulting in a model with output distribution similar to the initial model. Second, we refine the value of the most influential input, producing a model that maintains the mean of the output of initial ABM but with less spread. These simplifications can be used to 1) efficiently explore model outcomes, including outliers that may be important considerations in the design of robust policies, and 2) conduct explanatory analysis that exposes the smallest number of inputs influencing the steady state of the modeled system. PMID:25340764

  9. Evaluation of soil water stable isotope analysis by H2O(liquid)-H2O(vapor) equilibration method

    NASA Astrophysics Data System (ADS)

    Gralher, Benjamin; Stumpp, Christine

    2014-05-01

    Environmental tracers like stable isotopes of water (δ18O, δ2H) have proven to be valuable tools to study water flow and transport processes in soils. Recently, a new technique for soil water isotope analysis has been developed that employs a vapor phase being in isothermal equilibrium with the liquid phase of interest. This has increased the potential application of water stable isotopes in unsaturated zone studies as it supersedes laborious extraction of soil water. However, uncertainties of analysis and influencing factors need to be considered. Therefore, the objective of this study was to evaluate different methodologies of analysing stable isotopes in soil water in order to reduce measurement uncertainty. The methodologies included different preparation procedures of soil cores for equilibration of vapor and soil water as well as raw data correction. Two different inflatable sample containers (freezer bags, bags containing a metal layer) and equilibration atmospheres (N2, dry air) were tested. The results showed that uncertainties for δ18O were higher compared to δ2H that cannot be attributed to any specific detail of the processing routine. Particularly, soil samples with high contents of organic matter showed an apparent isotope enrichment which is indicative for fractionation due to evaporation. However, comparison of water samples obtained from suction cups with the local meteoric water line indicated negligible fractionation processes in the investigated soils. Therefore, a method was developed to correct the raw data reducing the uncertainties of the analysis.. We conclude that the evaluated method is advantageous over traditional methods regarding simplicity, resource requirements and sample throughput but careful consideration needs to be made regarding sample handling and data processing. Thus, stable isotopes of water are still a good tool to determine water flow and transport processes in the unsaturated zone.

  10. Accounting for downscaling and model uncertainty in fine-resolution seasonal climate projections over the Columbia River Basin

    NASA Astrophysics Data System (ADS)

    Ahmadalipour, Ali; Moradkhani, Hamid; Rana, Arun

    2018-01-01

    Climate change is expected to have severe impacts on natural systems as well as various socio-economic aspects of human life. This has urged scientific communities to improve the understanding of future climate and reduce the uncertainties associated with projections. In the present study, ten statistically downscaled CMIP5 GCMs at 1/16th deg. spatial resolution from two different downscaling procedures are utilized over the Columbia River Basin (CRB) to assess the changes in climate variables and characterize the associated uncertainties. Three climate variables, i.e. precipitation, maximum temperature, and minimum temperature, are studied for the historical period of 1970-2000 as well as future period of 2010-2099, simulated with representative concentration pathways of RCP4.5 and RCP8.5. Bayesian Model Averaging (BMA) is employed to reduce the model uncertainty and develop a probabilistic projection for each variable in each scenario. Historical comparison of long-term attributes of GCMs and observation suggests a more accurate representation for BMA than individual models. Furthermore, BMA projections are used to investigate future seasonal to annual changes of climate variables. Projections indicate significant increase in annual precipitation and temperature, with varied degree of change across different sub-basins of CRB. We then characterized uncertainty of future projections for each season over CRB. Results reveal that model uncertainty is the main source of uncertainty, among others. However, downscaling uncertainty considerably contributes to the total uncertainty of future projections, especially in summer. On the contrary, downscaling uncertainty appears to be higher than scenario uncertainty for precipitation.

  11. Classifying the Sizes of Explosive Eruptions using Tephra Deposits: The Advantages of a Numerical Inversion Approach

    NASA Astrophysics Data System (ADS)

    Connor, C.; Connor, L.; White, J.

    2015-12-01

    Explosive volcanic eruptions are often classified by deposit mass and eruption column height. How well are these eruption parameters determined in older deposits, and how well can we reduce uncertainty using robust numerical and statistical methods? We describe an efficient and effective inversion and uncertainty quantification approach for estimating eruption parameters given a dataset of tephra deposit thickness and granulometry. The inversion and uncertainty quantification is implemented using the open-source PEST++ code. Inversion with PEST++ can be used with a variety of forward models and here is applied using Tephra2, a code that simulates advective and dispersive tephra transport and deposition. The Levenburg-Marquardt algorithm is combined with formal Tikhonov and subspace regularization to invert eruption parameters; a linear equation for conditional uncertainty propagation is used to estimate posterior parameter uncertainty. Both the inversion and uncertainty analysis support simultaneous analysis of the full eruption and wind-field parameterization. The combined inversion/uncertainty-quantification approach is applied to the 1992 eruption of Cerro Negro (Nicaragua), the 2011 Kirishima-Shinmoedake (Japan), and the 1913 Colima (Mexico) eruptions. These examples show that although eruption mass uncertainty is reduced by inversion against tephra isomass data, considerable uncertainty remains for many eruption and wind-field parameters, such as eruption column height. Supplementing the inversion dataset with tephra granulometry data is shown to further reduce the uncertainty of most eruption and wind-field parameters. We think the use of such robust models provides a better understanding of uncertainty in eruption parameters, and hence eruption classification, than is possible with more qualitative methods that are widely used.

  12. Effects of relational uncertainty in heightening national identification and reactive approach motivation of Japanese.

    PubMed

    Terashima, Yuto; Takai, Jiro

    2017-03-23

    This study investigated whether relational uncertainty poses uncertainty threat, which causes compensatory behaviours among Japanese. We hypothesised that Japanese, as collectivists, would perceive relational uncertainty to pose uncertainty threat. In two experiments, we manipulated relational uncertainty, and confirmed that participants exhibited compensatory reactions to reduce aversive feelings due to it. In Study 1, we conducted direct comparison between relational uncertainty, independent self-uncertainty and control conditions. The results revealed that participants who were instructed to imagine events pertaining to relational uncertainty heightened national identification as compensation than did participants in the control condition, but independent self-uncertainty did not provoke such effects. In Study 2, we again manipulated relational uncertainty; however, we also manipulated participants' individualism-collectivism cultural orientation through priming, and the analyses yielded a significant interaction effect between these variables. Relational uncertainty evoked reactive approach motivation, a cause for compensatory behaviours, among participants primed with collectivism, but not for individualism. It was concluded that the effect of uncertainty on compensatory behaviour is influenced by cultural priming, and that relational uncertainty is important to Japanese. © 2017 International Union of Psychological Science.

  13. Durability reliability analysis for corroding concrete structures under uncertainty

    NASA Astrophysics Data System (ADS)

    Zhang, Hao

    2018-02-01

    This paper presents a durability reliability analysis of reinforced concrete structures subject to the action of marine chloride. The focus is to provide insight into the role of epistemic uncertainties on durability reliability. The corrosion model involves a number of variables whose probabilistic characteristics cannot be fully determined due to the limited availability of supporting data. All sources of uncertainty, both aleatory and epistemic, should be included in the reliability analysis. Two methods are available to formulate the epistemic uncertainty: the imprecise probability-based method and the purely probabilistic method in which the epistemic uncertainties are modeled as random variables. The paper illustrates how the epistemic uncertainties are modeled and propagated in the two methods, and shows how epistemic uncertainties govern the durability reliability.

  14. Induced seismicity risk assessment for the 2006 Basel, Switzerland, Enhanced Geothermal System (EGS) project: Role of parameter uncertainty

    NASA Astrophysics Data System (ADS)

    Mignan, Arnaud; Landtwing, Delano; Mena, Banu; Wiemer, Stefan

    2013-04-01

    A project to exploit the geothermal potential of the crystalline rocks below the city of Basel, Switzerland, was abandoned in recent years due to unacceptable risk associated to increased seismic activity during and following hydraulic stimulation. The largest induced earthquake (Mw = 3.2, 8 December 2006) was widely felt by the local population and provoked slight non-structural damage to buildings. Here we present a probabilistic risk assessment analysis for the 2006 Basel EGS project, including uncertainty linked to the following parameters: induced seismicity forecast model, maximum magnitude, intensity prediction equation, site amplification or not, vulnerability index and cost function. Uncertainty is implemented using a logic tree composed of a total of 324 branches. Exposure is defined from the Basel area building stock of Baisch et al. (2009) (SERIANEX study). We first generate deterministic loss curves, defined as the insured value loss (IVL) as a function of earthquake magnitude. We calibrate the vulnerability curves for low EMS-98 intensities (using the input parameters fixed in the SERIANEX study) such that we match the real loss value, which has been estimated to 3 million CHF (lower than the paid value) for the Mw = 3.2 event. Coupling the deterministic loss curves with seismic hazard curves using the short-term earthquake risk (STEER) method, we obtain site-specific probabilistic loss curves (PLC, i.e., probability of exceeding a given IVL) for the 79 settlements considered. We then integrate over the different PLCs to calculate the most probable IVL. Based on the proposed logic tree, we find considerable variations in the most probable IVL, with lower values for the 6-day injection period than for the first 6 days of the post-injection period. This difference is due to a b-value significantly lower in the second period than in the first one, yielding a higher likelihood of larger earthquakes in the post-injection phase. Based on tornado diagrams, we show that the variability in the most probable IVL is mostly due to the choice of the vulnerability index, followed by the choice of including or not site amplification. The choice of the cost function comes in third place. Based on these results, we finally provide guidelines for decision-making. To the best of our knowledge, this study is the first one to consider uncertainties at the hazard and risk level in a systematic way in the scope of induced seismicity regimes. The proposed method is transferable to other EGS projects as well as to earthquake sequences triggered by wastewater disposal, carbon capture and sequestration.

  15. Variability And Uncertainty Analysis Of Contaminant Transport Model Using Fuzzy Latin Hypercube Sampling Technique

    NASA Astrophysics Data System (ADS)

    Kumar, V.; Nayagum, D.; Thornton, S.; Banwart, S.; Schuhmacher2, M.; Lerner, D.

    2006-12-01

    Characterization of uncertainty associated with groundwater quality models is often of critical importance, as for example in cases where environmental models are employed in risk assessment. Insufficient data, inherent variability and estimation errors of environmental model parameters introduce uncertainty into model predictions. However, uncertainty analysis using conventional methods such as standard Monte Carlo sampling (MCS) may not be efficient, or even suitable, for complex, computationally demanding models and involving different nature of parametric variability and uncertainty. General MCS or variant of MCS such as Latin Hypercube Sampling (LHS) assumes variability and uncertainty as a single random entity and the generated samples are treated as crisp assuming vagueness as randomness. Also when the models are used as purely predictive tools, uncertainty and variability lead to the need for assessment of the plausible range of model outputs. An improved systematic variability and uncertainty analysis can provide insight into the level of confidence in model estimates, and can aid in assessing how various possible model estimates should be weighed. The present study aims to introduce, Fuzzy Latin Hypercube Sampling (FLHS), a hybrid approach of incorporating cognitive and noncognitive uncertainties. The noncognitive uncertainty such as physical randomness, statistical uncertainty due to limited information, etc can be described by its own probability density function (PDF); whereas the cognitive uncertainty such estimation error etc can be described by the membership function for its fuzziness and confidence interval by ?-cuts. An important property of this theory is its ability to merge inexact generated data of LHS approach to increase the quality of information. The FLHS technique ensures that the entire range of each variable is sampled with proper incorporation of uncertainty and variability. A fuzzified statistical summary of the model results will produce indices of sensitivity and uncertainty that relate the effects of heterogeneity and uncertainty of input variables to model predictions. The feasibility of the method is validated to assess uncertainty propagation of parameter values for estimation of the contamination level of a drinking water supply well due to transport of dissolved phenolics from a contaminated site in the UK.

  16. Fuzzy Energy and Reserve Co-optimization With High Penetration of Renewable Energy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, Cong; Botterud, Audun; Zhou, Zhi

    In this study, we propose a fuzzy-based energy and reserve co-optimization model with consideration of high penetration of renewable energy. Under the assumption of a fixed uncertainty set of renewables, a two-stage robust model is proposed for clearing energy and reserves in the first stage and checking the feasibility and robustness of re-dispatches in the second stage. Fuzzy sets and their membership functions are introduced into the optimization model to represent the satisfaction degree of the variable uncertainty sets. The lower bound of the uncertainty set is expressed as fuzzy membership functions. The solutions are obtained by transforming the fuzzymore » mathematical programming formulation into traditional mixed integer linear programming problems.« less

  17. Uncertainty Considerations for Ballistic Limit Equations

    NASA Technical Reports Server (NTRS)

    Schonberg, W. P.; Evans, H. J.; Williamsen, J. E; Boyer, R. L.; Nakayama, G. S.

    2005-01-01

    The overall risk for any spacecraft system is typically determined using a Probabilistic Risk Assessment (PRA). A PRA determines the overall risk associated with a particular mission by factoring in all known risks to the spacecraft during its mission. The threat to mission and human life posed by the micro-meteoroid and orbital debris (MMOD) environment is one of the risks. NASA uses the BUMPER II program to provide point estimate predictions of MMOD risk for the Space Shuttle and the ISS. However, BUMPER II does not provide uncertainty bounds or confidence intervals for its predictions. In this paper, we present possible approaches through which uncertainty bounds can be developed for the various damage prediction and ballistic limit equations encoded within the Shuttle and Station versions of BUMPER II.

  18. Fuzzy Energy and Reserve Co-optimization With High Penetration of Renewable Energy

    DOE PAGES

    Liu, Cong; Botterud, Audun; Zhou, Zhi; ...

    2016-10-21

    In this study, we propose a fuzzy-based energy and reserve co-optimization model with consideration of high penetration of renewable energy. Under the assumption of a fixed uncertainty set of renewables, a two-stage robust model is proposed for clearing energy and reserves in the first stage and checking the feasibility and robustness of re-dispatches in the second stage. Fuzzy sets and their membership functions are introduced into the optimization model to represent the satisfaction degree of the variable uncertainty sets. The lower bound of the uncertainty set is expressed as fuzzy membership functions. The solutions are obtained by transforming the fuzzymore » mathematical programming formulation into traditional mixed integer linear programming problems.« less

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kastenberg, W.E.; Apostolakis, G.; Dhir, V.K.

    Severe accident management can be defined as the use of existing and/or altemative resources, systems and actors to prevent or mitigate a core-melt accident. For each accident sequence and each combination of severe accident management strategies, there may be several options available to the operator, and each involves phenomenological and operational considerations regarding uncertainty. Operational uncertainties include operator, system and instrumentation behavior during an accident. A framework based on decision trees and influence diagrams has been developed which incorporates such criteria as feasibility, effectiveness, and adverse effects, for evaluating potential severe accident management strategies. The framework is also capable ofmore » propagating both data and model uncertainty. It is applied to several potential strategies including PWR cavity flooding, BWR drywell flooding, PWR depressurization and PWR feed and bleed.« less

  20. Uncertainty in recharge estimation: impact on groundwater vulnerability assessments for the Pearl Harbor Basin, O'ahu, Hawai'i, U.S.A.

    NASA Astrophysics Data System (ADS)

    Giambelluca, Thomas W.; Loague, Keith; Green, Richard E.; Nullet, Michael A.

    1996-06-01

    In this paper, uncertainty in recharge estimates is investigated relative to its impact on assessments of groundwater contamination vulnerability using a relatively simple pesticide mobility index, attenuation factor (AF). We employ a combination of first-order uncertainty analysis (FOUA) and sensitivity analysis to investigate recharge uncertainties for agricultural land on the island of O'ahu, Hawai'i, that is currently, or has been in the past, under sugarcane or pineapple cultivation. Uncertainty in recharge due to recharge component uncertainties is 49% of the mean for sugarcane and 58% of the mean for pineapple. The components contributing the largest amounts of uncertainty to the recharge estimate are irrigation in the case of sugarcane and precipitation in the case of pineapple. For a suite of pesticides formerly or currently used in the region, the contribution to AF uncertainty of recharge uncertainty was compared with the contributions of other AF components: retardation factor (RF), a measure of the effects of sorption; soil-water content at field capacity (ΘFC); and pesticide half-life (t1/2). Depending upon the pesticide, the contribution of recharge to uncertainty ranks second or third among the four AF components tested. The natural temporal variability of recharge is another source of uncertainty in AF, because the index is calculated using the time-averaged recharge rate. Relative to the mean, recharge variability is 10%, 44%, and 176% for the annual, monthly, and daily time scales, respectively, under sugarcane, and 31%, 112%, and 344%, respectively, under pineapple. In general, uncertainty in AF associated with temporal variability in recharge at all time scales exceeds AF. For chemicals such as atrazine or diuron under sugarcane, and atrazine or bromacil under pineapple, the range of AF uncertainty due to temporal variability in recharge encompasses significantly higher levels of leaching potential at some locations than that indicated by the AF estimate.

  1. Gap Size Uncertainty Quantification in Advanced Gas Reactor TRISO Fuel Irradiation Experiments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pham, Binh T.; Einerson, Jeffrey J.; Hawkes, Grant L.

    The Advanced Gas Reactor (AGR)-3/4 experiment is the combination of the third and fourth tests conducted within the tristructural isotropic fuel development and qualification research program. The AGR-3/4 test consists of twelve independent capsules containing a fuel stack in the center surrounded by three graphite cylinders and shrouded by a stainless steel shell. This capsule design enables temperature control of both the fuel and the graphite rings by varying the neon/helium gas mixture flowing through the four resulting gaps. Knowledge of fuel and graphite temperatures is crucial for establishing the functional relationship between fission product release and irradiation thermal conditions.more » These temperatures are predicted for each capsule using the commercial finite-element heat transfer code ABAQUS. Uncertainty quantification reveals that the gap size uncertainties are among the dominant factors contributing to predicted temperature uncertainty due to high input sensitivity and uncertainty. Gap size uncertainty originates from the fact that all gap sizes vary with time due to dimensional changes of the fuel compacts and three graphite rings caused by extended exposure to high temperatures and fast neutron irradiation. Gap sizes are estimated using as-fabricated dimensional measurements at the start of irradiation and post irradiation examination dimensional measurements at the end of irradiation. Uncertainties in these measurements provide a basis for quantifying gap size uncertainty. However, lack of gap size measurements during irradiation and lack of knowledge about the dimension change rates lead to gap size modeling assumptions, which could increase gap size uncertainty. In addition, the dimensional measurements are performed at room temperature, and must be corrected to account for thermal expansion of the materials at high irradiation temperatures. Uncertainty in the thermal expansion coefficients for the graphite materials used in the AGR-3/4 capsules also increases gap size uncertainty. This study focuses on analysis of modeling assumptions and uncertainty sources to evaluate their impacts on the gap size uncertainty.« less

  2. Uncertainty Analysis of NASA Glenn's 8- by 6-Foot Supersonic Wind Tunnel

    NASA Technical Reports Server (NTRS)

    Stephens, Julia E.; Hubbard, Erin P.; Walter, Joel A.; McElroy, Tyler

    2016-01-01

    An analysis was performed to determine the measurement uncertainty of the Mach Number of the 8- by 6-foot Supersonic Wind Tunnel at the NASA Glenn Research Center. This paper details the analysis process used, including methods for handling limited data and complicated data correlations. Due to the complexity of the equations used, a Monte Carlo Method was utilized for this uncertainty analysis. A summary of the findings are presented as pertains to understanding what the uncertainties are, how they impact various research tests in the facility, and methods of reducing the uncertainties in the future.

  3. Computational Fluid Dynamics Uncertainty Analysis for Payload Fairing Spacecraft Environmental Control Systems

    NASA Technical Reports Server (NTRS)

    Groves, Curtis; Ilie, Marcel; Schallhorn, Paul

    2014-01-01

    Spacecraft components may be damaged due to airflow produced by Environmental Control Systems (ECS). There are uncertainties and errors associated with using Computational Fluid Dynamics (CFD) to predict the flow field around a spacecraft from the ECS System. This paper describes an approach to estimate the uncertainty in using CFD to predict the airflow speeds around an encapsulated spacecraft.

  4. Global Sensitivity Analysis and Estimation of Model Error, Toward Uncertainty Quantification in Scramjet Computations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huan, Xun; Safta, Cosmin; Sargsyan, Khachik

    The development of scramjet engines is an important research area for advancing hypersonic and orbital flights. Progress toward optimal engine designs requires accurate flow simulations together with uncertainty quantification. However, performing uncertainty quantification for scramjet simulations is challenging due to the large number of uncertain parameters involved and the high computational cost of flow simulations. These difficulties are addressed in this paper by developing practical uncertainty quantification algorithms and computational methods, and deploying them in the current study to large-eddy simulations of a jet in crossflow inside a simplified HIFiRE Direct Connect Rig scramjet combustor. First, global sensitivity analysis ismore » conducted to identify influential uncertain input parameters, which can help reduce the system’s stochastic dimension. Second, because models of different fidelity are used in the overall uncertainty quantification assessment, a framework for quantifying and propagating the uncertainty due to model error is presented. In conclusion, these methods are demonstrated on a nonreacting jet-in-crossflow test problem in a simplified scramjet geometry, with parameter space up to 24 dimensions, using static and dynamic treatments of the turbulence subgrid model, and with two-dimensional and three-dimensional geometries.« less

  5. Global Sensitivity Analysis and Estimation of Model Error, Toward Uncertainty Quantification in Scramjet Computations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huan, Xun; Safta, Cosmin; Sargsyan, Khachik

    The development of scramjet engines is an important research area for advancing hypersonic and orbital flights. Progress toward optimal engine designs requires accurate flow simulations together with uncertainty quantification. However, performing uncertainty quantification for scramjet simulations is challenging due to the large number of uncertain parameters involved and the high computational cost of flow simulations. These difficulties are addressed in this paper by developing practical uncertainty quantification algorithms and computational methods, and deploying them in the current study to large-eddy simulations of a jet in crossflow inside a simplified HIFiRE Direct Connect Rig scramjet combustor. First, global sensitivity analysis ismore » conducted to identify influential uncertain input parameters, which can help reduce the system’s stochastic dimension. Second, because models of different fidelity are used in the overall uncertainty quantification assessment, a framework for quantifying and propagating the uncertainty due to model error is presented. Finally, these methods are demonstrated on a nonreacting jet-in-crossflow test problem in a simplified scramjet geometry, with parameter space up to 24 dimensions, using static and dynamic treatments of the turbulence subgrid model, and with two-dimensional and three-dimensional geometries.« less

  6. The epistemic and aleatory uncertainties of the ETAS-type models: an application to the Central Italy seismicity.

    PubMed

    Lombardi, A M

    2017-09-18

    Stochastic models provide quantitative evaluations about the occurrence of earthquakes. A basic component of this type of models are the uncertainties in defining main features of an intrinsically random process. Even if, at a very basic level, any attempting to distinguish between types of uncertainty is questionable, an usual way to deal with this topic is to separate epistemic uncertainty, due to lack of knowledge, from aleatory variability, due to randomness. In the present study this problem is addressed in the narrow context of short-term modeling of earthquakes and, specifically, of ETAS modeling. By mean of an application of a specific version of the ETAS model to seismicity of Central Italy, recently struck by a sequence with a main event of Mw6.5, the aleatory and epistemic (parametric) uncertainty are separated and quantified. The main result of the paper is that the parametric uncertainty of the ETAS-type model, adopted here, is much lower than the aleatory variability in the process. This result points out two main aspects: an analyst has good chances to set the ETAS-type models, but he may retrospectively describe and forecast the earthquake occurrences with still limited precision and accuracy.

  7. Global Sensitivity Analysis and Estimation of Model Error, Toward Uncertainty Quantification in Scramjet Computations

    NASA Astrophysics Data System (ADS)

    Huan, Xun; Safta, Cosmin; Sargsyan, Khachik; Geraci, Gianluca; Eldred, Michael S.; Vane, Zachary P.; Lacaze, Guilhem; Oefelein, Joseph C.; Najm, Habib N.

    2018-03-01

    The development of scramjet engines is an important research area for advancing hypersonic and orbital flights. Progress toward optimal engine designs requires accurate flow simulations together with uncertainty quantification. However, performing uncertainty quantification for scramjet simulations is challenging due to the large number of uncertain parameters involved and the high computational cost of flow simulations. These difficulties are addressed in this paper by developing practical uncertainty quantification algorithms and computational methods, and deploying them in the current study to large-eddy simulations of a jet in crossflow inside a simplified HIFiRE Direct Connect Rig scramjet combustor. First, global sensitivity analysis is conducted to identify influential uncertain input parameters, which can help reduce the systems stochastic dimension. Second, because models of different fidelity are used in the overall uncertainty quantification assessment, a framework for quantifying and propagating the uncertainty due to model error is presented. These methods are demonstrated on a nonreacting jet-in-crossflow test problem in a simplified scramjet geometry, with parameter space up to 24 dimensions, using static and dynamic treatments of the turbulence subgrid model, and with two-dimensional and three-dimensional geometries.

  8. Global Sensitivity Analysis and Estimation of Model Error, Toward Uncertainty Quantification in Scramjet Computations

    DOE PAGES

    Huan, Xun; Safta, Cosmin; Sargsyan, Khachik; ...

    2018-02-09

    The development of scramjet engines is an important research area for advancing hypersonic and orbital flights. Progress toward optimal engine designs requires accurate flow simulations together with uncertainty quantification. However, performing uncertainty quantification for scramjet simulations is challenging due to the large number of uncertain parameters involved and the high computational cost of flow simulations. These difficulties are addressed in this paper by developing practical uncertainty quantification algorithms and computational methods, and deploying them in the current study to large-eddy simulations of a jet in crossflow inside a simplified HIFiRE Direct Connect Rig scramjet combustor. First, global sensitivity analysis ismore » conducted to identify influential uncertain input parameters, which can help reduce the system’s stochastic dimension. Second, because models of different fidelity are used in the overall uncertainty quantification assessment, a framework for quantifying and propagating the uncertainty due to model error is presented. In conclusion, these methods are demonstrated on a nonreacting jet-in-crossflow test problem in a simplified scramjet geometry, with parameter space up to 24 dimensions, using static and dynamic treatments of the turbulence subgrid model, and with two-dimensional and three-dimensional geometries.« less

  9. Integrating info-gap decision theory with robust population management: a case study using the Mountain Plover.

    PubMed

    van der Burg, Max Post; Tyre, Andrew J

    2011-01-01

    Wildlife managers often make decisions under considerable uncertainty. In the most extreme case, a complete lack of data leads to uncertainty that is unquantifiable. Information-gap decision theory deals with assessing management decisions under extreme uncertainty, but it is not widely used in wildlife management. So too, robust population management methods were developed to deal with uncertainties in multiple-model parameters. However, the two methods have not, as yet, been used in tandem to assess population management decisions. We provide a novel combination of the robust population management approach for matrix models with the information-gap decision theory framework for making conservation decisions under extreme uncertainty. We applied our model to the problem of nest survival management in an endangered bird species, the Mountain Plover (Charadrius montanus). Our results showed that matrix sensitivities suggest that nest management is unlikely to have a strong effect on population growth rate, confirming previous analyses. However, given the amount of uncertainty about adult and juvenile survival, our analysis suggested that maximizing nest marking effort was a more robust decision to maintain a stable population. Focusing on the twin concepts of opportunity and robustness in an information-gap model provides a useful method of assessing conservation decisions under extreme uncertainty.

  10. PCDD/F EMISSIONS FROM UNCONTROLLED, DOMESTIC WASTE BURNING

    EPA Science Inventory

    Considerable uncertainty exists in the inventory of polychlorinated dibenzodioxin and dibenzofuran (PCDD/F) emissions from controlled combustion sources such as backyard burning of domestic waste. The contribution from these sources to the worldwide PCDD/F balance may be signific...

  11. Sampling for Chemical Analysis.

    ERIC Educational Resources Information Center

    Kratochvil, Byron; And Others

    1984-01-01

    This review, designed to make analysts aware of uncertainties introduced into analytical measurements during sampling, is organized under these headings: general considerations; theory; standards; and applications related to mineralogy, soils, sediments, metallurgy, atmosphere, water, biology, agriculture and food, medical and clinical areas, oil…

  12. POWER TO DETECT REGIONAL TRENDS IN HABITAT CHARACTERISTICS

    EPA Science Inventory

    The condition of stream habitat draws considerable attention concerning the protection and recovery of salmonid populations in the West. Habitat degradation continues and substantial sums of money are spent on habitat restoration. However, aided by uncertainty concerning the ad...

  13. POWER TO DETECT REGIONAL TRENDS IN PHYSICAL HABITAT

    EPA Science Inventory

    The condition of stream habitat draws considerable attention concerning the protection and recovery of salmonid populations in the West. Habitat degradation continues and substantial sums of money are spent on habitat restoration. However, aided by uncertainty concerning the ad...

  14. Sources of Uncertainty and the Interpretation of Short-Term Fluctuations

    NASA Astrophysics Data System (ADS)

    Lewandowsky, S.; Risbey, J.; Cowtan, K.; Rahmstorf, S.

    2016-12-01

    The alleged significant slowdown in global warming during the first decade of the 21st century, and the appearance of a discrepancy between models and observations, has attracted considerable research attention. We trace the history of this research and show how its conclusions were shaped by several sources of uncertainty and ambiguity about models and observations. We show that as those sources of uncertainty were gradually eliminated by further research, insufficient evidence remained to infer any discrepancy between models and observations or a significant slowing of warming. Specifically, we show that early research had to contend with uncertainties about coverage biases in the global temperature record and biases in the sea surface temperature observations which turned out to have exaggerated the extent of slowing. In addition, uncertainties in the observed forcings were found to have exaggerated the mismatch between models and observations. Further sources of uncertainty that were ultimately eliminated involved the use of incommensurate sea surface temperature data between models and observations and a tacit interpretation of model projections as predictions or forecasts. After all those sources of uncertainty were eliminated, the most recent research finds little evidence for an unusual slowdown or a discrepancy between models and observations. We discuss whether these different kinds of uncertainty could have been anticipated or managed differently, and how one can apply those lessons to future short-term fluctuations in warming.

  15. Fuel cycle cost uncertainty from nuclear fuel cycle comparison

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, J.; McNelis, D.; Yim, M.S.

    2013-07-01

    This paper examined the uncertainty in fuel cycle cost (FCC) calculation by considering both model and parameter uncertainty. Four different fuel cycle options were compared in the analysis including the once-through cycle (OT), the DUPIC cycle, the MOX cycle and a closed fuel cycle with fast reactors (FR). The model uncertainty was addressed by using three different FCC modeling approaches with and without the time value of money consideration. The relative ratios of FCC in comparison to OT did not change much by using different modeling approaches. This observation was consistent with the results of the sensitivity study for themore » discount rate. Two different sets of data with uncertainty range of unit costs were used to address the parameter uncertainty of the FCC calculation. The sensitivity study showed that the dominating contributor to the total variance of FCC is the uranium price. In general, the FCC of OT was found to be the lowest followed by FR, MOX, and DUPIC. But depending on the uranium price, the FR cycle was found to have lower FCC over OT. The reprocessing cost was also found to have a major impact on FCC.« less

  16. Comparison of Two Methods for Estimating the Sampling-Related Uncertainty of Satellite Rainfall Averages Based on a Large Radar Data Set

    NASA Technical Reports Server (NTRS)

    Lau, William K. M. (Technical Monitor); Bell, Thomas L.; Steiner, Matthias; Zhang, Yu; Wood, Eric F.

    2002-01-01

    The uncertainty of rainfall estimated from averages of discrete samples collected by a satellite is assessed using a multi-year radar data set covering a large portion of the United States. The sampling-related uncertainty of rainfall estimates is evaluated for all combinations of 100 km, 200 km, and 500 km space domains, 1 day, 5 day, and 30 day rainfall accumulations, and regular sampling time intervals of 1 h, 3 h, 6 h, 8 h, and 12 h. These extensive analyses are combined to characterize the sampling uncertainty as a function of space and time domain, sampling frequency, and rainfall characteristics by means of a simple scaling law. Moreover, it is shown that both parametric and non-parametric statistical techniques of estimating the sampling uncertainty produce comparable results. Sampling uncertainty estimates, however, do depend on the choice of technique for obtaining them. They can also vary considerably from case to case, reflecting the great variability of natural rainfall, and should therefore be expressed in probabilistic terms. Rainfall calibration errors are shown to affect comparison of results obtained by studies based on data from different climate regions and/or observation platforms.

  17. Cloud Condensation Nuclei Prediction Error from Application of Kohler Theory: Importance for the Aerosol Indirect Effect

    NASA Technical Reports Server (NTRS)

    Sotiropoulou, Rafaella-Eleni P.; Nenes, Athanasios; Adams, Peter J.; Seinfeld, John H.

    2007-01-01

    In situ observations of aerosol and cloud condensation nuclei (CCN) and the GISS GCM Model II' with an online aerosol simulation and explicit aerosol-cloud interactions are used to quantify the uncertainty in radiative forcing and autoconversion rate from application of Kohler theory. Simulations suggest that application of Koehler theory introduces a 10-20% uncertainty in global average indirect forcing and 2-11% uncertainty in autoconversion. Regionally, the uncertainty in indirect forcing ranges between 10-20%, and 5-50% for autoconversion. These results are insensitive to the range of updraft velocity and water vapor uptake coefficient considered. This study suggests that Koehler theory (as implemented in climate models) is not a significant source of uncertainty for aerosol indirect forcing but can be substantial for assessments of aerosol effects on the hydrological cycle in climatically sensitive regions of the globe. This implies that improvements in the representation of GCM subgrid processes and aerosol size distribution will mostly benefit indirect forcing assessments. Predictions of autoconversion, by nature, will be subject to considerable uncertainty; its reduction may require explicit representation of size-resolved aerosol composition and mixing state.

  18. Aeroservoelastic Model Validation and Test Data Analysis of the F/A-18 Active Aeroelastic Wing

    NASA Technical Reports Server (NTRS)

    Brenner, Martin J.; Prazenica, Richard J.

    2003-01-01

    Model validation and flight test data analysis require careful consideration of the effects of uncertainty, noise, and nonlinearity. Uncertainty prevails in the data analysis techniques and results in a composite model uncertainty from unmodeled dynamics, assumptions and mechanics of the estimation procedures, noise, and nonlinearity. A fundamental requirement for reliable and robust model development is an attempt to account for each of these sources of error, in particular, for model validation, robust stability prediction, and flight control system development. This paper is concerned with data processing procedures for uncertainty reduction in model validation for stability estimation and nonlinear identification. F/A-18 Active Aeroelastic Wing (AAW) aircraft data is used to demonstrate signal representation effects on uncertain model development, stability estimation, and nonlinear identification. Data is decomposed using adaptive orthonormal best-basis and wavelet-basis signal decompositions for signal denoising into linear and nonlinear identification algorithms. Nonlinear identification from a wavelet-based Volterra kernel procedure is used to extract nonlinear dynamics from aeroelastic responses, and to assist model development and uncertainty reduction for model validation and stability prediction by removing a class of nonlinearity from the uncertainty.

  19. Emulation: A fast stochastic Bayesian method to eliminate model space

    NASA Astrophysics Data System (ADS)

    Roberts, Alan; Hobbs, Richard; Goldstein, Michael

    2010-05-01

    Joint inversion of large 3D datasets has been the goal of geophysicists ever since the datasets first started to be produced. There are two broad approaches to this kind of problem, traditional deterministic inversion schemes and more recently developed Bayesian search methods, such as MCMC (Markov Chain Monte Carlo). However, using both these kinds of schemes has proved prohibitively expensive, both in computing power and time cost, due to the normally very large model space which needs to be searched using forward model simulators which take considerable time to run. At the heart of strategies aimed at accomplishing this kind of inversion is the question of how to reliably and practicably reduce the size of the model space in which the inversion is to be carried out. Here we present a practical Bayesian method, known as emulation, which can address this issue. Emulation is a Bayesian technique used with considerable success in a number of technical fields, such as in astronomy, where the evolution of the universe has been modelled using this technique, and in the petroleum industry where history matching is carried out of hydrocarbon reservoirs. The method of emulation involves building a fast-to-compute uncertainty-calibrated approximation to a forward model simulator. We do this by modelling the output data from a number of forward simulator runs by a computationally cheap function, and then fitting the coefficients defining this function to the model parameters. By calibrating the error of the emulator output with respect to the full simulator output, we can use this to screen out large areas of model space which contain only implausible models. For example, starting with what may be considered a geologically reasonable prior model space of 10000 models, using the emulator we can quickly show that only models which lie within 10% of that model space actually produce output data which is plausibly similar in character to an observed dataset. We can thus much more tightly constrain the input model space for a deterministic inversion or MCMC method. By using this technique jointly on several datasets (specifically seismic, gravity, and magnetotelluric (MT) describing the same region), we can include in our modelling uncertainties in the data measurements, the relationships between the various physical parameters involved, as well as the model representation uncertainty, and at the same time further reduce the range of plausible models to several percent of the original model space. Being stochastic in nature, the output posterior parameter distributions also allow our understanding of/beliefs about a geological region can be objectively updated, with full assessment of uncertainties, and so the emulator is also an inversion-type tool in it's own right, with the advantage (as with any Bayesian method) that our uncertainties from all sources (both data and model) can be fully evaluated.

  20. Uncertainty and equipoise: at interplay between epistemology, decision making and ethics.

    PubMed

    Djulbegovic, Benjamin

    2011-10-01

    In recent years, various authors have proposed that the concept of equipoise be abandoned because it conflates the practice of clinical care with clinical research. At the same time, the equipoise opponents acknowledge the necessity of clinical research if there are unresolved uncertainties about the effects of proposed healthcare interventions. As equipoise represents just 1 measure of uncertainty, proposals to abandon equipoise while maintaining a requirement for addressing uncertainties are contradictory and ultimately not valid. As acknowledgment and articulation of uncertainties represent key scientific and moral requirements for human experimentation, the concept of equipoise remains the most useful framework to link the theory of human experimentation with the theory of rational choice. In this article, I show how uncertainty (equipoise) is at the intersection between epistemology, decision making and ethics of clinical research. In particular, I show how our formulation of responses to uncertainties of hoped-for benefits and unknown harms of testing is a function of the way humans cognitively process information. This approach is based on the view that considerations of ethics and rationality cannot be separated. I analyze the response to uncertainties as it relates to the dual-processing theory, which postulates that rational approach to (clinical research) decision making depends both on analytical, deliberative processes embodied in scientific method (system II), and good human intuition (system I). Ultimately, our choices can only become wiser if we understand a close and intertwined relationship between irreducible uncertainty, inevitable errors and unavoidable injustice.

  1. Assessment of Uncertainties for the NIST 1016 mm Guarded-Hot-Plate Apparatus: Extended Analysis for Low-Density Fibrous-Glass Thermal Insulation.

    PubMed

    Zarr, Robert R

    2010-01-01

    An assessment of uncertainties for the National Institute of Standards and Technology (NIST) 1016 mm Guarded-Hot-Plate apparatus is presented. The uncertainties are reported in a format consistent with current NIST policy on the expression of measurement uncertainty. The report describes a procedure for determination of component uncertainties for thermal conductivity and thermal resistance for the apparatus under operation in either the double-sided or single-sided mode of operation. An extensive example for computation of uncertainties for the single-sided mode of operation is provided for a low-density fibrous-glass blanket thermal insulation. For this material, the relative expanded uncertainty for thermal resistance increases from 1 % for a thickness of 25.4 mm to 3 % for a thickness of 228.6 mm. Although these uncertainties have been developed for a particular insulation material, the procedure and, to a lesser extent, the results are applicable to other insulation materials measured at a mean temperature close to 297 K (23.9 °C, 75 °F). The analysis identifies dominant components of uncertainty and, thus, potential areas for future improvement in the measurement process. For the NIST 1016 mm Guarded-Hot-Plate apparatus, considerable improvement, especially at higher values of thermal resistance, may be realized by developing better control strategies for guarding that include better measurement techniques for the guard gap thermopile voltage and the temperature sensors.

  2. Assessment of Uncertainties for the NIST 1016 mm Guarded-Hot-Plate Apparatus: Extended Analysis for Low-Density Fibrous-Glass Thermal Insulation

    PubMed Central

    Zarr, Robert R.

    2010-01-01

    An assessment of uncertainties for the National Institute of Standards and Technology (NIST) 1016 mm Guarded-Hot-Plate apparatus is presented. The uncertainties are reported in a format consistent with current NIST policy on the expression of measurement uncertainty. The report describes a procedure for determination of component uncertainties for thermal conductivity and thermal resistance for the apparatus under operation in either the double-sided or single-sided mode of operation. An extensive example for computation of uncertainties for the single-sided mode of operation is provided for a low-density fibrous-glass blanket thermal insulation. For this material, the relative expanded uncertainty for thermal resistance increases from 1 % for a thickness of 25.4 mm to 3 % for a thickness of 228.6 mm. Although these uncertainties have been developed for a particular insulation material, the procedure and, to a lesser extent, the results are applicable to other insulation materials measured at a mean temperature close to 297 K (23.9 °C, 75 °F). The analysis identifies dominant components of uncertainty and, thus, potential areas for future improvement in the measurement process. For the NIST 1016 mm Guarded-Hot-Plate apparatus, considerable improvement, especially at higher values of thermal resistance, may be realized by developing better control strategies for guarding that include better measurement techniques for the guard gap thermopile voltage and the temperature sensors. PMID:27134779

  3. Uncertainty and Equipoise: At Interplay Between Epistemology, Decision-Making and Ethics

    PubMed Central

    Djulbegovic, Benjamin

    2011-01-01

    In recent years, various authors have proposed that the concept of equipoise be abandoned since it conflates the practice of clinical care with clinical research. At the same time, the equipoise opponents acknowledge the necessity of clinical research if there are unresolved uncertainties about the effects of proposed healthcare interventions. Since equipoise represents just one measure of uncertainty, proposals to abandon equipoise while maintaining a requirement for addressing uncertainties are contradictory and ultimately not valid. As acknowledgment and articulation of uncertainties represent key scientific and moral requirements for human experimentation, the concept of equipoise remains the most useful framework to link the theory of human experimentation with the theory of rational choice. In this paper, I show how uncertainty (equipoise) is at the intersection between epistemology, decision-making and ethics of clinical research. In particular, I show how our formulation of responses to uncertainties of hoped-for benefits and unknown harms of testing is a function of the way humans cognitively process information. This approach is based on the view that considerations of ethics and rationality cannot be separated. I analyze the response to uncertainties as it relates to the dual-processing theory, which postulates that rational approach to (clinical research) decision-making depends both on analytical, deliberative processes embodied in scientific method (system II) and “good” human intuition (system I). Ultimately, our choices can only become wiser if we understand a close and intertwined relationship between irreducible uncertainty, inevitable errors, and unavoidable injustice. PMID:21817885

  4. Evaluation strategies and uncertainty calculation of isotope amount ratios measured by MC ICP-MS on the example of Sr.

    PubMed

    Horsky, Monika; Irrgeher, Johanna; Prohaska, Thomas

    2016-01-01

    This paper critically reviews the state-of-the-art of isotope amount ratio measurements by solution-based multi-collector inductively coupled plasma mass spectrometry (MC ICP-MS) and presents guidelines for corresponding data reduction strategies and uncertainty assessments based on the example of n((87)Sr)/n((86)Sr) isotope ratios. This ratio shows variation attributable to natural radiogenic processes and mass-dependent fractionation. The applied calibration strategies can display these differences. In addition, a proper statement of uncertainty of measurement, including all relevant influence quantities, is a metrological prerequisite. A detailed instructive procedure for the calculation of combined uncertainties is presented for Sr isotope amount ratios using three different strategies of correction for instrumental isotopic fractionation (IIF): traditional internal correction, standard-sample bracketing, and a combination of both, using Zr as internal standard. Uncertainties are quantified by means of a Kragten spreadsheet approach, including the consideration of correlations between individual input parameters to the model equation. The resulting uncertainties are compared with uncertainties obtained from the partial derivatives approach and Monte Carlo propagation of distributions. We obtain relative expanded uncertainties (U rel; k = 2) of n((87)Sr)/n((86)Sr) of < 0.03 %, when normalization values are not propagated. A comprehensive propagation, including certified values and the internal normalization ratio in nature, increases relative expanded uncertainties by about factor two and the correction for IIF becomes the major contributor.

  5. Robust root clustering for linear uncertain systems using generalized Lyapunov theory

    NASA Technical Reports Server (NTRS)

    Yedavalli, R. K.

    1993-01-01

    Consideration is given to the problem of matrix root clustering in subregions of a complex plane for linear state space models with real parameter uncertainty. The nominal matrix root clustering theory of Gutman & Jury (1981) using the generalized Liapunov equation is extended to the perturbed matrix case, and bounds are derived on the perturbation to maintain root clustering inside a given region. The theory makes it possible to obtain an explicit relationship between the parameters of the root clustering region and the uncertainty range of the parameter space.

  6. Improved entrance optic for global irradiance measurements with a Brewer spectrophotometer.

    PubMed

    Gröbner, Julian

    2003-06-20

    A new entrance optic for a Brewer spectrophotometer has been designed and tested both in the laboratory and during solar measurements. The integrated cosine response deviates by 2.4% from the ideal, with an uncertainty of +/- 1%. The systematic uncertainties of global solar irradiance measurements with this new entrance optic are considerably reduced compared with measurements with the traditional design. Simultaneous solar irradiance measurements between the Brewer spectrophotometer and a spectroradiometer equipped with a state-of-the-art shaped diffuser agreed to within +/- 2% during a five-day measurement period.

  7. Using discrete choice experiments within a cost-benefit analysis framework: some considerations.

    PubMed

    McIntosh, Emma

    2006-01-01

    A great advantage of the stated preference discrete choice experiment (SPDCE) approach to economic evaluation methodology is its immense flexibility within applied cost-benefit analyses (CBAs). However, while the use of SPDCEs in healthcare has increased markedly in recent years there has been a distinct lack of equivalent CBAs in healthcare using such SPDCE-derived valuations. This article outlines specific issues and some practical suggestions for consideration relevant to the development of CBAs using SPDCE-derived benefits. The article shows that SPDCE-derived CBA can adopt recent developments in cost-effectiveness methodology including the cost-effectiveness plane, appropriate consideration of uncertainty, the net-benefit framework and probabilistic sensitivity analysis methods, while maintaining the theoretical advantage of the SPDCE approach. The concept of a cost-benefit plane is no different in principle to the cost-effectiveness plane and can be a useful tool for reporting and presenting the results of CBAs.However, there are many challenging issues to address for the advancement of CBA methodology using SPCDEs within healthcare. Particular areas for development include the importance of accounting for uncertainty in SPDCE-derived willingness-to-pay values, the methodology of SPDCEs in clinical trial settings and economic models, measurement issues pertinent to using SPDCEs specifically in healthcare, and the importance of issues such as consideration of the dynamic nature of healthcare and the resulting impact this has on the validity of attribute definitions and context.

  8. Projecting biodiversity and wood production in future forest landscapes: 15 key modeling considerations.

    PubMed

    Felton, Adam; Ranius, Thomas; Roberge, Jean-Michel; Öhman, Karin; Lämås, Tomas; Hynynen, Jari; Juutinen, Artti; Mönkkönen, Mikko; Nilsson, Urban; Lundmark, Tomas; Nordin, Annika

    2017-07-15

    A variety of modeling approaches can be used to project the future development of forest systems, and help to assess the implications of different management alternatives for biodiversity and ecosystem services. This diversity of approaches does however present both an opportunity and an obstacle for those trying to decide which modeling technique to apply, and interpreting the management implications of model output. Furthermore, the breadth of issues relevant to addressing key questions related to forest ecology, conservation biology, silviculture, economics, requires insights stemming from a number of distinct scientific disciplines. As forest planners, conservation ecologists, ecological economists and silviculturalists, experienced with modeling trade-offs and synergies between biodiversity and wood biomass production, we identified fifteen key considerations relevant to assessing the pros and cons of alternative modeling approaches. Specifically we identified key considerations linked to study question formulation, modeling forest dynamics, forest processes, study landscapes, spatial and temporal aspects, and the key response metrics - biodiversity and wood biomass production, as well as dealing with trade-offs and uncertainties. We also provide illustrative examples from the modeling literature stemming from the key considerations assessed. We use our findings to reiterate the need for explicitly addressing and conveying the limitations and uncertainties of any modeling approach taken, and the need for interdisciplinary research efforts when addressing the conservation of biodiversity and sustainable use of environmental resources. Copyright © 2017 Elsevier Ltd. All rights reserved.

  9. Uncertainty Categorization, Modeling, and Management for Regional Water Supply Planning

    NASA Astrophysics Data System (ADS)

    Fletcher, S.; Strzepek, K. M.; AlSaati, A.; Alhassan, A.

    2016-12-01

    Many water planners face increased pressure on water supply systems from growing demands, variability in supply and a changing climate. Short-term variation in water availability and demand; long-term uncertainty in climate, groundwater storage, and sectoral competition for water; and varying stakeholder perspectives on the impacts of water shortages make it difficult to assess the necessity of expensive infrastructure investments. We categorize these uncertainties on two dimensions: whether they are the result of stochastic variation or epistemic uncertainty, and whether the uncertainties can be described probabilistically or are deep uncertainties whose likelihood is unknown. We develop a decision framework that combines simulation for probabilistic uncertainty, sensitivity analysis for deep uncertainty and Bayesian decision analysis for uncertainties that are reduced over time with additional information. We apply this framework to two contrasting case studies - drought preparedness in Melbourne, Australia and fossil groundwater depletion in Riyadh, Saudi Arabia - to assess the impacts of different types of uncertainty on infrastructure decisions. Melbourne's water supply system relies on surface water, which is impacted by natural variation in rainfall, and a market-based system for managing water rights. Our results show that small, flexible investment increases can mitigate shortage risk considerably at reduced cost. Riyadh, by contrast, relies primarily on desalination for municipal use and fossil groundwater for agriculture, and a centralized planner makes allocation decisions. Poor regional groundwater measurement makes it difficult to know when groundwater pumping will become uneconomical, resulting in epistemic uncertainty. However, collecting more data can reduce the uncertainty, suggesting the need for different uncertainty modeling and management strategies in Riyadh than in Melbourne. We will categorize the two systems and propose appropriate decision making under uncertainty methods from the state of the art. We will compare the efficiency of alternative approaches to the two case studies. Finally, we will present a hybrid decision analytic tool to address the synthesis of uncertainties.

  10. Rate coefficients for the reaction of formaldehyde with HO2 radicals from fluorescence spectroscopy of HOCH2OO radicals

    NASA Astrophysics Data System (ADS)

    Bunkan, Arne; Amédro, Damien; Crowley, John

    2017-04-01

    The reaction of formaldehyde with HO2 radicals constitutes a minor, but significant sink of formaldehyde in the troposphere as well as a possible interference in other formaldehyde photooxidation experiments. HCHO + HO2 ⇌ HOCH2OO (1) Due to the difficulty of simultaneously monitoring the reactant and product concentrations while preventing interfering secondary chemistry, there is a considerable uncertainty in the literature values for the reaction rate coefficients. We have used two photon, excited fragment spectroscopy (TPEFS), originally developed for monitoring HNO3 formation in kinetic experiments, to monitor the formation of the HOCH2OO radical. Dispersed and single wavelength fluorescence emission following the 193 nm photolysis of HOCH2OO have been recorded and analysed. Characterisation of the method is presented along with rate coefficients for the reaction of HCHO with HO2 radicals at tropospheric temperatures.

  11. The Cost of Uncertain Life Span*

    PubMed Central

    Edwards, Ryan D.

    2012-01-01

    A considerable amount of uncertainty surrounds the length of human life. The standard deviation in adult life span is about 15 years in the U.S., and theory and evidence suggest it is costly. I calibrate a utility-theoretic model of preferences over length of life and show that one fewer year in standard deviation is worth about half a mean life year. Differences in the standard deviation exacerbate cross-sectional differences in life expectancy between the U.S. and other industrialized countries, between rich and poor countries, and among poor countries. Accounting for the cost of life-span variance also appears to amplify recently discovered patterns of convergence in world average human well-being. This is partly for methodological reasons and partly because unconditional variance in human length of life, primarily the component due to infant mortality, has exhibited even more convergence than life expectancy. PMID:22368324

  12. Designing and verifying a disassembly line approach to cope with the upsurge of end-of-life vehicles in China.

    PubMed

    Zhang, Chunliang; Chen, Ming

    2018-06-01

    An upsurge of end-of-life vehicles (ELVs) is emerging in China, which means a potential monumental environmental crisis. The approach of disassembly line is expected to be an effective solution to such increasing volumes. Due to the complexity of vehicle product and uncertainties of disassembly processes, a complete set of disassembly line system should be taken into detailed consideration. We have designed and constructed a novel disassembly line using a flexible transition technique with the objective of complete disassembly. Prior to productivity testing, comparative Arena-based simulations on four scenarios have been performed and finally a best scenario is selected. The results show that the guarantee of cycle time is the key to meet the productivity target of 30,000 vehicles for one year. To achieve it, some constructive measures such as forcible entry tools are given. Copyright © 2018 Elsevier Ltd. All rights reserved.

  13. Using physiologically based pharmacokinetic modeling to address nonlinear kinetics and changes in rodent physiology and metabolism due to aging and adaptation in deriving reference values for propylene glycol methyl ether and propylene glycol methyl ether acetate.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kirman, C R.; Sweeney, Lisa M.; Corley, Rick A.

    2005-04-01

    Reference values, including an oral reference dose (RfD) and an inhalation reference concentration (RfC), were derived for propylene glycol methyl ether (PGME), and an oral RfD was derived for its acetate (PGMEA). These values were based upon transient sedation observed in F344 rats and B6C3F1 mice during a two-year inhalation study. The dose-response relationship for sedation was characterized using internal dose measures as predicted by a physiologically based pharmacokinetic (PBPK) model for PGME and its acetate. PBPK modeling was used to account for changes in rodent physiology and metabolism due to aging and adaptation, based on data collected during weeksmore » 1, 2, 26, 52, and 78 of a chronic inhalation study. The peak concentration of PGME in richly perfused tissues was selected as the most appropriate internal dose measure based upon a consideration of the mode of action for sedation and similarities in tissue partitioning between brain and other richly perfused tissues. Internal doses (peak tissue concentrations of PGME) were designated as either no-observed-adverse-effect levels (NOAELs) or lowest-observed-adverse-effect levels (LOAELs) based upon the presence or absence of sedation at each time-point, species, and sex in the two year study. Distributions of the NOAEL and LOAEL values expressed in terms of internal dose were characterized using an arithmetic mean and standard deviation, with the mean internal NOAEL serving as the basis for the reference values, which was then divided by appropriate uncertainty factors. Where data were permitting, chemical-specific adjustment factors were derived to replace default uncertainty factor values of ten. Nonlinear kinetics are were predicted by the model in all species at PGME concentrations exceeding 100 ppm, which complicates interspecies and low-dose extrapolations. To address this complication, reference values were derived using two approaches which differ with respect to the order in which these extrapolations were performed: (1) uncertainty factor application followed by interspecies extrapolation (PBPK modeling); and (2) interspecies extrapolation followed by uncertainty factor application. The resulting reference values for these two approaches are substantially different, with values from the former approach being 7-fold higher than those from the latter approach. Such a striking difference between the two approaches reveals an underlying issue that has received little attention in the literature regarding the application of uncertainty factors and interspecies extrapolations to compounds where saturable kinetics occur in the range of the NOAEL. Until such discussions have taken place, reference values based on the latter approach are recommended for risk assessments involving human exposures to PGME and PGMEA.« less

  14. Comparing the effects of different land management strategies across several land types on California's landscape carbon and associated greenhouse gas budgets

    NASA Astrophysics Data System (ADS)

    Di Vittorio, A. V.; Simmonds, M.; Nico, P. S.

    2017-12-01

    Land-based carbon sequestration and GreenHouse Gas (GHG) reduction strategies are often implemented in small patches and evaluated independently from each other, which poses several challenges to determining their potential benefits at the regional scales at which carbon/GHG targets are defined. These challenges include inconsistent methods, uncertain scalability to larger areas, and lack of constraints such as land ownership and competition among multiple strategies. To address such challenges we have developed an integrated carbon and GHG budget model of California's entire landscape, delineated by geographic region, land type, and ownership. This empirical model has annual time steps and includes net ecosystem carbon exchange, wildfire, multiple forest management practices including wood and bioenergy production, cropland and rangeland soil management, various land type restoration activities, and land cover change. While the absolute estimates vary considerably due to uncertainties in initial carbon densities and ecosystem carbon exchange rates, the estimated effects of particular management activities with respect to baseline are robust across these uncertainties. Uncertainty in land use/cover change data is also critical, as different rates of shrubland to grassland conversion can switch the system from a carbon source to a sink. The results indicate that reducing urban area expansion has substantial and consistent benefits, while the effects of direct land management practices vary and depend largely on the available management area. Increasing forest fuel reduction extent over the baseline contributes to annual GHG costs during increased management, and annual benefits after increased management ceases. Cumulatively, it could take decades to recover the cost of 14 years of increased fuel reduction. However, forest carbon losses can be completely offset within 20 years through increases in urban forest fraction and marsh restoration. Additionally, highly uncertain black carbon estimates dominate the overall GHG budget due to wildfire, forest management, and bioenergy production. Overall, this tool is well suited for exploring suites of management options and extents throughout California in order to quantify potential regional carbon sequestration and GHG emission benefits.

  15. Uncertainty Quantification of CFD Data Generated for a Model Scramjet Isolator Flowfield

    NASA Technical Reports Server (NTRS)

    Baurle, R. A.; Axdahl, E. L.

    2017-01-01

    Computational fluid dynamics is now considered to be an indispensable tool for the design and development of scramjet engine components. Unfortunately, the quantification of uncertainties is rarely addressed with anything other than sensitivity studies, so the degree of confidence associated with the numerical results remains exclusively with the subject matter expert that generated them. This practice must be replaced with a formal uncertainty quantification process for computational fluid dynamics to play an expanded role in the system design, development, and flight certification process. Given the limitations of current hypersonic ground test facilities, this expanded role is believed to be a requirement by some in the hypersonics community if scramjet engines are to be given serious consideration as a viable propulsion system. The present effort describes a simple, relatively low cost, nonintrusive approach to uncertainty quantification that includes the basic ingredients required to handle both aleatoric (random) and epistemic (lack of knowledge) sources of uncertainty. The nonintrusive nature of the approach allows the computational fluid dynamicist to perform the uncertainty quantification with the flow solver treated as a "black box". Moreover, a large fraction of the process can be automated, allowing the uncertainty assessment to be readily adapted into the engineering design and development workflow. In the present work, the approach is applied to a model scramjet isolator problem where the desire is to validate turbulence closure models in the presence of uncertainty. In this context, the relevant uncertainty sources are determined and accounted for to allow the analyst to delineate turbulence model-form errors from other sources of uncertainty associated with the simulation of the facility flow.

  16. Does internal climate variability overwhelm climate change signals in streamflow? The upper Po and Rhone basin case studies.

    PubMed

    Fatichi, S; Rimkus, S; Burlando, P; Bordoy, R

    2014-09-15

    Projections of climate change effects in streamflow are increasingly required to plan water management strategies. These projections are however largely uncertain due to the spread among climate model realizations, internal climate variability, and difficulties in transferring climate model results at the spatial and temporal scales required by catchment hydrology. A combination of a stochastic downscaling methodology and distributed hydrological modeling was used in the ACQWA project to provide projections of future streamflow (up to year 2050) for the upper Po and Rhone basins, respectively located in northern Italy and south-western Switzerland. Results suggest that internal (stochastic) climate variability is a fundamental source of uncertainty, typically comparable or larger than the projected climate change signal. Therefore, climate change effects in streamflow mean, frequency, and seasonality can be masked by natural climatic fluctuations in large parts of the analyzed regions. An exception to the overwhelming role of stochastic variability is represented by high elevation catchments fed by glaciers where streamflow is expected to be considerably reduced due to glacier retreat, with consequences appreciable in the main downstream rivers in August and September. Simulations also identify regions (west upper Rhone and Toce, Ticino river basins) where a strong precipitation increase in the February to April period projects streamflow beyond the range of natural climate variability during the melting season. This study emphasizes the importance of including internal climate variability in climate change analyses, especially when compared to the limited uncertainty that would be accounted for by few deterministic projections. The presented results could be useful in guiding more specific impact studies, although design or management decisions should be better based on reliability and vulnerability criteria as suggested by recent literature. Copyright © 2013 Elsevier B.V. All rights reserved.

  17. The Orbits and Masses of Pluto's Satellites

    NASA Astrophysics Data System (ADS)

    Brozovic, Marina; Jacobson, R. A.

    2013-05-01

    Abstract (2,250 Maximum Characters): We report on the numerically integrated orbital fits of Pluto's satellites, Charon, Nix, Hydra, and S/2011 (134340) 1, to an extensive set of astrometric, mutual event, and stellar occultation observations over the time interval April 1965 to July 2011. The observations of Charon relative to Pluto have been corrected for the Pluto center-of-figure center-of-light (COF) offset due to the Pluto albedo variations. The most recently discovered satellite S/2012 (134340) 1 is fit with a precessing ellipse because its observation set is insufficient to constrain a numerically integrated orbit. The Pluto system mass is well determined with the current data. However, the Charon’s mass still carries a considerable amount of the uncertainty due to the fact that the primary source of information for the Charon mass is a small quantity of absolute position measurements that are sensitive to the independent motions of Pluto and Charon about the system barycenter. We used bounded-least squares algorithm to try to constrain the masses of Nix, Hydra, and S/2011 (134340) 1, but the current dataset appears to be too sparse for mass determination. The long-term dynamical interaction among the satellites does yield a weak determination of Hydra's mass. We investigated the effect of more astrometry of S/2012 (134340) 1 on the mass determination of the other satellites and found no improvement with the additional data. We have delivered ephemerides based on our integrated orbits to the New Horizons project along with their expected uncertainties at the time of the spacecraft encounter with the Pluto system. Acknowledgments: The research described in this paper was carried out at the Jet Propulsion Laboratory, California Institute of Technology, under a contract with the National Aeronautics and Space Administration.

  18. Optimization of vibratory energy harvesters with stochastic parametric uncertainty: a new perspective

    NASA Astrophysics Data System (ADS)

    Haji Hosseinloo, Ashkan; Turitsyn, Konstantin

    2016-04-01

    Vibration energy harvesting has been shown as a promising power source for many small-scale applications mainly because of the considerable reduction in the energy consumption of the electronics and scalability issues of the conventional batteries. However, energy harvesters may not be as robust as the conventional batteries and their performance could drastically deteriorate in the presence of uncertainty in their parameters. Hence, study of uncertainty propagation and optimization under uncertainty is essential for proper and robust performance of harvesters in practice. While all studies have focused on expectation optimization, we propose a new and more practical optimization perspective; optimization for the worst-case (minimum) power. We formulate the problem in a generic fashion and as a simple example apply it to a linear piezoelectric energy harvester. We study the effect of parametric uncertainty in its natural frequency, load resistance, and electromechanical coupling coefficient on its worst-case power and then optimize for it under different confidence levels. The results show that there is a significant improvement in the worst-case power of thus designed harvester compared to that of a naively-optimized (deterministically-optimized) harvester.

  19. Estimating the spatial distribution of wintering little brown bat populations in the eastern United States

    USGS Publications Warehouse

    Russell, Robin E.; Tinsley, Karl; Erickson, Richard A.; Thogmartin, Wayne E.; Jennifer A. Szymanski,

    2014-01-01

    Depicting the spatial distribution of wildlife species is an important first step in developing management and conservation programs for particular species. Accurate representation of a species distribution is important for predicting the effects of climate change, land-use change, management activities, disease, and other landscape-level processes on wildlife populations. We developed models to estimate the spatial distribution of little brown bat (Myotis lucifugus) wintering populations in the United States east of the 100th meridian, based on known hibernacula locations. From this data, we developed several scenarios of wintering population counts per county that incorporated uncertainty in the spatial distribution of the hibernacula as well as uncertainty in the size of the current little brown bat population. We assessed the variability in our results resulting from effects of uncertainty. Despite considerable uncertainty in the known locations of overwintering little brown bats in the eastern United States, we believe that models accurately depicting the effects of the uncertainty are useful for making management decisions as these models are a coherent organization of the best available information.

  20. The effects of geometric uncertainties on computational modelling of knee biomechanics

    PubMed Central

    Fisher, John; Wilcox, Ruth

    2017-01-01

    The geometry of the articular components of the knee is an important factor in predicting joint mechanics in computational models. There are a number of uncertainties in the definition of the geometry of cartilage and meniscus, and evaluating the effects of these uncertainties is fundamental to understanding the level of reliability of the models. In this study, the sensitivity of knee mechanics to geometric uncertainties was investigated by comparing polynomial-based and image-based knee models and varying the size of meniscus. The results suggested that the geometric uncertainties in cartilage and meniscus resulting from the resolution of MRI and the accuracy of segmentation caused considerable effects on the predicted knee mechanics. Moreover, even if the mathematical geometric descriptors can be very close to the imaged-based articular surfaces, the detailed contact pressure distribution produced by the mathematical geometric descriptors was not the same as that of the image-based model. However, the trends predicted by the models based on mathematical geometric descriptors were similar to those of the imaged-based models. PMID:28879008

  1. Efficient Location Uncertainty Treatment for Probabilistic Modelling of Portfolio Loss from Earthquake Events

    NASA Astrophysics Data System (ADS)

    Scheingraber, Christoph; Käser, Martin; Allmann, Alexander

    2017-04-01

    Probabilistic seismic risk analysis (PSRA) is a well-established method for modelling loss from earthquake events. In the insurance industry, it is widely employed for probabilistic modelling of loss to a distributed portfolio. In this context, precise exposure locations are often unknown, which results in considerable loss uncertainty. The treatment of exposure uncertainty has already been identified as an area where PSRA would benefit from increased research attention. However, so far, epistemic location uncertainty has not been in the focus of a large amount of research. We propose a new framework for efficient treatment of location uncertainty. To demonstrate the usefulness of this novel method, a large number of synthetic portfolios resembling real-world portfolios is systematically analyzed. We investigate the effect of portfolio characteristics such as value distribution, portfolio size, or proportion of risk items with unknown coordinates on loss variability. Several sampling criteria to increase the computational efficiency of the framework are proposed and put into the wider context of well-established Monte-Carlo variance reduction techniques. The performance of each of the proposed criteria is analyzed.

  2. Model uncertainty of various settlement estimation methods in shallow tunnels excavation; case study: Qom subway tunnel

    NASA Astrophysics Data System (ADS)

    Khademian, Amir; Abdollahipour, Hamed; Bagherpour, Raheb; Faramarzi, Lohrasb

    2017-10-01

    In addition to the numerous planning and executive challenges, underground excavation in urban areas is always followed by certain destructive effects especially on the ground surface; ground settlement is the most important of these effects for which estimation there exist different empirical, analytical and numerical methods. Since geotechnical models are associated with considerable model uncertainty, this study characterized the model uncertainty of settlement estimation models through a systematic comparison between model predictions and past performance data derived from instrumentation. To do so, the amount of surface settlement induced by excavation of the Qom subway tunnel was estimated via empirical (Peck), analytical (Loganathan and Poulos) and numerical (FDM) methods; the resulting maximum settlement value of each model were 1.86, 2.02 and 1.52 cm, respectively. The comparison of these predicted amounts with the actual data from instrumentation was employed to specify the uncertainty of each model. The numerical model outcomes, with a relative error of 3.8%, best matched the reality and the analytical method, with a relative error of 27.8%, yielded the highest level of model uncertainty.

  3. Effect of soil property uncertainties on permafrost thaw projections: a calibration-constrained analysis

    NASA Astrophysics Data System (ADS)

    Harp, D. R.; Atchley, A. L.; Painter, S. L.; Coon, E. T.; Wilson, C. J.; Romanovsky, V. E.; Rowland, J. C.

    2016-02-01

    The effects of soil property uncertainties on permafrost thaw projections are studied using a three-phase subsurface thermal hydrology model and calibration-constrained uncertainty analysis. The null-space Monte Carlo method is used to identify soil hydrothermal parameter combinations that are consistent with borehole temperature measurements at the study site, the Barrow Environmental Observatory. Each parameter combination is then used in a forward projection of permafrost conditions for the 21st century (from calendar year 2006 to 2100) using atmospheric forcings from the Community Earth System Model (CESM) in the Representative Concentration Pathway (RCP) 8.5 greenhouse gas concentration trajectory. A 100-year projection allows for the evaluation of predictive uncertainty (due to soil property (parametric) uncertainty) and the inter-annual climate variability due to year to year differences in CESM climate forcings. After calibrating to measured borehole temperature data at this well-characterized site, soil property uncertainties are still significant and result in significant predictive uncertainties in projected active layer thickness and annual thaw depth-duration even with a specified future climate. Inter-annual climate variability in projected soil moisture content and Stefan number are small. A volume- and time-integrated Stefan number decreases significantly, indicating a shift in subsurface energy utilization in the future climate (latent heat of phase change becomes more important than heat conduction). Out of 10 soil parameters, ALT, annual thaw depth-duration, and Stefan number are highly dependent on mineral soil porosity, while annual mean liquid saturation of the active layer is highly dependent on the mineral soil residual saturation and moderately dependent on peat residual saturation. By comparing the ensemble statistics to the spread of projected permafrost metrics using different climate models, we quantify the relative magnitude of soil property uncertainty to another source of permafrost uncertainty, structural climate model uncertainty. We show that the effect of calibration-constrained uncertainty in soil properties, although significant, is less than that produced by structural climate model uncertainty for this location.

  4. Uncertainty analysis of thermocouple measurements used in normal and abnormal thermal environment experiments at Sandia's Radiant Heat Facility and Lurance Canyon Burn Site.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nakos, James Thomas

    2004-04-01

    It would not be possible to confidently qualify weapon systems performance or validate computer codes without knowing the uncertainty of the experimental data used. This report provides uncertainty estimates associated with thermocouple data for temperature measurements from two of Sandia's large-scale thermal facilities. These two facilities (the Radiant Heat Facility (RHF) and the Lurance Canyon Burn Site (LCBS)) routinely gather data from normal and abnormal thermal environment experiments. They are managed by Fire Science & Technology Department 09132. Uncertainty analyses were performed for several thermocouple (TC) data acquisition systems (DASs) used at the RHF and LCBS. These analyses apply tomore » Type K, chromel-alumel thermocouples of various types: fiberglass sheathed TC wire, mineral-insulated, metal-sheathed (MIMS) TC assemblies, and are easily extended to other TC materials (e.g., copper-constantan). Several DASs were analyzed: (1) A Hewlett-Packard (HP) 3852A system, and (2) several National Instrument (NI) systems. The uncertainty analyses were performed on the entire system from the TC to the DAS output file. Uncertainty sources include TC mounting errors, ANSI standard calibration uncertainty for Type K TC wire, potential errors due to temperature gradients inside connectors, extension wire uncertainty, DAS hardware uncertainties including noise, common mode rejection ratio, digital voltmeter accuracy, mV to temperature conversion, analog to digital conversion, and other possible sources. Typical results for 'normal' environments (e.g., maximum of 300-400 K) showed the total uncertainty to be about {+-}1% of the reading in absolute temperature. In high temperature or high heat flux ('abnormal') thermal environments, total uncertainties range up to {+-}2-3% of the reading (maximum of 1300 K). The higher uncertainties in abnormal thermal environments are caused by increased errors due to the effects of imperfect TC attachment to the test item. 'Best practices' are provided in Section 9 to help the user to obtain the best measurements possible.« less

  5. Orographic precipitation at global and regional scales: Observational uncertainty and evaluation of 25-km global model simulations

    NASA Astrophysics Data System (ADS)

    Schiemann, Reinhard; Roberts, Charles J.; Bush, Stephanie; Demory, Marie-Estelle; Strachan, Jane; Vidale, Pier Luigi; Mizielinski, Matthew S.; Roberts, Malcolm J.

    2015-04-01

    Precipitation over land exhibits a high degree of variability due to the complex interaction of the precipitation generating atmospheric processes with coastlines, the heterogeneous land surface, and orography. Global general circulation models (GCMs) have traditionally had very limited ability to capture this variability on the mesoscale (here ~50-500 km) due to their low resolution. This has changed with recent investments in resolution and ensembles of multidecadal climate simulations of atmospheric GCMs (AGCMs) with ~25 km grid spacing are becoming increasingly available. Here, we evaluate the mesoscale precipitation distribution in one such set of simulations obtained in the UPSCALE (UK on PrACE - weather-resolving Simulations of Climate for globAL Environmental risk) modelling campaign with the HadGEM-GA3 AGCM. Increased model resolution also poses new challenges to the observational datasets used to evaluate models. Global gridded data products such as those provided by the Global Precipitation Climatology Project (GPCP) are invaluable for assessing large-scale features of the precipitation distribution but may not sufficiently resolve mesoscale structures. In the absence of independent estimates, the intercomparison of different observational datasets may be the only way to get some insight into the uncertainties associated with these observations. Here, we focus on mid-latitude continental regions where observations based on higher-density gauge networks are available in addition to the global data sets: Europe/the Alps, South and East Asia, and the continental US. The ability of GCMs to represent mesoscale variability is of interest in its own right, as climate information on this scale is required by impact studies. An additional motivation for the research proposed here arises from continuing efforts to quantify the components of the global radiation budget and water cycle. Recent estimates based on radiation measurements suggest that the global mean precipitation/evaporation may be up to 10 Wm-2 (about 0.35 mm day-1) larger than the estimate obtained from GPCP. While the main part of this discrepancy is thought to be due to the underestimation of remotely-sensed ocean precipitation, there is also considerable uncertainty about 'unobserved' precipitation over land, in particular in the form of snow in regions of high latitude/altitude. We aim to contribute to this discussion, at least at a qualitative level, by considering case studies of how area-averaged mountain precipitation is represented in different observational datasets and by HadGEM3-GA3 at different resolutions. Our results show that the AGCM simulates considerably more orographic precipitation at higher resolution. We find this at the global scale both for the winter and summer hemispheres, as well as in several case studies in mid-latitude regions. Gridded observations based on gauge measurements generally capture the mesoscale spatial variability of precipitation, but differ strongly from one another in the magnitude of area-averaged precipitation, so that they are of very limited use for evaluating this aspect of the modelled climate. We are currently conducting a sensitivity experiment (coarse-grained orography in high-resolution HadGEM3) to further investigate the resolution sensitivity seen in the model.

  6. Simulating the Stability of Colloidal Amorphous Iron Oxide in Natural Water

    EPA Science Inventory

    Considerable uncertainty exists as to whether existing thermodynamic equilibrium solid/water partitioning paradigms can be used to assess the mobility of insoluble manufactured nanomaterials in the aquatic environment. In this work, the traditional Derjaguin–Landau–Verwey–Overbee...

  7. Uncertainty issues in forest monitoring: All you wanted to know about uncertainties and never dared to ask

    Treesearch

    Michael Köhl; Charles Scott; Daniel Plugge

    2013-01-01

    Uncertainties are a composite of errors arising from observations and the appropriateness of models. An error budget approach can be used to identify and accumulate the sources of errors to estimate change in emissions between two points in time. Various forest monitoring approaches can be used to estimate the changes in emissions due to deforestation and forest...

  8. An Uncertainty Data Set for Passive Microwave Satellite Observations of Warm Cloud Liquid Water Path

    NASA Astrophysics Data System (ADS)

    Greenwald, Thomas J.; Bennartz, Ralf; Lebsock, Matthew; Teixeira, João.

    2018-04-01

    The first extended comprehensive data set of the retrieval uncertainties in passive microwave observations of cloud liquid water path (CLWP) for warm oceanic clouds has been created for practical use in climate applications. Four major sources of systematic errors were considered over the 9-year record of the Advanced Microwave Scanning Radiometer-EOS (AMSR-E): clear-sky bias, cloud-rain partition (CRP) bias, cloud-fraction-dependent bias, and cloud temperature bias. Errors were estimated using a unique merged AMSR-E/Moderate resolution Imaging Spectroradiometer Level 2 data set as well as observations from the Cloud-Aerosol Lidar with Orthogonal Polarization and the CloudSat Cloud Profiling Radar. To quantify the CRP bias more accurately, a new parameterization was developed to improve the inference of CLWP in warm rain. The cloud-fraction-dependent bias was found to be a combination of the CRP bias, an in-cloud bias, and an adjacent precipitation bias. Globally, the mean net bias was 0.012 kg/m2, dominated by the CRP and in-cloud biases, but with considerable regional and seasonal variation. Good qualitative agreement between a bias-corrected AMSR-E CLWP climatology and ship observations in the Northeast Pacific suggests that the bias estimates are reasonable. However, a possible underestimation of the net bias in certain conditions may be due in part to the crude method used in classifying precipitation, underscoring the need for an independent method of detecting rain in warm clouds. This study demonstrates the importance of combining visible-infrared imager data and passive microwave CLWP observations for estimating uncertainties and improving the accuracy of these observations.

  9. An Uncertainty Data Set for Passive Microwave Satellite Observations of Warm Cloud Liquid Water Path

    PubMed Central

    Bennartz, Ralf; Lebsock, Matthew; Teixeira, João

    2018-01-01

    Abstract The first extended comprehensive data set of the retrieval uncertainties in passive microwave observations of cloud liquid water path (CLWP) for warm oceanic clouds has been created for practical use in climate applications. Four major sources of systematic errors were considered over the 9‐year record of the Advanced Microwave Scanning Radiometer‐EOS (AMSR‐E): clear‐sky bias, cloud‐rain partition (CRP) bias, cloud‐fraction‐dependent bias, and cloud temperature bias. Errors were estimated using a unique merged AMSR‐E/Moderate resolution Imaging Spectroradiometer Level 2 data set as well as observations from the Cloud‐Aerosol Lidar with Orthogonal Polarization and the CloudSat Cloud Profiling Radar. To quantify the CRP bias more accurately, a new parameterization was developed to improve the inference of CLWP in warm rain. The cloud‐fraction‐dependent bias was found to be a combination of the CRP bias, an in‐cloud bias, and an adjacent precipitation bias. Globally, the mean net bias was 0.012 kg/m2, dominated by the CRP and in‐cloud biases, but with considerable regional and seasonal variation. Good qualitative agreement between a bias‐corrected AMSR‐E CLWP climatology and ship observations in the Northeast Pacific suggests that the bias estimates are reasonable. However, a possible underestimation of the net bias in certain conditions may be due in part to the crude method used in classifying precipitation, underscoring the need for an independent method of detecting rain in warm clouds. This study demonstrates the importance of combining visible‐infrared imager data and passive microwave CLWP observations for estimating uncertainties and improving the accuracy of these observations. PMID:29938146

  10. FURTHER STUDIES ON UNCERTAINTY, CONFOUNDING, AND VALIDATION OF THE DOSES IN THE TECHA RIVER DOSIMETRY SYSTEM: Concluding Progress Report on the Second Phase of Project 1.1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Degteva, M. O.; Anspaugh, L. R.; Napier, Bruce A.

    2009-10-23

    This is the concluding Progress Report for Project 1.1 of the U.S./Russia Joint Coordinating Committee on Radiation Effects Research (JCCRER). An overwhelming majority of our work this period has been to complete our primary obligation of providing a new version of the Techa River Dosimetry System (TRDS), which we call TRDS-2009D; the D denotes deterministic. This system provides estimates of individual doses to members of the Extended Techa River Cohort (ETRC) and post-natal doses to members of the Techa River Offspring Cohort (TROC). The latter doses were calculated with use of the TRDS-2009D. The doses for the members of themore » ETRC have been made available to the American and Russian epidemiologists in September for their studies in deriving radiogenic risk factors. Doses for members of the TROC are being provided to European and Russian epidemiologists, as partial input for studies of risk in this population. Two of our original goals for the completion of this nine-year phase of Project 1.1 were not completed. These are completion of TRDS-2009MC, which was to be a Monte Carlo version of TRDS-2009 that could be used for more explicit analysis of the impact of uncertainty in doses on uncertainty in radiogenic risk factors. The second incomplete goal was to be the provision of household specific external doses (rather than village average). This task was far along, but had to be delayed due to the lead investigator’s work on consideration of a revised source term.« less

  11. NOy and O3 in the Asian Monsoon Anticyclone: Uncertainties associated with the Convection and Lightning in a Global Model

    NASA Astrophysics Data System (ADS)

    Pozzer, A.; Ojha, N.; Tost, H.; Joeckel, P.; Fischer, H.; Ziereis, H.; Zahn, A.; Tomsche, L.; Lelieveld, J.

    2017-12-01

    The impacts of Asian monsoon on the tropospheric chemistry are difficult to simulate in numerical models due to the lack of accurate emission inventories over the Asian region and the strong influence of parameterized processes such as convection and lightning. Further, the lack of observational data over the region during the monsoon period reduce drastically the capability to evaluate numerical models. Here, we combine simulations using the global EMAC (ECHAM5/MESSy2 Atmospheric Chemistry) model with the observational dataset based on the OMO campaign (July-August 2015) to study the tropospheric composition in the Asian monsoon anticyclone. The results of the simulations capture the C-shape of the CO vertical profiles, typically observed during the summer monsoon. The observed spatio-temporal variations in O3, CO, and NOy are reproduced by EMAC, with a better correlation in the upper troposphere (UT). However, the model overestimates NOy and O3 mixing ratios in the anticyclone by 25% and 35%, respectively. A series of numerical experiments showed the strong lightning emissions in the model as the source of this overestimation, with the anthropogenic NOx sources (in Asia) and global soil emissions having lower impact in the UT. A reduction of the lightning NOx emission by 50% leads to a better agreement between the model and OMO observations of NOy and O3. The uncertainties in the lightning emissions are found to considerably influence the OH distribution in the UT over India and downwind. The study reveals existing uncertainties in the estimations of monsoon impact on the tropospheric composition, and highlights the need to constrain numerical simulations with state-of-the-art observations for deriving the budget of trace species of climatic relevance.

  12. Delineating baseflow contribution areas for streams - A model and methods comparison.

    PubMed

    Chow, Reynold; Frind, Michael E; Frind, Emil O; Jones, Jon P; Sousa, Marcelo R; Rudolph, David L; Molson, John W; Nowak, Wolfgang

    2016-12-01

    This study addresses the delineation of areas that contribute baseflow to a stream reach, also known as stream capture zones. Such areas can be delineated using standard well capture zone delineation methods, with three important differences: (1) natural gradients are smaller compared to those produced by supply wells and are therefore subject to greater numerical errors, (2) stream discharge varies seasonally, and (3) stream discharge varies spatially. This study focuses on model-related uncertainties due to model characteristics, discretization schemes, delineation methods, and particle tracking algorithms. The methodology is applied to the Alder Creek watershed in southwestern Ontario. Four different model codes are compared: HydroGeoSphere, WATFLOW, MODFLOW, and FEFLOW. In addition, two delineation methods are compared: reverse particle tracking and reverse transport, where the latter considers local-scale parameter uncertainty by using a macrodispersion term to produce a capture probability plume. The results from this study indicate that different models can calibrate acceptably well to the same data and produce very similar distributions of hydraulic head, but can produce different capture zones. The stream capture zone is found to be highly sensitive to the particle tracking algorithm. It was also found that particle tracking by itself, if applied to complex systems such as the Alder Creek watershed, would require considerable subjective judgement in the delineation of stream capture zones. Reverse transport is an alternative and more reliable approach that provides probability intervals for the baseflow contribution areas, taking uncertainty into account. The two approaches can be used together to enhance the confidence in the final outcome. Copyright © 2016 The Authors. Published by Elsevier B.V. All rights reserved.

  13. Quantifying uncertainties in wind energy assessment

    NASA Astrophysics Data System (ADS)

    Patlakas, Platon; Galanis, George; Kallos, George

    2015-04-01

    The constant rise of wind energy production and the subsequent penetration in global energy markets during the last decades resulted in new sites selection with various types of problems. Such problems arise due to the variability and the uncertainty of wind speed. The study of the wind speed distribution lower and upper tail may support the quantification of these uncertainties. Such approaches focused on extreme wind conditions or periods below the energy production threshold are necessary for a better management of operations. Towards this direction, different methodologies are presented for the credible evaluation of potential non-frequent/extreme values for these environmental conditions. The approaches used, take into consideration the structural design of the wind turbines according to their lifespan, the turbine failures, the time needed for repairing as well as the energy production distribution. In this work, a multi-parametric approach for studying extreme wind speed values will be discussed based on tools of Extreme Value Theory. In particular, the study is focused on extreme wind speed return periods and the persistence of no energy production based on a weather modeling system/hind cast/10-year dataset. More specifically, two methods (Annual Maxima and Peaks Over Threshold) were used for the estimation of extreme wind speeds and their recurrence intervals. Additionally, two different methodologies (intensity given duration and duration given intensity, both based on Annual Maxima method) were implied to calculate the extreme events duration, combined with their intensity as well as the event frequency. The obtained results prove that the proposed approaches converge, at least on the main findings, for each case. It is also remarkable that, despite the moderate wind speed climate of the area, several consequent days of no energy production are observed.

  14. Evaluation of linear regression techniques for atmospheric applications: the importance of appropriate weighting

    NASA Astrophysics Data System (ADS)

    Wu, Cheng; Zhen Yu, Jian

    2018-03-01

    Linear regression techniques are widely used in atmospheric science, but they are often improperly applied due to lack of consideration or inappropriate handling of measurement uncertainty. In this work, numerical experiments are performed to evaluate the performance of five linear regression techniques, significantly extending previous works by Chu and Saylor. The five techniques are ordinary least squares (OLS), Deming regression (DR), orthogonal distance regression (ODR), weighted ODR (WODR), and York regression (YR). We first introduce a new data generation scheme that employs the Mersenne twister (MT) pseudorandom number generator. The numerical simulations are also improved by (a) refining the parameterization of nonlinear measurement uncertainties, (b) inclusion of a linear measurement uncertainty, and (c) inclusion of WODR for comparison. Results show that DR, WODR and YR produce an accurate slope, but the intercept by WODR and YR is overestimated and the degree of bias is more pronounced with a low R2 XY dataset. The importance of a properly weighting parameter λ in DR is investigated by sensitivity tests, and it is found that an improper λ in DR can lead to a bias in both the slope and intercept estimation. Because the λ calculation depends on the actual form of the measurement error, it is essential to determine the exact form of measurement error in the XY data during the measurement stage. If a priori error in one of the variables is unknown, or the measurement error described cannot be trusted, DR, WODR and YR can provide the least biases in slope and intercept among all tested regression techniques. For these reasons, DR, WODR and YR are recommended for atmospheric studies when both X and Y data have measurement errors. An Igor Pro-based program (Scatter Plot) was developed to facilitate the implementation of error-in-variables regressions.

  15. Human pursuance of equality hinges on mental processes of projecting oneself into the perspectives of others and into future situations.

    PubMed

    Takesue, Hirofumi; Miyauchi, Carlos Makoto; Sakaiya, Shiro; Fan, Hongwei; Matsuda, Tetsuya; Kato, Junko

    2017-07-19

    In the pursuance of equality, behavioural scientists disagree about distinct motivators, that is, consideration of others and prospective calculation for oneself. However, accumulating data suggest that these motivators may share a common process in the brain whereby perspectives and events that did not arise in the immediate environment are conceived. To examine this, we devised a game imitating a real decision-making situation regarding redistribution among income classes in a welfare state. The neural correlates of redistributive decisions were examined under contrasting conditions, with and without uncertainty, which affects support for equality in society. The dorsal anterior cingulate cortex (dACC) and the caudate nucleus were activated by equality decisions with uncertainty but by selfless decisions without uncertainty. Activation was also correlated with subjective values. Activation in both the dACC and the caudate nucleus was associated with the attitude to prefer accordance with others, whereas activation in the caudate nucleus reflected that the expected reward involved the prospective calculation of relative income. The neural correlates suggest that consideration of others and prospective calculation for oneself may underlie the support for equality. Projecting oneself into the perspective of others and into prospective future situations may underpin the pursuance of equality.

  16. Decomposing Trends in Inequality in Earnings into Forecastable and Uncertain Components

    PubMed Central

    Cunha, Flavio; Heckman, James

    2015-01-01

    A substantial empirical literature documents the rise in wage inequality in the American economy. It is silent on whether the increase in inequality is due to components of earnings that are predictable by agents or whether it is due to greater uncertainty facing them. These two sources of variability have different consequences for both aggregate and individual welfare. Using data on two cohorts of American males we find that a large component of the rise in inequality for less skilled workers is due to uncertainty. For skilled workers, the rise is less pronounced. PMID:27087741

  17. Techniques for analyses of trends in GRUAN data

    NASA Astrophysics Data System (ADS)

    Bodeker, G. E.; Kremser, S.

    2015-04-01

    The Global Climate Observing System (GCOS) Reference Upper Air Network (GRUAN) provides reference quality RS92 radiosonde measurements of temperature, pressure and humidity. A key attribute of reference quality measurements, and hence GRUAN data, is that each datum has a well characterized and traceable estimate of the measurement uncertainty. The long-term homogeneity of the measurement records, and their well characterized uncertainties, make these data suitable for reliably detecting changes in global and regional climate on decadal time scales. Considerable effort is invested in GRUAN operations to (i) describe and analyse all sources of measurement uncertainty to the extent possible, (ii) quantify and synthesize the contribution of each source of uncertainty to the total measurement uncertainty, and (iii) verify that the evaluated net uncertainty is within the required target uncertainty. However, if the climate science community is not sufficiently well informed on how to capitalize on this added value, the significant investment in estimating meaningful measurement uncertainties is largely wasted. This paper presents and discusses the techniques that will need to be employed to reliably quantify long-term trends in GRUAN data records. A pedagogical approach is taken whereby numerical recipes for key parts of the trend analysis process are explored. The paper discusses the construction of linear least squares regression models for trend analysis, boot-strapping approaches to determine uncertainties in trends, dealing with the combined effects of autocorrelation in the data and measurement uncertainties in calculating the uncertainty on trends, best practice for determining seasonality in trends, how to deal with co-linear basis functions, and interpreting derived trends. Synthetic data sets are used to demonstrate these concepts which are then applied to a first analysis of temperature trends in RS92 radiosonde upper air soundings at the GRUAN site at Lindenberg, Germany (52.21° N, 14.12° E).

  18. Techniques for analyses of trends in GRUAN data

    NASA Astrophysics Data System (ADS)

    Bodeker, G. E.; Kremser, S.

    2014-12-01

    The Global Climate Observing System (GCOS) Reference Upper Air Network (GRUAN) provides reference quality RS92 radiosonde measurements of temperature, pressure and humidity. A key attribute of reference quality measurements, and hence GRUAN data, is that each datum has a well characterised and traceable estimate of the measurement uncertainty. The long-term homogeneity of the measurement records, and their well characterised uncertainties, make these data suitable for reliably detecting changes in global and regional climate on decadal time scales. Considerable effort is invested in GRUAN operations to (i) describe and analyse all sources of measurement uncertainty to the extent possible, (ii) quantify and synthesize the contribution of each source of uncertainty to the total measurement uncertainty, and (iii) verify that the evaluated net uncertainty is within the required target uncertainty. However, if the climate science community is not sufficiently well informed on how to capitalize on this added value, the significant investment in estimating meaningful measurement uncertainties is largely wasted. This paper presents and discusses the techniques that will need to be employed to reliably quantify long-term trends in GRUAN data records. A pedagogical approach is taken whereby numerical recipes for key parts of the trend analysis process are explored. The paper discusses the construction of linear least squares regression models for trend analysis, boot-strapping approaches to determine uncertainties in trends, dealing with the combined effects of autocorrelation in the data and measurement uncertainties in calculating the uncertainty on trends, best practice for determining seasonality in trends, how to deal with co-linear basis functions, and interpreting derived trends. Synthetic data sets are used to demonstrate these concepts which are then applied to a first analysis of temperature trends in RS92 radiosonde upper air soundings at the GRUAN site at Lindenberg, Germany (52.21° N, 14.12° E).

  19. Asteroid approach covariance analysis for the Clementine mission

    NASA Technical Reports Server (NTRS)

    Ionasescu, Rodica; Sonnabend, David

    1993-01-01

    The Clementine mission is designed to test Strategic Defense Initiative Organization (SDIO) technology, the Brilliant Pebbles and Brilliant Eyes sensors, by mapping the moon surface and flying by the asteroid Geographos. The capability of two of the instruments available on board the spacecraft, the lidar (laser radar) and the UV/Visible camera is used in the covariance analysis to obtain the spacecraft delivery uncertainties at the asteroid. These uncertainties are due primarily to asteroid ephemeris uncertainties. On board optical navigation reduces the uncertainty in the knowledge of the spacecraft position in the direction perpendicular to the incoming asymptote to a one-sigma value of under 1 km, at the closest approach distance of 100 km. The uncertainty in the knowledge of the encounter time is about 0.1 seconds for a flyby velocity of 10.85 km/s. The magnitude of these uncertainties is due largely to Center Finding Errors (CFE). These systematic errors represent the accuracy expected in locating the center of the asteroid in the optical navigation images, in the absence of a topographic model for the asteroid. The direction of the incoming asymptote cannot be estimated accurately until minutes before the asteroid flyby, and correcting for it would require autonomous navigation. Orbit determination errors dominate over maneuver execution errors, and the final delivery accuracy attained is basically the orbit determination uncertainty before the final maneuver.

  20. Caught between intending and doing: older people ideating on a self-chosen death

    PubMed Central

    van Wijngaarden, Els; Leget, Carlo; Goossensen, Anne

    2016-01-01

    Objectives The aim of this paper is to provide insight into what it means to live with the intention to end life at a self-chosen moment from an insider perspective. Setting Participants who lived independent or semidependent throughout the Netherlands. Participants 25 Dutch older citizens (mean age of 82 years) participated. They were ideating on a self-chosen death because they considered their lives to be no longer worth living. Inclusion criteria were that they: (1) considered their lives to be ‘completed’; (2) suffered from the prospect of living on; (3) currently wished to die; (4) were 70 years of age or older; (5) were not terminally ill; (6) considered themselves to be mentally competent; (7) considered their death wish reasonable. Design In this qualitative study, in-depth interviews were carried out in the participants’ everyday home environment (median lasting 1.56 h). Verbatim transcripts were analysed based on the principles of phenomenological thematic analysis. Results The liminality or ‘in-betweenness’ of intending and actually performing self-directed death (or not) is characterised as a constant feeling of being torn explicated by the following pairs of themes: (1) detachment and attachment; (2) rational and non-rational considerations; (3) taking control and lingering uncertainty; (4) resisting interference and longing for support; (5) legitimacy and illegitimacy. Conclusions Our findings show that the in-between period emerges as a considerable, existential challenge with both rational and non-rational concerns and thoughts, rather than a calculative, coherent sum of rational considerations. Our study highlights the need to take due consideration of all ambiguities and ambivalences present after a putatively rational decision has been made in order to develop careful policy and support for this particular group of older people. PMID:26781505

  1. Physical Uncertainty Bounds (PUB)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vaughan, Diane Elizabeth; Preston, Dean L.

    2015-03-19

    This paper introduces and motivates the need for a new methodology for determining upper bounds on the uncertainties in simulations of engineered systems due to limited fidelity in the composite continuum-level physics models needed to simulate the systems. We show that traditional uncertainty quantification methods provide, at best, a lower bound on this uncertainty. We propose to obtain bounds on the simulation uncertainties by first determining bounds on the physical quantities or processes relevant to system performance. By bounding these physics processes, as opposed to carrying out statistical analyses of the parameter sets of specific physics models or simply switchingmore » out the available physics models, one can obtain upper bounds on the uncertainties in simulated quantities of interest.« less

  2. Impact of Nuclear Data Uncertainties on Advanced Fuel Cycles and their Irradiated Fuel - a Comparison between Libraries

    NASA Astrophysics Data System (ADS)

    Díez, C. J.; Cabellos, O.; Martínez, J. S.

    2014-04-01

    The uncertainties on the isotopic composition throughout the burnup due to the nuclear data uncertainties are analysed. The different sources of uncertainties: decay data, fission yield and cross sections; are propagated individually, and their effect assessed. Two applications are studied: EFIT (an ADS-like reactor) and ESFR (Sodium Fast Reactor). The impact of the uncertainties on cross sections provided by the EAF-2010, SCALE6.1 and COMMARA-2.0 libraries are compared. These Uncertainty Quantification (UQ) studies have been carried out with a Monte Carlo sampling approach implemented in the depletion/activation code ACAB. Such implementation has been improved to overcome depletion/activation problems with variations of the neutron spectrum.

  3. Riparian livestock exclosure research in the western United States: a critique and some recommendations.

    PubMed

    Sarr, Daniel A

    2002-10-01

    Over the last three decades, livestock exclosure research has emerged as a preferred method to evaluate the ecology of riparian ecosystems and their susceptibility to livestock impacts. This research has addressed the effects of livestock exclusion on many characteristics of riparian ecosystems, including vegetation, aquatic and terrestrial animals, and geomorphology. This paper reviews, critiques, and provides recommendations for the improvement of riparian livestock exclosure research. Exclosure-based research has left considerable scientific uncertainty due to popularization of relatively few studies, weak study designs, a poor understanding of the scales and mechanisms of ecosystem recovery, and selective, agenda-laden literature reviews advocating for or against public lands livestock grazing. Exclosures are often too small (<50 ha) and improperly placed to accurately measure the responses of aquatic organisms or geomorphic processes to livestock removal. Depending upon the site conditions when and where livestock exclosures are established, postexclusion dynamics may vary considerably. Systems can recover quickly and predictably with livestock removal (the "rubber band" model), fail to recover due to changes in system structure or function (the "Humpty Dumpty" model), or recover slowly and remain more sensitive to livestock impacts than they were before grazing was initiated (the "broken leg" model). Several initial ideas for strengthening the scientific basis for livestock exclosure research are presented: (1) incorporation of meta-analyses and critical reviews. (2) use of restoration ecology as a unifying conceptual framework; (3) development of long-term research programs; (4) improved exclosure placement/ design; and (5) a stronger commitment to collection of pretreatment data.

  4. Continuum topology optimization considering uncertainties in load locations based on the cloud model

    NASA Astrophysics Data System (ADS)

    Liu, Jie; Wen, Guilin

    2018-06-01

    Few researchers have paid attention to designing structures in consideration of uncertainties in the loading locations, which may significantly influence the structural performance. In this work, cloud models are employed to depict the uncertainties in the loading locations. A robust algorithm is developed in the context of minimizing the expectation of the structural compliance, while conforming to a material volume constraint. To guarantee optimal solutions, sufficient cloud drops are used, which in turn leads to low efficiency. An innovative strategy is then implemented to enormously improve the computational efficiency. A modified soft-kill bi-directional evolutionary structural optimization method using derived sensitivity numbers is used to output the robust novel configurations. Several numerical examples are presented to demonstrate the effectiveness and efficiency of the proposed algorithm.

  5. Production of biofuels and biochemicals: in need of an ORACLE.

    PubMed

    Miskovic, Ljubisa; Hatzimanikatis, Vassily

    2010-08-01

    The engineering of cells for the production of fuels and chemicals involves simultaneous optimization of multiple objectives, such as specific productivity, extended substrate range and improved tolerance - all under a great degree of uncertainty. The achievement of these objectives under physiological and process constraints will be impossible without the use of mathematical modeling. However, the limited information and the uncertainty in the available information require new methods for modeling and simulation that will characterize the uncertainty and will quantify, in a statistical sense, the expectations of success of alternative metabolic engineering strategies. We discuss these considerations toward developing a framework for the Optimization and Risk Analysis of Complex Living Entities (ORACLE) - a computational method that integrates available information into a mathematical structure to calculate control coefficients. Copyright 2010 Elsevier Ltd. All rights reserved.

  6. Traceable Dynamic Calibration of Force Transducers by Primary Means

    PubMed Central

    Vlajic, Nicholas; Chijioke, Ako

    2018-01-01

    We describe an apparatus for traceable, dynamic calibration of force transducers using harmonic excitation, and report calibration measurements of force transducers using this apparatus. In this system, the force applied to the transducer is produced by the acceleration of an attached mass, and is determined according to Newton’s second law, F = ma. The acceleration is measured by primary means, using laser interferometry. The capabilities of this system are demonstrated by performing dynamic calibrations of two shear-web-type force transducers up to a frequency of 2 kHz, with an expanded uncertainty below 1.2 %. We give an accounting of all significant sources of uncertainty, including a detailed consideration of the effects of dynamic tilting (rocking), which is a leading source of uncertainty in such harmonic force calibration systems. PMID:29887643

  7. Forecasting eruption size: what we know, what we don't know

    NASA Astrophysics Data System (ADS)

    Papale, Paolo

    2017-04-01

    Any eruption forecast includes an evaluation of the expected size of the forthcoming eruption, usually expressed as the probability associated to given size classes. Such evaluation is mostly based on the previous volcanic history at the specific volcano, or it is referred to a broader class of volcanoes constituting "analogues" of the one under specific consideration. In any case, use of knowledge from past eruptions implies considering the completeness of the reference catalogue, and most importantly, the existence of systematic biases in the catalogue, that may affect probability estimates and translate into biased volcanic hazard forecasts. An analysis of existing catalogues, with major reference to the catalogue from the Smithsonian Global Volcanism Program, suggests that systematic biases largely dominate at global, regional and local scale: volcanic histories reconstructed at individual volcanoes, often used as a reference for volcanic hazard forecasts, are the result of systematic loss of information with time and poor sample representativeness. That situation strictly requires the use of techniques to complete existing catalogues, as well as careful consideration of the uncertainties deriving from inadequate knowledge and model-dependent data elaboration. A reconstructed global eruption size distribution, obtained by merging information from different existing catalogues, shows a mode in the VEI 1-2 range, <0.1% incidence of eruptions with VEI 7 or larger, and substantial uncertainties associated with individual VEI frequencies. Even larger uncertainties are expected to derive from application to individual volcanoes or classes of analogue volcanoes, suggesting large to very large uncertainties associated to volcanic hazard forecasts virtually at any individual volcano worldwide.

  8. Impacts of tides on tsunami propagation due to potential Nankai Trough earthquakes in the Seto Inland Sea, Japan

    NASA Astrophysics Data System (ADS)

    Lee, Han Soo; Shimoyama, Tomohisa; Popinet, Stéphane

    2015-10-01

    The impacts of tides on extreme tsunami propagation due to potential Nankai Trough earthquakes in the Seto Inland Sea (SIS), Japan, are investigated through numerical experiments. Tsunami experiments are conducted based on five scenarios that consider tides at four different phases, such as flood, high, ebb, and low tides. The probes that were selected arbitrarily in the Bungo and Kii Channels show less significant effects of tides on tsunami heights and the arrival times of the first waves than those that experience large tidal ranges in inner basins and bays of the SIS. For instance, the maximum tsunami height and the arrival time at Toyomaesi differ by more than 0.5 m and nearly 1 h, respectively, depending on the tidal phase. The uncertainties defined in terms of calculated maximum tsunami heights due to tides illustrate that the calculated maximum tsunami heights in the inner SIS with standing tides have much larger uncertainties than those of two channels with propagating tides. Particularly in Harima Nada, the uncertainties due to the impacts of tides are greater than 50% of the tsunami heights without tidal interaction. The results recommend simulate tsunamis together with tides in shallow water environments to reduce the uncertainties involved with tsunami modeling and predictions for tsunami hazards preparedness. This article was corrected on 26 OCT 2015. See the end of the full text for details.

  9. Communicating uncertainties in earth sciences in view of user needs

    NASA Astrophysics Data System (ADS)

    de Vries, Wim; Kros, Hans; Heuvelink, Gerard

    2014-05-01

    Uncertainties are inevitable in all results obtained in the earth sciences, regardless whether these are based on field observations, experimental research or predictive modelling. When informing decision and policy makers or stakeholders, it is important that these uncertainties are also communicated. In communicating results, it important to apply a "Progressive Disclosure of Information (PDI)" from non-technical information through more specialised information, according to the user needs. Generalized information is generally directed towards non-scientific audiences and intended for policy advice. Decision makers have to be aware of the implications of the uncertainty associated with results, so that they can account for it in their decisions. Detailed information on the uncertainties is generally intended for scientific audiences to give insight in underlying approaches and results. When communicating uncertainties, it is important to distinguish between scientific results that allow presentation in terms of probabilistic measures of uncertainty and more intrinsic uncertainties and errors that cannot be expressed in mathematical terms. Examples of earth science research that allow probabilistic measures of uncertainty, involving sophisticated statistical methods, are uncertainties in spatial and/or temporal variations in results of: • Observations, such as soil properties measured at sampling locations. In this case, the interpolation uncertainty, caused by a lack of data collected in space, can be quantified by e.g. kriging standard deviation maps or animations of conditional simulations. • Experimental measurements, comparing impacts of treatments at different sites and/or under different conditions. In this case, an indication of the average and range in measured responses to treatments can be obtained from a meta-analysis, summarizing experimental findings between replicates and across studies, sites, ecosystems, etc. • Model predictions due to uncertain model parameters (parametric variability). These uncertainties can be quantified by uncertainty propagation methods such as Monte Carlo simulation methods. Examples of intrinsic uncertainties that generally cannot be expressed in mathematical terms are errors or biases in: • Results of experiments and observations due to inadequate sampling and errors in analyzing data in the laboratory and even in data reporting. • Results of (laboratory) experiments that are limited to a specific domain or performed under circumstances that differ from field circumstances. • Model structure, due to lack of knowledge of the underlying processes. Structural uncertainty, which may cause model inadequacy/ bias, is inherent in model approaches since models are approximations of reality. Intrinsic uncertainties often occur in an emerging field where ongoing new findings, either experiments or field observations of new model findings, challenge earlier work. In this context, climate scientists working within the IPCC have adopted a lexicon to communicate confidence in their findings, ranging from "very high", "high", "medium", "low" and "very low" confidence. In fact, there are also statistical methods to gain insight in uncertainties in model predictions due to model assumptions (i.e. model structural error). Examples are comparing model results with independent observations or a systematic intercomparison of predictions from multiple models. In the latter case, Bayesian model averaging techniques can be used, in which each model considered gets an assigned prior probability of being the 'true' model. This approach works well with statistical (regression) models, but extension to physically-based models is cumbersome. An alternative is the use of state-space models in which structural errors are represent as (additive) noise terms. In this presentation, we focus on approaches that are relevant at the science - policy interface, including multiple scientific disciplines and policy makers with different subject areas. Approaches to communicate uncertainties in results of observations or model predictions are discussed, distinguishing results that include probabilistic measures of uncertainty and more intrinsic uncertainties. Examples concentrate on uncertainties in nitrogen (N) related environmental issues, including: • Spatio-temporal trends in atmospheric N deposition, in view of the policy question whether there is a declining or increasing trend. • Carbon response to N inputs to terrestrial ecosystems, based on meta-analysis of N addition experiments and other approaches, in view of the policy relevance of N emission control. • Calculated spatial variations in the emissions of nitrous-oxide and ammonia, in view of the need of emission policies at different spatial scales. • Calculated N emissions and losses by model intercomparisons, in view of the policy need to apply no-regret decisions with respect to the control of those emissions.

  10. Uncertainties in biological responses that influence hazard and risk approaches to the regulation of endocrine active substances

    EPA Science Inventory

    Endocrine Disrupting Substances (EDSs) may have certain biological effects including delayed effects, multigenerational effects, and non-monotonic dose response relationships (NMDRs) that require careful consideration when determining environmental hazards. The case studies evalu...

  11. Computational Toxicology in Cancer Risk Assessment

    EPA Science Inventory

    Risk assessment over the last half century has, for many individual cases served us well, but has proceeded on an extremely slow pace and has left us with considerable uncertainty. There are certainly thousands of compounds and thousands of exposure scenarios that remain unteste...

  12. SETAC: Uncertainties in biological responses that influence hazard or risk approaches to the regulation of endocrine active substances

    EPA Science Inventory

    Endocrine Disrupting Substances (EDSs) may have certain biological effects including delayed effects, multigenerational effects, and non-monotonic dose response relationships (NMDRs) that require careful consideration when determining environmental hazards. The case studies evalu...

  13. Uncertainty Analysis of Instrument Calibration and Application

    NASA Technical Reports Server (NTRS)

    Tripp, John S.; Tcheng, Ping

    1999-01-01

    Experimental aerodynamic researchers require estimated precision and bias uncertainties of measured physical quantities, typically at 95 percent confidence levels. Uncertainties of final computed aerodynamic parameters are obtained by propagation of individual measurement uncertainties through the defining functional expressions. In this paper, rigorous mathematical techniques are extended to determine precision and bias uncertainties of any instrument-sensor system. Through this analysis, instrument uncertainties determined through calibration are now expressed as functions of the corresponding measurement for linear and nonlinear univariate and multivariate processes. Treatment of correlated measurement precision error is developed. During laboratory calibration, calibration standard uncertainties are assumed to be an order of magnitude less than those of the instrument being calibrated. Often calibration standards do not satisfy this assumption. This paper applies rigorous statistical methods for inclusion of calibration standard uncertainty and covariance due to the order of their application. The effects of mathematical modeling error on calibration bias uncertainty are quantified. The effects of experimental design on uncertainty are analyzed. The importance of replication is emphasized, techniques for estimation of both bias and precision uncertainties using replication are developed. Statistical tests for stationarity of calibration parameters over time are obtained.

  14. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Park, Chanyoung; Kim, Nam H.

    Structural elements, such as stiffened panels and lap joints, are basic components of aircraft structures. For aircraft structural design, designers select predesigned elements satisfying the design load requirement based on their load-carrying capabilities. Therefore, estimation of safety envelope of structural elements for load tolerances would be a good investment for design purpose. In this article, a method of estimating safety envelope is presented using probabilistic classification, which can estimate a specific level of failure probability under both aleatory and epistemic uncertainties. An important contribution of this article is that the calculation uncertainty is reflected in building a safety envelope usingmore » Gaussian process, and the effect of element test data on reducing the calculation uncertainty is incorporated by updating the Gaussian process model with the element test data. It is shown that even one element test can significantly reduce the calculation uncertainty due to lacking knowledge of actual physics, so that conservativeness in a safety envelope is significantly reduced. The proposed approach was demonstrated with a cantilever beam example, which represents a structural element. The example shows that calculation uncertainty provides about 93% conservativeness against the uncertainty due to a few element tests. As a result, it is shown that even a single element test can increase the load tolerance modeled with the safety envelope by 20%.« less

  15. Analogy as a strategy for supporting complex problem solving under uncertainty.

    PubMed

    Chan, Joel; Paletz, Susannah B F; Schunn, Christian D

    2012-11-01

    Complex problem solving in naturalistic environments is fraught with uncertainty, which has significant impacts on problem-solving behavior. Thus, theories of human problem solving should include accounts of the cognitive strategies people bring to bear to deal with uncertainty during problem solving. In this article, we present evidence that analogy is one such strategy. Using statistical analyses of the temporal dynamics between analogy and expressed uncertainty in the naturalistic problem-solving conversations among scientists on the Mars Rover Mission, we show that spikes in expressed uncertainty reliably predict analogy use (Study 1) and that expressed uncertainty reduces to baseline levels following analogy use (Study 2). In addition, in Study 3, we show with qualitative analyses that this relationship between uncertainty and analogy is not due to miscommunication-related uncertainty but, rather, is primarily concentrated on substantive problem-solving issues. Finally, we discuss a hypothesis about how analogy might serve as an uncertainty reduction strategy in naturalistic complex problem solving.

  16. Physicians’ Anxiety Due to Uncertainty and Use of Race in Medical Decision-Making

    PubMed Central

    Cunningham, Brooke A.; Bonham, Vence L.; Sellers, Sherrill L.; Yeh, Hsin-Chieh; Cooper, Lisa A.

    2014-01-01

    Background The explicit use of race in medical decision-making is contested. Researchers have hypothesized that physicians use race in care when they are uncertain. Objectives To investigate whether physician anxiety due to uncertainty is associated with a higher propensity to use race in medical decision-making. Research Design A national cross-sectional survey of general internists Subjects A national sample of 1738 clinically active general internists drawn from the SK&A physician database Measures Anxiety Due to Uncertainty (ADU) is a 5-item measure of emotional reactions to clinical uncertainty. Bonham and Sellers Racial Attributes in Clinical Evaluation (RACE) scale includes 7 items that measure self-reported use of race in medical decision-making. We used bivariate regression to test for associations between physician characteristics, ADU and RACE. Multivariate linear regression was performed to test for associations between ADU and RACE while adjusting for potential confounders. Results The mean score on ADU was 19.9 (SD=5.6). Mean score on RACE was 13.5 (SD=5.6). After adjusting for physician demographics, physicians with higher levels of ADU scored higher on RACE (+β=0.08 in RACE, p=0.04, for each 1-point increase in ADU), as did physicians who understand “race” to mean biological or genetic ancestral, rather than sociocultural, group. Physicians who graduated from a US medical school, completed fellowship, and had more white patients, scored lower on RACE. Conclusions This study demonstrates positive associations between physicians’ anxiety due to uncertainty, meanings attributed to race, and self-reported use of race in medical decision-making. Future research should examine the potential impact of these associations on patient outcomes and healthcare disparities. PMID:25025871

  17. How accurate are lexile text measures?

    PubMed

    Stenner, A Jackson; Burdick, Hal; Sanford, Eleanor E; Burdick, Donald S

    2006-01-01

    The Lexile Framework for Reading models comprehension as the difference between a reader measure and a text measure. Uncertainty in comprehension rates results from unreliability in reader measures and inaccuracy in text readability measures. Whole-text processing eliminates sampling error in text measures. However, Lexile text measures are imperfect due to misspecification of the Lexile theory. The standard deviation component associated with theory misspecification is estimated at 64L for a standard-length passage (approximately 125 words). A consequence is that standard errors for longer texts (2,500 to 150,000 words) are measured on the Lexile scale with uncertainties in the single digits. Uncertainties in expected comprehension rates are largely due to imprecision in reader ability and not inaccuracies in text readabilities.

  18. The Scientific Basis of Uncertainty Factors Used in Setting Occupational Exposure Limits.

    PubMed

    Dankovic, D A; Naumann, B D; Maier, A; Dourson, M L; Levy, L S

    2015-01-01

    The uncertainty factor concept is integrated into health risk assessments for all aspects of public health practice, including by most organizations that derive occupational exposure limits. The use of uncertainty factors is predicated on the assumption that a sufficient reduction in exposure from those at the boundary for the onset of adverse effects will yield a safe exposure level for at least the great majority of the exposed population, including vulnerable subgroups. There are differences in the application of the uncertainty factor approach among groups that conduct occupational assessments; however, there are common areas of uncertainty which are considered by all or nearly all occupational exposure limit-setting organizations. Five key uncertainties that are often examined include interspecies variability in response when extrapolating from animal studies to humans, response variability in humans, uncertainty in estimating a no-effect level from a dose where effects were observed, extrapolation from shorter duration studies to a full life-time exposure, and other insufficiencies in the overall health effects database indicating that the most sensitive adverse effect may not have been evaluated. In addition, a modifying factor is used by some organizations to account for other remaining uncertainties-typically related to exposure scenarios or accounting for the interplay among the five areas noted above. Consideration of uncertainties in occupational exposure limit derivation is a systematic process whereby the factors applied are not arbitrary, although they are mathematically imprecise. As the scientific basis for uncertainty factor application has improved, default uncertainty factors are now used only in the absence of chemical-specific data, and the trend is to replace them with chemical-specific adjustment factors whenever possible. The increased application of scientific data in the development of uncertainty factors for individual chemicals also has the benefit of increasing the transparency of occupational exposure limit derivation. Improved characterization of the scientific basis for uncertainty factors has led to increasing rigor and transparency in their application as part of the overall occupational exposure limit derivation process.

  19. Evaluation of the ²³⁹Pu prompt fission neutron spectrum induced by neutrons of 500 keV and associated covariances

    DOE PAGES

    Neudecker, D.; Talou, P.; Kawano, T.; ...

    2015-08-01

    We present evaluations of the prompt fission neutron spectrum (PFNS) of ²³⁹Pu induced by 500 keV neutrons, and associated covariances. In a previous evaluation by Talou et al. 2010, surprisingly low evaluated uncertainties were obtained, partly due to simplifying assumptions in the quantification of uncertainties from experiment and model. Therefore, special emphasis is placed here on a thorough uncertainty quantification of experimental data and of the Los Alamos model predicted values entering the evaluation. In addition, the Los Alamos model was extended and an evaluation technique was employed that takes into account the qualitative differences between normalized model predicted valuesmore » and experimental shape data. These improvements lead to changes in the evaluated PFNS and overall larger evaluated uncertainties than in the previous work. However, these evaluated uncertainties are still smaller than those obtained in a statistical analysis using experimental information only, due to strong model correlations. Hence, suggestions to estimate model defect uncertainties are presented, which lead to more reasonable evaluated uncertainties. The calculated k eff of selected criticality benchmarks obtained with these new evaluations agree with each other within their uncertainties despite the different approaches to estimate model defect uncertainties. The k eff one standard deviations overlap with some of those obtained using ENDF/B-VII.1, albeit their mean values are further away from unity. Spectral indexes for the Jezebel critical assembly calculated with the newly evaluated PFNS agree with the experimental data for selected (n,γ) and (n,f) reactions, and show improvements for high-energy threshold (n,2n) reactions compared to ENDF/B-VII.1.« less

  20. Evaluation of the 239 Pu prompt fission neutron spectrum induced by neutrons of 500 keV and associated covariances

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Neudecker, D.; Talou, P.; Kawano, T.

    2015-08-01

    We present evaluations of the prompt fission neutron spectrum (PFNS) of (PU)-P-239 induced by 500 keV neutrons, and associated covariances. In a previous evaluation by Talon et al. (2010), surprisingly low evaluated uncertainties were obtained, partly due to simplifying assumptions in the quantification of uncertainties from experiment and model. Therefore, special emphasis is placed here on a thorough uncertainty quantification of experimental data and of the Los Alamos model predicted values entering the evaluation. In addition, the Los Alamos model was extended and an evaluation technique was employed that takes into account the qualitative differences between normalized model predicted valuesmore » and experimental shape data These improvements lead to changes in the evaluated PENS and overall larger evaluated uncertainties than in the previous work. However, these evaluated uncertainties are still smaller than those obtained in a statistical analysis using experimental information only, due to strong model correlations. Hence, suggestions to estimate model defect uncertainties are presented. which lead to more reasonable evaluated uncertainties. The calculated k(eff) of selected criticality benchmarks obtained with these new evaluations agree with each other within their uncertainties despite the different approaches to estimate model defect uncertainties. The k(eff) one standard deviations overlap with some of those obtained using ENDF/B-VILl, albeit their mean values are further away from unity. Spectral indexes for the Jezebel critical assembly calculated with the newly evaluated PFNS agree with the experimental data for selected (n,) and (n,f) reactions, and show improvements for highenergy threshold (n,2n) reactions compared to ENDF/B-VII.l. (C) 2015 Elsevier B.V. All rights reserved.« less

  1. The role of the uncertainty of measurement of serum creatinine concentrations in the diagnosis of acute kidney injury.

    PubMed

    Kin Tekce, Buket; Tekce, Hikmet; Aktas, Gulali; Uyeturk, Ugur

    2016-01-01

    Uncertainty of measurement is the numeric expression of the errors associated with all measurements taken in clinical laboratories. Serum creatinine concentration is the most common diagnostic marker for acute kidney injury. The goal of this study was to determine the effect of the uncertainty of measurement of serum creatinine concentrations on the diagnosis of acute kidney injury. We calculated the uncertainty of measurement of serum creatinine according to the Nordtest Guide. Retrospectively, we identified 289 patients who were evaluated for acute kidney injury. Of the total patient pool, 233 were diagnosed with acute kidney injury using the AKIN classification scheme and then were compared using statistical analysis. We determined nine probabilities of the uncertainty of measurement of serum creatinine concentrations. There was a statistically significant difference in the number of patients diagnosed with acute kidney injury when uncertainty of measurement was taken into consideration (first probability compared to the fifth p = 0.023 and first probability compared to the ninth p = 0.012). We found that the uncertainty of measurement for serum creatinine concentrations was an important factor for correctly diagnosing acute kidney injury. In addition, based on the AKIN classification scheme, minimizing the total allowable error levels for serum creatinine concentrations is necessary for the accurate diagnosis of acute kidney injury by clinicians.

  2. Emotion and Decision-Making Under Uncertainty: Physiological arousal predicts increased gambling during ambiguity but not risk

    PubMed Central

    FeldmanHall, Oriel; Glimcher, Paul; Baker, Augustus L; Phelps, Elizabeth A

    2016-01-01

    Uncertainty, which is ubiquitous in decision-making, can be fractionated into known probabilities (risk) and unknown probabilities (ambiguity). Although research illustrates that individuals more often avoid decisions associated with ambiguity compared to risk, it remains unclear why ambiguity is perceived as more aversive. Here we examine the role of arousal in shaping the representation of value and subsequent choice under risky and ambiguous decisions. To investigate the relationship between arousal and decisions of uncertainty, we measure skin conductance response—a quantifiable measure reflecting sympathetic nervous system arousal—during choices to gamble under risk and ambiguity. To quantify the discrete influences of risk and ambiguity sensitivity and the subjective value of each option under consideration, we model fluctuating uncertainty, as well as the amount of money that can be gained by taking the gamble. Results reveal that while arousal tracks the subjective value of a lottery regardless of uncertainty type, arousal differentially contributes to the computation of value—i.e. choice—depending on whether the uncertainty is risky or ambiguous: enhanced arousal adaptively decreases risk-taking only when the lottery is highly risky but increases risk-taking when the probability of winning is ambiguous (even after controlling for subjective value). Together, this suggests that the role of arousal during decisions of uncertainty is modulatory and highly dependent on the context in which the decision is framed. PMID:27690508

  3. Retrospective estimation of the electric and magnetic field exposure conditions in in vitro experimental reports reveal considerable potential for uncertainty.

    PubMed

    Portelli, Lucas A; Falldorf, Karsten; Thuróczy, György; Cuppen, Jan

    2018-04-01

    Experiments on cell cultures exposed to extremely low frequency (ELF, 3-300 Hz) magnetic fields are often subject to multiple sources of uncertainty associated with specific electric and magnetic field exposure conditions. Here we systemically quantify these uncertainties based on exposure conditions described in a group of bioelectromagnetic experimental reports for a representative sampling of the existing literature. The resulting uncertainties, stemming from insufficient, ambiguous, or erroneous description, design, implementation, or validation of the experimental methods and systems, were often substantial enough to potentially make any successful reproduction of the original experimental conditions difficult or impossible. Without making any assumption about the true biological relevance of ELF electric and magnetic fields, these findings suggest another contributing factor which may add to the overall variability and irreproducibility traditionally associated with experimental results of in vitro exposures to low-level ELF magnetic fields. Bioelectromagnetics. 39:231-243, 2018. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.

  4. Trajectory Dispersed Vehicle Process for Space Launch System

    NASA Technical Reports Server (NTRS)

    Statham, Tamara; Thompson, Seth

    2017-01-01

    The Space Launch System (SLS) vehicle is part of NASA's deep space exploration plans that includes manned missions to Mars. Manufacturing uncertainties in design parameters are key considerations throughout SLS development as they have significant effects on focus parameters such as lift-off-thrust-to-weight, vehicle payload, maximum dynamic pressure, and compression loads. This presentation discusses how the SLS program captures these uncertainties by utilizing a 3 degree of freedom (DOF) process called Trajectory Dispersed (TD) analysis. This analysis biases nominal trajectories to identify extremes in the design parameters for various potential SLS configurations and missions. This process utilizes a Design of Experiments (DOE) and response surface methodologies (RSM) to statistically sample uncertainties, and develop resulting vehicles using a Maximum Likelihood Estimate (MLE) process for targeting uncertainties bias. These vehicles represent various missions and configurations which are used as key inputs into a variety of analyses in the SLS design process, including 6 DOF dispersions, separation clearances, and engine out failure studies.

  5. Entropy of hydrological systems under small samples: Uncertainty and variability

    NASA Astrophysics Data System (ADS)

    Liu, Dengfeng; Wang, Dong; Wang, Yuankun; Wu, Jichun; Singh, Vijay P.; Zeng, Xiankui; Wang, Lachun; Chen, Yuanfang; Chen, Xi; Zhang, Liyuan; Gu, Shenghua

    2016-01-01

    Entropy theory has been increasingly applied in hydrology in both descriptive and inferential ways. However, little attention has been given to the small-sample condition widespread in hydrological practice, where either hydrological measurements are limited or are even nonexistent. Accordingly, entropy estimated under this condition may incur considerable bias. In this study, small-sample condition is considered and two innovative entropy estimators, the Chao-Shen (CS) estimator and the James-Stein-type shrinkage (JSS) estimator, are introduced. Simulation tests are conducted with common distributions in hydrology, that lead to the best-performing JSS estimator. Then, multi-scale moving entropy-based hydrological analyses (MM-EHA) are applied to indicate the changing patterns of uncertainty of streamflow data collected from the Yangtze River and the Yellow River, China. For further investigation into the intrinsic property of entropy applied in hydrological uncertainty analyses, correlations of entropy and other statistics at different time-scales are also calculated, which show connections between the concept of uncertainty and variability.

  6. Validation of the analytical methods in the LWR code BOXER for gadolinium-loaded fuel pins

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Paratte, J.M.; Arkuszewski, J.J.; Kamboj, B.K.

    1990-01-01

    Due to the very high absorption occurring in gadolinium-loaded fuel pins, calculations of lattices with such pins present are a demanding test of the analysis methods in light water reactor (LWR) cell and assembly codes. Considerable effort has, therefore, been devoted to the validation of code methods for gadolinia fuel. The goal of the work reported in this paper is to check the analysis methods in the LWR cell/assembly code BOXER and its associated cross-section processing code ETOBOX, by comparison of BOXER results with those from a very accurate Monte Carlo calculation for a gadolinium benchmark problem. Initial results ofmore » such a comparison have been previously reported. However, the Monte Carlo calculations, done with the MCNP code, were performed at Los Alamos National Laboratory using ENDF/B-V data, while the BOXER calculations were performed at the Paul Scherrer Institute using JEF-1 nuclear data. This difference in the basic nuclear data used for the two calculations, caused by the restricted nature of these evaluated data files, led to associated uncertainties in a comparison of the results for methods validation. In the joint investigations at the Georgia Institute of Technology and PSI, such uncertainty in this comparison was eliminated by using ENDF/B-V data for BOXER calculations at Georgia Tech.« less

  7. Measurement Techniques for Respiratory Tract Deposition of Airborne Nanoparticles: A Critical Review

    PubMed Central

    Möller, Winfried; Pagels, Joakim H.; Kreyling, Wolfgang G.; Swietlicki, Erik; Schmid, Otmar

    2014-01-01

    Abstract Determination of the respiratory tract deposition of airborne particles is critical for risk assessment of air pollution, inhaled drug delivery, and understanding of respiratory disease. With the advent of nanotechnology, there has been an increasing interest in the measurement of pulmonary deposition of nanoparticles because of their unique properties in inhalation toxicology and medicine. Over the last century, around 50 studies have presented experimental data on lung deposition of nanoparticles (typical diameter≤100 nm, but here≤300 nm). These data show a considerable variability, partly due to differences in the applied methodologies. In this study, we review the experimental techniques for measuring respiratory tract deposition of nano-sized particles, analyze critical experimental design aspects causing measurement uncertainties, and suggest methodologies for future studies. It is shown that, although particle detection techniques have developed with time, the overall methodology in respiratory tract deposition experiments has not seen similar progress. Available experience from previous research has often not been incorporated, and some methodological design aspects that were overlooked in 30–70% of all studies may have biased the experimental data. This has contributed to a significant uncertainty on the absolute value of the lung deposition fraction of nanoparticles. We estimate the impact of the design aspects on obtained data, discuss solutions to minimize errors, and highlight gaps in the available experimental set of data. PMID:24151837

  8. Alignment error of mirror modules of advanced telescope for high-energy astrophysics due to wavefront aberrations

    NASA Astrophysics Data System (ADS)

    Zocchi, Fabio E.

    2017-10-01

    One of the approaches that is being tested for the integration of the mirror modules of the advanced telescope for high-energy astrophysics x-ray mission of the European Space Agency consists in aligning each module on an optical bench operated at an ultraviolet wavelength. The mirror module is illuminated by a plane wave and, in order to overcome diffraction effects, the centroid of the image produced by the module is used as a reference to assess the accuracy of the optical alignment of the mirror module itself. Among other sources of uncertainty, the wave-front error of the plane wave also introduces an error in the position of the centroid, thus affecting the quality of the mirror module alignment. The power spectral density of the position of the point spread function centroid is here derived from the power spectral density of the wave-front error of the plane wave in the framework of the scalar theory of Fourier diffraction. This allows the defining of a specification on the collimator quality used for generating the plane wave starting from the contribution to the error budget allocated for the uncertainty of the centroid position. The theory generally applies whenever Fourier diffraction is a valid approximation, in which case the obtained result is identical to that derived by geometrical optics considerations.

  9. Field variability and vulnerability index to identify precision agriculture opportunity

    USDA-ARS?s Scientific Manuscript database

    Innovations in precision agriculture (PA) have created opportunities to achieve a greater understanding of within-field variability. However, PA adoption has been hindered due to uncertainty about field-specific performance and return on investment. Uncertainty could be better addressed by analyzing...

  10. Implementation Considerations, Not Topological Differences, Are the Main Determinants of Noise Suppression Properties in Feedback and Incoherent Feedforward Circuits

    PubMed Central

    Buzi, Gentian; Khammash, Mustafa

    2016-01-01

    Biological systems use a variety of mechanisms to deal with the uncertain nature of their external and internal environments. Two of the most common motifs employed for this purpose are the incoherent feedforward (IFF) and feedback (FB) topologies. Many theoretical and experimental studies suggest that these circuits play very different roles in providing robustness to uncertainty in the cellular environment. Here, we use a control theoretic approach to analyze two common FB and IFF architectures that make use of an intermediary species to achieve regulation. We show the equivalence of both circuits topologies in suppressing static cell-to-cell variations. While both circuits can suppress variations due to input noise, they are ineffective in suppressing inherent chemical reaction stochasticity. Indeed, these circuits realize comparable improvements limited to a modest 25% variance reduction in best case scenarios. Such limitations are attributed to the use of intermediary species in regulation, and as such, they persist even for circuit architectures that combine both IFF and FB features. Intriguingly, while the FB circuits are better suited in dealing with dynamic input variability, the most significant difference between the two topologies lies not in the structural features of the circuits, but in their practical implementation considerations. PMID:27257684

  11. Implementation Considerations, Not Topological Differences, Are the Main Determinants of Noise Suppression Properties in Feedback and Incoherent Feedforward Circuits.

    PubMed

    Buzi, Gentian; Khammash, Mustafa

    2016-06-01

    Biological systems use a variety of mechanisms to deal with the uncertain nature of their external and internal environments. Two of the most common motifs employed for this purpose are the incoherent feedforward (IFF) and feedback (FB) topologies. Many theoretical and experimental studies suggest that these circuits play very different roles in providing robustness to uncertainty in the cellular environment. Here, we use a control theoretic approach to analyze two common FB and IFF architectures that make use of an intermediary species to achieve regulation. We show the equivalence of both circuits topologies in suppressing static cell-to-cell variations. While both circuits can suppress variations due to input noise, they are ineffective in suppressing inherent chemical reaction stochasticity. Indeed, these circuits realize comparable improvements limited to a modest 25% variance reduction in best case scenarios. Such limitations are attributed to the use of intermediary species in regulation, and as such, they persist even for circuit architectures that combine both IFF and FB features. Intriguingly, while the FB circuits are better suited in dealing with dynamic input variability, the most significant difference between the two topologies lies not in the structural features of the circuits, but in their practical implementation considerations.

  12. Using analogues to quantify geological uncertainty in stochastic reserve modelling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wells, B.; Brown, I.

    1995-08-01

    The petroleum industry seeks to minimize exploration risk by employing the best possible expertise, methods and tools. Is it possible to quantify the success of this process of risk reduction? Due to inherent uncertainty in predicting geological reality and due to changing environments for hydrocarbon exploration, it is not enough simply to record the proportion of successful wells drilled; in various parts of the world it has been noted that pseudo-random drilling would apparently have been as successful as the actual drilling programme. How, then, should we judge the success of risk reduction? For many years the E&P industry hasmore » routinely used Monte Carlo modelling to generate a probability distribution for prospect reserves. One aspect of Monte Carlo modelling which has received insufficient attention, but which is essential for quantifying risk reduction, is the consistency and repeatability with which predictions can be made. Reducing the subjective element inherent in the specification of geological uncertainty allows better quantification of uncertainty in the prediction of reserves, in both exploration and appraisal. Building on work reported at the AAPG annual conventions in 1994 and 1995, the present paper incorporates analogue information with uncertainty modelling. Analogues provide a major step forward in the quantification of risk, but their significance is potentially greater still. The two principal contributors to uncertainty in field and prospect analysis are the hydrocarbon life-cycle and the geometry of the trap. These are usually treated separately. Combining them into a single model is a major contribution to the reduction risk. This work is based in part on a joint project with Oryx Energy UK Ltd., and thanks are due in particular to Richard Benmore and Mike Cooper.« less

  13. Investigating the effect and uncertainties of light absorbing impurities in snow and ice on snow melt and discharge generation using a hydrologic catchment model and satellite data

    NASA Astrophysics Data System (ADS)

    Matt, Felix; Burkhart, John F.

    2017-04-01

    Light absorbing impurities in snow and ice (LAISI) originating from atmospheric deposition enhance snow melt by increasing the absorption of short wave radiation. The consequences are a shortening of the snow cover duration due to increased snow melt and, with respect to hydrologic processes, a temporal shift in the discharge generation. However, the magnitude of these effects as simulated in numerical models have large uncertainties, originating mainly from uncertainties in the wet and dry deposition of light absorbing aerosols, limitations in the model representation of the snowpack, and the lack of observable variables required to estimate model parameters and evaluate the simulated variables connected with the representation of LAISI. This leads to high uncertainties in the additional energy absorbed by the snow due to the presence of LAISI, a key variable in understanding snowpack energy-balance dynamics. In this study, we assess the effect of LAISI on snow melt and discharge generation and the involved uncertainties in a high mountain catchment located in the western Himalayas by using a distributed hydrological catchment model with focus on the representation of the seasonal snow pack. The snow albedo is hereby calculated from a radiative transfer model for snow, taking the increased absorption of short wave radiation by LAISI into account. Meteorological forcing data is generated from an assimilation of observations and high resolution WRF simulations, and LAISI mixing ratios from deposition rates of Black Carbon simulated with the FLEXPART model. To asses the quality of our simulations and the related uncertainties, we compare the simulated additional energy absorbed by the snow due to the presence of LAISI to the MODIS Dust Radiative Forcing in Snow (MODDRFS) algorithm satellite product.

  14. Predictive uncertainty analysis of plume distribution for geological carbon sequestration using sparse-grid Bayesian method

    NASA Astrophysics Data System (ADS)

    Shi, X.; Zhang, G.

    2013-12-01

    Because of the extensive computational burden, parametric uncertainty analyses are rarely conducted for geological carbon sequestration (GCS) process based multi-phase models. The difficulty of predictive uncertainty analysis for the CO2 plume migration in realistic GCS models is not only due to the spatial distribution of the caprock and reservoir (i.e. heterogeneous model parameters), but also because the GCS optimization estimation problem has multiple local minima due to the complex nonlinear multi-phase (gas and aqueous), and multi-component (water, CO2, salt) transport equations. The geological model built by Doughty and Pruess (2004) for the Frio pilot site (Texas) was selected and assumed to represent the 'true' system, which was composed of seven different facies (geological units) distributed among 10 layers. We chose to calibrate the permeabilities of these facies. Pressure and gas saturation values from this true model were then extracted and used as observations for subsequent model calibration. Random noise was added to the observations to approximate realistic field conditions. Each simulation of the model lasts about 2 hours. In this study, we develop a new approach that improves computational efficiency of Bayesian inference by constructing a surrogate system based on an adaptive sparse-grid stochastic collocation method. This surrogate response surface global optimization algorithm is firstly used to calibrate the model parameters, then prediction uncertainty of the CO2 plume position is quantified due to the propagation from parametric uncertainty in the numerical experiments, which is also compared to the actual plume from the 'true' model. Results prove that the approach is computationally efficient for multi-modal optimization and prediction uncertainty quantification for computationally expensive simulation models. Both our inverse methodology and findings can be broadly applicable to GCS in heterogeneous storage formations.

  15. Buoyancy contribution to uncertainty of mass, conventional mass and force

    NASA Astrophysics Data System (ADS)

    Malengo, Andrea; Bich, Walter

    2016-04-01

    The conventional mass is a useful concept introduced to reduce the impact of the buoyancy correction in everyday mass measurements, thus avoiding in most cases its accurate determination, necessary in measurements of ‘true’ mass. Although usage of conventional mass is universal and standardized, the concept is considered as a sort of second-choice tool, to be avoided in high-accuracy applications. In this paper we show that this is a false belief, by elucidating the role played by covariances between volume and mass and between volume and conventional mass at the various stages of the dissemination chain and in the relationship between the uncertainties of mass and conventional mass. We arrive at somewhat counter-intuitive results: the volume of the transfer standard plays a comparatively minor role in the uncertainty budget of the standard under calibration. In addition, conventional mass is preferable to mass in normal, in-air operation, as its uncertainty is smaller than that of mass, if covariance terms are properly taken into account, and the uncertainty over-stating (typically) resulting from neglecting them is less severe than that (always) occurring with mass. The same considerations hold for force. In this respect, we show that the associated uncertainty is the same using mass or conventional mass, and, again, that the latter is preferable if covariance terms are neglected.

  16. An evaluation of the treatment of risk and uncertainties in the IPCC reports on climate change.

    PubMed

    Aven, Terje; Renn, Ortwin

    2015-04-01

    Few global threats rival global climate change in scale and potential consequence. The principal international authority assessing climate risk is the Intergovernmental Panel on Climate Change (IPCC). Through repeated assessments the IPCC has devoted considerable effort and interdisciplinary competence to articulating a common characterization of climate risk and uncertainties. We have reviewed the assessment and its foundation for the Fifth Assessment Reports published in 2013 and 2014, in particular the guidance note for lead authors of the fifth IPCC assessment report on consistent treatment of uncertainties. Our analysis shows that the work carried out by the ICPP is short of providing a theoretically and conceptually convincing foundation on the treatment of risk and uncertainties. The main reasons for our assessment are: (i) the concept of risk is given a too narrow definition (a function of consequences and probability/likelihood); and (ii) the reports lack precision in delineating their concepts and methods. The goal of this article is to contribute to improving the handling of uncertainty and risk in future IPCC studies, thereby obtaining a more theoretically substantiated characterization as well as enhanced scientific quality for risk analysis in this area. Several suggestions for how to improve the risk and uncertainty treatment are provided. © 2014 Society for Risk Analysis.

  17. Artificial neural network modelling of uncertainty in gamma-ray spectrometry

    NASA Astrophysics Data System (ADS)

    Dragović, S.; Onjia, A.; Stanković, S.; Aničin, I.; Bačić, G.

    2005-03-01

    An artificial neural network (ANN) model for the prediction of measuring uncertainties in gamma-ray spectrometry was developed and optimized. A three-layer feed-forward ANN with back-propagation learning algorithm was used to model uncertainties of measurement of activity levels of eight radionuclides ( 226Ra, 238U, 235U, 40K, 232Th, 134Cs, 137Cs and 7Be) in soil samples as a function of measurement time. It was shown that the neural network provides useful data even from small experimental databases. The performance of the optimized neural network was found to be very good, with correlation coefficients ( R2) between measured and predicted uncertainties ranging from 0.9050 to 0.9915. The correlation coefficients did not significantly deteriorate when the network was tested on samples with greatly different uranium-to-thorium ( 238U/ 232Th) ratios. The differences between measured and predicted uncertainties were not influenced by the absolute values of uncertainties of measured radionuclide activities. Once the ANN is trained, it could be employed in analyzing soil samples regardless of the 238U/ 232Th ratio. It was concluded that a considerable saving in time could be obtained using the trained neural network model for predicting the measurement times needed to attain the desired statistical accuracy.

  18. Uncertainties in biological responses that influence hazard or risk approaches to the regulation of endocrine active substances

    EPA Science Inventory

    Endocrine Disrupting Chemicals (EDCs) may have delayed or transgenerational effects and display non-monotonic dose response relationships (NMDRs) that require careful consideration when determining environmental hazards. The case studies evaluated for the SETAC Pellston Workshop&...

  19. Does grazing management matter for soil carbon sequestration in shortgrass steppe?

    USDA-ARS?s Scientific Manuscript database

    Considerable uncertainty remains regarding the potential of grazing management on semiarid rangelands to sequester soil carbon. Short-term (less than 1 decade) studies have determined that grazing management potentially influences fluxes of carbon, but such studies are strongly influenced by prevail...

  20. A study on the impact of parameter uncertainty on the emission-based ranking of transportation projects.

    DOT National Transportation Integrated Search

    2014-01-01

    With the growing concern with air quality levels and, hence, the livability of urban regions in the nation, it has become increasingly common to incorporate vehicular emission considerations in the ranking of transportation projects. Network assignme...

  1. Risk management consideration in the bioeconomy

    Treesearch

    Camilla Abbati de Assis; Ronalds Gonzalez; Stephen Kelley; Hasan Jameel; Ted Bilek; Jesse Daystar; Robert Handfield; Jay Golden; Jeff Prestemon; Damien Singh

    2017-01-01

    In investing in a new venture, companies aim to increase their competitiveness and generate value in scenarios where volatile markets, geopolitical instabilities, and disruptive technologies create uncertainty and risk. The biobased industry poses additional challenges as it competes in a mature, highly efficient market, dominated by...

  2. Credibilistic multi-period portfolio optimization based on scenario tree

    NASA Astrophysics Data System (ADS)

    Mohebbi, Negin; Najafi, Amir Abbas

    2018-02-01

    In this paper, we consider a multi-period fuzzy portfolio optimization model with considering transaction costs and the possibility of risk-free investment. We formulate a bi-objective mean-VaR portfolio selection model based on the integration of fuzzy credibility theory and scenario tree in order to dealing with the markets uncertainty. The scenario tree is also a proper method for modeling multi-period portfolio problems since the length and continuity of their horizon. We take the return and risk as well cardinality, threshold, class, and liquidity constraints into consideration for further compliance of the model with reality. Then, an interactive dynamic programming method, which is based on a two-phase fuzzy interactive approach, is employed to solve the proposed model. In order to verify the proposed model, we present an empirical application in NYSE under different circumstances. The results show that the consideration of data uncertainty and other real-world assumptions lead to more practical and efficient solutions.

  3. Quantifying and Qualifying USGS ShakeMap Uncertainty

    USGS Publications Warehouse

    Wald, David J.; Lin, Kuo-Wan; Quitoriano, Vincent

    2008-01-01

    We describe algorithms for quantifying and qualifying uncertainties associated with USGS ShakeMap ground motions. The uncertainty values computed consist of latitude/longitude grid-based multiplicative factors that scale the standard deviation associated with the ground motion prediction equation (GMPE) used within the ShakeMap algorithm for estimating ground motions. The resulting grid-based 'uncertainty map' is essential for evaluation of losses derived using ShakeMaps as the hazard input. For ShakeMap, ground motion uncertainty at any point is dominated by two main factors: (i) the influence of any proximal ground motion observations, and (ii) the uncertainty of estimating ground motions from the GMPE, most notably, elevated uncertainty due to initial, unconstrained source rupture geometry. The uncertainty is highest for larger magnitude earthquakes when source finiteness is not yet constrained and, hence, the distance to rupture is also uncertain. In addition to a spatially-dependant, quantitative assessment, many users may prefer a simple, qualitative grading for the entire ShakeMap. We developed a grading scale that allows one to quickly gauge the appropriate level of confidence when using rapidly produced ShakeMaps as part of the post-earthquake decision-making process or for qualitative assessments of archived or historical earthquake ShakeMaps. We describe an uncertainty letter grading ('A' through 'F', for high to poor quality, respectively) based on the uncertainty map. A middle-range ('C') grade corresponds to a ShakeMap for a moderate-magnitude earthquake suitably represented with a point-source location. Lower grades 'D' and 'F' are assigned for larger events (M>6) where finite-source dimensions are not yet constrained. The addition of ground motion observations (or observed macroseismic intensities) reduces uncertainties over data-constrained portions of the map. Higher grades ('A' and 'B') correspond to ShakeMaps with constrained fault dimensions and numerous stations, depending on the density of station/data coverage. Due to these dependencies, the letter grade can change with subsequent ShakeMap revisions if more data are added or when finite-faulting dimensions are added. We emphasize that the greatest uncertainties are associated with unconstrained source dimensions for large earthquakes where the distance term in the GMPE is most uncertain; this uncertainty thus scales with magnitude (and consequently rupture dimension). Since this distance uncertainty produces potentially large uncertainties in ShakeMap ground-motion estimates, this factor dominates over compensating constraints for all but the most dense station distributions.

  4. Calculating Measurement Uncertainty of the “Conventional Value of the Result of Weighing in Air”

    DOE PAGES

    Flicker, Celia J.; Tran, Hy D.

    2016-04-02

    The conventional value of the result of weighing in air is frequently used in commercial calibrations of balances. The guidance in OIML D-028 for reporting uncertainty of the conventional value is too terse. When calibrating mass standards at low measurement uncertainties, it is necessary to perform a buoyancy correction before reporting the result. When calculating the conventional result after calibrating true mass, the uncertainty due to calculating the conventional result is correlated with the buoyancy correction. We show through Monte Carlo simulations that the measurement uncertainty of the conventional result is less than the measurement uncertainty when reporting true mass.more » The Monte Carlo simulation tool is available in the online version of this article.« less

  5. Uncertainty Analysis in 3D Equilibrium Reconstruction

    DOE PAGES

    Cianciosa, Mark R.; Hanson, James D.; Maurer, David A.

    2018-02-21

    Reconstruction is an inverse process where a parameter space is searched to locate a set of parameters with the highest probability of describing experimental observations. Due to systematic errors and uncertainty in experimental measurements, this optimal set of parameters will contain some associated uncertainty. This uncertainty in the optimal parameters leads to uncertainty in models derived using those parameters. V3FIT is a three-dimensional (3D) equilibrium reconstruction code that propagates uncertainty from the input signals, to the reconstructed parameters, and to the final model. Here in this paper, we describe the methods used to propagate uncertainty in V3FIT. Using the resultsmore » of whole shot 3D equilibrium reconstruction of the Compact Toroidal Hybrid, this propagated uncertainty is validated against the random variation in the resulting parameters. Two different model parameterizations demonstrate how the uncertainty propagation can indicate the quality of a reconstruction. As a proxy for random sampling, the whole shot reconstruction results in a time interval that will be used to validate the propagated uncertainty from a single time slice.« less

  6. Uncertainty Analysis in 3D Equilibrium Reconstruction

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cianciosa, Mark R.; Hanson, James D.; Maurer, David A.

    Reconstruction is an inverse process where a parameter space is searched to locate a set of parameters with the highest probability of describing experimental observations. Due to systematic errors and uncertainty in experimental measurements, this optimal set of parameters will contain some associated uncertainty. This uncertainty in the optimal parameters leads to uncertainty in models derived using those parameters. V3FIT is a three-dimensional (3D) equilibrium reconstruction code that propagates uncertainty from the input signals, to the reconstructed parameters, and to the final model. Here in this paper, we describe the methods used to propagate uncertainty in V3FIT. Using the resultsmore » of whole shot 3D equilibrium reconstruction of the Compact Toroidal Hybrid, this propagated uncertainty is validated against the random variation in the resulting parameters. Two different model parameterizations demonstrate how the uncertainty propagation can indicate the quality of a reconstruction. As a proxy for random sampling, the whole shot reconstruction results in a time interval that will be used to validate the propagated uncertainty from a single time slice.« less

  7. Uncertainty of Wheat Water Use: Simulated Patterns and Sensitivity to Temperature and CO2

    NASA Technical Reports Server (NTRS)

    Cammarano, Davide; Roetter, Reimund P.; Asseng, Senthold; Ewert, Frank; Wallach, Daniel; Martre, Pierre; Hatfield, Jerry L.; Jones, James W.; Rosenzweig, Cynthia E.; Ruane, Alex C.; hide

    2016-01-01

    Projected global warming and population growth will reduce future water availability for agriculture. Thus, it is essential to increase the efficiency in using water to ensure crop productivity. Quantifying crop water use (WU; i.e. actual evapotranspiration) is a critical step towards this goal. Here, sixteen wheat simulation models were used to quantify sources of model uncertainty and to estimate the relative changes and variability between models for simulated WU, water use efficiency (WUE, WU per unit of grain dry mass produced), transpiration efficiency (Teff, transpiration per kg of unit of grain yield dry mass produced), grain yield, crop transpiration and soil evaporation at increased temperatures and elevated atmospheric carbon dioxide concentrations ([CO2]). The greatest uncertainty in simulating water use, potential evapotranspiration, crop transpiration and soil evaporation was due to differences in how crop transpiration was modelled and accounted for 50 of the total variability among models. The simulation results for the sensitivity to temperature indicated that crop WU will decline with increasing temperature due to reduced growing seasons. The uncertainties in simulated crop WU, and in particularly due to uncertainties in simulating crop transpiration, were greater under conditions of increased temperatures and with high temperatures in combination with elevated atmospheric [CO2] concentrations. Hence the simulation of crop WU, and in particularly crop transpiration under higher temperature, needs to be improved and evaluated with field measurements before models can be used to simulate climate change impacts on future crop water demand.

  8. Probabilistic flood warning using grand ensemble weather forecasts

    NASA Astrophysics Data System (ADS)

    He, Y.; Wetterhall, F.; Cloke, H.; Pappenberger, F.; Wilson, M.; Freer, J.; McGregor, G.

    2009-04-01

    As the severity of floods increases, possibly due to climate and landuse change, there is urgent need for more effective and reliable warning systems. The incorporation of numerical weather predictions (NWP) into a flood warning system can increase forecast lead times from a few hours to a few days. A single NWP forecast from a single forecast centre, however, is insufficient as it involves considerable non-predictable uncertainties and can lead to a high number of false or missed warnings. An ensemble of weather forecasts from one Ensemble Prediction System (EPS), when used on catchment hydrology, can provide improved early flood warning as some of the uncertainties can be quantified. EPS forecasts from a single weather centre only account for part of the uncertainties originating from initial conditions and stochastic physics. Other sources of uncertainties, including numerical implementations and/or data assimilation, can only be assessed if a grand ensemble of EPSs from different weather centres is used. When various models that produce EPS from different weather centres are aggregated, the probabilistic nature of the ensemble precipitation forecasts can be better retained and accounted for. The availability of twelve global EPSs through the 'THORPEX Interactive Grand Global Ensemble' (TIGGE) offers a new opportunity for the design of an improved probabilistic flood forecasting framework. This work presents a case study using the TIGGE database for flood warning on a meso-scale catchment. The upper reach of the River Severn catchment located in the Midlands Region of England is selected due to its abundant data for investigation and its relatively small size (4062 km2) (compared to the resolution of the NWPs). This choice was deliberate as we hypothesize that the uncertainty in the forcing of smaller catchments cannot be represented by a single EPS with a very limited number of ensemble members, but only through the variance given by a large number ensembles and ensemble system. A coupled atmospheric-hydrologic-hydraulic cascade system driven by the TIGGE ensemble forecasts is set up to study the potential benefits of using the TIGGE database in early flood warning. Physically based and fully distributed LISFLOOD suite of models is selected to simulate discharge and flood inundation consecutively. The results show the TIGGE database is a promising tool to produce forecasts of discharge and flood inundation comparable with the observed discharge and simulated inundation driven by the observed discharge. The spread of discharge forecasts varies from centre to centre, but it is generally large, implying a significant level of uncertainties. Precipitation input uncertainties dominate and propagate through the cascade chain. The current NWPs fall short of representing the spatial variability of precipitation on a comparatively small catchment. This perhaps indicates the need to improve NWPs resolution and/or disaggregation techniques to narrow down the spatial gap between meteorology and hydrology. It is not necessarily true that early flood warning becomes more reliable when more ensemble forecasts are employed. It is difficult to identify the best forecast centre(s), but in general the chance of detecting floods is increased by using the TIGGE database. Only one flood event was studied because most of the TIGGE data became available after October 2007. It is necessary to test the TIGGE ensemble forecasts with other flood events in other catchments with different hydrological and climatic regimes before general conclusions can be made on its robustness and applicability.

  9. Combining Satellite Ocean Color and Hydrodynamic Model Uncertainties in Bio-Optical Forecasts

    DTIC Science & Technology

    2014-04-03

    observed chlorophyll distribution for that day (MODIS Image for October 17, 2011), without regard to sign, I.e., IFigs. 11(c)-11(a)l. Black pixels indicate...time using the current field from the model. Uncertainties in both the satellite chlorophyll values and the currents from the circulation model impact...ensemole techniques to partition the chlorophyll uncertainties into components due to atmospheric correction and bio-optical inversion. By combining

  10. Cultural diversity teaching and issues of uncertainty: the findings of a qualitative study

    PubMed Central

    Dogra, Nisha; Giordano, James; France, Nicholas

    2007-01-01

    Background There is considerable ambiguity in the subjective dimensions that comprise much of the relational dynamic of the clinical encounter. Comfort with this ambiguity, and recognition of the potential uncertainty of particular domains of medicine (e.g. – cultural factors of illness expression, value bias in diagnoses, etc) is an important facet of medical education. This paper begins by defining ambiguity and uncertainty as relevant to clinical practice. Studies have shown differing patterns of students' tolerance for ambiguity and uncertainty that appear to reflect extant attitudinal predispositions toward technology, objectivity, culture, value- and theory-ladeness, and the need for self-examination. This paper reports on those findings specifically related to the theme of uncertainty as relevant to teaching about cultural diversity. Its focus is to identify how and where the theme of certainty arose in the teaching and learning of cultural diversity, what were the attitudes toward this theme and topic, and how these attitudes and responses reflect and inform this area of medical pedagogy. Methods A semi-structured interview was undertaken with 61 stakeholders (including policymakers, diversity teachers, students and users). The data were analysed and themes identified. Results There were diverse views about what the term cultural diversity means and what should constitute the cultural diversity curriculum. There was a need to provide certainty in teaching cultural diversity with diversity teachers feeling under considerable pressure to provide information. Students discomfort with uncertainty was felt to drive cultural diversity teaching towards factual emphasis rather than reflection or taking a patient centred approach. Conclusion Students and faculty may feel that cultural diversity teaching is more about how to avoid professional, medico-legal pitfalls, rather than improving the patient experience or the patient-physician relationship. There may be pressure to imbue cultural diversity issues with levels of objectivity and certainty representative of other aspects of the medical curriculum (e.g. – biochemistry). This may reflect a particular selection bias for students with a technocentric orientation. Inadvertently, medical education may enhance this bias through training effects, and accommodate disregard for subjectivity, over-reliance upon technology and thereby foster incorrect assumptions of objective certainty. We opine that it is important to teach students that technology cannot guarantee certainty, and that dealing with subjectivity, diversity, ambiguity and uncertainty is inseparable from the personal dimension of medicine as moral enterprise. Uncertainty is inherent in cultural diversity so this part of the curriculum provides an opportunity to address the issue as it relates to pateint care. PMID:17462089

  11. Cultural diversity teaching and issues of uncertainty: the findings of a qualitative study.

    PubMed

    Dogra, Nisha; Giordano, James; France, Nicholas

    2007-04-26

    There is considerable ambiguity in the subjective dimensions that comprise much of the relational dynamic of the clinical encounter. Comfort with this ambiguity, and recognition of the potential uncertainty of particular domains of medicine (e.g.--cultural factors of illness expression, value bias in diagnoses, etc) is an important facet of medical education. This paper begins by defining ambiguity and uncertainty as relevant to clinical practice. Studies have shown differing patterns of students' tolerance for ambiguity and uncertainty that appear to reflect extant attitudinal predispositions toward technology, objectivity, culture, value- and theory-ladeness, and the need for self-examination. This paper reports on those findings specifically related to the theme of uncertainty as relevant to teaching about cultural diversity. Its focus is to identify how and where the theme of certainty arose in the teaching and learning of cultural diversity, what were the attitudes toward this theme and topic, and how these attitudes and responses reflect and inform this area of medical pedagogy. A semi-structured interview was undertaken with 61 stakeholders (including policymakers, diversity teachers, students and users). The data were analysed and themes identified. There were diverse views about what the term cultural diversity means and what should constitute the cultural diversity curriculum. There was a need to provide certainty in teaching cultural diversity with diversity teachers feeling under considerable pressure to provide information. Students discomfort with uncertainty was felt to drive cultural diversity teaching towards factual emphasis rather than reflection or taking a patient centred approach. Students and faculty may feel that cultural diversity teaching is more about how to avoid professional, medico-legal pitfalls, rather than improving the patient experience or the patient-physician relationship. There may be pressure to imbue cultural diversity issues with levels of objectivity and certainty representative of other aspects of the medical curriculum (e.g.--biochemistry). This may reflect a particular selection bias for students with a technocentric orientation. Inadvertently, medical education may enhance this bias through training effects, and accommodate disregard for subjectivity, over-reliance upon technology and thereby foster incorrect assumptions of objective certainty. We opine that it is important to teach students that technology cannot guarantee certainty, and that dealing with subjectivity, diversity, ambiguity and uncertainty is inseparable from the personal dimension of medicine as moral enterprise. Uncertainty is inherent in cultural diversity so this part of the curriculum provides an opportunity to address the issue as it relates to patient care.

  12. Role of turbulence fluctuations on uncertainties of acoutic Doppler current profiler discharge measurements

    USGS Publications Warehouse

    Tarrab, Leticia; Garcia, Carlos M.; Cantero, Mariano I.; Oberg, Kevin

    2012-01-01

    This work presents a systematic analysis quantifying the role of the presence of turbulence fluctuations on uncertainties (random errors) of acoustic Doppler current profiler (ADCP) discharge measurements from moving platforms. Data sets of three-dimensional flow velocities with high temporal and spatial resolution were generated from direct numerical simulation (DNS) of turbulent open channel flow. Dimensionless functions relating parameters quantifying the uncertainty in discharge measurements due to flow turbulence (relative variance and relative maximum random error) to sampling configuration were developed from the DNS simulations and then validated with field-scale discharge measurements. The validated functions were used to evaluate the role of the presence of flow turbulence fluctuations on uncertainties in ADCP discharge measurements. The results of this work indicate that random errors due to the flow turbulence are significant when: (a) a low number of transects is used for a discharge measurement, and (b) measurements are made in shallow rivers using high boat velocity (short time for the boat to cross a flow turbulence structure).

  13. Hydrological considerations in providing data for water agreements

    NASA Astrophysics Data System (ADS)

    Shamir, U.

    2011-12-01

    Conflicts over water are as old as human history. Still, analysis of past and present water conflicts, cooperation and agreements clearly indicate a preponderance of cooperation over conflict. How can hydrologists contribute to maximizing the probability that this will be the outcome when interests of adjacent political entities over water move towards conflict? Hydrology is among the most important data bases for crafting a water agreement across a political boundary (others include: political, social, and economic) and are often the most elusive and controversial. We deal here with cases of water scarcity, although flood protection issues are no less interesting and challenging. For hydrologists, some of the important points in this regard are: - Agreed and "stable" hydrological data base: hydrologists know that data bases are always a "moving target" that keeps changing with new and better information, improved understanding of the hydrological components and the use of models, as well as due to the influence of changing internal and external drivers (land use and land cover, modified precipitation fields, climate change). On the other hand, it is not possible to manage an agreement that requires continuous change of the hydrological information. To do so would cause endless discussions between the parties, causing the agreement to become unstable. The tendency is therefore to "freeze" the hydrological information in the agreement and introduce a mechanism for periodic update. - Variability and uncertainty: while the basic hydrology is to be kept "stable", the agreement must recognize variability and uncertainty. Various mechanisms can be used for this, depending on the specific circumstances of the case, including: the range of variability and the degree of uncertainty and the consequences of excursions systematic from nominal values and the effects of random variability. - Water quality is an important parameter that determines usability for various purposes, and requires treatment when source quality does not match consumer requirements. - Complexity/difficulty and associated cost of extraction/production to make the "potential" source water into "usable" water. - Look jointly for new sources and benefits (expand the "cake"): agreements should look beyond the issues and water sources that are under imminent discussion due to competition and disagreement, to see whether the "cake" can be expanded, in terms of the water itself and of benefits that can accrue from a creative water agreement. - Conversion of "potential" water into "usable" water: water in a source requires transformation in time, space and quality and incurs a cost. - Introduction of expanded, previously unused resources which become available due to advanced extraction/production capabilities and additional treatment process, and/or by changing water use patterns and land use practices. - Negotiate over and jointly manage the benefits and losses due to water (wherever and whenever possible) rather than merely with the physical parameters of water themselves volume, flow, concentration.

  14. When, not if: The inescapability of an uncertain future

    NASA Astrophysics Data System (ADS)

    Lewandowsky, S.; Ballard, T.

    2014-12-01

    Uncertainty is an inherent feature of most scientific endeavours, and many political decisions must be made in the presence of scientific uncertainty. In the case of climate change, there is evidence that greater scientific uncertainty increases the risk associated with the impact of climate change. Scientific uncertainty thus provides an impetus for cutting emissions rather than delaying action. In contrast to those normative considerations, uncertainty is frequently cited in political and public discourse as a reason to delay mitigation. We examine ways in which this gap between public and scientific understanding of uncertainty can be bridged. In particular, we sought ways to communicate uncertainty in a way that better calibrates people's risk perceptions with the projected impact of climate change. We report two behavioural experiments in which uncertainty about the future was expressed either as outcome uncertainty or temporal uncertainty. The conventional presentation of uncertainty involves uncertainty about an outcome at a given time—for example, the range of possible sea level rise (say 50cm +/- 20cm) by a certain date. An alternative presentation of the same situation presents a certain outcome ("sea levels will rise by 50cm") but places the uncertainty into the time of arrival ("this may occur as early as 2040 or as late as 2080"). We presented participants with a series of statements and graphs indicating projected increases in temperature, sea levels, ocean acidification, and a decrease in artic sea ice. In the uncertain magnitude condition, the statements and graphs reported the upper and lower confidence bounds of the projected magnitude and the mean projected time of arrival. In the uncertain time of arrival condition, they reported the upper and lower confidence bounds of the projected time of arrival and the mean projected magnitude. The results show that when uncertainty was presented as uncertain time of arrival rather than an uncertain outcome, people expressed greater concern about the projected outcomes. In a further experiment involving repeated "games" with a simulated economy, we similarly showed that people allocate more resources to mitigation if there is uncertainty about the timing of an adverse event rather than about the magnitude of its impact.

  15. Managing uncertainty in flood protection planning with climate projections

    NASA Astrophysics Data System (ADS)

    Dittes, Beatrice; Špačková, Olga; Schoppa, Lukas; Straub, Daniel

    2018-04-01

    Technical flood protection is a necessary part of integrated strategies to protect riverine settlements from extreme floods. Many technical flood protection measures, such as dikes and protection walls, are costly to adapt after their initial construction. This poses a challenge to decision makers as there is large uncertainty in how the required protection level will change during the measure lifetime, which is typically many decades long. Flood protection requirements should account for multiple future uncertain factors: socioeconomic, e.g., whether the population and with it the damage potential grows or falls; technological, e.g., possible advancements in flood protection; and climatic, e.g., whether extreme discharge will become more frequent or not. This paper focuses on climatic uncertainty. Specifically, we devise methodology to account for uncertainty associated with the use of discharge projections, ultimately leading to planning implications. For planning purposes, we categorize uncertainties as either visible, if they can be quantified from available catchment data, or hidden, if they cannot be quantified from catchment data and must be estimated, e.g., from the literature. It is vital to consider the hidden uncertainty, since in practical applications only a limited amount of information (e.g., a finite projection ensemble) is available. We use a Bayesian approach to quantify the visible uncertainties and combine them with an estimate of the hidden uncertainties to learn a joint probability distribution of the parameters of extreme discharge. The methodology is integrated into an optimization framework and applied to a pre-alpine case study to give a quantitative, cost-optimal recommendation on the required amount of flood protection. The results show that hidden uncertainty ought to be considered in planning, but the larger the uncertainty already present, the smaller the impact of adding more. The recommended planning is robust to moderate changes in uncertainty as well as in trend. In contrast, planning without consideration of bias and dependencies in and between uncertainty components leads to strongly suboptimal planning recommendations.

  16. Assessment of uncertainties in radiation-induced cancer risk predictions at clinically relevant doses

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nguyen, J.; Moteabbed, M.; Paganetti, H., E-mail: hpaganetti@mgh.harvard.edu

    2015-01-15

    Purpose: Theoretical dose–response models offer the possibility to assess second cancer induction risks after external beam therapy. The parameters used in these models are determined with limited data from epidemiological studies. Risk estimations are thus associated with considerable uncertainties. This study aims at illustrating uncertainties when predicting the risk for organ-specific second cancers in the primary radiation field illustrated by choosing selected treatment plans for brain cancer patients. Methods: A widely used risk model was considered in this study. The uncertainties of the model parameters were estimated with reported data of second cancer incidences for various organs. Standard error propagationmore » was then subsequently applied to assess the uncertainty in the risk model. Next, second cancer risks of five pediatric patients treated for cancer in the head and neck regions were calculated. For each case, treatment plans for proton and photon therapy were designed to estimate the uncertainties (a) in the lifetime attributable risk (LAR) for a given treatment modality and (b) when comparing risks of two different treatment modalities. Results: Uncertainties in excess of 100% of the risk were found for almost all organs considered. When applied to treatment plans, the calculated LAR values have uncertainties of the same magnitude. A comparison between cancer risks of different treatment modalities, however, does allow statistically significant conclusions. In the studied cases, the patient averaged LAR ratio of proton and photon treatments was 0.35, 0.56, and 0.59 for brain carcinoma, brain sarcoma, and bone sarcoma, respectively. Their corresponding uncertainties were estimated to be potentially below 5%, depending on uncertainties in dosimetry. Conclusions: The uncertainty in the dose–response curve in cancer risk models makes it currently impractical to predict the risk for an individual external beam treatment. On the other hand, the ratio of absolute risks between two modalities is less sensitive to the uncertainties in the risk model and can provide statistically significant estimates.« less

  17. The Behavior of IAPWS-95 from 250 to 300 K and Pressures up to 400 MPa: Evaluation Based on Recently Derived Property Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wagner, Wolfgang, E-mail: Wagner@thermo.rub.de; Thol, Monika

    2015-12-15

    Over the past several years, considerable scientific and technical interest has been focused on accurate thermodynamic properties of fluid water covering part of the subcooled (metastable) region and the stable liquid from the melting line up to about 300 K and pressures up to several hundred MPa. Between 2000 and 2010, experimental density data were published whose accuracy was not completely clear. The scientific standard equation of state for fluid water, the IAPWS-95 formulation, was developed on the basis of experimental data for thermodynamic properties that were available by 1995. In this work, it is examined how IAPWS-95 behaves withmore » respect to the experimental data published after 1995. This investigation is carried out for temperatures from 250 to 300 K and pressures up to 400 MPa. The starting point is the assessment of the current data situation. This was mainly performed on the basis of data for the density, expansivity, compressibility, and isobaric heat capacity, which were derived in 2015 from very accurate speed-of-sound data. Apart from experimental data and these derived data, property values calculated from the recently published equation of state for this region of Holten et al. (2014) were also used. As a result, the unclear data situation could be clarified, and uncertainty values could be estimated for the investigated properties. In the region described above, detailed comparisons show that IAPWS-95 is able to represent the latest experimental data for the density, expansivity, compressibility, speed of sound, and isobaric heat capacity to within the uncertainties given in the release on IAPWS-95. Since the release does not contain uncertainty estimates for expansivities and compressibilities, the statement relates to the error propagation of the given uncertainty in density. Due to the lack of experimental data for the isobaric heat capacity for pressures above 100 MPa, no uncertainty estimates are given in the release for this pressure range. Results of the investigation of IAPWS-95 concerning its behavior with regard to the isobaric heat capacity in the high-pressure low-temperature region are also presented. Comparisons with very accurate speed-of-sound data published in 2012 showed that the uncertainty estimates of IAPWS-95 in speed of sound could be decreased for temperatures from 283 to 473 K and pressures up to 400 MPa.« less

  18. Multiscale Informatics for Low-Temperature Propane Oxidation: Further Complexities in Studies of Complex Reactions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Burke, Michael P.; Goldsmith, C. Franklin; Klippenstein, Stephen J.

    2015-07-16

    We have developed a multi-scale approach (Burke, M. P.; Klippenstein, S. J.; Harding, L. B. Proc. Combust. Inst. 2013, 34, 547–555.) to kinetic model formulation that directly incorporates elementary kinetic theories as a means to provide reliable, physics-based extrapolation to unexplored conditions. Here, we extend and generalize the multi-scale modeling strategy to treat systems of considerable complexity – involving multi-well reactions, potentially missing reactions, non-statistical product branching ratios, and non-Boltzmann (i.e. non-thermal) reactant distributions. The methodology is demonstrated here for a subsystem of low-temperature propane oxidation, as a representative system for low-temperature fuel oxidation. A multi-scale model is assembled andmore » informed by a wide variety of targets that include ab initio calculations of molecular properties, rate constant measurements of isolated reactions, and complex systems measurements. Active model parameters are chosen to accommodate both “parametric” and “structural” uncertainties. Theoretical parameters (e.g. barrier heights) are included as active model parameters to account for parametric uncertainties in the theoretical treatment; experimental parameters (e.g. initial temperatures) are included to account for parametric uncertainties in the physical models of the experiments. RMG software is used to assess potential structural uncertainties due to missing reactions. Additionally, branching ratios among product channels are included as active model parameters to account for structural uncertainties related to difficulties in modeling sequences of multiple chemically activated steps. The approach is demonstrated here for interpreting time-resolved measurements of OH, HO2, n-propyl, i-propyl, propene, oxetane, and methyloxirane from photolysis-initiated low-temperature oxidation of propane at pressures from 4 to 60 Torr and temperatures from 300 to 700 K. In particular, the multi-scale informed model provides a consistent quantitative explanation of both ab initio calculations and time-resolved species measurements. The present results show that interpretations of OH measurements are significantly more complicated than previously thought – in addition to barrier heights for key transition states considered previously, OH profiles also depend on additional theoretical parameters for R + O2 reactions, secondary reactions, QOOH + O2 reactions, and treatment of non-Boltzmann reaction sequences. Extraction of physically rigorous information from those measurements may require more sophisticated treatment of all of those model aspects, as well as additional experimental data under more conditions, to discriminate among possible interpretations and ensure model reliability. Keywords: Optimization, Uncertainty quantification, Chemical mechanism, Low-Temperature Oxidation, Non-Boltzmann« less

  19. Accounting for uncertainty in health economic decision models by using model averaging.

    PubMed

    Jackson, Christopher H; Thompson, Simon G; Sharples, Linda D

    2009-04-01

    Health economic decision models are subject to considerable uncertainty, much of which arises from choices between several plausible model structures, e.g. choices of covariates in a regression model. Such structural uncertainty is rarely accounted for formally in decision models but can be addressed by model averaging. We discuss the most common methods of averaging models and the principles underlying them. We apply them to a comparison of two surgical techniques for repairing abdominal aortic aneurysms. In model averaging, competing models are usually either weighted by using an asymptotically consistent model assessment criterion, such as the Bayesian information criterion, or a measure of predictive ability, such as Akaike's information criterion. We argue that the predictive approach is more suitable when modelling the complex underlying processes of interest in health economics, such as individual disease progression and response to treatment.

  20. Shock Layer Radiation Modeling and Uncertainty for Mars Entry

    NASA Technical Reports Server (NTRS)

    Johnston, Christopher O.; Brandis, Aaron M.; Sutton, Kenneth

    2012-01-01

    A model for simulating nonequilibrium radiation from Mars entry shock layers is presented. A new chemical kinetic rate model is developed that provides good agreement with recent EAST and X2 shock tube radiation measurements. This model includes a CO dissociation rate that is a factor of 13 larger than the rate used widely in previous models. Uncertainties in the proposed rates are assessed along with uncertainties in translational-vibrational relaxation modeling parameters. The stagnation point radiative flux uncertainty due to these flowfield modeling parameter uncertainties is computed to vary from 50 to 200% for a range of free-stream conditions, with densities ranging from 5e-5 to 5e-4 kg/m3 and velocities ranging from of 6.3 to 7.7 km/s. These conditions cover the range of anticipated peak radiative heating conditions for proposed hypersonic inflatable aerodynamic decelerators (HIADs). Modeling parameters for the radiative spectrum are compiled along with a non-Boltzmann rate model for the dominant radiating molecules, CO, CN, and C2. A method for treating non-local absorption in the non-Boltzmann model is developed, which is shown to result in up to a 50% increase in the radiative flux through absorption by the CO 4th Positive band. The sensitivity of the radiative flux to the radiation modeling parameters is presented and the uncertainty for each parameter is assessed. The stagnation point radiative flux uncertainty due to these radiation modeling parameter uncertainties is computed to vary from 18 to 167% for the considered range of free-stream conditions. The total radiative flux uncertainty is computed as the root sum square of the flowfield and radiation parametric uncertainties, which results in total uncertainties ranging from 50 to 260%. The main contributors to these significant uncertainties are the CO dissociation rate and the CO heavy-particle excitation rates. Applying the baseline flowfield and radiation models developed in this work, the radiative heating for the Mars Pathfinder probe is predicted to be nearly 20 W/cm2. In contrast to previous studies, this value is shown to be significant relative to the convective heating.

  1. Climate data induced uncertainty in model based estimations of terrestrial primary productivity

    NASA Astrophysics Data System (ADS)

    Wu, Z.; Ahlström, A.; Smith, B.; Ardö, J.; Eklundh, L.; Fensholt, R.; Lehsten, V.

    2016-12-01

    Models used to project global vegetation and carbon cycle differ in their estimates of historical fluxes and pools. These differences arise not only from differences between models but also from differences in the environmental and climatic data that forces the models. Here we investigate the role of uncertainties in historical climate data, encapsulated by a set of six historical climate datasets. We focus on terrestrial gross primary productivity (GPP) and analyze the results from a dynamic process-based vegetation model (LPJ-GUESS) forced by six different climate datasets and two empirical datasets of GPP (derived from flux towers and remote sensing). We find that the climate induced uncertainty, defined as the difference among historical simulations in GPP when forcing the model with the different climate datasets, can be as high as 33 Pg C yr-1 globally (19% of mean GPP). The uncertainty is partitioned into the three main climatic drivers, temperature, precipitation, and shortwave radiation. Additionally, we illustrate how the uncertainty due to a given climate driver depends both on the magnitude of the forcing data uncertainty (the data range) and the sensitivity of the modeled GPP to the driver (the ecosystem sensitivity). The analysis is performed globally and stratified into five land cover classes. We find that the dynamic vegetation model overestimates GPP, compared to empirically based GPP data over most areas, except for the tropical region. Both the simulations and empirical estimates agree that the tropical region is a disproportionate source of uncertainty in GPP estimation. This is mainly caused by uncertainties in shortwave radiation forcing, of which climate data range contributes slightly higher uncertainty than ecosystem sensitivity to shortwave radiation. We also find that precipitation dominated the climate induced uncertainty over nearly half of terrestrial vegetated surfaces, which is mainly due to large ecosystem sensitivity to precipitation. Overall, climate data ranges are found to contribute more to the climate induced uncertainty than ecosystem sensitivity. Our study highlights the need to better constrain tropical climate and demonstrate that uncertainty caused by climatic forcing data must be considered when comparing and evaluating model results and empirical datasets.

  2. Spatially Distributed Assimilation of Remotely Sensed Leaf Area Index and Potential Evapotranspiration for Hydrologic Modeling in Wetland Landscapes

    EPA Science Inventory

    Evapotranspiration (ET), a highly dynamic flux in wetland landscapes, regulates the accuracy of surface/sub-surface runoff simulation in a hydrologic model. However, considerable uncertainty in simulating ET-related processes remains, including our limited ability to incorporate ...

  3. BEST MANAGEMENT PRACTICES FOR THE CONTROL OF NUTRIENTS FROM URBAN NONPOINT SOURCES

    EPA Science Inventory

    While the costs and benefits associated with the point source control of nutrients are relatively well defined, considerable uncertainties remain in the efficiency and long-term costs associated with the best management practices (BMPs) used to redcuce loads from nonpoint and dif...

  4. Consideration of the FQPA Safety Factor and Other Uncertainty Factors in Cumulative Risk Assessment of Chemicals Sharing a Common Mechanism of Toxicity

    EPA Pesticide Factsheets

    This guidance document provides OPP's current thinking on application of the provision in FFDCA about an additional safety factor for the protection of infants and children in the context of cumulative risk assessments.

  5. The role of integrative, whole organism testing in monitoring applications: Back to the future

    EPA Science Inventory

    The biological effects of chemicals released to surface waters continue to be an area of uncertainty in risk assessment and risk management. Based on conventional risk assessment considerations, adequate exposure and effects information are required to reach a scientifically soun...

  6. Bayesian models for comparative analysis integrating phylogenetic uncertainty.

    PubMed

    de Villemereuil, Pierre; Wells, Jessie A; Edwards, Robert D; Blomberg, Simon P

    2012-06-28

    Uncertainty in comparative analyses can come from at least two sources: a) phylogenetic uncertainty in the tree topology or branch lengths, and b) uncertainty due to intraspecific variation in trait values, either due to measurement error or natural individual variation. Most phylogenetic comparative methods do not account for such uncertainties. Not accounting for these sources of uncertainty leads to false perceptions of precision (confidence intervals will be too narrow) and inflated significance in hypothesis testing (e.g. p-values will be too small). Although there is some application-specific software for fitting Bayesian models accounting for phylogenetic error, more general and flexible software is desirable. We developed models to directly incorporate phylogenetic uncertainty into a range of analyses that biologists commonly perform, using a Bayesian framework and Markov Chain Monte Carlo analyses. We demonstrate applications in linear regression, quantification of phylogenetic signal, and measurement error models. Phylogenetic uncertainty was incorporated by applying a prior distribution for the phylogeny, where this distribution consisted of the posterior tree sets from Bayesian phylogenetic tree estimation programs. The models were analysed using simulated data sets, and applied to a real data set on plant traits, from rainforest plant species in Northern Australia. Analyses were performed using the free and open source software OpenBUGS and JAGS. Incorporating phylogenetic uncertainty through an empirical prior distribution of trees leads to more precise estimation of regression model parameters than using a single consensus tree and enables a more realistic estimation of confidence intervals. In addition, models incorporating measurement errors and/or individual variation, in one or both variables, are easily formulated in the Bayesian framework. We show that BUGS is a useful, flexible general purpose tool for phylogenetic comparative analyses, particularly for modelling in the face of phylogenetic uncertainty and accounting for measurement error or individual variation in explanatory variables. Code for all models is provided in the BUGS model description language.

  7. Bayesian models for comparative analysis integrating phylogenetic uncertainty

    PubMed Central

    2012-01-01

    Background Uncertainty in comparative analyses can come from at least two sources: a) phylogenetic uncertainty in the tree topology or branch lengths, and b) uncertainty due to intraspecific variation in trait values, either due to measurement error or natural individual variation. Most phylogenetic comparative methods do not account for such uncertainties. Not accounting for these sources of uncertainty leads to false perceptions of precision (confidence intervals will be too narrow) and inflated significance in hypothesis testing (e.g. p-values will be too small). Although there is some application-specific software for fitting Bayesian models accounting for phylogenetic error, more general and flexible software is desirable. Methods We developed models to directly incorporate phylogenetic uncertainty into a range of analyses that biologists commonly perform, using a Bayesian framework and Markov Chain Monte Carlo analyses. Results We demonstrate applications in linear regression, quantification of phylogenetic signal, and measurement error models. Phylogenetic uncertainty was incorporated by applying a prior distribution for the phylogeny, where this distribution consisted of the posterior tree sets from Bayesian phylogenetic tree estimation programs. The models were analysed using simulated data sets, and applied to a real data set on plant traits, from rainforest plant species in Northern Australia. Analyses were performed using the free and open source software OpenBUGS and JAGS. Conclusions Incorporating phylogenetic uncertainty through an empirical prior distribution of trees leads to more precise estimation of regression model parameters than using a single consensus tree and enables a more realistic estimation of confidence intervals. In addition, models incorporating measurement errors and/or individual variation, in one or both variables, are easily formulated in the Bayesian framework. We show that BUGS is a useful, flexible general purpose tool for phylogenetic comparative analyses, particularly for modelling in the face of phylogenetic uncertainty and accounting for measurement error or individual variation in explanatory variables. Code for all models is provided in the BUGS model description language. PMID:22741602

  8. Impact of 4D image quality on the accuracy of target definition.

    PubMed

    Nielsen, Tine Bjørn; Hansen, Christian Rønn; Westberg, Jonas; Hansen, Olfred; Brink, Carsten

    2016-03-01

    Delineation accuracy of target shape and position depends on the image quality. This study investigates whether the image quality on standard 4D systems has an influence comparable to the overall delineation uncertainty. A moving lung target was imaged using a dynamic thorax phantom on three different 4D computed tomography (CT) systems and a 4D cone beam CT (CBCT) system using pre-defined clinical scanning protocols. Peak-to-peak motion and target volume were registered using rigid registration and automatic delineation, respectively. A spatial distribution of the imaging uncertainty was calculated as the distance deviation between the imaged target and the true target shape. The measured motions were smaller than actual motions. There were volume differences of the imaged target between respiration phases. Imaging uncertainties of >0.4 cm were measured in the motion direction which showed that there was a large distortion of the imaged target shape. Imaging uncertainties of standard 4D systems are of similar size as typical GTV-CTV expansions (0.5-1 cm) and contribute considerably to the target definition uncertainty. Optimising and validating 4D systems is recommended in order to obtain the most optimal imaged target shape.

  9. Gully erosion in the Caatinga biome, Brazil: measurement and stochastic modelling

    NASA Astrophysics Data System (ADS)

    Lima Alencar, Pedro Henrique; de Araújo, José Carlos; Nonato Távora Costa, Raimundo

    2017-04-01

    In contrast with inter-rill erosion, which takes a long time to modify the terrain form, gully erosion can fast and severely change the landscape. In the Brazilian semiarid region, a one-million km2 area that coincides with the Caatinga biome, inter-rill erosion prevails due to the silty shallow soils. However, gully erosion does occur in the Caatinga, with temporal increasing severity. This source of sediment impacts the existing dense network of small dams, generating significant deleterious effects, such as water availability reduction in a drought-prone region. This study focuses on the Madalena basin (124 km2, state of Ceará, Brazil), a land-reform settlement with 20 inhabitants per km2, whose main economic activities are agriculture (especially Zea mays), livestock and fishing. In the catchment area, where there are 12 dams (with storage capacity ranging from 6.104 to 2.107 m3), gully erosion has become an issue due to its increasing occurrence. Eight gully-erosion sites have been identified in the basin, but most of them have not yet reached great dimensions (depth and/or width), nor interacted with groundwater, being therefore classified as ephemeral gullies. We selected the three most relevant sites and measured the topography of the eroded channels, as well as the neighboring terrain relief, using accurate total stations and unmanned aerial vehicle. The data was processed with the help of software, such as DataGeosis (Office 7.5) and Surfer (11.0), providing information on gully erosion in terms of (μ ± σ): projection area (317±165 m2), eroded mass (61±36 Mg) and volume (42±25 m3), length (38±6 m), maximum depth (0.58±0.13 m) and maximum width (6.00±2.35 m). The measured data are then compared with those provided by the Foster and Lane model (1986). The model generated results with considerable scatter. This is possibly due to uncertainties in the field parameters, which are neglected in the deterministic approach of most physically-based models. We propose that the gully-erosion model approach consider the uncertainties of its main parameters/variables (e.g., soil density, soil grain-size distribution and peak discharge); and generate a histogram of responses, rather than a single deterministic value. The principle of maximum entropy should be used to derive the probability density functions of the uncertainty content of parameters and variables.

  10. Quantitative body DW-MRI biomarkers uncertainty estimation using unscented wild-bootstrap.

    PubMed

    Freiman, M; Voss, S D; Mulkern, R V; Perez-Rossello, J M; Warfield, S K

    2011-01-01

    We present a new method for the uncertainty estimation of diffusion parameters for quantitative body DW-MRI assessment. Diffusion parameters uncertainty estimation from DW-MRI is necessary for clinical applications that use these parameters to assess pathology. However, uncertainty estimation using traditional techniques requires repeated acquisitions, which is undesirable in routine clinical use. Model-based bootstrap techniques, for example, assume an underlying linear model for residuals rescaling and cannot be utilized directly for body diffusion parameters uncertainty estimation due to the non-linearity of the body diffusion model. To offset this limitation, our method uses the Unscented transform to compute the residuals rescaling parameters from the non-linear body diffusion model, and then applies the wild-bootstrap method to infer the body diffusion parameters uncertainty. Validation through phantom and human subject experiments shows that our method identify the regions with higher uncertainty in body DWI-MRI model parameters correctly with realtive error of -36% in the uncertainty values.

  11. Remaining Useful Life Estimation in Prognosis: An Uncertainty Propagation Problem

    NASA Technical Reports Server (NTRS)

    Sankararaman, Shankar; Goebel, Kai

    2013-01-01

    The estimation of remaining useful life is significant in the context of prognostics and health monitoring, and the prediction of remaining useful life is essential for online operations and decision-making. However, it is challenging to accurately predict the remaining useful life in practical aerospace applications due to the presence of various uncertainties that affect prognostic calculations, and in turn, render the remaining useful life prediction uncertain. It is challenging to identify and characterize the various sources of uncertainty in prognosis, understand how each of these sources of uncertainty affect the uncertainty in the remaining useful life prediction, and thereby compute the overall uncertainty in the remaining useful life prediction. In order to achieve these goals, this paper proposes that the task of estimating the remaining useful life must be approached as an uncertainty propagation problem. In this context, uncertainty propagation methods which are available in the literature are reviewed, and their applicability to prognostics and health monitoring are discussed.

  12. Instrumentation-related uncertainty of reflectance and transmittance measurements with a two-channel spectrophotometer.

    PubMed

    Peest, Christian; Schinke, Carsten; Brendel, Rolf; Schmidt, Jan; Bothe, Karsten

    2017-01-01

    Spectrophotometers are operated in numerous fields of science and industry for a variety of applications. In order to provide confidence for the measured data, analyzing the associated uncertainty is valuable. However, the uncertainty of the measurement results is often unknown or reduced to sample-related contributions. In this paper, we describe our approach for the systematic determination of the measurement uncertainty of the commercially available two-channel spectrophotometer Agilent Cary 5000 in accordance with the Guide to the expression of uncertainty in measurements. We focus on the instrumentation-related uncertainty contributions rather than the specific application and thus outline a general procedure which can be adapted for other instruments. Moreover, we discover a systematic signal deviation due to the inertia of the measurement amplifier and develop and apply a correction procedure. Thereby we increase the usable dynamic range of the instrument by more than one order of magnitude. We present methods for the quantification of the uncertainty contributions and combine them into an uncertainty budget for the device.

  13. Assessment of adaptation measures to high-mountain risks in Switzerland under climate uncertainties

    NASA Astrophysics Data System (ADS)

    Muccione, Veruska; Lontzek, Thomas; Huggel, Christian; Ott, Philipp; Salzmann, Nadine

    2015-04-01

    The economic evaluation of different adaptation options is important to support policy-makers that need to set priorities in the decision-making process. However, the decision-making process faces considerable uncertainties regarding current and projected climate impacts. First, physical climate and related impact systems are highly complex and not fully understood. Second, the further we look into the future, the more important the emission pathways become, with effects on the frequency and severity of climate impacts. Decision on adaptation measures taken today and in the future must be able to adequately consider the uncertainties originating from the different sources. Decisions are not taken in a vacuum but always in the context of specific social, economic, institutional and political conditions. Decision finding processes strongly depend on the socio-political system and usually have evolved over some time. Finding and taking decisions in the respective socio-political and economic context multiplies the uncertainty challenge. Our presumption is that a sound assessment of the different adaptation options in Switzerland under uncertainty necessitates formulating and solving a dynamic, stochastic optimization problem. Economic optimization models in the field of climate change are not new. Typically, such models are applied for global-scale studies but barely for local-scale problems. In this analysis, we considered the case of the Guttannen-Grimsel Valley, situated in the Swiss Bernese Alps. The alpine community has been affected by high-magnitude, high-frequency debris flows that started in 2009 and were historically unprecendented. They were related to thaw of permafrost in the rock slopes of Ritzlihorn and repeated rock fall events that accumulated at the debris fan and formed a sediment source for debris flows and were transported downvalley. An important transit road, a trans-European gas pipeline and settlements were severely affected and partly destroyed. Several adaptation measures were discussed by the responsible authorities but decision making is particularly challenging under multiple uncertainties. For this area, we developed a stochastic optimization model for concrete and real-case adaptation options and measures and use dynamic programming to explore the optimal adaptation decisions under uncertainty in face of uncertain impacts from climate change of debris flows and flooding. Even though simplification needed to be made the results produced were concrete and tangible, indicating that excavation is a preferable adaptation option based on our assumption and modeling in comparison to building a dam or relocation, which is not necessarily intuitive and adds an additional perspective to what has so far been sketched and evaluated by cantonal and communal authorities for Guttannen. Moreover, the building of an alternative cantonal road appears to be more expensive than costs incurring due to road closure.

  14. Uncertainties in Integrated Climate Change Impact Assessments by Sub-setting GCMs Based on Annual as well as Crop Growing Period under Rice Based Farming System of Indo-Gangetic Plains of India

    NASA Astrophysics Data System (ADS)

    Pillai, S. N.; Singh, H.; Panwar, A. S.; Meena, M. S.; Singh, S. V.; Singh, B.; Paudel, G. P.; Baigorria, G. A.; Ruane, A. C.; McDermid, S.; Boote, K. J.; Porter, C.; Valdivia, R. O.

    2016-12-01

    Integrated assessment of climate change impact on agricultural productivity is a challenge to the scientific community due to uncertainties of input data, particularly the climate, soil, crop calibration and socio-economic dataset. However, the uncertainty due to selection of GCMs is the major source due to complex underlying processes involved in initial as well as the boundary conditions dealt in solving the air-sea interactions. Under Agricultural Modeling Intercomparison and Improvement Project (AgMIP), the Indo-Gangetic Plains Regional Research Team investigated the uncertainties caused due to selection of GCMs through sub-setting based on annual as well as crop-growth period of rice-wheat systems in AgMIP Integrated Assessment methodology. The AgMIP Phase II protocols were used to study the linking of climate-crop-economic models for two study sites Meerut and Karnal to analyse the sensitivity of current production systems to climate change. Climate Change Projections were made using 29 CMIP5 GCMs under RCP4.5 and RCP 8.5 during mid-century period (2040-2069). Two crop models (APSIM & DSSAT) were used. TOA-MD economic model was used for integrated assessment. Based on RAPs (Representative Agricultural Pathways), some of the parameters, which are not possible to get through modeling, derived from literature and interactions with stakeholders incorporated into the TOA-MD model for integrated assessment.

  15. QUANTIFYING UNCERTAINTY DUE TO RANDOM ERRORS FOR MOMENT ANALYSES OF BREAKTHROUGH CURVES

    EPA Science Inventory

    The uncertainty in moments calculated from breakthrough curves (BTCs) is investigated as a function of random measurement errors in the data used to define the BTCs. The method presented assumes moments are calculated by numerical integration using the trapezoidal rule, and is t...

  16. Evidence Theory Based Uncertainty Quantification in Radiological Risk due to Accidental Release of Radioactivity from a Nuclear Power Plant

    NASA Astrophysics Data System (ADS)

    Ingale, S. V.; Datta, D.

    2010-10-01

    Consequence of the accidental release of radioactivity from a nuclear power plant is assessed in terms of exposure or dose to the members of the public. Assessment of risk is routed through this dose computation. Dose computation basically depends on the basic dose assessment model and exposure pathways. One of the exposure pathways is the ingestion of contaminated food. The aim of the present paper is to compute the uncertainty associated with the risk to the members of the public due to the ingestion of contaminated food. The governing parameters of the ingestion dose assessment model being imprecise, we have approached evidence theory to compute the bound of the risk. The uncertainty is addressed by the belief and plausibility fuzzy measures.

  17. Catastrophic shifts in vegetation-soil systems may unfold rapidly or slowly independent of the rate of change in the system driver

    NASA Astrophysics Data System (ADS)

    Karssenberg, Derek; Bierkens, Marc

    2014-05-01

    Complex systems may switch between contrasting stable states under gradual change of a driver. Such critical transitions often result in considerable long-term damage because strong hysteresis impedes reversion, and the transition becomes catastrophic. Critical transitions largely reduce our capability of forecasting future system states because it is hard to predict the timing of their occurrence [2]. Moreover, for many systems it is unknown how rapidly the critical transition unfolds when the tipping point has been reached. The rate of change during collapse, however, is important information because it determines the time available to take action to reverse a shift [1]. In this study we explore the rate of change during the degradation of a vegetation-soil system on a hillslope from a state with considerable vegetation cover and large soil depths, to a state with sparse vegetation and a bare rock or negligible soil depths. Using a distributed, stochastic model coupling hydrology, vegetation, weathering and water erosion, we derive two differential equations describing the vegetation and the soil system, and their interaction. Two stable states - vegetated and bare - are identified by means of analytical investigation, and it is shown that the change between these two states is a critical transition as indicated by hysteresis. Surprisingly, when the tipping point is reached under a very slow increase of grazing pressure, the transition between the vegetated and the bare state can either unfold rapidly, over a few years, or gradually, occurring over decennia up to millennia. These differences in the rate of change during the transient state are explained by differences in bedrock weathering rates. This finding emphasizes the considerable uncertainty associated with forecasting catastrophic shifts in ecosystems, which is due to both difficulties in forecasting the timing of the tipping point and the rate of change when the transition unfolds. References [1] Hughes, T. P., Linares, C., Dakos, V., van de Leemput, I. a, & van Nes, E. H. (2013). Living dangerously on borrowed time during slow, unrecognized regime shifts. Trends in ecology & evolution, 28(3), 149-55. [2] Karssenberg, D., & Bierkens, M. F. P. (2012). Early-warning signals (potentially) reduce uncertainty in forecasted timing of critical shifts. Ecosphere, 3(2).

  18. Orbital Debris Shape and Orientation Effects on Ballistic Limits

    NASA Technical Reports Server (NTRS)

    Evans, Steven W.; Williamsen, Joel E.

    2005-01-01

    The SPHC hydrodynamic code was used to evaluate the effects of orbital debris particle shape and orientation on penetration of a typical spacecraft dual-wall shield. Impacts were simulated at near-normal obliquity at 12 km/sec. Debris cloud characteristics and damage potential are compared with those from impacts by spherical projectiles. Results of these simulations indicate the uncertainties in the predicted ballistic limits due to modeling uncertainty and to uncertainty in the impactor orientation.

  19. Uncertainty of the peak flow reconstruction of the 1907 flood in the Ebro River in Xerta (NE Iberian Peninsula)

    NASA Astrophysics Data System (ADS)

    Ruiz-Bellet, Josep Lluís; Castelltort, Xavier; Balasch, J. Carles; Tuset, Jordi

    2017-02-01

    There is no clear, unified and accepted method to estimate the uncertainty of hydraulic modelling results. In historical floods reconstruction, due to the lower precision of input data, the magnitude of this uncertainty could reach a high value. With the objectives of giving an estimate of the peak flow error of a typical historical flood reconstruction with the model HEC-RAS and of providing a quick, simple uncertainty assessment that an end user could easily apply, the uncertainty of the reconstructed peak flow of a major flood in the Ebro River (NE Iberian Peninsula) was calculated with a set of local sensitivity analyses on six input variables. The peak flow total error was estimated at ±31% and water height was found to be the most influential variable on peak flow, followed by Manning's n. However, the latter, due to its large uncertainty, was the greatest contributor to peak flow total error. Besides, the HEC-RAS resulting peak flow was compared to the ones obtained with the 2D model Iber and with Manning's equation; all three methods gave similar peak flows. Manning's equation gave almost the same result than HEC-RAS. The main conclusion is that, to ensure the lowest peak flow error, the reliability and precision of the flood mark should be thoroughly assessed.

  20. Safety envelope for load tolerance of structural element design based on multi-stage testing

    DOE PAGES

    Park, Chanyoung; Kim, Nam H.

    2016-09-06

    Structural elements, such as stiffened panels and lap joints, are basic components of aircraft structures. For aircraft structural design, designers select predesigned elements satisfying the design load requirement based on their load-carrying capabilities. Therefore, estimation of safety envelope of structural elements for load tolerances would be a good investment for design purpose. In this article, a method of estimating safety envelope is presented using probabilistic classification, which can estimate a specific level of failure probability under both aleatory and epistemic uncertainties. An important contribution of this article is that the calculation uncertainty is reflected in building a safety envelope usingmore » Gaussian process, and the effect of element test data on reducing the calculation uncertainty is incorporated by updating the Gaussian process model with the element test data. It is shown that even one element test can significantly reduce the calculation uncertainty due to lacking knowledge of actual physics, so that conservativeness in a safety envelope is significantly reduced. The proposed approach was demonstrated with a cantilever beam example, which represents a structural element. The example shows that calculation uncertainty provides about 93% conservativeness against the uncertainty due to a few element tests. As a result, it is shown that even a single element test can increase the load tolerance modeled with the safety envelope by 20%.« less

  1. Quantifying and Reducing Curve-Fitting Uncertainty in Isc

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Campanelli, Mark; Duck, Benjamin; Emery, Keith

    2015-06-14

    Current-voltage (I-V) curve measurements of photovoltaic (PV) devices are used to determine performance parameters and to establish traceable calibration chains. Measurement standards specify localized curve fitting methods, e.g., straight-line interpolation/extrapolation of the I-V curve points near short-circuit current, Isc. By considering such fits as statistical linear regressions, uncertainties in the performance parameters are readily quantified. However, the legitimacy of such a computed uncertainty requires that the model be a valid (local) representation of the I-V curve and that the noise be sufficiently well characterized. Using more data points often has the advantage of lowering the uncertainty. However, more data pointsmore » can make the uncertainty in the fit arbitrarily small, and this fit uncertainty misses the dominant residual uncertainty due to so-called model discrepancy. Using objective Bayesian linear regression for straight-line fits for Isc, we investigate an evidence-based method to automatically choose data windows of I-V points with reduced model discrepancy. We also investigate noise effects. Uncertainties, aligned with the Guide to the Expression of Uncertainty in Measurement (GUM), are quantified throughout.« less

  2. Matching experimental and three dimensional numerical models for structural vibration problems with uncertainties

    NASA Astrophysics Data System (ADS)

    Langer, P.; Sepahvand, K.; Guist, C.; Bär, J.; Peplow, A.; Marburg, S.

    2018-03-01

    The simulation model which examines the dynamic behavior of real structures needs to address the impact of uncertainty in both geometry and material parameters. This article investigates three-dimensional finite element models for structural dynamics problems with respect to both model and parameter uncertainties. The parameter uncertainties are determined via laboratory measurements on several beam-like samples. The parameters are then considered as random variables to the finite element model for exploring the uncertainty effects on the quality of the model outputs, i.e. natural frequencies. The accuracy of the output predictions from the model is compared with the experimental results. To this end, the non-contact experimental modal analysis is conducted to identify the natural frequency of the samples. The results show a good agreement compared with experimental data. Furthermore, it is demonstrated that geometrical uncertainties have more influence on the natural frequencies compared to material parameters and material uncertainties are about two times higher than geometrical uncertainties. This gives valuable insights for improving the finite element model due to various parameter ranges required in a modeling process involving uncertainty.

  3. Quantifying and Reducing Curve-Fitting Uncertainty in Isc: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Campanelli, Mark; Duck, Benjamin; Emery, Keith

    Current-voltage (I-V) curve measurements of photovoltaic (PV) devices are used to determine performance parameters and to establish traceable calibration chains. Measurement standards specify localized curve fitting methods, e.g., straight-line interpolation/extrapolation of the I-V curve points near short-circuit current, Isc. By considering such fits as statistical linear regressions, uncertainties in the performance parameters are readily quantified. However, the legitimacy of such a computed uncertainty requires that the model be a valid (local) representation of the I-V curve and that the noise be sufficiently well characterized. Using more data points often has the advantage of lowering the uncertainty. However, more data pointsmore » can make the uncertainty in the fit arbitrarily small, and this fit uncertainty misses the dominant residual uncertainty due to so-called model discrepancy. Using objective Bayesian linear regression for straight-line fits for Isc, we investigate an evidence-based method to automatically choose data windows of I-V points with reduced model discrepancy. We also investigate noise effects. Uncertainties, aligned with the Guide to the Expression of Uncertainty in Measurement (GUM), are quantified throughout.« less

  4. Potential uncertainty reduction in model-averaged benchmark dose estimates informed by an additional dose study.

    PubMed

    Shao, Kan; Small, Mitchell J

    2011-10-01

    A methodology is presented for assessing the information value of an additional dosage experiment in existing bioassay studies. The analysis demonstrates the potential reduction in the uncertainty of toxicity metrics derived from expanded studies, providing insights for future studies. Bayesian methods are used to fit alternative dose-response models using Markov chain Monte Carlo (MCMC) simulation for parameter estimation and Bayesian model averaging (BMA) is used to compare and combine the alternative models. BMA predictions for benchmark dose (BMD) are developed, with uncertainty in these predictions used to derive the lower bound BMDL. The MCMC and BMA results provide a basis for a subsequent Monte Carlo analysis that backcasts the dosage where an additional test group would have been most beneficial in reducing the uncertainty in the BMD prediction, along with the magnitude of the expected uncertainty reduction. Uncertainty reductions are measured in terms of reduced interval widths of predicted BMD values and increases in BMDL values that occur as a result of this reduced uncertainty. The methodology is illustrated using two existing data sets for TCDD carcinogenicity, fitted with two alternative dose-response models (logistic and quantal-linear). The example shows that an additional dose at a relatively high value would have been most effective for reducing the uncertainty in BMA BMD estimates, with predicted reductions in the widths of uncertainty intervals of approximately 30%, and expected increases in BMDL values of 5-10%. The results demonstrate that dose selection for studies that subsequently inform dose-response models can benefit from consideration of how these models will be fit, combined, and interpreted. © 2011 Society for Risk Analysis.

  5. Evaluation of habitat suitability index models by global sensitivity and uncertainty analyses: a case study for submerged aquatic vegetation

    USGS Publications Warehouse

    Zajac, Zuzanna; Stith, Bradley M.; Bowling, Andrea C.; Langtimm, Catherine A.; Swain, Eric D.

    2015-01-01

    Habitat suitability index (HSI) models are commonly used to predict habitat quality and species distributions and are used to develop biological surveys, assess reserve and management priorities, and anticipate possible change under different management or climate change scenarios. Important management decisions may be based on model results, often without a clear understanding of the level of uncertainty associated with model outputs. We present an integrated methodology to assess the propagation of uncertainty from both inputs and structure of the HSI models on model outputs (uncertainty analysis: UA) and relative importance of uncertain model inputs and their interactions on the model output uncertainty (global sensitivity analysis: GSA). We illustrate the GSA/UA framework using simulated hydrology input data from a hydrodynamic model representing sea level changes and HSI models for two species of submerged aquatic vegetation (SAV) in southwest Everglades National Park: Vallisneria americana (tape grass) and Halodule wrightii (shoal grass). We found considerable spatial variation in uncertainty for both species, but distributions of HSI scores still allowed discrimination of sites with good versus poor conditions. Ranking of input parameter sensitivities also varied spatially for both species, with high habitat quality sites showing higher sensitivity to different parameters than low-quality sites. HSI models may be especially useful when species distribution data are unavailable, providing means of exploiting widely available environmental datasets to model past, current, and future habitat conditions. The GSA/UA approach provides a general method for better understanding HSI model dynamics, the spatial and temporal variation in uncertainties, and the parameters that contribute most to model uncertainty. Including an uncertainty and sensitivity analysis in modeling efforts as part of the decision-making framework will result in better-informed, more robust decisions.

  6. Equifinality and process-based modelling

    NASA Astrophysics Data System (ADS)

    Khatami, S.; Peel, M. C.; Peterson, T. J.; Western, A. W.

    2017-12-01

    Equifinality is understood as one of the fundamental difficulties in the study of open complex systems, including catchment hydrology. A review of the hydrologic literature reveals that the term equifinality has been widely used, but in many cases inconsistently and without coherent recognition of the various facets of equifinality, which can lead to ambiguity but also methodological fallacies. Therefore, in this study we first characterise the term equifinality within the context of hydrological modelling by reviewing the genesis of the concept of equifinality and then presenting a theoretical framework. During past decades, equifinality has mainly been studied as a subset of aleatory (arising due to randomness) uncertainty and for the assessment of model parameter uncertainty. Although the connection between parameter uncertainty and equifinality is undeniable, we argue there is more to equifinality than just aleatory parameter uncertainty. That is, the importance of equifinality and epistemic uncertainty (arising due to lack of knowledge) and their implications is overlooked in our current practice of model evaluation. Equifinality and epistemic uncertainty in studying, modelling, and evaluating hydrologic processes are treated as if they can be simply discussed in (or often reduced to) probabilistic terms (as for aleatory uncertainty). The deficiencies of this approach to conceptual rainfall-runoff modelling are demonstrated for selected Australian catchments by examination of parameter and internal flux distributions and interactions within SIMHYD. On this basis, we present a new approach that expands equifinality concept beyond model parameters to inform epistemic uncertainty. The new approach potentially facilitates the identification and development of more physically plausible models and model evaluation schemes particularly within the multiple working hypotheses framework, and is generalisable to other fields of environmental modelling as well.

  7. Uncertainty in predicting soil hydraulic properties at the hillslope scale with indirect methods

    NASA Astrophysics Data System (ADS)

    Chirico, G. B.; Medina, H.; Romano, N.

    2007-02-01

    SummarySeveral hydrological applications require the characterisation of the soil hydraulic properties at large spatial scales. Pedotransfer functions (PTFs) are being developed as simplified methods to estimate soil hydraulic properties as an alternative to direct measurements, which are unfeasible for most practical circumstances. The objective of this study is to quantify the uncertainty in PTFs spatial predictions at the hillslope scale as related to the sampling density, due to: (i) the error in estimated soil physico-chemical properties and (ii) PTF model error. The analysis is carried out on a 2-km-long experimental hillslope in South Italy. The method adopted is based on a stochastic generation of patterns of soil variables using sequential Gaussian simulation, conditioned to the observed sample data. The following PTFs are applied: Vereecken's PTF [Vereecken, H., Diels, J., van Orshoven, J., Feyen, J., Bouma, J., 1992. Functional evaluation of pedotransfer functions for the estimation of soil hydraulic properties. Soil Sci. Soc. Am. J. 56, 1371-1378] and HYPRES PTF [Wösten, J.H.M., Lilly, A., Nemes, A., Le Bas, C., 1999. Development and use of a database of hydraulic properties of European soils. Geoderma 90, 169-185]. The two PTFs estimate reliably the soil water retention characteristic even for a relatively coarse sampling resolution, with prediction uncertainties comparable to the uncertainties in direct laboratory or field measurements. The uncertainty of soil water retention prediction due to the model error is as much as or more significant than the uncertainty associated with the estimated input, even for a relatively coarse sampling resolution. Prediction uncertainties are much more important when PTF are applied to estimate the saturated hydraulic conductivity. In this case model error dominates the overall prediction uncertainties, making negligible the effect of the input error.

  8. Distress Due to Prognostic Uncertainty in Palliative Care: Frequency, Distribution, and Outcomes among Hospitalized Patients with Advanced Cancer.

    PubMed

    Gramling, Robert; Stanek, Susan; Han, Paul K J; Duberstein, Paul; Quill, Tim E; Temel, Jennifer S; Alexander, Stewart C; Anderson, Wendy G; Ladwig, Susan; Norton, Sally A

    2018-03-01

    Prognostic uncertainty is common in advanced cancer and frequently addressed during palliative care consultation, yet we know little about its impact on quality of life (QOL). We describe the prevalence and distribution of distress due to prognostic uncertainty among hospitalized patients with advanced cancer before palliative care consultation. We evaluate the association between this type of distress and overall QOL before and after palliative care consultation. Observational cohort study. Hospitalized patients with advanced cancer who receive a palliative care consultation at two geographically distant academic medical centers. At the time of enrollment, before palliative care consultation, we asked participants: "Over the past two days, how much have you been bothered by uncertainty about what to expect from the course of your illness?" (Not at all/Slightly/Moderately/Quite a Bit/Extremely). We defined responses of "Quite a bit" and "Extremely" to be indicative of substantial distress. Two hundred thirty-six participants completed the baseline assessment. Seventy-seven percent reported being at least moderately bothered by prognostic uncertainty and half reported substantial distress. Compared with others, those who were distressed by prognostic uncertainty (118/236) reported poorer overall QOL before palliative care consultation (mean QOL 3.8 out of 10 vs. 5.3 out of 10, p = < 0.001) and greater improvement in QOL following consultation (Adjusted difference in mean QOL change = 1.1; 95% confidence interval = 0.2, 2.0). Prognostic uncertainty is a prevalent source of distress among hospitalized patients with advanced cancer at the time of initial palliative care consultation. Distress from prognostic uncertainty is associated with lower levels of preconsultation QOL and with greater pre-post consultation improvement in the QOL.

  9. Application of identified sensitive physical parameters in reducing the uncertainty of numerical simulation

    NASA Astrophysics Data System (ADS)

    Sun, Guodong; Mu, Mu

    2016-04-01

    An important source of uncertainty, which then causes further uncertainty in numerical simulations, is that residing in the parameters describing physical processes in numerical models. There are many physical parameters in numerical models in the atmospheric and oceanic sciences, and it would cost a great deal to reduce uncertainties in all physical parameters. Therefore, finding a subset of these parameters, which are relatively more sensitive and important parameters, and reducing the errors in the physical parameters in this subset would be a far more efficient way to reduce the uncertainties involved in simulations. In this context, we present a new approach based on the conditional nonlinear optimal perturbation related to parameter (CNOP-P) method. The approach provides a framework to ascertain the subset of those relatively more sensitive and important parameters among the physical parameters. The Lund-Potsdam-Jena (LPJ) dynamical global vegetation model was utilized to test the validity of the new approach. The results imply that nonlinear interactions among parameters play a key role in the uncertainty of numerical simulations in arid and semi-arid regions of China compared to those in northern, northeastern and southern China. The uncertainties in the numerical simulations were reduced considerably by reducing the errors of the subset of relatively more sensitive and important parameters. The results demonstrate that our approach not only offers a new route to identify relatively more sensitive and important physical parameters but also that it is viable to then apply "target observations" to reduce the uncertainties in model parameters.

  10. Common but unappreciated sources of error in one, two, and multiple-color pyrometry

    NASA Technical Reports Server (NTRS)

    Spjut, R. Erik

    1988-01-01

    The most common sources of error in optical pyrometry are examined. They can be classified as either noise and uncertainty errors, stray radiation errors, or speed-of-response errors. Through judicious choice of detectors and optical wavelengths the effect of noise errors can be minimized, but one should strive to determine as many of the system properties as possible. Careful consideration of the optical-collection system can minimize stray radiation errors. Careful consideration must also be given to the slowest elements in a pyrometer when measuring rapid phenomena.

  11. Assessing the number of fire fatalities in a defined population.

    PubMed

    Jonsson, Anders; Bergqvist, Anders; Andersson, Ragnar

    2015-12-01

    Fire-related fatalities and injuries have become a growing governmental concern in Sweden, and a national vision zero strategy has been adopted stating that nobody should get killed or seriously injured from fires. There is considerable uncertainty, however, regarding the numbers of both deaths and injuries due to fires. Different national sources present different numbers, even on deaths, which obstructs reliable surveillance of the problem over time. We assume the situation is similar in other countries. This study seeks to assess the true number of fire-related deaths in Sweden by combining sources, and to verify the coverage of each individual source. By doing so, we also wish to demonstrate the possibilities of improved surveillance practices. Data from three national sources were collected and matched; a special database on fatal fires held by The Swedish Contingencies Agency (nationally responsible for fire prevention), a database on forensic medical examinations held by the National Board of Forensic Medicine, and the cause of death register held by the Swedish National Board of Health and Welfare. The results disclose considerable underreporting in the single sources. The national database on fatal fires, serving as the principal source for policy making on fire prevention matters, underestimates the true situation by 20%. Its coverage of residential fires appears to be better than other fires. Systematic safety work and informed policy-making presuppose access to correct and reliable numbers. By combining several different sources, as suggested in this study, the national database on fatal fires is now considerably improved and includes regular matching with complementary sources.

  12. 78 FR 25204 - Segregation of Lands-Renewable Energy

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-04-30

    ....L13400000] RIN 1004-AE19 Segregation of Lands--Renewable Energy AGENCY: Bureau of Land Management, Interior... pending solar or wind renewable energy generation project, or for public lands identified by the BLM under... consideration of renewable energy ROWs. As explained below, the BLM seeks to avoid the delays and uncertainty...

  13. Estimating US federal wildland fire managers' preferences toward competing strategic suppression objectives

    Treesearch

    David E. Calkin; Tyron Venn; Matthew Wibbenmeyer; Matthew P. Thompson

    2012-01-01

    Wildfire management involves significant complexity and uncertainty, requiring simultaneous consideration of multiple, non-commensurate objectives. This paper investigates the tradeoffs fire managers are willing to make among these objectives using a choice experiment methodology that provides three key advancements relative to previous stated-preference studies...

  14. Multiple microbial activity-based measures reflect effects of cover cropping and tillage on soils

    USDA-ARS?s Scientific Manuscript database

    Agricultural producers, conservation professionals, and policy makers are eager to learn of soil analytical techniques and data that document improvement in soil health by agricultural practices such as no-till and incorporation of cover crops. However, there is considerable uncertainty within the r...

  15. 77 FR 29257 - Registration of Copyright: Definition of Claimant

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-05-17

    ... considerable legal uncertainty while offering no clear benefits to the registration system. Removing it will... individuals or entities that have obtained the contractual right to claim legal title to copyright in an... author, the contractual right to claim legal title to the copyright in an application for copyright...

  16. Predicting the unpredictable: potential climate change impacts on vegetation in the Pacific Northwest

    Treesearch

    Marie Oliver; David W. Peterson; Becky Kerns

    2016-01-01

    Earth's climate is changing, as evidenced by warming temperatures, increased temperature variability, fluctuating precipitation patterns, and climate-related environmental disturbances. And with considerable uncertainty about the future, Forest Service land managers are now considering climate change adaptation in their planning efforts. They want practical...

  17. Integrability and Chaos: The Classical Uncertainty

    ERIC Educational Resources Information Center

    Masoliver, Jaume; Ros, Ana

    2011-01-01

    In recent years there has been a considerable increase in the publishing of textbooks and monographs covering what was formerly known as random or irregular deterministic motion, now referred to as deterministic chaos. There is still substantial interest in a matter that is included in many graduate and even undergraduate courses on classical…

  18. Effect of spatial image support in detecting long-term vegetation change from satellite time-series

    USDA-ARS?s Scientific Manuscript database

    Context Arid rangelands have been severely degraded over the past century. Multi-temporal remote sensing techniques are ideally suited to detect significant changes in ecosystem state; however, considerable uncertainty exists regarding the effects of changing image resolution on their ability to de...

  19. Risk-Aversion: Understanding Teachers' Resistance to Technology Integration

    ERIC Educational Resources Information Center

    Howard, Sarah K.

    2013-01-01

    Teachers who do not integrate technology are often labelled as "resistant" to change. Yet, considerable uncertainties remain about appropriate uses and actual value of technology in teaching and learning, which can make integration and change seem risky. The purpose of this article is to explore the nature of teachers' analytical and…

  20. Hydrologic influences on stream temperatures for Little Creek and Scotts Creek, Santa Cruz County, California

    Treesearch

    Justin M. Louen; Christopher G. Surfleet

    2017-01-01

    Stream temperature impacts have resulted in increased restrictions on land management, such as timber harvest and riparian restoration, creating considerable uncertainty for future planning and management of redwood (Sequoia sempervirens (D.Don) Endl.) forestlands. Challenges remain in the assessment of downstream cumulative stream...

  1. Learned Helplessness and Dyslexia: A Carts and Horses Issue?

    ERIC Educational Resources Information Center

    Kerr, H.

    2001-01-01

    Surveys attitudes towards and beliefs about dyslexia among Adult Basic Education (ABE) teachers and providers. Finds doubt, uncertainty and confusion about dyslexia and considerable misgiving. Discusses attribution theory and learned helplessness in the context of ABE. Argues that a diagnosis of dyslexia may be a maladaptive attribution and so…

  2. Consideration of vertical uncertainty in elevation-based sea-level rise assessments: Mobile Bay, Alabama case study

    USGS Publications Warehouse

    Gesch, Dean B.

    2013-01-01

    The accuracy with which coastal topography has been mapped directly affects the reliability and usefulness of elevationbased sea-level rise vulnerability assessments. Recent research has shown that the qualities of the elevation data must be well understood to properly model potential impacts. The cumulative vertical uncertainty has contributions from elevation data error, water level data uncertainties, and vertical datum and transformation uncertainties. The concepts of minimum sealevel rise increment and minimum planning timeline, important parameters for an elevation-based sea-level rise assessment, are used in recognition of the inherent vertical uncertainty of the underlying data. These concepts were applied to conduct a sea-level rise vulnerability assessment of the Mobile Bay, Alabama, region based on high-quality lidar-derived elevation data. The results that detail the area and associated resources (land cover, population, and infrastructure) vulnerable to a 1.18-m sea-level rise by the year 2100 are reported as a range of values (at the 95% confidence level) to account for the vertical uncertainty in the base data. Examination of the tabulated statistics about land cover, population, and infrastructure in the minimum and maximum vulnerable areas shows that these resources are not uniformly distributed throughout the overall vulnerable zone. The methods demonstrated in the Mobile Bay analysis provide an example of how to consider and properly account for vertical uncertainty in elevation-based sea-level rise vulnerability assessments, and the advantages of doing so.

  3. Emotion and decision-making under uncertainty: Physiological arousal predicts increased gambling during ambiguity but not risk.

    PubMed

    FeldmanHall, Oriel; Glimcher, Paul; Baker, Augustus L; Phelps, Elizabeth A

    2016-10-01

    Uncertainty, which is ubiquitous in decision-making, can be fractionated into known probabilities (risk) and unknown probabilities (ambiguity). Although research has illustrated that individuals more often avoid decisions associated with ambiguity compared to risk, it remains unclear why ambiguity is perceived as more aversive. Here we examine the role of arousal in shaping the representation of value and subsequent choice under risky and ambiguous decisions. To investigate the relationship between arousal and decisions of uncertainty, we measure skin conductance response-a quantifiable measure reflecting sympathetic nervous system arousal-during choices to gamble under risk and ambiguity. To quantify the discrete influences of risk and ambiguity sensitivity and the subjective value of each option under consideration, we model fluctuating uncertainty, as well as the amount of money that can be gained by taking the gamble. Results reveal that although arousal tracks the subjective value of a lottery regardless of uncertainty type, arousal differentially contributes to the computation of value-that is, choice-depending on whether the uncertainty is risky or ambiguous: Enhanced arousal adaptively decreases risk-taking only when the lottery is highly risky but increases risk-taking when the probability of winning is ambiguous (even after controlling for subjective value). Together, this suggests that the role of arousal during decisions of uncertainty is modulatory and highly dependent on the context in which the decision is framed. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  4. When 1+1 can be >2: Uncertainties compound when simulating climate, fisheries and marine ecosystems

    NASA Astrophysics Data System (ADS)

    Evans, Karen; Brown, Jaclyn N.; Sen Gupta, Alex; Nicol, Simon J.; Hoyle, Simon; Matear, Richard; Arrizabalaga, Haritz

    2015-03-01

    Multi-disciplinary approaches that combine oceanographic, biogeochemical, ecosystem, fisheries population and socio-economic models are vital tools for modelling whole ecosystems. Interpreting the outputs from such complex models requires an appreciation of the many different types of modelling frameworks being used and their associated limitations and uncertainties. Both users and developers of particular model components will often have little involvement or understanding of other components within such modelling frameworks. Failure to recognise limitations and uncertainties associated with components and how these uncertainties might propagate throughout modelling frameworks can potentially result in poor advice for resource management. Unfortunately, many of the current integrative frameworks do not propagate the uncertainties of their constituent parts. In this review, we outline the major components of a generic whole of ecosystem modelling framework incorporating the external pressures of climate and fishing. We discuss the limitations and uncertainties associated with each component of such a modelling system, along with key research gaps. Major uncertainties in modelling frameworks are broadly categorised into those associated with (i) deficient knowledge in the interactions of climate and ocean dynamics with marine organisms and ecosystems; (ii) lack of observations to assess and advance modelling efforts and (iii) an inability to predict with confidence natural ecosystem variability and longer term changes as a result of external drivers (e.g. greenhouse gases, fishing effort) and the consequences for marine ecosystems. As a result of these uncertainties and intrinsic differences in the structure and parameterisation of models, users are faced with considerable challenges associated with making appropriate choices on which models to use. We suggest research directions required to address these uncertainties, and caution against overconfident predictions. Understanding the full impact of uncertainty makes it clear that full comprehension and robust certainty about the systems themselves are not feasible. A key research direction is the development of management systems that are robust to this unavoidable uncertainty.

  5. Orientation Uncertainty of Structures Measured in Cored Boreholes: Methodology and Case Study of Swedish Crystalline Rock

    NASA Astrophysics Data System (ADS)

    Stigsson, Martin

    2016-11-01

    Many engineering applications in fractured crystalline rocks use measured orientations of structures such as rock contact and fractures, and lineated objects such as foliation and rock stress, mapped in boreholes as their foundation. Despite that these measurements are afflicted with uncertainties, very few attempts to quantify their magnitudes and effects on the inferred orientations have been reported. Only relying on the specification of tool imprecision may considerably underestimate the actual uncertainty space. The present work identifies nine sources of uncertainties, develops inference models of their magnitudes, and points out possible implications for the inference on orientation models and thereby effects on downstream models. The uncertainty analysis in this work builds on a unique data set from site investigations, performed by the Swedish Nuclear Fuel and Waste Management Co. (SKB). During these investigations, more than 70 boreholes with a maximum depth of 1 km were drilled in crystalline rock with a cumulative length of more than 34 km including almost 200,000 single fracture intercepts. The work presented, hence, relies on orientation of fractures. However, the techniques to infer the magnitude of orientation uncertainty may be applied to all types of structures and lineated objects in boreholes. The uncertainties are not solely detrimental, but can be valuable, provided that the reason for their presence is properly understood and the magnitudes correctly inferred. The main findings of this work are as follows: (1) knowledge of the orientation uncertainty is crucial in order to be able to infer correct orientation model and parameters coupled to the fracture sets; (2) it is important to perform multiple measurements to be able to infer the actual uncertainty instead of relying on the theoretical uncertainty provided by the manufacturers; (3) it is important to use the most appropriate tool for the prevailing circumstances; and (4) the single most important parameter to decrease the uncertainty space is to avoid drilling steeper than about -80°.

  6. Uncertainties in the land-use flux resulting from land-use change reconstructions and gross land transitions

    NASA Astrophysics Data System (ADS)

    Bayer, Anita D.; Lindeskog, Mats; Pugh, Thomas A. M.; Anthoni, Peter M.; Fuchs, Richard; Arneth, Almut

    2017-02-01

    Land-use and land-cover (LUC) changes are a key uncertainty when attributing changes in measured atmospheric CO2 concentration to its sinks and sources and must also be much better understood to determine the possibilities for land-based climate change mitigation, especially in the light of human demand on other land-based resources. On the spatial scale typically used in terrestrial ecosystem models (0.5 or 1°) changes in LUC over time periods of a few years or more can include bidirectional changes on the sub-grid level, such as the parallel expansion and abandonment of agricultural land (e.g. in shifting cultivation) or cropland-grassland conversion (and vice versa). These complex changes between classes within a grid cell have often been neglected in previous studies, and only net changes of land between natural vegetation cover, cropland and pastures accounted for, mainly because of a lack of reliable high-resolution historical information on gross land transitions, in combination with technical limitations within the models themselves. In the present study we applied a state-of-the-art dynamic global vegetation model with a detailed representation of croplands and carbon-nitrogen dynamics to quantify the uncertainty in terrestrial ecosystem carbon stocks and fluxes arising from the choice between net and gross representations of LUC. We used three frequently applied global, one recent global and one recent European LUC datasets, two of which resolve gross land transitions, either in Europe or in certain tropical regions. When considering only net changes, land-use-transition uncertainties (expressed as 1 standard deviation around decadal means of four models) in global carbon emissions from LUC (ELUC) are ±0.19, ±0.66 and ±0.47 Pg C a-1 in the 1980s, 1990s and 2000s, respectively, or between 14 and 39 % of mean ELUC. Carbon stocks at the end of the 20th century vary by ±11 Pg C for vegetation and ±37 Pg C for soil C due to the choice of LUC reconstruction, i.e. around 3 % of the respective C pools. Accounting for sub-grid (gross) land conversions significantly increased the effect of LUC on global and European carbon stocks and fluxes, most noticeably enhancing global cumulative ELUC by 33 Pg C (1750-2014) and entailing a significant reduction in carbon stored in vegetation, although the effect on soil C stocks was limited. Simulations demonstrated that assessments of historical carbon stocks and fluxes are highly uncertain due to the choice of LUC reconstruction and that the consideration of different contrasting LUC reconstructions is needed to account for this uncertainty. The analysis of gross, in addition to net, land-use changes showed that the full complexity of gross land-use changes is required in order to accurately predict the magnitude of LUC change emissions. This introduces technical challenges to process-based models and relies on extensive information regarding historical land-use transitions.

  7. Late Quaternary sedimentological and climate changes at Lake Bosumtwi Ghana: new constraints from laminae analysis and radiocarbon age modeling

    USGS Publications Warehouse

    Shanahan, Timothy M.; Beck, J. Warren; Overpeck, Jonathan T.; McKay, Nicholas P.; Pigati, Jeffrey S.; Peck, John A.; Scholz, Christopher A.; Heil, Clifford W.; King, John W.

    2012-01-01

    The Lake Bosumtwi sediment record represents one of the longest and highest-resolution terrestrial records of paleoclimate change available from sub-Saharan Africa. Here we report a new sediment age model framework for the last ~ 45 cal kyr of sedimentation using a combination of high-resolution radiocarbon dating, Bayesian age-depth modeling and lamination counting. Our results highlight the practical limits of these methods for reducing age model uncertainties and suggest that even with very high sampling densities, radiocarbon uncertainties of at least a few hundred years are unavoidable. Age model uncertainties are smallest during the Holocene (205 yr) and the glacial (360 yr) but are large at the base of the record (1660 yr), due to a combination of decreasing sample density, larger calibration uncertainties and increases in radiocarbon age scatter. For portions of the chronology older than ~ 35 cal kyr, additional considerations, such as the use of a low-blank graphitization system and more rigorous sample pretreatment were necessary to generate a reliable age depth model because of the incorporation of small amounts of younger carbon. A comparison of radiocarbon age model results and lamination counts over the time interval ~ 15–30 cal kyr agree with an overall discrepancy of ~ 10% and display similar changes in sedimentation rate, supporting the annual nature of sediment laminations in the early part of the record. Changes in sedimentation rates reconstructed from the age-depth model indicate that intervals of enhanced sediment delivery occurred at 16–19, 24 and 29–31 cal kyr, broadly synchronous with reconstructed drought episodes elsewhere in northern West Africa and potentially, with changes in Atlantic meridional heat transport during North Atlantic Heinrich events. These data suggest that millennial-scale drought events in the West African monsoon region were latitudinally extensive, reaching within several hundred kilometers of the Guinea coast. This is inconsistent with a simple southward shift in the mean position of the monsoon rainbelt, and requires changes in moisture convergence as a result of either a reduction in the moisture content of the tropical rainbelt, decreased convection, or both.

  8. A collision risk model to predict avian fatalities at wind facilities: an example using golden eagles, Aquila chrysaetos

    USGS Publications Warehouse

    New, Leslie; Bjerre, Emily; Millsap, Brian A.; Otto, Mark C.; Runge, Michael C.

    2015-01-01

    Wind power is a major candidate in the search for clean, renewable energy. Beyond the technical and economic challenges of wind energy development are environmental issues that may restrict its growth. Avian fatalities due to collisions with rotating turbine blades are a leading concern and there is considerable uncertainty surrounding avian collision risk at wind facilities. This uncertainty is not reflected in many models currently used to predict the avian fatalities that would result from proposed wind developments. We introduce a method to predict fatalities at wind facilities, based on pre-construction monitoring. Our method can directly incorporate uncertainty into the estimates of avian fatalities and can be updated if information on the true number of fatalities becomes available from post-construction carcass monitoring. Our model considers only three parameters: hazardous footprint, bird exposure to turbines and collision probability. By using a Bayesian analytical framework we account for uncertainties in these values, which are then reflected in our predictions and can be reduced through subsequent data collection. The simplicity of our approach makes it accessible to ecologists concerned with the impact of wind development, as well as to managers, policy makers and industry interested in its implementation in real-world decision contexts. We demonstrate the utility of our method by predicting golden eagle (Aquila chrysaetos) fatalities at a wind installation in the United States. Using pre-construction data, we predicted 7.48 eagle fatalities year-1 (95% CI: (1.1, 19.81)). The U.S. Fish and Wildlife Service uses the 80th quantile (11.0 eagle fatalities year-1) in their permitting process to ensure there is only a 20% chance a wind facility exceeds the authorized fatalities. Once data were available from two-years of post-construction monitoring, we updated the fatality estimate to 4.8 eagle fatalities year-1 (95% CI: (1.76, 9.4); 80th quantile, 6.3). In this case, the increased precision in the fatality prediction lowered the level of authorized take, and thus lowered the required amount of compensatory mitigation.

  9. A Collision Risk Model to Predict Avian Fatalities at Wind Facilities: An Example Using Golden Eagles, Aquila chrysaetos

    PubMed Central

    New, Leslie; Bjerre, Emily; Millsap, Brian; Otto, Mark C.; Runge, Michael C.

    2015-01-01

    Wind power is a major candidate in the search for clean, renewable energy. Beyond the technical and economic challenges of wind energy development are environmental issues that may restrict its growth. Avian fatalities due to collisions with rotating turbine blades are a leading concern and there is considerable uncertainty surrounding avian collision risk at wind facilities. This uncertainty is not reflected in many models currently used to predict the avian fatalities that would result from proposed wind developments. We introduce a method to predict fatalities at wind facilities, based on pre-construction monitoring. Our method can directly incorporate uncertainty into the estimates of avian fatalities and can be updated if information on the true number of fatalities becomes available from post-construction carcass monitoring. Our model considers only three parameters: hazardous footprint, bird exposure to turbines and collision probability. By using a Bayesian analytical framework we account for uncertainties in these values, which are then reflected in our predictions and can be reduced through subsequent data collection. The simplicity of our approach makes it accessible to ecologists concerned with the impact of wind development, as well as to managers, policy makers and industry interested in its implementation in real-world decision contexts. We demonstrate the utility of our method by predicting golden eagle (Aquila chrysaetos) fatalities at a wind installation in the United States. Using pre-construction data, we predicted 7.48 eagle fatalities year-1 (95% CI: (1.1, 19.81)). The U.S. Fish and Wildlife Service uses the 80th quantile (11.0 eagle fatalities year-1) in their permitting process to ensure there is only a 20% chance a wind facility exceeds the authorized fatalities. Once data were available from two-years of post-construction monitoring, we updated the fatality estimate to 4.8 eagle fatalities year-1 (95% CI: (1.76, 9.4); 80th quantile, 6.3). In this case, the increased precision in the fatality prediction lowered the level of authorized take, and thus lowered the required amount of compensatory mitigation. PMID:26134412

  10. A Collision Risk Model to Predict Avian Fatalities at Wind Facilities: An Example Using Golden Eagles, Aquila chrysaetos.

    PubMed

    New, Leslie; Bjerre, Emily; Millsap, Brian; Otto, Mark C; Runge, Michael C

    2015-01-01

    Wind power is a major candidate in the search for clean, renewable energy. Beyond the technical and economic challenges of wind energy development are environmental issues that may restrict its growth. Avian fatalities due to collisions with rotating turbine blades are a leading concern and there is considerable uncertainty surrounding avian collision risk at wind facilities. This uncertainty is not reflected in many models currently used to predict the avian fatalities that would result from proposed wind developments. We introduce a method to predict fatalities at wind facilities, based on pre-construction monitoring. Our method can directly incorporate uncertainty into the estimates of avian fatalities and can be updated if information on the true number of fatalities becomes available from post-construction carcass monitoring. Our model considers only three parameters: hazardous footprint, bird exposure to turbines and collision probability. By using a Bayesian analytical framework we account for uncertainties in these values, which are then reflected in our predictions and can be reduced through subsequent data collection. The simplicity of our approach makes it accessible to ecologists concerned with the impact of wind development, as well as to managers, policy makers and industry interested in its implementation in real-world decision contexts. We demonstrate the utility of our method by predicting golden eagle (Aquila chrysaetos) fatalities at a wind installation in the United States. Using pre-construction data, we predicted 7.48 eagle fatalities year-1 (95% CI: (1.1, 19.81)). The U.S. Fish and Wildlife Service uses the 80th quantile (11.0 eagle fatalities year-1) in their permitting process to ensure there is only a 20% chance a wind facility exceeds the authorized fatalities. Once data were available from two-years of post-construction monitoring, we updated the fatality estimate to 4.8 eagle fatalities year-1 (95% CI: (1.76, 9.4); 80th quantile, 6.3). In this case, the increased precision in the fatality prediction lowered the level of authorized take, and thus lowered the required amount of compensatory mitigation.

  11. Monte Carlo analysis of uncertainty propagation in a stratospheric model. 1: Development of a concise stratospheric model

    NASA Technical Reports Server (NTRS)

    Rundel, R. D.; Butler, D. M.; Stolarski, R. S.

    1977-01-01

    A concise model has been developed to analyze uncertainties in stratospheric perturbations, yet uses a minimum of computer time and is complete enough to represent the results of more complex models. The steady state model applies iteration to achieve coupling between interacting species. The species are determined from diffusion equations with appropriate sources and sinks. Diurnal effects due to chlorine nitrate formation are accounted for by analytic approximation. The model has been used to evaluate steady state perturbations due to injections of chlorine and NO(X).

  12. Experimental Research Examining How People Can Cope with Uncertainty Through Soft Haptic Sensations.

    PubMed

    van Horen, Femke; Mussweiler, Thomas

    2015-09-16

    Human beings are constantly surrounded by uncertainty and change. The question arises how people cope with such uncertainty. To date, most research has focused on the cognitive strategies people adopt to deal with uncertainty. However, especially when uncertainty is due to unpredictable societal events (e.g., economical crises, political revolutions, terrorism threats) of which one is unable to judge the impact on one's future live, cognitive strategies (like seeking additional information) is likely to fail to combat uncertainty. Instead, the current paper discusses a method demonstrating that people might deal with uncertainty experientially through soft haptic sensations. More specifically, because touching something soft creates a feeling of comfort and security, people prefer objects with softer as compared to harder properties when feeling uncertain. Seeking for softness is a highly efficient and effective tool to deal with uncertainty as our hands are available at all times. This protocol describes a set of methods demonstrating 1) how environmental (un)certainty can be situationally activated with an experiential priming procedure, 2) that the quality of the softness experience (what type of softness and how it is experienced) matters and 3) how uncertainty can be reduced using different methods.

  13. Long-term stormwater quantity and quality analysis using continuous measurements in a French urban catchment.

    PubMed

    Sun, Siao; Barraud, Sylvie; Castebrunet, Hélène; Aubin, Jean-Baptiste; Marmonier, Pierre

    2015-11-15

    The assessment of urban stormwater quantity and quality is important for evaluating and controlling the impact of the stormwater to natural water and environment. This study mainly addresses long-term evolution of stormwater quantity and quality in a French urban catchment using continuous measured data from 2004 to 2011. Storm event-based data series are obtained (716 rainfall events and 521 runoff events are available) from measured continuous time series. The Mann-Kendall test is applied to these event-based data series for trend detection. A lack of trend is found in rainfall and an increasing trend in runoff is detected. As a result, an increasing trend is present in the runoff coefficient, likely due to growing imperviousness of the catchment caused by urbanization. The event mean concentration of the total suspended solid (TSS) in stormwater does not present a trend, whereas the event load of TSS has an increasing tendency, which is attributed to the increasing event runoff volume. Uncertainty analysis suggests that the major uncertainty in trend detection results lies in uncertainty due to available data. A lack of events due to missing data leads to dramatically increased uncertainty in trend detection results. In contrast, measurement uncertainty in time series data plays a trivial role. The intra-event distribution of TSS is studied based on both M(V) curves and pollutant concentrations of absolute runoff volumes. The trend detection test reveals no significant change in intra-event distributions of TSS in the studied catchment. Copyright © 2015 Elsevier Ltd. All rights reserved.

  14. Determination of the reference air kerma rate for 192Ir brachytherapy sources and the related uncertainty.

    PubMed

    van Dijk, Eduard; Kolkman-Deurloo, Inger-Karine K; Damen, Patricia M G

    2004-10-01

    Different methods exist to determine the air kerma calibration factor of an ionization chamber for the spectrum of a 192Ir high-dose-rate (HDR) or pulsed-dose-rate (PDR) source. An analysis of two methods to obtain such a calibration factor was performed: (i) the method recommended by [Goetsch et al., Med. Phys. 18, 462-467 (1991)] and (ii) the method employed by the Dutch national standards institute NMi [Petersen et al., Report S-EI-94.01 (NMi, Delft, The Netherlands, 1994)]. This analysis showed a systematic difference on the order of 1% in the determination of the strength of 192Ir HDR and PDR sources depending on the method used for determining the air kerma calibration factor. The definitive significance of the difference between these methods can only be addressed after performing an accurate analysis of the associated uncertainties. For an NE 2561 (or equivalent) ionization chamber and an in-air jig, a typical uncertainty budget of 0.94% was found with the NMi method. The largest contribution in the type-B uncertainty is the uncertainty in the air kerma calibration factor for isotope i, N(i)k, as determined by the primary or secondary standards laboratories. This uncertainty is dominated by the uncertainties in the physical constants for the average mass-energy absorption coefficient ratio and the stopping power ratios. This means that it is not foreseeable that the standards laboratories can decrease the uncertainty in the air kerma calibration factors for ionization chambers in the short term. When the results of the determination of the 192Ir reference air kerma rates in, e.g., different institutes are compared, the uncertainties in the physical constants are the same. To compare the applied techniques, the ratio of the results can be judged by leaving out the uncertainties due to these physical constants. In that case an uncertainty budget of 0.40% (coverage factor=2) should be taken into account. Due to the differences in approach between the method used by NMi and the method recommended by Goetsch et al., an extra type-B uncertainty of 0.9% (k= 1) has to be taken into account when the method of Goetsch et al. is applied. Compared to the uncertainty of 1% (k= 2) found for the air calibration of 192Ir, the difference of 0.9% found is significant.

  15. Uncertainty in predictions of forest carbon dynamics: separating driver error from model error.

    PubMed

    Spadavecchia, L; Williams, M; Law, B E

    2011-07-01

    We present an analysis of the relative magnitude and contribution of parameter and driver uncertainty to the confidence intervals on estimates of net carbon fluxes. Model parameters may be difficult or impractical to measure, while driver fields are rarely complete, with data gaps due to sensor failure and sparse observational networks. Parameters are generally derived through some optimization method, while driver fields may be interpolated from available data sources. For this study, we used data from a young ponderosa pine stand at Metolius, Central Oregon, and a simple daily model of coupled carbon and water fluxes (DALEC). An ensemble of acceptable parameterizations was generated using an ensemble Kalman filter and eddy covariance measurements of net C exchange. Geostatistical simulations generated an ensemble of meteorological driving variables for the site, consistent with the spatiotemporal autocorrelations inherent in the observational data from 13 local weather stations. Simulated meteorological data were propagated through the model to derive the uncertainty on the CO2 flux resultant from driver uncertainty typical of spatially extensive modeling studies. Furthermore, the model uncertainty was partitioned between temperature and precipitation. With at least one meteorological station within 25 km of the study site, driver uncertainty was relatively small ( 10% of the total net flux), while parameterization uncertainty was larger, 50% of the total net flux. The largest source of driver uncertainty was due to temperature (8% of the total flux). The combined effect of parameter and driver uncertainty was 57% of the total net flux. However, when the nearest meteorological station was > 100 km from the study site, uncertainty in net ecosystem exchange (NEE) predictions introduced by meteorological drivers increased by 88%. Precipitation estimates were a larger source of bias in NEE estimates than were temperature estimates, although the biases partly compensated for each other. The time scales on which precipitation errors occurred in the simulations were shorter than the temporal scales over which drought developed in the model, so drought events were reasonably simulated. The approach outlined here provides a means to assess the uncertainty and bias introduced by meteorological drivers in regional-scale ecological forecasting.

  16. Stress From Uncertainty and Resilience Among Depressed and Burned Out Residents: A Cross-Sectional Study.

    PubMed

    Simpkin, Arabella L; Khan, Alisa; West, Daniel C; Garcia, Briana M; Sectish, Theodore C; Spector, Nancy D; Landrigan, Christopher P

    2018-03-07

    Depression and burnout are highly prevalent among residents, but little is known about modifiable personality variables, such as resilience and stress from uncertainty, that may predispose to these conditions. Residents are routinely faced with uncertainty when making medical decisions. To determine how stress from uncertainty is related to resilience among pediatric residents and whether these attributes are associated with depression and burnout. We surveyed 86 residents in pediatric residency programs from 4 urban freestanding children's hospitals in North America in 2015. Stress from uncertainty was measured with the use of the Physicians' Reaction to Uncertainty Scale, resilience with the use of the 14-item Resilience Scale, depression with the use of the Harvard National Depression Screening Scale; and burnout with the use of single-item measures of emotional exhaustion and depersonalization from the Maslach Burnout Inventory. Fifty out of 86 residents responded to the survey (58.1%). Higher levels of stress from uncertainty correlated with lower resilience (r = -0.60; P < .001). Five residents (10%) met depression criteria and 15 residents (31%) met burnout criteria. Depressed residents had higher mean levels of stress due to uncertainty (51.6 ± 9.1 vs 38.7 ± 6.7; P < .001) and lower mean levels of resilience (56.6 ± 10.7 vs 85.4 ± 8.0; P < .001) compared with residents who were not depressed. Burned out residents also had higher mean levels of stress due to uncertainty (44.0 ± 8.5 vs 38.3 ± 7.1; P = .02) and lower mean levels of resilience (76.7 ± 14.8 vs 85.0 ± 9.77; P = .02) compared with residents who were not burned out. We found high levels of stress from uncertainty, and low levels of resilience were strongly correlated with depression and burnout. Efforts to enhance tolerance of uncertainty and resilience among residents may provide opportunities to mitigate resident depression and burnout. Copyright © 2018 Academic Pediatric Association. Published by Elsevier Inc. All rights reserved.

  17. An application of a hydraulic model simulator in flood risk assessment under changing climatic conditions

    NASA Astrophysics Data System (ADS)

    Doroszkiewicz, J. M.; Romanowicz, R. J.

    2016-12-01

    The standard procedure of climate change impact assessment on future hydrological extremes consists of a chain of consecutive actions, starting from the choice of GCM driven by an assumed CO2 scenario, through downscaling of climatic forcing to a catchment scale, estimation of hydrological extreme indices using hydrological modelling tools and subsequent derivation of flood risk maps with the help of a hydraulic model. Among many possible sources of uncertainty, the main are the uncertainties related to future climate scenarios, climate models, downscaling techniques and hydrological and hydraulic models. Unfortunately, we cannot directly assess the impact of these different sources of uncertainties on flood risk in future due to lack of observations of future climate realizations. The aim of this study is an assessment of a relative impact of different sources of uncertainty on the uncertainty of flood risk maps. Due to the complexity of the processes involved, an assessment of total uncertainty of maps of inundation probability might be very computer time consuming. As a way forward we present an application of a hydraulic model simulator based on a nonlinear transfer function model for the chosen locations along the river reach. The transfer function model parameters are estimated based on the simulations of the hydraulic model at each of the model cross-sections. The study shows that the application of a simulator substantially reduces the computer requirements related to the derivation of flood risk maps under future climatic conditions. Biala Tarnowska catchment, situated in southern Poland is used as a case study. Future discharges at the input to a hydraulic model are obtained using the HBV model and climate projections obtained from the EUROCORDEX project. The study describes a cascade of uncertainty related to different stages of the process of derivation of flood risk maps under changing climate conditions. In this context it takes into account the uncertainty of future climate projections, an uncertainty of flow routing model, the propagation of that uncertainty through the hydraulic model, and finally, the uncertainty related to the derivation of flood risk maps.

  18. Special Issue on Uncertainty Quantification in Multiscale System Design and Simulation

    DOE PAGES

    Wang, Yan; Swiler, Laura

    2017-09-07

    The importance of uncertainty has been recognized in various modeling, simulation, and analysis applications, where inherent assumptions and simplifications affect the accuracy of model predictions for physical phenomena. As model predictions are now heavily relied upon for simulation-based system design, which includes new materials, vehicles, mechanical and civil structures, and even new drugs, wrong model predictions could potentially cause catastrophic consequences. Therefore, uncertainty and associated risks due to model errors should be quantified to support robust systems engineering.

  19. Special Issue on Uncertainty Quantification in Multiscale System Design and Simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Yan; Swiler, Laura

    The importance of uncertainty has been recognized in various modeling, simulation, and analysis applications, where inherent assumptions and simplifications affect the accuracy of model predictions for physical phenomena. As model predictions are now heavily relied upon for simulation-based system design, which includes new materials, vehicles, mechanical and civil structures, and even new drugs, wrong model predictions could potentially cause catastrophic consequences. Therefore, uncertainty and associated risks due to model errors should be quantified to support robust systems engineering.

  20. A design methodology for nonlinear systems containing parameter uncertainty

    NASA Technical Reports Server (NTRS)

    Young, G. E.; Auslander, D. M.

    1983-01-01

    In the present design methodology for nonlinear systems containing parameter uncertainty, a generalized sensitivity analysis is incorporated which employs parameter space sampling and statistical inference. For the case of a system with j adjustable and k nonadjustable parameters, this methodology (which includes an adaptive random search strategy) is used to determine the combination of j adjustable parameter values which maximize the probability of those performance indices which simultaneously satisfy design criteria in spite of the uncertainty due to k nonadjustable parameters.

Top