Sample records for uncertainty assessment main

  1. Experimental uncertainty survey and assessment. [Space Shuttle Main Engine testing

    NASA Technical Reports Server (NTRS)

    Coleman, Hugh W.

    1992-01-01

    An uncertainty analysis and assessment of the specific impulse determination during Space Shuttle Main Engine testing is reported. It is concluded that in planning and designing tests and in interpreting the results of tests, the bias and precision components of experimental uncertainty should be considered separately. Recommendations for future research efforts are presented.

  2. Uncertainties in land use data

    NASA Astrophysics Data System (ADS)

    Castilla, G.; Hay, G. J.

    2006-11-01

    This paper deals with the description and assessment of uncertainties in gridded land use data derived from Remote Sensing observations, in the context of hydrological studies. Land use is a categorical regionalised variable returning the main socio-economic role each location has, where the role is inferred from the pattern of occupation of land. There are two main uncertainties surrounding land use data, positional and categorical. This paper focuses on the second one, as the first one has in general less serious implications and is easier to tackle. The conventional method used to asess categorical uncertainty, the confusion matrix, is criticised in depth, the main critique being its inability to inform on a basic requirement to propagate uncertainty through distributed hydrological models, namely the spatial distribution of errors. Some existing alternative methods are reported, and finally the need for metadata is stressed as a more reliable means to assess the quality, and hence the uncertainty, of these data.

  3. How does uncertainty shape patient experience in advanced illness? A secondary analysis of qualitative data.

    PubMed

    Etkind, Simon Noah; Bristowe, Katherine; Bailey, Katharine; Selman, Lucy Ellen; Murtagh, Fliss Em

    2017-02-01

    Uncertainty is common in advanced illness but is infrequently studied in this context. If poorly addressed, uncertainty can lead to adverse patient outcomes. We aimed to understand patient experiences of uncertainty in advanced illness and develop a typology of patients' responses and preferences to inform practice. Secondary analysis of qualitative interview transcripts. Studies were assessed for inclusion and interviews were sampled using maximum-variation sampling. Analysis used a thematic approach with 10% of coding cross-checked to enhance reliability. Qualitative interviews from six studies including patients with heart failure, chronic obstructive pulmonary disease, renal disease, cancer and liver failure. A total of 30 transcripts were analysed. Median age was 75 (range, 43-95), 12 patients were women. The impact of uncertainty was frequently discussed: the main related themes were engagement with illness, information needs, patient priorities and the period of time that patients mainly focused their attention on (temporal focus). A typology of patient responses to uncertainty was developed from these themes. Uncertainty influences patient experience in advanced illness through affecting patients' information needs, preferences and future priorities for care. Our typology aids understanding of how patients with advanced illness respond to uncertainty. Assessment of these three factors may be a useful starting point to guide clinical assessment and shared decision making.

  4. Statistical uncertainty of eddy flux-based estimates of gross ecosystem carbon exchange at Howland Forest, Maine

    Treesearch

    S.C. Hagen; B.H. Braswell; E. Linder; S. Frolking; A.D. Richardson; David Hollinger. D.Y; Hollinger. D.Y

    2006-01-01

    We present an uncertainty analysis of gross ecosystem carbon exchange (GEE) estimates derived from 7 years of continuous eddy covariance measurements of forest atmosphere CO2 fluxes at Howland Forest, Maine, USA. These data, which have high temporal resolution, can be used to validate process modeling analyses, remote sensing assessments, and field surveys. However,...

  5. RECOVERY ACT - Methods for Decision under Technological Change Uncertainty and Risk Assessment for Integrated Assessment of Climate Change

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Webster, Mort David

    2015-03-10

    This report presents the final outcomes and products of the project as performed at the Massachusetts Institute of Technology. The research project consists of three main components: methodology development for decision-making under uncertainty, improving the resolution of the electricity sector to improve integrated assessment, and application of these methods to integrated assessment. Results in each area is described in the report.

  6. Uncertainty analysis in geospatial merit matrix–based hydropower resource assessment

    DOE PAGES

    Pasha, M. Fayzul K.; Yeasmin, Dilruba; Saetern, Sen; ...

    2016-03-30

    Hydraulic head and mean annual streamflow, two main input parameters in hydropower resource assessment, are not measured at every point along the stream. Translation and interpolation are used to derive these parameters, resulting in uncertainties. This study estimates the uncertainties and their effects on model output parameters: the total potential power and the number of potential locations (stream-reach). These parameters are quantified through Monte Carlo Simulation (MCS) linking with a geospatial merit matrix based hydropower resource assessment (GMM-HRA) Model. The methodology is applied to flat, mild, and steep terrains. Results show that the uncertainty associated with the hydraulic head ismore » within 20% for mild and steep terrains, and the uncertainty associated with streamflow is around 16% for all three terrains. Output uncertainty increases as input uncertainty increases. However, output uncertainty is around 10% to 20% of the input uncertainty, demonstrating the robustness of the GMM-HRA model. Hydraulic head is more sensitive to output parameters in steep terrain than in flat and mild terrains. Furthermore, mean annual streamflow is more sensitive to output parameters in flat terrain.« less

  7. Development of an Uncertainty Model for the National Transonic Facility

    NASA Technical Reports Server (NTRS)

    Walter, Joel A.; Lawrence, William R.; Elder, David W.; Treece, Michael D.

    2010-01-01

    This paper introduces an uncertainty model being developed for the National Transonic Facility (NTF). The model uses a Monte Carlo technique to propagate standard uncertainties of measured values through the NTF data reduction equations to calculate the combined uncertainties of the key aerodynamic force and moment coefficients and freestream properties. The uncertainty propagation approach to assessing data variability is compared with ongoing data quality assessment activities at the NTF, notably check standard testing using statistical process control (SPC) techniques. It is shown that the two approaches are complementary and both are necessary tools for data quality assessment and improvement activities. The SPC approach is the final arbiter of variability in a facility. Its result encompasses variation due to people, processes, test equipment, and test article. The uncertainty propagation approach is limited mainly to the data reduction process. However, it is useful because it helps to assess the causes of variability seen in the data and consequently provides a basis for improvement. For example, it is shown that Mach number random uncertainty is dominated by static pressure variation over most of the dynamic pressure range tested. However, the random uncertainty in the drag coefficient is generally dominated by axial and normal force uncertainty with much less contribution from freestream conditions.

  8. Hotspots of uncertainty in land-use and land-cover change projections: A global-scale model comparison

    DOE PAGES

    Prestele, Reinhard; Alexander, Peter; Rounsevell, Mark D. A.; ...

    2016-05-02

    Model-based global projections of future land use and land cover (LULC) change are frequently used in environmental assessments to study the impact of LULC change on environmental services and to provide decision support for policy. These projections are characterized by a high uncertainty in terms of quantity and allocation of projected changes, which can severely impact the results of environmental assessments. In this study, we identify hotspots of uncertainty, based on 43 simulations from 11 global-scale LULC change models representing a wide range of assumptions of future biophysical and socio-economic conditions. We attribute components of uncertainty to input data, modelmore » structure, scenario storyline and a residual term, based on a regression analysis and analysis of variance. From this diverse set of models and scenarios we find that the uncertainty varies, depending on the region and the LULC type under consideration. Hotspots of uncertainty appear mainly at the edges of globally important biomes (e.g. boreal and tropical forests). Our results indicate that an important source of uncertainty in forest and pasture areas originates from different input data applied in the models. Cropland, in contrast, is more consistent among the starting conditions, while variation in the projections gradually increases over time due to diverse scenario assumptions and different modeling approaches. Comparisons at the grid cell level indicate that disagreement is mainly related to LULC type definitions and the individual model allocation schemes. We conclude that improving the quality and consistency of observational data utilized in the modeling process as well as improving the allocation mechanisms of LULC change models remain important challenges. Furthermore, current LULC representation in environmental assessments might miss the uncertainty arising from the diversity of LULC change modeling approaches and many studies ignore the uncertainty in LULC projections in assessments of LULC change impacts on climate, water resources or biodiversity.« less

  9. Hotspots of uncertainty in land-use and land-cover change projections: a global-scale model comparison.

    PubMed

    Prestele, Reinhard; Alexander, Peter; Rounsevell, Mark D A; Arneth, Almut; Calvin, Katherine; Doelman, Jonathan; Eitelberg, David A; Engström, Kerstin; Fujimori, Shinichiro; Hasegawa, Tomoko; Havlik, Petr; Humpenöder, Florian; Jain, Atul K; Krisztin, Tamás; Kyle, Page; Meiyappan, Prasanth; Popp, Alexander; Sands, Ronald D; Schaldach, Rüdiger; Schüngel, Jan; Stehfest, Elke; Tabeau, Andrzej; Van Meijl, Hans; Van Vliet, Jasper; Verburg, Peter H

    2016-12-01

    Model-based global projections of future land-use and land-cover (LULC) change are frequently used in environmental assessments to study the impact of LULC change on environmental services and to provide decision support for policy. These projections are characterized by a high uncertainty in terms of quantity and allocation of projected changes, which can severely impact the results of environmental assessments. In this study, we identify hotspots of uncertainty, based on 43 simulations from 11 global-scale LULC change models representing a wide range of assumptions of future biophysical and socioeconomic conditions. We attribute components of uncertainty to input data, model structure, scenario storyline and a residual term, based on a regression analysis and analysis of variance. From this diverse set of models and scenarios, we find that the uncertainty varies, depending on the region and the LULC type under consideration. Hotspots of uncertainty appear mainly at the edges of globally important biomes (e.g., boreal and tropical forests). Our results indicate that an important source of uncertainty in forest and pasture areas originates from different input data applied in the models. Cropland, in contrast, is more consistent among the starting conditions, while variation in the projections gradually increases over time due to diverse scenario assumptions and different modeling approaches. Comparisons at the grid cell level indicate that disagreement is mainly related to LULC type definitions and the individual model allocation schemes. We conclude that improving the quality and consistency of observational data utilized in the modeling process and improving the allocation mechanisms of LULC change models remain important challenges. Current LULC representation in environmental assessments might miss the uncertainty arising from the diversity of LULC change modeling approaches, and many studies ignore the uncertainty in LULC projections in assessments of LULC change impacts on climate, water resources or biodiversity. © 2016 The Authors. Global Change Biology Published by John Wiley & Sons Ltd.

  10. Hotspots of uncertainty in land-use and land-cover change projections: A global-scale model comparison

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Prestele, Reinhard; Alexander, Peter; Rounsevell, Mark D. A.

    Model-based global projections of future land use and land cover (LULC) change are frequently used in environmental assessments to study the impact of LULC change on environmental services and to provide decision support for policy. These projections are characterized by a high uncertainty in terms of quantity and allocation of projected changes, which can severely impact the results of environmental assessments. In this study, we identify hotspots of uncertainty, based on 43 simulations from 11 global-scale LULC change models representing a wide range of assumptions of future biophysical and socio-economic conditions. We attribute components of uncertainty to input data, modelmore » structure, scenario storyline and a residual term, based on a regression analysis and analysis of variance. From this diverse set of models and scenarios we find that the uncertainty varies, depending on the region and the LULC type under consideration. Hotspots of uncertainty appear mainly at the edges of globally important biomes (e.g. boreal and tropical forests). Our results indicate that an important source of uncertainty in forest and pasture areas originates from different input data applied in the models. Cropland, in contrast, is more consistent among the starting conditions, while variation in the projections gradually increases over time due to diverse scenario assumptions and different modeling approaches. Comparisons at the grid cell level indicate that disagreement is mainly related to LULC type definitions and the individual model allocation schemes. We conclude that improving the quality and consistency of observational data utilized in the modeling process as well as improving the allocation mechanisms of LULC change models remain important challenges. Furthermore, current LULC representation in environmental assessments might miss the uncertainty arising from the diversity of LULC change modeling approaches and many studies ignore the uncertainty in LULC projections in assessments of LULC change impacts on climate, water resources or biodiversity.« less

  11. Understanding extreme sea levels for broad-scale coastal impact and adaptation analysis

    NASA Astrophysics Data System (ADS)

    Wahl, T.; Haigh, I. D.; Nicholls, R. J.; Arns, A.; Dangendorf, S.; Hinkel, J.; Slangen, A. B. A.

    2017-07-01

    One of the main consequences of mean sea level rise (SLR) on human settlements is an increase in flood risk due to an increase in the intensity and frequency of extreme sea levels (ESL). While substantial research efforts are directed towards quantifying projections and uncertainties of future global and regional SLR, corresponding uncertainties in contemporary ESL have not been assessed and projections are limited. Here we quantify, for the first time at global scale, the uncertainties in present-day ESL estimates, which have by default been ignored in broad-scale sea-level rise impact assessments to date. ESL uncertainties exceed those from global SLR projections and, assuming that we meet the Paris agreement goals, the projected SLR itself by the end of the century in many regions. Both uncertainties in SLR projections and ESL estimates need to be understood and combined to fully assess potential impacts and adaptation needs.

  12. Uncertainty and sensitivity assessment of flood risk assessments

    NASA Astrophysics Data System (ADS)

    de Moel, H.; Aerts, J. C.

    2009-12-01

    Floods are one of the most frequent and costly natural disasters. In order to protect human lifes and valuable assets from the effect of floods many defensive structures have been build. Despite these efforts economic losses due to catastrophic flood events have, however, risen substantially during the past couple of decades because of continuing economic developments in flood prone areas. On top of that, climate change is expected to affect the magnitude and frequency of flood events. Because these ongoing trends are expected to continue, a transition can be observed in various countries to move from a protective flood management approach to a more risk based flood management approach. In a risk based approach, flood risk assessments play an important role in supporting decision making. Most flood risk assessments assess flood risks in monetary terms (damage estimated for specific situations or expected annual damage) in order to feed cost-benefit analysis of management measures. Such flood risk assessments contain, however, considerable uncertainties. This is the result from uncertainties in the many different input parameters propagating through the risk assessment and accumulating in the final estimate. Whilst common in some other disciplines, as with integrated assessment models, full uncertainty and sensitivity analyses of flood risk assessments are not so common. Various studies have addressed uncertainties regarding flood risk assessments, but have mainly focussed on the hydrological conditions. However, uncertainties in other components of the risk assessment, like the relation between water depth and monetary damage, can be substantial as well. This research therefore tries to assess the uncertainties of all components of monetary flood risk assessments, using a Monte Carlo based approach. Furthermore, the total uncertainty will also be attributed to the different input parameters using a variance based sensitivity analysis. Assessing and visualizing the uncertainties of the final risk estimate will be helpful to decision makers to make better informed decisions and attributing this uncertainty to the input parameters helps to identify which parameters are most important when it comes to uncertainty in the final estimate and should therefore deserve additional attention in further research.

  13. An evaluation of the treatment of risk and uncertainties in the IPCC reports on climate change.

    PubMed

    Aven, Terje; Renn, Ortwin

    2015-04-01

    Few global threats rival global climate change in scale and potential consequence. The principal international authority assessing climate risk is the Intergovernmental Panel on Climate Change (IPCC). Through repeated assessments the IPCC has devoted considerable effort and interdisciplinary competence to articulating a common characterization of climate risk and uncertainties. We have reviewed the assessment and its foundation for the Fifth Assessment Reports published in 2013 and 2014, in particular the guidance note for lead authors of the fifth IPCC assessment report on consistent treatment of uncertainties. Our analysis shows that the work carried out by the ICPP is short of providing a theoretically and conceptually convincing foundation on the treatment of risk and uncertainties. The main reasons for our assessment are: (i) the concept of risk is given a too narrow definition (a function of consequences and probability/likelihood); and (ii) the reports lack precision in delineating their concepts and methods. The goal of this article is to contribute to improving the handling of uncertainty and risk in future IPCC studies, thereby obtaining a more theoretically substantiated characterization as well as enhanced scientific quality for risk analysis in this area. Several suggestions for how to improve the risk and uncertainty treatment are provided. © 2014 Society for Risk Analysis.

  14. Parameter and input data uncertainty estimation for the assessment of water resources in two sub-basins of the Limpopo River Basin

    NASA Astrophysics Data System (ADS)

    Oosthuizen, Nadia; Hughes, Denis A.; Kapangaziwiri, Evison; Mwenge Kahinda, Jean-Marc; Mvandaba, Vuyelwa

    2018-05-01

    The demand for water resources is rapidly growing, placing more strain on access to water and its management. In order to appropriately manage water resources, there is a need to accurately quantify available water resources. Unfortunately, the data required for such assessment are frequently far from sufficient in terms of availability and quality, especially in southern Africa. In this study, the uncertainty related to the estimation of water resources of two sub-basins of the Limpopo River Basin - the Mogalakwena in South Africa and the Shashe shared between Botswana and Zimbabwe - is assessed. Input data (and model parameters) are significant sources of uncertainty that should be quantified. In southern Africa water use data are among the most unreliable sources of model input data because available databases generally consist of only licensed information and actual use is generally unknown. The study assesses how these uncertainties impact the estimation of surface water resources of the sub-basins. Data on farm reservoirs and irrigated areas from various sources were collected and used to run the model. Many farm dams and large irrigation areas are located in the upper parts of the Mogalakwena sub-basin. Results indicate that water use uncertainty is small. Nevertheless, the medium to low flows are clearly impacted. The simulated mean monthly flows at the outlet of the Mogalakwena sub-basin were between 22.62 and 24.68 Mm3 per month when incorporating only the uncertainty related to the main physical runoff generating parameters. The range of total predictive uncertainty of the model increased to between 22.15 and 24.99 Mm3 when water use data such as small farm and large reservoirs and irrigation were included. For the Shashe sub-basin incorporating only uncertainty related to the main runoff parameters resulted in mean monthly flows between 11.66 and 14.54 Mm3. The range of predictive uncertainty changed to between 11.66 and 17.72 Mm3 after the uncertainty in water use information was added.

  15. Determination of Uncertainties for the New SSME Model

    NASA Technical Reports Server (NTRS)

    Coleman, Hugh W.; Hawk, Clark W.

    1996-01-01

    This report discusses the uncertainty analysis performed in support of a new test analysis and performance prediction model for the Space Shuttle Main Engine. The new model utilizes uncertainty estimates for experimental data and for the analytical model to obtain the most plausible operating condition for the engine system. This report discusses the development of the data sets and uncertainty estimates to be used in the development of the new model. It also presents the application of uncertainty analysis to analytical models and the uncertainty analysis for the conservation of mass and energy balance relations is presented. A new methodology for the assessment of the uncertainty associated with linear regressions is presented.

  16. Flood risk assessment and robust management under deep uncertainty: Application to Dhaka City

    NASA Astrophysics Data System (ADS)

    Mojtahed, Vahid; Gain, Animesh Kumar; Giupponi, Carlo

    2014-05-01

    The socio-economic changes as well as climatic changes have been the main drivers of uncertainty in environmental risk assessment and in particular flood. The level of future uncertainty that researchers face when dealing with problems in a future perspective with focus on climate change is known as Deep Uncertainty (also known as Knightian uncertainty), since nobody has already experienced and undergone those changes before and our knowledge is limited to the extent that we have no notion of probabilities, and therefore consolidated risk management approaches have limited potential.. Deep uncertainty is referred to circumstances that analysts and experts do not know or parties to decision making cannot agree on: i) the appropriate models describing the interaction among system variables, ii) probability distributions to represent uncertainty about key parameters in the model 3) how to value the desirability of alternative outcomes. The need thus emerges to assist policy-makers by providing them with not a single and optimal solution to the problem at hand, such as crisp estimates for the costs of damages of natural hazards considered, but instead ranges of possible future costs, based on the outcomes of ensembles of assessment models and sets of plausible scenarios. Accordingly, we need to substitute optimality as a decision criterion with robustness. Under conditions of deep uncertainty, the decision-makers do not have statistical and mathematical bases to identify optimal solutions, while instead they should prefer to implement "robust" decisions that perform relatively well over all conceivable outcomes out of all future unknown scenarios. Under deep uncertainty, analysts cannot employ probability theory or other statistics that usually can be derived from observed historical data and therefore, we turn to non-statistical measures such as scenario analysis. We construct several plausible scenarios with each scenario being a full description of what may happen in future and based on a meaningful synthesis of parameters' values with control of their correlations for maintaining internal consistencies. This paper aims at incorporating a set of data mining and sampling tools to assess uncertainty of model outputs under future climatic and socio-economic changes for Dhaka city and providing a decision support system for robust flood management and mitigation policies. After constructing an uncertainty matrix to identify the main sources of uncertainty for Dhaka City, we identify several hazard and vulnerability maps based on future climatic and socio-economic scenarios. The vulnerability of each flood management alternative under different set of scenarios is determined and finally the robustness of each plausible solution considered is defined based on the above assessment.

  17. Sources of uncertanity as a basis to fill the information gap in a response to flood

    NASA Astrophysics Data System (ADS)

    Kekez, Toni; Knezic, Snjezana

    2016-04-01

    Taking into account uncertainties in flood risk management remains a challenge due to difficulties in choosing adequate structural and/or non-structural risk management options. Despite stated measures wrong decisions are often being made when flood occurs. Parameter and structural uncertainties which include model and observation errors as well as lack of knowledge about system characteristics are the main considerations. Real time flood risk assessment methods are predominantly based on measured water level values and vulnerability as well as other relevant characteristics of flood affected area. The goal of this research is to identify sources of uncertainties and to minimize information gap between the point where the water level is measured and the affected area, taking into consideration main uncertainties that can affect risk value at the observed point or section of the river. Sources of uncertainties are identified and determined using system analysis approach and relevant uncertainties are included in the risk assessment model. With such methodological approach it is possible to increase response time with more effective risk assessment which includes uncertainty propagation model. Response phase could be better planned with adequate early warning systems resulting in more time and less costs to help affected areas and save human lives. Reliable and precise information is necessary to raise emergency operability level in order to enhance safety of citizens and reducing possible damage. The results of the EPISECC (EU funded FP7) project are used to validate potential benefits of this research in order to improve flood risk management and response methods. EPISECC aims at developing a concept of a common European Information Space for disaster response which, among other disasters, considers the floods.

  18. Sensitivity and Uncertainty Analysis of the GFR MOX Fuel Subassembly

    NASA Astrophysics Data System (ADS)

    Lüley, J.; Vrban, B.; Čerba, Š.; Haščík, J.; Nečas, V.; Pelloni, S.

    2014-04-01

    We performed sensitivity and uncertainty analysis as well as benchmark similarity assessment of the MOX fuel subassembly designed for the Gas-Cooled Fast Reactor (GFR) as a representative material of the core. Material composition was defined for each assembly ring separately allowing us to decompose the sensitivities not only for isotopes and reactions but also for spatial regions. This approach was confirmed by direct perturbation calculations for chosen materials and isotopes. Similarity assessment identified only ten partly comparable benchmark experiments that can be utilized in the field of GFR development. Based on the determined uncertainties, we also identified main contributors to the calculation bias.

  19. Quantifying model uncertainty in seasonal Arctic sea-ice forecasts

    NASA Astrophysics Data System (ADS)

    Blanchard-Wrigglesworth, Edward; Barthélemy, Antoine; Chevallier, Matthieu; Cullather, Richard; Fučkar, Neven; Massonnet, François; Posey, Pamela; Wang, Wanqiu; Zhang, Jinlun; Ardilouze, Constantin; Bitz, Cecilia; Vernieres, Guillaume; Wallcraft, Alan; Wang, Muyin

    2017-04-01

    Dynamical model forecasts in the Sea Ice Outlook (SIO) of September Arctic sea-ice extent over the last decade have shown lower skill than that found in both idealized model experiments and hindcasts of previous decades. Additionally, it is unclear how different model physics, initial conditions or post-processing techniques contribute to SIO forecast uncertainty. In this work, we have produced a seasonal forecast of 2015 Arctic summer sea ice using SIO dynamical models initialized with identical sea-ice thickness in the central Arctic. Our goals are to calculate the relative contribution of model uncertainty and irreducible error growth to forecast uncertainty and assess the importance of post-processing, and to contrast pan-Arctic forecast uncertainty with regional forecast uncertainty. We find that prior to forecast post-processing, model uncertainty is the main contributor to forecast uncertainty, whereas after forecast post-processing forecast uncertainty is reduced overall, model uncertainty is reduced by an order of magnitude, and irreducible error growth becomes the main contributor to forecast uncertainty. While all models generally agree in their post-processed forecasts of September sea-ice volume and extent, this is not the case for sea-ice concentration. Additionally, forecast uncertainty of sea-ice thickness grows at a much higher rate along Arctic coastlines relative to the central Arctic ocean. Potential ways of offering spatial forecast information based on the timescale over which the forecast signal beats the noise are also explored.

  20. Assessing the Risks to Human Health in Heterogeneous Aquifers under Uncertainty

    NASA Astrophysics Data System (ADS)

    de Barros, Felipe

    2015-04-01

    Reliable quantification of human health risk from toxic chemicals present in groundwater is a challenging task. The main difficulty relies on the fact that many of the components that constitute human health risk assessment are uncertain and requires interdisciplinary knowledge. Understanding the impact from each of these components in risk estimation can provide guidance for decision makers to manage contaminated sites and best allocate resources towards minimal prediction uncertainty. This presentation will focus on the impact of aquifer heterogeneity in human health risk. Spatial heterogeneity of the hydrogeological properties can lead to the formation of preferential flow channels which control the plume spreading rates and travel time statistics, both which are critical in assessing the risk level. By making use of an integrated hydrogeological-health stochastic framework, the significance of characteristic length scales (e.g. characterizing flow, transport and sampling devices) in both controlling the uncertainty of health risk and determining data needs is highlighted. Through a series of examples, we show how fundamental knowledge on the main physical mechanisms affecting solute pathways are necessary to understand the human health response to varying drivers.

  1. Approaches to Evaluating Probability of Collision Uncertainty

    NASA Technical Reports Server (NTRS)

    Hejduk, Matthew D.; Johnson, Lauren C.

    2016-01-01

    While the two-dimensional probability of collision (Pc) calculation has served as the main input to conjunction analysis risk assessment for over a decade, it has done this mostly as a point estimate, with relatively little effort made to produce confidence intervals on the Pc value based on the uncertainties in the inputs. The present effort seeks to try to carry these uncertainties through the calculation in order to generate a probability density of Pc results rather than a single average value. Methods for assessing uncertainty in the primary and secondary objects' physical sizes and state estimate covariances, as well as a resampling approach to reveal the natural variability in the calculation, are presented; and an initial proposal for operationally-useful display and interpretation of these data for a particular conjunction is given.

  2. Risk analysis under uncertainty, the precautionary principle, and the new EU chemicals strategy.

    PubMed

    Rogers, Michael D

    2003-06-01

    Three categories of uncertainty in relation to risk assessment are defined; uncertainty in effect, uncertainty in cause, and uncertainty in the relationship between a hypothesised cause and effect. The Precautionary Principle (PP) relates to the third type of uncertainty. Three broad descriptions of the PP are set out, uncertainty justifies action, uncertainty requires action, and uncertainty requires a reversal of the burden of proof for risk assessments. The application of the PP is controversial but what matters in practise is the precautionary action (PA) that follows. The criteria by which the PAs should be judged are detailed. This framework for risk assessment and management under uncertainty is then applied to the envisaged European system for the regulation of chemicals. A new EU regulatory system has been proposed which shifts the burden of proof concerning risk assessments from the regulator to the producer, and embodies the PP in all three of its main regulatory stages. The proposals are critically discussed in relation to three chemicals, namely, atrazine (an endocrine disrupter), cadmium (toxic and possibly carcinogenic), and hydrogen fluoride (a toxic, high-production-volume chemical). Reversing the burden of proof will speed up the regulatory process but the examples demonstrate that applying the PP appropriately, and balancing the countervailing risks and the socio-economic benefits, will continue to be a difficult task for the regulator. The paper concludes with a discussion of the role of precaution in the management of change and of the importance of trust in the effective regulation of uncertain risks.

  3. A review of uncertainty research in impact assessment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Leung, Wanda, E-mail: wanda.leung@usask.ca; Noble, Bram, E-mail: b.noble@usask.ca; Gunn, Jill, E-mail: jill.gunn@usask.ca

    2015-01-15

    This paper examines uncertainty research in Impact Assessment (IA) and the focus of attention of the IA scholarly literature. We do so by first exploring ‘outside’ the IA literature, identifying three main themes of uncertainty research, and then apply these themes to examine the focus of scholarly research on uncertainty ‘inside’ IA. Based on a search of the database Scopus, we identified 134 journal papers published between 1970 and 2013 that address uncertainty in IA, 75% of which were published since 2005. We found that 90% of IA research addressing uncertainty focused on uncertainty in the practice of IA, includingmore » uncertainty in impact predictions, models and managing environmental impacts. Notwithstanding early guidance on uncertainty treatment in IA from the 1980s, we found no common, underlying conceptual framework that was guiding research on uncertainty in IA practice. Considerably less attention, only 9% of papers, focused on uncertainty communication, disclosure and decision-making under uncertain conditions, the majority of which focused on the need to disclose uncertainties as opposed to providing guidance on how to do so and effectively use that information to inform decisions. Finally, research focused on theory building for explaining human behavior with respect to uncertainty avoidance constituted only 1% of the IA published literature. We suggest the need for further conceptual framework development for researchers focused on identifying and addressing uncertainty in IA practice; the need for guidance on how best to communicate uncertainties in practice, versus criticizing practitioners for not doing so; research that explores how best to interpret and use disclosures about uncertainty when making decisions about project approvals, and the implications of doing so; and academic theory building and exploring the utility of existing theories to better understand and explain uncertainty avoidance behavior in IA. - Highlights: • We identified three main themes of uncertainty research in 134 papers from the scholarly literature. • The majority of research has focused on better methods for managing uncertainty in predictions. • Uncertainty disclosure is demanded of practitioners, but there is little guidance on how to do so. • There is limited theoretical explanation as to why uncertainty is avoided or not disclosed. • Conceptual, practical and theoretical guidance are required for IA uncertainty consideration.« less

  4. Development and comparison in uncertainty assessment based Bayesian modularization method in hydrological modeling

    NASA Astrophysics Data System (ADS)

    Li, Lu; Xu, Chong-Yu; Engeland, Kolbjørn

    2013-04-01

    SummaryWith respect to model calibration, parameter estimation and analysis of uncertainty sources, various regression and probabilistic approaches are used in hydrological modeling. A family of Bayesian methods, which incorporates different sources of information into a single analysis through Bayes' theorem, is widely used for uncertainty assessment. However, none of these approaches can well treat the impact of high flows in hydrological modeling. This study proposes a Bayesian modularization uncertainty assessment approach in which the highest streamflow observations are treated as suspect information that should not influence the inference of the main bulk of the model parameters. This study includes a comprehensive comparison and evaluation of uncertainty assessments by our new Bayesian modularization method and standard Bayesian methods using the Metropolis-Hastings (MH) algorithm with the daily hydrological model WASMOD. Three likelihood functions were used in combination with standard Bayesian method: the AR(1) plus Normal model independent of time (Model 1), the AR(1) plus Normal model dependent on time (Model 2) and the AR(1) plus Multi-normal model (Model 3). The results reveal that the Bayesian modularization method provides the most accurate streamflow estimates measured by the Nash-Sutcliffe efficiency and provide the best in uncertainty estimates for low, medium and entire flows compared to standard Bayesian methods. The study thus provides a new approach for reducing the impact of high flows on the discharge uncertainty assessment of hydrological models via Bayesian method.

  5. Error and Uncertainty in the Accuracy Assessment of Land Cover Maps

    NASA Astrophysics Data System (ADS)

    Sarmento, Pedro Alexandre Reis

    Traditionally the accuracy assessment of land cover maps is performed through the comparison of these maps with a reference database, which is intended to represent the "real" land cover, being this comparison reported with the thematic accuracy measures through confusion matrixes. Although, these reference databases are also a representation of reality, containing errors due to the human uncertainty in the assignment of the land cover class that best characterizes a certain area, causing bias in the thematic accuracy measures that are reported to the end users of these maps. The main goal of this dissertation is to develop a methodology that allows the integration of human uncertainty present in reference databases in the accuracy assessment of land cover maps, and analyse the impacts that uncertainty may have in the thematic accuracy measures reported to the end users of land cover maps. The utility of the inclusion of human uncertainty in the accuracy assessment of land cover maps is investigated. Specifically we studied the utility of fuzzy sets theory, more precisely of fuzzy arithmetic, for a better understanding of human uncertainty associated to the elaboration of reference databases, and their impacts in the thematic accuracy measures that are derived from confusion matrixes. For this purpose linguistic values transformed in fuzzy intervals that address the uncertainty in the elaboration of reference databases were used to compute fuzzy confusion matrixes. The proposed methodology is illustrated using a case study in which the accuracy assessment of a land cover map for Continental Portugal derived from Medium Resolution Imaging Spectrometer (MERIS) is made. The obtained results demonstrate that the inclusion of human uncertainty in reference databases provides much more information about the quality of land cover maps, when compared with the traditional approach of accuracy assessment of land cover maps. None

  6. Multi-model seasonal forecast of Arctic sea-ice: forecast uncertainty at pan-Arctic and regional scales

    NASA Astrophysics Data System (ADS)

    Blanchard-Wrigglesworth, E.; Barthélemy, A.; Chevallier, M.; Cullather, R.; Fučkar, N.; Massonnet, F.; Posey, P.; Wang, W.; Zhang, J.; Ardilouze, C.; Bitz, C. M.; Vernieres, G.; Wallcraft, A.; Wang, M.

    2017-08-01

    Dynamical model forecasts in the Sea Ice Outlook (SIO) of September Arctic sea-ice extent over the last decade have shown lower skill than that found in both idealized model experiments and hindcasts of previous decades. Additionally, it is unclear how different model physics, initial conditions or forecast post-processing (bias correction) techniques contribute to SIO forecast uncertainty. In this work, we have produced a seasonal forecast of 2015 Arctic summer sea ice using SIO dynamical models initialized with identical sea-ice thickness in the central Arctic. Our goals are to calculate the relative contribution of model uncertainty and irreducible error growth to forecast uncertainty and assess the importance of post-processing, and to contrast pan-Arctic forecast uncertainty with regional forecast uncertainty. We find that prior to forecast post-processing, model uncertainty is the main contributor to forecast uncertainty, whereas after forecast post-processing forecast uncertainty is reduced overall, model uncertainty is reduced by an order of magnitude, and irreducible error growth becomes the main contributor to forecast uncertainty. While all models generally agree in their post-processed forecasts of September sea-ice volume and extent, this is not the case for sea-ice concentration. Additionally, forecast uncertainty of sea-ice thickness grows at a much higher rate along Arctic coastlines relative to the central Arctic ocean. Potential ways of offering spatial forecast information based on the timescale over which the forecast signal beats the noise are also explored.

  7. Statistical uncertainty of extreme wind storms over Europe derived from a probabilistic clustering technique

    NASA Astrophysics Data System (ADS)

    Walz, Michael; Leckebusch, Gregor C.

    2016-04-01

    Extratropical wind storms pose one of the most dangerous and loss intensive natural hazards for Europe. However, due to only 50 years of high quality observational data, it is difficult to assess the statistical uncertainty of these sparse events just based on observations. Over the last decade seasonal ensemble forecasts have become indispensable in quantifying the uncertainty of weather prediction on seasonal timescales. In this study seasonal forecasts are used in a climatological context: By making use of the up to 51 ensemble members, a broad and physically consistent statistical base can be created. This base can then be used to assess the statistical uncertainty of extreme wind storm occurrence more accurately. In order to determine the statistical uncertainty of storms with different paths of progression, a probabilistic clustering approach using regression mixture models is used to objectively assign storm tracks (either based on core pressure or on extreme wind speeds) to different clusters. The advantage of this technique is that the entire lifetime of a storm is considered for the clustering algorithm. Quadratic curves are found to describe the storm tracks most accurately. Three main clusters (diagonal, horizontal or vertical progression of the storm track) can be identified, each of which have their own particulate features. Basic storm features like average velocity and duration are calculated and compared for each cluster. The main benefit of this clustering technique, however, is to evaluate if the clusters show different degrees of uncertainty, e.g. more (less) spread for tracks approaching Europe horizontally (diagonally). This statistical uncertainty is compared for different seasonal forecast products.

  8. The evaluation of uncertainty in low-level LSC measurements of water samples.

    PubMed

    Rusconi, R; Forte, M; Caresana, M; Bellinzona, S; Cazzaniga, M T; Sgorbati, G

    2006-01-01

    The uncertainty in measurements of gross alpha and beta activities in water samples by liquid scintillation counting with alpha/beta discrimination has been evaluated considering the problems typical of low-level measurements of environmental samples. The use of a pulse shape analysis device to discriminate alpha and beta events introduces a correlation between some of the input quantities, and it has to be considered. Main contributors to total uncertainty have been assessed by specifically designed experimental tests. Results have been fully examined and discussed.

  9. Incorporating uncertainty in predictive species distribution modelling.

    PubMed

    Beale, Colin M; Lennon, Jack J

    2012-01-19

    Motivated by the need to solve ecological problems (climate change, habitat fragmentation and biological invasions), there has been increasing interest in species distribution models (SDMs). Predictions from these models inform conservation policy, invasive species management and disease-control measures. However, predictions are subject to uncertainty, the degree and source of which is often unrecognized. Here, we review the SDM literature in the context of uncertainty, focusing on three main classes of SDM: niche-based models, demographic models and process-based models. We identify sources of uncertainty for each class and discuss how uncertainty can be minimized or included in the modelling process to give realistic measures of confidence around predictions. Because this has typically not been performed, we conclude that uncertainty in SDMs has often been underestimated and a false precision assigned to predictions of geographical distribution. We identify areas where development of new statistical tools will improve predictions from distribution models, notably the development of hierarchical models that link different types of distribution model and their attendant uncertainties across spatial scales. Finally, we discuss the need to develop more defensible methods for assessing predictive performance, quantifying model goodness-of-fit and for assessing the significance of model covariates.

  10. Improving the Method of Roof Fall Susceptibility Assessment based on Fuzzy Approach

    NASA Astrophysics Data System (ADS)

    Ghasemi, Ebrahim; Ataei, Mohammad; Shahriar, Kourosh

    2017-03-01

    Retreat mining is always accompanied by a great amount of accidents and most of them are due to roof fall. Therefore, development of methodologies to evaluate the roof fall susceptibility (RFS) seems essential. Ghasemi et al. (2012) proposed a systematic methodology to assess the roof fall risk during retreat mining based on risk assessment classic approach. The main defect of this method is ignorance of subjective uncertainties due to linguistic input value of some factors, low resolution, fixed weighting, sharp class boundaries, etc. To remove this defection and improve the mentioned method, in this paper, a novel methodology is presented to assess the RFS using fuzzy approach. The application of fuzzy approach provides an effective tool to handle the subjective uncertainties. Furthermore, fuzzy analytical hierarchy process (AHP) is used to structure and prioritize various risk factors and sub-factors during development of this method. This methodology is applied to identify the susceptibility of roof fall occurrence in main panel of Tabas Central Mine (TCM), Iran. The results indicate that this methodology is effective and efficient in assessing RFS.

  11. A Review On Accuracy and Uncertainty of Spatial Data and Analyses with special reference to Urban and Hydrological Modelling

    NASA Astrophysics Data System (ADS)

    Devendran, A. A.; Lakshmanan, G.

    2014-11-01

    Data quality for GIS processing and analysis is becoming an increased concern due to the accelerated application of GIS technology for problem solving and decision making roles. Uncertainty in the geographic representation of the real world arises as these representations are incomplete. Identification of the sources of these uncertainties and the ways in which they operate in GIS based representations become crucial in any spatial data representation and geospatial analysis applied to any field of application. This paper reviews the articles on the various components of spatial data quality and various uncertainties inherent in them and special focus is paid to two fields of application such as Urban Simulation and Hydrological Modelling. Urban growth is a complicated process involving the spatio-temporal changes of all socio-economic and physical components at different scales. Cellular Automata (CA) model is one of the simulation models, which randomly selects potential cells for urbanisation and the transition rules evaluate the properties of the cell and its neighbour. Uncertainty arising from CA modelling is assessed mainly using sensitivity analysis including Monte Carlo simulation method. Likewise, the importance of hydrological uncertainty analysis has been emphasized in recent years and there is an urgent need to incorporate uncertainty estimation into water resources assessment procedures. The Soil and Water Assessment Tool (SWAT) is a continuous time watershed model to evaluate various impacts of land use management and climate on hydrology and water quality. Hydrological model uncertainties using SWAT model are dealt primarily by Generalized Likelihood Uncertainty Estimation (GLUE) method.

  12. An application of a hydraulic model simulator in flood risk assessment under changing climatic conditions

    NASA Astrophysics Data System (ADS)

    Doroszkiewicz, J. M.; Romanowicz, R. J.

    2016-12-01

    The standard procedure of climate change impact assessment on future hydrological extremes consists of a chain of consecutive actions, starting from the choice of GCM driven by an assumed CO2 scenario, through downscaling of climatic forcing to a catchment scale, estimation of hydrological extreme indices using hydrological modelling tools and subsequent derivation of flood risk maps with the help of a hydraulic model. Among many possible sources of uncertainty, the main are the uncertainties related to future climate scenarios, climate models, downscaling techniques and hydrological and hydraulic models. Unfortunately, we cannot directly assess the impact of these different sources of uncertainties on flood risk in future due to lack of observations of future climate realizations. The aim of this study is an assessment of a relative impact of different sources of uncertainty on the uncertainty of flood risk maps. Due to the complexity of the processes involved, an assessment of total uncertainty of maps of inundation probability might be very computer time consuming. As a way forward we present an application of a hydraulic model simulator based on a nonlinear transfer function model for the chosen locations along the river reach. The transfer function model parameters are estimated based on the simulations of the hydraulic model at each of the model cross-sections. The study shows that the application of a simulator substantially reduces the computer requirements related to the derivation of flood risk maps under future climatic conditions. Biala Tarnowska catchment, situated in southern Poland is used as a case study. Future discharges at the input to a hydraulic model are obtained using the HBV model and climate projections obtained from the EUROCORDEX project. The study describes a cascade of uncertainty related to different stages of the process of derivation of flood risk maps under changing climate conditions. In this context it takes into account the uncertainty of future climate projections, an uncertainty of flow routing model, the propagation of that uncertainty through the hydraulic model, and finally, the uncertainty related to the derivation of flood risk maps.

  13. Quantitative health impact assessment: taking stock and moving forward.

    PubMed

    Fehr, Rainer; Hurley, Fintan; Mekel, Odile Cecile; Mackenbach, Johan P

    2012-12-01

    Over the past years, application of health impact assessment has increased substantially, and there has been a strong growth of tools that allow quantification of health impacts for a range of health relevant policies. We review these developments, and conclude that further tool development is no longer a main priority, although several aspects need to be further developed, such as methods to assess impacts on health inequalities and to assess uncertainties. The main new challenges are, first, to conduct a comparative evaluation of different tools, and, second, to ensure the maintenance and continued availability of the toolkits including their data contents.

  14. On different types of uncertainties in the context of the precautionary principle.

    PubMed

    Aven, Terje

    2011-10-01

    Few policies for risk management have created more controversy than the precautionary principle. A main problem is the extreme number of different definitions and interpretations. Almost all definitions of the precautionary principle identify "scientific uncertainties" as the trigger or criterion for its invocation; however, the meaning of this concept is not clear. For applying the precautionary principle it is not sufficient that the threats or hazards are uncertain. A stronger requirement is needed. This article provides an in-depth analysis of this issue. We question how the scientific uncertainties are linked to the interpretation of the probability concept, expected values, the results from probabilistic risk assessments, the common distinction between aleatory uncertainties and epistemic uncertainties, and the problem of establishing an accurate prediction model (cause-effect relationship). A new classification structure is suggested to define what scientific uncertainties mean. © 2011 Society for Risk Analysis.

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Webster, Mort D.

    This report presents the final outcomes and products of the project as performed both at the Massachusetts Institute of Technology and subsequently at Pennsylvania State University. The research project can be divided into three main components: methodology development for decision-making under uncertainty, improving the resolution of the electricity sector to improve integrated assessment, and application of these methods to integrated assessment.

  16. Uncertainty Model for Total Solar Irradiance Estimation on Australian Rooftops

    NASA Astrophysics Data System (ADS)

    Al-Saadi, Hassan; Zivanovic, Rastko; Al-Sarawi, Said

    2017-11-01

    The installations of solar panels on Australian rooftops have been in rise for the last few years, especially in the urban areas. This motivates academic researchers, distribution network operators and engineers to accurately address the level of uncertainty resulting from grid-connected solar panels. The main source of uncertainty is the intermittent nature of radiation, therefore, this paper presents a new model to estimate the total radiation incident on a tilted solar panel. Where a probability distribution factorizes clearness index, the model is driven upon clearness index with special attention being paid for Australia with the utilization of best-fit-correlation for diffuse fraction. The assessment of the model validity is achieved with the adoption of four goodness-of-fit techniques. In addition, the Quasi Monte Carlo and sparse grid methods are used as sampling and uncertainty computation tools, respectively. High resolution data resolution of solar irradiations for Adelaide city were used for this assessment, with an outcome indicating a satisfactory agreement between actual data variation and model.

  17. Area 2: Inexpensive Monitoring and Uncertainty Assessment of CO2 Plume Migration using Injection Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Srinivasan, Sanjay

    2014-09-30

    In-depth understanding of the long-term fate of CO₂ in the subsurface requires study and analysis of the reservoir formation, the overlaying caprock formation, and adjacent faults. Because there is significant uncertainty in predicting the location and extent of geologic heterogeneity that can impact the future migration of CO₂ in the subsurface, there is a need to develop algorithms that can reliably quantify this uncertainty in plume migration. This project is focused on the development of a model selection algorithm that refines an initial suite of subsurface models representing the prior uncertainty to create a posterior set of subsurface models thatmore » reflect injection performance consistent with that observed. Such posterior models can be used to represent uncertainty in the future migration of the CO₂ plume. Because only injection data is required, the method provides a very inexpensive method to map the migration of the plume and the associated uncertainty in migration paths. The model selection method developed as part of this project mainly consists of assessing the connectivity/dynamic characteristics of a large prior ensemble of models, grouping the models on the basis of their expected dynamic response, selecting the subgroup of models that most closely yield dynamic response closest to the observed dynamic data, and finally quantifying the uncertainty in plume migration using the selected subset of models. The main accomplishment of the project is the development of a software module within the SGEMS earth modeling software package that implements the model selection methodology. This software module was subsequently applied to analyze CO₂ plume migration in two field projects – the In Salah CO₂ Injection project in Algeria and CO₂ injection into the Utsira formation in Norway. These applications of the software revealed that the proxies developed in this project for quickly assessing the dynamic characteristics of the reservoir were highly efficient and yielded accurate grouping of reservoir models. The plume migration paths probabilistically assessed by the method were confirmed by field observations and auxiliary data. The report also documents the application of the software to answer practical questions such as the optimum location of monitoring wells to reliably assess the migration of CO₂ plume, the effect of CO₂-rock interactions on plume migration and the ability to detect the plume under those conditions and the effect of a slow, unresolved leak on the predictions of plume migration.« less

  18. Communication and perception of uncertainty via graphics in disciplinary and interdisciplinary climate change research

    NASA Astrophysics Data System (ADS)

    Lackner, Bettina C.; Kirchengast, Gottfried

    2015-04-01

    Besides written and spoken language, graphical displays play an important role in communicating scientific findings or explaining scientific methods, both within one and between various disciplines. Uncertainties and probabilities are generally difficult to communicate, especially via graphics. Graphics including uncertainty sometimes need detailed written or oral descriptions to be understood. "Good" graphics should ease scientific communication, especially amongst different disciplines. One key objective of the Doctoral Programme "Climate Change: Uncertainties, Thresholds and Coping Strategies" (http://dk-climate-change.uni-graz.at/en/), located at the University of Graz, is to reach a better understanding of climate change uncertainties by bridging research in multiple disciplines, including physical climate sciences, geosciences, systems and sustainability sciences, environmental economics, and climate ethics. This asks for efforts into the formulation of a "common language", not only as to words, but also as to graphics. The focus of this work is on two topics: (1) What different kinds of uncertainties (e.g., data uncertainty, model uncertainty) are included in the graphics of the recent IPCC reports of all three working groups (WGs) and in what ways do uncertainties get illustrated? (2) How are these graphically displayed uncertainties perceived by researchers of a similar research discipline and from researchers of different disciplines than the authors of the graphics? To answer the first question, the IPCC graphics including uncertainties are grouped and analyzed with respect to different kinds of uncertainties to filter out most of the commonly used types of displays. The graphics will also be analyzed with respect to their WG origin, as we assume that graphics from researchers rooted in, e.g., physical climate sciences and geosciences (mainly IPCC WG 1) differ from those of researchers rooted in, e.g., economics or system sciences (mainly WG 3). In a subsequent analysis, some basic types of graphics displaying uncertainty are selected to serve as input for the construction of "makeshift graphics" (displaying only the main features but including no detailed title or caption). These makeshift graphics are then used to assess how the displayed features are perceived and understood by researchers of various disciplines. In this initial study, this analysis will be based on results of a workshop including the wide diversity of researchers within the FWF-DK Climate Change. We will present first results of this work.

  19. Geographical scenario uncertainty in generic fate and exposure factors of toxic pollutants for life-cycle impact assessment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huijbregts, Mark A.J.; Lundi, Sven; McKone, Thomas E.

    In environmental life-cycle assessments (LCA), fate and exposure factors account for the general fate and exposure properties of chemicals under generic environmental conditions by means of 'evaluative' multi-media fate and exposure box models. To assess the effect of using different generic environmental conditions, fate and exposure factors of chemicals emitted under typical conditions of (1) Western Europe, (2) Australia and (3) the United States of America were compared with the multi-media fate and exposure box model USES-LCA. Comparing the results of the three evaluative environments, it was found that the uncertainty in fate and exposure factors for ecosystems and humansmore » due to choice of an evaluative environment, as represented by the ratio of the 97.5th and 50th percentile, is between a factor 2 and 10. Particularly, fate and exposure factors of emissions causing effects in fresh water ecosystems and effects on human health have relatively high uncertainty. This uncertainty i s mainly caused by the continental difference in the average soil erosion rate, the dimensions of the fresh water and agricultural soil compartment, and the fraction of drinking water coming from ground water.« less

  20. Analyzing the uncertainty of ensemble-based gridded observations in land surface simulations and drought assessment

    NASA Astrophysics Data System (ADS)

    Ahmadalipour, Ali; Moradkhani, Hamid

    2017-12-01

    Hydrologic modeling is one of the primary tools utilized for drought monitoring and drought early warning systems. Several sources of uncertainty in hydrologic modeling have been addressed in the literature. However, few studies have assessed the uncertainty of gridded observation datasets from a drought monitoring perspective. This study provides a hydrologic modeling oriented analysis of the gridded observation data uncertainties over the Pacific Northwest (PNW) and its implications on drought assessment. We utilized a recently developed 100-member ensemble-based observed forcing data to simulate hydrologic fluxes at 1/8° spatial resolution using Variable Infiltration Capacity (VIC) model, and compared the results with a deterministic observation. Meteorological and hydrological droughts are studied at multiple timescales over the basin, and seasonal long-term trends and variations of drought extent is investigated for each case. Results reveal large uncertainty of observed datasets at monthly timescale, with systematic differences for temperature records, mainly due to different lapse rates. The uncertainty eventuates in large disparities of drought characteristics. In general, an increasing trend is found for winter drought extent across the PNW. Furthermore, a ∼3% decrease per decade is detected for snow water equivalent (SWE) over the PNW, with the region being more susceptible to SWE variations of the northern Rockies than the western Cascades. The agricultural areas of southern Idaho demonstrate decreasing trend of natural soil moisture as a result of precipitation decline, which implies higher appeal for anthropogenic water storage and irrigation systems.

  1. Methodology for qualitative uncertainty assessment of climate impact indicators

    NASA Astrophysics Data System (ADS)

    Otto, Juliane; Keup-Thiel, Elke; Rechid, Diana; Hänsler, Andreas; Pfeifer, Susanne; Roth, Ellinor; Jacob, Daniela

    2016-04-01

    The FP7 project "Climate Information Portal for Copernicus" (CLIPC) is developing an integrated platform of climate data services to provide a single point of access for authoritative scientific information on climate change and climate change impacts. In this project, the Climate Service Center Germany (GERICS) has been in charge of the development of a methodology on how to assess the uncertainties related to climate impact indicators. Existing climate data portals mainly treat the uncertainties in two ways: Either they provide generic guidance and/or express with statistical measures the quantifiable fraction of the uncertainty. However, none of the climate data portals give the users a qualitative guidance how confident they can be in the validity of the displayed data. The need for such guidance was identified in CLIPC user consultations. Therefore, we aim to provide an uncertainty assessment that provides the users with climate impact indicator-specific guidance on the degree to which they can trust the outcome. We will present an approach that provides information on the importance of different sources of uncertainties associated with a specific climate impact indicator and how these sources affect the overall 'degree of confidence' of this respective indicator. To meet users requirements in the effective communication of uncertainties, their feedback has been involved during the development process of the methodology. Assessing and visualising the quantitative component of uncertainty is part of the qualitative guidance. As visual analysis method, we apply the Climate Signal Maps (Pfeifer et al. 2015), which highlight only those areas with robust climate change signals. Here, robustness is defined as a combination of model agreement and the significance of the individual model projections. Reference Pfeifer, S., Bülow, K., Gobiet, A., Hänsler, A., Mudelsee, M., Otto, J., Rechid, D., Teichmann, C. and Jacob, D.: Robustness of Ensemble Climate Projections Analyzed with Climate Signal Maps: Seasonal and Extreme Precipitation for Germany, Atmosphere (Basel)., 6(5), 677-698, doi:10.3390/atmos6050677, 2015.

  2. To be or not to be: How do we speak about uncertainty in public?

    NASA Astrophysics Data System (ADS)

    Todesco, Micol; Lolli, Barbara; Sheldrake, Tom; Odbert, Henry

    2016-04-01

    One of the challenges related to hazard communication concerns the public perception and understanding of scientific uncertainties, and of its implications in terms of hazard assessment and mitigation. Often science is perceived as an effective dispenser of resolving answers to the main issues posed by the complexities of life and nature. In this perspective, uncertainty is seen as a pernicious lack of knowledge that hinders our ability to face complex problems. From a scientific perspective, however, the definition of uncertainty is the only valuable tool we have to handle errors affecting our data and propagating through the increasingly complex models we develop to describe reality. Through uncertainty, scientists acknowledge the great variability that characterises natural systems and account for it in their assessment of possible scenarios. From this point of view, uncertainty is not ignorance, but it rather provides a great deal of information that is needed to inform decision making. To find effective ways to bridge the gap between these different meaning of uncertainty, we asked high-school students for assistance. With their help, we gathered definitions of the term 'uncertainty' interviewing different categories of peoples, including schoolmates and professors, neighbours, families and friends. These definitions will be compared with those provided by scientists, to find differences and similarity. To understand the role of uncertainty on judgment, a hands-on experiment is performed where students will have to estimate the exact time of explosion of party poppers subjected to a variable degree of pull. At the end of the project, the students will express their own understanding of uncertainty in a video, which will be made available for sharing. Materials collected during all the activities will contribute to our understanding of how uncertainty is portrayed and can be better expressed to improve our hazard communication.

  3. Communicating uncertainties in assessments of future sea level rise

    NASA Astrophysics Data System (ADS)

    Wikman-Svahn, P.

    2013-12-01

    How uncertainty should be managed and communicated in policy-relevant scientific assessments is directly connected to the role of science and the responsibility of scientists. These fundamentally philosophical issues influence how scientific assessments are made and how scientific findings are communicated to policymakers. It is therefore of high importance to discuss implicit assumptions and value judgments that are made in policy-relevant scientific assessments. The present paper examines these issues for the case of scientific assessments of future sea level rise. The magnitude of future sea level rise is very uncertain, mainly due to poor scientific understanding of all physical mechanisms affecting the great ice sheets of Greenland and Antarctica, which together hold enough land-based ice to raise sea levels more than 60 meters if completely melted. There has been much confusion from policymakers on how different assessments of future sea levels should be interpreted. Much of this confusion is probably due to how uncertainties are characterized and communicated in these assessments. The present paper draws on the recent philosophical debate on the so-called "value-free ideal of science" - the view that science should not be based on social and ethical values. Issues related to how uncertainty is handled in scientific assessments are central to this debate. This literature has much focused on how uncertainty in data, parameters or models implies that choices have to be made, which can have social consequences. However, less emphasis has been on how uncertainty is characterized when communicating the findings of a study, which is the focus of the present paper. The paper argues that there is a tension between on the one hand the value-free ideal of science and on the other hand usefulness for practical applications in society. This means that even if the value-free ideal could be upheld in theory, by carefully constructing and hedging statements characterizing scientific uncertainty, it will in most cases not be very useful for society. Instead, it is argued that scientific assessments that are used to inform societal decision-making should try to anticipate applications and aim to construct statements that characterize knowledge and uncertainty in a way that are more useful for those anticipated applications, even if this means that the value-free ideal cannot be upheld. This means that scientific assessments should ideally be intertwined with societal applications, and that co-produced knowledge engaging both scientists and end-users are likely to provide better and more useful assessments. The argument is illustrated using real examples from scientific assessments of future sea level rise, with special emphasis on the approaches used by the Intergovernmental Panel on Climate Change in the fourth assessment report from 2007, and the fifth assessment report due in September 2013. Finally, it is argued that recent developments in "bottom-up" and "robust" decision-making frameworks provide a way forward to remove many of the pitfalls and problems of communicating uncertainties in policy-relevant scientific assessments.

  4. Short-term ensemble radar rainfall forecasts for hydrological applications

    NASA Astrophysics Data System (ADS)

    Codo de Oliveira, M.; Rico-Ramirez, M. A.

    2016-12-01

    Flooding is a very common natural disaster around the world, putting local population and economy at risk. Forecasting floods several hours ahead and issuing warnings are of main importance to permit proper response in emergency situations. However, it is important to know the uncertainties related to the rainfall forecasting in order to produce more reliable forecasts. Nowcasting models (short-term rainfall forecasts) are able to produce high spatial and temporal resolution predictions that are useful in hydrological applications. Nonetheless, they are subject to uncertainties mainly due to the nowcasting model used, errors in radar rainfall estimation, temporal development of the velocity field and to the fact that precipitation processes such as growth and decay are not taken into account. In this study an ensemble generation scheme using rain gauge data as a reference to estimate radars errors is used to produce forecasts with up to 3h lead-time. The ensembles try to assess in a realistic way the residual uncertainties that remain even after correction algorithms are applied in the radar data. The ensembles produced are compered to a stochastic ensemble generator. Furthermore, the rainfall forecast output was used as an input in a hydrodynamic sewer network model and also in hydrological model for catchments of different sizes in north England. A comparative analysis was carried of how was carried out to assess how the radar uncertainties propagate into these models. The first named author is grateful to CAPES - Ciencia sem Fronteiras for funding this PhD research.

  5. Using global sensitivity analysis of demographic models for ecological impact assessment.

    PubMed

    Aiello-Lammens, Matthew E; Akçakaya, H Resit

    2017-02-01

    Population viability analysis (PVA) is widely used to assess population-level impacts of environmental changes on species. When combined with sensitivity analysis, PVA yields insights into the effects of parameter and model structure uncertainty. This helps researchers prioritize efforts for further data collection so that model improvements are efficient and helps managers prioritize conservation and management actions. Usually, sensitivity is analyzed by varying one input parameter at a time and observing the influence that variation has over model outcomes. This approach does not account for interactions among parameters. Global sensitivity analysis (GSA) overcomes this limitation by varying several model inputs simultaneously. Then, regression techniques allow measuring the importance of input-parameter uncertainties. In many conservation applications, the goal of demographic modeling is to assess how different scenarios of impact or management cause changes in a population. This is challenging because the uncertainty of input-parameter values can be confounded with the effect of impacts and management actions. We developed a GSA method that separates model outcome uncertainty resulting from parameter uncertainty from that resulting from projected ecological impacts or simulated management actions, effectively separating the 2 main questions that sensitivity analysis asks. We applied this method to assess the effects of predicted sea-level rise on Snowy Plover (Charadrius nivosus). A relatively small number of replicate models (approximately 100) resulted in consistent measures of variable importance when not trying to separate the effects of ecological impacts from parameter uncertainty. However, many more replicate models (approximately 500) were required to separate these effects. These differences are important to consider when using demographic models to estimate ecological impacts of management actions. © 2016 Society for Conservation Biology.

  6. [Status Quo, Uncertainties and Trends Analysis of Environmental Risk Assessment for PFASs].

    PubMed

    Hao, Xue-wen; Li, Li; Wang, Jie; Cao, Yan; Liu, Jian-guo

    2015-08-01

    This study systematically combed the definition and change of terms, category and application of perfluoroalkyl and polyfluoroalkyl substances (PFASs) in international academic, focusing on the environmental risk and exposure assessment of PFASs, to comprehensively analyze the current status, uncertainties and trends of PFASs' environmental risk assessment. Overall, the risk assessment of PFASs is facing a complicated situation involving complex substance pedigrees, various types, complex derivative relations, confidential business information and risk uncertainties. Although the environmental risk of long-chain PFASs has been widely recognized, the short-chain PFASs and short-chain fluorotelomers as their alternatives still have many research gaps and uncertainties in environmental hazards, environmental fate and exposure risk. The scope of risk control of PFASs in the international community is still worth discussing. Due to trade secrets and market competition, the chemical structure and risk information of PFASs' alternatives are generally lack of openness and transparency. The environmental risk of most fluorinated and non-fluorinated alternatives is not clear. In total, the international research on PFASs risk assessment gradually transfer from long-chain perfluoroalkyl acids (PFAAs) represented by perfluorooctane sulfonic acid (PFOS) and perfluorooctanoic acid (PFOA) to short-chain PFAAs, and then extends to other PFASs. The main problems to be solved urgently and researched continuously are: the environmental hazardous assessment indexes, such as bioaccumulation and environmental migration, optimization method, the environmental release and multimedia environmental fate of short-chain PFASs; the environmental fate of neutral PFASs and the transformation and contribution as precursors of short-chain PFASs; the risk identification and assessment of fluorinated and non-fluorinated alternatives of PFASs.

  7. Sustainability of fisheries through marine reserves: a robust modeling analysis.

    PubMed

    Doyen, L; Béné, C

    2003-09-01

    Among the many factors that contribute to overexploitation of marine fisheries, the role played by uncertainty is important. This uncertainty includes both the scientific uncertainties related to the resource dynamics or assessments and the uncontrollability of catches. Some recent works advocate for the use of marine reserves as a central element of future stock management. In the present paper, we study the influence of protected areas upon fisheries sustainability through a simple dynamic model integrating non-stochastic harvesting uncertainty and a constraint of safe minimum biomass level. Using the mathematical concept of invariance kernel in a robust and worst-case context, we examine through a formal modeling analysis how marine reserves might guarantee viable fisheries. We also show how sustainability requirement is not necessarily conflicting with optimization of catches. Numerical simulations are provided to illustrate the main findings.

  8. Quantifying acoustic doppler current profiler discharge uncertainty: A Monte Carlo based tool for moving-boat measurements

    USGS Publications Warehouse

    Mueller, David S.

    2017-01-01

    This paper presents a method using Monte Carlo simulations for assessing uncertainty of moving-boat acoustic Doppler current profiler (ADCP) discharge measurements using a software tool known as QUant, which was developed for this purpose. Analysis was performed on 10 data sets from four Water Survey of Canada gauging stations in order to evaluate the relative contribution of a range of error sources to the total estimated uncertainty. The factors that differed among data sets included the fraction of unmeasured discharge relative to the total discharge, flow nonuniformity, and operator decisions about instrument programming and measurement cross section. As anticipated, it was found that the estimated uncertainty is dominated by uncertainty of the discharge in the unmeasured areas, highlighting the importance of appropriate selection of the site, the instrument, and the user inputs required to estimate the unmeasured discharge. The main contributor to uncertainty was invalid data, but spatial inhomogeneity in water velocity and bottom-track velocity also contributed, as did variation in the edge velocity, uncertainty in the edge distances, edge coefficients, and the top and bottom extrapolation methods. To a lesser extent, spatial inhomogeneity in the bottom depth also contributed to the total uncertainty, as did uncertainty in the ADCP draft at shallow sites. The estimated uncertainties from QUant can be used to assess the adequacy of standard operating procedures. They also provide quantitative feedback to the ADCP operators about the quality of their measurements, indicating which parameters are contributing most to uncertainty, and perhaps even highlighting ways in which uncertainty can be reduced. Additionally, QUant can be used to account for self-dependent error sources such as heading errors, which are a function of heading. The results demonstrate the importance of a Monte Carlo method tool such as QUant for quantifying random and bias errors when evaluating the uncertainty of moving-boat ADCP measurements.

  9. An ensemble of eddy-permitting global ocean reanalyses from the MyOcean project

    NASA Astrophysics Data System (ADS)

    Masina, Simona; Storto, Andrea; Ferry, Nicolas; Valdivieso, Maria; Haines, Keith; Balmaseda, Magdalena; Zuo, Hao; Drevillon, Marie; Parent, Laurent

    2017-08-01

    A set of four eddy-permitting global ocean reanalyses produced in the framework of the MyOcean project have been compared over the altimetry period 1993-2011. The main differences among the reanalyses used here come from the data assimilation scheme implemented to control the ocean state by inserting reprocessed observations of sea surface temperature (SST), in situ temperature and salinity profiles, sea level anomaly and sea-ice concentration. A first objective of this work includes assessing the interannual variability and trends for a series of parameters, usually considered in the community as essential ocean variables: SST, sea surface salinity, temperature and salinity averaged over meaningful layers of the water column, sea level, transports across pre-defined sections, and sea ice parameters. The eddy-permitting nature of the global reanalyses allows also to estimate eddy kinetic energy. The results show that in general there is a good consistency between the different reanalyses. An intercomparison against experiments without data assimilation was done during the MyOcean project and we conclude that data assimilation is crucial for correctly simulating some quantities such as regional trends of sea level as well as the eddy kinetic energy. A second objective is to show that the ensemble mean of reanalyses can be evaluated as one single system regarding its reliability in reproducing the climate signals, where both variability and uncertainties are assessed through the ensemble spread and signal-to-noise ratio. The main advantage of having access to several reanalyses differing in the way data assimilation is performed is that it becomes possible to assess part of the total uncertainty. Given the fact that we use very similar ocean models and atmospheric forcing, we can conclude that the spread of the ensemble of reanalyses is mainly representative of our ability to gauge uncertainty in the assimilation methods. This uncertainty changes a lot from one ocean parameter to another, especially in global indices. However, despite several caveats in the design of the multi-system ensemble, the main conclusion from this study is that an eddy-permitting multi-system ensemble approach has become mature and our results provide a first step towards a systematic comparison of eddy-permitting global ocean reanalyses aimed at providing robust conclusions on the recent evolution of the oceanic state.

  10. Calibration of hydrometers

    NASA Astrophysics Data System (ADS)

    Lorefice, Salvatore; Malengo, Andrea

    2006-10-01

    After a brief description of the different methods employed in periodic calibration of hydrometers used in most cases to measure the density of liquids in the range between 500 kg m-3 and 2000 kg m-3, particular emphasis is given to the multipoint procedure based on hydrostatic weighing, known as well as Cuckow's method. The features of the calibration apparatus and the procedure used at the INRiM (formerly IMGC-CNR) density laboratory have been considered to assess all relevant contributions involved in the calibration of different kinds of hydrometers. The uncertainty is strongly dependent on the kind of hydrometer; in particular, the results highlight the importance of the density of the reference buoyant liquid, the temperature of calibration and the skill of operator in the reading of the scale in the whole assessment of the uncertainty. It is also interesting to realize that for high-resolution hydrometers (division of 0.1 kg m-3), the uncertainty contribution of the density of the reference liquid is the main source of the total uncertainty, but its importance falls under about 50% for hydrometers with a division of 0.5 kg m-3 and becomes somewhat negligible for hydrometers with a division of 1 kg m-3, for which the reading uncertainty is the predominant part of the total uncertainty. At present the best INRiM result is obtained with commercially available hydrometers having a scale division of 0.1 kg m-3, for which the relative uncertainty is about 12 × 10-6.

  11. Uncertainty of the peak flow reconstruction of the 1907 flood in the Ebro River in Xerta (NE Iberian Peninsula)

    NASA Astrophysics Data System (ADS)

    Ruiz-Bellet, Josep Lluís; Castelltort, Xavier; Balasch, J. Carles; Tuset, Jordi

    2017-02-01

    There is no clear, unified and accepted method to estimate the uncertainty of hydraulic modelling results. In historical floods reconstruction, due to the lower precision of input data, the magnitude of this uncertainty could reach a high value. With the objectives of giving an estimate of the peak flow error of a typical historical flood reconstruction with the model HEC-RAS and of providing a quick, simple uncertainty assessment that an end user could easily apply, the uncertainty of the reconstructed peak flow of a major flood in the Ebro River (NE Iberian Peninsula) was calculated with a set of local sensitivity analyses on six input variables. The peak flow total error was estimated at ±31% and water height was found to be the most influential variable on peak flow, followed by Manning's n. However, the latter, due to its large uncertainty, was the greatest contributor to peak flow total error. Besides, the HEC-RAS resulting peak flow was compared to the ones obtained with the 2D model Iber and with Manning's equation; all three methods gave similar peak flows. Manning's equation gave almost the same result than HEC-RAS. The main conclusion is that, to ensure the lowest peak flow error, the reliability and precision of the flood mark should be thoroughly assessed.

  12. Uncertainty of a hydrological climate change impact assessment - Is it really all about climate uncertainty?

    NASA Astrophysics Data System (ADS)

    Honti, Mark; Reichert, Peter; Scheidegger, Andreas; Stamm, Christian

    2013-04-01

    Climate change impact assessments have become more and more popular in hydrology since the middle 1980's with another boost after the publication of the IPCC AR4 report. During hundreds of impact studies a quasi-standard methodology emerged, which is mainly shaped by the growing public demand for predicting how water resources management or flood protection should change in the close future. The ``standard'' workflow considers future climate under a specific IPCC emission scenario simulated by global circulation models (GCMs), possibly downscaled by a regional climate model (RCM) and/or a stochastic weather generator. The output from the climate models is typically corrected for bias before feeding it into a calibrated hydrological model, which is run on the past and future meteorological data to analyse the impacts of climate change on the hydrological indicators of interest. The impact predictions are as uncertain as any forecast that tries to describe the behaviour of an extremely complex system decades into the future. Future climate predictions are uncertain due to the scenario uncertainty and the GCM model uncertainty that is obvious on finer resolution than continental scale. Like in any hierarchical model system, uncertainty propagates through the descendant components. Downscaling increases uncertainty with the deficiencies of RCMs and/or weather generators. Bias correction adds a strong deterministic shift to the input data. Finally the predictive uncertainty of the hydrological model ends the cascade that leads to the total uncertainty of the hydrological impact assessment. There is an emerging consensus between many studies on the relative importance of the different uncertainty sources. The prevailing perception is that GCM uncertainty dominates hydrological impact studies. There are only few studies, which found that the predictive uncertainty of hydrological models can be in the same range or even larger than climatic uncertainty. We carried out a climate change impact assessment and estimated the relative importance of the uncertainty sources. The study was performed on 2 small catchments in the Swiss Plateau with a lumped conceptual rainfall runoff model. In the climatic part we applied the standard ensemble approach to quantify uncertainty but in hydrology we used formal Bayesian uncertainty assessment method with 2 different likelihood functions. One was a time-series error model that was able to deal with the complicated statistical properties of hydrological model residuals. The second was a likelihood function for the flow quantiles directly. Due to the better data coverage and smaller hydrological complexity in one of our test catchments we had better performance from the hydrological model and thus could observe that the relative importance of different uncertainty sources varied between sites, boundary conditions and flow indicators. The uncertainty of future climate was important, but not dominant. The deficiencies of the hydrological model were on the same scale, especially for the sites and flow components where model performance for the past observations was further from optimal (Nash-Sutcliffe index = 0.5 - 0.7). The overall uncertainty of predictions was well beyond the expected change signal even for the best performing site and flow indicator.

  13. Uncertainty and Risk Assessment in the Design Process for Wind

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Damiani, Rick R.

    This report summarizes the concepts and opinions that emerged from an initial study on the subject of uncertainty in wind design that included expert elicitation during a workshop held at the National Wind Technology Center at the National Renewable Energy Laboratory July 12-13, 2016. In this paper, five major categories of uncertainties are identified. The first category is associated with direct impacts on turbine loads, (i.e., the inflow including extreme events, aero-hydro-servo-elastic response, soil-structure inter- action, and load extrapolation). The second category encompasses material behavior and strength. Site suitability and due-diligence aspects pertain to the third category. Calibration of partialmore » safety factors and optimal reliability levels make up the fourth one. And last but not least, is the category associated with uncertainties in computational modeling. The main sections of this paper follow this organization.« less

  14. Probabilistic simulation of uncertainties in thermal structures

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.; Shiao, Michael

    1990-01-01

    Development of probabilistic structural analysis methods for hot structures is a major activity at Lewis Research Center. It consists of five program elements: (1) probabilistic loads; (2) probabilistic finite element analysis; (3) probabilistic material behavior; (4) assessment of reliability and risk; and (5) probabilistic structural performance evaluation. Recent progress includes: (1) quantification of the effects of uncertainties for several variables on high pressure fuel turbopump (HPFT) blade temperature, pressure, and torque of the Space Shuttle Main Engine (SSME); (2) the evaluation of the cumulative distribution function for various structural response variables based on assumed uncertainties in primitive structural variables; (3) evaluation of the failure probability; (4) reliability and risk-cost assessment, and (5) an outline of an emerging approach for eventual hot structures certification. Collectively, the results demonstrate that the structural durability/reliability of hot structural components can be effectively evaluated in a formal probabilistic framework. In addition, the approach can be readily extended to computationally simulate certification of hot structures for aerospace environments.

  15. Metrological analysis of a virtual flowmeter-based transducer for cryogenic helium

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Arpaia, P., E-mail: pasquale.arpaia@unina.it; Technology Department, European Organization for Nuclear Research; Girone, M., E-mail: mario.girone@cern.ch

    2015-12-15

    The metrological performance of a virtual flowmeter-based transducer for monitoring helium under cryogenic conditions is assessed. At this aim, an uncertainty model of the transducer, mainly based on a valve model, exploiting finite-element approach, and a virtual flowmeter model, based on the Sereg-Schlumberger method, are presented. The models are validated experimentally on a case study for helium monitoring in cryogenic systems at the European Organization for Nuclear Research (CERN). The impact of uncertainty sources on the transducer metrological performance is assessed by a sensitivity analysis, based on statistical experiment design and analysis of variance. In this way, the uncertainty sourcesmore » most influencing metrological performance of the transducer are singled out over the input range as a whole, at varying operating and setting conditions. This analysis turns out to be important for CERN cryogenics operation because the metrological design of the transducer is validated, and its components and working conditions with critical specifications for future improvements are identified.« less

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vrugt, Jasper A; Robinson, Bruce A; Ter Braak, Cajo J F

    In recent years, a strong debate has emerged in the hydrologic literature regarding what constitutes an appropriate framework for uncertainty estimation. Particularly, there is strong disagreement whether an uncertainty framework should have its roots within a proper statistical (Bayesian) context, or whether such a framework should be based on a different philosophy and implement informal measures and weaker inference to summarize parameter and predictive distributions. In this paper, we compare a formal Bayesian approach using Markov Chain Monte Carlo (MCMC) with generalized likelihood uncertainty estimation (GLUE) for assessing uncertainty in conceptual watershed modeling. Our formal Bayesian approach is implemented usingmore » the recently developed differential evolution adaptive metropolis (DREAM) MCMC scheme with a likelihood function that explicitly considers model structural, input and parameter uncertainty. Our results demonstrate that DREAM and GLUE can generate very similar estimates of total streamflow uncertainty. This suggests that formal and informal Bayesian approaches have more common ground than the hydrologic literature and ongoing debate might suggest. The main advantage of formal approaches is, however, that they attempt to disentangle the effect of forcing, parameter and model structural error on total predictive uncertainty. This is key to improving hydrologic theory and to better understand and predict the flow of water through catchments.« less

  17. Uncertainty Quantification of Medium-Term Heat Storage From Short-Term Geophysical Experiments Using Bayesian Evidential Learning

    NASA Astrophysics Data System (ADS)

    Hermans, Thomas; Nguyen, Frédéric; Klepikova, Maria; Dassargues, Alain; Caers, Jef

    2018-04-01

    In theory, aquifer thermal energy storage (ATES) systems can recover in winter the heat stored in the aquifer during summer to increase the energy efficiency of the system. In practice, the energy efficiency is often lower than expected from simulations due to spatial heterogeneity of hydraulic properties or non-favorable hydrogeological conditions. A proper design of ATES systems should therefore consider the uncertainty of the prediction related to those parameters. We use a novel framework called Bayesian Evidential Learning (BEL) to estimate the heat storage capacity of an alluvial aquifer using a heat tracing experiment. BEL is based on two main stages: pre- and postfield data acquisition. Before data acquisition, Monte Carlo simulations and global sensitivity analysis are used to assess the information content of the data to reduce the uncertainty of the prediction. After data acquisition, prior falsification and machine learning based on the same Monte Carlo are used to directly assess uncertainty on key prediction variables from observations. The result is a full quantification of the posterior distribution of the prediction conditioned to observed data, without any explicit full model inversion. We demonstrate the methodology in field conditions and validate the framework using independent measurements.

  18. Spatial resolution and measurement uncertainty of strains in bone and bone-cement interface using digital volume correlation.

    PubMed

    Zhu, Ming-Liang; Zhang, Qing-Hang; Lupton, Colin; Tong, Jie

    2016-04-01

    The measurement uncertainty of strains has been assessed in a bone analogue (sawbone), bovine trabecular bone and bone-cement interface specimens under zero load using the Digital Volume Correlation (DVC) method. The effects of sub-volume size, sample constraint and preload on the measured strain uncertainty have been examined. There is generally a trade-off between the measurement uncertainty and the spatial resolution. Suitable sub-volume sizes have been be selected based on a compromise between the measurement uncertainty and the spatial resolution of the cases considered. A ratio of sub-volume size to a microstructure characteristic (Tb.Sp) was introduced to reflect a suitable spatial resolution, and the measurement uncertainty associated was assessed. Specifically, ratios between 1.6 and 4 appear to give rise to standard deviations in the measured strains between 166 and 620 με in all the cases considered, which would seem to suffice for strain analysis in pre as well as post yield loading regimes. A microscale finite element (μFE) model was built from the CT images of the sawbone, and the results from the μFE model and a continuum FE model were compared with those from the DVC. The strain results were found to differ significantly between the two methods at tissue level, consistent in trend with the results found in human bones, indicating mainly a limitation of the current DVC method in mapping strains at this level. Copyright © 2015 Elsevier Ltd. All rights reserved.

  19. Climate data induced uncertainty in model-based estimations of terrestrial primary productivity

    NASA Astrophysics Data System (ADS)

    Wu, Zhendong; Ahlström, Anders; Smith, Benjamin; Ardö, Jonas; Eklundh, Lars; Fensholt, Rasmus; Lehsten, Veiko

    2017-06-01

    Model-based estimations of historical fluxes and pools of the terrestrial biosphere differ substantially. These differences arise not only from differences between models but also from differences in the environmental and climatic data used as input to the models. Here we investigate the role of uncertainties in historical climate data by performing simulations of terrestrial gross primary productivity (GPP) using a process-based dynamic vegetation model (LPJ-GUESS) forced by six different climate datasets. We find that the climate induced uncertainty, defined as the range among historical simulations in GPP when forcing the model with the different climate datasets, can be as high as 11 Pg C yr-1 globally (9% of mean GPP). We also assessed a hypothetical maximum climate data induced uncertainty by combining climate variables from different datasets, which resulted in significantly larger uncertainties of 41 Pg C yr-1 globally or 32% of mean GPP. The uncertainty is partitioned into components associated to the three main climatic drivers, temperature, precipitation, and shortwave radiation. Additionally, we illustrate how the uncertainty due to a given climate driver depends both on the magnitude of the forcing data uncertainty (climate data range) and the apparent sensitivity of the modeled GPP to the driver (apparent model sensitivity). We find that LPJ-GUESS overestimates GPP compared to empirically based GPP data product in all land cover classes except for tropical forests. Tropical forests emerge as a disproportionate source of uncertainty in GPP estimation both in the simulations and empirical data products. The tropical forest uncertainty is most strongly associated with shortwave radiation and precipitation forcing, of which climate data range contributes higher to overall uncertainty than apparent model sensitivity to forcing. Globally, precipitation dominates the climate induced uncertainty over nearly half of the vegetated land area, which is mainly due to climate data range and less so due to the apparent model sensitivity. Overall, climate data ranges are found to contribute more to the climate induced uncertainty than apparent model sensitivity to forcing. Our study highlights the need to better constrain tropical climate, and demonstrates that uncertainty caused by climatic forcing data must be considered when comparing and evaluating carbon cycle model results and empirical datasets.

  20. Uncertainty in the delayed neutron fraction in fuel assembly depletion calculations

    NASA Astrophysics Data System (ADS)

    Aures, Alexander; Bostelmann, Friederike; Kodeli, Ivan A.; Velkov, Kiril; Zwermann, Winfried

    2017-09-01

    This study presents uncertainty and sensitivity analyses of the delayed neutron fraction of light water reactor and sodium-cooled fast reactor fuel assemblies. For these analyses, the sampling-based XSUSA methodology is used to propagate cross section uncertainties in neutron transport and depletion calculations. Cross section data is varied according to the SCALE 6.1 covariance library. Since this library includes nu-bar uncertainties only for the total values, it has been supplemented by delayed nu-bar uncertainties from the covariance data of the JENDL-4.0 nuclear data library. The neutron transport and depletion calculations are performed with the TRITON/NEWT sequence of the SCALE 6.1 package. The evolution of the delayed neutron fraction uncertainty over burn-up is analysed without and with the consideration of delayed nu-bar uncertainties. Moreover, the main contributors to the result uncertainty are determined. In all cases, the delayed nu-bar uncertainties increase the delayed neutron fraction uncertainty. Depending on the fuel composition, the delayed nu-bar values of uranium and plutonium in fact give the main contributions to the delayed neutron fraction uncertainty for the LWR fuel assemblies. For the SFR case, the uncertainty of the scattering cross section of U-238 is the main contributor.

  1. Pre-analytical and analytical variation of drug determination in segmented hair using ultra-performance liquid chromatography-tandem mass spectrometry.

    PubMed

    Nielsen, Marie Katrine Klose; Johansen, Sys Stybe; Linnet, Kristian

    2014-01-01

    Assessment of total uncertainty of analytical methods for the measurements of drugs in human hair has mainly been derived from the analytical variation. However, in hair analysis several other sources of uncertainty will contribute to the total uncertainty. Particularly, in segmental hair analysis pre-analytical variations associated with the sampling and segmentation may be significant factors in the assessment of the total uncertainty budget. The aim of this study was to develop and validate a method for the analysis of 31 common drugs in hair using ultra-performance liquid chromatography-tandem mass spectrometry (UHPLC-MS/MS) with focus on the assessment of both the analytical and pre-analytical sampling variations. The validated method was specific, accurate (80-120%), and precise (CV≤20%) across a wide linear concentration range from 0.025-25 ng/mg for most compounds. The analytical variation was estimated to be less than 15% for almost all compounds. The method was successfully applied to 25 segmented hair specimens from deceased drug addicts showing a broad pattern of poly-drug use. The pre-analytical sampling variation was estimated from the genuine duplicate measurements of two bundles of hair collected from each subject after subtraction of the analytical component. For the most frequently detected analytes, the pre-analytical variation was estimated to be 26-69%. Thus, the pre-analytical variation was 3-7 folds larger than the analytical variation (7-13%) and hence the dominant component in the total variation (29-70%). The present study demonstrated the importance of including the pre-analytical variation in the assessment of the total uncertainty budget and in the setting of the 95%-uncertainty interval (±2CVT). Excluding the pre-analytical sampling variation could significantly affect the interpretation of results from segmental hair analysis. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  2. Assessing Uncertainty in Risk Assessment Models (BOSC CSS meeting)

    EPA Science Inventory

    In vitro assays are increasingly being used in risk assessments Uncertainty in assays leads to uncertainty in models used for risk assessments. This poster assesses uncertainty in the ER and AR models.

  3. Where do uncertainties reside within environmental risk assessments? Expert opinion on uncertainty distributions for pesticide risks to surface water organisms.

    PubMed

    Skinner, Daniel J C; Rocks, Sophie A; Pollard, Simon J T

    2016-12-01

    A reliable characterisation of uncertainties can aid uncertainty identification during environmental risk assessments (ERAs). However, typologies can be implemented inconsistently, causing uncertainties to go unidentified. We present an approach based on nine structured elicitations, in which subject-matter experts, for pesticide risks to surface water organisms, validate and assess three dimensions of uncertainty: its level (the severity of uncertainty, ranging from determinism to ignorance); nature (whether the uncertainty is epistemic or aleatory); and location (the data source or area in which the uncertainty arises). Risk characterisation contains the highest median levels of uncertainty, associated with estimating, aggregating and evaluating the magnitude of risks. Regarding the locations in which uncertainty is manifest, data uncertainty is dominant in problem formulation, exposure assessment and effects assessment. The comprehensive description of uncertainty described will enable risk analysts to prioritise the required phases, groups of tasks, or individual tasks within a risk analysis according to the highest levels of uncertainty, the potential for uncertainty to be reduced or quantified, or the types of location-based uncertainty, thus aiding uncertainty prioritisation during environmental risk assessments. In turn, it is expected to inform investment in uncertainty reduction or targeted risk management action. Copyright © 2016 The Authors. Published by Elsevier B.V. All rights reserved.

  4. Evaluation of Global LAI/FPAR Products from VIIRS and MODIS: Spatiotemporal Consistency and Uncertainty

    NASA Astrophysics Data System (ADS)

    Xu, B.; Park, T.; Yan, K.; Chen, C.; Jing, L.; Qinhuo, L.; Song, W.; Knyazikhin, Y.; Myneni, R.

    2017-12-01

    The operational EOS MODIS LAI/FPAR algorithm has been successfully transitioned to Suomi-NPP VIIRS by optimizing a small set of configurable parameters in Look-Up-Tables (LUTs). Our preliminary evaluation results show a reasonable agreement between VIIRS and MODIS LAI/FPAR retrievals. However, we still need more comprehensive investigations to assure the continuity of multi-sensor based global LAI/FPAR time series, as the preliminary evaluation was spatiotemporally limited. Here, we used a multi-year (2012-2016) global LAI/FPAR product generated from VIIRS Version 1 and MODIS Collection 6 to evaluate their spatiotemporal consistency at global and site scales. We also quantified the uncertainty of the product by defining and measuring theoretical and physical terms. For both consistency and uncertainty evaluation, we accounted varying biome types and temporal resolutions (i.e., 8-day, seasonal and annual steps). A newly developed approach (a.k.a., Grading and Upscaling Ground Measurements, GUGM) generating accurate validation datasets was implemented to help validating both products. Our results clearly indicate that the LAI/FPAR retrievals from VIIRS and MODIS are quite consistent at different spatio- (i.e., global and site) and temporal- (i.e., 8-day, seasonal and annual) scales. It is also worthy to note that the rate of retrievals from the radiative transfer based main algorithm is also comparable. However, we also saw a relatively larger LAI/FPAR discrepancy over highly dense tropical forests and a slightly less retrieval rate (main algorithm) from VIIRS over high latitude regions. For the uncertainty assessment, the theoretical uncertainty of VIIRS LAI (FPAR) is less than 0.2 (0.06) for non-forest and 0.9 (0.08) for forest, which is nearly identical to those of MODIS. The physical uncertainties of VIIRS and MODIS LAI (FPAR) products assessed by comparing to ground measurements are estimated by 0.60 (0.10) and 0.55 (0.11), respectively. All of the results presented here imbue confidence in assuring the consistency between VIIRS and MODIS LAI/FPAR retrievals, and the feasibility of generating long-term multi-sensor LAI/FPAR time series.

  5. Shock Layer Radiation Modeling and Uncertainty for Mars Entry

    NASA Technical Reports Server (NTRS)

    Johnston, Christopher O.; Brandis, Aaron M.; Sutton, Kenneth

    2012-01-01

    A model for simulating nonequilibrium radiation from Mars entry shock layers is presented. A new chemical kinetic rate model is developed that provides good agreement with recent EAST and X2 shock tube radiation measurements. This model includes a CO dissociation rate that is a factor of 13 larger than the rate used widely in previous models. Uncertainties in the proposed rates are assessed along with uncertainties in translational-vibrational relaxation modeling parameters. The stagnation point radiative flux uncertainty due to these flowfield modeling parameter uncertainties is computed to vary from 50 to 200% for a range of free-stream conditions, with densities ranging from 5e-5 to 5e-4 kg/m3 and velocities ranging from of 6.3 to 7.7 km/s. These conditions cover the range of anticipated peak radiative heating conditions for proposed hypersonic inflatable aerodynamic decelerators (HIADs). Modeling parameters for the radiative spectrum are compiled along with a non-Boltzmann rate model for the dominant radiating molecules, CO, CN, and C2. A method for treating non-local absorption in the non-Boltzmann model is developed, which is shown to result in up to a 50% increase in the radiative flux through absorption by the CO 4th Positive band. The sensitivity of the radiative flux to the radiation modeling parameters is presented and the uncertainty for each parameter is assessed. The stagnation point radiative flux uncertainty due to these radiation modeling parameter uncertainties is computed to vary from 18 to 167% for the considered range of free-stream conditions. The total radiative flux uncertainty is computed as the root sum square of the flowfield and radiation parametric uncertainties, which results in total uncertainties ranging from 50 to 260%. The main contributors to these significant uncertainties are the CO dissociation rate and the CO heavy-particle excitation rates. Applying the baseline flowfield and radiation models developed in this work, the radiative heating for the Mars Pathfinder probe is predicted to be nearly 20 W/cm2. In contrast to previous studies, this value is shown to be significant relative to the convective heating.

  6. Reliability-Based Performance Assessment of Damaged Ships

    DTIC Science & Technology

    2006-10-01

    37OR.A 2 AceI 2 oce 3 3C7R Main Deck Plan View at 9.906 metre STAA T~SIc ACCa IC 111 tI RO MAO ’ 2 0004 M 3O m 3 STORES) me 3 1 002~s STOR1 PfIS PlnVIS...Vol. 15, pp.7 4 -9 3 . Kaplan , M. Benatar, M., Bentson, J., Achtarides, T.A. (1984). "Analysis and assessment of major uncertainties associated with

  7. Multimodel hydrological ensemble forecasts for the Baskatong catchment in Canada using the TIGGE database.

    NASA Astrophysics Data System (ADS)

    Tito Arandia Martinez, Fabian

    2014-05-01

    Adequate uncertainty assessment is an important issue in hydrological modelling. An important issue for hydropower producers is to obtain ensemble forecasts which truly grasp the uncertainty linked to upcoming streamflows. If properly assessed, this uncertainty can lead to optimal reservoir management and energy production (ex. [1]). The meteorological inputs to the hydrological model accounts for an important part of the total uncertainty in streamflow forecasting. Since the creation of the THORPEX initiative and the TIGGE database, access to meteorological ensemble forecasts from nine agencies throughout the world have been made available. This allows for hydrological ensemble forecasts based on multiple meteorological ensemble forecasts. Consequently, both the uncertainty linked to the architecture of the meteorological model and the uncertainty linked to the initial condition of the atmosphere can be accounted for. The main objective of this work is to show that a weighted combination of meteorological ensemble forecasts based on different atmospheric models can lead to improved hydrological ensemble forecasts, for horizons from one to ten days. This experiment is performed for the Baskatong watershed, a head subcatchment of the Gatineau watershed in the province of Quebec, in Canada. Baskatong watershed is of great importance for hydro-power production, as it comprises the main reservoir for the Gatineau watershed, on which there are six hydropower plants managed by Hydro-Québec. Since the 70's, they have been using pseudo ensemble forecast based on deterministic meteorological forecasts to which variability derived from past forecasting errors is added. We use a combination of meteorological ensemble forecasts from different models (precipitation and temperature) as the main inputs for hydrological model HSAMI ([2]). The meteorological ensembles from eight of the nine agencies available through TIGGE are weighted according to their individual performance and combined to form a grand ensemble. Results show that the hydrological forecasts derived from the grand ensemble perform better than the pseudo ensemble forecasts actually used operationally at Hydro-Québec. References: [1] M. Verbunt, A. Walser, J. Gurtz et al., "Probabilistic flood forecasting with a limited-area ensemble prediction system: Selected case studies," Journal of Hydrometeorology, vol. 8, no. 4, pp. 897-909, Aug, 2007. [2] N. Evora, Valorisation des prévisions météorologiques d'ensemble, Institu de recherceh d'Hydro-Québec 2005. [3] V. Fortin, Le modèle météo-apport HSAMI: historique, théorie et application, Institut de recherche d'Hydro-Québec, 2000.

  8. Quantifying and Qualifying USGS ShakeMap Uncertainty

    USGS Publications Warehouse

    Wald, David J.; Lin, Kuo-Wan; Quitoriano, Vincent

    2008-01-01

    We describe algorithms for quantifying and qualifying uncertainties associated with USGS ShakeMap ground motions. The uncertainty values computed consist of latitude/longitude grid-based multiplicative factors that scale the standard deviation associated with the ground motion prediction equation (GMPE) used within the ShakeMap algorithm for estimating ground motions. The resulting grid-based 'uncertainty map' is essential for evaluation of losses derived using ShakeMaps as the hazard input. For ShakeMap, ground motion uncertainty at any point is dominated by two main factors: (i) the influence of any proximal ground motion observations, and (ii) the uncertainty of estimating ground motions from the GMPE, most notably, elevated uncertainty due to initial, unconstrained source rupture geometry. The uncertainty is highest for larger magnitude earthquakes when source finiteness is not yet constrained and, hence, the distance to rupture is also uncertain. In addition to a spatially-dependant, quantitative assessment, many users may prefer a simple, qualitative grading for the entire ShakeMap. We developed a grading scale that allows one to quickly gauge the appropriate level of confidence when using rapidly produced ShakeMaps as part of the post-earthquake decision-making process or for qualitative assessments of archived or historical earthquake ShakeMaps. We describe an uncertainty letter grading ('A' through 'F', for high to poor quality, respectively) based on the uncertainty map. A middle-range ('C') grade corresponds to a ShakeMap for a moderate-magnitude earthquake suitably represented with a point-source location. Lower grades 'D' and 'F' are assigned for larger events (M>6) where finite-source dimensions are not yet constrained. The addition of ground motion observations (or observed macroseismic intensities) reduces uncertainties over data-constrained portions of the map. Higher grades ('A' and 'B') correspond to ShakeMaps with constrained fault dimensions and numerous stations, depending on the density of station/data coverage. Due to these dependencies, the letter grade can change with subsequent ShakeMap revisions if more data are added or when finite-faulting dimensions are added. We emphasize that the greatest uncertainties are associated with unconstrained source dimensions for large earthquakes where the distance term in the GMPE is most uncertain; this uncertainty thus scales with magnitude (and consequently rupture dimension). Since this distance uncertainty produces potentially large uncertainties in ShakeMap ground-motion estimates, this factor dominates over compensating constraints for all but the most dense station distributions.

  9. Prioritizing Risks and Uncertainties from Intentional Release of Selected Category A Pathogens

    PubMed Central

    Hong, Tao; Gurian, Patrick L.; Huang, Yin; Haas, Charles N.

    2012-01-01

    This paper synthesizes available information on five Category A pathogens (Bacillus anthracis, Yersinia pestis, Francisella tularensis, Variola major and Lassa) to develop quantitative guidelines for how environmental pathogen concentrations may be related to human health risk in an indoor environment. An integrated model of environmental transport and human health exposure to biological pathogens is constructed which 1) includes the effects of environmental attenuation, 2) considers fomite contact exposure as well as inhalational exposure, and 3) includes an uncertainty analysis to identify key input uncertainties, which may inform future research directions. The findings provide a framework for developing the many different environmental standards that are needed for making risk-informed response decisions, such as when prophylactic antibiotics should be distributed, and whether or not a contaminated area should be cleaned up. The approach is based on the assumption of uniform mixing in environmental compartments and is thus applicable to areas sufficiently removed in time and space from the initial release that mixing has produced relatively uniform concentrations. Results indicate that when pathogens are released into the air, risk from inhalation is the main component of the overall risk, while risk from ingestion (dermal contact for B. anthracis) is the main component of the overall risk when pathogens are present on surfaces. Concentrations sampled from untracked floor, walls and the filter of heating ventilation and air conditioning (HVAC) system are proposed as indicators of previous exposure risk, while samples taken from touched surfaces are proposed as indicators of future risk if the building is reoccupied. A Monte Carlo uncertainty analysis is conducted and input-output correlations used to identify important parameter uncertainties. An approach is proposed for integrating these quantitative assessments of parameter uncertainty with broader, qualitative considerations to identify future research priorities. PMID:22412915

  10. Estimation of Vickers hardness uncertainty for a heterogeneous welded joint (S235JR+AR and X2CrNiMo17-12-2)

    NASA Astrophysics Data System (ADS)

    Dijmărescu, M. C.; Dijmărescu, M. R.

    2017-08-01

    When talking about tests that include measurements, the uncertainty of measurement is an essential element because it is important to know the limits within the obtained results may be assumed to lie and the influence the measurement technological system elements have on these results. The research presented in this paper focuses on the estimation of the Vickers hardness uncertainty of measurement for the heterogeneous welded joint between S235JR+AR and X2CrNiMo17-12-2 materials in order to establish the results relevance and the quality assessment of this joint. The paper contents are structured in three main parts. In the first part, the initial data necessary for the experiment is presented in terms of the welded joint and technological means characterisation. The second part presents the physical experiment development and its results and in the third part the uncertainty of the measurements is calculated and a results discussion is undertaken.

  11. Model Selection for Monitoring CO2 Plume during Sequestration

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2014-12-31

    The model selection method developed as part of this project mainly includes four steps: (1) assessing the connectivity/dynamic characteristics of a large prior ensemble of models, (2) model clustering using multidimensional scaling coupled with k-mean clustering, (3) model selection using the Bayes' rule in the reduced model space, (4) model expansion using iterative resampling of the posterior models. The fourth step expresses one of the advantages of the method: it provides a built-in means of quantifying the uncertainty in predictions made with the selected models. In our application to plume monitoring, by expanding the posterior space of models, the finalmore » ensemble of representations of geological model can be used to assess the uncertainty in predicting the future displacement of the CO2 plume. The software implementation of this approach is attached here.« less

  12. Development of Semi-distributed ecohydrological model in the Rio Grande De Manati River Basin, Puerto Rico

    NASA Astrophysics Data System (ADS)

    Setegn, S. G.; Ortiz, J.; Melendez, J.; Barreto, M.; Torres-Perez, J. L.; Guild, L. S.

    2015-12-01

    There are limited studies in Puerto Rico that shows the water resources availability and variability with respect to changing climates and land use. The main goal of the HICE-PR (Human Impacts to Coastal Ecosystems in Puerto Rico (HICE-PR): the Río Loco Watershed (southwest coast PR) project which was funded by NASA is to evaluate the impacts of land use/land cover changes on the quality and extent of coastal and marine ecosystems (CMEs) in two priority watersheds in Puerto Rico (Manatí and Guánica).The main objective of this study is to set up a physically based spatially distributed hydrological model, Soil and Water Assessment Tool (SWAT) for the analysis of hydrological processes in the Rio Grande de Manati river basin. SWAT (soil and water assessment tool) is a spatially distributed watershed model developed to predict the impact of land management practices on water, sediment and agricultural chemical yields in large complex watersheds. For efficient use of distributed models for hydrological and scenario analysis, it is important that these models pass through a careful calibration and uncertainty analysis. The model was calibrated and validated using Sequential Uncertainty Fitting (SUFI-2) calibration and uncertainty analysis algorithms. The model evaluation statistics for streamflows prediction shows that there is a good agreement between the measured and simulated flows that was verified by coefficients of determination and Nash Sutcliffe efficiency greater than 0.5. Keywords: Hydrological Modeling; SWAT; SUFI-2; Rio Grande De Manati; Puerto Rico

  13. Accounting for downscaling and model uncertainty in fine-resolution seasonal climate projections over the Columbia River Basin

    NASA Astrophysics Data System (ADS)

    Ahmadalipour, Ali; Moradkhani, Hamid; Rana, Arun

    2018-01-01

    Climate change is expected to have severe impacts on natural systems as well as various socio-economic aspects of human life. This has urged scientific communities to improve the understanding of future climate and reduce the uncertainties associated with projections. In the present study, ten statistically downscaled CMIP5 GCMs at 1/16th deg. spatial resolution from two different downscaling procedures are utilized over the Columbia River Basin (CRB) to assess the changes in climate variables and characterize the associated uncertainties. Three climate variables, i.e. precipitation, maximum temperature, and minimum temperature, are studied for the historical period of 1970-2000 as well as future period of 2010-2099, simulated with representative concentration pathways of RCP4.5 and RCP8.5. Bayesian Model Averaging (BMA) is employed to reduce the model uncertainty and develop a probabilistic projection for each variable in each scenario. Historical comparison of long-term attributes of GCMs and observation suggests a more accurate representation for BMA than individual models. Furthermore, BMA projections are used to investigate future seasonal to annual changes of climate variables. Projections indicate significant increase in annual precipitation and temperature, with varied degree of change across different sub-basins of CRB. We then characterized uncertainty of future projections for each season over CRB. Results reveal that model uncertainty is the main source of uncertainty, among others. However, downscaling uncertainty considerably contributes to the total uncertainty of future projections, especially in summer. On the contrary, downscaling uncertainty appears to be higher than scenario uncertainty for precipitation.

  14. Impact of uncertainties in inorganic chemical rate constants on tropospheric composition and ozone radiative forcing

    NASA Astrophysics Data System (ADS)

    Newsome, Ben; Evans, Mat

    2017-12-01

    Chemical rate constants determine the composition of the atmosphere and how this composition has changed over time. They are central to our understanding of climate change and air quality degradation. Atmospheric chemistry models, whether online or offline, box, regional or global, use these rate constants. Expert panels evaluate laboratory measurements, making recommendations for the rate constants that should be used. This results in very similar or identical rate constants being used by all models. The inherent uncertainties in these recommendations are, in general, therefore ignored. We explore the impact of these uncertainties on the composition of the troposphere using the GEOS-Chem chemistry transport model. Based on the Jet Propulsion Laboratory (JPL) and International Union of Pure and Applied Chemistry (IUPAC) evaluations we assess the influence of 50 mainly inorganic rate constants and 10 photolysis rates on tropospheric composition through the use of the GEOS-Chem chemistry transport model. We assess the impact on four standard metrics: annual mean tropospheric ozone burden, surface ozone and tropospheric OH concentrations, and tropospheric methane lifetime. Uncertainty in the rate constants for NO2 + OH M HNO3 and O3 + NO → NO2 + O2 are the two largest sources of uncertainty in these metrics. The absolute magnitude of the change in the metrics is similar if rate constants are increased or decreased by their σ values. We investigate two methods of assessing these uncertainties, addition in quadrature and a Monte Carlo approach, and conclude they give similar outcomes. Combining the uncertainties across the 60 reactions gives overall uncertainties on the annual mean tropospheric ozone burden, surface ozone and tropospheric OH concentrations, and tropospheric methane lifetime of 10, 11, 16 and 16 %, respectively. These are larger than the spread between models in recent model intercomparisons. Remote regions such as the tropics, poles and upper troposphere are most uncertain. This chemical uncertainty is sufficiently large to suggest that rate constant uncertainty should be considered alongside other processes when model results disagree with measurement. Calculations for the pre-industrial simulation allow a tropospheric ozone radiative forcing to be calculated of 0.412 ± 0.062 W m-2. This uncertainty (13 %) is comparable to the inter-model spread in ozone radiative forcing found in previous model-model intercomparison studies where the rate constants used in the models are all identical or very similar. Thus, the uncertainty of tropospheric ozone radiative forcing should expanded to include this additional source of uncertainty. These rate constant uncertainties are significant and suggest that refinement of supposedly well-known chemical rate constants should be considered alongside other improvements to enhance our understanding of atmospheric processes.

  15. Uncertainty on shallow landslide hazard assessment: from field data to hazard mapping

    NASA Astrophysics Data System (ADS)

    Trefolini, Emanuele; Tolo, Silvia; Patelli, Eduardo; Broggi, Matteo; Disperati, Leonardo; Le Tuan, Hai

    2015-04-01

    Shallow landsliding that involve Hillslope Deposits (HD), the surficial soil that cover the bedrock, is an important process of erosion, transport and deposition of sediment along hillslopes. Despite Shallow landslides generally mobilize relatively small volume of material, they represent the most hazardous factor in mountain regions due to their high velocity and the common absence of warning signs. Moreover, increasing urbanization and likely climate change make shallow landslides a source of widespread risk, therefore the interest of scientific community about this process grown in the last three decades. One of the main aims of research projects involved on this topic, is to perform robust shallow landslides hazard assessment for wide areas (regional assessment), in order to support sustainable spatial planning. Currently, three main methodologies may be implemented to assess regional shallow landslides hazard: expert evaluation, probabilistic (or data mining) methods and physical models based methods. The aim of this work is evaluate the uncertainty of shallow landslides hazard assessment based on physical models taking into account spatial variables such as: geotechnical and hydrogeologic parameters as well as hillslope morphometry. To achieve this goal a wide dataset of geotechnical properties (shear strength, permeability, depth and unit weight) of HD was gathered by integrating field survey, in situ and laboratory tests. This spatial database was collected from a study area of about 350 km2 including different bedrock lithotypes and geomorphological features. The uncertainty associated to each step of the hazard assessment process (e.g. field data collection, regionalization of site specific information and numerical modelling of hillslope stability) was carefully characterized. The most appropriate probability density function (PDF) was chosen for each numerical variable and we assessed the uncertainty propagation on HD strength parameters obtained by empirical relations with geotechnical index properties. Site specific information was regionalized at map scale by (hard and fuzzy) clustering analysis taking into account spatial variables such as: geology, geomorphology and hillslope morphometric variables (longitudinal and transverse curvature, flow accumulation and slope), the latter derived by a DEM with 10 m cell size. In order to map shallow landslide hazard, Monte Carlo simulation was performed for some common physically based models available in literature (eg. SINMAP, SHALSTAB, TRIGRS). Furthermore, a new approach based on the use of Bayesian Network was proposed and validated. Different models, such as Intervals, Convex Models and Fuzzy Sets, were adopted for the modelling of input parameters. Finally, an accuracy assessment was carried out on the resulting maps and the propagation of uncertainty of input parameters into the final shallow landslide hazard estimation was estimated. The outcomes of the analysis are compared and discussed in term of discrepancy among map pixel values and related estimated error. The novelty of the proposed method is on estimation of the confidence of the shallow landslides hazard mapping at regional level. This allows i) to discriminate regions where hazard assessment is robust from areas where more data are necessary to increase the confidence level and ii) to assess the reliability of the procedure used for hazard assessment.

  16. Uncertainty characterization of HOAPS 3.3 latent heat-flux-related parameters

    NASA Astrophysics Data System (ADS)

    Liman, Julian; Schröder, Marc; Fennig, Karsten; Andersson, Axel; Hollmann, Rainer

    2018-03-01

    Latent heat flux (LHF) is one of the main contributors to the global energy budget. As the density of in situ LHF measurements over the global oceans is generally poor, the potential of remotely sensed LHF for meteorological applications is enormous. However, to date none of the available satellite products have included estimates of systematic, random, and sampling uncertainties, all of which are essential for assessing their quality. Here, the challenge is taken on by matching LHF-related pixel-level data of the Hamburg Ocean Atmosphere Parameters and Fluxes from Satellite (HOAPS) climatology (version 3.3) to in situ measurements originating from a high-quality data archive of buoys and selected ships. Assuming the ground reference to be bias-free, this allows for deriving instantaneous systematic uncertainties as a function of four atmospheric predictor variables. The approach is regionally independent and therefore overcomes the issue of sparse in situ data densities over large oceanic areas. Likewise, random uncertainties are derived, which include not only a retrieval component but also contributions from in situ measurement noise and the collocation procedure. A recently published random uncertainty decomposition approach is applied to isolate the random retrieval uncertainty of all LHF-related HOAPS parameters. It makes use of two combinations of independent data triplets of both satellite and in situ data, which are analysed in terms of their pairwise variances of differences. Instantaneous uncertainties are finally aggregated, allowing for uncertainty characterizations on monthly to multi-annual timescales. Results show that systematic LHF uncertainties range between 15 and 50 W m-2 with a global mean of 25 W m-2. Local maxima are mainly found over the subtropical ocean basins as well as along the western boundary currents. Investigations indicate that contributions from qa (U) to the overall LHF uncertainty are on the order of 60 % (25 %). From an instantaneous point of view, random retrieval uncertainties are specifically large over the subtropics with a global average of 37 W m-2. In a climatological sense, their magnitudes become negligible, as do respective sampling uncertainties. Regional and seasonal analyses suggest that largest total LHF uncertainties are seen over the Gulf Stream and the Indian monsoon region during boreal winter. In light of the uncertainty measures, the observed continuous global mean LHF increase up to 2009 needs to be treated with caution. The demonstrated approach can easily be transferred to other satellite retrievals, which increases the significance of the present work.

  17. Uncertainty in urban flood damage assessment due to urban drainage modelling and depth-damage curve estimation.

    PubMed

    Freni, G; La Loggia, G; Notaro, V

    2010-01-01

    Due to the increased occurrence of flooding events in urban areas, many procedures for flood damage quantification have been defined in recent decades. The lack of large databases in most cases is overcome by combining the output of urban drainage models and damage curves linking flooding to expected damage. The application of advanced hydraulic models as diagnostic, design and decision-making support tools has become a standard practice in hydraulic research and application. Flooding damage functions are usually evaluated by a priori estimation of potential damage (based on the value of exposed goods) or by interpolating real damage data (recorded during historical flooding events). Hydraulic models have undergone continuous advancements, pushed forward by increasing computer capacity. The details of the flooding propagation process on the surface and the details of the interconnections between underground and surface drainage systems have been studied extensively in recent years, resulting in progressively more reliable models. The same level of was advancement has not been reached with regard to damage curves, for which improvements are highly connected to data availability; this remains the main bottleneck in the expected flooding damage estimation. Such functions are usually affected by significant uncertainty intrinsically related to the collected data and to the simplified structure of the adopted functional relationships. The present paper aimed to evaluate this uncertainty by comparing the intrinsic uncertainty connected to the construction of the damage-depth function to the hydraulic model uncertainty. In this way, the paper sought to evaluate the role of hydraulic model detail level in the wider context of flood damage estimation. This paper demonstrated that the use of detailed hydraulic models might not be justified because of the higher computational cost and the significant uncertainty in damage estimation curves. This uncertainty occurs mainly because a large part of the total uncertainty is dependent on depth-damage curves. Improving the estimation of these curves may provide better results in term of uncertainty reduction than the adoption of detailed hydraulic models.

  18. Uncertainty in Agricultural Impact Assessment

    NASA Technical Reports Server (NTRS)

    Wallach, Daniel; Mearns, Linda O.; Rivington, Michael; Antle, John M.; Ruane, Alexander C.

    2014-01-01

    This chapter considers issues concerning uncertainty associated with modeling and its use within agricultural impact assessments. Information about uncertainty is important for those who develop assessment methods, since that information indicates the need for, and the possibility of, improvement of the methods and databases. Such information also allows one to compare alternative methods. Information about the sources of uncertainties is an aid in prioritizing further work on the impact assessment method. Uncertainty information is also necessary for those who apply assessment methods, e.g., for projecting climate change impacts on agricultural production and for stakeholders who want to use the results as part of a decision-making process (e.g., for adaptation planning). For them, uncertainty information indicates the degree of confidence they can place in the simulated results. Quantification of uncertainty also provides stakeholders with an important guideline for making decisions that are robust across the known uncertainties. Thus, uncertainty information is important for any decision based on impact assessment. Ultimately, we are interested in knowledge about uncertainty so that information can be used to achieve positive outcomes from agricultural modeling and impact assessment.

  19. Uncertainty analysis for low-level radioactive waste disposal performance assessment at Oak Ridge National Laboratory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, D.W.; Yambert, M.W.; Kocher, D.C.

    1994-12-31

    A performance assessment of the operating Solid Waste Storage Area 6 (SWSA 6) facility for the disposal of low-level radioactive waste at the Oak Ridge National Laboratory has been prepared to provide the technical basis for demonstrating compliance with the performance objectives of DOE Order 5820.2A, Chapter 111.2 An analysis of the uncertainty incorporated into the assessment was performed which addressed the quantitative uncertainty in the data used by the models, the subjective uncertainty associated with the models used for assessing performance of the disposal facility and site, and the uncertainty in the models used for estimating dose and humanmore » exposure. The results of the uncertainty analysis were used to interpret results and to formulate conclusions about the performance assessment. This paper discusses the approach taken in analyzing the uncertainty in the performance assessment and the role of uncertainty in performance assessment.« less

  20. The epistemic and aleatory uncertainties of the ETAS-type models: an application to the Central Italy seismicity.

    PubMed

    Lombardi, A M

    2017-09-18

    Stochastic models provide quantitative evaluations about the occurrence of earthquakes. A basic component of this type of models are the uncertainties in defining main features of an intrinsically random process. Even if, at a very basic level, any attempting to distinguish between types of uncertainty is questionable, an usual way to deal with this topic is to separate epistemic uncertainty, due to lack of knowledge, from aleatory variability, due to randomness. In the present study this problem is addressed in the narrow context of short-term modeling of earthquakes and, specifically, of ETAS modeling. By mean of an application of a specific version of the ETAS model to seismicity of Central Italy, recently struck by a sequence with a main event of Mw6.5, the aleatory and epistemic (parametric) uncertainty are separated and quantified. The main result of the paper is that the parametric uncertainty of the ETAS-type model, adopted here, is much lower than the aleatory variability in the process. This result points out two main aspects: an analyst has good chances to set the ETAS-type models, but he may retrospectively describe and forecast the earthquake occurrences with still limited precision and accuracy.

  1. Error of the modelled peak flow of the hydraulically reconstructed 1907 flood of the Ebro River in Xerta (NE Iberian Peninsula)

    NASA Astrophysics Data System (ADS)

    Lluís Ruiz-Bellet, Josep; Castelltort, Xavier; Carles Balasch, J.; Tuset, Jordi

    2016-04-01

    The estimation of the uncertainty of the results of the hydraulic modelling has been deeply analysed, but no clear methodological procedures as to its determination have been formulated when applied to historical hydrology. The main objective of this study was to calculate the uncertainty of the resulting peak flow of a typical historical flood reconstruction. The secondary objective was to identify the input variables that influenced the result the most and their contribution to peak flow total error. The uncertainty of 21-23 October 1907 flood of the Ebro River (NE Iberian Peninsula) in the town of Xerta (83,000 km2) was calculated with a series of local sensitivity analyses of the main variables affecting the resulting peak flow. Besides, in order to see to what degree the result depended on the chosen model, the HEC-RAS resulting peak flow was compared to the ones obtained with the 2D model Iber and with Manning's equation. The peak flow of 1907 flood in the Ebro River in Xerta, reconstructed with HEC-RAS, was 11500 m3·s-1 and its total error was ±31%. The most influential input variable over HEC-RAS peak flow results was water height; however, the one that contributed the most to peak flow error was Manning's n, because its uncertainty was far greater than water height's. The main conclusion is that, to ensure the lowest peak flow error, the reliability and precision of the flood mark should be thoroughly assessed. The peak flow was 12000 m3·s-1 when calculated with the 2D model Iber and 11500 m3·s-1 when calculated with the Manning equation.

  2. Iterative performance assessments as a regulatory tool for evaluating repository safety: How experiences from SKI Project-90 were used in formulating the new performance assessment project SITE-94

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Andersson, J.

    1993-12-31

    The Swedish Nuclear Power Inspectorate, SKI, regulatory research program has to prepare for the process of licensing a repository for spent nuclear fuel, by building up the necessary knowledge and review capacity. SKIs main strategy for meeting this demand is to develop an independent performance assessment capability. SKIs first own performance assessment project, Project-90, was completed in 1991 and is now followed by a new project, SITE-94. SITE-94 is based on conclusions reached within Project-90. An independent review of Project-90, carried out by a NEA team of experts, has also contributed to the formation of the project. Another important reasonmore » for the project is that the implementing organization in Sweden, SKB, has proposed to submit an application to start detailed investigation of a repository candidate site around 1997. SITE-94 is a performance assessment of a hypothetical repository at a real site. The main objective of the project is to determine how site specific data should be assimilated into the performance assessment process, and to evaluate how uncertainties inherent in site characterization will influence performance assessment results. This will be addressed by exploring multiple interpretations, conceptual models, and parameters consistent with the site data. The site evaluation will strive for consistency between geological, hydrological, rock mechanical, and geochemical descriptions. Other important elements of SITE-94 are the development of a practical and defensible methodology for defining, constructing and analyzing scenarios, the development of approaches for treatment of uncertainties, evaluation of canister integrity, and the development and application of an appropriate quality assurance plan for performance assessments.« less

  3. Global assessment of water policy vulnerability under uncertainty in water scarcity projections

    NASA Astrophysics Data System (ADS)

    Greve, Peter; Kahil, Taher; Satoh, Yusuke; Burek, Peter; Fischer, Günther; Tramberend, Sylvia; Byers, Edward; Flörke, Martina; Eisner, Stephanie; Hanasaki, Naota; Langan, Simon; Wada, Yoshihide

    2017-04-01

    Water scarcity is a critical environmental issue worldwide, which has been driven by the significant increase in water extractions during the last century. In the coming decades, climate change is projected to further exacerbate water scarcity conditions in many regions around the world. At present, one important question for policy debate is the identification of water policy interventions that could address the mounting water scarcity problems. Main interventions include investing in water storage infrastructures, water transfer canals, efficient irrigation systems, and desalination plants, among many others. This type of interventions involve long-term planning, long-lived investments and some irreversibility in choices which can shape development of countries for decades. Making decisions on these water infrastructures requires anticipating the long term environmental conditions, needs and constraints under which they will function. This brings large uncertainty in the decision-making process, for instance from demographic or economic projections. But today, climate change is bringing another layer of uncertainty that make decisions even more complex. In this study, we assess in a probabilistic approach the uncertainty in global water scarcity projections following different socioeconomic pathways (SSPs) and climate scenarios (RCPs) within the first half of the 21st century. By utilizing an ensemble of 45 future water scarcity projections based on (i) three state-of-the-art global hydrological models (PCR-GLOBWB, H08, and WaterGAP), (ii) five climate models, and (iii) three water scenarios, we have assessed changes in water scarcity and the associated uncertainty distribution worldwide. The water scenarios used here are developed by IIASA's Water Futures and Solutions (WFaS) Initiative. The main objective of this study is to improve the contribution of hydro-climatic information to effective policymaking by identifying spatial and temporal policy vulnerabilities under large uncertainty about the future socio-economic and climatic changes and to guide policymakers in charting a more sustainable pathway and avoiding maladaptive development pathways. The results show that water scarcity is increasing in up to 83% of all land area under a high-emission scenario (RCP 6.0-SSP3). Importantly, the range of uncertainty in projected water scarcity is increasing; in some regions by several orders of magnitude (e.g. sub-Saharan Africa, eastern Europe, Central Asia). This is further illustrated by focusing on a set of large river basins that will be subject both to substantial changes in basin-wide water scarcity and to strong increases in the overall range of uncertainty (e.g. the Niger, Indus, Yangtze). These conditions pose a significant challenge for water management options in those vulnerable basins, complicating decisions on needed investments in water supply infrastructure and other system improvements, and leading to the degradation of valuable resources such as non-renewable groundwater resources and water-dependent ecosystems. The results of this study call for careful and deliberative design of water policy interventions under a wide range of socio-economic and climate conditions.

  4. Risk assessment for construction projects of transport infrastructure objects

    NASA Astrophysics Data System (ADS)

    Titarenko, Boris

    2017-10-01

    The paper analyzes and compares different methods of risk assessment for construction projects of transport objects. The management of such type of projects demands application of special probabilistic methods due to large level of uncertainty of their implementation. Risk management in the projects requires the use of probabilistic and statistical methods. The aim of the work is to develop a methodology for using traditional methods in combination with robust methods that allow obtaining reliable risk assessments in projects. The robust approach is based on the principle of maximum likelihood and in assessing the risk allows the researcher to obtain reliable results in situations of great uncertainty. The application of robust procedures allows to carry out a quantitative assessment of the main risk indicators of projects when solving the tasks of managing innovation-investment projects. Calculation of damage from the onset of a risky event is possible by any competent specialist. And an assessment of the probability of occurrence of a risky event requires the involvement of special probabilistic methods based on the proposed robust approaches. Practice shows the effectiveness and reliability of results. The methodology developed in the article can be used to create information technologies and their application in automated control systems for complex projects.

  5. Using Ecosystem Experiments to Improve Vegetation Models

    DOE PAGES

    Medlyn, Belinda; Zaehle, S; DeKauwe, Martin G.; ...

    2015-05-21

    Ecosystem responses to rising CO2 concentrations are a major source of uncertainty in climate change projections. Data from ecosystem-scale Free-Air CO2 Enrichment (FACE) experiments provide a unique opportunity to reduce this uncertainty. The recent FACE Model–Data Synthesis project aimed to use the information gathered in two forest FACE experiments to assess and improve land ecosystem models. A new 'assumption-centred' model intercomparison approach was used, in which participating models were evaluated against experimental data based on the ways in which they represent key ecological processes. Identifying and evaluating the main assumptions caused differences among models, and the assumption-centered approach produced amore » clear roadmap for reducing model uncertainty. We explain this approach and summarize the resulting research agenda. We encourage the application of this approach in other model intercomparison projects to fundamentally improve predictive understanding of the Earth system.« less

  6. Real-Time Optimal Flood Control Decision Making and Risk Propagation Under Multiple Uncertainties

    NASA Astrophysics Data System (ADS)

    Zhu, Feilin; Zhong, Ping-An; Sun, Yimeng; Yeh, William W.-G.

    2017-12-01

    Multiple uncertainties exist in the optimal flood control decision-making process, presenting risks involving flood control decisions. This paper defines the main steps in optimal flood control decision making that constitute the Forecast-Optimization-Decision Making (FODM) chain. We propose a framework for supporting optimal flood control decision making under multiple uncertainties and evaluate risk propagation along the FODM chain from a holistic perspective. To deal with uncertainties, we employ stochastic models at each link of the FODM chain. We generate synthetic ensemble flood forecasts via the martingale model of forecast evolution. We then establish a multiobjective stochastic programming with recourse model for optimal flood control operation. The Pareto front under uncertainty is derived via the constraint method coupled with a two-step process. We propose a novel SMAA-TOPSIS model for stochastic multicriteria decision making. Then we propose the risk assessment model, the risk of decision-making errors and rank uncertainty degree to quantify the risk propagation process along the FODM chain. We conduct numerical experiments to investigate the effects of flood forecast uncertainty on optimal flood control decision making and risk propagation. We apply the proposed methodology to a flood control system in the Daduhe River basin in China. The results indicate that the proposed method can provide valuable risk information in each link of the FODM chain and enable risk-informed decisions with higher reliability.

  7. Performance of a fast digital integrator in on-field magnetic measurements for particle accelerators

    NASA Astrophysics Data System (ADS)

    Arpaia, P.; Bottura, L.; Fiscarelli, L.; Walckiers, L.

    2012-02-01

    The fast digital integrator has been conceived to face most demanding magnet test requirements with a resolution of 10 ppm, a signal-to-noise ratio of 105 dB at 20 kHz, a time resolution of 50 ns, an offset of 10 ppm, and on-line processing. In this paper, the on-field achievements of the fast digital integrator are assessed by a specific measurement campaign at the European Organization for Nuclear Research (CERN). At first, the architecture and the metrological specifications of the instrument are reported. Then, the recent on-field achievements of (i) ±10 ppm of uncertainty in the measurement of the main field for superconducting magnets characterization, (ii) ±0.02 % of field uncertainty in quality assessment of small-aperture permanent magnets, and (iii) ±0.15 % of drift, in an excitation current measurement of 600 s under cryogenic conditions, are presented and discussed.

  8. Uncertainties in discharge projections in consequence of climate change

    NASA Astrophysics Data System (ADS)

    Liebert, J.; Düthmann, D.; Berg, P.; Feldmann, H.; Ihringer, J.; Kunstmann, H.; Merz, B.; Ott, I.; Schädler, G.; Wagner, S.

    2012-04-01

    The fourth assessment report of the IPCC summarizes possible effects of the global climate change. For Europe an increasing variability of temperature and precipitation is expected. While the increasing temperature is projected almost uniformly for Europe, for precipitation the models indicate partly heterogeneous tendencies. In order to maintain current safety-standards in the infrastructure of our various water management systems, the possible future floods discharges are very often a central question. In the planning and operation of water infrastructure systems uncertainties considerations have an important function. In times of climate change the analyses of measured historical gauge data (normally 30 - 80 years) are not sufficient enough, because even significant trends are only valid in the analyzed time period and extrapolations are exceedingly difficult. Therefore combined climate and hydrological modeling for scenario based projections become more and more popular. Regarding that adaptation measures in water infrastructure are in general very time-consuming and cost intensive qualified questions to the variability and uncertainty of model based results are important as well. The CEDIM-Project "Flood hazards in a changing climate" is focusing on both: future changes in flood discharge and assess the uncertainties that are involved in such model based future predictions. In detail the study bases on an ensemble of hydrological model (HM) simulations in 3 representative small to medium sized German river catchments (Ammer, Mulde and Ruhr). The meteorological Input bases on 2 high resolution (7 km) regional climate models (RCM) driven by 2 global climate models (GCM) for the near future (2021 - 2050) following the A1B emission scenario (SRES). Two of the catchments (Ruhr and Mulde) have sub-mountainous and one (Ammer) has alpine character. Besides analyzing the future changes in discharge in the catchments, the describing and potential quantification of the variability of the results, based on the different driving data, regionalization methods, spatial resolutions and model types, is one main goal of the study and should stay in the focus of the poster. The general result is a large variability in the discharge projection. The identified variabilities are in the annual regime mainly attributable to different causes in the used model chain (GCM-RCM-HM). In winter the global climate models (GCM) bring the main uncertainties in the future projection. In summer the main variability refers to the meteorological downscaling to the regional scale (RCM) in combination with the hydrological modeling (HM). But with an appropriate ensemble statistic are despite the large variabilities mean future tendencies detectable. The Ruhr catchment shows tendencies to future higher flood discharges and in the Ammer and Mulde catchments are no significant changes expected.

  9. Modelling the exposure to chemicals for risk assessment: a comprehensive library of multimedia and PBPK models for integration, prediction, uncertainty and sensitivity analysis - the MERLIN-Expo tool.

    PubMed

    Ciffroy, P; Alfonso, B; Altenpohl, A; Banjac, Z; Bierkens, J; Brochot, C; Critto, A; De Wilde, T; Fait, G; Fierens, T; Garratt, J; Giubilato, E; Grange, E; Johansson, E; Radomyski, A; Reschwann, K; Suciu, N; Tanaka, T; Tediosi, A; Van Holderbeke, M; Verdonck, F

    2016-10-15

    MERLIN-Expo is a library of models that was developed in the frame of the FP7 EU project 4FUN in order to provide an integrated assessment tool for state-of-the-art exposure assessment for environment, biota and humans, allowing the detection of scientific uncertainties at each step of the exposure process. This paper describes the main features of the MERLIN-Expo tool. The main challenges in exposure modelling that MERLIN-Expo has tackled are: (i) the integration of multimedia (MM) models simulating the fate of chemicals in environmental media, and of physiologically based pharmacokinetic (PBPK) models simulating the fate of chemicals in human body. MERLIN-Expo thus allows the determination of internal effective chemical concentrations; (ii) the incorporation of a set of functionalities for uncertainty/sensitivity analysis, from screening to variance-based approaches. The availability of such tools for uncertainty and sensitivity analysis aimed to facilitate the incorporation of such issues in future decision making; (iii) the integration of human and wildlife biota targets with common fate modelling in the environment. MERLIN-Expo is composed of a library of fate models dedicated to non biological receptor media (surface waters, soils, outdoor air), biological media of concern for humans (several cultivated crops, mammals, milk, fish), as well as wildlife biota (primary producers in rivers, invertebrates, fish) and humans. These models can be linked together to create flexible scenarios relevant for both human and wildlife biota exposure. Standardized documentation for each model and training material were prepared to support an accurate use of the tool by end-users. One of the objectives of the 4FUN project was also to increase the confidence in the applicability of the MERLIN-Expo tool through targeted realistic case studies. In particular, we aimed at demonstrating the feasibility of building complex realistic exposure scenarios and the accuracy of the modelling predictions through a comparison with actual measurements. Copyright © 2016 Elsevier B.V. All rights reserved.

  10. Quantification of allyl hexanoate in pineapple beverages and yogurts as a case study to characterise a source of uncertainty in dietary exposure assessment to flavouring substances.

    PubMed

    Raffo, A; D'Aloise, A; Magrì, A D; Leclercq, C

    2012-01-01

    One source of uncertainty in the estimation of dietary exposure to flavouring substances is the uncertainty in the occurrence and concentration levels of these substances naturally present or added to foodstuffs. The aim of this study was to assess the variability of concentration levels of allyl hexanoate, considered as a case study, in two main food categories to which it is often added: pineapple juice-based beverages and yogurts containing pineapple. Thirty-four beverages and 29 yogurts, with pineapple fruit or juice and added flavourings declared as ingredients on the package, were purchased from the local market (in Rome) and analysed. Analytical methods based on the stir bar sorptive extraction (SBSE) technique for the isolation of the target analyte, and on GC-MS analysis for final determination, were developed for the two food categories. In beverages, allyl hexanoate concentrations ranged from less than 0.01 to 16.71 mg l(-1), whereas in yogurts they ranged from 0.02 to 89.41 mg kg(-1). Average concentrations in beverages and yogurts with pineapple as the main fruit ingredient (1.91 mg l(-1) for beverages, 9.61 mg kg(-1) for yogurts) were in fair agreement with average use level data reported from industry surveys for the relevant food categories (4.5 and 6.0 mg kg(-1), respectively). Within the group of yogurts a single product was found to contain a level of allyl hexanoate more than 10-fold higher than the average reported use level. The screening techniques developed by the European Food Safety Authority (EFSA) using use level data provided by industry gave estimates of exposure that were of the same order of magnitude as the estimates obtained for regular consumers who would be loyal to the pineapple yogurt and beverage products containing the highest observed concentration of the substance of interest. In this specific case the uncertainty in the results obtained with the use of standard screening techniques for exposure assessment based on industry reported use levels is low.

  11. Uncertainties associated with quantifying climate change impacts on human health: a case study for diarrhea.

    PubMed

    Kolstad, Erik W; Johansson, Kjell Arne

    2011-03-01

    Climate change is expected to have large impacts on health at low latitudes where droughts and malnutrition, diarrhea, and malaria are projected to increase. The main objective of this study was to indicate a method to assess a range of plausible health impacts of climate change while handling uncertainties in a unambiguous manner. We illustrate this method by quantifying the impacts of projected regional warming on diarrhea in this century. We combined a range of linear regression coefficients to compute projections of future climate change-induced increases in diarrhea using the results from five empirical studies and a 19-member climate model ensemble for which future greenhouse gas emissions were prescribed. Six geographical regions were analyzed. The model ensemble projected temperature increases of up to 4°C over land in the tropics and subtropics by the end of this century. The associated mean projected increases of relative risk of diarrhea in the six study regions were 8-11% (with SDs of 3-5%) by 2010-2039 and 22-29% (SDs of 9-12%) by 2070-2099. Even our most conservative estimates indicate substantial impacts from climate change on the incidence of diarrhea. Nevertheless, our main conclusion is that large uncertainties are associated with future projections of diarrhea and climate change. We believe that these uncertainties can be attributed primarily to the sparsity of empirical climate-health data. Our results therefore highlight the need for empirical data in the cross section between climate and human health.

  12. Uncertainty Assessment of Synthetic Design Hydrographs for Gauged and Ungauged Catchments

    NASA Astrophysics Data System (ADS)

    Brunner, Manuela I.; Sikorska, Anna E.; Furrer, Reinhard; Favre, Anne-Catherine

    2018-03-01

    Design hydrographs described by peak discharge, hydrograph volume, and hydrograph shape are essential for engineering tasks involving storage. Such design hydrographs are inherently uncertain as are classical flood estimates focusing on peak discharge only. Various sources of uncertainty contribute to the total uncertainty of synthetic design hydrographs for gauged and ungauged catchments. These comprise model uncertainties, sampling uncertainty, and uncertainty due to the choice of a regionalization method. A quantification of the uncertainties associated with flood estimates is essential for reliable decision making and allows for the identification of important uncertainty sources. We therefore propose an uncertainty assessment framework for the quantification of the uncertainty associated with synthetic design hydrographs. The framework is based on bootstrap simulations and consists of three levels of complexity. On the first level, we assess the uncertainty due to individual uncertainty sources. On the second level, we quantify the total uncertainty of design hydrographs for gauged catchments and the total uncertainty of regionalizing them to ungauged catchments but independently from the construction uncertainty. On the third level, we assess the coupled uncertainty of synthetic design hydrographs in ungauged catchments, jointly considering construction and regionalization uncertainty. We find that the most important sources of uncertainty in design hydrograph construction are the record length and the choice of the flood sampling strategy. The total uncertainty of design hydrographs in ungauged catchments depends on the catchment properties and is not negligible in our case.

  13. Where do uncertainties reside within environmental risk assessments? Testing UnISERA, a guide for uncertainty assessment.

    PubMed

    Skinner, Daniel J C; Rocks, Sophie A; Pollard, Simon J T

    2017-06-01

    A means for identifying and prioritising the treatment of uncertainty (UnISERA) in environmental risk assessments (ERAs) is tested, using three risk domains where ERA is an established requirement and one in which ERA practice is emerging. UnISERA's development draws on 19 expert elicitations across genetically modified higher plants, particulate matter, and agricultural pesticide release and is stress tested here for engineered nanomaterials (ENM). We are concerned with the severity of uncertainty; its nature; and its location across four accepted stages of ERAs. Using an established uncertainty scale, the risk characterisation stage of ERA harbours the highest severity level of uncertainty, associated with estimating, aggregating and evaluating expressions of risk. Combined epistemic and aleatory uncertainty is the dominant nature of uncertainty. The dominant location of uncertainty is associated with data in problem formulation, exposure assessment and effects assessment. Testing UnISERA produced agreements of 55%, 90%, and 80% for the severity level, nature and location dimensions of uncertainty between the combined case studies and the ENM stress test. UnISERA enables environmental risk analysts to prioritise risk assessment phases, groups of tasks, or individual ERA tasks and it can direct them towards established methods for uncertainty treatment. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.

  14. Quantifying volcanic hazard at Campi Flegrei caldera (Italy) with uncertainty assessment: 2. Pyroclastic density current invasion maps

    NASA Astrophysics Data System (ADS)

    Neri, Augusto; Bevilacqua, Andrea; Esposti Ongaro, Tomaso; Isaia, Roberto; Aspinall, Willy P.; Bisson, Marina; Flandoli, Franco; Baxter, Peter J.; Bertagnini, Antonella; Iannuzzi, Enrico; Orsucci, Simone; Pistolesi, Marco; Rosi, Mauro; Vitale, Stefano

    2015-04-01

    Campi Flegrei (CF) is an example of an active caldera containing densely populated settlements at very high risk of pyroclastic density currents (PDCs). We present here an innovative method for assessing background spatial PDC hazard in a caldera setting with probabilistic invasion maps conditional on the occurrence of an explosive event. The method encompasses the probabilistic assessment of potential vent opening positions, derived in the companion paper, combined with inferences about the spatial density distribution of PDC invasion areas from a simplified flow model, informed by reconstruction of deposits from eruptions in the last 15 ka. The flow model describes the PDC kinematics and accounts for main effects of topography on flow propagation. Structured expert elicitation is used to incorporate certain sources of epistemic uncertainty, and a Monte Carlo approach is adopted to produce a set of probabilistic hazard maps for the whole CF area. Our findings show that, in case of eruption, almost the entire caldera is exposed to invasion with a mean probability of at least 5%, with peaks greater than 50% in some central areas. Some areas outside the caldera are also exposed to this danger, with mean probabilities of invasion of the order of 5-10%. Our analysis suggests that these probability estimates have location-specific uncertainties which can be substantial. The results prove to be robust with respect to alternative elicitation models and allow the influence on hazard mapping of different sources of uncertainty, and of theoretical and numerical assumptions, to be quantified.

  15. Dealing with uncertainties in environmental burden of disease assessment

    PubMed Central

    2009-01-01

    Disability Adjusted Life Years (DALYs) combine the number of people affected by disease or mortality in a population and the duration and severity of their condition into one number. The environmental burden of disease is the number of DALYs that can be attributed to environmental factors. Environmental burden of disease estimates enable policy makers to evaluate, compare and prioritize dissimilar environmental health problems or interventions. These estimates often have various uncertainties and assumptions which are not always made explicit. Besides statistical uncertainty in input data and parameters – which is commonly addressed – a variety of other types of uncertainties may substantially influence the results of the assessment. We have reviewed how different types of uncertainties affect environmental burden of disease assessments, and we give suggestions as to how researchers could address these uncertainties. We propose the use of an uncertainty typology to identify and characterize uncertainties. Finally, we argue that uncertainties need to be identified, assessed, reported and interpreted in order for assessment results to adequately support decision making. PMID:19400963

  16. Vegetation resurvey is robust to plot location uncertainty

    PubMed Central

    Kopecký, Martin; Macek, Martin

    2017-01-01

    Aim Resurveys of historical vegetation plots are increasingly used for the assessment of decadal changes in plant species diversity and composition. However, historical plots are usually relocated only approximately. This potentially inflates temporal changes and undermines results. Location Temperate deciduous forests in Central Europe. Methods To explore if robust conclusions can be drawn from resurvey studies despite location uncertainty, we compared temporal changes in species richness, frequency, composition and compositional heterogeneity between exactly and approximately relocated plots. We hypothesized that compositional changes should be lower and changes in species richness should be less variable on exactly relocated plots, because pseudo-turnover inflates temporal changes on approximately relocated plots. Results Temporal changes in species richness were not more variable and temporal changes in species composition and compositional heterogeneity were not higher on approximately relocated plots. Moreover, the frequency of individual species changed similarly on both plot types. Main conclusions The resurvey of historical vegetation plots is robust to uncertainty in original plot location and, when done properly, provides reliable evidence of decadal changes in plant communities. This provides important background for other resurvey studies and opens up the possibility for large-scale assessments of plant community change. PMID:28503083

  17. Evaluation of the information content of long-term wastewater characteristics data in relation to activated sludge model parameters.

    PubMed

    Alikhani, Jamal; Takacs, Imre; Al-Omari, Ahmed; Murthy, Sudhir; Massoudieh, Arash

    2017-03-01

    A parameter estimation framework was used to evaluate the ability of observed data from a full-scale nitrification-denitrification bioreactor to reduce the uncertainty associated with the bio-kinetic and stoichiometric parameters of an activated sludge model (ASM). Samples collected over a period of 150 days from the effluent as well as from the reactor tanks were used. A hybrid genetic algorithm and Bayesian inference were used to perform deterministic and parameter estimations, respectively. The main goal was to assess the ability of the data to obtain reliable parameter estimates for a modified version of the ASM. The modified ASM model includes methylotrophic processes which play the main role in methanol-fed denitrification. Sensitivity analysis was also used to explain the ability of the data to provide information about each of the parameters. The results showed that the uncertainty in the estimates of the most sensitive parameters (including growth rate, decay rate, and yield coefficients) decreased with respect to the prior information.

  18. Whispering Gallery Mode Thermometry

    PubMed Central

    Corbellini, Simone; Ramella, Chiara; Yu, Lili; Pirola, Marco; Fernicola, Vito

    2016-01-01

    This paper presents a state-of-the-art whispering gallery mode (WGM) thermometer system, which could replace platinum resistance thermometers currently used in many industrial applications, thus overcoming some of their well-known limitations and their potential for providing lower measurement uncertainty. The temperature-sensing element is a sapphire-crystal-based whispering gallery mode resonator with the main resonant modes between 10 GHz and 20 GHz. In particular, it was found that the WGM around 13.6 GHz maximizes measurement performance, affording sub-millikelvin resolution and temperature stability of better than 1 mK at 0 °C. The thermometer system was made portable and low-cost by developing an ad hoc interrogation system (hardware and software) able to achieve an accuracy in the order of a few parts in 109 in the determination of resonance frequencies. Herein we report the experimental assessment of the measurement stability, repeatability and resolution, and the calibration of the thermometer in the temperature range from −74 °C to 85 °C. The combined standard uncertainty for a single temperature calibration point is found to be within 5 mK (i.e., comparable with state-of-the-art for industrial thermometry), and is mainly due to the employed calibration setup. The uncertainty contribution of the WGM thermometer alone is within a millikelvin. PMID:27801868

  19. Assessing the environmental impacts of aircraft noise and emissions

    NASA Astrophysics Data System (ADS)

    Mahashabde, Anuja; Wolfe, Philip; Ashok, Akshay; Dorbian, Christopher; He, Qinxian; Fan, Alice; Lukachko, Stephen; Mozdzanowska, Aleksandra; Wollersheim, Christoph; Barrett, Steven R. H.; Locke, Maryalice; Waitz, Ian A.

    2011-01-01

    With the projected growth in demand for commercial aviation, many anticipate increased environmental impacts associated with noise, air quality, and climate change. Therefore, decision-makers and stakeholders are seeking policies, technologies, and operational procedures that balance environmental and economic interests. The main objective of this paper is to address shortcomings in current decision-making practices for aviation environmental policies. We review knowledge of the noise, air quality, and climate impacts of aviation, and demonstrate how including environmental impact assessment and quantifying uncertainties can enable a more comprehensive evaluation of aviation environmental policies. A comparison is presented between the cost-effectiveness analysis currently used for aviation environmental policy decision-making and an illustrative cost-benefit analysis. We focus on assessing a subset of the engine NO X emissions certification stringency options considered at the eighth meeting of the International Civil Aviation Organization’s Committee on Aviation Environmental Protection. The FAA Aviation environmental Portfolio Management Tool (APMT) is employed to conduct the policy assessments. We show that different conclusions may be drawn about the same policy options depending on whether benefits and interdependencies are estimated in terms of health and welfare impacts versus changes in NO X emissions inventories as is the typical practice. We also show that these conclusions are sensitive to a variety of modeling uncertainties. While our more comprehensive analysis makes the best policy option less clear, it represents a more accurate characterization of the scientific and economic uncertainties underlying impacts and the policy choices.

  20. Impact of uncertainty on modeling and testing

    NASA Technical Reports Server (NTRS)

    Coleman, Hugh W.; Brown, Kendall K.

    1995-01-01

    A thorough understanding of the uncertainties associated with the modeling and testing of the Space Shuttle Main Engine (SSME) Engine will greatly aid decisions concerning hardware performance and future development efforts. This report will describe the determination of the uncertainties in the modeling and testing of the Space Shuttle Main Engine test program at the Technology Test Bed facility at Marshall Space Flight Center. Section 2 will present a summary of the uncertainty analysis methodology used and discuss the specific applications to the TTB SSME test program. Section 3 will discuss the application of the uncertainty analysis to the test program and the results obtained. Section 4 presents the results of the analysis of the SSME modeling effort from an uncertainty analysis point of view. The appendices at the end of the report contain a significant amount of information relative to the analysis, including discussions of venturi flowmeter data reduction and uncertainty propagation, bias uncertainty documentations, technical papers published, the computer code generated to determine the venturi uncertainties, and the venturi data and results used in the analysis.

  1. Assessing and reporting uncertainties in dietary exposure analysis - Part II: Application of the uncertainty template to a practical example of exposure assessment.

    PubMed

    Tennant, David; Bánáti, Diána; Kennedy, Marc; König, Jürgen; O'Mahony, Cian; Kettler, Susanne

    2017-11-01

    A previous publication described methods for assessing and reporting uncertainty in dietary exposure assessments. This follow-up publication uses a case study to develop proposals for representing and communicating uncertainty to risk managers. The food ingredient aspartame is used as the case study in a simple deterministic model (the EFSA FAIM template) and with more sophisticated probabilistic exposure assessment software (FACET). Parameter and model uncertainties are identified for each modelling approach and tabulated. The relative importance of each source of uncertainty is then evaluated using a semi-quantitative scale and the results expressed using two different forms of graphical summary. The value of this approach in expressing uncertainties in a manner that is relevant to the exposure assessment and useful to risk managers is then discussed. It was observed that the majority of uncertainties are often associated with data sources rather than the model itself. However, differences in modelling methods can have the greatest impact on uncertainties overall, particularly when the underlying data are the same. It was concluded that improved methods for communicating uncertainties for risk management is the research area where the greatest amount of effort is suggested to be placed in future. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.

  2. Quantifying uncertainty in health impact assessment: a case-study example on indoor housing ventilation.

    PubMed

    Mesa-Frias, Marco; Chalabi, Zaid; Foss, Anna M

    2014-01-01

    Quantitative health impact assessment (HIA) is increasingly being used to assess the health impacts attributable to an environmental policy or intervention. As a consequence, there is a need to assess uncertainties in the assessments because of the uncertainty in the HIA models. In this paper, a framework is developed to quantify the uncertainty in the health impacts of environmental interventions and is applied to evaluate the impacts of poor housing ventilation. The paper describes the development of the framework through three steps: (i) selecting the relevant exposure metric and quantifying the evidence of potential health effects of the exposure; (ii) estimating the size of the population affected by the exposure and selecting the associated outcome measure; (iii) quantifying the health impact and its uncertainty. The framework introduces a novel application for the propagation of uncertainty in HIA, based on fuzzy set theory. Fuzzy sets are used to propagate parametric uncertainty in a non-probabilistic space and are applied to calculate the uncertainty in the morbidity burdens associated with three indoor ventilation exposure scenarios: poor, fair and adequate. The case-study example demonstrates how the framework can be used in practice, to quantify the uncertainty in health impact assessment where there is insufficient information to carry out a probabilistic uncertainty analysis. © 2013.

  3. Socioeconomic Implications of Achieving 2.0 °C and 1.5 °C Climate Targets under Scientific Uncertainties

    NASA Astrophysics Data System (ADS)

    Su, X.; Takahashi, K.; Fujimori, S.; Hasegawa, T.; Tanaka, K.; Shiogama, H.; Emori, S.; LIU, J.; Hanasaki, N.; Hijioka, Y.; Masui, T.

    2017-12-01

    Large uncertainty exists in the temperature projections, including contributions from carbon cycle, climate system and aerosols. For the integrated assessment models (IAMs), like DICE, FUND and PAGE, however, the scientific uncertainties mainly rely on the distribution of (equilibrium) climate sensitivity. This study aims at evaluating the emission pathways by limiting temperature increase below 2.0 ºC or 1.5 ºC after 2100 considering scientific uncertainties, and exploring how socioeconomic indicators are affected by such scientific uncertainties. We use a stochastic version of the SCM4OPT, with an uncertainty measurement by considering alternative ranges of key parameters. Three climate cases, namely, i) base case of SSP2, ii) limiting temperature increase below 2.0 ºC after 2100 and iii) limiting temperature increase below 1.5 ºC after 2100, and three types of probabilities - i) >66% probability or likely, ii) >50% probability or more likely than not and iii) the mean of the probability distribution, are considered in the study. The results show that, i) for the 2.0ºC case, the likely CO2 reduction rate in 2100 ranges from 75.5%-102.4%, with mean value of 88.1%, and 93.0%-113.1% (mean 102.5%) for the 1.5ºC case; ii) a likely range of forcing effect is found for the 2.0 ºC case (2.7-3.9 Wm-2) due to scientific uncertainty, and 1.9-3.1 Wm-2 for the 1.5 ºC case; iii) the carbon prices within 50% confidential interval may differ a factor of 3 for both the 2.0ºC case and the 1.5 ºC case; iv) the abatement costs within 50% confidential interval may differ a factor of 4 for both the 2.0ºC case and the 1.5 ºC case. Nine C4MIP carbon cycle models and nineteen CMIP3 AOGCMs are used to account for the scientific uncertainties, following MAGICC 6.0. These uncertainties will result in a likely radiative forcing range of 6.1-7.5 Wm-2 and a likely temperature increase of 3.1-4.5 ºC in 2100 for the base case of SSP2. If we evaluate the 2 ºC target by limiting the temperature increase, a likely difference of up to 20.7 GtCO2-eq greenhouse gases (GHGs) in 2100 will occur in the assessment, or 14.4 GtCO2-eq GHGs difference for the 1.5 ºC case. The scientific uncertainties have significant impacts on evaluating costs of climate change and an appropriate representation of such uncertainties is important in the socioeconomic assessment.

  4. Integrated probabilistic risk assessment for nanoparticles: the case of nanosilica in food.

    PubMed

    Jacobs, Rianne; van der Voet, Hilko; Ter Braak, Cajo J F

    Insight into risks of nanotechnology and the use of nanoparticles is an essential condition for the social acceptance and safe use of nanotechnology. One of the problems with which the risk assessment of nanoparticles is faced is the lack of data, resulting in uncertainty in the risk assessment. We attempt to quantify some of this uncertainty by expanding a previous deterministic study on nanosilica (5-200 nm) in food into a fully integrated probabilistic risk assessment. We use the integrated probabilistic risk assessment method in which statistical distributions and bootstrap methods are used to quantify uncertainty and variability in the risk assessment. Due to the large amount of uncertainty present, this probabilistic method, which separates variability from uncertainty, contributed to a better understandable risk assessment. We found that quantifying the uncertainties did not increase the perceived risk relative to the outcome of the deterministic study. We pinpointed particular aspects of the hazard characterization that contributed most to the total uncertainty in the risk assessment, suggesting that further research would benefit most from obtaining more reliable data on those aspects.

  5. Uncertainties in s -process nucleosynthesis in low mass stars determined from Monte Carlo variations

    NASA Astrophysics Data System (ADS)

    Cescutti, G.; Hirschi, R.; Nishimura, N.; den Hartogh, J. W.; Rauscher, T.; Murphy, A. St J.; Cristallo, S.

    2018-05-01

    The main s-process taking place in low mass stars produces about half of the elements heavier than iron. It is therefore very important to determine the importance and impact of nuclear physics uncertainties on this process. We have performed extensive nuclear reaction network calculations using individual and temperature-dependent uncertainties for reactions involving elements heavier than iron, within a Monte Carlo framework. Using this technique, we determined the uncertainty in the main s-process abundance predictions due to nuclear uncertainties link to weak interactions and neutron captures on elements heavier than iron. We also identified the key nuclear reactions dominating these uncertainties. We found that β-decay rate uncertainties affect only a few nuclides near s-process branchings, whereas most of the uncertainty in the final abundances is caused by uncertainties in neutron capture rates, either directly producing or destroying the nuclide of interest. Combined total nuclear uncertainties due to reactions on heavy elements are in general small (less than 50%). Three key reactions, nevertheless, stand out because they significantly affect the uncertainties of a large number of nuclides. These are 56Fe(n,γ), 64Ni(n,γ), and 138Ba(n,γ). We discuss the prospect of reducing uncertainties in the key reactions identified in this study with future experiments.

  6. Assessing Uncertainty in Expert Judgments About Natural Resources

    Treesearch

    David A. Cleaves

    1994-01-01

    Judgments are necessary in natural resources management, but uncertainty about these judgments should be assessed. When all judgments are rejected in the absence of hard data, valuable professional experience and knowledge are not utilized fully. The objective of assessing uncertainty is to get the best representation of knowledge and its bounds. Uncertainty...

  7. The Key Role of Eyewitnesses in Rapid Impact Assessment of Global Earthquake

    NASA Astrophysics Data System (ADS)

    Bossu, R.; Steed, R.; Mazet-Roux, G.; Roussel, F.; Etivant, C.; Frobert, L.; Godey, S.

    2014-12-01

    Uncertainties in rapid impact assessments of global earthquakes are intrinsically large because they rely on 3 main elements (ground motion prediction models, building stock inventory and related vulnerability) which values and/or spatial variations are poorly constrained. Furthermore, variations of hypocentral location and magnitude within their respective uncertainty domain can lead to significantly different shaking level for centers of population and change the scope of the disaster. We present the strategy and methods implemented at the Euro-Med Seismological Centre (EMSC) to rapidly collect in-situ observations on earthquake effects from eyewitnesses for reducing uncertainties of rapid earthquake impact assessment. It comprises crowdsourced information (online questionnaires, pics) as well as information derived from real time analysis of web traffic (flashourcing technique), and more recently deployment of QCN (Quake Catcher Network) low cost sensors. We underline the importance of merging results of different methods to improve performances and reliability of collected data.We try to better understand and respond to public demands and expectations after earthquakes through improved information services and diversification of information tools (social networks, smartphone app., browsers adds-on…), which, in turn, drive more eyewitnesses to our services and improve data collection. We will notably present our LastQuake Twitter feed (Quakebot) and smartphone applications (IOs and android) which only report earthquakes that matter for the public and authorities, i.e. felt and damaging earthquakes identified thanks to citizen generated information.

  8. Spillway sizing of large dams in Austria

    NASA Astrophysics Data System (ADS)

    Reszler, Ch.; Gutknecht, D.; Blöschl, G.

    2003-04-01

    This paper discusses the basic philosophy of defining and calculating design floods for large dams in Austria, both for the construction of new dams and for a re-assessment of the safety of existing dams. Currently the consensus is to choose flood peak values corresponding to a probability of exceedance of 2*10-4 for a given year. A two step procedure is proposed to estimate the design flood discharges - a rapid assessment and a detailed assessment. In the rapid assessment the design discharge is chosen as a constant multiple of flood values read from a map of regionalised floods. The safety factor or multiplier takes care of the uncertainties of the local estimation and the regionalisation procedure. If the current design level of a spillway exceeds the value so estimated, no further calculations are needed. Otherwise (and for new dams) a detailed assessment is required. The idea of the detailed assessment is to draw upon all existing sources of information to constrain the uncertainties. The three main sources are local flood frequency analysis, where flood data are available; regional flood estimation from hydrologically similar catchments; and rainfall-runoff modelling using design storms as inputs. The three values obtained by these methods are then assessed and weighted in terms of their reliability to facilitate selection of the design flood. The uncertainty assessment of the various methods is based on confidence intervals, estimates of regional heterogeneity, data availability and sensitivity analyses of the rainfall-runoff model. As the definition of the design floods discussed above is based on probability concepts it is also important to examine the excess risk, i.e. the possibility of the occurrence of a flood exceeding the design levels. The excess risk is evaluated based on a so called Safety Check Flood (SCF), similar to the existing practice in other countries in Europe. The SCF is a vehicle to analyse the damage potential of an event of this magnitude. This is to provide guidance for protective measures to dealing with very extreme floods. The SCF is used to check the vulnerability of the system with regard to structural stability, morphological effects, etc., and to develop alarm plans and disaster mitigation procedures. The basis for estimating the SCF are the uncertainty assessments of the design flood values estimated by the three methods including unlikely combinations of the controlling factors and attending uncertainties. Finally we discuss the impact on the downstream valley of floods exceeding the design values and of smaller floods and illustrate the basic concepts by examples from the recent flood in August 2002.

  9. Joint Peru/United States report on Peru/United States cooperative energy assessment. Volume 1. Executive summary, main report and appendices

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1979-08-01

    In 1978, the US and Peru conducted a comprehensive assessment of Peru's energy resources, needs, and uses and developed several alternative energy strategies that utilize the available resources to meet their energy requirements. This Volume I reports the findings of the assessment and contains the executive summary, the main report, and five appendices of information that support the integrated energy supply and demand analysis. The following chapters are included: The Energy Situation in Peru (economic context and background, energy resources and production, energy consumption patterns); Reference Supply and Demand Projection (approach, procedures, and assumptions; economic projections; energy demand and supplymore » projections; supply/demand integration; uncertainties); and The Development of Strategies and Options (the analysis of options; strategies; increased use of renewables, hydropower, coal; increased energy efficiency; and financial analysis of strategies).« less

  10. Assessing and reporting uncertainties in dietary exposure analysis: Mapping of uncertainties in a tiered approach.

    PubMed

    Kettler, Susanne; Kennedy, Marc; McNamara, Cronan; Oberdörfer, Regina; O'Mahony, Cian; Schnabel, Jürgen; Smith, Benjamin; Sprong, Corinne; Faludi, Roland; Tennant, David

    2015-08-01

    Uncertainty analysis is an important component of dietary exposure assessments in order to understand correctly the strength and limits of its results. Often, standard screening procedures are applied in a first step which results in conservative estimates. If through those screening procedures a potential exceedance of health-based guidance values is indicated, within the tiered approach more refined models are applied. However, the sources and types of uncertainties in deterministic and probabilistic models can vary or differ. A key objective of this work has been the mapping of different sources and types of uncertainties to better understand how to best use uncertainty analysis to generate more realistic comprehension of dietary exposure. In dietary exposure assessments, uncertainties can be introduced by knowledge gaps about the exposure scenario, parameter and the model itself. With this mapping, general and model-independent uncertainties have been identified and described, as well as those which can be introduced and influenced by the specific model during the tiered approach. This analysis identifies that there are general uncertainties common to point estimates (screening or deterministic methods) and probabilistic exposure assessment methods. To provide further clarity, general sources of uncertainty affecting many dietary exposure assessments should be separated from model-specific uncertainties. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.

  11. On the Timing of Glacial Terminations in the Equatorial Pacific

    NASA Astrophysics Data System (ADS)

    Khider, D.; Ahn, S.; Lisiecki, L. E.; Lawrence, C.; Kienast, M.

    2015-12-01

    Understanding the mechanisms through which the climate system responds to orbital insolation changes requires establishing the timing of events imprinted on the geological record. In this study, we investigate the relative timing of the glacial terminations across the equatorial Pacific in order to identify a possible mechanism through which the tropics may have influenced a global climate response. The relative termination timing between the eastern and western equatorial Pacific was assessed from 15 published SST records based on Globigerinoides ruber Mg/Ca or alkenone thermometry. The novelty of our study lies in the accounting of the various sources of uncertainty inherent to paleoclimate reconstruction and timing analysis. Specifically, we use a Monte-Carlo process allowing sampling of possible realizations of the time series that are functions of the uncertainty of the benthic δ18O alignment to a global benthic curve, of the SST uncertainty, and of the uncertainty in the change point, which we use as a definition for the termination timing. We find that the uncertainty on the relative timing estimates is on the order of several thousand years, and stems from age model uncertainty (60%) and the uncertainty in the change point detection (40%). Random sources of uncertainty are the main contributor, and, therefore, averaging over a large datasets and/or higher resolution records should yield more precise and accurate estimates of the relative lead-lag. However, at this time, the number of records is not sufficient to identify any significant differences in the timing of the last three glacial terminations in SST records from the Eastern and Western Tropical Pacific.

  12. Evaluating uncertainty in environmental life-cycle assessment. A case study comparing two insulation options for a Dutch one-family dwelling.

    PubMed

    Huijbregts, Mark A J; Gilijamse, Wim; Ragas, Ad M J; Reijnders, Lucas

    2003-06-01

    The evaluation of uncertainty is relatively new in environmental life-cycle assessment (LCA). It provides useful information to assess the reliability of LCA-based decisions and to guide future research toward reducing uncertainty. Most uncertainty studies in LCA quantify only one type of uncertainty, i.e., uncertainty due to input data (parameter uncertainty). However, LCA outcomes can also be uncertain due to normative choices (scenario uncertainty) and the mathematical models involved (model uncertainty). The present paper outlines a new methodology that quantifies parameter, scenario, and model uncertainty simultaneously in environmental life-cycle assessment. The procedure is illustrated in a case study that compares two insulation options for a Dutch one-family dwelling. Parameter uncertainty was quantified by means of Monte Carlo simulation. Scenario and model uncertainty were quantified by resampling different decision scenarios and model formulations, respectively. Although scenario and model uncertainty were not quantified comprehensively, the results indicate that both types of uncertainty influence the case study outcomes. This stresses the importance of quantifying parameter, scenario, and model uncertainty simultaneously. The two insulation options studied were found to have significantly different impact scores for global warming, stratospheric ozone depletion, and eutrophication. The thickest insulation option has the lowest impact on global warming and eutrophication, and the highest impact on stratospheric ozone depletion.

  13. Multi-objective calibration and uncertainty analysis of hydrologic models; A comparative study between formal and informal methods

    NASA Astrophysics Data System (ADS)

    Shafii, M.; Tolson, B.; Matott, L. S.

    2012-04-01

    Hydrologic modeling has benefited from significant developments over the past two decades. This has resulted in building of higher levels of complexity into hydrologic models, which eventually makes the model evaluation process (parameter estimation via calibration and uncertainty analysis) more challenging. In order to avoid unreasonable parameter estimates, many researchers have suggested implementation of multi-criteria calibration schemes. Furthermore, for predictive hydrologic models to be useful, proper consideration of uncertainty is essential. Consequently, recent research has emphasized comprehensive model assessment procedures in which multi-criteria parameter estimation is combined with statistically-based uncertainty analysis routines such as Bayesian inference using Markov Chain Monte Carlo (MCMC) sampling. Such a procedure relies on the use of formal likelihood functions based on statistical assumptions, and moreover, the Bayesian inference structured on MCMC samplers requires a considerably large number of simulations. Due to these issues, especially in complex non-linear hydrological models, a variety of alternative informal approaches have been proposed for uncertainty analysis in the multi-criteria context. This study aims at exploring a number of such informal uncertainty analysis techniques in multi-criteria calibration of hydrological models. The informal methods addressed in this study are (i) Pareto optimality which quantifies the parameter uncertainty using the Pareto solutions, (ii) DDS-AU which uses the weighted sum of objective functions to derive the prediction limits, and (iii) GLUE which describes the total uncertainty through identification of behavioral solutions. The main objective is to compare such methods with MCMC-based Bayesian inference with respect to factors such as computational burden, and predictive capacity, which are evaluated based on multiple comparative measures. The measures for comparison are calculated both for calibration and evaluation periods. The uncertainty analysis methodologies are applied to a simple 5-parameter rainfall-runoff model, called HYMOD.

  14. Cost assessment of natural hazards in Europe - state-of-the-art, knowledge gaps and recommendations

    NASA Astrophysics Data System (ADS)

    Meyer, V.; Becker, N.; Markantonis, V.; Schwarze, R.; van den Bergh, J. C. J. M.; Bouwer, L. M.; Bubeck, P.; Ciavola, P.; Thieken, A. H.; Genovese, E.; Green, C.; Hallegatte, S.; Kreibich, H.; Lequeux, Q.; Viavattenne, C.; Logar, I.; Papyrakis, E.; Pfurtscheller, C.; Poussin, J.; Przyluski, V.

    2012-04-01

    Effective and efficient reduction of natural hazard risks requires a thorough understanding of the costs of natural hazards in order to develop sustainable risk management strategies. The current methods that assess the costs of different natural hazards employ a diversity of terminologies and approaches for different hazards and impacted sectors. This makes it difficult to arrive at robust, comprehensive and comparable cost figures. The CONHAZ (Costs of Natural Hazards) project aimed to compile and synthesise current knowledge on cost assessment methods in order to strengthen the role of cost assessments in the development of integrated natural hazard management and adaptation planning. In order to achieve this, CONHAZ has adopted a comprehensive approach, considering natural hazards ranging from droughts, floods and coastal hazards to Alpine hazards, as well as different impacted sectors and cost types. Its specific objectives have been 1) to compile the state-of-the-art methods for cost assessment; 2) to analyse and assess these methods in terms of technical aspects, as well as terminology, data quality and availability, and research gaps; and 3) to synthesise resulting knowledge into recommendations and to identify further research needs. This presentation summarises the main results of CONHAZ. CONHAZ differentiates between direct tangible damages, losses due to business interruption, indirect damages, intangible effects, and costs of risk mitigation. It is shown that the main focus of cost assessment methods and their application in practice is on direct costs, while existing methods for assessing intangible and indirect effects are rather rarely applied and methods for assessing indirect effects often cannot be used on the scale of interest (e.g. the regional scale). Furthermore, methods often focus on single sectors and/or hazards, and only very few are able to reflect several sectors or multiple hazards. Process understanding and its use in cost assessment is poor, leading to highly uncertain results. However, sensitivity and uncertainty analyses as well as validations are hardly undertaken. Important recommendations are that cost assessment can be made more comprehensive by including indirect and intangible effects. Furthermore, the importance is highlighted of identifying sources of uncertainties, of reducing them effectively and of documenting remaining ones. One source of uncertainty concerns data sources. A framework for supporting data collection on the European level ensuring minimum data quality standards would facilitate the development and consistency of European and national databases. Furthermore, an improvement of methods is needed with regard to a better understanding and modelling of the damaging processes. In particular, there is a need for a better understanding of the economic response to external shocks and to improve models for indirect cost assessment based on this. Also models to estimate direct economic damage need to be based on more knowledge about the complex processes leading to damages. Moreover, the dynamics of risk due to climate and socio-economic change have to be better considered in the models in order to unveil uncertainties about future developments in the costs of natural hazards. Finally, there is a need for appropriate and transparent tools and guidance to support decision makers in the integration of uncertain cost assessment figures into decision making.

  15. Assessing uncertainties in global cropland futures using a conditional probabilistic modelling framework

    NASA Astrophysics Data System (ADS)

    Engström, Kerstin; Olin, Stefan; Rounsevell, Mark D. A.; Brogaard, Sara; van Vuuren, Detlef P.; Alexander, Peter; Murray-Rust, Dave; Arneth, Almut

    2016-11-01

    We present a modelling framework to simulate probabilistic futures of global cropland areas that are conditional on the SSP (shared socio-economic pathway) scenarios. Simulations are based on the Parsimonious Land Use Model (PLUM) linked with the global dynamic vegetation model LPJ-GUESS (Lund-Potsdam-Jena General Ecosystem Simulator) using socio-economic data from the SSPs and climate data from the RCPs (representative concentration pathways). The simulated range of global cropland is 893-2380 Mha in 2100 (± 1 standard deviation), with the main uncertainties arising from differences in the socio-economic conditions prescribed by the SSP scenarios and the assumptions that underpin the translation of qualitative SSP storylines into quantitative model input parameters. Uncertainties in the assumptions for population growth, technological change and cropland degradation were found to be the most important for global cropland, while uncertainty in food consumption had less influence on the results. The uncertainties arising from climate variability and the differences between climate change scenarios do not strongly affect the range of global cropland futures. Some overlap occurred across all of the conditional probabilistic futures, except for those based on SSP3. We conclude that completely different socio-economic and climate change futures, although sharing low to medium population development, can result in very similar cropland areas on the aggregated global scale.

  16. SeaWiFS technical report series. Volume 16: The second SeaWiFS Intercalibration Round-Robin Experiment, SIRREX-2, June 1993

    NASA Technical Reports Server (NTRS)

    Hooker, Stanford B. (Editor); Firestone, Elaine R. (Editor); Mueller, James L.; Mclean, James T.; Johnson, B. Carol; Westphal, Todd L.; Cooper, John W.

    1994-01-01

    The results of the second Sea-viewing Wide Field-of-view Sensor (SeaWiFS) Intercalibration Round-Robin Experiment (SIRREX-2), which was held at the Center for Hydro-Optics and Remote Sensing (CHORS) at San Diego State University on 14-25 Jun. 1993 are presented. SeaWiFS is an ocean color radiometer that is scheduled for launch in 1994. The SIRREXs are part of the SeaWiFS Calibration and Validation Program that includes the GSFC, CHORS, NIST, and several other laboratories. GSFC maintains the radiometric scales (spectral radiance and irradiance) for the SeaWiFS program using spectral irradiance standards lamps, which are calibrated by NIST. The purpose of each SIRREX is to assure that the radiometric scales which are realized by the laboratories who participate in the SeaWiFS Calibration and Validation Program are correct; that is, the uncertainties of the radiometric scales are such that measurements of normalized water-leaving radiance using oceanographic radiometers have uncertainties of 5%. SIRREX-1 demonstrated, from the internal consistency of the results, that the program goals would not be met without improvements to the instrumentation. The results of SIRREX-2 demonstrate that spectral irradiance scales realized using the GSFC standard irradiance lamp (F269) are consistent with the program goals, as the uncertainty of these measurements is assessed to be about 1%. However, this is not true for the spectral radiance scales, where again the internal consistency of the results is used to assess the uncertainty. This is attributed to inadequate performance and characterization of the instrumentation. For example, spatial nonuniformities, spectral features, and sensitivity to illumination configuration were observed in some of the integrating sphere sources. The results of SIRREX-2 clearly indicate the direction for future work, with the main emphasis on instrument characterization and the assessment of the measurement uncertainties so that the results may be stated in a more definitive manner.

  17. Stationarity is undead: Uncertainty dominates the distribution of extremes

    NASA Astrophysics Data System (ADS)

    Serinaldi, Francesco; Kilsby, Chris G.

    2015-03-01

    The increasing effort to develop and apply nonstationary models in hydrologic frequency analyses under changing environmental conditions can be frustrated when the additional uncertainty related to the model complexity is accounted for along with the sampling uncertainty. In order to show the practical implications and possible problems of using nonstationary models and provide critical guidelines, in this study we review the main tools developed in this field (such as nonstationary distribution functions, return periods, and risk of failure) highlighting advantages and disadvantages. The discussion is supported by three case studies that revise three illustrative examples reported in the scientific and technical literature referring to the Little Sugar Creek (at Charlotte, North Carolina), Red River of the North (North Dakota/Minnesota), and the Assunpink Creek (at Trenton, New Jersey). The uncertainty of the results is assessed by complementing point estimates with confidence intervals (CIs) and emphasizing critical aspects such as the subjectivity affecting the choice of the models' structure. Our results show that (1) nonstationary frequency analyses should not only be based on at-site time series but require additional information and detailed exploratory data analyses (EDA); (2) as nonstationary models imply that the time-varying model structure holds true for the entire future design life period, an appropriate modeling strategy requires that EDA identifies a well-defined deterministic mechanism leading the examined process; (3) when the model structure cannot be inferred in a deductive manner and nonstationary models are fitted by inductive inference, model structure introduces an additional source of uncertainty so that the resulting nonstationary models can provide no practical enhancement of the credibility and accuracy of the predicted extreme quantiles, whereas possible model misspecification can easily lead to physically inconsistent results; (4) when the model structure is uncertain, stationary models and a suitable assessment of the uncertainty accounting for possible temporal persistence should be retained as more theoretically coherent and reliable options for practical applications in real-world design and management problems; (5) a clear understanding of the actual probabilistic meaning of stationary and nonstationary return periods and risk of failure is required for a correct risk assessment and communication.

  18. A measurement-driven approach to assess power line telecommunication (PLT) network quality of service (QoS) performance parameters

    NASA Astrophysics Data System (ADS)

    Betta, G.; Capriglione, D.; Ferrigno, L.; Laracca, M.

    2009-10-01

    Power line telecommunication (PLT) technology offers cheap and fast ways for providing in-home broadband services and local area networking. Its main advantage is due to the possibility of using the pre-existing electrical grid as a communication channel. Nevertheless, technical challenges arise from the difficulty of operating on a hostile medium, not designed for communication purposes, characterized by complex channel modeling and by varying time response. These aspects put practical problems for designers and testers in the assessment of network quality of service performance parameters such as the throughput, the latency, the jitter, and the reliability. The measurement of these parameters has not yet been standardized so that there do not exist reference test set-ups and measurement methodologies (i.e. the type of isolation from the ac main, the observation time and the number of experiments, the measurement uncertainty and so on). Consequently, experiments executed by adopting different methods may lead to incompatible measurement results, thus making it also impossible to have reliable comparisons of different PLT modems. Really, the development of standard procedures is a very difficult task because the scenarios in which the PLT modems can work are very wide and then the application of an exhaustive approach (in which all the parameters influencing the PLT performance should be considered) would be very complex and time consuming, thus making the modem characterization very expensive. In this paper, the authors propose a methodological approach to develop an efficient measurement procedure able to reliably assess the performance of PLT modems (in terms of network quality of service parameters) with a minimum number of experiments. It is based on both creating a reconfigurable grid to which real disturbing loads are connected and implementing an original design of the experiment technique based on the effects of the uncertainty of the measurement results. Methods are also provided to analyze measurement results and to estimate the measurement uncertainty.

  19. Estimating the uncertainty in thermochemical calculations for oxygen-hydrogen combustors

    NASA Astrophysics Data System (ADS)

    Sims, Joseph David

    The thermochemistry program CEA2 was combined with the statistical thermodynamics program PAC99 in a Monte Carlo simulation to determine the uncertainty in several CEA2 output variables due to uncertainty in thermodynamic reference values for the reactant and combustion species. In all, six typical performance parameters were examined, along with the required intermediate calculations (five gas properties and eight stoichiometric coefficients), for three hydrogen-oxygen combustors: a main combustor, an oxidizer preburner and a fuel preburner. The three combustors were analyzed in two different modes: design mode, where, for the first time, the uncertainty in thermodynamic reference values---taken from the literature---was considered (inputs to CEA2 were specified and so had no uncertainty); and data reduction mode, where inputs to CEA2 did have uncertainty. The inputs to CEA2 were contrived experimental measurements that were intended to represent the typical combustor testing facility. In design mode, uncertainties in the performance parameters were on the order of 0.1% for the main combustor, on the order of 0.05% for the oxidizer preburner and on the order of 0.01% for the fuel preburner. Thermodynamic reference values for H2O were the dominant sources of uncertainty, as was the assigned enthalpy for liquid oxygen. In data reduction mode, uncertainties in performance parameters increased significantly as a result of the uncertainties in experimental measurements compared to uncertainties in thermodynamic reference values. Main combustor and fuel preburner theoretical performance values had uncertainties of about 0.5%, while the oxidizer preburner had nearly 2%. Associated experimentally-determined performance values for all three combustors were 3% to 4%. The dominant sources of uncertainty in this mode were the propellant flowrates. These results only apply to hydrogen-oxygen combustors and should not be generalized to every propellant combination. Species for a hydrogen-oxygen system are relatively simple, thereby resulting in low thermodynamic reference value uncertainties. Hydrocarbon combustors, solid rocket motors and hybrid rocket motors have combustion gases containing complex molecules that will likely have thermodynamic reference values with large uncertainties. Thus, every chemical system should be analyzed in a similar manner as that shown in this work.

  20. Future Climate Change in the Baltic Sea Area

    NASA Astrophysics Data System (ADS)

    Bøssing Christensen, Ole; Kjellström, Erik; Zorita, Eduardo; Sonnenborg, Torben; Meier, Markus; Grinsted, Aslak

    2015-04-01

    Regional climate models have been used extensively since the first assessment of climate change in the Baltic Sea region published in 2008, not the least for studies of Europe (and including the Baltic Sea catchment area). Therefore, conclusions regarding climate model results have a better foundation than was the case for the first BACC report of 2008. This presentation will report model results regarding future climate. What is the state of understanding about future human-driven climate change? We will cover regional models, statistical downscaling, hydrological modelling, ocean modelling and sea-level change as it is projected for the Baltic Sea region. Collections of regional model simulations from the ENSEMBLES project for example, financed through the European 5th Framework Programme and the World Climate Research Programme Coordinated Regional Climate Downscaling Experiment, have made it possible to obtain an increasingly robust estimation of model uncertainty. While the first Baltic Sea assessment mainly used four simulations from the European 5th Framework Programme PRUDENCE project, an ensemble of 13 transient regional simulations with twice the horizontal resolution reaching the end of the 21st century has been available from the ENSEMBLES project; therefore it has been possible to obtain more quantitative assessments of model uncertainty. The literature about future climate change in the Baltic Sea region is largely built upon the ENSEMBLES project. Also within statistical downscaling, a considerable number of papers have been published, encompassing now the application of non-linear statistical models, projected changes in extremes and correction of climate model biases. The uncertainty of hydrological change has received increasing attention since the previous Baltic Sea assessment. Several studies on the propagation of uncertainties originating in GCMs, RCMs, and emission scenarios are presented. The number of studies on uncertainties related to downscaling and impact models is relatively small, but more are emerging. A large number of coupled climate-environmental scenario simulations for the Baltic Sea have been performed within the BONUS+ projects (ECOSUPPORT, INFLOW, AMBER and Baltic-C (2009-2011)), using various combinations of output from GCMs, RCMs, hydrological models and scenarios for load and emission of nutrients as forcing for Baltic Sea models. Such a large ensemble of scenario simulations for the Baltic Sea has never before been produced and enables for the first time an estimation of uncertainties.

  1. Operationalising uncertainty in data and models for integrated water resources management.

    PubMed

    Blind, M W; Refsgaard, J C

    2007-01-01

    Key sources of uncertainty of importance for water resources management are (1) uncertainty in data; (2) uncertainty related to hydrological models (parameter values, model technique, model structure); and (3) uncertainty related to the context and the framing of the decision-making process. The European funded project 'Harmonised techniques and representative river basin data for assessment and use of uncertainty information in integrated water management (HarmoniRiB)' has resulted in a range of tools and methods to assess such uncertainties, focusing on items (1) and (2). The project also engaged in a number of discussions surrounding uncertainty and risk assessment in support of decision-making in water management. Based on the project's results and experiences, and on the subsequent discussions a number of conclusions can be drawn on the future needs for successful adoption of uncertainty analysis in decision support. These conclusions range from additional scientific research on specific uncertainties, dedicated guidelines for operational use to capacity building at all levels. The purpose of this paper is to elaborate on these conclusions and anchoring them in the broad objective of making uncertainty and risk assessment an essential and natural part in future decision-making processes.

  2. Uncertainty Quantification in Geomagnetic Field Modeling

    NASA Astrophysics Data System (ADS)

    Chulliat, A.; Nair, M. C.; Alken, P.; Meyer, B.; Saltus, R.; Woods, A.

    2017-12-01

    Geomagnetic field models are mathematical descriptions of the various sources of the Earth's magnetic field, and are generally obtained by solving an inverse problem. They are widely used in research to separate and characterize field sources, but also in many practical applications such as aircraft and ship navigation, smartphone orientation, satellite attitude control, and directional drilling. In recent years, more sophisticated models have been developed, thanks to the continuous availability of high quality satellite data and to progress in modeling techniques. Uncertainty quantification has become an integral part of model development, both to assess the progress made and to address specific users' needs. Here we report on recent advances made by our group in quantifying the uncertainty of geomagnetic field models. We first focus on NOAA's World Magnetic Model (WMM) and the International Geomagnetic Reference Field (IGRF), two reference models of the main (core) magnetic field produced every five years. We describe the methods used in quantifying the model commission error as well as the omission error attributed to various un-modeled sources such as magnetized rocks in the crust and electric current systems in the atmosphere and near-Earth environment. A simple error model was derived from this analysis, to facilitate usage in practical applications. We next report on improvements brought by combining a main field model with a high resolution crustal field model and a time-varying, real-time external field model, like in NOAA's High Definition Geomagnetic Model (HDGM). The obtained uncertainties are used by the directional drilling industry to mitigate health, safety and environment risks.

  3. Uncertainties Associated with Quantifying Climate Change Impacts on Human Health: A Case Study for Diarrhea

    PubMed Central

    Kolstad, Erik W.; Johansson, Kjell Arne

    2011-01-01

    Background Climate change is expected to have large impacts on health at low latitudes where droughts and malnutrition, diarrhea, and malaria are projected to increase. Objectives The main objective of this study was to indicate a method to assess a range of plausible health impacts of climate change while handling uncertainties in a unambiguous manner. We illustrate this method by quantifying the impacts of projected regional warming on diarrhea in this century. Methods We combined a range of linear regression coefficients to compute projections of future climate change-induced increases in diarrhea using the results from five empirical studies and a 19-member climate model ensemble for which future greenhouse gas emissions were prescribed. Six geographical regions were analyzed. Results The model ensemble projected temperature increases of up to 4°C over land in the tropics and subtropics by the end of this century. The associated mean projected increases of relative risk of diarrhea in the six study regions were 8–11% (with SDs of 3–5%) by 2010–2039 and 22–29% (SDs of 9–12%) by 2070–2099. Conclusions Even our most conservative estimates indicate substantial impacts from climate change on the incidence of diarrhea. Nevertheless, our main conclusion is that large uncertainties are associated with future projections of diarrhea and climate change. We believe that these uncertainties can be attributed primarily to the sparsity of empirical climate–health data. Our results therefore highlight the need for empirical data in the cross section between climate and human health. PMID:20929684

  4. Addressing uncertainty in vulnerability assessments [Chapter 5

    Treesearch

    Linda Joyce; Molly Cross; Evan Girvatz

    2011-01-01

    This chapter addresses issues and approaches for dealing with uncertainty specifically within the context of conducting climate change vulnerability assessments (i.e., uncertainties related to identifying and modeling the sensitivities, levels of exposure, and adaptive capacity of the assessment targets).

  5. Approximating uncertainty of annual runoff and reservoir yield using stochastic replicates of global climate model data

    NASA Astrophysics Data System (ADS)

    Peel, M. C.; Srikanthan, R.; McMahon, T. A.; Karoly, D. J.

    2015-04-01

    Two key sources of uncertainty in projections of future runoff for climate change impact assessments are uncertainty between global climate models (GCMs) and within a GCM. Within-GCM uncertainty is the variability in GCM output that occurs when running a scenario multiple times but each run has slightly different, but equally plausible, initial conditions. The limited number of runs available for each GCM and scenario combination within the Coupled Model Intercomparison Project phase 3 (CMIP3) and phase 5 (CMIP5) data sets, limits the assessment of within-GCM uncertainty. In this second of two companion papers, the primary aim is to present a proof-of-concept approximation of within-GCM uncertainty for monthly precipitation and temperature projections and to assess the impact of within-GCM uncertainty on modelled runoff for climate change impact assessments. A secondary aim is to assess the impact of between-GCM uncertainty on modelled runoff. Here we approximate within-GCM uncertainty by developing non-stationary stochastic replicates of GCM monthly precipitation and temperature data. These replicates are input to an off-line hydrologic model to assess the impact of within-GCM uncertainty on projected annual runoff and reservoir yield. We adopt stochastic replicates of available GCM runs to approximate within-GCM uncertainty because large ensembles, hundreds of runs, for a given GCM and scenario are unavailable, other than the Climateprediction.net data set for the Hadley Centre GCM. To date within-GCM uncertainty has received little attention in the hydrologic climate change impact literature and this analysis provides an approximation of the uncertainty in projected runoff, and reservoir yield, due to within- and between-GCM uncertainty of precipitation and temperature projections. In the companion paper, McMahon et al. (2015) sought to reduce between-GCM uncertainty by removing poorly performing GCMs, resulting in a selection of five better performing GCMs from CMIP3 for use in this paper. Here we present within- and between-GCM uncertainty results in mean annual precipitation (MAP), mean annual temperature (MAT), mean annual runoff (MAR), the standard deviation of annual precipitation (SDP), standard deviation of runoff (SDR) and reservoir yield for five CMIP3 GCMs at 17 worldwide catchments. Based on 100 stochastic replicates of each GCM run at each catchment, within-GCM uncertainty was assessed in relative form as the standard deviation expressed as a percentage of the mean of the 100 replicate values of each variable. The average relative within-GCM uncertainties from the 17 catchments and 5 GCMs for 2015-2044 (A1B) were MAP 4.2%, SDP 14.2%, MAT 0.7%, MAR 10.1% and SDR 17.6%. The Gould-Dincer Gamma (G-DG) procedure was applied to each annual runoff time series for hypothetical reservoir capacities of 1 × MAR and 3 × MAR and the average uncertainties in reservoir yield due to within-GCM uncertainty from the 17 catchments and 5 GCMs were 25.1% (1 × MAR) and 11.9% (3 × MAR). Our approximation of within-GCM uncertainty is expected to be an underestimate due to not replicating the GCM trend. However, our results indicate that within-GCM uncertainty is important when interpreting climate change impact assessments. Approximately 95% of values of MAP, SDP, MAT, MAR, SDR and reservoir yield from 1 × MAR or 3 × MAR capacity reservoirs are expected to fall within twice their respective relative uncertainty (standard deviation/mean). Within-GCM uncertainty has significant implications for interpreting climate change impact assessments that report future changes within our range of uncertainty for a given variable - these projected changes may be due solely to within-GCM uncertainty. Since within-GCM variability is amplified from precipitation to runoff and then to reservoir yield, climate change impact assessments that do not take into account within-GCM uncertainty risk providing water resources management decision makers with a sense of certainty that is unjustified.

  6. Objective assessment of the contribution of the RECOPESCA network to the monitoring of 3D coastal ocean variables in the Bay of Biscay and the English Channel

    NASA Astrophysics Data System (ADS)

    Lamouroux, Julien; Charria, Guillaume; De Mey, Pierre; Raynaud, Stéphane; Heyraud, Catherine; Craneguy, Philippe; Dumas, Franck; Le Hénaff, Matthieu

    2016-04-01

    In the Bay of Biscay and the English Channel, in situ observations represent a key element to monitor and to understand the wide range of processes in the coastal ocean and their direct impacts on human activities. An efficient way to measure the hydrological content of the water column over the main part of the continental shelf is to consider ships of opportunity as the surface to cover is wide and could be far from the coast. In the French observation strategy, the RECOPESCA programme, as a component of the High frequency Observation network for the environment in coastal SEAs (HOSEA), aims to collect environmental observations from sensors attached to fishing nets. In the present study, we assess that network using the Array Modes (ArM) method (a stochastic implementation of Le Hénaff et al. Ocean Dyn 59: 3-20. doi: 10.1007/s10236-008-0144-7, 2009). That model ensemble-based method is used here to compare model and observation errors and to quantitatively evaluate the performance of the observation network at detecting prior (model) uncertainties, based on hypotheses on error sources. A reference network, based on fishing vessel observations in 2008, is assessed using that method. Considering the various seasons, we show the efficiency of the network at detecting the main model uncertainties. Moreover, three scenarios, based on the reference network, a denser network in 2010 and a fictive network aggregated from a pluri-annual collection of profiles, are also analysed. Our sensitivity study shows the importance of the profile positions with respect to the sheer number of profiles for ensuring the ability of the network to describe the main error modes. More generally, we demonstrate the capacity of this method, with a low computational cost, to assess and to design new in situ observation networks.

  7. How Certain are We of the Uncertainties in Recent Ozone Profile Trend Assessments of Merged Limbo Ccultation Records? Challenges and Possible Ways Forward

    NASA Technical Reports Server (NTRS)

    Hubert, Daan; Lambert, Jean-Christopher; Verhoelst, Tijl; Granville, Jose; Keppens, Arno; Baray, Jean-Luc; Cortesi, Ugo; Degenstein, D. A.; Froidevaux, Lucien; Godin-Beekmann, Sophie; hide

    2015-01-01

    Most recent assessments of long-term changes in the vertical distribution of ozone (by e.g. WMO and SI2N) rely on data sets that integrate observations by multiple instruments. Several merged satellite ozone profile records have been developed over the past few years; each considers a particular set of instruments and adopts a particular merging strategy. Their intercomparison by Tummon et al. revealed that the current merging schemes are not sufficiently refined to correct for all major differences between the limb/occultation records. This shortcoming introduces uncertainties that need to be known to obtain a sound interpretation of the different satellite-based trend studies. In practice however, producing realistic uncertainty estimates is an intricate task which depends on a sufficiently detailed understanding of the characteristics of each contributing data record and on the subsequent interplay and propagation of these through the merging scheme. Our presentation discusses these challenges in the context of limb/occultation ozone profile records, but they are equally relevant for other instruments and atmospheric measurements. We start by showing how the NDACC and GAW-affiliated ground-based networks of ozonesonde and lidar instruments allowed us to characterize fourteen limb/occultation ozone profile records, together providing a global view over the last three decades. Our prime focus will be on techniques to estimate long-term drift since our results suggest this is the main driver of the major trend differences between the merged data sets. The single-instrument drift estimates are then used for a tentative estimate of the systematic uncertainty in the profile trends from merged data records. We conclude by reflecting on possible further steps needed to improve the merging algorithms and to obtain a better characterization of the uncertainties involved.

  8. Impacts of uncertainties in European gridded precipitation observations on regional climate analysis

    PubMed Central

    Gobiet, Andreas

    2016-01-01

    ABSTRACT Gridded precipitation data sets are frequently used to evaluate climate models or to remove model output biases. Although precipitation data are error prone due to the high spatio‐temporal variability of precipitation and due to considerable measurement errors, relatively few attempts have been made to account for observational uncertainty in model evaluation or in bias correction studies. In this study, we compare three types of European daily data sets featuring two Pan‐European data sets and a set that combines eight very high‐resolution station‐based regional data sets. Furthermore, we investigate seven widely used, larger scale global data sets. Our results demonstrate that the differences between these data sets have the same magnitude as precipitation errors found in regional climate models. Therefore, including observational uncertainties is essential for climate studies, climate model evaluation, and statistical post‐processing. Following our results, we suggest the following guidelines for regional precipitation assessments. (1) Include multiple observational data sets from different sources (e.g. station, satellite, reanalysis based) to estimate observational uncertainties. (2) Use data sets with high station densities to minimize the effect of precipitation undersampling (may induce about 60% error in data sparse regions). The information content of a gridded data set is mainly related to its underlying station density and not to its grid spacing. (3) Consider undercatch errors of up to 80% in high latitudes and mountainous regions. (4) Analyses of small‐scale features and extremes are especially uncertain in gridded data sets. For higher confidence, use climate‐mean and larger scale statistics. In conclusion, neglecting observational uncertainties potentially misguides climate model development and can severely affect the results of climate change impact assessments. PMID:28111497

  9. Impacts of uncertainties in European gridded precipitation observations on regional climate analysis.

    PubMed

    Prein, Andreas F; Gobiet, Andreas

    2017-01-01

    Gridded precipitation data sets are frequently used to evaluate climate models or to remove model output biases. Although precipitation data are error prone due to the high spatio-temporal variability of precipitation and due to considerable measurement errors, relatively few attempts have been made to account for observational uncertainty in model evaluation or in bias correction studies. In this study, we compare three types of European daily data sets featuring two Pan-European data sets and a set that combines eight very high-resolution station-based regional data sets. Furthermore, we investigate seven widely used, larger scale global data sets. Our results demonstrate that the differences between these data sets have the same magnitude as precipitation errors found in regional climate models. Therefore, including observational uncertainties is essential for climate studies, climate model evaluation, and statistical post-processing. Following our results, we suggest the following guidelines for regional precipitation assessments. (1) Include multiple observational data sets from different sources (e.g. station, satellite, reanalysis based) to estimate observational uncertainties. (2) Use data sets with high station densities to minimize the effect of precipitation undersampling (may induce about 60% error in data sparse regions). The information content of a gridded data set is mainly related to its underlying station density and not to its grid spacing. (3) Consider undercatch errors of up to 80% in high latitudes and mountainous regions. (4) Analyses of small-scale features and extremes are especially uncertain in gridded data sets. For higher confidence, use climate-mean and larger scale statistics. In conclusion, neglecting observational uncertainties potentially misguides climate model development and can severely affect the results of climate change impact assessments.

  10. Global Sampling for Integrating Physics-Specific Subsystems and Quantifying Uncertainties of CO 2 Geological Sequestration

    DOE PAGES

    Sun, Y.; Tong, C.; Trainor-Guitten, W. J.; ...

    2012-12-20

    The risk of CO 2 leakage from a deep storage reservoir into a shallow aquifer through a fault is assessed and studied using physics-specific computer models. The hypothetical CO 2 geological sequestration system is composed of three subsystems: a deep storage reservoir, a fault in caprock, and a shallow aquifer, which are modeled respectively by considering sub-domain-specific physics. Supercritical CO 2 is injected into the reservoir subsystem with uncertain permeabilities of reservoir, caprock, and aquifer, uncertain fault location, and injection rate (as a decision variable). The simulated pressure and CO 2/brine saturation are connected to the fault-leakage model as amore » boundary condition. CO 2 and brine fluxes from the fault-leakage model at the fault outlet are then imposed in the aquifer model as a source term. Moreover, uncertainties are propagated from the deep reservoir model, to the fault-leakage model, and eventually to the geochemical model in the shallow aquifer, thus contributing to risk profiles. To quantify the uncertainties and assess leakage-relevant risk, we propose a global sampling-based method to allocate sub-dimensions of uncertain parameters to sub-models. The risk profiles are defined and related to CO 2 plume development for pH value and total dissolved solids (TDS) below the EPA's Maximum Contaminant Levels (MCL) for drinking water quality. A global sensitivity analysis is conducted to select the most sensitive parameters to the risk profiles. The resulting uncertainty of pH- and TDS-defined aquifer volume, which is impacted by CO 2 and brine leakage, mainly results from the uncertainty of fault permeability. Subsequently, high-resolution, reduced-order models of risk profiles are developed as functions of all the decision variables and uncertain parameters in all three subsystems.« less

  11. Modelling ecological and human exposure to POPs in Venice lagoon - Part II: Quantitative uncertainty and sensitivity analysis in coupled exposure models.

    PubMed

    Radomyski, Artur; Giubilato, Elisa; Ciffroy, Philippe; Critto, Andrea; Brochot, Céline; Marcomini, Antonio

    2016-11-01

    The study is focused on applying uncertainty and sensitivity analysis to support the application and evaluation of large exposure models where a significant number of parameters and complex exposure scenarios might be involved. The recently developed MERLIN-Expo exposure modelling tool was applied to probabilistically assess the ecological and human exposure to PCB 126 and 2,3,7,8-TCDD in the Venice lagoon (Italy). The 'Phytoplankton', 'Aquatic Invertebrate', 'Fish', 'Human intake' and PBPK models available in MERLIN-Expo library were integrated to create a specific food web to dynamically simulate bioaccumulation in various aquatic species and in the human body over individual lifetimes from 1932 until 1998. MERLIN-Expo is a high tier exposure modelling tool allowing propagation of uncertainty on the model predictions through Monte Carlo simulation. Uncertainty in model output can be further apportioned between parameters by applying built-in sensitivity analysis tools. In this study, uncertainty has been extensively addressed in the distribution functions to describe the data input and the effect on model results by applying sensitivity analysis techniques (screening Morris method, regression analysis, and variance-based method EFAST). In the exposure scenario developed for the Lagoon of Venice, the concentrations of 2,3,7,8-TCDD and PCB 126 in human blood turned out to be mainly influenced by a combination of parameters (half-lives of the chemicals, body weight variability, lipid fraction, food assimilation efficiency), physiological processes (uptake/elimination rates), environmental exposure concentrations (sediment, water, food) and eating behaviours (amount of food eaten). In conclusion, this case study demonstrated feasibility of MERLIN-Expo to be successfully employed in integrated, high tier exposure assessment. Copyright © 2016 Elsevier B.V. All rights reserved.

  12. Factors controlling volume errors through 2D gully erosion assessment: guidelines for optimal survey design

    NASA Astrophysics Data System (ADS)

    Castillo, Carlos; Pérez, Rafael

    2017-04-01

    The assessment of gully erosion volumes is essential for the quantification of soil losses derived from this relevant degradation process. Traditionally, 2D and 3D approaches has been applied for this purpose (Casalí et al., 2006). Although innovative 3D approaches have recently been proposed for gully volume quantification, a renewed interest can be found in literature regarding the useful information that cross-section analysis still provides in gully erosion research. Moreover, the application of methods based on 2D approaches can be the most cost-effective approach in many situations such as preliminary studies with low accuracy requirements or surveys under time or budget constraints. The main aim of this work is to examine the key factors controlling volume error variability in 2D gully assessment by means of a stochastic experiment involving a Monte Carlo analysis over synthetic gully profiles in order to 1) contribute to a better understanding of the drivers and magnitude of gully erosion 2D-surveys uncertainty and 2) provide guidelines for optimal survey designs. Owing to the stochastic properties of error generation in 2D volume assessment, a statistical approach was followed to generate a large and significant set of gully reach configurations to evaluate quantitatively the influence of the main factors controlling the uncertainty of the volume assessment. For this purpose, a simulation algorithm in Matlab® code was written, involving the following stages: - Generation of synthetic gully area profiles with different degrees of complexity (characterized by the cross-section variability) - Simulation of field measurements characterised by a survey intensity and the precision of the measurement method - Quantification of the volume error uncertainty as a function of the key factors In this communication we will present the relationships between volume error and the studied factors and propose guidelines for 2D field surveys based on the minimal survey densities required to achieve a certain accuracy given the cross-sectional variability of a gully and the measurement method applied. References Casali, J., Loizu, J., Campo, M.A., De Santisteban, L.M., Alvarez-Mozos, J., 2006. Accuracy of methods for field assessment of rill and ephemeral gully erosion. Catena 67, 128-138. doi:10.1016/j.catena.2006.03.005

  13. Uncertainty assessment of PM2.5 contamination mapping using spatiotemporal sequential indicator simulations and multi-temporal monitoring data.

    PubMed

    Yang, Yong; Christakos, George; Huang, Wei; Lin, Chengda; Fu, Peihong; Mei, Yang

    2016-04-12

    Because of the rapid economic growth in China, many regions are subjected to severe particulate matter pollution. Thus, improving the methods of determining the spatiotemporal distribution and uncertainty of air pollution can provide considerable benefits when developing risk assessments and environmental policies. The uncertainty assessment methods currently in use include the sequential indicator simulation (SIS) and indicator kriging techniques. However, these methods cannot be employed to assess multi-temporal data. In this work, a spatiotemporal sequential indicator simulation (STSIS) based on a non-separable spatiotemporal semivariogram model was used to assimilate multi-temporal data in the mapping and uncertainty assessment of PM2.5 distributions in a contaminated atmosphere. PM2.5 concentrations recorded throughout 2014 in Shandong Province, China were used as the experimental dataset. Based on the number of STSIS procedures, we assessed various types of mapping uncertainties, including single-location uncertainties over one day and multiple days and multi-location uncertainties over one day and multiple days. A comparison of the STSIS technique with the SIS technique indicate that a better performance was obtained with the STSIS method.

  14. Uncertainty assessment of PM2.5 contamination mapping using spatiotemporal sequential indicator simulations and multi-temporal monitoring data

    NASA Astrophysics Data System (ADS)

    Yang, Yong; Christakos, George; Huang, Wei; Lin, Chengda; Fu, Peihong; Mei, Yang

    2016-04-01

    Because of the rapid economic growth in China, many regions are subjected to severe particulate matter pollution. Thus, improving the methods of determining the spatiotemporal distribution and uncertainty of air pollution can provide considerable benefits when developing risk assessments and environmental policies. The uncertainty assessment methods currently in use include the sequential indicator simulation (SIS) and indicator kriging techniques. However, these methods cannot be employed to assess multi-temporal data. In this work, a spatiotemporal sequential indicator simulation (STSIS) based on a non-separable spatiotemporal semivariogram model was used to assimilate multi-temporal data in the mapping and uncertainty assessment of PM2.5 distributions in a contaminated atmosphere. PM2.5 concentrations recorded throughout 2014 in Shandong Province, China were used as the experimental dataset. Based on the number of STSIS procedures, we assessed various types of mapping uncertainties, including single-location uncertainties over one day and multiple days and multi-location uncertainties over one day and multiple days. A comparison of the STSIS technique with the SIS technique indicate that a better performance was obtained with the STSIS method.

  15. Model Developments for Development of Improved Emissions Scenarios: Developing Purchasing-Power Parity Models, Analyzing Uncertainty, and Developing Data Sets for Gridded Integrated Assessment Models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yang, Zili; Nordhaus, William

    2009-03-19

    In the duration of this project, we finished the main tasks set up in the initial proposal. These tasks include: setting up the basic platform in GAMS language for the new RICE 2007 model; testing various model structure of RICE 2007; incorporating PPP data set in the new RICE model; developing gridded data set for IA modeling.

  16. Probabilistic structural analysis of aerospace components using NESSUS

    NASA Technical Reports Server (NTRS)

    Shiao, Michael C.; Nagpal, Vinod K.; Chamis, Christos C.

    1988-01-01

    Probabilistic structural analysis of a Space Shuttle main engine turbopump blade is conducted using the computer code NESSUS (numerical evaluation of stochastic structures under stress). The goal of the analysis is to derive probabilistic characteristics of blade response given probabilistic descriptions of uncertainties in blade geometry, material properties, and temperature and pressure distributions. Probability densities are derived for critical blade responses. Risk assessment and failure life analysis is conducted assuming different failure models.

  17. Relating Data and Models to Characterize Parameter and Prediction Uncertainty

    EPA Science Inventory

    Applying PBPK models in risk analysis requires that we realistically assess the uncertainty of relevant model predictions in as quantitative a way as possible. The reality of human variability may add a confusing feature to the overall uncertainty assessment, as uncertainty and v...

  18. Capturing the complexity of uncertainty language to maximise its use.

    NASA Astrophysics Data System (ADS)

    Juanchich, Marie; Sirota, Miroslav

    2016-04-01

    Uncertainty is often communicated verbally, using uncertainty phrases such as 'there is a small risk of earthquake', 'flooding is possible' or 'it is very likely the sea level will rise'. Prior research has only examined a limited number of properties of uncertainty phrases: mainly the probability conveyed (e.g., 'a small chance' convey a small probability whereas 'it is likely' convey a high probability). We propose a new analytical framework that captures more of the complexity of uncertainty phrases by studying their semantic, pragmatic and syntactic properties. Further, we argue that the complexity of uncertainty phrases is functional and can be leveraged to best describe uncertain outcomes and achieve the goals of speakers. We will present findings from a corpus study and an experiment where we assessed the following properties of uncertainty phrases: probability conveyed, subjectivity, valence, nature of the subject, grammatical category of the uncertainty quantifier and whether the quantifier elicits a positive or a negative framing. Natural language processing techniques applied to corpus data showed that people use a very large variety of uncertainty phrases representing different configurations of the properties of uncertainty phrases (e.g., phrases that convey different levels of subjectivity, phrases with different grammatical construction). In addition, the corpus analysis uncovered that uncertainty phrases commonly studied in psychology are not the most commonly used in real life. In the experiment we manipulated the amount of evidence indicating that a fact was true and whether the participant was required to prove the fact was true or that it was false. Participants produced a phrase to communicate the likelihood that the fact was true (e.g., 'it is not sure…', 'I am convinced that…'). The analyses of the uncertainty phrases produced showed that participants leveraged the properties of uncertainty phrases to reflect the strength of evidence but also to achieve their personal goals. For example, participants aiming to prove that the fact was true chose words that conveyed a more positive polarity and a higher probability than participants aiming to prove that the fact was false. We discuss the utility of the framework for harnessing the properties of uncertainty phrases in geosciences.

  19. A systematic uncertainty analysis for liner impedance eduction technology

    NASA Astrophysics Data System (ADS)

    Zhou, Lin; Bodén, Hans

    2015-11-01

    The so-called impedance eduction technology is widely used for obtaining acoustic properties of liners used in aircraft engines. The measurement uncertainties for this technology are still not well understood though it is essential for data quality assessment and model validation. A systematic framework based on multivariate analysis is presented in this paper to provide 95 percent confidence interval uncertainty estimates in the process of impedance eduction. The analysis is made using a single mode straightforward method based on transmission coefficients involving the classic Ingard-Myers boundary condition. The multivariate technique makes it possible to obtain an uncertainty analysis for the possibly correlated real and imaginary parts of the complex quantities. The results show that the errors in impedance results at low frequency mainly depend on the variability of transmission coefficients, while the mean Mach number accuracy is the most important source of error at high frequencies. The effect of Mach numbers used in the wave dispersion equation and in the Ingard-Myers boundary condition has been separated for comparison of the outcome of impedance eduction. A local Mach number based on friction velocity is suggested as a way to reduce the inconsistencies found when estimating impedance using upstream and downstream acoustic excitation.

  20. Planck 2015 results. IV. Low Frequency Instrument beams and window functions

    NASA Astrophysics Data System (ADS)

    Planck Collaboration; Ade, P. A. R.; Aghanim, N.; Ashdown, M.; Aumont, J.; Baccigalupi, C.; Banday, A. J.; Barreiro, R. B.; Bartolo, N.; Battaner, E.; Benabed, K.; Benoît, A.; Benoit-Lévy, A.; Bernard, J.-P.; Bersanelli, M.; Bielewicz, P.; Bock, J. J.; Bonaldi, A.; Bonavera, L.; Bond, J. R.; Borrill, J.; Bouchet, F. R.; Bucher, M.; Burigana, C.; Butler, R. C.; Calabrese, E.; Cardoso, J.-F.; Catalano, A.; Chamballu, A.; Christensen, P. R.; Colombi, S.; Colombo, L. P. L.; Crill, B. P.; Curto, A.; Cuttaia, F.; Danese, L.; Davies, R. D.; Davis, R. J.; de Bernardis, P.; de Rosa, A.; de Zotti, G.; Delabrouille, J.; Dickinson, C.; Diego, J. M.; Dole, H.; Donzelli, S.; Doré, O.; Douspis, M.; Ducout, A.; Dupac, X.; Efstathiou, G.; Elsner, F.; Enßlin, T. A.; Eriksen, H. K.; Fergusson, J.; Finelli, F.; Forni, O.; Frailis, M.; Franceschi, E.; Frejsel, A.; Galeotta, S.; Galli, S.; Ganga, K.; Giard, M.; Giraud-Héraud, Y.; Gjerløw, E.; González-Nuevo, J.; Górski, K. M.; Gratton, S.; Gregorio, A.; Gruppuso, A.; Hansen, F. K.; Hanson, D.; Harrison, D. L.; Henrot-Versillé, S.; Herranz, D.; Hildebrandt, S. R.; Hivon, E.; Hobson, M.; Holmes, W. A.; Hornstrup, A.; Hovest, W.; Huffenberger, K. M.; Hurier, G.; Jaffe, A. H.; Jaffe, T. R.; Juvela, M.; Keihänen, E.; Keskitalo, R.; Kiiveri, K.; Kisner, T. S.; Knoche, J.; Kunz, M.; Kurki-Suonio, H.; Lähteenmäki, A.; Lamarre, J.-M.; Lasenby, A.; Lattanzi, M.; Lawrence, C. R.; Leahy, J. P.; Leonardi, R.; Lesgourgues, J.; Levrier, F.; Liguori, M.; Lilje, P. B.; Linden-Vørnle, M.; Lindholm, V.; López-Caniego, M.; Lubin, P. M.; Macías-Pérez, J. F.; Maggio, G.; Maino, D.; Mandolesi, N.; Mangilli, A.; Maris, M.; Martin, P. G.; Martínez-González, E.; Masi, S.; Matarrese, S.; Mazzotta, P.; McGehee, P.; Meinhold, P. R.; Melchiorri, A.; Mendes, L.; Mennella, A.; Migliaccio, M.; Mitra, S.; Montier, L.; Morgante, G.; Mortlock, D.; Moss, A.; Munshi, D.; Murphy, J. A.; Naselsky, P.; Nati, F.; Natoli, P.; Netterfield, C. B.; Nørgaard-Nielsen, H. U.; Novikov, D.; Novikov, I.; Paci, F.; Pagano, L.; Paoletti, D.; Partridge, B.; Pasian, F.; Patanchon, G.; Pearson, T. J.; Perdereau, O.; Perotto, L.; Perrotta, F.; Pettorino, V.; Piacentini, F.; Pierpaoli, E.; Pietrobon, D.; Pointecouteau, E.; Polenta, G.; Pratt, G. W.; Prézeau, G.; Prunet, S.; Puget, J.-L.; Rachen, J. P.; Rebolo, R.; Reinecke, M.; Remazeilles, M.; Renzi, A.; Rocha, G.; Rosset, C.; Rossetti, M.; Roudier, G.; Rubiño-Martín, J. A.; Rusholme, B.; Sandri, M.; Santos, D.; Savelainen, M.; Scott, D.; Seiffert, M. D.; Shellard, E. P. S.; Spencer, L. D.; Stolyarov, V.; Sutton, D.; Suur-Uski, A.-S.; Sygnet, J.-F.; Tauber, J. A.; Terenzi, L.; Toffolatti, L.; Tomasi, M.; Tristram, M.; Tucci, M.; Tuovinen, J.; Umana, G.; Valenziano, L.; Valiviita, J.; Van Tent, B.; Vassallo, T.; Vielva, P.; Villa, F.; Wade, L. A.; Wandelt, B. D.; Watson, R.; Wehus, I. K.; Yvon, D.; Zacchei, A.; Zonca, A.

    2016-09-01

    This paper presents the characterization of the in-flight beams, the beam window functions, and the associated uncertainties for the Planck Low Frequency Instrument (LFI). The structure of the paper is similar to that presented in the 2013 Planck release; the main differences concern the beam normalization and the delivery of the window functions to be used for polarization analysis. The in-flight assessment of the LFI main beams relies on measurements performed during observations of Jupiter. By stacking data from seven Jupiter transits, the main beam profiles are measured down to -25 dB at 30 and 44 GHz, and down to -30 dB at 70 GHz. It has been confirmed that the agreement between the simulated beams and the measured beams is better than 1% at each LFI frequency band (within the 20 dB contour from the peak, the rms values are 0.1% at 30 and 70 GHz; 0.2% at 44 GHz). Simulated polarized beams are used for the computation of the effective beam window functions. The error budget for the window functions is estimated from both main beam and sidelobe contributions, and accounts for the radiometer band shapes. The total uncertainties in the effective beam window functions are 0.7% and 1% at 30 and 44 GHz, respectively (at ℓ ≈ 600); and 0.5% at 70 GHz (at ℓ ≈ 1000).

  1. Planck 2015 results: IV. Low Frequency Instrument beams and window functions

    DOE PAGES

    Ade, P. A. R.; Aghanim, N.; Ashdown, M.; ...

    2016-09-20

    This article presents the characterization of the in-flight beams, the beam window functions, and the associated uncertainties for the Planck Low Frequency Instrument (LFI). The structure of the paper is similar to that presented in the 2013 Planck release; the main differences concern the beam normalization and the delivery of the window functions to be used for polarization analysis. The in-flight assessment of the LFI main beams relies on measurements performed during observations of Jupiter. By stacking data from seven Jupiter transits, the main beam profiles are measured down to -25 dB at 30 and 44 GHz, and down tomore » -30 dB at 70 GHz. It has been confirmed that the agreement between the simulated beams and the measured beams is better than 1% at each LFI frequency band (within the 20 dB contour from the peak, the rms values are 0.1% at 30 and 70 GHz; 0.2% at 44 GHz). Simulated polarized beams are used for the computation of the effective beam window functions. The error budget for the window functions is estimated from both main beam and sidelobe contributions, and accounts for the radiometer band shapes. The total uncertainties in the effective beam window functions are 0.7% and 1% at 30 and 44 GHz, respectively (at ℓ ≈ 600); and 0.5% at 70 GHz (at ℓ ≈ 1000).« less

  2. Woody plants and the prediction of climate-change impacts on bird diversity.

    PubMed

    Kissling, W D; Field, R; Korntheuer, H; Heyder, U; Böhning-Gaese, K

    2010-07-12

    Current methods of assessing climate-induced shifts of species distributions rarely account for species interactions and usually ignore potential differences in response times of interacting taxa to climate change. Here, we used species-richness data from 1005 breeding bird and 1417 woody plant species in Kenya and employed model-averaged coefficients from regression models and median climatic forecasts assembled across 15 climate-change scenarios to predict bird species richness under climate change. Forecasts assuming an instantaneous response of woody plants and birds to climate change suggested increases in future bird species richness across most of Kenya whereas forecasts assuming strongly lagged woody plant responses to climate change indicated a reversed trend, i.e. reduced bird species richness. Uncertainties in predictions of future bird species richness were geographically structured, mainly owing to uncertainties in projected precipitation changes. We conclude that assessments of future species responses to climate change are very sensitive to current uncertainties in regional climate-change projections, and to the inclusion or not of time-lagged interacting taxa. We expect even stronger effects for more specialized plant-animal associations. Given the slow response time of woody plant distributions to climate change, current estimates of future biodiversity of many animal taxa may be both biased and too optimistic.

  3. Uncertainty of fast biological radiation dose assessment for emergency response scenarios.

    PubMed

    Ainsbury, Elizabeth A; Higueras, Manuel; Puig, Pedro; Einbeck, Jochen; Samaga, Daniel; Barquinero, Joan Francesc; Barrios, Lleonard; Brzozowska, Beata; Fattibene, Paola; Gregoire, Eric; Jaworska, Alicja; Lloyd, David; Oestreicher, Ursula; Romm, Horst; Rothkamm, Kai; Roy, Laurence; Sommer, Sylwester; Terzoudi, Georgia; Thierens, Hubert; Trompier, Francois; Vral, Anne; Woda, Clemens

    2017-01-01

    Reliable dose estimation is an important factor in appropriate dosimetric triage categorization of exposed individuals to support radiation emergency response. Following work done under the EU FP7 MULTIBIODOSE and RENEB projects, formal methods for defining uncertainties on biological dose estimates are compared using simulated and real data from recent exercises. The results demonstrate that a Bayesian method of uncertainty assessment is the most appropriate, even in the absence of detailed prior information. The relative accuracy and relevance of techniques for calculating uncertainty and combining assay results to produce single dose and uncertainty estimates is further discussed. Finally, it is demonstrated that whatever uncertainty estimation method is employed, ignoring the uncertainty on fast dose assessments can have an important impact on rapid biodosimetric categorization.

  4. Framework for Uncertainty Assessment - Hanford Site-Wide Groundwater Flow and Transport Modeling

    NASA Astrophysics Data System (ADS)

    Bergeron, M. P.; Cole, C. R.; Murray, C. J.; Thorne, P. D.; Wurstner, S. K.

    2002-05-01

    Pacific Northwest National Laboratory is in the process of development and implementation of an uncertainty estimation methodology for use in future site assessments that addresses parameter uncertainty as well as uncertainties related to the groundwater conceptual model. The long-term goals of the effort are development and implementation of an uncertainty estimation methodology for use in future assessments and analyses being made with the Hanford site-wide groundwater model. The basic approach in the framework developed for uncertainty assessment consists of: 1) Alternate conceptual model (ACM) identification to identify and document the major features and assumptions of each conceptual model. The process must also include a periodic review of the existing and proposed new conceptual models as data or understanding become available. 2) ACM development of each identified conceptual model through inverse modeling with historical site data. 3) ACM evaluation to identify which of conceptual models are plausible and should be included in any subsequent uncertainty assessments. 4) ACM uncertainty assessments will only be carried out for those ACMs determined to be plausible through comparison with historical observations and model structure identification measures. The parameter uncertainty assessment process generally involves: a) Model Complexity Optimization - to identify the important or relevant parameters for the uncertainty analysis; b) Characterization of Parameter Uncertainty - to develop the pdfs for the important uncertain parameters including identification of any correlations among parameters; c) Propagation of Uncertainty - to propagate parameter uncertainties (e.g., by first order second moment methods if applicable or by a Monte Carlo approach) through the model to determine the uncertainty in the model predictions of interest. 5)Estimation of combined ACM and scenario uncertainty by a double sum with each component of the inner sum (an individual CCDF) representing parameter uncertainty associated with a particular scenario and ACM and the outer sum enumerating the various plausible ACM and scenario combinations in order to represent the combined estimate of uncertainty (a family of CCDFs). A final important part of the framework includes identification, enumeration, and documentation of all the assumptions, which include those made during conceptual model development, required by the mathematical model, required by the numerical model, made during the spatial and temporal descretization process, needed to assign the statistical model and associated parameters that describe the uncertainty in the relevant input parameters, and finally those assumptions required by the propagation method. Pacific Northwest National Laboratory is operated for the U.S. Department of Energy under Contract DE-AC06-76RL01830.

  5. Probabilistic volcanic hazard assessments of Pyroclastic Density Currents: ongoing practices and future perspectives

    NASA Astrophysics Data System (ADS)

    Tierz, Pablo; Sandri, Laura; Ramona Stefanescu, Elena; Patra, Abani; Marzocchi, Warner; Costa, Antonio; Sulpizio, Roberto

    2014-05-01

    Explosive volcanoes and, especially, Pyroclastic Density Currents (PDCs) pose an enormous threat to populations living in the surroundings of volcanic areas. Difficulties in the modeling of PDCs are related to (i) very complex and stochastic physical processes, intrinsic to their occurrence, and (ii) to a lack of knowledge about how these processes actually form and evolve. This means that there are deep uncertainties (namely, of aleatory nature due to point (i) above, and of epistemic nature due to point (ii) above) associated to the study and forecast of PDCs. Consequently, the assessment of their hazard is better described in terms of probabilistic approaches rather than by deterministic ones. What is actually done to assess probabilistic hazard from PDCs is to couple deterministic simulators with statistical techniques that can, eventually, supply probabilities and inform about the uncertainties involved. In this work, some examples of both PDC numerical simulators (Energy Cone and TITAN2D) and uncertainty quantification techniques (Monte Carlo sampling -MC-, Polynomial Chaos Quadrature -PCQ- and Bayesian Linear Emulation -BLE-) are presented, and their advantages, limitations and future potential are underlined. The key point in choosing a specific method leans on the balance between its related computational cost, the physical reliability of the simulator and the pursued target of the hazard analysis (type of PDCs considered, time-scale selected for the analysis, particular guidelines received from decision-making agencies, etc.). Although current numerical and statistical techniques have brought important advances in probabilistic volcanic hazard assessment from PDCs, some of them may be further applicable to more sophisticated simulators. In addition, forthcoming improvements could be focused on three main multidisciplinary directions: 1) Validate the simulators frequently used (through comparison with PDC deposits and other simulators), 2) Decrease simulator runtimes (whether by increasing the knowledge about the physical processes or by doing more efficient programming, parallelization, ...) and 3) Improve uncertainty quantification techniques.

  6. Climate change risk analysis framework (CCRAF) a probabilistic tool for analyzing climate change uncertainties

    NASA Astrophysics Data System (ADS)

    Legget, J.; Pepper, W.; Sankovski, A.; Smith, J.; Tol, R.; Wigley, T.

    2003-04-01

    Potential risks of human-induced climate change are subject to a three-fold uncertainty associated with: the extent of future anthropogenic and natural GHG emissions; global and regional climatic responses to emissions; and impacts of climatic changes on economies and the biosphere. Long-term analyses are also subject to uncertainty regarding how humans will respond to actual or perceived changes, through adaptation or mitigation efforts. Explicitly addressing these uncertainties is a high priority in the scientific and policy communities Probabilistic modeling is gaining momentum as a technique to quantify uncertainties explicitly and use decision analysis techniques that take advantage of improved risk information. The Climate Change Risk Assessment Framework (CCRAF) presented here a new integrative tool that combines the probabilistic approaches developed in population, energy and economic sciences with empirical data and probabilistic results of climate and impact models. The main CCRAF objective is to assess global climate change as a risk management challenge and to provide insights regarding robust policies that address the risks, by mitigating greenhouse gas emissions and by adapting to climate change consequences. The CCRAF endogenously simulates to 2100 or beyond annual region-specific changes in population; GDP; primary (by fuel) and final energy (by type) use; a wide set of associated GHG emissions; GHG concentrations; global temperature change and sea level rise; economic, health, and biospheric impacts; costs of mitigation and adaptation measures and residual costs or benefits of climate change. Atmospheric and climate components of CCRAF are formulated based on the latest version of Wigley's and Raper's MAGICC model and impacts are simulated based on a modified version of Tol's FUND model. The CCRAF is based on series of log-linear equations with deterministic and random components and is implemented using a Monte-Carlo method with up to 5000 variants per set of fixed input parameters. The shape and coefficients of CCRAF equations are derived from regression analyses of historic data and expert assessments. There are two types of random components in CCRAF - one reflects a year-to-year fluctuations around the expected value of a given variable (e.g., standard error of the annual GDP growth) and another is fixed within each CCRAF variant and represents some essential constants within a "world" represented by that variant (e.g., the value of climate sensitivity). Both types of random components are drawn from pre-defined probability distributions functions developed based on historic data or expert assessments. Preliminary CCRAF results emphasize the relative importance of uncertainties associated with the conversion of GHG and particulate emissions into radiative forcing and quantifying climate change effects at the regional level. A separates analysis involves an "adaptive decision-making", which optimizes the expected future policy effects given the estimated probabilistic uncertainties. As uncertainty for some variables evolve over the time steps, the decisions also adapt. This modeling approach is feasible only with explicit modeling of uncertainties.

  7. Parameter sensitivity analysis of a 1-D cold region lake model for land-surface schemes

    NASA Astrophysics Data System (ADS)

    Guerrero, José-Luis; Pernica, Patricia; Wheater, Howard; Mackay, Murray; Spence, Chris

    2017-12-01

    Lakes might be sentinels of climate change, but the uncertainty in their main feedback to the atmosphere - heat-exchange fluxes - is often not considered within climate models. Additionally, these fluxes are seldom measured, hindering critical evaluation of model output. Analysis of the Canadian Small Lake Model (CSLM), a one-dimensional integral lake model, was performed to assess its ability to reproduce diurnal and seasonal variations in heat fluxes and the sensitivity of simulated fluxes to changes in model parameters, i.e., turbulent transport parameters and the light extinction coefficient (Kd). A C++ open-source software package, Problem Solving environment for Uncertainty Analysis and Design Exploration (PSUADE), was used to perform sensitivity analysis (SA) and identify the parameters that dominate model behavior. The generalized likelihood uncertainty estimation (GLUE) was applied to quantify the fluxes' uncertainty, comparing daily-averaged eddy-covariance observations to the output of CSLM. Seven qualitative and two quantitative SA methods were tested, and the posterior likelihoods of the modeled parameters, obtained from the GLUE analysis, were used to determine the dominant parameters and the uncertainty in the modeled fluxes. Despite the ubiquity of the equifinality issue - different parameter-value combinations yielding equivalent results - the answer to the question was unequivocal: Kd, a measure of how much light penetrates the lake, dominates sensible and latent heat fluxes, and the uncertainty in their estimates is strongly related to the accuracy with which Kd is determined. This is important since accurate and continuous measurements of Kd could reduce modeling uncertainty.

  8. Uncertainties in Climatological Seawater Density Calculations

    NASA Astrophysics Data System (ADS)

    Dai, Hao; Zhang, Xining

    2018-03-01

    In most applications, with seawater conductivity, temperature, and pressure data measured in situ by various observation instruments e.g., Conductivity-Temperature-Depth instruments (CTD), the density which has strong ties to ocean dynamics and so on is computed according to equations of state for seawater. This paper, based on density computational formulae in the Thermodynamic Equation of Seawater 2010 (TEOS-10), follows the Guide of the expression of Uncertainty in Measurement (GUM) and assesses the main sources of uncertainties. By virtue of climatological decades-average temperature/Practical Salinity/pressure data sets in the global ocean provided by the National Oceanic and Atmospheric Administration (NOAA), correlation coefficients between uncertainty sources are determined and the combined standard uncertainties uc>(ρ>) in seawater density calculations are evaluated. For grid points in the world ocean with 0.25° resolution, the standard deviations of uc>(ρ>) in vertical profiles cover the magnitude order of 10-4 kg m-3. The uc>(ρ>) means in vertical profiles of the Baltic Sea are about 0.028kg m-3 due to the larger scatter of Absolute Salinity anomaly. The distribution of the uc>(ρ>) means in vertical profiles of the world ocean except for the Baltic Sea, which covers the range of >(0.004,0.01>) kg m-3, is related to the correlation coefficient r>(SA,p>) between Absolute Salinity SA and pressure p. The results in the paper are based on sensors' measuring uncertainties of high accuracy CTD. Larger uncertainties in density calculations may arise if connected with lower sensors' specifications. This work may provide valuable uncertainty information required for reliability considerations of ocean circulation and global climate models.

  9. Estimation of Uncertainties in Stage-Discharge Curve for an Experimental Himalayan Watershed

    NASA Astrophysics Data System (ADS)

    Kumar, V.; Sen, S.

    2016-12-01

    Various water resource projects developed on rivers originating from the Himalayan region, the "Water Tower of Asia", plays an important role on downstream development. Flow measurements at the desired river site are very critical for river engineers and hydrologists for water resources planning and management, flood forecasting, reservoir operation and flood inundation studies. However, an accurate discharge assessment of these mountainous rivers is costly, tedious and frequently dangerous to operators during flood events. Currently, in India, discharge estimation is linked to stage-discharge relationship known as rating curve. This relationship would be affected by a high degree of uncertainty. Estimating the uncertainty of rating curve remains a relevant challenge because it is not easy to parameterize. Main source of rating curve uncertainty are errors because of incorrect discharge measurement, variation in hydraulic conditions and depth measurement. In this study our objective is to obtain best parameters of rating curve that fit the limited record of observations and to estimate uncertainties at different depth obtained from rating curve. The rating curve parameters of standard power law are estimated for three different streams of Aglar watershed located in lesser Himalayas by maximum-likelihood estimator. Quantification of uncertainties in the developed rating curves is obtained from the estimate of variances and covariances of the rating curve parameters. Results showed that the uncertainties varied with catchment behavior with error varies between 0.006-1.831 m3/s. Discharge uncertainty in the Aglar watershed streams significantly depend on the extent of extrapolation outside the range of observed water levels. Extrapolation analysis confirmed that more than 15% for maximum discharges and 5% for minimum discharges are not strongly recommended for these mountainous gauging sites.

  10. Equifinality and process-based modelling

    NASA Astrophysics Data System (ADS)

    Khatami, S.; Peel, M. C.; Peterson, T. J.; Western, A. W.

    2017-12-01

    Equifinality is understood as one of the fundamental difficulties in the study of open complex systems, including catchment hydrology. A review of the hydrologic literature reveals that the term equifinality has been widely used, but in many cases inconsistently and without coherent recognition of the various facets of equifinality, which can lead to ambiguity but also methodological fallacies. Therefore, in this study we first characterise the term equifinality within the context of hydrological modelling by reviewing the genesis of the concept of equifinality and then presenting a theoretical framework. During past decades, equifinality has mainly been studied as a subset of aleatory (arising due to randomness) uncertainty and for the assessment of model parameter uncertainty. Although the connection between parameter uncertainty and equifinality is undeniable, we argue there is more to equifinality than just aleatory parameter uncertainty. That is, the importance of equifinality and epistemic uncertainty (arising due to lack of knowledge) and their implications is overlooked in our current practice of model evaluation. Equifinality and epistemic uncertainty in studying, modelling, and evaluating hydrologic processes are treated as if they can be simply discussed in (or often reduced to) probabilistic terms (as for aleatory uncertainty). The deficiencies of this approach to conceptual rainfall-runoff modelling are demonstrated for selected Australian catchments by examination of parameter and internal flux distributions and interactions within SIMHYD. On this basis, we present a new approach that expands equifinality concept beyond model parameters to inform epistemic uncertainty. The new approach potentially facilitates the identification and development of more physically plausible models and model evaluation schemes particularly within the multiple working hypotheses framework, and is generalisable to other fields of environmental modelling as well.

  11. Probability and Confidence Trade-space (PACT) Evaluation: Accounting for Uncertainty in Sparing Assessments

    NASA Technical Reports Server (NTRS)

    Anderson, Leif; Box, Neil; Carter, Katrina; DiFilippo, Denise; Harrington, Sean; Jackson, David; Lutomski, Michael

    2012-01-01

    There are two general shortcomings to the current annual sparing assessment: 1. The vehicle functions are currently assessed according to confidence targets, which can be misleading- overly conservative or optimistic. 2. The current confidence levels are arbitrarily determined and do not account for epistemic uncertainty (lack of knowledge) in the ORU failure rate. There are two major categories of uncertainty that impact Sparing Assessment: (a) Aleatory Uncertainty: Natural variability in distribution of actual failures around an Mean Time Between Failure (MTBF) (b) Epistemic Uncertainty : Lack of knowledge about the true value of an Orbital Replacement Unit's (ORU) MTBF We propose an approach to revise confidence targets and account for both categories of uncertainty, an approach we call Probability and Confidence Trade-space (PACT) evaluation.

  12. Hydrological Responses of Chaobai River Basin under 1.5° and 2.0° Global Warming Using Multi-GCMs and Multi-RCPs

    NASA Astrophysics Data System (ADS)

    Hao, Y.; Ma, J.

    2017-12-01

    The global warming of 1.5° and 2.0° proposed in Paris Agreement has became the iconic threshold of climate change impact research and discussion. In order to provide useful reference to the effective water resource management and planning for the capital city of China, this study aims to assessing the potential impact of 1.5° and 2.0° global warming on river discharge in Chaobai River Basin(CRB) which is main water supply source of Beijing. A semi-distributed hydrological model SWAT was driven by climate projections from five General Circulation Models(GCMs) under three Representative Concentration Pathways (RCP4.5, RCP6.0 and RCP8.5) to simulate the future discharge in CRB under 1.5° and 2.0° global warming respectively. On this basis, climate change impact on annual and monthly discharge, seasonal discharge distribution, extreme monthly discharge in CRB were assessed and the uncertainty associated with GCMs and RCPs were analyzed quantitatively. The results indicate that the average annual discharge will increase slightly and more concentrate in midsummer and early autumn under 1.5° global warming. When the global average temperature rise 2°, the annual discharge in CRB show an evident positive tendency with the magnitude increasing by approximate 30% and the extreme monthly runoff will significantly increase. However, the proportion of discharge in summer which is the peak water usage period will decline. It is obvious that the increment of 0.5° will lead to more flood events and bring great challenge to water resource management. There is a certain uncertainty in the projection of temperature, precipitation and discharge, by contrast, uncertainty of discharge projection is far greater than that of other two meteorological elements. Compared with RCPs, GCMs are proved to be the main factor which are responsible for the impact uncertainty in CRB under two global warming horizons. The uncertainty will be larger as the warming magnitude increase. In a word, the additional 0.5 will be crucial to flood control and water security, therefore, it is better to pursue efforts to limit the temperature increase to 1.5C above pre-industrial levels.

  13. Using Life Cycle Assessment methodology to assess UHT milk production in Portugal.

    PubMed

    González-García, Sara; Castanheira, Erica G; Dias, Ana Cláudia; Arroja, Luis

    2013-01-01

    Milk and dairy products constitute an important ingredient in the human diet. Ultra-high temperature (UHT) milk is the main dairy product consumed in Portugal and its production entails large inputs of resources which derive on negative environmental effects such as nutrient enrichment of the ecosystem and climate change. In this study, Life Cycle Assessment (LCA) methodology was considered for the environmental assessment of packaged UHT milk produced in Portugal, including simple (whole, semi-skimmed and skimmed) and cocoa milk from a cradle-to-gate perspective and to identify the environmental hot spots. Results showed that the production of the raw milk in the dairy farm is the main hot spot in almost all the categories under assessment mainly due to the emissions from enteric fermentation, manure management and fertilisers production and application. Furthermore, on-site emissions derived from dairy factory are remarkable together with the packages and energy requirements production. The values reported in this study are in the range of other related papers. However, differences were also identified due to several reasons such as allocation approach, data sources, characterisation factors, farm management regimes and assumptions considered. Therefore, these aspects should be carefully addressed and sensitivity to the assumptions and uncertainty of the results should be evaluated. Copyright © 2012 Elsevier B.V. All rights reserved.

  14. The importance of hydrological uncertainty assessment methods in climate change impact studies

    NASA Astrophysics Data System (ADS)

    Honti, M.; Scheidegger, A.; Stamm, C.

    2014-08-01

    Climate change impact assessments have become more and more popular in hydrology since the middle 1980s with a recent boost after the publication of the IPCC AR4 report. From hundreds of impact studies a quasi-standard methodology has emerged, to a large extent shaped by the growing public demand for predicting how water resources management or flood protection should change in the coming decades. The "standard" workflow relies on a model cascade from global circulation model (GCM) predictions for selected IPCC scenarios to future catchment hydrology. Uncertainty is present at each level and propagates through the model cascade. There is an emerging consensus between many studies on the relative importance of the different uncertainty sources. The prevailing perception is that GCM uncertainty dominates hydrological impact studies. Our hypothesis was that the relative importance of climatic and hydrologic uncertainty is (among other factors) heavily influenced by the uncertainty assessment method. To test this we carried out a climate change impact assessment and estimated the relative importance of the uncertainty sources. The study was performed on two small catchments in the Swiss Plateau with a lumped conceptual rainfall runoff model. In the climatic part we applied the standard ensemble approach to quantify uncertainty but in hydrology we used formal Bayesian uncertainty assessment with two different likelihood functions. One was a time series error model that was able to deal with the complicated statistical properties of hydrological model residuals. The second was an approximate likelihood function for the flow quantiles. The results showed that the expected climatic impact on flow quantiles was small compared to prediction uncertainty. The choice of uncertainty assessment method actually determined what sources of uncertainty could be identified at all. This demonstrated that one could arrive at rather different conclusions about the causes behind predictive uncertainty for the same hydrological model and calibration data when considering different objective functions for calibration.

  15. Uncertainty Analysis of OC5-DeepCwind Floating Semisubmersible Offshore Wind Test Campaign

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Robertson, Amy N

    This paper examines how to assess the uncertainty levels for test measurements of the Offshore Code Comparison, Continued, with Correlation (OC5)-DeepCwind floating offshore wind system, examined within the OC5 project. The goal of the OC5 project was to validate the accuracy of ultimate and fatigue load estimates from a numerical model of the floating semisubmersible using data measured during scaled tank testing of the system under wind and wave loading. The examination of uncertainty was done after the test, and it was found that the limited amount of data available did not allow for an acceptable uncertainty assessment. Therefore, thismore » paper instead qualitatively examines the sources of uncertainty associated with this test to start a discussion of how to assess uncertainty for these types of experiments and to summarize what should be done during future testing to acquire the information needed for a proper uncertainty assessment. Foremost, future validation campaigns should initiate numerical modeling before testing to guide the test campaign, which should include a rigorous assessment of uncertainty, and perform validation during testing to ensure that the tests address all of the validation needs.« less

  16. Uncertainty Analysis of OC5-DeepCwind Floating Semisubmersible Offshore Wind Test Campaign: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Robertson, Amy N

    This paper examines how to assess the uncertainty levels for test measurements of the Offshore Code Comparison, Continued, with Correlation (OC5)-DeepCwind floating offshore wind system, examined within the OC5 project. The goal of the OC5 project was to validate the accuracy of ultimate and fatigue load estimates from a numerical model of the floating semisubmersible using data measured during scaled tank testing of the system under wind and wave loading. The examination of uncertainty was done after the test, and it was found that the limited amount of data available did not allow for an acceptable uncertainty assessment. Therefore, thismore » paper instead qualitatively examines the sources of uncertainty associated with this test to start a discussion of how to assess uncertainty for these types of experiments and to summarize what should be done during future testing to acquire the information needed for a proper uncertainty assessment. Foremost, future validation campaigns should initiate numerical modeling before testing to guide the test campaign, which should include a rigorous assessment of uncertainty, and perform validation during testing to ensure that the tests address all of the validation needs.« less

  17. Integrating expert opinion with modelling for quantitative multi-hazard risk assessment in the Eastern Italian Alps

    NASA Astrophysics Data System (ADS)

    Chen, Lixia; van Westen, Cees J.; Hussin, Haydar; Ciurean, Roxana L.; Turkington, Thea; Chavarro-Rincon, Diana; Shrestha, Dhruba P.

    2016-11-01

    Extreme rainfall events are the main triggering causes for hydro-meteorological hazards in mountainous areas, where development is often constrained by the limited space suitable for construction. In these areas, hazard and risk assessments are fundamental for risk mitigation, especially for preventive planning, risk communication and emergency preparedness. Multi-hazard risk assessment in mountainous areas at local and regional scales remain a major challenge because of lack of data related to past events and causal factors, and the interactions between different types of hazards. The lack of data leads to a high level of uncertainty in the application of quantitative methods for hazard and risk assessment. Therefore, a systematic approach is required to combine these quantitative methods with expert-based assumptions and decisions. In this study, a quantitative multi-hazard risk assessment was carried out in the Fella River valley, prone to debris flows and flood in the north-eastern Italian Alps. The main steps include data collection and development of inventory maps, definition of hazard scenarios, hazard assessment in terms of temporal and spatial probability calculation and intensity modelling, elements-at-risk mapping, estimation of asset values and the number of people, physical vulnerability assessment, the generation of risk curves and annual risk calculation. To compare the risk for each type of hazard, risk curves were generated for debris flows, river floods and flash floods. Uncertainties were expressed as minimum, average and maximum values of temporal and spatial probability, replacement costs of assets, population numbers, and physical vulnerability. These result in minimum, average and maximum risk curves. To validate this approach, a back analysis was conducted using the extreme hydro-meteorological event that occurred in August 2003 in the Fella River valley. The results show a good performance when compared to the historical damage reports.

  18. An uncertainty analysis of air pollution externalities from road transport in Belgium in 2010.

    PubMed

    Int Panis, L; De Nocker, L; Cornelis, E; Torfs, R

    2004-12-01

    Although stricter standards for vehicles will reduce emissions to air significantly by 2010, a number of problems will remain, especially related to particulate concentrations in cities, ground-level ozone, and CO(2). To evaluate the impacts of new policy measures, tools need to be available that assess the potential benefits of these measures in terms of the vehicle fleet, fuel choice, modal choice, kilometers driven, emissions, and the impacts on public health and related external costs. The ExternE accounting framework offers the most up to date and comprehensive methodology to assess marginal external costs of energy-related pollutants. It combines emission models, air dispersion models at local and regional scales with dose-response functions and valuation rules. Vito has extended this accounting framework with data and models related to the future composition of the vehicle fleet and transportation demand to evaluate the impact of new policy proposals on air quality and aggregated (total) external costs by 2010. Special attention was given to uncertainty analysis. The uncertainty for more than 100 different parameters was combined in Monte Carlo simulations to assess the range of possible outcomes and the main drivers of these results. Although the impacts from emission standards and total fleet mileage look dominant at first, a number of other factors were found to be important as well. This includes the number of diesel vehicles, inspection and maintenance (high-emitter cars), use of air conditioning, and heavy duty transit traffic.

  19. Uncertainties in the production of p nuclides in thermonuclear supernovae determined by Monte Carlo variations

    NASA Astrophysics Data System (ADS)

    Nishimura, N.; Rauscher, T.; Hirschi, R.; Murphy, A. St J.; Cescutti, G.; Travaglio, C.

    2018-03-01

    Thermonuclear supernovae originating from the explosion of a white dwarf accreting mass from a companion star have been suggested as a site for the production of p nuclides. Such nuclei are produced during the explosion, in layers enriched with seed nuclei coming from prior strong s processing. These seeds are transformed into proton-richer isotopes mainly by photodisintegration reactions. Several thousand trajectories from a 2D explosion model were used in a Monte Carlo approach. Temperature-dependent uncertainties were assigned individually to thousands of rates varied simultaneously in post-processing in an extended nuclear reaction network. The uncertainties in the final nuclear abundances originating from uncertainties in the astrophysical reaction rates were determined. In addition to the 35 classical p nuclides, abundance uncertainties were also determined for the radioactive nuclides 92Nb, 97, 98Tc, 146Sm, and for the abundance ratios Y(92Mo)/Y(94Mo), Y(92Nb)/Y(92Mo), Y(97Tc)/Y(98Ru), Y(98Tc)/Y(98Ru), and Y(146Sm)/Y(144Sm), important for Galactic Chemical Evolution studies. Uncertainties found were generally lower than a factor of 2, although most nucleosynthesis flows mainly involve predicted rates with larger uncertainties. The main contribution to the total uncertainties comes from a group of trajectories with high peak density originating from the interior of the exploding white dwarf. The distinction between low-density and high-density trajectories allows more general conclusions to be drawn, also applicable to other simulations of white dwarf explosions.

  20. Reducing, Maintaining, or Escalating Uncertainty? The Development and Validation of Four Uncertainty Preference Scales Related to Cancer Information Seeking and Avoidance.

    PubMed

    Carcioppolo, Nick; Yang, Fan; Yang, Qinghua

    2016-09-01

    Uncertainty is a central characteristic of many aspects of cancer prevention, screening, diagnosis, and treatment. Brashers's (2001) uncertainty management theory details the multifaceted nature of uncertainty and describes situations in which uncertainty can both positively and negatively affect health outcomes. The current study extends theory on uncertainty management by developing four scale measures of uncertainty preferences in the context of cancer. Two national surveys were conducted to validate the scales and assess convergent and concurrent validity. Results support the factor structure of each measure and provide general support across multiple validity assessments. These scales can advance research on uncertainty and cancer communication by providing researchers with measures that address multiple aspects of uncertainty management.

  1. Development and comparison of Bayesian modularization method in uncertainty assessment of hydrological models

    NASA Astrophysics Data System (ADS)

    Li, L.; Xu, C.-Y.; Engeland, K.

    2012-04-01

    With respect to model calibration, parameter estimation and analysis of uncertainty sources, different approaches have been used in hydrological models. Bayesian method is one of the most widely used methods for uncertainty assessment of hydrological models, which incorporates different sources of information into a single analysis through Bayesian theorem. However, none of these applications can well treat the uncertainty in extreme flows of hydrological models' simulations. This study proposes a Bayesian modularization method approach in uncertainty assessment of conceptual hydrological models by considering the extreme flows. It includes a comprehensive comparison and evaluation of uncertainty assessments by a new Bayesian modularization method approach and traditional Bayesian models using the Metropolis Hasting (MH) algorithm with the daily hydrological model WASMOD. Three likelihood functions are used in combination with traditional Bayesian: the AR (1) plus Normal and time period independent model (Model 1), the AR (1) plus Normal and time period dependent model (Model 2) and the AR (1) plus multi-normal model (Model 3). The results reveal that (1) the simulations derived from Bayesian modularization method are more accurate with the highest Nash-Sutcliffe efficiency value, and (2) the Bayesian modularization method performs best in uncertainty estimates of entire flows and in terms of the application and computational efficiency. The study thus introduces a new approach for reducing the extreme flow's effect on the discharge uncertainty assessment of hydrological models via Bayesian. Keywords: extreme flow, uncertainty assessment, Bayesian modularization, hydrological model, WASMOD

  2. Locus of control, quality of life, anxiety, and depression among Malaysian breast cancer patients: The mediating role of uncertainty.

    PubMed

    Pahlevan Sharif, Saeed

    2017-04-01

    The main objective of this study was to investigate the mediating role of uncertainty in the relationship between locus of control with quality of life, anxiety, and depression. A descriptive and correlational survey was conducted in a private hospital in Kuala Lumpur, Malaysia. A convenience sample of 118 Malaysian breast cancer patients voluntarily participated in the study and responded to a set of questionnaires including: socio-demographic questionnaire, the short form of Locus of Control Scale, the Functional Assessment of Cancer Therapy-Breast (FACT-B), the Hospital Anxiety and Depression Scale (HADS), and the Short-Form Mishel Uncertainty in Illness Scale (SF-MUIS). The results revealed that breast cancer patients with higher internal locus of control and lower external locus of control experience a higher quality of life, lower anxiety, and lower depression. Also, uncertainty mediated the relationship between locus of control with quality of life and depression (quasi-significant). The findings indicated the need for early, targeted psychological interventions seeking to gradually shift cancer patients' locus of control from external to internal in order to improve their quality of life and reduce their depression and anxiety. Moreover, health care providers by providing relevant information to cancer patients, especially for externally oriented patients, can reduce their uncertainty which in turn would improve their quality of life. Copyright © 2017 Elsevier Ltd. All rights reserved.

  3. Influence of model reduction on uncertainty of flood inundation predictions

    NASA Astrophysics Data System (ADS)

    Romanowicz, R. J.; Kiczko, A.; Osuch, M.

    2012-04-01

    Derivation of flood risk maps requires an estimation of the maximum inundation extent for a flood with an assumed probability of exceedence, e.g. a 100 or 500 year flood. The results of numerical simulations of flood wave propagation are used to overcome the lack of relevant observations. In practice, deterministic 1-D models are used for flow routing, giving a simplified image of a flood wave propagation process. The solution of a 1-D model depends on the simplifications to the model structure, the initial and boundary conditions and the estimates of model parameters which are usually identified using the inverse problem based on the available noisy observations. Therefore, there is a large uncertainty involved in the derivation of flood risk maps. In this study we examine the influence of model structure simplifications on estimates of flood extent for the urban river reach. As the study area we chose the Warsaw reach of the River Vistula, where nine bridges and several dikes are located. The aim of the study is to examine the influence of water structures on the derived model roughness parameters, with all the bridges and dikes taken into account, with a reduced number and without any water infrastructure. The results indicate that roughness parameter values of a 1-D HEC-RAS model can be adjusted for the reduction in model structure. However, the price we pay is the model robustness. Apart from a relatively simple question regarding reducing model structure, we also try to answer more fundamental questions regarding the relative importance of input, model structure simplification, parametric and rating curve uncertainty to the uncertainty of flood extent estimates. We apply pseudo-Bayesian methods of uncertainty estimation and Global Sensitivity Analysis as the main methodological tools. The results indicate that the uncertainties have a substantial influence on flood risk assessment. In the paper we present a simplified methodology allowing the influence of that uncertainty to be assessed. This work was supported by National Science Centre of Poland (grant 2011/01/B/ST10/06866).

  4. On the Concept and Definition of Terrorism Risk.

    PubMed

    Aven, Terje; Guikema, Seth

    2015-12-01

    In this article, we provide some reflections on how to define and understand the concept of terrorism risk in a professional risk assessment context. As a basis for this discussion we introduce a set of criteria that we believe should apply to any conceptualization of terrorism risk. These criteria are based on both criteria used in other areas of risk analysis and our experience with terrorism risk analysis. That is, these criteria offer our perspective. We show that several of the suggested perspectives and definitions have weaknesses in relation to these criteria. A main problem identified is the idea that terrorism risk can be conceptualized as a function of probability and consequence, not as a function of the interactions between adaptive individuals and organizations. We argue that perspectives based solely on probability and consequence should be used cautiously or not at all because they fail to reflect the essential features of the concept of terrorism risk, the threats and attacks, their consequences, and the uncertainties, all in the context of adaptation by the adversaries. These three elements should in our view constitute the main pillars of the terrorism risk concept. From this concept we can develop methods for assessing the risk by identifying a set of threats, attacks, and consequence measures associated with the possible outcome scenarios together with a description of the uncertainties and interactions between the adversaries. © 2015 Society for Risk Analysis.

  5. Development of probabilistic internal dosimetry computer code

    NASA Astrophysics Data System (ADS)

    Noh, Siwan; Kwon, Tae-Eun; Lee, Jai-Ki

    2017-02-01

    Internal radiation dose assessment involves biokinetic models, the corresponding parameters, measured data, and many assumptions. Every component considered in the internal dose assessment has its own uncertainty, which is propagated in the intake activity and internal dose estimates. For research or scientific purposes, and for retrospective dose reconstruction for accident scenarios occurring in workplaces having a large quantity of unsealed radionuclides, such as nuclear power plants, nuclear fuel cycle facilities, and facilities in which nuclear medicine is practiced, a quantitative uncertainty assessment of the internal dose is often required. However, no calculation tools or computer codes that incorporate all the relevant processes and their corresponding uncertainties, i.e., from the measured data to the committed dose, are available. Thus, the objective of the present study is to develop an integrated probabilistic internal-dose-assessment computer code. First, the uncertainty components in internal dosimetry are identified, and quantitative uncertainty data are collected. Then, an uncertainty database is established for each component. In order to propagate these uncertainties in an internal dose assessment, a probabilistic internal-dose-assessment system that employs the Bayesian and Monte Carlo methods. Based on the developed system, we developed a probabilistic internal-dose-assessment code by using MATLAB so as to estimate the dose distributions from the measured data with uncertainty. Using the developed code, we calculated the internal dose distribution and statistical values ( e.g. the 2.5th, 5th, median, 95th, and 97.5th percentiles) for three sample scenarios. On the basis of the distributions, we performed a sensitivity analysis to determine the influence of each component on the resulting dose in order to identify the major component of the uncertainty in a bioassay. The results of this study can be applied to various situations. In cases of severe internal exposure, the causation probability of a deterministic health effect can be derived from the dose distribution, and a high statistical value ( e.g., the 95th percentile of the distribution) can be used to determine the appropriate intervention. The distribution-based sensitivity analysis can also be used to quantify the contribution of each factor to the dose uncertainty, which is essential information for reducing and optimizing the uncertainty in the internal dose assessment. Therefore, the present study can contribute to retrospective dose assessment for accidental internal exposure scenarios, as well as to internal dose monitoring optimization and uncertainty reduction.

  6. Methods for Assessing Uncertainties in Climate Change, Impacts and Responses (Invited)

    NASA Astrophysics Data System (ADS)

    Manning, M. R.; Swart, R.

    2009-12-01

    Assessing the scientific uncertainties or confidence levels for the many different aspects of climate change is particularly important because of the seriousness of potential impacts and the magnitude of economic and political responses that are needed to mitigate climate change effectively. This has made the treatment of uncertainty and confidence a key feature in the assessments carried out by the Intergovernmental Panel on Climate Change (IPCC). Because climate change is very much a cross-disciplinary area of science, adequately dealing with uncertainties requires recognition of their wide range and different perspectives on assessing and communicating those uncertainties. The structural differences that exist across disciplines are often embedded deeply in the corresponding literature that is used as the basis for an IPCC assessment. The assessment of climate change science by the IPCC has from its outset tried to report the levels of confidence and uncertainty in the degree of understanding in both the underlying multi-disciplinary science and in projections for future climate. The growing recognition of the seriousness of this led to the formation of a detailed approach for consistent treatment of uncertainties in the IPCC’s Third Assessment Report (TAR) [Moss and Schneider, 2000]. However, in completing the TAR there remained some systematic differences between the disciplines raising concerns about the level of consistency. So further consideration of a systematic approach to uncertainties was undertaken for the Fourth Assessment Report (AR4). The basis for the approach used in the AR4 was developed at an expert meeting of scientists representing many different disciplines. This led to the introduction of a broader way of addressing uncertainties in the AR4 [Manning et al., 2004] which was further refined by lengthy discussions among many IPCC Lead Authors, for over a year, resulting in a short summary of a standard approach to be followed for that assessment [IPCC, 2005]. This paper extends a review of the treatment of uncertainty in the IPCC assessments by Swart et al [2009]. It is shown that progress towards consistency has been made but that there also appears to be a need for continued use of several complementary approaches in order to cover the wide range of circumstances across different disciplines involved in climate change. While this reflects the situation in the science community, it also raises the level of complexity for policymakers and other users of the assessments who would prefer one common consensus approach. References IPCC (2005), Guidance Notes for Lead Authors of the IPCC Fourth Assessment Report on Addressing Uncertainties, IPCC, Geneva. Manning, M., et al. (2004), IPCC Workshop on Describing Scientific Uncertainties in Climate Change to Support Analysis of Risk and of Options. IPCC Moss, R., and S. Schneider (2000), Uncertainties, in Guidance Papers on the Cross Cutting Issues of the Third Assessment Report of the IPCC, edited by R. Pachauri, et al., Intergovernmental Panel on Climate Change (IPCC), Geneva. Swart, R., et al. (2009), Agreeing to disagree: uncertainty management in assessing climate change, impacts and responses by the IPCC Climatic Change, 92(1-2), 1 - 29.

  7. Climate data induced uncertainty in model based estimations of terrestrial primary productivity

    NASA Astrophysics Data System (ADS)

    Wu, Z.; Ahlström, A.; Smith, B.; Ardö, J.; Eklundh, L.; Fensholt, R.; Lehsten, V.

    2016-12-01

    Models used to project global vegetation and carbon cycle differ in their estimates of historical fluxes and pools. These differences arise not only from differences between models but also from differences in the environmental and climatic data that forces the models. Here we investigate the role of uncertainties in historical climate data, encapsulated by a set of six historical climate datasets. We focus on terrestrial gross primary productivity (GPP) and analyze the results from a dynamic process-based vegetation model (LPJ-GUESS) forced by six different climate datasets and two empirical datasets of GPP (derived from flux towers and remote sensing). We find that the climate induced uncertainty, defined as the difference among historical simulations in GPP when forcing the model with the different climate datasets, can be as high as 33 Pg C yr-1 globally (19% of mean GPP). The uncertainty is partitioned into the three main climatic drivers, temperature, precipitation, and shortwave radiation. Additionally, we illustrate how the uncertainty due to a given climate driver depends both on the magnitude of the forcing data uncertainty (the data range) and the sensitivity of the modeled GPP to the driver (the ecosystem sensitivity). The analysis is performed globally and stratified into five land cover classes. We find that the dynamic vegetation model overestimates GPP, compared to empirically based GPP data over most areas, except for the tropical region. Both the simulations and empirical estimates agree that the tropical region is a disproportionate source of uncertainty in GPP estimation. This is mainly caused by uncertainties in shortwave radiation forcing, of which climate data range contributes slightly higher uncertainty than ecosystem sensitivity to shortwave radiation. We also find that precipitation dominated the climate induced uncertainty over nearly half of terrestrial vegetated surfaces, which is mainly due to large ecosystem sensitivity to precipitation. Overall, climate data ranges are found to contribute more to the climate induced uncertainty than ecosystem sensitivity. Our study highlights the need to better constrain tropical climate and demonstrate that uncertainty caused by climatic forcing data must be considered when comparing and evaluating model results and empirical datasets.

  8. The Importance of Model Structure in the Cost-Effectiveness Analysis of Primary Care Interventions for the Management of Hypertension.

    PubMed

    Peñaloza-Ramos, Maria Cristina; Jowett, Sue; Sutton, Andrew John; McManus, Richard J; Barton, Pelham

    2018-03-01

    Management of hypertension can lead to significant reductions in blood pressure, thereby reducing the risk of cardiovascular disease. Modeling the course of cardiovascular disease is not without complications, and uncertainty surrounding the structure of a model will almost always arise once a choice of a model structure is defined. To provide a practical illustration of the impact on the results of cost-effectiveness of changing or adapting model structures in a previously published cost-utility analysis of a primary care intervention for the management of hypertension Targets and Self-Management for the Control of Blood Pressure in Stroke and at Risk Groups (TASMIN-SR). The case study assessed the structural uncertainty arising from model structure and from the exclusion of secondary events. Four alternative model structures were implemented. Long-term cost-effectiveness was estimated and the results compared with those from the TASMIN-SR model. The main cost-effectiveness results obtained in the TASMIN-SR study did not change with the implementation of alternative model structures. Choice of model type was limited to a cohort Markov model, and because of the lack of epidemiological data, only model 4 captured structural uncertainty arising from the exclusion of secondary events in the case study model. The results of this study indicate that the main conclusions drawn from the TASMIN-SR model of cost-effectiveness were robust to changes in model structure and the inclusion of secondary events. Even though one of the models produced results that were different to those of TASMIN-SR, the fact that the main conclusions were identical suggests that a more parsimonious model may have sufficed. Copyright © 2018 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  9. Estimation of chemical emissions from down-the-drain consumer products using consumer survey data at a country and wastewater treatment plant level.

    PubMed

    Douziech, Mélanie; van Zelm, Rosalie; Oldenkamp, Rik; Franco, Antonio; Hendriks, A Jan; King, Henry; Huijbregts, Mark A J

    2018-02-01

    Deriving reliable estimates of chemical emissions to the environment is a key challenge for impact and risk assessment methods and typically the associated uncertainty is not characterised. We have developed an approach to spatially quantify annual chemical emission loads to the aquatic environment together with their associated uncertainty using consumer survey data and publicly accessible and non-confidential data sources. The approach is applicable for chemicals widely used across a product sector. Product usage data from consumer survey studies in France, the Netherlands, South Korea and the USA were combined with information on typical product formulations, wastewater removal rates, and the spatial distribution of populations and wastewater treatment plants (WWTPs) in the four countries. Results are presented for three chemicals common to three types of personal care products (shampoo, conditioner, and bodywash) at WWTP and national levels. Uncertainty in WWTP-specific emission estimates was characterised with a 95% confidence interval and ranged up to a factor of 4.8 around the mean, mainly due to uncertainty associated with removal efficiency. Estimates of whole country product usage were comparable to total market estimates derived from sectorial market sales data with differences ranging from a factor 0.8 (for the Netherlands) to 5 (for the USA). The proposed approach is suitable where measured data on chemical emissions is missing and is applicable for use in risk assessments and chemical footprinting methods when applied to specific product categories. Copyright © 2017 Elsevier Ltd. All rights reserved.

  10. Measuring Research Data Uncertainty in the 2010 NRC Assessment of Geography Graduate Education

    ERIC Educational Resources Information Center

    Shortridge, Ashton; Goldsberry, Kirk; Weessies, Kathleen

    2011-01-01

    This article characterizes and measures errors in the 2010 National Research Council (NRC) assessment of research-doctorate programs in geography. This article provides a conceptual model for data-based sources of uncertainty and reports on a quantitative assessment of NRC research data uncertainty for a particular geography doctoral program.…

  11. Uncertainty quantification in flood risk assessment

    NASA Astrophysics Data System (ADS)

    Blöschl, Günter; Hall, Julia; Kiss, Andrea; Parajka, Juraj; Perdigão, Rui A. P.; Rogger, Magdalena; Salinas, José Luis; Viglione, Alberto

    2017-04-01

    Uncertainty is inherent to flood risk assessments because of the complexity of the human-water system, which is characterised by nonlinearities and interdependencies, because of limited knowledge about system properties and because of cognitive biases in human perception and decision-making. On top of the uncertainty associated with the assessment of the existing risk to extreme events, additional uncertainty arises because of temporal changes in the system due to climate change, modifications of the environment, population growth and the associated increase in assets. Novel risk assessment concepts are needed that take into account all these sources of uncertainty. They should be based on the understanding of how flood extremes are generated and how they change over time. They should also account for the dynamics of risk perception of decision makers and population in the floodplains. In this talk we discuss these novel risk assessment concepts through examples from Flood Frequency Hydrology, Socio-Hydrology and Predictions Under Change. We believe that uncertainty quantification in flood risk assessment should lead to a robust approach of integrated flood risk management aiming at enhancing resilience rather than searching for optimal defense strategies.

  12. Assessment of the national schistosomiasis control program in a typical region along the Yangtze River, China.

    PubMed

    Hu, Yi; Li, Si; Xia, Congcong; Chen, Yue; Lynn, Henry; Zhang, Tiejun; Xiong, Chenglong; Chen, Gengxin; He, Zonggui; Zhang, Zhijie

    2017-01-01

    Schistosomiasis remains a major public health problem in eastern China, particularly along the Yangtze River Basin. The latest national schistosomiasis control program (NSCP) was implemented in 2005 with the main goal of reducing the rate of infection to less than 5% by 2008 and 1% by 2015. To assess the progress, we applied a Bayesian spatio-temporal model to describe dynamics of schistosomiasis in Guichi, Anhui Province, China, using annual parasitological and environmental data collected within 41 sample villages for the period 2005-2011. Predictive maps of schistosomiasis showed that the disease prevalence remains constant and low. Results of uncertainty analysis, in the form of probability contour maps (PCMs), indicated that the first goal of "infection rate less than 5% by 2008" was fully achieved in the study area. More longitudinal data for schistosomiasis are needed for the assessment of the second goal of "infection rate less than 1% by 2015". Compared with the traditional way of mapping uncertainty (e.g., variance or mean-square error), our PCMs provide more realistic information for schistosomiasis control. Copyright © 2016 Australian Society for Parasitology. Published by Elsevier Ltd. All rights reserved.

  13. Plurality of Type A evaluations of uncertainty

    NASA Astrophysics Data System (ADS)

    Possolo, Antonio; Pintar, Adam L.

    2017-10-01

    The evaluations of measurement uncertainty involving the application of statistical methods to measurement data (Type A evaluations as specified in the Guide to the Expression of Uncertainty in Measurement, GUM) comprise the following three main steps: (i) developing a statistical model that captures the pattern of dispersion or variability in the experimental data, and that relates the data either to the measurand directly or to some intermediate quantity (input quantity) that the measurand depends on; (ii) selecting a procedure for data reduction that is consistent with this model and that is fit for the purpose that the results are intended to serve; (iii) producing estimates of the model parameters, or predictions based on the fitted model, and evaluations of uncertainty that qualify either those estimates or these predictions, and that are suitable for use in subsequent uncertainty propagation exercises. We illustrate these steps in uncertainty evaluations related to the measurement of the mass fraction of vanadium in a bituminous coal reference material, including the assessment of the homogeneity of the material, and to the calibration and measurement of the amount-of-substance fraction of a hydrochlorofluorocarbon in air, and of the age of a meteorite. Our goal is to expose the plurality of choices that can reasonably be made when taking each of the three steps outlined above, and to show that different choices typically lead to different estimates of the quantities of interest, and to different evaluations of the associated uncertainty. In all the examples, the several alternatives considered represent choices that comparably competent statisticians might make, but who differ in the assumptions that they are prepared to rely on, and in their selection of approach to statistical inference. They represent also alternative treatments that the same statistician might give to the same data when the results are intended for different purposes.

  14. Biophysical and Economic Uncertainty in the Analysis of Poverty Impacts of Climate Change

    NASA Astrophysics Data System (ADS)

    Hertel, T. W.; Lobell, D. B.; Verma, M.

    2011-12-01

    This paper seeks to understand the main sources of uncertainty in assessing the impacts of climate change on agricultural output, international trade, and poverty. We incorporate biophysical uncertainty by sampling from a distribution of global climate model predictions for temperature and precipitation for 2050. The implications of these realizations for crop yields around the globe are estimated using the recently published statistical crop yield functions provided by Lobell, Schlenker and Costa-Roberts (2011). By comparing these yields to those predicted under current climate, we obtain the likely change in crop yields owing to climate change. The economic uncertainty in our analysis relates to the response of the global economic system to these biophysical shocks. We use a modified version of the GTAP model to elicit the impact of the biophysical shocks on global patterns of production, consumption, trade and poverty. Uncertainty in these responses is reflected in the econometrically estimated parameters governing the responsiveness of international trade, consumption, production (and hence the intensive margin of supply response), and factor supplies (which govern the extensive margin of supply response). We sample from the distributions of these parameters as specified by Hertel et al. (2007) and Keeney and Hertel (2009). We find that, even though it is difficult to predict where in the world agricultural crops will be favorably affected by climate change, the responses of economic variables, including output and exports can be far more robust (Table 1). This is due to the fact that supply and demand decisions depend on relative prices, and relative prices depend on productivity changes relative to other crops in a given region, or relative to similar crops in other parts of the world. We also find that uncertainty in poverty impacts of climate change appears to be almost entirely driven by biophysical uncertainty.

  15. On-orbit servicing system assessment and optimization methods based on lifecycle simulation under mixed aleatory and epistemic uncertainties

    NASA Astrophysics Data System (ADS)

    Yao, Wen; Chen, Xiaoqian; Huang, Yiyong; van Tooren, Michel

    2013-06-01

    To assess the on-orbit servicing (OOS) paradigm and optimize its utilities by taking advantage of its inherent flexibility and responsiveness, the OOS system assessment and optimization methods based on lifecycle simulation under uncertainties are studied. The uncertainty sources considered in this paper include both the aleatory (random launch/OOS operation failure and on-orbit component failure) and the epistemic (the unknown trend of the end-used market price) types. Firstly, the lifecycle simulation under uncertainties is discussed. The chronological flowchart is presented. The cost and benefit models are established, and the uncertainties thereof are modeled. The dynamic programming method to make optimal decision in face of the uncertain events is introduced. Secondly, the method to analyze the propagation effects of the uncertainties on the OOS utilities is studied. With combined probability and evidence theory, a Monte Carlo lifecycle Simulation based Unified Uncertainty Analysis (MCS-UUA) approach is proposed, based on which the OOS utility assessment tool under mixed uncertainties is developed. Thirdly, to further optimize the OOS system under mixed uncertainties, the reliability-based optimization (RBO) method is studied. To alleviate the computational burden of the traditional RBO method which involves nested optimum search and uncertainty analysis, the framework of Sequential Optimization and Mixed Uncertainty Analysis (SOMUA) is employed to integrate MCS-UUA, and the RBO algorithm SOMUA-MCS is developed. Fourthly, a case study on the OOS system for a hypothetical GEO commercial communication satellite is investigated with the proposed assessment tool. Furthermore, the OOS system is optimized with SOMUA-MCS. Lastly, some conclusions are given and future research prospects are highlighted.

  16. Hazards in volcanic arcs

    NASA Astrophysics Data System (ADS)

    Sparks, S. R.

    2008-12-01

    Volcanic eruptions in arcs are complex natural phenomena, involving the movement of magma to the Earth's surface and interactions with the surrounding crust during ascent and with the surface environment during eruption, resulting in secondary hazards. Magma changes its properties profoundly during ascent and eruption and many of the underlying processes of heat and mass transfer and physical property changes that govern volcanic flows and magmatic interactions with the environment are highly non-linear. Major direct hazards include tephra fall, pyroclastic flows from explosions and dome collapse, volcanic blasts, lahars, debris avalanches and tsunamis. There are also health hazards related to emissions of gases and very fine volcanic ash. These hazards and progress in their assessment are illustrated mainly from the ongoing eruption of the Soufriere Hills volcano. Montserrat. There are both epistemic and aleatory uncertainties in the assessment of volcanic hazards, which can be large, making precise prediction a formidable objective. Indeed in certain respects volcanic systems and hazardous phenomena may be intrinsically unpredictable. As with other natural phenomena, predictions and hazards inevitably have to be expressed in probabilistic terms that take account of these uncertainties. Despite these limitations significant progress is being made in the ability to anticipate volcanic activity in volcanic arcs and, in favourable circumstances, make robust hazards assessments and predictions. Improvements in monitoring ground deformation, gas emissions and seismicity are being combined with more advanced models of volcanic flows and their interactions with the environment. In addition more structured and systematic methods for assessing hazards and risk are emerging that allow impartial advice to be given to authorities during volcanic crises. There remain significant issues of how scientific advice and associated uncertainties are communicated to provide effective mitigation during volcanic crises.

  17. Engineering geologic and geotechnical analysis of paleoseismic shaking using liquefaction effects: Field examples

    USGS Publications Warehouse

    Green, R.A.; Obermeier, S.F.; Olson, S.M.

    2005-01-01

    The greatest impediments to the widespread acceptance of back-calculated ground motion characteristics from paleoliquefaction studies typically stem from three uncertainties: (1) the significance of changes in the geotechnical properties of post-liquefied sediments (e.g., "aging" and density changes), (2) the selection of appropriate geotechnical soil indices from individual paleoliquefaction sites, and (3) the methodology for integration of back-calculated results of strength of shaking from individual paleoliquefaction sites into a regional assessment of paleoseismic strength of shaking. Presented herein are two case studies that illustrate the methods outlined by Olson et al. [Engineering Geology, this issue] for addressing these uncertainties. The first case study is for a site near Memphis, Tennessee, wherein cone penetration test data from side-by-side locations, one of liquefaction and the other of no liquefaction, are used to readily discern that the influence of post-liquefaction "aging" and density changes on the measured in situ soil indices is minimal. In the second case study, 12 sites that are at scattered locations in the Wabash Valley and that exhibit paleoliquefaction features are analyzed. The features are first provisionally attributed to the Vincennes Earthquake, which occurred around 6100 years BP, and are used to illustrate our proposed approach for selecting representative soil indices of the liquefied sediments. These indices are used in back-calculating the strength of shaking at the individual sites, the results from which are then incorporated into a regional assessment of the moment magnitude, M, of the Vincennes Earthquake. The regional assessment validated the provisional assumption that the paleoliquefaction features at the scattered sites were induced by the Vincennes Earthquake, in the main, which was determined to have M ??? 7.5. The uncertainties and assumptions used in the assessment are discussed in detail. ?? 2004 Elsevier B.V. All rights reserved.

  18. Radiation protection issues in galactic cosmic ray risk assessment

    NASA Technical Reports Server (NTRS)

    Sinclair, W. K.

    1994-01-01

    Radiation protection involves the limitation of exposure to below threshold doses for direct (or deterministic) effects and a knowledge of the risk of stochastic effects after low doses. The principal stochastic risk associated with low dose rate galactic cosmic rays is the increased risk of cancer. Estimates of this risk depend on two factors (a) estimates of cancer risk for low-LET radiation and (b) values of the appropriate radiation weighting factors, WR, for the high-LET radiations of galactic cosmic rays. Both factors are subject to considerable uncertainty. The low-LET cancer risk derived from the late effects of the atomic bombs is vulnerable to a number of uncertainties including especially that from projection in time, and from extrapolation from high to low dose rate. Nevertheless, recent low dose studies of workers and others tend to confirm these estimates. WR, relies on biological effects studied mainly in non-human systems. Additional laboratory studies could reduce the uncertainties in WR and thus produce a more confident estimate of the overall risk of galactic cosmic rays.

  19. Radiation protection issues in galactic cosmic ray risk assessment.

    PubMed

    Sinclair, W K

    1994-01-01

    Radiation protection involves the limitation of exposure to below threshold doses for direct (or deterministic) effects and a knowledge of the risk of stochastic effects after low doses. The principal stochastic risk associated with low dose rate galactic cosmic rays is the increased risk of cancer. Estimates of this risk depend on two factors (a) estimates of cancer risk for low-LET radiation and (b) values of the appropriate radiation weighting factors, WR, for the high-LET radiations of galactic cosmic rays. Both factors are subject to considerable uncertainty. The low-LET cancer risk derived from the late effects of the atomic bombs is vulnerable to a number of uncertainties including especially that from projection in time, and from extrapolation from high to low dose rate. Nevertheless, recent low dose studies of workers and others tend to confirm these estimates. WR, relies on biological effects studied mainly in non-human systems. Additional laboratory studies could reduce the uncertainties in WR and thus produce a more confident estimate of the overall risk of galactic cosmic rays.

  20. Incorporating climate-system and carbon-cycle uncertainties in integrated assessments of climate change. (Invited)

    NASA Astrophysics Data System (ADS)

    Rogelj, J.; McCollum, D. L.; Reisinger, A.; Knutti, R.; Riahi, K.; Meinshausen, M.

    2013-12-01

    The field of integrated assessment draws from a large body of knowledge across a range of disciplines to gain robust insights about possible interactions, trade-offs, and synergies. Integrated assessment of climate change, for example, uses knowledge from the fields of energy system science, economics, geophysics, demography, climate change impacts, and many others. Each of these fields comes with its associated caveats and uncertainties, which should be taken into account when assessing any results. The geophysical system and its associated uncertainties are often represented by models of reduced complexity in integrated assessment modelling frameworks. Such models include simple representations of the carbon-cycle and climate system, and are often based on the global energy balance equation. A prominent example of such model is the 'Model for the Assessment of Greenhouse Gas Induced Climate Change', MAGICC. Here we show how a model like MAGICC can be used for the representation of geophysical uncertainties. Its strengths, weaknesses, and limitations are discussed and illustrated by means of an analysis which attempts to integrate socio-economic and geophysical uncertainties. These uncertainties in the geophysical response of the Earth system to greenhouse gases remains key for estimating the cost of greenhouse gas emission mitigation scenarios. We look at uncertainties in four dimensions: geophysical, technological, social and political. Our results indicate that while geophysical uncertainties are an important factor influencing projections of mitigation costs, political choices that delay mitigation by one or two decades a much more pronounced effect.

  1. A new Method for the Estimation of Initial Condition Uncertainty Structures in Mesoscale Models

    NASA Astrophysics Data System (ADS)

    Keller, J. D.; Bach, L.; Hense, A.

    2012-12-01

    The estimation of fast growing error modes of a system is a key interest of ensemble data assimilation when assessing uncertainty in initial conditions. Over the last two decades three methods (and variations of these methods) have evolved for global numerical weather prediction models: ensemble Kalman filter, singular vectors and breeding of growing modes (or now ensemble transform). While the former incorporates a priori model error information and observation error estimates to determine ensemble initial conditions, the latter two techniques directly address the error structures associated with Lyapunov vectors. However, in global models these structures are mainly associated with transient global wave patterns. When assessing initial condition uncertainty in mesoscale limited area models, several problems regarding the aforementioned techniques arise: (a) additional sources of uncertainty on the smaller scales contribute to the error and (b) error structures from the global scale may quickly move through the model domain (depending on the size of the domain). To address the latter problem, perturbation structures from global models are often included in the mesoscale predictions as perturbed boundary conditions. However, the initial perturbations (when used) are often generated with a variant of an ensemble Kalman filter which does not necessarily focus on the large scale error patterns. In the framework of the European regional reanalysis project of the Hans-Ertel-Center for Weather Research we use a mesoscale model with an implemented nudging data assimilation scheme which does not support ensemble data assimilation at all. In preparation of an ensemble-based regional reanalysis and for the estimation of three-dimensional atmospheric covariance structures, we implemented a new method for the assessment of fast growing error modes for mesoscale limited area models. The so-called self-breeding is development based on the breeding of growing modes technique. Initial perturbations are integrated forward for a short time period and then rescaled and added to the initial state again. Iterating this rapid breeding cycle provides estimates for the initial uncertainty structure (or local Lyapunov vectors) given a specific norm. To avoid that all ensemble perturbations converge towards the leading local Lyapunov vector we apply an ensemble transform variant to orthogonalize the perturbations in the sub-space spanned by the ensemble. By choosing different kind of norms to measure perturbation growth, this technique allows for estimating uncertainty patterns targeted at specific sources of errors (e.g. convection, turbulence). With case study experiments we show applications of the self-breeding method for different sources of uncertainty and different horizontal scales.

  2. Assessing the Performance of Sensor Fusion Methods: Application to Magnetic-Inertial-Based Human Body Tracking

    PubMed Central

    Ligorio, Gabriele; Bergamini, Elena; Pasciuto, Ilaria; Vannozzi, Giuseppe; Cappozzo, Aurelio; Sabatini, Angelo Maria

    2016-01-01

    Information from complementary and redundant sensors are often combined within sensor fusion algorithms to obtain a single accurate observation of the system at hand. However, measurements from each sensor are characterized by uncertainties. When multiple data are fused, it is often unclear how all these uncertainties interact and influence the overall performance of the sensor fusion algorithm. To address this issue, a benchmarking procedure is presented, where simulated and real data are combined in different scenarios in order to quantify how each sensor’s uncertainties influence the accuracy of the final result. The proposed procedure was applied to the estimation of the pelvis orientation using a waist-worn magnetic-inertial measurement unit. Ground-truth data were obtained from a stereophotogrammetric system and used to obtain simulated data. Two Kalman-based sensor fusion algorithms were submitted to the proposed benchmarking procedure. For the considered application, gyroscope uncertainties proved to be the main error source in orientation estimation accuracy for both tested algorithms. Moreover, although different performances were obtained using simulated data, these differences became negligible when real data were considered. The outcome of this evaluation may be useful both to improve the design of new sensor fusion methods and to drive the algorithm tuning process. PMID:26821027

  3. Assessing the Performance of Sensor Fusion Methods: Application to Magnetic-Inertial-Based Human Body Tracking.

    PubMed

    Ligorio, Gabriele; Bergamini, Elena; Pasciuto, Ilaria; Vannozzi, Giuseppe; Cappozzo, Aurelio; Sabatini, Angelo Maria

    2016-01-26

    Information from complementary and redundant sensors are often combined within sensor fusion algorithms to obtain a single accurate observation of the system at hand. However, measurements from each sensor are characterized by uncertainties. When multiple data are fused, it is often unclear how all these uncertainties interact and influence the overall performance of the sensor fusion algorithm. To address this issue, a benchmarking procedure is presented, where simulated and real data are combined in different scenarios in order to quantify how each sensor's uncertainties influence the accuracy of the final result. The proposed procedure was applied to the estimation of the pelvis orientation using a waist-worn magnetic-inertial measurement unit. Ground-truth data were obtained from a stereophotogrammetric system and used to obtain simulated data. Two Kalman-based sensor fusion algorithms were submitted to the proposed benchmarking procedure. For the considered application, gyroscope uncertainties proved to be the main error source in orientation estimation accuracy for both tested algorithms. Moreover, although different performances were obtained using simulated data, these differences became negligible when real data were considered. The outcome of this evaluation may be useful both to improve the design of new sensor fusion methods and to drive the algorithm tuning process.

  4. Probabilistic evaluation of uncertainties and risks in aerospace components

    NASA Technical Reports Server (NTRS)

    Shah, A. R.; Shiao, M. C.; Nagpal, V. K.; Chamis, C. C.

    1992-01-01

    This paper summarizes a methodology developed at NASA Lewis Research Center which computationally simulates the structural, material, and load uncertainties associated with Space Shuttle Main Engine (SSME) components. The methodology was applied to evaluate the scatter in static, buckling, dynamic, fatigue, and damage behavior of the SSME turbo pump blade. Also calculated are the probability densities of typical critical blade responses, such as effective stress, natural frequency, damage initiation, most probable damage path, etc. Risk assessments were performed for different failure modes, and the effect of material degradation on the fatigue and damage behaviors of a blade were calculated using a multi-factor interaction equation. Failure probabilities for different fatigue cycles were computed and the uncertainties associated with damage initiation and damage propagation due to different load cycle were quantified. Evaluations on the effects of mistuned blades on a rotor were made; uncertainties in the excitation frequency were found to significantly amplify the blade responses of a mistuned rotor. The effects of the number of blades on a rotor were studied. The autocorrelation function of displacements and the probability density function of the first passage time for deterministic and random barriers for structures subjected to random processes also were computed. A brief discussion was included on the future direction of probabilistic structural analysis.

  5. Probabilistic sensitivity analysis for NICE technology assessment: not an optional extra.

    PubMed

    Claxton, Karl; Sculpher, Mark; McCabe, Chris; Briggs, Andrew; Akehurst, Ron; Buxton, Martin; Brazier, John; O'Hagan, Tony

    2005-04-01

    Recently the National Institute for Clinical Excellence (NICE) updated its methods guidance for technology assessment. One aspect of the new guidance is to require the use of probabilistic sensitivity analysis with all cost-effectiveness models submitted to the Institute. The purpose of this paper is to place the NICE guidance on dealing with uncertainty into a broader context of the requirements for decision making; to explain the general approach that was taken in its development; and to address each of the issues which have been raised in the debate about the role of probabilistic sensitivity analysis in general. The most appropriate starting point for developing guidance is to establish what is required for decision making. On the basis of these requirements, the methods and framework of analysis which can best meet these needs can then be identified. It will be argued that the guidance on dealing with uncertainty and, in particular, the requirement for probabilistic sensitivity analysis, is justified by the requirements of the type of decisions that NICE is asked to make. Given this foundation, the main issues and criticisms raised during and after the consultation process are reviewed. Finally, some of the methodological challenges posed by the need fully to characterise decision uncertainty and to inform the research agenda will be identified and discussed. Copyright (c) 2005 John Wiley & Sons, Ltd.

  6. An Applied Framework for Incorporating Multiple Sources of Uncertainty in Fisheries Stock Assessments.

    PubMed

    Scott, Finlay; Jardim, Ernesto; Millar, Colin P; Cerviño, Santiago

    2016-01-01

    Estimating fish stock status is very challenging given the many sources and high levels of uncertainty surrounding the biological processes (e.g. natural variability in the demographic rates), model selection (e.g. choosing growth or stock assessment models) and parameter estimation. Incorporating multiple sources of uncertainty in a stock assessment allows advice to better account for the risks associated with proposed management options, promoting decisions that are more robust to such uncertainty. However, a typical assessment only reports the model fit and variance of estimated parameters, thereby underreporting the overall uncertainty. Additionally, although multiple candidate models may be considered, only one is selected as the 'best' result, effectively rejecting the plausible assumptions behind the other models. We present an applied framework to integrate multiple sources of uncertainty in the stock assessment process. The first step is the generation and conditioning of a suite of stock assessment models that contain different assumptions about the stock and the fishery. The second step is the estimation of parameters, including fitting of the stock assessment models. The final step integrates across all of the results to reconcile the multi-model outcome. The framework is flexible enough to be tailored to particular stocks and fisheries and can draw on information from multiple sources to implement a broad variety of assumptions, making it applicable to stocks with varying levels of data availability The Iberian hake stock in International Council for the Exploration of the Sea (ICES) Divisions VIIIc and IXa is used to demonstrate the framework, starting from length-based stock and indices data. Process and model uncertainty are considered through the growth, natural mortality, fishing mortality, survey catchability and stock-recruitment relationship. Estimation uncertainty is included as part of the fitting process. Simple model averaging is used to integrate across the results and produce a single assessment that considers the multiple sources of uncertainty.

  7. Radar Altimetry for Hydrological Modeling and Monitoring in the Zambezi River Basin

    NASA Astrophysics Data System (ADS)

    Michailovsky, C. I.; Berry, P. A.; Smith, R. G.; Bauer-Gottwein, P.

    2011-12-01

    Hydrological model forecasts are subject to large uncertainties stemming from uncertain input data, model structure, parameterization and lack of sufficient calibration/validation data. For real-time or near-real-time applications data assimilation techniques such as the Ensemble Kalman Filter (EnKF) can be used to reduce forecast uncertainty by updating model states as new data becomes available. The use of remote sensing data is attractive for such applications as it provides wide geographical coverage and continuous time-series without the typically long delays that exist in obtaining in-situ data. River discharge is one of the main hydrological variables of interest, and while it cannot currently be directly measured remotely, water levels in rivers can be obtained from satellite based radar altimetry and converted to discharge through rating curves. This study aims to give a realistic assessment of the improvements that can be derived from the use of satellite radar altimetry measurements from the Envisat mission for discharge monitoring and modeling on the basin scale for the Zambezi River. The altimetry data used is the Radar AlTimetry (RAT) product developed at the Earth and Planetary Remote Sensing Laboratory at the De Montfort University. The first step in analyzing the data is the determination of potential altimetry targets which are the locations at which the Envisat orbit and the river network cross in order to select data points corresponding to surface water. The quality of the water level time-series is then analyzed for all targets and the exploitable targets identified. Rating curves are derived from in-situ or remotely-sensed data depending on data-availability at the various locations and discharge time-series are established. A Monte Carlo analysis is carried out to assess the uncertainties on the computed discharge. It was found that having a single cross-section and associated discharge measurement at one point in time significantly reduces discharge uncertainty. To assess improvements in model predictions, a model of the Zambezi River basin based on remote sensing data is set up with the Soil and Water Assessment Tool and calibrated with available in-situ data. The discharge data from altimetry is then used in an EnKF framework to update discharge in the model as it runs. The method showed improvements in prediction uncertainties for short lead times.

  8. Projected shifts in copepod surface communities in the Mediterranean Sea under several climate change scenarios

    NASA Astrophysics Data System (ADS)

    Benedetti, F.; Guilhaumon, F.; Adloff, F.; Irisson, J. O.; Ayata, S. D.

    2016-02-01

    Although future increases in water temperature and future changes in regional circulation are expected to have great impacts on the pelagic food-web, estimates focusing on community-level shifts are still lacking for the planktonic compartment. By combining statistical niche models (or species distribution models) with projections from a regional circulation model, the impact of climate change on copepod epipelagic communities is assessed for the Mediterranean Sea. Habitat suitability maps are generated for 106 of the most abundant copepod species to analyze emerging patterns of diversity at the community level. Using variance analysis, we also quantified the uncertainties associated to our modeling strategy (niche model choice, CO2 emission scenario, boundary forcings of the circulation model). Comparing present and future projections, changes in species richness (alpha diversity) and in community composition (beta diversity, decomposed into turnover and nestedness component) are calculated. Average projections show that copepod communities will mainly experience turn-over processes, with little changes in species richness. Species gains are mainly located in the Gulf of Lions, the Northern Adriatic and the Northern Aegean seas. However, projections are highly variable, especially in the Eastern Mediterranean basin. We show that such variability is mainly driven by the choice of the niche model, through interactions with the CO2 emission scenario or the boundary forcing of the circulation model can be locally important. Finally, the possible impact of the estimated community changes on zooplanktonic functional and phylogenetic diversity is also assessed. We encourage the enlargement of this type of study to other components of the pelagic food-web, and argue that niche models' outputs should always be given along with a measure of uncertainty, and explained in light of a strong theoretical background.

  9. Assessment of Uncertainty-Infused Scientific Argumentation

    ERIC Educational Resources Information Center

    Lee, Hee-Sun; Liu, Ou Lydia; Pallant, Amy; Roohr, Katrina Crotts; Pryputniewicz, Sarah; Buck, Zoë E.

    2014-01-01

    Though addressing sources of uncertainty is an important part of doing science, it has largely been neglected in assessing students' scientific argumentation. In this study, we initially defined a scientific argumentation construct in four structural elements consisting of claim, justification, uncertainty qualifier, and uncertainty…

  10. Bayesian stock assessment of Pacific herring in Prince William Sound, Alaska.

    PubMed

    Muradian, Melissa L; Branch, Trevor A; Moffitt, Steven D; Hulson, Peter-John F

    2017-01-01

    The Pacific herring (Clupea pallasii) population in Prince William Sound, Alaska crashed in 1993 and has yet to recover, affecting food web dynamics in the Sound and impacting Alaskan communities. To help researchers design and implement the most effective monitoring, management, and recovery programs, a Bayesian assessment of Prince William Sound herring was developed by reformulating the current model used by the Alaska Department of Fish and Game. The Bayesian model estimated pre-fishery spawning biomass of herring age-3 and older in 2013 to be a median of 19,410 mt (95% credibility interval 12,150-31,740 mt), with a 54% probability that biomass in 2013 was below the management limit used to regulate fisheries in Prince William Sound. The main advantages of the Bayesian model are that it can more objectively weight different datasets and provide estimates of uncertainty for model parameters and outputs, unlike the weighted sum-of-squares used in the original model. In addition, the revised model could be used to manage herring stocks with a decision rule that considers both stock status and the uncertainty in stock status.

  11. Bayesian stock assessment of Pacific herring in Prince William Sound, Alaska

    PubMed Central

    Moffitt, Steven D.; Hulson, Peter-John F.

    2017-01-01

    The Pacific herring (Clupea pallasii) population in Prince William Sound, Alaska crashed in 1993 and has yet to recover, affecting food web dynamics in the Sound and impacting Alaskan communities. To help researchers design and implement the most effective monitoring, management, and recovery programs, a Bayesian assessment of Prince William Sound herring was developed by reformulating the current model used by the Alaska Department of Fish and Game. The Bayesian model estimated pre-fishery spawning biomass of herring age-3 and older in 2013 to be a median of 19,410 mt (95% credibility interval 12,150–31,740 mt), with a 54% probability that biomass in 2013 was below the management limit used to regulate fisheries in Prince William Sound. The main advantages of the Bayesian model are that it can more objectively weight different datasets and provide estimates of uncertainty for model parameters and outputs, unlike the weighted sum-of-squares used in the original model. In addition, the revised model could be used to manage herring stocks with a decision rule that considers both stock status and the uncertainty in stock status. PMID:28222151

  12. Design and implementation of a risk assessment module in a spatial decision support system

    NASA Astrophysics Data System (ADS)

    Zhang, Kaixi; van Westen, Cees; Bakker, Wim

    2014-05-01

    The spatial decision support system named 'Changes SDSS' is currently under development. The goal of this system is to analyze changing hydro-meteorological hazards and the effect of risk reduction alternatives to support decision makers in choosing the best alternatives. The risk assessment module within the system is to assess the current risk, analyze the risk after implementations of risk reduction alternatives, and analyze the risk in different future years when considering scenarios such as climate change, land use change and population growth. The objective of this work is to present the detailed design and implementation plan of the risk assessment module. The main challenges faced consist of how to shift the risk assessment from traditional desktop software to an open source web-based platform, the availability of input data and the inclusion of uncertainties in the risk analysis. The risk assessment module is developed using Ext JS library for the implementation of user interface on the client side, using Python for scripting, as well as PostGIS spatial functions for complex computations on the server side. The comprehensive consideration of the underlying uncertainties in input data can lead to a better quantification of risk assessment and a more reliable Changes SDSS, since the outputs of risk assessment module are the basis for decision making module within the system. The implementation of this module will contribute to the development of open source web-based modules for multi-hazard risk assessment in the future. This work is part of the "CHANGES SDSS" project, funded by the European Community's 7th Framework Program.

  13. Uncertainties in Hydrologic and Climate Change Impact Analysis in Major Watersheds in British Columbia, Canada

    NASA Astrophysics Data System (ADS)

    Bennett, K. E.; Schnorbus, M.; Werner, A. T.; Music, B.; Caya, D.; Rodenhuis, D. R.

    2009-12-01

    Uncertainties in the projections of future hydrologic change can be assessed using a suite of tools, thereby allowing researchers to focus on improvement to identifiable sources of uncertainty. A pareto set of optimal hydrologic parameterizations was run for three BC watersheds (Fraser, Peace and Columbia) for a range of downscaled Global Climate Model (GCM) emission scenarios to illustrate the uncertainty in hydrologic response to climate change. Results show varying responses of hydrologic regimes across geographic landscapes. Uncertainties in streamflow and water balance (runoff, evapo-transpiration, snow water equivalent, soil moisture) were analysed by forcing the Variable Infiltration Capacity (VIC) hydrologic model, run under twenty-five optimal parameter solution sets using six Bias-Corrected Statistically Downscaled (BCSD) GCM emission scenario projections for the 2050s and the 2080s. Projected changes by the 2050s include increased winter flows, increases and decreases in freshet magnitude depending on the scenario, and decreases in summer flows persisting until September. Winter runoff had the greatest range between GCM emission scenarios, while the hydrologic parameters within individual GCM emission scenarios had a winter runoff range an order of magnitude smaller. Evapo-transpiration, snow water equivalent and soil moisture exhibited a spread of ~10% or less. Streamflow changes by the 2080s lie outside the natural range of historic variability over the winter and spring. Results indicate that the changes projected between GCM emission scenarios are greater than the differences between the hydrologic model parameterizations. An alternate tool, the Canadian Regional Climate Model (CRCM) has been set up for these watersheds and various runs have been analysed to determine the range and variability present and to examine these results in comparison to the hydrologic model projections. The CRCM range and variability is an improvement over the Canadian GCM and thus requires less bias correction. However, without downscaling the CRCM results are still coarser than what is required to drive macroscale hydrologic models, such as VIC. Applying these tools has illustrated the importance of focusing on improved downscaling efforts, including downscaling CRCM results rather than CGCM data. Tools for decision-making in the face of uncertainty are emerging as a priority for the climate change impacts community, and there is a need to focus on incorporating uncertainty information along with the projection of impacts. Assessing uncertainty across a range of regimes and geographic regions can assist to identify the main sources of uncertainty and allow researchers to focus on improving those sources using more robust methodological approaches and tools.

  14. Robust stability of fractional order polynomials with complicated uncertainty structure

    PubMed Central

    Şenol, Bilal; Pekař, Libor

    2017-01-01

    The main aim of this article is to present a graphical approach to robust stability analysis for families of fractional order (quasi-)polynomials with complicated uncertainty structure. More specifically, the work emphasizes the multilinear, polynomial and general structures of uncertainty and, moreover, the retarded quasi-polynomials with parametric uncertainty are studied. Since the families with these complex uncertainty structures suffer from the lack of analytical tools, their robust stability is investigated by numerical calculation and depiction of the value sets and subsequent application of the zero exclusion condition. PMID:28662173

  15. Flood resilience and uncertainty in flood risk assessment

    NASA Astrophysics Data System (ADS)

    Beven, K.; Leedal, D.; Neal, J.; Bates, P.; Hunter, N.; Lamb, R.; Keef, C.

    2012-04-01

    Flood risk assessments do not normally take account of the uncertainty in assessing flood risk. There is no requirement in the EU Floods Directive to do so. But given the generally short series (and potential non-stationarity) of flood discharges, the extrapolation to smaller exceedance potentials may be highly uncertain. This means that flood risk mapping may also be highly uncertainty, with additional uncertainties introduced by the representation of flood plain and channel geometry, conveyance and infrastructure. This suggests that decisions about flood plain management should be based on exceedance probability of risk rather than the deterministic hazard maps that are common in most EU countries. Some examples are given from 2 case studies in the UK where a framework for good practice in assessing uncertainty in flood risk mapping has been produced as part of the Flood Risk Management Research Consortium and Catchment Change Network Projects. This framework provides a structure for the communication and audit of assumptions about uncertainties.

  16. Trends and uncertainties in budburst projections of Norway spruce in Northern Europe.

    PubMed

    Olsson, Cecilia; Olin, Stefan; Lindström, Johan; Jönsson, Anna Maria

    2017-12-01

    Budburst is regulated by temperature conditions, and a warming climate is associated with earlier budburst. A range of phenology models has been developed to assess climate change effects, and they tend to produce different results. This is mainly caused by different model representations of tree physiology processes, selection of observational data for model parameterization, and selection of climate model data to generate future projections. In this study, we applied (i) Bayesian inference to estimate model parameter values to address uncertainties associated with selection of observational data, (ii) selection of climate model data representative of a larger dataset, and (iii) ensembles modeling over multiple initial conditions, model classes, model parameterizations, and boundary conditions to generate future projections and uncertainty estimates. The ensemble projection indicated that the budburst of Norway spruce in northern Europe will on average take place 10.2 ± 3.7 days earlier in 2051-2080 than in 1971-2000, given climate conditions corresponding to RCP 8.5. Three provenances were assessed separately (one early and two late), and the projections indicated that the relationship among provenance will remain also in a warmer climate. Structurally complex models were more likely to fail predicting budburst for some combinations of site and year than simple models. However, they contributed to the overall picture of current understanding of climate impacts on tree phenology by capturing additional aspects of temperature response, for example, chilling. Model parameterizations based on single sites were more likely to result in model failure than parameterizations based on multiple sites, highlighting that the model parameterization is sensitive to initial conditions and may not perform well under other climate conditions, whether the change is due to a shift in space or over time. By addressing a range of uncertainties, this study showed that ensemble modeling provides a more robust impact assessment than would a single phenology model run.

  17. Assessing student understanding of measurement and uncertainty

    NASA Astrophysics Data System (ADS)

    Abbott, David Scot

    A test to assess student understanding of measurement and uncertainty has been developed and administered to more than 500 students at two large research universities. The aim is two-fold: (1) to assess what students learn in the first semester of introductory physics labs and (2) to uncover patterns in student reasoning and practice. The forty minute, eleven item test focuses on direct measurement and student attitudes toward multiple measurements. After one revision cycle using think-aloud interviews, the test was administered to students to three groups: students enrolled in traditional laboratory lab sections of first semester physics at North Carolina State University (NCSU), students in an experimental (SCALE-UP) section of first semester physics at NCSU, and students in first semester physics at the University of North Carolina at Chapel Hill. The results were analyzed using a mixture of qualitative and quantitative methods. In the traditional NCSU labs, where students receive no instruction in uncertainty and measurement, students show no improvement on any of the areas examined by the test. In SCALE-UP and at UNC, students show statistically significant gains in most areas of the test. Gains on specific test items in SCALE-UP and at UNC correspond to areas of instructional emphasis. Test items were grouped into four main aspects of performance: "point/set" reasoning, meaning of spread, ruler reading and "stacking." Student performance on the pretest was examined to identify links between these aspects. Items within each aspect are correlated to one another, sometimes quite strongly, but items from different aspects rarely show statistically significant correlation. Taken together, these results suggest that student difficulties may not be linked to a single underlying cause. The study shows that current instruction techniques improve student understanding, but that many students exit the introductory physics lab course without appreciation or coherent understanding for the concept of measurement uncertainty.

  18. Uncertainty in the Modeling of Tsunami Sediment Transport

    NASA Astrophysics Data System (ADS)

    Jaffe, B. E.; Sugawara, D.; Goto, K.; Gelfenbaum, G. R.; La Selle, S.

    2016-12-01

    Erosion and deposition from tsunamis record information about tsunami hydrodynamics and size that can be interpreted to improve tsunami hazard assessment. A recent study (Jaffe et al., 2016) explores sources and methods for quantifying uncertainty in tsunami sediment transport modeling. Uncertainty varies with tsunami properties, study site characteristics, available input data, sediment grain size, and the model used. Although uncertainty has the potential to be large, case studies for both forward and inverse models have shown that sediment transport modeling provides useful information on tsunami inundation and hydrodynamics that can be used to improve tsunami hazard assessment. New techniques for quantifying uncertainty, such as Ensemble Kalman Filtering inversion, and more rigorous reporting of uncertainties will advance the science of tsunami sediment transport modeling. Uncertainty may be decreased with additional laboratory studies that increase our understanding of the semi-empirical parameters and physics of tsunami sediment transport, standardized benchmark tests to assess model performance, and the development of hybrid modeling approaches to exploit the strengths of forward and inverse models. As uncertainty in tsunami sediment transport modeling is reduced, and with increased ability to quantify uncertainty, the geologic record of tsunamis will become more valuable in the assessment of tsunami hazard. Jaffe, B., Goto, K., Sugawara, D., Gelfenbaum, G., and La Selle, S., "Uncertainty in Tsunami Sediment Transport Modeling", Journal of Disaster Research Vol. 11 No. 4, pp. 647-661, 2016, doi: 10.20965/jdr.2016.p0647 https://www.fujipress.jp/jdr/dr/dsstr001100040647/

  19. Risk assessment of vector-borne diseases for public health governance.

    PubMed

    Sedda, L; Morley, D W; Braks, M A H; De Simone, L; Benz, D; Rogers, D J

    2014-12-01

    In the context of public health, risk governance (or risk analysis) is a framework for the assessment and subsequent management and/or control of the danger posed by an identified disease threat. Generic frameworks in which to carry out risk assessment have been developed by various agencies. These include monitoring, data collection, statistical analysis and dissemination. Due to the inherent complexity of disease systems, however, the generic approach must be modified for individual, disease-specific risk assessment frameworks. The analysis was based on the review of the current risk assessments of vector-borne diseases adopted by the main Public Health organisations (OIE, WHO, ECDC, FAO, CDC etc…). Literature, legislation and statistical assessment of the risk analysis frameworks. This review outlines the need for the development of a general public health risk assessment method for vector-borne diseases, in order to guarantee that sufficient information is gathered to apply robust models of risk assessment. Stochastic (especially spatial) methods, often in Bayesian frameworks are now gaining prominence in standard risk assessment procedures because of their ability to assess accurately model uncertainties. Risk assessment needs to be addressed quantitatively wherever possible, and submitted with its quality assessment in order to enable successful public health measures to be adopted. In terms of current practice, often a series of different models and analyses are applied to the same problem, with results and outcomes that are difficult to compare because of the unknown model and data uncertainties. Therefore, the risk assessment areas in need of further research are identified in this article. Copyright © 2014 The Royal Society for Public Health. Published by Elsevier Ltd. All rights reserved.

  20. Uncertainty assessment in geodetic network adjustment by combining GUM and Monte-Carlo-simulations

    NASA Astrophysics Data System (ADS)

    Niemeier, Wolfgang; Tengen, Dieter

    2017-06-01

    In this article first ideas are presented to extend the classical concept of geodetic network adjustment by introducing a new method for uncertainty assessment as two-step analysis. In the first step the raw data and possible influencing factors are analyzed using uncertainty modeling according to GUM (Guidelines to the Expression of Uncertainty in Measurements). This approach is well established in metrology, but rarely adapted within Geodesy. The second step consists of Monte-Carlo-Simulations (MC-simulations) for the complete processing chain from raw input data and pre-processing to adjustment computations and quality assessment. To perform these simulations, possible realizations of raw data and the influencing factors are generated, using probability distributions for all variables and the established concept of pseudo-random number generators. Final result is a point cloud which represents the uncertainty of the estimated coordinates; a confidence region can be assigned to these point clouds, as well. This concept may replace the common concept of variance propagation and the quality assessment of adjustment parameters by using their covariance matrix. It allows a new way for uncertainty assessment in accordance with the GUM concept for uncertainty modelling and propagation. As practical example the local tie network in "Metsähovi Fundamental Station", Finland is used, where classical geodetic observations are combined with GNSS data.

  1. Accounting for uncertainty in the quantification of the environmental impacts of Canadian pig farming systems.

    PubMed

    Mackenzie, S G; Leinonen, I; Ferguson, N; Kyriazakis, I

    2015-06-01

    The objective of the study was to develop a life cycle assessment (LCA) for pig farming systems that would account for uncertainty and variability in input data and allow systematic environmental impact comparisons between production systems. The environmental impacts of commercial pig production for 2 regions in Canada (Eastern and Western) were compared using a cradle-to-farm gate LCA. These systems had important contrasting characteristics such as typical feed ingredients used, herd performance, and expected emission factors from manure management. The study used detailed production data supplied by the industry and incorporated uncertainty/variation in all major aspects of the system including life cycle inventory data for feed ingredients, animal performance, energy inputs, and emission factors. The impacts were defined using 5 metrics-global warming potential, acidification potential, eutrophication potential (EP), abiotic resource use, and nonrenewable energy use-and were expressed per kilogram carcass weight at farm gate. Eutrophication potential was further separated into marine EP (MEP) and freshwater EP (FEP). Uncertainties in the model inputs were separated into 2 types: uncertainty in the data used to describe the system (α uncertainties) and uncertainty in impact calculations or background data that affects all systems equally (β uncertainties). The impacts of pig production in the 2 regions were systematically compared based on the differences in the systems (α uncertainties). The method of ascribing uncertainty influenced the outcomes. In eastern systems, EP, MEP, and FEP were lower (P < 0.05) when assuming that all uncertainty in the emission factors for leaching from manure application was β. This was mainly due to increased EP resulting from field emissions for typical ingredients in western diets. When uncertainty in these emission factors was assumed to be α, only FEP was lower in eastern systems (P < 0.05). The environmental impacts for the other impact categories were not significantly different between the 2 systems, despite their aforementioned differences. In conclusion, a probabilistic approach was used to develop an LCA that systematically dealt with uncertainty in the data when comparing multiple environmental impacts measures in pig farming systems for the first time. The method was used to identify differences between Canadian pig production systems but can also be applied for comparisons between other agricultural systems that include inherent variation.

  2. Stochastic evaluation of annual micropollutant loads and their uncertainties in separate storm sewers.

    PubMed

    Hannouche, Ali; Chebbo, Ghassan; Joannis, Claude; Gasperi, Johnny; Gromaire, Marie-Christine; Moilleron, Régis; Barraud, Sylvie; Ruban, Véronique

    2017-12-01

    This article describes a stochastic method to calculate the annual pollutant loads and its application over several years at the outlet of three catchments drained by separate storm sewers. A stochastic methodology using Monte Carlo simulations is proposed for assessing annual pollutant load, as well as the associated uncertainties, from a few event sampling campaigns and/or continuous turbidity measurements (representative of the total suspended solids concentration (TSS)). Indeed, in the latter case, the proposed method takes into account the correlation between pollutants and TSS. The developed method was applied to data acquired within the French research project "INOGEV" (innovations for a sustainable management of urban water) at the outlet of three urban catchments drained by separate storm sewers. Ten or so event sampling campaigns for a large range of pollutants (46 pollutants and 2 conventional water quality parameters: TSS and total organic carbon (TOC)) are combined with hundreds of rainfall events for which, at least one among three continuously monitored parameters (rainfall intensity, flow rate, and turbidity) is available. Results obtained for the three catchments show that the annual pollutant loads can be estimated with uncertainties ranging from 10 to 60%, and the added value of turbidity monitoring for lowering the uncertainty is demonstrated. A low inter-annual and inter-site variability of pollutant loads, for many of studied pollutants, is observed with respect to the estimated uncertainties, and can be explained mainly by annual precipitation.

  3. Model uncertainty and multimodel inference in reliability estimation within a longitudinal framework.

    PubMed

    Alonso, Ariel; Laenen, Annouschka

    2013-05-01

    Laenen, Alonso, and Molenberghs (2007) and Laenen, Alonso, Molenberghs, and Vangeneugden (2009) proposed a method to assess the reliability of rating scales in a longitudinal context. The methodology is based on hierarchical linear models, and reliability coefficients are derived from the corresponding covariance matrices. However, finding a good parsimonious model to describe complex longitudinal data is a challenging task. Frequently, several models fit the data equally well, raising the problem of model selection uncertainty. When model uncertainty is high one may resort to model averaging, where inferences are based not on one but on an entire set of models. We explored the use of different model building strategies, including model averaging, in reliability estimation. We found that the approach introduced by Laenen et al. (2007, 2009) combined with some of these strategies may yield meaningful results in the presence of high model selection uncertainty and when all models are misspecified, in so far as some of them manage to capture the most salient features of the data. Nonetheless, when all models omit prominent regularities in the data, misleading results may be obtained. The main ideas are further illustrated on a case study in which the reliability of the Hamilton Anxiety Rating Scale is estimated. Importantly, the ambit of model selection uncertainty and model averaging transcends the specific setting studied in the paper and may be of interest in other areas of psychometrics. © 2012 The British Psychological Society.

  4. Multi-model approach to assess the impact of climate change on runoff

    NASA Astrophysics Data System (ADS)

    Dams, J.; Nossent, J.; Senbeta, T. B.; Willems, P.; Batelaan, O.

    2015-10-01

    The assessment of climate change impacts on hydrology is subject to uncertainties related to the climate change scenarios, stochastic uncertainties of the hydrological model and structural uncertainties of the hydrological model. This paper focuses on the contribution of structural uncertainty of hydrological models to the overall uncertainty of the climate change impact assessment. To quantify the structural uncertainty of hydrological models, four physically based hydrological models (SWAT, PRMS and a semi- and fully distributed version of the WetSpa model) are set up for a catchment in Belgium. Each model is calibrated using four different objective functions. Three climate change scenarios with a high, mean and low hydrological impact are statistically perturbed from a large ensemble of climate change scenarios and are used to force the hydrological models. This methodology allows assessing and comparing the uncertainty introduced by the climate change scenarios with the uncertainty introduced by the hydrological model structure. Results show that the hydrological model structure introduces a large uncertainty on both the average monthly discharge and the extreme peak and low flow predictions under the climate change scenarios. For the low impact climate change scenario, the uncertainty range of the mean monthly runoff is comparable to the range of these runoff values in the reference period. However, for the mean and high impact scenarios, this range is significantly larger. The uncertainty introduced by the climate change scenarios is larger than the uncertainty due to the hydrological model structure for the low and mean hydrological impact scenarios, but the reverse is true for the high impact climate change scenario. The mean and high impact scenarios project increasing peak discharges, while the low impact scenario projects increasing peak discharges only for peak events with return periods larger than 1.6 years. All models suggest for all scenarios a decrease of the lowest flows, except for the SWAT model with the mean hydrological impact climate change scenario. The results of this study indicate that besides the uncertainty introduced by the climate change scenarios also the hydrological model structure uncertainty should be taken into account in the assessment of climate change impacts on hydrology. To make it more straightforward and transparent to include model structural uncertainty in hydrological impact studies, there is a need for hydrological modelling tools that allow flexible structures and methods to validate model structures in their ability to assess impacts under unobserved future climatic conditions.

  5. Application of a baseflow filter for evaluating model structure suitability of the IHACRES CMD

    NASA Astrophysics Data System (ADS)

    Kim, H. S.

    2015-02-01

    The main objective of this study was to assess the predictive uncertainty from the rainfall-runoff model structure coupling a conceptual module (non-linear module) with a metric transfer function module (linear module). The methodology was primarily based on the comparison between the outputs of the rainfall-runoff model and those from an alternative model approach. An alternative model approach was used to minimise uncertainties arising from data and the model structure. A baseflow filter was adopted to better understand deficiencies in the forms of the rainfall-runoff model by avoiding the uncertainties related to data and the model structure. The predictive uncertainty from the model structure was investigated for representative groups of catchments having similar hydrological response characteristics in the upper Murrumbidgee Catchment. In the assessment of model structure suitability, the consistency (or variability) of catchment response over time and space in model performance and parameter values has been investigated to detect problems related to the temporal and spatial variability of the model accuracy. The predictive error caused by model uncertainty was evaluated through analysis of the variability of the model performance and parameters. A graphical comparison of model residuals, effective rainfall estimates and hydrographs was used to determine a model's ability related to systematic model deviation between simulated and observed behaviours and general behavioural differences in the timing and magnitude of peak flows. The model's predictability was very sensitive to catchment response characteristics. The linear module performs reasonably well in the wetter catchments but has considerable difficulties when applied to the drier catchments where a hydrologic response is dominated by quick flow. The non-linear module has a potential limitation in its capacity to capture non-linear processes for converting observed rainfall into effective rainfall in both the wetter and drier catchments. The comparative study based on a better quantification of the accuracy and precision of hydrological modelling predictions yields a better understanding for the potential improvement of model deficiencies.

  6. Warning and prevention based on estimates with large uncertainties: the case of low-frequency and large-impact events like tsunamis

    NASA Astrophysics Data System (ADS)

    Tinti, Stefano; Armigliato, Alberto; Pagnoni, Gianluca; Zaniboni, Filippo

    2013-04-01

    Geoscientists deal often with hazardous processes like earthquakes, volcanic eruptions, tsunamis, hurricanes, etc., and their research is aimed not only to a better understanding of the physical processes, but also to provide assessment of the space and temporal evolution of a given individual event (i.e. to provide short-term prediction) and of the expected evolution of a group of events (i.e. to provide statistical estimates referred to a given return period, and a given geographical area). One of the main issues of any scientific method is how to cope with measurement errors, a topic which in case of forecast of ongoing or of future events translates into how to deal with forecast uncertainties. In general, the more data are available and processed to make a prediction, the more accurate the prediction is expected to be if the scientific approach is sound, and the smaller the associated uncertainties are. However, there are several important cases where assessment is to be made with insufficient data or insufficient time for processing, which leads to large uncertainties. Two examples can be given taken from tsunami science, since tsunamis are rare events that may have destructive power and very large impact. One example is the case of warning for a tsunami generated by a near-coast earthquake, which is an issue at the focus of the European funded project NearToWarn. Warning has to be launched before tsunami hits the coast, that is in a few minutes after its generation. This may imply that data collected in such a short time are not yet enough for an accurate evaluation, also because the implemented monitoring system (if any) could be inadequate (f.i. one reason of inadequacy could be that implementing a dense instrumental network could be judged too expensive for rare events) The second case is the long term prevention from tsunami strikes. Tsunami infrequency may imply that the historical record for a given piece of coast is too short to capture a statistical sufficient number of large tsunamis, which entails that tsunami hazard has to be estimated by means of speculated worst-case scenarios, and their consequences are evaluated accordingly and usually result associated with large uncertainty bands. In case of large uncertainties, the main issues for geoscientists are how to communicate the information (prediction and uncertainties) to stakeholders and citizens and how to build and implement together responsive procedures that should be adequate. Usually there is a tradeoff between the cost of the countermeasure (warning and prevention) and its efficacy (i.e. its capability of minimizing the damage). The level of the acceptable tradeoff is an issue pertaining to decision makers and to local threatened communities. This paper, that represents a contribution from the European project TRIDEC on management of emergency crises, discusses the role of geoscientists in providing predictions and the related uncertainties. It is stressed that through academic education geoscientists are formed more to better their understanding of processes and the quantification of uncertainties, but are often unprepared to communicate their results in a way appropriate for society. Filling this gap is crucial for improving the way geoscience and society handle natural hazards and devise proper defense means.

  7. MODEL UNCERTAINTY ANALYSIS, FIELD DATA COLLECTION AND ANALYSIS OF CONTAMINATED VAPOR INTRUSION INTO BUILDINGS

    EPA Science Inventory

    To address uncertainty associated with the evaluation of vapor intrusion problems we are working on a three part strategy that includes: evaluation of uncertainty in model-based assessments; collection of field data and assessment of sites using EPA and state protocols.

  8. The Uncertainties on the GIS Based Land Suitability Assessment for Urban and Rural Planning

    NASA Astrophysics Data System (ADS)

    Liu, H.; Zhan, Q.; Zhan, M.

    2017-09-01

    The majority of the research on the uncertainties of spatial data and spatial analysis focuses on some specific data feature or analysis tool. Few have accomplished the uncertainties of the whole process of an application like planning, making the research of uncertainties detached from practical applications. The paper discusses the uncertainties of the geographical information systems (GIS) based land suitability assessment in planning on the basis of literature review. The uncertainties considered range from index system establishment to the classification of the final result. Methods to reduce the uncertainties arise from the discretization of continuous raster data and the index weight determination are summarized. The paper analyzes the merits and demerits of the "Nature Breaks" method which is broadly used by planners. It also explores the other factors which impact the accuracy of the final classification like the selection of class numbers, intervals and the autocorrelation of the spatial data. In the conclusion part, the paper indicates that the adoption of machine learning methods should be modified to integrate the complexity of land suitability assessment. The work contributes to the application of spatial data and spatial analysis uncertainty research on land suitability assessment, and promotes the scientific level of the later planning and decision-making.

  9. Phylogeny, extinction and conservation: embracing uncertainties in a time of urgency

    PubMed Central

    Forest, Félix; Crandall, Keith A.; Chase, Mark W.; Faith, Daniel P.

    2015-01-01

    Evolutionary studies have played a fundamental role in our understanding of life, but until recently, they had only a relatively modest involvement in addressing conservation issues. The main goal of the present discussion meeting issue is to offer a platform to present the available methods allowing the integration of phylogenetic and extinction risk data in conservation planning. Here, we identify the main knowledge gaps in biodiversity science, which include incomplete sampling, reconstruction biases in phylogenetic analyses, partly known species distribution ranges, and the difficulty in producing conservation assessments for all known species, not to mention that much of the effective biological diversity remains to be discovered. Given the impact that human activities have on biodiversity and the urgency with which we need to address these issues, imperfect assumptions need to be sanctioned and surrogates used in the race to salvage as much as possible of our natural and evolutionary heritage. We discuss some aspects of the uncertainties found in biodiversity science, such as the ideal surrogates for biodiversity, the gaps in our knowledge and the numerous available phylogenetic diversity-based methods. We also introduce a series of cases studies that demonstrate how evolutionary biology can effectively contribute to biodiversity conservation science. PMID:25561663

  10. Info-Gap robustness pathway method for transitioning of urban drainage systems under deep uncertainties.

    PubMed

    Zischg, Jonatan; Goncalves, Mariana L R; Bacchin, Taneha Kuzniecow; Leonhardt, Günther; Viklander, Maria; van Timmeren, Arjan; Rauch, Wolfgang; Sitzenfrei, Robert

    2017-09-01

    In the urban water cycle, there are different ways of handling stormwater runoff. Traditional systems mainly rely on underground piped, sometimes named 'gray' infrastructure. New and so-called 'green/blue' ambitions aim for treating and conveying the runoff at the surface. Such concepts are mainly based on ground infiltration and temporal storage. In this work a methodology to create and compare different planning alternatives for stormwater handling on their pathways to a desired system state is presented. Investigations are made to assess the system performance and robustness when facing the deeply uncertain spatial and temporal developments in the future urban fabric, including impacts caused by climate change, urbanization and other disruptive events, like shifts in the network layout and interactions of 'gray' and 'green/blue' structures. With the Info-Gap robustness pathway method, three planning alternatives are evaluated to identify critical performance levels at different stages over time. This novel methodology is applied to a real case study problem where a city relocation process takes place during the upcoming decades. In this case study it is shown that hybrid systems including green infrastructures are more robust with respect to future uncertainties, compared to traditional network design.

  11. Ecological Equivalence Assessment Methods: What Trade-Offs between Operationality, Scientific Basis and Comprehensiveness?

    PubMed

    Bezombes, Lucie; Gaucherand, Stéphanie; Kerbiriou, Christian; Reinert, Marie-Eve; Spiegelberger, Thomas

    2017-08-01

    In many countries, biodiversity compensation is required to counterbalance negative impacts of development projects on biodiversity by carrying out ecological measures, called offset when the goal is to reach "no net loss" of biodiversity. One main issue is to ensure that offset gains are equivalent to impact-related losses. Ecological equivalence is assessed with ecological equivalence assessment methods taking into account a range of key considerations that we summarized as ecological, spatial, temporal, and uncertainty. When equivalence assessment methods take into account all considerations, we call them "comprehensive". Equivalence assessment methods should also aim to be science-based and operational, which is challenging. Many equivalence assessment methods have been developed worldwide but none is fully satisfying. In the present study, we examine 13 equivalence assessment methods in order to identify (i) their general structure and (ii) the synergies and trade-offs between equivalence assessment methods characteristics related to operationality, scientific-basis and comprehensiveness (called "challenges" in his paper). We evaluate each equivalence assessment methods on the basis of 12 criteria describing the level of achievement of each challenge. We observe that all equivalence assessment methods share a general structure, with possible improvements in the choice of target biodiversity, the indicators used, the integration of landscape context and the multipliers reflecting time lags and uncertainties. We show that no equivalence assessment methods combines all challenges perfectly. There are trade-offs between and within the challenges: operationality tends to be favored while scientific basis are integrated heterogeneously in equivalence assessment methods development. One way of improving the challenges combination would be the use of offset dedicated data-bases providing scientific feedbacks on previous offset measures.

  12. Full uncertainty quantification of N2O and NO emissions using the biogeochemical model LandscapeDNDC on site and regional scale

    NASA Astrophysics Data System (ADS)

    Haas, Edwin; Santabarbara, Ignacio; Kiese, Ralf; Butterbach-Bahl, Klaus

    2017-04-01

    Numerical simulation models are increasingly used to estimate greenhouse gas emissions at site to regional / national scale and are outlined as the most advanced methodology (Tier 3) in the framework of UNFCCC reporting. Process-based models incorporate the major processes of the carbon and nitrogen cycle of terrestrial ecosystems and are thus thought to be widely applicable at various conditions and spatial scales. Process based modelling requires high spatial resolution input data on soil properties, climate drivers and management information. The acceptance of model based inventory calculations depends on the assessment of the inventory's uncertainty (model, input data and parameter induced uncertainties). In this study we fully quantify the uncertainty in modelling soil N2O and NO emissions from arable, grassland and forest soils using the biogeochemical model LandscapeDNDC. We address model induced uncertainty (MU) by contrasting two different soil biogeochemistry modules within LandscapeDNDC. The parameter induced uncertainty (PU) was assessed by using joint parameter distributions for key parameters describing microbial C and N turnover processes as obtained by different Bayesian calibration studies for each model configuration. Input data induced uncertainty (DU) was addressed by Bayesian calibration of soil properties, climate drivers and agricultural management practices data. For the MU, DU and PU we performed several hundred simulations each to contribute to the individual uncertainty assessment. For the overall uncertainty quantification we assessed the model prediction probability, followed by sampled sets of input datasets and parameter distributions. Statistical analysis of the simulation results have been used to quantify the overall full uncertainty of the modelling approach. With this study we can contrast the variation in model results to the different sources of uncertainties for each ecosystem. Further we have been able to perform a fully uncertainty analysis for modelling N2O and NO emissions from arable, grassland and forest soils necessary for the comprehensibility of modelling results. We have applied the methodology to a regional inventory to assess the overall modelling uncertainty for a regional N2O and NO emissions inventory for the state of Saxony, Germany.

  13. Uncertainty assessment of urban pluvial flood risk in a context of climate change adaptation decision making

    NASA Astrophysics Data System (ADS)

    Arnbjerg-Nielsen, Karsten; Zhou, Qianqian

    2014-05-01

    There has been a significant increase in climatic extremes in many regions. In Central and Northern Europe, this has led to more frequent and more severe floods. Along with improved flood modelling technologies this has enabled development of economic assessment of climate change adaptation to increasing urban flood risk. Assessment of adaptation strategies often requires a comprehensive risk-based economic analysis of current risk, drivers of change of risk over time, and measures to reduce the risk. However, such studies are often associated with large uncertainties. The uncertainties arise from basic assumptions in the economic analysis and the hydrological model, but also from the projection of future societies to local climate change impacts and suitable adaptation options. This presents a challenge to decision makers when trying to identify robust measures. We present an integrated uncertainty analysis, which can assess and quantify the overall uncertainty in relation to climate change adaptation to urban flash floods. The analysis is based on an uncertainty cascade that by means of Monte Carlo simulations of flood risk assessments incorporates climate change impacts as a key driver of risk changes over time. The overall uncertainty is then attributed to six bulk processes: climate change impact, urban rainfall-runoff processes, stage-depth functions, unit cost of repair, cost of adaptation measures, and discount rate. We apply the approach on an urban hydrological catchment in Odense, Denmark, and find that the uncertainty on the climate change impact appears to have the least influence on the net present value of the studied adaptation measures-. This does not imply that the climate change impact is not important, but that the uncertainties are not dominating when deciding on action or in-action. We then consider the uncertainty related to choosing between adaptation options given that a decision of action has been taken. In this case the major part of the uncertainty on the estimated net present values is identical for all adaptation options and will therefore not affect a comparison between adaptation measures. This makes the chose among the options easier. Furthermore, the explicit attribution of uncertainty also enables a reduction of the overall uncertainty by identifying the processes which contributes the most. This knowledge can then be used to further reduce the uncertainty related to decision making, as a substantial part of the remaining uncertainty is epistemic.

  14. Water Footprint and Water Consumption for the Main Crops and Biofuels Produced in Brazil

    NASA Astrophysics Data System (ADS)

    Sun, Y.; Tong, C.; Mansoor, K.; Carroll, S.

    2011-12-01

    The risk of CO2 leakage into shallow aquifers through various pathways such as faults and abandoned wells is a concern of CO2 geological sequestration. If a leak is detected in an aquifer system, a contingency plan is required to manage the CO2 storage and to protect the groundwater source. Among many remediation and mitigation strategies, the simplest is to stop CO2 leakage at a wellbore. Therefore, it is necessary to address whether and when the CO2 leaks should be sealed, and how much risk can be mitigated. In the presence of various uncertainties, including geological-structure uncertainty and parametric uncertainty, the risk of CO2 leakage into an aquifer needs to be assessed with probabilistic distributions of uncertain parameters. In this study, we developed an integrated model to simulate multiphase flow of CO2 and brine in a deep storage reservoir, through a leaky well at an uncertain location, and subsequently multicomponent reactive transport in a shallow aquifer. Each sub-model covers its domain-specific physics. Uncertainties of geological structure and parameters are considered together with decision variables (CO2 injection rate and mitigation time) for risk assessment of leakage-impacted aquifer volume. High-resolution and less-expensive reduced-order models (ROMs) of risk profiles are approximated as polynomial functions of decision variables and all uncertain parameters. These reduced-order models are then used in the place of computationally-expensive numerical models for future decision-making on if and when the leaky well is sealed. The tradeoff between CO2 storage capacity in the reservoir and the leakage-induced risk in the aquifer is evaluated. This work performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under contract DE-AC52-07NA27344.

  15. Damage severity assessment in wind turbine blade laboratory model through fuzzy finite element model updating

    NASA Astrophysics Data System (ADS)

    Turnbull, Heather; Omenzetter, Piotr

    2017-04-01

    The recent shift towards development of clean, sustainable energy sources has provided a new challenge in terms of structural safety and reliability: with aging, manufacturing defects, harsh environmental and operational conditions, and extreme events such as lightning strikes wind turbines can become damaged resulting in production losses and environmental degradation. To monitor the current structural state of the turbine, structural health monitoring (SHM) techniques would be beneficial. Physics based SHM in the form of calibration of a finite element model (FEMs) by inverse techniques is adopted in this research. Fuzzy finite element model updating (FFEMU) techniques for damage severity assessment of a small-scale wind turbine blade are discussed and implemented. The main advantage is the ability of FFEMU to account in a simple way for uncertainty within the problem of model updating. Uncertainty quantification techniques, such as fuzzy sets, enable a convenient mathematical representation of the various uncertainties. Experimental frequencies obtained from modal analysis on a small-scale wind turbine blade were described by fuzzy numbers to model measurement uncertainty. During this investigation, damage severity estimation was investigated through addition of small masses of varying magnitude to the trailing edge of the structure. This structural modification, intended to be in lieu of damage, enabled non-destructive experimental simulation of structural change. A numerical model was constructed with multiple variable additional masses simulated upon the blades trailing edge and used as updating parameters. Objective functions for updating were constructed and minimized using both particle swarm optimization algorithm and firefly algorithm. FFEMU was able to obtain a prediction of baseline material properties of the blade whilst also successfully predicting, with sufficient accuracy, a larger magnitude of structural alteration and its location.

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sun, Y.; Tong, C.; Trainor-Guitten, W. J.

    The risk of CO 2 leakage from a deep storage reservoir into a shallow aquifer through a fault is assessed and studied using physics-specific computer models. The hypothetical CO 2 geological sequestration system is composed of three subsystems: a deep storage reservoir, a fault in caprock, and a shallow aquifer, which are modeled respectively by considering sub-domain-specific physics. Supercritical CO 2 is injected into the reservoir subsystem with uncertain permeabilities of reservoir, caprock, and aquifer, uncertain fault location, and injection rate (as a decision variable). The simulated pressure and CO 2/brine saturation are connected to the fault-leakage model as amore » boundary condition. CO 2 and brine fluxes from the fault-leakage model at the fault outlet are then imposed in the aquifer model as a source term. Moreover, uncertainties are propagated from the deep reservoir model, to the fault-leakage model, and eventually to the geochemical model in the shallow aquifer, thus contributing to risk profiles. To quantify the uncertainties and assess leakage-relevant risk, we propose a global sampling-based method to allocate sub-dimensions of uncertain parameters to sub-models. The risk profiles are defined and related to CO 2 plume development for pH value and total dissolved solids (TDS) below the EPA's Maximum Contaminant Levels (MCL) for drinking water quality. A global sensitivity analysis is conducted to select the most sensitive parameters to the risk profiles. The resulting uncertainty of pH- and TDS-defined aquifer volume, which is impacted by CO 2 and brine leakage, mainly results from the uncertainty of fault permeability. Subsequently, high-resolution, reduced-order models of risk profiles are developed as functions of all the decision variables and uncertain parameters in all three subsystems.« less

  17. Sources of uncertainty in hydrological climate impact assessment: a cross-scale study

    NASA Astrophysics Data System (ADS)

    Hattermann, F. F.; Vetter, T.; Breuer, L.; Su, Buda; Daggupati, P.; Donnelly, C.; Fekete, B.; Flörke, F.; Gosling, S. N.; Hoffmann, P.; Liersch, S.; Masaki, Y.; Motovilov, Y.; Müller, C.; Samaniego, L.; Stacke, T.; Wada, Y.; Yang, T.; Krysnaova, V.

    2018-01-01

    Climate change impacts on water availability and hydrological extremes are major concerns as regards the Sustainable Development Goals. Impacts on hydrology are normally investigated as part of a modelling chain, in which climate projections from multiple climate models are used as inputs to multiple impact models, under different greenhouse gas emissions scenarios, which result in different amounts of global temperature rise. While the goal is generally to investigate the relevance of changes in climate for the water cycle, water resources or hydrological extremes, it is often the case that variations in other components of the model chain obscure the effect of climate scenario variation. This is particularly important when assessing the impacts of relatively lower magnitudes of global warming, such as those associated with the aspirational goals of the Paris Agreement. In our study, we use ANOVA (analyses of variance) to allocate and quantify the main sources of uncertainty in the hydrological impact modelling chain. In turn we determine the statistical significance of different sources of uncertainty. We achieve this by using a set of five climate models and up to 13 hydrological models, for nine large scale river basins across the globe, under four emissions scenarios. The impact variable we consider in our analysis is daily river discharge. We analyze overall water availability and flow regime, including seasonality, high flows and low flows. Scaling effects are investigated by separately looking at discharge generated by global and regional hydrological models respectively. Finally, we compare our results with other recently published studies. We find that small differences in global temperature rise associated with some emissions scenarios have mostly significant impacts on river discharge—however, climate model related uncertainty is so large that it obscures the sensitivity of the hydrological system.

  18. Facing uncertainty in ecosystem services-based resource management.

    PubMed

    Grêt-Regamey, Adrienne; Brunner, Sibyl H; Altwegg, Jürg; Bebi, Peter

    2013-09-01

    The concept of ecosystem services is increasingly used as a support for natural resource management decisions. While the science for assessing ecosystem services is improving, appropriate methods to address uncertainties in a quantitative manner are missing. Ignoring parameter uncertainties, modeling uncertainties and uncertainties related to human-environment interactions can modify decisions and lead to overlooking important management possibilities. In this contribution, we present a new approach for mapping the uncertainties in the assessment of multiple ecosystem services. The spatially explicit risk approach links Bayesian networks to a Geographic Information System for forecasting the value of a bundle of ecosystem services and quantifies the uncertainties related to the outcomes in a spatially explicit manner. We demonstrate that mapping uncertainties in ecosystem services assessments provides key information for decision-makers seeking critical areas in the delivery of ecosystem services in a case study in the Swiss Alps. The results suggest that not only the total value of the bundle of ecosystem services is highly dependent on uncertainties, but the spatial pattern of the ecosystem services values changes substantially when considering uncertainties. This is particularly important for the long-term management of mountain forest ecosystems, which have long rotation stands and are highly sensitive to pressing climate and socio-economic changes. Copyright © 2012 Elsevier Ltd. All rights reserved.

  19. Variability And Uncertainty Analysis Of Contaminant Transport Model Using Fuzzy Latin Hypercube Sampling Technique

    NASA Astrophysics Data System (ADS)

    Kumar, V.; Nayagum, D.; Thornton, S.; Banwart, S.; Schuhmacher2, M.; Lerner, D.

    2006-12-01

    Characterization of uncertainty associated with groundwater quality models is often of critical importance, as for example in cases where environmental models are employed in risk assessment. Insufficient data, inherent variability and estimation errors of environmental model parameters introduce uncertainty into model predictions. However, uncertainty analysis using conventional methods such as standard Monte Carlo sampling (MCS) may not be efficient, or even suitable, for complex, computationally demanding models and involving different nature of parametric variability and uncertainty. General MCS or variant of MCS such as Latin Hypercube Sampling (LHS) assumes variability and uncertainty as a single random entity and the generated samples are treated as crisp assuming vagueness as randomness. Also when the models are used as purely predictive tools, uncertainty and variability lead to the need for assessment of the plausible range of model outputs. An improved systematic variability and uncertainty analysis can provide insight into the level of confidence in model estimates, and can aid in assessing how various possible model estimates should be weighed. The present study aims to introduce, Fuzzy Latin Hypercube Sampling (FLHS), a hybrid approach of incorporating cognitive and noncognitive uncertainties. The noncognitive uncertainty such as physical randomness, statistical uncertainty due to limited information, etc can be described by its own probability density function (PDF); whereas the cognitive uncertainty such estimation error etc can be described by the membership function for its fuzziness and confidence interval by ?-cuts. An important property of this theory is its ability to merge inexact generated data of LHS approach to increase the quality of information. The FLHS technique ensures that the entire range of each variable is sampled with proper incorporation of uncertainty and variability. A fuzzified statistical summary of the model results will produce indices of sensitivity and uncertainty that relate the effects of heterogeneity and uncertainty of input variables to model predictions. The feasibility of the method is validated to assess uncertainty propagation of parameter values for estimation of the contamination level of a drinking water supply well due to transport of dissolved phenolics from a contaminated site in the UK.

  20. International survey for good practices in forecasting uncertainty assessment and communication

    NASA Astrophysics Data System (ADS)

    Berthet, Lionel; Piotte, Olivier

    2014-05-01

    Achieving technically sound flood forecasts is a crucial objective for forecasters but remains of poor use if the users do not understand properly their significance and do not use it properly in decision making. One usual way to precise the forecasts limitations is to communicate some information about their uncertainty. Uncertainty assessment and communication to stakeholders are thus important issues for operational flood forecasting services (FFS) but remain open fields for research. French FFS wants to publish graphical streamflow and level forecasts along with uncertainty assessment in near future on its website (available to the greater public). In order to choose the technical options best adapted to its operational context, it carried out a survey among more than 15 fellow institutions. Most of these are providing forecasts and warnings to civil protection officers while some were mostly working for hydroelectricity suppliers. A questionnaire has been prepared in order to standardize the analysis of the practices of the surveyed institutions. The survey was conducted by gathering information from technical reports or from the scientific literature, as well as 'interviews' driven by phone, email discussions or meetings. The questionnaire helped in the exploration of practices in uncertainty assessment, evaluation and communication. Attention was paid to the particular context within which every insitution works, in the analysis drawn from raw results. Results show that most services interviewed assess their forecasts uncertainty. However, practices can differ significantly from a country to another. Popular techniques are ensemble approaches. They allow to take into account several uncertainty sources. Statistical past forecasts analysis (such as the quantile regressions) are also commonly used. Contrary to what was expected, only few services emphasize the role of the forecaster (subjective assessment). Similar contrasts can be observed in uncertainty communication practices. Some countries are quite advanced in uncertainty communication to the general public whereas most of them restrain this communication to pre-defined stakeholders who have previously been sensitized or trained. Differents forms of communication were met during the survey, from written comments to complex graphics. No form could claim a clear leadership. This survey revealed useful to identify some difficulties in the design of the next French forecast uncertainty assessment and communication schemes.

  1. Uncertainty and Variability in Physiologically-Based ...

    EPA Pesticide Factsheets

    EPA announced the availability of the final report, Uncertainty and Variability in Physiologically-Based Pharmacokinetic (PBPK) Models: Key Issues and Case Studies. This report summarizes some of the recent progress in characterizing uncertainty and variability in physiologically-based pharmacokinetic models and their predictions for use in risk assessment. This report summarizes some of the recent progress in characterizing uncertainty and variability in physiologically-based pharmacokinetic models and their predictions for use in risk assessment.

  2. Uncertainty Assessment of Hypersonic Aerothermodynamics Prediction Capability

    NASA Technical Reports Server (NTRS)

    Bose, Deepak; Brown, James L.; Prabhu, Dinesh K.; Gnoffo, Peter; Johnston, Christopher O.; Hollis, Brian

    2011-01-01

    The present paper provides the background of a focused effort to assess uncertainties in predictions of heat flux and pressure in hypersonic flight (airbreathing or atmospheric entry) using state-of-the-art aerothermodynamics codes. The assessment is performed for four mission relevant problems: (1) shock turbulent boundary layer interaction on a compression corner, (2) shock turbulent boundary layer interaction due a impinging shock, (3) high-mass Mars entry and aerocapture, and (4) high speed return to Earth. A validation based uncertainty assessment approach with reliance on subject matter expertise is used. A code verification exercise with code-to-code comparisons and comparisons against well established correlations is also included in this effort. A thorough review of the literature in search of validation experiments is performed, which identified a scarcity of ground based validation experiments at hypersonic conditions. In particular, a shortage of useable experimental data at flight like enthalpies and Reynolds numbers is found. The uncertainty was quantified using metrics that measured discrepancy between model predictions and experimental data. The discrepancy data is statistically analyzed and investigated for physics based trends in order to define a meaningful quantified uncertainty. The detailed uncertainty assessment of each mission relevant problem is found in the four companion papers.

  3. Western oil shale development: a technology assessment. Volume 1. Main report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1981-11-01

    The general goal of this study is to present the prospects of shale oil within the context of (1) environmental constraints, (2) available natural and economic resources, and (3) the characteristics of existing and emerging technology. The objectives are: to review shale oil technologies objectively as a means of supplying domestically produced fuels within environmental, social, economic, and legal/institutional constraints; using available data, analyses, and experienced judgment, to examine the major points of uncertainty regarding potential impacts of oil shale development; to resolve issues where data and analyses are compelling or where conclusions can be reached on judgmental grounds; tomore » specify issues which cannot be resolved on the bases of the data, analyses, and experienced judgment currently available; and when appropriate and feasible, to suggest ways for the removal of existing uncertainties that stand in the way of resolving outstanding issues.« less

  4. Uncertainty after treatment for prostate cancer: definition, assessment, and management.

    PubMed

    Yu Ko, Wellam F; Degner, Lesley F

    2008-10-01

    Prostate cancer is the second most common type of cancer in men living in the United States and the most common type of malignancy in Canadian men, accounting for 186,320 new cases in the United States and 24,700 in Canada in 2008. Uncertainty, a component of all illness experiences, influences how men perceive the processes of treatment and adaptation. The Reconceptualized Uncertainty in Illness Theory explains the chronic nature of uncertainty in cancer survivorship by describing a shift from an emergent acute phase of uncertainty in survivors to a new level of uncertainty that is no longer acute and becomes a part of daily life. Proper assessment of certainty and uncertainty may allow nurses to maximize the effectiveness of patient-provider communication, cognitive reframing, and problem-solving interventions to reduce uncertainty after cancer treatment.

  5. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zozor, Steeve; Portesi, Mariela; Sanchez-Moreno, Pablo

    The position-momentum uncertainty-like inequality based on moments of arbitrary order for d-dimensional quantum systems, which is a generalization of the celebrated Heisenberg formulation of the uncertainty principle, is improved here by use of the Renyi-entropy-based uncertainty relation. The accuracy of the resulting lower bound is physico-computationally analyzed for the two main prototypes in d-dimensional physics: the hydrogenic and oscillator-like systems.

  6. Disentangling the uncertainty of hydrologic drought characteristics in a multi-model century-long experiment in continental river basins

    NASA Astrophysics Data System (ADS)

    Samaniego, Luis; Kumar, Rohini; Pechlivanidis, Illias; Breuer, Lutz; Wortmann, Michel; Vetter, Tobias; Flörke, Martina; Chamorro, Alejandro; Schäfer, David; Shah, Harsh; Zeng, Xiaofan

    2016-04-01

    The quantification of the predictive uncertainty in hydrologic models and their attribution to its main sources is of particular interest in climate change studies. In recent years, a number of studies have been aimed at assessing the ability of hydrologic models (HMs) to reproduce extreme hydrologic events. Disentangling the overall uncertainty of streamflow -including its derived low-flow characteristics- into individual contributions, stemming from forcings and model structure, has also been studied. Based on recent literature, it can be stated that there is a controversy with respect to which source is the largest (e.g., Teng, et al. 2012, Bosshard et al. 2013, Prudhomme et al. 2014). Very little has also been done to estimate the relative impact of the parametric uncertainty of the HMs with respect to overall uncertainty of low-flow characteristics. The ISI-MIP2 project provides a unique opportunity to understand the propagation of forcing and model structure uncertainties into century-long time series of drought characteristics. This project defines a consistent framework to deal with compatible initial conditions for the HMs and a set of standardized historical and future forcings. Moreover, the ensemble of hydrologic model predictions varies across a broad range of climate scenarios and regions. To achieve this goal, we use six preconditioned hydrologic models (HYPE or HBV, mHM, SWIM, VIC, and WaterGAP3) set up in seven large continental river basins: Amazon, Blue Nile, Ganges, Niger, Mississippi, Rhine, Yellow. These models are forced with bias-corrected outputs of five CMIP5 general circulation models (GCM) under four extreme representative concentration pathway (RCP) scenarios (i.e. 2.6, 4.5, 6.0, and 8.5 Wm-2) for the period 1971-2099. Simulated streamflow is transformed into a monthly runoff index (RI) to analyze the attribution of the GCM and HM uncertainty into drought magnitude and duration over time. Uncertainty contributions are investigated during periods: 1) 2006-2035, 2) 2036-2065 and 3) 2070-2099. Results presented in Samaniego et al. 2015 (submitted) indicate that GCM uncertainty mostly dominates over HM uncertainty for predictions of runoff drought characteristics, irrespective of the selected RCP and region. For the mHM model, in particular, GCM uncertainty always dominates over parametric uncertainty. In general, the overall uncertainty increases with time. The larger the radiative forcing of the RCP, the larger the uncertainty in drought characteristics, however, the propagation of the GCM uncertainty onto a drought characteristic depends largely upon the hydro-climatic regime. While our study emphasizes the need for multi-model ensembles for the assessment of future drought projections, the agreement between GCM forcings is still weak to draw conclusive recommendations. References: L. Samaniego, R. Kumar, I. G. Pechlivanidis, L. Breuer, M. Wortmann, T. Vetter, M. Flörke, A. Chamorro, D. Schäfer, H. Shah, X. Zeng: Propagation of forcing and model uncertainty into hydrological drought characteristics in a multi-model century-long experiment in continental river basins. Submitted to Climatic Change on Dec 2015. Bosshard, et al. 2013. doi:10.1029/2011WR011533. Prudhomme et al. 2014, doi:10.1073/pnas.1222473110. Teng, et al. 2012, doi:10.1175/JHM-D-11-058.1.

  7. Data quality control in eco-environmental monitoring

    NASA Astrophysics Data System (ADS)

    Lu, Chunyan; Wang, Jing

    2007-11-01

    With the development of science and technology, a number of environmental issues, such as sustainable development, climate change, environmental pollution, and land degradation become serious. Greater attention has been attached to environmental protection. The government gradually launched some eco--environmental construction projects. In 1999, China begin to carry out the project of Grain-for-Green in the west, to improve the eco-environment, and it make some good effect, but there are some questions that still can not be answered. How about the new grass or forest? Where are they? How can we do in the future? To answer these questions, the government began to monitor the eco-environment, based on remote sensing technology. Geography information can be attained timely, but the issue of uncertainty has become increasingly recognized, and this uncertainty affects the reliability of applications using the data. This article analyzed the process of eco-environment monitoring, the uncertainty of geography information, and discussed the methods of data quality control. The Spot5 span data and multi-spectral data in 2003(2002) were used, combined with land use survey data at the scale of 1:10,000, topography data at the scale of 1:10,000, and the local Grain-for-Green project map. Also the social and economic data were collected. Eco-environmental monitoring is a process which consists of several steps, such as image geometric correction, image matching, information extraction, and so on. Based on visual and automated method, land information turned to grass and forest from cultivated land was obtained by comparing the information form remote sensing data with the land survey data, and local Grain-for-Green project data, combined with field survey. According to the process, the uncertainty in the process was analyzed. Positional uncertainty, attribute uncertainty, and thematic uncertainty was obvious. Positional uncertainty mainly derived from image geometric correction, such as data resource, the number and spatial distribution of the control points are important resource of uncertainty. Attribution uncertainty mainly derived from the process of information extraction. Land classification system, artificial error was the main factor induced uncertainty. Concept defined was not clear, and it reduced thematic uncertainty. According to the resource of uncertainty, data quality control methods were put forward to improve the data quality. At first, it is more important to choose appropriate remote sensing data and other basic data. Secondly, the accuracy of digital orthophoto map should be controlled. Thirdly, it is necessary to check the result data according to relative data quality criterion to guarantee GIS data quality.

  8. Accounting for Global Climate Model Projection Uncertainty in Modern Statistical Downscaling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Johannesson, G

    2010-03-17

    Future climate change has emerged as a national and a global security threat. To carry out the needed adaptation and mitigation steps, a quantification of the expected level of climate change is needed, both at the global and the regional scale; in the end, the impact of climate change is felt at the local/regional level. An important part of such climate change assessment is uncertainty quantification. Decision and policy makers are not only interested in 'best guesses' of expected climate change, but rather probabilistic quantification (e.g., Rougier, 2007). For example, consider the following question: What is the probability that themore » average summer temperature will increase by at least 4 C in region R if global CO{sub 2} emission increases by P% from current levels by time T? It is a simple question, but one that remains very difficult to answer. It is answering these kind of questions that is the focus of this effort. The uncertainty associated with future climate change can be attributed to three major factors: (1) Uncertainty about future emission of green house gasses (GHG). (2) Given a future GHG emission scenario, what is its impact on the global climate? (3) Given a particular evolution of the global climate, what does it mean for a particular location/region? In what follows, we assume a particular GHG emission scenario has been selected. Given the GHG emission scenario, the current batch of the state-of-the-art global climate models (GCMs) is used to simulate future climate under this scenario, yielding an ensemble of future climate projections (which reflect, to some degree our uncertainty of being able to simulate future climate give a particular GHG scenario). Due to the coarse-resolution nature of the GCM projections, they need to be spatially downscaled for regional impact assessments. To downscale a given GCM projection, two methods have emerged: dynamical downscaling and statistical (empirical) downscaling (SDS). Dynamic downscaling involves configuring and running a regional climate model (RCM) nested within a given GCM projection (i.e., the GCM provides bounder conditions for the RCM). On the other hand, statistical downscaling aims at establishing a statistical relationship between observed local/regional climate variables of interest and synoptic (GCM-scale) climate predictors. The resulting empirical relationship is then applied to future GCM projections. A comparison of the pros and cons of dynamical versus statistical downscaling is outside the scope of this effort, but has been extensively studied and the reader is referred to Wilby et al. (1998); Murphy (1999); Wood et al. (2004); Benestad et al. (2007); Fowler et al. (2007), and references within those. The scope of this effort is to study methodology, a statistical framework, to propagate and account for GCM uncertainty in regional statistical downscaling assessment. In particular, we will explore how to leverage an ensemble of GCM projections to quantify the impact of the GCM uncertainty in such an assessment. There are three main component to this effort: (1) gather the necessary climate-related data for a regional SDS study, including multiple GCM projections, (2) carry out SDS, and (3) assess the uncertainty. The first step is carried out using tools written in the Python programming language, while analysis tools were developed in the statistical programming language R; see Figure 1.« less

  9. Comparison of methods for measuring atmospheric deposition of arsenic, cadmium, nickel and lead.

    PubMed

    Aas, Wenche; Alleman, Laurent Y; Bieber, Elke; Gladtke, Dieter; Houdret, Jean-Luc; Karlsson, Vuokko; Monies, Christian

    2009-06-01

    A comprehensive field intercomparison at four different types of European sites (two rural, one urban and one industrial) comparing three different collectors (wet only, bulk and Bergerhoff samplers) was conducted in the framework of the European Committee for Standardization (CEN) to create an European standard for the deposition of the four elements As, Cd, Ni and Pb. The purpose was to determine whether the proposed methods lead to results within the uncertainty required by the EU's daughter directive (70%). The main conclusion is that a different sampling strategy is needed for rural and industrial sites. Thus, the conclusions on uncertainties and sample approach are presented separately for the different approaches. The wet only and bulk collector ("bulk bottle method") are comparable at wet rural sites where the total deposition arises mainly from precipitation, the expanded uncertainty when comparing these two types of sampler are below 45% for As, Cd and Pb, 67% for Ni. At industrial sites and possibly very dry rural and urban sites it is necessary to use Bergerhoff samplers or a "bulk bottle+funnel method". It is not possible to address the total deposition estimation with these methods, but they will give the lowest estimate of the total deposition. The expanded uncertainties when comparing the Bergerhoff and the bulk bottle+funnel methods are below 50% for As and Cd, and 63% for Pb. The uncertainty for Ni was not addressed since the bulk bottle+funnel method did not include a full digestion procedure which is necessary for sites with high loads of undissolved metals. The lowest estimate can however be calculated by comparing parallel Bergerhoff samplers where the expanded uncertainty for Ni was 24%. The reproducibility is comparable to the between sampler/method uncertainties. Sampling and sample preparation were proved to be the main factors in the uncertainty budget of deposition measurements.

  10. A systematic uncertainty analysis of an evaluative fate and exposure model.

    PubMed

    Hertwich, E G; McKone, T E; Pease, W S

    2000-08-01

    Multimedia fate and exposure models are widely used to regulate the release of toxic chemicals, to set cleanup standards for contaminated sites, and to evaluate emissions in life-cycle assessment. CalTOX, one of these models, is used to calculate the potential dose, an outcome that is combined with the toxicity of the chemical to determine the Human Toxicity Potential (HTP), used to aggregate and compare emissions. The comprehensive assessment of the uncertainty in the potential dose calculation in this article serves to provide the information necessary to evaluate the reliability of decisions based on the HTP A framework for uncertainty analysis in multimedia risk assessment is proposed and evaluated with four types of uncertainty. Parameter uncertainty is assessed through Monte Carlo analysis. The variability in landscape parameters is assessed through a comparison of potential dose calculations for different regions in the United States. Decision rule uncertainty is explored through a comparison of the HTP values under open and closed system boundaries. Model uncertainty is evaluated through two case studies, one using alternative formulations for calculating the plant concentration and the other testing the steady state assumption for wet deposition. This investigation shows that steady state conditions for the removal of chemicals from the atmosphere are not appropriate and result in an underestimate of the potential dose for 25% of the 336 chemicals evaluated.

  11. Investigating the effect and uncertainties of light absorbing impurities in snow and ice on snow melt and discharge generation using a hydrologic catchment model and satellite data

    NASA Astrophysics Data System (ADS)

    Matt, Felix; Burkhart, John F.

    2017-04-01

    Light absorbing impurities in snow and ice (LAISI) originating from atmospheric deposition enhance snow melt by increasing the absorption of short wave radiation. The consequences are a shortening of the snow cover duration due to increased snow melt and, with respect to hydrologic processes, a temporal shift in the discharge generation. However, the magnitude of these effects as simulated in numerical models have large uncertainties, originating mainly from uncertainties in the wet and dry deposition of light absorbing aerosols, limitations in the model representation of the snowpack, and the lack of observable variables required to estimate model parameters and evaluate the simulated variables connected with the representation of LAISI. This leads to high uncertainties in the additional energy absorbed by the snow due to the presence of LAISI, a key variable in understanding snowpack energy-balance dynamics. In this study, we assess the effect of LAISI on snow melt and discharge generation and the involved uncertainties in a high mountain catchment located in the western Himalayas by using a distributed hydrological catchment model with focus on the representation of the seasonal snow pack. The snow albedo is hereby calculated from a radiative transfer model for snow, taking the increased absorption of short wave radiation by LAISI into account. Meteorological forcing data is generated from an assimilation of observations and high resolution WRF simulations, and LAISI mixing ratios from deposition rates of Black Carbon simulated with the FLEXPART model. To asses the quality of our simulations and the related uncertainties, we compare the simulated additional energy absorbed by the snow due to the presence of LAISI to the MODIS Dust Radiative Forcing in Snow (MODDRFS) algorithm satellite product.

  12. Evaluation of uncertainties in the CRCM-simulated North American climate

    NASA Astrophysics Data System (ADS)

    de Elía, Ramón; Caya, Daniel; Côté, Hélène; Frigon, Anne; Biner, Sébastien; Giguère, Michel; Paquin, Dominique; Harvey, Richard; Plummer, David

    2008-02-01

    This work is a first step in the analysis of uncertainty sources in the RCM-simulated climate over North America. Three main sets of sensitivity studies were carried out: the first estimates the magnitude of internal variability, which is needed to evaluate the significance of changes in the simulated climate induced by any model modification. The second is devoted to the role of CRCM configuration as a source of uncertainty, in particular the sensitivity to nesting technique, domain size, and driving reanalysis. The third study aims to assess the relative importance of the previously estimated sensitivities by performing two additional sensitivity experiments: one, in which the reanalysis driving data is replaced by data generated by the second generation Coupled Global Climate Model (CGCM2), and another, in which a different CRCM version is used. Results show that the internal variability, triggered by differences in initial conditions, is much smaller than the sensitivity to any other source. Results also show that levels of uncertainty originating from liberty of choices in the definition of configuration parameters are comparable among themselves and are smaller than those due to the choice of CGCM or CRCM version used. These results suggest that uncertainty originated by the CRCM configuration latitude (freedom of choice among domain sizes, nesting techniques and reanalysis dataset), although important, does not seem to be a major obstacle to climate downscaling. Finally, with the aim of evaluating the combined effect of the different uncertainties, the ensemble spread is estimated for a subset of the analysed simulations. Results show that downscaled surface temperature is in general more uncertain in the northern regions, while precipitation is more uncertain in the central and eastern US.

  13. Examination of the uncertainty in contaminant fate and transport modeling: a case study in the Venice Lagoon.

    PubMed

    Sommerfreund, J; Arhonditsis, G B; Diamond, M L; Frignani, M; Capodaglio, G; Gerino, M; Bellucci, L; Giuliani, S; Mugnai, C

    2010-03-01

    A Monte Carlo analysis is used to quantify environmental parametric uncertainty in a multi-segment, multi-chemical model of the Venice Lagoon. Scientific knowledge, expert judgment and observational data are used to formulate prior probability distributions that characterize the uncertainty pertaining to 43 environmental system parameters. The propagation of this uncertainty through the model is then assessed by a comparative analysis of the moments (central tendency, dispersion) of the model output distributions. We also apply principal component analysis in combination with correlation analysis to identify the most influential parameters, thereby gaining mechanistic insights into the ecosystem functioning. We found that modeled concentrations of Cu, Pb, OCDD/F and PCB-180 varied by up to an order of magnitude, exhibiting both contaminant- and site-specific variability. These distributions generally overlapped with the measured concentration ranges. We also found that the uncertainty of the contaminant concentrations in the Venice Lagoon was characterized by two modes of spatial variability, mainly driven by the local hydrodynamic regime, which separate the northern and central parts of the lagoon and the more isolated southern basin. While spatial contaminant gradients in the lagoon were primarily shaped by hydrology, our analysis also shows that the interplay amongst the in-place historical pollution in the central lagoon, the local suspended sediment concentrations and the sediment burial rates exerts significant control on the variability of the contaminant concentrations. We conclude that the probabilistic analysis presented herein is valuable for quantifying uncertainty and probing its cause in over-parameterized models, while some of our results can be used to dictate where additional data collection efforts should focus on and the directions that future model refinement should follow. (c) 2009 Elsevier Inc. All rights reserved.

  14. The effect of rainfall measurement uncertainties on rainfall-runoff processes modelling.

    PubMed

    Stransky, D; Bares, V; Fatka, P

    2007-01-01

    Rainfall data are a crucial input for various tasks concerning the wet weather period. Nevertheless, their measurement is affected by random and systematic errors that cause an underestimation of the rainfall volume. Therefore, the general objective of the presented work was to assess the credibility of measured rainfall data and to evaluate the effect of measurement errors on urban drainage modelling tasks. Within the project, the methodology of the tipping bucket rain gauge (TBR) was defined and assessed in terms of uncertainty analysis. A set of 18 TBRs was calibrated and the results were compared to the previous calibration. This enables us to evaluate the ageing of TBRs. A propagation of calibration and other systematic errors through the rainfall-runoff model was performed on experimental catchment. It was found that the TBR calibration is important mainly for tasks connected with the assessment of peak values and high flow durations. The omission of calibration leads to up to 30% underestimation and the effect of other systematic errors can add a further 15%. The TBR calibration should be done every two years in order to catch up the ageing of TBR mechanics. Further, the authors recommend to adjust the dynamic test duration proportionally to generated rainfall intensity.

  15. Crown fuel spatial variability and predictability of fire spread

    Treesearch

    Russell A. Parsons; Jeremy Sauer; Rodman R. Linn

    2010-01-01

    Fire behavior predictions, as well as measures of uncertainty in those predictions, are essential in operational and strategic fire management decisions. While it is becoming common practice to assess uncertainty in fire behavior predictions arising from variability in weather inputs, uncertainty arising from the fire models themselves is difficult to assess. This is...

  16. Managing geological uncertainty in CO2-EOR reservoir assessments

    NASA Astrophysics Data System (ADS)

    Welkenhuysen, Kris; Piessens, Kris

    2014-05-01

    Recently the European Parliament has agreed that an atlas for the storage potential of CO2 is of high importance to have a successful commercial introduction of CCS (CO2 capture and geological storage) technology in Europe. CO2-enhanced oil recovery (CO2-EOR) is often proposed as a promising business case for CCS, and likely has a high potential in the North Sea region. Traditional economic assessments for CO2-EOR largely neglect the geological reality of reservoir uncertainties because these are difficult to introduce realistically in such calculations. There is indeed a gap between the outcome of a reservoir simulation and the input values for e.g. cost-benefit evaluations, especially where it concerns uncertainty. The approach outlined here is to turn the procedure around, and to start from which geological data is typically (or minimally) requested for an economic assessment. Thereafter it is evaluated how this data can realistically be provided by geologists and reservoir engineers. For the storage of CO2 these parameters are total and yearly CO2 injection capacity, and containment or potential on leakage. Specifically for the EOR operation, two additional parameters can be defined: the EOR ratio, or the ratio of recovered oil over injected CO2, and the CO2 recycling ratio of CO2 that is reproduced after breakthrough at the production well. A critical but typically estimated parameter for CO2-EOR projects is the EOR ratio, taken in this brief outline as an example. The EOR ratio depends mainly on local geology (e.g. injection per well), field design (e.g. number of wells), and time. Costs related to engineering can be estimated fairly good, given some uncertainty range. The problem is usually to reliably estimate the geological parameters that define the EOR ratio. Reliable data is only available from (onshore) CO2-EOR projects in the US. Published studies for the North Sea generally refer to these data in a simplified form, without uncertainty ranges, and are therefore not suited for cost-benefit analysis. They likely result in too optimistic results because onshore configurations are cheaper and different. We propose to translate the detailed US data to the North Sea, retaining their uncertainty ranges. In a first step, a general cost correction can be applied to account for costs specific to the EU and the offshore setting. In a second step site-specific data, including laboratory tests and reservoir modelling, are used to further adapt the EOR ratio values taking into account all available geological reservoir-specific knowledge. And lastly, an evaluation of the field configuration will have an influence on both the cost and local geology dimension, because e.g. horizontal drilling is needed (cost) to improve injectivity (geology). As such, a dataset of the EOR field is obtained which contains all aspects and their uncertainty ranges. With these, a geologically realistic basis is obtained for further cost-benefit analysis of a specific field, where the uncertainties are accounted for using a stochastic evaluation. Such ad-hoc evaluation of geological parameters will provide a better assessment of the CO2-EOR potential of the North Sea oil fields.

  17. Upcoming new international measurement standards in the field of building acoustics

    NASA Astrophysics Data System (ADS)

    Goydke, Hans

    2002-11-01

    The extensively completed revision of most of the ISO measurement standards in building acoustics mainly initiated by the European Commissions demand for harmonized standards emphasized the insight that the main goal to avoid trade barriers between the countries can only be reached when the standards sufficiently and comprehensively cover the field when they are related to the actual state of the art and when they are sufficiently related to practice. In modern architecture one can observe the rapid change in the use of building materials, for instance regarding the use of glass. Lightweight constructions as well as heavyweight building elements with additional linings are increasingly in common use and unquestionably there are consequences to be considered regarding the ascertainment of sound insulation properties. Besides others, International Standardization is unsatisfactory regarding the assessment of noise in buildings from waste water installations, in the low frequency area and in general regarding the expression of uncertainty of measurements. Intensity measurements in building acoustics, rainfall noise assessment, estimation of sound insulation, impulse response measurement methods, assessment of sound scattering are examples of upcoming standards.

  18. Uncertainty Analysis for Peer Assessment: Oral Presentation Skills for Final Year Project

    ERIC Educational Resources Information Center

    Kim, Ho Sung

    2014-01-01

    Peer assessment plays an important role in engineering education for an active involvement in the assessment process, developing autonomy, enhancing reflection, and understanding of how to achieve the learning outcomes. Peer assessment uncertainty for oral presentation skills as part of the FYP assessment is studied. Validity and reliability for…

  19. Quantification for Complex Assessment: Uncertainty Estimation in Final Year Project Thesis Assessment

    ERIC Educational Resources Information Center

    Kim, Ho Sung

    2013-01-01

    A quantitative method for estimating an expected uncertainty (reliability and validity) in assessment results arising from the relativity between four variables, viz examiner's expertise, examinee's expertise achieved, assessment task difficulty and examinee's performance, was developed for the complex assessment applicable to final…

  20. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cardenas, Ibsen C., E-mail: c.cardenas@utwente.nl; Halman, Johannes I.M., E-mail: J.I.M.Halman@utwente.nl

    Uncertainty is virtually unavoidable in environmental impact assessments (EIAs). From the literature related to treating and managing uncertainty, we have identified specific techniques for coping with uncertainty in EIAs. Here, we have focused on basic steps in the decision-making process that take place within an EIA setting. More specifically, we have identified uncertainties involved in each decision-making step and discussed the extent to which these can be treated and managed in the context of an activity or project that may have environmental impacts. To further demonstrate the relevance of the techniques identified, we have examined the extent to which themore » EIA guidelines currently used in Colombia consider and provide guidance on managing the uncertainty involved in these assessments. Some points that should be considered in order to provide greater robustness in impact assessments in Colombia have been identified. These include the management of stakeholder values, the systematic generation of project options, and their associated impacts as well as the associated management actions, and the evaluation of uncertainties and assumptions. We believe that the relevant and specific techniques reported here can be a reference for future evaluations of other EIA guidelines in different countries. - Highlights: • uncertainty is unavoidable in environmental impact assessments, EIAs; • we have identified some open techniques to EIAs for treating and managing uncertainty in these assessments; • points for improvement that should be considered in order to provide greater robustness in EIAs in Colombia have been identified; • the paper provides substantiated a reference for possible examinations of EIAs guidelines in other countries.« less

  1. Transfer Standard Uncertainty Can Cause Inconclusive Inter-Laboratory Comparisons

    PubMed Central

    Wright, John; Toman, Blaza; Mickan, Bodo; Wübbeler, Gerd; Bodnar, Olha; Elster, Clemens

    2016-01-01

    Inter-laboratory comparisons use the best available transfer standards to check the participants’ uncertainty analyses, identify underestimated uncertainty claims or unknown measurement biases, and improve the global measurement system. For some measurands, instability of the transfer standard can lead to an inconclusive comparison result. If the transfer standard uncertainty is large relative to a participating laboratory’s uncertainty, the commonly used standardized degree of equivalence ≤ 1 criterion does not always correctly assess whether a participant is working within their uncertainty claims. We show comparison results that demonstrate this issue and propose several criteria for assessing a comparison result as passing, failing, or inconclusive. We investigate the behavior of the standardized degree of equivalence and alternative comparison measures for a range of values of the transfer standard uncertainty relative to the individual laboratory uncertainty values. The proposed alternative criteria successfully discerned between passing, failing, and inconclusive comparison results for the cases we examined. PMID:28090123

  2. Assessing and reducing hydrogeologic model uncertainty

    USDA-ARS?s Scientific Manuscript database

    NRC is sponsoring research that couples model abstraction techniques with model uncertainty assessment methods. Insights and information from this program will be useful in decision making by NRC staff, licensees and stakeholders in their assessment of subsurface radionuclide transport. All analytic...

  3. Climate change adaptation and Integrated Water Resource Management in the water sector

    NASA Astrophysics Data System (ADS)

    Ludwig, Fulco; van Slobbe, Erik; Cofino, Wim

    2014-10-01

    Integrated Water Resources Management (IWRM) was introduced in 1980s to better optimise water uses between different water demanding sectors. However, since it was introduced water systems have become more complicated due to changes in the global water cycle as a result of climate change. The realization that climate change will have a significant impact on water availability and flood risks has driven research and policy making on adaptation. This paper discusses the main similarities and differences between climate change adaptation and IWRM. The main difference between the two is the focus on current and historic issues of IWRM compared to the (long-term) future focus of adaptation. One of the main problems of implementing climate change adaptation is the large uncertainties in future projections. Two completely different approaches to adaptation have been developed in response to these large uncertainties. A top-down approach based on large scale biophysical impacts analyses focussing on quantifying and minimizing uncertainty by using a large range of scenarios and different climate and impact models. The main problem with this approach is the propagation of uncertainties within the modelling chain. The opposite is the bottom up approach which basically ignores uncertainty. It focusses on reducing vulnerabilities, often at local scale, by developing resilient water systems. Both these approaches however are unsuitable for integrating into water management. The bottom up approach focuses too much on socio-economic vulnerability and too little on developing (technical) solutions. The top-down approach often results in an “explosion” of uncertainty and therefore complicates decision making. A more promising direction of adaptation would be a risk based approach. Future research should further develop and test an approach which starts with developing adaptation strategies based on current and future risks. These strategies should then be evaluated using a range of future scenarios in order to develop robust adaptation measures and strategies.

  4. Standards for the validation of remotely sensed albedo products

    NASA Astrophysics Data System (ADS)

    Adams, Jennifer

    2015-04-01

    Land surface albedo is important component of the Earth's energy balance, defined as the fraction of shortwave radiation absorbed by a surface, and is one many Essential Climate Variables (ECVS) that can be retrieved from space through remote sensing. To quantify the accuracy of these products, they must be validated with respect to in-situ measurements of albedo using an albedometer. Whilst accepted standards exist for the calibration of albedometers, standards for the use of in-situ measurement schemes, and their use in validation procedures have yet to be developed. It is essential that we can assess the quality of remotely sensed albedo data, and to identify traceable sources of uncertainty during process of providing these data. As a result of the current lack of accepted standards for in-situ albedo retrieval and validation procedures, we are not yet able to identify and quantify traceable sources of uncertainty. Establishing standard protocols for in-situ retrievals for the validation of global albedo products would allow inter-product use and comparison, in addition to product standardization. Accordingly, this study aims to assess the quality of in-situ albedo retrieval schemes and identify sources of uncertainty, specifically in vegetation environments. A 3D Monte Carlo Ray Tracing Model will be used to simulate albedometer instruments in complex 3D vegetation canopies. To determine sources of uncertainty, factors that influence albedo measurement uncertainty were identified and will subsequently be examined: 1. Time of day (Solar Zenith Angle) 2. Ecosytem type 3. Placement of albedometer within the ecosystem 4. Height of albedometer above the canopy 5. Clustering within the ecosystem A variety of 3D vegetation canopies have been generated to cover the main ecosystems found globally, different seasons, and different plant distributions. Canopies generated include birchstand and pinestand forests for summer and winter, savanna, shrubland, cropland and citrus orchard. All canopies were simulated for a 100x100m area to best represent in-situ measurement conditions. Preliminary tests have been conducted, firstly, identifying the spectral range required to estimate broadband albedo (BBA) and secondly, determining the hyper-spectral intervals required to calculate BBA from spectral albedo. Final results are expected to be able to identify for the factors aforementioned, given a specified confidence level and within 3% accuracy, when does uncertainty of in-situ measurement fall within these critera, and outside these criteria. As the uncertainty of in-situ measurements should be made on an individual basis accounting for relevant factors, this study aims to document for a specific scenario traceable uncertainty sources in in-situ albedo retrieval.

  5. Capturing total chronological and spatial uncertainties in palaeo-ice sheet reconstructions: the DATED example

    NASA Astrophysics Data System (ADS)

    Hughes, Anna; Gyllencreutz, Richard; Mangerud, Jan; Svendsen, John Inge

    2017-04-01

    Glacial geologists generate empirical reconstructions of former ice-sheet dynamics by combining evidence from the preserved record of glacial landforms (e.g. end moraines, lineations) and sediments with chronological evidence (mainly numerical dates derived predominantly from radiocarbon, exposure and luminescence techniques). However the geomorphological and sedimentological footprints and chronological data are both incomplete records in both space and time, and all have multiple types of uncertainty associated with them. To understand ice sheets' response to climate we need numerical models of ice-sheet dynamics based on physical principles. To test and/or constrain such models, empirical reconstructions of past ice sheets that capture and acknowledge all uncertainties are required. In 2005 we started a project (Database of the Eurasian Deglaciation, DATED) to produce an empirical reconstruction of the evolution of the last Eurasian ice sheets, (including the British-Irish, Scandinavian and Svalbard-Barents-Kara Seas ice sheets) that is fully documented, specified in time, and includes uncertainty estimates. Over 5000 dates relevant to constraining ice build-up and retreat were assessed for reliability and used together with published ice-sheet margin positions based on glacial geomorphology to reconstruct time-slice maps of the ice sheets' extent. The DATED maps show synchronous ice margins with maximum-minimum uncertainty bounds for every 1000 years between 25-10 kyr ago. In the first version of results (DATED-1; Hughes et al. 2016) all uncertainties (both quantitative and qualitative, e.g. precision and accuracy of numerical dates, correlation of moraines, stratigraphic interpretations) were combined based on our best glaciological-geological assessment and expressed in terms of distance as a 'fuzzy' margin. Large uncertainties (>100 km) exist; predominantly across marine sectors and other locations where there are spatial gaps in the dating record (e.g. the timing of coalescence and separation of the Scandinavian and Svalbard-Barents-Kara ice sheets) but also in well-studied areas due to conflicting yet apparently equally robust data. In the four years since the DATED-1 census (1 January 2013), the volume of new information (from both dates and mapped glacial geomorphology) has grown significantly ( 1000 new dates). Here, we present work towards the updated version of results, DATED-2, that attempts to further reduce and explicitly report all uncertainties inherent in ice sheet reconstructions. Hughes, A. L. C., Gyllencreutz, R., Lohne, Ø. S., Mangerud, J., Svendsen, J. I. 2016: The last Eurasian ice sheets - a chronological database and time-slice reconstruction, DATED-1. Boreas, 45, 1-45. 10.1111/bor.12142

  6. Advanced Small Modular Reactor Economics Model Development

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harrison, Thomas J.

    2014-10-01

    The US Department of Energy Office of Nuclear Energy’s Advanced Small Modular Reactor (SMR) research and development activities focus on four key areas: Developing assessment methods for evaluating advanced SMR technologies and characteristics; and Developing and testing of materials, fuels and fabrication techniques; and Resolving key regulatory issues identified by US Nuclear Regulatory Commission and industry; and Developing advanced instrumentation and controls and human-machine interfaces. This report focuses on development of assessment methods to evaluate advanced SMR technologies and characteristics. Specifically, this report describes the expansion and application of the economic modeling effort at Oak Ridge National Laboratory. Analysis ofmore » the current modeling methods shows that one of the primary concerns for the modeling effort is the handling of uncertainty in cost estimates. Monte Carlo–based methods are commonly used to handle uncertainty, especially when implemented by a stand-alone script within a program such as Python or MATLAB. However, a script-based model requires each potential user to have access to a compiler and an executable capable of handling the script. Making the model accessible to multiple independent analysts is best accomplished by implementing the model in a common computing tool such as Microsoft Excel. Excel is readily available and accessible to most system analysts, but it is not designed for straightforward implementation of a Monte Carlo–based method. Using a Monte Carlo algorithm requires in-spreadsheet scripting and statistical analyses or the use of add-ons such as Crystal Ball. An alternative method uses propagation of error calculations in the existing Excel-based system to estimate system cost uncertainty. This method has the advantage of using Microsoft Excel as is, but it requires the use of simplifying assumptions. These assumptions do not necessarily bring into question the analytical results. In fact, the analysis shows that the propagation of error method introduces essentially negligible error, especially when compared to the uncertainty associated with some of the estimates themselves. The results of these uncertainty analyses generally quantify and identify the sources of uncertainty in the overall cost estimation. The obvious generalization—that capital cost uncertainty is the main driver—can be shown to be an accurate generalization for the current state of reactor cost analysis. However, the detailed analysis on a component-by-component basis helps to demonstrate which components would benefit most from research and development to decrease the uncertainty, as well as which components would benefit from research and development to decrease the absolute cost.« less

  7. Validation of the Transient Structural Response of a Threaded Assembly: Phase I

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Doebling, Scott W.; Hemez, Francois M.; Robertson, Amy N.

    2004-04-01

    This report explores the application of model validation techniques in structural dynamics. The problem of interest is the propagation of an explosive-driven mechanical shock through a complex threaded joint. The study serves the purpose of assessing whether validating a large-size computational model is feasible, which unit experiments are required, and where the main sources of uncertainty reside. The results documented here are preliminary, and the analyses are exploratory in nature. The results obtained to date reveal several deficiencies of the analysis, to be rectified in future work.

  8. Validating the applicability of the GUM procedure

    NASA Astrophysics Data System (ADS)

    Cox, Maurice G.; Harris, Peter M.

    2014-08-01

    This paper is directed at practitioners seeking a degree of assurance in the quality of the results of an uncertainty evaluation when using the procedure in the Guide to the Expression of Uncertainty in Measurement (GUM) (JCGM 100 : 2008). Such assurance is required in adhering to general standards such as International Standard ISO/IEC 17025 or other sector-specific standards. We investigate the extent to which such assurance can be given. For many practical cases, a measurement result incorporating an evaluated uncertainty that is correct to one significant decimal digit would be acceptable. Any quantification of the numerical precision of an uncertainty statement is naturally relative to the adequacy of the measurement model and the knowledge used of the quantities in that model. For general univariate and multivariate measurement models, we emphasize the use of a Monte Carlo method, as recommended in GUM Supplements 1 and 2. One use of this method is as a benchmark in terms of which measurement results provided by the GUM can be assessed in any particular instance. We mainly consider measurement models that are linear in the input quantities, or have been linearized and the linearization process is deemed to be adequate. When the probability distributions for those quantities are independent, we indicate the use of other approaches such as convolution methods based on the fast Fourier transform and, particularly, Chebyshev polynomials as benchmarks.

  9. Reconstructing solar magnetic fields from historical observations. II. Testing the surface flux transport model

    NASA Astrophysics Data System (ADS)

    Virtanen, I. O. I.; Virtanen, I. I.; Pevtsov, A. A.; Yeates, A.; Mursula, K.

    2017-07-01

    Aims: We aim to use the surface flux transport model to simulate the long-term evolution of the photospheric magnetic field from historical observations. In this work we study the accuracy of the model and its sensitivity to uncertainties in its main parameters and the input data. Methods: We tested the model by running simulations with different values of meridional circulation and supergranular diffusion parameters, and studied how the flux distribution inside active regions and the initial magnetic field affected the simulation. We compared the results to assess how sensitive the simulation is to uncertainties in meridional circulation speed, supergranular diffusion, and input data. We also compared the simulated magnetic field with observations. Results: We find that there is generally good agreement between simulations and observations. Although the model is not capable of replicating fine details of the magnetic field, the long-term evolution of the polar field is very similar in simulations and observations. Simulations typically yield a smoother evolution of polar fields than observations, which often include artificial variations due to observational limitations. We also find that the simulated field is fairly insensitive to uncertainties in model parameters or the input data. Due to the decay term included in the model the effects of the uncertainties are somewhat minor or temporary, lasting typically one solar cycle.

  10. Assessing the relative importance of parameter and forcing uncertainty and their interactions in conceptual hydrological model simulations

    NASA Astrophysics Data System (ADS)

    Mockler, E. M.; Chun, K. P.; Sapriza-Azuri, G.; Bruen, M.; Wheater, H. S.

    2016-11-01

    Predictions of river flow dynamics provide vital information for many aspects of water management including water resource planning, climate adaptation, and flood and drought assessments. Many of the subjective choices that modellers make including model and criteria selection can have a significant impact on the magnitude and distribution of the output uncertainty. Hydrological modellers are tasked with understanding and minimising the uncertainty surrounding streamflow predictions before communicating the overall uncertainty to decision makers. Parameter uncertainty in conceptual rainfall-runoff models has been widely investigated, and model structural uncertainty and forcing data have been receiving increasing attention. This study aimed to assess uncertainties in streamflow predictions due to forcing data and the identification of behavioural parameter sets in 31 Irish catchments. By combining stochastic rainfall ensembles and multiple parameter sets for three conceptual rainfall-runoff models, an analysis of variance model was used to decompose the total uncertainty in streamflow simulations into contributions from (i) forcing data, (ii) identification of model parameters and (iii) interactions between the two. The analysis illustrates that, for our subjective choices, hydrological model selection had a greater contribution to overall uncertainty, while performance criteria selection influenced the relative intra-annual uncertainties in streamflow predictions. Uncertainties in streamflow predictions due to the method of determining parameters were relatively lower for wetter catchments, and more evenly distributed throughout the year when the Nash-Sutcliffe Efficiency of logarithmic values of flow (lnNSE) was the evaluation criterion.

  11. Development of risk-based air quality management strategies under impacts of climate change.

    PubMed

    Liao, Kuo-Jen; Amar, Praveen; Tagaris, Efthimios; Russell, Armistead G

    2012-05-01

    Climate change is forecast to adversely affect air quality through perturbations in meteorological conditions, photochemical reactions, and precursor emissions. To protect the environment and human health from air pollution, there is an increasing recognition of the necessity of developing effective air quality management strategies under the impacts of climate change. This paper presents a framework for developing risk-based air quality management strategies that can help policy makers improve their decision-making processes in response to current and future climate change about 30-50 years from now. Development of air quality management strategies under the impacts of climate change is fundamentally a risk assessment and risk management process involving four steps: (1) assessment of the impacts of climate change and associated uncertainties; (2) determination of air quality targets; (3) selections of potential air quality management options; and (4) identification of preferred air quality management strategies that minimize control costs, maximize benefits, or limit the adverse effects of climate change on air quality when considering the scarcity of resources. The main challenge relates to the level of uncertainties associated with climate change forecasts and advancements in future control measures, since they will significantly affect the risk assessment results and development of effective air quality management plans. The concept presented in this paper can help decision makers make appropriate responses to climate change, since it provides an integrated approach for climate risk assessment and management when developing air quality management strategies. Development of climate-responsive air quality management strategies is fundamentally a risk assessment and risk management process. The risk assessment process includes quantification of climate change impacts on air quality and associated uncertainties. Risk management for air quality under the impacts of climate change includes determination of air quality targets, selections of potential management options, and identification of effective air quality management strategies through decision-making models. The risk-based decision-making framework can also be applied to develop climate-responsive management strategies for the other environmental dimensions and assess costs and benefits of future environmental management policies.

  12. Evaluation and Intercomparison of MODIS and GEOV1 Global Leaf Area Index Products over Four Sites in North China

    PubMed Central

    Li, Zhenwang; Tang, Huan; Zhang, Baohui; Yang, Guixia; Xin, Xiaoping

    2015-01-01

    This study investigated the performances of the Moderate Resolution Imaging Spectroradiometer (MODIS) and GEOLAND2 Version 1 (GEOV1) Leaf Area Index (LAI) products using ground measurements and LAI reference maps over four sites in North China for 2011–2013. The Terra + Aqua MODIS and Terra MODIS LAI retrieved by the main algorithm and GEOV1 LAI within the valid range were evaluated and intercompared using LAI reference maps to assess their uncertainty and seasonal variability The results showed that GEOV1 LAI is the most similar product with the LAI reference maps (R2 = 0.78 and RMSE = 0.59). The MODIS products performed well for biomes with low LAI values, but considerable uncertainty arose when the LAI was larger than 3. Terra + Aqua MODIS (R2 = 0.72 and RMSE = 0.68) was slightly more accurate than Terra MODIS (R2 = 0.57 and RMSE = 0.90) for producing slightly more successful observations. Both MODIS and GEOV1 products effectively followed the seasonal trajectory of the reference maps, and GEOV1 exhibited a smoother seasonal trajectory than MODIS. MODIS anomalies mainly occurred during summer and likely occurred because of surface reflectance uncertainty, shorter temporal resolutions and inconsistency between simulated and MODIS surface reflectances. This study suggests that further improvements of the MODIS LAI products should focus on finer algorithm inputs and improved seasonal variation modeling of MODIS observations. Future field work considering finer biome maps and better generation of LAI reference maps is still needed. PMID:25781509

  13. A new framework for quantifying uncertainties in modelling studies for future climates - how more certain are CMIP5 precipitation and temperature simulations compared to CMIP3?

    NASA Astrophysics Data System (ADS)

    Sharma, A.; Woldemeskel, F. M.; Sivakumar, B.; Mehrotra, R.

    2014-12-01

    We outline a new framework for assessing uncertainties in model simulations, be they hydro-ecological simulations for known scenarios, or climate simulations for assumed scenarios representing the future. This framework is illustrated here using GCM projections for future climates for hydrologically relevant variables (precipitation and temperature), with the uncertainty segregated into three dominant components - model uncertainty, scenario uncertainty (representing greenhouse gas emission scenarios), and ensemble uncertainty (representing uncertain initial conditions and states). A novel uncertainty metric, the Square Root Error Variance (SREV), is used to quantify the uncertainties involved. The SREV requires: (1) Interpolating raw and corrected GCM outputs to a common grid; (2) Converting these to percentiles; (3) Estimating SREV for model, scenario, initial condition and total uncertainty at each percentile; and (4) Transforming SREV to a time series. The outcome is a spatially varying series of SREVs associated with each model that can be used to assess how uncertain the system is at each simulated point or time. This framework, while illustrated in a climate change context, is completely applicable for assessment of uncertainties any modelling framework may be subject to. The proposed method is applied to monthly precipitation and temperature from 6 CMIP3 and 13 CMIP5 GCMs across the world. For CMIP3, B1, A1B and A2 scenarios whereas for CMIP5, RCP2.6, RCP4.5 and RCP8.5 representing low, medium and high emissions are considered. For both CMIP3 and CMIP5, model structure is the largest source of uncertainty, which reduces significantly after correcting for biases. Scenario uncertainly increases, especially for temperature, in future due to divergence of the three emission scenarios analysed. While CMIP5 precipitation simulations exhibit a small reduction in total uncertainty over CMIP3, there is almost no reduction observed for temperature projections. Estimation of uncertainty in both space and time sheds lights on the spatial and temporal patterns of uncertainties in GCM outputs, providing an effective platform for risk-based assessments of any alternate plans or decisions that may be formulated using GCM simulations.

  14. Using FOSM-Based Data Worth Analyses to Design Geophysical Surveys to Reduce Uncertainty in a Regional Groundwater Model Update

    NASA Astrophysics Data System (ADS)

    Smith, B. D.; White, J.; Kress, W. H.; Clark, B. R.; Barlow, J.

    2016-12-01

    Hydrogeophysical surveys have become an integral part of understanding hydrogeological frameworks used in groundwater models. Regional models cover a large area where water well data is, at best, scattered and irregular. Since budgets are finite, priorities must be assigned to select optimal areas for geophysical surveys. For airborne electromagnetic (AEM) geophysical surveys, optimization of mapping depth and line spacing needs to take in account the objectives of the groundwater models. The approach discussed here uses a first-order, second-moment (FOSM) uncertainty analyses which assumes an approximate linear relation between model parameters and observations. This assumption allows FOSM analyses to be applied to estimate the value of increased parameter knowledge to reduce forecast uncertainty. FOSM is used to facilitate optimization of yet-to-be-completed geophysical surveying to reduce model forecast uncertainty. The main objective of geophysical surveying is assumed to estimate values and spatial variation in hydrologic parameters (i.e. hydraulic conductivity) as well as map lower permeability layers that influence the spatial distribution of recharge flux. The proposed data worth analysis was applied to Mississippi Embayment Regional Aquifer Study (MERAS) which is being updated. The objective of MERAS is to assess the ground-water availability (status and trends) of the Mississippi embayment aquifer system. The study area covers portions of eight states including Alabama, Arkansas, Illinois, Kentucky, Louisiana, Mississippi, Missouri, and Tennessee. The active model grid covers approximately 70,000 square miles, and incorporates some 6,000 miles of major rivers and over 100,000 water wells. In the FOSM analysis, a dense network of pilot points was used to capture uncertainty in hydraulic conductivity and recharge. To simulate the effect of AEM flight lines, the prior uncertainty for hydraulic conductivity and recharge pilots along potential flight lines was reduced. The FOSM forecast uncertainty estimates were then recalculated and compared to the base forecast uncertainty estimates. The resulting reduction in forecast uncertainty is a measure of the effect on the model from the AEM survey. Iterations through this process, results in optimization of flight line location.

  15. Propagating uncertainty from hydrology into human health risk assessment

    NASA Astrophysics Data System (ADS)

    Siirila, E. R.; Maxwell, R. M.

    2013-12-01

    Hydro-geologic modeling and uncertainty assessment of flow and transport parameters can be incorporated into human health risk (both cancer and non-cancer) assessment to better understand the associated uncertainties. This interdisciplinary approach is needed now more than ever as societal problems concerning water quality are increasingly interdisciplinary as well. For example, uncertainty can originate from environmental conditions such as a lack of information or measurement error, or can manifest as variability, such as differences in physiological and exposure parameters between individuals. To complicate the matter, traditional risk assessment methodologies are independent of time, virtually neglecting any temporal dependence. Here we present not only how uncertainty and variability can be incorporated into a risk assessment, but also how time dependent risk assessment (TDRA) allows for the calculation of risk as a function of time. The development of TDRA and the inclusion of quantitative risk analysis in this research provide a means to inform decision makers faced with water quality issues and challenges. The stochastic nature of this work also provides a means to address the question of uncertainty in management decisions, a component that is frequently difficult to quantify. To illustrate this new formulation and to investigate hydraulic mechanisms for sensitivity, an example of varying environmental concentration signals resulting from rate dependencies in geochemical reactions is used. Cancer risk is computed and compared using environmental concentration ensembles modeled with sorption as 1) a linear equilibrium assumption and 2) first order kinetics. Results show that the up scaling of these small-scale processes controls the distribution, magnitude, and associated uncertainty of cancer risk.

  16. Aiding alternatives assessment with an uncertainty-tolerant hazard scoring method.

    PubMed

    Faludi, Jeremy; Hoang, Tina; Gorman, Patrick; Mulvihill, Martin

    2016-11-01

    This research developed a single-score system to simplify and clarify decision-making in chemical alternatives assessment, accounting for uncertainty. Today, assessing alternatives to hazardous constituent chemicals is a difficult task-rather than comparing alternatives by a single definitive score, many independent toxicological variables must be considered at once, and data gaps are rampant. Thus, most hazard assessments are only comprehensible to toxicologists, but business leaders and politicians need simple scores to make decisions. In addition, they must balance hazard against other considerations, such as product functionality, and they must be aware of the high degrees of uncertainty in chemical hazard data. This research proposes a transparent, reproducible method to translate eighteen hazard endpoints into a simple numeric score with quantified uncertainty, alongside a similar product functionality score, to aid decisions between alternative products. The scoring method uses Clean Production Action's GreenScreen as a guide, but with a different method of score aggregation. It provides finer differentiation between scores than GreenScreen's four-point scale, and it displays uncertainty quantitatively in the final score. Displaying uncertainty also illustrates which alternatives are early in product development versus well-defined commercial products. This paper tested the proposed assessment method through a case study in the building industry, assessing alternatives to spray polyurethane foam insulation containing methylene diphenyl diisocyanate (MDI). The new hazard scoring method successfully identified trade-offs between different alternatives, showing finer resolution than GreenScreen Benchmarking. Sensitivity analysis showed that different weighting schemes in hazard scores had almost no effect on alternatives ranking, compared to uncertainty from data gaps. Copyright © 2016 Elsevier Ltd. All rights reserved.

  17. Propagation of neutron-reaction uncertainties through multi-physics models of novel LWR's

    NASA Astrophysics Data System (ADS)

    Hernandez-Solis, Augusto; Sjöstrand, Henrik; Helgesson, Petter

    2017-09-01

    The novel design of the renewable boiling water reactor (RBWR) allows a breeding ratio greater than unity and thus, it aims at providing for a self-sustained fuel cycle. The neutron reactions that compose the different microscopic cross-sections and angular distributions are uncertain, so when they are employed in the determination of the spatial distribution of the neutron flux in a nuclear reactor, a methodology should be employed to account for these associated uncertainties. In this work, the Total Monte Carlo (TMC) method is used to propagate the different neutron-reactions (as well as angular distributions) covariances that are part of the TENDL-2014 nuclear data (ND) library. The main objective is to propagate them through coupled neutronic and thermal-hydraulic models in order to assess the uncertainty of important safety parameters related to multi-physics, such as peak cladding temperature along the axial direction of an RBWR fuel assembly. The objective of this study is to quantify the impact that ND covariances of important nuclides such as U-235, U-238, Pu-239 and the thermal scattering of hydrogen in H2O have in the deterministic safety analysis of novel nuclear reactors designs.

  18. Conflict or Caveats? Effects of Media Portrayals of Scientific Uncertainty on Audience Perceptions of New Technologies.

    PubMed

    Binder, Andrew R; Hillback, Elliott D; Brossard, Dominique

    2016-04-01

    Research indicates that uncertainty in science news stories affects public assessment of risk and uncertainty. However, the form in which uncertainty is presented may also affect people's risk and uncertainty assessments. For example, a news story that features an expert discussing both what is known and what is unknown about a topic may convey a different form of scientific uncertainty than a story that features two experts who hold conflicting opinions about the status of scientific knowledge of the topic, even when both stories contain the same information about knowledge and its boundaries. This study focuses on audience uncertainty and risk perceptions regarding the emerging science of nanotechnology by manipulating whether uncertainty in a news story about potential risks is attributed to expert sources in the form of caveats (individual uncertainty) or conflicting viewpoints (collective uncertainty). Results suggest that the type of uncertainty portrayed does not impact audience feelings of uncertainty or risk perceptions directly. Rather, the presentation of the story influences risk perceptions only among those who are highly deferent to scientific authority. Implications for risk communication theory and practice are discussed. © 2015 Society for Risk Analysis.

  19. Influences of system uncertainties on the numerical transfer path analysis of engine systems

    NASA Astrophysics Data System (ADS)

    Acri, A.; Nijman, E.; Acri, A.; Offner, G.

    2017-10-01

    Practical mechanical systems operate with some degree of uncertainty. In numerical models uncertainties can result from poorly known or variable parameters, from geometrical approximation, from discretization or numerical errors, from uncertain inputs or from rapidly changing forcing that can be best described in a stochastic framework. Recently, random matrix theory was introduced to take parameter uncertainties into account in numerical modeling problems. In particular in this paper, Wishart random matrix theory is applied on a multi-body dynamic system to generate random variations of the properties of system components. Multi-body dynamics is a powerful numerical tool largely implemented during the design of new engines. In this paper the influence of model parameter variability on the results obtained from the multi-body simulation of engine dynamics is investigated. The aim is to define a methodology to properly assess and rank system sources when dealing with uncertainties. Particular attention is paid to the influence of these uncertainties on the analysis and the assessment of the different engine vibration sources. Examples of the effects of different levels of uncertainties are illustrated by means of examples using a representative numerical powertrain model. A numerical transfer path analysis, based on system dynamic substructuring, is used to derive and assess the internal engine vibration sources. The results obtained from this analysis are used to derive correlations between parameter uncertainties and statistical distribution of results. The derived statistical information can be used to advance the knowledge of the multi-body analysis and the assessment of system sources when uncertainties in model parameters are considered.

  20. The Approaching Obsolescence of 137Cs as a Means of Dating Wetland Soils and Sediments in North America

    NASA Astrophysics Data System (ADS)

    Drexler, J. Z.; Fuller, C.

    2017-12-01

    137Cesium is an anthropogenic radionuclide whose maximum fallout occurred in 1963/4 at the height of above-ground nuclear weapons testing. The presence of this fallout peak in core profiles has been used widely to estimate vertical accretion and carbon accumulation rates in wetlands. 137Cs dating has long been applied with little attention to uncertainty of peak position or measurement error. Initially, this caused few problems as activities were high and peaks were generally clear; however recently the clarity of peaks has deteriorated, raising questions of method efficacy. We quantified uncertainty in 137Cs dating in 52 wetland sediment/peat cores collected from 2005 - 2015 in Maine, California, Virginia, North Carolina, South Carolina, and Washington and compared the position of each peak to the date obtained with 210Pb. We found that the two dating methods matched within 5 years for only 20% of cores with a distinct 137Cs peak. We attribute this to a decline in 137Cs efficacy for three main reasons: (1) mobility of 137Cs resulting from diffusion independent of sediments, downwashing, and/or physical/biotic perturbation, (2) on-going decay of the original 137Cs in situ (half-life = 30.17 years), which manifests in lower signal to noise ratios, and (3) 137Cs inputs from watershed/tidal sources, which have confounded the 137Cs pattern in sediments. Such reduced efficacy is of concern because carbon accumulation rates determined with 137Cs are used for informing national-scale carbon assessments and for determining the carbon storage potential of wetlands restored as offsets for the carbon market. We conclude that 137Cs dating alone has sufficient uncertainty that it should be disallowed for carbon accounting and that any use of 137Cs should be accompanied by an uncertainty analysis of peak position. Our results suggest that soon the common practice of using 137Cs to corroborate 210Pb dating will likely be obsolete in much of North America.

  1. Drainage Basins as Large-Scale Field Laboratories of Change: Hydro-biogeochemical- economic Model Study Support for Water Pollution and Eutrophication Management Under Uncertainty

    NASA Astrophysics Data System (ADS)

    Destouni, G.

    2008-12-01

    Excess nutrient and pollutant releases from various point and diffuse sources at and below the land surface, associated with land use, industry and households, pose serious eutrophication and pollution risks to inland and coastal water ecosystems worldwide. These risks must be assessed, for instance according to the EU Water Framework Directive (WFD). The WFD demands economically efficient, basin-scale water management for achieving and maintaining good physico-chemical and ecological status in all the inland and coastal waters of EU member states. This paper synthesizes a series of hydro-biogeochemical and linked economic efficiency studies of basin-scale waterborne nutrient and pollutant flows, the development over the last decades up to the current levels of these flows, the main monitoring and modelling uncertainties associated with their quantification, and the effectiveness and economic efficiency of different possible abatement strategies for abating them in order to meet WFD requirements and other environmental goals on local, national and international levels under climate and other regional change. The studies include different Swedish and Baltic Sea drainage basins. Main findings include quantification of near-coastal monitoring gaps and long-term nutrient and pollutant memory in the subsurface (soil-groundwater-sediment) water systems of drainage basins. The former may significantly mask nutrient and pollutant loads to the sea while the latter may continue to uphold large loads to inland and coastal waters long time after source mitigation. A methodology is presented for finding a rational trade-off between the two resource-demanding options to reduce, or accept and explicitly account for the uncertainties implied by these monitoring gaps and long-term nutrient-pollution memories and time lags, and other knowledge, data and model uncertainties that limit the effectiveness and efficiency of water pollution and eutrophication management.

  2. Long-term stormwater quantity and quality analysis using continuous measurements in a French urban catchment.

    PubMed

    Sun, Siao; Barraud, Sylvie; Castebrunet, Hélène; Aubin, Jean-Baptiste; Marmonier, Pierre

    2015-11-15

    The assessment of urban stormwater quantity and quality is important for evaluating and controlling the impact of the stormwater to natural water and environment. This study mainly addresses long-term evolution of stormwater quantity and quality in a French urban catchment using continuous measured data from 2004 to 2011. Storm event-based data series are obtained (716 rainfall events and 521 runoff events are available) from measured continuous time series. The Mann-Kendall test is applied to these event-based data series for trend detection. A lack of trend is found in rainfall and an increasing trend in runoff is detected. As a result, an increasing trend is present in the runoff coefficient, likely due to growing imperviousness of the catchment caused by urbanization. The event mean concentration of the total suspended solid (TSS) in stormwater does not present a trend, whereas the event load of TSS has an increasing tendency, which is attributed to the increasing event runoff volume. Uncertainty analysis suggests that the major uncertainty in trend detection results lies in uncertainty due to available data. A lack of events due to missing data leads to dramatically increased uncertainty in trend detection results. In contrast, measurement uncertainty in time series data plays a trivial role. The intra-event distribution of TSS is studied based on both M(V) curves and pollutant concentrations of absolute runoff volumes. The trend detection test reveals no significant change in intra-event distributions of TSS in the studied catchment. Copyright © 2015 Elsevier Ltd. All rights reserved.

  3. Uncertainty quantification and global sensitivity analysis of the Los Alamos sea ice model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Urrego-Blanco, Jorge Rolando; Urban, Nathan Mark; Hunke, Elizabeth Clare

    Changes in the high-latitude climate system have the potential to affect global climate through feedbacks with the atmosphere and connections with midlatitudes. Sea ice and climate models used to understand these changes have uncertainties that need to be characterized and quantified. We present a quantitative way to assess uncertainty in complex computer models, which is a new approach in the analysis of sea ice models. We characterize parametric uncertainty in the Los Alamos sea ice model (CICE) in a standalone configuration and quantify the sensitivity of sea ice area, extent, and volume with respect to uncertainty in 39 individual modelmore » parameters. Unlike common sensitivity analyses conducted in previous studies where parameters are varied one at a time, this study uses a global variance-based approach in which Sobol' sequences are used to efficiently sample the full 39-dimensional parameter space. We implement a fast emulator of the sea ice model whose predictions of sea ice extent, area, and volume are used to compute the Sobol' sensitivity indices of the 39 parameters. Main effects and interactions among the most influential parameters are also estimated by a nonparametric regression technique based on generalized additive models. A ranking based on the sensitivity indices indicates that model predictions are most sensitive to snow parameters such as snow conductivity and grain size, and the drainage of melt ponds. Lastly, it is recommended that research be prioritized toward more accurately determining these most influential parameter values by observational studies or by improving parameterizations in the sea ice model.« less

  4. Multi-parametric variational data assimilation for hydrological forecasting

    NASA Astrophysics Data System (ADS)

    Alvarado-Montero, R.; Schwanenberg, D.; Krahe, P.; Helmke, P.; Klein, B.

    2017-12-01

    Ensemble forecasting is increasingly applied in flow forecasting systems to provide users with a better understanding of forecast uncertainty and consequently to take better-informed decisions. A common practice in probabilistic streamflow forecasting is to force deterministic hydrological model with an ensemble of numerical weather predictions. This approach aims at the representation of meteorological uncertainty but neglects uncertainty of the hydrological model as well as its initial conditions. Complementary approaches use probabilistic data assimilation techniques to receive a variety of initial states or represent model uncertainty by model pools instead of single deterministic models. This paper introduces a novel approach that extends a variational data assimilation based on Moving Horizon Estimation to enable the assimilation of observations into multi-parametric model pools. It results in a probabilistic estimate of initial model states that takes into account the parametric model uncertainty in the data assimilation. The assimilation technique is applied to the uppermost area of River Main in Germany. We use different parametric pools, each of them with five parameter sets, to assimilate streamflow data, as well as remotely sensed data from the H-SAF project. We assess the impact of the assimilation in the lead time performance of perfect forecasts (i.e. observed data as forcing variables) as well as deterministic and probabilistic forecasts from ECMWF. The multi-parametric assimilation shows an improvement of up to 23% for CRPS performance and approximately 20% in Brier Skill Scores with respect to the deterministic approach. It also improves the skill of the forecast in terms of rank histogram and produces a narrower ensemble spread.

  5. Uncertainty quantification and global sensitivity analysis of the Los Alamos sea ice model

    DOE PAGES

    Urrego-Blanco, Jorge Rolando; Urban, Nathan Mark; Hunke, Elizabeth Clare; ...

    2016-04-01

    Changes in the high-latitude climate system have the potential to affect global climate through feedbacks with the atmosphere and connections with midlatitudes. Sea ice and climate models used to understand these changes have uncertainties that need to be characterized and quantified. We present a quantitative way to assess uncertainty in complex computer models, which is a new approach in the analysis of sea ice models. We characterize parametric uncertainty in the Los Alamos sea ice model (CICE) in a standalone configuration and quantify the sensitivity of sea ice area, extent, and volume with respect to uncertainty in 39 individual modelmore » parameters. Unlike common sensitivity analyses conducted in previous studies where parameters are varied one at a time, this study uses a global variance-based approach in which Sobol' sequences are used to efficiently sample the full 39-dimensional parameter space. We implement a fast emulator of the sea ice model whose predictions of sea ice extent, area, and volume are used to compute the Sobol' sensitivity indices of the 39 parameters. Main effects and interactions among the most influential parameters are also estimated by a nonparametric regression technique based on generalized additive models. A ranking based on the sensitivity indices indicates that model predictions are most sensitive to snow parameters such as snow conductivity and grain size, and the drainage of melt ponds. Lastly, it is recommended that research be prioritized toward more accurately determining these most influential parameter values by observational studies or by improving parameterizations in the sea ice model.« less

  6. Uncertainty quantification and global sensitivity analysis of the Los Alamos sea ice model

    NASA Astrophysics Data System (ADS)

    Urrego-Blanco, Jorge R.; Urban, Nathan M.; Hunke, Elizabeth C.; Turner, Adrian K.; Jeffery, Nicole

    2016-04-01

    Changes in the high-latitude climate system have the potential to affect global climate through feedbacks with the atmosphere and connections with midlatitudes. Sea ice and climate models used to understand these changes have uncertainties that need to be characterized and quantified. We present a quantitative way to assess uncertainty in complex computer models, which is a new approach in the analysis of sea ice models. We characterize parametric uncertainty in the Los Alamos sea ice model (CICE) in a standalone configuration and quantify the sensitivity of sea ice area, extent, and volume with respect to uncertainty in 39 individual model parameters. Unlike common sensitivity analyses conducted in previous studies where parameters are varied one at a time, this study uses a global variance-based approach in which Sobol' sequences are used to efficiently sample the full 39-dimensional parameter space. We implement a fast emulator of the sea ice model whose predictions of sea ice extent, area, and volume are used to compute the Sobol' sensitivity indices of the 39 parameters. Main effects and interactions among the most influential parameters are also estimated by a nonparametric regression technique based on generalized additive models. A ranking based on the sensitivity indices indicates that model predictions are most sensitive to snow parameters such as snow conductivity and grain size, and the drainage of melt ponds. It is recommended that research be prioritized toward more accurately determining these most influential parameter values by observational studies or by improving parameterizations in the sea ice model.

  7. Quantifying uncertainty in discharge measurements: A new approach

    USGS Publications Warehouse

    Kiang, J.E.; Cohn, T.A.; Mason, R.R.

    2009-01-01

    The accuracy of discharge measurements using velocity meters and the velocity-area method is typically assessed based on empirical studies that may not correspond to conditions encountered in practice. In this paper, a statistical approach for assessing uncertainty based on interpolated variance estimation (IVE) is introduced. The IVE method quantifies all sources of random uncertainty in the measured data. This paper presents results employing data from sites where substantial over-sampling allowed for the comparison of IVE-estimated uncertainty and observed variability among repeated measurements. These results suggest that the IVE approach can provide approximate estimates of measurement uncertainty. The use of IVE to estimate the uncertainty of a discharge measurement would provide the hydrographer an immediate determination of uncertainty and help determine whether there is a need for additional sampling in problematic river cross sections. ?? 2009 ASCE.

  8. Consideration of vertical uncertainty in elevation-based sea-level rise assessments: Mobile Bay, Alabama case study

    USGS Publications Warehouse

    Gesch, Dean B.

    2013-01-01

    The accuracy with which coastal topography has been mapped directly affects the reliability and usefulness of elevationbased sea-level rise vulnerability assessments. Recent research has shown that the qualities of the elevation data must be well understood to properly model potential impacts. The cumulative vertical uncertainty has contributions from elevation data error, water level data uncertainties, and vertical datum and transformation uncertainties. The concepts of minimum sealevel rise increment and minimum planning timeline, important parameters for an elevation-based sea-level rise assessment, are used in recognition of the inherent vertical uncertainty of the underlying data. These concepts were applied to conduct a sea-level rise vulnerability assessment of the Mobile Bay, Alabama, region based on high-quality lidar-derived elevation data. The results that detail the area and associated resources (land cover, population, and infrastructure) vulnerable to a 1.18-m sea-level rise by the year 2100 are reported as a range of values (at the 95% confidence level) to account for the vertical uncertainty in the base data. Examination of the tabulated statistics about land cover, population, and infrastructure in the minimum and maximum vulnerable areas shows that these resources are not uniformly distributed throughout the overall vulnerable zone. The methods demonstrated in the Mobile Bay analysis provide an example of how to consider and properly account for vertical uncertainty in elevation-based sea-level rise vulnerability assessments, and the advantages of doing so.

  9. The Viareggio LPG railway accident: event reconstruction and modeling.

    PubMed

    Brambilla, Sara; Manca, Davide

    2010-10-15

    This manuscript describes in detail the LPG accident occurred in Viareggio on June 2009 and its modeling. The accident investigation highlighted the uncertainty and complexity of assessing and modeling what happened in the congested environment close to the Viareggio railway station. Nonetheless, the analysis allowed comprehending the sequence of events, the way they influenced each other, and the different possible paths/evolutions. The paper describes suitable models for the quantitative assessment of the consequences of the most probable accidental dynamics and its outcomes. The main finding is that after about 80 s from the beginning of the release the dense-gas cloud reached the surrounding houses that were destroyed successively by internal explosions. This fact has two main implications. First, it shows that the adopted modeling framework can give a correct picture of what happened in Viareggio. Second, it confirms the need to develop effective mitigation measures because, in case of this kind of accidents, there is no time to apply any protective emergency plans/actions. 2010 Elsevier B.V. All rights reserved.

  10. A TIERED APPROACH TO PERFORMING UNCERTAINTY ANALYSIS IN CONDUCTING EXPOSURE ANALYSIS FOR CHEMICALS

    EPA Science Inventory

    The WHO/IPCS draft Guidance Document on Characterizing and Communicating Uncertainty in Exposure Assessment provides guidance on recommended strategies for conducting uncertainty analysis as part of human exposure analysis. Specifically, a tiered approach to uncertainty analysis ...

  11. Monitoring and modeling as a continuing learning process: the use of hydrological models in a general probabilistic framework.

    NASA Astrophysics Data System (ADS)

    Baroni, G.; Gräff, T.; Reinstorf, F.; Oswald, S. E.

    2012-04-01

    Nowadays uncertainty and sensitivity analysis are considered basic tools for the assessment of hydrological models and the evaluation of the most important sources of uncertainty. In this context, in the last decades several methods have been developed and applied in different hydrological conditions. However, in most of the cases, the studies have been done by investigating mainly the influence of the parameter uncertainty on the simulated outputs and few approaches tried to consider also other sources of uncertainty i.e. input and model structure. Moreover, several constrains arise when spatially distributed parameters are involved. To overcome these limitations a general probabilistic framework based on Monte Carlo simulations and the Sobol method has been proposed. In this study, the general probabilistic framework was applied at field scale using a 1D physical-based hydrological model (SWAP). Furthermore, the framework was extended at catchment scale in combination with a spatially distributed hydrological model (SHETRAN). The models are applied in two different experimental sites in Germany: a relatively flat cropped field close to Potsdam (Brandenburg) and a small mountainous catchment with agricultural land use (Schaefertal, Harz Mountains). For both cases, input and parameters are considered as major sources of uncertainty. Evaluation of the models was based on soil moisture detected at plot scale in different depths and, for the catchment site, also with daily discharge values. The study shows how the framework can take into account all the various sources of uncertainty i.e. input data, parameters (either in scalar or spatially distributed form) and model structures. The framework can be used in a loop in order to optimize further monitoring activities used to improve the performance of the model. In the particular applications, the results show how the sources of uncertainty are specific for each process considered. The influence of the input data as well as the presence of compensating errors become clear by the different processes simulated.

  12. The impact of land use on estimates of pesticide leaching potential: Assessments and uncertainties

    NASA Astrophysics Data System (ADS)

    Loague, Keith

    1991-11-01

    This paper illustrates the magnitude of uncertainty which can exist for pesticide leaching assessments, due to data uncertainties, both between soil orders and within a single soil order. The current work differs from previous efforts because the impact of uncertainty in recharge estimates is considered. The examples are for diuron leaching in the Pearl Harbor Basin. The results clearly indicate that land use has a significant impact on both estimates of pesticide leaching potential and the uncertainties associated with those estimates. It appears that the regulation of agricultural chemicals in the future should include consideration for changing land use.

  13. Dynamic rating curve assessment in hydrometric stations and calculation of the associated uncertainties : Quality and monitoring indicators

    NASA Astrophysics Data System (ADS)

    Morlot, Thomas; Perret, Christian; Favre, Anne-Catherine

    2013-04-01

    Whether we talk about safety reasons, energy production or regulation, water resources management is one of EDF's (French hydropower company) main concerns. To meet these needs, since the fifties EDF-DTG operates a hydrometric network that includes more than 350 hydrometric stations. The data collected allow real time monitoring of rivers (hydro meteorological forecasts at points of interests), as well as hydrological studies and the sizing of structures. Ensuring the quality of stream flow data is a priority. A rating curve is an indirect method of estimating the discharge in rivers based on water level measurements. The value of discharge obtained thanks to the rating curve is not entirely accurate due to the constant changes of the river bed morphology, to the precision of the gaugings (direct and punctual discharge measurements) and to the quality of the tracing. As time goes on, the uncertainty of the estimated discharge from a rating curve « gets older » and increases: therefore the final level of uncertainty remains particularly difficult to assess. Moreover, the current EDF capacity to produce a rating curve is not suited to the frequency of change of the stage-discharge relationship. The actual method does not take into consideration the variation of the flow conditions and the modifications of the river bed which occur due to natural processes such as erosion, sedimentation and seasonal vegetation growth. In order to get the most accurate stream flow data and to improve their reliability, this study undertakes an original « dynamic» method to compute rating curves based on historical gaugings from a hydrometric station. A curve is computed for each new gauging and a model of uncertainty is adjusted for each of them. The model of uncertainty takes into account the inaccuracies in the measurement of the water height, the quality of the tracing, the uncertainty of the gaugings and the aging of the confidence intervals calculated with a variographic analysis. These rating curves enable to provide values of stream flow taking into account the variability of flow conditions, while providing a model of uncertainties resulting from the aging of the rating curves. By taking into account the variability of the flow conditions and the life of the hydrometric station, this original dynamic method can answer important questions in the field of hydrometry such as « How many gaugings a year have to be made so as to produce stream flow data with an average uncertainty of X% ? » and « When and in which range of water flow do we have to realize those gaugings ? ». KEY WORDS : Uncertainty, Rating curve, Hydrometric station, Gauging, Variogram, Stream Flow

  14. Experimental Approach for the Uncertainty Assessment of 3D Complex Geometry Dimensional Measurements Using Computed Tomography at the mm and Sub-mm Scales

    PubMed Central

    Jiménez, Roberto; Torralba, Marta; Yagüe-Fabra, José A.; Ontiveros, Sinué; Tosello, Guido

    2017-01-01

    The dimensional verification of miniaturized components with 3D complex geometries is particularly challenging. Computed Tomography (CT) can represent a suitable alternative solution to micro metrology tools based on optical and tactile techniques. However, the establishment of CT systems’ traceability when measuring 3D complex geometries is still an open issue. In this work, an alternative method for the measurement uncertainty assessment of 3D complex geometries by using CT is presented. The method is based on the micro-CT system Maximum Permissible Error (MPE) estimation, determined experimentally by using several calibrated reference artefacts. The main advantage of the presented method is that a previous calibration of the component by a more accurate Coordinate Measuring System (CMS) is not needed. In fact, such CMS would still hold all the typical limitations of optical and tactile techniques, particularly when measuring miniaturized components with complex 3D geometries and their inability to measure inner parts. To validate the presented method, the most accepted standard currently available for CT sensors, the Verein Deutscher Ingenieure/Verband Deutscher Elektrotechniker (VDI/VDE) guideline 2630-2.1 is applied. Considering the high number of influence factors in CT and their impact on the measuring result, two different techniques for surface extraction are also considered to obtain a realistic determination of the influence of data processing on uncertainty. The uncertainty assessment of a workpiece used for micro mechanical material testing is firstly used to confirm the method, due to its feasible calibration by an optical CMS. Secondly, the measurement of a miniaturized dental file with 3D complex geometry is carried out. The estimated uncertainties are eventually compared with the component’s calibration and the micro manufacturing tolerances to demonstrate the suitability of the presented CT calibration procedure. The 2U/T ratios resulting from the validation workpiece are, respectively, 0.27 (VDI) and 0.35 (MPE), by assuring tolerances in the range of ± 20–30 µm. For the dental file, the EN < 1 value analysis is favorable in the majority of the cases (70.4%) and 2U/T is equal to 0.31 for sub-mm measurands (L < 1 mm and tolerance intervals of ± 40–80 µm). PMID:28509869

  15. Experimental Approach for the Uncertainty Assessment of 3D Complex Geometry Dimensional Measurements Using Computed Tomography at the mm and Sub-mm Scales.

    PubMed

    Jiménez, Roberto; Torralba, Marta; Yagüe-Fabra, José A; Ontiveros, Sinué; Tosello, Guido

    2017-05-16

    The dimensional verification of miniaturized components with 3D complex geometries is particularly challenging. Computed Tomography (CT) can represent a suitable alternative solution to micro metrology tools based on optical and tactile techniques. However, the establishment of CT systems' traceability when measuring 3D complex geometries is still an open issue. In this work, an alternative method for the measurement uncertainty assessment of 3D complex geometries by using CT is presented. The method is based on the micro-CT system Maximum Permissible Error (MPE) estimation, determined experimentally by using several calibrated reference artefacts. The main advantage of the presented method is that a previous calibration of the component by a more accurate Coordinate Measuring System (CMS) is not needed. In fact, such CMS would still hold all the typical limitations of optical and tactile techniques, particularly when measuring miniaturized components with complex 3D geometries and their inability to measure inner parts. To validate the presented method, the most accepted standard currently available for CT sensors, the Verein Deutscher Ingenieure/Verband Deutscher Elektrotechniker (VDI/VDE) guideline 2630-2.1 is applied. Considering the high number of influence factors in CT and their impact on the measuring result, two different techniques for surface extraction are also considered to obtain a realistic determination of the influence of data processing on uncertainty. The uncertainty assessment of a workpiece used for micro mechanical material testing is firstly used to confirm the method, due to its feasible calibration by an optical CMS. Secondly, the measurement of a miniaturized dental file with 3D complex geometry is carried out. The estimated uncertainties are eventually compared with the component's calibration and the micro manufacturing tolerances to demonstrate the suitability of the presented CT calibration procedure. The 2U/T ratios resulting from the validation workpiece are, respectively, 0.27 (VDI) and 0.35 (MPE), by assuring tolerances in the range of ± 20-30 µm. For the dental file, the E N < 1 value analysis is favorable in the majority of the cases (70.4%) and 2U/T is equal to 0.31 for sub-mm measurands (L < 1 mm and tolerance intervals of ± 40-80 µm).

  16. Assessment of the uncertainty in future projection for summer climate extremes over the East Asia

    NASA Astrophysics Data System (ADS)

    Park, Changyong; Min, Seung-Ki; Cha, Dong-Hyun

    2017-04-01

    Future projections of climate extremes in regional and local scales are essential information needed for better adapting to climate changes. However, future projections hold larger uncertainty factors arising from internal and external processes which reduce the projection confidence. Using CMIP5 (Coupled Model Intercomparison Project Phase 5) multi-model simulations, we assess uncertainties in future projections of the East Asian temperature and precipitation extremes focusing on summer. In examining future projection, summer mean and extreme projections of the East Asian temperature and precipitation would be larger as time. Moreover, uncertainty cascades represent wider scenario difference and inter-model ranges with increasing time. A positive mean-extreme relation is found in projections for both temperature and precipitation. For the assessment of uncertainty factors for these projections, dominant uncertainty factors from temperature and precipitation change as time. For uncertainty of mean and extreme temperature, contributions of internal variability and model uncertainty declines after mid-21st century while role of scenario uncertainty grows rapidly. For uncertainty of mean precipitation projections, internal variability is more important than the scenario uncertainty. Unlike mean precipitation, extreme precipitation shows that the scenario uncertainty is expected to be a dominant factor in 2090s. The model uncertainty holds as an important factor for both mean and extreme precipitation until late 21st century. The spatial changes for the uncertainty factors of mean and extreme projections generally are expressed according to temporal changes of the fraction of total variance from uncertainty factors in many grids of the East Asia. ACKNOWLEDGEMENTS The research was supported by the Korea Meteorological Administration Research and Development program under grant KMIPA 2015-2083 and the National Research Foundation of Korea Grant funded by the Ministry of Science, ICT and Future Planning of Korea (NRF-2016M3C4A7952637) for its support and assistant in completion of the study.

  17. Relationship between Physicians' Uncertainty about Clinical Assessments and Patient-Centered Recommendations for Colorectal Cancer Screening in the Elderly.

    PubMed

    Dalton, Alexandra F; Golin, Carol E; Esserman, Denise; Pignone, Michael P; Pathman, Donald E; Lewis, Carmen L

    2015-05-01

    The goal of this study was to examine associations between physicians' clinical assessments, their certainty in these assessments, and the likelihood of a patient-centered recommendation about colorectal cancer (CRC) screening in the elderly. Two hundred seventy-six primary care physicians in the United States read 3 vignettes about an 80-year-old female patient and answered questions about her life expectancy, their confidence in their life expectancy estimate, the balance of benefits/downsides of CRC screening, their certainty in their benefit/downside assessment, and the best course of action regarding CRC screening. We used logistic regression to determine the relationship between these variables and patient-centered recommendations about CRC screening. In bivariate analyses, physicians had higher odds of making a patient-centered recommendation about CRC screening when their clinical assessments did not lead to a clear screening recommendation or when they experienced uncertainty in their clinical assessments. However, in a multivariate regression model, only benefit/downside assessment and best course of action remained statistically significant predictors of a patient-centered recommendation. Our findings demonstrate that when the results of clinical assessments do not lead to obvious screening decisions or when physicians feel uncertain about their clinical assessments, they are more likely to make patient-centered recommendations. Existing uncertainty frameworks do not adequately describe the uncertainty associated with patient-centered recommendations found in this study. Adapting or modifying these frameworks to better reflect the constructs associated with uncertainty and the interactions between uncertainty and the complexity inherent in clinical decisions will facilitate a more complete understanding of how and when physicians choose to include patients in clinical decisions. © The Author(s) 2015.

  18. Bayesian analysis for erosion modelling of sediments in combined sewer systems.

    PubMed

    Kanso, A; Chebbo, G; Tassin, B

    2005-01-01

    Previous research has confirmed that the sediments at the bed of combined sewer systems are the main source of particulate and organic pollution during rain events contributing to combined sewer overflows. However, existing urban stormwater models utilize inappropriate sediment transport formulas initially developed from alluvial hydrodynamics. Recently, a model has been formulated and profoundly assessed based on laboratory experiments to simulate the erosion of sediments in sewer pipes taking into account the increase in strength with depth in the weak layer of deposits. In order to objectively evaluate this model, this paper presents a Bayesian analysis of the model using field data collected in sewer pipes in Paris under known hydraulic conditions. The test has been performed using a MCMC sampling method for calibration and uncertainty assessment. Results demonstrate the capacity of the model to reproduce erosion as a direct response to the increase in bed shear stress. This is due to the model description of the erosional strength in the deposits and to the shape of the measured bed shear stress. However, large uncertainties in some of the model parameters suggest that the model could be over-parameterised and necessitates a large amount of informative data for its calibration.

  19. Dynamic Event Tree advancements and control logic improvements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Alfonsi, Andrea; Rabiti, Cristian; Mandelli, Diego

    The RAVEN code has been under development at the Idaho National Laboratory since 2012. Its main goal is to create a multi-purpose platform for the deploying of all the capabilities needed for Probabilistic Risk Assessment, uncertainty quantification, data mining analysis and optimization studies. RAVEN is currently equipped with three different sampling categories: Forward samplers (Monte Carlo, Latin Hyper Cube, Stratified, Grid Sampler, Factorials, etc.), Adaptive Samplers (Limit Surface search, Adaptive Polynomial Chaos, etc.) and Dynamic Event Tree (DET) samplers (Deterministic and Adaptive Dynamic Event Trees). The main subject of this document is to report the activities that have been donemore » in order to: start the migration of the RAVEN/RELAP-7 control logic system into MOOSE, and develop advanced dynamic sampling capabilities based on the Dynamic Event Tree approach. In order to provide to all MOOSE-based applications a control logic capability, in this Fiscal Year an initial migration activity has been initiated, moving the control logic system, designed for RELAP-7 by the RAVEN team, into the MOOSE framework. In this document, a brief explanation of what has been done is going to be reported. The second and most important subject of this report is about the development of a Dynamic Event Tree (DET) sampler named “Hybrid Dynamic Event Tree” (HDET) and its Adaptive variant “Adaptive Hybrid Dynamic Event Tree” (AHDET). As other authors have already reported, among the different types of uncertainties, it is possible to discern two principle types: aleatory and epistemic uncertainties. The classical Dynamic Event Tree is in charge of treating the first class (aleatory) uncertainties; the dependence of the probabilistic risk assessment and analysis on the epistemic uncertainties are treated by an initial Monte Carlo sampling (MCDET). From each Monte Carlo sample, a DET analysis is run (in total, N trees). The Monte Carlo employs a pre-sampling of the input space characterized by epistemic uncertainties. The consequent Dynamic Event Tree performs the exploration of the aleatory space. In the RAVEN code, a more general approach has been developed, not limiting the exploration of the epistemic space through a Monte Carlo method but using all the forward sampling strategies RAVEN currently employs. The user can combine a Latin Hyper Cube, Grid, Stratified and Monte Carlo sampling in order to explore the epistemic space, without any limitation. From this pre-sampling, the Dynamic Event Tree sampler starts its aleatory space exploration. As reported by the authors, the Dynamic Event Tree is a good fit to develop a goal-oriented sampling strategy. The DET is used to drive a Limit Surface search. The methodology that has been developed by the authors last year, performs a Limit Surface search in the aleatory space only. This report documents how this approach has been extended in order to consider the epistemic space interacting with the Hybrid Dynamic Event Tree methodology.« less

  20. Characterizing spatial uncertainty when integrating social data in conservation planning.

    PubMed

    Lechner, A M; Raymond, C M; Adams, V M; Polyakov, M; Gordon, A; Rhodes, J R; Mills, M; Stein, A; Ives, C D; Lefroy, E C

    2014-12-01

    Recent conservation planning studies have presented approaches for integrating spatially referenced social (SRS) data with a view to improving the feasibility of conservation action. We reviewed the growing conservation literature on SRS data, focusing on elicited or stated preferences derived through social survey methods such as choice experiments and public participation geographic information systems. Elicited SRS data includes the spatial distribution of willingness to sell, willingness to pay, willingness to act, and assessments of social and cultural values. We developed a typology for assessing elicited SRS data uncertainty which describes how social survey uncertainty propagates when projected spatially and the importance of accounting for spatial uncertainty such as scale effects and data quality. These uncertainties will propagate when elicited SRS data is integrated with biophysical data for conservation planning and may have important consequences for assessing the feasibility of conservation actions. To explore this issue further, we conducted a systematic review of the elicited SRS data literature. We found that social survey uncertainty was commonly tested for, but that these uncertainties were ignored when projected spatially. Based on these results we developed a framework which will help researchers and practitioners estimate social survey uncertainty and use these quantitative estimates to systematically address uncertainty within an analysis. This is important when using SRS data in conservation applications because decisions need to be made irrespective of data quality and well characterized uncertainty can be incorporated into decision theoretic approaches. © 2014 Society for Conservation Biology.

  1. Assessing species vulnerability to climate change

    NASA Astrophysics Data System (ADS)

    Pacifici, Michela; Foden, Wendy B.; Visconti, Piero; Watson, James E. M.; Butchart, Stuart H. M.; Kovacs, Kit M.; Scheffers, Brett R.; Hole, David G.; Martin, Tara G.; Akçakaya, H. Resit; Corlett, Richard T.; Huntley, Brian; Bickford, David; Carr, Jamie A.; Hoffmann, Ary A.; Midgley, Guy F.; Pearce-Kelly, Paul; Pearson, Richard G.; Williams, Stephen E.; Willis, Stephen G.; Young, Bruce; Rondinini, Carlo

    2015-03-01

    The effects of climate change on biodiversity are increasingly well documented, and many methods have been developed to assess species' vulnerability to climatic changes, both ongoing and projected in the coming decades. To minimize global biodiversity losses, conservationists need to identify those species that are likely to be most vulnerable to the impacts of climate change. In this Review, we summarize different currencies used for assessing species' climate change vulnerability. We describe three main approaches used to derive these currencies (correlative, mechanistic and trait-based), and their associated data requirements, spatial and temporal scales of application and modelling methods. We identify strengths and weaknesses of the approaches and highlight the sources of uncertainty inherent in each method that limit projection reliability. Finally, we provide guidance for conservation practitioners in selecting the most appropriate approach(es) for their planning needs and highlight priority areas for further assessments.

  2. Predicting Ice Sheet and Climate Evolution at Extreme Scales

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Heimbach, Patrick

    2016-02-06

    A main research objectives of PISCEES is the development of formal methods for quantifying uncertainties in ice sheet modeling. Uncertainties in simulating and projecting mass loss from the polar ice sheets arise primarily from initial conditions, surface and basal boundary conditions, and model parameters. In general terms, two main chains of uncertainty propagation may be identified: 1. inverse propagation of observation and/or prior onto posterior control variable uncertainties; 2. forward propagation of prior or posterior control variable uncertainties onto those of target output quantities of interest (e.g., climate indices or ice sheet mass loss). A related goal is the developmentmore » of computationally efficient methods for producing initial conditions for an ice sheet that are close to available present-day observations and essentially free of artificial model drift, which is required in order to be useful for model projections (“initialization problem”). To be of maximum value, such optimal initial states should be accompanied by “useful” uncertainty estimates that account for the different sources of uncerainties, as well as the degree to which the optimum state is constrained by available observations. The PISCEES proposal outlined two approaches for quantifying uncertainties. The first targets the full exploration of the uncertainty in model projections with sampling-based methods and a workflow managed by DAKOTA (the main delivery vehicle for software developed under QUEST). This is feasible for low-dimensional problems, e.g., those with a handful of global parameters to be inferred. This approach can benefit from derivative/adjoint information, but it is not necessary, which is why it often referred to as “non-intrusive”. The second approach makes heavy use of derivative information from model adjoints to address quantifying uncertainty in high-dimensions (e.g., basal boundary conditions in ice sheet models). The use of local gradient, or Hessian information (i.e., second derivatives of the cost function), requires additional code development and implementation, and is thus often referred to as an “intrusive” approach. Within PISCEES, MIT has been tasked to develop methods for derivative-based UQ, the ”intrusive” approach discussed above. These methods rely on the availability of first (adjoint) and second (Hessian) derivative code, developed through intrusive methods such as algorithmic differentiation (AD). While representing a significant burden in terms of code development, derivative-baesd UQ is able to cope with very high-dimensional uncertainty spaces. That is, unlike sampling methods (all variations of Monte Carlo), calculational burden is independent of the dimension of the uncertainty space. This is a significant advantage for spatially distributed uncertainty fields, such as threedimensional initial conditions, three-dimensional parameter fields, or two-dimensional surface and basal boundary conditions. Importantly, uncertainty fields for ice sheet models generally fall into this category.« less

  3. Combining exposure and effect modeling into an integrated probabilistic environmental risk assessment for nanoparticles.

    PubMed

    Jacobs, Rianne; Meesters, Johannes A J; Ter Braak, Cajo J F; van de Meent, Dik; van der Voet, Hilko

    2016-12-01

    There is a growing need for good environmental risk assessment of engineered nanoparticles (ENPs). Environmental risk assessment of ENPs has been hampered by lack of data and knowledge about ENPs, their environmental fate, and their toxicity. This leads to uncertainty in the risk assessment. To deal with uncertainty in the risk assessment effectively, probabilistic methods are advantageous. In the present study, the authors developed a method to model both the variability and the uncertainty in environmental risk assessment of ENPs. This method is based on the concentration ratio and the ratio of the exposure concentration to the critical effect concentration, both considered to be random. In this method, variability and uncertainty are modeled separately so as to allow the user to see which part of the total variation in the concentration ratio is attributable to uncertainty and which part is attributable to variability. The authors illustrate the use of the method with a simplified aquatic risk assessment of nano-titanium dioxide. The authors' method allows a more transparent risk assessment and can also direct further environmental and toxicological research to the areas in which it is most needed. Environ Toxicol Chem 2016;35:2958-2967. © 2016 The Authors. Environmental Toxicology and Chemistry published by Wiley Periodicals, Inc. on behalf of SETAC. © 2016 The Authors. Environmental Toxicology and Chemistry published by Wiley Periodicals, Inc. on behalf of SETAC.

  4. An overview of methods to identify and manage uncertainty for modelling problems in the water-environment-agriculture cross-sector

    DOE PAGES

    Jakeman, Anthony J.; Jakeman, John Davis

    2018-03-14

    Uncertainty pervades the representation of systems in the water–environment–agriculture cross-sector. Successful methods to address uncertainties have largely focused on standard mathematical formulations of biophysical processes in a single sector, such as partial or ordinary differential equations. More attention to integrated models of such systems is warranted. Model components representing the different sectors of an integrated model can have less standard, and different, formulations to one another, as well as different levels of epistemic knowledge and data informativeness. Thus, uncertainty is not only pervasive but also crosses boundaries and propagates between system components. Uncertainty assessment (UA) cries out for more eclecticmore » treatment in these circumstances, some of it being more qualitative and empirical. Here in this paper, we discuss the various sources of uncertainty in such a cross-sectoral setting and ways to assess and manage them. We have outlined a fast-growing set of methodologies, particularly in the computational mathematics literature on uncertainty quantification (UQ), that seem highly pertinent for uncertainty assessment. There appears to be considerable scope for advancing UA by integrating relevant UQ techniques into cross-sectoral problem applications. Of course this will entail considerable collaboration between domain specialists who often take first ownership of the problem and computational methods experts.« less

  5. An overview of methods to identify and manage uncertainty for modelling problems in the water-environment-agriculture cross-sector

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jakeman, Anthony J.; Jakeman, John Davis

    Uncertainty pervades the representation of systems in the water–environment–agriculture cross-sector. Successful methods to address uncertainties have largely focused on standard mathematical formulations of biophysical processes in a single sector, such as partial or ordinary differential equations. More attention to integrated models of such systems is warranted. Model components representing the different sectors of an integrated model can have less standard, and different, formulations to one another, as well as different levels of epistemic knowledge and data informativeness. Thus, uncertainty is not only pervasive but also crosses boundaries and propagates between system components. Uncertainty assessment (UA) cries out for more eclecticmore » treatment in these circumstances, some of it being more qualitative and empirical. Here in this paper, we discuss the various sources of uncertainty in such a cross-sectoral setting and ways to assess and manage them. We have outlined a fast-growing set of methodologies, particularly in the computational mathematics literature on uncertainty quantification (UQ), that seem highly pertinent for uncertainty assessment. There appears to be considerable scope for advancing UA by integrating relevant UQ techniques into cross-sectoral problem applications. Of course this will entail considerable collaboration between domain specialists who often take first ownership of the problem and computational methods experts.« less

  6. Quantification of water resources uncertainties in the Luvuvhu sub-basin of the Limpopo river basin

    NASA Astrophysics Data System (ADS)

    Oosthuizen, N.; Hughes, D.; Kapangaziwiri, E.; Mwenge Kahinda, J.; Mvandaba, V.

    2018-06-01

    In the absence of historical observed data, models are generally used to describe the different hydrological processes and generate data and information that will inform management and policy decision making. Ideally, any hydrological model should be based on a sound conceptual understanding of the processes in the basin and be backed by quantitative information for the parameterization of the model. However, these data are often inadequate in many sub-basins, necessitating the incorporation of the uncertainty related to the estimation process. This paper reports on the impact of the uncertainty related to the parameterization of the Pitman monthly model and water use data on the estimates of the water resources of the Luvuvhu, a sub-basin of the Limpopo river basin. The study reviews existing information sources associated with the quantification of water balance components and gives an update of water resources of the sub-basin. The flows generated by the model at the outlet of the basin were between 44.03 Mm3 and 45.48 Mm3 per month when incorporating +20% uncertainty to the main physical runoff generating parameters. The total predictive uncertainty of the model increased when water use data such as small farm and large reservoirs and irrigation were included. The dam capacity data was considered at an average of 62% uncertainty mainly as a result of the large differences between the available information in the national water resources database and that digitised from satellite imagery. Water used by irrigated crops was estimated with an average of about 50% uncertainty. The mean simulated monthly flows were between 38.57 Mm3 and 54.83 Mm3 after the water use uncertainty was added. However, it is expected that the uncertainty could be reduced by using higher resolution remote sensing imagery.

  7. Uncertainty in tsunami sediment transport modeling

    USGS Publications Warehouse

    Jaffe, Bruce E.; Goto, Kazuhisa; Sugawara, Daisuke; Gelfenbaum, Guy R.; La Selle, SeanPaul M.

    2016-01-01

    Erosion and deposition from tsunamis record information about tsunami hydrodynamics and size that can be interpreted to improve tsunami hazard assessment. We explore sources and methods for quantifying uncertainty in tsunami sediment transport modeling. Uncertainty varies with tsunami, study site, available input data, sediment grain size, and model. Although uncertainty has the potential to be large, published case studies indicate that both forward and inverse tsunami sediment transport models perform well enough to be useful for deciphering tsunami characteristics, including size, from deposits. New techniques for quantifying uncertainty, such as Ensemble Kalman Filtering inversion, and more rigorous reporting of uncertainties will advance the science of tsunami sediment transport modeling. Uncertainty may be decreased with additional laboratory studies that increase our understanding of the semi-empirical parameters and physics of tsunami sediment transport, standardized benchmark tests to assess model performance, and development of hybrid modeling approaches to exploit the strengths of forward and inverse models.

  8. Effects of in situ stress measurement uncertainties on assessment of predicted seismic activity and risk associated with a hypothetical industrial-scale geologic CO 2 sequestration operation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jeanne, Pierre; Rutqvist, Jonny; Wainwright, Haruko M.

    Carbon capture and storage (CCS) in geologic formations has been recognized as a promising option for reducing carbon dioxide (CO 2) emissions from large stationary sources. However, the pressure buildup inside the storage formation can potentially induce slip along preexisting faults, which could lead to felt seismic ground motion and also provide pathways for brine/CO 2 leakage into shallow drinking water aquifers. To assess the geomechanical stability of faults, it is of crucial importance to know the in situ state of stress. In situ stress measurements can provide some information on the stresses acting on faults but with considerable uncertainties.more » In this paper, we investigate how such uncertainties, as defined by the variation of stress measurements obtained within the study area, could influence the assessment of the geomechanical stability of faults and the characteristics of potential injection-induced seismic events. Our modeling study is based on a hypothetical industrial-scale carbon sequestration project assumed to be located in the Southern San Joaquin Basin in California, USA. We assess the stability on the major (25 km long) fault that bounds the sequestration site and is subjected to significant reservoir pressure changes as a result of 50 years of CO 2 injection. We also present a series of geomechanical simulations in which the resolved stresses on the fault were varied over ranges of values corresponding to various stress measurements performed around the study area. The simulation results are analyzed by a statistical approach. Our main results are that the variations in resolved stresses as defined by the range of stress measurements had a negligible effect on the prediction of the seismic risk (maximum magnitude), but an important effect on the timing, the seismicity rate (number of seismic events) and the location of seismic activity.« less

  9. Effects of in situ stress measurement uncertainties on assessment of predicted seismic activity and risk associated with a hypothetical industrial-scale geologic CO 2 sequestration operation

    DOE PAGES

    Jeanne, Pierre; Rutqvist, Jonny; Wainwright, Haruko M.; ...

    2016-10-05

    Carbon capture and storage (CCS) in geologic formations has been recognized as a promising option for reducing carbon dioxide (CO 2) emissions from large stationary sources. However, the pressure buildup inside the storage formation can potentially induce slip along preexisting faults, which could lead to felt seismic ground motion and also provide pathways for brine/CO 2 leakage into shallow drinking water aquifers. To assess the geomechanical stability of faults, it is of crucial importance to know the in situ state of stress. In situ stress measurements can provide some information on the stresses acting on faults but with considerable uncertainties.more » In this paper, we investigate how such uncertainties, as defined by the variation of stress measurements obtained within the study area, could influence the assessment of the geomechanical stability of faults and the characteristics of potential injection-induced seismic events. Our modeling study is based on a hypothetical industrial-scale carbon sequestration project assumed to be located in the Southern San Joaquin Basin in California, USA. We assess the stability on the major (25 km long) fault that bounds the sequestration site and is subjected to significant reservoir pressure changes as a result of 50 years of CO 2 injection. We also present a series of geomechanical simulations in which the resolved stresses on the fault were varied over ranges of values corresponding to various stress measurements performed around the study area. The simulation results are analyzed by a statistical approach. Our main results are that the variations in resolved stresses as defined by the range of stress measurements had a negligible effect on the prediction of the seismic risk (maximum magnitude), but an important effect on the timing, the seismicity rate (number of seismic events) and the location of seismic activity.« less

  10. Overview of the Long-term Ozone Trends and Uncertainties in the Stratosphere(LOTUS) SPARC Activity

    NASA Astrophysics Data System (ADS)

    Petropavlovskikh, I. V.; Hubert, D.; Godin-Beekman, S.; Damadeo, R. P.; Sofieva, V.; Hassler, B.

    2017-12-01

    WMO/UNEP Assessments on the state of the ozone layer (aka Ozone Assessments) require an accurate evaluation of both total ozone and ozone profile long-term trends. These trend results are of utmost importance in order to evaluate the success of the Montreal Protocol with regards to the recovery of the ozone layer and the effect of climate change on this recovery, in the main regions of the stratosphere (polar, mid-latitudes, tropics). A previous activity sponsored by SPARC, IO3C, IGACO-O3 and NDACC (SI2N) successfully provided estimates of ozone profile decreasing trend in the period 1979 - 1997 and recovery trend in the period 1998 -2012, from a variety of long term records, however its results were different from those published in the WMO 2014 Ozone Assessment report. For the WMO/UNEP 2018 Ozone Assessment, a clear understanding of ozone trends and their significance as a function of altitude and latitude is still needed, nearly 20 years after the peak of ozone depleting substances in the stratosphere. In the most recent years, new merged satellite data sets and long awaited homogenized ozonesonde data series have been produced. There is thus a strong interest in the scientific community to understand limitations in determining significance of ozone recovery. In order to address the issues left pending after the end of SI2N, a comprehensive evaluation of all long term data sets available together with their relative drifts was performed through the SPARC LOTUS (Long-term Ozone Trends and uncertainties in Stratosphere) activity. Evaluation of consistencies in results from various statistical trend regression models, sensitivity to the selection of predictors, evaluation of sampling-related uncertainties and impact of the measurement error propagation on ozone trend calculation was among subjects of investigation. This presentation will provide overview of the LOTUS project goals, provide highlights of the results and discuss the future goals.

  11. Flood Frequency Analysis using different flood descriptors - the Warsaw reach of the river Vistula case study

    NASA Astrophysics Data System (ADS)

    Karamuz, Emilia; Kochanek, Krzysztof; Romanowicz, Renata

    2014-05-01

    Flood frequency analysis (FFA) is customarily performed using annual maximum flows. However, there is a number of different flood descriptors that could be used. Among them are water levels, peaks over the threshold, flood-wave duration, flood volume, etc. In this study we compare different approaches to FFA for their suitability for flood risk assessment. The main goal is to obtain the FFA curve with the smallest possible uncertainty limits, in particular for the distribution tail. The extrapolation of FFA curves is crucial in future flood risk assessment in a changing climate. We compare the FFA curves together with their uncertainty limits obtained using flows, water levels, flood inundation area and volumes for the Warsaw reach of the river Vistula. Moreover, we derive the FFA curves obtained using simulated flows. The results are used to derive the error distribution for the maximum simulated and observed values under different modelling techniques and assess its influence on flood risk predictions for ungauged catchments. MIKE11, HEC-RAS and transfer function model are applied in average and extreme conditions to model flow propagation in the Warsaw Vistula reach. The additional questions we want to answer are what is the range of application of different modelling tools under various flow conditions and how can the uncertainty of flood risk assessment be decreased. This work was partly supported by the projects "Stochastic flood forecasting system (The River Vistula reach from Zawichost to Warsaw)" and "Modern statistical models for analysis of flood frequency and features of flood waves", carried by the Institute of Geophysics, Polish Academy of Sciences on the order of the National Science Centre (contracts Nos. 2011/01/B/ST10/06866 and 2012/05/B/ST10/00482, respectively). The water level and flow data were provided by the Institute of Meteorology and Water Management (IMGW), Poland.

  12. Assessment of SFR Wire Wrap Simulation Uncertainties

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Delchini, Marc-Olivier G.; Popov, Emilian L.; Pointer, William David

    Predictive modeling and simulation of nuclear reactor performance and fuel are challenging due to the large number of coupled physical phenomena that must be addressed. Models that will be used for design or operational decisions must be analyzed for uncertainty to ascertain impacts to safety or performance. Rigorous, structured uncertainty analyses are performed by characterizing the model’s input uncertainties and then propagating the uncertainties through the model to estimate output uncertainty. This project is part of the ongoing effort to assess modeling uncertainty in Nek5000 simulations of flow configurations relevant to the advanced reactor applications of the Nuclear Energy Advancedmore » Modeling and Simulation (NEAMS) program. Three geometries are under investigation in these preliminary assessments: a 3-D pipe, a 3-D 7-pin bundle, and a single pin from the Thermal-Hydraulic Out-of-Reactor Safety (THORS) facility. Initial efforts have focused on gaining an understanding of Nek5000 modeling options and integrating Nek5000 with Dakota. These tasks are being accomplished by demonstrating the use of Dakota to assess parametric uncertainties in a simple pipe flow problem. This problem is used to optimize performance of the uncertainty quantification strategy and to estimate computational requirements for assessments of complex geometries. A sensitivity analysis to three turbulent models was conducted for a turbulent flow in a single wire wrapped pin (THOR) geometry. Section 2 briefly describes the software tools used in this study and provides appropriate references. Section 3 presents the coupling interface between Dakota and a computational fluid dynamic (CFD) code (Nek5000 or STARCCM+), with details on the workflow, the scripts used for setting up the run, and the scripts used for post-processing the output files. In Section 4, the meshing methods used to generate the THORS and 7-pin bundle meshes are explained. Sections 5, 6 and 7 present numerical results for the 3-D pipe, the single pin THORS mesh, and the 7-pin bundle mesh, respectively.« less

  13. A multi-model assessment of terrestrial biosphere model data needs

    NASA Astrophysics Data System (ADS)

    Gardella, A.; Cowdery, E.; De Kauwe, M. G.; Desai, A. R.; Duveneck, M.; Fer, I.; Fisher, R.; Knox, R. G.; Kooper, R.; LeBauer, D.; McCabe, T.; Minunno, F.; Raiho, A.; Serbin, S.; Shiklomanov, A. N.; Thomas, A.; Walker, A.; Dietze, M.

    2017-12-01

    Terrestrial biosphere models provide us with the means to simulate the impacts of climate change and their uncertainties. Going beyond direct observation and experimentation, models synthesize our current understanding of ecosystem processes and can give us insight on data needed to constrain model parameters. In previous work, we leveraged the Predictive Ecosystem Analyzer (PEcAn) to assess the contribution of different parameters to the uncertainty of the Ecosystem Demography model v2 (ED) model outputs across various North American biomes (Dietze et al., JGR-G, 2014). While this analysis identified key research priorities, the extent to which these priorities were model- and/or biome-specific was unclear. Furthermore, because the analysis only studied one model, we were unable to comment on the effect of variability in model structure to overall predictive uncertainty. Here, we expand this analysis to all biomes globally and a wide sample of models that vary in complexity: BioCro, CABLE, CLM, DALEC, ED2, FATES, G'DAY, JULES, LANDIS, LINKAGES, LPJ-GUESS, MAESPA, PRELES, SDGVM, SIPNET, and TEM. Prior to performing uncertainty analyses, model parameter uncertainties were assessed by assimilating all available trait data from the combination of the BETYdb and TRY trait databases, using an updated multivariate version of PEcAn's Hierarchical Bayesian meta-analysis. Next, sensitivity analyses were performed for all models across a range of sites globally to assess sensitivities for a range of different outputs (GPP, ET, SH, Ra, NPP, Rh, NEE, LAI) at multiple time scales from the sub-annual to the decadal. Finally, parameter uncertainties and model sensitivities were combined to evaluate the fractional contribution of each parameter to the predictive uncertainty for a specific variable at a specific site and timescale. Facilitated by PEcAn's automated workflows, this analysis represents the broadest assessment of the sensitivities and uncertainties in terrestrial models to date, and provides a comprehensive roadmap for constraining model uncertainties through model development and data collection.

  14. Assessment of Experimental Uncertainty for a Floating Wind Semisubmersible under Hydrodynamic Loading: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Robertson, Amy N; Wendt, Fabian F; Jonkman, Jason

    The objective of this paper is to assess the sources of experimental uncertainty in an offshore wind validation campaign focused on better understanding the nonlinear hydrodynamic response behavior of a floating semisubmersible. The test specimen and conditions were simplified compared to other floating wind test campaigns to reduce potential sources of uncertainties and better focus on the hydrodynamic load attributes. Repeat tests were used to understand the repeatability of the test conditions and to assess the level of random uncertainty in the measurements. Attention was also given to understanding bias in all components of the test. The end goal ofmore » this work is to set uncertainty bounds on the response metrics of interest, which will be used in future work to evaluate the success of modeling tools in accurately calculating hydrodynamic loads and the associated motion responses of the system.« less

  15. Quantitative body DW-MRI biomarkers uncertainty estimation using unscented wild-bootstrap.

    PubMed

    Freiman, M; Voss, S D; Mulkern, R V; Perez-Rossello, J M; Warfield, S K

    2011-01-01

    We present a new method for the uncertainty estimation of diffusion parameters for quantitative body DW-MRI assessment. Diffusion parameters uncertainty estimation from DW-MRI is necessary for clinical applications that use these parameters to assess pathology. However, uncertainty estimation using traditional techniques requires repeated acquisitions, which is undesirable in routine clinical use. Model-based bootstrap techniques, for example, assume an underlying linear model for residuals rescaling and cannot be utilized directly for body diffusion parameters uncertainty estimation due to the non-linearity of the body diffusion model. To offset this limitation, our method uses the Unscented transform to compute the residuals rescaling parameters from the non-linear body diffusion model, and then applies the wild-bootstrap method to infer the body diffusion parameters uncertainty. Validation through phantom and human subject experiments shows that our method identify the regions with higher uncertainty in body DWI-MRI model parameters correctly with realtive error of -36% in the uncertainty values.

  16. Bayesian GGE biplot models applied to maize multi-environments trials.

    PubMed

    de Oliveira, L A; da Silva, C P; Nuvunga, J J; da Silva, A Q; Balestre, M

    2016-06-17

    The additive main effects and multiplicative interaction (AMMI) and the genotype main effects and genotype x environment interaction (GGE) models stand out among the linear-bilinear models used in genotype x environment interaction studies. Despite the advantages of their use to describe genotype x environment (AMMI) or genotype and genotype x environment (GGE) interactions, these methods have known limitations that are inherent to fixed effects models, including difficulty in treating variance heterogeneity and missing data. Traditional biplots include no measure of uncertainty regarding the principal components. The present study aimed to apply the Bayesian approach to GGE biplot models and assess the implications for selecting stable and adapted genotypes. Our results demonstrated that the Bayesian approach applied to GGE models with non-informative priors was consistent with the traditional GGE biplot analysis, although the credible region incorporated into the biplot enabled distinguishing, based on probability, the performance of genotypes, and their relationships with the environments in the biplot. Those regions also enabled the identification of groups of genotypes and environments with similar effects in terms of adaptability and stability. The relative position of genotypes and environments in biplots is highly affected by the experimental accuracy. Thus, incorporation of uncertainty in biplots is a key tool for breeders to make decisions regarding stability selection and adaptability and the definition of mega-environments.

  17. Assessment and visualization of uncertainty for countrywide soil organic matter map of Hungary using local entropy

    NASA Astrophysics Data System (ADS)

    Szatmári, Gábor; Pásztor, László

    2016-04-01

    Uncertainty is a general term expressing our imperfect knowledge in describing an environmental process and we are aware of it (Bárdossy and Fodor, 2004). Sampling, laboratory measurements, models and so on are subject to uncertainty. Effective quantification and visualization of uncertainty would be indispensable to stakeholders (e.g. policy makers, society). Soil related features and their spatial models should be stressfully targeted to uncertainty assessment because their inferences are further used in modelling and decision making process. The aim of our present study was to assess and effectively visualize the local uncertainty of the countrywide soil organic matter (SOM) spatial distribution model of Hungary using geostatistical tools and concepts. The Hungarian Soil Information and Monitoring System's SOM data (approximately 1,200 observations) and environmental related, spatially exhaustive secondary information (i.e. digital elevation model, climatic maps, MODIS satellite images and geological map) were used to model the countrywide SOM spatial distribution by regression kriging. It would be common to use the calculated estimation (or kriging) variance as a measure of uncertainty, however the normality and homoscedasticity hypotheses have to be refused according to our preliminary analysis on the data. Therefore, a normal score transformation and a sequential stochastic simulation approach was introduced to be able to model and assess the local uncertainty. Five hundred equally probable realizations (i.e. stochastic images) were generated. The number of the stochastic images is fairly enough to provide a model of uncertainty at each location, which is a complete description of uncertainty in geostatistics (Deutsch and Journel, 1998). Furthermore, these models can be applied e.g. to contour the probability of any events, which can be regarded as goal oriented digital soil maps and are of interest for agricultural management and decision making as well. A standardized measure of the local entropy was used to visualize uncertainty, where entropy values close to 1 correspond to high uncertainty, whilst values close to 0 correspond low uncertainty. The advantage of the usage of local entropy in this context is that it combines probabilities from multiple members into a single number for each location of the model. In conclusion, it is straightforward to use a sequential stochastic simulation approach to the assessment of uncertainty, when normality and homoscedasticity are violated. The visualization of uncertainty using the local entropy is effective and communicative to stakeholders because it represents the uncertainty through a single number within a [0, 1] scale. References: Bárdossy, Gy. & Fodor, J., 2004. Evaluation of Uncertainties and Risks in Geology. Springer-Verlag, Berlin Heidelberg. Deutsch, C.V. & Journel, A.G., 1998. GSLIB: geostatistical software library and user's guide. Oxford University Press, New York. Acknowledgement: Our work was supported by the Hungarian National Scientific Research Foundation (OTKA, Grant No. K105167).

  18. Ignoring correlation in uncertainty and sensitivity analysis in life cycle assessment: what is the risk?

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Groen, E.A., E-mail: Evelyne.Groen@gmail.com; Heijungs, R.; Leiden University, Einsteinweg 2, Leiden 2333 CC

    Life cycle assessment (LCA) is an established tool to quantify the environmental impact of a product. A good assessment of uncertainty is important for making well-informed decisions in comparative LCA, as well as for correctly prioritising data collection efforts. Under- or overestimation of output uncertainty (e.g. output variance) will lead to incorrect decisions in such matters. The presence of correlations between input parameters during uncertainty propagation, can increase or decrease the the output variance. However, most LCA studies that include uncertainty analysis, ignore correlations between input parameters during uncertainty propagation, which may lead to incorrect conclusions. Two approaches to include correlationsmore » between input parameters during uncertainty propagation and global sensitivity analysis were studied: an analytical approach and a sampling approach. The use of both approaches is illustrated for an artificial case study of electricity production. Results demonstrate that both approaches yield approximately the same output variance and sensitivity indices for this specific case study. Furthermore, we demonstrate that the analytical approach can be used to quantify the risk of ignoring correlations between input parameters during uncertainty propagation in LCA. We demonstrate that: (1) we can predict if including correlations among input parameters in uncertainty propagation will increase or decrease output variance; (2) we can quantify the risk of ignoring correlations on the output variance and the global sensitivity indices. Moreover, this procedure requires only little data. - Highlights: • Ignoring correlation leads to under- or overestimation of the output variance. • We demonstrated that the risk of ignoring correlation can be quantified. • The procedure proposed is generally applicable in life cycle assessment. • In some cases, ignoring correlation has a minimal effect on decision-making tools.« less

  19. Quantifying the intra-annual uncertainties in climate change assessment over 10 sub-basins across the Pacific Northwest US

    NASA Astrophysics Data System (ADS)

    Ahmadalipour, Ali; Moradkhani, Hamid; Rana, Arun

    2017-04-01

    Uncertainty is an inevitable feature of climate change impact assessments. Understanding and quantifying different sources of uncertainty is of high importance, which can help modeling agencies improve the current models and scenarios. In this study, we have assessed the future changes in three climate variables (i.e. precipitation, maximum temperature, and minimum temperature) over 10 sub-basins across the Pacific Northwest US. To conduct the study, 10 statistically downscaled CMIP5 GCMs from two downscaling methods (i.e. BCSD and MACA) were utilized at 1/16 degree spatial resolution for the historical period of 1970-2000 and future period of 2010-2099. For the future projections, two future scenarios of RCP4.5 and RCP8.5 were used. Furthermore, Bayesian Model Averaging (BMA) was employed to develop a probabilistic future projection for each climate variable. Results indicate superiority of BMA simulations compared to individual models. Increasing temperature and precipitation are projected at annual timescale. However, the changes are not uniform among different seasons. Model uncertainty shows to be the major source of uncertainty, while downscaling uncertainty significantly contributes to the total uncertainty, especially in summer.

  20. Accounting for multiple sources of uncertainty in impact assessments: The example of the BRACE study

    NASA Astrophysics Data System (ADS)

    O'Neill, B. C.

    2015-12-01

    Assessing climate change impacts often requires the use of multiple scenarios, types of models, and data sources, leading to a large number of potential sources of uncertainty. For example, a single study might require a choice of a forcing scenario, climate model, bias correction and/or downscaling method, societal development scenario, model (typically several) for quantifying elements of societal development such as economic and population growth, biophysical model (such as for crop yields or hydrology), and societal impact model (e.g. economic or health model). Some sources of uncertainty are reduced or eliminated by the framing of the question. For example, it may be useful to ask what an impact outcome would be conditional on a given societal development pathway, forcing scenario, or policy. However many sources of uncertainty remain, and it is rare for all or even most of these sources to be accounted for. I use the example of a recent integrated project on the Benefits of Reduced Anthropogenic Climate changE (BRACE) to explore useful approaches to uncertainty across multiple components of an impact assessment. BRACE comprises 23 papers that assess the differences in impacts between two alternative climate futures: those associated with Representative Concentration Pathways (RCPs) 4.5 and 8.5. It quantifies difference in impacts in terms of extreme events, health, agriculture, tropical cyclones, and sea level rise. Methodologically, it includes climate modeling, statistical analysis, integrated assessment modeling, and sector-specific impact modeling. It employs alternative scenarios of both radiative forcing and societal development, but generally uses a single climate model (CESM), partially accounting for climate uncertainty by drawing heavily on large initial condition ensembles. Strengths and weaknesses of the approach to uncertainty in BRACE are assessed. Options under consideration for improving the approach include the use of perturbed physics ensembles of CESM, employing results from multiple climate models, and combining the results from single impact models with statistical representations of uncertainty across multiple models. A key consideration is the relationship between the question being addressed and the uncertainty approach.

  1. Contribution of crop model structure, parameters and climate projections to uncertainty in climate change impact assessments.

    PubMed

    Tao, Fulu; Rötter, Reimund P; Palosuo, Taru; Gregorio Hernández Díaz-Ambrona, Carlos; Mínguez, M Inés; Semenov, Mikhail A; Kersebaum, Kurt Christian; Nendel, Claas; Specka, Xenia; Hoffmann, Holger; Ewert, Frank; Dambreville, Anaelle; Martre, Pierre; Rodríguez, Lucía; Ruiz-Ramos, Margarita; Gaiser, Thomas; Höhn, Jukka G; Salo, Tapio; Ferrise, Roberto; Bindi, Marco; Cammarano, Davide; Schulman, Alan H

    2018-03-01

    Climate change impact assessments are plagued with uncertainties from many sources, such as climate projections or the inadequacies in structure and parameters of the impact model. Previous studies tried to account for the uncertainty from one or two of these. Here, we developed a triple-ensemble probabilistic assessment using seven crop models, multiple sets of model parameters and eight contrasting climate projections together to comprehensively account for uncertainties from these three important sources. We demonstrated the approach in assessing climate change impact on barley growth and yield at Jokioinen, Finland in the Boreal climatic zone and Lleida, Spain in the Mediterranean climatic zone, for the 2050s. We further quantified and compared the contribution of crop model structure, crop model parameters and climate projections to the total variance of ensemble output using Analysis of Variance (ANOVA). Based on the triple-ensemble probabilistic assessment, the median of simulated yield change was -4% and +16%, and the probability of decreasing yield was 63% and 31% in the 2050s, at Jokioinen and Lleida, respectively, relative to 1981-2010. The contribution of crop model structure to the total variance of ensemble output was larger than that from downscaled climate projections and model parameters. The relative contribution of crop model parameters and downscaled climate projections to the total variance of ensemble output varied greatly among the seven crop models and between the two sites. The contribution of downscaled climate projections was on average larger than that of crop model parameters. This information on the uncertainty from different sources can be quite useful for model users to decide where to put the most effort when preparing or choosing models or parameters for impact analyses. We concluded that the triple-ensemble probabilistic approach that accounts for the uncertainties from multiple important sources provide more comprehensive information for quantifying uncertainties in climate change impact assessments as compared to the conventional approaches that are deterministic or only account for the uncertainties from one or two of the uncertainty sources. © 2017 John Wiley & Sons Ltd.

  2. V&V framework

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hills, Richard G.; Maniaci, David Charles; Naughton, Jonathan W.

    2015-09-01

    A Verification and Validation (V&V) framework is presented for the development and execution of coordinated modeling and experimental program s to assess the predictive capability of computational models of complex systems through focused, well structured, and formal processes.The elements of the framework are based on established V&V methodology developed by various organizations including the Department of Energy, National Aeronautics and Space Administration, the American Institute of Aeronautics and Astronautics, and the American Society of Mechanical Engineers. Four main topics are addressed: 1) Program planning based on expert elicitation of the modeling physics requirements, 2) experimental design for model assessment, 3)more » uncertainty quantification for experimental observations and computational model simulations, and 4) assessment of the model predictive capability. The audience for this document includes program planners, modelers, experimentalist, V &V specialist, and customers of the modeling results.« less

  3. Comprehensive vulnerability assessment method for nodes considering anti-interference ability and influence

    NASA Astrophysics Data System (ADS)

    LUO, Jianchun; WANG, Yunyu; YANG, Jun; RAN, hong; PENG, Xiaodong; HUANG, Ming; FENG, Hao; LIU, Meijun

    2018-03-01

    The vulnerability assessment of power grid is of great significance in the current research. Power system faces many kinds of uncertainty factors, and the disturbance caused by them has become one of the main factors which restrict the safe operation of power grid. To solve this problem, considering the anti-interference ability of the system when the system is disturbed and the effect of the system when the node is out of operation, a set of index to reflect the anti-interference ability and the influence of nodes are set up. On this basis, a new comprehensive vulnerability assessment method of nodes is put forward by using super efficiency data envelopment analysis to scientific integration. Finally, the simulative results of IEEE30-bus system indicated that the proposed model is rational and valid.

  4. Accounting for methodological, structural, and parameter uncertainty in decision-analytic models: a practical guide.

    PubMed

    Bilcke, Joke; Beutels, Philippe; Brisson, Marc; Jit, Mark

    2011-01-01

    Accounting for uncertainty is now a standard part of decision-analytic modeling and is recommended by many health technology agencies and published guidelines. However, the scope of such analyses is often limited, even though techniques have been developed for presenting the effects of methodological, structural, and parameter uncertainty on model results. To help bring these techniques into mainstream use, the authors present a step-by-step guide that offers an integrated approach to account for different kinds of uncertainty in the same model, along with a checklist for assessing the way in which uncertainty has been incorporated. The guide also addresses special situations such as when a source of uncertainty is difficult to parameterize, resources are limited for an ideal exploration of uncertainty, or evidence to inform the model is not available or not reliable. for identifying the sources of uncertainty that influence results most are also described. Besides guiding analysts, the guide and checklist may be useful to decision makers who need to assess how well uncertainty has been accounted for in a decision-analytic model before using the results to make a decision.

  5. Uncertainty characterization approaches for risk assessment of DBPs in drinking water: a review.

    PubMed

    Chowdhury, Shakhawat; Champagne, Pascale; McLellan, P James

    2009-04-01

    The management of risk from disinfection by-products (DBPs) in drinking water has become a critical issue over the last three decades. The areas of concern for risk management studies include (i) human health risk from DBPs, (ii) disinfection performance, (iii) technical feasibility (maintenance, management and operation) of treatment and disinfection approaches, and (iv) cost. Human health risk assessment is typically considered to be the most important phase of the risk-based decision-making or risk management studies. The factors associated with health risk assessment and other attributes are generally prone to considerable uncertainty. Probabilistic and non-probabilistic approaches have both been employed to characterize uncertainties associated with risk assessment. The probabilistic approaches include sampling-based methods (typically Monte Carlo simulation and stratified sampling) and asymptotic (approximate) reliability analysis (first- and second-order reliability methods). Non-probabilistic approaches include interval analysis, fuzzy set theory and possibility theory. However, it is generally accepted that no single method is suitable for the entire spectrum of problems encountered in uncertainty analyses for risk assessment. Each method has its own set of advantages and limitations. In this paper, the feasibility and limitations of different uncertainty analysis approaches are outlined for risk management studies of drinking water supply systems. The findings assist in the selection of suitable approaches for uncertainty analysis in risk management studies associated with DBPs and human health risk.

  6. Neural-Network-Based Robust Optimal Tracking Control for MIMO Discrete-Time Systems With Unknown Uncertainty Using Adaptive Critic Design.

    PubMed

    Liu, Lei; Wang, Zhanshan; Zhang, Huaguang

    2018-04-01

    This paper is concerned with the robust optimal tracking control strategy for a class of nonlinear multi-input multi-output discrete-time systems with unknown uncertainty via adaptive critic design (ACD) scheme. The main purpose is to establish an adaptive actor-critic control method, so that the cost function in the procedure of dealing with uncertainty is minimum and the closed-loop system is stable. Based on the neural network approximator, an action network is applied to generate the optimal control signal and a critic network is used to approximate the cost function, respectively. In contrast to the previous methods, the main features of this paper are: 1) the ACD scheme is integrated into the controllers to cope with the uncertainty and 2) a novel cost function, which is not in quadric form, is proposed so that the total cost in the design procedure is reduced. It is proved that the optimal control signals and the tracking errors are uniformly ultimately bounded even when the uncertainty exists. Finally, a numerical simulation is developed to show the effectiveness of the present approach.

  7. Using high-resolution soil moisture modelling to assess the uncertainty of microwave remotely sensed soil moisture products at the correct spatial and temporal support

    NASA Astrophysics Data System (ADS)

    Wanders, N.; Karssenberg, D.; Bierkens, M. F. P.; Van Dam, J. C.; De Jong, S. M.

    2012-04-01

    Soil moisture is a key variable in the hydrological cycle and important in hydrological modelling. When assimilating soil moisture into flood forecasting models, the improvement of forecasting skills depends on the ability to accurately estimate the spatial and temporal patterns of soil moisture content throughout the river basin. Space-borne remote sensing may provide this information with a high temporal and spatial resolution and with a global coverage. Currently three microwave soil moisture products are available: AMSR-E, ASCAT and SMOS. The quality of these satellite-based products is often assessed by comparing them with in-situ observations of soil moisture. This comparison is however hampered by the difference in spatial and temporal support (i.e., resolution, scale), because the spatial resolution of microwave satellites is rather low compared to in-situ field measurements. Thus, the aim of this study is to derive a method to assess the uncertainty of microwave satellite soil moisture products at the correct spatial support. To overcome the difference in support size between in-situ soil moisture observations and remote sensed soil moisture, we used a stochastic, distributed unsaturated zone model (SWAP, van Dam (2000)) that is upscaled to the support of different satellite products. A detailed assessment of the SWAP model uncertainty is included to ensure that the uncertainty in satellite soil moisture is not overestimated due to an underestimation of the model uncertainty. We simulated unsaturated water flow up to a depth of 1.5m with a vertical resolution of 1 to 10 cm and on a horizontal grid of 1 km2 for the period Jan 2010 - Jun 2011. The SWAP model was first calibrated and validated on in-situ data of the REMEDHUS soil moisture network (Spain). Next, to evaluate the satellite products, the model was run for areas in the proximity of 79 meteorological stations in Spain, where model results were aggregated to the correct support of the satellite product by averaging model results from the 1 km2 grid within the remote sensing footprint. Overall 440 (AMSR-E, SMOS) to 680 (ASCAT) timeseries were compared to the aggregated SWAP model results, providing valuable information on the uncertainty of satellite soil moisture at the proper support. Our results show that temporal dynamics are best captured by ASCAT resulting in an average correlation of 0.72 with the model, while ASMR-E (0.41) and SMOS (0.42) are less capable of representing these dynamics. Standard deviations found for ASCAT and SMOS are low, 0.049 and 0.051m3m-3 respectively, while AMSR-E has a higher value of 0.062m3m-3. All standard deviations are higher than the average model uncertainty of 0.017m3m-3. All satellite products show a negative bias compared to the model results, with the largest value for SMOS. Satellite uncertainty is not found to be significantly related to topography, but is found to increase in densely vegetated areas. In general AMSR-E has most difficulties capturing soil moisture dynamics in Spain, while SMOS and mainly ASCAT have a fair to good performance. However, all products contain valuable information about the near-surface soil moisture over Spain. Van Dam, J.C., 2000, Field scale water flow and solute transport. SWAP model concepts, parameter estimation and case studies. Ph.D. thesis, Wageningen University

  8. Quantifying model-structure- and parameter-driven uncertainties in spring wheat phenology prediction with Bayesian analysis

    DOE PAGES

    Alderman, Phillip D.; Stanfill, Bryan

    2016-10-06

    Recent international efforts have brought renewed emphasis on the comparison of different agricultural systems models. Thus far, analysis of model-ensemble simulated results has not clearly differentiated between ensemble prediction uncertainties due to model structural differences per se and those due to parameter value uncertainties. Additionally, despite increasing use of Bayesian parameter estimation approaches with field-scale crop models, inadequate attention has been given to the full posterior distributions for estimated parameters. The objectives of this study were to quantify the impact of parameter value uncertainty on prediction uncertainty for modeling spring wheat phenology using Bayesian analysis and to assess the relativemore » contributions of model-structure-driven and parameter-value-driven uncertainty to overall prediction uncertainty. This study used a random walk Metropolis algorithm to estimate parameters for 30 spring wheat genotypes using nine phenology models based on multi-location trial data for days to heading and days to maturity. Across all cases, parameter-driven uncertainty accounted for between 19 and 52% of predictive uncertainty, while model-structure-driven uncertainty accounted for between 12 and 64%. Here, this study demonstrated the importance of quantifying both model-structure- and parameter-value-driven uncertainty when assessing overall prediction uncertainty in modeling spring wheat phenology. More generally, Bayesian parameter estimation provided a useful framework for quantifying and analyzing sources of prediction uncertainty.« less

  9. Assessing climate adaptation options and uncertainties for cereal systems in West Africa

    NASA Astrophysics Data System (ADS)

    Guan, K.; Sultan, B.; Biasutti, M.; Lobell, D. B.

    2015-12-01

    The already fragile agriculture production system in West Africa faces further challenges in meeting food security in the coming decades, primarily due to a fast increasing population and risks of climate change. Successful adaptation of agriculture should not only benefit in the current climate but should also reduce negative (or enhance positive) impacts for climate change. Assessment of various possible adaptation options and their uncertainties provides key information for prioritizing adaptation investments. Here, based on the several robust aspects of climate projections in this region (i.e. temperature increases and rainfall pattern shifts), we use two well-validated crop models (i.e. APSIM and SARRA-H) and an ensemble of downscaled climate forcing to assess five possible and realistic adaptation options (late sowing, intensification, thermal time increase, water harvesting and increased resilience to heat stress) in West Africa for the staple crop production of sorghum. We adopt a new assessment framework to account for both the impacts of adaptation options in current climate and their ability to reduce impacts of future climate change, and also consider changes in both mean yield and its variability. Our results reveal that most proposed "adaptation options" are not more beneficial in the future than in the current climate, i.e. not really reduce the climate change impacts. Increased temperature resilience during grain number formation period is the main adaptation that emerges. We also find that changing from the traditional to modern cultivar, and later sowing in West Sahel appear to be robust adaptations.

  10. The Intolerance of Uncertainty Inventory: Validity and Comparison of Scoring Methods to Assess Individuals Screening Positive for Anxiety and Depression.

    PubMed

    Lauriola, Marco; Mosca, Oriana; Trentini, Cristina; Foschi, Renato; Tambelli, Renata; Carleton, R Nicholas

    2018-01-01

    Intolerance of Uncertainty is a fundamental transdiagnostic personality construct hierarchically organized with a core general factor underlying diverse clinical manifestations. The current study evaluated the construct validity of the Intolerance of Uncertainty Inventory, a two-part scale separately assessing a unitary Intolerance of Uncertainty disposition to consider uncertainties to be unacceptable and threatening (Part A) and the consequences of such disposition, regarding experiential avoidance, chronic doubt, overestimation of threat, worrying, control of uncertain situations, and seeking reassurance (Part B). Community members ( N = 1046; Mean age = 36.69 ± 12.31 years; 61% females) completed the Intolerance of Uncertainty Inventory with the Beck Depression Inventory-II and the State-Trait Anxiety Inventory. Part A demonstrated a robust unidimensional structure and an excellent convergent validity with Part B. A bifactor model was the best fitting model for Part B. Based on these results, we compared the hierarchical factor scores with summated ratings clinical proxy groups reporting anxiety and depression symptoms. Summated rating scores were associated with both depression and anxiety and proportionally increased with the co-occurrence of depressive and anxious symptoms. By contrast, hierarchical scores were useful to detect which facets mostly separated between for depression and anxiety groups. In sum, Part A was a reliable and valid transdiagnostic measure of Intolerance of Uncertainty. The Part B was arguably more useful for assessing clinical manifestations of Intolerance of Uncertainty for specific disorders, provided that hierarchical scores are used. Overall, our study suggest that clinical assessments might need to shift toward hierarchical factor scores.

  11. The Intolerance of Uncertainty Inventory: Validity and Comparison of Scoring Methods to Assess Individuals Screening Positive for Anxiety and Depression

    PubMed Central

    Lauriola, Marco; Mosca, Oriana; Trentini, Cristina; Foschi, Renato; Tambelli, Renata; Carleton, R. Nicholas

    2018-01-01

    Intolerance of Uncertainty is a fundamental transdiagnostic personality construct hierarchically organized with a core general factor underlying diverse clinical manifestations. The current study evaluated the construct validity of the Intolerance of Uncertainty Inventory, a two-part scale separately assessing a unitary Intolerance of Uncertainty disposition to consider uncertainties to be unacceptable and threatening (Part A) and the consequences of such disposition, regarding experiential avoidance, chronic doubt, overestimation of threat, worrying, control of uncertain situations, and seeking reassurance (Part B). Community members (N = 1046; Mean age = 36.69 ± 12.31 years; 61% females) completed the Intolerance of Uncertainty Inventory with the Beck Depression Inventory-II and the State-Trait Anxiety Inventory. Part A demonstrated a robust unidimensional structure and an excellent convergent validity with Part B. A bifactor model was the best fitting model for Part B. Based on these results, we compared the hierarchical factor scores with summated ratings clinical proxy groups reporting anxiety and depression symptoms. Summated rating scores were associated with both depression and anxiety and proportionally increased with the co-occurrence of depressive and anxious symptoms. By contrast, hierarchical scores were useful to detect which facets mostly separated between for depression and anxiety groups. In sum, Part A was a reliable and valid transdiagnostic measure of Intolerance of Uncertainty. The Part B was arguably more useful for assessing clinical manifestations of Intolerance of Uncertainty for specific disorders, provided that hierarchical scores are used. Overall, our study suggest that clinical assessments might need to shift toward hierarchical factor scores. PMID:29632505

  12. Advanced probabilistic methods for quantifying the effects of various uncertainties in structural response

    NASA Technical Reports Server (NTRS)

    Nagpal, Vinod K.

    1988-01-01

    The effects of actual variations, also called uncertainties, in geometry and material properties on the structural response of a space shuttle main engine turbopump blade are evaluated. A normal distribution was assumed to represent the uncertainties statistically. Uncertainties were assumed to be totally random, partially correlated, and fully correlated. The magnitude of these uncertainties were represented in terms of mean and variance. Blade responses, recorded in terms of displacements, natural frequencies, and maximum stress, was evaluated and plotted in the form of probabilistic distributions under combined uncertainties. These distributions provide an estimate of the range of magnitudes of the response and probability of occurrence of a given response. Most importantly, these distributions provide the information needed to estimate quantitatively the risk in a structural design.

  13. Improved Event Location Uncertainty Estimates

    DTIC Science & Technology

    2006-09-21

    validation purposes, we use GT0-2 event clusters. These include the Nevada Lop Nor, Semipalatinsk , and Novaya Zemlys test sites , as well as the Azgir...uncertainties. Furthermore, the tails of real seismic data distributions are heavier than Gaussian. The main objectives of this project are to develop, test

  14. Health risk assessment for nanoparticles: A case for using expert judgment

    NASA Astrophysics Data System (ADS)

    Kandlikar, Milind; Ramachandran, Gurumurthy; Maynard, Andrew; Murdock, Barbara; Toscano, William A.

    2007-01-01

    Uncertainties in conventional quantitative risk assessment typically relate to values of parameters in risk models. For many environmental contaminants, there is a lack of sufficient information about multiple components of the risk assessment framework. In such cases, the use of default assumptions and extrapolations to fill in the data gaps is a common practice. Nanoparticle risks, however, pose a new form of risk assessment challenge. Besides a lack of data, there is deep scientific uncertainty regarding every aspect of the risk assessment framework: (a) particle characteristics that may affect toxicity; (b) their fate and transport through the environment; (c) the routes of exposure and the metrics by which exposure ought to be measured; (d) the mechanisms of translocation to different parts of the body; and (e) the mechanisms of toxicity and disease. In each of these areas, there are multiple and competing models and hypotheses. These are not merely parametric uncertainties but uncertainties about the choice of the causal mechanisms themselves and the proper model variables to be used, i.e., structural uncertainties. While these uncertainties exist for PM2.5 as well, risk assessment for PM2.5 has avoided dealing with these issues because of a plethora of epidemiological studies. However, such studies don't exist for the case of nanoparticles. Even if such studies are done in the future, they will be very specific to a particular type of engineered nanoparticle and not generalizable to other nanoparticles. Therefore, risk assessment for nanoparticles will have to deal with the various uncertainties that were avoided in the case of PM2.5. Consequently, uncertainties in estimating risks due to nanoparticle exposures may be characterized as `extreme'. This paper proposes a methodology by which risk analysts can cope with such extreme uncertainty. One way to make these problems analytically tractable is to use expert judgment approaches to study the degree of consensus and/or disagreement between experts on different parts of the exposure-response paradigm. This can be done by eliciting judgments from a wide range of experts on different parts of the risk causal chain. We also use examples to illustrate how studying expert consensus/disagreement helps in research prioritization and budget allocation exercises. The expert elicitation can be repeated over the course of several years, over which time, the state of scientific knowledge will also improve and uncertainties may possibly reduce. Results from expert the elicitation exercise can be used by risk managers or managers of funding agencies as a tool for research prioritization.

  15. Impact of input data uncertainty on environmental exposure assessment models: A case study for electromagnetic field modelling from mobile phone base stations.

    PubMed

    Beekhuizen, Johan; Heuvelink, Gerard B M; Huss, Anke; Bürgi, Alfred; Kromhout, Hans; Vermeulen, Roel

    2014-11-01

    With the increased availability of spatial data and computing power, spatial prediction approaches have become a standard tool for exposure assessment in environmental epidemiology. However, such models are largely dependent on accurate input data. Uncertainties in the input data can therefore have a large effect on model predictions, but are rarely quantified. With Monte Carlo simulation we assessed the effect of input uncertainty on the prediction of radio-frequency electromagnetic fields (RF-EMF) from mobile phone base stations at 252 receptor sites in Amsterdam, The Netherlands. The impact on ranking and classification was determined by computing the Spearman correlations and weighted Cohen's Kappas (based on tertiles of the RF-EMF exposure distribution) between modelled values and RF-EMF measurements performed at the receptor sites. The uncertainty in modelled RF-EMF levels was large with a median coefficient of variation of 1.5. Uncertainty in receptor site height, building damping and building height contributed most to model output uncertainty. For exposure ranking and classification, the heights of buildings and receptor sites were the most important sources of uncertainty, followed by building damping, antenna- and site location. Uncertainty in antenna power, tilt, height and direction had a smaller impact on model performance. We quantified the effect of input data uncertainty on the prediction accuracy of an RF-EMF environmental exposure model, thereby identifying the most important sources of uncertainty and estimating the total uncertainty stemming from potential errors in the input data. This approach can be used to optimize the model and better interpret model output. Copyright © 2014 Elsevier Inc. All rights reserved.

  16. Sensitivity test and ensemble hazard assessment for tephra fallout at Campi Flegrei, Italy

    NASA Astrophysics Data System (ADS)

    Selva, J.; Costa, A.; De Natale, G.; Di Vito, M. A.; Isaia, R.; Macedonio, G.

    2018-02-01

    We present the results of a statistical study on tephra dispersal in the case of a reactivation of the Campi Flegrei volcano. To represent the spectrum of possible eruptive sizes, four classes of eruptions were considered. Excluding the lava emission, three classes are explosive (Small, Medium, and Large) and can produce a significant quantity of volcanic ash. Hazard assessments were made through simulations of atmospheric dispersion of ash and lapilli, considering the full variability of winds and eruptive vents. The results are presented in form of conditional hazard curves given the occurrence of specific eruptive sizes, representative members of each size class, and then combined to quantify the conditional hazard given an eruption of any size. The main focus of this analysis was to constrain the epistemic uncertainty (i.e. associated with the level of scientific knowledge of phenomena), in order to provide unbiased hazard estimations. The epistemic uncertainty on the estimation of hazard curves was quantified, making use of scientifically acceptable alternatives to be aggregated in the final results. The choice of such alternative models was made after a comprehensive sensitivity analysis which considered different weather databases, alternative modelling of submarine eruptive vents and tephra total grain-size distributions (TGSD) with a different relative mass fraction of fine ash, and the effect of ash aggregation. The results showed that the dominant uncertainty is related to the combined effect of the uncertainty with regard to the fraction of fine particles with respect to the total mass and on how ash aggregation is modelled. The latter is particularly relevant in the case of magma-water interactions during explosive eruptive phases, when a large fraction of fine ash can form accretionary lapilli that might contribute significantly in increasing the tephra load in the proximal areas. The variability induced by the use of different meteorological databases and the selected approach to modelling offshore eruptions were relatively insignificant. The uncertainty arising from the alternative implementations, which would have been neglected in standard (Bayesian) quantifications, were finally quantified by ensemble modelling, and represented by hazard and probability maps produced at different confidence levels.

  17. Uncertainty indication in soil function maps - transparent and easy-to-use information to support sustainable use of soil resources

    NASA Astrophysics Data System (ADS)

    Greiner, Lucie; Nussbaum, Madlene; Papritz, Andreas; Zimmermann, Stephan; Gubler, Andreas; Grêt-Regamey, Adrienne; Keller, Armin

    2018-05-01

    Spatial information on soil function fulfillment (SFF) is increasingly being used to inform decision-making in spatial planning programs to support sustainable use of soil resources. Soil function maps visualize soils abilities to fulfill their functions, e.g., regulating water and nutrient flows, providing habitats, and supporting biomass production based on soil properties. Such information must be reliable for informed and transparent decision-making in spatial planning programs. In this study, we add to the transparency of soil function maps by (1) indicating uncertainties arising from the prediction of soil properties generated by digital soil mapping (DSM) that are used for soil function assessment (SFA) and (2) showing the response of different SFA methods to the propagation of uncertainties through the assessment. For a study area of 170 km2 in the Swiss Plateau, we map 10 static soil sub-functions for agricultural soils for a spatial resolution of 20 × 20 m together with their uncertainties. Mapping the 10 soil sub-functions using simple ordinal assessment scales reveals pronounced spatial patterns with a high variability of SFF scores across the region, linked to the inherent properties of the soils and terrain attributes and climate conditions. Uncertainties in soil properties propagated through SFA methods generally lead to substantial uncertainty in the mapped soil sub-functions. We propose two types of uncertainty maps that can be readily understood by stakeholders. Cumulative distribution functions of SFF scores indicate that SFA methods respond differently to the propagated uncertainty of soil properties. Even where methods are comparable on the level of complexity and assessment scale, their comparability in view of uncertainty propagation might be different. We conclude that comparable uncertainty indications in soil function maps are relevant to enable informed and transparent decisions on the sustainable use of soil resources.

  18. Why Quantify Uncertainty in Ecosystem Studies: Obligation versus Discovery Tool?

    NASA Astrophysics Data System (ADS)

    Harmon, M. E.

    2016-12-01

    There are multiple motivations for quantifying uncertainty in ecosystem studies. One is as an obligation; the other is as a tool useful in moving ecosystem science toward discovery. While reporting uncertainty should become a routine expectation, a more convincing motivation involves discovery. By clarifying what is known and to what degree it is known, uncertainty analyses can point the way toward improvements in measurements, sampling designs, and models. While some of these improvements (e.g., better sampling designs) may lead to incremental gains, those involving models (particularly model selection) may require large gains in knowledge. To be fully harnessed as a discovery tool, attitudes toward uncertainty may have to change: rather than viewing uncertainty as a negative assessment of what was done, it should be viewed as positive, helpful assessment of what remains to be done.

  19. Evaluation of an exposure assessment used in epidemiological studies of diesel exhaust and lung cancer in underground mines

    PubMed Central

    Crump, Kenny; Van Landingham, Cynthia

    2012-01-01

    NIOSH/NCI (National Institute of Occupational Safety and Health and National Cancer Institute) developed exposure estimates for respirable elemental carbon (REC) as a surrogate for exposure to diesel exhaust (DE) for different jobs in eight underground mines by year beginning in the 1940s—1960s when diesel equipment was first introduced into these mines. These estimates played a key role in subsequent epidemiological analyses of the potential relationship between exposure to DE and lung cancer conducted in these mines. We report here on a reanalysis of some of the data from this exposure assessment. Because samples of REC were limited primarily to 1998–2001, NIOSH/NCI used carbon monoxide (CO) as a surrogate for REC. In addition, because CO samples were limited, particularly in the earlier years, they used the ratio of diesel horsepower (HP) to the mine air exhaust rate as a surrogate for CO. There are considerable uncertainties connected with each of these surrogate-based steps. The estimates of HP appear to involve considerable uncertainty, although we had no data upon which to evaluate the magnitude of this uncertainty. A sizable percentage (45%) of the CO samples used in the HP to CO model was below the detection limit which required NIOSH/NCI to assign CO values to these samples. In their preferred REC estimates, NIOSH/NCI assumed a linear relation between C0 and REC, although they provided no credible support for that assumption. Their assumption of a stable relationship between HP and CO also is questionable, and our reanalysis found a statistically significant relationship in only one-half of the mines. We re-estimated yearly REC exposures mainly using NIOSH/NCI methods but with some important differences: (i) rather than simply assuming a linear relationship, we used data from the mines to estimate the CO—REC relationship; (ii) we used a different method for assigning values to nondetect CO measurements; and (iii) we took account of statistical uncertainty to estimate bounds for REC exposures. This exercise yielded significantly different exposure estimates than estimated by NIOSH/NCI. However, this analysis did not incorporate the full range of uncertainty in REC exposures because of additional uncertainties in the assumptions underlying the modeling and in the underlying data (e.g. HP and mine exhaust rates). Estimating historical exposures in a cohort is generally a very difficult undertaking. However, this should not prevent one from recognizing the uncertainty in the resulting estimates in any use made of them. PMID:22594934

  20. Uncertainty assessment and implications for data acquisition in support of integrated hydrologic models

    NASA Astrophysics Data System (ADS)

    Brunner, Philip; Doherty, J.; Simmons, Craig T.

    2012-07-01

    The data set used for calibration of regional numerical models which simulate groundwater flow and vadose zone processes is often dominated by head observations. It is to be expected therefore, that parameters describing vadose zone processes are poorly constrained. A number of studies on small spatial scales explored how additional data types used in calibration constrain vadose zone parameters or reduce predictive uncertainty. However, available studies focused on subsets of observation types and did not jointly account for different measurement accuracies or different hydrologic conditions. In this study, parameter identifiability and predictive uncertainty are quantified in simulation of a 1-D vadose zone soil system driven by infiltration, evaporation and transpiration. The worth of different types of observation data (employed individually, in combination, and with different measurement accuracies) is evaluated by using a linear methodology and a nonlinear Pareto-based methodology under different hydrological conditions. Our main conclusions are (1) Linear analysis provides valuable information on comparative parameter and predictive uncertainty reduction accrued through acquisition of different data types. Its use can be supplemented by nonlinear methods. (2) Measurements of water table elevation can support future water table predictions, even if such measurements inform the individual parameters of vadose zone models to only a small degree. (3) The benefits of including ET and soil moisture observations in the calibration data set are heavily dependent on depth to groundwater. (4) Measurements of groundwater levels, measurements of vadose ET or soil moisture poorly constrain regional groundwater system forcing functions.

  1. Uncertainty estimations for moment tensor inversions: the issue of the 2012 May 20 Emilia earthquake

    NASA Astrophysics Data System (ADS)

    Scognamiglio, Laura; Magnoni, Federica; Tinti, Elisa; Casarotti, Emanuele

    2016-08-01

    Seismic moment tensor is one of the most important source parameters defining the earthquake dimension and style of the activated fault. Geoscientists ordinarily use moment tensor catalogues, however, few attempts have been done to assess possible impacts of moment magnitude uncertainties upon their analysis. The 2012 May 20 Emilia main shock is a representative event since it is defined in literature with a moment magnitude value (Mw) spanning between 5.63 and 6.12. A variability of ˜0.5 units in magnitude leads to a controversial knowledge of the real size of the event and reveals how the solutions could be poorly constrained. In this work, we investigate the stability of the moment tensor solution for this earthquake, studying the effect of five different 1-D velocity models, the number and the distribution of the stations used in the inversion procedure. We also introduce a 3-D velocity model to account for structural heterogeneity. We finally estimate the uncertainties associated to the computed focal planes and the obtained Mw. We conclude that our reliable source solutions provide a moment magnitude that ranges from 5.87, 1-D model, to 5.96, 3-D model, reducing the variability of the literature to ˜0.1. We endorse that the estimate of seismic moment from moment tensor solutions, as well as the estimate of the other kinematic source parameters, requires coming out with disclosed assumptions and explicit processing workflows. Finally and, probably more important, when moment tensor solution is used for secondary analyses it has to be combined with the same main boundary conditions (e.g. wave-velocity propagation model) to avoid conflicting results.

  2. Justify Your Answer: The Role of Written Think Aloud in Script Concordance Testing.

    PubMed

    Power, Alyssa; Lemay, Jean-Francois; Cooke, Suzette

    2017-01-01

    Construct: Clinical reasoning assessment is a growing area of interest in the medical education literature. Script concordance testing (SCT) evaluates clinical reasoning in conditions of uncertainty and has emerged as an innovative tool in the domain of clinical reasoning assessment. SCT quantifies the degree of concordance between a learner and an experienced clinician and attempts to capture the breadth of responses of expert clinicians, acknowledging the significant yet acceptable variation in practice under situations of uncertainty. SCT has been shown to be a valid and reliable clinical reasoning assessment tool. However, as SCT provides only quantitative information, it may not provide a complete assessment of clinical reasoning. Think aloud (TA) is a qualitative research tool used in clinical reasoning assessment in which learners verbalize their thought process around an assigned task. This study explores the use of TA, in the form of written reflection, in SCT to assess resident clinical reasoning, hypothesizing that the information obtained from the written TA would enrich the quantitative data obtained through SCT. Ninety-one pediatric postgraduate trainees and 21 pediatricians from 4 Canadian training centers completed an online test consisting of 24 SCT cases immediately followed by retrospective written TA. Six of 24 cases were selected to gather TA data. These cases were chosen to allow all phases of clinical decision making (diagnosis, investigation, and treatment) to be represented in the TA data. Inductive thematic analysis was employed when systematically reviewing TA responses. Three main benefits of adding written TA to SCT were identified: (a) uncovering instances of incorrect clinical reasoning despite a correct SCT response, (b) revealing sound clinical reasoning in the context of a suboptimal SCT response, and (c) detecting question misinterpretation. Written TA can optimize SCT by demonstrating when correct examinee responses are based on guessing or uncertainty rather than robust clinical rationale. TA can also enhance SCT by allowing examinees to provide justification for responses that otherwise would have been considered incorrect and by identifying questions that are frequently misinterpreted to avoid including them in future examinations. TA also has significant value in differentiating between acceptable variations in expert clinician responses and deviance associated with faulty rationale or question misinterpretation; this could improve SCT reliability. A written TA protocol appears to be a valuable tool to assess trainees' clinical reasoning and can strengthen the quantitative assessment provided by SCT.

  3. Interim reliability-evaluation program: analysis of the Browns Ferry, Unit 1, nuclear plant. Appendix C - sequence quantification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mays, S.E.; Poloski, J.P.; Sullivan, W.H.

    1982-07-01

    This report describes a risk study of the Browns Ferry, Unit 1, nuclear plant. The study is one of four such studies sponsored by the NRC Office of Research, Division of Risk Assessment, as part of its Interim Reliability Evaluation Program (IREP), Phase II. This report is contained in four volumes: a main report and three appendixes. Appendix C generally describes the methods used to estimate accident sequence frequency values. Information is presented concerning the approach, example collection, failure data, candidate dominant sequences, uncertainty analysis, and sensitivity analysis.

  4. Uncertainty in the Himalayan energy-water nexus: estimating regional exposure to glacial lake outburst floods

    NASA Astrophysics Data System (ADS)

    Schwanghart, Wolfgang; Worni, Raphael; Huggel, Christian; Stoffel, Markus; Korup, Oliver

    2016-07-01

    Himalayan water resources attract a rapidly growing number of hydroelectric power projects (HPP) to satisfy Asia’s soaring energy demands. Yet HPP operating or planned in steep, glacier-fed mountain rivers face hazards of glacial lake outburst floods (GLOFs) that can damage hydropower infrastructure, alter water and sediment yields, and compromise livelihoods downstream. Detailed appraisals of such GLOF hazards are limited to case studies, however, and a more comprehensive, systematic analysis remains elusive. To this end we estimate the regional exposure of 257 Himalayan HPP to GLOFs, using a flood-wave propagation model fed by Monte Carlo-derived outburst volumes of >2300 glacial lakes. We interpret the spread of thus modeled peak discharges as a predictive uncertainty that arises mainly from outburst volumes and dam-breach rates that are difficult to assess before dams fail. With 66% of sampled HPP are on potential GLOF tracks, up to one third of these HPP could experience GLOF discharges well above local design floods, as hydropower development continues to seek higher sites closer to glacial lakes. We compute that this systematic push of HPP into headwaters effectively doubles the uncertainty about GLOF peak discharge in these locations. Peak discharges farther downstream, in contrast, are easier to predict because GLOF waves attenuate rapidly. Considering this systematic pattern of regional GLOF exposure might aid the site selection of future Himalayan HPP. Our method can augment, and help to regularly update, current hazard assessments, given that global warming is likely changing the number and size of Himalayan meltwater lakes.

  5. Improvement of Modeling HTGR Neutron Physics by Uncertainty Analysis with the Use of Cross-Section Covariance Information

    NASA Astrophysics Data System (ADS)

    Boyarinov, V. F.; Grol, A. V.; Fomichenko, P. A.; Ternovykh, M. Yu

    2017-01-01

    This work is aimed at improvement of HTGR neutron physics design calculations by application of uncertainty analysis with the use of cross-section covariance information. Methodology and codes for preparation of multigroup libraries of covariance information for individual isotopes from the basic 44-group library of SCALE-6 code system were developed. A 69-group library of covariance information in a special format for main isotopes and elements typical for high temperature gas cooled reactors (HTGR) was generated. This library can be used for estimation of uncertainties, associated with nuclear data, in analysis of HTGR neutron physics with design codes. As an example, calculations of one-group cross-section uncertainties for fission and capture reactions for main isotopes of the MHTGR-350 benchmark, as well as uncertainties of the multiplication factor (k∞) for the MHTGR-350 fuel compact cell model and fuel block model were performed. These uncertainties were estimated by the developed technology with the use of WIMS-D code and modules of SCALE-6 code system, namely, by TSUNAMI, KENO-VI and SAMS. Eight most important reactions on isotopes for MHTGR-350 benchmark were identified, namely: 10B(capt), 238U(n,γ), ν5, 235U(n,γ), 238U(el), natC(el), 235U(fiss)-235U(n,γ), 235U(fiss).

  6. Making Invasion models useful for decision makers; incorporating uncertainty, knowledge gaps, and decision-making preferences

    Treesearch

    Denys Yemshanov; Frank H Koch; Mark Ducey

    2015-01-01

    Uncertainty is inherent in model-based forecasts of ecological invasions. In this chapter, we explore how the perceptions of that uncertainty can be incorporated into the pest risk assessment process. Uncertainty changes a decision maker’s perceptions of risk; therefore, the direct incorporation of uncertainty may provide a more appropriate depiction of risk. Our...

  7. Uncertainty Assessment: What Good Does it Do? (Invited)

    NASA Astrophysics Data System (ADS)

    Oreskes, N.; Lewandowsky, S.

    2013-12-01

    The scientific community has devoted considerable time and energy to understanding, quantifying and articulating the uncertainties related to anthropogenic climate change. However, informed decision-making and good public policy arguably rely far more on a central core of understanding of matters that are scientifically well established than on detailed understanding and articulation of all relevant uncertainties. Advocates of vaccination, for example, stress its overall efficacy in preventing morbidity and mortality--not the uncertainties over how long the protective effects last. Advocates for colonoscopy for cancer screening stress its capacity to detect polyps before they become cancerous, with relatively little attention paid to the fact that many, if not most, polyps, would not become cancerous even if left unremoved. So why has the climate science community spent so much time focused on uncertainty? One reason, of course, is that articulation of uncertainty is a normal and appropriate part of scientific work. However, we argue that there is another reason that involves the pressure that the scientific community has experienced from individuals and groups promoting doubt about anthropogenic climate change. Specifically, doubt-mongering groups focus public attention on scientific uncertainty as a means to undermine scientific claims, equating uncertainty with untruth. Scientists inadvertently validate these arguments by agreeing that much of the science is uncertain, and thus seemingly implying that our knowledge is insecure. The problem goes further, as the scientific community attempts to articulate more clearly, and reduce, those uncertainties, thus, seemingly further agreeing that the knowledge base is insufficient to warrant public and governmental action. We refer to this effect as 'seepage,' as the effects of doubt-mongering seep into the scientific community and the scientific agenda, despite the fact that addressing these concerns does little to alter the public debate or advance public policy. We argue that attempts to address public doubts by improving uncertainty assessment are bound to fail, insofar as the motives for doubt-mongering are independent of scientific uncertainty, and therefore remain unaffected even as those uncertainties are diminished. We illustrate this claim by consideration of the evolution of the debate over the past ten years over the relationship between hurricanes and anthropogenic climate change. We suggest that scientists should pursue uncertainty assessment if such assessment improves scientific understanding, but not as a means to reduce public doubts or advance public policy in relation to anthropogenic climate change.

  8. Environmental impact and risk assessments and key factors contributing to the overall uncertainties.

    PubMed

    Salbu, Brit

    2016-01-01

    There is a significant number of nuclear and radiological sources that have contributed, are still contributing, or have the potential to contribute to radioactive contamination of the environment in the future. To protect the environment from radioactive contamination, impact and risk assessments are performed prior to or during a release event, short or long term after deposition or prior and after implementation of countermeasures. When environmental impact and risks are assessed, however, a series of factors will contribute to the overall uncertainties. To provide environmental impact and risk assessments, information on processes, kinetics and a series of input variables is needed. Adding problems such as variability, questionable assumptions, gaps in knowledge, extrapolations and poor conceptual model structures, a series of factors are contributing to large and often unacceptable uncertainties in impact and risk assessments. Information on the source term and the release scenario is an essential starting point in impact and risk models; the source determines activity concentrations and atom ratios of radionuclides released, while the release scenario determine the physico-chemical forms of released radionuclides such as particle size distribution, structure and density. Releases will most often contain other contaminants such as metals, and due to interactions, contaminated sites should be assessed as a multiple stressor scenario. Following deposition, a series of stressors, interactions and processes will influence the ecosystem transfer of radionuclide species and thereby influence biological uptake (toxicokinetics) and responses (toxicodynamics) in exposed organisms. Due to the variety of biological species, extrapolation is frequently needed to fill gaps in knowledge e.g., from effects to no effects, from effects in one organism to others, from one stressor to mixtures. Most toxtests are, however, performed as short term exposure of adult organisms, ignoring sensitive history life stages of organisms and transgenerational effects. To link sources, ecosystem transfer and biological effects to future impact and risks, a series of models are usually interfaced, while uncertainty estimates are seldom given. The model predictions are, however, only valid within the boundaries of the overall uncertainties. Furthermore, the model predictions are only useful and relevant when uncertainties are estimated, communicated and understood. Among key factors contributing most to uncertainties, the present paper focuses especially on structure uncertainties (model bias or discrepancies) as aspects such as particle releases, ecosystem dynamics, mixed exposure, sensitive life history stages and transgenerational effects, are usually ignored in assessment models. Research focus on these aspects should significantly reduce the overall uncertainties in the impact and risk assessment of radioactive contaminated ecosystems. Copyright © 2015 Elsevier Ltd. All rights reserved.

  9. Variability of blood alcohol content (BAC) determinations: the role of measurement uncertainty, significant figures, and decision rules for compliance assessment in the frame of a multiple BAC threshold law.

    PubMed

    Zamengo, Luca; Frison, Giampietro; Tedeschi, Gianpaola; Frasson, Samuela; Zancanaro, Flavio; Sciarrone, Rocco

    2014-10-01

    The measurement of blood-alcohol content (BAC) is a crucial analytical determination required to assess if an offence (e.g. driving under the influence of alcohol) has been committed. For various reasons, results of forensic alcohol analysis are often challenged by the defence. As a consequence, measurement uncertainty becomes a critical topic when assessing compliance with specification limits for forensic purposes. The aims of this study were: (1) to investigate major sources of variability for BAC determinations; (2) to estimate measurement uncertainty for routine BAC determinations; (3) to discuss the role of measurement uncertainty in compliance assessment; (4) to set decision rules for a multiple BAC threshold law, as provided in the Italian Highway Code; (5) to address the topic of the zero-alcohol limit from the forensic toxicology point of view; and (6) to discuss the role of significant figures and rounding errors on measurement uncertainty and compliance assessment. Measurement variability was investigated by the analysis of data collected from real cases and internal quality control. The contribution of both pre-analytical and analytical processes to measurement variability was considered. The resulting expanded measurement uncertainty was 8.0%. Decision rules for the multiple BAC threshold Italian law were set by adopting a guard-banding approach. 0.1 g/L was chosen as cut-off level to assess compliance with the zero-alcohol limit. The role of significant figures and rounding errors in compliance assessment was discussed by providing examples which stressed the importance of these topics for forensic purposes. Copyright © 2014 John Wiley & Sons, Ltd.

  10. Micropollutants throughout an integrated urban drainage model: Sensitivity and uncertainty analysis

    NASA Astrophysics Data System (ADS)

    Mannina, Giorgio; Cosenza, Alida; Viviani, Gaspare

    2017-11-01

    The paper presents the sensitivity and uncertainty analysis of an integrated urban drainage model which includes micropollutants. Specifically, a bespoke integrated model developed in previous studies has been modified in order to include the micropollutant assessment (namely, sulfamethoxazole - SMX). The model takes into account also the interactions between the three components of the system: sewer system (SS), wastewater treatment plant (WWTP) and receiving water body (RWB). The analysis has been applied to an experimental catchment nearby Palermo (Italy): the Nocella catchment. Overall, five scenarios, each characterized by different uncertainty combinations of sub-systems (i.e., SS, WWTP and RWB), have been considered applying, for the sensitivity analysis, the Extended-FAST method in order to select the key factors affecting the RWB quality and to design a reliable/useful experimental campaign. Results have demonstrated that sensitivity analysis is a powerful tool for increasing operator confidence in the modelling results. The approach adopted here can be used for blocking some non-identifiable factors, thus wisely modifying the structure of the model and reducing the related uncertainty. The model factors related to the SS have been found to be the most relevant factors affecting the SMX modeling in the RWB when all model factors (scenario 1) or model factors of SS (scenarios 2 and 3) are varied. If the only factors related to the WWTP are changed (scenarios 4 and 5), the SMX concentration in the RWB is mainly influenced (till to 95% influence of the total variance for SSMX,max) by the aerobic sorption coefficient. A progressive uncertainty reduction from the upstream to downstream was found for the soluble fraction of SMX in the RWB.

  11. Uncertainties in Future Regional Sea Level Trends: How to Deal with the Internal Climate Variability?

    NASA Astrophysics Data System (ADS)

    Becker, M.; Karpytchev, M.; Hu, A.; Deser, C.; Lennartz-Sassinek, S.

    2017-12-01

    Today, the Climate models (CM) are the main tools for forecasting sea level rise (SLR) at global and regional scales. The CM forecasts are accompanied by inherent uncertainties. Understanding and reducing these uncertainties is becoming a matter of increasing urgency in order to provide robust estimates of SLR impact on coastal societies, which need sustainable choices of climate adaptation strategy. These CM uncertainties are linked to structural model formulation, initial conditions, emission scenario and internal variability. The internal variability is due to complex non-linear interactions within the Earth Climate System and can induce diverse quasi-periodic oscillatory modes and long-term persistences. To quantify the effects of internal variability, most studies used multi-model ensembles or sea level projections from a single model ran with perturbed initial conditions. However, large ensembles are not generally available, or too small, and computationally expensive. In this study, we use a power-law scaling of sea level fluctuations, as observed in many other geophysical signals and natural systems, which can be used to characterize the internal climate variability. From this specific statistical framework, we (1) use the pre-industrial control run of the National Center for Atmospheric Research Community Climate System Model (NCAR-CCSM) to test the robustness of the power-law scaling hypothesis; (2) employ the power-law statistics as a tool for assessing the spread of regional sea level projections due to the internal climate variability for the 21st century NCAR-CCSM; (3) compare the uncertainties in predicted sea level changes obtained from a NCAR-CCSM multi-member ensemble simulations with estimates derived for power-law processes, and (4) explore the sensitivity of spatial patterns of the internal variability and its effects on regional sea level projections.

  12. Research priority setting in Barrett's oesophagus and gastro-oesophageal reflux disease.

    PubMed

    Britton, James; Gadeke, Lisa; Lovat, Laurence; Hamdy, Shaheen; Hawkey, Chris; McLaughlin, John; Ang, Yeng

    2017-11-01

    The incidence of gastro-oesophageal reflux disease and Barrett's oesophagus is increasing. Barrett's oesophagus is the main precursor to oesophageal adenocarcinoma, which has a poor prognosis. In view of the vast potential burden of these diseases on patients and health-care resources, there is a real need to define and focus research efforts. This priority setting exercise aimed to produce a list of the top ten uncertainties in the field that reflect the priorities of patients and health-care providers. We adopted the robust and transparent methodologies previously outlined by the James Lind Alliance. This qualitative approach firstly involves an ideas gathering survey that, once distilled, generates a longlist of research uncertainties. These uncertainties are then prioritised via an interim ranking survey and a final workshop to achieve consensus agreement. The initial 629 uncertainties, generated from a survey of 170 individual respondents (47% professional, 53% non-professional) and one workshop, were narrowed down to the final top ten uncertainties of priority for future research. These priorities covered a range of issues, including a need for improved patient risk stratification, alternative diagnostic and surveillance tests, efficacy of a dedicated service for Barrett's oesophagus, cost-effectiveness and appropriateness of current surveillance, advances in development of non-drug treatments for gastro-oesophageal reflux disease, safety of long-term drug treatment, and questions regarding the durability and role of different endoscopic therapies for dysplastic Barrett's oesophagus. This is the first patient-centred assessment of priorities for researchers in this chronic disease setting. We hope that recognition and dissemination of these results will shape the future direction of research and translate into meaningful gains for patients. Copyright © 2017 Elsevier Ltd. All rights reserved.

  13. Uncertainty and risk in wildland fire management: a review.

    PubMed

    Thompson, Matthew P; Calkin, Dave E

    2011-08-01

    Wildland fire management is subject to manifold sources of uncertainty. Beyond the unpredictability of wildfire behavior, uncertainty stems from inaccurate/missing data, limited resource value measures to guide prioritization across fires and resources at risk, and an incomplete scientific understanding of ecological response to fire, of fire behavior response to treatments, and of spatiotemporal dynamics involving disturbance regimes and climate change. This work attempts to systematically align sources of uncertainty with the most appropriate decision support methodologies, in order to facilitate cost-effective, risk-based wildfire planning efforts. We review the state of wildfire risk assessment and management, with a specific focus on uncertainties challenging implementation of integrated risk assessments that consider a suite of human and ecological values. Recent advances in wildfire simulation and geospatial mapping of highly valued resources have enabled robust risk-based analyses to inform planning across a variety of scales, although improvements are needed in fire behavior and ignition occurrence models. A key remaining challenge is a better characterization of non-market resources at risk, both in terms of their response to fire and how society values those resources. Our findings echo earlier literature identifying wildfire effects analysis and value uncertainty as the primary challenges to integrated wildfire risk assessment and wildfire management. We stress the importance of identifying and characterizing uncertainties in order to better quantify and manage them. Leveraging the most appropriate decision support tools can facilitate wildfire risk assessment and ideally improve decision-making. Published by Elsevier Ltd.

  14. Incorporating climate change into ecosystem service assessments and decisions: a review.

    PubMed

    Runting, Rebecca K; Bryan, Brett A; Dee, Laura E; Maseyk, Fleur J F; Mandle, Lisa; Hamel, Perrine; Wilson, Kerrie A; Yetka, Kathleen; Possingham, Hugh P; Rhodes, Jonathan R

    2017-01-01

    Climate change is having a significant impact on ecosystem services and is likely to become increasingly important as this phenomenon intensifies. Future impacts can be difficult to assess as they often involve long timescales, dynamic systems with high uncertainties, and are typically confounded by other drivers of change. Despite a growing literature on climate change impacts on ecosystem services, no quantitative syntheses exist. Hence, we lack an overarching understanding of the impacts of climate change, how they are being assessed, and the extent to which other drivers, uncertainties, and decision making are incorporated. To address this, we systematically reviewed the peer-reviewed literature that assesses climate change impacts on ecosystem services at subglobal scales. We found that the impact of climate change on most types of services was predominantly negative (59% negative, 24% mixed, 4% neutral, 13% positive), but varied across services, drivers, and assessment methods. Although uncertainty was usually incorporated, there were substantial gaps in the sources of uncertainty included, along with the methods used to incorporate them. We found that relatively few studies integrated decision making, and even fewer studies aimed to identify solutions that were robust to uncertainty. For management or policy to ensure the delivery of ecosystem services, integrated approaches that incorporate multiple drivers of change and account for multiple sources of uncertainty are needed. This is undoubtedly a challenging task, but ignoring these complexities can result in misleading assessments of the impacts of climate change, suboptimal management outcomes, and the inefficient allocation of resources for climate adaptation. © 2016 John Wiley & Sons Ltd.

  15. Risk assessment from exposure to arsenic, antimony, and selenium in urban gardens (Madrid, Spain).

    PubMed

    De Miguel, Eduardo; Izquierdo, Miguel; Gómez, Amaia; Mingot, Juan; Barrio-Parra, Fernando

    2017-02-01

    The authors discuss the geochemical behavior of arsenic (As), antimony (Sb), and selenium (Se) in urban gardens and the human health implications associated with urban agriculture. A total of 42 samples from 7 urban gardens in Madrid, Spain, were collected from the top 20 cm of soil. Concentrations of As, Sb, and Se and the main soil properties (i.e., total iron, pH, texture, calcium carbonate, and organic matter) were determined. A significant correlation was found between As and Sb and calcium carbonate, indicating the possibility of surface adsorption or ligand exchange with the carbonate group. Also, Sb seemed to form stable chelates with soil organic matter. On the other hand, Se showed a significant association with clay and iron content. The concentration of Sb in soil exceeded the recommended value for agricultural use in 70% of the urban gardens. A human health risk assessment resulted in acceptable levels of both noncarcinogenic and carcinogenic risks (although with elevated values of the latter), with As as the main risk driver and soil and food ingestion as the main exposure pathways. The numerical results of the risk assessment should be interpreted with caution given the considerable uncertainties in some exposure variables and the lack of quantitative values for the suspected carcinogenicity of Sb and Se. Environ Toxicol Chem 2017;36:544-550. © 2016 SETAC. © 2016 SETAC.

  16. Calibrating White Dwarf Asteroseismic Fitting Techniques

    NASA Astrophysics Data System (ADS)

    Castanheira, B. G.; Romero, A. D.; Bischoff-Kim, A.

    2017-03-01

    The main goal of looking for intrinsic variability in stars is the unique opportunity to study their internal structure. Once we have extracted independent modes from the data, it appears to be a simple matter of comparing the period spectrum with those from theoretical model grids to learn the inner structure of that star. However, asteroseismology is much more complicated than this simple description. We must account not only for observational uncertainties in period determination, but most importantly for the limitations of the model grids, coming from the uncertainties in the constitutive physics, and of the fitting techniques. In this work, we will discuss results of numerical experiments where we used different independently calculated model grids (white dwarf cooling models WDEC and fully evolutionary LPCODE-PUL) and fitting techniques to fit synthetic stars. The advantage of using synthetic stars is that we know the details of their interior structure so we can assess how well our models and fitting techniques are able to the recover the interior structure, as well as the stellar parameters.

  17. Beyond Social Constructivist Perspectives on Assessment: The Centring of Knowledge

    ERIC Educational Resources Information Center

    Shay, Suellen

    2008-01-01

    Over the past few decades assessment has been heralded for its key role in the improvement of teaching and learning. However, more recently there have been expressions of uncertainty about whether assessment is in fact delivering on its promised potential. Against this backdrop of uncertainty and circumspection this paper offers a critical…

  18. Assessing Uncertainty of Interspecies Correlation Estimation Models for Aromatic Compounds

    EPA Science Inventory

    We developed Interspecies Correlation Estimation (ICE) models for aromatic compounds containing 1 to 4 benzene rings to assess uncertainty in toxicity extrapolation in two data compilation approaches. ICE models are mathematical relationships between surrogate and predicted test ...

  19. SU-E-J-125: Classification of CBCT Noises in Terms of Their Contribution to Proton Range Uncertainty

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brousmiche, S; Orban de Xivry, J; Macq, B

    2014-06-01

    Purpose: This study assesses the potential use of CBCT images in adaptive protontherapy by estimating the contribution of the main sources of noise and calibration errors to the proton range uncertainty. Methods: Measurements intended to highlight each particular source have been achieved by adapting either the testbench configuration, e.g. use of filtration, fan-beam collimation, beam stop arrays, phantoms and detector reset light, or the sequence of correction algorithms including water precorrection. Additional Monte-Carlo simulations have been performed to complement these measurements, especially for the beam hardening and the scatter cases. Simulations of proton beams penetration through the resulting images havemore » then been carried out to quantify the range change due to these effects. The particular case of a brain irradiation is considered mainly because of the multiple effects that the skull bones have on the internal soft tissues. Results: On top of the range error sources is the undercorrection of scatter. Its influence has been analyzed from a comparison of fan-beam and full axial FOV acquisitions. In this case, large range errors of about 12 mm can be reached if the assumption is made that the scatter has only a constant contribution over the projection images. Even the detector lag, which a priori induces a much smaller effect, has been shown to contribute for up to 2 mm to the overall error if its correction only aims at reducing the skin artefact. This last result can partially be understood by the larger interface between tissues and bones inside the skull. Conclusion: This study has set the basis of a more systematical analysis of the effect CBCT noise on range uncertainties based on a combination of measurements, simulations and theoretical results. With our method, even more subtle effects such as the cone-beam artifact or the detector lag can be assessed. SBR and JOR are financed by iMagX, a public-private partnership between the region Wallone of Belgium and IBA under convention #1217662.« less

  20. Impact of climate change on the water cycle of agricultural landscapes in Southwest Germany

    NASA Astrophysics Data System (ADS)

    Witte, Irene; Ingwersen, Joachim; Gayler, Sebastian; Streck, Thilo

    2016-04-01

    For agricultural production and life in general, water is a necessity. To ensure food and drinking water security in the future an understanding of the impact of climate change on the water cycle is indispensable. The objective of this PhD research is to assess how higher temperatures, higher atmospheric CO2 concentration and changing precipitation patterns will alter the water cycle of agricultural landscapes in Southwest Germany. As representative key characteristics data evaluation will focus on water use efficiency (WUE) and groundwater recharge. The main research question is whether the positive effect of elevated atmospheric CO2 on WUE will be overcompensated by a decrease in net primary production due to warming and to altered seasonal water availability caused by higher rainfall variability. Elevated atmospheric CO2 stimulates plant growth and improves WUE, whereas higher temperatures are expected to reduce net primary production and groundwater recharge. Another research question referring to groundwater recharge is whether groundwater recharge will increase in winter and decrease in summer in Southwest Germany. Changed groundwater recharge directly affects drinking water supply and is an indicator for possible temporary water shortages in agricultural production. A multi-model ensemble composed of 16 combinations of four crop growth models, two water regime models and two nitrogen models will be calibrated and validated against sets of field data. Field data will be provided by FOR 1965 from 2009-2015 for the Kraichgau region and the Swabian Alb, two contrasting areas with regard to climate and agricultural intensity. By using a multi model ensemble uncertainties in predictions due to different model structures (epistemic uncertainty) can be quantified. The uncertainty related to the randomness of inputs and parameters, the so-called aleatory uncertainty, will be additionally assessed for each of the 16 models. Hence, a more reliable range of future scenarios can be derived and supports to develop practicable mitigation strategies.

  1. Sources of uncertainty in annual forest inventory estimates

    Treesearch

    Ronald E. McRoberts

    2000-01-01

    Although design and estimation aspects of annual forest inventories have begun to receive considerable attention within the forestry and natural resources communities, little attention has been devoted to identifying the sources of uncertainty inherent in these systems or to assessing the impact of those uncertainties on the total uncertainties of inventory estimates....

  2. Assessing uncertainties in land cover projections.

    PubMed

    Alexander, Peter; Prestele, Reinhard; Verburg, Peter H; Arneth, Almut; Baranzelli, Claudia; Batista E Silva, Filipe; Brown, Calum; Butler, Adam; Calvin, Katherine; Dendoncker, Nicolas; Doelman, Jonathan C; Dunford, Robert; Engström, Kerstin; Eitelberg, David; Fujimori, Shinichiro; Harrison, Paula A; Hasegawa, Tomoko; Havlik, Petr; Holzhauer, Sascha; Humpenöder, Florian; Jacobs-Crisioni, Chris; Jain, Atul K; Krisztin, Tamás; Kyle, Page; Lavalle, Carlo; Lenton, Tim; Liu, Jiayi; Meiyappan, Prasanth; Popp, Alexander; Powell, Tom; Sands, Ronald D; Schaldach, Rüdiger; Stehfest, Elke; Steinbuks, Jevgenijs; Tabeau, Andrzej; van Meijl, Hans; Wise, Marshall A; Rounsevell, Mark D A

    2017-02-01

    Understanding uncertainties in land cover projections is critical to investigating land-based climate mitigation policies, assessing the potential of climate adaptation strategies and quantifying the impacts of land cover change on the climate system. Here, we identify and quantify uncertainties in global and European land cover projections over a diverse range of model types and scenarios, extending the analysis beyond the agro-economic models included in previous comparisons. The results from 75 simulations over 18 models are analysed and show a large range in land cover area projections, with the highest variability occurring in future cropland areas. We demonstrate systematic differences in land cover areas associated with the characteristics of the modelling approach, which is at least as great as the differences attributed to the scenario variations. The results lead us to conclude that a higher degree of uncertainty exists in land use projections than currently included in climate or earth system projections. To account for land use uncertainty, it is recommended to use a diverse set of models and approaches when assessing the potential impacts of land cover change on future climate. Additionally, further work is needed to better understand the assumptions driving land use model results and reveal the causes of uncertainty in more depth, to help reduce model uncertainty and improve the projections of land cover. © 2016 John Wiley & Sons Ltd.

  3. Integrating info-gap decision theory with robust population management: a case study using the Mountain Plover.

    PubMed

    van der Burg, Max Post; Tyre, Andrew J

    2011-01-01

    Wildlife managers often make decisions under considerable uncertainty. In the most extreme case, a complete lack of data leads to uncertainty that is unquantifiable. Information-gap decision theory deals with assessing management decisions under extreme uncertainty, but it is not widely used in wildlife management. So too, robust population management methods were developed to deal with uncertainties in multiple-model parameters. However, the two methods have not, as yet, been used in tandem to assess population management decisions. We provide a novel combination of the robust population management approach for matrix models with the information-gap decision theory framework for making conservation decisions under extreme uncertainty. We applied our model to the problem of nest survival management in an endangered bird species, the Mountain Plover (Charadrius montanus). Our results showed that matrix sensitivities suggest that nest management is unlikely to have a strong effect on population growth rate, confirming previous analyses. However, given the amount of uncertainty about adult and juvenile survival, our analysis suggested that maximizing nest marking effort was a more robust decision to maintain a stable population. Focusing on the twin concepts of opportunity and robustness in an information-gap model provides a useful method of assessing conservation decisions under extreme uncertainty.

  4. Quantification for complex assessment: uncertainty estimation in final year project thesis assessment

    NASA Astrophysics Data System (ADS)

    Kim, Ho Sung

    2013-12-01

    A quantitative method for estimating an expected uncertainty (reliability and validity) in assessment results arising from the relativity between four variables, viz examiner's expertise, examinee's expertise achieved, assessment task difficulty and examinee's performance, was developed for the complex assessment applicable to final year project thesis assessment including peer assessment. A guide map can be generated by the method for finding expected uncertainties prior to the assessment implementation with a given set of variables. It employs a scale for visualisation of expertise levels, derivation of which is based on quantified clarities of mental images for levels of the examiner's expertise and the examinee's expertise achieved. To identify the relevant expertise areas that depend on the complexity in assessment format, a graphical continuum model was developed. The continuum model consists of assessment task, assessment standards and criterion for the transition towards the complex assessment owing to the relativity between implicitness and explicitness and is capable of identifying areas of expertise required for scale development.

  5. Performance Assessment Uncertainty Analysis for Japan's HLW Program Feasibility Study (H12)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    BABA,T.; ISHIGURO,K.; ISHIHARA,Y.

    1999-08-30

    Most HLW programs in the world recognize that any estimate of long-term radiological performance must be couched in terms of the uncertainties derived from natural variation, changes through time and lack of knowledge about the essential processes. The Japan Nuclear Cycle Development Institute followed a relatively standard procedure to address two major categories of uncertainty. First, a FEatures, Events and Processes (FEPs) listing, screening and grouping activity was pursued in order to define the range of uncertainty in system processes as well as possible variations in engineering design. A reference and many alternative cases representing various groups of FEPs weremore » defined and individual numerical simulations performed for each to quantify the range of conceptual uncertainty. Second, parameter distributions were developed for the reference case to represent the uncertainty in the strength of these processes, the sequencing of activities and geometric variations. Both point estimates using high and low values for individual parameters as well as a probabilistic analysis were performed to estimate parameter uncertainty. A brief description of the conceptual model uncertainty analysis is presented. This paper focuses on presenting the details of the probabilistic parameter uncertainty assessment.« less

  6. Verifying and Validating Simulation Models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hemez, Francois M.

    2015-02-23

    This presentation is a high-level discussion of the Verification and Validation (V&V) of computational models. Definitions of V&V are given to emphasize that “validation” is never performed in a vacuum; it accounts, instead, for the current state-of-knowledge in the discipline considered. In particular comparisons between physical measurements and numerical predictions should account for their respective sources of uncertainty. The differences between error (bias), aleatoric uncertainty (randomness) and epistemic uncertainty (ignorance, lack-of- knowledge) are briefly discussed. Four types of uncertainty in physics and engineering are discussed: 1) experimental variability, 2) variability and randomness, 3) numerical uncertainty and 4) model-form uncertainty. Statisticalmore » sampling methods are available to propagate, and analyze, variability and randomness. Numerical uncertainty originates from the truncation error introduced by the discretization of partial differential equations in time and space. Model-form uncertainty is introduced by assumptions often formulated to render a complex problem more tractable and amenable to modeling and simulation. The discussion concludes with high-level guidance to assess the “credibility” of numerical simulations, which stems from the level of rigor with which these various sources of uncertainty are assessed and quantified.« less

  7. Trajectory analysis of land use and land cover maps to improve spatial-temporal patterns, and impact assessment on groundwater recharge

    NASA Astrophysics Data System (ADS)

    Zomlot, Z.; Verbeiren, B.; Huysmans, M.; Batelaan, O.

    2017-11-01

    Land use/land cover (LULC) change is a consequence of human-induced global environmental change. It is also considered one of the major factors affecting groundwater recharge. Uncertainties and inconsistencies in LULC maps are one of the difficulties that LULC timeseries analysis face and which have a significant effect on hydrological impact analysis. Therefore, an accuracy assessment approach of LULC timeseries is needed for a more reliable hydrological analysis and prediction. The objective of this paper is to assess the impact of land use uncertainty and to improve the accuracy of a timeseries of CORINE (coordination of information on the environment) land cover maps by using a new approach of identifying spatial-temporal LULC change trajectories as a pre-processing tool. This ensures consistency of model input when dealing with land-use dynamics and as such improves the accuracy of land use maps and consequently groundwater recharge estimation. As a case study the impact of consistent land use changes from 1990 until 2013 on groundwater recharge for the Flanders-Brussels region is assessed. The change trajectory analysis successfully assigned a rational trajectory to 99% of all pixels. The methodology is shown to be powerful in correcting interpretation inconsistencies and overestimation errors in CORINE land cover maps. The overall kappa (cell-by-cell map comparison) improved from 0.6 to 0.8 and from 0.2 to 0.7 for forest and pasture land use classes respectively. The study shows that the inconsistencies in the land use maps introduce uncertainty in groundwater recharge estimation in a range of 10-30%. The analysis showed that during the period of 1990-2013 the LULC changes were mainly driven by urban expansion. The results show that the resolution at which the spatial analysis is performed is important; the recharge differences using original and corrected CORINE land cover maps increase considerably with increasing spatial resolution. This study indicates that improving consistency of land use map timeseries is of critical importance for assessing land use change and its environmental impact.

  8. Dispelling urban myths about default uncertainty factors in chemical risk assessment – sufficient protection against mixture effects?

    PubMed Central

    2013-01-01

    Assessing the detrimental health effects of chemicals requires the extrapolation of experimental data in animals to human populations. This is achieved by applying a default uncertainty factor of 100 to doses not found to be associated with observable effects in laboratory animals. It is commonly assumed that the toxicokinetic and toxicodynamic sub-components of this default uncertainty factor represent worst-case scenarios and that the multiplication of those components yields conservative estimates of safe levels for humans. It is sometimes claimed that this conservatism also offers adequate protection from mixture effects. By analysing the evolution of uncertainty factors from a historical perspective, we expose that the default factor and its sub-components are intended to represent adequate rather than worst-case scenarios. The intention of using assessment factors for mixture effects was abandoned thirty years ago. It is also often ignored that the conservatism (or otherwise) of uncertainty factors can only be considered in relation to a defined level of protection. A protection equivalent to an effect magnitude of 0.001-0.0001% over background incidence is generally considered acceptable. However, it is impossible to say whether this level of protection is in fact realised with the tolerable doses that are derived by employing uncertainty factors. Accordingly, it is difficult to assess whether uncertainty factors overestimate or underestimate the sensitivity differences in human populations. It is also often not appreciated that the outcome of probabilistic approaches to the multiplication of sub-factors is dependent on the choice of probability distributions. Therefore, the idea that default uncertainty factors are overly conservative worst-case scenarios which can account both for the lack of statistical power in animal experiments and protect against potential mixture effects is ill-founded. We contend that precautionary regulation should provide an incentive to generate better data and recommend adopting a pragmatic, but scientifically better founded approach to mixture risk assessment. PMID:23816180

  9. Dispelling urban myths about default uncertainty factors in chemical risk assessment--sufficient protection against mixture effects?

    PubMed

    Martin, Olwenn V; Martin, Scholze; Kortenkamp, Andreas

    2013-07-01

    Assessing the detrimental health effects of chemicals requires the extrapolation of experimental data in animals to human populations. This is achieved by applying a default uncertainty factor of 100 to doses not found to be associated with observable effects in laboratory animals. It is commonly assumed that the toxicokinetic and toxicodynamic sub-components of this default uncertainty factor represent worst-case scenarios and that the multiplication of those components yields conservative estimates of safe levels for humans. It is sometimes claimed that this conservatism also offers adequate protection from mixture effects. By analysing the evolution of uncertainty factors from a historical perspective, we expose that the default factor and its sub-components are intended to represent adequate rather than worst-case scenarios. The intention of using assessment factors for mixture effects was abandoned thirty years ago. It is also often ignored that the conservatism (or otherwise) of uncertainty factors can only be considered in relation to a defined level of protection. A protection equivalent to an effect magnitude of 0.001-0.0001% over background incidence is generally considered acceptable. However, it is impossible to say whether this level of protection is in fact realised with the tolerable doses that are derived by employing uncertainty factors. Accordingly, it is difficult to assess whether uncertainty factors overestimate or underestimate the sensitivity differences in human populations. It is also often not appreciated that the outcome of probabilistic approaches to the multiplication of sub-factors is dependent on the choice of probability distributions. Therefore, the idea that default uncertainty factors are overly conservative worst-case scenarios which can account both for the lack of statistical power in animal experiments and protect against potential mixture effects is ill-founded. We contend that precautionary regulation should provide an incentive to generate better data and recommend adopting a pragmatic, but scientifically better founded approach to mixture risk assessment.

  10. LICARA nanoSCAN - A tool for the self-assessment of benefits and risks of nanoproducts.

    PubMed

    van Harmelen, Toon; Zondervan-van den Beuken, Esther K; Brouwer, Derk H; Kuijpers, Eelco; Fransman, Wouter; Buist, Harrie B; Ligthart, Tom N; Hincapié, Ingrid; Hischier, Roland; Linkov, Igor; Nowack, Bernd; Studer, Jennifer; Hilty, Lorenz; Som, Claudia

    2016-05-01

    The fast penetration of nanoproducts on the market under conditions of significant uncertainty of their environmental properties and risks to humans creates a need for companies to assess sustainability of their products. Evaluation of the potential benefits and risks to build a coherent story for communication with clients, authorities, consumers, and other stakeholders is getting to be increasingly important, but SMEs often lack the knowledge and expertise to assess risks and communicate them appropriately. This paper introduces LICARA nanoSCAN, a modular web based tool that supports SMEs in assessing benefits and risks associated with new or existing nanoproducts. This tool is unique because it is scanning both the benefits and risks over the nanoproducts life cycle in comparison to a reference product with a similar functionality in order to enable the development of sustainable and competitive nanoproducts. SMEs can use data and expert judgment to answer mainly qualitative and semi-quantitative questions as a part of tool application. Risks to public, workers and consumers are assessed, while the benefits are evaluated for economic, environmental and societal opportunities associated with the product use. The tool provides an easy way to visualize results as well as to identify gaps, missing data and associated uncertainties. The LICARA nanoSCAN has been positively evaluated by several companies and was tested in a number of case studies. The tool helps to develop a consistent and comprehensive argument on the weaknesses and strengths of a nanoproduct that may be valuable for the communication with authorities, clients and among stakeholders in the value chain. LICARA nanoSCAN identifies areas for more detailed assessments, product design improvement or application of risk mitigation measures. Copyright © 2016 Elsevier Ltd. All rights reserved.

  11. Future sea level rise constrained by observations and long-term commitment.

    PubMed

    Mengel, Matthias; Levermann, Anders; Frieler, Katja; Robinson, Alexander; Marzeion, Ben; Winkelmann, Ricarda

    2016-03-08

    Sea level has been steadily rising over the past century, predominantly due to anthropogenic climate change. The rate of sea level rise will keep increasing with continued global warming, and, even if temperatures are stabilized through the phasing out of greenhouse gas emissions, sea level is still expected to rise for centuries. This will affect coastal areas worldwide, and robust projections are needed to assess mitigation options and guide adaptation measures. Here we combine the equilibrium response of the main sea level rise contributions with their last century's observed contribution to constrain projections of future sea level rise. Our model is calibrated to a set of observations for each contribution, and the observational and climate uncertainties are combined to produce uncertainty ranges for 21st century sea level rise. We project anthropogenic sea level rise of 28-56 cm, 37-77 cm, and 57-131 cm in 2100 for the greenhouse gas concentration scenarios RCP26, RCP45, and RCP85, respectively. Our uncertainty ranges for total sea level rise overlap with the process-based estimates of the Intergovernmental Panel on Climate Change. The "constrained extrapolation" approach generalizes earlier global semiempirical models and may therefore lead to a better understanding of the discrepancies with process-based projections.

  12. Future sea level rise constrained by observations and long-term commitment

    PubMed Central

    Mengel, Matthias; Levermann, Anders; Frieler, Katja; Robinson, Alexander; Marzeion, Ben; Winkelmann, Ricarda

    2016-01-01

    Sea level has been steadily rising over the past century, predominantly due to anthropogenic climate change. The rate of sea level rise will keep increasing with continued global warming, and, even if temperatures are stabilized through the phasing out of greenhouse gas emissions, sea level is still expected to rise for centuries. This will affect coastal areas worldwide, and robust projections are needed to assess mitigation options and guide adaptation measures. Here we combine the equilibrium response of the main sea level rise contributions with their last century's observed contribution to constrain projections of future sea level rise. Our model is calibrated to a set of observations for each contribution, and the observational and climate uncertainties are combined to produce uncertainty ranges for 21st century sea level rise. We project anthropogenic sea level rise of 28–56 cm, 37–77 cm, and 57–131 cm in 2100 for the greenhouse gas concentration scenarios RCP26, RCP45, and RCP85, respectively. Our uncertainty ranges for total sea level rise overlap with the process-based estimates of the Intergovernmental Panel on Climate Change. The “constrained extrapolation” approach generalizes earlier global semiempirical models and may therefore lead to a better understanding of the discrepancies with process-based projections. PMID:26903648

  13. TSUNAMI Primer: A Primer for Sensitivity/Uncertainty Calculations with SCALE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rearden, Bradley T; Mueller, Don; Bowman, Stephen M

    2009-01-01

    This primer presents examples in the application of the SCALE/TSUNAMI tools to generate k{sub eff} sensitivity data for one- and three-dimensional models using TSUNAMI-1D and -3D and to examine uncertainties in the computed k{sub eff} values due to uncertainties in the cross-section data used in their calculation. The proper use of unit cell data and need for confirming the appropriate selection of input parameters through direct perturbations are described. The uses of sensitivity and uncertainty data to identify and rank potential sources of computational bias in an application system and TSUNAMI tools for assessment of system similarity using sensitivity andmore » uncertainty criteria are demonstrated. Uses of these criteria in trending analyses to assess computational biases, bias uncertainties, and gap analyses are also described. Additionally, an application of the data adjustment tool TSURFER is provided, including identification of specific details of sources of computational bias.« less

  14. Uncertainty analysis on simple mass balance model to calculate critical loads for soil acidity.

    PubMed

    Li, Harbin; McNulty, Steven G

    2007-10-01

    Simple mass balance equations (SMBE) of critical acid loads (CAL) in forest soil were developed to assess potential risks of air pollutants to ecosystems. However, to apply SMBE reliably at large scales, SMBE must be tested for adequacy and uncertainty. Our goal was to provide a detailed analysis of uncertainty in SMBE so that sound strategies for scaling up CAL estimates to the national scale could be developed. Specifically, we wanted to quantify CAL uncertainty under natural variability in 17 model parameters, and determine their relative contributions in predicting CAL. Results indicated that uncertainty in CAL came primarily from components of base cation weathering (BC(w); 49%) and acid neutralizing capacity (46%), whereas the most critical parameters were BC(w) base rate (62%), soil depth (20%), and soil temperature (11%). Thus, improvements in estimates of these factors are crucial to reducing uncertainty and successfully scaling up SMBE for national assessments of CAL.

  15. Assessing the inherent uncertainty of one-dimensional diffusions

    NASA Astrophysics Data System (ADS)

    Eliazar, Iddo; Cohen, Morrel H.

    2013-01-01

    In this paper we assess the inherent uncertainty of one-dimensional diffusion processes via a stochasticity classification which provides an à la Mandelbrot categorization into five states of uncertainty: infra-mild, mild, borderline, wild, and ultra-wild. Two settings are considered. (i) Stopped diffusions: the diffusion initiates from a high level and is stopped once it first reaches a low level; in this setting we analyze the inherent uncertainty of the diffusion's maximal exceedance above its initial high level. (ii) Stationary diffusions: the diffusion is in dynamical statistical equilibrium; in this setting we analyze the inherent uncertainty of the diffusion's equilibrium level. In both settings general closed-form analytic results are established, and their application is exemplified by stock prices in the stopped-diffusions setting, and by interest rates in the stationary-diffusions setting. These results provide a highly implementable decision-making tool for the classification of uncertainty in the context of one-dimensional diffusions.

  16. Epistemic uncertainties and natural hazard risk assessment - Part 1: A review of the issues

    NASA Astrophysics Data System (ADS)

    Beven, K. J.; Aspinall, W. P.; Bates, P. D.; Borgomeo, E.; Goda, K.; Hall, J. W.; Page, T.; Phillips, J. C.; Rougier, J. T.; Simpson, M.; Stephenson, D. B.; Smith, P. J.; Wagener, T.; Watson, M.

    2015-12-01

    Uncertainties in natural hazard risk assessment are generally dominated by the sources arising from lack of knowledge or understanding of the processes involved. There is a lack of knowledge about frequencies, process representations, parameters, present and future boundary conditions, consequences and impacts, and the meaning of observations in evaluating simulation models. These are the epistemic uncertainties that can be difficult to constrain, especially in terms of event or scenario probabilities, even as elicited probabilities rationalized on the basis of expert judgements. This paper reviews the issues raised by trying to quantify the effects of epistemic uncertainties. Such scientific uncertainties might have significant influence on decisions that are made for risk management, so it is important to communicate the meaning of an uncertainty estimate and to provide an audit trail of the assumptions on which it is based. Some suggestions for good practice in doing so are made.

  17. Analyzing the Effects of Stellar Evolution on White Dwarf Ages

    NASA Astrophysics Data System (ADS)

    Moss, Adam; Von Hippel, Ted, Dr.

    2018-01-01

    White dwarfs are among the oldest objects in our Galaxy, thus if we can determine their ages, we can derive the star formation history of our Galaxy. As part of a larger project that will use Gaia parallaxes to derive the ages of tens of thousands of white dwarfs, we explore the impact on the total white dwarf age of various modern models of main sequence and red giant branch stellar evolution, as well as uncertainties in progenitor metallicity. In addition, we study the effect on white dwarf ages caused by uncertainties in the Initial Final Mass Relation, which is the mapping between zero age main sequence and white dwarf masses. We find that for old and high mass white dwarfs, uncertainties in these factors have little effect on the total white dwarf age.

  18. High-resolution Fourier transform measurements of air-induced broadening and shift coefficients in the 0002-0000 main isotopologue band of nitrous oxide

    NASA Astrophysics Data System (ADS)

    Werwein, Viktor; Li, Gang; Serdyukov, Anton; Brunzendorf, Jens; Werhahn, Olav; Ebert, Volker

    2018-06-01

    In the present study, we report highly accurate air-induced broadening and shift coefficients for the nitrous oxide (N2O) 0002-0000 band at 2.26 μm of the main isotopologue retrieved from high-resolution Fourier transform infrared (FTIR) measurements with metrologically determined pressure, temperature, absorption path length and chemical composition. Most of our retrieved air-broadening coefficients agree with previously generated datasets within the expanded (confidence interval of 95%) uncertainties. For the air-shift coefficients our results suggest a different rotational dependence compared to literature. The present study benefits from improved measurement conditions and a detailed metrological uncertainty description. Comparing to literature, the uncertainties of the previous broadening and shift coefficients are improved by a factor of up to 39 and up to 22, respectively.

  19. Assessment of uncertainties in radiation-induced cancer risk predictions at clinically relevant doses

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nguyen, J.; Moteabbed, M.; Paganetti, H., E-mail: hpaganetti@mgh.harvard.edu

    2015-01-15

    Purpose: Theoretical dose–response models offer the possibility to assess second cancer induction risks after external beam therapy. The parameters used in these models are determined with limited data from epidemiological studies. Risk estimations are thus associated with considerable uncertainties. This study aims at illustrating uncertainties when predicting the risk for organ-specific second cancers in the primary radiation field illustrated by choosing selected treatment plans for brain cancer patients. Methods: A widely used risk model was considered in this study. The uncertainties of the model parameters were estimated with reported data of second cancer incidences for various organs. Standard error propagationmore » was then subsequently applied to assess the uncertainty in the risk model. Next, second cancer risks of five pediatric patients treated for cancer in the head and neck regions were calculated. For each case, treatment plans for proton and photon therapy were designed to estimate the uncertainties (a) in the lifetime attributable risk (LAR) for a given treatment modality and (b) when comparing risks of two different treatment modalities. Results: Uncertainties in excess of 100% of the risk were found for almost all organs considered. When applied to treatment plans, the calculated LAR values have uncertainties of the same magnitude. A comparison between cancer risks of different treatment modalities, however, does allow statistically significant conclusions. In the studied cases, the patient averaged LAR ratio of proton and photon treatments was 0.35, 0.56, and 0.59 for brain carcinoma, brain sarcoma, and bone sarcoma, respectively. Their corresponding uncertainties were estimated to be potentially below 5%, depending on uncertainties in dosimetry. Conclusions: The uncertainty in the dose–response curve in cancer risk models makes it currently impractical to predict the risk for an individual external beam treatment. On the other hand, the ratio of absolute risks between two modalities is less sensitive to the uncertainties in the risk model and can provide statistically significant estimates.« less

  20. Uncertainties in climate change projections for viticulture in Portugal

    NASA Astrophysics Data System (ADS)

    Fraga, Helder; Malheiro, Aureliano C.; Moutinho-Pereira, José; Pinto, Joaquim G.; Santos, João A.

    2013-04-01

    The assessment of climate change impacts on viticulture is often carried out using regional climate model (RCM) outputs. These studies rely on either multi-model ensembles or on single-model approaches. The RCM-ensembles account for uncertainties inherent to the different models. In this study, using a 16-RCM ensemble under the IPCC A1B scenario, the climate change signal (future minus recent-past, 2041-2070 - 1961-2000) of 4 bioclimatic indices (Huglin Index - HI, Dryness Index - DI, Hydrothermal Index - HyI and CompI - Composite Index) over mainland Portugal is analysed. A normalized interquartile range (NIQR) of the 16-member ensemble for each bioclimatic index is assessed in order to quantify the ensemble uncertainty. The results show significant increases in the HI index over most of Portugal, with higher values in Alentejo, Trás-os-Montes and Douro/Porto wine regions, also depicting very low uncertainty. Conversely, the decreases in the DI pattern throughout the country show large uncertainties, except in Minho (northwestern Portugal), where precipitation reaches the highest amounts in Portugal. The HyI shows significant decreases in northwestern Portugal, with relatively low uncertainty all across the country. The CompI depicts significant decreases over Alentejo and increases over Minho, though decreases over Alentejo reveal high uncertainty, while increases over Minho show low uncertainty. The assessment of the uncertainty in climate change projections is of great relevance for the wine industry. Quantifying this uncertainty is crucial, since different models may lead to quite different outcomes and may thereby be as crucial as climate change itself to the winemaking sector. This work is supported by European Union Funds (FEDER/COMPETE - Operational Competitiveness Programme) and by national funds (FCT - Portuguese Foundation for Science and Technology) under the project FCOMP-01-0124-FEDER-022692.

  1. Uncertainty assessment of a model for biological nitrogen and phosphorus removal: Application to a large wastewater treatment plant

    NASA Astrophysics Data System (ADS)

    Mannina, Giorgio; Cosenza, Alida; Viviani, Gaspare

    In the last few years, the use of mathematical models in WasteWater Treatment Plant (WWTP) processes has become a common way to predict WWTP behaviour. However, mathematical models generally demand advanced input for their implementation that must be evaluated by an extensive data-gathering campaign, which cannot always be carried out. This fact, together with the intrinsic complexity of the model structure, leads to model results that may be very uncertain. Quantification of the uncertainty is imperative. However, despite the importance of uncertainty quantification, only few studies have been carried out in the wastewater treatment field, and those studies only included a few of the sources of model uncertainty. Seeking the development of the area, the paper presents the uncertainty assessment of a mathematical model simulating biological nitrogen and phosphorus removal. The uncertainty assessment was conducted according to the Generalised Likelihood Uncertainty Estimation (GLUE) methodology that has been scarcely applied in wastewater field. The model was based on activated-sludge models 1 (ASM) and 2 (ASM2). Different approaches can be used for uncertainty analysis. The GLUE methodology requires a large number of Monte Carlo simulations in which a random sampling of individual parameters drawn from probability distributions is used to determine a set of parameter values. Using this approach, model reliability was evaluated based on its capacity to globally limit the uncertainty. The method was applied to a large full-scale WWTP for which quantity and quality data was gathered. The analysis enabled to gain useful insights for WWTP modelling identifying the crucial aspects where higher uncertainty rely and where therefore, more efforts should be provided in terms of both data gathering and modelling practises.

  2. Uncertainty characterization and quantification in air pollution models. Application to the ADMS-Urban model.

    NASA Astrophysics Data System (ADS)

    Debry, E.; Malherbe, L.; Schillinger, C.; Bessagnet, B.; Rouil, L.

    2009-04-01

    Evaluation of human exposure to atmospheric pollution usually requires the knowledge of pollutants concentrations in ambient air. In the framework of PAISA project, which studies the influence of socio-economical status on relationships between air pollution and short term health effects, the concentrations of gas and particle pollutants are computed over Strasbourg with the ADMS-Urban model. As for any modeling result, simulated concentrations come with uncertainties which have to be characterized and quantified. There are several sources of uncertainties related to input data and parameters, i.e. fields used to execute the model like meteorological fields, boundary conditions and emissions, related to the model formulation because of incomplete or inaccurate treatment of dynamical and chemical processes, and inherent to the stochastic behavior of atmosphere and human activities [1]. Our aim is here to assess the uncertainties of the simulated concentrations with respect to input data and model parameters. In this scope the first step consisted in bringing out the input data and model parameters that contribute most effectively to space and time variability of predicted concentrations. Concentrations of several pollutants were simulated for two months in winter 2004 and two months in summer 2004 over five areas of Strasbourg. The sensitivity analysis shows the dominating influence of boundary conditions and emissions. Among model parameters, the roughness and Monin-Obukhov lengths appear to have non neglectable local effects. Dry deposition is also an important dynamic process. The second step of the characterization and quantification of uncertainties consists in attributing a probability distribution to each input data and model parameter and in propagating the joint distribution of all data and parameters into the model so as to associate a probability distribution to the modeled concentrations. Several analytical and numerical methods exist to perform an uncertainty analysis. We chose the Monte Carlo method which has already been applied to atmospheric dispersion models [2, 3, 4]. The main advantage of this method is to be insensitive to the number of perturbed parameters but its drawbacks are its computation cost and its slow convergence. In order to speed up this one we used the method of antithetic variable which takes adavantage of the symmetry of probability laws. The air quality model simulations were carried out by the Association for study and watching of Atmospheric Pollution in Alsace (ASPA). The output concentrations distributions can then be updated with a Bayesian method. This work is part of an INERIS Research project also aiming at assessing the uncertainty of the CHIMERE dispersion model used in the Prev'Air forecasting platform (www.prevair.org) in order to deliver more accurate predictions. (1) Rao, K.S. Uncertainty Analysis in Atmospheric Dispersion Modeling, Pure and Applied Geophysics, 2005, 162, 1893-1917. (2) Beekmann, M. and Derognat, C. Monte Carlo uncertainty analysis of a regional-scale transport chemistry model constrained by measurements from the Atmospheric Pollution Over the PAris Area (ESQUIF) campaign, Journal of Geophysical Research, 2003, 108, 8559-8576. (3) Hanna, S.R. and Lu, Z. and Frey, H.C. and Wheeler, N. and Vukovich, J. and Arunachalam, S. and Fernau, M. and Hansen, D.A. Uncertainties in predicted ozone concentrations due to input uncertainties for the UAM-V photochemical grid model applied to the July 1995 OTAG domain, Atmospheric Environment, 2001, 35, 891-903. (4) Romanowicz, R. and Higson, H. and Teasdale, I. Bayesian uncertainty estimation methodology applied to air pollution modelling, Environmetrics, 2000, 11, 351-371.

  3. Uncertainty analysis of thermal quantities measurement in a centrifugal compressor

    NASA Astrophysics Data System (ADS)

    Hurda, Lukáš; Matas, Richard

    2017-09-01

    Compressor performance characteristics evaluation process based on the measurement of pressure, temperature and other quantities is examined to find uncertainties for directly measured and derived quantities. CFD is used as a tool to quantify the influences of different sources of uncertainty of measurements for single- and multi-thermocouple total temperature probes. The heat conduction through the body of the thermocouple probe and the heat-up of the air in the intake piping are the main phenomena of interest.

  4. Quantifying and Reducing Uncertainty in Correlated Multi-Area Short-Term Load Forecasting

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sun, Yannan; Hou, Zhangshuan; Meng, Da

    2016-07-17

    In this study, we represent and reduce the uncertainties in short-term electric load forecasting by integrating time series analysis tools including ARIMA modeling, sequential Gaussian simulation, and principal component analysis. The approaches are mainly focusing on maintaining the inter-dependency between multiple geographically related areas. These approaches are applied onto cross-correlated load time series as well as their forecast errors. Multiple short-term prediction realizations are then generated from the reduced uncertainty ranges, which are useful for power system risk analyses.

  5. Quantification of uncertainty in flood risk assessment for flood protection planning: a Bayesian approach

    NASA Astrophysics Data System (ADS)

    Dittes, Beatrice; Špačková, Olga; Ebrahimian, Negin; Kaiser, Maria; Rieger, Wolfgang; Disse, Markus; Straub, Daniel

    2017-04-01

    Flood risk estimates are subject to significant uncertainties, e.g. due to limited records of historic flood events, uncertainty in flood modeling, uncertain impact of climate change or uncertainty in the exposure and loss estimates. In traditional design of flood protection systems, these uncertainties are typically just accounted for implicitly, based on engineering judgment. In the AdaptRisk project, we develop a fully quantitative framework for planning of flood protection systems under current and future uncertainties using quantitative pre-posterior Bayesian decision analysis. In this contribution, we focus on the quantification of the uncertainties and study their relative influence on the flood risk estimate and on the planning of flood protection systems. The following uncertainty components are included using a Bayesian approach: 1) inherent and statistical (i.e. limited record length) uncertainty; 2) climate uncertainty that can be learned from an ensemble of GCM-RCM models; 3) estimates of climate uncertainty components not covered in 2), such as bias correction, incomplete ensemble, local specifics not captured by the GCM-RCM models; 4) uncertainty in the inundation modelling; 5) uncertainty in damage estimation. We also investigate how these uncertainties are possibly reduced in the future when new evidence - such as new climate models, observed extreme events, and socio-economic data - becomes available. Finally, we look into how this new evidence influences the risk assessment and effectivity of flood protection systems. We demonstrate our methodology for a pre-alpine catchment in southern Germany: the Mangfall catchment in Bavaria that includes the city of Rosenheim, which suffered significant losses during the 2013 flood event.

  6. Assessment of the Broadleaf Crops Leaf Area Index Product from the Terra MODIS Instrument

    NASA Technical Reports Server (NTRS)

    Tan, Bin; Hu, Jiannan; Huang, Dong; Yang, Wenze; Zhang, Ping; Shabanov, Nikolay V.; Knyazikhin, Yuri; Nemani, Ramakrishna R.; Myneni, Ranga B.

    2005-01-01

    The first significant processing of Terra MODIS data, called Collection 3, covered the period from November 2000 to December 2002. The Collection 3 leaf area index (LAI) and fraction vegetation absorbed photosynthetically active radiation (FPAR) products for broadleaf crops exhibited three anomalies (a) high LAI values during the peak growing season, (b) differences in LAI seasonality between the radiative transfer-based main algorithm and the vegetation index based back-up algorithm, and (c) too few retrievals from the main algorithm during the summer period when the crops are at full flush. The cause of these anomalies is a mismatch between reflectances modeled by the algorithm and MODIS measurements. Therefore, the Look-Up-Tables accompanying the algorithm were revised and implemented in Collection 4 processing. The main algorithm with the revised Look-Up-Tables generated retrievals for over 80% of the pixels with valid data. Retrievals from the back-up algorithm, although few, should be used with caution as they are generated from surface reflectances with high uncertainties.

  7. Optimal Timing of Oceanic, Geological and Biological Carbon Sequestration to Safeguard Climate

    NASA Astrophysics Data System (ADS)

    Gitz, V.; Ambrosi, P.; Ciais, P.; Orr, J.; Magne, B.; Hourcade, J.

    2005-12-01

    We address the issue of safeguarding climate in the presence of a cascade of uncertainties through a portfolio of mitigation options: emissions reductions (M), biological carbon sequestration (BCS), carbon capture and storage - both geological (GCS) and oceanic (OCS). Within a sequential decision framework (i.e. as uncertainties are progressively resolved with time), we use a global optimal control model, RESPONSE, to examine the relative advantages of the three sequestration options in lowering fossil fuel abatement expenditures. Moreover, we show to what extent these options offer additional flexibility for short- and long-term decision given uncertainties on climate sensitivity and ``safe'' climate targets. To do so, we compute the value of information regarding these uncertainties and assess the timeliness of learning (i.e. which uncertainty is more``urgent'' to resolve). Finally, we show to what extent short term optimal paths of fossil emissions abatement and carbon sequestration are robust to these uncertainties. We find that BCS, GCS and OCS are complementary both in alleviating the constraint on the energy sector and in tackling the uncertainties. BCS is used more in the short term as a brake whereas OCS and GCS are used more in the long term as a safety valve. In other words, a portfolio approach is preferable to an approach based solely on emissions reduction: with a fully- diversified mitigation portfolio, discounted global climate policy costs are up to 38% lower than with an abatement-only policy and discounted abatement costs decrease up to 54%. Short-term costs are lower, mainly (81%) thanks to BCS - a result relatively independent upon the emissions scenario. Long- term costs are mainly lower thanks to GCS or OCS, both options being concurrent. However, in the case of high-emissions scenarios (like A2), OCS proves highly helpful (up to 25% of A2 reference scenario cumulated emissions could be stored). Though marginal in duration given the opportunity cost to permanently immobilize lands, BCS proves helpful on short-term (when the rate constraint essentially bites) against all uncertainties. GCS and OCS, which deploy later and prove helpful against uncertainties that are pregnant on a longer term (like the magnitude constraint), or when they are supposed to be resolved in the long run. \\begin{tabular}{ccccccc}\\hline\\hline &μlticolumn{3}{c} {early learning}&μlticolumn{3}{c}{late learning} policy option&t2x&ryt&tmax&t2x&ryt&tmax\\hline M&1.64&6.32&1.24&2.22&7.81&4.46 M+BCS&1.05&4.44&0.84&1.50&5.55&3.29 M+GCS&1.51&5.91&0.80&2.02&6.90&2.47 M+OCS&1.62&6.02&0.89&2.13&6.93&2.47 M+BCS+GCS+OCS&1.02&4.22&0.59&1.45&5.07&2.04\\hline\\hline Table 1. The value of information in 1990 (T US90$) for the climate sensitivity (t2x), the rate constraint (ryt), the magnitude constraint (tmax) in early-learning (2030) and late-learning (2060) scenarios, under four policy options.53 wt.

  8. Quantifying uncertainties in the structural response of SSME blades

    NASA Technical Reports Server (NTRS)

    Nagpal, Vinod K.

    1987-01-01

    To quantify the uncertainties associated with the geometry and material properties of a Space Shuttle Main Engine (SSME) turbopump blade, a computer code known as STAEBL was used. A finite element model of the blade used 80 triangular shell elements with 55 nodes and five degrees of freedom per node. The whole study was simulated on the computer and no real experiments were conducted. The structural response has been evaluated in terms of three variables which are natural frequencies, root (maximum) stress, and blade tip displacements. The results of the study indicate that only the geometric uncertainties have significant effects on the response. Uncertainties in material properties have insignificant effects.

  9. The uncertainty cascade in flood risk assessment under changing climatic conditions - the Biala Tarnowska case study

    NASA Astrophysics Data System (ADS)

    Doroszkiewicz, Joanna; Romanowicz, Renata

    2016-04-01

    Uncertainty in the results of the hydraulic model is not only associated with the limitations of that model and the shortcomings of data. An important factor that has a major impact on the uncertainty of the flood risk assessment in a changing climate conditions is associated with the uncertainty of future climate scenarios (IPCC WG I, 2013). Future climate projections provided by global climate models are used to generate future runoff required as an input to hydraulic models applied in the derivation of flood risk maps. Biala Tarnowska catchment, situated in southern Poland is used as a case study. Future discharges at the input to a hydraulic model are obtained using the HBV model and climate projections obtained from the EUROCORDEX project. The study describes a cascade of uncertainty related to different stages of the process of derivation of flood risk maps under changing climate conditions. In this context it takes into account the uncertainty of future climate projections, an uncertainty of flow routing model, the propagation of that uncertainty through the hydraulic model, and finally, the uncertainty related to the derivation of flood risk maps. One of the aims of this study is an assessment of a relative impact of different sources of uncertainty on the uncertainty of flood risk maps. Due to the complexity of the process, an assessment of total uncertainty of maps of inundation probability might be very computer time consuming. As a way forward we present an application of a hydraulic model simulator based on a nonlinear transfer function model for the chosen locations along the river reach. The transfer function model parameters are estimated based on the simulations of the hydraulic model at each of the model cross-section. The study shows that the application of the simulator substantially reduces the computer requirements related to the derivation of flood risk maps under future climatic conditions. Acknowledgements: This work was supported by the project CHIHE (Climate Change Impact on Hydrological Extremes), carried out in the Institute of Geophysics Polish Academy of Sciences, funded by Norway Grants (contract No. Pol-Nor/196243/80/2013). The hydro-meteorological observations were provided by the Institute of Meteorology and Water Management (IMGW), Poland.

  10. Dealing with unquantifiable uncertainties in landslide modelling for urban risk reduction in developing countries

    NASA Astrophysics Data System (ADS)

    Almeida, Susana; Holcombe, Liz; Pianosi, Francesca; Wagener, Thorsten

    2016-04-01

    Landslides have many negative economic and societal impacts, including the potential for significant loss of life and damage to infrastructure. Slope stability assessment can be used to guide decisions about the management of landslide risk, but its usefulness can be challenged by high levels of uncertainty in predicting landslide occurrence. Prediction uncertainty may be associated with the choice of model that is used to assess slope stability, the quality of the available input data, or a lack of knowledge of how future climatic and socio-economic changes may affect future landslide risk. While some of these uncertainties can be characterised by relatively well-defined probability distributions, for other uncertainties, such as those linked to climate change, no probability distribution is available to characterise them. This latter type of uncertainty, often referred to as deep uncertainty, means that robust policies need to be developed that are expected to perform acceptably well over a wide range of future conditions. In our study the impact of deep uncertainty on slope stability predictions is assessed in a quantitative and structured manner using Global Sensitivity Analysis (GSA) and the Combined Hydrology and Stability Model (CHASM). In particular, we use several GSA methods including the Method of Morris, Regional Sensitivity Analysis and Classification and Regression Trees (CART), as well as advanced visualization tools, to assess the combination of conditions that may lead to slope failure. Our example application is a slope in the Caribbean, an area that is naturally susceptible to landslides due to a combination of high rainfall rates during the hurricane season, steep slopes, and highly weathered residual soils. Rapid unplanned urbanisation and changing climate may further exacerbate landslide risk in the future. Our example shows how we can gain useful information in the presence of deep uncertainty by combining physically based models with GSA in a scenario discovery framework.

  11. Assessment of Laminar, Convective Aeroheating Prediction Uncertainties for Mars Entry Vehicles

    NASA Technical Reports Server (NTRS)

    Hollis, Brian R.; Prabhu, Dinesh K.

    2011-01-01

    An assessment of computational uncertainties is presented for numerical methods used by NASA to predict laminar, convective aeroheating environments for Mars entry vehicles. A survey was conducted of existing experimental heat-transfer and shock-shape data for high enthalpy, reacting-gas CO2 flows and five relevant test series were selected for comparison to predictions. Solutions were generated at the experimental test conditions using NASA state-of-the-art computational tools and compared to these data. The comparisons were evaluated to establish predictive uncertainties as a function of total enthalpy and to provide guidance for future experimental testing requirements to help lower these uncertainties.

  12. Assessment of Laminar, Convective Aeroheating Prediction Uncertainties for Mars-Entry Vehicles

    NASA Technical Reports Server (NTRS)

    Hollis, Brian R.; Prabhu, Dinesh K.

    2013-01-01

    An assessment of computational uncertainties is presented for numerical methods used by NASA to predict laminar, convective aeroheating environments for Mars-entry vehicles. A survey was conducted of existing experimental heat transfer and shock-shape data for high-enthalpy reacting-gas CO2 flows, and five relevant test series were selected for comparison with predictions. Solutions were generated at the experimental test conditions using NASA state-of-the-art computational tools and compared with these data. The comparisons were evaluated to establish predictive uncertainties as a function of total enthalpy and to provide guidance for future experimental testing requirements to help lower these uncertainties.

  13. A Methodology for Robust Comparative Life Cycle Assessments Incorporating Uncertainty.

    PubMed

    Gregory, Jeremy R; Noshadravan, Arash; Olivetti, Elsa A; Kirchain, Randolph E

    2016-06-21

    We propose a methodology for conducting robust comparative life cycle assessments (LCA) by leveraging uncertainty. The method evaluates a broad range of the possible scenario space in a probabilistic fashion while simultaneously considering uncertainty in input data. The method is intended to ascertain which scenarios have a definitive environmentally preferable choice among the alternatives being compared and the significance of the differences given uncertainty in the parameters, which parameters have the most influence on this difference, and how we can identify the resolvable scenarios (where one alternative in the comparison has a clearly lower environmental impact). This is accomplished via an aggregated probabilistic scenario-aware analysis, followed by an assessment of which scenarios have resolvable alternatives. Decision-tree partitioning algorithms are used to isolate meaningful scenario groups. In instances where the alternatives cannot be resolved for scenarios of interest, influential parameters are identified using sensitivity analysis. If those parameters can be refined, the process can be iterated using the refined parameters. We also present definitions of uncertainty quantities that have not been applied in the field of LCA and approaches for characterizing uncertainty in those quantities. We then demonstrate the methodology through a case study of pavements.

  14. Applying High-Speed Vision Sensing to an Industrial Robot for High-Performance Position Regulation under Uncertainties

    PubMed Central

    Huang, Shouren; Bergström, Niklas; Yamakawa, Yuji; Senoo, Taku; Ishikawa, Masatoshi

    2016-01-01

    It is traditionally difficult to implement fast and accurate position regulation on an industrial robot in the presence of uncertainties. The uncertain factors can be attributed either to the industrial robot itself (e.g., a mismatch of dynamics, mechanical defects such as backlash, etc.) or to the external environment (e.g., calibration errors, misalignment or perturbations of a workpiece, etc.). This paper proposes a systematic approach to implement high-performance position regulation under uncertainties on a general industrial robot (referred to as the main robot) with minimal or no manual teaching. The method is based on a coarse-to-fine strategy that involves configuring an add-on module for the main robot’s end effector. The add-on module consists of a 1000 Hz vision sensor and a high-speed actuator to compensate for accumulated uncertainties. The main robot only focuses on fast and coarse motion, with its trajectories automatically planned by image information from a static low-cost camera. Fast and accurate peg-and-hole alignment in one dimension was implemented as an application scenario by using a commercial parallel-link robot and an add-on compensation module with one degree of freedom (DoF). Experimental results yielded an almost 100% success rate for fast peg-in-hole manipulation (with regulation accuracy at about 0.1 mm) when the workpiece was randomly placed. PMID:27483274

  15. Evaluating uncertainty to strengthen epidemiologic data for use in human health risk assessments

    EPA Science Inventory

    Background: There is a recognized need to improve the application of epidemiologic data in human health risk assessment especially for understanding and characterizing risks from environmental and occupational exposures. While most epidemiologic studies result in uncertainty, tec...

  16. Prioritizing Chemicals and Data Requirements for Screening-Level Exposure and Risk Assessment

    PubMed Central

    Brown, Trevor N.; Wania, Frank; Breivik, Knut; McLachlan, Michael S.

    2012-01-01

    Background: Scientists and regulatory agencies strive to identify chemicals that may cause harmful effects to humans and the environment; however, prioritization is challenging because of the large number of chemicals requiring evaluation and limited data and resources. Objectives: We aimed to prioritize chemicals for exposure and exposure potential and obtain a quantitative perspective on research needs to better address uncertainty in screening assessments. Methods: We used a multimedia mass balance model to prioritize > 12,000 organic chemicals using four far-field human exposure metrics. The propagation of variance (uncertainty) in key chemical information used as model input for calculating exposure metrics was quantified. Results: Modeled human concentrations and intake rates span approximately 17 and 15 orders of magnitude, respectively. Estimates of exposure potential using human concentrations and a unit emission rate span approximately 13 orders of magnitude, and intake fractions span 7 orders of magnitude. The actual chemical emission rate contributes the greatest variance (uncertainty) in exposure estimates. The human biotransformation half-life is the second greatest source of uncertainty in estimated concentrations. In general, biotransformation and biodegradation half-lives are greater sources of uncertainty in modeled exposure and exposure potential than chemical partition coefficients. Conclusions: Mechanistic exposure modeling is suitable for screening and prioritizing large numbers of chemicals. By including uncertainty analysis and uncertainty in chemical information in the exposure estimates, these methods can help identify and address the important sources of uncertainty in human exposure and risk assessment in a systematic manner. PMID:23008278

  17. Calibration of ground-based microwave radiometers - Accuracy assessment and recommendations for network users

    NASA Astrophysics Data System (ADS)

    Pospichal, Bernhard; Küchler, Nils; Löhnert, Ulrich; Crewell, Susanne; Czekala, Harald; Güldner, Jürgen

    2016-04-01

    Ground-based microwave radiometers (MWR) are becoming widely used in atmospheric remote sensing and start to be routinely operated by national weather services and other institutions. However, common standards for calibration of these radiometers and a detailed knowledge about the error characteristics is needed, in order to assimilate the data into models. Intercomparisons of calibrations by different MWRs have rarely been done. Therefore, two calibration experiments in Lindenberg (2014) and Meckenheim (2015) were performed in the frame of TOPROF (Cost action ES1303) in order to assess uncertainties and differences between various instruments. In addition, a series of experiments were taken in Oklahoma in autumn 2014. The focus lay on the performance of the two main instrument types, which are currently used operationally. These are the MP-Profiler series by Radiometrics Corporation as well as the HATPRO series by Radiometer Physics GmbH (RPG). Both instrument types are operating in two frequency bands, one along the 22 GHz water vapour line, the other one at the lower wing of the 60 GHz oxygen absorption complex. The goal was to establish protocols for providing quality controlled (QC) MWR data and their uncertainties. To this end, standardized calibration procedures for MWR were developed and recommendations for radiometer users were compiled. We focus here mainly on data types, integration times and optimal settings for calibration intervals, both for absolute (liquid nitrogen, tipping curve) as well as relative (hot load, noise diode) calibrations. Besides the recommendations for ground-based MWR operators, we will present methods to determine the accuracy of the calibration as well as means for automatic data quality control. In addition, some results from the intercomparison of different radiometers will be discussed.

  18. Risk, Uncertainty and Precaution in Science: The Threshold of the Toxicological Concern Approach in Food Toxicology.

    PubMed

    Bschir, Karim

    2017-04-01

    Environmental risk assessment is often affected by severe uncertainty. The frequently invoked precautionary principle helps to guide risk assessment and decision-making in the face of scientific uncertainty. In many contexts, however, uncertainties play a role not only in the application of scientific models but also in their development. Building on recent literature in the philosophy of science, this paper argues that precaution should be exercised at the stage when tools for risk assessment are developed as well as when they are used to inform decision-making. The relevance and consequences of this claim are discussed in the context of the threshold of the toxicological concern approach in food toxicology. I conclude that the approach does not meet the standards of an epistemic version of the precautionary principle.

  19. Quantifying uncertainty in the measurement of arsenic in suspended particulate matter by Atomic Absorption Spectrometry with hydride generator

    PubMed Central

    2011-01-01

    Arsenic is the toxic element, which creates several problems in human being specially when inhaled through air. So the accurate and precise measurement of arsenic in suspended particulate matter (SPM) is of prime importance as it gives information about the level of toxicity in the environment, and preventive measures could be taken in the effective areas. Quality assurance is equally important in the measurement of arsenic in SPM samples before making any decision. The quality and reliability of the data of such volatile elements depends upon the measurement of uncertainty of each step involved from sampling to analysis. The analytical results quantifying uncertainty gives a measure of the confidence level of the concerned laboratory. So the main objective of this study was to determine arsenic content in SPM samples with uncertainty budget and to find out various potential sources of uncertainty, which affects the results. Keeping these facts, we have selected seven diverse sites of Delhi (National Capital of India) for quantification of arsenic content in SPM samples with uncertainty budget following sampling by HVS to analysis by Atomic Absorption Spectrometer-Hydride Generator (AAS-HG). In the measurement of arsenic in SPM samples so many steps are involved from sampling to final result and we have considered various potential sources of uncertainties. The calculation of uncertainty is based on ISO/IEC17025: 2005 document and EURACHEM guideline. It has been found that the final results mostly depend on the uncertainty in measurement mainly due to repeatability, final volume prepared for analysis, weighing balance and sampling by HVS. After the analysis of data of seven diverse sites of Delhi, it has been concluded that during the period from 31st Jan. 2008 to 7th Feb. 2008 the arsenic concentration varies from 1.44 ± 0.25 to 5.58 ± 0.55 ng/m3 with 95% confidence level (k = 2). PMID:21466671

  20. The state of the art of the impact of sampling uncertainty on measurement uncertainty

    NASA Astrophysics Data System (ADS)

    Leite, V. J.; Oliveira, E. C.

    2018-03-01

    The measurement uncertainty is a parameter that marks the reliability and can be divided into two large groups: sampling and analytical variations. Analytical uncertainty is a controlled process, performed in the laboratory. The same does not occur with the sampling uncertainty, which, because it faces several obstacles and there is no clarity on how to perform the procedures, has been neglected, although it is admittedly indispensable to the measurement process. This paper aims at describing the state of the art of sampling uncertainty and at assessing its relevance to measurement uncertainty.

  1. Sequential simulation approach to modeling of multi-seam coal deposits with an application to the assessment of a Louisiana lignite

    USGS Publications Warehouse

    Olea, Ricardo A.; Luppens, James A.

    2012-01-01

    There are multiple ways to characterize uncertainty in the assessment of coal resources, but not all of them are equally satisfactory. Increasingly, the tendency is toward borrowing from the statistical tools developed in the last 50 years for the quantitative assessment of other mineral commodities. Here, we briefly review the most recent of such methods and formulate a procedure for the systematic assessment of multi-seam coal deposits taking into account several geological factors, such as fluctuations in thickness, erosion, oxidation, and bed boundaries. A lignite deposit explored in three stages is used for validating models based on comparing a first set of drill holes against data from infill and development drilling. Results were fully consistent with reality, providing a variety of maps, histograms, and scatterplots characterizing the deposit and associated uncertainty in the assessments. The geostatistical approach was particularly informative in providing a probability distribution modeling deposit wide uncertainty about total resources and a cumulative distribution of coal tonnage as a function of local uncertainty.

  2. Soil sampling strategies for site assessments in petroleum-contaminated areas.

    PubMed

    Kim, Geonha; Chowdhury, Saikat; Lin, Yen-Min; Lu, Chih-Jen

    2017-04-01

    Environmental site assessments are frequently executed for monitoring and remediation performance evaluation purposes, especially in total petroleum hydrocarbon (TPH)-contaminated areas, such as gas stations. As a key issue, reproducibility of the assessment results must be ensured, especially if attempts are made to compare results between different institutions. Although it is widely known that uncertainties associated with soil sampling are much higher than those with chemical analyses, field guides or protocols to deal with these uncertainties are not stipulated in detail in the relevant regulations, causing serious errors and distortion of the reliability of environmental site assessments. In this research, uncertainties associated with soil sampling and sample reduction for chemical analysis were quantified using laboratory-scale experiments and the theory of sampling. The research results showed that the TPH mass assessed by sampling tends to be overestimated and sampling errors are high, especially for the low range of TPH concentrations. Homogenization of soil was found to be an efficient method to suppress uncertainty, but high-resolution sampling could be an essential way to minimize this.

  3. Making sense of genetic uncertainty: the role of religion and spirituality.

    PubMed

    White, Mary T

    2009-02-15

    This article argues that to the extent that religious and spiritual beliefs can help people cope with genetic uncertainty, a limited spiritual assessment may be appropriate in genetic counseling. The article opens by establishing why genetic information is inherently uncertain and why this uncertainty can be medically, morally, and spiritually problematic. This is followed by a review of the range of factors that can contribute to risk assessments, including a few heuristics commonly used in responses to uncertainty. The next two sections summarize recent research on the diverse roles of religious and spiritual beliefs in genetic decisions and challenges to conducting spiritual assessments in genetic counseling. Based on these findings, religious and spiritual beliefs are posited as serving essentially as a heuristic that some people will utilize in responding to their genetic risks. In the interests of helping such clients make informed decisions, a limited spiritual assessment is recommended and described. Some of the challenges and risks associated with this limited assessment are discussed. Since some religious and spiritual beliefs can conflict with the values of medicine, some decisions will remain problematic. (c) 2009 Wiley-Liss, Inc.

  4. Characterizing Uncertainty and Variability in PBPK Models: State of the Science and Needs for Research and Implementation

    EPA Science Inventory

    Mode-of-action based risk and safety assessments can rely upon tissue dosimetry estimates in animals and humans obtained from physiologically-based pharmacokinetic (PBPK) modeling. However, risk assessment also increasingly requires characterization of uncertainty and variabilit...

  5. Uncertainty in simulating wheat yields under climate change

    USDA-ARS?s Scientific Manuscript database

    Anticipating the impacts of climate change on crop yields is critical for assessing future food security. Process-based crop simulation models are the most commonly used tools in such assessments. Analysis of uncertainties in future greenhouse gas emissions and their impacts on future climate change...

  6. The impacts of uncertainty and variability in groundwater-driven health risk assessment. (Invited)

    NASA Astrophysics Data System (ADS)

    Maxwell, R. M.

    2010-12-01

    Potential human health risk from contaminated groundwater is becoming an important, quantitative measure used in management decisions in a range of applications from Superfund to CO2 sequestration. Quantitatively assessing the potential human health risks from contaminated groundwater is challenging due to the many coupled processes, uncertainty in transport parameters and the variability in individual physiology and behavior. Perspective on human health risk assessment techniques will be presented and a framework used to predict potential, increased human health risk from contaminated groundwater will be discussed. This framework incorporates transport of contaminants through the subsurface from source to receptor and health risks to individuals via household exposure pathways. The subsurface is shown subject to both physical and chemical heterogeneity which affects downstream concentrations at receptors. Cases are presented where hydraulic conductivity can exhibit both uncertainty and spatial variability in addition to situations where hydraulic conductivity is the dominant source of uncertainty in risk assessment. Management implications, such as characterization and remediation will also be discussed.

  7. Water supply infrastructure planning under multiple uncertainties: A differentiated approach

    NASA Astrophysics Data System (ADS)

    Fletcher, S.; Strzepek, K.

    2017-12-01

    Many water planners face increased pressure on water supply systems from increasing demands from population and economic growth in combination with uncertain water supply. Supply uncertainty arises from short-term climate variability and long-term climate change as well as uncertainty in groundwater availability. Social and economic uncertainties - such as sectoral competition for water, food and energy security, urbanization, and environmental protection - compound physical uncertainty. Further, the varying risk aversion of stakeholders and water managers makes it difficult to assess the necessity of expensive infrastructure investments to reduce risk. We categorize these uncertainties on two dimensions: whether they can be updated over time by collecting additional information, and whether the uncertainties can be described probabilistically or are "deep" uncertainties whose likelihood is unknown. Based on this, we apply a decision framework that combines simulation for probabilistic uncertainty, scenario analysis for deep uncertainty, and multi-stage decision analysis for uncertainties that are reduced over time with additional information. In light of these uncertainties and the investment costs of large infrastructure, we propose the assessment of staged, modular infrastructure and information updating as a hedge against risk. We apply this framework to cases in Melbourne, Australia and Riyadh, Saudi Arabia. Melbourne is a surface water system facing uncertain population growth and variable rainfall and runoff. A severe drought from 1997 to 2009 prompted investment in a 150 MCM/y reverse osmosis desalination plan with a capital cost of 3.5 billion. Our analysis shows that flexible design in which a smaller portion of capacity is developed initially with the option to add modular capacity in the future can mitigate uncertainty and reduce the expected lifetime costs by up to 1 billion. In Riyadh, urban water use relies on fossil groundwater aquifers and desalination. Intense withdrawals for urban and agricultural use will lead to lowering of the water table in the aquifer at rapid but uncertain rates due to poor groundwater characterization. We assess the potential for additional groundwater data collection and a flexible infrastructure approach similar to that in Melbourne to mitigate risk.

  8. Urban water supply infrastructure planning under predictive groundwater uncertainty: Bayesian updating and flexible design

    NASA Astrophysics Data System (ADS)

    Fletcher, S.; Strzepek, K.

    2017-12-01

    Many urban water planners face increased pressure on water supply systems from increasing demands from population and economic growth in combination with uncertain water supply, driven by short-term climate variability and long-term climate change. These uncertainties are often exacerbated in groundwater-dependent water systems due to the extra difficulty in measuring groundwater storage, recharge, and sustainable yield. Groundwater models are typically under-parameterized due to the high data requirements for calibration and limited data availability, leading to uncertainty in the models' predictions. We develop an integrated approach to urban water supply planning that combines predictive groundwater uncertainty analysis with adaptive water supply planning using multi-stage decision analysis. This allows us to compare the value of collecting additional groundwater data and reducing predictive uncertainty with the value of using water infrastructure planning that is flexible, modular, and can react quickly in response to unexpected changes in groundwater availability. We apply this approach to a case from Riyadh, Saudi Arabia. Riyadh relies on fossil groundwater aquifers and desalination for urban use. The main fossil aquifers incur minimal recharge and face depletion as a result of intense withdrawals for urban and agricultural use. As the water table declines and pumping becomes uneconomical, Riyadh will have to build new supply infrastructure, decrease demand, or increase the efficiency of its distribution system. However, poor groundwater characterization has led to severe uncertainty in aquifer parameters such as hydraulic conductivity, and therefore severe uncertainty in how the water table will respond to pumping over time and when these transitions will be necessary: the potential depletion time varies from approximately five years to 100 years. This case is an excellent candidate for flexible planning both because of its severity and the potential for learning: additional information can be collected over time and flexible options exercised in response. Stochastic dynamic programming is used to find optimal policies for using flexibility under different information scenarios. The performance of each strategy is then assessed using a simulation model of Riyadh's water system.

  9. A Practical Approach to Address Uncertainty in Stakeholder Deliberations.

    PubMed

    Gregory, Robin; Keeney, Ralph L

    2017-03-01

    This article addresses the difficulties of incorporating uncertainty about consequence estimates as part of stakeholder deliberations involving multiple alternatives. Although every prediction of future consequences necessarily involves uncertainty, a large gap exists between common practices for addressing uncertainty in stakeholder deliberations and the procedures of prescriptive decision-aiding models advanced by risk and decision analysts. We review the treatment of uncertainty at four main phases of the deliberative process: with experts asked to describe possible consequences of competing alternatives, with stakeholders who function both as individuals and as members of coalitions, with the stakeholder committee composed of all stakeholders, and with decisionmakers. We develop and recommend a model that uses certainty equivalents as a theoretically robust and practical approach for helping diverse stakeholders to incorporate uncertainties when evaluating multiple-objective alternatives as part of public policy decisions. © 2017 Society for Risk Analysis.

  10. Uncertainty in BMP evaluation and optimization for watershed management

    NASA Astrophysics Data System (ADS)

    Chaubey, I.; Cibin, R.; Sudheer, K.; Her, Y.

    2012-12-01

    Use of computer simulation models have increased substantially to make watershed management decisions and to develop strategies for water quality improvements. These models are often used to evaluate potential benefits of various best management practices (BMPs) for reducing losses of pollutants from sources areas into receiving waterbodies. Similarly, use of simulation models in optimizing selection and placement of best management practices under single (maximization of crop production or minimization of pollutant transport) and multiple objective functions has increased recently. One of the limitations of the currently available assessment and optimization approaches is that the BMP strategies are considered deterministic. Uncertainties in input data (e.g. precipitation, streamflow, sediment, nutrient and pesticide losses measured, land use) and model parameters may result in considerable uncertainty in watershed response under various BMP options. We have developed and evaluated options to include uncertainty in BMP evaluation and optimization for watershed management. We have also applied these methods to evaluate uncertainty in ecosystem services from mixed land use watersheds. In this presentation, we will discuss methods to to quantify uncertainties in BMP assessment and optimization solutions due to uncertainties in model inputs and parameters. We have used a watershed model (Soil and Water Assessment Tool or SWAT) to simulate the hydrology and water quality in mixed land use watershed located in Midwest USA. The SWAT model was also used to represent various BMPs in the watershed needed to improve water quality. SWAT model parameters, land use change parameters, and climate change parameters were considered uncertain. It was observed that model parameters, land use and climate changes resulted in considerable uncertainties in BMP performance in reducing P, N, and sediment loads. In addition, climate change scenarios also affected uncertainties in SWAT simulated crop yields. Considerable uncertainties in the net cost and the water quality improvements resulted due to uncertainties in land use, climate change, and model parameter values.

  11. Assessment of BTEX-induced health risk under multiple uncertainties at a petroleum-contaminated site: An integrated fuzzy stochastic approach

    NASA Astrophysics Data System (ADS)

    Zhang, Xiaodong; Huang, Guo H.

    2011-12-01

    Groundwater pollution has gathered more and more attention in the past decades. Conducting an assessment of groundwater contamination risk is desired to provide sound bases for supporting risk-based management decisions. Therefore, the objective of this study is to develop an integrated fuzzy stochastic approach to evaluate risks of BTEX-contaminated groundwater under multiple uncertainties. It consists of an integrated interval fuzzy subsurface modeling system (IIFMS) and an integrated fuzzy second-order stochastic risk assessment (IFSOSRA) model. The IIFMS is developed based on factorial design, interval analysis, and fuzzy sets approach to predict contaminant concentrations under hybrid uncertainties. Two input parameters (longitudinal dispersivity and porosity) are considered to be uncertain with known fuzzy membership functions, and intrinsic permeability is considered to be an interval number with unknown distribution information. A factorial design is conducted to evaluate interactive effects of the three uncertain factors on the modeling outputs through the developed IIFMS. The IFSOSRA model can systematically quantify variability and uncertainty, as well as their hybrids, presented as fuzzy, stochastic and second-order stochastic parameters in health risk assessment. The developed approach haw been applied to the management of a real-world petroleum-contaminated site within a western Canada context. The results indicate that multiple uncertainties, under a combination of information with various data-quality levels, can be effectively addressed to provide supports in identifying proper remedial efforts. A unique contribution of this research is the development of an integrated fuzzy stochastic approach for handling various forms of uncertainties associated with simulation and risk assessment efforts.

  12. Tightness Entropic Uncertainty Relation in Quantum Markovian-Davies Environment

    NASA Astrophysics Data System (ADS)

    Zhang, Jun; Liu, Liang; Han, Yan

    2018-05-01

    In this paper, we investigate the tightness of entropic uncertainty relation in the absence (presence) of the quantum memory which the memory particle being weakly coupled to a decohering Davies-type Markovian environment. The results show that the tightness of the quantum uncertainty relation can be controlled by the energy relaxation time F, the dephasing time G and the rescaled temperature p, the perfect tightness can be arrived by dephasing and energy relaxation satisfying F = 2G and p = 1/2. In addition, the tightness of the memory-assisted entropic uncertainty relation and the entropic uncertainty relation can be influenced mainly by the purity. While in memory-assisted model, the purity and quantum correlation can also influence the tightness actively while the quantum entanglement can influence the tightness slightly.

  13. Sources of uncertainty in flood inundation maps

    USGS Publications Warehouse

    Bales, J.D.; Wagner, C.R.

    2009-01-01

    Flood inundation maps typically have been used to depict inundated areas for floods having specific exceedance levels. The uncertainty associated with the inundation boundaries is seldom quantified, in part, because all of the sources of uncertainty are not recognized and because data available to quantify uncertainty seldom are available. Sources of uncertainty discussed in this paper include hydrologic data used for hydraulic model development and validation, topographic data, and the hydraulic model. The assumption of steady flow, which typically is made to produce inundation maps, has less of an effect on predicted inundation at lower flows than for higher flows because more time typically is required to inundate areas at high flows than at low flows. Difficulties with establishing reasonable cross sections that do not intersect and that represent water-surface slopes in tributaries contribute additional uncertainties in the hydraulic modelling. As a result, uncertainty in the flood inundation polygons simulated with a one-dimensional model increases with distance from the main channel.

  14. Incorporating rainfall uncertainty in a SWAT model: the river Zenne basin (Belgium) case study

    NASA Astrophysics Data System (ADS)

    Tolessa Leta, Olkeba; Nossent, Jiri; van Griensven, Ann; Bauwens, Willy

    2013-04-01

    The European Union Water Framework Directive (EU-WFD) called its member countries to achieve a good ecological status for all inland and coastal water bodies by 2015. According to recent studies, the river Zenne (Belgium) is far from this objective. Therefore, an interuniversity and multidisciplinary project "Towards a Good Ecological Status in the river Zenne (GESZ)" was launched to evaluate the effects of wastewater management plans on the river. In this project, different models have been developed and integrated using the Open Modelling Interface (OpenMI). The hydrologic, semi-distributed Soil and Water Assessment Tool (SWAT) is hereby used as one of the model components in the integrated modelling chain in order to model the upland catchment processes. The assessment of the uncertainty of SWAT is an essential aspect of the decision making process, in order to design robust management strategies that take the predicted uncertainties into account. Model uncertainty stems from the uncertainties on the model parameters, the input data (e.g, rainfall), the calibration data (e.g., stream flows) and on the model structure itself. The objective of this paper is to assess the first three sources of uncertainty in a SWAT model of the river Zenne basin. For the assessment of rainfall measurement uncertainty, first, we identified independent rainfall periods, based on the daily precipitation and stream flow observations and using the Water Engineering Time Series PROcessing tool (WETSPRO). Secondly, we assigned a rainfall multiplier parameter for each of the independent rainfall periods, which serves as a multiplicative input error corruption. Finally, we treated these multipliers as latent parameters in the model optimization and uncertainty analysis (UA). For parameter uncertainty assessment, due to the high number of parameters of the SWAT model, first, we screened out its most sensitive parameters using the Latin Hypercube One-factor-At-a-Time (LH-OAT) technique. Subsequently, we only considered the most sensitive parameters for parameter optimization and UA. To explicitly account for the stream flow uncertainty, we assumed that the stream flow measurement error increases linearly with the stream flow value. To assess the uncertainty and infer posterior distributions of the parameters, we used a Markov Chain Monte Carlo (MCMC) sampler - differential evolution adaptive metropolis (DREAM) that uses sampling from an archive of past states to generate candidate points in each individual chain. It is shown that the marginal posterior distributions of the rainfall multipliers vary widely between individual events, as a consequence of rainfall measurement errors and the spatial variability of the rain. Only few of the rainfall events are well defined. The marginal posterior distributions of the SWAT model parameter values are well defined and identified by DREAM, within their prior ranges. The posterior distributions of output uncertainty parameter values also show that the stream flow data is highly uncertain. The approach of using rainfall multipliers to treat rainfall uncertainty for a complex model has an impact on the model parameter marginal posterior distributions and on the model results Corresponding author: Tel.: +32 (0)2629 3027; fax: +32(0)2629 3022. E-mail: otolessa@vub.ac.be

  15. Holistic uncertainty analysis in river basin modeling for climate vulnerability assessment

    NASA Astrophysics Data System (ADS)

    Taner, M. U.; Wi, S.; Brown, C.

    2017-12-01

    The challenges posed by uncertain future climate are a prominent concern for water resources managers. A number of frameworks exist for assessing the impacts of climate-related uncertainty, including internal climate variability and anthropogenic climate change, such as scenario-based approaches and vulnerability-based approaches. While in many cases climate uncertainty may be dominant, other factors such as future evolution of the river basin, hydrologic response and reservoir operations are potentially significant sources of uncertainty. While uncertainty associated with modeling hydrologic response has received attention, very little attention has focused on the range of uncertainty and possible effects of the water resources infrastructure and management. This work presents a holistic framework that allows analysis of climate, hydrologic and water management uncertainty in water resources systems analysis with the aid of a water system model designed to integrate component models for hydrology processes and water management activities. The uncertainties explored include those associated with climate variability and change, hydrologic model parameters, and water system operation rules. A Bayesian framework is used to quantify and model the uncertainties at each modeling steps in integrated fashion, including prior and the likelihood information about model parameters. The framework is demonstrated in a case study for the St. Croix Basin located at border of United States and Canada.

  16. Methodology for quantifying uncertainty in coal assessments with an application to a Texas lignite deposit

    USGS Publications Warehouse

    Olea, R.A.; Luppens, J.A.; Tewalt, S.J.

    2011-01-01

    A common practice for characterizing uncertainty in coal resource assessments has been the itemization of tonnage at the mining unit level and the classification of such units according to distance to drilling holes. Distance criteria, such as those used in U.S. Geological Survey Circular 891, are still widely used for public disclosure. A major deficiency of distance methods is that they do not provide a quantitative measure of uncertainty. Additionally, relying on distance between data points alone does not take into consideration other factors known to have an influence on uncertainty, such as spatial correlation, type of probability distribution followed by the data, geological discontinuities, and boundary of the deposit. Several geostatistical methods have been combined to formulate a quantitative characterization for appraising uncertainty. Drill hole datasets ranging from widespread exploration drilling to detailed development drilling from a lignite deposit in Texas were used to illustrate the modeling. The results show that distance to the nearest drill hole is almost completely unrelated to uncertainty, which confirms the inadequacy of characterizing uncertainty based solely on a simple classification of resources by distance classes. The more complex statistical methods used in this study quantify uncertainty and show good agreement between confidence intervals in the uncertainty predictions and data from additional drilling. ?? 2010.

  17. Toxicity interaction between chlorpyrifos, mancozeb and soil moisture to the terrestrial isopod Porcellionides pruinosus.

    PubMed

    Morgado, Rui G; Gomes, Pedro A D; Ferreira, Nuno G C; Cardoso, Diogo N; Santos, Miguel J G; Soares, Amadeu M V M; Loureiro, Susana

    2016-02-01

    A main source of uncertainty currently associated with environmental risk assessment of chemicals is the poor understanding of the influence of environmental factors on the toxicity of xenobiotics. Aiming to reduce this uncertainty, here we evaluate the joint-effects of two pesticides (chlorpyrifos and mancozeb) on the terrestrial isopod Porcellionides pruinosus under different soil moisture regimes. A full factorial design, including three treatments of each pesticide and an untreated control, were performed under different soil moisture regimes: 25%, 50%, and 75% WHC. Our results showed that soil moisture had no effects on isopods survival, at the levels assessed in this experiment, neither regarding single pesticides nor mixture treatments. Additivity was always the most parsimonious result when both pesticides were present. Oppositely, both feeding activity and biomass change showed a higher sensitivity to soil moisture, with isopods generally showing worse performance when exposed to pesticides and dry or moist conditions. Most of the significant differences between soil moisture regimes were found in single pesticide treatments, yet different responses to mixtures could still be distinguished depending on the soil moisture assessed. This study shows that while soil moisture has the potential to influence the effects of the pesticide mixture itself, such effects might become less important in a context of complex combinations of stressors, as the major contribution comes from its individual interaction with each pesticide. Finally, the implications of our results are discussed in light of the current state of environmental risk assessment procedures and some future perspectives are advanced. Copyright © 2015 Elsevier Ltd. All rights reserved.

  18. Assessing Uncertainties in Surface Water Security: A Probabilistic Multi-model Resampling approach

    NASA Astrophysics Data System (ADS)

    Rodrigues, D. B. B.

    2015-12-01

    Various uncertainties are involved in the representation of processes that characterize interactions between societal needs, ecosystem functioning, and hydrological conditions. Here, we develop an empirical uncertainty assessment of water security indicators that characterize scarcity and vulnerability, based on a multi-model and resampling framework. We consider several uncertainty sources including those related to: i) observed streamflow data; ii) hydrological model structure; iii) residual analysis; iv) the definition of Environmental Flow Requirement method; v) the definition of critical conditions for water provision; and vi) the critical demand imposed by human activities. We estimate the overall uncertainty coming from the hydrological model by means of a residual bootstrap resampling approach, and by uncertainty propagation through different methodological arrangements applied to a 291 km² agricultural basin within the Cantareira water supply system in Brazil. Together, the two-component hydrograph residual analysis and the block bootstrap resampling approach result in a more accurate and precise estimate of the uncertainty (95% confidence intervals) in the simulated time series. We then compare the uncertainty estimates associated with water security indicators using a multi-model framework and provided by each model uncertainty estimation approach. The method is general and can be easily extended forming the basis for meaningful support to end-users facing water resource challenges by enabling them to incorporate a viable uncertainty analysis into a robust decision making process.

  19. Assessing uncertainties in crop and pasture ensemble model simulations of productivity and N2O emissions

    USDA-ARS?s Scientific Manuscript database

    Simulation models are extensively used to predict agricultural productivity and greenhouse gas (GHG) emissions. However, the uncertainties of (reduced) model ensemble simulations have not been assessed systematically for variables affecting food security and climate change mitigation, within multisp...

  20. Applying Uncertainty Analysis to a Risk Assessment for the Pesticide Permethrin

    EPA Science Inventory

    We discuss the application of methods of uncertainty analysis from our previous poster to the problem of a risk assessment for exposure to the food-use pesticide permethrin resulting from residential pesticide crack and crevice application. Exposures are simulated by the SHEDS (S...

  1. Probability and Confidence Trade-Space (PACT) Evaluation: Accounting for Uncertainty in Sparing Assessments

    NASA Technical Reports Server (NTRS)

    Anderson, Leif; Box, Neil; Carter-Journet, Katrina; DiFilippo, Denise; Harrington, Sean; Jackson, David; Lutomski, Michael

    2012-01-01

    Purpose of presentation: (1) Status update on the developing methodology to revise sub-system sparing targets. (2) To describe how to incorporate uncertainty into spare assessments and why it is important to do so (3) Demonstrate hardware risk postures through PACT evaluation

  2. Is the aerosol emission detectable in the thermal infrared?

    NASA Astrophysics Data System (ADS)

    Hollweg, H.-D.; Bakan, S.; Taylor, J. P.

    2006-08-01

    The impact of aerosols on the thermal infrared radiation can be assessed by combining observations and radiative transfer calculations. Both have uncertainties, which are discussed in this paper. Observational uncertainties are obtained for two FTIR instruments operated side by side on the ground during the LACE 1998 field campaign. Radiative transfer uncertainties are assessed using a line-by-line model taking into account the uncertainties of the HITRAN 2004 spectroscopic database, uncertainties in the determination of the atmospheric profiles of water vapor and ozone, and differences in the treatment of the water vapor continuum absorption by the CKD 2.4.1 and MT_CKD 1.0 algorithms. The software package OPAC was used to describe the optical properties of aerosols for climate modeling. The corresponding radiative signature is a guideline to the assessment of the uncertainty ranges of observations and models. We found that the detection of aerosols depends strongly on the measurement accuracy of atmospheric profiles of water vapor and ozone and is easier for drier conditions. Within the atmospheric window, only the forcing of downward radiation at the surface by desert aerosol emerges clearly from the uncertainties of modeling and FTIR measurement. Urban and polluted continental aerosols are only partially detectable depending on the wave number and on the atmospheric water vapor amount. Simulations for the space-borne interferometer IASI show that only upward radiation above transported mineral dust aloft emerges out of the uncertainties. The detection of aerosols with weak radiative impact by FTIR instruments like ARIES and OASIS is made difficult by noise as demonstrated by the signal to noise ratio for clean continental aerosols. Altogether, the uncertainties found suggest that it is difficult to detect the optical depths of nonmineral and unpolluted aerosols.

  3. Assessing uncertainties in surface water security: An empirical multimodel approach

    NASA Astrophysics Data System (ADS)

    Rodrigues, Dulce B. B.; Gupta, Hoshin V.; Mendiondo, Eduardo M.; Oliveira, Paulo Tarso S.

    2015-11-01

    Various uncertainties are involved in the representation of processes that characterize interactions among societal needs, ecosystem functioning, and hydrological conditions. Here we develop an empirical uncertainty assessment of water security indicators that characterize scarcity and vulnerability, based on a multimodel and resampling framework. We consider several uncertainty sources including those related to (i) observed streamflow data; (ii) hydrological model structure; (iii) residual analysis; (iv) the method for defining Environmental Flow Requirement; (v) the definition of critical conditions for water provision; and (vi) the critical demand imposed by human activities. We estimate the overall hydrological model uncertainty by means of a residual bootstrap resampling approach, and by uncertainty propagation through different methodological arrangements applied to a 291 km2 agricultural basin within the Cantareira water supply system in Brazil. Together, the two-component hydrograph residual analysis and the block bootstrap resampling approach result in a more accurate and precise estimate of the uncertainty (95% confidence intervals) in the simulated time series. We then compare the uncertainty estimates associated with water security indicators using a multimodel framework and the uncertainty estimates provided by each model uncertainty estimation approach. The range of values obtained for the water security indicators suggests that the models/methods are robust and performs well in a range of plausible situations. The method is general and can be easily extended, thereby forming the basis for meaningful support to end-users facing water resource challenges by enabling them to incorporate a viable uncertainty analysis into a robust decision-making process.

  4. Graphical Representations of Data Improve Student Understanding of Measurement and Uncertainty: An Eye-Tracking Study

    ERIC Educational Resources Information Center

    Susac, Ana; Bubic, Andreja; Martinjak, Petra; Planinic, Maja; Palmovic, Marijan

    2017-01-01

    Developing a better understanding of the measurement process and measurement uncertainty is one of the main goals of university physics laboratory courses. This study investigated the influence of graphical representation of data on student understanding and interpreting of measurement results. A sample of 101 undergraduate students (48 first year…

  5. Use of Total Possibilistic Uncertainty as a Measure of Students' Modelling Capacities

    ERIC Educational Resources Information Center

    Voskoglou, Michael Gr.

    2010-01-01

    We represent the main stages of the process of mathematical modelling as fuzzy sets in the set of the linguistic labels of negligible, low intermediate, high and complete success by students in each of these stages and we use the total possibilistic uncertainty as a measure of students' modelling capacities. A classroom experiment is also…

  6. The U.S. Federal Government's Efforts to Estimate an Economic Value for Reduced Carbon Emissions (Invited)

    NASA Astrophysics Data System (ADS)

    Wolverton, A.

    2010-12-01

    This presentation will summarize the technical process and results from recent U.S. Federal government efforts to estimate the “social cost of carbon” (SCC); the monetized damages associated with an incremental increase in carbon dioxide emissions in a given year. The purpose of the SCC estimates is to make it possible for Federal agencies to incorporate the social benefits from reducing CO2 emissions into cost-benefit analyses of regulatory actions that have relatively small impacts on cumulative global emissions. An interagency working group initiated a comprehensive analysis using three integrated assessment models. The interagency group chose to rely on three of the most widely recognized peer-reviewed models to fairly represent differences in the way in which economic impacts from climate change are modeled (DICE, PAGE, and FUND). The main objective of this process was to develop a range of SCC values using a defensible set of input assumptions grounded in the existing scientific and economic literatures. In this way, key uncertainties and model differences transparently and consistently inform the range of SCC estimates used in the rulemaking process. This proved challenging since the literature did not always agree on the best path forward. In some cases the group agreed to a range of assumptions to allow for uncertainty analysis (e.g., they include 5 different socioeconomic scenarios in the Monte Carlo analysis to reflect uncertainty about how future economic and population growth and energy systems will develop over the next 100 years). The four values selected for regulatory analysis included three estimates based on the average SCC from three integrated assessment models over a range of discount rates, since there is wide disagreement on which to apply in an inter-generational context. The fourth value represents the 95th percentile SCC estimate across all three models at a 3 percent discount rate and is included to represent higher-than-expected impacts from temperature change further out in the tails of the SCC distribution. The interagency working group recognized that there are a number of serious challenges in attempting to assess the incremental economic impacts of CO2 emissions. A recent report from the National Academies of Science (NRC 2009) points out that any assessment will suffer from uncertainty, speculation, and lack of information about future emissions of greenhouse gases, the effects of emissions on the climate system, the impact of changes in climate on the physical and biological environment, and the translation of these environmental impacts into economic damages. In addition to highlighting key areas in need of future research in the interagency technical document, the Department of Energy and the Environmental Protection Agency are planning two workshops that will bring the very best climate modelers from the scientific and economic community together to discuss key gaps in the literature and how to improve current modeling capabilities. I will briefly summarize the main outcomes from the first of two workshops, scheduled to occur in November 2010.

  7. Testing Map Features Designed to Convey the Uncertainty of Cancer Risk: Insights Gained From Assessing Judgments of Information Adequacy and Communication Goals

    PubMed Central

    Severtson, Dolores J.

    2015-01-01

    Barriers to communicating the uncertainty of environmental health risks include preferences for certain information and low numeracy. Map features designed to communicate the magnitude and uncertainty of estimated cancer risk from air pollution were tested among 826 participants to assess how map features influenced judgments of adequacy and the intended communication goals. An uncertain versus certain visual feature was judged as less adequate but met both communication goals and addressed numeracy barriers. Expressing relative risk using words communicated uncertainty and addressed numeracy barriers but was judged as highly inadequate. Risk communication and visual cognition concepts were applied to explain findings. PMID:26412960

  8. Testing Map Features Designed to Convey the Uncertainty of Cancer Risk: Insights Gained From Assessing Judgments of Information Adequacy and Communication Goals.

    PubMed

    Severtson, Dolores J

    2015-02-01

    Barriers to communicating the uncertainty of environmental health risks include preferences for certain information and low numeracy. Map features designed to communicate the magnitude and uncertainty of estimated cancer risk from air pollution were tested among 826 participants to assess how map features influenced judgments of adequacy and the intended communication goals. An uncertain versus certain visual feature was judged as less adequate but met both communication goals and addressed numeracy barriers. Expressing relative risk using words communicated uncertainty and addressed numeracy barriers but was judged as highly inadequate. Risk communication and visual cognition concepts were applied to explain findings.

  9. Development of a software framework for data assimilation and its applications for streamflow forecasting in Japan

    NASA Astrophysics Data System (ADS)

    Noh, S. J.; Tachikawa, Y.; Shiiba, M.; Yorozu, K.; Kim, S.

    2012-04-01

    Data assimilation methods have received increased attention to accomplish uncertainty assessment and enhancement of forecasting capability in various areas. Despite of their potentials, applicable software frameworks to probabilistic approaches and data assimilation are still limited because the most of hydrologic modeling software are based on a deterministic approach. In this study, we developed a hydrological modeling framework for sequential data assimilation, so called MPI-OHyMoS. MPI-OHyMoS allows user to develop his/her own element models and to easily build a total simulation system model for hydrological simulations. Unlike process-based modeling framework, this software framework benefits from its object-oriented feature to flexibly represent hydrological processes without any change of the main library. Sequential data assimilation based on the particle filters is available for any hydrologic models based on MPI-OHyMoS considering various sources of uncertainty originated from input forcing, parameters and observations. The particle filters are a Bayesian learning process in which the propagation of all uncertainties is carried out by a suitable selection of randomly generated particles without any assumptions about the nature of the distributions. In MPI-OHyMoS, ensemble simulations are parallelized, which can take advantage of high performance computing (HPC) system. We applied this software framework for short-term streamflow forecasting of several catchments in Japan using a distributed hydrologic model. Uncertainty of model parameters and remotely-sensed rainfall data such as X-band or C-band radar is estimated and mitigated in the sequential data assimilation.

  10. Medical errors and uncertainty in primary healthcare: A comparative study of coping strategies among young and experienced GPs

    PubMed Central

    Kuikka, Liisa; Pitkälä, Kaisu

    2014-01-01

    Abstract Objective. To study coping differences between young and experienced GPs in primary care who experience medical errors and uncertainty. Design. Questionnaire-based survey (self-assessment) conducted in 2011. Setting. Finnish primary practice offices in Southern Finland. Subjects. Finnish GPs engaged in primary health care from two different respondent groups: young (working experience ≤ 5years, n = 85) and experienced (working experience > 5 years, n = 80). Main outcome measures. Outcome measures included experiences and attitudes expressed by the included participants towards medical errors and tolerance of uncertainty, their coping strategies, and factors that may influence (positively or negatively) sources of errors. Results. In total, 165/244 GPs responded (response rate: 68%). Young GPs expressed significantly more often fear of committing a medical error (70.2% vs. 48.1%, p = 0.004) and admitted more often than experienced GPs that they had committed a medical error during the past year (83.5% vs. 68.8%, p = 0.026). Young GPs were less prone to apologize to a patient for an error (44.7% vs. 65.0%, p = 0.009) and found, more often than their more experienced colleagues, on-site consultations and electronic databases useful for avoiding mistakes. Conclusion. Experienced GPs seem to better tolerate uncertainty and also seem to fear medical errors less than their young colleagues. Young and more experienced GPs use different coping strategies for dealing with medical errors. Implications. When GPs become more experienced, they seem to get better at coping with medical errors. Means to support these skills should be studied in future research. PMID:24914458

  11. Greenhouse gas network design using backward Lagrangian particle dispersion modelling - Part 1: Methodology and Australian test case

    NASA Astrophysics Data System (ADS)

    Ziehn, T.; Nickless, A.; Rayner, P. J.; Law, R. M.; Roff, G.; Fraser, P.

    2014-03-01

    This paper describes the generation of optimal atmospheric measurement networks for determining carbon dioxide fluxes over Australia using inverse methods. A Lagrangian particle dispersion model is used in reverse mode together with a Bayesian inverse modelling framework to calculate the relationship between weekly surface fluxes and hourly concentration observations for the Australian continent. Meteorological driving fields are provided by the regional version of the Australian Community Climate and Earth System Simulator (ACCESS) at 12 km resolution at an hourly time scale. Prior uncertainties are derived on a weekly time scale for biosphere fluxes and fossil fuel emissions from high resolution BIOS2 model runs and from the Fossil Fuel Data Assimilation System (FFDAS), respectively. The influence from outside the modelled domain is investigated, but proves to be negligible for the network design. Existing ground based measurement stations in Australia are assessed in terms of their ability to constrain local flux estimates from the land. We find that the six stations that are currently operational are already able to reduce the uncertainties on surface flux estimates by about 30%. A candidate list of 59 stations is generated based on logistic constraints and an incremental optimization scheme is used to extend the network of existing stations. In order to achieve an uncertainty reduction of about 50% we need to double the number of measurement stations in Australia. Assuming equal data uncertainties for all sites, new stations would be mainly located in the northern and eastern part of the continent.

  12. Greenhouse gas network design using backward Lagrangian particle dispersion modelling - Part 1: Methodology and Australian test case

    NASA Astrophysics Data System (ADS)

    Ziehn, T.; Nickless, A.; Rayner, P. J.; Law, R. M.; Roff, G.; Fraser, P.

    2014-09-01

    This paper describes the generation of optimal atmospheric measurement networks for determining carbon dioxide fluxes over Australia using inverse methods. A Lagrangian particle dispersion model is used in reverse mode together with a Bayesian inverse modelling framework to calculate the relationship between weekly surface fluxes, comprising contributions from the biosphere and fossil fuel combustion, and hourly concentration observations for the Australian continent. Meteorological driving fields are provided by the regional version of the Australian Community Climate and Earth System Simulator (ACCESS) at 12 km resolution at an hourly timescale. Prior uncertainties are derived on a weekly timescale for biosphere fluxes and fossil fuel emissions from high-resolution model runs using the Community Atmosphere Biosphere Land Exchange (CABLE) model and the Fossil Fuel Data Assimilation System (FFDAS) respectively. The influence from outside the modelled domain is investigated, but proves to be negligible for the network design. Existing ground-based measurement stations in Australia are assessed in terms of their ability to constrain local flux estimates from the land. We find that the six stations that are currently operational are already able to reduce the uncertainties on surface flux estimates by about 30%. A candidate list of 59 stations is generated based on logistic constraints and an incremental optimisation scheme is used to extend the network of existing stations. In order to achieve an uncertainty reduction of about 50%, we need to double the number of measurement stations in Australia. Assuming equal data uncertainties for all sites, new stations would be mainly located in the northern and eastern part of the continent.

  13. Flood risk and adaptation strategies under climate change and urban expansion: A probabilistic analysis using global data.

    PubMed

    Muis, Sanne; Güneralp, Burak; Jongman, Brenden; Aerts, Jeroen C J H; Ward, Philip J

    2015-12-15

    An accurate understanding of flood risk and its drivers is crucial for effective risk management. Detailed risk projections, including uncertainties, are however rarely available, particularly in developing countries. This paper presents a method that integrates recent advances in global-scale modeling of flood hazard and land change, which enables the probabilistic analysis of future trends in national-scale flood risk. We demonstrate its application to Indonesia. We develop 1000 spatially-explicit projections of urban expansion from 2000 to 2030 that account for uncertainty associated with population and economic growth projections, as well as uncertainty in where urban land change may occur. The projections show that the urban extent increases by 215%-357% (5th and 95th percentiles). Urban expansion is particularly rapid on Java, which accounts for 79% of the national increase. From 2000 to 2030, increases in exposure will elevate flood risk by, on average, 76% and 120% for river and coastal floods. While sea level rise will further increase the exposure-induced trend by 19%-37%, the response of river floods to climate change is highly uncertain. However, as urban expansion is the main driver of future risk, the implementation of adaptation measures is increasingly urgent, regardless of the wide uncertainty in climate projections. Using probabilistic urban projections, we show that spatial planning can be a very effective adaptation strategy. Our study emphasizes that global data can be used successfully for probabilistic risk assessment in data-scarce countries. Copyright © 2015 Elsevier B.V. All rights reserved.

  14. Assessing uncertainty in ecological systems using global sensitivity analyses: a case example of simulated wolf reintroduction effects on elk

    USGS Publications Warehouse

    Fieberg, J.; Jenkins, Kurt J.

    2005-01-01

    Often landmark conservation decisions are made despite an incomplete knowledge of system behavior and inexact predictions of how complex ecosystems will respond to management actions. For example, predicting the feasibility and likely effects of restoring top-level carnivores such as the gray wolf (Canis lupus) to North American wilderness areas is hampered by incomplete knowledge of the predator-prey system processes and properties. In such cases, global sensitivity measures, such as Sobola?? indices, allow one to quantify the effect of these uncertainties on model predictions. Sobola?? indices are calculated by decomposing the variance in model predictions (due to parameter uncertainty) into main effects of model parameters and their higher order interactions. Model parameters with large sensitivity indices can then be identified for further study in order to improve predictive capabilities. Here, we illustrate the use of Sobola?? sensitivity indices to examine the effect of parameter uncertainty on the predicted decline of elk (Cervus elaphus) population sizes following a hypothetical reintroduction of wolves to Olympic National Park, Washington, USA. The strength of density dependence acting on survival of adult elk and magnitude of predation were the most influential factors controlling elk population size following a simulated wolf reintroduction. In particular, the form of density dependence in natural survival rates and the per-capita predation rate together accounted for over 90% of variation in simulated elk population trends. Additional research on wolf predation rates on elk and natural compensations in prey populations is needed to reliably predict the outcome of predatora??prey system behavior following wolf reintroductions.

  15. Operational Implementation of a Pc Uncertainty Construct for Conjunction Assessment Risk Analysis

    NASA Technical Reports Server (NTRS)

    Newman, Lauri K.; Hejduk, Matthew D.; Johnson, Lauren C.

    2016-01-01

    Earlier this year the NASA Conjunction Assessment and Risk Analysis (CARA) project presented the theoretical and algorithmic aspects of a method to include the uncertainties in the calculation inputs when computing the probability of collision (Pc) between two space objects, principally uncertainties in the covariances and the hard-body radius. The output of this calculation approach is to produce rather than a single Pc value an entire probability density function that will represent the range of possible Pc values given the uncertainties in the inputs and bring CA risk analysis methodologies more in line with modern risk management theory. The present study provides results from the exercise of this method against an extended dataset of satellite conjunctions in order to determine the effect of its use on the evaluation of conjunction assessment (CA) event risk posture. The effects are found to be considerable: a good number of events are downgraded from or upgraded to a serious risk designation on the basis of consideration of the Pc uncertainty. The findings counsel the integration of the developed methods into NASA CA operations.

  16. Assessing performance of flaw characterization methods through uncertainty propagation

    NASA Astrophysics Data System (ADS)

    Miorelli, R.; Le Bourdais, F.; Artusi, X.

    2018-04-01

    In this work, we assess the inversion performance in terms of crack characterization and localization based on synthetic signals associated to ultrasonic and eddy current physics. More precisely, two different standard iterative inversion algorithms are used to minimize the discrepancy between measurements (i.e., the tested data) and simulations. Furthermore, in order to speed up the computational time and get rid of the computational burden often associated to iterative inversion algorithms, we replace the standard forward solver by a suitable metamodel fit on a database built offline. In a second step, we assess the inversion performance by adding uncertainties on a subset of the database parameters and then, through the metamodel, we propagate these uncertainties within the inversion procedure. The fast propagation of uncertainties enables efficiently evaluating the impact due to the lack of knowledge on some parameters employed to describe the inspection scenarios, which is a situation commonly encountered in the industrial NDE context.

  17. Assessment of the Gaussian Covariance Approximation over an Earth-Asteroid Encounter Period

    NASA Technical Reports Server (NTRS)

    Mattern, Daniel

    2017-01-01

    In assessing the risk an asteroid may pose to the Earth, the asteroids state is often predicted for many years, often decades. Only by accounting for the asteroids initial state uncertainty can a measure of the risk be calculated. With the asteroids state uncertainty growing as a function of the initial velocity uncertainty, orbit velocity at the last state update, and the time from the last update to the epoch of interest, the asteroids position uncertainties can grow to many times the size of the Earth when propagated to the encounter risk corridor. This paper examines the merits of propagating the asteroids state covariance as an analytical matrix. The results of this study help to bound the efficacy of applying different metrics for assessing the risk an asteroid poses to the Earth. Additionally, this work identifies a criterion for when different covariance propagation methods are needed to continue predictions after an Earth-encounter period.

  18. Quantifying uncertainty in Bayesian calibrated animal-to-human PBPK models with informative prior distributions

    EPA Science Inventory

    Understanding and quantifying the uncertainty of model parameters and predictions has gained more interest in recent years with the increased use of computational models in chemical risk assessment. Fully characterizing the uncertainty in risk metrics derived from linked quantita...

  19. Uncertainty Modeling of Pollutant Transport in Atmosphere and Aquatic Route Using Soft Computing

    NASA Astrophysics Data System (ADS)

    Datta, D.

    2010-10-01

    Hazardous radionuclides are released as pollutants in the atmospheric and aquatic environment (ATAQE) during the normal operation of nuclear power plants. Atmospheric and aquatic dispersion models are routinely used to assess the impact of release of radionuclide from any nuclear facility or hazardous chemicals from any chemical plant on the ATAQE. Effect of the exposure from the hazardous nuclides or chemicals is measured in terms of risk. Uncertainty modeling is an integral part of the risk assessment. The paper focuses the uncertainty modeling of the pollutant transport in atmospheric and aquatic environment using soft computing. Soft computing is addressed due to the lack of information on the parameters that represent the corresponding models. Soft-computing in this domain basically addresses the usage of fuzzy set theory to explore the uncertainty of the model parameters and such type of uncertainty is called as epistemic uncertainty. Each uncertain input parameters of the model is described by a triangular membership function.

  20. The role of risk perception in making flood risk management more effective

    NASA Astrophysics Data System (ADS)

    Buchecker, M.; Salvini, G.; Di Baldassarre, G.; Semenzin, E.; Maidl, E.; Marcomini, A.

    2013-11-01

    Over the last few decades, Europe has suffered from a number of severe flood events and, as a result, there has been a growing interest in probing alternative approaches to managing flood risk via prevention measures. A literature review reveals that, although in the last decades risk evaluation has been recognized as key element of risk management, and risk assessment methodologies (including risk analysis and evaluation) have been improved by including social, economic, cultural, historical and political conditions, the theoretical schemes are not yet applied in practice. One main reason for this shortcoming is that risk perception literature is mainly of universal and theoretical nature and cannot provide the necessary details to implement a comprehensive risk evaluation. This paper therefore aims to explore a procedure that allows the inclusion of stakeholders' perceptions of prevention measures in risk assessment. It proposes to adopt methods of risk communication (both one-way and two-way communication) in risk assessment with the final aim of making flood risk management more effective. The proposed procedure not only focuses on the effect of discursive risk communication on risk perception, and on achieving a shared assessment of the prevention alternatives, but also considers the effects of the communication process on perceived uncertainties, accepted risk levels, and trust in the managing institutions. The effectiveness of this combined procedure has been studied and illustrated using the example of the participatory flood prevention assessment process on the Sihl River in Zurich, Switzerland. The main findings of the case study suggest that the proposed procedure performed well, but that it needs some adaptations for it to be applicable in different contexts and to allow a (semi-) quantitative estimation of risk perception to be used as an indicator of adaptive capacity.

  1. Strengthening Agricultural Decisions in Countries at Risk of Food Insecurity: The GEOGLAM Crop Monitor for Early Warning

    NASA Astrophysics Data System (ADS)

    Becker-Reshef, I.; Barker, B.; McGaughey, K.; Humber, M. L.; Sanchez, A.; Justice, C. O.; Rembold, F.; Verdin, J. P.

    2016-12-01

    Timely, reliable information on crop conditions, and prospects at the subnational scale, is critical for making informed policy and agricultural decisions for ensuring food security, particularly for the most vulnerable countries. However, such information is often incomplete or lacking. As such, the Crop Monitor for Early Warning (CM for EW) was developed with the goal to reduce uncertainty and strengthen decision support by providing actionable information on a monthly basis to national, regional and global food security agencies through timely consensus assessments of crop conditions. This information is especially critical in recent years, given the extreme weather conditions impacting food supplies including the most recent El Nino event. This initiative brings together the main international food security monitoring agencies and organizations to develop monthly crop assessments based on satellite observations, meteorological information, field observations and ground reports, which reflect an international consensus. This activity grew out of the successful Crop Monitor for the G20 Agricultural Market Information System (AMIS), which provides operational monthly crop assessments of the main producing countries of the world. The CM for EW was launched in February 2016 and has already become a trusted source of information internationally and regionally. Its assessments have been featured in a large number of news articles, reports, and press releases, including a joint statement by the USAID's FEWS NET, UN World Food Program, European Commission Joint Research Center, and the UN Food and Agriculture Organziation, on the devastating impacts of the southern African drought due to El Nino. One of the main priorities for this activity going forward is to expand its partnership with regional and national monitoring agencies, and strengthen capacity for national crop condition assessments.

  2. Theoretical uncertainties on the radius of low- and very-low-mass stars

    NASA Astrophysics Data System (ADS)

    Tognelli, E.; Prada Moroni, P. G.; Degl'Innocenti, S.

    2018-05-01

    We performed an analysis of the main theoretical uncertainties that affect the radius of low- and very-low-mass stars predicted by current stellar models. We focused on stars in the mass range 0.1-1 M⊙, on both the zero-age main sequence (ZAMS) and on 1, 2, and 5 Gyr isochrones. First, we quantified the impact on the radius of the uncertainty of several quantities, namely the equation of state, radiative opacity, atmospheric models, convection efficiency, and initial chemical composition. Then, we computed the cumulative radius error stripe obtained by adding the radius variation due to all the analysed quantities. As a general trend, the radius uncertainty increases with the stellar mass. For ZAMS structures the cumulative error stripe of very-low-mass stars is about ±2 and ±3 per cent, while at larger masses it increases up to ±4 and ±5 per cent. The radius uncertainty gets larger and age dependent if isochrones are considered, reaching for M ˜ 1 M⊙ about +12(-15) per cent at an age of 5 Gyr. We also investigated the radius uncertainty at a fixed luminosity. In this case, the cumulative error stripe is the same for both ZAMS and isochrone models and it ranges from about ±4 to +7 and +9(-5) per cent. We also showed that the sole uncertainty on the chemical composition plays an important role in determining the radius error stripe, producing a radius variation that ranges between about ±1 and ±2 per cent on ZAMS models with fixed mass and about ±3 and ±5 per cent at a fixed luminosity.

  3. A model-averaging method for assessing groundwater conceptual model uncertainty.

    PubMed

    Ye, Ming; Pohlmann, Karl F; Chapman, Jenny B; Pohll, Greg M; Reeves, Donald M

    2010-01-01

    This study evaluates alternative groundwater models with different recharge and geologic components at the northern Yucca Flat area of the Death Valley Regional Flow System (DVRFS), USA. Recharge over the DVRFS has been estimated using five methods, and five geological interpretations are available at the northern Yucca Flat area. Combining the recharge and geological components together with additional modeling components that represent other hydrogeological conditions yields a total of 25 groundwater flow models. As all the models are plausible given available data and information, evaluating model uncertainty becomes inevitable. On the other hand, hydraulic parameters (e.g., hydraulic conductivity) are uncertain in each model, giving rise to parametric uncertainty. Propagation of the uncertainty in the models and model parameters through groundwater modeling causes predictive uncertainty in model predictions (e.g., hydraulic head and flow). Parametric uncertainty within each model is assessed using Monte Carlo simulation, and model uncertainty is evaluated using the model averaging method. Two model-averaging techniques (on the basis of information criteria and GLUE) are discussed. This study shows that contribution of model uncertainty to predictive uncertainty is significantly larger than that of parametric uncertainty. For the recharge and geological components, uncertainty in the geological interpretations has more significant effect on model predictions than uncertainty in the recharge estimates. In addition, weighted residuals vary more for the different geological models than for different recharge models. Most of the calibrated observations are not important for discriminating between the alternative models, because their weighted residuals vary only slightly from one model to another.

  4. Diagnosing and Assessing Uncertainties of the Carbon Cycle in Terrestrial Ecosystem Models from a Multi-Model Ensemble Experiment

    NASA Astrophysics Data System (ADS)

    Wang, W.; Dungan, J. L.; Hashimoto, H.; Michaelis, A.; Milesi, C.; Ichii, K.; Nemani, R. R.

    2009-12-01

    We are conducting an ensemble modeling exercise using the Terrestrial Observation and Prediction System (TOPS) to characterize structural uncertainty in carbon fluxes and stocks estimates from different ecosystem models. The experiment uses public-domain versions of Biome-BGC, LPJ, TOPS-BGC, and CASA, driven by a consistent set of climate fields for North America at 8km resolution and daily/monthly time steps over the period of 1982-2006. A set of diagnostics is developed to characterize the behavior of the models in the climate (temperature-precipitation) space, and to evaluate the simulated carbon cycle in an integrated way. The key findings of this study include that: (relative) optimal primary production is generally found in climate regions where the relationship between annual temperature (T, oC) and precipitation (P, mm) is defined by P = 50*T+500; the ratios between NPP and GPP are close to 50% on average, yet can vary between models and in different climate regions; the allocation of carbon to leaf growth represents a positive feedback to the primary production, and different approaches to constrain this process have significant impacts on the simulated carbon cycle; substantial differences in biomass stocks may be induced by small differences in the tissue turnover rate and the plant mortality; the mean residence time of soil carbon pools is strongly influenced by schemes of temperature regulations; non-respiratory disturbances (e.g., fires) are the main driver for NEP, yet its magnitudes vary between models. Overall, these findings indicate that although the structures of the models are similar, the uncertainties among them can be large, highlighting the problem inherent in relying on only one modeling approach to map surface carbon fluxes or to assess vegetation-climate interactions.

  5. Smart climate ensemble exploring approaches: the example of climate impacts on air pollution in Europe.

    NASA Astrophysics Data System (ADS)

    Lemaire, Vincent; Colette, Augustin; Menut, Laurent

    2016-04-01

    Because of its sensitivity to weather patterns, climate change will have an impact on air pollution so that, in the future, a climate penalty could jeopardize the expected efficiency of air pollution mitigation measures. A common method to assess the impact of climate on air quality consists in implementing chemistry-transport models forced by climate projections. However, at present, such impact assessment lack multi-model ensemble approaches to address uncertainties because of the substantial computing cost. Therefore, as a preliminary step towards exploring large climate ensembles with air quality models, we developed an ensemble exploration technique in order to point out the climate models that should be investigated in priority. By using a training dataset from a deterministic projection of climate and air quality over Europe, we identified the main meteorological drivers of air quality for 8 regions in Europe and developed statistical models that could be used to estimate future air pollutant concentrations. Applying this statistical model to the whole EuroCordex ensemble of climate projection, we find a climate penalty for six subregions out of eight (Eastern Europe, France, Iberian Peninsula, Mid Europe and Northern Italy). On the contrary, a climate benefit for PM2.5 was identified for three regions (Eastern Europe, Mid Europe and Northern Italy). The uncertainty of this statistical model challenges limits however the confidence we can attribute to associated quantitative projections. This technique allows however selecting a subset of relevant regional climate model members that should be used in priority for future deterministic projections to propose an adequate coverage of uncertainties. We are thereby proposing a smart ensemble exploration strategy that can also be used for other impacts studies beyond air quality.

  6. Uncertainty Estimation Cheat Sheet for Probabilistic Risk Assessment

    NASA Technical Reports Server (NTRS)

    Britton, Paul T.; Al Hassan, Mohammad; Ring, Robert W.

    2017-01-01

    "Uncertainty analysis itself is uncertain, therefore, you cannot evaluate it exactly," Source Uncertain Quantitative results for aerospace engineering problems are influenced by many sources of uncertainty. Uncertainty analysis aims to make a technical contribution to decision-making through the quantification of uncertainties in the relevant variables as well as through the propagation of these uncertainties up to the result. Uncertainty can be thought of as a measure of the 'goodness' of a result and is typically represented as statistical dispersion. This paper will explain common measures of centrality and dispersion; and-with examples-will provide guidelines for how they may be estimated to ensure effective technical contributions to decision-making.

  7. PROBABILISTIC RISK ANALYSIS OF RADIOACTIVE WASTE DISPOSALS - a case study

    NASA Astrophysics Data System (ADS)

    Trinchero, P.; Delos, A.; Tartakovsky, D. M.; Fernandez-Garcia, D.; Bolster, D.; Dentz, M.; Sanchez-Vila, X.; Molinero, J.

    2009-12-01

    The storage of contaminant material in superficial or sub-superficial repositories, such as tailing piles for mine waste or disposal sites for low and intermediate nuclear waste, poses a potential threat for the surrounding biosphere. The minimization of these risks can be achieved by supporting decision-makers with quantitative tools capable to incorporate all source of uncertainty within a rigorous probabilistic framework. A case study is presented where we assess the risks associated to the superficial storage of hazardous waste close to a populated area. The intrinsic complexity of the problem, involving many events with different spatial and time scales and many uncertainty parameters is overcome by using a formal PRA (probabilistic risk assessment) procedure that allows decomposing the system into a number of key events. Hence, the failure of the system is directly linked to the potential contamination of one of the three main receptors: the underlying karst aquifer, a superficial stream that flows near the storage piles and a protection area surrounding a number of wells used for water supply. The minimal cut sets leading to the failure of the system are obtained by defining a fault-tree that incorporates different events including the failure of the engineered system (e.g. cover of the piles) and the failure of the geological barrier (e.g. clay layer that separates the bottom of the pile from the karst formation). Finally the probability of failure is quantitatively assessed combining individual independent or conditional probabilities that are computed numerically or borrowed from reliability database.

  8. DEVELOPMENTS AT U.S. EPA IN ADDRESSING UNCERTAINTY IN RISK ASSESSMENT

    EPA Science Inventory

    An emerging trend in risk assessment is to be more explicit about uncertainties, both during the analytical procedures and in communicating results. In February 1 992, then-Deputy EPA Administrator Henry Habicht set out Agency goals in a memorandum stating that the Agency will "p...

  9. Lived Experiences of "Illness Uncertainty" of Iranian Cancer Patients: A Phenomenological Hermeneutic Study.

    PubMed

    Sajjadi, Moosa; Rassouli, Maryam; Abbaszadeh, Abbas; Brant, Jeannine; Majd, Hamid Alavi

    2016-01-01

    For cancer patients, uncertainty is a pervasive experience and a major psychological stressor that affects many aspects of their lives. Uncertainty is a multifaceted concept, and its understanding for patients depends on many factors, including factors associated with various sociocultural contexts. Unfortunately, little is known about the concept of uncertainty in Iranian society and culture. This study aimed to clarify the concept and explain lived experiences of illness uncertainty in Iranian cancer patients. In this hermeneutic phenomenological study, 8 cancer patients participated in semistructured in-depth interviews about their experiences of uncertainty in illness. Interviews continued until data saturation was reached. All interviews were recorded, transcribed, analyzed, and interpreted using 6 stages of the van Manen phenomenological approach. Seven main themes emerged from patients' experiences of illness uncertainty of cancer. Four themes contributed to uncertainty including "Complexity of Cancer," "Confusion About Cancer," "Contradictory Information," and "Unknown Future." Two themes facilitated coping with uncertainty including "Seeking Knowledge" and "Need for Spiritual Peace." One theme, "Knowledge Ambivalence," revealed the struggle between wanting to know and not wanting to know, especially if bad news was delivered. Uncertainty experience for cancer patients in different societies is largely similar. However, some experiences (eg, ambiguity in access to medical resources) seemed unique to Iranian patients. This study provided an outlook of cancer patients' experiences of illness uncertainty in Iran. Cancer patients' coping ability to deal with uncertainty can be improved.

  10. Treatment of uncertainties in the IPCC: a philosophical analysis

    NASA Astrophysics Data System (ADS)

    Jebeile, J.; Drouet, I.

    2014-12-01

    The IPCC produces scientific reports out of findings on climate and climate change. Because the findings are uncertain in many respects, the production of reports requires aggregating assessments of uncertainties of different kinds. This difficult task is currently regulated by the Guidance note for lead authors of the IPCC fifth assessment report on consistent treatment of uncertainties. The note recommends that two metrics—i.e. confidence and likelihood— be used for communicating the degree of certainty in findings. Confidence is expressed qualitatively "based on the type, amount, quality, and consistency of evidence […] and the degree of agreement", while likelihood is expressed probabilistically "based on statistical analysis of observations or model results, or expert judgment". Therefore, depending on the evidence evaluated, authors have the choice to present either an assigned level of confidence or a quantified measure of likelihood. But aggregating assessments of uncertainties of these two different kinds express distinct and conflicting methodologies. So the question arises whether the treatment of uncertainties in the IPCC is rationally justified. In order to answer the question, it is worth comparing the IPCC procedures with the formal normative theories of epistemic rationality which have been developed by philosophers. These theories—which include contributions to the philosophy of probability and to bayesian probabilistic confirmation theory—are relevant for our purpose because they are commonly used to assess the rationality of common collective jugement formation based on uncertain knowledge. In this paper we make the comparison and pursue the following objectives: i/we determine whether the IPCC confidence and likelihood can be compared with the notions of uncertainty targeted by or underlying the formal normative theories of epistemic rationality; ii/we investigate whether the formal normative theories of epistemic rationality justify treating uncertainty along those two dimensions, and indicate how this can be avoided.

  11. Assessing risk of baleen whale hearing loss from seismic surveys: The effect of uncertainty and individual variation.

    PubMed

    Gedamke, Jason; Gales, Nick; Frydman, Sascha

    2011-01-01

    The potential for seismic airgun "shots" to cause acoustic trauma in marine mammals is poorly understood. There are just two empirical measurements of temporary threshold shift (TTS) onset levels from airgun-like sounds in odontocetes. Considering these limited data, a model was developed examining the impact of individual variability and uncertainty on risk assessment of baleen whale TTS from seismic surveys. In each of 100 simulations: 10000 "whales" are assigned TTS onset levels accounting for: inter-individual variation; uncertainty over the population's mean; and uncertainty over weighting of odontocete data to obtain baleen whale onset levels. Randomly distributed whales are exposed to one seismic survey passage with cumulative exposure level calculated. In the base scenario, 29% of whales (5th/95th percentiles of 10%/62%) approached to 1-1.2 km range were exposed to levels sufficient for TTS onset. By comparison, no whales are at risk outside 0.6 km when uncertainty and variability are not considered. Potentially "exposure altering" parameters (movement, avoidance, surfacing, and effective quiet) were also simulated. Until more research refines model inputs, the results suggest a reasonable likelihood that whales at a kilometer or more from seismic surveys could potentially be susceptible to TTS and demonstrate that the large impact uncertainty and variability can have on risk assessment.

  12. Distribution of uncertainties at the municipality level for flood risk modelling along the river Meuse: implications for policy-making

    NASA Astrophysics Data System (ADS)

    Pirotton, Michel; Stilmant, Frédéric; Erpicum, Sébastien; Dewals, Benjamin; Archambeau, Pierre

    2016-04-01

    Flood risk modelling has been conducted for the whole course of the river Meuse in Belgium. Major cities, such as Liege (200,000 inh.) and Namur (110,000 inh.), are located in the floodplains of river Meuse. Particular attention has been paid to uncertainty analysis and its implications for decision-making. The modelling chain contains flood frequency analysis, detailed 2D hydraulic computations, damage modelling and risk calculation. The relative importance of each source of uncertainty to the overall results uncertainty has been estimated by considering several alternate options for each step of the analysis: different distributions were considered in the flood frequency analysis; the influence of modelling assumptions and boundary conditions (e.g., steady vs. unsteady) were taken into account for the hydraulic computation; two different landuse classifications and two sets of damage functions were used; the number of exceedance probabilities involved in the risk calculation (by integration of the risk-curves) was varied. In addition, the sensitivity of the results with respect to increases in flood discharges was assessed. The considered increases are consistent with a "wet" climate change scenario for the time horizons 2021-2050 and 2071-2100 (Detrembleur et al., 2015). The results of hazard computation differ significantly between the upper and lower parts of the course of river Meuse in Belgium. In the former, inundation extents grow gradually as the considered flood discharge is increased (i.e. the exceedance probability is reduced), while in the downstream part, protection structures (mainly concrete walls) prevent inundation for flood discharges corresponding to exceedance probabilities of 0.01 and above (in the present climate). For higher discharges, large inundation extents are obtained in the floodplains. The highest values of risk (mean annual damage) are obtained in the municipalities which undergo relatively frequent flooding (upper part of the river), as well as in those of the downstream part of the Meuse in which flow depths in the urbanized floodplains are particularly high when inundation occurs. This is the case of the city of Liege, as a result of a subsidence process following former mining activities. For a given climate scenario, the uncertainty ranges affecting flood risk estimates are significant; but not so much that the results for the different municipalities would overlap substantially. Therefore, these uncertainties do not hamper prioritization in terms of allocation of risk reduction measures at the municipality level. In the present climate, the uncertainties arising from flood frequency analysis have a negligible influence in the upper part of the river, while they have a considerable impact on risk modelling in the lower part, where a threshold effect was observed due to the flood protection structures (sudden transition from no inundation to massive flooding when a threshold discharge is exceeded). Varying the number of exceedance probabilities in the integration of the risk curve has different effects for different municipalities; but it does not change the ranking of the municipalities in terms of flood risk. For the other scenarios, damage estimation contributes most to the overall uncertainties. As shown by this study, the magnitude of the uncertainty and its main origin vary in space and in time. This emphasizes the paramount importance of conducting distributed uncertainty analyses. In the considered study area, prioritization of risk reduction means can be reliably performed despite the modelling uncertainties. Reference Detrembleur, S., Stilmant, F., Dewals, B., Erpicum, S., Archambeau, P., & Pirotton, M. (2015). Impacts of climate change on future flood damage on the river Meuse, with a distributed uncertainty analysis. Natural Hazards, 77(3), 1533-1549. Acknowledgement Part of this research was funded through the ARC grant for Concerted Research Actions, financed by the Wallonia-Brussels Federation. It was also supported by the NWE Interreg IVB Program.

  13. Risk Assessment of Groundwater Contamination: A Multilevel Fuzzy Comprehensive Evaluation Approach Based on DRASTIC Model

    PubMed Central

    Zhang, Yan; Zhong, Ming

    2013-01-01

    Groundwater contamination is a serious threat to water supply. Risk assessment of groundwater contamination is an effective way to protect the safety of groundwater resource. Groundwater is a complex and fuzzy system with many uncertainties, which is impacted by different geological and hydrological factors. In order to deal with the uncertainty in the risk assessment of groundwater contamination, we propose an approach with analysis hierarchy process and fuzzy comprehensive evaluation integrated together. Firstly, the risk factors of groundwater contamination are identified by the sources-pathway-receptor-consequence method, and a corresponding index system of risk assessment based on DRASTIC model is established. Due to the complexity in the process of transitions between the possible pollution risks and the uncertainties of factors, the method of analysis hierarchy process is applied to determine the weights of each factor, and the fuzzy sets theory is adopted to calculate the membership degrees of each factor. Finally, a case study is presented to illustrate and test this methodology. It is concluded that the proposed approach integrates the advantages of both analysis hierarchy process and fuzzy comprehensive evaluation, which provides a more flexible and reliable way to deal with the linguistic uncertainty and mechanism uncertainty in groundwater contamination without losing important information. PMID:24453883

  14. Uncertainty analysis of the Operational Simplified Surface Energy Balance (SSEBop) model at multiple flux tower sites

    USGS Publications Warehouse

    Chen, Mingshi; Senay, Gabriel B.; Singh, Ramesh K.; Verdin, James P.

    2016-01-01

    Evapotranspiration (ET) is an important component of the water cycle – ET from the land surface returns approximately 60% of the global precipitation back to the atmosphere. ET also plays an important role in energy transport among the biosphere, atmosphere, and hydrosphere. Current regional to global and daily to annual ET estimation relies mainly on surface energy balance (SEB) ET models or statistical and empirical methods driven by remote sensing data and various climatological databases. These models have uncertainties due to inevitable input errors, poorly defined parameters, and inadequate model structures. The eddy covariance measurements on water, energy, and carbon fluxes at the AmeriFlux tower sites provide an opportunity to assess the ET modeling uncertainties. In this study, we focused on uncertainty analysis of the Operational Simplified Surface Energy Balance (SSEBop) model for ET estimation at multiple AmeriFlux tower sites with diverse land cover characteristics and climatic conditions. The 8-day composite 1-km MODerate resolution Imaging Spectroradiometer (MODIS) land surface temperature (LST) was used as input land surface temperature for the SSEBop algorithms. The other input data were taken from the AmeriFlux database. Results of statistical analysis indicated that the SSEBop model performed well in estimating ET with an R2 of 0.86 between estimated ET and eddy covariance measurements at 42 AmeriFlux tower sites during 2001–2007. It was encouraging to see that the best performance was observed for croplands, where R2 was 0.92 with a root mean square error of 13 mm/month. The uncertainties or random errors from input variables and parameters of the SSEBop model led to monthly ET estimates with relative errors less than 20% across multiple flux tower sites distributed across different biomes. This uncertainty of the SSEBop model lies within the error range of other SEB models, suggesting systematic error or bias of the SSEBop model is within the normal range. This finding implies that the simplified parameterization of the SSEBop model did not significantly affect the accuracy of the ET estimate while increasing the ease of model setup for operational applications. The sensitivity analysis indicated that the SSEBop model is most sensitive to input variables, land surface temperature (LST) and reference ET (ETo); and parameters, differential temperature (dT), and maximum ET scalar (Kmax), particularly during the non-growing season and in dry areas. In summary, the uncertainty assessment verifies that the SSEBop model is a reliable and robust method for large-area ET estimation. The SSEBop model estimates can be further improved by reducing errors in two input variables (ETo and LST) and two key parameters (Kmax and dT).

  15. Development and Testing of Neutron Cross Section Covariance Data for SCALE 6.2

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Marshall, William BJ J; Williams, Mark L; Wiarda, Dorothea

    2015-01-01

    Neutron cross-section covariance data are essential for many sensitivity/uncertainty and uncertainty quantification assessments performed both within the TSUNAMI suite and more broadly throughout the SCALE code system. The release of ENDF/B-VII.1 included a more complete set of neutron cross-section covariance data: these data form the basis for a new cross-section covariance library to be released in SCALE 6.2. A range of testing is conducted to investigate the properties of these covariance data and ensure that the data are reasonable. These tests include examination of the uncertainty in critical experiment benchmark model k eff values due to nuclear data uncertainties, asmore » well as similarity assessments of irradiated pressurized water reactor (PWR) and boiling water reactor (BWR) fuel with suites of critical experiments. The contents of the new covariance library, the testing performed, and the behavior of the new covariance data are described in this paper. The neutron cross-section covariances can be combined with a sensitivity data file generated using the TSUNAMI suite of codes within SCALE to determine the uncertainty in system k eff caused by nuclear data uncertainties. The Verified, Archived Library of Inputs and Data (VALID) maintained at Oak Ridge National Laboratory (ORNL) contains over 400 critical experiment benchmark models, and sensitivity data are generated for each of these models. The nuclear data uncertainty in k eff is generated for each experiment, and the resulting uncertainties are tabulated and compared to the differences in measured and calculated results. The magnitude of the uncertainty for categories of nuclides (such as actinides, fission products, and structural materials) is calculated for irradiated PWR and BWR fuel to quantify the effect of covariance library changes between the SCALE 6.1 and 6.2 libraries. One of the primary applications of sensitivity/uncertainty methods within SCALE is the assessment of similarities between benchmark experiments and safety applications. This is described by a c k value for each experiment with each application. Several studies have analyzed typical c k values for a range of critical experiments compared with hypothetical irradiated fuel applications. The c k value is sensitive to the cross-section covariance data because the contribution of each nuclide is influenced by its uncertainty; large uncertainties indicate more likely bias sources and are thus given more weight. Changes in c k values resulting from different covariance data can be used to examine and assess underlying data changes. These comparisons are performed for PWR and BWR fuel in storage and transportation systems.« less

  16. Analysis of flood hazard under consideration of dike breaches

    NASA Astrophysics Data System (ADS)

    Vorogushyn, S.; Apel, H.; Lindenschmidt, K.-E.; Merz, B.

    2009-04-01

    The study focuses on the development and application of a new modelling system which allows a comprehensive flood hazard assessment along diked river reaches under consideration of dike failures. The proposed Inundation Hazard Assessment Model (IHAM) represents a hybrid probabilistic-deterministic model. It comprises three models interactively coupled at runtime. These are: (1) 1D unsteady hydrodynamic model of river channel and floodplain flow between dikes, (2) probabilistic dike breach model which determines possible dike breach locations, breach widths and breach outflow discharges, and (3) 2D raster-based diffusion wave storage cell model of the hinterland areas behind the dikes. Due to the unsteady nature of the 1D and 2D coupled models, the dependence between hydraulic load at various locations along the reach is explicitly considered. The probabilistic dike breach model describes dike failures due to three failure mechanisms: overtopping, piping and slope instability caused by the seepage flow through the dike core (micro-instability). Dike failures for each mechanism are simulated based on fragility functions. The probability of breach is conditioned by the uncertainty in geometrical and geotechnical dike parameters. The 2D storage cell model driven by the breach outflow boundary conditions computes an extended spectrum of flood intensity indicators such as water depth, flow velocity, impulse, inundation duration and rate of water rise. IHAM is embedded in a Monte Carlo simulation in order to account for the natural variability of the flood generation processes reflected in the form of input hydrographs and for the randomness of dike failures given by breach locations, times and widths. The scenario calculations for the developed synthetic input hydrographs for the main river and tributary were carried out for floods with return periods of T = 100; 200; 500; 1000 a. Based on the modelling results, probabilistic dike hazard maps could be generated that indicate the failure probability of each discretised dike section for every scenario magnitude. Besides the binary inundation patterns that indicate the probability of raster cells being inundated, IHAM generates probabilistic flood hazard maps. These maps display spatial patterns of the considered flood intensity indicators and their associated return periods. The probabilistic nature of IHAM allows for the generation of percentile flood hazard maps that indicate the median and uncertainty bounds of the flood intensity indicators. The uncertainty results from the natural variability of the flow hydrographs and randomness of dike breach processes. The same uncertainty sources determine the uncertainty in the flow hydrographs along the study reach. The simulations showed that the dike breach stochasticity has an increasing impact on hydrograph uncertainty in downstream direction. Whereas in the upstream part of the reach the hydrograph uncertainty is mainly stipulated by the variability of the flood wave form, the dike failures strongly shape the uncertainty boundaries in the downstream part of the reach. Finally, scenarios of polder deployment for the extreme floods with T = 200; 500; 1000 a were simulated with IHAM. The results indicate a rather weak reduction of the mean and median flow hydrographs in the river channel. However, the capping of the flow peaks resulted in a considerable reduction of the overtopping failures downstream of the polder with a simultaneous slight increase of the piping and slope micro-instability frequencies explained by a more durable average impoundment. The developed IHAM simulation system represents a new scientific tool for studying fluvial inundation dynamics under extreme conditions incorporating effects of technical flood protection measures. With its major outputs in form of novel probabilistic inundation and dike hazard maps, the IHAM system has a high practical value for decision support in flood management.

  17. Parameter and model uncertainty in a life-table model for fine particles (PM2.5): a statistical modeling study

    PubMed Central

    Tainio, Marko; Tuomisto, Jouni T; Hänninen, Otto; Ruuskanen, Juhani; Jantunen, Matti J; Pekkanen, Juha

    2007-01-01

    Background The estimation of health impacts involves often uncertain input variables and assumptions which have to be incorporated into the model structure. These uncertainties may have significant effects on the results obtained with model, and, thus, on decision making. Fine particles (PM2.5) are believed to cause major health impacts, and, consequently, uncertainties in their health impact assessment have clear relevance to policy-making. We studied the effects of various uncertain input variables by building a life-table model for fine particles. Methods Life-expectancy of the Helsinki metropolitan area population and the change in life-expectancy due to fine particle exposures were predicted using a life-table model. A number of parameter and model uncertainties were estimated. Sensitivity analysis for input variables was performed by calculating rank-order correlations between input and output variables. The studied model uncertainties were (i) plausibility of mortality outcomes and (ii) lag, and parameter uncertainties (iii) exposure-response coefficients for different mortality outcomes, and (iv) exposure estimates for different age groups. The monetary value of the years-of-life-lost and the relative importance of the uncertainties related to monetary valuation were predicted to compare the relative importance of the monetary valuation on the health effect uncertainties. Results The magnitude of the health effects costs depended mostly on discount rate, exposure-response coefficient, and plausibility of the cardiopulmonary mortality. Other mortality outcomes (lung cancer, other non-accidental and infant mortality) and lag had only minor impact on the output. The results highlight the importance of the uncertainties associated with cardiopulmonary mortality in the fine particle impact assessment when compared with other uncertainties. Conclusion When estimating life-expectancy, the estimates used for cardiopulmonary exposure-response coefficient, discount rate, and plausibility require careful assessment, while complicated lag estimates can be omitted without this having any major effect on the results. PMID:17714598

  18. Parameter and model uncertainty in a life-table model for fine particles (PM2.5): a statistical modeling study.

    PubMed

    Tainio, Marko; Tuomisto, Jouni T; Hänninen, Otto; Ruuskanen, Juhani; Jantunen, Matti J; Pekkanen, Juha

    2007-08-23

    The estimation of health impacts involves often uncertain input variables and assumptions which have to be incorporated into the model structure. These uncertainties may have significant effects on the results obtained with model, and, thus, on decision making. Fine particles (PM2.5) are believed to cause major health impacts, and, consequently, uncertainties in their health impact assessment have clear relevance to policy-making. We studied the effects of various uncertain input variables by building a life-table model for fine particles. Life-expectancy of the Helsinki metropolitan area population and the change in life-expectancy due to fine particle exposures were predicted using a life-table model. A number of parameter and model uncertainties were estimated. Sensitivity analysis for input variables was performed by calculating rank-order correlations between input and output variables. The studied model uncertainties were (i) plausibility of mortality outcomes and (ii) lag, and parameter uncertainties (iii) exposure-response coefficients for different mortality outcomes, and (iv) exposure estimates for different age groups. The monetary value of the years-of-life-lost and the relative importance of the uncertainties related to monetary valuation were predicted to compare the relative importance of the monetary valuation on the health effect uncertainties. The magnitude of the health effects costs depended mostly on discount rate, exposure-response coefficient, and plausibility of the cardiopulmonary mortality. Other mortality outcomes (lung cancer, other non-accidental and infant mortality) and lag had only minor impact on the output. The results highlight the importance of the uncertainties associated with cardiopulmonary mortality in the fine particle impact assessment when compared with other uncertainties. When estimating life-expectancy, the estimates used for cardiopulmonary exposure-response coefficient, discount rate, and plausibility require careful assessment, while complicated lag estimates can be omitted without this having any major effect on the results.

  19. Addressing uncertainty in modelling cumulative impacts within maritime spatial planning in the Adriatic and Ionian region.

    PubMed

    Gissi, Elena; Menegon, Stefano; Sarretta, Alessandro; Appiotti, Federica; Maragno, Denis; Vianello, Andrea; Depellegrin, Daniel; Venier, Chiara; Barbanti, Andrea

    2017-01-01

    Maritime spatial planning (MSP) is envisaged as a tool to apply an ecosystem-based approach to the marine and coastal realms, aiming at ensuring that the collective pressure of human activities is kept within acceptable limits. Cumulative impacts (CI) assessment can support science-based MSP, in order to understand the existing and potential impacts of human uses on the marine environment. A CI assessment includes several sources of uncertainty that can hinder the correct interpretation of its results if not explicitly incorporated in the decision-making process. This study proposes a three-level methodology to perform a general uncertainty analysis integrated with the CI assessment for MSP, applied to the Adriatic and Ionian Region (AIR). We describe the nature and level of uncertainty with the help of expert judgement and elicitation to include all of the possible sources of uncertainty related to the CI model with assumptions and gaps related to the case-based MSP process in the AIR. Next, we use the results to tailor the global uncertainty analysis to spatially describe the uncertainty distribution and variations of the CI scores dependent on the CI model factors. The results show the variability of the uncertainty in the AIR, with only limited portions robustly identified as the most or the least impacted areas under multiple model factors hypothesis. The results are discussed for the level and type of reliable information and insights they provide to decision-making. The most significant uncertainty factors are identified to facilitate the adaptive MSP process and to establish research priorities to fill knowledge gaps for subsequent planning cycles. The method aims to depict the potential CI effects, as well as the extent and spatial variation of the data and scientific uncertainty; therefore, this method constitutes a suitable tool to inform the potential establishment of the precautionary principle in MSP.

  20. Evidence Theory Based Uncertainty Quantification in Radiological Risk due to Accidental Release of Radioactivity from a Nuclear Power Plant

    NASA Astrophysics Data System (ADS)

    Ingale, S. V.; Datta, D.

    2010-10-01

    Consequence of the accidental release of radioactivity from a nuclear power plant is assessed in terms of exposure or dose to the members of the public. Assessment of risk is routed through this dose computation. Dose computation basically depends on the basic dose assessment model and exposure pathways. One of the exposure pathways is the ingestion of contaminated food. The aim of the present paper is to compute the uncertainty associated with the risk to the members of the public due to the ingestion of contaminated food. The governing parameters of the ingestion dose assessment model being imprecise, we have approached evidence theory to compute the bound of the risk. The uncertainty is addressed by the belief and plausibility fuzzy measures.

  1. Assessing uncertain human exposure to ambient air pollution using environmental models in the Web

    NASA Astrophysics Data System (ADS)

    Gerharz, L. E.; Pebesma, E.; Denby, B.

    2012-04-01

    Ambient air quality can have significant impact on human health by causing respiratory and cardio-vascular diseases. Thereby, the pollutant concentration a person is exposed to can differ considerably between individuals depending on their daily routine and movement patterns. Using a straight forward approach this exposure can be estimated by integration of individual space-time paths and spatio-temporally resolved ambient air quality data. To allow a realistic exposure assessment, it is furthermore important to consider uncertainties due to input and model errors. In this work, we present a generic, web-based approach for estimating individual exposure by integration of uncertain position and air quality information implemented as a web service. Following the Model Web initiative envisioning an infrastructure for deploying, executing and chaining environmental models as services, existing models and data sources for e.g. air quality, can be used to assess exposure. Therefore, the service needs to deal with different formats, resolutions and uncertainty representations provided by model or data services. Potential mismatch can be accounted for by transformation of uncertainties and (dis-)aggregation of data under consideration of changes in the uncertainties using components developed in the UncertWeb project. In UncertWeb, the Model Web vision is extended to an Uncertainty-enabled Model Web, where services can process and communicate uncertainties in the data and models. The propagation of uncertainty to the exposure results is quantified using Monte Carlo simulation by combining different realisations of positions and ambient concentrations. Two case studies were used to evaluate the developed exposure assessment service. In a first study, GPS tracks with a positional uncertainty of a few meters, collected in the urban area of Münster, Germany were used to assess exposure to PM10 (particulate matter smaller 10 µm). Air quality data was provided by an uncertainty-enabled air quality model system which provided realisations of concentrations per hour on a 250 m x 250 m resolved grid over Münster. The second case study uses modelled human trajectories in Rotterdam, The Netherlands. The trajectories were provided as realisations in 15 min resolution per 4 digit postal code from an activity model. Air quality estimates were provided for different pollutants as ensembles by a coupled meteorology and air quality model system on a 1 km x 1 km grid with hourly resolution. Both case studies show the successful application of the service to different resolutions and uncertainty representations.

  2. Uncertainty Estimation Cheat Sheet for Probabilistic Risk Assessment

    NASA Technical Reports Server (NTRS)

    Britton, Paul; Al Hassan, Mohammad; Ring, Robert

    2017-01-01

    Quantitative results for aerospace engineering problems are influenced by many sources of uncertainty. Uncertainty analysis aims to make a technical contribution to decision-making through the quantification of uncertainties in the relevant variables as well as through the propagation of these uncertainties up to the result. Uncertainty can be thought of as a measure of the 'goodness' of a result and is typically represented as statistical dispersion. This paper will explain common measures of centrality and dispersion; and-with examples-will provide guidelines for how they may be estimated to ensure effective technical contributions to decision-making.

  3. Impact of Synoptic-Scale Factors on Rainfall Forecast in Different Stages of a Persistent Heavy Rainfall Event in South China

    NASA Astrophysics Data System (ADS)

    Zhang, Murong; Meng, Zhiyong

    2018-04-01

    This study investigates the stage-dependent rainfall forecast skills and the associated synoptic-scale features in a persistent heavy rainfall event in south China, Guangdong Province, during 29-31 March 2014, using operational global ensemble forecasts from the European Centre for Medium-Range Weather Forecasts. This persistent rainfall was divided into two stages with a better precipitation forecast skill in Stage 2 (S2) than Stage 1 (S1) although S2 had a longer lead time. Using ensemble-based sensitivity analysis, key synoptic-scale factors that affected the rainfall were diagnosed by correlating the accumulated precipitation of each stage to atmospheric state variables in the middle of respective stage. The precipitation in both stages was found to be significantly correlated with midlevel trough, low-level vortex, and particularly the low-level jet on the southeast flank of the vortex and its associated moisture transport. The rainfall forecast skill was mainly determined by the forecast accuracy in the location of the low-level jet, which was possibly related to the different juxtapositions between the direction of the movement of the low-level vortex and the orientation of the low-level jet. The uncertainty in rainfall forecast in S1 was mainly from the location uncertainty of the low-level jet, while the uncertainty in rainfall forecast in S2 was mainly from the width uncertainty of the low-level jet with the relatively accurate location of the low-level jet.

  4. Improving uncertainty estimates: Inter-annual variability in Ireland

    NASA Astrophysics Data System (ADS)

    Pullinger, D.; Zhang, M.; Hill, N.; Crutchley, T.

    2017-11-01

    This paper addresses the uncertainty associated with inter-annual variability used within wind resource assessments for Ireland in order to more accurately represent the uncertainties within wind resource and energy yield assessments. The study was undertaken using a total of 16 ground stations (Met Eireann) and corresponding reanalysis datasets to provide an update to previous work on this topic undertaken nearly 20 years ago. The results of the work demonstrate that the previously reported 5.4% of wind speed inter-annual variability is considered to be appropriate, guidance is given on how to provide a robust assessment of IAV using available sources of data including ground stations, MERRA-2 and ERA-Interim.

  5. Biophysics of NASA radiation quality factors.

    PubMed

    Cucinotta, Francis A

    2015-09-01

    NASA has implemented new radiation quality factors (QFs) for projecting cancer risks from space radiation exposures to astronauts. The NASA QFs are based on particle track structure concepts with parameters derived from available radiobiology data, and NASA introduces distinct QFs for solid cancer and leukaemia risk estimates. The NASA model was reviewed by the US National Research Council and approved for use by NASA for risk assessment for International Space Station missions and trade studies of future exploration missions to Mars and other destinations. A key feature of the NASA QFs is to represent the uncertainty in the QF assessments and evaluate the importance of the QF uncertainty to overall uncertainties in cancer risk projections. In this article, the biophysical basis for the probability distribution functions representing QF uncertainties was reviewed, and approaches needed to reduce uncertainties were discussed. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  6. Selection of climate policies under the uncertainties in the Fifth Assessment Report of the IPCC

    NASA Astrophysics Data System (ADS)

    Drouet, L.; Bosetti, V.; Tavoni, M.

    2015-10-01

    Strategies for dealing with climate change must incorporate and quantify all the relevant uncertainties, and be designed to manage the resulting risks. Here we employ the best available knowledge so far, summarized by the three working groups of the Fifth Assessment Report of the Intergovernmental Panel on Climate Change (IPCC AR5; refs , , ), to quantify the uncertainty of mitigation costs, climate change dynamics, and economic damage for alternative carbon budgets. We rank climate policies according to different decision-making criteria concerning uncertainty, risk aversion and intertemporal preferences. Our findings show that preferences over uncertainties are as important as the choice of the widely discussed time discount factor. Climate policies consistent with limiting warming to 2 °C above preindustrial levels are compatible with a subset of decision-making criteria and some model parametrizations, but not with the commonly adopted expected utility framework.

  7. Cross-Sectional And Longitudinal Uncertainty Propagation In Drinking Water Risk Assessment

    NASA Astrophysics Data System (ADS)

    Tesfamichael, A. A.; Jagath, K. J.

    2004-12-01

    Pesticide residues in drinking water can vary significantly from day to day. However, drinking water quality monitoring performed under the Safe Drinking Water Act (SDWA) at most community water systems (CWSs) is typically limited to four data points per year over a few years. Due to limited sampling, likely maximum residues may be underestimated in risk assessment. In this work, a statistical methodology is proposed to study the cross-sectional and longitudinal uncertainties in observed samples and their propagated effect in risk estimates. The methodology will be demonstrated using data from 16 CWSs across the US that have three independent databases of atrazine residue to estimate the uncertainty of risk in infants and children. The results showed that in 85% of the CWSs, chronic risks predicted with the proposed approach may be two- to four-folds higher than that predicted with the current approach, while intermediate risks may be two- to three-folds higher in 50% of the CWSs. In 12% of the CWSs, however, the proposed methodology showed a lower intermediate risk. A closed-form solution of propagated uncertainty will be developed to calculate the number of years (seasons) of water quality data and sampling frequency needed to reduce the uncertainty in risk estimates. In general, this methodology provided good insight into the importance of addressing uncertainty of observed water quality data and the need to predict likely maximum residues in risk assessment by considering propagation of uncertainties.

  8. Sensitivity and uncertainty analysis for the annual phosphorus loss estimator model

    USDA-ARS?s Scientific Manuscript database

    Models are often used to predict phosphorus (P) loss from agricultural fields. While it is commonly recognized that there are inherent uncertainties with model predictions, limited studies have addressed model prediction uncertainty. In this study we assess the effect of model input error on predict...

  9. Intolerance of Uncertainty, Fear of Anxiety, and Adolescent Worry

    ERIC Educational Resources Information Center

    Dugas, Michel J.; Laugesen, Nina; Bukowski, William M.

    2012-01-01

    A 5 year, ten wave longitudinal study of 338 adolescents assessed the association between two forms of cognitive vulnerability (intolerance of uncertainty and fear of anxiety) and worry. Multilevel mediational analyses revealed a bidirectional and reciprocal relation between intolerance of uncertainty and worry in which change in one variable…

  10. The Intolerance of Uncertainty Index: Replication and Extension with an English Sample

    ERIC Educational Resources Information Center

    Carleton, R. Nicholas; Gosselin, Patrick; Asmundson, Gordon J. G.

    2010-01-01

    Intolerance of uncertainty (IU) is related to anxiety, depression, worry, and anxiety sensitivity. Precedent IU measures were criticized for psychometric instability and redundancy; alternative measures include the novel 45-item measure (Intolerance of Uncertainty Index; IUI). The IUI was developed in French with 2 parts, assessing general…

  11. A modified F-test for evaluating model performance by including both experimental and simulation uncertainties

    USDA-ARS?s Scientific Manuscript database

    Experimental and simulation uncertainties have not been included in many of the statistics used in assessing agricultural model performance. The objectives of this study were to develop an F-test that can be used to evaluate model performance considering experimental and simulation uncertainties, an...

  12. Uncertainty

    USGS Publications Warehouse

    Hunt, Randall J.

    2012-01-01

    Management decisions will often be directly informed by model predictions. However, we now know there can be no expectation of a single ‘true’ model; thus, model results are uncertain. Understandable reporting of underlying uncertainty provides necessary context to decision-makers, as model results are used for management decisions. This, in turn, forms a mechanism by which groundwater models inform a risk-management framework because uncertainty around a prediction provides the basis for estimating the probability or likelihood of some event occurring. Given that the consequences of management decisions vary, it follows that the extent of and resources devoted to an uncertainty analysis may depend on the consequences. For events with low impact, a qualitative, limited uncertainty analysis may be sufficient for informing a decision. For events with a high impact, on the other hand, the risks might be better assessed and associated decisions made using a more robust and comprehensive uncertainty analysis. The purpose of this chapter is to provide guidance on uncertainty analysis through discussion of concepts and approaches, which can vary from heuristic (i.e. the modeller’s assessment of prediction uncertainty based on trial and error and experience) to a comprehensive, sophisticated, statistics-based uncertainty analysis. Most of the material presented here is taken from Doherty et al. (2010) if not otherwise cited. Although the treatment here is necessarily brief, the reader can find citations for the source material and additional references within this chapter.

  13. Uncertainty Categorization, Modeling, and Management for Regional Water Supply Planning

    NASA Astrophysics Data System (ADS)

    Fletcher, S.; Strzepek, K. M.; AlSaati, A.; Alhassan, A.

    2016-12-01

    Many water planners face increased pressure on water supply systems from growing demands, variability in supply and a changing climate. Short-term variation in water availability and demand; long-term uncertainty in climate, groundwater storage, and sectoral competition for water; and varying stakeholder perspectives on the impacts of water shortages make it difficult to assess the necessity of expensive infrastructure investments. We categorize these uncertainties on two dimensions: whether they are the result of stochastic variation or epistemic uncertainty, and whether the uncertainties can be described probabilistically or are deep uncertainties whose likelihood is unknown. We develop a decision framework that combines simulation for probabilistic uncertainty, sensitivity analysis for deep uncertainty and Bayesian decision analysis for uncertainties that are reduced over time with additional information. We apply this framework to two contrasting case studies - drought preparedness in Melbourne, Australia and fossil groundwater depletion in Riyadh, Saudi Arabia - to assess the impacts of different types of uncertainty on infrastructure decisions. Melbourne's water supply system relies on surface water, which is impacted by natural variation in rainfall, and a market-based system for managing water rights. Our results show that small, flexible investment increases can mitigate shortage risk considerably at reduced cost. Riyadh, by contrast, relies primarily on desalination for municipal use and fossil groundwater for agriculture, and a centralized planner makes allocation decisions. Poor regional groundwater measurement makes it difficult to know when groundwater pumping will become uneconomical, resulting in epistemic uncertainty. However, collecting more data can reduce the uncertainty, suggesting the need for different uncertainty modeling and management strategies in Riyadh than in Melbourne. We will categorize the two systems and propose appropriate decision making under uncertainty methods from the state of the art. We will compare the efficiency of alternative approaches to the two case studies. Finally, we will present a hybrid decision analytic tool to address the synthesis of uncertainties.

  14. Some suggested future directions of quantitative resource assessments

    USGS Publications Warehouse

    Singer, D.A.

    2001-01-01

    Future quantitative assessments will be expected to estimate quantities, values, and locations of undiscovered mineral resources in a form that conveys both economic viability and uncertainty associated with the resources. Historically, declining metal prices point to the need for larger deposits over time. Sensitivity analysis demonstrates that the greatest opportunity for reducing uncertainty in assessments lies in lowering uncertainty associated with tonnage estimates. Of all errors possible in assessments, those affecting tonnage estimates are by far the most important. Selecting the correct deposit model is the most important way of controlling errors because the dominance of tonnage-deposit models are the best known predictor of tonnage. Much of the surface is covered with apparently barren rocks and sediments in many large regions. Because many exposed mineral deposits are believed to have been found, a prime concern is the presence of possible mineralized rock under cover. Assessments of areas with resources under cover must rely on extrapolation from surrounding areas, new geologic maps of rocks under cover, or analogy with other well-explored areas that can be considered training tracts. Cover has a profound effect on uncertainty and on methods and procedures of assessments because geology is seldom known and geophysical methods typically have attenuated responses. Many earlier assessment methods were based on relationships of geochemical and geophysical variables to deposits learned from deposits exposed on the surface-these will need to be relearned based on covered deposits. Mineral-deposit models are important in quantitative resource assessments for two reasons: (1) grades and tonnages of most deposit types are significantly different, and (2) deposit types are present in different geologic settings that can be identified from geologic maps. Mineral-deposit models are the keystone in combining the diverse geoscience information on geology, mineral occurrences, geophysics, and geochemistry used in resource assessments and mineral exploration. Grade and tonnage models and development of quantitative descriptive, economic, and deposit density models will help reduce the uncertainty of these new assessments.

  15. "Stultifera Navis": Institutional Tensions, Conceptual Chaos, and Professional Uncertainty at the Beginning of the Decade of Education for Sustainable Development

    ERIC Educational Resources Information Center

    Perez, Jose Gutierrez; Llorente, Ma Teresa Pozo

    2005-01-01

    The main idea this article develops is the conceptual chaos, methodological tensions and epistemological conflicts that are being experienced in the field of environmental education as a result of the uncertainty generated by some institutions and international organisms. The authors' perspective starts from the idea that too many expectations…

  16. Absolute Ages and Distances of 22 GCs Using Monte Carlo Main-sequence Fitting

    NASA Astrophysics Data System (ADS)

    O'Malley, Erin M.; Gilligan, Christina; Chaboyer, Brian

    2017-04-01

    The recent Gaia Data Release 1 of stellar parallaxes provides ample opportunity to find metal-poor main-sequence stars with precise parallaxes. We select 21 such stars with parallax uncertainties better than σ π /π ≤ 0.10 and accurate abundance determinations suitable for testing metal-poor stellar evolution models and determining the distance to Galactic globular clusters (GCs). A Monte Carlo analysis was used, taking into account uncertainties in the model construction parameters, to generate stellar models and isochrones to fit to the calibration stars. The isochrones that fit the calibration stars best were then used to determine the distances and ages of 22 GCs with metallicities ranging from -2.4 dex to -0.7 dex. We find distances with an average uncertainty of 0.15 mag and absolute ages ranging from 10.8 to 13.6 Gyr with an average uncertainty of 1.6 Gyr. Using literature proper motion data, we calculate orbits for the clusters, finding six that reside within the Galactic disk/bulge, while the rest are considered halo clusters. We find no strong evidence for a relationship between age and Galactocentric distance, but we do find a decreasing age-[Fe/H] relation.

  17. Calculation of the compounded uncertainty of 14C AMS measurements

    NASA Astrophysics Data System (ADS)

    Nadeau, Marie-Josée; Grootes, Pieter M.

    2013-01-01

    The correct method to calculate conventional 14C ages from the carbon isotopic ratios was summarised 35 years ago by Stuiver and Polach (1977) and is now accepted as the only method to calculate 14C ages. There is, however, no consensus regarding the treatment of AMS data, mainly of the uncertainty of the final result. The estimation and treatment of machine background, process blank, and/or in situ contamination is not uniform between laboratories, leading to differences in 14C results, mainly for older ages. As Donahue (1987) and Currie (1994), among others, mentioned, some laboratories find it important to use the scatter of several measurements as uncertainty while others prefer to use Poisson statistics. The contribution of the scatter of the standards, machine background, process blank, and in situ contamination to the uncertainty of the final 14C result is also treated in different ways. In the early years of AMS, several laboratories found it important to describe their calculation process in details. In recent years, this practise has declined. We present an overview of the calculation process for 14C AMS measurements looking at calculation practises published from the beginning of AMS until present.

  18. Uncertainties in Forecasting Streamflow using Entropy Theory

    NASA Astrophysics Data System (ADS)

    Cui, H.; Singh, V. P.

    2017-12-01

    Streamflow forecasting is essential in river restoration, reservoir operation, power generation, irrigation, navigation, and water management. However, there is always uncertainties accompanied in forecast, which may affect the forecasting results and lead to large variations. Therefore, uncertainties must be considered and be assessed properly when forecasting streamflow for water management. The aim of our work is to quantify the uncertainties involved in forecasting streamflow and provide reliable streamflow forecast. Despite that streamflow time series are stochastic, they exhibit seasonal and periodic patterns. Therefore, streamflow forecasting entails modeling seasonality, periodicity, and its correlation structure, and assessing uncertainties. This study applies entropy theory to forecast streamflow and measure uncertainties during the forecasting process. To apply entropy theory for streamflow forecasting, spectral analysis is combined to time series analysis, as spectral analysis can be employed to characterize patterns of streamflow variation and identify the periodicity of streamflow. That is, it permits to extract significant information for understanding the streamflow process and prediction thereof. Application of entropy theory for streamflow forecasting involves determination of spectral density, determination of parameters, and extension of autocorrelation function. The uncertainties brought by precipitation input, forecasting model and forecasted results are measured separately using entropy. With information theory, how these uncertainties transported and aggregated during these processes will be described.

  19. Uncertainty analysis in vulnerability estimations for elements at risk- a review of concepts and some examples on landslides

    NASA Astrophysics Data System (ADS)

    Ciurean, R. L.; Glade, T.

    2012-04-01

    Decision under uncertainty is a constant of everyday life and an important component of risk management and governance. Recently, experts have emphasized the importance of quantifying uncertainty in all phases of landslide risk analysis. Due to its multi-dimensional and dynamic nature, (physical) vulnerability is inherently complex and the "degree of loss" estimates imprecise and to some extent even subjective. Uncertainty analysis introduces quantitative modeling approaches that allow for a more explicitly objective output, improving the risk management process as well as enhancing communication between various stakeholders for better risk governance. This study presents a review of concepts for uncertainty analysis in vulnerability of elements at risk to landslides. Different semi-quantitative and quantitative methods are compared based on their feasibility in real-world situations, hazard dependency, process stage in vulnerability assessment (i.e. input data, model, output), and applicability within an integrated landslide hazard and risk framework. The resulted observations will help to identify current gaps and future needs in vulnerability assessment, including estimation of uncertainty propagation, transferability of the methods, development of visualization tools, but also address basic questions like what is uncertainty and how uncertainty can be quantified or treated in a reliable and reproducible way.

  20. Characterization of XR-RV3 GafChromic{sup ®} films in standard laboratory and in clinical conditions and means to evaluate uncertainties and reduce errors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Farah, J., E-mail: jad.farah@irsn.fr; Clairand, I.; Huet, C.

    2015-07-15

    Purpose: To investigate the optimal use of XR-RV3 GafChromic{sup ®} films to assess patient skin dose in interventional radiology while addressing the means to reduce uncertainties in dose assessment. Methods: XR-Type R GafChromic films have been shown to represent the most efficient and suitable solution to determine patient skin dose in interventional procedures. As film dosimetry can be associated with high uncertainty, this paper presents the EURADOS WG 12 initiative to carry out a comprehensive study of film characteristics with a multisite approach. The considered sources of uncertainties include scanner, film, and fitting-related errors. The work focused on studying filmmore » behavior with clinical high-dose-rate pulsed beams (previously unavailable in the literature) together with reference standard laboratory beams. Results: First, the performance analysis of six different scanner models has shown that scan uniformity perpendicular to the lamp motion axis and that long term stability are the main sources of scanner-related uncertainties. These could induce errors of up to 7% on the film readings unless regularly checked and corrected. Typically, scan uniformity correction matrices and reading normalization to the scanner-specific and daily background reading should be done. In addition, the analysis on multiple film batches has shown that XR-RV3 films have generally good uniformity within one batch (<1.5%), require 24 h to stabilize after the irradiation and their response is roughly independent of dose rate (<5%). However, XR-RV3 films showed large variations (up to 15%) with radiation quality both in standard laboratory and in clinical conditions. As such, and prior to conducting patient skin dose measurements, it is mandatory to choose the appropriate calibration beam quality depending on the characteristics of the x-ray systems that will be used clinically. In addition, yellow side film irradiations should be preferentially used since they showed a lower dependence on beam parameters compared to white side film irradiations. Finally, among the six different fit equations tested in this work, typically used third order polynomials and more rational and simplistic equations, of the form dose inversely proportional to pixel value, were both found to provide satisfactory results. Fitting-related uncertainty was clearly identified as a major contributor to the overall film dosimetry uncertainty with up to 40% error on the dose estimate. Conclusions: The overall uncertainty associated with the use of XR-RV3 films to determine skin dose in the interventional environment can realistically be estimated to be around 20% (k = 1). This uncertainty can be reduced to within 5% if carefully monitoring scanner, film, and fitting-related errors or it can easily increase to over 40% if minimal care is not taken. This work demonstrates the importance of appropriate calibration, reading, fitting, and other film-related and scan-related processes, which will help improve the accuracy of skin dose measurements in interventional procedures.« less

  1. REDUCING UNCERTAINTY IN RISK ASSESSMENT USING MECHANISTIC DATA: ENHANCING THE U.S. EPA DEVELOPMENTAL NEUROTOXICITY TESTING GUIDELINES

    EPA Science Inventory

    SUMMARY: Mechanistic data should provide the Agency with a more accurate basis to estimate risk than do the Agency’s default assumptions (10x uncertainty factors, etc.), thereby improving risk assessment decisions. NTD is providing mechanistic data for toxicant effects on two maj...

  2. Assessment of Competence in Clinical Reasoning and Decision-Making under Uncertainty: The Script Concordance Test Method

    ERIC Educational Resources Information Center

    Ramaekers, Stephan; Kremer, Wim; Pilot, Albert; van Beukelen, Peter; van Keulen, Hanno

    2010-01-01

    Real-life, complex problems often require that decisions are made despite limited information or insufficient time to explore all relevant aspects. Incorporating authentic uncertainties into an assessment, however, poses problems in establishing results and analysing their methodological qualities. This study aims at developing a test on clinical…

  3. Assessing Uncertainties in a Simple and Cheap Experiment

    ERIC Educational Resources Information Center

    de Souza, Paulo A., Jr.; Brasil, Gutemberg Hespanha

    2009-01-01

    This paper describes how to calculate measurement uncertainties using as a practical example the assessment of the thickness of ping-pong balls and their material density. The advantages of a randomized experiment are also discussed. This experiment can be reproduced in the physics laboratory for undergraduate students. (Contains 7 tables, 1…

  4. Accepting uncertainty, assessing risk: decision quality in managing wildfire, forest resource values, and new technology

    Treesearch

    Jeffrey G. Borchers

    2005-01-01

    The risks, uncertainties, and social conflicts surrounding uncharacteristic wildfire and forest resource values have defied conventional approaches to planning and decision-making. Paradoxically, the adoption of technological innovations such as risk assessment, decision analysis, and landscape simulation models by land management organizations has been limited. The...

  5. Measuring and explaining eco-efficiencies of wastewater treatment plants in China: An uncertainty analysis perspective.

    PubMed

    Dong, Xin; Zhang, Xinyi; Zeng, Siyu

    2017-04-01

    In the context of sustainable development, there has been an increasing requirement for an eco-efficiency assessment of wastewater treatment plants (WWTPs). Data envelopment analysis (DEA), a technique that is widely applied for relative efficiency assessment, is used in combination with the tolerances approach to handle WWTPs' multiple inputs and outputs as well as their uncertainty. The economic cost, energy consumption, contaminant removal, and global warming effect during the treatment processes are integrated to interpret the eco-efficiency of WWTPs. A total of 736 sample plants from across China are assessed, and large sensitivities to variations in inputs and outputs are observed for most samples, with only three WWTPs identified as being stably efficient. Size of plant, overcapacity, climate type, and influent characteristics are proven to have a significant influence on both the mean efficiency and performance sensitivity of WWTPs, while no clear relationships were found between eco-efficiency and technology under the framework of uncertainty analysis. The incorporation of uncertainty quantification and environmental impact consideration has improved the liability and applicability of the assessment. Copyright © 2017 Elsevier Ltd. All rights reserved.

  6. Assessment of Radiative Heating Uncertainty for Hyperbolic Earth Entry

    NASA Technical Reports Server (NTRS)

    Johnston, Christopher O.; Mazaheri, Alireza; Gnoffo, Peter A.; Kleb, W. L.; Sutton, Kenneth; Prabhu, Dinesh K.; Brandis, Aaron M.; Bose, Deepak

    2011-01-01

    This paper investigates the shock-layer radiative heating uncertainty for hyperbolic Earth entry, with the main focus being a Mars return. In Part I of this work, a baseline simulation approach involving the LAURA Navier-Stokes code with coupled ablation and radiation is presented, with the HARA radiation code being used for the radiation predictions. Flight cases representative of peak-heating Mars or asteroid return are de ned and the strong influence of coupled ablation and radiation on their aerothermodynamic environments are shown. Structural uncertainties inherent in the baseline simulations are identified, with turbulence modeling, precursor absorption, grid convergence, and radiation transport uncertainties combining for a +34% and ..24% structural uncertainty on the radiative heating. A parametric uncertainty analysis, which assumes interval uncertainties, is presented. This analysis accounts for uncertainties in the radiation models as well as heat of formation uncertainties in the flow field model. Discussions and references are provided to support the uncertainty range chosen for each parameter. A parametric uncertainty of +47.3% and -28.3% is computed for the stagnation-point radiative heating for the 15 km/s Mars-return case. A breakdown of the largest individual uncertainty contributors is presented, which includes C3 Swings cross-section, photoionization edge shift, and Opacity Project atomic lines. Combining the structural and parametric uncertainty components results in a total uncertainty of +81.3% and ..52.3% for the Mars-return case. In Part II, the computational technique and uncertainty analysis presented in Part I are applied to 1960s era shock-tube and constricted-arc experimental cases. It is shown that experiments contain shock layer temperatures and radiative ux values relevant to the Mars-return cases of present interest. Comparisons between the predictions and measurements, accounting for the uncertainty in both, are made for a range of experiments. A measure of comparison quality is de ned, which consists of the percent overlap of the predicted uncertainty bar with the corresponding measurement uncertainty bar. For nearly all cases, this percent overlap is greater than zero, and for most of the higher temperature cases (T >13,000 K) it is greater than 50%. These favorable comparisons provide evidence that the baseline computational technique and uncertainty analysis presented in Part I are adequate for Mars-return simulations. In Part III, the computational technique and uncertainty analysis presented in Part I are applied to EAST shock-tube cases. These experimental cases contain wavelength dependent intensity measurements in a wavelength range that covers 60% of the radiative intensity for the 11 km/s, 5 m radius flight case studied in Part I. Comparisons between the predictions and EAST measurements are made for a range of experiments. The uncertainty analysis presented in Part I is applied to each prediction, and comparisons are made using the metrics defined in Part II. The agreement between predictions and measurements is excellent for velocities greater than 10.5 km/s. Both the wavelength dependent and wavelength integrated intensities agree within 30% for nearly all cases considered. This agreement provides confidence in the computational technique and uncertainty analysis presented in Part I, and provides further evidence that this approach is adequate for Mars-return simulations. Part IV of this paper reviews existing experimental data that include the influence of massive ablation on radiative heating. It is concluded that this existing data is not sufficient for the present uncertainty analysis. Experiments to capture the influence of massive ablation on radiation are suggested as future work, along with further studies of the radiative precursor and improvements in the radiation properties of ablation products.

  7. Deriving persistence indicators from regulatory water-sediment studies – opportunities and limitations in OECD 308 data.

    PubMed

    Honti, Mark; Fenner, Kathrin

    2015-05-19

    The OECD guideline 308 describes a laboratory test method to assess aerobic and anaerobic transformation of organic chemicals in aquatic sediment systems and is an integral part of tiered testing strategies in different legislative frameworks for the environmental risk assessment of chemicals. The results from experiments carried out according to OECD 308 are generally used to derive persistence indicators for hazard assessment or half-lives for exposure assessment. We used Bayesian parameter estimation and system representations of various complexities to systematically assess opportunities and limitations for estimating these indicators from existing data generated according to OECD 308 for 23 pesticides and pharmaceuticals. We found that there is a disparity between the uncertainty and the conceptual robustness of persistence indicators. Disappearance half-lives are directly extractable with limited uncertainty, but they lump degradation and phase transfer information and are not robust against changes in system geometry. Transformation half-lives are less system-specific but require inverse modeling to extract, resulting in considerable uncertainty. Available data were thus insufficient to derive indicators that had both acceptable robustness and uncertainty, which further supports previously voiced concerns about the usability and efficiency of these costly experiments. Despite the limitations of existing data, we suggest the time until 50% of the parent compound has been transformed in the entire system (DegT(50,system)) could still be a useful indicator of persistence in the upper, partially aerobic sediment layer in the context of PBT assessment. This should, however, be accompanied by a mandatory reporting or full standardization of the geometry of the experimental system. We recommend transformation half-lives determined by inverse modeling to be used as input parameters into fate models for exposure assessment, if due consideration is given to their uncertainty.

  8. Modelling of plasma-based dry reforming: how do uncertainties in the input data affect the calculation results?

    NASA Astrophysics Data System (ADS)

    Wang, Weizong; Berthelot, Antonin; Zhang, Quanzhi; Bogaerts, Annemie

    2018-05-01

    One of the main issues in plasma chemistry modeling is that the cross sections and rate coefficients are subject to uncertainties, which yields uncertainties in the modeling results and hence hinders the predictive capabilities. In this paper, we reveal the impact of these uncertainties on the model predictions of plasma-based dry reforming in a dielectric barrier discharge. For this purpose, we performed a detailed uncertainty analysis and sensitivity study. 2000 different combinations of rate coefficients, based on the uncertainty from a log-normal distribution, are used to predict the uncertainties in the model output. The uncertainties in the electron density and electron temperature are around 11% and 8% at the maximum of the power deposition for a 70% confidence level. Still, this can have a major effect on the electron impact rates and hence on the calculated conversions of CO2 and CH4, as well as on the selectivities of CO and H2. For the CO2 and CH4 conversion, we obtain uncertainties of 24% and 33%, respectively. For the CO and H2 selectivity, the corresponding uncertainties are 28% and 14%, respectively. We also identify which reactions contribute most to the uncertainty in the model predictions. In order to improve the accuracy and reliability of plasma chemistry models, we recommend using only verified rate coefficients, and we point out the need for dedicated verification experiments.

  9. Real options analysis for land use management: Methods, application, and implications for policy.

    PubMed

    Regan, Courtney M; Bryan, Brett A; Connor, Jeffery D; Meyer, Wayne S; Ostendorf, Bertram; Zhu, Zili; Bao, Chenming

    2015-09-15

    Discounted cash flow analysis, including net present value is an established way to value land use and management investments which accounts for the time-value of money. However, it provides a static view and assumes passive commitment to an investment strategy when real world land use and management investment decisions are characterised by uncertainty, irreversibility, change, and adaptation. Real options analysis has been proposed as a better valuation method under uncertainty and where the opportunity exists to delay investment decisions, pending more information. We briefly review the use of discounted cash flow methods in land use and management and discuss their benefits and limitations. We then provide an overview of real options analysis, describe the main analytical methods, and summarize its application to land use investment decisions. Real options analysis is largely underutilized in evaluating land use decisions, despite uncertainty in policy and economic drivers, the irreversibility and sunk costs involved. New simulation methods offer the potential for overcoming current technical challenges to implementation as demonstrated with a real options simulation model used to evaluate an agricultural land use decision in South Australia. We conclude that considering option values in future policy design will provide a more realistic assessment of landholder investment decision making and provide insights for improved policy performance. Copyright © 2015 Elsevier Ltd. All rights reserved.

  10. Assessing the Agreement Between Eo-Based Semi-Automated Landslide Maps with Fuzzy Manual Landslide Delineation

    NASA Astrophysics Data System (ADS)

    Albrecht, F.; Hölbling, D.; Friedl, B.

    2017-09-01

    Landslide mapping benefits from the ever increasing availability of Earth Observation (EO) data resulting from programmes like the Copernicus Sentinel missions and improved infrastructure for data access. However, there arises the need for improved automated landslide information extraction processes from EO data while the dominant method is still manual delineation. Object-based image analysis (OBIA) provides the means for the fast and efficient extraction of landslide information. To prove its quality, automated results are often compared to manually delineated landslide maps. Although there is awareness of the uncertainties inherent in manual delineations, there is a lack of understanding how they affect the levels of agreement in a direct comparison of OBIA-derived landslide maps and manually derived landslide maps. In order to provide an improved reference, we present a fuzzy approach for the manual delineation of landslides on optical satellite images, thereby making the inherent uncertainties of the delineation explicit. The fuzzy manual delineation and the OBIA classification are compared by accuracy metrics accepted in the remote sensing community. We have tested this approach for high resolution (HR) satellite images of three large landslides in Austria and Italy. We were able to show that the deviation of the OBIA result from the manual delineation can mainly be attributed to the uncertainty inherent in the manual delineation process, a relevant issue for the design of validation processes for OBIA-derived landslide maps.

  11. Improving Future Ecosystem Benefits through Earth Observations: the H2020 Project ECOPOTENTIAL

    NASA Astrophysics Data System (ADS)

    Provenzale, Antonello; Beierkuhnlein, Carl; Ziv, Guy

    2016-04-01

    Terrestrial and marine ecosystems provide essential goods and services to human societies. In the last decades, however, anthropogenic pressures caused serious threats to ecosystem integrity, functions and processes, potentially leading to the loss of essential ecosystem services. ECOPOTENTIAL is a large European-funded H2020 project which focuses its activities on a targeted set of internationally recognised protected areas in Europe, European Territories and beyond, blending Earth Observations from remote sensing and field measurements, data analysis and modelling of current and future ecosystem conditions and services. The definition of future scenarios is based on climate and land-use change projections, addressing the issue of uncertainties and uncertainty propagation across the modelling chain. The ECOPOTENTIAL project addresses cross-scale geosphere-biosphere interactions and landscape-ecosystem dynamics at regional to continental scales, using geostatistical methods and the emerging approaches in Macrosystem Ecology and Earth Critical Zone studies, addressing long-term and large-scale environmental and ecological challenges. The project started its activities in 2015, by defining a set of storylines which allow to tackle some of the most crucial issues in the assessment of present conditions and the estimate of the future state of selected ecosystem services. In this contribution, we focus on some of the main storylines of the project and discuss the general approach, focusing on the interplay of data and models and on the estimate of projection uncertainties.

  12. Carbon storage in Chinese grassland ecosystems: Influence of different integrative methods.

    PubMed

    Ma, Anna; He, Nianpeng; Yu, Guirui; Wen, Ding; Peng, Shunlei

    2016-02-17

    The accurate estimate of grassland carbon (C) is affected by many factors at the large scale. Here, we used six methods (three spatial interpolation methods and three grassland classification methods) to estimate C storage of Chinese grasslands based on published data from 2004 to 2014, and assessed the uncertainty resulting from different integrative methods. The uncertainty (coefficient of variation, CV, %) of grassland C storage was approximately 4.8% for the six methods tested, which was mainly determined by soil C storage. C density and C storage to the soil layer depth of 100 cm were estimated to be 8.46 ± 0.41 kg C m(-2) and 30.98 ± 1.25 Pg C, respectively. Ecosystem C storage was composed of 0.23 ± 0.01 (0.7%) above-ground biomass, 1.38 ± 0.14 (4.5%) below-ground biomass, and 29.37 ± 1.2 (94.8%) Pg C in the 0-100 cm soil layer. Carbon storage calculated by the grassland classification methods (18 grassland types) was closer to the mean value than those calculated by the spatial interpolation methods. Differences in integrative methods may partially explain the high uncertainty in C storage estimates in different studies. This first evaluation demonstrates the importance of multi-methodological approaches to accurately estimate C storage in large-scale terrestrial ecosystems.

  13. Temporal models for the episodic volcanism of Campi Flegrei caldera (Italy) with uncertainty quantification

    NASA Astrophysics Data System (ADS)

    Bevilacqua, Andrea; Flandoli, Franco; Neri, Augusto; Isaia, Roberto; Vitale, Stefano

    2016-11-01

    After the large-scale event of Neapolitan Yellow Tuff ( 15 ka B.P.), intense and mostly explosive volcanism has occurred within and along the boundaries of the Campi Flegrei caldera (Italy). Eruptions occurred closely spaced in time, over periods from a few centuries to a few millennia, and were alternated with periods of quiescence lasting up to several millennia. Often events also occurred closely in space, thus generating a cluster of events. This study had two main objectives: (1) to describe the uncertainty in the geologic record by using a quantitative model and (2) to develop, based on the uncertainty assessment, a long-term subdomain specific temporal probability model that describes the temporal and spatial eruptive behavior of the caldera. In particular, the study adopts a space-time doubly stochastic nonhomogeneous Poisson-type model with a local self-excitation feature able to generate clustering of events which are consistent with the reconstructed record of Campi Flegrei. Results allow the evaluation of similarities and differences between the three epochs of activity as well as to derive eruptive base rate of the caldera and its capacity to generate clusters of events. The temporal probability model is also used to investigate the effect of the most recent eruption of Monte Nuovo (A.D. 1538) in a possible reactivation of the caldera and to estimate the time to the next eruption under different volcanological and modeling assumptions.

  14. Comparing methods for Earthquake Location

    NASA Astrophysics Data System (ADS)

    Turkaya, Semih; Bodin, Thomas; Sylvander, Matthieu; Parroucau, Pierre; Manchuel, Kevin

    2017-04-01

    There are plenty of methods available for locating small magnitude point source earthquakes. However, it is known that these different approaches produce different results. For each approach, results also depend on a number of parameters which can be separated into two main branches: (1) parameters related to observations (number and distribution of for example) and (2) parameters related to the inversion process (velocity model, weighting parameters, initial location etc.). Currently, the results obtained from most of the location methods do not systematically include quantitative uncertainties. The effect of the selected parameters on location uncertainties is also poorly known. Understanding the importance of these different parameters and their effect on uncertainties is clearly required to better constrained knowledge on fault geometry, seismotectonic processes and at the end to improve seismic hazard assessment. In this work, realized in the frame of the SINAPS@ research program (http://www.institut-seism.fr/projets/sinaps/), we analyse the effect of different parameters on earthquakes location (e.g. type of phase, max. hypocentral separation etc.). We compare several codes available (Hypo71, HypoDD, NonLinLoc etc.) and determine their strengths and weaknesses in different cases by means of synthetic tests. The work, performed for the moment on synthetic data, is planned to be applied, in a second step, on data collected by the Midi-Pyrénées Observatory (OMP).

  15. Dynamic rating curve assessment for hydrometric stations and computation of the associated uncertainties: Quality and station management indicators

    NASA Astrophysics Data System (ADS)

    Morlot, Thomas; Perret, Christian; Favre, Anne-Catherine; Jalbert, Jonathan

    2014-09-01

    A rating curve is used to indirectly estimate the discharge in rivers based on water level measurements. The discharge values obtained from a rating curve include uncertainties related to the direct stage-discharge measurements (gaugings) used to build the curves, the quality of fit of the curve to these measurements and the constant changes in the river bed morphology. Moreover, the uncertainty of discharges estimated from a rating curve increases with the “age” of the rating curve. The level of uncertainty at a given point in time is therefore particularly difficult to assess. A “dynamic” method has been developed to compute rating curves while calculating associated uncertainties, thus making it possible to regenerate streamflow data with uncertainty estimates. The method is based on historical gaugings at hydrometric stations. A rating curve is computed for each gauging and a model of the uncertainty is fitted for each of them. The model of uncertainty takes into account the uncertainties in the measurement of the water level, the quality of fit of the curve, the uncertainty of gaugings and the increase of the uncertainty of discharge estimates with the age of the rating curve computed with a variographic analysis (Jalbert et al., 2011). The presented dynamic method can answer important questions in the field of hydrometry such as “How many gaugings a year are required to produce streamflow data with an average uncertainty of X%?” and “When and in what range of water flow rates should these gaugings be carried out?”. The Rocherousse hydrometric station (France, Haute-Durance watershed, 946 [km2]) is used as an example throughout the paper. Others stations are used to illustrate certain points.

  16. Measuring the uncertainties of discharge measurements: interlaboratory experiments in hydrometry

    NASA Astrophysics Data System (ADS)

    Le Coz, Jérôme; Blanquart, Bertrand; Pobanz, Karine; Dramais, Guillaume; Pierrefeu, Gilles; Hauet, Alexandre; Despax, Aurélien

    2015-04-01

    Quantifying the uncertainty of streamflow data is key for hydrological sciences. The conventional uncertainty analysis based on error propagation techniques is restricted by the absence of traceable discharge standards and by the weight of difficult-to-predict errors related to the operator, procedure and measurement environment. Field interlaboratory experiments recently emerged as an efficient, standardized method to 'measure' the uncertainties of a given streamgauging technique in given measurement conditions. Both uncertainty approaches are compatible and should be developed jointly in the field of hydrometry. In the recent years, several interlaboratory experiments have been reported by different hydrological services. They involved different streamgauging techniques, including acoustic profilers (ADCP), current-meters and handheld radars (SVR). Uncertainty analysis was not always their primary goal: most often, testing the proficiency and homogeneity of instruments, makes and models, procedures and operators was the original motivation. When interlaboratory experiments are processed for uncertainty analysis, once outliers have been discarded all participants are assumed to be equally skilled and to apply the same streamgauging technique in equivalent conditions. A universal requirement is that all participants simultaneously measure the same discharge, which shall be kept constant within negligible variations. To our best knowledge, we were the first to apply the interlaboratory method for computing the uncertainties of streamgauging techniques, according to the authoritative international documents (ISO standards). Several specific issues arise due to the measurements conditions in outdoor canals and rivers. The main limitation is that the best available river discharge references are usually too uncertain to quantify the bias of the streamgauging technique, i.e. the systematic errors that are common to all participants in the experiment. A reference or a sensitivity analysis to the fixed parameters of the streamgauging technique remain very useful for estimating the uncertainty related to the (non quantified) bias correction. In the absence of a reference, the uncertainty estimate is referenced to the average of all discharge measurements in the interlaboratory experiment, ignoring the technique bias. Simple equations can be used to assess the uncertainty of the uncertainty results, as a function of the number of participants and of repeated measurements. The interlaboratory method was applied to several interlaboratory experiments on ADCPs and currentmeters mounted on wading rods, in streams of different sizes and aspects, with 10 to 30 instruments, typically. The uncertainty results were consistent with the usual expert judgment and highly depended on the measurement environment. Approximately, the expanded uncertainties (within the 95% probability interval) were ±5% to ±10% for ADCPs in good or poor conditions, and ±10% to ±15% for currentmeters in shallow creeks. Due to the specific limitations related to a slow measurement process and to small, natural streams, uncertainty results for currentmeters were more uncertain than for ADCPs, for which the site-specific errors were significantly evidenced. The proposed method can be applied to a wide range of interlaboratory experiments conducted in contrasted environments for different streamgauging techniques, in a standardized way. Ideally, an international open database would enhance the investigation of hydrological data uncertainties, according to the characteristics of the measurement conditions and procedures. Such a dataset could be used for implementing and validating uncertainty propagation methods in hydrometry.

  17. Wildfire Decision Making Under Uncertainty

    NASA Astrophysics Data System (ADS)

    Thompson, M.

    2013-12-01

    Decisions relating to wildfire management are subject to multiple sources of uncertainty, and are made by a broad range of individuals, across a multitude of environmental and socioeconomic contexts. In this presentation I will review progress towards identification and characterization of uncertainties and how this information can support wildfire decision-making. First, I will review a typology of uncertainties common to wildfire management, highlighting some of the more salient sources of uncertainty and how they present challenges to assessing wildfire risk. This discussion will cover the expanding role of burn probability modeling, approaches for characterizing fire effects, and the role of multi-criteria decision analysis, and will provide illustrative examples of integrated wildfire risk assessment across a variety of planning scales. Second, I will describe a related uncertainty typology that focuses on the human dimensions of wildfire management, specifically addressing how social, psychological, and institutional factors may impair cost-effective risk mitigation. This discussion will encompass decision processes before, during, and after fire events, with a specific focus on active management of complex wildfire incidents. An improved ability to characterize uncertainties faced in wildfire management could lead to improved delivery of decision support, targeted communication strategies, and ultimately to improved wildfire management outcomes.

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bao, C.; Hanany, S.; Baccigalupi, C.

    We extend a general maximum likelihood foreground estimation for cosmic microwave background (CMB) polarization data to include estimation of instrumental systematic effects. We focus on two particular effects: frequency band measurement uncertainty and instrumentally induced frequency dependent polarization rotation. We assess the bias induced on the estimation of the B-mode polarization signal by these two systematic effects in the presence of instrumental noise and uncertainties in the polarization and spectral index of Galactic dust. Degeneracies between uncertainties in the band and polarization angle calibration measurements and in the dust spectral index and polarization increase the uncertainty in the extracted CMBmore » B-mode power, and may give rise to a biased estimate. We provide a quantitative assessment of the potential bias and increased uncertainty in an example experimental configuration. For example, we find that with 10% polarized dust, a tensor to scalar ratio of r = 0.05, and the instrumental configuration of the E and B experiment balloon payload, the estimated CMB B-mode power spectrum is recovered without bias when the frequency band measurement has 5% uncertainty or less, and the polarization angle calibration has an uncertainty of up to 4°.« less

  19. [Application of robustness test for assessment of the measurement uncertainty at the end of development phase of a chromatographic method for quantification of water-soluble vitamins].

    PubMed

    Ihssane, B; Bouchafra, H; El Karbane, M; Azougagh, M; Saffaj, T

    2016-05-01

    We propose in this work an efficient way to evaluate the measurement of uncertainty at the end of the development step of an analytical method, since this assessment provides an indication of the performance of the optimization process. The estimation of the uncertainty is done through a robustness test by applying a Placquett-Burman design, investigating six parameters influencing the simultaneous chromatographic assay of five water-soluble vitamins. The estimated effects of the variation of each parameter are translated into standard uncertainty value at each concentration level. The values obtained of the relative uncertainty do not exceed the acceptance limit of 5%, showing that the procedure development was well done. In addition, a statistical comparison conducted to compare standard uncertainty after the development stage and those of the validation step indicates that the estimated uncertainty are equivalent. The results obtained show clearly the performance and capacity of the chromatographic method to simultaneously assay the five vitamins and suitability for use in routine application. Copyright © 2015 Académie Nationale de Pharmacie. Published by Elsevier Masson SAS. All rights reserved.

  20. Validation of Globsnow-2 Snow Water Equivalent Over Eastern Canada

    NASA Technical Reports Server (NTRS)

    Larue, Fanny; Royer, Alain; De Seve, Danielle; Langlois, Alexandre; Roy, Alexandre R.; Brucker, Ludovic

    2017-01-01

    In Qubec, Eastern Canada, snowmelt runoff contributes more than 30% of the annual energy reserve for hydroelectricity production, and uncertainties in annual maximum snow water equivalent (SWE) over the region are one of the main constraints for improved hydrological forecasting. Current satellite-based methods for mapping SWE over Qubec's main hydropower basins do not meet Hydro-Qubec operational requirements for SWE accuracies with less than 15% error. This paper assesses the accuracy of the GlobSnow-2 (GS-2) SWE product, which combines microwave satellite data and in situ measurements, for hydrological applications in Qubec. GS-2 SWE values for a 30-year period (1980 to 2009) were compared with space- and time-matched values from a comprehensive dataset of in situ SWE measurements (a total of 38,990 observations in Eastern Canada). The root mean square error (RMSE) of the GS-2 SWE product is 94.1+/- 20.3 mm, corresponding to an overall relative percentage error (RPE) of 35.9%. The main sources of uncertainty are wet and deep snow conditions (when SWE is higher than 150 mm), and forest cover type. However, compared to a typical stand-alone brightness temperature channel difference algorithm, the assimilation of surface information in the GS-2 algorithm clearly improves SWE accuracy by reducing the RPE by about 30%. Comparison of trends in annual mean and maximum SWE between surface observations and GS-2 over 1980-2009 showed agreement for increasing trends over southern Qubec, but less agreement on the sign and magnitude of trends over northern Qubec. Extended at a continental scale, the GS-2 SWE trends highlight a strong regional variability.

  1. Assessing the reliability of dose coefficients for exposure to radioiodine by members of the public, accounting for dosimetric and risk model uncertainties.

    PubMed

    Puncher, M; Zhang, W; Harrison, J D; Wakeford, R

    2017-06-26

    Assessments of risk to a specific population group resulting from internal exposure to a particular radionuclide can be used to assess the reliability of the appropriate International Commission on Radiological Protection (ICRP) dose coefficients used as a radiation protection device for the specified exposure pathway. An estimate of the uncertainty on the associated risk is important for informing judgments on reliability; a derived uncertainty factor, UF, is an estimate of the 95% probable geometric difference between the best risk estimate and the nominal risk and is a useful tool for making this assessment. This paper describes the application of parameter uncertainty analysis to quantify uncertainties resulting from internal exposures to radioiodine by members of the public, specifically 1, 10 and 20-year old females from the population of England and Wales. Best estimates of thyroid cancer incidence risk (lifetime attributable risk) are calculated for ingestion or inhalation of 129 I and 131 I, accounting for uncertainties in biokinetic model and cancer risk model parameter values. These estimates are compared with the equivalent ICRP derived nominal age-, sex- and population-averaged estimates of excess thyroid cancer incidence to obtain UFs. Derived UF values for ingestion or inhalation of 131 I for 1 year, 10-year and 20-year olds are around 28, 12 and 6, respectively, when compared with ICRP Publication 103 nominal values, and 9, 7 and 14, respectively, when compared with ICRP Publication 60 values. Broadly similar results were obtained for 129 I. The uncertainties on risk estimates are largely determined by uncertainties on risk model parameters rather than uncertainties on biokinetic model parameters. An examination of the sensitivity of the results to the risk models and populations used in the calculations show variations in the central estimates of risk of a factor of around 2-3. It is assumed that the direct proportionality of excess thyroid cancer risk and dose observed at low to moderate acute doses and incorporated in the risk models also applies to very small doses received at very low dose rates; the uncertainty in this assumption is considerable, but largely unquantifiable. The UF values illustrate the need for an informed approach to the use of ICRP dose and risk coefficients.

  2. Nationwide tsunami hazard assessment project in Japan

    NASA Astrophysics Data System (ADS)

    Hirata, K.; Fujiwara, H.; Nakamura, H.; Osada, M.; Ohsumi, T.; Morikawa, N.; Kawai, S.; Aoi, S.; Yamamoto, N.; Matsuyama, H.; Toyama, N.; Kito, T.; Murashima, Y.; Murata, Y.; Inoue, T.; Saito, R.; Akiyama, S.; Korenaga, M.; Abe, Y.; Hashimoto, N.

    2014-12-01

    In 2012, we began a project of nationwide Probabilistic Tsunami Hazard Assessment (PTHA) in Japan to support various measures (Fujiwara et al., 2013, JpGU; Hirata et al., 2014, AOGS). The most important strategy in the nationwide PTHA is predominance of aleatory uncertainty in the assessment but use of epistemic uncertainty is limited to the minimum, because the number of all possible combinations among epistemic uncertainties diverges quickly when the number of epistemic uncertainties in the assessment increases ; we consider only a type of earthquake occurrence probability distribution as epistemic uncertainty. We briefly show outlines of the nationwide PTHA as follows; (i) we consider all possible earthquakes in the future, including those that the Headquarters for Earthquake Research Promotion (HERP) of Japanese Government, already assessed. (ii) We construct a set of simplified earthquake fault models, called "Characterized Earthquake Fault Models (CEFMs)", for all of the earthquakes by following prescribed rules (Toyama et al., 2014, JpGU; Korenaga et al., 2014, JpGU). (iii) For all of initial water surface distributions caused by a number of the CEFMs, we calculate tsunamis by solving a nonlinear long wave equation, using FDM, including runup calculation, over a nesting grid system with a minimum grid size of 50 meters. (iv) Finally, we integrate information about the tsunamis calculated from the numerous CEFMs to get nationwide tsunami hazard assessments. One of the most popular representations of the integrated information is a tsunami hazard curve for coastal tsunami heights, incorporating uncertainties inherent in tsunami simulation and earthquake fault slip heterogeneity (Abe et al., 2014, JpGU). We will show a PTHA along the eastern coast of Honshu, Japan, based on approximately 1,800 tsunami sources located within the subduction zone along the Japan Trench, as a prototype of the nationwide PTHA. This study is supported by part of the research project on research on evaluation of hazard and risk of natural disasters, under the direction of the HERP of Japanese Government.

  3. Influence of Subjectivity in Geological Mapping on the Net Penetration Rate Prediction for a Hard Rock TBM

    NASA Astrophysics Data System (ADS)

    Seo, Yongbeom; Macias, Francisco Javier; Jakobsen, Pål Drevland; Bruland, Amund

    2018-05-01

    The net penetration rate of hard rock tunnel boring machines (TBM) is influenced by rock mass degree of fracturing. This influence is taken into account in the NTNU prediction model by the rock mass fracturing factor ( k s). k s is evaluated by geological mapping, the measurement of the orientation of fractures and the spacing of fractures and fracture type. Geological mapping is a subjective procedure. Mapping results can therefore contain considerable uncertainty. The mapping data of a tunnel mapped by three researchers were compared, and the influence of the variation in geological mapping was estimated to assess the influence of subjectivity in geological mapping. This study compares predicted net penetration rates and actual net penetration rates for TBM tunneling (from field data) and suggests mapping methods that can reduce the error related to subjectivity. The main findings of this paper are as follows: (1) variation of mapping data between individuals; (2) effect of observed variation on uncertainty in predicted net penetration rates; (3) influence of mapping methods on the difference between predicted and actual net penetration rate.

  4. Spectroscopy Made Easy: Evolution

    NASA Astrophysics Data System (ADS)

    Piskunov, Nikolai; Valenti, Jeff A.

    2017-01-01

    Context. The Spectroscopy Made Easy (SME) package has become a popular tool for analyzing stellar spectra, often in connection with large surveys or exoplanet research. SME has evolved significantly since it was first described in 1996, but many of the original caveats and potholes still haunt users. The main drivers for this paper are complexity of the modeling task, the large user community, and the massive effort that has gone into SME. Aims: We do not intend to give a comprehensive introduction to stellar atmospheres, but will describe changes to key components of SME: the equation of state, opacities, and radiative transfer. We will describe the analysis and fitting procedure and investigate various error sources that affect inferred parameters. Methods: We review the current status of SME, emphasizing new algorithms and methods. We describe some best practices for using the package, based on lessons learned over two decades of SME usage. We present a new way to assess uncertainties in derived stellar parameters. Results: Improvements made to SME, better line data, and new model atmospheres yield more realistic stellar spectra, but in many cases systematic errors still dominate over measurement uncertainty. Future enhancements are outlined.

  5. Reconstruction of MODIS total suspended matter time series maps by DINEOF and validation with autonomous platform data

    NASA Astrophysics Data System (ADS)

    Nechad, Bouchra; Alvera-Azcaràte, Aida; Ruddick, Kevin; Greenwood, Naomi

    2011-08-01

    In situ measurements of total suspended matter (TSM) over the period 2003-2006, collected with two autonomous platforms from the Centre for Environment, Fisheries and Aquatic Sciences (Cefas) measuring the optical backscatter (OBS) in the southern North Sea, are used to assess the accuracy of TSM time series extracted from satellite data. Since there are gaps in the remote sensing (RS) data, due mainly to cloud cover, the Data Interpolating Empirical Orthogonal Functions (DINEOF) is used to fill in the TSM time series and build a continuous daily "recoloured" dataset. The RS datasets consist of TSM maps derived from MODIS imagery using the bio-optical model of Nechad et al. (Rem Sens Environ 114: 854-866, 2010). In this study, the DINEOF time series are compared to the in situ OBS measured in moderately to very turbid waters respectively in West Gabbard and Warp Anchorage, in the southern North Sea. The discrepancies between instantaneous RS, DINEOF-filled RS data and Cefas data are analysed in terms of TSM algorithm uncertainties, space-time variability and DINEOF reconstruction uncertainty.

  6. Review article: Hydrological modeling in glacierized catchments of central Asia - status and challenges

    NASA Astrophysics Data System (ADS)

    Chen, Yaning; Li, Weihong; Fang, Gonghuan; Li, Zhi

    2017-02-01

    Meltwater from glacierized catchments is one of the most important water supplies in central Asia. Therefore, the effects of climate change on glaciers and snow cover will have increasingly significant consequences for runoff. Hydrological modeling has become an indispensable research approach to water resources management in large glacierized river basins, but there is a lack of focus in the modeling of glacial discharge. This paper reviews the status of hydrological modeling in glacierized catchments of central Asia, discussing the limitations of the available models and extrapolating these to future challenges and directions. After reviewing recent efforts, we conclude that the main sources of uncertainty in assessing the regional hydrological impacts of climate change are the unreliable and incomplete data sets and the lack of understanding of the hydrological regimes of glacierized catchments of central Asia. Runoff trends indicate a complex response to changes in climate. For future variation of water resources, it is essential to quantify the responses of hydrologic processes to both climate change and shrinking glaciers in glacierized catchments, and scientific focus should be on reducing uncertainties linked to these processes.

  7. Development of a European Ensemble System for Seasonal Prediction: Application to crop yield

    NASA Astrophysics Data System (ADS)

    Terres, J. M.; Cantelaube, P.

    2003-04-01

    Western European agriculture is highly intensive and the weather is the main source of uncertainty for crop yield assessment and for crop management. In the current system, at the time when a crop yield forecast is issued, the weather conditions leading up to harvest time are unknown and are therefore a major source of uncertainty. The use of seasonal weather forecast would bring additional information for the remaining crop season and has valuable benefit for improving the management of agricultural markets and environmentally sustainable farm practices. An innovative method for supplying seasonal forecast information to crop simulation models has been developed in the frame of the EU funded research project DEMETER. It consists in running a crop model on each individual member of the seasonal hindcasts to derive a probability distribution of crop yield. Preliminary results of cumulative probability function of wheat yield provides information on both the yield anomaly and the reliability of the forecast. Based on the spread of the probability distribution, the end-user can directly quantify the benefits and risks of taking weather-sensitive decisions.

  8. Aggregate exposure approaches for parabens in personal care products: a case assessment for children between 0 and 3 years old

    PubMed Central

    Gosens, Ilse; Delmaar, Christiaan J E; ter Burg, Wouter; de Heer, Cees; Schuur, A Gerlienke

    2014-01-01

    In the risk assessment of chemical substances, aggregation of exposure to a substance from different sources via different pathways is not common practice. Focusing the exposure assessment on a substance from a single source can lead to a significant underestimation of the risk. To gain more insight on how to perform an aggregate exposure assessment, we applied a deterministic (tier 1) and a person-oriented probabilistic approach (tier 2) for exposure to the four most common parabens through personal care products in children between 0 and 3 years old. Following a deterministic approach, a worst-case exposure estimate is calculated for methyl-, ethyl-, propyl- and butylparaben. As an illustration for risk assessment, Margins of Exposure (MoE) are calculated. These are 991 and 4966 for methyl- and ethylparaben, and 8 and 10 for propyl- and butylparaben, respectively. In tier 2, more detailed information on product use has been obtained from a small survey on product use of consumers. A probabilistic exposure assessment is performed to estimate the variability and uncertainty of exposure in a population. Results show that the internal exposure for each paraben is below the level determined in tier 1. However, for propyl- and butylparaben, the percentile of the population with an exposure probability above the assumed “safe” MoE of 100, is 13% and 7%, respectively. In conclusion, a tier 1 approach can be performed using simple equations and default point estimates, and serves as a starting point for exposure and risk assessment. If refinement is warranted, the more data demanding person-oriented probabilistic approach should be used. This probabilistic approach results in a more realistic exposure estimate, including the uncertainty, and allows determining the main drivers of exposure. Furthermore, it allows to estimate the percentage of the population for which the exposure is likely to be above a specific value. PMID:23801276

  9. A comparative experimental evaluation of uncertainty estimation methods for two-component PIV

    NASA Astrophysics Data System (ADS)

    Boomsma, Aaron; Bhattacharya, Sayantan; Troolin, Dan; Pothos, Stamatios; Vlachos, Pavlos

    2016-09-01

    Uncertainty quantification in planar particle image velocimetry (PIV) measurement is critical for proper assessment of the quality and significance of reported results. New uncertainty estimation methods have been recently introduced generating interest about their applicability and utility. The present study compares and contrasts current methods, across two separate experiments and three software packages in order to provide a diversified assessment of the methods. We evaluated the performance of four uncertainty estimation methods, primary peak ratio (PPR), mutual information (MI), image matching (IM) and correlation statistics (CS). The PPR method was implemented and tested in two processing codes, using in-house open source PIV processing software (PRANA, Purdue University) and Insight4G (TSI, Inc.). The MI method was evaluated in PRANA, as was the IM method. The CS method was evaluated using DaVis (LaVision, GmbH). Utilizing two PIV systems for high and low-resolution measurements and a laser doppler velocimetry (LDV) system, data were acquired in a total of three cases: a jet flow and a cylinder in cross flow at two Reynolds numbers. LDV measurements were used to establish a point validation against which the high-resolution PIV measurements were validated. Subsequently, the high-resolution PIV measurements were used as a reference against which the low-resolution PIV data were assessed for error and uncertainty. We compared error and uncertainty distributions, spatially varying RMS error and RMS uncertainty, and standard uncertainty coverages. We observed that qualitatively, each method responded to spatially varying error (i.e. higher error regions resulted in higher uncertainty predictions in that region). However, the PPR and MI methods demonstrated reduced uncertainty dynamic range response. In contrast, the IM and CS methods showed better response, but under-predicted the uncertainty ranges. The standard coverages (68% confidence interval) ranged from approximately 65%-77% for PPR and MI methods, 40%-50% for IM and near 50% for CS. These observations illustrate some of the strengths and weaknesses of the methods considered herein and identify future directions for development and improvement.

  10. Integrating uncertainties for climate change mitigation

    NASA Astrophysics Data System (ADS)

    Rogelj, Joeri; McCollum, David; Reisinger, Andy; Meinshausen, Malte; Riahi, Keywan

    2013-04-01

    The target of keeping global average temperature increase to below 2°C has emerged in the international climate debate more than a decade ago. In response, the scientific community has tried to estimate the costs of reaching such a target through modelling and scenario analysis. Producing such estimates remains a challenge, particularly because of relatively well-known, but ill-quantified uncertainties, and owing to limited integration of scientific knowledge across disciplines. The integrated assessment community, on one side, has extensively assessed the influence of technological and socio-economic uncertainties on low-carbon scenarios and associated costs. The climate modelling community, on the other side, has worked on achieving an increasingly better understanding of the geophysical response of the Earth system to emissions of greenhouse gases (GHG). This geophysical response remains a key uncertainty for the cost of mitigation scenarios but has only been integrated with assessments of other uncertainties in a rudimentary manner, i.e., for equilibrium conditions. To bridge this gap between the two research communities, we generate distributions of the costs associated with limiting transient global temperature increase to below specific temperature limits, taking into account uncertainties in multiple dimensions: geophysical, technological, social and political. In other words, uncertainties resulting from our incomplete knowledge about how the climate system precisely reacts to GHG emissions (geophysical uncertainties), about how society will develop (social uncertainties and choices), which technologies will be available (technological uncertainty and choices), when we choose to start acting globally on climate change (political choices), and how much money we are or are not willing to spend to achieve climate change mitigation. We find that political choices that delay mitigation have the largest effect on the cost-risk distribution, followed by geophysical, future energy demand, and mitigation technology uncertainties. This information provides central information for policy making, since it helps to understand the relationship between mitigation costs and their potential to reduce the risk of exceeding 2°C, or other temperature limits like 3°C or 1.5°C, under a wide range of scenarios.

  11. Changing Global Risk Landscape - Challenges for Risk Management (Invited)

    NASA Astrophysics Data System (ADS)

    Wenzel, F.

    2009-12-01

    The exponentially growing losses related to natural disasters on a global scale reflect a changing risk landscape that is characterized by the influence of climate change and a growing population, particularly in urban agglomerations and coastal zones. In consequence of these trends we witness (a) new hazards such as landslides due to dwindling permafrost, new patterns of strong precipitation and related floods, potential for tropical cyclones in the Mediterranean, sea level rise and others; (b) new risks related to large numbers of people in very dense urban areas, and risks related to the vulnerability of infrastructure such as energy supply, water supply, transportation, communication, etc. (c) extreme events with unprecedented size and implications. An appropriate answer to these challenges goes beyond classical views of risk assessment and protection. It must include an understanding of risk as changing with time so that risk assessment needs to be supplemented by risk monitoring. It requires decision making under high uncertainty. The risks (i.e. potentials for future losses) of extreme events are not only high but also very difficult to quantify, as they are characterized by high levels of uncertainty. Uncertainties relate to frequency, time of occurrence, strength and impact of extreme events but also to the coping capacities of society in response to them. The characterization, quantification, reduction in the extent possible of the uncertainties is an inherent topic of extreme event research. However, they will not disappear, so a rational approach to extreme events must include more than reducing uncertainties. It requires us to assess and rate the irreducible uncertainties, to evaluate options for mitigation under large uncertainties, and their communication to societal sectors. Thus scientist need to develop methodologies that aim at a rational approach to extreme events associated with high levels of uncertainty.

  12. Using Predictive Uncertainty Analysis to Assess Hydrologic Model Performance for a Watershed in Oregon

    NASA Astrophysics Data System (ADS)

    Brannan, K. M.; Somor, A.

    2016-12-01

    A variety of statistics are used to assess watershed model performance but these statistics do not directly answer the question: what is the uncertainty of my prediction. Understanding predictive uncertainty is important when using a watershed model to develop a Total Maximum Daily Load (TMDL). TMDLs are a key component of the US Clean Water Act and specify the amount of a pollutant that can enter a waterbody when the waterbody meets water quality criteria. TMDL developers use watershed models to estimate pollutant loads from nonpoint sources of pollution. We are developing a TMDL for bacteria impairments in a watershed in the Coastal Range of Oregon. We setup an HSPF model of the watershed and used the calibration software PEST to estimate HSPF hydrologic parameters and then perform predictive uncertainty analysis of stream flow. We used Monte-Carlo simulation to run the model with 1,000 different parameter sets and assess predictive uncertainty. In order to reduce the chance of specious parameter sets, we accounted for the relationships among parameter values by using mathematically-based regularization techniques and an estimate of the parameter covariance when generating random parameter sets. We used a novel approach to select flow data for predictive uncertainty analysis. We set aside flow data that occurred on days that bacteria samples were collected. We did not use these flows in the estimation of the model parameters. We calculated a percent uncertainty for each flow observation based 1,000 model runs. We also used several methods to visualize results with an emphasis on making the data accessible to both technical and general audiences. We will use the predictive uncertainty estimates in the next phase of our work, simulating bacteria fate and transport in the watershed.

  13. Choice of baseline climate data impacts projected species' responses to climate change.

    PubMed

    Baker, David J; Hartley, Andrew J; Butchart, Stuart H M; Willis, Stephen G

    2016-07-01

    Climate data created from historic climate observations are integral to most assessments of potential climate change impacts, and frequently comprise the baseline period used to infer species-climate relationships. They are often also central to downscaling coarse resolution climate simulations from General Circulation Models (GCMs) to project future climate scenarios at ecologically relevant spatial scales. Uncertainty in these baseline data can be large, particularly where weather observations are sparse and climate dynamics are complex (e.g. over mountainous or coastal regions). Yet, importantly, this uncertainty is almost universally overlooked when assessing potential responses of species to climate change. Here, we assessed the importance of historic baseline climate uncertainty for projections of species' responses to future climate change. We built species distribution models (SDMs) for 895 African bird species of conservation concern, using six different climate baselines. We projected these models to two future periods (2040-2069, 2070-2099), using downscaled climate projections, and calculated species turnover and changes in species-specific climate suitability. We found that the choice of baseline climate data constituted an important source of uncertainty in projections of both species turnover and species-specific climate suitability, often comparable with, or more important than, uncertainty arising from the choice of GCM. Importantly, the relative contribution of these factors to projection uncertainty varied spatially. Moreover, when projecting SDMs to sites of biodiversity importance (Important Bird and Biodiversity Areas), these uncertainties altered site-level impacts, which could affect conservation prioritization. Our results highlight that projections of species' responses to climate change are sensitive to uncertainty in the baseline climatology. We recommend that this should be considered routinely in such analyses. © 2016 John Wiley & Sons Ltd.

  14. Uncertainty Analysis in Humidity Measurements by the Psychrometer Method

    PubMed Central

    Chen, Jiunyuan; Chen, Chiachung

    2017-01-01

    The most common and cheap indirect technique to measure relative humidity is by using psychrometer based on a dry and a wet temperature sensor. In this study, the measurement uncertainty of relative humidity was evaluated by this indirect method with some empirical equations for calculating relative humidity. Among the six equations tested, the Penman equation had the best predictive ability for the dry bulb temperature range of 15–50 °C. At a fixed dry bulb temperature, an increase in the wet bulb depression increased the error. A new equation for the psychrometer constant was established by regression analysis. This equation can be computed by using a calculator. The average predictive error of relative humidity was <0.1% by this new equation. The measurement uncertainty of the relative humidity affected by the accuracy of dry and wet bulb temperature and the numeric values of measurement uncertainty were evaluated for various conditions. The uncertainty of wet bulb temperature was the main factor on the RH measurement uncertainty. PMID:28216599

  15. Uncertainty Analysis in Humidity Measurements by the Psychrometer Method.

    PubMed

    Chen, Jiunyuan; Chen, Chiachung

    2017-02-14

    The most common and cheap indirect technique to measure relative humidity is by using psychrometer based on a dry and a wet temperature sensor. In this study, the measurement uncertainty of relative humidity was evaluated by this indirect method with some empirical equations for calculating relative humidity. Among the six equations tested, the Penman equation had the best predictive ability for the dry bulb temperature range of 15-50 °C. At a fixed dry bulb temperature, an increase in the wet bulb depression increased the error. A new equation for the psychrometer constant was established by regression analysis. This equation can be computed by using a calculator. The average predictive error of relative humidity was <0.1% by this new equation. The measurement uncertainty of the relative humidity affected by the accuracy of dry and wet bulb temperature and the numeric values of measurement uncertainty were evaluated for various conditions. The uncertainty of wet bulb temperature was the main factor on the RH measurement uncertainty.

  16. Dispersal and extrapolation on the accuracy of temporal predictions from distribution models for the Darwin's frog.

    PubMed

    Uribe-Rivera, David E; Soto-Azat, Claudio; Valenzuela-Sánchez, Andrés; Bizama, Gustavo; Simonetti, Javier A; Pliscoff, Patricio

    2017-07-01

    Climate change is a major threat to biodiversity; the development of models that reliably predict its effects on species distributions is a priority for conservation biogeography. Two of the main issues for accurate temporal predictions from Species Distribution Models (SDM) are model extrapolation and unrealistic dispersal scenarios. We assessed the consequences of these issues on the accuracy of climate-driven SDM predictions for the dispersal-limited Darwin's frog Rhinoderma darwinii in South America. We calibrated models using historical data (1950-1975) and projected them across 40 yr to predict distribution under current climatic conditions, assessing predictive accuracy through the area under the ROC curve (AUC) and True Skill Statistics (TSS), contrasting binary model predictions against temporal-independent validation data set (i.e., current presences/absences). To assess the effects of incorporating dispersal processes we compared the predictive accuracy of dispersal constrained models with no dispersal limited SDMs; and to assess the effects of model extrapolation on the predictive accuracy of SDMs, we compared this between extrapolated and no extrapolated areas. The incorporation of dispersal processes enhanced predictive accuracy, mainly due to a decrease in the false presence rate of model predictions, which is consistent with discrimination of suitable but inaccessible habitat. This also had consequences on range size changes over time, which is the most used proxy for extinction risk from climate change. The area of current climatic conditions that was absent in the baseline conditions (i.e., extrapolated areas) represents 39% of the study area, leading to a significant decrease in predictive accuracy of model predictions for those areas. Our results highlight (1) incorporating dispersal processes can improve predictive accuracy of temporal transference of SDMs and reduce uncertainties of extinction risk assessments from global change; (2) as geographical areas subjected to novel climates are expected to arise, they must be reported as they show less accurate predictions under future climate scenarios. Consequently, environmental extrapolation and dispersal processes should be explicitly incorporated to report and reduce uncertainties in temporal predictions of SDMs, respectively. Doing so, we expect to improve the reliability of the information we provide for conservation decision makers under future climate change scenarios. © 2017 by the Ecological Society of America.

  17. High-resolution modeling assessment of tidal stream resource in Western Passage of Maine, USA

    NASA Astrophysics Data System (ADS)

    Yang, Zhaoqing; Wang, Taiping; Feng, Xi; Xue, Huijie; Kilcher, Levi

    2017-04-01

    Although significant efforts have been taken to assess the maximum potential of tidal stream energy at system-wide scale, accurate assessment of tidal stream energy resource at project design scale requires detailed hydrodynamic simulations using high-resolution three-dimensional (3-D) numerical models. Extended model validation against high quality measured data is essential to minimize the uncertainties of the resource assessment. Western Passage in the State of Maine in U.S. has been identified as one of the top ranking sites for tidal stream energy development in U.S. coastal waters, based on a number of criteria including tidal power density, market value and transmission distance. This study presents an on-going modeling effort for simulating the tidal hydrodynamics in Western Passage using the 3-D unstructured-grid Finite Volume Community Ocean Model (FVCOM). The model domain covers a large region including the entire the Bay of Fundy with grid resolution varies from 20 m in the Western Passage to approximately 1000 m along the open boundary near the mouth of Bay of Fundy. Preliminary model validation was conducted using existing NOAA measurements within the model domain. Spatial distributions of tidal power density were calculated and extractable tidal energy was estimated using a tidal turbine module embedded in FVCOM under different tidal farm scenarios. Additional field measurements to characterize resource and support model validation were discussed. This study provides an example of high resolution resource assessment based on the guidance recommended by the International Electrotechnical Commission Technical Specification.

  18. Comparison of the uncertainties of several European low-dose calibration facilities

    NASA Astrophysics Data System (ADS)

    Dombrowski, H.; Cornejo Díaz, N. A.; Toni, M. P.; Mihelic, M.; Röttger, A.

    2018-04-01

    The typical uncertainty of a low-dose rate calibration of a detector, which is calibrated in a dedicated secondary national calibration laboratory, is investigated, including measurements in the photon field of metrology institutes. Calibrations at low ambient dose equivalent rates (at the level of the natural ambient radiation) are needed when environmental radiation monitors are to be characterised. The uncertainties of calibration measurements in conventional irradiation facilities above ground are compared with those obtained in a low-dose rate irradiation facility located deep underground. Four laboratories quantitatively evaluated the uncertainties of their calibration facilities, in particular for calibrations at low dose rates (250 nSv/h and 1 μSv/h). For the first time, typical uncertainties of European calibration facilities are documented in a comparison and the main sources of uncertainty are revealed. All sources of uncertainties are analysed, including the irradiation geometry, scattering, deviations of real spectra from standardised spectra, etc. As a fundamental metrological consequence, no instrument calibrated in such a facility can have a lower total uncertainty in subsequent measurements. For the first time, the need to perform calibrations at very low dose rates (< 100 nSv/h) deep underground is underpinned on the basis of quantitative data.

  19. Uncertainty based modeling of rainfall-runoff: Combined differential evolution adaptive Metropolis (DREAM) and K-means clustering

    NASA Astrophysics Data System (ADS)

    Zahmatkesh, Zahra; Karamouz, Mohammad; Nazif, Sara

    2015-09-01

    Simulation of rainfall-runoff process in urban areas is of great importance considering the consequences and damages of extreme runoff events and floods. The first issue in flood hazard analysis is rainfall simulation. Large scale climate signals have been proved to be effective in rainfall simulation and prediction. In this study, an integrated scheme is developed for rainfall-runoff modeling considering different sources of uncertainty. This scheme includes three main steps of rainfall forecasting, rainfall-runoff simulation and future runoff prediction. In the first step, data driven models are developed and used to forecast rainfall using large scale climate signals as rainfall predictors. Due to high effect of different sources of uncertainty on the output of hydrologic models, in the second step uncertainty associated with input data, model parameters and model structure is incorporated in rainfall-runoff modeling and simulation. Three rainfall-runoff simulation models are developed for consideration of model conceptual (structural) uncertainty in real time runoff forecasting. To analyze the uncertainty of the model structure, streamflows generated by alternative rainfall-runoff models are combined, through developing a weighting method based on K-means clustering. Model parameters and input uncertainty are investigated using an adaptive Markov Chain Monte Carlo method. Finally, calibrated rainfall-runoff models are driven using the forecasted rainfall to predict future runoff for the watershed. The proposed scheme is employed in the case study of the Bronx River watershed, New York City. Results of uncertainty analysis of rainfall-runoff modeling reveal that simultaneous estimation of model parameters and input uncertainty significantly changes the probability distribution of the model parameters. It is also observed that by combining the outputs of the hydrological models using the proposed clustering scheme, the accuracy of runoff simulation in the watershed is remarkably improved up to 50% in comparison to the simulations by the individual models. Results indicate that the developed methodology not only provides reliable tools for rainfall and runoff modeling, but also adequate time for incorporating required mitigation measures in dealing with potentially extreme runoff events and flood hazard. Results of this study can be used in identification of the main factors affecting flood hazard analysis.

  20. Impacts of uncertainties in weather and streamflow observations in calibration and evaluation of an elevation distributed HBV-model

    NASA Astrophysics Data System (ADS)

    Engeland, K.; Steinsland, I.; Petersen-Øverleir, A.; Johansen, S.

    2012-04-01

    The aim of this study is to assess the uncertainties in streamflow simulations when uncertainties in both observed inputs (precipitation and temperature) and streamflow observations used in the calibration of the hydrological model are explicitly accounted for. To achieve this goal we applied the elevation distributed HBV model operating on daily time steps to a small catchment in high elevation in Southern Norway where the seasonal snow cover is important. The uncertainties in precipitation inputs were quantified using conditional simulation. This procedure accounts for the uncertainty related to the density of the precipitation network, but neglects uncertainties related to measurement bias/errors and eventual elevation gradients in precipitation. The uncertainties in temperature inputs were quantified using a Bayesian temperature interpolation procedure where the temperature lapse rate is re-estimated every day. The uncertainty in the lapse rate was accounted for whereas the sampling uncertainty related to network density was neglected. For every day a random sample of precipitation and temperature inputs were drawn to be applied as inputs to the hydrologic model. The uncertainties in observed streamflow were assessed based on the uncertainties in the rating curve model. A Bayesian procedure was applied to estimate the probability for rating curve models with 1 to 3 segments and the uncertainties in their parameters. This method neglects uncertainties related to errors in observed water levels. Note that one rating curve was drawn to make one realisation of a whole time series of streamflow, thus the rating curve errors lead to a systematic bias in the streamflow observations. All these uncertainty sources were linked together in both calibration and evaluation of the hydrologic model using a DREAM based MCMC routine. Effects of having less information (e.g. missing one streamflow measurement for defining the rating curve or missing one precipitation station) was also investigated.

Top