Sample records for uncertainty assessment volume

  1. Preliminary performance assessment for the Waste Isolation Pilot Plant, December 1992. Volume 5, Uncertainty and sensitivity analyses of gas and brine migration for undisturbed performance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1993-08-01

    Before disposing of transuranic radioactive waste in the Waste Isolation Pilot Plant (WIPP), the United States Department of Energy (DOE) must evaluate compliance with applicable long-term regulations of the United States Environmental Protection Agency (EPA). Sandia National Laboratories is conducting iterative performance assessments (PAs) of the WIPP for the DOE to provide interim guidance while preparing for a final compliance evaluation. This volume of the 1992 PA contains results of uncertainty and sensitivity analyses with respect to migration of gas and brine from the undisturbed repository. Additional information about the 1992 PA is provided in other volumes. Volume 1 containsmore » an overview of WIPP PA and results of a preliminary comparison with 40 CFR 191, Subpart B. Volume 2 describes the technical basis for the performance assessment, including descriptions of the linked computational models used in the Monte Carlo analyses. Volume 3 contains the reference data base and values for input parameters used in consequence and probability modeling. Volume 4 contains uncertainty and sensitivity analyses with respect to the EPA`s Environmental Standards for the Management and Disposal of Spent Nuclear Fuel, High-Level and Transuranic Radioactive Wastes (40 CFR 191, Subpart B). Finally, guidance derived from the entire 1992 PA is presented in Volume 6. Results of the 1992 uncertainty and sensitivity analyses indicate that, conditional on the modeling assumptions and the assigned parameter-value distributions, the most important parameters for which uncertainty has the potential to affect gas and brine migration from the undisturbed repository are: initial liquid saturation in the waste, anhydrite permeability, biodegradation-reaction stoichiometry, gas-generation rates for both corrosion and biodegradation under inundated conditions, and the permeability of the long-term shaft seal.« less

  2. A general method for assessing the effects of uncertainty in individual-tree volume model predictions on large-area volume estimates with a subtropical forest illustration

    Treesearch

    Ronald E. McRoberts; Paolo Moser; Laio Zimermann Oliveira; Alexander C. Vibrans

    2015-01-01

    Forest inventory estimates of tree volume for large areas are typically calculated by adding the model predictions of volumes for individual trees at the plot level, calculating the mean over plots, and expressing the result on a per unit area basis. The uncertainty in the model predictions is generally ignored, with the result that the precision of the large-area...

  3. Spatial resolution and measurement uncertainty of strains in bone and bone-cement interface using digital volume correlation.

    PubMed

    Zhu, Ming-Liang; Zhang, Qing-Hang; Lupton, Colin; Tong, Jie

    2016-04-01

    The measurement uncertainty of strains has been assessed in a bone analogue (sawbone), bovine trabecular bone and bone-cement interface specimens under zero load using the Digital Volume Correlation (DVC) method. The effects of sub-volume size, sample constraint and preload on the measured strain uncertainty have been examined. There is generally a trade-off between the measurement uncertainty and the spatial resolution. Suitable sub-volume sizes have been be selected based on a compromise between the measurement uncertainty and the spatial resolution of the cases considered. A ratio of sub-volume size to a microstructure characteristic (Tb.Sp) was introduced to reflect a suitable spatial resolution, and the measurement uncertainty associated was assessed. Specifically, ratios between 1.6 and 4 appear to give rise to standard deviations in the measured strains between 166 and 620 με in all the cases considered, which would seem to suffice for strain analysis in pre as well as post yield loading regimes. A microscale finite element (μFE) model was built from the CT images of the sawbone, and the results from the μFE model and a continuum FE model were compared with those from the DVC. The strain results were found to differ significantly between the two methods at tissue level, consistent in trend with the results found in human bones, indicating mainly a limitation of the current DVC method in mapping strains at this level. Copyright © 2015 Elsevier Ltd. All rights reserved.

  4. Geostatistical approach for assessing soil volumes requiring remediation: validation using lead-polluted soils underlying a former smelting works.

    PubMed

    Demougeot-Renard, Helene; De Fouquet, Chantal

    2004-10-01

    Assessing the volume of soil requiring remediation and the accuracy of this assessment constitutes an essential step in polluted site management. If this remediation volume is not properly assessed, misclassification may lead both to environmental risks (polluted soils may not be remediated) and financial risks (unexpected discovery of polluted soils may generate additional remediation costs). To minimize such risks, this paper proposes a geostatistical methodology based on stochastic simulations that allows the remediation volume and the uncertainty to be assessed using investigation data. The methodology thoroughly reproduces the conditions in which the soils are classified and extracted at the remediation stage. The validity of the approach is tested by applying it on the data collected during the investigation phase of a former lead smelting works and by comparing the results with the volume that has actually been remediated. This real remediated volume was composed of all the remediation units that were classified as polluted after systematic sampling and analysis during clean-up stage. The volume estimated from the 75 samples collected during site investigation slightly overestimates (5.3% relative error) the remediated volume deduced from 212 remediation units. Furthermore, the real volume falls within the range of uncertainty predicted using the proposed methodology.

  5. Assessment of Uncertainty-Based Screening Volumes for NASA Robotic LEO and GEO Conjunction Risk Assessment

    NASA Technical Reports Server (NTRS)

    Narvet, Steven W.; Frigm, Ryan C.; Hejduk, Matthew D.

    2011-01-01

    Conjunction Assessment operations require screening assets against the space object catalog by placing a pre-determined spatial volume around each asset and predicting when another object will violate that volume. The selection of the screening volume used for each spacecraft is a trade-off between observing all conjunction events that may pose a potential risk to the primary spacecraft and the ability to analyze those predicted events. If the screening volumes are larger, then more conjunctions can be observed and therefore the probability of a missed detection of a high risk conjunction event is small; however, the amount of data which needs to be analyzed increases. This paper characterizes the sensitivity of screening volume size to capturing typical orbit uncertainties and the expected number of conjunction events observed. These sensitivities are quantified in the form of a trade space that allows for selection of appropriate screen-ing volumes to fit the desired concept of operations, system limitations, and tolerable analyst workloads. This analysis will specifically highlight the screening volume determination and selection process for use in the NASA Conjunction Assessment Risk Analysis process but will also provide a general framework for other Owner / Operators faced with similar decisions.

  6. Assessment of grassland ecosystem conditions in the Southwestern United States. Vol. 1

    Treesearch

    Deborah M. Finch

    2004-01-01

    This report is volume 1 of a two-volume ecological assessment of grassland ecosystems in the Southwestern United States. Broadscale assessments are syntheses of current scientific knowledge, including a description of uncertainties and assumptions, to provide a characterization and comprehensive description of ecological, social, and economic components within an...

  7. Assessment of grassland ecosystem conditions in the Southwestern United States: Wildlife and fish. Vol. 2

    Treesearch

    Deborah M. Finch

    2005-01-01

    This report is volume 2 of a two-volume ecological assessment of grassland ecosystems in the Southwestern United States. Broad-scale assessments are syntheses of current scientific knowledge, including a description of uncertainties and assumptions, to provide a characterization and comprehensive description of ecological, social, and economic components within an...

  8. Uncertainty Assessment of Synthetic Design Hydrographs for Gauged and Ungauged Catchments

    NASA Astrophysics Data System (ADS)

    Brunner, Manuela I.; Sikorska, Anna E.; Furrer, Reinhard; Favre, Anne-Catherine

    2018-03-01

    Design hydrographs described by peak discharge, hydrograph volume, and hydrograph shape are essential for engineering tasks involving storage. Such design hydrographs are inherently uncertain as are classical flood estimates focusing on peak discharge only. Various sources of uncertainty contribute to the total uncertainty of synthetic design hydrographs for gauged and ungauged catchments. These comprise model uncertainties, sampling uncertainty, and uncertainty due to the choice of a regionalization method. A quantification of the uncertainties associated with flood estimates is essential for reliable decision making and allows for the identification of important uncertainty sources. We therefore propose an uncertainty assessment framework for the quantification of the uncertainty associated with synthetic design hydrographs. The framework is based on bootstrap simulations and consists of three levels of complexity. On the first level, we assess the uncertainty due to individual uncertainty sources. On the second level, we quantify the total uncertainty of design hydrographs for gauged catchments and the total uncertainty of regionalizing them to ungauged catchments but independently from the construction uncertainty. On the third level, we assess the coupled uncertainty of synthetic design hydrographs in ungauged catchments, jointly considering construction and regionalization uncertainty. We find that the most important sources of uncertainty in design hydrograph construction are the record length and the choice of the flood sampling strategy. The total uncertainty of design hydrographs in ungauged catchments depends on the catchment properties and is not negligible in our case.

  9. Mercury study report to Congress. Volume 5. Health effects of mercury and mercury compounds

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hassett-Sipple, B.; Swartout, J.; Schoeny, R.

    1997-12-01

    This volume summarizes the available information on human health effects and animal data for hazard identification and dose-response assessment for three forms of mercury: elemental mercury, mercury chloride (inorganic mercury), and methylmercury (organic mercury). Effects are summarized by endpoint. The risk assessment evaluates carcinogenicity, mutagenicity, developmental toxicity and general systemic toxicity of these chemical species of mercury. Toxicokinetics (absorption, distribution, metabolism and excretion) are described for each of the three mercury species. Reference doses are calculated for inorganic and methylmercury; a reference concentrations for inhaled elemental mercury is provided. A quantitative analysis of factors contributing to variability and uncertainty inmore » the methylmercury RfD is provided in an appendix. Interactions and sensitive populations are described. the draft volume assesses ongoing research and research needs to reduce uncertainty surrounding adverse human health consequences of methylmercury exposure.« less

  10. Factors controlling volume errors through 2D gully erosion assessment: guidelines for optimal survey design

    NASA Astrophysics Data System (ADS)

    Castillo, Carlos; Pérez, Rafael

    2017-04-01

    The assessment of gully erosion volumes is essential for the quantification of soil losses derived from this relevant degradation process. Traditionally, 2D and 3D approaches has been applied for this purpose (Casalí et al., 2006). Although innovative 3D approaches have recently been proposed for gully volume quantification, a renewed interest can be found in literature regarding the useful information that cross-section analysis still provides in gully erosion research. Moreover, the application of methods based on 2D approaches can be the most cost-effective approach in many situations such as preliminary studies with low accuracy requirements or surveys under time or budget constraints. The main aim of this work is to examine the key factors controlling volume error variability in 2D gully assessment by means of a stochastic experiment involving a Monte Carlo analysis over synthetic gully profiles in order to 1) contribute to a better understanding of the drivers and magnitude of gully erosion 2D-surveys uncertainty and 2) provide guidelines for optimal survey designs. Owing to the stochastic properties of error generation in 2D volume assessment, a statistical approach was followed to generate a large and significant set of gully reach configurations to evaluate quantitatively the influence of the main factors controlling the uncertainty of the volume assessment. For this purpose, a simulation algorithm in Matlab® code was written, involving the following stages: - Generation of synthetic gully area profiles with different degrees of complexity (characterized by the cross-section variability) - Simulation of field measurements characterised by a survey intensity and the precision of the measurement method - Quantification of the volume error uncertainty as a function of the key factors In this communication we will present the relationships between volume error and the studied factors and propose guidelines for 2D field surveys based on the minimal survey densities required to achieve a certain accuracy given the cross-sectional variability of a gully and the measurement method applied. References Casali, J., Loizu, J., Campo, M.A., De Santisteban, L.M., Alvarez-Mozos, J., 2006. Accuracy of methods for field assessment of rill and ephemeral gully erosion. Catena 67, 128-138. doi:10.1016/j.catena.2006.03.005

  11. Fukushima Daiichi Unit 1 Uncertainty Analysis-Exploration of Core Melt Progression Uncertain Parameters-Volume II.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Denman, Matthew R.; Brooks, Dusty Marie

    Sandia National Laboratories (SNL) has conducted an uncertainty analysi s (UA) on the Fukushima Daiichi unit (1F1) accident progression wit h the MELCOR code. Volume I of the 1F1 UA discusses the physical modeling details and time history results of the UA. Volume II of the 1F1 UA discusses the statistical viewpoint. The model used was developed for a previous accident reconstruction investigation jointly sponsored by the US Department of Energy (DOE) and Nuclear Regulatory Commission (NRC). The goal of this work was to perform a focused evaluation of uncertainty in core damage progression behavior and its effect on keymore » figures - of - merit (e.g., hydrogen production, fraction of intact fuel, vessel lower head failure) and in doing so assess the applicability of traditional sensitivity analysis techniques .« less

  12. County-Level Climate Uncertainty for Risk Assessments: Volume 14 Appendix M - Historical Surface Runoff.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Backus, George A.; Lowry, Thomas Stephen; Jones, Shannon M

    2017-06-01

    This report uses the CMIP5 series of climate model simulations to produce country- level uncertainty distributions for use in socioeconomic risk assessments of climate change impacts. It provides appropriate probability distributions, by month, for 169 countries and autonomous-areas on temperature, precipitation, maximum temperature, maximum wind speed, humidity, runoff, soil moisture and evaporation for the historical period (1976-2005), and for decadal time periods to 2100. It also provides historical and future distributions for the Arctic region on ice concentration, ice thickness, age of ice, and ice ridging in 15-degree longitude arc segments from the Arctic Circle to 80 degrees latitude, plusmore » two polar semicircular regions from 80 to 90 degrees latitude. The uncertainty is meant to describe the lack of knowledge rather than imprecision in the physical simulation because the emphasis is on unfalsified risk and its use to determine potential socioeconomic impacts. The full report is contained in 27 volumes.« less

  13. County-Level Climate Uncertainty for Risk Assessments: Volume 15 Appendix N - Forecast Surface Runoff.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Backus, George A.; Lowry, Thomas Stephen; Jones, Shannon M.

    2017-05-01

    This report uses the CMIP5 series of climate model simulations to produce country- level uncertainty distributions for use in socioeconomic risk assessments of climate change impacts. It provides appropriate probability distributions, by month, for 169 countries and autonomous-areas on temperature, precipitation, maximum temperature, maximum wind speed, humidity, runoff, soil moisture and evaporation for the historical period (1976-2005), and for decadal time periods to 2100. It also provides historical and future distributions for the Arctic region on ice concentration, ice thickness, age of ice, and ice ridging in 15-degree longitude arc segments from the Arctic Circle to 80 degrees latitude, plusmore » two polar semicircular regions from 80 to 90 degrees latitude. The uncertainty is meant to describe the lack of knowledge rather than imprecision in the physical simulation because the emphasis is on unfalsified risk and its use to determine potential socioeconomic impacts. The full report is contained in 27 volumes.« less

  14. County-Level Climate Uncertainty for Risk Assessments: Volume 10 Appendix I - Historical Evaporation.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Backus, George A.; Lowry, Thomas Stephen; Jones, Shannon M.

    2017-06-01

    This report uses the CMIP5 series of climate model simulations to produce country- level uncertainty distributions for use in socioeconomic risk assessments of climate change impacts. It provides appropriate probability distributions, by month, for 169 countries and autonomous-areas on temperature, precipitation, maximum temperature, maximum wind speed, humidity, runoff, soil moisture and evaporation for the historical period (1976-2005), and for decadal time periods to 2100. It also provides historical and future distributions for the Arctic region on ice concentration, ice thickness, age of ice, and ice ridging in 15-degree longitude arc segments from the Arctic Circle to 80 degrees latitude, plusmore » two polar semicircular regions from 80 to 90 degrees latitude. The uncertainty is meant to describe the lack of knowledge rather than imprecision in the physical simulation because the emphasis is on unfalsified risk and its use to determine potential socioeconomic impacts. The full report is contained in 27 volumes.« less

  15. County-Level Climate Uncertainty for Risk Assessments: Volume 8 Appendix G - Historical Precipitation.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Backus, George A.; Lowry, Thomas Stephen; Jones, Shannon M.

    2017-06-01

    This report uses the CMIP5 series of climate model simulations to produce country- level uncertainty distributions for use in socioeconomic risk assessments of climate change impacts. It provides appropriate probability distributions, by month, for 169 countries and autonomous-areas on temperature, precipitation, maximum temperature, maximum wind speed, humidity, runoff, soil moisture and evaporation for the historical period (1976-2005), and for decadal time periods to 2100. It also provides historical and future distributions for the Arctic region on ice concentration, ice thickness, age of ice, and ice ridging in 15-degree longitude arc segments from the Arctic Circle to 80 degrees latitude, plusmore » two polar semicircular regions from 80 to 90 degrees latitude. The uncertainty is meant to describe the lack of knowledge rather than imprecision in the physical simulation because the emphasis is on unfalsified risk and its use to determine potential socioeconomic impacts. The full report is contained in 27 volumes.« less

  16. County-Level Climate Uncertainty for Risk Assessments: Volume 12 Appendix K - Historical Rel. Humidity.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Backus, George A.; Lowry, Thomas Stephen; Jones, Shannon M

    2017-06-01

    This report uses the CMIP5 series of climate model simulations to produce country- level uncertainty distributions for use in socioeconomic risk assessments of climate change impacts. It provides appropriate probability distributions, by month, for 169 countries and autonomous-areas on temperature, precipitation, maximum temperature, maximum wind speed, humidity, runoff, soil moisture and evaporation for the historical period (1976-2005), and for decadal time periods to 2100. It also provides historical and future distributions for the Arctic region on ice concentration, ice thickness, age of ice, and ice ridging in 15-degree longitude arc segments from the Arctic Circle to 80 degrees latitude, plusmore » two polar semicircular regions from 80 to 90 degrees latitude. The uncertainty is meant to describe the lack of knowledge rather than imprecision in the physical simulation because the emphasis is on unfalsified risk and its use to determine potential socioeconomic impacts. The full report is contained in 27 volumes.« less

  17. County-Level Climate Uncertainty for Risk Assessments: Volume 23 Appendix V - Forecast Sea Ice Thickness

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Backus, George A.; Lowry, Thomas Stephen; Jones, Shannon M.

    2017-04-01

    This report uses the CMIP5 series of climate model simulations to produce country- level uncertainty distributions for use in socioeconomic risk assessments of climate change impacts. It provides appropriate probability distributions, by month, for 169 countries and autonomous-areas on temperature, precipitation, maximum temperature, maximum wind speed, humidity, runoff, soil moisture and evaporation for the historical period (1976-2005), and for decadal time periods to 2100. It also provides historical and future distributions for the Arctic region on ice concentration, ice thickness, age of ice, and ice ridging in 15-degree longitude arc segments from the Arctic Circle to 80 degrees latitude, plusmore » two polar semicircular regions from 80 to 90 degrees latitude. The uncertainty is meant to describe the lack of knowledge rather than imprecision in the physical simulation because the emphasis is on unfalsified risk and its use to determine potential socioeconomic impacts. The full report is contained in 27 volumes.« less

  18. County-Level Climate Uncertainty for Risk Assessments: Volume 24 Appendix W - Historical Sea Ice Age.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Backus, George A.; Lowry, Thomas Stephen; Jones, Shannon M

    2017-05-01

    This report uses the CMIP5 series of climate model simulations to produce country- level uncertainty distributions for use in socioeconomic risk assessments of climate change impacts. It provides appropriate probability distributions, by month, for 169 countries and autonomous-areas on temperature, precipitation, maximum temperature, maximum wind speed, humidity, runoff, soil moisture and evaporation for the historical period (1976-2005), and for decadal time periods to 2100. It also provides historical and future distributions for the Arctic region on ice concentration, ice thickness, age of ice, and ice ridging in 15-degree longitude arc segments from the Arctic Circle to 80 degrees latitude, plusmore » two polar semicircular regions from 80 to 90 degrees latitude. The uncertainty is meant to describe the lack of knowledge rather than imprecision in the physical simulation because the emphasis is on unfalsified risk and its use to determine potential socioeconomic impacts. The full report is contained in 27 volumes.« less

  19. County-Level Climate Uncertainty for Risk Assessments: Volume 22 Appendix U - Historical Sea Ice Thickness

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Backus, George A.; Lowry, Thomas Stephen; Jones, Shannon M

    2017-06-01

    This report uses the CMIP5 series of climate model simulations to produce country- level uncertainty distributions for use in socioeconomic risk assessments of climate change impacts. It provides appropriate probability distributions, by month, for 169 countries and autonomous-areas on temperature, precipitation, maximum temperature, maximum wind speed, humidity, runoff, soil moisture and evaporation for the historical period (1976-2005), and for decadal time periods to 2100. It also provides historical and future distributions for the Arctic region on ice concentration, ice thickness, age of ice, and ice ridging in 15-degree longitude arc segments from the Arctic Circle to 80 degrees latitude, plusmore » two polar semicircular regions from 80 to 90 degrees latitude. The uncertainty is meant to describe the lack of knowledge rather than imprecision in the physical simulation because the emphasis is on unfalsified risk and its use to determine potential socioeconomic impacts. The full report is contained in 27 volumes.« less

  20. County-Level Climate Uncertainty for Risk Assessments: Volume 21 Appendix T - Forecast Sea Ice Area Fraction.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Backus, George A.; Lowry, Thomas Stephen; Jones, Shannon M.

    2017-06-01

    This report uses the CMIP5 series of climate model simulations to produce country- level uncertainty distributions for use in socioeconomic risk assessments of climate change impacts. It provides appropriate probability distributions, by month, for 169 countries and autonomous-areas on temperature, precipitation, maximum temperature, maximum wind speed, humidity, runoff, soil moisture and evaporation for the historical period (1976-2005), and for decadal time periods to 2100. It also provides historical and future distributions for the Arctic region on ice concentration, ice thickness, age of ice, and ice ridging in 15-degree longitude arc segments from the Arctic Circle to 80 degrees latitude, plusmore » two polar semicircular regions from 80 to 90 degrees latitude. The uncertainty is meant to describe the lack of knowledge rather than imprecision in the physical simulation because the emphasis is on unfalsified risk and its use to determine potential socioeconomic impacts. The full report is contained in 27 volumes.« less

  1. County-Level Climate Uncertainty for Risk Assessments: Volume 25 Appendix X - Forecast Sea Ice Age.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Backus, George A.; Lowry, Thomas Stephen; Jones, Shannon M.

    2017-05-01

    This report uses the CMIP5 series of climate model simulations to produce country- level uncertainty distributions for use in socioeconomic risk assessments of climate change impacts. It provides appropriate probability distributions, by month, for 169 countries and autonomous-areas on temperature, precipitation, maximum temperature, maximum wind speed, humidity, runoff, soil moisture and evaporation for the historical period (1976-2005), and for decadal time periods to 2100. It also provides historical and future distributions for the Arctic region on ice concentration, ice thickness, age of ice, and ice ridging in 15-degree longitude arc segments from the Arctic Circle to 80 degrees latitude, plusmore » two polar semicircular regions from 80 to 90 degrees latitude. The uncertainty is meant to describe the lack of knowledge rather than imprecision in the physical simulation because the emphasis is on unfalsified risk and its use to determine potential socioeconomic impacts. The full report is contained in 27 volumes.« less

  2. County-Level Climate Uncertainty for Risk Assessments: Volume 27 Appendix Z - Forecast Ridging Rate.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Backus, George A.; Lowry, Thomas Stephen; Jones, Shannon M.

    2017-06-01

    This report uses the CMIP5 series of climate model simulations to produce country- level uncertainty distributions for use in socioeconomic risk assessments of climate change impacts. It provides appropriate probability distributions, by month, for 169 countries and autonomous-areas on temperature, precipitation, maximum temperature, maximum wind speed, humidity, runoff, soil moisture and evaporation for the historical period (1976-2005), and for decadal time periods to 2100. It also provides historical and future distributions for the Arctic region on ice concentration, ice thickness, age of ice, and ice ridging in 15-degree longitude arc segments from the Arctic Circle to 80 degrees latitude, plusmore » two polar semicircular regions from 80 to 90 degrees latitude. The uncertainty is meant to describe the lack of knowledge rather than imprecision in the physical simulation because the emphasis is on unfalsified risk and its use to determine potential socioeconomic impacts. The full report is contained in 27 volumes.« less

  3. County-Level Climate Uncertainty for Risk Assessments: Volume 18 Appendix Q - Historical Maximum Near-Surface Wind Speed.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Backus, George A.; Lowry, Thomas Stephen; Jones, Shannon M.

    This report uses the CMIP5 series of climate model simulations to produce country- level uncertainty distributions for use in socioeconomic risk assessments of climate change impacts. It provides appropriate probability distributions, by month, for 169 countries and autonomous-areas on temperature, precipitation, maximum temperature, maximum wind speed, humidity, runoff, soil moisture and evaporation for the historical period (1976-2005), and for decadal time periods to 2100. It also provides historical and future distributions for the Arctic region on ice concentration, ice thickness, age of ice, and ice ridging in 15-degree longitude arc segments from the Arctic Circle to 80 degrees latitude, plusmore » two polar semicircular regions from 80 to 90 degrees latitude. The uncertainty is meant to describe the lack of knowledge rather than imprecision in the physical simulation because the emphasis is on unfalsified risk and its use to determine potential socioeconom ic impacts. The full report is contained in 27 volumes.« less

  4. County-Level Climate Uncertainty for Risk Assessments: Volume 17 Appendix P - Forecast Soil Moisture

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Backus, George A.; Lowry, Thomas Stephen; Jones, Shannon M.

    This report uses the CMIP5 series of climate model simulations to produce country- level uncertainty distributions for use in socioeconomic risk assessments of climate change impacts. It provides appropriate probability distributions, by month, for 169 countries and autonomous-areas on temperature, precipitation, maximum temperature, maximum wind speed, humidity, runoff, soil moisture and evaporation for the historical period (1976-2005), and for decadal time periods to 2100. It also provides historical and future distributions for the Arctic region on ice concentration, ice thickness, age of ice, and ice ridging in 15-degree longitude arc segments from the Arctic Circle to 80 degrees latitude, plusmore » two polar semicircular regions from 80 to 90 degrees latitude. The uncertainty is meant to describe the lack of knowledge rather than imprecision in the physical simulation because the emphasis is on unfalsified risk and its use to determine potential socioeconomic impacts. The full report is contained in 27 volumes.« less

  5. County-Level Climate Uncertainty for Risk Assessments: Volume 16 Appendix O - Historical Soil Moisture.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Backus, George A.; Lowry, Thomas Stephen; Jones, Shannon M

    This report uses the CMIP5 series of climate model simulations to produce country- level uncertainty distributions for use in socioeconomic risk assessments of climate change impacts. It provides appropriate probability distributions, by month, for 169 countries and autonomous-areas on temperature, precipitation, maximum temperature, maximum wind speed, humidity, runoff, soil moisture and evaporation for the historical period (1976-2005), and for decadal time periods to 2100. It also provides historical and future distributions for the Arctic region on ice concentration, ice thickness, age of ice, and ice ridging in 15-degree longitude arc segments from the Arctic Circle to 80 degrees latitude, plusmore » two polar semicircular regions from 80 to 90 degrees latitude. The uncertainty is meant to describe the lack of knowledge rather than imprecision in the physical simulation because the emphasis is on unfalsified risk and its use to determine potential socioeconomic impacts. The full report is contained in 27 volumes.« less

  6. County-Level Climate Uncertainty for Risk Assessments: Volume 6 Appendix E - Historical Minimum Near-Surface Air Temperature.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Backus, George A.; Lowry, Thomas Stephen; Jones, Shannon M.

    2017-06-01

    This report uses the CMIP5 series of climate model simulations to produce country- level uncertainty distributions for use in socioeconomic risk assessments of climate change impacts. It provides appropriate probability distributions, by month, for 169 countries and autonomous-areas on temperature, precipitation, maximum temperature, maximum wind speed, humidity, runoff, soil moisture and evaporation for the historical period (1976-2005), and for decadal time periods to 2100. It also provides historical and future distributions for the Arctic region on ice concentration, ice thickness, age of ice, and ice ridging in 15-degree longitude arc segments from the Arctic Circle to 80 degrees latitude, plusmore » two polar semicircular regions from 80 to 90 degrees latitude. The uncertainty is meant to describe the lack of knowledge rather than imprecision in the physical simulation because the emphasis is on unfalsified risk and its use to determine potential socioeconomic impacts. The full report is contained in 27 volumes.« less

  7. County-Level Climate Uncertainty for Risk Assessments: Volume 26 Appendix Y - Historical Ridging Rate.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Backus, George A.; Lowry, Thomas Stephen; Jones, Shannon M.

    2017-05-01

    This report uses the CMIP5 series of climate model simulations to produce country- level uncertainty distributions for use in socioeconomic risk assessments of climate change impacts. It provides appropriate probability distributions, by month, for 169 countries and autonomous-areas on temperature, precipitation, maximum temperature, maximum wind speed, humidity, runoff, soil moisture and evaporation for the historical period (1976-2005), and for decadal time periods to 2100. It also provides historical and future distributions for the Arctic region on ice concentration, ice thickness, age of ice, and ice ridging in 15-degree longitude arc segments from the Arctic Circle to 80 degrees latitude, plusmore » two polar semicircular regions from 80 to 90 degrees latitude. The uncertainty is meant to describe the lack of knowledge rather than imprecision in the physical simulation because the emphasis is on unfalsified risk and its use to determine potential socioeconomic impacts. The full report is contained in 27 volumes.« less

  8. County-Level Climate Uncertainty for Risk Assessments: Volume 4 Appendix C - Historical Maximum Near-Surface Air Temperature.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Backus, George A.; Lowry, Thomas Stephen; Jones, Shannon M.

    2017-06-01

    This report uses the CMIP5 series of climate model simulations to produce country- level uncertainty distributions for use in socioeconomic risk assessments of climate change impacts. It provides appropriate probability distributions, by month, for 169 countries and autonomous-areas on temperature, precipitation, maximum temperature, maximum wind speed, humidity, runoff, soil moisture and evaporation for the historical period (1976-2005), and for decadal time periods to 2100. It also provides historical and future distributions for the Arctic region on ice concentration, ice thickness, age of ice, and ice ridging in 15-degree longitude arc segments from the Arctic Circle to 80 degrees latitude, plusmore » two polar semicircular regions from 80 to 90 degrees latitude. The uncertainty is meant to describe the lack of knowledge rather than imprecision in the physical simulation because the emphasis is on unfalsified risk and its use to determine potential socioeconomic impacts. The full report is contained in 27 volumes.« less

  9. County-Level Climate Uncertainty for Risk Assessments: Volume 2 Appendix A - Historical Near-Surface Air Temperature.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Backus, George A.; Lowry, Thomas Stephen; Jones, Shannon M.

    2017-06-01

    This report uses the CMIP5 series of climate model simulations to produce country- level uncertainty distributions for use in socioeconomic risk assessments of climate change impacts. It provides appropriate probability distributions, by month, for 169 countries and autonomous-areas on temperature, precipitation, maximum temperature, maximum wind speed, humidity, runoff, soil moisture and evaporation for the historical period (1976-2005), and for decadal time periods to 2100. It also provides historical and future distributions for the Arctic region on ice concentration, ice thickness, age of ice, and ice ridging in 15-degree longitude arc segments from the Arctic Circle to 80 degrees latitude, plusmore » two polar semicircular regions from 80 to 90 degrees latitude. The uncertainty is meant to describe the lack of knowledge rather than imprecision in the physical simulation because the emphasis is on unfalsified risk and its use to determine potential socioeconomic impacts. The full report is contained in 27 volumes.« less

  10. County-Level Climate Uncertainty for Risk Assessments: Volume 1.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Backus, George A.; Lowry, Thomas Stephen; Jones, Shannon M.

    This report uses the CMIP5 series of climate model simulations to produce country- level uncertainty distributions for use in socioeconomic risk assessments of climate change impacts. It provides appropriate probability distributions, by month, for 169 countries and autonomous-areas on temperature, precipitation, maximum temperature, maximum wind speed, humidity, runoff, soil moisture and evaporation for the historical period (1976-2005), and for decadal time periods to 2100. It also provides historical and future distributions for the Arctic region on ice concentration, ice thickness, age of ice, and ice ridging in 15-degree longitude arc segments from the Arctic Circle to 80 degrees latitude, plusmore » two polar semicircular regions from 80 to 90 degrees latitude. The uncertainty is meant to describe the lack of knowledge rather than imprecision in the physical simulation because the emphasis is on unfalsified risk and its use to determine potential socioeconomic impacts. The full report is contained in 27 volumes.« less

  11. SU-E-T-493: Analysis of the Impact of Range and Setup Uncertainties On the Dose to Brain Stem and Whole Brain in the Passively Scattered Proton Therapy Plans

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sahoo, N; Zhu, X; Zhang, X

    Purpose: To quantify the impact of range and setup uncertainties on various dosimetric indices that are used to assess normal tissue toxicities of patients receiving passive scattering proton beam therapy (PSPBT). Methods: Robust analysis of sample treatment plans of six brain cancer patients treated with PSPBT at our facility for whom the maximum brain stem dose exceeded 5800 CcGE were performed. The DVH of each plan was calculated in an Eclipse treatment planning system (TPS) version 11 applying ±3.5% range uncertainty and ±3 mm shift of the isocenter in x, y and z directions to account for setup uncertainties. Worst-casemore » dose indices for brain stem and whole brain were compared to their values in the nominal plan to determine the average change in their values. For the brain stem, maximum dose to 1 cc of volume, dose to 10%, 50%, 90% of volume (D10, D50, D90) and volume receiving 6000, 5400, 5000, 4500, 4000 CcGE (V60, V54, V50, V45, V40) were evaluated. For the whole brain, maximum dose to 1 cc of volume, and volume receiving 5400, 5000, 4500, 4000, 3000 CcGE (V54, V50, V45, V40 and V30) were assessed. Results: The average change in the values of these indices in the worst scenario cases from the nominal plan were as follows. Brain stem; Maximum dose to 1 cc of volume: 1.1%, D10: 1.4%, D50: 8.0%, D90:73.3%, V60:116.9%, V54:27.7%, V50: 21.2%, V45:16.2%, V40:13.6%,Whole brain; Maximum dose to 1 cc of volume: 0.3%, V54:11.4%, V50: 13.0%, V45:13.6%, V40:14.1%, V30:13.5%. Conclusion: Large to modest changes in the dosiemtric indices for brain stem and whole brain compared to nominal plan due to range and set up uncertainties were observed. Such potential changes should be taken into account while using any dosimetric parameters for outcome evaluation of patients receiving proton therapy.« less

  12. TARGET ORGAN TOXICITY IN MARINE AND FRESHWATER TELEOSTS: VOLUME 1 - ORGANS

    EPA Science Inventory

    In any given aquatic ecosystem, fish serve a multitude of critical functions and so, are typically included in the risk assessment of various chemicals in waterways. However, uncertainties in toxicity evaluation can arise since these assessments are usually based solely on acute ...

  13. TARGET ORGAN TOXICITY IN MARINE AND FRESHWATER TELEOSTS: VOLUME 2 - SYSTEMS

    EPA Science Inventory

    In any given aquatic ecosystem, fish serve a multitude of critical functions and so, are typically included in the risk assessment of various chemicals in waterways. However, uncertainties in toxicity evaluation can arise since these assessments are usually based solely on acute ...

  14. Probabilistic accident consequence uncertainty analysis -- Uncertainty assessment for internal dosimetry. Volume 2: Appendices

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Goossens, L.H.J.; Kraan, B.C.P.; Cooke, R.M.

    1998-04-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library ofmore » uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA internal dosimetry models. This volume contains appendices that include (1) a summary of the MACCS and COSYMA consequence codes, (2) the elicitation questionnaires and case structures, (3) the rationales and results for the panel on internal dosimetry, (4) short biographies of the experts, and (5) the aggregated results of their responses.« less

  15. Evaluation of pollutant loads from stormwater BMPs to receiving water using load frequency curves with uncertainty analysis.

    PubMed

    Park, Daeryong; Roesner, Larry A

    2012-12-15

    This study examined pollutant loads released to receiving water from a typical urban watershed in the Los Angeles (LA) Basin of California by applying a best management practice (BMP) performance model that includes uncertainty. This BMP performance model uses the k-C model and incorporates uncertainty analysis and the first-order second-moment (FOSM) method to assess the effectiveness of BMPs for removing stormwater pollutants. Uncertainties were considered for the influent event mean concentration (EMC) and the aerial removal rate constant of the k-C model. The storage treatment overflow and runoff model (STORM) was used to simulate the flow volume from watershed, the bypass flow volume and the flow volume that passes through the BMP. Detention basins and total suspended solids (TSS) were chosen as representatives of stormwater BMP and pollutant, respectively. This paper applies load frequency curves (LFCs), which replace the exceedance percentage with an exceedance frequency as an alternative to load duration curves (LDCs), to evaluate the effectiveness of BMPs. An evaluation method based on uncertainty analysis is suggested because it applies a water quality standard exceedance based on frequency and magnitude. As a result, the incorporation of uncertainty in the estimates of pollutant loads can assist stormwater managers in determining the degree of total daily maximum load (TMDL) compliance that could be expected from a given BMP in a watershed. Copyright © 2012 Elsevier Ltd. All rights reserved.

  16. Preliminary risk assessment for nuclear waste disposal in space, volume 1

    NASA Technical Reports Server (NTRS)

    Rice, E. E.; Denning, R. S.; Friedlander, A. L.

    1982-01-01

    The feasibility, desirability and preferred approaches for disposal of selected high-level nuclear wastes in space were analyzed. Preliminary space disposal risk estimates and estimates of risk uncertainty are provided.

  17. Introduction to “Global tsunami science: Past and future, Volume I”

    USGS Publications Warehouse

    Geist, Eric L.; Fritz, Hermann; Rabinovich, Alexander B.; Tanioka, Yuichiro

    2016-01-01

    Twenty-five papers on the study of tsunamis are included in Volume I of the PAGEOPH topical issue “Global Tsunami Science: Past and Future”. Six papers examine various aspects of tsunami probability and uncertainty analysis related to hazard assessment. Three papers relate to deterministic hazard and risk assessment. Five more papers present new methods for tsunami warning and detection. Six papers describe new methods for modeling tsunami hydrodynamics. Two papers investigate tsunamis generated by non-seismic sources: landslides and meteorological disturbances. The final three papers describe important case studies of recent and historical events. Collectively, this volume highlights contemporary trends in global tsunami research, both fundamental and applied toward hazard assessment and mitigation.

  18. Introduction to "Global Tsunami Science: Past and Future, Volume I"

    NASA Astrophysics Data System (ADS)

    Geist, Eric L.; Fritz, Hermann M.; Rabinovich, Alexander B.; Tanioka, Yuichiro

    2016-12-01

    Twenty-five papers on the study of tsunamis are included in Volume I of the PAGEOPH topical issue "Global Tsunami Science: Past and Future". Six papers examine various aspects of tsunami probability and uncertainty analysis related to hazard assessment. Three papers relate to deterministic hazard and risk assessment. Five more papers present new methods for tsunami warning and detection. Six papers describe new methods for modeling tsunami hydrodynamics. Two papers investigate tsunamis generated by non-seismic sources: landslides and meteorological disturbances. The final three papers describe important case studies of recent and historical events. Collectively, this volume highlights contemporary trends in global tsunami research, both fundamental and applied toward hazard assessment and mitigation.

  19. Risk analysis under uncertainty, the precautionary principle, and the new EU chemicals strategy.

    PubMed

    Rogers, Michael D

    2003-06-01

    Three categories of uncertainty in relation to risk assessment are defined; uncertainty in effect, uncertainty in cause, and uncertainty in the relationship between a hypothesised cause and effect. The Precautionary Principle (PP) relates to the third type of uncertainty. Three broad descriptions of the PP are set out, uncertainty justifies action, uncertainty requires action, and uncertainty requires a reversal of the burden of proof for risk assessments. The application of the PP is controversial but what matters in practise is the precautionary action (PA) that follows. The criteria by which the PAs should be judged are detailed. This framework for risk assessment and management under uncertainty is then applied to the envisaged European system for the regulation of chemicals. A new EU regulatory system has been proposed which shifts the burden of proof concerning risk assessments from the regulator to the producer, and embodies the PP in all three of its main regulatory stages. The proposals are critically discussed in relation to three chemicals, namely, atrazine (an endocrine disrupter), cadmium (toxic and possibly carcinogenic), and hydrogen fluoride (a toxic, high-production-volume chemical). Reversing the burden of proof will speed up the regulatory process but the examples demonstrate that applying the PP appropriately, and balancing the countervailing risks and the socio-economic benefits, will continue to be a difficult task for the regulator. The paper concludes with a discussion of the role of precaution in the management of change and of the importance of trust in the effective regulation of uncertain risks.

  20. Radiologist Uncertainty and the Interpretation of Screening

    PubMed Central

    Carney, Patricia A.; Elmore, Joann G.; Abraham, Linn A.; Gerrity, Martha S.; Hendrick, R. Edward; Taplin, Stephen H.; Barlow, William E.; Cutter, Gary R.; Poplack, Steven P.; D’Orsi, Carl J.

    2011-01-01

    Objective To determine radiologists’ reactions to uncertainty when interpreting mammography and the extent to which radiologist uncertainty explains variability in interpretive performance. Methods The authors used a mailed survey to assess demographic and clinical characteristics of radiologists and reactions to uncertainty associated with practice. Responses were linked to radiologists’ actual interpretive performance data obtained from 3 regionally located mammography registries. Results More than 180 radiologists were eligible to participate, and 139 consented for a response rate of 76.8%. Radiologist gender, more years interpreting, and higher volume were associated with lower uncertainty scores. Positive predictive value, recall rates, and specificity were more affected by reactions to uncertainty than sensitivity or negative predictive value; however, none of these relationships was statistically significant. Conclusion Certain practice factors, such as gender and years of interpretive experience, affect uncertainty scores. Radiologists’ reactions to uncertainty do not appear to affect interpretive performance. PMID:15155014

  1. Probabilistic accident consequence uncertainty analysis -- Uncertainty assessment for deposited material and external doses. Volume 2: Appendices

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Goossens, L.H.J.; Kraan, B.C.P.; Cooke, R.M.

    1997-12-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library ofmore » uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA deposited material and external dose models. This volume contains appendices that include (1) a summary of the MACCS and COSYMA consequence codes, (2) the elicitation questionnaires and case structures, (3) the rationales and results for the panel on deposited material and external doses, (4) short biographies of the experts, and (5) the aggregated results of their responses.« less

  2. Probabilistic accident consequence uncertainty analysis -- Late health effects uncertain assessment. Volume 2: Appendices

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Little, M.P.; Muirhead, C.R.; Goossens, L.H.J.

    1997-12-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library ofmore » uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA late health effects models. This volume contains appendices that include (1) a summary of the MACCS and COSYMA consequence codes, (2) the elicitation questionnaires and case structures, (3) the rationales and results for the expert panel on late health effects, (4) short biographies of the experts, and (5) the aggregated results of their responses.« less

  3. Exposure Factors Handbook (1996, External Review Draft)

    EPA Science Inventory

    This handbook provided a summary of the available statistical data on various factors used in assessing human exposure.

    Volume I, General Factors, includes an introduction and discussion of uncertainty and provides data for drinking water consumption, soil ingestion, inha...

  4. Role of beach morphology in wave overtopping hazard assessment

    NASA Astrophysics Data System (ADS)

    Phillips, Benjamin; Brown, Jennifer; Bidlot, Jean-Raymond; Plater, Andrew

    2017-04-01

    Understanding the role of beach morphology in controlling wave overtopping volume will further minimise uncertainties in flood risk assessments at coastal locations defended by engineered structures worldwide. XBeach is used to model wave overtopping volume for a 1:200 yr joint probability distribution of waves and water levels with measured, pre- and post-storm beach profiles. The simulation with measured bathymetry is repeated with and without morphological evolution enabled during the modelled storm event. This research assesses the role of morphology in controlling wave overtopping volumes for hazardous events that meet the typical design level of coastal defence structures. Results show disabling storm-driven morphology under-represents modelled wave overtopping volumes by up to 39% under high Hs conditions, and has a greater impact on the wave overtopping rate than the variability applied within the boundary conditions due to the range of wave-water level combinations that meet the 1:200 yr joint probability criterion. Accounting for morphology in flood modelling is therefore critical for accurately predicting wave overtopping volumes and the resulting flood hazard and to assess economic losses.

  5. Simulation Credibility: Advances in Verification, Validation, and Uncertainty Quantification

    NASA Technical Reports Server (NTRS)

    Mehta, Unmeel B. (Editor); Eklund, Dean R.; Romero, Vicente J.; Pearce, Jeffrey A.; Keim, Nicholas S.

    2016-01-01

    Decision makers and other users of simulations need to know quantified simulation credibility to make simulation-based critical decisions and effectively use simulations, respectively. The credibility of a simulation is quantified by its accuracy in terms of uncertainty, and the responsibility of establishing credibility lies with the creator of the simulation. In this volume, we present some state-of-the-art philosophies, principles, and frameworks. The contributing authors involved in this publication have been dedicated to advancing simulation credibility. They detail and provide examples of key advances over the last 10 years in the processes used to quantify simulation credibility: verification, validation, and uncertainty quantification. The philosophies and assessment methods presented here are anticipated to be useful to other technical communities conducting continuum physics-based simulations; for example, issues related to the establishment of simulation credibility in the discipline of propulsion are discussed. We envision that simulation creators will find this volume very useful to guide and assist them in quantitatively conveying the credibility of their simulations.

  6. Impact of Non-Gaussian Error Volumes on Conjunction Assessment Risk Analysis

    NASA Technical Reports Server (NTRS)

    Ghrist, Richard W.; Plakalovic, Dragan

    2012-01-01

    An understanding of how an initially Gaussian error volume becomes non-Gaussian over time is an important consideration for space-vehicle conjunction assessment. Traditional assumptions applied to the error volume artificially suppress the true non-Gaussian nature of the space-vehicle position uncertainties. For typical conjunction assessment objects, representation of the error volume by a state error covariance matrix in a Cartesian reference frame is a more significant limitation than is the assumption of linearized dynamics for propagating the error volume. In this study, the impact of each assumption is examined and isolated for each point in the volume. Limitations arising from representing the error volume in a Cartesian reference frame is corrected by employing a Monte Carlo approach to probability of collision (Pc), using equinoctial samples from the Cartesian position covariance at the time of closest approach (TCA) between the pair of space objects. A set of actual, higher risk (Pc >= 10 (exp -4)+) conjunction events in various low-Earth orbits using Monte Carlo methods are analyzed. The impact of non-Gaussian error volumes on Pc for these cases is minimal, even when the deviation from a Gaussian distribution is significant.

  7. Quantifying radar-rainfall uncertainties in urban drainage flow modelling

    NASA Astrophysics Data System (ADS)

    Rico-Ramirez, M. A.; Liguori, S.; Schellart, A. N. A.

    2015-09-01

    This work presents the results of the implementation of a probabilistic system to model the uncertainty associated to radar rainfall (RR) estimates and the way this uncertainty propagates through the sewer system of an urban area located in the North of England. The spatial and temporal correlations of the RR errors as well as the error covariance matrix were computed to build a RR error model able to generate RR ensembles that reproduce the uncertainty associated with the measured rainfall. The results showed that the RR ensembles provide important information about the uncertainty in the rainfall measurement that can be propagated in the urban sewer system. The results showed that the measured flow peaks and flow volumes are often bounded within the uncertainty area produced by the RR ensembles. In 55% of the simulated events, the uncertainties in RR measurements can explain the uncertainties observed in the simulated flow volumes. However, there are also some events where the RR uncertainty cannot explain the whole uncertainty observed in the simulated flow volumes indicating that there are additional sources of uncertainty that must be considered such as the uncertainty in the urban drainage model structure, the uncertainty in the urban drainage model calibrated parameters, and the uncertainty in the measured sewer flows.

  8. Solar power satellite system definition study. Volume 1, phase 1: Executive summary

    NASA Technical Reports Server (NTRS)

    1979-01-01

    A systems definition study of the solar satellite system (SPS) is presented. The technical feasibility of solar power satellites based on forecasts of technical capability in the various applicable technologies is assessed. The performance, cost, operational characteristics, reliability, and the suitability of SPS's as power generators for typical commercial electricity grids are discussed. The uncertainties inherent in the system characteristics forecasts are assessed.

  9. WE-AB-207B-06: Dose and Biological Uncertainties in Sarcoma

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Marteinsdottir, M; University of Iceland, Reykjavik; Schuemann, J

    2016-06-15

    Purpose: To understand the clinical impact of key uncertainties in proton therapy potentially affecting the analysis of clinical trials, namely the assumption of using a constant relative biological effectiveness (RBE) of 1.1 compared to variable RBE for proton therapy and the use of analytical dose calculation (ADC) methods. Methods: Proton dose distributions were compared for analytical and Monte Carlo (TOPAS) dose calculations. In addition, differences between using a constant RBE of 1.1 (RBE-constant) were compared with four different RBE models (to assess model variations). 10 patients were selected from an ongoing clinical trial on IMRT versus scanned protons for sarcoma.more » Comparisons were performed using dosimetric indices based on dose-volume histogram analyses and γ-index analyses. Results: For three of the RBE-models the mean dose, D95, D50 and D02 (dose values covering 95%, 50% and 2% of the target volume, respectively) were up to 5% lower than for RBE-constant. The dosimetric indices for one of the RBE-models were around 9% lower than for the RBE-constant model. The differences for V90 (the percentage of the target volume covered by 90% of the prescription dose) were up to 40% for three RBE-models, whereas for one the difference was around 95%. All ADC dosimetric indices were up to 5% larger than for RBE-constant. The γ-index passing rate for the target volume with a 3%/3mm criterion was above 97% for all models except for one, which was below 24%. Conclusion: Interpretation of clinical trials on sarcoma may depend on dose calculation uncertainties (as assessed by Monte Carlo). In addition, the biological dose distribution depends notably on which RBE model is utilized. The current practice of using a constant RBE of 1.1 may overestimate the target dose by as much as 5% for biological dose calculations. Performing an RBE uncertainty analysis is recommended for trial analysis. U19 projects - U19 CA 021239. PI: Delaney.« less

  10. Dealing with uncertainty in the probability of overtopping of a flood mitigation dam

    NASA Astrophysics Data System (ADS)

    Michailidi, Eleni Maria; Bacchi, Baldassare

    2017-05-01

    In recent years, copula multivariate functions were used to model, probabilistically, the most important variables of flood events: discharge peak, flood volume and duration. However, in most of the cases, the sampling uncertainty, from which small-sized samples suffer, is neglected. In this paper, considering a real reservoir controlled by a dam as a case study, we apply a structure-based approach to estimate the probability of reaching specific reservoir levels, taking into account the key components of an event (flood peak, volume, hydrograph shape) and of the reservoir (rating curve, volume-water depth relation). Additionally, we improve information about the peaks from historical data and reports through a Bayesian framework, allowing the incorporation of supplementary knowledge from different sources and its associated error. As it is seen here, the extra information can result in a very different inferred parameter set and consequently this is reflected as a strong variability of the reservoir level, associated with a given return period. Most importantly, the sampling uncertainty is accounted for in both cases (single-site and multi-site with historical information scenarios), and Monte Carlo confidence intervals for the maximum water level are calculated. It is shown that water levels of specific return periods in a lot of cases overlap, thus making risk assessment, without providing confidence intervals, deceiving.

  11. Quantifying uncertainty in carbon and nutrient pools of coarse woody debris

    NASA Astrophysics Data System (ADS)

    See, C. R.; Campbell, J. L.; Fraver, S.; Domke, G. M.; Harmon, M. E.; Knoepp, J. D.; Woodall, C. W.

    2016-12-01

    Woody detritus constitutes a major pool of both carbon and nutrients in forested ecosystems. Estimating coarse wood stocks relies on many assumptions, even when full surveys are conducted. Researchers rarely report error in coarse wood pool estimates, despite the importance to ecosystem budgets and modelling efforts. To date, no study has attempted a comprehensive assessment of error rates and uncertainty inherent in the estimation of this pool. Here, we use Monte Carlo analysis to propagate the error associated with the major sources of uncertainty present in the calculation of coarse wood carbon and nutrient (i.e., N, P, K, Ca, Mg, Na) pools. We also evaluate individual sources of error to identify the importance of each source of uncertainty in our estimates. We quantify sampling error by comparing the three most common field methods used to survey coarse wood (two transect methods and a whole-plot survey). We quantify the measurement error associated with length and diameter measurement, and technician error in species identification and decay class using plots surveyed by multiple technicians. We use previously published values of model error for the four most common methods of volume estimation: Smalian's, conical frustum, conic paraboloid, and average-of-ends. We also use previously published values for error in the collapse ratio (cross-sectional height/width) of decayed logs that serves as a surrogate for the volume remaining. We consider sampling error in chemical concentration and density for all decay classes, using distributions from both published and unpublished studies. Analytical uncertainty is calculated using standard reference plant material from the National Institute of Standards. Our results suggest that technician error in decay classification can have a large effect on uncertainty, since many of the error distributions included in the calculation (e.g. density, chemical concentration, volume-model selection, collapse ratio) are decay-class specific.

  12. Is it necessary to plan with safety margins for actively scanned proton therapy?

    NASA Astrophysics Data System (ADS)

    Albertini, F.; Hug, E. B.; Lomax, A. J.

    2011-07-01

    In radiation therapy, a plan is robust if the calculated and the delivered dose are in agreement, even in the case of different uncertainties. The current practice is to use safety margins, expanding the clinical target volume sufficiently enough to account for treatment uncertainties. This, however, might not be ideal for proton therapy and in particular when using intensity modulated proton therapy (IMPT) plans as degradation in the dose conformity could also be found in the middle of the target resulting from misalignments of highly in-field dose gradients. Single field uniform dose (SFUD) and IMPT plans have been calculated for different anatomical sites and the need for margins has been assessed by analyzing plan robustness to set-up and range uncertainties. We found that the use of safety margins is a good way to improve plan robustness for SFUD and IMPT plans with low in-field dose gradients but not necessarily for highly modulated IMPT plans for which only a marginal improvement in plan robustness could be detected through the definition of a planning target volume.

  13. Dynamic Target Definition: a novel approach for PTV definition in ion beam therapy.

    PubMed

    Cabal, Gonzalo A; Jäkel, Oliver

    2013-05-01

    To present a beam arrangement specific approach for PTV definition in ion beam therapy. By means of a Monte Carlo error propagation analysis a criteria is formulated to assess whether a voxel is safely treated. Based on this a non-isotropical expansion rule is proposed aiming to minimize the impact of uncertainties on the dose delivered. The method is exemplified in two cases: a Head and Neck case and a Prostate case. In both cases the modality used is proton beam irradiation and the sources of uncertainties taken into account are positioning (set up) errors and range uncertainties. It is shown how different beam arrangements have an impact on plan robustness which leads to different target expansions necessary to assure a predefined level of plan robustness. The relevance of appropriate beam angle arrangements as a way to minimize uncertainties is demonstrated. A novel method for PTV definition in on beam therapy is presented. The method show promising results by improving the probability of correct dose CTV coverage while reducing the size of the PTV volume. In a clinical scenario this translates into an enhanced tumor control probability while reducing the volume of healthy tissue being irradiated. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  14. From field data to volumes: constraining uncertainties in pyroclastic eruption parameters

    NASA Astrophysics Data System (ADS)

    Klawonn, Malin; Houghton, Bruce F.; Swanson, Donald A.; Fagents, Sarah A.; Wessel, Paul; Wolfe, Cecily J.

    2014-07-01

    In this study, we aim to understand the variability in eruption volume estimates derived from field studies of pyroclastic deposits. We distributed paper maps of the 1959 Kīlauea Iki tephra to 101 volcanologists worldwide, who produced hand-drawn isopachs. Across the returned maps, uncertainty in isopach areas is 7 % across the well-sampled deposit but increases to over 30 % for isopachs that are governed by the largest and smallest thickness measurements. We fit the exponential, power-law, and Weibull functions through the isopach thickness versus area1/2 values and find volume estimate variations up to a factor of 4.9 for a single map. Across all maps and methodologies, we find an average standard deviation for a total volume of s = 29 %. The volume uncertainties are largest for the most proximal ( s = 62 %) and distal field ( s = 53 %) and small for the densely sampled intermediate deposit ( s = 8 %). For the Kīlauea Iki 1959 eruption, we find that the deposit beyond the 5-cm isopach contains only 2 % of the total erupted volume, whereas the near-source deposit contains 48 % and the intermediate deposit 50 % of the total volume. Thus, the relative uncertainty within each zone impacts the total volume estimates differently. The observed uncertainties for the different deposit regions in this study illustrate a fundamental problem of estimating eruption volumes: while some methodologies may provide better fits to the isopach data or rely on fewer free parameters, the main issue remains the predictive capabilities of the empirical functions for the regions where measurements are missing.

  15. A novel method for the evaluation of uncertainty in dose-volume histogram computation.

    PubMed

    Henríquez, Francisco Cutanda; Castrillón, Silvia Vargas

    2008-03-15

    Dose-volume histograms (DVHs) are a useful tool in state-of-the-art radiotherapy treatment planning, and it is essential to recognize their limitations. Even after a specific dose-calculation model is optimized, dose distributions computed by using treatment-planning systems are affected by several sources of uncertainty, such as algorithm limitations, measurement uncertainty in the data used to model the beam, and residual differences between measured and computed dose. This report presents a novel method to take them into account. To take into account the effect of associated uncertainties, a probabilistic approach using a new kind of histogram, a dose-expected volume histogram, is introduced. The expected value of the volume in the region of interest receiving an absorbed dose equal to or greater than a certain value is found by using the probability distribution of the dose at each point. A rectangular probability distribution is assumed for this point dose, and a formulation that accounts for uncertainties associated with point dose is presented for practical computations. This method is applied to a set of DVHs for different regions of interest, including 6 brain patients, 8 lung patients, 8 pelvis patients, and 6 prostate patients planned for intensity-modulated radiation therapy. Results show a greater effect on planning target volume coverage than in organs at risk. In cases of steep DVH gradients, such as planning target volumes, this new method shows the largest differences with the corresponding DVH; thus, the effect of the uncertainty is larger.

  16. Uncertainty Estimation for the Determination of Ni, Pb and Al in Natural Water Samples by SPE-ICP-OES

    NASA Astrophysics Data System (ADS)

    Ghorbani, A.; Farahani, M. Mahmoodi; Rabbani, M.; Aflaki, F.; Waqifhosain, Syed

    2008-01-01

    In this paper we propose uncertainty estimation for the analytical results we obtained from determination of Ni, Pb and Al by solidphase extraction and inductively coupled plasma optical emission spectrometry (SPE-ICP-OES). The procedure is based on the retention of analytes in the form of 8-hydroxyquinoline (8-HQ) complexes on a mini column of XAD-4 resin and subsequent elution with nitric acid. The influence of various analytical parameters including the amount of solid phase, pH, elution factors (concentration and volume of eluting solution), volume of sample solution, and amount of ligand on the extraction efficiency of analytes was investigated. To estimate the uncertainty of analytical result obtained, we propose assessing trueness by employing spiked sample. Two types of bias are calculated in the assessment of trueness: a proportional bias and a constant bias. We applied Nested design for calculating proportional bias and Youden method to calculate the constant bias. The results we obtained for proportional bias are calculated from spiked samples. In this case, the concentration found is plotted against the concentration added and the slop of standard addition curve is an estimate of the method recovery. Estimated method of average recovery in Karaj river water is: (1.004±0.0085) for Ni, (0.999±0.010) for Pb and (0.987±0.008) for Al.

  17. Quantifying uncertainty in soot volume fraction estimates using Bayesian inference of auto-correlated laser-induced incandescence measurements

    NASA Astrophysics Data System (ADS)

    Hadwin, Paul J.; Sipkens, T. A.; Thomson, K. A.; Liu, F.; Daun, K. J.

    2016-01-01

    Auto-correlated laser-induced incandescence (AC-LII) infers the soot volume fraction (SVF) of soot particles by comparing the spectral incandescence from laser-energized particles to the pyrometrically inferred peak soot temperature. This calculation requires detailed knowledge of model parameters such as the absorption function of soot, which may vary with combustion chemistry, soot age, and the internal structure of the soot. This work presents a Bayesian methodology to quantify such uncertainties. This technique treats the additional "nuisance" model parameters, including the soot absorption function, as stochastic variables and incorporates the current state of knowledge of these parameters into the inference process through maximum entropy priors. While standard AC-LII analysis provides a point estimate of the SVF, Bayesian techniques infer the posterior probability density, which will allow scientists and engineers to better assess the reliability of AC-LII inferred SVFs in the context of environmental regulations and competing diagnostics.

  18. Assuring Life in Composite Systems

    NASA Technical Reports Server (NTRS)

    Chamis, Christos c.

    2008-01-01

    A computational simulation method is presented to assure life in composite systems by using dynamic buckling of smart composite shells as an example. The combined use of composite mechanics, finite element computer codes, and probabilistic analysis enable the effective assessment of the dynamic buckling load of smart composite shells. A universal plot is generated to estimate the dynamic buckling load of composite shells at various load rates and probabilities. The shell structure is also evaluated with smart fibers embedded in the plies right below the outer plies. The results show that, on the average, the use of smart fibers improved the shell buckling resistance by about 9% at different probabilities and delayed the buckling occurrence time. The probabilistic sensitivities results indicate that uncertainties in the fiber volume ratio and ply thickness have major effects on the buckling load. The uncertainties in the electric field strength and smart material volume fraction have moderate effects and thereby in the assured life of the shell.

  19. Predicting uncertainty in future marine ice sheet volume using Bayesian statistical methods

    NASA Astrophysics Data System (ADS)

    Davis, A. D.

    2015-12-01

    The marine ice instability can trigger rapid retreat of marine ice streams. Recent observations suggest that marine ice systems in West Antarctica have begun retreating. However, unknown ice dynamics, computationally intensive mathematical models, and uncertain parameters in these models make predicting retreat rate and ice volume difficult. In this work, we fuse current observational data with ice stream/shelf models to develop probabilistic predictions of future grounded ice sheet volume. Given observational data (e.g., thickness, surface elevation, and velocity) and a forward model that relates uncertain parameters (e.g., basal friction and basal topography) to these observations, we use a Bayesian framework to define a posterior distribution over the parameters. A stochastic predictive model then propagates uncertainties in these parameters to uncertainty in a particular quantity of interest (QoI)---here, the volume of grounded ice at a specified future time. While the Bayesian approach can in principle characterize the posterior predictive distribution of the QoI, the computational cost of both the forward and predictive models makes this effort prohibitively expensive. To tackle this challenge, we introduce a new Markov chain Monte Carlo method that constructs convergent approximations of the QoI target density in an online fashion, yielding accurate characterizations of future ice sheet volume at significantly reduced computational cost.Our second goal is to attribute uncertainty in these Bayesian predictions to uncertainties in particular parameters. Doing so can help target data collection, for the purpose of constraining the parameters that contribute most strongly to uncertainty in the future volume of grounded ice. For instance, smaller uncertainties in parameters to which the QoI is highly sensitive may account for more variability in the prediction than larger uncertainties in parameters to which the QoI is less sensitive. We use global sensitivity analysis to help answer this question, and make the computation of sensitivity indices computationally tractable using a combination of polynomial chaos and Monte Carlo techniques.

  20. Design and Uncertainty Analysis for a PVTt Gas Flow Standard

    PubMed Central

    Wright, John D.; Johnson, Aaron N.; Moldover, Michael R.

    2003-01-01

    A new pressure, volume, temperature, and, time (PVTt) primary gas flow standard at the National Institute of Standards and Technology has an expanded uncertainty (k = 2) of between 0.02 % and 0.05 %. The standard spans the flow range of 1 L/min to 2000 L/min using two collection tanks and two diverter valve systems. The standard measures flow by collecting gas in a tank of known volume during a measured time interval. We describe the significant and novel features of the standard and analyze its uncertainty. The gas collection tanks have a small diameter and are immersed in a uniform, stable, thermostatted water bath. The collected gas achieves thermal equilibrium rapidly and the uncertainty of the average gas temperature is only 7 mK (22 × 10−6 T). A novel operating method leads to essentially zero mass change in and very low uncertainty contributions from the inventory volume. Gravimetric and volume expansion techniques were used to determine the tank and the inventory volumes. Gravimetric determinations of collection tank volume made with nitrogen and argon agree with a standard deviation of 16 × 10−6 VT. The largest source of uncertainty in the flow measurement is drift of the pressure sensor over time, which contributes relative standard uncertainty of 60 × 10−6 to the determinations of the volumes of the collection tanks and to the flow measurements. Throughout the range 3 L/min to 110 L/min, flows were measured independently using the 34 L and the 677 L collection systems, and the two systems agreed within a relative difference of 150 × 10−6. Double diversions were used to evaluate the 677 L system over a range of 300 L/min to 1600 L/min, and the relative differences between single and double diversions were less than 75 × 10−6. PMID:27413592

  1. Quantifying model uncertainty in seasonal Arctic sea-ice forecasts

    NASA Astrophysics Data System (ADS)

    Blanchard-Wrigglesworth, Edward; Barthélemy, Antoine; Chevallier, Matthieu; Cullather, Richard; Fučkar, Neven; Massonnet, François; Posey, Pamela; Wang, Wanqiu; Zhang, Jinlun; Ardilouze, Constantin; Bitz, Cecilia; Vernieres, Guillaume; Wallcraft, Alan; Wang, Muyin

    2017-04-01

    Dynamical model forecasts in the Sea Ice Outlook (SIO) of September Arctic sea-ice extent over the last decade have shown lower skill than that found in both idealized model experiments and hindcasts of previous decades. Additionally, it is unclear how different model physics, initial conditions or post-processing techniques contribute to SIO forecast uncertainty. In this work, we have produced a seasonal forecast of 2015 Arctic summer sea ice using SIO dynamical models initialized with identical sea-ice thickness in the central Arctic. Our goals are to calculate the relative contribution of model uncertainty and irreducible error growth to forecast uncertainty and assess the importance of post-processing, and to contrast pan-Arctic forecast uncertainty with regional forecast uncertainty. We find that prior to forecast post-processing, model uncertainty is the main contributor to forecast uncertainty, whereas after forecast post-processing forecast uncertainty is reduced overall, model uncertainty is reduced by an order of magnitude, and irreducible error growth becomes the main contributor to forecast uncertainty. While all models generally agree in their post-processed forecasts of September sea-ice volume and extent, this is not the case for sea-ice concentration. Additionally, forecast uncertainty of sea-ice thickness grows at a much higher rate along Arctic coastlines relative to the central Arctic ocean. Potential ways of offering spatial forecast information based on the timescale over which the forecast signal beats the noise are also explored.

  2. From field data to volumes: constraining uncertainties in pyroclastic eruption parameters

    USGS Publications Warehouse

    Klawonn, Malin; Houghton, Bruce F.; Swanson, Don; Fagents, Sarah A.; Wessel, Paul; Wolfe, Cecily J.

    2014-01-01

    In this study, we aim to understand the variability in eruption volume estimates derived from field studies of pyroclastic deposits. We distributed paper maps of the 1959 Kīlauea Iki tephra to 101 volcanologists worldwide, who produced hand-drawn isopachs. Across the returned maps, uncertainty in isopach areas is 7 % across the well-sampled deposit but increases to over 30 % for isopachs that are governed by the largest and smallest thickness measurements. We fit the exponential, power-law, and Weibull functions through the isopach thickness versus area1/2 values and find volume estimate variations up to a factor of 4.9 for a single map. Across all maps and methodologies, we find an average standard deviation for a total volume of s = 29 %. The volume uncertainties are largest for the most proximal (s = 62 %) and distal field (s = 53 %) and small for the densely sampled intermediate deposit (s = 8 %). For the Kīlauea Iki 1959 eruption, we find that the deposit beyond the 5-cm isopach contains only 2 % of the total erupted volume, whereas the near-source deposit contains 48 % and the intermediate deposit 50 % of the total volume. Thus, the relative uncertainty within each zone impacts the total volume estimates differently. The observed uncertainties for the different deposit regions in this study illustrate a fundamental problem of estimating eruption volumes: while some methodologies may provide better fits to the isopach data or rely on fewer free parameters, the main issue remains the predictive capabilities of the empirical functions for the regions where measurements are missing.

  3. Uncertainty in peat volume and soil carbon estimated using ground-penetrating radar and probing

    Treesearch

    Andrew D. Parsekian; Lee Slater; Dimitrios Ntarlagiannis; James Nolan; Stephen D. Sebestyen; Randall K. Kolka; Paul J. Hanson

    2012-01-01

    Estimating soil C stock in a peatland is highly dependent on accurate measurement of the peat volume. In this study, we evaluated the uncertainty in calculations of peat volume using high-resolution data to resolve the three-dimensional structure of a peat basin based on both direct (push probes) and indirect geophysical (ground-penetrating radar) measurements. We...

  4. Uncertainty modelling and analysis of volume calculations based on a regular grid digital elevation model (DEM)

    NASA Astrophysics Data System (ADS)

    Li, Chang; Wang, Qing; Shi, Wenzhong; Zhao, Sisi

    2018-05-01

    The accuracy of earthwork calculations that compute terrain volume is critical to digital terrain analysis (DTA). The uncertainties in volume calculations (VCs) based on a DEM are primarily related to three factors: 1) model error (ME), which is caused by an adopted algorithm for a VC model, 2) discrete error (DE), which is usually caused by DEM resolution and terrain complexity, and 3) propagation error (PE), which is caused by the variables' error. Based on these factors, the uncertainty modelling and analysis of VCs based on a regular grid DEM are investigated in this paper. Especially, how to quantify the uncertainty of VCs is proposed by a confidence interval based on truncation error (TE). In the experiments, the trapezoidal double rule (TDR) and Simpson's double rule (SDR) were used to calculate volume, where the TE is the major ME, and six simulated regular grid DEMs with different terrain complexity and resolution (i.e. DE) were generated by a Gauss synthetic surface to easily obtain the theoretical true value and eliminate the interference of data errors. For PE, Monte-Carlo simulation techniques and spatial autocorrelation were used to represent DEM uncertainty. This study can enrich uncertainty modelling and analysis-related theories of geographic information science.

  5. Uncertainty propagation for SPECT/CT-based renal dosimetry in 177Lu peptide receptor radionuclide therapy

    NASA Astrophysics Data System (ADS)

    Gustafsson, Johan; Brolin, Gustav; Cox, Maurice; Ljungberg, Michael; Johansson, Lena; Sjögreen Gleisner, Katarina

    2015-11-01

    A computer model of a patient-specific clinical 177Lu-DOTATATE therapy dosimetry system is constructed and used for investigating the variability of renal absorbed dose and biologically effective dose (BED) estimates. As patient models, three anthropomorphic computer phantoms coupled to a pharmacokinetic model of 177Lu-DOTATATE are used. Aspects included in the dosimetry-process model are the gamma-camera calibration via measurement of the system sensitivity, selection of imaging time points, generation of mass-density maps from CT, SPECT imaging, volume-of-interest delineation, calculation of absorbed-dose rate via a combination of local energy deposition for electrons and Monte Carlo simulations of photons, curve fitting and integration to absorbed dose and BED. By introducing variabilities in these steps the combined uncertainty in the output quantity is determined. The importance of different sources of uncertainty is assessed by observing the decrease in standard deviation when removing a particular source. The obtained absorbed dose and BED standard deviations are approximately 6% and slightly higher if considering the root mean square error. The most important sources of variability are the compensation for partial volume effects via a recovery coefficient and the gamma-camera calibration via the system sensitivity.

  6. Uncertainties in stormwater runoff data collection from a small urban catchment, Southeast China.

    PubMed

    Huang, Jinliang; Tu, Zhenshun; Du, Pengfei; Lin, Jie; Li, Qingsheng

    2010-01-01

    Monitoring data are often used to identify stormwater runoff characteristics and in stormwater runoff modelling without consideration of their inherent uncertainties. Integrated with discrete sample analysis and error propagation analysis, this study attempted to quantify the uncertainties of discrete chemical oxygen demand (COD), total suspended solids (TSS) concentration, stormwater flowrate, stormwater event volumes, COD event mean concentration (EMC), and COD event loads in terms of flow measurement, sample collection, storage and laboratory analysis. The results showed that the uncertainties due to sample collection, storage and laboratory analysis of COD from stormwater runoff are 13.99%, 19.48% and 12.28%. Meanwhile, flow measurement uncertainty was 12.82%, and the sample collection uncertainty of TSS from stormwater runoff was 31.63%. Based on the law of propagation of uncertainties, the uncertainties regarding event flow volume, COD EMC and COD event loads were quantified as 7.03%, 10.26% and 18.47%.

  7. Probabilistic accident consequence uncertainty analysis: Dispersion and deposition uncertainty assessment, appendices A and B

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harper, F.T.; Young, M.L.; Miller, L.A.

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, completed in 1990, estimate the risks presented by nuclear installations based on postulated frequencies and magnitudes of potential accidents. In 1991, the US Nuclear Regulatory Commission (NRC) and the Commission of the European Communities (CEC) began a joint uncertainty analysis of the two codes. The objective was to develop credible and traceable uncertainty distributions for the input variables of the codes. Expert elicitation, developed independently, was identified as the best technology available for developing a library of uncertainty distributions for the selected consequence parameters. The study was formulatedmore » jointly and was limited to the current code models and to physical quantities that could be measured in experiments. To validate the distributions generated for the wet deposition input variables, samples were taken from these distributions and propagated through the wet deposition code model along with the Gaussian plume model (GPM) implemented in the MACCS and COSYMA codes. Resulting distributions closely replicated the aggregated elicited wet deposition distributions. Project teams from the NRC and CEC cooperated successfully to develop and implement a unified process for the elaboration of uncertainty distributions on consequence code input parameters. Formal expert judgment elicitation proved valuable for synthesizing the best available information. Distributions on measurable atmospheric dispersion and deposition parameters were successfully elicited from experts involved in the many phenomenological areas of consequence analysis. This volume is the second of a three-volume document describing the project and contains two appendices describing the rationales for the dispersion and deposition data along with short biographies of the 16 experts who participated in the project.« less

  8. Transforming Medical Assessment: Integrating Uncertainty Into the Evaluation of Clinical Reasoning in Medical Education.

    PubMed

    Cooke, Suzette; Lemay, Jean-Francois

    2017-06-01

    In an age where practicing physicians have access to an overwhelming volume of clinical information and are faced with increasingly complex medical decisions, the ability to execute sound clinical reasoning is essential to optimal patient care. The authors propose two concepts that are philosophically paramount to the future assessment of clinical reasoning in medicine: assessment in the context of "uncertainty" (when, despite all of the information that is available, there is still significant doubt as to the best diagnosis, investigation, or treatment), and acknowledging that it is entirely possible (and reasonable) to have more than "one correct answer." The purpose of this article is to highlight key elements related to these two core concepts and discuss genuine barriers that currently exist on the pathway to creating such assessments. These include acknowledging situations of uncertainty, creating clear frameworks that define progressive levels of clinical reasoning skills, providing validity evidence to increase the defensibility of such assessments, considering the comparative feasibility with other forms of assessment, and developing strategies to evaluate the impact of these assessment methods on future learning and practice. The authors recommend that concerted efforts be directed toward these key areas to help advance the field of clinical reasoning assessment, improve the clinical care decisions made by current and future physicians, and have positive outcomes for patients. It is anticipated that these and subsequent efforts will aid in reaching the goal of making future assessment in medical education more representative of current-day clinical reasoning and decision making.

  9. [Interpretation and use of routine pulmonary function tests: Spirometry, static lung volumes, lung diffusion, arterial blood gas, methacholine challenge test and 6-minute walk test].

    PubMed

    Bokov, P; Delclaux, C

    2016-02-01

    Resting pulmonary function tests (PFT) include the assessment of ventilatory capacity: spirometry (forced expiratory flows and mobilisable volumes) and static volume assessment, notably using body plethysmography. Spirometry allows the potential definition of obstructive defect, while static volume assessment allows the potential definition of restrictive defect (decrease in total lung capacity) and thoracic hyperinflation (increase in static volumes). It must be kept in mind that this evaluation is incomplete and that an assessment of ventilatory demand is often warranted, especially when facing dyspnoea: evaluation of arterial blood gas (searching for respiratory insufficiency) and measurement of the transfer coefficient of the lung, allowing with the measurement of alveolar volume to calculate the diffusing capacity of the lung for CO (DLCO: assessment of alveolar-capillary wall and capillary blood volume). All these pulmonary function tests have been the subject of an Americano-European Task force (standardisation of lung function testing) published in 2005, and translated in French in 2007. Interpretative strategies for lung function tests have been recommended, which define abnormal lung function tests using the 5th and 95th percentiles of predicted values (lower and upper limits of normal values). Thus, these recommendations need to be implemented in all pulmonary function test units. A methacholine challenge test will only be performed in the presence of an intermediate pre-test probability for asthma (diagnostic uncertainty), which is an infrequent setting. The most convenient exertional test is the 6-minute walk test that allows the assessment of walking performance, the search for arterial desaturation and the quantification of dyspnoea complaint. Copyright © 2015 Société nationale française de médecine interne (SNFMI). Published by Elsevier SAS. All rights reserved.

  10. Multi-model seasonal forecast of Arctic sea-ice: forecast uncertainty at pan-Arctic and regional scales

    NASA Astrophysics Data System (ADS)

    Blanchard-Wrigglesworth, E.; Barthélemy, A.; Chevallier, M.; Cullather, R.; Fučkar, N.; Massonnet, F.; Posey, P.; Wang, W.; Zhang, J.; Ardilouze, C.; Bitz, C. M.; Vernieres, G.; Wallcraft, A.; Wang, M.

    2017-08-01

    Dynamical model forecasts in the Sea Ice Outlook (SIO) of September Arctic sea-ice extent over the last decade have shown lower skill than that found in both idealized model experiments and hindcasts of previous decades. Additionally, it is unclear how different model physics, initial conditions or forecast post-processing (bias correction) techniques contribute to SIO forecast uncertainty. In this work, we have produced a seasonal forecast of 2015 Arctic summer sea ice using SIO dynamical models initialized with identical sea-ice thickness in the central Arctic. Our goals are to calculate the relative contribution of model uncertainty and irreducible error growth to forecast uncertainty and assess the importance of post-processing, and to contrast pan-Arctic forecast uncertainty with regional forecast uncertainty. We find that prior to forecast post-processing, model uncertainty is the main contributor to forecast uncertainty, whereas after forecast post-processing forecast uncertainty is reduced overall, model uncertainty is reduced by an order of magnitude, and irreducible error growth becomes the main contributor to forecast uncertainty. While all models generally agree in their post-processed forecasts of September sea-ice volume and extent, this is not the case for sea-ice concentration. Additionally, forecast uncertainty of sea-ice thickness grows at a much higher rate along Arctic coastlines relative to the central Arctic ocean. Potential ways of offering spatial forecast information based on the timescale over which the forecast signal beats the noise are also explored.

  11. Uncertainty Quantification and Assessment of CO2 Leakage in Groundwater Aquifers

    NASA Astrophysics Data System (ADS)

    Carroll, S.; Mansoor, K.; Sun, Y.; Jones, E.

    2011-12-01

    Complexity of subsurface aquifers and the geochemical reactions that control drinking water compositions complicate our ability to estimate the impact of leaking CO2 on groundwater quality. We combined lithologic field data from the High Plains Aquifer, numerical simulations, and uncertainty quantification analysis to assess the role of aquifer heterogeneity and physical transport on the extent of CO2 impacted plume over a 100-year period. The High Plains aquifer is a major aquifer over much of the central United States where CO2 may be sequestered in depleted oil and gas reservoirs or deep saline formations. Input parameters considered included, aquifer heterogeneity, permeability, porosity, regional groundwater flow, CO2 and TDS leakage rates over time, and the number of leakage source points. Sensitivity analysis suggest that variations in sand and clay permeability, correlation lengths, van Genuchten parameters, and CO2 leakage rate have the greatest impact on impacted volume or maximum distance from the leak source. A key finding is that relative sensitivity of the parameters changes over the 100-year period. Reduced order models developed from regression of the numerical simulations show that volume of the CO2-impacted aquifer increases over time with 2 order of magnitude variance.

  12. Bayesian updating in a fault tree model for shipwreck risk assessment.

    PubMed

    Landquist, H; Rosén, L; Lindhe, A; Norberg, T; Hassellöv, I-M

    2017-07-15

    Shipwrecks containing oil and other hazardous substances have been deteriorating on the seabeds of the world for many years and are threatening to pollute the marine environment. The status of the wrecks and the potential volume of harmful substances present in the wrecks are affected by a multitude of uncertainties. Each shipwreck poses a unique threat, the nature of which is determined by the structural status of the wreck and possible damage resulting from hazardous activities that could potentially cause a discharge. Decision support is required to ensure the efficiency of the prioritisation process and the allocation of resources required to carry out risk mitigation measures. Whilst risk assessments can provide the requisite decision support, comprehensive methods that take into account key uncertainties related to shipwrecks are limited. The aim of this paper was to develop a method for estimating the probability of discharge of hazardous substances from shipwrecks. The method is based on Bayesian updating of generic information on the hazards posed by different activities in the surroundings of the wreck, with information on site-specific and wreck-specific conditions in a fault tree model. Bayesian updating is performed using Monte Carlo simulations for estimating the probability of a discharge of hazardous substances and formal handling of intrinsic uncertainties. An example application involving two wrecks located off the Swedish coast is presented. Results show the estimated probability of opening, discharge and volume of the discharge for the two wrecks and illustrate the capability of the model to provide decision support. Together with consequence estimations of a discharge of hazardous substances, the suggested model enables comprehensive and probabilistic risk assessments of shipwrecks to be made. Copyright © 2017 Elsevier B.V. All rights reserved.

  13. Long-term stormwater quantity and quality analysis using continuous measurements in a French urban catchment.

    PubMed

    Sun, Siao; Barraud, Sylvie; Castebrunet, Hélène; Aubin, Jean-Baptiste; Marmonier, Pierre

    2015-11-15

    The assessment of urban stormwater quantity and quality is important for evaluating and controlling the impact of the stormwater to natural water and environment. This study mainly addresses long-term evolution of stormwater quantity and quality in a French urban catchment using continuous measured data from 2004 to 2011. Storm event-based data series are obtained (716 rainfall events and 521 runoff events are available) from measured continuous time series. The Mann-Kendall test is applied to these event-based data series for trend detection. A lack of trend is found in rainfall and an increasing trend in runoff is detected. As a result, an increasing trend is present in the runoff coefficient, likely due to growing imperviousness of the catchment caused by urbanization. The event mean concentration of the total suspended solid (TSS) in stormwater does not present a trend, whereas the event load of TSS has an increasing tendency, which is attributed to the increasing event runoff volume. Uncertainty analysis suggests that the major uncertainty in trend detection results lies in uncertainty due to available data. A lack of events due to missing data leads to dramatically increased uncertainty in trend detection results. In contrast, measurement uncertainty in time series data plays a trivial role. The intra-event distribution of TSS is studied based on both M(V) curves and pollutant concentrations of absolute runoff volumes. The trend detection test reveals no significant change in intra-event distributions of TSS in the studied catchment. Copyright © 2015 Elsevier Ltd. All rights reserved.

  14. Uncertainty quantification and global sensitivity analysis of the Los Alamos sea ice model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Urrego-Blanco, Jorge Rolando; Urban, Nathan Mark; Hunke, Elizabeth Clare

    Changes in the high-latitude climate system have the potential to affect global climate through feedbacks with the atmosphere and connections with midlatitudes. Sea ice and climate models used to understand these changes have uncertainties that need to be characterized and quantified. We present a quantitative way to assess uncertainty in complex computer models, which is a new approach in the analysis of sea ice models. We characterize parametric uncertainty in the Los Alamos sea ice model (CICE) in a standalone configuration and quantify the sensitivity of sea ice area, extent, and volume with respect to uncertainty in 39 individual modelmore » parameters. Unlike common sensitivity analyses conducted in previous studies where parameters are varied one at a time, this study uses a global variance-based approach in which Sobol' sequences are used to efficiently sample the full 39-dimensional parameter space. We implement a fast emulator of the sea ice model whose predictions of sea ice extent, area, and volume are used to compute the Sobol' sensitivity indices of the 39 parameters. Main effects and interactions among the most influential parameters are also estimated by a nonparametric regression technique based on generalized additive models. A ranking based on the sensitivity indices indicates that model predictions are most sensitive to snow parameters such as snow conductivity and grain size, and the drainage of melt ponds. Lastly, it is recommended that research be prioritized toward more accurately determining these most influential parameter values by observational studies or by improving parameterizations in the sea ice model.« less

  15. Uncertainty quantification and global sensitivity analysis of the Los Alamos sea ice model

    DOE PAGES

    Urrego-Blanco, Jorge Rolando; Urban, Nathan Mark; Hunke, Elizabeth Clare; ...

    2016-04-01

    Changes in the high-latitude climate system have the potential to affect global climate through feedbacks with the atmosphere and connections with midlatitudes. Sea ice and climate models used to understand these changes have uncertainties that need to be characterized and quantified. We present a quantitative way to assess uncertainty in complex computer models, which is a new approach in the analysis of sea ice models. We characterize parametric uncertainty in the Los Alamos sea ice model (CICE) in a standalone configuration and quantify the sensitivity of sea ice area, extent, and volume with respect to uncertainty in 39 individual modelmore » parameters. Unlike common sensitivity analyses conducted in previous studies where parameters are varied one at a time, this study uses a global variance-based approach in which Sobol' sequences are used to efficiently sample the full 39-dimensional parameter space. We implement a fast emulator of the sea ice model whose predictions of sea ice extent, area, and volume are used to compute the Sobol' sensitivity indices of the 39 parameters. Main effects and interactions among the most influential parameters are also estimated by a nonparametric regression technique based on generalized additive models. A ranking based on the sensitivity indices indicates that model predictions are most sensitive to snow parameters such as snow conductivity and grain size, and the drainage of melt ponds. Lastly, it is recommended that research be prioritized toward more accurately determining these most influential parameter values by observational studies or by improving parameterizations in the sea ice model.« less

  16. Uncertainty quantification and global sensitivity analysis of the Los Alamos sea ice model

    NASA Astrophysics Data System (ADS)

    Urrego-Blanco, Jorge R.; Urban, Nathan M.; Hunke, Elizabeth C.; Turner, Adrian K.; Jeffery, Nicole

    2016-04-01

    Changes in the high-latitude climate system have the potential to affect global climate through feedbacks with the atmosphere and connections with midlatitudes. Sea ice and climate models used to understand these changes have uncertainties that need to be characterized and quantified. We present a quantitative way to assess uncertainty in complex computer models, which is a new approach in the analysis of sea ice models. We characterize parametric uncertainty in the Los Alamos sea ice model (CICE) in a standalone configuration and quantify the sensitivity of sea ice area, extent, and volume with respect to uncertainty in 39 individual model parameters. Unlike common sensitivity analyses conducted in previous studies where parameters are varied one at a time, this study uses a global variance-based approach in which Sobol' sequences are used to efficiently sample the full 39-dimensional parameter space. We implement a fast emulator of the sea ice model whose predictions of sea ice extent, area, and volume are used to compute the Sobol' sensitivity indices of the 39 parameters. Main effects and interactions among the most influential parameters are also estimated by a nonparametric regression technique based on generalized additive models. A ranking based on the sensitivity indices indicates that model predictions are most sensitive to snow parameters such as snow conductivity and grain size, and the drainage of melt ponds. It is recommended that research be prioritized toward more accurately determining these most influential parameter values by observational studies or by improving parameterizations in the sea ice model.

  17. Spent Fuel Working Group Report. Volume 1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    O`Toole, T.

    1993-11-01

    The Department of Energy is storing large amounts of spent nuclear fuel and other reactor irradiated nuclear materials (herein referred to as RINM). In the past, the Department reprocessed RINM to recover plutonium, tritium, and other isotopes. However, the Department has ceased or is phasing out reprocessing operations. As a consequence, Department facilities designed, constructed, and operated to store RINM for relatively short periods of time now store RINM, pending decisions on the disposition of these materials. The extended use of the facilities, combined with their known degradation and that of their stored materials, has led to uncertainties about safety.more » To ensure that extended storage is safe (i.e., that protection exists for workers, the public, and the environment), the conditions of these storage facilities had to be assessed. The compelling need for such an assessment led to the Secretary`s initiative on spent fuel, which is the subject of this report. This report comprises three volumes: Volume I; Summary Results of the Spent Fuel Working Group Evaluation; Volume II, Working Group Assessment Team Reports and Protocol; Volume III; Operating Contractor Site Team Reports. This volume presents the overall results of the Working Group`s Evaluation. The group assessed 66 facilities spread across 11 sites. It identified: (1) facilities that should be considered for priority attention. (2) programmatic issues to be considered in decision making about interim storage plans and (3) specific vulnerabilities for some of these facilities.« less

  18. An NTCP Analysis of Urethral Complications from Low Doserate Mono- and Bi-Radionuclide Brachytherapy.

    PubMed

    Nuttens, V E; Nahum, A E; Lucas, S

    2011-01-01

    Urethral NTCP has been determined for three prostates implanted with seeds based on (125)I (145 Gy), (103)Pd (125 Gy), (131)Cs (115 Gy), (103)Pd-(125)I (145 Gy), or (103)Pd-(131)Cs (115 Gy or 130 Gy). First, DU(20), meaning that 20% of the urhral volume receive a dose of at least DU(20), is converted into an I-125 LDR equivalent DU(20) in order to use the urethral NTCP model. Second, the propagation of uncertainties through the steps in the NTCP calculation was assessed in order to identify the parameters responsible for large data uncertainties. Two sets of radiobiological parameters were studied. The NTCP results all fall in the 19%-23% range and are associated with large uncertainties, making the comparison difficult. Depending on the dataset chosen, the ranking of NTCP values among the six seed implants studied changes. Moreover, the large uncertainties on the fitting parameters of the urethral NTCP model result in large uncertainty on the NTCP value. In conclusion, the use of NTCP model for permanent brachytherapy is feasible but it is essential that the uncertainties on the parameters in the model be reduced.

  19. Accounting for sensor calibration, data validation, measurement and sampling uncertainties in monitoring urban drainage systems.

    PubMed

    Bertrand-Krajewski, J L; Bardin, J P; Mourad, M; Béranger, Y

    2003-01-01

    Assessing the functioning and the performance of urban drainage systems on both rainfall event and yearly time scales is usually based on online measurements of flow rates and on samples of influent effluent for some rainfall events per year. In order to draw pertinent scientific and operational conclusions from the measurement results, it is absolutely necessary to use appropriate methods and techniques in order to i) calibrate sensors and analytical methods, ii) validate raw data, iii) evaluate measurement uncertainties, iv) evaluate the number of rainfall events to sample per year in order to determine performance indicator with a given uncertainty. Based an previous work, the paper gives a synthetic review of required and techniques, and illustrates their application to storage and settling tanks. Experiments show that, controlled and careful experimental conditions, relative uncertainties are about 20% for flow rates in sewer pipes, 6-10% for volumes, 25-35% for TSS concentrations and loads, and 18-276% for TSS removal rates. In order to evaluate the annual pollutant interception efficiency of storage and settling tanks with a given uncertainty, efforts should first be devoted to decrease the sampling uncertainty by increasing the number of sampled events.

  20. Large-Scale Transport Model Uncertainty and Sensitivity Analysis: Distributed Sources in Complex Hydrogeologic Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sig Drellack, Lance Prothro

    2007-12-01

    The Underground Test Area (UGTA) Project of the U.S. Department of Energy, National Nuclear Security Administration Nevada Site Office is in the process of assessing and developing regulatory decision options based on modeling predictions of contaminant transport from underground testing of nuclear weapons at the Nevada Test Site (NTS). The UGTA Project is attempting to develop an effective modeling strategy that addresses and quantifies multiple components of uncertainty including natural variability, parameter uncertainty, conceptual/model uncertainty, and decision uncertainty in translating model results into regulatory requirements. The modeling task presents multiple unique challenges to the hydrological sciences as a result ofmore » the complex fractured and faulted hydrostratigraphy, the distributed locations of sources, the suite of reactive and non-reactive radionuclides, and uncertainty in conceptual models. Characterization of the hydrogeologic system is difficult and expensive because of deep groundwater in the arid desert setting and the large spatial setting of the NTS. Therefore, conceptual model uncertainty is partially addressed through the development of multiple alternative conceptual models of the hydrostratigraphic framework and multiple alternative models of recharge and discharge. Uncertainty in boundary conditions is assessed through development of alternative groundwater fluxes through multiple simulations using the regional groundwater flow model. Calibration of alternative models to heads and measured or inferred fluxes has not proven to provide clear measures of model quality. Therefore, model screening by comparison to independently-derived natural geochemical mixing targets through cluster analysis has also been invoked to evaluate differences between alternative conceptual models. Advancing multiple alternative flow models, sensitivity of transport predictions to parameter uncertainty is assessed through Monte Carlo simulations. The simulations are challenged by the distributed sources in each of the Corrective Action Units, by complex mass transfer processes, and by the size and complexity of the field-scale flow models. An efficient methodology utilizing particle tracking results and convolution integrals provides in situ concentrations appropriate for Monte Carlo analysis. Uncertainty in source releases and transport parameters including effective porosity, fracture apertures and spacing, matrix diffusion coefficients, sorption coefficients, and colloid load and mobility are considered. With the distributions of input uncertainties and output plume volumes, global analysis methods including stepwise regression, contingency table analysis, and classification tree analysis are used to develop sensitivity rankings of parameter uncertainties for each model considered, thus assisting a variety of decisions.« less

  1. Using stereo satellite imagery to account for ablation, entrainment, and compaction in volume calculations for rock avalanches on Glaciers: Application to the 2016 Lamplugh Rock Avalanche in Glacier Bay National Park, Alaska

    USGS Publications Warehouse

    Bessette-Kirton, Erin; Coe, Jeffrey A.; Zhou, Wendy

    2018-01-01

    The use of preevent and postevent digital elevation models (DEMs) to estimate the volume of rock avalanches on glaciers is complicated by ablation of ice before and after the rock avalanche, scour of material during rock avalanche emplacement, and postevent ablation and compaction of the rock avalanche deposit. We present a model to account for these processes in volume estimates of rock avalanches on glaciers. We applied our model by calculating the volume of the 28 June 2016 Lamplugh rock avalanche in Glacier Bay National Park, Alaska. We derived preevent and postevent 2‐m resolution DEMs from WorldView satellite stereo imagery. Using data from DEM differencing, we reconstructed the rock avalanche and adjacent surfaces at the time of occurrence by accounting for elevation changes due to ablation and scour of the ice surface, and postevent deposit changes. We accounted for uncertainties in our DEMs through precise coregistration and an assessment of relative elevation accuracy in bedrock control areas. The rock avalanche initially displaced 51.7 ± 1.5 Mm3 of intact rock and then scoured and entrained 13.2 ± 2.2 Mm3 of snow and ice during emplacement. We calculated the total deposit volume to be 69.9 ± 7.9 Mm3. Volume estimates that did not account for topographic changes due to ablation, scour, and compaction underestimated the deposit volume by 31.0–46.8 Mm3. Our model provides an improved framework for estimating uncertainties affecting rock avalanche volume measurements in glacial environments. These improvements can contribute to advances in the understanding of rock avalanche hazards and dynamics.

  2. Relative Evaluation of the Independent Volume Measures of Caverns

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    MUNSON,DARRELL E.

    2000-08-01

    Throughout the construction and operation of the caverns of the Strategic Petroleum Reserve (SPR), three types of cavern volume measurements have been maintained. These are: (1) the calculated solution volume determined during initial construction by solution mining and any subsequent solutioning during oil transfers, (2) the calculated sonar volume determined through sonar surveys of the cavern dimensions, and (3) the direct metering of oil to determine the volume of the cavern occupied by the oil. The objective of this study is to compare these measurements to each other and determine, if possible, the uncertainties associated with a given type ofmore » measurement. Over time, each type of measurement has acquired a customary, or an industry accepted, stated uncertainty. This uncertainty is not necessarily the result of a technical analysis. Ultimately there is one definitive quantity, the oil volume measure by the oil custody transfer meters, taken by all parties to the transfer as the correct ledger amount and for which the SPR Project is accountable. However, subsequent transfers within a site may not be with meters of the same accuracy. In this study, a very simple theory of the perfect relationship is used to evaluate the correlation (deviation) of the various measures. This theory permits separation of uncertainty and bias. Each of the four SPR sites are examined, first with comparisons between the calculated solution volumes and the sonar volumes determined during construction, then with comparisons of the oil inventories and the sonar volumes obtained either by surveying through brine prior to oil filling or through the oil directly.« less

  3. A Review of Methods Applied by the U.S. Geological Survey in the Assessment of Identified Geothermal Resources

    USGS Publications Warehouse

    Williams, Colin F.; Reed, Marshall J.; Mariner, Robert H.

    2008-01-01

    The U. S. Geological Survey (USGS) is conducting an updated assessment of geothermal resources in the United States. The primary method applied in assessments of identified geothermal systems by the USGS and other organizations is the volume method, in which the recoverable heat is estimated from the thermal energy available in a reservoir. An important focus in the assessment project is on the development of geothermal resource models consistent with the production histories and observed characteristics of exploited geothermal fields. The new assessment will incorporate some changes in the models for temperature and depth ranges for electric power production, preferred chemical geothermometers for estimates of reservoir temperatures, estimates of reservoir volumes, and geothermal energy recovery factors. Monte Carlo simulations are used to characterize uncertainties in the estimates of electric power generation. These new models for the recovery of heat from heterogeneous, fractured reservoirs provide a physically realistic basis for evaluating the production potential of natural geothermal reservoirs.

  4. Flood Frequency Analysis using different flood descriptors - the Warsaw reach of the river Vistula case study

    NASA Astrophysics Data System (ADS)

    Karamuz, Emilia; Kochanek, Krzysztof; Romanowicz, Renata

    2014-05-01

    Flood frequency analysis (FFA) is customarily performed using annual maximum flows. However, there is a number of different flood descriptors that could be used. Among them are water levels, peaks over the threshold, flood-wave duration, flood volume, etc. In this study we compare different approaches to FFA for their suitability for flood risk assessment. The main goal is to obtain the FFA curve with the smallest possible uncertainty limits, in particular for the distribution tail. The extrapolation of FFA curves is crucial in future flood risk assessment in a changing climate. We compare the FFA curves together with their uncertainty limits obtained using flows, water levels, flood inundation area and volumes for the Warsaw reach of the river Vistula. Moreover, we derive the FFA curves obtained using simulated flows. The results are used to derive the error distribution for the maximum simulated and observed values under different modelling techniques and assess its influence on flood risk predictions for ungauged catchments. MIKE11, HEC-RAS and transfer function model are applied in average and extreme conditions to model flow propagation in the Warsaw Vistula reach. The additional questions we want to answer are what is the range of application of different modelling tools under various flow conditions and how can the uncertainty of flood risk assessment be decreased. This work was partly supported by the projects "Stochastic flood forecasting system (The River Vistula reach from Zawichost to Warsaw)" and "Modern statistical models for analysis of flood frequency and features of flood waves", carried by the Institute of Geophysics, Polish Academy of Sciences on the order of the National Science Centre (contracts Nos. 2011/01/B/ST10/06866 and 2012/05/B/ST10/00482, respectively). The water level and flow data were provided by the Institute of Meteorology and Water Management (IMGW), Poland.

  5. Volume determination of two spheres of the new 28Si crystal of PTB

    NASA Astrophysics Data System (ADS)

    Nicolaus, A.; Bartl, G.; Peter, A.; Kuhn, E.; Mai, T.

    2017-08-01

    In the scope of the redetermination of Avogadro’s constant N A, a new isotopically enriched silicon crystal has been produced, from which two spheres were manufactured. After the crystal properties, the lattice parameter and molar mass, as well as the masses of the two spheres have been determined, the volume of the spheres was also measured. For this, the sphere interferometer of PTB was used. The methods of the interferometric measurements have been improved and the major contributions to the uncertainty have been investigated thoroughly. As a result, the total uncertainty could be reduced significantly, yielding a substantial impact on the determination of Avogadro’s constant. The mean diameter of each sphere was measured twice with a repeatability of  ±2  ×  10-10, and the relative uncertainty of the ‘apparent’ volume, which disregards the comparatively small influence of the optical effects of surface layers, was reduced to 7  ×  10-9. The final results of the volumes and comments on their uncertainties are given.

  6. DSM-flux: A new technology for reliable Combined Sewer Overflow discharge monitoring with low uncertainties.

    PubMed

    Maté Marín, Ainhoa; Rivière, Nicolas; Lipeme Kouyi, Gislain

    2018-06-01

    In the past ten years, governments from the European Union have been encouraged to collect volume and quality data for all the effluent overflows from separated stormwater and combined sewer systems that result in a significant environmental impact on receiving water bodies. Methods to monitor and control these flows require improvements, particularly for complex Combined Sewer Overflow (CSO) structures. The DSM-flux (Device for Stormwater and combined sewer flows Monitoring and the control of pollutant fluxes) is a new pre-designed and pre-calibrated channel that provides appropriate hydraulic conditions suitable for measurement of overflow rates and volumes by means of one water level gauge. In this paper, a stage-discharge relation for the DSM-flux is obtained experimentally and validated for multiple inflow hydraulic configurations. Uncertainties in CSO discharges and volumes are estimated within the Guide to the expression of Uncertainty in Measurement (GUM) framework. Whatever the upstream hydraulic conditions are, relative uncertainties are lower than 15% and 2% for the investigated discharges and volumes, respectively. Copyright © 2018 Elsevier Ltd. All rights reserved.

  7. Anthropometric approaches and their uncertainties to assigning computational phantoms to individual patients in pediatric dosimetry studies

    NASA Astrophysics Data System (ADS)

    Whalen, Scott; Lee, Choonsik; Williams, Jonathan L.; Bolch, Wesley E.

    2008-01-01

    Current efforts to reconstruct organ doses in children undergoing diagnostic imaging or therapeutic interventions using ionizing radiation typically rely upon the use of reference anthropomorphic computational phantoms coupled to Monte Carlo radiation transport codes. These phantoms are generally matched to individual patients based upon nearest age or sometimes total body mass. In this study, we explore alternative methods of phantom-to-patient matching with the goal of identifying those methods which yield the lowest residual errors in internal organ volumes. Various thoracic and abdominal organs were segmented and organ volumes obtained from chest-abdominal-pelvic (CAP) computed tomography (CT) image sets from 38 pediatric patients ranging in age from 2 months to 15 years. The organs segmented included the skeleton, heart, kidneys, liver, lungs and spleen. For each organ, least-squared regression lines, 95th percentile confidence intervals and 95th percentile prediction intervals were established as a function of patient age, trunk volume, estimated trunk mass, trunk height, and three estimates of the ventral body cavity volume based on trunk height alone, or in combination with circumferential, width and/or breadth measurements in the mid-chest of the patient. When matching phantom to patient based upon age, residual uncertainties in organ volumes ranged from 53% (lungs) to 33% (kidneys), and when trunk mass was used (surrogate for total body mass as we did not have images of patient head, arms or legs), these uncertainties ranged from 56% (spleen) to 32% (liver). When trunk height is used as the matching parameter, residual uncertainties in organ volumes were reduced to between 21 and 29% for all organs except the spleen (40%). In the case of the lungs and skeleton, the two-fold reduction in organ volume uncertainties was seen in moving from patient age to trunk height—a parameter easily measured in the clinic. When ventral body cavity volumes were used, residual uncertainties were lowered even further to a range of between 14 and 20% for all organs except the spleen, which continued to remain at around 40%. The results of this study suggest that a more anthropometric pairing of computational phantom to individual patient based on simple measurements of trunk height and possibly mid-chest circumference or thickness (where influences of subcutaneous fat are minimized) can lead to significant reductions in organ volume uncertainties: ranges of 40-50% (based on patient age) to between 15 and 20% (based on body cavity volumes tied to trunk height). An expanded series of non-uniform rational B-spine (NURBS) pediatric phantoms are being created at the University of Florida to allow the full application of this new approach in pediatric medical imaging studies.

  8. Mercury study report to Congress. Volume 4. Health effects of mercury and mercury compounds. Sab review draft

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schoeny, R.

    1996-06-01

    This volume of the draft Mercury Study Report to Congress summarizes the available information on human health effects and animal data for hazard identification and dose-response assessment for three forms of mercury: elemental mercury, mercury chloride (inorganic mercury), and methylmercury (organic mercury). Effects are summarized by endpoint. The risk assessment evaluates carcinogenicity, mutagenicity, developmental toxicity and general systemic toxicity of these chemical species of mercury. Toxicokinetics (absorption, distribution, metabolism and excretion) are described for each of the three mercury species. PBPK models are described, but not applied in risk assessment. Reference doses are calculated for inorganic and methylmercury; a referencemore » concentration for inhaled elemental mercury is provided. A quantitiative analysis of factors contributing to variability and uncertainty in the methylmercury RfD is provided in an appendix. Interations and sensitive populations are described.« less

  9. Joint Peru/United States report on Peru/United States cooperative energy assessment. Volume 1. Executive summary, main report and appendices

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1979-08-01

    In 1978, the US and Peru conducted a comprehensive assessment of Peru's energy resources, needs, and uses and developed several alternative energy strategies that utilize the available resources to meet their energy requirements. This Volume I reports the findings of the assessment and contains the executive summary, the main report, and five appendices of information that support the integrated energy supply and demand analysis. The following chapters are included: The Energy Situation in Peru (economic context and background, energy resources and production, energy consumption patterns); Reference Supply and Demand Projection (approach, procedures, and assumptions; economic projections; energy demand and supplymore » projections; supply/demand integration; uncertainties); and The Development of Strategies and Options (the analysis of options; strategies; increased use of renewables, hydropower, coal; increased energy efficiency; and financial analysis of strategies).« less

  10. Beam-specific planning volumes for scattered-proton lung radiotherapy

    NASA Astrophysics Data System (ADS)

    Flampouri, S.; Hoppe, B. S.; Slopsema, R. L.; Li, Z.

    2014-08-01

    This work describes the clinical implementation of a beam-specific planning treatment volume (bsPTV) calculation for lung cancer proton therapy and its integration into the treatment planning process. Uncertainties incorporated in the calculation of the bsPTV included setup errors, machine delivery variability, breathing effects, inherent proton range uncertainties and combinations of the above. Margins were added for translational and rotational setup errors and breathing motion variability during the course of treatment as well as for their effect on proton range of each treatment field. The effect of breathing motion and deformation on the proton range was calculated from 4D computed tomography data. Range uncertainties were considered taking into account the individual voxel HU uncertainty along each proton beamlet. Beam-specific treatment volumes generated for 12 patients were used: a) as planning targets, b) for routine plan evaluation, c) to aid beam angle selection and d) to create beam-specific margins for organs at risk to insure sparing. The alternative planning technique based on the bsPTVs produced similar target coverage as the conventional proton plans while better sparing the surrounding tissues. Conventional proton plans were evaluated by comparing the dose distributions per beam with the corresponding bsPTV. The bsPTV volume as a function of beam angle revealed some unexpected sources of uncertainty and could help the planner choose more robust beams. Beam-specific planning volume for the spinal cord was used for dose distribution shaping to ensure organ sparing laterally and distally to the beam.

  11. Breaking through the uncertainty ceiling in LA-ICP-MS U-Pb geochronology

    NASA Astrophysics Data System (ADS)

    Horstwood, M.

    2016-12-01

    Sources of systematic uncertainty associated with session-to-session bias are the dominant contributor to the 2% (2s) uncertainty ceiling that currently limits the accuracy of LA-ICP-MS U-Pb geochronology. Sources include differential downhole fractionation (LIEF), `matrix effects' and ablation volume differences, which result in irreproducibility of the same reference material across sessions. Current mitigation methods include correcting for LIEF mathematically, using matrix-matched reference materials, annealing material to reduce or eliminate radiation damage effects and tuning for robust plasma conditions. Reducing the depth and volume of ablation can also mitigate these problems and should contribute to the reduction of the uncertainty ceiling. Reducing analysed volume leads to increased detection efficiency, reduced matrix-effects, eliminates LIEF, obviates ablation rate differences and reduces the likelihood of intercepting complex growth zones with depth, thereby apparently improving material homogeneity. High detection efficiencies (% level) and low sampling volumes (20um box, 1-2um deep) can now be achieved using MC-ICP-MS such that low volume ablations should be considered part of the toolbox of methods targeted at improving the reproducibility of LA-ICP-MS U-Pb geochronology. In combination with other strategies these improvements should be feasible on any ICP platform. However, reducing the volume of analysis reduces detected counts and requires a change of analytical approach in order to mitigate this. Appropriate strategies may include the use of high efficiency cell and torch technologies and the optimisation of acquisition protocols and data handling techniques such as condensing signal peaks, using log ratios and total signal integration. The tools required to break the 2% (2s) uncertainty ceiling in LA-ICP-MS U-Pb geochronology are likely now known but require a coherent strategy and change of approach to combine their implementation and realise this goal. This study will highlight these changes and efforts towards reducing the uncertainty contribution for LA-ICP-MS U-Pb geochronology.

  12. Uncertainty in the Himalayan energy-water nexus: estimating regional exposure to glacial lake outburst floods

    NASA Astrophysics Data System (ADS)

    Schwanghart, Wolfgang; Worni, Raphael; Huggel, Christian; Stoffel, Markus; Korup, Oliver

    2016-07-01

    Himalayan water resources attract a rapidly growing number of hydroelectric power projects (HPP) to satisfy Asia’s soaring energy demands. Yet HPP operating or planned in steep, glacier-fed mountain rivers face hazards of glacial lake outburst floods (GLOFs) that can damage hydropower infrastructure, alter water and sediment yields, and compromise livelihoods downstream. Detailed appraisals of such GLOF hazards are limited to case studies, however, and a more comprehensive, systematic analysis remains elusive. To this end we estimate the regional exposure of 257 Himalayan HPP to GLOFs, using a flood-wave propagation model fed by Monte Carlo-derived outburst volumes of >2300 glacial lakes. We interpret the spread of thus modeled peak discharges as a predictive uncertainty that arises mainly from outburst volumes and dam-breach rates that are difficult to assess before dams fail. With 66% of sampled HPP are on potential GLOF tracks, up to one third of these HPP could experience GLOF discharges well above local design floods, as hydropower development continues to seek higher sites closer to glacial lakes. We compute that this systematic push of HPP into headwaters effectively doubles the uncertainty about GLOF peak discharge in these locations. Peak discharges farther downstream, in contrast, are easier to predict because GLOF waves attenuate rapidly. Considering this systematic pattern of regional GLOF exposure might aid the site selection of future Himalayan HPP. Our method can augment, and help to regularly update, current hazard assessments, given that global warming is likely changing the number and size of Himalayan meltwater lakes.

  13. Measurement uncertainty of ester number, acid number and patchouli alcohol of patchouli oil produced in Yogyakarta

    NASA Astrophysics Data System (ADS)

    Istiningrum, Reni Banowati; Saepuloh, Azis; Jannah, Wirdatul; Aji, Didit Waskito

    2017-03-01

    Yogyakarta is one of patchouli oil distillation center in Indonesia. The quality of patchouli oil greatly affect its market price. Therefore, testing quality of patchouli oil parameters is an important concern, one through determination of the measurement uncertainty. This study will determine the measurement uncertainty of ester number, acid number and content of patchouli alcohol through a bottom up approach. Source contributor to measurement uncertainty of ester number is a mass of the sample, a blank and sample titration volume, the molar mass of KOH, HCl normality, and replication. While the source contributor of the measurement uncertainty of acid number is the mass of the sample, the sample titration volume, the relative mass and normality of KOH, and repetition. Determination of patchouli alcohol by Gas Chromatography considers the sources of measurement uncertainty only from repeatability because reference materials are not available.

  14. Assessing the robustness of passive scattering proton therapy with regard to local recurrence in stage III non-small cell lung cancer: a secondary analysis of a phase II trial.

    PubMed

    Zhu, Zhengfei; Liu, Wei; Gillin, Michael; Gomez, Daniel R; Komaki, Ritsuko; Cox, James D; Mohan, Radhe; Chang, Joe Y

    2014-05-06

    We assessed the robustness of passive scattering proton therapy (PSPT) plans for patients in a phase II trial of PSPT for stage III non-small cell lung cancer (NSCLC) by using the worst-case scenario method, and compared the worst-case dose distributions with the appearance of locally recurrent lesions. Worst-case dose distributions were generated for each of 9 patients who experienced recurrence after concurrent chemotherapy and PSPT to 74 Gy(RBE) for stage III NSCLC by simulating and incorporating uncertainties associated with set-up, respiration-induced organ motion, and proton range in the planning process. The worst-case CT scans were then fused with the positron emission tomography (PET) scans to locate the recurrence. Although the volumes enclosed by the prescription isodose lines in the worst-case dose distributions were consistently smaller than enclosed volumes in the nominal plans, the target dose coverage was not significantly affected: only one patient had a recurrence outside the prescription isodose lines in the worst-case plan. PSPT is a relatively robust technique. Local recurrence was not associated with target underdosage resulting from estimated uncertainties in 8 of 9 cases.

  15. Variability and epistemic uncertainty in water ingestion rates and pharmacokinetic parameters, and impact on the association between perfluorooctanoate and preeclampsia in the C8 Health Project population.

    PubMed

    Avanasi, Raghavendhran; Shin, Hyeong-Moo; Vieira, Veronica M; Bartell, Scott M

    2016-04-01

    We recently utilized a suite of environmental fate and transport models and an integrated exposure and pharmacokinetic model to estimate individual perfluorooctanoate (PFOA) serum concentrations, and also assessed the association of those concentrations with preeclampsia for participants in the C8 Health Project (a cross-sectional study of over 69,000 people who were environmentally exposed to PFOA near a major U.S. fluoropolymer production facility located in West Virginia). However, the exposure estimates from this integrated model relied on default values for key independent exposure parameters including water ingestion rates, the serum PFOA half-life, and the volume of distribution for PFOA. The aim of the present study is to assess the impact of inter-individual variability and epistemic uncertainty in these parameters on the exposure estimates and subsequently, the epidemiological association between PFOA exposure and preeclampsia. We used Monte Carlo simulation to propagate inter-individual variability/epistemic uncertainty in the exposure assessment and reanalyzed the epidemiological association. Inter-individual variability in these parameters mildly impacted the serum PFOA concentration predictions (the lowest mean rank correlation between the estimated serum concentrations in our study and the original predicted serum concentrations was 0.95) and there was a negligible impact on the epidemiological association with preeclampsia (no change in the mean adjusted odds ratio (AOR) and the contribution of exposure uncertainty to the total uncertainty including sampling variability was 7%). However, when epistemic uncertainty was added along with the inter-individual variability, serum PFOA concentration predictions and their association with preeclampsia were moderately impacted (the mean AOR of preeclampsia occurrence was reduced from 1.12 to 1.09, and the contribution of exposure uncertainty to the total uncertainty was increased up to 33%). In conclusion, our study shows that the change of the rank exposure among the study participants due to variability and epistemic uncertainty in the independent exposure parameters was large enough to cause a 25% bias towards the null. This suggests that the true AOR of the association between PFOA and preeclampsia in this population might be higher than the originally reported AOR and has more uncertainty than indicated by the originally reported confidence interval. Copyright © 2016 Elsevier Inc. All rights reserved.

  16. Buoyancy contribution to uncertainty of mass, conventional mass and force

    NASA Astrophysics Data System (ADS)

    Malengo, Andrea; Bich, Walter

    2016-04-01

    The conventional mass is a useful concept introduced to reduce the impact of the buoyancy correction in everyday mass measurements, thus avoiding in most cases its accurate determination, necessary in measurements of ‘true’ mass. Although usage of conventional mass is universal and standardized, the concept is considered as a sort of second-choice tool, to be avoided in high-accuracy applications. In this paper we show that this is a false belief, by elucidating the role played by covariances between volume and mass and between volume and conventional mass at the various stages of the dissemination chain and in the relationship between the uncertainties of mass and conventional mass. We arrive at somewhat counter-intuitive results: the volume of the transfer standard plays a comparatively minor role in the uncertainty budget of the standard under calibration. In addition, conventional mass is preferable to mass in normal, in-air operation, as its uncertainty is smaller than that of mass, if covariance terms are properly taken into account, and the uncertainty over-stating (typically) resulting from neglecting them is less severe than that (always) occurring with mass. The same considerations hold for force. In this respect, we show that the associated uncertainty is the same using mass or conventional mass, and, again, that the latter is preferable if covariance terms are neglected.

  17. SU-E-J-159: Intra-Patient Deformable Image Registration Uncertainties Quantified Using the Distance Discordance Metric

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Saleh, Z; Thor, M; Apte, A

    2014-06-01

    Purpose: The quantitative evaluation of deformable image registration (DIR) is currently challenging due to lack of a ground truth. In this study we test a new method proposed for quantifying multiple-image based DIRrelated uncertainties, for DIR of pelvic images. Methods: 19 patients were analyzed, each with 6 CT scans, who previously had radiotherapy for prostate cancer. Manually delineated structures for rectum and bladder, which served as ground truth structures, were delineated on the planning CT and each subsequent scan. For each patient, voxel-by-voxel DIR-related uncertainties were evaluated, following B-spline based DIR, by applying a previously developed metric, the distance discordancemore » metric (DDM; Saleh et al., PMB (2014) 59:733). The DDM map was superimposed on the first acquired CT scan and DDM statistics were assessed, also relative to two metrics estimating the agreement between the propagated and the manually delineated structures. Results: The highest DDM values which correspond to greatest spatial uncertainties were observed near the body surface and in the bowel due to the presence of gas. The mean rectal and bladder DDM values ranged from 1.1–11.1 mm and 1.5–12.7 mm, respectively. There was a strong correlation in the DDMs between the rectum and bladder (Pearson R = 0.68 for the max DDM). For both structures, DDM was correlated with the ratio between the DIR-propagated and manually delineated volumes (R = 0.74 for the max rectal DDM). The maximum rectal DDM was negatively correlated with the Dice Similarity Coefficient between the propagated and the manually delineated volumes (R= −0.52). Conclusion: The multipleimage based DDM map quantified considerable DIR variability across different structures and among patients. Besides using the DDM for quantifying DIR-related uncertainties it could potentially be used to adjust for uncertainties in DIR-based accumulated dose distributions.« less

  18. Estimating the volume of glaciers in the Himalayan-Karakoram region using different methods

    NASA Astrophysics Data System (ADS)

    Frey, H.; Machguth, H.; Huss, M.; Huggel, C.; Bajracharya, S.; Bolch, T.; Kulkarni, A.; Linsbauer, A.; Salzmann, N.; Stoffel, M.

    2014-12-01

    Ice volume estimates are crucial for assessing water reserves stored in glaciers. Due to its large glacier coverage, such estimates are of particular interest for the Himalayan-Karakoram (HK) region. In this study, different existing methodologies are used to estimate the ice reserves: three area-volume relations, one slope-dependent volume estimation method, and two ice-thickness distribution models are applied to a recent, detailed, and complete glacier inventory of the HK region, spanning over the period 2000-2010 and revealing an ice coverage of 40 775 km2. An uncertainty and sensitivity assessment is performed to investigate the influence of the observed glacier area and important model parameters on the resulting total ice volume. Results of the two ice-thickness distribution models are validated with local ice-thickness measurements at six glaciers. The resulting ice volumes for the entire HK region range from 2955 to 4737 km3, depending on the approach. This range is lower than most previous estimates. Results from the ice thickness distribution models and the slope-dependent thickness estimations agree well with measured local ice thicknesses. However, total volume estimates from area-related relations are larger than those from other approaches. The study provides evidence on the significant effect of the selected method on results and underlines the importance of a careful and critical evaluation.

  19. Modification, calibration, and performance of the Ultra-High Sensitivity Aerosol Spectrometer for particle size distribution and volatility measurements during the Atmospheric Tomography Mission (ATom) airborne campaign

    NASA Astrophysics Data System (ADS)

    Kupc, Agnieszka; Williamson, Christina; Wagner, Nicholas L.; Richardson, Mathews; Brock, Charles A.

    2018-01-01

    Atmospheric aerosol is a key component of the chemistry and climate of the Earth's atmosphere. Accurate measurement of the concentration of atmospheric particles as a function of their size is fundamental to investigations of particle microphysics, optical characteristics, and chemical processes. We describe the modification, calibration, and performance of two commercially available, Ultra-High Sensitivity Aerosol Spectrometers (UHSASs) as used on the NASA DC-8 aircraft during the Atmospheric Tomography Mission (ATom). To avoid sample flow issues related to pressure variations during aircraft altitude changes, we installed a laminar flow meter on each instrument to measure sample flow directly at the inlet as well as flow controllers to maintain constant volumetric sheath flows. In addition, we added a compact thermodenuder operating at 300 °C to the inlet line of one of the instruments. With these modifications, the instruments are capable of making accurate (ranging from 7 % for Dp < 0.07 µm to 1 % for Dp > 0.13 µm), precise (< ±1.2 %), and continuous (1 Hz) measurements of size-resolved particle number concentration over the diameter range of 0.063-1.0 µm at ambient pressures of > 1000 to 225 hPa, while simultaneously providing information on particle volatility.We assessed the effect of uncertainty in the refractive index (n) of ambient particles that are sized by the UHSAS assuming the refractive index of ammonium sulfate (n = 1.52). For calibration particles with n between 1.44 and 1.58, the UHSAS diameter varies by +4/-10 % relative to ammonium sulfate. This diameter uncertainty associated with the range of refractive indices (i.e., particle composition) translates to aerosol surface area and volume uncertainties of +8.4/-17.8 and +12.4/-27.5 %, respectively. In addition to sizing uncertainty, low counting statistics can lead to uncertainties of < 20 % for aerosol surface area and < 30 % for volume with 10 s time resolution. The UHSAS reduction in counting efficiency was corrected for concentrations > 1000 cm-3.Examples of thermodenuded and non-thermodenuded aerosol number and volume size distributions as well as propagated uncertainties are shown for several cases encountered during the ATom project. Uncertainties in particle number concentration were limited by counting statistics, especially in the tropical upper troposphere where accumulation-mode concentrations were sometimes < 20 cm-3 (counting rates ˜ 5 Hz) at standard temperature and pressure.

  20. Accurate green water loads calculation using naval hydro pack

    NASA Astrophysics Data System (ADS)

    Jasak, H.; Gatin, I.; Vukčević, V.

    2017-12-01

    An extensive verification and validation of Finite Volume based CFD software Naval Hydro based on foam-extend is presented in this paper for green water loads. Two-phase numerical model with advanced methods for treating the free surface is employed. Pressure loads on horizontal deck of Floating Production Storage and Offloading vessel (FPSO) model are compared to experimental results from [1] for three incident regular waves. Pressure peaks and integrals of pressure in time are measured on ten different locations on deck for each case. Pressure peaks and integrals are evaluated as average values among the measured incident wave periods, where periodic uncertainty is assessed for both numerical and experimental results. Spatial and temporal discretization refinement study is performed providing numerical discretization uncertainties.

  1. Assessing state-of-the-art capabilities for probing the atmospheric boundary layer: The XPIA field campaign

    DOE PAGES

    Lundquist, Julie K.; Wilczak, James M.; Ashton, Ryan; ...

    2017-03-07

    To assess current capabilities for measuring flow within the atmospheric boundary layer, including within wind farms, the U.S. Dept. of Energy sponsored the eXperimental Planetary boundary layer Instrumentation Assessment (XPIA) campaign at the Boulder Atmospheric Observatory (BAO) in spring 2015. Herein, we summarize the XPIA field experiment, highlight novel measurement approaches, and quantify uncertainties associated with these measurement methods. Line-of-sight velocities measured by scanning lidars and radars exhibit close agreement with tower measurements, despite differences in measurement volumes. Virtual towers of wind measurements, from multiple lidars or radars, also agree well with tower and profiling lidar measurements. Estimates of windsmore » over volumes from scanning lidars and radars are in close agreement, enabling assessment of spatial variability. Strengths of the radar systems used here include high scan rates, large domain coverage, and availability during most precipitation events, but they struggle at times to provide data during periods with limited atmospheric scatterers. In contrast, for the deployment geometry tested here, the lidars have slower scan rates and less range, but provide more data during non-precipitating atmospheric conditions. Microwave radiometers provide temperature profiles with approximately the same uncertainty as Radio-Acoustic Sounding Systems (RASS). Using a motion platform, we assess motion-compensation algorithms for lidars to be mounted on offshore platforms. As a result, we highlight cases for validation of mesoscale or large-eddy simulations, providing information on accessing the archived dataset. We conclude that modern remote sensing systems provide a generational improvement in observational capabilities, enabling resolution of fine-scale processes critical to understanding inhomogeneous boundary-layer flows.« less

  2. Reduced order models for prediction of groundwater quality impacts from CO₂ and brine leakage

    DOE PAGES

    Zheng, Liange; Carroll, Susan; Bianchi, Marco; ...

    2014-12-31

    A careful assessment of the risk associated with geologic CO₂ storage is critical to the deployment of large-scale storage projects. A potential risk is the deterioration of groundwater quality caused by the leakage of CO₂ and brine leakage from deep subsurface reservoirs. In probabilistic risk assessment studies, numerical modeling is the primary tool employed to assess risk. However, the application of traditional numerical models to fully evaluate the impact of CO₂ leakage on groundwater can be computationally complex, demanding large processing times and resources, and involving large uncertainties. As an alternative, reduced order models (ROMs) can be used as highlymore » efficient surrogates for the complex process-based numerical models. In this study, we represent the complex hydrogeological and geochemical conditions in a heterogeneous aquifer and subsequent risk by developing and using two separate ROMs. The first ROM is derived from a model that accounts for the heterogeneous flow and transport conditions in the presence of complex leakage functions for CO₂ and brine. The second ROM is obtained from models that feature similar, but simplified flow and transport conditions, and allow for a more complex representation of all relevant geochemical reactions. To quantify possible impacts to groundwater aquifers, the basic risk metric is taken as the aquifer volume in which the water quality of the aquifer may be affected by an underlying CO₂ storage project. The integration of the two ROMs provides an estimate of the impacted aquifer volume taking into account uncertainties in flow, transport and chemical conditions. These two ROMs can be linked in a comprehensive system level model for quantitative risk assessment of the deep storage reservoir, wellbore leakage, and shallow aquifer impacts to assess the collective risk of CO₂ storage projects.« less

  3. Assessing state-of-the-art capabilities for probing the atmospheric boundary layer: The XPIA field campaign

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lundquist, Julie K.; Wilczak, James M.; Ashton, Ryan

    To assess current capabilities for measuring flow within the atmospheric boundary layer, including within wind farms, the U.S. Dept. of Energy sponsored the eXperimental Planetary boundary layer Instrumentation Assessment (XPIA) campaign at the Boulder Atmospheric Observatory (BAO) in spring 2015. Herein, we summarize the XPIA field experiment, highlight novel measurement approaches, and quantify uncertainties associated with these measurement methods. Line-of-sight velocities measured by scanning lidars and radars exhibit close agreement with tower measurements, despite differences in measurement volumes. Virtual towers of wind measurements, from multiple lidars or radars, also agree well with tower and profiling lidar measurements. Estimates of windsmore » over volumes from scanning lidars and radars are in close agreement, enabling assessment of spatial variability. Strengths of the radar systems used here include high scan rates, large domain coverage, and availability during most precipitation events, but they struggle at times to provide data during periods with limited atmospheric scatterers. In contrast, for the deployment geometry tested here, the lidars have slower scan rates and less range, but provide more data during non-precipitating atmospheric conditions. Microwave radiometers provide temperature profiles with approximately the same uncertainty as Radio-Acoustic Sounding Systems (RASS). Using a motion platform, we assess motion-compensation algorithms for lidars to be mounted on offshore platforms. As a result, we highlight cases for validation of mesoscale or large-eddy simulations, providing information on accessing the archived dataset. We conclude that modern remote sensing systems provide a generational improvement in observational capabilities, enabling resolution of fine-scale processes critical to understanding inhomogeneous boundary-layer flows.« less

  4. Assessing State-of-the-Art Capabilities for Probing the Atmospheric Boundary Layer: The XPIA Field Campaign

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lundquist, Julie K.; Wilczak, James M.; Ashton, Ryan

    The synthesis of new measurement technologies with advances in high performance computing provides an unprecedented opportunity to advance our understanding of the atmosphere, particularly with regard to the complex flows in the atmospheric boundary layer. To assess current measurement capabilities for quantifying features of atmospheric flow within wind farms, the U.S. Dept. of Energy sponsored the eXperimental Planetary boundary layer Instrumentation Assessment (XPIA) campaign at the Boulder Atmospheric Observatory (BAO) in spring 2015. Herein, we summarize the XPIA field experiment design, highlight novel approaches to boundary-layer measurements, and quantify measurement uncertainties associated with these experimental methods. Line-of-sight velocities measured bymore » scanning lidars and radars exhibit close agreement with tower measurements, despite differences in measurement volumes. Virtual towers of wind measurements, from multiple lidars or dual radars, also agree well with tower and profiling lidar measurements. Estimates of winds over volumes,conducted with rapid lidar scans, agree with those from scanning radars, enabling assessment of spatial variability. Microwave radiometers provide temperature profiles within and above the boundary layer with approximately the same uncertainty as operational remote sensing measurements. Using a motion platform, we assess motion-compensation algorithms for lidars to be mounted on offshore platforms. Finally, we highlight cases that could be useful for validation of large-eddy simulations or mesoscale numerical weather prediction, providing information on accessing the archived dataset. We conclude that modern remote Lundquist et al. XPIA BAMS Page 4 of 81 sensing systems provide a generational improvement in observational capabilities, enabling resolution of refined processes critical to understanding 61 inhomogeneous boundary-layer flows such as those found in wind farms.« less

  5. Integrating quantitative PCR and Bayesian statistics in quantifying human adenoviruses in small volumes of source water.

    PubMed

    Wu, Jianyong; Gronewold, Andrew D; Rodriguez, Roberto A; Stewart, Jill R; Sobsey, Mark D

    2014-02-01

    Rapid quantification of viral pathogens in drinking and recreational water can help reduce waterborne disease risks. For this purpose, samples in small volume (e.g. 1L) are favored because of the convenience of collection, transportation and processing. However, the results of viral analysis are often subject to uncertainty. To overcome this limitation, we propose an approach that integrates Bayesian statistics, efficient concentration methods, and quantitative PCR (qPCR) to quantify viral pathogens in water. Using this approach, we quantified human adenoviruses (HAdVs) in eighteen samples of source water collected from six drinking water treatment plants. HAdVs were found in seven samples. In the other eleven samples, HAdVs were not detected by qPCR, but might have existed based on Bayesian inference. Our integrated approach that quantifies uncertainty provides a better understanding than conventional assessments of potential risks to public health, particularly in cases when pathogens may present a threat but cannot be detected by traditional methods. © 2013 Elsevier B.V. All rights reserved.

  6. Assessing the seasonality and uncertainty in evapotranspiration partitioning using a tracer-aided model

    NASA Astrophysics Data System (ADS)

    Smith, A. A.; Welch, C.; Stadnyk, T. A.

    2018-05-01

    Evapotranspiration (ET) partitioning is a growing field of research in hydrology due to the significant fraction of watershed water loss it represents. The use of tracer-aided models has improved understanding of watershed processes, and has significant potential for identifying time-variable partitioning of evaporation (E) from ET. A tracer-aided model was used to establish a time-series of E/ET using differences in riverine δ18O and δ2H in four northern Canadian watersheds (lower Nelson River, Manitoba, Canada). On average E/ET follows a parabolic trend ranging from 0.7 in the spring and autumn to 0.15 (three watersheds) and 0.5 (fourth watershed) during the summer growing season. In the fourth watershed wetlands and shrubs dominate land cover. During the summer, E/ET ratios are highest in wetlands for three watersheds (10% higher than unsaturated soil storage), while lowest for the fourth watershed (20% lower than unsaturated soil storage). Uncertainty of the ET partition parameters is strongly influenced by storage volumes, with large storage volumes increasing partition uncertainty. In addition, higher simulated soil moisture increases estimated E/ET. Although unsaturated soil storage accounts for larger surface areas in these watersheds than wetlands, riverine isotopic composition is more strongly affected by E from wetlands. Comparisons of E/ET to measurement-intensive studies in similar ecoregions indicate that the methodology proposed here adequately partitions ET.

  7. Inter- and Intrafraction Uncertainty in Prostate Bed Image-Guided Radiotherapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huang, Kitty; Palma, David A.; Department of Oncology, University of Western Ontario, London

    2012-10-01

    Purpose: The goals of this study were to measure inter- and intrafraction setup error and prostate bed motion (PBM) in patients undergoing post-prostatectomy image-guided radiotherapy (IGRT) and to propose appropriate population-based three-dimensional clinical target volume to planning target volume (CTV-PTV) margins in both non-IGRT and IGRT scenarios. Methods and Materials: In this prospective study, 14 patients underwent adjuvant or salvage radiotherapy to the prostate bed under image guidance using linac-based kilovoltage cone-beam CT (kV-CBCT). Inter- and intrafraction uncertainty/motion was assessed by offline analysis of three consecutive daily kV-CBCT images of each patient: (1) after initial setup to skin marks, (2)more » after correction for positional error/immediately before radiation treatment, and (3) immediately after treatment. Results: The magnitude of interfraction PBM was 2.1 mm, and intrafraction PBM was 0.4 mm. The maximum inter- and intrafraction prostate bed motion was primarily in the anterior-posterior direction. Margins of at least 3-5 mm with IGRT and 4-7 mm without IGRT (aligning to skin marks) will ensure 95% of the prescribed dose to the clinical target volume in 90% of patients. Conclusions: PBM is a predominant source of intrafraction error compared with setup error and has implications for appropriate PTV margins. Based on inter- and estimated intrafraction motion of the prostate bed using pre- and post-kV-CBCT images, CBCT IGRT to correct for day-to-day variances can potentially reduce CTV-PTV margins by 1-2 mm. CTV-PTV margins for prostate bed treatment in the IGRT and non-IGRT scenarios are proposed; however, in cases with more uncertainty of target delineation and image guidance accuracy, larger margins are recommended.« less

  8. TU-AB-202-03: Prediction of PET Transfer Uncertainty by DIR Error Estimating Software, AUTODIRECT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kim, H; Chen, J; Phillips, J

    2016-06-15

    Purpose: Deformable image registration (DIR) is a powerful tool, but DIR errors can adversely affect its clinical applications. To estimate voxel-specific DIR uncertainty, a software tool, called AUTODIRECT (automated DIR evaluation of confidence tool), has been developed and validated. This work tests the ability of this software to predict uncertainty for the transfer of standard uptake values (SUV) from positron-emission tomography (PET) with DIR. Methods: Virtual phantoms are used for this study. Each phantom has a planning computed tomography (CT) image and a diagnostic PET-CT image set. A deformation was digitally applied to the diagnostic CT to create the planningmore » CT image and establish a known deformation between the images. One lung and three rectum patient datasets were employed to create the virtual phantoms. Both of these sites have difficult deformation scenarios associated with them, which can affect DIR accuracy (lung tissue sliding and changes in rectal filling). The virtual phantoms were created to simulate these scenarios by introducing discontinuities in the deformation field at the lung rectum border. The DIR algorithm from Plastimatch software was applied to these phantoms. The SUV mapping errors from the DIR were then compared to that predicted by AUTODIRECT. Results: The SUV error distributions closely followed the AUTODIRECT predicted error distribution for the 4 test cases. The minimum and maximum PET SUVs were produced from AUTODIRECT at 95% confidence interval before applying gradient-based SUV segmentation for each of these volumes. Notably, 93.5% of the target volume warped by the true deformation was included within the AUTODIRECT-predicted maximum SUV volume after the segmentation, while 78.9% of the target volume was within the target volume warped by Plastimatch. Conclusion: The AUTODIRECT framework is able to predict PET transfer uncertainty caused by DIR, which enables an understanding of the associated target volume uncertainty.« less

  9. Source processes for the probabilistic assessment of tsunami hazards

    USGS Publications Warehouse

    Geist, Eric L.; Lynett, Patrick J.

    2014-01-01

    The importance of tsunami hazard assessment has increased in recent years as a result of catastrophic consequences from events such as the 2004 Indian Ocean and 2011 Japan tsunamis. In particular, probabilistic tsunami hazard assessment (PTHA) methods have been emphasized to include all possible ways a tsunami could be generated. Owing to the scarcity of tsunami observations, a computational approach is used to define the hazard. This approach includes all relevant sources that may cause a tsunami to impact a site and all quantifiable uncertainty. Although only earthquakes were initially considered for PTHA, recent efforts have also attempted to include landslide tsunami sources. Including these sources into PTHA is considerably more difficult because of a general lack of information on relating landslide area and volume to mean return period. The large variety of failure types and rheologies associated with submarine landslides translates to considerable uncertainty in determining the efficiency of tsunami generation. Resolution of these and several other outstanding problems are described that will further advance PTHA methodologies leading to a more accurate understanding of tsunami hazard.

  10. Assessing risk of navigational hazard from sea-level-related datum in the South West of Java Sea, Indonesia

    NASA Astrophysics Data System (ADS)

    Poerbandono

    2017-07-01

    This paper assesses the presence of navigational hazards due to underestimation of charted depths originated from an establishment of a sea-level-related reference plane, i.e. datum. The study domain is situated in one of Indonesia's densest marine traffic, SW Java Sea, Indonesia. The assessment is based on the comparison of the authorized Chart Datum (CD), being uniformly located at 0.6 m below Mean Sea Level (MSL), and a spatially varying Lowest Astronomical Tide (LAT) generated for the purpose of this research. Hazards are considered here as the deviation of LAT from CD and quantified as the ratio of LAT -CD deviation with respect to the allowable Total Vertical Uncertainty (TVU), i.e. the international standard for accuracy of depth information on nautical charts. Underestimation of charted depth is expected for the case that LAT falls below CD. Such a risk magnifies with decreasing depths, as well as the increasing volume of traffic and draught of the vessel. It is found that most of the domain is in the interior of risk-free zone from using uniform CD. As much as 0.08 and 0.19 parts of the area are in zones where the uncertainty of CD contributes respectively to 50% and 30% of Total Vertical Uncertainty. These are zones where the hazard of navigation is expected to increase due to underestimated lowest tidal level.

  11. Predicting morphological changes DS New Naga-Hammadi Barrage for extreme Nile flood flows: A Monte Carlo analysis

    PubMed Central

    Sattar, Ahmed M.A.; Raslan, Yasser M.

    2013-01-01

    While construction of the Aswan High Dam (AHD) has stopped concurrent flooding events, River Nile is still subject to low intensity flood waves resulting from controlled release of water from the dam reservoir. Analysis of flow released from New Naga-Hammadi Barrage, which is located at 3460 km downstream AHD indicated an increase in magnitude of flood released from the barrage in the past 10 years. A 2D numerical mobile bed model is utilized to investigate the possible morphological changes in the downstream of Naga-Hammadi Barrage from possible higher flood releases. Monte Carlo simulation analyses (MCS) is applied to the deterministic results of the 2D model to account for and assess the uncertainty of sediment parameters and formulations in addition to sacristy of field measurements. Results showed that the predicted volume of erosion yielded the highest uncertainty and variation from deterministic run, while navigation velocity yielded the least uncertainty. Furthermore, the error budget method is used to rank various sediment parameters for their contribution in the total prediction uncertainty. It is found that the suspended sediment contributed to output uncertainty more than other sediment parameters followed by bed load with 10% less order of magnitude. PMID:25685476

  12. Predicting morphological changes DS New Naga-Hammadi Barrage for extreme Nile flood flows: A Monte Carlo analysis.

    PubMed

    Sattar, Ahmed M A; Raslan, Yasser M

    2014-01-01

    While construction of the Aswan High Dam (AHD) has stopped concurrent flooding events, River Nile is still subject to low intensity flood waves resulting from controlled release of water from the dam reservoir. Analysis of flow released from New Naga-Hammadi Barrage, which is located at 3460 km downstream AHD indicated an increase in magnitude of flood released from the barrage in the past 10 years. A 2D numerical mobile bed model is utilized to investigate the possible morphological changes in the downstream of Naga-Hammadi Barrage from possible higher flood releases. Monte Carlo simulation analyses (MCS) is applied to the deterministic results of the 2D model to account for and assess the uncertainty of sediment parameters and formulations in addition to sacristy of field measurements. Results showed that the predicted volume of erosion yielded the highest uncertainty and variation from deterministic run, while navigation velocity yielded the least uncertainty. Furthermore, the error budget method is used to rank various sediment parameters for their contribution in the total prediction uncertainty. It is found that the suspended sediment contributed to output uncertainty more than other sediment parameters followed by bed load with 10% less order of magnitude.

  13. Impact of Hydrogeological Uncertainty on Estimation of Environmental Risks Posed by Hydrocarbon Transportation Networks

    NASA Astrophysics Data System (ADS)

    Ciriello, V.; Lauriola, I.; Bonvicini, S.; Cozzani, V.; Di Federico, V.; Tartakovsky, Daniel M.

    2017-11-01

    Ubiquitous hydrogeological uncertainty undermines the veracity of quantitative predictions of soil and groundwater contamination due to accidental hydrocarbon spills from onshore pipelines. Such predictions, therefore, must be accompanied by quantification of predictive uncertainty, especially when they are used for environmental risk assessment. We quantify the impact of parametric uncertainty on quantitative forecasting of temporal evolution of two key risk indices, volumes of unsaturated and saturated soil contaminated by a surface spill of light nonaqueous-phase liquids. This is accomplished by treating the relevant uncertain parameters as random variables and deploying two alternative probabilistic models to estimate their effect on predictive uncertainty. A physics-based model is solved with a stochastic collocation method and is supplemented by a global sensitivity analysis. A second model represents the quantities of interest as polynomials of random inputs and has a virtually negligible computational cost, which enables one to explore any number of risk-related contamination scenarios. For a typical oil-spill scenario, our method can be used to identify key flow and transport parameters affecting the risk indices, to elucidate texture-dependent behavior of different soils, and to evaluate, with a degree of confidence specified by the decision-maker, the extent of contamination and the correspondent remediation costs.

  14. Reducing Contingency through Sampling at the Luckey FUSRAP Site - 13186

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Frothingham, David; Barker, Michelle; Buechi, Steve

    2013-07-01

    Typically, the greatest risk in developing accurate cost estimates for the remediation of hazardous, toxic, and radioactive waste sites is the uncertainty in the estimated volume of contaminated media requiring remediation. Efforts to address this risk in the remediation cost estimate can result in large cost contingencies that are often considered unacceptable when budgeting for site cleanups. Such was the case for the Luckey Formerly Utilized Sites Remedial Action Program (FUSRAP) site near Luckey, Ohio, which had significant uncertainty surrounding the estimated volume of site soils contaminated with radium, uranium, thorium, beryllium, and lead. Funding provided by the American Recoverymore » and Reinvestment Act (ARRA) allowed the U.S. Army Corps of Engineers (USACE) to conduct additional environmental sampling and analysis at the Luckey Site between November 2009 and April 2010, with the objective to further delineate the horizontal and vertical extent of contaminated soils in order to reduce the uncertainty in the soil volume estimate. Investigative work included radiological, geophysical, and topographic field surveys, subsurface borings, and soil sampling. Results from the investigative sampling were used in conjunction with Argonne National Laboratory's Bayesian Approaches for Adaptive Spatial Sampling (BAASS) software to update the contaminated soil volume estimate for the site. This updated volume estimate was then used to update the project cost-to-complete estimate using the USACE Cost and Schedule Risk Analysis process, which develops cost contingencies based on project risks. An investment of $1.1 M of ARRA funds for additional investigative work resulted in a reduction of 135,000 in-situ cubic meters (177,000 in-situ cubic yards) in the estimated base volume estimate. This refinement of the estimated soil volume resulted in a $64.3 M reduction in the estimated project cost-to-complete, through a reduction in the uncertainty in the contaminated soil volume estimate and the associated contingency costs. (authors)« less

  15. Evolution of Mobil`s methods to evaluate exploration and producing opportunities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gaynor, C.B.; Cook, D.M. Jr.

    1996-08-01

    Over the past decade, Mobil has changed significantly in size, structure and focus to improve profitability. Concurrently, work processes and methodologies have been modified to improve resource utilization and opportunity selection. The key imperative has been recognition of the full range of hydrocarbon volume uncertainty, its risk and value. Exploration has focussed on increasing success through improved geotechnical estimates and demonstrating value addition. For Producing, the important tasks: (1) A centralized Exploration and Producing team was formed to help ensure an integrated, consistent worldwide approach to prospect and field assessments. Monte Carlo simulation was instituted to recognize probability-weighted ranges ofmore » possible outcomes for prospects and fields, and hydrocarbon volume category definitions were standardized. (2) Exploration instituted a global Prospect Inventory, tracking wildcat predictions vs. results. Performance analyses led to initiatives to improve the quality and consistency of assessments. Process improvement efforts included the use of multidisciplinary teams and peer reviews. Continued overestimates of hydrocarbon volumes prompted methodology changes such as the use of {open_quotes}reality checks{close_quotes} and log-normal distributions. The communication of value predictions and additions became paramount. (3) Producing now recognizes the need for Exploration`s commercial discoveries and new Producing ventures, notwithstanding the associated risk. Multi-disciplinary teams of engineers and geoscientists work on post-discovery assessments to optimize field development and maximize the value of opportunities. Mobil now integrates volume and risk assessment with correlative future capital investment programs to make proactive strategic choices to maximize shareholder value.« less

  16. SU-F-BRD-05: Robustness of Dose Painting by Numbers in Proton Therapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Montero, A Barragan; Sterpin, E; Lee, J

    Purpose: Proton range uncertainties may cause important dose perturbations within the target volume, especially when steep dose gradients are present as in dose painting. The aim of this study is to assess the robustness against setup and range errors for high heterogeneous dose prescriptions (i.e., dose painting by numbers), delivered by proton pencil beam scanning. Methods: An automatic workflow, based on MATLAB functions, was implemented through scripting in RayStation (RaySearch Laboratories). It performs a gradient-based segmentation of the dose painting volume from 18FDG-PET images (GTVPET), and calculates the dose prescription as a linear function of the FDG-uptake value on eachmore » voxel. The workflow was applied to two patients with head and neck cancer. Robustness against setup and range errors of the conventional PTV margin strategy (prescription dilated by 2.5 mm) versus CTV-based (minimax) robust optimization (2.5 mm setup, 3% range error) was assessed by comparing the prescription with the planned dose for a set of error scenarios. Results: In order to ensure dose coverage above 95% of the prescribed dose in more than 95% of the GTVPET voxels while compensating for the uncertainties, the plans with a PTV generated a high overdose. For the nominal case, up to 35% of the GTVPET received doses 5% beyond prescription. For the worst of the evaluated error scenarios, the volume with 5% overdose increased to 50%. In contrast, for CTV-based plans this 5% overdose was present only in a small fraction of the GTVPET, which ranged from 7% in the nominal case to 15% in the worst of the evaluated scenarios. Conclusion: The use of a PTV leads to non-robust dose distributions with excessive overdose in the painted volume. In contrast, robust optimization yields robust dose distributions with limited overdose. RaySearch Laboratories is sincerely acknowledged for providing us with RayStation treatment planning system and for the support provided.« less

  17. PTV margin determination in conformal SRT of intracranial lesions

    PubMed Central

    Parker, Brent C.; Shiu, Almon S.; Maor, Moshe H.; Lang, Frederick F.; Liu, H. Helen; White, R. Allen; Antolak, John A.

    2002-01-01

    The planning target volume (PTV) includes the clinical target volume (CTV) to be irradiated and a margin to account for uncertainties in the treatment process. Uncertainties in miniature multileaf collimator (mMLC) leaf positioning, CT scanner spatial localization, CT‐MRI image fusion spatial localization, and Gill‐Thomas‐Cosman (GTC) relocatable head frame repositioning were quantified for the purpose of determining a minimum PTV margin that still delivers a satisfactory CTV dose. The measured uncertainties were then incorporated into a simple Monte Carlo calculation for evaluation of various margin and fraction combinations. Satisfactory CTV dosimetric criteria were selected to be a minimum CTV dose of 95% of the PTV dose and at least 95% of the CTV receiving 100% of the PTV dose. The measured uncertainties were assumed to be Gaussian distributions. Systematic errors were added linearly and random errors were added in quadrature assuming no correlation to arrive at the total combined error. The Monte Carlo simulation written for this work examined the distribution of cumulative dose volume histograms for a large patient population using various margin and fraction combinations to determine the smallest margin required to meet the established criteria. The program examined 5 and 30 fraction treatments, since those are the only fractionation schemes currently used at our institution. The fractionation schemes were evaluated using no margin, a margin of just the systematic component of the total uncertainty, and a margin of the systematic component plus one standard deviation of the total uncertainty. It was concluded that (i) a margin of the systematic error plus one standard deviation of the total uncertainty is the smallest PTV margin necessary to achieve the established CTV dose criteria, and (ii) it is necessary to determine the uncertainties introduced by the specific equipment and procedures used at each institution since the uncertainties may vary among locations. PACS number(s): 87.53.Kn, 87.53.Ly PMID:12132939

  18. Assessing Uncertainty in Risk Assessment Models (BOSC CSS meeting)

    EPA Science Inventory

    In vitro assays are increasingly being used in risk assessments Uncertainty in assays leads to uncertainty in models used for risk assessments. This poster assesses uncertainty in the ER and AR models.

  19. SeaWiFS technical report series. Volume 16: The second SeaWiFS Intercalibration Round-Robin Experiment, SIRREX-2, June 1993

    NASA Technical Reports Server (NTRS)

    Hooker, Stanford B. (Editor); Firestone, Elaine R. (Editor); Mueller, James L.; Mclean, James T.; Johnson, B. Carol; Westphal, Todd L.; Cooper, John W.

    1994-01-01

    The results of the second Sea-viewing Wide Field-of-view Sensor (SeaWiFS) Intercalibration Round-Robin Experiment (SIRREX-2), which was held at the Center for Hydro-Optics and Remote Sensing (CHORS) at San Diego State University on 14-25 Jun. 1993 are presented. SeaWiFS is an ocean color radiometer that is scheduled for launch in 1994. The SIRREXs are part of the SeaWiFS Calibration and Validation Program that includes the GSFC, CHORS, NIST, and several other laboratories. GSFC maintains the radiometric scales (spectral radiance and irradiance) for the SeaWiFS program using spectral irradiance standards lamps, which are calibrated by NIST. The purpose of each SIRREX is to assure that the radiometric scales which are realized by the laboratories who participate in the SeaWiFS Calibration and Validation Program are correct; that is, the uncertainties of the radiometric scales are such that measurements of normalized water-leaving radiance using oceanographic radiometers have uncertainties of 5%. SIRREX-1 demonstrated, from the internal consistency of the results, that the program goals would not be met without improvements to the instrumentation. The results of SIRREX-2 demonstrate that spectral irradiance scales realized using the GSFC standard irradiance lamp (F269) are consistent with the program goals, as the uncertainty of these measurements is assessed to be about 1%. However, this is not true for the spectral radiance scales, where again the internal consistency of the results is used to assess the uncertainty. This is attributed to inadequate performance and characterization of the instrumentation. For example, spatial nonuniformities, spectral features, and sensitivity to illumination configuration were observed in some of the integrating sphere sources. The results of SIRREX-2 clearly indicate the direction for future work, with the main emphasis on instrument characterization and the assessment of the measurement uncertainties so that the results may be stated in a more definitive manner.

  20. Where do uncertainties reside within environmental risk assessments? Expert opinion on uncertainty distributions for pesticide risks to surface water organisms.

    PubMed

    Skinner, Daniel J C; Rocks, Sophie A; Pollard, Simon J T

    2016-12-01

    A reliable characterisation of uncertainties can aid uncertainty identification during environmental risk assessments (ERAs). However, typologies can be implemented inconsistently, causing uncertainties to go unidentified. We present an approach based on nine structured elicitations, in which subject-matter experts, for pesticide risks to surface water organisms, validate and assess three dimensions of uncertainty: its level (the severity of uncertainty, ranging from determinism to ignorance); nature (whether the uncertainty is epistemic or aleatory); and location (the data source or area in which the uncertainty arises). Risk characterisation contains the highest median levels of uncertainty, associated with estimating, aggregating and evaluating the magnitude of risks. Regarding the locations in which uncertainty is manifest, data uncertainty is dominant in problem formulation, exposure assessment and effects assessment. The comprehensive description of uncertainty described will enable risk analysts to prioritise the required phases, groups of tasks, or individual tasks within a risk analysis according to the highest levels of uncertainty, the potential for uncertainty to be reduced or quantified, or the types of location-based uncertainty, thus aiding uncertainty prioritisation during environmental risk assessments. In turn, it is expected to inform investment in uncertainty reduction or targeted risk management action. Copyright © 2016 The Authors. Published by Elsevier B.V. All rights reserved.

  1. Operationally efficient propulsion system study (OEPSS) data book. Volume 10; Air Augmented Rocket Afterburning

    NASA Technical Reports Server (NTRS)

    Farhangi, Shahram; Trent, Donnie (Editor)

    1992-01-01

    A study was directed towards assessing viability and effectiveness of an air augmented ejector/rocket. Successful thrust augmentation could potentially reduce a multi-stage vehicle to a single stage-to-orbit vehicle (SSTO) and, thereby, eliminate the associated ground support facility infrastructure and ground processing required by the eliminated stage. The results of this preliminary study indicate that an air augmented ejector/rocket propulsion system is viable. However, uncertainties resulting from simplified approach and assumptions must be resolved by further investigations.

  2. Ku-band antenna acquisition and tracking performance study, volume 4

    NASA Technical Reports Server (NTRS)

    Huang, T. C.; Lindsey, W. C.

    1977-01-01

    The results pertaining to the tradeoff analysis and performance of the Ku-band shuttle antenna pointing and signal acquisition system are presented. The square, hexagonal and spiral antenna trajectories were investigated assuming the TDRS postulated uncertainty region and a flexible statistical model for the location of the TDRS within the uncertainty volume. The scanning trajectories, shuttle/TDRS signal parameters and dynamics, and three signal acquisition algorithms were integrated into a hardware simulation. The hardware simulation is quite flexible in that it allows for the evaluation of signal acquisition performance for an arbitrary (programmable) antenna pattern, a large range of C/N sub O's, various TDRS/shuttle a priori uncertainty distributions, and three distinct signal search algorithms.

  3. Validation of sea ice models using an uncertainty-based distance metric for multiple model variables: NEW METRIC FOR SEA ICE MODEL VALIDATION

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Urrego-Blanco, Jorge R.; Hunke, Elizabeth C.; Urban, Nathan M.

    Here, we implement a variance-based distance metric (D n) to objectively assess skill of sea ice models when multiple output variables or uncertainties in both model predictions and observations need to be considered. The metric compares observations and model data pairs on common spatial and temporal grids improving upon highly aggregated metrics (e.g., total sea ice extent or volume) by capturing the spatial character of model skill. The D n metric is a gamma-distributed statistic that is more general than the χ 2 statistic commonly used to assess model fit, which requires the assumption that the model is unbiased andmore » can only incorporate observational error in the analysis. The D n statistic does not assume that the model is unbiased, and allows the incorporation of multiple observational data sets for the same variable and simultaneously for different variables, along with different types of variances that can characterize uncertainties in both observations and the model. This approach represents a step to establish a systematic framework for probabilistic validation of sea ice models. The methodology is also useful for model tuning by using the D n metric as a cost function and incorporating model parametric uncertainty as part of a scheme to optimize model functionality. We apply this approach to evaluate different configurations of the standalone Los Alamos sea ice model (CICE) encompassing the parametric uncertainty in the model, and to find new sets of model configurations that produce better agreement than previous configurations between model and observational estimates of sea ice concentration and thickness.« less

  4. Uncertainty Quantification and Risk Mitigation of CO2 Leakage in Groundwater Aquifers

    NASA Astrophysics Data System (ADS)

    Sun, Y.; Tong, C.; Mansoor, K.; Carroll, S.

    2013-12-01

    The risk of CO2 leakage into shallow aquifers through various pathways such as faults and abandoned wells is a concern of CO2 geological sequestration. If a leak is detected in an aquifer system, a contingency plan is required to manage the CO2 storage and to protect the groundwater source. Among many remediation and mitigation strategies, the simplest is to stop CO2 leakage at a wellbore. Therefore, it is necessary to address whether and when the CO2 leaks should be sealed, and how much risk can be mitigated. In the presence of various uncertainties, including geological-structure uncertainty and parametric uncertainty, the risk of CO2 leakage into an aquifer needs to be assessed with probabilistic distributions of uncertain parameters. In this study, we developed an integrated model to simulate multiphase flow of CO2 and brine in a deep storage reservoir, through a leaky well at an uncertain location, and subsequently multicomponent reactive transport in a shallow aquifer. Each sub-model covers its domain-specific physics. Uncertainties of geological structure and parameters are considered together with decision variables (CO2 injection rate and mitigation time) for risk assessment of leakage-impacted aquifer volume. High-resolution and less-expensive reduced-order models (ROMs) of risk profiles are approximated as polynomial functions of decision variables and all uncertain parameters. These reduced-order models are then used in the place of computationally-expensive numerical models for future decision-making on if and when the leaky well is sealed. The tradeoff between CO2 storage capacity in the reservoir and the leakage-induced risk in the aquifer is evaluated. This work performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under contract DE-AC52-07NA27344.

  5. Validation of sea ice models using an uncertainty-based distance metric for multiple model variables: NEW METRIC FOR SEA ICE MODEL VALIDATION

    DOE PAGES

    Urrego-Blanco, Jorge R.; Hunke, Elizabeth C.; Urban, Nathan M.; ...

    2017-04-01

    Here, we implement a variance-based distance metric (D n) to objectively assess skill of sea ice models when multiple output variables or uncertainties in both model predictions and observations need to be considered. The metric compares observations and model data pairs on common spatial and temporal grids improving upon highly aggregated metrics (e.g., total sea ice extent or volume) by capturing the spatial character of model skill. The D n metric is a gamma-distributed statistic that is more general than the χ 2 statistic commonly used to assess model fit, which requires the assumption that the model is unbiased andmore » can only incorporate observational error in the analysis. The D n statistic does not assume that the model is unbiased, and allows the incorporation of multiple observational data sets for the same variable and simultaneously for different variables, along with different types of variances that can characterize uncertainties in both observations and the model. This approach represents a step to establish a systematic framework for probabilistic validation of sea ice models. The methodology is also useful for model tuning by using the D n metric as a cost function and incorporating model parametric uncertainty as part of a scheme to optimize model functionality. We apply this approach to evaluate different configurations of the standalone Los Alamos sea ice model (CICE) encompassing the parametric uncertainty in the model, and to find new sets of model configurations that produce better agreement than previous configurations between model and observational estimates of sea ice concentration and thickness.« less

  6. Global Sampling for Integrating Physics-Specific Subsystems and Quantifying Uncertainties of CO 2 Geological Sequestration

    DOE PAGES

    Sun, Y.; Tong, C.; Trainor-Guitten, W. J.; ...

    2012-12-20

    The risk of CO 2 leakage from a deep storage reservoir into a shallow aquifer through a fault is assessed and studied using physics-specific computer models. The hypothetical CO 2 geological sequestration system is composed of three subsystems: a deep storage reservoir, a fault in caprock, and a shallow aquifer, which are modeled respectively by considering sub-domain-specific physics. Supercritical CO 2 is injected into the reservoir subsystem with uncertain permeabilities of reservoir, caprock, and aquifer, uncertain fault location, and injection rate (as a decision variable). The simulated pressure and CO 2/brine saturation are connected to the fault-leakage model as amore » boundary condition. CO 2 and brine fluxes from the fault-leakage model at the fault outlet are then imposed in the aquifer model as a source term. Moreover, uncertainties are propagated from the deep reservoir model, to the fault-leakage model, and eventually to the geochemical model in the shallow aquifer, thus contributing to risk profiles. To quantify the uncertainties and assess leakage-relevant risk, we propose a global sampling-based method to allocate sub-dimensions of uncertain parameters to sub-models. The risk profiles are defined and related to CO 2 plume development for pH value and total dissolved solids (TDS) below the EPA's Maximum Contaminant Levels (MCL) for drinking water quality. A global sensitivity analysis is conducted to select the most sensitive parameters to the risk profiles. The resulting uncertainty of pH- and TDS-defined aquifer volume, which is impacted by CO 2 and brine leakage, mainly results from the uncertainty of fault permeability. Subsequently, high-resolution, reduced-order models of risk profiles are developed as functions of all the decision variables and uncertain parameters in all three subsystems.« less

  7. Probabilistic Dynamic Buckling of Smart Composite Shells

    NASA Technical Reports Server (NTRS)

    Abumeri, Galib H.; Chamis, Christos C.

    2003-01-01

    A computational simulation method is presented to evaluate the deterministic and nondeterministic dynamic buckling of smart composite shells. The combined use of composite mechanics, finite element computer codes, and probabilistic analysis enable the effective assessment of the dynamic buckling load of smart composite shells. A universal plot is generated to estimate the dynamic buckling load of composite shells at various load rates and probabilities. The shell structure is also evaluated with smart fibers embedded in the plies right below the outer plies. The results show that, on the average, the use of smart fibers improved the shell buckling resistance by about 10 percent at different probabilities and delayed the buckling occurrence time. The probabilistic sensitivities results indicate that uncertainties in the fiber volume ratio and ply thickness have major effects on the buckling load while uncertainties in the electric field strength and smart material volume fraction have moderate effects. For the specific shell considered in this evaluation, the use of smart composite material is not recommended because the shell buckling resistance can be improved by simply re-arranging the orientation of the outer plies, as shown in the dynamic buckling analysis results presented in this report.

  8. Probabilistic Dynamic Buckling of Smart Composite Shells

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.; Abumeri, Galib H.

    2007-01-01

    A computational simulation method is presented to evaluate the deterministic and nondeterministic dynamic buckling of smart composite shells. The combined use of intraply hybrid composite mechanics, finite element computer codes, and probabilistic analysis enable the effective assessment of the dynamic buckling load of smart composite shells. A universal plot is generated to estimate the dynamic buckling load of composite shells at various load rates and probabilities. The shell structure is also evaluated with smart fibers embedded in the plies right next to the outer plies. The results show that, on the average, the use of smart fibers improved the shell buckling resistance by about 10% at different probabilities and delayed the buckling occurrence time. The probabilistic sensitivities results indicate that uncertainties in the fiber volume ratio and ply thickness have major effects on the buckling load while uncertainties in the electric field strength and smart material volume fraction have moderate effects. For the specific shell considered in this evaluation, the use of smart composite material is not recommended because the shell buckling resistance can be improved by simply re-arranging the orientation of the outer plies, as shown in the dynamic buckling analysis results presented in this report.

  9. Bounding uncertainty in volumetric geometric models for terrestrial lidar observations of ecosystems.

    PubMed

    Paynter, Ian; Genest, Daniel; Peri, Francesco; Schaaf, Crystal

    2018-04-06

    Volumetric models with known biases are shown to provide bounds for the uncertainty in estimations of volume for ecologically interesting objects, observed with a terrestrial laser scanner (TLS) instrument. Bounding cuboids, three-dimensional convex hull polygons, voxels, the Outer Hull Model and Square Based Columns (SBCs) are considered for their ability to estimate the volume of temperate and tropical trees, as well as geomorphological features such as bluffs and saltmarsh creeks. For temperate trees, supplementary geometric models are evaluated for their ability to bound the uncertainty in cylinder-based reconstructions, finding that coarser volumetric methods do not currently constrain volume meaningfully, but may be helpful with further refinement, or in hybridized models. Three-dimensional convex hull polygons consistently overestimate object volume, and SBCs consistently underestimate volume. Voxel estimations vary in their bias, due to the point density of the TLS data, and occlusion, particularly in trees. The response of the models to parametrization is analysed, observing unexpected trends in the SBC estimates for the drumlin dataset. Establishing that this result is due to the resolution of the TLS observations being insufficient to support the resolution of the geometric model, it is suggested that geometric models with predictable outcomes can also highlight data quality issues when they produce illogical results.

  10. Bounding uncertainty in volumetric geometric models for terrestrial lidar observations of ecosystems

    PubMed Central

    Genest, Daniel; Peri, Francesco; Schaaf, Crystal

    2018-01-01

    Volumetric models with known biases are shown to provide bounds for the uncertainty in estimations of volume for ecologically interesting objects, observed with a terrestrial laser scanner (TLS) instrument. Bounding cuboids, three-dimensional convex hull polygons, voxels, the Outer Hull Model and Square Based Columns (SBCs) are considered for their ability to estimate the volume of temperate and tropical trees, as well as geomorphological features such as bluffs and saltmarsh creeks. For temperate trees, supplementary geometric models are evaluated for their ability to bound the uncertainty in cylinder-based reconstructions, finding that coarser volumetric methods do not currently constrain volume meaningfully, but may be helpful with further refinement, or in hybridized models. Three-dimensional convex hull polygons consistently overestimate object volume, and SBCs consistently underestimate volume. Voxel estimations vary in their bias, due to the point density of the TLS data, and occlusion, particularly in trees. The response of the models to parametrization is analysed, observing unexpected trends in the SBC estimates for the drumlin dataset. Establishing that this result is due to the resolution of the TLS observations being insufficient to support the resolution of the geometric model, it is suggested that geometric models with predictable outcomes can also highlight data quality issues when they produce illogical results. PMID:29503722

  11. Assessment of parameter uncertainty in hydrological model using a Markov-Chain-Monte-Carlo-based multilevel-factorial-analysis method

    NASA Astrophysics Data System (ADS)

    Zhang, Junlong; Li, Yongping; Huang, Guohe; Chen, Xi; Bao, Anming

    2016-07-01

    Without a realistic assessment of parameter uncertainty, decision makers may encounter difficulties in accurately describing hydrologic processes and assessing relationships between model parameters and watershed characteristics. In this study, a Markov-Chain-Monte-Carlo-based multilevel-factorial-analysis (MCMC-MFA) method is developed, which can not only generate samples of parameters from a well constructed Markov chain and assess parameter uncertainties with straightforward Bayesian inference, but also investigate the individual and interactive effects of multiple parameters on model output through measuring the specific variations of hydrological responses. A case study is conducted for addressing parameter uncertainties in the Kaidu watershed of northwest China. Effects of multiple parameters and their interactions are quantitatively investigated using the MCMC-MFA with a three-level factorial experiment (totally 81 runs). A variance-based sensitivity analysis method is used to validate the results of parameters' effects. Results disclose that (i) soil conservation service runoff curve number for moisture condition II (CN2) and fraction of snow volume corresponding to 50% snow cover (SNO50COV) are the most significant factors to hydrological responses, implying that infiltration-excess overland flow and snow water equivalent represent important water input to the hydrological system of the Kaidu watershed; (ii) saturate hydraulic conductivity (SOL_K) and soil evaporation compensation factor (ESCO) have obvious effects on hydrological responses; this implies that the processes of percolation and evaporation would impact hydrological process in this watershed; (iii) the interactions of ESCO and SNO50COV as well as CN2 and SNO50COV have an obvious effect, implying that snow cover can impact the generation of runoff on land surface and the extraction of soil evaporative demand in lower soil layers. These findings can help enhance the hydrological model's capability for simulating/predicting water resources.

  12. Uncertainty associated with assessing semen volume: are volumetric and gravimetric methods that different?

    PubMed

    Woodward, Bryan; Gossen, Nicole; Meadows, Jessica; Tomlinson, Mathew

    2016-12-01

    The World Health Organization laboratory manual for the examination of human semen suggests that an indirect measurement of semen volume by weighing (gravimetric method) is more accurate than a direct measure using a serological pipette. A series of experiments were performed to determine the level of discrepancy between the two methods using pipettes and a balance which had been calibrated to a traceable standard. The median weights of 1.0ml and 5.0ml of semen were 1.03 g (range 1.02-1.05 g) and 5.11 g (range 4.95-5.16 g), respectively, suggesting a density for semen between 1.03g and 1.04 g/ml. When the containers were re-weighed after the removal of 5.0 ml semen using a serological pipette, the mean residual loss was 0.12 ml (120 μl) or 0.12 g (median 100 μl, range 70-300 μl). Direct comparison of the volumetric and gravimetric methods in a total of 40 samples showed a mean difference of 0.25ml (median 0.32 ± 0.67ml) representing an error of 8.5%. Residual semen left in the container by weight was on average 0.11 g (median 0.10 g, range 0.05-0.19 g). Assuming a density of 1 g/ml then the average error between volumetric and gravimetric methods was approximately 8% (p < 0.001). If, however, the WHO value for density is assumed (1.04 g/ml) then the difference is reduced to 4.2%. At least 2.4-3.5% of this difference is also explained by the residual semen remaining in the container. This study suggests that by assuming the density of semen as 1 g/ml, there is significant uncertainty associated with the average gravimetric measurement of semen volume. Laboratories may therefore prefer to provide in-house quality assurance data in order to be satisfied that 'estimating' semen volume is 'fit for purpose' as opposed to assuming a lower uncertainty associated with the WHO recommended method.

  13. Uncertainty in Agricultural Impact Assessment

    NASA Technical Reports Server (NTRS)

    Wallach, Daniel; Mearns, Linda O.; Rivington, Michael; Antle, John M.; Ruane, Alexander C.

    2014-01-01

    This chapter considers issues concerning uncertainty associated with modeling and its use within agricultural impact assessments. Information about uncertainty is important for those who develop assessment methods, since that information indicates the need for, and the possibility of, improvement of the methods and databases. Such information also allows one to compare alternative methods. Information about the sources of uncertainties is an aid in prioritizing further work on the impact assessment method. Uncertainty information is also necessary for those who apply assessment methods, e.g., for projecting climate change impacts on agricultural production and for stakeholders who want to use the results as part of a decision-making process (e.g., for adaptation planning). For them, uncertainty information indicates the degree of confidence they can place in the simulated results. Quantification of uncertainty also provides stakeholders with an important guideline for making decisions that are robust across the known uncertainties. Thus, uncertainty information is important for any decision based on impact assessment. Ultimately, we are interested in knowledge about uncertainty so that information can be used to achieve positive outcomes from agricultural modeling and impact assessment.

  14. Uncertainty analysis for low-level radioactive waste disposal performance assessment at Oak Ridge National Laboratory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, D.W.; Yambert, M.W.; Kocher, D.C.

    1994-12-31

    A performance assessment of the operating Solid Waste Storage Area 6 (SWSA 6) facility for the disposal of low-level radioactive waste at the Oak Ridge National Laboratory has been prepared to provide the technical basis for demonstrating compliance with the performance objectives of DOE Order 5820.2A, Chapter 111.2 An analysis of the uncertainty incorporated into the assessment was performed which addressed the quantitative uncertainty in the data used by the models, the subjective uncertainty associated with the models used for assessing performance of the disposal facility and site, and the uncertainty in the models used for estimating dose and humanmore » exposure. The results of the uncertainty analysis were used to interpret results and to formulate conclusions about the performance assessment. This paper discusses the approach taken in analyzing the uncertainty in the performance assessment and the role of uncertainty in performance assessment.« less

  15. A modified ATI technique for nowcasting convective rain volumes over areas. [area-time integrals

    NASA Technical Reports Server (NTRS)

    Makarau, Amos; Johnson, L. Ronald; Doneaud, Andre A.

    1988-01-01

    This paper explores the applicability of the area-time-integral (ATI) technique for the estimation of the growth portion only of a convective storm (while the rain volume is computed using the entire life history of the event) and for nowcasting the total rain volume of a convective system at the stage of its maximum development. For these purposes, the ATIs were computed from the digital radar data (for 1981-1982) from the North Dakota Cloud Modification Project, using the maximum echo area (ATIA) no less than 25 dBz, the maximum reflectivity, and the maximum echo height as the end of the growth portion of the convective event. Linear regression analysis demonstrated that correlations between total rain volume or the maximum rain volume versus ATIA were the strongest. The uncertainties obtained were comparable to the uncertainties which typically occur in rain volume estimates obtained from radar data employing Z-R conversion followed by space and time integration. This demonstrates that the total rain volume of a storm can be nowcasted at its maximum stage of development.

  16. Comparative evaluation of 1D and quasi-2D hydraulic models based on benchmark and real-world applications for uncertainty assessment in flood mapping

    NASA Astrophysics Data System (ADS)

    Dimitriadis, Panayiotis; Tegos, Aristoteles; Oikonomou, Athanasios; Pagana, Vassiliki; Koukouvinos, Antonios; Mamassis, Nikos; Koutsoyiannis, Demetris; Efstratiadis, Andreas

    2016-03-01

    One-dimensional and quasi-two-dimensional hydraulic freeware models (HEC-RAS, LISFLOOD-FP and FLO-2d) are widely used for flood inundation mapping. These models are tested on a benchmark test with a mixed rectangular-triangular channel cross section. Using a Monte-Carlo approach, we employ extended sensitivity analysis by simultaneously varying the input discharge, longitudinal and lateral gradients and roughness coefficients, as well as the grid cell size. Based on statistical analysis of three output variables of interest, i.e. water depths at the inflow and outflow locations and total flood volume, we investigate the uncertainty enclosed in different model configurations and flow conditions, without the influence of errors and other assumptions on topography, channel geometry and boundary conditions. Moreover, we estimate the uncertainty associated to each input variable and we compare it to the overall one. The outcomes of the benchmark analysis are further highlighted by applying the three models to real-world flood propagation problems, in the context of two challenging case studies in Greece.

  17. I had a dream… Continuous InSAR measurement and transparent earth, the beauty of analogue modeling to assess direct model uncertainties

    NASA Astrophysics Data System (ADS)

    Taisne, B.; Pansino, S.; Manta, F.; Tay Wen Jing, C.

    2017-12-01

    Have you ever dreamed about continuous, high resolution InSAR data? Have you ever dreamed about a transparent earth allowing you to see what is actually going on under a volcano? Well, you likely dreamed about an analogue facility that allows you to scale down the natural system to fit into a room, with a controlled environment and complex visualisation system. Analogue modeling has been widely used to understand magmatic processes and thanks to a transparent analogue for the elastic Earth's crust, we can see, as it evolves with time, the migration of a dyke, the volume change of a chamber or the rise of a bubble in a conduit. All those phenomena are modeled theoretically or numerically, with their own simplifications. Therefore, how well are we really constraining the physical parameters describing the evolution of a dyke or a chamber? Getting access to those parameters, in real time and with high level of confidence is of paramount importance while dealing with unrest at volcanoes. The aim of this research is to estimate the uncertainties of the widely used Okada and Mogi models. To do so, we design a set of analogue experiments allowing us to explore different elastic properties of the medium, the characteristic of the fluid injected into the medium as well as the depth, size and volume change of a reservoir. The associated surface deformation is extracted using an array of synchronised cameras and using digital image correlation and structure from motion for horizontal and vertical deformation respectively. The surface deformation are then inverted to retrieve the controlling parameters (e.g. location and volume change of a chamber, or orientation, position, length, breadth and opening of a dyke). By comparing those results with the known parameters, that we can see and measure independently, we estimate the uncertainties of the models themself, and the associated level of confidence for each of the inverted parameters.

  18. Final report on the EURAMET.M.FF-K4.2.2014 volume comparison at 100 μL—calibration of micropipettes

    NASA Astrophysics Data System (ADS)

    Batista, Elsa; Matus, Michael; Metaxiotou, Zoe; Tudor, Maria; Lenard, Elzbieta; Buker, Oliver; Wennergren, Per; Piluri, Erinda; Miteva, Mariana; Vicarova, Martina; Vospĕlová, Alena; Turnsek, Urska; Micic, Ljiljana; Grue, Lise-Lote; Mihailovic, Mirjana; Sarevska, Anastazija

    2017-01-01

    During the EURAMET TC-F meeting of 2014 and following the finalization of CCM.FF-K4.2.2011 comparison, it was agreed to start a Regional Key Comparison (KC) on volume measurements using two 100 μL micropipettes (piston pipettes) allowing the participating laboratories to assess the agreement of their results and uncertainties. Two 100 μL micropipettes were tested by 15 participants. One participant was not a member or associate member of the BIPM and was be removed from this report. The comparison started in July 2015 and ended in March 2016. The Volume and Flow Laboratory of the Portuguese Institute for Quality (IPQ) was the pilot laboratory and performed the initial and final measurements of the micropipettes. The micropipettes showed a stable volume during the whole comparison, which was confirmed by the results from the pilot laboratory. The original results of all participant NMIs were corrected to the standard atmospheric pressure in order to compare results under the same calibration conditions, and the contribution of the 'process-related handling contribution' was added to the uncertainty budget of each participant. In general the declared CMCs are in accordance with the KCDB. For the micropipette 354828Z, two laboratories had inconsistent results. For micropipette 354853Z, three laboratories had inconsistent results. Main text To reach the main text of this paper, click on Final Report. Note that this text is that which appears in Appendix B of the BIPM key comparison database kcdb.bipm.org/. The final report has been peer-reviewed and approved for publication by the CCM, according to the provisions of the CIPM Mutual Recognition Arrangement (CIPM MRA).

  19. Facial biometrics of peri-oral changes in Crohn's disease.

    PubMed

    Zou, L; Adegun, O K; Willis, A; Fortune, Farida

    2014-05-01

    Crohn's disease is a chronic relapsing and remitting inflammatory condition which affects any part of the gastrointestinal tract. In the oro-facial region, patients can present peri-oral swellings which results in severe facial disfigurement. To date, assessing the degree of facial changes and evaluation of treatment outcomes relies on clinical observation and semi-quantitative methods. In this paper, we describe the development of a robust and reproducible measurement strategy using 3-D facial biometrics to objectively quantify the extent and progression of oro-facial Crohn's disease. Using facial laser scanning, 32 serial images from 13 Crohn's patients attending the Oral Medicine clinic were acquired during relapse, remission, and post-treatment phases. Utilising theories of coordinate metrology, the facial images were subjected to registration, regions of interest identification, and reproducible repositioning prior to obtaining volume measurements. To quantify the changes in tissue volume, scan images from consecutive appointments were compared to the baseline (first scan image). Reproducibility test was performed to ascertain the degree of uncertainty in volume measurements. 3-D facial biometric imaging is a reliable method to identify and quantify peri-oral swelling in Crohn's patients. Comparison of facial scan images at different phases of the disease revealed precisely profile and volume changes. The volume measurements were highly reproducible as adjudged from the 1% standard deviation. 3-D facial biometrics measurements in Crohn's patients with oro-facial involvement offers a quick, robust, economical and objective approach for guided therapeutic intervention and routine assessment of treatment efficacy on the clinic.

  20. Computational Fluid Dynamics Best Practice Guidelines in the Analysis of Storage Dry Cask

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zigh, A.; Solis, J.

    2008-07-01

    Computational fluid dynamics (CFD) methods are used to evaluate the thermal performance of a dry cask under long term storage conditions in accordance with NUREG-1536 [NUREG-1536, 1997]. A three-dimensional CFD model was developed and validated using data for a ventilated storage cask (VSC-17) collected by Idaho National Laboratory (INL). The developed Fluent CFD model was validated to minimize the modeling and application uncertainties. To address modeling uncertainties, the paper focused on turbulence modeling of buoyancy driven air flow. Similarly, in the application uncertainties, the pressure boundary conditions used to model the air inlet and outlet vents were investigated and validated.more » Different turbulence models were used to reduce the modeling uncertainty in the CFD simulation of the air flow through the annular gap between the overpack and the multi-assembly sealed basket (MSB). Among the chosen turbulence models, the validation showed that the low Reynolds k-{epsilon} and the transitional k-{omega} turbulence models predicted the measured temperatures closely. To assess the impact of pressure boundary conditions used at the air inlet and outlet channels on the application uncertainties, a sensitivity analysis of operating density was undertaken. For convergence purposes, all available commercial CFD codes include the operating density in the pressure gradient term of the momentum equation. The validation showed that the correct operating density corresponds to the density evaluated at the air inlet condition of pressure and temperature. Next, the validated CFD method was used to predict the thermal performance of an existing dry cask storage system. The evaluation uses two distinct models: a three-dimensional and an axisymmetrical representation of the cask. In the 3-D model, porous media was used to model only the volume occupied by the rodded region that is surrounded by the BWR channel box. In the axisymmetric model, porous media was used to model the entire region that encompasses the fuel assemblies as well as the gaps in between. Consequently, a larger volume is represented by porous media in the second model; hence, a higher frictional flow resistance is introduced in the momentum equations. The conservatism and the safety margins of these models were compared to assess the applicability and the realism of these two models. The three-dimensional model included fewer geometry simplifications and is recommended as it predicted less conservative fuel cladding temperature values, while still assuring the existence of adequate safety margins. (authors)« less

  1. Assessing Degree of Susceptibility to Landslide Hazard

    NASA Astrophysics Data System (ADS)

    Sheridan, M. F.; Cordoba, G. A.; Delgado, H.; Stefanescu, R.

    2013-05-01

    The modeling of hazardous mass flows, both dry and water saturated, is currently an area of active research and several stable models have now emerged that have differing degrees of physical and mathematical fidelity. Models based on the early work of Savage and Hutter (1989) assume that very large dense granular flows could be modeled as incompressible continua governed by a Coulomb failure criterion. Based on this concept, Patra et al. (2005) developed a code for dry avalanches, which proposes a thin layer mathematical model similar to shallow-water equations. This concept was implemented in the widely-used TITAN2D program, which integrates the shock-capturing Godunov solution methodology for the equation system. We propose a method to assess the susceptibility of specific locations susceptible to landslides following heavy tephra fall using the TIATN2D code. Successful application requires that the range of several uncertainties must be framed in the selection of model input data: 1) initial conditions, like volume and location of origin of the landslide, 2) bed and internal friction parameters and 3) digital elevation model (DEM) uncertainties. Among the possible ways of coping with these uncertainties, we chose to use Latin Hypercube Sampling (LHS). This statistical technique reduces a computationally intractable problem to such an extent that is it possible to apply it, even with current personal computers. LHS requires that there is only one sample in each row and each column of the sampling matrix, where each row (multi-dimensional) corresponds to each uncertainty. LHS requires less than 10% of the sample runs needed by Monte Carlo approaches to achieve a stable solution. In our application LHS output provides model sampling for 4 input parameters: initial random volumes, UTM location (x and y), and bed friction. We developed a simple Octave script to link the output of LHS with TITAN2D. In this way, TITAN2D can run several times with successively different initial conditions provided by the LHC routine. Finally the set of results from TITAN2D are computed to obtain the distribution of maximum exceedance probability given that a landslide occurs at a place of interest. We apply this method to find sectors least prone to be affected by landslides, in a region along the Panamerican Highway in the southern part of Colombia. The goal of such a study is to provide decision makers to improve their assessments regarding permissions for development along the highway.

  2. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Backus, George A.; Lowry, Thomas Stephen; Jones, Shannon M

    This report uses the CMIP5 series of climate model simulations to produce country- level uncertainty distributions for use in socioeconomic risk assessments of climate change impacts. It provides appropriate probability distributions, by month, for 169 countries and autonomous-areas on temperature, precipitation, maximum temperature, maximum wind speed, humidity, runoff, soil moisture and evaporation for the historical period (1976-2005), and for decadal time periods to 2100. It also provides historical and future distributions for the Arctic region on ice concentration, ice thickness, age of ice, and ice ridging in 15-degree longitude arc segments from the Arctic Circle to 80 degrees latitude, plusmore » two polar semicircular regions from 80 to 90 degrees latitude. The uncertainty is meant to describe the lack of knowledge rather than imprecision in the physical simulation because the emphasis is on unfalsified risk and its use to determine potential socioeconomic impacts. The full report is contained in 27 volumes.« less

  3. Western oil shale development: a technology assessment. Volume 1. Main report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1981-11-01

    The general goal of this study is to present the prospects of shale oil within the context of (1) environmental constraints, (2) available natural and economic resources, and (3) the characteristics of existing and emerging technology. The objectives are: to review shale oil technologies objectively as a means of supplying domestically produced fuels within environmental, social, economic, and legal/institutional constraints; using available data, analyses, and experienced judgment, to examine the major points of uncertainty regarding potential impacts of oil shale development; to resolve issues where data and analyses are compelling or where conclusions can be reached on judgmental grounds; tomore » specify issues which cannot be resolved on the bases of the data, analyses, and experienced judgment currently available; and when appropriate and feasible, to suggest ways for the removal of existing uncertainties that stand in the way of resolving outstanding issues.« less

  4. Patient volume, staffing, and workload in relation to risk-adjusted outcomes in a random stratified sample of UK neonatal intensive care units: a prospective evaluation.

    PubMed

    Tucker, Janet

    2002-01-12

    UK recommendations suggest that large neonatal intensive-care units (NICUs) have better outcomes than small units, although this suggestion remains unproven. We assessed whether patient volume, staffing levels, and workload are associated with risk-adjusted outcomes, and with costs or staff wellbeing. 186 UK NICUs were stratified according to volume of patients, nursing provision, and neonatal consultant provision. Primary outcomes were hospital mortality, mortality or cerebral damage, and nosocomial bacteraemia. We studied 13515 infants of all birthweights consecutively admitted to 54 randomly selected NICUs. Multiple logistic regression analyses were done with every primary outcome as the dependent variable. Staff wellbeing and stress were assessed by anonymous mental health index (MHI)-5 questionnaires. Data were available for 13334 (99%) infants. High-volume NICUs treated the sickest infants and had highest crude mortality. Risk-adjusted mortality and mortality or cerebral damage were unrelated to patient volume or staffing provision; however, nosocomial bacteraemia was less frequent in NICUs with low neonatal consultant provision (odds ratio 0.65, 95% CI 0.43-0.98). Mortality was raised with increasing workload in all types of NICUs. Infants admitted at full capacity versus half capacity were about 50% more likely to die, but there was wide uncertainty around this estimate. Most staff had MHI-5 scores that suggested good mental health. The implications of this report for staffing policy, medicolegal risk management, and ethical practice remain to be tested. Centralisation of only the sickest infants could improve efficiency, provided that this does not create excessive workload for staff. Assessment of increased staffing levels that are closer to those in adult intensive care might be appropriate.

  5. Volume Measurements of Laser-generated Pits for In Situ Geochronology using KArLE (Potassium-Argon Laser Experiment)

    NASA Technical Reports Server (NTRS)

    French, R. A.; Cohen, B. A.; Miller, J. S.

    2014-01-01

    The Potassium-Argon Laser Experiment( KArLE), is composed of two main instruments: a spectrometer as part of the Laser-Induced Breakdown Spectroscopy (LIBS) method and a Mass Spectrometer (MS). The LIBS laser ablates a sample and creates a plasma cloud, generating a pit in the sample. The LIBS plasma is measured for K abundance in weight percent and the released gas is measured using the MS, which calculates Ar abundance in mols. To relate the K and Ar measurements, total mass of the ablated sample is needed but can be difficult to directly measure. Instead, density and volume are used to calculate mass, where density is calculated based on the elemental composition of the rock (from the emission spectrum) and volume is determined by pit morphology. This study aims to reduce the uncertainty for KArLE by analyzing pit volume relationships in several analog materials and comparing methods of pit volume measurements and their associated uncertainties.

  6. The major volume /density/ of solid oxygen in equilibrium with vapor

    NASA Technical Reports Server (NTRS)

    Roder, H. M.

    1979-01-01

    Data from the literature on the molar volume of solid oxygen have been compiled and critically analyzed. A correlated and thermodynamically consistent set of molar volumes, including the volume changes at the various solid phase transitions, is presented. Evidence for the existence of a delta-solid phase is reviewed. Uncertainties in the data and in the recommended set of values are discussed.

  7. Coverage-based constraints for IMRT optimization

    NASA Astrophysics Data System (ADS)

    Mescher, H.; Ulrich, S.; Bangert, M.

    2017-09-01

    Radiation therapy treatment planning requires an incorporation of uncertainties in order to guarantee an adequate irradiation of the tumor volumes. In current clinical practice, uncertainties are accounted for implicitly with an expansion of the target volume according to generic margin recipes. Alternatively, it is possible to account for uncertainties by explicit minimization of objectives that describe worst-case treatment scenarios, the expectation value of the treatment or the coverage probability of the target volumes during treatment planning. In this note we show that approaches relying on objectives to induce a specific coverage of the clinical target volumes are inevitably sensitive to variation of the relative weighting of the objectives. To address this issue, we introduce coverage-based constraints for intensity-modulated radiation therapy (IMRT) treatment planning. Our implementation follows the concept of coverage-optimized planning that considers explicit error scenarios to calculate and optimize patient-specific probabilities q(\\hat{d}, \\hat{v}) of covering a specific target volume fraction \\hat{v} with a certain dose \\hat{d} . Using a constraint-based reformulation of coverage-based objectives we eliminate the trade-off between coverage and competing objectives during treatment planning. In-depth convergence tests including 324 treatment plan optimizations demonstrate the reliability of coverage-based constraints for varying levels of probability, dose and volume. General clinical applicability of coverage-based constraints is demonstrated for two cases. A sensitivity analysis regarding penalty variations within this planing study based on IMRT treatment planning using (1) coverage-based constraints, (2) coverage-based objectives, (3) probabilistic optimization, (4) robust optimization and (5) conventional margins illustrates the potential benefit of coverage-based constraints that do not require tedious adjustment of target volume objectives.

  8. Runoff measurements and hydrological modelling for the estimation of rainfall volumes in an Alpine basin

    NASA Astrophysics Data System (ADS)

    Ranzi, R.; Bacchi, B.; Grossi, G.

    2003-01-01

    Streamflow data and water levels in reservoirs have been collected at 30 recording sites in the Toce river basin and its surroundings, upstream of Lago Maggiore, one of the target areas of the Mesoscale Alpine Programme (MAP) experiment. These data have been used for two purposes: firstly, the verification of a hydrological model, forced by rain-gauge data and the output of a mesoscale meteorological model, for flood simulation and forecasting; secondly, to solve an inverse problem--to estimate rainfall volumes from the runoff data in mountain areas where the influence of orography and the limits of actual monitoring systems prevent accurate measurement of precipitation. The methods are illustrated for 19-20 September 1999, MAP Intensive Observing Period 2b, an event with a 4-year return period for the Toce river basin. Uncertainties in the estimates of the areal rainfall volumes based on rain-gauge data and via the inverse solution are assessed.

  9. SU-E-T-429: Uncertainties of Cell Surviving Fractions Derived From Tumor-Volume Variation Curves

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chvetsov, A

    2014-06-01

    Purpose: To evaluate uncertainties of cell surviving fraction reconstructed from tumor-volume variation curves during radiation therapy using sensitivity analysis based on linear perturbation theory. Methods: The time dependent tumor-volume functions V(t) have been calculated using a twolevel cell population model which is based on the separation of entire tumor cell population in two subpopulations: oxygenated viable and lethally damaged cells. The sensitivity function is defined as S(t)=[δV(t)/V(t)]/[δx/x] where δV(t)/V(t) is the time dependent relative variation of the volume V(t) and δx/x is the relative variation of the radiobiological parameter x. The sensitivity analysis was performed using direct perturbation method wheremore » the radiobiological parameter x was changed by a certain error and the tumor-volume was recalculated to evaluate the corresponding tumor-volume variation. Tumor volume variation curves and sensitivity functions have been computed for different values of cell surviving fractions from the practically important interval S{sub 2}=0.1-0.7 using the two-level cell population model. Results: The sensitivity functions of tumor-volume to cell surviving fractions achieved a relatively large value of 2.7 for S{sub 2}=0.7 and then approached zero as S{sub 2} is approaching zero Assuming a systematic error of 3-4% we obtain that the relative error in S{sub 2} is less that 20% in the range S2=0.4-0.7. This Resultis important because the large values of S{sub 2} are associated with poor treatment outcome should be measured with relatively small uncertainties. For the very small values of S2<0.3, the relative error can be larger than 20%; however, the absolute error does not increase significantly. Conclusion: Tumor-volume curves measured during radiotherapy can be used for evaluation of cell surviving fractions usually observed in radiation therapy with conventional fractionation.« less

  10. Where do uncertainties reside within environmental risk assessments? Testing UnISERA, a guide for uncertainty assessment.

    PubMed

    Skinner, Daniel J C; Rocks, Sophie A; Pollard, Simon J T

    2017-06-01

    A means for identifying and prioritising the treatment of uncertainty (UnISERA) in environmental risk assessments (ERAs) is tested, using three risk domains where ERA is an established requirement and one in which ERA practice is emerging. UnISERA's development draws on 19 expert elicitations across genetically modified higher plants, particulate matter, and agricultural pesticide release and is stress tested here for engineered nanomaterials (ENM). We are concerned with the severity of uncertainty; its nature; and its location across four accepted stages of ERAs. Using an established uncertainty scale, the risk characterisation stage of ERA harbours the highest severity level of uncertainty, associated with estimating, aggregating and evaluating expressions of risk. Combined epistemic and aleatory uncertainty is the dominant nature of uncertainty. The dominant location of uncertainty is associated with data in problem formulation, exposure assessment and effects assessment. Testing UnISERA produced agreements of 55%, 90%, and 80% for the severity level, nature and location dimensions of uncertainty between the combined case studies and the ENM stress test. UnISERA enables environmental risk analysts to prioritise risk assessment phases, groups of tasks, or individual ERA tasks and it can direct them towards established methods for uncertainty treatment. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.

  11. Dealing with uncertainties in environmental burden of disease assessment

    PubMed Central

    2009-01-01

    Disability Adjusted Life Years (DALYs) combine the number of people affected by disease or mortality in a population and the duration and severity of their condition into one number. The environmental burden of disease is the number of DALYs that can be attributed to environmental factors. Environmental burden of disease estimates enable policy makers to evaluate, compare and prioritize dissimilar environmental health problems or interventions. These estimates often have various uncertainties and assumptions which are not always made explicit. Besides statistical uncertainty in input data and parameters – which is commonly addressed – a variety of other types of uncertainties may substantially influence the results of the assessment. We have reviewed how different types of uncertainties affect environmental burden of disease assessments, and we give suggestions as to how researchers could address these uncertainties. We propose the use of an uncertainty typology to identify and characterize uncertainties. Finally, we argue that uncertainties need to be identified, assessed, reported and interpreted in order for assessment results to adequately support decision making. PMID:19400963

  12. Fishing in the Water: Effect of Sampled Water Volume on Environmental DNA-Based Detection of Macroinvertebrates.

    PubMed

    Mächler, Elvira; Deiner, Kristy; Spahn, Fabienne; Altermatt, Florian

    2016-01-05

    Accurate detection of organisms is crucial for the effective management of threatened and invasive species because false detections directly affect the implementation of management actions. The use of environmental DNA (eDNA) as a species detection tool is in a rapid development stage; however, concerns about accurate detections using eDNA have been raised. We evaluated the effect of sampled water volume (0.25 to 2 L) on the detection rate for three macroinvertebrate species. Additionally, we tested (depending on the sampled water volume) what amount of total extracted DNA should be screened to reduce uncertainty in detections. We found that all three species were detected in all volumes of water. Surprisingly, however, only one species had a positive relationship between an increased sample volume and an increase in the detection rate. We conclude that the optimal sample volume might depend on the species-habitat combination and should be tested for the system where management actions are warranted. Nevertheless, we minimally recommend sampling water volumes of 1 L and screening at least 14 μL of extracted eDNA for each sample to reduce uncertainty in detections when studying macroinvertebrates in rivers and using our molecular workflow.

  13. Paleozoic shale gas resources in the Sichuan Basin, China

    USGS Publications Warehouse

    Potter, Christopher J.

    2018-01-01

    The Sichuan Basin, China, is commonly considered to contain the world’s most abundant shale gas resources. Although its Paleozoic marine shales share many basic characteristics with successful United States gas shales, numerous geologic uncertainties exist, and Sichuan Basin shale gas production is nascent. Gas retention was likely compromised by the age of the shale reservoirs, multiple uplifts and orogenies, and migration pathways along unconformities. High thermal maturities raise questions about gas storage potential in lower Paleozoic shales. Given these uncertainties, a new look at Sichuan Basin shale gas resources is advantageous. As part of a systematic effort to quantitatively assess continuous oil and gas resources in priority basins worldwide, the US Geological Survey (USGS) completed an assessment of Paleozoic shale gas in the Sichuan Basin in 2015. Three organic-rich marine Paleozoic shale intervals meet the USGS geologic criteria for quantitative assessment of shale gas resources: the lower Cambrian Qiongzhusi Formation, the uppermost Ordovician Wufeng through lowermost Silurian Longmaxi Formations (currently producing shale gas), and the upper Permian Longtan and Dalong Formations. This study defined geologically based assessment units and calculated probabilistic distributions of technically recoverable shale gas resources using the USGS well productivity–based method. For six assessment units evaluated in 2015, the USGS estimated a mean value of 23.9 tcf (677 billion cubic meters) of undiscovered, technically recoverable shale gas. This result is considerably lower than volumes calculated in previous shale gas assessments of the Sichuan Basin, highlighting a need for caution in this geologically challenging setting.

  14. Impact of 4D image quality on the accuracy of target definition.

    PubMed

    Nielsen, Tine Bjørn; Hansen, Christian Rønn; Westberg, Jonas; Hansen, Olfred; Brink, Carsten

    2016-03-01

    Delineation accuracy of target shape and position depends on the image quality. This study investigates whether the image quality on standard 4D systems has an influence comparable to the overall delineation uncertainty. A moving lung target was imaged using a dynamic thorax phantom on three different 4D computed tomography (CT) systems and a 4D cone beam CT (CBCT) system using pre-defined clinical scanning protocols. Peak-to-peak motion and target volume were registered using rigid registration and automatic delineation, respectively. A spatial distribution of the imaging uncertainty was calculated as the distance deviation between the imaged target and the true target shape. The measured motions were smaller than actual motions. There were volume differences of the imaged target between respiration phases. Imaging uncertainties of >0.4 cm were measured in the motion direction which showed that there was a large distortion of the imaged target shape. Imaging uncertainties of standard 4D systems are of similar size as typical GTV-CTV expansions (0.5-1 cm) and contribute considerably to the target definition uncertainty. Optimising and validating 4D systems is recommended in order to obtain the most optimal imaged target shape.

  15. The influence of patient positioning uncertainties in proton radiotherapy on proton range and dose distributions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liebl, Jakob, E-mail: jakob.liebl@medaustron.at; Francis H. Burr Proton Therapy Center, Massachusetts General Hospital and Harvard Medical School, Boston, Massachusetts 02114; Department of Therapeutic Radiology and Oncology, Medical University of Graz, 8036 Graz

    2014-09-15

    Purpose: Proton radiotherapy allows radiation treatment delivery with high dose gradients. The nature of such dose distributions increases the influence of patient positioning uncertainties on their fidelity when compared to photon radiotherapy. The present work quantitatively analyzes the influence of setup uncertainties on proton range and dose distributions. Methods: Thirty-eight clinical passive scattering treatment fields for small lesions in the head were studied. Dose distributions for shifted and rotated patient positions were Monte Carlo-simulated. Proton range uncertainties at the 50%- and 90%-dose falloff position were calculated considering 18 arbitrary combinations of maximal patient position shifts and rotations for two patientmore » positioning methods. Normal tissue complication probabilities (NTCPs), equivalent uniform doses (EUDs), and tumor control probabilities (TCPs) were studied for organs at risk (OARs) and target volumes of eight patients. Results: The authors identified a median 1σ proton range uncertainty at the 50%-dose falloff of 2.8 mm for anatomy-based patient positioning and 1.6 mm for fiducial-based patient positioning as well as 7.2 and 5.8 mm for the 90%-dose falloff position, respectively. These range uncertainties were correlated to heterogeneity indices (HIs) calculated for each treatment field (38% < R{sup 2} < 50%). A NTCP increase of more than 10% (absolute) was observed for less than 2.9% (anatomy-based positioning) and 1.2% (fiducial-based positioning) of the studied OARs and patient shifts. For target volumes TCP decreases by more than 10% (absolute) occurred in less than 2.2% of the considered treatment scenarios for anatomy-based patient positioning and were nonexistent for fiducial-based patient positioning. EUD changes for target volumes were up to 35% (anatomy-based positioning) and 16% (fiducial-based positioning). Conclusions: The influence of patient positioning uncertainties on proton range in therapy of small lesions in the human brain as well as target and OAR dosimetry were studied. Observed range uncertainties were correlated with HIs. The clinical practice of using multiple fields with smeared compensators while avoiding distal OAR sparing is considered to be safe.« less

  16. The influence of patient positioning uncertainties in proton radiotherapy on proton range and dose distributions

    PubMed Central

    Liebl, Jakob; Paganetti, Harald; Zhu, Mingyao; Winey, Brian A.

    2014-01-01

    Purpose: Proton radiotherapy allows radiation treatment delivery with high dose gradients. The nature of such dose distributions increases the influence of patient positioning uncertainties on their fidelity when compared to photon radiotherapy. The present work quantitatively analyzes the influence of setup uncertainties on proton range and dose distributions. Methods: Thirty-eight clinical passive scattering treatment fields for small lesions in the head were studied. Dose distributions for shifted and rotated patient positions were Monte Carlo-simulated. Proton range uncertainties at the 50%- and 90%-dose falloff position were calculated considering 18 arbitrary combinations of maximal patient position shifts and rotations for two patient positioning methods. Normal tissue complication probabilities (NTCPs), equivalent uniform doses (EUDs), and tumor control probabilities (TCPs) were studied for organs at risk (OARs) and target volumes of eight patients. Results: The authors identified a median 1σ proton range uncertainty at the 50%-dose falloff of 2.8 mm for anatomy-based patient positioning and 1.6 mm for fiducial-based patient positioning as well as 7.2 and 5.8 mm for the 90%-dose falloff position, respectively. These range uncertainties were correlated to heterogeneity indices (HIs) calculated for each treatment field (38% < R2 < 50%). A NTCP increase of more than 10% (absolute) was observed for less than 2.9% (anatomy-based positioning) and 1.2% (fiducial-based positioning) of the studied OARs and patient shifts. For target volumes TCP decreases by more than 10% (absolute) occurred in less than 2.2% of the considered treatment scenarios for anatomy-based patient positioning and were nonexistent for fiducial-based patient positioning. EUD changes for target volumes were up to 35% (anatomy-based positioning) and 16% (fiducial-based positioning). Conclusions: The influence of patient positioning uncertainties on proton range in therapy of small lesions in the human brain as well as target and OAR dosimetry were studied. Observed range uncertainties were correlated with HIs. The clinical practice of using multiple fields with smeared compensators while avoiding distal OAR sparing is considered to be safe. PMID:25186386

  17. Objective rapid delineation of areas at risk from block-and-ash pyroclastic flows and surges

    USGS Publications Warehouse

    Widiwijayanti, C.; Voight, B.; Hidayat, D.; Schilling, S.P.

    2009-01-01

    Assessments of pyroclastic flow (PF) hazards are commonly based on mapping of PF and surge deposits and estimations of inundation limits, and/or computer models of varying degrees of sophistication. In volcanic crises a PF hazard map may be sorely needed, but limited time, exposures, or safety aspects may preclude fieldwork, and insufficient time or baseline data may be available for reliable dynamic simulations. We have developed a statistically constrained simulation model for block-and-ash type PFs to estimate potential areas of inundation by adapting methodology from Iverson et al. (Geol Soc America Bull 110:972-984, (1998) for lahars. The predictive equations for block-and-ash PFs are calibrated with data from several volcanoes and given by A = (0.05 to 0.1) V2/3, B = (35 to 40) V2/3, where A is cross-sectional area of inundation, B is planimetric area and V is deposit volume. The proportionality coefficients were obtained from regression analyses and comparison of simulations to mapped deposits. The method embeds the predictive equations in a GIS program coupled with DEM topography, using the LAHARZ program of Schilling (1998). Although the method is objective and reproducible, any PF hazard zone so computed should be considered as an approximate guide only, due to uncertainties on the coefficients applicable to individual PFs, the authenticity of DEM details, and the volume of future collapses. The statistical uncertainty of the predictive equations, which imply a factor of two or more in predicting A or B for a specified V, is superposed on the uncertainty of forecasting V for the next PF to descend a particular valley. Multiple inundation zones, produced by simulations using a selected range of volumes, partly accommodate these uncertainties. The resulting maps show graphically that PF inundation potentials are highest nearest volcano sources and along valley thalwegs, and diminish with distance from source and lateral distance from thalweg. The model does not explicitly consider dynamic behavior, which can be important. Ash-cloud surge impact limits must be extended beyond PF hazard zones and we provide several approaches to do this. The method has been used to supply PF and surge hazard maps in two crises: Merapi 2006; and Montserrat 2006-2007. ?? Springer-Verlag 2008.

  18. Water Footprint and Water Consumption for the Main Crops and Biofuels Produced in Brazil

    NASA Astrophysics Data System (ADS)

    Sun, Y.; Tong, C.; Mansoor, K.; Carroll, S.

    2011-12-01

    The risk of CO2 leakage into shallow aquifers through various pathways such as faults and abandoned wells is a concern of CO2 geological sequestration. If a leak is detected in an aquifer system, a contingency plan is required to manage the CO2 storage and to protect the groundwater source. Among many remediation and mitigation strategies, the simplest is to stop CO2 leakage at a wellbore. Therefore, it is necessary to address whether and when the CO2 leaks should be sealed, and how much risk can be mitigated. In the presence of various uncertainties, including geological-structure uncertainty and parametric uncertainty, the risk of CO2 leakage into an aquifer needs to be assessed with probabilistic distributions of uncertain parameters. In this study, we developed an integrated model to simulate multiphase flow of CO2 and brine in a deep storage reservoir, through a leaky well at an uncertain location, and subsequently multicomponent reactive transport in a shallow aquifer. Each sub-model covers its domain-specific physics. Uncertainties of geological structure and parameters are considered together with decision variables (CO2 injection rate and mitigation time) for risk assessment of leakage-impacted aquifer volume. High-resolution and less-expensive reduced-order models (ROMs) of risk profiles are approximated as polynomial functions of decision variables and all uncertain parameters. These reduced-order models are then used in the place of computationally-expensive numerical models for future decision-making on if and when the leaky well is sealed. The tradeoff between CO2 storage capacity in the reservoir and the leakage-induced risk in the aquifer is evaluated. This work performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under contract DE-AC52-07NA27344.

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sun, Y.; Tong, C.; Trainor-Guitten, W. J.

    The risk of CO 2 leakage from a deep storage reservoir into a shallow aquifer through a fault is assessed and studied using physics-specific computer models. The hypothetical CO 2 geological sequestration system is composed of three subsystems: a deep storage reservoir, a fault in caprock, and a shallow aquifer, which are modeled respectively by considering sub-domain-specific physics. Supercritical CO 2 is injected into the reservoir subsystem with uncertain permeabilities of reservoir, caprock, and aquifer, uncertain fault location, and injection rate (as a decision variable). The simulated pressure and CO 2/brine saturation are connected to the fault-leakage model as amore » boundary condition. CO 2 and brine fluxes from the fault-leakage model at the fault outlet are then imposed in the aquifer model as a source term. Moreover, uncertainties are propagated from the deep reservoir model, to the fault-leakage model, and eventually to the geochemical model in the shallow aquifer, thus contributing to risk profiles. To quantify the uncertainties and assess leakage-relevant risk, we propose a global sampling-based method to allocate sub-dimensions of uncertain parameters to sub-models. The risk profiles are defined and related to CO 2 plume development for pH value and total dissolved solids (TDS) below the EPA's Maximum Contaminant Levels (MCL) for drinking water quality. A global sensitivity analysis is conducted to select the most sensitive parameters to the risk profiles. The resulting uncertainty of pH- and TDS-defined aquifer volume, which is impacted by CO 2 and brine leakage, mainly results from the uncertainty of fault permeability. Subsequently, high-resolution, reduced-order models of risk profiles are developed as functions of all the decision variables and uncertain parameters in all three subsystems.« less

  20. The effect of rainfall measurement uncertainties on rainfall-runoff processes modelling.

    PubMed

    Stransky, D; Bares, V; Fatka, P

    2007-01-01

    Rainfall data are a crucial input for various tasks concerning the wet weather period. Nevertheless, their measurement is affected by random and systematic errors that cause an underestimation of the rainfall volume. Therefore, the general objective of the presented work was to assess the credibility of measured rainfall data and to evaluate the effect of measurement errors on urban drainage modelling tasks. Within the project, the methodology of the tipping bucket rain gauge (TBR) was defined and assessed in terms of uncertainty analysis. A set of 18 TBRs was calibrated and the results were compared to the previous calibration. This enables us to evaluate the ageing of TBRs. A propagation of calibration and other systematic errors through the rainfall-runoff model was performed on experimental catchment. It was found that the TBR calibration is important mainly for tasks connected with the assessment of peak values and high flow durations. The omission of calibration leads to up to 30% underestimation and the effect of other systematic errors can add a further 15%. The TBR calibration should be done every two years in order to catch up the ageing of TBR mechanics. Further, the authors recommend to adjust the dynamic test duration proportionally to generated rainfall intensity.

  1. Regarding the Focal Treatment of Prostate Cancer: Inference of the Gleason Grade From Magnetic Resonance Spectroscopic Imaging

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brame, Ryan S.; Zaider, Marco; Zakian, Kristen L.

    2009-05-01

    Purpose: To quantify, as a function of average magnetic resonance spectroscopy (MRS) score and tumor volume, the probability that a cancer-suspected lesion has an elevated Gleason grade. Methods and Materials: The data consist of MRS imaging ratios R stratified by patient, lesion (contiguous abnormal voxels), voxels, biopsy and pathologic Gleason grade, and lesion volume. The data were analyzed using a logistic model. Results: For both low and high Gleason score biopsy lesions, the probability of pathologic Gleason score {>=}4+3 increases with lesion volume. At low values of R a lesion volume of at least 15-20 voxels is needed to reachmore » a probability of success of 80%; the biopsy result helps reduce the prediction uncertainty. At larger MRS ratios (R > 6) the biopsy result becomes essentially uninformative once the lesion volume is >12 voxels. With the exception of low values of R, for lesions with low Gleason score at biopsy, the MRS ratios serve primarily as a selection tool for assessing lesion volumes. Conclusions: In patients with biopsy Gleason score {>=}4+3, high MRS imaging tumor volume and (creatine + choline)/citrate ratio may justify the initiation of voxel-specific dose escalation. This is an example of biologically motivated focal treatment for which intensity-modulated radiotherapy and especially brachytherapy are ideally suited.« less

  2. Assessing and reporting uncertainties in dietary exposure analysis - Part II: Application of the uncertainty template to a practical example of exposure assessment.

    PubMed

    Tennant, David; Bánáti, Diána; Kennedy, Marc; König, Jürgen; O'Mahony, Cian; Kettler, Susanne

    2017-11-01

    A previous publication described methods for assessing and reporting uncertainty in dietary exposure assessments. This follow-up publication uses a case study to develop proposals for representing and communicating uncertainty to risk managers. The food ingredient aspartame is used as the case study in a simple deterministic model (the EFSA FAIM template) and with more sophisticated probabilistic exposure assessment software (FACET). Parameter and model uncertainties are identified for each modelling approach and tabulated. The relative importance of each source of uncertainty is then evaluated using a semi-quantitative scale and the results expressed using two different forms of graphical summary. The value of this approach in expressing uncertainties in a manner that is relevant to the exposure assessment and useful to risk managers is then discussed. It was observed that the majority of uncertainties are often associated with data sources rather than the model itself. However, differences in modelling methods can have the greatest impact on uncertainties overall, particularly when the underlying data are the same. It was concluded that improved methods for communicating uncertainties for risk management is the research area where the greatest amount of effort is suggested to be placed in future. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.

  3. Quantifying uncertainty in health impact assessment: a case-study example on indoor housing ventilation.

    PubMed

    Mesa-Frias, Marco; Chalabi, Zaid; Foss, Anna M

    2014-01-01

    Quantitative health impact assessment (HIA) is increasingly being used to assess the health impacts attributable to an environmental policy or intervention. As a consequence, there is a need to assess uncertainties in the assessments because of the uncertainty in the HIA models. In this paper, a framework is developed to quantify the uncertainty in the health impacts of environmental interventions and is applied to evaluate the impacts of poor housing ventilation. The paper describes the development of the framework through three steps: (i) selecting the relevant exposure metric and quantifying the evidence of potential health effects of the exposure; (ii) estimating the size of the population affected by the exposure and selecting the associated outcome measure; (iii) quantifying the health impact and its uncertainty. The framework introduces a novel application for the propagation of uncertainty in HIA, based on fuzzy set theory. Fuzzy sets are used to propagate parametric uncertainty in a non-probabilistic space and are applied to calculate the uncertainty in the morbidity burdens associated with three indoor ventilation exposure scenarios: poor, fair and adequate. The case-study example demonstrates how the framework can be used in practice, to quantify the uncertainty in health impact assessment where there is insufficient information to carry out a probabilistic uncertainty analysis. © 2013.

  4. SU-E-J-159: Analysis of Total Imaging Uncertainty in Respiratory-Gated Radiotherapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Suzuki, J; Okuda, T; Sakaino, S

    Purpose: In respiratory-gated radiotherapy, the gating phase during treatment delivery needs to coincide with the corresponding phase determined during the treatment plan. However, because radiotherapy is performed based on the image obtained for the treatment plan, the time delay, motion artifact, volume effect, and resolution in the images are uncertain. Thus, imaging uncertainty is the most basic factor that affects the localization accuracy. Therefore, these uncertainties should be analyzed. This study aims to analyze the total imaging uncertainty in respiratory-gated radiotherapy. Methods: Two factors of imaging uncertainties related to respiratory-gated radiotherapy were analyzed. First, CT image was used to determinemore » the target volume and 4D treatment planning for the Varian Realtime Position Management (RPM) system. Second, an X-ray image was acquired for image-guided radiotherapy (IGRT) for the BrainLAB ExacTrac system. These factors were measured using a respiratory gating phantom. The conditions applied during phantom operation were as follows: respiratory wave form, sine curve; respiratory cycle, 4 s; phantom target motion amplitude, 10, 20, and 29 mm (which is maximum phantom longitudinal motion). The target and cylindrical marker implanted in the phantom coverage of the CT images was measured and compared with the theoretically calculated coverage from the phantom motion. The theoretical position of the cylindrical marker implanted in the phantom was compared with that acquired from the X-ray image. The total imaging uncertainty was analyzed from these two factors. Results: In the CT image, the uncertainty between the target and cylindrical marker’s actual coverage and the coverage of CT images was 1.19 mm and 2.50mm, respectively. In the Xray image, the uncertainty was 0.39 mm. The total imaging uncertainty from the two factors was 1.62mm. Conclusion: The total imaging uncertainty in respiratory-gated radiotherapy was clinically acceptable. However, an internal margin should be added to account for the total imaging uncertainty.« less

  5. Integrated probabilistic risk assessment for nanoparticles: the case of nanosilica in food.

    PubMed

    Jacobs, Rianne; van der Voet, Hilko; Ter Braak, Cajo J F

    Insight into risks of nanotechnology and the use of nanoparticles is an essential condition for the social acceptance and safe use of nanotechnology. One of the problems with which the risk assessment of nanoparticles is faced is the lack of data, resulting in uncertainty in the risk assessment. We attempt to quantify some of this uncertainty by expanding a previous deterministic study on nanosilica (5-200 nm) in food into a fully integrated probabilistic risk assessment. We use the integrated probabilistic risk assessment method in which statistical distributions and bootstrap methods are used to quantify uncertainty and variability in the risk assessment. Due to the large amount of uncertainty present, this probabilistic method, which separates variability from uncertainty, contributed to a better understandable risk assessment. We found that quantifying the uncertainties did not increase the perceived risk relative to the outcome of the deterministic study. We pinpointed particular aspects of the hazard characterization that contributed most to the total uncertainty in the risk assessment, suggesting that further research would benefit most from obtaining more reliable data on those aspects.

  6. Water and Proppant Requirements and Water Production Associated with Undiscovered Petroleum in the Bakken and Three Forks Formations, North Dakota and Montana, USA

    NASA Astrophysics Data System (ADS)

    Haines, S. S.; Varela, B. A.; Thamke, J.; Hawkins, S. J.; Gianoutsos, N. J.; Tennyson, M. E.

    2017-12-01

    Water is used for several stages of oil and gas production, in particular for hydraulic fracturing that is typically used during production of petroleum from low-permeability shales and other rock types (referred to as "continuous" petroleum accumulations). Proppant, often sand, is also consumed during hydraulic fracturing. Water is then produced from the reservoir along with the oil and gas, representing either a disposal consideration or a possible source of water for further petroleum development or other purposes. The U.S. Geological Survey (USGS) has developed an approach for regional-scale estimation of these water and proppant quantities in order to provide an improved understanding of possible impacts and to help with planning and decision-making. Using the new methodology, the USGS has conducted a quantitative assessment of water and proppant requirements, and water production volumes, associated with associated with possible future production of undiscovered petroleum resources in the Bakken and Three Forks Formations, Williston Basin, USA. This water and proppant assessment builds directly from the 2013 USGS petroleum assessment for the Bakken and Three Forks Formations. USGS petroleum assessments incorporate all available geologic and petroleum production information, and include the definition of assessment units (AUs) that specify the geographic regions and geologic formations for the assessment. The 2013 petroleum assessment included 5 continuous AUs for the Bakken Formation and one continuous AU for the Three Forks Formation. The assessment inputs are defined probabilistically, and a Monte Carlo approach provides outputs that include uncertainty bounds. We can summarize the assessment outputs with the mean values of the associated distributions. The mean estimated total volume of water for well drilling and cement for all six continuous AUs is 5.9 billion gallons, and the mean estimated volume of water for hydraulic fracturing for all AUs is 164.3 billion gallons. The mean estimated quantity of proppant for hydraulic fracturing is 101.3 million tons. Summing over all of the AUs, the mean estimated total flowback water volume is 9.9 billion gallons and the mean estimated total produced water is 414.5 billion gallons.

  7. Uncertainty associated with the gravimetric measurement of particulate matter concentration in ambient air.

    PubMed

    Lacey, Ronald E; Faulkner, William Brock

    2015-07-01

    This work applied a propagation of uncertainty method to typical total suspended particulate (TSP) sampling apparatus in order to estimate the overall measurement uncertainty. The objectives of this study were to estimate the uncertainty for three TSP samplers, develop an uncertainty budget, and determine the sensitivity of the total uncertainty to environmental parameters. The samplers evaluated were the TAMU High Volume TSP Sampler at a nominal volumetric flow rate of 1.42 m3 min(-1) (50 CFM), the TAMU Low Volume TSP Sampler at a nominal volumetric flow rate of 17 L min(-1) (0.6 CFM) and the EPA TSP Sampler at the nominal volumetric flow rates of 1.1 and 1.7 m3 min(-1) (39 and 60 CFM). Under nominal operating conditions the overall measurement uncertainty was found to vary from 6.1x10(-6) g m(-3) to 18.0x10(-6) g m(-3), which represented an uncertainty of 1.7% to 5.2% of the measurement. Analysis of the uncertainty budget determined that three of the instrument parameters contributed significantly to the overall uncertainty: the uncertainty in the pressure drop measurement across the orifice meter during both calibration and testing and the uncertainty of the airflow standard used during calibration of the orifice meter. Five environmental parameters occurring during field measurements were considered for their effect on overall uncertainty: ambient TSP concentration, volumetric airflow rate, ambient temperature, ambient pressure, and ambient relative humidity. Of these, only ambient TSP concentration and volumetric airflow rate were found to have a strong effect on the overall uncertainty. The technique described in this paper can be applied to other measurement systems and is especially useful where there are no methods available to generate these values empirically. This work addresses measurement uncertainty of TSP samplers used in ambient conditions. Estimation of uncertainty in gravimetric measurements is of particular interest, since as ambient particulate matter (PM) concentrations approach regulatory limits, the uncertainty of the measurement is essential in determining the sample size and the probability of type II errors in hypothesis testing. This is an important factor in determining if ambient PM concentrations exceed regulatory limits. The technique described in this paper can be applied to other measurement systems and is especially useful where there are no methods available to generate these values empirically.

  8. Assessing Uncertainty in Expert Judgments About Natural Resources

    Treesearch

    David A. Cleaves

    1994-01-01

    Judgments are necessary in natural resources management, but uncertainty about these judgments should be assessed. When all judgments are rejected in the absence of hard data, valuable professional experience and knowledge are not utilized fully. The objective of assessing uncertainty is to get the best representation of knowledge and its bounds. Uncertainty...

  9. Uncertainty and sensitivity assessment of flood risk assessments

    NASA Astrophysics Data System (ADS)

    de Moel, H.; Aerts, J. C.

    2009-12-01

    Floods are one of the most frequent and costly natural disasters. In order to protect human lifes and valuable assets from the effect of floods many defensive structures have been build. Despite these efforts economic losses due to catastrophic flood events have, however, risen substantially during the past couple of decades because of continuing economic developments in flood prone areas. On top of that, climate change is expected to affect the magnitude and frequency of flood events. Because these ongoing trends are expected to continue, a transition can be observed in various countries to move from a protective flood management approach to a more risk based flood management approach. In a risk based approach, flood risk assessments play an important role in supporting decision making. Most flood risk assessments assess flood risks in monetary terms (damage estimated for specific situations or expected annual damage) in order to feed cost-benefit analysis of management measures. Such flood risk assessments contain, however, considerable uncertainties. This is the result from uncertainties in the many different input parameters propagating through the risk assessment and accumulating in the final estimate. Whilst common in some other disciplines, as with integrated assessment models, full uncertainty and sensitivity analyses of flood risk assessments are not so common. Various studies have addressed uncertainties regarding flood risk assessments, but have mainly focussed on the hydrological conditions. However, uncertainties in other components of the risk assessment, like the relation between water depth and monetary damage, can be substantial as well. This research therefore tries to assess the uncertainties of all components of monetary flood risk assessments, using a Monte Carlo based approach. Furthermore, the total uncertainty will also be attributed to the different input parameters using a variance based sensitivity analysis. Assessing and visualizing the uncertainties of the final risk estimate will be helpful to decision makers to make better informed decisions and attributing this uncertainty to the input parameters helps to identify which parameters are most important when it comes to uncertainty in the final estimate and should therefore deserve additional attention in further research.

  10. TU-EF-304-03: 4D Monte Carlo Robustness Test for Proton Therapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Souris, K; Sterpin, E; Lee, J

    Purpose: Breathing motion and approximate dose calculation engines may increase proton range uncertainties. We address these two issues using a comprehensive 4D robustness evaluation tool based on an efficient Monte Carlo (MC) engine, which can simulate breathing with no significant increase in computation time. Methods: To assess the robustness of the treatment plan, multiple scenarios of uncertainties are simulated, taking into account the systematic and random setup errors, range uncertainties, and organ motion. Our fast MC dose engine, called MCsquare, implements optimized models on a massively-parallel computation architecture and allows us to accurately simulate a scenario in less than onemore » minute. The deviations of the uncertainty scenarios are then reported on a DVH-band and compared to the nominal plan.The robustness evaluation tool is illustrated in a lung case by comparing three 60Gy treatment plans. First, a plan is optimized on a PTV obtained by extending the CTV with an 8mm margin, in order to take into account systematic geometrical uncertainties, like in our current practice in radiotherapy. No specific strategy is employed to correct for tumor and organ motions. The second plan involves a PTV generated from the ITV, which encompasses the tumor volume in all breathing phases. The last plan results from robust optimization performed on the ITV, with robustness parameters of 3% for tissue density and 8 mm for positioning errors. Results: The robustness test revealed that the first two plans could not properly cover the target in the presence of uncertainties. CTV-coverage (D95) in the three plans ranged respectively between 39.4–55.5Gy, 50.2–57.5Gy, and 55.1–58.6Gy. Conclusion: A realistic robustness verification tool based on a fast MC dose engine has been developed. This test is essential to assess the quality of proton therapy plan and very useful to study various planning strategies for mobile tumors. This work is partly funded by IBA (Louvain-la-Neuve, Belgium)« less

  11. Assessing and reporting uncertainties in dietary exposure analysis: Mapping of uncertainties in a tiered approach.

    PubMed

    Kettler, Susanne; Kennedy, Marc; McNamara, Cronan; Oberdörfer, Regina; O'Mahony, Cian; Schnabel, Jürgen; Smith, Benjamin; Sprong, Corinne; Faludi, Roland; Tennant, David

    2015-08-01

    Uncertainty analysis is an important component of dietary exposure assessments in order to understand correctly the strength and limits of its results. Often, standard screening procedures are applied in a first step which results in conservative estimates. If through those screening procedures a potential exceedance of health-based guidance values is indicated, within the tiered approach more refined models are applied. However, the sources and types of uncertainties in deterministic and probabilistic models can vary or differ. A key objective of this work has been the mapping of different sources and types of uncertainties to better understand how to best use uncertainty analysis to generate more realistic comprehension of dietary exposure. In dietary exposure assessments, uncertainties can be introduced by knowledge gaps about the exposure scenario, parameter and the model itself. With this mapping, general and model-independent uncertainties have been identified and described, as well as those which can be introduced and influenced by the specific model during the tiered approach. This analysis identifies that there are general uncertainties common to point estimates (screening or deterministic methods) and probabilistic exposure assessment methods. To provide further clarity, general sources of uncertainty affecting many dietary exposure assessments should be separated from model-specific uncertainties. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.

  12. Analysis and reduction of the uncertainty of the assessment of children's lead exposure around an old mine.

    PubMed

    Glorennec, Philippe

    2006-02-01

    Exposure to lead is a special problem in children, because they are more highly exposed than adults and because this pollutant, which accumulates in the body, induces neurobehavioral and cognitive effects. The objective of this study was to determine the probability density of the lead exposure dose of a 2-year-old child around an old mine site and to analyze its uncertainties, especially those associated with the bioavailability of lead in soil. Children's exposure was estimated indirectly from environmental samples (soils, domestic dust, water, air) and parameters (volume inhaled, body weight, soil intake rate, water intake, dietary intake) from the literature. Uncertainty and variability were analyzed separately in a two-dimensional Monte Carlo simulation with Crystal Ball software. Exposure doses were simulated with different methods for accessing the bioavailability of lead in soil. The exposure dose per kilogram of body weight varied from 2 microg/kgday at the 5th percentile to 5.5 microg/kgday at the 95th percentile (and from 2 to 10 microg/kgday, respectively, when ignoring bioavailability). The principal factors of variation were dietary intake, soil concentrations, and soil ingestion. The principal uncertainties were associated with the level of soil ingestion and the bioavailability of lead. Reducing uncertainty about the bioavailability of lead in soil by taking into account information about the type of mineral made it possible to increase our degree of confidence (from 25% to more than 95%) that the median exposure dose does not exceed the Tolerable Daily Intake. Knowledge of the mineral very substantially increases the degree of confidence in estimates of children's lead exposure around an old mining site by reducing the uncertainty associated with lead's bioavailability.

  13. Evaluating uncertainty in environmental life-cycle assessment. A case study comparing two insulation options for a Dutch one-family dwelling.

    PubMed

    Huijbregts, Mark A J; Gilijamse, Wim; Ragas, Ad M J; Reijnders, Lucas

    2003-06-01

    The evaluation of uncertainty is relatively new in environmental life-cycle assessment (LCA). It provides useful information to assess the reliability of LCA-based decisions and to guide future research toward reducing uncertainty. Most uncertainty studies in LCA quantify only one type of uncertainty, i.e., uncertainty due to input data (parameter uncertainty). However, LCA outcomes can also be uncertain due to normative choices (scenario uncertainty) and the mathematical models involved (model uncertainty). The present paper outlines a new methodology that quantifies parameter, scenario, and model uncertainty simultaneously in environmental life-cycle assessment. The procedure is illustrated in a case study that compares two insulation options for a Dutch one-family dwelling. Parameter uncertainty was quantified by means of Monte Carlo simulation. Scenario and model uncertainty were quantified by resampling different decision scenarios and model formulations, respectively. Although scenario and model uncertainty were not quantified comprehensively, the results indicate that both types of uncertainty influence the case study outcomes. This stresses the importance of quantifying parameter, scenario, and model uncertainty simultaneously. The two insulation options studied were found to have significantly different impact scores for global warming, stratospheric ozone depletion, and eutrophication. The thickest insulation option has the lowest impact on global warming and eutrophication, and the highest impact on stratospheric ozone depletion.

  14. Interim reliability-evaluation program: analysis of the Browns Ferry, Unit 1, nuclear plant. Appendix C - sequence quantification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mays, S.E.; Poloski, J.P.; Sullivan, W.H.

    1982-07-01

    This report describes a risk study of the Browns Ferry, Unit 1, nuclear plant. The study is one of four such studies sponsored by the NRC Office of Research, Division of Risk Assessment, as part of its Interim Reliability Evaluation Program (IREP), Phase II. This report is contained in four volumes: a main report and three appendixes. Appendix C generally describes the methods used to estimate accident sequence frequency values. Information is presented concerning the approach, example collection, failure data, candidate dominant sequences, uncertainty analysis, and sensitivity analysis.

  15. Impact assessment of extreme storm events using a Bayesian network

    USGS Publications Warehouse

    den Heijer, C.(Kees); Knipping, Dirk T.J.A.; Plant, Nathaniel G.; van Thiel de Vries, Jaap S. M.; Baart, Fedor; van Gelder, Pieter H. A. J. M.

    2012-01-01

    This paper describes an investigation on the usefulness of Bayesian Networks in the safety assessment of dune coasts. A network has been created that predicts the erosion volume based on hydraulic boundary conditions and a number of cross-shore profile indicators. Field measurement data along a large part of the Dutch coast has been used to train the network. Corresponding storm impact on the dunes was calculated with an empirical dune erosion model named duros+. Comparison between the Bayesian Network predictions and the original duros+ results, here considered as observations, results in a skill up to 0.88, provided that the training data covers the range of predictions. Hence, the predictions from a deterministic model (duros+) can be captured in a probabilistic model (Bayesian Network) such that both the process knowledge and uncertainties can be included in impact and vulnerability assessments.

  16. Sensitivity to experimental data of pollutant site mean concentration in stormwater runoff.

    PubMed

    Mourad, M; Bertrand-Krajewski, J L; Chebbo, G

    2005-01-01

    Urban wet weather discharges are known to be a great source of pollutants for receiving waters, which protection requires the estimation of long-term discharged pollutant loads. Pollutant loads can be estimated by multiplying a site mean concentration (SMC) by the total runoff volume during a given period of time. The estimation of the SMC value as a weighted mean value with event runoff volumes as weights is affected by uncertainties due to the variability of event mean concentrations and to the number of events used. This study carried out on 13 catchments gives orders of magnitude of these uncertainties and shows the limitations of usual practices using few measured events. The results obtained show that it is not possible to propose a standard minimal number of events to be measured on any catchment in order to evaluate the SMC value with a given uncertainty.

  17. Radiation Dose-Volume Effects in the Larynx and Pharynx

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rancati, Tiziana; Schwarz, Marco; Allen, Aaron M.

    2010-03-01

    The dose-volume outcome data for RT-associated laryngeal edema, laryngeal dysfunction, and dysphagia, have only recently been addressed, and are summarized. For late dysphagia, a major issue is accurate definition and uncertainty of the relevant anatomical structures. These and other issues are discussed.

  18. Operationalising uncertainty in data and models for integrated water resources management.

    PubMed

    Blind, M W; Refsgaard, J C

    2007-01-01

    Key sources of uncertainty of importance for water resources management are (1) uncertainty in data; (2) uncertainty related to hydrological models (parameter values, model technique, model structure); and (3) uncertainty related to the context and the framing of the decision-making process. The European funded project 'Harmonised techniques and representative river basin data for assessment and use of uncertainty information in integrated water management (HarmoniRiB)' has resulted in a range of tools and methods to assess such uncertainties, focusing on items (1) and (2). The project also engaged in a number of discussions surrounding uncertainty and risk assessment in support of decision-making in water management. Based on the project's results and experiences, and on the subsequent discussions a number of conclusions can be drawn on the future needs for successful adoption of uncertainty analysis in decision support. These conclusions range from additional scientific research on specific uncertainties, dedicated guidelines for operational use to capacity building at all levels. The purpose of this paper is to elaborate on these conclusions and anchoring them in the broad objective of making uncertainty and risk assessment an essential and natural part in future decision-making processes.

  19. A data-driven approach for modeling post-fire debris-flow volumes and their uncertainty

    USGS Publications Warehouse

    Friedel, Michael J.

    2011-01-01

    This study demonstrates the novel application of genetic programming to evolve nonlinear post-fire debris-flow volume equations from variables associated with a data-driven conceptual model of the western United States. The search space is constrained using a multi-component objective function that simultaneously minimizes root-mean squared and unit errors for the evolution of fittest equations. An optimization technique is then used to estimate the limits of nonlinear prediction uncertainty associated with the debris-flow equations. In contrast to a published multiple linear regression three-variable equation, linking basin area with slopes greater or equal to 30 percent, burn severity characterized as area burned moderate plus high, and total storm rainfall, the data-driven approach discovers many nonlinear and several dimensionally consistent equations that are unbiased and have less prediction uncertainty. Of the nonlinear equations, the best performance (lowest prediction uncertainty) is achieved when using three variables: average basin slope, total burned area, and total storm rainfall. Further reduction in uncertainty is possible for the nonlinear equations when dimensional consistency is not a priority and by subsequently applying a gradient solver to the fittest solutions. The data-driven modeling approach can be applied to nonlinear multivariate problems in all fields of study.

  20. Addressing uncertainty in vulnerability assessments [Chapter 5

    Treesearch

    Linda Joyce; Molly Cross; Evan Girvatz

    2011-01-01

    This chapter addresses issues and approaches for dealing with uncertainty specifically within the context of conducting climate change vulnerability assessments (i.e., uncertainties related to identifying and modeling the sensitivities, levels of exposure, and adaptive capacity of the assessment targets).

  1. Regional scale landslide risk assessment with a dynamic physical model - development, application and uncertainty analysis

    NASA Astrophysics Data System (ADS)

    Luna, Byron Quan; Vidar Vangelsten, Bjørn; Liu, Zhongqiang; Eidsvig, Unni; Nadim, Farrokh

    2013-04-01

    Landslide risk must be assessed at the appropriate scale in order to allow effective risk management. At the moment, few deterministic models exist that can do all the computations required for a complete landslide risk assessment at a regional scale. This arises from the difficulty to precisely define the location and volume of the released mass and from the inability of the models to compute the displacement with a large amount of individual initiation areas (computationally exhaustive). This paper presents a medium-scale, dynamic physical model for rapid mass movements in mountainous and volcanic areas. The deterministic nature of the approach makes it possible to apply it to other sites since it considers the frictional equilibrium conditions for the initiation process, the rheological resistance of the displaced flow for the run-out process and fragility curve that links intensity to economic loss for each building. The model takes into account the triggering effect of an earthquake, intense rainfall and a combination of both (spatial and temporal). The run-out module of the model considers the flow as a 2-D continuum medium solving the equations of mass balance and momentum conservation. The model is embedded in an open source environment geographical information system (GIS), it is computationally efficient and it is transparent (understandable and comprehensible) for the end-user. The model was applied to a virtual region, assessing landslide hazard, vulnerability and risk. A Monte Carlo simulation scheme was applied to quantify, propagate and communicate the effects of uncertainty in input parameters on the final results. In this technique, the input distributions are recreated through sampling and the failure criteria are calculated for each stochastic realisation of the site properties. The model is able to identify the released volumes of the critical slopes and the areas threatened by the run-out intensity. The obtained final outcome is the estimation of individual building damage and total economic risk. The research leading to these results has received funding from the European Community's Seventh Framework Programme [FP7/2007-2013] under grant agreement No 265138 New Multi-HAzard and MulTi-RIsK Assessment MethodS for Europe (MATRIX).

  2. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Turley, Jessica; Claridge Mackonis, Elizabeth

    To evaluate in-field megavoltage (MV) imaging of simultaneously integrated boost (SIB) breast fields to determine its feasibility in treatment verification for the SIB breast radiotherapy technique, and to assess whether the current-imaging protocol and treatment margins are sufficient. For nine patients undergoing SIB breast radiotherapy, in-field MV images of the SIB fields were acquired on days that regular treatment verification imaging was performed. The in-field images were matched offline according to the scar wire on digitally reconstructed radiographs. The offline image correction results were then applied to a margin recipe formula to calculate safe margins that account for random andmore » systematic uncertainties in the position of the boost volume when an offline correction protocol has been applied. After offline assessment of the acquired images, 96% were within the tolerance set in the current department-imaging protocol. Retrospectively performing the maximum position deviations on the Eclipse™ treatment planning system demonstrated that the clinical target volume (CTV) boost received a minimum dose difference of 0.4% and a maximum dose difference of 1.4% less than planned. Furthermore, applying our results to the Van Herk margin formula to ensure that 90% of patients receive 95% of the prescribed dose, the calculated CTV margins were comparable to the current departmental procedure used. Based on the in-field boost images acquired and the feasible application of these results to the margin formula the current CTV-planning target volume margins used are appropriate for the accurate treatment of the SIB boost volume without additional imaging.« less

  3. Approximating uncertainty of annual runoff and reservoir yield using stochastic replicates of global climate model data

    NASA Astrophysics Data System (ADS)

    Peel, M. C.; Srikanthan, R.; McMahon, T. A.; Karoly, D. J.

    2015-04-01

    Two key sources of uncertainty in projections of future runoff for climate change impact assessments are uncertainty between global climate models (GCMs) and within a GCM. Within-GCM uncertainty is the variability in GCM output that occurs when running a scenario multiple times but each run has slightly different, but equally plausible, initial conditions. The limited number of runs available for each GCM and scenario combination within the Coupled Model Intercomparison Project phase 3 (CMIP3) and phase 5 (CMIP5) data sets, limits the assessment of within-GCM uncertainty. In this second of two companion papers, the primary aim is to present a proof-of-concept approximation of within-GCM uncertainty for monthly precipitation and temperature projections and to assess the impact of within-GCM uncertainty on modelled runoff for climate change impact assessments. A secondary aim is to assess the impact of between-GCM uncertainty on modelled runoff. Here we approximate within-GCM uncertainty by developing non-stationary stochastic replicates of GCM monthly precipitation and temperature data. These replicates are input to an off-line hydrologic model to assess the impact of within-GCM uncertainty on projected annual runoff and reservoir yield. We adopt stochastic replicates of available GCM runs to approximate within-GCM uncertainty because large ensembles, hundreds of runs, for a given GCM and scenario are unavailable, other than the Climateprediction.net data set for the Hadley Centre GCM. To date within-GCM uncertainty has received little attention in the hydrologic climate change impact literature and this analysis provides an approximation of the uncertainty in projected runoff, and reservoir yield, due to within- and between-GCM uncertainty of precipitation and temperature projections. In the companion paper, McMahon et al. (2015) sought to reduce between-GCM uncertainty by removing poorly performing GCMs, resulting in a selection of five better performing GCMs from CMIP3 for use in this paper. Here we present within- and between-GCM uncertainty results in mean annual precipitation (MAP), mean annual temperature (MAT), mean annual runoff (MAR), the standard deviation of annual precipitation (SDP), standard deviation of runoff (SDR) and reservoir yield for five CMIP3 GCMs at 17 worldwide catchments. Based on 100 stochastic replicates of each GCM run at each catchment, within-GCM uncertainty was assessed in relative form as the standard deviation expressed as a percentage of the mean of the 100 replicate values of each variable. The average relative within-GCM uncertainties from the 17 catchments and 5 GCMs for 2015-2044 (A1B) were MAP 4.2%, SDP 14.2%, MAT 0.7%, MAR 10.1% and SDR 17.6%. The Gould-Dincer Gamma (G-DG) procedure was applied to each annual runoff time series for hypothetical reservoir capacities of 1 × MAR and 3 × MAR and the average uncertainties in reservoir yield due to within-GCM uncertainty from the 17 catchments and 5 GCMs were 25.1% (1 × MAR) and 11.9% (3 × MAR). Our approximation of within-GCM uncertainty is expected to be an underestimate due to not replicating the GCM trend. However, our results indicate that within-GCM uncertainty is important when interpreting climate change impact assessments. Approximately 95% of values of MAP, SDP, MAT, MAR, SDR and reservoir yield from 1 × MAR or 3 × MAR capacity reservoirs are expected to fall within twice their respective relative uncertainty (standard deviation/mean). Within-GCM uncertainty has significant implications for interpreting climate change impact assessments that report future changes within our range of uncertainty for a given variable - these projected changes may be due solely to within-GCM uncertainty. Since within-GCM variability is amplified from precipitation to runoff and then to reservoir yield, climate change impact assessments that do not take into account within-GCM uncertainty risk providing water resources management decision makers with a sense of certainty that is unjustified.

  4. Simultaneous calibration of end-member thermodynamic data and solution properties with correlated uncertainties

    NASA Astrophysics Data System (ADS)

    Antoshechkina, P. M.; Wolf, A. S.; Hamecher, E. A.; Asimow, P. D.; Ghiorso, M. S.

    2013-12-01

    Community databases such as EarthChem, LEPR, and AMCSD both increase demand for quantitative petrological tools, including thermodynamic models like the MELTS family of algorithms, and are invaluable in development of such tools. The need to extend existing solid solution models to include minor components such as Cr and Na has been evident for years but as the number of components increases it becomes impossible to completely separate derivation of end-member thermodynamic data from calibration of solution properties. In Hamecher et al. (2012; 2013) we developed a calibration scheme that directly interfaces with a MySQL database based on LEPR, with volume data from AMCSD and elsewhere. Here we combine that scheme with a Bayesian approach, where independent constraints on parameter values (e.g. existence of miscibility gaps) are combined with uncertainty propagation to give a more reliable best-fit along with associated model uncertainties. We illustrate the scheme with a new model of molar volume for (Ca,Fe,Mg,Mn,Na)3(Al,Cr,Fe3+,Fe2+,Mg,Mn,Si,Ti)2Si3O12 cubic garnets. For a garnet in this chemical system, the model molar volume is obtained by adding excess volume terms to a linear combination of nine independent end-member volumes. The model calibration is broken into three main stages: (1) estimation of individual end-member thermodynamic properties; (2) calibration of standard state volumes for all available independent and dependent end members; (3) fitting of binary and mixed composition data. For each calibration step, the goodness-of-fit includes weighted residuals as well as χ2-like penalty terms representing the (not necessarily Gaussian) prior constraints on parameter values. Using the Bayesian approach, uncertainties are correctly propagated forward to subsequent steps, allowing determination of final parameter values and correlated uncertainties that account for the entire calibration process. For the aluminosilicate garnets, optimal values of the bulk modulus and its pressure derivative are obtained by fitting published compression data using the Vinet equation of state, with the Mie-Grüneisen-Debye thermal pressure formalism to model thermal expansion. End-member thermal parameters are obtained by fitting volume data while ensuring that the heat capacity is consistent with the thermodynamic database of Berman and co-workers. For other end members, data for related compositions are used where such data exist; otherwise ultrasonic data or density functional theory results are taken or, for thermal parameters, systematics in cation radii are used. In stages (2) and (3) the remaining data at ambient conditions are fit. Using this step-wise calibration scheme, most parameters are modified little by subsequent calibration steps but some, such as the standard state volume of the Ti-bearing end member, can vary within calculated uncertainties. The final model satisfies desired criteria and fits almost all the data (more than 1000 points); only excess parameters that are justified by the data are activated. The scheme can be easily extended to calibration of end-member and solution properties from experimental phase equilibria. As a first step we obtain the internally consistent standard state entropy and enthalpy of formation for knorringite and discuss differences between our results and those of Klemme and co-workers.

  5. An empirical application of transaction-costs theory to organizational design characteristics.

    PubMed

    Williams, S

    2000-01-01

    The environmental uncertainty component of transaction-costs theory was used to predict the organizational structural characteristics of size (number of employees) and horizontal differentiation (number of vice presidents) using financial and management information from the COMPACT DISCLOSURE data base (which contains the most recent annual and periodic reports for more than 12,000 public companies). Organizations were categorized as low- or high-uncertainty industries according to Dess and Beard's (1984) Dynamism Scale, and net sales volume was controlled. As predicted, high-uncertainty companies had significantly higher horizontal differentiation than low-uncertainty firms, a finding that supports the transaction-costs expectation that organizations may require more departments or personnel to cope with increasing uncertainty. Surprisingly, low-uncertainty firms were found to have significantly more employees than high-uncertainty organizations, which is the opposite of what transaction-costs theory predicts. Possible explanations for this unexpected finding and further potential limitations are discussed.

  6. Uncertainty assessment of PM2.5 contamination mapping using spatiotemporal sequential indicator simulations and multi-temporal monitoring data.

    PubMed

    Yang, Yong; Christakos, George; Huang, Wei; Lin, Chengda; Fu, Peihong; Mei, Yang

    2016-04-12

    Because of the rapid economic growth in China, many regions are subjected to severe particulate matter pollution. Thus, improving the methods of determining the spatiotemporal distribution and uncertainty of air pollution can provide considerable benefits when developing risk assessments and environmental policies. The uncertainty assessment methods currently in use include the sequential indicator simulation (SIS) and indicator kriging techniques. However, these methods cannot be employed to assess multi-temporal data. In this work, a spatiotemporal sequential indicator simulation (STSIS) based on a non-separable spatiotemporal semivariogram model was used to assimilate multi-temporal data in the mapping and uncertainty assessment of PM2.5 distributions in a contaminated atmosphere. PM2.5 concentrations recorded throughout 2014 in Shandong Province, China were used as the experimental dataset. Based on the number of STSIS procedures, we assessed various types of mapping uncertainties, including single-location uncertainties over one day and multiple days and multi-location uncertainties over one day and multiple days. A comparison of the STSIS technique with the SIS technique indicate that a better performance was obtained with the STSIS method.

  7. Uncertainty assessment of PM2.5 contamination mapping using spatiotemporal sequential indicator simulations and multi-temporal monitoring data

    NASA Astrophysics Data System (ADS)

    Yang, Yong; Christakos, George; Huang, Wei; Lin, Chengda; Fu, Peihong; Mei, Yang

    2016-04-01

    Because of the rapid economic growth in China, many regions are subjected to severe particulate matter pollution. Thus, improving the methods of determining the spatiotemporal distribution and uncertainty of air pollution can provide considerable benefits when developing risk assessments and environmental policies. The uncertainty assessment methods currently in use include the sequential indicator simulation (SIS) and indicator kriging techniques. However, these methods cannot be employed to assess multi-temporal data. In this work, a spatiotemporal sequential indicator simulation (STSIS) based on a non-separable spatiotemporal semivariogram model was used to assimilate multi-temporal data in the mapping and uncertainty assessment of PM2.5 distributions in a contaminated atmosphere. PM2.5 concentrations recorded throughout 2014 in Shandong Province, China were used as the experimental dataset. Based on the number of STSIS procedures, we assessed various types of mapping uncertainties, including single-location uncertainties over one day and multiple days and multi-location uncertainties over one day and multiple days. A comparison of the STSIS technique with the SIS technique indicate that a better performance was obtained with the STSIS method.

  8. Effects of Measurement Errors on Individual Tree Stem Volume Estimates for the Austrian National Forest Inventory

    Treesearch

    Ambros Berger; Thomas Gschwantner; Ronald E. McRoberts; Klemens Schadauer

    2014-01-01

    National forest inventories typically estimate individual tree volumes using models that rely on measurements of predictor variables such as tree height and diameter, both of which are subject to measurement error. The aim of this study was to quantify the impacts of these measurement errors on the uncertainty of the model-based tree stem volume estimates. The impacts...

  9. Evaluation of rainfall structure on hydrograph simulation: Comparison of radar and interpolated methods, a study case in a tropical catchment

    NASA Astrophysics Data System (ADS)

    Velasquez, N.; Ochoa, A.; Castillo, S.; Hoyos Ortiz, C. D.

    2017-12-01

    The skill of river discharge simulation using hydrological models strongly depends on the quality and spatio-temporal representativeness of precipitation during storm events. All precipitation measurement strategies have their own strengths and weaknesses that translate into discharge simulation uncertainties. Distributed hydrological models are based on evolving rainfall fields in the same time scale as the hydrological simulation. In general, rainfall measurements from a dense and well maintained rain gauge network provide a very good estimation of the total volume for each rainfall event, however, the spatial structure relies on interpolation strategies introducing considerable uncertainty in the simulation process. On the other hand, rainfall retrievals from radar reflectivity achieve a better spatial structure representation but with higher uncertainty in the surface precipitation intensity and volume depending on the vertical rainfall characteristics and radar scan strategy. To assess the impact of both rainfall measurement methodologies on hydrological simulations, and in particular the effects of the rainfall spatio-temporal variability, a numerical modeling experiment is proposed including the use of a novel QPE (Quantitative Precipitation Estimation) method based on disdrometer data in order to estimate surface rainfall from radar reflectivity. The experiment is based on the simulation of 84 storms, the hydrological simulations are carried out using radar QPE and two different interpolation methods (IDW and TIN), and the assessment of simulated peak flow. Results show significant rainfall differences between radar QPE and the interpolated fields, evidencing a poor representation of storms in the interpolated fields, which tend to miss the precise location of the intense precipitation cores, and to artificially generate rainfall in some areas of the catchment. Regarding streamflow modelling, the potential improvement achieved by using radar QPE depends on the density of the rain gauge network and its distribution relative to the precipitation events. The results for the 84 storms show a better model skill using radar QPE than the interpolated fields. Results using interpolated fields are highly affected by the dominant rainfall type and the basin scale.

  10. Uncertainty in natural hazards, modeling and decision support: An introduction to this volume [Chapter 1

    Treesearch

    Karin Riley; Matthew Thompson; Peter Webley; Kevin D. Hyde

    2017-01-01

    Modeling has been used to characterize and map natural hazards and hazard susceptibility for decades. Uncertainties are pervasive in natural hazards analysis, including a limited ability to predict where and when extreme events will occur, with what consequences, and driven by what contributing factors. Modeling efforts are challenged by the intrinsic...

  11. A new estimate of the volume and distribution of gas hydrate in the northern Gulf of Mexico

    NASA Astrophysics Data System (ADS)

    Majumdar, U.; Cook, A.

    2016-12-01

    In spite of the wealth of information gained over the last several decades about gas hydrate in the northern Gulf of Mexico, there is still considerable uncertainty about the distribution and volume of gas hydrate. In our assessment we build a dataset of basin-wide gas hydrate distribution and thickness, as appraised from publicly available petroleum industry well logs within the gas hydrate stability zone (HSZ), and subsequently develop a Monte Carlo to determine the volumetric estimate of gas hydrate using the dataset. We evaluate the presence of gas hydrate from electrical resistivity well logs, and categorized possible reservoir type (either sand or clay) based on the gamma ray response and resistivity curve characteristics. Out of the 798 wells with resistivity well log data within the HSZ we analyzed, we found evidence of gas hydrate in 124 wells. In this research we present a new stochastic estimate of the gas hydrate volume in the northern Gulf of Mexico guided by our well log dataset. For our Monte Carlo simulation, we divided our assessment area of 200,000 km2 into 1 km2 grid cells. Our volume assessment model incorporates variables unique to our well log dataset such as the likelihood of gas hydrate occurrence, fraction of the HSZ occupied by gas hydrate, reservoir type, and gas hydrate saturation depending on the reservoir, in each grid cell, in addition to other basic variables such as HSZ thickness and porosity. Preliminary results from our model suggests that the total volume of gas at standard temperature and pressure in gas hydrate in the northern Gulf of Mexico is in the range of 430 trillion cubic feet (TCF) to 730 TCF, with a mean volume of 585 TCF. While the reservoir distribution from our well log dataset found gas hydrate in sand reservoirs in 30 wells out of the 124 wells with evidence of gas hydrate ( 24%), we find sand reservoirs contain over half of the total volume of gas hydrate in the Gulf of Mexico, as a result of the relatively high gas hydrate saturation in sand.

  12. Relating Data and Models to Characterize Parameter and Prediction Uncertainty

    EPA Science Inventory

    Applying PBPK models in risk analysis requires that we realistically assess the uncertainty of relevant model predictions in as quantitative a way as possible. The reality of human variability may add a confusing feature to the overall uncertainty assessment, as uncertainty and v...

  13. From Global to Local and Vice Versa: On the Importance of the `Globalization' Agenda in Continental Groundwater Research and Policy-Making

    NASA Astrophysics Data System (ADS)

    Filimonau, Viachaslau; Barth, Johannes A. C.

    2016-09-01

    Groundwater is one of the most important environmental resources and its use continuously rises globally for industrial, agricultural, and drinking water supply purposes. Because of its importance, more knowledge about the volume of usable groundwater is necessary to satisfy the global demand. Due to the challenges in quantifying the volume of available global groundwater, studies which aim to assess its magnitude are limited in number. They are further restricted in scope and depth of analysis as, in most cases, they do not explain how the estimates of global groundwater resources have been obtained, what methods have been used to generate the figures and what levels of uncertainty exist. This article reviews the estimates of global groundwater resources. It finds that the level of uncertainty attached to existing numbers often exceeds 100 % and strives to establish the reasons for discrepancy. The outcome of this study outlines the need for a new agenda in water research with a more pronounced focus on groundwater. This new research agenda should aim at enhancing the quality and quantity of data provision on local and regional groundwater stocks and flows. This knowledge enhancement can serve as a basis to improve policy-making on groundwater resources globally. Research-informed policies will facilitate more effective groundwater management practices to ensure a more rapid progress of the global water sector towards the goal of sustainability.

  14. [Evaluation of measurement uncertainty of welding fume in welding workplace of a shipyard].

    PubMed

    Ren, Jie; Wang, Yanrang

    2015-12-01

    To evaluate the measurement uncertainty of welding fume in the air of the welding workplace of a shipyard, and to provide quality assurance for measurement. According to GBZ/T 192.1-2007 "Determination of dust in the air of workplace-Part 1: Total dust concentration" and JJF 1059-1999 "Evaluation and expression of measurement uncertainty", the uncertainty for determination of welding fume was evaluated and the measurement results were completely described. The concentration of welding fume was 3.3 mg/m(3), and the expanded uncertainty was 0.24 mg/m(3). The repeatability for determination of dust concentration introduced an uncertainty of 1.9%, the measurement using electronic balance introduced a standard uncertainty of 0.3%, and the measurement of sample quality introduced a standard uncertainty of 3.2%. During the determination of welding fume, the standard uncertainty introduced by the measurement of sample quality is the dominant uncertainty. In the process of sampling and measurement, quality control should be focused on the collection efficiency of dust, air humidity, sample volume, and measuring instruments.

  15. Uncertainty on shallow landslide hazard assessment: from field data to hazard mapping

    NASA Astrophysics Data System (ADS)

    Trefolini, Emanuele; Tolo, Silvia; Patelli, Eduardo; Broggi, Matteo; Disperati, Leonardo; Le Tuan, Hai

    2015-04-01

    Shallow landsliding that involve Hillslope Deposits (HD), the surficial soil that cover the bedrock, is an important process of erosion, transport and deposition of sediment along hillslopes. Despite Shallow landslides generally mobilize relatively small volume of material, they represent the most hazardous factor in mountain regions due to their high velocity and the common absence of warning signs. Moreover, increasing urbanization and likely climate change make shallow landslides a source of widespread risk, therefore the interest of scientific community about this process grown in the last three decades. One of the main aims of research projects involved on this topic, is to perform robust shallow landslides hazard assessment for wide areas (regional assessment), in order to support sustainable spatial planning. Currently, three main methodologies may be implemented to assess regional shallow landslides hazard: expert evaluation, probabilistic (or data mining) methods and physical models based methods. The aim of this work is evaluate the uncertainty of shallow landslides hazard assessment based on physical models taking into account spatial variables such as: geotechnical and hydrogeologic parameters as well as hillslope morphometry. To achieve this goal a wide dataset of geotechnical properties (shear strength, permeability, depth and unit weight) of HD was gathered by integrating field survey, in situ and laboratory tests. This spatial database was collected from a study area of about 350 km2 including different bedrock lithotypes and geomorphological features. The uncertainty associated to each step of the hazard assessment process (e.g. field data collection, regionalization of site specific information and numerical modelling of hillslope stability) was carefully characterized. The most appropriate probability density function (PDF) was chosen for each numerical variable and we assessed the uncertainty propagation on HD strength parameters obtained by empirical relations with geotechnical index properties. Site specific information was regionalized at map scale by (hard and fuzzy) clustering analysis taking into account spatial variables such as: geology, geomorphology and hillslope morphometric variables (longitudinal and transverse curvature, flow accumulation and slope), the latter derived by a DEM with 10 m cell size. In order to map shallow landslide hazard, Monte Carlo simulation was performed for some common physically based models available in literature (eg. SINMAP, SHALSTAB, TRIGRS). Furthermore, a new approach based on the use of Bayesian Network was proposed and validated. Different models, such as Intervals, Convex Models and Fuzzy Sets, were adopted for the modelling of input parameters. Finally, an accuracy assessment was carried out on the resulting maps and the propagation of uncertainty of input parameters into the final shallow landslide hazard estimation was estimated. The outcomes of the analysis are compared and discussed in term of discrepancy among map pixel values and related estimated error. The novelty of the proposed method is on estimation of the confidence of the shallow landslides hazard mapping at regional level. This allows i) to discriminate regions where hazard assessment is robust from areas where more data are necessary to increase the confidence level and ii) to assess the reliability of the procedure used for hazard assessment.

  16. Capturing total chronological and spatial uncertainties in palaeo-ice sheet reconstructions: the DATED example

    NASA Astrophysics Data System (ADS)

    Hughes, Anna; Gyllencreutz, Richard; Mangerud, Jan; Svendsen, John Inge

    2017-04-01

    Glacial geologists generate empirical reconstructions of former ice-sheet dynamics by combining evidence from the preserved record of glacial landforms (e.g. end moraines, lineations) and sediments with chronological evidence (mainly numerical dates derived predominantly from radiocarbon, exposure and luminescence techniques). However the geomorphological and sedimentological footprints and chronological data are both incomplete records in both space and time, and all have multiple types of uncertainty associated with them. To understand ice sheets' response to climate we need numerical models of ice-sheet dynamics based on physical principles. To test and/or constrain such models, empirical reconstructions of past ice sheets that capture and acknowledge all uncertainties are required. In 2005 we started a project (Database of the Eurasian Deglaciation, DATED) to produce an empirical reconstruction of the evolution of the last Eurasian ice sheets, (including the British-Irish, Scandinavian and Svalbard-Barents-Kara Seas ice sheets) that is fully documented, specified in time, and includes uncertainty estimates. Over 5000 dates relevant to constraining ice build-up and retreat were assessed for reliability and used together with published ice-sheet margin positions based on glacial geomorphology to reconstruct time-slice maps of the ice sheets' extent. The DATED maps show synchronous ice margins with maximum-minimum uncertainty bounds for every 1000 years between 25-10 kyr ago. In the first version of results (DATED-1; Hughes et al. 2016) all uncertainties (both quantitative and qualitative, e.g. precision and accuracy of numerical dates, correlation of moraines, stratigraphic interpretations) were combined based on our best glaciological-geological assessment and expressed in terms of distance as a 'fuzzy' margin. Large uncertainties (>100 km) exist; predominantly across marine sectors and other locations where there are spatial gaps in the dating record (e.g. the timing of coalescence and separation of the Scandinavian and Svalbard-Barents-Kara ice sheets) but also in well-studied areas due to conflicting yet apparently equally robust data. In the four years since the DATED-1 census (1 January 2013), the volume of new information (from both dates and mapped glacial geomorphology) has grown significantly ( 1000 new dates). Here, we present work towards the updated version of results, DATED-2, that attempts to further reduce and explicitly report all uncertainties inherent in ice sheet reconstructions. Hughes, A. L. C., Gyllencreutz, R., Lohne, Ø. S., Mangerud, J., Svendsen, J. I. 2016: The last Eurasian ice sheets - a chronological database and time-slice reconstruction, DATED-1. Boreas, 45, 1-45. 10.1111/bor.12142

  17. Contamination of packaged food by substances migrating from a direct-contact plastic layer: Assessment using a generic quantitative household scale methodology.

    PubMed

    Vitrac, Olivier; Challe, Blandine; Leblanc, Jean-Charles; Feigenbaum, Alexandre

    2007-01-01

    The contamination risk in 12 packaged foods by substances released from the plastic contact layer has been evaluated using a novel modeling technique, which predicts the migration that accounts for (i) possible variations in the time of contact between foodstuffs and packaging and (ii) uncertainty in physico-chemical parameters used to predict migration. Contamination data, which are subject to variability and uncertainty, are derived through a stochastic resolution of transport equations, which control the migration into food. Distributions of contact times between packaging materials and foodstuffs were reconstructed from the volumes and frequencies of purchases of a given panel of 6422 households, making assumptions about household storage behaviour. The risk of contamination of the packaged foods was estimated for styrene (a monomer found in polystyrene yogurt pots) and 2,6-di-tert-butyl-4-hydroxytoluene (a representative of the widely used phenolic antioxidants). The results are analysed and discussed regarding sensitivity of the model to the set parameters and chosen assumptions.

  18. Uncertainty of fast biological radiation dose assessment for emergency response scenarios.

    PubMed

    Ainsbury, Elizabeth A; Higueras, Manuel; Puig, Pedro; Einbeck, Jochen; Samaga, Daniel; Barquinero, Joan Francesc; Barrios, Lleonard; Brzozowska, Beata; Fattibene, Paola; Gregoire, Eric; Jaworska, Alicja; Lloyd, David; Oestreicher, Ursula; Romm, Horst; Rothkamm, Kai; Roy, Laurence; Sommer, Sylwester; Terzoudi, Georgia; Thierens, Hubert; Trompier, Francois; Vral, Anne; Woda, Clemens

    2017-01-01

    Reliable dose estimation is an important factor in appropriate dosimetric triage categorization of exposed individuals to support radiation emergency response. Following work done under the EU FP7 MULTIBIODOSE and RENEB projects, formal methods for defining uncertainties on biological dose estimates are compared using simulated and real data from recent exercises. The results demonstrate that a Bayesian method of uncertainty assessment is the most appropriate, even in the absence of detailed prior information. The relative accuracy and relevance of techniques for calculating uncertainty and combining assay results to produce single dose and uncertainty estimates is further discussed. Finally, it is demonstrated that whatever uncertainty estimation method is employed, ignoring the uncertainty on fast dose assessments can have an important impact on rapid biodosimetric categorization.

  19. Framework for Uncertainty Assessment - Hanford Site-Wide Groundwater Flow and Transport Modeling

    NASA Astrophysics Data System (ADS)

    Bergeron, M. P.; Cole, C. R.; Murray, C. J.; Thorne, P. D.; Wurstner, S. K.

    2002-05-01

    Pacific Northwest National Laboratory is in the process of development and implementation of an uncertainty estimation methodology for use in future site assessments that addresses parameter uncertainty as well as uncertainties related to the groundwater conceptual model. The long-term goals of the effort are development and implementation of an uncertainty estimation methodology for use in future assessments and analyses being made with the Hanford site-wide groundwater model. The basic approach in the framework developed for uncertainty assessment consists of: 1) Alternate conceptual model (ACM) identification to identify and document the major features and assumptions of each conceptual model. The process must also include a periodic review of the existing and proposed new conceptual models as data or understanding become available. 2) ACM development of each identified conceptual model through inverse modeling with historical site data. 3) ACM evaluation to identify which of conceptual models are plausible and should be included in any subsequent uncertainty assessments. 4) ACM uncertainty assessments will only be carried out for those ACMs determined to be plausible through comparison with historical observations and model structure identification measures. The parameter uncertainty assessment process generally involves: a) Model Complexity Optimization - to identify the important or relevant parameters for the uncertainty analysis; b) Characterization of Parameter Uncertainty - to develop the pdfs for the important uncertain parameters including identification of any correlations among parameters; c) Propagation of Uncertainty - to propagate parameter uncertainties (e.g., by first order second moment methods if applicable or by a Monte Carlo approach) through the model to determine the uncertainty in the model predictions of interest. 5)Estimation of combined ACM and scenario uncertainty by a double sum with each component of the inner sum (an individual CCDF) representing parameter uncertainty associated with a particular scenario and ACM and the outer sum enumerating the various plausible ACM and scenario combinations in order to represent the combined estimate of uncertainty (a family of CCDFs). A final important part of the framework includes identification, enumeration, and documentation of all the assumptions, which include those made during conceptual model development, required by the mathematical model, required by the numerical model, made during the spatial and temporal descretization process, needed to assign the statistical model and associated parameters that describe the uncertainty in the relevant input parameters, and finally those assumptions required by the propagation method. Pacific Northwest National Laboratory is operated for the U.S. Department of Energy under Contract DE-AC06-76RL01830.

  20. Assessing the Impact of Climate Change on Extreme Streamflow and Reservoir Operation for Nuuanu Watershed, Oahu, Hawaii

    NASA Astrophysics Data System (ADS)

    Leta, O. T.; El-Kadi, A. I.; Dulaiova, H.

    2016-12-01

    Extreme events, such as flooding and drought, are expected to occur at increased frequencies worldwide due to climate change influencing the water cycle. This is particularly critical for tropical islands where the local freshwater resources are very sensitive to climate. This study examined the impact of climate change on extreme streamflow, reservoir water volume and outflow for the Nuuanu watershed, using the Soil and Water Assessment Tool (SWAT) model. Based on the sensitive parameters screened by the Latin Hypercube-One-factor-At-a-Time (LH-OAT) method, SWAT was calibrated and validated to daily streamflow using the SWAT Calibration and Uncertainty Program (SWAT-CUP) at three streamflow gauging stations. Results showed that SWAT adequately reproduced the observed daily streamflow hydrographs at all stations. This was verified with Nash-Sutcliffe Efficiency that resulted in acceptable values of 0.58 to 0.88, whereby more than 90% of observations were bracketed within 95% model prediction uncertainty interval for both calibration and validation periods, signifying the potential applicability of SWAT for future prediction. The climate change impact on extreme flows, reservoir water volume and outflow was assessed under the Representative Concentration Pathways of 4.5 and 8.5 scenarios. We found wide changes in extreme peak and low flows ranging from -44% to 20% and -50% to -2%, respectively, compared to baseline. Consequently, the amount of water stored in Nuuanu reservoir will be decreased up to 27% while the corresponding outflow rates are expected to decrease up to 37% relative to the baseline. In addition, the stored water and extreme flows are highly sensitive to rainfall change when compared to temperature and solar radiation changes. It is concluded that the decrease in extreme low and peak flows can have serious consequences, such as flooding, drought, with detrimental effects on riparian ecological functioning. This study's results are expected to aid in reservoir operation as well as in identifying appropriate climate change adaptation strategies.

  1. Probability and Confidence Trade-space (PACT) Evaluation: Accounting for Uncertainty in Sparing Assessments

    NASA Technical Reports Server (NTRS)

    Anderson, Leif; Box, Neil; Carter, Katrina; DiFilippo, Denise; Harrington, Sean; Jackson, David; Lutomski, Michael

    2012-01-01

    There are two general shortcomings to the current annual sparing assessment: 1. The vehicle functions are currently assessed according to confidence targets, which can be misleading- overly conservative or optimistic. 2. The current confidence levels are arbitrarily determined and do not account for epistemic uncertainty (lack of knowledge) in the ORU failure rate. There are two major categories of uncertainty that impact Sparing Assessment: (a) Aleatory Uncertainty: Natural variability in distribution of actual failures around an Mean Time Between Failure (MTBF) (b) Epistemic Uncertainty : Lack of knowledge about the true value of an Orbital Replacement Unit's (ORU) MTBF We propose an approach to revise confidence targets and account for both categories of uncertainty, an approach we call Probability and Confidence Trade-space (PACT) evaluation.

  2. Bathymetric map and area/capacity table for Castle Lake, Washington

    USGS Publications Warehouse

    Mosbrucker, Adam R.; Spicer, Kurt R.

    2017-11-14

    The May 18, 1980, eruption of Mount St. Helens produced a 2.5-cubic-kilometer debris avalanche that dammed South Fork Castle Creek, causing Castle Lake to form behind a 20-meter-tall blockage. Risk of a catastrophic breach of the newly impounded lake led to outlet channel stabilization work, aggressive monitoring programs, mapping efforts, and blockage stability studies. Despite relatively large uncertainty, early mapping efforts adequately supported several lake breakout models, but have limited applicability to current lake monitoring and hazard assessment. Here, we present the results of a bathymetric survey conducted in August 2012 with the purpose of (1) verifying previous volume estimates, (2) computing an area/capacity table, and (3) producing a bathymetric map. Our survey found seasonal lake volume ranges between 21.0 and 22.6 million cubic meters with a fundamental vertical accuracy representing 0.88 million cubic meters. Lake surface area ranges between 1.13 and 1.16 square kilometers. Relationships developed by our results allow the computation of lake volume from near real-time lake elevation measurements or from remotely sensed imagery.

  3. OAST Space Theme Workshop. Volume 3: Working group summary. 9: Aerothermodynamics (M-3). A: Statement. B: Technology needs (form 1). C. Priority assessment (form 2). D. Additional assessments

    NASA Technical Reports Server (NTRS)

    1976-01-01

    Twelve aerothermodynamic space technology needs were identified to reduce the design uncertainties in aerodynamic heating and forces experienced by heavy lift launch vehicles, orbit transfer vehicles, and advanced single stage to orbit vehicles for the space transportation system, and for probes, planetary surface landers, and sample return vehicles for solar system exploration vehicles. Research and technology needs identified include: (1) increasing the fluid dynamics capability by at least two orders of magnitude by developing an advanced computer processor for the solution of fluid dynamic problems with improved software; (2) predicting multi-engine base flow fields for launch vehicles; and (3) developing methods to conserve energy in aerothermodynamic ground test facilities.

  4. The importance of hydrological uncertainty assessment methods in climate change impact studies

    NASA Astrophysics Data System (ADS)

    Honti, M.; Scheidegger, A.; Stamm, C.

    2014-08-01

    Climate change impact assessments have become more and more popular in hydrology since the middle 1980s with a recent boost after the publication of the IPCC AR4 report. From hundreds of impact studies a quasi-standard methodology has emerged, to a large extent shaped by the growing public demand for predicting how water resources management or flood protection should change in the coming decades. The "standard" workflow relies on a model cascade from global circulation model (GCM) predictions for selected IPCC scenarios to future catchment hydrology. Uncertainty is present at each level and propagates through the model cascade. There is an emerging consensus between many studies on the relative importance of the different uncertainty sources. The prevailing perception is that GCM uncertainty dominates hydrological impact studies. Our hypothesis was that the relative importance of climatic and hydrologic uncertainty is (among other factors) heavily influenced by the uncertainty assessment method. To test this we carried out a climate change impact assessment and estimated the relative importance of the uncertainty sources. The study was performed on two small catchments in the Swiss Plateau with a lumped conceptual rainfall runoff model. In the climatic part we applied the standard ensemble approach to quantify uncertainty but in hydrology we used formal Bayesian uncertainty assessment with two different likelihood functions. One was a time series error model that was able to deal with the complicated statistical properties of hydrological model residuals. The second was an approximate likelihood function for the flow quantiles. The results showed that the expected climatic impact on flow quantiles was small compared to prediction uncertainty. The choice of uncertainty assessment method actually determined what sources of uncertainty could be identified at all. This demonstrated that one could arrive at rather different conclusions about the causes behind predictive uncertainty for the same hydrological model and calibration data when considering different objective functions for calibration.

  5. Uncertainty Analysis of OC5-DeepCwind Floating Semisubmersible Offshore Wind Test Campaign

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Robertson, Amy N

    This paper examines how to assess the uncertainty levels for test measurements of the Offshore Code Comparison, Continued, with Correlation (OC5)-DeepCwind floating offshore wind system, examined within the OC5 project. The goal of the OC5 project was to validate the accuracy of ultimate and fatigue load estimates from a numerical model of the floating semisubmersible using data measured during scaled tank testing of the system under wind and wave loading. The examination of uncertainty was done after the test, and it was found that the limited amount of data available did not allow for an acceptable uncertainty assessment. Therefore, thismore » paper instead qualitatively examines the sources of uncertainty associated with this test to start a discussion of how to assess uncertainty for these types of experiments and to summarize what should be done during future testing to acquire the information needed for a proper uncertainty assessment. Foremost, future validation campaigns should initiate numerical modeling before testing to guide the test campaign, which should include a rigorous assessment of uncertainty, and perform validation during testing to ensure that the tests address all of the validation needs.« less

  6. Uncertainty Analysis of OC5-DeepCwind Floating Semisubmersible Offshore Wind Test Campaign: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Robertson, Amy N

    This paper examines how to assess the uncertainty levels for test measurements of the Offshore Code Comparison, Continued, with Correlation (OC5)-DeepCwind floating offshore wind system, examined within the OC5 project. The goal of the OC5 project was to validate the accuracy of ultimate and fatigue load estimates from a numerical model of the floating semisubmersible using data measured during scaled tank testing of the system under wind and wave loading. The examination of uncertainty was done after the test, and it was found that the limited amount of data available did not allow for an acceptable uncertainty assessment. Therefore, thismore » paper instead qualitatively examines the sources of uncertainty associated with this test to start a discussion of how to assess uncertainty for these types of experiments and to summarize what should be done during future testing to acquire the information needed for a proper uncertainty assessment. Foremost, future validation campaigns should initiate numerical modeling before testing to guide the test campaign, which should include a rigorous assessment of uncertainty, and perform validation during testing to ensure that the tests address all of the validation needs.« less

  7. Comparison of linear and nonlinear programming approaches for "worst case dose" and "minmax" robust optimization of intensity-modulated proton therapy dose distributions.

    PubMed

    Zaghian, Maryam; Cao, Wenhua; Liu, Wei; Kardar, Laleh; Randeniya, Sharmalee; Mohan, Radhe; Lim, Gino

    2017-03-01

    Robust optimization of intensity-modulated proton therapy (IMPT) takes uncertainties into account during spot weight optimization and leads to dose distributions that are resilient to uncertainties. Previous studies demonstrated benefits of linear programming (LP) for IMPT in terms of delivery efficiency by considerably reducing the number of spots required for the same quality of plans. However, a reduction in the number of spots may lead to loss of robustness. The purpose of this study was to evaluate and compare the performance in terms of plan quality and robustness of two robust optimization approaches using LP and nonlinear programming (NLP) models. The so-called "worst case dose" and "minmax" robust optimization approaches and conventional planning target volume (PTV)-based optimization approach were applied to designing IMPT plans for five patients: two with prostate cancer, one with skull-based cancer, and two with head and neck cancer. For each approach, both LP and NLP models were used. Thus, for each case, six sets of IMPT plans were generated and assessed: LP-PTV-based, NLP-PTV-based, LP-worst case dose, NLP-worst case dose, LP-minmax, and NLP-minmax. The four robust optimization methods behaved differently from patient to patient, and no method emerged as superior to the others in terms of nominal plan quality and robustness against uncertainties. The plans generated using LP-based robust optimization were more robust regarding patient setup and range uncertainties than were those generated using NLP-based robust optimization for the prostate cancer patients. However, the robustness of plans generated using NLP-based methods was superior for the skull-based and head and neck cancer patients. Overall, LP-based methods were suitable for the less challenging cancer cases in which all uncertainty scenarios were able to satisfy tight dose constraints, while NLP performed better in more difficult cases in which most uncertainty scenarios were hard to meet tight dose limits. For robust optimization, the worst case dose approach was less sensitive to uncertainties than was the minmax approach for the prostate and skull-based cancer patients, whereas the minmax approach was superior for the head and neck cancer patients. The robustness of the IMPT plans was remarkably better after robust optimization than after PTV-based optimization, and the NLP-PTV-based optimization outperformed the LP-PTV-based optimization regarding robustness of clinical target volume coverage. In addition, plans generated using LP-based methods had notably fewer scanning spots than did those generated using NLP-based methods. © 2017 The Authors. Journal of Applied Clinical Medical Physics published by Wiley Periodicals, Inc. on behalf of American Association of Physicists in Medicine.

  8. Influence of safety measures on the risks of transporting dangerous goods through road tunnels.

    PubMed

    Saccomanno, Frank; Haastrup, Palle

    2002-12-01

    Quantitative risk assessment (QRA) models are used to estimate the risks of transporting dangerous goods and to assess the merits of introducing alternative risk reduction measures for different transportation scenarios and assumptions. A comprehensive QRA model recently was developed in Europe for application to road tunnels. This model can assess the merits of a limited number of "native safety measures." In this article, we introduce a procedure for extending its scope to include the treatment of a number of important "nonnative safety measures" of interest to tunnel operators and decisionmakers. Nonnative safety measures were not included in the original model specification. The suggested procedure makes use of expert judgment and Monte Carlo simulation methods to model uncertainty in the revised risk estimates. The results of a case study application are presented that involve the risks of transporting a given volume of flammable liquid through a 10-km road tunnel.

  9. A Statistical Theory of Bidirectionality

    NASA Technical Reports Server (NTRS)

    DeLoach, Richard; Ulbrich, Norbert

    2013-01-01

    Original concepts related to the quantification and assessment of bidirectionality in strain-gage balances were introduced by Ulbrich in 2012. These concepts are extended here in three ways: 1) the metric originally proposed by Ulbrich is normalized, 2) a categorical variable is introduced in the regression analysis to account for load polarity, and 3) the uncertainty in both normalized and non-normalized bidirectionality metrics is quantified. These extensions are applied to four representative balances to assess the bidirectionality characteristics of each. The paper is tutorial in nature, featuring reviews of certain elements of regression and formal inference. Principal findings are that bidirectionality appears to be a common characteristic of most balance outputs and that unless it is taken into account, it is likely to consume the entire error budget of a typical balance calibration experiment. Data volume and correlation among calibration loads are shown to have a significant impact on the precision with which bidirectionality metrics can be assessed.

  10. An appraisal of Indonesia's immense peat carbon stock using national peatland maps: uncertainties and potential losses from conversion.

    PubMed

    Warren, Matthew; Hergoualc'h, Kristell; Kauffman, J Boone; Murdiyarso, Daniel; Kolka, Randall

    2017-12-01

    A large proportion of the world's tropical peatlands occur in Indonesia where rapid conversion and associated losses of carbon, biodiversity and ecosystem services have brought peatland management to the forefront of Indonesia's climate mitigation efforts. We evaluated peat volume from two commonly referenced maps of peat distribution and depth published by Wetlands International (WI) and the Indonesian Ministry of Agriculture (MoA), and used regionally specific values of carbon density to calculate carbon stocks. Peatland extent and volume published in the MoA maps are lower than those in the WI maps, resulting in lower estimates of carbon storage. We estimate Indonesia's total peat carbon store to be within 13.6 GtC (the low MoA map estimate) and 40.5 GtC (the high WI map estimate) with a best estimate of 28.1 GtC: the midpoint of medium carbon stock estimates derived from WI (30.8 GtC) and MoA (25.3 GtC) maps. This estimate is about half of previous assessments which used an assumed average value of peat thickness for all Indonesian peatlands, and revises the current global tropical peat carbon pool to 75 GtC. Yet, these results do not diminish the significance of Indonesia's peatlands, which store an estimated 30% more carbon than the biomass of all Indonesian forests. The largest discrepancy between maps is for the Papua province, which accounts for 62-71% of the overall differences in peat area, volume and carbon storage. According to the MoA map, 80% of Indonesian peatlands are <300 cm thick and thus vulnerable to conversion outside of protected areas according to environmental regulations. The carbon contained in these shallower peatlands is conservatively estimated to be 10.6 GtC, equivalent to 42% of Indonesia's total peat carbon and about 12 years of global emissions from land use change at current rates. Considering the high uncertainties in peatland extent, volume and carbon storage revealed in this assessment of current maps, a systematic revision of Indonesia's peat maps to produce a single geospatial reference that is universally accepted would improve national peat carbon storage estimates and greatly benefit carbon cycle research, land use management and spatial planning.

  11. Reducing, Maintaining, or Escalating Uncertainty? The Development and Validation of Four Uncertainty Preference Scales Related to Cancer Information Seeking and Avoidance.

    PubMed

    Carcioppolo, Nick; Yang, Fan; Yang, Qinghua

    2016-09-01

    Uncertainty is a central characteristic of many aspects of cancer prevention, screening, diagnosis, and treatment. Brashers's (2001) uncertainty management theory details the multifaceted nature of uncertainty and describes situations in which uncertainty can both positively and negatively affect health outcomes. The current study extends theory on uncertainty management by developing four scale measures of uncertainty preferences in the context of cancer. Two national surveys were conducted to validate the scales and assess convergent and concurrent validity. Results support the factor structure of each measure and provide general support across multiple validity assessments. These scales can advance research on uncertainty and cancer communication by providing researchers with measures that address multiple aspects of uncertainty management.

  12. Position uncertainty distribution for articulated arm coordinate measuring machine based on simplified definite integration

    NASA Astrophysics Data System (ADS)

    You, Xu; Zhi-jian, Zong; Qun, Gao

    2018-07-01

    This paper describes a methodology for the position uncertainty distribution of an articulated arm coordinate measuring machine (AACMM). First, a model of the structural parameter uncertainties was established by statistical method. Second, the position uncertainty space volume of the AACMM in a certain configuration was expressed using a simplified definite integration method based on the structural parameter uncertainties; it was then used to evaluate the position accuracy of the AACMM in a certain configuration. Third, the configurations of a certain working point were calculated by an inverse solution, and the position uncertainty distribution of a certain working point was determined; working point uncertainty can be evaluated by the weighting method. Lastly, the position uncertainty distribution in the workspace of the ACCMM was described by a map. A single-point contrast test of a 6-joint AACMM was carried out to verify the effectiveness of the proposed method, and it was shown that the method can describe the position uncertainty of the AACMM and it was used to guide the calibration of the AACMM and the choice of AACMM’s accuracy area.

  13. Development and comparison of Bayesian modularization method in uncertainty assessment of hydrological models

    NASA Astrophysics Data System (ADS)

    Li, L.; Xu, C.-Y.; Engeland, K.

    2012-04-01

    With respect to model calibration, parameter estimation and analysis of uncertainty sources, different approaches have been used in hydrological models. Bayesian method is one of the most widely used methods for uncertainty assessment of hydrological models, which incorporates different sources of information into a single analysis through Bayesian theorem. However, none of these applications can well treat the uncertainty in extreme flows of hydrological models' simulations. This study proposes a Bayesian modularization method approach in uncertainty assessment of conceptual hydrological models by considering the extreme flows. It includes a comprehensive comparison and evaluation of uncertainty assessments by a new Bayesian modularization method approach and traditional Bayesian models using the Metropolis Hasting (MH) algorithm with the daily hydrological model WASMOD. Three likelihood functions are used in combination with traditional Bayesian: the AR (1) plus Normal and time period independent model (Model 1), the AR (1) plus Normal and time period dependent model (Model 2) and the AR (1) plus multi-normal model (Model 3). The results reveal that (1) the simulations derived from Bayesian modularization method are more accurate with the highest Nash-Sutcliffe efficiency value, and (2) the Bayesian modularization method performs best in uncertainty estimates of entire flows and in terms of the application and computational efficiency. The study thus introduces a new approach for reducing the extreme flow's effect on the discharge uncertainty assessment of hydrological models via Bayesian. Keywords: extreme flow, uncertainty assessment, Bayesian modularization, hydrological model, WASMOD

  14. Estimating the rates of mass change, ice volume change and snow volume change in Greenland from ICESat and GRACE data

    NASA Astrophysics Data System (ADS)

    Slobbe, D. C.; Ditmar, P.; Lindenbergh, R. C.

    2009-01-01

    The focus of this paper is on the quantification of ongoing mass and volume changes over the Greenland ice sheet. For that purpose, we used elevation changes derived from the Ice, Cloud, and land Elevation Satellite (ICESat) laser altimetry mission and monthly variations of the Earth's gravity field as observed by the Gravity Recovery and Climate Experiment (GRACE) mission. Based on a stand alone processing scheme of ICESat data, the most probable estimate of the mass change rate from 2003 February to 2007 April equals -139 +/- 68 Gtonyr-1. Here, we used a density of 600+/-300 kgm-3 to convert the estimated elevation change rate in the region above 2000m into a mass change rate. For the region below 2000m, we used a density of 900+/-300 kgm-3. Based on GRACE gravity models from half 2002 to half 2007 as processed by CNES, CSR, DEOS and GFZ, the estimated mass change rate for the whole of Greenland ranges between -128 and -218Gtonyr-1. Most GRACE solutions show much stronger mass losses as obtained with ICESat, which might be related to a local undersampling of the mass loss by ICESat and uncertainties in the used snow/ice densities. To solve the problem of uncertainties in the snow and ice densities, two independent joint inversion concepts are proposed to profit from both GRACE and ICESat observations simultaneously. The first concept, developed to reduce the uncertainty of the mass change rate, estimates this rate in combination with an effective snow/ice density. However, it turns out that the uncertainties are not reduced, which is probably caused by the unrealistic assumption that the effective density is constant in space and time. The second concept is designed to convert GRACE and ICESat data into two totally new products: variations of ice volume and variations of snow volume separately. Such an approach is expected to lead to new insights in ongoing mass change processes over the Greenland ice sheet. Our results show for different GRACE solutions a snow volume change of -11 to 155km3yr-1 and an ice loss with a rate of -136 to -292km3yr-1.

  15. Development of probabilistic internal dosimetry computer code

    NASA Astrophysics Data System (ADS)

    Noh, Siwan; Kwon, Tae-Eun; Lee, Jai-Ki

    2017-02-01

    Internal radiation dose assessment involves biokinetic models, the corresponding parameters, measured data, and many assumptions. Every component considered in the internal dose assessment has its own uncertainty, which is propagated in the intake activity and internal dose estimates. For research or scientific purposes, and for retrospective dose reconstruction for accident scenarios occurring in workplaces having a large quantity of unsealed radionuclides, such as nuclear power plants, nuclear fuel cycle facilities, and facilities in which nuclear medicine is practiced, a quantitative uncertainty assessment of the internal dose is often required. However, no calculation tools or computer codes that incorporate all the relevant processes and their corresponding uncertainties, i.e., from the measured data to the committed dose, are available. Thus, the objective of the present study is to develop an integrated probabilistic internal-dose-assessment computer code. First, the uncertainty components in internal dosimetry are identified, and quantitative uncertainty data are collected. Then, an uncertainty database is established for each component. In order to propagate these uncertainties in an internal dose assessment, a probabilistic internal-dose-assessment system that employs the Bayesian and Monte Carlo methods. Based on the developed system, we developed a probabilistic internal-dose-assessment code by using MATLAB so as to estimate the dose distributions from the measured data with uncertainty. Using the developed code, we calculated the internal dose distribution and statistical values ( e.g. the 2.5th, 5th, median, 95th, and 97.5th percentiles) for three sample scenarios. On the basis of the distributions, we performed a sensitivity analysis to determine the influence of each component on the resulting dose in order to identify the major component of the uncertainty in a bioassay. The results of this study can be applied to various situations. In cases of severe internal exposure, the causation probability of a deterministic health effect can be derived from the dose distribution, and a high statistical value ( e.g., the 95th percentile of the distribution) can be used to determine the appropriate intervention. The distribution-based sensitivity analysis can also be used to quantify the contribution of each factor to the dose uncertainty, which is essential information for reducing and optimizing the uncertainty in the internal dose assessment. Therefore, the present study can contribute to retrospective dose assessment for accidental internal exposure scenarios, as well as to internal dose monitoring optimization and uncertainty reduction.

  16. Methods for Assessing Uncertainties in Climate Change, Impacts and Responses (Invited)

    NASA Astrophysics Data System (ADS)

    Manning, M. R.; Swart, R.

    2009-12-01

    Assessing the scientific uncertainties or confidence levels for the many different aspects of climate change is particularly important because of the seriousness of potential impacts and the magnitude of economic and political responses that are needed to mitigate climate change effectively. This has made the treatment of uncertainty and confidence a key feature in the assessments carried out by the Intergovernmental Panel on Climate Change (IPCC). Because climate change is very much a cross-disciplinary area of science, adequately dealing with uncertainties requires recognition of their wide range and different perspectives on assessing and communicating those uncertainties. The structural differences that exist across disciplines are often embedded deeply in the corresponding literature that is used as the basis for an IPCC assessment. The assessment of climate change science by the IPCC has from its outset tried to report the levels of confidence and uncertainty in the degree of understanding in both the underlying multi-disciplinary science and in projections for future climate. The growing recognition of the seriousness of this led to the formation of a detailed approach for consistent treatment of uncertainties in the IPCC’s Third Assessment Report (TAR) [Moss and Schneider, 2000]. However, in completing the TAR there remained some systematic differences between the disciplines raising concerns about the level of consistency. So further consideration of a systematic approach to uncertainties was undertaken for the Fourth Assessment Report (AR4). The basis for the approach used in the AR4 was developed at an expert meeting of scientists representing many different disciplines. This led to the introduction of a broader way of addressing uncertainties in the AR4 [Manning et al., 2004] which was further refined by lengthy discussions among many IPCC Lead Authors, for over a year, resulting in a short summary of a standard approach to be followed for that assessment [IPCC, 2005]. This paper extends a review of the treatment of uncertainty in the IPCC assessments by Swart et al [2009]. It is shown that progress towards consistency has been made but that there also appears to be a need for continued use of several complementary approaches in order to cover the wide range of circumstances across different disciplines involved in climate change. While this reflects the situation in the science community, it also raises the level of complexity for policymakers and other users of the assessments who would prefer one common consensus approach. References IPCC (2005), Guidance Notes for Lead Authors of the IPCC Fourth Assessment Report on Addressing Uncertainties, IPCC, Geneva. Manning, M., et al. (2004), IPCC Workshop on Describing Scientific Uncertainties in Climate Change to Support Analysis of Risk and of Options. IPCC Moss, R., and S. Schneider (2000), Uncertainties, in Guidance Papers on the Cross Cutting Issues of the Third Assessment Report of the IPCC, edited by R. Pachauri, et al., Intergovernmental Panel on Climate Change (IPCC), Geneva. Swart, R., et al. (2009), Agreeing to disagree: uncertainty management in assessing climate change, impacts and responses by the IPCC Climatic Change, 92(1-2), 1 - 29.

  17. Using a Meniscus to Teach Uncertainty in Measurement

    NASA Astrophysics Data System (ADS)

    Backman, Philip

    2008-02-01

    I have found that students easily understand that a measurement cannot be exact, but they often seem to lack an understanding of why it is important to know something about the magnitude of the uncertainty. This tends to promote an attitude that almost any uncertainty value will do. Such indifference may exist because once an uncertainty is determined or calculated, it remains as only a number without a concrete physical connection back to the experiment. For the activity described here—presented as a challenge—groups of students are given a container and asked to make certain measurements and to estimate the uncertainty in each of those measurements. They are then challenged to complete a particular task involving the container and a volume of water. Whether the assigned task is actually achievable, however, slowly comes into question once the magnitude of the uncertainties in the original measurements is compared to the specific requirements of the challenge.

  18. Error and Uncertainty Quantification in the Numerical Simulation of Complex Fluid Flows

    NASA Technical Reports Server (NTRS)

    Barth, Timothy J.

    2010-01-01

    The failure of numerical simulation to predict physical reality is often a direct consequence of the compounding effects of numerical error arising from finite-dimensional approximation and physical model uncertainty resulting from inexact knowledge and/or statistical representation. In this topical lecture, we briefly review systematic theories for quantifying numerical errors and restricted forms of model uncertainty occurring in simulations of fluid flow. A goal of this lecture is to elucidate both positive and negative aspects of applying these theories to practical fluid flow problems. Finite-element and finite-volume calculations of subsonic and hypersonic fluid flow are presented to contrast the differing roles of numerical error and model uncertainty. for these problems.

  19. Measuring Research Data Uncertainty in the 2010 NRC Assessment of Geography Graduate Education

    ERIC Educational Resources Information Center

    Shortridge, Ashton; Goldsberry, Kirk; Weessies, Kathleen

    2011-01-01

    This article characterizes and measures errors in the 2010 National Research Council (NRC) assessment of research-doctorate programs in geography. This article provides a conceptual model for data-based sources of uncertainty and reports on a quantitative assessment of NRC research data uncertainty for a particular geography doctoral program.…

  20. Uncertainty quantification in flood risk assessment

    NASA Astrophysics Data System (ADS)

    Blöschl, Günter; Hall, Julia; Kiss, Andrea; Parajka, Juraj; Perdigão, Rui A. P.; Rogger, Magdalena; Salinas, José Luis; Viglione, Alberto

    2017-04-01

    Uncertainty is inherent to flood risk assessments because of the complexity of the human-water system, which is characterised by nonlinearities and interdependencies, because of limited knowledge about system properties and because of cognitive biases in human perception and decision-making. On top of the uncertainty associated with the assessment of the existing risk to extreme events, additional uncertainty arises because of temporal changes in the system due to climate change, modifications of the environment, population growth and the associated increase in assets. Novel risk assessment concepts are needed that take into account all these sources of uncertainty. They should be based on the understanding of how flood extremes are generated and how they change over time. They should also account for the dynamics of risk perception of decision makers and population in the floodplains. In this talk we discuss these novel risk assessment concepts through examples from Flood Frequency Hydrology, Socio-Hydrology and Predictions Under Change. We believe that uncertainty quantification in flood risk assessment should lead to a robust approach of integrated flood risk management aiming at enhancing resilience rather than searching for optimal defense strategies.

  1. Local setup errors in image-guided radiotherapy for head and neck cancer patients immobilized with a custom-made device.

    PubMed

    Giske, Kristina; Stoiber, Eva M; Schwarz, Michael; Stoll, Armin; Muenter, Marc W; Timke, Carmen; Roeder, Falk; Debus, Juergen; Huber, Peter E; Thieke, Christian; Bendl, Rolf

    2011-06-01

    To evaluate the local positioning uncertainties during fractionated radiotherapy of head-and-neck cancer patients immobilized using a custom-made fixation device and discuss the effect of possible patient correction strategies for these uncertainties. A total of 45 head-and-neck patients underwent regular control computed tomography scanning using an in-room computed tomography scanner. The local and global positioning variations of all patients were evaluated by applying a rigid registration algorithm. One bounding box around the complete target volume and nine local registration boxes containing relevant anatomic structures were introduced. The resulting uncertainties for a stereotactic setup and the deformations referenced to one anatomic local registration box were determined. Local deformations of the patients immobilized using our custom-made device were compared with previously published results. Several patient positioning correction strategies were simulated, and the residual local uncertainties were calculated. The patient anatomy in the stereotactic setup showed local systematic positioning deviations of 1-4 mm. The deformations referenced to a particular anatomic local registration box were similar to the reported deformations assessed from patients immobilized with commercially available Aquaplast masks. A global correction, including the rotational error compensation, decreased the remaining local translational errors. Depending on the chosen patient positioning strategy, the remaining local uncertainties varied considerably. Local deformations in head-and-neck patients occur even if an elaborate, custom-made patient fixation method is used. A rotational error correction decreased the required margins considerably. None of the considered correction strategies achieved perfect alignment. Therefore, weighting of anatomic subregions to obtain the optimal correction vector should be investigated in the future. Copyright © 2011 Elsevier Inc. All rights reserved.

  2. Uncertainties for seismic moment tensors and applications to nuclear explosions, volcanic events, and earthquakes

    NASA Astrophysics Data System (ADS)

    Tape, C.; Alvizuri, C. R.; Silwal, V.; Tape, W.

    2017-12-01

    When considered as a point source, a seismic source can be characterized in terms of its origin time, hypocenter, moment tensor, and source time function. The seismologist's task is to estimate these parameters--and their uncertainties--from three-component ground motion recorded at irregularly spaced stations. We will focus on one portion of this problem: the estimation of the moment tensor and its uncertainties. With magnitude estimated separately, we are left with five parameters describing the normalized moment tensor. A lune of normalized eigenvalue triples can be used to visualize the two parameters (lune longitude and lune latitude) describing the source type, while the conventional strike, dip, and rake angles can be used to characterize the orientation. Slight modifications of these five parameters lead to a uniform parameterization of moment tensors--uniform in the sense that equal volumes in the coordinate domain of the parameterization correspond to equal volumes of moment tensors. For a moment tensor m that we have inferred from seismic data for an earthquake, we define P(V) to be the probability that the true moment tensor for the earthquake lies in the neighborhood of m that has fractional volume V. The average value of P(V) is then a measure of our confidence in our inference of m. The calculation of P(V) requires knowing both the probability P(w) and the fractional volume V(w) of the set of moment tensors within a given angular radius w of m. We apply this approach to several different data sets, including nuclear explosions from the Nevada Test Site, volcanic events from Uturuncu (Bolivia), and earthquakes. Several challenges remain: choosing an appropriate misfit function, handling time shifts between data and synthetic waveforms, and extending the uncertainty estimation to include more source parameters (e.g., hypocenter and source time function).

  3. TU-H-CAMPUS-JeP2-02: Interobserver Variability of CT, PET-CT and MRI Based Primary Tumor Delineation for Lung Cancer

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Karki, K; Hugo, G; Saraiya, S

    Purpose: Target delineation in lung cancer radiotherapy has, in general, large variability. MRI has so far not been investigated in detail for lung cancer delineation variability. The purpose of this study is to investigate delineation variability for lung tumors using MRI and compare it to CT alone and PET-CT based delineations. Methods: Seven physicians delineated the primary tumor volumes of nine patients for the following scenarios: (1) CT only; (2) post-contrast T1-weighted MRI registered with diffusion-weighted MRI; and (3) PET-CT fusion images. To compute interobserver variability, the median surface was generated from all observers’ contours and used as the referencemore » surface. A single physician labeled the interface types (tumor to lung, atelectasis (collapsed lung), hilum, mediastinum, or chest-wall) on the median surface. Volume variation (normalized to PET-CT volume), minimum distance (MD), and bidirectional local distance (BLD) between individual observers’ contours and the reference contour were measured. Results: CT- and MRI-based normalized volumes were 1.61±0.76 (mean±SD) and 1.38±0.44, respectively, both significantly larger than PET-CT (p<0.05, paired t-test). The overall uncertainty (root mean square of SD values over all points) of both BLD and MD measures of the observers for the interfaces were not significantly different (p>0.05, two-samples t-test) for all imaging modalities except between tumor-mediastinum and tumor-atelectasis in PET-CT. The largest mean overall uncertainty was observed for tumor-atelectasis interface, the smallest for tumor-mediastinum and tumor-lung interfaces for all modalities. The whole tumor uncertainties for both BLD and MD were not significantly different between any two modalities (p>0.05, paired t-test). Overall uncertainties for the interfaces using BLD were similar to using MD. Conclusion: Large volume variations were observed between the three imaging modalities. Contouring variability appeared to depend on the interface type. This study will be useful for understanding the delineation uncertainty for radiotherapy planning of lung cancer using different imaging modalities. Disclosures: Research agreement with Phillips Healthcare (GH and EW), National Institutes of Health Licensing agreement with Varian Medical Systems (GH and EW), research grants from the National Institute of Health (GH and EW), UpToDate royalties (EW), and none (others). Authors have no potential conflicts of interest to disclose.« less

  4. Impact of geometric uncertainties on dose calculations for intensity modulated radiation therapy of prostate cancer

    NASA Astrophysics Data System (ADS)

    Jiang, Runqing

    Intensity-modulated radiation therapy (IMRT) uses non-uniform beam intensities within a radiation field to provide patient-specific dose shaping, resulting in a dose distribution that conforms tightly to the planning target volume (PTV). Unavoidable geometric uncertainty arising from patient repositioning and internal organ motion can lead to lower conformality index (CI) during treatment delivery, a decrease in tumor control probability (TCP) and an increase in normal tissue complication probability (NTCP). The CI of the IMRT plan depends heavily on steep dose gradients between the PTV and organ at risk (OAR). Geometric uncertainties reduce the planned dose gradients and result in a less steep or "blurred" dose gradient. The blurred dose gradients can be maximized by constraining the dose objective function in the static IMRT plan or by reducing geometric uncertainty during treatment with corrective verification imaging. Internal organ motion and setup error were evaluated simultaneously for 118 individual patients with implanted fiducials and MV electronic portal imaging (EPI). A Gaussian probability density function (PDF) is reasonable for modeling geometric uncertainties as indicated by the 118 patients group. The Gaussian PDF is patient specific and group standard deviation (SD) should not be used for accurate treatment planning for individual patients. In addition, individual SD should not be determined or predicted from small imaging samples because of random nature of the fluctuations. Frequent verification imaging should be employed in situations where geometric uncertainties are expected. Cumulative PDF data can be used for re-planning to assess accuracy of delivered dose. Group data is useful for determining worst case discrepancy between planned and delivered dose. The margins for the PTV should ideally represent true geometric uncertainties. The measured geometric uncertainties were used in this thesis to assess PTV coverage, dose to OAR, equivalent uniform dose per fraction (EUDf) and NTCP. The dose distribution including geometric uncertainties was determined from integration of the convolution of the static dose gradient with the PDF. Integration of the convolution of the static dose and derivative of the PDF can also be used to determine the dose including geometric uncertainties although this method was not investigated in detail. Local maximum dose gradient (LMDG) was determined via optimization of dose objective function by manually adjusting DVH control points or selecting beam numbers and directions during IMRT treatment planning. Minimum SD (SDmin) is used when geometric uncertainty is corrected with verification imaging. Maximum SD (SDmax) is used when the geometric uncertainty is known to be large and difficult to manage. SDmax was 4.38 mm in anterior-posterior (AP) direction, 2.70 mm in left-right (LR) direction and 4.35 mm in superior-inferior (SI) direction; SDmin was 1.1 mm in all three directions if less than 2 mm threshold was used for uncorrected fractions in every direction. EUDf is a useful QA parameter for interpreting the biological impact of geometric uncertainties on the static dose distribution. The EUD f has been used as the basis for the time-course NTCP evaluation in the thesis. Relative NTCP values are useful for comparative QA checking by normalizing known complications (e.g. reported in the RTOG studies) to specific DVH control points. For prostate cancer patients, rectal complications were evaluated from specific RTOG clinical trials and detailed evaluation of the treatment techniques (e.g. dose prescription, DVH, number of beams, bean angles). Treatment plans that did not meet DVH constraints represented additional complication risk. Geometric uncertainties improved or worsened rectal NTCP depending on individual internal organ motion within patient.

  5. Limited Impact of Setup and Range Uncertainties, Breathing Motion, and Interplay Effects in Robustly Optimized Intensity Modulated Proton Therapy for Stage III Non-small Cell Lung Cancer.

    PubMed

    Inoue, Tatsuya; Widder, Joachim; van Dijk, Lisanne V; Takegawa, Hideki; Koizumi, Masahiko; Takashina, Masaaki; Usui, Keisuke; Kurokawa, Chie; Sugimoto, Satoru; Saito, Anneyuko I; Sasai, Keisuke; Van't Veld, Aart A; Langendijk, Johannes A; Korevaar, Erik W

    2016-11-01

    To investigate the impact of setup and range uncertainties, breathing motion, and interplay effects using scanning pencil beams in robustly optimized intensity modulated proton therapy (IMPT) for stage III non-small cell lung cancer (NSCLC). Three-field IMPT plans were created using a minimax robust optimization technique for 10 NSCLC patients. The plans accounted for 5- or 7-mm setup errors with ±3% range uncertainties. The robustness of the IMPT nominal plans was evaluated considering (1) isotropic 5-mm setup errors with ±3% range uncertainties; (2) breathing motion; (3) interplay effects; and (4) a combination of items 1 and 2. The plans were calculated using 4-dimensional and average intensity projection computed tomography images. The target coverage (TC, volume receiving 95% of prescribed dose) and homogeneity index (D2 - D98, where D2 and D98 are the least doses received by 2% and 98% of the volume) for the internal clinical target volume, and dose indexes for lung, esophagus, heart and spinal cord were compared with that of clinical volumetric modulated arc therapy plans. The TC and homogeneity index for all plans were within clinical limits when considering the breathing motion and interplay effects independently. The setup and range uncertainties had a larger effect when considering their combined effect. The TC decreased to <98% (clinical threshold) in 3 of 10 patients for robust 5-mm evaluations. However, the TC remained >98% for robust 7-mm evaluations for all patients. The organ at risk dose parameters did not significantly vary between the respective robust 5-mm and robust 7-mm evaluations for the 4 error types. Compared with the volumetric modulated arc therapy plans, the IMPT plans showed better target homogeneity and mean lung and heart dose parameters reduced by about 40% and 60%, respectively. In robustly optimized IMPT for stage III NSCLC, the setup and range uncertainties, breathing motion, and interplay effects have limited impact on target coverage, dose homogeneity, and organ-at-risk dose parameters. Copyright © 2016 Elsevier Inc. All rights reserved.

  6. Improving the effectiveness of real-time flood forecasting through Predictive Uncertainty estimation: the multi-temporal approach

    NASA Astrophysics Data System (ADS)

    Barbetta, Silvia; Coccia, Gabriele; Moramarco, Tommaso; Todini, Ezio

    2015-04-01

    The negative effects of severe flood events are usually contrasted through structural measures that, however, do not fully eliminate flood risk. Non-structural measures, such as real-time flood forecasting and warning, are also required. Accurate stage/discharge future predictions with appropriate forecast lead-time are sought by decision-makers for implementing strategies to mitigate the adverse effects of floods. Traditionally, flood forecasting has been approached by using rainfall-runoff and/or flood routing modelling. Indeed, both types of forecasts, cannot be considered perfectly representing future outcomes because of lacking of a complete knowledge of involved processes (Todini, 2004). Nonetheless, although aware that model forecasts are not perfectly representing future outcomes, decision makers are de facto implicitly assuming the forecast of water level/discharge/volume, etc. as "deterministic" and coinciding with what is going to occur. Recently the concept of Predictive Uncertainty (PU) was introduced in hydrology (Krzysztofowicz, 1999), and several uncertainty processors were developed (Todini, 2008). PU is defined as the probability of occurrence of the future realization of a predictand (water level/discharge/volume) conditional on: i) prior observations and knowledge, ii) the available information obtained on the future value, typically provided by one or more forecast models. Unfortunately, PU has been frequently interpreted as a measure of lack of accuracy rather than the appropriate tool allowing to take the most appropriate decisions, given a model or several models' forecasts. With the aim to shed light on the benefits for appropriately using PU, a multi-temporal approach based on the MCP approach (Todini, 2008; Coccia and Todini, 2011) is here applied to stage forecasts at sites along the Upper Tiber River. Specifically, the STAge Forecasting-Rating Curve Model Muskingum-based (STAFOM-RCM) (Barbetta et al., 2014) along with the Rating-Curve Model in Real Time (RCM-RT) (Barbetta and Moramarco, 2014) are used to this end. Both models without considering rainfall information explicitly considers, at each time of forecast, the estimate of lateral contribution along the river reach for which the stage forecast is performed at downstream end. The analysis is performed for several reaches using different lead times according to the channel length. Barbetta, S., Moramarco, T., Brocca, L., Franchini, M. and Melone, F. 2014. Confidence interval of real-time forecast stages provided by the STAFOM-RCM model: the case study of the Tiber River (Italy). Hydrological Processes, 28(3),729-743. Barbetta, S. and Moramarco, T. 2014. Real-time flood forecasting by relating local stage and remote discharge. Hydrological Sciences Journal, 59(9 ), 1656-1674. Coccia, G. and Todini, E. 2011. Recent developments in predictive uncertainty assessment based on the Model Conditional Processor approach. Hydrology and Earth System Sciences, 15, 3253-3274. doi:10.5194/hess-15-3253-2011. Krzysztofowicz, R. 1999. Bayesian theory of probabilistic forecasting via deterministic hydrologic model, Water Resour. Res., 35, 2739-2750. Todini, E. 2004. Role and treatment of uncertainty in real-time flood forecasting. Hydrological Processes 18(14), 2743_2746. Todini, E. 2008. A model conditional processor to assess predictive uncertainty in flood forecasting. Intl. J. River Basin Management, 6(2): 123-137.

  7. Expert agreements and disagreements on induced seismicity by Enhanced Geothermal Systems

    NASA Astrophysics Data System (ADS)

    Trutnevyte, E.; Azevedo, I. L.

    2016-12-01

    Enhanced or Engineered Geothermal Systems (EGS) are at an early stage of development and only a handful of projects exist worldwide. In face of limited empirical evidence on EGS induced seismicity, expert elicitation provides a complementary view to quantitative assessments and basic science. We present the results of an international expert elicitation exercise with 14 experts from 6 countries. The elicitation aimed at evaluating induced seismicity hazard and risk for EGS and characterizing associated uncertainty. The state-of-the-art expert elicitation method was used: it combines technical analysis with behavioral science-informed elicitation of expert judgement in order to minimize subjectivity. The experts assessed a harmonized scenario of an EGS plant, its operational characteristics, geological context, and surrounding buildings and infrastructures. The experts provided quantitative estimates of exceedance probabilities of induced M>=3 and M>=5, maximum magnitudes that could be observed, and made judgements on economic loss, injuries, and fatalities in the case of M=3 and M=5. The experts also rated the importance of factors that influence induced seismicity hazard and risk (e.g. reservoir depth, injected volumes, exposed building stock etc.) and the potential uncertainty reductions through future research. We present the findings of this elicitation and highlight the points of expert agreements and disagreements.

  8. On-orbit servicing system assessment and optimization methods based on lifecycle simulation under mixed aleatory and epistemic uncertainties

    NASA Astrophysics Data System (ADS)

    Yao, Wen; Chen, Xiaoqian; Huang, Yiyong; van Tooren, Michel

    2013-06-01

    To assess the on-orbit servicing (OOS) paradigm and optimize its utilities by taking advantage of its inherent flexibility and responsiveness, the OOS system assessment and optimization methods based on lifecycle simulation under uncertainties are studied. The uncertainty sources considered in this paper include both the aleatory (random launch/OOS operation failure and on-orbit component failure) and the epistemic (the unknown trend of the end-used market price) types. Firstly, the lifecycle simulation under uncertainties is discussed. The chronological flowchart is presented. The cost and benefit models are established, and the uncertainties thereof are modeled. The dynamic programming method to make optimal decision in face of the uncertain events is introduced. Secondly, the method to analyze the propagation effects of the uncertainties on the OOS utilities is studied. With combined probability and evidence theory, a Monte Carlo lifecycle Simulation based Unified Uncertainty Analysis (MCS-UUA) approach is proposed, based on which the OOS utility assessment tool under mixed uncertainties is developed. Thirdly, to further optimize the OOS system under mixed uncertainties, the reliability-based optimization (RBO) method is studied. To alleviate the computational burden of the traditional RBO method which involves nested optimum search and uncertainty analysis, the framework of Sequential Optimization and Mixed Uncertainty Analysis (SOMUA) is employed to integrate MCS-UUA, and the RBO algorithm SOMUA-MCS is developed. Fourthly, a case study on the OOS system for a hypothetical GEO commercial communication satellite is investigated with the proposed assessment tool. Furthermore, the OOS system is optimized with SOMUA-MCS. Lastly, some conclusions are given and future research prospects are highlighted.

  9. An Integrated Systems Approach to Designing Climate Change Adaptation Policy in Water Resources

    NASA Astrophysics Data System (ADS)

    Ryu, D.; Malano, H. M.; Davidson, B.; George, B.

    2014-12-01

    Climate change projections are characterised by large uncertainties with rainfall variability being the key challenge in designing adaptation policies. Climate change adaptation in water resources shows all the typical characteristics of 'wicked' problems typified by cognitive uncertainty as new scientific knowledge becomes available, problem instability, knowledge imperfection and strategic uncertainty due to institutional changes that inevitably occur over time. Planning that is characterised by uncertainties and instability requires an approach that can accommodate flexibility and adaptive capacity for decision-making. An ability to take corrective measures in the event that scenarios and responses envisaged initially derive into forms at some future stage. We present an integrated-multidisciplinary and comprehensive framework designed to interface and inform science and decision making in the formulation of water resource management strategies to deal with climate change in the Musi Catchment of Andhra Pradesh, India. At the core of this framework is a dialogue between stakeholders, decision makers and scientists to define a set of plausible responses to an ensemble of climate change scenarios derived from global climate modelling. The modelling framework used to evaluate the resulting combination of climate scenarios and adaptation responses includes the surface and groundwater assessment models (SWAT & MODFLOW) and the water allocation modelling (REALM) to determine the water security of each adaptation strategy. Three climate scenarios extracted from downscaled climate models were selected for evaluation together with four agreed responses—changing cropping patterns, increasing watershed development, changing the volume of groundwater extraction and improving irrigation efficiency. Water security in this context is represented by the combination of level of water availability and its associated security of supply for three economic activities (agriculture, urban, industrial) on a spatially distributed basis. The resulting combinations of climate scenarios and adaptation responses were subjected to a combined hydro-economic assessment based on the degree of water security together with its cost-effectiveness against the Business-as-usual scenario.

  10. Advanced NASA Earth Science Mission Concept for Vegetation 3D Structure, Biomass and Disturbance

    NASA Technical Reports Server (NTRS)

    Ranson, K. Jon

    2007-01-01

    Carbon in forest canopies represents about 85% of the total carbon in the Earth's aboveground biomass (Olson et al., 1983). A major source of uncertainty in global carbon budgets derives from large errors in the current estimates of these carbon stocks (IPCC, 2001). The magnitudes and distributions of terrestrial carbon storage along with changes in sources and sinks for atmospheric C02 due to land use change remain the most significant uncertainties in Earth's carbon budget. These uncertainties severely limit accurate terrestrial carbon accounting; our ability to evaluate terrestrial carbon management schemes; and the veracity of atmospheric C02 projections in response to further fossil fuel combustion and other human activities. Measurements of vegetation three-dimensional (3D) structural characteristics over the Earth's land surface are needed to estimate biomass and carbon stocks and to quantify biomass recovery following disturbance. These measurements include vegetation height, the vertical profile of canopy elements (i.e., leaves, stems, branches), andlor the volume scattering of canopy elements. They are critical for reducing uncertainties in the global carbon budget. Disturbance by natural phenomena, such as fire or wind, as well as by human activities, such as forest harvest, and subsequent recovery, complicate the quantification of carbon storage and release. The resulting spatial and temporal heterogeneity of terrestrial biomass and carbon in vegetation make it very difficult to estimate terrestrial carbon stocks and quantify their dynamics. Vegetation height profiles and disturbance recovery patterns are also required to assess ecosystem health and characterize habitat. The three-dimensional structure of vegetation provides habitats for many species and is a control on biodiversity. Canopy height and structure influence habitat use and specialization, two fundamental processes that modify species richness and abundance across ecosystems. Accurate and consistent 3D measurements of forest structure at the landscape scale are needed for assessing impacts to animal habitats and biodiversity following disturbance.

  11. Development of an Uncertainty Model for the National Transonic Facility

    NASA Technical Reports Server (NTRS)

    Walter, Joel A.; Lawrence, William R.; Elder, David W.; Treece, Michael D.

    2010-01-01

    This paper introduces an uncertainty model being developed for the National Transonic Facility (NTF). The model uses a Monte Carlo technique to propagate standard uncertainties of measured values through the NTF data reduction equations to calculate the combined uncertainties of the key aerodynamic force and moment coefficients and freestream properties. The uncertainty propagation approach to assessing data variability is compared with ongoing data quality assessment activities at the NTF, notably check standard testing using statistical process control (SPC) techniques. It is shown that the two approaches are complementary and both are necessary tools for data quality assessment and improvement activities. The SPC approach is the final arbiter of variability in a facility. Its result encompasses variation due to people, processes, test equipment, and test article. The uncertainty propagation approach is limited mainly to the data reduction process. However, it is useful because it helps to assess the causes of variability seen in the data and consequently provides a basis for improvement. For example, it is shown that Mach number random uncertainty is dominated by static pressure variation over most of the dynamic pressure range tested. However, the random uncertainty in the drag coefficient is generally dominated by axial and normal force uncertainty with much less contribution from freestream conditions.

  12. Glass Property Models and Constraints for Estimating the Glass to be Produced at Hanford by Implementing Current Advanced Glass Formulation Efforts

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vienna, John D.; Kim, Dong-Sang; Skorski, Daniel C.

    2013-07-01

    Recent glass formulation and melter testing data have suggested that significant increases in waste loading in HLW and LAW glasses are possible over current system planning estimates. The data (although limited in some cases) were evaluated to determine a set of constraints and models that could be used to estimate the maximum loading of specific waste compositions in glass. It is recommended that these models and constraints be used to estimate the likely HLW and LAW glass volumes that would result if the current glass formulation studies are successfully completed. It is recognized that some of the models are preliminarymore » in nature and will change in the coming years. Plus the models do not currently address the prediction uncertainties that would be needed before they could be used in plant operations. The models and constraints are only meant to give an indication of rough glass volumes and are not intended to be used in plant operation or waste form qualification activities. A current research program is in place to develop the data, models, and uncertainty descriptions for that purpose. A fundamental tenet underlying the research reported in this document is to try to be less conservative than previous studies when developing constraints for estimating the glass to be produced by implementing current advanced glass formulation efforts. The less conservative approach documented herein should allow for the estimate of glass masses that may be realized if the current efforts in advanced glass formulations are completed over the coming years and are as successful as early indications suggest they may be. Because of this approach there is an unquantifiable uncertainty in the ultimate glass volume projections due to model prediction uncertainties that has to be considered along with other system uncertainties such as waste compositions and amounts to be immobilized, split factors between LAW and HLW, etc.« less

  13. Assessing the debris flow run-out frequency of a catchment in the French Alps using a parameterization analysis with the RAMMS numerical run-out model

    NASA Astrophysics Data System (ADS)

    Hussin, H. Y.; Luna, B. Quan; van Westen, C. J.; Christen, M.; Malet, J.-P.; van Asch, Th. W. J.

    2012-04-01

    Debris flows occurring in the European Alps frequently cause significant damage to settlements, power-lines and transportation infrastructure which has led to traffic disruptions, economic loss and even death. Estimating the debris flow run-out extent and the parameter uncertainty related to run-out modeling are some of the difficulties found in the Quantitative Risk Assessment (QRA) of debris flows. Also, the process of the entrainment of material into a debris flow is until now not completely understood. Debris flows observed in the French Alps entrain 5 - 50 times the amount of volume compared to the initially mobilized source volume. In this study we analyze a debris flow that occurred in 2003 at the Faucon catchment in the Barcelonnette Basin (Southern French Alps). The analysis was carried out using the Voellmy rheology and an entrainment model imbedded in the RAMMS 2D numerical modeling software. The historic event was back calibrated based on source, entrainment and deposit volumes, including the run-out distance, velocities and deposit heights of the debris flow. This was then followed by a sensitivity analysis of the rheological and entrainment parameters to produce 120 debris flow scenarios leading to a frequency assessment of the run-out distance and deposit height at the debris fan. The study shows that the Voellmy frictional parameters mainly influence the run-out distance and velocity of the flow, while the entrainment parameter has a major impact on the debris flow height. The frequency assessment of the 120 simulated scenarios further gives an indication on the most likely debris flow run-out extents and heights for this catchment. Such an assessment can be an important link between the rheological model parameters and the spatial probability of the run-out for the Quantitative Risk Assessment (QRA) of debris flows.

  14. Methods for determining and processing 3D errors and uncertainties for AFM data analysis

    NASA Astrophysics Data System (ADS)

    Klapetek, P.; Nečas, D.; Campbellová, A.; Yacoot, A.; Koenders, L.

    2011-02-01

    This paper describes the processing of three-dimensional (3D) scanning probe microscopy (SPM) data. It is shown that 3D volumetric calibration error and uncertainty data can be acquired for both metrological atomic force microscope systems and commercial SPMs. These data can be used within nearly all the standard SPM data processing algorithms to determine local values of uncertainty of the scanning system. If the error function of the scanning system is determined for the whole measurement volume of an SPM, it can be converted to yield local dimensional uncertainty values that can in turn be used for evaluation of uncertainties related to the acquired data and for further data processing applications (e.g. area, ACF, roughness) within direct or statistical measurements. These have been implemented in the software package Gwyddion.

  15. Incorporating climate-system and carbon-cycle uncertainties in integrated assessments of climate change. (Invited)

    NASA Astrophysics Data System (ADS)

    Rogelj, J.; McCollum, D. L.; Reisinger, A.; Knutti, R.; Riahi, K.; Meinshausen, M.

    2013-12-01

    The field of integrated assessment draws from a large body of knowledge across a range of disciplines to gain robust insights about possible interactions, trade-offs, and synergies. Integrated assessment of climate change, for example, uses knowledge from the fields of energy system science, economics, geophysics, demography, climate change impacts, and many others. Each of these fields comes with its associated caveats and uncertainties, which should be taken into account when assessing any results. The geophysical system and its associated uncertainties are often represented by models of reduced complexity in integrated assessment modelling frameworks. Such models include simple representations of the carbon-cycle and climate system, and are often based on the global energy balance equation. A prominent example of such model is the 'Model for the Assessment of Greenhouse Gas Induced Climate Change', MAGICC. Here we show how a model like MAGICC can be used for the representation of geophysical uncertainties. Its strengths, weaknesses, and limitations are discussed and illustrated by means of an analysis which attempts to integrate socio-economic and geophysical uncertainties. These uncertainties in the geophysical response of the Earth system to greenhouse gases remains key for estimating the cost of greenhouse gas emission mitigation scenarios. We look at uncertainties in four dimensions: geophysical, technological, social and political. Our results indicate that while geophysical uncertainties are an important factor influencing projections of mitigation costs, political choices that delay mitigation by one or two decades a much more pronounced effect.

  16. Uncertainty analysis in geospatial merit matrix–based hydropower resource assessment

    DOE PAGES

    Pasha, M. Fayzul K.; Yeasmin, Dilruba; Saetern, Sen; ...

    2016-03-30

    Hydraulic head and mean annual streamflow, two main input parameters in hydropower resource assessment, are not measured at every point along the stream. Translation and interpolation are used to derive these parameters, resulting in uncertainties. This study estimates the uncertainties and their effects on model output parameters: the total potential power and the number of potential locations (stream-reach). These parameters are quantified through Monte Carlo Simulation (MCS) linking with a geospatial merit matrix based hydropower resource assessment (GMM-HRA) Model. The methodology is applied to flat, mild, and steep terrains. Results show that the uncertainty associated with the hydraulic head ismore » within 20% for mild and steep terrains, and the uncertainty associated with streamflow is around 16% for all three terrains. Output uncertainty increases as input uncertainty increases. However, output uncertainty is around 10% to 20% of the input uncertainty, demonstrating the robustness of the GMM-HRA model. Hydraulic head is more sensitive to output parameters in steep terrain than in flat and mild terrains. Furthermore, mean annual streamflow is more sensitive to output parameters in flat terrain.« less

  17. SU-E-T-657: Quantitative Assessment of Plan Robustness for Helical Tomotherapy for Head and Neck Cancer Radiotherapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Matney, J; Lian, J; Chera, B

    2015-06-15

    Introduction: Geometric uncertainties in daily patient setup can lead to variations in the planned dose, especially when using highly conformal techniques such as helical Tomotherapy. To account for the potential effect of geometric uncertainty, our clinical practice is to expand critical structures by 3mm expansion into planning risk volumes (PRV). The PRV concept assumes the spatial dose cloud is insensitive to patient positioning. However, no tools currently exist to determine if a Tomotherapy plan is robust to the effects of daily setup variation. We objectively quantified the impact of geometric uncertainties on the 3D doses to critical normal tissues duringmore » helical Tomotherapy. Methods: Using a Matlab-based program created and validated by Accuray (Madison, WI), the planned Tomotherapy delivery sinogram recalculated dose on shifted CT datasets. Ten head and neck patients were selected for analysis. To simulate setup uncertainty, the patient anatomy was shifted ±3mm in the longitudinal, lateral and vertical axes. For each potential shift, the recalculated doses to various critical normal tissues were compared to the doses delivered to the PRV in the original plan Results: 18 shifted scenarios created from Tomotherapy plans for three patients with head and neck cancers were analyzed. For all simulated setup errors, the maximum doses to the brainstem, spinal cord, parotids and cochlea were no greater than 0.6Gy of the respective original PRV maximum. Despite 3mm setup shifts, the minimum dose delivered to 95% of the CTVs and PTVs were always within 0.4Gy of the original plan. Conclusions: For head and neck sites treated with Tomotherapy, the use of a 3mm PRV expansion provide a reasonable estimate of the dosimetric effects of 3mm setup uncertainties. Similarly, target coverage appears minimally effected by a 3mm setup uncertainty. Data from a larger number of patients will be presented. Future work will include other anatomical sites.« less

  18. An Applied Framework for Incorporating Multiple Sources of Uncertainty in Fisheries Stock Assessments.

    PubMed

    Scott, Finlay; Jardim, Ernesto; Millar, Colin P; Cerviño, Santiago

    2016-01-01

    Estimating fish stock status is very challenging given the many sources and high levels of uncertainty surrounding the biological processes (e.g. natural variability in the demographic rates), model selection (e.g. choosing growth or stock assessment models) and parameter estimation. Incorporating multiple sources of uncertainty in a stock assessment allows advice to better account for the risks associated with proposed management options, promoting decisions that are more robust to such uncertainty. However, a typical assessment only reports the model fit and variance of estimated parameters, thereby underreporting the overall uncertainty. Additionally, although multiple candidate models may be considered, only one is selected as the 'best' result, effectively rejecting the plausible assumptions behind the other models. We present an applied framework to integrate multiple sources of uncertainty in the stock assessment process. The first step is the generation and conditioning of a suite of stock assessment models that contain different assumptions about the stock and the fishery. The second step is the estimation of parameters, including fitting of the stock assessment models. The final step integrates across all of the results to reconcile the multi-model outcome. The framework is flexible enough to be tailored to particular stocks and fisheries and can draw on information from multiple sources to implement a broad variety of assumptions, making it applicable to stocks with varying levels of data availability The Iberian hake stock in International Council for the Exploration of the Sea (ICES) Divisions VIIIc and IXa is used to demonstrate the framework, starting from length-based stock and indices data. Process and model uncertainty are considered through the growth, natural mortality, fishing mortality, survey catchability and stock-recruitment relationship. Estimation uncertainty is included as part of the fitting process. Simple model averaging is used to integrate across the results and produce a single assessment that considers the multiple sources of uncertainty.

  19. Optoelectronic system for the determination of blood volume in pneumatic heart assist devices.

    PubMed

    Konieczny, Grzegorz; Pustelny, Tadeusz; Setkiewicz, Maciej; Gawlikowski, Maciej

    2015-12-10

    The following article describes the concept of optical measurement of blood volume in ventricular assist devices (VAD's) of the pulsatile type. The paper presents the current state of art in blood volume measurements of such devices and introduces a newly developed solution in the optic domain. The objective of the research is to overcome the disadvantages of the previously developed acoustic method-the requirement of additional sensor chamber. The idea of a compact measurement system has been introduced, followed by laboratory measurements. Static tests of the system have been presented, followed by dynamic measurements on a physical model of the human ventricular system. The results involving the measurements of blood chamber volume acquired by means of an optical system have been compared with the results acquired by means of the Transonic T410 ultrasound flow rate sensor (11PLX transducer, uncertainty ±5 %). Preliminary dynamic measurements conducted on the physical model of the human cardiovascular system show that the proposed optical measurement system may be used to measure the transient blood chamber volumes of pulsatile VAD's with the uncertainties (standard mean deviation) lower than 10 %. The results show that the noninvasive measurements of the temporary blood chamber volume in the POLVAD prosthesis with the use of the developed optical system allows us to carry out accurate static and dynamic measurements.

  20. Interobserver delineation uncertainty in involved-node radiation therapy (INRT) for early-stage Hodgkin lymphoma: on behalf of the Radiotherapy Committee of the EORTC lymphoma group.

    PubMed

    Aznar, Marianne C; Girinsky, Theodore; Berthelsen, Anne Kiil; Aleman, Berthe; Beijert, Max; Hutchings, Martin; Lievens, Yolande; Meijnders, Paul; Meidahl Petersen, Peter; Schut, Deborah; Maraldo, Maja V; van der Maazen, Richard; Specht, Lena

    2017-04-01

    In early-stage classical Hodgkin lymphoma (HL) the target volume nowadays consists of the volume of the originally involved nodes. Delineation of this volume on a post-chemotherapy CT-scan is challenging. We report on the interobserver variability in target volume definition and its impact on resulting treatment plans. Two representative cases were selected (1: male, stage IB, localization: left axilla; 2: female, stage IIB, localizations: mediastinum and bilateral neck). Eight experienced observers individually defined the clinical target volume (CTV) using involved-node radiotherapy (INRT) as defined by the EORTC-GELA guidelines for the H10 trial. A consensus contour was generated and the standard deviation computed. We investigated the overlap between observer and consensus contour [Sørensen-Dice coefficient (DSC)] and the magnitude of gross deviations between the surfaces of the observer and consensus contour (Hausdorff distance). 3D-conformal (3D-CRT) and intensity-modulated radiotherapy (IMRT) plans were calculated for each contour in order to investigate the impact of interobserver variability on each treatment modality. Similar target coverage was enforced for all plans. The median CTV was 120 cm 3 (IQR: 95-173 cm 3 ) for Case 1, and 255 cm 3 (IQR: 183-293 cm 3 ) for Case 2. DSC values were generally high (>0.7), and Hausdorff distances were about 30 mm. The SDs between all observer contours, providing an estimate of the systematic error associated with delineation uncertainty, ranged from 1.9 to 3.8 mm (median: 3.2 mm). Variations in mean dose resulting from different observer contours were small and were not higher in IMRT plans than in 3D-CRT plans. We observed considerable differences in target volume delineation, but the systematic delineation uncertainty of around 3 mm is comparable to that reported in other tumour sites. This report is a first step towards calculating an evidence-based planning target volume margin for INRT in HL.

  1. From Global to Local and Vice Versa: On the Importance of the 'Globalization' Agenda in Continental Groundwater Research and Policy-Making.

    PubMed

    Filimonau, Viachaslau; Barth, Johannes A C

    2016-09-01

    Groundwater is one of the most important environmental resources and its use continuously rises globally for industrial, agricultural, and drinking water supply purposes. Because of its importance, more knowledge about the volume of usable groundwater is necessary to satisfy the global demand. Due to the challenges in quantifying the volume of available global groundwater, studies which aim to assess its magnitude are limited in number. They are further restricted in scope and depth of analysis as, in most cases, they do not explain how the estimates of global groundwater resources have been obtained, what methods have been used to generate the figures and what levels of uncertainty exist. This article reviews the estimates of global groundwater resources. It finds that the level of uncertainty attached to existing numbers often exceeds 100 % and strives to establish the reasons for discrepancy. The outcome of this study outlines the need for a new agenda in water research with a more pronounced focus on groundwater. This new research agenda should aim at enhancing the quality and quantity of data provision on local and regional groundwater stocks and flows. This knowledge enhancement can serve as a basis to improve policy-making on groundwater resources globally. Research-informed policies will facilitate more effective groundwater management practices to ensure a more rapid progress of the global water sector towards the goal of sustainability.

  2. Assessment of Uncertainty-Infused Scientific Argumentation

    ERIC Educational Resources Information Center

    Lee, Hee-Sun; Liu, Ou Lydia; Pallant, Amy; Roohr, Katrina Crotts; Pryputniewicz, Sarah; Buck, Zoë E.

    2014-01-01

    Though addressing sources of uncertainty is an important part of doing science, it has largely been neglected in assessing students' scientific argumentation. In this study, we initially defined a scientific argumentation construct in four structural elements consisting of claim, justification, uncertainty qualifier, and uncertainty…

  3. The Density of Mid-sized Kuiper Belt Objects from ALMA Thermal Observations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, Michael E.; Butler, Bryan J.

    The densities of mid-sized Kuiper Belt objects (KBOs) are a key constraint in understanding the assembly of objects in the outer solar system. These objects are critical for understanding the currently unexplained transition from the smallest KBOs with densities lower than that of water, to the largest objects with significant rock content. Mapping this transition is made difficult by the uncertainties in the diameters of these objects, which maps into an even larger uncertainty in volume and thus density. The substantial collecting area of the Atacama Large Millimeter Array allows significantly more precise measurements of thermal emission from outer solarmore » system objects and could potentially greatly improve the density measurements. Here we use new thermal observations of four objects with satellites to explore the improvements possible with millimeter data. We find that effects due to effective emissivity at millimeter wavelengths make it difficult to use the millimeter data directly to find diameters and thus volumes for these bodies. In addition, we find that when including the effects of model uncertainty, the true uncertainties on the sizes of outer solar system objects measured with radiometry are likely larger than those previously published. Substantial improvement in object sizes will likely require precise occultation measurements.« less

  4. Flood resilience and uncertainty in flood risk assessment

    NASA Astrophysics Data System (ADS)

    Beven, K.; Leedal, D.; Neal, J.; Bates, P.; Hunter, N.; Lamb, R.; Keef, C.

    2012-04-01

    Flood risk assessments do not normally take account of the uncertainty in assessing flood risk. There is no requirement in the EU Floods Directive to do so. But given the generally short series (and potential non-stationarity) of flood discharges, the extrapolation to smaller exceedance potentials may be highly uncertain. This means that flood risk mapping may also be highly uncertainty, with additional uncertainties introduced by the representation of flood plain and channel geometry, conveyance and infrastructure. This suggests that decisions about flood plain management should be based on exceedance probability of risk rather than the deterministic hazard maps that are common in most EU countries. Some examples are given from 2 case studies in the UK where a framework for good practice in assessing uncertainty in flood risk mapping has been produced as part of the Flood Risk Management Research Consortium and Catchment Change Network Projects. This framework provides a structure for the communication and audit of assumptions about uncertainties.

  5. Carbon Emissions from Residue Burn Piles Estimated Using LiDAR or Ground Based Measurements of Pile Volumes in a Coastal Douglas-Fir Forest

    NASA Astrophysics Data System (ADS)

    Trofymow, J. A.; Coops, N.; Hayhurst, D.

    2012-12-01

    Following forest harvest, residues left on site and roadsides are often disposed of to reduce fire risk and free planting space. In coastal British Columbia burn piles are the main method of disposal, particularly for accumulations from log processing. Quantification of residue wood in piles is required for: smoke emission estimates, C budget calculations, billable waste assessment, harvest efficiency monitoring, and determination of bioenergy potentials. A second-growth Douglas-fir dominated (DF1949) site on eastern Vancouver Island and subject of C flux and budget studies since 1998, was clearcut in winter 2011, residues piled in spring and burned in fall. Prior to harvest, the site was divided into 4 blocks to account for harvest plans and ecosite conditions. Total harvested wood volume was scaled for each block. Residue pile wood volume was determined by a standard Waste and Residue Survey (WRS) using field estimates of pile base area and plot density (wood volume / 0.005 ha plot) on 2 piles per block, by a smoke emissions geometric method with pile volumes estimated as ellipsoidal paraboloids and packing ratios (wood volume / pile volume) for 2 piles per block, as well as by five other GIS methods using pile volumes and areas from LiDAR and orthophotography flown August 2011, a LiDAR derived digital elevation model (DEM) from 2008, and total scaled wood volumes of 8 sample piles disassembled November 2011. A weak but significant negative relationship was found between pile packing ratio and pile volume. Block level avoidable+unavoidable residue pile wood volumes from the WRS method (20.0 m3 ha-1 SE 2.8) were 30%-50% of the geometric (69.0 m3 ha-1 SE 18.0) or five GIS/LiDAR (48.0 to 65.7 m3 ha-1 ) methods. Block volumes using the 2008 LiDAR DEM (unshifted 48.0 m3 ha-1 SE 3.9, shifted 53.6 m3 ha-1 SE 4.2) to account for pre-existing humps or hollows beneath piles were not different from those using the 2011 LiDAR DEM (50.3 m3 ha-1 SE 4.0). The block volume ratio (total residue pile / harvest scale, wood volumes x 100) for the WRS method (3.3% SE 0.45) was lower than for LiDAR 2011 method (8.1% SE 0.31). Using wood densities from in situ samples and LiDAR 2011 method wood volumes, total residue pile wood biomass in the blocks was 21.5 t dry mass ha-1 (SE 1.9). Post-burn charred residues were ~1.5 t dry mass ha-1 resulting in C emission estimates of 10 t C ha-1 (SE 0.91), assuming 50% C, and equivalent to 2 - 3 years of pre-harvest stand C uptake (NEP 4.8 t C ha-1 y-1 SE 0.58). Results suggest the WRS method may underestimate residue pile wood volumes, while the geometric method may overestimate depending on packing ratio used. While remote sensing methods reduce uncertainty in estimating volumes or areas of all piles in a block, quantification of packing ratios remains a significant source of uncertainty in determining block level residue pile wood volumes. Additional studies are needed for other forest and harvest types to determine the wider applicability of these findings.

  6. Uncertainty in the Modeling of Tsunami Sediment Transport

    NASA Astrophysics Data System (ADS)

    Jaffe, B. E.; Sugawara, D.; Goto, K.; Gelfenbaum, G. R.; La Selle, S.

    2016-12-01

    Erosion and deposition from tsunamis record information about tsunami hydrodynamics and size that can be interpreted to improve tsunami hazard assessment. A recent study (Jaffe et al., 2016) explores sources and methods for quantifying uncertainty in tsunami sediment transport modeling. Uncertainty varies with tsunami properties, study site characteristics, available input data, sediment grain size, and the model used. Although uncertainty has the potential to be large, case studies for both forward and inverse models have shown that sediment transport modeling provides useful information on tsunami inundation and hydrodynamics that can be used to improve tsunami hazard assessment. New techniques for quantifying uncertainty, such as Ensemble Kalman Filtering inversion, and more rigorous reporting of uncertainties will advance the science of tsunami sediment transport modeling. Uncertainty may be decreased with additional laboratory studies that increase our understanding of the semi-empirical parameters and physics of tsunami sediment transport, standardized benchmark tests to assess model performance, and the development of hybrid modeling approaches to exploit the strengths of forward and inverse models. As uncertainty in tsunami sediment transport modeling is reduced, and with increased ability to quantify uncertainty, the geologic record of tsunamis will become more valuable in the assessment of tsunami hazard. Jaffe, B., Goto, K., Sugawara, D., Gelfenbaum, G., and La Selle, S., "Uncertainty in Tsunami Sediment Transport Modeling", Journal of Disaster Research Vol. 11 No. 4, pp. 647-661, 2016, doi: 10.20965/jdr.2016.p0647 https://www.fujipress.jp/jdr/dr/dsstr001100040647/

  7. Interpretation of a 3D Seismic-Reflection Volume in the Basin and Range, Hawthorne, Nevada

    NASA Astrophysics Data System (ADS)

    Louie, J. N.; Kell, A. M.; Pullammanappallil, S.; Oldow, J. S.; Sabin, A.; Lazaro, M.

    2009-12-01

    A collaborative effort by the Great Basin Center for Geothermal Energy at the University of Nevada, Reno, and Optim Inc. of Reno has interpreted a 3d seismic data set recorded by the U.S. Navy Geothermal Programs Office (GPO) at the Hawthorne Army Depot, Nevada. The 3d survey incorporated about 20 NNW-striking lines covering an area of approximately 3 by 10 km. The survey covered an alluvial area below the eastern flank of the Wassuk Range. In the reflection volume the most prominent events are interpreted to be the base of Quaternary alluvium, the Quaternary Wassuk Range-front normal fault zone, and sequences of intercalated Tertiary volcanic flows and sediments. Such a data set is rare in the Basin and Range. Our interpretation reveals structural and stratigraphic details that form a basis for rapid development of the geothermal-energy resources underlying the Depot. We interpret a map of the time-elevation of the Wassuk Range fault and its associated splays and basin-ward step faults. The range-front fault is the deepest, and its isochron map provides essentially a map of "economic basement" under the prospect area. There are three faults that are the most readily picked through vertical sections. The fault reflections show an uncertainty in the time-depth that we can interpret for them of 50 to 200 ms, due to the over-migrated appearance of the processing contractor’s prestack time-migrated data set. Proper assessment of velocities for mitigating the migration artifacts through prestack depth migration is not possible from this data set alone, as the offsets are not long enough for sufficiently deep velocity tomography. The three faults we interpreted appear as gradients in potential-field maps. In addition, the southern boundary of a major Tertiary graben may be seen within the volume as the northward termination of the strong reflections from older Tertiary volcanics. Using a transparent volume view across the survey gives a view of the volcanics in full, providing a clear picture of prominent structures. Potential drill targets and areas of development are defined within the data volume by the intersections of the fault surfaces with the tracked, strong stratigraphic reflections. Target volumes for drilling and development are defined by the intersections of the faults and bright-spot stratigraphy, and their uncertainty bounds. There are a few such intersections present within the 3d volume. Analyzing seismic attributes gives the opportunity to identify characteristics common in geothermal environments.

  8. Imaging and quantification of anomaly volume using an eight-electrode 'hemiarray' EIT reconstruction method.

    PubMed

    Sadleir, R J; Zhang, S U; Tucker, A S; Oh, Sungho

    2008-08-01

    Electrical impedance tomography (EIT) is particularly well-suited to applications where its portability, rapid acquisition speed and sensitivity give it a practical advantage over other monitoring or imaging systems. An EIT system's patient interface can potentially be adapted to match the target environment, and thereby increase its utility. It may thus be appropriate to use different electrode positions from those conventionally used in EIT in these cases. One application that may require this is the use of EIT on emergency medicine patients; in particular those who have suffered blunt abdominal trauma. In patients who have suffered major trauma, it is desirable to minimize the risk of spinal cord injury by avoiding lifting them. To adapt EIT to this requirement, we devised and evaluated a new electrode topology (the 'hemiarray') which comprises a set of eight electrodes placed only on the subject's anterior surface. Images were obtained using a two-dimensional sensitivity matrix and weighted singular value decomposition reconstruction. The hemiarray method's ability to quantify bleeding was evaluated by comparing its performance with conventional 2D reconstruction methods using data gathered from a saline phantom. We found that without applying corrections to reconstructed images it was possible to estimate blood volume in a two-dimensional hemiarray case with an uncertainty of around 27 ml. In an approximately 3D hemiarray case, volume prediction was possible with a maximum uncertainty of around 38 ml in the centre of the electrode plane. After application of a QI normalizing filter, average uncertainties in a two-dimensional hemiarray case were reduced to about 15 ml. Uncertainties in the approximate 3D case were reduced to about 30 ml.

  9. Efficacy of robust optimization plan with partial-arc VMAT for photon volumetric-modulated arc therapy: A phantom study.

    PubMed

    Miura, Hideharu; Ozawa, Shuichi; Nagata, Yasushi

    2017-09-01

    This study investigated position dependence in planning target volume (PTV)-based and robust optimization plans using full-arc and partial-arc volumetric modulated arc therapy (VMAT). The gantry angles at the periphery, intermediate, and center CTV positions were 181°-180° (full-arc VMAT) and 181°-360° (partial-arc VMAT). A PTV-based optimization plan was defined by 5 mm margin expansion of the CTV to a PTV volume, on which the dose constraints were applied. The robust optimization plan consisted of a directly optimized dose to the CTV under a maximum-uncertainties setup of 5 mm. The prescription dose was normalized to the CTV D 99% (the minimum relative dose that covers 99% of the volume of the CTV) as an original plan. The isocenter was rigidly shifted at 1 mm intervals in the anterior-posterior (A-P), superior-inferior (S-I), and right-left (R-L) directions from the original position to the maximum-uncertainties setup of 5 mm in the original plan, yielding recalculated dose distributions. It was found that for the intermediate and center positions, the uncertainties in the D 99% doses to the CTV for all directions did not significantly differ when comparing the PTV-based and robust optimization plans (P > 0.05). For the periphery position, uncertainties in the D 99% doses to the CTV in the R-L direction for the robust optimization plan were found to be lower than those in the PTV-based optimization plan (P < 0.05). Our study demonstrated that a robust optimization plan's efficacy using partial-arc VMAT depends on the periphery CTV position. © 2017 The Authors. Journal of Applied Clinical Medical Physics published by Wiley Periodicals, Inc. on behalf of American Association of Physicists in Medicine.

  10. Uncertainty assessment in geodetic network adjustment by combining GUM and Monte-Carlo-simulations

    NASA Astrophysics Data System (ADS)

    Niemeier, Wolfgang; Tengen, Dieter

    2017-06-01

    In this article first ideas are presented to extend the classical concept of geodetic network adjustment by introducing a new method for uncertainty assessment as two-step analysis. In the first step the raw data and possible influencing factors are analyzed using uncertainty modeling according to GUM (Guidelines to the Expression of Uncertainty in Measurements). This approach is well established in metrology, but rarely adapted within Geodesy. The second step consists of Monte-Carlo-Simulations (MC-simulations) for the complete processing chain from raw input data and pre-processing to adjustment computations and quality assessment. To perform these simulations, possible realizations of raw data and the influencing factors are generated, using probability distributions for all variables and the established concept of pseudo-random number generators. Final result is a point cloud which represents the uncertainty of the estimated coordinates; a confidence region can be assigned to these point clouds, as well. This concept may replace the common concept of variance propagation and the quality assessment of adjustment parameters by using their covariance matrix. It allows a new way for uncertainty assessment in accordance with the GUM concept for uncertainty modelling and propagation. As practical example the local tie network in "Metsähovi Fundamental Station", Finland is used, where classical geodetic observations are combined with GNSS data.

  11. Multi-model approach to assess the impact of climate change on runoff

    NASA Astrophysics Data System (ADS)

    Dams, J.; Nossent, J.; Senbeta, T. B.; Willems, P.; Batelaan, O.

    2015-10-01

    The assessment of climate change impacts on hydrology is subject to uncertainties related to the climate change scenarios, stochastic uncertainties of the hydrological model and structural uncertainties of the hydrological model. This paper focuses on the contribution of structural uncertainty of hydrological models to the overall uncertainty of the climate change impact assessment. To quantify the structural uncertainty of hydrological models, four physically based hydrological models (SWAT, PRMS and a semi- and fully distributed version of the WetSpa model) are set up for a catchment in Belgium. Each model is calibrated using four different objective functions. Three climate change scenarios with a high, mean and low hydrological impact are statistically perturbed from a large ensemble of climate change scenarios and are used to force the hydrological models. This methodology allows assessing and comparing the uncertainty introduced by the climate change scenarios with the uncertainty introduced by the hydrological model structure. Results show that the hydrological model structure introduces a large uncertainty on both the average monthly discharge and the extreme peak and low flow predictions under the climate change scenarios. For the low impact climate change scenario, the uncertainty range of the mean monthly runoff is comparable to the range of these runoff values in the reference period. However, for the mean and high impact scenarios, this range is significantly larger. The uncertainty introduced by the climate change scenarios is larger than the uncertainty due to the hydrological model structure for the low and mean hydrological impact scenarios, but the reverse is true for the high impact climate change scenario. The mean and high impact scenarios project increasing peak discharges, while the low impact scenario projects increasing peak discharges only for peak events with return periods larger than 1.6 years. All models suggest for all scenarios a decrease of the lowest flows, except for the SWAT model with the mean hydrological impact climate change scenario. The results of this study indicate that besides the uncertainty introduced by the climate change scenarios also the hydrological model structure uncertainty should be taken into account in the assessment of climate change impacts on hydrology. To make it more straightforward and transparent to include model structural uncertainty in hydrological impact studies, there is a need for hydrological modelling tools that allow flexible structures and methods to validate model structures in their ability to assess impacts under unobserved future climatic conditions.

  12. Uncertainties for two-dimensional models of solar rotation from helioseismic eigenfrequency splitting

    NASA Technical Reports Server (NTRS)

    Genovese, Christopher R.; Stark, Philip B.; Thompson, Michael J.

    1995-01-01

    Observed solar p-mode frequency splittings can be used to estimate angular velocity as a function of position in the solar interior. Formal uncertainties of such estimates depend on the method of estimation (e.g., least-squares), the distribution of errors in the observations, and the parameterization imposed on the angular velocity. We obtain lower bounds on the uncertainties that do not depend on the method of estimation; the bounds depend on an assumed parameterization, but the fact that they are lower bounds for the 'true' uncertainty does not. Ninety-five percent confidence intervals for estimates of the angular velocity from 1986 Big Bear Solar Observatory (BBSO) data, based on a 3659 element tensor-product cubic-spline parameterization, are everywhere wider than 120 nHz, and exceed 60,000 nHz near the core. When compared with estimates of the solar rotation, these bounds reveal that useful inferences based on pointwise estimates of the angular velocity using 1986 BBSO splitting data are not feasible over most of the Sun's volume. The discouraging size of the uncertainties is due principally to the fact that helioseismic measurements are insensitive to changes in the angular velocity at individual points, so estimates of point values based on splittings are extremely uncertain. Functionals that measure distributed 'smooth' properties are, in general, better constrained than estimates of the rotation at a point. For example, the uncertainties in estimated differences of average rotation between adjacent blocks of about 0.001 solar volumes across the base of the convective zone are much smaller, and one of several estimated differences we compute appears significant at the 95% level.

  13. Uncertainties in estimating health risks associated with exposure to ionising radiation.

    PubMed

    Preston, R Julian; Boice, John D; Brill, A Bertrand; Chakraborty, Ranajit; Conolly, Rory; Hoffman, F Owen; Hornung, Richard W; Kocher, David C; Land, Charles E; Shore, Roy E; Woloschak, Gayle E

    2013-09-01

    The information for the present discussion on the uncertainties associated with estimation of radiation risks and probability of disease causation was assembled for the recently published NCRP Report No. 171 on this topic. This memorandum provides a timely overview of the topic, given that quantitative uncertainty analysis is the state of the art in health risk assessment and given its potential importance to developments in radiation protection. Over the past decade the increasing volume of epidemiology data and the supporting radiobiology findings have aided in the reduction of uncertainty in the risk estimates derived. However, it is equally apparent that there remain significant uncertainties related to dose assessment, low dose and low dose-rate extrapolation approaches (e.g. the selection of an appropriate dose and dose-rate effectiveness factor), the biological effectiveness where considerations of the health effects of high-LET and lower-energy low-LET radiations are required and the transfer of risks from a population for which health effects data are available to one for which such data are not available. The impact of radiation on human health has focused in recent years on cancer, although there has been a decided increase in the data for noncancer effects together with more reliable estimates of the risk following radiation exposure, even at relatively low doses (notably for cataracts and cardiovascular disease). New approaches for the estimation of hereditary risk have been developed with the use of human data whenever feasible, although the current estimates of heritable radiation effects still are based on mouse data because of an absence of effects in human studies. Uncertainties associated with estimation of these different types of health effects are discussed in a qualitative and semi-quantitative manner as appropriate. The way forward would seem to require additional epidemiological studies, especially studies of low dose and low dose-rate occupational and perhaps environmental exposures and for exposures to x rays and high-LET radiations used in medicine. The development of models for more reliably combining the epidemiology data with experimental laboratory animal and cellular data can enhance the overall risk assessment approach by providing biologically refined data to strengthen the estimation of effects at low doses as opposed to the sole use of mathematical models of epidemiological data that are primarily driven by medium/high doses. NASA's approach to radiation protection for astronauts, although a unique occupational group, indicates the possible applicability of estimates of risk and their uncertainty in a broader context for developing recommendations on: (1) dose limits for occupational exposure and exposure of members of the public; (2) criteria to limit exposures of workers and members of the public to radon and its short-lived decay products; and (3) the dosimetric quantity (effective dose) used in radiation protection.

  14. MODEL UNCERTAINTY ANALYSIS, FIELD DATA COLLECTION AND ANALYSIS OF CONTAMINATED VAPOR INTRUSION INTO BUILDINGS

    EPA Science Inventory

    To address uncertainty associated with the evaluation of vapor intrusion problems we are working on a three part strategy that includes: evaluation of uncertainty in model-based assessments; collection of field data and assessment of sites using EPA and state protocols.

  15. The Uncertainties on the GIS Based Land Suitability Assessment for Urban and Rural Planning

    NASA Astrophysics Data System (ADS)

    Liu, H.; Zhan, Q.; Zhan, M.

    2017-09-01

    The majority of the research on the uncertainties of spatial data and spatial analysis focuses on some specific data feature or analysis tool. Few have accomplished the uncertainties of the whole process of an application like planning, making the research of uncertainties detached from practical applications. The paper discusses the uncertainties of the geographical information systems (GIS) based land suitability assessment in planning on the basis of literature review. The uncertainties considered range from index system establishment to the classification of the final result. Methods to reduce the uncertainties arise from the discretization of continuous raster data and the index weight determination are summarized. The paper analyzes the merits and demerits of the "Nature Breaks" method which is broadly used by planners. It also explores the other factors which impact the accuracy of the final classification like the selection of class numbers, intervals and the autocorrelation of the spatial data. In the conclusion part, the paper indicates that the adoption of machine learning methods should be modified to integrate the complexity of land suitability assessment. The work contributes to the application of spatial data and spatial analysis uncertainty research on land suitability assessment, and promotes the scientific level of the later planning and decision-making.

  16. Enhancing soil moisture monitoring via cosmic-ray neutron sensing in farmlands by combining field site tests with an uncertainty analysis

    NASA Astrophysics Data System (ADS)

    Oswald, S. E.; Scheiffele, L. M.; Baroni, G.; Ingwersen, J.; Schrön, M.

    2017-12-01

    One application of Cosmic-Ray Neutron Sensing (CRNS) is to investigate soil moisture on agricultural fields during the crop season. This fully employs the non-invasive character of CRNS without interference with agricultural practices of the farmland. The changing influence of vegetation on CRNS has to be dealt with as well as spatio-temporal influences, e.g. by irrigation or harvest. Previous work revealed that the CRNS signal on farmland shows complex and non-unique response because of the hydrogen pools in different depths and distances. This creates a challenge for soil moisture estimation and subsequent use for irrigation management or hydrological modelling. Thus, a special aim of our study was to assess the uncertainty of CRNS in cropped fields and to identify underlying causes of uncertainty. We have applied CRNS at two field sites during the growing season that were accompanied by intensive measurements of soil moisture, vegetation parameters, and irrigation events. Sources of uncertainty were identified from the experimental data. A Monte Carlo approach was used to propagate these uncertainties to CRNS soil moisture estimations. In addition, a sensitivity analysis was performed to identify the most important factors explaining this uncertainty. Results showed that CRNS soil moisture compares well to the soil moisture network when the point values were converted to weighted water content with all hydrogen pools included. However, when considered as a stand-alone method to retrieve volumetric soil moisture, the performance decreased. The support volume including its penetration depth showed also a considerable uncertainty, especially in relatively dry soil moisture conditions. Of seven factors analyzed, actual soil moisture profile, bulk density, incoming neutron correction and calibrated parameter N0 were found to play an important role. One possible improvement could be a simple correction factor based on independent data of soil moisture profiles to better account for the sensitivity of the CRNS signal to the upper soil layers. This is an important step to improve the method for validation of remote sensing products or agricultural water management and establish CRNS as an applied monitoring tool on farmland.

  17. Determination of the absorption coefficient of chromophoric dissolved organic matter from underway spectrophotometry.

    PubMed

    Dall'Olmo, Giorgio; Brewin, Robert J W; Nencioli, Francesco; Organelli, Emanuele; Lefering, Ina; McKee, David; Röttgers, Rüdiger; Mitchell, Catherine; Boss, Emmanuel; Bricaud, Annick; Tilstone, Gavin

    2017-11-27

    Measurements of the absorption coefficient of chromophoric dissolved organic matter (ay) are needed to validate existing ocean-color algorithms. In the surface open ocean, these measurements are challenging because of low ay values. Yet, existing global datasets demonstrate that ay could contribute between 30% to 50% of the total absorption budget in the 400-450 nm spectral range, thus making accurate measurement of ay essential to constrain these uncertainties. In this study, we present a simple way of determining ay using a commercially-available in-situ spectrophotometer operated in underway mode. The obtained ay values were validated using independent collocated measurements. The method is simple to implement, can provide measurements with very high spatio-temporal resolution, and has an accuracy of about 0.0004 m -1 and a precision of about 0.0025 m -1 when compared to independent data (at 440 nm). The only limitation for using this method at sea is that it relies on the availability of relatively large volumes of ultrapure water. Despite this limitation, the method can deliver the ay data needed for validating and assessing uncertainties in ocean-colour algorithms.

  18. Full uncertainty quantification of N2O and NO emissions using the biogeochemical model LandscapeDNDC on site and regional scale

    NASA Astrophysics Data System (ADS)

    Haas, Edwin; Santabarbara, Ignacio; Kiese, Ralf; Butterbach-Bahl, Klaus

    2017-04-01

    Numerical simulation models are increasingly used to estimate greenhouse gas emissions at site to regional / national scale and are outlined as the most advanced methodology (Tier 3) in the framework of UNFCCC reporting. Process-based models incorporate the major processes of the carbon and nitrogen cycle of terrestrial ecosystems and are thus thought to be widely applicable at various conditions and spatial scales. Process based modelling requires high spatial resolution input data on soil properties, climate drivers and management information. The acceptance of model based inventory calculations depends on the assessment of the inventory's uncertainty (model, input data and parameter induced uncertainties). In this study we fully quantify the uncertainty in modelling soil N2O and NO emissions from arable, grassland and forest soils using the biogeochemical model LandscapeDNDC. We address model induced uncertainty (MU) by contrasting two different soil biogeochemistry modules within LandscapeDNDC. The parameter induced uncertainty (PU) was assessed by using joint parameter distributions for key parameters describing microbial C and N turnover processes as obtained by different Bayesian calibration studies for each model configuration. Input data induced uncertainty (DU) was addressed by Bayesian calibration of soil properties, climate drivers and agricultural management practices data. For the MU, DU and PU we performed several hundred simulations each to contribute to the individual uncertainty assessment. For the overall uncertainty quantification we assessed the model prediction probability, followed by sampled sets of input datasets and parameter distributions. Statistical analysis of the simulation results have been used to quantify the overall full uncertainty of the modelling approach. With this study we can contrast the variation in model results to the different sources of uncertainties for each ecosystem. Further we have been able to perform a fully uncertainty analysis for modelling N2O and NO emissions from arable, grassland and forest soils necessary for the comprehensibility of modelling results. We have applied the methodology to a regional inventory to assess the overall modelling uncertainty for a regional N2O and NO emissions inventory for the state of Saxony, Germany.

  19. Uncertainty assessment of urban pluvial flood risk in a context of climate change adaptation decision making

    NASA Astrophysics Data System (ADS)

    Arnbjerg-Nielsen, Karsten; Zhou, Qianqian

    2014-05-01

    There has been a significant increase in climatic extremes in many regions. In Central and Northern Europe, this has led to more frequent and more severe floods. Along with improved flood modelling technologies this has enabled development of economic assessment of climate change adaptation to increasing urban flood risk. Assessment of adaptation strategies often requires a comprehensive risk-based economic analysis of current risk, drivers of change of risk over time, and measures to reduce the risk. However, such studies are often associated with large uncertainties. The uncertainties arise from basic assumptions in the economic analysis and the hydrological model, but also from the projection of future societies to local climate change impacts and suitable adaptation options. This presents a challenge to decision makers when trying to identify robust measures. We present an integrated uncertainty analysis, which can assess and quantify the overall uncertainty in relation to climate change adaptation to urban flash floods. The analysis is based on an uncertainty cascade that by means of Monte Carlo simulations of flood risk assessments incorporates climate change impacts as a key driver of risk changes over time. The overall uncertainty is then attributed to six bulk processes: climate change impact, urban rainfall-runoff processes, stage-depth functions, unit cost of repair, cost of adaptation measures, and discount rate. We apply the approach on an urban hydrological catchment in Odense, Denmark, and find that the uncertainty on the climate change impact appears to have the least influence on the net present value of the studied adaptation measures-. This does not imply that the climate change impact is not important, but that the uncertainties are not dominating when deciding on action or in-action. We then consider the uncertainty related to choosing between adaptation options given that a decision of action has been taken. In this case the major part of the uncertainty on the estimated net present values is identical for all adaptation options and will therefore not affect a comparison between adaptation measures. This makes the chose among the options easier. Furthermore, the explicit attribution of uncertainty also enables a reduction of the overall uncertainty by identifying the processes which contributes the most. This knowledge can then be used to further reduce the uncertainty related to decision making, as a substantial part of the remaining uncertainty is epistemic.

  20. Effects of exchange rate volatility on export volume and prices of forest products

    Treesearch

    Sijia Zhang; Joseph Buongiorno

    2010-01-01

    The relative value of currencies varies considerably over time. These fluctuations bring uncertainty to international traders. As a result, the volatility in exchange rate movements may influence the volume and the price of traded commodities. The volatility of exchange rates was measured by the variance of residuals in a GARCH(1,1) model of the exchange rate. We...

  1. SU-E-T-397: Evaluation of Planned Dose Distributions by Monte Carlo (0.5%) and Ray Tracing Algorithm for the Spinal Tumors with CyberKnife

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cho, H; Brindle, J; Hepel, J

    2015-06-15

    Purpose: To analyze and evaluate dose distribution between Ray Tracing (RT) and Monte Carlo (MC) algorithms of 0.5% uncertainty on a critical structure of spinal cord and gross target volume and planning target volume. Methods: Twenty four spinal tumor patients were treated with stereotactic body radiotherapy (SBRT) by CyberKnife in 2013 and 2014. The MC algorithm with 0.5% of uncertainty is used to recalculate the dose distribution for the treatment plan of the patients using the same beams, beam directions, and monitor units (MUs). Results: The prescription doses are uniformly larger for MC plans than RT except one case. Upmore » to a factor of 1.19 for 0.25cc threshold volume and 1.14 for 1.2cc threshold volume of dose differences are observed for the spinal cord. Conclusion: The MC recalculated dose distributions are larger than the original MC calculations for the spinal tumor cases. Based on the accuracy of the MC calculations, more radiation dose might be delivered to the tumor targets and spinal cords with the increase prescription dose.« less

  2. Hippocampal volume change measurement: quantitative assessment of the reproducibility of expert manual outlining and the automated methods FreeSurfer and FIRST.

    PubMed

    Mulder, Emma R; de Jong, Remko A; Knol, Dirk L; van Schijndel, Ronald A; Cover, Keith S; Visser, Pieter J; Barkhof, Frederik; Vrenken, Hugo

    2014-05-15

    To measure hippocampal volume change in Alzheimer's disease (AD) or mild cognitive impairment (MCI), expert manual delineation is often used because of its supposed accuracy. It has been suggested that expert outlining yields poorer reproducibility as compared to automated methods, but this has not been investigated. To determine the reproducibilities of expert manual outlining and two common automated methods for measuring hippocampal atrophy rates in healthy aging, MCI and AD. From the Alzheimer's Disease Neuroimaging Initiative (ADNI), 80 subjects were selected: 20 patients with AD, 40 patients with mild cognitive impairment (MCI) and 20 healthy controls (HCs). Left and right hippocampal volume change between baseline and month-12 visit was assessed by using expert manual delineation, and by the automated software packages FreeSurfer (longitudinal processing stream) and FIRST. To assess reproducibility of the measured hippocampal volume change, both back-to-back (BTB) MPRAGE scans available for each visit were analyzed. Hippocampal volume change was expressed in μL, and as a percentage of baseline volume. Reproducibility of the 1-year hippocampal volume change was estimated from the BTB measurements by using linear mixed model to calculate the limits of agreement (LoA) of each method, reflecting its measurement uncertainty. Using the delta method, approximate p-values were calculated for the pairwise comparisons between methods. Statistical analyses were performed both with inclusion and exclusion of visibly incorrect segmentations. Visibly incorrect automated segmentation in either one or both scans of a longitudinal scan pair occurred in 7.5% of the hippocampi for FreeSurfer and in 6.9% of the hippocampi for FIRST. After excluding these failed cases, reproducibility analysis for 1-year percentage volume change yielded LoA of ±7.2% for FreeSurfer, ±9.7% for expert manual delineation, and ±10.0% for FIRST. Methods ranked the same for reproducibility of 1-year μL volume change, with LoA of ±218 μL for FreeSurfer, ±319 μL for expert manual delineation, and ±333 μL for FIRST. Approximate p-values indicated that reproducibility was better for FreeSurfer than for manual or FIRST, and that manual and FIRST did not differ. Inclusion of failed automated segmentations led to worsening of reproducibility of both automated methods for 1-year raw and percentage volume change. Quantitative reproducibility values of 1-year microliter and percentage hippocampal volume change were roughly similar between expert manual outlining, FIRST and FreeSurfer, but FreeSurfer reproducibility was statistically significantly superior to both manual outlining and FIRST after exclusion of failed segmentations. Copyright © 2014 Elsevier Inc. All rights reserved.

  3. Facing uncertainty in ecosystem services-based resource management.

    PubMed

    Grêt-Regamey, Adrienne; Brunner, Sibyl H; Altwegg, Jürg; Bebi, Peter

    2013-09-01

    The concept of ecosystem services is increasingly used as a support for natural resource management decisions. While the science for assessing ecosystem services is improving, appropriate methods to address uncertainties in a quantitative manner are missing. Ignoring parameter uncertainties, modeling uncertainties and uncertainties related to human-environment interactions can modify decisions and lead to overlooking important management possibilities. In this contribution, we present a new approach for mapping the uncertainties in the assessment of multiple ecosystem services. The spatially explicit risk approach links Bayesian networks to a Geographic Information System for forecasting the value of a bundle of ecosystem services and quantifies the uncertainties related to the outcomes in a spatially explicit manner. We demonstrate that mapping uncertainties in ecosystem services assessments provides key information for decision-makers seeking critical areas in the delivery of ecosystem services in a case study in the Swiss Alps. The results suggest that not only the total value of the bundle of ecosystem services is highly dependent on uncertainties, but the spatial pattern of the ecosystem services values changes substantially when considering uncertainties. This is particularly important for the long-term management of mountain forest ecosystems, which have long rotation stands and are highly sensitive to pressing climate and socio-economic changes. Copyright © 2012 Elsevier Ltd. All rights reserved.

  4. Variability And Uncertainty Analysis Of Contaminant Transport Model Using Fuzzy Latin Hypercube Sampling Technique

    NASA Astrophysics Data System (ADS)

    Kumar, V.; Nayagum, D.; Thornton, S.; Banwart, S.; Schuhmacher2, M.; Lerner, D.

    2006-12-01

    Characterization of uncertainty associated with groundwater quality models is often of critical importance, as for example in cases where environmental models are employed in risk assessment. Insufficient data, inherent variability and estimation errors of environmental model parameters introduce uncertainty into model predictions. However, uncertainty analysis using conventional methods such as standard Monte Carlo sampling (MCS) may not be efficient, or even suitable, for complex, computationally demanding models and involving different nature of parametric variability and uncertainty. General MCS or variant of MCS such as Latin Hypercube Sampling (LHS) assumes variability and uncertainty as a single random entity and the generated samples are treated as crisp assuming vagueness as randomness. Also when the models are used as purely predictive tools, uncertainty and variability lead to the need for assessment of the plausible range of model outputs. An improved systematic variability and uncertainty analysis can provide insight into the level of confidence in model estimates, and can aid in assessing how various possible model estimates should be weighed. The present study aims to introduce, Fuzzy Latin Hypercube Sampling (FLHS), a hybrid approach of incorporating cognitive and noncognitive uncertainties. The noncognitive uncertainty such as physical randomness, statistical uncertainty due to limited information, etc can be described by its own probability density function (PDF); whereas the cognitive uncertainty such estimation error etc can be described by the membership function for its fuzziness and confidence interval by ?-cuts. An important property of this theory is its ability to merge inexact generated data of LHS approach to increase the quality of information. The FLHS technique ensures that the entire range of each variable is sampled with proper incorporation of uncertainty and variability. A fuzzified statistical summary of the model results will produce indices of sensitivity and uncertainty that relate the effects of heterogeneity and uncertainty of input variables to model predictions. The feasibility of the method is validated to assess uncertainty propagation of parameter values for estimation of the contamination level of a drinking water supply well due to transport of dissolved phenolics from a contaminated site in the UK.

  5. Relationship Between LIBS Ablation and Pit Volume for Geologic Samples: Applications for the In Situ Absolute Geochronology

    NASA Technical Reports Server (NTRS)

    Devismes, Damien; Cohen, Barbara; Miller, J.-S.; Gillot, P.-Y.; Lefevre, J.-C.; Boukari, C.

    2014-01-01

    These first results demonstrate that LIBS spectra can be an interesting tool to estimate the ablated volume. When the ablated volume is bigger than 9.10(exp 6) cubic micrometers, this method has less than 10% of uncertainties. Far enough to be directly implemented in the KArLE experiment protocol. Nevertheless, depending on the samples and their mean grain size, the difficulty to have homogeneous spectra will increase with the ablated volume. Several K-Ar dating studies based on this approach will be implemented. After that, the results will be shown and discussed.

  6. International survey for good practices in forecasting uncertainty assessment and communication

    NASA Astrophysics Data System (ADS)

    Berthet, Lionel; Piotte, Olivier

    2014-05-01

    Achieving technically sound flood forecasts is a crucial objective for forecasters but remains of poor use if the users do not understand properly their significance and do not use it properly in decision making. One usual way to precise the forecasts limitations is to communicate some information about their uncertainty. Uncertainty assessment and communication to stakeholders are thus important issues for operational flood forecasting services (FFS) but remain open fields for research. French FFS wants to publish graphical streamflow and level forecasts along with uncertainty assessment in near future on its website (available to the greater public). In order to choose the technical options best adapted to its operational context, it carried out a survey among more than 15 fellow institutions. Most of these are providing forecasts and warnings to civil protection officers while some were mostly working for hydroelectricity suppliers. A questionnaire has been prepared in order to standardize the analysis of the practices of the surveyed institutions. The survey was conducted by gathering information from technical reports or from the scientific literature, as well as 'interviews' driven by phone, email discussions or meetings. The questionnaire helped in the exploration of practices in uncertainty assessment, evaluation and communication. Attention was paid to the particular context within which every insitution works, in the analysis drawn from raw results. Results show that most services interviewed assess their forecasts uncertainty. However, practices can differ significantly from a country to another. Popular techniques are ensemble approaches. They allow to take into account several uncertainty sources. Statistical past forecasts analysis (such as the quantile regressions) are also commonly used. Contrary to what was expected, only few services emphasize the role of the forecaster (subjective assessment). Similar contrasts can be observed in uncertainty communication practices. Some countries are quite advanced in uncertainty communication to the general public whereas most of them restrain this communication to pre-defined stakeholders who have previously been sensitized or trained. Differents forms of communication were met during the survey, from written comments to complex graphics. No form could claim a clear leadership. This survey revealed useful to identify some difficulties in the design of the next French forecast uncertainty assessment and communication schemes.

  7. Deciphering the evolution of the last Eurasian ice sheets

    NASA Astrophysics Data System (ADS)

    Hughes, Anna; Gyllencreutz, Richard; Mangerud, Jan; Svendsen, John Inge

    2016-04-01

    Glacial geologists need ice sheet-scale chronological reconstructions of former ice extent to set individual records in a wider context and compare interpretations of ice sheet response to records of past environmental changes. Ice sheet modellers require empirical reconstructions on size and volume of past ice sheets that are fully documented, specified in time and include uncertainty estimates for model validation or constraints. Motivated by these demands, in 2005 we started a project (Database of the Eurasian Deglaciation, DATED) to compile and archive all published dates relevant to constraining the build-up and retreat of the last Eurasian ice sheets, including the British-Irish, Scandinavian and Svalbard-Barents-Kara Seas ice sheets (BIIS, SIS and SBKIS respectively). Over 5000 dates were assessed for reliability and used together with published ice-sheet margin positions to reconstruct time-slice maps of the ice sheets' extent, with uncertainty bounds, every 1000 years between 25-10 kyr ago and at four additional periods back to 40 kyr ago. Ten years after the idea for a database was conceived, the first version of results (DATED-1) has now been released (Hughes et al. 2016). We observe that: i) both the BIIS and SBKIS achieve maximum extent, and commence retreat earlier than the larger SIS; ii) the eastern terrestrial margin of the SIS reached its maximum extent up to 7000 years later than the westernmost marine margin; iii) the combined maximum ice volume (~24 m sea-level equivalent) was reached c. 21 ka; iv) large uncertainties exist; predominantly across marine sectors (e.g. the timing of coalescence and separation of the SIS and BKIS) but also in well-studied areas due to conflicting yet equally robust data. In just three years since the DATED-1 census (1 January 2013), the volume of new information (from both dates and mapped glacial geomorphology) has grown significantly (~1000 new dates). Here, we present the DATED-1 results in the context of the climatic changes of the last glacial, discuss the implications of emerging post-census data, and describe plans for the next version of the database, DATED-2. Hughes, A. L. C., Gyllencreutz, R., Lohne, Ø. S., Mangerud, J., Svendsen, J. I. 2016: The last Eurasian ice sheets - a chronological database and time-slice reconstruction, DATED-1. Boreas, 45, 1-45. 10.1111/bor.12142

  8. Initial Report of Pencil Beam Scanning Proton Therapy for Posthysterectomy Patients With Gynecologic Cancer

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lin, Lilie L., E-mail: lin@xrt.upenn.edu; Kirk, Maura; Scholey, Jessica

    2016-05-01

    Purpose: To report the acute toxicities associated with pencil beam scanning proton beam radiation therapy (PBS) for whole pelvis radiation therapy in women with gynecologic cancers and the results of a dosimetric comparison of PBS versus intensity modulated radiation therapy (IMRT) plans. Methods and Materials: Eleven patients with posthysterectomy gynecologic cancer received PBS to the whole pelvis. The patients received a dose of 45 to 50.4 Gy relative biological effectiveness (RBE) in 1.8 Gy (RBE) daily fractions. Acute toxicity was scored according to the Common Terminology Criteria for Adverse Events, version 4. A dosimetric comparison between a 2-field posterior oblique beam PBSmore » and an IMRT plan was conducted. The Wilcoxon signed rank test was used to assess the potential dosimetric differences between the 2 plans and PBS target coverage robustness relative to setup uncertainties. Results: The median patient age was 55 years (range 23-76). The primary site was cervical in 7, vaginal in 1, and endometrial in 3. Of the 11 patients, 7 received concurrent cisplatin, 1 each received sandwich carboplatin and paclitaxel chemotherapy, both sandwich and concurrent chemotherapy, and concurrent and adjuvant chemotherapy, and 1 received no chemotherapy. All patients completed treatment. Of the 9 patients who received concurrent chemotherapy, the rate of grade 2 and 3 hematologic toxicities was 33% and 11%, respectively. One patient (9%) developed grade 3 acute gastrointestinal toxicity; no patient developed grade ≥3 genitourinary toxicity. The volume of pelvic bone marrow, bladder, and small bowel receiving 10 to 30 Gy was significantly lower with PBS than with intensity modulated radiation therapy (P<.001). The target coverage for all PBS plans was robust relative to the setup uncertainties (P>.05) with the clinical target volume mean dose percentage received by 95% and 98% of the target volume coverage changes within 2% for the individual plans. Conclusions: Our results have demonstrated the clinical feasibility of PBS and the dosimetric advantages, especially for the low-dose sparing of normal tissues in the pelvis with the target robustness maintained relative to the setup uncertainties. Future studies with larger patient numbers are planned to further validate our preliminary findings.« less

  9. An evaluation of the treatment of risk and uncertainties in the IPCC reports on climate change.

    PubMed

    Aven, Terje; Renn, Ortwin

    2015-04-01

    Few global threats rival global climate change in scale and potential consequence. The principal international authority assessing climate risk is the Intergovernmental Panel on Climate Change (IPCC). Through repeated assessments the IPCC has devoted considerable effort and interdisciplinary competence to articulating a common characterization of climate risk and uncertainties. We have reviewed the assessment and its foundation for the Fifth Assessment Reports published in 2013 and 2014, in particular the guidance note for lead authors of the fifth IPCC assessment report on consistent treatment of uncertainties. Our analysis shows that the work carried out by the ICPP is short of providing a theoretically and conceptually convincing foundation on the treatment of risk and uncertainties. The main reasons for our assessment are: (i) the concept of risk is given a too narrow definition (a function of consequences and probability/likelihood); and (ii) the reports lack precision in delineating their concepts and methods. The goal of this article is to contribute to improving the handling of uncertainty and risk in future IPCC studies, thereby obtaining a more theoretically substantiated characterization as well as enhanced scientific quality for risk analysis in this area. Several suggestions for how to improve the risk and uncertainty treatment are provided. © 2014 Society for Risk Analysis.

  10. Uncertainty and Variability in Physiologically-Based ...

    EPA Pesticide Factsheets

    EPA announced the availability of the final report, Uncertainty and Variability in Physiologically-Based Pharmacokinetic (PBPK) Models: Key Issues and Case Studies. This report summarizes some of the recent progress in characterizing uncertainty and variability in physiologically-based pharmacokinetic models and their predictions for use in risk assessment. This report summarizes some of the recent progress in characterizing uncertainty and variability in physiologically-based pharmacokinetic models and their predictions for use in risk assessment.

  11. To what extent does variability of historical rainfall series influence extreme event statistics of sewer system surcharge and overflows?

    PubMed

    Schaarup-Jensen, K; Rasmussen, M R; Thorndahl, S

    2009-01-01

    In urban drainage modelling long-term extreme statistics has become an important basis for decision-making e.g. in connection with renovation projects. Therefore it is of great importance to minimize the uncertainties with regards to long-term prediction of maximum water levels and combined sewer overflow (CSO) in drainage systems. These uncertainties originate from large uncertainties regarding rainfall inputs, parameters, and assessment of return periods. This paper investigates how the choice of rainfall time series influences the extreme events statistics of max water levels in manholes and CSO volumes. Traditionally, long-term rainfall series, from a local rain gauge, are unavailable. In the present case study, however, long and local rain series are available. 2 rainfall gauges have recorded events for approximately 9 years at 2 locations within the catchment. Beside these 2 gauges another 7 gauges are located at a distance of max 20 kilometers from the catchment. All gauges are included in the Danish national rain gauge system which was launched in 1976. The paper describes to what extent the extreme events statistics based on these 9 series diverge from each other and how this diversity can be handled, e.g. by introducing an "averaging procedure" based on the variability within the set of statistics. All simulations are performed by means of the MOUSE LTS model.

  12. Uncertainty Assessment of Hypersonic Aerothermodynamics Prediction Capability

    NASA Technical Reports Server (NTRS)

    Bose, Deepak; Brown, James L.; Prabhu, Dinesh K.; Gnoffo, Peter; Johnston, Christopher O.; Hollis, Brian

    2011-01-01

    The present paper provides the background of a focused effort to assess uncertainties in predictions of heat flux and pressure in hypersonic flight (airbreathing or atmospheric entry) using state-of-the-art aerothermodynamics codes. The assessment is performed for four mission relevant problems: (1) shock turbulent boundary layer interaction on a compression corner, (2) shock turbulent boundary layer interaction due a impinging shock, (3) high-mass Mars entry and aerocapture, and (4) high speed return to Earth. A validation based uncertainty assessment approach with reliance on subject matter expertise is used. A code verification exercise with code-to-code comparisons and comparisons against well established correlations is also included in this effort. A thorough review of the literature in search of validation experiments is performed, which identified a scarcity of ground based validation experiments at hypersonic conditions. In particular, a shortage of useable experimental data at flight like enthalpies and Reynolds numbers is found. The uncertainty was quantified using metrics that measured discrepancy between model predictions and experimental data. The discrepancy data is statistically analyzed and investigated for physics based trends in order to define a meaningful quantified uncertainty. The detailed uncertainty assessment of each mission relevant problem is found in the four companion papers.

  13. Uncertainty Management in Remote Sensing of Climate Data. Summary of A Workshop

    NASA Technical Reports Server (NTRS)

    McConnell, M.; Weidman, S.

    2009-01-01

    Great advances have been made in our understanding of the climate system over the past few decades, and remotely sensed data have played a key role in supporting many of these advances. Improvements in satellites and in computational and data-handling techniques have yielded high quality, readily accessible data. However, rapid increases in data volume have also led to large and complex datasets that pose significant challenges in data analysis (NRC, 2007). Uncertainty characterization is needed for every satellite mission and scientists continue to be challenged by the need to reduce the uncertainty in remotely sensed climate records and projections. The approaches currently used to quantify the uncertainty in remotely sensed data, including statistical methods used to calibrate and validate satellite instruments, lack an overall mathematically based framework.

  14. Uncertainties in the estimation of specific absorption rate during radiofrequency alternating magnetic field induced non-adiabatic heating of ferrofluids

    NASA Astrophysics Data System (ADS)

    Lahiri, B. B.; Ranoo, Surojit; Philip, John

    2017-11-01

    Magnetic fluid hyperthermia (MFH) is becoming a viable cancer treatment methodology where the alternating magnetic field induced heating of magnetic fluid is utilized for ablating the cancerous cells or making them more susceptible to the conventional treatments. The heating efficiency in MFH is quantified in terms of specific absorption rate (SAR), which is defined as the heating power generated per unit mass. In majority of the experimental studies, SAR is evaluated from the temperature rise curves, obtained under non-adiabatic experimental conditions, which is prone to various thermodynamic uncertainties. A proper understanding of the experimental uncertainties and its remedies is a prerequisite for obtaining accurate and reproducible SAR. Here, we study the thermodynamic uncertainties associated with peripheral heating, delayed heating, heat loss from the sample and spatial variation in the temperature profile within the sample. Using first order approximations, an adiabatic reconstruction protocol for the measured temperature rise curves is developed for SAR estimation, which is found to be in good agreement with those obtained from the computationally intense slope corrected method. Our experimental findings clearly show that the peripheral and delayed heating are due to radiation heat transfer from the heating coils and slower response time of the sensor, respectively. Our results suggest that the peripheral heating is linearly proportional to the sample area to volume ratio and coil temperature. It is also observed that peripheral heating decreases in presence of a non-magnetic insulating shielding. The delayed heating is found to contribute up to ~25% uncertainties in SAR values. As the SAR values are very sensitive to the initial slope determination method, explicit mention of the range of linear regression analysis is appropriate to reproduce the results. The effect of sample volume to area ratio on linear heat loss rate is systematically studied and the results are compared using a lumped system thermal model. The various uncertainties involved in SAR estimation are categorized as material uncertainties, thermodynamic uncertainties and parametric uncertainties. The adiabatic reconstruction is found to decrease the uncertainties in SAR measurement by approximately three times. Additionally, a set of experimental guidelines for accurate SAR estimation using adiabatic reconstruction protocol is also recommended. These results warrant a universal experimental and data analysis protocol for SAR measurements during field induced heating of magnetic fluids under non-adiabatic conditions.

  15. Uncertainty after treatment for prostate cancer: definition, assessment, and management.

    PubMed

    Yu Ko, Wellam F; Degner, Lesley F

    2008-10-01

    Prostate cancer is the second most common type of cancer in men living in the United States and the most common type of malignancy in Canadian men, accounting for 186,320 new cases in the United States and 24,700 in Canada in 2008. Uncertainty, a component of all illness experiences, influences how men perceive the processes of treatment and adaptation. The Reconceptualized Uncertainty in Illness Theory explains the chronic nature of uncertainty in cancer survivorship by describing a shift from an emergent acute phase of uncertainty in survivors to a new level of uncertainty that is no longer acute and becomes a part of daily life. Proper assessment of certainty and uncertainty may allow nurses to maximize the effectiveness of patient-provider communication, cognitive reframing, and problem-solving interventions to reduce uncertainty after cancer treatment.

  16. Temporal and spatial variation in allocating annual traffic activity across an urban region and implications for air quality assessments

    PubMed Central

    Batterman, Stuart

    2015-01-01

    Patterns of traffic activity, including changes in the volume and speed of vehicles, vary over time and across urban areas and can substantially affect vehicle emissions of air pollutants. Time-resolved activity at the street scale typically is derived using temporal allocation factors (TAFs) that allow the development of emissions inventories needed to predict concentrations of traffic-related air pollutants. This study examines the spatial and temporal variation of TAFs, and characterizes prediction errors resulting from their use. Methods are presented to estimate TAFs and their spatial and temporal variability and used to analyze total, commercial and non-commercial traffic in the Detroit, Michigan, U.S. metropolitan area. The variability of total volume estimates, quantified by the coefficient of variation (COV) representing the percentage departure from expected hourly volume, was 21, 33, 24 and 33% for weekdays, Saturdays, Sundays and holidays, respectively. Prediction errors mostly resulted from hour-to-hour variability on weekdays and Saturdays, and from day-to-day variability on Sundays and holidays. Spatial variability was limited across the study roads, most of which were large freeways. Commercial traffic had different temporal patterns and greater variability than noncommercial vehicle traffic, e.g., the weekday variability of hourly commercial volume was 28%. The results indicate that TAFs for a metropolitan region can provide reasonably accurate estimates of hourly vehicle volume on major roads. While vehicle volume is only one of many factors that govern on-road emission rates, air quality analyses would be strengthened by incorporating information regarding the uncertainty and variability of traffic activity. PMID:26688671

  17. Interobserver reliability of computed tomographic contouring of canine tonsils in radiation therapy treatment planning.

    PubMed

    Murakami, Keiko; Rancilio, Nicholas J; Plantenga, Jeannie Poulson; Moore, George E; Heng, Hock Gan; Lim, Chee Kin

    2018-05-01

    In radiation therapy (RT) treatment planning for canine head and neck cancer, the tonsils may be included as part of the treated volume. Delineation of tonsils on computed tomography (CT) scans is difficult. Error or uncertainty in the volume and location of contoured structures may result in treatment failure. The purpose of this prospective, observer agreement study was to assess the interobserver agreement of tonsillar contouring by two groups of trained observers. Thirty dogs undergoing pre- and post-contrast CT studies of the head were included. After the pre- and postcontrast CT scans, the tonsils were identified via direct visualization, barium paste was applied bilaterally to the visible tonsils, and a third CT scan was acquired. Data from each of the three CT scans were registered in an RT treatment planning system. Two groups of observers (one veterinary radiologist and one veterinary radiation oncologist in each group) contoured bilateral tonsils by consensus, obtaining three sets of contours. Tonsil volume and location data were obtained from both groups. The contour volumes and locations were compared between groups using mixed (fixed and random effect) linear models. There was no significant difference between each group's contours in terms of three-dimensional coordinates. However there was a significant difference between each group's contours in terms of the tonsillar volume (P < 0.0001). Pre- and postcontrast CT can be used to identify the location of canine tonsils with reasonable agreement between trained observers. Discrepancy in tonsillar volume between groups of trained observers may affect RT treatment outcome. © 2017 American College of Veterinary Radiology.

  18. First-principles calibration of 40Ar/39Ar mineral standards and complete extraction of 40Ar* from sanidine

    NASA Astrophysics Data System (ADS)

    Morgan, L. E.; Kuiper, K.; Mark, D.; Postma, O.; Villa, I. M.; Wijbrans, J. R.

    2010-12-01

    40Ar/39Ar geochronology relies on comparing argon isotopic data for unknowns to those for knowns. Mineral standards used as neutron fluence monitors must be dated by the K-Ar method (or at least referenced to a mineral of known K-Ar age). The commonly used age of 28.02 ± 0.28 Ma for the Fish Canyon sanidine (FCs) (Renne et al., 1998) is based upon measurements of radiogenic 40Ar in GA1550 biotite (McDougall and Roksandic, 1974), but underlying full data were not published (these measurements were never intended for use as an international standard), so uncertainties are difficult to assess. Recent developments by Kuiper et al. (2008) and Renne et al. (2010) are limited by their reliance on the accuracy of other systems. Modern technology should allow for more precise and accurate calibration of primary K-Ar and 40Ar/39Ar standards. From the ideal gas law, the number of moles of 40Ar in a system can be calculated from measurements of pressure, volume, and temperature. Thus we have designed and are proceeding to build a pipette system to introduce well-determined amounts of 40Ar into noble gas extraction lines and mass spectrometers. This system relies on components with calibrations traceable to SI unit prototypes, including a diaphragm pressure gauge (MKS Instruments), thermocouples, and a “slug” of an accurately determined volume to be inserted into the reservoir for volume determinations of the reservoir and pipette. The system will be renewable, with a lifetime of ca. 1 month for gas in the reservoir, and portable, to permit interlaboratory calibrations. The quantitative extraction of 40Ar* from the mineral standard is of highest importance; for sanidine standards this is complicated by high melt viscosity during heating. Experiments adding basaltic “zero age glass” (ZAG) to decrease melt viscosity are underway. This has previously been explored by McDowell (1983) with a resistance furnace, but has not been quantitatively addressed with laser heating. The sensitivity of each participating mass spectrometer will be calibrated by the bracketing standards approach, alternating measurements of pipette gas and mineral standards. This will convert relative abundances into absolute molar quantities and allow for quantification of interlaboratory systematic bias. Uncertainty propagation indicates uncertainties of the molar quantity of 40Ar in mineral standards will be < 0.25% (2σ), a considerable improvement of one component of the uncertainties involved in 40Ar/39Ar geochronology. The research leading to these results has received funding from the European Community's Seventh Framework Programme (FP7/2007-2013) under grant agreement n° [215458].

  19. Bayesian Revision of Residual Detection Power

    NASA Technical Reports Server (NTRS)

    DeLoach, Richard

    2013-01-01

    This paper addresses some issues with quality assessment and quality assurance in response surface modeling experiments executed in wind tunnels. The role of data volume on quality assurance for response surface models is reviewed. Specific wind tunnel response surface modeling experiments are considered for which apparent discrepancies exist between fit quality expectations based on implemented quality assurance tactics, and the actual fit quality achieved in those experiments. These discrepancies are resolved by using Bayesian inference to account for certain imperfections in the assessment methodology. Estimates of the fraction of out-of-tolerance model predictions based on traditional frequentist methods are revised to account for uncertainty in the residual assessment process. The number of sites in the design space for which residuals are out of tolerance is seen to exceed the number of sites where the model actually fails to fit the data. A method is presented to estimate how much of the design space in inadequately modeled by low-order polynomial approximations to the true but unknown underlying response function.

  20. Estimating the volume of supra-glacial melt lakes across Greenland: A study of uncertainties derived from multi-platform water-reflectance models

    NASA Astrophysics Data System (ADS)

    Cordero-Llana, L.; Selmes, N.; Murray, T.; Scharrer, K.; Booth, A. D.

    2012-12-01

    Large volumes of water are necessary to propagate cracks to the glacial bed via hydrofractures. Hydrological models have shown that lakes above a critical volume can supply the necessary water for this process, so the ability to measure water depth in lakes remotely is important to study these processes. Previously, water depth has been derived from the optical properties of water using data from high resolution optical satellite images, as such ASTER, (Advanced Spaceborne Thermal Emission and Reflection Radiometer), IKONOS and LANDSAT. These studies used water-reflectance models based on the Bouguer-Lambert-Beer law and lack any estimation of model uncertainties. We propose an optimized model based on Sneed and Hamilton's (2007) approach to estimate water depths in supraglacial lakes and undertake a robust analysis of the errors for the first time. We used atmospherically-corrected data from ASTER and MODIS data as an input to the water-reflectance model. Three physical parameters are needed: namely bed albedo, water attenuation coefficient and reflectance of optically-deep water. These parameters were derived for each wavelength using standard calibrations. As a reference dataset, we obtained lake geometries using ICESat measurements over empty lakes. Differences between modeled and reference depths are used in a minimization model to obtain parameters for the water-reflectance model, yielding optimized lake depth estimates. Our key contribution is the development of a Monte Carlo simulation to run the water-reflectance model, which allows us to quantify the uncertainties in water depth and hence water volume. This robust statistical analysis provides better understanding of the sensitivity of the water-reflectance model to the choice of input parameters, which should contribute to the understanding of the influence of surface-derived melt-water on ice sheet dynamics. Sneed, W.A. and Hamilton, G.S., 2007: Evolution of melt pond volume on the surface of the Greenland Ice Sheet. Geophysical Research Letters, 34, 1-4.

  1. Error and Uncertainty in the Accuracy Assessment of Land Cover Maps

    NASA Astrophysics Data System (ADS)

    Sarmento, Pedro Alexandre Reis

    Traditionally the accuracy assessment of land cover maps is performed through the comparison of these maps with a reference database, which is intended to represent the "real" land cover, being this comparison reported with the thematic accuracy measures through confusion matrixes. Although, these reference databases are also a representation of reality, containing errors due to the human uncertainty in the assignment of the land cover class that best characterizes a certain area, causing bias in the thematic accuracy measures that are reported to the end users of these maps. The main goal of this dissertation is to develop a methodology that allows the integration of human uncertainty present in reference databases in the accuracy assessment of land cover maps, and analyse the impacts that uncertainty may have in the thematic accuracy measures reported to the end users of land cover maps. The utility of the inclusion of human uncertainty in the accuracy assessment of land cover maps is investigated. Specifically we studied the utility of fuzzy sets theory, more precisely of fuzzy arithmetic, for a better understanding of human uncertainty associated to the elaboration of reference databases, and their impacts in the thematic accuracy measures that are derived from confusion matrixes. For this purpose linguistic values transformed in fuzzy intervals that address the uncertainty in the elaboration of reference databases were used to compute fuzzy confusion matrixes. The proposed methodology is illustrated using a case study in which the accuracy assessment of a land cover map for Continental Portugal derived from Medium Resolution Imaging Spectrometer (MERIS) is made. The obtained results demonstrate that the inclusion of human uncertainty in reference databases provides much more information about the quality of land cover maps, when compared with the traditional approach of accuracy assessment of land cover maps. None

  2. A systematic uncertainty analysis of an evaluative fate and exposure model.

    PubMed

    Hertwich, E G; McKone, T E; Pease, W S

    2000-08-01

    Multimedia fate and exposure models are widely used to regulate the release of toxic chemicals, to set cleanup standards for contaminated sites, and to evaluate emissions in life-cycle assessment. CalTOX, one of these models, is used to calculate the potential dose, an outcome that is combined with the toxicity of the chemical to determine the Human Toxicity Potential (HTP), used to aggregate and compare emissions. The comprehensive assessment of the uncertainty in the potential dose calculation in this article serves to provide the information necessary to evaluate the reliability of decisions based on the HTP A framework for uncertainty analysis in multimedia risk assessment is proposed and evaluated with four types of uncertainty. Parameter uncertainty is assessed through Monte Carlo analysis. The variability in landscape parameters is assessed through a comparison of potential dose calculations for different regions in the United States. Decision rule uncertainty is explored through a comparison of the HTP values under open and closed system boundaries. Model uncertainty is evaluated through two case studies, one using alternative formulations for calculating the plant concentration and the other testing the steady state assumption for wet deposition. This investigation shows that steady state conditions for the removal of chemicals from the atmosphere are not appropriate and result in an underestimate of the potential dose for 25% of the 336 chemicals evaluated.

  3. Crown fuel spatial variability and predictability of fire spread

    Treesearch

    Russell A. Parsons; Jeremy Sauer; Rodman R. Linn

    2010-01-01

    Fire behavior predictions, as well as measures of uncertainty in those predictions, are essential in operational and strategic fire management decisions. While it is becoming common practice to assess uncertainty in fire behavior predictions arising from variability in weather inputs, uncertainty arising from the fire models themselves is difficult to assess. This is...

  4. Experimental uncertainty survey and assessment. [Space Shuttle Main Engine testing

    NASA Technical Reports Server (NTRS)

    Coleman, Hugh W.

    1992-01-01

    An uncertainty analysis and assessment of the specific impulse determination during Space Shuttle Main Engine testing is reported. It is concluded that in planning and designing tests and in interpreting the results of tests, the bias and precision components of experimental uncertainty should be considered separately. Recommendations for future research efforts are presented.

  5. Limited Impact of Setup and Range Uncertainties, Breathing Motion, and Interplay Effects in Robustly Optimized Intensity Modulated Proton Therapy for Stage III Non-small Cell Lung Cancer

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Inoue, Tatsuya; Widder, Joachim; Dijk, Lisanne V. van

    2016-11-01

    Purpose: To investigate the impact of setup and range uncertainties, breathing motion, and interplay effects using scanning pencil beams in robustly optimized intensity modulated proton therapy (IMPT) for stage III non-small cell lung cancer (NSCLC). Methods and Materials: Three-field IMPT plans were created using a minimax robust optimization technique for 10 NSCLC patients. The plans accounted for 5- or 7-mm setup errors with ±3% range uncertainties. The robustness of the IMPT nominal plans was evaluated considering (1) isotropic 5-mm setup errors with ±3% range uncertainties; (2) breathing motion; (3) interplay effects; and (4) a combination of items 1 and 2.more » The plans were calculated using 4-dimensional and average intensity projection computed tomography images. The target coverage (TC, volume receiving 95% of prescribed dose) and homogeneity index (D{sub 2} − D{sub 98}, where D{sub 2} and D{sub 98} are the least doses received by 2% and 98% of the volume) for the internal clinical target volume, and dose indexes for lung, esophagus, heart and spinal cord were compared with that of clinical volumetric modulated arc therapy plans. Results: The TC and homogeneity index for all plans were within clinical limits when considering the breathing motion and interplay effects independently. The setup and range uncertainties had a larger effect when considering their combined effect. The TC decreased to <98% (clinical threshold) in 3 of 10 patients for robust 5-mm evaluations. However, the TC remained >98% for robust 7-mm evaluations for all patients. The organ at risk dose parameters did not significantly vary between the respective robust 5-mm and robust 7-mm evaluations for the 4 error types. Compared with the volumetric modulated arc therapy plans, the IMPT plans showed better target homogeneity and mean lung and heart dose parameters reduced by about 40% and 60%, respectively. Conclusions: In robustly optimized IMPT for stage III NSCLC, the setup and range uncertainties, breathing motion, and interplay effects have limited impact on target coverage, dose homogeneity, and organ-at-risk dose parameters.« less

  6. Determination of the active volumes of solid-state photon-beam dosimetry detectors using the PTB proton microbeam.

    PubMed

    Poppinga, Daniela; Delfs, Bjoern; Meyners, Jutta; Langner, Frank; Giesen, Ulrich; Harder, Dietrich; Poppe, Bjoern; Looe, Hui K

    2018-05-04

    This study aims at the experimental determination of the diameters and thicknesses of the active volumes of solid-state photon-beam detectors for clinical dosimetry. The 10 MeV proton microbeam of the PTB (Physikalisch-Technische Bundesanstalt, Braunschweig) was used to examine two synthetic diamond detectors, type microDiamond (PTW Freiburg, Germany), and the silicon detectors Diode E (PTW Freiburg, Germany) and Razor Diode (Iba Dosimetry, Germany). The knowledge of the dimensions of their active volumes is essential for their Monte Carlo simulation and their applications in small-field photon-beam dosimetry. The diameter of the active detector volume was determined from the detector current profile recorded by radially scanning the proton microbeam across the detector. The thickness of the active detector volume was determined from the detector's electrical current, the number of protons incident per time interval and their mean stopping power in the active volume. The mean energy of the protons entering this volume was assessed by comparing the measured and the simulated influence of the thickness of a stack of aluminum preabsorber foils on the detector signal. For all detector types investigated, the diameters measured for the active volume closely agreed with the manufacturers' data. For the silicon Diode E detector, the thickness determined for the active volume agreed with the manufacturer's data, while for the microDiamond detectors and the Razor Diode, the thicknesses measured slightly exceeded those stated by the manufacturers. The PTB microbeam facility was used to analyze the diameters and thicknesses of the active volumes of photon dosimetry detectors for the first time. A new method of determining the thickness values with an uncertainty of ±10% was applied. The results appear useful for further consolidating detailed geometrical knowledge of the solid-state detectors investigated, which are used in clinical small-field photon-beam dosimetry. © 2018 American Association of Physicists in Medicine.

  7. An Analysis of Plan Robustness for Esophageal Tumors: Comparing Volumetric Modulated Arc Therapy Plans and Spot Scanning Proton Planning

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Warren, Samantha, E-mail: samantha.warren@oncology.ox.ac.uk; Partridge, Mike; Bolsi, Alessandra

    Purpose: Planning studies to compare x-ray and proton techniques and to select the most suitable technique for each patient have been hampered by the nonequivalence of several aspects of treatment planning and delivery. A fair comparison should compare similarly advanced delivery techniques from current clinical practice and also assess the robustness of each technique. The present study therefore compared volumetric modulated arc therapy (VMAT) and single-field optimization (SFO) spot scanning proton therapy plans created using a simultaneous integrated boost (SIB) for dose escalation in midesophageal cancer and analyzed the effect of setup and range uncertainties on these plans. Methods andmore » Materials: For 21 patients, SIB plans with a physical dose prescription of 2 Gy or 2.5 Gy/fraction in 25 fractions to planning target volume (PTV){sub 50Gy} or PTV{sub 62.5Gy} (primary tumor with 0.5 cm margins) were created and evaluated for robustness to random setup errors and proton range errors. Dose–volume metrics were compared for the optimal and uncertainty plans, with P<.05 (Wilcoxon) considered significant. Results: SFO reduced the mean lung dose by 51.4% (range 35.1%-76.1%) and the mean heart dose by 40.9% (range 15.0%-57.4%) compared with VMAT. Proton plan robustness to a 3.5% range error was acceptable. For all patients, the clinical target volume D{sub 98} was 95.0% to 100.4% of the prescribed dose and gross tumor volume (GTV) D{sub 98} was 98.8% to 101%. Setup error robustness was patient anatomy dependent, and the potential minimum dose per fraction was always lower with SFO than with VMAT. The clinical target volume D{sub 98} was lower by 0.6% to 7.8% of the prescribed dose, and the GTV D{sub 98} was lower by 0.3% to 2.2% of the prescribed GTV dose. Conclusions: The SFO plans achieved significant sparing of normal tissue compared with the VMAT plans for midesophageal cancer. The target dose coverage in the SIB proton plans was less robust to random setup errors and might be unacceptable for certain patients. Robust optimization to ensure adequate target coverage of SIB proton plans might be beneficial.« less

  8. An Analysis of Plan Robustness for Esophageal Tumors: Comparing Volumetric Modulated Arc Therapy Plans and Spot Scanning Proton Planning

    PubMed Central

    Warren, Samantha; Partridge, Mike; Bolsi, Alessandra; Lomax, Anthony J.; Hurt, Chris; Crosby, Thomas; Hawkins, Maria A.

    2016-01-01

    Purpose Planning studies to compare x-ray and proton techniques and to select the most suitable technique for each patient have been hampered by the nonequivalence of several aspects of treatment planning and delivery. A fair comparison should compare similarly advanced delivery techniques from current clinical practice and also assess the robustness of each technique. The present study therefore compared volumetric modulated arc therapy (VMAT) and single-field optimization (SFO) spot scanning proton therapy plans created using a simultaneous integrated boost (SIB) for dose escalation in midesophageal cancer and analyzed the effect of setup and range uncertainties on these plans. Methods and Materials For 21 patients, SIB plans with a physical dose prescription of 2 Gy or 2.5 Gy/fraction in 25 fractions to planning target volume (PTV)50Gy or PTV62.5Gy (primary tumor with 0.5 cm margins) were created and evaluated for robustness to random setup errors and proton range errors. Dose–volume metrics were compared for the optimal and uncertainty plans, with P<.05 (Wilcoxon) considered significant. Results SFO reduced the mean lung dose by 51.4% (range 35.1%-76.1%) and the mean heart dose by 40.9% (range 15.0%-57.4%) compared with VMAT. Proton plan robustness to a 3.5% range error was acceptable. For all patients, the clinical target volume D98 was 95.0% to 100.4% of the prescribed dose and gross tumor volume (GTV) D98 was 98.8% to 101%. Setup error robustness was patient anatomy dependent, and the potential minimum dose per fraction was always lower with SFO than with VMAT. The clinical target volume D98 was lower by 0.6% to 7.8% of the prescribed dose, and the GTV D98 was lower by 0.3% to 2.2% of the prescribed GTV dose. Conclusions The SFO plans achieved significant sparing of normal tissue compared with the VMAT plans for midesophageal cancer. The target dose coverage in the SIB proton plans was less robust to random setup errors and might be unacceptable for certain patients. Robust optimization to ensure adequate target coverage of SIB proton plans might be beneficial. PMID:27084641

  9. An Analysis of Plan Robustness for Esophageal Tumors: Comparing Volumetric Modulated Arc Therapy Plans and Spot Scanning Proton Planning.

    PubMed

    Warren, Samantha; Partridge, Mike; Bolsi, Alessandra; Lomax, Anthony J; Hurt, Chris; Crosby, Thomas; Hawkins, Maria A

    2016-05-01

    Planning studies to compare x-ray and proton techniques and to select the most suitable technique for each patient have been hampered by the nonequivalence of several aspects of treatment planning and delivery. A fair comparison should compare similarly advanced delivery techniques from current clinical practice and also assess the robustness of each technique. The present study therefore compared volumetric modulated arc therapy (VMAT) and single-field optimization (SFO) spot scanning proton therapy plans created using a simultaneous integrated boost (SIB) for dose escalation in midesophageal cancer and analyzed the effect of setup and range uncertainties on these plans. For 21 patients, SIB plans with a physical dose prescription of 2 Gy or 2.5 Gy/fraction in 25 fractions to planning target volume (PTV)50Gy or PTV62.5Gy (primary tumor with 0.5 cm margins) were created and evaluated for robustness to random setup errors and proton range errors. Dose-volume metrics were compared for the optimal and uncertainty plans, with P<.05 (Wilcoxon) considered significant. SFO reduced the mean lung dose by 51.4% (range 35.1%-76.1%) and the mean heart dose by 40.9% (range 15.0%-57.4%) compared with VMAT. Proton plan robustness to a 3.5% range error was acceptable. For all patients, the clinical target volume D98 was 95.0% to 100.4% of the prescribed dose and gross tumor volume (GTV) D98 was 98.8% to 101%. Setup error robustness was patient anatomy dependent, and the potential minimum dose per fraction was always lower with SFO than with VMAT. The clinical target volume D98 was lower by 0.6% to 7.8% of the prescribed dose, and the GTV D98 was lower by 0.3% to 2.2% of the prescribed GTV dose. The SFO plans achieved significant sparing of normal tissue compared with the VMAT plans for midesophageal cancer. The target dose coverage in the SIB proton plans was less robust to random setup errors and might be unacceptable for certain patients. Robust optimization to ensure adequate target coverage of SIB proton plans might be beneficial. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.

  10. MELCOR computer code manuals: Primer and user`s guides, Version 1.8.3 September 1994. Volume 1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Summers, R.M.; Cole, R.K. Jr.; Smith, R.C.

    1995-03-01

    MELCOR is a fully integrated, engineering-level computer code that models the progression of severe accidents in light water reactor nuclear power plants. MELCOR is being developed at Sandia National Laboratories for the US Nuclear Regulatory Commission as a second-generation plant risk assessment tool and the successor to the Source Term Code Package. A broad spectrum of severe accident phenomena in both boiling and pressurized water reactors is treated in MELCOR in a unified framework. These include: thermal-hydraulic response in the reactor coolant system, reactor cavity, containment, and confinement buildings; core heatup, degradation, and relocation; core-concrete attack; hydrogen production, transport, andmore » combustion; fission product release and transport; and the impact of engineered safety features on thermal-hydraulic and radionuclide behavior. Current uses of MELCOR include estimation of severe accident source terms and their sensitivities and uncertainties in a variety of applications. This publication of the MELCOR computer code manuals corresponds to MELCOR 1.8.3, released to users in August, 1994. Volume 1 contains a primer that describes MELCOR`s phenomenological scope, organization (by package), and documentation. The remainder of Volume 1 contains the MELCOR Users` Guides, which provide the input instructions and guidelines for each package. Volume 2 contains the MELCOR Reference Manuals, which describe the phenomenological models that have been implemented in each package.« less

  11. Uncertainty Analysis for Peer Assessment: Oral Presentation Skills for Final Year Project

    ERIC Educational Resources Information Center

    Kim, Ho Sung

    2014-01-01

    Peer assessment plays an important role in engineering education for an active involvement in the assessment process, developing autonomy, enhancing reflection, and understanding of how to achieve the learning outcomes. Peer assessment uncertainty for oral presentation skills as part of the FYP assessment is studied. Validity and reliability for…

  12. Quantification for Complex Assessment: Uncertainty Estimation in Final Year Project Thesis Assessment

    ERIC Educational Resources Information Center

    Kim, Ho Sung

    2013-01-01

    A quantitative method for estimating an expected uncertainty (reliability and validity) in assessment results arising from the relativity between four variables, viz examiner's expertise, examinee's expertise achieved, assessment task difficulty and examinee's performance, was developed for the complex assessment applicable to final…

  13. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cardenas, Ibsen C., E-mail: c.cardenas@utwente.nl; Halman, Johannes I.M., E-mail: J.I.M.Halman@utwente.nl

    Uncertainty is virtually unavoidable in environmental impact assessments (EIAs). From the literature related to treating and managing uncertainty, we have identified specific techniques for coping with uncertainty in EIAs. Here, we have focused on basic steps in the decision-making process that take place within an EIA setting. More specifically, we have identified uncertainties involved in each decision-making step and discussed the extent to which these can be treated and managed in the context of an activity or project that may have environmental impacts. To further demonstrate the relevance of the techniques identified, we have examined the extent to which themore » EIA guidelines currently used in Colombia consider and provide guidance on managing the uncertainty involved in these assessments. Some points that should be considered in order to provide greater robustness in impact assessments in Colombia have been identified. These include the management of stakeholder values, the systematic generation of project options, and their associated impacts as well as the associated management actions, and the evaluation of uncertainties and assumptions. We believe that the relevant and specific techniques reported here can be a reference for future evaluations of other EIA guidelines in different countries. - Highlights: • uncertainty is unavoidable in environmental impact assessments, EIAs; • we have identified some open techniques to EIAs for treating and managing uncertainty in these assessments; • points for improvement that should be considered in order to provide greater robustness in EIAs in Colombia have been identified; • the paper provides substantiated a reference for possible examinations of EIAs guidelines in other countries.« less

  14. Transfer Standard Uncertainty Can Cause Inconclusive Inter-Laboratory Comparisons

    PubMed Central

    Wright, John; Toman, Blaza; Mickan, Bodo; Wübbeler, Gerd; Bodnar, Olha; Elster, Clemens

    2016-01-01

    Inter-laboratory comparisons use the best available transfer standards to check the participants’ uncertainty analyses, identify underestimated uncertainty claims or unknown measurement biases, and improve the global measurement system. For some measurands, instability of the transfer standard can lead to an inconclusive comparison result. If the transfer standard uncertainty is large relative to a participating laboratory’s uncertainty, the commonly used standardized degree of equivalence ≤ 1 criterion does not always correctly assess whether a participant is working within their uncertainty claims. We show comparison results that demonstrate this issue and propose several criteria for assessing a comparison result as passing, failing, or inconclusive. We investigate the behavior of the standardized degree of equivalence and alternative comparison measures for a range of values of the transfer standard uncertainty relative to the individual laboratory uncertainty values. The proposed alternative criteria successfully discerned between passing, failing, and inconclusive comparison results for the cases we examined. PMID:28090123

  15. Assessing and reducing hydrogeologic model uncertainty

    USDA-ARS?s Scientific Manuscript database

    NRC is sponsoring research that couples model abstraction techniques with model uncertainty assessment methods. Insights and information from this program will be useful in decision making by NRC staff, licensees and stakeholders in their assessment of subsurface radionuclide transport. All analytic...

  16. Practical no-gold-standard evaluation framework for quantitative imaging methods: application to lesion segmentation in positron emission tomography

    PubMed Central

    Jha, Abhinav K.; Mena, Esther; Caffo, Brian; Ashrafinia, Saeed; Rahmim, Arman; Frey, Eric; Subramaniam, Rathan M.

    2017-01-01

    Abstract. Recently, a class of no-gold-standard (NGS) techniques have been proposed to evaluate quantitative imaging methods using patient data. These techniques provide figures of merit (FoMs) quantifying the precision of the estimated quantitative value without requiring repeated measurements and without requiring a gold standard. However, applying these techniques to patient data presents several practical difficulties including assessing the underlying assumptions, accounting for patient-sampling-related uncertainty, and assessing the reliability of the estimated FoMs. To address these issues, we propose statistical tests that provide confidence in the underlying assumptions and in the reliability of the estimated FoMs. Furthermore, the NGS technique is integrated within a bootstrap-based methodology to account for patient-sampling-related uncertainty. The developed NGS framework was applied to evaluate four methods for segmenting lesions from F-Fluoro-2-deoxyglucose positron emission tomography images of patients with head-and-neck cancer on the task of precisely measuring the metabolic tumor volume. The NGS technique consistently predicted the same segmentation method as the most precise method. The proposed framework provided confidence in these results, even when gold-standard data were not available. The bootstrap-based methodology indicated improved performance of the NGS technique with larger numbers of patient studies, as was expected, and yielded consistent results as long as data from more than 80 lesions were available for the analysis. PMID:28331883

  17. Assessing the relative importance of parameter and forcing uncertainty and their interactions in conceptual hydrological model simulations

    NASA Astrophysics Data System (ADS)

    Mockler, E. M.; Chun, K. P.; Sapriza-Azuri, G.; Bruen, M.; Wheater, H. S.

    2016-11-01

    Predictions of river flow dynamics provide vital information for many aspects of water management including water resource planning, climate adaptation, and flood and drought assessments. Many of the subjective choices that modellers make including model and criteria selection can have a significant impact on the magnitude and distribution of the output uncertainty. Hydrological modellers are tasked with understanding and minimising the uncertainty surrounding streamflow predictions before communicating the overall uncertainty to decision makers. Parameter uncertainty in conceptual rainfall-runoff models has been widely investigated, and model structural uncertainty and forcing data have been receiving increasing attention. This study aimed to assess uncertainties in streamflow predictions due to forcing data and the identification of behavioural parameter sets in 31 Irish catchments. By combining stochastic rainfall ensembles and multiple parameter sets for three conceptual rainfall-runoff models, an analysis of variance model was used to decompose the total uncertainty in streamflow simulations into contributions from (i) forcing data, (ii) identification of model parameters and (iii) interactions between the two. The analysis illustrates that, for our subjective choices, hydrological model selection had a greater contribution to overall uncertainty, while performance criteria selection influenced the relative intra-annual uncertainties in streamflow predictions. Uncertainties in streamflow predictions due to the method of determining parameters were relatively lower for wetter catchments, and more evenly distributed throughout the year when the Nash-Sutcliffe Efficiency of logarithmic values of flow (lnNSE) was the evaluation criterion.

  18. A new framework for quantifying uncertainties in modelling studies for future climates - how more certain are CMIP5 precipitation and temperature simulations compared to CMIP3?

    NASA Astrophysics Data System (ADS)

    Sharma, A.; Woldemeskel, F. M.; Sivakumar, B.; Mehrotra, R.

    2014-12-01

    We outline a new framework for assessing uncertainties in model simulations, be they hydro-ecological simulations for known scenarios, or climate simulations for assumed scenarios representing the future. This framework is illustrated here using GCM projections for future climates for hydrologically relevant variables (precipitation and temperature), with the uncertainty segregated into three dominant components - model uncertainty, scenario uncertainty (representing greenhouse gas emission scenarios), and ensemble uncertainty (representing uncertain initial conditions and states). A novel uncertainty metric, the Square Root Error Variance (SREV), is used to quantify the uncertainties involved. The SREV requires: (1) Interpolating raw and corrected GCM outputs to a common grid; (2) Converting these to percentiles; (3) Estimating SREV for model, scenario, initial condition and total uncertainty at each percentile; and (4) Transforming SREV to a time series. The outcome is a spatially varying series of SREVs associated with each model that can be used to assess how uncertain the system is at each simulated point or time. This framework, while illustrated in a climate change context, is completely applicable for assessment of uncertainties any modelling framework may be subject to. The proposed method is applied to monthly precipitation and temperature from 6 CMIP3 and 13 CMIP5 GCMs across the world. For CMIP3, B1, A1B and A2 scenarios whereas for CMIP5, RCP2.6, RCP4.5 and RCP8.5 representing low, medium and high emissions are considered. For both CMIP3 and CMIP5, model structure is the largest source of uncertainty, which reduces significantly after correcting for biases. Scenario uncertainly increases, especially for temperature, in future due to divergence of the three emission scenarios analysed. While CMIP5 precipitation simulations exhibit a small reduction in total uncertainty over CMIP3, there is almost no reduction observed for temperature projections. Estimation of uncertainty in both space and time sheds lights on the spatial and temporal patterns of uncertainties in GCM outputs, providing an effective platform for risk-based assessments of any alternate plans or decisions that may be formulated using GCM simulations.

  19. Propagating uncertainty from hydrology into human health risk assessment

    NASA Astrophysics Data System (ADS)

    Siirila, E. R.; Maxwell, R. M.

    2013-12-01

    Hydro-geologic modeling and uncertainty assessment of flow and transport parameters can be incorporated into human health risk (both cancer and non-cancer) assessment to better understand the associated uncertainties. This interdisciplinary approach is needed now more than ever as societal problems concerning water quality are increasingly interdisciplinary as well. For example, uncertainty can originate from environmental conditions such as a lack of information or measurement error, or can manifest as variability, such as differences in physiological and exposure parameters between individuals. To complicate the matter, traditional risk assessment methodologies are independent of time, virtually neglecting any temporal dependence. Here we present not only how uncertainty and variability can be incorporated into a risk assessment, but also how time dependent risk assessment (TDRA) allows for the calculation of risk as a function of time. The development of TDRA and the inclusion of quantitative risk analysis in this research provide a means to inform decision makers faced with water quality issues and challenges. The stochastic nature of this work also provides a means to address the question of uncertainty in management decisions, a component that is frequently difficult to quantify. To illustrate this new formulation and to investigate hydraulic mechanisms for sensitivity, an example of varying environmental concentration signals resulting from rate dependencies in geochemical reactions is used. Cancer risk is computed and compared using environmental concentration ensembles modeled with sorption as 1) a linear equilibrium assumption and 2) first order kinetics. Results show that the up scaling of these small-scale processes controls the distribution, magnitude, and associated uncertainty of cancer risk.

  20. Aiding alternatives assessment with an uncertainty-tolerant hazard scoring method.

    PubMed

    Faludi, Jeremy; Hoang, Tina; Gorman, Patrick; Mulvihill, Martin

    2016-11-01

    This research developed a single-score system to simplify and clarify decision-making in chemical alternatives assessment, accounting for uncertainty. Today, assessing alternatives to hazardous constituent chemicals is a difficult task-rather than comparing alternatives by a single definitive score, many independent toxicological variables must be considered at once, and data gaps are rampant. Thus, most hazard assessments are only comprehensible to toxicologists, but business leaders and politicians need simple scores to make decisions. In addition, they must balance hazard against other considerations, such as product functionality, and they must be aware of the high degrees of uncertainty in chemical hazard data. This research proposes a transparent, reproducible method to translate eighteen hazard endpoints into a simple numeric score with quantified uncertainty, alongside a similar product functionality score, to aid decisions between alternative products. The scoring method uses Clean Production Action's GreenScreen as a guide, but with a different method of score aggregation. It provides finer differentiation between scores than GreenScreen's four-point scale, and it displays uncertainty quantitatively in the final score. Displaying uncertainty also illustrates which alternatives are early in product development versus well-defined commercial products. This paper tested the proposed assessment method through a case study in the building industry, assessing alternatives to spray polyurethane foam insulation containing methylene diphenyl diisocyanate (MDI). The new hazard scoring method successfully identified trade-offs between different alternatives, showing finer resolution than GreenScreen Benchmarking. Sensitivity analysis showed that different weighting schemes in hazard scores had almost no effect on alternatives ranking, compared to uncertainty from data gaps. Copyright © 2016 Elsevier Ltd. All rights reserved.

  1. Sci-Thur PM – Brachytherapy 03: Identifying the impact of seroma visualization on permanent breast seed implant brachytherapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Morton, Daniel; Batchelar, Deidre; Hilts, Michelle

    Purpose: Uncertainties in target identification can reduce treatment accuracy in permanent breast seed implant (PBSI) brachytherapy. This study evaluates the relationship between seroma visualization and seed placement accuracy. Methods: Spatially co-registered CT and 3D ultrasound (US) images were acquired for 10 patients receiving PBSI. Seromas were retrospectively contoured independently by 3 radiation oncologists on both CT and US and respective consensus volumes were defined, CTV{sub CT} and CTV{sub US}. The seroma clarity and inter-user conformity index (CI), as well as inter-modality CI, volume, and positional differences were evaluated. Correlations with seed placement accuracy were then assessed. CTVs were expanded bymore » 1.25cm to create PTV{sub CT} and PTV{sub US} and evaluate the conformity with PTV{sub Clinical} (CTV{sub Clinical}+1.25cm) used in treatment planning. The change in PTV coincidence by expanding PTV{sub Clinical} by 0.25cm was determined. Results: CTV{sub US} were a mean 68 ± 12% smaller than CTV{sub CT} and generally had improved clarity and inter-user conformity. No correlations between seed displacement and CTV{sub US}-CTV{sub CT} positional difference or CI were observed. Greater seed displacements were associated with larger CTV{sub US}-CTV{sub CT} volume differences (r=−0.65) and inter-user CT CI (r=−0.74). A median (range) 88% (71–99%) of PTV{sub CT} and 83% (69–100%) of PTV{sub US} were contained within PTV{sub Clinical}. Expanding treatment margins to 1.5cm increased coincidence to 98% (86–100%) and 94% (82–100%), respectively. Conclusions: Differences in seroma visualization impacts seed displacement in PBSI. Reducing dependence on CT by incorporating 3DUS into target identification, or expanding CT-based treatment margins to 1.5cm may reduce or mitigate uncertainties related to seroma visualization.« less

  2. Conflict or Caveats? Effects of Media Portrayals of Scientific Uncertainty on Audience Perceptions of New Technologies.

    PubMed

    Binder, Andrew R; Hillback, Elliott D; Brossard, Dominique

    2016-04-01

    Research indicates that uncertainty in science news stories affects public assessment of risk and uncertainty. However, the form in which uncertainty is presented may also affect people's risk and uncertainty assessments. For example, a news story that features an expert discussing both what is known and what is unknown about a topic may convey a different form of scientific uncertainty than a story that features two experts who hold conflicting opinions about the status of scientific knowledge of the topic, even when both stories contain the same information about knowledge and its boundaries. This study focuses on audience uncertainty and risk perceptions regarding the emerging science of nanotechnology by manipulating whether uncertainty in a news story about potential risks is attributed to expert sources in the form of caveats (individual uncertainty) or conflicting viewpoints (collective uncertainty). Results suggest that the type of uncertainty portrayed does not impact audience feelings of uncertainty or risk perceptions directly. Rather, the presentation of the story influences risk perceptions only among those who are highly deferent to scientific authority. Implications for risk communication theory and practice are discussed. © 2015 Society for Risk Analysis.

  3. Assessment of Current Jet Noise Prediction Capabilities

    NASA Technical Reports Server (NTRS)

    Hunter, Craid A.; Bridges, James E.; Khavaran, Abbas

    2008-01-01

    An assessment was made of the capability of jet noise prediction codes over a broad range of jet flows, with the objective of quantifying current capabilities and identifying areas requiring future research investment. Three separate codes in NASA s possession, representative of two classes of jet noise prediction codes, were evaluated, one empirical and two statistical. The empirical code is the Stone Jet Noise Module (ST2JET) contained within the ANOPP aircraft noise prediction code. It is well documented, and represents the state of the art in semi-empirical acoustic prediction codes where virtual sources are attributed to various aspects of noise generation in each jet. These sources, in combination, predict the spectral directivity of a jet plume. A total of 258 jet noise cases were examined on the ST2JET code, each run requiring only fractions of a second to complete. Two statistical jet noise prediction codes were also evaluated, JeNo v1, and Jet3D. Fewer cases were run for the statistical prediction methods because they require substantially more resources, typically a Reynolds-Averaged Navier-Stokes solution of the jet, volume integration of the source statistical models over the entire plume, and a numerical solution of the governing propagation equation within the jet. In the evaluation process, substantial justification of experimental datasets used in the evaluations was made. In the end, none of the current codes can predict jet noise within experimental uncertainty. The empirical code came within 2dB on a 1/3 octave spectral basis for a wide range of flows. The statistical code Jet3D was within experimental uncertainty at broadside angles for hot supersonic jets, but errors in peak frequency and amplitude put it out of experimental uncertainty at cooler, lower speed conditions. Jet3D did not predict changes in directivity in the downstream angles. The statistical code JeNo,v1 was within experimental uncertainty predicting noise from cold subsonic jets at all angles, but did not predict changes with heating of the jet and did not account for directivity changes at supersonic conditions. Shortcomings addressed here give direction for future work relevant to the statistical-based prediction methods. A full report will be released as a chapter in a NASA publication assessing the state of the art in aircraft noise prediction.

  4. Combined Estimation of Hydrogeologic Conceptual Model and Parameter Uncertainty

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Meyer, Philip D.; Ye, Ming; Neuman, Shlomo P.

    2004-03-01

    The objective of the research described in this report is the development and application of a methodology for comprehensively assessing the hydrogeologic uncertainties involved in dose assessment, including uncertainties associated with conceptual models, parameters, and scenarios. This report describes and applies a statistical method to quantitatively estimate the combined uncertainty in model predictions arising from conceptual model and parameter uncertainties. The method relies on model averaging to combine the predictions of a set of alternative models. Implementation is driven by the available data. When there is minimal site-specific data the method can be carried out with prior parameter estimates basedmore » on generic data and subjective prior model probabilities. For sites with observations of system behavior (and optionally data characterizing model parameters), the method uses model calibration to update the prior parameter estimates and model probabilities based on the correspondence between model predictions and site observations. The set of model alternatives can contain both simplified and complex models, with the requirement that all models be based on the same set of data. The method was applied to the geostatistical modeling of air permeability at a fractured rock site. Seven alternative variogram models of log air permeability were considered to represent data from single-hole pneumatic injection tests in six boreholes at the site. Unbiased maximum likelihood estimates of variogram and drift parameters were obtained for each model. Standard information criteria provided an ambiguous ranking of the models, which would not justify selecting one of them and discarding all others as is commonly done in practice. Instead, some of the models were eliminated based on their negligibly small updated probabilities and the rest were used to project the measured log permeabilities by kriging onto a rock volume containing the six boreholes. These four projections, and associated kriging variances, were averaged using the posterior model probabilities as weights. Finally, cross-validation was conducted by eliminating from consideration all data from one borehole at a time, repeating the above process, and comparing the predictive capability of the model-averaged result with that of each individual model. Using two quantitative measures of comparison, the model-averaged result was superior to any individual geostatistical model of log permeability considered.« less

  5. Influences of system uncertainties on the numerical transfer path analysis of engine systems

    NASA Astrophysics Data System (ADS)

    Acri, A.; Nijman, E.; Acri, A.; Offner, G.

    2017-10-01

    Practical mechanical systems operate with some degree of uncertainty. In numerical models uncertainties can result from poorly known or variable parameters, from geometrical approximation, from discretization or numerical errors, from uncertain inputs or from rapidly changing forcing that can be best described in a stochastic framework. Recently, random matrix theory was introduced to take parameter uncertainties into account in numerical modeling problems. In particular in this paper, Wishart random matrix theory is applied on a multi-body dynamic system to generate random variations of the properties of system components. Multi-body dynamics is a powerful numerical tool largely implemented during the design of new engines. In this paper the influence of model parameter variability on the results obtained from the multi-body simulation of engine dynamics is investigated. The aim is to define a methodology to properly assess and rank system sources when dealing with uncertainties. Particular attention is paid to the influence of these uncertainties on the analysis and the assessment of the different engine vibration sources. Examples of the effects of different levels of uncertainties are illustrated by means of examples using a representative numerical powertrain model. A numerical transfer path analysis, based on system dynamic substructuring, is used to derive and assess the internal engine vibration sources. The results obtained from this analysis are used to derive correlations between parameter uncertainties and statistical distribution of results. The derived statistical information can be used to advance the knowledge of the multi-body analysis and the assessment of system sources when uncertainties in model parameters are considered.

  6. Assessment of Intrafraction Breathing Motion on Left Anterior Descending Artery Dose During Left-Sided Breast Radiation Therapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    El-Sherif, Omar, E-mail: Omar.ElSherif@lhsc.on.ca; Department of Physics, London Regional Cancer Program, London, Ontario; Yu, Edward

    Purpose: To use 4-dimensional computed tomography (4D-CT) imaging to predict the level of uncertainty in cardiac dose estimates of the left anterior descending artery that arises due to breathing motion during radiation therapy for left-sided breast cancer. Methods and Materials: The fast helical CT (FH-CT) and 4D-CT of 30 left-sided breast cancer patients were retrospectively analyzed. Treatment plans were created on the FH-CT. The original treatment plan was then superimposed onto all 10 phases of the 4D-CT to quantify the dosimetric impact of respiratory motion through 4D dose accumulation (4D-dose). Dose-volume histograms for the heart, left ventricle (LV), and left anteriormore » descending (LAD) artery obtained from the FH-CT were compared with those obtained from the 4D-dose. Results: The 95% confidence interval of 4D-dose and FH-CT differences in mean dose estimates for the heart, LV, and LAD were ±0.5 Gy, ±1.0 Gy, and ±8.7 Gy, respectively. Conclusion: Fast helical CT is a good approximation for doses to the heart and LV; however, dose estimates for the LAD are susceptible to uncertainties that arise due to intrafraction breathing motion that cannot be ascertained without the additional information obtained from 4D-CT and dose accumulation. For future clinical studies, we suggest the use of 4D-CT–derived dose-volume histograms for estimating the dose to the LAD.« less

  7. How to Develop and Interpret a Credibility Assessment of Numerical Models for Human Research: NASA-STD-7009 Demystified

    NASA Technical Reports Server (NTRS)

    Nelson, Emily S.; Mulugeta, Lealem; Walton, Marlei; Myers, Jerry G.

    2014-01-01

    In the wake of the Columbia accident, the NASA-STD-7009 [1] credibility assessment was developed as a unifying platform to describe model credibility and the uncertainties in its modeling predictions. This standard is now being adapted by NASAs Human Research Program to cover a wide range of numerical models for human research. When used properly, the standard can improve the process of code development by encouraging the use of best practices. It can also give management more insight in making informed decisions through a better understanding of the models capabilities and limitations.To a newcomer, the abstractions presented in NASA-STD-7009 and the sheer volume of information that must be absorbed can be overwhelming. This talk is aimed at describing the credibility assessment, which is the heart of the standard, in plain terms. It will outline how to develop a credibility assessment under the standard. It will also show how to quickly interpret the graphs and tables that result from the assessment and how to drill down from the top-level view to the foundation of the assessment. Finally, it will highlight some of the resources that are available for further study.

  8. Two Primary Standards for Low Flows of Gases

    PubMed Central

    Berg, Robert F.; Tison, Stuart A.

    2004-01-01

    We describe two primary standards for gas flow in the range from 0.1 to 1000 μmol/s. (1 μmol/s ≅ 1.3 cm3/min at 0 °C and 1 atmosphere.) The first standard is a volumetric technique in which measurements of pressure, volume, temperature, and time are recorded while gas flows in or out of a stainless steel bellows at constant pressure. The second standard is a gravimetric technique. A small aluminum pressure cylinder supplies gas to a laminar flow meter, and the integrated throughput of the laminar flow meter is compared to the weight decrease of the cylinder. The two standards, which have standard uncertainties of 0.019 %, agree to within combined uncertainties with each other and with a third primary standard at NIST based on pressure measurements at constant volume. PMID:27366623

  9. Monte Carlo calibration of avalanches described as Coulomb fluid flows.

    PubMed

    Ancey, Christophe

    2005-07-15

    The idea that snow avalanches might behave as granular flows, and thus be described as Coulomb fluid flows, came up very early in the scientific study of avalanches, but it is not until recently that field evidence has been provided that demonstrates the reliability of this idea. This paper aims to specify the bulk frictional behaviour of snow avalanches by seeking a universal friction law. Since the bulk friction coefficient cannot be measured directly in the field, the friction coefficient must be calibrated by adjusting the model outputs to closely match the recorded data. Field data are readily available but are of poor quality and accuracy. We used Bayesian inference techniques to specify the model uncertainty relative to data uncertainty and to robustly and efficiently solve the inverse problem. A sample of 173 events taken from seven paths in the French Alps was used. The first analysis showed that the friction coefficient behaved as a random variable with a smooth and bell-shaped empirical distribution function. Evidence was provided that the friction coefficient varied with the avalanche volume, but any attempt to adjust a one-to-one relationship relating friction to volume produced residual errors that could be as large as three times the maximum uncertainty of field data. A tentative universal friction law is proposed: the friction coefficient is a random variable, the distribution of which can be approximated by a normal distribution with a volume-dependent mean.

  10. Quantifying uncertainty in discharge measurements: A new approach

    USGS Publications Warehouse

    Kiang, J.E.; Cohn, T.A.; Mason, R.R.

    2009-01-01

    The accuracy of discharge measurements using velocity meters and the velocity-area method is typically assessed based on empirical studies that may not correspond to conditions encountered in practice. In this paper, a statistical approach for assessing uncertainty based on interpolated variance estimation (IVE) is introduced. The IVE method quantifies all sources of random uncertainty in the measured data. This paper presents results employing data from sites where substantial over-sampling allowed for the comparison of IVE-estimated uncertainty and observed variability among repeated measurements. These results suggest that the IVE approach can provide approximate estimates of measurement uncertainty. The use of IVE to estimate the uncertainty of a discharge measurement would provide the hydrographer an immediate determination of uncertainty and help determine whether there is a need for additional sampling in problematic river cross sections. ?? 2009 ASCE.

  11. Assessment of unconvential (tight) gas resources in Upper Cook Inlet Basin, South-central Alaska

    USGS Publications Warehouse

    Schenk, Christopher J.; Nelson, Philip H.; Klett, Timothy R.; Le, Phuong A.; Anderson, Christopher P.; Schenk, Christopher J.

    2015-01-01

    A geologic model was developed for the assessment of potential Mesozoic tight-gas resources in the deep, central part of upper Cook Inlet Basin, south-central Alaska. The basic premise of the geologic model is that organic-bearing marine shales of the Middle Jurassic Tuxedni Group achieved adequate thermal maturity for oil and gas generation in the central part of the basin largely due to several kilometers of Paleogene and Neogene burial. In this model, hydrocarbons generated in Tuxedni source rocks resulted in overpressure, causing fracturing and local migration of oil and possibly gas into low-permeability sandstone and siltstone reservoirs in the Jurassic Tuxedni Group and Chinitna and Naknek Formations. Oil that was generated either remained in the source rock and subsequently was cracked to gas which then migrated into low-permeability reservoirs, or oil initially migrated into adjacent low-permeability reservoirs, where it subsequently cracked to gas as adequate thermal maturation was reached in the central part of the basin. Geologic uncertainty exists on the (1) presence of adequate marine source rocks, (2) degree and timing of thermal maturation, generation, and expulsion, (3) migration of hydrocarbons into low-permeability reservoirs, and (4) preservation of this petroleum system. Given these uncertainties and using known U.S. tight gas reservoirs as geologic and production analogs, a mean volume of 0.64 trillion cubic feet of gas was assessed in the basin-center tight-gas system that is postulated to exist in Mesozoic rocks of the upper Cook Inlet Basin. This assessment of Mesozoic basin-center tight gas does not include potential gas accumulations in Cenozoic low-permeability reservoirs.

  12. An application of a hydraulic model simulator in flood risk assessment under changing climatic conditions

    NASA Astrophysics Data System (ADS)

    Doroszkiewicz, J. M.; Romanowicz, R. J.

    2016-12-01

    The standard procedure of climate change impact assessment on future hydrological extremes consists of a chain of consecutive actions, starting from the choice of GCM driven by an assumed CO2 scenario, through downscaling of climatic forcing to a catchment scale, estimation of hydrological extreme indices using hydrological modelling tools and subsequent derivation of flood risk maps with the help of a hydraulic model. Among many possible sources of uncertainty, the main are the uncertainties related to future climate scenarios, climate models, downscaling techniques and hydrological and hydraulic models. Unfortunately, we cannot directly assess the impact of these different sources of uncertainties on flood risk in future due to lack of observations of future climate realizations. The aim of this study is an assessment of a relative impact of different sources of uncertainty on the uncertainty of flood risk maps. Due to the complexity of the processes involved, an assessment of total uncertainty of maps of inundation probability might be very computer time consuming. As a way forward we present an application of a hydraulic model simulator based on a nonlinear transfer function model for the chosen locations along the river reach. The transfer function model parameters are estimated based on the simulations of the hydraulic model at each of the model cross-sections. The study shows that the application of a simulator substantially reduces the computer requirements related to the derivation of flood risk maps under future climatic conditions. Biala Tarnowska catchment, situated in southern Poland is used as a case study. Future discharges at the input to a hydraulic model are obtained using the HBV model and climate projections obtained from the EUROCORDEX project. The study describes a cascade of uncertainty related to different stages of the process of derivation of flood risk maps under changing climate conditions. In this context it takes into account the uncertainty of future climate projections, an uncertainty of flow routing model, the propagation of that uncertainty through the hydraulic model, and finally, the uncertainty related to the derivation of flood risk maps.

  13. Consideration of vertical uncertainty in elevation-based sea-level rise assessments: Mobile Bay, Alabama case study

    USGS Publications Warehouse

    Gesch, Dean B.

    2013-01-01

    The accuracy with which coastal topography has been mapped directly affects the reliability and usefulness of elevationbased sea-level rise vulnerability assessments. Recent research has shown that the qualities of the elevation data must be well understood to properly model potential impacts. The cumulative vertical uncertainty has contributions from elevation data error, water level data uncertainties, and vertical datum and transformation uncertainties. The concepts of minimum sealevel rise increment and minimum planning timeline, important parameters for an elevation-based sea-level rise assessment, are used in recognition of the inherent vertical uncertainty of the underlying data. These concepts were applied to conduct a sea-level rise vulnerability assessment of the Mobile Bay, Alabama, region based on high-quality lidar-derived elevation data. The results that detail the area and associated resources (land cover, population, and infrastructure) vulnerable to a 1.18-m sea-level rise by the year 2100 are reported as a range of values (at the 95% confidence level) to account for the vertical uncertainty in the base data. Examination of the tabulated statistics about land cover, population, and infrastructure in the minimum and maximum vulnerable areas shows that these resources are not uniformly distributed throughout the overall vulnerable zone. The methods demonstrated in the Mobile Bay analysis provide an example of how to consider and properly account for vertical uncertainty in elevation-based sea-level rise vulnerability assessments, and the advantages of doing so.

  14. Development and comparison in uncertainty assessment based Bayesian modularization method in hydrological modeling

    NASA Astrophysics Data System (ADS)

    Li, Lu; Xu, Chong-Yu; Engeland, Kolbjørn

    2013-04-01

    SummaryWith respect to model calibration, parameter estimation and analysis of uncertainty sources, various regression and probabilistic approaches are used in hydrological modeling. A family of Bayesian methods, which incorporates different sources of information into a single analysis through Bayes' theorem, is widely used for uncertainty assessment. However, none of these approaches can well treat the impact of high flows in hydrological modeling. This study proposes a Bayesian modularization uncertainty assessment approach in which the highest streamflow observations are treated as suspect information that should not influence the inference of the main bulk of the model parameters. This study includes a comprehensive comparison and evaluation of uncertainty assessments by our new Bayesian modularization method and standard Bayesian methods using the Metropolis-Hastings (MH) algorithm with the daily hydrological model WASMOD. Three likelihood functions were used in combination with standard Bayesian method: the AR(1) plus Normal model independent of time (Model 1), the AR(1) plus Normal model dependent on time (Model 2) and the AR(1) plus Multi-normal model (Model 3). The results reveal that the Bayesian modularization method provides the most accurate streamflow estimates measured by the Nash-Sutcliffe efficiency and provide the best in uncertainty estimates for low, medium and entire flows compared to standard Bayesian methods. The study thus provides a new approach for reducing the impact of high flows on the discharge uncertainty assessment of hydrological models via Bayesian method.

  15. Uncertainty contribution on the density of liquids due to unknown sinker temperature in hydrostatic weighing apparatus

    NASA Astrophysics Data System (ADS)

    Schiebl, M.; Zelenka, Z.; Buchner, C.; Pohl, R.; Steindl, D.

    2018-02-01

    In this study, the influence of the unknown sinker temperature on the measured density of liquids is evaluated. Generally, due to the intrinsic temperature instability of the heat bath temperature controller, the system will never reach thermal equilibrium but instead will oscillate around a mean temperature. The sinker temperature follows this temperature oscillation with a certain time lag. Since the sinker temperature is not measured directly in a hydrostatic weighing apparatus, the temperature of the sinker, and thus in turn the volume of the sinker, is not known exactly. As a consequence, this leads to uncertainty in the value of the density of the liquid. From an analysis of the volume relaxation of the sinker immersed into a heat bath with time-dependent temperature characteristics, the heat transfer coefficient can be estimated, and thus a characteristic time constant for achieving quasi thermal equilibrium for a hydrostatic weighing apparatus is proposed. Additionally, from a theoretical analysis of the transient behavior of the sinker volume, the systematic deviation of the theoretical to the actual measured liquid density is calculated.

  16. A multiple-image-based method to evaluate the performance of deformable image registration in the pelvis

    NASA Astrophysics Data System (ADS)

    Saleh, Ziad; Thor, Maria; Apte, Aditya P.; Sharp, Gregory; Tang, Xiaoli; Veeraraghavan, Harini; Muren, Ludvig; Deasy, Joseph

    2016-08-01

    Deformable image registration (DIR) is essential for adaptive radiotherapy (RT) for tumor sites subject to motion, changes in tumor volume, as well as changes in patient normal anatomy due to weight loss. Several methods have been published to evaluate DIR-related uncertainties but they are not widely adopted. The aim of this study was, therefore, to evaluate intra-patient DIR for two highly deformable organs—the bladder and the rectum—in prostate cancer RT using a quantitative metric based on multiple image registration, the distance discordance metric (DDM). Voxel-by-voxel DIR uncertainties of the bladder and rectum were evaluated using DDM on weekly CT scans of 38 subjects previously treated with RT for prostate cancer (six scans/subject). The DDM was obtained from group-wise B-spline registration of each patient’s collection of repeat CT scans. For each structure, registration uncertainties were derived from DDM-related metrics. In addition, five other quantitative measures, including inverse consistency error (ICE), transitivity error (TE), Dice similarity (DSC) and volume ratios between corresponding structures from pre- and post- registered images were computed and compared with the DDM. The DDM varied across subjects and structures; DDMmean of the bladder ranged from 2 to 13 mm and from 1 to 11 mm for the rectum. There was a high correlation between DDMmean of the bladder and the rectum (Pearson’s correlation coefficient, R p  =  0.62). The correlation between DDMmean and the volume ratios post-DIR was stronger (R p  =  0.51 0.68) than the correlation with the TE (bladder: R p  =  0.46 rectum: R p  =  0.47), or the ICE (bladder: R p  =  0.34 rectum: R p  =  0.37). There was a negative correlation between DSC and DDMmean of both the bladder (R p  =  -0.23) and the rectum (R p  =  -0.63). The DDM uncertainty metric indicated considerable DIR variability across subjects and structures. Our results show a stronger correlation with volume ratios and with the DSC using DDM compared to using ICE and TE. The DDM has the potential to quantitatively identify regions of large DIR uncertainties and consequently identify anatomical/scan outliers. The DDM can, thus, be applied to improve the adaptive RT process for tumor sites subject to motion.

  17. A TIERED APPROACH TO PERFORMING UNCERTAINTY ANALYSIS IN CONDUCTING EXPOSURE ANALYSIS FOR CHEMICALS

    EPA Science Inventory

    The WHO/IPCS draft Guidance Document on Characterizing and Communicating Uncertainty in Exposure Assessment provides guidance on recommended strategies for conducting uncertainty analysis as part of human exposure analysis. Specifically, a tiered approach to uncertainty analysis ...

  18. The impact of land use on estimates of pesticide leaching potential: Assessments and uncertainties

    NASA Astrophysics Data System (ADS)

    Loague, Keith

    1991-11-01

    This paper illustrates the magnitude of uncertainty which can exist for pesticide leaching assessments, due to data uncertainties, both between soil orders and within a single soil order. The current work differs from previous efforts because the impact of uncertainty in recharge estimates is considered. The examples are for diuron leaching in the Pearl Harbor Basin. The results clearly indicate that land use has a significant impact on both estimates of pesticide leaching potential and the uncertainties associated with those estimates. It appears that the regulation of agricultural chemicals in the future should include consideration for changing land use.

  19. Validation of a model with climatic and flow scenario analysis: case of Lake Burrumbeet in southeastern Australia.

    PubMed

    Yihdego, Yohannes; Webb, John

    2016-05-01

    Forecast evaluation is an important topic that addresses the development of reliable hydrological probabilistic forecasts, mainly through the use of climate uncertainties. Often, validation has no place in hydrology for most of the times, despite the parameters of a model are uncertain. Similarly, the structure of the model can be incorrectly chosen. A calibrated and verified dynamic hydrologic water balance spreadsheet model has been used to assess the effect of climate variability on Lake Burrumbeet, southeastern Australia. The lake level has been verified to lake level, lake volume, lake surface area, surface outflow and lake salinity. The current study aims to increase lake level confidence model prediction through historical validation for the year 2008-2013, under different climatic scenario. Based on the observed climatic condition (2008-2013), it fairly matches with a hybridization of scenarios, being the period interval (2008-2013), corresponds to both dry and wet climatic condition. Besides to the hydrologic stresses uncertainty, uncertainty in the calibrated model is among the major drawbacks involved in making scenario simulations. In line with this, the uncertainty in the calibrated model was tested using sensitivity analysis and showed that errors in the model can largely be attributed to erroneous estimates of evaporation and rainfall, and surface inflow to a lesser. The study demonstrates that several climatic scenarios should be analysed, with a combination of extreme climate, stream flow and climate change instead of one assumed climatic sequence, to improve climate variability prediction in the future. Performing such scenario analysis is a valid exercise to comprehend the uncertainty with the model structure and hydrology, in a meaningful way, without missing those, even considered as less probable, ultimately turned to be crucial for decision making and will definitely increase the confidence of model prediction for management of the water resources.

  20. SU-E-T-503: IMRT Optimization Using Monte Carlo Dose Engine: The Effect of Statistical Uncertainty.

    PubMed

    Tian, Z; Jia, X; Graves, Y; Uribe-Sanchez, A; Jiang, S

    2012-06-01

    With the development of ultra-fast GPU-based Monte Carlo (MC) dose engine, it becomes clinically realistic to compute the dose-deposition coefficients (DDC) for IMRT optimization using MC simulation. However, it is still time-consuming if we want to compute DDC with small statistical uncertainty. This work studies the effects of the statistical error in DDC matrix on IMRT optimization. The MC-computed DDC matrices are simulated here by adding statistical uncertainties at a desired level to the ones generated with a finite-size pencil beam algorithm. A statistical uncertainty model for MC dose calculation is employed. We adopt a penalty-based quadratic optimization model and gradient descent method to optimize fluence map and then recalculate the corresponding actual dose distribution using the noise-free DDC matrix. The impacts of DDC noise are assessed in terms of the deviation of the resulted dose distributions. We have also used a stochastic perturbation theory to theoretically estimate the statistical errors of dose distributions on a simplified optimization model. A head-and-neck case is used to investigate the perturbation to IMRT plan due to MC's statistical uncertainty. The relative errors of the final dose distributions of the optimized IMRT are found to be much smaller than those in the DDC matrix, which is consistent with our theoretical estimation. When history number is decreased from 108 to 106, the dose-volume-histograms are still very similar to the error-free DVHs while the error in DDC is about 3.8%. The results illustrate that the statistical errors in the DDC matrix have a relatively small effect on IMRT optimization in dose domain. This indicates we can use relatively small number of histories to obtain the DDC matrix with MC simulation within a reasonable amount of time, without considerably compromising the accuracy of the optimized treatment plan. This work is supported by Varian Medical Systems through a Master Research Agreement. © 2012 American Association of Physicists in Medicine.

  1. Assessment of the uncertainty in future projection for summer climate extremes over the East Asia

    NASA Astrophysics Data System (ADS)

    Park, Changyong; Min, Seung-Ki; Cha, Dong-Hyun

    2017-04-01

    Future projections of climate extremes in regional and local scales are essential information needed for better adapting to climate changes. However, future projections hold larger uncertainty factors arising from internal and external processes which reduce the projection confidence. Using CMIP5 (Coupled Model Intercomparison Project Phase 5) multi-model simulations, we assess uncertainties in future projections of the East Asian temperature and precipitation extremes focusing on summer. In examining future projection, summer mean and extreme projections of the East Asian temperature and precipitation would be larger as time. Moreover, uncertainty cascades represent wider scenario difference and inter-model ranges with increasing time. A positive mean-extreme relation is found in projections for both temperature and precipitation. For the assessment of uncertainty factors for these projections, dominant uncertainty factors from temperature and precipitation change as time. For uncertainty of mean and extreme temperature, contributions of internal variability and model uncertainty declines after mid-21st century while role of scenario uncertainty grows rapidly. For uncertainty of mean precipitation projections, internal variability is more important than the scenario uncertainty. Unlike mean precipitation, extreme precipitation shows that the scenario uncertainty is expected to be a dominant factor in 2090s. The model uncertainty holds as an important factor for both mean and extreme precipitation until late 21st century. The spatial changes for the uncertainty factors of mean and extreme projections generally are expressed according to temporal changes of the fraction of total variance from uncertainty factors in many grids of the East Asia. ACKNOWLEDGEMENTS The research was supported by the Korea Meteorological Administration Research and Development program under grant KMIPA 2015-2083 and the National Research Foundation of Korea Grant funded by the Ministry of Science, ICT and Future Planning of Korea (NRF-2016M3C4A7952637) for its support and assistant in completion of the study.

  2. Relationship between Physicians' Uncertainty about Clinical Assessments and Patient-Centered Recommendations for Colorectal Cancer Screening in the Elderly.

    PubMed

    Dalton, Alexandra F; Golin, Carol E; Esserman, Denise; Pignone, Michael P; Pathman, Donald E; Lewis, Carmen L

    2015-05-01

    The goal of this study was to examine associations between physicians' clinical assessments, their certainty in these assessments, and the likelihood of a patient-centered recommendation about colorectal cancer (CRC) screening in the elderly. Two hundred seventy-six primary care physicians in the United States read 3 vignettes about an 80-year-old female patient and answered questions about her life expectancy, their confidence in their life expectancy estimate, the balance of benefits/downsides of CRC screening, their certainty in their benefit/downside assessment, and the best course of action regarding CRC screening. We used logistic regression to determine the relationship between these variables and patient-centered recommendations about CRC screening. In bivariate analyses, physicians had higher odds of making a patient-centered recommendation about CRC screening when their clinical assessments did not lead to a clear screening recommendation or when they experienced uncertainty in their clinical assessments. However, in a multivariate regression model, only benefit/downside assessment and best course of action remained statistically significant predictors of a patient-centered recommendation. Our findings demonstrate that when the results of clinical assessments do not lead to obvious screening decisions or when physicians feel uncertain about their clinical assessments, they are more likely to make patient-centered recommendations. Existing uncertainty frameworks do not adequately describe the uncertainty associated with patient-centered recommendations found in this study. Adapting or modifying these frameworks to better reflect the constructs associated with uncertainty and the interactions between uncertainty and the complexity inherent in clinical decisions will facilitate a more complete understanding of how and when physicians choose to include patients in clinical decisions. © The Author(s) 2015.

  3. Predicting Consumer Biomass, Size-Structure, Production, Catch Potential, Responses to Fishing and Associated Uncertainties in the World's Marine Ecosystems.

    PubMed

    Jennings, Simon; Collingridge, Kate

    2015-01-01

    Existing estimates of fish and consumer biomass in the world's oceans are disparate. This creates uncertainty about the roles of fish and other consumers in biogeochemical cycles and ecosystem processes, the extent of human and environmental impacts and fishery potential. We develop and use a size-based macroecological model to assess the effects of parameter uncertainty on predicted consumer biomass, production and distribution. Resulting uncertainty is large (e.g. median global biomass 4.9 billion tonnes for consumers weighing 1 g to 1000 kg; 50% uncertainty intervals of 2 to 10.4 billion tonnes; 90% uncertainty intervals of 0.3 to 26.1 billion tonnes) and driven primarily by uncertainty in trophic transfer efficiency and its relationship with predator-prey body mass ratios. Even the upper uncertainty intervals for global predictions of consumer biomass demonstrate the remarkable scarcity of marine consumers, with less than one part in 30 million by volume of the global oceans comprising tissue of macroscopic animals. Thus the apparently high densities of marine life seen in surface and coastal waters and frequently visited abundance hotspots will likely give many in society a false impression of the abundance of marine animals. Unexploited baseline biomass predictions from the simple macroecological model were used to calibrate a more complex size- and trait-based model to estimate fisheries yield and impacts. Yields are highly dependent on baseline biomass and fisheries selectivity. Predicted global sustainable fisheries yield increases ≈4 fold when smaller individuals (< 20 cm from species of maximum mass < 1 kg) are targeted in all oceans, but the predicted yields would rarely be accessible in practice and this fishing strategy leads to the collapse of larger species if fishing mortality rates on different size classes cannot be decoupled. Our analyses show that models with minimal parameter demands that are based on a few established ecological principles can support equitable analysis and comparison of diverse ecosystems. The analyses provide insights into the effects of parameter uncertainty on global biomass and production estimates, which have yet to be achieved with complex models, and will therefore help to highlight priorities for future research and data collection. However, the focus on simple model structures and global processes means that non-phytoplankton primary production and several groups, structures and processes of ecological and conservation interest are not represented. Consequently, our simple models become increasingly less useful than more complex alternatives when addressing questions about food web structure and function, biodiversity, resilience and human impacts at smaller scales and for areas closer to coasts.

  4. High Resolution Viscosity Measurement by Thermal Noise Detection

    PubMed Central

    Aguilar Sandoval, Felipe; Sepúlveda, Manuel; Bellon, Ludovic; Melo, Francisco

    2015-01-01

    An interferometric method is implemented in order to accurately assess the thermal fluctuations of a micro-cantilever sensor in liquid environments. The power spectrum density (PSD) of thermal fluctuations together with Sader’s model of the cantilever allow for the indirect measurement of the liquid viscosity with good accuracy. The good quality of the deflection signal and the characteristic low noise of the instrument allow for the detection and corrections of drawbacks due to both the cantilever shape irregularities and the uncertainties on the position of the laser spot at the fluctuating end of the cantilever. Variation of viscosity below 0.03 mPa·s was detected with the alternative to achieve measurements with a volume as low as 50 μL. PMID:26540061

  5. High Resolution Viscosity Measurement by Thermal Noise Detection.

    PubMed

    Sandoval, Felipe Aguilar; Sepúlveda, Manuel; Bellon, Ludovic; Melo, Francisco

    2015-11-03

    An interferometric method is implemented in order to accurately assess the thermal fluctuations of a micro-cantilever sensor in liquid environments. The power spectrum density (PSD) of thermal fluctuations together with Sader's model of the cantilever allow for the indirect measurement of the liquid viscosity with good accuracy. The good quality of the deflection signal and the characteristic low noise of the instrument allow for the detection and corrections of drawbacks due to both the cantilever shape irregularities and the uncertainties on the position of the laser spot at the fluctuating end of the cantilever. Variation of viscosity below 0:03mPa·s was detected with the alternative to achieve measurements with a volume as low as 50 µL.

  6. Characterizing spatial uncertainty when integrating social data in conservation planning.

    PubMed

    Lechner, A M; Raymond, C M; Adams, V M; Polyakov, M; Gordon, A; Rhodes, J R; Mills, M; Stein, A; Ives, C D; Lefroy, E C

    2014-12-01

    Recent conservation planning studies have presented approaches for integrating spatially referenced social (SRS) data with a view to improving the feasibility of conservation action. We reviewed the growing conservation literature on SRS data, focusing on elicited or stated preferences derived through social survey methods such as choice experiments and public participation geographic information systems. Elicited SRS data includes the spatial distribution of willingness to sell, willingness to pay, willingness to act, and assessments of social and cultural values. We developed a typology for assessing elicited SRS data uncertainty which describes how social survey uncertainty propagates when projected spatially and the importance of accounting for spatial uncertainty such as scale effects and data quality. These uncertainties will propagate when elicited SRS data is integrated with biophysical data for conservation planning and may have important consequences for assessing the feasibility of conservation actions. To explore this issue further, we conducted a systematic review of the elicited SRS data literature. We found that social survey uncertainty was commonly tested for, but that these uncertainties were ignored when projected spatially. Based on these results we developed a framework which will help researchers and practitioners estimate social survey uncertainty and use these quantitative estimates to systematically address uncertainty within an analysis. This is important when using SRS data in conservation applications because decisions need to be made irrespective of data quality and well characterized uncertainty can be incorporated into decision theoretic approaches. © 2014 Society for Conservation Biology.

  7. Communicating uncertainties in assessments of future sea level rise

    NASA Astrophysics Data System (ADS)

    Wikman-Svahn, P.

    2013-12-01

    How uncertainty should be managed and communicated in policy-relevant scientific assessments is directly connected to the role of science and the responsibility of scientists. These fundamentally philosophical issues influence how scientific assessments are made and how scientific findings are communicated to policymakers. It is therefore of high importance to discuss implicit assumptions and value judgments that are made in policy-relevant scientific assessments. The present paper examines these issues for the case of scientific assessments of future sea level rise. The magnitude of future sea level rise is very uncertain, mainly due to poor scientific understanding of all physical mechanisms affecting the great ice sheets of Greenland and Antarctica, which together hold enough land-based ice to raise sea levels more than 60 meters if completely melted. There has been much confusion from policymakers on how different assessments of future sea levels should be interpreted. Much of this confusion is probably due to how uncertainties are characterized and communicated in these assessments. The present paper draws on the recent philosophical debate on the so-called "value-free ideal of science" - the view that science should not be based on social and ethical values. Issues related to how uncertainty is handled in scientific assessments are central to this debate. This literature has much focused on how uncertainty in data, parameters or models implies that choices have to be made, which can have social consequences. However, less emphasis has been on how uncertainty is characterized when communicating the findings of a study, which is the focus of the present paper. The paper argues that there is a tension between on the one hand the value-free ideal of science and on the other hand usefulness for practical applications in society. This means that even if the value-free ideal could be upheld in theory, by carefully constructing and hedging statements characterizing scientific uncertainty, it will in most cases not be very useful for society. Instead, it is argued that scientific assessments that are used to inform societal decision-making should try to anticipate applications and aim to construct statements that characterize knowledge and uncertainty in a way that are more useful for those anticipated applications, even if this means that the value-free ideal cannot be upheld. This means that scientific assessments should ideally be intertwined with societal applications, and that co-produced knowledge engaging both scientists and end-users are likely to provide better and more useful assessments. The argument is illustrated using real examples from scientific assessments of future sea level rise, with special emphasis on the approaches used by the Intergovernmental Panel on Climate Change in the fourth assessment report from 2007, and the fifth assessment report due in September 2013. Finally, it is argued that recent developments in "bottom-up" and "robust" decision-making frameworks provide a way forward to remove many of the pitfalls and problems of communicating uncertainties in policy-relevant scientific assessments.

  8. Combining exposure and effect modeling into an integrated probabilistic environmental risk assessment for nanoparticles.

    PubMed

    Jacobs, Rianne; Meesters, Johannes A J; Ter Braak, Cajo J F; van de Meent, Dik; van der Voet, Hilko

    2016-12-01

    There is a growing need for good environmental risk assessment of engineered nanoparticles (ENPs). Environmental risk assessment of ENPs has been hampered by lack of data and knowledge about ENPs, their environmental fate, and their toxicity. This leads to uncertainty in the risk assessment. To deal with uncertainty in the risk assessment effectively, probabilistic methods are advantageous. In the present study, the authors developed a method to model both the variability and the uncertainty in environmental risk assessment of ENPs. This method is based on the concentration ratio and the ratio of the exposure concentration to the critical effect concentration, both considered to be random. In this method, variability and uncertainty are modeled separately so as to allow the user to see which part of the total variation in the concentration ratio is attributable to uncertainty and which part is attributable to variability. The authors illustrate the use of the method with a simplified aquatic risk assessment of nano-titanium dioxide. The authors' method allows a more transparent risk assessment and can also direct further environmental and toxicological research to the areas in which it is most needed. Environ Toxicol Chem 2016;35:2958-2967. © 2016 The Authors. Environmental Toxicology and Chemistry published by Wiley Periodicals, Inc. on behalf of SETAC. © 2016 The Authors. Environmental Toxicology and Chemistry published by Wiley Periodicals, Inc. on behalf of SETAC.

  9. An overview of methods to identify and manage uncertainty for modelling problems in the water-environment-agriculture cross-sector

    DOE PAGES

    Jakeman, Anthony J.; Jakeman, John Davis

    2018-03-14

    Uncertainty pervades the representation of systems in the water–environment–agriculture cross-sector. Successful methods to address uncertainties have largely focused on standard mathematical formulations of biophysical processes in a single sector, such as partial or ordinary differential equations. More attention to integrated models of such systems is warranted. Model components representing the different sectors of an integrated model can have less standard, and different, formulations to one another, as well as different levels of epistemic knowledge and data informativeness. Thus, uncertainty is not only pervasive but also crosses boundaries and propagates between system components. Uncertainty assessment (UA) cries out for more eclecticmore » treatment in these circumstances, some of it being more qualitative and empirical. Here in this paper, we discuss the various sources of uncertainty in such a cross-sectoral setting and ways to assess and manage them. We have outlined a fast-growing set of methodologies, particularly in the computational mathematics literature on uncertainty quantification (UQ), that seem highly pertinent for uncertainty assessment. There appears to be considerable scope for advancing UA by integrating relevant UQ techniques into cross-sectoral problem applications. Of course this will entail considerable collaboration between domain specialists who often take first ownership of the problem and computational methods experts.« less

  10. An overview of methods to identify and manage uncertainty for modelling problems in the water-environment-agriculture cross-sector

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jakeman, Anthony J.; Jakeman, John Davis

    Uncertainty pervades the representation of systems in the water–environment–agriculture cross-sector. Successful methods to address uncertainties have largely focused on standard mathematical formulations of biophysical processes in a single sector, such as partial or ordinary differential equations. More attention to integrated models of such systems is warranted. Model components representing the different sectors of an integrated model can have less standard, and different, formulations to one another, as well as different levels of epistemic knowledge and data informativeness. Thus, uncertainty is not only pervasive but also crosses boundaries and propagates between system components. Uncertainty assessment (UA) cries out for more eclecticmore » treatment in these circumstances, some of it being more qualitative and empirical. Here in this paper, we discuss the various sources of uncertainty in such a cross-sectoral setting and ways to assess and manage them. We have outlined a fast-growing set of methodologies, particularly in the computational mathematics literature on uncertainty quantification (UQ), that seem highly pertinent for uncertainty assessment. There appears to be considerable scope for advancing UA by integrating relevant UQ techniques into cross-sectoral problem applications. Of course this will entail considerable collaboration between domain specialists who often take first ownership of the problem and computational methods experts.« less

  11. How does uncertainty shape patient experience in advanced illness? A secondary analysis of qualitative data.

    PubMed

    Etkind, Simon Noah; Bristowe, Katherine; Bailey, Katharine; Selman, Lucy Ellen; Murtagh, Fliss Em

    2017-02-01

    Uncertainty is common in advanced illness but is infrequently studied in this context. If poorly addressed, uncertainty can lead to adverse patient outcomes. We aimed to understand patient experiences of uncertainty in advanced illness and develop a typology of patients' responses and preferences to inform practice. Secondary analysis of qualitative interview transcripts. Studies were assessed for inclusion and interviews were sampled using maximum-variation sampling. Analysis used a thematic approach with 10% of coding cross-checked to enhance reliability. Qualitative interviews from six studies including patients with heart failure, chronic obstructive pulmonary disease, renal disease, cancer and liver failure. A total of 30 transcripts were analysed. Median age was 75 (range, 43-95), 12 patients were women. The impact of uncertainty was frequently discussed: the main related themes were engagement with illness, information needs, patient priorities and the period of time that patients mainly focused their attention on (temporal focus). A typology of patient responses to uncertainty was developed from these themes. Uncertainty influences patient experience in advanced illness through affecting patients' information needs, preferences and future priorities for care. Our typology aids understanding of how patients with advanced illness respond to uncertainty. Assessment of these three factors may be a useful starting point to guide clinical assessment and shared decision making.

  12. Performance of internal covariance estimators for cosmic shear correlation functions

    DOE PAGES

    Friedrich, O.; Seitz, S.; Eifler, T. F.; ...

    2015-12-31

    Data re-sampling methods such as the delete-one jackknife are a common tool for estimating the covariance of large scale structure probes. In this paper we investigate the concepts of internal covariance estimation in the context of cosmic shear two-point statistics. We demonstrate how to use log-normal simulations of the convergence field and the corresponding shear field to carry out realistic tests of internal covariance estimators and find that most estimators such as jackknife or sub-sample covariance can reach a satisfactory compromise between bias and variance of the estimated covariance. In a forecast for the complete, 5-year DES survey we show that internally estimated covariance matrices can provide a large fraction of the true uncertainties on cosmological parameters in a 2D cosmic shear analysis. The volume inside contours of constant likelihood in themore » $$\\Omega_m$$-$$\\sigma_8$$ plane as measured with internally estimated covariance matrices is on average $$\\gtrsim 85\\%$$ of the volume derived from the true covariance matrix. The uncertainty on the parameter combination $$\\Sigma_8 \\sim \\sigma_8 \\Omega_m^{0.5}$$ derived from internally estimated covariances is $$\\sim 90\\%$$ of the true uncertainty.« less

  13. Phase shifts in I = 2 ππ-scattering from two lattice approaches

    NASA Astrophysics Data System (ADS)

    Kurth, T.; Ishii, N.; Doi, T.; Aoki, S.; Hatsuda, T.

    2013-12-01

    We present a lattice QCD study of the phase shift of I = 2 ππ scattering on the basis of two different approaches: the standard finite volume approach by Lüscher and the recently introduced HAL QCD potential method. Quenched QCD simulations are performed on lattices with extents N s = 16 , 24 , 32 , 48 and N t = 128 as well as lattice spacing a ~ 0 .115 fm and a pion mass of m π ~ 940 MeV. The phase shift and the scattering length are calculated in these two methods. In the potential method, the error is dominated by the systematic uncertainty associated with the violation of rotational symmetry due to finite lattice spacing. In Lüscher's approach, such systematic uncertainty is difficult to be evaluated and thus is not included in this work. A systematic uncertainty attributed to the quenched approximation, however, is not evaluated in both methods. In case of the potential method, the phase shift can be calculated for arbitrary energies below the inelastic threshold. The energy dependence of the phase shift is also obtained from Lüscher's method using different volumes and/or nonrest-frame extension of it. The results are found to agree well with the potential method.

  14. Toward quantifying the effectiveness of water trading under uncertainty.

    PubMed

    Luo, B; Huang, G H; Zou, Y; Yin, Y Y

    2007-04-01

    This paper presents a methodology for quantifying the effectiveness of water-trading under uncertainty, by developing an optimization model based on the interval-parameter two-stage stochastic program (TSP) technique. In the study, the effectiveness of a water-trading program is measured by the water volume that can be released through trading from a statistical point of view. The methodology can also deal with recourse water allocation problems generated by randomness in water availability and, at the same time, tackle uncertainties expressed as intervals in the trading system. The developed methodology was tested with a hypothetical water-trading program in an agricultural system in the Swift Current Creek watershed, Canada. Study results indicate that the methodology can effectively measure the effectiveness of a trading program through estimating the water volume being released through trading in a long-term view. A sensitivity analysis was also conducted to analyze the effects of different trading costs on the trading program. It shows that the trading efforts would become ineffective when the trading costs are too high. The case study also demonstrates that the trading program is more effective in a dry season when total water availability is in shortage.

  15. A robust scientific workflow for assessing fire danger levels using open-source software

    NASA Astrophysics Data System (ADS)

    Vitolo, Claudia; Di Giuseppe, Francesca; Smith, Paul

    2017-04-01

    Modelling forest fires is theoretically and computationally challenging because it involves the use of a wide variety of information, in large volumes and affected by high uncertainties. In-situ observations of wildfire, for instance, are highly sparse and need to be complemented by remotely sensed data measuring biomass burning to achieve homogeneous coverage at global scale. Fire models use weather reanalysis products to measure energy release and rate of spread but can only assess the potential predictability of fire danger as the actual ignition is due to human behaviour and, therefore, very unpredictable. Lastly, fire forecasting systems rely on weather forecasts to extend the advance warning but are currently calibrated using fire danger thresholds that are defined at global scale and do not take into account the spatial variability of fuel availability. As a consequence, uncertainties sharply increase cascading from the observational to the modelling stage and they might be further inflated by non-reproducible analyses. Although uncertainties in observations will only decrease with technological advances over the next decades, the other uncertainties (i.e. generated during modelling and post-processing) can already be addressed by developing transparent and reproducible analysis workflows, even more if implemented within open-source initiatives. This is because reproducible workflows aim to streamline the processing task as they present ready-made solutions to handle and manipulate complex and heterogeneous datasets. Also, opening the code to the scrutiny of other experts increases the chances to implement more robust solutions and avoids duplication of efforts. In this work we present our contribution to the forest fire modelling community: an open-source tool called "caliver" for the calibration and verification of forest fire model results. This tool is developed in the R programming language and publicly available under an open license. We will present the caliver R package, illustrate the main functionalities and show the results of our preliminary experiments calculating fire danger thresholds for various regions on Earth. We will compare these with the existing global thresholds and, lastly, demonstrate how these newly-calculated regional thresholds can lead to improved calibration of fire forecast models in an operational setting.

  16. Pragmatic geometric model evaluation

    NASA Astrophysics Data System (ADS)

    Pamer, Robert

    2015-04-01

    Quantification of subsurface model reliability is mathematically and technically demanding as there are many different sources of uncertainty and some of the factors can be assessed merely in a subjective way. For many practical applications in industry or risk assessment (e. g. geothermal drilling) a quantitative estimation of possible geometric variations in depth unit is preferred over relative numbers because of cost calculations for different scenarios. The talk gives an overview of several factors that affect the geometry of structural subsurface models that are based upon typical geological survey organization (GSO) data like geological maps, borehole data and conceptually driven construction of subsurface elements (e. g. fault network). Within the context of the trans-European project "GeoMol" uncertainty analysis has to be very pragmatic also because of different data rights, data policies and modelling software between the project partners. In a case study a two-step evaluation methodology for geometric subsurface model uncertainty is being developed. In a first step several models of the same volume of interest have been calculated by omitting successively more and more input data types (seismic constraints, fault network, outcrop data). The positions of the various horizon surfaces are then compared. The procedure is equivalent to comparing data of various levels of detail and therefore structural complexity. This gives a measure of the structural significance of each data set in space and as a consequence areas of geometric complexity are identified. These areas are usually very data sensitive hence geometric variability in between individual data points in these areas is higher than in areas of low structural complexity. Instead of calculating a multitude of different models by varying some input data or parameters as it is done by Monte-Carlo-simulations, the aim of the second step of the evaluation procedure (which is part of the ongoing work) is to calculate basically two model variations that can be seen as geometric extremes of all available input data. This does not lead to a probability distribution for the spatial position of geometric elements but it defines zones of major (or minor resp.) geometric variations due to data uncertainty. Both model evaluations are then analyzed together to give ranges of possible model outcomes in metric units.

  17. Uncertainty in tsunami sediment transport modeling

    USGS Publications Warehouse

    Jaffe, Bruce E.; Goto, Kazuhisa; Sugawara, Daisuke; Gelfenbaum, Guy R.; La Selle, SeanPaul M.

    2016-01-01

    Erosion and deposition from tsunamis record information about tsunami hydrodynamics and size that can be interpreted to improve tsunami hazard assessment. We explore sources and methods for quantifying uncertainty in tsunami sediment transport modeling. Uncertainty varies with tsunami, study site, available input data, sediment grain size, and model. Although uncertainty has the potential to be large, published case studies indicate that both forward and inverse tsunami sediment transport models perform well enough to be useful for deciphering tsunami characteristics, including size, from deposits. New techniques for quantifying uncertainty, such as Ensemble Kalman Filtering inversion, and more rigorous reporting of uncertainties will advance the science of tsunami sediment transport modeling. Uncertainty may be decreased with additional laboratory studies that increase our understanding of the semi-empirical parameters and physics of tsunami sediment transport, standardized benchmark tests to assess model performance, and development of hybrid modeling approaches to exploit the strengths of forward and inverse models.

  18. Assessment of SFR Wire Wrap Simulation Uncertainties

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Delchini, Marc-Olivier G.; Popov, Emilian L.; Pointer, William David

    Predictive modeling and simulation of nuclear reactor performance and fuel are challenging due to the large number of coupled physical phenomena that must be addressed. Models that will be used for design or operational decisions must be analyzed for uncertainty to ascertain impacts to safety or performance. Rigorous, structured uncertainty analyses are performed by characterizing the model’s input uncertainties and then propagating the uncertainties through the model to estimate output uncertainty. This project is part of the ongoing effort to assess modeling uncertainty in Nek5000 simulations of flow configurations relevant to the advanced reactor applications of the Nuclear Energy Advancedmore » Modeling and Simulation (NEAMS) program. Three geometries are under investigation in these preliminary assessments: a 3-D pipe, a 3-D 7-pin bundle, and a single pin from the Thermal-Hydraulic Out-of-Reactor Safety (THORS) facility. Initial efforts have focused on gaining an understanding of Nek5000 modeling options and integrating Nek5000 with Dakota. These tasks are being accomplished by demonstrating the use of Dakota to assess parametric uncertainties in a simple pipe flow problem. This problem is used to optimize performance of the uncertainty quantification strategy and to estimate computational requirements for assessments of complex geometries. A sensitivity analysis to three turbulent models was conducted for a turbulent flow in a single wire wrapped pin (THOR) geometry. Section 2 briefly describes the software tools used in this study and provides appropriate references. Section 3 presents the coupling interface between Dakota and a computational fluid dynamic (CFD) code (Nek5000 or STARCCM+), with details on the workflow, the scripts used for setting up the run, and the scripts used for post-processing the output files. In Section 4, the meshing methods used to generate the THORS and 7-pin bundle meshes are explained. Sections 5, 6 and 7 present numerical results for the 3-D pipe, the single pin THORS mesh, and the 7-pin bundle mesh, respectively.« less

  19. A multi-model assessment of terrestrial biosphere model data needs

    NASA Astrophysics Data System (ADS)

    Gardella, A.; Cowdery, E.; De Kauwe, M. G.; Desai, A. R.; Duveneck, M.; Fer, I.; Fisher, R.; Knox, R. G.; Kooper, R.; LeBauer, D.; McCabe, T.; Minunno, F.; Raiho, A.; Serbin, S.; Shiklomanov, A. N.; Thomas, A.; Walker, A.; Dietze, M.

    2017-12-01

    Terrestrial biosphere models provide us with the means to simulate the impacts of climate change and their uncertainties. Going beyond direct observation and experimentation, models synthesize our current understanding of ecosystem processes and can give us insight on data needed to constrain model parameters. In previous work, we leveraged the Predictive Ecosystem Analyzer (PEcAn) to assess the contribution of different parameters to the uncertainty of the Ecosystem Demography model v2 (ED) model outputs across various North American biomes (Dietze et al., JGR-G, 2014). While this analysis identified key research priorities, the extent to which these priorities were model- and/or biome-specific was unclear. Furthermore, because the analysis only studied one model, we were unable to comment on the effect of variability in model structure to overall predictive uncertainty. Here, we expand this analysis to all biomes globally and a wide sample of models that vary in complexity: BioCro, CABLE, CLM, DALEC, ED2, FATES, G'DAY, JULES, LANDIS, LINKAGES, LPJ-GUESS, MAESPA, PRELES, SDGVM, SIPNET, and TEM. Prior to performing uncertainty analyses, model parameter uncertainties were assessed by assimilating all available trait data from the combination of the BETYdb and TRY trait databases, using an updated multivariate version of PEcAn's Hierarchical Bayesian meta-analysis. Next, sensitivity analyses were performed for all models across a range of sites globally to assess sensitivities for a range of different outputs (GPP, ET, SH, Ra, NPP, Rh, NEE, LAI) at multiple time scales from the sub-annual to the decadal. Finally, parameter uncertainties and model sensitivities were combined to evaluate the fractional contribution of each parameter to the predictive uncertainty for a specific variable at a specific site and timescale. Facilitated by PEcAn's automated workflows, this analysis represents the broadest assessment of the sensitivities and uncertainties in terrestrial models to date, and provides a comprehensive roadmap for constraining model uncertainties through model development and data collection.

  20. Understanding extreme sea levels for broad-scale coastal impact and adaptation analysis

    NASA Astrophysics Data System (ADS)

    Wahl, T.; Haigh, I. D.; Nicholls, R. J.; Arns, A.; Dangendorf, S.; Hinkel, J.; Slangen, A. B. A.

    2017-07-01

    One of the main consequences of mean sea level rise (SLR) on human settlements is an increase in flood risk due to an increase in the intensity and frequency of extreme sea levels (ESL). While substantial research efforts are directed towards quantifying projections and uncertainties of future global and regional SLR, corresponding uncertainties in contemporary ESL have not been assessed and projections are limited. Here we quantify, for the first time at global scale, the uncertainties in present-day ESL estimates, which have by default been ignored in broad-scale sea-level rise impact assessments to date. ESL uncertainties exceed those from global SLR projections and, assuming that we meet the Paris agreement goals, the projected SLR itself by the end of the century in many regions. Both uncertainties in SLR projections and ESL estimates need to be understood and combined to fully assess potential impacts and adaptation needs.

  1. Assessment of Experimental Uncertainty for a Floating Wind Semisubmersible under Hydrodynamic Loading: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Robertson, Amy N; Wendt, Fabian F; Jonkman, Jason

    The objective of this paper is to assess the sources of experimental uncertainty in an offshore wind validation campaign focused on better understanding the nonlinear hydrodynamic response behavior of a floating semisubmersible. The test specimen and conditions were simplified compared to other floating wind test campaigns to reduce potential sources of uncertainties and better focus on the hydrodynamic load attributes. Repeat tests were used to understand the repeatability of the test conditions and to assess the level of random uncertainty in the measurements. Attention was also given to understanding bias in all components of the test. The end goal ofmore » this work is to set uncertainty bounds on the response metrics of interest, which will be used in future work to evaluate the success of modeling tools in accurately calculating hydrodynamic loads and the associated motion responses of the system.« less

  2. Quantitative body DW-MRI biomarkers uncertainty estimation using unscented wild-bootstrap.

    PubMed

    Freiman, M; Voss, S D; Mulkern, R V; Perez-Rossello, J M; Warfield, S K

    2011-01-01

    We present a new method for the uncertainty estimation of diffusion parameters for quantitative body DW-MRI assessment. Diffusion parameters uncertainty estimation from DW-MRI is necessary for clinical applications that use these parameters to assess pathology. However, uncertainty estimation using traditional techniques requires repeated acquisitions, which is undesirable in routine clinical use. Model-based bootstrap techniques, for example, assume an underlying linear model for residuals rescaling and cannot be utilized directly for body diffusion parameters uncertainty estimation due to the non-linearity of the body diffusion model. To offset this limitation, our method uses the Unscented transform to compute the residuals rescaling parameters from the non-linear body diffusion model, and then applies the wild-bootstrap method to infer the body diffusion parameters uncertainty. Validation through phantom and human subject experiments shows that our method identify the regions with higher uncertainty in body DWI-MRI model parameters correctly with realtive error of -36% in the uncertainty values.

  3. Assessment and visualization of uncertainty for countrywide soil organic matter map of Hungary using local entropy

    NASA Astrophysics Data System (ADS)

    Szatmári, Gábor; Pásztor, László

    2016-04-01

    Uncertainty is a general term expressing our imperfect knowledge in describing an environmental process and we are aware of it (Bárdossy and Fodor, 2004). Sampling, laboratory measurements, models and so on are subject to uncertainty. Effective quantification and visualization of uncertainty would be indispensable to stakeholders (e.g. policy makers, society). Soil related features and their spatial models should be stressfully targeted to uncertainty assessment because their inferences are further used in modelling and decision making process. The aim of our present study was to assess and effectively visualize the local uncertainty of the countrywide soil organic matter (SOM) spatial distribution model of Hungary using geostatistical tools and concepts. The Hungarian Soil Information and Monitoring System's SOM data (approximately 1,200 observations) and environmental related, spatially exhaustive secondary information (i.e. digital elevation model, climatic maps, MODIS satellite images and geological map) were used to model the countrywide SOM spatial distribution by regression kriging. It would be common to use the calculated estimation (or kriging) variance as a measure of uncertainty, however the normality and homoscedasticity hypotheses have to be refused according to our preliminary analysis on the data. Therefore, a normal score transformation and a sequential stochastic simulation approach was introduced to be able to model and assess the local uncertainty. Five hundred equally probable realizations (i.e. stochastic images) were generated. The number of the stochastic images is fairly enough to provide a model of uncertainty at each location, which is a complete description of uncertainty in geostatistics (Deutsch and Journel, 1998). Furthermore, these models can be applied e.g. to contour the probability of any events, which can be regarded as goal oriented digital soil maps and are of interest for agricultural management and decision making as well. A standardized measure of the local entropy was used to visualize uncertainty, where entropy values close to 1 correspond to high uncertainty, whilst values close to 0 correspond low uncertainty. The advantage of the usage of local entropy in this context is that it combines probabilities from multiple members into a single number for each location of the model. In conclusion, it is straightforward to use a sequential stochastic simulation approach to the assessment of uncertainty, when normality and homoscedasticity are violated. The visualization of uncertainty using the local entropy is effective and communicative to stakeholders because it represents the uncertainty through a single number within a [0, 1] scale. References: Bárdossy, Gy. & Fodor, J., 2004. Evaluation of Uncertainties and Risks in Geology. Springer-Verlag, Berlin Heidelberg. Deutsch, C.V. & Journel, A.G., 1998. GSLIB: geostatistical software library and user's guide. Oxford University Press, New York. Acknowledgement: Our work was supported by the Hungarian National Scientific Research Foundation (OTKA, Grant No. K105167).

  4. Ignoring correlation in uncertainty and sensitivity analysis in life cycle assessment: what is the risk?

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Groen, E.A., E-mail: Evelyne.Groen@gmail.com; Heijungs, R.; Leiden University, Einsteinweg 2, Leiden 2333 CC

    Life cycle assessment (LCA) is an established tool to quantify the environmental impact of a product. A good assessment of uncertainty is important for making well-informed decisions in comparative LCA, as well as for correctly prioritising data collection efforts. Under- or overestimation of output uncertainty (e.g. output variance) will lead to incorrect decisions in such matters. The presence of correlations between input parameters during uncertainty propagation, can increase or decrease the the output variance. However, most LCA studies that include uncertainty analysis, ignore correlations between input parameters during uncertainty propagation, which may lead to incorrect conclusions. Two approaches to include correlationsmore » between input parameters during uncertainty propagation and global sensitivity analysis were studied: an analytical approach and a sampling approach. The use of both approaches is illustrated for an artificial case study of electricity production. Results demonstrate that both approaches yield approximately the same output variance and sensitivity indices for this specific case study. Furthermore, we demonstrate that the analytical approach can be used to quantify the risk of ignoring correlations between input parameters during uncertainty propagation in LCA. We demonstrate that: (1) we can predict if including correlations among input parameters in uncertainty propagation will increase or decrease output variance; (2) we can quantify the risk of ignoring correlations on the output variance and the global sensitivity indices. Moreover, this procedure requires only little data. - Highlights: • Ignoring correlation leads to under- or overestimation of the output variance. • We demonstrated that the risk of ignoring correlation can be quantified. • The procedure proposed is generally applicable in life cycle assessment. • In some cases, ignoring correlation has a minimal effect on decision-making tools.« less

  5. How to find what you don't know: Visualising variability in 3D geological models

    NASA Astrophysics Data System (ADS)

    Lindsay, Mark; Wellmann, Florian; Jessell, Mark; Ailleres, Laurent

    2014-05-01

    Uncertainties in input data can have compounding effects on the predictive reliability of three-dimensional (3D) geological models. Resource exploration, tectonic studies and environmental modelling can be compromised by using 3D models that misrepresent the target geology, and drilling campaigns that attempt to intersect particular geological units guided by 3D models are at risk of failure if the exploration geologist is unaware of inherent uncertainties. In addition, the visual inspection of 3D models is often the first contact decision makers have with the geology, thus visually communicating the presence and magnitude of uncertainties contained within geological 3D models is critical. Unless uncertainties are presented early in the relationship between decision maker and model, the model will be considered more truthful than the uncertainties allow with each subsequent viewing. We present a selection of visualisation techniques that provide the viewer with an insight to the location and amount of uncertainty contained within a model, and the geological characteristics which are most affected. A model of the Gippsland Basin, southeastern Australia is used as a case study to demonstrate the concepts of information entropy, stratigraphic variability and geodiversity. Central to the techniques shown here is the creation of a model suite, performed by creating similar (but not the same) version of the original model through perturbation of the input data. Specifically, structural data in the form of strike and dip measurements is perturbed in the creation of the model suite. The visualisation techniques presented are: (i) information entropy; (ii) stratigraphic variability and (iii) geodiversity. Information entropy is used to analyse uncertainty in a spatial context, combining the empirical probability distributions of multiple outcomes with a single quantitative measure. Stratigraphic variability displays the number of possible lithologies that may exist at a given point within the model volume. Geodiversity analyses various model characteristics (or 'geodiveristy metrics'), including the depth, volume of unit, the curvature of an interface, the geological complexity of a contact and the contact relationships units have with each other. Principal component analysis, a multivariate statistical technique, is used to simultaneously examine each of the geodiveristy metrics to determine the boundaries of model space, and identify which metrics contribute most to model uncertainty. The combination of information entropy, stratigraphic variability and geodiversity analysis provides a descriptive and thorough representation of uncertainty with effective visualisation techniques that clearly communicate the geological uncertainty contained within the geological model.

  6. MELCOR computer code manuals

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Summers, R.M.; Cole, R.K. Jr.; Smith, R.C.

    1995-03-01

    MELCOR is a fully integrated, engineering-level computer code that models the progression of severe accidents in light water reactor nuclear power plants. MELCOR is being developed at Sandia National Laboratories for the U.S. Nuclear Regulatory Commission as a second-generation plant risk assessment tool and the successor to the Source Term Code Package. A broad spectrum of severe accident phenomena in both boiling and pressurized water reactors is treated in MELCOR in a unified framework. These include: thermal-hydraulic response in the reactor coolant system, reactor cavity, containment, and confinement buildings; core heatup, degradation, and relocation; core-concrete attack; hydrogen production, transport, andmore » combustion; fission product release and transport; and the impact of engineered safety features on thermal-hydraulic and radionuclide behavior. Current uses of MELCOR include estimation of severe accident source terms and their sensitivities and uncertainties in a variety of applications. This publication of the MELCOR computer code manuals corresponds to MELCOR 1.8.3, released to users in August, 1994. Volume 1 contains a primer that describes MELCOR`s phenomenological scope, organization (by package), and documentation. The remainder of Volume 1 contains the MELCOR Users Guides, which provide the input instructions and guidelines for each package. Volume 2 contains the MELCOR Reference Manuals, which describe the phenomenological models that have been implemented in each package.« less

  7. Quantifying the intra-annual uncertainties in climate change assessment over 10 sub-basins across the Pacific Northwest US

    NASA Astrophysics Data System (ADS)

    Ahmadalipour, Ali; Moradkhani, Hamid; Rana, Arun

    2017-04-01

    Uncertainty is an inevitable feature of climate change impact assessments. Understanding and quantifying different sources of uncertainty is of high importance, which can help modeling agencies improve the current models and scenarios. In this study, we have assessed the future changes in three climate variables (i.e. precipitation, maximum temperature, and minimum temperature) over 10 sub-basins across the Pacific Northwest US. To conduct the study, 10 statistically downscaled CMIP5 GCMs from two downscaling methods (i.e. BCSD and MACA) were utilized at 1/16 degree spatial resolution for the historical period of 1970-2000 and future period of 2010-2099. For the future projections, two future scenarios of RCP4.5 and RCP8.5 were used. Furthermore, Bayesian Model Averaging (BMA) was employed to develop a probabilistic future projection for each climate variable. Results indicate superiority of BMA simulations compared to individual models. Increasing temperature and precipitation are projected at annual timescale. However, the changes are not uniform among different seasons. Model uncertainty shows to be the major source of uncertainty, while downscaling uncertainty significantly contributes to the total uncertainty, especially in summer.

  8. Accounting for multiple sources of uncertainty in impact assessments: The example of the BRACE study

    NASA Astrophysics Data System (ADS)

    O'Neill, B. C.

    2015-12-01

    Assessing climate change impacts often requires the use of multiple scenarios, types of models, and data sources, leading to a large number of potential sources of uncertainty. For example, a single study might require a choice of a forcing scenario, climate model, bias correction and/or downscaling method, societal development scenario, model (typically several) for quantifying elements of societal development such as economic and population growth, biophysical model (such as for crop yields or hydrology), and societal impact model (e.g. economic or health model). Some sources of uncertainty are reduced or eliminated by the framing of the question. For example, it may be useful to ask what an impact outcome would be conditional on a given societal development pathway, forcing scenario, or policy. However many sources of uncertainty remain, and it is rare for all or even most of these sources to be accounted for. I use the example of a recent integrated project on the Benefits of Reduced Anthropogenic Climate changE (BRACE) to explore useful approaches to uncertainty across multiple components of an impact assessment. BRACE comprises 23 papers that assess the differences in impacts between two alternative climate futures: those associated with Representative Concentration Pathways (RCPs) 4.5 and 8.5. It quantifies difference in impacts in terms of extreme events, health, agriculture, tropical cyclones, and sea level rise. Methodologically, it includes climate modeling, statistical analysis, integrated assessment modeling, and sector-specific impact modeling. It employs alternative scenarios of both radiative forcing and societal development, but generally uses a single climate model (CESM), partially accounting for climate uncertainty by drawing heavily on large initial condition ensembles. Strengths and weaknesses of the approach to uncertainty in BRACE are assessed. Options under consideration for improving the approach include the use of perturbed physics ensembles of CESM, employing results from multiple climate models, and combining the results from single impact models with statistical representations of uncertainty across multiple models. A key consideration is the relationship between the question being addressed and the uncertainty approach.

  9. Contribution of crop model structure, parameters and climate projections to uncertainty in climate change impact assessments.

    PubMed

    Tao, Fulu; Rötter, Reimund P; Palosuo, Taru; Gregorio Hernández Díaz-Ambrona, Carlos; Mínguez, M Inés; Semenov, Mikhail A; Kersebaum, Kurt Christian; Nendel, Claas; Specka, Xenia; Hoffmann, Holger; Ewert, Frank; Dambreville, Anaelle; Martre, Pierre; Rodríguez, Lucía; Ruiz-Ramos, Margarita; Gaiser, Thomas; Höhn, Jukka G; Salo, Tapio; Ferrise, Roberto; Bindi, Marco; Cammarano, Davide; Schulman, Alan H

    2018-03-01

    Climate change impact assessments are plagued with uncertainties from many sources, such as climate projections or the inadequacies in structure and parameters of the impact model. Previous studies tried to account for the uncertainty from one or two of these. Here, we developed a triple-ensemble probabilistic assessment using seven crop models, multiple sets of model parameters and eight contrasting climate projections together to comprehensively account for uncertainties from these three important sources. We demonstrated the approach in assessing climate change impact on barley growth and yield at Jokioinen, Finland in the Boreal climatic zone and Lleida, Spain in the Mediterranean climatic zone, for the 2050s. We further quantified and compared the contribution of crop model structure, crop model parameters and climate projections to the total variance of ensemble output using Analysis of Variance (ANOVA). Based on the triple-ensemble probabilistic assessment, the median of simulated yield change was -4% and +16%, and the probability of decreasing yield was 63% and 31% in the 2050s, at Jokioinen and Lleida, respectively, relative to 1981-2010. The contribution of crop model structure to the total variance of ensemble output was larger than that from downscaled climate projections and model parameters. The relative contribution of crop model parameters and downscaled climate projections to the total variance of ensemble output varied greatly among the seven crop models and between the two sites. The contribution of downscaled climate projections was on average larger than that of crop model parameters. This information on the uncertainty from different sources can be quite useful for model users to decide where to put the most effort when preparing or choosing models or parameters for impact analyses. We concluded that the triple-ensemble probabilistic approach that accounts for the uncertainties from multiple important sources provide more comprehensive information for quantifying uncertainties in climate change impact assessments as compared to the conventional approaches that are deterministic or only account for the uncertainties from one or two of the uncertainty sources. © 2017 John Wiley & Sons Ltd.

  10. Accounting for methodological, structural, and parameter uncertainty in decision-analytic models: a practical guide.

    PubMed

    Bilcke, Joke; Beutels, Philippe; Brisson, Marc; Jit, Mark

    2011-01-01

    Accounting for uncertainty is now a standard part of decision-analytic modeling and is recommended by many health technology agencies and published guidelines. However, the scope of such analyses is often limited, even though techniques have been developed for presenting the effects of methodological, structural, and parameter uncertainty on model results. To help bring these techniques into mainstream use, the authors present a step-by-step guide that offers an integrated approach to account for different kinds of uncertainty in the same model, along with a checklist for assessing the way in which uncertainty has been incorporated. The guide also addresses special situations such as when a source of uncertainty is difficult to parameterize, resources are limited for an ideal exploration of uncertainty, or evidence to inform the model is not available or not reliable. for identifying the sources of uncertainty that influence results most are also described. Besides guiding analysts, the guide and checklist may be useful to decision makers who need to assess how well uncertainty has been accounted for in a decision-analytic model before using the results to make a decision.

  11. Uncertainty characterization approaches for risk assessment of DBPs in drinking water: a review.

    PubMed

    Chowdhury, Shakhawat; Champagne, Pascale; McLellan, P James

    2009-04-01

    The management of risk from disinfection by-products (DBPs) in drinking water has become a critical issue over the last three decades. The areas of concern for risk management studies include (i) human health risk from DBPs, (ii) disinfection performance, (iii) technical feasibility (maintenance, management and operation) of treatment and disinfection approaches, and (iv) cost. Human health risk assessment is typically considered to be the most important phase of the risk-based decision-making or risk management studies. The factors associated with health risk assessment and other attributes are generally prone to considerable uncertainty. Probabilistic and non-probabilistic approaches have both been employed to characterize uncertainties associated with risk assessment. The probabilistic approaches include sampling-based methods (typically Monte Carlo simulation and stratified sampling) and asymptotic (approximate) reliability analysis (first- and second-order reliability methods). Non-probabilistic approaches include interval analysis, fuzzy set theory and possibility theory. However, it is generally accepted that no single method is suitable for the entire spectrum of problems encountered in uncertainty analyses for risk assessment. Each method has its own set of advantages and limitations. In this paper, the feasibility and limitations of different uncertainty analysis approaches are outlined for risk management studies of drinking water supply systems. The findings assist in the selection of suitable approaches for uncertainty analysis in risk management studies associated with DBPs and human health risk.

  12. Is the impact of eutrophication on phytoplankton diversity dependent on lake volume/ecosystem size?

    USGS Publications Warehouse

    Baho, Didier L.; Drakare, Stina; Johnson, Richard K.; Allen, Craig R.; Angeler, David G.

    2017-01-01

    Research focusing on biodiversity responses to the interactions of ecosystem size and anthropogenic stressors are based mainly on correlative gradient studies, and may therefore confound size-stress relationships due to spatial context and differences in local habitat features across ecosystems. We investigated how local factors related to anthropogenic stressors (e.g.,eutrophication) interact with ecosystem size to influence species diversity. In this study, constructed lake mesocosms (with two contrasting volumes: 1020 (shallow mesocosms) and 2150 (deep mesocosms) litres) were used to simulate ecosystems of different size and manipulated nutrient levels to simulate mesotrophic and hypertrophic conditions. Using a factorial design, we assessed how the interaction between ecosystem size and nutrients influences phytoplankton diversity. We assessed community metrics (richness, diversity, evenness and total biovolumes) and multivariate community structure over a growing season (May to early November 2011). Different community structures were found between deep and shallow mescosoms with nutrient enrichment: Cyanobacteria dominated in the deep and Charophyta in the shallow mesocosms. In contrast, phytoplankton communities were more similar to each other in the low nutrient treatments; only Chlorophyta had generally a higher biovolume in the shallow compared to the deep mesocosms. These results suggest that ecosystem size is not only a determinant of species diversity, but that it can mediate the influence of anthropogenic effects on biodiversity. Such interactions increase the uncertainty of global change outcomes, and should therefore not be ignored in risk/impact assessment and management.

  13. How predictable is the timing of a summer ice-free Arctic?

    NASA Astrophysics Data System (ADS)

    Jahn, Alexandra; Kay, Jennifer E.; Holland, Marika M.; Hall, David M.

    2016-09-01

    Climate model simulations give a large range of over 100 years for predictions of when the Arctic could first become ice free in the summer, and many studies have attempted to narrow this uncertainty range. However, given the chaotic nature of the climate system, what amount of spread in the prediction of an ice-free summer Arctic is inevitable? Based on results from large ensemble simulations with the Community Earth System Model, we show that internal variability alone leads to a prediction uncertainty of about two decades, while scenario uncertainty between the strong (Representative Concentration Pathway (RCP) 8.5) and medium (RCP4.5) forcing scenarios adds at least another 5 years. Common metrics of the past and present mean sea ice state (such as ice extent, volume, and thickness) as well as global mean temperatures do not allow a reduction of the prediction uncertainty from internal variability.

  14. Uncertainty principles for inverse source problems for electromagnetic and elastic waves

    NASA Astrophysics Data System (ADS)

    Griesmaier, Roland; Sylvester, John

    2018-06-01

    In isotropic homogeneous media, far fields of time-harmonic electromagnetic waves radiated by compactly supported volume currents, and elastic waves radiated by compactly supported body force densities can be modelled in very similar fashions. Both are projected restricted Fourier transforms of vector-valued source terms. In this work we generalize two types of uncertainty principles recently developed for far fields of scalar-valued time-harmonic waves in Griesmaier and Sylvester (2017 SIAM J. Appl. Math. 77 154–80) to this vector-valued setting. These uncertainty principles yield stability criteria and algorithms for splitting far fields radiated by collections of well-separated sources into the far fields radiated by individual source components, and for the restoration of missing data segments. We discuss proper regularization strategies for these inverse problems, provide stability estimates based on the new uncertainty principles, and comment on reconstruction schemes. A numerical example illustrates our theoretical findings.

  15. Quantifying model-structure- and parameter-driven uncertainties in spring wheat phenology prediction with Bayesian analysis

    DOE PAGES

    Alderman, Phillip D.; Stanfill, Bryan

    2016-10-06

    Recent international efforts have brought renewed emphasis on the comparison of different agricultural systems models. Thus far, analysis of model-ensemble simulated results has not clearly differentiated between ensemble prediction uncertainties due to model structural differences per se and those due to parameter value uncertainties. Additionally, despite increasing use of Bayesian parameter estimation approaches with field-scale crop models, inadequate attention has been given to the full posterior distributions for estimated parameters. The objectives of this study were to quantify the impact of parameter value uncertainty on prediction uncertainty for modeling spring wheat phenology using Bayesian analysis and to assess the relativemore » contributions of model-structure-driven and parameter-value-driven uncertainty to overall prediction uncertainty. This study used a random walk Metropolis algorithm to estimate parameters for 30 spring wheat genotypes using nine phenology models based on multi-location trial data for days to heading and days to maturity. Across all cases, parameter-driven uncertainty accounted for between 19 and 52% of predictive uncertainty, while model-structure-driven uncertainty accounted for between 12 and 64%. Here, this study demonstrated the importance of quantifying both model-structure- and parameter-value-driven uncertainty when assessing overall prediction uncertainty in modeling spring wheat phenology. More generally, Bayesian parameter estimation provided a useful framework for quantifying and analyzing sources of prediction uncertainty.« less

  16. The Intolerance of Uncertainty Inventory: Validity and Comparison of Scoring Methods to Assess Individuals Screening Positive for Anxiety and Depression.

    PubMed

    Lauriola, Marco; Mosca, Oriana; Trentini, Cristina; Foschi, Renato; Tambelli, Renata; Carleton, R Nicholas

    2018-01-01

    Intolerance of Uncertainty is a fundamental transdiagnostic personality construct hierarchically organized with a core general factor underlying diverse clinical manifestations. The current study evaluated the construct validity of the Intolerance of Uncertainty Inventory, a two-part scale separately assessing a unitary Intolerance of Uncertainty disposition to consider uncertainties to be unacceptable and threatening (Part A) and the consequences of such disposition, regarding experiential avoidance, chronic doubt, overestimation of threat, worrying, control of uncertain situations, and seeking reassurance (Part B). Community members ( N = 1046; Mean age = 36.69 ± 12.31 years; 61% females) completed the Intolerance of Uncertainty Inventory with the Beck Depression Inventory-II and the State-Trait Anxiety Inventory. Part A demonstrated a robust unidimensional structure and an excellent convergent validity with Part B. A bifactor model was the best fitting model for Part B. Based on these results, we compared the hierarchical factor scores with summated ratings clinical proxy groups reporting anxiety and depression symptoms. Summated rating scores were associated with both depression and anxiety and proportionally increased with the co-occurrence of depressive and anxious symptoms. By contrast, hierarchical scores were useful to detect which facets mostly separated between for depression and anxiety groups. In sum, Part A was a reliable and valid transdiagnostic measure of Intolerance of Uncertainty. The Part B was arguably more useful for assessing clinical manifestations of Intolerance of Uncertainty for specific disorders, provided that hierarchical scores are used. Overall, our study suggest that clinical assessments might need to shift toward hierarchical factor scores.

  17. The Intolerance of Uncertainty Inventory: Validity and Comparison of Scoring Methods to Assess Individuals Screening Positive for Anxiety and Depression

    PubMed Central

    Lauriola, Marco; Mosca, Oriana; Trentini, Cristina; Foschi, Renato; Tambelli, Renata; Carleton, R. Nicholas

    2018-01-01

    Intolerance of Uncertainty is a fundamental transdiagnostic personality construct hierarchically organized with a core general factor underlying diverse clinical manifestations. The current study evaluated the construct validity of the Intolerance of Uncertainty Inventory, a two-part scale separately assessing a unitary Intolerance of Uncertainty disposition to consider uncertainties to be unacceptable and threatening (Part A) and the consequences of such disposition, regarding experiential avoidance, chronic doubt, overestimation of threat, worrying, control of uncertain situations, and seeking reassurance (Part B). Community members (N = 1046; Mean age = 36.69 ± 12.31 years; 61% females) completed the Intolerance of Uncertainty Inventory with the Beck Depression Inventory-II and the State-Trait Anxiety Inventory. Part A demonstrated a robust unidimensional structure and an excellent convergent validity with Part B. A bifactor model was the best fitting model for Part B. Based on these results, we compared the hierarchical factor scores with summated ratings clinical proxy groups reporting anxiety and depression symptoms. Summated rating scores were associated with both depression and anxiety and proportionally increased with the co-occurrence of depressive and anxious symptoms. By contrast, hierarchical scores were useful to detect which facets mostly separated between for depression and anxiety groups. In sum, Part A was a reliable and valid transdiagnostic measure of Intolerance of Uncertainty. The Part B was arguably more useful for assessing clinical manifestations of Intolerance of Uncertainty for specific disorders, provided that hierarchical scores are used. Overall, our study suggest that clinical assessments might need to shift toward hierarchical factor scores. PMID:29632505

  18. Proceedings of the NASA Conference on Space Telerobotics, volume 1

    NASA Technical Reports Server (NTRS)

    Rodriguez, Guillermo (Editor); Seraji, Homayoun (Editor)

    1989-01-01

    The theme of the Conference was man-machine collaboration in space. Topics addressed include: redundant manipulators; man-machine systems; telerobot architecture; remote sensing and planning; navigation; neural networks; fundamental AI research; and reasoning under uncertainty.

  19. Calculation of airborne radioactivity in a Technegas lung ventilation unit.

    PubMed

    López Medina, A; Miñano, J A; Terrón, J A; Bullejos, J A; Guerrero, R; Arroyo, T; Ramírez, A; Llamas, J M

    1999-12-01

    Airborne contamination by 99Tcm has been monitored in the Nuclear Medicine Department in our hospital to assess the risk of internal contamination to occupational workers exposed to Technegas studies. An air sampler fitted with a membrane filter was used. The optimum time for air absorption for obtaining the maximum activity in the filter was calculated. Maximum activity in the membrane filter ensures minimum uncertainty, which is especially important when low-level activities are being measured. The optimum time depends on air absorption velocity, room volume and filter efficiency for isotope collection. It tends to 1/lambda (lambda = disintegration constant for 99Tcm) for large volume and low velocity. Room activity with the air pump switched on was related to filter activity, and its variation with time was studied. Free activity in air for each study was approximately 7 x 10(-4) the activity used, and the effective half-life of the isotope in the room was 13.9 min (decay and diffusion). For a typical study (630 MBq), the effective dose to staff was 0.01 microSv when in the room for 10 min.

  20. The Vulnerability of People to Landslides: A Case Study on the Relationship between the Casualties and Volume of Landslides in China.

    PubMed

    Lin, Qigen; Wang, Ying; Liu, Tianxue; Zhu, Yingqi; Sui, Qi

    2017-02-21

    The lack of a detailed landslide inventory makes research on the vulnerability of people to landslides highly limited. In this paper, the authors collect information on the landslides that have caused casualties in China, and established the Landslides Casualties Inventory of China . 100 landslide cases from 2003 to 2012 were utilized to develop an empirical relationship between the volume of a landslide event and the casualties caused by the occurrence of the event. The error bars were used to describe the uncertainty of casualties resulting from landslides and to establish a threshold curve of casualties caused by landslides in China. The threshold curve was then applied to the landslide cases occurred in 2013 and 2014. The validation results show that the estimated casualties of the threshold curve were in good agreement with the real casualties with a small deviation. Therefore, the threshold curve can be used for estimating potential casualties and landslide vulnerability, which is meaningful for emergency rescue operations after landslides occurred and for risk assessment research.

  1. The Vulnerability of People to Landslides: A Case Study on the Relationship between the Casualties and Volume of Landslides in China

    PubMed Central

    Lin, Qigen; Wang, Ying; Liu, Tianxue; Zhu, Yingqi; Sui, Qi

    2017-01-01

    The lack of a detailed landslide inventory makes research on the vulnerability of people to landslides highly limited. In this paper, the authors collect information on the landslides that have caused casualties in China, and established the Landslides Casualties Inventory of China. 100 landslide cases from 2003 to 2012 were utilized to develop an empirical relationship between the volume of a landslide event and the casualties caused by the occurrence of the event. The error bars were used to describe the uncertainty of casualties resulting from landslides and to establish a threshold curve of casualties caused by landslides in China. The threshold curve was then applied to the landslide cases occurred in 2013 and 2014. The validation results show that the estimated casualties of the threshold curve were in good agreement with the real casualties with a small deviation. Therefore, the threshold curve can be used for estimating potential casualties and landslide vulnerability, which is meaningful for emergency rescue operations after landslides occurred and for risk assessment research. PMID:28230810

  2. Health risk assessment for nanoparticles: A case for using expert judgment

    NASA Astrophysics Data System (ADS)

    Kandlikar, Milind; Ramachandran, Gurumurthy; Maynard, Andrew; Murdock, Barbara; Toscano, William A.

    2007-01-01

    Uncertainties in conventional quantitative risk assessment typically relate to values of parameters in risk models. For many environmental contaminants, there is a lack of sufficient information about multiple components of the risk assessment framework. In such cases, the use of default assumptions and extrapolations to fill in the data gaps is a common practice. Nanoparticle risks, however, pose a new form of risk assessment challenge. Besides a lack of data, there is deep scientific uncertainty regarding every aspect of the risk assessment framework: (a) particle characteristics that may affect toxicity; (b) their fate and transport through the environment; (c) the routes of exposure and the metrics by which exposure ought to be measured; (d) the mechanisms of translocation to different parts of the body; and (e) the mechanisms of toxicity and disease. In each of these areas, there are multiple and competing models and hypotheses. These are not merely parametric uncertainties but uncertainties about the choice of the causal mechanisms themselves and the proper model variables to be used, i.e., structural uncertainties. While these uncertainties exist for PM2.5 as well, risk assessment for PM2.5 has avoided dealing with these issues because of a plethora of epidemiological studies. However, such studies don't exist for the case of nanoparticles. Even if such studies are done in the future, they will be very specific to a particular type of engineered nanoparticle and not generalizable to other nanoparticles. Therefore, risk assessment for nanoparticles will have to deal with the various uncertainties that were avoided in the case of PM2.5. Consequently, uncertainties in estimating risks due to nanoparticle exposures may be characterized as `extreme'. This paper proposes a methodology by which risk analysts can cope with such extreme uncertainty. One way to make these problems analytically tractable is to use expert judgment approaches to study the degree of consensus and/or disagreement between experts on different parts of the exposure-response paradigm. This can be done by eliciting judgments from a wide range of experts on different parts of the risk causal chain. We also use examples to illustrate how studying expert consensus/disagreement helps in research prioritization and budget allocation exercises. The expert elicitation can be repeated over the course of several years, over which time, the state of scientific knowledge will also improve and uncertainties may possibly reduce. Results from expert the elicitation exercise can be used by risk managers or managers of funding agencies as a tool for research prioritization.

  3. Impact of input data uncertainty on environmental exposure assessment models: A case study for electromagnetic field modelling from mobile phone base stations.

    PubMed

    Beekhuizen, Johan; Heuvelink, Gerard B M; Huss, Anke; Bürgi, Alfred; Kromhout, Hans; Vermeulen, Roel

    2014-11-01

    With the increased availability of spatial data and computing power, spatial prediction approaches have become a standard tool for exposure assessment in environmental epidemiology. However, such models are largely dependent on accurate input data. Uncertainties in the input data can therefore have a large effect on model predictions, but are rarely quantified. With Monte Carlo simulation we assessed the effect of input uncertainty on the prediction of radio-frequency electromagnetic fields (RF-EMF) from mobile phone base stations at 252 receptor sites in Amsterdam, The Netherlands. The impact on ranking and classification was determined by computing the Spearman correlations and weighted Cohen's Kappas (based on tertiles of the RF-EMF exposure distribution) between modelled values and RF-EMF measurements performed at the receptor sites. The uncertainty in modelled RF-EMF levels was large with a median coefficient of variation of 1.5. Uncertainty in receptor site height, building damping and building height contributed most to model output uncertainty. For exposure ranking and classification, the heights of buildings and receptor sites were the most important sources of uncertainty, followed by building damping, antenna- and site location. Uncertainty in antenna power, tilt, height and direction had a smaller impact on model performance. We quantified the effect of input data uncertainty on the prediction accuracy of an RF-EMF environmental exposure model, thereby identifying the most important sources of uncertainty and estimating the total uncertainty stemming from potential errors in the input data. This approach can be used to optimize the model and better interpret model output. Copyright © 2014 Elsevier Inc. All rights reserved.

  4. Uncertainty of a hydrological climate change impact assessment - Is it really all about climate uncertainty?

    NASA Astrophysics Data System (ADS)

    Honti, Mark; Reichert, Peter; Scheidegger, Andreas; Stamm, Christian

    2013-04-01

    Climate change impact assessments have become more and more popular in hydrology since the middle 1980's with another boost after the publication of the IPCC AR4 report. During hundreds of impact studies a quasi-standard methodology emerged, which is mainly shaped by the growing public demand for predicting how water resources management or flood protection should change in the close future. The ``standard'' workflow considers future climate under a specific IPCC emission scenario simulated by global circulation models (GCMs), possibly downscaled by a regional climate model (RCM) and/or a stochastic weather generator. The output from the climate models is typically corrected for bias before feeding it into a calibrated hydrological model, which is run on the past and future meteorological data to analyse the impacts of climate change on the hydrological indicators of interest. The impact predictions are as uncertain as any forecast that tries to describe the behaviour of an extremely complex system decades into the future. Future climate predictions are uncertain due to the scenario uncertainty and the GCM model uncertainty that is obvious on finer resolution than continental scale. Like in any hierarchical model system, uncertainty propagates through the descendant components. Downscaling increases uncertainty with the deficiencies of RCMs and/or weather generators. Bias correction adds a strong deterministic shift to the input data. Finally the predictive uncertainty of the hydrological model ends the cascade that leads to the total uncertainty of the hydrological impact assessment. There is an emerging consensus between many studies on the relative importance of the different uncertainty sources. The prevailing perception is that GCM uncertainty dominates hydrological impact studies. There are only few studies, which found that the predictive uncertainty of hydrological models can be in the same range or even larger than climatic uncertainty. We carried out a climate change impact assessment and estimated the relative importance of the uncertainty sources. The study was performed on 2 small catchments in the Swiss Plateau with a lumped conceptual rainfall runoff model. In the climatic part we applied the standard ensemble approach to quantify uncertainty but in hydrology we used formal Bayesian uncertainty assessment method with 2 different likelihood functions. One was a time-series error model that was able to deal with the complicated statistical properties of hydrological model residuals. The second was a likelihood function for the flow quantiles directly. Due to the better data coverage and smaller hydrological complexity in one of our test catchments we had better performance from the hydrological model and thus could observe that the relative importance of different uncertainty sources varied between sites, boundary conditions and flow indicators. The uncertainty of future climate was important, but not dominant. The deficiencies of the hydrological model were on the same scale, especially for the sites and flow components where model performance for the past observations was further from optimal (Nash-Sutcliffe index = 0.5 - 0.7). The overall uncertainty of predictions was well beyond the expected change signal even for the best performing site and flow indicator.

  5. Quantification of the effect of electrical and thermal parameters on radiofrequency ablation for concentric tumour model of different sizes.

    PubMed

    Jamil, Muhammad; Ng, E Y K

    2015-07-01

    Radiofrequency ablation (RFA) has been increasingly used in treating cancer for multitude of situations in various tissue types. To perform the therapy safely and reliably, the effect of critical parameters needs to be known beforehand. Temperature plays an important role in the outcome of the therapy and any uncertainties in temperature assessment can be lethal. This study presents the RFA case of fixed tip temperature where we've analysed the effect of electrical conductivity, thermal conductivity and blood perfusion rate of the tumour and surrounding normal tissue on the radiofrequency ablation. Ablation volume was chosen as the characteristic to be optimised and temperature control was achieved via PID controller. The effect of all 6 parameters each having 3 levels was quantified with minimum number of experiments harnessing the fractional factorial characteristic of Taguchi's orthogonal arrays. It was observed that as the blood perfusion increases the ablation volume decreases. Increasing electrical conductivity of the tumour results in increase of ablation volume whereas increase in normal tissue conductivity tends to decrease the ablation volume and vice versa. Likewise, increasing thermal conductivity of the tumour results in enhanced ablation volume whereas an increase in thermal conductivity of the surrounding normal tissue has a debilitating effect on the ablation volume and vice versa. With increase in the size of the tumour (i.e., 2-3cm) the effect of each parameter is not linear. The parameter effect varies with change in size of the tumour that is manifested by the different gradient observed in ablation volume. Most important is the relative insensitivity of ablation volume to blood perfusion rate for smaller tumour size (2cm) that is also in accordance with the previous results presented in literature. These findings will provide initial insight for safe, reliable and improved treatment planning perceptively. Copyright © 2015 Elsevier Ltd. All rights reserved.

  6. Uncertainty indication in soil function maps - transparent and easy-to-use information to support sustainable use of soil resources

    NASA Astrophysics Data System (ADS)

    Greiner, Lucie; Nussbaum, Madlene; Papritz, Andreas; Zimmermann, Stephan; Gubler, Andreas; Grêt-Regamey, Adrienne; Keller, Armin

    2018-05-01

    Spatial information on soil function fulfillment (SFF) is increasingly being used to inform decision-making in spatial planning programs to support sustainable use of soil resources. Soil function maps visualize soils abilities to fulfill their functions, e.g., regulating water and nutrient flows, providing habitats, and supporting biomass production based on soil properties. Such information must be reliable for informed and transparent decision-making in spatial planning programs. In this study, we add to the transparency of soil function maps by (1) indicating uncertainties arising from the prediction of soil properties generated by digital soil mapping (DSM) that are used for soil function assessment (SFA) and (2) showing the response of different SFA methods to the propagation of uncertainties through the assessment. For a study area of 170 km2 in the Swiss Plateau, we map 10 static soil sub-functions for agricultural soils for a spatial resolution of 20 × 20 m together with their uncertainties. Mapping the 10 soil sub-functions using simple ordinal assessment scales reveals pronounced spatial patterns with a high variability of SFF scores across the region, linked to the inherent properties of the soils and terrain attributes and climate conditions. Uncertainties in soil properties propagated through SFA methods generally lead to substantial uncertainty in the mapped soil sub-functions. We propose two types of uncertainty maps that can be readily understood by stakeholders. Cumulative distribution functions of SFF scores indicate that SFA methods respond differently to the propagated uncertainty of soil properties. Even where methods are comparable on the level of complexity and assessment scale, their comparability in view of uncertainty propagation might be different. We conclude that comparable uncertainty indications in soil function maps are relevant to enable informed and transparent decisions on the sustainable use of soil resources.

  7. Why Quantify Uncertainty in Ecosystem Studies: Obligation versus Discovery Tool?

    NASA Astrophysics Data System (ADS)

    Harmon, M. E.

    2016-12-01

    There are multiple motivations for quantifying uncertainty in ecosystem studies. One is as an obligation; the other is as a tool useful in moving ecosystem science toward discovery. While reporting uncertainty should become a routine expectation, a more convincing motivation involves discovery. By clarifying what is known and to what degree it is known, uncertainty analyses can point the way toward improvements in measurements, sampling designs, and models. While some of these improvements (e.g., better sampling designs) may lead to incremental gains, those involving models (particularly model selection) may require large gains in knowledge. To be fully harnessed as a discovery tool, attitudes toward uncertainty may have to change: rather than viewing uncertainty as a negative assessment of what was done, it should be viewed as positive, helpful assessment of what remains to be done.

  8. Offshore Storage Resource Assessment - Final Scientific/Technical Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Savage, Bill; Ozgen, Chet

    The DOE developed volumetric equation for estimating Prospective Resources (CO 2 storage) in oil and gas reservoirs was utilized on each depleted field in the Federal GOM. This required assessment of the in-situ hydrocarbon fluid volumes for the fields under evaluation in order to apply the DOE equation. This project utilized public data from the U.S. Department of the Interior, Bureau of Ocean Energy Management (BOEM) Reserves database and from a well reputed, large database (250,000+ wells) of GOM well and production data marketed by IHS, Inc. IHS interpreted structure map files were also accessed for a limited number ofmore » fields. The databases were used along with geological and petrophysical software to identify depleted oil and gas fields in the Federal GOM region. BOEM arranged for access by the project team to proprietary reservoir level maps under an NDA. Review of the BOEM’s Reserves database as of December 31, 2013 indicated that 675 fields in the region were depleted. NITEC identified and rank these 675 fields containing 3,514 individual reservoirs based on BOEM’s estimated OOIP or OGIP values available in the Reserves database. The estimated BOEM OOIP or OGIP values for five fields were validated by an independent evaluation using available petrophysical, geologic and engineering data in the databases. Once this validation was successfully completed, the BOEM ranked list was used to calculate the estimated CO 2 storage volume for each field/reservoir using the DOE CO 2 Resource Estimate Equation. This calculation assumed a range for the CO 2 efficiency factor in the equation, as it was not known at that point in time. NITEC then utilize reservoir simulation to further enhance and refine the DOE equation estimated range of CO 2 storage volumes. NITEC used a purpose built, publically available, 4-component, compositional reservoir simulator developed under funding from DOE (DE-FE0006015) to assess CO 2-EOR and CO 2 storage in 73 fields/461 reservoirs. This simulator was fast and easy to utilize and provided a valuable enhanced assessment and refinement of the estimated CO 2 storage volume for each reservoir simulated. The user interface was expanded to allow for calculation of a probability based assessment of the CO 2 storage volume based on typical uncertainties in operating conditions and reservoir properties during the CO 2 injection period. This modeling of the CO 2 storage estimates for the simulated reservoirs resulted in definition of correlations applicable to all reservoir types (a refined DOE equation) which can be used for predictive purposes using available public data. Application of the correlations to the 675 depleted fields yielded a total CO 2 storage capacity of 4,748 MM tons. The CO 2 storage assessments were supplemented with simulation modeling of eleven (11) oil reservoirs that quantified the change in the stored CO 2 storage volume with the addition of CO 2-EOR (Enhanced Oil Recovery) production. Application of CO 2-EOR to oil reservoirs resulted in higher volumes of CO 2 storage.« less

  9. Three-dimensional dose verification of the clinical application of gamma knife stereotactic radiosurgery using polymer gel and MRI.

    PubMed

    Papagiannis, P; Karaiskos, P; Kozicki, M; Rosiak, J M; Sakelliou, L; Sandilos, P; Seimenis, I; Torrens, M

    2005-05-07

    This work seeks to verify multi-shot clinical applications of stereotactic radiosurgery with a Leksell Gamma Knife model C unit employing a polymer gel-MRI based experimental procedure, which has already been shown to be capable of verifying the precision and accuracy of dose delivery in single-shot gamma knife applications. The treatment plan studied in the present work resembles a clinical treatment case of pituitary adenoma using four 8 mm and one 14 mm collimator helmet shots to deliver a prescription dose of 15 Gy to the 50% isodose line (30 Gy maximum dose). For the experimental dose verification of the treatment plan, the same criteria as those used in the clinical treatment planning evaluation were employed. These included comparison of measured and GammaPlan calculated data, in terms of percentage isodose contours on axial, coronal and sagittal planes, as well as 3D plan evaluation criteria such as dose-volume histograms for the target volume, target coverage and conformity indices. Measured percentage isodose contours compared favourably with calculated ones despite individual point fluctuations at low dose contours (e.g., 20%) mainly due to the effect of T2 measurement uncertainty on dose resolution. Dose-volume histogram data were also found in a good agreement while the experimental results for the percentage target coverage and conformity index were 94% and 1.17 relative to corresponding GammaPlan calculations of 96% and 1.12, respectively. Overall, polymer gel results verified the planned dose distribution within experimental uncertainties and uncertainty related to the digitization process of selected GammaPlan output data.

  10. Storage flux uncertainty impact on eddy covariance net ecosystem exchange measurements

    NASA Astrophysics Data System (ADS)

    Nicolini, Giacomo; Aubinet, Marc; Feigenwinter, Christian; Heinesch, Bernard; Lindroth, Anders; Mamadou, Ossénatou; Moderow, Uta; Mölder, Meelis; Montagnani, Leonardo; Rebmann, Corinna; Papale, Dario

    2017-04-01

    Complying with several assumption and simplifications, most of the carbon budget studies based on eddy covariance (EC) measurements, quantify the net ecosystem exchange (NEE) by summing the flux obtained by EC (Fc) and the storage flux (Sc). Sc is the rate of change of CO2, within the so called control volume below the EC measurement level, given by the difference in the instantaneous profiles of concentration at the beginning and end of the EC averaging period, divided by the averaging period. While cumulating over time led to a nullification of Sc, it can be significant at short time periods. The approaches used to estimate Sc fluxes largely vary, from measurements based only on a single sampling point (usually located at the EC measurement height) to measurements based on several sampling profiles distributed within the control volume. Furthermore, the number of sampling points within each profile vary, according to their height and the ecosystem typology. It follows that measurement accuracy increases with the sampling intensity within the control volume. In this work we use the experimental dataset collected during the ADVEX campaign in which Sc flux has been measured in three similar forest sites by the use of 5 sampling profiles (towers). Our main objective is to quantify the impact of Sc measurement uncertainty on NEE estimates. Results show that different methods may produce substantially different Sc flux estimates, with problematic consequences in case high frequency (half-hourly) data are needed for the analysis. However, the uncertainty on long-term estimates may be tolerate.

  11. Making Invasion models useful for decision makers; incorporating uncertainty, knowledge gaps, and decision-making preferences

    Treesearch

    Denys Yemshanov; Frank H Koch; Mark Ducey

    2015-01-01

    Uncertainty is inherent in model-based forecasts of ecological invasions. In this chapter, we explore how the perceptions of that uncertainty can be incorporated into the pest risk assessment process. Uncertainty changes a decision maker’s perceptions of risk; therefore, the direct incorporation of uncertainty may provide a more appropriate depiction of risk. Our...

  12. Uncertainty Assessment: What Good Does it Do? (Invited)

    NASA Astrophysics Data System (ADS)

    Oreskes, N.; Lewandowsky, S.

    2013-12-01

    The scientific community has devoted considerable time and energy to understanding, quantifying and articulating the uncertainties related to anthropogenic climate change. However, informed decision-making and good public policy arguably rely far more on a central core of understanding of matters that are scientifically well established than on detailed understanding and articulation of all relevant uncertainties. Advocates of vaccination, for example, stress its overall efficacy in preventing morbidity and mortality--not the uncertainties over how long the protective effects last. Advocates for colonoscopy for cancer screening stress its capacity to detect polyps before they become cancerous, with relatively little attention paid to the fact that many, if not most, polyps, would not become cancerous even if left unremoved. So why has the climate science community spent so much time focused on uncertainty? One reason, of course, is that articulation of uncertainty is a normal and appropriate part of scientific work. However, we argue that there is another reason that involves the pressure that the scientific community has experienced from individuals and groups promoting doubt about anthropogenic climate change. Specifically, doubt-mongering groups focus public attention on scientific uncertainty as a means to undermine scientific claims, equating uncertainty with untruth. Scientists inadvertently validate these arguments by agreeing that much of the science is uncertain, and thus seemingly implying that our knowledge is insecure. The problem goes further, as the scientific community attempts to articulate more clearly, and reduce, those uncertainties, thus, seemingly further agreeing that the knowledge base is insufficient to warrant public and governmental action. We refer to this effect as 'seepage,' as the effects of doubt-mongering seep into the scientific community and the scientific agenda, despite the fact that addressing these concerns does little to alter the public debate or advance public policy. We argue that attempts to address public doubts by improving uncertainty assessment are bound to fail, insofar as the motives for doubt-mongering are independent of scientific uncertainty, and therefore remain unaffected even as those uncertainties are diminished. We illustrate this claim by consideration of the evolution of the debate over the past ten years over the relationship between hurricanes and anthropogenic climate change. We suggest that scientists should pursue uncertainty assessment if such assessment improves scientific understanding, but not as a means to reduce public doubts or advance public policy in relation to anthropogenic climate change.

  13. Environmental impact and risk assessments and key factors contributing to the overall uncertainties.

    PubMed

    Salbu, Brit

    2016-01-01

    There is a significant number of nuclear and radiological sources that have contributed, are still contributing, or have the potential to contribute to radioactive contamination of the environment in the future. To protect the environment from radioactive contamination, impact and risk assessments are performed prior to or during a release event, short or long term after deposition or prior and after implementation of countermeasures. When environmental impact and risks are assessed, however, a series of factors will contribute to the overall uncertainties. To provide environmental impact and risk assessments, information on processes, kinetics and a series of input variables is needed. Adding problems such as variability, questionable assumptions, gaps in knowledge, extrapolations and poor conceptual model structures, a series of factors are contributing to large and often unacceptable uncertainties in impact and risk assessments. Information on the source term and the release scenario is an essential starting point in impact and risk models; the source determines activity concentrations and atom ratios of radionuclides released, while the release scenario determine the physico-chemical forms of released radionuclides such as particle size distribution, structure and density. Releases will most often contain other contaminants such as metals, and due to interactions, contaminated sites should be assessed as a multiple stressor scenario. Following deposition, a series of stressors, interactions and processes will influence the ecosystem transfer of radionuclide species and thereby influence biological uptake (toxicokinetics) and responses (toxicodynamics) in exposed organisms. Due to the variety of biological species, extrapolation is frequently needed to fill gaps in knowledge e.g., from effects to no effects, from effects in one organism to others, from one stressor to mixtures. Most toxtests are, however, performed as short term exposure of adult organisms, ignoring sensitive history life stages of organisms and transgenerational effects. To link sources, ecosystem transfer and biological effects to future impact and risks, a series of models are usually interfaced, while uncertainty estimates are seldom given. The model predictions are, however, only valid within the boundaries of the overall uncertainties. Furthermore, the model predictions are only useful and relevant when uncertainties are estimated, communicated and understood. Among key factors contributing most to uncertainties, the present paper focuses especially on structure uncertainties (model bias or discrepancies) as aspects such as particle releases, ecosystem dynamics, mixed exposure, sensitive life history stages and transgenerational effects, are usually ignored in assessment models. Research focus on these aspects should significantly reduce the overall uncertainties in the impact and risk assessment of radioactive contaminated ecosystems. Copyright © 2015 Elsevier Ltd. All rights reserved.

  14. Explaining the discrepancy between forced fold amplitude and sill thickness.

    NASA Astrophysics Data System (ADS)

    Hoggett, Murray; Jones, Stephen M.; Reston, Timothy; Magee, Craig; Jackson, Christopher AL

    2017-04-01

    Understanding the behaviour of Earth's surface in response to movement and emplacement of magma underground is important because it assists calculation of subsurface magma volumes, and could feed into eruption forecasting. Studies of seismic reflection data have observed that the amplitude of a forced fold above an igneous sill is usually smaller than the thickness of the sill itself. This observation implies that fold amplitude alone provides only a lower bound for magma volume, and an understanding of the mechanism(s) behind the fold amplitude/sill thickness discrepancy is also required to obtain a true estimate of magma volume. Mechanisms suggested to explain the discrepancy include problems with seismic imaging and varying strain behaviour of the host rock. Here we examine the extent to which host-rock compaction can explain the fold amplitude/sill thickness discrepancy. This mechanism operates in cases where a sill is injected into the upper few kilometres of sedimentary rock that contain significant porosity. Accumulation of sediment after sill intrusion reduces the amplitude of the forced fold by compaction, but the sill itself undergoes little compaction since its starting porosity is almost zero. We compiled a database of good-quality 2D and 3D seismic observations where sill thickness has been measured independently of forced fold geometry. We then backstripped the post-intrusion sedimentary section to reconstruct the amplitude of the forced fold at the time of intrusion. We used the standard compaction model in which porosity decays exponentially below the sediment surface. In all examples we studied, post-sill-emplacement compaction can explain all of the fold amplitude/sill thickness discrepancy, subject to uncertainty in compaction model parameters. This result leads directly to an improved method of predicting magma volume from fold amplitude, including how uncertainty in compaction parameters maps onto uncertainty in magma volume. Our work implies that host-rock deformation at the time of magma intrusion is less important than post-intrusion pure-shear compaction in response to ongoing sediment accumulation. This inference could be tested in cases where an independent direct measurement of the porosity-depth profile overlying the sill is available to better constrain compaction model parameters.

  15. Exploratory Study of 4D Versus 3D Robust Optimization in Intensity-Modulated Proton Therapy for Lung Cancer

    PubMed Central

    Liu, Wei; Schild, Steven E.; Chang, Joe Y.; Liao, Zhongxing; Chang, Yu-Hui; Wen, Zhifei; Shen, Jiajian; Stoker, Joshua B.; Ding, Xiaoning; Hu, Yanle; Sahoo, Narayan; Herman, Michael G.; Vargas, Carlos; Keole, Sameer; Wong, William; Bues, Martin

    2015-01-01

    Background To compare the impact of uncertainties and interplay effect on 3D and 4D robustly optimized intensity-modulated proton therapy (IMPT) plans for lung cancer in an exploratory methodology study. Methods IMPT plans were created for 11 non-randomly selected non-small-cell lung cancer (NSCLC) cases: 3D robustly optimized plans on average CTs with internal gross tumor volume density overridden to irradiate internal target volume, and 4D robustly optimized plans on 4D CTs to irradiate clinical target volume (CTV). Regular fractionation (66 Gy[RBE] in 33 fractions) were considered. In 4D optimization, the CTV of individual phases received non-uniform doses to achieve a uniform cumulative dose. The root-mean-square-dose volume histograms (RVH) measured the sensitivity of the dose to uncertainties, and the areas under the RVH curve (AUCs) were used to evaluate plan robustness. Dose evaluation software modeled time-dependent spot delivery to incorporate interplay effect with randomized starting phases of each field per fraction. Dose-volume histogram indices comparing CTV coverage, homogeneity, and normal tissue sparing were evaluated using Wilcoxon signed-rank test. Results 4D robust optimization plans led to smaller AUC for CTV (14.26 vs. 18.61 (p=0.001), better CTV coverage (Gy[RBE]) [D95% CTV: 60.6 vs 55.2 (p=0.001)], and better CTV homogeneity [D5%–D95% CTV: 10.3 vs 17.7 (p=0.002)] in the face of uncertainties. With interplay effect considered, 4D robust optimization produced plans with better target coverage [D95% CTV: 64.5 vs 63.8 (p=0.0068)], comparable target homogeneity, and comparable normal tissue protection. The benefits from 4D robust optimization were most obvious for the 2 typical stage III lung cancer patients. Conclusions Our exploratory methodology study showed that, compared to 3D robust optimization, 4D robust optimization produced significantly more robust and interplay-effect-resistant plans for targets with comparable dose distributions for normal tissues. A further study with a larger and more realistic patient population is warranted to generalize the conclusions. PMID:26725727

  16. Variability of blood alcohol content (BAC) determinations: the role of measurement uncertainty, significant figures, and decision rules for compliance assessment in the frame of a multiple BAC threshold law.

    PubMed

    Zamengo, Luca; Frison, Giampietro; Tedeschi, Gianpaola; Frasson, Samuela; Zancanaro, Flavio; Sciarrone, Rocco

    2014-10-01

    The measurement of blood-alcohol content (BAC) is a crucial analytical determination required to assess if an offence (e.g. driving under the influence of alcohol) has been committed. For various reasons, results of forensic alcohol analysis are often challenged by the defence. As a consequence, measurement uncertainty becomes a critical topic when assessing compliance with specification limits for forensic purposes. The aims of this study were: (1) to investigate major sources of variability for BAC determinations; (2) to estimate measurement uncertainty for routine BAC determinations; (3) to discuss the role of measurement uncertainty in compliance assessment; (4) to set decision rules for a multiple BAC threshold law, as provided in the Italian Highway Code; (5) to address the topic of the zero-alcohol limit from the forensic toxicology point of view; and (6) to discuss the role of significant figures and rounding errors on measurement uncertainty and compliance assessment. Measurement variability was investigated by the analysis of data collected from real cases and internal quality control. The contribution of both pre-analytical and analytical processes to measurement variability was considered. The resulting expanded measurement uncertainty was 8.0%. Decision rules for the multiple BAC threshold Italian law were set by adopting a guard-banding approach. 0.1 g/L was chosen as cut-off level to assess compliance with the zero-alcohol limit. The role of significant figures and rounding errors in compliance assessment was discussed by providing examples which stressed the importance of these topics for forensic purposes. Copyright © 2014 John Wiley & Sons, Ltd.

  17. Uncertainty and risk in wildland fire management: a review.

    PubMed

    Thompson, Matthew P; Calkin, Dave E

    2011-08-01

    Wildland fire management is subject to manifold sources of uncertainty. Beyond the unpredictability of wildfire behavior, uncertainty stems from inaccurate/missing data, limited resource value measures to guide prioritization across fires and resources at risk, and an incomplete scientific understanding of ecological response to fire, of fire behavior response to treatments, and of spatiotemporal dynamics involving disturbance regimes and climate change. This work attempts to systematically align sources of uncertainty with the most appropriate decision support methodologies, in order to facilitate cost-effective, risk-based wildfire planning efforts. We review the state of wildfire risk assessment and management, with a specific focus on uncertainties challenging implementation of integrated risk assessments that consider a suite of human and ecological values. Recent advances in wildfire simulation and geospatial mapping of highly valued resources have enabled robust risk-based analyses to inform planning across a variety of scales, although improvements are needed in fire behavior and ignition occurrence models. A key remaining challenge is a better characterization of non-market resources at risk, both in terms of their response to fire and how society values those resources. Our findings echo earlier literature identifying wildfire effects analysis and value uncertainty as the primary challenges to integrated wildfire risk assessment and wildfire management. We stress the importance of identifying and characterizing uncertainties in order to better quantify and manage them. Leveraging the most appropriate decision support tools can facilitate wildfire risk assessment and ideally improve decision-making. Published by Elsevier Ltd.

  18. Incorporating climate change into ecosystem service assessments and decisions: a review.

    PubMed

    Runting, Rebecca K; Bryan, Brett A; Dee, Laura E; Maseyk, Fleur J F; Mandle, Lisa; Hamel, Perrine; Wilson, Kerrie A; Yetka, Kathleen; Possingham, Hugh P; Rhodes, Jonathan R

    2017-01-01

    Climate change is having a significant impact on ecosystem services and is likely to become increasingly important as this phenomenon intensifies. Future impacts can be difficult to assess as they often involve long timescales, dynamic systems with high uncertainties, and are typically confounded by other drivers of change. Despite a growing literature on climate change impacts on ecosystem services, no quantitative syntheses exist. Hence, we lack an overarching understanding of the impacts of climate change, how they are being assessed, and the extent to which other drivers, uncertainties, and decision making are incorporated. To address this, we systematically reviewed the peer-reviewed literature that assesses climate change impacts on ecosystem services at subglobal scales. We found that the impact of climate change on most types of services was predominantly negative (59% negative, 24% mixed, 4% neutral, 13% positive), but varied across services, drivers, and assessment methods. Although uncertainty was usually incorporated, there were substantial gaps in the sources of uncertainty included, along with the methods used to incorporate them. We found that relatively few studies integrated decision making, and even fewer studies aimed to identify solutions that were robust to uncertainty. For management or policy to ensure the delivery of ecosystem services, integrated approaches that incorporate multiple drivers of change and account for multiple sources of uncertainty are needed. This is undoubtedly a challenging task, but ignoring these complexities can result in misleading assessments of the impacts of climate change, suboptimal management outcomes, and the inefficient allocation of resources for climate adaptation. © 2016 John Wiley & Sons Ltd.

  19. How well do we understand oil spill hazard mapping?

    NASA Astrophysics Data System (ADS)

    Sepp Neves, Antonio Augusto; Pinardi, Nadia

    2017-04-01

    In simple terms, we could describe the marine oil spill hazard as related to three main factors: the spill event itself, the spill trajectory and the arrival and adsorption of oil to the shore or beaching. Regarding the first factor, spill occurrence rates and magnitude distribution and their respective uncertainties have been estimated mainly relying on maritime casualty reports. Abascal et al. (2010) and Sepp Neves et al. (2015) demonstrated for the Prestige (Spain, 2002) and Jiyeh (Lebanon, 2006) spills that ensemble numerical oil spill simulations can generate reliable estimaes of the most likely oil trajectories and impacted coasts. Although paramount to estimate the spill impacts on coastal resources, the third component of the oil spill hazard (i.e. oil beaching) is still subject of discussion. Analysts have employed different methodologies to estimate the coastal component of the hazard relying, for instance, on the beaching frequency solely, the time which a given coastal segment is subject to oil concentrations above a certain preset threshold, percentages of oil beached compared to the original spilled volume and many others. Obviously, results are not comparable and sometimes not consistent with the present knowledge about the environmental impacts of oil spills. The observed inconsistency in the hazard mapping methodologies suggests that there is still a lack of understanding of the beaching component of the oil spill hazard itself. The careful statistical description of the beaching process could finally set a common ground in oil spill hazard mapping studies as observed for other hazards such as earthquakes and landslides. This paper is the last of a series of efforts to standardize oil spill hazard and risk assessments through an ISO-compliant framework (IT - OSRA, see Sepp Neves et al., (2015)). We performed two large ensemble oil spill experiments addressing uncertainties in the spill characteristics and location, and meteocean conditions for two different areas (Algarve and Uruguay) aiming at quantifying the hazard due to accidental (large volumes and rare events) and operational (frequent and usually involving small volumes) spills associated with the maritime traffic. In total, over 60,000 240h-long simulations were run and the statistical behavior of the beached concentrations found was described. The concentration distributions for both study areas were successfully fit using a Gamma distribution demonstrating the generality of our conclusions. The oil spill hazard and its uncertainties were quantified for accidental and operational events relying on the statistical distribution parameters. Therefore, the hazard estimates were comparable between areas and allowed to identify priority coastal segments for protection and rank sources of hazard.

  20. Introducing nonpoint source transferable quotas in nitrogen trading: The effects of transaction costs and uncertainty.

    PubMed

    Zhou, Xiuru; Ye, Weili; Zhang, Bing

    2016-03-01

    Transaction costs and uncertainty are considered to be significant obstacles in the emissions trading market, especially for including nonpoint source in water quality trading. This study develops a nonlinear programming model to simulate how uncertainty and transaction costs affect the performance of point/nonpoint source (PS/NPS) water quality trading in the Lake Tai watershed, China. The results demonstrate that PS/NPS water quality trading is a highly cost-effective instrument for emissions abatement in the Lake Tai watershed, which can save 89.33% on pollution abatement costs compared to trading only between nonpoint sources. However, uncertainty can significantly reduce the cost-effectiveness by reducing trading volume. In addition, transaction costs from bargaining and decision making raise total pollution abatement costs directly and cause the offset system to deviate from the optimal state. While proper investment in monitoring and measuring of nonpoint emissions can decrease uncertainty and save on the total abatement costs. Finally, we show that the dispersed ownership of China's farmland will bring high uncertainty and transaction costs into the PS/NPS offset system, even if the pollution abatement cost is lower than for point sources. Copyright © 2015 Elsevier Ltd. All rights reserved.

  1. Beyond Social Constructivist Perspectives on Assessment: The Centring of Knowledge

    ERIC Educational Resources Information Center

    Shay, Suellen

    2008-01-01

    Over the past few decades assessment has been heralded for its key role in the improvement of teaching and learning. However, more recently there have been expressions of uncertainty about whether assessment is in fact delivering on its promised potential. Against this backdrop of uncertainty and circumspection this paper offers a critical…

  2. Disdrometer-based C-Band Radar Quantitative Precipitation Estimation (QPE) in a highly complex terrain region in tropical Colombia.

    NASA Astrophysics Data System (ADS)

    Sepúlveda, J.; Hoyos Ortiz, C. D.

    2017-12-01

    An adequate quantification of precipitation over land is critical for many societal applications including agriculture, hydroelectricity generation, water supply, and risk management associated with extreme events. The use of rain gauges, a traditional method for precipitation estimation, and an excellent one, to estimate the volume of liquid water during a particular precipitation event, does not allow to fully capture the highly spatial variability of the phenomena which is a requirement for almost all practical applications. On the other hand, the weather radar, an active remote sensing sensor, provides a proxy for rainfall with fine spatial resolution and adequate temporary sampling, however, it does not measure surface precipitation. In order to fully exploit the capabilities of the weather radar, it is necessary to develop quantitative precipitation estimation (QPE) techniques combining radar information with in-situ measurements. Different QPE methodologies are explored and adapted to local observations in a highly complex terrain region in tropical Colombia using a C-Band radar and a relatively dense network of rain gauges and disdrometers. One important result is that the expressions reported in the literature for extratropical locations are not representative of the conditions found in the tropical region studied. In addition to reproducing the state-of-the-art techniques, a new multi-stage methodology based on radar-derived variables and disdrometer data is proposed in order to achieve the best QPE possible. The main motivation for this new methodology is based on the fact that most traditional QPE methods do not directly take into account the different uncertainty sources involved in the process. The main advantage of the multi-stage model compared to traditional models is that it allows assessing and quantifying the uncertainty in the surface rain rate estimation. The sub-hourly rainfall estimations using the multi-stage methodology are realistic compared to observed data in spite of the many sources of uncertainty including the sampling volume, the different physical principles of the sensors, the incomplete understanding of the microphysics of precipitation and, the most important, the rapidly varying droplet size distribution.

  3. Assessing Uncertainty of Interspecies Correlation Estimation Models for Aromatic Compounds

    EPA Science Inventory

    We developed Interspecies Correlation Estimation (ICE) models for aromatic compounds containing 1 to 4 benzene rings to assess uncertainty in toxicity extrapolation in two data compilation approaches. ICE models are mathematical relationships between surrogate and predicted test ...

  4. Flood risk assessment and robust management under deep uncertainty: Application to Dhaka City

    NASA Astrophysics Data System (ADS)

    Mojtahed, Vahid; Gain, Animesh Kumar; Giupponi, Carlo

    2014-05-01

    The socio-economic changes as well as climatic changes have been the main drivers of uncertainty in environmental risk assessment and in particular flood. The level of future uncertainty that researchers face when dealing with problems in a future perspective with focus on climate change is known as Deep Uncertainty (also known as Knightian uncertainty), since nobody has already experienced and undergone those changes before and our knowledge is limited to the extent that we have no notion of probabilities, and therefore consolidated risk management approaches have limited potential.. Deep uncertainty is referred to circumstances that analysts and experts do not know or parties to decision making cannot agree on: i) the appropriate models describing the interaction among system variables, ii) probability distributions to represent uncertainty about key parameters in the model 3) how to value the desirability of alternative outcomes. The need thus emerges to assist policy-makers by providing them with not a single and optimal solution to the problem at hand, such as crisp estimates for the costs of damages of natural hazards considered, but instead ranges of possible future costs, based on the outcomes of ensembles of assessment models and sets of plausible scenarios. Accordingly, we need to substitute optimality as a decision criterion with robustness. Under conditions of deep uncertainty, the decision-makers do not have statistical and mathematical bases to identify optimal solutions, while instead they should prefer to implement "robust" decisions that perform relatively well over all conceivable outcomes out of all future unknown scenarios. Under deep uncertainty, analysts cannot employ probability theory or other statistics that usually can be derived from observed historical data and therefore, we turn to non-statistical measures such as scenario analysis. We construct several plausible scenarios with each scenario being a full description of what may happen in future and based on a meaningful synthesis of parameters' values with control of their correlations for maintaining internal consistencies. This paper aims at incorporating a set of data mining and sampling tools to assess uncertainty of model outputs under future climatic and socio-economic changes for Dhaka city and providing a decision support system for robust flood management and mitigation policies. After constructing an uncertainty matrix to identify the main sources of uncertainty for Dhaka City, we identify several hazard and vulnerability maps based on future climatic and socio-economic scenarios. The vulnerability of each flood management alternative under different set of scenarios is determined and finally the robustness of each plausible solution considered is defined based on the above assessment.

  5. Predicting the Earth encounters of (99942) Apophis

    NASA Technical Reports Server (NTRS)

    Giorgini, Jon D.; Benner, Lance A. M.; Ostro, Steven J.; Nolan, Michael C.; Busch, Michael W.

    2007-01-01

    Arecibo delay-Doppler measurements of (99942) Apophis in 2005 and 2006 resulted in a five standard-deviation trajectory correction to the optically predicted close approach distance to Earth in 2029. The radar measurements reduced the volume of the statistical uncertainty region entering the encounter to 7.3% of the pre-radar solution, but increased the trajectory uncertainty growth rate across the encounter by 800% due to the closer predicted approach to the Earth. A small estimated Earth impact probability remained for 2036. With standard-deviation plane-of-sky position uncertainties for 2007-2010 already less than 0.2 arcsec, the best near-term ground-based optical astrometry can only weakly affect the trajectory estimate. While the potential for impact in 2036 will likely be excluded in 2013 (if not 2011) using ground-based optical measurements, approximations within the Standard Dynamical Model (SDM) used to estimate and predict the trajectory from the current era are sufficient to obscure the difference between a predicted impact and a miss in 2036 by altering the dynamics leading into the 2029 encounter. Normal impact probability assessments based on the SDM become problematic without knowledge of the object's physical properties; impact could be excluded while the actual dynamics still permit it. Calibrated position uncertainty intervals are developed to compensate for this by characterizing the minimum and maximum effect of physical parameters on the trajectory. Uncertainty in accelerations related to solar radiation can cause between 82 and 4720 Earth-radii of trajectory change relative to the SDM by 2036. If an actionable hazard exists, alteration by 2-10% of Apophis' total absorption of solar radiation in 2018 could be sufficient to produce a six standard-deviation trajectory change by 2036 given physical characterization; even a 0.5% change could produce a trajectory shift of one Earth-radius by 2036 for all possible spin-poles and likely masses. Planetary ephemeris uncertainties are the next greatest source of systematic error, causing up to 23 Earth-radii of uncertainty. The SDM Earth point-mass assumption introduces an additional 2.9 Earth-radii of prediction error by 2036. Unmodeled asteroid perturbations produce as much as 2.3 Earth-radii of error. We find no future small-body encounters likely to yield an Apophis mass determination prior to 2029. However, asteroid (144898) 2004 VD17, itself having a statistical Earth impact in 2102, will probably encounter Apophis at 6.7 lunar distances in 2034, their uncertainty regions coming as close as 1.6 lunar distances near the center of both SDM probability distributions.

  6. Sources of uncertainty in annual forest inventory estimates

    Treesearch

    Ronald E. McRoberts

    2000-01-01

    Although design and estimation aspects of annual forest inventories have begun to receive considerable attention within the forestry and natural resources communities, little attention has been devoted to identifying the sources of uncertainty inherent in these systems or to assessing the impact of those uncertainties on the total uncertainties of inventory estimates....

  7. Assessing uncertainties in land cover projections.

    PubMed

    Alexander, Peter; Prestele, Reinhard; Verburg, Peter H; Arneth, Almut; Baranzelli, Claudia; Batista E Silva, Filipe; Brown, Calum; Butler, Adam; Calvin, Katherine; Dendoncker, Nicolas; Doelman, Jonathan C; Dunford, Robert; Engström, Kerstin; Eitelberg, David; Fujimori, Shinichiro; Harrison, Paula A; Hasegawa, Tomoko; Havlik, Petr; Holzhauer, Sascha; Humpenöder, Florian; Jacobs-Crisioni, Chris; Jain, Atul K; Krisztin, Tamás; Kyle, Page; Lavalle, Carlo; Lenton, Tim; Liu, Jiayi; Meiyappan, Prasanth; Popp, Alexander; Powell, Tom; Sands, Ronald D; Schaldach, Rüdiger; Stehfest, Elke; Steinbuks, Jevgenijs; Tabeau, Andrzej; van Meijl, Hans; Wise, Marshall A; Rounsevell, Mark D A

    2017-02-01

    Understanding uncertainties in land cover projections is critical to investigating land-based climate mitigation policies, assessing the potential of climate adaptation strategies and quantifying the impacts of land cover change on the climate system. Here, we identify and quantify uncertainties in global and European land cover projections over a diverse range of model types and scenarios, extending the analysis beyond the agro-economic models included in previous comparisons. The results from 75 simulations over 18 models are analysed and show a large range in land cover area projections, with the highest variability occurring in future cropland areas. We demonstrate systematic differences in land cover areas associated with the characteristics of the modelling approach, which is at least as great as the differences attributed to the scenario variations. The results lead us to conclude that a higher degree of uncertainty exists in land use projections than currently included in climate or earth system projections. To account for land use uncertainty, it is recommended to use a diverse set of models and approaches when assessing the potential impacts of land cover change on future climate. Additionally, further work is needed to better understand the assumptions driving land use model results and reveal the causes of uncertainty in more depth, to help reduce model uncertainty and improve the projections of land cover. © 2016 John Wiley & Sons Ltd.

  8. Integrating info-gap decision theory with robust population management: a case study using the Mountain Plover.

    PubMed

    van der Burg, Max Post; Tyre, Andrew J

    2011-01-01

    Wildlife managers often make decisions under considerable uncertainty. In the most extreme case, a complete lack of data leads to uncertainty that is unquantifiable. Information-gap decision theory deals with assessing management decisions under extreme uncertainty, but it is not widely used in wildlife management. So too, robust population management methods were developed to deal with uncertainties in multiple-model parameters. However, the two methods have not, as yet, been used in tandem to assess population management decisions. We provide a novel combination of the robust population management approach for matrix models with the information-gap decision theory framework for making conservation decisions under extreme uncertainty. We applied our model to the problem of nest survival management in an endangered bird species, the Mountain Plover (Charadrius montanus). Our results showed that matrix sensitivities suggest that nest management is unlikely to have a strong effect on population growth rate, confirming previous analyses. However, given the amount of uncertainty about adult and juvenile survival, our analysis suggested that maximizing nest marking effort was a more robust decision to maintain a stable population. Focusing on the twin concepts of opportunity and robustness in an information-gap model provides a useful method of assessing conservation decisions under extreme uncertainty.

  9. Quantification for complex assessment: uncertainty estimation in final year project thesis assessment

    NASA Astrophysics Data System (ADS)

    Kim, Ho Sung

    2013-12-01

    A quantitative method for estimating an expected uncertainty (reliability and validity) in assessment results arising from the relativity between four variables, viz examiner's expertise, examinee's expertise achieved, assessment task difficulty and examinee's performance, was developed for the complex assessment applicable to final year project thesis assessment including peer assessment. A guide map can be generated by the method for finding expected uncertainties prior to the assessment implementation with a given set of variables. It employs a scale for visualisation of expertise levels, derivation of which is based on quantified clarities of mental images for levels of the examiner's expertise and the examinee's expertise achieved. To identify the relevant expertise areas that depend on the complexity in assessment format, a graphical continuum model was developed. The continuum model consists of assessment task, assessment standards and criterion for the transition towards the complex assessment owing to the relativity between implicitness and explicitness and is capable of identifying areas of expertise required for scale development.

  10. Towards a Uniform Metrological Assessment of Grating-Based Optical Fiber Sensors: From Refractometers to Biosensors

    PubMed Central

    Chiavaioli, Francesco; Gouveia, Carlos A. J.; Jorge, Pedro A. S.; Baldini, Francesco

    2017-01-01

    A metrological assessment of grating-based optical fiber sensors is proposed with the aim of providing an objective evaluation of the performance of this sensor category. Attention was focused on the most common parameters, used to describe the performance of both optical refractometers and biosensors, which encompassed sensitivity, with a distinction between volume or bulk sensitivity and surface sensitivity, resolution, response time, limit of detection, specificity (or selectivity), reusability (or regenerability) and some other parameters of generic interest, such as measurement uncertainty, accuracy, precision, stability, drift, repeatability and reproducibility. Clearly, the concepts discussed here can also be applied to any resonance-based sensor, thus providing the basis for an easier and direct performance comparison of a great number of sensors published in the literature up to now. In addition, common mistakes present in the literature made for the evaluation of sensor performance are highlighted, and lastly a uniform performance assessment is discussed and provided. Finally, some design strategies will be proposed to develop a grating-based optical fiber sensing scheme with improved performance. PMID:28635665

  11. Towards a Uniform Metrological Assessment of Grating-Based Optical Fiber Sensors: From Refractometers to Biosensors.

    PubMed

    Chiavaioli, Francesco; Gouveia, Carlos A J; Jorge, Pedro A S; Baldini, Francesco

    2017-06-21

    A metrological assessment of grating-based optical fiber sensors is proposed with the aim of providing an objective evaluation of the performance of this sensor category. Attention was focused on the most common parameters, used to describe the performance of both optical refractometers and biosensors, which encompassed sensitivity, with a distinction between volume or bulk sensitivity and surface sensitivity, resolution, response time, limit of detection, specificity (or selectivity), reusability (or regenerability) and some other parameters of generic interest, such as measurement uncertainty, accuracy, precision, stability, drift, repeatability and reproducibility. Clearly, the concepts discussed here can also be applied to any resonance-based sensor, thus providing the basis for an easier and direct performance comparison of a great number of sensors published in the literature up to now. In addition, common mistakes present in the literature made for the evaluation of sensor performance are highlighted, and lastly a uniform performance assessment is discussed and provided. Finally, some design strategies will be proposed to develop a grating-based optical fiber sensing scheme with improved performance.

  12. Comparison of 2D numerical models for river flood hazard assessment: simulation of the Secchia River flood in January, 2014

    NASA Astrophysics Data System (ADS)

    Shustikova, Iuliia; Domeneghetti, Alessio; Neal, Jeffrey; Bates, Paul; Castellarin, Attilio

    2017-04-01

    Hydrodynamic modeling of inundation events still brings a large array of uncertainties. This effect is especially evident in the models run for geographically large areas. Recent studies suggest using fully two-dimensional (2D) models with high resolution in order to avoid uncertainties and limitations coming from the incorrect interpretation of flood dynamics and an unrealistic reproduction of the terrain topography. This, however, affects the computational efficiency increasing the running time and hardware demands. Concerning this point, our study evaluates and compares numerical models of different complexity by testing them on a flood event that occurred in the basin of the Secchia River, Northern Italy, on 19th January, 2014. The event was characterized by a levee breach and consequent flooding of over 75 km2 of the plain behind the dike within 48 hours causing population displacement, one death and economic losses in excess of 400 million Euro. We test the well-established TELEMAC 2D, and LISFLOOD-FP codes, together with the recently launched HEC-RAS 5.0.3 (2D model), all models are implemented using different grid size (2-200 m) based on the 1 m digital elevation model resolution. TELEMAC is a fully 2D hydrodynamic model which is based on the finite-element or finite-volume approach. Whereas HEC-RAS 5.0.3 and LISFLOOD-FP are both coupled 1D-2D models. All models are calibrated against observed inundation extent and maximum water depths, which are retrieved from remotely sensed data and field survey reports. Our study quantitatively compares the three modeling strategies highlighting differences in terms of the ease of implementation, accuracy of representation of hydraulic processes within floodplains and computational efficiency. Additionally, we look into the different grid resolutions in terms of the results accuracy and computation time. Our study is a preliminary assessment that focuses on smaller areas in order to identify potential modeling schemes that would be efficient for simulating flooding scenarios for large and very large floodplains. This research aims at contributing to the reduction of uncertainties and limitations in hazard and risk assessment.

  13. Development of a primary standard for absorbed dose from unsealed radionuclide solutions

    NASA Astrophysics Data System (ADS)

    Billas, I.; Shipley, D.; Galer, S.; Bass, G.; Sander, T.; Fenwick, A.; Smyth, V.

    2016-12-01

    Currently, the determination of the internal absorbed dose to tissue from an administered radionuclide solution relies on Monte Carlo (MC) calculations based on published nuclear decay data, such as emission probabilities and energies. In order to validate these methods with measurements, it is necessary to achieve the required traceability of the internal absorbed dose measurements of a radionuclide solution to a primary standard of absorbed dose. The purpose of this work was to develop a suitable primary standard. A comparison between measurements and calculations of absorbed dose allows the validation of the internal radiation dose assessment methods. The absorbed dose from an yttrium-90 chloride (90YCl) solution was measured with an extrapolation chamber. A phantom was developed at the National Physical Laboratory (NPL), the UK’s National Measurement Institute, to position the extrapolation chamber as closely as possible to the surface of the solution. The performance of the extrapolation chamber was characterised and a full uncertainty budget for the absorbed dose determination was obtained. Absorbed dose to air in the collecting volume of the chamber was converted to absorbed dose at the centre of the radionuclide solution by applying a MC calculated correction factor. This allowed a direct comparison of the analytically calculated and experimentally determined absorbed dose of an 90YCl solution. The relative standard uncertainty in the measurement of absorbed dose at the centre of an 90YCl solution with the extrapolation chamber was found to be 1.6% (k  =  1). The calculated 90Y absorbed doses from published medical internal radiation dose (MIRD) and radiation dose assessment resource (RADAR) data agreed with measurements to within 1.5% and 1.4%, respectively. This study has shown that it is feasible to use an extrapolation chamber for performing primary standard absorbed dose measurements of an unsealed radionuclide solution. Internal radiation dose assessment methods based on MIRD and RADAR data for 90Y have been validated with experimental absorbed dose determination and they agree within the stated expanded uncertainty (k  =  2).

  14. Optimization of geothermal well trajectory in order to minimize borehole failure

    NASA Astrophysics Data System (ADS)

    Dahrabou, A.; Valley, B.; Ladner, F.; Guinot, F.; Meier, P.

    2017-12-01

    In projects based on Enhanced Geothermal System (EGS) principle, deep boreholes are drilled to low permeability rock masses. As part of the completion operations, the permeability of existing fractures in the rock mass is enhanced by injecting large volumes of water. These stimulation treatments aim at achieving enough water circulation for heat extraction at commercial rates which makes the stimulation operations critical to the project success. The accurate placement of the stimulation treatments requires well completion with effective zonal isolation, and wellbore stability is a prerequisite to all zonal isolation techniques, be it packer sealing or cement placement. In this project, a workflow allowing a fast decision-making process for selecting an optimal well trajectory for EGS projects is developed. In fact, the well is first drilled vertically then based on logging data which are costly (100 KCHF/day), the direction in which the strongly deviated borehole section will be drilled needs to be determined in order to optimize borehole stability and to intersect the highest number of fractures that are oriented favorably for stimulation. The workflow applies to crystalline rock and includes an uncertainty and risk assessment framework. An initial sensitivity study was performed to identify the most influential parameters on borehole stability. The main challenge in these analyses is that the strength and stress profiles are unknown independently. Calibration of a geomechanical model on the observed borehole failure has been performed using data from the Basel Geothermal well BS-1. In a first approximation, a purely elastic-static analytical solution in combination with a purely cohesive failure criterion were used as it provides the most consistent prediction across failure indicators. A systematic analysis of the uncertainty on all parameters was performed to assess the reliability of the optimal trajectory selection. To each drilling scenario, failure probability and the associated risks, are computed stochastically. In addition, model uncertainty is assessed by confronting various failure modelling approaches to the available failure data from the Basel Project. Together, these results form the basis of an integrated workflow optimizing geothermal (EGS) well trajectory.

  15. Performance Assessment Uncertainty Analysis for Japan's HLW Program Feasibility Study (H12)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    BABA,T.; ISHIGURO,K.; ISHIHARA,Y.

    1999-08-30

    Most HLW programs in the world recognize that any estimate of long-term radiological performance must be couched in terms of the uncertainties derived from natural variation, changes through time and lack of knowledge about the essential processes. The Japan Nuclear Cycle Development Institute followed a relatively standard procedure to address two major categories of uncertainty. First, a FEatures, Events and Processes (FEPs) listing, screening and grouping activity was pursued in order to define the range of uncertainty in system processes as well as possible variations in engineering design. A reference and many alternative cases representing various groups of FEPs weremore » defined and individual numerical simulations performed for each to quantify the range of conceptual uncertainty. Second, parameter distributions were developed for the reference case to represent the uncertainty in the strength of these processes, the sequencing of activities and geometric variations. Both point estimates using high and low values for individual parameters as well as a probabilistic analysis were performed to estimate parameter uncertainty. A brief description of the conceptual model uncertainty analysis is presented. This paper focuses on presenting the details of the probabilistic parameter uncertainty assessment.« less

  16. Verifying and Validating Simulation Models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hemez, Francois M.

    2015-02-23

    This presentation is a high-level discussion of the Verification and Validation (V&V) of computational models. Definitions of V&V are given to emphasize that “validation” is never performed in a vacuum; it accounts, instead, for the current state-of-knowledge in the discipline considered. In particular comparisons between physical measurements and numerical predictions should account for their respective sources of uncertainty. The differences between error (bias), aleatoric uncertainty (randomness) and epistemic uncertainty (ignorance, lack-of- knowledge) are briefly discussed. Four types of uncertainty in physics and engineering are discussed: 1) experimental variability, 2) variability and randomness, 3) numerical uncertainty and 4) model-form uncertainty. Statisticalmore » sampling methods are available to propagate, and analyze, variability and randomness. Numerical uncertainty originates from the truncation error introduced by the discretization of partial differential equations in time and space. Model-form uncertainty is introduced by assumptions often formulated to render a complex problem more tractable and amenable to modeling and simulation. The discussion concludes with high-level guidance to assess the “credibility” of numerical simulations, which stems from the level of rigor with which these various sources of uncertainty are assessed and quantified.« less

  17. Dispelling urban myths about default uncertainty factors in chemical risk assessment – sufficient protection against mixture effects?

    PubMed Central

    2013-01-01

    Assessing the detrimental health effects of chemicals requires the extrapolation of experimental data in animals to human populations. This is achieved by applying a default uncertainty factor of 100 to doses not found to be associated with observable effects in laboratory animals. It is commonly assumed that the toxicokinetic and toxicodynamic sub-components of this default uncertainty factor represent worst-case scenarios and that the multiplication of those components yields conservative estimates of safe levels for humans. It is sometimes claimed that this conservatism also offers adequate protection from mixture effects. By analysing the evolution of uncertainty factors from a historical perspective, we expose that the default factor and its sub-components are intended to represent adequate rather than worst-case scenarios. The intention of using assessment factors for mixture effects was abandoned thirty years ago. It is also often ignored that the conservatism (or otherwise) of uncertainty factors can only be considered in relation to a defined level of protection. A protection equivalent to an effect magnitude of 0.001-0.0001% over background incidence is generally considered acceptable. However, it is impossible to say whether this level of protection is in fact realised with the tolerable doses that are derived by employing uncertainty factors. Accordingly, it is difficult to assess whether uncertainty factors overestimate or underestimate the sensitivity differences in human populations. It is also often not appreciated that the outcome of probabilistic approaches to the multiplication of sub-factors is dependent on the choice of probability distributions. Therefore, the idea that default uncertainty factors are overly conservative worst-case scenarios which can account both for the lack of statistical power in animal experiments and protect against potential mixture effects is ill-founded. We contend that precautionary regulation should provide an incentive to generate better data and recommend adopting a pragmatic, but scientifically better founded approach to mixture risk assessment. PMID:23816180

  18. Dispelling urban myths about default uncertainty factors in chemical risk assessment--sufficient protection against mixture effects?

    PubMed

    Martin, Olwenn V; Martin, Scholze; Kortenkamp, Andreas

    2013-07-01

    Assessing the detrimental health effects of chemicals requires the extrapolation of experimental data in animals to human populations. This is achieved by applying a default uncertainty factor of 100 to doses not found to be associated with observable effects in laboratory animals. It is commonly assumed that the toxicokinetic and toxicodynamic sub-components of this default uncertainty factor represent worst-case scenarios and that the multiplication of those components yields conservative estimates of safe levels for humans. It is sometimes claimed that this conservatism also offers adequate protection from mixture effects. By analysing the evolution of uncertainty factors from a historical perspective, we expose that the default factor and its sub-components are intended to represent adequate rather than worst-case scenarios. The intention of using assessment factors for mixture effects was abandoned thirty years ago. It is also often ignored that the conservatism (or otherwise) of uncertainty factors can only be considered in relation to a defined level of protection. A protection equivalent to an effect magnitude of 0.001-0.0001% over background incidence is generally considered acceptable. However, it is impossible to say whether this level of protection is in fact realised with the tolerable doses that are derived by employing uncertainty factors. Accordingly, it is difficult to assess whether uncertainty factors overestimate or underestimate the sensitivity differences in human populations. It is also often not appreciated that the outcome of probabilistic approaches to the multiplication of sub-factors is dependent on the choice of probability distributions. Therefore, the idea that default uncertainty factors are overly conservative worst-case scenarios which can account both for the lack of statistical power in animal experiments and protect against potential mixture effects is ill-founded. We contend that precautionary regulation should provide an incentive to generate better data and recommend adopting a pragmatic, but scientifically better founded approach to mixture risk assessment.

  19. Defining robustness protocols: a method to include and evaluate robustness in clinical plans

    NASA Astrophysics Data System (ADS)

    McGowan, S. E.; Albertini, F.; Thomas, S. J.; Lomax, A. J.

    2015-04-01

    We aim to define a site-specific robustness protocol to be used during the clinical plan evaluation process. Plan robustness of 16 skull base IMPT plans to systematic range and random set-up errors have been retrospectively and systematically analysed. This was determined by calculating the error-bar dose distribution (ebDD) for all the plans and by defining some metrics used to define protocols aiding the plan assessment. Additionally, an example of how to clinically use the defined robustness database is given whereby a plan with sub-optimal brainstem robustness was identified. The advantage of using different beam arrangements to improve the plan robustness was analysed. Using the ebDD it was found range errors had a smaller effect on dose distribution than the corresponding set-up error in a single fraction, and that organs at risk were most robust to the range errors, whereas the target was more robust to set-up errors. A database was created to aid planners in terms of plan robustness aims in these volumes. This resulted in the definition of site-specific robustness protocols. The use of robustness constraints allowed for the identification of a specific patient that may have benefited from a treatment of greater individuality. A new beam arrangement showed to be preferential when balancing conformality and robustness for this case. The ebDD and error-bar volume histogram proved effective in analysing plan robustness. The process of retrospective analysis could be used to establish site-specific robustness planning protocols in proton therapy. These protocols allow the planner to determine plans that, although delivering a dosimetrically adequate dose distribution, have resulted in sub-optimal robustness to these uncertainties. For these cases the use of different beam start conditions may improve the plan robustness to set-up and range uncertainties.

  20. Quantitative analysis of in situ optical diagnostics for inferring particle/aggregate parameters in flames: Implications for soot surface growth and total emissivity

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Koeylue, U.O.

    1997-05-01

    An in situ particulate diagnostic/analysis technique is outlined based on the Rayleigh-Debye-Gans polydisperse fractal aggregate (RDG/PFA) scattering interpretation of absolute angular light scattering and extinction measurements. Using proper particle refractive index, the proposed data analysis method can quantitatively yield all aggregate parameters (particle volume fraction, f{sub v}, fractal dimension, D{sub f}, primary particle diameter, d{sub p}, particle number density, n{sub p}, and aggregate size distribution, pdf(N)) without any prior knowledge about the particle-laden environment. The present optical diagnostic/interpretation technique was applied to two different soot-containing laminar and turbulent ethylene/air nonpremixed flames in order to assess its reliability. The aggregate interpretationmore » of optical measurements yielded D{sub f}, d{sub p}, and pdf(N) that are in excellent agreement with ex situ thermophoretic sampling/transmission electron microscope (TS/TEM) observations within experimental uncertainties. However, volume-equivalent single particle models (Rayleigh/Mie) overestimated d{sub p} by about a factor of 3, causing an order of magnitude underestimation in n{sub p}. Consequently, soot surface areas and growth rates were in error by a factor of 3, emphasizing that aggregation effects need to be taken into account when using optical diagnostics for a reliable understanding of soot formation/evolution mechanism in flames. The results also indicated that total soot emissivities were generally underestimated using Rayleigh analysis (up to 50%), mainly due to the uncertainties in soot refractive indices at infrared wavelengths. This suggests that aggregate considerations may not be essential for reasonable radiation heat transfer predictions from luminous flames because of fortuitous error cancellation, resulting in typically a 10 to 30% net effect.« less

  1. TSUNAMI Primer: A Primer for Sensitivity/Uncertainty Calculations with SCALE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rearden, Bradley T; Mueller, Don; Bowman, Stephen M

    2009-01-01

    This primer presents examples in the application of the SCALE/TSUNAMI tools to generate k{sub eff} sensitivity data for one- and three-dimensional models using TSUNAMI-1D and -3D and to examine uncertainties in the computed k{sub eff} values due to uncertainties in the cross-section data used in their calculation. The proper use of unit cell data and need for confirming the appropriate selection of input parameters through direct perturbations are described. The uses of sensitivity and uncertainty data to identify and rank potential sources of computational bias in an application system and TSUNAMI tools for assessment of system similarity using sensitivity andmore » uncertainty criteria are demonstrated. Uses of these criteria in trending analyses to assess computational biases, bias uncertainties, and gap analyses are also described. Additionally, an application of the data adjustment tool TSURFER is provided, including identification of specific details of sources of computational bias.« less

  2. Uncertainty analysis on simple mass balance model to calculate critical loads for soil acidity.

    PubMed

    Li, Harbin; McNulty, Steven G

    2007-10-01

    Simple mass balance equations (SMBE) of critical acid loads (CAL) in forest soil were developed to assess potential risks of air pollutants to ecosystems. However, to apply SMBE reliably at large scales, SMBE must be tested for adequacy and uncertainty. Our goal was to provide a detailed analysis of uncertainty in SMBE so that sound strategies for scaling up CAL estimates to the national scale could be developed. Specifically, we wanted to quantify CAL uncertainty under natural variability in 17 model parameters, and determine their relative contributions in predicting CAL. Results indicated that uncertainty in CAL came primarily from components of base cation weathering (BC(w); 49%) and acid neutralizing capacity (46%), whereas the most critical parameters were BC(w) base rate (62%), soil depth (20%), and soil temperature (11%). Thus, improvements in estimates of these factors are crucial to reducing uncertainty and successfully scaling up SMBE for national assessments of CAL.

  3. Assessing the inherent uncertainty of one-dimensional diffusions

    NASA Astrophysics Data System (ADS)

    Eliazar, Iddo; Cohen, Morrel H.

    2013-01-01

    In this paper we assess the inherent uncertainty of one-dimensional diffusion processes via a stochasticity classification which provides an à la Mandelbrot categorization into five states of uncertainty: infra-mild, mild, borderline, wild, and ultra-wild. Two settings are considered. (i) Stopped diffusions: the diffusion initiates from a high level and is stopped once it first reaches a low level; in this setting we analyze the inherent uncertainty of the diffusion's maximal exceedance above its initial high level. (ii) Stationary diffusions: the diffusion is in dynamical statistical equilibrium; in this setting we analyze the inherent uncertainty of the diffusion's equilibrium level. In both settings general closed-form analytic results are established, and their application is exemplified by stock prices in the stopped-diffusions setting, and by interest rates in the stationary-diffusions setting. These results provide a highly implementable decision-making tool for the classification of uncertainty in the context of one-dimensional diffusions.

  4. Uncertainties in land use data

    NASA Astrophysics Data System (ADS)

    Castilla, G.; Hay, G. J.

    2006-11-01

    This paper deals with the description and assessment of uncertainties in gridded land use data derived from Remote Sensing observations, in the context of hydrological studies. Land use is a categorical regionalised variable returning the main socio-economic role each location has, where the role is inferred from the pattern of occupation of land. There are two main uncertainties surrounding land use data, positional and categorical. This paper focuses on the second one, as the first one has in general less serious implications and is easier to tackle. The conventional method used to asess categorical uncertainty, the confusion matrix, is criticised in depth, the main critique being its inability to inform on a basic requirement to propagate uncertainty through distributed hydrological models, namely the spatial distribution of errors. Some existing alternative methods are reported, and finally the need for metadata is stressed as a more reliable means to assess the quality, and hence the uncertainty, of these data.

  5. Epistemic uncertainties and natural hazard risk assessment - Part 1: A review of the issues

    NASA Astrophysics Data System (ADS)

    Beven, K. J.; Aspinall, W. P.; Bates, P. D.; Borgomeo, E.; Goda, K.; Hall, J. W.; Page, T.; Phillips, J. C.; Rougier, J. T.; Simpson, M.; Stephenson, D. B.; Smith, P. J.; Wagener, T.; Watson, M.

    2015-12-01

    Uncertainties in natural hazard risk assessment are generally dominated by the sources arising from lack of knowledge or understanding of the processes involved. There is a lack of knowledge about frequencies, process representations, parameters, present and future boundary conditions, consequences and impacts, and the meaning of observations in evaluating simulation models. These are the epistemic uncertainties that can be difficult to constrain, especially in terms of event or scenario probabilities, even as elicited probabilities rationalized on the basis of expert judgements. This paper reviews the issues raised by trying to quantify the effects of epistemic uncertainties. Such scientific uncertainties might have significant influence on decisions that are made for risk management, so it is important to communicate the meaning of an uncertainty estimate and to provide an audit trail of the assumptions on which it is based. Some suggestions for good practice in doing so are made.

  6. Assessment of uncertainties in radiation-induced cancer risk predictions at clinically relevant doses

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nguyen, J.; Moteabbed, M.; Paganetti, H., E-mail: hpaganetti@mgh.harvard.edu

    2015-01-15

    Purpose: Theoretical dose–response models offer the possibility to assess second cancer induction risks after external beam therapy. The parameters used in these models are determined with limited data from epidemiological studies. Risk estimations are thus associated with considerable uncertainties. This study aims at illustrating uncertainties when predicting the risk for organ-specific second cancers in the primary radiation field illustrated by choosing selected treatment plans for brain cancer patients. Methods: A widely used risk model was considered in this study. The uncertainties of the model parameters were estimated with reported data of second cancer incidences for various organs. Standard error propagationmore » was then subsequently applied to assess the uncertainty in the risk model. Next, second cancer risks of five pediatric patients treated for cancer in the head and neck regions were calculated. For each case, treatment plans for proton and photon therapy were designed to estimate the uncertainties (a) in the lifetime attributable risk (LAR) for a given treatment modality and (b) when comparing risks of two different treatment modalities. Results: Uncertainties in excess of 100% of the risk were found for almost all organs considered. When applied to treatment plans, the calculated LAR values have uncertainties of the same magnitude. A comparison between cancer risks of different treatment modalities, however, does allow statistically significant conclusions. In the studied cases, the patient averaged LAR ratio of proton and photon treatments was 0.35, 0.56, and 0.59 for brain carcinoma, brain sarcoma, and bone sarcoma, respectively. Their corresponding uncertainties were estimated to be potentially below 5%, depending on uncertainties in dosimetry. Conclusions: The uncertainty in the dose–response curve in cancer risk models makes it currently impractical to predict the risk for an individual external beam treatment. On the other hand, the ratio of absolute risks between two modalities is less sensitive to the uncertainties in the risk model and can provide statistically significant estimates.« less

  7. Uncertainties in climate change projections for viticulture in Portugal

    NASA Astrophysics Data System (ADS)

    Fraga, Helder; Malheiro, Aureliano C.; Moutinho-Pereira, José; Pinto, Joaquim G.; Santos, João A.

    2013-04-01

    The assessment of climate change impacts on viticulture is often carried out using regional climate model (RCM) outputs. These studies rely on either multi-model ensembles or on single-model approaches. The RCM-ensembles account for uncertainties inherent to the different models. In this study, using a 16-RCM ensemble under the IPCC A1B scenario, the climate change signal (future minus recent-past, 2041-2070 - 1961-2000) of 4 bioclimatic indices (Huglin Index - HI, Dryness Index - DI, Hydrothermal Index - HyI and CompI - Composite Index) over mainland Portugal is analysed. A normalized interquartile range (NIQR) of the 16-member ensemble for each bioclimatic index is assessed in order to quantify the ensemble uncertainty. The results show significant increases in the HI index over most of Portugal, with higher values in Alentejo, Trás-os-Montes and Douro/Porto wine regions, also depicting very low uncertainty. Conversely, the decreases in the DI pattern throughout the country show large uncertainties, except in Minho (northwestern Portugal), where precipitation reaches the highest amounts in Portugal. The HyI shows significant decreases in northwestern Portugal, with relatively low uncertainty all across the country. The CompI depicts significant decreases over Alentejo and increases over Minho, though decreases over Alentejo reveal high uncertainty, while increases over Minho show low uncertainty. The assessment of the uncertainty in climate change projections is of great relevance for the wine industry. Quantifying this uncertainty is crucial, since different models may lead to quite different outcomes and may thereby be as crucial as climate change itself to the winemaking sector. This work is supported by European Union Funds (FEDER/COMPETE - Operational Competitiveness Programme) and by national funds (FCT - Portuguese Foundation for Science and Technology) under the project FCOMP-01-0124-FEDER-022692.

  8. Uncertainty assessment of a model for biological nitrogen and phosphorus removal: Application to a large wastewater treatment plant

    NASA Astrophysics Data System (ADS)

    Mannina, Giorgio; Cosenza, Alida; Viviani, Gaspare

    In the last few years, the use of mathematical models in WasteWater Treatment Plant (WWTP) processes has become a common way to predict WWTP behaviour. However, mathematical models generally demand advanced input for their implementation that must be evaluated by an extensive data-gathering campaign, which cannot always be carried out. This fact, together with the intrinsic complexity of the model structure, leads to model results that may be very uncertain. Quantification of the uncertainty is imperative. However, despite the importance of uncertainty quantification, only few studies have been carried out in the wastewater treatment field, and those studies only included a few of the sources of model uncertainty. Seeking the development of the area, the paper presents the uncertainty assessment of a mathematical model simulating biological nitrogen and phosphorus removal. The uncertainty assessment was conducted according to the Generalised Likelihood Uncertainty Estimation (GLUE) methodology that has been scarcely applied in wastewater field. The model was based on activated-sludge models 1 (ASM) and 2 (ASM2). Different approaches can be used for uncertainty analysis. The GLUE methodology requires a large number of Monte Carlo simulations in which a random sampling of individual parameters drawn from probability distributions is used to determine a set of parameter values. Using this approach, model reliability was evaluated based on its capacity to globally limit the uncertainty. The method was applied to a large full-scale WWTP for which quantity and quality data was gathered. The analysis enabled to gain useful insights for WWTP modelling identifying the crucial aspects where higher uncertainty rely and where therefore, more efforts should be provided in terms of both data gathering and modelling practises.

  9. Uncertainty principle in loop quantum cosmology by Moyal formalism

    NASA Astrophysics Data System (ADS)

    Perlov, Leonid

    2018-03-01

    In this paper, we derive the uncertainty principle for the loop quantum cosmology homogeneous and isotropic Friedmann-Lemaiter-Robertson-Walker model with the holonomy-flux algebra. The uncertainty principle is between the variables c, with the meaning of connection and μ having the meaning of the physical cell volume to the power 2/3, i.e., v2 /3 or a plaquette area. Since both μ and c are not operators, but rather the random variables, the Robertson uncertainty principle derivation that works for hermitian operators cannot be used. Instead we use the Wigner-Moyal-Groenewold phase space formalism. The Wigner-Moyal-Groenewold formalism was originally applied to the Heisenberg algebra of the quantum mechanics. One can derive it from both the canonical and path integral quantum mechanics as well as the uncertainty principle. In this paper, we apply it to the holonomy-flux algebra in the case of the homogeneous and isotropic space. Another result is the expression for the Wigner function on the space of the cylindrical wave functions defined on Rb in c variables rather than in dual space μ variables.

  10. The triaxial ellipsoid size, density, and rotational pole of asteroid (16) Psyche from Keck and Gemini AO observations 2004-2015

    NASA Astrophysics Data System (ADS)

    Drummond, Jack D.; Merline, William J.; Carry, Benoit; Conrad, Al; Reddy, Vishnu; Tamblyn, Peter; Chapman, Clark R.; Enke, Brian L.; Pater, Imke de; Kleer, Katherine de; Christou, Julian; Dumas, Christophe

    2018-05-01

    We analyze a comprehensive set of our adaptive optics (AO) images taken at the 10 m W. M. Keck telescope and the 8 m Gemini telescope to derive values for the size, shape, and rotational pole of asteroid (16) Psyche. Our fit of a large number of AO images, spanning 14 years and covering a range of viewing geometries, allows a well-constrained model that yields small uncertainties in all measured and derived parameters, including triaxial ellipsoid dimensions, rotational pole, volume, and density. We find a best fit set of triaxial ellipsoid diameters of (a,b,c) = (274 ± 9, 231 ± 7, 176 ± 7) km, with an average diameter of 223 ± 7 km. Continuing the literature review of Carry (2012), we find a new mass for Psyche of 2.43 ± 0.35 × 1019 kg that, with the volume from our size, leads to a density estimate 4.16 ± 0.64 g/cm3. The largest contribution to the uncertainty in the density, however, still comes from the uncertainty in the mass, not our volume. Psyche's M classification, combined with its high radar albedo, suggests at least a surface metallic composition. If Psyche is composed of pure nickel-iron, the density we derive implies a macro-porosity of 47%, suggesting that it may be an exposed, disrupted, and reassembled core of a Vesta-like planetesimal. The rotational pole position (critical for planning spacecraft mission operations) that we find is consistent with others, but with a reduced uncertainty: [RA;Dec]=[32°;+5°] or Ecliptic [λ; δ]=[32∘ ; -8∘ ] with an uncertainty radius of 3°. Our results provide independent measurements of fundamental parameters for this M-type asteroid, and demonstrate that the parameters are well determined by all techniques, including setting the prime meridian over the longest principal axis. The 5.00 year orbital period of Psyche produces only four distinct opposition geometries, suggesting that observations before the arrival of Psyche Mission in 2030 should perhaps emphasize observations away from opposition, although the penalty then would be that the asteroid will be fainter and further than at opposition.

  11. Experimental techniques for the calibration of lidar depolarization channels in EARLINET

    NASA Astrophysics Data System (ADS)

    Belegante, Livio; Bravo-Aranda, Juan Antonio; Freudenthaler, Volker; Nicolae, Doina; Nemuc, Anca; Ene, Dragos; Alados-Arboledas, Lucas; Amodeo, Aldo; Pappalardo, Gelsomina; D'Amico, Giuseppe; Amato, Francesco; Engelmann, Ronny; Baars, Holger; Wandinger, Ulla; Papayannis, Alexandros; Kokkalis, Panos; Pereira, Sérgio N.

    2018-02-01

    Particle depolarization ratio retrieved from lidar measurements are commonly used for aerosol-typing studies, microphysical inversion, or mass concentration retrievals. The particle depolarization ratio is one of the primary parameters that can differentiate several major aerosol components but only if the measurements are accurate enough. The accuracy related to the retrieval of particle depolarization ratios is the driving factor for assessing and improving the uncertainties of the depolarization products. This paper presents different depolarization calibration procedures used to improve the quality of the depolarization data. The results illustrate a significant improvement of the depolarization lidar products for all the selected lidar stations that have implemented depolarization calibration procedures. The calibrated volume and particle depolarization profiles at 532 nm show values that fall within a range that is generally accepted in the literature.

  12. Assessment of interpatient heterogeneity in tumor radiosensitivity for nonsmall cell lung cancer using tumor-volume variation data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chvetsov, Alexei V., E-mail: chvetsov2@gmail.com; Schwartz, Jeffrey L.; Mayr, Nina

    2014-06-15

    Purpose: In our previous work, the authors showed that a distribution of cell surviving fractionsS{sub 2} in a heterogeneous group of patients could be derived from tumor-volume variation curves during radiotherapy for head and neck cancer. In this research study, the authors show that this algorithm can be applied to other tumors, specifically in nonsmall cell lung cancer. This new application includes larger patient volumes and includes comparison of data sets obtained at independent institutions. Methods: Our analysis was based on two data sets of tumor-volume variation curves for heterogeneous groups of 17 patients treated for nonsmall cell lung cancermore » with conventional dose fractionation. The data sets were obtained previously at two independent institutions by using megavoltage computed tomography. Statistical distributions of cell surviving fractionsS{sub 2} and clearance half-lives of lethally damaged cells T{sub 1/2} have been reconstructed in each patient group by using a version of the two-level cell population model of tumor response and a simulated annealing algorithm. The reconstructed statistical distributions of the cell surviving fractions have been compared to the distributions measured using predictive assays in vitro. Results: Nonsmall cell lung cancer presents certain difficulties for modeling surviving fractions using tumor-volume variation curves because of relatively large fractional hypoxic volume, low gradient of tumor-volume response, and possible uncertainties due to breathing motion. Despite these difficulties, cell surviving fractionsS{sub 2} for nonsmall cell lung cancer derived from tumor-volume variation measured at different institutions have similar probability density functions (PDFs) with mean values of 0.30 and 0.43 and standard deviations of 0.13 and 0.18, respectively. The PDFs for cell surviving fractions S{sub 2} reconstructed from tumor volume variation agree with the PDF measured in vitro. Conclusions: The data obtained in this work, when taken together with the data obtained previously for head and neck cancer, suggests that the cell surviving fractionsS{sub 2} can be reconstructed from the tumor volume variation curves measured during radiotherapy with conventional fractionation. The proposed method can be used for treatment evaluation and adaptation.« less

  13. Assessment of interpatient heterogeneity in tumor radiosensitivity for nonsmall cell lung cancer using tumor-volume variation data.

    PubMed

    Chvetsov, Alexei V; Yartsev, Slav; Schwartz, Jeffrey L; Mayr, Nina

    2014-06-01

    In our previous work, the authors showed that a distribution of cell surviving fractions S2 in a heterogeneous group of patients could be derived from tumor-volume variation curves during radiotherapy for head and neck cancer. In this research study, the authors show that this algorithm can be applied to other tumors, specifically in nonsmall cell lung cancer. This new application includes larger patient volumes and includes comparison of data sets obtained at independent institutions. Our analysis was based on two data sets of tumor-volume variation curves for heterogeneous groups of 17 patients treated for nonsmall cell lung cancer with conventional dose fractionation. The data sets were obtained previously at two independent institutions by using megavoltage computed tomography. Statistical distributions of cell surviving fractions S2 and clearance half-lives of lethally damaged cells T(1/2) have been reconstructed in each patient group by using a version of the two-level cell population model of tumor response and a simulated annealing algorithm. The reconstructed statistical distributions of the cell surviving fractions have been compared to the distributions measured using predictive assays in vitro. Nonsmall cell lung cancer presents certain difficulties for modeling surviving fractions using tumor-volume variation curves because of relatively large fractional hypoxic volume, low gradient of tumor-volume response, and possible uncertainties due to breathing motion. Despite these difficulties, cell surviving fractions S2 for nonsmall cell lung cancer derived from tumor-volume variation measured at different institutions have similar probability density functions (PDFs) with mean values of 0.30 and 0.43 and standard deviations of 0.13 and 0.18, respectively. The PDFs for cell surviving fractions S2 reconstructed from tumor volume variation agree with the PDF measured in vitro. The data obtained in this work, when taken together with the data obtained previously for head and neck cancer, suggests that the cell surviving fractions S2 can be reconstructed from the tumor volume variation curves measured during radiotherapy with conventional fractionation. The proposed method can be used for treatment evaluation and adaptation.

  14. Quantification of uncertainty in flood risk assessment for flood protection planning: a Bayesian approach

    NASA Astrophysics Data System (ADS)

    Dittes, Beatrice; Špačková, Olga; Ebrahimian, Negin; Kaiser, Maria; Rieger, Wolfgang; Disse, Markus; Straub, Daniel

    2017-04-01

    Flood risk estimates are subject to significant uncertainties, e.g. due to limited records of historic flood events, uncertainty in flood modeling, uncertain impact of climate change or uncertainty in the exposure and loss estimates. In traditional design of flood protection systems, these uncertainties are typically just accounted for implicitly, based on engineering judgment. In the AdaptRisk project, we develop a fully quantitative framework for planning of flood protection systems under current and future uncertainties using quantitative pre-posterior Bayesian decision analysis. In this contribution, we focus on the quantification of the uncertainties and study their relative influence on the flood risk estimate and on the planning of flood protection systems. The following uncertainty components are included using a Bayesian approach: 1) inherent and statistical (i.e. limited record length) uncertainty; 2) climate uncertainty that can be learned from an ensemble of GCM-RCM models; 3) estimates of climate uncertainty components not covered in 2), such as bias correction, incomplete ensemble, local specifics not captured by the GCM-RCM models; 4) uncertainty in the inundation modelling; 5) uncertainty in damage estimation. We also investigate how these uncertainties are possibly reduced in the future when new evidence - such as new climate models, observed extreme events, and socio-economic data - becomes available. Finally, we look into how this new evidence influences the risk assessment and effectivity of flood protection systems. We demonstrate our methodology for a pre-alpine catchment in southern Germany: the Mangfall catchment in Bavaria that includes the city of Rosenheim, which suffered significant losses during the 2013 flood event.

  15. A Review On Accuracy and Uncertainty of Spatial Data and Analyses with special reference to Urban and Hydrological Modelling

    NASA Astrophysics Data System (ADS)

    Devendran, A. A.; Lakshmanan, G.

    2014-11-01

    Data quality for GIS processing and analysis is becoming an increased concern due to the accelerated application of GIS technology for problem solving and decision making roles. Uncertainty in the geographic representation of the real world arises as these representations are incomplete. Identification of the sources of these uncertainties and the ways in which they operate in GIS based representations become crucial in any spatial data representation and geospatial analysis applied to any field of application. This paper reviews the articles on the various components of spatial data quality and various uncertainties inherent in them and special focus is paid to two fields of application such as Urban Simulation and Hydrological Modelling. Urban growth is a complicated process involving the spatio-temporal changes of all socio-economic and physical components at different scales. Cellular Automata (CA) model is one of the simulation models, which randomly selects potential cells for urbanisation and the transition rules evaluate the properties of the cell and its neighbour. Uncertainty arising from CA modelling is assessed mainly using sensitivity analysis including Monte Carlo simulation method. Likewise, the importance of hydrological uncertainty analysis has been emphasized in recent years and there is an urgent need to incorporate uncertainty estimation into water resources assessment procedures. The Soil and Water Assessment Tool (SWAT) is a continuous time watershed model to evaluate various impacts of land use management and climate on hydrology and water quality. Hydrological model uncertainties using SWAT model are dealt primarily by Generalized Likelihood Uncertainty Estimation (GLUE) method.

  16. The significance of spatial variability of rainfall on streamflow: A synthetic analysis at the Upper Lee catchment, UK

    NASA Astrophysics Data System (ADS)

    Pechlivanidis, Ilias; McIntyre, Neil; Wheater, Howard

    2017-04-01

    Rainfall, one of the main inputs in hydrological modeling, is a highly heterogeneous process over a wide range of scales in space, and hence the ignorance of the spatial rainfall information could affect the simulated streamflow. Calibration of hydrological model parameters is rarely a straightforward task due to parameter equifinality and parameters' 'nature' to compensate for other uncertainties, i.e. structural and forcing input. In here, we analyse the significance of spatial variability of rainfall on streamflow as a function of catchment scale and type, and antecedent conditions using the continuous time, semi-distributed PDM hydrological model at the Upper Lee catchment, UK. The impact of catchment scale and type is assessed using 11 nested catchments ranging in scale from 25 to 1040 km2, and further assessed by artificially changing the catchment characteristics and translating these to model parameters with uncertainty using model regionalisation. Synthetic rainfall events are introduced to directly relate the change in simulated streamflow to the spatial variability of rainfall. Overall, we conclude that the antecedent catchment wetness and catchment type play an important role in controlling the significance of the spatial distribution of rainfall on streamflow. Results show a relationship between hydrograph characteristics (streamflow peak and volume) and the degree of spatial variability of rainfall for the impermeable catchments under dry antecedent conditions, although this decreases at larger scales; however this sensitivity is significantly undermined under wet antecedent conditions. Although there is indication that the impact of spatial rainfall on streamflow varies as a function of catchment scale, the variability of antecedent conditions between the synthetic catchments seems to mask this significance. Finally, hydrograph responses to different spatial patterns in rainfall depend on assumptions used for model parameter estimation and also the spatial variation in parameters indicating the need of an uncertainty framework in such investigation.

  17. SU-F-T-316: A Model to Deal with Dosimetric and Delivery Uncertainties in Radiotherapy Treatment Planning

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Haering, P; Lang, C; Splinter, M

    2016-06-15

    Purpose The conventional way of dealing with uncertainties resulting from dose calculation or beam delivery in IMRT, is to do verification measurements for the plan in question. Here we present an alternative based on recommendations given in the AAPM 142 report and treatment specific parameters that model the uncertainties for the plan delivery. Methods Basis of the model is the assignment of uncertainty parameters to all segment fields or control point sequences of a plan. The given field shape is analyzed for complexity, dose rate, number of MU, field size related output as well as factors for in/out field positionmore » and penumbra regions. Together with depth related uncertainties, a 3D matrix is generated by a projection algorithm. Patient anatomy is included as uncertainty CT data set as well. Therefore, object density is classified in 4 categories close to water, lung, bone and gradient regions with additional uncertainties. The result is then exported as a DICOM dose file by the software tool (written in IDL, Exelis), having the given resolution and target point. Results Uncertainty matrixes for several patient cases have been calculated and compared side by side in the planning system. The result is not quite always intuitive but it clearly indicates high and low uncertainties related to OARs and target volumes as well as to measured gamma distributions.ConclusionThe imported uncertainty datasets may help the treatment planner to understand the complexity of the treatment plan. He then might decide to change the plan to produce a more suited uncertainty distribution, e.g. by changing the beam angles the high uncertainty spots can be influenced or try to use another treatment setup, resulting in a plan with lower uncertainties. A next step could be to include such a model into the optimization algorithm to add a new dose uncertainty constraint.« less

  18. When Spreadsheets Become Software - Quality Control Challenges and Approaches - 13360

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fountain, Stefanie A.; Chen, Emmie G.; Beech, John F.

    2013-07-01

    As part of a preliminary waste acceptance criteria (PWAC) development, several commercial models were employed, including the Hydrologic Evaluation of Landfill Performance model (HELP) [1], the Disposal Unit Source Term - Multiple Species model (DUSTMS) [2], and the Analytical Transient One, Two, and Three-Dimensional model (AT123D) [3]. The results of these models were post-processed in MS Excel spreadsheets to convert the model results to alternate units, compare the groundwater concentrations to the groundwater concentration thresholds, and then to adjust the waste contaminant masses (based on average concentration over the waste volume) as needed in an attempt to achieve groundwater concentrationsmore » at the limiting point of assessment that would meet the compliance concentrations while maximizing the potential use of the landfill (i.e., maximizing the volume of projected waste being generated that could be placed in the landfill). During the course of the PWAC calculation development, one of the Microsoft (MS) Excel spreadsheets used to post-process the results of the commercial model packages grew to include more than 575,000 formulas across 18 worksheets. This spreadsheet was used to assess six base scenarios as well as nine uncertainty/sensitivity scenarios. The complexity of the spreadsheet resulted in the need for a rigorous quality control (QC) procedure to verify data entry and confirm the accuracy of formulas. (authors)« less

  19. The uncertainty cascade in flood risk assessment under changing climatic conditions - the Biala Tarnowska case study

    NASA Astrophysics Data System (ADS)

    Doroszkiewicz, Joanna; Romanowicz, Renata

    2016-04-01

    Uncertainty in the results of the hydraulic model is not only associated with the limitations of that model and the shortcomings of data. An important factor that has a major impact on the uncertainty of the flood risk assessment in a changing climate conditions is associated with the uncertainty of future climate scenarios (IPCC WG I, 2013). Future climate projections provided by global climate models are used to generate future runoff required as an input to hydraulic models applied in the derivation of flood risk maps. Biala Tarnowska catchment, situated in southern Poland is used as a case study. Future discharges at the input to a hydraulic model are obtained using the HBV model and climate projections obtained from the EUROCORDEX project. The study describes a cascade of uncertainty related to different stages of the process of derivation of flood risk maps under changing climate conditions. In this context it takes into account the uncertainty of future climate projections, an uncertainty of flow routing model, the propagation of that uncertainty through the hydraulic model, and finally, the uncertainty related to the derivation of flood risk maps. One of the aims of this study is an assessment of a relative impact of different sources of uncertainty on the uncertainty of flood risk maps. Due to the complexity of the process, an assessment of total uncertainty of maps of inundation probability might be very computer time consuming. As a way forward we present an application of a hydraulic model simulator based on a nonlinear transfer function model for the chosen locations along the river reach. The transfer function model parameters are estimated based on the simulations of the hydraulic model at each of the model cross-section. The study shows that the application of the simulator substantially reduces the computer requirements related to the derivation of flood risk maps under future climatic conditions. Acknowledgements: This work was supported by the project CHIHE (Climate Change Impact on Hydrological Extremes), carried out in the Institute of Geophysics Polish Academy of Sciences, funded by Norway Grants (contract No. Pol-Nor/196243/80/2013). The hydro-meteorological observations were provided by the Institute of Meteorology and Water Management (IMGW), Poland.

  20. Dealing with unquantifiable uncertainties in landslide modelling for urban risk reduction in developing countries

    NASA Astrophysics Data System (ADS)

    Almeida, Susana; Holcombe, Liz; Pianosi, Francesca; Wagener, Thorsten

    2016-04-01

    Landslides have many negative economic and societal impacts, including the potential for significant loss of life and damage to infrastructure. Slope stability assessment can be used to guide decisions about the management of landslide risk, but its usefulness can be challenged by high levels of uncertainty in predicting landslide occurrence. Prediction uncertainty may be associated with the choice of model that is used to assess slope stability, the quality of the available input data, or a lack of knowledge of how future climatic and socio-economic changes may affect future landslide risk. While some of these uncertainties can be characterised by relatively well-defined probability distributions, for other uncertainties, such as those linked to climate change, no probability distribution is available to characterise them. This latter type of uncertainty, often referred to as deep uncertainty, means that robust policies need to be developed that are expected to perform acceptably well over a wide range of future conditions. In our study the impact of deep uncertainty on slope stability predictions is assessed in a quantitative and structured manner using Global Sensitivity Analysis (GSA) and the Combined Hydrology and Stability Model (CHASM). In particular, we use several GSA methods including the Method of Morris, Regional Sensitivity Analysis and Classification and Regression Trees (CART), as well as advanced visualization tools, to assess the combination of conditions that may lead to slope failure. Our example application is a slope in the Caribbean, an area that is naturally susceptible to landslides due to a combination of high rainfall rates during the hurricane season, steep slopes, and highly weathered residual soils. Rapid unplanned urbanisation and changing climate may further exacerbate landslide risk in the future. Our example shows how we can gain useful information in the presence of deep uncertainty by combining physically based models with GSA in a scenario discovery framework.

  1. Hotspots of uncertainty in land-use and land-cover change projections: A global-scale model comparison

    DOE PAGES

    Prestele, Reinhard; Alexander, Peter; Rounsevell, Mark D. A.; ...

    2016-05-02

    Model-based global projections of future land use and land cover (LULC) change are frequently used in environmental assessments to study the impact of LULC change on environmental services and to provide decision support for policy. These projections are characterized by a high uncertainty in terms of quantity and allocation of projected changes, which can severely impact the results of environmental assessments. In this study, we identify hotspots of uncertainty, based on 43 simulations from 11 global-scale LULC change models representing a wide range of assumptions of future biophysical and socio-economic conditions. We attribute components of uncertainty to input data, modelmore » structure, scenario storyline and a residual term, based on a regression analysis and analysis of variance. From this diverse set of models and scenarios we find that the uncertainty varies, depending on the region and the LULC type under consideration. Hotspots of uncertainty appear mainly at the edges of globally important biomes (e.g. boreal and tropical forests). Our results indicate that an important source of uncertainty in forest and pasture areas originates from different input data applied in the models. Cropland, in contrast, is more consistent among the starting conditions, while variation in the projections gradually increases over time due to diverse scenario assumptions and different modeling approaches. Comparisons at the grid cell level indicate that disagreement is mainly related to LULC type definitions and the individual model allocation schemes. We conclude that improving the quality and consistency of observational data utilized in the modeling process as well as improving the allocation mechanisms of LULC change models remain important challenges. Furthermore, current LULC representation in environmental assessments might miss the uncertainty arising from the diversity of LULC change modeling approaches and many studies ignore the uncertainty in LULC projections in assessments of LULC change impacts on climate, water resources or biodiversity.« less

  2. Hotspots of uncertainty in land-use and land-cover change projections: a global-scale model comparison.

    PubMed

    Prestele, Reinhard; Alexander, Peter; Rounsevell, Mark D A; Arneth, Almut; Calvin, Katherine; Doelman, Jonathan; Eitelberg, David A; Engström, Kerstin; Fujimori, Shinichiro; Hasegawa, Tomoko; Havlik, Petr; Humpenöder, Florian; Jain, Atul K; Krisztin, Tamás; Kyle, Page; Meiyappan, Prasanth; Popp, Alexander; Sands, Ronald D; Schaldach, Rüdiger; Schüngel, Jan; Stehfest, Elke; Tabeau, Andrzej; Van Meijl, Hans; Van Vliet, Jasper; Verburg, Peter H

    2016-12-01

    Model-based global projections of future land-use and land-cover (LULC) change are frequently used in environmental assessments to study the impact of LULC change on environmental services and to provide decision support for policy. These projections are characterized by a high uncertainty in terms of quantity and allocation of projected changes, which can severely impact the results of environmental assessments. In this study, we identify hotspots of uncertainty, based on 43 simulations from 11 global-scale LULC change models representing a wide range of assumptions of future biophysical and socioeconomic conditions. We attribute components of uncertainty to input data, model structure, scenario storyline and a residual term, based on a regression analysis and analysis of variance. From this diverse set of models and scenarios, we find that the uncertainty varies, depending on the region and the LULC type under consideration. Hotspots of uncertainty appear mainly at the edges of globally important biomes (e.g., boreal and tropical forests). Our results indicate that an important source of uncertainty in forest and pasture areas originates from different input data applied in the models. Cropland, in contrast, is more consistent among the starting conditions, while variation in the projections gradually increases over time due to diverse scenario assumptions and different modeling approaches. Comparisons at the grid cell level indicate that disagreement is mainly related to LULC type definitions and the individual model allocation schemes. We conclude that improving the quality and consistency of observational data utilized in the modeling process and improving the allocation mechanisms of LULC change models remain important challenges. Current LULC representation in environmental assessments might miss the uncertainty arising from the diversity of LULC change modeling approaches, and many studies ignore the uncertainty in LULC projections in assessments of LULC change impacts on climate, water resources or biodiversity. © 2016 The Authors. Global Change Biology Published by John Wiley & Sons Ltd.

  3. Hotspots of uncertainty in land-use and land-cover change projections: A global-scale model comparison

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Prestele, Reinhard; Alexander, Peter; Rounsevell, Mark D. A.

    Model-based global projections of future land use and land cover (LULC) change are frequently used in environmental assessments to study the impact of LULC change on environmental services and to provide decision support for policy. These projections are characterized by a high uncertainty in terms of quantity and allocation of projected changes, which can severely impact the results of environmental assessments. In this study, we identify hotspots of uncertainty, based on 43 simulations from 11 global-scale LULC change models representing a wide range of assumptions of future biophysical and socio-economic conditions. We attribute components of uncertainty to input data, modelmore » structure, scenario storyline and a residual term, based on a regression analysis and analysis of variance. From this diverse set of models and scenarios we find that the uncertainty varies, depending on the region and the LULC type under consideration. Hotspots of uncertainty appear mainly at the edges of globally important biomes (e.g. boreal and tropical forests). Our results indicate that an important source of uncertainty in forest and pasture areas originates from different input data applied in the models. Cropland, in contrast, is more consistent among the starting conditions, while variation in the projections gradually increases over time due to diverse scenario assumptions and different modeling approaches. Comparisons at the grid cell level indicate that disagreement is mainly related to LULC type definitions and the individual model allocation schemes. We conclude that improving the quality and consistency of observational data utilized in the modeling process as well as improving the allocation mechanisms of LULC change models remain important challenges. Furthermore, current LULC representation in environmental assessments might miss the uncertainty arising from the diversity of LULC change modeling approaches and many studies ignore the uncertainty in LULC projections in assessments of LULC change impacts on climate, water resources or biodiversity.« less

  4. Uncertainty of sensory signal explains variation of color constancy.

    PubMed

    Witzel, Christoph; van Alphen, Carlijn; Godau, Christoph; O'Regan, J Kevin

    2016-12-01

    Color constancy is the ability to recognize the color of an object (or more generally of a surface) under different illuminations. Without color constancy, surface color as a perceptual attribute would not be meaningful in the visual environment, where illumination changes all the time. Nevertheless, it is not obvious how color constancy is possible in the light of metamer mismatching. Surfaces that produce exactly the same sensory color signal under one illumination (metamerism) may produce utterly different sensory signals under another illumination (metamer mismatching). Here we show that this phenomenon explains to a large extent the variation of color constancy across different colors. For this purpose, color constancy was measured for different colors in an asymmetric matching task with photorealistic images. Color constancy performance was strongly correlated to the size of metamer mismatch volumes, which describe the uncertainty of the sensory signal due to metamer mismatching for a given color. The higher the uncertainty of the sensory signal, the lower the observers' color constancy. At the same time, sensory singularities, color categories, and cone ratios did not affect color constancy. The present findings do not only provide considerable insight into the determinants of color constancy, they also show that metamer mismatch volumes must be taken into account when investigating color as a perceptual property of objects and surfaces.

  5. Assessment of Laminar, Convective Aeroheating Prediction Uncertainties for Mars Entry Vehicles

    NASA Technical Reports Server (NTRS)

    Hollis, Brian R.; Prabhu, Dinesh K.

    2011-01-01

    An assessment of computational uncertainties is presented for numerical methods used by NASA to predict laminar, convective aeroheating environments for Mars entry vehicles. A survey was conducted of existing experimental heat-transfer and shock-shape data for high enthalpy, reacting-gas CO2 flows and five relevant test series were selected for comparison to predictions. Solutions were generated at the experimental test conditions using NASA state-of-the-art computational tools and compared to these data. The comparisons were evaluated to establish predictive uncertainties as a function of total enthalpy and to provide guidance for future experimental testing requirements to help lower these uncertainties.

  6. Assessment of Laminar, Convective Aeroheating Prediction Uncertainties for Mars-Entry Vehicles

    NASA Technical Reports Server (NTRS)

    Hollis, Brian R.; Prabhu, Dinesh K.

    2013-01-01

    An assessment of computational uncertainties is presented for numerical methods used by NASA to predict laminar, convective aeroheating environments for Mars-entry vehicles. A survey was conducted of existing experimental heat transfer and shock-shape data for high-enthalpy reacting-gas CO2 flows, and five relevant test series were selected for comparison with predictions. Solutions were generated at the experimental test conditions using NASA state-of-the-art computational tools and compared with these data. The comparisons were evaluated to establish predictive uncertainties as a function of total enthalpy and to provide guidance for future experimental testing requirements to help lower these uncertainties.

  7. A Methodology for Robust Comparative Life Cycle Assessments Incorporating Uncertainty.

    PubMed

    Gregory, Jeremy R; Noshadravan, Arash; Olivetti, Elsa A; Kirchain, Randolph E

    2016-06-21

    We propose a methodology for conducting robust comparative life cycle assessments (LCA) by leveraging uncertainty. The method evaluates a broad range of the possible scenario space in a probabilistic fashion while simultaneously considering uncertainty in input data. The method is intended to ascertain which scenarios have a definitive environmentally preferable choice among the alternatives being compared and the significance of the differences given uncertainty in the parameters, which parameters have the most influence on this difference, and how we can identify the resolvable scenarios (where one alternative in the comparison has a clearly lower environmental impact). This is accomplished via an aggregated probabilistic scenario-aware analysis, followed by an assessment of which scenarios have resolvable alternatives. Decision-tree partitioning algorithms are used to isolate meaningful scenario groups. In instances where the alternatives cannot be resolved for scenarios of interest, influential parameters are identified using sensitivity analysis. If those parameters can be refined, the process can be iterated using the refined parameters. We also present definitions of uncertainty quantities that have not been applied in the field of LCA and approaches for characterizing uncertainty in those quantities. We then demonstrate the methodology through a case study of pavements.

  8. Evaluating uncertainty to strengthen epidemiologic data for use in human health risk assessments

    EPA Science Inventory

    Background: There is a recognized need to improve the application of epidemiologic data in human health risk assessment especially for understanding and characterizing risks from environmental and occupational exposures. While most epidemiologic studies result in uncertainty, tec...

  9. Prioritizing Chemicals and Data Requirements for Screening-Level Exposure and Risk Assessment

    PubMed Central

    Brown, Trevor N.; Wania, Frank; Breivik, Knut; McLachlan, Michael S.

    2012-01-01

    Background: Scientists and regulatory agencies strive to identify chemicals that may cause harmful effects to humans and the environment; however, prioritization is challenging because of the large number of chemicals requiring evaluation and limited data and resources. Objectives: We aimed to prioritize chemicals for exposure and exposure potential and obtain a quantitative perspective on research needs to better address uncertainty in screening assessments. Methods: We used a multimedia mass balance model to prioritize > 12,000 organic chemicals using four far-field human exposure metrics. The propagation of variance (uncertainty) in key chemical information used as model input for calculating exposure metrics was quantified. Results: Modeled human concentrations and intake rates span approximately 17 and 15 orders of magnitude, respectively. Estimates of exposure potential using human concentrations and a unit emission rate span approximately 13 orders of magnitude, and intake fractions span 7 orders of magnitude. The actual chemical emission rate contributes the greatest variance (uncertainty) in exposure estimates. The human biotransformation half-life is the second greatest source of uncertainty in estimated concentrations. In general, biotransformation and biodegradation half-lives are greater sources of uncertainty in modeled exposure and exposure potential than chemical partition coefficients. Conclusions: Mechanistic exposure modeling is suitable for screening and prioritizing large numbers of chemicals. By including uncertainty analysis and uncertainty in chemical information in the exposure estimates, these methods can help identify and address the important sources of uncertainty in human exposure and risk assessment in a systematic manner. PMID:23008278

  10. Assesment of CALIPSO's level 3 climatological product

    NASA Astrophysics Data System (ADS)

    Papagiannopoulos, Nikolaos; Mona, Lucia; Pappalardo, Gelsomina

    2015-04-01

    Since December 2011 has been released the latest CALIPSO Level 3 (CL3) monthly product and is subject to calibration/validation studies. EARLINET as the unique European lidar network on a continental scale is the key candidate for these kind of studies. CALIPSO Level 3 data were compared against EARLINET monthly averages obtained by profiles during satellite overpasses. Data from stations of Potenza, Naples, Granada, Évora and Leipzig equipped with advanced multi-wavelength Raman lidars were used for this study. EARLINET monthly profiles yielded higher extinction values comparing to CALIPSO ones. In order to mitigate uncertainties due to spatial and temporal differences, we reproduced the CL3 filtering rubric onto the CALIPSO Level 2 data. Only grid CALIPSO overflights during EARLINET correlative measurements were used. From these data, monthly averages on 2x5 grid are reconstructed. The CALIPSO monthly mean profiles following the new approach are called CALIPSOLevel 3*,CL3*. This offers the possibility to achieve direct comparable datasets, even if greatly reduces the number of satellite grid overflights. Moreover, the comparison of matched observations reduces uncertainties from spatial variability that affects the sampled volumes. The agreement typically improved, in particular above the areas directly affected by the anthropogenic activities within the planetary boundary layer. In contrast to CL3 product, CL3* data offers the possibility to assess also the CALIPSO performance in terms of the backscatter coefficient keeping the same quality assurance criteria applied to extinction coefficient. Lastly, the typing capabilities of CALIPSO were assessed outlining the importance of the correct aerosol type assessment to the CALIPSO aerosol properties retrieval. This work is the first in-depth assessment to evaluate the aerosol optical properties reported in the CL 3 data product. The outcome will assist the establishment of independently derived uncertainty estimates that can be used to create more reliable model forecasts based on CALIPSO data. Moreover, the presented work can contribute to current and future studies that use space-based lidar data. Acknowledgments: The financial support for EARLINET provided by the European Union under grant RICA 025991 within the framework of the Sixth Framework Programme is gratefully acknowledged. Since 2011 EARLINET has been integrated in the ACTRIS Research Infrastructure Project supported by the European Union Seventh Framework Programme (FP7/2007-2013) under grant agreement no. 262254.

  11. Predicting Consumer Biomass, Size-Structure, Production, Catch Potential, Responses to Fishing and Associated Uncertainties in the World’s Marine Ecosystems

    PubMed Central

    Jennings, Simon; Collingridge, Kate

    2015-01-01

    Existing estimates of fish and consumer biomass in the world’s oceans are disparate. This creates uncertainty about the roles of fish and other consumers in biogeochemical cycles and ecosystem processes, the extent of human and environmental impacts and fishery potential. We develop and use a size-based macroecological model to assess the effects of parameter uncertainty on predicted consumer biomass, production and distribution. Resulting uncertainty is large (e.g. median global biomass 4.9 billion tonnes for consumers weighing 1 g to 1000 kg; 50% uncertainty intervals of 2 to 10.4 billion tonnes; 90% uncertainty intervals of 0.3 to 26.1 billion tonnes) and driven primarily by uncertainty in trophic transfer efficiency and its relationship with predator-prey body mass ratios. Even the upper uncertainty intervals for global predictions of consumer biomass demonstrate the remarkable scarcity of marine consumers, with less than one part in 30 million by volume of the global oceans comprising tissue of macroscopic animals. Thus the apparently high densities of marine life seen in surface and coastal waters and frequently visited abundance hotspots will likely give many in society a false impression of the abundance of marine animals. Unexploited baseline biomass predictions from the simple macroecological model were used to calibrate a more complex size- and trait-based model to estimate fisheries yield and impacts. Yields are highly dependent on baseline biomass and fisheries selectivity. Predicted global sustainable fisheries yield increases ≈4 fold when smaller individuals (< 20 cm from species of maximum mass < 1kg) are targeted in all oceans, but the predicted yields would rarely be accessible in practice and this fishing strategy leads to the collapse of larger species if fishing mortality rates on different size classes cannot be decoupled. Our analyses show that models with minimal parameter demands that are based on a few established ecological principles can support equitable analysis and comparison of diverse ecosystems. The analyses provide insights into the effects of parameter uncertainty on global biomass and production estimates, which have yet to be achieved with complex models, and will therefore help to highlight priorities for future research and data collection. However, the focus on simple model structures and global processes means that non-phytoplankton primary production and several groups, structures and processes of ecological and conservation interest are not represented. Consequently, our simple models become increasingly less useful than more complex alternatives when addressing questions about food web structure and function, biodiversity, resilience and human impacts at smaller scales and for areas closer to coasts. PMID:26226590

  12. Equation of State for Solid Phase I of Carbon Dioxide Valid for Temperatures up to 800 K and Pressures up to 12 GPa

    NASA Astrophysics Data System (ADS)

    Martin Trusler, J. P.

    2011-12-01

    The available thermodynamic-property data for solid phase I of carbon dioxide ("dry ice") are reviewed and used to determine the parameters of a new fundamental equation of state constructed in the form of a Helmholtz energy function with temperature and molar volume as the independent variables. The experimental data considered include the pressure, molar volume, and isobaric heat capacity along the sublimation curve, the melting-pressure curve, and molar volume in the compressed solid at temperatures from 295 to 764 K and pressures up to 12 GPa. The equation of state is based on the quasi-harmonic approximation, incorporating a Debye oscillator distribution for the vibrons, two discrete modes for the librons and a further three distinct modes for the internal vibrations of the CO2 molecule. A small anharmonic correction term is included, which is significant mainly in the region of the triple point. The estimated relative uncertainty of molar volume at specified temperature and pressure calculated from the equation of state is 0.02% on the sublimation curve and 1.5% in the compressed solid; for isobaric heat capacity on the sublimation curve, the uncertainty varies from 5.0% to 0.5% between 2 and 195 K. Auxiliary equations for the pressure and molar volume on the sublimation and melting curves are given. The equation of state is valid at temperatures from 0 to 800 K and at pressures from the solid-fluid phase boundary to 12 GPa.

  13. Risk, Uncertainty and Precaution in Science: The Threshold of the Toxicological Concern Approach in Food Toxicology.

    PubMed

    Bschir, Karim

    2017-04-01

    Environmental risk assessment is often affected by severe uncertainty. The frequently invoked precautionary principle helps to guide risk assessment and decision-making in the face of scientific uncertainty. In many contexts, however, uncertainties play a role not only in the application of scientific models but also in their development. Building on recent literature in the philosophy of science, this paper argues that precaution should be exercised at the stage when tools for risk assessment are developed as well as when they are used to inform decision-making. The relevance and consequences of this claim are discussed in the context of the threshold of the toxicological concern approach in food toxicology. I conclude that the approach does not meet the standards of an epistemic version of the precautionary principle.

  14. The state of the art of the impact of sampling uncertainty on measurement uncertainty

    NASA Astrophysics Data System (ADS)

    Leite, V. J.; Oliveira, E. C.

    2018-03-01

    The measurement uncertainty is a parameter that marks the reliability and can be divided into two large groups: sampling and analytical variations. Analytical uncertainty is a controlled process, performed in the laboratory. The same does not occur with the sampling uncertainty, which, because it faces several obstacles and there is no clarity on how to perform the procedures, has been neglected, although it is admittedly indispensable to the measurement process. This paper aims at describing the state of the art of sampling uncertainty and at assessing its relevance to measurement uncertainty.

  15. Continuum topology optimization considering uncertainties in load locations based on the cloud model

    NASA Astrophysics Data System (ADS)

    Liu, Jie; Wen, Guilin

    2018-06-01

    Few researchers have paid attention to designing structures in consideration of uncertainties in the loading locations, which may significantly influence the structural performance. In this work, cloud models are employed to depict the uncertainties in the loading locations. A robust algorithm is developed in the context of minimizing the expectation of the structural compliance, while conforming to a material volume constraint. To guarantee optimal solutions, sufficient cloud drops are used, which in turn leads to low efficiency. An innovative strategy is then implemented to enormously improve the computational efficiency. A modified soft-kill bi-directional evolutionary structural optimization method using derived sensitivity numbers is used to output the robust novel configurations. Several numerical examples are presented to demonstrate the effectiveness and efficiency of the proposed algorithm.

  16. Simultaneous retrieval of sea ice thickness and snow depth using concurrent active altimetry and passive L-band remote sensing data

    NASA Astrophysics Data System (ADS)

    Zhou, L.; Xu, S.; Liu, J.

    2017-12-01

    The retrieval of sea ice thickness mainly relies on satellite altimetry, and the freeboard measurements are converted to sea ice thickness (hi) under certain assumptions over snow loading. The uncertain in snow depth (hs) is a major source of uncertainty in the retrieved sea ice thickness and total volume for both radar and laser altimetry. In this study, novel algorithms for the simultaneous retrieval of hi and hs are proposed for the data synergy of L-band (1.4 GHz) passive remote sensing and both types of active altimetry: (1) L-band (1.4GHz) brightness temperature (TB) from Soil Moisture Ocean Salinity (SMOS) satellite and sea ice freeboard (FBice) from radar altimetry, (2) L-band TB data and snow freeboard (FBsnow) from laser altimetry. Two physical models serve as the forward models for the retrieval: L-band radiation model, and the hydrostatic equilibrium model. Verification with SMOS and Operational IceBridge (OIB) data is carried out, showing overall good retrieval accuracy for both sea ice parameters. Specifically, we show that the covariability between hs and FBsnow is crucial for the synergy between TB and FBsnow. Comparison with existing algorithms shows lower uncertainty in both sea ice parameters, and that the uncertainty in the retrieved sea ice thickness as caused by that of snow depth is spatially uncorrelated, with the potential reduction of the volume uncertainty through spatial sampling. The proposed algorithms can be applied to the retrieval of sea ice parameters at basin-scale, using concurrent active and passive remote sensing data based on satellites.

  17. Sequential simulation approach to modeling of multi-seam coal deposits with an application to the assessment of a Louisiana lignite

    USGS Publications Warehouse

    Olea, Ricardo A.; Luppens, James A.

    2012-01-01

    There are multiple ways to characterize uncertainty in the assessment of coal resources, but not all of them are equally satisfactory. Increasingly, the tendency is toward borrowing from the statistical tools developed in the last 50 years for the quantitative assessment of other mineral commodities. Here, we briefly review the most recent of such methods and formulate a procedure for the systematic assessment of multi-seam coal deposits taking into account several geological factors, such as fluctuations in thickness, erosion, oxidation, and bed boundaries. A lignite deposit explored in three stages is used for validating models based on comparing a first set of drill holes against data from infill and development drilling. Results were fully consistent with reality, providing a variety of maps, histograms, and scatterplots characterizing the deposit and associated uncertainty in the assessments. The geostatistical approach was particularly informative in providing a probability distribution modeling deposit wide uncertainty about total resources and a cumulative distribution of coal tonnage as a function of local uncertainty.

  18. Soil sampling strategies for site assessments in petroleum-contaminated areas.

    PubMed

    Kim, Geonha; Chowdhury, Saikat; Lin, Yen-Min; Lu, Chih-Jen

    2017-04-01

    Environmental site assessments are frequently executed for monitoring and remediation performance evaluation purposes, especially in total petroleum hydrocarbon (TPH)-contaminated areas, such as gas stations. As a key issue, reproducibility of the assessment results must be ensured, especially if attempts are made to compare results between different institutions. Although it is widely known that uncertainties associated with soil sampling are much higher than those with chemical analyses, field guides or protocols to deal with these uncertainties are not stipulated in detail in the relevant regulations, causing serious errors and distortion of the reliability of environmental site assessments. In this research, uncertainties associated with soil sampling and sample reduction for chemical analysis were quantified using laboratory-scale experiments and the theory of sampling. The research results showed that the TPH mass assessed by sampling tends to be overestimated and sampling errors are high, especially for the low range of TPH concentrations. Homogenization of soil was found to be an efficient method to suppress uncertainty, but high-resolution sampling could be an essential way to minimize this.

  19. Making sense of genetic uncertainty: the role of religion and spirituality.

    PubMed

    White, Mary T

    2009-02-15

    This article argues that to the extent that religious and spiritual beliefs can help people cope with genetic uncertainty, a limited spiritual assessment may be appropriate in genetic counseling. The article opens by establishing why genetic information is inherently uncertain and why this uncertainty can be medically, morally, and spiritually problematic. This is followed by a review of the range of factors that can contribute to risk assessments, including a few heuristics commonly used in responses to uncertainty. The next two sections summarize recent research on the diverse roles of religious and spiritual beliefs in genetic decisions and challenges to conducting spiritual assessments in genetic counseling. Based on these findings, religious and spiritual beliefs are posited as serving essentially as a heuristic that some people will utilize in responding to their genetic risks. In the interests of helping such clients make informed decisions, a limited spiritual assessment is recommended and described. Some of the challenges and risks associated with this limited assessment are discussed. Since some religious and spiritual beliefs can conflict with the values of medicine, some decisions will remain problematic. (c) 2009 Wiley-Liss, Inc.

  20. Characterizing Uncertainty and Variability in PBPK Models: State of the Science and Needs for Research and Implementation

    EPA Science Inventory

    Mode-of-action based risk and safety assessments can rely upon tissue dosimetry estimates in animals and humans obtained from physiologically-based pharmacokinetic (PBPK) modeling. However, risk assessment also increasingly requires characterization of uncertainty and variabilit...

  1. Uncertainty in simulating wheat yields under climate change

    USDA-ARS?s Scientific Manuscript database

    Anticipating the impacts of climate change on crop yields is critical for assessing future food security. Process-based crop simulation models are the most commonly used tools in such assessments. Analysis of uncertainties in future greenhouse gas emissions and their impacts on future climate change...

  2. The impacts of uncertainty and variability in groundwater-driven health risk assessment. (Invited)

    NASA Astrophysics Data System (ADS)

    Maxwell, R. M.

    2010-12-01

    Potential human health risk from contaminated groundwater is becoming an important, quantitative measure used in management decisions in a range of applications from Superfund to CO2 sequestration. Quantitatively assessing the potential human health risks from contaminated groundwater is challenging due to the many coupled processes, uncertainty in transport parameters and the variability in individual physiology and behavior. Perspective on human health risk assessment techniques will be presented and a framework used to predict potential, increased human health risk from contaminated groundwater will be discussed. This framework incorporates transport of contaminants through the subsurface from source to receptor and health risks to individuals via household exposure pathways. The subsurface is shown subject to both physical and chemical heterogeneity which affects downstream concentrations at receptors. Cases are presented where hydraulic conductivity can exhibit both uncertainty and spatial variability in addition to situations where hydraulic conductivity is the dominant source of uncertainty in risk assessment. Management implications, such as characterization and remediation will also be discussed.

  3. Poster — Thur Eve — 32: Stereotactic Body Radiation Therapy for Peripheral Lung Lesion: Treatment Planning and Quality Assurance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wan, Shuying; Oliver, Michael; Wang, Xiaofang

    2014-08-15

    Stereotactic body radiation therapy (SBRT), due to its high precision for target localizing, has become widely used to treat tumours at various locations, including the lungs. Lung SBRT program was started at our institution a year ago. Eighteen patients with peripheral lesions up to 3 cm diameter have been treated with 48 Gy in 4 fractions. Based on four-dimensional computed tomography (4DCT) simulation, internal target volume (ITV) was delineated to encompass the respiratory motion of the lesion. A margin of 5 mm was then added to create the planning target volume (PTV) for setup uncertainties. There was no expansion frommore » gross tumour volume (GTV) to clinical target volume (CTV). Pinnacle 9.6 was used as the primary treatment planning system. Volumetric modulated arc therapy (VMAT) technique, with one or two coplanar arcs, generally worked well. For quality assurance (QA), each plan was exported to Eclipse 10 and dose calculation was repeated. Dose volume histograms (DVHs) of the targets and organs at risk (OARs) were then compared between the two treatment planning systems. Winston-Lutz tests were carried out as routine machine QA. Patient-specific QA included ArcCheck measurement with an insert, where an ionization chamber was placed at the centre to measure dose at the isocenter. For the first several patients, and subsequently for the plans with extremely strong modulation, Gafchromic film dosimetry was also employed. For each patient, a mock setup was scheduled prior to treatments. Daily pre- and post-CBCT were acquired for setup and assessment of intra-fractional motion, respectively.« less

  4. Water supply infrastructure planning under multiple uncertainties: A differentiated approach

    NASA Astrophysics Data System (ADS)

    Fletcher, S.; Strzepek, K.

    2017-12-01

    Many water planners face increased pressure on water supply systems from increasing demands from population and economic growth in combination with uncertain water supply. Supply uncertainty arises from short-term climate variability and long-term climate change as well as uncertainty in groundwater availability. Social and economic uncertainties - such as sectoral competition for water, food and energy security, urbanization, and environmental protection - compound physical uncertainty. Further, the varying risk aversion of stakeholders and water managers makes it difficult to assess the necessity of expensive infrastructure investments to reduce risk. We categorize these uncertainties on two dimensions: whether they can be updated over time by collecting additional information, and whether the uncertainties can be described probabilistically or are "deep" uncertainties whose likelihood is unknown. Based on this, we apply a decision framework that combines simulation for probabilistic uncertainty, scenario analysis for deep uncertainty, and multi-stage decision analysis for uncertainties that are reduced over time with additional information. In light of these uncertainties and the investment costs of large infrastructure, we propose the assessment of staged, modular infrastructure and information updating as a hedge against risk. We apply this framework to cases in Melbourne, Australia and Riyadh, Saudi Arabia. Melbourne is a surface water system facing uncertain population growth and variable rainfall and runoff. A severe drought from 1997 to 2009 prompted investment in a 150 MCM/y reverse osmosis desalination plan with a capital cost of 3.5 billion. Our analysis shows that flexible design in which a smaller portion of capacity is developed initially with the option to add modular capacity in the future can mitigate uncertainty and reduce the expected lifetime costs by up to 1 billion. In Riyadh, urban water use relies on fossil groundwater aquifers and desalination. Intense withdrawals for urban and agricultural use will lead to lowering of the water table in the aquifer at rapid but uncertain rates due to poor groundwater characterization. We assess the potential for additional groundwater data collection and a flexible infrastructure approach similar to that in Melbourne to mitigate risk.

  5. Parameter and input data uncertainty estimation for the assessment of water resources in two sub-basins of the Limpopo River Basin

    NASA Astrophysics Data System (ADS)

    Oosthuizen, Nadia; Hughes, Denis A.; Kapangaziwiri, Evison; Mwenge Kahinda, Jean-Marc; Mvandaba, Vuyelwa

    2018-05-01

    The demand for water resources is rapidly growing, placing more strain on access to water and its management. In order to appropriately manage water resources, there is a need to accurately quantify available water resources. Unfortunately, the data required for such assessment are frequently far from sufficient in terms of availability and quality, especially in southern Africa. In this study, the uncertainty related to the estimation of water resources of two sub-basins of the Limpopo River Basin - the Mogalakwena in South Africa and the Shashe shared between Botswana and Zimbabwe - is assessed. Input data (and model parameters) are significant sources of uncertainty that should be quantified. In southern Africa water use data are among the most unreliable sources of model input data because available databases generally consist of only licensed information and actual use is generally unknown. The study assesses how these uncertainties impact the estimation of surface water resources of the sub-basins. Data on farm reservoirs and irrigated areas from various sources were collected and used to run the model. Many farm dams and large irrigation areas are located in the upper parts of the Mogalakwena sub-basin. Results indicate that water use uncertainty is small. Nevertheless, the medium to low flows are clearly impacted. The simulated mean monthly flows at the outlet of the Mogalakwena sub-basin were between 22.62 and 24.68 Mm3 per month when incorporating only the uncertainty related to the main physical runoff generating parameters. The range of total predictive uncertainty of the model increased to between 22.15 and 24.99 Mm3 when water use data such as small farm and large reservoirs and irrigation were included. For the Shashe sub-basin incorporating only uncertainty related to the main runoff parameters resulted in mean monthly flows between 11.66 and 14.54 Mm3. The range of predictive uncertainty changed to between 11.66 and 17.72 Mm3 after the uncertainty in water use information was added.

  6. Proceedings of the Space Surveillance Workshop (12th) Held in Lexington, Massachusetts on 5-7 April 1994. Volume 2

    DTIC Science & Technology

    1994-04-07

    detector mated to wide- angle optics to continuously view a large conical volume of space in the vicinity of the orbiting spacecraft . When a debris... large uncertainties. This lack of reliable data for debris particles in the millimeter/centimeter size range presents a problem to spacecraft designers...by smaller particles (<I mm) can be negated by the use of meteor bumpers covering the critical parts of a spacecraft , without incurring too large a

  7. Density Measurement of Tridecane by using Hydrostatic Weighing System at Density Laboratory, NML-SIRIM

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nor, Mohd. Fazrul Hisyam Mohd.; Othman, Hafidzah; Abidin, Abd. Rashid Zainal

    2009-07-07

    This paper presents the density measurement of tridecane by using hydrostatic weighing system, which is currently practised in Density Laboratory of National Metrology Laboratory (NML), SIRIM Berhad. This system weighed the crystal sphere while the crystal sphere was immersed in the tridecane. The volume and mass in air of the crystal sphere were calibrated at KRISS, Korea. The uncertainties of volume and mass in air of the crystal sphere were 4 ppm and 0.3 ppm respectively.

  8. QUALITY ASSURANCE HANDBOOK FOR AIR POLLUTION MEASUREMENT SYSTEMS: VOLUME IV - METEOROLOGICAL MEASUREMENTS (REVISED - AUGUST 1994)

    EPA Science Inventory

    Procedures on installing, acceptance testing, operating, maintaining and quality assuring three types of ground-based, upper air meteorological measurement systems are described. he limitations and uncertainties in precision and accuracy measurements associated with these systems...

  9. Uncertainty in BMP evaluation and optimization for watershed management

    NASA Astrophysics Data System (ADS)

    Chaubey, I.; Cibin, R.; Sudheer, K.; Her, Y.

    2012-12-01

    Use of computer simulation models have increased substantially to make watershed management decisions and to develop strategies for water quality improvements. These models are often used to evaluate potential benefits of various best management practices (BMPs) for reducing losses of pollutants from sources areas into receiving waterbodies. Similarly, use of simulation models in optimizing selection and placement of best management practices under single (maximization of crop production or minimization of pollutant transport) and multiple objective functions has increased recently. One of the limitations of the currently available assessment and optimization approaches is that the BMP strategies are considered deterministic. Uncertainties in input data (e.g. precipitation, streamflow, sediment, nutrient and pesticide losses measured, land use) and model parameters may result in considerable uncertainty in watershed response under various BMP options. We have developed and evaluated options to include uncertainty in BMP evaluation and optimization for watershed management. We have also applied these methods to evaluate uncertainty in ecosystem services from mixed land use watersheds. In this presentation, we will discuss methods to to quantify uncertainties in BMP assessment and optimization solutions due to uncertainties in model inputs and parameters. We have used a watershed model (Soil and Water Assessment Tool or SWAT) to simulate the hydrology and water quality in mixed land use watershed located in Midwest USA. The SWAT model was also used to represent various BMPs in the watershed needed to improve water quality. SWAT model parameters, land use change parameters, and climate change parameters were considered uncertain. It was observed that model parameters, land use and climate changes resulted in considerable uncertainties in BMP performance in reducing P, N, and sediment loads. In addition, climate change scenarios also affected uncertainties in SWAT simulated crop yields. Considerable uncertainties in the net cost and the water quality improvements resulted due to uncertainties in land use, climate change, and model parameter values.

  10. Assessment of BTEX-induced health risk under multiple uncertainties at a petroleum-contaminated site: An integrated fuzzy stochastic approach

    NASA Astrophysics Data System (ADS)

    Zhang, Xiaodong; Huang, Guo H.

    2011-12-01

    Groundwater pollution has gathered more and more attention in the past decades. Conducting an assessment of groundwater contamination risk is desired to provide sound bases for supporting risk-based management decisions. Therefore, the objective of this study is to develop an integrated fuzzy stochastic approach to evaluate risks of BTEX-contaminated groundwater under multiple uncertainties. It consists of an integrated interval fuzzy subsurface modeling system (IIFMS) and an integrated fuzzy second-order stochastic risk assessment (IFSOSRA) model. The IIFMS is developed based on factorial design, interval analysis, and fuzzy sets approach to predict contaminant concentrations under hybrid uncertainties. Two input parameters (longitudinal dispersivity and porosity) are considered to be uncertain with known fuzzy membership functions, and intrinsic permeability is considered to be an interval number with unknown distribution information. A factorial design is conducted to evaluate interactive effects of the three uncertain factors on the modeling outputs through the developed IIFMS. The IFSOSRA model can systematically quantify variability and uncertainty, as well as their hybrids, presented as fuzzy, stochastic and second-order stochastic parameters in health risk assessment. The developed approach haw been applied to the management of a real-world petroleum-contaminated site within a western Canada context. The results indicate that multiple uncertainties, under a combination of information with various data-quality levels, can be effectively addressed to provide supports in identifying proper remedial efforts. A unique contribution of this research is the development of an integrated fuzzy stochastic approach for handling various forms of uncertainties associated with simulation and risk assessment efforts.

  11. Lake-level frequency analysis for Devils Lake, North Dakota

    USGS Publications Warehouse

    Wiche, Gregg J.; Vecchia, Aldo V.

    1996-01-01

    Two approaches were used to estimate future lake-level probabilities for Devils Lake. The first approach is based on an annual lake-volume model, and the second approach is based on a statistical water mass-balance model that generates seasonal lake volumes on the basis of seasonal precipitation, evaporation, and inflow. Autoregressive moving average models were used to model the annual mean lake volume and the difference between the annual maximum lake volume and the annual mean lake volume. Residuals from both models were determined to be uncorrelated with zero mean and constant variance. However, a nonlinear relation between the residuals of the two models was included in the final annual lakevolume model.Because of high autocorrelation in the annual lake levels of Devils Lake, the annual lake-volume model was verified using annual lake-level changes. The annual lake-volume model closely reproduced the statistics of the recorded lake-level changes for 1901-93 except for the skewness coefficient. However, the model output is less skewed than the data indicate because of some unrealistically large lake-level declines. The statistical water mass-balance model requires as inputs seasonal precipitation, evaporation, and inflow data for Devils Lake. Analysis of annual precipitation, evaporation, and inflow data for 1950-93 revealed no significant trends or long-range dependence so the input time series were assumed to be stationary and short-range dependent.Normality transformations were used to approximately maintain the marginal probability distributions; and a multivariate, periodic autoregressive model was used to reproduce the correlation structure. Each of the coefficients in the model is significantly different from zero at the 5-percent significance level. Coefficients relating spring inflow from one year to spring and fall inflows from the previous year had the largest effect on the lake-level frequency analysis.Inclusion of parameter uncertainty in the model for generating precipitation, evaporation, and inflow indicates that the upper lake-level exceedance levels from the water mass-balance model are particularly sensitive to parameter uncertainty. The sensitivity in the upper exceedance levels was caused almost entirely by uncertainty in the fitted probability distributions of the quarterly inflows. A method was developed for using long-term streamflow data for the Red River of the North at Grand Forks to reduce the variance in the estimated mean.Comparison of the annual lake-volume model and the water mass-balance model indicates the upper exceedance levels of the water mass-balance model increase much more rapidly than those of the annual lake-volume model. As an example, for simulation year 5, the 99-percent exceedance for the lake level is 1,417.6 feet above sea level for the annual lake-volume model and 1,423.2 feet above sea level for the water mass-balance model. The rapid increase is caused largely by the record precipitation and inflow in the summer and fall of 1993. Because the water mass-balance model produces lake-level traces that closely match the hydrology of Devils Lake, the water mass-balance model is superior to the annual lake-volume model for computing exceedance levels for the 50-year planning horizon.

  12. Lake-level frequency analysis for Devils Lake, North Dakota

    USGS Publications Warehouse

    Wiche, Gregg J.; Vecchia, Aldo V.

    1995-01-01

    Two approaches were used to estimate future lake-level probabilities for Devils Lake. The first approach is based on an annual lake-volume model, and the second approach is based on a statistical water mass-balance model that generates seasonal lake volumes on the basis of seasonal precipitation, evaporation, and inflow.Autoregressive moving average models were used to model the annual mean lake volume and the difference between the annual maximum lake volume and the annual mean lake volume. Residuals from both models were determined to be uncorrelated with zero mean and constant variance. However, a nonlinear relation between the residuals of the two models was included in the final annual lake-volume model.Because of high autocorrelation in the annual lake levels of Devils Lake, the annual lakevolume model was verified using annual lake-level changes. The annual lake-volume model closely reproduced the statistics of the recorded lake-level changes for 1901-93 except for the skewness coefficient However, the model output is less skewed than the data indicate because of some unrealistically large lake-level declines.The statistical water mass-balance model requires as inputs seasonal precipitation, evaporation, and inflow data for Devils Lake. Analysis of annual precipitation, evaporation, and inflow data for 1950-93 revealed no significant trends or long-range dependence so the input time series were assumed to be stationary and short-range dependent.Normality transformations were used to approximately maintain the marginal probability distributions; and a multivariate, periodic autoregressive model was used to reproduce the correlation structure. Each of the coefficients in the model is significantly different from zero at the 5-percent significance level. Coefficients relating spring inflow from one year to spring and fall inflows from the previous year had the largest effect on the lake-level frequency analysis.Inclusion of parameter uncertainty in the model for generating precipitation, evaporation, and inflow indicates that the upper lake-level exceedance levels from the water mass-balance model are particularly sensitive to parameter uncertainty. The sensitivity in the upper exceedance levels was caused almost entirely by uncertainty in the fitted probability distributions of the quarterly inflows. A method was developed for using long-term streamflow data for the Red River of the North at Grand Forks to reduce the variance in the estimated mean. Comparison of the annual lake-volume model and the water mass-balance model indicates the upper exceedance levels of the water mass-balance model increase much more rapidly than those of the annual lake-volume model. As an example, for simulation year 5, the 99-percent exceedance for the lake level is 1,417.6 feet above sea level for the annual lake-volume model and 1,423.2 feet above sea level for the water mass-balance model. The rapid increase is caused largely by the record precipitation and inflow in the summer and fall of 1993. Because the water mass-balance model produces lake-level traces that closely match the hydrology of Devils Lake, the water mass-balance model is superior to the annual lake-volume model for computing exceedance levels for the 50-year planning horizon.

  13. Geological maps and models: are we certain how uncertain they are?

    NASA Astrophysics Data System (ADS)

    Mathers, Steve; Waters, Colin; McEvoy, Fiona

    2014-05-01

    Geological maps and latterly 3D models provide the spatial framework for geology at diverse scales or resolutions. As demands continue to rise for sustainable use of the subsurface, use of these maps and models is informing decisions on management of natural resources, hazards and environmental change. Inaccuracies and uncertainties in geological maps and models can impact substantially on the perception, assessment and management of opportunities and the associated risks . Lithostratigraphical classification schemes predominate, and are used in most geological mapping and modelling. The definition of unit boundaries, as 2D lines or 3D surfaces is the prime objective. The intervening area or volume is rarely described other than by its bulk attributes, those relating to the whole unit. Where sufficient data exist on the spatial and/or statistical distribution of properties it can be gridded or voxelated with integrity. Here we only discuss the uncertainty involved in defining the boundary conditions. The primary uncertainty of any geological map or model is the accuracy of the geological boundaries, i.e. tops, bases, limits, fault intersections etc. Traditionally these have been depicted on BGS maps using three line styles that reflect the uncertainty of the boundary, e.g. observed, inferred, conjectural. Most geological maps tend to neglect the subsurface expression (subcrops etc). Models could also be built with subsurface geological boundaries (as digital node strings) tagged with levels of uncertainty; initial experience suggests three levels may again be practicable. Once tagged these values could be used to autogenerate uncertainty plots. Whilst maps are predominantly explicit and based upon evidence and the conceptual the understanding of the geologist, models of this type are less common and tend to be restricted to certain software methodologies. Many modelling packages are implicit, being driven by simple statistical interpolation or complex algorithms for building surfaces in ways that are invisible and so not controlled by the working geologist. Such models have the advantage of being replicable within a software package and so can discount some interpretational differences between modellers. They can however create geologically implausible results unless good geological rules and control are established prior to model calculation. Comparisons of results from varied software packages yield surprisingly diverse results. This is a significant and often overlooked source of uncertainty in models. Expert elicitation is commonly employed to establish values used in statistical treatments of model uncertainty. However this introduces another possible source of uncertainty created by the different judgements of the modellers. The pragmatic solution appears to be using panels of experienced geologists to elicit the values. Treatments of uncertainty in maps and models yield relative rather than absolute values even though many of these are expressed numerically. This makes it extremely difficult to devise standard methodologies to determine uncertainty or propose fixed numerical scales for expressing the results. Furthermore, these may give a misleading impression of greater certainty than actually exists. This contribution outlines general perceptions with regard to uncertainty in our maps and models and presents results from recent BGS studies

  14. Incorporating rainfall uncertainty in a SWAT model: the river Zenne basin (Belgium) case study

    NASA Astrophysics Data System (ADS)

    Tolessa Leta, Olkeba; Nossent, Jiri; van Griensven, Ann; Bauwens, Willy

    2013-04-01

    The European Union Water Framework Directive (EU-WFD) called its member countries to achieve a good ecological status for all inland and coastal water bodies by 2015. According to recent studies, the river Zenne (Belgium) is far from this objective. Therefore, an interuniversity and multidisciplinary project "Towards a Good Ecological Status in the river Zenne (GESZ)" was launched to evaluate the effects of wastewater management plans on the river. In this project, different models have been developed and integrated using the Open Modelling Interface (OpenMI). The hydrologic, semi-distributed Soil and Water Assessment Tool (SWAT) is hereby used as one of the model components in the integrated modelling chain in order to model the upland catchment processes. The assessment of the uncertainty of SWAT is an essential aspect of the decision making process, in order to design robust management strategies that take the predicted uncertainties into account. Model uncertainty stems from the uncertainties on the model parameters, the input data (e.g, rainfall), the calibration data (e.g., stream flows) and on the model structure itself. The objective of this paper is to assess the first three sources of uncertainty in a SWAT model of the river Zenne basin. For the assessment of rainfall measurement uncertainty, first, we identified independent rainfall periods, based on the daily precipitation and stream flow observations and using the Water Engineering Time Series PROcessing tool (WETSPRO). Secondly, we assigned a rainfall multiplier parameter for each of the independent rainfall periods, which serves as a multiplicative input error corruption. Finally, we treated these multipliers as latent parameters in the model optimization and uncertainty analysis (UA). For parameter uncertainty assessment, due to the high number of parameters of the SWAT model, first, we screened out its most sensitive parameters using the Latin Hypercube One-factor-At-a-Time (LH-OAT) technique. Subsequently, we only considered the most sensitive parameters for parameter optimization and UA. To explicitly account for the stream flow uncertainty, we assumed that the stream flow measurement error increases linearly with the stream flow value. To assess the uncertainty and infer posterior distributions of the parameters, we used a Markov Chain Monte Carlo (MCMC) sampler - differential evolution adaptive metropolis (DREAM) that uses sampling from an archive of past states to generate candidate points in each individual chain. It is shown that the marginal posterior distributions of the rainfall multipliers vary widely between individual events, as a consequence of rainfall measurement errors and the spatial variability of the rain. Only few of the rainfall events are well defined. The marginal posterior distributions of the SWAT model parameter values are well defined and identified by DREAM, within their prior ranges. The posterior distributions of output uncertainty parameter values also show that the stream flow data is highly uncertain. The approach of using rainfall multipliers to treat rainfall uncertainty for a complex model has an impact on the model parameter marginal posterior distributions and on the model results Corresponding author: Tel.: +32 (0)2629 3027; fax: +32(0)2629 3022. E-mail: otolessa@vub.ac.be

  15. Holistic uncertainty analysis in river basin modeling for climate vulnerability assessment

    NASA Astrophysics Data System (ADS)

    Taner, M. U.; Wi, S.; Brown, C.

    2017-12-01

    The challenges posed by uncertain future climate are a prominent concern for water resources managers. A number of frameworks exist for assessing the impacts of climate-related uncertainty, including internal climate variability and anthropogenic climate change, such as scenario-based approaches and vulnerability-based approaches. While in many cases climate uncertainty may be dominant, other factors such as future evolution of the river basin, hydrologic response and reservoir operations are potentially significant sources of uncertainty. While uncertainty associated with modeling hydrologic response has received attention, very little attention has focused on the range of uncertainty and possible effects of the water resources infrastructure and management. This work presents a holistic framework that allows analysis of climate, hydrologic and water management uncertainty in water resources systems analysis with the aid of a water system model designed to integrate component models for hydrology processes and water management activities. The uncertainties explored include those associated with climate variability and change, hydrologic model parameters, and water system operation rules. A Bayesian framework is used to quantify and model the uncertainties at each modeling steps in integrated fashion, including prior and the likelihood information about model parameters. The framework is demonstrated in a case study for the St. Croix Basin located at border of United States and Canada.

  16. Methodology for quantifying uncertainty in coal assessments with an application to a Texas lignite deposit

    USGS Publications Warehouse

    Olea, R.A.; Luppens, J.A.; Tewalt, S.J.

    2011-01-01

    A common practice for characterizing uncertainty in coal resource assessments has been the itemization of tonnage at the mining unit level and the classification of such units according to distance to drilling holes. Distance criteria, such as those used in U.S. Geological Survey Circular 891, are still widely used for public disclosure. A major deficiency of distance methods is that they do not provide a quantitative measure of uncertainty. Additionally, relying on distance between data points alone does not take into consideration other factors known to have an influence on uncertainty, such as spatial correlation, type of probability distribution followed by the data, geological discontinuities, and boundary of the deposit. Several geostatistical methods have been combined to formulate a quantitative characterization for appraising uncertainty. Drill hole datasets ranging from widespread exploration drilling to detailed development drilling from a lignite deposit in Texas were used to illustrate the modeling. The results show that distance to the nearest drill hole is almost completely unrelated to uncertainty, which confirms the inadequacy of characterizing uncertainty based solely on a simple classification of resources by distance classes. The more complex statistical methods used in this study quantify uncertainty and show good agreement between confidence intervals in the uncertainty predictions and data from additional drilling. ?? 2010.

  17. Defining the uncertainty of electro-optical identification system performance estimates using a 3D optical environment derived from satellite

    NASA Astrophysics Data System (ADS)

    Ladner, S. D.; Arnone, R.; Casey, B.; Weidemann, A.; Gray, D.; Shulman, I.; Mahoney, K.; Giddings, T.; Shirron, J.

    2009-05-01

    Current United States Navy Mine-Counter-Measure (MCM) operations primarily use electro-optical identification (EOID) sensors to identify underwater targets after detection via acoustic sensors. These EOID sensors which are based on laser underwater imaging by design work best in "clear" waters and are limited in coastal waters especially with strong optical layers. Optical properties and in particular scattering and absorption play an important role on systems performance. Surface optical properties alone from satellite are not adequate to determine how well a system will perform at depth due to the existence of optical layers. The spatial and temporal characteristics of the 3d optical variability of the coastal waters along with strength and location of subsurface optical layers maximize chances of identifying underwater targets by exploiting optimum sensor deployment. Advanced methods have been developed to fuse the optical measurements from gliders, optical properties from "surface" satellite snapshot and 3-D ocean circulation models to extend the two-dimensional (2-D) surface satellite optical image into a three-dimensional (3-D) optical volume with subsurface optical layers. Modifications were made to an EOID performance model to integrate a 3-D optical volume covering an entire region of interest as input and derive system performance field. These enhancements extend present capability based on glider optics and EOID sensor models to estimate the system's "image quality". This only yields system performance information for a single glider profile location in a very large operational region. Finally, we define the uncertainty of the system performance by coupling the EOID performance model with the 3-D optical volume uncertainties. Knowing the ensemble spread of EOID performance field provides a new and unique capability for tactical decision makers and Navy Operations.

  18. Kalman filter approach for uncertainty quantification in time-resolved laser-induced incandescence.

    PubMed

    Hadwin, Paul J; Sipkens, Timothy A; Thomson, Kevin A; Liu, Fengshan; Daun, Kyle J

    2018-03-01

    Time-resolved laser-induced incandescence (TiRe-LII) data can be used to infer spatially and temporally resolved volume fractions and primary particle size distributions of soot-laden aerosols, but these estimates are corrupted by measurement noise as well as uncertainties in the spectroscopic and heat transfer submodels used to interpret the data. Estimates of the temperature, concentration, and size distribution of soot primary particles within a sample aerosol are typically made by nonlinear regression of modeled spectral incandescence decay, or effective temperature decay, to experimental data. In this work, we employ nonstationary Bayesian estimation techniques to infer aerosol properties from simulated and experimental LII signals, specifically the extended Kalman filter and Schmidt-Kalman filter. These techniques exploit the time-varying nature of both the measurements and the models, and they reveal how uncertainty in the estimates computed from TiRe-LII data evolves over time. Both techniques perform better when compared with standard deterministic estimates; however, we demonstrate that the Schmidt-Kalman filter produces more realistic uncertainty estimates.

  19. Assessing Uncertainties in Surface Water Security: A Probabilistic Multi-model Resampling approach

    NASA Astrophysics Data System (ADS)

    Rodrigues, D. B. B.

    2015-12-01

    Various uncertainties are involved in the representation of processes that characterize interactions between societal needs, ecosystem functioning, and hydrological conditions. Here, we develop an empirical uncertainty assessment of water security indicators that characterize scarcity and vulnerability, based on a multi-model and resampling framework. We consider several uncertainty sources including those related to: i) observed streamflow data; ii) hydrological model structure; iii) residual analysis; iv) the definition of Environmental Flow Requirement method; v) the definition of critical conditions for water provision; and vi) the critical demand imposed by human activities. We estimate the overall uncertainty coming from the hydrological model by means of a residual bootstrap resampling approach, and by uncertainty propagation through different methodological arrangements applied to a 291 km² agricultural basin within the Cantareira water supply system in Brazil. Together, the two-component hydrograph residual analysis and the block bootstrap resampling approach result in a more accurate and precise estimate of the uncertainty (95% confidence intervals) in the simulated time series. We then compare the uncertainty estimates associated with water security indicators using a multi-model framework and provided by each model uncertainty estimation approach. The method is general and can be easily extended forming the basis for meaningful support to end-users facing water resource challenges by enabling them to incorporate a viable uncertainty analysis into a robust decision making process.

  20. SU-C-BRA-02: Gradient Based Method of Target Delineation On PET/MR Image of Head and Neck Cancer Patients

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dance, M; Chera, B; Falchook, A

    2015-06-15

    Purpose: Validate the consistency of a gradient-based segmentation tool to facilitate accurate delineation of PET/CT-based GTVs in head and neck cancers by comparing against hybrid PET/MR-derived GTV contours. Materials and Methods: A total of 18 head and neck target volumes (10 primary and 8 nodal) were retrospectively contoured using a gradient-based segmentation tool by two observers. Each observer independently contoured each target five times. Inter-observer variability was evaluated via absolute percent differences. Intra-observer variability was examined by percentage uncertainty. All target volumes were also contoured using the SUV percent threshold method. The thresholds were explored case by case so itsmore » derived volume matched with the gradient-based volume. Dice similarity coefficients (DSC) were calculated to determine overlap of PET/CT GTVs and PET/MR GTVs. Results: The Levene’s test showed there was no statistically significant difference of the variances between the observer’s gradient-derived contours. However, the absolute difference between the observer’s volumes was 10.83%, with a range from 0.39% up to 42.89%. PET-avid regions with qualitatively non-uniform shapes and intensity levels had a higher absolute percent difference near 25%, while regions with uniform shapes and intensity levels had an absolute percent difference of 2% between observers. The average percentage uncertainty between observers was 4.83% and 7%. As the volume of the gradient-derived contours increased, the SUV threshold percent needed to match the volume decreased. Dice coefficients showed good agreement of the PET/CT and PET/MR GTVs with an average DSC value across all volumes at 0.69. Conclusion: Gradient-based segmentation of PET volume showed good consistency in general but can vary considerably for non-uniform target shapes and intensity levels. PET/CT-derived GTV contours stemming from the gradient-based tool show good agreement with the anatomically and metabolically more accurate PET/MR-derived GTV contours, but tumor delineation accuracy can be further improved with the use PET/MR.« less

  1. Assessing uncertainties in crop and pasture ensemble model simulations of productivity and N2O emissions

    USDA-ARS?s Scientific Manuscript database

    Simulation models are extensively used to predict agricultural productivity and greenhouse gas (GHG) emissions. However, the uncertainties of (reduced) model ensemble simulations have not been assessed systematically for variables affecting food security and climate change mitigation, within multisp...

  2. Applying Uncertainty Analysis to a Risk Assessment for the Pesticide Permethrin

    EPA Science Inventory

    We discuss the application of methods of uncertainty analysis from our previous poster to the problem of a risk assessment for exposure to the food-use pesticide permethrin resulting from residential pesticide crack and crevice application. Exposures are simulated by the SHEDS (S...

  3. Probability and Confidence Trade-Space (PACT) Evaluation: Accounting for Uncertainty in Sparing Assessments

    NASA Technical Reports Server (NTRS)

    Anderson, Leif; Box, Neil; Carter-Journet, Katrina; DiFilippo, Denise; Harrington, Sean; Jackson, David; Lutomski, Michael

    2012-01-01

    Purpose of presentation: (1) Status update on the developing methodology to revise sub-system sparing targets. (2) To describe how to incorporate uncertainty into spare assessments and why it is important to do so (3) Demonstrate hardware risk postures through PACT evaluation

  4. Is the aerosol emission detectable in the thermal infrared?

    NASA Astrophysics Data System (ADS)

    Hollweg, H.-D.; Bakan, S.; Taylor, J. P.

    2006-08-01

    The impact of aerosols on the thermal infrared radiation can be assessed by combining observations and radiative transfer calculations. Both have uncertainties, which are discussed in this paper. Observational uncertainties are obtained for two FTIR instruments operated side by side on the ground during the LACE 1998 field campaign. Radiative transfer uncertainties are assessed using a line-by-line model taking into account the uncertainties of the HITRAN 2004 spectroscopic database, uncertainties in the determination of the atmospheric profiles of water vapor and ozone, and differences in the treatment of the water vapor continuum absorption by the CKD 2.4.1 and MT_CKD 1.0 algorithms. The software package OPAC was used to describe the optical properties of aerosols for climate modeling. The corresponding radiative signature is a guideline to the assessment of the uncertainty ranges of observations and models. We found that the detection of aerosols depends strongly on the measurement accuracy of atmospheric profiles of water vapor and ozone and is easier for drier conditions. Within the atmospheric window, only the forcing of downward radiation at the surface by desert aerosol emerges clearly from the uncertainties of modeling and FTIR measurement. Urban and polluted continental aerosols are only partially detectable depending on the wave number and on the atmospheric water vapor amount. Simulations for the space-borne interferometer IASI show that only upward radiation above transported mineral dust aloft emerges out of the uncertainties. The detection of aerosols with weak radiative impact by FTIR instruments like ARIES and OASIS is made difficult by noise as demonstrated by the signal to noise ratio for clean continental aerosols. Altogether, the uncertainties found suggest that it is difficult to detect the optical depths of nonmineral and unpolluted aerosols.

  5. RECOVERY ACT - Methods for Decision under Technological Change Uncertainty and Risk Assessment for Integrated Assessment of Climate Change

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Webster, Mort David

    2015-03-10

    This report presents the final outcomes and products of the project as performed at the Massachusetts Institute of Technology. The research project consists of three main components: methodology development for decision-making under uncertainty, improving the resolution of the electricity sector to improve integrated assessment, and application of these methods to integrated assessment. Results in each area is described in the report.

  6. Assessing uncertainties in surface water security: An empirical multimodel approach

    NASA Astrophysics Data System (ADS)

    Rodrigues, Dulce B. B.; Gupta, Hoshin V.; Mendiondo, Eduardo M.; Oliveira, Paulo Tarso S.

    2015-11-01

    Various uncertainties are involved in the representation of processes that characterize interactions among societal needs, ecosystem functioning, and hydrological conditions. Here we develop an empirical uncertainty assessment of water security indicators that characterize scarcity and vulnerability, based on a multimodel and resampling framework. We consider several uncertainty sources including those related to (i) observed streamflow data; (ii) hydrological model structure; (iii) residual analysis; (iv) the method for defining Environmental Flow Requirement; (v) the definition of critical conditions for water provision; and (vi) the critical demand imposed by human activities. We estimate the overall hydrological model uncertainty by means of a residual bootstrap resampling approach, and by uncertainty propagation through different methodological arrangements applied to a 291 km2 agricultural basin within the Cantareira water supply system in Brazil. Together, the two-component hydrograph residual analysis and the block bootstrap resampling approach result in a more accurate and precise estimate of the uncertainty (95% confidence intervals) in the simulated time series. We then compare the uncertainty estimates associated with water security indicators using a multimodel framework and the uncertainty estimates provided by each model uncertainty estimation approach. The range of values obtained for the water security indicators suggests that the models/methods are robust and performs well in a range of plausible situations. The method is general and can be easily extended, thereby forming the basis for meaningful support to end-users facing water resource challenges by enabling them to incorporate a viable uncertainty analysis into a robust decision-making process.

  7. Evolution of motion uncertainty in rectal cancer: implications for adaptive radiotherapy

    NASA Astrophysics Data System (ADS)

    Kleijnen, Jean-Paul J. E.; van Asselen, Bram; Burbach, Johannes P. M.; Intven, Martijn; Philippens, Marielle E. P.; Reerink, Onne; Lagendijk, Jan J. W.; Raaymakers, Bas W.

    2016-01-01

    Reduction of motion uncertainty by applying adaptive radiotherapy strategies depends largely on the temporal behavior of this motion. To fully optimize adaptive strategies, insight into target motion is needed. The purpose of this study was to analyze stability and evolution in time of motion uncertainty of both the gross tumor volume (GTV) and clinical target volume (CTV) for patients with rectal cancer. We scanned 16 patients daily during one week, on a 1.5 T MRI scanner in treatment position, prior to each radiotherapy fraction. Single slice sagittal cine MRIs were made at the beginning, middle, and end of each scan session, for one minute at 2 Hz temporal resolution. GTV and CTV motion were determined by registering a delineated reference frame to time-points later in time. The 95th percentile of observed motion (dist95%) was taken as a measure of motion. The stability of motion in time was evaluated within each cine-MRI separately. The evolution of motion was investigated between the reference frame and the cine-MRIs of a single scan session and between the reference frame and the cine-MRIs of several days later in the course of treatment. This observed motion was then converted into a PTV-margin estimate. Within a one minute cine-MRI scan, motion was found to be stable and small. Independent of the time-point within the scan session, the average dist95% remains below 3.6 mm and 2.3 mm for CTV and GTV, respectively 90% of the time. We found similar motion over time intervals from 18 min to 4 days. When reducing the time interval from 18 min to 1 min, a large reduction in motion uncertainty is observed. A reduction in motion uncertainty, and thus the PTV-margin estimate, of 71% and 75% for CTV and tumor was observed, respectively. Time intervals of 15 and 30 s yield no further reduction in motion uncertainty compared to a 1 min time interval.

  8. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liang, X; Li, Z; Zheng, D

    Purpose: In the context of evaluating dosimetric impacts of a variety of uncertainties involved in HDR Tandem-and-Ovoid treatment, to study the correlations between conventional point doses and 3D volumetric doses. Methods: For 5 cervical cancer patients treated with HDR T&O, 150 plans were retrospectively created to study dosimetric impacts of the following uncertainties: (1) inter-fractional applicator displacement between two treatment fractions within a single insertion by applying Fraction#1 plan to Fraction#2 CT; (2) positional dwell error simulated from −5mm to 5mm in 1mm steps; (3) simulated temporal dwell error of 0.05s, 0.1s, 0.5s, and 1s. The original plans were basedmore » on point dose prescription, from which the volume covered by the prescription dose was generated as the pseudo target volume to study the 3D target dose effect. OARs were contoured. The point and volumetric dose errors were calculated by taking the differences between original and simulated plans. The correlations between the point and volumetric dose errors were analyzed. Results: For the most clinically relevant positional dwell uncertainty of 1mm, temporal uncertainty of 0.05s, and inter-fractional applicator displacement within the same insertion, the mean target D90 and V100 deviation were within 1%. Among these uncertainties, the applicator displacement showed the largest potential target coverage impact (2.6% on D90) as well as the OAR dose impact (2.5% and 3.4% on bladder D2cc and rectum D2cc). The Spearman correlation analysis shows a correlation coefficient of 0.43 with a p-value of 0.11 between target D90 coverage and H point dose. Conclusion: With the most clinically relevant positional and temporal dwell uncertainties and patient interfractional applicator displacement within the same insertion, the dose error is within clinical acceptable range. The lack of correlation between H point and 3D volumetric dose errors is a motivator for the use of 3D treatment planning in cervical HDR brachytherapy.« less

  9. Testing Map Features Designed to Convey the Uncertainty of Cancer Risk: Insights Gained From Assessing Judgments of Information Adequacy and Communication Goals

    PubMed Central

    Severtson, Dolores J.

    2015-01-01

    Barriers to communicating the uncertainty of environmental health risks include preferences for certain information and low numeracy. Map features designed to communicate the magnitude and uncertainty of estimated cancer risk from air pollution were tested among 826 participants to assess how map features influenced judgments of adequacy and the intended communication goals. An uncertain versus certain visual feature was judged as less adequate but met both communication goals and addressed numeracy barriers. Expressing relative risk using words communicated uncertainty and addressed numeracy barriers but was judged as highly inadequate. Risk communication and visual cognition concepts were applied to explain findings. PMID:26412960

  10. Testing Map Features Designed to Convey the Uncertainty of Cancer Risk: Insights Gained From Assessing Judgments of Information Adequacy and Communication Goals.

    PubMed

    Severtson, Dolores J

    2015-02-01

    Barriers to communicating the uncertainty of environmental health risks include preferences for certain information and low numeracy. Map features designed to communicate the magnitude and uncertainty of estimated cancer risk from air pollution were tested among 826 participants to assess how map features influenced judgments of adequacy and the intended communication goals. An uncertain versus certain visual feature was judged as less adequate but met both communication goals and addressed numeracy barriers. Expressing relative risk using words communicated uncertainty and addressed numeracy barriers but was judged as highly inadequate. Risk communication and visual cognition concepts were applied to explain findings.

  11. SU-E-T-287: Robustness Study of Passive-Scattering Proton Therapy in Lung: Is Range and Setup Uncertainty Calculation On the Initial CT Enough to Predict the Plan Robustness?

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ding, X; Dormer, J; Kenton, O

    Purpose: Plan robustness of the passive-scattering proton therapy treatment of lung tumors has been studied previously using combined uncertainties of 3.5% in CT number and 3 mm geometric shifts. In this study, we investigate whether this method is sufficient to predict proton plan robustness by comparing to plans performed on weekly verification CT scans. Methods: Ten lung cancer patients treated with passive-scattering proton therapy were randomly selected. All plans were prescribed 6660cGy in 37 fractions. Each initial plan was calculated using +/− 3.5% range and +/− 0.3cm setup uncertainty in x, y and z directions in Eclipse TPS(Method-A). Throughout themore » treatment course, patients received weekly verification CT scans to assess the daily treatment variation(Method-B). After contours and imaging registrations are verified by the physician, the initial plan with the same beamline and compensator was mapped into the verification CT. Dose volume histograms (DVH) were evaluated for robustness study. Results: Differences are observed between method A and B in terms of iCTV coverage and lung dose. Method-A shows all the iCTV D95 are within +/− 1% difference, while 20% of cases fall outside +/−1% range in Method-B. In the worst case scenario(WCS), the iCTV D95 is reduced by 2.5%. All lung V5 and V20 are within +/−5% in Method-A while 15% of V5 and 10% of V20 fall outside of +/−5% in Method-B. In the WCS, Lung V5 increased by 15% and V20 increased by 9%. Method A and B show good agreement with regard to cord maximum and Esophagus mean dose. Conclusion: This study suggests that using range and setup uncertainty calculation (+/−3.5% and +/−3mm) may not be sufficient to predict the WCS. In the absence of regular verification scans, expanding the conventional uncertainty parameters(e.g., to +/−3.5% and +/−4mm) may be needed to better reflect plan actual robustness.« less

  12. Operational Implementation of a Pc Uncertainty Construct for Conjunction Assessment Risk Analysis

    NASA Technical Reports Server (NTRS)

    Newman, Lauri K.; Hejduk, Matthew D.; Johnson, Lauren C.

    2016-01-01

    Earlier this year the NASA Conjunction Assessment and Risk Analysis (CARA) project presented the theoretical and algorithmic aspects of a method to include the uncertainties in the calculation inputs when computing the probability of collision (Pc) between two space objects, principally uncertainties in the covariances and the hard-body radius. The output of this calculation approach is to produce rather than a single Pc value an entire probability density function that will represent the range of possible Pc values given the uncertainties in the inputs and bring CA risk analysis methodologies more in line with modern risk management theory. The present study provides results from the exercise of this method against an extended dataset of satellite conjunctions in order to determine the effect of its use on the evaluation of conjunction assessment (CA) event risk posture. The effects are found to be considerable: a good number of events are downgraded from or upgraded to a serious risk designation on the basis of consideration of the Pc uncertainty. The findings counsel the integration of the developed methods into NASA CA operations.

  13. Assessment of check-dam groundwater recharge with water-balance calculations

    NASA Astrophysics Data System (ADS)

    Djuma, Hakan; Bruggeman, Adriana; Camera, Corrado; Eliades, Marinos

    2017-04-01

    Studies on the enhancement of groundwater recharge by check-dams in arid and semi-arid environments mainly focus on deriving water infiltration rates from the check-dam ponding areas. This is usually achieved by applying simple water balance models, more advanced models (e.g., two dimensional groundwater models) and field tests (e.g., infiltrometer test or soil pit tests). Recharge behind the check-dam can be affected by the built-up of sediment as a result of erosion in the upstream watershed area. This natural process can increase the uncertainty in the estimates of the recharged water volume, especially for water balance calculations. Few water balance field studies of individual check-dams have been presented in the literature and none of them presented associated uncertainties of their estimates. The objectives of this study are i) to assess the effect of a check-dam on groundwater recharge from an ephemeral river; and ii) to assess annual sedimentation at the check-dam during a 4-year period. The study was conducted on a check-dam in the semi-arid island of Cyprus. Field campaigns were carried out to measure water flow, water depth and check-dam topography in order to establish check-dam water height, volume, evaporation, outflow and recharge relations. Topographic surveys were repeated at the end of consecutive hydrological years to estimate the sediment built up in the reservoir area of the check dam. Also, sediment samples were collected from the check-dam reservoir area for bulk-density analyses. To quantify the groundwater recharge, a water balance model was applied at two locations: at the check-dam and corresponding reservoir area, and at a 4-km stretch of the river bed without check-dam. Results showed that a check-dam with a storage capacity of 25,000 m3 was able to recharge to the aquifer, in four years, a total of 12 million m3 out of the 42 million m3 of measured (or modelled) streamflow. Recharge from the analyzed 4-km long river section without check-dam was estimated to be 1 million m3. Upper and lower limits of prediction intervals were computed to assess the uncertainties of the results. The model was rerun with these values and resulted in recharge values of 0.4 m3 as lower and 38 million m3 as upper limit. The sediment survey in the check-dam reservoir area showed that the reservoir area was filled with 2,000 to 3,000 tons of sediment after one rainfall season. This amount of sediment corresponds to 0.2 to 2 t h-1 y-1 sediment yield at the watershed level and reduces the check-dam storage capacity by approximately 10%. Results indicate that check-dams are valuable structures for increasing groundwater resources, but special attention should be given to soil erosion occurring in the upstream area and the resulting sediment built-up in the check-dam reservoir area. This study has received funding from the EU FP7 RECARE Project (GA 603498)

  14. Assessing performance of flaw characterization methods through uncertainty propagation

    NASA Astrophysics Data System (ADS)

    Miorelli, R.; Le Bourdais, F.; Artusi, X.

    2018-04-01

    In this work, we assess the inversion performance in terms of crack characterization and localization based on synthetic signals associated to ultrasonic and eddy current physics. More precisely, two different standard iterative inversion algorithms are used to minimize the discrepancy between measurements (i.e., the tested data) and simulations. Furthermore, in order to speed up the computational time and get rid of the computational burden often associated to iterative inversion algorithms, we replace the standard forward solver by a suitable metamodel fit on a database built offline. In a second step, we assess the inversion performance by adding uncertainties on a subset of the database parameters and then, through the metamodel, we propagate these uncertainties within the inversion procedure. The fast propagation of uncertainties enables efficiently evaluating the impact due to the lack of knowledge on some parameters employed to describe the inspection scenarios, which is a situation commonly encountered in the industrial NDE context.

  15. Assessment of the Gaussian Covariance Approximation over an Earth-Asteroid Encounter Period

    NASA Technical Reports Server (NTRS)

    Mattern, Daniel

    2017-01-01

    In assessing the risk an asteroid may pose to the Earth, the asteroids state is often predicted for many years, often decades. Only by accounting for the asteroids initial state uncertainty can a measure of the risk be calculated. With the asteroids state uncertainty growing as a function of the initial velocity uncertainty, orbit velocity at the last state update, and the time from the last update to the epoch of interest, the asteroids position uncertainties can grow to many times the size of the Earth when propagated to the encounter risk corridor. This paper examines the merits of propagating the asteroids state covariance as an analytical matrix. The results of this study help to bound the efficacy of applying different metrics for assessing the risk an asteroid poses to the Earth. Additionally, this work identifies a criterion for when different covariance propagation methods are needed to continue predictions after an Earth-encounter period.

  16. Quantifying uncertainty in Bayesian calibrated animal-to-human PBPK models with informative prior distributions

    EPA Science Inventory

    Understanding and quantifying the uncertainty of model parameters and predictions has gained more interest in recent years with the increased use of computational models in chemical risk assessment. Fully characterizing the uncertainty in risk metrics derived from linked quantita...

  17. Uncertainty Modeling of Pollutant Transport in Atmosphere and Aquatic Route Using Soft Computing

    NASA Astrophysics Data System (ADS)

    Datta, D.

    2010-10-01

    Hazardous radionuclides are released as pollutants in the atmospheric and aquatic environment (ATAQE) during the normal operation of nuclear power plants. Atmospheric and aquatic dispersion models are routinely used to assess the impact of release of radionuclide from any nuclear facility or hazardous chemicals from any chemical plant on the ATAQE. Effect of the exposure from the hazardous nuclides or chemicals is measured in terms of risk. Uncertainty modeling is an integral part of the risk assessment. The paper focuses the uncertainty modeling of the pollutant transport in atmospheric and aquatic environment using soft computing. Soft computing is addressed due to the lack of information on the parameters that represent the corresponding models. Soft-computing in this domain basically addresses the usage of fuzzy set theory to explore the uncertainty of the model parameters and such type of uncertainty is called as epistemic uncertainty. Each uncertain input parameters of the model is described by a triangular membership function.

  18. A model-averaging method for assessing groundwater conceptual model uncertainty.

    PubMed

    Ye, Ming; Pohlmann, Karl F; Chapman, Jenny B; Pohll, Greg M; Reeves, Donald M

    2010-01-01

    This study evaluates alternative groundwater models with different recharge and geologic components at the northern Yucca Flat area of the Death Valley Regional Flow System (DVRFS), USA. Recharge over the DVRFS has been estimated using five methods, and five geological interpretations are available at the northern Yucca Flat area. Combining the recharge and geological components together with additional modeling components that represent other hydrogeological conditions yields a total of 25 groundwater flow models. As all the models are plausible given available data and information, evaluating model uncertainty becomes inevitable. On the other hand, hydraulic parameters (e.g., hydraulic conductivity) are uncertain in each model, giving rise to parametric uncertainty. Propagation of the uncertainty in the models and model parameters through groundwater modeling causes predictive uncertainty in model predictions (e.g., hydraulic head and flow). Parametric uncertainty within each model is assessed using Monte Carlo simulation, and model uncertainty is evaluated using the model averaging method. Two model-averaging techniques (on the basis of information criteria and GLUE) are discussed. This study shows that contribution of model uncertainty to predictive uncertainty is significantly larger than that of parametric uncertainty. For the recharge and geological components, uncertainty in the geological interpretations has more significant effect on model predictions than uncertainty in the recharge estimates. In addition, weighted residuals vary more for the different geological models than for different recharge models. Most of the calibrated observations are not important for discriminating between the alternative models, because their weighted residuals vary only slightly from one model to another.

  19. Uncertainty Estimation Cheat Sheet for Probabilistic Risk Assessment

    NASA Technical Reports Server (NTRS)

    Britton, Paul T.; Al Hassan, Mohammad; Ring, Robert W.

    2017-01-01

    "Uncertainty analysis itself is uncertain, therefore, you cannot evaluate it exactly," Source Uncertain Quantitative results for aerospace engineering problems are influenced by many sources of uncertainty. Uncertainty analysis aims to make a technical contribution to decision-making through the quantification of uncertainties in the relevant variables as well as through the propagation of these uncertainties up to the result. Uncertainty can be thought of as a measure of the 'goodness' of a result and is typically represented as statistical dispersion. This paper will explain common measures of centrality and dispersion; and-with examples-will provide guidelines for how they may be estimated to ensure effective technical contributions to decision-making.

  20. DEVELOPMENTS AT U.S. EPA IN ADDRESSING UNCERTAINTY IN RISK ASSESSMENT

    EPA Science Inventory

    An emerging trend in risk assessment is to be more explicit about uncertainties, both during the analytical procedures and in communicating results. In February 1 992, then-Deputy EPA Administrator Henry Habicht set out Agency goals in a memorandum stating that the Agency will "p...

  1. SU-E-T-615: Plan Comparison Between Photon IMRT and Proton Plans Incorporating Uncertainty Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cheng, C; Wessels, B; Jesseph, F

    2015-06-15

    Purpose: In this study, we investigate the effect of setup uncertainty on DVH calculations which may impact plan comparison. Methods: Treatment plans (6 MV VMAT calculated on Pinnacle TPS) were chosen for different disease sites: brain, prostate, H&N and spine in this retrospective study. A proton plan (PP) using double scattering beams was generated for each selected VMAT plan subject to the same set of dose-volume constraints as in VMAT. An uncertainty analysis was incorporated on the DVH calculations in which isocenter shifts from 1 to 5 mm in each of the ±x, ±y and ±z directions were used tomore » simulate the setup uncertainty and residual positioning errors. A total of 40 different combinations of isocenter shifts were used in the re-calculation of DVH of the PTV and the various OARs for both the VMAT and the corresponding PT. Results: For the brain case, both VMAT and PP are comparable in PTV coverage and OAR sparing, and VMAT is a clear choice for treatment due to its ease of delivery. However, when incorporating isoshifts in DVH calculations, a significant change in dose-volume relationship emerges. For example, both VMAT and PT provide adequate coverage, even with ±3mm isoshift. However, +3mm isoshift results in increase of V40(Lcochlea, VMAT) from 7.2% in the original plan to 45% and V40(R cochlea, VMAT) from 75% to 92%. For protons, V40(Lcochlea, PT) increases from 62% in the initial plan to 75%, while V40(Rcochea, PT) increases from 7% to 26%. Conclusion: DVH alone may not be sufficient to allow an unequivocal decision in plan comparison, especially when two rival plans are very similar in both PTV coverage and OAR sparing. It is a good practice to incorporate uncertainty analysis on photon and proton plan comparison studies to test the plan robustness in plan evaluation.« less

  2. SU-E-T-625: Robustness Evaluation and Robust Optimization of IMPT Plans Based on Per-Voxel Standard Deviation of Dose Distributions.

    PubMed

    Liu, W; Mohan, R

    2012-06-01

    Proton dose distributions, IMPT in particular, are highly sensitive to setup and range uncertainties. We report a novel method, based on per-voxel standard deviation (SD) of dose distributions, to evaluate the robustness of proton plans and to robustly optimize IMPT plans to render them less sensitive to uncertainties. For each optimization iteration, nine dose distributions are computed - the nominal one, and one each for ± setup uncertainties along x, y and z axes and for ± range uncertainty. SD of dose in each voxel is used to create SD-volume histogram (SVH) for each structure. SVH may be considered a quantitative representation of the robustness of the dose distribution. For optimization, the desired robustness may be specified in terms of an SD-volume (SV) constraint on the CTV and incorporated as a term in the objective function. Results of optimization with and without this constraint were compared in terms of plan optimality and robustness using the so called'worst case' dose distributions; which are obtained by assigning the lowest among the nine doses to each voxel in the clinical target volume (CTV) and the highest to normal tissue voxels outside the CTV. The SVH curve and the area under it for each structure were used as quantitative measures of robustness. Penalty parameter of SV constraint may be varied to control the tradeoff between robustness and plan optimality. We applied these methods to one case each of H&N and lung. In both cases, we found that imposing SV constraint improved plan robustness but at the cost of normal tissue sparing. SVH-based optimization and evaluation is an effective tool for robustness evaluation and robust optimization of IMPT plans. Studies need to be conducted to test the methods for larger cohorts of patients and for other sites. This research is supported by National Cancer Institute (NCI) grant P01CA021239, the University Cancer Foundation via the Institutional Research Grant program at the University of Texas MD Anderson Cancer Center, and MD Anderson’s cancer center support grant CA016672. © 2012 American Association of Physicists in Medicine.

  3. Treatment of uncertainties in the IPCC: a philosophical analysis

    NASA Astrophysics Data System (ADS)

    Jebeile, J.; Drouet, I.

    2014-12-01

    The IPCC produces scientific reports out of findings on climate and climate change. Because the findings are uncertain in many respects, the production of reports requires aggregating assessments of uncertainties of different kinds. This difficult task is currently regulated by the Guidance note for lead authors of the IPCC fifth assessment report on consistent treatment of uncertainties. The note recommends that two metrics—i.e. confidence and likelihood— be used for communicating the degree of certainty in findings. Confidence is expressed qualitatively "based on the type, amount, quality, and consistency of evidence […] and the degree of agreement", while likelihood is expressed probabilistically "based on statistical analysis of observations or model results, or expert judgment". Therefore, depending on the evidence evaluated, authors have the choice to present either an assigned level of confidence or a quantified measure of likelihood. But aggregating assessments of uncertainties of these two different kinds express distinct and conflicting methodologies. So the question arises whether the treatment of uncertainties in the IPCC is rationally justified. In order to answer the question, it is worth comparing the IPCC procedures with the formal normative theories of epistemic rationality which have been developed by philosophers. These theories—which include contributions to the philosophy of probability and to bayesian probabilistic confirmation theory—are relevant for our purpose because they are commonly used to assess the rationality of common collective jugement formation based on uncertain knowledge. In this paper we make the comparison and pursue the following objectives: i/we determine whether the IPCC confidence and likelihood can be compared with the notions of uncertainty targeted by or underlying the formal normative theories of epistemic rationality; ii/we investigate whether the formal normative theories of epistemic rationality justify treating uncertainty along those two dimensions, and indicate how this can be avoided.

  4. Assessing risk of baleen whale hearing loss from seismic surveys: The effect of uncertainty and individual variation.

    PubMed

    Gedamke, Jason; Gales, Nick; Frydman, Sascha

    2011-01-01

    The potential for seismic airgun "shots" to cause acoustic trauma in marine mammals is poorly understood. There are just two empirical measurements of temporary threshold shift (TTS) onset levels from airgun-like sounds in odontocetes. Considering these limited data, a model was developed examining the impact of individual variability and uncertainty on risk assessment of baleen whale TTS from seismic surveys. In each of 100 simulations: 10000 "whales" are assigned TTS onset levels accounting for: inter-individual variation; uncertainty over the population's mean; and uncertainty over weighting of odontocete data to obtain baleen whale onset levels. Randomly distributed whales are exposed to one seismic survey passage with cumulative exposure level calculated. In the base scenario, 29% of whales (5th/95th percentiles of 10%/62%) approached to 1-1.2 km range were exposed to levels sufficient for TTS onset. By comparison, no whales are at risk outside 0.6 km when uncertainty and variability are not considered. Potentially "exposure altering" parameters (movement, avoidance, surfacing, and effective quiet) were also simulated. Until more research refines model inputs, the results suggest a reasonable likelihood that whales at a kilometer or more from seismic surveys could potentially be susceptible to TTS and demonstrate that the large impact uncertainty and variability can have on risk assessment.

  5. Risk Assessment of Groundwater Contamination: A Multilevel Fuzzy Comprehensive Evaluation Approach Based on DRASTIC Model

    PubMed Central

    Zhang, Yan; Zhong, Ming

    2013-01-01

    Groundwater contamination is a serious threat to water supply. Risk assessment of groundwater contamination is an effective way to protect the safety of groundwater resource. Groundwater is a complex and fuzzy system with many uncertainties, which is impacted by different geological and hydrological factors. In order to deal with the uncertainty in the risk assessment of groundwater contamination, we propose an approach with analysis hierarchy process and fuzzy comprehensive evaluation integrated together. Firstly, the risk factors of groundwater contamination are identified by the sources-pathway-receptor-consequence method, and a corresponding index system of risk assessment based on DRASTIC model is established. Due to the complexity in the process of transitions between the possible pollution risks and the uncertainties of factors, the method of analysis hierarchy process is applied to determine the weights of each factor, and the fuzzy sets theory is adopted to calculate the membership degrees of each factor. Finally, a case study is presented to illustrate and test this methodology. It is concluded that the proposed approach integrates the advantages of both analysis hierarchy process and fuzzy comprehensive evaluation, which provides a more flexible and reliable way to deal with the linguistic uncertainty and mechanism uncertainty in groundwater contamination without losing important information. PMID:24453883

  6. Effect of soil property uncertainties on permafrost thaw projections: a calibration-constrained analysis

    NASA Astrophysics Data System (ADS)

    Harp, D. R.; Atchley, A. L.; Painter, S. L.; Coon, E. T.; Wilson, C. J.; Romanovsky, V. E.; Rowland, J. C.

    2016-02-01

    The effects of soil property uncertainties on permafrost thaw projections are studied using a three-phase subsurface thermal hydrology model and calibration-constrained uncertainty analysis. The null-space Monte Carlo method is used to identify soil hydrothermal parameter combinations that are consistent with borehole temperature measurements at the study site, the Barrow Environmental Observatory. Each parameter combination is then used in a forward projection of permafrost conditions for the 21st century (from calendar year 2006 to 2100) using atmospheric forcings from the Community Earth System Model (CESM) in the Representative Concentration Pathway (RCP) 8.5 greenhouse gas concentration trajectory. A 100-year projection allows for the evaluation of predictive uncertainty (due to soil property (parametric) uncertainty) and the inter-annual climate variability due to year to year differences in CESM climate forcings. After calibrating to measured borehole temperature data at this well-characterized site, soil property uncertainties are still significant and result in significant predictive uncertainties in projected active layer thickness and annual thaw depth-duration even with a specified future climate. Inter-annual climate variability in projected soil moisture content and Stefan number are small. A volume- and time-integrated Stefan number decreases significantly, indicating a shift in subsurface energy utilization in the future climate (latent heat of phase change becomes more important than heat conduction). Out of 10 soil parameters, ALT, annual thaw depth-duration, and Stefan number are highly dependent on mineral soil porosity, while annual mean liquid saturation of the active layer is highly dependent on the mineral soil residual saturation and moderately dependent on peat residual saturation. By comparing the ensemble statistics to the spread of projected permafrost metrics using different climate models, we quantify the relative magnitude of soil property uncertainty to another source of permafrost uncertainty, structural climate model uncertainty. We show that the effect of calibration-constrained uncertainty in soil properties, although significant, is less than that produced by structural climate model uncertainty for this location.

  7. Development and Testing of Neutron Cross Section Covariance Data for SCALE 6.2

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Marshall, William BJ J; Williams, Mark L; Wiarda, Dorothea

    2015-01-01

    Neutron cross-section covariance data are essential for many sensitivity/uncertainty and uncertainty quantification assessments performed both within the TSUNAMI suite and more broadly throughout the SCALE code system. The release of ENDF/B-VII.1 included a more complete set of neutron cross-section covariance data: these data form the basis for a new cross-section covariance library to be released in SCALE 6.2. A range of testing is conducted to investigate the properties of these covariance data and ensure that the data are reasonable. These tests include examination of the uncertainty in critical experiment benchmark model k eff values due to nuclear data uncertainties, asmore » well as similarity assessments of irradiated pressurized water reactor (PWR) and boiling water reactor (BWR) fuel with suites of critical experiments. The contents of the new covariance library, the testing performed, and the behavior of the new covariance data are described in this paper. The neutron cross-section covariances can be combined with a sensitivity data file generated using the TSUNAMI suite of codes within SCALE to determine the uncertainty in system k eff caused by nuclear data uncertainties. The Verified, Archived Library of Inputs and Data (VALID) maintained at Oak Ridge National Laboratory (ORNL) contains over 400 critical experiment benchmark models, and sensitivity data are generated for each of these models. The nuclear data uncertainty in k eff is generated for each experiment, and the resulting uncertainties are tabulated and compared to the differences in measured and calculated results. The magnitude of the uncertainty for categories of nuclides (such as actinides, fission products, and structural materials) is calculated for irradiated PWR and BWR fuel to quantify the effect of covariance library changes between the SCALE 6.1 and 6.2 libraries. One of the primary applications of sensitivity/uncertainty methods within SCALE is the assessment of similarities between benchmark experiments and safety applications. This is described by a c k value for each experiment with each application. Several studies have analyzed typical c k values for a range of critical experiments compared with hypothetical irradiated fuel applications. The c k value is sensitive to the cross-section covariance data because the contribution of each nuclide is influenced by its uncertainty; large uncertainties indicate more likely bias sources and are thus given more weight. Changes in c k values resulting from different covariance data can be used to examine and assess underlying data changes. These comparisons are performed for PWR and BWR fuel in storage and transportation systems.« less

  8. Methodology for qualitative uncertainty assessment of climate impact indicators

    NASA Astrophysics Data System (ADS)

    Otto, Juliane; Keup-Thiel, Elke; Rechid, Diana; Hänsler, Andreas; Pfeifer, Susanne; Roth, Ellinor; Jacob, Daniela

    2016-04-01

    The FP7 project "Climate Information Portal for Copernicus" (CLIPC) is developing an integrated platform of climate data services to provide a single point of access for authoritative scientific information on climate change and climate change impacts. In this project, the Climate Service Center Germany (GERICS) has been in charge of the development of a methodology on how to assess the uncertainties related to climate impact indicators. Existing climate data portals mainly treat the uncertainties in two ways: Either they provide generic guidance and/or express with statistical measures the quantifiable fraction of the uncertainty. However, none of the climate data portals give the users a qualitative guidance how confident they can be in the validity of the displayed data. The need for such guidance was identified in CLIPC user consultations. Therefore, we aim to provide an uncertainty assessment that provides the users with climate impact indicator-specific guidance on the degree to which they can trust the outcome. We will present an approach that provides information on the importance of different sources of uncertainties associated with a specific climate impact indicator and how these sources affect the overall 'degree of confidence' of this respective indicator. To meet users requirements in the effective communication of uncertainties, their feedback has been involved during the development process of the methodology. Assessing and visualising the quantitative component of uncertainty is part of the qualitative guidance. As visual analysis method, we apply the Climate Signal Maps (Pfeifer et al. 2015), which highlight only those areas with robust climate change signals. Here, robustness is defined as a combination of model agreement and the significance of the individual model projections. Reference Pfeifer, S., Bülow, K., Gobiet, A., Hänsler, A., Mudelsee, M., Otto, J., Rechid, D., Teichmann, C. and Jacob, D.: Robustness of Ensemble Climate Projections Analyzed with Climate Signal Maps: Seasonal and Extreme Precipitation for Germany, Atmosphere (Basel)., 6(5), 677-698, doi:10.3390/atmos6050677, 2015.

  9. Impact of the dynamical core on the direct simulation of tropical cyclones in a high-resolution global model

    DOE PAGES

    Reed, K. A.; Bacmeister, J. T.; Rosenbloom, N. A.; ...

    2015-05-13

    Our paper examines the impact of the dynamical core on the simulation of tropical cyclone (TC) frequency, distribution, and intensity. The dynamical core, the central fluid flow component of any general circulation model (GCM), is often overlooked in the analysis of a model's ability to simulate TCs compared to the impact of more commonly documented components (e.g., physical parameterizations). The Community Atmosphere Model version 5 is configured with multiple dynamics packages. This analysis demonstrates that the dynamical core has a significant impact on storm intensity and frequency, even in the presence of similar large-scale environments. In particular, the spectral elementmore » core produces stronger TCs and more hurricanes than the finite-volume core using very similar parameterization packages despite the latter having a slightly more favorable TC environment. Furthermore, these results suggest that more detailed investigations into the impact of the GCM dynamical core on TC climatology are needed to fully understand these uncertainties. Key Points The impact of the GCM dynamical core is often overlooked in TC assessments The CAM5 dynamical core has a significant impact on TC frequency and intensity A larger effort is needed to better understand this uncertainty« less

  10. The past as prelude to the future for understanding 21st-century climate effects on Rocky Mountain Trout

    USGS Publications Warehouse

    Isaak, Daniel J.; Muhlfeld, Clint C.; Todd, Andrew S.; Al-chokhachy, Robert; Roberts, James; Kershner, Jeffrey L.; Fausch, Kurt D.; Hostetler, Steven W.

    2012-01-01

    Bioclimatic models predict large reductions in native trout across the Rocky Mountains in the 21st century but lack details about how changes will occur. Through five case histories across the region, we explore how a changing climate has been affecting streams and the potential consequences for trout. Monitoring records show trends in temperature and hydrographs consistent with a warming climate in recent decades. Biological implications include upstream shifts in thermal habitats, risk of egg scour, increased wildfire disturbances, and declining summer habitat volumes. The importance of these factors depends on the context, but temperature increases are most relevant where population boundaries are mediated by thermal constraints. Summer flow declines and wildfires will be important where trout populations are fragmented and constrained to small refugia. A critical information gap is evidence documenting how populations are adjusting to long-term habitat trends, so biological monitoring is a priority. Biological, temperature, and discharge data from monitoring networks could be used to develop accurate vulnerability assessments that provide information regarding where conservation actions would best improve population resilience. Even with better information, future uncertainties will remain large due to unknowns regarding Earth's ultimate warming trajectory and how effects translate across scales. Maintaining or increasing the size of habitats could provide a buffer against these uncertainties.

  11. Position reconstruction in LUX

    NASA Astrophysics Data System (ADS)

    Akerib, D. S.; Alsum, S.; Araújo, H. M.; Bai, X.; Bailey, A. J.; Balajthy, J.; Beltrame, P.; Bernard, E. P.; Bernstein, A.; Biesiadzinski, T. P.; Boulton, E. M.; Brás, P.; Byram, D.; Cahn, S. B.; Carmona-Benitez, M. C.; Chan, C.; Currie, A.; Cutter, J. E.; Davison, T. J. R.; Dobi, A.; Druszkiewicz, E.; Edwards, B. N.; Fallon, S. R.; Fan, A.; Fiorucci, S.; Gaitskell, R. J.; Genovesi, J.; Ghag, C.; Gilchriese, M. G. D.; Hall, C. R.; Hanhardt, M.; Haselschwardt, S. J.; Hertel, S. A.; Hogan, D. P.; Horn, M.; Huang, D. Q.; Ignarra, C. M.; Jacobsen, R. G.; Ji, W.; Kamdin, K.; Kazkaz, K.; Khaitan, D.; Knoche, R.; Larsen, N. A.; Lenardo, B. G.; Lesko, K. T.; Lindote, A.; Lopes, M. I.; Manalaysay, A.; Mannino, R. L.; Marzioni, M. F.; McKinsey, D. N.; Mei, D.-M.; Mock, J.; Moongweluwan, M.; Morad, J. A.; Murphy, A. St. J.; Nehrkorn, C.; Nelson, H. N.; Neves, F.; O'Sullivan, K.; Oliver-Mallory, K. C.; Palladino, K. J.; Pease, E. K.; Rhyne, C.; Shaw, S.; Shutt, T. A.; Silva, C.; Solmaz, M.; Solovov, V. N.; Sorensen, P.; Sumner, T. J.; Szydagis, M.; Taylor, D. J.; Taylor, W. C.; Tennyson, B. P.; Terman, P. A.; Tiedt, D. R.; To, W. H.; Tripathi, M.; Tvrznikova, L.; Uvarov, S.; Velan, V.; Verbus, J. R.; Webb, R. C.; White, J. T.; Whitis, T. J.; Witherell, M. S.; Wolfs, F. L. H.; Xu, J.; Yazdani, K.; Young, S. K.; Zhang, C.

    2018-02-01

    The (x, y) position reconstruction method used in the analysis of the complete exposure of the Large Underground Xenon (LUX) experiment is presented. The algorithm is based on a statistical test that makes use of an iterative method to recover the photomultiplier tube (PMT) light response directly from the calibration data. The light response functions make use of a two dimensional functional form to account for the photons reflected on the inner walls of the detector. To increase the resolution for small pulses, a photon counting technique was employed to describe the response of the PMTs. The reconstruction was assessed with calibration data including 83mKr (releasing a total energy of 41.5 keV) and 3H (β- with Q = 18.6 keV) decays, and a deuterium-deuterium (D-D) neutron beam (2.45 MeV) . Within the detector's fiducial volume, the reconstruction has achieved an (x, y) position uncertainty of σ = 0.82 cm and σ = 0.17 cm for events of only 200 and 4,000 detected electroluminescence photons respectively. Such signals are associated with electron recoils of energies ~0.25 keV and ~10 keV, respectively. The reconstructed position of the smallest events with a single electron emitted from the liquid surface (22 detected photons) has a horizontal (x, y) uncertainty of 2.13 cm.

  12. Parameter and model uncertainty in a life-table model for fine particles (PM2.5): a statistical modeling study

    PubMed Central

    Tainio, Marko; Tuomisto, Jouni T; Hänninen, Otto; Ruuskanen, Juhani; Jantunen, Matti J; Pekkanen, Juha

    2007-01-01

    Background The estimation of health impacts involves often uncertain input variables and assumptions which have to be incorporated into the model structure. These uncertainties may have significant effects on the results obtained with model, and, thus, on decision making. Fine particles (PM2.5) are believed to cause major health impacts, and, consequently, uncertainties in their health impact assessment have clear relevance to policy-making. We studied the effects of various uncertain input variables by building a life-table model for fine particles. Methods Life-expectancy of the Helsinki metropolitan area population and the change in life-expectancy due to fine particle exposures were predicted using a life-table model. A number of parameter and model uncertainties were estimated. Sensitivity analysis for input variables was performed by calculating rank-order correlations between input and output variables. The studied model uncertainties were (i) plausibility of mortality outcomes and (ii) lag, and parameter uncertainties (iii) exposure-response coefficients for different mortality outcomes, and (iv) exposure estimates for different age groups. The monetary value of the years-of-life-lost and the relative importance of the uncertainties related to monetary valuation were predicted to compare the relative importance of the monetary valuation on the health effect uncertainties. Results The magnitude of the health effects costs depended mostly on discount rate, exposure-response coefficient, and plausibility of the cardiopulmonary mortality. Other mortality outcomes (lung cancer, other non-accidental and infant mortality) and lag had only minor impact on the output. The results highlight the importance of the uncertainties associated with cardiopulmonary mortality in the fine particle impact assessment when compared with other uncertainties. Conclusion When estimating life-expectancy, the estimates used for cardiopulmonary exposure-response coefficient, discount rate, and plausibility require careful assessment, while complicated lag estimates can be omitted without this having any major effect on the results. PMID:17714598

  13. Parameter and model uncertainty in a life-table model for fine particles (PM2.5): a statistical modeling study.

    PubMed

    Tainio, Marko; Tuomisto, Jouni T; Hänninen, Otto; Ruuskanen, Juhani; Jantunen, Matti J; Pekkanen, Juha

    2007-08-23

    The estimation of health impacts involves often uncertain input variables and assumptions which have to be incorporated into the model structure. These uncertainties may have significant effects on the results obtained with model, and, thus, on decision making. Fine particles (PM2.5) are believed to cause major health impacts, and, consequently, uncertainties in their health impact assessment have clear relevance to policy-making. We studied the effects of various uncertain input variables by building a life-table model for fine particles. Life-expectancy of the Helsinki metropolitan area population and the change in life-expectancy due to fine particle exposures were predicted using a life-table model. A number of parameter and model uncertainties were estimated. Sensitivity analysis for input variables was performed by calculating rank-order correlations between input and output variables. The studied model uncertainties were (i) plausibility of mortality outcomes and (ii) lag, and parameter uncertainties (iii) exposure-response coefficients for different mortality outcomes, and (iv) exposure estimates for different age groups. The monetary value of the years-of-life-lost and the relative importance of the uncertainties related to monetary valuation were predicted to compare the relative importance of the monetary valuation on the health effect uncertainties. The magnitude of the health effects costs depended mostly on discount rate, exposure-response coefficient, and plausibility of the cardiopulmonary mortality. Other mortality outcomes (lung cancer, other non-accidental and infant mortality) and lag had only minor impact on the output. The results highlight the importance of the uncertainties associated with cardiopulmonary mortality in the fine particle impact assessment when compared with other uncertainties. When estimating life-expectancy, the estimates used for cardiopulmonary exposure-response coefficient, discount rate, and plausibility require careful assessment, while complicated lag estimates can be omitted without this having any major effect on the results.

  14. Addressing uncertainty in modelling cumulative impacts within maritime spatial planning in the Adriatic and Ionian region.

    PubMed

    Gissi, Elena; Menegon, Stefano; Sarretta, Alessandro; Appiotti, Federica; Maragno, Denis; Vianello, Andrea; Depellegrin, Daniel; Venier, Chiara; Barbanti, Andrea

    2017-01-01

    Maritime spatial planning (MSP) is envisaged as a tool to apply an ecosystem-based approach to the marine and coastal realms, aiming at ensuring that the collective pressure of human activities is kept within acceptable limits. Cumulative impacts (CI) assessment can support science-based MSP, in order to understand the existing and potential impacts of human uses on the marine environment. A CI assessment includes several sources of uncertainty that can hinder the correct interpretation of its results if not explicitly incorporated in the decision-making process. This study proposes a three-level methodology to perform a general uncertainty analysis integrated with the CI assessment for MSP, applied to the Adriatic and Ionian Region (AIR). We describe the nature and level of uncertainty with the help of expert judgement and elicitation to include all of the possible sources of uncertainty related to the CI model with assumptions and gaps related to the case-based MSP process in the AIR. Next, we use the results to tailor the global uncertainty analysis to spatially describe the uncertainty distribution and variations of the CI scores dependent on the CI model factors. The results show the variability of the uncertainty in the AIR, with only limited portions robustly identified as the most or the least impacted areas under multiple model factors hypothesis. The results are discussed for the level and type of reliable information and insights they provide to decision-making. The most significant uncertainty factors are identified to facilitate the adaptive MSP process and to establish research priorities to fill knowledge gaps for subsequent planning cycles. The method aims to depict the potential CI effects, as well as the extent and spatial variation of the data and scientific uncertainty; therefore, this method constitutes a suitable tool to inform the potential establishment of the precautionary principle in MSP.

  15. Evidence Theory Based Uncertainty Quantification in Radiological Risk due to Accidental Release of Radioactivity from a Nuclear Power Plant

    NASA Astrophysics Data System (ADS)

    Ingale, S. V.; Datta, D.

    2010-10-01

    Consequence of the accidental release of radioactivity from a nuclear power plant is assessed in terms of exposure or dose to the members of the public. Assessment of risk is routed through this dose computation. Dose computation basically depends on the basic dose assessment model and exposure pathways. One of the exposure pathways is the ingestion of contaminated food. The aim of the present paper is to compute the uncertainty associated with the risk to the members of the public due to the ingestion of contaminated food. The governing parameters of the ingestion dose assessment model being imprecise, we have approached evidence theory to compute the bound of the risk. The uncertainty is addressed by the belief and plausibility fuzzy measures.

  16. Assessing uncertain human exposure to ambient air pollution using environmental models in the Web

    NASA Astrophysics Data System (ADS)

    Gerharz, L. E.; Pebesma, E.; Denby, B.

    2012-04-01

    Ambient air quality can have significant impact on human health by causing respiratory and cardio-vascular diseases. Thereby, the pollutant concentration a person is exposed to can differ considerably between individuals depending on their daily routine and movement patterns. Using a straight forward approach this exposure can be estimated by integration of individual space-time paths and spatio-temporally resolved ambient air quality data. To allow a realistic exposure assessment, it is furthermore important to consider uncertainties due to input and model errors. In this work, we present a generic, web-based approach for estimating individual exposure by integration of uncertain position and air quality information implemented as a web service. Following the Model Web initiative envisioning an infrastructure for deploying, executing and chaining environmental models as services, existing models and data sources for e.g. air quality, can be used to assess exposure. Therefore, the service needs to deal with different formats, resolutions and uncertainty representations provided by model or data services. Potential mismatch can be accounted for by transformation of uncertainties and (dis-)aggregation of data under consideration of changes in the uncertainties using components developed in the UncertWeb project. In UncertWeb, the Model Web vision is extended to an Uncertainty-enabled Model Web, where services can process and communicate uncertainties in the data and models. The propagation of uncertainty to the exposure results is quantified using Monte Carlo simulation by combining different realisations of positions and ambient concentrations. Two case studies were used to evaluate the developed exposure assessment service. In a first study, GPS tracks with a positional uncertainty of a few meters, collected in the urban area of Münster, Germany were used to assess exposure to PM10 (particulate matter smaller 10 µm). Air quality data was provided by an uncertainty-enabled air quality model system which provided realisations of concentrations per hour on a 250 m x 250 m resolved grid over Münster. The second case study uses modelled human trajectories in Rotterdam, The Netherlands. The trajectories were provided as realisations in 15 min resolution per 4 digit postal code from an activity model. Air quality estimates were provided for different pollutants as ensembles by a coupled meteorology and air quality model system on a 1 km x 1 km grid with hourly resolution. Both case studies show the successful application of the service to different resolutions and uncertainty representations.

  17. Uncertainty Estimation Cheat Sheet for Probabilistic Risk Assessment

    NASA Technical Reports Server (NTRS)

    Britton, Paul; Al Hassan, Mohammad; Ring, Robert

    2017-01-01

    Quantitative results for aerospace engineering problems are influenced by many sources of uncertainty. Uncertainty analysis aims to make a technical contribution to decision-making through the quantification of uncertainties in the relevant variables as well as through the propagation of these uncertainties up to the result. Uncertainty can be thought of as a measure of the 'goodness' of a result and is typically represented as statistical dispersion. This paper will explain common measures of centrality and dispersion; and-with examples-will provide guidelines for how they may be estimated to ensure effective technical contributions to decision-making.

  18. Dosimetric effects of energy spectrum uncertainties in radiation therapy with laser-driven particle beams.

    PubMed

    Schell, S; Wilkens, J J

    2012-03-07

    Laser-driven particle acceleration is a potentially cost-efficient and compact new technology that might replace synchrotrons or cyclotrons for future proton or heavy-ion radiation therapy. Since the energy spectrum of laser-accelerated particles is rather wide, compared to the monoenergetic beams of conventional machines, studies have proposed the usage of broader spectra for the treatment of at least certain parts of the target volume to make the process more efficient. The thereby introduced additional uncertainty in the applied energy spectrum is analysed in this note. It is shown that the uncertainty can be categorized into a change of the total number of particles, and a change in the energy distribution of the particles. The former one can be monitored by a simple fluence detector and cancels for a high number of statistically fluctuating shots. The latter one, the redistribution of a fixed number of particles to different energy bins in the window of transmitted energies of the energy selection system, only introduces smaller changes to the resulting depth dose curve. Therefore, it might not be necessary to monitor this uncertainty for all applied shots. These findings might enable an easier uncertainty management for particle therapy with broad energy spectra.

  19. Sensitivity and uncertainty analysis for Abreu & Johnson numerical vapor intrusion model.

    PubMed

    Ma, Jie; Yan, Guangxu; Li, Haiyan; Guo, Shaohui

    2016-03-05

    This study conducted one-at-a-time (OAT) sensitivity and uncertainty analysis for a numerical vapor intrusion model for nine input parameters, including soil porosity, soil moisture, soil air permeability, aerobic biodegradation rate, building depressurization, crack width, floor thickness, building volume, and indoor air exchange rate. Simulations were performed for three soil types (clay, silt, and sand), two source depths (3 and 8m), and two source concentrations (1 and 400 g/m(3)). Model sensitivity and uncertainty for shallow and high-concentration vapor sources (3m and 400 g/m(3)) are much smaller than for deep and low-concentration sources (8m and 1g/m(3)). For high-concentration sources, soil air permeability, indoor air exchange rate, and building depressurization (for high permeable soil like sand) are key contributors to model output uncertainty. For low-concentration sources, soil porosity, soil moisture, aerobic biodegradation rate and soil gas permeability are key contributors to model output uncertainty. Another important finding is that impacts of aerobic biodegradation on vapor intrusion potential of petroleum hydrocarbons are negligible when vapor source concentration is high, because of insufficient oxygen supply that limits aerobic biodegradation activities. Copyright © 2015 Elsevier B.V. All rights reserved.

  20. Can Atmospheric Reanalysis Data Sets Be Used to Reproduce Flooding Over Large Scales?

    NASA Astrophysics Data System (ADS)

    Andreadis, Konstantinos M.; Schumann, Guy J.-P.; Stampoulis, Dimitrios; Bates, Paul D.; Brakenridge, G. Robert; Kettner, Albert J.

    2017-10-01

    Floods are costly to global economies and can be exceptionally lethal. The ability to produce consistent flood hazard maps over large areas could provide a significant contribution to reducing such losses, as the lack of knowledge concerning flood risk is a major factor in the transformation of river floods into flood disasters. In order to accurately reproduce flooding in river channels and floodplains, high spatial resolution hydrodynamic models are needed. Despite being computationally expensive, recent advances have made their continental to global implementation feasible, although inputs for long-term simulations may require the use of reanalysis meteorological products especially in data-poor regions. We employ a coupled hydrologic/hydrodynamic model cascade forced by the 20CRv2 reanalysis data set and evaluate its ability to reproduce flood inundation area and volume for Australia during the 1973-2012 period. Ensemble simulations using the reanalysis data were performed to account for uncertainty in the meteorology and compared with a validated benchmark simulation. Results show that the reanalysis ensemble capture the inundated areas and volumes relatively well, with correlations for the ensemble mean of 0.82 and 0.85 for area and volume, respectively, although the meteorological ensemble spread propagates in large uncertainty of the simulated flood characteristics.

Top