Sample records for spatial uncertainty analysis

  1. `spup' - An R Package for Analysis of Spatial Uncertainty Propagation and Application to Trace Gas Emission Simulations

    NASA Astrophysics Data System (ADS)

    Sawicka, K.; Breuer, L.; Houska, T.; Santabarbara Ruiz, I.; Heuvelink, G. B. M.

    2016-12-01

    Computer models have become a crucial tool in engineering and environmental sciences for simulating the behaviour of complex static and dynamic systems. However, while many models are deterministic, the uncertainty in their predictions needs to be estimated before they are used for decision support. Advances in uncertainty propagation analysis and assessment have been paralleled by a growing number of software tools for uncertainty analysis, but none has gained recognition for a universal applicability, including case studies with spatial models and spatial model inputs. Due to the growing popularity and applicability of the open source R programming language we undertook a project to develop an R package that facilitates uncertainty propagation analysis in spatial environmental modelling. In particular, the `spup' package provides functions for examining the uncertainty propagation starting from input data and model parameters, via the environmental model onto model predictions. The functions include uncertainty model specification, stochastic simulation and propagation of uncertainty using Monte Carlo techniques, as well as several uncertainty visualization functions. Here we will demonstrate that the 'spup' package is an effective and easy-to-use tool to be applied even in a very complex study case, and that it can be used in multi-disciplinary research and model-based decision support. As an example, we use the ecological LandscapeDNDC model to analyse propagation of uncertainties associated with spatial variability of the model driving forces such as rainfall, nitrogen deposition and fertilizer inputs. The uncertainty propagation is analysed for the prediction of emissions of N2O and CO2 for a German low mountainous, agriculturally developed catchment. The study tests the effect of spatial correlations on spatially aggregated model outputs, and could serve as an advice for developing best management practices and model improvement strategies.

  2. The Uncertainties on the GIS Based Land Suitability Assessment for Urban and Rural Planning

    NASA Astrophysics Data System (ADS)

    Liu, H.; Zhan, Q.; Zhan, M.

    2017-09-01

    The majority of the research on the uncertainties of spatial data and spatial analysis focuses on some specific data feature or analysis tool. Few have accomplished the uncertainties of the whole process of an application like planning, making the research of uncertainties detached from practical applications. The paper discusses the uncertainties of the geographical information systems (GIS) based land suitability assessment in planning on the basis of literature review. The uncertainties considered range from index system establishment to the classification of the final result. Methods to reduce the uncertainties arise from the discretization of continuous raster data and the index weight determination are summarized. The paper analyzes the merits and demerits of the "Nature Breaks" method which is broadly used by planners. It also explores the other factors which impact the accuracy of the final classification like the selection of class numbers, intervals and the autocorrelation of the spatial data. In the conclusion part, the paper indicates that the adoption of machine learning methods should be modified to integrate the complexity of land suitability assessment. The work contributes to the application of spatial data and spatial analysis uncertainty research on land suitability assessment, and promotes the scientific level of the later planning and decision-making.

  3. A geostatistics-informed hierarchical sensitivity analysis method for complex groundwater flow and transport modeling: GEOSTATISTICAL SENSITIVITY ANALYSIS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dai, Heng; Chen, Xingyuan; Ye, Ming

    Sensitivity analysis is an important tool for quantifying uncertainty in the outputs of mathematical models, especially for complex systems with a high dimension of spatially correlated parameters. Variance-based global sensitivity analysis has gained popularity because it can quantify the relative contribution of uncertainty from different sources. However, its computational cost increases dramatically with the complexity of the considered model and the dimension of model parameters. In this study we developed a hierarchical sensitivity analysis method that (1) constructs an uncertainty hierarchy by analyzing the input uncertainty sources, and (2) accounts for the spatial correlation among parameters at each level ofmore » the hierarchy using geostatistical tools. The contribution of uncertainty source at each hierarchy level is measured by sensitivity indices calculated using the variance decomposition method. Using this methodology, we identified the most important uncertainty source for a dynamic groundwater flow and solute transport in model at the Department of Energy (DOE) Hanford site. The results indicate that boundary conditions and permeability field contribute the most uncertainty to the simulated head field and tracer plume, respectively. The relative contribution from each source varied spatially and temporally as driven by the dynamic interaction between groundwater and river water at the site. By using a geostatistical approach to reduce the number of realizations needed for the sensitivity analysis, the computational cost of implementing the developed method was reduced to a practically manageable level. The developed sensitivity analysis method is generally applicable to a wide range of hydrologic and environmental problems that deal with high-dimensional spatially-distributed parameters.« less

  4. 'spup' - an R package for uncertainty propagation analysis in spatial environmental modelling

    NASA Astrophysics Data System (ADS)

    Sawicka, Kasia; Heuvelink, Gerard

    2017-04-01

    Computer models have become a crucial tool in engineering and environmental sciences for simulating the behaviour of complex static and dynamic systems. However, while many models are deterministic, the uncertainty in their predictions needs to be estimated before they are used for decision support. Currently, advances in uncertainty propagation and assessment have been paralleled by a growing number of software tools for uncertainty analysis, but none has gained recognition for a universal applicability and being able to deal with case studies with spatial models and spatial model inputs. Due to the growing popularity and applicability of the open source R programming language we undertook a project to develop an R package that facilitates uncertainty propagation analysis in spatial environmental modelling. In particular, the 'spup' package provides functions for examining the uncertainty propagation starting from input data and model parameters, via the environmental model onto model predictions. The functions include uncertainty model specification, stochastic simulation and propagation of uncertainty using Monte Carlo (MC) techniques, as well as several uncertainty visualization functions. Uncertain environmental variables are represented in the package as objects whose attribute values may be uncertain and described by probability distributions. Both numerical and categorical data types are handled. Spatial auto-correlation within an attribute and cross-correlation between attributes is also accommodated for. For uncertainty propagation the package has implemented the MC approach with efficient sampling algorithms, i.e. stratified random sampling and Latin hypercube sampling. The design includes facilitation of parallel computing to speed up MC computation. The MC realizations may be used as an input to the environmental models called from R, or externally. Selected visualization methods that are understandable by non-experts with limited background in statistics can be used to summarize and visualize uncertainty about the measured input, model parameters and output of the uncertainty propagation. We demonstrate that the 'spup' package is an effective and easy tool to apply and can be used in multi-disciplinary research and model-based decision support.

  5. A geostatistical approach to the change-of-support problem and variable-support data fusion in spatial analysis

    NASA Astrophysics Data System (ADS)

    Wang, Jun; Wang, Yang; Zeng, Hui

    2016-01-01

    A key issue to address in synthesizing spatial data with variable-support in spatial analysis and modeling is the change-of-support problem. We present an approach for solving the change-of-support and variable-support data fusion problems. This approach is based on geostatistical inverse modeling that explicitly accounts for differences in spatial support. The inverse model is applied here to produce both the best predictions of a target support and prediction uncertainties, based on one or more measurements, while honoring measurements. Spatial data covering large geographic areas often exhibit spatial nonstationarity and can lead to computational challenge due to the large data size. We developed a local-window geostatistical inverse modeling approach to accommodate these issues of spatial nonstationarity and alleviate computational burden. We conducted experiments using synthetic and real-world raster data. Synthetic data were generated and aggregated to multiple supports and downscaled back to the original support to analyze the accuracy of spatial predictions and the correctness of prediction uncertainties. Similar experiments were conducted for real-world raster data. Real-world data with variable-support were statistically fused to produce single-support predictions and associated uncertainties. The modeling results demonstrate that geostatistical inverse modeling can produce accurate predictions and associated prediction uncertainties. It is shown that the local-window geostatistical inverse modeling approach suggested offers a practical way to solve the well-known change-of-support problem and variable-support data fusion problem in spatial analysis and modeling.

  6. Assessment of flood susceptible areas using spatially explicit, probabilistic multi-criteria decision analysis

    NASA Astrophysics Data System (ADS)

    Tang, Zhongqian; Zhang, Hua; Yi, Shanzhen; Xiao, Yangfan

    2018-03-01

    GIS-based multi-criteria decision analysis (MCDA) is increasingly used to support flood risk assessment. However, conventional GIS-MCDA methods fail to adequately represent spatial variability and are accompanied with considerable uncertainty. It is, thus, important to incorporate spatial variability and uncertainty into GIS-based decision analysis procedures. This research develops a spatially explicit, probabilistic GIS-MCDA approach for the delineation of potentially flood susceptible areas. The approach integrates the probabilistic and the local ordered weighted averaging (OWA) methods via Monte Carlo simulation, to take into account the uncertainty related to criteria weights, spatial heterogeneity of preferences and the risk attitude of the analyst. The approach is applied to a pilot study for the Gucheng County, central China, heavily affected by the hazardous 2012 flood. A GIS database of six geomorphological and hydrometeorological factors for the evaluation of susceptibility was created. Moreover, uncertainty and sensitivity analysis were performed to investigate the robustness of the model. The results indicate that the ensemble method improves the robustness of the model outcomes with respect to variation in criteria weights and identifies which criteria weights are most responsible for the variability of model outcomes. Therefore, the proposed approach is an improvement over the conventional deterministic method and can provides a more rational, objective and unbiased tool for flood susceptibility evaluation.

  7. Gridded uncertainty in fossil fuel carbon dioxide emission maps, a CDIAC example

    DOE PAGES

    Andres, Robert J.; Boden, Thomas A.; Higdon, David M.

    2016-12-05

    Due to a current lack of physical measurements at appropriate spatial and temporal scales, all current global maps and distributions of fossil fuel carbon dioxide (FFCO2) emissions use one or more proxies to distribute those emissions. These proxies and distribution schemes introduce additional uncertainty into these maps. This paper examines the uncertainty associated with the magnitude of gridded FFCO2 emissions. This uncertainty is gridded at the same spatial and temporal scales as the mass magnitude maps. This gridded uncertainty includes uncertainty contributions from the spatial, temporal, proxy, and magnitude components used to create the magnitude map of FFCO2 emissions. Throughoutmore » this process, when assumptions had to be made or expert judgment employed, the general tendency in most cases was toward overestimating or increasing the magnitude of uncertainty. The results of the uncertainty analysis reveal a range of 4–190 %, with an average of 120 % (2 σ) for populated and FFCO2-emitting grid spaces over annual timescales. This paper also describes a methodological change specific to the creation of the Carbon Dioxide Information Analysis Center (CDIAC) FFCO2 emission maps: the change from a temporally fixed population proxy to a temporally varying population proxy.« less

  8. Gridded uncertainty in fossil fuel carbon dioxide emission maps, a CDIAC example

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Andres, Robert J.; Boden, Thomas A.; Higdon, David M.

    Due to a current lack of physical measurements at appropriate spatial and temporal scales, all current global maps and distributions of fossil fuel carbon dioxide (FFCO2) emissions use one or more proxies to distribute those emissions. These proxies and distribution schemes introduce additional uncertainty into these maps. This paper examines the uncertainty associated with the magnitude of gridded FFCO2 emissions. This uncertainty is gridded at the same spatial and temporal scales as the mass magnitude maps. This gridded uncertainty includes uncertainty contributions from the spatial, temporal, proxy, and magnitude components used to create the magnitude map of FFCO2 emissions. Throughoutmore » this process, when assumptions had to be made or expert judgment employed, the general tendency in most cases was toward overestimating or increasing the magnitude of uncertainty. The results of the uncertainty analysis reveal a range of 4–190 %, with an average of 120 % (2 σ) for populated and FFCO2-emitting grid spaces over annual timescales. This paper also describes a methodological change specific to the creation of the Carbon Dioxide Information Analysis Center (CDIAC) FFCO2 emission maps: the change from a temporally fixed population proxy to a temporally varying population proxy.« less

  9. Gridded uncertainty in fossil fuel carbon dioxide emission maps, a CDIAC example

    NASA Astrophysics Data System (ADS)

    Andres, Robert J.; Boden, Thomas A.; Higdon, David M.

    2016-12-01

    Due to a current lack of physical measurements at appropriate spatial and temporal scales, all current global maps and distributions of fossil fuel carbon dioxide (FFCO2) emissions use one or more proxies to distribute those emissions. These proxies and distribution schemes introduce additional uncertainty into these maps. This paper examines the uncertainty associated with the magnitude of gridded FFCO2 emissions. This uncertainty is gridded at the same spatial and temporal scales as the mass magnitude maps. This gridded uncertainty includes uncertainty contributions from the spatial, temporal, proxy, and magnitude components used to create the magnitude map of FFCO2 emissions. Throughout this process, when assumptions had to be made or expert judgment employed, the general tendency in most cases was toward overestimating or increasing the magnitude of uncertainty. The results of the uncertainty analysis reveal a range of 4-190 %, with an average of 120 % (2σ) for populated and FFCO2-emitting grid spaces over annual timescales. This paper also describes a methodological change specific to the creation of the Carbon Dioxide Information Analysis Center (CDIAC) FFCO2 emission maps: the change from a temporally fixed population proxy to a temporally varying population proxy.

  10. Characterizing spatial uncertainty when integrating social data in conservation planning.

    PubMed

    Lechner, A M; Raymond, C M; Adams, V M; Polyakov, M; Gordon, A; Rhodes, J R; Mills, M; Stein, A; Ives, C D; Lefroy, E C

    2014-12-01

    Recent conservation planning studies have presented approaches for integrating spatially referenced social (SRS) data with a view to improving the feasibility of conservation action. We reviewed the growing conservation literature on SRS data, focusing on elicited or stated preferences derived through social survey methods such as choice experiments and public participation geographic information systems. Elicited SRS data includes the spatial distribution of willingness to sell, willingness to pay, willingness to act, and assessments of social and cultural values. We developed a typology for assessing elicited SRS data uncertainty which describes how social survey uncertainty propagates when projected spatially and the importance of accounting for spatial uncertainty such as scale effects and data quality. These uncertainties will propagate when elicited SRS data is integrated with biophysical data for conservation planning and may have important consequences for assessing the feasibility of conservation actions. To explore this issue further, we conducted a systematic review of the elicited SRS data literature. We found that social survey uncertainty was commonly tested for, but that these uncertainties were ignored when projected spatially. Based on these results we developed a framework which will help researchers and practitioners estimate social survey uncertainty and use these quantitative estimates to systematically address uncertainty within an analysis. This is important when using SRS data in conservation applications because decisions need to be made irrespective of data quality and well characterized uncertainty can be incorporated into decision theoretic approaches. © 2014 Society for Conservation Biology.

  11. Parameter uncertainty and nonstationarity in regional extreme rainfall frequency analysis in Qu River Basin, East China

    NASA Astrophysics Data System (ADS)

    Zhu, Q.; Xu, Y. P.; Gu, H.

    2014-12-01

    Traditionally, regional frequency analysis methods were developed for stationary environmental conditions. Nevertheless, recent studies have identified significant changes in hydrological records, leading to the 'death' of stationarity. Besides, uncertainty in hydrological frequency analysis is persistent. This study aims to investigate the impact of one of the most important uncertainty sources, parameter uncertainty, together with nonstationarity, on design rainfall depth in Qu River Basin, East China. A spatial bootstrap is first proposed to analyze the uncertainty of design rainfall depth estimated by regional frequency analysis based on L-moments and estimated on at-site scale. Meanwhile, a method combining the generalized additive models with 30-year moving window is employed to analyze non-stationarity existed in the extreme rainfall regime. The results show that the uncertainties of design rainfall depth with 100-year return period under stationary conditions estimated by regional spatial bootstrap can reach 15.07% and 12.22% with GEV and PE3 respectively. On at-site scale, the uncertainties can reach 17.18% and 15.44% with GEV and PE3 respectively. In non-stationary conditions, the uncertainties of maximum rainfall depth (corresponding to design rainfall depth) with 0.01 annual exceedance probability (corresponding to 100-year return period) are 23.09% and 13.83% with GEV and PE3 respectively. Comparing the 90% confidence interval, the uncertainty of design rainfall depth resulted from parameter uncertainty is less than that from non-stationarity frequency analysis with GEV, however, slightly larger with PE3. This study indicates that the spatial bootstrap can be successfully applied to analyze the uncertainty of design rainfall depth on both regional and at-site scales. And the non-stationary analysis shows that the differences between non-stationary quantiles and their stationary equivalents are important for decision makes of water resources management and risk management.

  12. 'spup' - an R package for uncertainty propagation in spatial environmental modelling

    NASA Astrophysics Data System (ADS)

    Sawicka, Kasia; Heuvelink, Gerard

    2016-04-01

    Computer models have become a crucial tool in engineering and environmental sciences for simulating the behaviour of complex static and dynamic systems. However, while many models are deterministic, the uncertainty in their predictions needs to be estimated before they are used for decision support. Currently, advances in uncertainty propagation and assessment have been paralleled by a growing number of software tools for uncertainty analysis, but none has gained recognition for a universal applicability, including case studies with spatial models and spatial model inputs. Due to the growing popularity and applicability of the open source R programming language we undertook a project to develop an R package that facilitates uncertainty propagation analysis in spatial environmental modelling. In particular, the 'spup' package provides functions for examining the uncertainty propagation starting from input data and model parameters, via the environmental model onto model predictions. The functions include uncertainty model specification, stochastic simulation and propagation of uncertainty using Monte Carlo (MC) techniques, as well as several uncertainty visualization functions. Uncertain environmental variables are represented in the package as objects whose attribute values may be uncertain and described by probability distributions. Both numerical and categorical data types are handled. Spatial auto-correlation within an attribute and cross-correlation between attributes is also accommodated for. For uncertainty propagation the package has implemented the MC approach with efficient sampling algorithms, i.e. stratified random sampling and Latin hypercube sampling. The design includes facilitation of parallel computing to speed up MC computation. The MC realizations may be used as an input to the environmental models called from R, or externally. Selected static and interactive visualization methods that are understandable by non-experts with limited background in statistics can be used to summarize and visualize uncertainty about the measured input, model parameters and output of the uncertainty propagation. We demonstrate that the 'spup' package is an effective and easy tool to apply and can be used in multi-disciplinary research and model-based decision support.

  13. A Review On Accuracy and Uncertainty of Spatial Data and Analyses with special reference to Urban and Hydrological Modelling

    NASA Astrophysics Data System (ADS)

    Devendran, A. A.; Lakshmanan, G.

    2014-11-01

    Data quality for GIS processing and analysis is becoming an increased concern due to the accelerated application of GIS technology for problem solving and decision making roles. Uncertainty in the geographic representation of the real world arises as these representations are incomplete. Identification of the sources of these uncertainties and the ways in which they operate in GIS based representations become crucial in any spatial data representation and geospatial analysis applied to any field of application. This paper reviews the articles on the various components of spatial data quality and various uncertainties inherent in them and special focus is paid to two fields of application such as Urban Simulation and Hydrological Modelling. Urban growth is a complicated process involving the spatio-temporal changes of all socio-economic and physical components at different scales. Cellular Automata (CA) model is one of the simulation models, which randomly selects potential cells for urbanisation and the transition rules evaluate the properties of the cell and its neighbour. Uncertainty arising from CA modelling is assessed mainly using sensitivity analysis including Monte Carlo simulation method. Likewise, the importance of hydrological uncertainty analysis has been emphasized in recent years and there is an urgent need to incorporate uncertainty estimation into water resources assessment procedures. The Soil and Water Assessment Tool (SWAT) is a continuous time watershed model to evaluate various impacts of land use management and climate on hydrology and water quality. Hydrological model uncertainties using SWAT model are dealt primarily by Generalized Likelihood Uncertainty Estimation (GLUE) method.

  14. Spatial variability versus parameter uncertainty in freshwater fate and exposure factors of chemicals.

    PubMed

    Nijhof, Carl O P; Huijbregts, Mark A J; Golsteijn, Laura; van Zelm, Rosalie

    2016-04-01

    We compared the influence of spatial variability in environmental characteristics and the uncertainty in measured substance properties of seven chemicals on freshwater fate factors (FFs), representing the residence time in the freshwater environment, and on exposure factors (XFs), representing the dissolved fraction of a chemical. The influence of spatial variability was quantified using the SimpleBox model in which Europe was divided in 100 × 100 km regions, nested in a regional (300 × 300 km) and supra-regional (500 × 500 km) scale. Uncertainty in substance properties was quantified by means of probabilistic modelling. Spatial variability and parameter uncertainty were expressed by the ratio k of the 95%ile and 5%ile of the FF and XF. Our analysis shows that spatial variability ranges in FFs of persistent chemicals that partition predominantly into one environmental compartment was up to 2 orders of magnitude larger compared to uncertainty. For the other (less persistent) chemicals, uncertainty in the FF was up to 1 order of magnitude larger than spatial variability. Variability and uncertainty in freshwater XFs of the seven chemicals was negligible (k < 1.5). We found that, depending on the chemical and emission scenario, accounting for region-specific environmental characteristics in multimedia fate modelling, as well as accounting for parameter uncertainty, can have a significant influence on freshwater fate factor predictions. Therefore, we conclude that it is important that fate factors should not only account for parameter uncertainty, but for spatial variability as well, as this further increases the reliability of ecotoxicological impacts in LCA. Copyright © 2016 Elsevier Ltd. All rights reserved.

  15. A method to estimate the effect of deformable image registration uncertainties on daily dose mapping

    PubMed Central

    Murphy, Martin J.; Salguero, Francisco J.; Siebers, Jeffrey V.; Staub, David; Vaman, Constantin

    2012-01-01

    Purpose: To develop a statistical sampling procedure for spatially-correlated uncertainties in deformable image registration and then use it to demonstrate their effect on daily dose mapping. Methods: Sequential daily CT studies are acquired to map anatomical variations prior to fractionated external beam radiotherapy. The CTs are deformably registered to the planning CT to obtain displacement vector fields (DVFs). The DVFs are used to accumulate the dose delivered each day onto the planning CT. Each DVF has spatially-correlated uncertainties associated with it. Principal components analysis (PCA) is applied to measured DVF error maps to produce decorrelated principal component modes of the errors. The modes are sampled independently and reconstructed to produce synthetic registration error maps. The synthetic error maps are convolved with dose mapped via deformable registration to model the resulting uncertainty in the dose mapping. The results are compared to the dose mapping uncertainty that would result from uncorrelated DVF errors that vary randomly from voxel to voxel. Results: The error sampling method is shown to produce synthetic DVF error maps that are statistically indistinguishable from the observed error maps. Spatially-correlated DVF uncertainties modeled by our procedure produce patterns of dose mapping error that are different from that due to randomly distributed uncertainties. Conclusions: Deformable image registration uncertainties have complex spatial distributions. The authors have developed and tested a method to decorrelate the spatial uncertainties and make statistical samples of highly correlated error maps. The sample error maps can be used to investigate the effect of DVF uncertainties on daily dose mapping via deformable image registration. An initial demonstration of this methodology shows that dose mapping uncertainties can be sensitive to spatial patterns in the DVF uncertainties. PMID:22320766

  16. A GIS based spatially-explicit sensitivity and uncertainty analysis approach for multi-criteria decision analysis.

    PubMed

    Feizizadeh, Bakhtiar; Jankowski, Piotr; Blaschke, Thomas

    2014-03-01

    GIS multicriteria decision analysis (MCDA) techniques are increasingly used in landslide susceptibility mapping for the prediction of future hazards, land use planning, as well as for hazard preparedness. However, the uncertainties associated with MCDA techniques are inevitable and model outcomes are open to multiple types of uncertainty. In this paper, we present a systematic approach to uncertainty and sensitivity analysis. We access the uncertainty of landslide susceptibility maps produced with GIS-MCDA techniques. A new spatially-explicit approach and Dempster-Shafer Theory (DST) are employed to assess the uncertainties associated with two MCDA techniques, namely Analytical Hierarchical Process (AHP) and Ordered Weighted Averaging (OWA) implemented in GIS. The methodology is composed of three different phases. First, weights are computed to express the relative importance of factors (criteria) for landslide susceptibility. Next, the uncertainty and sensitivity of landslide susceptibility is analyzed as a function of weights using Monte Carlo Simulation and Global Sensitivity Analysis. Finally, the results are validated using a landslide inventory database and by applying DST. The comparisons of the obtained landslide susceptibility maps of both MCDA techniques with known landslides show that the AHP outperforms OWA. However, the OWA-generated landslide susceptibility map shows lower uncertainty than the AHP-generated map. The results demonstrate that further improvement in the accuracy of GIS-based MCDA can be achieved by employing an integrated uncertainty-sensitivity analysis approach, in which the uncertainty of landslide susceptibility model is decomposed and attributed to model's criteria weights.

  17. A GIS based spatially-explicit sensitivity and uncertainty analysis approach for multi-criteria decision analysis

    NASA Astrophysics Data System (ADS)

    Feizizadeh, Bakhtiar; Jankowski, Piotr; Blaschke, Thomas

    2014-03-01

    GIS multicriteria decision analysis (MCDA) techniques are increasingly used in landslide susceptibility mapping for the prediction of future hazards, land use planning, as well as for hazard preparedness. However, the uncertainties associated with MCDA techniques are inevitable and model outcomes are open to multiple types of uncertainty. In this paper, we present a systematic approach to uncertainty and sensitivity analysis. We access the uncertainty of landslide susceptibility maps produced with GIS-MCDA techniques. A new spatially-explicit approach and Dempster-Shafer Theory (DST) are employed to assess the uncertainties associated with two MCDA techniques, namely Analytical Hierarchical Process (AHP) and Ordered Weighted Averaging (OWA) implemented in GIS. The methodology is composed of three different phases. First, weights are computed to express the relative importance of factors (criteria) for landslide susceptibility. Next, the uncertainty and sensitivity of landslide susceptibility is analyzed as a function of weights using Monte Carlo Simulation and Global Sensitivity Analysis. Finally, the results are validated using a landslide inventory database and by applying DST. The comparisons of the obtained landslide susceptibility maps of both MCDA techniques with known landslides show that the AHP outperforms OWA. However, the OWA-generated landslide susceptibility map shows lower uncertainty than the AHP-generated map. The results demonstrate that further improvement in the accuracy of GIS-based MCDA can be achieved by employing an integrated uncertainty-sensitivity analysis approach, in which the uncertainty of landslide susceptibility model is decomposed and attributed to model's criteria weights.

  18. Spatial uncertainty analysis: Propagation of interpolation errors in spatially distributed models

    USGS Publications Warehouse

    Phillips, D.L.; Marks, D.G.

    1996-01-01

    In simulation modelling, it is desirable to quantify model uncertainties and provide not only point estimates for output variables but confidence intervals as well. Spatially distributed physical and ecological process models are becoming widely used, with runs being made over a grid of points that represent the landscape. This requires input values at each grid point, which often have to be interpolated from irregularly scattered measurement sites, e.g., weather stations. Interpolation introduces spatially varying errors which propagate through the model We extended established uncertainty analysis methods to a spatial domain for quantifying spatial patterns of input variable interpolation errors and how they propagate through a model to affect the uncertainty of the model output. We applied this to a model of potential evapotranspiration (PET) as a demonstration. We modelled PET for three time periods in 1990 as a function of temperature, humidity, and wind on a 10-km grid across the U.S. portion of the Columbia River Basin. Temperature, humidity, and wind speed were interpolated using kriging from 700- 1000 supporting data points. Kriging standard deviations (SD) were used to quantify the spatially varying interpolation uncertainties. For each of 5693 grid points, 100 Monte Carlo simulations were done, using the kriged values of temperature, humidity, and wind, plus random error terms determined by the kriging SDs and the correlations of interpolation errors among the three variables. For the spring season example, kriging SDs averaged 2.6??C for temperature, 8.7% for relative humidity, and 0.38 m s-1 for wind. The resultant PET estimates had coefficients of variation (CVs) ranging from 14% to 27% for the 10-km grid cells. Maps of PET means and CVs showed the spatial patterns of PET with a measure of its uncertainty due to interpolation of the input variables. This methodology should be applicable to a variety of spatially distributed models using interpolated inputs.

  19. Quantifying uncertainty in forest nutrient budgets

    Treesearch

    Ruth D. Yanai; Carrie R. Levine; Mark B. Green; John L. Campbell

    2012-01-01

    Nutrient budgets for forested ecosystems have rarely included error analysis, in spite of the importance of uncertainty to interpretation and extrapolation of the results. Uncertainty derives from natural spatial and temporal variation and also from knowledge uncertainty in measurement and models. For example, when estimating forest biomass, researchers commonly report...

  20. Transfer of Satellite Rainfall Uncertainty from Gauged to Ungauged Regions at Regional and Seasonal Timescales

    NASA Technical Reports Server (NTRS)

    Tang, Ling; Hossain, Faisal; Huffman, George J.

    2010-01-01

    Hydrologists and other users need to know the uncertainty of the satellite rainfall data sets across the range of time/space scales over the whole domain of the data set. Here, uncertainty' refers to the general concept of the deviation' of an estimate from the reference (or ground truth) where the deviation may be defined in multiple ways. This uncertainty information can provide insight to the user on the realistic limits of utility, such as hydrologic predictability, that can be achieved with these satellite rainfall data sets. However, satellite rainfall uncertainty estimation requires ground validation (GV) precipitation data. On the other hand, satellite data will be most useful over regions that lack GV data, for example developing countries. This paper addresses the open issues for developing an appropriate uncertainty transfer scheme that can routinely estimate various uncertainty metrics across the globe by leveraging a combination of spatially-dense GV data and temporally sparse surrogate (or proxy) GV data, such as the Tropical Rainfall Measuring Mission (TRMM) Precipitation Radar and the Global Precipitation Measurement (GPM) mission Dual-Frequency Precipitation Radar. The TRMM Multi-satellite Precipitation Analysis (TMPA) products over the US spanning a record of 6 years are used as a representative example of satellite rainfall. It is shown that there exists a quantifiable spatial structure in the uncertainty of satellite data for spatial interpolation. Probabilistic analysis of sampling offered by the existing constellation of passive microwave sensors indicate that transfer of uncertainty for hydrologic applications may be effective at daily time scales or higher during the GPM era. Finally, a commonly used spatial interpolation technique (kriging), that leverages the spatial correlation of estimation uncertainty, is assessed at climatologic, seasonal, monthly and weekly timescales. It is found that the effectiveness of kriging is sensitive to the type of uncertainty metric, time scale of transfer and the density of GV data within the transfer domain. Transfer accuracy is lowest at weekly timescales with the error doubling from monthly to weekly.However, at very low GV data density (<20% of the domain), the transfer accuracy is too low to show any distinction as a function of the timescale of transfer.

  1. Communicating spatial uncertainty to non-experts using R

    NASA Astrophysics Data System (ADS)

    Luzzi, Damiano; Sawicka, Kasia; Heuvelink, Gerard; de Bruin, Sytze

    2016-04-01

    Effective visualisation methods are important for the efficient use of uncertainty information for various groups of users. Uncertainty propagation analysis is often used with spatial environmental models to quantify the uncertainty within the information. A challenge arises when trying to effectively communicate the uncertainty information to non-experts (not statisticians) in a wide range of cases. Due to the growing popularity and applicability of the open source programming language R, we undertook a project to develop an R package that facilitates uncertainty propagation analysis in spatial environmental modelling. The package has implemented Monte Carlo algorithms for uncertainty propagation, the output of which is represented by an ensemble of model outputs (i.e. a sample from a probability distribution). Numerous visualisation methods exist that aim to present such spatial uncertainty information both statically, dynamically and interactively. To provide the most universal visualisation tools for non-experts, we conducted a survey on a group of 20 university students and assessed the effectiveness of selected static and interactive methods for visualising uncertainty in spatial variables such as DEM and land cover. The static methods included adjacent maps and glyphs for continuous variables. Both allow for displaying maps with information about the ensemble mean, variance/standard deviation and prediction intervals. Adjacent maps were also used for categorical data, displaying maps of the most probable class, as well as its associated probability. The interactive methods included a graphical user interface, which in addition to displaying the previously mentioned variables also allowed for comparison of joint uncertainties at multiple locations. The survey indicated that users could understand the basics of the uncertainty information displayed in the static maps, with the interactive interface allowing for more in-depth information. Subsequently, the R package included a collation of the plotting functions that were evaluated in the survey. The implementation of static visualisations was done via calls to the 'ggplot2' package. This allowed the user to provide control over the content, legend, colours, axes and titles. The interactive methods were implemented using the 'shiny' package allowing users to activate the visualisation of statistical descriptions of uncertainty through interaction with a plotted map of means. This research brings uncertainty visualisation to a broader audience through the development of tools for visualising uncertainty using open source software.

  2. A GIS based spatially-explicit sensitivity and uncertainty analysis approach for multi-criteria decision analysis☆

    PubMed Central

    Feizizadeh, Bakhtiar; Jankowski, Piotr; Blaschke, Thomas

    2014-01-01

    GIS multicriteria decision analysis (MCDA) techniques are increasingly used in landslide susceptibility mapping for the prediction of future hazards, land use planning, as well as for hazard preparedness. However, the uncertainties associated with MCDA techniques are inevitable and model outcomes are open to multiple types of uncertainty. In this paper, we present a systematic approach to uncertainty and sensitivity analysis. We access the uncertainty of landslide susceptibility maps produced with GIS-MCDA techniques. A new spatially-explicit approach and Dempster–Shafer Theory (DST) are employed to assess the uncertainties associated with two MCDA techniques, namely Analytical Hierarchical Process (AHP) and Ordered Weighted Averaging (OWA) implemented in GIS. The methodology is composed of three different phases. First, weights are computed to express the relative importance of factors (criteria) for landslide susceptibility. Next, the uncertainty and sensitivity of landslide susceptibility is analyzed as a function of weights using Monte Carlo Simulation and Global Sensitivity Analysis. Finally, the results are validated using a landslide inventory database and by applying DST. The comparisons of the obtained landslide susceptibility maps of both MCDA techniques with known landslides show that the AHP outperforms OWA. However, the OWA-generated landslide susceptibility map shows lower uncertainty than the AHP-generated map. The results demonstrate that further improvement in the accuracy of GIS-based MCDA can be achieved by employing an integrated uncertainty–sensitivity analysis approach, in which the uncertainty of landslide susceptibility model is decomposed and attributed to model's criteria weights. PMID:25843987

  3. Evaluation of habitat suitability index models by global sensitivity and uncertainty analyses: a case study for submerged aquatic vegetation

    USGS Publications Warehouse

    Zajac, Zuzanna; Stith, Bradley M.; Bowling, Andrea C.; Langtimm, Catherine A.; Swain, Eric D.

    2015-01-01

    Habitat suitability index (HSI) models are commonly used to predict habitat quality and species distributions and are used to develop biological surveys, assess reserve and management priorities, and anticipate possible change under different management or climate change scenarios. Important management decisions may be based on model results, often without a clear understanding of the level of uncertainty associated with model outputs. We present an integrated methodology to assess the propagation of uncertainty from both inputs and structure of the HSI models on model outputs (uncertainty analysis: UA) and relative importance of uncertain model inputs and their interactions on the model output uncertainty (global sensitivity analysis: GSA). We illustrate the GSA/UA framework using simulated hydrology input data from a hydrodynamic model representing sea level changes and HSI models for two species of submerged aquatic vegetation (SAV) in southwest Everglades National Park: Vallisneria americana (tape grass) and Halodule wrightii (shoal grass). We found considerable spatial variation in uncertainty for both species, but distributions of HSI scores still allowed discrimination of sites with good versus poor conditions. Ranking of input parameter sensitivities also varied spatially for both species, with high habitat quality sites showing higher sensitivity to different parameters than low-quality sites. HSI models may be especially useful when species distribution data are unavailable, providing means of exploiting widely available environmental datasets to model past, current, and future habitat conditions. The GSA/UA approach provides a general method for better understanding HSI model dynamics, the spatial and temporal variation in uncertainties, and the parameters that contribute most to model uncertainty. Including an uncertainty and sensitivity analysis in modeling efforts as part of the decision-making framework will result in better-informed, more robust decisions.

  4. Evaluating uncertainty in predicting spatially variable representative elementary scales in fractured aquifers, with application to Turkey Creek Basin, Colorado

    USGS Publications Warehouse

    Wellman, Tristan P.; Poeter, Eileen P.

    2006-01-01

    Computational limitations and sparse field data often mandate use of continuum representation for modeling hydrologic processes in large‐scale fractured aquifers. Selecting appropriate element size is of primary importance because continuum approximation is not valid for all scales. The traditional approach is to select elements by identifying a single representative elementary scale (RES) for the region of interest. Recent advances indicate RES may be spatially variable, prompting unanswered questions regarding the ability of sparse data to spatially resolve continuum equivalents in fractured aquifers. We address this uncertainty of estimating RES using two techniques. In one technique we employ data‐conditioned realizations generated by sequential Gaussian simulation. For the other we develop a new approach using conditioned random walks and nonparametric bootstrapping (CRWN). We evaluate the effectiveness of each method under three fracture densities, three data sets, and two groups of RES analysis parameters. In sum, 18 separate RES analyses are evaluated, which indicate RES magnitudes may be reasonably bounded using uncertainty analysis, even for limited data sets and complex fracture structure. In addition, we conduct a field study to estimate RES magnitudes and resulting uncertainty for Turkey Creek Basin, a crystalline fractured rock aquifer located 30 km southwest of Denver, Colorado. Analyses indicate RES does not correlate to rock type or local relief in several instances but is generally lower within incised creek valleys and higher along mountain fronts. Results of this study suggest that (1) CRWN is an effective and computationally efficient method to estimate uncertainty, (2) RES predictions are well constrained using uncertainty analysis, and (3) for aquifers such as Turkey Creek Basin, spatial variability of RES is significant and complex.

  5. Estimation and impact assessment of input and parameter uncertainty in predicting groundwater flow with a fully distributed model

    NASA Astrophysics Data System (ADS)

    Touhidul Mustafa, Syed Md.; Nossent, Jiri; Ghysels, Gert; Huysmans, Marijke

    2017-04-01

    Transient numerical groundwater flow models have been used to understand and forecast groundwater flow systems under anthropogenic and climatic effects, but the reliability of the predictions is strongly influenced by different sources of uncertainty. Hence, researchers in hydrological sciences are developing and applying methods for uncertainty quantification. Nevertheless, spatially distributed flow models pose significant challenges for parameter and spatially distributed input estimation and uncertainty quantification. In this study, we present a general and flexible approach for input and parameter estimation and uncertainty analysis of groundwater models. The proposed approach combines a fully distributed groundwater flow model (MODFLOW) with the DiffeRential Evolution Adaptive Metropolis (DREAM) algorithm. To avoid over-parameterization, the uncertainty of the spatially distributed model input has been represented by multipliers. The posterior distributions of these multipliers and the regular model parameters were estimated using DREAM. The proposed methodology has been applied in an overexploited aquifer in Bangladesh where groundwater pumping and recharge data are highly uncertain. The results confirm that input uncertainty does have a considerable effect on the model predictions and parameter distributions. Additionally, our approach also provides a new way to optimize the spatially distributed recharge and pumping data along with the parameter values under uncertain input conditions. It can be concluded from our approach that considering model input uncertainty along with parameter uncertainty is important for obtaining realistic model predictions and a correct estimation of the uncertainty bounds.

  6. Addressing uncertainty in modelling cumulative impacts within maritime spatial planning in the Adriatic and Ionian region.

    PubMed

    Gissi, Elena; Menegon, Stefano; Sarretta, Alessandro; Appiotti, Federica; Maragno, Denis; Vianello, Andrea; Depellegrin, Daniel; Venier, Chiara; Barbanti, Andrea

    2017-01-01

    Maritime spatial planning (MSP) is envisaged as a tool to apply an ecosystem-based approach to the marine and coastal realms, aiming at ensuring that the collective pressure of human activities is kept within acceptable limits. Cumulative impacts (CI) assessment can support science-based MSP, in order to understand the existing and potential impacts of human uses on the marine environment. A CI assessment includes several sources of uncertainty that can hinder the correct interpretation of its results if not explicitly incorporated in the decision-making process. This study proposes a three-level methodology to perform a general uncertainty analysis integrated with the CI assessment for MSP, applied to the Adriatic and Ionian Region (AIR). We describe the nature and level of uncertainty with the help of expert judgement and elicitation to include all of the possible sources of uncertainty related to the CI model with assumptions and gaps related to the case-based MSP process in the AIR. Next, we use the results to tailor the global uncertainty analysis to spatially describe the uncertainty distribution and variations of the CI scores dependent on the CI model factors. The results show the variability of the uncertainty in the AIR, with only limited portions robustly identified as the most or the least impacted areas under multiple model factors hypothesis. The results are discussed for the level and type of reliable information and insights they provide to decision-making. The most significant uncertainty factors are identified to facilitate the adaptive MSP process and to establish research priorities to fill knowledge gaps for subsequent planning cycles. The method aims to depict the potential CI effects, as well as the extent and spatial variation of the data and scientific uncertainty; therefore, this method constitutes a suitable tool to inform the potential establishment of the precautionary principle in MSP.

  7. Visualizing Uncertainty of Point Phenomena by Redesigned Error Ellipses

    NASA Astrophysics Data System (ADS)

    Murphy, Christian E.

    2018-05-01

    Visualizing uncertainty remains one of the great challenges in modern cartography. There is no overarching strategy to display the nature of uncertainty, as an effective and efficient visualization depends, besides on the spatial data feature type, heavily on the type of uncertainty. This work presents a design strategy to visualize uncertainty con-nected to point features. The error ellipse, well-known from mathematical statistics, is adapted to display the uncer-tainty of point information originating from spatial generalization. Modified designs of the error ellipse show the po-tential of quantitative and qualitative symbolization and simultaneous point based uncertainty symbolization. The user can intuitively depict the centers of gravity, the major orientation of the point arrays as well as estimate the ex-tents and possible spatial distributions of multiple point phenomena. The error ellipse represents uncertainty in an intuitive way, particularly suitable for laymen. Furthermore it is shown how applicable an adapted design of the er-ror ellipse is to display the uncertainty of point features originating from incomplete data. The suitability of the error ellipse to display the uncertainty of point information is demonstrated within two showcases: (1) the analysis of formations of association football players, and (2) uncertain positioning of events on maps for the media.

  8. A geostatistics-informed hierarchical sensitivity analysis method for complex groundwater flow and transport modeling

    NASA Astrophysics Data System (ADS)

    Dai, Heng; Chen, Xingyuan; Ye, Ming; Song, Xuehang; Zachara, John M.

    2017-05-01

    Sensitivity analysis is an important tool for development and improvement of mathematical models, especially for complex systems with a high dimension of spatially correlated parameters. Variance-based global sensitivity analysis has gained popularity because it can quantify the relative contribution of uncertainty from different sources. However, its computational cost increases dramatically with the complexity of the considered model and the dimension of model parameters. In this study, we developed a new sensitivity analysis method that integrates the concept of variance-based method with a hierarchical uncertainty quantification framework. Different uncertain inputs are grouped and organized into a multilayer framework based on their characteristics and dependency relationships to reduce the dimensionality of the sensitivity analysis. A set of new sensitivity indices are defined for the grouped inputs using the variance decomposition method. Using this methodology, we identified the most important uncertainty source for a dynamic groundwater flow and solute transport model at the Department of Energy (DOE) Hanford site. The results indicate that boundary conditions and permeability field contribute the most uncertainty to the simulated head field and tracer plume, respectively. The relative contribution from each source varied spatially and temporally. By using a geostatistical approach to reduce the number of realizations needed for the sensitivity analysis, the computational cost of implementing the developed method was reduced to a practically manageable level. The developed sensitivity analysis method is generally applicable to a wide range of hydrologic and environmental problems that deal with high-dimensional spatially distributed input variables.

  9. A Geostatistics-Informed Hierarchical Sensitivity Analysis Method for Complex Groundwater Flow and Transport Modeling

    NASA Astrophysics Data System (ADS)

    Dai, H.; Chen, X.; Ye, M.; Song, X.; Zachara, J. M.

    2017-12-01

    Sensitivity analysis is an important tool for development and improvement of mathematical models, especially for complex systems with a high dimension of spatially correlated parameters. Variance-based global sensitivity analysis has gained popularity because it can quantify the relative contribution of uncertainty from different sources. However, its computational cost increases dramatically with the complexity of the considered model and the dimension of model parameters. In this study we developed a new sensitivity analysis method that integrates the concept of variance-based method with a hierarchical uncertainty quantification framework. Different uncertain inputs are grouped and organized into a multi-layer framework based on their characteristics and dependency relationships to reduce the dimensionality of the sensitivity analysis. A set of new sensitivity indices are defined for the grouped inputs using the variance decomposition method. Using this methodology, we identified the most important uncertainty source for a dynamic groundwater flow and solute transport model at the Department of Energy (DOE) Hanford site. The results indicate that boundary conditions and permeability field contribute the most uncertainty to the simulated head field and tracer plume, respectively. The relative contribution from each source varied spatially and temporally. By using a geostatistical approach to reduce the number of realizations needed for the sensitivity analysis, the computational cost of implementing the developed method was reduced to a practically manageable level. The developed sensitivity analysis method is generally applicable to a wide range of hydrologic and environmental problems that deal with high-dimensional spatially-distributed input variables.

  10. Assessment of spectral, misregistration, and spatial uncertainties inherent in the cross-calibration study

    USGS Publications Warehouse

    Chander, G.; Helder, D.L.; Aaron, David; Mishra, N.; Shrestha, A.K.

    2013-01-01

    Cross-calibration of satellite sensors permits the quantitative comparison of measurements obtained from different Earth Observing (EO) systems. Cross-calibration studies usually use simultaneous or near-simultaneous observations from several spaceborne sensors to develop band-by-band relationships through regression analysis. The investigation described in this paper focuses on evaluation of the uncertainties inherent in the cross-calibration process, including contributions due to different spectral responses, spectral resolution, spectral filter shift, geometric misregistrations, and spatial resolutions. The hyperspectral data from the Environmental Satellite SCanning Imaging Absorption SpectroMeter for Atmospheric CartograpHY and the EO-1 Hyperion, along with the relative spectral responses (RSRs) from the Landsat 7 Enhanced Thematic Mapper (TM) Plus and the Terra Moderate Resolution Imaging Spectroradiometer sensors, were used for the spectral uncertainty study. The data from Landsat 5 TM over five representative land cover types (desert, rangeland, grassland, deciduous forest, and coniferous forest) were used for the geometric misregistrations and spatial-resolution study. The spectral resolution uncertainty was found to be within 0.25%, spectral filter shift within 2.5%, geometric misregistrations within 0.35%, and spatial-resolution effects within 0.1% for the Libya 4 site. The one-sigma uncertainties presented in this paper are uncorrelated, and therefore, the uncertainties can be summed orthogonally. Furthermore, an overall total uncertainty was developed. In general, the results suggested that the spectral uncertainty is more dominant compared to other uncertainties presented in this paper. Therefore, the effect of the sensor RSR differences needs to be quantified and compensated to avoid large uncertainties in cross-calibration results.

  11. Big Data Geo-Analytical Tool Development for Spatial Analysis Uncertainty Visualization and Quantification Needs

    NASA Astrophysics Data System (ADS)

    Rose, K.; Bauer, J. R.; Baker, D. V.

    2015-12-01

    As big data computing capabilities are increasingly paired with spatial analytical tools and approaches, there is a need to ensure uncertainty associated with the datasets used in these analyses is adequately incorporated and portrayed in results. Often the products of spatial analyses, big data and otherwise, are developed using discontinuous, sparse, and often point-driven data to represent continuous phenomena. Results from these analyses are generally presented without clear explanations of the uncertainty associated with the interpolated values. The Variable Grid Method (VGM) offers users with a flexible approach designed for application to a variety of analyses where users there is a need to study, evaluate, and analyze spatial trends and patterns while maintaining connection to and communicating the uncertainty in the underlying spatial datasets. The VGM outputs a simultaneous visualization representative of the spatial data analyses and quantification of underlying uncertainties, which can be calculated using data related to sample density, sample variance, interpolation error, uncertainty calculated from multiple simulations. In this presentation we will show how we are utilizing Hadoop to store and perform spatial analysis through the development of custom Spark and MapReduce applications that incorporate ESRI Hadoop libraries. The team will present custom 'Big Data' geospatial applications that run on the Hadoop cluster and integrate with ESRI ArcMap with the team's probabilistic VGM approach. The VGM-Hadoop tool has been specially built as a multi-step MapReduce application running on the Hadoop cluster for the purpose of data reduction. This reduction is accomplished by generating multi-resolution, non-overlapping, attributed topology that is then further processed using ESRI's geostatistical analyst to convey a probabilistic model of a chosen study region. Finally, we will share our approach for implementation of data reduction and topology generation via custom multi-step Hadoop applications, performance benchmarking comparisons, and Hadoop-centric opportunities for greater parallelization of geospatial operations. The presentation includes examples of the approach being applied to a range of subsurface, geospatial studies (e.g. induced seismicity risk).

  12. Identifying and Analyzing Uncertainty Structures in the TRMM Microwave Imager Precipitation Product over Tropical Ocean Basins

    NASA Technical Reports Server (NTRS)

    Liu, Jianbo; Kummerow, Christian D.; Elsaesser, Gregory S.

    2016-01-01

    Despite continuous improvements in microwave sensors and retrieval algorithms, our understanding of precipitation uncertainty is quite limited, due primarily to inconsistent findings in studies that compare satellite estimates to in situ observations over different parts of the world. This study seeks to characterize the temporal and spatial properties of uncertainty in the Tropical Rainfall Measuring Mission Microwave Imager surface rainfall product over tropical ocean basins. Two uncertainty analysis frameworks are introduced to qualitatively evaluate the properties of uncertainty under a hierarchy of spatiotemporal data resolutions. The first framework (i.e. 'climate method') demonstrates that, apart from random errors and regionally dependent biases, a large component of the overall precipitation uncertainty is manifested in cyclical patterns that are closely related to large-scale atmospheric modes of variability. By estimating the magnitudes of major uncertainty sources independently, the climate method is able to explain 45-88% of the monthly uncertainty variability. The percentage is largely resolution dependent (with the lowest percentage explained associated with a 1 deg x 1 deg spatial/1 month temporal resolution, and highest associated with a 3 deg x 3 deg spatial/3 month temporal resolution). The second framework (i.e. 'weather method') explains regional mean precipitation uncertainty as a summation of uncertainties associated with individual precipitation systems. By further assuming that self-similar recurring precipitation systems yield qualitatively comparable precipitation uncertainties, the weather method can consistently resolve about 50 % of the daily uncertainty variability, with only limited dependence on the regions of interest.

  13. Hyperspectral imaging spectro radiometer improves radiometric accuracy

    NASA Astrophysics Data System (ADS)

    Prel, Florent; Moreau, Louis; Bouchard, Robert; Bullis, Ritchie D.; Roy, Claude; Vallières, Christian; Levesque, Luc

    2013-06-01

    Reliable and accurate infrared characterization is necessary to measure the specific spectral signatures of aircrafts and associated infrared counter-measures protections (i.e. flares). Infrared characterization is essential to improve counter measures efficiency, improve friend-foe identification and reduce the risk of friendly fire. Typical infrared characterization measurement setups include a variety of panchromatic cameras and spectroradiometers. Each instrument brings essential information; cameras measure the spatial distribution of targets and spectroradiometers provide the spectral distribution of the emitted energy. However, the combination of separate instruments brings out possible radiometric errors and uncertainties that can be reduced with Hyperspectral imagers. These instruments combine both spectral and spatial information into the same data. These instruments measure both the spectral and spatial distribution of the energy at the same time ensuring the temporal and spatial cohesion of collected information. This paper presents a quantitative analysis of the main contributors of radiometric uncertainties and shows how a hyperspectral imager can reduce these uncertainties.

  14. Diving into the consumer nutrition environment: A Bayesian spatial factor analysis of neighborhood restaurant environment.

    PubMed

    Luan, Hui; Law, Jane; Lysy, Martin

    2018-02-01

    Neighborhood restaurant environment (NRE) plays a vital role in shaping residents' eating behaviors. While NRE 'healthfulness' is a multi-facet concept, most studies evaluate it based only on restaurant type, thus largely ignoring variations of in-restaurant features. In the few studies that do account for such features, healthfulness scores are simply averaged over accessible restaurants, thereby concealing any uncertainty that attributed to neighborhoods' size or spatial correlation. To address these limitations, this paper presents a Bayesian Spatial Factor Analysis for assessing NRE healthfulness in the city of Kitchener, Canada. Several in-restaurant characteristics are included. By treating NRE healthfulness as a spatially correlated latent variable, the adopted modeling approach can: (i) identify specific indicators most relevant to NRE healthfulness, (ii) provide healthfulness estimates for neighborhoods without accessible restaurants, and (iii) readily quantify uncertainties in the healthfulness index. Implications of the analysis for intervention program development and community food planning are discussed. Copyright © 2017 Elsevier Ltd. All rights reserved.

  15. Challenges and regulatory considerations in the acoustic measurement of high-frequency (>20 MHz) ultrasound.

    PubMed

    Nagle, Samuel M; Sundar, Guru; Schafer, Mark E; Harris, Gerald R; Vaezy, Shahram; Gessert, James M; Howard, Samuel M; Moore, Mary K; Eaton, Richard M

    2013-11-01

    This article examines the challenges associated with making acoustic output measurements at high ultrasound frequencies (>20 MHz) in the context of regulatory considerations contained in the US Food and Drug Administration industry guidance document for diagnostic ultrasound devices. Error sources in the acoustic measurement, including hydrophone calibration and spatial averaging, nonlinear distortion, and mechanical alignment, are evaluated, and the limitations of currently available acoustic measurement instruments are discussed. An uncertainty analysis of acoustic intensity and power measurements is presented, and an example uncertainty calculation is done on a hypothetical 30-MHz high-frequency ultrasound system. This analysis concludes that the estimated measurement uncertainty of the acoustic intensity is +73%/-86%, and the uncertainty in the mechanical index is +37%/-43%. These values exceed the respective levels in the Food and Drug Administration guidance document of 30% and 15%, respectively, which are more representative of the measurement uncertainty associated with characterizing lower-frequency ultrasound systems. Recommendations made for minimizing the measurement uncertainty include implementing a mechanical positioning system that has sufficient repeatability and precision, reconstructing the time-pressure waveform via deconvolution using the hydrophone frequency response, and correcting for hydrophone spatial averaging.

  16. Numerical Uncertainty Quantification for Radiation Analysis Tools

    NASA Technical Reports Server (NTRS)

    Anderson, Brooke; Blattnig, Steve; Clowdsley, Martha

    2007-01-01

    Recently a new emphasis has been placed on engineering applications of space radiation analyses and thus a systematic effort of Verification, Validation and Uncertainty Quantification (VV&UQ) of the tools commonly used for radiation analysis for vehicle design and mission planning has begun. There are two sources of uncertainty in geometric discretization addressed in this paper that need to be quantified in order to understand the total uncertainty in estimating space radiation exposures. One source of uncertainty is in ray tracing, as the number of rays increase the associated uncertainty decreases, but the computational expense increases. Thus, a cost benefit analysis optimizing computational time versus uncertainty is needed and is addressed in this paper. The second source of uncertainty results from the interpolation over the dose vs. depth curves that is needed to determine the radiation exposure. The question, then, is what is the number of thicknesses that is needed to get an accurate result. So convergence testing is performed to quantify the uncertainty associated with interpolating over different shield thickness spatial grids.

  17. DGSA: A Matlab toolbox for distance-based generalized sensitivity analysis of geoscientific computer experiments

    NASA Astrophysics Data System (ADS)

    Park, Jihoon; Yang, Guang; Satija, Addy; Scheidt, Céline; Caers, Jef

    2016-12-01

    Sensitivity analysis plays an important role in geoscientific computer experiments, whether for forecasting, data assimilation or model calibration. In this paper we focus on an extension of a method of regionalized sensitivity analysis (RSA) to applications typical in the Earth Sciences. Such applications involve the building of large complex spatial models, the application of computationally extensive forward modeling codes and the integration of heterogeneous sources of model uncertainty. The aim of this paper is to be practical: 1) provide a Matlab code, 2) provide novel visualization methods to aid users in getting a better understanding in the sensitivity 3) provide a method based on kernel principal component analysis (KPCA) and self-organizing maps (SOM) to account for spatial uncertainty typical in Earth Science applications and 4) provide an illustration on a real field case where the above mentioned complexities present themselves. We present methods that extend the original RSA method in several ways. First we present the calculation of conditional effects, defined as the sensitivity of a parameter given a level of another parameters. Second, we show how this conditional effect can be used to choose nominal values or ranges to fix insensitive parameters aiming to minimally affect uncertainty in the response. Third, we develop a method based on KPCA and SOM to assign a rank to spatial models in order to calculate the sensitivity on spatial variability in the models. A large oil/gas reservoir case is used as illustration of these ideas.

  18. Impact of spatial proxies on the representation of bottom-up emission inventories: A satellite-based analysis

    NASA Astrophysics Data System (ADS)

    Geng, Guannan; Zhang, Qiang; Martin, Randall V.; Lin, Jintai; Huo, Hong; Zheng, Bo; Wang, Siwen; He, Kebin

    2017-03-01

    Spatial proxies used in bottom-up emission inventories to derive the spatial distributions of emissions are usually empirical and involve additional levels of uncertainty. Although uncertainties in current emission inventories have been discussed extensively, uncertainties resulting from improper spatial proxies have rarely been evaluated. In this work, we investigate the impact of spatial proxies on the representation of gridded emissions by comparing six gridded NOx emission datasets over China developed from the same magnitude of emissions and different spatial proxies. GEOS-Chem-modeled tropospheric NO2 vertical columns simulated from different gridded emission inventories are compared with satellite-based columns. The results show that differences between modeled and satellite-based NO2 vertical columns are sensitive to the spatial proxies used in the gridded emission inventories. The total population density is less suitable for allocating NOx emissions than nighttime light data because population density tends to allocate more emissions to rural areas. Determining the exact locations of large emission sources could significantly strengthen the correlation between modeled and observed NO2 vertical columns. Using vehicle population and an updated road network for the on-road transport sector could substantially enhance urban emissions and improve the model performance. When further applying industrial gross domestic product (IGDP) values for the industrial sector, modeled NO2 vertical columns could better capture pollution hotspots in urban areas and exhibit the best performance of the six cases compared to satellite-based NO2 vertical columns (slope = 1.01 and R2 = 0. 85). This analysis provides a framework for information from satellite observations to inform bottom-up inventory development. In the future, more effort should be devoted to the representation of spatial proxies to improve spatial patterns in bottom-up emission inventories.

  19. Using spatial uncertainty to manipulate the size of the attention focus.

    PubMed

    Huang, Dan; Xue, Linyan; Wang, Xin; Chen, Yao

    2016-09-01

    Preferentially processing behaviorally relevant information is vital for primate survival. In visuospatial attention studies, manipulating the spatial extent of attention focus is an important question. Although many studies have claimed to successfully adjust attention field size by either varying the uncertainty about the target location (spatial uncertainty) or adjusting the size of the cue orienting the attention focus, no systematic studies have assessed and compared the effectiveness of these methods. We used a multiple cue paradigm with 2.5° and 7.5° rings centered around a target position to measure the cue size effect, while the spatial uncertainty levels were manipulated by changing the number of cueing positions. We found that spatial uncertainty had a significant impact on reaction time during target detection, while the cue size effect was less robust. We also carefully varied the spatial scope of potential target locations within a small or large region and found that this amount of variation in spatial uncertainty can also significantly influence target detection speed. Our results indicate that adjusting spatial uncertainty is more effective than varying cue size when manipulating attention field size.

  20. Spatial resolution and measurement uncertainty of strains in bone and bone-cement interface using digital volume correlation.

    PubMed

    Zhu, Ming-Liang; Zhang, Qing-Hang; Lupton, Colin; Tong, Jie

    2016-04-01

    The measurement uncertainty of strains has been assessed in a bone analogue (sawbone), bovine trabecular bone and bone-cement interface specimens under zero load using the Digital Volume Correlation (DVC) method. The effects of sub-volume size, sample constraint and preload on the measured strain uncertainty have been examined. There is generally a trade-off between the measurement uncertainty and the spatial resolution. Suitable sub-volume sizes have been be selected based on a compromise between the measurement uncertainty and the spatial resolution of the cases considered. A ratio of sub-volume size to a microstructure characteristic (Tb.Sp) was introduced to reflect a suitable spatial resolution, and the measurement uncertainty associated was assessed. Specifically, ratios between 1.6 and 4 appear to give rise to standard deviations in the measured strains between 166 and 620 με in all the cases considered, which would seem to suffice for strain analysis in pre as well as post yield loading regimes. A microscale finite element (μFE) model was built from the CT images of the sawbone, and the results from the μFE model and a continuum FE model were compared with those from the DVC. The strain results were found to differ significantly between the two methods at tissue level, consistent in trend with the results found in human bones, indicating mainly a limitation of the current DVC method in mapping strains at this level. Copyright © 2015 Elsevier Ltd. All rights reserved.

  1. Examination of the uncertainty in contaminant fate and transport modeling: a case study in the Venice Lagoon.

    PubMed

    Sommerfreund, J; Arhonditsis, G B; Diamond, M L; Frignani, M; Capodaglio, G; Gerino, M; Bellucci, L; Giuliani, S; Mugnai, C

    2010-03-01

    A Monte Carlo analysis is used to quantify environmental parametric uncertainty in a multi-segment, multi-chemical model of the Venice Lagoon. Scientific knowledge, expert judgment and observational data are used to formulate prior probability distributions that characterize the uncertainty pertaining to 43 environmental system parameters. The propagation of this uncertainty through the model is then assessed by a comparative analysis of the moments (central tendency, dispersion) of the model output distributions. We also apply principal component analysis in combination with correlation analysis to identify the most influential parameters, thereby gaining mechanistic insights into the ecosystem functioning. We found that modeled concentrations of Cu, Pb, OCDD/F and PCB-180 varied by up to an order of magnitude, exhibiting both contaminant- and site-specific variability. These distributions generally overlapped with the measured concentration ranges. We also found that the uncertainty of the contaminant concentrations in the Venice Lagoon was characterized by two modes of spatial variability, mainly driven by the local hydrodynamic regime, which separate the northern and central parts of the lagoon and the more isolated southern basin. While spatial contaminant gradients in the lagoon were primarily shaped by hydrology, our analysis also shows that the interplay amongst the in-place historical pollution in the central lagoon, the local suspended sediment concentrations and the sediment burial rates exerts significant control on the variability of the contaminant concentrations. We conclude that the probabilistic analysis presented herein is valuable for quantifying uncertainty and probing its cause in over-parameterized models, while some of our results can be used to dictate where additional data collection efforts should focus on and the directions that future model refinement should follow. (c) 2009 Elsevier Inc. All rights reserved.

  2. Earth Observation, Spatial Data Quality, and Neglected Tropical Diseases.

    PubMed

    Hamm, Nicholas A S; Soares Magalhães, Ricardo J; Clements, Archie C A

    2015-12-01

    Earth observation (EO) is the use of remote sensing and in situ observations to gather data on the environment. It finds increasing application in the study of environmentally modulated neglected tropical diseases (NTDs). Obtaining and assuring the quality of the relevant spatially and temporally indexed EO data remain challenges. Our objective was to review the Earth observation products currently used in studies of NTD epidemiology and to discuss fundamental issues relating to spatial data quality (SDQ), which limit the utilization of EO and pose challenges for its more effective use. We searched Web of Science and PubMed for studies related to EO and echinococossis, leptospirosis, schistosomiasis, and soil-transmitted helminth infections. Relevant literature was also identified from the bibliographies of those papers. We found that extensive use is made of EO products in the study of NTD epidemiology; however, the quality of these products is usually given little explicit attention. We review key issues in SDQ concerning spatial and temporal scale, uncertainty, and the documentation and use of quality information. We give examples of how these issues may interact with uncertainty in NTD data to affect the output of an epidemiological analysis. We conclude that researchers should give careful attention to SDQ when designing NTD spatial-epidemiological studies. This should be used to inform uncertainty analysis in the epidemiological study. SDQ should be documented and made available to other researchers.

  3. Uncertainty in the modelling of spatial and temporal patterns of shallow groundwater flow paths: The role of geological and hydrological site information

    NASA Astrophysics Data System (ADS)

    Woodward, Simon J. R.; Wöhling, Thomas; Stenger, Roland

    2016-03-01

    Understanding the hydrological and hydrogeochemical responses of hillslopes and other small scale groundwater systems requires mapping the velocity and direction of groundwater flow relative to the controlling subsurface material features. Since point observations of subsurface materials and groundwater head are often the basis for modelling these complex, dynamic, three-dimensional systems, considerable uncertainties are inevitable, but are rarely assessed. This study explored whether piezometric head data measured at high spatial and temporal resolution over six years at a hillslope research site provided sufficient information to determine the flow paths that transfer nitrate leached from the soil zone through the shallow saturated zone into a nearby wetland and stream. Transient groundwater flow paths were modelled using MODFLOW and MODPATH, with spatial patterns of hydraulic conductivity in the three material layers at the site being estimated by regularised pilot point calibration using PEST, constrained by slug test estimates of saturated hydraulic conductivity at several locations. Subsequent Null Space Monte Carlo uncertainty analysis showed that this data was not sufficient to definitively determine the spatial pattern of hydraulic conductivity at the site, although modelled water table dynamics matched the measured heads with acceptable accuracy in space and time. Particle tracking analysis predicted that the saturated flow direction was similar throughout the year as the water table rose and fell, but was not aligned with either the ground surface or subsurface material contours; indeed the subsurface material layers, having relatively similar hydraulic properties, appeared to have little effect on saturated water flow at the site. Flow path uncertainty analysis showed that, while accurate flow path direction or velocity could not be determined on the basis of the available head and slug test data alone, the origin of well water samples relative to the material layers and site contour could still be broadly deduced. This study highlights both the challenge of collecting suitably informative field data with which to characterise subsurface hydrology, and the power of modern calibration and uncertainty modelling techniques to assess flow path uncertainty in hillslopes and other small scale systems.

  4. Time-Resolved Particle Image Velocimetry Measurements with Wall Shear Stress and Uncertainty Quantification for the FDA Nozzle Model.

    PubMed

    Raben, Jaime S; Hariharan, Prasanna; Robinson, Ronald; Malinauskas, Richard; Vlachos, Pavlos P

    2016-03-01

    We present advanced particle image velocimetry (PIV) processing, post-processing, and uncertainty estimation techniques to support the validation of computational fluid dynamics analyses of medical devices. This work is an extension of a previous FDA-sponsored multi-laboratory study, which used a medical device mimicking geometry referred to as the FDA benchmark nozzle model. Experimental measurements were performed using time-resolved PIV at five overlapping regions of the model for Reynolds numbers in the nozzle throat of 500, 2000, 5000, and 8000. Images included a twofold increase in spatial resolution in comparison to the previous study. Data was processed using ensemble correlation, dynamic range enhancement, and phase correlations to increase signal-to-noise ratios and measurement accuracy, and to resolve flow regions with large velocity ranges and gradients, which is typical of many blood-contacting medical devices. Parameters relevant to device safety, including shear stress at the wall and in bulk flow, were computed using radial basis functions. In addition, in-field spatially resolved pressure distributions, Reynolds stresses, and energy dissipation rates were computed from PIV measurements. Velocity measurement uncertainty was estimated directly from the PIV correlation plane, and uncertainty analysis for wall shear stress at each measurement location was performed using a Monte Carlo model. Local velocity uncertainty varied greatly and depended largely on local conditions such as particle seeding, velocity gradients, and particle displacements. Uncertainty in low velocity regions in the sudden expansion section of the nozzle was greatly reduced by over an order of magnitude when dynamic range enhancement was applied. Wall shear stress uncertainty was dominated by uncertainty contributions from velocity estimations, which were shown to account for 90-99% of the total uncertainty. This study provides advancements in the PIV processing methodologies over the previous work through increased PIV image resolution, use of robust image processing algorithms for near-wall velocity measurements and wall shear stress calculations, and uncertainty analyses for both velocity and wall shear stress measurements. The velocity and shear stress analysis, with spatially distributed uncertainty estimates, highlights the challenges of flow quantification in medical devices and provides potential methods to overcome such challenges.

  5. Evaluation of Uncertainty in Precipitation Datasets for New Mexico, USA

    NASA Astrophysics Data System (ADS)

    Besha, A. A.; Steele, C. M.; Fernald, A.

    2014-12-01

    Climate change, population growth and other factors are endangering water availability and sustainability in semiarid/arid areas particularly in the southwestern United States. Wide coverage of spatial and temporal measurements of precipitation are key for regional water budget analysis and hydrological operations which themselves are valuable tool for water resource planning and management. Rain gauge measurements are usually reliable and accurate at a point. They measure rainfall continuously, but spatial sampling is limited. Ground based radar and satellite remotely sensed precipitation have wide spatial and temporal coverage. However, these measurements are indirect and subject to errors because of equipment, meteorological variability, the heterogeneity of the land surface itself and lack of regular recording. This study seeks to understand precipitation uncertainty and in doing so, lessen uncertainty propagation into hydrological applications and operations. We reviewed, compared and evaluated the TRMM (Tropical Rainfall Measuring Mission) precipitation products, NOAA's (National Oceanic and Atmospheric Administration) Global Precipitation Climatology Centre (GPCC) monthly precipitation dataset, PRISM (Parameter elevation Regression on Independent Slopes Model) data and data from individual climate stations including Cooperative Observer Program (COOP), Remote Automated Weather Stations (RAWS), Soil Climate Analysis Network (SCAN) and Snowpack Telemetry (SNOTEL) stations. Though not yet finalized, this study finds that the uncertainty within precipitation estimates datasets is influenced by regional topography, season, climate and precipitation rate. Ongoing work aims to further evaluate precipitation datasets based on the relative influence of these phenomena so that we can identify the optimum datasets for input to statewide water budget analysis.

  6. Uncertainties in Earthquake Loss Analysis: A Case Study From Southern California

    NASA Astrophysics Data System (ADS)

    Mahdyiar, M.; Guin, J.

    2005-12-01

    Probabilistic earthquake hazard and loss analyses play important roles in many areas of risk management, including earthquake related public policy and insurance ratemaking. Rigorous loss estimation for portfolios of properties is difficult since there are various types of uncertainties in all aspects of modeling and analysis. It is the objective of this study to investigate the sensitivity of earthquake loss estimation to uncertainties in regional seismicity, earthquake source parameters, ground motions, and sites' spatial correlation on typical property portfolios in Southern California. Southern California is an attractive region for such a study because it has a large population concentration exposed to significant levels of seismic hazard. During the last decade, there have been several comprehensive studies of most regional faults and seismogenic sources. There have also been detailed studies on regional ground motion attenuations and regional and local site responses to ground motions. This information has been used by engineering seismologists to conduct regional seismic hazard and risk analysis on a routine basis. However, one of the more difficult tasks in such studies is the proper incorporation of uncertainties in the analysis. From the hazard side, there are uncertainties in the magnitudes, rates and mechanisms of the seismic sources and local site conditions and ground motion site amplifications. From the vulnerability side, there are considerable uncertainties in estimating the state of damage of buildings under different earthquake ground motions. From an analytical side, there are challenges in capturing the spatial correlation of ground motions and building damage, and integrating thousands of loss distribution curves with different degrees of correlation. In this paper we propose to address some of these issues by conducting loss analyses of a typical small portfolio in southern California, taking into consideration various source and ground motion uncertainties. The approach is designed to integrate loss distribution functions with different degrees of correlation for portfolio analysis. The analysis is based on USGS 2002 regional seismicity model.

  7. Estimating Uncertainties in the Multi-Instrument SBUV Profile Ozone Merged Data Set

    NASA Technical Reports Server (NTRS)

    Frith, Stacey; Stolarski, Richard

    2015-01-01

    The MOD data set is uniquely qualified for use in long-term ozone analysis because of its long record, high spatial coverage, and consistent instrument design and algorithm. The estimated MOD uncertainty term significantly increases the uncertainty over the statistical error alone. Trends in the post-2000 period are generally positive in the upper stratosphere, but only significant at 1-1.6 hPa. Remaining uncertainties not yet included in the Monte Carlo model are Smoothing Error ( 1 from 10 to 1 hPa) Relative calibration uncertainty between N11 and N17Seasonal cycle differences between SBUV records.

  8. Reducing Spatial Uncertainty Through Attentional Cueing Improves Contrast Sensitivity in Regions of the Visual Field With Glaucomatous Defects

    PubMed Central

    Phu, Jack; Kalloniatis, Michael; Khuu, Sieu K.

    2018-01-01

    Purpose Current clinical perimetric test paradigms present stimuli randomly to various locations across the visual field (VF), inherently introducing spatial uncertainty, which reduces contrast sensitivity. In the present study, we determined the extent to which spatial uncertainty affects contrast sensitivity in glaucoma patients by minimizing spatial uncertainty through attentional cueing. Methods Six patients with open-angle glaucoma and six healthy subjects underwent laboratory-based psychophysical testing to measure contrast sensitivity at preselected locations at two eccentricities (9.5° and 17.5°) with two stimulus sizes (Goldmann sizes III and V) under different cueing conditions: 1, 2, 4, or 8 points verbally cued. Method of Constant Stimuli and a single-interval forced-choice procedure were used to generate frequency of seeing (FOS) curves at locations with and without VF defects. Results At locations with VF defects, cueing minimizes spatial uncertainty and improves sensitivity under all conditions. The effect of cueing was maximal when one point was cued, and rapidly diminished when more points were cued (no change to baseline with 8 points cued). The slope of the FOS curve steepened with reduced spatial uncertainty. Locations with normal sensitivity in glaucomatous eyes had similar performance to that of healthy subjects. There was a systematic increase in uncertainty with the depth of VF loss. Conclusions Sensitivity measurements across the VF are negatively affected by spatial uncertainty, which increases with greater VF loss. Minimizing uncertainty can improve sensitivity at locations of deficit. Translational Relevance Current perimetric techniques introduce spatial uncertainty and may therefore underestimate sensitivity in regions of VF loss. PMID:29600116

  9. Spatial and temporal study of nitrate concentration in groundwater by means of coregionalization

    USGS Publications Warehouse

    D'Agostino, V.; Greene, E.A.; Passarella, G.; Vurro, M.

    1998-01-01

    Spatial and temporal behavior of hydrochemical parameters in groundwater can be studied using tools provided by geostatistics. The cross-variogram can be used to measure the spatial increments between observations at two given times as a function of distance (spatial structure). Taking into account the existence of such a spatial structure, two different data sets (sampled at two different times), representing concentrations of the same hydrochemical parameter, can be analyzed by cokriging in order to reduce the uncertainty of the estimation. In particular, if one of the two data sets is a subset of the other (that is, an undersampled set), cokriging allows us to study the spatial distribution of the hydrochemical parameter at that time, while also considering the statistical characteristics of the full data set established at a different time. This paper presents an application of cokriging by using temporal subsets to study the spatial distribution of nitrate concentration in the aquifer of the Lucca Plain, central Italy. Three data sets of nitrate concentration in groundwater were collected during three different periods in 1991. The first set was from 47 wells, but the second and the third are undersampled and represent 28 and 27 wells, respectively. Comparing the result of cokriging with ordinary kriging showed an improvement of the uncertainty in terms of reducing the estimation variance. The application of cokriging to the undersampled data sets reduced the uncertainty in estimating nitrate concentration and at the same time decreased the cost of the field sampling and laboratory analysis.Spatial and temporal behavior of hydrochemical parameters in groundwater can be studied using tools provided by geostatistics. The cross-variogram can be used to measure the spatial increments between observations at two given times as a function of distance (spatial structure). Taking into account the existence of such a spatial structure, two different data sets (sampled at two different times), representing concentrations of the same hydrochemical parameter, can be analyzed by cokriging in order to reduce the uncertainty of the estimation. In particular, if one of the two data sets is a subset of the other (that is, an undersampled set), cokriging allows us to study the spatial distribution of the hydrochemical parameter at that time, while also considering the statistical characteristics of the full data set established at a different time. This paper presents an application of cokriging by using temporal subsets to study the spatial distribution of nitrate concentration in the aquifer of the Lucca Plain, central Italy. Three data sets of nitrate concentration in groundwater were collected during three different periods in 1991. The first set was from 47 wells, but the second and the third are undersampled and represent 28 and 27 wells, respectively. Comparing the result of cokriging with ordinary kriging showed an improvement of the uncertainty in terms of reducing the estimation variance. The application of cokriging to the undersampled data sets reduced the uncertainty in estimating nitrate concentration and at the same time decreased the cost of the field sampling and laboratory analysis.

  10. MODIS land cover uncertainty in regional climate simulations

    NASA Astrophysics Data System (ADS)

    Li, Xue; Messina, Joseph P.; Moore, Nathan J.; Fan, Peilei; Shortridge, Ashton M.

    2017-12-01

    MODIS land cover datasets are used extensively across the climate modeling community, but inherent uncertainties and associated propagating impacts are rarely discussed. This paper modeled uncertainties embedded within the annual MODIS Land Cover Type (MCD12Q1) products and propagated these uncertainties through the Regional Atmospheric Modeling System (RAMS). First, land cover uncertainties were modeled using pixel-based trajectory analyses from a time series of MCD12Q1 for Urumqi, China. Second, alternative land cover maps were produced based on these categorical uncertainties and passed into RAMS. Finally, simulations from RAMS were analyzed temporally and spatially to reveal impacts. Our study found that MCD12Q1 struggles to discriminate between grasslands and croplands or grasslands and barren in this study area. Such categorical uncertainties have significant impacts on regional climate model outputs. All climate variables examined demonstrated impact across the various regions, with latent heat flux affected most with a magnitude of 4.32 W/m2 in domain average. Impacted areas were spatially connected to locations of greater land cover uncertainty. Both biophysical characteristics and soil moisture settings in regard to land cover types contribute to the variations among simulations. These results indicate that formal land cover uncertainty analysis should be included in MCD12Q1-fed climate modeling as a routine procedure.

  11. Visual scanning with or without spatial uncertainty and time-sharing performance

    NASA Technical Reports Server (NTRS)

    Liu, Yili; Wickens, Christopher D.

    1989-01-01

    An experiment is reported that examines the pattern of task interference between visual scanning as a sequential and selective attention process and other concurrent spatial or verbal processing tasks. A distinction is proposed between visual scanning with or without spatial uncertainty regarding the possible differential effects of these two types of scanning on interference with other concurrent processes. The experiment required the subject to perform a simulated primary tracking task, which was time-shared with a secondary spatial or verbal decision task. The relevant information that was needed to perform the decision tasks were displayed with or without spatial uncertainty. The experiment employed a 2 x 2 x 2 design with types of scanning (with or without spatial uncertainty), expected scanning distance (low/high), and codes of concurrent processing (spatial/verbal) as the three experimental factors. The results provide strong evidence that visual scanning as a spatial exploratory activity produces greater task interference with concurrent spatial tasks than with concurrent verbal tasks. Furthermore, spatial uncertainty in visual scanning is identified to be the crucial factor in producing this differential effect.

  12. Multivariate Probabilistic Analysis of an Hydrological Model

    NASA Astrophysics Data System (ADS)

    Franceschini, Samuela; Marani, Marco

    2010-05-01

    Model predictions derived based on rainfall measurements and hydrological model results are often limited by the systematic error of measuring instruments, by the intrinsic variability of the natural processes and by the uncertainty of the mathematical representation. We propose a means to identify such sources of uncertainty and to quantify their effects based on point-estimate approaches, as a valid alternative to cumbersome Montecarlo methods. We present uncertainty analyses on the hydrologic response to selected meteorological events, in the mountain streamflow-generating portion of the Brenta basin at Bassano del Grappa, Italy. The Brenta river catchment has a relatively uniform morphology and quite a heterogeneous rainfall-pattern. In the present work, we evaluate two sources of uncertainty: data uncertainty (the uncertainty due to data handling and analysis) and model uncertainty (the uncertainty related to the formulation of the model). We thus evaluate the effects of the measurement error of tipping-bucket rain gauges, the uncertainty in estimating spatially-distributed rainfall through block kriging, and the uncertainty associated with estimated model parameters. To this end, we coupled a deterministic model based on the geomorphological theory of the hydrologic response to probabilistic methods. In particular we compare the results of Monte Carlo Simulations (MCS) to the results obtained, in the same conditions, using Li's Point Estimate Method (LiM). The LiM is a probabilistic technique that approximates the continuous probability distribution function of the considered stochastic variables by means of discrete points and associated weights. This allows to satisfactorily reproduce results with only few evaluations of the model function. The comparison between the LiM and MCS results highlights the pros and cons of using an approximating method. LiM is less computationally demanding than MCS, but has limited applicability especially when the model response is highly nonlinear. Higher-order approximations can provide more accurate estimations, but reduce the numerical advantage of the LiM. The results of the uncertainty analysis identify the main sources of uncertainty in the computation of river discharge. In this particular case the spatial variability of rainfall and the model parameters uncertainty are shown to have the greatest impact on discharge evaluation. This, in turn, highlights the need to support any estimated hydrological response with probability information and risk analysis results in order to provide a robust, systematic framework for decision making.

  13. Attentional Mechanisms in Simple Visual Detection: A Speed-Accuracy Trade-Off Analysis

    ERIC Educational Resources Information Center

    Liu, Charles C.; Wolfgang, Bradley J.; Smith, Philip L.

    2009-01-01

    Recent spatial cuing studies have shown that detection sensitivity can be increased by the allocation of attention. This increase has been attributed to one of two mechanisms: signal enhancement or uncertainty reduction. Signal enhancement is an increase in the signal-to-noise ratio at the cued location; uncertainty reduction is a reduction in the…

  14. Use of paired simple and complex models to reduce predictive bias and quantify uncertainty

    NASA Astrophysics Data System (ADS)

    Doherty, John; Christensen, Steen

    2011-12-01

    Modern environmental management and decision-making is based on the use of increasingly complex numerical models. Such models have the advantage of allowing representation of complex processes and heterogeneous system property distributions inasmuch as these are understood at any particular study site. The latter are often represented stochastically, this reflecting knowledge of the character of system heterogeneity at the same time as it reflects a lack of knowledge of its spatial details. Unfortunately, however, complex models are often difficult to calibrate because of their long run times and sometimes questionable numerical stability. Analysis of predictive uncertainty is also a difficult undertaking when using models such as these. Such analysis must reflect a lack of knowledge of spatial hydraulic property details. At the same time, it must be subject to constraints on the spatial variability of these details born of the necessity for model outputs to replicate observations of historical system behavior. In contrast, the rapid run times and general numerical reliability of simple models often promulgates good calibration and ready implementation of sophisticated methods of calibration-constrained uncertainty analysis. Unfortunately, however, many system and process details on which uncertainty may depend are, by design, omitted from simple models. This can lead to underestimation of the uncertainty associated with many predictions of management interest. The present paper proposes a methodology that attempts to overcome the problems associated with complex models on the one hand and simple models on the other hand, while allowing access to the benefits each of them offers. It provides a theoretical analysis of the simplification process from a subspace point of view, this yielding insights into the costs of model simplification, and into how some of these costs may be reduced. It then describes a methodology for paired model usage through which predictive bias of a simplified model can be detected and corrected, and postcalibration predictive uncertainty can be quantified. The methodology is demonstrated using a synthetic example based on groundwater modeling environments commonly encountered in northern Europe and North America.

  15. The effect of short-range spatial variability on soil sampling uncertainty.

    PubMed

    Van der Perk, Marcel; de Zorzi, Paolo; Barbizzi, Sabrina; Belli, Maria; Fajgelj, Ales; Sansone, Umberto; Jeran, Zvonka; Jaćimović, Radojko

    2008-11-01

    This paper aims to quantify the soil sampling uncertainty arising from the short-range spatial variability of elemental concentrations in the topsoils of agricultural, semi-natural, and contaminated environments. For the agricultural site, the relative standard sampling uncertainty ranges between 1% and 5.5%. For the semi-natural area, the sampling uncertainties are 2-4 times larger than in the agricultural area. The contaminated site exhibited significant short-range spatial variability in elemental composition, which resulted in sampling uncertainties of 20-30%.

  16. Reducing Multisensor Satellite Monthly Mean Aerosol Optical Depth Uncertainty: 1. Objective Assessment of Current AERONET Locations

    NASA Technical Reports Server (NTRS)

    Li, Jing; Li, Xichen; Carlson, Barbara E.; Kahn, Ralph A.; Lacis, Andrew A.; Dubovik, Oleg; Nakajima, Teruyuki

    2016-01-01

    Various space-based sensors have been designed and corresponding algorithms developed to retrieve aerosol optical depth (AOD), the very basic aerosol optical property, yet considerable disagreement still exists across these different satellite data sets. Surface-based observations aim to provide ground truth for validating satellite data; hence, their deployment locations should preferably contain as much spatial information as possible, i.e., high spatial representativeness. Using a novel Ensemble Kalman Filter (EnKF)- based approach, we objectively evaluate the spatial representativeness of current Aerosol Robotic Network (AERONET) sites. Multisensor monthly mean AOD data sets from Moderate Resolution Imaging Spectroradiometer, Multiangle Imaging Spectroradiometer, Sea-viewing Wide Field-of-view Sensor, Ozone Monitoring Instrument, and Polarization and Anisotropy of Reflectances for Atmospheric Sciences coupled with Observations from a Lidar are combined into a 605-member ensemble, and AERONET data are considered as the observations to be assimilated into this ensemble using the EnKF. The assessment is made by comparing the analysis error variance (that has been constrained by ground-based measurements), with the background error variance (based on satellite data alone). Results show that the total uncertainty is reduced by approximately 27% on average and could reach above 50% over certain places. The uncertainty reduction pattern also has distinct seasonal patterns, corresponding to the spatial distribution of seasonally varying aerosol types, such as dust in the spring for Northern Hemisphere and biomass burning in the fall for Southern Hemisphere. Dust and biomass burning sites have the highest spatial representativeness, rural and oceanic sites can also represent moderate spatial information, whereas the representativeness of urban sites is relatively localized. A spatial score ranging from 1 to 3 is assigned to each AERONET site based on the uncertainty reduction, indicating its representativeness level.

  17. Coupled semivariogram uncertainty of hydrogeological and geophysical data on capture zone uncertainty analysis

    USGS Publications Warehouse

    Rahman, A.; Tsai, F.T.-C.; White, C.D.; Willson, C.S.

    2008-01-01

    This study investigates capture zone uncertainty that relates to the coupled semivariogram uncertainty of hydrogeological and geophysical data. Semivariogram uncertainty is represented by the uncertainty in structural parameters (range, sill, and nugget). We used the beta distribution function to derive the prior distributions of structural parameters. The probability distributions of structural parameters were further updated through the Bayesian approach with the Gaussian likelihood functions. Cokriging of noncollocated pumping test data and electrical resistivity data was conducted to better estimate hydraulic conductivity through autosemivariograms and pseudo-cross-semivariogram. Sensitivities of capture zone variability with respect to the spatial variability of hydraulic conductivity, porosity and aquifer thickness were analyzed using ANOVA. The proposed methodology was applied to the analysis of capture zone uncertainty at the Chicot aquifer in Southwestern Louisiana, where a regional groundwater flow model was developed. MODFLOW-MODPATH was adopted to delineate the capture zone. The ANOVA results showed that both capture zone area and compactness were sensitive to hydraulic conductivity variation. We concluded that the capture zone uncertainty due to the semivariogram uncertainty is much higher than that due to the kriging uncertainty for given semivariograms. In other words, the sole use of conditional variances of kriging may greatly underestimate the flow response uncertainty. Semivariogram uncertainty should also be taken into account in the uncertainty analysis. ?? 2008 ASCE.

  18. Uncertainty quantification in flux balance analysis of spatially lumped and distributed models of neuron-astrocyte metabolism.

    PubMed

    Calvetti, Daniela; Cheng, Yougan; Somersalo, Erkki

    2016-12-01

    Identifying feasible steady state solutions of a brain energy metabolism model is an inverse problem that allows infinitely many solutions. The characterization of the non-uniqueness, or the uncertainty quantification of the flux balance analysis, is tantamount to identifying the degrees of freedom of the solution. The degrees of freedom of multi-compartment mathematical models for energy metabolism of a neuron-astrocyte complex may offer a key to understand the different ways in which the energetic needs of the brain are met. In this paper we study the uncertainty in the solution, using techniques of linear algebra to identify the degrees of freedom in a lumped model, and Markov chain Monte Carlo methods in its extension to a spatially distributed case. The interpretation of the degrees of freedom in metabolic terms, more specifically, glucose and oxygen partitioning, is then leveraged to derive constraints on the free parameters to guarantee that the model is energetically feasible. We demonstrate how the model can be used to estimate the stoichiometric energy needs of the cells as well as the household energy based on the measured oxidative cerebral metabolic rate of glucose and glutamate cycling. Moreover, our analysis shows that in the lumped model the net direction of lactate dehydrogenase (LDH) in the cells can be deduced from the glucose partitioning between the compartments. The extension of the lumped model to a spatially distributed multi-compartment setting that includes diffusion fluxes from capillary to tissue increases the number of degrees of freedom, requiring the use of statistical sampling techniques. The analysis of the distributed model reveals that some of the conclusions valid for the spatially lumped model, e.g., concerning the LDH activity and glucose partitioning, may no longer hold.

  19. Sensitivity and Uncertainty Analysis of the GFR MOX Fuel Subassembly

    NASA Astrophysics Data System (ADS)

    Lüley, J.; Vrban, B.; Čerba, Š.; Haščík, J.; Nečas, V.; Pelloni, S.

    2014-04-01

    We performed sensitivity and uncertainty analysis as well as benchmark similarity assessment of the MOX fuel subassembly designed for the Gas-Cooled Fast Reactor (GFR) as a representative material of the core. Material composition was defined for each assembly ring separately allowing us to decompose the sensitivities not only for isotopes and reactions but also for spatial regions. This approach was confirmed by direct perturbation calculations for chosen materials and isotopes. Similarity assessment identified only ten partly comparable benchmark experiments that can be utilized in the field of GFR development. Based on the determined uncertainties, we also identified main contributors to the calculation bias.

  20. Characterizing permafrost active layer dynamics and sensitivity to landscape spatial heterogeneity in Alaska

    NASA Astrophysics Data System (ADS)

    Yi, Yonghong; Kimball, John S.; Chen, Richard H.; Moghaddam, Mahta; Reichle, Rolf H.; Mishra, Umakant; Zona, Donatella; Oechel, Walter C.

    2018-01-01

    An important feature of the Arctic is large spatial heterogeneity in active layer conditions, which is generally poorly represented by global models and can lead to large uncertainties in predicting regional ecosystem responses and climate feedbacks. In this study, we developed a spatially integrated modeling and analysis framework combining field observations, local-scale ( ˜ 50 m resolution) active layer thickness (ALT) and soil moisture maps derived from low-frequency (L + P-band) airborne radar measurements, and global satellite environmental observations to investigate the ALT sensitivity to recent climate trends and landscape heterogeneity in Alaska. Modeled ALT results show good correspondence with in situ measurements in higher-permafrost-probability (PP ≥ 70 %) areas (n = 33; R = 0.60; mean bias = 1.58 cm; RMSE = 20.32 cm), but with larger uncertainty in sporadic and discontinuous permafrost areas. The model results also reveal widespread ALT deepening since 2001, with smaller ALT increases in northern Alaska (mean trend = 0.32±1.18 cm yr-1) and much larger increases (> 3 cm yr-1) across interior and southern Alaska. The positive ALT trend coincides with regional warming and a longer snow-free season (R = 0.60 ± 0.32). A spatially integrated analysis of the radar retrievals and model sensitivity simulations demonstrated that uncertainty in the spatial and vertical distribution of soil organic carbon (SOC) was the largest factor affecting modeled ALT accuracy, while soil moisture played a secondary role. Potential improvements in characterizing SOC heterogeneity, including better spatial sampling of soil conditions and advances in remote sensing of SOC and soil moisture, will enable more accurate predictions of active layer conditions and refinement of the modeling framework across a larger domain.

  1. Statistical and Spatial Analysis of Bathymetric Data for the St. Clair River, 1971-2007

    USGS Publications Warehouse

    Bennion, David

    2009-01-01

    To address questions concerning ongoing geomorphic processes in the St. Clair River, selected bathymetric datasets spanning 36 years were analyzed. Comparisons of recent high-resolution datasets covering the upper river indicate a highly variable, active environment. Although statistical and spatial comparisons of the datasets show that some changes to the channel size and shape have taken place during the study period, uncertainty associated with various survey methods and interpolation processes limit the statistically certain results. The methods used to spatially compare the datasets are sensitive to small variations in position and depth that are within the range of uncertainty associated with the datasets. Characteristics of the data, such as the density of measured points and the range of values surveyed, can also influence the results of spatial comparison. With due consideration of these limitations, apparently active and ongoing areas of elevation change in the river are mapped and discussed.

  2. Geostatistical uncertainty of assessing air quality using high-spatial-resolution lichen data: A health study in the urban area of Sines, Portugal.

    PubMed

    Ribeiro, Manuel C; Pinho, P; Branquinho, C; Llop, Esteve; Pereira, Maria J

    2016-08-15

    In most studies correlating health outcomes with air pollution, personal exposure assignments are based on measurements collected at air-quality monitoring stations not coinciding with health data locations. In such cases, interpolators are needed to predict air quality in unsampled locations and to assign personal exposures. Moreover, a measure of the spatial uncertainty of exposures should be incorporated, especially in urban areas where concentrations vary at short distances due to changes in land use and pollution intensity. These studies are limited by the lack of literature comparing exposure uncertainty derived from distinct spatial interpolators. Here, we addressed these issues with two interpolation methods: regression Kriging (RK) and ordinary Kriging (OK). These methods were used to generate air-quality simulations with a geostatistical algorithm. For each method, the geostatistical uncertainty was drawn from generalized linear model (GLM) analysis. We analyzed the association between air quality and birth weight. Personal health data (n=227) and exposure data were collected in Sines (Portugal) during 2007-2010. Because air-quality monitoring stations in the city do not offer high-spatial-resolution measurements (n=1), we used lichen data as an ecological indicator of air quality (n=83). We found no significant difference in the fit of GLMs with any of the geostatistical methods. With RK, however, the models tended to fit better more often and worse less often. Moreover, the geostatistical uncertainty results showed a marginally higher mean and precision with RK. Combined with lichen data and land-use data of high spatial resolution, RK is a more effective geostatistical method for relating health outcomes with air quality in urban areas. This is particularly important in small cities, which generally do not have expensive air-quality monitoring stations with high spatial resolution. Further, alternative ways of linking human activities with their environment are needed to improve human well-being. Copyright © 2016 Elsevier B.V. All rights reserved.

  3. Spatial and Temporal Uncertainty of Crop Yield Aggregations

    NASA Technical Reports Server (NTRS)

    Porwollik, Vera; Mueller, Christoph; Elliott, Joshua; Chryssanthacopoulos, James; Iizumi, Toshichika; Ray, Deepak K.; Ruane, Alex C.; Arneth, Almut; Balkovic, Juraj; Ciais, Philippe; hide

    2016-01-01

    The aggregation of simulated gridded crop yields to national or regional scale requires information on temporal and spatial patterns of crop-specific harvested areas. This analysis estimates the uncertainty of simulated gridded yield time series related to the aggregation with four different harvested area data sets. We compare aggregated yield time series from the Global Gridded Crop Model Inter-comparison project for four crop types from 14 models at global, national, and regional scale to determine aggregation-driven differences in mean yields and temporal patterns as measures of uncertainty. The quantity and spatial patterns of harvested areas differ for individual crops among the four datasets applied for the aggregation. Also simulated spatial yield patterns differ among the 14 models. These differences in harvested areas and simulated yield patterns lead to differences in aggregated productivity estimates, both in mean yield and in the temporal dynamics. Among the four investigated crops, wheat yield (17% relative difference) is most affected by the uncertainty introduced by the aggregation at the global scale. The correlation of temporal patterns of global aggregated yield time series can be as low as for soybean (r = 0.28).For the majority of countries, mean relative differences of nationally aggregated yields account for10% or less. The spatial and temporal difference can be substantial higher for individual countries. Of the top-10 crop producers, aggregated national multi-annual mean relative difference of yields can be up to 67% (maize, South Africa), 43% (wheat, Pakistan), 51% (rice, Japan), and 427% (soybean, Bolivia).Correlations of differently aggregated yield time series can be as low as r = 0.56 (maize, India), r = 0.05*Corresponding (wheat, Russia), r = 0.13 (rice, Vietnam), and r = -0.01 (soybean, Uruguay). The aggregation to sub-national scale in comparison to country scale shows that spatial uncertainties can cancel out in countries with large harvested areas per crop type. We conclude that the aggregation uncertainty can be substantial for crop productivity and production estimations in the context of food security, impact assessment, and model evaluation exercises.

  4. Analysis of algal bloom risk with uncertainties in lakes by integrating self-organizing map and fuzzy information theory.

    PubMed

    Chen, Qiuwen; Rui, Han; Li, Weifeng; Zhang, Yanhui

    2014-06-01

    Algal blooms are a serious problem in waters, which damage aquatic ecosystems and threaten drinking water safety. However, the outbreak mechanism of algal blooms is very complex with great uncertainty, especially for large water bodies where environmental conditions have obvious variation in both space and time. This study developed an innovative method which integrated a self-organizing map (SOM) and fuzzy information diffusion theory to comprehensively analyze algal bloom risks with uncertainties. The Lake Taihu was taken as study case and the long-term (2004-2010) on-site monitoring data were used. The results showed that algal blooms in Taihu Lake were classified into four categories and exhibited obvious spatial-temporal patterns. The lake was mainly characterized by moderate bloom but had high uncertainty, whereas severe blooms with low uncertainty were observed in the northwest part of the lake. The study gives insight on the spatial-temporal dynamics of algal blooms, and should help government and decision-makers outline policies and practices on bloom monitoring and prevention. The developed method provides a promising approach to estimate algal bloom risks under uncertainties. Copyright © 2014 Elsevier B.V. All rights reserved.

  5. The application of Global Sensitivity Analysis to quantify the dominant input factors for hydraulic model simulations

    NASA Astrophysics Data System (ADS)

    Savage, James; Pianosi, Francesca; Bates, Paul; Freer, Jim; Wagener, Thorsten

    2015-04-01

    Predicting flood inundation extents using hydraulic models is subject to a number of critical uncertainties. For a specific event, these uncertainties are known to have a large influence on model outputs and any subsequent analyses made by risk managers. Hydraulic modellers often approach such problems by applying uncertainty analysis techniques such as the Generalised Likelihood Uncertainty Estimation (GLUE) methodology. However, these methods do not allow one to attribute which source of uncertainty has the most influence on the various model outputs that inform flood risk decision making. Another issue facing modellers is the amount of computational resource that is available to spend on modelling flood inundations that are 'fit for purpose' to the modelling objectives. Therefore a balance needs to be struck between computation time, realism and spatial resolution, and effectively characterising the uncertainty spread of predictions (for example from boundary conditions and model parameterisations). However, it is not fully understood how much of an impact each factor has on model performance, for example how much influence changing the spatial resolution of a model has on inundation predictions in comparison to other uncertainties inherent in the modelling process. Furthermore, when resampling fine scale topographic data in the form of a Digital Elevation Model (DEM) to coarser resolutions, there are a number of possible coarser DEMs that can be produced. Deciding which DEM is then chosen to represent the surface elevations in the model could also influence model performance. In this study we model a flood event using the hydraulic model LISFLOOD-FP and apply Sobol' Sensitivity Analysis to estimate which input factor, among the uncertainty in model boundary conditions, uncertain model parameters, the spatial resolution of the DEM and the choice of resampled DEM, have the most influence on a range of model outputs. These outputs include whole domain maximum inundation indicators and flood wave travel time in addition to temporally and spatially variable indicators. This enables us to assess whether the sensitivity of the model to various input factors is stationary in both time and space. Furthermore, competing models are assessed against observations of water depths from a historical flood event. Consequently we are able to determine which of the input factors has the most influence on model performance. Initial findings suggest the sensitivity of the model to different input factors varies depending on the type of model output assessed and at what stage during the flood hydrograph the model output is assessed. We have also found that initial decisions regarding the characterisation of the input factors, for example defining the upper and lower bounds of the parameter sample space, can be significant in influencing the implied sensitivities.

  6. Soil pH Errors Propagation from Measurements to Spatial Predictions - Cost Benefit Analysis and Risk Assessment Implications for Practitioners and Modelers

    NASA Astrophysics Data System (ADS)

    Owens, P. R.; Libohova, Z.; Seybold, C. A.; Wills, S. A.; Peaslee, S.; Beaudette, D.; Lindbo, D. L.

    2017-12-01

    The measurement errors and spatial prediction uncertainties of soil properties in the modeling community are usually assessed against measured values when available. However, of equal importance is the assessment of errors and uncertainty impacts on cost benefit analysis and risk assessments. Soil pH was selected as one of the most commonly measured soil properties used for liming recommendations. The objective of this study was to assess the error size from different sources and their implications with respect to management decisions. Error sources include measurement methods, laboratory sources, pedotransfer functions, database transections, spatial aggregations, etc. Several databases of measured and predicted soil pH were used for this study including the United States National Cooperative Soil Survey Characterization Database (NCSS-SCDB), the US Soil Survey Geographic (SSURGO) Database. The distribution of errors among different sources from measurement methods to spatial aggregation showed a wide range of values. The greatest RMSE of 0.79 pH units was from spatial aggregation (SSURGO vs Kriging), while the measurement methods had the lowest RMSE of 0.06 pH units. Assuming the order of data acquisition based on the transaction distance i.e. from measurement method to spatial aggregation the RMSE increased from 0.06 to 0.8 pH units suggesting an "error propagation". This has major implications for practitioners and modeling community. Most soil liming rate recommendations are based on 0.1 pH unit increments, while the desired soil pH level increments are based on 0.4 to 0.5 pH units. Thus, even when the measured and desired target soil pH are the same most guidelines recommend 1 ton ha-1 lime, which translates in 111 ha-1 that the farmer has to factor in the cost-benefit analysis. However, this analysis need to be based on uncertainty predictions (0.5-1.0 pH units) rather than measurement errors (0.1 pH units) which would translate in 555-1,111 investment that need to be assessed against the risk. The modeling community can benefit from such analysis, however, error size and spatial distribution for global and regional predictions need to be assessed against the variability of other drivers and impact on management decisions.

  7. Sampling in freshwater environments: suspended particle traps and variability in the final data.

    PubMed

    Barbizzi, Sabrina; Pati, Alessandra

    2008-11-01

    This paper reports one practical method to estimate the measurement uncertainty including sampling, derived by the approach implemented by Ramsey for soil investigations. The methodology has been applied to estimate the measurements uncertainty (sampling and analyses) of (137)Cs activity concentration (Bq kg(-1)) and total carbon content (%) in suspended particle sampling in a freshwater ecosystem. Uncertainty estimates for between locations, sampling and analysis components have been evaluated. For the considered measurands, the relative expanded measurement uncertainties are 12.3% for (137)Cs and 4.5% for total carbon. For (137)Cs, the measurement (sampling+analysis) variance gives the major contribution to the total variance, while for total carbon the spatial variance is the dominant contributor to the total variance. The limitations and advantages of this basic method are discussed.

  8. The Variable Grid Method, an Approach for the Simultaneous Visualization and Assessment of Spatial Trends and Uncertainty

    NASA Astrophysics Data System (ADS)

    Rose, K.; Glosser, D.; Bauer, J. R.; Barkhurst, A.

    2015-12-01

    The products of spatial analyses that leverage the interpolation of sparse, point data to represent continuous phenomena are often presented without clear explanations of the uncertainty associated with the interpolated values. As a result, there is frequently insufficient information provided to effectively support advanced computational analyses and individual research and policy decisions utilizing these results. This highlights the need for a reliable approach capable of quantitatively producing and communicating spatial data analyses and their inherent uncertainties for a broad range of uses. To address this need, we have developed the Variable Grid Method (VGM), and associated Python tool, which is a flexible approach that can be applied to a variety of analyses and use case scenarios where users need a method to effectively study, evaluate, and analyze spatial trends and patterns while communicating the uncertainty in the underlying spatial datasets. The VGM outputs a simultaneous visualization representative of the spatial data analyses and quantification of underlying uncertainties, which can be calculated using data related to sample density, sample variance, interpolation error, uncertainty calculated from multiple simulations, etc. We will present examples of our research utilizing the VGM to quantify key spatial trends and patterns for subsurface data interpolations and their uncertainties and leverage these results to evaluate storage estimates and potential impacts associated with underground injection for CO2 storage and unconventional resource production and development. The insights provided by these examples identify how the VGM can provide critical information about the relationship between uncertainty and spatial data that is necessary to better support their use in advance computation analyses and informing research, management and policy decisions.

  9. Spatial interpolation of forest conditions using co-conditional geostatistical simulation

    Treesearch

    H. Todd Mowrer

    2000-01-01

    In recent work the author used the geostatistical Monte Carlo technique of sequential Gaussian simulation (s.G.s.) to investigate uncertainty in a GIS analysis of potential old-growth forest areas. The current study compares this earlier technique to that of co-conditional simulation, wherein the spatial cross-correlations between variables are included. As in the...

  10. Spatial Uncertainty Modeling of Fuzzy Information in Images for Pattern Classification

    PubMed Central

    Pham, Tuan D.

    2014-01-01

    The modeling of the spatial distribution of image properties is important for many pattern recognition problems in science and engineering. Mathematical methods are needed to quantify the variability of this spatial distribution based on which a decision of classification can be made in an optimal sense. However, image properties are often subject to uncertainty due to both incomplete and imprecise information. This paper presents an integrated approach for estimating the spatial uncertainty of vagueness in images using the theory of geostatistics and the calculus of probability measures of fuzzy events. Such a model for the quantification of spatial uncertainty is utilized as a new image feature extraction method, based on which classifiers can be trained to perform the task of pattern recognition. Applications of the proposed algorithm to the classification of various types of image data suggest the usefulness of the proposed uncertainty modeling technique for texture feature extraction. PMID:25157744

  11. Chaotic Brillouin optical correlation-domain analysis

    NASA Astrophysics Data System (ADS)

    Zhang, Jianzhong; Zhang, Mingtao; Zhang, Mingjiang; Liu, Yi; Feng, Changkun; Wang, Yahui; Wang, Yuncai

    2018-04-01

    We propose and experimentally demonstrate a chaotic Brillouin optical correlation-domain analysis (BOCDA) system for distributed fiber sensing. The utilization of the chaotic laser with low coherent state ensures high spatial resolution. The experimental results demonstrate a 3.92-cm spatial resolution over a 906-m measurement range. The uncertainty in the measurement of the local Brillouin frequency shift is 1.2MHz. The measurement signal-to-noise ratio is given, which is agreement with the theoretical value.

  12. Full uncertainty quantification of N2O and NO emissions using the biogeochemical model LandscapeDNDC on site and regional scale

    NASA Astrophysics Data System (ADS)

    Haas, Edwin; Santabarbara, Ignacio; Kiese, Ralf; Butterbach-Bahl, Klaus

    2017-04-01

    Numerical simulation models are increasingly used to estimate greenhouse gas emissions at site to regional / national scale and are outlined as the most advanced methodology (Tier 3) in the framework of UNFCCC reporting. Process-based models incorporate the major processes of the carbon and nitrogen cycle of terrestrial ecosystems and are thus thought to be widely applicable at various conditions and spatial scales. Process based modelling requires high spatial resolution input data on soil properties, climate drivers and management information. The acceptance of model based inventory calculations depends on the assessment of the inventory's uncertainty (model, input data and parameter induced uncertainties). In this study we fully quantify the uncertainty in modelling soil N2O and NO emissions from arable, grassland and forest soils using the biogeochemical model LandscapeDNDC. We address model induced uncertainty (MU) by contrasting two different soil biogeochemistry modules within LandscapeDNDC. The parameter induced uncertainty (PU) was assessed by using joint parameter distributions for key parameters describing microbial C and N turnover processes as obtained by different Bayesian calibration studies for each model configuration. Input data induced uncertainty (DU) was addressed by Bayesian calibration of soil properties, climate drivers and agricultural management practices data. For the MU, DU and PU we performed several hundred simulations each to contribute to the individual uncertainty assessment. For the overall uncertainty quantification we assessed the model prediction probability, followed by sampled sets of input datasets and parameter distributions. Statistical analysis of the simulation results have been used to quantify the overall full uncertainty of the modelling approach. With this study we can contrast the variation in model results to the different sources of uncertainties for each ecosystem. Further we have been able to perform a fully uncertainty analysis for modelling N2O and NO emissions from arable, grassland and forest soils necessary for the comprehensibility of modelling results. We have applied the methodology to a regional inventory to assess the overall modelling uncertainty for a regional N2O and NO emissions inventory for the state of Saxony, Germany.

  13. Quantifying the impact of the longitudinal dispersion coefficient parameter uncertainty on the physical transport processes in rivers

    NASA Astrophysics Data System (ADS)

    Camacho Suarez, V. V.; Shucksmith, J.; Schellart, A.

    2016-12-01

    Analytical and numerical models can be used to represent the advection-dispersion processes governing the transport of pollutants in rivers (Fan et al., 2015; Van Genuchten et al., 2013). Simplifications, assumptions and parameter estimations in these models result in various uncertainties within the modelling process and estimations of pollutant concentrations. In this study, we explore both: 1) the structural uncertainty due to the one dimensional simplification of the Advection Dispersion Equation (ADE) and 2) the parameter uncertainty due to the semi empirical estimation of the longitudinal dispersion coefficient. The relative significance of these uncertainties has not previously been examined. By analysing both the relative structural uncertainty of analytical solutions of the ADE, and the parameter uncertainty due to the longitudinal dispersion coefficient via a Monte Carlo analysis, an evaluation of the dominant uncertainties for a case study in the river Chillan, Chile is presented over a range of spatial scales.

  14. Bayesian Analysis of the Cosmic Microwave Background

    NASA Technical Reports Server (NTRS)

    Jewell, Jeffrey

    2007-01-01

    There is a wealth of cosmological information encoded in the spatial power spectrum of temperature anisotropies of the cosmic microwave background! Experiments designed to map the microwave sky are returning a flood of data (time streams of instrument response as a beam is swept over the sky) at several different frequencies (from 30 to 900 GHz), all with different resolutions and noise properties. The resulting analysis challenge is to estimate, and quantify our uncertainty in, the spatial power spectrum of the cosmic microwave background given the complexities of "missing data", foreground emission, and complicated instrumental noise. Bayesian formulation of this problem allows consistent treatment of many complexities including complicated instrumental noise and foregrounds, and can be numerically implemented with Gibbs sampling. Gibbs sampling has now been validated as an efficient, statistically exact, and practically useful method for low-resolution (as demonstrated on WMAP 1 and 3 year temperature and polarization data). Continuing development for Planck - the goal is to exploit the unique capabilities of Gibbs sampling to directly propagate uncertainties in both foreground and instrument models to total uncertainty in cosmological parameters.

  15. Scaling uncertainties in estimating canopy foliar maintenance respiration for black spruce ecosystems in Alaska

    USGS Publications Warehouse

    Zhang, X.; McGuire, A.D.; Ruess, Roger W.

    2006-01-01

    A major challenge confronting the scientific community is to understand both patterns of and controls over spatial and temporal variability of carbon exchange between boreal forest ecosystems and the atmosphere. An understanding of the sources of variability of carbon processes at fine scales and how these contribute to uncertainties in estimating carbon fluxes is relevant to representing these processes at coarse scales. To explore some of the challenges and uncertainties in estimating carbon fluxes at fine to coarse scales, we conducted a modeling analysis of canopy foliar maintenance respiration for black spruce ecosystems of Alaska by scaling empirical hourly models of foliar maintenance respiration (Rm) to estimate canopy foliar Rm for individual stands. We used variation in foliar N concentration among stands to develop hourly stand-specific models and then developed an hourly pooled model. An uncertainty analysis identified that the most important parameter affecting estimates of canopy foliar Rm was one that describes R m at 0??C per g N, which explained more than 55% of variance in annual estimates of canopy foliar Rm. The comparison of simulated annual canopy foliar Rm identified significant differences between stand-specific and pooled models for each stand. This result indicates that control over foliar N concentration should be considered in models that estimate canopy foliar Rm of black spruce stands across the landscape. In this study, we also temporally scaled the hourly stand-level models to estimate canopy foliar Rm of black spruce stands using mean monthly temperature data. Comparisons of monthly Rm between the hourly and monthly versions of the models indicated that there was very little difference between the estimates of hourly and monthly models, suggesting that hourly models can be aggregated to use monthly input data with little loss of precision. We conclude that uncertainties in the use of a coarse-scale model for estimating canopy foliar Rm at regional scales depend on uncertainties in representing needle-level respiration and on uncertainties in representing the spatial variability of canopy foliar N across a region. The development of spatial data sets of canopy foliar N represents a major challenge in estimating canopy foliar maintenance respiration at regional scales. ?? Springer 2006.

  16. Transfer of uncertainty of space-borne high resolution rainfall products at ungauged regions

    NASA Astrophysics Data System (ADS)

    Tang, Ling

    Hydrologically relevant characteristics of high resolution (˜ 0.25 degree, 3 hourly) satellite rainfall uncertainty were derived as a function of season and location using a six year (2002-2007) archive of National Aeronautics and Space Administration (NASA)'s Tropical Rainfall Measuring Mission (TRMM) Multi-satellite Precipitation Analysis (TMPA) precipitation data. The Next Generation Radar (NEXRAD) Stage IV rainfall data over the continental United States was used as ground validation (GV) data. A geostatistical mapping scheme was developed and tested for transfer (i.e., spatial interpolation) of uncertainty information from GV regions to the vast non-GV regions by leveraging the error characterization work carried out in the earlier step. The open question explored here was, "If 'error' is defined on the basis of independent ground validation (GV) data, how are error metrics estimated for a satellite rainfall data product without the need for much extensive GV data?" After a quantitative analysis of the spatial and temporal structure of the satellite rainfall uncertainty, a proof-of-concept geostatistical mapping scheme (based on the kriging method) was evaluated. The idea was to understand how realistic the idea of 'transfer' is for the GPM era. It was found that it was indeed technically possible to transfer error metrics from a gauged to an ungauged location for certain error metrics and that a regionalized error metric scheme for GPM may be possible. The uncertainty transfer scheme based on a commonly used kriging method (ordinary kriging) was then assessed further at various timescales (climatologic, seasonal, monthly and weekly), and as a function of the density of GV coverage. The results indicated that if a transfer scheme for estimating uncertainty metrics was finer than seasonal scale (ranging from 3-6 hourly to weekly-monthly), the effectiveness for uncertainty transfer worsened significantly. Next, a comprehensive assessment of different kriging methods for spatial transfer (interpolation) of error metrics was performed. Three kriging methods for spatial interpolation are compared, which are: ordinary kriging (OK), indicator kriging (IK) and disjunctive kriging (DK). Additional comparison with the simple inverse distance weighting (IDW) method was also performed to quantify the added benefit (if any) of using geostatistical methods. The overall performance ranking of the kriging methods was found to be as follows: OK=DK > IDW > IK. Lastly, various metrics of satellite rainfall uncertainty were identified for two large continental landmasses that share many similar Koppen climate zones, United States and Australia. The dependence of uncertainty as a function of gauge density was then investigated. The investigation revealed that only the first and second ordered moments of error are most amenable to a Koppen-type climate type classification in different continental landmasses.

  17. Using geostatistics to evaluate cleanup goals

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Marcon, M.F.; Hopkins, L.P.

    1995-12-01

    Geostatistical analysis is a powerful predictive tool typically used to define spatial variability in environmental data. The information from a geostatistical analysis using kriging, a geostatistical. tool, can be taken a step further to optimize sampling location and frequency and help quantify sampling uncertainty in both the remedial investigation and remedial design at a hazardous waste site. Geostatistics were used to quantify sampling uncertainty in attainment of a risk-based cleanup goal and determine the optimal sampling frequency necessary to delineate the horizontal extent of impacted soils at a Gulf Coast waste site.

  18. Development and Implementation of a Formal Framework for Bottom-up Uncertainty Analysis of Input Emissions: Case Study of Residential Wood Combustion

    NASA Astrophysics Data System (ADS)

    Zhao, S.; Mashayekhi, R.; Saeednooran, S.; Hakami, A.; Ménard, R.; Moran, M. D.; Zhang, J.

    2016-12-01

    We have developed a formal framework for documentation, quantification, and propagation of uncertainties in upstream emissions inventory data at various stages leading to the generation of model-ready gridded emissions through emissions processing software such as the EPA's SMOKE (Sparse Matrix Operator Kernel Emissions) system. To illustrate this framework we present a proof-of-concept case study of a bottom-up quantitative assessment of uncertainties in emissions from residential wood combustion (RWC) in the U.S. and Canada. Uncertainties associated with key inventory parameters are characterized based on existing information sources, including the American Housing Survey (AHS) from the U.S. Census Bureau, Timber Products Output (TPO) surveys from the U.S. Forest Service, TNS Canadian Facts surveys, and the AP-42 emission factor document from the U.S. EPA. The propagation of uncertainties is based on Monte Carlo simulation code external to SMOKE. Latin Hypercube Sampling (LHS) is implemented to generate a set of random realizations of each RWC inventory parameter, for which the uncertainties are assumed to be normally distributed. Random realizations are also obtained for each RWC temporal and chemical speciation profile and spatial surrogate field external to SMOKE using the LHS approach. SMOKE outputs for primary emissions (e.g., CO, VOC) using both RWC emission inventory realizations and perturbed temporal and chemical profiles and spatial surrogates show relative uncertainties of about 30-50% across the U.S. and about 70-100% across Canada. Positive skewness values (up to 2.7) and variable kurtosis values (up to 4.8) were also found. Spatial allocation contributes significantly to the overall uncertainty, particularly in Canada. By applying this framework we are able to produce random realizations of model-ready gridded emissions that along with available meteorological ensembles can be used to propagate uncertainties through chemical transport models. The approach described here provides an effective means for formal quantification of uncertainties in estimated emissions from various source sectors and for continuous documentation, assessment, and reduction of emission uncertainties.

  19. Assessment and visualization of uncertainty for countrywide soil organic matter map of Hungary using local entropy

    NASA Astrophysics Data System (ADS)

    Szatmári, Gábor; Pásztor, László

    2016-04-01

    Uncertainty is a general term expressing our imperfect knowledge in describing an environmental process and we are aware of it (Bárdossy and Fodor, 2004). Sampling, laboratory measurements, models and so on are subject to uncertainty. Effective quantification and visualization of uncertainty would be indispensable to stakeholders (e.g. policy makers, society). Soil related features and their spatial models should be stressfully targeted to uncertainty assessment because their inferences are further used in modelling and decision making process. The aim of our present study was to assess and effectively visualize the local uncertainty of the countrywide soil organic matter (SOM) spatial distribution model of Hungary using geostatistical tools and concepts. The Hungarian Soil Information and Monitoring System's SOM data (approximately 1,200 observations) and environmental related, spatially exhaustive secondary information (i.e. digital elevation model, climatic maps, MODIS satellite images and geological map) were used to model the countrywide SOM spatial distribution by regression kriging. It would be common to use the calculated estimation (or kriging) variance as a measure of uncertainty, however the normality and homoscedasticity hypotheses have to be refused according to our preliminary analysis on the data. Therefore, a normal score transformation and a sequential stochastic simulation approach was introduced to be able to model and assess the local uncertainty. Five hundred equally probable realizations (i.e. stochastic images) were generated. The number of the stochastic images is fairly enough to provide a model of uncertainty at each location, which is a complete description of uncertainty in geostatistics (Deutsch and Journel, 1998). Furthermore, these models can be applied e.g. to contour the probability of any events, which can be regarded as goal oriented digital soil maps and are of interest for agricultural management and decision making as well. A standardized measure of the local entropy was used to visualize uncertainty, where entropy values close to 1 correspond to high uncertainty, whilst values close to 0 correspond low uncertainty. The advantage of the usage of local entropy in this context is that it combines probabilities from multiple members into a single number for each location of the model. In conclusion, it is straightforward to use a sequential stochastic simulation approach to the assessment of uncertainty, when normality and homoscedasticity are violated. The visualization of uncertainty using the local entropy is effective and communicative to stakeholders because it represents the uncertainty through a single number within a [0, 1] scale. References: Bárdossy, Gy. & Fodor, J., 2004. Evaluation of Uncertainties and Risks in Geology. Springer-Verlag, Berlin Heidelberg. Deutsch, C.V. & Journel, A.G., 1998. GSLIB: geostatistical software library and user's guide. Oxford University Press, New York. Acknowledgement: Our work was supported by the Hungarian National Scientific Research Foundation (OTKA, Grant No. K105167).

  20. Ambient Ozone Exposure in Czech Forests: A GIS-Based Approach to Spatial Distribution Assessment

    PubMed Central

    Hůnová, I.; Horálek, J.; Schreiberová, M.; Zapletal, M.

    2012-01-01

    Ambient ozone (O3) is an important phytotoxic pollutant, and detailed knowledge of its spatial distribution is becoming increasingly important. The aim of the paper is to compare different spatial interpolation techniques and to recommend the best approach for producing a reliable map for O3 with respect to its phytotoxic potential. For evaluation we used real-time ambient O3 concentrations measured by UV absorbance from 24 Czech rural sites in the 2007 and 2008 vegetation seasons. We considered eleven approaches for spatial interpolation used for the development of maps for mean vegetation season O3 concentrations and the AOT40F exposure index for forests. The uncertainty of maps was assessed by cross-validation analysis. The root mean square error (RMSE) of the map was used as a criterion. Our results indicate that the optimal interpolation approach is linear regression of O3 data and altitude with subsequent interpolation of its residuals by ordinary kriging. The relative uncertainty of the map of O3 mean for the vegetation season is less than 10%, using the optimal method as for both explored years, and this is a very acceptable value. In the case of AOT40F, however, the relative uncertainty of the map is notably worse, reaching nearly 20% in both examined years. PMID:22566757

  1. Uncertainty of future projections of species distributions in mountainous regions.

    PubMed

    Tang, Ying; Winkler, Julie A; Viña, Andrés; Liu, Jianguo; Zhang, Yuanbin; Zhang, Xiaofeng; Li, Xiaohong; Wang, Fang; Zhang, Jindong; Zhao, Zhiqiang

    2018-01-01

    Multiple factors introduce uncertainty into projections of species distributions under climate change. The uncertainty introduced by the choice of baseline climate information used to calibrate a species distribution model and to downscale global climate model (GCM) simulations to a finer spatial resolution is a particular concern for mountainous regions, as the spatial resolution of climate observing networks is often insufficient to detect the steep climatic gradients in these areas. Using the maximum entropy (MaxEnt) modeling framework together with occurrence data on 21 understory bamboo species distributed across the mountainous geographic range of the Giant Panda, we examined the differences in projected species distributions obtained from two contrasting sources of baseline climate information, one derived from spatial interpolation of coarse-scale station observations and the other derived from fine-spatial resolution satellite measurements. For each bamboo species, the MaxEnt model was calibrated separately for the two datasets and applied to 17 GCM simulations downscaled using the delta method. Greater differences in the projected spatial distributions of the bamboo species were observed for the models calibrated using the different baseline datasets than between the different downscaled GCM simulations for the same calibration. In terms of the projected future climatically-suitable area by species, quantification using a multi-factor analysis of variance suggested that the sum of the variance explained by the baseline climate dataset used for model calibration and the interaction between the baseline climate data and the GCM simulation via downscaling accounted for, on average, 40% of the total variation among the future projections. Our analyses illustrate that the combined use of gridded datasets developed from station observations and satellite measurements can help estimate the uncertainty introduced by the choice of baseline climate information to the projected changes in species distribution.

  2. Uncertainty of future projections of species distributions in mountainous regions

    PubMed Central

    Tang, Ying; Viña, Andrés; Liu, Jianguo; Zhang, Yuanbin; Zhang, Xiaofeng; Li, Xiaohong; Wang, Fang; Zhang, Jindong; Zhao, Zhiqiang

    2018-01-01

    Multiple factors introduce uncertainty into projections of species distributions under climate change. The uncertainty introduced by the choice of baseline climate information used to calibrate a species distribution model and to downscale global climate model (GCM) simulations to a finer spatial resolution is a particular concern for mountainous regions, as the spatial resolution of climate observing networks is often insufficient to detect the steep climatic gradients in these areas. Using the maximum entropy (MaxEnt) modeling framework together with occurrence data on 21 understory bamboo species distributed across the mountainous geographic range of the Giant Panda, we examined the differences in projected species distributions obtained from two contrasting sources of baseline climate information, one derived from spatial interpolation of coarse-scale station observations and the other derived from fine-spatial resolution satellite measurements. For each bamboo species, the MaxEnt model was calibrated separately for the two datasets and applied to 17 GCM simulations downscaled using the delta method. Greater differences in the projected spatial distributions of the bamboo species were observed for the models calibrated using the different baseline datasets than between the different downscaled GCM simulations for the same calibration. In terms of the projected future climatically-suitable area by species, quantification using a multi-factor analysis of variance suggested that the sum of the variance explained by the baseline climate dataset used for model calibration and the interaction between the baseline climate data and the GCM simulation via downscaling accounted for, on average, 40% of the total variation among the future projections. Our analyses illustrate that the combined use of gridded datasets developed from station observations and satellite measurements can help estimate the uncertainty introduced by the choice of baseline climate information to the projected changes in species distribution. PMID:29320501

  3. Effect of pesticide fate parameters and their uncertainty on the selection of 'worst-case' scenarios of pesticide leaching to groundwater.

    PubMed

    Vanderborght, Jan; Tiktak, Aaldrik; Boesten, Jos J T I; Vereecken, Harry

    2011-03-01

    For the registration of pesticides in the European Union, model simulations for worst-case scenarios are used to demonstrate that leaching concentrations to groundwater do not exceed a critical threshold. A worst-case scenario is a combination of soil and climate properties for which predicted leaching concentrations are higher than a certain percentile of the spatial concentration distribution within a region. The derivation of scenarios is complicated by uncertainty about soil and pesticide fate parameters. As the ranking of climate and soil property combinations according to predicted leaching concentrations is different for different pesticides, the worst-case scenario for one pesticide may misrepresent the worst case for another pesticide, which leads to 'scenario uncertainty'. Pesticide fate parameter uncertainty led to higher concentrations in the higher percentiles of spatial concentration distributions, especially for distributions in smaller and more homogeneous regions. The effect of pesticide fate parameter uncertainty on the spatial concentration distribution was small when compared with the uncertainty of local concentration predictions and with the scenario uncertainty. Uncertainty in pesticide fate parameters and scenario uncertainty can be accounted for using higher percentiles of spatial concentration distributions and considering a range of pesticides for the scenario selection. Copyright © 2010 Society of Chemical Industry.

  4. Trajectory analysis of land use and land cover maps to improve spatial-temporal patterns, and impact assessment on groundwater recharge

    NASA Astrophysics Data System (ADS)

    Zomlot, Z.; Verbeiren, B.; Huysmans, M.; Batelaan, O.

    2017-11-01

    Land use/land cover (LULC) change is a consequence of human-induced global environmental change. It is also considered one of the major factors affecting groundwater recharge. Uncertainties and inconsistencies in LULC maps are one of the difficulties that LULC timeseries analysis face and which have a significant effect on hydrological impact analysis. Therefore, an accuracy assessment approach of LULC timeseries is needed for a more reliable hydrological analysis and prediction. The objective of this paper is to assess the impact of land use uncertainty and to improve the accuracy of a timeseries of CORINE (coordination of information on the environment) land cover maps by using a new approach of identifying spatial-temporal LULC change trajectories as a pre-processing tool. This ensures consistency of model input when dealing with land-use dynamics and as such improves the accuracy of land use maps and consequently groundwater recharge estimation. As a case study the impact of consistent land use changes from 1990 until 2013 on groundwater recharge for the Flanders-Brussels region is assessed. The change trajectory analysis successfully assigned a rational trajectory to 99% of all pixels. The methodology is shown to be powerful in correcting interpretation inconsistencies and overestimation errors in CORINE land cover maps. The overall kappa (cell-by-cell map comparison) improved from 0.6 to 0.8 and from 0.2 to 0.7 for forest and pasture land use classes respectively. The study shows that the inconsistencies in the land use maps introduce uncertainty in groundwater recharge estimation in a range of 10-30%. The analysis showed that during the period of 1990-2013 the LULC changes were mainly driven by urban expansion. The results show that the resolution at which the spatial analysis is performed is important; the recharge differences using original and corrected CORINE land cover maps increase considerably with increasing spatial resolution. This study indicates that improving consistency of land use map timeseries is of critical importance for assessing land use change and its environmental impact.

  5. Uncertainty modelling and analysis of volume calculations based on a regular grid digital elevation model (DEM)

    NASA Astrophysics Data System (ADS)

    Li, Chang; Wang, Qing; Shi, Wenzhong; Zhao, Sisi

    2018-05-01

    The accuracy of earthwork calculations that compute terrain volume is critical to digital terrain analysis (DTA). The uncertainties in volume calculations (VCs) based on a DEM are primarily related to three factors: 1) model error (ME), which is caused by an adopted algorithm for a VC model, 2) discrete error (DE), which is usually caused by DEM resolution and terrain complexity, and 3) propagation error (PE), which is caused by the variables' error. Based on these factors, the uncertainty modelling and analysis of VCs based on a regular grid DEM are investigated in this paper. Especially, how to quantify the uncertainty of VCs is proposed by a confidence interval based on truncation error (TE). In the experiments, the trapezoidal double rule (TDR) and Simpson's double rule (SDR) were used to calculate volume, where the TE is the major ME, and six simulated regular grid DEMs with different terrain complexity and resolution (i.e. DE) were generated by a Gauss synthetic surface to easily obtain the theoretical true value and eliminate the interference of data errors. For PE, Monte-Carlo simulation techniques and spatial autocorrelation were used to represent DEM uncertainty. This study can enrich uncertainty modelling and analysis-related theories of geographic information science.

  6. Analysis of Sensitivity and Uncertainty in an Individual-Based Model of a Threatened Wildlife Species

    EPA Science Inventory

    We present a multi-faceted sensitivity analysis of a spatially explicit, individual-based model (IBM) (HexSim) of a threatened species, the Northern Spotted Owl (Strix occidentalis caurina) on a national forest in Washington, USA. Few sensitivity analyses have been conducted on ...

  7. Quantification of uncertainties in global grazing systems assessment

    NASA Astrophysics Data System (ADS)

    Fetzel, T.; Havlik, P.; Herrero, M.; Kaplan, J. O.; Kastner, T.; Kroisleitner, C.; Rolinski, S.; Searchinger, T.; Van Bodegom, P. M.; Wirsenius, S.; Erb, K.-H.

    2017-07-01

    Livestock systems play a key role in global sustainability challenges like food security and climate change, yet many unknowns and large uncertainties prevail. We present a systematic, spatially explicit assessment of uncertainties related to grazing intensity (GI), a key metric for assessing ecological impacts of grazing, by combining existing data sets on (a) grazing feed intake, (b) the spatial distribution of livestock, (c) the extent of grazing land, and (d) its net primary productivity (NPP). An analysis of the resulting 96 maps implies that on average 15% of the grazing land NPP is consumed by livestock. GI is low in most of the world's grazing lands, but hotspots of very high GI prevail in 1% of the total grazing area. The agreement between GI maps is good on one fifth of the world's grazing area, while on the remainder, it is low to very low. Largest uncertainties are found in global drylands and where grazing land bears trees (e.g., the Amazon basin or the Taiga belt). In some regions like India or Western Europe, massive uncertainties even result in GI > 100% estimates. Our sensitivity analysis indicates that the input data for NPP, animal distribution, and grazing area contribute about equally to the total variability in GI maps, while grazing feed intake is a less critical variable. We argue that a general improvement in quality of the available global level data sets is a precondition for improving the understanding of the role of livestock systems in the context of global environmental change or food security.

  8. Generalized Likelihood Uncertainty Estimation (GLUE) Using Multi-Optimization Algorithm as Sampling Method

    NASA Astrophysics Data System (ADS)

    Wang, Z.

    2015-12-01

    For decades, distributed and lumped hydrological models have furthered our understanding of hydrological system. The development of hydrological simulation in large scale and high precision elaborated the spatial descriptions and hydrological behaviors. Meanwhile, the new trend is also followed by the increment of model complexity and number of parameters, which brings new challenges of uncertainty quantification. Generalized Likelihood Uncertainty Estimation (GLUE) has been widely used in uncertainty analysis for hydrological models referring to Monte Carlo method coupled with Bayesian estimation. However, the stochastic sampling method of prior parameters adopted by GLUE appears inefficient, especially in high dimensional parameter space. The heuristic optimization algorithms utilizing iterative evolution show better convergence speed and optimality-searching performance. In light of the features of heuristic optimization algorithms, this study adopted genetic algorithm, differential evolution, shuffled complex evolving algorithm to search the parameter space and obtain the parameter sets of large likelihoods. Based on the multi-algorithm sampling, hydrological model uncertainty analysis is conducted by the typical GLUE framework. To demonstrate the superiority of the new method, two hydrological models of different complexity are examined. The results shows the adaptive method tends to be efficient in sampling and effective in uncertainty analysis, providing an alternative path for uncertainty quantilization.

  9. Communicating Geographical Risks in Crisis Management: The Need for Research.

    PubMed

    French, Simon; Argyris, Nikolaos; Haywood, Stephanie M; Hort, Matthew C; Smith, Jim Q

    2017-10-23

    In any crisis, there is a great deal of uncertainty, often geographical uncertainty or, more precisely, spatiotemporal uncertainty. Examples include the spread of contamination from an industrial accident, drifting volcanic ash, and the path of a hurricane. Estimating spatiotemporal probabilities is usually a difficult task, but that is not our primary concern. Rather, we ask how analysts can communicate spatiotemporal uncertainty to those handling the crisis. We comment on the somewhat limited literature on the representation of spatial uncertainty on maps. We note that many cognitive issues arise and that the potential for confusion is high. We note that in the early stages of handling a crisis, the uncertainties involved may be deep, i.e., difficult or impossible to quantify in the time available. In such circumstance, we suggest the idea of presenting multiple scenarios. © 2017 Society for Risk Analysis.

  10. Reliability analysis of hydrologic containment of liquefied petroleum gas within unlined rock caverns.

    NASA Astrophysics Data System (ADS)

    Gao, X.; Yan, E. C.; Yeh, T. C. J.; Wang, Y.; Liang, Y.; Hao, Y.

    2017-12-01

    Notice that most of the underground liquefied petroleum gas (LPG) storage caverns are constructed in unlined rock caverns (URCs), where the variability of hydraulic properties (in particular, hydraulic conductivity) has significant impacts on hydrologic containment performance. However, it is practically impossible to characterize the spatial distribution of these properties in detail at the site of URCs. This dilemma forces us to cope with uncertainty in our evaluations of gas containment. As a consequence, the uncertainty-based analysis is deemed more appropriate than the traditional deterministic analysis. The objectives of this paper are 1) to introduce a numerical first order method to calculate the gas containment reliability within a heterogeneous, two-dimensional unlined rock caverns, and 2) to suggest a strategy for improving the gas containment reliability. In order to achieve these goals, we first introduced the stochastic continuum representation of saturated hydraulic conductivity (Ks) of fractured rock and analyzed the spatial variability of Ks at a field site. We then conducted deterministic simulations to demonstrate the importance of heterogeneity of Ks in the analysis of gas tightness performance of URCs. Considering the uncertainty of the heterogeneity in the real world situations, we subsequently developed a numerical first order method (NFOM) to determine the gas tightness reliability at crucial locations of URCs. Using the NFOM, the effect of spatial variability of Ks on gas tightness reliability was investigated. Results show that as variance or spatial structure anisotropy of Ks increases, most of the gas tightness reliability at crucial locations reduces. Meanwhile, we compare the results of NFOM with those of Monte Carlo simulation, and we find the accuracy of NFOM is mainly affected by the magnitude of the variance of Ks. At last, for improving gas containment reliability at crucial locations at this study site, we suggest that vertical water-curtain holes should be installed in the pillar rather than increasing density of horizontal water-curtain boreholes.

  11. Linear, multivariable robust control with a mu perspective

    NASA Technical Reports Server (NTRS)

    Packard, Andy; Doyle, John; Balas, Gary

    1993-01-01

    The structured singular value is a linear algebra tool developed to study a particular class of matrix perturbation problems arising in robust feedback control of multivariable systems. These perturbations are called linear fractional, and are a natural way to model many types of uncertainty in linear systems, including state-space parameter uncertainty, multiplicative and additive unmodeled dynamics uncertainty, and coprime factor and gap metric uncertainty. The structured singular value theory provides a natural extension of classical SISO robustness measures and concepts to MIMO systems. The structured singular value analysis, coupled with approximate synthesis methods, make it possible to study the tradeoff between performance and uncertainty that occurs in all feedback systems. In MIMO systems, the complexity of the spatial interactions in the loop gains make it difficult to heuristically quantify the tradeoffs that must occur. This paper examines the role played by the structured singular value (and its computable bounds) in answering these questions, as well as its role in the general robust, multivariable control analysis and design problem.

  12. Investigation of the interpolation method to improve the distributed strain measurement accuracy in optical frequency domain reflectometry systems.

    PubMed

    Cui, Jiwen; Zhao, Shiyuan; Yang, Di; Ding, Zhenyang

    2018-02-20

    We use a spectrum interpolation technique to improve the distributed strain measurement accuracy in a Rayleigh-scatter-based optical frequency domain reflectometry sensing system. We demonstrate that strain accuracy is not limited by the "uncertainty principle" that exists in the time-frequency analysis. Different interpolation methods are investigated and used to improve the accuracy of peak position of the cross-correlation and, therefore, improve the accuracy of the strain. Interpolation implemented by padding zeros on one side of the windowed data in the spatial domain, before the inverse fast Fourier transform, is found to have the best accuracy. Using this method, the strain accuracy and resolution are both improved without decreasing the spatial resolution. The strain of 3 μϵ within the spatial resolution of 1 cm at the position of 21.4 m is distinguished, and the measurement uncertainty is 3.3 μϵ.

  13. Fuzzification of continuous-value spatial evidence for mineral prospectivity mapping

    NASA Astrophysics Data System (ADS)

    Yousefi, Mahyar; Carranza, Emmanuel John M.

    2015-01-01

    Complexities of geological processes portrayed as certain feature in a map (e.g., faults) are natural sources of uncertainties in decision-making for exploration of mineral deposits. Besides natural sources of uncertainties, knowledge-driven (e.g., fuzzy logic) mineral prospectivity mapping (MPM) is also plagued and incurs further uncertainty in subjective judgment of analyst when there is no reliable proven value of evidential scores corresponding to relative importance of geological features that can directly be measured. In this regard, analysts apply expert opinion to assess relative importance of spatial evidences as meaningful decision support. This paper aims for fuzzification of continuous spatial data used as proxy evidence to facilitate and to support fuzzy MPM to generate exploration target areas for further examination of undiscovered deposits. In addition, this paper proposes to adapt the concept of expected value to further improve fuzzy logic MPM because the analysis of uncertain variables can be presented in terms of their expected value. The proposed modified expected value approach to MPM is not only a multi-criteria approach but it also treats uncertainty of geological processes a depicted by maps or spatial data in term of biased weighting more realistically in comparison with classified evidential maps because fuzzy membership scores are defined continuously whereby, for example, there is no need to categorize distances from evidential features to proximity classes using arbitrary intervals. The proposed continuous weighting approach and then integrating the weighted evidence layers by using modified expected value function, described in this paper can be used efficiently in either greenfields or brownfields.

  14. Experiments to Evaluate and Implement Passive Tracer Gas Methods to Measure Ventilation Rates in Homes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lunden, Melissa; Faulkner, David; Heredia, Elizabeth

    2012-10-01

    This report documents experiments performed in three homes to assess the methodology used to determine air exchange rates using passive tracer techniques. The experiments used four different tracer gases emitted simultaneously but implemented with different spatial coverage in the home. Two different tracer gas sampling methods were used. The results characterize the factors of the execution and analysis of the passive tracer technique that affect the uncertainty in the calculated air exchange rates. These factors include uncertainties in tracer gas emission rates, differences in measured concentrations for different tracer gases, temporal and spatial variability of the concentrations, the comparison betweenmore » different gas sampling methods, and the effect of different ventilation conditions.« less

  15. Location error uncertainties - an advanced using of probabilistic inverse theory

    NASA Astrophysics Data System (ADS)

    Debski, Wojciech

    2016-04-01

    The spatial location of sources of seismic waves is one of the first tasks when transient waves from natural (uncontrolled) sources are analyzed in many branches of physics, including seismology, oceanology, to name a few. Source activity and its spatial variability in time, the geometry of recording network, the complexity and heterogeneity of wave velocity distribution are all factors influencing the performance of location algorithms and accuracy of the achieved results. While estimating of the earthquake foci location is relatively simple a quantitative estimation of the location accuracy is really a challenging task even if the probabilistic inverse method is used because it requires knowledge of statistics of observational, modelling, and apriori uncertainties. In this presentation we addressed this task when statistics of observational and/or modeling errors are unknown. This common situation requires introduction of apriori constraints on the likelihood (misfit) function which significantly influence the estimated errors. Based on the results of an analysis of 120 seismic events from the Rudna copper mine operating in southwestern Poland we illustrate an approach based on an analysis of Shanon's entropy calculated for the aposteriori distribution. We show that this meta-characteristic of the aposteriori distribution carries some information on uncertainties of the solution found.

  16. Optimal configurations of spatial scale for grid cell firing under noise and uncertainty

    PubMed Central

    Towse, Benjamin W.; Barry, Caswell; Bush, Daniel; Burgess, Neil

    2014-01-01

    We examined the accuracy with which the location of an agent moving within an environment could be decoded from the simulated firing of systems of grid cells. Grid cells were modelled with Poisson spiking dynamics and organized into multiple ‘modules’ of cells, with firing patterns of similar spatial scale within modules and a wide range of spatial scales across modules. The number of grid cells per module, the spatial scaling factor between modules and the size of the environment were varied. Errors in decoded location can take two forms: small errors of precision and larger errors resulting from ambiguity in decoding periodic firing patterns. With enough cells per module (e.g. eight modules of 100 cells each) grid systems are highly robust to ambiguity errors, even over ranges much larger than the largest grid scale (e.g. over a 500 m range when the maximum grid scale is 264 cm). Results did not depend strongly on the precise organization of scales across modules (geometric, co-prime or random). However, independent spatial noise across modules, which would occur if modules receive independent spatial inputs and might increase with spatial uncertainty, dramatically degrades the performance of the grid system. This effect of spatial uncertainty can be mitigated by uniform expansion of grid scales. Thus, in the realistic regimes simulated here, the optimal overall scale for a grid system represents a trade-off between minimizing spatial uncertainty (requiring large scales) and maximizing precision (requiring small scales). Within this view, the temporary expansion of grid scales observed in novel environments may be an optimal response to increased spatial uncertainty induced by the unfamiliarity of the available spatial cues. PMID:24366144

  17. Quantifying measurement uncertainty and spatial variability in the context of model evaluation

    NASA Astrophysics Data System (ADS)

    Choukulkar, A.; Brewer, A.; Pichugina, Y. L.; Bonin, T.; Banta, R. M.; Sandberg, S.; Weickmann, A. M.; Djalalova, I.; McCaffrey, K.; Bianco, L.; Wilczak, J. M.; Newman, J. F.; Draxl, C.; Lundquist, J. K.; Wharton, S.; Olson, J.; Kenyon, J.; Marquis, M.

    2017-12-01

    In an effort to improve wind forecasts for the wind energy sector, the Department of Energy and the NOAA funded the second Wind Forecast Improvement Project (WFIP2). As part of the WFIP2 field campaign, a large suite of in-situ and remote sensing instrumentation was deployed to the Columbia River Gorge in Oregon and Washington from October 2015 - March 2017. The array of instrumentation deployed included 915-MHz wind profiling radars, sodars, wind- profiling lidars, and scanning lidars. The role of these instruments was to provide wind measurements at high spatial and temporal resolution for model evaluation and improvement of model physics. To properly determine model errors, the uncertainties in instrument-model comparisons need to be quantified accurately. These uncertainties arise from several factors such as measurement uncertainty, spatial variability, and interpolation of model output to instrument locations, to name a few. In this presentation, we will introduce a formalism to quantify measurement uncertainty and spatial variability. The accuracy of this formalism will be tested using existing datasets such as the eXperimental Planetary boundary layer Instrumentation Assessment (XPIA) campaign. Finally, the uncertainties in wind measurement and the spatial variability estimates from the WFIP2 field campaign will be discussed to understand the challenges involved in model evaluation.

  18. Quantifying acoustic doppler current profiler discharge uncertainty: A Monte Carlo based tool for moving-boat measurements

    USGS Publications Warehouse

    Mueller, David S.

    2017-01-01

    This paper presents a method using Monte Carlo simulations for assessing uncertainty of moving-boat acoustic Doppler current profiler (ADCP) discharge measurements using a software tool known as QUant, which was developed for this purpose. Analysis was performed on 10 data sets from four Water Survey of Canada gauging stations in order to evaluate the relative contribution of a range of error sources to the total estimated uncertainty. The factors that differed among data sets included the fraction of unmeasured discharge relative to the total discharge, flow nonuniformity, and operator decisions about instrument programming and measurement cross section. As anticipated, it was found that the estimated uncertainty is dominated by uncertainty of the discharge in the unmeasured areas, highlighting the importance of appropriate selection of the site, the instrument, and the user inputs required to estimate the unmeasured discharge. The main contributor to uncertainty was invalid data, but spatial inhomogeneity in water velocity and bottom-track velocity also contributed, as did variation in the edge velocity, uncertainty in the edge distances, edge coefficients, and the top and bottom extrapolation methods. To a lesser extent, spatial inhomogeneity in the bottom depth also contributed to the total uncertainty, as did uncertainty in the ADCP draft at shallow sites. The estimated uncertainties from QUant can be used to assess the adequacy of standard operating procedures. They also provide quantitative feedback to the ADCP operators about the quality of their measurements, indicating which parameters are contributing most to uncertainty, and perhaps even highlighting ways in which uncertainty can be reduced. Additionally, QUant can be used to account for self-dependent error sources such as heading errors, which are a function of heading. The results demonstrate the importance of a Monte Carlo method tool such as QUant for quantifying random and bias errors when evaluating the uncertainty of moving-boat ADCP measurements.

  19. Analysis of sensitivity and uncertainty in an individual-based model of a threatened wildlife species

    Treesearch

    Bruce G. Marcot; Peter H. Singleton; Nathan H. Schumaker

    2015-01-01

    Sensitivity analysis—determination of how prediction variables affect response variables—of individual-based models (IBMs) are few but important to the interpretation of model output. We present sensitivity analysis of a spatially explicit IBM (HexSim) of a threatened species, the Northern Spotted Owl (NSO; Strix occidentalis caurina) in Washington...

  20. Applying Metrological Techniques to Satellite Fundamental Climate Data Records

    NASA Astrophysics Data System (ADS)

    Woolliams, Emma R.; Mittaz, Jonathan PD; Merchant, Christopher J.; Hunt, Samuel E.; Harris, Peter M.

    2018-02-01

    Quantifying long-term environmental variability, including climatic trends, requires decadal-scale time series of observations. The reliability of such trend analysis depends on the long-term stability of the data record, and understanding the sources of uncertainty in historic, current and future sensors. We give a brief overview on how metrological techniques can be applied to historical satellite data sets. In particular we discuss the implications of error correlation at different spatial and temporal scales and the forms of such correlation and consider how uncertainty is propagated with partial correlation. We give a form of the Law of Propagation of Uncertainties that considers the propagation of uncertainties associated with common errors to give the covariance associated with Earth observations in different spectral channels.

  1. Uncertainty Analysis in Large Area Aboveground Biomass Mapping

    NASA Astrophysics Data System (ADS)

    Baccini, A.; Carvalho, L.; Dubayah, R.; Goetz, S. J.; Friedl, M. A.

    2011-12-01

    Satellite and aircraft-based remote sensing observations are being more frequently used to generate spatially explicit estimates of aboveground carbon stock of forest ecosystems. Because deforestation and forest degradation account for circa 10% of anthropogenic carbon emissions to the atmosphere, policy mechanisms are increasingly recognized as a low-cost mitigation option to reduce carbon emission. They are, however, contingent upon the capacity to accurately measures carbon stored in the forests. Here we examine the sources of uncertainty and error propagation in generating maps of aboveground biomass. We focus on characterizing uncertainties associated with maps at the pixel and spatially aggregated national scales. We pursue three strategies to describe the error and uncertainty properties of aboveground biomass maps, including: (1) model-based assessment using confidence intervals derived from linear regression methods; (2) data-mining algorithms such as regression trees and ensembles of these; (3) empirical assessments using independently collected data sets.. The latter effort explores error propagation using field data acquired within satellite-based lidar (GLAS) acquisitions versus alternative in situ methods that rely upon field measurements that have not been systematically collected for this purpose (e.g. from forest inventory data sets). A key goal of our effort is to provide multi-level characterizations that provide both pixel and biome-level estimates of uncertainties at different scales.

  2. Monitoring and modeling as a continuing learning process: the use of hydrological models in a general probabilistic framework.

    NASA Astrophysics Data System (ADS)

    Baroni, G.; Gräff, T.; Reinstorf, F.; Oswald, S. E.

    2012-04-01

    Nowadays uncertainty and sensitivity analysis are considered basic tools for the assessment of hydrological models and the evaluation of the most important sources of uncertainty. In this context, in the last decades several methods have been developed and applied in different hydrological conditions. However, in most of the cases, the studies have been done by investigating mainly the influence of the parameter uncertainty on the simulated outputs and few approaches tried to consider also other sources of uncertainty i.e. input and model structure. Moreover, several constrains arise when spatially distributed parameters are involved. To overcome these limitations a general probabilistic framework based on Monte Carlo simulations and the Sobol method has been proposed. In this study, the general probabilistic framework was applied at field scale using a 1D physical-based hydrological model (SWAP). Furthermore, the framework was extended at catchment scale in combination with a spatially distributed hydrological model (SHETRAN). The models are applied in two different experimental sites in Germany: a relatively flat cropped field close to Potsdam (Brandenburg) and a small mountainous catchment with agricultural land use (Schaefertal, Harz Mountains). For both cases, input and parameters are considered as major sources of uncertainty. Evaluation of the models was based on soil moisture detected at plot scale in different depths and, for the catchment site, also with daily discharge values. The study shows how the framework can take into account all the various sources of uncertainty i.e. input data, parameters (either in scalar or spatially distributed form) and model structures. The framework can be used in a loop in order to optimize further monitoring activities used to improve the performance of the model. In the particular applications, the results show how the sources of uncertainty are specific for each process considered. The influence of the input data as well as the presence of compensating errors become clear by the different processes simulated.

  3. Latin hypercube sampling and geostatistical modeling of spatial uncertainty in a spatially explicit forest landscape model simulation

    Treesearch

    Chonggang Xu; Hong S. He; Yuanman Hu; Yu Chang; Xiuzhen Li; Rencang Bu

    2005-01-01

    Geostatistical stochastic simulation is always combined with Monte Carlo method to quantify the uncertainty in spatial model simulations. However, due to the relatively long running time of spatially explicit forest models as a result of their complexity, it is always infeasible to generate hundreds or thousands of Monte Carlo simulations. Thus, it is of great...

  4. The significance of spatial variability of rainfall on streamflow: A synthetic analysis at the Upper Lee catchment, UK

    NASA Astrophysics Data System (ADS)

    Pechlivanidis, Ilias; McIntyre, Neil; Wheater, Howard

    2017-04-01

    Rainfall, one of the main inputs in hydrological modeling, is a highly heterogeneous process over a wide range of scales in space, and hence the ignorance of the spatial rainfall information could affect the simulated streamflow. Calibration of hydrological model parameters is rarely a straightforward task due to parameter equifinality and parameters' 'nature' to compensate for other uncertainties, i.e. structural and forcing input. In here, we analyse the significance of spatial variability of rainfall on streamflow as a function of catchment scale and type, and antecedent conditions using the continuous time, semi-distributed PDM hydrological model at the Upper Lee catchment, UK. The impact of catchment scale and type is assessed using 11 nested catchments ranging in scale from 25 to 1040 km2, and further assessed by artificially changing the catchment characteristics and translating these to model parameters with uncertainty using model regionalisation. Synthetic rainfall events are introduced to directly relate the change in simulated streamflow to the spatial variability of rainfall. Overall, we conclude that the antecedent catchment wetness and catchment type play an important role in controlling the significance of the spatial distribution of rainfall on streamflow. Results show a relationship between hydrograph characteristics (streamflow peak and volume) and the degree of spatial variability of rainfall for the impermeable catchments under dry antecedent conditions, although this decreases at larger scales; however this sensitivity is significantly undermined under wet antecedent conditions. Although there is indication that the impact of spatial rainfall on streamflow varies as a function of catchment scale, the variability of antecedent conditions between the synthetic catchments seems to mask this significance. Finally, hydrograph responses to different spatial patterns in rainfall depend on assumptions used for model parameter estimation and also the spatial variation in parameters indicating the need of an uncertainty framework in such investigation.

  5. Evaluation of MODIS aerosol optical depth for semi­-arid environments in complex terrain

    NASA Astrophysics Data System (ADS)

    Holmes, H.; Loria Salazar, S. M.; Panorska, A. K.; Arnott, W. P.; Barnard, J.

    2015-12-01

    The use of satellite remote sensing to estimate spatially resolved ground level air pollutant concentrations is increasing due to advancements in remote sensing technology and the limited number of surface observations. Satellite retrievals provide global, spatiotemporal air quality information and are used to track plumes, estimate human exposures, model emissions, and determine sources (i.e., natural versus anthropogenic) in regulatory applications. Ground level PM2.5 concentrations can be estimated using columnar aerosol optical depth (AOD) from MODIS, where the satellite retrieval serves as a spatial surrogate to simulate surface PM2.5 gradients. The spatial statistical models and MODIS AOD retrieval algorithms have been evaluated for the dark, vegetated eastern US, while the semi-arid western US continues to be an understudied region with associated complexity due to heterogeneous emissions, smoke from wildfires, and complex terrain. The objective of this work is to evaluate the uncertainty of MODIS AOD retrievals by comparing with columnar AOD and surface PM2.5 measurements from AERONET and EPA networks. Data is analyzed from multiple stations in California and Nevada for three years where four major wildfires occurred. Results indicate that MODIS retrievals fail to estimate column-integrated aerosol pollution in the summer months. This is further investigated by quantifying the statistical relationships between MODIS AOD, AERONET AOD, and surface PM2.5 concentrations. Data analysis indicates that the distribution of MODIS AOD is significantly (p<0.05) different than AERONET AOD. Further, using the results of distributional and association analysis the impacts of MODIS AOD uncertainties on the spatial gradients are evaluated. Additionally, the relationships between these uncertainties and physical parameters in the retrieval algorithm (e.g., surface reflectance, Ångström Extinction Exponent) are discussed.

  6. Uncertainty assessment method for the Cs-137 fallout inventory and penetration depth.

    PubMed

    Papadakos, G N; Karangelos, D J; Petropoulos, N P; Anagnostakis, M J; Hinis, E P; Simopoulos, S E

    2017-05-01

    Within the presented study, soil samples were collected in year 2007 at 20 different locations of the Greek terrain, both from the surface and also from depths down to 26 cm. Sampling locations were selected primarily from areas where high levels of 137 Cs deposition after the Chernobyl accident had already been identified by the Nuclear Engineering Laboratory of the National Technical University of Athens during and after the year of 1986. At one location of relatively higher deposition, soil core samples were collected following a 60 m by 60 m Cartesian grid with a 20 m node-to-node distance. Single or pair core samples were also collected from the remaining 19 locations. Sample measurements and analysis were used to estimate 137 Cs inventory and the corresponding depth migration, twenty years after the deposition on Greek terrain. Based on these data, the uncertainty components of the whole sampling-to-results procedure were investigated. A cause-and-effect assessment process was used to apply the law of error propagation and demonstrate that the dominating significant component of the combined uncertainty is that due to the spatial variability of the contemporary (2007) 137 Cs inventory. A secondary, yet also significant component was identified to be the activity measurement process itself. Other less-significant uncertainty parameters were sampling methods, the variation in the soil field density with depth and the preparation of samples for measurement. The sampling grid experiment allowed for the quantitative evaluation of the uncertainty due to spatial variability, also by the assistance of the semivariance analysis. Denser, optimized grid could return more accurate values for this component but with a significantly elevated laboratory cost, in terms of both, human and material resources. Using the hereby collected data and for the case of a single core soil sampling using a well-defined sampling methodology quality assurance, the uncertainty component due to spatial variability was evaluated to about 19% for the 137 Cs inventory and up to 34% for the 137 Cs penetration depth. Based on the presented results and also on related literature, it is argued that such high uncertainties should be anticipated for single core samplings conducted using similar methodology and employed as 137 Cs inventory and penetration depth estimators. Copyright © 2017 Elsevier Ltd. All rights reserved.

  7. Technical Note: Atmospheric CO2 inversions on the mesoscale using data-driven prior uncertainties: methodology and system evaluation

    NASA Astrophysics Data System (ADS)

    Kountouris, Panagiotis; Gerbig, Christoph; Rödenbeck, Christian; Karstens, Ute; Koch, Thomas Frank; Heimann, Martin

    2018-03-01

    Atmospheric inversions are widely used in the optimization of surface carbon fluxes on a regional scale using information from atmospheric CO2 dry mole fractions. In many studies the prior flux uncertainty applied to the inversion schemes does not directly reflect the true flux uncertainties but is used to regularize the inverse problem. Here, we aim to implement an inversion scheme using the Jena inversion system and applying a prior flux error structure derived from a model-data residual analysis using high spatial and temporal resolution over a full year period in the European domain. We analyzed the performance of the inversion system with a synthetic experiment, in which the flux constraint is derived following the same residual analysis but applied to the model-model mismatch. The synthetic study showed a quite good agreement between posterior and true fluxes on European, country, annual and monthly scales. Posterior monthly and country-aggregated fluxes improved their correlation coefficient with the known truth by 7 % compared to the prior estimates when compared to the reference, with a mean correlation of 0.92. The ratio of the SD between the posterior and reference and between the prior and reference was also reduced by 33 % with a mean value of 1.15. We identified temporal and spatial scales on which the inversion system maximizes the derived information; monthly temporal scales at around 200 km spatial resolution seem to maximize the information gain.

  8. How much swamp are we talking here?: Propagating uncertainty about the area of coastal wetlands into the U.S. greenhouse gas inventory

    NASA Astrophysics Data System (ADS)

    Holmquist, J. R.; Crooks, S.; Windham-Myers, L.; Megonigal, P.; Weller, D.; Lu, M.; Bernal, B.; Byrd, K. B.; Morris, J. T.; Troxler, T.; McCombs, J.; Herold, N.

    2017-12-01

    Stable coastal wetlands can store substantial amounts of carbon (C) that can be released when they are degraded or eroded. The EPA recently incorporated coastal wetland net-storage and emissions within the Agricultural Forested and Other Land Uses category of the U.S. National Greenhouse Gas Inventory (NGGI). This was a seminal analysis, but its quantification of uncertainty needs improvement. We provide a value-added analysis by estimating that uncertainty, focusing initially on the most basic assumption, the area of coastal wetlands. We considered three sources: uncertainty in the areas of vegetation and salinity subclasses, uncertainty in the areas of changing or stable wetlands, and uncertainty in the inland extent of coastal wetlands. The areas of vegetation and salinity subtypes, as well as stable or changing, were estimated from 2006 and 2010 maps derived from Landsat imagery by the Coastal Change Analysis Program (C-CAP). We generated unbiased area estimates and confidence intervals for C-CAP, taking into account mapped area, proportional areas of commission and omission errors, as well as the number of observations. We defined the inland extent of wetlands as all land below the current elevation of twice monthly highest tides. We generated probabilistic inundation maps integrating wetland-specific bias and random error in light-detection and ranging elevation maps, with the spatially explicit random error in tidal surfaces generated from tide gauges. This initial uncertainty analysis will be extended to calculate total propagated uncertainty in the NGGI by including the uncertainties in the amount of C lost from eroded and degraded wetlands, stored annually in stable wetlands, and emitted in the form of methane by tidal freshwater wetlands.

  9. Monthly and spatially resolved black carbon emission inventory of India: uncertainty analysis

    NASA Astrophysics Data System (ADS)

    Paliwal, Umed; Sharma, Mukesh; Burkhart, John F.

    2016-10-01

    Black carbon (BC) emissions from India for the year 2011 are estimated to be 901.11 ± 151.56 Gg yr-1 based on a new ground-up, GIS-based inventory. The grid-based, spatially resolved emission inventory includes, in addition to conventional sources, emissions from kerosene lamps, forest fires, diesel-powered irrigation pumps and electricity generators at mobile towers. The emissions have been estimated at district level and were spatially distributed onto grids at a resolution of 40 × 40 km2. The uncertainty in emissions has been estimated using a Monte Carlo simulation by considering the variability in activity data and emission factors. Monthly variation of BC emissions has also been estimated to account for the seasonal variability. To the total BC emissions, domestic fuels contributed most significantly (47 %), followed by industry (22 %), transport (17 %), open burning (12 %) and others (2 %). The spatial and seasonal resolution of the inventory will be useful for modeling BC transport in the atmosphere for air quality, global warming and other process-level studies that require greater temporal resolution than traditional inventories.

  10. Using FOSM-Based Data Worth Analyses to Design Geophysical Surveys to Reduce Uncertainty in a Regional Groundwater Model Update

    NASA Astrophysics Data System (ADS)

    Smith, B. D.; White, J.; Kress, W. H.; Clark, B. R.; Barlow, J.

    2016-12-01

    Hydrogeophysical surveys have become an integral part of understanding hydrogeological frameworks used in groundwater models. Regional models cover a large area where water well data is, at best, scattered and irregular. Since budgets are finite, priorities must be assigned to select optimal areas for geophysical surveys. For airborne electromagnetic (AEM) geophysical surveys, optimization of mapping depth and line spacing needs to take in account the objectives of the groundwater models. The approach discussed here uses a first-order, second-moment (FOSM) uncertainty analyses which assumes an approximate linear relation between model parameters and observations. This assumption allows FOSM analyses to be applied to estimate the value of increased parameter knowledge to reduce forecast uncertainty. FOSM is used to facilitate optimization of yet-to-be-completed geophysical surveying to reduce model forecast uncertainty. The main objective of geophysical surveying is assumed to estimate values and spatial variation in hydrologic parameters (i.e. hydraulic conductivity) as well as map lower permeability layers that influence the spatial distribution of recharge flux. The proposed data worth analysis was applied to Mississippi Embayment Regional Aquifer Study (MERAS) which is being updated. The objective of MERAS is to assess the ground-water availability (status and trends) of the Mississippi embayment aquifer system. The study area covers portions of eight states including Alabama, Arkansas, Illinois, Kentucky, Louisiana, Mississippi, Missouri, and Tennessee. The active model grid covers approximately 70,000 square miles, and incorporates some 6,000 miles of major rivers and over 100,000 water wells. In the FOSM analysis, a dense network of pilot points was used to capture uncertainty in hydraulic conductivity and recharge. To simulate the effect of AEM flight lines, the prior uncertainty for hydraulic conductivity and recharge pilots along potential flight lines was reduced. The FOSM forecast uncertainty estimates were then recalculated and compared to the base forecast uncertainty estimates. The resulting reduction in forecast uncertainty is a measure of the effect on the model from the AEM survey. Iterations through this process, results in optimization of flight line location.

  11. Uncertainty indication in soil function maps - transparent and easy-to-use information to support sustainable use of soil resources

    NASA Astrophysics Data System (ADS)

    Greiner, Lucie; Nussbaum, Madlene; Papritz, Andreas; Zimmermann, Stephan; Gubler, Andreas; Grêt-Regamey, Adrienne; Keller, Armin

    2018-05-01

    Spatial information on soil function fulfillment (SFF) is increasingly being used to inform decision-making in spatial planning programs to support sustainable use of soil resources. Soil function maps visualize soils abilities to fulfill their functions, e.g., regulating water and nutrient flows, providing habitats, and supporting biomass production based on soil properties. Such information must be reliable for informed and transparent decision-making in spatial planning programs. In this study, we add to the transparency of soil function maps by (1) indicating uncertainties arising from the prediction of soil properties generated by digital soil mapping (DSM) that are used for soil function assessment (SFA) and (2) showing the response of different SFA methods to the propagation of uncertainties through the assessment. For a study area of 170 km2 in the Swiss Plateau, we map 10 static soil sub-functions for agricultural soils for a spatial resolution of 20 × 20 m together with their uncertainties. Mapping the 10 soil sub-functions using simple ordinal assessment scales reveals pronounced spatial patterns with a high variability of SFF scores across the region, linked to the inherent properties of the soils and terrain attributes and climate conditions. Uncertainties in soil properties propagated through SFA methods generally lead to substantial uncertainty in the mapped soil sub-functions. We propose two types of uncertainty maps that can be readily understood by stakeholders. Cumulative distribution functions of SFF scores indicate that SFA methods respond differently to the propagated uncertainty of soil properties. Even where methods are comparable on the level of complexity and assessment scale, their comparability in view of uncertainty propagation might be different. We conclude that comparable uncertainty indications in soil function maps are relevant to enable informed and transparent decisions on the sustainable use of soil resources.

  12. The use of composite fingerprints to quantify sediment sources in a wildfire impacted landscape, Alberta, Canada.

    PubMed

    Stone, M; Collins, A L; Silins, U; Emelko, M B; Zhang, Y S

    2014-03-01

    There is increasing global concern regarding the impacts of large scale land disturbance by wildfire on a wide range of water and related ecological services. This study explores the impact of the 2003 Lost Creek wildfire in the Crowsnest River basin, Alberta, Canada on regional scale sediment sources using a tracing approach. A composite geochemical fingerprinting procedure was used to apportion the sediment efflux among three key spatial sediment sources: 1) unburned (reference) 2) burned and 3) burned sub-basins that were subsequently salvage logged. Spatial sediment sources were characterized by collecting time-integrated suspended sediment samples using passive devices during the entire ice free periods in 2009 and 2010. The tracing procedure combines the Kruskal-Wallis H-test, principal component analysis and genetic-algorithm driven discriminant function analysis for source discrimination. Source apportionment was based on a numerical mass balance model deployed within a Monte Carlo framework incorporating both local optimization and global (genetic algorithm) optimization. The mean relative frequency-weighted average median inputs from the three spatial source units were estimated to be 17% (inter-quartile uncertainty range 0-32%) from the reference areas, 45% (inter-quartile uncertainty range 25-65%) from the burned areas and 38% (inter-quartile uncertainty range 14-59%) from the burned-salvage logged areas. High sediment inputs from burned and the burned-salvage logged areas, representing spatial source units 2 and 3, reflect the lasting effects of forest canopy and forest floor organic matter disturbance during the 2003 wildfire including increased runoff and sediment availability related to high terrestrial erosion, streamside mass wasting and river bank collapse. The results demonstrate the impact of wildfire and incremental pressures associated with salvage logging on catchment spatial sediment sources in higher elevation Montane regions where forest growth and vegetation recovery are relatively slow. Copyright © 2013 Elsevier B.V. All rights reserved.

  13. Evaluating critical uncertainty thresholds in a spatial model of forest pest invasion risk

    Treesearch

    Frank H. Koch; Denys Yemshanov; Daniel W. McKenney; William D. Smith

    2009-01-01

    Pest risk maps can provide useful decision support in invasive species management, but most do not adequately consider the uncertainty associated with predicted risk values. This study explores how increased uncertainty in a risk model’s numeric assumptions might affect the resultant risk map. We used a spatial stochastic model, integrating components for...

  14. Facing uncertainty in ecosystem services-based resource management.

    PubMed

    Grêt-Regamey, Adrienne; Brunner, Sibyl H; Altwegg, Jürg; Bebi, Peter

    2013-09-01

    The concept of ecosystem services is increasingly used as a support for natural resource management decisions. While the science for assessing ecosystem services is improving, appropriate methods to address uncertainties in a quantitative manner are missing. Ignoring parameter uncertainties, modeling uncertainties and uncertainties related to human-environment interactions can modify decisions and lead to overlooking important management possibilities. In this contribution, we present a new approach for mapping the uncertainties in the assessment of multiple ecosystem services. The spatially explicit risk approach links Bayesian networks to a Geographic Information System for forecasting the value of a bundle of ecosystem services and quantifies the uncertainties related to the outcomes in a spatially explicit manner. We demonstrate that mapping uncertainties in ecosystem services assessments provides key information for decision-makers seeking critical areas in the delivery of ecosystem services in a case study in the Swiss Alps. The results suggest that not only the total value of the bundle of ecosystem services is highly dependent on uncertainties, but the spatial pattern of the ecosystem services values changes substantially when considering uncertainties. This is particularly important for the long-term management of mountain forest ecosystems, which have long rotation stands and are highly sensitive to pressing climate and socio-economic changes. Copyright © 2012 Elsevier Ltd. All rights reserved.

  15. The Effect of Rainfall Measurement Technique and Its Spatiotemporal Resolution on Discharge Predictions in the Netherlands

    NASA Astrophysics Data System (ADS)

    Uijlenhoet, R.; Brauer, C.; Overeem, A.; Sassi, M.; Rios Gaona, M. F.

    2014-12-01

    Several rainfall measurement techniques are available for hydrological applications, each with its own spatial and temporal resolution. We investigated the effect of these spatiotemporal resolutions on discharge simulations in lowland catchments by forcing a novel rainfall-runoff model (WALRUS) with rainfall data from gauges, radars and microwave links. The hydrological model used for this analysis is the recently developed Wageningen Lowland Runoff Simulator (WALRUS). WALRUS is a rainfall-runoff model accounting for hydrological processes relevant to areas with shallow groundwater (e.g. groundwater-surface water feedback). Here, we used WALRUS for case studies in a freely draining lowland catchment and a polder with controlled water levels. We used rain gauge networks with automatic (hourly resolution but low spatial density) and manual gauges (high spatial density but daily resolution). Operational (real-time) and climatological (gauge-adjusted) C-band radar products and country-wide rainfall maps derived from microwave link data from a cellular telecommunication network were also used. Discharges simulated with these different inputs were compared to observations. We also investigated the effect of spatiotemporal resolution with a high-resolution X-band radar data set for catchments with different sizes. Uncertainty in rainfall forcing is a major source of uncertainty in discharge predictions, both with lumped and with distributed models. For lumped rainfall-runoff models, the main source of input uncertainty is associated with the way in which (effective) catchment-average rainfall is estimated. When catchments are divided into sub-catchments, rainfall spatial variability can become more important, especially during convective rainfall events, leading to spatially varying catchment wetness and spatially varying contribution of quick flow routes. Improving rainfall measurements and their spatiotemporal resolution can improve the performance of rainfall-runoff models, indicating their potential for reducing flood damage through real-time control.

  16. Assessment of Observational Uncertainty in Extreme Precipitation Events over the Continental United States

    NASA Astrophysics Data System (ADS)

    Slinskey, E. A.; Loikith, P. C.; Waliser, D. E.; Goodman, A.

    2017-12-01

    Extreme precipitation events are associated with numerous societal and environmental impacts. Furthermore, anthropogenic climate change is projected to alter precipitation intensity across portions of the Continental United States (CONUS). Therefore, a spatial understanding and intuitive means of monitoring extreme precipitation over time is critical. Towards this end, we apply an event-based indicator, developed as a part of NASA's support of the ongoing efforts of the US National Climate Assessment, which assigns categories to extreme precipitation events based on 3-day storm totals as a basis for dataset intercomparison. To assess observational uncertainty across a wide range of historical precipitation measurement approaches, we intercompare in situ station data from the Global Historical Climatology Network (GHCN), satellite-derived precipitation data from NASA's Tropical Rainfall Measuring Mission (TRMM), gridded in situ station data from the Parameter-elevation Regressions on Independent Slopes Model (PRISM), global reanalysis from NASA's Modern Era Retrospective-Analysis version 2 (MERRA 2), and regional reanalysis with gauge data assimilation from NCEP's North American Regional Reanalysis (NARR). Results suggest considerable variability across the five-dataset suite in the frequency, spatial extent, and magnitude of extreme precipitation events. Consistent with expectations, higher resolution datasets were found to resemble station data best and capture a greater frequency of high-end extreme events relative to lower spatial resolution datasets. The degree of dataset agreement varies regionally, however all datasets successfully capture the seasonal cycle of precipitation extremes across the CONUS. These intercomparison results provide additional insight about observational uncertainty and the ability of a range of precipitation measurement and analysis products to capture extreme precipitation event climatology. While the event category threshold is fixed in this analysis, preliminary results from the development of a flexible categorization scheme, that scales with grid resolution, are presented.

  17. A study protocol to evaluate the relationship between outdoor air pollution and pregnancy outcomes

    PubMed Central

    2010-01-01

    Background The present study protocol is designed to assess the relationship between outdoor air pollution and low birth weight and preterm births outcomes performing a semi-ecological analysis. Semi-ecological design studies are widely used to assess effects of air pollution in humans. In this type of analysis, health outcomes and covariates are measured in individuals and exposure assignments are usually based on air quality monitor stations. Therefore, estimating individual exposures are one of the major challenges when investigating these relationships with a semi-ecologic design. Methods/Design Semi-ecologic study consisting of a retrospective cohort study with ecologic assignment of exposure is applied. Health outcomes and covariates are collected at Primary Health Care Center. Data from pregnant registry, clinical record and specific questionnaire administered orally to the mothers of children born in period 2007-2010 in Portuguese Alentejo Litoral region, are collected by the research team. Outdoor air pollution data are collected with a lichen diversity biomonitoring program, and individual pregnancy exposures are assessed with spatial geostatistical simulation, which provides the basis for uncertainty analysis of individual exposures. Awareness of outdoor air pollution uncertainty will improve validity of individual exposures assignments for further statistical analysis with multivariate regression models. Discussion Exposure misclassification is an issue of concern in semi-ecological design. In this study, personal exposures are assigned to each pregnant using geocoded addresses data. A stochastic simulation method is applied to lichen diversity values index measured at biomonitoring survey locations, in order to assess spatial uncertainty of lichen diversity value index at each geocoded address. These methods assume a model for spatial autocorrelation of exposure and provide a distribution of exposures in each study location. We believe that variability of simulated exposure values at geocoded addresses will improve knowledge on variability of exposures, improving therefore validity of individual exposures to input in posterior statistical analysis. PMID:20950449

  18. A study protocol to evaluate the relationship between outdoor air pollution and pregnancy outcomes.

    PubMed

    Ribeiro, Manuel C; Pereira, Maria J; Soares, Amílcar; Branquinho, Cristina; Augusto, Sofia; Llop, Esteve; Fonseca, Susana; Nave, Joaquim G; Tavares, António B; Dias, Carlos M; Silva, Ana; Selemane, Ismael; de Toro, Joaquin; Santos, Mário J; Santos, Fernanda

    2010-10-15

    The present study protocol is designed to assess the relationship between outdoor air pollution and low birth weight and preterm births outcomes performing a semi-ecological analysis. Semi-ecological design studies are widely used to assess effects of air pollution in humans. In this type of analysis, health outcomes and covariates are measured in individuals and exposure assignments are usually based on air quality monitor stations. Therefore, estimating individual exposures are one of the major challenges when investigating these relationships with a semi-ecologic design. Semi-ecologic study consisting of a retrospective cohort study with ecologic assignment of exposure is applied. Health outcomes and covariates are collected at Primary Health Care Center. Data from pregnant registry, clinical record and specific questionnaire administered orally to the mothers of children born in period 2007-2010 in Portuguese Alentejo Litoral region, are collected by the research team. Outdoor air pollution data are collected with a lichen diversity biomonitoring program, and individual pregnancy exposures are assessed with spatial geostatistical simulation, which provides the basis for uncertainty analysis of individual exposures. Awareness of outdoor air pollution uncertainty will improve validity of individual exposures assignments for further statistical analysis with multivariate regression models. Exposure misclassification is an issue of concern in semi-ecological design. In this study, personal exposures are assigned to each pregnant using geocoded addresses data. A stochastic simulation method is applied to lichen diversity values index measured at biomonitoring survey locations, in order to assess spatial uncertainty of lichen diversity value index at each geocoded address. These methods assume a model for spatial autocorrelation of exposure and provide a distribution of exposures in each study location. We believe that variability of simulated exposure values at geocoded addresses will improve knowledge on variability of exposures, improving therefore validity of individual exposures to input in posterior statistical analysis.

  19. Assessing the importance of rainfall uncertainty on hydrological models with different spatial and temporal scale

    NASA Astrophysics Data System (ADS)

    Nossent, Jiri; Pereira, Fernando; Bauwens, Willy

    2015-04-01

    Precipitation is one of the key inputs for hydrological models. As long as the values of the hydrological model parameters are fixed, a variation of the rainfall input is expected to induce a change in the model output. Given the increased awareness of uncertainty on rainfall records, it becomes more important to understand the impact of this input - output dynamic. Yet, modellers often still have the intention to mimic the observed flow, whatever the deviation of the employed records from the actual rainfall might be, by recklessly adapting the model parameter values. But is it actually possible to vary the model parameter values in such a way that a certain (observed) model output can be generated based on inaccurate rainfall inputs? Thus, how important is the rainfall uncertainty for the model output with respect to the model parameter importance? To address this question, we apply the Sobol' sensitivity analysis method to assess and compare the importance of the rainfall uncertainty and the model parameters on the output of the hydrological model. In order to be able to treat the regular model parameters and input uncertainty in the same way, and to allow a comparison of their influence, a possible approach is to represent the rainfall uncertainty by a parameter. To tackle the latter issue, we apply so called rainfall multipliers on hydrological independent storm events, as a probabilistic parameter representation of the possible rainfall variation. As available rainfall records are very often point measurements at a discrete time step (hourly, daily, monthly,…), they contain uncertainty due to a latent lack of spatial and temporal variability. The influence of the latter variability can also be different for hydrological models with different spatial and temporal scale. Therefore, we perform the sensitivity analyses on a semi-distributed model (SWAT) and a lumped model (NAM). The assessment and comparison of the importance of the rainfall uncertainty and the model parameters is achieved by considering different scenarios for the included parameters and the state of the models.

  20. Using heat as a tracer to estimate spatially distributed mean residence times in the hyporheic zone

    NASA Astrophysics Data System (ADS)

    Naranjo, R. C.; Pohll, G. M.; Stone, M. C.; Niswonger, R. G.; McKay, W. A.

    2013-12-01

    Biogeochemical reactions that occur in the hyporheic zone are highly dependent on the time solutes are in contact with riverbed sediments. In this investigation, we developed a two-dimensional longitudinal flow and solute transport model to estimate the spatial distribution of mean residence time in the hyporheic zone along a riffle-pool sequence to gain a better understanding of nitrogen reactions. A flow and transport model was developed to estimate spatially distributed mean residence times and was calibrated using observations of temperature and pressure. The approach used in this investigation accounts for the mixing of ages given advection and dispersion. Uncertainty of flow and transport parameters was evaluated using standard Monte-Carlo analysis and the generalized likelihood uncertainty estimation method. Results of parameter estimation indicate the presence of a low-permeable zone in the riffle area that induced horizontal flow at shallow depth within the riffle area. This establishes shallow and localized flow paths and limits deep vertical exchange. From the optimal model, mean residence times were found to be relatively long (9 - 40 days). The uncertainty of hydraulic conductivity resulted in a mean interquartile range of 13 days across all piezometers and was reduced by 24% with the inclusion of temperature and pressure observations. To a lesser extent, uncertainty in streambed porosity and dispersivity resulted in a mean interquartile range of 2.2- and 4.7 days, respectively. Alternative conceptual models demonstrate the importance of accounting for the spatial distribution of hydraulic conductivity in simulating mean residence times in a riffle-pool sequence. It is demonstrated that spatially variable mean residence time beneath a riffle-pool system does not conform to simple conceptual models of hyporheic flow through a riffle-pool sequence. Rather, the mixing behavior between the river and the hyporheic flow are largely controlled by layered heterogeneity and anisotropy of the subsurface.

  1. Development of Semi-distributed ecohydrological model in the Rio Grande De Manati River Basin, Puerto Rico

    NASA Astrophysics Data System (ADS)

    Setegn, S. G.; Ortiz, J.; Melendez, J.; Barreto, M.; Torres-Perez, J. L.; Guild, L. S.

    2015-12-01

    There are limited studies in Puerto Rico that shows the water resources availability and variability with respect to changing climates and land use. The main goal of the HICE-PR (Human Impacts to Coastal Ecosystems in Puerto Rico (HICE-PR): the Río Loco Watershed (southwest coast PR) project which was funded by NASA is to evaluate the impacts of land use/land cover changes on the quality and extent of coastal and marine ecosystems (CMEs) in two priority watersheds in Puerto Rico (Manatí and Guánica).The main objective of this study is to set up a physically based spatially distributed hydrological model, Soil and Water Assessment Tool (SWAT) for the analysis of hydrological processes in the Rio Grande de Manati river basin. SWAT (soil and water assessment tool) is a spatially distributed watershed model developed to predict the impact of land management practices on water, sediment and agricultural chemical yields in large complex watersheds. For efficient use of distributed models for hydrological and scenario analysis, it is important that these models pass through a careful calibration and uncertainty analysis. The model was calibrated and validated using Sequential Uncertainty Fitting (SUFI-2) calibration and uncertainty analysis algorithms. The model evaluation statistics for streamflows prediction shows that there is a good agreement between the measured and simulated flows that was verified by coefficients of determination and Nash Sutcliffe efficiency greater than 0.5. Keywords: Hydrological Modeling; SWAT; SUFI-2; Rio Grande De Manati; Puerto Rico

  2. Evaluation of single and multiple Doppler lidar techniques to measure complex flow during the XPIA field campaign

    NASA Astrophysics Data System (ADS)

    Choukulkar, Aditya; Brewer, W. Alan; Sandberg, Scott P.; Weickmann, Ann; Bonin, Timothy A.; Hardesty, R. Michael; Lundquist, Julie K.; Delgado, Ruben; Valerio Iungo, G.; Ashton, Ryan; Debnath, Mithu; Bianco, Laura; Wilczak, James M.; Oncley, Steven; Wolfe, Daniel

    2017-01-01

    Accurate three-dimensional information of wind flow fields can be an important tool in not only visualizing complex flow but also understanding the underlying physical processes and improving flow modeling. However, a thorough analysis of the measurement uncertainties is required to properly interpret results. The XPIA (eXperimental Planetary boundary layer Instrumentation Assessment) field campaign conducted at the Boulder Atmospheric Observatory (BAO) in Erie, CO, from 2 March to 31 May 2015 brought together a large suite of in situ and remote sensing measurement platforms to evaluate complex flow measurement strategies. In this paper, measurement uncertainties for different single and multi-Doppler strategies using simple scan geometries (conical, vertical plane and staring) are investigated. The tradeoffs (such as time-space resolution vs. spatial coverage) among the different measurement techniques are evaluated using co-located measurements made near the BAO tower. Sensitivity of the single-/multi-Doppler measurement uncertainties to averaging period are investigated using the sonic anemometers installed on the BAO tower as the standard reference. Finally, the radiometer measurements are used to partition the measurement periods as a function of atmospheric stability to determine their effect on measurement uncertainty. It was found that with an increase in spatial coverage and measurement complexity, the uncertainty in the wind measurement also increased. For multi-Doppler techniques, the increase in uncertainty for temporally uncoordinated measurements is possibly due to requiring additional assumptions of stationarity along with horizontal homogeneity and less representative line-of-sight velocity statistics. It was also found that wind speed measurement uncertainty was lower during stable conditions compared to unstable conditions.

  3. Assessing uncertainty in high-resolution spatial climate data across the US Northeast.

    PubMed

    Bishop, Daniel A; Beier, Colin M

    2013-01-01

    Local and regional-scale knowledge of climate change is needed to model ecosystem responses, assess vulnerabilities and devise effective adaptation strategies. High-resolution gridded historical climate (GHC) products address this need, but come with multiple sources of uncertainty that are typically not well understood by data users. To better understand this uncertainty in a region with a complex climatology, we conducted a ground-truthing analysis of two 4 km GHC temperature products (PRISM and NRCC) for the US Northeast using 51 Cooperative Network (COOP) weather stations utilized by both GHC products. We estimated GHC prediction error for monthly temperature means and trends (1980-2009) across the US Northeast and evaluated any landscape effects (e.g., elevation, distance from coast) on those prediction errors. Results indicated that station-based prediction errors for the two GHC products were similar in magnitude, but on average, the NRCC product predicted cooler than observed temperature means and trends, while PRISM was cooler for means and warmer for trends. We found no evidence for systematic sources of uncertainty across the US Northeast, although errors were largest at high elevations. Errors in the coarse-scale (4 km) digital elevation models used by each product were correlated with temperature prediction errors, more so for NRCC than PRISM. In summary, uncertainty in spatial climate data has many sources and we recommend that data users develop an understanding of uncertainty at the appropriate scales for their purposes. To this end, we demonstrate a simple method for utilizing weather stations to assess local GHC uncertainty and inform decisions among alternative GHC products.

  4. Estimating the spatial distribution of wintering little brown bat populations in the eastern United States

    USGS Publications Warehouse

    Russell, Robin E.; Tinsley, Karl; Erickson, Richard A.; Thogmartin, Wayne E.; Jennifer A. Szymanski,

    2014-01-01

    Depicting the spatial distribution of wildlife species is an important first step in developing management and conservation programs for particular species. Accurate representation of a species distribution is important for predicting the effects of climate change, land-use change, management activities, disease, and other landscape-level processes on wildlife populations. We developed models to estimate the spatial distribution of little brown bat (Myotis lucifugus) wintering populations in the United States east of the 100th meridian, based on known hibernacula locations. From this data, we developed several scenarios of wintering population counts per county that incorporated uncertainty in the spatial distribution of the hibernacula as well as uncertainty in the size of the current little brown bat population. We assessed the variability in our results resulting from effects of uncertainty. Despite considerable uncertainty in the known locations of overwintering little brown bats in the eastern United States, we believe that models accurately depicting the effects of the uncertainty are useful for making management decisions as these models are a coherent organization of the best available information.

  5. Uncertainty in predicting soil hydraulic properties at the hillslope scale with indirect methods

    NASA Astrophysics Data System (ADS)

    Chirico, G. B.; Medina, H.; Romano, N.

    2007-02-01

    SummarySeveral hydrological applications require the characterisation of the soil hydraulic properties at large spatial scales. Pedotransfer functions (PTFs) are being developed as simplified methods to estimate soil hydraulic properties as an alternative to direct measurements, which are unfeasible for most practical circumstances. The objective of this study is to quantify the uncertainty in PTFs spatial predictions at the hillslope scale as related to the sampling density, due to: (i) the error in estimated soil physico-chemical properties and (ii) PTF model error. The analysis is carried out on a 2-km-long experimental hillslope in South Italy. The method adopted is based on a stochastic generation of patterns of soil variables using sequential Gaussian simulation, conditioned to the observed sample data. The following PTFs are applied: Vereecken's PTF [Vereecken, H., Diels, J., van Orshoven, J., Feyen, J., Bouma, J., 1992. Functional evaluation of pedotransfer functions for the estimation of soil hydraulic properties. Soil Sci. Soc. Am. J. 56, 1371-1378] and HYPRES PTF [Wösten, J.H.M., Lilly, A., Nemes, A., Le Bas, C., 1999. Development and use of a database of hydraulic properties of European soils. Geoderma 90, 169-185]. The two PTFs estimate reliably the soil water retention characteristic even for a relatively coarse sampling resolution, with prediction uncertainties comparable to the uncertainties in direct laboratory or field measurements. The uncertainty of soil water retention prediction due to the model error is as much as or more significant than the uncertainty associated with the estimated input, even for a relatively coarse sampling resolution. Prediction uncertainties are much more important when PTF are applied to estimate the saturated hydraulic conductivity. In this case model error dominates the overall prediction uncertainties, making negligible the effect of the input error.

  6. Estimation of uncertainty for contour method residual stress measurements

    DOE PAGES

    Olson, Mitchell D.; DeWald, Adrian T.; Prime, Michael B.; ...

    2014-12-03

    This paper describes a methodology for the estimation of measurement uncertainty for the contour method, where the contour method is an experimental technique for measuring a two-dimensional map of residual stress over a plane. Random error sources including the error arising from noise in displacement measurements and the smoothing of the displacement surfaces are accounted for in the uncertainty analysis. The output is a two-dimensional, spatially varying uncertainty estimate such that every point on the cross-section where residual stress is determined has a corresponding uncertainty value. Both numerical and physical experiments are reported, which are used to support the usefulnessmore » of the proposed uncertainty estimator. The uncertainty estimator shows the contour method to have larger uncertainty near the perimeter of the measurement plane. For the experiments, which were performed on a quenched aluminum bar with a cross section of 51 × 76 mm, the estimated uncertainty was approximately 5 MPa (σ/E = 7 · 10⁻⁵) over the majority of the cross-section, with localized areas of higher uncertainty, up to 10 MPa (σ/E = 14 · 10⁻⁵).« less

  7. Monthly Fossil-Fuel CO2 Emissions: Uncertainty of Emissions Gridded by On Degree Latitude by One Degree Longitude (Uncertainties, V.2016)

    DOE Data Explorer

    Andres, J.A. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Boden, T.A. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2016-01-01

    The monthly, gridded fossil-fuel CO2 emissions uncertainty estimates from 1950-2013 provided in this database are derived from time series of global, regional, and national fossil-fuel CO2 emissions (Boden et al. 2016). Andres et al. (2016) describes the basic methodology in estimating the uncertainty in the (gridded fossil fuel data product ). This uncertainty is gridded at the same spatial and temporal scales as the mass magnitude maps. This gridded uncertainty includes uncertainty contributions from the spatial, temporal, proxy, and magnitude components used to create the magnitude map of FFCO2 emissions. Throughout this process, when assumptions had to be made or expert judgment employed, the general tendency in most cases was toward overestimating or increasing the magnitude of uncertainty.

  8. Uncertainty in mixing models: a blessing in disguise?

    NASA Astrophysics Data System (ADS)

    Delsman, J. R.; Oude Essink, G. H. P.

    2012-04-01

    Despite the abundance of tracer-based studies in catchment hydrology over the past decades, relatively few studies have addressed the uncertainty associated with these studies in much detail. This uncertainty stems from analytical error, spatial and temporal variance in end-member composition, and from not incorporating all relevant processes in the necessarily simplistic mixing models. Instead of applying standard EMMA methodology, we used end-member mixing model analysis within a Monte Carlo framework to quantify the uncertainty surrounding our analysis. Borrowing from the well-known GLUE methodology, we discarded mixing models that could not satisfactorily explain sample concentrations and analyzed the posterior parameter set. This use of environmental tracers aided in disentangling hydrological pathways in a Dutch polder catchment. This 10 km2 agricultural catchment is situated in the coastal region of the Netherlands. Brackish groundwater seepage, originating from Holocene marine transgressions, adversely affects water quality in this catchment. Current water management practice is aimed at improving water quality by flushing the catchment with fresh water from the river Rhine. Climate change is projected to decrease future fresh water availability, signifying the need for a more sustainable water management practice and a better understanding of the functioning of the catchment. The end-member mixing analysis increased our understanding of the hydrology of the studied catchment. The use of a GLUE-like framework for applying the end-member mixing analysis not only quantified the uncertainty associated with the analysis, the analysis of the posterior parameter set also identified the existence of catchment processes otherwise overlooked.

  9. Optimal Groundwater Extraction under Uncertainty and a Spatial Stock Externality

    EPA Science Inventory

    We introduce a model that incorporates two important elements to estimating welfare gains from groundwater management: stochasticity and a spatial stock externality. We estimate welfare gains resulting from optimal management under uncertainty as well as a gradual stock externali...

  10. Statistical considerations in creating water vapor data records from combinations of satellite and other observation types, including in situ and ground-based remote sensing

    NASA Astrophysics Data System (ADS)

    Dykema, J. A.; Anderson, J. G.

    2014-12-01

    Measuring water vapor at the highest spatial and temporal at all vertical levels and at arbitrary times requires strategic utilization of disparate observations from satellites, ground-based remote sensing, and in situ measurements. These different measurement types have different response times and very different spatial averaging properties, both horizontally and vertically. Accounting for these different measurement properties and explicit propagation of associated uncertainties is necessary to test particular scientific hypotheses, especially in cases of detection of weak signals in the presence of natural fluctuations, and for process studies with small ensembles. This is also true where ancillary data from meteorological analyses are required, which have their own sampling limitations and uncertainties. This study will review two investigations pertaining to measurements of water vapor in the mid-troposphere and lower stratosphere that mix satellite observations with observations from other sources. The focus of the mid-troposphere analysis is to obtain improved estimates of water vapor at the instant of a sounding satellite overpass. The lower stratosphere work examines the uncertainty inherent in a small ensemble of anomalously elevated lower stratospheric water vapor observations when meteorological analysis products and aircraft in situ observations are required for interpretation.

  11. Uncertainty Analysis in the Creation of a Fine-Resolution Leaf Area Index (LAI) Reference Map for Validation of Moderate Resolution LAI Products

    EPA Science Inventory

    The validation process for a moderate resolution leaf area index (LAI) product (i.e., MODIS) involves the creation of a high spatial resolution LAI reference map (Lai-RM), which when scaled to the moderate LAI resolution (i.e., >1 km) allows for comparison and analysis with this ...

  12. Measuring high-density built environment for public health research: Uncertainty with respect to data, indicator design and spatial scale.

    PubMed

    Sun, Guibo; Webster, Chris; Ni, Michael Y; Zhang, Xiaohu

    2018-05-07

    Uncertainty with respect to built environment (BE) data collection, measure conceptualization and spatial scales is evident in urban health research, but most findings are from relatively lowdensity contexts. We selected Hong Kong, an iconic high-density city, as the study area as limited research has been conducted on uncertainty in such areas. We used geocoded home addresses (n=5732) from a large population-based cohort in Hong Kong to extract BE measures for the participants' place of residence based on an internationally recognized BE framework. Variability of the measures was mapped and Spearman's rank correlation calculated to assess how well the relationships among indicators are preserved across variables and spatial scales. We found extreme variations and uncertainties for the 180 measures collected using comprehensive data and advanced geographic information systems modelling techniques. We highlight the implications of methodological selection and spatial scales of the measures. The results suggest that more robust information regarding urban health research in high-density city would emerge if greater consideration were given to BE data, design methods and spatial scales of the BE measures.

  13. Web-based access, aggregation, and visualization of future climate projections with emphasis on agricultural assessments

    NASA Astrophysics Data System (ADS)

    Villoria, Nelson B.; Elliott, Joshua; Müller, Christoph; Shin, Jaewoo; Zhao, Lan; Song, Carol

    2018-01-01

    Access to climate and spatial datasets by non-specialists is restricted by technical barriers involving hardware, software and data formats. We discuss an open-source online tool that facilitates downloading the climate data from the global circulation models used by the Inter-Sectoral Impacts Model Intercomparison Project. The tool also offers temporal and spatial aggregation capabilities for incorporating future climate scenarios in applications where spatial aggregation is important. We hope that streamlined access to these data facilitates analysis of climate related issues while considering the uncertainties derived from future climate projections and temporal aggregation choices.

  14. Probabilistic stability analysis: the way forward for stability analysis of sustainable power systems.

    PubMed

    Milanović, Jovica V

    2017-08-13

    Future power systems will be significantly different compared with their present states. They will be characterized by an unprecedented mix of a wide range of electricity generation and transmission technologies, as well as responsive and highly flexible demand and storage devices with significant temporal and spatial uncertainty. The importance of probabilistic approaches towards power system stability analysis, as a subsection of power system studies routinely carried out by power system operators, has been highlighted in previous research. However, it may not be feasible (or even possible) to accurately model all of the uncertainties that exist within a power system. This paper describes for the first time an integral approach to probabilistic stability analysis of power systems, including small and large angular stability and frequency stability. It provides guidance for handling uncertainties in power system stability studies and some illustrative examples of the most recent results of probabilistic stability analysis of uncertain power systems.This article is part of the themed issue 'Energy management: flexibility, risk and optimization'. © 2017 The Author(s).

  15. Understanding the origins of uncertainty in landscape-scale variations of emissions of nitrous oxide

    NASA Astrophysics Data System (ADS)

    Milne, Alice; Haskard, Kathy; Webster, Colin; Truan, Imogen; Goulding, Keith

    2014-05-01

    Nitrous oxide is a potent greenhouse gas which is over 300 times more radiatively effective than carbon dioxide. In the UK, the agricultural sector is estimated to be responsible for over 80% of nitrous oxide emissions, with these emissions resulting from livestock and farmers adding nitrogen fertilizer to soils. For the purposes of reporting emissions to the IPCC, the estimates are calculated using simple models whereby readily-available national or international statistics are combined with IPCC default emission factors. The IPCC emission factor for direct emissions of nitrous oxide from soils has a very large uncertainty. This is primarily because the variability of nitrous oxide emissions in space is large and this results in uncertainty that may be regarded as sample noise. To both reduce uncertainty through improved modelling, and to communicate an understanding of this uncertainty, we must understand the origins of the variation. We analysed data on nitrous oxide emission rate and some other soil properties collected from a 7.5-km transect across contrasting land uses and parent materials in eastern England. We investigated the scale-dependence and spatial uniformity of the correlations between soil properties and emission rates from farm to landscape scale using wavelet analysis. The analysis revealed a complex pattern of scale-dependence. Emission rates were strongly correlated with a process-specific function of the water-filled pore space at the coarsest scale and nitrate at intermediate and coarsest scales. We also found significant correlations between pH and emission rates at the intermediate scales. The wavelet analysis showed that these correlations were not spatially uniform and that at certain scales changes in parent material coincided with significant changes in correlation. Our results indicate that, at the landscape scale, nitrate content and water-filled pore space are key soil properties for predicting nitrous oxide emissions and should therefore be incorporated into process models and emission factors for inventory calculations.

  16. Effects of Buffer Size and Shape on Associations between the Built Environment and Energy Balance

    PubMed Central

    Berrigan, David; Hart, Jaime E.; Hipp, J. Aaron; Hoehner, Christine M.; Kerr, Jacqueline; Major, Jacqueline M.; Oka, Masayoshi; Laden, Francine

    2014-01-01

    Uncertainty in the relevant spatial context may drive heterogeneity in findings on the built environment and energy balance. To estimate the effect of this uncertainty, we conducted a sensitivity analysis defining intersection and business densities and counts within different buffer sizes and shapes on associations with self-reported walking and body mass index. Linear regression results indicated that the scale and shape of buffers influenced study results and may partly explain the inconsistent findings in the built environment and energy balance literature. PMID:24607875

  17. Theoretical analysis on the measurement errors of local 2D DIC: Part I temporal and spatial uncertainty quantification of displacement measurements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Yueqi; Lava, Pascal; Reu, Phillip

    This study presents a theoretical uncertainty quantification of displacement measurements by subset-based 2D-digital image correlation. A generalized solution to estimate the random error of displacement measurement is presented. The obtained solution suggests that the random error of displacement measurements is determined by the image noise, the summation of the intensity gradient in a subset, the subpixel part of displacement, and the interpolation scheme. The proposed method is validated with virtual digital image correlation tests.

  18. Theoretical analysis on the measurement errors of local 2D DIC: Part I temporal and spatial uncertainty quantification of displacement measurements

    DOE PAGES

    Wang, Yueqi; Lava, Pascal; Reu, Phillip; ...

    2015-12-23

    This study presents a theoretical uncertainty quantification of displacement measurements by subset-based 2D-digital image correlation. A generalized solution to estimate the random error of displacement measurement is presented. The obtained solution suggests that the random error of displacement measurements is determined by the image noise, the summation of the intensity gradient in a subset, the subpixel part of displacement, and the interpolation scheme. The proposed method is validated with virtual digital image correlation tests.

  19. Exploring prediction uncertainty of spatial data in geostatistical and machine learning Approaches

    NASA Astrophysics Data System (ADS)

    Klump, J. F.; Fouedjio, F.

    2017-12-01

    Geostatistical methods such as kriging with external drift as well as machine learning techniques such as quantile regression forest have been intensively used for modelling spatial data. In addition to providing predictions for target variables, both approaches are able to deliver a quantification of the uncertainty associated with the prediction at a target location. Geostatistical approaches are, by essence, adequate for providing such prediction uncertainties and their behaviour is well understood. However, they often require significant data pre-processing and rely on assumptions that are rarely met in practice. Machine learning algorithms such as random forest regression, on the other hand, require less data pre-processing and are non-parametric. This makes the application of machine learning algorithms to geostatistical problems an attractive proposition. The objective of this study is to compare kriging with external drift and quantile regression forest with respect to their ability to deliver reliable prediction uncertainties of spatial data. In our comparison we use both simulated and real world datasets. Apart from classical performance indicators, comparisons make use of accuracy plots, probability interval width plots, and the visual examinations of the uncertainty maps provided by the two approaches. By comparing random forest regression to kriging we found that both methods produced comparable maps of estimated values for our variables of interest. However, the measure of uncertainty provided by random forest seems to be quite different to the measure of uncertainty provided by kriging. In particular, the lack of spatial context can give misleading results in areas without ground truth data. These preliminary results raise questions about assessing the risks associated with decisions based on the predictions from geostatistical and machine learning algorithms in a spatial context, e.g. mineral exploration.

  20. Large-Scale Transport Model Uncertainty and Sensitivity Analysis: Distributed Sources in Complex Hydrogeologic Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sig Drellack, Lance Prothro

    2007-12-01

    The Underground Test Area (UGTA) Project of the U.S. Department of Energy, National Nuclear Security Administration Nevada Site Office is in the process of assessing and developing regulatory decision options based on modeling predictions of contaminant transport from underground testing of nuclear weapons at the Nevada Test Site (NTS). The UGTA Project is attempting to develop an effective modeling strategy that addresses and quantifies multiple components of uncertainty including natural variability, parameter uncertainty, conceptual/model uncertainty, and decision uncertainty in translating model results into regulatory requirements. The modeling task presents multiple unique challenges to the hydrological sciences as a result ofmore » the complex fractured and faulted hydrostratigraphy, the distributed locations of sources, the suite of reactive and non-reactive radionuclides, and uncertainty in conceptual models. Characterization of the hydrogeologic system is difficult and expensive because of deep groundwater in the arid desert setting and the large spatial setting of the NTS. Therefore, conceptual model uncertainty is partially addressed through the development of multiple alternative conceptual models of the hydrostratigraphic framework and multiple alternative models of recharge and discharge. Uncertainty in boundary conditions is assessed through development of alternative groundwater fluxes through multiple simulations using the regional groundwater flow model. Calibration of alternative models to heads and measured or inferred fluxes has not proven to provide clear measures of model quality. Therefore, model screening by comparison to independently-derived natural geochemical mixing targets through cluster analysis has also been invoked to evaluate differences between alternative conceptual models. Advancing multiple alternative flow models, sensitivity of transport predictions to parameter uncertainty is assessed through Monte Carlo simulations. The simulations are challenged by the distributed sources in each of the Corrective Action Units, by complex mass transfer processes, and by the size and complexity of the field-scale flow models. An efficient methodology utilizing particle tracking results and convolution integrals provides in situ concentrations appropriate for Monte Carlo analysis. Uncertainty in source releases and transport parameters including effective porosity, fracture apertures and spacing, matrix diffusion coefficients, sorption coefficients, and colloid load and mobility are considered. With the distributions of input uncertainties and output plume volumes, global analysis methods including stepwise regression, contingency table analysis, and classification tree analysis are used to develop sensitivity rankings of parameter uncertainties for each model considered, thus assisting a variety of decisions.« less

  1. Cloudy Windows: What GCM Ensembles, Reanalyses and Observations Tell Us About Uncertainty in Greenland's Future Climate and Surface Melting

    NASA Astrophysics Data System (ADS)

    Reusch, D. B.

    2016-12-01

    Any analysis that wants to use a GCM-based scenario of future climate benefits from knowing how much uncertainty the GCM's inherent variability adds to the development of climate change predictions. This is extra relevant in the polar regions due to the potential of global impacts (e.g., sea level rise) from local (ice sheet) climate changes such as more frequent/intense surface melting. High-resolution, regional-scale models using GCMs for boundary/initial conditions in future scenarios inherit a measure of GCM-derived externally-driven uncertainty. We investigate these uncertainties for the Greenland ice sheet using the 30-member CESM1.0-CAM5-BGC Large Ensemble (CESMLE) for recent (1981-2000) and future (2081-2100, RCP 8.5) decades. Recent simulations are skill-tested against the ERA-Interim reanalysis and AWS observations with results informing future scenarios. We focus on key variables influencing surface melting through decadal climatologies, nonlinear analysis of variability with self-organizing maps (SOMs), regional-scale modeling (Polar WRF), and simple melt models. Relative to the ensemble average, spatially averaged climatological July temperature anomalies over a Greenland ice-sheet/ocean domain are mostly between +/- 0.2 °C. The spatial average hides larger local anomalies of up to +/- 2 °C. The ensemble average itself is 2 °C cooler than ERA-Interim. SOMs extend our diagnostics by providing a concise, objective summary of model variability as a set of generalized patterns. For CESMLE, the SOM patterns summarize the variability of multiple realizations of climate. Changes in pattern frequency by ensemble member show the influence of initial conditions. For example, basic statistical analysis of pattern frequency yields interquartile ranges of 2-4% for individual patterns across the ensemble. In climate terms, this tells us about climate state variability through the range of the ensemble, a potentially significant source of melt-prediction uncertainty. SOMs can also capture the different trajectories of climate due to intramodel variability over time. Polar WRF provides higher resolution regional modeling with improved, polar-centric model physics. Simple melt models allow us to characterize impacts of the upstream uncertainties on estimates of surface melting.

  2. Uncertainty of High Intensity Therapeutic Ultrasound (HITU) Field Characterization with Hydrophones: Effects of Nonlinearity, Spatial Averaging, and Complex Sensitivity

    PubMed Central

    Liu, Yunbo; Wear, Keith A.; Harris, Gerald R.

    2017-01-01

    Reliable acoustic characterization is fundamental for patient safety and clinical efficacy during high intensity therapeutic ultrasound (HITU) treatment. Technical challenges, such as measurement uncertainty and signal analysis still exist for HITU exposimetry using ultrasound hydrophones. In this work, four hydrophones were compared for pressure measurement: a robust needle hydrophone, a small PVDF capsule hydrophone and two different fiber-optic hydrophones. The focal waveform and beam distribution of a single element HITU transducer (1.05 MHz and 3.3 MHz) were evaluated. Complex deconvolution between the hydrophone voltage signal and frequency-dependent complex sensitivity was performed to obtain pressure waveform. Compressional pressure, rarefactional pressure, and focal beam distribution were compared up to 10.6/−6.0 MPa (p+ and p−) (1.05 MHz) and 20.65/−7.20 MPa (3.3 MHz). In particular, the effects of spatial averaging, local nonlinear distortion, complex deconvolution and hydrophone damage thresholds were investigated. This study showed an uncertainty of no better than 10–15% on hydrophone-based HITU pressure characterization. PMID:28735734

  3. Economic analysis of fuel treatments

    Treesearch

    D. Evan Mercer; Jeffrey P. Prestemon

    2012-01-01

    The economics of wildfire is complicated because wildfire behavior depends on the spatial and temporal scale at which management decisions made, and because of uncertainties surrounding the results of management actions. Like the wildfire processes they seek to manage, interventions through fire prevention programs, suppression, and fuels management are scale dependent...

  4. Annual Fossil-Fuel CO2 Emissions: Uncertainty of Emissions Gridded by On Degree Latitude by One Degree Longitude (1950-2013) (V. 2016)

    DOE Data Explorer

    Andres, R. J. [CDIAC; Boden, T. A. [CDIAC

    2016-01-01

    The annual, gridded fossil-fuel CO2 emissions uncertainty estimates from 1950-2013 provided in this database are derived from time series of global, regional, and national fossil-fuel CO2 emissions (Boden et al. 2016). Andres et al. (2016) describes the basic methodology in estimating the uncertainty in the (gridded fossil fuel data product ). This uncertainty is gridded at the same spatial and temporal scales as the mass magnitude maps. This gridded uncertainty includes uncertainty contributions from the spatial, temporal, proxy, and magnitude components used to create the magnitude map of FFCO2 emissions. Throughout this process, when assumptions had to be made or expert judgment employed, the general tendency in most cases was toward overestimating or increasing the magnitude of uncertainty.

  5. Bayesian Estimation of the Spatially Varying Completeness Magnitude of Earthquake Catalogs

    NASA Astrophysics Data System (ADS)

    Mignan, A.; Werner, M.; Wiemer, S.; Chen, C.; Wu, Y.

    2010-12-01

    Assessing the completeness magnitude Mc of earthquake catalogs is an essential prerequisite for any seismicity analysis. We employ a simple model to compute Mc in space, based on the proximity to seismic stations in a network. We show that a relationship of the form Mcpred(d) = ad^b+c, with d the distance to the 5th nearest seismic station, fits the observations well. We then propose a new Mc mapping approach, the Bayesian Magnitude of Completeness (BMC) method, based on a 2-step procedure: (1) a spatial resolution optimization to minimize spatial heterogeneities and uncertainties in Mc estimates and (2) a Bayesian approach that merges prior information about Mc based on the proximity to seismic stations with locally observed values weighted by their respective uncertainties. This new methodology eliminates most weaknesses associated with current Mc mapping procedures: the radius that defines which earthquakes to include in the local magnitude distribution is chosen according to an objective criterion and there are no gaps in the spatial estimation of Mc. The method solely requires the coordinates of seismic stations. Here, we investigate the Taiwan Central Weather Bureau (CWB) earthquake catalog by computing a Mc map for the period 1994-2010.

  6. Evaluation of single and multiple Doppler lidar techniques to measure complex flow during the XPIA field campaign

    DOE PAGES

    Choukulkar, Aditya; Brewer, W. Alan; Sandberg, Scott P.; ...

    2017-01-23

    Accurate three-dimensional information of wind flow fields can be an important tool in not only visualizing complex flow but also understanding the underlying physical processes and improving flow modeling. However, a thorough analysis of the measurement uncertainties is required to properly interpret results. The XPIA (eXperimental Planetary boundary layer Instrumentation Assessment) field campaign conducted at the Boulder Atmospheric Observatory (BAO) in Erie, CO, from 2 March to 31 May 2015 brought together a large suite of in situ and remote sensing measurement platforms to evaluate complex flow measurement strategies. In this paper, measurement uncertainties for different single and multi-Doppler strategies using simple scanmore » geometries (conical, vertical plane and staring) are investigated. The tradeoffs (such as time–space resolution vs. spatial coverage) among the different measurement techniques are evaluated using co-located measurements made near the BAO tower. Sensitivity of the single-/multi-Doppler measurement uncertainties to averaging period are investigated using the sonic anemometers installed on the BAO tower as the standard reference. Finally, the radiometer measurements are used to partition the measurement periods as a function of atmospheric stability to determine their effect on measurement uncertainty. It was found that with an increase in spatial coverage and measurement complexity, the uncertainty in the wind measurement also increased. For multi-Doppler techniques, the increase in uncertainty for temporally uncoordinated measurements is possibly due to requiring additional assumptions of stationarity along with horizontal homogeneity and less representative line-of-sight velocity statistics. Lastly, it was also found that wind speed measurement uncertainty was lower during stable conditions compared to unstable conditions.« less

  7. Quantifying the importance of spatial resolution and other factors through global sensitivity analysis of a flood inundation model

    NASA Astrophysics Data System (ADS)

    Thomas Steven Savage, James; Pianosi, Francesca; Bates, Paul; Freer, Jim; Wagener, Thorsten

    2016-11-01

    Where high-resolution topographic data are available, modelers are faced with the decision of whether it is better to spend computational resource on resolving topography at finer resolutions or on running more simulations to account for various uncertain input factors (e.g., model parameters). In this paper we apply global sensitivity analysis to explore how influential the choice of spatial resolution is when compared to uncertainties in the Manning's friction coefficient parameters, the inflow hydrograph, and those stemming from the coarsening of topographic data used to produce Digital Elevation Models (DEMs). We apply the hydraulic model LISFLOOD-FP to produce several temporally and spatially variable model outputs that represent different aspects of flood inundation processes, including flood extent, water depth, and time of inundation. We find that the most influential input factor for flood extent predictions changes during the flood event, starting with the inflow hydrograph during the rising limb before switching to the channel friction parameter during peak flood inundation, and finally to the floodplain friction parameter during the drying phase of the flood event. Spatial resolution and uncertainty introduced by resampling topographic data to coarser resolutions are much more important for water depth predictions, which are also sensitive to different input factors spatially and temporally. Our findings indicate that the sensitivity of LISFLOOD-FP predictions is more complex than previously thought. Consequently, the input factors that modelers should prioritize will differ depending on the model output assessed, and the location and time of when and where this output is most relevant.

  8. Spatial curvilinear path following control of underactuated AUV with multiple uncertainties.

    PubMed

    Miao, Jianming; Wang, Shaoping; Zhao, Zhiping; Li, Yuan; Tomovic, Mileta M

    2017-03-01

    This paper investigates the problem of spatial curvilinear path following control of underactuated autonomous underwater vehicles (AUVs) with multiple uncertainties. Firstly, in order to design the appropriate controller, path following error dynamics model is constructed in a moving Serret-Frenet frame, and the five degrees of freedom (DOFs) dynamic model with multiple uncertainties is established. Secondly, the proposed control law is separated into kinematic controller and dynamic controller via back-stepping technique. In the case of kinematic controller, to overcome the drawback of dependence on the accurate vehicle model that are present in a number of path following control strategies described in the literature, the unknown side-slip angular velocity and attack angular velocity are treated as uncertainties. Whereas in the case of dynamic controller, the model parameters perturbations, unknown external environmental disturbances and the nonlinear hydrodynamic damping terms are treated as lumped uncertainties. Both kinematic and dynamic uncertainties are estimated and compensated by designed reduced-order linear extended state observes (LESOs). Thirdly, feedback linearization (FL) based control law is implemented for the control model using the estimates generated by reduced-order LESOs. For handling the problem of computational complexity inherent in the conventional back-stepping method, nonlinear tracking differentiators (NTDs) are applied to construct derivatives of the virtual control commands. Finally, the closed loop stability for the overall system is established. Simulation and comparative analysis demonstrate that the proposed controller exhibits enhanced performance in the presence of internal parameter variations, external unknown disturbances, unmodeled nonlinear damping terms, and measurement noises. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.

  9. Multiobjective design of aquifer monitoring networks for optimal spatial prediction and geostatistical parameter estimation

    NASA Astrophysics Data System (ADS)

    Alzraiee, Ayman H.; Bau, Domenico A.; Garcia, Luis A.

    2013-06-01

    Effective sampling of hydrogeological systems is essential in guiding groundwater management practices. Optimal sampling of groundwater systems has previously been formulated based on the assumption that heterogeneous subsurface properties can be modeled using a geostatistical approach. Therefore, the monitoring schemes have been developed to concurrently minimize the uncertainty in the spatial distribution of systems' states and parameters, such as the hydraulic conductivity K and the hydraulic head H, and the uncertainty in the geostatistical model of system parameters using a single objective function that aggregates all objectives. However, it has been shown that the aggregation of possibly conflicting objective functions is sensitive to the adopted aggregation scheme and may lead to distorted results. In addition, the uncertainties in geostatistical parameters affect the uncertainty in the spatial prediction of K and H according to a complex nonlinear relationship, which has often been ineffectively evaluated using a first-order approximation. In this study, we propose a multiobjective optimization framework to assist the design of monitoring networks of K and H with the goal of optimizing their spatial predictions and estimating the geostatistical parameters of the K field. The framework stems from the combination of a data assimilation (DA) algorithm and a multiobjective evolutionary algorithm (MOEA). The DA algorithm is based on the ensemble Kalman filter, a Monte-Carlo-based Bayesian update scheme for nonlinear systems, which is employed to approximate the posterior uncertainty in K, H, and the geostatistical parameters of K obtained by collecting new measurements. Multiple MOEA experiments are used to investigate the trade-off among design objectives and identify the corresponding monitoring schemes. The methodology is applied to design a sampling network for a shallow unconfined groundwater system located in Rocky Ford, Colorado. Results indicate that the effect of uncertainties associated with the geostatistical parameters on the spatial prediction might be significantly alleviated (by up to 80% of the prior uncertainty in K and by 90% of the prior uncertainty in H) by sampling evenly distributed measurements with a spatial measurement density of more than 1 observation per 60 m × 60 m grid block. In addition, exploration of the interaction of objective functions indicates that the ability of head measurements to reduce the uncertainty associated with the correlation scale is comparable to the effect of hydraulic conductivity measurements.

  10. Bayesian spatio-temporal discard model in a demersal trawl fishery

    NASA Astrophysics Data System (ADS)

    Grazia Pennino, M.; Muñoz, Facundo; Conesa, David; López-Quílez, Antonio; Bellido, José M.

    2014-07-01

    Spatial management of discards has recently been proposed as a useful tool for the protection of juveniles, by reducing discard rates and can be used as a buffer against management errors and recruitment failure. In this study Bayesian hierarchical spatial models have been used to analyze about 440 trawl fishing operations of two different metiers, sampled between 2009 and 2012, in order to improve our understanding of factors that influence the quantity of discards and to identify their spatio-temporal distribution in the study area. Our analysis showed that the relative importance of each variable was different for each metier, with a few similarities. In particular, the random vessel effect and seasonal variability were identified as main driving variables for both metiers. Predictive maps of the abundance of discards and maps of the posterior mean of the spatial component show several hot spots with high discard concentration for each metier. We argue how the seasonal/spatial effects, and the knowledge about the factors influential to discarding, could potentially be exploited as potential mitigation measures for future fisheries management strategies. However, misidentification of hotspots and uncertain predictions can culminate in inappropriate mitigation practices which can sometimes be irreversible. The proposed Bayesian spatial method overcomes these issues, since it offers a unified approach which allows the incorporation of spatial random-effect terms, spatial correlation of the variables and the uncertainty of the parameters in the modeling process, resulting in a better quantification of the uncertainty and accurate predictions.

  11. Flood risk and adaptation strategies under climate change and urban expansion: A probabilistic analysis using global data.

    PubMed

    Muis, Sanne; Güneralp, Burak; Jongman, Brenden; Aerts, Jeroen C J H; Ward, Philip J

    2015-12-15

    An accurate understanding of flood risk and its drivers is crucial for effective risk management. Detailed risk projections, including uncertainties, are however rarely available, particularly in developing countries. This paper presents a method that integrates recent advances in global-scale modeling of flood hazard and land change, which enables the probabilistic analysis of future trends in national-scale flood risk. We demonstrate its application to Indonesia. We develop 1000 spatially-explicit projections of urban expansion from 2000 to 2030 that account for uncertainty associated with population and economic growth projections, as well as uncertainty in where urban land change may occur. The projections show that the urban extent increases by 215%-357% (5th and 95th percentiles). Urban expansion is particularly rapid on Java, which accounts for 79% of the national increase. From 2000 to 2030, increases in exposure will elevate flood risk by, on average, 76% and 120% for river and coastal floods. While sea level rise will further increase the exposure-induced trend by 19%-37%, the response of river floods to climate change is highly uncertain. However, as urban expansion is the main driver of future risk, the implementation of adaptation measures is increasingly urgent, regardless of the wide uncertainty in climate projections. Using probabilistic urban projections, we show that spatial planning can be a very effective adaptation strategy. Our study emphasizes that global data can be used successfully for probabilistic risk assessment in data-scarce countries. Copyright © 2015 Elsevier B.V. All rights reserved.

  12. Robustness of Thirty Meter Telescope primary mirror control

    NASA Astrophysics Data System (ADS)

    Macmynowski, Douglas G.; Thompson, Peter M.; Shelton, Chris; Roberts, Lewis C., Jr.

    2010-07-01

    The primary mirror control system for the Thirty Meter Telescope (TMT) maintains the alignment of the 492 segments in the presence of both quasi-static (gravity and thermal) and dynamic disturbances due to unsteady wind loads. The latter results in a desired control bandwidth of 1Hz at high spatial frequencies. The achievable bandwidth is limited by robustness to (i) uncertain telescope structural dynamics (control-structure interaction) and (ii) small perturbations in the ill-conditioned influence matrix that relates segment edge sensor response to actuator commands. Both of these effects are considered herein using models of TMT. The former is explored through multivariable sensitivity analysis on a reduced-order Zernike-basis representation of the structural dynamics. The interaction matrix ("A-matrix") uncertainty has been analyzed theoretically elsewhere, and is examined here for realistic amplitude perturbations due to segment and sensor installation errors, and gravity and thermal induced segment motion. The primary influence of A-matrix uncertainty is on the control of "focusmode"; this is the least observable mode, measurable only through the edge-sensor (gap-dependent) sensitivity to the dihedral angle between segments. Accurately estimating focus-mode will require updating the A-matrix as a function of the measured gap. A-matrix uncertainty also results in a higher gain-margin requirement for focus-mode, and hence the A-matrix and CSI robustness need to be understood simultaneously. Based on the robustness analysis, the desired 1 Hz bandwidth is achievable in the presence of uncertainty for all except the lowest spatial-frequency response patterns of the primary mirror.

  13. Uncertainties in Coastal Ocean Color Products: Impacts of Spatial Sampling

    NASA Technical Reports Server (NTRS)

    Pahlevan, Nima; Sarkar, Sudipta; Franz, Bryan A.

    2016-01-01

    With increasing demands for ocean color (OC) products with improved accuracy and well characterized, per-retrieval uncertainty budgets, it is vital to decompose overall estimated errors into their primary components. Amongst various contributing elements (e.g., instrument calibration, atmospheric correction, inversion algorithms) in the uncertainty of an OC observation, less attention has been paid to uncertainties associated with spatial sampling. In this paper, we simulate MODIS (aboard both Aqua and Terra) and VIIRS OC products using 30 m resolution OC products derived from the Operational Land Imager (OLI) aboard Landsat-8, to examine impacts of spatial sampling on both cross-sensor product intercomparisons and in-situ validations of R(sub rs) products in coastal waters. Various OLI OC products representing different productivity levels and in-water spatial features were scanned for one full orbital-repeat cycle of each ocean color satellite. While some view-angle dependent differences in simulated Aqua-MODIS and VIIRS were observed, the average uncertainties (absolute) in product intercomparisons (due to differences in spatial sampling) at regional scales are found to be 1.8%, 1.9%, 2.4%, 4.3%, 2.7%, 1.8%, and 4% for the R(sub rs)(443), R(sub rs)(482), R(sub rs)(561), R(sub rs)(655), Chla, K(sub d)(482), and b(sub bp)(655) products, respectively. It is also found that, depending on in-water spatial variability and the sensor's footprint size, the errors for an in-situ validation station in coastal areas can reach as high as +/- 18%. We conclude that a) expected biases induced by the spatial sampling in product intercomparisons are mitigated when products are averaged over at least 7 km × 7 km areas, b) VIIRS observations, with improved consistency in cross-track spatial sampling, yield more precise calibration/validation statistics than that of MODIS, and c) use of a single pixel centered on in-situ coastal stations provides an optimal sampling size for validation efforts. These findings will have implications for enhancing our understanding of uncertainties in ocean color retrievals and for planning of future ocean color missions and the associated calibration/validation exercises.

  14. A Statistics-Based Material Property Analysis to Support TPS Characterization

    NASA Technical Reports Server (NTRS)

    Copeland, Sean R.; Cozmuta, Ioana; Alonso, Juan J.

    2012-01-01

    Accurate characterization of entry capsule heat shield material properties is a critical component in modeling and simulating Thermal Protection System (TPS) response in a prescribed aerothermal environment. The thermal decomposition of the TPS material during the pyrolysis and charring processes is poorly characterized and typically results in large uncertainties in material properties as inputs for ablation models. These material property uncertainties contribute to large design margins on flight systems and cloud re- construction efforts for data collected during flight and ground testing, making revision to existing models for entry systems more challenging. The analysis presented in this work quantifies how material property uncertainties propagate through an ablation model and guides an experimental test regimen aimed at reducing these uncertainties and characterizing the dependencies between properties in the virgin and charred states for a Phenolic Impregnated Carbon Ablator (PICA) based TPS. A sensitivity analysis identifies how the high-fidelity model behaves in the expected flight environment, while a Monte Carlo based uncertainty propagation strategy is used to quantify the expected spread in the in-depth temperature response of the TPS. An examination of how perturbations to the input probability density functions affect output temperature statistics is accomplished using a Kriging response surface of the high-fidelity model. Simulations are based on capsule configuration and aerothermal environments expected during the Mars Science Laboratory (MSL) entry sequence. We identify and rank primary sources of uncertainty from material properties in a flight-relevant environment, show the dependence on spatial orientation and in-depth location on those uncertainty contributors, and quantify how sensitive the expected results are.

  15. SpatialEpiApp: A Shiny web application for the analysis of spatial and spatio-temporal disease data.

    PubMed

    Moraga, Paula

    2017-11-01

    During last years, public health surveillance has been facilitated by the existence of several packages implementing statistical methods for the analysis of spatial and spatio-temporal disease data. However, these methods are still inaccesible for many researchers lacking the adequate programming skills to effectively use the required software. In this paper we present SpatialEpiApp, a Shiny web application that integrate two of the most common approaches in health surveillance: disease mapping and detection of clusters. SpatialEpiApp is easy to use and does not require any programming knowledge. Given information about the cases, population and optionally covariates for each of the areas and dates of study, the application allows to fit Bayesian models to obtain disease risk estimates and their uncertainty by using R-INLA, and to detect disease clusters by using SaTScan. The application allows user interaction and the creation of interactive data visualizations and reports showing the analyses performed. Copyright © 2017 Elsevier Ltd. All rights reserved.

  16. Spatial Analysis of Geothermal Resource Potential in New York and Pennsylvania: A Stratified Kriging Approach

    NASA Astrophysics Data System (ADS)

    Smith, J. D.; Whealton, C. A.; Stedinger, J. R.

    2014-12-01

    Resource assessments for low-grade geothermal applications employ available well temperature measurements to determine if the resource potential is sufficient for supporting district heating opportunities. This study used a compilation of bottomhole temperature (BHT) data from recent unconventional shale oil and gas wells, along with legacy oil, gas, and storage wells, in Pennsylvania (PA) and New York (NY). Our study's goal was to predict the geothermal resource potential and associated uncertainty for the NY-PA region using kriging interpolation. The dataset was scanned for outliers, and some observations were removed. Because these wells were drilled for reasons other than geothermal resource assessment, their spatial density varied widely. An exploratory spatial statistical analysis revealed differences in the spatial structure of the geothermal gradient data (the kriging semi-variogram and its nugget variance, shape, sill, and the degree of anisotropy). As a result, a stratified kriging procedure was adopted to better capture the statistical structure of the data, to generate an interpolated surface, and to quantify the uncertainty of the computed surface. The area was stratified reflecting different physiographic provinces in NY and PA that have geologic properties likely related to variations in the value of the geothermal gradient. The kriging prediction and the variance-of-prediction were determined for each province by the generation of a semi-variogram using only the wells that were located within that province. A leave-one-out cross validation (LOOCV) was conducted as a diagnostic tool. The results of stratified kriging were compared to kriging using the whole region to determine the impact of stratification. The two approaches provided similar predictions of the geothermal gradient. However, the variance-of-prediction was different. The stratified approach is recommended because it gave a more appropriate site-specific characterization of uncertainty based upon a more realistic description of the statistical structure of the data given the geologic characteristics of each province.

  17. Multi-temporal InSAR analysis to reduce uncertainties and assess time-dependence of deformation in the northern Chilean forearc

    NASA Astrophysics Data System (ADS)

    Manjunath, D.; Gomez, F.; Loveless, J.

    2005-12-01

    Interferometric Synthetic Aperture Radar (InSAR) provides unprecedented spatial imaging of crustal deformation. However, for small deformations, such as those due to interseismic strain accumulation, potentially significant uncertainty may result from other sources of interferometric phase, such as atmospheric effects, errors in satellite baseline, and height errors in the reference digital elevation model (DEM). We aim to constrain spatial and temporal variations in crustal deformation of the northern Chilean forearc region of the Andean subduction zone (19° - 22°S) using multiple interferograms spanning 1995 - 2000. The study area includes the region of the 1995 Mw 8.1 Antofagasta earthquake and the region to the north. In contrast to previous InSAR-based studies of the Chilean forearc, we seek to distinguish interferometric phase contributions from linear and nonlinear deformation, height errors in the DEM, and atmospheric effects. Understanding these phase contributions reduces the uncertainties on the deformation rates and provides a view of the time-dependence of deformation. The inteferograms cover a 150 km-wide swath spanning two adjacent orbital tracks. Our study involves the analysis of more than 28 inteferograms along each track. Coherent interferograms in the hyper-arid Atacama Desert permit spatial phase unwrapping. Initial estimates of topographic phase were determined using 3'' DEM data from the SRTM mission. We perform a pixel-by-pixel analysis of the unwrapped phase to identify time- and baseline-dependent phase contributions, using the Gamma Remote Sensing radar software. Atmospheric phase, non-linear deformation, and phase noise were further distinguished using a combination of spatial and temporal filters. Non-linear deformation is evident for up to 2.5 years following the 1995 earthquake, followed by a return to time-linear, interseismic strain accumulation. The regional trend of linear deformation, characterized by coastal subsidence and relative uplift inland, is consistent with the displacement field expected for a locked subduction zone. Our improved determination of deformation rates is used to formulate a new elastic model of interseismic strain in the Chilean forearc.

  18. Regional landslide hazard assessment in a deep uncertain future

    NASA Astrophysics Data System (ADS)

    Almeida, Susana; Holcombe, Liz; Pianosi, Francesca; Wagener, Thorsten

    2017-04-01

    Landslides have many negative economic and societal impacts, including the potential for significant loss of life and damage to infrastructure. These risks are likely to be exacerbated in the future by a combination of climatic and socio-economic factors. Climate change, for example, is expected to increase the occurrence of rainfall-triggered landslides, because a warmer atmosphere tends to produce more high intensity rainfall events. Prediction of future changes in rainfall, however, is subject to high levels of uncertainty, making it challenging for decision-makers to identify the areas and populations that are most vulnerable to landslide hazards. In this study, we demonstrate how a physically-based model - the Combined Hydrology and Stability Model (CHASM) - can be used together with Global Sensitivity Analysis (GSA) to explore the underlying factors controlling the spatial distribution of landslide risks across a regional landscape, while also accounting for deep uncertainty around future rainfall conditions. We demonstrate how GSA can used to analyse CHASM which in turn represents the spatial variability of hillslope characteristics in the study region, while accounting for other uncertainties. Results are presented in the form of landslide hazard maps, utilising high-resolution digital elevation datasets for a case study in St Lucia in the Caribbean. Our findings about spatial landslide hazard drivers have important implications for data collection approaches and for long-term decision-making about land management practices.

  19. Regional Landslide Hazard Assessment Considering Potential Climate Change

    NASA Astrophysics Data System (ADS)

    Almeida, S.; Holcombe, E.; Pianosi, F.; Wagener, T.

    2016-12-01

    Landslides have many negative economic and societal impacts, including the potential for significant loss of life and damage to infrastructure. These risks are likely to be exacerbated in the future by a combination of climatic and socio-economic factors. Climate change, for example, is expected to increase the occurrence of rainfall-triggered landslides, because a warmer atmosphere tends to produce more high intensity rainfall events. Prediction of future changes in rainfall, however, is subject to high levels of uncertainty, making it challenging for decision-makers to identify the areas and populations that are most vulnerable to landslide hazards. In this study, we demonstrate how a physically-based model - the Combined Hydrology and Stability Model (CHASM) - can be used together with Global Sensitivity Analysis (GSA) to explore the underlying factors controlling the spatial distribution of landslide risks across a regional landscape, while also accounting for deep uncertainty around potential future rainfall triggers. We demonstrate how GSA can be used to analyse CHASM which in turn represents the spatial variability of hillslope characteristics in the study region, while accounting for other uncertainties. Results are presented in the form of landslide hazard maps, utilising high-resolution digital elevation datasets for a case study in St Lucia in the Caribbean. Our findings about spatial landslide hazard drivers have important implications for data collection approaches and for long-term decision-making about land management practices.

  20. Evaluating sources of uncertainties in finite-fault source models: lessons from the 2009 Mw6.1 L'Aquila earthquake, Italy

    NASA Astrophysics Data System (ADS)

    Ragon, T.; Sladen, A.; Bletery, Q.; Simons, M.; Magnoni, F.; Avallone, A.; Cavalié, O.; Vergnolle, M.

    2016-12-01

    Despite the diversity of available data for the Mw 6.1 2009 earthquake in L'Aquila, Italy, published finite fault slip models are surprisingly different. For instance, the amplitude of the maximum coseismic slip patch varies from 80cm to 225cm, and its depth oscillates between 5 and 15km. Discrepancies between proposed source parameters are believed to result from three sources: observational uncertainties, epistemic uncertainties, and the inherent non-uniqueness of inverse problems. We explore the whole solution space of fault-slip models compatible with the data within the range of both observational and epistemic uncertainties by performing a fully Bayesian analysis. In this initial stage, we restrict our analysis to the static problem.In terms of observation uncertainty, we must take into account the difference in time span associated with the different data types: InSAR images provide excellent spatial coverage but usually correspond to a period of a few days to weeks after the mainshock and can thus be potentially biased by significant afterslip. Continuous GPS stations do not have the same shortcoming, but in contrast do not have the desired spatial coverage near the fault. In the case of the L'Aquila earthquake, InSAR images include a minimum of 6 days of afterslip. Here, we explicitly account for these different time windows in the inversion by jointly inverting for coseismic and post-seismic fault slip. Regarding epistemic or modeling uncertainties, we focus on the impact of uncertain fault geometry and elastic structure. Modeling errors, which result from inaccurate model predictions and are generally neglected, are estimated for both earth model and fault geometry as non-diagonal covariance matrices. The L'Aquila earthquake is particularly suited to investigation of these effects given the availability of a detailed aftershock catalog and 3D velocity models. This work aims at improving our knowledge of the L'Aquila earthquake as well as at providing a more general perspective on which uncertainties are the most critical in finite-fault source studies.

  1. Multi-criteria decision analysis in conservation planning: Designing conservation area networks in San Diego County

    NASA Astrophysics Data System (ADS)

    MacDonald, Garrick Richard

    To limit biodiversity loss caused by human activity, conservation planning must protect biodiversity while considering socio-economic cost criteria. This research aimed to determine the effects of socio-economic criteria and spatial configurations on the development of CANs for three species with different distribution patterns, while simultaneously attempting to address the uncertainty and sensitivity of CANs produced by ConsNet. The socio-economic factors and spatial criteria included the cost of land, population density, agricultural output value, area, average cluster area, number of clusters, shape, and perimeter. Three sensitive mammal species with different distribution patterns were selected and included the Bobcat, Ringtail, and a custom created mammal distribution. Forty problems and the corresponding number of CANs were formulated and computed by running each predicted presence species model with and without the four different socioeconomic threshold groups at two different resolutions. Thirty-two percent less area was conserved after considering multiple socio-economic constraints and spatial configurations in comparison to CANs that did not consider multiple socio-economic constraints and spatial configurations. Without including socio-economic costs, ConsNet's ALL_CELLS heuristic solution was the highest ranking CAN. After considering multiple socio-economic costs, the number one ranking CAN was no longer the ALL_CELLS heuristic solution, but a spatially different meta-heuristic solution. The effects of multiple constraints and objectives on the design of CANs with different distribution patterns did not vary significantly across the criteria. The CANs produced by ConsNet appeared to demonstrate some uncertainty surrounding particular criteria, but did not demonstrate substantial uncertainty across all criteria used to rank the CANs. Similarly, the range of socio-economic criteria thresholds did not have a substantial impact. ConsNet was very applicable to the research project, however, it did exhibit a few limitations. Both the advantages and disadvantages of ConsNet should be considered before using ConsNet for future conservation planning projects. The research project is an example of a large data scenario undertaken with a multiple criteria decision analysis (MCDA) approach.

  2. Methods Development for a Spatially Explicit Population-Level Risk Assessment, Uncertainty Analysis, and Comparison with Risk Quotient Approaches

    EPA Science Inventory

    The standard framework of Ecological Risk Assessment (ERA) uses organism-level assessment endpoints to qualitatively determine the risk to populations. While organism-level toxicity data provide the pathway by which a species may be affected by a chemical stressor, they neither i...

  3. Stochastic Analysis and Probabilistic Downscaling of Soil Moisture

    NASA Astrophysics Data System (ADS)

    Deshon, J. P.; Niemann, J. D.; Green, T. R.; Jones, A. S.

    2017-12-01

    Soil moisture is a key variable for rainfall-runoff response estimation, ecological and biogeochemical flux estimation, and biodiversity characterization, each of which is useful for watershed condition assessment. These applications require not only accurate, fine-resolution soil-moisture estimates but also confidence limits on those estimates and soil-moisture patterns that exhibit realistic statistical properties (e.g., variance and spatial correlation structure). The Equilibrium Moisture from Topography, Vegetation, and Soil (EMT+VS) model downscales coarse-resolution (9-40 km) soil moisture from satellite remote sensing or land-surface models to produce fine-resolution (10-30 m) estimates. The model was designed to produce accurate deterministic soil-moisture estimates at multiple points, but the resulting patterns do not reproduce the variance or spatial correlation of observed soil-moisture patterns. The primary objective of this research is to generalize the EMT+VS model to produce a probability density function (pdf) for soil moisture at each fine-resolution location and time. Each pdf has a mean that is equal to the deterministic soil-moisture estimate, and the pdf can be used to quantify the uncertainty in the soil-moisture estimates and to simulate soil-moisture patterns. Different versions of the generalized model are hypothesized based on how uncertainty enters the model, whether the uncertainty is additive or multiplicative, and which distributions describe the uncertainty. These versions are then tested by application to four catchments with detailed soil-moisture observations (Tarrawarra, Satellite Station, Cache la Poudre, and Nerrigundah). The performance of the generalized models is evaluated by comparing the statistical properties of the simulated soil-moisture patterns to those of the observations and the deterministic EMT+VS model. The versions of the generalized EMT+VS model with normally distributed stochastic components produce soil-moisture patterns with more realistic statistical properties than the deterministic model. Additionally, the results suggest that the variance and spatial correlation of the stochastic soil-moisture variations do not vary consistently with the spatial-average soil moisture.

  4. Quantification of uncertainty for fluid flow in heterogeneous petroleum reservoirs

    NASA Astrophysics Data System (ADS)

    Zhang, Dongxiao

    Detailed description of the heterogeneity of oil/gas reservoirs is needed to make performance predictions of oil/gas recovery. However, only limited measurements at a few locations are usually available. This combination of significant spatial heterogeneity with incomplete information about it leads to uncertainty about the values of reservoir properties and thus, to uncertainty in estimates of production potential. The theory of stochastic processes provides a natural method for evaluating these uncertainties. In this study, we present a stochastic analysis of transient, single phase flow in heterogeneous reservoirs. We derive general equations governing the statistical moments of flow quantities by perturbation expansions. These moments can be used to construct confidence intervals for the flow quantities (e.g., pressure and flow rate). The moment equations are deterministic and can be solved numerically with existing solvers. The proposed moment equation approach has certain advantages over the commonly used Monte Carlo approach.

  5. Uncertainty quantification of environmental performance metrics in heterogeneous aquifers with long-range correlations

    NASA Astrophysics Data System (ADS)

    Moslehi, Mahsa; de Barros, Felipe P. J.

    2017-01-01

    We investigate how the uncertainty stemming from disordered porous media that display long-range correlation in the hydraulic conductivity (K) field propagates to predictions of environmental performance metrics (EPMs). In this study, the EPMs are quantities that are of relevance to risk analysis and remediation, such as peak flux-averaged concentration, early and late arrival times among others. By using stochastic simulations, we quantify the uncertainty associated with the EPMs for a given disordered spatial structure of the K-field and identify the probability distribution function (PDF) model that best captures the statistics of the EPMs of interest. Results indicate that the probabilistic distribution of the EPMs considered in this study follows lognormal PDF. Finally, through the use of information theory, we reveal how the persistent/anti-persistent correlation structure of the K-field influences the EPMs and corresponding uncertainties.

  6. Transitioning from MODIS to VIIRS: an analysis of inter-consistency of NDVI data sets for agricultural monitoring.

    PubMed

    Skakun, Sergii; Justice, Christopher O; Vermote, Eric; Roger, Jean-Claude

    2018-01-01

    The Visible/Infrared Imager/Radiometer Suite (VIIRS) aboard the Suomi National Polar-orbiting Partnership (S-NPP) satellite was launched in 2011, in part to provide continuity with the Moderate Resolution Imaging Spectroradiometer (MODIS) instrument aboard National Aeronautics and Space Administration's (NASA) Terra and Aqua remote sensing satellites. The VIIRS will eventually replace MODIS for both land science and applications and add to the coarse-resolution, long term data record. It is, therefore, important to provide the user community with an assessment of the consistency of equivalent products from the two sensors. For this study, we do this in the context of example agricultural monitoring applications. Surface reflectance that is routinely delivered within the M{O,Y}D09 and VNP09 series of products provide critical input for generating downstream products. Given the range of applications utilizing the normalized difference vegetation index (NDVI) generated from M{O,Y}D09 and VNP09 products and the inherent differences between MODIS and VIIRS sensors in calibration, spatial sampling, and spectral bands, the main objective of this study is to quantify uncertainties related the transitioning from using MODIS to VIIRS-based NDVI's. In particular, we compare NDVI's derived from two sets of Level 3 MYD09 and VNP09 products with various spatial-temporal characteristics, namely 8-day composites at 500 m spatial resolution and daily Climate Modelling Grid (CMG) images at 0.05° spatial resolution. Spectral adjustment of VIIRS I1 (red) and I2 (near infra-red - NIR) bands to match MODIS/Aqua b1 (red) and b2 (NIR) bands is performed to remove a bias between MODIS and VIIRS-based red, NIR, and NDVI estimates. Overall, red reflectance, NIR reflectance, NDVI uncertainties were 0.014, 0.029 and 0.056 respectively for the 500 m product and 0.013, 0.016 and 0.032 for the 0.05° product. The study shows that MODIS and VIIRS NDVI data can be used interchangeably for applications with an uncertainty of less than 0.02 to 0.05, depending on the scale of spatial aggregation, which is typically the uncertainty of the individual dataset.

  7. Report on an Informal Survey of Groundwater Modeling Practitioners About How They Quantify Uncertainty: Which Tools They Use, Why, and Why Not.

    NASA Astrophysics Data System (ADS)

    Ginn, T. R.; Scheibe, T. D.

    2006-12-01

    Hydrogeology is among the most data-limited of the earth sciences, so that uncertainty arises in every aspect of subsurface flow and transport modeling, from conceptual model to spatial discretization to parameter values. Thus treatment of uncertainty is unavoidable, and the literature and conference proceedings are replete with approaches, templates, paradigms and such for doing so. However, such tools remain not well used, especially those of the stochastic analytic sort, leading recently to explicit inquiries about why this is the case, in response to which entire journal issues have been dedicated. In an effort to continue this discussion in a constructive way we report on an informal yet extensive survey of hydrogeology practitioners, as the "marketplace" for techniques to deal with uncertainty. We include scientists, engineers, regulators, and others in the survey, that reports on quantitative (or not) methods for uncertainty characterization and analysis, frequency and level of usage, and reasons behind the selection or avoidance of available methods. Results shed light on fruitful directions for future research in uncertainty quantification in hydrogeology.

  8. Moving across scales: Challenges and opportunities in upscaling carbon fluxes

    NASA Astrophysics Data System (ADS)

    Naithani, K. J.

    2016-12-01

    Light use efficiency (LUE) type models are commonly used to upscale terrestrial C fluxes and estimate regional and global C budgets. Model parameters are often estimated for each land cover type (LCT) using flux observations from one or more eddy covariance towers, and then spatially extrapolated by integrating land cover, meteorological, and remotely sensed data. Decisions regarding the type of input data (spatial resolution of land cover data, spatial and temporal length of flux data), representation of landscape structure (land use vs. disturbance regime), and the type of modeling framework (common risk vs. hierarchical) all influence the estimates CO2 fluxes and the associated uncertainties, but are rarely considered together. This work presents a synthesis of past and present efforts for upscaling CO2 fluxes and associated uncertainties in the ChEAS (Chequamegon Ecosystem Atmosphere Study) region in northern Wisconsin and the Upper Peninsula of Michigan. This work highlights two key future research needs. First, the characterization of uncertainties due to all of the abovementioned factors reflects only a (hopefully relevant) subset the overall uncertainties. Second, interactions among these factors are likely critical, but are poorly represented by the tower network at landscape scales. Yet, results indicate significant spatial and temporal heterogeneity of uncertainty in CO2 fluxes which can inform carbon management efforts and prioritize data needs.

  9. Fuzzy geometry, entropy, and image information

    NASA Technical Reports Server (NTRS)

    Pal, Sankar K.

    1991-01-01

    Presented here are various uncertainty measures arising from grayness ambiguity and spatial ambiguity in an image, and their possible applications as image information measures. Definitions are given of an image in the light of fuzzy set theory, and of information measures and tools relevant for processing/analysis e.g., fuzzy geometrical properties, correlation, bound functions and entropy measures. Also given is a formulation of algorithms along with management of uncertainties for segmentation and object extraction, and edge detection. The output obtained here is both fuzzy and nonfuzzy. Ambiguity in evaluation and assessment of membership function are also described.

  10. Effects of buffer size and shape on associations between the built environment and energy balance.

    PubMed

    James, Peter; Berrigan, David; Hart, Jaime E; Hipp, J Aaron; Hoehner, Christine M; Kerr, Jacqueline; Major, Jacqueline M; Oka, Masayoshi; Laden, Francine

    2014-05-01

    Uncertainty in the relevant spatial context may drive heterogeneity in findings on the built environment and energy balance. To estimate the effect of this uncertainty, we conducted a sensitivity analysis defining intersection and business densities and counts within different buffer sizes and shapes on associations with self-reported walking and body mass index. Linear regression results indicated that the scale and shape of buffers influenced study results and may partly explain the inconsistent findings in the built environment and energy balance literature. Copyright © 2014 Elsevier Ltd. All rights reserved.

  11. Uncertainties in building a strategic defense.

    PubMed

    Zraket, C A

    1987-03-27

    Building a strategic defense against nuclear ballistic missiles involves complex and uncertain functional, spatial, and temporal relations. Such a defensive system would evolve and grow over decades. It is too complex, dynamic, and interactive to be fully understood initially by design, analysis, and experiments. Uncertainties exist in the formulation of requirements and in the research and design of a defense architecture that can be implemented incrementally and be fully tested to operate reliably. The analysis and measurement of system survivability, performance, and cost-effectiveness are critical to this process. Similar complexities exist for an adversary's system that would suppress or use countermeasures against a missile defense. Problems and opportunities posed by these relations are described, with emphasis on the unique characteristics and vulnerabilities of space-based systems.

  12. Tower-Based Greenhouse Gas Measurement Network Design---The National Institute of Standards and Technology North East Corridor Testbed.

    PubMed

    Lopez-Coto, Israel; Ghosh, Subhomoy; Prasad, Kuldeep; Whetstone, James

    2017-09-01

    The North-East Corridor (NEC) Testbed project is the 3rd of three NIST (National Institute of Standards and Technology) greenhouse gas emissions testbeds designed to advance greenhouse gas measurements capabilities. A design approach for a dense observing network combined with atmospheric inversion methodologies is described. The Advanced Research Weather Research and Forecasting Model with the Stochastic Time-Inverted Lagrangian Transport model were used to derive the sensitivity of hypothetical observations to surface greenhouse gas emissions (footprints). Unlike other network design algorithms, an iterative selection algorithm, based on a k -means clustering method, was applied to minimize the similarities between the temporal response of each site and maximize sensitivity to the urban emissions contribution. Once a network was selected, a synthetic inversion Bayesian Kalman filter was used to evaluate observing system performance. We present the performances of various measurement network configurations consisting of differing numbers of towers and tower locations. Results show that an overly spatially compact network has decreased spatial coverage, as the spatial information added per site is then suboptimal as to cover the largest possible area, whilst networks dispersed too broadly lose capabilities of constraining flux uncertainties. In addition, we explore the possibility of using a very high density network of lower cost and performance sensors characterized by larger uncertainties and temporal drift. Analysis convergence is faster with a large number of observing locations, reducing the response time of the filter. Larger uncertainties in the observations implies lower values of uncertainty reduction. On the other hand, the drift is a bias in nature, which is added to the observations and, therefore, biasing the retrieved fluxes.

  13. Population-level differences in disease transmission: A Bayesian analysis of multiple smallpox epidemics

    PubMed Central

    Elderd, Bret D.; Dwyer, Greg; Dukic, Vanja

    2013-01-01

    Estimates of a disease’s basic reproductive rate R0 play a central role in understanding outbreaks and planning intervention strategies. In many calculations of R0, a simplifying assumption is that different host populations have effectively identical transmission rates. This assumption can lead to an underestimate of the overall uncertainty associated with R0, which, due to the non-linearity of epidemic processes, may result in a mis-estimate of epidemic intensity and miscalculated expenditures associated with public-health interventions. In this paper, we utilize a Bayesian method for quantifying the overall uncertainty arising from differences in population-specific basic reproductive rates. Using this method, we fit spatial and non-spatial susceptible-exposed-infected-recovered (SEIR) models to a series of 13 smallpox outbreaks. Five outbreaks occurred in populations that had been previously exposed to smallpox, while the remaining eight occurred in Native-American populations that were naïve to the disease at the time. The Native-American outbreaks were close in a spatial and temporal sense. Using Bayesian Information Criterion (BIC), we show that the best model includes population-specific R0 values. These differences in R0 values may, in part, be due to differences in genetic background, social structure, or food and water availability. As a result of these inter-population differences, the overall uncertainty associated with the “population average” value of smallpox R0 is larger, a finding that can have important consequences for controlling epidemics. In general, Bayesian hierarchical models are able to properly account for the uncertainty associated with multiple epidemics, provide a clearer understanding of variability in epidemic dynamics, and yield a better assessment of the range of potential risks and consequences that decision makers face. PMID:24021521

  14. Merging gauge and satellite rainfall with specification of associated uncertainty across Australia

    NASA Astrophysics Data System (ADS)

    Woldemeskel, Fitsum M.; Sivakumar, Bellie; Sharma, Ashish

    2013-08-01

    Accurate estimation of spatial rainfall is crucial for modelling hydrological systems and planning and management of water resources. While spatial rainfall can be estimated either using rain gauge-based measurements or using satellite-based measurements, such estimates are subject to uncertainties due to various sources of errors in either case, including interpolation and retrieval errors. The purpose of the present study is twofold: (1) to investigate the benefit of merging rain gauge measurements and satellite rainfall data for Australian conditions and (2) to produce a database of retrospective rainfall along with a new uncertainty metric for each grid location at any timestep. The analysis involves four steps: First, a comparison of rain gauge measurements and the Tropical Rainfall Measuring Mission (TRMM) 3B42 data at such rain gauge locations is carried out. Second, gridded monthly rain gauge rainfall is determined using thin plate smoothing splines (TPSS) and modified inverse distance weight (MIDW) method. Third, the gridded rain gauge rainfall is merged with the monthly accumulated TRMM 3B42 using a linearised weighting procedure, the weights at each grid being calculated based on the error variances of each dataset. Finally, cross validation (CV) errors at rain gauge locations and standard errors at gridded locations for each timestep are estimated. The CV error statistics indicate that merging of the two datasets improves the estimation of spatial rainfall, and more so where the rain gauge network is sparse. The provision of spatio-temporal standard errors with the retrospective dataset is particularly useful for subsequent modelling applications where input error knowledge can help reduce the uncertainty associated with modelling outcomes.

  15. Tower-based greenhouse gas measurement network design—The National Institute of Standards and Technology North East Corridor Testbed

    NASA Astrophysics Data System (ADS)

    Lopez-Coto, Israel; Ghosh, Subhomoy; Prasad, Kuldeep; Whetstone, James

    2017-09-01

    The North-East Corridor (NEC) Testbed project is the 3rd of three NIST (National Institute of Standards and Technology) greenhouse gas emissions testbeds designed to advance greenhouse gas measurements capabilities. A design approach for a dense observing network combined with atmospheric inversion methodologies is described. The Advanced Research Weather Research and Forecasting Model with the Stochastic Time-Inverted Lagrangian Transport model were used to derive the sensitivity of hypothetical observations to surface greenhouse gas emissions (footprints). Unlike other network design algorithms, an iterative selection algorithm, based on a k-means clustering method, was applied to minimize the similarities between the temporal response of each site and maximize sensitivity to the urban emissions contribution. Once a network was selected, a synthetic inversion Bayesian Kalman filter was used to evaluate observing system performance. We present the performances of various measurement network configurations consisting of differing numbers of towers and tower locations. Results show that an overly spatially compact network has decreased spatial coverage, as the spatial information added per site is then suboptimal as to cover the largest possible area, whilst networks dispersed too broadly lose capabilities of constraining flux uncertainties. In addition, we explore the possibility of using a very high density network of lower cost and performance sensors characterized by larger uncertainties and temporal drift. Analysis convergence is faster with a large number of observing locations, reducing the response time of the filter. Larger uncertainties in the observations implies lower values of uncertainty reduction. On the other hand, the drift is a bias in nature, which is added to the observations and, therefore, biasing the retrieved fluxes.

  16. MODFLOW 2000 Head Uncertainty, a First-Order Second Moment Method

    USGS Publications Warehouse

    Glasgow, H.S.; Fortney, M.D.; Lee, J.; Graettinger, A.J.; Reeves, H.W.

    2003-01-01

    A computationally efficient method to estimate the variance and covariance in piezometric head results computed through MODFLOW 2000 using a first-order second moment (FOSM) approach is presented. This methodology employs a first-order Taylor series expansion to combine model sensitivity with uncertainty in geologic data. MODFLOW 2000 is used to calculate both the ground water head and the sensitivity of head to changes in input data. From a limited number of samples, geologic data are extrapolated and their associated uncertainties are computed through a conditional probability calculation. Combining the spatially related sensitivity and input uncertainty produces the variance-covariance matrix, the diagonal of which is used to yield the standard deviation in MODFLOW 2000 head. The variance in piezometric head can be used for calibrating the model, estimating confidence intervals, directing exploration, and evaluating the reliability of a design. A case study illustrates the approach, where aquifer transmissivity is the spatially related uncertain geologic input data. The FOSM methodology is shown to be applicable for calculating output uncertainty for (1) spatially related input and output data, and (2) multiple input parameters (transmissivity and recharge).

  17. Benchmarking observational uncertainties for hydrology (Invited)

    NASA Astrophysics Data System (ADS)

    McMillan, H. K.; Krueger, T.; Freer, J. E.; Westerberg, I.

    2013-12-01

    There is a pressing need for authoritative and concise information on the expected error distributions and magnitudes in hydrological data, to understand its information content. Many studies have discussed how to incorporate uncertainty information into model calibration and implementation, and shown how model results can be biased if uncertainty is not appropriately characterised. However, it is not always possible (for example due to financial or time constraints) to make detailed studies of uncertainty for every research study. Instead, we propose that the hydrological community could benefit greatly from sharing information on likely uncertainty characteristics and the main factors that control the resulting magnitude. In this presentation, we review the current knowledge of uncertainty for a number of key hydrological variables: rainfall, flow and water quality (suspended solids, nitrogen, phosphorus). We collated information on the specifics of the data measurement (data type, temporal and spatial resolution), error characteristics measured (e.g. standard error, confidence bounds) and error magnitude. Our results were primarily split by data type. Rainfall uncertainty was controlled most strongly by spatial scale, flow uncertainty was controlled by flow state (low, high) and gauging method. Water quality presented a more complex picture with many component errors. For all variables, it was easy to find examples where relative error magnitude exceeded 40%. We discuss some of the recent developments in hydrology which increase the need for guidance on typical error magnitudes, in particular when doing comparative/regionalisation and multi-objective analysis. Increased sharing of data, comparisons between multiple catchments, and storage in national/international databases can mean that data-users are far removed from data collection, but require good uncertainty information to reduce bias in comparisons or catchment regionalisation studies. Recently it has become more common for hydrologists to use multiple data types and sources within a single study. This may be driven by complex water management questions which integrate water quantity, quality and ecology; or by recognition of the value of auxiliary data to understand hydrological processes. We discuss briefly the impact of data uncertainty on the increasingly popular use of diagnostic signatures for hydrological process understanding and model development.

  18. Spatial modeling of litter and soil carbon stocks with associated uncertainty on forest land in the conterminous United States

    NASA Astrophysics Data System (ADS)

    Cao, B.; Domke, G. M.; Russell, M.; McRoberts, R. E.; Walters, B. F.

    2017-12-01

    Forest ecosystems contribute substantially to carbon (C) storage. The dynamics of litter decomposition, translocation and stabilization into soil layers are essential processes in the functioning of forest ecosystems, as they control the cycling of soil organic matter and the accumulation and release of C to the atmosphere. Therefore, the spatial distributions of litter and soil C stocks are important in greenhouse gas estimation and reporting and inform land management decisions, policy, and climate change mitigation strategies. In this study, we explored the effects of spatial aggregation of climatic, biotic, topographic and soil input data on national estimates of litter and soil C stocks and characterized the spatial distribution of litter and soil C stocks in the conterminous United States. Data from the Forest Inventory and Analysis (FIA) program within the US Forest Service were used with vegetation phenology data estimated from LANDSAT imagery (30 m) and raster data describing relevant environmental parameters (e.g. temperature, precipitation, topographic properties) for the entire conterminous US. Litter and soil C stocks were estimated and mapped through geostatistical analysis and statistical uncertainty bounds on the pixel level predictions were constructed using a Monte Carlo-bootstrap technique, by which credible variance estimates for the C stocks were calculated. The sensitivity of model estimates to spatial aggregation depends on geographic region. Further, using long-term (30-year) climate averages during periods with strong climatic trends results in large differences in litter and soil C stock estimates. In addition, results suggest that local topographic aspect is an important variable in litter and soil C estimation at the continental scale.

  19. Decerns: A framework for multi-criteria decision analysis

    DOE PAGES

    Yatsalo, Boris; Didenko, Vladimir; Gritsyuk, Sergey; ...

    2015-02-27

    A new framework, Decerns, for multicriteria decision analysis (MCDA) of a wide range of practical problems on risk management is introduced. Decerns framework contains a library of modules that are the basis for two scalable systems: DecernsMCDA for analysis of multicriteria problems, and DecernsSDSS for multicriteria analysis of spatial options. DecernsMCDA includes well known MCDA methods and original methods for uncertainty treatment based on probabilistic approaches and fuzzy numbers. As a result, these MCDA methods are described along with a case study on analysis of multicriteria location problem.

  20. The Generalized Uncertainty Principle and Harmonic Interaction in Three Spatial Dimensions

    NASA Astrophysics Data System (ADS)

    Hassanabadi, H.; Hooshmand, P.; Zarrinkamar, S.

    2015-01-01

    In three spatial dimensions, the generalized uncertainty principle is considered under an isotropic harmonic oscillator interaction in both non-relativistic and relativistic regions. By using novel transformations and separations of variables, the exact analytical solution of energy eigenvalues as well as the wave functions is obtained. Time evolution of the non-relativistic region is also reported.

  1. Spatial regression methods capture prediction uncertainty in species distribution model projections through time

    Treesearch

    Alan K. Swanson; Solomon Z. Dobrowski; Andrew O. Finley; James H. Thorne; Michael K. Schwartz

    2013-01-01

    The uncertainty associated with species distribution model (SDM) projections is poorly characterized, despite its potential value to decision makers. Error estimates from most modelling techniques have been shown to be biased due to their failure to account for spatial autocorrelation (SAC) of residual error. Generalized linear mixed models (GLMM) have the ability to...

  2. Communicating uncertainties in earth sciences in view of user needs

    NASA Astrophysics Data System (ADS)

    de Vries, Wim; Kros, Hans; Heuvelink, Gerard

    2014-05-01

    Uncertainties are inevitable in all results obtained in the earth sciences, regardless whether these are based on field observations, experimental research or predictive modelling. When informing decision and policy makers or stakeholders, it is important that these uncertainties are also communicated. In communicating results, it important to apply a "Progressive Disclosure of Information (PDI)" from non-technical information through more specialised information, according to the user needs. Generalized information is generally directed towards non-scientific audiences and intended for policy advice. Decision makers have to be aware of the implications of the uncertainty associated with results, so that they can account for it in their decisions. Detailed information on the uncertainties is generally intended for scientific audiences to give insight in underlying approaches and results. When communicating uncertainties, it is important to distinguish between scientific results that allow presentation in terms of probabilistic measures of uncertainty and more intrinsic uncertainties and errors that cannot be expressed in mathematical terms. Examples of earth science research that allow probabilistic measures of uncertainty, involving sophisticated statistical methods, are uncertainties in spatial and/or temporal variations in results of: • Observations, such as soil properties measured at sampling locations. In this case, the interpolation uncertainty, caused by a lack of data collected in space, can be quantified by e.g. kriging standard deviation maps or animations of conditional simulations. • Experimental measurements, comparing impacts of treatments at different sites and/or under different conditions. In this case, an indication of the average and range in measured responses to treatments can be obtained from a meta-analysis, summarizing experimental findings between replicates and across studies, sites, ecosystems, etc. • Model predictions due to uncertain model parameters (parametric variability). These uncertainties can be quantified by uncertainty propagation methods such as Monte Carlo simulation methods. Examples of intrinsic uncertainties that generally cannot be expressed in mathematical terms are errors or biases in: • Results of experiments and observations due to inadequate sampling and errors in analyzing data in the laboratory and even in data reporting. • Results of (laboratory) experiments that are limited to a specific domain or performed under circumstances that differ from field circumstances. • Model structure, due to lack of knowledge of the underlying processes. Structural uncertainty, which may cause model inadequacy/ bias, is inherent in model approaches since models are approximations of reality. Intrinsic uncertainties often occur in an emerging field where ongoing new findings, either experiments or field observations of new model findings, challenge earlier work. In this context, climate scientists working within the IPCC have adopted a lexicon to communicate confidence in their findings, ranging from "very high", "high", "medium", "low" and "very low" confidence. In fact, there are also statistical methods to gain insight in uncertainties in model predictions due to model assumptions (i.e. model structural error). Examples are comparing model results with independent observations or a systematic intercomparison of predictions from multiple models. In the latter case, Bayesian model averaging techniques can be used, in which each model considered gets an assigned prior probability of being the 'true' model. This approach works well with statistical (regression) models, but extension to physically-based models is cumbersome. An alternative is the use of state-space models in which structural errors are represent as (additive) noise terms. In this presentation, we focus on approaches that are relevant at the science - policy interface, including multiple scientific disciplines and policy makers with different subject areas. Approaches to communicate uncertainties in results of observations or model predictions are discussed, distinguishing results that include probabilistic measures of uncertainty and more intrinsic uncertainties. Examples concentrate on uncertainties in nitrogen (N) related environmental issues, including: • Spatio-temporal trends in atmospheric N deposition, in view of the policy question whether there is a declining or increasing trend. • Carbon response to N inputs to terrestrial ecosystems, based on meta-analysis of N addition experiments and other approaches, in view of the policy relevance of N emission control. • Calculated spatial variations in the emissions of nitrous-oxide and ammonia, in view of the need of emission policies at different spatial scales. • Calculated N emissions and losses by model intercomparisons, in view of the policy need to apply no-regret decisions with respect to the control of those emissions.

  3. Model-based scenario planning to develop climate change adaptation strategies for rare plant populations in grassland reserves

    Treesearch

    Laura Phillips-Mao; Susan M. Galatowitsch; Stephanie A. Snyder; Robert G. Haight

    2016-01-01

    Incorporating climate change into conservation decision-making at site and population scales is challenging due to uncertainties associated with localized climate change impacts and population responses to multiple interacting impacts and adaptation strategies. We explore the use of spatially explicit population models to facilitate scenario analysis, a conservation...

  4. Analysis of Coupled Model Uncertainties in Source to Dose Modeling of Human Exposures to Ambient Air Pollution: a PM2.5 Case-Study

    EPA Science Inventory

    Quantitative assessment of human exposures and health effects due to air pollution involve detailed characterization of impacts of air quality on exposure and dose. A key challenge is to integrate these three components on a consistent spatial and temporal basis taking into acco...

  5. Adding uncertainty to forest inventory plot locations: effects on analyses using geospatial data

    Treesearch

    Alexia A. Sabor; Volker C. Radeloff; Ronald E. McRoberts; Murray Clayton; Susan I. Stewart

    2007-01-01

    The Forest Inventory and Analysis (FIA) program of the USDA Forest Service alters plot locations before releasing data to the public to ensure landowner confidentiality and sample integrity, but using data with altered plot locations in conjunction with other spatially explicit data layers produces analytical results with unknown amounts of error. We calculated the...

  6. Quantifying geological uncertainty for flow and transport modeling in multi-modal heterogeneous formations

    NASA Astrophysics Data System (ADS)

    Feyen, Luc; Caers, Jef

    2006-06-01

    In this work, we address the problem of characterizing the heterogeneity and uncertainty of hydraulic properties for complex geological settings. Hereby, we distinguish between two scales of heterogeneity, namely the hydrofacies structure and the intrafacies variability of the hydraulic properties. We employ multiple-point geostatistics to characterize the hydrofacies architecture. The multiple-point statistics are borrowed from a training image that is designed to reflect the prior geological conceptualization. The intrafacies variability of the hydraulic properties is represented using conventional two-point correlation methods, more precisely, spatial covariance models under a multi-Gaussian spatial law. We address the different levels and sources of uncertainty in characterizing the subsurface heterogeneity, and explore their effect on groundwater flow and transport predictions. Typically, uncertainty is assessed by way of many images, termed realizations, of a fixed statistical model. However, in many cases, sampling from a fixed stochastic model does not adequately represent the space of uncertainty. It neglects the uncertainty related to the selection of the stochastic model and the estimation of its input parameters. We acknowledge the uncertainty inherent in the definition of the prior conceptual model of aquifer architecture and in the estimation of global statistics, anisotropy, and correlation scales. Spatial bootstrap is used to assess the uncertainty of the unknown statistical parameters. As an illustrative example, we employ a synthetic field that represents a fluvial setting consisting of an interconnected network of channel sands embedded within finer-grained floodplain material. For this highly non-stationary setting we quantify the groundwater flow and transport model prediction uncertainty for various levels of hydrogeological uncertainty. Results indicate the importance of accurately describing the facies geometry, especially for transport predictions.

  7. Uncertainties in historical pollution data from sedimentary records from an Australian urban floodplain lake

    NASA Astrophysics Data System (ADS)

    Lintern, A.; Leahy, P.; Deletic, A.; Heijnis, H.; Zawadzki, A.; Gadd, P.; McCarthy, D.

    2018-05-01

    Sediment cores from aquatic environments can provide valuable information about historical pollution levels and sources. However, there is little understanding of the uncertainties associated with these findings. The aim of this study is to fill this knowledge gap by proposing a framework for quantifying the uncertainties in historical heavy metal pollution records reconstructed from sediment cores. This uncertainty framework consists of six sources of uncertainty: uncertainties in (1) metals analysis methods, (2) spatial variability of sediment core heavy metal profiles, (3) sub-sampling intervals, (4) the sediment chronology, (5) the assumption that metal levels in bed sediments reflect the magnitude of metal inputs into the aquatic system, and (6) post-depositional transformation of metals. We apply this uncertainty framework to an urban floodplain lake in South-East Australia (Willsmere Billabong). We find that for this site, uncertainties in historical dated heavy metal profiles can be up to 176%, largely due to uncertainties in the sediment chronology, and in the assumption that the settled heavy metal mass is equivalent to the heavy metal mass entering the aquatic system. As such, we recommend that future studies reconstructing historical pollution records using sediment cores from aquatic systems undertake an investigation of the uncertainties in the reconstructed pollution record, using the uncertainty framework provided in this study. We envisage that quantifying and understanding the uncertainties associated with the reconstructed pollution records will facilitate the practical application of sediment core heavy metal profiles in environmental management projects.

  8. Optimal Integration of Departures and Arrivals in Terminal Airspace

    NASA Technical Reports Server (NTRS)

    Xue, Min; Zelinski, Shannon Jean

    2013-01-01

    Coordination of operations with spatially and temporally shared resources, such as route segments, fixes, and runways, improves the efficiency of terminal airspace management. Problems in this category are, in general, computationally difficult compared to conventional scheduling problems. This paper presents a fast time algorithm formulation using a non-dominated sorting genetic algorithm (NSGA). It was first applied to a test problem introduced in existing literature. An experiment with a test problem showed that new methods can solve the 20 aircraft problem in fast time with a 65% or 440 second delay reduction using shared departure fixes. In order to test its application in a more realistic and complicated problem, the NSGA algorithm was applied to a problem in LAX terminal airspace, where interactions between 28% of LAX arrivals and 10% of LAX departures are resolved by spatial separation in current operations, which may introduce unnecessary delays. In this work, three types of separations - spatial, temporal, and hybrid separations - were formulated using the new algorithm. The hybrid separation combines both temporal and spatial separations. Results showed that although temporal separation achieved less delay than spatial separation with a small uncertainty buffer, spatial separation outperformed temporal separation when the uncertainty buffer was increased. Hybrid separation introduced much less delay than both spatial and temporal approaches. For a total of 15 interacting departures and arrivals, when compared to spatial separation, the delay reduction of hybrid separation varied between 11% or 3.1 minutes and 64% or 10.7 minutes corresponding to an uncertainty buffer from 0 to 60 seconds. Furthermore, as a comparison with the NSGA algorithm, a First-Come-First-Serve based heuristic method was implemented for the hybrid separation. Experiments showed that the results from the NSGA algorithm have 9% to 42% less delay than the heuristic method with varied uncertainty buffer sizes.

  9. A heteroskedastic error covariance matrix estimator using a first-order conditional autoregressive Markov simulation for deriving asympotical efficient estimates from ecological sampled Anopheles arabiensis aquatic habitat covariates

    PubMed Central

    Jacob, Benjamin G; Griffith, Daniel A; Muturi, Ephantus J; Caamano, Erick X; Githure, John I; Novak, Robert J

    2009-01-01

    Background Autoregressive regression coefficients for Anopheles arabiensis aquatic habitat models are usually assessed using global error techniques and are reported as error covariance matrices. A global statistic, however, will summarize error estimates from multiple habitat locations. This makes it difficult to identify where there are clusters of An. arabiensis aquatic habitats of acceptable prediction. It is therefore useful to conduct some form of spatial error analysis to detect clusters of An. arabiensis aquatic habitats based on uncertainty residuals from individual sampled habitats. In this research, a method of error estimation for spatial simulation models was demonstrated using autocorrelation indices and eigenfunction spatial filters to distinguish among the effects of parameter uncertainty on a stochastic simulation of ecological sampled Anopheles aquatic habitat covariates. A test for diagnostic checking error residuals in an An. arabiensis aquatic habitat model may enable intervention efforts targeting productive habitats clusters, based on larval/pupal productivity, by using the asymptotic distribution of parameter estimates from a residual autocovariance matrix. The models considered in this research extends a normal regression analysis previously considered in the literature. Methods Field and remote-sampled data were collected during July 2006 to December 2007 in Karima rice-village complex in Mwea, Kenya. SAS 9.1.4® was used to explore univariate statistics, correlations, distributions, and to generate global autocorrelation statistics from the ecological sampled datasets. A local autocorrelation index was also generated using spatial covariance parameters (i.e., Moran's Indices) in a SAS/GIS® database. The Moran's statistic was decomposed into orthogonal and uncorrelated synthetic map pattern components using a Poisson model with a gamma-distributed mean (i.e. negative binomial regression). The eigenfunction values from the spatial configuration matrices were then used to define expectations for prior distributions using a Markov chain Monte Carlo (MCMC) algorithm. A set of posterior means were defined in WinBUGS 1.4.3®. After the model had converged, samples from the conditional distributions were used to summarize the posterior distribution of the parameters. Thereafter, a spatial residual trend analyses was used to evaluate variance uncertainty propagation in the model using an autocovariance error matrix. Results By specifying coefficient estimates in a Bayesian framework, the covariate number of tillers was found to be a significant predictor, positively associated with An. arabiensis aquatic habitats. The spatial filter models accounted for approximately 19% redundant locational information in the ecological sampled An. arabiensis aquatic habitat data. In the residual error estimation model there was significant positive autocorrelation (i.e., clustering of habitats in geographic space) based on log-transformed larval/pupal data and the sampled covariate depth of habitat. Conclusion An autocorrelation error covariance matrix and a spatial filter analyses can prioritize mosquito control strategies by providing a computationally attractive and feasible description of variance uncertainty estimates for correctly identifying clusters of prolific An. arabiensis aquatic habitats based on larval/pupal productivity. PMID:19772590

  10. Estimating Prediction Uncertainty from Geographical Information System Raster Processing: A User's Manual for the Raster Error Propagation Tool (REPTool)

    USGS Publications Warehouse

    Gurdak, Jason J.; Qi, Sharon L.; Geisler, Michael L.

    2009-01-01

    The U.S. Geological Survey Raster Error Propagation Tool (REPTool) is a custom tool for use with the Environmental System Research Institute (ESRI) ArcGIS Desktop application to estimate error propagation and prediction uncertainty in raster processing operations and geospatial modeling. REPTool is designed to introduce concepts of error and uncertainty in geospatial data and modeling and provide users of ArcGIS Desktop a geoprocessing tool and methodology to consider how error affects geospatial model output. Similar to other geoprocessing tools available in ArcGIS Desktop, REPTool can be run from a dialog window, from the ArcMap command line, or from a Python script. REPTool consists of public-domain, Python-based packages that implement Latin Hypercube Sampling within a probabilistic framework to track error propagation in geospatial models and quantitatively estimate the uncertainty of the model output. Users may specify error for each input raster or model coefficient represented in the geospatial model. The error for the input rasters may be specified as either spatially invariant or spatially variable across the spatial domain. Users may specify model output as a distribution of uncertainty for each raster cell. REPTool uses the Relative Variance Contribution method to quantify the relative error contribution from the two primary components in the geospatial model - errors in the model input data and coefficients of the model variables. REPTool is appropriate for many types of geospatial processing operations, modeling applications, and related research questions, including applications that consider spatially invariant or spatially variable error in geospatial data.

  11. The worth of data to reduce predictive uncertainty of an integrated catchment model by multi-constraint calibration

    NASA Astrophysics Data System (ADS)

    Koch, J.; Jensen, K. H.; Stisen, S.

    2017-12-01

    Hydrological models that integrate numerical process descriptions across compartments of the water cycle are typically required to undergo thorough model calibration in order to estimate suitable effective model parameters. In this study, we apply a spatially distributed hydrological model code which couples the saturated zone with the unsaturated zone and the energy portioning at the land surface. We conduct a comprehensive multi-constraint model calibration against nine independent observational datasets which reflect both the temporal and the spatial behavior of hydrological response of a 1000km2 large catchment in Denmark. The datasets are obtained from satellite remote sensing and in-situ measurements and cover five keystone hydrological variables: discharge, evapotranspiration, groundwater head, soil moisture and land surface temperature. Results indicate that a balanced optimization can be achieved where errors on objective functions for all nine observational datasets can be reduced simultaneously. The applied calibration framework was tailored with focus on improving the spatial pattern performance; however results suggest that the optimization is still more prone to improve the temporal dimension of model performance. This study features a post-calibration linear uncertainty analysis. This allows quantifying parameter identifiability which is the worth of a specific observational dataset to infer values to model parameters through calibration. Furthermore the ability of an observation to reduce predictive uncertainty is assessed as well. Such findings determine concrete implications on the design of model calibration frameworks and, in more general terms, the acquisition of data in hydrological observatories.

  12. Quantifying uncertainty in national forest carbon stocks: challenges and opportunities for the United States National Greenhouse Gas Inventory

    NASA Astrophysics Data System (ADS)

    Clough, B.; Russell, M.; Domke, G. M.; Woodall, C. W.

    2016-12-01

    Uncertainty estimates are needed to establish confidence in national forest carbon stocks and to verify changes reported to the United Nations Framework Convention on Climate Change. Good practice guidance from the Intergovernmental Panel on Climate Change stipulates that uncertainty assessments should neither exaggerate nor underestimate the actual error within carbon stocks, yet methodological guidance for forests has been hampered by limited understanding of how complex dynamics give rise to errors across spatial scales (i.e., individuals to continents). This talk highlights efforts to develop a multi-scale, data-driven framework for assessing uncertainty within the United States (US) forest carbon inventory, and focuses on challenges and opportunities for improving the precision of national forest carbon stock estimates. Central to our approach is the calibration of allometric models with a newly established legacy biomass database for North American tree species, and the use of hierarchical models to link these data with the Forest Inventory and Analysis (FIA) database as well as remote sensing datasets. Our work suggests substantial risk for misestimating key sources of uncertainty including: (1) attributing more confidence in allometric models than what is warranted by the best available data; (2) failing to capture heterogeneity in biomass stocks due to environmental variation at regional scales; and (3) ignoring spatial autocorrelation and other random effects that are characteristic of national forest inventory data. Our results suggest these sources of error may be much higher than is generally assumed, though these results must be understood with the limited scope and availability of appropriate calibration data in mind. In addition to reporting on important sources of uncertainty, this talk will discuss opportunities to improve the precision of national forest carbon stocks that are motivated by our use of data-driven forecasting including: (1) improving the taxonomic and geographic scope of available biomass data; (2) direct attribution of landscape-level heterogeneity in biomass stocks to specific ecological processes; and (3) integration of expert opinion and meta-analysis to lessen the influence of often highly variable datasets on biomass stock forecasts.

  13. Estimation and Uncertainty Analysis of Impacts of Future Heat Waves on Mortality in the Eastern United States

    PubMed Central

    Wu, Jianyong; Zhou, Ying; Gao, Yang; Fu, Joshua S.; Johnson, Brent A.; Huang, Cheng; Kim, Young-Min

    2013-01-01

    Background: Climate change is anticipated to influence heat-related mortality in the future. However, estimates of excess mortality attributable to future heat waves are subject to large uncertainties and have not been projected under the latest greenhouse gas emission scenarios. Objectives: We estimated future heat wave mortality in the eastern United States (approximately 1,700 counties) under two Representative Concentration Pathways (RCPs) and investigated sources of uncertainty. Methods: Using dynamically downscaled hourly temperature projections for 2057–2059, we projected heat wave days that were defined using four heat wave metrics and estimated the excess mortality attributable to them. We apportioned the sources of uncertainty in excess mortality estimates using a variance-decomposition method. Results: Estimates suggest that excess mortality attributable to heat waves in the eastern United States would result in 200–7,807 deaths/year (mean 2,379 deaths/year) in 2057–2059. Average excess mortality projections under RCP4.5 and RCP8.5 scenarios were 1,403 and 3,556 deaths/year, respectively. Excess mortality would be relatively high in the southern states and eastern coastal areas (excluding Maine). The major sources of uncertainty were the relative risk estimates for mortality on heat wave versus non–heat wave days, the RCP scenarios, and the heat wave definitions. Conclusions: Mortality risks from future heat waves may be an order of magnitude higher than the mortality risks reported in 2002–2004, with thousands of heat wave–related deaths per year in the study area projected under the RCP8.5 scenario. Substantial spatial variability in county-level heat mortality estimates suggests that effective mitigation and adaptation measures should be developed based on spatially resolved data. Citation: Wu J, Zhou Y, Gao Y, Fu JS, Johnson BA, Huang C, Kim YM, Liu Y. 2014. Estimation and uncertainty analysis of impacts of future heat waves on mortality in the eastern United States. Environ Health Perspect 122:10–16; http://dx.doi.org/10.1289/ehp.1306670 PMID:24192064

  14. Evaluating land cover influences on model uncertainties—A case study of cropland carbon dynamics in the Mid-Continent Intensive Campaign region

    USGS Publications Warehouse

    Li, Zhengpeng; Liu, Shuguang; Zhang, Xuesong; West, Tristram O.; Ogle, Stephen M.; Zhou, Naijun

    2016-01-01

    Quantifying spatial and temporal patterns of carbon sources and sinks and their uncertainties across agriculture-dominated areas remains challenging for understanding regional carbon cycles. Characteristics of local land cover inputs could impact the regional carbon estimates but the effect has not been fully evaluated in the past. Within the North American Carbon Program Mid-Continent Intensive (MCI) Campaign, three models were developed to estimate carbon fluxes on croplands: an inventory-based model, the Environmental Policy Integrated Climate (EPIC) model, and the General Ensemble biogeochemical Modeling System (GEMS) model. They all provided estimates of three major carbon fluxes on cropland: net primary production (NPP), net ecosystem production (NEP), and soil organic carbon (SOC) change. Using data mining and spatial statistics, we studied the spatial distribution of the carbon fluxes uncertainties and the relationships between the uncertainties and the land cover characteristics. Results indicated that uncertainties for all three carbon fluxes were not randomly distributed, but instead formed multiple clusters within the MCI region. We investigated the impacts of three land cover characteristics on the fluxes uncertainties: cropland percentage, cropland richness and cropland diversity. The results indicated that cropland percentage significantly influenced the uncertainties of NPP and NEP, but not on the uncertainties of SOC change. Greater uncertainties of NPP and NEP were found in counties with small cropland percentage than the counties with large cropland percentage. Cropland species richness and diversity also showed negative correlations with the model uncertainties. Our study demonstrated that the land cover characteristics contributed to the uncertainties of regional carbon fluxes estimates. The approaches we used in this study can be applied to other ecosystem models to identify the areas with high uncertainties and where models can be improved to reduce overall uncertainties for regional carbon flux estimates.

  15. Estimates of Global Rangeland Net Primary Productivity and its Consumption Based on Climate and Livestock Distribution Data

    NASA Astrophysics Data System (ADS)

    Asrar, G.; Wolf, J.; Rafique, R.; West, T. O.; Ogle, S. M.

    2016-12-01

    Rangelands play an important role in providing ecosystem services such as food, forage, and fuels in many parts of the world. The net primary productivity (NPP), a difference between CO2 fixed by plants and CO2 lost to autotrophic respiration, is a good indicator of the productivity of rangeland ecosystems, and their contribution to the cycling of carbon in the Earth system. In this study, we estimated the NPP of global rangelands, the consumption thereof by grazing livestock, and associated uncertainties, to better understand and quantify the contribution of rangelands to land-based carbon storage. We estimated rangeland NPP using mean annual precipitation data from Climate Research Unit (CRU), and a regression model based on global observations (Del Grosso et al., 2008). Spatial distributions of annual livestock consumption of rangeland NPP (Wolf et al., 2015) were combined with gridded annual rangeland NPP for the years 2000 - 2011. The uncertainty analysis of these estimates was conducted using a Monte Carlo approach. The rangeland NPP estimates with associated uncertainties were also compared with the total modeled GPP estimates obtained from vegetation dynamic model simulations. Our results showed that mean above-ground NPP of rangelands is 1017.5 MgC/km2, while mean below-ground NPP is 847.6 MgC/km2. The total rangeland NPP represents a significant portion of the total NPP of the terrestrial ecosystem. The livestock area requirements used to geographically distribute livestock spatially are based on optimal pasturage and are low relative to area requirements on less productive land. Even so, ca. 90% of annual livestock consumption of rangeland NPP were met with no adjustment of livestock distributions. Moreover, the results of this study allowed us to explicitly quantify the temporal and spatial variations of rangeland NPP under different climatic conditions. Uncertainty analysis was helpful in identifying the strength and weakness of the methods used to estimate rangeland NPP. Overall, the results from this study are useful in quantifying the contribution of rangelands to the carbon cycle and for providing geospatially distributed carbon fluxes associated with the production and consumption of rangeland biomass.

  16. Deriving Continuous Fields of Tree Cover at 1-m over the Continental United States From the National Agriculture Imagery Program (NAIP) Imagery to Reduce Uncertainties in Forest Carbon Stock Estimation

    NASA Astrophysics Data System (ADS)

    Ganguly, S.; Basu, S.; Mukhopadhyay, S.; Michaelis, A.; Milesi, C.; Votava, P.; Nemani, R. R.

    2013-12-01

    An unresolved issue with coarse-to-medium resolution satellite-based forest carbon mapping over regional to continental scales is the high level of uncertainty in above ground biomass (AGB) estimates caused by the absence of forest cover information at a high enough spatial resolution (current spatial resolution is limited to 30-m). To put confidence in existing satellite-derived AGB density estimates, it is imperative to create continuous fields of tree cover at a sufficiently high resolution (e.g. 1-m) such that large uncertainties in forested area are reduced. The proposed work will provide means to reduce uncertainty in present satellite-derived AGB maps and Forest Inventory and Analysis (FIA) based regional estimates. Our primary objective will be to create Very High Resolution (VHR) estimates of tree cover at a spatial resolution of 1-m for the Continental United States using all available National Agriculture Imaging Program (NAIP) color-infrared imagery from 2010 till 2012. We will leverage the existing capabilities of the NASA Earth Exchange (NEX) high performance computing and storage facilities. The proposed 1-m tree cover map can be further aggregated to provide percent tree cover at any medium-to-coarse resolution spatial grid, which will aid in reducing uncertainties in AGB density estimation at the respective grid and overcome current limitations imposed by medium-to-coarse resolution land cover maps. We have implemented a scalable and computationally-efficient parallelized framework for tree-cover delineation - the core components of the algorithm [that] include a feature extraction process, a Statistical Region Merging image segmentation algorithm and a classification algorithm based on Deep Belief Network and a Feedforward Backpropagation Neural Network algorithm. An initial pilot exercise has been performed over the state of California (~11,000 scenes) to create a wall-to-wall 1-m tree cover map and the classification accuracy has been assessed. Results show an improvement in accuracy of tree-cover delineation as compared to existing forest cover maps from NLCD, especially over fragmented, heterogeneous and urban landscapes. Estimates of VHR tree cover will complement and enhance the accuracy of present remote-sensing based AGB modeling approaches and forest inventory based estimates at both national and local scales. A requisite step will be to characterize the inherent uncertainties in tree cover estimates and propagate them to estimate AGB.

  17. A Bayesian Framework of Uncertainties Integration in 3D Geological Model

    NASA Astrophysics Data System (ADS)

    Liang, D.; Liu, X.

    2017-12-01

    3D geological model can describe complicated geological phenomena in an intuitive way while its application may be limited by uncertain factors. Great progress has been made over the years, lots of studies decompose the uncertainties of geological model to analyze separately, while ignored the comprehensive impacts of multi-source uncertainties. Great progress has been made over the years, while lots of studies ignored the comprehensive impacts of multi-source uncertainties when analyzed them item by item from each source. To evaluate the synthetical uncertainty, we choose probability distribution to quantify uncertainty, and propose a bayesian framework of uncertainties integration. With this framework, we integrated data errors, spatial randomness, and cognitive information into posterior distribution to evaluate synthetical uncertainty of geological model. Uncertainties propagate and cumulate in modeling process, the gradual integration of multi-source uncertainty is a kind of simulation of the uncertainty propagation. Bayesian inference accomplishes uncertainty updating in modeling process. Maximum entropy principle makes a good effect on estimating prior probability distribution, which ensures the prior probability distribution subjecting to constraints supplied by the given information with minimum prejudice. In the end, we obtained a posterior distribution to evaluate synthetical uncertainty of geological model. This posterior distribution represents the synthetical impact of all the uncertain factors on the spatial structure of geological model. The framework provides a solution to evaluate synthetical impact on geological model of multi-source uncertainties and a thought to study uncertainty propagation mechanism in geological modeling.

  18. Consistency of Aquarius version-4 sea surface salinity with Argo products on various spatial and temporal scales

    NASA Astrophysics Data System (ADS)

    Lee, Tong

    2017-04-01

    Understanding the accuracies of satellite-derived sea surface salinity (SSS) measurements in depicting temporal changes and the dependence of the accuracies on spatiotemporal scales are important to capability assessment, future mission design, and applications to study oceanic phenomena of different spatiotemporal scales. This study quantifies the consistency between Aquarius Version-4 monthly gridded SSS (released in late 2015) with two widely used Argo monthly gridded near-surface salinity products. The analysis focused on their consistency in depicting temporal changes (including seasonal and non-seasonal) on various spatial scales: 1˚ x1˚ , 3˚ x3˚ , and 10˚ x10˚ . Globally averaged standard deviation (STD) values for Aquarius-Argo salinity differences on these three spatial scales are 0.16, 0.14, 0.09 psu, compared to those between the two Argo products of 0.10, 0.09, and 0.04 psu. Aquarius SSS compare better with Argo data on non-seasonal (e.g., interannual and intraseasonal) than for seasonal time scales. The seasonal Aquarius-Argo SSS differences are mostly concentrated at high latitudes. The Aquarius team is making active efforts to further reduce these high-latitude seasonal biases. The consistency between Aquarius and Argo salinity is similar to that between the two Argo products in the tropics and subtropics for non-seasonal signals, and in the tropics for seasonal signals. Therefore, the representativeness errors of the Argo products for various spatial scales (related to sampling and gridding) need to be taken into account when estimating the uncertainty of Aquarius SSS. The globally averaged uncertainty of large-scale (10˚ x10˚ ) non-seasonal Aquarius SSS is approximately 0.04 psu. These estimates reflect the significant improvements of Aquarius Version-4 SSS over the previous versions. The estimates can be used as baseline requirements for future ocean salinity missions from space. The spatial distribution of the uncertainty estimates is also useful for assimilation of Aquarius SSS.

  19. Predicting long-range transport: a systematic evaluation of two multimedia transport models.

    PubMed

    Bennett, D H; Scheringer, M; McKone, T E; Hungerbühler, K

    2001-03-15

    The United Nations Environment Program has recently developed criteria to identify and restrict chemicals with a potential for persistence and long-range transport (persistent organic pollutants or POPs). There are many stakeholders involved, and the issues are not only scientific but also include social, economic, and political factors. This work focuses on one aspect of the POPs debate, the criteria for determining the potential for long-range transport (LRT). Our goal is to determine if current models are reliable enough to support decisions that classify a chemical based on the LRT potential. We examine the robustness of two multimedia fate models for determining the relative ranking and absolute spatial range of various chemicals in the environment. We also consider the effect of parameter uncertainties and the model uncertainty associated with the selection of an algorithm for gas-particle partitioning on the model results. Given the same chemical properties, both models give virtually the same ranking. However, when chemical parameter uncertainties and model uncertainties such as particle partitioning are considered, the spatial range distributions obtained for the individual chemicals overlap, preventing a distinct rank order. The absolute values obtained for the predicted spatial range or travel distance differ significantly between the two models for the uncertainties evaluated. We find that to evaluate a chemical when large and unresolved uncertainties exist, it is more informative to use two or more models and include multiple types of uncertainty. Model differences and uncertainties must be explicitly confronted to determine how the limitations of scientific knowledge impact predictions in the decision-making process.

  20. Effects of uncertain topographic input data on two-dimensional flow modeling in a gravel-bed river

    USGS Publications Warehouse

    Legleiter, C.J.; Kyriakidis, P.C.; McDonald, R.R.; Nelson, J.M.

    2011-01-01

    Many applications in river research and management rely upon two-dimensional (2D) numerical models to characterize flow fields, assess habitat conditions, and evaluate channel stability. Predictions from such models are potentially highly uncertain due to the uncertainty associated with the topographic data provided as input. This study used a spatial stochastic simulation strategy to examine the effects of topographic uncertainty on flow modeling. Many, equally likely bed elevation realizations for a simple meander bend were generated and propagated through a typical 2D model to produce distributions of water-surface elevation, depth, velocity, and boundary shear stress at each node of the model's computational grid. Ensemble summary statistics were used to characterize the uncertainty associated with these predictions and to examine the spatial structure of this uncertainty in relation to channel morphology. Simulations conditioned to different data configurations indicated that model predictions became increasingly uncertain as the spacing between surveyed cross sections increased. Model sensitivity to topographic uncertainty was greater for base flow conditions than for a higher, subbankfull flow (75% of bankfull discharge). The degree of sensitivity also varied spatially throughout the bend, with the greatest uncertainty occurring over the point bar where the flow field was influenced by topographic steering effects. Uncertain topography can therefore introduce significant uncertainty to analyses of habitat suitability and bed mobility based on flow model output. In the presence of such uncertainty, the results of these studies are most appropriately represented in probabilistic terms using distributions of model predictions derived from a series of topographic realizations. Copyright 2011 by the American Geophysical Union.

  1. Role of turbulence fluctuations on uncertainties of acoutic Doppler current profiler discharge measurements

    USGS Publications Warehouse

    Tarrab, Leticia; Garcia, Carlos M.; Cantero, Mariano I.; Oberg, Kevin

    2012-01-01

    This work presents a systematic analysis quantifying the role of the presence of turbulence fluctuations on uncertainties (random errors) of acoustic Doppler current profiler (ADCP) discharge measurements from moving platforms. Data sets of three-dimensional flow velocities with high temporal and spatial resolution were generated from direct numerical simulation (DNS) of turbulent open channel flow. Dimensionless functions relating parameters quantifying the uncertainty in discharge measurements due to flow turbulence (relative variance and relative maximum random error) to sampling configuration were developed from the DNS simulations and then validated with field-scale discharge measurements. The validated functions were used to evaluate the role of the presence of flow turbulence fluctuations on uncertainties in ADCP discharge measurements. The results of this work indicate that random errors due to the flow turbulence are significant when: (a) a low number of transects is used for a discharge measurement, and (b) measurements are made in shallow rivers using high boat velocity (short time for the boat to cross a flow turbulence structure).

  2. Spatial planning using probabilistic flood maps

    NASA Astrophysics Data System (ADS)

    Alfonso, Leonardo; Mukolwe, Micah; Di Baldassarre, Giuliano

    2015-04-01

    Probabilistic flood maps account for uncertainty in flood inundation modelling and convey a degree of certainty in the outputs. Major sources of uncertainty include input data, topographic data, model structure, observation data and parametric uncertainty. Decision makers prefer less ambiguous information from modellers; this implies that uncertainty is suppressed to yield binary flood maps. Though, suppressing information may potentially lead to either surprise or misleading decisions. Inclusion of uncertain information in the decision making process is therefore desirable and transparent. To this end, we utilise the Prospect theory and information from a probabilistic flood map to evaluate potential decisions. Consequences related to the decisions were evaluated using flood risk analysis. Prospect theory explains how choices are made given options for which probabilities of occurrence are known and accounts for decision makers' characteristics such as loss aversion and risk seeking. Our results show that decision making is pronounced when there are high gains and loss, implying higher payoffs and penalties, therefore a higher gamble. Thus the methodology may be appropriately considered when making decisions based on uncertain information.

  3. Underwater passive acoustic localization of Pacific walruses in the northeastern Chukchi Sea.

    PubMed

    Rideout, Brendan P; Dosso, Stan E; Hannay, David E

    2013-09-01

    This paper develops and applies a linearized Bayesian localization algorithm based on acoustic arrival times of marine mammal vocalizations at spatially-separated receivers which provides three-dimensional (3D) location estimates with rigorous uncertainty analysis. To properly account for uncertainty in receiver parameters (3D hydrophone locations and synchronization times) and environmental parameters (water depth and sound-speed correction), these quantities are treated as unknowns constrained by prior estimates and prior uncertainties. Unknown scaling factors on both the prior and arrival-time uncertainties are estimated by minimizing Akaike's Bayesian information criterion (a maximum entropy condition). Maximum a posteriori estimates for sound source locations and times, receiver parameters, and environmental parameters are calculated simultaneously using measurements of arrival times for direct and interface-reflected acoustic paths. Posterior uncertainties for all unknowns incorporate both arrival time and prior uncertainties. Monte Carlo simulation results demonstrate that, for the cases considered here, linearization errors are small and the lack of an accurate sound-speed profile does not cause significant biases in the estimated locations. A sequence of Pacific walrus vocalizations, recorded in the Chukchi Sea northwest of Alaska, is localized using this technique, yielding a track estimate and uncertainties with an estimated speed comparable to normal walrus swim speeds.

  4. Uncertainty analysis for the steady-state flows in a dual throat nozzle

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Q.-Y.; Gottlieb, David; Hesthaven, Jan S.

    2005-03-20

    It is well known that the steady state of an isentropic flow in a dual-throat nozzle with equal throat areas is not unique. In particular there is a possibility that the flow contains a shock wave, whose location is determined solely by the initial condition. In this paper, we consider cases with uncertainty in this initial condition and use generalized polynomial chaos methods to study the steady-state solutions for stochastic initial conditions. Special interest is given to the statistics of the shock location. The polynomial chaos (PC) expansion modes are shown to be smooth functions of the spatial variable x,more » although each solution realization is discontinuous in the spatial variable x. When the variance of the initial condition is small, the probability density function of the shock location is computed with high accuracy. Otherwise, many terms are needed in the PC expansion to produce reasonable results due to the slow convergence of the PC expansion, caused by non-smoothness in random space.« less

  5. Understanding spatial organizations of chromosomes via statistical analysis of Hi-C data

    PubMed Central

    Hu, Ming; Deng, Ke; Qin, Zhaohui; Liu, Jun S.

    2015-01-01

    Understanding how chromosomes fold provides insights into the transcription regulation, hence, the functional state of the cell. Using the next generation sequencing technology, the recently developed Hi-C approach enables a global view of spatial chromatin organization in the nucleus, which substantially expands our knowledge about genome organization and function. However, due to multiple layers of biases, noises and uncertainties buried in the protocol of Hi-C experiments, analyzing and interpreting Hi-C data poses great challenges, and requires novel statistical methods to be developed. This article provides an overview of recent Hi-C studies and their impacts on biomedical research, describes major challenges in statistical analysis of Hi-C data, and discusses some perspectives for future research. PMID:26124977

  6. Approximate spatial reasoning

    NASA Technical Reports Server (NTRS)

    Dutta, Soumitra

    1988-01-01

    A model for approximate spatial reasoning using fuzzy logic to represent the uncertainty in the environment is presented. Algorithms are developed which can be used to reason about spatial information expressed in the form of approximate linguistic descriptions similar to the kind of spatial information processed by humans. Particular attention is given to static spatial reasoning.

  7. Probabilistic Flood Maps to support decision-making: Mapping the Value of Information

    NASA Astrophysics Data System (ADS)

    Alfonso, L.; Mukolwe, M. M.; Di Baldassarre, G.

    2016-02-01

    Floods are one of the most frequent and disruptive natural hazards that affect man. Annually, significant flood damage is documented worldwide. Flood mapping is a common preimpact flood hazard mitigation measure, for which advanced methods and tools (such as flood inundation models) are used to estimate potential flood extent maps that are used in spatial planning. However, these tools are affected, largely to an unknown degree, by both epistemic and aleatory uncertainty. Over the past few years, advances in uncertainty analysis with respect to flood inundation modeling show that it is appropriate to adopt Probabilistic Flood Maps (PFM) to account for uncertainty. However, the following question arises; how can probabilistic flood hazard information be incorporated into spatial planning? Thus, a consistent framework to incorporate PFMs into the decision-making is required. In this paper, a novel methodology based on Decision-Making under Uncertainty theories, in particular Value of Information (VOI) is proposed. Specifically, the methodology entails the use of a PFM to generate a VOI map, which highlights floodplain locations where additional information is valuable with respect to available floodplain management actions and their potential consequences. The methodology is illustrated with a simplified example and also applied to a real case study in the South of France, where a VOI map is analyzed on the basis of historical land use change decisions over a period of 26 years. Results show that uncertain flood hazard information encapsulated in PFMs can aid decision-making in floodplain planning.

  8. Spatial Intensity Duration Frequency Relationships Using Hierarchical Bayesian Analysis for Urban Areas

    NASA Astrophysics Data System (ADS)

    Rupa, Chandra; Mujumdar, Pradeep

    2016-04-01

    In urban areas, quantification of extreme precipitation is important in the design of storm water drains and other infrastructure. Intensity Duration Frequency (IDF) relationships are generally used to obtain design return level for a given duration and return period. Due to lack of availability of extreme precipitation data for sufficiently large number of years, estimating the probability of extreme events is difficult. Typically, a single station data is used to obtain the design return levels for various durations and return periods, which are used in the design of urban infrastructure for the entire city. In an urban setting, the spatial variation of precipitation can be high; the precipitation amounts and patterns often vary within short distances of less than 5 km. Therefore it is crucial to study the uncertainties in the spatial variation of return levels for various durations. In this work, the extreme precipitation is modeled spatially using the Bayesian hierarchical analysis and the spatial variation of return levels is studied. The analysis is carried out with Block Maxima approach for defining the extreme precipitation, using Generalized Extreme Value (GEV) distribution for Bangalore city, Karnataka state, India. Daily data for nineteen stations in and around Bangalore city is considered in the study. The analysis is carried out for summer maxima (March - May), monsoon maxima (June - September) and the annual maxima rainfall. In the hierarchical analysis, the statistical model is specified in three layers. The data layer models the block maxima, pooling the extreme precipitation from all the stations. In the process layer, the latent spatial process characterized by geographical and climatological covariates (lat-lon, elevation, mean temperature etc.) which drives the extreme precipitation is modeled and in the prior level, the prior distributions that govern the latent process are modeled. Markov Chain Monte Carlo (MCMC) algorithm (Metropolis Hastings algorithm within a Gibbs sampler) is used to obtain the samples of parameters from the posterior distribution of parameters. The spatial maps of return levels for specified return periods, along with the associated uncertainties, are obtained for the summer, monsoon and annual maxima rainfall. Considering various covariates, the best fit model is selected using Deviance Information Criteria. It is observed that the geographical covariates outweigh the climatological covariates for the monsoon maxima rainfall (latitude and longitude). The best covariates for summer maxima and annual maxima rainfall are mean summer precipitation and mean monsoon precipitation respectively, including elevation for both the cases. The scale invariance theory, which states that statistical properties of a process observed at various scales are governed by the same relationship, is used to disaggregate the daily rainfall to hourly scales. The spatial maps of the scale are obtained for the study area. The spatial maps of IDF relationships thus generated are useful in storm water designs, adequacy analysis and identifying the vulnerable flooding areas.

  9. Novel patch modelling method for efficient simulation and prediction uncertainty analysis of multi-scale groundwater flow and transport processes

    NASA Astrophysics Data System (ADS)

    Sreekanth, J.; Moore, Catherine

    2018-04-01

    The application of global sensitivity and uncertainty analysis techniques to groundwater models of deep sedimentary basins are typically challenged by large computational burdens combined with associated numerical stability issues. The highly parameterized approaches required for exploring the predictive uncertainty associated with the heterogeneous hydraulic characteristics of multiple aquifers and aquitards in these sedimentary basins exacerbate these issues. A novel Patch Modelling Methodology is proposed for improving the computational feasibility of stochastic modelling analysis of large-scale and complex groundwater models. The method incorporates a nested groundwater modelling framework that enables efficient simulation of groundwater flow and transport across multiple spatial and temporal scales. The method also allows different processes to be simulated within different model scales. Existing nested model methodologies are extended by employing 'joining predictions' for extrapolating prediction-salient information from one model scale to the next. This establishes a feedback mechanism supporting the transfer of information from child models to parent models as well as parent models to child models in a computationally efficient manner. This feedback mechanism is simple and flexible and ensures that while the salient small scale features influencing larger scale prediction are transferred back to the larger scale, this does not require the live coupling of models. This method allows the modelling of multiple groundwater flow and transport processes using separate groundwater models that are built for the appropriate spatial and temporal scales, within a stochastic framework, while also removing the computational burden associated with live model coupling. The utility of the method is demonstrated by application to an actual large scale aquifer injection scheme in Australia.

  10. Preferential sampling and Bayesian geostatistics: Statistical modeling and examples.

    PubMed

    Cecconi, Lorenzo; Grisotto, Laura; Catelan, Dolores; Lagazio, Corrado; Berrocal, Veronica; Biggeri, Annibale

    2016-08-01

    Preferential sampling refers to any situation in which the spatial process and the sampling locations are not stochastically independent. In this paper, we present two examples of geostatistical analysis in which the usual assumption of stochastic independence between the point process and the measurement process is violated. To account for preferential sampling, we specify a flexible and general Bayesian geostatistical model that includes a shared spatial random component. We apply the proposed model to two different case studies that allow us to highlight three different modeling and inferential aspects of geostatistical modeling under preferential sampling: (1) continuous or finite spatial sampling frame; (2) underlying causal model and relevant covariates; and (3) inferential goals related to mean prediction surface or prediction uncertainty. © The Author(s) 2016.

  11. Impact of Satellite Viewing-Swath Width on Global and Regional Aerosol Optical Thickness Statistics and Trends

    NASA Technical Reports Server (NTRS)

    Colarco, P. R.; Kahn, R. A.; Remer, L. A.; Levy, R. C.

    2014-01-01

    We use the Moderate Resolution Imaging Spectroradiometer (MODIS) satellite aerosol optical thickness (AOT) product to assess the impact of reduced swath width on global and regional AOT statistics and trends. Alongtrack and across-track sampling strategies are employed, in which the full MODIS data set is sub-sampled with various narrow-swath (approximately 400-800 km) and single pixel width (approximately 10 km) configurations. Although view-angle artifacts in the MODIS AOT retrieval confound direct comparisons between averages derived from different sub-samples, careful analysis shows that with many portions of the Earth essentially unobserved, spatial sampling introduces uncertainty in the derived seasonal-regional mean AOT. These AOT spatial sampling artifacts comprise up to 60%of the full-swath AOT value under moderate aerosol loading, and can be as large as 0.1 in some regions under high aerosol loading. Compared to full-swath observations, narrower swath and single pixel width sampling exhibits a reduced ability to detect AOT trends with statistical significance. On the other hand, estimates of the global, annual mean AOT do not vary significantly from the full-swath values as spatial sampling is reduced. Aggregation of the MODIS data at coarse grid scales (10 deg) shows consistency in the aerosol trends across sampling strategies, with increased statistical confidence, but quantitative errors in the derived trends are found even for the full-swath data when compared to high spatial resolution (0.5 deg) aggregations. Using results of a model-derived aerosol reanalysis, we find consistency in our conclusions about a seasonal-regional spatial sampling artifact in AOT Furthermore, the model shows that reduced spatial sampling can amount to uncertainty in computed shortwave top-ofatmosphere aerosol radiative forcing of 2-3 W m(sup-2). These artifacts are lower bounds, as possibly other unconsidered sampling strategies would perform less well. These results suggest that future aerosol satellite missions having significantly less than full-swath viewing are unlikely to sample the true AOT distribution well enough to obtain the statistics needed to reduce uncertainty in aerosol direct forcing of climate.

  12. Current and future pluvial flood hazard analysis for the city of Antwerp

    NASA Astrophysics Data System (ADS)

    Willems, Patrick; Tabari, Hossein; De Niel, Jan; Van Uytven, Els; Lambrechts, Griet; Wellens, Geert

    2016-04-01

    For the city of Antwerp in Belgium, higher rainfall extremes were observed in comparison with surrounding areas. The differences were found statistically significant for some areas and may be the result of the heat island effect in combination with the higher concentrations of aerosols. A network of 19 rain gauges but with varying records length (the longest since the 1960s) and continuous radar data for 10 years were combined to map the spatial variability of rainfall extremes over the city at various durations from 15 minutes to 1 day together with the uncertainty. The improved spatial rainfall information was used as input in the sewer system model of the city to analyze the frequency of urban pluvial floods. Comparison with historical flood observations from various sources (fire brigade and media) confirmed that the improved spatial rainfall information also improved sewer impact results on both the magnitude and frequency of the sewer floods. Next to these improved urban flood impact results for recent and current climatological conditions, the new insights on the local rainfall microclimate were also helpful to enhance future projections on rainfall extremes and pluvial floods in the city. This was done by improved statistical downscaling of all available CMIP5 global climate model runs (160 runs) for the 4 RCP scenarios, as well as the available EURO-CORDEX regional climate model runs. Two types of statistical downscaling methods were applied for that purpose (a weather typing based method, and a quantile perturbation approach), making use of the microclimate results and its dependency on specific weather types. Changes in extreme rainfall intensities were analyzed and mapped as a function of the RCP scenario, together with the uncertainty, decomposed in the uncertainties related to the climate models, the climate model initialization or limited length of the 30-year time series (natural climate variability) and the statistical downscaling (albeit limited to two types of methods). These were finally transferred into future pluvial flash flood hazard maps for the city together with the uncertainties, and are considered as basis for spatial planning and adaptation.

  13. Endogenous spatial attention: evidence for intact functioning in adults with autism

    PubMed Central

    Grubb, Michael A.; Behrmann, Marlene; Egan, Ryan; Minshew, Nancy J.; Carrasco, Marisa; Heeger, David J.

    2012-01-01

    Lay Abstract Attention allows us to selectively process the vast amount of information with which we are confronted. Focusing on a certain location of the visual scene (visual spatial attention) enables the prioritization of some aspects of information while ignoring others. Rapid manipulation of the attention field (i.e., the location and spread of visual spatial attention) is a critical aspect of human cognition, and previous research on spatial attention in individuals with autism spectrum disorders (ASD) has produced inconsistent results. In a series of three experiments, we evaluated claims in the literature that individuals with ASD exhibit a deficit in voluntarily controlling the deployment and size of the spatial attention field. We measured how well participants perform a visual discrimination task (accuracy) and how quickly they do so (reaction time), with and without spatial uncertainty (i.e., the lack of predictability concerning the spatial position of the upcoming stimulus). We found that high–functioning adults with autism exhibited slower reactions times overall with spatial uncertainty, but the effects of attention on performance accuracies and reaction times were indistinguishable between individuals with autism and typically developing individuals, in all three experiments. These results provide evidence of intact endogenous spatial attention function in high–functioning adults with ASD, suggesting that atypical endogenous spatial attention cannot be a latent characteristic of autism in general. Scientific Abstract Rapid manipulation of the attention field (i.e., the location and spread of visual spatial attention) is a critical aspect of human cognition, and previous research on spatial attention in individuals with autism spectrum disorders (ASD) has produced inconsistent results. In a series of three psychophysical experiments, we evaluated claims in the literature that individuals with ASD exhibit a deficit in voluntarily controlling the deployment and size of the spatial attention field. We measured the spatial distribution of performance accuracies and reaction times to quantify the sizes and locations of the attention field, with and without spatial uncertainty (i.e., the lack of predictability concerning the spatial position of the upcoming stimulus). We found that high–functioning adults with autism exhibited slower reactions times overall with spatial uncertainty, but the effects of attention on performance accuracies and reaction times were indistinguishable between individuals with autism and typically developing individuals, in all three experiments. These results provide evidence of intact endogenous spatial attention function in high–functioning adults with ASD, suggesting that atypical endogenous attention cannot be a latent characteristic of autism in general. PMID:23427075

  14. Numerical uncertainty in computational engineering and physics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hemez, Francois M

    2009-01-01

    Obtaining a solution that approximates ordinary or partial differential equations on a computational mesh or grid does not necessarily mean that the solution is accurate or even 'correct'. Unfortunately assessing the quality of discrete solutions by questioning the role played by spatial and temporal discretizations generally comes as a distant third to test-analysis comparison and model calibration. This publication is contributed to raise awareness of the fact that discrete solutions introduce numerical uncertainty. This uncertainty may, in some cases, overwhelm in complexity and magnitude other sources of uncertainty that include experimental variability, parametric uncertainty and modeling assumptions. The concepts ofmore » consistency, convergence and truncation error are overviewed to explain the articulation between the exact solution of continuous equations, the solution of modified equations and discrete solutions computed by a code. The current state-of-the-practice of code and solution verification activities is discussed. An example in the discipline of hydro-dynamics illustrates the significant effect that meshing can have on the quality of code predictions. A simple method is proposed to derive bounds of solution uncertainty in cases where the exact solution of the continuous equations, or its modified equations, is unknown. It is argued that numerical uncertainty originating from mesh discretization should always be quantified and accounted for in the overall uncertainty 'budget' that supports decision-making for applications in computational physics and engineering.« less

  15. Geostatistical Sampling Methods for Efficient Uncertainty Analysis in Flow and Transport Problems

    NASA Astrophysics Data System (ADS)

    Liodakis, Stylianos; Kyriakidis, Phaedon; Gaganis, Petros

    2015-04-01

    In hydrogeological applications involving flow and transport of in heterogeneous porous media the spatial distribution of hydraulic conductivity is often parameterized in terms of a lognormal random field based on a histogram and variogram model inferred from data and/or synthesized from relevant knowledge. Realizations of simulated conductivity fields are then generated using geostatistical simulation involving simple random (SR) sampling and are subsequently used as inputs to physically-based simulators of flow and transport in a Monte Carlo framework for evaluating the uncertainty in the spatial distribution of solute concentration due to the uncertainty in the spatial distribution of hydraulic con- ductivity [1]. Realistic uncertainty analysis, however, calls for a large number of simulated concentration fields; hence, can become expensive in terms of both time and computer re- sources. A more efficient alternative to SR sampling is Latin hypercube (LH) sampling, a special case of stratified random sampling, which yields a more representative distribution of simulated attribute values with fewer realizations [2]. Here, term representative implies realizations spanning efficiently the range of possible conductivity values corresponding to the lognormal random field. In this work we investigate the efficiency of alternative methods to classical LH sampling within the context of simulation of flow and transport in a heterogeneous porous medium. More precisely, we consider the stratified likelihood (SL) sampling method of [3], in which attribute realizations are generated using the polar simulation method by exploring the geometrical properties of the multivariate Gaussian distribution function. In addition, we propose a more efficient version of the above method, here termed minimum energy (ME) sampling, whereby a set of N representative conductivity realizations at M locations is constructed by: (i) generating a representative set of N points distributed on the surface of a M-dimensional, unit radius hyper-sphere, (ii) relocating the N points on a representative set of N hyper-spheres of different radii, and (iii) transforming the coordinates of those points to lie on N different hyper-ellipsoids spanning the multivariate Gaussian distribution. The above method is applied in a dimensionality reduction context by defining flow-controlling points over which representative sampling of hydraulic conductivity is performed, thus also accounting for the sensitivity of the flow and transport model to the input hydraulic conductivity field. The performance of the various stratified sampling methods, LH, SL, and ME, is compared to that of SR sampling in terms of reproduction of ensemble statistics of hydraulic conductivity and solute concentration for different sample sizes N (numbers of realizations). The results indicate that ME sampling constitutes an equally if not more efficient simulation method than LH and SL sampling, as it can reproduce to a similar extent statistics of the conductivity and concentration fields, yet with smaller sampling variability than SR sampling. References [1] Gutjahr A.L. and Bras R.L. Spatial variability in subsurface flow and transport: A review. Reliability Engineering & System Safety, 42, 293-316, (1993). [2] Helton J.C. and Davis F.J. Latin hypercube sampling and the propagation of uncertainty in analyses of complex systems. Reliability Engineering & System Safety, 81, 23-69, (2003). [3] Switzer P. Multiple simulation of spatial fields. In: Heuvelink G, Lemmens M (eds) Proceedings of the 4th International Symposium on Spatial Accuracy Assessment in Natural Resources and Environmental Sciences, Coronet Books Inc., pp 629?635 (2000).

  16. Validation of sea ice models using an uncertainty-based distance metric for multiple model variables: NEW METRIC FOR SEA ICE MODEL VALIDATION

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Urrego-Blanco, Jorge R.; Hunke, Elizabeth C.; Urban, Nathan M.

    Here, we implement a variance-based distance metric (D n) to objectively assess skill of sea ice models when multiple output variables or uncertainties in both model predictions and observations need to be considered. The metric compares observations and model data pairs on common spatial and temporal grids improving upon highly aggregated metrics (e.g., total sea ice extent or volume) by capturing the spatial character of model skill. The D n metric is a gamma-distributed statistic that is more general than the χ 2 statistic commonly used to assess model fit, which requires the assumption that the model is unbiased andmore » can only incorporate observational error in the analysis. The D n statistic does not assume that the model is unbiased, and allows the incorporation of multiple observational data sets for the same variable and simultaneously for different variables, along with different types of variances that can characterize uncertainties in both observations and the model. This approach represents a step to establish a systematic framework for probabilistic validation of sea ice models. The methodology is also useful for model tuning by using the D n metric as a cost function and incorporating model parametric uncertainty as part of a scheme to optimize model functionality. We apply this approach to evaluate different configurations of the standalone Los Alamos sea ice model (CICE) encompassing the parametric uncertainty in the model, and to find new sets of model configurations that produce better agreement than previous configurations between model and observational estimates of sea ice concentration and thickness.« less

  17. Evaluating the combined effects of source zone mass release rates and aquifer heterogeneity on solute discharge uncertainty

    NASA Astrophysics Data System (ADS)

    de Barros, Felipe P. J.

    2018-07-01

    Quantifying the uncertainty in solute mass discharge at an environmentally sensitive location is key to assess the risks due to groundwater contamination. Solute mass fluxes are strongly affected by the spatial variability of hydrogeological properties as well as release conditions at the source zone. This paper provides a methodological framework to investigate the interaction between the ubiquitous heterogeneity of the hydraulic conductivity and the mass release rate at the source zone on the uncertainty of mass discharge. Through the use of perturbation theory, we derive analytical and semi-analytical expressions for the statistics of the solute mass discharge at a control plane in a three-dimensional aquifer while accounting for the solute mass release rates at the source. The derived solutions are limited to aquifers displaying low-to-mild heterogeneity. Results illustrate the significance of the source zone mass release rate in controlling the mass discharge uncertainty. The relative importance of the mass release rate on the mean solute discharge depends on the distance between the source and the control plane. On the other hand, we find that the solute release rate at the source zone has a strong impact on the variance of the mass discharge. Within a risk context, we also compute the peak mean discharge as a function of the parameters governing the spatial heterogeneity of the hydraulic conductivity field and mass release rates at the source zone. The proposed physically-based framework is application-oriented, computationally efficient and capable of propagating uncertainty from different parameters onto risk metrics. Furthermore, it can be used for preliminary screening purposes to guide site managers to perform system-level sensitivity analysis and better allocate resources.

  18. The impact of lake and reservoir parameterization on global streamflow simulation.

    PubMed

    Zajac, Zuzanna; Revilla-Romero, Beatriz; Salamon, Peter; Burek, Peter; Hirpa, Feyera A; Beck, Hylke

    2017-05-01

    Lakes and reservoirs affect the timing and magnitude of streamflow, and are therefore essential hydrological model components, especially in the context of global flood forecasting. However, the parameterization of lake and reservoir routines on a global scale is subject to considerable uncertainty due to lack of information on lake hydrographic characteristics and reservoir operating rules. In this study we estimated the effect of lakes and reservoirs on global daily streamflow simulations of a spatially-distributed LISFLOOD hydrological model. We applied state-of-the-art global sensitivity and uncertainty analyses for selected catchments to examine the effect of uncertain lake and reservoir parameterization on model performance. Streamflow observations from 390 catchments around the globe and multiple performance measures were used to assess model performance. Results indicate a considerable geographical variability in the lake and reservoir effects on the streamflow simulation. Nash-Sutcliffe Efficiency (NSE) and Kling-Gupta Efficiency (KGE) metrics improved for 65% and 38% of catchments respectively, with median skill score values of 0.16 and 0.2 while scores deteriorated for 28% and 52% of the catchments, with median values -0.09 and -0.16, respectively. The effect of reservoirs on extreme high flows was substantial and widespread in the global domain, while the effect of lakes was spatially limited to a few catchments. As indicated by global sensitivity analysis, parameter uncertainty substantially affected uncertainty of model performance. Reservoir parameters often contributed to this uncertainty, although the effect varied widely among catchments. The effect of reservoir parameters on model performance diminished with distance downstream of reservoirs in favor of other parameters, notably groundwater-related parameters and channel Manning's roughness coefficient. This study underscores the importance of accounting for lakes and, especially, reservoirs and using appropriate parameterization in large-scale hydrological simulations.

  19. Validation of sea ice models using an uncertainty-based distance metric for multiple model variables: NEW METRIC FOR SEA ICE MODEL VALIDATION

    DOE PAGES

    Urrego-Blanco, Jorge R.; Hunke, Elizabeth C.; Urban, Nathan M.; ...

    2017-04-01

    Here, we implement a variance-based distance metric (D n) to objectively assess skill of sea ice models when multiple output variables or uncertainties in both model predictions and observations need to be considered. The metric compares observations and model data pairs on common spatial and temporal grids improving upon highly aggregated metrics (e.g., total sea ice extent or volume) by capturing the spatial character of model skill. The D n metric is a gamma-distributed statistic that is more general than the χ 2 statistic commonly used to assess model fit, which requires the assumption that the model is unbiased andmore » can only incorporate observational error in the analysis. The D n statistic does not assume that the model is unbiased, and allows the incorporation of multiple observational data sets for the same variable and simultaneously for different variables, along with different types of variances that can characterize uncertainties in both observations and the model. This approach represents a step to establish a systematic framework for probabilistic validation of sea ice models. The methodology is also useful for model tuning by using the D n metric as a cost function and incorporating model parametric uncertainty as part of a scheme to optimize model functionality. We apply this approach to evaluate different configurations of the standalone Los Alamos sea ice model (CICE) encompassing the parametric uncertainty in the model, and to find new sets of model configurations that produce better agreement than previous configurations between model and observational estimates of sea ice concentration and thickness.« less

  20. Detectability of change in winter precipitation within mountain landscapes: Spatial patterns and uncertainty

    NASA Astrophysics Data System (ADS)

    Silverman, N. L.; Maneta, M. P.

    2016-06-01

    Detecting long-term change in seasonal precipitation using ground observations is dependent on the representativity of the point measurement to the surrounding landscape. In mountainous regions, representativity can be poor and lead to large uncertainties in precipitation estimates at high elevations or in areas where observations are sparse. If the uncertainty in the estimate is large compared to the long-term shifts in precipitation, then the change will likely go undetected. In this analysis, we examine the minimum detectable change across mountainous terrain in western Montana, USA. We ask the question: What is the minimum amount of change that is necessary to be detected using our best estimates of precipitation in complex terrain? We evaluate the spatial uncertainty in the precipitation estimates by conditioning historic regional climate model simulations to ground observations using Bayesian inference. By using this uncertainty as a null hypothesis, we test for detectability across the study region. To provide context for the detectability calculations, we look at a range of future scenarios from the Coupled Model Intercomparison Project 5 (CMIP5) multimodel ensemble downscaled to 4 km resolution using the MACAv2-METDATA data set. When using the ensemble averages we find that approximately 65% of the significant increases in winter precipitation go undetected at midelevations. At high elevation, approximately 75% of significant increases in winter precipitation are undetectable. Areas where change can be detected are largely controlled by topographic features. Elevation and aspect are key characteristics that determine whether or not changes in winter precipitation can be detected. Furthermore, we find that undetected increases in winter precipitation at high elevation will likely remain as snow under climate change scenarios. Therefore, there is potential for these areas to offset snowpack loss at lower elevations and confound the effects of climate change on water resources.

  1. Estimation and Uncertainty Analysis of Impacts of Future Heat Waves on Mortality in the Eastern United States

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wu, Jianyong; Zhou, Ying; Gao, Yang

    Background: It is anticipated that climate change will influence heat-related mortality in the future. However, the estimation of excess mortality attributable to future heat waves is subject to large uncertainties, which have not been examined under the latest greenhouse gas emission scenarios. Objectives: We estimated the future heat wave impact on mortality in the eastern United States (~ 1,700 counties) under two Representative Concentration Pathways (RCPs) and analyzed the sources of uncertainties. Methods Using dynamically downscaled hourly temperature projections in 2057-2059, we calculated heat wave days and episodes based on four heat wave metrics, and estimated the excess mortality attributablemore » to them. The sources of uncertainty in estimated excess mortality were apportioned using a variance-decomposition method. Results: In the eastern U.S., the excess mortality attributable to heat waves could range from 200-7,807 with the mean of 2,379 persons/year in 2057-2059. The projected average excess mortality in RCP 4.5 and 8.5 scenarios was 1,403 and 3,556 persons/year, respectively. Excess mortality would be relatively high in the southern and eastern coastal areas. The major sources of uncertainty in the estimates are relative risk of heat wave mortality, the RCP scenarios, and the heat wave definitions. Conclusions: The estimated mortality risks from future heat waves are likely an order of magnitude higher than its current level and lead to thousands of deaths each year under the RCP8.5 scenario. The substantial spatial variability in estimated county-level heat mortality suggests that effective mitigation and adaptation measures should be developed based on spatially resolved data.« less

  2. Latin hypercube approach to estimate uncertainty in ground water vulnerability

    USGS Publications Warehouse

    Gurdak, J.J.; McCray, J.E.; Thyne, G.; Qi, S.L.

    2007-01-01

    A methodology is proposed to quantify prediction uncertainty associated with ground water vulnerability models that were developed through an approach that coupled multivariate logistic regression with a geographic information system (GIS). This method uses Latin hypercube sampling (LHS) to illustrate the propagation of input error and estimate uncertainty associated with the logistic regression predictions of ground water vulnerability. Central to the proposed method is the assumption that prediction uncertainty in ground water vulnerability models is a function of input error propagation from uncertainty in the estimated logistic regression model coefficients (model error) and the values of explanatory variables represented in the GIS (data error). Input probability distributions that represent both model and data error sources of uncertainty were simultaneously sampled using a Latin hypercube approach with logistic regression calculations of probability of elevated nonpoint source contaminants in ground water. The resulting probability distribution represents the prediction intervals and associated uncertainty of the ground water vulnerability predictions. The method is illustrated through a ground water vulnerability assessment of the High Plains regional aquifer. Results of the LHS simulations reveal significant prediction uncertainties that vary spatially across the regional aquifer. Additionally, the proposed method enables a spatial deconstruction of the prediction uncertainty that can lead to improved prediction of ground water vulnerability. ?? 2007 National Ground Water Association.

  3. Understanding high magnitude flood risk: evidence from the past

    NASA Astrophysics Data System (ADS)

    MacDonald, N.

    2009-04-01

    The average length of gauged river flow records in the UK is ~25 years, which presents a problem in determining flood risk for high-magnitude flood events. Severe floods have been recorded in many UK catchments during the past 10 years, increasing the uncertainty in conventional flood risk estimates based on river flow records. Current uncertainty in flood risk has implications for society (insurance costs), individuals (personal vulnerability) and water resource managers (flood/drought risk). An alternative approach is required which can improve current understanding of the flood frequency/magnitude relationship. Historical documentary accounts are now recognised as a valuable resource when considering the flood frequency/magnitude relationship, but little consideration has been given to the temporal and spatial distribution of these records. Building on previous research based on British rivers (urban centre): Ouse (York), Trent (Nottingham), Tay (Perth), Severn (Shrewsbury), Dee (Chester), Great Ouse (Cambridge), Sussex Ouse (Lewes), Thames (Oxford), Tweed (Kelso) and Tyne (Hexham), this work considers the spatial and temporal distribution of historical flooding. The selected sites provide a network covering many of the largest river catchments in Britain, based on urban centres with long detailed documentary flood histories. The chronologies offer an opportunity to assess long-term patterns of flooding, indirectly determining periods of climatic variability and potentially increased geomorphic activity. This research represents the first coherent large scale analysis undertaken of historical multi-catchment flood chronologies, providing an unparalleled network of sites, permitting analysis of the spatial and temporal distribution of historical flood patterns on a national scale.

  4. Advances in Parameter and Uncertainty Quantification Using Bayesian Hierarchical Techniques with a Spatially Referenced Watershed Model (Invited)

    NASA Astrophysics Data System (ADS)

    Alexander, R. B.; Boyer, E. W.; Schwarz, G. E.; Smith, R. A.

    2013-12-01

    Estimating water and material stores and fluxes in watershed studies is frequently complicated by uncertainties in quantifying hydrological and biogeochemical effects of factors such as land use, soils, and climate. Although these process-related effects are commonly measured and modeled in separate catchments, researchers are especially challenged by their complexity across catchments and diverse environmental settings, leading to a poor understanding of how model parameters and prediction uncertainties vary spatially. To address these concerns, we illustrate the use of Bayesian hierarchical modeling techniques with a dynamic version of the spatially referenced watershed model SPARROW (SPAtially Referenced Regression On Watershed attributes). The dynamic SPARROW model is designed to predict streamflow and other water cycle components (e.g., evapotranspiration, soil and groundwater storage) for monthly varying hydrological regimes, using mechanistic functions, mass conservation constraints, and statistically estimated parameters. In this application, the model domain includes nearly 30,000 NHD (National Hydrologic Data) stream reaches and their associated catchments in the Susquehanna River Basin. We report the results of our comparisons of alternative models of varying complexity, including models with different explanatory variables as well as hierarchical models that account for spatial and temporal variability in model parameters and variance (error) components. The model errors are evaluated for changes with season and catchment size and correlations in time and space. The hierarchical models consist of a two-tiered structure in which climate forcing parameters are modeled as random variables, conditioned on watershed properties. Quantification of spatial and temporal variations in the hydrological parameters and model uncertainties in this approach leads to more efficient (lower variance) and less biased model predictions throughout the river network. Moreover, predictions of water-balance components are reported according to probabilistic metrics (e.g., percentiles, prediction intervals) that include both parameter and model uncertainties. These improvements in predictions of streamflow dynamics can inform the development of more accurate predictions of spatial and temporal variations in biogeochemical stores and fluxes (e.g., nutrients and carbon) in watersheds.

  5. Understanding extreme sea levels for coastal impact and adaptation analysis

    NASA Astrophysics Data System (ADS)

    Wahl, T.; Haigh, I. D.; Nicholls, R. J.; Arns, A.; Hinkel, J.; Dangendorf, S.; Slangen, A.

    2016-12-01

    Coastal impact and adaptation assessments require detailed knowledge on extreme sea levels, because increasing damage due to extreme events, such as storm surges and tropical cyclones, is one of the major consequences of sea level rise and climate change. In fact, the IPCC has highlighted in its AR4 report that "societal impacts of sea level change primarily occur via the extreme levels rather than as a direct consequence of mean sea level changes". Over the last few decades, substantial research efforts have been directed towards improved understanding of past and future mean sea level; different scenarios were developed with process-based or semi-empirical models and used for coastal impact assessments at various spatial scales to guide coastal management and adaptation efforts. The uncertainties in future sea level rise are typically accounted for by analyzing the impacts associated with a range of scenarios leading to a vertical displacement of the distribution of extreme sea-levels. And indeed most regional and global studies find little or no evidence for changes in storminess with climate change, although there is still low confidence in the results. However, and much more importantly, there is still a limited understanding of present-day extreme sea-levels which is largely ignored in most impact and adaptation analyses. The two key uncertainties stem from: (1) numerical models that are used to generate long time series of extreme sea-levels. The bias of these models varies spatially and can reach values much larger than the expected sea level rise; but it can be accounted for in most regions making use of in-situ measurements; (2) Statistical models used for determining present-day extreme sea-level exceedance probabilities. There is no universally accepted approach to obtain such values for flood risk assessments and while substantial research has explored inter-model uncertainties for mean sea level, we explore here, for the first time, inter-model uncertainties for extreme sea-levels at large spatial scales and compare them to the uncertainties in mean sea level projections.

  6. National-scale aboveground biomass geostatistical mapping with FIA inventory and GLAS data: Preparation for sparsely sampled lidar assisted forest inventory

    NASA Astrophysics Data System (ADS)

    Babcock, C. R.; Finley, A. O.; Andersen, H. E.; Moskal, L. M.; Morton, D. C.; Cook, B.; Nelson, R.

    2017-12-01

    Upcoming satellite lidar missions, such as GEDI and IceSat-2, are designed to collect laser altimetry data from space for narrow bands along orbital tracts. As a result lidar metric sets derived from these sources will not be of complete spatial coverage. This lack of complete coverage, or sparsity, means traditional regression approaches that consider lidar metrics as explanatory variables (without error) cannot be used to generate wall-to-wall maps of forest inventory variables. We implement a coregionalization framework to jointly model sparsely sampled lidar information and point-referenced forest variable measurements to create wall-to-wall maps with full probabilistic uncertainty quantification of all inputs. We inform the model with USFS Forest Inventory and Analysis (FIA) in-situ forest measurements and GLAS lidar data to spatially predict aboveground forest biomass (AGB) across the contiguous US. We cast our model within a Bayesian hierarchical framework to better model complex space-varying correlation structures among the lidar metrics and FIA data, which yields improved prediction and uncertainty assessment. To circumvent computational difficulties that arise when fitting complex geostatistical models to massive datasets, we use a Nearest Neighbor Gaussian process (NNGP) prior. Results indicate that a coregionalization modeling approach to leveraging sampled lidar data to improve AGB estimation is effective. Further, fitting the coregionalization model within a Bayesian mode of inference allows for AGB quantification across scales ranging from individual pixel estimates of AGB density to total AGB for the continental US with uncertainty. The coregionalization framework examined here is directly applicable to future spaceborne lidar acquisitions from GEDI and IceSat-2. Pairing these lidar sources with the extensive FIA forest monitoring plot network using a joint prediction framework, such as the coregionalization model explored here, offers the potential to improve forest AGB accounting certainty and provide maps for post-model fitting analysis of the spatial distribution of AGB.

  7. Stochastical analysis of surfactant-enhanced remediation of denser-than-water nonaqueous phase liquid (DNAPL)-contaminated soils.

    PubMed

    Zhang, Renduo; Wood, A Lynn; Enfield, Carl G; Jeong, Seung-Woo

    2003-01-01

    Stochastical analysis was performed to assess the effect of soil spatial variability and heterogeneity on the recovery of denser-than-water nonaqueous phase liquids (DNAPL) during the process of surfactant-enhanced remediation. UTCHEM, a three-dimensional, multicomponent, multiphase, compositional model, was used to simulate water flow and chemical transport processes in heterogeneous soils. Soil spatial variability and heterogeneity were accounted for by considering the soil permeability as a spatial random variable and a geostatistical method was used to generate random distributions of the permeability. The randomly generated permeability fields were incorporated into UTCHEM to simulate DNAPL transport in heterogeneous media and stochastical analysis was conducted based on the simulated results. From the analysis, an exponential relationship between average DNAPL recovery and soil heterogeneity (defined as the standard deviation of log of permeability) was established with a coefficient of determination (r2) of 0.991, which indicated that DNAPL recovery decreased exponentially with increasing soil heterogeneity. Temporal and spatial distributions of relative saturations in the water phase, DNAPL, and microemulsion in heterogeneous soils were compared with those in homogeneous soils and related to soil heterogeneity. Cleanup time and uncertainty to determine DNAPL distributions in heterogeneous soils were also quantified. The study would provide useful information to design strategies for the characterization and remediation of nonaqueous phase liquid-contaminated soils with spatial variability and heterogeneity.

  8. Innovating Big Data Computing Geoprocessing for Analysis of Engineered-Natural Systems

    NASA Astrophysics Data System (ADS)

    Rose, K.; Baker, V.; Bauer, J. R.; Vasylkivska, V.

    2016-12-01

    Big data computing and analytical techniques offer opportunities to improve predictions about subsurface systems while quantifying and characterizing associated uncertainties from these analyses. Spatial analysis, big data and otherwise, of subsurface natural and engineered systems are based on variable resolution, discontinuous, and often point-driven data to represent continuous phenomena. We will present examples from two spatio-temporal methods that have been adapted for use with big datasets and big data geo-processing capabilities. The first approach uses regional earthquake data to evaluate spatio-temporal trends associated with natural and induced seismicity. The second algorithm, the Variable Grid Method (VGM), is a flexible approach that presents spatial trends and patterns, such as those resulting from interpolation methods, while simultaneously visualizing and quantifying uncertainty in the underlying spatial datasets. In this presentation we will show how we are utilizing Hadoop to store and perform spatial analyses to efficiently consume and utilize large geospatial data in these custom analytical algorithms through the development of custom Spark and MapReduce applications that incorporate ESRI Hadoop libraries. The team will present custom `Big Data' geospatial applications that run on the Hadoop cluster and integrate with ESRI ArcMap with the team's probabilistic VGM approach. The VGM-Hadoop tool has been specially built as a multi-step MapReduce application running on the Hadoop cluster for the purpose of data reduction. This reduction is accomplished by generating multi-resolution, non-overlapping, attributed topology that is then further processed using ESRI's geostatistical analyst to convey a probabilistic model of a chosen study region. Finally, we will share our approach for implementation of data reduction and topology generation via custom multi-step Hadoop applications, performance benchmarking comparisons, and Hadoop-centric opportunities for greater parallelization of geospatial operations.

  9. Effects of input uncertainty on cross-scale crop modeling

    NASA Astrophysics Data System (ADS)

    Waha, Katharina; Huth, Neil; Carberry, Peter

    2014-05-01

    The quality of data on climate, soils and agricultural management in the tropics is in general low or data is scarce leading to uncertainty in process-based modeling of cropping systems. Process-based crop models are common tools for simulating crop yields and crop production in climate change impact studies, studies on mitigation and adaptation options or food security studies. Crop modelers are concerned about input data accuracy as this, together with an adequate representation of plant physiology processes and choice of model parameters, are the key factors for a reliable simulation. For example, assuming an error in measurements of air temperature, radiation and precipitation of ± 0.2°C, ± 2 % and ± 3 % respectively, Fodor & Kovacs (2005) estimate that this translates into an uncertainty of 5-7 % in yield and biomass simulations. In our study we seek to answer the following questions: (1) are there important uncertainties in the spatial variability of simulated crop yields on the grid-cell level displayed on maps, (2) are there important uncertainties in the temporal variability of simulated crop yields on the aggregated, national level displayed in time-series, and (3) how does the accuracy of different soil, climate and management information influence the simulated crop yields in two crop models designed for use at different spatial scales? The study will help to determine whether more detailed information improves the simulations and to advise model users on the uncertainty related to input data. We analyse the performance of the point-scale crop model APSIM (Keating et al., 2003) and the global scale crop model LPJmL (Bondeau et al., 2007) with different climate information (monthly and daily) and soil conditions (global soil map and African soil map) under different agricultural management (uniform and variable sowing dates) for the low-input maize-growing areas in Burkina Faso/West Africa. We test the models' response to different levels of input data from very little to very detailed information, and compare the models' abilities to represent the spatial variability and temporal variability in crop yields. We display the uncertainty in crop yield simulations from different input data and crop models in Taylor diagrams which are a graphical summary of the similarity between simulations and observations (Taylor, 2001). The observed spatial variability can be represented well from both models (R=0.6-0.8) but APSIM predicts higher spatial variability than LPJmL due to its sensitivity to soil parameters. Simulations with the same crop model, climate and sowing dates have similar statistics and therefore similar skill to reproduce the observed spatial variability. Soil data is less important for the skill of a crop model to reproduce the observed spatial variability. However, the uncertainty in simulated spatial variability from the two crop models is larger than from input data settings and APSIM is more sensitive to input data then LPJmL. Even with a detailed, point-scale crop model and detailed input data it is difficult to capture the complexity and diversity in maize cropping systems.

  10. Application of a Monte Carlo framework with bootstrapping for quantification of uncertainty in baseline map of carbon emissions from deforestation in Tropical Regions

    Treesearch

    William Salas; Steve Hagen

    2013-01-01

    This presentation will provide an overview of an approach for quantifying uncertainty in spatial estimates of carbon emission from land use change. We generate uncertainty bounds around our final emissions estimate using a randomized, Monte Carlo (MC)-style sampling technique. This approach allows us to combine uncertainty from different sources without making...

  11. Spatial distribution of forest aboveground biomass estimated from remote sensing and forest inventory data in New England, USA

    Treesearch

    Daolan Zheng; Linda S. Heath; Mark J. Ducey

    2008-01-01

    We combined satellite (Landsat 7 and Moderate Resolution Imaging Spectrometer) and U.S. Department of Agriculture forest inventory and analysis (FIA) data to estimate forest aboveground biomass (AGB) across New England, USA. This is practical for large-scale carbon studies and may reduce uncertainty of AGB estimates. We estimate that total regional forest AGB was 1,867...

  12. Physics Verification Overview

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Doebling, Scott William

    The purpose of the verification project is to establish, through rigorous convergence analysis, that each ASC computational physics code correctly implements a set of physics models and algorithms (code verification); Evaluate and analyze the uncertainties of code outputs associated with the choice of temporal and spatial discretization (solution or calculation verification); and Develop and maintain the capability to expand and update these analyses on demand. This presentation describes project milestones.

  13. Presentation of uncertainties on web platforms for climate change information

    NASA Astrophysics Data System (ADS)

    Nocke, Thomas; Wrobel, Markus; Reusser, Dominik

    2014-05-01

    Climate research has a long tradition, however there is still uncertainty about the specific effects of climate change. One of the key tasks is - beyond discussing climate change and its impacts in specialist groups - to present these to a wider audience. In that respect, decision-makers in the public sector as well as directly affected professional groups require to obtain easy-to-understand information. These groups are not made up of specialist scientists. This gives rise to the challenge that the scientific information must be presented such that it is commonly understood, however, the complexity of the science behind needs to be incorporated. In particular, this requires the explicit representation of spatial and temporal uncertainty information to lay people. Within this talk/poster we survey how climate change and climate impact uncertainty information is presented on various climate service web-based platforms. We outline how the specifics of this medium make it challenging to find adequate and readable representations of uncertainties. First, we introduce a multi-step approach in communicating the uncertainty basing on a typology of uncertainty distinguishing between epistemic, natural stochastic, and human reflexive uncertainty. Then, we compare existing concepts and representations for uncertainty communication with current practices on web-based platforms, including own solutions within our web platforms ClimateImpactsOnline and ci:grasp. Finally, we review surveys on how spatial uncertainty visualization techniques are conceived by untrainded users.

  14. Advances in audio source seperation and multisource audio content retrieval

    NASA Astrophysics Data System (ADS)

    Vincent, Emmanuel

    2012-06-01

    Audio source separation aims to extract the signals of individual sound sources from a given recording. In this paper, we review three recent advances which improve the robustness of source separation in real-world challenging scenarios and enable its use for multisource content retrieval tasks, such as automatic speech recognition (ASR) or acoustic event detection (AED) in noisy environments. We present a Flexible Audio Source Separation Toolkit (FASST) and discuss its advantages compared to earlier approaches such as independent component analysis (ICA) and sparse component analysis (SCA). We explain how cues as diverse as harmonicity, spectral envelope, temporal fine structure or spatial location can be jointly exploited by this toolkit. We subsequently present the uncertainty decoding (UD) framework for the integration of audio source separation and audio content retrieval. We show how the uncertainty about the separated source signals can be accurately estimated and propagated to the features. Finally, we explain how this uncertainty can be efficiently exploited by a classifier, both at the training and the decoding stage. We illustrate the resulting performance improvements in terms of speech separation quality and speaker recognition accuracy.

  15. Paleoclimate networks: a concept meeting central challenges in the reconstruction of paleoclimate dynamics

    NASA Astrophysics Data System (ADS)

    Rehfeld, Kira; Goswami, Bedartha; Marwan, Norbert; Breitenbach, Sebastian; Kurths, Jürgen

    2013-04-01

    Statistical analysis of dependencies amongst paleoclimate data helps to infer on the climatic processes they reflect. Three key challenges have to be addressed, however: the datasets are heterogeneous in time (i) and space (ii), and furthermore time itself is a variable that needs to be reconstructed, which (iii) introduces additional uncertainties. To address these issues in a flexible way we developed the paleoclimate network framework, inspired by the increasing application of complex networks in climate research. Nodes in the paleoclimate network represent a paleoclimate archive, and an associated time series. Links between these nodes are assigned, if these time series are significantly similar. Therefore, the base of the paleoclimate network is formed by linear and nonlinear estimators for Pearson correlation, mutual information and event synchronization, which quantify similarity from irregularly sampled time series. Age uncertainties are propagated into the final network analysis using time series ensembles which reflect the uncertainty. We discuss how spatial heterogeneity influences the results obtained from network measures, and demonstrate the power of the approach by inferring teleconnection variability of the Asian summer monsoon for the past 1000 years.

  16. Approaches to highly parameterized inversion: A guide to using PEST for model-parameter and predictive-uncertainty analysis

    USGS Publications Warehouse

    Doherty, John E.; Hunt, Randall J.; Tonkin, Matthew J.

    2010-01-01

    Analysis of the uncertainty associated with parameters used by a numerical model, and with predictions that depend on those parameters, is fundamental to the use of modeling in support of decisionmaking. Unfortunately, predictive uncertainty analysis with regard to models can be very computationally demanding, due in part to complex constraints on parameters that arise from expert knowledge of system properties on the one hand (knowledge constraints) and from the necessity for the model parameters to assume values that allow the model to reproduce historical system behavior on the other hand (calibration constraints). Enforcement of knowledge and calibration constraints on parameters used by a model does not eliminate the uncertainty in those parameters. In fact, in many cases, enforcement of calibration constraints simply reduces the uncertainties associated with a number of broad-scale combinations of model parameters that collectively describe spatially averaged system properties. The uncertainties associated with other combinations of parameters, especially those that pertain to small-scale parameter heterogeneity, may not be reduced through the calibration process. To the extent that a prediction depends on system-property detail, its postcalibration variability may be reduced very little, if at all, by applying calibration constraints; knowledge constraints remain the only limits on the variability of predictions that depend on such detail. Regrettably, in many common modeling applications, these constraints are weak. Though the PEST software suite was initially developed as a tool for model calibration, recent developments have focused on the evaluation of model-parameter and predictive uncertainty. As a complement to functionality that it provides for highly parameterized inversion (calibration) by means of formal mathematical regularization techniques, the PEST suite provides utilities for linear and nonlinear error-variance and uncertainty analysis in these highly parameterized modeling contexts. Availability of these utilities is particularly important because, in many cases, a significant proportion of the uncertainty associated with model parameters-and the predictions that depend on them-arises from differences between the complex properties of the real world and the simplified representation of those properties that is expressed by the calibrated model. This report is intended to guide intermediate to advanced modelers in the use of capabilities available with the PEST suite of programs for evaluating model predictive error and uncertainty. A brief theoretical background is presented on sources of parameter and predictive uncertainty and on the means for evaluating this uncertainty. Applications of PEST tools are then discussed for overdetermined and underdetermined problems, both linear and nonlinear. PEST tools for calculating contributions to model predictive uncertainty, as well as optimization of data acquisition for reducing parameter and predictive uncertainty, are presented. The appendixes list the relevant PEST variables, files, and utilities required for the analyses described in the document.

  17. Sources of Uncertainty in the Prediction of LAI / fPAR from MODIS

    NASA Technical Reports Server (NTRS)

    Dungan, Jennifer L.; Ganapol, Barry D.; Brass, James A. (Technical Monitor)

    2002-01-01

    To explicate the sources of uncertainty in the prediction of biophysical variables over space, consider the general equation: where z is a variable with values on some nominal, ordinal, interval or ratio scale; y is a vector of input variables; u is the spatial support of y and z ; x and u are the spatial locations of y and z , respectively; f is a model and B is the vector of the parameters of this model. Any y or z has a value and a spatial extent which is called its support. Viewed in this way, categories of uncertainty are from variable (e.g. measurement), parameter, positional. support and model (e.g. structural) sources. The prediction of Leaf Area Index (LAI) and the fraction of absorbed photosynthetically active radiation (fPAR) are examples of z variables predicted using model(s) as a function of y variables and spatially constant parameters. The MOD15 algorithm is an example of f, called f(sub 1), with parameters including those defined by one of six biome types and solar and view angles. The Leaf Canopy Model (LCM)2, a nested model that combines leaf radiative transfer with a full canopy reflectance model through the phase function, is a simpler though similar radiative transfer approach to f(sub 1). In a previous study, MOD15 and LCM2 gave similar results for the broadleaf forest biome. Differences between these two models can be used to consider the structural uncertainty in prediction results. In an effort to quantify each of the five sources of uncertainty and rank their relative importance for the LAI/fPAR prediction problem, we used recent data for an EOS Core Validation Site in the broadleaf biome with coincident surface reflectance, vegetation index, fPAR and LAI products from the Moderate Resolution Imaging Spectrometer (MODIS). Uncertainty due to support on the input reflectance variable was characterized using Landsat ETM+ data. Input uncertainties were propagated through the LCM2 model and compared with published uncertainties from the MOD15 algorithm.

  18. Hypersonic Boundary Layer Measurements with Variable Blowing Rates Using Molecular Tagging Velocimetry

    NASA Technical Reports Server (NTRS)

    Bathel, Brett F.; Danehy, Paul M.; Johansen, Craig T.; Jones, Stephen B.; Goyne, Christopher P.

    2012-01-01

    Measurements of mean and instantaneous streamwise velocity profiles in a hypersonic boundary layer with variable rates of mass injection (blowing) of nitrogen dioxide (NO2) were obtained over a 10-degree half-angle wedge model. The NO2 was seeded into the flow from a slot located 29.4 mm downstream of the sharp leading edge. The top surface of the wedge was oriented at a 20 degree angle in the Mach 10 flow, yielding an edge Mach number of approximately 4.2. The streamwise velocity profiles and streamwise fluctuating velocity component profiles were obtained using a three-laser NO2->NO photolysis molecular tagging velocimetry method. Observed trends in the mean streamwise velocity profiles and profiles of the fluctuating component of streamwise velocity as functions of the blowing rate are described. An effort is made to distinguish between the effect of blowing rate and wall temperature on the measured profiles. An analysis of the mean velocity profiles for a constant blowing rate is presented to determine the uncertainty in the measurement for different probe laser delay settings. Measurements of streamwise velocity were made to within approximately 120 gm of the model surface. The streamwise spatial resolution in this experiment ranged from 0.6 mm to 2.6 mm. An improvement in the spatial precision of the measurement technique has been made, with spatial uncertainties reduced by about a factor of 2 compared to previous measurements. For the quiescent flow calibration measurements presented, uncertainties as low as 2 m/s are obtained at 95% confidence for long delay times (25 gs). For the velocity measurements obtained with the wind tunnel operating, average single-shot uncertainties of less than 44 m/s are obtained at 95% confidence with a probe laser delay setting of 1 gs. The measurements were performed in the 31-inch Mach 10 Air Tunnel at the NASA Langley Research Center.

  19. Satellite-based drought monitoring in Kenya in an operational setting

    NASA Astrophysics Data System (ADS)

    Klisch, A.; Atzberger, C.; Luminari, L.

    2015-04-01

    The University of Natural Resources and Life Sciences (BOKU) in Vienna (Austria) in cooperation with the National Drought Management Authority (NDMA) in Nairobi (Kenya) has setup an operational processing chain for mapping drought occurrence and strength for the territory of Kenya using the Moderate Resolution Imaging Spectroradiometer (MODIS) NDVI at 250 m ground resolution from 2000 onwards. The processing chain employs a modified Whittaker smoother providing consistent NDVI "Mondayimages" in near real-time (NRT) at a 7-daily updating interval. The approach constrains temporally extrapolated NDVI values based on reasonable temporal NDVI paths. Contrary to other competing approaches, the processing chain provides a modelled uncertainty range for each pixel and time step. The uncertainties are calculated by a hindcast analysis of the NRT products against an "optimum" filtering. To detect droughts, the vegetation condition index (VCI) is calculated at pixel level and is spatially aggregated to administrative units. Starting from weekly temporal resolution, the indicator is also aggregated for 1- and 3-monthly intervals considering available uncertainty information. Analysts at NDMA use the spatially/temporally aggregated VCI and basic image products for their monthly bulletins. Based on the provided bio-physical indicators as well as a number of socio-economic indicators, contingency funds are released by NDMA to sustain counties in drought conditions. The paper shows the successful application of the products within NDMA by providing a retrospective analysis applied to droughts in 2006, 2009 and 2011. Some comparisons with alternative products (e.g. FEWS NET, the Famine Early Warning Systems Network) highlight main differences.

  20. Effects of temporal and spatial resolution of calibration data on integrated hydrologic water quality model identification

    NASA Astrophysics Data System (ADS)

    Jiang, Sanyuan; Jomaa, Seifeddine; Büttner, Olaf; Rode, Michael

    2014-05-01

    Hydrological water quality modeling is increasingly used for investigating runoff and nutrient transport processes as well as watershed management but it is mostly unclear how data availablity determins model identification. In this study, the HYPE (HYdrological Predictions for the Environment) model, which is a process-based, semi-distributed hydrological water quality model, was applied in two different mesoscale catchments (Selke (463 km2) and Weida (99 km2)) located in central Germany to simulate discharge and inorganic nitrogen (IN) transport. PEST and DREAM(ZS) were combined with the HYPE model to conduct parameter calibration and uncertainty analysis. Split-sample test was used for model calibration (1994-1999) and validation (1999-2004). IN concentration and daily IN load were found to be highly correlated with discharge, indicating that IN leaching is mainly controlled by runoff. Both dynamics and balances of water and IN load were well captured with NSE greater than 0.83 during validation period. Multi-objective calibration (calibrating hydrological and water quality parameters simultaneously) was found to outperform step-wise calibration in terms of model robustness. Multi-site calibration was able to improve model performance at internal sites, decrease parameter posterior uncertainty and prediction uncertainty. Nitrogen-process parameters calibrated using continuous daily averages of nitrate-N concentration observations produced better and more robust simulations of IN concentration and load, lower posterior parameter uncertainty and IN concentration prediction uncertainty compared to the calibration against uncontinuous biweekly nitrate-N concentration measurements. Both PEST and DREAM(ZS) are efficient in parameter calibration. However, DREAM(ZS) is more sound in terms of parameter identification and uncertainty analysis than PEST because of its capability to evolve parameter posterior distributions and estimate prediction uncertainty based on global search and Bayesian inference schemes.

  1. Hydrological model uncertainty due to spatial evapotranspiration estimation methods

    NASA Astrophysics Data System (ADS)

    Yu, Xuan; Lamačová, Anna; Duffy, Christopher; Krám, Pavel; Hruška, Jakub

    2016-05-01

    Evapotranspiration (ET) continues to be a difficult process to estimate in seasonal and long-term water balances in catchment models. Approaches to estimate ET typically use vegetation parameters (e.g., leaf area index [LAI], interception capacity) obtained from field observation, remote sensing data, national or global land cover products, and/or simulated by ecosystem models. In this study we attempt to quantify the uncertainty that spatial evapotranspiration estimation introduces into hydrological simulations when the age of the forest is not precisely known. The Penn State Integrated Hydrologic Model (PIHM) was implemented for the Lysina headwater catchment, located 50°03‧N, 12°40‧E in the western part of the Czech Republic. The spatial forest patterns were digitized from forest age maps made available by the Czech Forest Administration. Two ET methods were implemented in the catchment model: the Biome-BGC forest growth sub-model (1-way coupled to PIHM) and with the fixed-seasonal LAI method. From these two approaches simulation scenarios were developed. We combined the estimated spatial forest age maps and two ET estimation methods to drive PIHM. A set of spatial hydrologic regime and streamflow regime indices were calculated from the modeling results for each method. Intercomparison of the hydrological responses to the spatial vegetation patterns suggested considerable variation in soil moisture and recharge and a small uncertainty in the groundwater table elevation and streamflow. The hydrologic modeling with ET estimated by Biome-BGC generated less uncertainty due to the plant physiology-based method. The implication of this research is that overall hydrologic variability induced by uncertain management practices was reduced by implementing vegetation models in the catchment models.

  2. On the Character and Mitigation of Atmospheric Noise in InSAR Time Series Analysis (Invited)

    NASA Astrophysics Data System (ADS)

    Barnhart, W. D.; Fielding, E. J.; Fishbein, E.

    2013-12-01

    Time series analysis of interferometric synthetic aperture radar (InSAR) data, with its broad spatial coverage and ability to image regions that are sometimes very difficult to access, is a powerful tool for characterizing continental surface deformation and its temporal variations. With the impending launch of dedicated SAR missions such as Sentinel-1, ALOS-2, and the planned NASA L-band SAR mission, large volume data sets will allow researchers to further probe ground displacement processes with increased fidelity. Unfortunately, the precision of measurements in individual interferograms is impacted by several sources of noise, notably spatially correlated signals caused by path delays through the stratified and turbulent atmosphere and ionosphere. Spatial and temporal variations in atmospheric water vapor often introduce several to tens of centimeters of apparent deformation in the radar line-of-sight, correlated over short spatial scales (<10 km). Signals resulting from atmospheric path delays are particularly problematic because, like the subsidence and uplift signals associated with tectonic deformation, they are often spatially correlated with topography. In this talk, we provide an overview of the effects of spatially correlated tropospheric noise in individual interferograms and InSAR time series analysis, and we highlight where common assumptions of the temporal and spatial characteristics of tropospheric noise fail. Next, we discuss two classes of methods for mitigating the effects of tropospheric water vapor noise in InSAR time series analysis and single interferograms: noise estimation and characterization with independent observations from multispectral sensors such as MODIS and MERIS; and noise estimation and removal with weather models, multispectral sensor observations, and GPS. Each of these techniques can provide independent assessments of the contribution of water vapor in interferograms, but each technique also suffers from several pitfalls that we outline. The multispectral near-infrared (NIR) sensors provide high spatial resolution (~1 km) estimates of total column tropospheric water vapor by measuring the absorption of reflected solar illumination and provide may excellent estimates of wet delay. The Online Services for Correcting Atmosphere in Radar (OSCAR) project currently provides water vapor products through web services (http://oscar.jpl.nasa.gov). Unfortunately, such sensors require daytime and cloudless observations. Global and regional numerical weather models can provide an additional estimate of both the dry and atmospheric delays with spatial resolution of (3-100 km) and time scales of 1-3 hours, though these models are of lower accuracy than imaging observations and are benefited by independent observations from independent observations of atmospheric water vapor. Despite these issues, the integration of these techniques for InSAR correction and uncertainty estimation may contribute substantially to the reduction and rigorous characterization of uncertainty in InSAR time series analysis - helping to expand the range of tectonic displacements imaged with InSAR, to robustly constrain geophysical models, and to generate a-priori assessments of satellite acquisitions goals.

  3. Trend analysis of the aerosol optical depth from fusion of MISR and MODIS retrievals over China

    NASA Astrophysics Data System (ADS)

    Guo, Jing; Gu, Xingfa; Yu, Tao; Cheng, Tianhai; Chen, Hao

    2014-03-01

    Atmospheric aerosol plays an important role in the climate change though direct and indirect processes. In order to evaluate the effects of aerosols on climate, it is necessary to have a research on their spatial and temporal distributions. Satellite aerosol remote sensing is a developing technology that may provide good temporal sampling and superior spatial coverage to study aerosols. The Moderate Resolution Imaging Spectroradiometer (MODIS) and Multi-angle Imaging Spectroradiometer (MISR) have provided aerosol observations since 2000, with large coverage and high accuracy. However, due to the complex surface, cloud contamination, and aerosol models used in the retrieving process, the uncertainties still exist in current satellite aerosol products. There are several observed differences in comparing the MISR and MODIS AOD data with the AERONET AOD. Combing multiple sensors could reduce uncertainties and improve observational accuracy. The validation results reveal that a better agreement between fusion AOD and AERONET AOD. The results confirm that the fusion AOD values are more accurate than single sensor. We have researched the trend analysis of the aerosol properties over China based on nine-year (2002-2010) fusion data. Compared with trend analysis in Jingjintang and Yangtze River Delta, the accuracy has increased by 5% and 3%, respectively. It is obvious that the increasing trend of the AOD occurred in Yangtze River Delta, where human activities may be the main source of the increasing AOD.

  4. Crown fuel spatial variability and predictability of fire spread

    Treesearch

    Russell A. Parsons; Jeremy Sauer; Rodman R. Linn

    2010-01-01

    Fire behavior predictions, as well as measures of uncertainty in those predictions, are essential in operational and strategic fire management decisions. While it is becoming common practice to assess uncertainty in fire behavior predictions arising from variability in weather inputs, uncertainty arising from the fire models themselves is difficult to assess. This is...

  5. Spatial inter-comparison of Top-down emission inventories in European urban areas

    NASA Astrophysics Data System (ADS)

    Trombetti, Marco; Thunis, Philippe; Bessagnet, Bertrand; Clappier, Alain; Couvidat, Florian; Guevara, Marc; Kuenen, Jeroen; López-Aparicio, Susana

    2018-01-01

    This paper presents an inter-comparison of the main Top-down emission inventories currently used for air quality modelling studies at the European level. The comparison is developed for eleven European cities and compares the distribution of emissions of NOx, SO2, VOC and PPM2.5 from the road transport, residential combustion and industry sectors. The analysis shows that substantial differences in terms of total emissions, sectorial emission shares and spatial distribution exist between the datasets. The possible reasons in terms of downscaling approaches and choice of spatial proxies are analysed and recommendations are provided for each inventory in order to work towards the harmonisation of spatial downscaling and proxy calibration, in particular for policy purposes. The proposed methodology may be useful for the development of consistent and harmonised European-wide inventories with the aim of reducing the uncertainties in air quality modelling activities.

  6. Assessment of Spatial Transferability of Process-Based Hydrological Model Parameters in Two Neighboring Catchments in the Himalayan Region

    NASA Astrophysics Data System (ADS)

    Nepal, S.

    2016-12-01

    The spatial transferability of the model parameters of the process-oriented distributed J2000 hydrological model was investigated in two glaciated sub-catchments of the Koshi river basin in eastern Nepal. The basins had a high degree of similarity with respect to their static landscape features. The model was first calibrated (1986-1991) and validated (1992-1997) in the Dudh Koshi sub-catchment. The calibrated and validated model parameters were then transferred to the nearby Tamor catchment (2001-2009). A sensitivity and uncertainty analysis was carried out for both sub-catchments to discover the sensitivity range of the parameters in the two catchments. The model represented the overall hydrograph well in both sub-catchments, including baseflow and medium range flows (rising and recession limbs). The efficiency results according to both Nash-Sutcliffe and the coefficient of determination was above 0.84 in both cases. The sensitivity analysis showed that the same parameter was most sensitive for Nash-Sutcliffe (ENS) and Log Nash-Sutcliffe (LNS) efficiencies in both catchments. However, there were some differences in sensitivity to ENS and LNS for moderate and low sensitive parameters, although the majority (13 out of 16 for ENS and 16 out of 16 for LNS) had a sensitivity response in a similar range. A generalized likelihood uncertainty estimation (GLUE) result suggest that most of the time the observed runoff is within the parameter uncertainty range, although occasionally the values lie outside the uncertainty range, especially during flood peaks and more in the Tamor. This may be due to the limited input data resulting from the small number of precipitation stations and lack of representative stations in high-altitude areas, as well as to model structural uncertainty. The results indicate that transfer of the J2000 parameters to a neighboring catchment in the Himalayan region with similar physiographic landscape characteristics is viable. This indicates the possibility of applying process-based J2000 model be to the ungauged catchments in the Himalayan region, which could provide important insights into the hydrological system dynamics and provide much needed information to support water resources planning and management.

  7. A Spatial-frequency Method for Analyzing Antenna-to-Probe Interactions in Near-field Antenna Measurements.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brock, Billy C.

    The measurement of the radiation characteristics of an antenna on a near-field range requires that the antenna under test be located very close to the near-field probe. Although the direct coupling is utilized for characterizing the near field, this close proximity also presents the opportunity for significant undesired interactions (for example, reflections) to occur between the antenna and the near-field probe. When uncompensated, these additional interactions will introduce error into the measurement, increasing the uncertainty in the final gain pattern obtained through the near-field-to-far-field transformation. Quantifying this gain-uncertainty contribution requires quantifying the various additional interactions. A method incorporating spatial-frequency analysismore » is described which allows the dominant interaction contributions to be easily identified and quantified. In addition to identifying the additional antenna-to-probe interactions, the method also allows identification and quantification of interactions with other nearby objects within the measurement room. Because the method is a spatial-frequency method, wide-bandwidth data is not required, and it can be applied even when data is available at only a single temporal frequency. This feature ensures that the method can be applied to narrow-band antennas, where a similar time-domain analysis would not be possible. - 3 - - 4 -« less

  8. Assessing concentration uncertainty estimates from passive microwave sea ice products

    NASA Astrophysics Data System (ADS)

    Meier, W.; Brucker, L.; Miller, J. A.

    2017-12-01

    Sea ice concentration is an essential climate variable and passive microwave derived estimates of concentration are one of the longest satellite-derived climate records. However, until recently uncertainty estimates were not provided. Numerous validation studies provided insight into general error characteristics, but the studies have found that concentration error varied greatly depending on sea ice conditions. Thus, an uncertainty estimate from each observation is desired, particularly for initialization, assimilation, and validation of models. Here we investigate three sea ice products that include an uncertainty for each concentration estimate: the NASA Team 2 algorithm product, the EUMETSAT Ocean and Sea Ice Satellite Application Facility (OSI-SAF) product, and the NOAA/NSIDC Climate Data Record (CDR) product. Each product estimates uncertainty with a completely different approach. The NASA Team 2 product derives uncertainty internally from the algorithm method itself. The OSI-SAF uses atmospheric reanalysis fields and a radiative transfer model. The CDR uses spatial variability from two algorithms. Each approach has merits and limitations. Here we evaluate the uncertainty estimates by comparing the passive microwave concentration products with fields derived from the NOAA VIIRS sensor. The results show that the relationship between the product uncertainty estimates and the concentration error (relative to VIIRS) is complex. This may be due to the sea ice conditions, the uncertainty methods, as well as the spatial and temporal variability of the passive microwave and VIIRS products.

  9. Global sensitivity and uncertainty analysis of an atmospheric chemistry transport model: the FRAME model (version 9.15.0) as a case study

    NASA Astrophysics Data System (ADS)

    Aleksankina, Ksenia; Heal, Mathew R.; Dore, Anthony J.; Van Oijen, Marcel; Reis, Stefan

    2018-04-01

    Atmospheric chemistry transport models (ACTMs) are widely used to underpin policy decisions associated with the impact of potential changes in emissions on future pollutant concentrations and deposition. It is therefore essential to have a quantitative understanding of the uncertainty in model output arising from uncertainties in the input pollutant emissions. ACTMs incorporate complex and non-linear descriptions of chemical and physical processes which means that interactions and non-linearities in input-output relationships may not be revealed through the local one-at-a-time sensitivity analysis typically used. The aim of this work is to demonstrate a global sensitivity and uncertainty analysis approach for an ACTM, using as an example the FRAME model, which is extensively employed in the UK to generate source-receptor matrices for the UK Integrated Assessment Model and to estimate critical load exceedances. An optimised Latin hypercube sampling design was used to construct model runs within ±40 % variation range for the UK emissions of SO2, NOx, and NH3, from which regression coefficients for each input-output combination and each model grid ( > 10 000 across the UK) were calculated. Surface concentrations of SO2, NOx, and NH3 (and of deposition of S and N) were found to be predominantly sensitive to the emissions of the respective pollutant, while sensitivities of secondary species such as HNO3 and particulate SO42-, NO3-, and NH4+ to pollutant emissions were more complex and geographically variable. The uncertainties in model output variables were propagated from the uncertainty ranges reported by the UK National Atmospheric Emissions Inventory for the emissions of SO2, NOx, and NH3 (±4, ±10, and ±20 % respectively). The uncertainties in the surface concentrations of NH3 and NOx and the depositions of NHx and NOy were dominated by the uncertainties in emissions of NH3, and NOx respectively, whilst concentrations of SO2 and deposition of SOy were affected by the uncertainties in both SO2 and NH3 emissions. Likewise, the relative uncertainties in the modelled surface concentrations of each of the secondary pollutant variables (NH4+, NO3-, SO42-, and HNO3) were due to uncertainties in at least two input variables. In all cases the spatial distribution of relative uncertainty was found to be geographically heterogeneous. The global methods used here can be applied to conduct sensitivity and uncertainty analyses of other ACTMs.

  10. Improvements in 2016 to Natural Reservoir Analysis in Low-Temperature Geothermal Play Fairway Analysis for the Appalachian Basin

    DOE Data Explorer

    Teresa E. Jordan

    2016-08-18

    *These files add to and replace same-named files found within Submission 559 (https://gdr.openei.org/submissions/559)* The files included in this submission contain all data pertinent to the methods and results of a cohesive multi-state analysis of all known potential geothermal reservoirs in sedimentary rocks in the Appalachian Basin region, ranked by their potential favorability. Favorability is quantified using three metrics: Reservoir Productivity Index for water; Reservoir Productivity Index; Reservoir Flow Capacity. The metrics are explained in the Reservoirs Methodology Memo (included in zip file). The product represents a minimum spatial extent of potential sedimentary rock geothermal reservoirs. Only natural porosity and permeability were analyzed. Shapefile and images of the spatial distributions of these reservoir quality metrics and of the uncertainty on these metrics are included as well. UPDATE: Accompanying geologic reservoirs data may be found at: https://gdr.openei.org/submissions/881 (linked below).

  11. Quantifying volcanic hazard at Campi Flegrei caldera (Italy) with uncertainty assessment: 2. Pyroclastic density current invasion maps

    NASA Astrophysics Data System (ADS)

    Neri, Augusto; Bevilacqua, Andrea; Esposti Ongaro, Tomaso; Isaia, Roberto; Aspinall, Willy P.; Bisson, Marina; Flandoli, Franco; Baxter, Peter J.; Bertagnini, Antonella; Iannuzzi, Enrico; Orsucci, Simone; Pistolesi, Marco; Rosi, Mauro; Vitale, Stefano

    2015-04-01

    Campi Flegrei (CF) is an example of an active caldera containing densely populated settlements at very high risk of pyroclastic density currents (PDCs). We present here an innovative method for assessing background spatial PDC hazard in a caldera setting with probabilistic invasion maps conditional on the occurrence of an explosive event. The method encompasses the probabilistic assessment of potential vent opening positions, derived in the companion paper, combined with inferences about the spatial density distribution of PDC invasion areas from a simplified flow model, informed by reconstruction of deposits from eruptions in the last 15 ka. The flow model describes the PDC kinematics and accounts for main effects of topography on flow propagation. Structured expert elicitation is used to incorporate certain sources of epistemic uncertainty, and a Monte Carlo approach is adopted to produce a set of probabilistic hazard maps for the whole CF area. Our findings show that, in case of eruption, almost the entire caldera is exposed to invasion with a mean probability of at least 5%, with peaks greater than 50% in some central areas. Some areas outside the caldera are also exposed to this danger, with mean probabilities of invasion of the order of 5-10%. Our analysis suggests that these probability estimates have location-specific uncertainties which can be substantial. The results prove to be robust with respect to alternative elicitation models and allow the influence on hazard mapping of different sources of uncertainty, and of theoretical and numerical assumptions, to be quantified.

  12. Sensitivity and Uncertainty Analysis for Streamflow Prediction Using Different Objective Functions and Optimization Algorithms: San Joaquin California

    NASA Astrophysics Data System (ADS)

    Paul, M.; Negahban-Azar, M.

    2017-12-01

    The hydrologic models usually need to be calibrated against observed streamflow at the outlet of a particular drainage area through a careful model calibration. However, a large number of parameters are required to fit in the model due to their unavailability of the field measurement. Therefore, it is difficult to calibrate the model for a large number of potential uncertain model parameters. This even becomes more challenging if the model is for a large watershed with multiple land uses and various geophysical characteristics. Sensitivity analysis (SA) can be used as a tool to identify most sensitive model parameters which affect the calibrated model performance. There are many different calibration and uncertainty analysis algorithms which can be performed with different objective functions. By incorporating sensitive parameters in streamflow simulation, effects of the suitable algorithm in improving model performance can be demonstrated by the Soil and Water Assessment Tool (SWAT) modeling. In this study, the SWAT was applied in the San Joaquin Watershed in California covering 19704 km2 to calibrate the daily streamflow. Recently, sever water stress escalating due to intensified climate variability, prolonged drought and depleting groundwater for agricultural irrigation in this watershed. Therefore it is important to perform a proper uncertainty analysis given the uncertainties inherent in hydrologic modeling to predict the spatial and temporal variation of the hydrologic process to evaluate the impacts of different hydrologic variables. The purpose of this study was to evaluate the sensitivity and uncertainty of the calibrated parameters for predicting streamflow. To evaluate the sensitivity of the calibrated parameters three different optimization algorithms (Sequential Uncertainty Fitting- SUFI-2, Generalized Likelihood Uncertainty Estimation- GLUE and Parameter Solution- ParaSol) were used with four different objective functions (coefficient of determination- r2, Nash-Sutcliffe efficiency- NSE, percent bias- PBIAS, and Kling-Gupta efficiency- KGE). The preliminary results showed that using the SUFI-2 algorithm with the objective function NSE and KGE has improved significantly the calibration (e.g. R2 and NSE is found 0.52 and 0.47 respectively for daily streamflow calibration).

  13. Uncertainty of the global oceanic CO2 exchange at the air-water interface induced by the choice of the gas exchange velocity formulation and the wind product: quantification and spatial analysis

    NASA Astrophysics Data System (ADS)

    Roobaert, Alizee; Laruelle, Goulven; Landschützer, Peter; Regnier, Pierre

    2017-04-01

    In lakes, rivers, estuaries and the ocean, the quantification of air-water CO2 exchange (FCO2) is still characterized by large uncertainties partly due to the lack of agreement over the parameterization of the gas exchange velocity (k). Although the ocean is generally regarded as the best constrained system because k is only controlled by the wind speed, numerous formulations are still currently used, leading to potentially large differences in FCO2. Here, a quantitative global spatial analysis of FCO2 is presented using several k-wind speed formulations in order to compare the effect of the choice of parameterization of k on FCO2. This analysis is performed at a 1 degree resolution using a sea surface pCO2 product generated using a two-step artificial neuronal network by Landschützer et al. (2015) over the 1991-2011 period. Four different global wind speed datasets (CCMP, ERA, NCEP 1 and NCEP 2) are also used to assess the effect of the choice of one wind speed product over the other when calculating the global and regional oceanic FCO2. Results indicate that this choice of wind speed product only leads to small discrepancies globally (6 %) except with NCEP 2 which produces a more intense global FCO2 compared to the other wind products. Regionally, theses differences are even more pronounced. For a given wind speed product, the choice of parametrization of k yields global FCO2 differences ranging from 7 % to 16 % depending on the wind product used. We also provide latitudinal profiles of FCO2 and its uncertainty calculated combining all combinations between the different k-relationships and the four wind speed products. Wind speeds >14 m s-1, which only account for 7 % of all observations, contributes disproportionately to the global oceanic FCO2 and, for this range of wind speeds, the uncertainty induced by the choice of formulation for k is maximum ( 50 %).

  14. The timecourse of space- and object-based attentional prioritization with varying degrees of certainty

    PubMed Central

    Drummond, Leslie; Shomstein, Sarah

    2013-01-01

    The relative contributions of objects (i.e., object-based) and underlying spatial (i.e., space-based representations) to attentional prioritization and selection remain unclear. In most experimental circumstances, the two representations overlap thus their respective contributions cannot be evaluated. Here, a dynamic version of the two-rectangle paradigm allowed for a successful de-coupling of spatial and object representations. Space-based (cued spatial location), cued end of the object, and object-based (locations within the cued object) effects were sampled at several timepoints following the cue with high or low certainty as to target location. In the high uncertainty condition spatial benefits prevailed throughout most of the timecourse, as evidenced by facilitatory and inhibitory effects. Additionally, the cued end of the object, rather than a whole object, received the attentional benefit. When target location was predictable (low uncertainty manipulation), only probabilities guided selection (i.e., evidence by a benefit for the statistically biased location). These results suggest that with high spatial uncertainty, all available information present within the stimulus display is used for the purposes of attentional selection (e.g., spatial locations, cued end of the object) albeit to varying degrees and at different time points. However, as certainty increases, only spatial certainty guides selection (i.e., object ends and whole objects are filtered out). Taken together, these results further elucidate the contributing role of space- and object-representations to attentional guidance. PMID:24367302

  15. Using meta-information of a posteriori Bayesian solutions of the hypocentre location task for improving accuracy of location error estimation

    NASA Astrophysics Data System (ADS)

    Debski, Wojciech

    2015-06-01

    The spatial location of sources of seismic waves is one of the first tasks when transient waves from natural (uncontrolled) sources are analysed in many branches of physics, including seismology, oceanology, to name a few. Source activity and its spatial variability in time, the geometry of recording network, the complexity and heterogeneity of wave velocity distribution are all factors influencing the performance of location algorithms and accuracy of the achieved results. Although estimating of the earthquake foci location is relatively simple, a quantitative estimation of the location accuracy is really a challenging task even if the probabilistic inverse method is used because it requires knowledge of statistics of observational, modelling and a priori uncertainties. In this paper, we addressed this task when statistics of observational and/or modelling errors are unknown. This common situation requires introduction of a priori constraints on the likelihood (misfit) function which significantly influence the estimated errors. Based on the results of an analysis of 120 seismic events from the Rudna copper mine operating in southwestern Poland, we propose an approach based on an analysis of Shanon's entropy calculated for the a posteriori distribution. We show that this meta-characteristic of the a posteriori distribution carries some information on uncertainties of the solution found.

  16. Sensitivity analysis of ozone formation and transport for a central California air pollution episode.

    PubMed

    Jin, Ling; Tonse, Shaheen; Cohan, Daniel S; Mao, Xiaoling; Harley, Robert A; Brown, Nancy J

    2008-05-15

    We developed a first- and second-order sensitivity analysis approach with the decoupled direct method to examine spatial and temporal variations of ozone-limiting reagents and the importance of local vs upwind emission sources in the San Joaquin Valley of central California for a 5 day ozone episode (Jul 29th to Aug 3rd, 2000). Despite considerable spatial variations, nitrogen oxides (NO(x)) emission reductions are overall more effective than volatile organic compound (VOC) control for attaining the 8 h ozone standard in this region for this episode, in contrast to the VOC control that works better for attaining the prior 1 h ozone standard. Interbasin source contributions of NO(x) emissions are limited to the northern part of the SJV, while anthropogenic VOC (AVOC) emissions, especially those emitted at night, influence ozone formation in the SJV further downwind. Among model input parameters studied here, uncertainties in emissions of NO(x) and AVOC, and the rate coefficient of the OH + NO2 termination reaction, have the greatest effect on first-order ozone responses to changes in NO(x) emissions. Uncertainties in biogenic VOC emissions only have a modest effect because they are generally not collocated with anthropogenic sources in this region.

  17. A first hazard analysis of the Harrat Ash Shamah volcanic field, Syria-Jordan Borderline

    NASA Astrophysics Data System (ADS)

    Cagnan, Zehra; Akkar, Sinan; Moghimi, Saed

    2017-04-01

    The northernmost part of the Saudi Cenozoic Volcanic Fields, the 100,000 km2 Harrat Ash Shamah has hosted some of the most recent volcanic eruptions along the Syria-Jordan borderline. With rapid growth of the cities in this region, exposure to any potential renewed volcanism increased considerably. We present here a first-order probabilistic hazard analysis related to new vent formation and subsequent lava flow from Harrat Ash Shamah. The 733 visible eruption vent sites were utilized to develop a probability density function for new eruption sites using Gaussian kernel smoothing. This revealed a NNW striking zone of high spatial hazard surrounding the cities Amman and Irbid in Jordan. The temporal eruption recurrence rate is estimated to be approximately one vent per 3500 years, but the temporal record of the field is so poorly constrained that the lower and upper bounds for the recurrence interval are 17,700 yrs and 70 yrs, respectively. A Poisson temporal model is employed within the scope of this study. In order to treat the uncertainties associated with the spatio-temporal models as well as size of the area affected by the lava flow, the logic tree approach is adopted. For the Syria-Jordan borderline, the spatial variation of volcanic hazard is computed as well as uncertainty associated with these estimates.

  18. Multiple Velocity Profile Measurements in Hypersonic Flows using Sequentially-Imaged Fluorescence Tagging

    NASA Technical Reports Server (NTRS)

    Bathel, Brett F.; Danehy, Paul M.; Inmian, Jennifer A.; Jones, Stephen B.; Ivey, Christopher B.; Goyne, Christopher P.

    2010-01-01

    Nitric-oxide planar laser-induced fluorescence (NO PLIF) was used to perform velocity measurements in hypersonic flows by generating multiple tagged lines which fluoresce as they convect downstream. For each laser pulse, a single interline, progressive scan intensified CCD camera was used to obtain separate images of the initial undelayed and delayed NO molecules that had been tagged by the laser. The CCD configuration allowed for sub-microsecond acquisition of both images, resulting in sub-microsecond temporal resolution as well as sub-mm spatial resolution (0.5-mm x 0.7-mm). Determination of axial velocity was made by application of a cross-correlation analysis of the horizontal shift of individual tagged lines. Quantification of systematic errors, the contribution of gating/exposure duration errors, and influence of collision rate on fluorescence to temporal uncertainty were made. Quantification of the spatial uncertainty depended upon the analysis technique and signal-to-noise of the acquired profiles. This investigation focused on two hypersonic flow experiments: (1) a reaction control system (RCS) jet on an Orion Crew Exploration Vehicle (CEV) wind tunnel model and (2) a 10-degree half-angle wedge containing a 2-mm tall, 4-mm wide cylindrical boundary layer trip. The experiments were performed at the NASA Langley Research Center's 31-inch Mach 10 wind tunnel.

  19. Evaluating Precipitation from Orbital Data Products of TRMM and GPM over the Indian Subcontinent

    NASA Astrophysics Data System (ADS)

    Jayaluxmi, I.; Kumar, D. N.

    2015-12-01

    The rapidly growing records of microwave based precipitation data made available from various earth observation satellites have instigated a pressing need towards evaluating the associated uncertainty which arise from different sources such as retrieval error, spatial/temporal sampling error and sensor dependent error. Pertaining to microwave remote sensing, most of the studies in literature focus on gridded data products, fewer studies exist on evaluating the uncertainty inherent in orbital data products. Evaluation of the latter are essential as they potentially cause large uncertainties during real time flood forecasting studies especially at the watershed scale. The present study evaluates the uncertainty of precipitation data derived from the orbital data products of the Tropical Rainfall Measuring Mission (TRMM) satellite namely the 2A12, 2A25 and 2B31 products. Case study results over the flood prone basin of Mahanadi, India, are analyzed for precipitation uncertainty through these three facets viz., a) Uncertainty quantification using the volumetric metrics from the contingency table [Aghakouchak and Mehran 2014] b) Error characterization using additive and multiplicative error models c) Error decomposition to identify systematic and random errors d) Comparative assessment with the orbital data from GPM mission. The homoscedastic random errors from multiplicative error models justify a better representation of precipitation estimates by the 2A12 algorithm. It can be concluded that although the radiometer derived 2A12 precipitation data is known to suffer from many sources of uncertainties, spatial analysis over the case study region of India testifies that they are in excellent agreement with the reference estimates for the data period considered [Indu and Kumar 2015]. References A. AghaKouchak and A. Mehran (2014), Extended contingency table: Performance metrics for satellite observations and climate model simulations, Water Resources Research, vol. 49, 7144-7149; J. Indu and D. Nagesh Kumar (2015), Evaluation of Precipitation Retrievals from Orbital Data Products of TRMM over a Subtropical basin in India, IEEE Transactions on Geoscience and Remote Sensing, in press, doi: 10.1109/TGRS.2015.2440338.

  20. Rainfall: State of the Science

    NASA Astrophysics Data System (ADS)

    Testik, Firat Y.; Gebremichael, Mekonnen

    Rainfall: State of the Science offers the most up-to-date knowledge on the fundamental and practical aspects of rainfall. Each chapter, self-contained and written by prominent scientists in their respective fields, provides three forms of information: fundamental principles, detailed overview of current knowledge and description of existing methods, and emerging techniques and future research directions. The book discusses • Rainfall microphysics: raindrop morphodynamics, interactions, size distribution, and evolution • Rainfall measurement and estimation: ground-based direct measurement (disdrometer and rain gauge), weather radar rainfall estimation, polarimetric radar rainfall estimation, and satellite rainfall estimation • Statistical analyses: intensity-duration-frequency curves, frequency analysis of extreme events, spatial analyses, simulation and disaggregation, ensemble approach for radar rainfall uncertainty, and uncertainty analysis of satellite rainfall products The book is tailored to be an indispensable reference for researchers, practitioners, and graduate students who study any aspect of rainfall or utilize rainfall information in various science and engineering disciplines.

  1. A statistical method for lung tumor segmentation uncertainty in PET images based on user inference.

    PubMed

    Zheng, Chaojie; Wang, Xiuying; Feng, Dagan

    2015-01-01

    PET has been widely accepted as an effective imaging modality for lung tumor diagnosis and treatment. However, standard criteria for delineating tumor boundary from PET are yet to develop largely due to relatively low quality of PET images, uncertain tumor boundary definition, and variety of tumor characteristics. In this paper, we propose a statistical solution to segmentation uncertainty on the basis of user inference. We firstly define the uncertainty segmentation band on the basis of segmentation probability map constructed from Random Walks (RW) algorithm; and then based on the extracted features of the user inference, we use Principle Component Analysis (PCA) to formulate the statistical model for labeling the uncertainty band. We validated our method on 10 lung PET-CT phantom studies from the public RIDER collections [1] and 16 clinical PET studies where tumors were manually delineated by two experienced radiologists. The methods were validated using Dice similarity coefficient (DSC) to measure the spatial volume overlap. Our method achieved an average DSC of 0.878 ± 0.078 on phantom studies and 0.835 ± 0.039 on clinical studies.

  2. Examination of elevation dependency in observed and projected temperature change in the Upper Indus Basin and Western Himalaya

    NASA Astrophysics Data System (ADS)

    Fowler, H. J.; Forsythe, N. D.; Blenkinsop, S.; Archer, D.; Hardy, A.; Janes, T.; Jones, R. G.; Holderness, T.

    2013-12-01

    We present results of two distinct, complementary analyses to assess evidence of elevation dependency in temperature change in the UIB (Karakoram, Eastern Hindu Kush) and wider WH. The first analysis component examines historical remotely-sensed land surface temperature (LST) from the second and third generation of the Advanced Very High Resolution Radiometer (AVHRR/2, AVHRR/3) instrument flown on NOAA satellite platforms since the mid-1980s through present day. The high spatial resolution (<4km) from AVHRR instrument enables precise consideration of the relationship between estimated LST and surface topography. The LST data product was developed as part of initiative to produce continuous time-series for key remotely sensed spatial products (LST, snow covered area, cloud cover, NDVI) extending as far back into the historical record as feasible. Context for the AVHRR LST data product is provided by results of bias assessment and validation procedures against both available local observations, both manned and automatic weather stations. Local observations provide meaningful validation and bias assessment of the vertical gradients found in the AVHRR LST as the elevation range from the lowest manned meteorological station (at 1460m asl) to the highest automatic weather station (4733m asl) covers much of the key range yielding runoff from seasonal snowmelt. Furthermore the common available record period of these stations (1995 to 2007) enables assessment not only of the AVHRR LST but also performance comparisons with the more recent MODIS LST data product. A range of spatial aggregations (from minor tributary catchments to primary basin headwaters) is performed to assess regional homogeneity and identify potential latitudinal or longitudinal gradients in elevation dependency. The second analysis component investigates elevation dependency, including its uncertainty, in projected temperature change trajectories in the downscaling of a seventeen member Global Climate Model (GCM) perturbed physics ensemble (PPE) of transient (130-year) simulations using a moderate resolution (25km) regional climate model (RCM). The GCM ensemble is the17-member QUMP (Quantifying Uncertainty in Model Projections) ensemble and the downscaling is done using HadRM3P, part of the PRECIS regional climate modelling system. Both the RCM and GCMs are models developed the UK Met Office Hadley Centre and are based on the HadCM3 GCM. Use of the multi-member PPE enables quantification of uncertainty in projected temperature change while the spatial resolution of RCM improves insight into the role of elevation in projected rates of change. Furthermore comparison with the results of the remote sensing analysis component - considered to provide an 'observed climatology' - permits evaluation of individual ensemble members with regards to biases in spatial gradients in temperature as well timing and magnitude of annual cycles.

  3. Climate model uncertainty in impact assessments for agriculture: A multi-ensemble case study on maize in sub-Saharan Africa

    NASA Astrophysics Data System (ADS)

    Dale, Amy; Fant, Charles; Strzepek, Kenneth; Lickley, Megan; Solomon, Susan

    2017-03-01

    We present maize production in sub-Saharan Africa as a case study in the exploration of how uncertainties in global climate change, as reflected in projections from a range of climate model ensembles, influence climate impact assessments for agriculture. The crop model AquaCrop-OS (Food and Agriculture Organization of the United Nations) was modified to run on a 2° × 2° grid and coupled to 122 climate model projections from multi-model ensembles for three emission scenarios (Coupled Model Intercomparison Project Phase 3 [CMIP3] SRES A1B and CMIP5 Representative Concentration Pathway [RCP] scenarios 4.5 and 8.5) as well as two "within-model" ensembles (NCAR CCSM3 and ECHAM5/MPI-OM) designed to capture internal variability (i.e., uncertainty due to chaos in the climate system). In spite of high uncertainty, most notably in the high-producing semi-arid zones, we observed robust regional and sub-regional trends across all ensembles. In agreement with previous work, we project widespread yield losses in the Sahel region and Southern Africa, resilience in Central Africa, and sub-regional increases in East Africa and at the southern tip of the continent. Spatial patterns of yield losses corresponded with spatial patterns of aridity increases, which were explicitly evaluated. Internal variability was a major source of uncertainty in both within-model and between-model ensembles and explained the majority of the spatial distribution of uncertainty in yield projections. Projected climate change impacts on maize production in different regions and nations ranged from near-zero or positive (upper quartile estimates) to substantially negative (lower quartile estimates), highlighting a need for risk management strategies that are adaptive and robust to uncertainty.

  4. Object-based vegetation classification with high resolution remote sensing imagery

    NASA Astrophysics Data System (ADS)

    Yu, Qian

    Vegetation species are valuable indicators to understand the earth system. Information from mapping of vegetation species and community distribution at large scales provides important insight for studying the phenological (growth) cycles of vegetation and plant physiology. Such information plays an important role in land process modeling including climate, ecosystem and hydrological models. The rapidly growing remote sensing technology has increased its potential in vegetation species mapping. However, extracting information at a species level is still a challenging research topic. I proposed an effective method for extracting vegetation species distribution from remotely sensed data and investigated some ways for accuracy improvement. The study consists of three phases. Firstly, a statistical analysis was conducted to explore the spatial variation and class separability of vegetation as a function of image scale. This analysis aimed to confirm that high resolution imagery contains the information on spatial vegetation variation and these species classes can be potentially separable. The second phase was a major effort in advancing classification by proposing a method for extracting vegetation species from high spatial resolution remote sensing data. The proposed classification employs an object-based approach that integrates GIS and remote sensing data and explores the usefulness of ancillary information. The whole process includes image segmentation, feature generation and selection, and nearest neighbor classification. The third phase introduces a spatial regression model for evaluating the mapping quality from the above vegetation classification results. The effects of six categories of sample characteristics on the classification uncertainty are examined: topography, sample membership, sample density, spatial composition characteristics, training reliability and sample object features. This evaluation analysis answered several interesting scientific questions such as (1) whether the sample characteristics affect the classification accuracy and how significant if it does; (2) how much variance of classification uncertainty can be explained by above factors. This research is carried out on a hilly peninsular area in Mediterranean climate, Point Reyes National Seashore (PRNS) in Northern California. The area mainly consists of a heterogeneous, semi-natural broadleaf and conifer woodland, shrub land, and annual grassland. A detailed list of vegetation alliances is used in this study. Research results from the first phase indicates that vegetation spatial variation as reflected by the average local variance (ALV) keeps a high level of magnitude between 1 m and 4 m resolution. (Abstract shortened by UMI.)

  5. Remotely Sensed Data for High Resolution Agro-Environmental Policy Analysis

    NASA Astrophysics Data System (ADS)

    Welle, Paul

    Policy analyses of agricultural and environmental systems are often limited due to data constraints. Measurement campaigns can be costly, especially when the area of interest includes oceans, forests, agricultural regions or other dispersed spatial domains. Satellite based remote sensing offers a way to increase the spatial and temporal resolution of policy analysis concerning these systems. However, there are key limitations to the implementation of satellite data. Uncertainty in data derived from remote-sensing can be significant, and traditional methods of policy analysis for managing uncertainty on large datasets can be computationally expensive. Moreover, while satellite data can increasingly offer estimates of some parameters such as weather or crop use, other information regarding demographic or economic data is unlikely to be estimated using these techniques. Managing these challenges in practical policy analysis remains a challenge. In this dissertation, I conduct five case studies which rely heavily on data sourced from orbital sensors. First, I assess the magnitude of climate and anthropogenic stress on coral reef ecosystems. Second, I conduct an impact assessment of soil salinity on California agriculture. Third, I measure the propensity of growers to adapt their cropping practices to soil salinization in agriculture. Fourth, I analyze whether small-scale desalination units could be applied on farms in California in order mitigate the effects of drought and salinization as well as prevent agricultural drainage from entering vulnerable ecosystems. And fifth, I assess the feasibility of satellite-based remote sensing for salinity measurement at global scale. Through these case studies, I confront both the challenges and benefits associated with implementing satellite based-remote sensing for improved policy analysis.

  6. A High Performance Bayesian Computing Framework for Spatiotemporal Uncertainty Modeling

    NASA Astrophysics Data System (ADS)

    Cao, G.

    2015-12-01

    All types of spatiotemporal measurements are subject to uncertainty. With spatiotemporal data becomes increasingly involved in scientific research and decision making, it is important to appropriately model the impact of uncertainty. Quantitatively modeling spatiotemporal uncertainty, however, is a challenging problem considering the complex dependence and dataheterogeneities.State-space models provide a unifying and intuitive framework for dynamic systems modeling. In this paper, we aim to extend the conventional state-space models for uncertainty modeling in space-time contexts while accounting for spatiotemporal effects and data heterogeneities. Gaussian Markov Random Field (GMRF) models, also known as conditional autoregressive models, are arguably the most commonly used methods for modeling of spatially dependent data. GMRF models basically assume that a geo-referenced variable primarily depends on its neighborhood (Markov property), and the spatial dependence structure is described via a precision matrix. Recent study has shown that GMRFs are efficient approximation to the commonly used Gaussian fields (e.g., Kriging), and compared with Gaussian fields, GMRFs enjoy a series of appealing features, such as fast computation and easily accounting for heterogeneities in spatial data (e.g, point and areal). This paper represents each spatial dataset as a GMRF and integrates them into a state-space form to statistically model the temporal dynamics. Different types of spatial measurements (e.g., categorical, count or continuous), can be accounted for by according link functions. A fast alternative to MCMC framework, so-called Integrated Nested Laplace Approximation (INLA), was adopted for model inference.Preliminary case studies will be conducted to showcase the advantages of the described framework. In the first case, we apply the proposed method for modeling the water table elevation of Ogallala aquifer over the past decades. In the second case, we analyze the drought impacts in Texas counties in the past years, where the spatiotemporal dynamics are represented in areal data.

  7. The influence of uncertainty and location-specific conditions on the environmental prioritisation of human pharmaceuticals in Europe.

    PubMed

    Oldenkamp, Rik; Huijbregts, Mark A J; Ragas, Ad M J

    2016-05-01

    The selection of priority APIs (Active Pharmaceutical Ingredients) can benefit from a spatially explicit approach, since an API might exceed the threshold of environmental concern in one location, while staying below that same threshold in another. However, such a spatially explicit approach is relatively data intensive and subject to parameter uncertainty due to limited data. This raises the question to what extent a spatially explicit approach for the environmental prioritisation of APIs remains worthwhile when accounting for uncertainty in parameter settings. We show here that the inclusion of spatially explicit information enables a more efficient environmental prioritisation of APIs in Europe, compared with a non-spatial EU-wide approach, also under uncertain conditions. In a case study with nine antibiotics, uncertainty distributions of the PAF (Potentially Affected Fraction) of aquatic species were calculated in 100∗100km(2) environmental grid cells throughout Europe, and used for the selection of priority APIs. Two APIs have median PAF values that exceed a threshold PAF of 1% in at least one environmental grid cell in Europe, i.e., oxytetracycline and erythromycin. At a tenfold lower threshold PAF (i.e., 0.1%), two additional APIs would be selected, i.e., cefuroxime and ciprofloxacin. However, in 94% of the environmental grid cells in Europe, no APIs exceed either of the thresholds. This illustrates the advantage of following a location-specific approach in the prioritisation of APIs. This added value remains when accounting for uncertainty in parameter settings, i.e., if the 95th percentile of the PAF instead of its median value is compared with the threshold. In 96% of the environmental grid cells, the location-specific approach still enables a reduction of the selection of priority APIs of at least 50%, compared with a EU-wide prioritisation. Copyright © 2016 Elsevier Ltd. All rights reserved.

  8. Rain radar measurement error estimation using data assimilation in an advection-based nowcasting system

    NASA Astrophysics Data System (ADS)

    Merker, Claire; Ament, Felix; Clemens, Marco

    2017-04-01

    The quantification of measurement uncertainty for rain radar data remains challenging. Radar reflectivity measurements are affected, amongst other things, by calibration errors, noise, blocking and clutter, and attenuation. Their combined impact on measurement accuracy is difficult to quantify due to incomplete process understanding and complex interdependencies. An improved quality assessment of rain radar measurements is of interest for applications both in meteorology and hydrology, for example for precipitation ensemble generation, rainfall runoff simulations, or in data assimilation for numerical weather prediction. Especially a detailed description of the spatial and temporal structure of errors is beneficial in order to make best use of the areal precipitation information provided by radars. Radar precipitation ensembles are one promising approach to represent spatially variable radar measurement errors. We present a method combining ensemble radar precipitation nowcasting with data assimilation to estimate radar measurement uncertainty at each pixel. This combination of ensemble forecast and observation yields a consistent spatial and temporal evolution of the radar error field. We use an advection-based nowcasting method to generate an ensemble reflectivity forecast from initial data of a rain radar network. Subsequently, reflectivity data from single radars is assimilated into the forecast using the Local Ensemble Transform Kalman Filter. The spread of the resulting analysis ensemble provides a flow-dependent, spatially and temporally correlated reflectivity error estimate at each pixel. We will present first case studies that illustrate the method using data from a high-resolution X-band radar network.

  9. Aquifer Hydrogeologic Layer Zonation at the Hanford Site

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Savelieva-Trofimova, Elena A.; Kanevski, Mikhail; timonin, v.

    2003-09-10

    Sedimentary aquifer layers are characterized by spatial variability of hydraulic properties. Nevertheless, zones with similar values of hydraulic parameters (parameter zones) can be distinguished. This parameter zonation approach is an alternative to the analysis of spatial variation of the continuous hydraulic parameters. The parameter zonation approach is primarily motivated by the lack of measurements that would be needed for direct spatial modeling of the hydraulic properties. The current work is devoted to the problem of zonation of the Hanford formation, the uppermost sedimentary aquifer unit (U1) included in hydrogeologic models at the Hanford site. U1 is characterized by 5 zonesmore » with different hydraulic properties. Each sampled location is ascribed to a parameter zone by an expert. This initial classification is accompanied by a measure of quality (also indicated by an expert) that addresses the level of classification confidence. In the current study, the coneptual zonation map developed by an expert geologist was used as an a priori model. The parameter zonation problem was formulated as a multiclass classification task. Different geostatistical and machine learning algorithms were adapted and applied to solve this problem, including: indicator kriging, conditional simulations, neural networks of different architectures, and support vector machines. All methods were trained using additional soft information based on expert estimates. Regularization methods were used to overcome possible overfitting. The zonation problem was complicated because there were few samples for some zones (classes) and by the spatial non-stationarity of the data. Special approaches were developed to overcome these complications. The comparison of different methods was performed using qualitative and quantitative statistical methods and image analysis. We examined the correspondence of the results with the geologically based interpretation, including the reproduction of the spatial orientation of the different classes and the spatial correlation structure of the classes. The uncertainty of the classification task was examined using both probabilistic interpretation of the estimators and by examining the results of a set of stochastic realizations. Characterization of the classification uncertainty is the main advantage of the proposed methods.« less

  10. Uncertainty in gridded CO 2 emissions estimates

    DOE PAGES

    Hogue, Susannah; Marland, Eric; Andres, Robert J.; ...

    2016-05-19

    We are interested in the spatial distribution of fossil-fuel-related emissions of CO 2 for both geochemical and geopolitical reasons, but it is important to understand the uncertainty that exists in spatially explicit emissions estimates. Working from one of the widely used gridded data sets of CO 2 emissions, we examine the elements of uncertainty, focusing on gridded data for the United States at the scale of 1° latitude by 1° longitude. Uncertainty is introduced in the magnitude of total United States emissions, the magnitude and location of large point sources, the magnitude and distribution of non-point sources, and from themore » use of proxy data to characterize emissions. For the United States, we develop estimates of the contribution of each component of uncertainty. At 1° resolution, in most grid cells, the largest contribution to uncertainty comes from how well the distribution of the proxy (in this case population density) represents the distribution of emissions. In other grid cells, the magnitude and location of large point sources make the major contribution to uncertainty. Uncertainty in population density can be important where a large gradient in population density occurs near a grid cell boundary. Uncertainty is strongly scale-dependent with uncertainty increasing as grid size decreases. In conclusion, uncertainty for our data set with 1° grid cells for the United States is typically on the order of ±150%, but this is perhaps not excessive in a data set where emissions per grid cell vary over 8 orders of magnitude.« less

  11. Inter-comparison of interpolated background nitrogen dioxide concentrations across Greater Manchester, UK

    NASA Astrophysics Data System (ADS)

    Lindley, S. J.; Walsh, T.

    There are many modelling methods dedicated to the estimation of spatial patterns in pollutant concentrations, each with their distinctive advantages and disadvantages. The derivation of a surface of air quality values from monitoring data alone requires the conversion of point-based data from a limited number of monitoring stations to a continuous surface using interpolation. Since interpolation techniques involve the estimation of data at un-sampled points based on calculated relationships between data measured at a number of known sample points, they are subject to some uncertainty, both in terms of the values estimated and their spatial distribution. These uncertainties, which are incorporated into many empirical and semi-empirical mapping methodologies, could be recognised in any further usage of the data and also in the assessment of the extent of an exceedence of an air quality standard and the degree of exposure this may represent. There is a wide range of available interpolation techniques and the differences in the characteristics of these result in variations in the output surfaces estimated from the same set of input points. The work presented in this paper provides an examination of uncertainties through the application of a number of interpolation techniques available in standard GIS packages to a case study nitrogen dioxide data set for the Greater Manchester conurbation in northern England. The implications of the use of different techniques are discussed through application to hourly concentrations during an air quality episode and annual average concentrations in 2001. Patterns of concentrations demonstrate considerable differences in the estimated spatial pattern of maxima as the combined effects of chemical processes, topography and meteorology. In the case of air quality episodes, the considerable spatial variability of concentrations results in large uncertainties in the surfaces produced but these uncertainties vary widely from area to area. In view of the uncertainties with classical techniques research is ongoing to develop alternative methods which should in time help improve the suite of tools available to air quality managers.

  12. Study the effect of reservoir spatial heterogeneity on CO2 sequestration under an uncertainty quantification (UQ) software framework

    NASA Astrophysics Data System (ADS)

    Fang, Y.; Hou, J.; Engel, D.; Lin, G.; Yin, J.; Han, B.; Fang, Z.; Fountoulakis, V.

    2011-12-01

    In this study, we introduce an uncertainty quantification (UQ) software framework for carbon sequestration, with the focus of studying being the effect of spatial heterogeneity of reservoir properties on CO2 migration. We use a sequential Gaussian method (SGSIM) to generate realizations of permeability fields with various spatial statistical attributes. To deal with the computational difficulties, we integrate the following ideas/approaches: 1) firstly, we use three different sampling approaches (probabilistic collocation, quasi-Monte Carlo, and adaptive sampling approaches) to reduce the required forward calculations while trying to explore the parameter space and quantify the input uncertainty; 2) secondly, we use eSTOMP as the forward modeling simulator. eSTOMP is implemented using the Global Arrays toolkit (GA) that is based on one-sided inter-processor communication and supports a shared memory programming style on distributed memory platforms. It provides highly-scalable performance. It uses a data model to partition most of the large scale data structures into a relatively small number of distinct classes. The lower level simulator infrastructure (e.g. meshing support, associated data structures, and data mapping to processors) is separated from the higher level physics and chemistry algorithmic routines using a grid component interface; and 3) besides the faster model and more efficient algorithms to speed up the forward calculation, we built an adaptive system infrastructure to select the best possible data transfer mechanisms, to optimally allocate system resources to improve performance, and to integrate software packages and data for composing carbon sequestration simulation, computation, analysis, estimation and visualization. We will demonstrate the framework with a given CO2 injection scenario in a heterogeneous sandstone reservoir.

  13. Control of experimental uncertainties in filtered Rayleigh scattering measurements

    NASA Technical Reports Server (NTRS)

    Forkey, Joseph N.; Finkelstein, N. D.; Lempert, Walter R.; Miles, Richard B.

    1995-01-01

    Filtered Rayleigh Scattering is a technique which allows for measurement of velocity, temperature, and pressure in unseeded flows, spatially resolved in 2-dimensions. We present an overview of the major components of a Filtered Rayleigh Scattering system. In particular, we develop and discuss a detailed theoretical model along with associated model parameters and related uncertainties. Based on this model, we then present experimental results for ambient room air and for a Mach 2 free jet, including spatially resolved measurements of velocity, temperature, and pressure.

  14. Uncertainty Analysis of A Flood Risk Mapping Procedure Applied In Urban Areas

    NASA Astrophysics Data System (ADS)

    Krause, J.; Uhrich, S.; Bormann, H.; Diekkrüger, B.

    In the framework of IRMA-Sponge program the presented study was part of the joint research project FRHYMAP (flood risk and hydrological mapping). A simple con- ceptual flooding model (FLOODMAP) has been developed to simulate flooded areas besides rivers within cities. FLOODMAP requires a minimum of input data (digital el- evation model (DEM), river line, water level plain) and parameters and calculates the flood extent as well as the spatial distribution of flood depths. of course the simulated model results are affected by errors and uncertainties. Possible sources of uncertain- ties are the model structure, model parameters and input data. Thus after the model validation (comparison of simulated water to observed extent, taken from airborne pictures) the uncertainty of the essential input data set (digital elevation model) was analysed. Monte Carlo simulations were performed to assess the effect of uncertain- ties concerning the statistics of DEM quality and to derive flooding probabilities from the set of simulations. The questions concerning a minimum resolution of a DEM re- quired for flood simulation and concerning the best aggregation procedure of a given DEM was answered by comparing the results obtained using all available standard GIS aggregation procedures. Seven different aggregation procedures were applied to high resolution DEMs (1-2m) in three cities (Bonn, Cologne, Luxembourg). Basing on this analysis the effect of 'uncertain' DEM data was estimated and compared with other sources of uncertainties. Especially socio-economic information and monetary transfer functions required for a damage risk analysis show a high uncertainty. There- fore this study helps to analyse the weak points of the flood risk and damage risk assessment procedure.

  15. Uncertainty in Random Forests: What does it mean in a spatial context?

    NASA Astrophysics Data System (ADS)

    Klump, Jens; Fouedjio, Francky

    2017-04-01

    Geochemical surveys are an important part of exploration for mineral resources and in environmental studies. The samples and chemical analyses are often laborious and difficult to obtain and therefore come at a high cost. As a consequence, these surveys are characterised by datasets with large numbers of variables but relatively few data points when compared to conventional big data problems. With more remote sensing platforms and sensor networks being deployed, large volumes of auxiliary data of the surveyed areas are becoming available. The use of these auxiliary data has the potential to improve the prediction of chemical element concentrations over the whole study area. Kriging is a well established geostatistical method for the prediction of spatial data but requires significant pre-processing and makes some basic assumptions about the underlying distribution of the data. Some machine learning algorithms, on the other hand, may require less data pre-processing and are non-parametric. In this study we used a dataset provided by Kirkwood et al. [1] to explore the potential use of Random Forest in geochemical mapping. We chose Random Forest because it is a well understood machine learning method and has the advantage that it provides us with a measure of uncertainty. By comparing Random Forest to Kriging we found that both methods produced comparable maps of estimated values for our variables of interest. Kriging outperformed Random Forest for variables of interest with relatively strong spatial correlation. The measure of uncertainty provided by Random Forest seems to be quite different to the measure of uncertainty provided by Kriging. In particular, the lack of spatial context can give misleading results in areas without ground truth data. In conclusion, our preliminary results show that the model driven approach in geostatistics gives us more reliable estimates for our target variables than Random Forest for variables with relatively strong spatial correlation. However, in cases of weak spatial correlation Random Forest, as a nonparametric method, may give the better results once we have a better understanding of the meaning of its uncertainty measures in a spatial context. References [1] Kirkwood, C., M. Cave, D. Beamish, S. Grebby, and A. Ferreira (2016), A machine learning approach to geochemical mapping, Journal of Geochemical Exploration, 163, 28-40, doi:10.1016/j.gexplo.2016.05.003.

  16. Uncertainty Assessment of the NASA Earth Exchange Global Daily Downscaled Climate Projections (NEX-GDDP) Dataset

    NASA Technical Reports Server (NTRS)

    Wang, Weile; Nemani, Ramakrishna R.; Michaelis, Andrew; Hashimoto, Hirofumi; Dungan, Jennifer L.; Thrasher, Bridget L.; Dixon, Keith W.

    2016-01-01

    The NASA Earth Exchange Global Daily Downscaled Projections (NEX-GDDP) dataset is comprised of downscaled climate projections that are derived from 21 General Circulation Model (GCM) runs conducted under the Coupled Model Intercomparison Project Phase 5 (CMIP5) and across two of the four greenhouse gas emissions scenarios (RCP4.5 and RCP8.5). Each of the climate projections includes daily maximum temperature, minimum temperature, and precipitation for the periods from 1950 through 2100 and the spatial resolution is 0.25 degrees (approximately 25 km x 25 km). The GDDP dataset has received warm welcome from the science community in conducting studies of climate change impacts at local to regional scales, but a comprehensive evaluation of its uncertainties is still missing. In this study, we apply the Perfect Model Experiment framework (Dixon et al. 2016) to quantify the key sources of uncertainties from the observational baseline dataset, the downscaling algorithm, and some intrinsic assumptions (e.g., the stationary assumption) inherent to the statistical downscaling techniques. We developed a set of metrics to evaluate downscaling errors resulted from bias-correction ("quantile-mapping"), spatial disaggregation, as well as the temporal-spatial non-stationarity of climate variability. Our results highlight the spatial disaggregation (or interpolation) errors, which dominate the overall uncertainties of the GDDP dataset, especially over heterogeneous and complex terrains (e.g., mountains and coastal area). In comparison, the temporal errors in the GDDP dataset tend to be more constrained. Our results also indicate that the downscaled daily precipitation also has relatively larger uncertainties than the temperature fields, reflecting the rather stochastic nature of precipitation in space. Therefore, our results provide insights in improving statistical downscaling algorithms and products in the future.

  17. Quantitative Measures for Evaluation of Ultrasound Therapies of the Prostate

    NASA Astrophysics Data System (ADS)

    Kobelevskiy, Ilya; Burtnyk, Mathieu; Bronskill, Michael; Chopra, Rajiv

    2010-03-01

    Development of non-invasive techniques for prostate cancer treatment requires implementation of quantitative measures for evaluation of the treatment results. In this paper. we introduce measures that estimate spatial targeting accuracy and potential thermal damage to the structures surrounding the prostate. The measures were developed for the technique of treating prostate cancer with a transurethral ultrasound heating applicators guided by active MR temperature feedback. Variations of ultrasound element length and related MR imaging parameters such as MR slice thickness and update time were investigated by performing numerical simulations of the treatment on a database of ten patient prostate geometries segmented from clinical MR images. Susceptibility of each parameter configuration to uncertainty in MR temperature measurements was studied by adding noise to the temperature measurements. Gaussian noise with zero mean and standard deviation of 0, 1, 3 and 5° C was used to model different levels of uncertainty in MR temperature measurements. Results of simulations for each parameter configuration were averaged over the database of the ten prostate patient geometries studied. Results have shown that for update time of 5 seconds both 3- and 5-mm elements achieve appropriate performance for temperature uncertainty up to 3° C, while temperature uncertainty of 5° C leads to noticeable reduction in spatial accuracy and increased risk of damaging rectal wall. Ten-mm elements lacked spatial accuracy and had higher risk of damaging rectal wall compared to 3- and 5-mm elements, but were less sensitive to the level of temperature uncertainty. The effect of changing update time was studied for 5-mm elements. Simulations showed that update time had minor effects on all aspects of treatment for temperature uncertainty of 0° C and 1° C, while temperature uncertainties of 3° C and 5° C led to reduced spatial accuracy, increased potential damage to the rectal wall, and longer treatment times for update time above 5 seconds. Overall evaluation of results suggested that 5-mm elements showed best performance under physically reachable MR imaging parameters.

  18. Application of Spectral Analysis Techniques in the Intercomparison of Aerosol Data. Part II: Using Maximum Covariance Analysis to Effectively Compare Spatiotemporal Variability of Satellite and AERONET Measured Aerosol Optical Depth

    NASA Technical Reports Server (NTRS)

    Li, Jing; Carlson, Barbara E.; Lacis, Andrew A.

    2014-01-01

    Moderate Resolution Imaging SpectroRadiometer (MODIS) and Multi-angle Imaging Spectroradiomater (MISR) provide regular aerosol observations with global coverage. It is essential to examine the coherency between space- and ground-measured aerosol parameters in representing aerosol spatial and temporal variability, especially in the climate forcing and model validation context. In this paper, we introduce Maximum Covariance Analysis (MCA), also known as Singular Value Decomposition analysis as an effective way to compare correlated aerosol spatial and temporal patterns between satellite measurements and AERONET data. This technique not only successfully extracts the variability of major aerosol regimes but also allows the simultaneous examination of the aerosol variability both spatially and temporally. More importantly, it well accommodates the sparsely distributed AERONET data, for which other spectral decomposition methods, such as Principal Component Analysis, do not yield satisfactory results. The comparison shows overall good agreement between MODIS/MISR and AERONET AOD variability. The correlations between the first three modes of MCA results for both MODIS/AERONET and MISR/ AERONET are above 0.8 for the full data set and above 0.75 for the AOD anomaly data. The correlations between MODIS and MISR modes are also quite high (greater than 0.9). We also examine the extent of spatial agreement between satellite and AERONET AOD data at the selected stations. Some sites with disagreements in the MCA results, such as Kanpur, also have low spatial coherency. This should be associated partly with high AOD spatial variability and partly with uncertainties in satellite retrievals due to the seasonally varying aerosol types and surface properties.

  19. Recent developments in measurement and evaluation of FAC damage in power plants

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Garud, Y.S.; Besuner, P.; Cohn, M.J.

    1999-11-01

    This paper describes some recent developments in the measurement and evaluation of flow-accelerated corrosion (FAC) damage in power plants. The evaluation focuses on data checking and smoothing to account for gross errors, noise, and uncertainty in the wall thickness measurements from ultrasonic or pulsed eddy-current data. Also, the evaluation method utilizes advanced regression analysis for spatial and temporal evolution of the wall loss, providing statistically robust predictions of wear rates and associated uncertainty. Results of the application of these new tools are presented for several components in actual service. More importantly, the practical implications of using these advances are discussedmore » in relation to the likely impact on the scope and effectiveness of FAC related inspection programs.« less

  20. Advances on the Failure Analysis of the Dam-Foundation Interface of Concrete Dams.

    PubMed

    Altarejos-García, Luis; Escuder-Bueno, Ignacio; Morales-Torres, Adrián

    2015-12-02

    Failure analysis of the dam-foundation interface in concrete dams is characterized by complexity, uncertainties on models and parameters, and a strong non-linear softening behavior. In practice, these uncertainties are dealt with a well-structured mixture of experience, best practices and prudent, conservative design approaches based on the safety factor concept. Yet, a sound, deep knowledge of some aspects of this failure mode remain unveiled, as they have been offset in practical applications by the use of this conservative approach. In this paper we show a strategy to analyse this failure mode under a reliability-based approach. The proposed methodology of analysis integrates epistemic uncertainty on spatial variability of strength parameters and data from dam monitoring. The purpose is to produce meaningful and useful information regarding the probability of occurrence of this failure mode that can be incorporated in risk-informed dam safety reviews. In addition, relationships between probability of failure and factors of safety are obtained. This research is supported by a more than a decade of intensive professional practice on real world cases and its final purpose is to bring some clarity, guidance and to contribute to the improvement of current knowledge and best practices on such an important dam safety concern.

  1. Advances on the Failure Analysis of the Dam—Foundation Interface of Concrete Dams

    PubMed Central

    Altarejos-García, Luis; Escuder-Bueno, Ignacio; Morales-Torres, Adrián

    2015-01-01

    Failure analysis of the dam-foundation interface in concrete dams is characterized by complexity, uncertainties on models and parameters, and a strong non-linear softening behavior. In practice, these uncertainties are dealt with a well-structured mixture of experience, best practices and prudent, conservative design approaches based on the safety factor concept. Yet, a sound, deep knowledge of some aspects of this failure mode remain unveiled, as they have been offset in practical applications by the use of this conservative approach. In this paper we show a strategy to analyse this failure mode under a reliability-based approach. The proposed methodology of analysis integrates epistemic uncertainty on spatial variability of strength parameters and data from dam monitoring. The purpose is to produce meaningful and useful information regarding the probability of occurrence of this failure mode that can be incorporated in risk-informed dam safety reviews. In addition, relationships between probability of failure and factors of safety are obtained. This research is supported by a more than a decade of intensive professional practice on real world cases and its final purpose is to bring some clarity, guidance and to contribute to the improvement of current knowledge and best practices on such an important dam safety concern. PMID:28793709

  2. Use of NARCCAP data to characterize regional climate uncertainty in the impact of global climate change on large river fish population: Missouri River sturgeon example

    NASA Astrophysics Data System (ADS)

    Anderson, C. J.; Wildhaber, M. L.; Wikle, C. K.; Moran, E. H.; Franz, K. J.; Dey, R.

    2012-12-01

    Climate change operates over a broad range of spatial and temporal scales. Understanding the effects of change on ecosystems requires accounting for the propagation of information and uncertainty across these scales. For example, to understand potential climate change effects on fish populations in riverine ecosystems, climate conditions predicted by course-resolution atmosphere-ocean global climate models must first be translated to the regional climate scale. In turn, this regional information is used to force watershed models, which are used to force river condition models, which impact the population response. A critical challenge in such a multiscale modeling environment is to quantify sources of uncertainty given the highly nonlinear nature of interactions between climate variables and the individual organism. We use a hierarchical modeling approach for accommodating uncertainty in multiscale ecological impact studies. This framework allows for uncertainty due to system models, model parameter settings, and stochastic parameterizations. This approach is a hybrid between physical (deterministic) downscaling and statistical downscaling, recognizing that there is uncertainty in both. We use NARCCAP data to determine confidence the capability of climate models to simulate relevant processes and to quantify regional climate variability within the context of the hierarchical model of uncertainty quantification. By confidence, we mean the ability of the regional climate model to replicate observed mechanisms. We use the NCEP-driven simulations for this analysis. This provides a base from which regional change can be categorized as either a modification of previously observed mechanisms or emergence of new processes. The management implications for these categories of change are significantly different in that procedures to address impacts from existing processes may already be known and need adjustment; whereas, an emergent processes may require new management strategies. The results from hierarchical analysis of uncertainty are used to study the relative change in weights of the endangered Missouri River pallid sturgeon (Scaphirhynchus albus) under a 21st century climate scenario.

  3. Advances in Applications of Hierarchical Bayesian Methods with Hydrological Models

    NASA Astrophysics Data System (ADS)

    Alexander, R. B.; Schwarz, G. E.; Boyer, E. W.

    2017-12-01

    Mechanistic and empirical watershed models are increasingly used to inform water resource decisions. Growing access to historical stream measurements and data from in-situ sensor technologies has increased the need for improved techniques for coupling models with hydrological measurements. Techniques that account for the intrinsic uncertainties of both models and measurements are especially needed. Hierarchical Bayesian methods provide an efficient modeling tool for quantifying model and prediction uncertainties, including those associated with measurements. Hierarchical methods can also be used to explore spatial and temporal variations in model parameters and uncertainties that are informed by hydrological measurements. We used hierarchical Bayesian methods to develop a hybrid (statistical-mechanistic) SPARROW (SPAtially Referenced Regression On Watershed attributes) model of long-term mean annual streamflow across diverse environmental and climatic drainages in 18 U.S. hydrological regions. Our application illustrates the use of a new generation of Bayesian methods that offer more advanced computational efficiencies than the prior generation. Evaluations of the effects of hierarchical (regional) variations in model coefficients and uncertainties on model accuracy indicates improved prediction accuracies (median of 10-50%) but primarily in humid eastern regions, where model uncertainties are one-third of those in arid western regions. Generally moderate regional variability is observed for most hierarchical coefficients. Accounting for measurement and structural uncertainties, using hierarchical state-space techniques, revealed the effects of spatially-heterogeneous, latent hydrological processes in the "localized" drainages between calibration sites; this improved model precision, with only minor changes in regional coefficients. Our study can inform advances in the use of hierarchical methods with hydrological models to improve their integration with stream measurements.

  4. Modeling sugar cane yield with a process-based model from site to continental scale: uncertainties arising from model structure and parameter values

    NASA Astrophysics Data System (ADS)

    Valade, A.; Ciais, P.; Vuichard, N.; Viovy, N.; Huth, N.; Marin, F.; Martiné, J.-F.

    2014-01-01

    Agro-Land Surface Models (agro-LSM) have been developed from the integration of specific crop processes into large-scale generic land surface models that allow calculating the spatial distribution and variability of energy, water and carbon fluxes within the soil-vegetation-atmosphere continuum. When developing agro-LSM models, a particular attention must be given to the effects of crop phenology and management on the turbulent fluxes exchanged with the atmosphere, and the underlying water and carbon pools. A part of the uncertainty of Agro-LSM models is related to their usually large number of parameters. In this study, we quantify the parameter-values uncertainty in the simulation of sugar cane biomass production with the agro-LSM ORCHIDEE-STICS, using a multi-regional approach with data from sites in Australia, La Réunion and Brazil. In ORCHIDEE-STICS, two models are chained: STICS, an agronomy model that calculates phenology and management, and ORCHIDEE, a land surface model that calculates biomass and other ecosystem variables forced by STICS' phenology. First, the parameters that dominate the uncertainty of simulated biomass at harvest date are determined through a screening of 67 different parameters of both STICS and ORCHIDEE on a multi-site basis. Secondly, the uncertainty of harvested biomass attributable to those most sensitive parameters is quantified and specifically attributed to either STICS (phenology, management) or to ORCHIDEE (other ecosystem variables including biomass) through distinct Monte-Carlo runs. The uncertainty on parameter values is constrained using observations by calibrating the model independently at seven sites. In a third step, a sensitivity analysis is carried out by varying the most sensitive parameters to investigate their effects at continental scale. A Monte-Carlo sampling method associated with the calculation of Partial Ranked Correlation Coefficients is used to quantify the sensitivity of harvested biomass to input parameters on a continental scale across the large regions of intensive sugar cane cultivation in Australia and Brazil. Ten parameters driving most of the uncertainty in the ORCHIDEE-STICS modeled biomass at the 7 sites are identified by the screening procedure. We found that the 10 most sensitive parameters control phenology (maximum rate of increase of LAI) and root uptake of water and nitrogen (root profile and root growth rate, nitrogen stress threshold) in STICS, and photosynthesis (optimal temperature of photosynthesis, optimal carboxylation rate), radiation interception (extinction coefficient), and transpiration and respiration (stomatal conductance, growth and maintenance respiration coefficients) in ORCHIDEE. We find that the optimal carboxylation rate and photosynthesis temperature parameters contribute most to the uncertainty in harvested biomass simulations at site scale. The spatial variation of the ranked correlation between input parameters and modeled biomass at harvest is well explained by rain and temperature drivers, suggesting climate-mediated different sensitivities of modeled sugar cane yield to the model parameters, for Australia and Brazil. This study reveals the spatial and temporal patterns of uncertainty variability for a highly parameterized agro-LSM and calls for more systematic uncertainty analyses of such models.

  5. Modeling sugarcane yield with a process-based model from site to continental scale: uncertainties arising from model structure and parameter values

    NASA Astrophysics Data System (ADS)

    Valade, A.; Ciais, P.; Vuichard, N.; Viovy, N.; Caubel, A.; Huth, N.; Marin, F.; Martiné, J.-F.

    2014-06-01

    Agro-land surface models (agro-LSM) have been developed from the integration of specific crop processes into large-scale generic land surface models that allow calculating the spatial distribution and variability of energy, water and carbon fluxes within the soil-vegetation-atmosphere continuum. When developing agro-LSM models, particular attention must be given to the effects of crop phenology and management on the turbulent fluxes exchanged with the atmosphere, and the underlying water and carbon pools. A part of the uncertainty of agro-LSM models is related to their usually large number of parameters. In this study, we quantify the parameter-values uncertainty in the simulation of sugarcane biomass production with the agro-LSM ORCHIDEE-STICS, using a multi-regional approach with data from sites in Australia, La Réunion and Brazil. In ORCHIDEE-STICS, two models are chained: STICS, an agronomy model that calculates phenology and management, and ORCHIDEE, a land surface model that calculates biomass and other ecosystem variables forced by STICS phenology. First, the parameters that dominate the uncertainty of simulated biomass at harvest date are determined through a screening of 67 different parameters of both STICS and ORCHIDEE on a multi-site basis. Secondly, the uncertainty of harvested biomass attributable to those most sensitive parameters is quantified and specifically attributed to either STICS (phenology, management) or to ORCHIDEE (other ecosystem variables including biomass) through distinct Monte Carlo runs. The uncertainty on parameter values is constrained using observations by calibrating the model independently at seven sites. In a third step, a sensitivity analysis is carried out by varying the most sensitive parameters to investigate their effects at continental scale. A Monte Carlo sampling method associated with the calculation of partial ranked correlation coefficients is used to quantify the sensitivity of harvested biomass to input parameters on a continental scale across the large regions of intensive sugarcane cultivation in Australia and Brazil. The ten parameters driving most of the uncertainty in the ORCHIDEE-STICS modeled biomass at the 7 sites are identified by the screening procedure. We found that the 10 most sensitive parameters control phenology (maximum rate of increase of LAI) and root uptake of water and nitrogen (root profile and root growth rate, nitrogen stress threshold) in STICS, and photosynthesis (optimal temperature of photosynthesis, optimal carboxylation rate), radiation interception (extinction coefficient), and transpiration and respiration (stomatal conductance, growth and maintenance respiration coefficients) in ORCHIDEE. We find that the optimal carboxylation rate and photosynthesis temperature parameters contribute most to the uncertainty in harvested biomass simulations at site scale. The spatial variation of the ranked correlation between input parameters and modeled biomass at harvest is well explained by rain and temperature drivers, suggesting different climate-mediated sensitivities of modeled sugarcane yield to the model parameters, for Australia and Brazil. This study reveals the spatial and temporal patterns of uncertainty variability for a highly parameterized agro-LSM and calls for more systematic uncertainty analyses of such models.

  6. Impact of Uncertainty on the Porous Media Description in the Subsurface Transport Analysis

    NASA Astrophysics Data System (ADS)

    Darvini, G.; Salandin, P.

    2008-12-01

    In the modelling of flow and transport phenomena in naturally heterogeneous media, the spatial variability of hydraulic properties, typically the hydraulic conductivity, is generally described by use of a variogram of constant sill and spatial correlation. While some analyses reported in the literature discuss of spatial inhomogeneity related to a trend in the mean hydraulic conductivity, the effect in the flow and transport due to an inexact definition of spatial statistical properties of media as far as we know had never taken into account. The relevance of this topic is manifest, and it is related to the uncertainty in the definition of spatial moments of hydraulic log-conductivity from an (usually) little number of data, as well as to the modelling of flow and transport processes by the Monte Carlo technique, whose numerical fields have poor ergodic properties and are not strictly statistically homogeneous. In this work we investigate the effects related to mean log-conductivity (logK) field behaviours different from the constant one due to different sources of inhomogeneity as: i) a deterministic trend; ii) a deterministic sinusoidal pattern and iii) a random behaviour deriving from the hierarchical sedimentary architecture of porous formations and iv) conditioning procedure on available measurements of the hydraulic conductivity. These mean log-conductivity behaviours are superimposed to a correlated weakly fluctuating logK field. The time evolution of the spatial moments of the plume driven by a statistically inhomogeneous steady state random velocity field is analyzed in a 2-D finite domain by taking into account different sizes of injection area. The problem is approached by both a classical Monte Carlo procedure and SFEM (stochastic finite element method). By the latter the moments are achieved by space-time integration of the velocity field covariance structure derived according to the first- order Taylor series expansion. Two different goals are foreseen: 1) from the results it will be possible to distinguish the contribute in the plume dispersion of the uncertainty in the statistics of the medium hydraulic properties in all the cases considered, and 2) we will try to highlight the loss of performances that seems to affect the first-order approaches in the transport phenomena that take place in hierarchical architecture of porous formations.

  7. Modeling of 2D diffusion processes based on microscopy data: parameter estimation and practical identifiability analysis.

    PubMed

    Hock, Sabrina; Hasenauer, Jan; Theis, Fabian J

    2013-01-01

    Diffusion is a key component of many biological processes such as chemotaxis, developmental differentiation and tissue morphogenesis. Since recently, the spatial gradients caused by diffusion can be assessed in-vitro and in-vivo using microscopy based imaging techniques. The resulting time-series of two dimensional, high-resolutions images in combination with mechanistic models enable the quantitative analysis of the underlying mechanisms. However, such a model-based analysis is still challenging due to measurement noise and sparse observations, which result in uncertainties of the model parameters. We introduce a likelihood function for image-based measurements with log-normal distributed noise. Based upon this likelihood function we formulate the maximum likelihood estimation problem, which is solved using PDE-constrained optimization methods. To assess the uncertainty and practical identifiability of the parameters we introduce profile likelihoods for diffusion processes. As proof of concept, we model certain aspects of the guidance of dendritic cells towards lymphatic vessels, an example for haptotaxis. Using a realistic set of artificial measurement data, we estimate the five kinetic parameters of this model and compute profile likelihoods. Our novel approach for the estimation of model parameters from image data as well as the proposed identifiability analysis approach is widely applicable to diffusion processes. The profile likelihood based method provides more rigorous uncertainty bounds in contrast to local approximation methods.

  8. Error discrimination of an operational hydrological forecasting system at a national scale

    NASA Astrophysics Data System (ADS)

    Jordan, F.; Brauchli, T.

    2010-09-01

    The use of operational hydrological forecasting systems is recommended for hydropower production as well as flood management. However, the forecast uncertainties can be important and lead to bad decisions such as false alarms and inappropriate reservoir management of hydropower plants. In order to improve the forecasting systems, it is important to discriminate the different sources of uncertainties. To achieve this task, reanalysis of past predictions can be realized and provide information about the structure of the global uncertainty. In order to discriminate between uncertainty due to the weather numerical model and uncertainty due to the rainfall-runoff model, simulations assuming perfect weather forecast must be realized. This contribution presents the spatial analysis of the weather uncertainties and their influence on the river discharge prediction of a few different river basins where an operational forecasting system exists. The forecast is based on the RS 3.0 system [1], [2], which is also running the open Internet platform www.swissrivers.ch [3]. The uncertainty related to the hydrological model is compared to the uncertainty related to the weather prediction. A comparison between numerous weather prediction models [4] at different lead times is also presented. The results highlight an important improving potential of both forecasting components: the hydrological rainfall-runoff model and the numerical weather prediction models. The hydrological processes must be accurately represented during the model calibration procedure, while weather prediction models suffer from a systematic spatial bias. REFERENCES [1] Garcia, J., Jordan, F., Dubois, J. & Boillat, J.-L. 2007. "Routing System II, Modélisation d'écoulements dans des systèmes hydrauliques", Communication LCH n° 32, Ed. Prof. A. Schleiss, Lausanne [2] Jordan, F. 2007. Modèle de prévision et de gestion des crues - optimisation des opérations des aménagements hydroélectriques à accumulation pour la réduction des débits de crue, thèse de doctorat n° 3711, Ecole Polytechnique Fédérale, Lausanne [3] Keller, R. 2009. "Le débit des rivières au peigne fin", Revue Technique Suisse, N°7/8 2009, Swiss engineering RTS, UTS SA, Lausanne, p. 11 [4] Kaufmann, P., Schubiger, F. & Binder, P. 2003. Precipitation forecasting by a mesoscale numerical weather prediction (NWP) model : eight years of experience, Hydrology and Earth System

  9. A Multi-Band Uncertainty Set Based Robust SCUC With Spatial and Temporal Budget Constraints

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dai, Chenxi; Wu, Lei; Wu, Hongyu

    2016-11-01

    The dramatic increase of renewable energy resources in recent years, together with the long-existing load forecast errors and increasingly involved price sensitive demands, has introduced significant uncertainties into power systems operation. In order to guarantee the operational security of power systems with such uncertainties, robust optimization has been extensively studied in security-constrained unit commitment (SCUC) problems, for immunizing the system against worst uncertainty realizations. However, traditional robust SCUC models with single-band uncertainty sets may yield over-conservative solutions in most cases. This paper proposes a multi-band robust model to accurately formulate various uncertainties with higher resolution. By properly tuning band intervalsmore » and weight coefficients of individual bands, the proposed multi-band robust model can rigorously and realistically reflect spatial/temporal relationships and asymmetric characteristics of various uncertainties, and in turn could effectively leverage the tradeoff between robustness and economics of robust SCUC solutions. The proposed multi-band robust SCUC model is solved by Benders decomposition (BD) and outer approximation (OA), while taking the advantage of integral property of the proposed multi-band uncertainty set. In addition, several accelerating techniques are developed for enhancing the computational performance and the convergence speed. Numerical studies on a 6-bus system and the modified IEEE 118-bus system verify the effectiveness of the proposed robust SCUC approach for enhancing uncertainty modeling capabilities and mitigating conservativeness of the robust SCUC solution.« less

  10. Application of data fusion modeling (DFM) to site characterization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Porter, D.W.; Gibbs, B.P.; Jones, W.F.

    1996-01-01

    Subsurface characterization is faced with substantial uncertainties because the earth is very heterogeneous, and typical data sets are fragmented and disparate. DFM removes many of the data limitations of current methods to quantify and reduce uncertainty for a variety of data types and models. DFM is a methodology to compute hydrogeological state estimates and their uncertainties from three sources of information: measured data, physical laws, and statistical models for spatial heterogeneities. The benefits of DFM are savings in time and cost through the following: the ability to update models in real time to help guide site assessment, improved quantification ofmore » uncertainty for risk assessment, and improved remedial design by quantifying the uncertainty in safety margins. A Bayesian inverse modeling approach is implemented with a Gauss Newton method where spatial heterogeneities are viewed as Markov random fields. Information from data, physical laws, and Markov models is combined in a Square Root Information Smoother (SRIS). Estimates and uncertainties can be computed for heterogeneous hydraulic conductivity fields in multiple geological layers from the usually sparse hydraulic conductivity data and the often more plentiful head data. An application of DFM to the Old Burial Ground at the DOE Savannah River Site will be presented. DFM estimates and quantifies uncertainty in hydrogeological parameters using variably saturated flow numerical modeling to constrain the estimation. Then uncertainties are propagated through the transport modeling to quantify the uncertainty in tritium breakthrough curves at compliance points.« less

  11. Application of data fusion modeling (DFM) to site characterization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Porter, D.W.; Gibbs, B.P.; Jones, W.F.

    1996-12-31

    Subsurface characterization is faced with substantial uncertainties because the earth is very heterogeneous, and typical data sets are fragmented and disparate. DFM removes many of the data limitations of current methods to quantify and reduce uncertainty for a variety of data types and models. DFM is a methodology to compute hydrogeological state estimates and their uncertainties from three sources of information: measured data, physical laws, and statistical models for spatial heterogeneities. The benefits of DFM are savings in time and cost through the following: the ability to update models in real time to help guide site assessment, improved quantification ofmore » uncertainty for risk assessment, and improved remedial design by quantifying the uncertainty in safety margins. A Bayesian inverse modeling approach is implemented with a Gauss Newton method where spatial heterogeneities are viewed as Markov random fields. Information from data, physical laws, and Markov models is combined in a Square Root Information Smoother (SRIS). Estimates and uncertainties can be computed for heterogeneous hydraulic conductivity fields in multiple geological layers from the usually sparse hydraulic conductivity data and the often more plentiful head data. An application of DFM to the Old Burial Ground at the DOE Savannah River Site will be presented. DFM estimates and quantifies uncertainty in hydrogeological parameters using variably saturated flow numerical modeling to constrain the estimation. Then uncertainties are propagated through the transport modeling to quantify the uncertainty in tritium breakthrough curves at compliance points.« less

  12. Predictive uncertainty analysis of plume distribution for geological carbon sequestration using sparse-grid Bayesian method

    NASA Astrophysics Data System (ADS)

    Shi, X.; Zhang, G.

    2013-12-01

    Because of the extensive computational burden, parametric uncertainty analyses are rarely conducted for geological carbon sequestration (GCS) process based multi-phase models. The difficulty of predictive uncertainty analysis for the CO2 plume migration in realistic GCS models is not only due to the spatial distribution of the caprock and reservoir (i.e. heterogeneous model parameters), but also because the GCS optimization estimation problem has multiple local minima due to the complex nonlinear multi-phase (gas and aqueous), and multi-component (water, CO2, salt) transport equations. The geological model built by Doughty and Pruess (2004) for the Frio pilot site (Texas) was selected and assumed to represent the 'true' system, which was composed of seven different facies (geological units) distributed among 10 layers. We chose to calibrate the permeabilities of these facies. Pressure and gas saturation values from this true model were then extracted and used as observations for subsequent model calibration. Random noise was added to the observations to approximate realistic field conditions. Each simulation of the model lasts about 2 hours. In this study, we develop a new approach that improves computational efficiency of Bayesian inference by constructing a surrogate system based on an adaptive sparse-grid stochastic collocation method. This surrogate response surface global optimization algorithm is firstly used to calibrate the model parameters, then prediction uncertainty of the CO2 plume position is quantified due to the propagation from parametric uncertainty in the numerical experiments, which is also compared to the actual plume from the 'true' model. Results prove that the approach is computationally efficient for multi-modal optimization and prediction uncertainty quantification for computationally expensive simulation models. Both our inverse methodology and findings can be broadly applicable to GCS in heterogeneous storage formations.

  13. Mapping the Risk of Soil-Transmitted Helminthic Infections in the Philippines

    PubMed Central

    Leonardo, Lydia; Gray, Darren J.; Carabin, Hélène; Halton, Kate; McManus, Donald P.; Williams, Gail M.; Rivera, Pilarita; Saniel, Ofelia; Hernandez, Leda; Yakob, Laith; McGarvey, Stephen T.; Clements, Archie C. A.

    2015-01-01

    Background In order to increase the efficient allocation of soil-transmitted helminth (STH) disease control resources in the Philippines, we aimed to describe for the first time the spatial variation in the prevalence of A. lumbricoides, T. trichiura and hookworm across the country, quantify the association between the physical environment and spatial variation of STH infection and develop predictive risk maps for each infection. Methodology/Principal Findings Data on STH infection from 35,573 individuals across the country were geolocated at the barangay level and included in the analysis. The analysis was stratified geographically in two major regions: 1) Luzon and the Visayas and 2) Mindanao. Bayesian geostatistical models of STH prevalence were developed, including age and sex of individuals and environmental variables (rainfall, land surface temperature and distance to inland water bodies) as predictors, and diagnostic uncertainty was incorporated. The role of environmental variables was different between regions of the Philippines. This analysis revealed that while A. lumbricoides and T. trichiura infections were widespread and highly endemic, hookworm infections were more circumscribed to smaller foci in the Visayas and Mindanao. Conclusions/Significance This analysis revealed significant spatial variation in STH infection prevalence within provinces of the Philippines. This suggests that a spatially targeted approach to STH interventions, including mass drug administration, is warranted. When financially possible, additional STH surveys should be prioritized to high-risk areas identified by our study in Luzon. PMID:26368819

  14. Uncertainty in spatially explicit animal dispersal models

    USGS Publications Warehouse

    Mooij, Wolf M.; DeAngelis, Donald L.

    2003-01-01

    Uncertainty in estimates of survival of dispersing animals is a vexing difficulty in conservation biology. The current notion is that this uncertainty decreases the usefulness of spatially explicit population models in particular. We examined this problem by comparing dispersal models of three levels of complexity: (1) an event-based binomial model that considers only the occurrence of mortality or arrival, (2) a temporally explicit exponential model that employs mortality and arrival rates, and (3) a spatially explicit grid-walk model that simulates the movement of animals through an artificial landscape. Each model was fitted to the same set of field data. A first objective of the paper is to illustrate how the maximum-likelihood method can be used in all three cases to estimate the means and confidence limits for the relevant model parameters, given a particular set of data on dispersal survival. Using this framework we show that the structure of the uncertainty for all three models is strikingly similar. In fact, the results of our unified approach imply that spatially explicit dispersal models, which take advantage of information on landscape details, suffer less from uncertainly than do simpler models. Moreover, we show that the proposed strategy of model development safeguards one from error propagation in these more complex models. Finally, our approach shows that all models related to animal dispersal, ranging from simple to complex, can be related in a hierarchical fashion, so that the various approaches to modeling such dispersal can be viewed from a unified perspective.

  15. Parameter optimization of a hydrologic model in a snow-dominated basin using a modular Python framework

    NASA Astrophysics Data System (ADS)

    Volk, J. M.; Turner, M. A.; Huntington, J. L.; Gardner, M.; Tyler, S.; Sheneman, L.

    2016-12-01

    Many distributed models that simulate watershed hydrologic processes require a collection of multi-dimensional parameters as input, some of which need to be calibrated before the model can be applied. The Precipitation Runoff Modeling System (PRMS) is a physically-based and spatially distributed hydrologic model that contains a considerable number of parameters that often need to be calibrated. Modelers can also benefit from uncertainty analysis of these parameters. To meet these needs, we developed a modular framework in Python to conduct PRMS parameter optimization, uncertainty analysis, interactive visual inspection of parameters and outputs, and other common modeling tasks. Here we present results for multi-step calibration of sensitive parameters controlling solar radiation, potential evapo-transpiration, and streamflow in a PRMS model that we applied to the snow-dominated Dry Creek watershed in Idaho. We also demonstrate how our modular approach enables the user to use a variety of parameter optimization and uncertainty methods or easily define their own, such as Monte Carlo random sampling, uniform sampling, or even optimization methods such as the downhill simplex method or its commonly used, more robust counterpart, shuffled complex evolution.

  16. Dynamic wake prediction and visualization with uncertainty analysis

    NASA Technical Reports Server (NTRS)

    Holforty, Wendy L. (Inventor); Powell, J. David (Inventor)

    2005-01-01

    A dynamic wake avoidance system utilizes aircraft and atmospheric parameters readily available in flight to model and predict airborne wake vortices in real time. A novel combination of algorithms allows for a relatively simple yet robust wake model to be constructed based on information extracted from a broadcast. The system predicts the location and movement of the wake based on the nominal wake model and correspondingly performs an uncertainty analysis on the wake model to determine a wake hazard zone (no fly zone), which comprises a plurality of wake planes, each moving independently from another. The system selectively adjusts dimensions of each wake plane to minimize spatial and temporal uncertainty, thereby ensuring that the actual wake is within the wake hazard zone. The predicted wake hazard zone is communicated in real time directly to a user via a realistic visual representation. In an example, the wake hazard zone is visualized on a 3-D flight deck display to enable a pilot to visualize or see a neighboring aircraft as well as its wake. The system substantially enhances the pilot's situational awareness and allows for a further safe decrease in spacing, which could alleviate airport and airspace congestion.

  17. Using discharge data to reduce structural deficits in a hydrological model with a Bayesian inference approach and the implications for the prediction of critical source areas

    NASA Astrophysics Data System (ADS)

    Frey, M. P.; Stamm, C.; Schneider, M. K.; Reichert, P.

    2011-12-01

    A distributed hydrological model was used to simulate the distribution of fast runoff formation as a proxy for critical source areas for herbicide pollution in a small agricultural catchment in Switzerland. We tested to what degree predictions based on prior knowledge without local measurements could be improved upon relying on observed discharge. This learning process consisted of five steps: For the prior prediction (step 1), knowledge of the model parameters was coarse and predictions were fairly uncertain. In the second step, discharge data were used to update the prior parameter distribution. Effects of uncertainty in input data and model structure were accounted for by an autoregressive error model. This step decreased the width of the marginal distributions of parameters describing the lower boundary (percolation rates) but hardly affected soil hydraulic parameters. Residual analysis (step 3) revealed model structure deficits. We modified the model, and in the subsequent Bayesian updating (step 4) the widths of the posterior marginal distributions were reduced for most parameters compared to those of the prior. This incremental procedure led to a strong reduction in the uncertainty of the spatial prediction. Thus, despite only using spatially integrated data (discharge), the spatially distributed effect of the improved model structure can be expected to improve the spatially distributed predictions also. The fifth step consisted of a test with independent spatial data on herbicide losses and revealed ambiguous results. The comparison depended critically on the ratio of event to preevent water that was discharged. This ratio cannot be estimated from hydrological data only. The results demonstrate that the value of local data is strongly dependent on a correct model structure. An iterative procedure of Bayesian updating, model testing, and model modification is suggested.

  18. Global economic trade-offs between wild nature and tropical agriculture.

    PubMed

    Carrasco, Luis R; Webb, Edward L; Symes, William S; Koh, Lian P; Sodhi, Navjot S

    2017-07-01

    Global demands for agricultural and forestry products provide economic incentives for deforestation across the tropics. Much of this deforestation occurs with a lack of information on the spatial distribution of benefits and costs of deforestation. To inform global sustainable land-use policies, we combine geographic information systems (GIS) with a meta-analysis of ecosystem services (ES) studies to perform a spatially explicit analysis of the trade-offs between agricultural benefits, carbon emissions, and losses of multiple ecosystem services because of tropical deforestation from 2000 to 2012. Even though the value of ecosystem services presents large inherent uncertainties, we find a pattern supporting the argument that the externalities of destroying tropical forests are greater than the current direct economic benefits derived from agriculture in all cases bar one: when yield and rent potentials of high-value crops could be realized in the future. Our analysis identifies the Atlantic Forest, areas around the Gulf of Guinea, and Thailand as areas where agricultural conversion appears economically efficient, indicating a major impediment to the long-term financial sustainability of Reducing Emissions from Deforestation and forest Degradation (REDD+) schemes in those countries. By contrast, Latin America, insular Southeast Asia, and Madagascar present areas with low agricultural rents (ARs) and high values in carbon stocks and ES, suggesting that they are economically viable conservation targets. Our study helps identify optimal areas for conservation and agriculture together with their associated uncertainties, which could enhance the efficiency and sustainability of pantropical land-use policies and help direct future research efforts.

  19. Potential Applications of Gosat Based Carbon Budget Products to Refine Terrestrial Ecosystem Model

    NASA Astrophysics Data System (ADS)

    Kondo, M.; Ichii, K.

    2011-12-01

    Estimation of carbon exchange in terrestrial ecosystem associates with difficulties due to complex entanglement of physical and biological processes: thus, the net ecosystem productivity (NEP) estimated from simulation often differs among process-based terrestrial ecosystem models. In addition to complexity of the system, validation can only be conducted in a point scale since reliable observation is only available from ground observations. With a lack of large spatial data, extension of model simulation to a global scale results in significant uncertainty in the future carbon balance and climate change. Greenhouse gases Observing SATellite (GOSAT), launched by the Japanese space agency (JAXA) in January, 2009, is the 1st operational satellite promised to deliver the net land-atmosphere carbon budget to the terrestrial biosphere research community. Using that information, the model reproducibility of carbon budget is expected to improve: hence, gives a better estimation of the future climate change. This initial analysis is to seek and evaluate the potential applications of GOSAT observation toward the sophistication of terrestrial ecosystem model. The present study was conducted in two processes: site-based analysis using eddy covariance observation data to assess the potential use of terrestrial carbon fluxes (GPP, RE, and NEP) to refine the model, and extension of the point scale analysis to spatial using Carbon Tracker product as a prototype of GOSAT product. In the first phase of the experiment, it was verified that an optimization routine adapted to a terrestrial model, Biome-BGC, yielded the improved result with respect to eddy covariance observation data from AsiaFlux Network. Spatial data sets used in the second phase were consists of GPP from empirical algorithm (e.g. support vector machine), NEP from Carbon Tracker, and RE from the combination of these. These spatial carbon flux estimations was used to refine the model applying the exactly same optimization procedure as the point analysis, and found that these spatial data help to improve the model's overall reproducibility. The GOSAT product is expected to have higher accuracy since it uses global CO2 observations. Therefore, with the application of GOSAT data, a better estimation of terrestrial carbon cycle can be achieved with optimization. It is anticipated to carry out more detailed analysis upon the arrival of GOSAT product and to verify the reduction in the uncertainty in the future carbon budget and the climate change with the calibrated models, which is the major contribution can be achieved from GOSAT.

  20. Medical Geography: a Promising Field of Application for Geostatistics.

    PubMed

    Goovaerts, P

    2009-01-01

    The analysis of health data and putative covariates, such as environmental, socio-economic, behavioral or demographic factors, is a promising application for geostatistics. It presents, however, several methodological challenges that arise from the fact that data are typically aggregated over irregular spatial supports and consist of a numerator and a denominator (i.e. population size). This paper presents an overview of recent developments in the field of health geostatistics, with an emphasis on three main steps in the analysis of areal health data: estimation of the underlying disease risk, detection of areas with significantly higher risk, and analysis of relationships with putative risk factors. The analysis is illustrated using age-adjusted cervix cancer mortality rates recorded over the 1970-1994 period for 118 counties of four states in the Western USA. Poisson kriging allows the filtering of noisy mortality rates computed from small population sizes, enhancing the correlation with two putative explanatory variables: percentage of habitants living below the federally defined poverty line, and percentage of Hispanic females. Area-to-point kriging formulation creates continuous maps of mortality risk, reducing the visual bias associated with the interpretation of choropleth maps. Stochastic simulation is used to generate realizations of cancer mortality maps, which allows one to quantify numerically how the uncertainty about the spatial distribution of health outcomes translates into uncertainty about the location of clusters of high values or the correlation with covariates. Last, geographically-weighted regression highlights the non-stationarity in the explanatory power of covariates: the higher mortality values along the coast are better explained by the two covariates than the lower risk recorded in Utah.

  1. Variability of ICA decomposition may impact EEG signals when used to remove eyeblink artifacts

    PubMed Central

    PONTIFEX, MATTHEW B.; GWIZDALA, KATHRYN L.; PARKS, ANDREW C.; BILLINGER, MARTIN; BRUNNER, CLEMENS

    2017-01-01

    Despite the growing use of independent component analysis (ICA) algorithms for isolating and removing eyeblink-related activity from EEG data, we have limited understanding of how variability associated with ICA uncertainty may be influencing the reconstructed EEG signal after removing the eyeblink artifact components. To characterize the magnitude of this ICA uncertainty and to understand the extent to which it may influence findings within ERP and EEG investigations, ICA decompositions of EEG data from 32 college-aged young adults were repeated 30 times for three popular ICA algorithms. Following each decomposition, eyeblink components were identified and removed. The remaining components were back-projected, and the resulting clean EEG data were further used to analyze ERPs. Findings revealed that ICA uncertainty results in variation in P3 amplitude as well as variation across all EEG sampling points, but differs across ICA algorithms as a function of the spatial location of the EEG channel. This investigation highlights the potential of ICA uncertainty to introduce additional sources of variance when the data are back-projected without artifact components. Careful selection of ICA algorithms and parameters can reduce the extent to which ICA uncertainty may introduce an additional source of variance within ERP/EEG studies. PMID:28026876

  2. Impact of input data uncertainty on environmental exposure assessment models: A case study for electromagnetic field modelling from mobile phone base stations.

    PubMed

    Beekhuizen, Johan; Heuvelink, Gerard B M; Huss, Anke; Bürgi, Alfred; Kromhout, Hans; Vermeulen, Roel

    2014-11-01

    With the increased availability of spatial data and computing power, spatial prediction approaches have become a standard tool for exposure assessment in environmental epidemiology. However, such models are largely dependent on accurate input data. Uncertainties in the input data can therefore have a large effect on model predictions, but are rarely quantified. With Monte Carlo simulation we assessed the effect of input uncertainty on the prediction of radio-frequency electromagnetic fields (RF-EMF) from mobile phone base stations at 252 receptor sites in Amsterdam, The Netherlands. The impact on ranking and classification was determined by computing the Spearman correlations and weighted Cohen's Kappas (based on tertiles of the RF-EMF exposure distribution) between modelled values and RF-EMF measurements performed at the receptor sites. The uncertainty in modelled RF-EMF levels was large with a median coefficient of variation of 1.5. Uncertainty in receptor site height, building damping and building height contributed most to model output uncertainty. For exposure ranking and classification, the heights of buildings and receptor sites were the most important sources of uncertainty, followed by building damping, antenna- and site location. Uncertainty in antenna power, tilt, height and direction had a smaller impact on model performance. We quantified the effect of input data uncertainty on the prediction accuracy of an RF-EMF environmental exposure model, thereby identifying the most important sources of uncertainty and estimating the total uncertainty stemming from potential errors in the input data. This approach can be used to optimize the model and better interpret model output. Copyright © 2014 Elsevier Inc. All rights reserved.

  3. A Comparison of Traditional, Step-Path, and Geostatistical Techniques in the Stability Analysis of a Large Open Pit

    NASA Astrophysics Data System (ADS)

    Mayer, J. M.; Stead, D.

    2017-04-01

    With the increased drive towards deeper and more complex mine designs, geotechnical engineers are often forced to reconsider traditional deterministic design techniques in favour of probabilistic methods. These alternative techniques allow for the direct quantification of uncertainties within a risk and/or decision analysis framework. However, conventional probabilistic practices typically discretize geological materials into discrete, homogeneous domains, with attributes defined by spatially constant random variables, despite the fact that geological media display inherent heterogeneous spatial characteristics. This research directly simulates this phenomenon using a geostatistical approach, known as sequential Gaussian simulation. The method utilizes the variogram which imposes a degree of controlled spatial heterogeneity on the system. Simulations are constrained using data from the Ok Tedi mine site in Papua New Guinea and designed to randomly vary the geological strength index and uniaxial compressive strength using Monte Carlo techniques. Results suggest that conventional probabilistic techniques have a fundamental limitation compared to geostatistical approaches, as they fail to account for the spatial dependencies inherent to geotechnical datasets. This can result in erroneous model predictions, which are overly conservative when compared to the geostatistical results.

  4. Identifying and assessing critical uncertainty thresholds in a forest pest risk model

    Treesearch

    Frank H. Koch; Denys Yemshanov

    2015-01-01

    Pest risk maps can provide helpful decision support for invasive alien species management, but often fail to address adequately the uncertainty associated with their predicted risk values. Th is chapter explores how increased uncertainty in a risk model’s numeric assumptions (i.e. its principal parameters) might aff ect the resulting risk map. We used a spatial...

  5. Sensitivity analysis of the GEMS soil organic carbon model to land cover land use classification uncertainties under different climate scenarios in Senegal

    USGS Publications Warehouse

    Dieye, A.M.; Roy, David P.; Hanan, N.P.; Liu, S.; Hansen, M.; Toure, A.

    2012-01-01

    Spatially explicit land cover land use (LCLU) change information is needed to drive biogeochemical models that simulate soil organic carbon (SOC) dynamics. Such information is increasingly being mapped using remotely sensed satellite data with classification schemes and uncertainties constrained by the sensing system, classification algorithms and land cover schemes. In this study, automated LCLU classification of multi-temporal Landsat satellite data were used to assess the sensitivity of SOC modeled by the Global Ensemble Biogeochemical Modeling System (GEMS). The GEMS was run for an area of 1560 km2 in Senegal under three climate change scenarios with LCLU maps generated using different Landsat classification approaches. This research provides a method to estimate the variability of SOC, specifically the SOC uncertainty due to satellite classification errors, which we show is dependent not only on the LCLU classification errors but also on where the LCLU classes occur relative to the other GEMS model inputs.

  6. The GeV Excess Shining Through: Background Systematics for the Inner Galaxy Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Calore, Francesca; Cholis, Ilias; Weniger, Christoph

    2015-02-10

    Recently, a spatially extended excess of gamma rays collected by the Fermi-LAT from the inner region of the Milky Way has been detected by different groups and with increasingly sophisticated techniques. Yet, any final conclusion about the morphology and spectral properties of such an extended diffuse emission are subject to a number of potentially critical uncertainties, related to the high density of cosmic rays, gas, magnetic fields and abundance of point sources. We will present a thorough study of the systematic uncertainties related to the modelling of diffuse background and to the propagation of cosmic rays in the inner partmore » of our Galaxy. We will test a large set of models for the Galactic diffuse emission, generated by varying the propagation parameters within extreme conditions. By using those models in the fit of Fermi-LAT data as Galactic foreground, we will show that the gamma-ray excess survives and we will quantify the uncertainties on the excess emission morphology and energy spectrum.« less

  7. Why we need to estimate the sampling uncertainty of eddy covariance flux measurement?

    NASA Astrophysics Data System (ADS)

    Kim, W.; Seo, H. H.

    2015-12-01

    Fruitful studies on exchanges of energy, water and carbon dioxide between the atmosphere and terrestrial ecosystem has been produced under a global network (http://fluxnet.ornl.gov). The exchange is defined by a flux, and in traditional the flux is estimated with eddy covariance (EC) method as a mean flux F for 30-min or 1-hr, because no techniques have been established for a direct measurement of a momentary flux itself. Therefore, the exchange analysis with F is to paid attention to estimations of spatial or temporal mean, because the exchange estimated by arithmetic mean Fa might be inappropriate in terms of the sample F used in this averaging having nonidentical inherent quality within one another in accordance with different micrometeorological and ecophysiological conditions while those are measured by the same instruments. To overcome this issue, we propose the weighted mean Fw using a relative sampling uncertainty ɛ estimated by a sampling F and its uncertainty, and introduce Fw performance tested with EC measurements for various sites.

  8. Assessment of the impact of climate change on spatiotemporal variability of blue and green water resources under CMIP3 and CMIP5 models in a highly mountainous watershed

    NASA Astrophysics Data System (ADS)

    Fazeli Farsani, Iman; Farzaneh, M. R.; Besalatpour, A. A.; Salehi, M. H.; Faramarzi, M.

    2018-04-01

    The variability and uncertainty of water resources associated with climate change are critical issues in arid and semi-arid regions. In this study, we used the soil and water assessment tool (SWAT) to evaluate the impact of climate change on the spatial and temporal variability of water resources in the Bazoft watershed, Iran. The analysis was based on changes of blue water flow, green water flow, and green water storage for a future period (2010-2099) compared to a historical period (1992-2008). The r-factor, p-factor, R 2, and Nash-Sutcliff coefficients for discharge were 1.02, 0.89, 0.80, and 0.80 for the calibration period and 1.03, 0.76, 0.57, and 0.59 for the validation period, respectively. General circulation models (GCMs) under 18 emission scenarios from the IPCC's Fourth (AR4) and Fifth (AR5) Assessment Reports were fed into the SWAT model. At the sub-basin level, blue water tended to decrease, while green water flow tended to increase in the future scenario, and green water storage was predicted to continue its historical trend into the future. At the monthly time scale, the 95% prediction uncertainty bands (95PPUs) of blue and green water flows varied widely in the watershed. A large number (18) of climate change scenarios fell within the estimated uncertainty band of the historical period. The large differences among scenarios indicated high levels of uncertainty in the watershed. Our results reveal that the spatial patterns of water resource components and their uncertainties in the context of climate change are notably different between IPCC AR4 and AR5 in the Bazoft watershed. This study provides a strong basis for water supply-demand analyses, and the general analytical framework can be applied to other study areas with similar challenges.

  9. Modeling uncertainty and correlation in soil properties using Restricted Pairing and implications for ensemble-based hillslope-scale soil moisture and temperature estimation

    NASA Astrophysics Data System (ADS)

    Flores, A. N.; Entekhabi, D.; Bras, R. L.

    2007-12-01

    Soil hydraulic and thermal properties (SHTPs) affect both the rate of moisture redistribution in the soil column and the volumetric soil water capacity. Adequately constraining these properties through field and lab analysis to parameterize spatially-distributed hydrology models is often prohibitively expensive. Because SHTPs vary significantly at small spatial scales individual soil samples are also only reliably indicative of local conditions, and these properties remain a significant source of uncertainty in soil moisture and temperature estimation. In ensemble-based soil moisture data assimilation, uncertainty in the model-produced prior estimate due to associated uncertainty in SHTPs must be taken into account to avoid under-dispersive ensembles. To treat SHTP uncertainty for purposes of supplying inputs to a distributed watershed model we use the restricted pairing (RP) algorithm, an extension of Latin Hypercube (LH) sampling. The RP algorithm generates an arbitrary number of SHTP combinations by sampling the appropriate marginal distributions of the individual soil properties using the LH approach, while imposing a target rank correlation among the properties. A previously-published meta- database of 1309 soils representing 12 textural classes is used to fit appropriate marginal distributions to the properties and compute the target rank correlation structure, conditioned on soil texture. Given categorical soil textures, our implementation of the RP algorithm generates an arbitrarily-sized ensemble of realizations of the SHTPs required as input to the TIN-based Realtime Integrated Basin Simulator with vegetation dynamics (tRIBS+VEGGIE) distributed parameter ecohydrology model. Soil moisture ensembles simulated with RP- generated SHTPs exhibit less variance than ensembles simulated with SHTPs generated by a scheme that neglects correlation among properties. Neglecting correlation among SHTPs can lead to physically unrealistic combinations of parameters that exhibit implausible hydrologic behavior when input to the tRIBS+VEGGIE model.

  10. Methane emissions from global wetlands: An assessment of the uncertainty associated with various wetland extent data sets

    NASA Astrophysics Data System (ADS)

    Zhang, Bowen; Tian, Hanqin; Lu, Chaoqun; Chen, Guangsheng; Pan, Shufen; Anderson, Christopher; Poulter, Benjamin

    2017-09-01

    A wide range of estimates on global wetland methane (CH4) fluxes has been reported during the recent two decades. This gives rise to urgent needs to clarify and identify the uncertainty sources, and conclude a reconciled estimate for global CH4 fluxes from wetlands. Most estimates by using bottom-up approach rely on wetland data sets, but these data sets show largely inconsistent in terms of both wetland extent and spatiotemporal distribution. A quantitative assessment of uncertainties associated with these discrepancies among wetland data sets has not been well investigated yet. By comparing the five widely used global wetland data sets (GISS, GLWD, Kaplan, GIEMS and SWAMPS-GLWD), it this study, we found large differences in the wetland extent, ranging from 5.3 to 10.2 million km2, as well as their spatial and temporal distributions among the five data sets. These discrepancies in wetland data sets resulted in large bias in model-estimated global wetland CH4 emissions as simulated by using the Dynamic Land Ecosystem Model (DLEM). The model simulations indicated that the mean global wetland CH4 emissions during 2000-2007 were 177.2 ± 49.7 Tg CH4 yr-1, based on the five different data sets. The tropical regions contributed the largest portion of estimated CH4 emissions from global wetlands, but also had the largest discrepancy. Among six continents, the largest uncertainty was found in South America. Thus, the improved estimates of wetland extent and CH4 emissions in the tropical regions and South America would be a critical step toward an accurate estimate of global CH4 emissions. This uncertainty analysis also reveals an important need for our scientific community to generate a global scale wetland data set with higher spatial resolution and shorter time interval, by integrating multiple sources of field and satellite data with modeling approaches, for cross-scale extrapolation.

  11. And yet it moves! Involving transient flow conditions is the logical next step for WHPA analysis

    NASA Astrophysics Data System (ADS)

    Rodriguez-Pretelin, A.; Nowak, W.

    2017-12-01

    As the first line of defense among different safety measures, Wellhead Protection Areas (WHPAs) have been broadly used to protect drinking water wells against sources of pollution. In most cases, their implementation relies on simplifications, such as assuming homogeneous or zonated aquifer conditions or considering steady-state flow scenarios. Obviously, both assumptions inevitably invoke errors. However, while uncertainty due to aquifer heterogeneity has been extensively studied in the literature, the impact of transient flow conditions have received yet very little attention. For instance, WHPA maps in the offices of water supply companies are fixed maps derived from steady-state models although the actual catchment out there are transient. To mitigate high computational costs, we approximate transiency by means of a dynamic superposition of steady-state flow solutions. Then, we analyze four transient drivers that often appear on the seasonal scale: (I) regional groundwater flow direction, (II) strength of the regional hydraulic gradient, (III) natural recharge to the groundwater and (IV) pumping rate. The integration of transiency in WHPA analysis leads to time-frequency maps. They express for each location the temporal frequency of catchment membership. Furthermore, we account for the uncertainty due to incomplete knowledge on geological and transiency conditions, solved through Monte Carlo simulations. The main contribution of this study, is to show the need of enhancing groundwater well protection by considering transient flow considerations during WHPA analysis. To support and complement our statement, we demonstrate that 1) each transient driver imprints an individual spatial pattern in the required WHPA, ranking their influence through a global sensitivity analysis. 2) We compare the influence of transient conditions compared to geological uncertainty in terms of areal WHPA demand. 3) We show that considering geological uncertainty alone is insufficient in the presence of transient conditions. 4) We propose a practical decision rule for selecting a proper reliability level protection in the presence of both transiency and geological uncertainty.

  12. Volcano deformation source parameters estimated from InSAR: Sensitivities to uncertainties in seismic tomography

    USGS Publications Warehouse

    Masterlark, Timothy; Donovan, Theodore; Feigl, Kurt L.; Haney, Matt; Thurber, Clifford H.; Tung, Sui

    2016-01-01

    The eruption cycle of a volcano is controlled in part by the upward migration of magma. The characteristics of the magma flux produce a deformation signature at the Earth's surface. Inverse analyses use geodetic data to estimate strategic controlling parameters that describe the position and pressurization of a magma chamber at depth. The specific distribution of material properties controls how observed surface deformation translates to source parameter estimates. Seismic tomography models describe the spatial distributions of material properties that are necessary for accurate models of volcano deformation. This study investigates how uncertainties in seismic tomography models propagate into variations in the estimates of volcano deformation source parameters inverted from geodetic data. We conduct finite element model-based nonlinear inverse analyses of interferometric synthetic aperture radar (InSAR) data for Okmok volcano, Alaska, as an example. We then analyze the estimated parameters and their uncertainties to characterize the magma chamber. Analyses are performed separately for models simulating a pressurized chamber embedded in a homogeneous domain as well as for a domain having a heterogeneous distribution of material properties according to seismic tomography. The estimated depth of the source is sensitive to the distribution of material properties. The estimated depths for the homogeneous and heterogeneous domains are 2666 ± 42 and 3527 ± 56 m below mean sea level, respectively (99% confidence). A Monte Carlo analysis indicates that uncertainties of the seismic tomography cannot account for this discrepancy at the 99% confidence level. Accounting for the spatial distribution of elastic properties according to seismic tomography significantly improves the fit of the deformation model predictions and significantly influences estimates for parameters that describe the location of a pressurized magma chamber.

  13. Quantifying aggregated uncertainty in Plasmodium falciparum malaria prevalence and populations at risk via efficient space-time geostatistical joint simulation.

    PubMed

    Gething, Peter W; Patil, Anand P; Hay, Simon I

    2010-04-01

    Risk maps estimating the spatial distribution of infectious diseases are required to guide public health policy from local to global scales. The advent of model-based geostatistics (MBG) has allowed these maps to be generated in a formal statistical framework, providing robust metrics of map uncertainty that enhances their utility for decision-makers. In many settings, decision-makers require spatially aggregated measures over large regions such as the mean prevalence within a country or administrative region, or national populations living under different levels of risk. Existing MBG mapping approaches provide suitable metrics of local uncertainty--the fidelity of predictions at each mapped pixel--but have not been adapted for measuring uncertainty over large areas, due largely to a series of fundamental computational constraints. Here the authors present a new efficient approximating algorithm that can generate for the first time the necessary joint simulation of prevalence values across the very large prediction spaces needed for global scale mapping. This new approach is implemented in conjunction with an established model for P. falciparum allowing robust estimates of mean prevalence at any specified level of spatial aggregation. The model is used to provide estimates of national populations at risk under three policy-relevant prevalence thresholds, along with accompanying model-based measures of uncertainty. By overcoming previously unchallenged computational barriers, this study illustrates how MBG approaches, already at the forefront of infectious disease mapping, can be extended to provide large-scale aggregate measures appropriate for decision-makers.

  14. Towards robust quantification and reduction of uncertainty in hydrologic predictions: Integration of particle Markov chain Monte Carlo and factorial polynomial chaos expansion

    NASA Astrophysics Data System (ADS)

    Wang, S.; Huang, G. H.; Baetz, B. W.; Ancell, B. C.

    2017-05-01

    The particle filtering techniques have been receiving increasing attention from the hydrologic community due to its ability to properly estimate model parameters and states of nonlinear and non-Gaussian systems. To facilitate a robust quantification of uncertainty in hydrologic predictions, it is necessary to explicitly examine the forward propagation and evolution of parameter uncertainties and their interactions that affect the predictive performance. This paper presents a unified probabilistic framework that merges the strengths of particle Markov chain Monte Carlo (PMCMC) and factorial polynomial chaos expansion (FPCE) algorithms to robustly quantify and reduce uncertainties in hydrologic predictions. A Gaussian anamorphosis technique is used to establish a seamless bridge between the data assimilation using the PMCMC and the uncertainty propagation using the FPCE through a straightforward transformation of posterior distributions of model parameters. The unified probabilistic framework is applied to the Xiangxi River watershed of the Three Gorges Reservoir (TGR) region in China to demonstrate its validity and applicability. Results reveal that the degree of spatial variability of soil moisture capacity is the most identifiable model parameter with the fastest convergence through the streamflow assimilation process. The potential interaction between the spatial variability in soil moisture conditions and the maximum soil moisture capacity has the most significant effect on the performance of streamflow predictions. In addition, parameter sensitivities and interactions vary in magnitude and direction over time due to temporal and spatial dynamics of hydrologic processes.

  15. A hybrid finite element - statistical energy analysis approach to robust sound transmission modeling

    NASA Astrophysics Data System (ADS)

    Reynders, Edwin; Langley, Robin S.; Dijckmans, Arne; Vermeir, Gerrit

    2014-09-01

    When considering the sound transmission through a wall in between two rooms, in an important part of the audio frequency range, the local response of the rooms is highly sensitive to uncertainty in spatial variations in geometry, material properties and boundary conditions, which have a wave scattering effect, while the local response of the wall is rather insensitive to such uncertainty. For this mid-frequency range, a computationally efficient modeling strategy is adopted that accounts for this uncertainty. The partitioning wall is modeled deterministically, e.g. with finite elements. The rooms are modeled in a very efficient, nonparametric stochastic way, as in statistical energy analysis. All components are coupled by means of a rigorous power balance. This hybrid strategy is extended so that the mean and variance of the sound transmission loss can be computed as well as the transition frequency that loosely marks the boundary between low- and high-frequency behavior of a vibro-acoustic component. The method is first validated in a simulation study, and then applied for predicting the airborne sound insulation of a series of partition walls of increasing complexity: a thin plastic plate, a wall consisting of gypsum blocks, a thicker masonry wall and a double glazing. It is found that the uncertainty caused by random scattering is important except at very high frequencies, where the modal overlap of the rooms is very high. The results are compared with laboratory measurements, and both are found to agree within the prediction uncertainty in the considered frequency range.

  16. Water resources of the Black Sea Basin at high spatial and temporal resolution

    NASA Astrophysics Data System (ADS)

    Rouholahnejad, Elham; Abbaspour, Karim C.; Srinivasan, Raghvan; Bacu, Victor; Lehmann, Anthony

    2014-07-01

    The pressure on water resources, deteriorating water quality, and uncertainties associated with the climate change create an environment of conflict in large and complex river system. The Black Sea Basin (BSB), in particular, suffers from ecological unsustainability and inadequate resource management leading to severe environmental, social, and economical problems. To better tackle the future challenges, we used the Soil and Water Assessment Tool (SWAT) to model the hydrology of the BSB coupling water quantity, water quality, and crop yield components. The hydrological model of the BSB was calibrated and validated considering sensitivity and uncertainty analysis. River discharges, nitrate loads, and crop yields were used to calibrate the model. Employing grid technology improved calibration computation time by more than an order of magnitude. We calculated components of water resources such as river discharge, infiltration, aquifer recharge, soil moisture, and actual and potential evapotranspiration. Furthermore, available water resources were calculated at subbasin spatial and monthly temporal levels. Within this framework, a comprehensive database of the BSB was created to fill the existing gaps in water resources data in the region. In this paper, we discuss the challenges of building a large-scale model in fine spatial and temporal detail. This study provides the basis for further research on the impacts of climate and land use change on water resources in the BSB.

  17. Shale Gas Development and Brook Trout: Scaling Best Management Practices to Anticipate Cumulative Effects

    USGS Publications Warehouse

    Smith, David; Snyder, Craig D.; Hitt, Nathaniel P.; Young, John A.; Faulkner, Stephen P.

    2012-01-01

    Shale gas development may involve trade-offs between energy development and benefits provided by natural ecosystems. However, current best management practices (BMPs) focus on mitigating localized ecological degradation. We review evidence for cumulative effects of natural gas development on brook trout (Salvelinus fontinalis) and conclude that BMPs should account for potential watershed-scale effects in addition to localized influences. The challenge is to develop BMPs in the face of uncertainty in the predicted response of brook trout to landscape-scale disturbance caused by gas extraction. We propose a decision-analysis approach to formulating BMPs in the specific case of relatively undisturbed watersheds where there is consensus to maintain brook trout populations during gas development. The decision analysis was informed by existing empirical models that describe brook trout occupancy responses to landscape disturbance and set bounds on the uncertainty in the predicted responses to shale gas development. The decision analysis showed that a high efficiency of gas development (e.g., 1 well pad per square mile and 7 acres per pad) was critical to achieving a win-win solution characterized by maintaining brook trout and maximizing extraction of available gas. This finding was invariant to uncertainty in predicted response of brook trout to watershed-level disturbance. However, as the efficiency of gas development decreased, the optimal BMP depended on the predicted response, and there was considerable potential value in discriminating among predictive models through adaptive management or research. The proposed decision-analysis framework provides an opportunity to anticipate the cumulative effects of shale gas development, account for uncertainty, and inform management decisions at the appropriate spatial scales.

  18. Observational uncertainty and regional climate model evaluation: A pan-European perspective

    NASA Astrophysics Data System (ADS)

    Kotlarski, Sven; Szabó, Péter; Herrera, Sixto; Räty, Olle; Keuler, Klaus; Soares, Pedro M.; Cardoso, Rita M.; Bosshard, Thomas; Pagé, Christian; Boberg, Fredrik; Gutiérrez, José M.; Jaczewski, Adam; Kreienkamp, Frank; Liniger, Mark. A.; Lussana, Cristian; Szepszo, Gabriella

    2017-04-01

    Local and regional climate change assessments based on downscaling methods crucially depend on the existence of accurate and reliable observational reference data. In dynamical downscaling via regional climate models (RCMs) observational data can influence model development itself and, later on, model evaluation, parameter calibration and added value assessment. In empirical-statistical downscaling, observations serve as predictand data and directly influence model calibration with corresponding effects on downscaled climate change projections. Focusing on the evaluation of RCMs, we here analyze the influence of uncertainties in observational reference data on evaluation results in a well-defined performance assessment framework and on a European scale. For this purpose we employ three different gridded observational reference grids, namely (1) the well-established EOBS dataset (2) the recently developed EURO4M-MESAN regional re-analysis, and (3) several national high-resolution and quality-controlled gridded datasets that recently became available. In terms of climate models five reanalysis-driven experiments carried out by five different RCMs within the EURO-CORDEX framework are used. Two variables (temperature and precipitation) and a range of evaluation metrics that reflect different aspects of RCM performance are considered. We furthermore include an illustrative model ranking exercise and relate observational spread to RCM spread. The results obtained indicate a varying influence of observational uncertainty on model evaluation depending on the variable, the season, the region and the specific performance metric considered. Over most parts of the continent, the influence of the choice of the reference dataset for temperature is rather small for seasonal mean values and inter-annual variability. Here, model uncertainty (as measured by the spread between the five RCM simulations considered) is typically much larger than reference data uncertainty. For parameters of the daily temperature distribution and for the spatial pattern correlation, however, important dependencies on the reference dataset can arise. The related evaluation uncertainties can be as large or even larger than model uncertainty. For precipitation the influence of observational uncertainty is, in general, larger than for temperature. It often dominates model uncertainty especially for the evaluation of the wet day frequency, the spatial correlation and the shape and location of the distribution of daily values. But even the evaluation of large-scale seasonal mean values can be considerably affected by the choice of the reference. When employing a simple and illustrative model ranking scheme on these results it is found that RCM ranking in many cases depends on the reference dataset employed.

  19. Uncertainty on shallow landslide hazard assessment: from field data to hazard mapping

    NASA Astrophysics Data System (ADS)

    Trefolini, Emanuele; Tolo, Silvia; Patelli, Eduardo; Broggi, Matteo; Disperati, Leonardo; Le Tuan, Hai

    2015-04-01

    Shallow landsliding that involve Hillslope Deposits (HD), the surficial soil that cover the bedrock, is an important process of erosion, transport and deposition of sediment along hillslopes. Despite Shallow landslides generally mobilize relatively small volume of material, they represent the most hazardous factor in mountain regions due to their high velocity and the common absence of warning signs. Moreover, increasing urbanization and likely climate change make shallow landslides a source of widespread risk, therefore the interest of scientific community about this process grown in the last three decades. One of the main aims of research projects involved on this topic, is to perform robust shallow landslides hazard assessment for wide areas (regional assessment), in order to support sustainable spatial planning. Currently, three main methodologies may be implemented to assess regional shallow landslides hazard: expert evaluation, probabilistic (or data mining) methods and physical models based methods. The aim of this work is evaluate the uncertainty of shallow landslides hazard assessment based on physical models taking into account spatial variables such as: geotechnical and hydrogeologic parameters as well as hillslope morphometry. To achieve this goal a wide dataset of geotechnical properties (shear strength, permeability, depth and unit weight) of HD was gathered by integrating field survey, in situ and laboratory tests. This spatial database was collected from a study area of about 350 km2 including different bedrock lithotypes and geomorphological features. The uncertainty associated to each step of the hazard assessment process (e.g. field data collection, regionalization of site specific information and numerical modelling of hillslope stability) was carefully characterized. The most appropriate probability density function (PDF) was chosen for each numerical variable and we assessed the uncertainty propagation on HD strength parameters obtained by empirical relations with geotechnical index properties. Site specific information was regionalized at map scale by (hard and fuzzy) clustering analysis taking into account spatial variables such as: geology, geomorphology and hillslope morphometric variables (longitudinal and transverse curvature, flow accumulation and slope), the latter derived by a DEM with 10 m cell size. In order to map shallow landslide hazard, Monte Carlo simulation was performed for some common physically based models available in literature (eg. SINMAP, SHALSTAB, TRIGRS). Furthermore, a new approach based on the use of Bayesian Network was proposed and validated. Different models, such as Intervals, Convex Models and Fuzzy Sets, were adopted for the modelling of input parameters. Finally, an accuracy assessment was carried out on the resulting maps and the propagation of uncertainty of input parameters into the final shallow landslide hazard estimation was estimated. The outcomes of the analysis are compared and discussed in term of discrepancy among map pixel values and related estimated error. The novelty of the proposed method is on estimation of the confidence of the shallow landslides hazard mapping at regional level. This allows i) to discriminate regions where hazard assessment is robust from areas where more data are necessary to increase the confidence level and ii) to assess the reliability of the procedure used for hazard assessment.

  20. Spatially resolved estimation of ozone-related mortality in the United States under two representative concentration pathways (RCPs) and their uncertainty

    DOE PAGES

    Kim, Young-Min; Zhou, Ying; Gao, Yang; ...

    2014-11-16

    We report that the spatial pattern of the uncertainty in air pollution-related health impacts due to climate change has rarely been studied due to the lack of high-resolution model simulations, especially under the Representative Concentration Pathways (RCPs), the latest greenhouse gas emission pathways. We estimated future tropospheric ozone (O 3) and related excess mortality and evaluated the associated uncertainties in the continental United States under RCPs. Based on dynamically downscaled climate model simulations, we calculated changes in O 3 level at 12 km resolution between the future (2057 and 2059) and base years (2001–2004) under a low-to-medium emission scenario (RCP4.5)more » and a fossil fuel intensive emission scenario (RCP8.5). We then estimated the excess mortality attributable to changes in O 3. Finally, we analyzed the sensitivity of the excess mortality estimates to the input variables and the uncertainty in the excess mortality estimation using Monte Carlo simulations. O 3-related premature deaths in the continental U.S. were estimated to be 1312 deaths/year under RCP8.5 (95 % confidence interval (CI): 427 to 2198) and ₋2118 deaths/year under RCP4.5 (95 % CI: ₋3021 to ₋1216), when allowing for climate change and emissions reduction. The uncertainty of O 3-related excess mortality estimates was mainly caused by RCP emissions pathways. Finally, excess mortality estimates attributable to the combined effect of climate and emission changes on O 3 as well as the associated uncertainties vary substantially in space and so do the most influential input variables. Spatially resolved data is crucial to develop effective community level mitigation and adaptation policy.« less

  1. Spatially resolved estimation of ozone-related mortality in the United States under two representative concentration pathways (RCPs) and their uncertainty

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kim, Young-Min; Zhou, Ying; Gao, Yang

    We report that the spatial pattern of the uncertainty in air pollution-related health impacts due to climate change has rarely been studied due to the lack of high-resolution model simulations, especially under the Representative Concentration Pathways (RCPs), the latest greenhouse gas emission pathways. We estimated future tropospheric ozone (O 3) and related excess mortality and evaluated the associated uncertainties in the continental United States under RCPs. Based on dynamically downscaled climate model simulations, we calculated changes in O 3 level at 12 km resolution between the future (2057 and 2059) and base years (2001–2004) under a low-to-medium emission scenario (RCP4.5)more » and a fossil fuel intensive emission scenario (RCP8.5). We then estimated the excess mortality attributable to changes in O 3. Finally, we analyzed the sensitivity of the excess mortality estimates to the input variables and the uncertainty in the excess mortality estimation using Monte Carlo simulations. O 3-related premature deaths in the continental U.S. were estimated to be 1312 deaths/year under RCP8.5 (95 % confidence interval (CI): 427 to 2198) and ₋2118 deaths/year under RCP4.5 (95 % CI: ₋3021 to ₋1216), when allowing for climate change and emissions reduction. The uncertainty of O 3-related excess mortality estimates was mainly caused by RCP emissions pathways. Finally, excess mortality estimates attributable to the combined effect of climate and emission changes on O 3 as well as the associated uncertainties vary substantially in space and so do the most influential input variables. Spatially resolved data is crucial to develop effective community level mitigation and adaptation policy.« less

  2. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hardin, Ernest; Hadgu, Teklu; Greenberg, Harris

    This report is one follow-on to a study of reference geologic disposal design concepts (Hardin et al. 2011a). Based on an analysis of maximum temperatures, that study concluded that certain disposal concepts would require extended decay storage prior to emplacement, or the use of small waste packages, or both. The study used nominal values for thermal properties of host geologic media and engineered materials, demonstrating the need for uncertainty analysis to support the conclusions. This report is a first step that identifies the input parameters of the maximum temperature calculation, surveys published data on measured values, uses an analytical approachmore » to determine which parameters are most important, and performs an example sensitivity analysis. Using results from this first step, temperature calculations planned for FY12 can focus on only the important parameters, and can use the uncertainty ranges reported here. The survey of published information on thermal properties of geologic media and engineered materials, is intended to be sufficient for use in generic calculations to evaluate the feasibility of reference disposal concepts. A full compendium of literature data is beyond the scope of this report. The term “uncertainty” is used here to represent both measurement uncertainty and spatial variability, or variability across host geologic units. For the most important parameters (e.g., buffer thermal conductivity) the extent of literature data surveyed samples these different forms of uncertainty and variability. Finally, this report is intended to be one chapter or section of a larger FY12 deliverable summarizing all the work on design concepts and thermal load management for geologic disposal (M3FT-12SN0804032, due 15Aug2012).« less

  3. An uncertainty and sensitivity analysis applied to the prioritisation of pharmaceuticals as surface water contaminants from wastewater treatment plant direct emissions.

    PubMed

    Morais, Sérgio Alberto; Delerue-Matos, Cristina; Gabarrell, Xavier

    2014-08-15

    In this study, the concentration probability distributions of 82 pharmaceutical compounds detected in the effluents of 179 European wastewater treatment plants were computed and inserted into a multimedia fate model. The comparative ecotoxicological impact of the direct emission of these compounds from wastewater treatment plants on freshwater ecosystems, based on a potentially affected fraction (PAF) of species approach, was assessed to rank compounds based on priority. As many pharmaceuticals are acids or bases, the multimedia fate model accounts for regressions to estimate pH-dependent fate parameters. An uncertainty analysis was performed by means of Monte Carlo analysis, which included the uncertainty of fate and ecotoxicity model input variables, as well as the spatial variability of landscape characteristics on the European continental scale. Several pharmaceutical compounds were identified as being of greatest concern, including 7 analgesics/anti-inflammatories, 3 β-blockers, 3 psychiatric drugs, and 1 each of 6 other therapeutic classes. The fate and impact modelling relied extensively on estimated data, given that most of these compounds have little or no experimental fate or ecotoxicity data available, as well as a limited reported occurrence in effluents. The contribution of estimated model input variables to the variance of freshwater ecotoxicity impact, as well as the lack of experimental abiotic degradation data for most compounds, helped in establishing priorities for further testing. Generally, the effluent concentration and the ecotoxicity effect factor were the model input variables with the most significant effect on the uncertainty of output results. Copyright © 2014. Published by Elsevier B.V.

  4. Sources of Uncertainty in Predicting Land Surface Fluxes Using Diverse Data and Models

    NASA Technical Reports Server (NTRS)

    Dungan, Jennifer L.; Wang, Weile; Michaelis, Andrew; Votava, Petr; Nemani, Ramakrishma

    2010-01-01

    In the domain of predicting land surface fluxes, models are used to bring data from large observation networks and satellite remote sensing together to make predictions about present and future states of the Earth. Characterizing the uncertainty about such predictions is a complex process and one that is not yet fully understood. Uncertainty exists about initialization, measurement and interpolation of input variables; model parameters; model structure; and mixed spatial and temporal supports. Multiple models or structures often exist to describe the same processes. Uncertainty about structure is currently addressed by running an ensemble of different models and examining the distribution of model outputs. To illustrate structural uncertainty, a multi-model ensemble experiment we have been conducting using the Terrestrial Observation and Prediction System (TOPS) will be discussed. TOPS uses public versions of process-based ecosystem models that use satellite-derived inputs along with surface climate data and land surface characterization to produce predictions of ecosystem fluxes including gross and net primary production and net ecosystem exchange. Using the TOPS framework, we have explored the uncertainty arising from the application of models with different assumptions, structures, parameters, and variable definitions. With a small number of models, this only begins to capture the range of possible spatial fields of ecosystem fluxes. Few attempts have been made to systematically address the components of uncertainty in such a framework. We discuss the characterization of uncertainty for this approach including both quantifiable and poorly known aspects.

  5. Assessment of a Bayesian Belief Network-GIS framework as a practical tool to support marine planning.

    PubMed

    Stelzenmüller, V; Lee, J; Garnacho, E; Rogers, S I

    2010-10-01

    For the UK continental shelf we developed a Bayesian Belief Network-GIS framework to visualise relationships between cumulative human pressures, sensitive marine landscapes and landscape vulnerability, to assess the consequences of potential marine planning objectives, and to map uncertainty-related changes in management measures. Results revealed that the spatial assessment of footprints and intensities of human activities had more influence on landscape vulnerabilities than the type of landscape sensitivity measure used. We addressed questions regarding consequences of potential planning targets, and necessary management measures with spatially-explicit assessment of their consequences. We conclude that the BN-GIS framework is a practical tool allowing for the visualisation of relationships, the spatial assessment of uncertainty related to spatial management scenarios, the engagement of different stakeholder views, and enables a quick update of new spatial data and relationships. Ultimately, such BN-GIS based tools can support the decision-making process used in adaptive marine management. Copyright © 2010 Elsevier Ltd. All rights reserved.

  6. Intensity-Duration-Frequency curves from remote sensing datasets: direct comparison of weather radar and CMORPH over the Eastern Mediterranean

    NASA Astrophysics Data System (ADS)

    Morin, Efrat; Marra, Francesco; Peleg, Nadav; Mei, Yiwen; Anagnostou, Emmanouil N.

    2017-04-01

    Rainfall frequency analysis is used to quantify the probability of occurrence of extreme rainfall and is traditionally based on rain gauge records. The limited spatial coverage of rain gauges is insufficient to sample the spatiotemporal variability of extreme rainfall and to provide the areal information required by management and design applications. Conversely, remote sensing instruments, even if quantitative uncertain, offer coverage and spatiotemporal detail that allow overcoming these issues. In recent years, remote sensing datasets began to be used for frequency analyses, taking advantage of increased record lengths and quantitative adjustments of the data. However, the studies so far made use of concepts and techniques developed for rain gauge (i.e. point or multiple-point) data and have been validated by comparison with gauge-derived analyses. These procedures add further sources of uncertainty and prevent from isolating between data and methodological uncertainties and from fully exploiting the available information. In this study, we step out of the gauge-centered concept presenting a direct comparison between at-site Intensity-Duration-Frequency (IDF) curves derived from different remote sensing datasets on corresponding spatial scales, temporal resolutions and records. We analyzed 16 years of homogeneously corrected and gauge-adjusted C-Band weather radar estimates, high-resolution CMORPH and gauge-adjusted high-resolution CMORPH over the Eastern Mediterranean. Results of this study include: (a) good spatial correlation between radar and satellite IDFs ( 0.7 for 2-5 years return period); (b) consistent correlation and dispersion in the raw and gauge adjusted CMORPH; (c) bias is almost uniform with return period for 12-24 h durations; (d) radar identifies thicker tail distributions than CMORPH and the tail of the distributions depends on the spatial and temporal scales. These results demonstrate the potential of remote sensing datasets for rainfall frequency analysis for management (e.g. warning and early-warning systems) and design (e.g. sewer design, large scale drainage planning)

  7. Inverse analysis and regularisation in conditional source-term estimation modelling

    NASA Astrophysics Data System (ADS)

    Labahn, Jeffrey W.; Devaud, Cecile B.; Sipkens, Timothy A.; Daun, Kyle J.

    2014-05-01

    Conditional Source-term Estimation (CSE) obtains the conditional species mass fractions by inverting a Fredholm integral equation of the first kind. In the present work, a Bayesian framework is used to compare two different regularisation methods: zeroth-order temporal Tikhonov regulatisation and first-order spatial Tikhonov regularisation. The objectives of the current study are: (i) to elucidate the ill-posedness of the inverse problem; (ii) to understand the origin of the perturbations in the data and quantify their magnitude; (iii) to quantify the uncertainty in the solution using different priors; and (iv) to determine the regularisation method best suited to this problem. A singular value decomposition shows that the current inverse problem is ill-posed. Perturbations to the data may be caused by the use of a discrete mixture fraction grid for calculating the mixture fraction PDF. The magnitude of the perturbations is estimated using a box filter and the uncertainty in the solution is determined based on the width of the credible intervals. The width of the credible intervals is significantly reduced with the inclusion of a smoothing prior and the recovered solution is in better agreement with the exact solution. The credible intervals for temporal and spatial smoothing are shown to be similar. Credible intervals for temporal smoothing depend on the solution from the previous time step and a smooth solution is not guaranteed. For spatial smoothing, the credible intervals are not dependent upon a previous solution and better predict characteristics for higher mixture fraction values. These characteristics make spatial smoothing a promising alternative method for recovering a solution from the CSE inversion process.

  8. Analysis of Trace Siderophile Elements at High Spatial Resolution Using Laser Ablation ICP-MS

    NASA Astrophysics Data System (ADS)

    Campbell, A. J.; Humayun, M.

    2006-05-01

    Laser ablation inductively coupled plasma mass spectometry is an increasingly important method of performing spatially resolved trace element analyses. Over the last several years we have applied this technique to measure siderophile element distributions at the ppm level in a variety of natural and synthetic samples, especially metallic phases in meteorites and experimental run products intended for trace element partitioning studies. These samples frequently require trace element analyses to be made at a finer spatial resolution (25 microns or better) than is frequently attained using LA-ICP-MS. In this presentation we review analytical protocols that were developed to optimize the LA-ICP-MS measurements for high spatial resolution. Particular attention is paid to the trade-offs involving sensitivity, ablation pit depth and diameter, background levels, and number of elements measured. To maximize signal/background ratios and avoid difficulties associated with ablating to depths greater than the ablation pit diameter, measurement involved integration of rapidly varying, transient but well-behaved signals. The abundances of platinum group elements and other siderophile elements in ferrous metals were calibrated against well-characterized standards, including iron meteorites and NIST certified steels. The calibrations can be set against the known abundance of an independently determined element, but normalization to 100 percent can also be employed, and was more useful in many circumstances. Evaluation of uncertainties incorporated counting statistics as well as a measure of instrumental uncertainty, determined by replicate analyses of the standards. These methods have led to a number of insights into the formation and chemical processing of metal in the early solar system.

  9. An integrated GIS-based interval-probabilistic programming model for land-use planning management under uncertainty--a case study at Suzhou, China.

    PubMed

    Lu, Shasha; Zhou, Min; Guan, Xingliang; Tao, Lizao

    2015-03-01

    A large number of mathematical models have been developed for supporting optimization of land-use allocation; however, few of them simultaneously consider land suitability (e.g., physical features and spatial information) and various uncertainties existing in many factors (e.g., land availabilities, land demands, land-use patterns, and ecological requirements). This paper incorporates geographic information system (GIS) technology into interval-probabilistic programming (IPP) for land-use planning management (IPP-LUPM). GIS is utilized to assemble data for the aggregated land-use alternatives, and IPP is developed for tackling uncertainties presented as discrete intervals and probability distribution. Based on GIS, the suitability maps of different land users are provided by the outcomes of land suitability assessment and spatial analysis. The maximum area of every type of land use obtained from the suitability maps, as well as various objectives/constraints (i.e., land supply, land demand of socioeconomic development, future development strategies, and environmental capacity), is used as input data for the optimization of land-use areas with IPP-LUPM model. The proposed model not only considers the outcomes of land suitability evaluation (i.e., topography, ground conditions, hydrology, and spatial location) but also involves economic factors, food security, and eco-environmental constraints, which can effectively reflect various interrelations among different aspects in a land-use planning management system. The case study results at Suzhou, China, demonstrate that the model can help to examine the reliability of satisfying (or risk of violating) system constraints under uncertainty. Moreover, it may identify the quantitative relationship between land suitability and system benefits. Willingness to arrange the land areas based on the condition of highly suitable land will not only reduce the potential conflicts on the environmental system but also lead to a lower economic benefit. However, a strong desire to develop lower suitable land areas will bring not only a higher economic benefit but also higher risks of violating environmental and ecological constraints. The land manager should make decisions through trade-offs between economic objectives and environmental/ecological objectives.

  10. Identification of gene regulation models from single-cell data

    NASA Astrophysics Data System (ADS)

    Weber, Lisa; Raymond, William; Munsky, Brian

    2018-09-01

    In quantitative analyses of biological processes, one may use many different scales of models (e.g. spatial or non-spatial, deterministic or stochastic, time-varying or at steady-state) or many different approaches to match models to experimental data (e.g. model fitting or parameter uncertainty/sloppiness quantification with different experiment designs). These different analyses can lead to surprisingly different results, even when applied to the same data and the same model. We use a simplified gene regulation model to illustrate many of these concerns, especially for ODE analyses of deterministic processes, chemical master equation and finite state projection analyses of heterogeneous processes, and stochastic simulations. For each analysis, we employ MATLAB and PYTHON software to consider a time-dependent input signal (e.g. a kinase nuclear translocation) and several model hypotheses, along with simulated single-cell data. We illustrate different approaches (e.g. deterministic and stochastic) to identify the mechanisms and parameters of the same model from the same simulated data. For each approach, we explore how uncertainty in parameter space varies with respect to the chosen analysis approach or specific experiment design. We conclude with a discussion of how our simulated results relate to the integration of experimental and computational investigations to explore signal-activated gene expression models in yeast (Neuert et al 2013 Science 339 584–7) and human cells (Senecal et al 2014 Cell Rep. 8 75–83)5.

  11. Uncertainty in exposure to air pollution

    NASA Astrophysics Data System (ADS)

    Pebesma, Edzer; Helle, Kristina; Christoph, Stasch; Rasouli, Soora; Timmermans, Harry; Walker, Sam-Erik; Denby, Bruce

    2013-04-01

    To assess exposure to air pollution for a person or for a group of people, one needs to know where the person or group is as a function of time, and what the air pollution is at these times and locations. In this study we used the Albatross activity-based model to assess the whereabouts of people and the uncertainties in this, and a probabilistic air quality system based on TAPM/EPISODE to assess air quality probabilistically. The outcomes of the two models were combined to assess exposure to air pollution, and the errors in it. We used the area around Rotterdam (Netherlands) as a case study. As the outcomes of both models come as Monte Carlo realizations, it was relatively easy to cancel one of the sources of uncertainty (movement of persons, air pollution) in order to identify their respective contributions, and also to compare evaluations for individuals with averages for a population of persons. As the output is probabilistic, and in addition spatially and temporally varying, the visual analysis of the complete results poses some challenges. This case study was one of the test cases in the UncertWeb project, which has built concepts and tools to realize the uncertainty-enabled model web. Some of the tools and protocols will be shown and evaluated in this presentation. For the uncertainty of exposure, the uncertainty of air quality was more important than the uncertainty of peoples locations. This difference was stronger for PM10 than for NO2. The workflow was implemented as generic Web services in UncertWeb that also allow for other inputs than the simulated activity schedules and air quality with other resolution. However, due to this flexibility, the Web services require standardized formats and the overlay algorithm is not optimized for the specific use case resulting in a data and processing overhead. Hence, we implemented the full analysis in parallel in R, for this specific case as the model web solution had difficulties with massive data.

  12. Calculating surface ocean pCO2 from biogeochemical Argo floats equipped with pH: An uncertainty analysis

    NASA Astrophysics Data System (ADS)

    Williams, N. L.; Juranek, L. W.; Feely, R. A.; Johnson, K. S.; Sarmiento, J. L.; Talley, L. D.; Dickson, A. G.; Gray, A. R.; Wanninkhof, R.; Russell, J. L.; Riser, S. C.; Takeshita, Y.

    2017-03-01

    More than 74 biogeochemical profiling floats that measure water column pH, oxygen, nitrate, fluorescence, and backscattering at 10 day intervals have been deployed throughout the Southern Ocean. Calculating the surface ocean partial pressure of carbon dioxide (pCO2sw) from float pH has uncertainty contributions from the pH sensor, the alkalinity estimate, and carbonate system equilibrium constants, resulting in a relative standard uncertainty in pCO2sw of 2.7% (or 11 µatm at pCO2sw of 400 µatm). The calculated pCO2sw from several floats spanning a range of oceanographic regimes are compared to existing climatologies. In some locations, such as the subantarctic zone, the float data closely match the climatologies, but in the polar Antarctic zone significantly higher pCO2sw are calculated in the wintertime implying a greater air-sea CO2 efflux estimate. Our results based on four representative floats suggest that despite their uncertainty relative to direct measurements, the float data can be used to improve estimates for air-sea carbon flux, as well as to increase knowledge of spatial, seasonal, and interannual variability in this flux.

  13. Uncertainty propagation by using spectral methods: A practical application to a two-dimensional turbulence fluid model

    NASA Astrophysics Data System (ADS)

    Riva, Fabio; Milanese, Lucio; Ricci, Paolo

    2017-10-01

    To reduce the computational cost of the uncertainty propagation analysis, which is used to study the impact of input parameter variations on the results of a simulation, a general and simple to apply methodology based on decomposing the solution to the model equations in terms of Chebyshev polynomials is discussed. This methodology, based on the work by Scheffel [Am. J. Comput. Math. 2, 173-193 (2012)], approximates the model equation solution with a semi-analytic expression that depends explicitly on time, spatial coordinates, and input parameters. By employing a weighted residual method, a set of nonlinear algebraic equations for the coefficients appearing in the Chebyshev decomposition is then obtained. The methodology is applied to a two-dimensional Braginskii model used to simulate plasma turbulence in basic plasma physics experiments and in the scrape-off layer of tokamaks, in order to study the impact on the simulation results of the input parameter that describes the parallel losses. The uncertainty that characterizes the time-averaged density gradient lengths, time-averaged densities, and fluctuation density level are evaluated. A reasonable estimate of the uncertainty of these distributions can be obtained with a single reduced-cost simulation.

  14. Uncertainty Quantification of Medium-Term Heat Storage From Short-Term Geophysical Experiments Using Bayesian Evidential Learning

    NASA Astrophysics Data System (ADS)

    Hermans, Thomas; Nguyen, Frédéric; Klepikova, Maria; Dassargues, Alain; Caers, Jef

    2018-04-01

    In theory, aquifer thermal energy storage (ATES) systems can recover in winter the heat stored in the aquifer during summer to increase the energy efficiency of the system. In practice, the energy efficiency is often lower than expected from simulations due to spatial heterogeneity of hydraulic properties or non-favorable hydrogeological conditions. A proper design of ATES systems should therefore consider the uncertainty of the prediction related to those parameters. We use a novel framework called Bayesian Evidential Learning (BEL) to estimate the heat storage capacity of an alluvial aquifer using a heat tracing experiment. BEL is based on two main stages: pre- and postfield data acquisition. Before data acquisition, Monte Carlo simulations and global sensitivity analysis are used to assess the information content of the data to reduce the uncertainty of the prediction. After data acquisition, prior falsification and machine learning based on the same Monte Carlo are used to directly assess uncertainty on key prediction variables from observations. The result is a full quantification of the posterior distribution of the prediction conditioned to observed data, without any explicit full model inversion. We demonstrate the methodology in field conditions and validate the framework using independent measurements.

  15. Collocation mismatch uncertainties in satellite aerosol retrieval validation

    NASA Astrophysics Data System (ADS)

    Virtanen, Timo H.; Kolmonen, Pekka; Sogacheva, Larisa; Rodríguez, Edith; Saponaro, Giulia; de Leeuw, Gerrit

    2018-02-01

    Satellite-based aerosol products are routinely validated against ground-based reference data, usually obtained from sun photometer networks such as AERONET (AEROsol RObotic NETwork). In a typical validation exercise a spatial sample of the instantaneous satellite data is compared against a temporal sample of the point-like ground-based data. The observations do not correspond to exactly the same column of the atmosphere at the same time, and the representativeness of the reference data depends on the spatiotemporal variability of the aerosol properties in the samples. The associated uncertainty is known as the collocation mismatch uncertainty (CMU). The validation results depend on the sampling parameters. While small samples involve less variability, they are more sensitive to the inevitable noise in the measurement data. In this paper we study systematically the effect of the sampling parameters in the validation of AATSR (Advanced Along-Track Scanning Radiometer) aerosol optical depth (AOD) product against AERONET data and the associated collocation mismatch uncertainty. To this end, we study the spatial AOD variability in the satellite data, compare it against the corresponding values obtained from densely located AERONET sites, and assess the possible reasons for observed differences. We find that the spatial AOD variability in the satellite data is approximately 2 times larger than in the ground-based data, and the spatial variability correlates only weakly with that of AERONET for short distances. We interpreted that only half of the variability in the satellite data is due to the natural variability in the AOD, and the rest is noise due to retrieval errors. However, for larger distances (˜ 0.5°) the correlation is improved as the noise is averaged out, and the day-to-day changes in regional AOD variability are well captured. Furthermore, we assess the usefulness of the spatial variability of the satellite AOD data as an estimate of CMU by comparing the retrieval errors to the total uncertainty estimates including the CMU in the validation. We find that accounting for CMU increases the fraction of consistent observations.

  16. Improving snow density estimation for mapping SWE with Lidar snow depth: assessment of uncertainty in modeled density and field sampling strategies in NASA SnowEx

    NASA Astrophysics Data System (ADS)

    Raleigh, M. S.; Smyth, E.; Small, E. E.

    2017-12-01

    The spatial distribution of snow water equivalent (SWE) is not sufficiently monitored with either remotely sensed or ground-based observations for water resources management. Recent applications of airborne Lidar have yielded basin-wide mapping of SWE when combined with a snow density model. However, in the absence of snow density observations, the uncertainty in these SWE maps is dominated by uncertainty in modeled snow density rather than in Lidar measurement of snow depth. Available observations tend to have a bias in physiographic regime (e.g., flat open areas) and are often insufficient in number to support testing of models across a range of conditions. Thus, there is a need for targeted sampling strategies and controlled model experiments to understand where and why different snow density models diverge. This will enable identification of robust model structures that represent dominant processes controlling snow densification, in support of basin-scale estimation of SWE with remotely-sensed snow depth datasets. The NASA SnowEx mission is a unique opportunity to evaluate sampling strategies of snow density and to quantify and reduce uncertainty in modeled snow density. In this presentation, we present initial field data analyses and modeling results over the Colorado SnowEx domain in the 2016-2017 winter campaign. We detail a framework for spatially mapping the uncertainty in snowpack density, as represented across multiple models. Leveraging the modular SUMMA model, we construct a series of physically-based models to assess systematically the importance of specific process representations to snow density estimates. We will show how models and snow pit observations characterize snow density variations with forest cover in the SnowEx domains. Finally, we will use the spatial maps of density uncertainty to evaluate the selected locations of snow pits, thereby assessing the adequacy of the sampling strategy for targeting uncertainty in modeled snow density.

  17. Epistemic uncertainty in the location and magnitude of earthquakes in Italy from Macroseismic data

    USGS Publications Warehouse

    Bakun, W.H.; Gomez, Capera A.; Stucchi, M.

    2011-01-01

    Three independent techniques (Bakun and Wentworth, 1997; Boxer from Gasperini et al., 1999; and Macroseismic Estimation of Earthquake Parameters [MEEP; see Data and Resources section, deliverable D3] from R.M.W. Musson and M.J. Jimenez) have been proposed for estimating an earthquake location and magnitude from intensity data alone. The locations and magnitudes obtained for a given set of intensity data are almost always different, and no one technique is consistently best at matching instrumental locations and magnitudes of recent well-recorded earthquakes in Italy. Rather than attempting to select one of the three solutions as best, we use all three techniques to estimate the location and the magnitude and the epistemic uncertainties among them. The estimates are calculated using bootstrap resampled data sets with Monte Carlo sampling of a decision tree. The decision-tree branch weights are based on goodness-of-fit measures of location and magnitude for recent earthquakes. The location estimates are based on the spatial distribution of locations calculated from the bootstrap resampled data. The preferred source location is the locus of the maximum bootstrap location spatial density. The location uncertainty is obtained from contours of the bootstrap spatial density: 68% of the bootstrap locations are within the 68% confidence region, and so on. For large earthquakes, our preferred location is not associated with the epicenter but with a location on the extended rupture surface. For small earthquakes, the epicenters are generally consistent with the location uncertainties inferred from the intensity data if an epicenter inaccuracy of 2-3 km is allowed. The preferred magnitude is the median of the distribution of bootstrap magnitudes. As with location uncertainties, the uncertainties in magnitude are obtained from the distribution of bootstrap magnitudes: the bounds of the 68% uncertainty range enclose 68% of the bootstrap magnitudes, and so on. The instrumental magnitudes for large and small earthquakes are generally consistent with the confidence intervals inferred from the distribution of bootstrap resampled magnitudes.

  18. Uncertainty visualisation in the Model Web

    NASA Astrophysics Data System (ADS)

    Gerharz, L. E.; Autermann, C.; Hopmann, H.; Stasch, C.; Pebesma, E.

    2012-04-01

    Visualisation of geospatial data as maps is a common way to communicate spatially distributed information. If temporal and furthermore uncertainty information are included in the data, efficient visualisation methods are required. For uncertain spatial and spatio-temporal data, numerous visualisation methods have been developed and proposed, but only few tools for visualisation of data in a standardised way exist. Furthermore, usually they are realised as thick clients, and lack functionality of handling data coming from web services as it is envisaged in the Model Web. We present an interactive web tool for visualisation of uncertain spatio-temporal data developed in the UncertWeb project. The client is based on the OpenLayers JavaScript library. OpenLayers provides standard map windows and navigation tools, i.e. pan, zoom in/out, to allow interactive control for the user. Further interactive methods are implemented using jStat, a JavaScript library for statistics plots developed in UncertWeb, and flot. To integrate the uncertainty information into existing standards for geospatial data, the Uncertainty Markup Language (UncertML) was applied in combination with OGC Observations&Measurements 2.0 and JavaScript Object Notation (JSON) encodings for vector and NetCDF for raster data. The client offers methods to visualise uncertain vector and raster data with temporal information. Uncertainty information considered for the tool are probabilistic and quantified attribute uncertainties which can be provided as realisations or samples, full probability distributions functions and statistics. Visualisation is supported for uncertain continuous and categorical data. In the client, the visualisation is realised using a combination of different methods. Based on previously conducted usability studies, a differentiation between expert (in statistics or mapping) and non-expert users has been indicated as useful. Therefore, two different modes are realised together in the tool: (i) adjacent maps showing data and uncertainty separately, and (ii) multidimensional mapping providing different visualisation methods in combination to explore the spatial, temporal and uncertainty distribution of the data. Adjacent maps allow a simpler visualisation by separating value and uncertainty maps for non-experts and a first overview. The multidimensional approach allows a more complex exploration of the data for experts by browsing through the different dimensions. It offers the visualisation of maps, statistic plots and time series in different windows and sliders to interactively move through time, space and uncertainty (thresholds).

  19. Daily ambient air pollution metrics for five cities: Evaluation of data-fusion-based estimates and uncertainties

    NASA Astrophysics Data System (ADS)

    Friberg, Mariel D.; Kahn, Ralph A.; Holmes, Heather A.; Chang, Howard H.; Sarnat, Stefanie Ebelt; Tolbert, Paige E.; Russell, Armistead G.; Mulholland, James A.

    2017-06-01

    Spatiotemporal characterization of ambient air pollutant concentrations is increasingly relying on the combination of observations and air quality models to provide well-constrained, spatially and temporally complete pollutant concentration fields. Air quality models, in particular, are attractive, as they characterize the emissions, meteorological, and physiochemical process linkages explicitly while providing continuous spatial structure. However, such modeling is computationally intensive and has biases. The limitations of spatially sparse and temporally incomplete observations can be overcome by blending the data with estimates from a physically and chemically coherent model, driven by emissions and meteorological inputs. We recently developed a data fusion method that blends ambient ground observations and chemical-transport-modeled (CTM) data to estimate daily, spatially resolved pollutant concentrations and associated correlations. In this study, we assess the ability of the data fusion method to produce daily metrics (i.e., 1-hr max, 8-hr max, and 24-hr average) of ambient air pollution that capture spatiotemporal air pollution trends for 12 pollutants (CO, NO2, NOx, O3, SO2, PM10, PM2.5, and five PM2.5 components) across five metropolitan areas (Atlanta, Birmingham, Dallas, Pittsburgh, and St. Louis), from 2002 to 2008. Three sets of comparisons are performed: (1) the CTM concentrations are evaluated for each pollutant and metropolitan domain, (2) the data fusion concentrations are compared with the monitor data, (3) a comprehensive cross-validation analysis against observed data evaluates the quality of the data fusion model simulations across multiple metropolitan domains. The resulting daily spatial field estimates of air pollutant concentrations and uncertainties are not only consistent with observations, emissions, and meteorology, but substantially improve CTM-derived results for nearly all pollutants and all cities, with the exception of NO2 for Birmingham. The greatest improvements occur for O3 and PM2.5. Squared spatiotemporal correlation coefficients range between simulations and observations determined using cross-validation across all cities for air pollutants of secondary and mixed origins are R2 = 0.88-0.93 (O3), 0.81-0.89 (SO4), 0.67-0.83 (PM2.5), 0.52-0.72 (NO3), 0.43-0.80 (NH4), 0.32-0.51 (OC), and 0.14-0.71 (PM10). Results for relatively homogeneous pollutants of secondary origin, tend to be better than those for more spatially heterogeneous (larger spatial gradients) pollutants of primary origin (NOx, CO, SO2 and EC). Generally, background concentrations and spatial concentration gradients reflect interurban airshed complexity and the effects of regional transport, whereas daily spatial pattern variability shows intra-urban consistency in the fused data. With sufficiently high CTM spatial resolution, traffic-related pollutants exhibit gradual concentration gradients that peak toward the urban centers. Ambient pollutant concentration uncertainty estimates for the fused data are both more accurate and smaller than those for either the observations or the model simulations alone.

  20. Uncertainty quantification and risk analyses of CO2 leakage in heterogeneous geological formations

    NASA Astrophysics Data System (ADS)

    Hou, Z.; Murray, C. J.; Rockhold, M. L.

    2012-12-01

    A stochastic sensitivity analysis framework is adopted to evaluate the impact of spatial heterogeneity in permeability on CO2 leakage risk. The leakage is defined as the total mass of CO2 moving into the overburden through the caprock-overburden interface, in both gaseous and liquid (dissolved) phases. The entropy-based framework has the ability to quantify the uncertainty associated with the input parameters in the form of prior pdfs (probability density functions). Effective sampling of the prior pdfs enables us to fully explore the parameter space and systematically evaluate the individual and combined effects of the parameters of interest on CO2 leakage risk. The parameters that are considered in the study include: mean, variance, and horizontal to vertical spatial anisotropy ratio for caprock permeability, and those same parameters for reservoir permeability. Given the sampled spatial variogram parameters, multiple realizations of permeability fields were generated using GSLIB subroutines. For each permeability field, a numerical simulator, STOMP, (in the water-salt-CO2-energy operational mode) is used to simulate the CO2 migration within the reservoir and caprock up to 50 years after injection. Due to intensive computational demand, we run both a scalable version simulator eSTOMP and serial STOMP on various supercomputers. We then perform statistical analyses and summarize the relationships between the parameters of interest (mean/variance/anisotropy ratio of caprock and reservoir permeability) and CO2 leakage ratio. We also present the effects of those parameters on CO2 plume radius and reservoir injectivity. The statistical analysis provides a reduced order model that can be used to estimate the impact of heterogeneity on caprock leakage.

  1. Large-area settlement pattern recognition from Landsat-8 data

    NASA Astrophysics Data System (ADS)

    Wieland, Marc; Pittore, Massimiliano

    2016-09-01

    The study presents an image processing and analysis pipeline that combines object-based image analysis with a Support Vector Machine to derive a multi-layered settlement product from Landsat-8 data over large areas. 43 image scenes are processed over large parts of Central Asia (Southern Kazakhstan, Kyrgyzstan, Tajikistan and Eastern Uzbekistan). The main tasks tackled by this work include built-up area identification, settlement type classification and urban structure types pattern recognition. Besides commonly used accuracy assessments of the resulting map products, thorough performance evaluations are carried out under varying conditions to tune algorithm parameters and assess their applicability for the given tasks. As part of this, several research questions are being addressed. In particular the influence of the improved spatial and spectral resolution of Landsat-8 on the SVM performance to identify built-up areas and urban structure types are evaluated. Also the influence of an extended feature space including digital elevation model features is tested for mountainous regions. Moreover, the spatial distribution of classification uncertainties is analyzed and compared to the heterogeneity of the building stock within the computational unit of the segments. The study concludes that the information content of Landsat-8 images is sufficient for the tested classification tasks and even detailed urban structures could be extracted with satisfying accuracy. Freely available ancillary settlement point location data could further improve the built-up area classification. Digital elevation features and pan-sharpening could, however, not significantly improve the classification results. The study highlights the importance of dynamically tuned classifier parameters, and underlines the use of Shannon entropy computed from the soft answers of the SVM as a valid measure of the spatial distribution of classification uncertainties.

  2. Assessing TCE source bioremediation by geostatistical analysis of a flux fence.

    PubMed

    Cai, Zuansi; Wilson, Ryan D; Lerner, David N

    2012-01-01

    Mass discharge across transect planes is increasingly used as a metric for performance assessment of in situ groundwater remediation systems. Mass discharge estimates using concentrations measured in multilevel transects are often made by assuming a uniform flow field, and uncertainty contributions from spatial concentration and flow field variability are often overlooked. We extend our recently developed geostatistical approach to estimate mass discharge using transect data of concentration and hydraulic conductivity, so accounting for the spatial variability of both datasets. The magnitude and uncertainty of mass discharge were quantified by conditional simulation. An important benefit of the approach is that uncertainty is quantified as an integral part of the mass discharge estimate. We use this approach for performance assessment of a bioremediation experiment of a trichloroethene (TCE) source zone. Analyses of dissolved parent and daughter compounds demonstrated that the engineered bioremediation has elevated the degradation rate of TCE, resulting in a two-thirds reduction in the TCE mass discharge from the source zone. The biologically enhanced dissolution of TCE was not significant (~5%), and was less than expected. However, the discharges of the daughter products cis-1,2, dichloroethene (cDCE) and vinyl chloride (VC) increased, probably because of the rapid transformation of TCE from the source zone to the measurement transect. This suggests that enhancing the biodegradation of cDCE and VC will be crucial to successful engineered bioremediation of TCE source zones. © 2012, The Author(s). Ground Water © 2012, National Ground Water Association.

  3. Developing a spatial-temporal method for the geographic investigation of shoeprint evidence.

    PubMed

    Lin, Ge; Elmes, Gregory; Walnoha, Mike; Chen, Xiannian

    2009-01-01

    This article examines the potential of a spatial-temporal method for analysis of forensic shoeprint data. The large volume of shoeprint evidence recovered at crime scenes results in varied success in matching a print to a known shoe type and subsequently linking sets of matched prints to suspected offenders. Unlike DNA and fingerprint data, a major challenge is to reduce the uncertainty in linking sets of matched shoeprints to a suspected serial offender. Shoeprint data for 2004 were imported from the Greater London Metropolitan Area Bigfoot database into a geographic information system, and a spatial-temporal algorithm developed for this project. The results show that by using distance and time constraints interactively, the number of candidate shoeprints that can implicate one or few suspects can be substantially reduced. It concludes that the use of space-time and other ancillary information within a geographic information system can be quite helpful for forensic investigation.

  4. Parameter and uncertainty estimation for mechanistic, spatially explicit epidemiological models

    NASA Astrophysics Data System (ADS)

    Finger, Flavio; Schaefli, Bettina; Bertuzzo, Enrico; Mari, Lorenzo; Rinaldo, Andrea

    2014-05-01

    Epidemiological models can be a crucially important tool for decision-making during disease outbreaks. The range of possible applications spans from real-time forecasting and allocation of health-care resources to testing alternative intervention mechanisms such as vaccines, antibiotics or the improvement of sanitary conditions. Our spatially explicit, mechanistic models for cholera epidemics have been successfully applied to several epidemics including, the one that struck Haiti in late 2010 and is still ongoing. Calibration and parameter estimation of such models represents a major challenge because of properties unusual in traditional geoscientific domains such as hydrology. Firstly, the epidemiological data available might be subject to high uncertainties due to error-prone diagnosis as well as manual (and possibly incomplete) data collection. Secondly, long-term time-series of epidemiological data are often unavailable. Finally, the spatially explicit character of the models requires the comparison of several time-series of model outputs with their real-world counterparts, which calls for an appropriate weighting scheme. It follows that the usual assumption of a homoscedastic Gaussian error distribution, used in combination with classical calibration techniques based on Markov chain Monte Carlo algorithms, is likely to be violated, whereas the construction of an appropriate formal likelihood function seems close to impossible. Alternative calibration methods, which allow for accurate estimation of total model uncertainty, particularly regarding the envisaged use of the models for decision-making, are thus needed. Here we present the most recent developments regarding methods for parameter and uncertainty estimation to be used with our mechanistic, spatially explicit models for cholera epidemics, based on informal measures of goodness of fit.

  5. Geostatistical estimation of forest biomass in interior Alaska combining Landsat-derived tree cover, sampled airborne lidar and field observations

    NASA Astrophysics Data System (ADS)

    Babcock, Chad; Finley, Andrew O.; Andersen, Hans-Erik; Pattison, Robert; Cook, Bruce D.; Morton, Douglas C.; Alonzo, Michael; Nelson, Ross; Gregoire, Timothy; Ene, Liviu; Gobakken, Terje; Næsset, Erik

    2018-06-01

    The goal of this research was to develop and examine the performance of a geostatistical coregionalization modeling approach for combining field inventory measurements, strip samples of airborne lidar and Landsat-based remote sensing data products to predict aboveground biomass (AGB) in interior Alaska's Tanana Valley. The proposed modeling strategy facilitates pixel-level mapping of AGB density predictions across the entire spatial domain. Additionally, the coregionalization framework allows for statistically sound estimation of total AGB for arbitrary areal units within the study area---a key advance to support diverse management objectives in interior Alaska. This research focuses on appropriate characterization of prediction uncertainty in the form of posterior predictive coverage intervals and standard deviations. Using the framework detailed here, it is possible to quantify estimation uncertainty for any spatial extent, ranging from pixel-level predictions of AGB density to estimates of AGB stocks for the full domain. The lidar-informed coregionalization models consistently outperformed their counterpart lidar-free models in terms of point-level predictive performance and total AGB precision. Additionally, the inclusion of Landsat-derived forest cover as a covariate further improved estimation precision in regions with lower lidar sampling intensity. Our findings also demonstrate that model-based approaches that do not explicitly account for residual spatial dependence can grossly underestimate uncertainty, resulting in falsely precise estimates of AGB. On the other hand, in a geostatistical setting, residual spatial structure can be modeled within a Bayesian hierarchical framework to obtain statistically defensible assessments of uncertainty for AGB estimates.

  6. The methane distribution on Titan: high resolution spectroscopy in the near-IR with Keck NIRSPEC/AO

    NASA Astrophysics Data System (ADS)

    Adamkovics, Mate; Mitchell, Jonathan L.

    2014-11-01

    The distribution of methane on Titan is a diagnostic of regional scale meteorology and large scale atmospheric circulation. The observed formation of clouds and the transport of heat through the atmosphere both depend on spatial and temporal variations in methane humidity. We have performed observations to measure the the distribution on methane Titan using high spectral resolution near-IR (H-band) observations made with NIRSPEC, with adaptive optics, at Keck Observatory in July 2014. This work builds on previous attempts at this measurement with improvement in the observing protocol and data reduction, together with increased integration times. Radiative transfer models using line-by-line calculation of methane opacities from the HITRAN2012 database are used to retrieve methane abundances. We will describe analysis of the reduced observations, which show latitudinal spatial variation in the region the spectrum that is thought to be sensitive to methane abundance. Quantifying the methane abundance variation requires models that include the spatial variation in surface albedo and meridional haze gradient; we will describe (currently preliminary) analysis of the the methane distribution and uncertainties in the retrieval.

  7. Multi-perspective analysis and spatiotemporal mapping of air pollution monitoring data.

    PubMed

    Kolovos, Alexander; Skupin, André; Jerrett, Michael; Christakos, George

    2010-09-01

    Space-time data analysis and assimilation techniques in atmospheric sciences typically consider input from monitoring measurements. The input is often processed in a manner that acknowledges characteristics of the measurements (e.g., underlying patterns, fluctuation features) under conditions of uncertainty; it also leads to the derivation of secondary information that serves study-oriented goals, and provides input to space-time prediction techniques. We present a novel approach that blends a rigorous space-time prediction model (Bayesian maximum entropy, BME) with a cognitively informed visualization of high-dimensional data (spatialization). The combined BME and spatialization approach (BME-S) is used to study monthly averaged NO2 and mean annual SO4 measurements in California over the 15-year period 1988-2002. Using the original scattered measurements of these two pollutants BME generates spatiotemporal predictions on a regular grid across the state. Subsequently, the prediction network undergoes the spatialization transformation into a lower-dimensional geometric representation, aimed at revealing patterns and relationships that exist within the input data. The proposed BME-S provides a powerful spatiotemporal framework to study a variety of air pollution data sources.

  8. Application of geo-spatial technology in schistosomiasis modelling in Africa: a review.

    PubMed

    Manyangadze, Tawanda; Chimbari, Moses John; Gebreslasie, Michael; Mukaratirwa, Samson

    2015-11-04

    Schistosomiasis continues to impact socio-economic development negatively in sub-Saharan Africa. The advent of spatial technologies, including geographic information systems (GIS), Earth observation (EO) and global positioning systems (GPS) assist modelling efforts. However, there is increasing concern regarding the accuracy and precision of the current spatial models. This paper reviews the literature regarding the progress and challenges in the development and utilization of spatial technology with special reference to predictive models for schistosomiasis in Africa. Peer-reviewed papers identified through a PubMed search using the following keywords: geo-spatial analysis OR remote sensing OR modelling OR earth observation OR geographic information systems OR prediction OR mapping AND schistosomiasis AND Africa were used. Statistical uncertainty, low spatial and temporal resolution satellite data and poor validation were identified as some of the factors that compromise the precision and accuracy of the existing predictive models. The need for high spatial resolution of remote sensing data in conjunction with ancillary data viz. ground-measured climatic and environmental information, local presence/absence intermediate host snail surveys as well as prevalence and intensity of human infection for model calibration and validation are discussed. The importance of a multidisciplinary approach in developing robust, spatial data capturing, modelling techniques and products applicable in epidemiology is highlighted.

  9. Sensitivity of geological, geochemical and hydrologic parameters in complex reactive transport systems for in-situ uranium bioremediation

    NASA Astrophysics Data System (ADS)

    Yang, G.; Maher, K.; Caers, J.

    2015-12-01

    Groundwater contamination associated with remediated uranium mill tailings is a challenging environmental problem, particularly within the Colorado River Basin. To examine the effectiveness of in-situ bioremediation of U(VI), acetate injection has been proposed and tested at the Rifle pilot site. There have been several geologic modeling and simulated contaminant transport investigations, to evaluate the potential outcomes of the process and identify crucial factors for successful uranium reduction. Ultimately, findings from these studies would contribute to accurate predictions of the efficacy of uranium reduction. However, all these previous studies have considered limited model complexities, either because of the concern that data is too sparse to resolve such complex systems or because some parameters are assumed to be less important. Such simplified initial modeling, however, limits the predictive power of the model. Moreover, previous studies have not yet focused on spatial heterogeneity of various modeling components and its impact on the spatial distribution of the immobilized uranium (U(IV)). In this study, we study the impact of uncertainty on 21 parameters on model responses by means of recently developed distance-based global sensitivity analysis (DGSA), to study the main effects and interactions of parameters of various types. The 21 parameters include, for example, spatial variability of initial uranium concentration, mean hydraulic conductivity, and variogram structures of hydraulic conductivity. DGSA allows for studying multi-variate model responses based on spatial and non-spatial model parameters. When calculating the distances between model responses, in addition to the overall uranium reduction efficacy, we also considered the spatial profiles of the immobilized uranium concentration as target response. Results show that the mean hydraulic conductivity and the mineral reaction rate are the two most sensitive parameters with regard to the overall uranium reduction. But in terms of spatial distribution of immobilized uranium, initial conditions of uranium concentration and spatial uncertainty in hydraulic conductivity also become important. These analyses serve as the first step of further prediction practices of the complex uranium transport and reaction systems.

  10. Toward a probabilistic acoustic emission source location algorithm: A Bayesian approach

    NASA Astrophysics Data System (ADS)

    Schumacher, Thomas; Straub, Daniel; Higgins, Christopher

    2012-09-01

    Acoustic emissions (AE) are stress waves initiated by sudden strain releases within a solid body. These can be caused by internal mechanisms such as crack opening or propagation, crushing, or rubbing of crack surfaces. One application for the AE technique in the field of Structural Engineering is Structural Health Monitoring (SHM). With piezo-electric sensors mounted to the surface of the structure, stress waves can be detected, recorded, and stored for later analysis. An important step in quantitative AE analysis is the estimation of the stress wave source locations. Commonly, source location results are presented in a rather deterministic manner as spatial and temporal points, excluding information about uncertainties and errors. Due to variability in the material properties and uncertainty in the mathematical model, measures of uncertainty are needed beyond best-fit point solutions for source locations. This paper introduces a novel holistic framework for the development of a probabilistic source location algorithm. Bayesian analysis methods with Markov Chain Monte Carlo (MCMC) simulation are employed where all source location parameters are described with posterior probability density functions (PDFs). The proposed methodology is applied to an example employing data collected from a realistic section of a reinforced concrete bridge column. The selected approach is general and has the advantage that it can be extended and refined efficiently. Results are discussed and future steps to improve the algorithm are suggested.

  11. Synthesis of instrumentally and historically recorded earthquakes and studying their spatial statistical relationship (A case study: Dasht-e-Biaz, Eastern Iran)

    NASA Astrophysics Data System (ADS)

    Jalali, Mohammad; Ramazi, Hamidreza

    2018-06-01

    Earthquake catalogues are the main source of statistical seismology for the long term studies of earthquake occurrence. Therefore, studying the spatiotemporal problems is important to reduce the related uncertainties in statistical seismology studies. A statistical tool, time normalization method, has been determined to revise time-frequency relationship in one of the most active regions of Asia, Eastern Iran and West of Afghanistan, (a and b were calculated around 8.84 and 1.99 in the exponential scale, not logarithmic scale). Geostatistical simulation method has been further utilized to reduce the uncertainties in the spatial domain. A geostatistical simulation produces a representative, synthetic catalogue with 5361 events to reduce spatial uncertainties. The synthetic database is classified using a Geographical Information System, GIS, based on simulated magnitudes to reveal the underlying seismicity patterns. Although some regions with highly seismicity correspond to known faults, significantly, as far as seismic patterns are concerned, the new method highlights possible locations of interest that have not been previously identified. It also reveals some previously unrecognized lineation and clusters in likely future strain release.

  12. Flood risk analysis and adaptive strategy in context of uncertainties: a case study of Nhieu Loc Thi Nghe Basin, Ho Chi Minh City

    NASA Astrophysics Data System (ADS)

    Ho, Long-Phi; Chau, Nguyen-Xuan-Quang; Nguyen, Hong-Quan

    2013-04-01

    The Nhieu Loc - Thi Nghe basin is the most important administrative and business area of Ho Chi Minh City. Due to system complexity of the basin such as the increasing trend of rainfall intensity, (tidal) water level and land subsidence, the simulation of hydrological, hydraulic variables for flooding prediction seems rather not adequate in practical projects. The basin is still highly vulnerable despite of multi-million USD investment for urban drainage improvement projects since the last decade. In this paper, an integrated system analysis in both spatial and temporal aspects based on statistical, GIS and modelling approaches has been conducted in order to: (1) Analyse risks before and after projects, (2) Foresee water-related risk under uncertainties of unfavourable driving factors and (3) Develop a sustainable flood risk management strategy for the basin. The results show that given the framework of risk analysis and adaptive strategy, certain urban developing plans in the basin must be carefully revised and/or checked in order to reduce the highly unexpected loss in the future

  13. The NASA Carbon Airborne Flux Experiment (CARAFE): instrumentation and methodology

    NASA Astrophysics Data System (ADS)

    Wolfe, Glenn M.; Kawa, S. Randy; Hanisco, Thomas F.; Hannun, Reem A.; Newman, Paul A.; Swanson, Andrew; Bailey, Steve; Barrick, John; Thornhill, K. Lee; Diskin, Glenn; DiGangi, Josh; Nowak, John B.; Sorenson, Carl; Bland, Geoffrey; Yungel, James K.; Swenson, Craig A.

    2018-03-01

    The exchange of trace gases between the Earth's surface and atmosphere strongly influences atmospheric composition. Airborne eddy covariance can quantify surface fluxes at local to regional scales (1-1000 km), potentially helping to bridge gaps between top-down and bottom-up flux estimates and offering novel insights into biophysical and biogeochemical processes. The NASA Carbon Airborne Flux Experiment (CARAFE) utilizes the NASA C-23 Sherpa aircraft with a suite of commercial and custom instrumentation to acquire fluxes of carbon dioxide, methane, sensible heat, and latent heat at high spatial resolution. Key components of the CARAFE payload are described, including the meteorological, greenhouse gas, water vapor, and surface imaging systems. Continuous wavelet transforms deliver spatially resolved fluxes along aircraft flight tracks. Flux analysis methodology is discussed in depth, with special emphasis on quantification of uncertainties. Typical uncertainties in derived surface fluxes are 40-90 % for a nominal resolution of 2 km or 16-35 % when averaged over a full leg (typically 30-40 km). CARAFE has successfully flown two missions in the eastern US in 2016 and 2017, quantifying fluxes over forest, cropland, wetlands, and water. Preliminary results from these campaigns are presented to highlight the performance of this system.

  14. Determining an empirical estimate of the tracking inconsistency component for true astrometric uncertainties

    NASA Astrophysics Data System (ADS)

    Ramanjooloo, Yudish; Tholen, David J.; Fohring, Dora; Claytor, Zach; Hung, Denise

    2017-10-01

    The asteroid community is moving towards the implementation of a new astrometric reporting format. This new format will finally include of complementary astrometric uncertainties in the reported observations. The availability of uncertainties will allow ephemeris predictions and orbit solutions to be constrained with greater reliability, thereby improving the efficiency of the community's follow-up and recovery efforts.Our current uncertainty model involves our uncertainties in centroiding on the trailed stars and asteroid and the uncertainty due to the astrometric solution. The accuracy of our astrometric measurements are reliant on how well we can minimise the offset between the spatial and temporal centroids of the stars and the asteroid. This offset is currently unmodelled and can be caused by variations in the cloud transparency, the seeing and tracking inconsistencies. The magnitude zero point of the image, which is affected by fluctuating weather conditions and the catalog bias in the photometric magnitudes, can serve as an indicator of the presence and thickness of clouds. Through comparison of the astrometric uncertainties to the orbit solution residuals, it was apparent that a component of the error analysis remained unaccounted for, as a result of cloud coverage and thickness, telescope tracking inconsistencies and variable seeing. This work will attempt to quantify the tracking inconsistency component. We have acquired a rich dataset with the University of Hawaii 2.24 metre telescope (UH-88 inch) that is well positioned to construct an empirical estimate of the tracking inconsistency component. This work is funded by NASA grant NXX13AI64G.

  15. The visualization of spatial uncertainty

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Srivastava, R.M.

    1994-12-31

    Geostatistical conditions simulation is gaining acceptance as a numerical modeling tool in the petroleum industry. Unfortunately, many of the new users of conditional simulation work with only one outcome or ``realization`` and ignore the many other outcomes that could be produced by their conditional simulation tools. 3-D visualization tools allow them to create very realistic images of this single outcome as reality. There are many methods currently available for presenting the uncertainty information from a family of possible outcomes; most of these, however, use static displays and many present uncertainty in a format that is not intuitive. This paper exploresmore » the visualization of uncertainty through dynamic displays that exploit the intuitive link between uncertainty and change by presenting the use with a constantly evolving model. The key technical challenge to such a dynamic presentation is the ability to create numerical models that honor the available well data and geophysical information and yet are incrementally different so that successive frames can be viewed rapidly as an animated cartoon. An example of volumetric uncertainty from a Gulf Coast reservoir will be used to demonstrate that such a dynamic presentation is the ability to create numerical models that honor the available well data and geophysical information and yet are incrementally different so that successive frames can be viewed rapidly as an animated cartoon. An example of volumetric uncertainty from a Gulf Coast reservoir will be used to demonstrate that such animation is possible and to show that such dynamic displays can be an effective tool in risk analysis for the petroleum industry.« less

  16. Institutional Mapping Towards Developing a Framework for Sustainable Marine Spatial Planning

    NASA Astrophysics Data System (ADS)

    Yatim, M. H. M.; Omar, A. H.; Abdullah, N. M.; Hashim, N. M.

    2016-09-01

    Within few years before, the urge to implement the marine spatial planning is due to increasing numbers of marine activities that will lead into uncertainties of rights, restrictions and responsibilities of the maritime nations. Marine authorities in this situation that deal with national rights and legislations are the government institutions that engage with marine spatial information. There are several elements to be considered when dealing with the marine spatial planning; which is institutional sustainability governance. Providing the importance of marine spatial planning towards sustainable marine spatial governance, the focus should highlight the role marine institutions towards sustainable marine plan. The iterative process of marine spatial planning among marine institutions is important as the spatial information governance is scattered from reflected rights, restrictions and responsibilities of marine government institutions. Malaysia is one of the maritime nations that conjures the initial step towards establishing the sustainable marine spatial planning. In order to have sustainable institutions in marine spatial planning process, it involves four main stages; planning phase, plan evaluation phase, implementation phase and post implementation phase. Current situation has witnessed the unclear direction and role of marine government institutions to manage the marine spatial information. This review paper is focusing on the institutional sustainability upon interaction of marine government institutions in the marine spatial planning process based on Institutional Analysis Framework. The outcome of the integration of institutional sustainability and marine spatial planning process will propose a framework of marine institutional sustainable plan.

  17. Fine-grained suspended sediment source identification for the Kharaa River basin, northern Mongolia

    NASA Astrophysics Data System (ADS)

    Rode, Michael; Theuring, Philipp; Collins, Adrian L.

    2015-04-01

    Fine sediment inputs into river systems can be a major source of nutrients and heavy metals and have a strong impact on the water quality and ecosystem functions of rivers and lakes, including those in semiarid regions. However, little is known to date about the spatial distribution of sediment sources in most large scale river basins in Central Asia. Accordingly, a sediment source fingerprinting technique was used to assess the spatial sources of fine-grained (<10 microns) sediment in the 15 000 km2 Kharaa River basin in northern Mongolia. Five field sampling campaigns in late summer 2009, and spring and late summer in both 2010 and 2011, were conducted directly after high water flows, to collect an overall total of 900 sediment samples. The work used a statistical approach for sediment source discrimination with geochemical composite fingerprints based on a new Genetic Algorithm (GA)-driven Discriminant Function Analysis, the Kruskal-Wallis H-test and Principal Component Analysis. The composite fingerprints were subsequently used for numerical mass balance modelling with uncertainty analysis. The contributions of the individual sub-catchment spatial sediment sources varied from 6.4% (the headwater sub-catchment of Sugnugur Gol) to 36.2% (the Kharaa II sub-catchment in the middle reaches of the study basin) with the pattern generally showing higher contributions from the sub-catchments in the middle, rather than the upstream, portions of the study area. The importance of riverbank erosion was shown to increase from upstream to midstream tributaries. The source tracing procedure provides results in reasonable accordance with previous findings in the study region and demonstrates the general applicability and associated uncertainties of an approach for fine-grained sediment source investigation in large scale semi-arid catchments. The combined application of source fingerprinting and catchment modelling approaches can be used to assess whether tracing estimates are credible and in combination such approaches provide a basis for making sediment source apportionment more compelling to catchment stakeholders and managers.

  18. The Use of Radar-Based Products for Deriving Extreme Rainfall Frequencies Using Regional Frequency Analysis with Application in South Louisiana

    NASA Astrophysics Data System (ADS)

    Eldardiry, H. A.; Habib, E. H.

    2014-12-01

    Radar-based technologies have made spatially and temporally distributed quantitative precipitation estimates (QPE) available in an operational environmental compared to the raingauges. The floods identified through flash flood monitoring and prediction systems are subject to at least three sources of uncertainties: (a) those related to rainfall estimation errors, (b) those due to streamflow prediction errors due to model structural issues, and (c) those due to errors in defining a flood event. The current study focuses on the first source of uncertainty and its effect on deriving important climatological characteristics of extreme rainfall statistics. Examples of such characteristics are rainfall amounts with certain Average Recurrence Intervals (ARI) or Annual Exceedance Probability (AEP), which are highly valuable for hydrologic and civil engineering design purposes. Gauge-based precipitation frequencies estimates (PFE) have been maturely developed and widely used over the last several decades. More recently, there has been a growing interest by the research community to explore the use of radar-based rainfall products for developing PFE and understand the associated uncertainties. This study will use radar-based multi-sensor precipitation estimates (MPE) for 11 years to derive PFE's corresponding to various return periods over a spatial domain that covers the state of Louisiana in southern USA. The PFE estimation approach used in this study is based on fitting generalized extreme value distribution to hydrologic extreme rainfall data based on annual maximum series (AMS). Some of the estimation problems that may arise from fitting GEV distributions at each radar pixel is the large variance and seriously biased quantile estimators. Hence, a regional frequency analysis approach (RFA) is applied. The RFA involves the use of data from different pixels surrounding each pixel within a defined homogenous region. In this study, region of influence approach along with the index flood technique are used in the RFA. A bootstrap technique procedure is carried out to account for the uncertainty in the distribution parameters to construct 90% confidence intervals (i.e., 5% and 95% confidence limits) on AMS-based precipitation frequency curves.

  19. Effective Integration of Earth Observation Data and Flood Modeling for Rapid Disaster Response: The Texas 2015 Case

    NASA Astrophysics Data System (ADS)

    Schumann, G.

    2016-12-01

    Routinely obtaining real-time 2-D inundation patterns of a flood event at a meaningful spatial resolution and over large scales is at the moment only feasible with either operational aircraft flights or satellite imagery. Of course having model simulations of floodplain inundation available to complement the remote sensing data is highly desirable, for both event re-analysis and forecasting event inundation. Using the Texas 2015 flood disaster, we demonstrate the value of multi-scale EO data for large scale 2-D floodplain inundation modeling and forecasting. A dynamic re-analysis of the Texas 2015 flood disaster was run using a 2-D flood model developed for accurate large scale simulations. We simulated the major rivers entering the Gulf of Mexico and used flood maps produced from both optical and SAR satellite imagery to examine regional model sensitivities and assess associated performance. It was demonstrated that satellite flood maps can complement model simulations and add value, although this is largely dependent on a number of important factors, such as image availability, regional landscape topology, and model uncertainty. In the preferred case where model uncertainty is high, landscape topology is complex (i.e. urbanized coastal area) and satellite flood maps are available (in case of SAR for instance), satellite data can significantly reduce model uncertainty by identifying the "best possible" model parameter set. However, most often the situation is occurring where model uncertainty is low and spatially contiguous flooding can be mapped from satellites easily enough, such as in rural large inland river floodplains. Consequently, not much value from satellites can be added. Nevertheless, where a large number of flood maps are available, model credibility can be increased substantially. In the case presented here this was true for at least 60% of the many thousands of kilometers of river flow length simulated, where satellite flood maps existed. The next steps of this project is to employ a technique termed "targeted observation" approach, which is an assimilation based procedure that allows quantifying the impact observations have on model predictions at the local scale and also along the entire river system, when assimilated with the model at specific "overpass" locations.

  20. Space-Time Urban Air Pollution Forecasts

    NASA Astrophysics Data System (ADS)

    Russo, A.; Trigo, R. M.; Soares, A.

    2012-04-01

    Air pollution, like other natural phenomena, may be considered a space-time process. However, the simultaneous integration of time and space is not an easy task to perform, due to the existence of different uncertainties levels and data characteristics. In this work we propose a hybrid method that combines geostatistical and neural models to analyze PM10 time series recorded in the urban area of Lisbon (Portugal) for the 2002-2006 period and to produce forecasts. Geostatistical models have been widely used to characterize air pollution in urban areas, where the pollutant sources are considered diffuse, and also to industrial areas with localized emission sources. It should be stressed however that most geostatistical models correspond basically to an interpolation methodology (estimation, simulation) of a set of variables in a spatial or space-time domain. The temporal prediction of a pollutant usually requires knowledge of the main trends and complex patterns of physical dispersion phenomenon. To deal with low resolution problems and to enhance reliability of predictions, an approach based on neural network short term predictions in the monitoring stations which behave as a local conditioner to a fine grid stochastic simulation model is presented here. After the pollutant concentration is predicted for a given time period at the monitoring stations, we can use the local conditional distributions of observed values, given the predicted value for that period, to perform the spatial simulations for the entire area and consequently evaluate the spatial uncertainty of pollutant concentration. To attain this objective, we propose the use of direct sequential simulations with local distributions. With this approach one succeed to predict the space-time distribution of pollutant concentration that accounts for the time prediction uncertainty (reflecting the neural networks efficiency at each local monitoring station) and the spatial uncertainty as revealed by the spatial variograms. The dataset used consists of PM10 concentrations recorded hourly by 12 monitoring stations within the Lisbon's area, for the period 2002-2006. In addition, meteorological data recorded at 3 monitoring stations and boundary layer height (BLH) daily values from the ECMWF (European Centre for Medium Weather Forecast), ERA Interim, were also used. Based on the large-scale standard pressure fields from the ERA40/ECMWF, prevailing circulation patterns at regional scale where determined and used on the construction of the models. After the daily forecasts were produced, the difference between the average maps based on real observations and predicted values were determined and the model's performance was assessed. Based on the analysis of the results, we conclude that the proposed approach shows to be a very promising alternative for urban air quality characterization because of its good results and simplicity of application.

  1. The use of multi temporal LiDAR to assess basin-scale erosion and deposition following the catastrophic January 2011 Lockyer flood, SE Queensland, Australia

    NASA Astrophysics Data System (ADS)

    Croke, Jacky; Todd, Peter; Thompson, Chris; Watson, Fiona; Denham, Robert; Khanal, Giri

    2013-02-01

    Advances in remote sensing and digital terrain processing now allow for a sophisticated analysis of spatial and temporal changes in erosion and deposition. Digital elevation models (DEMs) can now be constructed and differenced to produce DEMs of Difference (DoD), which are used to assess net landscape change for morphological budgeting. To date this has been most effectively achieved in gravel-bed rivers over relatively small spatial scales. If the full potential of the technology is to be realised, additional studies are required at larger scales and across a wider range of geomorphic features. This study presents an assessment of the basin-scale spatial patterns of erosion, deposition, and net morphological change that resulted from a catastrophic flood event in the Lockyer Creek catchment of SE Queensland (SEQ) in January 2011. Multitemporal Light Detection and Ranging (LiDAR) DEMs were used to construct a DoD that was then combined with a one-dimensional flow hydraulic model HEC-RAS to delineate five major geomorphic landforms, including inner-channel area, within-channel benches, macrochannel banks, and floodplain. The LiDAR uncertainties were quantified and applied together with a probabilistic representation of uncertainty thresholded at a conservative 95% confidence interval. The elevation change distribution (ECD) for the 100-km2 study area indicates a magnitude of elevation change spanning almost 10 m but the mean elevation change of 0.04 m confirms that a large part of the landscape was characterised by relatively low magnitude changes over a large spatial area. Mean elevation changes varied by geomorphic feature and only two, the within-channel benches and macrochannel banks, were net erosional with an estimated combined loss of 1,815,149 m3 of sediment. The floodplain was the zone of major net deposition but mean elevation changes approached the defined critical limit of uncertainty. Areal and volumetric ECDs for this extreme event provide a representative expression of the balance between erosion and deposition, and importantly sediment redistribution, which is extremely difficult to quantify using more traditional channel planform or cross-sectional surveys. The ability of LiDAR to make a rapid and accurate assessment of key geomorphic processes over large spatial scales contributes to our understanding of key processes and, as demonstrated here, to the assessment of major geomorphological hazards such as extreme flood events.

  2. Analyzing the uncertainty of ensemble-based gridded observations in land surface simulations and drought assessment

    NASA Astrophysics Data System (ADS)

    Ahmadalipour, Ali; Moradkhani, Hamid

    2017-12-01

    Hydrologic modeling is one of the primary tools utilized for drought monitoring and drought early warning systems. Several sources of uncertainty in hydrologic modeling have been addressed in the literature. However, few studies have assessed the uncertainty of gridded observation datasets from a drought monitoring perspective. This study provides a hydrologic modeling oriented analysis of the gridded observation data uncertainties over the Pacific Northwest (PNW) and its implications on drought assessment. We utilized a recently developed 100-member ensemble-based observed forcing data to simulate hydrologic fluxes at 1/8° spatial resolution using Variable Infiltration Capacity (VIC) model, and compared the results with a deterministic observation. Meteorological and hydrological droughts are studied at multiple timescales over the basin, and seasonal long-term trends and variations of drought extent is investigated for each case. Results reveal large uncertainty of observed datasets at monthly timescale, with systematic differences for temperature records, mainly due to different lapse rates. The uncertainty eventuates in large disparities of drought characteristics. In general, an increasing trend is found for winter drought extent across the PNW. Furthermore, a ∼3% decrease per decade is detected for snow water equivalent (SWE) over the PNW, with the region being more susceptible to SWE variations of the northern Rockies than the western Cascades. The agricultural areas of southern Idaho demonstrate decreasing trend of natural soil moisture as a result of precipitation decline, which implies higher appeal for anthropogenic water storage and irrigation systems.

  3. Reducing uncertainty in dust monitoring to detect aeolian sediment transport responses to land cover change

    NASA Astrophysics Data System (ADS)

    Webb, N.; Chappell, A.; Van Zee, J.; Toledo, D.; Duniway, M.; Billings, B.; Tedela, N.

    2017-12-01

    Anthropogenic land use and land cover change (LULCC) influence global rates of wind erosion and dust emission, yet our understanding of the magnitude of the responses remains poor. Field measurements and monitoring provide essential data to resolve aeolian sediment transport patterns and assess the impacts of human land use and management intensity. Data collected in the field are also required for dust model calibration and testing, as models have become the primary tool for assessing LULCC-dust cycle interactions. However, there is considerable uncertainty in estimates of dust emission due to the spatial variability of sediment transport. Field sampling designs are currently rudimentary and considerable opportunities are available to reduce the uncertainty. Establishing the minimum detectable change is critical for measuring spatial and temporal patterns of sediment transport, detecting potential impacts of LULCC and land management, and for quantifying the uncertainty of dust model estimates. Here, we evaluate the effectiveness of common sampling designs (e.g., simple random sampling, systematic sampling) used to measure and monitor aeolian sediment transport rates. Using data from the US National Wind Erosion Research Network across diverse rangeland and cropland cover types, we demonstrate how only large changes in sediment mass flux (of the order 200% to 800%) can be detected when small sample sizes are used, crude sampling designs are implemented, or when the spatial variation is large. We then show how statistical rigour and the straightforward application of a sampling design can reduce the uncertainty and detect change in sediment transport over time and between land use and land cover types.

  4. Mapping the spatial distribution of chloride deposition across Australia

    NASA Astrophysics Data System (ADS)

    Davies, P. J.; Crosbie, R. S.

    2018-06-01

    The high solubility and conservative behaviour of chloride make it ideal for use as an environmental tracer of water and salt movement through the hydrologic cycle. For such use the spatial distribution of chloride deposition in rainfall at a suitable scale must be known. A number of authors have used point data acquired from field studies of chloride deposition around Australia to construct relationships to characterise chloride deposition as a function of distance from the coast; these relationships have allowed chloride deposition to be interpolated in different regions around Australia. In this paper we took this a step further and developed a chloride deposition map for all of Australia which includes a quantification of uncertainty. A previously developed four parameter model of chloride deposition as a function of distance from the coast for Australia was used as the basis for producing a continental scale chloride deposition map. Each of the four model parameters were made spatially variable by creating parameter surfaces that were interpolated using a pilot point regularisation approach within a parameter estimation software. The observations of chloride deposition were drawn from a literature review that identified 291 point measurements of chloride deposition over a period of 80 years spread unevenly across all Australian States and Territories. A best estimate chloride deposition map was developed from the resulting surfaces on a 0.05 degree grid. The uncertainty in the chloride deposition map was quantified as the 5th and 95th percentile of 1000 calibrated models produced via Null Space Monte Carlo analysis and the spatial variability of chloride deposition across the continent was consistent with landscape morphology. The temporal variability in chloride deposition on a decadal scale was investigated in the Murray-Darling Basin, this highlighted the need for long-term monitoring of chloride deposition if the uncertainty of the continental scale map is to be reduced. Use of the derived chloride deposition map was demonstrated for a probabilistic estimation of groundwater recharge for the southeast of South Australia using the chloride mass balance method.

  5. Towards the Development of a More Accurate Monitoring Procedure for Invertebrate Populations, in the Presence of an Unknown Spatial Pattern of Population Distribution in the Field

    PubMed Central

    Petrovskaya, Natalia B.; Forbes, Emily; Petrovskii, Sergei V.; Walters, Keith F. A.

    2018-01-01

    Studies addressing many ecological problems require accurate evaluation of the total population size. In this paper, we revisit a sampling procedure used for the evaluation of the abundance of an invertebrate population from assessment data collected on a spatial grid of sampling locations. We first discuss how insufficient information about the spatial population density obtained on a coarse sampling grid may affect the accuracy of an evaluation of total population size. Such information deficit in field data can arise because of inadequate spatial resolution of the population distribution (spatially variable population density) when coarse grids are used, which is especially true when a strongly heterogeneous spatial population density is sampled. We then argue that the average trap count (the quantity routinely used to quantify abundance), if obtained from a sampling grid that is too coarse, is a random variable because of the uncertainty in sampling spatial data. Finally, we show that a probabilistic approach similar to bootstrapping techniques can be an efficient tool to quantify the uncertainty in the evaluation procedure in the presence of a spatial pattern reflecting a patchy distribution of invertebrates within the sampling grid. PMID:29495513

  6. IMPROVING PARTICULATE MATTER SOURCE APPORTIONMENT FOR HEALTH STUDIES: A TRAINED RECEPTOR MODELING APPROACH WITH SENSITIVITY, UNCERTAINTY AND SPATIAL ANALYSES

    EPA Science Inventory

    An approach for conducting PM source apportionment will be developed, tested, and applied that directly addresses limitations in current SA methods, in particular variability, biases, and intensive resource requirements. Uncertainties in SA results and sensitivities to SA inpu...

  7. Climate change and the economics of biomass energy feedstocks in semi-arid agricultural landscapes: A spatially explicit real options analysis.

    PubMed

    Regan, Courtney M; Connor, Jeffery D; Raja Segaran, Ramesh; Meyer, Wayne S; Bryan, Brett A; Ostendorf, Bertram

    2017-05-01

    The economics of establishing perennial species as renewable energy feedstocks has been widely investigated as a climate change adapted diversification option for landholders, primarily using net present value (NPV) analysis. NPV does not account for key uncertainties likely to influence relevant landholder decision making. While real options analysis (ROA) is an alternative method that accounts for the uncertainty over future conditions and the large upfront irreversible investment involved in establishing perennials, there have been limited applications of ROA to evaluating land use change decision economics and even fewer applications considering climate change risks. Further, while the influence of spatially varying climate risk on biomass conversion economic has been widely evaluated using NPV methods, effects of spatial variability and climate on land use change have been scarcely assessed with ROA. In this study we applied a simulation-based ROA model to evaluate a landholder's decision to convert land from agriculture to biomass. This spatially explicit model considers price and yield risks under baseline climate and two climate change scenarios over a geographically diverse farming region. We found that underlying variability in primary productivity across the study area had a substantial effect on conversion thresholds required to trigger land use change when compared to results from NPV analysis. Areas traditionally thought of as being quite similar in average productive capacity can display large differences in response to the inclusion of production and price risks. The effects of climate change, broadly reduced returns required for land use change to biomass in low and medium rainfall zones and increased them in higher rainfall areas. Additionally, the risks posed by climate change can further exacerbate the tendency for NPV methods to underestimate true conversion thresholds. Our results show that even under severe drying and warming where crop yield variability is more affected than perennial biomass plantings, comparatively little of the study area is economically viable for conversion to biomass under $200/DM t, and it is not until prices exceed $200/DM t that significant areas become profitable for biomass plantings. We conclude that for biomass to become a valuable diversification option the synchronisation of products and services derived from biomass and the development of markets is vital. Copyright © 2017 Elsevier Ltd. All rights reserved.

  8. Uncertainty Analysis of Downscaled CMIP5 Precipitation Data for Louisiana, USA

    NASA Astrophysics Data System (ADS)

    Sumi, S. J.; Tamanna, M.; Chivoiu, B.; Habib, E. H.

    2014-12-01

    The downscaled CMIP3 and CMIP5 Climate and Hydrology Projections dataset contains fine spatial resolution translations of climate projections over the contiguous United States developed using two downscaling techniques (monthly Bias Correction Spatial Disaggregation (BCSD) and daily Bias Correction Constructed Analogs (BCCA)). The objective of this study is to assess the uncertainty of the CMIP5 downscaled general circulation models (GCM). We performed an analysis of the daily, monthly, seasonal and annual variability of precipitation downloaded from the Downscaled CMIP3 and CMIP5 Climate and Hydrology Projections website for the state of Louisiana, USA at 0.125° x 0.125° resolution. A data set of daily gridded observations of precipitation of a rectangular boundary covering Louisiana is used to assess the validity of 21 downscaled GCMs for the 1950-1999 period. The following statistics are computed using the CMIP5 observed dataset with respect to the 21 models: the correlation coefficient, the bias, the normalized bias, the mean absolute error (MAE), the mean absolute percentage error (MAPE), and the root mean square error (RMSE). A measure of variability simulated by each model is computed as the ratio of its standard deviation, in both space and time, to the corresponding standard deviation of the observation. The correlation and MAPE statistics are also computed for each of the nine climate divisions of Louisiana. Some of the patterns that we observed are: 1) Average annual precipitation rate shows similar spatial distribution for all the models within a range of 3.27 to 4.75 mm/day from Northwest to Southeast. 2) Standard deviation of summer (JJA) precipitation (mm/day) for the models maintains lower value than the observation whereas they have similar spatial patterns and range of values in winter (NDJ). 3) Correlation coefficients of annual precipitation of models against observation have a range of -0.48 to 0.36 with variable spatial distribution by model. 4) Most of the models show negative correlation coefficients in summer and positive in winter. 5) MAE shows similar spatial distribution for all the models within a range of 5.20 to 7.43 mm/day from Northwest to Southeast of Louisiana. 6) Highest values of correlation coefficients are found at seasonal scale within a range of 0.36 to 0.46.

  9. Estimates of tropical analysis differences in daily values produced by two operational centers

    NASA Technical Reports Server (NTRS)

    Kasahara, Akira; Mizzi, Arthur P.

    1992-01-01

    To assess the uncertainty of daily synoptic analyses for the atmospheric state, the intercomparison of three First GARP Global Experiment level IIIb datasets is performed. Daily values of divergence, vorticity, temperature, static stability, vertical motion, mixing ratio, and diagnosed diabatic heating rate are compared for the period of 26 January-11 February 1979. The spatial variance and mean, temporal mean and variance, 2D wavenumber power spectrum, anomaly correlation, and normalized square difference are employed for comparison.

  10. Zonal average earth radiation budget measurements from satellites for climate studies

    NASA Technical Reports Server (NTRS)

    Ellis, J. S.; Haar, T. H. V.

    1976-01-01

    Data from 29 months of satellite radiation budget measurements, taken intermittently over the period 1964 through 1971, are composited into mean month, season and annual zonally averaged meridional profiles. Individual months, which comprise the 29 month set, were selected as representing the best available total flux data for compositing into large scale statistics for climate studies. A discussion of spatial resolution of the measurements along with an error analysis, including both the uncertainty and standard error of the mean, are presented.

  11. Modelling ecosystem service flows under uncertainty with stochiastic SPAN

    USGS Publications Warehouse

    Johnson, Gary W.; Snapp, Robert R.; Villa, Ferdinando; Bagstad, Kenneth J.

    2012-01-01

    Ecosystem service models are increasingly in demand for decision making. However, the data required to run these models are often patchy, missing, outdated, or untrustworthy. Further, communication of data and model uncertainty to decision makers is often either absent or unintuitive. In this work, we introduce a systematic approach to addressing both the data gap and the difficulty in communicating uncertainty through a stochastic adaptation of the Service Path Attribution Networks (SPAN) framework. The SPAN formalism assesses ecosystem services through a set of up to 16 maps, which characterize the services in a study area in terms of flow pathways between ecosystems and human beneficiaries. Although the SPAN algorithms were originally defined deterministically, we present them here in a stochastic framework which combines probabilistic input data with a stochastic transport model in order to generate probabilistic spatial outputs. This enables a novel feature among ecosystem service models: the ability to spatially visualize uncertainty in the model results. The stochastic SPAN model can analyze areas where data limitations are prohibitive for deterministic models. Greater uncertainty in the model inputs (including missing data) should lead to greater uncertainty expressed in the model’s output distributions. By using Bayesian belief networks to fill data gaps and expert-provided trust assignments to augment untrustworthy or outdated information, we can account for uncertainty in input data, producing a model that is still able to run and provide information where strictly deterministic models could not. Taken together, these attributes enable more robust and intuitive modelling of ecosystem services under uncertainty.

  12. Uncertainties in the estimation of specific absorption rate during radiofrequency alternating magnetic field induced non-adiabatic heating of ferrofluids

    NASA Astrophysics Data System (ADS)

    Lahiri, B. B.; Ranoo, Surojit; Philip, John

    2017-11-01

    Magnetic fluid hyperthermia (MFH) is becoming a viable cancer treatment methodology where the alternating magnetic field induced heating of magnetic fluid is utilized for ablating the cancerous cells or making them more susceptible to the conventional treatments. The heating efficiency in MFH is quantified in terms of specific absorption rate (SAR), which is defined as the heating power generated per unit mass. In majority of the experimental studies, SAR is evaluated from the temperature rise curves, obtained under non-adiabatic experimental conditions, which is prone to various thermodynamic uncertainties. A proper understanding of the experimental uncertainties and its remedies is a prerequisite for obtaining accurate and reproducible SAR. Here, we study the thermodynamic uncertainties associated with peripheral heating, delayed heating, heat loss from the sample and spatial variation in the temperature profile within the sample. Using first order approximations, an adiabatic reconstruction protocol for the measured temperature rise curves is developed for SAR estimation, which is found to be in good agreement with those obtained from the computationally intense slope corrected method. Our experimental findings clearly show that the peripheral and delayed heating are due to radiation heat transfer from the heating coils and slower response time of the sensor, respectively. Our results suggest that the peripheral heating is linearly proportional to the sample area to volume ratio and coil temperature. It is also observed that peripheral heating decreases in presence of a non-magnetic insulating shielding. The delayed heating is found to contribute up to ~25% uncertainties in SAR values. As the SAR values are very sensitive to the initial slope determination method, explicit mention of the range of linear regression analysis is appropriate to reproduce the results. The effect of sample volume to area ratio on linear heat loss rate is systematically studied and the results are compared using a lumped system thermal model. The various uncertainties involved in SAR estimation are categorized as material uncertainties, thermodynamic uncertainties and parametric uncertainties. The adiabatic reconstruction is found to decrease the uncertainties in SAR measurement by approximately three times. Additionally, a set of experimental guidelines for accurate SAR estimation using adiabatic reconstruction protocol is also recommended. These results warrant a universal experimental and data analysis protocol for SAR measurements during field induced heating of magnetic fluids under non-adiabatic conditions.

  13. Spatial analysis of health risk assessment with arsenic intake of drinking water in the LanYang plain

    NASA Astrophysics Data System (ADS)

    Chen, C. F.; Liang, C. P.; Jang, C. S.; Chen, J. S.

    2016-12-01

    Groundwater is one of the most component water resources in Lanyang plain. The groundwater of the Lanyang Plain contains arsenic levels that exceed the current Taiwan Environmental Protection Administration (Taiwan EPA) limit of 10 μg/L. The arsenic of groundwater in some areas of the Lanyang Plain pose great menace for the safe use of groundwater resources. Therefore, poor water quality can adversely impact drinking water uses, leading to human health risks. This study analyzed the potential health risk associated with the ingestion of arsenic-affected groundwater in the arseniasis-endemic Lanyang plain. Geostatistical approach is widely used in spatial variability analysis and distributions of field data with uncertainty. The estimation of spatial distribution of the arsenic contaminant in groundwater is very important in the health risk assessment. This study used indicator kriging (IK) and ordinary kriging (OK) methods to explore the spatial variability of arsenic-polluted parameters. The estimated difference between IK and OK estimates was compared. The extent of arsenic pollution was spatially determined and the Target cancer risk (TR) and dose response were explored when the ingestion of arsenic in groundwater. Thus, a zonal management plan based on safe groundwater use is formulated. The research findings can provide a plan reference of regional water resources supplies for local government administrators and developing groundwater resources in the Lanyang Plain.

  14. Medical Geography: a Promising Field of Application for Geostatistics

    PubMed Central

    Goovaerts, P.

    2008-01-01

    The analysis of health data and putative covariates, such as environmental, socio-economic, behavioral or demographic factors, is a promising application for geostatistics. It presents, however, several methodological challenges that arise from the fact that data are typically aggregated over irregular spatial supports and consist of a numerator and a denominator (i.e. population size). This paper presents an overview of recent developments in the field of health geostatistics, with an emphasis on three main steps in the analysis of areal health data: estimation of the underlying disease risk, detection of areas with significantly higher risk, and analysis of relationships with putative risk factors. The analysis is illustrated using age-adjusted cervix cancer mortality rates recorded over the 1970–1994 period for 118 counties of four states in the Western USA. Poisson kriging allows the filtering of noisy mortality rates computed from small population sizes, enhancing the correlation with two putative explanatory variables: percentage of habitants living below the federally defined poverty line, and percentage of Hispanic females. Area-to-point kriging formulation creates continuous maps of mortality risk, reducing the visual bias associated with the interpretation of choropleth maps. Stochastic simulation is used to generate realizations of cancer mortality maps, which allows one to quantify numerically how the uncertainty about the spatial distribution of health outcomes translates into uncertainty about the location of clusters of high values or the correlation with covariates. Last, geographically-weighted regression highlights the non-stationarity in the explanatory power of covariates: the higher mortality values along the coast are better explained by the two covariates than the lower risk recorded in Utah. PMID:19412347

  15. Global economic trade-offs between wild nature and tropical agriculture

    PubMed Central

    Webb, Edward L.; Symes, William S.; Koh, Lian P.

    2017-01-01

    Global demands for agricultural and forestry products provide economic incentives for deforestation across the tropics. Much of this deforestation occurs with a lack of information on the spatial distribution of benefits and costs of deforestation. To inform global sustainable land-use policies, we combine geographic information systems (GIS) with a meta-analysis of ecosystem services (ES) studies to perform a spatially explicit analysis of the trade-offs between agricultural benefits, carbon emissions, and losses of multiple ecosystem services because of tropical deforestation from 2000 to 2012. Even though the value of ecosystem services presents large inherent uncertainties, we find a pattern supporting the argument that the externalities of destroying tropical forests are greater than the current direct economic benefits derived from agriculture in all cases bar one: when yield and rent potentials of high-value crops could be realized in the future. Our analysis identifies the Atlantic Forest, areas around the Gulf of Guinea, and Thailand as areas where agricultural conversion appears economically efficient, indicating a major impediment to the long-term financial sustainability of Reducing Emissions from Deforestation and forest Degradation (REDD+) schemes in those countries. By contrast, Latin America, insular Southeast Asia, and Madagascar present areas with low agricultural rents (ARs) and high values in carbon stocks and ES, suggesting that they are economically viable conservation targets. Our study helps identify optimal areas for conservation and agriculture together with their associated uncertainties, which could enhance the efficiency and sustainability of pantropical land-use policies and help direct future research efforts. PMID:28732022

  16. Land cover mapping and change detection in urban watersheds using QuickBird high spatial resolution satellite imagery

    NASA Astrophysics Data System (ADS)

    Hester, David Barry

    The objective of this research was to develop methods for urban land cover analysis using QuickBird high spatial resolution satellite imagery. Such imagery has emerged as a rich commercially available remote sensing data source and has enjoyed high-profile broadcast news media and Internet applications, but methods of quantitative analysis have not been thoroughly explored. The research described here consists of three studies focused on the use of pan-sharpened 61-cm spatial resolution QuickBird imagery, the spatial resolution of which is the highest of any commercial satellite. In the first study, a per-pixel land cover classification method is developed for use with this imagery. This method utilizes a per-pixel classification approach to generate an accurate six-category high spatial resolution land cover map of a developing suburban area. The primary objective of the second study was to develop an accurate land cover change detection method for use with QuickBird land cover products. This work presents an efficient fuzzy framework for transforming map uncertainty into accurate and meaningful high spatial resolution land cover change analysis. The third study described here is an urban planning application of the high spatial resolution QuickBird-based land cover product developed in the first study. This work both meaningfully connects this exciting new data source to urban watershed management and makes an important empirical contribution to the study of suburban watersheds. Its analysis of residential roads and driveways as well as retail parking lots sheds valuable light on the impact of transportation-related land use on the suburban landscape. Broadly, these studies provide new methods for using state-of-the-art remote sensing data to inform land cover analysis and urban planning. These methods are widely adaptable and produce land cover products that are both meaningful and accurate. As additional high spatial resolution satellites are launched and the cost of high resolution imagery continues to decline, this research makes an important contribution to this exciting era in the science of remote sensing.

  17. Daniel Goodman’s empirical approach to Bayesian statistics

    USGS Publications Warehouse

    Gerrodette, Tim; Ward, Eric; Taylor, Rebecca L.; Schwarz, Lisa K.; Eguchi, Tomoharu; Wade, Paul; Himes Boor, Gina

    2016-01-01

    Bayesian statistics, in contrast to classical statistics, uses probability to represent uncertainty about the state of knowledge. Bayesian statistics has often been associated with the idea that knowledge is subjective and that a probability distribution represents a personal degree of belief. Dr. Daniel Goodman considered this viewpoint problematic for issues of public policy. He sought to ground his Bayesian approach in data, and advocated the construction of a prior as an empirical histogram of “similar” cases. In this way, the posterior distribution that results from a Bayesian analysis combined comparable previous data with case-specific current data, using Bayes’ formula. Goodman championed such a data-based approach, but he acknowledged that it was difficult in practice. If based on a true representation of our knowledge and uncertainty, Goodman argued that risk assessment and decision-making could be an exact science, despite the uncertainties. In his view, Bayesian statistics is a critical component of this science because a Bayesian analysis produces the probabilities of future outcomes. Indeed, Goodman maintained that the Bayesian machinery, following the rules of conditional probability, offered the best legitimate inference from available data. We give an example of an informative prior in a recent study of Steller sea lion spatial use patterns in Alaska.

  18. Planning spatial sampling of the soil from an uncertain reconnaissance variogram

    NASA Astrophysics Data System (ADS)

    Lark, R. Murray; Hamilton, Elliott M.; Kaninga, Belinda; Maseka, Kakoma K.; Mutondo, Moola; Sakala, Godfrey M.; Watts, Michael J.

    2017-12-01

    An estimated variogram of a soil property can be used to support a rational choice of sampling intensity for geostatistical mapping. However, it is known that estimated variograms are subject to uncertainty. In this paper we address two practical questions. First, how can we make a robust decision on sampling intensity, given the uncertainty in the variogram? Second, what are the costs incurred in terms of oversampling because of uncertainty in the variogram model used to plan sampling? To achieve this we show how samples of the posterior distribution of variogram parameters, from a computational Bayesian analysis, can be used to characterize the effects of variogram parameter uncertainty on sampling decisions. We show how one can select a sample intensity so that a target value of the kriging variance is not exceeded with some specified probability. This will lead to oversampling, relative to the sampling intensity that would be specified if there were no uncertainty in the variogram parameters. One can estimate the magnitude of this oversampling by treating the tolerable grid spacing for the final sample as a random variable, given the target kriging variance and the posterior sample values. We illustrate these concepts with some data on total uranium content in a relatively sparse sample of soil from agricultural land near mine tailings in the Copperbelt Province of Zambia.

  19. Multivariate Geostatistical Analysis of Uncertainty for the Hydrodynamic Model of a Geological Trap for Carbon Dioxide Storage. Case study: Multilayered Geological Structure Vest Valcele, ROMANIA

    NASA Astrophysics Data System (ADS)

    Scradeanu, D.; Pagnejer, M.

    2012-04-01

    The purpose of the works is to evaluate the uncertainty of the hydrodynamic model for a multilayered geological structure, a potential trap for carbon dioxide storage. The hydrodynamic model is based on a conceptual model of the multilayered hydrostructure with three components: 1) spatial model; 2) parametric model and 3) energy model. The necessary data to achieve the three components of the conceptual model are obtained from: 240 boreholes explored by geophysical logging and seismic investigation, for the first two components, and an experimental water injection test for the last one. The hydrodinamic model is a finite difference numerical model based on a 3D stratigraphic model with nine stratigraphic units (Badenian and Oligocene) and a 3D multiparameter model (porosity, permeability, hydraulic conductivity, storage coefficient, leakage etc.). The uncertainty of the two 3D models was evaluated using multivariate geostatistical tools: a)cross-semivariogram for structural analysis, especially the study of anisotropy and b)cokriging to reduce estimation variances in a specific situation where is a cross-correlation between a variable and one or more variables that are undersampled. It has been identified important differences between univariate and bivariate anisotropy. The minimised uncertainty of the parametric model (by cokriging) was transferred to hydrodynamic model. The uncertainty distribution of the pressures generated by the water injection test has been additional filtered by the sensitivity of the numerical model. The obtained relative errors of the pressure distribution in the hydrodynamic model are 15-20%. The scientific research was performed in the frame of the European FP7 project "A multiple space and time scale approach for the quantification of deep saline formation for CO2 storage(MUSTANG)".

  20. A new framework for quantifying uncertainties in modelling studies for future climates - how more certain are CMIP5 precipitation and temperature simulations compared to CMIP3?

    NASA Astrophysics Data System (ADS)

    Sharma, A.; Woldemeskel, F. M.; Sivakumar, B.; Mehrotra, R.

    2014-12-01

    We outline a new framework for assessing uncertainties in model simulations, be they hydro-ecological simulations for known scenarios, or climate simulations for assumed scenarios representing the future. This framework is illustrated here using GCM projections for future climates for hydrologically relevant variables (precipitation and temperature), with the uncertainty segregated into three dominant components - model uncertainty, scenario uncertainty (representing greenhouse gas emission scenarios), and ensemble uncertainty (representing uncertain initial conditions and states). A novel uncertainty metric, the Square Root Error Variance (SREV), is used to quantify the uncertainties involved. The SREV requires: (1) Interpolating raw and corrected GCM outputs to a common grid; (2) Converting these to percentiles; (3) Estimating SREV for model, scenario, initial condition and total uncertainty at each percentile; and (4) Transforming SREV to a time series. The outcome is a spatially varying series of SREVs associated with each model that can be used to assess how uncertain the system is at each simulated point or time. This framework, while illustrated in a climate change context, is completely applicable for assessment of uncertainties any modelling framework may be subject to. The proposed method is applied to monthly precipitation and temperature from 6 CMIP3 and 13 CMIP5 GCMs across the world. For CMIP3, B1, A1B and A2 scenarios whereas for CMIP5, RCP2.6, RCP4.5 and RCP8.5 representing low, medium and high emissions are considered. For both CMIP3 and CMIP5, model structure is the largest source of uncertainty, which reduces significantly after correcting for biases. Scenario uncertainly increases, especially for temperature, in future due to divergence of the three emission scenarios analysed. While CMIP5 precipitation simulations exhibit a small reduction in total uncertainty over CMIP3, there is almost no reduction observed for temperature projections. Estimation of uncertainty in both space and time sheds lights on the spatial and temporal patterns of uncertainties in GCM outputs, providing an effective platform for risk-based assessments of any alternate plans or decisions that may be formulated using GCM simulations.

  1. Line-averaging measurement methods to estimate the gap in the CO2 balance closure - possibilities, challenges, and uncertainties

    NASA Astrophysics Data System (ADS)

    Ziemann, Astrid; Starke, Manuela; Schütze, Claudia

    2017-11-01

    An imbalance of surface energy fluxes using the eddy covariance (EC) method is observed in global measurement networks although all necessary corrections and conversions are applied to the raw data. Mainly during nighttime, advection can occur, resulting in a closing gap that consequently should also affect the CO2 balances. There is the crucial need for representative concentration and wind data to measure advective fluxes. Ground-based remote sensing techniques are an ideal tool as they provide the spatially representative CO2 concentration together with wind components within the same voxel structure. For this purpose, the presented SQuAd (Spatially resolved Quantification of the Advection influence on the balance closure of greenhouse gases) approach applies an integrated method combination of acoustic and optical remote sensing. The innovative combination of acoustic travel-time tomography (A-TOM) and open-path Fourier-transform infrared spectroscopy (OP-FTIR) will enable an upscaling and enhancement of EC measurements. OP-FTIR instrumentation offers the significant advantage of real-time simultaneous measurements of line-averaged concentrations for CO2 and other greenhouse gases (GHGs). A-TOM is a scalable method to remotely resolve 3-D wind and temperature fields. The paper will give an overview about the proposed SQuAd approach and first results of experimental tests at the FLUXNET site Grillenburg in Germany. Preliminary results of the comprehensive experiments reveal a mean nighttime horizontal advection of CO2 of about 10 µmol m-2 s-1 estimated by the spatially integrating and representative SQuAd method. Additionally, uncertainties in determining CO2 concentrations using passive OP-FTIR and wind speed applying A-TOM are systematically quantified. The maximum uncertainty for CO2 concentration was estimated due to environmental parameters, instrumental characteristics, and retrieval procedure with a total amount of approximately 30 % for a single measurement. Instantaneous wind components can be derived with a maximum uncertainty of 0.3 m s-1 depending on sampling, signal analysis, and environmental influences on sound propagation. Averaging over a period of 30 min, the standard error of the mean values can be decreased by a factor of at least 0.5 for OP-FTIR and 0.1 for A-TOM depending on the required spatial resolution. The presented validation of the joint application of the two independent, nonintrusive methods is in the focus of attention concerning their ability to quantify advective fluxes.

  2. Uncertainties in Life Cycle Greenhouse Gas Emissions from Advanced Biomass Feedstock Logistics Supply Chains in Kansas

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cafferty, Kara G.; Searcy, Erin M.; Nguyen, Long

    To meet Energy Independence and Security Act (EISA) cellulosic biofuel mandates, the United States will require an annual domestic supply of about 242 million Mg of biomass by 2022. To improve the feedstock logistics of lignocellulosic biofuels and access available biomass resources from areas with varying yields, commodity systems have been proposed and designed to deliver on-spec biomass feedstocks at preprocessing “depots”, which densify and stabilize the biomass prior to long-distance transport and delivery to centralized biorefineries. The harvesting, preprocessing, and logistics (HPL) of biomass commodity supply chains thus could introduce spatially variable environmental impacts into the biofuel life cyclemore » due to needing to harvest, move, and preprocess biomass from multiple distances that have variable spatial density. This study examines the uncertainty in greenhouse gas (GHG) emissions of corn stover logisticsHPL within a bio-ethanol supply chain in the state of Kansas, where sustainable biomass supply varies spatially. Two scenarios were evaluated each having a different number of depots of varying capacity and location within Kansas relative to a central commodity-receiving biorefinery to test GHG emissions uncertainty. Monte Carlo simulation was used to estimate the spatial uncertainty in the HPL gate-to-gate sequence. The results show that the transport of densified biomass introduces the highest variability and contribution to the carbon footprint of the logistics HPL supply chain (0.2-13 g CO 2e/MJ). Moreover, depending upon the biomass availability and its spatial density and surrounding transportation infrastructure (road and rail), logistics HPL processes can increase the variability in life cycle environmental impacts for lignocellulosic biofuels. Within Kansas, life cycle GHG emissions could range from 24 to 41 g CO 2e/MJ depending upon the location, size and number of preprocessing depots constructed. However, this range can be minimized through optimizing the siting of preprocessing depots where ample rail infrastructure exists to supply biomass commodity to a regional biorefinery supply system« less

  3. Uncertainties in Life Cycle Greenhouse Gas Emissions from Advanced Biomass Feedstock Logistics Supply Chains in Kansas

    DOE PAGES

    Cafferty, Kara G.; Searcy, Erin M.; Nguyen, Long; ...

    2014-11-04

    To meet Energy Independence and Security Act (EISA) cellulosic biofuel mandates, the United States will require an annual domestic supply of about 242 million Mg of biomass by 2022. To improve the feedstock logistics of lignocellulosic biofuels and access available biomass resources from areas with varying yields, commodity systems have been proposed and designed to deliver on-spec biomass feedstocks at preprocessing “depots”, which densify and stabilize the biomass prior to long-distance transport and delivery to centralized biorefineries. The harvesting, preprocessing, and logistics (HPL) of biomass commodity supply chains thus could introduce spatially variable environmental impacts into the biofuel life cyclemore » due to needing to harvest, move, and preprocess biomass from multiple distances that have variable spatial density. This study examines the uncertainty in greenhouse gas (GHG) emissions of corn stover logisticsHPL within a bio-ethanol supply chain in the state of Kansas, where sustainable biomass supply varies spatially. Two scenarios were evaluated each having a different number of depots of varying capacity and location within Kansas relative to a central commodity-receiving biorefinery to test GHG emissions uncertainty. Monte Carlo simulation was used to estimate the spatial uncertainty in the HPL gate-to-gate sequence. The results show that the transport of densified biomass introduces the highest variability and contribution to the carbon footprint of the logistics HPL supply chain (0.2-13 g CO 2e/MJ). Moreover, depending upon the biomass availability and its spatial density and surrounding transportation infrastructure (road and rail), logistics HPL processes can increase the variability in life cycle environmental impacts for lignocellulosic biofuels. Within Kansas, life cycle GHG emissions could range from 24 to 41 g CO 2e/MJ depending upon the location, size and number of preprocessing depots constructed. However, this range can be minimized through optimizing the siting of preprocessing depots where ample rail infrastructure exists to supply biomass commodity to a regional biorefinery supply system« less

  4. Using high-resolution soil moisture modelling to assess the uncertainty of microwave remotely sensed soil moisture products at the correct spatial and temporal support

    NASA Astrophysics Data System (ADS)

    Wanders, N.; Karssenberg, D.; Bierkens, M. F. P.; Van Dam, J. C.; De Jong, S. M.

    2012-04-01

    Soil moisture is a key variable in the hydrological cycle and important in hydrological modelling. When assimilating soil moisture into flood forecasting models, the improvement of forecasting skills depends on the ability to accurately estimate the spatial and temporal patterns of soil moisture content throughout the river basin. Space-borne remote sensing may provide this information with a high temporal and spatial resolution and with a global coverage. Currently three microwave soil moisture products are available: AMSR-E, ASCAT and SMOS. The quality of these satellite-based products is often assessed by comparing them with in-situ observations of soil moisture. This comparison is however hampered by the difference in spatial and temporal support (i.e., resolution, scale), because the spatial resolution of microwave satellites is rather low compared to in-situ field measurements. Thus, the aim of this study is to derive a method to assess the uncertainty of microwave satellite soil moisture products at the correct spatial support. To overcome the difference in support size between in-situ soil moisture observations and remote sensed soil moisture, we used a stochastic, distributed unsaturated zone model (SWAP, van Dam (2000)) that is upscaled to the support of different satellite products. A detailed assessment of the SWAP model uncertainty is included to ensure that the uncertainty in satellite soil moisture is not overestimated due to an underestimation of the model uncertainty. We simulated unsaturated water flow up to a depth of 1.5m with a vertical resolution of 1 to 10 cm and on a horizontal grid of 1 km2 for the period Jan 2010 - Jun 2011. The SWAP model was first calibrated and validated on in-situ data of the REMEDHUS soil moisture network (Spain). Next, to evaluate the satellite products, the model was run for areas in the proximity of 79 meteorological stations in Spain, where model results were aggregated to the correct support of the satellite product by averaging model results from the 1 km2 grid within the remote sensing footprint. Overall 440 (AMSR-E, SMOS) to 680 (ASCAT) timeseries were compared to the aggregated SWAP model results, providing valuable information on the uncertainty of satellite soil moisture at the proper support. Our results show that temporal dynamics are best captured by ASCAT resulting in an average correlation of 0.72 with the model, while ASMR-E (0.41) and SMOS (0.42) are less capable of representing these dynamics. Standard deviations found for ASCAT and SMOS are low, 0.049 and 0.051m3m-3 respectively, while AMSR-E has a higher value of 0.062m3m-3. All standard deviations are higher than the average model uncertainty of 0.017m3m-3. All satellite products show a negative bias compared to the model results, with the largest value for SMOS. Satellite uncertainty is not found to be significantly related to topography, but is found to increase in densely vegetated areas. In general AMSR-E has most difficulties capturing soil moisture dynamics in Spain, while SMOS and mainly ASCAT have a fair to good performance. However, all products contain valuable information about the near-surface soil moisture over Spain. Van Dam, J.C., 2000, Field scale water flow and solute transport. SWAP model concepts, parameter estimation and case studies. Ph.D. thesis, Wageningen University

  5. Optimal portfolio design to reduce climate-related conservation uncertainty in the Prairie Pothole Region.

    PubMed

    Ando, Amy W; Mallory, Mindy L

    2012-04-24

    Climate change is likely to alter the spatial distributions of species and habitat types but the nature of such change is uncertain. Thus, climate change makes it difficult to implement standard conservation planning paradigms. Previous work has suggested some approaches to cope with such uncertainty but has not harnessed all of the benefits of risk diversification. We adapt Modern Portfolio Theory (MPT) to optimal spatial targeting of conservation activity, using wetland habitat conservation in the Prairie Pothole Region (PPR) as an example. This approach finds the allocations of conservation activity among subregions of the planning area that maximize the expected conservation returns for a given level of uncertainty or minimize uncertainty for a given expected level of returns. We find that using MPT instead of simple diversification in the PPR can achieve a value of the conservation objective per dollar spent that is 15% higher for the same level of risk. MPT-based portfolios can also have 21% less uncertainty over benefits or 6% greater expected benefits than the current portfolio of PPR conservation. Total benefits from conservation investment are higher if returns are defined in terms of benefit-cost ratios rather than benefits alone. MPT-guided diversification can work to reduce the climate-change-induced uncertainty of future ecosystem-service benefits from many land policy and investment initiatives, especially when outcomes are negatively correlated between subregions of a planning area.

  6. Optimal portfolio design to reduce climate-related conservation uncertainty in the Prairie Pothole Region

    PubMed Central

    Ando, Amy W.; Mallory, Mindy L.

    2012-01-01

    Climate change is likely to alter the spatial distributions of species and habitat types but the nature of such change is uncertain. Thus, climate change makes it difficult to implement standard conservation planning paradigms. Previous work has suggested some approaches to cope with such uncertainty but has not harnessed all of the benefits of risk diversification. We adapt Modern Portfolio Theory (MPT) to optimal spatial targeting of conservation activity, using wetland habitat conservation in the Prairie Pothole Region (PPR) as an example. This approach finds the allocations of conservation activity among subregions of the planning area that maximize the expected conservation returns for a given level of uncertainty or minimize uncertainty for a given expected level of returns. We find that using MPT instead of simple diversification in the PPR can achieve a value of the conservation objective per dollar spent that is 15% higher for the same level of risk. MPT-based portfolios can also have 21% less uncertainty over benefits or 6% greater expected benefits than the current portfolio of PPR conservation. Total benefits from conservation investment are higher if returns are defined in terms of benefit–cost ratios rather than benefits alone. MPT-guided diversification can work to reduce the climate-change–induced uncertainty of future ecosystem-service benefits from many land policy and investment initiatives, especially when outcomes are negatively correlated between subregions of a planning area. PMID:22451914

  7. Modeling the uncertainty of estimating forest carbon stocks in China

    NASA Astrophysics Data System (ADS)

    Yue, T. X.; Wang, Y. F.; Du, Z. P.; Zhao, M. W.; Zhang, L. L.; Zhao, N.; Lu, M.; Larocque, G. R.; Wilson, J. P.

    2015-12-01

    Earth surface systems are controlled by a combination of global and local factors, which cannot be understood without accounting for both the local and global components. The system dynamics cannot be recovered from the global or local controls alone. Ground forest inventory is able to accurately estimate forest carbon stocks at sample plots, but these sample plots are too sparse to support the spatial simulation of carbon stocks with required accuracy. Satellite observation is an important source of global information for the simulation of carbon stocks. Satellite remote-sensing can supply spatially continuous information about the surface of forest carbon stocks, which is impossible from ground-based investigations, but their description has considerable uncertainty. In this paper, we validated the Lund-Potsdam-Jena dynamic global vegetation model (LPJ), the Kriging method for spatial interpolation of ground sample plots and a satellite-observation-based approach as well as an approach for fusing the ground sample plots with satellite observations and an assimilation method for incorporating the ground sample plots into LPJ. The validation results indicated that both the data fusion and data assimilation approaches reduced the uncertainty of estimating carbon stocks. The data fusion had the lowest uncertainty by using an existing method for high accuracy surface modeling to fuse the ground sample plots with the satellite observations (HASM-SOA). The estimates produced with HASM-SOA were 26.1 and 28.4 % more accurate than the satellite-based approach and spatial interpolation of the sample plots, respectively. Forest carbon stocks of 7.08 Pg were estimated for China during the period from 2004 to 2008, an increase of 2.24 Pg from 1984 to 2008, using the preferred HASM-SOA method.

  8. High Resolution Insights into Snow Distribution Provided by Drone Photogrammetry

    NASA Astrophysics Data System (ADS)

    Redpath, T.; Sirguey, P. J.; Cullen, N. J.; Fitzsimons, S.

    2017-12-01

    Dynamic in time and space, New Zealand's seasonal snow is largely confined to remote alpine areas, complicating ongoing in situ measurement and characterisation. Improved understanding and modeling of the seasonal snowpack requires fine scale resolution of snow distribution and spatial variability. The potential of remotely piloted aircraft system (RPAS) photogrammetry to resolve spatial and temporal variability of snow depth and water equivalent in a New Zealand alpine catchment is assessed in the Pisa Range, Central Otago. This approach yielded orthophotomosaics and digital surface models (DSM) at 0.05 and 0.15 m spatial resolution, respectively. An autumn reference DSM allowed mapping of winter (02/08/2016) and spring (10/09/2016) snow depth at 0.15 m spatial resolution, via DSM differencing. The consistency and accuracy of the RPAS-derived surface was assessed by comparison of snow-free regions of the spring and autumn DSMs, while accuracy of RPAS retrieved snow depth was assessed with 86 in situ snow probe measurements. Results show a mean vertical residual of 0.024 m between DSMs acquired in autumn and spring. This residual approximated a Laplace distribution, reflecting the influence of large outliers on the small overall bias. Propagation of errors associated with successive DSMs saw snow depth mapped with an accuracy of ± 0.09 m (95% c.l.). Comparing RPAS and in situ snow depth measurements revealed the influence of geo-location uncertainty and interactions between vegetation and the snowpack on snow depth uncertainty and bias. Semi-variogram analysis revealed that the RPAS outperformed systematic in situ measurements in resolving fine scale spatial variability. Despite limitations accompanying RPAS photogrammetry, this study demonstrates a repeatable means of accurately mapping snow depth for an entire, yet relatively small, hydrological basin ( 0.5 km2), at high resolution. Resolving snowpack features associated with re-distribution and preferential accumulation and ablation, snow depth maps provide geostatistically robust insights into seasonal snow processes, with unprecedented detail. Such data may enhance understanding of physical processes controlling spatial and temporal distribution of seasonal snow, and their relative importance at varying spatial and temporal scales.

  9. Spatial assessment of soil organic carbon and physicochemical properties in a horticultural orchard at arid zone of India using geostatistical approaches.

    PubMed

    Singh, Akath; Santra, Priyabrata; Kumar, Mahesh; Panwar, Navraten; Meghwal, P R

    2016-09-01

    Soil organic carbon (SOC) is a major indicator of long-term sustenance of agricultural production system. Apart from sustaining productivity, SOC plays a crucial role in context of climate change. Keeping in mind these potentials, spatial variation of SOC contents of a fruit orchard comprising several arid fruit plantations located at arid region of India is assessed in this study through geostatistical approaches. For this purpose, surface and subsurface soil samples from 175 locations from a fruit orchard spreading over 14.33 ha area were collected along with geographical coordinates. SOC content and soil physicochemical properties of collected soil samples were determined followed by geostatistical analysis for mapping purposes. Average SOC stock density of the orchard was 14.48 Mg ha(-1) for 0- to 30-cm soil layer ranging from 9.01 Mg ha(-1) in Carissa carandas to 19.52 Mg ha(-1) in Prosopis cineraria block. Range of spatial variation of SOC content was found about 100 m, while two other soil physicochemical properties, e.g., pH and electrical conductivity (EC) also showed similar spatial trend. This indicated that minimum sampling distance for future SOC mapping programme may be kept lower than 100 m for better accuracy. Ordinary kriging technique satisfactorily predicted SOC contents (in percent) at unsampled locations with root-mean-squared residual (RMSR) of 0.35-0.37. Co-kriging approach was found slightly superior (RMSR = 0.26-0.28) than ordinary kriging for spatial prediction of SOC contents because of significant correlations of SOC contents with pH and EC. Uncertainty of SOC estimation was also presented in terms of 90 % confidence interval. Spatial estimates of SOC stock through ordinary kriging or co-kriging approach were also found with low uncertainty of estimation than non-spatial estimates, e.g., arithmetic averaging approach. Among different fruit block plantations of the orchard, the block with Prosopis cineraria ('khejri') has higher SOC stock density than others.

  10. Towards a global harmonized permafrost soil organic carbon stock estimates.

    NASA Astrophysics Data System (ADS)

    Hugelius, G.; Mishra, U.; Yang, Y.

    2017-12-01

    Permafrost affected soils store disproportionately large amount of organic carbon stocks due to multiple cryopedogenic processes. Previous permafrost soil organic carbon (SOC) stock estimates used a variety of approaches and reported substantial uncertainty in SOC stocks of permafrost soils. Here, we used spatially referenced data of soil-forming factors (topographic attributes, land cover types, climate, and bedrock geology) and SOC pedon description data (n = 2552) in a regression kriging approach to predict the spatial and vertical heterogeneity of SOC stocks across the Northern Circumpolar and Tibetan permafrost regions. Our approach allowed us to take into account both environmental correlation and spatial autocorrelation to separately estimate SOC stocks and their spatial uncertainties (95% CI) for three depth intervals at 250 m spatial resolution. In Northern Circumpolar region, our results show 1278.1 (1009.33 - 1550.45) Pg C in 0-3 m depth interval, with 542.09 (451.83 - 610.15), 422.46 (306.48 - 550.82), and 313.55 (251.02 - 389.48) Pg C in 0 - 1, 1 - 2, and 2 - 3 m depth intervals, respectively. In Tibetan region, our results show 26.68 (9.82 - 79.92) Pg C in 0 - 3 m depth interval, with 13.98 (6.2 - 32.96), 6.49 (1.73 - 25.86), and 6.21 (1.889 - 20.90) Pg C in 0 - 1, 1 - 2, and 2 - 3 m depth intervals, respectively. Our estimates show large spatial variability (50 - 100% coefficient of variation, depending upon the study region and depth interval) and higher uncertainty range in comparison to existing estimates. We will present the observed controls of different environmental factors on SOC at the AGU meeting.

  11. Weathering the Storm: Developing a Spatial Data Infrastructure and Online Research Platform for Oil Spill Preparedness

    NASA Astrophysics Data System (ADS)

    Bauer, J. R.; Rose, K.; Romeo, L.; Barkhurst, A.; Nelson, J.; Duran-Sesin, R.; Vielma, J.

    2016-12-01

    Efforts to prepare for and reduce the risk of hazards, from both natural and anthropogenic sources, which threaten our oceans and coasts requires an understanding of the dynamics and interactions between the physical, ecological, and socio-economic systems. Understanding these coupled dynamics are essential as offshore oil & gas exploration and production continues to push into harsher, more extreme environments where risks and uncertainty increase. However, working with these large, complex data from various sources and scales to assess risks and potential impacts associated with offshore energy exploration and production poses several challenges to research. In order to address these challenges, an integrated assessment model (IAM) was developed at the Department of Energy's (DOE) National Energy Technology Laboratory (NETL) that combines spatial data infrastructure and an online research platform to manage, process, analyze, and share these large, multidimensional datasets, research products, and the tools and models used to evaluate risk and reduce uncertainty for the entire offshore system, from the subsurface, through the water column, to coastal ecosystems and communities. Here, we will discuss the spatial data infrastructure and online research platform, NETL's Energy Data eXchange (EDX), that underpin the offshore IAM, providing information on how the framework combines multidimensional spatial data and spatio-temporal tools to evaluate risks to the complex matrix of potential environmental, social, and economic impacts stemming from modeled offshore hazard scenarios, such as oil spills or hurricanes. In addition, we will discuss the online analytics, tools, and visualization methods integrated into this framework that support availability and access to data, as well as allow for the rapid analysis and effective communication of analytical results to aid a range of decision-making needs.

  12. Geotechnical parameter spatial distribution stochastic analysis based on multi-precision information assimilation

    NASA Astrophysics Data System (ADS)

    Wang, C.; Rubin, Y.

    2014-12-01

    Spatial distribution of important geotechnical parameter named compression modulus Es contributes considerably to the understanding of the underlying geological processes and the adequate assessment of the Es mechanics effects for differential settlement of large continuous structure foundation. These analyses should be derived using an assimilating approach that combines in-situ static cone penetration test (CPT) with borehole experiments. To achieve such a task, the Es distribution of stratum of silty clay in region A of China Expo Center (Shanghai) is studied using the Bayesian-maximum entropy method. This method integrates rigorously and efficiently multi-precision of different geotechnical investigations and sources of uncertainty. Single CPT samplings were modeled as a rational probability density curve by maximum entropy theory. Spatial prior multivariate probability density function (PDF) and likelihood PDF of the CPT positions were built by borehole experiments and the potential value of the prediction point, then, preceding numerical integration on the CPT probability density curves, the posterior probability density curve of the prediction point would be calculated by the Bayesian reverse interpolation framework. The results were compared between Gaussian Sequential Stochastic Simulation and Bayesian methods. The differences were also discussed between single CPT samplings of normal distribution and simulated probability density curve based on maximum entropy theory. It is shown that the study of Es spatial distributions can be improved by properly incorporating CPT sampling variation into interpolation process, whereas more informative estimations are generated by considering CPT Uncertainty for the estimation points. Calculation illustrates the significance of stochastic Es characterization in a stratum, and identifies limitations associated with inadequate geostatistical interpolation techniques. This characterization results will provide a multi-precision information assimilation method of other geotechnical parameters.

  13. Climate change, ecosystem impacts, and management for Pacific salmon

    Treesearch

    D.E. Schindler; X. Augerot; E. Fleishman; N.J. Mantua; B. Riddell; M. Ruckelshaus; J. Seeb; M. Webster

    2008-01-01

    As climate change intensifies, there is increasing interest in developing models that reduce uncertainties in projections of global climate and refine these projections to finer spatial scales. Forecasts of climate impacts on ecosystems are far more challenging and their uncertainties even larger because of a limited understanding of physical controls on biological...

  14. Uncertainty Exposed: A Field Lab Exercise Where GIS Meets the Real World

    ERIC Educational Resources Information Center

    Prisley, Stephen P.; Luebbering, Candice

    2011-01-01

    Students in natural resources programs commonly take courses in geospatial technologies. An awareness of the uncertainty of spatial data and algorithms can be an important outcome of such courses. This article describes a laboratory exercise in a graduate geographic information system (GIS) class that involves collection of data for the assessment…

  15. Assessment of spatial variation of risks in small populations.

    PubMed Central

    Riggan, W B; Manton, K G; Creason, J P; Woodbury, M A; Stallard, E

    1991-01-01

    Often environmental hazards are assessed by examining the spatial variation of disease-specific mortality or morbidity rates. These rates, when estimated for small local populations, can have a high degree of random variation or uncertainty associated with them. If those rate estimates are used to prioritize environmental clean-up actions or to allocate resources, then those decisions may be influenced by this high degree of uncertainty. Unfortunately, the effect of this uncertainty is not to add "random noise" into the decision-making process, but to systematically bias action toward the smallest populations where uncertainty is greatest and where extreme high and low rate deviations are most likely to be manifest by chance. We present a statistical procedure for adjusting rate estimates for differences in variability due to differentials in local area population sizes. Such adjustments produce rate estimates for areas that have better properties than the unadjusted rates for use in making statistically based decisions about the entire set of areas. Examples are provided for county variation in bladder, stomach, and lung cancer mortality rates for U.S. white males for the period 1970 to 1979. PMID:1820268

  16. Global Surface Temperature Change and Uncertainties Since 1861

    NASA Technical Reports Server (NTRS)

    Shen, Samuel S. P.; Lau, William K. M. (Technical Monitor)

    2002-01-01

    The objective of this talk is to analyze the warming trend and its uncertainties of the global and hemi-spheric surface temperatures. By the method of statistical optimal averaging scheme, the land surface air temperature and sea surface temperature observational data are used to compute the spatial average annual mean surface air temperature. The optimal averaging method is derived from the minimization of the mean square error between the true and estimated averages and uses the empirical orthogonal functions. The method can accurately estimate the errors of the spatial average due to observational gaps and random measurement errors. In addition, quantified are three independent uncertainty factors: urbanization, change of the in situ observational practices and sea surface temperature data corrections. Based on these uncertainties, the best linear fit to annual global surface temperature gives an increase of 0.61 +/- 0.16 C between 1861 and 2000. This lecture will also touch the topics on the impact of global change on nature and environment. as well as the latest assessment methods for the attributions of global change.

  17. Rumor diffusion model with spatio-temporal diffusion and uncertainty of behavior decision in complex social networks

    NASA Astrophysics Data System (ADS)

    Zhu, Liang; Wang, Youguo

    2018-07-01

    In this paper, a rumor diffusion model with uncertainty of human behavior under spatio-temporal diffusion framework is established. Take physical significance of spatial diffusion into account, a diffusion threshold is set under which the rumor is not a trend topic and only spreads along determined physical connections. Heterogeneity of degree distribution and distance distribution has also been considered in theoretical model at the same time. The global existence and uniqueness of classical solution are proved with a Lyapunov function and an approximate classical solution in form of infinite series is constructed with a system of eigenfunction. Simulations and numerical solutions both on Watts-Strogatz (WS) network and Barabási-Albert (BA) network display the variation of density of infected connections from spatial and temporal dimensions. Relevant results show that the density of infected connections is dominated by network topology and uncertainty of human behavior at threshold time. With increase of social capability, rumor diffuses to the steady state in a higher speed. And the variation trends of diffusion size with uncertainty are diverse on different artificial networks.

  18. PROBING X-RAY ABSORPTION AND OPTICAL EXTINCTION IN THE INTERSTELLAR MEDIUM USING CHANDRA OBSERVATIONS OF SUPERNOVA REMNANTS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Foight, Dillon R.; Slane, Patrick O.; Güver, Tolga

    We present a comprehensive study of interstellar X-ray extinction using the extensive Chandra supernova remnant (SNR) archive and use our results to refine the empirical relation between the hydrogen column density and optical extinction. In our analysis, we make use of the large, uniform data sample to assess various systematic uncertainties in the measurement of the interstellar X-ray absorption. Specifically, we address systematic uncertainties that originate from (i) the emission models used to fit SNR spectra; (ii) the spatial variations within individual remnants; (iii) the physical conditions of the remnant such as composition, temperature, and non-equilibrium regions; and (iv) themore » model used for the absorption of X-rays in the interstellar medium. Using a Bayesian framework to quantify these systematic uncertainties, and combining the resulting hydrogen column density measurements with the measurements of optical extinction toward the same remnants, we find the empirical relation N {sub H} = (2.87 ± 0.12) × 10{sup 21} A {sub V} cm{sup 2}, which is significantly higher than the previous measurements.« less

  19. When size matters: attention affects performance by contrast or response gain.

    PubMed

    Herrmann, Katrin; Montaser-Kouhsari, Leila; Carrasco, Marisa; Heeger, David J

    2010-12-01

    Covert attention, the selective processing of visual information in the absence of eye movements, improves behavioral performance. We found that attention, both exogenous (involuntary) and endogenous (voluntary), can affect performance by contrast or response gain changes, depending on the stimulus size and the relative size of the attention field. These two variables were manipulated in a cueing task while stimulus contrast was varied. We observed a change in behavioral performance consonant with a change in contrast gain for small stimuli paired with spatial uncertainty and a change in response gain for large stimuli presented at one location (no uncertainty) and surrounded by irrelevant flanking distracters. A complementary neuroimaging experiment revealed that observers' attention fields were wider with than without spatial uncertainty. Our results support important predictions of the normalization model of attention and reconcile previous, seemingly contradictory findings on the effects of visual attention.

  20. Effect of Temporal and Spatial Rainfall Resolution on HSPF Predictive Performance and Parameter Estimation

    EPA Science Inventory

    Watershed scale rainfall‐runoff models are used for environmental management and regulatory modeling applications, but their effectiveness are limited by predictive uncertainties associated with model input data. This study evaluated the effect of temporal and spatial rainfall re...

  1. Land Surface Verification Toolkit (LVT) - A Generalized Framework for Land Surface Model Evaluation

    NASA Technical Reports Server (NTRS)

    Kumar, Sujay V.; Peters-Lidard, Christa D.; Santanello, Joseph; Harrison, Ken; Liu, Yuqiong; Shaw, Michael

    2011-01-01

    Model evaluation and verification are key in improving the usage and applicability of simulation models for real-world applications. In this article, the development and capabilities of a formal system for land surface model evaluation called the Land surface Verification Toolkit (LVT) is described. LVT is designed to provide an integrated environment for systematic land model evaluation and facilitates a range of verification approaches and analysis capabilities. LVT operates across multiple temporal and spatial scales and employs a large suite of in-situ, remotely sensed and other model and reanalysis datasets in their native formats. In addition to the traditional accuracy-based measures, LVT also includes uncertainty and ensemble diagnostics, information theory measures, spatial similarity metrics and scale decomposition techniques that provide novel ways for performing diagnostic model evaluations. Though LVT was originally designed to support the land surface modeling and data assimilation framework known as the Land Information System (LIS), it also supports hydrological data products from other, non-LIS environments. In addition, the analysis of diagnostics from various computational subsystems of LIS including data assimilation, optimization and uncertainty estimation are supported within LVT. Together, LIS and LVT provide a robust end-to-end environment for enabling the concepts of model data fusion for hydrological applications. The evolving capabilities of LVT framework are expected to facilitate rapid model evaluation efforts and aid the definition and refinement of formal evaluation procedures for the land surface modeling community.

  2. Spatial entanglement patterns and Einstein-Podolsky-Rosen steering in Bose-Einstein condensates

    NASA Astrophysics Data System (ADS)

    Fadel, Matteo; Zibold, Tilman; Décamps, Boris; Treutlein, Philipp

    2018-04-01

    Many-particle entanglement is a fundamental concept of quantum physics that still presents conceptual challenges. Although nonclassical states of atomic ensembles were used to enhance measurement precision in quantum metrology, the notion of entanglement in these systems was debated because the correlations among the indistinguishable atoms were witnessed by collective measurements only. Here, we use high-resolution imaging to directly measure the spin correlations between spatially separated parts of a spin-squeezed Bose-Einstein condensate. We observe entanglement that is strong enough for Einstein-Podolsky-Rosen steering: We can predict measurement outcomes for noncommuting observables in one spatial region on the basis of corresponding measurements in another region with an inferred uncertainty product below the Heisenberg uncertainty bound. This method could be exploited for entanglement-enhanced imaging of electromagnetic field distributions and quantum information tasks.

  3. The Value of Learning about Natural History in Biodiversity Markets

    PubMed Central

    Bruggeman, Douglas J.

    2015-01-01

    Markets for biodiversity have generated much controversy because of the often unstated and untested assumptions included in transactions rules. Simple trading rules are favored to reduce transaction costs, but others have argued that this leads to markets that favor development and erode biodiversity. Here, I describe how embracing complexity and uncertainty within a tradable credit system for the Red-cockaded Woodpecker (Picoides borealis) creates opportunities to achieve financial and conservation goals simultaneously. Reversing the effects of habitat fragmentation is one of the main reasons for developing markets. I include uncertainty in habitat fragmentation effects by evaluating market transactions using five alternative dispersal models that were able to approximate observed patterns of occupancy and movement. Further, because dispersal habitat is often not included in market transactions, I contrast how changes in breeding versus dispersal habitat affect credit values. I use an individually-based, spatially-explicit population model for the Red-cockaded Woodpecker (Picoides borealis) to predict spatial- and temporal- influences of landscape change on species occurrence and genetic diversity. Results indicated that the probability of no net loss of abundance and genetic diversity responded differently to the transient dynamics in breeding and dispersal habitat. Trades that do not violate the abundance cap may simultaneously violate the cap for the erosion of genetic diversity. To highlight how economic incentives may help reduce uncertainty, I demonstrate tradeoffs between the value of tradable credits and the value of information needed to predict the influence of habitat trades on population viability. For the trade with the greatest uncertainty regarding the change in habitat fragmentation, I estimate that the value of using 13-years of data to reduce uncertainty in dispersal behaviors is $6.2 million. Future guidance for biodiversity markets should at least encourage the use of spatially- and temporally-explicit techniques that include population genetic estimates and the influence of uncertainty. PMID:26675488

  4. The Value of Learning about Natural History in Biodiversity Markets.

    PubMed

    Bruggeman, Douglas J

    2015-01-01

    Markets for biodiversity have generated much controversy because of the often unstated and untested assumptions included in transactions rules. Simple trading rules are favored to reduce transaction costs, but others have argued that this leads to markets that favor development and erode biodiversity. Here, I describe how embracing complexity and uncertainty within a tradable credit system for the Red-cockaded Woodpecker (Picoides borealis) creates opportunities to achieve financial and conservation goals simultaneously. Reversing the effects of habitat fragmentation is one of the main reasons for developing markets. I include uncertainty in habitat fragmentation effects by evaluating market transactions using five alternative dispersal models that were able to approximate observed patterns of occupancy and movement. Further, because dispersal habitat is often not included in market transactions, I contrast how changes in breeding versus dispersal habitat affect credit values. I use an individually-based, spatially-explicit population model for the Red-cockaded Woodpecker (Picoides borealis) to predict spatial- and temporal- influences of landscape change on species occurrence and genetic diversity. Results indicated that the probability of no net loss of abundance and genetic diversity responded differently to the transient dynamics in breeding and dispersal habitat. Trades that do not violate the abundance cap may simultaneously violate the cap for the erosion of genetic diversity. To highlight how economic incentives may help reduce uncertainty, I demonstrate tradeoffs between the value of tradable credits and the value of information needed to predict the influence of habitat trades on population viability. For the trade with the greatest uncertainty regarding the change in habitat fragmentation, I estimate that the value of using 13-years of data to reduce uncertainty in dispersal behaviors is $6.2 million. Future guidance for biodiversity markets should at least encourage the use of spatially- and temporally-explicit techniques that include population genetic estimates and the influence of uncertainty.

  5. Uncertainties in mapping forest carbon in urban ecosystems.

    PubMed

    Chen, Gang; Ozelkan, Emre; Singh, Kunwar K; Zhou, Jun; Brown, Marilyn R; Meentemeyer, Ross K

    2017-02-01

    Spatially explicit urban forest carbon estimation provides a baseline map for understanding the variation in forest vertical structure, informing sustainable forest management and urban planning. While high-resolution remote sensing has proven promising for carbon mapping in highly fragmented urban landscapes, data cost and availability are the major obstacle prohibiting accurate, consistent, and repeated measurement of forest carbon pools in cities. This study aims to evaluate the uncertainties of forest carbon estimation in response to the combined impacts of remote sensing data resolution and neighborhood spatial patterns in Charlotte, North Carolina. The remote sensing data for carbon mapping were resampled to a range of resolutions, i.e., LiDAR point cloud density - 5.8, 4.6, 2.3, and 1.2 pt s/m 2 , aerial optical NAIP (National Agricultural Imagery Program) imagery - 1, 5, 10, and 20 m. Urban spatial patterns were extracted to represent area, shape complexity, dispersion/interspersion, diversity, and connectivity of landscape patches across the residential neighborhoods with built-up densities from low, medium-low, medium-high, to high. Through statistical analyses, we found that changing remote sensing data resolution introduced noticeable uncertainties (variation) in forest carbon estimation at the neighborhood level. Higher uncertainties were caused by the change of LiDAR point density (causing 8.7-11.0% of variation) than changing NAIP image resolution (causing 6.2-8.6% of variation). For both LiDAR and NAIP, urban neighborhoods with a higher degree of anthropogenic disturbance unveiled a higher level of uncertainty in carbon mapping. However, LiDAR-based results were more likely to be affected by landscape patch connectivity, and the NAIP-based estimation was found to be significantly influenced by the complexity of patch shape. Copyright © 2016 Elsevier Ltd. All rights reserved.

  6. A Development of Nonstationary Regional Frequency Analysis Model with Large-scale Climate Information: Its Application to Korean Watershed

    NASA Astrophysics Data System (ADS)

    Kim, Jin-Young; Kwon, Hyun-Han; Kim, Hung-Soo

    2015-04-01

    The existing regional frequency analysis has disadvantages in that it is difficult to consider geographical characteristics in estimating areal rainfall. In this regard, this study aims to develop a hierarchical Bayesian model based nonstationary regional frequency analysis in that spatial patterns of the design rainfall with geographical information (e.g. latitude, longitude and altitude) are explicitly incorporated. This study assumes that the parameters of Gumbel (or GEV distribution) are a function of geographical characteristics within a general linear regression framework. Posterior distribution of the regression parameters are estimated by Bayesian Markov Chain Monte Carlo (MCMC) method, and the identified functional relationship is used to spatially interpolate the parameters of the distributions by using digital elevation models (DEM) as inputs. The proposed model is applied to derive design rainfalls over the entire Han-river watershed. It was found that the proposed Bayesian regional frequency analysis model showed similar results compared to L-moment based regional frequency analysis. In addition, the model showed an advantage in terms of quantifying uncertainty of the design rainfall and estimating the area rainfall considering geographical information. Finally, comprehensive discussion on design rainfall in the context of nonstationary will be presented. KEYWORDS: Regional frequency analysis, Nonstationary, Spatial information, Bayesian Acknowledgement This research was supported by a grant (14AWMP-B082564-01) from Advanced Water Management Research Program funded by Ministry of Land, Infrastructure and Transport of Korean government.

  7. Visualizing uncertainties with the North Wyke Farm Platform Data Sets

    NASA Astrophysics Data System (ADS)

    Harris, Paul; Brunsdon, Chris; Lee, Michael

    2016-04-01

    The North Wyke Farm Platform (NWFP) is a systems-based, farm-scale experiment with the aim of addressing agricultural productivity and ecosystem responses to different management practices. The 63 ha site captures the spatial and/or temporal data necessary to develop a better understanding of the dynamic processes and underlying mechanisms that can be used to model how agricultural grassland systems respond to different management inputs. Via cattle beef and sheep production, the underlying principle is to manage each of three farmlets (each consisting of five hydrologically-isolated sub-catchments) in three contrasting ways: (i) improvement of permanent pasture through use of mineral fertilizers; (ii) improvement through use of legumes; and (iii) improvement through innovation. The connectivity between the timing and intensity of the different management operations, together with the transport of nutrients and potential pollutants from the NWFP is evaluated using numerous inter-linked data collection exercises. In this paper, we introduce some of the visualization opportunities that are possible with this rich data resource, and methods of analysis that might be applied to it, in particular with respect to data and model uncertainty operating across both temporal and spatial dimensions. An important component of the NWFP experiment is the representation of trade-offs with respect to: (a) economic profits, (b) environmental concerns, and (c) societal benefits, under the umbrella of sustainable intensification. Various visualizations exist to display such trade-offs and here we demonstrate ways to tailor them to relay key uncertainties and assessments of risk; and also consider how these visualizations can be honed to suit different audiences.

  8. Efficient spatial privacy preserving scheme for sensor network

    NASA Astrophysics Data System (ADS)

    Debnath, Ashmita; Singaravelu, Pradheepkumar; Verma, Shekhar

    2013-03-01

    The privacy of sensitive events observed by a wireless sensor networks (WSN) needs to be protected. Adversaries with the knowledge of sensor deployment and network protocols can infer the location of a sensed event by monitoring the communication from the sensors even when the messages are encrypted. Encryption provides confidentiality; however, the context of the event can used to breach the privacy of sensed objects. An adversary can track the trajectory of a moving object or determine the location of the occurrence of a critical event to breach its privacy. In this paper, we propose ring signature to obfuscate the spatial information. Firstly, the extended region of location of an event of interest as estimated from a sensor communication is presented. Then, the increase in this region of spatial uncertainty due to the effect of ring signature is determined. We observe that ring signature can effectively enhance the region of location uncertainty of a sensed event. As the event of interest can be situated anywhere in the enhanced region of uncertainty, its privacy against local or global adversary is ensured. Both analytical and simulation results show that induced delay and throughput are insignificant with negligible impact on the performance of a WSN.

  9. The effects of environmental variability and spatial sampling on the three-dimensional inversion problem.

    PubMed

    Bender, Christopher M; Ballard, Megan S; Wilson, Preston S

    2014-06-01

    The overall goal of this work is to quantify the effects of environmental variability and spatial sampling on the accuracy and uncertainty of estimates of the three-dimensional ocean sound-speed field. In this work, ocean sound speed estimates are obtained with acoustic data measured by a sparse autonomous observing system using a perturbative inversion scheme [Rajan, Lynch, and Frisk, J. Acoust. Soc. Am. 82, 998-1017 (1987)]. The vertical and horizontal resolution of the solution depends on the bandwidth of acoustic data and on the quantity of sources and receivers, respectively. Thus, for a simple, range-independent ocean sound speed profile, a single source-receiver pair is sufficient to estimate the water-column sound-speed field. On the other hand, an environment with significant variability may not be fully characterized by a large number of sources and receivers, resulting in uncertainty in the solution. This work explores the interrelated effects of environmental variability and spatial sampling on the accuracy and uncertainty of the inversion solution though a set of case studies. Synthetic data representative of the ocean variability on the New Jersey shelf are used.

  10. Spatiotemporal integration for tactile localization during arm movements: a probabilistic approach.

    PubMed

    Maij, Femke; Wing, Alan M; Medendorp, W Pieter

    2013-12-01

    It has been shown that people make systematic errors in the localization of a brief tactile stimulus that is delivered to the index finger while they are making an arm movement. Here we modeled these spatial errors with a probabilistic approach, assuming that they follow from temporal uncertainty about the occurrence of the stimulus. In the model, this temporal uncertainty converts into a spatial likelihood about the external stimulus location, depending on arm velocity. We tested the prediction of the model that the localization errors depend on arm velocity. Participants (n = 8) were instructed to localize a tactile stimulus that was presented to their index finger while they were making either slow- or fast-targeted arm movements. Our results confirm the model's prediction that participants make larger localization errors when making faster arm movements. The model, which was used to fit the errors for both slow and fast arm movements simultaneously, accounted very well for all the characteristics of these data with temporal uncertainty in stimulus processing as the only free parameter. We conclude that spatial errors in dynamic tactile perception stem from the temporal precision with which tactile inputs are processed.

  11. Understanding discharge data uncertainty and its consequences for analyses of spatial and temporal change in hydrological response

    NASA Astrophysics Data System (ADS)

    Westerberg, Ida

    2017-04-01

    Understanding and quantifying how hydrological response behaviour varies across catchments, or how catchments change with time requires reliable discharge data. For reliable estimation of spatial and temporal change, the change in the response behaviour needs to be larger than the uncertainty in the response behaviour estimates that are compared. Understanding how discharge data uncertainty varies between catchments and over time, and how these uncertainties propagate to information derived from the data, is therefore key to drawing the right conclusions in comparative analyses. Uncertainty in discharge data is often highly place-specific and reliable estimation depends on detailed analyses of the rating curve model and stage-discharge measurements used to calculate discharge time series from stage (water level) at the gauging station. This underlying information is often not available when discharge data is provided by monitoring agencies. However, even without detailed analyses, the chance that the discharge data would be uncertain at particular flow ranges can be assessed based on information about the gauging station, the flow regime, and the catchment. This type of information is often available for most catchments even if the rating curve data are not. Such 'soft information' on discharge uncertainty may aid interpretation of results from regional and temporal change analyses. In particular, it can help reduce the risk of wrongly interpreting differences in response behaviour caused by discharge uncertainty as real changes. In this presentation I draw on several previous studies to discuss some of the factors that affect discharge data uncertainty and give examples from catchments worldwide. I aim to 1) illustrate the consequences of discharge data uncertainty on comparisons of different types of hydrological response behaviour across catchments and when analysing temporal change, and 2) give practical advice as to what factors may help identify catchments with potentially large discharge uncertainty.

  12. How Well Will MODIS Measure Top of Atmosphere Aerosol Direct Radiative Forcing?

    NASA Technical Reports Server (NTRS)

    Remer, Lorraine A.; Kaufman, Yoram J.; Levin, Zev; Ghan, Stephen; Einaudi, Franco (Technical Monitor)

    2000-01-01

    The new generation of satellite sensors such as the Moderate Resolution Imaging Spectroradiometer (MODIS) will be able to detect and characterize global aerosols with an unprecedented accuracy. The question remains whether this accuracy will be sufficient to narrow the uncertainties in our estimates of aerosol radiative forcing at the top of the atmosphere. Satellite remote sensing detects aerosol optical thickness with the least amount of relative error when aerosol loading is high. Satellites are less effective when aerosol loading is low. We use the monthly mean results of two global aerosol transport models to simulate the spatial distribution of smoke aerosol in the Southern Hemisphere during the tropical biomass burning season. This spatial distribution allows us to determine that 87-94% of the smoke aerosol forcing at the top of the atmosphere occurs in grid squares with sufficient signal to noise ratio to be detectable from space. The uncertainty of quantifying the smoke aerosol forcing in the Southern Hemisphere depends on the uncertainty introduced by errors in estimating the background aerosol, errors resulting from uncertainties in surface properties and errors resulting from uncertainties in assumptions of aerosol properties. These three errors combine to give overall uncertainties of 1.5 to 2.2 Wm-2 (21-56%) in determining the Southern Hemisphere smoke aerosol forcing at the top of the atmosphere. The range of values depend on which estimate of MODIS retrieval uncertainty is used, either the theoretical calculation (upper bound) or the empirical estimate (lower bound). Strategies that use the satellite data to derive flux directly or use the data in conjunction with ground-based remote sensing and aerosol transport models can reduce these uncertainties.

  13. Uncertainty Quantification for Ice Sheet Science and Sea Level Projections

    NASA Astrophysics Data System (ADS)

    Boening, C.; Schlegel, N.; Limonadi, D.; Schodlok, M.; Seroussi, H. L.; Larour, E. Y.; Watkins, M. M.

    2017-12-01

    In order to better quantify uncertainties in global mean sea level rise projections and in particular upper bounds, we aim at systematically evaluating the contributions from ice sheets and potential for extreme sea level rise due to sudden ice mass loss. Here, we take advantage of established uncertainty quantification tools embedded within the Ice Sheet System Model (ISSM) as well as sensitivities to ice/ocean interactions using melt rates and melt potential derived from MITgcm/ECCO2. With the use of these tools, we conduct Monte-Carlo style sampling experiments on forward simulations of the Antarctic ice sheet, by varying internal parameters and boundary conditions of the system over both extreme and credible worst-case ranges. Uncertainty bounds for climate forcing are informed by CMIP5 ensemble precipitation and ice melt estimates for year 2100, and uncertainty bounds for ocean melt rates are derived from a suite of regional sensitivity experiments using MITgcm. Resulting statistics allow us to assess how regional uncertainty in various parameters affect model estimates of century-scale sea level rise projections. The results inform efforts to a) isolate the processes and inputs that are most responsible for determining ice sheet contribution to sea level; b) redefine uncertainty brackets for century-scale projections; and c) provide a prioritized list of measurements, along with quantitative information on spatial and temporal resolution, required for reducing uncertainty in future sea level rise projections. Results indicate that ice sheet mass loss is dependent on the spatial resolution of key boundary conditions - such as bedrock topography and melt rates at the ice-ocean interface. This work is performed at and supported by the California Institute of Technology's Jet Propulsion Laboratory. Supercomputing time is also supported through a contract with the National Aeronautics and Space Administration's Cryosphere program.

  14. Choice of baseline climate data impacts projected species' responses to climate change.

    PubMed

    Baker, David J; Hartley, Andrew J; Butchart, Stuart H M; Willis, Stephen G

    2016-07-01

    Climate data created from historic climate observations are integral to most assessments of potential climate change impacts, and frequently comprise the baseline period used to infer species-climate relationships. They are often also central to downscaling coarse resolution climate simulations from General Circulation Models (GCMs) to project future climate scenarios at ecologically relevant spatial scales. Uncertainty in these baseline data can be large, particularly where weather observations are sparse and climate dynamics are complex (e.g. over mountainous or coastal regions). Yet, importantly, this uncertainty is almost universally overlooked when assessing potential responses of species to climate change. Here, we assessed the importance of historic baseline climate uncertainty for projections of species' responses to future climate change. We built species distribution models (SDMs) for 895 African bird species of conservation concern, using six different climate baselines. We projected these models to two future periods (2040-2069, 2070-2099), using downscaled climate projections, and calculated species turnover and changes in species-specific climate suitability. We found that the choice of baseline climate data constituted an important source of uncertainty in projections of both species turnover and species-specific climate suitability, often comparable with, or more important than, uncertainty arising from the choice of GCM. Importantly, the relative contribution of these factors to projection uncertainty varied spatially. Moreover, when projecting SDMs to sites of biodiversity importance (Important Bird and Biodiversity Areas), these uncertainties altered site-level impacts, which could affect conservation prioritization. Our results highlight that projections of species' responses to climate change are sensitive to uncertainty in the baseline climatology. We recommend that this should be considered routinely in such analyses. © 2016 John Wiley & Sons Ltd.

  15. An improved method to represent DEM uncertainty in glacial lake outburst flood propagation using stochastic simulations

    NASA Astrophysics Data System (ADS)

    Watson, Cameron S.; Carrivick, Jonathan; Quincey, Duncan

    2015-10-01

    Modelling glacial lake outburst floods (GLOFs) or 'jökulhlaups', necessarily involves the propagation of large and often stochastic uncertainties throughout the source to impact process chain. Since flood routing is primarily a function of underlying topography, communication of digital elevation model (DEM) uncertainty should accompany such modelling efforts. Here, a new stochastic first-pass assessment technique was evaluated against an existing GIS-based model and an existing 1D hydrodynamic model, using three DEMs with different spatial resolution. The analysis revealed the effect of DEM uncertainty and model choice on several flood parameters and on the prediction of socio-economic impacts. Our new model, which we call MC-LCP (Monte Carlo Least Cost Path) and which is distributed in the supplementary information, demonstrated enhanced 'stability' when compared to the two existing methods, and this 'stability' was independent of DEM choice. The MC-LCP model outputs an uncertainty continuum within its extent, from which relative socio-economic risk can be evaluated. In a comparison of all DEM and model combinations, the Shuttle Radar Topography Mission (SRTM) DEM exhibited fewer artefacts compared to those with the Advanced Spaceborne Thermal Emission and Reflection Radiometer Global Digital Elevation Model (ASTER GDEM), and were comparable to those with a finer resolution Advanced Land Observing Satellite Panchromatic Remote-sensing Instrument for Stereo Mapping (ALOS PRISM) derived DEM. Overall, we contend that the variability we find between flood routing model results suggests that consideration of DEM uncertainty and pre-processing methods is important when assessing flow routing and when evaluating potential socio-economic implications of a GLOF event. Incorporation of a stochastic variable provides an illustration of uncertainty that is important when modelling and communicating assessments of an inherently complex process.

  16. Quantifying the uncertainty of nonpoint source attribution in distributed water quality models: A Bayesian assessment of SWAT's sediment export predictions

    NASA Astrophysics Data System (ADS)

    Wellen, Christopher; Arhonditsis, George B.; Long, Tanya; Boyd, Duncan

    2014-11-01

    Spatially distributed nonpoint source watershed models are essential tools to estimate the magnitude and sources of diffuse pollution. However, little work has been undertaken to understand the sources and ramifications of the uncertainty involved in their use. In this study we conduct the first Bayesian uncertainty analysis of the water quality components of the SWAT model, one of the most commonly used distributed nonpoint source models. Working in Southern Ontario, we apply three Bayesian configurations for calibrating SWAT to Redhill Creek, an urban catchment, and Grindstone Creek, an agricultural one. We answer four interrelated questions: can SWAT determine suspended sediment sources with confidence when end of basin data is used for calibration? How does uncertainty propagate from the discharge submodel to the suspended sediment submodels? Do the estimated sediment sources vary when different calibration approaches are used? Can we combine the knowledge gained from different calibration approaches? We show that: (i) despite reasonable fit at the basin outlet, the simulated sediment sources are subject to uncertainty sufficient to undermine the typical approach of reliance on a single, best fit simulation; (ii) more than a third of the uncertainty of sediment load predictions may stem from the discharge submodel; (iii) estimated sediment sources do vary significantly across the three statistical configurations of model calibration despite end-of-basin predictions being virtually identical; and (iv) Bayesian model averaging is an approach that can synthesize predictions when a number of adequate distributed models make divergent source apportionments. We conclude with recommendations for future research to reduce the uncertainty encountered when using distributed nonpoint source models for source apportionment.

  17. Modeling spatial-temporal dynamics of global wetlands: Comprehensive evaluation of a new sub-grid TOPMODEL parameterization and uncertainties

    NASA Astrophysics Data System (ADS)

    Zhang, Z.; Zimmermann, N. E.; Poulter, B.

    2015-12-01

    Simulations of the spatial-temporal dynamics of wetlands is key to understanding the role of wetland biogeochemistry under past and future climate variability. Hydrologic inundation models, such as TOPMODEL, are based on a fundamental parameter known as the compound topographic index (CTI) and provide a computationally cost-efficient approach to simulate global wetland dynamics. However, there remains large discrepancy in the implementations of TOPMODEL in land-surface models (LSMs) and thus their performance against observations. This study describes new improvements to TOPMODEL implementation and estimates of global wetland dynamics using the LPJ-wsl DGVM, and quantifies uncertainties by comparing three digital elevation model products (HYDRO1k, GMTED, and HydroSHEDS) at different spatial resolution and accuracy on simulated inundation dynamics. We found that calibrating TOPMODEL with a benchmark dataset can help to successfully predict the seasonal and interannual variations of wetlands, as well as improve the spatial distribution of wetlands to be consistent with inventories. The HydroSHEDS DEM, using a river-basin scheme for aggregating the CTI, shows best accuracy for capturing the spatio-temporal dynamics of wetland among three DEM products. This study demonstrates the feasibility to capture spatial heterogeneity of inundation and to estimate seasonal and interannual variations in wetland by coupling a hydrological module in LSMs with appropriate benchmark datasets. It additionally highlight the importance of an adequate understanding of topographic indices for simulating global wetlands and show the opportunity to converge wetland estimations in LSMs by identifying the uncertainty associated with existing wetland products.

  18. Assimilation of Spatially Sparse In Situ Soil Moisture Networks into a Continuous Model Domain

    NASA Astrophysics Data System (ADS)

    Gruber, A.; Crow, W. T.; Dorigo, W. A.

    2018-02-01

    Growth in the availability of near-real-time soil moisture observations from ground-based networks has spurred interest in the assimilation of these observations into land surface models via a two-dimensional data assimilation system. However, the design of such systems is currently hampered by our ignorance concerning the spatial structure of error afflicting ground and model-based soil moisture estimates. Here we apply newly developed triple collocation techniques to provide the spatial error information required to fully parameterize a two-dimensional (2-D) data assimilation system designed to assimilate spatially sparse observations acquired from existing ground-based soil moisture networks into a spatially continuous Antecedent Precipitation Index (API) model for operational agricultural drought monitoring. Over the contiguous United States (CONUS), the posterior uncertainty of surface soil moisture estimates associated with this 2-D system is compared to that obtained from the 1-D assimilation of remote sensing retrievals to assess the value of ground-based observations to constrain a surface soil moisture analysis. Results demonstrate that a fourfold increase in existing CONUS ground station density is needed for ground network observations to provide a level of skill comparable to that provided by existing satellite-based surface soil moisture retrievals.

  19. Towards ground-truthing of spaceborne estimates of above-ground life biomass and leaf area index in tropical rain forests

    NASA Astrophysics Data System (ADS)

    Köhler, P.; Huth, A.

    2010-08-01

    The canopy height h of forests is a key variable which can be obtained using air- or spaceborne remote sensing techniques such as radar interferometry or LIDAR. If new allometric relationships between canopy height and the biomass stored in the vegetation can be established this would offer the possibility for a global monitoring of the above-ground carbon content on land. In the absence of adequate field data we use simulation results of a tropical rain forest growth model to propose what degree of information might be generated from canopy height and thus to enable ground-truthing of potential future satellite observations. We here analyse the correlation between canopy height in a tropical rain forest with other structural characteristics, such as above-ground life biomass (AGB) (and thus carbon content of vegetation) and leaf area index (LAI) and identify how correlation and uncertainty vary for two different spatial scales. The process-based forest growth model FORMIND2.0 was applied to simulate (a) undisturbed forest growth and (b) a wide range of possible disturbance regimes typically for local tree logging conditions for a tropical rain forest site on Borneo (Sabah, Malaysia) in South-East Asia. In both undisturbed and disturbed forests AGB can be expressed as a power-law function of canopy height h (AGB = a · hb) with an r2 ~ 60% if data are analysed in a spatial resolution of 20 m × 20 m (0.04 ha, also called plot size). The correlation coefficient of the regression is becoming significant better in the disturbed forest sites (r2 = 91%) if data are analysed hectare wide. There seems to exist no functional dependency between LAI and canopy height, but there is also a linear correlation (r2 ~ 60%) between AGB and the area fraction of gaps in which the canopy is highly disturbed. A reasonable agreement of our results with observations is obtained from a comparison of the simulations with permanent sampling plot (PSP) data from the same region and with the large-scale forest inventory in Lambir. We conclude that the spaceborne remote sensing techniques such as LIDAR and radar interferometry have the potential to quantify the carbon contained in the vegetation, although this calculation contains due to the heterogeneity of the forest landscape structural uncertainties which restrict future applications to spatial averages of about one hectare in size. The uncertainties in AGB for a given canopy height are here 20-40% (95% confidence level) corresponding to a standard deviation of less than ± 10%. This uncertainty on the 1 ha-scale is much smaller than in the analysis of 0.04 ha-scale data. At this small scale (0.04 ha) AGB can only be calculated out of canopy height with an uncertainty which is at least of the magnitude of the signal itself due to the natural spatial heterogeneity of these forests.

  20. Using internal discharge data in a distributed conceptual model to reduce uncertainty in streamflow simulations

    NASA Astrophysics Data System (ADS)

    Guerrero, J.; Halldin, S.; Xu, C.; Lundin, L.

    2011-12-01

    Distributed hydrological models are important tools in water management as they account for the spatial variability of the hydrological data, as well as being able to produce spatially distributed outputs. They can directly incorporate and assess potential changes in the characteristics of our basins. A recognized problem for models in general is equifinality, which is only exacerbated for distributed models who tend to have a large number of parameters. We need to deal with the fundamentally ill-posed nature of the problem that such models force us to face, i.e. a large number of parameters and very few variables that can be used to constrain them, often only the catchment discharge. There is a growing but yet limited literature showing how the internal states of a distributed model can be used to calibrate/validate its predictions. In this paper, a distributed version of WASMOD, a conceptual rainfall runoff model with only three parameters, combined with a routing algorithm based on the high-resolution HydroSHEDS data was used to simulate the discharge in the Paso La Ceiba basin in Honduras. The parameter space was explored using Monte-Carlo simulations and the region of space containing the parameter-sets that were considered behavioral according to two different criteria was delimited using the geometric concept of alpha-shapes. The discharge data from five internal sub-basins was used to aid in the calibration of the model and to answer the following questions: Can this information improve the simulations at the outlet of the catchment, or decrease their uncertainty? Also, after reducing the number of model parameters needing calibration through sensitivity analysis: Is it possible to relate them to basin characteristics? The analysis revealed that in most cases the internal discharge data can be used to reduce the uncertainty in the discharge at the outlet, albeit with little improvement in the overall simulation results.

  1. Predicting wildfire occurrence distribution with spatial point process models and its uncertainty assessment: a case study in the Lake Tahoe Basin, USA

    Treesearch

    Jian Yang; Peter J. Weisberg; Thomas E. Dilts; E. Louise Loudermilk; Robert M. Scheller; Alison Stanton; Carl Skinner

    2015-01-01

    Strategic fire and fuel management planning benefits from detailed understanding of how wildfire occurrences are distributed spatially under current climate, and from predictive models of future wildfire occurrence given climate change scenarios. In this study, we fitted historical wildfire occurrence data from 1986 to 2009 to a suite of spatial point process (SPP)...

  2. Connecting Systems Model Design to Decision-Maker and Stakeholder Needs: Lessons from Louisiana's Coastal Master Plan

    NASA Astrophysics Data System (ADS)

    Fischbach, J. R.; Johnson, D.

    2017-12-01

    Louisiana's Comprehensive Master Plan for a Sustainable Coast is a 50-year plan designed to reduce flood risk and minimize land loss while allowing for the continued provision of economic and ecosystem services from this critical coastal region. Conceived in 2007 in response to hurricanes Katrina and Rita in 2005, the master plan is updated on a five-year planning cycle by the state's Coastal Protection and Restoration Authority (CPRA). Under the plan's middle-of-the-road (Medium) environmental scenario, the master plan is projected to reduce expected annual damage from storm surge flooding by approximately 65% relative to a future without action: from 5.3 billion to 2.2 billion in 2040, and from 12.1 billion to 3.7 billion in 2065. The Coastal Louisiana Risk Assessment model (CLARA) is used to estimate the risk reduction impacts of projects that have been considered for implementation as part of the plan. Evaluation of projects involves estimation of cost effectiveness in multiple future time periods and under a range of environmental uncertainties (e.g., the rates of sea level rise and land subsidence, changes in future hurricane intensity and frequency), operational uncertainties (e.g., system fragility), and economic uncertainties (e.g., patterns of population change and asset exposure). Between the 2012 and 2017 planning cycles, many improvements were made to the CLARA model. These included changes to the model's spatial resolution and definition of policy-relevant spatial units, an improved treatment of parametric uncertainty and uncertainty propagation between model components, the addition of a module to consider critical infrastructure exposure, and a new population growth model. CPRA also developed new scenarios for analysis in 2017 that were responsive to new scientific literature and to accommodate a new approach to modeling coastal morphology. In this talk, we discuss how CLARA has evolved over the 2012 and 2017 planning cycles in response to the needs of policy makers and CPRA managers. While changes will be illustrated through examples from Louisiana's 2017 Coastal Master Plan, we endeavor to provide generalizable and actionable insights about how modeling choices should be guided by the decision support process being used by planners.

  3. Impact of Climate Change on high and low flows across Great Britain: a temporal analysis and uncertainty assessment.

    NASA Astrophysics Data System (ADS)

    Beevers, Lindsay; Collet, Lila

    2017-04-01

    Over the past decade there have been significant challenges to water management posed by both floods and droughts. In the UK, since 2000 flooding has caused over £5Bn worth of damage, and direct costs from the recent drought (2011-12) are estimated to be between £70-165M, arising from impacts on public and industrial water supply. Projections of future climate change suggest an increase in temperature and precipitation trends which may exacerbate the frequency and severity of such hazards, but there is significant uncertainty associated with these projections. It thus becomes urgent to assess the possible impact of these changes on extreme flows and evaluate the uncertainties related to these projections, particularly changes in the seasonality of such hazards. This paper aims to assess the changes in seasonality of peak and low flows across Great Britain as a result of climate change. It is based on the Future Flow database; an 11-member ensemble of transient river flow projections from January 1951 to December 2098. We analyse the daily river flow over the baseline (1961-1990) and the 2080s (2069-2098) for 281 gauging stations. For each ensemble member, annual maxima (AMAX) and minima (AMIN) are extracted for both time periods for each gauging station. The month of the year the AMAX and AMIN occur respectively are recorded for each of the 30 years in the past and the future time periods. The uncertainty of the AMAX and AMIN occurrence temporally (monthly) is assessed across the 11 ensemble members, as well as the changes to this temporal signal between the baseline and the 2080s. Ultimately, this work gives a national picture (spatially) of high and low flows occurrence temporally and allows the assessment of possible changes in hydrological dynamics as a result of climate change in a statistical framework. Results will quantify the uncertainty related to the Climate Model parameters which are cascaded into the modelling chain. This study highlights the issues facing hydrological cycle management, due to changing spatial and temporal trends in order to anticipate and adapt to hydro-hazard changes in an uncertain context.

  4. Management Choices in an Uncertain Future: Navigating Snow, Precipitation, and Temperature Projections in the Pacific Northwest U.S. to Assess Water Management Alternatives

    NASA Astrophysics Data System (ADS)

    Luce, C.

    2014-12-01

    Climate and hydrology models are regularly applied to assess potential changes in water resources and to inform adaptation decisions. An increasingly common question is, "What if we are wrong?" While climate models show substantial agreement on metrics such as pressure, temperature, and wind, they are notoriously uncertain in projecting precipitation change. The response to that uncertainty varies depending on the water management context and the nature of the uncertainty. In the southwestern U.S., large storage reservoirs (relative to annual supply) and general expectations of decreasing precipitation have guided extensive discussion on water management towards uncertainties in annual-scale water balances, precipitation, and evapotranspiration. In contrast, smaller reservoirs and little expectation for change in annual precipitation have focused discussions of Pacific Northwest water management toward shifts in runoff seasonality. The relative certainty of temperature impacts on snowpacks compared to the substantial uncertainty in precipitation has yielded a consistent narrative on earlier snowmelt. This narrative has been reinforced by a perception of essentially the same behavior in the historical record. This perception has led to calls in the political arena for more reservoir storage to replace snowpack storage for water supplies. Recent findings on differences in trends in precipitation at high versus low elevations, however, has recalled the uncertainty in precipitation futures and generated questions about alternative water management strategies. An important question with respect to snowpacks is whether the precipitation changes matter in the context of such substantial projections for temperature change. Here we apply an empirical snowpack model to analyze spatial differences in the uncertainty of snowpack responses to temperature and precipitation forcing across the Pacific Northwest U.S. The analysis reveals a strong geographic gradient in uncertainty of snowpack response to future climate, from the coastal regions, where precipitation uncertainty is relatively inconsequential for snowpack changes, to interior mountains where minor uncertainties in precipitation are on par with expected changes relative to temperature.

  5. Towards a hierarchical optimization framework for spatially targeting incentive policies to promote green infrastructure amidst multiple objectives and uncertainty

    EPA Science Inventory

    We introduce a hierarchical optimization framework for spatially targeting green infrastructure (GI) incentive policies in order to meet objectives related to cost and environmental effectiveness. The framework explicitly simulates the interaction between multiple levels of polic...

  6. Robustness of risk maps and survey networks to knowledge gaps about a new invasive pest.

    PubMed

    Yemshanov, Denys; Koch, Frank H; Ben-Haim, Yakov; Smith, William D

    2010-02-01

    In pest risk assessment it is frequently necessary to make management decisions regarding emerging threats under severe uncertainty. Although risk maps provide useful decision support for invasive alien species, they rarely address knowledge gaps associated with the underlying risk model or how they may change the risk estimates. Failure to recognize uncertainty leads to risk-ignorant decisions and miscalculation of expected impacts as well as the costs required to minimize these impacts. Here we use the information gap concept to evaluate the robustness of risk maps to uncertainties in key assumptions about an invading organism. We generate risk maps with a spatial model of invasion that simulates potential entries of an invasive pest via international marine shipments, their spread through a landscape, and establishment on a susceptible host. In particular, we focus on the question of how much uncertainty in risk model assumptions can be tolerated before the risk map loses its value. We outline this approach with an example of a forest pest recently detected in North America, Sirex noctilio Fabricius. The results provide a spatial representation of the robustness of predictions of S. noctilio invasion risk to uncertainty and show major geographic hotspots where the consideration of uncertainty in model parameters may change management decisions about a new invasive pest. We then illustrate how the dependency between the extent of uncertainties and the degree of robustness of a risk map can be used to select a surveillance network design that is most robust to knowledge gaps about the pest.

  7. Part 2. Development of Enhanced Statistical Methods for Assessing Health Effects Associated with an Unknown Number of Major Sources of Multiple Air Pollutants.

    PubMed

    Park, Eun Sug; Symanski, Elaine; Han, Daikwon; Spiegelman, Clifford

    2015-06-01

    A major difficulty with assessing source-specific health effects is that source-specific exposures cannot be measured directly; rather, they need to be estimated by a source-apportionment method such as multivariate receptor modeling. The uncertainty in source apportionment (uncertainty in source-specific exposure estimates and model uncertainty due to the unknown number of sources and identifiability conditions) has been largely ignored in previous studies. Also, spatial dependence of multipollutant data collected from multiple monitoring sites has not yet been incorporated into multivariate receptor modeling. The objectives of this project are (1) to develop a multipollutant approach that incorporates both sources of uncertainty in source-apportionment into the assessment of source-specific health effects and (2) to develop enhanced multivariate receptor models that can account for spatial correlations in the multipollutant data collected from multiple sites. We employed a Bayesian hierarchical modeling framework consisting of multivariate receptor models, health-effects models, and a hierarchical model on latent source contributions. For the health model, we focused on the time-series design in this project. Each combination of number of sources and identifiability conditions (additional constraints on model parameters) defines a different model. We built a set of plausible models with extensive exploratory data analyses and with information from previous studies, and then computed posterior model probability to estimate model uncertainty. Parameter estimation and model uncertainty estimation were implemented simultaneously by Markov chain Monte Carlo (MCMC*) methods. We validated the methods using simulated data. We illustrated the methods using PM2.5 (particulate matter ≤ 2.5 μm in aerodynamic diameter) speciation data and mortality data from Phoenix, Arizona, and Houston, Texas. The Phoenix data included counts of cardiovascular deaths and daily PM2.5 speciation data from 1995-1997. The Houston data included respiratory mortality data and 24-hour PM2.5 speciation data sampled every six days from a region near the Houston Ship Channel in years 2002-2005. We also developed a Bayesian spatial multivariate receptor modeling approach that, while simultaneously dealing with the unknown number of sources and identifiability conditions, incorporated spatial correlations in the multipollutant data collected from multiple sites into the estimation of source profiles and contributions based on the discrete process convolution model for multivariate spatial processes. This new modeling approach was applied to 24-hour ambient air concentrations of 17 volatile organic compounds (VOCs) measured at nine monitoring sites in Harris County, Texas, during years 2000 to 2005. Simulation results indicated that our methods were accurate in identifying the true model and estimated parameters were close to the true values. The results from our methods agreed in general with previous studies on the source apportionment of the Phoenix data in terms of estimated source profiles and contributions. However, we had a greater number of statistically insignificant findings, which was likely a natural consequence of incorporating uncertainty in the estimated source contributions into the health-effects parameter estimation. For the Houston data, a model with five sources (that seemed to be Sulfate-Rich Secondary Aerosol, Motor Vehicles, Industrial Combustion, Soil/Crustal Matter, and Sea Salt) showed the highest posterior model probability among the candidate models considered when fitted simultaneously to the PM2.5 and mortality data. There was a statistically significant positive association between respiratory mortality and same-day PM2.5 concentrations attributed to one of the sources (probably industrial combustion). The Bayesian spatial multivariate receptor modeling approach applied to the VOC data led to a highest posterior model probability for a model with five sources (that seemed to be refinery, petrochemical production, gasoline evaporation, natural gas, and vehicular exhaust) among several candidate models, with the number of sources varying between three and seven and with different identifiability conditions. Our multipollutant approach assessing source-specific health effects is more advantageous than a single-pollutant approach in that it can estimate total health effects from multiple pollutants and can also identify emission sources that are responsible for adverse health effects. Our Bayesian approach can incorporate not only uncertainty in the estimated source contributions, but also model uncertainty that has not been addressed in previous studies on assessing source-specific health effects. The new Bayesian spatial multivariate receptor modeling approach enables predictions of source contributions at unmonitored sites, minimizing exposure misclassification and providing improved exposure estimates along with their uncertainty estimates, as well as accounting for uncertainty in the number of sources and identifiability conditions.

  8. Sources of errors and uncertainties in the assessment of forest soil carbon stocks at different scales-review and recommendations.

    PubMed

    Vanguelova, E I; Bonifacio, E; De Vos, B; Hoosbeek, M R; Berger, T W; Vesterdal, L; Armolaitis, K; Celi, L; Dinca, L; Kjønaas, O J; Pavlenda, P; Pumpanen, J; Püttsepp, Ü; Reidy, B; Simončič, P; Tobin, B; Zhiyanski, M

    2016-11-01

    Spatially explicit knowledge of recent and past soil organic carbon (SOC) stocks in forests will improve our understanding of the effect of human- and non-human-induced changes on forest C fluxes. For SOC accounting, a minimum detectable difference must be defined in order to adequately determine temporal changes and spatial differences in SOC. This requires sufficiently detailed data to predict SOC stocks at appropriate scales within the required accuracy so that only significant changes are accounted for. When designing sampling campaigns, taking into account factors influencing SOC spatial and temporal distribution (such as soil type, topography, climate and vegetation) are needed to optimise sampling depths and numbers of samples, thereby ensuring that samples accurately reflect the distribution of SOC at a site. Furthermore, the appropriate scales related to the research question need to be defined: profile, plot, forests, catchment, national or wider. Scaling up SOC stocks from point sample to landscape unit is challenging, and thus requires reliable baseline data. Knowledge of the associated uncertainties related to SOC measures at each particular scale and how to reduce them is crucial for assessing SOC stocks with the highest possible accuracy at each scale. This review identifies where potential sources of errors and uncertainties related to forest SOC stock estimation occur at five different scales-sample, profile, plot, landscape/regional and European. Recommendations are also provided on how to reduce forest SOC uncertainties and increase efficiency of SOC assessment at each scale.

  9. Phase correction and error estimation in InSAR time series analysis

    NASA Astrophysics Data System (ADS)

    Zhang, Y.; Fattahi, H.; Amelung, F.

    2017-12-01

    During the last decade several InSAR time series approaches have been developed in response to the non-idea acquisition strategy of SAR satellites, such as large spatial and temporal baseline with non-regular acquisitions. The small baseline tubes and regular acquisitions of new SAR satellites such as Sentinel-1 allows us to form fully connected networks of interferograms and simplifies the time series analysis into a weighted least square inversion of an over-determined system. Such robust inversion allows us to focus more on the understanding of different components in InSAR time-series and its uncertainties. We present an open-source python-based package for InSAR time series analysis, called PySAR (https://yunjunz.github.io/PySAR/), with unique functionalities for obtaining unbiased ground displacement time-series, geometrical and atmospheric correction of InSAR data and quantifying the InSAR uncertainty. Our implemented strategy contains several features including: 1) improved spatial coverage using coherence-based network of interferograms, 2) unwrapping error correction using phase closure or bridging, 3) tropospheric delay correction using weather models and empirical approaches, 4) DEM error correction, 5) optimal selection of reference date and automatic outlier detection, 6) InSAR uncertainty due to the residual tropospheric delay, decorrelation and residual DEM error, and 7) variance-covariance matrix of final products for geodetic inversion. We demonstrate the performance using SAR datasets acquired by Cosmo-Skymed and TerraSAR-X, Sentinel-1 and ALOS/ALOS-2, with application on the highly non-linear volcanic deformation in Japan and Ecuador (figure 1). Our result shows precursory deformation before the 2015 eruptions of Cotopaxi volcano, with a maximum uplift of 3.4 cm on the western flank (fig. 1b), with a standard deviation of 0.9 cm (fig. 1a), supporting the finding by Morales-Rivera et al. (2017, GRL); and a post-eruptive subsidence on the same area, with a maximum of -3 +/- 0.9 cm (fig. 1c). Time-series displacement map (fig. 2) shows a highly non-linear deformation behavior, indicating the complicated magma propagation process during this eruption cycle.

  10. Common mode error in Antarctic GPS coordinate time series on its effect on bedrock-uplift estimates

    NASA Astrophysics Data System (ADS)

    Liu, Bin; King, Matt; Dai, Wujiao

    2018-05-01

    Spatially-correlated common mode error always exists in regional, or-larger, GPS networks. We applied independent component analysis (ICA) to GPS vertical coordinate time series in Antarctica from 2010 to 2014 and made a comparison with the principal component analysis (PCA). Using PCA/ICA, the time series can be decomposed into a set of temporal components and their spatial responses. We assume the components with common spatial responses are common mode error (CME). An average reduction of ˜40% about the RMS values was achieved in both PCA and ICA filtering. However, the common mode components obtained from the two approaches have different spatial and temporal features. ICA time series present interesting correlations with modeled atmospheric and non-tidal ocean loading displacements. A white noise (WN) plus power law noise (PL) model was adopted in the GPS velocity estimation using maximum likelihood estimation (MLE) analysis, with ˜55% reduction of the velocity uncertainties after filtering using ICA. Meanwhile, spatiotemporal filtering reduces the amplitude of PL and periodic terms in the GPS time series. Finally, we compare the GPS uplift velocities, after correction for elastic effects, with recent models of glacial isostatic adjustment (GIA). The agreements of the GPS observed velocities and four GIA models are generally improved after the spatiotemporal filtering, with a mean reduction of ˜0.9 mm/yr of the WRMS values, possibly allowing for more confident separation of various GIA model predictions.

  11. Quantification of the impact of precipitation spatial distribution uncertainty on predictive uncertainty of a snowmelt runoff model

    NASA Astrophysics Data System (ADS)

    Jacquin, A. P.

    2012-04-01

    This study is intended to quantify the impact of uncertainty about precipitation spatial distribution on predictive uncertainty of a snowmelt runoff model. This problem is especially relevant in mountain catchments with a sparse precipitation observation network and relative short precipitation records. The model analysed is a conceptual watershed model operating at a monthly time step. The model divides the catchment into five elevation zones, where the fifth zone corresponds to the catchment's glaciers. Precipitation amounts at each elevation zone i are estimated as the product between observed precipitation at a station and a precipitation factor FPi. If other precipitation data are not available, these precipitation factors must be adjusted during the calibration process and are thus seen as parameters of the model. In the case of the fifth zone, glaciers are seen as an inexhaustible source of water that melts when the snow cover is depleted.The catchment case study is Aconcagua River at Chacabuquito, located in the Andean region of Central Chile. The model's predictive uncertainty is measured in terms of the output variance of the mean squared error of the Box-Cox transformed discharge, the relative volumetric error, and the weighted average of snow water equivalent in the elevation zones at the end of the simulation period. Sobol's variance decomposition (SVD) method is used for assessing the impact of precipitation spatial distribution, represented by the precipitation factors FPi, on the models' predictive uncertainty. In the SVD method, the first order effect of a parameter (or group of parameters) indicates the fraction of predictive uncertainty that could be reduced if the true value of this parameter (or group) was known. Similarly, the total effect of a parameter (or group) measures the fraction of predictive uncertainty that would remain if the true value of this parameter (or group) was unknown, but all the remaining model parameters could be fixed. In this study, first order and total effects of the group of precipitation factors FP1- FP4, and the precipitation factor FP5, are calculated separately. First order and total effects of the group FP1- FP4 are much higher than first order and total effects of the factor FP5, which are negligible This situation is due to the fact that the actual value taken by FP5 does not have much influence in the contribution of the glacier zone to the catchment's output discharge, mainly limited by incident solar radiation. In addition to this, first order effects indicate that, in average, nearly 25% of predictive uncertainty could be reduced if the true values of the precipitation factors FPi could be known, but no information was available on the appropriate values for the remaining model parameters. Finally, the total effects of the precipitation factors FP1- FP4 are close to 41% in average, implying that even if the appropriate values for the remaining model parameters could be fixed, predictive uncertainty would be still quite high if the spatial distribution of precipitation remains unknown. Acknowledgements: This research was funded by FONDECYT, Research Project 1110279.

  12. Multi-scale landslide hazard assessment: Advances in global and regional methodologies

    NASA Astrophysics Data System (ADS)

    Kirschbaum, Dalia; Peters-Lidard, Christa; Adler, Robert; Hong, Yang

    2010-05-01

    The increasing availability of remotely sensed surface data and precipitation provides a unique opportunity to explore how smaller-scale landslide susceptibility and hazard assessment methodologies may be applicable at larger spatial scales. This research first considers an emerging satellite-based global algorithm framework, which evaluates how the landslide susceptibility and satellite derived rainfall estimates can forecast potential landslide conditions. An analysis of this algorithm using a newly developed global landslide inventory catalog suggests that forecasting errors are geographically variable due to improper weighting of surface observables, resolution of the current susceptibility map, and limitations in the availability of landslide inventory data. These methodological and data limitation issues can be more thoroughly assessed at the regional level, where available higher resolution landslide inventories can be applied to empirically derive relationships between surface variables and landslide occurrence. The regional empirical model shows improvement over the global framework in advancing near real-time landslide forecasting efforts; however, there are many uncertainties and assumptions surrounding such a methodology that decreases the functionality and utility of this system. This research seeks to improve upon this initial concept by exploring the potential opportunities and methodological structure needed to advance larger-scale landslide hazard forecasting and make it more of an operational reality. Sensitivity analysis of the surface and rainfall parameters in the preliminary algorithm indicates that surface data resolution and the interdependency of variables must be more appropriately quantified at local and regional scales. Additionally, integrating available surface parameters must be approached in a more theoretical, physically-based manner to better represent the physical processes underlying slope instability and landslide initiation. Several rainfall infiltration and hydrological flow models have been developed to model slope instability at small spatial scales. This research investigates the potential of applying a more quantitative hydrological model to larger spatial scales, utilizing satellite and surface data inputs that are obtainable over different geographic regions. Due to the significant role that data and methodological uncertainties play in the effectiveness of landslide hazard assessment outputs, the methodology and data inputs are considered within an ensemble uncertainty framework in order to better resolve the contribution and limitations of model inputs and to more effectively communicate the model skill for improved landslide hazard assessment.

  13. The role of historical fire disturbance in the carbon dynamics of the pan-boreal region: A process-based analysis

    USGS Publications Warehouse

    Balshi, M. S.; McGuire, A.D.; Zhuang, Q.; Melillo, J.; Kicklighter, D.W.; Kasischke, E.; Wirth, C.; Flannigan, M.; Harden, J.; Clein, Joy S.; Burnside, T.J.; McAllister, J.; Kurz, W.A.; Apps, M.; Shvidenko, A.

    2007-01-01

    Wildfire is a common occurrence in ecosystems of northern high latitudes, and changes in the fire regime of this region have consequences for carbon feedbacks to the climate system. To improve our understanding of how wildfire influences carbon dynamics of this region, we used the process-based Terrestrial Ecosystem Model to simulate fire emissions and changes in carbon storage north of 45??N from the start of spatially explicit historically recorded fire records in the twentieth century through 2002, and evaluated the role of fire in the carbon dynamics of the region within the context of ecosystem responses to changes in atmospheric CO2 concentration and climate. Our analysis indicates that fire plays an important role in interannual and decadal scale variation of source/sink relationships of northern terrestrial ecosystems and also suggests that atmospheric CO2 may be important to consider in addition to changes in climate and fire disturbance. There are substantial uncertainties in the effects of fire on carbon storage in our simulations. These uncertainties are associated with sparse fire data for northern Eurasia, uncertainty in estimating carbon consumption, and difficulty in verifying assumptions about the representation of fires that occurred prior to the start of the historical fire record. To improve the ability to better predict how fire will influence carbon storage of this region in the future, new analyses of the retrospective role of fire in the carbon dynamics of northern high latitudes should address these uncertainties. Copyright 2007 by the American Geophysical Union.

  14. Experimental Concepts for Testing Seismic Hazard Models

    NASA Astrophysics Data System (ADS)

    Marzocchi, W.; Jordan, T. H.

    2015-12-01

    Seismic hazard analysis is the primary interface through which useful information about earthquake rupture and wave propagation is delivered to society. To account for the randomness (aleatory variability) and limited knowledge (epistemic uncertainty) of these natural processes, seismologists must formulate and test hazard models using the concepts of probability. In this presentation, we will address the scientific objections that have been raised over the years against probabilistic seismic hazard analysis (PSHA). Owing to the paucity of observations, we must rely on expert opinion to quantify the epistemic uncertainties of PSHA models (e.g., in the weighting of individual models from logic-tree ensembles of plausible models). The main theoretical issue is a frequentist critique: subjectivity is immeasurable; ergo, PSHA models cannot be objectively tested against data; ergo, they are fundamentally unscientific. We have argued (PNAS, 111, 11973-11978) that the Bayesian subjectivity required for casting epistemic uncertainties can be bridged with the frequentist objectivity needed for pure significance testing through "experimental concepts." An experimental concept specifies collections of data, observed and not yet observed, that are judged to be exchangeable (i.e., with a joint distribution independent of the data ordering) when conditioned on a set of explanatory variables. We illustrate, through concrete examples, experimental concepts useful in the testing of PSHA models for ontological errors in the presence of aleatory variability and epistemic uncertainty. In particular, we describe experimental concepts that lead to exchangeable binary sequences that are statistically independent but not identically distributed, showing how the Bayesian concept of exchangeability generalizes the frequentist concept of experimental repeatability. We also address the issue of testing PSHA models using spatially correlated data.

  15. Accounting for rate instability and spatial patterns in the boundary analysis of cancer mortality maps

    PubMed Central

    Goovaerts, Pierre

    2006-01-01

    Boundary analysis of cancer maps may highlight areas where causative exposures change through geographic space, the presence of local populations with distinct cancer incidences, or the impact of different cancer control methods. Too often, such analysis ignores the spatial pattern of incidence or mortality rates and overlooks the fact that rates computed from sparsely populated geographic entities can be very unreliable. This paper proposes a new methodology that accounts for the uncertainty and spatial correlation of rate data in the detection of significant edges between adjacent entities or polygons. Poisson kriging is first used to estimate the risk value and the associated standard error within each polygon, accounting for the population size and the risk semivariogram computed from raw rates. The boundary statistic is then defined as half the absolute difference between kriged risks. Its reference distribution, under the null hypothesis of no boundary, is derived through the generation of multiple realizations of the spatial distribution of cancer risk values. This paper presents three types of neutral models generated using methods of increasing complexity: the common random shuffle of estimated risk values, a spatial re-ordering of these risks, or p-field simulation that accounts for the population size within each polygon. The approach is illustrated using age-adjusted pancreatic cancer mortality rates for white females in 295 US counties of the Northeast (1970–1994). Simulation studies demonstrate that Poisson kriging yields more accurate estimates of the cancer risk and how its value changes between polygons (i.e. boundary statistic), relatively to the use of raw rates or local empirical Bayes smoother. When used in conjunction with spatial neutral models generated by p-field simulation, the boundary analysis based on Poisson kriging estimates minimizes the proportion of type I errors (i.e. edges wrongly declared significant) while the frequency of these errors is predicted well by the p-value of the statistical test. PMID:19023455

  16. Constraining the effects of permeability uncertainty for geologic CO2 sequestration in a basalt reservoir

    NASA Astrophysics Data System (ADS)

    Jayne, R., Jr.; Pollyea, R.

    2016-12-01

    Carbon capture and sequestration (CCS) in geologic reservoirs is one strategy for reducing anthropogenic CO2 emissions from large-scale point-source emitters. Recent developments at the CarbFix CCS pilot in Iceland have shown that basalt reservoirs are highly effective for permanent mineral trapping on the basis of CO2-water-rock interactions, which result in the formation of carbonates minerals. In order to advance our understanding of basalt sequestration in large igneous provinces, this research uses numerical simulation to evaluate the feasibility of industrial-scale CO2 injections in the Columbia River Basalt Group (CRBG). Although bulk reservoir properties are well constrained on the basis of field and laboratory testing from the Wallula Basalt Sequestration Pilot Project, there remains significant uncertainty in the spatial distribution of permeability at the scale of individual basalt flows. Geostatistical analysis of hydrologic data from 540 wells illustrates that CRBG reservoirs are reasonably modeled as layered heterogeneous systems on the basis of basalt flow morphology; however, the regional dataset is insufficient to constrain permeability variability at the scale of an individual basalt flow. As a result, permeability distribution for this modeling study is established by centering the lognormal permeability distribution in the regional dataset over the bulk permeability measured at Wallula site, which results in a spatially random permeability distribution within the target reservoir. In order to quantify the effects of this permeability uncertainty, CO2 injections are simulated within 50 equally probable synthetic reservoir domains. Each model domain comprises three-dimensional geometry with 530,000 grid blocks, and fracture-matrix interaction is simulated as interacting continua for the two low permeability layers (flow interiors) bounding the injection zone. Results from this research illustrate that permeability uncertainty at the scale of individual basalt flows may significantly impact both injection pressure accumulation and CO2 distribution.

  17. Categorical Biases in Spatial Memory: The Role of Certainty

    ERIC Educational Resources Information Center

    Holden, Mark P.; Newcombe, Nora S.; Shipley, Thomas F.

    2015-01-01

    Memories for spatial locations often show systematic errors toward the central value of the surrounding region. The Category Adjustment (CA) model suggests that this bias is due to a Bayesian combination of categorical and metric information, which offers an optimal solution under conditions of uncertainty (Huttenlocher, Hedges, & Duncan,…

  18. Application of the CO2-PENS risk analysis tool to the Rock Springs Uplift, Wyoming

    USGS Publications Warehouse

    Stauffer, P.H.; Pawar, R.J.; Surdam, R.C.; Jiao, Z.; Deng, H.; Lettelier, B.C.; Viswanathan, H.S.; Sanzo, D.L.; Keating, G.N.

    2011-01-01

    We describe preliminary application of the CO2-PENS performance and risk analysis tool to a planned geologic CO2 sequestration demonstration project in the Rock Springs Uplift (RSU), located in south western Wyoming. We use data from the RSU to populate CO2-PENS, an evolving system-level modeling tool developed at Los Alamos National Laboratory. This tool has been designed to generate performance and risk assessment calculations for the geologic sequestration of carbon dioxide. Our approach follows Systems Analysis logic and includes estimates of uncertainty in model parameters and Monte-Carlo simulations that lead to probabilistic results. Probabilistic results provide decision makers with a range in the likelihood of different outcomes. Herein we present results from a newly implemented approach in CO 2-PENS that captures site-specific spatially coherent details such as topography on the reservoir/cap-rock interface, changes in saturation and pressure during injection, and dip on overlying aquifers that may be impacted by leakage upward through wellbores and faults. We present simulations of CO 2 injection under different uncertainty distributions for hypothetical leaking wells and faults. Although results are preliminary and to be used only for demonstration of the approach, future results of the risk analysis will form the basis for a discussion on methods to reduce uncertainty in the risk calculations. Additionally, we present ideas on using the model to help locate monitoring equipment to detect potential leaks. By maintaining site-specific details in the CO2-PENS analysis we provide a tool that allows more logical presentations to stakeholders in the region. ?? 2011 Published by Elsevier Ltd.

  19. Uncertainty in predictions of forest carbon dynamics: separating driver error from model error.

    PubMed

    Spadavecchia, L; Williams, M; Law, B E

    2011-07-01

    We present an analysis of the relative magnitude and contribution of parameter and driver uncertainty to the confidence intervals on estimates of net carbon fluxes. Model parameters may be difficult or impractical to measure, while driver fields are rarely complete, with data gaps due to sensor failure and sparse observational networks. Parameters are generally derived through some optimization method, while driver fields may be interpolated from available data sources. For this study, we used data from a young ponderosa pine stand at Metolius, Central Oregon, and a simple daily model of coupled carbon and water fluxes (DALEC). An ensemble of acceptable parameterizations was generated using an ensemble Kalman filter and eddy covariance measurements of net C exchange. Geostatistical simulations generated an ensemble of meteorological driving variables for the site, consistent with the spatiotemporal autocorrelations inherent in the observational data from 13 local weather stations. Simulated meteorological data were propagated through the model to derive the uncertainty on the CO2 flux resultant from driver uncertainty typical of spatially extensive modeling studies. Furthermore, the model uncertainty was partitioned between temperature and precipitation. With at least one meteorological station within 25 km of the study site, driver uncertainty was relatively small ( 10% of the total net flux), while parameterization uncertainty was larger, 50% of the total net flux. The largest source of driver uncertainty was due to temperature (8% of the total flux). The combined effect of parameter and driver uncertainty was 57% of the total net flux. However, when the nearest meteorological station was > 100 km from the study site, uncertainty in net ecosystem exchange (NEE) predictions introduced by meteorological drivers increased by 88%. Precipitation estimates were a larger source of bias in NEE estimates than were temperature estimates, although the biases partly compensated for each other. The time scales on which precipitation errors occurred in the simulations were shorter than the temporal scales over which drought developed in the model, so drought events were reasonably simulated. The approach outlined here provides a means to assess the uncertainty and bias introduced by meteorological drivers in regional-scale ecological forecasting.

  20. Aiding planning in air traffic control: an experimental investigation of the effects of perceptual information integration.

    PubMed

    Moertl, Peter M; Canning, John M; Gronlund, Scott D; Dougherty, Michael R P; Johansson, Joakim; Mills, Scott H

    2002-01-01

    Prior research examined how controllers plan in their traditional environment and identified various information uncertainties as detriments to planning. A planning aid was designed to reduce this uncertainty by perceptually representing important constraints. This included integrating spatial information on the radar screen with discrete information (planned sequences of air traffic). Previous research reported improved planning performance and decreased workload in the planning aid condition. The purpose of this paper was to determine the source of these performance improvements. Analysis of computer interactions using log-linear modeling showed that the planning interface led to less repetitive--but more integrated--information retrieval compared with the traditional planning environment. Ecological interface design principles helped explain how the integrated information retrieval gave rise to the performance improvements. Actual or potential applications of this research include the design and evaluation of interface automation that keeps users in active control by modification of perceptual task characteristics.

  1. Framework for Uncertainty Assessment - Hanford Site-Wide Groundwater Flow and Transport Modeling

    NASA Astrophysics Data System (ADS)

    Bergeron, M. P.; Cole, C. R.; Murray, C. J.; Thorne, P. D.; Wurstner, S. K.

    2002-05-01

    Pacific Northwest National Laboratory is in the process of development and implementation of an uncertainty estimation methodology for use in future site assessments that addresses parameter uncertainty as well as uncertainties related to the groundwater conceptual model. The long-term goals of the effort are development and implementation of an uncertainty estimation methodology for use in future assessments and analyses being made with the Hanford site-wide groundwater model. The basic approach in the framework developed for uncertainty assessment consists of: 1) Alternate conceptual model (ACM) identification to identify and document the major features and assumptions of each conceptual model. The process must also include a periodic review of the existing and proposed new conceptual models as data or understanding become available. 2) ACM development of each identified conceptual model through inverse modeling with historical site data. 3) ACM evaluation to identify which of conceptual models are plausible and should be included in any subsequent uncertainty assessments. 4) ACM uncertainty assessments will only be carried out for those ACMs determined to be plausible through comparison with historical observations and model structure identification measures. The parameter uncertainty assessment process generally involves: a) Model Complexity Optimization - to identify the important or relevant parameters for the uncertainty analysis; b) Characterization of Parameter Uncertainty - to develop the pdfs for the important uncertain parameters including identification of any correlations among parameters; c) Propagation of Uncertainty - to propagate parameter uncertainties (e.g., by first order second moment methods if applicable or by a Monte Carlo approach) through the model to determine the uncertainty in the model predictions of interest. 5)Estimation of combined ACM and scenario uncertainty by a double sum with each component of the inner sum (an individual CCDF) representing parameter uncertainty associated with a particular scenario and ACM and the outer sum enumerating the various plausible ACM and scenario combinations in order to represent the combined estimate of uncertainty (a family of CCDFs). A final important part of the framework includes identification, enumeration, and documentation of all the assumptions, which include those made during conceptual model development, required by the mathematical model, required by the numerical model, made during the spatial and temporal descretization process, needed to assign the statistical model and associated parameters that describe the uncertainty in the relevant input parameters, and finally those assumptions required by the propagation method. Pacific Northwest National Laboratory is operated for the U.S. Department of Energy under Contract DE-AC06-76RL01830.

  2. Optimal Interpolation scheme to generate reference crop evapotranspiration

    NASA Astrophysics Data System (ADS)

    Tomas-Burguera, Miquel; Beguería, Santiago; Vicente-Serrano, Sergio; Maneta, Marco

    2018-05-01

    We used an Optimal Interpolation (OI) scheme to generate a reference crop evapotranspiration (ETo) grid, forcing meteorological variables, and their respective error variance in the Iberian Peninsula for the period 1989-2011. To perform the OI we used observational data from the Spanish Meteorological Agency (AEMET) and outputs from a physically-based climate model. To compute ETo we used five OI schemes to generate grids for the five observed climate variables necessary to compute ETo using the FAO-recommended form of the Penman-Monteith equation (FAO-PM). The granularity of the resulting grids are less sensitive to variations in the density and distribution of the observational network than those generated by other interpolation methods. This is because our implementation of the OI method uses a physically-based climate model as prior background information about the spatial distribution of the climatic variables, which is critical for under-observed regions. This provides temporal consistency in the spatial variability of the climatic fields. We also show that increases in the density and improvements in the distribution of the observational network reduces substantially the uncertainty of the climatic and ETo estimates. Finally, a sensitivity analysis of observational uncertainties and network densification suggests the existence of a trade-off between quantity and quality of observations.

  3. Spatial extreme value analysis to project extremes of large-scale indicators for severe weather

    PubMed Central

    Gilleland, Eric; Brown, Barbara G; Ammann, Caspar M

    2013-01-01

    Concurrently high values of the maximum potential wind speed of updrafts (Wmax) and 0–6 km wind shear (Shear) have been found to represent conducive environments for severe weather, which subsequently provides a way to study severe weather in future climates. Here, we employ a model for the product of these variables (WmSh) from the National Center for Atmospheric Research/United States National Center for Environmental Prediction reanalysis over North America conditioned on their having extreme energy in the spatial field in order to project the predominant spatial patterns of WmSh. The approach is based on the Heffernan and Tawn conditional extreme value model. Results suggest that this technique estimates the spatial behavior of WmSh well, which allows for exploring possible changes in the patterns over time. While the model enables a method for inferring the uncertainty in the patterns, such analysis is difficult with the currently available inference approach. A variation of the method is also explored to investigate how this type of model might be used to qualitatively understand how the spatial patterns of WmSh correspond to extreme river flow events. A case study for river flows from three rivers in northwestern Tennessee is studied, and it is found that advection of WmSh from the Gulf of Mexico prevails while elsewhere, WmSh is generally very low during such extreme events. © 2013 The Authors. Environmetrics published by JohnWiley & Sons, Ltd. PMID:24223482

  4. Understanding the Perception of Global Climate Change: Research into the Role of Media

    NASA Astrophysics Data System (ADS)

    Kundargi, R.; Gopal, S.; Tsay-Vogel, M.

    2016-12-01

    Here we present preliminary results for a novel study investigating the perception of climate change media, in relation to two pre-selected dimensions. We administer a questionnaire varying in two dimensions (spatial proximity and scientific literacy) to 155 mostly students in order to evaluate their emotional and cognitive reactions towards a series of video clips depicting the impacts of global climate change (GCC) events or the science behind global climate change. 19 videos were selected and vetted by experts for content and relevance to the subject matter. Our preliminary analysis indicate that the further away an event is perceived to be (spatial proximity) results in a lower uncertainty about the risks of GCC, lower self-efficacy to effect GCC, and lower personal responsibility to influence GCC. Furthermore, our results show that videos with a higher perceived background scientific knowledge requirement (scientific literacy) results in greater viewer engagement with the video. A full analysis and results of this study will be presented within the poster presentation.

  5. Towards ground-truthing of spaceborne estimates of above-ground biomass and leaf area index in tropical rain forests

    NASA Astrophysics Data System (ADS)

    Köhler, P.; Huth, A.

    2010-05-01

    The canopy height of forests is a key variable which can be obtained using air- or spaceborne remote sensing techniques such as radar interferometry or lidar. If new allometric relationships between canopy height and the biomass stored in the vegetation can be established this would offer the possibility for a global monitoring of the above-ground carbon content on land. In the absence of adequate field data we use simulation results of a tropical rain forest growth model to propose what degree of information might be generated from canopy height and thus to enable ground-truthing of potential future satellite observations. We here analyse the correlation between canopy height in a tropical rain forest with other structural characteristics, such as above-ground biomass (AGB) (and thus carbon content of vegetation) and leaf area index (LAI). The process-based forest growth model FORMIND2.0 was applied to simulate (a) undisturbed forest growth and (b) a wide range of possible disturbance regimes typically for local tree logging conditions for a tropical rain forest site on Borneo (Sabah, Malaysia) in South-East Asia. It is found that for undisturbed forest and a variety of disturbed forests situations AGB can be expressed as a power-law function of canopy height h (AGB=a·hb) with an r2~60% for a spatial resolution of 20 m×20 m (0.04 ha, also called plot size). The regression is becoming significant better for the hectare wide analysis of the disturbed forest sites (r2=91%). There seems to exist no functional dependency between LAI and canopy height, but there is also a linear correlation (r2~60%) between AGB and the area fraction in which the canopy is highly disturbed. A reasonable agreement of our results with observations is obtained from a comparison of the simulations with permanent sampling plot data from the same region and with the large-scale forest inventory in Lambir. We conclude that the spaceborne remote sensing techniques have the potential to quantify the carbon contained in the vegetation, although this calculation contains due to the heterogeneity of the forest landscape structural uncertainties which restrict future applications to spatial averages of about one hectare in size. The uncertainties in AGB for a given canopy height are here 20-40% (95% confidence level) corresponding to a standard deviation of less than ±10%. This uncertainty on the 1 ha-scale is much smaller than in the analysis of 0.04 ha-scale data. At this small scale (0.04 ha) AGB can only be calculated out of canopy height with an uncertainty which is at least of the magnitude of the signal itself due to the natural spatial heterogeneity of these forests.

  6. Uncertainty in structural interpretation: Lessons to be learnt

    NASA Astrophysics Data System (ADS)

    Bond, Clare E.

    2015-05-01

    Uncertainty in the interpretation of geological data is an inherent element of geology. Datasets from different sources: remotely sensed seismic imagery, field data and borehole data, are often combined and interpreted to create a geological model of the sub-surface. The data have limited resolution and spatial distribution that results in uncertainty in the interpretation of the data and in the subsequent geological model(s) created. Methods to determine the extent of interpretational uncertainty of a dataset, how to capture and express that uncertainty, and consideration of uncertainties in terms of risk have been investigated. Here I review the work that has taken place and discuss best practice in accounting for uncertainties in structural interpretation workflows. Barriers to best practice are reflected on, including the use of software packages for interpretation. Experimental evidence suggests that minimising interpretation error through the use of geological reasoning and rules can help decrease interpretation uncertainty; through identification of inadmissible interpretations and in highlighting areas of uncertainty. Understanding expert thought processes and reasoning, including the use of visuospatial skills, during interpretation may aid in the identification of uncertainties, and in the education of new geoscientists.

  7. Using demography and movement behavior to predict range expansion of the southern sea otter.

    USGS Publications Warehouse

    Tinker, M.T.; Doak, D.F.; Estes, J.A.

    2008-01-01

    In addition to forecasting population growth, basic demographic data combined with movement data provide a means for predicting rates of range expansion. Quantitative models of range expansion have rarely been applied to large vertebrates, although such tools could be useful for restoration and management of many threatened but recovering populations. Using the southern sea otter (Enhydra lutris nereis) as a case study, we utilized integro-difference equations in combination with a stage-structured projection matrix that incorporated spatial variation in dispersal and demography to make forecasts of population recovery and range recolonization. In addition to these basic predictions, we emphasize how to make these modeling predictions useful in a management context through the inclusion of parameter uncertainty and sensitivity analysis. Our models resulted in hind-cast (1989–2003) predictions of net population growth and range expansion that closely matched observed patterns. We next made projections of future range expansion and population growth, incorporating uncertainty in all model parameters, and explored the sensitivity of model predictions to variation in spatially explicit survival and dispersal rates. The predicted rate of southward range expansion (median = 5.2 km/yr) was sensitive to both dispersal and survival rates; elasticity analysis indicated that changes in adult survival would have the greatest potential effect on the rate of range expansion, while perturbation analysis showed that variation in subadult dispersal contributed most to variance in model predictions. Variation in survival and dispersal of females at the south end of the range contributed most of the variance in predicted southward range expansion. Our approach provides guidance for the acquisition of further data and a means of forecasting the consequence of specific management actions. Similar methods could aid in the management of other recovering populations.

  8. Modelling the growth of Populus species using Ecosystem Demography (ED) model

    NASA Astrophysics Data System (ADS)

    Wang, D.; Lebauer, D. S.; Feng, X.; Dietze, M. C.

    2010-12-01

    Hybrid poplar plantations are an important source being evaluated for biomass production. Effective management of such plantations requires adequate growth and yield models. The Ecosystem Demography model (ED) makes predictions about the large scales of interest in above- and belowground ecosystem structure and the fluxes of carbon and water from a description of the fine-scale physiological processes. In this study, we used a workflow management tool, the Predictive Ecophysiological Carbon flux Analyzer (PECAn), to integrate literature data, field measurement and the ED model to provide predictions of ecosystem functioning. Parameters for the ED ensemble runs were sampled from the posterior distribution of ecophysiological traits of Populus species compiled from the literature using a Bayesian meta-analysis approach. Sensitivity analysis was performed to identify the parameters which contribute the most to the uncertainties of the ED model output. Model emulation techniques were used to update parameter posterior distributions using field-observed data in northern Wisconsin hybrid poplar plantations. Model results were evaluated with 5-year field-observed data in a hybrid poplar plantation at New Franklin, MO. ED was then used to predict the spatial variability of poplar yield in the coterminous United States (United States minus Alaska and Hawaii). Sensitivity analysis showed that root respiration, dark respiration, growth respiration, stomatal slope and specific leaf area contribute the most to the uncertainty, which suggests that our field measurements and data collection should focus on these parameters. The ED model successfully captured the inter-annual and spatial variability of the yield of poplar. Analyses in progress with the ED model focus on evaluating the ecosystem services of short-rotation woody plantations, such as impacts on soil carbon storage, water use, and nutrient retention.

  9. Characterize Aerosols from MODIS/MISR/OMI/MERRA-2: Dynamic Image Browse Perspective

    NASA Astrophysics Data System (ADS)

    Wei, J. C.; Yang, W.; Shen, S.; Zhao, P.; Albayrak, A.; Johnson, J. E.; Kempler, S. J.; Pham, L.

    2016-12-01

    Among the known atmospheric constituents, aerosols still represent the greatest uncertainty in climate research. To understand the uncertainty is to bring altogether of observational (in-situ and remote sensing) and modeling datasets and inter-compare them synergistically for a wide variety of applications that can bring far-reaching benefits to the science community and the broader society. These benefits can best be achieved if these earth science data (satellite and modeling) are well utilized and interpreted. Unfortunately, this is not always the case, despite the abundance and relative maturity of numerous satellite-borne sensors routinely measure aerosols. There is often disagreement between similar aerosol parameters retrieved from different sensors, leaving users confused as to which sensors to trust for answering important science questions about the distribution, properties, and impacts of aerosols. NASA Goddard Earth Sciences Data and Information Services Center (GES DISC) have developed multiple MAPSS (Multi-sensor Aerosol Products Sampling System) applications as a part of Giovanni (Geospatial Interactive Online Visualization and Analysis Interface) data visualization and analysis tool since 2007. The MAPSS database provides spatio-temporal statistics for multiple spatial spaceborne Level 2 aerosol products (MODIS Terra, MODIS Aqua, MISR, POLDER, OMI, CALIOP, SeaWiFS Deep Blue, and VIIRS) sampled over AERONET ground stations. In this presentation, I will demonstrate a new visualization service (NASA Level 2 Data Quality Visualization, DQViz) supporting various visualization and data accessing capabilities from satellite Level 2 (MODIS/MISR/OMI) and long term assimilated aerosols from NASA Modern-Era Retrospective analysis for Research and Applications, Version 2 (MERRA-2 displaying at their own native physical-retrieved spatial resolution. Functionality will include selecting data sources (e.g., multiple parameters under the same measurement), defining area-of-interest and temporal extents, zooming, panning, overlaying, sliding, and data subsetting and reformatting.

  10. Estimating regional evapotranspiration from remotely sensed data by surface energy balance models

    NASA Technical Reports Server (NTRS)

    Asrar, Ghassem; Kanemasu, Edward; Myneni, R. B.; Lapitan, R. L.; Harris, T. R.; Killeen, J. M.; Cooper, D. I.; Hwang, C.

    1987-01-01

    Spatial and temporal variations of surface radiative temperatures of the burned and unburned areas of the Konza tallgrass prairie were studied. The role of management practices, topographic conditions and the uncertainties associated with in situ or airborne surface temperature measurements were assessed. Evaluation of diurnal and seasonal spectral characteristics of the burned and unburned areas of the prairie was also made. This was accomplished based on the analysis of measured spectral reflectance of the grass canopies under field conditions, and modelling their spectral behavior using a one dimensional radiative transfer model.

  11. On the action of Heisenberg's uncertainty principle in discrete linear methods for calculating the components of the deflection of the vertical

    NASA Astrophysics Data System (ADS)

    Mazurova, Elena; Lapshin, Aleksey

    2013-04-01

    The method of discrete linear transformations that can be implemented through the algorithms of the Standard Fourier Transform (SFT), Short-Time Fourier Transform (STFT) or Wavelet transform (WT) is effective for calculating the components of the deflection of the vertical from discrete values of gravity anomaly. The SFT due to the action of Heisenberg's uncertainty principle indicates weak spatial localization that manifests in the following: firstly, it is necessary to know the initial digital signal on the complete number line (in case of one-dimensional transform) or in the whole two-dimensional space (if a two-dimensional transform is performed) in order to find the SFT. Secondly, the localization and values of the "peaks" of the initial function cannot be derived from its Fourier transform as the coefficients of the Fourier transform are formed by taking into account all the values of the initial function. Thus, the SFT gives the global information on all frequencies available in the digital signal throughout the whole time period. To overcome this peculiarity it is necessary to localize the signal in time and apply the Fourier transform only to a small portion of the signal; the STFT that differs from the SFT only by the presence of an additional factor (window) is used for this purpose. A narrow enough window is chosen to localize the signal in time and, according to Heisenberg's uncertainty principle, it results in have significant enough uncertainty in frequency. If one chooses a wide enough window it, according to the same principle, will increase time uncertainty. Thus, if the signal is narrowly localized in time its spectrum, on the contrary, is spread on the complete axis of frequencies, and vice versa. The STFT makes it possible to improve spatial localization, that is, it allows one to define the presence of any frequency in the signal and the interval of its presence. However, owing to Heisenberg's uncertainty principle, it is impossible to tell precisely, what frequency is present in the signal at the current moment of time: it is possible to speak only about the range of frequencies. Besides, it is impossible to specify precisely the time moment of the presence of this or that frequency: it is possible to speak only about the time frame. It is this feature that imposes major constrains on the applicability of the STFT. In spite of the fact that the problems of resolution in time and frequency result from a physical phenomenon (Heisenberg's uncertainty principle) and exist independent of the transform applied, there is a possibility to analyze any signal, using the alternative approach - the multiresolutional analysis (MRA). The wavelet-transform is one of the methods for making a MRA-type analysis. Thanks to it, low frequencies can be shown in a more detailed form with respect to time, and high ones - with respect to frequency. The paper presents the results of calculating of the components of the deflection of the vertical, done by the SFT, STFT and WT. The results are presented in the form of 3-d models that visually show the action of Heisenberg's uncertainty principle in the specified algorithms. The research conducted allows us to recommend the application of wavelet-transform to calculate of the components of the deflection of the vertical in the near-field zone. Keywords: Standard Fourier Transform, Short-Time Fourier Transform, Wavelet Transform, Heisenberg's uncertainty principle.

  12. Multiscale Structure of UXO Site Characterization: Spatial Estimation and Uncertainty Quantification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ostrouchov, George; Doll, William E.; Beard, Les P.

    2009-01-01

    Unexploded ordnance (UXO) site characterization must consider both how the contamination is generated and how we observe that contamination. Within the generation and observation processes, dependence structures can be exploited at multiple scales. We describe a conceptual site characterization process, the dependence structures available at several scales, and consider their statistical estimation aspects. It is evident that most of the statistical methods that are needed to address the estimation problems are known but their application-specific implementation may not be available. We demonstrate estimation at one scale and propose a representation for site contamination intensity that takes full account of uncertainty,more » is flexible enough to answer regulatory requirements, and is a practical tool for managing detailed spatial site characterization and remediation. The representation is based on point process spatial estimation methods that require modern computational resources for practical application. These methods have provisions for including prior and covariate information.« less

  13. What if the Hubbard Brook weirs had been built somewhere else? Spatial uncertainty in the application of catchment budgets

    NASA Astrophysics Data System (ADS)

    Bailey, S. W.

    2016-12-01

    Nine catchments are gaged at Hubbard Brook Experimental Forest, Woodstock, NH, USA, with weirs installed on adjacent first-order streams. These catchments have been used as unit ecosystems for analysis of chemical budgets, including evaluation of long term trends and response to disturbance. This study examines uncertainty in the representativeness of these budgets to other nearby catchments, or as representatives of the broader northern hardwood ecosystem, depending on choice of location of the stream gaging station. Within forested northern hardwood catchments across the Hubbard Brook region, there is relatively little spatial variation in amount or chemistry of precipitation inputs or in amount of streamwater outputs. For example, runoff per unit catchment area varies by less than 10% at gaging stations on first to sixth order streams. In contrast, concentrations of major solutes vary by an order of magnitude or more across stream sampling sites, with a similar range in concentrations seen within individual first order catchments as seen across the third order Hubbard Brook valley or across the White Mountain region. These spatial variations in stream chemistry are temporally persistent across a range of flow conditions. Thus first order catchment budgets vary greatly depending on very local variations in stream chemistry driven by choice of the site to develop a stream gage. For example, carbon output in dissolved organic matter varies by a factor of five depending on where the catchment output is defined at Watershed 3. I hypothesize that catchment outputs from first order streams are driven by spatially variable chemistry of shallow groundwater, reflecting local variations in the distribution of soils and vegetation. In contrast, spatial variability in stream chemistry decreases with stream order, hypothesized to reflect deeper groundwater inputs on larger streams, which are more regionally uniform. Thus, choice of a gaging site and definition of an ecosystem as a unit of analysis at a larger scale, such as the Hubbard Brook valley, would have less impact on calculated budgets than at the headwater scale. Monitoring of a larger catchment is more likely to be representative of other similar sized catchments. However, particular research questions may be better studied at the smaller headwater scale.

  14. Emulation and Sobol' sensitivity analysis of an atmospheric dispersion model applied to the Fukushima nuclear accident

    NASA Astrophysics Data System (ADS)

    Girard, Sylvain; Mallet, Vivien; Korsakissok, Irène; Mathieu, Anne

    2016-04-01

    Simulations of the atmospheric dispersion of radionuclides involve large uncertainties originating from the limited knowledge of meteorological input data, composition, amount and timing of emissions, and some model parameters. The estimation of these uncertainties is an essential complement to modeling for decision making in case of an accidental release. We have studied the relative influence of a set of uncertain inputs on several outputs from the Eulerian model Polyphemus/Polair3D on the Fukushima case. We chose to use the variance-based sensitivity analysis method of Sobol'. This method requires a large number of model evaluations which was not achievable directly due to the high computational cost of Polyphemus/Polair3D. To circumvent this issue, we built a mathematical approximation of the model using Gaussian process emulation. We observed that aggregated outputs are mainly driven by the amount of emitted radionuclides, while local outputs are mostly sensitive to wind perturbations. The release height is notably influential, but only in the vicinity of the source. Finally, averaging either spatially or temporally tends to cancel out interactions between uncertain inputs.

  15. Temporal and Spatial Analysis of Monogenetic Volcanic Fields

    NASA Astrophysics Data System (ADS)

    Kiyosugi, Koji

    Achieving an understanding of the nature of monogenetic volcanic fields depends on identification of the spatial and temporal patterns of volcanism in these fields, and their relationships to structures mapped in the shallow crust and inferred in the deep crust and mantle through interpretation of geochemical, radiometric and geophysical data. We investigate the spatial and temporal distributions of volcanism in the Abu Monogenetic Volcano Group, Southwest Japan. E-W elongated volcano distribution, which is identified by a nonparametric kernel method, is found to be consistent with the spatial extent of P-wave velocity anomalies in the lower crust and upper mantle, supporting the idea that the spatial density map of volcanic vents reflects the geometry of a mantle diapir. Estimated basalt supply to the lower crust is constant. This observation and the spatial distribution of volcanic vents suggest stability of magma productivity and essentially constant two-dimensional size of the source mantle diapir. We mapped conduits, dike segments, and sills in the San Rafael sub-volcanic field, Utah, where the shallowest part of a Pliocene magmatic system is exceptionally well exposed. The distribution of conduits matches the major features of dike distribution, including development of clusters and distribution of outliers. The comparison of San Rafael conduit distribution and the distributions of volcanoes in several recently active volcanic fields supports the use of statistical models, such as nonparametric kernel methods, in probabilistic hazard assessment for distributed volcanism. We developed a new recurrence rate calculation method that uses a Monte Carlo procedure to better reflect and understand the impact of uncertainties of radiometric age determinations on uncertainty of recurrence rate estimates for volcanic activity in the Abu, Yucca Mountain Region, and Izu-Tobu volcanic fields. Results suggest that the recurrence rates of volcanic fields can change by more than one order of magnitude on time scales of several hundred thousand to several million years. This suggests that magma generation rate beneath volcanic fields may change over these time scales. Also, recurrence rate varies more than one order of magnitude between these volcanic fields, consistent with the idea that distributed volcanism may be influenced by both the rate of magma generation and the potential for dike interaction during ascent.

  16. How to find what you don't know: Visualising variability in 3D geological models

    NASA Astrophysics Data System (ADS)

    Lindsay, Mark; Wellmann, Florian; Jessell, Mark; Ailleres, Laurent

    2014-05-01

    Uncertainties in input data can have compounding effects on the predictive reliability of three-dimensional (3D) geological models. Resource exploration, tectonic studies and environmental modelling can be compromised by using 3D models that misrepresent the target geology, and drilling campaigns that attempt to intersect particular geological units guided by 3D models are at risk of failure if the exploration geologist is unaware of inherent uncertainties. In addition, the visual inspection of 3D models is often the first contact decision makers have with the geology, thus visually communicating the presence and magnitude of uncertainties contained within geological 3D models is critical. Unless uncertainties are presented early in the relationship between decision maker and model, the model will be considered more truthful than the uncertainties allow with each subsequent viewing. We present a selection of visualisation techniques that provide the viewer with an insight to the location and amount of uncertainty contained within a model, and the geological characteristics which are most affected. A model of the Gippsland Basin, southeastern Australia is used as a case study to demonstrate the concepts of information entropy, stratigraphic variability and geodiversity. Central to the techniques shown here is the creation of a model suite, performed by creating similar (but not the same) version of the original model through perturbation of the input data. Specifically, structural data in the form of strike and dip measurements is perturbed in the creation of the model suite. The visualisation techniques presented are: (i) information entropy; (ii) stratigraphic variability and (iii) geodiversity. Information entropy is used to analyse uncertainty in a spatial context, combining the empirical probability distributions of multiple outcomes with a single quantitative measure. Stratigraphic variability displays the number of possible lithologies that may exist at a given point within the model volume. Geodiversity analyses various model characteristics (or 'geodiveristy metrics'), including the depth, volume of unit, the curvature of an interface, the geological complexity of a contact and the contact relationships units have with each other. Principal component analysis, a multivariate statistical technique, is used to simultaneously examine each of the geodiveristy metrics to determine the boundaries of model space, and identify which metrics contribute most to model uncertainty. The combination of information entropy, stratigraphic variability and geodiversity analysis provides a descriptive and thorough representation of uncertainty with effective visualisation techniques that clearly communicate the geological uncertainty contained within the geological model.

  17. PTV margin determination in conformal SRT of intracranial lesions

    PubMed Central

    Parker, Brent C.; Shiu, Almon S.; Maor, Moshe H.; Lang, Frederick F.; Liu, H. Helen; White, R. Allen; Antolak, John A.

    2002-01-01

    The planning target volume (PTV) includes the clinical target volume (CTV) to be irradiated and a margin to account for uncertainties in the treatment process. Uncertainties in miniature multileaf collimator (mMLC) leaf positioning, CT scanner spatial localization, CT‐MRI image fusion spatial localization, and Gill‐Thomas‐Cosman (GTC) relocatable head frame repositioning were quantified for the purpose of determining a minimum PTV margin that still delivers a satisfactory CTV dose. The measured uncertainties were then incorporated into a simple Monte Carlo calculation for evaluation of various margin and fraction combinations. Satisfactory CTV dosimetric criteria were selected to be a minimum CTV dose of 95% of the PTV dose and at least 95% of the CTV receiving 100% of the PTV dose. The measured uncertainties were assumed to be Gaussian distributions. Systematic errors were added linearly and random errors were added in quadrature assuming no correlation to arrive at the total combined error. The Monte Carlo simulation written for this work examined the distribution of cumulative dose volume histograms for a large patient population using various margin and fraction combinations to determine the smallest margin required to meet the established criteria. The program examined 5 and 30 fraction treatments, since those are the only fractionation schemes currently used at our institution. The fractionation schemes were evaluated using no margin, a margin of just the systematic component of the total uncertainty, and a margin of the systematic component plus one standard deviation of the total uncertainty. It was concluded that (i) a margin of the systematic error plus one standard deviation of the total uncertainty is the smallest PTV margin necessary to achieve the established CTV dose criteria, and (ii) it is necessary to determine the uncertainties introduced by the specific equipment and procedures used at each institution since the uncertainties may vary among locations. PACS number(s): 87.53.Kn, 87.53.Ly PMID:12132939

  18. Identification of sensitive parameters in the modeling of SVOC reemission processes from soil to atmosphere.

    PubMed

    Loizeau, Vincent; Ciffroy, Philippe; Roustan, Yelva; Musson-Genon, Luc

    2014-09-15

    Semi-volatile organic compounds (SVOCs) are subject to Long-Range Atmospheric Transport because of transport-deposition-reemission successive processes. Several experimental data available in the literature suggest that soil is a non-negligible contributor of SVOCs to atmosphere. Then coupling soil and atmosphere in integrated coupled models and simulating reemission processes can be essential for estimating atmospheric concentration of several pollutants. However, the sources of uncertainty and variability are multiple (soil properties, meteorological conditions, chemical-specific parameters) and can significantly influence the determination of reemissions. In order to identify the key parameters in reemission modeling and their effect on global modeling uncertainty, we conducted a sensitivity analysis targeted on the 'reemission' output variable. Different parameters were tested, including soil properties, partition coefficients and meteorological conditions. We performed EFAST sensitivity analysis for four chemicals (benzo-a-pyrene, hexachlorobenzene, PCB-28 and lindane) and different spatial scenari (regional and continental scales). Partition coefficients between air, solid and water phases are influent, depending on the precision of data and global behavior of the chemical. Reemissions showed a lower variability to soil parameters (soil organic matter and water contents at field capacity and wilting point). A mapping of these parameters at a regional scale is sufficient to correctly estimate reemissions when compared to other sources of uncertainty. Copyright © 2014 Elsevier B.V. All rights reserved.

  19. The development of an hourly gridded rainfall product for hydrological applications in England and Wales

    NASA Astrophysics Data System (ADS)

    Liguori, Sara; O'Loughlin, Fiachra; Souvignet, Maxime; Coxon, Gemma; Freer, Jim; Woods, Ross

    2014-05-01

    This research presents a newly developed observed sub-daily gridded precipitation product for England and Wales. Importantly our analysis specifically allows a quantification of rainfall errors from grid to the catchment scale, useful for hydrological model simulation and the evaluation of prediction uncertainties. Our methodology involves the disaggregation of the current one kilometre daily gridded precipitation records available for the United Kingdom[1]. The hourly product is created using information from: 1) 2000 tipping-bucket rain gauges; and 2) the United Kingdom Met-Office weather radar network. These two independent datasets provide rainfall estimates at temporal resolutions much smaller than the current daily gridded rainfall product; thus allowing the disaggregation of the daily rainfall records to an hourly timestep. Our analysis is conducted for the period 2004 to 2008, limited by the current availability of the datasets. We analyse the uncertainty components affecting the accuracy of this product. Specifically we explore how these uncertainties vary spatially, temporally and with climatic regimes. Preliminary results indicate scope for improvement of hydrological model performance by the utilisation of this new hourly gridded rainfall product. Such product will improve our ability to diagnose and identify structural errors in hydrological modelling by including the quantification of input errors. References [1] Keller V, Young AR, Morris D, Davies H (2006) Continuous Estimation of River Flows. Technical Report: Estimation of Precipitation Inputs. in Agency E (ed.). Environmental Agency.

  20. A comparative experimental evaluation of uncertainty estimation methods for two-component PIV

    NASA Astrophysics Data System (ADS)

    Boomsma, Aaron; Bhattacharya, Sayantan; Troolin, Dan; Pothos, Stamatios; Vlachos, Pavlos

    2016-09-01

    Uncertainty quantification in planar particle image velocimetry (PIV) measurement is critical for proper assessment of the quality and significance of reported results. New uncertainty estimation methods have been recently introduced generating interest about their applicability and utility. The present study compares and contrasts current methods, across two separate experiments and three software packages in order to provide a diversified assessment of the methods. We evaluated the performance of four uncertainty estimation methods, primary peak ratio (PPR), mutual information (MI), image matching (IM) and correlation statistics (CS). The PPR method was implemented and tested in two processing codes, using in-house open source PIV processing software (PRANA, Purdue University) and Insight4G (TSI, Inc.). The MI method was evaluated in PRANA, as was the IM method. The CS method was evaluated using DaVis (LaVision, GmbH). Utilizing two PIV systems for high and low-resolution measurements and a laser doppler velocimetry (LDV) system, data were acquired in a total of three cases: a jet flow and a cylinder in cross flow at two Reynolds numbers. LDV measurements were used to establish a point validation against which the high-resolution PIV measurements were validated. Subsequently, the high-resolution PIV measurements were used as a reference against which the low-resolution PIV data were assessed for error and uncertainty. We compared error and uncertainty distributions, spatially varying RMS error and RMS uncertainty, and standard uncertainty coverages. We observed that qualitatively, each method responded to spatially varying error (i.e. higher error regions resulted in higher uncertainty predictions in that region). However, the PPR and MI methods demonstrated reduced uncertainty dynamic range response. In contrast, the IM and CS methods showed better response, but under-predicted the uncertainty ranges. The standard coverages (68% confidence interval) ranged from approximately 65%-77% for PPR and MI methods, 40%-50% for IM and near 50% for CS. These observations illustrate some of the strengths and weaknesses of the methods considered herein and identify future directions for development and improvement.

  1. CrossWater - Modelling micropollutant loads from different sources in the Rhine basin

    NASA Astrophysics Data System (ADS)

    Moser, Andreas; Bader, Hans-Peter; Scheidegger, Ruth; Honti, Mark; Stamm, Christian

    2017-04-01

    The pressure on rivers from micropollutants (MPs) originating from various sources is a growing environmental issue that requires political regulations. The challenges for the water management are numerous, particularly for international water basins. Spatial knowledge of MP sources and the water quality are prerequisites for an effective water quality policy. In this study within the Rhine basin, the spatial patterns of MP sources and concentrations from different use classes of chemicals are investigated with a mass flow analysis and compared to the territorial jurisdictions that shape the spatial arrangement of water management. The source area of MPs depends on the specific use of a compound. Here, we focus on i) herbicides from agricultural land use, ii) biocides from material protection on buildings and iii) human pharmaceuticals from households. The total mass of MPs available for release to the stream network is estimated from statistics of sales and consumption data. Based on GIS data of agricultural land use, vector data of buildings, wastewater treatment plant (WWTP) locations, respectively, the available mass of MPs is spatially distributed to the subcatchments of the Rhine basin. The modelling of concentrations in the rivers consists of two principal components. The first component - the substance transfer module - simulates the actual release of MPs to the stream network. This transfer is affected by many factors rendering spatial distributed modeling a serious challenge. Here we use a parsimonious approach that tries to represent the first order controls of the transfer processes. We use empirical loss rates relating concentration to river discharge for agricultural herbicides and to precipitation for biocides. For the pharmaceuticals the release is coupled to the human metabolism rates and elimination rates in WWTP. The prediction uncertainty was quantified by an error model that takes the seasonality of the herbicide input into account. The second component - the routing module - links contribution of the subcatchments and represents the in-stream transport and fate processes of the substances. The substance transfer module was calibrated using field studies providing simultaneously data on application amounts of substances and on losses to the rivers. However the predictive uncertainty was often large because of some mismatches of high peaks. The model was subsequently validated with independent data from several comprehensive sampling campaigns in Switzerland. Despite acceptable performance in general, some compounds were poorly simulated for some catchments. Data inspection suggests that uncertainty about timing and application amounts are a major limitation. Finally, the calibrated model is used to simulate concentration time series for the Rhine and its main tributaries. The corresponding results will be presented.

  2. Quantifying radar-rainfall uncertainties in urban drainage flow modelling

    NASA Astrophysics Data System (ADS)

    Rico-Ramirez, M. A.; Liguori, S.; Schellart, A. N. A.

    2015-09-01

    This work presents the results of the implementation of a probabilistic system to model the uncertainty associated to radar rainfall (RR) estimates and the way this uncertainty propagates through the sewer system of an urban area located in the North of England. The spatial and temporal correlations of the RR errors as well as the error covariance matrix were computed to build a RR error model able to generate RR ensembles that reproduce the uncertainty associated with the measured rainfall. The results showed that the RR ensembles provide important information about the uncertainty in the rainfall measurement that can be propagated in the urban sewer system. The results showed that the measured flow peaks and flow volumes are often bounded within the uncertainty area produced by the RR ensembles. In 55% of the simulated events, the uncertainties in RR measurements can explain the uncertainties observed in the simulated flow volumes. However, there are also some events where the RR uncertainty cannot explain the whole uncertainty observed in the simulated flow volumes indicating that there are additional sources of uncertainty that must be considered such as the uncertainty in the urban drainage model structure, the uncertainty in the urban drainage model calibrated parameters, and the uncertainty in the measured sewer flows.

  3. Approximating prediction uncertainty for random forest regression models

    Treesearch

    John W. Coulston; Christine E. Blinn; Valerie A. Thomas; Randolph H. Wynne

    2016-01-01

    Machine learning approaches such as random forest have increased for the spatial modeling and mapping of continuous variables. Random forest is a non-parametric ensemble approach, and unlike traditional regression approaches there is no direct quantification of prediction error. Understanding prediction uncertainty is important when using model-based continuous maps as...

  4. Anxious and egocentric: how specific emotions influence perspective taking.

    PubMed

    Todd, Andrew R; Forstmann, Matthias; Burgmer, Pascal; Brooks, Alison Wood; Galinsky, Adam D

    2015-04-01

    People frequently feel anxious. Although prior research has extensively studied how feeling anxious shapes intrapsychic aspects of cognition, much less is known about how anxiety affects interpersonal aspects of cognition. Here, we examine the influence of incidental experiences of anxiety on perceptual and conceptual forms of perspective taking. Compared with participants experiencing other negative, high-arousal emotions (i.e., anger or disgust) or neutral feelings, anxious participants displayed greater egocentrism in their mental-state reasoning: They were more likely to describe an object using their own spatial perspective, had more difficulty resisting egocentric interference when identifying an object from others' spatial perspectives, and relied more heavily on privileged knowledge when inferring others' beliefs. Using both experimental-causal-chain and measurement-of-mediation approaches, we found that these effects were explained, in part, by uncertainty appraisal tendencies. Further supporting the role of uncertainty, a positive emotion associated with uncertainty (i.e., surprise) produced increases in egocentrism that were similar to anxiety. Collectively, the results suggest that incidentally experiencing emotions associated with uncertainty increase reliance on one's own egocentric perspective when reasoning about the mental states of others. (c) 2015 APA, all rights reserved).

  5. Incorporating climate change and morphological uncertainty into coastal change hazard assessments

    USGS Publications Warehouse

    Baron, Heather M.; Ruggiero, Peter; Wood, Nathan J.; Harris, Erica L.; Allan, Jonathan; Komar, Paul D.; Corcoran, Patrick

    2015-01-01

    Documented and forecasted trends in rising sea levels and changes in storminess patterns have the potential to increase the frequency, magnitude, and spatial extent of coastal change hazards. To develop realistic adaptation strategies, coastal planners need information about coastal change hazards that recognizes the dynamic temporal and spatial scales of beach morphology, the climate controls on coastal change hazards, and the uncertainties surrounding the drivers and impacts of climate change. We present a probabilistic approach for quantifying and mapping coastal change hazards that incorporates the uncertainty associated with both climate change and morphological variability. To demonstrate the approach, coastal change hazard zones of arbitrary confidence levels are developed for the Tillamook County (State of Oregon, USA) coastline using a suite of simple models and a range of possible climate futures related to wave climate, sea-level rise projections, and the frequency of major El Niño events. Extreme total water levels are more influenced by wave height variability, whereas the magnitude of erosion is more influenced by sea-level rise scenarios. Morphological variability has a stronger influence on the width of coastal hazard zones than the uncertainty associated with the range of climate change scenarios.

  6. Assessing population exposure for landslide risk analysis using dasymetric cartography

    NASA Astrophysics Data System (ADS)

    Garcia, Ricardo A. C.; Oliveira, Sérgio C.; Zêzere, José L.

    2016-12-01

    Assessing the number and locations of exposed people is a crucial step in landslide risk management and emergency planning. The available population statistical data frequently have insufficient detail for an accurate assessment of potentially exposed people to hazardous events, mainly when they occur at the local scale, such as with landslides. The present study aims to apply dasymetric cartography to improving population spatial resolution and to assess the potentially exposed population. An additional objective is to compare the results with those obtained with a more common approach that uses, as spatial units, basic census units, which are the best spatial data disaggregation and detailed information available for regional studies in Portugal. Considering the Portuguese census data and a layer of residential building footprint, which was used as ancillary information, the number of exposed inhabitants differs significantly according to the approach used. When the census unit approach is used, considering the three highest landslide susceptible classes, the number of exposed inhabitants is in general overestimated. Despite the associated uncertainties of a general cost-benefit analysis, the presented methodology seems to be a reliable approach for gaining a first approximation of a more detailed estimation of exposed people. The approach based on dasymetric cartography allows the spatial resolution of population over large areas to be increased and enables the use of detailed landslide susceptibility maps, which are valuable for improving the exposed population assessment.

  7. Design and implementation of a risk assessment module in a spatial decision support system

    NASA Astrophysics Data System (ADS)

    Zhang, Kaixi; van Westen, Cees; Bakker, Wim

    2014-05-01

    The spatial decision support system named 'Changes SDSS' is currently under development. The goal of this system is to analyze changing hydro-meteorological hazards and the effect of risk reduction alternatives to support decision makers in choosing the best alternatives. The risk assessment module within the system is to assess the current risk, analyze the risk after implementations of risk reduction alternatives, and analyze the risk in different future years when considering scenarios such as climate change, land use change and population growth. The objective of this work is to present the detailed design and implementation plan of the risk assessment module. The main challenges faced consist of how to shift the risk assessment from traditional desktop software to an open source web-based platform, the availability of input data and the inclusion of uncertainties in the risk analysis. The risk assessment module is developed using Ext JS library for the implementation of user interface on the client side, using Python for scripting, as well as PostGIS spatial functions for complex computations on the server side. The comprehensive consideration of the underlying uncertainties in input data can lead to a better quantification of risk assessment and a more reliable Changes SDSS, since the outputs of risk assessment module are the basis for decision making module within the system. The implementation of this module will contribute to the development of open source web-based modules for multi-hazard risk assessment in the future. This work is part of the "CHANGES SDSS" project, funded by the European Community's 7th Framework Program.

  8. Modeling spatial-temporal dynamics of global wetlands: comprehensive evaluation of a new sub-grid TOPMODEL parameterization and uncertainties

    NASA Astrophysics Data System (ADS)

    Zhang, Z.; Zimmermann, N. E.; Poulter, B.

    2015-11-01

    Simulations of the spatial-temporal dynamics of wetlands are key to understanding the role of wetland biogeochemistry under past and future climate variability. Hydrologic inundation models, such as TOPMODEL, are based on a fundamental parameter known as the compound topographic index (CTI) and provide a computationally cost-efficient approach to simulate wetland dynamics at global scales. However, there remains large discrepancy in the implementations of TOPMODEL in land-surface models (LSMs) and thus their performance against observations. This study describes new improvements to TOPMODEL implementation and estimates of global wetland dynamics using the LPJ-wsl dynamic global vegetation model (DGVM), and quantifies uncertainties by comparing three digital elevation model products (HYDRO1k, GMTED, and HydroSHEDS) at different spatial resolution and accuracy on simulated inundation dynamics. In addition, we found that calibrating TOPMODEL with a benchmark wetland dataset can help to successfully delineate the seasonal and interannual variations of wetlands, as well as improve the spatial distribution of wetlands to be consistent with inventories. The HydroSHEDS DEM, using a river-basin scheme for aggregating the CTI, shows best accuracy for capturing the spatio-temporal dynamics of wetlands among the three DEM products. The estimate of global wetland potential/maximum is ∼ 10.3 Mkm2 (106 km2), with a mean annual maximum of ∼ 5.17 Mkm2 for 1980-2010. This study demonstrates the feasibility to capture spatial heterogeneity of inundation and to estimate seasonal and interannual variations in wetland by coupling a hydrological module in LSMs with appropriate benchmark datasets. It additionally highlights the importance of an adequate investigation of topographic indices for simulating global wetlands and shows the opportunity to converge wetland estimates across LSMs by identifying the uncertainty associated with existing wetland products.

  9. The impact of forest structure and spatial scale on the relationship between ground plot above ground biomass and GEDI lidar waveforms

    NASA Astrophysics Data System (ADS)

    Armston, J.; Marselis, S.; Hancock, S.; Duncanson, L.; Tang, H.; Kellner, J. R.; Calders, K.; Disney, M.; Dubayah, R.

    2017-12-01

    The NASA Global Ecosystem Dynamics Investigation (GEDI) will place a multi-beam waveform lidar instrument on the International Space Station (ISS) to provide measurements of forest vertical structure globally. These measurements of structure will underpin empirical modelling of above ground biomass density (AGBD) at the scale of individual GEDI lidar footprints (25m diameter). The GEDI pre-launch calibration strategy for footprint level models relies on linking AGBD estimates from ground plots with GEDI lidar waveforms simulated from coincident discrete return airborne laser scanning data. Currently available ground plot data have variable and often large uncertainty at the spatial resolution of GEDI footprints due to poor colocation, allometric model error, sample size and plot edge effects. The relative importance of these sources of uncertainty partly depends on the quality of ground measurements and region. It is usually difficult to know the magnitude of these uncertainties a priori so a common approach to mitigate their influence on model training is to aggregate ground plot and waveform lidar data to a coarser spatial scale (0.25-1ha). Here we examine the impacts of these principal sources of uncertainty using a 3D simulation approach. Sets of realistic tree models generated from terrestrial laser scanning (TLS) data or parametric modelling matched to tree inventory data were assembled from four contrasting forest plots across tropical rainforest, deciduous temperate forest, and sclerophyll eucalypt woodland sites. These tree models were used to simulate geometrically explicit 3D scenes with variable tree density, size class and spatial distribution. GEDI lidar waveforms are simulated over ground plots within these scenes using monte carlo ray tracing, allowing the impact of varying ground plot and waveform colocation error, forest structure and edge effects on the relationship between ground plot AGBD and GEDI lidar waveforms to be directly assessed. We quantify the sensitivity of calibration equations relating GEDI lidar structure measurements and AGBD to these factors at a range of spatial scales (0.0625-1ha) and discuss the implications for the expanding use of existing in situ ground plot data by GEDI.

  10. Uncertainty in the spatial distribution of tropical forest biomass: a comparison of pan-tropical maps.

    PubMed

    Mitchard, Edward Ta; Saatchi, Sassan S; Baccini, Alessandro; Asner, Gregory P; Goetz, Scott J; Harris, Nancy L; Brown, Sandra

    2013-10-26

    Mapping the aboveground biomass of tropical forests is essential both for implementing conservation policy and reducing uncertainties in the global carbon cycle. Two medium resolution (500 m - 1000 m) pantropical maps of vegetation biomass have been recently published, and have been widely used by sub-national and national-level activities in relation to Reducing Emissions from Deforestation and forest Degradation (REDD+). Both maps use similar input data layers, and are driven by the same spaceborne LiDAR dataset providing systematic forest height and canopy structure estimates, but use different ground datasets for calibration and different spatial modelling methodologies. Here, we compare these two maps to each other, to the FAO's Forest Resource Assessment (FRA) 2010 country-level data, and to a high resolution (100 m) biomass map generated for a portion of the Colombian Amazon. We find substantial differences between the two maps, in particular in central Amazonia, the Congo basin, the south of Papua New Guinea, the Miombo woodlands of Africa, and the dry forests and savannas of South America. There is little consistency in the direction of the difference. However, when the maps are aggregated to the country or biome scale there is greater agreement, with differences cancelling out to a certain extent. When comparing country level biomass stocks, the two maps agree with each other to a much greater extent than to the FRA 2010 estimates. In the Colombian Amazon, both pantropical maps estimate higher biomass than the independent high resolution map, but show a similar spatial distribution of this biomass. Biomass mapping has progressed enormously over the past decade, to the stage where we can produce globally consistent maps of aboveground biomass. We show that there are still large uncertainties in these maps, in particular in areas with little field data. However, when used at a regional scale, different maps appear to converge, suggesting we can provide reasonable stock estimates when aggregated over large regions. Therefore we believe the largest uncertainties for REDD+ activities relate to the spatial distribution of biomass and to the spatial pattern of forest cover change, rather than to total globally or nationally summed carbon density.

  11. Estimation of Spatial Trends in LAI in Heterogeneous Semi-arid Ecosystems using Full Waveform Lidar

    NASA Astrophysics Data System (ADS)

    Glenn, N. F.; Ilangakoon, N.; Spaete, L.; Dashti, H.

    2017-12-01

    Leaf area index (LAI) is a key structural trait that is defined by the plant functional type (PFT) and controlled by prevailing climate- and human-driven ecosystem stresses. Estimates of LAI using remote sensing techniques are limited by the uncertainties of vegetation inter and intra-gap fraction estimates; this is especially the case in sparse, low stature vegetated ecosystems. Small footprint full waveform lidar digitizes the total amount of return energy with the direction information as a near continuous waveform at a high vertical resolution (1 ns). Thus waveform lidar provides additional data matrices to capture vegetation gaps as well as PFTs that can be used to constrain the uncertainties of LAI estimates. In this study, we calculated a radiometrically calibrated full waveform parameter called backscatter cross section, along with other data matrices from the waveform to estimate vegetation gaps across plots (10 m x 10 m) in a semi-arid ecosystem in the western US. The LAI was then estimated using empirical relationships with directional gap fraction. Full waveform-derived gap fraction based LAI showed a high correlation with field observed shrub LAI (R2 = 0.66, RMSE = 0.24) compared to discrete return lidar based LAI (R2 = 0.01, RMSE = 0.5). The data matrices derived from full waveform lidar classified a number of deciduous and evergreen tree species, shrub species, and bare ground with an overall accuracy of 89% at 10 m. A similar analysis was performed at 1m with overall accuracy of 80%. The next step is to use these relationships to map the PFTs LAI at 10 m spatial scale across the larger study regions. The results show the exciting potential of full waveform lidar to identify plant functional types and LAI in low-stature vegetation dominated semi-arid ecosystems, an ecosystem in which many other remote sensing techniques fail. These results can be used to assess ecosystem state, habitat suitability as well as to constrain model uncertainties in vegetation dynamic models with a combination of other remote sensing techniques. Multi-spatial resolution (1 m and 10 m) studies provide basic information on the applicability and detection thresholds of future global satellite sensors designed at coarser spatial resolutions (e.g. GEDI, ICESat-2) in semi-arid ecosystems.

  12. Risk assessments of regional climate change over Europe: generation of probabilistic ensemble and uncertainty assessment for EURO-CODEX

    NASA Astrophysics Data System (ADS)

    Yuan, J.; Kopp, R. E.

    2017-12-01

    Quantitative risk analysis of regional climate change is crucial for risk management and impact assessment of climate change. Two major challenges to assessing the risks of climate change are: CMIP5 model runs, which drive EURO-CODEX downscaling runs, do not cover the full range of uncertainty of future projections; Climate models may underestimate the probability of tail risks (i.e. extreme events). To overcome the difficulties, this study offers a viable avenue, where a set of probabilistic climate ensemble is generated using the Surrogate/Model Mixed Ensemble (SMME) method. The probabilistic ensembles for temperature and precipitation are used to assess the range of uncertainty covered by five bias-corrected simulations from the high-resolution (0.11º) EURO-CODEX database, which are selected by the PESETA (The Projection of Economic impacts of climate change in Sectors of the European Union based on bottom-up Analysis) III project. Results show that the distribution of SMME ensemble is notably wider than both distribution of raw ensemble of GCMs and the spread of the five EURO-CORDEX in RCP8.5. Tail risks are well presented by the SMME ensemble. Both SMME ensemble and EURO-CORDEX projections are aggregated to administrative level, and are integrated into impact functions of PESETA III to assess climate risks in Europe. To further evaluate the uncertainties introduced by the downscaling process, we compare the 5 runs from EURO-CORDEX with runs from the corresponding GCMs. Time series of regional mean, spatial patterns, and climate indices are examined for the future climate (2080-2099) deviating from the present climate (1981-2010). The downscaling processes do not appear to be trend-preserving, e.g. the increase in regional mean temperature from EURO-CORDEX is slower than that from the corresponding GCM. The spatial pattern comparison reveals that the differences between each pair of GCM and EURO-CORDEX are small in winter. In summer, the temperatures of EURO-CORDEX are generally lower than those of GCMs, while the drying trends in precipitation of EURO-CORDEX are smaller than those of GCMs. Climate indices are significantly affected by bias-correction and downscaling process. Our study provides valuable information for selecting climate indices in different regions over Europe.

  13. Alpine Grassland Soil Organic Carbon Stock and Its Uncertainty in the Three Rivers Source Region of the Tibetan Plateau

    PubMed Central

    Chang, Xiaofeng; Wang, Shiping; Cui, Shujuan; Zhu, Xiaoxue; Luo, Caiyun; Zhang, Zhenhua; Wilkes, Andreas

    2014-01-01

    Alpine grassland of the Tibetan Plateau is an important component of global soil organic carbon (SOC) stocks, but insufficient field observations and large spatial heterogeneity leads to great uncertainty in their estimation. In the Three Rivers Source Region (TRSR), alpine grasslands account for more than 75% of the total area. However, the regional carbon (C) stock estimate and their uncertainty have seldom been tested. Here we quantified the regional SOC stock and its uncertainty using 298 soil profiles surveyed from 35 sites across the TRSR during 2006–2008. We showed that the upper soil (0–30 cm depth) in alpine grasslands of the TRSR stores 2.03 Pg C, with a 95% confidence interval ranging from 1.25 to 2.81 Pg C. Alpine meadow soils comprised 73% (i.e. 1.48 Pg C) of the regional SOC estimate, but had the greatest uncertainty at 51%. The statistical power to detect a deviation of 10% uncertainty in grassland C stock was less than 0.50. The required sample size to detect this deviation at a power of 90% was about 6–7 times more than the number of sample sites surveyed. Comparison of our observed SOC density with the corresponding values from the dataset of Yang et al. indicates that these two datasets are comparable. The combined dataset did not reduce the uncertainty in the estimate of the regional grassland soil C stock. This result could be mainly explained by the underrepresentation of sampling sites in large areas with poor accessibility. Further research to improve the regional SOC stock estimate should optimize sampling strategy by considering the number of samples and their spatial distribution. PMID:24819054

  14. Dynamical attribution of oceanic prediction uncertainty in the North Atlantic: application to the design of optimal monitoring systems

    NASA Astrophysics Data System (ADS)

    Sévellec, Florian; Dijkstra, Henk A.; Drijfhout, Sybren S.; Germe, Agathe

    2017-11-01

    In this study, the relation between two approaches to assess the ocean predictability on interannual to decadal time scales is investigated. The first pragmatic approach consists of sampling the initial condition uncertainty and assess the predictability through the divergence of this ensemble in time. The second approach is provided by a theoretical framework to determine error growth by estimating optimal linear growing modes. In this paper, it is shown that under the assumption of linearized dynamics and normal distributions of the uncertainty, the exact quantitative spread of ensemble can be determined from the theoretical framework. This spread is at least an order of magnitude less expensive to compute than the approximate solution given by the pragmatic approach. This result is applied to a state-of-the-art Ocean General Circulation Model to assess the predictability in the North Atlantic of four typical oceanic metrics: the strength of the Atlantic Meridional Overturning Circulation (AMOC), the intensity of its heat transport, the two-dimensional spatially-averaged Sea Surface Temperature (SST) over the North Atlantic, and the three-dimensional spatially-averaged temperature in the North Atlantic. For all tested metrics, except for SST, ˜ 75% of the total uncertainty on interannual time scales can be attributed to oceanic initial condition uncertainty rather than atmospheric stochastic forcing. The theoretical method also provide the sensitivity pattern to the initial condition uncertainty, allowing for targeted measurements to improve the skill of the prediction. It is suggested that a relatively small fleet of several autonomous underwater vehicles can reduce the uncertainty in AMOC strength prediction by 70% for 1-5 years lead times.

  15. Assessing and mapping spatial associations among oral cancer mortality rates, concentrations of heavy metals in soil, and land use types based on multiple scale data.

    PubMed

    Lin, Wei-Chih; Lin, Yu-Pin; Wang, Yung-Chieh; Chang, Tsun-Kuo; Chiang, Li-Chi

    2014-02-21

    In this study, a deconvolution procedure was used to create a variogram of oral cancer (OC) rates. Based on the variogram, area-to-point (ATP) Poisson kriging and p-field simulation were used to downscale and simulate, respectively, the OC rate data for Taiwan from the district scale to a 1 km × 1 km grid scale. Local cluster analysis (LCA) of OC mortality rates was then performed to identify OC mortality rate hot spots based on the downscaled and the p-field-simulated OC mortality maps. The relationship between OC mortality and land use was studied by overlapping the maps of the downscaled OC mortality, the LCA results, and the land uses. One thousand simulations were performed to quantify local and spatial uncertainties in the LCA to identify OC mortality hot spots. The scatter plots and Spearman's rank correlation yielded the relationship between OC mortality and concentrations of the seven metals in the 1 km cell grid. The correlation analysis results for the 1 km scale revealed a weak correlation between OC mortality rate and concentrations of the seven studied heavy metals in soil. Accordingly, the heavy metal concentrations in soil are not major determinants of OC mortality rates at the 1 km scale at which soils were sampled. The LCA statistical results for local indicator of spatial association (LISA) revealed that the sites with high probability of high-high (high value surrounded by high values) OC mortality at the 1 km grid scale were clustered in southern, eastern, and mid-western Taiwan. The number of such sites was also significantly higher on agricultural land and in urban regions than on land with other uses. The proposed approach can be used to downscale and evaluate uncertainty in mortality data from a coarse scale to a fine scale at which useful additional information can be obtained for assessing and managing land use and risk.

  16. Surface Temperature Data Analysis

    NASA Technical Reports Server (NTRS)

    Hansen, James; Ruedy, Reto

    2012-01-01

    Small global mean temperature changes may have significant to disastrous consequences for the Earth's climate if they persist for an extended period. Obtaining global means from local weather reports is hampered by the uneven spatial distribution of the reliably reporting weather stations. Methods had to be developed that minimize as far as possible the impact of that situation. This software is a method of combining temperature data of individual stations to obtain a global mean trend, overcoming/estimating the uncertainty introduced by the spatial and temporal gaps in the available data. Useful estimates were obtained by the introduction of a special grid, subdividing the Earth's surface into 8,000 equal-area boxes, using the existing data to create virtual stations at the center of each of these boxes, and combining temperature anomalies (after assessing the radius of high correlation) rather than temperatures.

  17. SRNL PARTICIPATION IN THE MULTI-SCALE ENSEMBLE EXERCISES

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Buckley, R

    2007-10-29

    Consequence assessment during emergency response often requires atmospheric transport and dispersion modeling to guide decision making. A statistical analysis of the ensemble of results from several models is a useful way of estimating the uncertainty for a given forecast. ENSEMBLE is a European Union program that utilizes an internet-based system to ingest transport results from numerous modeling agencies. A recent set of exercises required output on three distinct spatial and temporal scales. The Savannah River National Laboratory (SRNL) uses a regional prognostic model nested within a larger-scale synoptic model to generate the meteorological conditions which are in turn used inmore » a Lagrangian particle dispersion model. A discussion of SRNL participation in these exercises is given, with particular emphasis on requirements for provision of results in a timely manner with regard to the various spatial scales.« less

  18. Nutrient Budgets in Successional Northern Hardwood Forests: Uncertainty in soil, root, and tree concentrations and pools (Invited)

    NASA Astrophysics Data System (ADS)

    Yanai, R. D.; Bae, K.; Levine, C. R.; Lilly, P.; Vadeboncoeur, M. A.; Fatemi, F. R.; Blum, J. D.; Arthur, M.; Hamburg, S.

    2013-12-01

    Ecosystem nutrient budgets are difficult to construct and even more difficult to replicate. As a result, uncertainty in the estimates of pools and fluxes are rarely reported, and opportunities to assess confidence through replicated measurements are rare. In this study, we report nutrient concentrations and contents of soil and biomass pools in northern hardwood stands in replicate plots within replicate stands in 3 age classes (14-19 yr, 26-29 yr, and > 100 yr) at the Bartlett Experimental Forest, USA. Soils were described by quantitative soil pits in three plots per stand, excavated by depth increment to the C horizon and analyzed by a sequential extraction procedure. Variation in soil mass among pits within stands averaged 28% (coefficient of variation); variation among stands within an age class ranged from 9-25%. Variation in nutrient concentrations were higher still (averaging 38%, within element, depth increment, and extraction type), perhaps because the depth increments contained varying proportions of genetic horizons. To estimate nutrient contents of aboveground biomass, we propagated model uncertainty through allometric equations, and found errors ranging from 3-7%, depending on the stand. The variation in biomass among plots within stands (6-19%) was always larger than the allometric uncertainties. Variability in measured nutrient concentrations of tree tissues were more variable than the uncertainty in biomass. Foliage had the lowest variability (averaging 16% for Ca, Mg, K, N and P within age class and species), and wood had the highest (averaging 30%), when reported in proportion to the mean, because concentrations in wood are low. For Ca content of aboveground biomass, sampling variation was the greatest source of uncertainty. Coefficients of variation among plots within a stand averaged 16%; stands within an age class ranged from 5-25% CV, including uncertainties in tree allometry and tissue chemistry. Uncertainty analysis can help direct research effort to areas most in need of improvement. In systems such as the one we studied, more intensive sampling would be the best approach to reducing uncertainty, as natural spatial variation was higher than model or measurement uncertainties.

  19. Uncertainties in land use data

    NASA Astrophysics Data System (ADS)

    Castilla, G.; Hay, G. J.

    2006-11-01

    This paper deals with the description and assessment of uncertainties in gridded land use data derived from Remote Sensing observations, in the context of hydrological studies. Land use is a categorical regionalised variable returning the main socio-economic role each location has, where the role is inferred from the pattern of occupation of land. There are two main uncertainties surrounding land use data, positional and categorical. This paper focuses on the second one, as the first one has in general less serious implications and is easier to tackle. The conventional method used to asess categorical uncertainty, the confusion matrix, is criticised in depth, the main critique being its inability to inform on a basic requirement to propagate uncertainty through distributed hydrological models, namely the spatial distribution of errors. Some existing alternative methods are reported, and finally the need for metadata is stressed as a more reliable means to assess the quality, and hence the uncertainty, of these data.

  20. Number-phase minimum-uncertainty state with reduced number uncertainty in a Kerr nonlinear interferometer

    NASA Astrophysics Data System (ADS)

    Kitagawa, M.; Yamamoto, Y.

    1987-11-01

    An alternative scheme for generating amplitude-squeezed states of photons based on unitary evolution which can properly be described by quantum mechanics is presented. This scheme is a nonlinear Mach-Zehnder interferometer containing an optical Kerr medium. The quasi-probability density (QPD) and photon-number distribution of the output field are calculated, and it is demonstrated that the reduced photon-number uncertainty and enhanced phase uncertainty maintain the minimum-uncertainty product. A self-phase-modulation of the single-mode quantized field in the Kerr medium is described based on localized operators. The spatial evolution of the state is demonstrated by QPD in the Schroedinger picture. It is shown that photon-number variance can be reduced to a level far below the limit for an ordinary squeezed state, and that the state prepared using this scheme remains a number-phase minimum-uncertainty state until the maximum reduction of number fluctuations is surpassed.

  1. Use of meteorological information in the risk analysis of a mixed wind farm and solar

    NASA Astrophysics Data System (ADS)

    Mengelkamp, H.-T.; Bendel, D.

    2010-09-01

    Use of meteorological information in the risk analysis of a mixed wind farm and solar power plant portfolio H.-T. Mengelkamp*,** , D. Bendel** *GKSS Research Center Geesthacht GmbH **anemos Gesellschaft für Umweltmeteorologie mbH The renewable energy industry has rapidly developed during the last two decades and so have the needs for high quality comprehensive meteorological services. It is, however, only recently that international financial institutions bundle wind farms and solar power plants and offer shares in these aggregate portfolios. The monetary value of a mixed wind farm and solar power plant portfolio is determined by legal and technical aspects, the expected annual energy production of each wind farm and solar power plant and the associated uncertainty of the energy yield estimation or the investment risk. Building an aggregate portfolio will reduce the overall uncertainty through diversification in contrast to the single wind farm/solar power plant energy yield uncertainty. This is similar to equity funds based on a variety of companies or products. Meteorological aspects contribute to the diversification in various ways. There is the uncertainty in the estimation of the expected long-term mean energy production of the wind and solar power plants. Different components of uncertainty have to be considered depending on whether the power plant is already in operation or in the planning phase. The uncertainty related to a wind farm in the planning phase comprises the methodology of the wind potential estimation and the uncertainty of the site specific wind turbine power curve as well as the uncertainty of the wind farm effect calculation. The uncertainty related to a solar power plant in the pre-operational phase comprises the uncertainty of the radiation data base and that of the performance curve. The long-term mean annual energy yield of operational wind farms and solar power plants is estimated on the basis of the actual energy production and it's relation to a climatologically stable long-term reference period. These components of uncertainty are of technical nature and based on subjective estimations rather than on a statistically sound data analysis. And then there is the temporal and spatial variability of the wind speed and radiation. Their influence on the overall risk is determined by the regional distribution of the power plants. These uncertainty components are calculated on the basis of wind speed observations and simulations and satellite derived radiation data. The respective volatility (temporal variability) is calculated from the site specific time series and the influence on the portfolio through regional correlation. For an exemplary portfolio comprising fourteen wind farms and eight solar power plants the annual mean energy production to be expected is calculated, the different components of uncertainty are estimated for each single wind farm and solar power plant and for the portfolio as a whole. The reduction in uncertainty (or risk) through bundling the wind farms and the solar power plants (the portfolio effect) is calculated by Markowitz' Modern Portfolio Theory. This theory is applied separately for the wind farm and the solar power plant bundle and for the combination of both. The combination of wind and photovoltaic assets clearly shows potential for a risk reduction. Even assets with a comparably low expected return can lead to a significant risk reduction depending on their individual characteristics.

  2. A TIERED APPROACH TO PERFORMING UNCERTAINTY ANALYSIS IN CONDUCTING EXPOSURE ANALYSIS FOR CHEMICALS

    EPA Science Inventory

    The WHO/IPCS draft Guidance Document on Characterizing and Communicating Uncertainty in Exposure Assessment provides guidance on recommended strategies for conducting uncertainty analysis as part of human exposure analysis. Specifically, a tiered approach to uncertainty analysis ...

  3. Heavy Metal Pollution Delineation Based on Uncertainty in a Coastal Industrial City in the Yangtze River Delta, China

    PubMed Central

    Zhao, Ruiying; Chen, Songchao; Zhou, Yue; Jin, Bin; Li, Yan

    2018-01-01

    Assessing heavy metal pollution and delineating pollution are the bases for evaluating pollution and determining a cost-effective remediation plan. Most existing studies are based on the spatial distribution of pollutants but ignore related uncertainty. In this study, eight heavy-metal concentrations (Cr, Pb, Cd, Hg, Zn, Cu, Ni, and Zn) were collected at 1040 sampling sites in a coastal industrial city in the Yangtze River Delta, China. The single pollution index (PI) and Nemerow integrated pollution index (NIPI) were calculated for every surface sample (0–20 cm) to assess the degree of heavy metal pollution. Ordinary kriging (OK) was used to map the spatial distribution of heavy metals content and NIPI. Then, we delineated composite heavy metal contamination based on the uncertainty produced by indicator kriging (IK). The results showed that mean values of all PIs and NIPIs were at safe levels. Heavy metals were most accumulated in the central portion of the study area. Based on IK, the spatial probability of composite heavy metal pollution was computed. The probability of composite contamination in the central core urban area was highest. A probability of 0.6 was found as the optimum probability threshold to delineate polluted areas from unpolluted areas for integrative heavy metal contamination. Results of pollution delineation based on uncertainty showed the proportion of false negative error areas was 6.34%, while the proportion of false positive error areas was 0.86%. The accuracy of the classification was 92.80%. This indicated the method we developed is a valuable tool for delineating heavy metal pollution. PMID:29642623

  4. Heavy Metal Pollution Delineation Based on Uncertainty in a Coastal Industrial City in the Yangtze River Delta, China.

    PubMed

    Hu, Bifeng; Zhao, Ruiying; Chen, Songchao; Zhou, Yue; Jin, Bin; Li, Yan; Shi, Zhou

    2018-04-10

    Assessing heavy metal pollution and delineating pollution are the bases for evaluating pollution and determining a cost-effective remediation plan. Most existing studies are based on the spatial distribution of pollutants but ignore related uncertainty. In this study, eight heavy-metal concentrations (Cr, Pb, Cd, Hg, Zn, Cu, Ni, and Zn) were collected at 1040 sampling sites in a coastal industrial city in the Yangtze River Delta, China. The single pollution index (PI) and Nemerow integrated pollution index (NIPI) were calculated for every surface sample (0-20 cm) to assess the degree of heavy metal pollution. Ordinary kriging (OK) was used to map the spatial distribution of heavy metals content and NIPI. Then, we delineated composite heavy metal contamination based on the uncertainty produced by indicator kriging (IK). The results showed that mean values of all PIs and NIPIs were at safe levels. Heavy metals were most accumulated in the central portion of the study area. Based on IK, the spatial probability of composite heavy metal pollution was computed. The probability of composite contamination in the central core urban area was highest. A probability of 0.6 was found as the optimum probability threshold to delineate polluted areas from unpolluted areas for integrative heavy metal contamination. Results of pollution delineation based on uncertainty showed the proportion of false negative error areas was 6.34%, while the proportion of false positive error areas was 0.86%. The accuracy of the classification was 92.80%. This indicated the method we developed is a valuable tool for delineating heavy metal pollution.

  5. Using heat as a tracer to estimate spatially distributed mean residence times in the hyporheic zone of a riffle-pool sequence

    USGS Publications Warehouse

    Naranjo, Ramon C.

    2013-01-01

    Biochemical reactions that occur in the hyporheic zone are highly dependent on the time solutes that are in contact with sediments of the riverbed. In this investigation, we developed a 2-D longitudinal flow and solute-transport model to estimate the spatial distribution of mean residence time in the hyporheic zone. The flow model was calibrated using observations of temperature and pressure, and the mean residence times were simulated using the age-mass approach for steady-state flow conditions. The approach used in this investigation includes the mixing of different ages and flow paths of water through advection and dispersion. Uncertainty of flow and transport parameters was evaluated using standard Monte Carlo and the generalized likelihood uncertainty estimation method. Results of parameter estimation support the presence of a low-permeable zone in the riffle area that induced horizontal flow at a shallow depth within the riffle area. This establishes shallow and localized flow paths and limits deep vertical exchange. For the optimal model, mean residence times were found to be relatively long (9–40.0 days). The uncertainty of hydraulic conductivity resulted in a mean interquartile range (IQR) of 13 days across all piezometers and was reduced by 24% with the inclusion of temperature and pressure observations. To a lesser extent, uncertainty in streambed porosity and dispersivity resulted in a mean IQR of 2.2 and 4.7 days, respectively. Alternative conceptual models demonstrate the importance of accounting for the spatial distribution of hydraulic conductivity in simulating mean residence times in a riffle-pool sequence.

  6. Automating the evaluation of flood damages: methodology and potential gains

    NASA Astrophysics Data System (ADS)

    Eleutério, Julian; Martinez, Edgar Daniel

    2010-05-01

    The evaluation of flood damage potential consists of three main steps: assessing and processing data, combining data and calculating potential damages. The first step consists of modelling hazard and assessing vulnerability. In general, this step of the evaluation demands more time and investments than the others. The second step of the evaluation consists of combining spatial data on hazard with spatial data on vulnerability. Geographic Information System (GIS) is a fundamental tool in the realization of this step. GIS software allows the simultaneous analysis of spatial and matrix data. The third step of the evaluation consists of calculating potential damages by means of damage-functions or contingent analysis. All steps demand time and expertise. However, the last two steps must be realized several times when comparing different management scenarios. In addition, uncertainty analysis and sensitivity test are made during the second and third steps of the evaluation. The feasibility of these steps could be relevant in the choice of the extent of the evaluation. Low feasibility could lead to choosing not to evaluate uncertainty or to limit the number of scenario comparisons. Several computer models have been developed over time in order to evaluate the flood risk. GIS software is largely used to realise flood risk analysis. The software is used to combine and process different types of data, and to visualise the risk and the evaluation results. The main advantages of using a GIS in these analyses are: the possibility of "easily" realising the analyses several times, in order to compare different scenarios and study uncertainty; the generation of datasets which could be used any time in future to support territorial decision making; the possibility of adding information over time to update the dataset and make other analyses. However, these analyses require personnel specialisation and time. The use of GIS software to evaluate the flood risk requires personnel with a double professional specialisation. The professional should be proficient in GIS software and in flood damage analysis (which is already a multidisciplinary field). Great effort is necessary in order to correctly evaluate flood damages, and the updating and the improvement of the evaluation over time become a difficult task. The automation of this process should bring great advance in flood management studies over time, especially for public utilities. This study has two specific objectives: (1) show the entire process of automation of the second and third steps of flood damage evaluations; and (2) analyse the induced potential gains in terms of time and expertise needed in the analysis. A programming language is used within GIS software in order to automate hazard and vulnerability data combination and potential damages calculation. We discuss the overall process of flood damage evaluation. The main result of this study is a computational tool which allows significant operational gains on flood loss analyses. We quantify these gains by means of a hypothetical example. The tool significantly reduces the time of analysis and the needs for expertise. An indirect gain is that sensitivity and cost-benefit analyses can be more easily realized.

  7. Spatial distribution of child pedestrian injuries along census tract boundaries: Implications for identifying area-based correlates.

    PubMed

    Curtis, Jacqueline W

    2017-01-01

    Census tracts are often used to investigate area-based correlates of a variety of health outcomes. This approach has been shown to be valuable in understanding the ways that health is shaped by place and to design appropriate interventions that account for community-level processes. Following this line of inquiry, it is common in the study of pedestrian injuries to aggregate the point level locations of these injuries to the census tracts in which they occur. Such aggregation enables investigation of the relationships between a range of socioeconomic variables and areas of notably high or low incidence. This study reports on the spatial distribution of child pedestrian injuries in a mid-sized U.S. city over a three-year period. Utilizing a combination of geospatial approaches, Near Analysis, Kernel Density Estimation, and Local Moran's I, enables identification, visualization, and quantification of close proximity between incidents and tract boundaries. Specifically, results reveal that nearly half of the 100 incidents occur within roads that are also census tract boundaries. Results also uncover incidents that occur on tract boundaries, not merely near them. This geographic pattern raises the question of the utility of associating area-based census data from any one tract to the injuries occurring in these border zones. Furthermore, using a standard spatial join technique in a Geographic Information System (GIS), these points located on the border are counted as falling into census tracts on both sides of the boundary, which introduces uncertainty in any subsequent analysis. Therefore, two additional approaches of aggregating points to polygons were tested in this study. Results differ with each approach, but without any alert of such differences to the GIS user. This finding raises a fundamental concern about techniques through which points are aggregated to polygons in any study using point level incidents and their surrounding census tract socioeconomic data to understand health and place. This study concludes with a suggested protocol to test for this source of uncertainty in analysis and an approach that may remove it.

  8. Determination of Interannual to Decadal Changes in Ice Sheet Mass Balance from Satellite Altimetry

    NASA Technical Reports Server (NTRS)

    Zwally, H. Jay; Busalacchi, Antonioa J. (Technical Monitor)

    2001-01-01

    A major uncertainty in predicting sea level rise is the sensitivity of ice sheet mass balance to climate change, as well as the uncertainty in present mass balance. Since the annual water exchange is about 8 mm of global sea level equivalent, the +/- 25% uncertainty in current mass balance corresponds to +/- 2 mm/yr in sea level change. Furthermore, estimates of the sensitivity of the mass balance to temperature change range from perhaps as much as - 10% to + 10% per K. Although the overall ice mass balance and seasonal and inter-annual variations can be derived from time-series of ice surface elevations from satellite altimetry, satellite radar altimeters have been limited in spatial coverage and elevation accuracy. Nevertheless, new data analysis shows mixed patterns of ice elevation increases and decreases that are significant in terms of regional-scale mass balances. In addition, observed seasonal and interannual variations in elevation demonstrate the potential for relating the variability in mass balance to changes in precipitation, temperature, and melting. From 2001, NASA's ICESat laser altimeter mission will provide significantly better elevation accuracy and spatial coverage to 86 deg latitude and to the margins of the ice sheets. During 3 to 5 years of ICESat-1 operation, an estimate of the overall ice sheet mass balance and sea level contribution will be obtained. The importance of continued ice monitoring after the first ICESat is illustrated by the variability in the area of Greenland surface melt observed over 17-years and its correlation with temperature. In addition, measurement of ice sheet changes, along with measurements of sea level change by a series of ocean altimeters, should enable direct detection of ice level and global sea level correlations.

  9. A Novel Method to Handle the Effect of Uneven Sampling Effort in Biodiversity Databases

    PubMed Central

    Pardo, Iker; Pata, María P.; Gómez, Daniel; García, María B.

    2013-01-01

    How reliable are results on spatial distribution of biodiversity based on databases? Many studies have evidenced the uncertainty related to this kind of analysis due to sampling effort bias and the need for its quantification. Despite that a number of methods are available for that, little is known about their statistical limitations and discrimination capability, which could seriously constrain their use. We assess for the first time the discrimination capacity of two widely used methods and a proposed new one (FIDEGAM), all based on species accumulation curves, under different scenarios of sampling exhaustiveness using Receiver Operating Characteristic (ROC) analyses. Additionally, we examine to what extent the output of each method represents the sampling completeness in a simulated scenario where the true species richness is known. Finally, we apply FIDEGAM to a real situation and explore the spatial patterns of plant diversity in a National Park. FIDEGAM showed an excellent discrimination capability to distinguish between well and poorly sampled areas regardless of sampling exhaustiveness, whereas the other methods failed. Accordingly, FIDEGAM values were strongly correlated with the true percentage of species detected in a simulated scenario, whereas sampling completeness estimated with other methods showed no relationship due to null discrimination capability. Quantifying sampling effort is necessary to account for the uncertainty in biodiversity analyses, however, not all proposed methods are equally reliable. Our comparative analysis demonstrated that FIDEGAM was the most accurate discriminator method in all scenarios of sampling exhaustiveness, and therefore, it can be efficiently applied to most databases in order to enhance the reliability of biodiversity analyses. PMID:23326357

  10. A novel method to handle the effect of uneven sampling effort in biodiversity databases.

    PubMed

    Pardo, Iker; Pata, María P; Gómez, Daniel; García, María B

    2013-01-01

    How reliable are results on spatial distribution of biodiversity based on databases? Many studies have evidenced the uncertainty related to this kind of analysis due to sampling effort bias and the need for its quantification. Despite that a number of methods are available for that, little is known about their statistical limitations and discrimination capability, which could seriously constrain their use. We assess for the first time the discrimination capacity of two widely used methods and a proposed new one (FIDEGAM), all based on species accumulation curves, under different scenarios of sampling exhaustiveness using Receiver Operating Characteristic (ROC) analyses. Additionally, we examine to what extent the output of each method represents the sampling completeness in a simulated scenario where the true species richness is known. Finally, we apply FIDEGAM to a real situation and explore the spatial patterns of plant diversity in a National Park. FIDEGAM showed an excellent discrimination capability to distinguish between well and poorly sampled areas regardless of sampling exhaustiveness, whereas the other methods failed. Accordingly, FIDEGAM values were strongly correlated with the true percentage of species detected in a simulated scenario, whereas sampling completeness estimated with other methods showed no relationship due to null discrimination capability. Quantifying sampling effort is necessary to account for the uncertainty in biodiversity analyses, however, not all proposed methods are equally reliable. Our comparative analysis demonstrated that FIDEGAM was the most accurate discriminator method in all scenarios of sampling exhaustiveness, and therefore, it can be efficiently applied to most databases in order to enhance the reliability of biodiversity analyses.

  11. Field infiltration measurements in grassed roadside drainage ditches: Spatial and temporal variability

    NASA Astrophysics Data System (ADS)

    Ahmed, Farzana; Gulliver, John S.; Nieber, J. L.

    2015-11-01

    Roadside drainage ditches (grassed swales) are an attractive stormwater control measure (SCM) since they can reduce runoff volume by infiltrating water into the soil, filter sediments and associated pollutants out of the water, and settle solids onto the soil surface. In this study a total of 722 infiltration measurements were collected in five swales located in Twin-Cities, MN and one swale located in Madison, WI to characterize the field-saturated hydraulic conductivity (Kfs) derived from the infiltration measurements of these swales. Measurements were taken with a falling head device, the Modified Philip Dunne (MPD) infiltrometer, which allows the collection of simultaneous infiltration measurements at multiple locations with several infiltrometers. Field-saturated hydraulic conductivity was higher than expected for different soil texture classes. We hypothesize that this is due to plant roots creating macropores that break up the soil for infiltration. Statistical analysis was performed on the Kfs values to analyze the effect of initial soil moisture content, season, soil texture class and distance in downstream direction on the geometric mean Kfs value of a swale. Because of the high spatial variation of Kfs in the same swale no effect of initial soil moisture content, season and soil texture class was observed on the geometric mean Kfs value. But the distance in downstream direction may have positive or negative effect on the Kfs value. An uncertainty analysis on the Kfs value indicated that approximately twenty infiltration measurements is the minimum number to obtain a representative geometric mean Kfs value of a swale that is less than 350 m long within an acceptable level of uncertainty.

  12. Image/video understanding systems based on network-symbolic models

    NASA Astrophysics Data System (ADS)

    Kuvich, Gary

    2004-03-01

    Vision is a part of a larger information system that converts visual information into knowledge structures. These structures drive vision process, resolve ambiguity and uncertainty via feedback projections, and provide image understanding that is an interpretation of visual information in terms of such knowledge models. Computer simulation models are built on the basis of graphs/networks. The ability of human brain to emulate similar graph/network models is found. Symbols, predicates and grammars naturally emerge in such networks, and logic is simply a way of restructuring such models. Brain analyzes an image as a graph-type relational structure created via multilevel hierarchical compression of visual information. Primary areas provide active fusion of image features on a spatial grid-like structure, where nodes are cortical columns. Spatial logic and topology naturally present in such structures. Mid-level vision processes like perceptual grouping, separation of figure from ground, are special kinds of network transformations. They convert primary image structure into the set of more abstract ones, which represent objects and visual scene, making them easy for analysis by higher-level knowledge structures. Higher-level vision phenomena are results of such analysis. Composition of network-symbolic models combines learning, classification, and analogy together with higher-level model-based reasoning into a single framework, and it works similar to frames and agents. Computational intelligence methods transform images into model-based knowledge representation. Based on such principles, an Image/Video Understanding system can convert images into the knowledge models, and resolve uncertainty and ambiguity. This allows creating intelligent computer vision systems for design and manufacturing.

  13. Does the uncertainty in the representation of terrestrial water flows affect precipitation predictability? A WRF-Hydro ensemble analysis for Central Europe

    NASA Astrophysics Data System (ADS)

    Arnault, Joel; Rummler, Thomas; Baur, Florian; Lerch, Sebastian; Wagner, Sven; Fersch, Benjamin; Zhang, Zhenyu; Kerandi, Noah; Keil, Christian; Kunstmann, Harald

    2017-04-01

    Precipitation predictability can be assessed by the spread within an ensemble of atmospheric simulations being perturbed in the initial, lateral boundary conditions and/or modeled processes within a range of uncertainty. Surface-related processes are more likely to change precipitation when synoptic forcing is weak. This study investigates the effect of uncertainty in the representation of terrestrial water flows on precipitation predictability. The tools used for this investigation are the Weather Research and Forecasting (WRF) model and its hydrologically-enhanced version WRF-Hydro, applied over Central Europe during April-October 2008. The WRF grid is that of COSMO-DE, with a resolution of 2.8 km. In WRF-Hydro, the WRF grid is coupled with a sub-grid at 280 m resolution to resolve lateral terrestrial water flows. Vertical flow uncertainty is considered by modifying the parameter controlling the partitioning between surface runoff and infiltration in WRF, and horizontal flow uncertainty is considered by comparing WRF with WRF-Hydro. Precipitation predictability is deduced from the spread of an ensemble based on three turbulence parameterizations. Model results are validated with E-OBS precipitation and surface temperature, ESA-CCI soil moisture, FLUXNET-MTE surface evaporation and GRDC discharge. It is found that the uncertainty in the representation of terrestrial water flows is more likely to significantly affect precipitation predictability when surface flux spatial variability is high. In comparison to the WRF ensemble, WRF-Hydro slightly improves the adjusted continuous ranked probability score of daily precipitation. The reproduction of observed daily discharge with Nash-Sutcliffe model efficiency coefficients up to 0.91 demonstrates the potential of WRF-Hydro for flood forecasting.

  14. ICESat laser altimetry over small mountain glaciers

    NASA Astrophysics Data System (ADS)

    Treichler, Désirée; Kääb, Andreas

    2016-09-01

    Using sparsely glaciated southern Norway as a case study, we assess the potential and limitations of ICESat laser altimetry for analysing regional glacier elevation change in rough mountain terrain. Differences between ICESat GLAS elevations and reference elevation data are plotted over time to derive a glacier surface elevation trend for the ICESat acquisition period 2003-2008. We find spatially varying biases between ICESat and three tested digital elevation models (DEMs): the Norwegian national DEM, SRTM DEM, and a high-resolution lidar DEM. For regional glacier elevation change, the spatial inconsistency of reference DEMs - a result of spatio-temporal merging - has the potential to significantly affect or dilute trends. Elevation uncertainties of all three tested DEMs exceed ICESat elevation uncertainty by an order of magnitude, and are thus limiting the accuracy of the method, rather than ICESat uncertainty. ICESat matches glacier size distribution of the study area well and measures small ice patches not commonly monitored in situ. The sample is large enough for spatial and thematic subsetting. Vertical offsets to ICESat elevations vary for different glaciers in southern Norway due to spatially inconsistent reference DEM age. We introduce a per-glacier correction that removes these spatially varying offsets, and considerably increases trend significance. Only after application of this correction do individual campaigns fit observed in situ glacier mass balance. Our correction also has the potential to improve glacier trend significance for other causes of spatially varying vertical offsets, for instance due to radar penetration into ice and snow for the SRTM DEM or as a consequence of mosaicking and merging that is common for national or global DEMs. After correction of reference elevation bias, we find that ICESat provides a robust and realistic estimate of a moderately negative glacier mass balance of around -0.36 ± 0.07 m ice per year. This regional estimate agrees well with the heterogeneous but overall negative in situ glacier mass balance observed in the area.

  15. Hydraulic Conductivity Estimation using Bayesian Model Averaging and Generalized Parameterization

    NASA Astrophysics Data System (ADS)

    Tsai, F. T.; Li, X.

    2006-12-01

    Non-uniqueness in parameterization scheme is an inherent problem in groundwater inverse modeling due to limited data. To cope with the non-uniqueness problem of parameterization, we introduce a Bayesian Model Averaging (BMA) method to integrate a set of selected parameterization methods. The estimation uncertainty in BMA includes the uncertainty in individual parameterization methods as the within-parameterization variance and the uncertainty from using different parameterization methods as the between-parameterization variance. Moreover, the generalized parameterization (GP) method is considered in the geostatistical framework in this study. The GP method aims at increasing the flexibility of parameterization through the combination of a zonation structure and an interpolation method. The use of BMP with GP avoids over-confidence in a single parameterization method. A normalized least-squares estimation (NLSE) is adopted to calculate the posterior probability for each GP. We employee the adjoint state method for the sensitivity analysis on the weighting coefficients in the GP method. The adjoint state method is also applied to the NLSE problem. The proposed methodology is implemented to the Alamitos Barrier Project (ABP) in California, where the spatially distributed hydraulic conductivity is estimated. The optimal weighting coefficients embedded in GP are identified through the maximum likelihood estimation (MLE) where the misfits between the observed and calculated groundwater heads are minimized. The conditional mean and conditional variance of the estimated hydraulic conductivity distribution using BMA are obtained to assess the estimation uncertainty.

  16. Uncertainty Analysis of Radar and Gauge Rainfall Estimates in the Russian River Basin

    NASA Astrophysics Data System (ADS)

    Cifelli, R.; Chen, H.; Willie, D.; Reynolds, D.; Campbell, C.; Sukovich, E.

    2013-12-01

    Radar Quantitative Precipitation Estimation (QPE) has been a very important application of weather radar since it was introduced and made widely available after World War II. Although great progress has been made over the last two decades, it is still a challenging process especially in regions of complex terrain such as the western U.S. It is also extremely difficult to make direct use of radar precipitation data in quantitative hydrologic forecasting models. To improve the understanding of rainfall estimation and distributions in the NOAA Hydrometeorology Testbed in northern California (HMT-West), extensive evaluation of radar and gauge QPE products has been performed using a set of independent rain gauge data. This study focuses on the rainfall evaluation in the Russian River Basin. The statistical properties of the different gridded QPE products will be compared quantitatively. The main emphasis of this study will be on the analysis of uncertainties of the radar and gauge rainfall products that are subject to various sources of error. The spatial variation analysis of the radar estimates is performed by measuring the statistical distribution of the radar base data such as reflectivity and by the comparison with a rain gauge cluster. The application of mean field bias values to the radar rainfall data will also be described. The uncertainty analysis of the gauge rainfall will be focused on the comparison of traditional kriging and conditional bias penalized kriging (Seo 2012) methods. This comparison is performed with the retrospective Multisensor Precipitation Estimator (MPE) system installed at the NOAA Earth System Research Laboratory. The independent gauge set will again be used as the verification tool for the newly generated rainfall products.

  17. Modeling Area-Level Health Rankings.

    PubMed

    Courtemanche, Charles; Soneji, Samir; Tchernis, Rusty

    2015-10-01

    Rank county health using a Bayesian factor analysis model. Secondary county data from the National Center for Health Statistics (through 2007) and Behavioral Risk Factor Surveillance System (through 2009). Our model builds on the existing county health rankings (CHRs) by using data-derived weights to compute ranks from mortality and morbidity variables, and by quantifying uncertainty based on population, spatial correlation, and missing data. We apply our model to Wisconsin, which has comprehensive data, and Texas, which has substantial missing information. The data were downloaded from www.countyhealthrankings.org. Our estimated rankings are more similar to the CHRs for Wisconsin than Texas, as the data-derived factor weights are closer to the assigned weights for Wisconsin. The correlations between the CHRs and our ranks are 0.89 for Wisconsin and 0.65 for Texas. Uncertainty is especially severe for Texas given the state's substantial missing data. The reliability of comprehensive CHRs varies from state to state. We advise focusing on the counties that remain among the least healthy after incorporating alternate weighting methods and accounting for uncertainty. Our results also highlight the need for broader geographic coverage in health data. © Health Research and Educational Trust.

  18. A hierarchical spatial model for well yield in complex aquifers

    NASA Astrophysics Data System (ADS)

    Montgomery, J.; O'sullivan, F.

    2017-12-01

    Efficiently siting and managing groundwater wells requires reliable estimates of the amount of water that can be produced, or the well yield. This can be challenging to predict in highly complex, heterogeneous fractured aquifers due to the uncertainty around local hydraulic properties. Promising statistical approaches have been advanced in recent years. For instance, kriging and multivariate regression analysis have been applied to well test data with limited but encouraging levels of prediction accuracy. Additionally, some analytical solutions to diffusion in homogeneous porous media have been used to infer "effective" properties consistent with observed flow rates or drawdown. However, this is an under-specified inverse problem with substantial and irreducible uncertainty. We describe a flexible machine learning approach capable of combining diverse datasets with constraining physical and geostatistical models for improved well yield prediction accuracy and uncertainty quantification. Our approach can be implemented within a hierarchical Bayesian framework using Markov Chain Monte Carlo, which allows for additional sources of information to be incorporated in priors to further constrain and improve predictions and reduce the model order. We demonstrate the usefulness of this approach using data from over 7,000 wells in a fractured bedrock aquifer.

  19. Propagation of neutron-reaction uncertainties through multi-physics models of novel LWR's

    NASA Astrophysics Data System (ADS)

    Hernandez-Solis, Augusto; Sjöstrand, Henrik; Helgesson, Petter

    2017-09-01

    The novel design of the renewable boiling water reactor (RBWR) allows a breeding ratio greater than unity and thus, it aims at providing for a self-sustained fuel cycle. The neutron reactions that compose the different microscopic cross-sections and angular distributions are uncertain, so when they are employed in the determination of the spatial distribution of the neutron flux in a nuclear reactor, a methodology should be employed to account for these associated uncertainties. In this work, the Total Monte Carlo (TMC) method is used to propagate the different neutron-reactions (as well as angular distributions) covariances that are part of the TENDL-2014 nuclear data (ND) library. The main objective is to propagate them through coupled neutronic and thermal-hydraulic models in order to assess the uncertainty of important safety parameters related to multi-physics, such as peak cladding temperature along the axial direction of an RBWR fuel assembly. The objective of this study is to quantify the impact that ND covariances of important nuclides such as U-235, U-238, Pu-239 and the thermal scattering of hydrogen in H2O have in the deterministic safety analysis of novel nuclear reactors designs.

  20. Spatial, temporal and frequency based climate change assessment in Columbia River Basin using multi downscaled-scenarios

    NASA Astrophysics Data System (ADS)

    Rana, Arun; Moradkhani, Hamid

    2016-07-01

    Uncertainties in climate modelling are well documented in literature. Global Climate Models (GCMs) are often used to downscale the climatic parameters on a regional scale. In the present work, we have analyzed the changes in precipitation and temperature for future scenario period of 2070-2099 with respect to historical period of 1970-2000 from statistically downscaled GCM projections in Columbia River Basin (CRB). Analysis is performed using two different statistically downscaled climate projections (with ten GCMs downscaled products each, for RCP 4.5 and RCP 8.5, from CMIP5 dataset) namely, those from the Bias Correction and Spatial Downscaling (BCSD) technique generated at Portland State University and from the Multivariate Adaptive Constructed Analogs (MACA) technique, generated at University of Idaho, totaling to 40 different scenarios. The two datasets for BCSD and MACA are downscaled from observed data for both scenarios projections i.e. RCP4.5 and RCP8.5. Analysis is performed using spatial change (yearly scale), temporal change (monthly scale), percentile change (seasonal scale), quantile change (yearly scale), and wavelet analysis (yearly scale) in the future period from the historical period, respectively, at a scale of 1/16th of degree for entire CRB region. Results have indicated in varied degree of spatial change pattern for the entire Columbia River Basin, especially western part of the basin. At temporal scales, winter precipitation has higher variability than summer and vice versa for temperature. Most of the models have indicated considerate positive change in quantiles and percentiles for both precipitation and temperature. Wavelet analysis provided insights into possible explanation to changes in precipitation.

Top