The Uncertainties on the GIS Based Land Suitability Assessment for Urban and Rural Planning
NASA Astrophysics Data System (ADS)
Liu, H.; Zhan, Q.; Zhan, M.
2017-09-01
The majority of the research on the uncertainties of spatial data and spatial analysis focuses on some specific data feature or analysis tool. Few have accomplished the uncertainties of the whole process of an application like planning, making the research of uncertainties detached from practical applications. The paper discusses the uncertainties of the geographical information systems (GIS) based land suitability assessment in planning on the basis of literature review. The uncertainties considered range from index system establishment to the classification of the final result. Methods to reduce the uncertainties arise from the discretization of continuous raster data and the index weight determination are summarized. The paper analyzes the merits and demerits of the "Nature Breaks" method which is broadly used by planners. It also explores the other factors which impact the accuracy of the final classification like the selection of class numbers, intervals and the autocorrelation of the spatial data. In the conclusion part, the paper indicates that the adoption of machine learning methods should be modified to integrate the complexity of land suitability assessment. The work contributes to the application of spatial data and spatial analysis uncertainty research on land suitability assessment, and promotes the scientific level of the later planning and decision-making.
Characterizing spatial uncertainty when integrating social data in conservation planning.
Lechner, A M; Raymond, C M; Adams, V M; Polyakov, M; Gordon, A; Rhodes, J R; Mills, M; Stein, A; Ives, C D; Lefroy, E C
2014-12-01
Recent conservation planning studies have presented approaches for integrating spatially referenced social (SRS) data with a view to improving the feasibility of conservation action. We reviewed the growing conservation literature on SRS data, focusing on elicited or stated preferences derived through social survey methods such as choice experiments and public participation geographic information systems. Elicited SRS data includes the spatial distribution of willingness to sell, willingness to pay, willingness to act, and assessments of social and cultural values. We developed a typology for assessing elicited SRS data uncertainty which describes how social survey uncertainty propagates when projected spatially and the importance of accounting for spatial uncertainty such as scale effects and data quality. These uncertainties will propagate when elicited SRS data is integrated with biophysical data for conservation planning and may have important consequences for assessing the feasibility of conservation actions. To explore this issue further, we conducted a systematic review of the elicited SRS data literature. We found that social survey uncertainty was commonly tested for, but that these uncertainties were ignored when projected spatially. Based on these results we developed a framework which will help researchers and practitioners estimate social survey uncertainty and use these quantitative estimates to systematically address uncertainty within an analysis. This is important when using SRS data in conservation applications because decisions need to be made irrespective of data quality and well characterized uncertainty can be incorporated into decision theoretic approaches. © 2014 Society for Conservation Biology.
Using spatial uncertainty to manipulate the size of the attention focus.
Huang, Dan; Xue, Linyan; Wang, Xin; Chen, Yao
2016-09-01
Preferentially processing behaviorally relevant information is vital for primate survival. In visuospatial attention studies, manipulating the spatial extent of attention focus is an important question. Although many studies have claimed to successfully adjust attention field size by either varying the uncertainty about the target location (spatial uncertainty) or adjusting the size of the cue orienting the attention focus, no systematic studies have assessed and compared the effectiveness of these methods. We used a multiple cue paradigm with 2.5° and 7.5° rings centered around a target position to measure the cue size effect, while the spatial uncertainty levels were manipulated by changing the number of cueing positions. We found that spatial uncertainty had a significant impact on reaction time during target detection, while the cue size effect was less robust. We also carefully varied the spatial scope of potential target locations within a small or large region and found that this amount of variation in spatial uncertainty can also significantly influence target detection speed. Our results indicate that adjusting spatial uncertainty is more effective than varying cue size when manipulating attention field size.
NASA Astrophysics Data System (ADS)
Greiner, Lucie; Nussbaum, Madlene; Papritz, Andreas; Zimmermann, Stephan; Gubler, Andreas; Grêt-Regamey, Adrienne; Keller, Armin
2018-05-01
Spatial information on soil function fulfillment (SFF) is increasingly being used to inform decision-making in spatial planning programs to support sustainable use of soil resources. Soil function maps visualize soils abilities to fulfill their functions, e.g., regulating water and nutrient flows, providing habitats, and supporting biomass production based on soil properties. Such information must be reliable for informed and transparent decision-making in spatial planning programs. In this study, we add to the transparency of soil function maps by (1) indicating uncertainties arising from the prediction of soil properties generated by digital soil mapping (DSM) that are used for soil function assessment (SFA) and (2) showing the response of different SFA methods to the propagation of uncertainties through the assessment. For a study area of 170 km2 in the Swiss Plateau, we map 10 static soil sub-functions for agricultural soils for a spatial resolution of 20 × 20 m together with their uncertainties. Mapping the 10 soil sub-functions using simple ordinal assessment scales reveals pronounced spatial patterns with a high variability of SFF scores across the region, linked to the inherent properties of the soils and terrain attributes and climate conditions. Uncertainties in soil properties propagated through SFA methods generally lead to substantial uncertainty in the mapped soil sub-functions. We propose two types of uncertainty maps that can be readily understood by stakeholders. Cumulative distribution functions of SFF scores indicate that SFA methods respond differently to the propagated uncertainty of soil properties. Even where methods are comparable on the level of complexity and assessment scale, their comparability in view of uncertainty propagation might be different. We conclude that comparable uncertainty indications in soil function maps are relevant to enable informed and transparent decisions on the sustainable use of soil resources.
Facing uncertainty in ecosystem services-based resource management.
Grêt-Regamey, Adrienne; Brunner, Sibyl H; Altwegg, Jürg; Bebi, Peter
2013-09-01
The concept of ecosystem services is increasingly used as a support for natural resource management decisions. While the science for assessing ecosystem services is improving, appropriate methods to address uncertainties in a quantitative manner are missing. Ignoring parameter uncertainties, modeling uncertainties and uncertainties related to human-environment interactions can modify decisions and lead to overlooking important management possibilities. In this contribution, we present a new approach for mapping the uncertainties in the assessment of multiple ecosystem services. The spatially explicit risk approach links Bayesian networks to a Geographic Information System for forecasting the value of a bundle of ecosystem services and quantifies the uncertainties related to the outcomes in a spatially explicit manner. We demonstrate that mapping uncertainties in ecosystem services assessments provides key information for decision-makers seeking critical areas in the delivery of ecosystem services in a case study in the Swiss Alps. The results suggest that not only the total value of the bundle of ecosystem services is highly dependent on uncertainties, but the spatial pattern of the ecosystem services values changes substantially when considering uncertainties. This is particularly important for the long-term management of mountain forest ecosystems, which have long rotation stands and are highly sensitive to pressing climate and socio-economic changes. Copyright © 2012 Elsevier Ltd. All rights reserved.
Crown fuel spatial variability and predictability of fire spread
Russell A. Parsons; Jeremy Sauer; Rodman R. Linn
2010-01-01
Fire behavior predictions, as well as measures of uncertainty in those predictions, are essential in operational and strategic fire management decisions. While it is becoming common practice to assess uncertainty in fire behavior predictions arising from variability in weather inputs, uncertainty arising from the fire models themselves is difficult to assess. This is...
Gissi, Elena; Menegon, Stefano; Sarretta, Alessandro; Appiotti, Federica; Maragno, Denis; Vianello, Andrea; Depellegrin, Daniel; Venier, Chiara; Barbanti, Andrea
2017-01-01
Maritime spatial planning (MSP) is envisaged as a tool to apply an ecosystem-based approach to the marine and coastal realms, aiming at ensuring that the collective pressure of human activities is kept within acceptable limits. Cumulative impacts (CI) assessment can support science-based MSP, in order to understand the existing and potential impacts of human uses on the marine environment. A CI assessment includes several sources of uncertainty that can hinder the correct interpretation of its results if not explicitly incorporated in the decision-making process. This study proposes a three-level methodology to perform a general uncertainty analysis integrated with the CI assessment for MSP, applied to the Adriatic and Ionian Region (AIR). We describe the nature and level of uncertainty with the help of expert judgement and elicitation to include all of the possible sources of uncertainty related to the CI model with assumptions and gaps related to the case-based MSP process in the AIR. Next, we use the results to tailor the global uncertainty analysis to spatially describe the uncertainty distribution and variations of the CI scores dependent on the CI model factors. The results show the variability of the uncertainty in the AIR, with only limited portions robustly identified as the most or the least impacted areas under multiple model factors hypothesis. The results are discussed for the level and type of reliable information and insights they provide to decision-making. The most significant uncertainty factors are identified to facilitate the adaptive MSP process and to establish research priorities to fill knowledge gaps for subsequent planning cycles. The method aims to depict the potential CI effects, as well as the extent and spatial variation of the data and scientific uncertainty; therefore, this method constitutes a suitable tool to inform the potential establishment of the precautionary principle in MSP.
NASA Astrophysics Data System (ADS)
Sawicka, K.; Breuer, L.; Houska, T.; Santabarbara Ruiz, I.; Heuvelink, G. B. M.
2016-12-01
Computer models have become a crucial tool in engineering and environmental sciences for simulating the behaviour of complex static and dynamic systems. However, while many models are deterministic, the uncertainty in their predictions needs to be estimated before they are used for decision support. Advances in uncertainty propagation analysis and assessment have been paralleled by a growing number of software tools for uncertainty analysis, but none has gained recognition for a universal applicability, including case studies with spatial models and spatial model inputs. Due to the growing popularity and applicability of the open source R programming language we undertook a project to develop an R package that facilitates uncertainty propagation analysis in spatial environmental modelling. In particular, the `spup' package provides functions for examining the uncertainty propagation starting from input data and model parameters, via the environmental model onto model predictions. The functions include uncertainty model specification, stochastic simulation and propagation of uncertainty using Monte Carlo techniques, as well as several uncertainty visualization functions. Here we will demonstrate that the 'spup' package is an effective and easy-to-use tool to be applied even in a very complex study case, and that it can be used in multi-disciplinary research and model-based decision support. As an example, we use the ecological LandscapeDNDC model to analyse propagation of uncertainties associated with spatial variability of the model driving forces such as rainfall, nitrogen deposition and fertilizer inputs. The uncertainty propagation is analysed for the prediction of emissions of N2O and CO2 for a German low mountainous, agriculturally developed catchment. The study tests the effect of spatial correlations on spatially aggregated model outputs, and could serve as an advice for developing best management practices and model improvement strategies.
Stelzenmüller, V; Lee, J; Garnacho, E; Rogers, S I
2010-10-01
For the UK continental shelf we developed a Bayesian Belief Network-GIS framework to visualise relationships between cumulative human pressures, sensitive marine landscapes and landscape vulnerability, to assess the consequences of potential marine planning objectives, and to map uncertainty-related changes in management measures. Results revealed that the spatial assessment of footprints and intensities of human activities had more influence on landscape vulnerabilities than the type of landscape sensitivity measure used. We addressed questions regarding consequences of potential planning targets, and necessary management measures with spatially-explicit assessment of their consequences. We conclude that the BN-GIS framework is a practical tool allowing for the visualisation of relationships, the spatial assessment of uncertainty related to spatial management scenarios, the engagement of different stakeholder views, and enables a quick update of new spatial data and relationships. Ultimately, such BN-GIS based tools can support the decision-making process used in adaptive marine management. Copyright © 2010 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Devendran, A. A.; Lakshmanan, G.
2014-11-01
Data quality for GIS processing and analysis is becoming an increased concern due to the accelerated application of GIS technology for problem solving and decision making roles. Uncertainty in the geographic representation of the real world arises as these representations are incomplete. Identification of the sources of these uncertainties and the ways in which they operate in GIS based representations become crucial in any spatial data representation and geospatial analysis applied to any field of application. This paper reviews the articles on the various components of spatial data quality and various uncertainties inherent in them and special focus is paid to two fields of application such as Urban Simulation and Hydrological Modelling. Urban growth is a complicated process involving the spatio-temporal changes of all socio-economic and physical components at different scales. Cellular Automata (CA) model is one of the simulation models, which randomly selects potential cells for urbanisation and the transition rules evaluate the properties of the cell and its neighbour. Uncertainty arising from CA modelling is assessed mainly using sensitivity analysis including Monte Carlo simulation method. Likewise, the importance of hydrological uncertainty analysis has been emphasized in recent years and there is an urgent need to incorporate uncertainty estimation into water resources assessment procedures. The Soil and Water Assessment Tool (SWAT) is a continuous time watershed model to evaluate various impacts of land use management and climate on hydrology and water quality. Hydrological model uncertainties using SWAT model are dealt primarily by Generalized Likelihood Uncertainty Estimation (GLUE) method.
NASA Astrophysics Data System (ADS)
Szatmári, Gábor; Pásztor, László
2016-04-01
Uncertainty is a general term expressing our imperfect knowledge in describing an environmental process and we are aware of it (Bárdossy and Fodor, 2004). Sampling, laboratory measurements, models and so on are subject to uncertainty. Effective quantification and visualization of uncertainty would be indispensable to stakeholders (e.g. policy makers, society). Soil related features and their spatial models should be stressfully targeted to uncertainty assessment because their inferences are further used in modelling and decision making process. The aim of our present study was to assess and effectively visualize the local uncertainty of the countrywide soil organic matter (SOM) spatial distribution model of Hungary using geostatistical tools and concepts. The Hungarian Soil Information and Monitoring System's SOM data (approximately 1,200 observations) and environmental related, spatially exhaustive secondary information (i.e. digital elevation model, climatic maps, MODIS satellite images and geological map) were used to model the countrywide SOM spatial distribution by regression kriging. It would be common to use the calculated estimation (or kriging) variance as a measure of uncertainty, however the normality and homoscedasticity hypotheses have to be refused according to our preliminary analysis on the data. Therefore, a normal score transformation and a sequential stochastic simulation approach was introduced to be able to model and assess the local uncertainty. Five hundred equally probable realizations (i.e. stochastic images) were generated. The number of the stochastic images is fairly enough to provide a model of uncertainty at each location, which is a complete description of uncertainty in geostatistics (Deutsch and Journel, 1998). Furthermore, these models can be applied e.g. to contour the probability of any events, which can be regarded as goal oriented digital soil maps and are of interest for agricultural management and decision making as well. A standardized measure of the local entropy was used to visualize uncertainty, where entropy values close to 1 correspond to high uncertainty, whilst values close to 0 correspond low uncertainty. The advantage of the usage of local entropy in this context is that it combines probabilities from multiple members into a single number for each location of the model. In conclusion, it is straightforward to use a sequential stochastic simulation approach to the assessment of uncertainty, when normality and homoscedasticity are violated. The visualization of uncertainty using the local entropy is effective and communicative to stakeholders because it represents the uncertainty through a single number within a [0, 1] scale. References: Bárdossy, Gy. & Fodor, J., 2004. Evaluation of Uncertainties and Risks in Geology. Springer-Verlag, Berlin Heidelberg. Deutsch, C.V. & Journel, A.G., 1998. GSLIB: geostatistical software library and user's guide. Oxford University Press, New York. Acknowledgement: Our work was supported by the Hungarian National Scientific Research Foundation (OTKA, Grant No. K105167).
NASA Astrophysics Data System (ADS)
Rose, K.; Glosser, D.; Bauer, J. R.; Barkhurst, A.
2015-12-01
The products of spatial analyses that leverage the interpolation of sparse, point data to represent continuous phenomena are often presented without clear explanations of the uncertainty associated with the interpolated values. As a result, there is frequently insufficient information provided to effectively support advanced computational analyses and individual research and policy decisions utilizing these results. This highlights the need for a reliable approach capable of quantitatively producing and communicating spatial data analyses and their inherent uncertainties for a broad range of uses. To address this need, we have developed the Variable Grid Method (VGM), and associated Python tool, which is a flexible approach that can be applied to a variety of analyses and use case scenarios where users need a method to effectively study, evaluate, and analyze spatial trends and patterns while communicating the uncertainty in the underlying spatial datasets. The VGM outputs a simultaneous visualization representative of the spatial data analyses and quantification of underlying uncertainties, which can be calculated using data related to sample density, sample variance, interpolation error, uncertainty calculated from multiple simulations, etc. We will present examples of our research utilizing the VGM to quantify key spatial trends and patterns for subsurface data interpolations and their uncertainties and leverage these results to evaluate storage estimates and potential impacts associated with underground injection for CO2 storage and unconventional resource production and development. The insights provided by these examples identify how the VGM can provide critical information about the relationship between uncertainty and spatial data that is necessary to better support their use in advance computation analyses and informing research, management and policy decisions.
NASA Astrophysics Data System (ADS)
Tang, Zhongqian; Zhang, Hua; Yi, Shanzhen; Xiao, Yangfan
2018-03-01
GIS-based multi-criteria decision analysis (MCDA) is increasingly used to support flood risk assessment. However, conventional GIS-MCDA methods fail to adequately represent spatial variability and are accompanied with considerable uncertainty. It is, thus, important to incorporate spatial variability and uncertainty into GIS-based decision analysis procedures. This research develops a spatially explicit, probabilistic GIS-MCDA approach for the delineation of potentially flood susceptible areas. The approach integrates the probabilistic and the local ordered weighted averaging (OWA) methods via Monte Carlo simulation, to take into account the uncertainty related to criteria weights, spatial heterogeneity of preferences and the risk attitude of the analyst. The approach is applied to a pilot study for the Gucheng County, central China, heavily affected by the hazardous 2012 flood. A GIS database of six geomorphological and hydrometeorological factors for the evaluation of susceptibility was created. Moreover, uncertainty and sensitivity analysis were performed to investigate the robustness of the model. The results indicate that the ensemble method improves the robustness of the model outcomes with respect to variation in criteria weights and identifies which criteria weights are most responsible for the variability of model outcomes. Therefore, the proposed approach is an improvement over the conventional deterministic method and can provides a more rational, objective and unbiased tool for flood susceptibility evaluation.
Zhu, Ming-Liang; Zhang, Qing-Hang; Lupton, Colin; Tong, Jie
2016-04-01
The measurement uncertainty of strains has been assessed in a bone analogue (sawbone), bovine trabecular bone and bone-cement interface specimens under zero load using the Digital Volume Correlation (DVC) method. The effects of sub-volume size, sample constraint and preload on the measured strain uncertainty have been examined. There is generally a trade-off between the measurement uncertainty and the spatial resolution. Suitable sub-volume sizes have been be selected based on a compromise between the measurement uncertainty and the spatial resolution of the cases considered. A ratio of sub-volume size to a microstructure characteristic (Tb.Sp) was introduced to reflect a suitable spatial resolution, and the measurement uncertainty associated was assessed. Specifically, ratios between 1.6 and 4 appear to give rise to standard deviations in the measured strains between 166 and 620 με in all the cases considered, which would seem to suffice for strain analysis in pre as well as post yield loading regimes. A microscale finite element (μFE) model was built from the CT images of the sawbone, and the results from the μFE model and a continuum FE model were compared with those from the DVC. The strain results were found to differ significantly between the two methods at tissue level, consistent in trend with the results found in human bones, indicating mainly a limitation of the current DVC method in mapping strains at this level. Copyright © 2015 Elsevier Ltd. All rights reserved.
Quantifying measurement uncertainty and spatial variability in the context of model evaluation
NASA Astrophysics Data System (ADS)
Choukulkar, A.; Brewer, A.; Pichugina, Y. L.; Bonin, T.; Banta, R. M.; Sandberg, S.; Weickmann, A. M.; Djalalova, I.; McCaffrey, K.; Bianco, L.; Wilczak, J. M.; Newman, J. F.; Draxl, C.; Lundquist, J. K.; Wharton, S.; Olson, J.; Kenyon, J.; Marquis, M.
2017-12-01
In an effort to improve wind forecasts for the wind energy sector, the Department of Energy and the NOAA funded the second Wind Forecast Improvement Project (WFIP2). As part of the WFIP2 field campaign, a large suite of in-situ and remote sensing instrumentation was deployed to the Columbia River Gorge in Oregon and Washington from October 2015 - March 2017. The array of instrumentation deployed included 915-MHz wind profiling radars, sodars, wind- profiling lidars, and scanning lidars. The role of these instruments was to provide wind measurements at high spatial and temporal resolution for model evaluation and improvement of model physics. To properly determine model errors, the uncertainties in instrument-model comparisons need to be quantified accurately. These uncertainties arise from several factors such as measurement uncertainty, spatial variability, and interpolation of model output to instrument locations, to name a few. In this presentation, we will introduce a formalism to quantify measurement uncertainty and spatial variability. The accuracy of this formalism will be tested using existing datasets such as the eXperimental Planetary boundary layer Instrumentation Assessment (XPIA) campaign. Finally, the uncertainties in wind measurement and the spatial variability estimates from the WFIP2 field campaign will be discussed to understand the challenges involved in model evaluation.
Russell, Robin E.; Tinsley, Karl; Erickson, Richard A.; Thogmartin, Wayne E.; Jennifer A. Szymanski,
2014-01-01
Depicting the spatial distribution of wildlife species is an important first step in developing management and conservation programs for particular species. Accurate representation of a species distribution is important for predicting the effects of climate change, land-use change, management activities, disease, and other landscape-level processes on wildlife populations. We developed models to estimate the spatial distribution of little brown bat (Myotis lucifugus) wintering populations in the United States east of the 100th meridian, based on known hibernacula locations. From this data, we developed several scenarios of wintering population counts per county that incorporated uncertainty in the spatial distribution of the hibernacula as well as uncertainty in the size of the current little brown bat population. We assessed the variability in our results resulting from effects of uncertainty. Despite considerable uncertainty in the known locations of overwintering little brown bats in the eastern United States, we believe that models accurately depicting the effects of the uncertainty are useful for making management decisions as these models are a coherent organization of the best available information.
NASA Astrophysics Data System (ADS)
Touhidul Mustafa, Syed Md.; Nossent, Jiri; Ghysels, Gert; Huysmans, Marijke
2017-04-01
Transient numerical groundwater flow models have been used to understand and forecast groundwater flow systems under anthropogenic and climatic effects, but the reliability of the predictions is strongly influenced by different sources of uncertainty. Hence, researchers in hydrological sciences are developing and applying methods for uncertainty quantification. Nevertheless, spatially distributed flow models pose significant challenges for parameter and spatially distributed input estimation and uncertainty quantification. In this study, we present a general and flexible approach for input and parameter estimation and uncertainty analysis of groundwater models. The proposed approach combines a fully distributed groundwater flow model (MODFLOW) with the DiffeRential Evolution Adaptive Metropolis (DREAM) algorithm. To avoid over-parameterization, the uncertainty of the spatially distributed model input has been represented by multipliers. The posterior distributions of these multipliers and the regular model parameters were estimated using DREAM. The proposed methodology has been applied in an overexploited aquifer in Bangladesh where groundwater pumping and recharge data are highly uncertain. The results confirm that input uncertainty does have a considerable effect on the model predictions and parameter distributions. Additionally, our approach also provides a new way to optimize the spatially distributed recharge and pumping data along with the parameter values under uncertain input conditions. It can be concluded from our approach that considering model input uncertainty along with parameter uncertainty is important for obtaining realistic model predictions and a correct estimation of the uncertainty bounds.
NASA Astrophysics Data System (ADS)
Feyen, Luc; Caers, Jef
2006-06-01
In this work, we address the problem of characterizing the heterogeneity and uncertainty of hydraulic properties for complex geological settings. Hereby, we distinguish between two scales of heterogeneity, namely the hydrofacies structure and the intrafacies variability of the hydraulic properties. We employ multiple-point geostatistics to characterize the hydrofacies architecture. The multiple-point statistics are borrowed from a training image that is designed to reflect the prior geological conceptualization. The intrafacies variability of the hydraulic properties is represented using conventional two-point correlation methods, more precisely, spatial covariance models under a multi-Gaussian spatial law. We address the different levels and sources of uncertainty in characterizing the subsurface heterogeneity, and explore their effect on groundwater flow and transport predictions. Typically, uncertainty is assessed by way of many images, termed realizations, of a fixed statistical model. However, in many cases, sampling from a fixed stochastic model does not adequately represent the space of uncertainty. It neglects the uncertainty related to the selection of the stochastic model and the estimation of its input parameters. We acknowledge the uncertainty inherent in the definition of the prior conceptual model of aquifer architecture and in the estimation of global statistics, anisotropy, and correlation scales. Spatial bootstrap is used to assess the uncertainty of the unknown statistical parameters. As an illustrative example, we employ a synthetic field that represents a fluvial setting consisting of an interconnected network of channel sands embedded within finer-grained floodplain material. For this highly non-stationary setting we quantify the groundwater flow and transport model prediction uncertainty for various levels of hydrogeological uncertainty. Results indicate the importance of accurately describing the facies geometry, especially for transport predictions.
Sun, Guibo; Webster, Chris; Ni, Michael Y; Zhang, Xiaohu
2018-05-07
Uncertainty with respect to built environment (BE) data collection, measure conceptualization and spatial scales is evident in urban health research, but most findings are from relatively lowdensity contexts. We selected Hong Kong, an iconic high-density city, as the study area as limited research has been conducted on uncertainty in such areas. We used geocoded home addresses (n=5732) from a large population-based cohort in Hong Kong to extract BE measures for the participants' place of residence based on an internationally recognized BE framework. Variability of the measures was mapped and Spearman's rank correlation calculated to assess how well the relationships among indicators are preserved across variables and spatial scales. We found extreme variations and uncertainties for the 180 measures collected using comprehensive data and advanced geographic information systems modelling techniques. We highlight the implications of methodological selection and spatial scales of the measures. The results suggest that more robust information regarding urban health research in high-density city would emerge if greater consideration were given to BE data, design methods and spatial scales of the BE measures.
'spup' - an R package for uncertainty propagation in spatial environmental modelling
NASA Astrophysics Data System (ADS)
Sawicka, Kasia; Heuvelink, Gerard
2016-04-01
Computer models have become a crucial tool in engineering and environmental sciences for simulating the behaviour of complex static and dynamic systems. However, while many models are deterministic, the uncertainty in their predictions needs to be estimated before they are used for decision support. Currently, advances in uncertainty propagation and assessment have been paralleled by a growing number of software tools for uncertainty analysis, but none has gained recognition for a universal applicability, including case studies with spatial models and spatial model inputs. Due to the growing popularity and applicability of the open source R programming language we undertook a project to develop an R package that facilitates uncertainty propagation analysis in spatial environmental modelling. In particular, the 'spup' package provides functions for examining the uncertainty propagation starting from input data and model parameters, via the environmental model onto model predictions. The functions include uncertainty model specification, stochastic simulation and propagation of uncertainty using Monte Carlo (MC) techniques, as well as several uncertainty visualization functions. Uncertain environmental variables are represented in the package as objects whose attribute values may be uncertain and described by probability distributions. Both numerical and categorical data types are handled. Spatial auto-correlation within an attribute and cross-correlation between attributes is also accommodated for. For uncertainty propagation the package has implemented the MC approach with efficient sampling algorithms, i.e. stratified random sampling and Latin hypercube sampling. The design includes facilitation of parallel computing to speed up MC computation. The MC realizations may be used as an input to the environmental models called from R, or externally. Selected static and interactive visualization methods that are understandable by non-experts with limited background in statistics can be used to summarize and visualize uncertainty about the measured input, model parameters and output of the uncertainty propagation. We demonstrate that the 'spup' package is an effective and easy tool to apply and can be used in multi-disciplinary research and model-based decision support.
'spup' - an R package for uncertainty propagation analysis in spatial environmental modelling
NASA Astrophysics Data System (ADS)
Sawicka, Kasia; Heuvelink, Gerard
2017-04-01
Computer models have become a crucial tool in engineering and environmental sciences for simulating the behaviour of complex static and dynamic systems. However, while many models are deterministic, the uncertainty in their predictions needs to be estimated before they are used for decision support. Currently, advances in uncertainty propagation and assessment have been paralleled by a growing number of software tools for uncertainty analysis, but none has gained recognition for a universal applicability and being able to deal with case studies with spatial models and spatial model inputs. Due to the growing popularity and applicability of the open source R programming language we undertook a project to develop an R package that facilitates uncertainty propagation analysis in spatial environmental modelling. In particular, the 'spup' package provides functions for examining the uncertainty propagation starting from input data and model parameters, via the environmental model onto model predictions. The functions include uncertainty model specification, stochastic simulation and propagation of uncertainty using Monte Carlo (MC) techniques, as well as several uncertainty visualization functions. Uncertain environmental variables are represented in the package as objects whose attribute values may be uncertain and described by probability distributions. Both numerical and categorical data types are handled. Spatial auto-correlation within an attribute and cross-correlation between attributes is also accommodated for. For uncertainty propagation the package has implemented the MC approach with efficient sampling algorithms, i.e. stratified random sampling and Latin hypercube sampling. The design includes facilitation of parallel computing to speed up MC computation. The MC realizations may be used as an input to the environmental models called from R, or externally. Selected visualization methods that are understandable by non-experts with limited background in statistics can be used to summarize and visualize uncertainty about the measured input, model parameters and output of the uncertainty propagation. We demonstrate that the 'spup' package is an effective and easy tool to apply and can be used in multi-disciplinary research and model-based decision support.
Zajac, Zuzanna; Stith, Bradley M.; Bowling, Andrea C.; Langtimm, Catherine A.; Swain, Eric D.
2015-01-01
Habitat suitability index (HSI) models are commonly used to predict habitat quality and species distributions and are used to develop biological surveys, assess reserve and management priorities, and anticipate possible change under different management or climate change scenarios. Important management decisions may be based on model results, often without a clear understanding of the level of uncertainty associated with model outputs. We present an integrated methodology to assess the propagation of uncertainty from both inputs and structure of the HSI models on model outputs (uncertainty analysis: UA) and relative importance of uncertain model inputs and their interactions on the model output uncertainty (global sensitivity analysis: GSA). We illustrate the GSA/UA framework using simulated hydrology input data from a hydrodynamic model representing sea level changes and HSI models for two species of submerged aquatic vegetation (SAV) in southwest Everglades National Park: Vallisneria americana (tape grass) and Halodule wrightii (shoal grass). We found considerable spatial variation in uncertainty for both species, but distributions of HSI scores still allowed discrimination of sites with good versus poor conditions. Ranking of input parameter sensitivities also varied spatially for both species, with high habitat quality sites showing higher sensitivity to different parameters than low-quality sites. HSI models may be especially useful when species distribution data are unavailable, providing means of exploiting widely available environmental datasets to model past, current, and future habitat conditions. The GSA/UA approach provides a general method for better understanding HSI model dynamics, the spatial and temporal variation in uncertainties, and the parameters that contribute most to model uncertainty. Including an uncertainty and sensitivity analysis in modeling efforts as part of the decision-making framework will result in better-informed, more robust decisions.
Chander, G.; Helder, D.L.; Aaron, David; Mishra, N.; Shrestha, A.K.
2013-01-01
Cross-calibration of satellite sensors permits the quantitative comparison of measurements obtained from different Earth Observing (EO) systems. Cross-calibration studies usually use simultaneous or near-simultaneous observations from several spaceborne sensors to develop band-by-band relationships through regression analysis. The investigation described in this paper focuses on evaluation of the uncertainties inherent in the cross-calibration process, including contributions due to different spectral responses, spectral resolution, spectral filter shift, geometric misregistrations, and spatial resolutions. The hyperspectral data from the Environmental Satellite SCanning Imaging Absorption SpectroMeter for Atmospheric CartograpHY and the EO-1 Hyperion, along with the relative spectral responses (RSRs) from the Landsat 7 Enhanced Thematic Mapper (TM) Plus and the Terra Moderate Resolution Imaging Spectroradiometer sensors, were used for the spectral uncertainty study. The data from Landsat 5 TM over five representative land cover types (desert, rangeland, grassland, deciduous forest, and coniferous forest) were used for the geometric misregistrations and spatial-resolution study. The spectral resolution uncertainty was found to be within 0.25%, spectral filter shift within 2.5%, geometric misregistrations within 0.35%, and spatial-resolution effects within 0.1% for the Libya 4 site. The one-sigma uncertainties presented in this paper are uncorrelated, and therefore, the uncertainties can be summed orthogonally. Furthermore, an overall total uncertainty was developed. In general, the results suggested that the spectral uncertainty is more dominant compared to other uncertainties presented in this paper. Therefore, the effect of the sensor RSR differences needs to be quantified and compensated to avoid large uncertainties in cross-calibration results.
NASA Technical Reports Server (NTRS)
Li, Jing; Li, Xichen; Carlson, Barbara E.; Kahn, Ralph A.; Lacis, Andrew A.; Dubovik, Oleg; Nakajima, Teruyuki
2016-01-01
Various space-based sensors have been designed and corresponding algorithms developed to retrieve aerosol optical depth (AOD), the very basic aerosol optical property, yet considerable disagreement still exists across these different satellite data sets. Surface-based observations aim to provide ground truth for validating satellite data; hence, their deployment locations should preferably contain as much spatial information as possible, i.e., high spatial representativeness. Using a novel Ensemble Kalman Filter (EnKF)- based approach, we objectively evaluate the spatial representativeness of current Aerosol Robotic Network (AERONET) sites. Multisensor monthly mean AOD data sets from Moderate Resolution Imaging Spectroradiometer, Multiangle Imaging Spectroradiometer, Sea-viewing Wide Field-of-view Sensor, Ozone Monitoring Instrument, and Polarization and Anisotropy of Reflectances for Atmospheric Sciences coupled with Observations from a Lidar are combined into a 605-member ensemble, and AERONET data are considered as the observations to be assimilated into this ensemble using the EnKF. The assessment is made by comparing the analysis error variance (that has been constrained by ground-based measurements), with the background error variance (based on satellite data alone). Results show that the total uncertainty is reduced by approximately 27% on average and could reach above 50% over certain places. The uncertainty reduction pattern also has distinct seasonal patterns, corresponding to the spatial distribution of seasonally varying aerosol types, such as dust in the spring for Northern Hemisphere and biomass burning in the fall for Southern Hemisphere. Dust and biomass burning sites have the highest spatial representativeness, rural and oceanic sites can also represent moderate spatial information, whereas the representativeness of urban sites is relatively localized. A spatial score ranging from 1 to 3 is assigned to each AERONET site based on the uncertainty reduction, indicating its representativeness level.
NASA Astrophysics Data System (ADS)
Haas, Edwin; Santabarbara, Ignacio; Kiese, Ralf; Butterbach-Bahl, Klaus
2017-04-01
Numerical simulation models are increasingly used to estimate greenhouse gas emissions at site to regional / national scale and are outlined as the most advanced methodology (Tier 3) in the framework of UNFCCC reporting. Process-based models incorporate the major processes of the carbon and nitrogen cycle of terrestrial ecosystems and are thus thought to be widely applicable at various conditions and spatial scales. Process based modelling requires high spatial resolution input data on soil properties, climate drivers and management information. The acceptance of model based inventory calculations depends on the assessment of the inventory's uncertainty (model, input data and parameter induced uncertainties). In this study we fully quantify the uncertainty in modelling soil N2O and NO emissions from arable, grassland and forest soils using the biogeochemical model LandscapeDNDC. We address model induced uncertainty (MU) by contrasting two different soil biogeochemistry modules within LandscapeDNDC. The parameter induced uncertainty (PU) was assessed by using joint parameter distributions for key parameters describing microbial C and N turnover processes as obtained by different Bayesian calibration studies for each model configuration. Input data induced uncertainty (DU) was addressed by Bayesian calibration of soil properties, climate drivers and agricultural management practices data. For the MU, DU and PU we performed several hundred simulations each to contribute to the individual uncertainty assessment. For the overall uncertainty quantification we assessed the model prediction probability, followed by sampled sets of input datasets and parameter distributions. Statistical analysis of the simulation results have been used to quantify the overall full uncertainty of the modelling approach. With this study we can contrast the variation in model results to the different sources of uncertainties for each ecosystem. Further we have been able to perform a fully uncertainty analysis for modelling N2O and NO emissions from arable, grassland and forest soils necessary for the comprehensibility of modelling results. We have applied the methodology to a regional inventory to assess the overall modelling uncertainty for a regional N2O and NO emissions inventory for the state of Saxony, Germany.
Beekhuizen, Johan; Heuvelink, Gerard B M; Huss, Anke; Bürgi, Alfred; Kromhout, Hans; Vermeulen, Roel
2014-11-01
With the increased availability of spatial data and computing power, spatial prediction approaches have become a standard tool for exposure assessment in environmental epidemiology. However, such models are largely dependent on accurate input data. Uncertainties in the input data can therefore have a large effect on model predictions, but are rarely quantified. With Monte Carlo simulation we assessed the effect of input uncertainty on the prediction of radio-frequency electromagnetic fields (RF-EMF) from mobile phone base stations at 252 receptor sites in Amsterdam, The Netherlands. The impact on ranking and classification was determined by computing the Spearman correlations and weighted Cohen's Kappas (based on tertiles of the RF-EMF exposure distribution) between modelled values and RF-EMF measurements performed at the receptor sites. The uncertainty in modelled RF-EMF levels was large with a median coefficient of variation of 1.5. Uncertainty in receptor site height, building damping and building height contributed most to model output uncertainty. For exposure ranking and classification, the heights of buildings and receptor sites were the most important sources of uncertainty, followed by building damping, antenna- and site location. Uncertainty in antenna power, tilt, height and direction had a smaller impact on model performance. We quantified the effect of input data uncertainty on the prediction accuracy of an RF-EMF environmental exposure model, thereby identifying the most important sources of uncertainty and estimating the total uncertainty stemming from potential errors in the input data. This approach can be used to optimize the model and better interpret model output. Copyright © 2014 Elsevier Inc. All rights reserved.
Mutel, Christopher L; Pfister, Stephan; Hellweg, Stefanie
2012-01-17
We describe a new methodology for performing regionalized life cycle assessment and systematically choosing the spatial scale of regionalized impact assessment methods. We extend standard matrix-based calculations to include matrices that describe the mapping from inventory to impact assessment spatial supports. Uncertainty in inventory spatial data is modeled using a discrete spatial distribution function, which in a case study is derived from empirical data. The minimization of global spatial autocorrelation is used to choose the optimal spatial scale of impact assessment methods. We demonstrate these techniques on electricity production in the United States, using regionalized impact assessment methods for air emissions and freshwater consumption. Case study results show important differences between site-generic and regionalized calculations, and provide specific guidance for future improvements of inventory data sets and impact assessment methods.
NASA Astrophysics Data System (ADS)
Sharma, A.; Woldemeskel, F. M.; Sivakumar, B.; Mehrotra, R.
2014-12-01
We outline a new framework for assessing uncertainties in model simulations, be they hydro-ecological simulations for known scenarios, or climate simulations for assumed scenarios representing the future. This framework is illustrated here using GCM projections for future climates for hydrologically relevant variables (precipitation and temperature), with the uncertainty segregated into three dominant components - model uncertainty, scenario uncertainty (representing greenhouse gas emission scenarios), and ensemble uncertainty (representing uncertain initial conditions and states). A novel uncertainty metric, the Square Root Error Variance (SREV), is used to quantify the uncertainties involved. The SREV requires: (1) Interpolating raw and corrected GCM outputs to a common grid; (2) Converting these to percentiles; (3) Estimating SREV for model, scenario, initial condition and total uncertainty at each percentile; and (4) Transforming SREV to a time series. The outcome is a spatially varying series of SREVs associated with each model that can be used to assess how uncertain the system is at each simulated point or time. This framework, while illustrated in a climate change context, is completely applicable for assessment of uncertainties any modelling framework may be subject to. The proposed method is applied to monthly precipitation and temperature from 6 CMIP3 and 13 CMIP5 GCMs across the world. For CMIP3, B1, A1B and A2 scenarios whereas for CMIP5, RCP2.6, RCP4.5 and RCP8.5 representing low, medium and high emissions are considered. For both CMIP3 and CMIP5, model structure is the largest source of uncertainty, which reduces significantly after correcting for biases. Scenario uncertainly increases, especially for temperature, in future due to divergence of the three emission scenarios analysed. While CMIP5 precipitation simulations exhibit a small reduction in total uncertainty over CMIP3, there is almost no reduction observed for temperature projections. Estimation of uncertainty in both space and time sheds lights on the spatial and temporal patterns of uncertainties in GCM outputs, providing an effective platform for risk-based assessments of any alternate plans or decisions that may be formulated using GCM simulations.
Identifying and assessing critical uncertainty thresholds in a forest pest risk model
Frank H. Koch; Denys Yemshanov
2015-01-01
Pest risk maps can provide helpful decision support for invasive alien species management, but often fail to address adequately the uncertainty associated with their predicted risk values. Th is chapter explores how increased uncertainty in a risk modelâs numeric assumptions (i.e. its principal parameters) might aff ect the resulting risk map. We used a spatial...
NASA Astrophysics Data System (ADS)
Raleigh, M. S.; Smyth, E.; Small, E. E.
2017-12-01
The spatial distribution of snow water equivalent (SWE) is not sufficiently monitored with either remotely sensed or ground-based observations for water resources management. Recent applications of airborne Lidar have yielded basin-wide mapping of SWE when combined with a snow density model. However, in the absence of snow density observations, the uncertainty in these SWE maps is dominated by uncertainty in modeled snow density rather than in Lidar measurement of snow depth. Available observations tend to have a bias in physiographic regime (e.g., flat open areas) and are often insufficient in number to support testing of models across a range of conditions. Thus, there is a need for targeted sampling strategies and controlled model experiments to understand where and why different snow density models diverge. This will enable identification of robust model structures that represent dominant processes controlling snow densification, in support of basin-scale estimation of SWE with remotely-sensed snow depth datasets. The NASA SnowEx mission is a unique opportunity to evaluate sampling strategies of snow density and to quantify and reduce uncertainty in modeled snow density. In this presentation, we present initial field data analyses and modeling results over the Colorado SnowEx domain in the 2016-2017 winter campaign. We detail a framework for spatially mapping the uncertainty in snowpack density, as represented across multiple models. Leveraging the modular SUMMA model, we construct a series of physically-based models to assess systematically the importance of specific process representations to snow density estimates. We will show how models and snow pit observations characterize snow density variations with forest cover in the SnowEx domains. Finally, we will use the spatial maps of density uncertainty to evaluate the selected locations of snow pits, thereby assessing the adequacy of the sampling strategy for targeting uncertainty in modeled snow density.
Exploring prediction uncertainty of spatial data in geostatistical and machine learning Approaches
NASA Astrophysics Data System (ADS)
Klump, J. F.; Fouedjio, F.
2017-12-01
Geostatistical methods such as kriging with external drift as well as machine learning techniques such as quantile regression forest have been intensively used for modelling spatial data. In addition to providing predictions for target variables, both approaches are able to deliver a quantification of the uncertainty associated with the prediction at a target location. Geostatistical approaches are, by essence, adequate for providing such prediction uncertainties and their behaviour is well understood. However, they often require significant data pre-processing and rely on assumptions that are rarely met in practice. Machine learning algorithms such as random forest regression, on the other hand, require less data pre-processing and are non-parametric. This makes the application of machine learning algorithms to geostatistical problems an attractive proposition. The objective of this study is to compare kriging with external drift and quantile regression forest with respect to their ability to deliver reliable prediction uncertainties of spatial data. In our comparison we use both simulated and real world datasets. Apart from classical performance indicators, comparisons make use of accuracy plots, probability interval width plots, and the visual examinations of the uncertainty maps provided by the two approaches. By comparing random forest regression to kriging we found that both methods produced comparable maps of estimated values for our variables of interest. However, the measure of uncertainty provided by random forest seems to be quite different to the measure of uncertainty provided by kriging. In particular, the lack of spatial context can give misleading results in areas without ground truth data. These preliminary results raise questions about assessing the risks associated with decisions based on the predictions from geostatistical and machine learning algorithms in a spatial context, e.g. mineral exploration.
Application of data fusion modeling (DFM) to site characterization
DOE Office of Scientific and Technical Information (OSTI.GOV)
Porter, D.W.; Gibbs, B.P.; Jones, W.F.
1996-01-01
Subsurface characterization is faced with substantial uncertainties because the earth is very heterogeneous, and typical data sets are fragmented and disparate. DFM removes many of the data limitations of current methods to quantify and reduce uncertainty for a variety of data types and models. DFM is a methodology to compute hydrogeological state estimates and their uncertainties from three sources of information: measured data, physical laws, and statistical models for spatial heterogeneities. The benefits of DFM are savings in time and cost through the following: the ability to update models in real time to help guide site assessment, improved quantification ofmore » uncertainty for risk assessment, and improved remedial design by quantifying the uncertainty in safety margins. A Bayesian inverse modeling approach is implemented with a Gauss Newton method where spatial heterogeneities are viewed as Markov random fields. Information from data, physical laws, and Markov models is combined in a Square Root Information Smoother (SRIS). Estimates and uncertainties can be computed for heterogeneous hydraulic conductivity fields in multiple geological layers from the usually sparse hydraulic conductivity data and the often more plentiful head data. An application of DFM to the Old Burial Ground at the DOE Savannah River Site will be presented. DFM estimates and quantifies uncertainty in hydrogeological parameters using variably saturated flow numerical modeling to constrain the estimation. Then uncertainties are propagated through the transport modeling to quantify the uncertainty in tritium breakthrough curves at compliance points.« less
Application of data fusion modeling (DFM) to site characterization
DOE Office of Scientific and Technical Information (OSTI.GOV)
Porter, D.W.; Gibbs, B.P.; Jones, W.F.
1996-12-31
Subsurface characterization is faced with substantial uncertainties because the earth is very heterogeneous, and typical data sets are fragmented and disparate. DFM removes many of the data limitations of current methods to quantify and reduce uncertainty for a variety of data types and models. DFM is a methodology to compute hydrogeological state estimates and their uncertainties from three sources of information: measured data, physical laws, and statistical models for spatial heterogeneities. The benefits of DFM are savings in time and cost through the following: the ability to update models in real time to help guide site assessment, improved quantification ofmore » uncertainty for risk assessment, and improved remedial design by quantifying the uncertainty in safety margins. A Bayesian inverse modeling approach is implemented with a Gauss Newton method where spatial heterogeneities are viewed as Markov random fields. Information from data, physical laws, and Markov models is combined in a Square Root Information Smoother (SRIS). Estimates and uncertainties can be computed for heterogeneous hydraulic conductivity fields in multiple geological layers from the usually sparse hydraulic conductivity data and the often more plentiful head data. An application of DFM to the Old Burial Ground at the DOE Savannah River Site will be presented. DFM estimates and quantifies uncertainty in hydrogeological parameters using variably saturated flow numerical modeling to constrain the estimation. Then uncertainties are propagated through the transport modeling to quantify the uncertainty in tritium breakthrough curves at compliance points.« less
NASA Astrophysics Data System (ADS)
Dale, Amy; Fant, Charles; Strzepek, Kenneth; Lickley, Megan; Solomon, Susan
2017-03-01
We present maize production in sub-Saharan Africa as a case study in the exploration of how uncertainties in global climate change, as reflected in projections from a range of climate model ensembles, influence climate impact assessments for agriculture. The crop model AquaCrop-OS (Food and Agriculture Organization of the United Nations) was modified to run on a 2° × 2° grid and coupled to 122 climate model projections from multi-model ensembles for three emission scenarios (Coupled Model Intercomparison Project Phase 3 [CMIP3] SRES A1B and CMIP5 Representative Concentration Pathway [RCP] scenarios 4.5 and 8.5) as well as two "within-model" ensembles (NCAR CCSM3 and ECHAM5/MPI-OM) designed to capture internal variability (i.e., uncertainty due to chaos in the climate system). In spite of high uncertainty, most notably in the high-producing semi-arid zones, we observed robust regional and sub-regional trends across all ensembles. In agreement with previous work, we project widespread yield losses in the Sahel region and Southern Africa, resilience in Central Africa, and sub-regional increases in East Africa and at the southern tip of the continent. Spatial patterns of yield losses corresponded with spatial patterns of aridity increases, which were explicitly evaluated. Internal variability was a major source of uncertainty in both within-model and between-model ensembles and explained the majority of the spatial distribution of uncertainty in yield projections. Projected climate change impacts on maize production in different regions and nations ranged from near-zero or positive (upper quartile estimates) to substantially negative (lower quartile estimates), highlighting a need for risk management strategies that are adaptive and robust to uncertainty.
NASA Technical Reports Server (NTRS)
Tang, Ling; Hossain, Faisal; Huffman, George J.
2010-01-01
Hydrologists and other users need to know the uncertainty of the satellite rainfall data sets across the range of time/space scales over the whole domain of the data set. Here, uncertainty' refers to the general concept of the deviation' of an estimate from the reference (or ground truth) where the deviation may be defined in multiple ways. This uncertainty information can provide insight to the user on the realistic limits of utility, such as hydrologic predictability, that can be achieved with these satellite rainfall data sets. However, satellite rainfall uncertainty estimation requires ground validation (GV) precipitation data. On the other hand, satellite data will be most useful over regions that lack GV data, for example developing countries. This paper addresses the open issues for developing an appropriate uncertainty transfer scheme that can routinely estimate various uncertainty metrics across the globe by leveraging a combination of spatially-dense GV data and temporally sparse surrogate (or proxy) GV data, such as the Tropical Rainfall Measuring Mission (TRMM) Precipitation Radar and the Global Precipitation Measurement (GPM) mission Dual-Frequency Precipitation Radar. The TRMM Multi-satellite Precipitation Analysis (TMPA) products over the US spanning a record of 6 years are used as a representative example of satellite rainfall. It is shown that there exists a quantifiable spatial structure in the uncertainty of satellite data for spatial interpolation. Probabilistic analysis of sampling offered by the existing constellation of passive microwave sensors indicate that transfer of uncertainty for hydrologic applications may be effective at daily time scales or higher during the GPM era. Finally, a commonly used spatial interpolation technique (kriging), that leverages the spatial correlation of estimation uncertainty, is assessed at climatologic, seasonal, monthly and weekly timescales. It is found that the effectiveness of kriging is sensitive to the type of uncertainty metric, time scale of transfer and the density of GV data within the transfer domain. Transfer accuracy is lowest at weekly timescales with the error doubling from monthly to weekly.However, at very low GV data density (<20% of the domain), the transfer accuracy is too low to show any distinction as a function of the timescale of transfer.
Vanguelova, E I; Bonifacio, E; De Vos, B; Hoosbeek, M R; Berger, T W; Vesterdal, L; Armolaitis, K; Celi, L; Dinca, L; Kjønaas, O J; Pavlenda, P; Pumpanen, J; Püttsepp, Ü; Reidy, B; Simončič, P; Tobin, B; Zhiyanski, M
2016-11-01
Spatially explicit knowledge of recent and past soil organic carbon (SOC) stocks in forests will improve our understanding of the effect of human- and non-human-induced changes on forest C fluxes. For SOC accounting, a minimum detectable difference must be defined in order to adequately determine temporal changes and spatial differences in SOC. This requires sufficiently detailed data to predict SOC stocks at appropriate scales within the required accuracy so that only significant changes are accounted for. When designing sampling campaigns, taking into account factors influencing SOC spatial and temporal distribution (such as soil type, topography, climate and vegetation) are needed to optimise sampling depths and numbers of samples, thereby ensuring that samples accurately reflect the distribution of SOC at a site. Furthermore, the appropriate scales related to the research question need to be defined: profile, plot, forests, catchment, national or wider. Scaling up SOC stocks from point sample to landscape unit is challenging, and thus requires reliable baseline data. Knowledge of the associated uncertainties related to SOC measures at each particular scale and how to reduce them is crucial for assessing SOC stocks with the highest possible accuracy at each scale. This review identifies where potential sources of errors and uncertainties related to forest SOC stock estimation occur at five different scales-sample, profile, plot, landscape/regional and European. Recommendations are also provided on how to reduce forest SOC uncertainties and increase efficiency of SOC assessment at each scale.
Communicating spatial uncertainty to non-experts using R
NASA Astrophysics Data System (ADS)
Luzzi, Damiano; Sawicka, Kasia; Heuvelink, Gerard; de Bruin, Sytze
2016-04-01
Effective visualisation methods are important for the efficient use of uncertainty information for various groups of users. Uncertainty propagation analysis is often used with spatial environmental models to quantify the uncertainty within the information. A challenge arises when trying to effectively communicate the uncertainty information to non-experts (not statisticians) in a wide range of cases. Due to the growing popularity and applicability of the open source programming language R, we undertook a project to develop an R package that facilitates uncertainty propagation analysis in spatial environmental modelling. The package has implemented Monte Carlo algorithms for uncertainty propagation, the output of which is represented by an ensemble of model outputs (i.e. a sample from a probability distribution). Numerous visualisation methods exist that aim to present such spatial uncertainty information both statically, dynamically and interactively. To provide the most universal visualisation tools for non-experts, we conducted a survey on a group of 20 university students and assessed the effectiveness of selected static and interactive methods for visualising uncertainty in spatial variables such as DEM and land cover. The static methods included adjacent maps and glyphs for continuous variables. Both allow for displaying maps with information about the ensemble mean, variance/standard deviation and prediction intervals. Adjacent maps were also used for categorical data, displaying maps of the most probable class, as well as its associated probability. The interactive methods included a graphical user interface, which in addition to displaying the previously mentioned variables also allowed for comparison of joint uncertainties at multiple locations. The survey indicated that users could understand the basics of the uncertainty information displayed in the static maps, with the interactive interface allowing for more in-depth information. Subsequently, the R package included a collation of the plotting functions that were evaluated in the survey. The implementation of static visualisations was done via calls to the 'ggplot2' package. This allowed the user to provide control over the content, legend, colours, axes and titles. The interactive methods were implemented using the 'shiny' package allowing users to activate the visualisation of statistical descriptions of uncertainty through interaction with a plotted map of means. This research brings uncertainty visualisation to a broader audience through the development of tools for visualising uncertainty using open source software.
Assessing uncertainty in high-resolution spatial climate data across the US Northeast.
Bishop, Daniel A; Beier, Colin M
2013-01-01
Local and regional-scale knowledge of climate change is needed to model ecosystem responses, assess vulnerabilities and devise effective adaptation strategies. High-resolution gridded historical climate (GHC) products address this need, but come with multiple sources of uncertainty that are typically not well understood by data users. To better understand this uncertainty in a region with a complex climatology, we conducted a ground-truthing analysis of two 4 km GHC temperature products (PRISM and NRCC) for the US Northeast using 51 Cooperative Network (COOP) weather stations utilized by both GHC products. We estimated GHC prediction error for monthly temperature means and trends (1980-2009) across the US Northeast and evaluated any landscape effects (e.g., elevation, distance from coast) on those prediction errors. Results indicated that station-based prediction errors for the two GHC products were similar in magnitude, but on average, the NRCC product predicted cooler than observed temperature means and trends, while PRISM was cooler for means and warmer for trends. We found no evidence for systematic sources of uncertainty across the US Northeast, although errors were largest at high elevations. Errors in the coarse-scale (4 km) digital elevation models used by each product were correlated with temperature prediction errors, more so for NRCC than PRISM. In summary, uncertainty in spatial climate data has many sources and we recommend that data users develop an understanding of uncertainty at the appropriate scales for their purposes. To this end, we demonstrate a simple method for utilizing weather stations to assess local GHC uncertainty and inform decisions among alternative GHC products.
Ambient Ozone Exposure in Czech Forests: A GIS-Based Approach to Spatial Distribution Assessment
Hůnová, I.; Horálek, J.; Schreiberová, M.; Zapletal, M.
2012-01-01
Ambient ozone (O3) is an important phytotoxic pollutant, and detailed knowledge of its spatial distribution is becoming increasingly important. The aim of the paper is to compare different spatial interpolation techniques and to recommend the best approach for producing a reliable map for O3 with respect to its phytotoxic potential. For evaluation we used real-time ambient O3 concentrations measured by UV absorbance from 24 Czech rural sites in the 2007 and 2008 vegetation seasons. We considered eleven approaches for spatial interpolation used for the development of maps for mean vegetation season O3 concentrations and the AOT40F exposure index for forests. The uncertainty of maps was assessed by cross-validation analysis. The root mean square error (RMSE) of the map was used as a criterion. Our results indicate that the optimal interpolation approach is linear regression of O3 data and altitude with subsequent interpolation of its residuals by ordinary kriging. The relative uncertainty of the map of O3 mean for the vegetation season is less than 10%, using the optimal method as for both explored years, and this is a very acceptable value. In the case of AOT40F, however, the relative uncertainty of the map is notably worse, reaching nearly 20% in both examined years. PMID:22566757
NASA Astrophysics Data System (ADS)
Nossent, Jiri; Pereira, Fernando; Bauwens, Willy
2015-04-01
Precipitation is one of the key inputs for hydrological models. As long as the values of the hydrological model parameters are fixed, a variation of the rainfall input is expected to induce a change in the model output. Given the increased awareness of uncertainty on rainfall records, it becomes more important to understand the impact of this input - output dynamic. Yet, modellers often still have the intention to mimic the observed flow, whatever the deviation of the employed records from the actual rainfall might be, by recklessly adapting the model parameter values. But is it actually possible to vary the model parameter values in such a way that a certain (observed) model output can be generated based on inaccurate rainfall inputs? Thus, how important is the rainfall uncertainty for the model output with respect to the model parameter importance? To address this question, we apply the Sobol' sensitivity analysis method to assess and compare the importance of the rainfall uncertainty and the model parameters on the output of the hydrological model. In order to be able to treat the regular model parameters and input uncertainty in the same way, and to allow a comparison of their influence, a possible approach is to represent the rainfall uncertainty by a parameter. To tackle the latter issue, we apply so called rainfall multipliers on hydrological independent storm events, as a probabilistic parameter representation of the possible rainfall variation. As available rainfall records are very often point measurements at a discrete time step (hourly, daily, monthly,…), they contain uncertainty due to a latent lack of spatial and temporal variability. The influence of the latter variability can also be different for hydrological models with different spatial and temporal scale. Therefore, we perform the sensitivity analyses on a semi-distributed model (SWAT) and a lumped model (NAM). The assessment and comparison of the importance of the rainfall uncertainty and the model parameters is achieved by considering different scenarios for the included parameters and the state of the models.
Choice of baseline climate data impacts projected species' responses to climate change.
Baker, David J; Hartley, Andrew J; Butchart, Stuart H M; Willis, Stephen G
2016-07-01
Climate data created from historic climate observations are integral to most assessments of potential climate change impacts, and frequently comprise the baseline period used to infer species-climate relationships. They are often also central to downscaling coarse resolution climate simulations from General Circulation Models (GCMs) to project future climate scenarios at ecologically relevant spatial scales. Uncertainty in these baseline data can be large, particularly where weather observations are sparse and climate dynamics are complex (e.g. over mountainous or coastal regions). Yet, importantly, this uncertainty is almost universally overlooked when assessing potential responses of species to climate change. Here, we assessed the importance of historic baseline climate uncertainty for projections of species' responses to future climate change. We built species distribution models (SDMs) for 895 African bird species of conservation concern, using six different climate baselines. We projected these models to two future periods (2040-2069, 2070-2099), using downscaled climate projections, and calculated species turnover and changes in species-specific climate suitability. We found that the choice of baseline climate data constituted an important source of uncertainty in projections of both species turnover and species-specific climate suitability, often comparable with, or more important than, uncertainty arising from the choice of GCM. Importantly, the relative contribution of these factors to projection uncertainty varied spatially. Moreover, when projecting SDMs to sites of biodiversity importance (Important Bird and Biodiversity Areas), these uncertainties altered site-level impacts, which could affect conservation prioritization. Our results highlight that projections of species' responses to climate change are sensitive to uncertainty in the baseline climatology. We recommend that this should be considered routinely in such analyses. © 2016 John Wiley & Sons Ltd.
NASA Astrophysics Data System (ADS)
Zhao, S.; Mashayekhi, R.; Saeednooran, S.; Hakami, A.; Ménard, R.; Moran, M. D.; Zhang, J.
2016-12-01
We have developed a formal framework for documentation, quantification, and propagation of uncertainties in upstream emissions inventory data at various stages leading to the generation of model-ready gridded emissions through emissions processing software such as the EPA's SMOKE (Sparse Matrix Operator Kernel Emissions) system. To illustrate this framework we present a proof-of-concept case study of a bottom-up quantitative assessment of uncertainties in emissions from residential wood combustion (RWC) in the U.S. and Canada. Uncertainties associated with key inventory parameters are characterized based on existing information sources, including the American Housing Survey (AHS) from the U.S. Census Bureau, Timber Products Output (TPO) surveys from the U.S. Forest Service, TNS Canadian Facts surveys, and the AP-42 emission factor document from the U.S. EPA. The propagation of uncertainties is based on Monte Carlo simulation code external to SMOKE. Latin Hypercube Sampling (LHS) is implemented to generate a set of random realizations of each RWC inventory parameter, for which the uncertainties are assumed to be normally distributed. Random realizations are also obtained for each RWC temporal and chemical speciation profile and spatial surrogate field external to SMOKE using the LHS approach. SMOKE outputs for primary emissions (e.g., CO, VOC) using both RWC emission inventory realizations and perturbed temporal and chemical profiles and spatial surrogates show relative uncertainties of about 30-50% across the U.S. and about 70-100% across Canada. Positive skewness values (up to 2.7) and variable kurtosis values (up to 4.8) were also found. Spatial allocation contributes significantly to the overall uncertainty, particularly in Canada. By applying this framework we are able to produce random realizations of model-ready gridded emissions that along with available meteorological ensembles can be used to propagate uncertainties through chemical transport models. The approach described here provides an effective means for formal quantification of uncertainties in estimated emissions from various source sectors and for continuous documentation, assessment, and reduction of emission uncertainties.
Assessment of spatial variation of risks in small populations.
Riggan, W B; Manton, K G; Creason, J P; Woodbury, M A; Stallard, E
1991-01-01
Often environmental hazards are assessed by examining the spatial variation of disease-specific mortality or morbidity rates. These rates, when estimated for small local populations, can have a high degree of random variation or uncertainty associated with them. If those rate estimates are used to prioritize environmental clean-up actions or to allocate resources, then those decisions may be influenced by this high degree of uncertainty. Unfortunately, the effect of this uncertainty is not to add "random noise" into the decision-making process, but to systematically bias action toward the smallest populations where uncertainty is greatest and where extreme high and low rate deviations are most likely to be manifest by chance. We present a statistical procedure for adjusting rate estimates for differences in variability due to differentials in local area population sizes. Such adjustments produce rate estimates for areas that have better properties than the unadjusted rates for use in making statistically based decisions about the entire set of areas. Examples are provided for county variation in bladder, stomach, and lung cancer mortality rates for U.S. white males for the period 1970 to 1979. PMID:1820268
NASA Astrophysics Data System (ADS)
Owens, P. R.; Libohova, Z.; Seybold, C. A.; Wills, S. A.; Peaslee, S.; Beaudette, D.; Lindbo, D. L.
2017-12-01
The measurement errors and spatial prediction uncertainties of soil properties in the modeling community are usually assessed against measured values when available. However, of equal importance is the assessment of errors and uncertainty impacts on cost benefit analysis and risk assessments. Soil pH was selected as one of the most commonly measured soil properties used for liming recommendations. The objective of this study was to assess the error size from different sources and their implications with respect to management decisions. Error sources include measurement methods, laboratory sources, pedotransfer functions, database transections, spatial aggregations, etc. Several databases of measured and predicted soil pH were used for this study including the United States National Cooperative Soil Survey Characterization Database (NCSS-SCDB), the US Soil Survey Geographic (SSURGO) Database. The distribution of errors among different sources from measurement methods to spatial aggregation showed a wide range of values. The greatest RMSE of 0.79 pH units was from spatial aggregation (SSURGO vs Kriging), while the measurement methods had the lowest RMSE of 0.06 pH units. Assuming the order of data acquisition based on the transaction distance i.e. from measurement method to spatial aggregation the RMSE increased from 0.06 to 0.8 pH units suggesting an "error propagation". This has major implications for practitioners and modeling community. Most soil liming rate recommendations are based on 0.1 pH unit increments, while the desired soil pH level increments are based on 0.4 to 0.5 pH units. Thus, even when the measured and desired target soil pH are the same most guidelines recommend 1 ton ha-1 lime, which translates in 111 ha-1 that the farmer has to factor in the cost-benefit analysis. However, this analysis need to be based on uncertainty predictions (0.5-1.0 pH units) rather than measurement errors (0.1 pH units) which would translate in 555-1,111 investment that need to be assessed against the risk. The modeling community can benefit from such analysis, however, error size and spatial distribution for global and regional predictions need to be assessed against the variability of other drivers and impact on management decisions.
NASA Astrophysics Data System (ADS)
Savage, James; Pianosi, Francesca; Bates, Paul; Freer, Jim; Wagener, Thorsten
2015-04-01
Predicting flood inundation extents using hydraulic models is subject to a number of critical uncertainties. For a specific event, these uncertainties are known to have a large influence on model outputs and any subsequent analyses made by risk managers. Hydraulic modellers often approach such problems by applying uncertainty analysis techniques such as the Generalised Likelihood Uncertainty Estimation (GLUE) methodology. However, these methods do not allow one to attribute which source of uncertainty has the most influence on the various model outputs that inform flood risk decision making. Another issue facing modellers is the amount of computational resource that is available to spend on modelling flood inundations that are 'fit for purpose' to the modelling objectives. Therefore a balance needs to be struck between computation time, realism and spatial resolution, and effectively characterising the uncertainty spread of predictions (for example from boundary conditions and model parameterisations). However, it is not fully understood how much of an impact each factor has on model performance, for example how much influence changing the spatial resolution of a model has on inundation predictions in comparison to other uncertainties inherent in the modelling process. Furthermore, when resampling fine scale topographic data in the form of a Digital Elevation Model (DEM) to coarser resolutions, there are a number of possible coarser DEMs that can be produced. Deciding which DEM is then chosen to represent the surface elevations in the model could also influence model performance. In this study we model a flood event using the hydraulic model LISFLOOD-FP and apply Sobol' Sensitivity Analysis to estimate which input factor, among the uncertainty in model boundary conditions, uncertain model parameters, the spatial resolution of the DEM and the choice of resampled DEM, have the most influence on a range of model outputs. These outputs include whole domain maximum inundation indicators and flood wave travel time in addition to temporally and spatially variable indicators. This enables us to assess whether the sensitivity of the model to various input factors is stationary in both time and space. Furthermore, competing models are assessed against observations of water depths from a historical flood event. Consequently we are able to determine which of the input factors has the most influence on model performance. Initial findings suggest the sensitivity of the model to different input factors varies depending on the type of model output assessed and at what stage during the flood hydrograph the model output is assessed. We have also found that initial decisions regarding the characterisation of the input factors, for example defining the upper and lower bounds of the parameter sample space, can be significant in influencing the implied sensitivities.
Uncertainties in land use data
NASA Astrophysics Data System (ADS)
Castilla, G.; Hay, G. J.
2006-11-01
This paper deals with the description and assessment of uncertainties in gridded land use data derived from Remote Sensing observations, in the context of hydrological studies. Land use is a categorical regionalised variable returning the main socio-economic role each location has, where the role is inferred from the pattern of occupation of land. There are two main uncertainties surrounding land use data, positional and categorical. This paper focuses on the second one, as the first one has in general less serious implications and is easier to tackle. The conventional method used to asess categorical uncertainty, the confusion matrix, is criticised in depth, the main critique being its inability to inform on a basic requirement to propagate uncertainty through distributed hydrological models, namely the spatial distribution of errors. Some existing alternative methods are reported, and finally the need for metadata is stressed as a more reliable means to assess the quality, and hence the uncertainty, of these data.
NASA Astrophysics Data System (ADS)
Sévellec, Florian; Dijkstra, Henk A.; Drijfhout, Sybren S.; Germe, Agathe
2017-11-01
In this study, the relation between two approaches to assess the ocean predictability on interannual to decadal time scales is investigated. The first pragmatic approach consists of sampling the initial condition uncertainty and assess the predictability through the divergence of this ensemble in time. The second approach is provided by a theoretical framework to determine error growth by estimating optimal linear growing modes. In this paper, it is shown that under the assumption of linearized dynamics and normal distributions of the uncertainty, the exact quantitative spread of ensemble can be determined from the theoretical framework. This spread is at least an order of magnitude less expensive to compute than the approximate solution given by the pragmatic approach. This result is applied to a state-of-the-art Ocean General Circulation Model to assess the predictability in the North Atlantic of four typical oceanic metrics: the strength of the Atlantic Meridional Overturning Circulation (AMOC), the intensity of its heat transport, the two-dimensional spatially-averaged Sea Surface Temperature (SST) over the North Atlantic, and the three-dimensional spatially-averaged temperature in the North Atlantic. For all tested metrics, except for SST, ˜ 75% of the total uncertainty on interannual time scales can be attributed to oceanic initial condition uncertainty rather than atmospheric stochastic forcing. The theoretical method also provide the sensitivity pattern to the initial condition uncertainty, allowing for targeted measurements to improve the skill of the prediction. It is suggested that a relatively small fleet of several autonomous underwater vehicles can reduce the uncertainty in AMOC strength prediction by 70% for 1-5 years lead times.
NASA Astrophysics Data System (ADS)
Wanders, N.; Karssenberg, D.; Bierkens, M. F. P.; Van Dam, J. C.; De Jong, S. M.
2012-04-01
Soil moisture is a key variable in the hydrological cycle and important in hydrological modelling. When assimilating soil moisture into flood forecasting models, the improvement of forecasting skills depends on the ability to accurately estimate the spatial and temporal patterns of soil moisture content throughout the river basin. Space-borne remote sensing may provide this information with a high temporal and spatial resolution and with a global coverage. Currently three microwave soil moisture products are available: AMSR-E, ASCAT and SMOS. The quality of these satellite-based products is often assessed by comparing them with in-situ observations of soil moisture. This comparison is however hampered by the difference in spatial and temporal support (i.e., resolution, scale), because the spatial resolution of microwave satellites is rather low compared to in-situ field measurements. Thus, the aim of this study is to derive a method to assess the uncertainty of microwave satellite soil moisture products at the correct spatial support. To overcome the difference in support size between in-situ soil moisture observations and remote sensed soil moisture, we used a stochastic, distributed unsaturated zone model (SWAP, van Dam (2000)) that is upscaled to the support of different satellite products. A detailed assessment of the SWAP model uncertainty is included to ensure that the uncertainty in satellite soil moisture is not overestimated due to an underestimation of the model uncertainty. We simulated unsaturated water flow up to a depth of 1.5m with a vertical resolution of 1 to 10 cm and on a horizontal grid of 1 km2 for the period Jan 2010 - Jun 2011. The SWAP model was first calibrated and validated on in-situ data of the REMEDHUS soil moisture network (Spain). Next, to evaluate the satellite products, the model was run for areas in the proximity of 79 meteorological stations in Spain, where model results were aggregated to the correct support of the satellite product by averaging model results from the 1 km2 grid within the remote sensing footprint. Overall 440 (AMSR-E, SMOS) to 680 (ASCAT) timeseries were compared to the aggregated SWAP model results, providing valuable information on the uncertainty of satellite soil moisture at the proper support. Our results show that temporal dynamics are best captured by ASCAT resulting in an average correlation of 0.72 with the model, while ASMR-E (0.41) and SMOS (0.42) are less capable of representing these dynamics. Standard deviations found for ASCAT and SMOS are low, 0.049 and 0.051m3m-3 respectively, while AMSR-E has a higher value of 0.062m3m-3. All standard deviations are higher than the average model uncertainty of 0.017m3m-3. All satellite products show a negative bias compared to the model results, with the largest value for SMOS. Satellite uncertainty is not found to be significantly related to topography, but is found to increase in densely vegetated areas. In general AMSR-E has most difficulties capturing soil moisture dynamics in Spain, while SMOS and mainly ASCAT have a fair to good performance. However, all products contain valuable information about the near-surface soil moisture over Spain. Van Dam, J.C., 2000, Field scale water flow and solute transport. SWAP model concepts, parameter estimation and case studies. Ph.D. thesis, Wageningen University
Jian Yang; Peter J. Weisberg; Thomas E. Dilts; E. Louise Loudermilk; Robert M. Scheller; Alison Stanton; Carl Skinner
2015-01-01
Strategic fire and fuel management planning benefits from detailed understanding of how wildfire occurrences are distributed spatially under current climate, and from predictive models of future wildfire occurrence given climate change scenarios. In this study, we fitted historical wildfire occurrence data from 1986 to 2009 to a suite of spatial point process (SPP)...
NASA Astrophysics Data System (ADS)
Webb, N.; Chappell, A.; Van Zee, J.; Toledo, D.; Duniway, M.; Billings, B.; Tedela, N.
2017-12-01
Anthropogenic land use and land cover change (LULCC) influence global rates of wind erosion and dust emission, yet our understanding of the magnitude of the responses remains poor. Field measurements and monitoring provide essential data to resolve aeolian sediment transport patterns and assess the impacts of human land use and management intensity. Data collected in the field are also required for dust model calibration and testing, as models have become the primary tool for assessing LULCC-dust cycle interactions. However, there is considerable uncertainty in estimates of dust emission due to the spatial variability of sediment transport. Field sampling designs are currently rudimentary and considerable opportunities are available to reduce the uncertainty. Establishing the minimum detectable change is critical for measuring spatial and temporal patterns of sediment transport, detecting potential impacts of LULCC and land management, and for quantifying the uncertainty of dust model estimates. Here, we evaluate the effectiveness of common sampling designs (e.g., simple random sampling, systematic sampling) used to measure and monitor aeolian sediment transport rates. Using data from the US National Wind Erosion Research Network across diverse rangeland and cropland cover types, we demonstrate how only large changes in sediment mass flux (of the order 200% to 800%) can be detected when small sample sizes are used, crude sampling designs are implemented, or when the spatial variation is large. We then show how statistical rigour and the straightforward application of a sampling design can reduce the uncertainty and detect change in sediment transport over time and between land use and land cover types.
Uncertainty Analysis in Large Area Aboveground Biomass Mapping
NASA Astrophysics Data System (ADS)
Baccini, A.; Carvalho, L.; Dubayah, R.; Goetz, S. J.; Friedl, M. A.
2011-12-01
Satellite and aircraft-based remote sensing observations are being more frequently used to generate spatially explicit estimates of aboveground carbon stock of forest ecosystems. Because deforestation and forest degradation account for circa 10% of anthropogenic carbon emissions to the atmosphere, policy mechanisms are increasingly recognized as a low-cost mitigation option to reduce carbon emission. They are, however, contingent upon the capacity to accurately measures carbon stored in the forests. Here we examine the sources of uncertainty and error propagation in generating maps of aboveground biomass. We focus on characterizing uncertainties associated with maps at the pixel and spatially aggregated national scales. We pursue three strategies to describe the error and uncertainty properties of aboveground biomass maps, including: (1) model-based assessment using confidence intervals derived from linear regression methods; (2) data-mining algorithms such as regression trees and ensembles of these; (3) empirical assessments using independently collected data sets.. The latter effort explores error propagation using field data acquired within satellite-based lidar (GLAS) acquisitions versus alternative in situ methods that rely upon field measurements that have not been systematically collected for this purpose (e.g. from forest inventory data sets). A key goal of our effort is to provide multi-level characterizations that provide both pixel and biome-level estimates of uncertainties at different scales.
Sensitivity and Uncertainty Analysis of the GFR MOX Fuel Subassembly
NASA Astrophysics Data System (ADS)
Lüley, J.; Vrban, B.; Čerba, Š.; Haščík, J.; Nečas, V.; Pelloni, S.
2014-04-01
We performed sensitivity and uncertainty analysis as well as benchmark similarity assessment of the MOX fuel subassembly designed for the Gas-Cooled Fast Reactor (GFR) as a representative material of the core. Material composition was defined for each assembly ring separately allowing us to decompose the sensitivities not only for isotopes and reactions but also for spatial regions. This approach was confirmed by direct perturbation calculations for chosen materials and isotopes. Similarity assessment identified only ten partly comparable benchmark experiments that can be utilized in the field of GFR development. Based on the determined uncertainties, we also identified main contributors to the calculation bias.
Latin hypercube approach to estimate uncertainty in ground water vulnerability
Gurdak, J.J.; McCray, J.E.; Thyne, G.; Qi, S.L.
2007-01-01
A methodology is proposed to quantify prediction uncertainty associated with ground water vulnerability models that were developed through an approach that coupled multivariate logistic regression with a geographic information system (GIS). This method uses Latin hypercube sampling (LHS) to illustrate the propagation of input error and estimate uncertainty associated with the logistic regression predictions of ground water vulnerability. Central to the proposed method is the assumption that prediction uncertainty in ground water vulnerability models is a function of input error propagation from uncertainty in the estimated logistic regression model coefficients (model error) and the values of explanatory variables represented in the GIS (data error). Input probability distributions that represent both model and data error sources of uncertainty were simultaneously sampled using a Latin hypercube approach with logistic regression calculations of probability of elevated nonpoint source contaminants in ground water. The resulting probability distribution represents the prediction intervals and associated uncertainty of the ground water vulnerability predictions. The method is illustrated through a ground water vulnerability assessment of the High Plains regional aquifer. Results of the LHS simulations reveal significant prediction uncertainties that vary spatially across the regional aquifer. Additionally, the proposed method enables a spatial deconstruction of the prediction uncertainty that can lead to improved prediction of ground water vulnerability. ?? 2007 National Ground Water Association.
Mueller, David S.
2017-01-01
This paper presents a method using Monte Carlo simulations for assessing uncertainty of moving-boat acoustic Doppler current profiler (ADCP) discharge measurements using a software tool known as QUant, which was developed for this purpose. Analysis was performed on 10 data sets from four Water Survey of Canada gauging stations in order to evaluate the relative contribution of a range of error sources to the total estimated uncertainty. The factors that differed among data sets included the fraction of unmeasured discharge relative to the total discharge, flow nonuniformity, and operator decisions about instrument programming and measurement cross section. As anticipated, it was found that the estimated uncertainty is dominated by uncertainty of the discharge in the unmeasured areas, highlighting the importance of appropriate selection of the site, the instrument, and the user inputs required to estimate the unmeasured discharge. The main contributor to uncertainty was invalid data, but spatial inhomogeneity in water velocity and bottom-track velocity also contributed, as did variation in the edge velocity, uncertainty in the edge distances, edge coefficients, and the top and bottom extrapolation methods. To a lesser extent, spatial inhomogeneity in the bottom depth also contributed to the total uncertainty, as did uncertainty in the ADCP draft at shallow sites. The estimated uncertainties from QUant can be used to assess the adequacy of standard operating procedures. They also provide quantitative feedback to the ADCP operators about the quality of their measurements, indicating which parameters are contributing most to uncertainty, and perhaps even highlighting ways in which uncertainty can be reduced. Additionally, QUant can be used to account for self-dependent error sources such as heading errors, which are a function of heading. The results demonstrate the importance of a Monte Carlo method tool such as QUant for quantifying random and bias errors when evaluating the uncertainty of moving-boat ADCP measurements.
Quantification of uncertainties in global grazing systems assessment
NASA Astrophysics Data System (ADS)
Fetzel, T.; Havlik, P.; Herrero, M.; Kaplan, J. O.; Kastner, T.; Kroisleitner, C.; Rolinski, S.; Searchinger, T.; Van Bodegom, P. M.; Wirsenius, S.; Erb, K.-H.
2017-07-01
Livestock systems play a key role in global sustainability challenges like food security and climate change, yet many unknowns and large uncertainties prevail. We present a systematic, spatially explicit assessment of uncertainties related to grazing intensity (GI), a key metric for assessing ecological impacts of grazing, by combining existing data sets on (a) grazing feed intake, (b) the spatial distribution of livestock, (c) the extent of grazing land, and (d) its net primary productivity (NPP). An analysis of the resulting 96 maps implies that on average 15% of the grazing land NPP is consumed by livestock. GI is low in most of the world's grazing lands, but hotspots of very high GI prevail in 1% of the total grazing area. The agreement between GI maps is good on one fifth of the world's grazing area, while on the remainder, it is low to very low. Largest uncertainties are found in global drylands and where grazing land bears trees (e.g., the Amazon basin or the Taiga belt). In some regions like India or Western Europe, massive uncertainties even result in GI > 100% estimates. Our sensitivity analysis indicates that the input data for NPP, animal distribution, and grazing area contribute about equally to the total variability in GI maps, while grazing feed intake is a less critical variable. We argue that a general improvement in quality of the available global level data sets is a precondition for improving the understanding of the role of livestock systems in the context of global environmental change or food security.
Effects of uncertain topographic input data on two-dimensional flow modeling in a gravel-bed river
Legleiter, C.J.; Kyriakidis, P.C.; McDonald, R.R.; Nelson, J.M.
2011-01-01
Many applications in river research and management rely upon two-dimensional (2D) numerical models to characterize flow fields, assess habitat conditions, and evaluate channel stability. Predictions from such models are potentially highly uncertain due to the uncertainty associated with the topographic data provided as input. This study used a spatial stochastic simulation strategy to examine the effects of topographic uncertainty on flow modeling. Many, equally likely bed elevation realizations for a simple meander bend were generated and propagated through a typical 2D model to produce distributions of water-surface elevation, depth, velocity, and boundary shear stress at each node of the model's computational grid. Ensemble summary statistics were used to characterize the uncertainty associated with these predictions and to examine the spatial structure of this uncertainty in relation to channel morphology. Simulations conditioned to different data configurations indicated that model predictions became increasingly uncertain as the spacing between surveyed cross sections increased. Model sensitivity to topographic uncertainty was greater for base flow conditions than for a higher, subbankfull flow (75% of bankfull discharge). The degree of sensitivity also varied spatially throughout the bend, with the greatest uncertainty occurring over the point bar where the flow field was influenced by topographic steering effects. Uncertain topography can therefore introduce significant uncertainty to analyses of habitat suitability and bed mobility based on flow model output. In the presence of such uncertainty, the results of these studies are most appropriately represented in probabilistic terms using distributions of model predictions derived from a series of topographic realizations. Copyright 2011 by the American Geophysical Union.
NASA Astrophysics Data System (ADS)
Pechlivanidis, Ilias; McIntyre, Neil; Wheater, Howard
2017-04-01
Rainfall, one of the main inputs in hydrological modeling, is a highly heterogeneous process over a wide range of scales in space, and hence the ignorance of the spatial rainfall information could affect the simulated streamflow. Calibration of hydrological model parameters is rarely a straightforward task due to parameter equifinality and parameters' 'nature' to compensate for other uncertainties, i.e. structural and forcing input. In here, we analyse the significance of spatial variability of rainfall on streamflow as a function of catchment scale and type, and antecedent conditions using the continuous time, semi-distributed PDM hydrological model at the Upper Lee catchment, UK. The impact of catchment scale and type is assessed using 11 nested catchments ranging in scale from 25 to 1040 km2, and further assessed by artificially changing the catchment characteristics and translating these to model parameters with uncertainty using model regionalisation. Synthetic rainfall events are introduced to directly relate the change in simulated streamflow to the spatial variability of rainfall. Overall, we conclude that the antecedent catchment wetness and catchment type play an important role in controlling the significance of the spatial distribution of rainfall on streamflow. Results show a relationship between hydrograph characteristics (streamflow peak and volume) and the degree of spatial variability of rainfall for the impermeable catchments under dry antecedent conditions, although this decreases at larger scales; however this sensitivity is significantly undermined under wet antecedent conditions. Although there is indication that the impact of spatial rainfall on streamflow varies as a function of catchment scale, the variability of antecedent conditions between the synthetic catchments seems to mask this significance. Finally, hydrograph responses to different spatial patterns in rainfall depend on assumptions used for model parameter estimation and also the spatial variation in parameters indicating the need of an uncertainty framework in such investigation.
A comparative experimental evaluation of uncertainty estimation methods for two-component PIV
NASA Astrophysics Data System (ADS)
Boomsma, Aaron; Bhattacharya, Sayantan; Troolin, Dan; Pothos, Stamatios; Vlachos, Pavlos
2016-09-01
Uncertainty quantification in planar particle image velocimetry (PIV) measurement is critical for proper assessment of the quality and significance of reported results. New uncertainty estimation methods have been recently introduced generating interest about their applicability and utility. The present study compares and contrasts current methods, across two separate experiments and three software packages in order to provide a diversified assessment of the methods. We evaluated the performance of four uncertainty estimation methods, primary peak ratio (PPR), mutual information (MI), image matching (IM) and correlation statistics (CS). The PPR method was implemented and tested in two processing codes, using in-house open source PIV processing software (PRANA, Purdue University) and Insight4G (TSI, Inc.). The MI method was evaluated in PRANA, as was the IM method. The CS method was evaluated using DaVis (LaVision, GmbH). Utilizing two PIV systems for high and low-resolution measurements and a laser doppler velocimetry (LDV) system, data were acquired in a total of three cases: a jet flow and a cylinder in cross flow at two Reynolds numbers. LDV measurements were used to establish a point validation against which the high-resolution PIV measurements were validated. Subsequently, the high-resolution PIV measurements were used as a reference against which the low-resolution PIV data were assessed for error and uncertainty. We compared error and uncertainty distributions, spatially varying RMS error and RMS uncertainty, and standard uncertainty coverages. We observed that qualitatively, each method responded to spatially varying error (i.e. higher error regions resulted in higher uncertainty predictions in that region). However, the PPR and MI methods demonstrated reduced uncertainty dynamic range response. In contrast, the IM and CS methods showed better response, but under-predicted the uncertainty ranges. The standard coverages (68% confidence interval) ranged from approximately 65%-77% for PPR and MI methods, 40%-50% for IM and near 50% for CS. These observations illustrate some of the strengths and weaknesses of the methods considered herein and identify future directions for development and improvement.
Collocation mismatch uncertainties in satellite aerosol retrieval validation
NASA Astrophysics Data System (ADS)
Virtanen, Timo H.; Kolmonen, Pekka; Sogacheva, Larisa; Rodríguez, Edith; Saponaro, Giulia; de Leeuw, Gerrit
2018-02-01
Satellite-based aerosol products are routinely validated against ground-based reference data, usually obtained from sun photometer networks such as AERONET (AEROsol RObotic NETwork). In a typical validation exercise a spatial sample of the instantaneous satellite data is compared against a temporal sample of the point-like ground-based data. The observations do not correspond to exactly the same column of the atmosphere at the same time, and the representativeness of the reference data depends on the spatiotemporal variability of the aerosol properties in the samples. The associated uncertainty is known as the collocation mismatch uncertainty (CMU). The validation results depend on the sampling parameters. While small samples involve less variability, they are more sensitive to the inevitable noise in the measurement data. In this paper we study systematically the effect of the sampling parameters in the validation of AATSR (Advanced Along-Track Scanning Radiometer) aerosol optical depth (AOD) product against AERONET data and the associated collocation mismatch uncertainty. To this end, we study the spatial AOD variability in the satellite data, compare it against the corresponding values obtained from densely located AERONET sites, and assess the possible reasons for observed differences. We find that the spatial AOD variability in the satellite data is approximately 2 times larger than in the ground-based data, and the spatial variability correlates only weakly with that of AERONET for short distances. We interpreted that only half of the variability in the satellite data is due to the natural variability in the AOD, and the rest is noise due to retrieval errors. However, for larger distances (˜ 0.5°) the correlation is improved as the noise is averaged out, and the day-to-day changes in regional AOD variability are well captured. Furthermore, we assess the usefulness of the spatial variability of the satellite AOD data as an estimate of CMU by comparing the retrieval errors to the total uncertainty estimates including the CMU in the validation. We find that accounting for CMU increases the fraction of consistent observations.
The impacts of uncertainty and variability in groundwater-driven health risk assessment. (Invited)
NASA Astrophysics Data System (ADS)
Maxwell, R. M.
2010-12-01
Potential human health risk from contaminated groundwater is becoming an important, quantitative measure used in management decisions in a range of applications from Superfund to CO2 sequestration. Quantitatively assessing the potential human health risks from contaminated groundwater is challenging due to the many coupled processes, uncertainty in transport parameters and the variability in individual physiology and behavior. Perspective on human health risk assessment techniques will be presented and a framework used to predict potential, increased human health risk from contaminated groundwater will be discussed. This framework incorporates transport of contaminants through the subsurface from source to receptor and health risks to individuals via household exposure pathways. The subsurface is shown subject to both physical and chemical heterogeneity which affects downstream concentrations at receptors. Cases are presented where hydraulic conductivity can exhibit both uncertainty and spatial variability in addition to situations where hydraulic conductivity is the dominant source of uncertainty in risk assessment. Management implications, such as characterization and remediation will also be discussed.
Exploring Land Use and Land Cover Change and Feedbacks in the Global Change Assessment Model
NASA Astrophysics Data System (ADS)
Chen, M.; Vernon, C. R.; Huang, M.; Calvin, K. V.; Le Page, Y.; Kraucunas, I.
2017-12-01
Land Use and Land Cover Change (LULCC) is a major driver of global and regional environmental change. Projections of land use change are thus an essential component in Integrated Assessment Models (IAMs) to study feedbacks between transformation of energy systems and land productivity under the context of climate change. However, the spatial scale of IAMs, e.g., the Global Change Assessment Model (GCAM), is typically larger than the scale of terrestrial processes in the human-Earth system, LULCC downscaling therefore becomes a critical linkage among these multi-scale and multi-sector processes. Parametric uncertainties in LULCC downscaling algorithms, however, have been under explored, especially in the context of how such uncertainties could propagate to affect energy systems in a changing climate. In this study, we use a LULCC downscaling model, Demeter, to downscale GCAM-based future land use scenarios into fine spatial scales, and explore the sensitivity of downscaled land allocations to key parameters. Land productivity estimates (e.g., biomass production and crop yield) based on the downscaled LULCC scenarios are then fed to GCAM to evaluate how energy systems might change due to altered water and carbon cycle dynamics and their interactions with the human system, , which would in turn affect future land use projections. We demonstrate that uncertainties in LULCC downscaling can result in significant differences in simulated scenarios, indicating the importance of quantifying parametric uncertainties in LULCC downscaling models for integrated assessment studies.
NASA Astrophysics Data System (ADS)
Ahmadalipour, Ali; Moradkhani, Hamid; Rana, Arun
2017-04-01
Uncertainty is an inevitable feature of climate change impact assessments. Understanding and quantifying different sources of uncertainty is of high importance, which can help modeling agencies improve the current models and scenarios. In this study, we have assessed the future changes in three climate variables (i.e. precipitation, maximum temperature, and minimum temperature) over 10 sub-basins across the Pacific Northwest US. To conduct the study, 10 statistically downscaled CMIP5 GCMs from two downscaling methods (i.e. BCSD and MACA) were utilized at 1/16 degree spatial resolution for the historical period of 1970-2000 and future period of 2010-2099. For the future projections, two future scenarios of RCP4.5 and RCP8.5 were used. Furthermore, Bayesian Model Averaging (BMA) was employed to develop a probabilistic future projection for each climate variable. Results indicate superiority of BMA simulations compared to individual models. Increasing temperature and precipitation are projected at annual timescale. However, the changes are not uniform among different seasons. Model uncertainty shows to be the major source of uncertainty, while downscaling uncertainty significantly contributes to the total uncertainty, especially in summer.
Lopiano, Kenneth K; Young, Linda J; Gotway, Carol A
2014-09-01
Spatially referenced datasets arising from multiple sources are routinely combined to assess relationships among various outcomes and covariates. The geographical units associated with the data, such as the geographical coordinates or areal-level administrative units, are often spatially misaligned, that is, observed at different locations or aggregated over different geographical units. As a result, the covariate is often predicted at the locations where the response is observed. The method used to align disparate datasets must be accounted for when subsequently modeling the aligned data. Here we consider the case where kriging is used to align datasets in point-to-point and point-to-areal misalignment problems when the response variable is non-normally distributed. If the relationship is modeled using generalized linear models, the additional uncertainty induced from using the kriging mean as a covariate introduces a Berkson error structure. In this article, we develop a pseudo-penalized quasi-likelihood algorithm to account for the additional uncertainty when estimating regression parameters and associated measures of uncertainty. The method is applied to a point-to-point example assessing the relationship between low-birth weights and PM2.5 levels after the onset of the largest wildfire in Florida history, the Bugaboo scrub fire. A point-to-areal misalignment problem is presented where the relationship between asthma events in Florida's counties and PM2.5 levels after the onset of the fire is assessed. Finally, the method is evaluated using a simulation study. Our results indicate the method performs well in terms of coverage for 95% confidence intervals and naive methods that ignore the additional uncertainty tend to underestimate the variability associated with parameter estimates. The underestimation is most profound in Poisson regression models. © 2014, The International Biometric Society.
Park, Eun Sug; Symanski, Elaine; Han, Daikwon; Spiegelman, Clifford
2015-06-01
A major difficulty with assessing source-specific health effects is that source-specific exposures cannot be measured directly; rather, they need to be estimated by a source-apportionment method such as multivariate receptor modeling. The uncertainty in source apportionment (uncertainty in source-specific exposure estimates and model uncertainty due to the unknown number of sources and identifiability conditions) has been largely ignored in previous studies. Also, spatial dependence of multipollutant data collected from multiple monitoring sites has not yet been incorporated into multivariate receptor modeling. The objectives of this project are (1) to develop a multipollutant approach that incorporates both sources of uncertainty in source-apportionment into the assessment of source-specific health effects and (2) to develop enhanced multivariate receptor models that can account for spatial correlations in the multipollutant data collected from multiple sites. We employed a Bayesian hierarchical modeling framework consisting of multivariate receptor models, health-effects models, and a hierarchical model on latent source contributions. For the health model, we focused on the time-series design in this project. Each combination of number of sources and identifiability conditions (additional constraints on model parameters) defines a different model. We built a set of plausible models with extensive exploratory data analyses and with information from previous studies, and then computed posterior model probability to estimate model uncertainty. Parameter estimation and model uncertainty estimation were implemented simultaneously by Markov chain Monte Carlo (MCMC*) methods. We validated the methods using simulated data. We illustrated the methods using PM2.5 (particulate matter ≤ 2.5 μm in aerodynamic diameter) speciation data and mortality data from Phoenix, Arizona, and Houston, Texas. The Phoenix data included counts of cardiovascular deaths and daily PM2.5 speciation data from 1995-1997. The Houston data included respiratory mortality data and 24-hour PM2.5 speciation data sampled every six days from a region near the Houston Ship Channel in years 2002-2005. We also developed a Bayesian spatial multivariate receptor modeling approach that, while simultaneously dealing with the unknown number of sources and identifiability conditions, incorporated spatial correlations in the multipollutant data collected from multiple sites into the estimation of source profiles and contributions based on the discrete process convolution model for multivariate spatial processes. This new modeling approach was applied to 24-hour ambient air concentrations of 17 volatile organic compounds (VOCs) measured at nine monitoring sites in Harris County, Texas, during years 2000 to 2005. Simulation results indicated that our methods were accurate in identifying the true model and estimated parameters were close to the true values. The results from our methods agreed in general with previous studies on the source apportionment of the Phoenix data in terms of estimated source profiles and contributions. However, we had a greater number of statistically insignificant findings, which was likely a natural consequence of incorporating uncertainty in the estimated source contributions into the health-effects parameter estimation. For the Houston data, a model with five sources (that seemed to be Sulfate-Rich Secondary Aerosol, Motor Vehicles, Industrial Combustion, Soil/Crustal Matter, and Sea Salt) showed the highest posterior model probability among the candidate models considered when fitted simultaneously to the PM2.5 and mortality data. There was a statistically significant positive association between respiratory mortality and same-day PM2.5 concentrations attributed to one of the sources (probably industrial combustion). The Bayesian spatial multivariate receptor modeling approach applied to the VOC data led to a highest posterior model probability for a model with five sources (that seemed to be refinery, petrochemical production, gasoline evaporation, natural gas, and vehicular exhaust) among several candidate models, with the number of sources varying between three and seven and with different identifiability conditions. Our multipollutant approach assessing source-specific health effects is more advantageous than a single-pollutant approach in that it can estimate total health effects from multiple pollutants and can also identify emission sources that are responsible for adverse health effects. Our Bayesian approach can incorporate not only uncertainty in the estimated source contributions, but also model uncertainty that has not been addressed in previous studies on assessing source-specific health effects. The new Bayesian spatial multivariate receptor modeling approach enables predictions of source contributions at unmonitored sites, minimizing exposure misclassification and providing improved exposure estimates along with their uncertainty estimates, as well as accounting for uncertainty in the number of sources and identifiability conditions.
NASA Astrophysics Data System (ADS)
Woodward, Simon J. R.; Wöhling, Thomas; Stenger, Roland
2016-03-01
Understanding the hydrological and hydrogeochemical responses of hillslopes and other small scale groundwater systems requires mapping the velocity and direction of groundwater flow relative to the controlling subsurface material features. Since point observations of subsurface materials and groundwater head are often the basis for modelling these complex, dynamic, three-dimensional systems, considerable uncertainties are inevitable, but are rarely assessed. This study explored whether piezometric head data measured at high spatial and temporal resolution over six years at a hillslope research site provided sufficient information to determine the flow paths that transfer nitrate leached from the soil zone through the shallow saturated zone into a nearby wetland and stream. Transient groundwater flow paths were modelled using MODFLOW and MODPATH, with spatial patterns of hydraulic conductivity in the three material layers at the site being estimated by regularised pilot point calibration using PEST, constrained by slug test estimates of saturated hydraulic conductivity at several locations. Subsequent Null Space Monte Carlo uncertainty analysis showed that this data was not sufficient to definitively determine the spatial pattern of hydraulic conductivity at the site, although modelled water table dynamics matched the measured heads with acceptable accuracy in space and time. Particle tracking analysis predicted that the saturated flow direction was similar throughout the year as the water table rose and fell, but was not aligned with either the ground surface or subsurface material contours; indeed the subsurface material layers, having relatively similar hydraulic properties, appeared to have little effect on saturated water flow at the site. Flow path uncertainty analysis showed that, while accurate flow path direction or velocity could not be determined on the basis of the available head and slug test data alone, the origin of well water samples relative to the material layers and site contour could still be broadly deduced. This study highlights both the challenge of collecting suitably informative field data with which to characterise subsurface hydrology, and the power of modern calibration and uncertainty modelling techniques to assess flow path uncertainty in hillslopes and other small scale systems.
The standard framework of Ecological Risk Assessment (ERA) uses organism-level assessment endpoints to qualitatively determine the risk to populations. While organism-level toxicity data provide the pathway by which a species may be affected by a chemical stressor, they neither i...
Probabilistic framework for assessing the ice sheet contribution to sea level change.
Little, Christopher M; Urban, Nathan M; Oppenheimer, Michael
2013-02-26
Previous sea level rise (SLR) assessments have excluded the potential for dynamic ice loss over much of Greenland and Antarctica, and recently proposed "upper bounds" on Antarctica's 21st-century SLR contribution are derived principally from regions where present-day mass loss is concentrated (basin 15, or B15, drained largely by Pine Island, Thwaites, and Smith glaciers). Here, we present a probabilistic framework for assessing the ice sheet contribution to sea level change that explicitly accounts for mass balance uncertainty over an entire ice sheet. Applying this framework to Antarctica, we find that ongoing mass imbalances in non-B15 basins give an SLR contribution by 2100 that: (i) is comparable to projected changes in B15 discharge and Antarctica's surface mass balance, and (ii) varies widely depending on the subset of basins and observational dataset used in projections. Increases in discharge uncertainty, or decreases in the exceedance probability used to define an upper bound, increase the fractional contribution of non-B15 basins; even weak spatial correlations in future discharge growth rates markedly enhance this sensitivity. Although these projections rely on poorly constrained statistical parameters, they may be updated with observations and/or models at many spatial scales, facilitating a more comprehensive account of uncertainty that, if implemented, will improve future assessments.
Assessing concentration uncertainty estimates from passive microwave sea ice products
NASA Astrophysics Data System (ADS)
Meier, W.; Brucker, L.; Miller, J. A.
2017-12-01
Sea ice concentration is an essential climate variable and passive microwave derived estimates of concentration are one of the longest satellite-derived climate records. However, until recently uncertainty estimates were not provided. Numerous validation studies provided insight into general error characteristics, but the studies have found that concentration error varied greatly depending on sea ice conditions. Thus, an uncertainty estimate from each observation is desired, particularly for initialization, assimilation, and validation of models. Here we investigate three sea ice products that include an uncertainty for each concentration estimate: the NASA Team 2 algorithm product, the EUMETSAT Ocean and Sea Ice Satellite Application Facility (OSI-SAF) product, and the NOAA/NSIDC Climate Data Record (CDR) product. Each product estimates uncertainty with a completely different approach. The NASA Team 2 product derives uncertainty internally from the algorithm method itself. The OSI-SAF uses atmospheric reanalysis fields and a radiative transfer model. The CDR uses spatial variability from two algorithms. Each approach has merits and limitations. Here we evaluate the uncertainty estimates by comparing the passive microwave concentration products with fields derived from the NOAA VIIRS sensor. The results show that the relationship between the product uncertainty estimates and the concentration error (relative to VIIRS) is complex. This may be due to the sea ice conditions, the uncertainty methods, as well as the spatial and temporal variability of the passive microwave and VIIRS products.
Phu, Jack; Kalloniatis, Michael; Khuu, Sieu K.
2018-01-01
Purpose Current clinical perimetric test paradigms present stimuli randomly to various locations across the visual field (VF), inherently introducing spatial uncertainty, which reduces contrast sensitivity. In the present study, we determined the extent to which spatial uncertainty affects contrast sensitivity in glaucoma patients by minimizing spatial uncertainty through attentional cueing. Methods Six patients with open-angle glaucoma and six healthy subjects underwent laboratory-based psychophysical testing to measure contrast sensitivity at preselected locations at two eccentricities (9.5° and 17.5°) with two stimulus sizes (Goldmann sizes III and V) under different cueing conditions: 1, 2, 4, or 8 points verbally cued. Method of Constant Stimuli and a single-interval forced-choice procedure were used to generate frequency of seeing (FOS) curves at locations with and without VF defects. Results At locations with VF defects, cueing minimizes spatial uncertainty and improves sensitivity under all conditions. The effect of cueing was maximal when one point was cued, and rapidly diminished when more points were cued (no change to baseline with 8 points cued). The slope of the FOS curve steepened with reduced spatial uncertainty. Locations with normal sensitivity in glaucomatous eyes had similar performance to that of healthy subjects. There was a systematic increase in uncertainty with the depth of VF loss. Conclusions Sensitivity measurements across the VF are negatively affected by spatial uncertainty, which increases with greater VF loss. Minimizing uncertainty can improve sensitivity at locations of deficit. Translational Relevance Current perimetric techniques introduce spatial uncertainty and may therefore underestimate sensitivity in regions of VF loss. PMID:29600116
Ribeiro, Manuel C; Pinho, P; Branquinho, C; Llop, Esteve; Pereira, Maria J
2016-08-15
In most studies correlating health outcomes with air pollution, personal exposure assignments are based on measurements collected at air-quality monitoring stations not coinciding with health data locations. In such cases, interpolators are needed to predict air quality in unsampled locations and to assign personal exposures. Moreover, a measure of the spatial uncertainty of exposures should be incorporated, especially in urban areas where concentrations vary at short distances due to changes in land use and pollution intensity. These studies are limited by the lack of literature comparing exposure uncertainty derived from distinct spatial interpolators. Here, we addressed these issues with two interpolation methods: regression Kriging (RK) and ordinary Kriging (OK). These methods were used to generate air-quality simulations with a geostatistical algorithm. For each method, the geostatistical uncertainty was drawn from generalized linear model (GLM) analysis. We analyzed the association between air quality and birth weight. Personal health data (n=227) and exposure data were collected in Sines (Portugal) during 2007-2010. Because air-quality monitoring stations in the city do not offer high-spatial-resolution measurements (n=1), we used lichen data as an ecological indicator of air quality (n=83). We found no significant difference in the fit of GLMs with any of the geostatistical methods. With RK, however, the models tended to fit better more often and worse less often. Moreover, the geostatistical uncertainty results showed a marginally higher mean and precision with RK. Combined with lichen data and land-use data of high spatial resolution, RK is a more effective geostatistical method for relating health outcomes with air quality in urban areas. This is particularly important in small cities, which generally do not have expensive air-quality monitoring stations with high spatial resolution. Further, alternative ways of linking human activities with their environment are needed to improve human well-being. Copyright © 2016 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Freer, Jim; Coxon, Gemma; Quinn, Niall; Dunne, Toby; Lane, Rosie; Bates, Paul; Wagener, Thorsten; Woods, Ross; Neal, Jeff; Howden, Nicholas; Musuuza, Jude
2017-04-01
There is a huge challenge in developing hydrological model structures that can be used for hypothesis testing, prediction, impact assessment and risk analyses over a wide range of spatial scales. There are many reasons why this is the case, from computational demands, to how we define and characterize different features and pathway connectivities in the landscape, that differ depending on the objectives of the study. However there is certainly a need more than ever to explore the trade-offs between the complexity of modelling applied (i.e. spatial discretization, levels of process representation, complexity of landscape representation) compared to the benefits realized in terms of predictive capability and robustness of these predictions during hydrological extremes and during change. Furthermore, there is a further balance, particularly associated with prediction uncertainties, in that it is not desirable to have modelling systems that are too complex compared to the observed data that would ever be available to apply them. This is particularly the case when models are applied to quantify national impact assessments, especially if these are based on validation assessments from smaller more detailed case studies. Therefore the hydrological community needs modelling tools and approaches that enable these trade-offs to be explored and to understand the level of representation needed in models to be 'fit-for-purpose' for a given application. This paper presents a catchment scale national modelling framework based on Dynamic-TOPMODEL specifically setup to fulfil these aims. A key component of the modelling framework is it's structural flexibility, as is the ability to assess model outputs using Monte Carlo simulation techniques. The model build has been automated to work at any spatial scale to the national scale, and within that to control the level of spatial discretisation and connectivity of locally accounted landscape elements in the form of hydrological response units (HRU's). This allows for the explicit consideration of spatial rainfall fields, landscape, soils and geological attributes and the spatial connectivity of hydrological flow pathways to explore what level of modelling complexity we need for different prediction problems. We shall present this framework and show how it can be used in flood and drought risk analyses as well as include attributes and features within the landscape to explore societal and climate impacts effectively within an uncertainty analyses framework.
Spatial and Temporal Uncertainty of Crop Yield Aggregations
NASA Technical Reports Server (NTRS)
Porwollik, Vera; Mueller, Christoph; Elliott, Joshua; Chryssanthacopoulos, James; Iizumi, Toshichika; Ray, Deepak K.; Ruane, Alex C.; Arneth, Almut; Balkovic, Juraj; Ciais, Philippe;
2016-01-01
The aggregation of simulated gridded crop yields to national or regional scale requires information on temporal and spatial patterns of crop-specific harvested areas. This analysis estimates the uncertainty of simulated gridded yield time series related to the aggregation with four different harvested area data sets. We compare aggregated yield time series from the Global Gridded Crop Model Inter-comparison project for four crop types from 14 models at global, national, and regional scale to determine aggregation-driven differences in mean yields and temporal patterns as measures of uncertainty. The quantity and spatial patterns of harvested areas differ for individual crops among the four datasets applied for the aggregation. Also simulated spatial yield patterns differ among the 14 models. These differences in harvested areas and simulated yield patterns lead to differences in aggregated productivity estimates, both in mean yield and in the temporal dynamics. Among the four investigated crops, wheat yield (17% relative difference) is most affected by the uncertainty introduced by the aggregation at the global scale. The correlation of temporal patterns of global aggregated yield time series can be as low as for soybean (r = 0.28).For the majority of countries, mean relative differences of nationally aggregated yields account for10% or less. The spatial and temporal difference can be substantial higher for individual countries. Of the top-10 crop producers, aggregated national multi-annual mean relative difference of yields can be up to 67% (maize, South Africa), 43% (wheat, Pakistan), 51% (rice, Japan), and 427% (soybean, Bolivia).Correlations of differently aggregated yield time series can be as low as r = 0.56 (maize, India), r = 0.05*Corresponding (wheat, Russia), r = 0.13 (rice, Vietnam), and r = -0.01 (soybean, Uruguay). The aggregation to sub-national scale in comparison to country scale shows that spatial uncertainties can cancel out in countries with large harvested areas per crop type. We conclude that the aggregation uncertainty can be substantial for crop productivity and production estimations in the context of food security, impact assessment, and model evaluation exercises.
Feizizadeh, Bakhtiar; Jankowski, Piotr; Blaschke, Thomas
2014-03-01
GIS multicriteria decision analysis (MCDA) techniques are increasingly used in landslide susceptibility mapping for the prediction of future hazards, land use planning, as well as for hazard preparedness. However, the uncertainties associated with MCDA techniques are inevitable and model outcomes are open to multiple types of uncertainty. In this paper, we present a systematic approach to uncertainty and sensitivity analysis. We access the uncertainty of landslide susceptibility maps produced with GIS-MCDA techniques. A new spatially-explicit approach and Dempster-Shafer Theory (DST) are employed to assess the uncertainties associated with two MCDA techniques, namely Analytical Hierarchical Process (AHP) and Ordered Weighted Averaging (OWA) implemented in GIS. The methodology is composed of three different phases. First, weights are computed to express the relative importance of factors (criteria) for landslide susceptibility. Next, the uncertainty and sensitivity of landslide susceptibility is analyzed as a function of weights using Monte Carlo Simulation and Global Sensitivity Analysis. Finally, the results are validated using a landslide inventory database and by applying DST. The comparisons of the obtained landslide susceptibility maps of both MCDA techniques with known landslides show that the AHP outperforms OWA. However, the OWA-generated landslide susceptibility map shows lower uncertainty than the AHP-generated map. The results demonstrate that further improvement in the accuracy of GIS-based MCDA can be achieved by employing an integrated uncertainty-sensitivity analysis approach, in which the uncertainty of landslide susceptibility model is decomposed and attributed to model's criteria weights.
NASA Astrophysics Data System (ADS)
Feizizadeh, Bakhtiar; Jankowski, Piotr; Blaschke, Thomas
2014-03-01
GIS multicriteria decision analysis (MCDA) techniques are increasingly used in landslide susceptibility mapping for the prediction of future hazards, land use planning, as well as for hazard preparedness. However, the uncertainties associated with MCDA techniques are inevitable and model outcomes are open to multiple types of uncertainty. In this paper, we present a systematic approach to uncertainty and sensitivity analysis. We access the uncertainty of landslide susceptibility maps produced with GIS-MCDA techniques. A new spatially-explicit approach and Dempster-Shafer Theory (DST) are employed to assess the uncertainties associated with two MCDA techniques, namely Analytical Hierarchical Process (AHP) and Ordered Weighted Averaging (OWA) implemented in GIS. The methodology is composed of three different phases. First, weights are computed to express the relative importance of factors (criteria) for landslide susceptibility. Next, the uncertainty and sensitivity of landslide susceptibility is analyzed as a function of weights using Monte Carlo Simulation and Global Sensitivity Analysis. Finally, the results are validated using a landslide inventory database and by applying DST. The comparisons of the obtained landslide susceptibility maps of both MCDA techniques with known landslides show that the AHP outperforms OWA. However, the OWA-generated landslide susceptibility map shows lower uncertainty than the AHP-generated map. The results demonstrate that further improvement in the accuracy of GIS-based MCDA can be achieved by employing an integrated uncertainty-sensitivity analysis approach, in which the uncertainty of landslide susceptibility model is decomposed and attributed to model's criteria weights.
Visual scanning with or without spatial uncertainty and time-sharing performance
NASA Technical Reports Server (NTRS)
Liu, Yili; Wickens, Christopher D.
1989-01-01
An experiment is reported that examines the pattern of task interference between visual scanning as a sequential and selective attention process and other concurrent spatial or verbal processing tasks. A distinction is proposed between visual scanning with or without spatial uncertainty regarding the possible differential effects of these two types of scanning on interference with other concurrent processes. The experiment required the subject to perform a simulated primary tracking task, which was time-shared with a secondary spatial or verbal decision task. The relevant information that was needed to perform the decision tasks were displayed with or without spatial uncertainty. The experiment employed a 2 x 2 x 2 design with types of scanning (with or without spatial uncertainty), expected scanning distance (low/high), and codes of concurrent processing (spatial/verbal) as the three experimental factors. The results provide strong evidence that visual scanning as a spatial exploratory activity produces greater task interference with concurrent spatial tasks than with concurrent verbal tasks. Furthermore, spatial uncertainty in visual scanning is identified to be the crucial factor in producing this differential effect.
Spatially resolved hazard and exposure assessments: an example of lead in soil at Lavrion, Greece.
Tristán, E; Demetriades, A; Ramsey, M H; Rosenbaum, M S; Stavrakis, P; Thornton, I; Vassiliades, E; Vergou, K
2000-01-01
Spatially resolved hazard assessment (SRHA) and spatially resolved exposure assessment (SREA) are methodologies that have been devised for assessing child exposure to soil containing environmental pollutants. These are based on either a quantitative or a semiquantitative approach. The feasibility of the methodologies has been demonstrated in a study assessing child exposure to Pb accessible in soil at the town of Lavrion in Greece. Using a quantitative approach, both measured and kriged concentrations of Pb in soil are compared with an "established" statutory threshold value. The probabilistic approach gives a refined classification of the contaminated land, since it takes into consideration the uncertainty in both the actual measurement and estimated kriged values. Two exposure assessment models (i.e., IEUBK and HESP) are used as the basis of the quantitative SREA methodologies. The significant correlation between the blood-Pb predictions, using the IEUBK model, and measured concentrations provides a partial validation of the method, because it allows for the uncertainty in the measurements and the lack of some site-specific measurements. The semiquantitative applications of SRHA and SREA incorporate both qualitative information (e.g., land use and dustiness of waste) and quantitative information (e.g., distance from wastes and distance from industry). The significant correlation between the results of these assessments and the measured blood-Pb levels confirms the robust nature of this approach. Successful application of these methodologies could reduce the cost of the assessment and allow areas to be prioritized for further investigation, remediation, or risk management.
Zhao, Ruiying; Chen, Songchao; Zhou, Yue; Jin, Bin; Li, Yan
2018-01-01
Assessing heavy metal pollution and delineating pollution are the bases for evaluating pollution and determining a cost-effective remediation plan. Most existing studies are based on the spatial distribution of pollutants but ignore related uncertainty. In this study, eight heavy-metal concentrations (Cr, Pb, Cd, Hg, Zn, Cu, Ni, and Zn) were collected at 1040 sampling sites in a coastal industrial city in the Yangtze River Delta, China. The single pollution index (PI) and Nemerow integrated pollution index (NIPI) were calculated for every surface sample (0–20 cm) to assess the degree of heavy metal pollution. Ordinary kriging (OK) was used to map the spatial distribution of heavy metals content and NIPI. Then, we delineated composite heavy metal contamination based on the uncertainty produced by indicator kriging (IK). The results showed that mean values of all PIs and NIPIs were at safe levels. Heavy metals were most accumulated in the central portion of the study area. Based on IK, the spatial probability of composite heavy metal pollution was computed. The probability of composite contamination in the central core urban area was highest. A probability of 0.6 was found as the optimum probability threshold to delineate polluted areas from unpolluted areas for integrative heavy metal contamination. Results of pollution delineation based on uncertainty showed the proportion of false negative error areas was 6.34%, while the proportion of false positive error areas was 0.86%. The accuracy of the classification was 92.80%. This indicated the method we developed is a valuable tool for delineating heavy metal pollution. PMID:29642623
Hu, Bifeng; Zhao, Ruiying; Chen, Songchao; Zhou, Yue; Jin, Bin; Li, Yan; Shi, Zhou
2018-04-10
Assessing heavy metal pollution and delineating pollution are the bases for evaluating pollution and determining a cost-effective remediation plan. Most existing studies are based on the spatial distribution of pollutants but ignore related uncertainty. In this study, eight heavy-metal concentrations (Cr, Pb, Cd, Hg, Zn, Cu, Ni, and Zn) were collected at 1040 sampling sites in a coastal industrial city in the Yangtze River Delta, China. The single pollution index (PI) and Nemerow integrated pollution index (NIPI) were calculated for every surface sample (0-20 cm) to assess the degree of heavy metal pollution. Ordinary kriging (OK) was used to map the spatial distribution of heavy metals content and NIPI. Then, we delineated composite heavy metal contamination based on the uncertainty produced by indicator kriging (IK). The results showed that mean values of all PIs and NIPIs were at safe levels. Heavy metals were most accumulated in the central portion of the study area. Based on IK, the spatial probability of composite heavy metal pollution was computed. The probability of composite contamination in the central core urban area was highest. A probability of 0.6 was found as the optimum probability threshold to delineate polluted areas from unpolluted areas for integrative heavy metal contamination. Results of pollution delineation based on uncertainty showed the proportion of false negative error areas was 6.34%, while the proportion of false positive error areas was 0.86%. The accuracy of the classification was 92.80%. This indicated the method we developed is a valuable tool for delineating heavy metal pollution.
Uncertainty on shallow landslide hazard assessment: from field data to hazard mapping
NASA Astrophysics Data System (ADS)
Trefolini, Emanuele; Tolo, Silvia; Patelli, Eduardo; Broggi, Matteo; Disperati, Leonardo; Le Tuan, Hai
2015-04-01
Shallow landsliding that involve Hillslope Deposits (HD), the surficial soil that cover the bedrock, is an important process of erosion, transport and deposition of sediment along hillslopes. Despite Shallow landslides generally mobilize relatively small volume of material, they represent the most hazardous factor in mountain regions due to their high velocity and the common absence of warning signs. Moreover, increasing urbanization and likely climate change make shallow landslides a source of widespread risk, therefore the interest of scientific community about this process grown in the last three decades. One of the main aims of research projects involved on this topic, is to perform robust shallow landslides hazard assessment for wide areas (regional assessment), in order to support sustainable spatial planning. Currently, three main methodologies may be implemented to assess regional shallow landslides hazard: expert evaluation, probabilistic (or data mining) methods and physical models based methods. The aim of this work is evaluate the uncertainty of shallow landslides hazard assessment based on physical models taking into account spatial variables such as: geotechnical and hydrogeologic parameters as well as hillslope morphometry. To achieve this goal a wide dataset of geotechnical properties (shear strength, permeability, depth and unit weight) of HD was gathered by integrating field survey, in situ and laboratory tests. This spatial database was collected from a study area of about 350 km2 including different bedrock lithotypes and geomorphological features. The uncertainty associated to each step of the hazard assessment process (e.g. field data collection, regionalization of site specific information and numerical modelling of hillslope stability) was carefully characterized. The most appropriate probability density function (PDF) was chosen for each numerical variable and we assessed the uncertainty propagation on HD strength parameters obtained by empirical relations with geotechnical index properties. Site specific information was regionalized at map scale by (hard and fuzzy) clustering analysis taking into account spatial variables such as: geology, geomorphology and hillslope morphometric variables (longitudinal and transverse curvature, flow accumulation and slope), the latter derived by a DEM with 10 m cell size. In order to map shallow landslide hazard, Monte Carlo simulation was performed for some common physically based models available in literature (eg. SINMAP, SHALSTAB, TRIGRS). Furthermore, a new approach based on the use of Bayesian Network was proposed and validated. Different models, such as Intervals, Convex Models and Fuzzy Sets, were adopted for the modelling of input parameters. Finally, an accuracy assessment was carried out on the resulting maps and the propagation of uncertainty of input parameters into the final shallow landslide hazard estimation was estimated. The outcomes of the analysis are compared and discussed in term of discrepancy among map pixel values and related estimated error. The novelty of the proposed method is on estimation of the confidence of the shallow landslides hazard mapping at regional level. This allows i) to discriminate regions where hazard assessment is robust from areas where more data are necessary to increase the confidence level and ii) to assess the reliability of the procedure used for hazard assessment.
The effect of short-range spatial variability on soil sampling uncertainty.
Van der Perk, Marcel; de Zorzi, Paolo; Barbizzi, Sabrina; Belli, Maria; Fajgelj, Ales; Sansone, Umberto; Jeran, Zvonka; Jaćimović, Radojko
2008-11-01
This paper aims to quantify the soil sampling uncertainty arising from the short-range spatial variability of elemental concentrations in the topsoils of agricultural, semi-natural, and contaminated environments. For the agricultural site, the relative standard sampling uncertainty ranges between 1% and 5.5%. For the semi-natural area, the sampling uncertainties are 2-4 times larger than in the agricultural area. The contaminated site exhibited significant short-range spatial variability in elemental composition, which resulted in sampling uncertainties of 20-30%.
Frequency-Based Spatial Correlation Assessments of the Ares I Subscale Acoustic Model Test Firings
NASA Technical Reports Server (NTRS)
Kenny, R. Jeremy; Houston, J.
2012-01-01
The Marshall Space Flight Center has performed a series of test firings to simulate and understand the acoustic environments generated for the Ares I liftoff profiles. Part of the instrumentation package had special sensor groups to assess the acoustic field spatial correlation features for the various test configurations. The spatial correlation characteristics were evaluated for all of the test firings, inclusive of understanding the diffuse to propagating wave amplitude ratios, the acoustic wave decays, and the incident angle of propagating waves across the sensor groups. These parameters were evaluated across the measured frequency spectra and the associated uncertainties for each parameter were estimated.
Feizizadeh, Bakhtiar; Jankowski, Piotr; Blaschke, Thomas
2014-01-01
GIS multicriteria decision analysis (MCDA) techniques are increasingly used in landslide susceptibility mapping for the prediction of future hazards, land use planning, as well as for hazard preparedness. However, the uncertainties associated with MCDA techniques are inevitable and model outcomes are open to multiple types of uncertainty. In this paper, we present a systematic approach to uncertainty and sensitivity analysis. We access the uncertainty of landslide susceptibility maps produced with GIS-MCDA techniques. A new spatially-explicit approach and Dempster–Shafer Theory (DST) are employed to assess the uncertainties associated with two MCDA techniques, namely Analytical Hierarchical Process (AHP) and Ordered Weighted Averaging (OWA) implemented in GIS. The methodology is composed of three different phases. First, weights are computed to express the relative importance of factors (criteria) for landslide susceptibility. Next, the uncertainty and sensitivity of landslide susceptibility is analyzed as a function of weights using Monte Carlo Simulation and Global Sensitivity Analysis. Finally, the results are validated using a landslide inventory database and by applying DST. The comparisons of the obtained landslide susceptibility maps of both MCDA techniques with known landslides show that the AHP outperforms OWA. However, the OWA-generated landslide susceptibility map shows lower uncertainty than the AHP-generated map. The results demonstrate that further improvement in the accuracy of GIS-based MCDA can be achieved by employing an integrated uncertainty–sensitivity analysis approach, in which the uncertainty of landslide susceptibility model is decomposed and attributed to model's criteria weights. PMID:25843987
Intercomparison and Uncertainty Assessment of Nine Evapotranspiration Estimates Over South America
NASA Astrophysics Data System (ADS)
Sörensson, Anna A.; Ruscica, Romina C.
2018-04-01
This study examines the uncertainties and the representations of anomalies of a set of evapotranspiration products over climatologically distinct regions of South America. The products, coming from land surface models, reanalysis, and remote sensing, are chosen from sources that are readily available to the community of users. The results show that the spatial patterns of maximum uncertainty differ among metrics, with dry regions showing maximum relative uncertainties of annual mean evapotranspiration, while energy-limited regions present maximum uncertainties in the representation of the annual cycle and monsoon regions in the representation of anomalous conditions. Furthermore, it is found that land surface models driven by observed atmospheric fields detect meteorological and agricultural droughts in dry regions unequivocally. The remote sensing products employed do not distinguish all agricultural droughts and this could be attributed to the forcing net radiation. The study also highlights important characteristics of individual data sets and recommends users to include assessments of sensitivity to evapotranspiration data sets in their studies, depending on region and nature of study to be conducted.
Transfer of uncertainty of space-borne high resolution rainfall products at ungauged regions
NASA Astrophysics Data System (ADS)
Tang, Ling
Hydrologically relevant characteristics of high resolution (˜ 0.25 degree, 3 hourly) satellite rainfall uncertainty were derived as a function of season and location using a six year (2002-2007) archive of National Aeronautics and Space Administration (NASA)'s Tropical Rainfall Measuring Mission (TRMM) Multi-satellite Precipitation Analysis (TMPA) precipitation data. The Next Generation Radar (NEXRAD) Stage IV rainfall data over the continental United States was used as ground validation (GV) data. A geostatistical mapping scheme was developed and tested for transfer (i.e., spatial interpolation) of uncertainty information from GV regions to the vast non-GV regions by leveraging the error characterization work carried out in the earlier step. The open question explored here was, "If 'error' is defined on the basis of independent ground validation (GV) data, how are error metrics estimated for a satellite rainfall data product without the need for much extensive GV data?" After a quantitative analysis of the spatial and temporal structure of the satellite rainfall uncertainty, a proof-of-concept geostatistical mapping scheme (based on the kriging method) was evaluated. The idea was to understand how realistic the idea of 'transfer' is for the GPM era. It was found that it was indeed technically possible to transfer error metrics from a gauged to an ungauged location for certain error metrics and that a regionalized error metric scheme for GPM may be possible. The uncertainty transfer scheme based on a commonly used kriging method (ordinary kriging) was then assessed further at various timescales (climatologic, seasonal, monthly and weekly), and as a function of the density of GV coverage. The results indicated that if a transfer scheme for estimating uncertainty metrics was finer than seasonal scale (ranging from 3-6 hourly to weekly-monthly), the effectiveness for uncertainty transfer worsened significantly. Next, a comprehensive assessment of different kriging methods for spatial transfer (interpolation) of error metrics was performed. Three kriging methods for spatial interpolation are compared, which are: ordinary kriging (OK), indicator kriging (IK) and disjunctive kriging (DK). Additional comparison with the simple inverse distance weighting (IDW) method was also performed to quantify the added benefit (if any) of using geostatistical methods. The overall performance ranking of the kriging methods was found to be as follows: OK=DK > IDW > IK. Lastly, various metrics of satellite rainfall uncertainty were identified for two large continental landmasses that share many similar Koppen climate zones, United States and Australia. The dependence of uncertainty as a function of gauge density was then investigated. The investigation revealed that only the first and second ordered moments of error are most amenable to a Koppen-type climate type classification in different continental landmasses.
NASA Astrophysics Data System (ADS)
Zolfaghari, Mohammad R.
2009-07-01
Recent achievements in computer and information technology have provided the necessary tools to extend the application of probabilistic seismic hazard mapping from its traditional engineering use to many other applications. Examples for such applications are risk mitigation, disaster management, post disaster recovery planning and catastrophe loss estimation and risk management. Due to the lack of proper knowledge with regard to factors controlling seismic hazards, there are always uncertainties associated with all steps involved in developing and using seismic hazard models. While some of these uncertainties can be controlled by more accurate and reliable input data, the majority of the data and assumptions used in seismic hazard studies remain with high uncertainties that contribute to the uncertainty of the final results. In this paper a new methodology for the assessment of seismic hazard is described. The proposed approach provides practical facility for better capture of spatial variations of seismological and tectonic characteristics, which allows better treatment of their uncertainties. In the proposed approach, GIS raster-based data models are used in order to model geographical features in a cell-based system. The cell-based source model proposed in this paper provides a framework for implementing many geographically referenced seismotectonic factors into seismic hazard modelling. Examples for such components are seismic source boundaries, rupture geometry, seismic activity rate, focal depth and the choice of attenuation functions. The proposed methodology provides improvements in several aspects of the standard analytical tools currently being used for assessment and mapping of regional seismic hazard. The proposed methodology makes the best use of the recent advancements in computer technology in both software and hardware. The proposed approach is well structured to be implemented using conventional GIS tools.
A method to estimate the effect of deformable image registration uncertainties on daily dose mapping
Murphy, Martin J.; Salguero, Francisco J.; Siebers, Jeffrey V.; Staub, David; Vaman, Constantin
2012-01-01
Purpose: To develop a statistical sampling procedure for spatially-correlated uncertainties in deformable image registration and then use it to demonstrate their effect on daily dose mapping. Methods: Sequential daily CT studies are acquired to map anatomical variations prior to fractionated external beam radiotherapy. The CTs are deformably registered to the planning CT to obtain displacement vector fields (DVFs). The DVFs are used to accumulate the dose delivered each day onto the planning CT. Each DVF has spatially-correlated uncertainties associated with it. Principal components analysis (PCA) is applied to measured DVF error maps to produce decorrelated principal component modes of the errors. The modes are sampled independently and reconstructed to produce synthetic registration error maps. The synthetic error maps are convolved with dose mapped via deformable registration to model the resulting uncertainty in the dose mapping. The results are compared to the dose mapping uncertainty that would result from uncorrelated DVF errors that vary randomly from voxel to voxel. Results: The error sampling method is shown to produce synthetic DVF error maps that are statistically indistinguishable from the observed error maps. Spatially-correlated DVF uncertainties modeled by our procedure produce patterns of dose mapping error that are different from that due to randomly distributed uncertainties. Conclusions: Deformable image registration uncertainties have complex spatial distributions. The authors have developed and tested a method to decorrelate the spatial uncertainties and make statistical samples of highly correlated error maps. The sample error maps can be used to investigate the effect of DVF uncertainties on daily dose mapping via deformable image registration. An initial demonstration of this methodology shows that dose mapping uncertainties can be sensitive to spatial patterns in the DVF uncertainties. PMID:22320766
NASA Technical Reports Server (NTRS)
Wang, Weile; Nemani, Ramakrishna R.; Michaelis, Andrew; Hashimoto, Hirofumi; Dungan, Jennifer L.; Thrasher, Bridget L.; Dixon, Keith W.
2016-01-01
The NASA Earth Exchange Global Daily Downscaled Projections (NEX-GDDP) dataset is comprised of downscaled climate projections that are derived from 21 General Circulation Model (GCM) runs conducted under the Coupled Model Intercomparison Project Phase 5 (CMIP5) and across two of the four greenhouse gas emissions scenarios (RCP4.5 and RCP8.5). Each of the climate projections includes daily maximum temperature, minimum temperature, and precipitation for the periods from 1950 through 2100 and the spatial resolution is 0.25 degrees (approximately 25 km x 25 km). The GDDP dataset has received warm welcome from the science community in conducting studies of climate change impacts at local to regional scales, but a comprehensive evaluation of its uncertainties is still missing. In this study, we apply the Perfect Model Experiment framework (Dixon et al. 2016) to quantify the key sources of uncertainties from the observational baseline dataset, the downscaling algorithm, and some intrinsic assumptions (e.g., the stationary assumption) inherent to the statistical downscaling techniques. We developed a set of metrics to evaluate downscaling errors resulted from bias-correction ("quantile-mapping"), spatial disaggregation, as well as the temporal-spatial non-stationarity of climate variability. Our results highlight the spatial disaggregation (or interpolation) errors, which dominate the overall uncertainties of the GDDP dataset, especially over heterogeneous and complex terrains (e.g., mountains and coastal area). In comparison, the temporal errors in the GDDP dataset tend to be more constrained. Our results also indicate that the downscaled daily precipitation also has relatively larger uncertainties than the temperature fields, reflecting the rather stochastic nature of precipitation in space. Therefore, our results provide insights in improving statistical downscaling algorithms and products in the future.
Nijhof, Carl O P; Huijbregts, Mark A J; Golsteijn, Laura; van Zelm, Rosalie
2016-04-01
We compared the influence of spatial variability in environmental characteristics and the uncertainty in measured substance properties of seven chemicals on freshwater fate factors (FFs), representing the residence time in the freshwater environment, and on exposure factors (XFs), representing the dissolved fraction of a chemical. The influence of spatial variability was quantified using the SimpleBox model in which Europe was divided in 100 × 100 km regions, nested in a regional (300 × 300 km) and supra-regional (500 × 500 km) scale. Uncertainty in substance properties was quantified by means of probabilistic modelling. Spatial variability and parameter uncertainty were expressed by the ratio k of the 95%ile and 5%ile of the FF and XF. Our analysis shows that spatial variability ranges in FFs of persistent chemicals that partition predominantly into one environmental compartment was up to 2 orders of magnitude larger compared to uncertainty. For the other (less persistent) chemicals, uncertainty in the FF was up to 1 order of magnitude larger than spatial variability. Variability and uncertainty in freshwater XFs of the seven chemicals was negligible (k < 1.5). We found that, depending on the chemical and emission scenario, accounting for region-specific environmental characteristics in multimedia fate modelling, as well as accounting for parameter uncertainty, can have a significant influence on freshwater fate factor predictions. Therefore, we conclude that it is important that fate factors should not only account for parameter uncertainty, but for spatial variability as well, as this further increases the reliability of ecotoxicological impacts in LCA. Copyright © 2016 Elsevier Ltd. All rights reserved.
Luan, Hui; Law, Jane; Lysy, Martin
2018-02-01
Neighborhood restaurant environment (NRE) plays a vital role in shaping residents' eating behaviors. While NRE 'healthfulness' is a multi-facet concept, most studies evaluate it based only on restaurant type, thus largely ignoring variations of in-restaurant features. In the few studies that do account for such features, healthfulness scores are simply averaged over accessible restaurants, thereby concealing any uncertainty that attributed to neighborhoods' size or spatial correlation. To address these limitations, this paper presents a Bayesian Spatial Factor Analysis for assessing NRE healthfulness in the city of Kitchener, Canada. Several in-restaurant characteristics are included. By treating NRE healthfulness as a spatially correlated latent variable, the adopted modeling approach can: (i) identify specific indicators most relevant to NRE healthfulness, (ii) provide healthfulness estimates for neighborhoods without accessible restaurants, and (iii) readily quantify uncertainties in the healthfulness index. Implications of the analysis for intervention program development and community food planning are discussed. Copyright © 2017 Elsevier Ltd. All rights reserved.
Olea, R.A.; Luppens, J.A.; Tewalt, S.J.
2011-01-01
A common practice for characterizing uncertainty in coal resource assessments has been the itemization of tonnage at the mining unit level and the classification of such units according to distance to drilling holes. Distance criteria, such as those used in U.S. Geological Survey Circular 891, are still widely used for public disclosure. A major deficiency of distance methods is that they do not provide a quantitative measure of uncertainty. Additionally, relying on distance between data points alone does not take into consideration other factors known to have an influence on uncertainty, such as spatial correlation, type of probability distribution followed by the data, geological discontinuities, and boundary of the deposit. Several geostatistical methods have been combined to formulate a quantitative characterization for appraising uncertainty. Drill hole datasets ranging from widespread exploration drilling to detailed development drilling from a lignite deposit in Texas were used to illustrate the modeling. The results show that distance to the nearest drill hole is almost completely unrelated to uncertainty, which confirms the inadequacy of characterizing uncertainty based solely on a simple classification of resources by distance classes. The more complex statistical methods used in this study quantify uncertainty and show good agreement between confidence intervals in the uncertainty predictions and data from additional drilling. ?? 2010.
Design and implementation of a risk assessment module in a spatial decision support system
NASA Astrophysics Data System (ADS)
Zhang, Kaixi; van Westen, Cees; Bakker, Wim
2014-05-01
The spatial decision support system named 'Changes SDSS' is currently under development. The goal of this system is to analyze changing hydro-meteorological hazards and the effect of risk reduction alternatives to support decision makers in choosing the best alternatives. The risk assessment module within the system is to assess the current risk, analyze the risk after implementations of risk reduction alternatives, and analyze the risk in different future years when considering scenarios such as climate change, land use change and population growth. The objective of this work is to present the detailed design and implementation plan of the risk assessment module. The main challenges faced consist of how to shift the risk assessment from traditional desktop software to an open source web-based platform, the availability of input data and the inclusion of uncertainties in the risk analysis. The risk assessment module is developed using Ext JS library for the implementation of user interface on the client side, using Python for scripting, as well as PostGIS spatial functions for complex computations on the server side. The comprehensive consideration of the underlying uncertainties in input data can lead to a better quantification of risk assessment and a more reliable Changes SDSS, since the outputs of risk assessment module are the basis for decision making module within the system. The implementation of this module will contribute to the development of open source web-based modules for multi-hazard risk assessment in the future. This work is part of the "CHANGES SDSS" project, funded by the European Community's 7th Framework Program.
Understanding extreme sea levels for coastal impact and adaptation analysis
NASA Astrophysics Data System (ADS)
Wahl, T.; Haigh, I. D.; Nicholls, R. J.; Arns, A.; Hinkel, J.; Dangendorf, S.; Slangen, A.
2016-12-01
Coastal impact and adaptation assessments require detailed knowledge on extreme sea levels, because increasing damage due to extreme events, such as storm surges and tropical cyclones, is one of the major consequences of sea level rise and climate change. In fact, the IPCC has highlighted in its AR4 report that "societal impacts of sea level change primarily occur via the extreme levels rather than as a direct consequence of mean sea level changes". Over the last few decades, substantial research efforts have been directed towards improved understanding of past and future mean sea level; different scenarios were developed with process-based or semi-empirical models and used for coastal impact assessments at various spatial scales to guide coastal management and adaptation efforts. The uncertainties in future sea level rise are typically accounted for by analyzing the impacts associated with a range of scenarios leading to a vertical displacement of the distribution of extreme sea-levels. And indeed most regional and global studies find little or no evidence for changes in storminess with climate change, although there is still low confidence in the results. However, and much more importantly, there is still a limited understanding of present-day extreme sea-levels which is largely ignored in most impact and adaptation analyses. The two key uncertainties stem from: (1) numerical models that are used to generate long time series of extreme sea-levels. The bias of these models varies spatially and can reach values much larger than the expected sea level rise; but it can be accounted for in most regions making use of in-situ measurements; (2) Statistical models used for determining present-day extreme sea-level exceedance probabilities. There is no universally accepted approach to obtain such values for flood risk assessments and while substantial research has explored inter-model uncertainties for mean sea level, we explore here, for the first time, inter-model uncertainties for extreme sea-levels at large spatial scales and compare them to the uncertainties in mean sea level projections.
Framework for Uncertainty Assessment - Hanford Site-Wide Groundwater Flow and Transport Modeling
NASA Astrophysics Data System (ADS)
Bergeron, M. P.; Cole, C. R.; Murray, C. J.; Thorne, P. D.; Wurstner, S. K.
2002-05-01
Pacific Northwest National Laboratory is in the process of development and implementation of an uncertainty estimation methodology for use in future site assessments that addresses parameter uncertainty as well as uncertainties related to the groundwater conceptual model. The long-term goals of the effort are development and implementation of an uncertainty estimation methodology for use in future assessments and analyses being made with the Hanford site-wide groundwater model. The basic approach in the framework developed for uncertainty assessment consists of: 1) Alternate conceptual model (ACM) identification to identify and document the major features and assumptions of each conceptual model. The process must also include a periodic review of the existing and proposed new conceptual models as data or understanding become available. 2) ACM development of each identified conceptual model through inverse modeling with historical site data. 3) ACM evaluation to identify which of conceptual models are plausible and should be included in any subsequent uncertainty assessments. 4) ACM uncertainty assessments will only be carried out for those ACMs determined to be plausible through comparison with historical observations and model structure identification measures. The parameter uncertainty assessment process generally involves: a) Model Complexity Optimization - to identify the important or relevant parameters for the uncertainty analysis; b) Characterization of Parameter Uncertainty - to develop the pdfs for the important uncertain parameters including identification of any correlations among parameters; c) Propagation of Uncertainty - to propagate parameter uncertainties (e.g., by first order second moment methods if applicable or by a Monte Carlo approach) through the model to determine the uncertainty in the model predictions of interest. 5)Estimation of combined ACM and scenario uncertainty by a double sum with each component of the inner sum (an individual CCDF) representing parameter uncertainty associated with a particular scenario and ACM and the outer sum enumerating the various plausible ACM and scenario combinations in order to represent the combined estimate of uncertainty (a family of CCDFs). A final important part of the framework includes identification, enumeration, and documentation of all the assumptions, which include those made during conceptual model development, required by the mathematical model, required by the numerical model, made during the spatial and temporal descretization process, needed to assign the statistical model and associated parameters that describe the uncertainty in the relevant input parameters, and finally those assumptions required by the propagation method. Pacific Northwest National Laboratory is operated for the U.S. Department of Energy under Contract DE-AC06-76RL01830.
Incorporating uncertainty in predictive species distribution modelling.
Beale, Colin M; Lennon, Jack J
2012-01-19
Motivated by the need to solve ecological problems (climate change, habitat fragmentation and biological invasions), there has been increasing interest in species distribution models (SDMs). Predictions from these models inform conservation policy, invasive species management and disease-control measures. However, predictions are subject to uncertainty, the degree and source of which is often unrecognized. Here, we review the SDM literature in the context of uncertainty, focusing on three main classes of SDM: niche-based models, demographic models and process-based models. We identify sources of uncertainty for each class and discuss how uncertainty can be minimized or included in the modelling process to give realistic measures of confidence around predictions. Because this has typically not been performed, we conclude that uncertainty in SDMs has often been underestimated and a false precision assigned to predictions of geographical distribution. We identify areas where development of new statistical tools will improve predictions from distribution models, notably the development of hierarchical models that link different types of distribution model and their attendant uncertainties across spatial scales. Finally, we discuss the need to develop more defensible methods for assessing predictive performance, quantifying model goodness-of-fit and for assessing the significance of model covariates.
Global Surface Temperature Change and Uncertainties Since 1861
NASA Technical Reports Server (NTRS)
Shen, Samuel S. P.; Lau, William K. M. (Technical Monitor)
2002-01-01
The objective of this talk is to analyze the warming trend and its uncertainties of the global and hemi-spheric surface temperatures. By the method of statistical optimal averaging scheme, the land surface air temperature and sea surface temperature observational data are used to compute the spatial average annual mean surface air temperature. The optimal averaging method is derived from the minimization of the mean square error between the true and estimated averages and uses the empirical orthogonal functions. The method can accurately estimate the errors of the spatial average due to observational gaps and random measurement errors. In addition, quantified are three independent uncertainty factors: urbanization, change of the in situ observational practices and sea surface temperature data corrections. Based on these uncertainties, the best linear fit to annual global surface temperature gives an increase of 0.61 +/- 0.16 C between 1861 and 2000. This lecture will also touch the topics on the impact of global change on nature and environment. as well as the latest assessment methods for the attributions of global change.
NASA Astrophysics Data System (ADS)
Lindley, S. J.; Walsh, T.
There are many modelling methods dedicated to the estimation of spatial patterns in pollutant concentrations, each with their distinctive advantages and disadvantages. The derivation of a surface of air quality values from monitoring data alone requires the conversion of point-based data from a limited number of monitoring stations to a continuous surface using interpolation. Since interpolation techniques involve the estimation of data at un-sampled points based on calculated relationships between data measured at a number of known sample points, they are subject to some uncertainty, both in terms of the values estimated and their spatial distribution. These uncertainties, which are incorporated into many empirical and semi-empirical mapping methodologies, could be recognised in any further usage of the data and also in the assessment of the extent of an exceedence of an air quality standard and the degree of exposure this may represent. There is a wide range of available interpolation techniques and the differences in the characteristics of these result in variations in the output surfaces estimated from the same set of input points. The work presented in this paper provides an examination of uncertainties through the application of a number of interpolation techniques available in standard GIS packages to a case study nitrogen dioxide data set for the Greater Manchester conurbation in northern England. The implications of the use of different techniques are discussed through application to hourly concentrations during an air quality episode and annual average concentrations in 2001. Patterns of concentrations demonstrate considerable differences in the estimated spatial pattern of maxima as the combined effects of chemical processes, topography and meteorology. In the case of air quality episodes, the considerable spatial variability of concentrations results in large uncertainties in the surfaces produced but these uncertainties vary widely from area to area. In view of the uncertainties with classical techniques research is ongoing to develop alternative methods which should in time help improve the suite of tools available to air quality managers.
NASA Astrophysics Data System (ADS)
Zomlot, Z.; Verbeiren, B.; Huysmans, M.; Batelaan, O.
2017-11-01
Land use/land cover (LULC) change is a consequence of human-induced global environmental change. It is also considered one of the major factors affecting groundwater recharge. Uncertainties and inconsistencies in LULC maps are one of the difficulties that LULC timeseries analysis face and which have a significant effect on hydrological impact analysis. Therefore, an accuracy assessment approach of LULC timeseries is needed for a more reliable hydrological analysis and prediction. The objective of this paper is to assess the impact of land use uncertainty and to improve the accuracy of a timeseries of CORINE (coordination of information on the environment) land cover maps by using a new approach of identifying spatial-temporal LULC change trajectories as a pre-processing tool. This ensures consistency of model input when dealing with land-use dynamics and as such improves the accuracy of land use maps and consequently groundwater recharge estimation. As a case study the impact of consistent land use changes from 1990 until 2013 on groundwater recharge for the Flanders-Brussels region is assessed. The change trajectory analysis successfully assigned a rational trajectory to 99% of all pixels. The methodology is shown to be powerful in correcting interpretation inconsistencies and overestimation errors in CORINE land cover maps. The overall kappa (cell-by-cell map comparison) improved from 0.6 to 0.8 and from 0.2 to 0.7 for forest and pasture land use classes respectively. The study shows that the inconsistencies in the land use maps introduce uncertainty in groundwater recharge estimation in a range of 10-30%. The analysis showed that during the period of 1990-2013 the LULC changes were mainly driven by urban expansion. The results show that the resolution at which the spatial analysis is performed is important; the recharge differences using original and corrected CORINE land cover maps increase considerably with increasing spatial resolution. This study indicates that improving consistency of land use map timeseries is of critical importance for assessing land use change and its environmental impact.
NASA Astrophysics Data System (ADS)
Croke, Jacky; Todd, Peter; Thompson, Chris; Watson, Fiona; Denham, Robert; Khanal, Giri
2013-02-01
Advances in remote sensing and digital terrain processing now allow for a sophisticated analysis of spatial and temporal changes in erosion and deposition. Digital elevation models (DEMs) can now be constructed and differenced to produce DEMs of Difference (DoD), which are used to assess net landscape change for morphological budgeting. To date this has been most effectively achieved in gravel-bed rivers over relatively small spatial scales. If the full potential of the technology is to be realised, additional studies are required at larger scales and across a wider range of geomorphic features. This study presents an assessment of the basin-scale spatial patterns of erosion, deposition, and net morphological change that resulted from a catastrophic flood event in the Lockyer Creek catchment of SE Queensland (SEQ) in January 2011. Multitemporal Light Detection and Ranging (LiDAR) DEMs were used to construct a DoD that was then combined with a one-dimensional flow hydraulic model HEC-RAS to delineate five major geomorphic landforms, including inner-channel area, within-channel benches, macrochannel banks, and floodplain. The LiDAR uncertainties were quantified and applied together with a probabilistic representation of uncertainty thresholded at a conservative 95% confidence interval. The elevation change distribution (ECD) for the 100-km2 study area indicates a magnitude of elevation change spanning almost 10 m but the mean elevation change of 0.04 m confirms that a large part of the landscape was characterised by relatively low magnitude changes over a large spatial area. Mean elevation changes varied by geomorphic feature and only two, the within-channel benches and macrochannel banks, were net erosional with an estimated combined loss of 1,815,149 m3 of sediment. The floodplain was the zone of major net deposition but mean elevation changes approached the defined critical limit of uncertainty. Areal and volumetric ECDs for this extreme event provide a representative expression of the balance between erosion and deposition, and importantly sediment redistribution, which is extremely difficult to quantify using more traditional channel planform or cross-sectional surveys. The ability of LiDAR to make a rapid and accurate assessment of key geomorphic processes over large spatial scales contributes to our understanding of key processes and, as demonstrated here, to the assessment of major geomorphological hazards such as extreme flood events.
Spatial Downscaling of Alien Species Presences using Machine Learning
NASA Astrophysics Data System (ADS)
Daliakopoulos, Ioannis N.; Katsanevakis, Stelios; Moustakas, Aristides
2017-07-01
Large scale, high-resolution data on alien species distributions are essential for spatially explicit assessments of their environmental and socio-economic impacts, and management interventions for mitigation. However, these data are often unavailable. This paper presents a method that relies on Random Forest (RF) models to distribute alien species presence counts at a finer resolution grid, thus achieving spatial downscaling. A sufficiently large number of RF models are trained using random subsets of the dataset as predictors, in a bootstrapping approach to account for the uncertainty introduced by the subset selection. The method is tested with an approximately 8×8 km2 grid containing floral alien species presence and several indices of climatic, habitat, land use covariates for the Mediterranean island of Crete, Greece. Alien species presence is aggregated at 16×16 km2 and used as a predictor of presence at the original resolution, thus simulating spatial downscaling. Potential explanatory variables included habitat types, land cover richness, endemic species richness, soil type, temperature, precipitation, and freshwater availability. Uncertainty assessment of the spatial downscaling of alien species’ occurrences was also performed and true/false presences and absences were quantified. The approach is promising for downscaling alien species datasets of larger spatial scale but coarse resolution, where the underlying environmental information is available at a finer resolution than the alien species data. Furthermore, the RF architecture allows for tuning towards operationally optimal sensitivity and specificity, thus providing a decision support tool for designing a resource efficient alien species census.
Petrovskaya, Natalia B.; Forbes, Emily; Petrovskii, Sergei V.; Walters, Keith F. A.
2018-01-01
Studies addressing many ecological problems require accurate evaluation of the total population size. In this paper, we revisit a sampling procedure used for the evaluation of the abundance of an invertebrate population from assessment data collected on a spatial grid of sampling locations. We first discuss how insufficient information about the spatial population density obtained on a coarse sampling grid may affect the accuracy of an evaluation of total population size. Such information deficit in field data can arise because of inadequate spatial resolution of the population distribution (spatially variable population density) when coarse grids are used, which is especially true when a strongly heterogeneous spatial population density is sampled. We then argue that the average trap count (the quantity routinely used to quantify abundance), if obtained from a sampling grid that is too coarse, is a random variable because of the uncertainty in sampling spatial data. Finally, we show that a probabilistic approach similar to bootstrapping techniques can be an efficient tool to quantify the uncertainty in the evaluation procedure in the presence of a spatial pattern reflecting a patchy distribution of invertebrates within the sampling grid. PMID:29495513
2017-11-01
magnitude, intensity, and seasonality of climate. For infrastructure projects, relevant design life often exceeds 30 years—a period of time of...uncertainty about future statistical properties of climate at time and spatial scales required for planning and design purposes. Information...about future statistical properties of climate at time and spatial scales required for planning and design , and for assessing future operational
Uncertainties in Emissions In Emissions Inputs for Near-Road Assessments
Emissions, travel demand, and dispersion models are all needed to obtain temporally and spatially resolved pollutant concentrations. Current methodology combines these three models in a bottom-up approach based on hourly traffic and emissions estimates, and hourly dispersion conc...
Spatial Uncertainty Modeling of Fuzzy Information in Images for Pattern Classification
Pham, Tuan D.
2014-01-01
The modeling of the spatial distribution of image properties is important for many pattern recognition problems in science and engineering. Mathematical methods are needed to quantify the variability of this spatial distribution based on which a decision of classification can be made in an optimal sense. However, image properties are often subject to uncertainty due to both incomplete and imprecise information. This paper presents an integrated approach for estimating the spatial uncertainty of vagueness in images using the theory of geostatistics and the calculus of probability measures of fuzzy events. Such a model for the quantification of spatial uncertainty is utilized as a new image feature extraction method, based on which classifiers can be trained to perform the task of pattern recognition. Applications of the proposed algorithm to the classification of various types of image data suggest the usefulness of the proposed uncertainty modeling technique for texture feature extraction. PMID:25157744
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lunden, Melissa; Faulkner, David; Heredia, Elizabeth
2012-10-01
This report documents experiments performed in three homes to assess the methodology used to determine air exchange rates using passive tracer techniques. The experiments used four different tracer gases emitted simultaneously but implemented with different spatial coverage in the home. Two different tracer gas sampling methods were used. The results characterize the factors of the execution and analysis of the passive tracer technique that affect the uncertainty in the calculated air exchange rates. These factors include uncertainties in tracer gas emission rates, differences in measured concentrations for different tracer gases, temporal and spatial variability of the concentrations, the comparison betweenmore » different gas sampling methods, and the effect of different ventilation conditions.« less
NASA Astrophysics Data System (ADS)
Neri, Augusto; Bevilacqua, Andrea; Esposti Ongaro, Tomaso; Isaia, Roberto; Aspinall, Willy P.; Bisson, Marina; Flandoli, Franco; Baxter, Peter J.; Bertagnini, Antonella; Iannuzzi, Enrico; Orsucci, Simone; Pistolesi, Marco; Rosi, Mauro; Vitale, Stefano
2015-04-01
Campi Flegrei (CF) is an example of an active caldera containing densely populated settlements at very high risk of pyroclastic density currents (PDCs). We present here an innovative method for assessing background spatial PDC hazard in a caldera setting with probabilistic invasion maps conditional on the occurrence of an explosive event. The method encompasses the probabilistic assessment of potential vent opening positions, derived in the companion paper, combined with inferences about the spatial density distribution of PDC invasion areas from a simplified flow model, informed by reconstruction of deposits from eruptions in the last 15 ka. The flow model describes the PDC kinematics and accounts for main effects of topography on flow propagation. Structured expert elicitation is used to incorporate certain sources of epistemic uncertainty, and a Monte Carlo approach is adopted to produce a set of probabilistic hazard maps for the whole CF area. Our findings show that, in case of eruption, almost the entire caldera is exposed to invasion with a mean probability of at least 5%, with peaks greater than 50% in some central areas. Some areas outside the caldera are also exposed to this danger, with mean probabilities of invasion of the order of 5-10%. Our analysis suggests that these probability estimates have location-specific uncertainties which can be substantial. The results prove to be robust with respect to alternative elicitation models and allow the influence on hazard mapping of different sources of uncertainty, and of theoretical and numerical assumptions, to be quantified.
Spatial and Temporal Flood Risk Assessment for Decision Making Approach
NASA Astrophysics Data System (ADS)
Azizat, Nazirah; Omar, Wan-Mohd-Sabki Wan
2018-03-01
Heavy rainfall, adversely impacting inundation areas, depends on the magnitude of the flood. Significantly, location of settlements, infrastructure and facilities in floodplains result in many regions facing flooding risks. A problem faced by the decision maker in an assessment of flood vulnerability and evaluation of adaptation measures is recurrent flooding in the same areas. Identification of recurrent flooding areas and frequency of floods should be priorities for flood risk management. However, spatial and temporal variability become major factors of uncertainty in flood risk management. Therefore, dynamic and spatial characteristics of these changes in flood impact assessment are important in making decisions about the future of infrastructure development and community life. System dynamics (SD) simulation and hydrodynamic modelling are presented as tools for modelling the dynamic characteristics of flood risk and spatial variability. This paper discusses the integration between spatial and temporal information that is required by the decision maker for the identification of multi-criteria decision problems involving multiple stakeholders.
Fuzzification of continuous-value spatial evidence for mineral prospectivity mapping
NASA Astrophysics Data System (ADS)
Yousefi, Mahyar; Carranza, Emmanuel John M.
2015-01-01
Complexities of geological processes portrayed as certain feature in a map (e.g., faults) are natural sources of uncertainties in decision-making for exploration of mineral deposits. Besides natural sources of uncertainties, knowledge-driven (e.g., fuzzy logic) mineral prospectivity mapping (MPM) is also plagued and incurs further uncertainty in subjective judgment of analyst when there is no reliable proven value of evidential scores corresponding to relative importance of geological features that can directly be measured. In this regard, analysts apply expert opinion to assess relative importance of spatial evidences as meaningful decision support. This paper aims for fuzzification of continuous spatial data used as proxy evidence to facilitate and to support fuzzy MPM to generate exploration target areas for further examination of undiscovered deposits. In addition, this paper proposes to adapt the concept of expected value to further improve fuzzy logic MPM because the analysis of uncertain variables can be presented in terms of their expected value. The proposed modified expected value approach to MPM is not only a multi-criteria approach but it also treats uncertainty of geological processes a depicted by maps or spatial data in term of biased weighting more realistically in comparison with classified evidential maps because fuzzy membership scores are defined continuously whereby, for example, there is no need to categorize distances from evidential features to proximity classes using arbitrary intervals. The proposed continuous weighting approach and then integrating the weighted evidence layers by using modified expected value function, described in this paper can be used efficiently in either greenfields or brownfields.
NASA Astrophysics Data System (ADS)
Babcock, Chad; Finley, Andrew O.; Andersen, Hans-Erik; Pattison, Robert; Cook, Bruce D.; Morton, Douglas C.; Alonzo, Michael; Nelson, Ross; Gregoire, Timothy; Ene, Liviu; Gobakken, Terje; Næsset, Erik
2018-06-01
The goal of this research was to develop and examine the performance of a geostatistical coregionalization modeling approach for combining field inventory measurements, strip samples of airborne lidar and Landsat-based remote sensing data products to predict aboveground biomass (AGB) in interior Alaska's Tanana Valley. The proposed modeling strategy facilitates pixel-level mapping of AGB density predictions across the entire spatial domain. Additionally, the coregionalization framework allows for statistically sound estimation of total AGB for arbitrary areal units within the study area---a key advance to support diverse management objectives in interior Alaska. This research focuses on appropriate characterization of prediction uncertainty in the form of posterior predictive coverage intervals and standard deviations. Using the framework detailed here, it is possible to quantify estimation uncertainty for any spatial extent, ranging from pixel-level predictions of AGB density to estimates of AGB stocks for the full domain. The lidar-informed coregionalization models consistently outperformed their counterpart lidar-free models in terms of point-level predictive performance and total AGB precision. Additionally, the inclusion of Landsat-derived forest cover as a covariate further improved estimation precision in regions with lower lidar sampling intensity. Our findings also demonstrate that model-based approaches that do not explicitly account for residual spatial dependence can grossly underestimate uncertainty, resulting in falsely precise estimates of AGB. On the other hand, in a geostatistical setting, residual spatial structure can be modeled within a Bayesian hierarchical framework to obtain statistically defensible assessments of uncertainty for AGB estimates.
Hiloidhari, Moonmoon; Baruah, D C; Singh, Anoop; Kataki, Sampriti; Medhi, Kristina; Kumari, Shilpi; Ramachandra, T V; Jenkins, B M; Thakur, Indu Shekhar
2017-10-01
Sustainability of a bioenergy project depends on precise assessment of biomass resource, planning of cost-effective logistics and evaluation of possible environmental implications. In this context, this paper reviews the role and applications of geo-spatial tool such as Geographical Information System (GIS) for precise agro-residue resource assessment, biomass logistic and power plant design. Further, application of Life Cycle Assessment (LCA) in understanding the potential impact of agro-residue bioenergy generation on different ecosystem services has also been reviewed and limitations associated with LCA variability and uncertainty were discussed. Usefulness of integration of GIS into LCA (i.e. spatial LCA) to overcome the limitations of conventional LCA and to produce a holistic evaluation of the environmental benefits and concerns of bioenergy is also reviewed. Application of GIS, LCA and spatial LCA can help alleviate the challenges faced by ambitious bioenergy projects by addressing both economics and environmental goals. Copyright © 2017 Elsevier Ltd. All rights reserved.
Assessment of the uncertainty in future projection for summer climate extremes over the East Asia
NASA Astrophysics Data System (ADS)
Park, Changyong; Min, Seung-Ki; Cha, Dong-Hyun
2017-04-01
Future projections of climate extremes in regional and local scales are essential information needed for better adapting to climate changes. However, future projections hold larger uncertainty factors arising from internal and external processes which reduce the projection confidence. Using CMIP5 (Coupled Model Intercomparison Project Phase 5) multi-model simulations, we assess uncertainties in future projections of the East Asian temperature and precipitation extremes focusing on summer. In examining future projection, summer mean and extreme projections of the East Asian temperature and precipitation would be larger as time. Moreover, uncertainty cascades represent wider scenario difference and inter-model ranges with increasing time. A positive mean-extreme relation is found in projections for both temperature and precipitation. For the assessment of uncertainty factors for these projections, dominant uncertainty factors from temperature and precipitation change as time. For uncertainty of mean and extreme temperature, contributions of internal variability and model uncertainty declines after mid-21st century while role of scenario uncertainty grows rapidly. For uncertainty of mean precipitation projections, internal variability is more important than the scenario uncertainty. Unlike mean precipitation, extreme precipitation shows that the scenario uncertainty is expected to be a dominant factor in 2090s. The model uncertainty holds as an important factor for both mean and extreme precipitation until late 21st century. The spatial changes for the uncertainty factors of mean and extreme projections generally are expressed according to temporal changes of the fraction of total variance from uncertainty factors in many grids of the East Asia. ACKNOWLEDGEMENTS The research was supported by the Korea Meteorological Administration Research and Development program under grant KMIPA 2015-2083 and the National Research Foundation of Korea Grant funded by the Ministry of Science, ICT and Future Planning of Korea (NRF-2016M3C4A7952637) for its support and assistant in completion of the study.
Characterisation of a reference site for quantifying uncertainties related to soil sampling.
Barbizzi, Sabrina; de Zorzi, Paolo; Belli, Maria; Pati, Alessandra; Sansone, Umberto; Stellato, Luisa; Barbina, Maria; Deluisa, Andrea; Menegon, Sandro; Coletti, Valter
2004-01-01
The paper reports a methodology adopted to face problems related to quality assurance in soil sampling. The SOILSAMP project, funded by the Environmental Protection Agency of Italy (APAT), is aimed at (i) establishing protocols for soil sampling in different environments; (ii) assessing uncertainties associated with different soil sampling methods in order to select the "fit-for-purpose" method; (iii) qualifying, in term of trace elements spatial variability, a reference site for national and international inter-comparison exercises. Preliminary results and considerations are illustrated.
NASA Astrophysics Data System (ADS)
Slinskey, E. A.; Loikith, P. C.; Waliser, D. E.; Goodman, A.
2017-12-01
Extreme precipitation events are associated with numerous societal and environmental impacts. Furthermore, anthropogenic climate change is projected to alter precipitation intensity across portions of the Continental United States (CONUS). Therefore, a spatial understanding and intuitive means of monitoring extreme precipitation over time is critical. Towards this end, we apply an event-based indicator, developed as a part of NASA's support of the ongoing efforts of the US National Climate Assessment, which assigns categories to extreme precipitation events based on 3-day storm totals as a basis for dataset intercomparison. To assess observational uncertainty across a wide range of historical precipitation measurement approaches, we intercompare in situ station data from the Global Historical Climatology Network (GHCN), satellite-derived precipitation data from NASA's Tropical Rainfall Measuring Mission (TRMM), gridded in situ station data from the Parameter-elevation Regressions on Independent Slopes Model (PRISM), global reanalysis from NASA's Modern Era Retrospective-Analysis version 2 (MERRA 2), and regional reanalysis with gauge data assimilation from NCEP's North American Regional Reanalysis (NARR). Results suggest considerable variability across the five-dataset suite in the frequency, spatial extent, and magnitude of extreme precipitation events. Consistent with expectations, higher resolution datasets were found to resemble station data best and capture a greater frequency of high-end extreme events relative to lower spatial resolution datasets. The degree of dataset agreement varies regionally, however all datasets successfully capture the seasonal cycle of precipitation extremes across the CONUS. These intercomparison results provide additional insight about observational uncertainty and the ability of a range of precipitation measurement and analysis products to capture extreme precipitation event climatology. While the event category threshold is fixed in this analysis, preliminary results from the development of a flexible categorization scheme, that scales with grid resolution, are presented.
Uncertainty assessment method for the Cs-137 fallout inventory and penetration depth.
Papadakos, G N; Karangelos, D J; Petropoulos, N P; Anagnostakis, M J; Hinis, E P; Simopoulos, S E
2017-05-01
Within the presented study, soil samples were collected in year 2007 at 20 different locations of the Greek terrain, both from the surface and also from depths down to 26 cm. Sampling locations were selected primarily from areas where high levels of 137 Cs deposition after the Chernobyl accident had already been identified by the Nuclear Engineering Laboratory of the National Technical University of Athens during and after the year of 1986. At one location of relatively higher deposition, soil core samples were collected following a 60 m by 60 m Cartesian grid with a 20 m node-to-node distance. Single or pair core samples were also collected from the remaining 19 locations. Sample measurements and analysis were used to estimate 137 Cs inventory and the corresponding depth migration, twenty years after the deposition on Greek terrain. Based on these data, the uncertainty components of the whole sampling-to-results procedure were investigated. A cause-and-effect assessment process was used to apply the law of error propagation and demonstrate that the dominating significant component of the combined uncertainty is that due to the spatial variability of the contemporary (2007) 137 Cs inventory. A secondary, yet also significant component was identified to be the activity measurement process itself. Other less-significant uncertainty parameters were sampling methods, the variation in the soil field density with depth and the preparation of samples for measurement. The sampling grid experiment allowed for the quantitative evaluation of the uncertainty due to spatial variability, also by the assistance of the semivariance analysis. Denser, optimized grid could return more accurate values for this component but with a significantly elevated laboratory cost, in terms of both, human and material resources. Using the hereby collected data and for the case of a single core soil sampling using a well-defined sampling methodology quality assurance, the uncertainty component due to spatial variability was evaluated to about 19% for the 137 Cs inventory and up to 34% for the 137 Cs penetration depth. Based on the presented results and also on related literature, it is argued that such high uncertainties should be anticipated for single core samplings conducted using similar methodology and employed as 137 Cs inventory and penetration depth estimators. Copyright © 2017 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Beevers, Lindsay; Collet, Lila
2017-04-01
Over the past decade there have been significant challenges to water management posed by both floods and droughts. In the UK, since 2000 flooding has caused over £5Bn worth of damage, and direct costs from the recent drought (2011-12) are estimated to be between £70-165M, arising from impacts on public and industrial water supply. Projections of future climate change suggest an increase in temperature and precipitation trends which may exacerbate the frequency and severity of such hazards, but there is significant uncertainty associated with these projections. It thus becomes urgent to assess the possible impact of these changes on extreme flows and evaluate the uncertainties related to these projections, particularly changes in the seasonality of such hazards. This paper aims to assess the changes in seasonality of peak and low flows across Great Britain as a result of climate change. It is based on the Future Flow database; an 11-member ensemble of transient river flow projections from January 1951 to December 2098. We analyse the daily river flow over the baseline (1961-1990) and the 2080s (2069-2098) for 281 gauging stations. For each ensemble member, annual maxima (AMAX) and minima (AMIN) are extracted for both time periods for each gauging station. The month of the year the AMAX and AMIN occur respectively are recorded for each of the 30 years in the past and the future time periods. The uncertainty of the AMAX and AMIN occurrence temporally (monthly) is assessed across the 11 ensemble members, as well as the changes to this temporal signal between the baseline and the 2080s. Ultimately, this work gives a national picture (spatially) of high and low flows occurrence temporally and allows the assessment of possible changes in hydrological dynamics as a result of climate change in a statistical framework. Results will quantify the uncertainty related to the Climate Model parameters which are cascaded into the modelling chain. This study highlights the issues facing hydrological cycle management, due to changing spatial and temporal trends in order to anticipate and adapt to hydro-hazard changes in an uncertain context.
Modelling ecosystem service flows under uncertainty with stochiastic SPAN
Johnson, Gary W.; Snapp, Robert R.; Villa, Ferdinando; Bagstad, Kenneth J.
2012-01-01
Ecosystem service models are increasingly in demand for decision making. However, the data required to run these models are often patchy, missing, outdated, or untrustworthy. Further, communication of data and model uncertainty to decision makers is often either absent or unintuitive. In this work, we introduce a systematic approach to addressing both the data gap and the difficulty in communicating uncertainty through a stochastic adaptation of the Service Path Attribution Networks (SPAN) framework. The SPAN formalism assesses ecosystem services through a set of up to 16 maps, which characterize the services in a study area in terms of flow pathways between ecosystems and human beneficiaries. Although the SPAN algorithms were originally defined deterministically, we present them here in a stochastic framework which combines probabilistic input data with a stochastic transport model in order to generate probabilistic spatial outputs. This enables a novel feature among ecosystem service models: the ability to spatially visualize uncertainty in the model results. The stochastic SPAN model can analyze areas where data limitations are prohibitive for deterministic models. Greater uncertainty in the model inputs (including missing data) should lead to greater uncertainty expressed in the model’s output distributions. By using Bayesian belief networks to fill data gaps and expert-provided trust assignments to augment untrustworthy or outdated information, we can account for uncertainty in input data, producing a model that is still able to run and provide information where strictly deterministic models could not. Taken together, these attributes enable more robust and intuitive modelling of ecosystem services under uncertainty.
NASA Astrophysics Data System (ADS)
Baroni, G.; Gräff, T.; Reinstorf, F.; Oswald, S. E.
2012-04-01
Nowadays uncertainty and sensitivity analysis are considered basic tools for the assessment of hydrological models and the evaluation of the most important sources of uncertainty. In this context, in the last decades several methods have been developed and applied in different hydrological conditions. However, in most of the cases, the studies have been done by investigating mainly the influence of the parameter uncertainty on the simulated outputs and few approaches tried to consider also other sources of uncertainty i.e. input and model structure. Moreover, several constrains arise when spatially distributed parameters are involved. To overcome these limitations a general probabilistic framework based on Monte Carlo simulations and the Sobol method has been proposed. In this study, the general probabilistic framework was applied at field scale using a 1D physical-based hydrological model (SWAP). Furthermore, the framework was extended at catchment scale in combination with a spatially distributed hydrological model (SHETRAN). The models are applied in two different experimental sites in Germany: a relatively flat cropped field close to Potsdam (Brandenburg) and a small mountainous catchment with agricultural land use (Schaefertal, Harz Mountains). For both cases, input and parameters are considered as major sources of uncertainty. Evaluation of the models was based on soil moisture detected at plot scale in different depths and, for the catchment site, also with daily discharge values. The study shows how the framework can take into account all the various sources of uncertainty i.e. input data, parameters (either in scalar or spatially distributed form) and model structures. The framework can be used in a loop in order to optimize further monitoring activities used to improve the performance of the model. In the particular applications, the results show how the sources of uncertainty are specific for each process considered. The influence of the input data as well as the presence of compensating errors become clear by the different processes simulated.
The effects of spatial population dataset choice on estimates of population at risk of disease
2011-01-01
Background The spatial modeling of infectious disease distributions and dynamics is increasingly being undertaken for health services planning and disease control monitoring, implementation, and evaluation. Where risks are heterogeneous in space or dependent on person-to-person transmission, spatial data on human population distributions are required to estimate infectious disease risks, burdens, and dynamics. Several different modeled human population distribution datasets are available and widely used, but the disparities among them and the implications for enumerating disease burdens and populations at risk have not been considered systematically. Here, we quantify some of these effects using global estimates of populations at risk (PAR) of P. falciparum malaria as an example. Methods The recent construction of a global map of P. falciparum malaria endemicity enabled the testing of different gridded population datasets for providing estimates of PAR by endemicity class. The estimated population numbers within each class were calculated for each country using four different global gridded human population datasets: GRUMP (~1 km spatial resolution), LandScan (~1 km), UNEP Global Population Databases (~5 km), and GPW3 (~5 km). More detailed assessments of PAR variation and accuracy were conducted for three African countries where census data were available at a higher administrative-unit level than used by any of the four gridded population datasets. Results The estimates of PAR based on the datasets varied by more than 10 million people for some countries, even accounting for the fact that estimates of population totals made by different agencies are used to correct national totals in these datasets and can vary by more than 5% for many low-income countries. In many cases, these variations in PAR estimates comprised more than 10% of the total national population. The detailed country-level assessments suggested that none of the datasets was consistently more accurate than the others in estimating PAR. The sizes of such differences among modeled human populations were related to variations in the methods, input resolution, and date of the census data underlying each dataset. Data quality varied from country to country within the spatial population datasets. Conclusions Detailed, highly spatially resolved human population data are an essential resource for planning health service delivery for disease control, for the spatial modeling of epidemics, and for decision-making processes related to public health. However, our results highlight that for the low-income regions of the world where disease burden is greatest, existing datasets display substantial variations in estimated population distributions, resulting in uncertainty in disease assessments that utilize them. Increased efforts are required to gather contemporary and spatially detailed demographic data to reduce this uncertainty, particularly in Africa, and to develop population distribution modeling methods that match the rigor, sophistication, and ability to handle uncertainty of contemporary disease mapping and spread modeling. In the meantime, studies that utilize a particular spatial population dataset need to acknowledge the uncertainties inherent within them and consider how the methods and data that comprise each will affect conclusions. PMID:21299885
Vanderborght, Jan; Tiktak, Aaldrik; Boesten, Jos J T I; Vereecken, Harry
2011-03-01
For the registration of pesticides in the European Union, model simulations for worst-case scenarios are used to demonstrate that leaching concentrations to groundwater do not exceed a critical threshold. A worst-case scenario is a combination of soil and climate properties for which predicted leaching concentrations are higher than a certain percentile of the spatial concentration distribution within a region. The derivation of scenarios is complicated by uncertainty about soil and pesticide fate parameters. As the ranking of climate and soil property combinations according to predicted leaching concentrations is different for different pesticides, the worst-case scenario for one pesticide may misrepresent the worst case for another pesticide, which leads to 'scenario uncertainty'. Pesticide fate parameter uncertainty led to higher concentrations in the higher percentiles of spatial concentration distributions, especially for distributions in smaller and more homogeneous regions. The effect of pesticide fate parameter uncertainty on the spatial concentration distribution was small when compared with the uncertainty of local concentration predictions and with the scenario uncertainty. Uncertainty in pesticide fate parameters and scenario uncertainty can be accounted for using higher percentiles of spatial concentration distributions and considering a range of pesticides for the scenario selection. Copyright © 2010 Society of Chemical Industry.
Incorporating climate change and morphological uncertainty into coastal change hazard assessments
Baron, Heather M.; Ruggiero, Peter; Wood, Nathan J.; Harris, Erica L.; Allan, Jonathan; Komar, Paul D.; Corcoran, Patrick
2015-01-01
Documented and forecasted trends in rising sea levels and changes in storminess patterns have the potential to increase the frequency, magnitude, and spatial extent of coastal change hazards. To develop realistic adaptation strategies, coastal planners need information about coastal change hazards that recognizes the dynamic temporal and spatial scales of beach morphology, the climate controls on coastal change hazards, and the uncertainties surrounding the drivers and impacts of climate change. We present a probabilistic approach for quantifying and mapping coastal change hazards that incorporates the uncertainty associated with both climate change and morphological variability. To demonstrate the approach, coastal change hazard zones of arbitrary confidence levels are developed for the Tillamook County (State of Oregon, USA) coastline using a suite of simple models and a range of possible climate futures related to wave climate, sea-level rise projections, and the frequency of major El Niño events. Extreme total water levels are more influenced by wave height variability, whereas the magnitude of erosion is more influenced by sea-level rise scenarios. Morphological variability has a stronger influence on the width of coastal hazard zones than the uncertainty associated with the range of climate change scenarios.
Impact of petrophysical uncertainty on Bayesian hydrogeophysical inversion and model selection
NASA Astrophysics Data System (ADS)
Brunetti, Carlotta; Linde, Niklas
2018-01-01
Quantitative hydrogeophysical studies rely heavily on petrophysical relationships that link geophysical properties to hydrogeological properties and state variables. Coupled inversion studies are frequently based on the questionable assumption that these relationships are perfect (i.e., no scatter). Using synthetic examples and crosshole ground-penetrating radar (GPR) data from the South Oyster Bacterial Transport Site in Virginia, USA, we investigate the impact of spatially-correlated petrophysical uncertainty on inferred posterior porosity and hydraulic conductivity distributions and on Bayes factors used in Bayesian model selection. Our study shows that accounting for petrophysical uncertainty in the inversion (I) decreases bias of the inferred variance of hydrogeological subsurface properties, (II) provides more realistic uncertainty assessment and (III) reduces the overconfidence in the ability of geophysical data to falsify conceptual hydrogeological models.
A study protocol to evaluate the relationship between outdoor air pollution and pregnancy outcomes
2010-01-01
Background The present study protocol is designed to assess the relationship between outdoor air pollution and low birth weight and preterm births outcomes performing a semi-ecological analysis. Semi-ecological design studies are widely used to assess effects of air pollution in humans. In this type of analysis, health outcomes and covariates are measured in individuals and exposure assignments are usually based on air quality monitor stations. Therefore, estimating individual exposures are one of the major challenges when investigating these relationships with a semi-ecologic design. Methods/Design Semi-ecologic study consisting of a retrospective cohort study with ecologic assignment of exposure is applied. Health outcomes and covariates are collected at Primary Health Care Center. Data from pregnant registry, clinical record and specific questionnaire administered orally to the mothers of children born in period 2007-2010 in Portuguese Alentejo Litoral region, are collected by the research team. Outdoor air pollution data are collected with a lichen diversity biomonitoring program, and individual pregnancy exposures are assessed with spatial geostatistical simulation, which provides the basis for uncertainty analysis of individual exposures. Awareness of outdoor air pollution uncertainty will improve validity of individual exposures assignments for further statistical analysis with multivariate regression models. Discussion Exposure misclassification is an issue of concern in semi-ecological design. In this study, personal exposures are assigned to each pregnant using geocoded addresses data. A stochastic simulation method is applied to lichen diversity values index measured at biomonitoring survey locations, in order to assess spatial uncertainty of lichen diversity value index at each geocoded address. These methods assume a model for spatial autocorrelation of exposure and provide a distribution of exposures in each study location. We believe that variability of simulated exposure values at geocoded addresses will improve knowledge on variability of exposures, improving therefore validity of individual exposures to input in posterior statistical analysis. PMID:20950449
A study protocol to evaluate the relationship between outdoor air pollution and pregnancy outcomes.
Ribeiro, Manuel C; Pereira, Maria J; Soares, Amílcar; Branquinho, Cristina; Augusto, Sofia; Llop, Esteve; Fonseca, Susana; Nave, Joaquim G; Tavares, António B; Dias, Carlos M; Silva, Ana; Selemane, Ismael; de Toro, Joaquin; Santos, Mário J; Santos, Fernanda
2010-10-15
The present study protocol is designed to assess the relationship between outdoor air pollution and low birth weight and preterm births outcomes performing a semi-ecological analysis. Semi-ecological design studies are widely used to assess effects of air pollution in humans. In this type of analysis, health outcomes and covariates are measured in individuals and exposure assignments are usually based on air quality monitor stations. Therefore, estimating individual exposures are one of the major challenges when investigating these relationships with a semi-ecologic design. Semi-ecologic study consisting of a retrospective cohort study with ecologic assignment of exposure is applied. Health outcomes and covariates are collected at Primary Health Care Center. Data from pregnant registry, clinical record and specific questionnaire administered orally to the mothers of children born in period 2007-2010 in Portuguese Alentejo Litoral region, are collected by the research team. Outdoor air pollution data are collected with a lichen diversity biomonitoring program, and individual pregnancy exposures are assessed with spatial geostatistical simulation, which provides the basis for uncertainty analysis of individual exposures. Awareness of outdoor air pollution uncertainty will improve validity of individual exposures assignments for further statistical analysis with multivariate regression models. Exposure misclassification is an issue of concern in semi-ecological design. In this study, personal exposures are assigned to each pregnant using geocoded addresses data. A stochastic simulation method is applied to lichen diversity values index measured at biomonitoring survey locations, in order to assess spatial uncertainty of lichen diversity value index at each geocoded address. These methods assume a model for spatial autocorrelation of exposure and provide a distribution of exposures in each study location. We believe that variability of simulated exposure values at geocoded addresses will improve knowledge on variability of exposures, improving therefore validity of individual exposures to input in posterior statistical analysis.
NASA Astrophysics Data System (ADS)
Rasouli, K.; Pomeroy, J. W.; Hayashi, M.; Fang, X.; Gutmann, E. D.; Li, Y.
2017-12-01
The hydrology of mountainous cold regions has a large spatial variability that is driven both by climate variability and near-surface process variability associated with complex terrain and patterns of vegetation, soils, and hydrogeology. There is a need to downscale large-scale atmospheric circulations towards the fine scales that cold regions hydrological processes operate at to assess their spatial variability in complex terrain and quantify uncertainties by comparison to field observations. In this research, three high resolution numerical weather prediction models, namely, the Intermediate Complexity Atmosphere Research (ICAR), Weather Research and Forecasting (WRF), and Global Environmental Multiscale (GEM) models are used to represent spatial and temporal patterns of atmospheric conditions appropriate for hydrological modelling. An area covering high mountains and foothills of the Canadian Rockies was selected to assess and compare high resolution ICAR (1 km × 1 km), WRF (4 km × 4 km), and GEM (2.5 km × 2.5 km) model outputs with station-based meteorological measurements. ICAR with very low computational cost was run with different initial and boundary conditions and with finer spatial resolution, which allowed an assessment of modelling uncertainty and scaling that was difficult with WRF. Results show that ICAR, when compared with WRF and GEM, performs very well in precipitation and air temperature modelling in the Canadian Rockies, while all three models show a fair performance in simulating wind and humidity fields. Representation of local-scale atmospheric dynamics leading to realistic fields of temperature and precipitation by ICAR, WRF, and GEM makes these models suitable for high resolution cold regions hydrological predictions in complex terrain, which is a key factor in estimating water security in western Canada.
Scaling range sizes to threats for robust predictions of risks to biodiversity.
Keith, David A; Akçakaya, H Resit; Murray, Nicholas J
2018-04-01
Assessments of risk to biodiversity often rely on spatial distributions of species and ecosystems. Range-size metrics used extensively in these assessments, such as area of occupancy (AOO), are sensitive to measurement scale, prompting proposals to measure them at finer scales or at different scales based on the shape of the distribution or ecological characteristics of the biota. Despite its dominant role in red-list assessments for decades, appropriate spatial scales of AOO for predicting risks of species' extinction or ecosystem collapse remain untested and contentious. There are no quantitative evaluations of the scale-sensitivity of AOO as a predictor of risks, the relationship between optimal AOO scale and threat scale, or the effect of grid uncertainty. We used stochastic simulation models to explore risks to ecosystems and species with clustered, dispersed, and linear distribution patterns subject to regimes of threat events with different frequency and spatial extent. Area of occupancy was an accurate predictor of risk (0.81<|r|<0.98) and performed optimally when measured with grid cells 0.1-1.0 times the largest plausible area threatened by an event. Contrary to previous assertions, estimates of AOO at these relatively coarse scales were better predictors of risk than finer-scale estimates of AOO (e.g., when measurement cells are <1% of the area of the largest threat). The optimal scale depended on the spatial scales of threats more than the shape or size of biotic distributions. Although we found appreciable potential for grid-measurement errors, current IUCN guidelines for estimating AOO neutralize geometric uncertainty and incorporate effective scaling procedures for assessing risks posed by landscape-scale threats to species and ecosystems. © 2017 The Authors. Conservation Biology published by Wiley Periodicals, Inc. on behalf of Society for Conservation Biology.
NASA Astrophysics Data System (ADS)
Fazeli Farsani, Iman; Farzaneh, M. R.; Besalatpour, A. A.; Salehi, M. H.; Faramarzi, M.
2018-04-01
The variability and uncertainty of water resources associated with climate change are critical issues in arid and semi-arid regions. In this study, we used the soil and water assessment tool (SWAT) to evaluate the impact of climate change on the spatial and temporal variability of water resources in the Bazoft watershed, Iran. The analysis was based on changes of blue water flow, green water flow, and green water storage for a future period (2010-2099) compared to a historical period (1992-2008). The r-factor, p-factor, R 2, and Nash-Sutcliff coefficients for discharge were 1.02, 0.89, 0.80, and 0.80 for the calibration period and 1.03, 0.76, 0.57, and 0.59 for the validation period, respectively. General circulation models (GCMs) under 18 emission scenarios from the IPCC's Fourth (AR4) and Fifth (AR5) Assessment Reports were fed into the SWAT model. At the sub-basin level, blue water tended to decrease, while green water flow tended to increase in the future scenario, and green water storage was predicted to continue its historical trend into the future. At the monthly time scale, the 95% prediction uncertainty bands (95PPUs) of blue and green water flows varied widely in the watershed. A large number (18) of climate change scenarios fell within the estimated uncertainty band of the historical period. The large differences among scenarios indicated high levels of uncertainty in the watershed. Our results reveal that the spatial patterns of water resource components and their uncertainties in the context of climate change are notably different between IPCC AR4 and AR5 in the Bazoft watershed. This study provides a strong basis for water supply-demand analyses, and the general analytical framework can be applied to other study areas with similar challenges.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Urrego-Blanco, Jorge R.; Hunke, Elizabeth C.; Urban, Nathan M.
Here, we implement a variance-based distance metric (D n) to objectively assess skill of sea ice models when multiple output variables or uncertainties in both model predictions and observations need to be considered. The metric compares observations and model data pairs on common spatial and temporal grids improving upon highly aggregated metrics (e.g., total sea ice extent or volume) by capturing the spatial character of model skill. The D n metric is a gamma-distributed statistic that is more general than the χ 2 statistic commonly used to assess model fit, which requires the assumption that the model is unbiased andmore » can only incorporate observational error in the analysis. The D n statistic does not assume that the model is unbiased, and allows the incorporation of multiple observational data sets for the same variable and simultaneously for different variables, along with different types of variances that can characterize uncertainties in both observations and the model. This approach represents a step to establish a systematic framework for probabilistic validation of sea ice models. The methodology is also useful for model tuning by using the D n metric as a cost function and incorporating model parametric uncertainty as part of a scheme to optimize model functionality. We apply this approach to evaluate different configurations of the standalone Los Alamos sea ice model (CICE) encompassing the parametric uncertainty in the model, and to find new sets of model configurations that produce better agreement than previous configurations between model and observational estimates of sea ice concentration and thickness.« less
Urrego-Blanco, Jorge R.; Hunke, Elizabeth C.; Urban, Nathan M.; ...
2017-04-01
Here, we implement a variance-based distance metric (D n) to objectively assess skill of sea ice models when multiple output variables or uncertainties in both model predictions and observations need to be considered. The metric compares observations and model data pairs on common spatial and temporal grids improving upon highly aggregated metrics (e.g., total sea ice extent or volume) by capturing the spatial character of model skill. The D n metric is a gamma-distributed statistic that is more general than the χ 2 statistic commonly used to assess model fit, which requires the assumption that the model is unbiased andmore » can only incorporate observational error in the analysis. The D n statistic does not assume that the model is unbiased, and allows the incorporation of multiple observational data sets for the same variable and simultaneously for different variables, along with different types of variances that can characterize uncertainties in both observations and the model. This approach represents a step to establish a systematic framework for probabilistic validation of sea ice models. The methodology is also useful for model tuning by using the D n metric as a cost function and incorporating model parametric uncertainty as part of a scheme to optimize model functionality. We apply this approach to evaluate different configurations of the standalone Los Alamos sea ice model (CICE) encompassing the parametric uncertainty in the model, and to find new sets of model configurations that produce better agreement than previous configurations between model and observational estimates of sea ice concentration and thickness.« less
NASA Astrophysics Data System (ADS)
Villoria, Nelson B.; Elliott, Joshua; Müller, Christoph; Shin, Jaewoo; Zhao, Lan; Song, Carol
2018-01-01
Access to climate and spatial datasets by non-specialists is restricted by technical barriers involving hardware, software and data formats. We discuss an open-source online tool that facilitates downloading the climate data from the global circulation models used by the Inter-Sectoral Impacts Model Intercomparison Project. The tool also offers temporal and spatial aggregation capabilities for incorporating future climate scenarios in applications where spatial aggregation is important. We hope that streamlined access to these data facilitates analysis of climate related issues while considering the uncertainties derived from future climate projections and temporal aggregation choices.
A global wind resource atlas including high-resolution terrain effects
NASA Astrophysics Data System (ADS)
Hahmann, Andrea; Badger, Jake; Olsen, Bjarke; Davis, Neil; Larsen, Xiaoli; Badger, Merete
2015-04-01
Currently no accurate global wind resource dataset is available to fill the needs of policy makers and strategic energy planners. Evaluating wind resources directly from coarse resolution reanalysis datasets underestimate the true wind energy resource, as the small-scale spatial variability of winds is missing. This missing variability can account for a large part of the local wind resource. Crucially, it is the windiest sites that suffer the largest wind resource errors: in simple terrain the windiest sites may be underestimated by 25%, in complex terrain the underestimate can be as large as 100%. The small-scale spatial variability of winds can be modelled using novel statistical methods and by application of established microscale models within WAsP developed at DTU Wind Energy. We present the framework for a single global methodology, which is relative fast and economical to complete. The method employs reanalysis datasets, which are downscaled to high-resolution wind resource datasets via a so-called generalization step, and microscale modelling using WAsP. This method will create the first global wind atlas (GWA) that covers all land areas (except Antarctica) and 30 km coastal zone over water. Verification of the GWA estimates will be done at carefully selected test regions, against verified estimates from mesoscale modelling and satellite synthetic aperture radar (SAR). This verification exercise will also help in the estimation of the uncertainty of the new wind climate dataset. Uncertainty will be assessed as a function of spatial aggregation. It is expected that the uncertainty at verification sites will be larger than that of dedicated assessments, but the uncertainty will be reduced at levels of aggregation appropriate for energy planning, and importantly much improved relative to what is used today. In this presentation we discuss the methodology used, which includes the generalization of wind climatologies, and the differences in local and spatially aggregated wind resources that result from using different reanalyses in the various verification regions. A prototype web interface for the public access to the data will also be showcased.
ECOLOGICAL RISK ASSESSMENT IN THE CONTEXT OF GLOBAL CLIMATE CHANGE
Landis, Wayne G; Durda, Judi L; Brooks, Marjorie L; Chapman, Peter M; Menzie, Charles A; Stahl, Ralph G; Stauber, Jennifer L
2013-01-01
Changes to sources, stressors, habitats, and geographic ranges; toxicological effects; end points; and uncertainty estimation require significant changes in the implementation of ecological risk assessment (ERA). Because of the lack of analog systems and circumstances in historically studied sites, there is a likelihood of type III error. As a first step, the authors propose a decision key to aid managers and risk assessors in determining when and to what extent climate change should be incorporated. Next, when global climate change is an important factor, the authors recommend seven critical changes to ERA. First, develop conceptual cause–effect diagrams that consider relevant management decisions as well as appropriate spatial and temporal scales to include both direct and indirect effects of climate change and the stressor of management interest. Second, develop assessment end points that are expressed as ecosystem services. Third, evaluate multiple stressors and nonlinear responses—include the chemicals and the stressors related to climate change. Fourth, estimate how climate change will affect or modify management options as the impacts become manifest. Fifth, consider the direction and rate of change relative to management objectives, recognizing that both positive and negative outcomes can occur. Sixth, determine the major drivers of uncertainty, estimating and bounding stochastic uncertainty spatially, temporally, and progressively. Seventh, plan for adaptive management to account for changing environmental conditions and consequent changes to ecosystem services. Good communication is essential for making risk-related information understandable and useful for managers and stakeholders to implement a successful risk-assessment and decision-making process. Environ. Toxicol. Chem. 2013;32:79–92. © 2012 SETAC PMID:23161373
Ecological risk assessment in the context of global climate change.
Landis, Wayne G; Durda, Judi L; Brooks, Marjorie L; Chapman, Peter M; Menzie, Charles A; Stahl, Ralph G; Stauber, Jennifer L
2013-01-01
Changes to sources, stressors, habitats, and geographic ranges; toxicological effects; end points; and uncertainty estimation require significant changes in the implementation of ecological risk assessment (ERA). Because of the lack of analog systems and circumstances in historically studied sites, there is a likelihood of type III error. As a first step, the authors propose a decision key to aid managers and risk assessors in determining when and to what extent climate change should be incorporated. Next, when global climate change is an important factor, the authors recommend seven critical changes to ERA. First, develop conceptual cause-effect diagrams that consider relevant management decisions as well as appropriate spatial and temporal scales to include both direct and indirect effects of climate change and the stressor of management interest. Second, develop assessment end points that are expressed as ecosystem services. Third, evaluate multiple stressors and nonlinear responses-include the chemicals and the stressors related to climate change. Fourth, estimate how climate change will affect or modify management options as the impacts become manifest. Fifth, consider the direction and rate of change relative to management objectives, recognizing that both positive and negative outcomes can occur. Sixth, determine the major drivers of uncertainty, estimating and bounding stochastic uncertainty spatially, temporally, and progressively. Seventh, plan for adaptive management to account for changing environmental conditions and consequent changes to ecosystem services. Good communication is essential for making risk-related information understandable and useful for managers and stakeholders to implement a successful risk-assessment and decision-making process. Copyright © 2012 SETAC.
NASA Astrophysics Data System (ADS)
Vergnes, Jean-Pierre; Habets, Florence
2018-05-01
This study aims to assess the sensitivity of river level estimations to the stream-aquifer exchanges within a hydrogeological model of the Upper Rhine alluvial aquifer (France/Germany), characterized as a large shallow aquifer with numerous hydropower dams. Two specific points are addressed: errors associated with digital elevation models (DEMs) and errors associated with the estimation of river level. The fine-resolution raw Shuttle Radar Topographic Mission dataset is used to assess the impact of the DEM uncertainties. Specific corrections are used to overcome these uncertainties: a simple moving average is applied to the topography along the rivers and additional data are used along the Rhine River to account for the numerous dams. Then, the impact of the river-level temporal variations is assessed through two different methods based on observed rating curves and on the Manning formula. Results are evaluated against observation data from 37 river-level points located over the aquifer, 190 piezometers, and a spatial database of wetlands. DEM uncertainties affect the spatial variability of the stream-aquifer exchanges by inducing strong noise and unrealistic peaks. The corrected DEM reduces the biases between observations and simulations by 22 and 51% for the river levels and the river discharges, respectively. It also improves the agreement between simulated groundwater overflows and observed wetlands. Introducing river-level time variability increases the stream-aquifer exchange range and reduces the piezometric head variability. These results confirm the need to better assess river levels in regional hydrogeological modeling, especially for applications in which stream-aquifer exchanges are important.
The perspectives, information and conclusions conveyed in research project abstracts, progress reports, final reports, journal abstracts and journal publications convey the viewpoints of the principal investigator and may not represent the views and policies of ORD and EPA. Concl...
While aerosol radiative effects have been recognized as some of the largest sources of uncertainty among the forcers of climate change, the verification of the spatial and temporal variability of aerosol radiative forcing has remained challenging. Anthropogenic emissions of prima...
NASA Astrophysics Data System (ADS)
Choukulkar, Aditya; Brewer, W. Alan; Sandberg, Scott P.; Weickmann, Ann; Bonin, Timothy A.; Hardesty, R. Michael; Lundquist, Julie K.; Delgado, Ruben; Valerio Iungo, G.; Ashton, Ryan; Debnath, Mithu; Bianco, Laura; Wilczak, James M.; Oncley, Steven; Wolfe, Daniel
2017-01-01
Accurate three-dimensional information of wind flow fields can be an important tool in not only visualizing complex flow but also understanding the underlying physical processes and improving flow modeling. However, a thorough analysis of the measurement uncertainties is required to properly interpret results. The XPIA (eXperimental Planetary boundary layer Instrumentation Assessment) field campaign conducted at the Boulder Atmospheric Observatory (BAO) in Erie, CO, from 2 March to 31 May 2015 brought together a large suite of in situ and remote sensing measurement platforms to evaluate complex flow measurement strategies. In this paper, measurement uncertainties for different single and multi-Doppler strategies using simple scan geometries (conical, vertical plane and staring) are investigated. The tradeoffs (such as time-space resolution vs. spatial coverage) among the different measurement techniques are evaluated using co-located measurements made near the BAO tower. Sensitivity of the single-/multi-Doppler measurement uncertainties to averaging period are investigated using the sonic anemometers installed on the BAO tower as the standard reference. Finally, the radiometer measurements are used to partition the measurement periods as a function of atmospheric stability to determine their effect on measurement uncertainty. It was found that with an increase in spatial coverage and measurement complexity, the uncertainty in the wind measurement also increased. For multi-Doppler techniques, the increase in uncertainty for temporally uncoordinated measurements is possibly due to requiring additional assumptions of stationarity along with horizontal homogeneity and less representative line-of-sight velocity statistics. It was also found that wind speed measurement uncertainty was lower during stable conditions compared to unstable conditions.
Robustness of risk maps and survey networks to knowledge gaps about a new invasive pest.
Yemshanov, Denys; Koch, Frank H; Ben-Haim, Yakov; Smith, William D
2010-02-01
In pest risk assessment it is frequently necessary to make management decisions regarding emerging threats under severe uncertainty. Although risk maps provide useful decision support for invasive alien species, they rarely address knowledge gaps associated with the underlying risk model or how they may change the risk estimates. Failure to recognize uncertainty leads to risk-ignorant decisions and miscalculation of expected impacts as well as the costs required to minimize these impacts. Here we use the information gap concept to evaluate the robustness of risk maps to uncertainties in key assumptions about an invading organism. We generate risk maps with a spatial model of invasion that simulates potential entries of an invasive pest via international marine shipments, their spread through a landscape, and establishment on a susceptible host. In particular, we focus on the question of how much uncertainty in risk model assumptions can be tolerated before the risk map loses its value. We outline this approach with an example of a forest pest recently detected in North America, Sirex noctilio Fabricius. The results provide a spatial representation of the robustness of predictions of S. noctilio invasion risk to uncertainty and show major geographic hotspots where the consideration of uncertainty in model parameters may change management decisions about a new invasive pest. We then illustrate how the dependency between the extent of uncertainties and the degree of robustness of a risk map can be used to select a surveillance network design that is most robust to knowledge gaps about the pest.
Assessing TCE source bioremediation by geostatistical analysis of a flux fence.
Cai, Zuansi; Wilson, Ryan D; Lerner, David N
2012-01-01
Mass discharge across transect planes is increasingly used as a metric for performance assessment of in situ groundwater remediation systems. Mass discharge estimates using concentrations measured in multilevel transects are often made by assuming a uniform flow field, and uncertainty contributions from spatial concentration and flow field variability are often overlooked. We extend our recently developed geostatistical approach to estimate mass discharge using transect data of concentration and hydraulic conductivity, so accounting for the spatial variability of both datasets. The magnitude and uncertainty of mass discharge were quantified by conditional simulation. An important benefit of the approach is that uncertainty is quantified as an integral part of the mass discharge estimate. We use this approach for performance assessment of a bioremediation experiment of a trichloroethene (TCE) source zone. Analyses of dissolved parent and daughter compounds demonstrated that the engineered bioremediation has elevated the degradation rate of TCE, resulting in a two-thirds reduction in the TCE mass discharge from the source zone. The biologically enhanced dissolution of TCE was not significant (~5%), and was less than expected. However, the discharges of the daughter products cis-1,2, dichloroethene (cDCE) and vinyl chloride (VC) increased, probably because of the rapid transformation of TCE from the source zone to the measurement transect. This suggests that enhancing the biodegradation of cDCE and VC will be crucial to successful engineered bioremediation of TCE source zones. © 2012, The Author(s). Ground Water © 2012, National Ground Water Association.
Observational uncertainty and regional climate model evaluation: A pan-European perspective
NASA Astrophysics Data System (ADS)
Kotlarski, Sven; Szabó, Péter; Herrera, Sixto; Räty, Olle; Keuler, Klaus; Soares, Pedro M.; Cardoso, Rita M.; Bosshard, Thomas; Pagé, Christian; Boberg, Fredrik; Gutiérrez, José M.; Jaczewski, Adam; Kreienkamp, Frank; Liniger, Mark. A.; Lussana, Cristian; Szepszo, Gabriella
2017-04-01
Local and regional climate change assessments based on downscaling methods crucially depend on the existence of accurate and reliable observational reference data. In dynamical downscaling via regional climate models (RCMs) observational data can influence model development itself and, later on, model evaluation, parameter calibration and added value assessment. In empirical-statistical downscaling, observations serve as predictand data and directly influence model calibration with corresponding effects on downscaled climate change projections. Focusing on the evaluation of RCMs, we here analyze the influence of uncertainties in observational reference data on evaluation results in a well-defined performance assessment framework and on a European scale. For this purpose we employ three different gridded observational reference grids, namely (1) the well-established EOBS dataset (2) the recently developed EURO4M-MESAN regional re-analysis, and (3) several national high-resolution and quality-controlled gridded datasets that recently became available. In terms of climate models five reanalysis-driven experiments carried out by five different RCMs within the EURO-CORDEX framework are used. Two variables (temperature and precipitation) and a range of evaluation metrics that reflect different aspects of RCM performance are considered. We furthermore include an illustrative model ranking exercise and relate observational spread to RCM spread. The results obtained indicate a varying influence of observational uncertainty on model evaluation depending on the variable, the season, the region and the specific performance metric considered. Over most parts of the continent, the influence of the choice of the reference dataset for temperature is rather small for seasonal mean values and inter-annual variability. Here, model uncertainty (as measured by the spread between the five RCM simulations considered) is typically much larger than reference data uncertainty. For parameters of the daily temperature distribution and for the spatial pattern correlation, however, important dependencies on the reference dataset can arise. The related evaluation uncertainties can be as large or even larger than model uncertainty. For precipitation the influence of observational uncertainty is, in general, larger than for temperature. It often dominates model uncertainty especially for the evaluation of the wet day frequency, the spatial correlation and the shape and location of the distribution of daily values. But even the evaluation of large-scale seasonal mean values can be considerably affected by the choice of the reference. When employing a simple and illustrative model ranking scheme on these results it is found that RCM ranking in many cases depends on the reference dataset employed.
Uncertainty Quantification for Ice Sheet Science and Sea Level Projections
NASA Astrophysics Data System (ADS)
Boening, C.; Schlegel, N.; Limonadi, D.; Schodlok, M.; Seroussi, H. L.; Larour, E. Y.; Watkins, M. M.
2017-12-01
In order to better quantify uncertainties in global mean sea level rise projections and in particular upper bounds, we aim at systematically evaluating the contributions from ice sheets and potential for extreme sea level rise due to sudden ice mass loss. Here, we take advantage of established uncertainty quantification tools embedded within the Ice Sheet System Model (ISSM) as well as sensitivities to ice/ocean interactions using melt rates and melt potential derived from MITgcm/ECCO2. With the use of these tools, we conduct Monte-Carlo style sampling experiments on forward simulations of the Antarctic ice sheet, by varying internal parameters and boundary conditions of the system over both extreme and credible worst-case ranges. Uncertainty bounds for climate forcing are informed by CMIP5 ensemble precipitation and ice melt estimates for year 2100, and uncertainty bounds for ocean melt rates are derived from a suite of regional sensitivity experiments using MITgcm. Resulting statistics allow us to assess how regional uncertainty in various parameters affect model estimates of century-scale sea level rise projections. The results inform efforts to a) isolate the processes and inputs that are most responsible for determining ice sheet contribution to sea level; b) redefine uncertainty brackets for century-scale projections; and c) provide a prioritized list of measurements, along with quantitative information on spatial and temporal resolution, required for reducing uncertainty in future sea level rise projections. Results indicate that ice sheet mass loss is dependent on the spatial resolution of key boundary conditions - such as bedrock topography and melt rates at the ice-ocean interface. This work is performed at and supported by the California Institute of Technology's Jet Propulsion Laboratory. Supercomputing time is also supported through a contract with the National Aeronautics and Space Administration's Cryosphere program.
NASA Astrophysics Data System (ADS)
Knoefel, Patrick; Loew, Fabian; Conrad, Christopher
2015-04-01
Crop maps based on classification of remotely sensed data are of increased attendance in agricultural management. This induces a more detailed knowledge about the reliability of such spatial information. However, classification of agricultural land use is often limited by high spectral similarities of the studied crop types. More, spatially and temporally varying agro-ecological conditions can introduce confusion in crop mapping. Classification errors in crop maps in turn may have influence on model outputs, like agricultural production monitoring. One major goal of the PhenoS project ("Phenological structuring to determine optimal acquisition dates for Sentinel-2 data for field crop classification"), is the detection of optimal phenological time windows for land cover classification purposes. Since many crop species are spectrally highly similar, accurate classification requires the right selection of satellite images for a certain classification task. In the course of one growing season, phenological phases exist where crops are separable with higher accuracies. For this purpose, coupling of multi-temporal spectral characteristics and phenological events is promising. The focus of this study is set on the separation of spectrally similar cereal crops like winter wheat, barley, and rye of two test sites in Germany called "Harz/Central German Lowland" and "Demmin". However, this study uses object based random forest (RF) classification to investigate the impact of image acquisition frequency and timing on crop classification uncertainty by permuting all possible combinations of available RapidEye time series recorded on the test sites between 2010 and 2014. The permutations were applied to different segmentation parameters. Then, classification uncertainty was assessed and analysed, based on the probabilistic soft-output from the RF algorithm at the per-field basis. From this soft output, entropy was calculated as a spatial measure of classification uncertainty. The results indicate that uncertainty estimates provide a valuable addition to traditional accuracy assessments and helps the user to allocate error in crop maps.
Optimal configurations of spatial scale for grid cell firing under noise and uncertainty
Towse, Benjamin W.; Barry, Caswell; Bush, Daniel; Burgess, Neil
2014-01-01
We examined the accuracy with which the location of an agent moving within an environment could be decoded from the simulated firing of systems of grid cells. Grid cells were modelled with Poisson spiking dynamics and organized into multiple ‘modules’ of cells, with firing patterns of similar spatial scale within modules and a wide range of spatial scales across modules. The number of grid cells per module, the spatial scaling factor between modules and the size of the environment were varied. Errors in decoded location can take two forms: small errors of precision and larger errors resulting from ambiguity in decoding periodic firing patterns. With enough cells per module (e.g. eight modules of 100 cells each) grid systems are highly robust to ambiguity errors, even over ranges much larger than the largest grid scale (e.g. over a 500 m range when the maximum grid scale is 264 cm). Results did not depend strongly on the precise organization of scales across modules (geometric, co-prime or random). However, independent spatial noise across modules, which would occur if modules receive independent spatial inputs and might increase with spatial uncertainty, dramatically degrades the performance of the grid system. This effect of spatial uncertainty can be mitigated by uniform expansion of grid scales. Thus, in the realistic regimes simulated here, the optimal overall scale for a grid system represents a trade-off between minimizing spatial uncertainty (requiring large scales) and maximizing precision (requiring small scales). Within this view, the temporary expansion of grid scales observed in novel environments may be an optimal response to increased spatial uncertainty induced by the unfamiliarity of the available spatial cues. PMID:24366144
NASA Astrophysics Data System (ADS)
Ahmadalipour, Ali; Moradkhani, Hamid
2017-12-01
Hydrologic modeling is one of the primary tools utilized for drought monitoring and drought early warning systems. Several sources of uncertainty in hydrologic modeling have been addressed in the literature. However, few studies have assessed the uncertainty of gridded observation datasets from a drought monitoring perspective. This study provides a hydrologic modeling oriented analysis of the gridded observation data uncertainties over the Pacific Northwest (PNW) and its implications on drought assessment. We utilized a recently developed 100-member ensemble-based observed forcing data to simulate hydrologic fluxes at 1/8° spatial resolution using Variable Infiltration Capacity (VIC) model, and compared the results with a deterministic observation. Meteorological and hydrological droughts are studied at multiple timescales over the basin, and seasonal long-term trends and variations of drought extent is investigated for each case. Results reveal large uncertainty of observed datasets at monthly timescale, with systematic differences for temperature records, mainly due to different lapse rates. The uncertainty eventuates in large disparities of drought characteristics. In general, an increasing trend is found for winter drought extent across the PNW. Furthermore, a ∼3% decrease per decade is detected for snow water equivalent (SWE) over the PNW, with the region being more susceptible to SWE variations of the northern Rockies than the western Cascades. The agricultural areas of southern Idaho demonstrate decreasing trend of natural soil moisture as a result of precipitation decline, which implies higher appeal for anthropogenic water storage and irrigation systems.
Risk based adaptation of infrastructures to floods and storm surges induced by climate change.
NASA Astrophysics Data System (ADS)
Luna, Byron Quan; Garrè, Luca; Hansen, Peter Friis
2014-05-01
Coastal natural hazards are changing in frequency and intensity associated to climate change. These extreme events combined with an increase in the extent of vulnerable societies will lead to an increase of substantial monetary losses. For this reason, adaptive measures are required to identify the effective and adequate measures to withstand the impacts of climate change. Decision strategies are needed for the timing of investments and for the allocation of resources to safeguard the future in a sustainable manner. Adapting structures to climate change requires decision making under uncertainties. Therefore, it is vital that risk assessments are generated on a reliable and appropriate evaluation of the involved uncertainties. Linking a Bayesian network (BN) to a Geographic Information System (GIS) for a risk assessment enables to model all the relevant parameters, their causal relations and the involved uncertainties. The integration of the probabilistic approach into a GIS allows quantifying and visualizing uncertainties in a spatial manner. By addressing these uncertainties, the Bayesian Network approach allows quantifying their effects; and facilitates the identification of future model improvements and where other efforts should be concentrated. The final results can be applied as a supportive tool for presenting reliable risk assessments to decision-makers. Based on this premises, a case study was performed to assess how the storm surge magnitude and flooding extent of an event with similar characteristics to the Sandy Super storm will occur in 2050 and 2090.
NASA Astrophysics Data System (ADS)
Westerberg, Ida
2017-04-01
Understanding and quantifying how hydrological response behaviour varies across catchments, or how catchments change with time requires reliable discharge data. For reliable estimation of spatial and temporal change, the change in the response behaviour needs to be larger than the uncertainty in the response behaviour estimates that are compared. Understanding how discharge data uncertainty varies between catchments and over time, and how these uncertainties propagate to information derived from the data, is therefore key to drawing the right conclusions in comparative analyses. Uncertainty in discharge data is often highly place-specific and reliable estimation depends on detailed analyses of the rating curve model and stage-discharge measurements used to calculate discharge time series from stage (water level) at the gauging station. This underlying information is often not available when discharge data is provided by monitoring agencies. However, even without detailed analyses, the chance that the discharge data would be uncertain at particular flow ranges can be assessed based on information about the gauging station, the flow regime, and the catchment. This type of information is often available for most catchments even if the rating curve data are not. Such 'soft information' on discharge uncertainty may aid interpretation of results from regional and temporal change analyses. In particular, it can help reduce the risk of wrongly interpreting differences in response behaviour caused by discharge uncertainty as real changes. In this presentation I draw on several previous studies to discuss some of the factors that affect discharge data uncertainty and give examples from catchments worldwide. I aim to 1) illustrate the consequences of discharge data uncertainty on comparisons of different types of hydrological response behaviour across catchments and when analysing temporal change, and 2) give practical advice as to what factors may help identify catchments with potentially large discharge uncertainty.
Chonggang Xu; Hong S. He; Yuanman Hu; Yu Chang; Xiuzhen Li; Rencang Bu
2005-01-01
Geostatistical stochastic simulation is always combined with Monte Carlo method to quantify the uncertainty in spatial model simulations. However, due to the relatively long running time of spatially explicit forest models as a result of their complexity, it is always infeasible to generate hundreds or thousands of Monte Carlo simulations. Thus, it is of great...
Douziech, Mélanie; van Zelm, Rosalie; Oldenkamp, Rik; Franco, Antonio; Hendriks, A Jan; King, Henry; Huijbregts, Mark A J
2018-02-01
Deriving reliable estimates of chemical emissions to the environment is a key challenge for impact and risk assessment methods and typically the associated uncertainty is not characterised. We have developed an approach to spatially quantify annual chemical emission loads to the aquatic environment together with their associated uncertainty using consumer survey data and publicly accessible and non-confidential data sources. The approach is applicable for chemicals widely used across a product sector. Product usage data from consumer survey studies in France, the Netherlands, South Korea and the USA were combined with information on typical product formulations, wastewater removal rates, and the spatial distribution of populations and wastewater treatment plants (WWTPs) in the four countries. Results are presented for three chemicals common to three types of personal care products (shampoo, conditioner, and bodywash) at WWTP and national levels. Uncertainty in WWTP-specific emission estimates was characterised with a 95% confidence interval and ranged up to a factor of 4.8 around the mean, mainly due to uncertainty associated with removal efficiency. Estimates of whole country product usage were comparable to total market estimates derived from sectorial market sales data with differences ranging from a factor 0.8 (for the Netherlands) to 5 (for the USA). The proposed approach is suitable where measured data on chemical emissions is missing and is applicable for use in risk assessments and chemical footprinting methods when applied to specific product categories. Copyright © 2017 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
McMullen, Kyla A.
Although the concept of virtual spatial audio has existed for almost twenty-five years, only in the past fifteen years has modern computing technology enabled the real-time processing needed to deliver high-precision spatial audio. Furthermore, the concept of virtually walking through an auditory environment did not exist. The applications of such an interface have numerous potential uses. Spatial audio has the potential to be used in various manners ranging from enhancing sounds delivered in virtual gaming worlds to conveying spatial locations in real-time emergency response systems. To incorporate this technology in real-world systems, various concerns should be addressed. First, to widely incorporate spatial audio into real-world systems, head-related transfer functions (HRTFs) must be inexpensively created for each user. The present study further investigated an HRTF subjective selection procedure previously developed within our research group. Users discriminated auditory cues to subjectively select their preferred HRTF from a publicly available database. Next, the issue of training to find virtual sources was addressed. Listeners participated in a localization training experiment using their selected HRTFs. The training procedure was created from the characterization of successful search strategies in prior auditory search experiments. Search accuracy significantly improved after listeners performed the training procedure. Next, in the investigation of auditory spatial memory, listeners completed three search and recall tasks with differing recall methods. Recall accuracy significantly decreased in tasks that required the storage of sound source configurations in memory. To assess the impacts of practical scenarios, the present work assessed the performance effects of: signal uncertainty, visual augmentation, and different attenuation modeling. Fortunately, source uncertainty did not affect listeners' ability to recall or identify sound sources. The present study also found that the presence of visual reference frames significantly increased recall accuracy. Additionally, the incorporation of drastic attenuation significantly improved environment recall accuracy. Through investigating the aforementioned concerns, the present study made initial footsteps guiding the design of virtual auditory environments that support spatial configuration recall.
Bayesian Estimation of the Spatially Varying Completeness Magnitude of Earthquake Catalogs
NASA Astrophysics Data System (ADS)
Mignan, A.; Werner, M.; Wiemer, S.; Chen, C.; Wu, Y.
2010-12-01
Assessing the completeness magnitude Mc of earthquake catalogs is an essential prerequisite for any seismicity analysis. We employ a simple model to compute Mc in space, based on the proximity to seismic stations in a network. We show that a relationship of the form Mcpred(d) = ad^b+c, with d the distance to the 5th nearest seismic station, fits the observations well. We then propose a new Mc mapping approach, the Bayesian Magnitude of Completeness (BMC) method, based on a 2-step procedure: (1) a spatial resolution optimization to minimize spatial heterogeneities and uncertainties in Mc estimates and (2) a Bayesian approach that merges prior information about Mc based on the proximity to seismic stations with locally observed values weighted by their respective uncertainties. This new methodology eliminates most weaknesses associated with current Mc mapping procedures: the radius that defines which earthquakes to include in the local magnitude distribution is chosen according to an objective criterion and there are no gaps in the spatial estimation of Mc. The method solely requires the coordinates of seismic stations. Here, we investigate the Taiwan Central Weather Bureau (CWB) earthquake catalog by computing a Mc map for the period 1994-2010.
USDA-ARS?s Scientific Manuscript database
Spatial extrapolation of cropping systems models for regional crop growth and water use assessment and farm-level precision management has been limited by the vast model input requirements and the model sensitivity to parameter uncertainty. Remote sensing has been proposed as a viable source of spat...
Mohammad Safeeq; Guillaume S. Mauger; Gordon E. Grant; Ivan Arismendi; Alan F. Hamlet; Se-Yeun Lee
2014-01-01
Assessing uncertainties in hydrologic models can improve accuracy in predicting future streamflow. Here, simulated streamflows using the Variable Infiltration Capacity (VIC) model at coarse (1/16°) and fine (1/120°) spatial resolutions were evaluated against observed streamflows from 217 watersheds. In...
NASA Astrophysics Data System (ADS)
Troldborg, M.; Nowak, W.; Binning, P. J.; Bjerg, P. L.
2012-12-01
Estimates of mass discharge (mass/time) are increasingly being used when assessing risks of groundwater contamination and designing remedial systems at contaminated sites. Mass discharge estimates are, however, prone to rather large uncertainties as they integrate uncertain spatial distributions of both concentration and groundwater flow velocities. For risk assessments or any other decisions that are being based on mass discharge estimates, it is essential to address these uncertainties. We present a novel Bayesian geostatistical approach for quantifying the uncertainty of the mass discharge across a multilevel control plane. The method decouples the flow and transport simulation and has the advantage of avoiding the heavy computational burden of three-dimensional numerical flow and transport simulation coupled with geostatistical inversion. It may therefore be of practical relevance to practitioners compared to existing methods that are either too simple or computationally demanding. The method is based on conditional geostatistical simulation and accounts for i) heterogeneity of both the flow field and the concentration distribution through Bayesian geostatistics (including the uncertainty in covariance functions), ii) measurement uncertainty, and iii) uncertain source zone geometry and transport parameters. The method generates multiple equally likely realizations of the spatial flow and concentration distribution, which all honour the measured data at the control plane. The flow realizations are generated by analytical co-simulation of the hydraulic conductivity and the hydraulic gradient across the control plane. These realizations are made consistent with measurements of both hydraulic conductivity and head at the site. An analytical macro-dispersive transport solution is employed to simulate the mean concentration distribution across the control plane, and a geostatistical model of the Box-Cox transformed concentration data is used to simulate observed deviations from this mean solution. By combining the flow and concentration realizations, a mass discharge probability distribution is obtained. Tests show that the decoupled approach is both efficient and able to provide accurate uncertainty estimates. The method is demonstrated on a Danish field site contaminated with chlorinated ethenes. For this site, we show that including a physically meaningful concentration trend and the co-simulation of hydraulic conductivity and hydraulic gradient across the transect helps constrain the mass discharge uncertainty. The number of sampling points required for accurate mass discharge estimation and the relative influence of different data types on mass discharge uncertainty is discussed.
NASA Astrophysics Data System (ADS)
Smith, J. D.; Whealton, C. A.; Stedinger, J. R.
2014-12-01
Resource assessments for low-grade geothermal applications employ available well temperature measurements to determine if the resource potential is sufficient for supporting district heating opportunities. This study used a compilation of bottomhole temperature (BHT) data from recent unconventional shale oil and gas wells, along with legacy oil, gas, and storage wells, in Pennsylvania (PA) and New York (NY). Our study's goal was to predict the geothermal resource potential and associated uncertainty for the NY-PA region using kriging interpolation. The dataset was scanned for outliers, and some observations were removed. Because these wells were drilled for reasons other than geothermal resource assessment, their spatial density varied widely. An exploratory spatial statistical analysis revealed differences in the spatial structure of the geothermal gradient data (the kriging semi-variogram and its nugget variance, shape, sill, and the degree of anisotropy). As a result, a stratified kriging procedure was adopted to better capture the statistical structure of the data, to generate an interpolated surface, and to quantify the uncertainty of the computed surface. The area was stratified reflecting different physiographic provinces in NY and PA that have geologic properties likely related to variations in the value of the geothermal gradient. The kriging prediction and the variance-of-prediction were determined for each province by the generation of a semi-variogram using only the wells that were located within that province. A leave-one-out cross validation (LOOCV) was conducted as a diagnostic tool. The results of stratified kriging were compared to kriging using the whole region to determine the impact of stratification. The two approaches provided similar predictions of the geothermal gradient. However, the variance-of-prediction was different. The stratified approach is recommended because it gave a more appropriate site-specific characterization of uncertainty based upon a more realistic description of the statistical structure of the data given the geologic characteristics of each province.
Loague, Keith; Blanke, James S; Mills, Melissa B; Diaz-Diaz, Ricardo; Corwin, Dennis L
2012-01-01
Precious groundwater resources across the United States have been contaminated due to decades-long nonpoint-source applications of agricultural chemicals. Assessing the impact of past, ongoing, and future chemical applications for large-scale agriculture operations is timely for designing best-management practices to prevent subsurface pollution. Presented here are the results from a series of regional-scale vulnerability assessments for the San Joaquin Valley (SJV). Two relatively simple indices, the retardation and attenuation factors, are used to estimate near-surface vulnerabilities based on the chemical properties of 32 pesticides and the variability of both soil characteristics and recharge rates across the SJV. The uncertainties inherit to these assessments, derived from the uncertainties within the chemical and soil data bases, are estimated using first-order analyses. The results are used to screen and rank the chemicals based on mobility and leaching potential, without and with consideration of data-related uncertainties. Chemicals of historic high visibility in the SJV (e.g., atrazine, DBCP [dibromochloropropane], ethylene dibromide, and simazine) are ranked in the top half of those considered. Vulnerability maps generated for atrazine and DBCP, featured for their legacy status in the study area, clearly illustrate variations within and across the assessments. For example, the leaching potential is greater for DBCP than for atrazine, the leaching potential for DBCP is greater for the spatially variable recharge values than for the average recharge rate, and the leaching potentials for both DBCP and atrazine are greater for the annual recharge estimates than for the monthly recharge estimates. The data-related uncertainties identified in this study can be significant, targeting opportunities for improving future vulnerability assessments. Copyright © by the American Society of Agronomy, Crop Science Society of America, and Soil Science Society of America, Inc.
Spatial uncertainty analysis: Propagation of interpolation errors in spatially distributed models
Phillips, D.L.; Marks, D.G.
1996-01-01
In simulation modelling, it is desirable to quantify model uncertainties and provide not only point estimates for output variables but confidence intervals as well. Spatially distributed physical and ecological process models are becoming widely used, with runs being made over a grid of points that represent the landscape. This requires input values at each grid point, which often have to be interpolated from irregularly scattered measurement sites, e.g., weather stations. Interpolation introduces spatially varying errors which propagate through the model We extended established uncertainty analysis methods to a spatial domain for quantifying spatial patterns of input variable interpolation errors and how they propagate through a model to affect the uncertainty of the model output. We applied this to a model of potential evapotranspiration (PET) as a demonstration. We modelled PET for three time periods in 1990 as a function of temperature, humidity, and wind on a 10-km grid across the U.S. portion of the Columbia River Basin. Temperature, humidity, and wind speed were interpolated using kriging from 700- 1000 supporting data points. Kriging standard deviations (SD) were used to quantify the spatially varying interpolation uncertainties. For each of 5693 grid points, 100 Monte Carlo simulations were done, using the kriged values of temperature, humidity, and wind, plus random error terms determined by the kriging SDs and the correlations of interpolation errors among the three variables. For the spring season example, kriging SDs averaged 2.6??C for temperature, 8.7% for relative humidity, and 0.38 m s-1 for wind. The resultant PET estimates had coefficients of variation (CVs) ranging from 14% to 27% for the 10-km grid cells. Maps of PET means and CVs showed the spatial patterns of PET with a measure of its uncertainty due to interpolation of the input variables. This methodology should be applicable to a variety of spatially distributed models using interpolated inputs.
Assessing uncertainty in SRTM elevations for global flood modelling
NASA Astrophysics Data System (ADS)
Hawker, L. P.; Rougier, J.; Neal, J. C.; Bates, P. D.
2017-12-01
The SRTM DEM is widely used as the topography input to flood models in data-sparse locations. Understanding spatial error in the SRTM product is crucial in constraining uncertainty about elevations and assessing the impact of these upon flood prediction. Assessment of SRTM error was carried out by Rodriguez et al (2006), but this did not explicitly quantify the spatial structure of vertical errors in the DEM, and nor did it distinguish between errors over different types of landscape. As a result, there is a lack of information about spatial structure of vertical errors of the SRTM in the landscape that matters most to flood models - the floodplain. Therefore, this study attempts this task by comparing SRTM, an error corrected SRTM product (The MERIT DEM of Yamazaki et al., 2017) and near truth LIDAR elevations for 3 deltaic floodplains (Mississippi, Po, Wax Lake) and a large lowland region (the Fens, UK). Using the error covariance function, calculated by comparing SRTM elevations to the near truth LIDAR, perturbations of the 90m SRTM DEM were generated, producing a catalogue of plausible DEMs. This allows modellers to simulate a suite of plausible DEMs at any aggregated block size above native SRTM resolution. Finally, the generated DEM's were input into a hydrodynamic model of the Mekong Delta, built using the LISFLOOD-FP hydrodynamic model, to assess how DEM error affects the hydrodynamics and inundation extent across the domain. The end product of this is an inundation map with the probability of each pixel being flooded based on the catalogue of DEMs. In a world of increasing computer power, but a lack of detailed datasets, this powerful approach can be used throughout natural hazard modelling to understand how errors in the SRTM DEM can impact the hazard assessment.
Choukulkar, Aditya; Brewer, W. Alan; Sandberg, Scott P.; ...
2017-01-23
Accurate three-dimensional information of wind flow fields can be an important tool in not only visualizing complex flow but also understanding the underlying physical processes and improving flow modeling. However, a thorough analysis of the measurement uncertainties is required to properly interpret results. The XPIA (eXperimental Planetary boundary layer Instrumentation Assessment) field campaign conducted at the Boulder Atmospheric Observatory (BAO) in Erie, CO, from 2 March to 31 May 2015 brought together a large suite of in situ and remote sensing measurement platforms to evaluate complex flow measurement strategies. In this paper, measurement uncertainties for different single and multi-Doppler strategies using simple scanmore » geometries (conical, vertical plane and staring) are investigated. The tradeoffs (such as time–space resolution vs. spatial coverage) among the different measurement techniques are evaluated using co-located measurements made near the BAO tower. Sensitivity of the single-/multi-Doppler measurement uncertainties to averaging period are investigated using the sonic anemometers installed on the BAO tower as the standard reference. Finally, the radiometer measurements are used to partition the measurement periods as a function of atmospheric stability to determine their effect on measurement uncertainty. It was found that with an increase in spatial coverage and measurement complexity, the uncertainty in the wind measurement also increased. For multi-Doppler techniques, the increase in uncertainty for temporally uncoordinated measurements is possibly due to requiring additional assumptions of stationarity along with horizontal homogeneity and less representative line-of-sight velocity statistics. Lastly, it was also found that wind speed measurement uncertainty was lower during stable conditions compared to unstable conditions.« less
Evaluating critical uncertainty thresholds in a spatial model of forest pest invasion risk
Frank H. Koch; Denys Yemshanov; Daniel W. McKenney; William D. Smith
2009-01-01
Pest risk maps can provide useful decision support in invasive species management, but most do not adequately consider the uncertainty associated with predicted risk values. This study explores how increased uncertainty in a risk modelâs numeric assumptions might affect the resultant risk map. We used a spatial stochastic model, integrating components for...
Visualizing Uncertainty of Point Phenomena by Redesigned Error Ellipses
NASA Astrophysics Data System (ADS)
Murphy, Christian E.
2018-05-01
Visualizing uncertainty remains one of the great challenges in modern cartography. There is no overarching strategy to display the nature of uncertainty, as an effective and efficient visualization depends, besides on the spatial data feature type, heavily on the type of uncertainty. This work presents a design strategy to visualize uncertainty con-nected to point features. The error ellipse, well-known from mathematical statistics, is adapted to display the uncer-tainty of point information originating from spatial generalization. Modified designs of the error ellipse show the po-tential of quantitative and qualitative symbolization and simultaneous point based uncertainty symbolization. The user can intuitively depict the centers of gravity, the major orientation of the point arrays as well as estimate the ex-tents and possible spatial distributions of multiple point phenomena. The error ellipse represents uncertainty in an intuitive way, particularly suitable for laymen. Furthermore it is shown how applicable an adapted design of the er-ror ellipse is to display the uncertainty of point features originating from incomplete data. The suitability of the error ellipse to display the uncertainty of point information is demonstrated within two showcases: (1) the analysis of formations of association football players, and (2) uncertain positioning of events on maps for the media.
Assessing the Risks to Human Health in Heterogeneous Aquifers under Uncertainty
NASA Astrophysics Data System (ADS)
de Barros, Felipe
2015-04-01
Reliable quantification of human health risk from toxic chemicals present in groundwater is a challenging task. The main difficulty relies on the fact that many of the components that constitute human health risk assessment are uncertain and requires interdisciplinary knowledge. Understanding the impact from each of these components in risk estimation can provide guidance for decision makers to manage contaminated sites and best allocate resources towards minimal prediction uncertainty. This presentation will focus on the impact of aquifer heterogeneity in human health risk. Spatial heterogeneity of the hydrogeological properties can lead to the formation of preferential flow channels which control the plume spreading rates and travel time statistics, both which are critical in assessing the risk level. By making use of an integrated hydrogeological-health stochastic framework, the significance of characteristic length scales (e.g. characterizing flow, transport and sampling devices) in both controlling the uncertainty of health risk and determining data needs is highlighted. Through a series of examples, we show how fundamental knowledge on the main physical mechanisms affecting solute pathways are necessary to understand the human health response to varying drivers.
High Resolution Insights into Snow Distribution Provided by Drone Photogrammetry
NASA Astrophysics Data System (ADS)
Redpath, T.; Sirguey, P. J.; Cullen, N. J.; Fitzsimons, S.
2017-12-01
Dynamic in time and space, New Zealand's seasonal snow is largely confined to remote alpine areas, complicating ongoing in situ measurement and characterisation. Improved understanding and modeling of the seasonal snowpack requires fine scale resolution of snow distribution and spatial variability. The potential of remotely piloted aircraft system (RPAS) photogrammetry to resolve spatial and temporal variability of snow depth and water equivalent in a New Zealand alpine catchment is assessed in the Pisa Range, Central Otago. This approach yielded orthophotomosaics and digital surface models (DSM) at 0.05 and 0.15 m spatial resolution, respectively. An autumn reference DSM allowed mapping of winter (02/08/2016) and spring (10/09/2016) snow depth at 0.15 m spatial resolution, via DSM differencing. The consistency and accuracy of the RPAS-derived surface was assessed by comparison of snow-free regions of the spring and autumn DSMs, while accuracy of RPAS retrieved snow depth was assessed with 86 in situ snow probe measurements. Results show a mean vertical residual of 0.024 m between DSMs acquired in autumn and spring. This residual approximated a Laplace distribution, reflecting the influence of large outliers on the small overall bias. Propagation of errors associated with successive DSMs saw snow depth mapped with an accuracy of ± 0.09 m (95% c.l.). Comparing RPAS and in situ snow depth measurements revealed the influence of geo-location uncertainty and interactions between vegetation and the snowpack on snow depth uncertainty and bias. Semi-variogram analysis revealed that the RPAS outperformed systematic in situ measurements in resolving fine scale spatial variability. Despite limitations accompanying RPAS photogrammetry, this study demonstrates a repeatable means of accurately mapping snow depth for an entire, yet relatively small, hydrological basin ( 0.5 km2), at high resolution. Resolving snowpack features associated with re-distribution and preferential accumulation and ablation, snow depth maps provide geostatistically robust insights into seasonal snow processes, with unprecedented detail. Such data may enhance understanding of physical processes controlling spatial and temporal distribution of seasonal snow, and their relative importance at varying spatial and temporal scales.
Fuzzy geometry, entropy, and image information
NASA Technical Reports Server (NTRS)
Pal, Sankar K.
1991-01-01
Presented here are various uncertainty measures arising from grayness ambiguity and spatial ambiguity in an image, and their possible applications as image information measures. Definitions are given of an image in the light of fuzzy set theory, and of information measures and tools relevant for processing/analysis e.g., fuzzy geometrical properties, correlation, bound functions and entropy measures. Also given is a formulation of algorithms along with management of uncertainties for segmentation and object extraction, and edge detection. The output obtained here is both fuzzy and nonfuzzy. Ambiguity in evaluation and assessment of membership function are also described.
Regional landslide hazard assessment in a deep uncertain future
NASA Astrophysics Data System (ADS)
Almeida, Susana; Holcombe, Liz; Pianosi, Francesca; Wagener, Thorsten
2017-04-01
Landslides have many negative economic and societal impacts, including the potential for significant loss of life and damage to infrastructure. These risks are likely to be exacerbated in the future by a combination of climatic and socio-economic factors. Climate change, for example, is expected to increase the occurrence of rainfall-triggered landslides, because a warmer atmosphere tends to produce more high intensity rainfall events. Prediction of future changes in rainfall, however, is subject to high levels of uncertainty, making it challenging for decision-makers to identify the areas and populations that are most vulnerable to landslide hazards. In this study, we demonstrate how a physically-based model - the Combined Hydrology and Stability Model (CHASM) - can be used together with Global Sensitivity Analysis (GSA) to explore the underlying factors controlling the spatial distribution of landslide risks across a regional landscape, while also accounting for deep uncertainty around future rainfall conditions. We demonstrate how GSA can used to analyse CHASM which in turn represents the spatial variability of hillslope characteristics in the study region, while accounting for other uncertainties. Results are presented in the form of landslide hazard maps, utilising high-resolution digital elevation datasets for a case study in St Lucia in the Caribbean. Our findings about spatial landslide hazard drivers have important implications for data collection approaches and for long-term decision-making about land management practices.
Regional Landslide Hazard Assessment Considering Potential Climate Change
NASA Astrophysics Data System (ADS)
Almeida, S.; Holcombe, E.; Pianosi, F.; Wagener, T.
2016-12-01
Landslides have many negative economic and societal impacts, including the potential for significant loss of life and damage to infrastructure. These risks are likely to be exacerbated in the future by a combination of climatic and socio-economic factors. Climate change, for example, is expected to increase the occurrence of rainfall-triggered landslides, because a warmer atmosphere tends to produce more high intensity rainfall events. Prediction of future changes in rainfall, however, is subject to high levels of uncertainty, making it challenging for decision-makers to identify the areas and populations that are most vulnerable to landslide hazards. In this study, we demonstrate how a physically-based model - the Combined Hydrology and Stability Model (CHASM) - can be used together with Global Sensitivity Analysis (GSA) to explore the underlying factors controlling the spatial distribution of landslide risks across a regional landscape, while also accounting for deep uncertainty around potential future rainfall triggers. We demonstrate how GSA can be used to analyse CHASM which in turn represents the spatial variability of hillslope characteristics in the study region, while accounting for other uncertainties. Results are presented in the form of landslide hazard maps, utilising high-resolution digital elevation datasets for a case study in St Lucia in the Caribbean. Our findings about spatial landslide hazard drivers have important implications for data collection approaches and for long-term decision-making about land management practices.
NASA Astrophysics Data System (ADS)
Fowler, H. J.; Forsythe, N. D.; Blenkinsop, S.; Archer, D.; Hardy, A.; Janes, T.; Jones, R. G.; Holderness, T.
2013-12-01
We present results of two distinct, complementary analyses to assess evidence of elevation dependency in temperature change in the UIB (Karakoram, Eastern Hindu Kush) and wider WH. The first analysis component examines historical remotely-sensed land surface temperature (LST) from the second and third generation of the Advanced Very High Resolution Radiometer (AVHRR/2, AVHRR/3) instrument flown on NOAA satellite platforms since the mid-1980s through present day. The high spatial resolution (<4km) from AVHRR instrument enables precise consideration of the relationship between estimated LST and surface topography. The LST data product was developed as part of initiative to produce continuous time-series for key remotely sensed spatial products (LST, snow covered area, cloud cover, NDVI) extending as far back into the historical record as feasible. Context for the AVHRR LST data product is provided by results of bias assessment and validation procedures against both available local observations, both manned and automatic weather stations. Local observations provide meaningful validation and bias assessment of the vertical gradients found in the AVHRR LST as the elevation range from the lowest manned meteorological station (at 1460m asl) to the highest automatic weather station (4733m asl) covers much of the key range yielding runoff from seasonal snowmelt. Furthermore the common available record period of these stations (1995 to 2007) enables assessment not only of the AVHRR LST but also performance comparisons with the more recent MODIS LST data product. A range of spatial aggregations (from minor tributary catchments to primary basin headwaters) is performed to assess regional homogeneity and identify potential latitudinal or longitudinal gradients in elevation dependency. The second analysis component investigates elevation dependency, including its uncertainty, in projected temperature change trajectories in the downscaling of a seventeen member Global Climate Model (GCM) perturbed physics ensemble (PPE) of transient (130-year) simulations using a moderate resolution (25km) regional climate model (RCM). The GCM ensemble is the17-member QUMP (Quantifying Uncertainty in Model Projections) ensemble and the downscaling is done using HadRM3P, part of the PRECIS regional climate modelling system. Both the RCM and GCMs are models developed the UK Met Office Hadley Centre and are based on the HadCM3 GCM. Use of the multi-member PPE enables quantification of uncertainty in projected temperature change while the spatial resolution of RCM improves insight into the role of elevation in projected rates of change. Furthermore comparison with the results of the remote sensing analysis component - considered to provide an 'observed climatology' - permits evaluation of individual ensemble members with regards to biases in spatial gradients in temperature as well timing and magnitude of annual cycles.
NASA Astrophysics Data System (ADS)
Watson, Cameron S.; Carrivick, Jonathan; Quincey, Duncan
2015-10-01
Modelling glacial lake outburst floods (GLOFs) or 'jökulhlaups', necessarily involves the propagation of large and often stochastic uncertainties throughout the source to impact process chain. Since flood routing is primarily a function of underlying topography, communication of digital elevation model (DEM) uncertainty should accompany such modelling efforts. Here, a new stochastic first-pass assessment technique was evaluated against an existing GIS-based model and an existing 1D hydrodynamic model, using three DEMs with different spatial resolution. The analysis revealed the effect of DEM uncertainty and model choice on several flood parameters and on the prediction of socio-economic impacts. Our new model, which we call MC-LCP (Monte Carlo Least Cost Path) and which is distributed in the supplementary information, demonstrated enhanced 'stability' when compared to the two existing methods, and this 'stability' was independent of DEM choice. The MC-LCP model outputs an uncertainty continuum within its extent, from which relative socio-economic risk can be evaluated. In a comparison of all DEM and model combinations, the Shuttle Radar Topography Mission (SRTM) DEM exhibited fewer artefacts compared to those with the Advanced Spaceborne Thermal Emission and Reflection Radiometer Global Digital Elevation Model (ASTER GDEM), and were comparable to those with a finer resolution Advanced Land Observing Satellite Panchromatic Remote-sensing Instrument for Stereo Mapping (ALOS PRISM) derived DEM. Overall, we contend that the variability we find between flood routing model results suggests that consideration of DEM uncertainty and pre-processing methods is important when assessing flow routing and when evaluating potential socio-economic implications of a GLOF event. Incorporation of a stochastic variable provides an illustration of uncertainty that is important when modelling and communicating assessments of an inherently complex process.
Quantifying model uncertainty in seasonal Arctic sea-ice forecasts
NASA Astrophysics Data System (ADS)
Blanchard-Wrigglesworth, Edward; Barthélemy, Antoine; Chevallier, Matthieu; Cullather, Richard; Fučkar, Neven; Massonnet, François; Posey, Pamela; Wang, Wanqiu; Zhang, Jinlun; Ardilouze, Constantin; Bitz, Cecilia; Vernieres, Guillaume; Wallcraft, Alan; Wang, Muyin
2017-04-01
Dynamical model forecasts in the Sea Ice Outlook (SIO) of September Arctic sea-ice extent over the last decade have shown lower skill than that found in both idealized model experiments and hindcasts of previous decades. Additionally, it is unclear how different model physics, initial conditions or post-processing techniques contribute to SIO forecast uncertainty. In this work, we have produced a seasonal forecast of 2015 Arctic summer sea ice using SIO dynamical models initialized with identical sea-ice thickness in the central Arctic. Our goals are to calculate the relative contribution of model uncertainty and irreducible error growth to forecast uncertainty and assess the importance of post-processing, and to contrast pan-Arctic forecast uncertainty with regional forecast uncertainty. We find that prior to forecast post-processing, model uncertainty is the main contributor to forecast uncertainty, whereas after forecast post-processing forecast uncertainty is reduced overall, model uncertainty is reduced by an order of magnitude, and irreducible error growth becomes the main contributor to forecast uncertainty. While all models generally agree in their post-processed forecasts of September sea-ice volume and extent, this is not the case for sea-ice concentration. Additionally, forecast uncertainty of sea-ice thickness grows at a much higher rate along Arctic coastlines relative to the central Arctic ocean. Potential ways of offering spatial forecast information based on the timescale over which the forecast signal beats the noise are also explored.
Gridded uncertainty in fossil fuel carbon dioxide emission maps, a CDIAC example
Andres, Robert J.; Boden, Thomas A.; Higdon, David M.
2016-12-05
Due to a current lack of physical measurements at appropriate spatial and temporal scales, all current global maps and distributions of fossil fuel carbon dioxide (FFCO2) emissions use one or more proxies to distribute those emissions. These proxies and distribution schemes introduce additional uncertainty into these maps. This paper examines the uncertainty associated with the magnitude of gridded FFCO2 emissions. This uncertainty is gridded at the same spatial and temporal scales as the mass magnitude maps. This gridded uncertainty includes uncertainty contributions from the spatial, temporal, proxy, and magnitude components used to create the magnitude map of FFCO2 emissions. Throughoutmore » this process, when assumptions had to be made or expert judgment employed, the general tendency in most cases was toward overestimating or increasing the magnitude of uncertainty. The results of the uncertainty analysis reveal a range of 4–190 %, with an average of 120 % (2 σ) for populated and FFCO2-emitting grid spaces over annual timescales. This paper also describes a methodological change specific to the creation of the Carbon Dioxide Information Analysis Center (CDIAC) FFCO2 emission maps: the change from a temporally fixed population proxy to a temporally varying population proxy.« less
Gridded uncertainty in fossil fuel carbon dioxide emission maps, a CDIAC example
DOE Office of Scientific and Technical Information (OSTI.GOV)
Andres, Robert J.; Boden, Thomas A.; Higdon, David M.
Due to a current lack of physical measurements at appropriate spatial and temporal scales, all current global maps and distributions of fossil fuel carbon dioxide (FFCO2) emissions use one or more proxies to distribute those emissions. These proxies and distribution schemes introduce additional uncertainty into these maps. This paper examines the uncertainty associated with the magnitude of gridded FFCO2 emissions. This uncertainty is gridded at the same spatial and temporal scales as the mass magnitude maps. This gridded uncertainty includes uncertainty contributions from the spatial, temporal, proxy, and magnitude components used to create the magnitude map of FFCO2 emissions. Throughoutmore » this process, when assumptions had to be made or expert judgment employed, the general tendency in most cases was toward overestimating or increasing the magnitude of uncertainty. The results of the uncertainty analysis reveal a range of 4–190 %, with an average of 120 % (2 σ) for populated and FFCO2-emitting grid spaces over annual timescales. This paper also describes a methodological change specific to the creation of the Carbon Dioxide Information Analysis Center (CDIAC) FFCO2 emission maps: the change from a temporally fixed population proxy to a temporally varying population proxy.« less
Gridded uncertainty in fossil fuel carbon dioxide emission maps, a CDIAC example
NASA Astrophysics Data System (ADS)
Andres, Robert J.; Boden, Thomas A.; Higdon, David M.
2016-12-01
Due to a current lack of physical measurements at appropriate spatial and temporal scales, all current global maps and distributions of fossil fuel carbon dioxide (FFCO2) emissions use one or more proxies to distribute those emissions. These proxies and distribution schemes introduce additional uncertainty into these maps. This paper examines the uncertainty associated with the magnitude of gridded FFCO2 emissions. This uncertainty is gridded at the same spatial and temporal scales as the mass magnitude maps. This gridded uncertainty includes uncertainty contributions from the spatial, temporal, proxy, and magnitude components used to create the magnitude map of FFCO2 emissions. Throughout this process, when assumptions had to be made or expert judgment employed, the general tendency in most cases was toward overestimating or increasing the magnitude of uncertainty. The results of the uncertainty analysis reveal a range of 4-190 %, with an average of 120 % (2σ) for populated and FFCO2-emitting grid spaces over annual timescales. This paper also describes a methodological change specific to the creation of the Carbon Dioxide Information Analysis Center (CDIAC) FFCO2 emission maps: the change from a temporally fixed population proxy to a temporally varying population proxy.
NASA Astrophysics Data System (ADS)
Poppick, A. N.; McKinnon, K. A.; Dunn-Sigouin, E.; Deser, C.
2017-12-01
Initial condition climate model ensembles suggest that regional temperature trends can be highly variable on decadal timescales due to characteristics of internal climate variability. Accounting for trend uncertainty due to internal variability is therefore necessary to contextualize recent observed temperature changes. However, while the variability of trends in a climate model ensemble can be evaluated directly (as the spread across ensemble members), internal variability simulated by a climate model may be inconsistent with observations. Observation-based methods for assessing the role of internal variability on trend uncertainty are therefore required. Here, we use a statistical resampling approach to assess trend uncertainty due to internal variability in historical 50-year (1966-2015) winter near-surface air temperature trends over North America. We compare this estimate of trend uncertainty to simulated trend variability in the NCAR CESM1 Large Ensemble (LENS), finding that uncertainty in wintertime temperature trends over North America due to internal variability is largely overestimated by CESM1, on average by a factor of 32%. Our observation-based resampling approach is combined with the forced signal from LENS to produce an 'Observational Large Ensemble' (OLENS). The members of OLENS indicate a range of spatially coherent fields of temperature trends resulting from different sequences of internal variability consistent with observations. The smaller trend variability in OLENS suggests that uncertainty in the historical climate change signal in observations due to internal variability is less than suggested by LENS.
NASA Technical Reports Server (NTRS)
Oda, T.; Ott, L.; Lauvaux, T.; Feng, S.; Bun, R.; Roman, M.; Baker, D. F.; Pawson, S.
2017-01-01
Fossil fuel carbon dioxide (CO2) emissions (FFCO2) are the largest input to the global carbon cycle on a decadal time scale. Because total emissions are assumed to be reasonably well constrained by fuel statistics, FFCO2 often serves as a reference in order to deduce carbon uptake by poorly understood terrestrial and ocean sinks. Conventional atmospheric CO2 flux inversions solve for spatially explicit regional sources and sinks and estimate land and ocean fluxes by subtracting FFCO2. Thus, errors in FFCO2 can propagate into the final inferred flux estimates. Gridded emissions are often based on disaggregation of emissions estimated at national or regional level. Although national and regional total FFCO2 are well known, gridded emission fields are subject to additional uncertainties due to the emission disaggregation. Assessing such uncertainties is often challenging because of the lack of physical measurements for evaluation. We first review difficulties in assessing uncertainties associated with gridded FFCO2 emission data and present several approaches for evaluation of such uncertainties at multiple scales. Given known limitations, inter-emission data differences are often used as a proxy for the uncertainty. The popular approach allows us to characterize differences in emissions, but does not allow us to fully quantify emission disaggregation biases. Our work aims to vicariously evaluate FFCO2 emission data using atmospheric models and measurements. We show a global simulation experiment where uncertainty estimates are propagated as an atmospheric tracer (uncertainty tracer) alongside CO2 in NASA's GEOS model and discuss implications of FFCO2 uncertainties in the context of flux inversions. We also demonstrate the use of high resolution urban CO2 simulations as a tool for objectively evaluating FFCO2 data over intense emission regions. Though this study focuses on FFCO2 emission data, the outcome of this study could also help improve the knowledge of similar gridded emissions data for non-CO2 compounds with similar emission characteristics.
NASA Astrophysics Data System (ADS)
Oda, T.; Ott, L. E.; Lauvaux, T.; Feng, S.; Bun, R.; Roman, M. O.; Baker, D. F.; Pawson, S.
2017-12-01
Fossil fuel carbon dioxide (CO2) emissions (FFCO2) are the largest input to the global carbon cycle on a decadal time scale. Because total emissions are assumed to be reasonably well constrained by fuel statistics, FFCO2 often serves as a reference in order to deduce carbon uptake by poorly understood terrestrial and ocean sinks. Conventional atmospheric CO2 flux inversions solve for spatially explicit regional sources and sinks and estimate land and ocean fluxes by subtracting FFCO2. Thus, errors in FFCO2 can propagate into the final inferred flux estimates. Gridded emissions are often based on disaggregation of emissions estimated at national or regional level. Although national and regional total FFCO2 are well known, gridded emission fields are subject to additional uncertainties due to the emission disaggregation. Assessing such uncertainties is often challenging because of the lack of physical measurements for evaluation. We first review difficulties in assessing uncertainties associated with gridded FFCO2 emission data and present several approaches for evaluation of such uncertainties at multiple scales. Given known limitations, inter-emission data differences are often used as a proxy for the uncertainty. The popular approach allows us to characterize differences in emissions, but does not allow us to fully quantify emission disaggregation biases. Our work aims to vicariously evaluate FFCO2 emission data using atmospheric models and measurements. We show a global simulation experiment where uncertainty estimates are propagated as an atmospheric tracer (uncertainty tracer) alongside CO2 in NASA's GEOS model and discuss implications of FFCO2 uncertainties in the context of flux inversions. We also demonstrate the use of high resolution urban CO2 simulations as a tool for objectively evaluating FFCO2 data over intense emission regions. Though this study focuses on FFCO2 emission data, the outcome of this study could also help improve the knowledge of similar gridded emissions data for non-CO2 compounds that share emission sectors.
Quantitative assessment of human exposures and health effects due to air pollution involve detailed characterization of impacts of air quality on exposure and dose. A key challenge is to integrate these three components on a consistent spatial and temporal basis taking into acco...
Andres, J.A. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Boden, T.A. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)
2016-01-01
The monthly, gridded fossil-fuel CO2 emissions uncertainty estimates from 1950-2013 provided in this database are derived from time series of global, regional, and national fossil-fuel CO2 emissions (Boden et al. 2016). Andres et al. (2016) describes the basic methodology in estimating the uncertainty in the (gridded fossil fuel data product ). This uncertainty is gridded at the same spatial and temporal scales as the mass magnitude maps. This gridded uncertainty includes uncertainty contributions from the spatial, temporal, proxy, and magnitude components used to create the magnitude map of FFCO2 emissions. Throughout this process, when assumptions had to be made or expert judgment employed, the general tendency in most cases was toward overestimating or increasing the magnitude of uncertainty.
NASA Technical Reports Server (NTRS)
Liu, Jianbo; Kummerow, Christian D.; Elsaesser, Gregory S.
2016-01-01
Despite continuous improvements in microwave sensors and retrieval algorithms, our understanding of precipitation uncertainty is quite limited, due primarily to inconsistent findings in studies that compare satellite estimates to in situ observations over different parts of the world. This study seeks to characterize the temporal and spatial properties of uncertainty in the Tropical Rainfall Measuring Mission Microwave Imager surface rainfall product over tropical ocean basins. Two uncertainty analysis frameworks are introduced to qualitatively evaluate the properties of uncertainty under a hierarchy of spatiotemporal data resolutions. The first framework (i.e. 'climate method') demonstrates that, apart from random errors and regionally dependent biases, a large component of the overall precipitation uncertainty is manifested in cyclical patterns that are closely related to large-scale atmospheric modes of variability. By estimating the magnitudes of major uncertainty sources independently, the climate method is able to explain 45-88% of the monthly uncertainty variability. The percentage is largely resolution dependent (with the lowest percentage explained associated with a 1 deg x 1 deg spatial/1 month temporal resolution, and highest associated with a 3 deg x 3 deg spatial/3 month temporal resolution). The second framework (i.e. 'weather method') explains regional mean precipitation uncertainty as a summation of uncertainties associated with individual precipitation systems. By further assuming that self-similar recurring precipitation systems yield qualitatively comparable precipitation uncertainties, the weather method can consistently resolve about 50 % of the daily uncertainty variability, with only limited dependence on the regions of interest.
Optimal Groundwater Extraction under Uncertainty and a Spatial Stock Externality
We introduce a model that incorporates two important elements to estimating welfare gains from groundwater management: stochasticity and a spatial stock externality. We estimate welfare gains resulting from optimal management under uncertainty as well as a gradual stock externali...
Stochastic Analysis and Probabilistic Downscaling of Soil Moisture
NASA Astrophysics Data System (ADS)
Deshon, J. P.; Niemann, J. D.; Green, T. R.; Jones, A. S.
2017-12-01
Soil moisture is a key variable for rainfall-runoff response estimation, ecological and biogeochemical flux estimation, and biodiversity characterization, each of which is useful for watershed condition assessment. These applications require not only accurate, fine-resolution soil-moisture estimates but also confidence limits on those estimates and soil-moisture patterns that exhibit realistic statistical properties (e.g., variance and spatial correlation structure). The Equilibrium Moisture from Topography, Vegetation, and Soil (EMT+VS) model downscales coarse-resolution (9-40 km) soil moisture from satellite remote sensing or land-surface models to produce fine-resolution (10-30 m) estimates. The model was designed to produce accurate deterministic soil-moisture estimates at multiple points, but the resulting patterns do not reproduce the variance or spatial correlation of observed soil-moisture patterns. The primary objective of this research is to generalize the EMT+VS model to produce a probability density function (pdf) for soil moisture at each fine-resolution location and time. Each pdf has a mean that is equal to the deterministic soil-moisture estimate, and the pdf can be used to quantify the uncertainty in the soil-moisture estimates and to simulate soil-moisture patterns. Different versions of the generalized model are hypothesized based on how uncertainty enters the model, whether the uncertainty is additive or multiplicative, and which distributions describe the uncertainty. These versions are then tested by application to four catchments with detailed soil-moisture observations (Tarrawarra, Satellite Station, Cache la Poudre, and Nerrigundah). The performance of the generalized models is evaluated by comparing the statistical properties of the simulated soil-moisture patterns to those of the observations and the deterministic EMT+VS model. The versions of the generalized EMT+VS model with normally distributed stochastic components produce soil-moisture patterns with more realistic statistical properties than the deterministic model. Additionally, the results suggest that the variance and spatial correlation of the stochastic soil-moisture variations do not vary consistently with the spatial-average soil moisture.
ICESat laser altimetry over small mountain glaciers
NASA Astrophysics Data System (ADS)
Treichler, Désirée; Kääb, Andreas
2016-09-01
Using sparsely glaciated southern Norway as a case study, we assess the potential and limitations of ICESat laser altimetry for analysing regional glacier elevation change in rough mountain terrain. Differences between ICESat GLAS elevations and reference elevation data are plotted over time to derive a glacier surface elevation trend for the ICESat acquisition period 2003-2008. We find spatially varying biases between ICESat and three tested digital elevation models (DEMs): the Norwegian national DEM, SRTM DEM, and a high-resolution lidar DEM. For regional glacier elevation change, the spatial inconsistency of reference DEMs - a result of spatio-temporal merging - has the potential to significantly affect or dilute trends. Elevation uncertainties of all three tested DEMs exceed ICESat elevation uncertainty by an order of magnitude, and are thus limiting the accuracy of the method, rather than ICESat uncertainty. ICESat matches glacier size distribution of the study area well and measures small ice patches not commonly monitored in situ. The sample is large enough for spatial and thematic subsetting. Vertical offsets to ICESat elevations vary for different glaciers in southern Norway due to spatially inconsistent reference DEM age. We introduce a per-glacier correction that removes these spatially varying offsets, and considerably increases trend significance. Only after application of this correction do individual campaigns fit observed in situ glacier mass balance. Our correction also has the potential to improve glacier trend significance for other causes of spatially varying vertical offsets, for instance due to radar penetration into ice and snow for the SRTM DEM or as a consequence of mosaicking and merging that is common for national or global DEMs. After correction of reference elevation bias, we find that ICESat provides a robust and realistic estimate of a moderately negative glacier mass balance of around -0.36 ± 0.07 m ice per year. This regional estimate agrees well with the heterogeneous but overall negative in situ glacier mass balance observed in the area.
NASA Astrophysics Data System (ADS)
Jang, Cheng-Shin; Liu, Chen-Wuing
2005-10-01
This study aimed to analyze the contamination potential associated with the reactive transport of nitrate-N and ammonium-N in the Choushui River alluvial fan, Taiwan and to evaluate a risk region in developing a groundwater protection policy in 2021. In this area, an aquifer redox sequence provided a good understanding of the spatial distributions of nitrate-N and ammonium-N and of aerobic and anaerobic environments. Equiprobable hydraulic conductivity ( K) fields reproduced by geostatistical methods characterized the spatial uncertainty of contaminant transport in the heterogeneous aquifer. Nitrogen contamination potential fronts for high and low threshold concentrations based on a 95% risk probability were used to assess different levels of risk. The simulated result reveals that the spatial uncertainty of highly heterogeneous K fields governs the contamination potential assessment of the nitrogen compounds along the regional flow directions. The contamination potential of nitrate-N is more uncertain than that for ammonium-N. The high nitrate-N concentrations (≧ 3 mg/L) are prevalent in the aerobic environment. The low concentration nitrate-N plumes (0.5-3 mg/L) gradually migrate to the mid-fan area and to a maximum distance of 15 km from the aerobic region. The nitrate-N plumes pose a potential human health risk in the aerobic and anaerobic environments. The ammonium-N plumes remain stably confined to the distal-fan and partial mid-fan areas.
Duhalde, Denisse J; Arumí, José L; Oyarzún, Ricardo A; Rivera, Diego A
2018-06-11
A fuzzy logic approach has been proposed to face the uncertainty caused by sparse data in the assessment of the intrinsic vulnerability of a groundwater system with parametric methods in Las Trancas Valley, Andean Mountain, south-central Chile, a popular touristic place in Chile, but lacking of a centralized drinking and sewage water public systems; this situation is a potentially source of groundwater pollution. Based on DRASTIC, GOD, and EKv and the expert knowledge of the study area, the Mamdani fuzzy approach was generated and the spatial data were processed by ArcGIS. The groundwater system exhibited areas with high, medium, and low intrinsic vulnerability indices. The fuzzy approach results were compared with traditional methods results, which, in general, have shown a good spatial agreement even though significant changes were also identified in the spatial distribution of the indices. The Mamdani logic approach has shown to be a useful and practical tool to assess the intrinsic vulnerability of an aquifer under sparse data conditions.
Digital soil mapping in assessment of land suitability for organic farming
NASA Astrophysics Data System (ADS)
Ghambashidze, Giorgi; Kentchiashvili, Naira; Tarkhnishvili, Maia; Jolokhava, Tamar; Meskhi, Tea
2017-04-01
Digital soil mapping (DSM) is a fast-developing sub discipline of soil science which gets more importance along with increased availability of spatial data. DSM is based on three main components: the input in the form of field and laboratory observational methods, the process used in terms of spatial and non-spatial soil inference systems, and the output in the form of spatial soil information systems, which includes outputs in the form of rasters of prediction along with the uncertainty of prediction. Georgia is one of the countries who are under the way of spatial data infrastructure development, which includes soil related spatial data also. Therefore, it is important to demonstrate the capacity of DSM technics for planning and decision making process, in which assessment of land suitability is a major interest for those willing to grow agricultural crops. In that term land suitability assessment for establishing organic farms is in high demand as market for organically produced commodities is still increasing. It is the first attempt in Georgia to use DSM to predict areas with potential for organic farming development. Current approach is based on risk assessment of soil pollution with toxic elements (As, Hg, Pb, Cd, Cr) and prediction of bio-availability of those elements to plants on example of the region of Western Georgia, where detailed soil survey was conducted and spatial database of soil was created. The results of the study show the advantages of DSM at early stage assessment and depending on availability and quality of the input data, it can achieve acceptable accuracy.
NASA Technical Reports Server (NTRS)
Hassler, B.; Petropavlovskikh, I.; Staehelin, J.; August, T.; Bhartia, P. K.; Clerbaux, C.; Degenstein, D.; Maziere, M. De; Dinelli, B. M.; Dudhia, A.;
2014-01-01
Peak stratospheric chlorofluorocarbon (CFC) and other ozone depleting substance (ODS) concentrations were reached in the mid- to late 1990s. Detection and attribution of the expected recovery of the stratospheric ozone layer in an atmosphere with reduced ODSs as well as efforts to understand the evolution of stratospheric ozone in the presence of increasing greenhouse gases are key current research topics. These require a critical examination of the ozone changes with an accurate knowledge of the spatial (geographical and vertical) and temporal ozone response. For such an examination, it is vital that the quality of the measurements used be as high as possible and measurement uncertainties well quantified. In preparation for the 2014 United Nations Environment Programme (UNEP)/World Meteorological Organization (WMO) Scientific Assessment of Ozone Depletion, the SPARC/IO3C/IGACO-O3/NDACC (SI2N) Initiative was designed to study and document changes in the global ozone profile distribution. This requires assessing long-term ozone profile data sets in regards to measurement stability and uncertainty characteristics. The ultimate goal is to establish suitability for estimating long-term ozone trends to contribute to ozone recovery studies. Some of the data sets have been improved as part of this initiative with updated versions now available. This summary presents an overview of stratospheric ozone profile measurement data sets (ground and satellite based) available for ozone recovery studies. Here we document measurement techniques, spatial and temporal coverage, vertical resolution, native units and measurement uncertainties. In addition, the latest data versions are briefly described (including data version updates as well as detailing multiple retrievals when available for a given satellite instrument). Archive location information for each data set is also given.
Uncertainty in exposure to air pollution
NASA Astrophysics Data System (ADS)
Pebesma, Edzer; Helle, Kristina; Christoph, Stasch; Rasouli, Soora; Timmermans, Harry; Walker, Sam-Erik; Denby, Bruce
2013-04-01
To assess exposure to air pollution for a person or for a group of people, one needs to know where the person or group is as a function of time, and what the air pollution is at these times and locations. In this study we used the Albatross activity-based model to assess the whereabouts of people and the uncertainties in this, and a probabilistic air quality system based on TAPM/EPISODE to assess air quality probabilistically. The outcomes of the two models were combined to assess exposure to air pollution, and the errors in it. We used the area around Rotterdam (Netherlands) as a case study. As the outcomes of both models come as Monte Carlo realizations, it was relatively easy to cancel one of the sources of uncertainty (movement of persons, air pollution) in order to identify their respective contributions, and also to compare evaluations for individuals with averages for a population of persons. As the output is probabilistic, and in addition spatially and temporally varying, the visual analysis of the complete results poses some challenges. This case study was one of the test cases in the UncertWeb project, which has built concepts and tools to realize the uncertainty-enabled model web. Some of the tools and protocols will be shown and evaluated in this presentation. For the uncertainty of exposure, the uncertainty of air quality was more important than the uncertainty of peoples locations. This difference was stronger for PM10 than for NO2. The workflow was implemented as generic Web services in UncertWeb that also allow for other inputs than the simulated activity schedules and air quality with other resolution. However, due to this flexibility, the Web services require standardized formats and the overlay algorithm is not optimized for the specific use case resulting in a data and processing overhead. Hence, we implemented the full analysis in parallel in R, for this specific case as the model web solution had difficulties with massive data.
NASA Astrophysics Data System (ADS)
Armston, J.; Marselis, S.; Hancock, S.; Duncanson, L.; Tang, H.; Kellner, J. R.; Calders, K.; Disney, M.; Dubayah, R.
2017-12-01
The NASA Global Ecosystem Dynamics Investigation (GEDI) will place a multi-beam waveform lidar instrument on the International Space Station (ISS) to provide measurements of forest vertical structure globally. These measurements of structure will underpin empirical modelling of above ground biomass density (AGBD) at the scale of individual GEDI lidar footprints (25m diameter). The GEDI pre-launch calibration strategy for footprint level models relies on linking AGBD estimates from ground plots with GEDI lidar waveforms simulated from coincident discrete return airborne laser scanning data. Currently available ground plot data have variable and often large uncertainty at the spatial resolution of GEDI footprints due to poor colocation, allometric model error, sample size and plot edge effects. The relative importance of these sources of uncertainty partly depends on the quality of ground measurements and region. It is usually difficult to know the magnitude of these uncertainties a priori so a common approach to mitigate their influence on model training is to aggregate ground plot and waveform lidar data to a coarser spatial scale (0.25-1ha). Here we examine the impacts of these principal sources of uncertainty using a 3D simulation approach. Sets of realistic tree models generated from terrestrial laser scanning (TLS) data or parametric modelling matched to tree inventory data were assembled from four contrasting forest plots across tropical rainforest, deciduous temperate forest, and sclerophyll eucalypt woodland sites. These tree models were used to simulate geometrically explicit 3D scenes with variable tree density, size class and spatial distribution. GEDI lidar waveforms are simulated over ground plots within these scenes using monte carlo ray tracing, allowing the impact of varying ground plot and waveform colocation error, forest structure and edge effects on the relationship between ground plot AGBD and GEDI lidar waveforms to be directly assessed. We quantify the sensitivity of calibration equations relating GEDI lidar structure measurements and AGBD to these factors at a range of spatial scales (0.0625-1ha) and discuss the implications for the expanding use of existing in situ ground plot data by GEDI.
Mitchard, Edward Ta; Saatchi, Sassan S; Baccini, Alessandro; Asner, Gregory P; Goetz, Scott J; Harris, Nancy L; Brown, Sandra
2013-10-26
Mapping the aboveground biomass of tropical forests is essential both for implementing conservation policy and reducing uncertainties in the global carbon cycle. Two medium resolution (500 m - 1000 m) pantropical maps of vegetation biomass have been recently published, and have been widely used by sub-national and national-level activities in relation to Reducing Emissions from Deforestation and forest Degradation (REDD+). Both maps use similar input data layers, and are driven by the same spaceborne LiDAR dataset providing systematic forest height and canopy structure estimates, but use different ground datasets for calibration and different spatial modelling methodologies. Here, we compare these two maps to each other, to the FAO's Forest Resource Assessment (FRA) 2010 country-level data, and to a high resolution (100 m) biomass map generated for a portion of the Colombian Amazon. We find substantial differences between the two maps, in particular in central Amazonia, the Congo basin, the south of Papua New Guinea, the Miombo woodlands of Africa, and the dry forests and savannas of South America. There is little consistency in the direction of the difference. However, when the maps are aggregated to the country or biome scale there is greater agreement, with differences cancelling out to a certain extent. When comparing country level biomass stocks, the two maps agree with each other to a much greater extent than to the FRA 2010 estimates. In the Colombian Amazon, both pantropical maps estimate higher biomass than the independent high resolution map, but show a similar spatial distribution of this biomass. Biomass mapping has progressed enormously over the past decade, to the stage where we can produce globally consistent maps of aboveground biomass. We show that there are still large uncertainties in these maps, in particular in areas with little field data. However, when used at a regional scale, different maps appear to converge, suggesting we can provide reasonable stock estimates when aggregated over large regions. Therefore we believe the largest uncertainties for REDD+ activities relate to the spatial distribution of biomass and to the spatial pattern of forest cover change, rather than to total globally or nationally summed carbon density.
Skakun, Sergii; Justice, Christopher O; Vermote, Eric; Roger, Jean-Claude
2018-01-01
The Visible/Infrared Imager/Radiometer Suite (VIIRS) aboard the Suomi National Polar-orbiting Partnership (S-NPP) satellite was launched in 2011, in part to provide continuity with the Moderate Resolution Imaging Spectroradiometer (MODIS) instrument aboard National Aeronautics and Space Administration's (NASA) Terra and Aqua remote sensing satellites. The VIIRS will eventually replace MODIS for both land science and applications and add to the coarse-resolution, long term data record. It is, therefore, important to provide the user community with an assessment of the consistency of equivalent products from the two sensors. For this study, we do this in the context of example agricultural monitoring applications. Surface reflectance that is routinely delivered within the M{O,Y}D09 and VNP09 series of products provide critical input for generating downstream products. Given the range of applications utilizing the normalized difference vegetation index (NDVI) generated from M{O,Y}D09 and VNP09 products and the inherent differences between MODIS and VIIRS sensors in calibration, spatial sampling, and spectral bands, the main objective of this study is to quantify uncertainties related the transitioning from using MODIS to VIIRS-based NDVI's. In particular, we compare NDVI's derived from two sets of Level 3 MYD09 and VNP09 products with various spatial-temporal characteristics, namely 8-day composites at 500 m spatial resolution and daily Climate Modelling Grid (CMG) images at 0.05° spatial resolution. Spectral adjustment of VIIRS I1 (red) and I2 (near infra-red - NIR) bands to match MODIS/Aqua b1 (red) and b2 (NIR) bands is performed to remove a bias between MODIS and VIIRS-based red, NIR, and NDVI estimates. Overall, red reflectance, NIR reflectance, NDVI uncertainties were 0.014, 0.029 and 0.056 respectively for the 500 m product and 0.013, 0.016 and 0.032 for the 0.05° product. The study shows that MODIS and VIIRS NDVI data can be used interchangeably for applications with an uncertainty of less than 0.02 to 0.05, depending on the scale of spatial aggregation, which is typically the uncertainty of the individual dataset.
NASA Astrophysics Data System (ADS)
Zhang, Bowen; Tian, Hanqin; Lu, Chaoqun; Chen, Guangsheng; Pan, Shufen; Anderson, Christopher; Poulter, Benjamin
2017-09-01
A wide range of estimates on global wetland methane (CH4) fluxes has been reported during the recent two decades. This gives rise to urgent needs to clarify and identify the uncertainty sources, and conclude a reconciled estimate for global CH4 fluxes from wetlands. Most estimates by using bottom-up approach rely on wetland data sets, but these data sets show largely inconsistent in terms of both wetland extent and spatiotemporal distribution. A quantitative assessment of uncertainties associated with these discrepancies among wetland data sets has not been well investigated yet. By comparing the five widely used global wetland data sets (GISS, GLWD, Kaplan, GIEMS and SWAMPS-GLWD), it this study, we found large differences in the wetland extent, ranging from 5.3 to 10.2 million km2, as well as their spatial and temporal distributions among the five data sets. These discrepancies in wetland data sets resulted in large bias in model-estimated global wetland CH4 emissions as simulated by using the Dynamic Land Ecosystem Model (DLEM). The model simulations indicated that the mean global wetland CH4 emissions during 2000-2007 were 177.2 ± 49.7 Tg CH4 yr-1, based on the five different data sets. The tropical regions contributed the largest portion of estimated CH4 emissions from global wetlands, but also had the largest discrepancy. Among six continents, the largest uncertainty was found in South America. Thus, the improved estimates of wetland extent and CH4 emissions in the tropical regions and South America would be a critical step toward an accurate estimate of global CH4 emissions. This uncertainty analysis also reveals an important need for our scientific community to generate a global scale wetland data set with higher spatial resolution and shorter time interval, by integrating multiple sources of field and satellite data with modeling approaches, for cross-scale extrapolation.
NASA Astrophysics Data System (ADS)
Wang, Jun; Wang, Yang; Zeng, Hui
2016-01-01
A key issue to address in synthesizing spatial data with variable-support in spatial analysis and modeling is the change-of-support problem. We present an approach for solving the change-of-support and variable-support data fusion problems. This approach is based on geostatistical inverse modeling that explicitly accounts for differences in spatial support. The inverse model is applied here to produce both the best predictions of a target support and prediction uncertainties, based on one or more measurements, while honoring measurements. Spatial data covering large geographic areas often exhibit spatial nonstationarity and can lead to computational challenge due to the large data size. We developed a local-window geostatistical inverse modeling approach to accommodate these issues of spatial nonstationarity and alleviate computational burden. We conducted experiments using synthetic and real-world raster data. Synthetic data were generated and aggregated to multiple supports and downscaled back to the original support to analyze the accuracy of spatial predictions and the correctness of prediction uncertainties. Similar experiments were conducted for real-world raster data. Real-world data with variable-support were statistically fused to produce single-support predictions and associated uncertainties. The modeling results demonstrate that geostatistical inverse modeling can produce accurate predictions and associated prediction uncertainties. It is shown that the local-window geostatistical inverse modeling approach suggested offers a practical way to solve the well-known change-of-support problem and variable-support data fusion problem in spatial analysis and modeling.
NASA Astrophysics Data System (ADS)
Hermans, Thomas; Nguyen, Frédéric; Klepikova, Maria; Dassargues, Alain; Caers, Jef
2018-04-01
In theory, aquifer thermal energy storage (ATES) systems can recover in winter the heat stored in the aquifer during summer to increase the energy efficiency of the system. In practice, the energy efficiency is often lower than expected from simulations due to spatial heterogeneity of hydraulic properties or non-favorable hydrogeological conditions. A proper design of ATES systems should therefore consider the uncertainty of the prediction related to those parameters. We use a novel framework called Bayesian Evidential Learning (BEL) to estimate the heat storage capacity of an alluvial aquifer using a heat tracing experiment. BEL is based on two main stages: pre- and postfield data acquisition. Before data acquisition, Monte Carlo simulations and global sensitivity analysis are used to assess the information content of the data to reduce the uncertainty of the prediction. After data acquisition, prior falsification and machine learning based on the same Monte Carlo are used to directly assess uncertainty on key prediction variables from observations. The result is a full quantification of the posterior distribution of the prediction conditioned to observed data, without any explicit full model inversion. We demonstrate the methodology in field conditions and validate the framework using independent measurements.
Spatial dependence of extreme rainfall
NASA Astrophysics Data System (ADS)
Radi, Noor Fadhilah Ahmad; Zakaria, Roslinazairimah; Satari, Siti Zanariah; Azman, Muhammad Az-zuhri
2017-05-01
This study aims to model the spatial extreme daily rainfall process using the max-stable model. The max-stable model is used to capture the dependence structure of spatial properties of extreme rainfall. Three models from max-stable are considered namely Smith, Schlather and Brown-Resnick models. The methods are applied on 12 selected rainfall stations in Kelantan, Malaysia. Most of the extreme rainfall data occur during wet season from October to December of 1971 to 2012. This period is chosen to assure the available data is enough to satisfy the assumption of stationarity. The dependence parameters including the range and smoothness, are estimated using composite likelihood approach. Then, the bootstrap approach is applied to generate synthetic extreme rainfall data for all models using the estimated dependence parameters. The goodness of fit between the observed extreme rainfall and the synthetic data is assessed using the composite likelihood information criterion (CLIC). Results show that Schlather model is the best followed by Brown-Resnick and Smith models based on the smallest CLIC's value. Thus, the max-stable model is suitable to be used to model extreme rainfall in Kelantan. The study on spatial dependence in extreme rainfall modelling is important to reduce the uncertainties of the point estimates for the tail index. If the spatial dependency is estimated individually, the uncertainties will be large. Furthermore, in the case of joint return level is of interest, taking into accounts the spatial dependence properties will improve the estimation process.
Elderd, Bret D.; Dwyer, Greg; Dukic, Vanja
2013-01-01
Estimates of a disease’s basic reproductive rate R0 play a central role in understanding outbreaks and planning intervention strategies. In many calculations of R0, a simplifying assumption is that different host populations have effectively identical transmission rates. This assumption can lead to an underestimate of the overall uncertainty associated with R0, which, due to the non-linearity of epidemic processes, may result in a mis-estimate of epidemic intensity and miscalculated expenditures associated with public-health interventions. In this paper, we utilize a Bayesian method for quantifying the overall uncertainty arising from differences in population-specific basic reproductive rates. Using this method, we fit spatial and non-spatial susceptible-exposed-infected-recovered (SEIR) models to a series of 13 smallpox outbreaks. Five outbreaks occurred in populations that had been previously exposed to smallpox, while the remaining eight occurred in Native-American populations that were naïve to the disease at the time. The Native-American outbreaks were close in a spatial and temporal sense. Using Bayesian Information Criterion (BIC), we show that the best model includes population-specific R0 values. These differences in R0 values may, in part, be due to differences in genetic background, social structure, or food and water availability. As a result of these inter-population differences, the overall uncertainty associated with the “population average” value of smallpox R0 is larger, a finding that can have important consequences for controlling epidemics. In general, Bayesian hierarchical models are able to properly account for the uncertainty associated with multiple epidemics, provide a clearer understanding of variability in epidemic dynamics, and yield a better assessment of the range of potential risks and consequences that decision makers face. PMID:24021521
NASA Astrophysics Data System (ADS)
Tadini, A.; Bisson, M.; Neri, A.; Cioni, R.; Bevilacqua, A.; Aspinall, W. P.
2017-06-01
This study presents new and revised data sets about the spatial distribution of past volcanic vents, eruptive fissures, and regional/local structures of the Somma-Vesuvio volcanic system (Italy). The innovative features of the study are the identification and quantification of important sources of uncertainty affecting interpretations of the data sets. In this regard, the spatial uncertainty of each feature is modeled by an uncertainty area, i.e., a geometric element typically represented by a polygon drawn around points or lines. The new data sets have been assembled as an updatable geodatabase that integrates and complements existing databases for Somma-Vesuvio. The data are organized into 4 data sets and stored as 11 feature classes (points and lines for feature locations and polygons for the associated uncertainty areas), totaling more than 1700 elements. More specifically, volcanic vent and eruptive fissure elements are subdivided into feature classes according to their associated eruptive styles: (i) Plinian and sub-Plinian eruptions (i.e., large- or medium-scale explosive activity); (ii) violent Strombolian and continuous ash emission eruptions (i.e., small-scale explosive activity); and (iii) effusive eruptions (including eruptions from both parasitic vents and eruptive fissures). Regional and local structures (i.e., deep faults) are represented as linear feature classes. To support interpretation of the eruption data, additional data sets are provided for Somma-Vesuvio geological units and caldera morphological features. In the companion paper, the data presented here, and the associated uncertainties, are used to develop a first vent opening probability map for the Somma-Vesuvio caldera, with specific attention focused on large or medium explosive events.
Sommerfreund, J; Arhonditsis, G B; Diamond, M L; Frignani, M; Capodaglio, G; Gerino, M; Bellucci, L; Giuliani, S; Mugnai, C
2010-03-01
A Monte Carlo analysis is used to quantify environmental parametric uncertainty in a multi-segment, multi-chemical model of the Venice Lagoon. Scientific knowledge, expert judgment and observational data are used to formulate prior probability distributions that characterize the uncertainty pertaining to 43 environmental system parameters. The propagation of this uncertainty through the model is then assessed by a comparative analysis of the moments (central tendency, dispersion) of the model output distributions. We also apply principal component analysis in combination with correlation analysis to identify the most influential parameters, thereby gaining mechanistic insights into the ecosystem functioning. We found that modeled concentrations of Cu, Pb, OCDD/F and PCB-180 varied by up to an order of magnitude, exhibiting both contaminant- and site-specific variability. These distributions generally overlapped with the measured concentration ranges. We also found that the uncertainty of the contaminant concentrations in the Venice Lagoon was characterized by two modes of spatial variability, mainly driven by the local hydrodynamic regime, which separate the northern and central parts of the lagoon and the more isolated southern basin. While spatial contaminant gradients in the lagoon were primarily shaped by hydrology, our analysis also shows that the interplay amongst the in-place historical pollution in the central lagoon, the local suspended sediment concentrations and the sediment burial rates exerts significant control on the variability of the contaminant concentrations. We conclude that the probabilistic analysis presented herein is valuable for quantifying uncertainty and probing its cause in over-parameterized models, while some of our results can be used to dictate where additional data collection efforts should focus on and the directions that future model refinement should follow. (c) 2009 Elsevier Inc. All rights reserved.
The Potential for Spatial Distribution Indices to Signal Thresholds in Marine Fish Biomass
Reuchlin-Hugenholtz, Emilie
2015-01-01
The frequently observed positive relationship between fish population abundance and spatial distribution suggests that changes in distribution can be indicative of trends in abundance. If contractions in spatial distribution precede declines in spawning stock biomass (SSB), spatial distribution reference points could complement the SSB reference points that are commonly used in marine conservation biology and fisheries management. When relevant spatial distribution information is integrated into fisheries management and recovery plans, risks and uncertainties associated with a plan based solely on the SSB criterion would be reduced. To assess the added value of spatial distribution data, we examine the relationship between SSB and four metrics of spatial distribution intended to reflect changes in population range, concentration, and density for 10 demersal populations (9 species) inhabiting the Scotian Shelf, Northwest Atlantic. Our primary purpose is to assess their potential to serve as indices of SSB, using fisheries independent survey data. We find that metrics of density offer the best correlate of spawner biomass. A decline in the frequency of encountering high density areas is associated with, and in a few cases preceded by, rapid declines in SSB in 6 of 10 populations. Density-based indices have considerable potential to serve both as an indicator of SSB and as spatially based reference points in fisheries management. PMID:25789624
Andres, R. J. [CDIAC; Boden, T. A. [CDIAC
2016-01-01
The annual, gridded fossil-fuel CO2 emissions uncertainty estimates from 1950-2013 provided in this database are derived from time series of global, regional, and national fossil-fuel CO2 emissions (Boden et al. 2016). Andres et al. (2016) describes the basic methodology in estimating the uncertainty in the (gridded fossil fuel data product ). This uncertainty is gridded at the same spatial and temporal scales as the mass magnitude maps. This gridded uncertainty includes uncertainty contributions from the spatial, temporal, proxy, and magnitude components used to create the magnitude map of FFCO2 emissions. Throughout this process, when assumptions had to be made or expert judgment employed, the general tendency in most cases was toward overestimating or increasing the magnitude of uncertainty.
NASA Astrophysics Data System (ADS)
Setegn, S. G.; Ortiz, J.; Melendez, J.; Barreto, M.; Torres-Perez, J. L.; Guild, L. S.
2015-12-01
There are limited studies in Puerto Rico that shows the water resources availability and variability with respect to changing climates and land use. The main goal of the HICE-PR (Human Impacts to Coastal Ecosystems in Puerto Rico (HICE-PR): the Río Loco Watershed (southwest coast PR) project which was funded by NASA is to evaluate the impacts of land use/land cover changes on the quality and extent of coastal and marine ecosystems (CMEs) in two priority watersheds in Puerto Rico (Manatí and Guánica).The main objective of this study is to set up a physically based spatially distributed hydrological model, Soil and Water Assessment Tool (SWAT) for the analysis of hydrological processes in the Rio Grande de Manati river basin. SWAT (soil and water assessment tool) is a spatially distributed watershed model developed to predict the impact of land management practices on water, sediment and agricultural chemical yields in large complex watersheds. For efficient use of distributed models for hydrological and scenario analysis, it is important that these models pass through a careful calibration and uncertainty analysis. The model was calibrated and validated using Sequential Uncertainty Fitting (SUFI-2) calibration and uncertainty analysis algorithms. The model evaluation statistics for streamflows prediction shows that there is a good agreement between the measured and simulated flows that was verified by coefficients of determination and Nash Sutcliffe efficiency greater than 0.5. Keywords: Hydrological Modeling; SWAT; SUFI-2; Rio Grande De Manati; Puerto Rico
NASA Astrophysics Data System (ADS)
Clough, B.; Russell, M.; Domke, G. M.; Woodall, C. W.
2016-12-01
Uncertainty estimates are needed to establish confidence in national forest carbon stocks and to verify changes reported to the United Nations Framework Convention on Climate Change. Good practice guidance from the Intergovernmental Panel on Climate Change stipulates that uncertainty assessments should neither exaggerate nor underestimate the actual error within carbon stocks, yet methodological guidance for forests has been hampered by limited understanding of how complex dynamics give rise to errors across spatial scales (i.e., individuals to continents). This talk highlights efforts to develop a multi-scale, data-driven framework for assessing uncertainty within the United States (US) forest carbon inventory, and focuses on challenges and opportunities for improving the precision of national forest carbon stock estimates. Central to our approach is the calibration of allometric models with a newly established legacy biomass database for North American tree species, and the use of hierarchical models to link these data with the Forest Inventory and Analysis (FIA) database as well as remote sensing datasets. Our work suggests substantial risk for misestimating key sources of uncertainty including: (1) attributing more confidence in allometric models than what is warranted by the best available data; (2) failing to capture heterogeneity in biomass stocks due to environmental variation at regional scales; and (3) ignoring spatial autocorrelation and other random effects that are characteristic of national forest inventory data. Our results suggest these sources of error may be much higher than is generally assumed, though these results must be understood with the limited scope and availability of appropriate calibration data in mind. In addition to reporting on important sources of uncertainty, this talk will discuss opportunities to improve the precision of national forest carbon stocks that are motivated by our use of data-driven forecasting including: (1) improving the taxonomic and geographic scope of available biomass data; (2) direct attribution of landscape-level heterogeneity in biomass stocks to specific ecological processes; and (3) integration of expert opinion and meta-analysis to lessen the influence of often highly variable datasets on biomass stock forecasts.
NASA Astrophysics Data System (ADS)
Yuan, J.; Kopp, R. E.
2017-12-01
Quantitative risk analysis of regional climate change is crucial for risk management and impact assessment of climate change. Two major challenges to assessing the risks of climate change are: CMIP5 model runs, which drive EURO-CODEX downscaling runs, do not cover the full range of uncertainty of future projections; Climate models may underestimate the probability of tail risks (i.e. extreme events). To overcome the difficulties, this study offers a viable avenue, where a set of probabilistic climate ensemble is generated using the Surrogate/Model Mixed Ensemble (SMME) method. The probabilistic ensembles for temperature and precipitation are used to assess the range of uncertainty covered by five bias-corrected simulations from the high-resolution (0.11º) EURO-CODEX database, which are selected by the PESETA (The Projection of Economic impacts of climate change in Sectors of the European Union based on bottom-up Analysis) III project. Results show that the distribution of SMME ensemble is notably wider than both distribution of raw ensemble of GCMs and the spread of the five EURO-CORDEX in RCP8.5. Tail risks are well presented by the SMME ensemble. Both SMME ensemble and EURO-CORDEX projections are aggregated to administrative level, and are integrated into impact functions of PESETA III to assess climate risks in Europe. To further evaluate the uncertainties introduced by the downscaling process, we compare the 5 runs from EURO-CORDEX with runs from the corresponding GCMs. Time series of regional mean, spatial patterns, and climate indices are examined for the future climate (2080-2099) deviating from the present climate (1981-2010). The downscaling processes do not appear to be trend-preserving, e.g. the increase in regional mean temperature from EURO-CORDEX is slower than that from the corresponding GCM. The spatial pattern comparison reveals that the differences between each pair of GCM and EURO-CORDEX are small in winter. In summer, the temperatures of EURO-CORDEX are generally lower than those of GCMs, while the drying trends in precipitation of EURO-CORDEX are smaller than those of GCMs. Climate indices are significantly affected by bias-correction and downscaling process. Our study provides valuable information for selecting climate indices in different regions over Europe.
Accounting for spatial effects in land use regression for urban air pollution modeling.
Bertazzon, Stefania; Johnson, Markey; Eccles, Kristin; Kaplan, Gilaad G
2015-01-01
In order to accurately assess air pollution risks, health studies require spatially resolved pollution concentrations. Land-use regression (LUR) models estimate ambient concentrations at a fine spatial scale. However, spatial effects such as spatial non-stationarity and spatial autocorrelation can reduce the accuracy of LUR estimates by increasing regression errors and uncertainty; and statistical methods for resolving these effects--e.g., spatially autoregressive (SAR) and geographically weighted regression (GWR) models--may be difficult to apply simultaneously. We used an alternate approach to address spatial non-stationarity and spatial autocorrelation in LUR models for nitrogen dioxide. Traditional models were re-specified to include a variable capturing wind speed and direction, and re-fit as GWR models. Mean R(2) values for the resulting GWR-wind models (summer: 0.86, winter: 0.73) showed a 10-20% improvement over traditional LUR models. GWR-wind models effectively addressed both spatial effects and produced meaningful predictive models. These results suggest a useful method for improving spatially explicit models. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.
NASA Astrophysics Data System (ADS)
Shoaib, Syed Abu; Marshall, Lucy; Sharma, Ashish
2018-06-01
Every model to characterise a real world process is affected by uncertainty. Selecting a suitable model is a vital aspect of engineering planning and design. Observation or input errors make the prediction of modelled responses more uncertain. By way of a recently developed attribution metric, this study is aimed at developing a method for analysing variability in model inputs together with model structure variability to quantify their relative contributions in typical hydrological modelling applications. The Quantile Flow Deviation (QFD) metric is used to assess these alternate sources of uncertainty. The Australian Water Availability Project (AWAP) precipitation data for four different Australian catchments is used to analyse the impact of spatial rainfall variability on simulated streamflow variability via the QFD. The QFD metric attributes the variability in flow ensembles to uncertainty associated with the selection of a model structure and input time series. For the case study catchments, the relative contribution of input uncertainty due to rainfall is higher than that due to potential evapotranspiration, and overall input uncertainty is significant compared to model structure and parameter uncertainty. Overall, this study investigates the propagation of input uncertainty in a daily streamflow modelling scenario and demonstrates how input errors manifest across different streamflow magnitudes.
Babcock, Chad; Finley, Andrew O.; Bradford, John B.; Kolka, Randall K.; Birdsey, Richard A.; Ryan, Michael G.
2015-01-01
Many studies and production inventory systems have shown the utility of coupling covariates derived from Light Detection and Ranging (LiDAR) data with forest variables measured on georeferenced inventory plots through regression models. The objective of this study was to propose and assess the use of a Bayesian hierarchical modeling framework that accommodates both residual spatial dependence and non-stationarity of model covariates through the introduction of spatial random effects. We explored this objective using four forest inventory datasets that are part of the North American Carbon Program, each comprising point-referenced measures of above-ground forest biomass and discrete LiDAR. For each dataset, we considered at least five regression model specifications of varying complexity. Models were assessed based on goodness of fit criteria and predictive performance using a 10-fold cross-validation procedure. Results showed that the addition of spatial random effects to the regression model intercept improved fit and predictive performance in the presence of substantial residual spatial dependence. Additionally, in some cases, allowing either some or all regression slope parameters to vary spatially, via the addition of spatial random effects, further improved model fit and predictive performance. In other instances, models showed improved fit but decreased predictive performance—indicating over-fitting and underscoring the need for cross-validation to assess predictive ability. The proposed Bayesian modeling framework provided access to pixel-level posterior predictive distributions that were useful for uncertainty mapping, diagnosing spatial extrapolation issues, revealing missing model covariates, and discovering locally significant parameters.
NASA Astrophysics Data System (ADS)
Blanchard-Wrigglesworth, E.; Barthélemy, A.; Chevallier, M.; Cullather, R.; Fučkar, N.; Massonnet, F.; Posey, P.; Wang, W.; Zhang, J.; Ardilouze, C.; Bitz, C. M.; Vernieres, G.; Wallcraft, A.; Wang, M.
2017-08-01
Dynamical model forecasts in the Sea Ice Outlook (SIO) of September Arctic sea-ice extent over the last decade have shown lower skill than that found in both idealized model experiments and hindcasts of previous decades. Additionally, it is unclear how different model physics, initial conditions or forecast post-processing (bias correction) techniques contribute to SIO forecast uncertainty. In this work, we have produced a seasonal forecast of 2015 Arctic summer sea ice using SIO dynamical models initialized with identical sea-ice thickness in the central Arctic. Our goals are to calculate the relative contribution of model uncertainty and irreducible error growth to forecast uncertainty and assess the importance of post-processing, and to contrast pan-Arctic forecast uncertainty with regional forecast uncertainty. We find that prior to forecast post-processing, model uncertainty is the main contributor to forecast uncertainty, whereas after forecast post-processing forecast uncertainty is reduced overall, model uncertainty is reduced by an order of magnitude, and irreducible error growth becomes the main contributor to forecast uncertainty. While all models generally agree in their post-processed forecasts of September sea-ice volume and extent, this is not the case for sea-ice concentration. Additionally, forecast uncertainty of sea-ice thickness grows at a much higher rate along Arctic coastlines relative to the central Arctic ocean. Potential ways of offering spatial forecast information based on the timescale over which the forecast signal beats the noise are also explored.
Multi-scale landslide hazard assessment: Advances in global and regional methodologies
NASA Astrophysics Data System (ADS)
Kirschbaum, Dalia; Peters-Lidard, Christa; Adler, Robert; Hong, Yang
2010-05-01
The increasing availability of remotely sensed surface data and precipitation provides a unique opportunity to explore how smaller-scale landslide susceptibility and hazard assessment methodologies may be applicable at larger spatial scales. This research first considers an emerging satellite-based global algorithm framework, which evaluates how the landslide susceptibility and satellite derived rainfall estimates can forecast potential landslide conditions. An analysis of this algorithm using a newly developed global landslide inventory catalog suggests that forecasting errors are geographically variable due to improper weighting of surface observables, resolution of the current susceptibility map, and limitations in the availability of landslide inventory data. These methodological and data limitation issues can be more thoroughly assessed at the regional level, where available higher resolution landslide inventories can be applied to empirically derive relationships between surface variables and landslide occurrence. The regional empirical model shows improvement over the global framework in advancing near real-time landslide forecasting efforts; however, there are many uncertainties and assumptions surrounding such a methodology that decreases the functionality and utility of this system. This research seeks to improve upon this initial concept by exploring the potential opportunities and methodological structure needed to advance larger-scale landslide hazard forecasting and make it more of an operational reality. Sensitivity analysis of the surface and rainfall parameters in the preliminary algorithm indicates that surface data resolution and the interdependency of variables must be more appropriately quantified at local and regional scales. Additionally, integrating available surface parameters must be approached in a more theoretical, physically-based manner to better represent the physical processes underlying slope instability and landslide initiation. Several rainfall infiltration and hydrological flow models have been developed to model slope instability at small spatial scales. This research investigates the potential of applying a more quantitative hydrological model to larger spatial scales, utilizing satellite and surface data inputs that are obtainable over different geographic regions. Due to the significant role that data and methodological uncertainties play in the effectiveness of landslide hazard assessment outputs, the methodology and data inputs are considered within an ensemble uncertainty framework in order to better resolve the contribution and limitations of model inputs and to more effectively communicate the model skill for improved landslide hazard assessment.
The role of satellite remote sensing in structured ecosystem risk assessments.
Murray, Nicholas J; Keith, David A; Bland, Lucie M; Ferrari, Renata; Lyons, Mitchell B; Lucas, Richard; Pettorelli, Nathalie; Nicholson, Emily
2018-04-01
The current set of global conservation targets requires methods for monitoring the changing status of ecosystems. Protocols for ecosystem risk assessment are uniquely suited to this task, providing objective syntheses of a wide range of data to estimate the likelihood of ecosystem collapse. Satellite remote sensing can deliver ecologically relevant, long-term datasets suitable for analysing changes in ecosystem area, structure and function at temporal and spatial scales relevant to risk assessment protocols. However, there is considerable uncertainty about how to select and effectively utilise remotely sensed variables for risk assessment. Here, we review the use of satellite remote sensing for assessing spatial and functional changes of ecosystems, with the aim of providing guidance on the use of these data in ecosystem risk assessment. We suggest that decisions on the use of satellite remote sensing should be made a priori and deductively with the assistance of conceptual ecosystem models that identify the primary indicators representing the dynamics of a focal ecosystem. Copyright © 2017 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Alzraiee, Ayman H.; Bau, Domenico A.; Garcia, Luis A.
2013-06-01
Effective sampling of hydrogeological systems is essential in guiding groundwater management practices. Optimal sampling of groundwater systems has previously been formulated based on the assumption that heterogeneous subsurface properties can be modeled using a geostatistical approach. Therefore, the monitoring schemes have been developed to concurrently minimize the uncertainty in the spatial distribution of systems' states and parameters, such as the hydraulic conductivity K and the hydraulic head H, and the uncertainty in the geostatistical model of system parameters using a single objective function that aggregates all objectives. However, it has been shown that the aggregation of possibly conflicting objective functions is sensitive to the adopted aggregation scheme and may lead to distorted results. In addition, the uncertainties in geostatistical parameters affect the uncertainty in the spatial prediction of K and H according to a complex nonlinear relationship, which has often been ineffectively evaluated using a first-order approximation. In this study, we propose a multiobjective optimization framework to assist the design of monitoring networks of K and H with the goal of optimizing their spatial predictions and estimating the geostatistical parameters of the K field. The framework stems from the combination of a data assimilation (DA) algorithm and a multiobjective evolutionary algorithm (MOEA). The DA algorithm is based on the ensemble Kalman filter, a Monte-Carlo-based Bayesian update scheme for nonlinear systems, which is employed to approximate the posterior uncertainty in K, H, and the geostatistical parameters of K obtained by collecting new measurements. Multiple MOEA experiments are used to investigate the trade-off among design objectives and identify the corresponding monitoring schemes. The methodology is applied to design a sampling network for a shallow unconfined groundwater system located in Rocky Ford, Colorado. Results indicate that the effect of uncertainties associated with the geostatistical parameters on the spatial prediction might be significantly alleviated (by up to 80% of the prior uncertainty in K and by 90% of the prior uncertainty in H) by sampling evenly distributed measurements with a spatial measurement density of more than 1 observation per 60 m × 60 m grid block. In addition, exploration of the interaction of objective functions indicates that the ability of head measurements to reduce the uncertainty associated with the correlation scale is comparable to the effect of hydraulic conductivity measurements.
NASA Astrophysics Data System (ADS)
Troldborg, Mads; Nowak, Wolfgang; Lange, Ida V.; Santos, Marta C.; Binning, Philip J.; Bjerg, Poul L.
2012-09-01
Mass discharge estimates are increasingly being used when assessing risks of groundwater contamination and designing remedial systems at contaminated sites. Such estimates are, however, rather uncertain as they integrate uncertain spatial distributions of both concentration and groundwater flow. Here a geostatistical simulation method for quantifying the uncertainty of the mass discharge across a multilevel control plane is presented. The method accounts for (1) heterogeneity of both the flow field and the concentration distribution through Bayesian geostatistics, (2) measurement uncertainty, and (3) uncertain source zone and transport parameters. The method generates conditional realizations of the spatial flow and concentration distribution. An analytical macrodispersive transport solution is employed to simulate the mean concentration distribution, and a geostatistical model of the Box-Cox transformed concentration data is used to simulate observed deviations from this mean solution. By combining the flow and concentration realizations, a mass discharge probability distribution is obtained. The method has the advantage of avoiding the heavy computational burden of three-dimensional numerical flow and transport simulation coupled with geostatistical inversion. It may therefore be of practical relevance to practitioners compared to existing methods that are either too simple or computationally demanding. The method is demonstrated on a field site contaminated with chlorinated ethenes. For this site, we show that including a physically meaningful concentration trend and the cosimulation of hydraulic conductivity and hydraulic gradient across the transect helps constrain the mass discharge uncertainty. The number of sampling points required for accurate mass discharge estimation and the relative influence of different data types on mass discharge uncertainty is discussed.
NASA Astrophysics Data System (ADS)
Schlegel, Nicole-Jeanne; Boening, Carmen; Larour, Eric; Limonadi, Daniel; Schodlok, Michael; Seroussi, Helene; Watkins, Michael
2017-04-01
Research and development activities at the Jet Propulsion Laboratory (JPL) currently support the creation of a framework to formally evaluate the observational needs within earth system science. One of the pilot projects of this effort aims to quantify uncertainties in global mean sea level rise projections, due to contributions from the continental ice sheets. Here, we take advantage of established uncertainty quantification tools embedded within the JPL-University of California at Irvine Ice Sheet System Model (ISSM). We conduct sensitivity and Monte-Carlo style sampling experiments on forward simulations of the Greenland and Antarctic ice sheets. By varying internal parameters and boundary conditions of the system over both extreme and credible worst-case ranges, we assess the impact of the different parameter ranges on century-scale sea level rise projections. The results inform efforts to a) isolate the processes and inputs that are most responsible for determining ice sheet contribution to sea level; b) redefine uncertainty brackets for century-scale projections; and c) provide a prioritized list of measurements, along with quantitative information on spatial and temporal resolution, required for reducing uncertainty in future sea level rise projections. Results indicate that ice sheet mass loss is dependent on the spatial resolution of key boundary conditions - such as bedrock topography and melt rates at the ice-ocean interface. This work is performed at and supported by the California Institute of Technology's Jet Propulsion Laboratory. Supercomputing time is also supported through a contract with the National Aeronautics and Space Administration's Cryosphere program.
NASA Astrophysics Data System (ADS)
Koch, J.; Jensen, K. H.; Stisen, S.
2017-12-01
Hydrological models that integrate numerical process descriptions across compartments of the water cycle are typically required to undergo thorough model calibration in order to estimate suitable effective model parameters. In this study, we apply a spatially distributed hydrological model code which couples the saturated zone with the unsaturated zone and the energy portioning at the land surface. We conduct a comprehensive multi-constraint model calibration against nine independent observational datasets which reflect both the temporal and the spatial behavior of hydrological response of a 1000km2 large catchment in Denmark. The datasets are obtained from satellite remote sensing and in-situ measurements and cover five keystone hydrological variables: discharge, evapotranspiration, groundwater head, soil moisture and land surface temperature. Results indicate that a balanced optimization can be achieved where errors on objective functions for all nine observational datasets can be reduced simultaneously. The applied calibration framework was tailored with focus on improving the spatial pattern performance; however results suggest that the optimization is still more prone to improve the temporal dimension of model performance. This study features a post-calibration linear uncertainty analysis. This allows quantifying parameter identifiability which is the worth of a specific observational dataset to infer values to model parameters through calibration. Furthermore the ability of an observation to reduce predictive uncertainty is assessed as well. Such findings determine concrete implications on the design of model calibration frameworks and, in more general terms, the acquisition of data in hydrological observatories.
Uncertainties in Coastal Ocean Color Products: Impacts of Spatial Sampling
NASA Technical Reports Server (NTRS)
Pahlevan, Nima; Sarkar, Sudipta; Franz, Bryan A.
2016-01-01
With increasing demands for ocean color (OC) products with improved accuracy and well characterized, per-retrieval uncertainty budgets, it is vital to decompose overall estimated errors into their primary components. Amongst various contributing elements (e.g., instrument calibration, atmospheric correction, inversion algorithms) in the uncertainty of an OC observation, less attention has been paid to uncertainties associated with spatial sampling. In this paper, we simulate MODIS (aboard both Aqua and Terra) and VIIRS OC products using 30 m resolution OC products derived from the Operational Land Imager (OLI) aboard Landsat-8, to examine impacts of spatial sampling on both cross-sensor product intercomparisons and in-situ validations of R(sub rs) products in coastal waters. Various OLI OC products representing different productivity levels and in-water spatial features were scanned for one full orbital-repeat cycle of each ocean color satellite. While some view-angle dependent differences in simulated Aqua-MODIS and VIIRS were observed, the average uncertainties (absolute) in product intercomparisons (due to differences in spatial sampling) at regional scales are found to be 1.8%, 1.9%, 2.4%, 4.3%, 2.7%, 1.8%, and 4% for the R(sub rs)(443), R(sub rs)(482), R(sub rs)(561), R(sub rs)(655), Chla, K(sub d)(482), and b(sub bp)(655) products, respectively. It is also found that, depending on in-water spatial variability and the sensor's footprint size, the errors for an in-situ validation station in coastal areas can reach as high as +/- 18%. We conclude that a) expected biases induced by the spatial sampling in product intercomparisons are mitigated when products are averaged over at least 7 km × 7 km areas, b) VIIRS observations, with improved consistency in cross-track spatial sampling, yield more precise calibration/validation statistics than that of MODIS, and c) use of a single pixel centered on in-situ coastal stations provides an optimal sampling size for validation efforts. These findings will have implications for enhancing our understanding of uncertainties in ocean color retrievals and for planning of future ocean color missions and the associated calibration/validation exercises.
NASA Astrophysics Data System (ADS)
Jacquin, A. P.
2012-04-01
This study is intended to quantify the impact of uncertainty about precipitation spatial distribution on predictive uncertainty of a snowmelt runoff model. This problem is especially relevant in mountain catchments with a sparse precipitation observation network and relative short precipitation records. The model analysed is a conceptual watershed model operating at a monthly time step. The model divides the catchment into five elevation zones, where the fifth zone corresponds to the catchment's glaciers. Precipitation amounts at each elevation zone i are estimated as the product between observed precipitation at a station and a precipitation factor FPi. If other precipitation data are not available, these precipitation factors must be adjusted during the calibration process and are thus seen as parameters of the model. In the case of the fifth zone, glaciers are seen as an inexhaustible source of water that melts when the snow cover is depleted.The catchment case study is Aconcagua River at Chacabuquito, located in the Andean region of Central Chile. The model's predictive uncertainty is measured in terms of the output variance of the mean squared error of the Box-Cox transformed discharge, the relative volumetric error, and the weighted average of snow water equivalent in the elevation zones at the end of the simulation period. Sobol's variance decomposition (SVD) method is used for assessing the impact of precipitation spatial distribution, represented by the precipitation factors FPi, on the models' predictive uncertainty. In the SVD method, the first order effect of a parameter (or group of parameters) indicates the fraction of predictive uncertainty that could be reduced if the true value of this parameter (or group) was known. Similarly, the total effect of a parameter (or group) measures the fraction of predictive uncertainty that would remain if the true value of this parameter (or group) was unknown, but all the remaining model parameters could be fixed. In this study, first order and total effects of the group of precipitation factors FP1- FP4, and the precipitation factor FP5, are calculated separately. First order and total effects of the group FP1- FP4 are much higher than first order and total effects of the factor FP5, which are negligible This situation is due to the fact that the actual value taken by FP5 does not have much influence in the contribution of the glacier zone to the catchment's output discharge, mainly limited by incident solar radiation. In addition to this, first order effects indicate that, in average, nearly 25% of predictive uncertainty could be reduced if the true values of the precipitation factors FPi could be known, but no information was available on the appropriate values for the remaining model parameters. Finally, the total effects of the precipitation factors FP1- FP4 are close to 41% in average, implying that even if the appropriate values for the remaining model parameters could be fixed, predictive uncertainty would be still quite high if the spatial distribution of precipitation remains unknown. Acknowledgements: This research was funded by FONDECYT, Research Project 1110279.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sig Drellack, Lance Prothro
2007-12-01
The Underground Test Area (UGTA) Project of the U.S. Department of Energy, National Nuclear Security Administration Nevada Site Office is in the process of assessing and developing regulatory decision options based on modeling predictions of contaminant transport from underground testing of nuclear weapons at the Nevada Test Site (NTS). The UGTA Project is attempting to develop an effective modeling strategy that addresses and quantifies multiple components of uncertainty including natural variability, parameter uncertainty, conceptual/model uncertainty, and decision uncertainty in translating model results into regulatory requirements. The modeling task presents multiple unique challenges to the hydrological sciences as a result ofmore » the complex fractured and faulted hydrostratigraphy, the distributed locations of sources, the suite of reactive and non-reactive radionuclides, and uncertainty in conceptual models. Characterization of the hydrogeologic system is difficult and expensive because of deep groundwater in the arid desert setting and the large spatial setting of the NTS. Therefore, conceptual model uncertainty is partially addressed through the development of multiple alternative conceptual models of the hydrostratigraphic framework and multiple alternative models of recharge and discharge. Uncertainty in boundary conditions is assessed through development of alternative groundwater fluxes through multiple simulations using the regional groundwater flow model. Calibration of alternative models to heads and measured or inferred fluxes has not proven to provide clear measures of model quality. Therefore, model screening by comparison to independently-derived natural geochemical mixing targets through cluster analysis has also been invoked to evaluate differences between alternative conceptual models. Advancing multiple alternative flow models, sensitivity of transport predictions to parameter uncertainty is assessed through Monte Carlo simulations. The simulations are challenged by the distributed sources in each of the Corrective Action Units, by complex mass transfer processes, and by the size and complexity of the field-scale flow models. An efficient methodology utilizing particle tracking results and convolution integrals provides in situ concentrations appropriate for Monte Carlo analysis. Uncertainty in source releases and transport parameters including effective porosity, fracture apertures and spacing, matrix diffusion coefficients, sorption coefficients, and colloid load and mobility are considered. With the distributions of input uncertainties and output plume volumes, global analysis methods including stepwise regression, contingency table analysis, and classification tree analysis are used to develop sensitivity rankings of parameter uncertainties for each model considered, thus assisting a variety of decisions.« less
Uncertainty in Citizen Science observations: from measurement to user perception
NASA Astrophysics Data System (ADS)
Lahoz, William; Schneider, Philipp; Castell, Nuria
2016-04-01
Citizen Science activities concern general public engagement in scientific research activities when citizens actively contribute to science either with their intellectual effort or surrounding knowledge or with their tools and resources. The advent of technologies such as the Internet and smartphones, and the growth in their usage, has significantly increased the potential benefits from Citizen Science activities. Citizen Science observations from low-cost sensors, smartphones and Citizen Observatories, provide a novel and recent development in platforms for observing the Earth System, with the opportunity to extend the range of observational platforms available to society to spatio-temporal scales (10-100s m; 1 hr or less) highly relevant to citizen needs. The potential value of Citizen Science is high, with applications in science, education, social aspects, and policy aspects, but this potential, particularly for citizens and policymakers, remains largely untapped. Key areas where Citizen Science data start to have demonstrable benefits include GEOSS Societal Benefit Areas such as Health and Weather. Citizen Science observations have many challenges, including simulation of smaller spatial scales, noisy data, combination with traditional observational methods (satellite and in situ data), and assessment, representation and visualization of uncertainty. Within these challenges, that of the assessment and representation of uncertainty and its communication to users is fundamental, as it provides qualitative and/or quantitative information that influences the belief users will have in environmental information. This presentation will discuss the challenges in assessment and representation of uncertainty in Citizen Science observations, its communication to users, including the use of visualization, and the perception of this uncertainty information by users of Citizen Science observations.
NASA Astrophysics Data System (ADS)
Smith, B. D.; White, J.; Kress, W. H.; Clark, B. R.; Barlow, J.
2016-12-01
Hydrogeophysical surveys have become an integral part of understanding hydrogeological frameworks used in groundwater models. Regional models cover a large area where water well data is, at best, scattered and irregular. Since budgets are finite, priorities must be assigned to select optimal areas for geophysical surveys. For airborne electromagnetic (AEM) geophysical surveys, optimization of mapping depth and line spacing needs to take in account the objectives of the groundwater models. The approach discussed here uses a first-order, second-moment (FOSM) uncertainty analyses which assumes an approximate linear relation between model parameters and observations. This assumption allows FOSM analyses to be applied to estimate the value of increased parameter knowledge to reduce forecast uncertainty. FOSM is used to facilitate optimization of yet-to-be-completed geophysical surveying to reduce model forecast uncertainty. The main objective of geophysical surveying is assumed to estimate values and spatial variation in hydrologic parameters (i.e. hydraulic conductivity) as well as map lower permeability layers that influence the spatial distribution of recharge flux. The proposed data worth analysis was applied to Mississippi Embayment Regional Aquifer Study (MERAS) which is being updated. The objective of MERAS is to assess the ground-water availability (status and trends) of the Mississippi embayment aquifer system. The study area covers portions of eight states including Alabama, Arkansas, Illinois, Kentucky, Louisiana, Mississippi, Missouri, and Tennessee. The active model grid covers approximately 70,000 square miles, and incorporates some 6,000 miles of major rivers and over 100,000 water wells. In the FOSM analysis, a dense network of pilot points was used to capture uncertainty in hydraulic conductivity and recharge. To simulate the effect of AEM flight lines, the prior uncertainty for hydraulic conductivity and recharge pilots along potential flight lines was reduced. The FOSM forecast uncertainty estimates were then recalculated and compared to the base forecast uncertainty estimates. The resulting reduction in forecast uncertainty is a measure of the effect on the model from the AEM survey. Iterations through this process, results in optimization of flight line location.
NASA Astrophysics Data System (ADS)
Vallam, P.; Qin, X. S.
2017-07-01
Flooding risk is increasing in many parts of the world and may worsen under climate change conditions. The accuracy of predicting flooding risk relies on reasonable projection of meteorological data (especially rainfall) at the local scale. The current statistical downscaling approaches face the difficulty of projecting multi-site climate information for future conditions while conserving spatial information. This study presents a combined Long Ashton Research Station Weather Generator (LARS-WG) stochastic weather generator and multi-site rainfall simulator RainSim (CLWRS) approach to investigate flow regimes under future conditions in the Kootenay Watershed, Canada. To understand the uncertainty effect stemming from different scenarios, the climate output is fed into a hydrologic model. The results showed different variation trends of annual peak flows (in 2080-2099) based on different climate change scenarios and demonstrated that the hydrological impact would be driven by the interaction between snowmelt and peak flows. The proposed CLWRS approach is useful where there is a need for projection of potential climate change scenarios.
NASA Astrophysics Data System (ADS)
Theunissen, Raf; Kadosh, Jesse S.; Allen, Christian B.
2015-06-01
Spatially varying signals are typically sampled by collecting uniformly spaced samples irrespective of the signal content. For signals with inhomogeneous information content, this leads to unnecessarily dense sampling in regions of low interest or insufficient sample density at important features, or both. A new adaptive sampling technique is presented directing sample collection in proportion to local information content, capturing adequately the short-period features while sparsely sampling less dynamic regions. The proposed method incorporates a data-adapted sampling strategy on the basis of signal curvature, sample space-filling, variable experimental uncertainty and iterative improvement. Numerical assessment has indicated a reduction in the number of samples required to achieve a predefined uncertainty level overall while improving local accuracy for important features. The potential of the proposed method has been further demonstrated on the basis of Laser Doppler Anemometry experiments examining the wake behind a NACA0012 airfoil and the boundary layer characterisation of a flat plate.
NASA Astrophysics Data System (ADS)
Lee, Tong
2017-04-01
Understanding the accuracies of satellite-derived sea surface salinity (SSS) measurements in depicting temporal changes and the dependence of the accuracies on spatiotemporal scales are important to capability assessment, future mission design, and applications to study oceanic phenomena of different spatiotemporal scales. This study quantifies the consistency between Aquarius Version-4 monthly gridded SSS (released in late 2015) with two widely used Argo monthly gridded near-surface salinity products. The analysis focused on their consistency in depicting temporal changes (including seasonal and non-seasonal) on various spatial scales: 1˚ x1˚ , 3˚ x3˚ , and 10˚ x10˚ . Globally averaged standard deviation (STD) values for Aquarius-Argo salinity differences on these three spatial scales are 0.16, 0.14, 0.09 psu, compared to those between the two Argo products of 0.10, 0.09, and 0.04 psu. Aquarius SSS compare better with Argo data on non-seasonal (e.g., interannual and intraseasonal) than for seasonal time scales. The seasonal Aquarius-Argo SSS differences are mostly concentrated at high latitudes. The Aquarius team is making active efforts to further reduce these high-latitude seasonal biases. The consistency between Aquarius and Argo salinity is similar to that between the two Argo products in the tropics and subtropics for non-seasonal signals, and in the tropics for seasonal signals. Therefore, the representativeness errors of the Argo products for various spatial scales (related to sampling and gridding) need to be taken into account when estimating the uncertainty of Aquarius SSS. The globally averaged uncertainty of large-scale (10˚ x10˚ ) non-seasonal Aquarius SSS is approximately 0.04 psu. These estimates reflect the significant improvements of Aquarius Version-4 SSS over the previous versions. The estimates can be used as baseline requirements for future ocean salinity missions from space. The spatial distribution of the uncertainty estimates is also useful for assimilation of Aquarius SSS.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dai, Heng; Chen, Xingyuan; Ye, Ming
Sensitivity analysis is an important tool for quantifying uncertainty in the outputs of mathematical models, especially for complex systems with a high dimension of spatially correlated parameters. Variance-based global sensitivity analysis has gained popularity because it can quantify the relative contribution of uncertainty from different sources. However, its computational cost increases dramatically with the complexity of the considered model and the dimension of model parameters. In this study we developed a hierarchical sensitivity analysis method that (1) constructs an uncertainty hierarchy by analyzing the input uncertainty sources, and (2) accounts for the spatial correlation among parameters at each level ofmore » the hierarchy using geostatistical tools. The contribution of uncertainty source at each hierarchy level is measured by sensitivity indices calculated using the variance decomposition method. Using this methodology, we identified the most important uncertainty source for a dynamic groundwater flow and solute transport in model at the Department of Energy (DOE) Hanford site. The results indicate that boundary conditions and permeability field contribute the most uncertainty to the simulated head field and tracer plume, respectively. The relative contribution from each source varied spatially and temporally as driven by the dynamic interaction between groundwater and river water at the site. By using a geostatistical approach to reduce the number of realizations needed for the sensitivity analysis, the computational cost of implementing the developed method was reduced to a practically manageable level. The developed sensitivity analysis method is generally applicable to a wide range of hydrologic and environmental problems that deal with high-dimensional spatially-distributed parameters.« less
NASA Astrophysics Data System (ADS)
Ganguly, S.; Basu, S.; Mukhopadhyay, S.; Michaelis, A.; Milesi, C.; Votava, P.; Nemani, R. R.
2013-12-01
An unresolved issue with coarse-to-medium resolution satellite-based forest carbon mapping over regional to continental scales is the high level of uncertainty in above ground biomass (AGB) estimates caused by the absence of forest cover information at a high enough spatial resolution (current spatial resolution is limited to 30-m). To put confidence in existing satellite-derived AGB density estimates, it is imperative to create continuous fields of tree cover at a sufficiently high resolution (e.g. 1-m) such that large uncertainties in forested area are reduced. The proposed work will provide means to reduce uncertainty in present satellite-derived AGB maps and Forest Inventory and Analysis (FIA) based regional estimates. Our primary objective will be to create Very High Resolution (VHR) estimates of tree cover at a spatial resolution of 1-m for the Continental United States using all available National Agriculture Imaging Program (NAIP) color-infrared imagery from 2010 till 2012. We will leverage the existing capabilities of the NASA Earth Exchange (NEX) high performance computing and storage facilities. The proposed 1-m tree cover map can be further aggregated to provide percent tree cover at any medium-to-coarse resolution spatial grid, which will aid in reducing uncertainties in AGB density estimation at the respective grid and overcome current limitations imposed by medium-to-coarse resolution land cover maps. We have implemented a scalable and computationally-efficient parallelized framework for tree-cover delineation - the core components of the algorithm [that] include a feature extraction process, a Statistical Region Merging image segmentation algorithm and a classification algorithm based on Deep Belief Network and a Feedforward Backpropagation Neural Network algorithm. An initial pilot exercise has been performed over the state of California (~11,000 scenes) to create a wall-to-wall 1-m tree cover map and the classification accuracy has been assessed. Results show an improvement in accuracy of tree-cover delineation as compared to existing forest cover maps from NLCD, especially over fragmented, heterogeneous and urban landscapes. Estimates of VHR tree cover will complement and enhance the accuracy of present remote-sensing based AGB modeling approaches and forest inventory based estimates at both national and local scales. A requisite step will be to characterize the inherent uncertainties in tree cover estimates and propagate them to estimate AGB.
NASA Technical Reports Server (NTRS)
Carrasco, M.; Penpeci-Talgar, C.; Eckstein, M.
2000-01-01
This study is the first to report the benefits of spatial covert attention on contrast sensitivity in a wide range of spatial frequencies when a target alone was presented in the absence of a local post-mask. We used a peripheral precue (a small circle indicating the target location) to explore the effects of covert spatial attention on contrast sensitivity as assessed by orientation discrimination (Experiments 1-4), detection (Experiments 2 and 3) and localization (Experiment 3) tasks. In all four experiments the target (a Gabor patch ranging in spatial frequency from 0.5 to 10 cpd) was presented alone in one of eight possible locations equidistant from fixation. Contrast sensitivity was consistently higher for peripherally- than for neutrally-cued trials, even though we eliminated variables (distracters, global masks, local masks, and location uncertainty) that are known to contribute to an external noise reduction explanation of attention. When observers were presented with vertical and horizontal Gabor patches an external noise reduction signal detection model accounted for the cueing benefit in a discrimination task (Experiment 1). However, such a model could not account for this benefit when location uncertainty was reduced, either by: (a) Increasing overall performance level (Experiment 2); (b) increasing stimulus contrast to enable fine discriminations of slightly tilted suprathreshold stimuli (Experiment 3); and (c) presenting a local post-mask (Experiment 4). Given that attentional benefits occurred under conditions that exclude all variables predicted by the external noise reduction model, these results support the signal enhancement model of attention.
Assessing population exposure for landslide risk analysis using dasymetric cartography
NASA Astrophysics Data System (ADS)
Garcia, Ricardo A. C.; Oliveira, Sérgio C.; Zêzere, José L.
2016-12-01
Assessing the number and locations of exposed people is a crucial step in landslide risk management and emergency planning. The available population statistical data frequently have insufficient detail for an accurate assessment of potentially exposed people to hazardous events, mainly when they occur at the local scale, such as with landslides. The present study aims to apply dasymetric cartography to improving population spatial resolution and to assess the potentially exposed population. An additional objective is to compare the results with those obtained with a more common approach that uses, as spatial units, basic census units, which are the best spatial data disaggregation and detailed information available for regional studies in Portugal. Considering the Portuguese census data and a layer of residential building footprint, which was used as ancillary information, the number of exposed inhabitants differs significantly according to the approach used. When the census unit approach is used, considering the three highest landslide susceptible classes, the number of exposed inhabitants is in general overestimated. Despite the associated uncertainties of a general cost-benefit analysis, the presented methodology seems to be a reliable approach for gaining a first approximation of a more detailed estimation of exposed people. The approach based on dasymetric cartography allows the spatial resolution of population over large areas to be increased and enables the use of detailed landslide susceptibility maps, which are valuable for improving the exposed population assessment.
Moving across scales: Challenges and opportunities in upscaling carbon fluxes
NASA Astrophysics Data System (ADS)
Naithani, K. J.
2016-12-01
Light use efficiency (LUE) type models are commonly used to upscale terrestrial C fluxes and estimate regional and global C budgets. Model parameters are often estimated for each land cover type (LCT) using flux observations from one or more eddy covariance towers, and then spatially extrapolated by integrating land cover, meteorological, and remotely sensed data. Decisions regarding the type of input data (spatial resolution of land cover data, spatial and temporal length of flux data), representation of landscape structure (land use vs. disturbance regime), and the type of modeling framework (common risk vs. hierarchical) all influence the estimates CO2 fluxes and the associated uncertainties, but are rarely considered together. This work presents a synthesis of past and present efforts for upscaling CO2 fluxes and associated uncertainties in the ChEAS (Chequamegon Ecosystem Atmosphere Study) region in northern Wisconsin and the Upper Peninsula of Michigan. This work highlights two key future research needs. First, the characterization of uncertainties due to all of the abovementioned factors reflects only a (hopefully relevant) subset the overall uncertainties. Second, interactions among these factors are likely critical, but are poorly represented by the tower network at landscape scales. Yet, results indicate significant spatial and temporal heterogeneity of uncertainty in CO2 fluxes which can inform carbon management efforts and prioritize data needs.
NASA Astrophysics Data System (ADS)
Enzenhoefer, R.; Rodriguez-Pretelin, A.; Nowak, W.
2012-12-01
"From an engineering standpoint, the quantification of uncertainty is extremely important not only because it allows estimating risk but mostly because it allows taking optimal decisions in an uncertain framework" (Renard, 2007). The most common way to account for uncertainty in the field of subsurface hydrology and wellhead protection is to randomize spatial parameters, e.g. the log-hydraulic conductivity or porosity. This enables water managers to take robust decisions in delineating wellhead protection zones with rationally chosen safety margins in the spirit of probabilistic risk management. Probabilistic wellhead protection zones are commonly based on steady-state flow fields. However, several past studies showed that transient flow conditions may substantially influence the shape and extent of catchments. Therefore, we believe they should be accounted for in the probabilistic assessment and in the delineation process. The aim of our work is to show the significance of flow transients and to investigate the interplay between spatial uncertainty and flow transients in wellhead protection zone delineation. To this end, we advance our concept of probabilistic capture zone delineation (Enzenhoefer et al., 2012) that works with capture probabilities and other probabilistic criteria for delineation. The extended framework is able to evaluate the time fraction that any point on a map falls within a capture zone. In short, we separate capture probabilities into spatial/statistical and time-related frequencies. This will provide water managers additional information on how to manage a well catchment in the light of possible hazard conditions close to the capture boundary under uncertain and time-variable flow conditions. In order to save computational costs, we take advantage of super-positioned flow components with time-variable coefficients. We assume an instantaneous development of steady-state flow conditions after each temporal change in driving forces, following an idea by Festger and Walter, 2002. These quasi steady-state flow fields are cast into a geostatistical Monte Carlo framework to admit and evaluate the influence of parameter uncertainty on the delineation process. Furthermore, this framework enables conditioning on observed data with any conditioning scheme, such as rejection sampling, Ensemble Kalman Filters, etc. To further reduce the computational load, we use the reverse formulation of advective-dispersive transport. We simulate the reverse transport by particle tracking random walk in order to avoid numerical dispersion to account for well arrival times.
NASA Astrophysics Data System (ADS)
Bauer, J. R.; Rose, K.; Romeo, L.; Barkhurst, A.; Nelson, J.; Duran-Sesin, R.; Vielma, J.
2016-12-01
Efforts to prepare for and reduce the risk of hazards, from both natural and anthropogenic sources, which threaten our oceans and coasts requires an understanding of the dynamics and interactions between the physical, ecological, and socio-economic systems. Understanding these coupled dynamics are essential as offshore oil & gas exploration and production continues to push into harsher, more extreme environments where risks and uncertainty increase. However, working with these large, complex data from various sources and scales to assess risks and potential impacts associated with offshore energy exploration and production poses several challenges to research. In order to address these challenges, an integrated assessment model (IAM) was developed at the Department of Energy's (DOE) National Energy Technology Laboratory (NETL) that combines spatial data infrastructure and an online research platform to manage, process, analyze, and share these large, multidimensional datasets, research products, and the tools and models used to evaluate risk and reduce uncertainty for the entire offshore system, from the subsurface, through the water column, to coastal ecosystems and communities. Here, we will discuss the spatial data infrastructure and online research platform, NETL's Energy Data eXchange (EDX), that underpin the offshore IAM, providing information on how the framework combines multidimensional spatial data and spatio-temporal tools to evaluate risks to the complex matrix of potential environmental, social, and economic impacts stemming from modeled offshore hazard scenarios, such as oil spills or hurricanes. In addition, we will discuss the online analytics, tools, and visualization methods integrated into this framework that support availability and access to data, as well as allow for the rapid analysis and effective communication of analytical results to aid a range of decision-making needs.
NASA Astrophysics Data System (ADS)
Yeo, I. Y.
2016-12-01
Wetlands are valuable landscape features that provide important ecosystem functions and services. The ecosystem processes in wetlands are highly dependent on the hydrology. However, hydroperiod (i.e., change dynamics in inundation extent) is highly variable spatially and temporarily, and extremely difficult to predict owing to the complexity in hydrological processes within wetlands and its interaction with surrounding areas. This study reports the challenges and progress in assessing the catchment scale benefits of wetlands to regulate hydrological regime and water quality improvement in agricultural watershed. A process-based watershed model, Soil and Water Assessment Tool (SWAT) was improved to simulate the cumulative impacts of wetlands on downstream. Newly developed remote sensing products from LiDAR intensity and time series Landsat records, which show the inter-annual changes in fraction inundation, were utilized to describe the change status of inundated areas within forested wetlands, develop spatially varying wetland parameters, and evaluate the predicted inundated areas at the landscape level. We outline the challenges on developing the time series inundation mapping products at a high spatial and temporal resolution and reconciling the catchment scale model with the moderate remote sensing products. We then highlight the importance of integrating spatialized information to model calibration and evaluation to address the issues of equi-finality and prediction uncertainty. This integrated approach was applied to the upper region of Choptank River Watershed, the agricultural watershed in the Coastal Plain of Chesapeake Bay Watershed (in US). In the Mid- Atlantic US, the provision of pollution regulation services provided by wetlands has been emphasized due to declining water quality within the Chesapeake Bay and watersheds, and the preservation and restoration of wetlands has become the top priority to manage nonpoint source water pollution.
NASA Astrophysics Data System (ADS)
Hughes, Anna; Gyllencreutz, Richard; Mangerud, Jan; Svendsen, John Inge
2017-04-01
Glacial geologists generate empirical reconstructions of former ice-sheet dynamics by combining evidence from the preserved record of glacial landforms (e.g. end moraines, lineations) and sediments with chronological evidence (mainly numerical dates derived predominantly from radiocarbon, exposure and luminescence techniques). However the geomorphological and sedimentological footprints and chronological data are both incomplete records in both space and time, and all have multiple types of uncertainty associated with them. To understand ice sheets' response to climate we need numerical models of ice-sheet dynamics based on physical principles. To test and/or constrain such models, empirical reconstructions of past ice sheets that capture and acknowledge all uncertainties are required. In 2005 we started a project (Database of the Eurasian Deglaciation, DATED) to produce an empirical reconstruction of the evolution of the last Eurasian ice sheets, (including the British-Irish, Scandinavian and Svalbard-Barents-Kara Seas ice sheets) that is fully documented, specified in time, and includes uncertainty estimates. Over 5000 dates relevant to constraining ice build-up and retreat were assessed for reliability and used together with published ice-sheet margin positions based on glacial geomorphology to reconstruct time-slice maps of the ice sheets' extent. The DATED maps show synchronous ice margins with maximum-minimum uncertainty bounds for every 1000 years between 25-10 kyr ago. In the first version of results (DATED-1; Hughes et al. 2016) all uncertainties (both quantitative and qualitative, e.g. precision and accuracy of numerical dates, correlation of moraines, stratigraphic interpretations) were combined based on our best glaciological-geological assessment and expressed in terms of distance as a 'fuzzy' margin. Large uncertainties (>100 km) exist; predominantly across marine sectors and other locations where there are spatial gaps in the dating record (e.g. the timing of coalescence and separation of the Scandinavian and Svalbard-Barents-Kara ice sheets) but also in well-studied areas due to conflicting yet apparently equally robust data. In the four years since the DATED-1 census (1 January 2013), the volume of new information (from both dates and mapped glacial geomorphology) has grown significantly ( 1000 new dates). Here, we present work towards the updated version of results, DATED-2, that attempts to further reduce and explicitly report all uncertainties inherent in ice sheet reconstructions. Hughes, A. L. C., Gyllencreutz, R., Lohne, Ø. S., Mangerud, J., Svendsen, J. I. 2016: The last Eurasian ice sheets - a chronological database and time-slice reconstruction, DATED-1. Boreas, 45, 1-45. 10.1111/bor.12142
Testing the robustness of management decisions to uncertainty: Everglades restoration scenarios.
Fuller, Michael M; Gross, Louis J; Duke-Sylvester, Scott M; Palmer, Mark
2008-04-01
To effectively manage large natural reserves, resource managers must prepare for future contingencies while balancing the often conflicting priorities of different stakeholders. To deal with these issues, managers routinely employ models to project the response of ecosystems to different scenarios that represent alternative management plans or environmental forecasts. Scenario analysis is often used to rank such alternatives to aid the decision making process. However, model projections are subject to uncertainty in assumptions about model structure, parameter values, environmental inputs, and subcomponent interactions. We introduce an approach for testing the robustness of model-based management decisions to the uncertainty inherent in complex ecological models and their inputs. We use relative assessment to quantify the relative impacts of uncertainty on scenario ranking. To illustrate our approach we consider uncertainty in parameter values and uncertainty in input data, with specific examples drawn from the Florida Everglades restoration project. Our examples focus on two alternative 30-year hydrologic management plans that were ranked according to their overall impacts on wildlife habitat potential. We tested the assumption that varying the parameter settings and inputs of habitat index models does not change the rank order of the hydrologic plans. We compared the average projected index of habitat potential for four endemic species and two wading-bird guilds to rank the plans, accounting for variations in parameter settings and water level inputs associated with hypothetical future climates. Indices of habitat potential were based on projections from spatially explicit models that are closely tied to hydrology. For the American alligator, the rank order of the hydrologic plans was unaffected by substantial variation in model parameters. By contrast, simulated major shifts in water levels led to reversals in the ranks of the hydrologic plans in 24.1-30.6% of the projections for the wading bird guilds and several individual species. By exposing the differential effects of uncertainty, relative assessment can help resource managers assess the robustness of scenario choice in model-based policy decisions.
A Scheme for the Integrated Assessment of Mitigation Options
NASA Astrophysics Data System (ADS)
Held, H.; Edenhofer, O.
2003-04-01
After some consensus has been achieved that the global mean temperature will have increased by 1.4 to 5.8^oC at the end of this century in case of continued ``business as usual'' greenhouse gas emissions, society has to decide if or which mitigation measures should be taken. A new integrated assessment project on this very issue will be started at PIK in spring 2003. The assessment will cover economic aspects as well as potential side effects of various measures. In the economic module, the effects of investment decisions on technological innovation will be explicitly taken into account. Special emphasize will be put on the issue of uncertainty. Hereby we distinguish the uncertainty related to the Integrated Assessment modules, including the economic module, from the fact that no over-complex system can be fully captured by a model. Therefore, a scheme for the assessment of the ``residual'', the non-modelled part of the system, needs to be worked out. The scheme must be truly interdisciplinary, i.e. must be applicable to at least the natural science and the economic aspects. A scheme based on meta-principles like minimum persistence, ubiquity, or irreversibility of potential measures appears to be a promising candidate. An implementation of ubiquity as at present successfully operated in environmental chemistry may serve as a guideline [1]. Here, the best-known mechanism within a complex impact chain of potentially harmful chemicals, their transport, is captured by a reaction-diffusion mechanism [2]. begin{thebibliography}{0} bibitem{s} M. Scheringer, Persistence and spatial range as endpoints of an exposure-based assessment of organic chemicals. Environ. Sci. Technol. 30: 1652-1659 (1996). bibitem{h} H. Held, Robustness of spatial ranges of environmental chemicals with respect to model dimension, accepted for publication in Stoch. Environ. Res. Risk Assessment.
Life cycle-based water assessment of a hand dishwashing product: opportunities and limitations.
Van Hoof, Gert; Buyle, Bea; Kounina, Anna; Humbert, Sebastien
2013-10-01
It is only recently that life cycle-based indicators have been used to evaluate products from a water use impact perspective. The applicability of some of these methods has been primarily demonstrated on agricultural materials or products, because irrigation requirements in food production can be water-intensive. In view of an increasing interest on life cycle-based water indicators from different products, we ran a study on a hand dishwashing product. A number of water assessment methods were applied with the purpose of identifying both product improvement opportunities, as well as understanding the potential for underlying database and methodological improvements. The study covered the entire life cycle of the product and focused on environmental issues related to water use, looking in-depth at inventory, midpoint, and endpoint methods. "Traditional" water emission driven methods, such as freshwater eutrophication, were excluded from the analysis. The use of a single formula with the same global supply chain, manufactured in 1 location was evaluated in 2 countries with different water scarcity conditions. The study shows differences ranging up to 4 orders in magnitude for indicators with similar units associated with different water use types (inventory methods) and different cause-effect chain models (midpoint and endpoint impact categories). No uncertainty information was available on the impact assessment methods, whereas uncertainty from stochastic variability was not available at the time of study. For the majority of the indicators studied, the contribution from the consumer use stage is the most important (>90%), driven by both direct water use (dishwashing process) as well as indirect water use (electricity generation to heat the water). Creating consumer awareness on how the product is used, particularly in water-scarce areas, is the largest improvement opportunity for a hand dishwashing product. However, spatial differentiation in the inventory and impact assessment model may lead to very different results for the product used under exactly the same consumer use conditions, making the communication of results a real challenge. From a practitioner's perspective, the data collection step in relation to the goal and scope of the study sets high requirements for both foreground and background data. In particular, databases covering a broad spectrum of inventory data with spatially differentiated water use information are lacking. For some impact methods, it is unknown as to whether or not characterization factors should be spatially differentiated, which creates uncertainty in their interpretation and applicability. Finally, broad application of life cycle-based water assessment will require further development of commercial life cycle assessment software. © 2013 SETAC.
Singh, Akath; Santra, Priyabrata; Kumar, Mahesh; Panwar, Navraten; Meghwal, P R
2016-09-01
Soil organic carbon (SOC) is a major indicator of long-term sustenance of agricultural production system. Apart from sustaining productivity, SOC plays a crucial role in context of climate change. Keeping in mind these potentials, spatial variation of SOC contents of a fruit orchard comprising several arid fruit plantations located at arid region of India is assessed in this study through geostatistical approaches. For this purpose, surface and subsurface soil samples from 175 locations from a fruit orchard spreading over 14.33 ha area were collected along with geographical coordinates. SOC content and soil physicochemical properties of collected soil samples were determined followed by geostatistical analysis for mapping purposes. Average SOC stock density of the orchard was 14.48 Mg ha(-1) for 0- to 30-cm soil layer ranging from 9.01 Mg ha(-1) in Carissa carandas to 19.52 Mg ha(-1) in Prosopis cineraria block. Range of spatial variation of SOC content was found about 100 m, while two other soil physicochemical properties, e.g., pH and electrical conductivity (EC) also showed similar spatial trend. This indicated that minimum sampling distance for future SOC mapping programme may be kept lower than 100 m for better accuracy. Ordinary kriging technique satisfactorily predicted SOC contents (in percent) at unsampled locations with root-mean-squared residual (RMSR) of 0.35-0.37. Co-kriging approach was found slightly superior (RMSR = 0.26-0.28) than ordinary kriging for spatial prediction of SOC contents because of significant correlations of SOC contents with pH and EC. Uncertainty of SOC estimation was also presented in terms of 90 % confidence interval. Spatial estimates of SOC stock through ordinary kriging or co-kriging approach were also found with low uncertainty of estimation than non-spatial estimates, e.g., arithmetic averaging approach. Among different fruit block plantations of the orchard, the block with Prosopis cineraria ('khejri') has higher SOC stock density than others.
Numerical uncertainty in computational engineering and physics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hemez, Francois M
2009-01-01
Obtaining a solution that approximates ordinary or partial differential equations on a computational mesh or grid does not necessarily mean that the solution is accurate or even 'correct'. Unfortunately assessing the quality of discrete solutions by questioning the role played by spatial and temporal discretizations generally comes as a distant third to test-analysis comparison and model calibration. This publication is contributed to raise awareness of the fact that discrete solutions introduce numerical uncertainty. This uncertainty may, in some cases, overwhelm in complexity and magnitude other sources of uncertainty that include experimental variability, parametric uncertainty and modeling assumptions. The concepts ofmore » consistency, convergence and truncation error are overviewed to explain the articulation between the exact solution of continuous equations, the solution of modified equations and discrete solutions computed by a code. The current state-of-the-practice of code and solution verification activities is discussed. An example in the discipline of hydro-dynamics illustrates the significant effect that meshing can have on the quality of code predictions. A simple method is proposed to derive bounds of solution uncertainty in cases where the exact solution of the continuous equations, or its modified equations, is unknown. It is argued that numerical uncertainty originating from mesh discretization should always be quantified and accounted for in the overall uncertainty 'budget' that supports decision-making for applications in computational physics and engineering.« less
A Cascade Approach to Uncertainty Estimation for the Hydrological Simulation of Droughts
NASA Astrophysics Data System (ADS)
Smith, Katie; Tanguy, Maliko; Parry, Simon; Prudhomme, Christel
2016-04-01
Uncertainty poses a significant challenge in environmental research and the characterisation and quantification of uncertainty has become a research priority over the past decade. Studies of extreme events are particularly affected by issues of uncertainty. This study focusses on the sources of uncertainty in the modelling of streamflow droughts in the United Kingdom. Droughts are a poorly understood natural hazard with no universally accepted definition. Meteorological, hydrological and agricultural droughts have different meanings and vary both spatially and temporally, yet each is inextricably linked. The work presented here is part of two extensive interdisciplinary projects investigating drought reconstruction and drought forecasting capabilities in the UK. Lumped catchment models are applied to simulate streamflow drought, and uncertainties from 5 different sources are investigated: climate input data, potential evapotranspiration (PET) method, hydrological model, within model structure, and model parameterisation. Latin Hypercube sampling is applied to develop large parameter ensembles for each model structure which are run using parallel computing on a high performance computer cluster. Parameterisations are assessed using a multi-objective evaluation criteria which includes both general and drought performance metrics. The effect of different climate input data and PET methods on model output is then considered using the accepted model parameterisations. The uncertainty from each of the sources creates a cascade, and when presented as such the relative importance of each aspect of uncertainty can be determined.
Enhancing the ecological risk assessment process.
Dale, Virginia H; Biddinger, Gregory R; Newman, Michael C; Oris, James T; Suter, Glenn W; Thompson, Timothy; Armitage, Thomas M; Meyer, Judith L; Allen-King, Richelle M; Burton, G Allen; Chapman, Peter M; Conquest, Loveday L; Fernandez, Ivan J; Landis, Wayne G; Master, Lawrence L; Mitsch, William J; Mueller, Thomas C; Rabeni, Charles F; Rodewald, Amanda D; Sanders, James G; van Heerden, Ivor L
2008-07-01
The Ecological Processes and Effects Committee of the US Environmental Protection Agency Science Advisory Board conducted a self-initiated study and convened a public workshop to characterize the state of the ecological risk assessment (ERA), with a view toward advancing the science and application of the process. That survey and analysis of ERA in decision making shows that such assessments have been most effective when clear management goals were included in the problem formulation; translated into information needs; and developed in collaboration with decision makers, assessors, scientists, and stakeholders. This process is best facilitated when risk managers, risk assessors, and stakeholders are engaged in an ongoing dialogue about problem formulation. Identification and acknowledgment of uncertainties that have the potential to profoundly affect the results and outcome of risk assessments also improves assessment effectiveness. Thus we suggest 1) through peer review of ERAs be conducted at the problem formulation stage and 2) the predictive power of risk-based decision making be expanded to reduce uncertainties through analytical and methodological approaches like life cycle analysis. Risk assessment and monitoring programs need better integration to reduce uncertainty and to evaluate risk management decision outcomes. Postdecision audit programs should be initiated to evaluate the environmental outcomes of risk-based decisions. In addition, a process should be developed to demonstrate how monitoring data can be used to reduce uncertainties. Ecological risk assessments should include the effects of chemical and nonchemical stressors at multiple levels of biological organization and spatial scale, and the extent and resolution of the pertinent scales and levels of organization should be explicitly considered during problem formulation. An approach to interpreting lines of evidence and weight of evidence is critically needed for complex assessments, and it would be useful to develop case studies and/or standards of practice for interpreting lines of evidence. In addition, tools for cumulative risk assessment should be developed because contaminants are often released into stressed environments.
Quantifying uncertainty in forest nutrient budgets
Ruth D. Yanai; Carrie R. Levine; Mark B. Green; John L. Campbell
2012-01-01
Nutrient budgets for forested ecosystems have rarely included error analysis, in spite of the importance of uncertainty to interpretation and extrapolation of the results. Uncertainty derives from natural spatial and temporal variation and also from knowledge uncertainty in measurement and models. For example, when estimating forest biomass, researchers commonly report...
MODFLOW 2000 Head Uncertainty, a First-Order Second Moment Method
Glasgow, H.S.; Fortney, M.D.; Lee, J.; Graettinger, A.J.; Reeves, H.W.
2003-01-01
A computationally efficient method to estimate the variance and covariance in piezometric head results computed through MODFLOW 2000 using a first-order second moment (FOSM) approach is presented. This methodology employs a first-order Taylor series expansion to combine model sensitivity with uncertainty in geologic data. MODFLOW 2000 is used to calculate both the ground water head and the sensitivity of head to changes in input data. From a limited number of samples, geologic data are extrapolated and their associated uncertainties are computed through a conditional probability calculation. Combining the spatially related sensitivity and input uncertainty produces the variance-covariance matrix, the diagonal of which is used to yield the standard deviation in MODFLOW 2000 head. The variance in piezometric head can be used for calibrating the model, estimating confidence intervals, directing exploration, and evaluating the reliability of a design. A case study illustrates the approach, where aquifer transmissivity is the spatially related uncertain geologic input data. The FOSM methodology is shown to be applicable for calculating output uncertainty for (1) spatially related input and output data, and (2) multiple input parameters (transmissivity and recharge).
NASA Astrophysics Data System (ADS)
Camporese, M.; Cassiani, G.; Deiana, R.; Salandin, P.
2011-12-01
In recent years geophysical methods have become increasingly popular for hydrological applications. Time-lapse electrical resistivity tomography (ERT) represents a potentially powerful tool for subsurface solute transport characterization since a full picture of the spatiotemporal evolution of the process can be obtained. However, the quantitative interpretation of tracer tests is difficult because of the uncertainty related to the geoelectrical inversion, the constitutive models linking geophysical and hydrological quantities, and the a priori unknown heterogeneous properties of natural formations. Here an approach based on the Lagrangian formulation of transport and the ensemble Kalman filter (EnKF) data assimilation technique is applied to assess the spatial distribution of hydraulic conductivity K by incorporating time-lapse cross-hole ERT data. Electrical data consist of three-dimensional cross-hole ERT images generated for a synthetic tracer test in a heterogeneous aquifer. Under the assumption that the solute spreads as a passive tracer, for high Peclet numbers the spatial moments of the evolving plume are dominated by the spatial distribution of the hydraulic conductivity. The assimilation of the electrical conductivity 4D images allows updating of the hydrological state as well as the spatial distribution of K. Thus, delineation of the tracer plume and estimation of the local aquifer heterogeneity can be achieved at the same time by means of this interpretation of time-lapse electrical images from tracer tests. We assess the impact on the performance of the hydrological inversion of (i) the uncertainty inherently affecting ERT inversions in terms of tracer concentration and (ii) the choice of the prior statistics of K. Our findings show that realistic ERT images can be integrated into a hydrological model even within an uncoupled inverse modeling framework. The reconstruction of the hydraulic conductivity spatial distribution is satisfactory in the portion of the domain directly covered by the passage of the tracer. Aside from the issues commonly affecting inverse models, the proposed approach is subject to the problem of the filter inbreeding and the retrieval performance is sensitive to the choice of K prior geostatistical parameters.
NASA Astrophysics Data System (ADS)
Xu, Rongting; Tian, Hanqin; Lu, Chaoqun; Pan, Shufen; Chen, Jian; Yang, Jia; Zhang, Bowen
2017-07-01
To accurately assess how increased global nitrous oxide (N2O) emission has affected the climate system requires a robust estimation of the preindustrial N2O emissions since only the difference between current and preindustrial emissions represents net drivers of anthropogenic climate change. However, large uncertainty exists in previous estimates of preindustrial N2O emissions from the land biosphere, while preindustrial N2O emissions on the finer scales, such as regional, biome, or sector scales, have not been well quantified yet. In this study, we applied a process-based Dynamic Land Ecosystem Model (DLEM) to estimate the magnitude and spatial patterns of preindustrial N2O fluxes at the biome, continental, and global level as driven by multiple environmental factors. Uncertainties associated with key parameters were also evaluated. Our study indicates that the mean of the preindustrial N2O emission was approximately 6.20 Tg N yr-1, with an uncertainty range of 4.76 to 8.13 Tg N yr-1. The estimated N2O emission varied significantly at spatial and biome levels. South America, Africa, and Southern Asia accounted for 34.12, 23.85, and 18.93 %, respectively, together contributing 76.90 % of global total emission. The tropics were identified as the major source of N2O released into the atmosphere, accounting for 64.66 % of the total emission. Our multi-scale estimates provide a robust reference for assessing the climate forcing of anthropogenic N2O emission from the land biosphere
NASA Astrophysics Data System (ADS)
Rose, K.; Bauer, J. R.; Baker, D. V.
2015-12-01
As big data computing capabilities are increasingly paired with spatial analytical tools and approaches, there is a need to ensure uncertainty associated with the datasets used in these analyses is adequately incorporated and portrayed in results. Often the products of spatial analyses, big data and otherwise, are developed using discontinuous, sparse, and often point-driven data to represent continuous phenomena. Results from these analyses are generally presented without clear explanations of the uncertainty associated with the interpolated values. The Variable Grid Method (VGM) offers users with a flexible approach designed for application to a variety of analyses where users there is a need to study, evaluate, and analyze spatial trends and patterns while maintaining connection to and communicating the uncertainty in the underlying spatial datasets. The VGM outputs a simultaneous visualization representative of the spatial data analyses and quantification of underlying uncertainties, which can be calculated using data related to sample density, sample variance, interpolation error, uncertainty calculated from multiple simulations. In this presentation we will show how we are utilizing Hadoop to store and perform spatial analysis through the development of custom Spark and MapReduce applications that incorporate ESRI Hadoop libraries. The team will present custom 'Big Data' geospatial applications that run on the Hadoop cluster and integrate with ESRI ArcMap with the team's probabilistic VGM approach. The VGM-Hadoop tool has been specially built as a multi-step MapReduce application running on the Hadoop cluster for the purpose of data reduction. This reduction is accomplished by generating multi-resolution, non-overlapping, attributed topology that is then further processed using ESRI's geostatistical analyst to convey a probabilistic model of a chosen study region. Finally, we will share our approach for implementation of data reduction and topology generation via custom multi-step Hadoop applications, performance benchmarking comparisons, and Hadoop-centric opportunities for greater parallelization of geospatial operations. The presentation includes examples of the approach being applied to a range of subsurface, geospatial studies (e.g. induced seismicity risk).
Hyperspectral imaging spectro radiometer improves radiometric accuracy
NASA Astrophysics Data System (ADS)
Prel, Florent; Moreau, Louis; Bouchard, Robert; Bullis, Ritchie D.; Roy, Claude; Vallières, Christian; Levesque, Luc
2013-06-01
Reliable and accurate infrared characterization is necessary to measure the specific spectral signatures of aircrafts and associated infrared counter-measures protections (i.e. flares). Infrared characterization is essential to improve counter measures efficiency, improve friend-foe identification and reduce the risk of friendly fire. Typical infrared characterization measurement setups include a variety of panchromatic cameras and spectroradiometers. Each instrument brings essential information; cameras measure the spatial distribution of targets and spectroradiometers provide the spectral distribution of the emitted energy. However, the combination of separate instruments brings out possible radiometric errors and uncertainties that can be reduced with Hyperspectral imagers. These instruments combine both spectral and spatial information into the same data. These instruments measure both the spectral and spatial distribution of the energy at the same time ensuring the temporal and spatial cohesion of collected information. This paper presents a quantitative analysis of the main contributors of radiometric uncertainties and shows how a hyperspectral imager can reduce these uncertainties.
The Generalized Uncertainty Principle and Harmonic Interaction in Three Spatial Dimensions
NASA Astrophysics Data System (ADS)
Hassanabadi, H.; Hooshmand, P.; Zarrinkamar, S.
2015-01-01
In three spatial dimensions, the generalized uncertainty principle is considered under an isotropic harmonic oscillator interaction in both non-relativistic and relativistic regions. By using novel transformations and separations of variables, the exact analytical solution of energy eigenvalues as well as the wave functions is obtained. Time evolution of the non-relativistic region is also reported.
Alan K. Swanson; Solomon Z. Dobrowski; Andrew O. Finley; James H. Thorne; Michael K. Schwartz
2013-01-01
The uncertainty associated with species distribution model (SDM) projections is poorly characterized, despite its potential value to decision makers. Error estimates from most modelling techniques have been shown to be biased due to their failure to account for spatial autocorrelation (SAC) of residual error. Generalized linear mixed models (GLMM) have the ability to...
NASA Technical Reports Server (NTRS)
Colarco, P. R.; Kahn, R. A.; Remer, L. A.; Levy, R. C.
2014-01-01
We use the Moderate Resolution Imaging Spectroradiometer (MODIS) satellite aerosol optical thickness (AOT) product to assess the impact of reduced swath width on global and regional AOT statistics and trends. Alongtrack and across-track sampling strategies are employed, in which the full MODIS data set is sub-sampled with various narrow-swath (approximately 400-800 km) and single pixel width (approximately 10 km) configurations. Although view-angle artifacts in the MODIS AOT retrieval confound direct comparisons between averages derived from different sub-samples, careful analysis shows that with many portions of the Earth essentially unobserved, spatial sampling introduces uncertainty in the derived seasonal-regional mean AOT. These AOT spatial sampling artifacts comprise up to 60%of the full-swath AOT value under moderate aerosol loading, and can be as large as 0.1 in some regions under high aerosol loading. Compared to full-swath observations, narrower swath and single pixel width sampling exhibits a reduced ability to detect AOT trends with statistical significance. On the other hand, estimates of the global, annual mean AOT do not vary significantly from the full-swath values as spatial sampling is reduced. Aggregation of the MODIS data at coarse grid scales (10 deg) shows consistency in the aerosol trends across sampling strategies, with increased statistical confidence, but quantitative errors in the derived trends are found even for the full-swath data when compared to high spatial resolution (0.5 deg) aggregations. Using results of a model-derived aerosol reanalysis, we find consistency in our conclusions about a seasonal-regional spatial sampling artifact in AOT Furthermore, the model shows that reduced spatial sampling can amount to uncertainty in computed shortwave top-ofatmosphere aerosol radiative forcing of 2-3 W m(sup-2). These artifacts are lower bounds, as possibly other unconsidered sampling strategies would perform less well. These results suggest that future aerosol satellite missions having significantly less than full-swath viewing are unlikely to sample the true AOT distribution well enough to obtain the statistics needed to reduce uncertainty in aerosol direct forcing of climate.
NASA Astrophysics Data System (ADS)
de Barros, Felipe P. J.
2018-07-01
Quantifying the uncertainty in solute mass discharge at an environmentally sensitive location is key to assess the risks due to groundwater contamination. Solute mass fluxes are strongly affected by the spatial variability of hydrogeological properties as well as release conditions at the source zone. This paper provides a methodological framework to investigate the interaction between the ubiquitous heterogeneity of the hydraulic conductivity and the mass release rate at the source zone on the uncertainty of mass discharge. Through the use of perturbation theory, we derive analytical and semi-analytical expressions for the statistics of the solute mass discharge at a control plane in a three-dimensional aquifer while accounting for the solute mass release rates at the source. The derived solutions are limited to aquifers displaying low-to-mild heterogeneity. Results illustrate the significance of the source zone mass release rate in controlling the mass discharge uncertainty. The relative importance of the mass release rate on the mean solute discharge depends on the distance between the source and the control plane. On the other hand, we find that the solute release rate at the source zone has a strong impact on the variance of the mass discharge. Within a risk context, we also compute the peak mean discharge as a function of the parameters governing the spatial heterogeneity of the hydraulic conductivity field and mass release rates at the source zone. The proposed physically-based framework is application-oriented, computationally efficient and capable of propagating uncertainty from different parameters onto risk metrics. Furthermore, it can be used for preliminary screening purposes to guide site managers to perform system-level sensitivity analysis and better allocate resources.
The impact of lake and reservoir parameterization on global streamflow simulation.
Zajac, Zuzanna; Revilla-Romero, Beatriz; Salamon, Peter; Burek, Peter; Hirpa, Feyera A; Beck, Hylke
2017-05-01
Lakes and reservoirs affect the timing and magnitude of streamflow, and are therefore essential hydrological model components, especially in the context of global flood forecasting. However, the parameterization of lake and reservoir routines on a global scale is subject to considerable uncertainty due to lack of information on lake hydrographic characteristics and reservoir operating rules. In this study we estimated the effect of lakes and reservoirs on global daily streamflow simulations of a spatially-distributed LISFLOOD hydrological model. We applied state-of-the-art global sensitivity and uncertainty analyses for selected catchments to examine the effect of uncertain lake and reservoir parameterization on model performance. Streamflow observations from 390 catchments around the globe and multiple performance measures were used to assess model performance. Results indicate a considerable geographical variability in the lake and reservoir effects on the streamflow simulation. Nash-Sutcliffe Efficiency (NSE) and Kling-Gupta Efficiency (KGE) metrics improved for 65% and 38% of catchments respectively, with median skill score values of 0.16 and 0.2 while scores deteriorated for 28% and 52% of the catchments, with median values -0.09 and -0.16, respectively. The effect of reservoirs on extreme high flows was substantial and widespread in the global domain, while the effect of lakes was spatially limited to a few catchments. As indicated by global sensitivity analysis, parameter uncertainty substantially affected uncertainty of model performance. Reservoir parameters often contributed to this uncertainty, although the effect varied widely among catchments. The effect of reservoir parameters on model performance diminished with distance downstream of reservoirs in favor of other parameters, notably groundwater-related parameters and channel Manning's roughness coefficient. This study underscores the importance of accounting for lakes and, especially, reservoirs and using appropriate parameterization in large-scale hydrological simulations.
Lorenz, Marco; Fürst, Christine; Thiel, Enrico
2013-09-01
Regarding increasing pressures by global societal and climate change, the assessment of the impact of land use and land management practices on land degradation and the related decrease in sustainable provision of ecosystem services gains increasing interest. Existing approaches to assess agricultural practices focus on the assessment of single crops or statistical data because spatially explicit information on practically applied crop rotations is mostly not available. This provokes considerable uncertainties in crop production models as regional specifics have to be neglected or cannot be considered in an appropriate way. In a case study in Saxony, we developed an approach to (i) derive representative regional crop rotations by combining different data sources and expert knowledge. This includes the integration of innovative crop sequences related to bio-energy production or organic farming and different soil tillage, soil management and soil protection techniques. Furthermore, (ii) we developed a regionalization approach for transferring crop rotations and related soil management strategies on the basis of statistical data and spatially explicit data taken from so called field blocks. These field blocks are the smallest spatial entity for which agricultural practices must be reported to apply for agricultural funding within the frame of the European Agricultural Fund for Rural Development (EAFRD) program. The information was finally integrated into the spatial decision support tool GISCAME to assess and visualize in spatially explicit manner the impact of alternative agricultural land use strategies on soil erosion risk and ecosystem services provision. Objective of this paper is to present the approach how to create spatially explicit information on agricultural management practices for a study area around Dresden, the capital of the German Federal State Saxony. Copyright © 2013 Elsevier Ltd. All rights reserved.
Incorporating rainfall uncertainty in a SWAT model: the river Zenne basin (Belgium) case study
NASA Astrophysics Data System (ADS)
Tolessa Leta, Olkeba; Nossent, Jiri; van Griensven, Ann; Bauwens, Willy
2013-04-01
The European Union Water Framework Directive (EU-WFD) called its member countries to achieve a good ecological status for all inland and coastal water bodies by 2015. According to recent studies, the river Zenne (Belgium) is far from this objective. Therefore, an interuniversity and multidisciplinary project "Towards a Good Ecological Status in the river Zenne (GESZ)" was launched to evaluate the effects of wastewater management plans on the river. In this project, different models have been developed and integrated using the Open Modelling Interface (OpenMI). The hydrologic, semi-distributed Soil and Water Assessment Tool (SWAT) is hereby used as one of the model components in the integrated modelling chain in order to model the upland catchment processes. The assessment of the uncertainty of SWAT is an essential aspect of the decision making process, in order to design robust management strategies that take the predicted uncertainties into account. Model uncertainty stems from the uncertainties on the model parameters, the input data (e.g, rainfall), the calibration data (e.g., stream flows) and on the model structure itself. The objective of this paper is to assess the first three sources of uncertainty in a SWAT model of the river Zenne basin. For the assessment of rainfall measurement uncertainty, first, we identified independent rainfall periods, based on the daily precipitation and stream flow observations and using the Water Engineering Time Series PROcessing tool (WETSPRO). Secondly, we assigned a rainfall multiplier parameter for each of the independent rainfall periods, which serves as a multiplicative input error corruption. Finally, we treated these multipliers as latent parameters in the model optimization and uncertainty analysis (UA). For parameter uncertainty assessment, due to the high number of parameters of the SWAT model, first, we screened out its most sensitive parameters using the Latin Hypercube One-factor-At-a-Time (LH-OAT) technique. Subsequently, we only considered the most sensitive parameters for parameter optimization and UA. To explicitly account for the stream flow uncertainty, we assumed that the stream flow measurement error increases linearly with the stream flow value. To assess the uncertainty and infer posterior distributions of the parameters, we used a Markov Chain Monte Carlo (MCMC) sampler - differential evolution adaptive metropolis (DREAM) that uses sampling from an archive of past states to generate candidate points in each individual chain. It is shown that the marginal posterior distributions of the rainfall multipliers vary widely between individual events, as a consequence of rainfall measurement errors and the spatial variability of the rain. Only few of the rainfall events are well defined. The marginal posterior distributions of the SWAT model parameter values are well defined and identified by DREAM, within their prior ranges. The posterior distributions of output uncertainty parameter values also show that the stream flow data is highly uncertain. The approach of using rainfall multipliers to treat rainfall uncertainty for a complex model has an impact on the model parameter marginal posterior distributions and on the model results Corresponding author: Tel.: +32 (0)2629 3027; fax: +32(0)2629 3022. E-mail: otolessa@vub.ac.be
NASA Astrophysics Data System (ADS)
Ahmadalipour, Ali; Moradkhani, Hamid; Rana, Arun
2018-01-01
Climate change is expected to have severe impacts on natural systems as well as various socio-economic aspects of human life. This has urged scientific communities to improve the understanding of future climate and reduce the uncertainties associated with projections. In the present study, ten statistically downscaled CMIP5 GCMs at 1/16th deg. spatial resolution from two different downscaling procedures are utilized over the Columbia River Basin (CRB) to assess the changes in climate variables and characterize the associated uncertainties. Three climate variables, i.e. precipitation, maximum temperature, and minimum temperature, are studied for the historical period of 1970-2000 as well as future period of 2010-2099, simulated with representative concentration pathways of RCP4.5 and RCP8.5. Bayesian Model Averaging (BMA) is employed to reduce the model uncertainty and develop a probabilistic projection for each variable in each scenario. Historical comparison of long-term attributes of GCMs and observation suggests a more accurate representation for BMA than individual models. Furthermore, BMA projections are used to investigate future seasonal to annual changes of climate variables. Projections indicate significant increase in annual precipitation and temperature, with varied degree of change across different sub-basins of CRB. We then characterized uncertainty of future projections for each season over CRB. Results reveal that model uncertainty is the main source of uncertainty, among others. However, downscaling uncertainty considerably contributes to the total uncertainty of future projections, especially in summer. On the contrary, downscaling uncertainty appears to be higher than scenario uncertainty for precipitation.
Environmental assessment of spatial plan policies through land use scenarios
DOE Office of Scientific and Technical Information (OSTI.GOV)
Geneletti, Davide, E-mail: davide.geneletti@ing.unitn.it
2012-01-15
This paper presents a method based on scenario analysis to compare the environmental effects of different spatial plan policies in a range of possible futures. The study aimed at contributing to overcome two limitations encountered in Strategic Environmental Assessment (SEA) for spatial planning: poor exploration of how the future might unfold, and poor consideration of alternative plan policies. Scenarios were developed through what-if functions and spatial modeling in a Geographical Information System (GIS), and consisted in maps that represent future land uses under different assumptions on key driving forces. The use of land use scenarios provided a representation of howmore » the different policies will look like on the ground. This allowed gaining a better understanding of the policies' implications on the environment, which could be measured through a set of indicators. The research undertook a case-study approach by developing and assessing land use scenarios for the future growth of Caia, a strategically-located and fast-developing town in rural Mozambique. The effects of alternative spatial plan policies were assessed against a set of environmental performance indicators, including deforestation, loss of agricultural land, encroachment of flood-prone areas and wetlands and access to water sources. In this way, critical environmental effects related to the implementation of each policy were identified and discussed, suggesting possible strategies to address them. - Graphical abstract: Display Omitted Research Highlights: Black-Right-Pointing-Pointer The method contributes to two critical issues in SEA: exploration of the future and consideration of alternatives. Black-Right-Pointing-Pointer Future scenarios are used to test the environmental performance of different spatial plan policies in uncertainty conditions. Black-Right-Pointing-Pointer Spatially-explicit land use scenarios provide a representation of how different policies will look like on the ground.« less
NASA Astrophysics Data System (ADS)
Luce, C.
2014-12-01
Climate and hydrology models are regularly applied to assess potential changes in water resources and to inform adaptation decisions. An increasingly common question is, "What if we are wrong?" While climate models show substantial agreement on metrics such as pressure, temperature, and wind, they are notoriously uncertain in projecting precipitation change. The response to that uncertainty varies depending on the water management context and the nature of the uncertainty. In the southwestern U.S., large storage reservoirs (relative to annual supply) and general expectations of decreasing precipitation have guided extensive discussion on water management towards uncertainties in annual-scale water balances, precipitation, and evapotranspiration. In contrast, smaller reservoirs and little expectation for change in annual precipitation have focused discussions of Pacific Northwest water management toward shifts in runoff seasonality. The relative certainty of temperature impacts on snowpacks compared to the substantial uncertainty in precipitation has yielded a consistent narrative on earlier snowmelt. This narrative has been reinforced by a perception of essentially the same behavior in the historical record. This perception has led to calls in the political arena for more reservoir storage to replace snowpack storage for water supplies. Recent findings on differences in trends in precipitation at high versus low elevations, however, has recalled the uncertainty in precipitation futures and generated questions about alternative water management strategies. An important question with respect to snowpacks is whether the precipitation changes matter in the context of such substantial projections for temperature change. Here we apply an empirical snowpack model to analyze spatial differences in the uncertainty of snowpack responses to temperature and precipitation forcing across the Pacific Northwest U.S. The analysis reveals a strong geographic gradient in uncertainty of snowpack response to future climate, from the coastal regions, where precipitation uncertainty is relatively inconsequential for snowpack changes, to interior mountains where minor uncertainties in precipitation are on par with expected changes relative to temperature.
Uncertainty prediction for PUB
NASA Astrophysics Data System (ADS)
Mendiondo, E. M.; Tucci, C. M.; Clarke, R. T.; Castro, N. M.; Goldenfum, J. A.; Chevallier, P.
2003-04-01
IAHS’ initiative of Prediction in Ungaged Basins (PUB) attempts to integrate monitoring needs and uncertainty prediction for river basins. This paper outlines alternative ways of uncertainty prediction which could be linked with new blueprints for PUB, thereby showing how equifinality-based models should be grasped using practical strategies of gauging like the Nested Catchment Experiment (NCE). Uncertainty prediction is discussed from observations of Potiribu Project, which is a NCE layout at representative basins of a suptropical biome of 300,000 km2 in South America. Uncertainty prediction is assessed at the microscale (1 m2 plots), at the hillslope (0,125 km2) and at the mesoscale (0,125 - 560 km2). At the microscale, uncertainty-based models are constrained by temporal variations of state variables with changing likelihood surfaces of experiments using Green-Ampt model. Two new blueprints emerged from this NCE for PUB: (1) the Scale Transferability Scheme (STS) at the hillslope scale and the Integrating Process Hypothesis (IPH) at the mesoscale. The STS integrates a multi-dimensional scaling with similarity thresholds, as a generalization of the Representative Elementary Area (REA), using spatial correlation from point (distributed) to area (lumped) process. In this way, STS addresses uncertainty-bounds of model parameters, into an upscaling process at the hillslope. In the other hand, the IPH approach regionalizes synthetic hydrographs, thereby interpreting the uncertainty bounds of streamflow variables. Multiscale evidences from Potiribu NCE layout show novel pathways of uncertainty prediction under a PUB perspective in representative basins of world biomes.
Integrating data types to enhance shoreline change assessments
NASA Astrophysics Data System (ADS)
Long, J.; Henderson, R.; Plant, N. G.; Nelson, P. R.
2016-12-01
Shorelines represent the variable boundary between terrestrial and marine environments. Assessment of geographic and temporal variability in shoreline position and related variability in shoreline change rates are an important part of studies and applications related to impacts from sea-level rise and storms. The results from these assessments are used to quantify future ecosystem services and coastal resilience and guide selection of appropriate coastal restoration and protection designs. But existing assessments typically fail to incorporate all available shoreline observations because they are derived from multiple data types and have different or unknown biases and uncertainties. Shoreline-change research and assessments often focus on either the long-term trajectory using sparse data over multiple decades or shorter-term evolution using data collected more frequently but over a shorter period of time. The combination of data collected with significantly different temporal resolution is not often considered. Also, differences in the definition of the shoreline metric itself can occur, whether using a single or multiple data source(s), due to variation the signal being detected in the data (e.g. instantaneous land/water interface, swash zone, wrack line, or topographic contours). Previous studies have not explored whether more robust shoreline change assessments are possible if all available data are utilized and all uncertainties are considered. In this study, we test the hypothesis that incorporating all available shoreline data will lead to both improved historical assessments and enhance the predictive capability of shoreline-change forecasts. Using over 250 observations of shoreline position at Dauphin Island, Alabama over the last century, we compare shoreline-change rates derived from individual data sources (airborne lidar, satellite, aerial photographs) with an assessment using the combination of all available data. Biases or simple uncertainties in the shoreline metric from different data types and varying temporal/spatial resolution of the data are examined. As part of this test, we also demonstrate application of data assimilation techniques to predict shoreline position by accurately including the uncertainty in each type of data.
Optimal Integration of Departures and Arrivals in Terminal Airspace
NASA Technical Reports Server (NTRS)
Xue, Min; Zelinski, Shannon Jean
2013-01-01
Coordination of operations with spatially and temporally shared resources, such as route segments, fixes, and runways, improves the efficiency of terminal airspace management. Problems in this category are, in general, computationally difficult compared to conventional scheduling problems. This paper presents a fast time algorithm formulation using a non-dominated sorting genetic algorithm (NSGA). It was first applied to a test problem introduced in existing literature. An experiment with a test problem showed that new methods can solve the 20 aircraft problem in fast time with a 65% or 440 second delay reduction using shared departure fixes. In order to test its application in a more realistic and complicated problem, the NSGA algorithm was applied to a problem in LAX terminal airspace, where interactions between 28% of LAX arrivals and 10% of LAX departures are resolved by spatial separation in current operations, which may introduce unnecessary delays. In this work, three types of separations - spatial, temporal, and hybrid separations - were formulated using the new algorithm. The hybrid separation combines both temporal and spatial separations. Results showed that although temporal separation achieved less delay than spatial separation with a small uncertainty buffer, spatial separation outperformed temporal separation when the uncertainty buffer was increased. Hybrid separation introduced much less delay than both spatial and temporal approaches. For a total of 15 interacting departures and arrivals, when compared to spatial separation, the delay reduction of hybrid separation varied between 11% or 3.1 minutes and 64% or 10.7 minutes corresponding to an uncertainty buffer from 0 to 60 seconds. Furthermore, as a comparison with the NSGA algorithm, a First-Come-First-Serve based heuristic method was implemented for the hybrid separation. Experiments showed that the results from the NSGA algorithm have 9% to 42% less delay than the heuristic method with varied uncertainty buffer sizes.
Gurdak, Jason J.; Qi, Sharon L.; Geisler, Michael L.
2009-01-01
The U.S. Geological Survey Raster Error Propagation Tool (REPTool) is a custom tool for use with the Environmental System Research Institute (ESRI) ArcGIS Desktop application to estimate error propagation and prediction uncertainty in raster processing operations and geospatial modeling. REPTool is designed to introduce concepts of error and uncertainty in geospatial data and modeling and provide users of ArcGIS Desktop a geoprocessing tool and methodology to consider how error affects geospatial model output. Similar to other geoprocessing tools available in ArcGIS Desktop, REPTool can be run from a dialog window, from the ArcMap command line, or from a Python script. REPTool consists of public-domain, Python-based packages that implement Latin Hypercube Sampling within a probabilistic framework to track error propagation in geospatial models and quantitatively estimate the uncertainty of the model output. Users may specify error for each input raster or model coefficient represented in the geospatial model. The error for the input rasters may be specified as either spatially invariant or spatially variable across the spatial domain. Users may specify model output as a distribution of uncertainty for each raster cell. REPTool uses the Relative Variance Contribution method to quantify the relative error contribution from the two primary components in the geospatial model - errors in the model input data and coefficients of the model variables. REPTool is appropriate for many types of geospatial processing operations, modeling applications, and related research questions, including applications that consider spatially invariant or spatially variable error in geospatial data.
Quantifying and Qualifying USGS ShakeMap Uncertainty
Wald, David J.; Lin, Kuo-Wan; Quitoriano, Vincent
2008-01-01
We describe algorithms for quantifying and qualifying uncertainties associated with USGS ShakeMap ground motions. The uncertainty values computed consist of latitude/longitude grid-based multiplicative factors that scale the standard deviation associated with the ground motion prediction equation (GMPE) used within the ShakeMap algorithm for estimating ground motions. The resulting grid-based 'uncertainty map' is essential for evaluation of losses derived using ShakeMaps as the hazard input. For ShakeMap, ground motion uncertainty at any point is dominated by two main factors: (i) the influence of any proximal ground motion observations, and (ii) the uncertainty of estimating ground motions from the GMPE, most notably, elevated uncertainty due to initial, unconstrained source rupture geometry. The uncertainty is highest for larger magnitude earthquakes when source finiteness is not yet constrained and, hence, the distance to rupture is also uncertain. In addition to a spatially-dependant, quantitative assessment, many users may prefer a simple, qualitative grading for the entire ShakeMap. We developed a grading scale that allows one to quickly gauge the appropriate level of confidence when using rapidly produced ShakeMaps as part of the post-earthquake decision-making process or for qualitative assessments of archived or historical earthquake ShakeMaps. We describe an uncertainty letter grading ('A' through 'F', for high to poor quality, respectively) based on the uncertainty map. A middle-range ('C') grade corresponds to a ShakeMap for a moderate-magnitude earthquake suitably represented with a point-source location. Lower grades 'D' and 'F' are assigned for larger events (M>6) where finite-source dimensions are not yet constrained. The addition of ground motion observations (or observed macroseismic intensities) reduces uncertainties over data-constrained portions of the map. Higher grades ('A' and 'B') correspond to ShakeMaps with constrained fault dimensions and numerous stations, depending on the density of station/data coverage. Due to these dependencies, the letter grade can change with subsequent ShakeMap revisions if more data are added or when finite-faulting dimensions are added. We emphasize that the greatest uncertainties are associated with unconstrained source dimensions for large earthquakes where the distance term in the GMPE is most uncertain; this uncertainty thus scales with magnitude (and consequently rupture dimension). Since this distance uncertainty produces potentially large uncertainties in ShakeMap ground-motion estimates, this factor dominates over compensating constraints for all but the most dense station distributions.
Li, Zhengpeng; Liu, Shuguang; Zhang, Xuesong; West, Tristram O.; Ogle, Stephen M.; Zhou, Naijun
2016-01-01
Quantifying spatial and temporal patterns of carbon sources and sinks and their uncertainties across agriculture-dominated areas remains challenging for understanding regional carbon cycles. Characteristics of local land cover inputs could impact the regional carbon estimates but the effect has not been fully evaluated in the past. Within the North American Carbon Program Mid-Continent Intensive (MCI) Campaign, three models were developed to estimate carbon fluxes on croplands: an inventory-based model, the Environmental Policy Integrated Climate (EPIC) model, and the General Ensemble biogeochemical Modeling System (GEMS) model. They all provided estimates of three major carbon fluxes on cropland: net primary production (NPP), net ecosystem production (NEP), and soil organic carbon (SOC) change. Using data mining and spatial statistics, we studied the spatial distribution of the carbon fluxes uncertainties and the relationships between the uncertainties and the land cover characteristics. Results indicated that uncertainties for all three carbon fluxes were not randomly distributed, but instead formed multiple clusters within the MCI region. We investigated the impacts of three land cover characteristics on the fluxes uncertainties: cropland percentage, cropland richness and cropland diversity. The results indicated that cropland percentage significantly influenced the uncertainties of NPP and NEP, but not on the uncertainties of SOC change. Greater uncertainties of NPP and NEP were found in counties with small cropland percentage than the counties with large cropland percentage. Cropland species richness and diversity also showed negative correlations with the model uncertainties. Our study demonstrated that the land cover characteristics contributed to the uncertainties of regional carbon fluxes estimates. The approaches we used in this study can be applied to other ecosystem models to identify the areas with high uncertainties and where models can be improved to reduce overall uncertainties for regional carbon flux estimates.
NASA Astrophysics Data System (ADS)
Llopis-Albert, Carlos; Palacios-Marqués, Daniel; Merigó, José M.
2014-04-01
In this paper a methodology for the stochastic management of groundwater quality problems is presented, which can be used to provide agricultural advisory services. A stochastic algorithm to solve the coupled flow and mass transport inverse problem is combined with a stochastic management approach to develop methods for integrating uncertainty; thus obtaining more reliable policies on groundwater nitrate pollution control from agriculture. The stochastic inverse model allows identifying non-Gaussian parameters and reducing uncertainty in heterogeneous aquifers by constraining stochastic simulations to data. The management model determines the spatial and temporal distribution of fertilizer application rates that maximizes net benefits in agriculture constrained by quality requirements in groundwater at various control sites. The quality constraints can be taken, for instance, by those given by water laws such as the EU Water Framework Directive (WFD). Furthermore, the methodology allows providing the trade-off between higher economic returns and reliability in meeting the environmental standards. Therefore, this new technology can help stakeholders in the decision-making process under an uncertainty environment. The methodology has been successfully applied to a 2D synthetic aquifer, where an uncertainty assessment has been carried out by means of Monte Carlo simulation techniques.
Kleinmann, Joachim U; Wang, Magnus
2017-09-01
Spatial behavior is of crucial importance for the risk assessment of pesticides and for the assessment of effects of agricultural practice or multiple stressors, because it determines field use, exposition, and recovery. Recently, population models have increasingly been used to understand the mechanisms driving risk and recovery or to conduct landscape-level risk assessments. To include spatial behavior appropriately in population models for use in risk assessments, a new method, "probabilistic walk," was developed, which simulates the detailed daily movement of individuals by taking into account food resources, vegetation cover, and the presence of conspecifics. At each movement step, animals decide where to move next based on probabilities being determined from this information. The model was parameterized to simulate populations of brown hares (Lepus europaeus). A detailed validation of the model demonstrated that it can realistically reproduce various natural patterns of brown hare ecology and behavior. Simulated proportions of time animals spent in fields (PT values) were also comparable to field observations. It is shown that these important parameters for the risk assessment may, however, vary in different landscapes. The results demonstrate the value of using population models to reduce uncertainties in risk assessment and to better understand which factors determine risk in a landscape context. Environ Toxicol Chem 2017;36:2299-2307. © 2017 SETAC. © 2017 SETAC.
A Bayesian Framework of Uncertainties Integration in 3D Geological Model
NASA Astrophysics Data System (ADS)
Liang, D.; Liu, X.
2017-12-01
3D geological model can describe complicated geological phenomena in an intuitive way while its application may be limited by uncertain factors. Great progress has been made over the years, lots of studies decompose the uncertainties of geological model to analyze separately, while ignored the comprehensive impacts of multi-source uncertainties. Great progress has been made over the years, while lots of studies ignored the comprehensive impacts of multi-source uncertainties when analyzed them item by item from each source. To evaluate the synthetical uncertainty, we choose probability distribution to quantify uncertainty, and propose a bayesian framework of uncertainties integration. With this framework, we integrated data errors, spatial randomness, and cognitive information into posterior distribution to evaluate synthetical uncertainty of geological model. Uncertainties propagate and cumulate in modeling process, the gradual integration of multi-source uncertainty is a kind of simulation of the uncertainty propagation. Bayesian inference accomplishes uncertainty updating in modeling process. Maximum entropy principle makes a good effect on estimating prior probability distribution, which ensures the prior probability distribution subjecting to constraints supplied by the given information with minimum prejudice. In the end, we obtained a posterior distribution to evaluate synthetical uncertainty of geological model. This posterior distribution represents the synthetical impact of all the uncertain factors on the spatial structure of geological model. The framework provides a solution to evaluate synthetical impact on geological model of multi-source uncertainties and a thought to study uncertainty propagation mechanism in geological modeling.
Amy K. Snover,; Nathan J. Mantua,; Littell, Jeremy; Michael A. Alexander,; Michelle M. McClure,; Janet Nye,
2013-01-01
Increased concern over climate change is demonstrated by the many efforts to assess climate effects and develop adaptation strategies. Scientists, resource managers, and decision makers are increasingly expected to use climate information, but they struggle with its uncertainty. With the current proliferation of climate simulations and downscaling methods, scientifically credible strategies for selecting a subset for analysis and decision making are needed. Drawing on a rich literature in climate science and impact assessment and on experience working with natural resource scientists and decision makers, we devised guidelines for choosing climate-change scenarios for ecological impact assessment that recognize irreducible uncertainty in climate projections and address common misconceptions about this uncertainty. This approach involves identifying primary local climate drivers by climate sensitivity of the biological system of interest; determining appropriate sources of information for future changes in those drivers; considering how well processes controlling local climate are spatially resolved; and selecting scenarios based on considering observed emission trends, relative importance of natural climate variability, and risk tolerance and time horizon of the associated decision. The most appropriate scenarios for a particular analysis will not necessarily be the most appropriate for another due to differences in local climate drivers, biophysical linkages to climate, decision characteristics, and how well a model simulates the climate parameters and processes of interest. Given these complexities, we recommend interaction among climate scientists, natural and physical scientists, and decision makers throughout the process of choosing and using climate-change scenarios for ecological impact assessment.
NASA Astrophysics Data System (ADS)
Rakovec, O.; Weerts, A. H.; Hazenberg, P.; Torfs, P. J. J. F.; Uijlenhoet, R.
2012-09-01
This paper presents a study on the optimal setup for discharge assimilation within a spatially distributed hydrological model. The Ensemble Kalman filter (EnKF) is employed to update the grid-based distributed states of such an hourly spatially distributed version of the HBV-96 model. By using a physically based model for the routing, the time delay and attenuation are modelled more realistically. The discharge and states at a given time step are assumed to be dependent on the previous time step only (Markov property). Synthetic and real world experiments are carried out for the Upper Ourthe (1600 km2), a relatively quickly responding catchment in the Belgian Ardennes. We assess the impact on the forecasted discharge of (1) various sets of the spatially distributed discharge gauges and (2) the filtering frequency. The results show that the hydrological forecast at the catchment outlet is improved by assimilating interior gauges. This augmentation of the observation vector improves the forecast more than increasing the updating frequency. In terms of the model states, the EnKF procedure is found to mainly change the pdfs of the two routing model storages, even when the uncertainty in the discharge simulations is smaller than the defined observation uncertainty.
The potential of satellite spectro-imagery for monitoring CO2 emissions from large cities
NASA Astrophysics Data System (ADS)
Broquet, Grégoire; Bréon, François-Marie; Renault, Emmanuel; Buchwitz, Michael; Reuter, Maximilian; Bovensmann, Heinrich; Chevallier, Frédéric; Wu, Lin; Ciais, Philippe
2018-02-01
This study assesses the potential of 2 to 10 km resolution imagery of CO2 concentrations retrieved from the shortwave infrared measurements of a space-borne passive spectrometer for monitoring the spatially integrated emissions from the Paris area. Such imagery could be provided by missions similar to CarbonSat, which was studied as a candidate Earth Explorer 8 mission by the European Space Agency (ESA). This assessment is based on observing system simulation experiments (OSSEs) with an atmospheric inversion approach at city scale. The inversion system solves for hourly city CO2 emissions and natural fluxes, or for these fluxes per main anthropogenic sector or ecosystem, during the 6 h before a given satellite overpass. These 6 h correspond to the period during which emissions produce CO2 plumes that can be identified on the image from this overpass. The statistical framework of the inversion accounts for the existence of some prior knowledge with 50 % uncertainty on the hourly or sectorial emissions, and with ˜ 25 % uncertainty on the 6 h mean emissions, from an inventory based on energy use and carbon fuel consumption statistics. The link between the hourly or sectorial emissions and the vertically integrated column of CO2 observed by the satellite is simulated using a coupled flux and atmospheric transport model. This coupled model is built with the information on the spatial and temporal distribution of emissions from the emission inventory produced by the local air-quality agency (Airparif) and a 2 km horizontal resolution atmospheric transport model. Tests are conducted for different realistic simulations of the spatial coverage, resolution, precision and accuracy of the imagery from sun-synchronous polar-orbiting missions, corresponding to the specifications of CarbonSat and Sentinel-5 or extrapolated from these specifications. First, OSSEs are conducted with a rather optimistic configuration in which the inversion system is perfectly informed about the statistics of the limited number of error sources. These OSSEs indicate that the image resolution has to be finer than 4 km to decrease the uncertainty in the 6 h mean emissions by more than 50 %. More complex experiments assess the impact of more realistic error estimates that current inversion methods do not properly account for, in particular, the systematic measurement errors with spatially correlated patterns. These experiments highlight the difficulty to improve current knowledge on CO2 emissions for urban areas like Paris with CO2 observations from satellites, and call for more technological innovations in the remote sensing of vertically integrated columns of CO2 and in the inversion systems that exploit it.
NASA Astrophysics Data System (ADS)
Thomas Steven Savage, James; Pianosi, Francesca; Bates, Paul; Freer, Jim; Wagener, Thorsten
2016-11-01
Where high-resolution topographic data are available, modelers are faced with the decision of whether it is better to spend computational resource on resolving topography at finer resolutions or on running more simulations to account for various uncertain input factors (e.g., model parameters). In this paper we apply global sensitivity analysis to explore how influential the choice of spatial resolution is when compared to uncertainties in the Manning's friction coefficient parameters, the inflow hydrograph, and those stemming from the coarsening of topographic data used to produce Digital Elevation Models (DEMs). We apply the hydraulic model LISFLOOD-FP to produce several temporally and spatially variable model outputs that represent different aspects of flood inundation processes, including flood extent, water depth, and time of inundation. We find that the most influential input factor for flood extent predictions changes during the flood event, starting with the inflow hydrograph during the rising limb before switching to the channel friction parameter during peak flood inundation, and finally to the floodplain friction parameter during the drying phase of the flood event. Spatial resolution and uncertainty introduced by resampling topographic data to coarser resolutions are much more important for water depth predictions, which are also sensitive to different input factors spatially and temporally. Our findings indicate that the sensitivity of LISFLOOD-FP predictions is more complex than previously thought. Consequently, the input factors that modelers should prioritize will differ depending on the model output assessed, and the location and time of when and where this output is most relevant.
Assessing the impact of future climate extremes on the US corn and soybean production
NASA Astrophysics Data System (ADS)
Jin, Z.
2015-12-01
Future climate changes will place big challenges to the US agricultural system, among which increasing heat stress and precipitation variability were the two major concerns. Reliable prediction of crop productions in response to the increasingly frequent and severe extreme climate is a prerequisite for developing adaptive strategies on agricultural risk management. However, the progress has been slow on quantifying the uncertainty of computational predictions at high spatial resolutions. Here we assessed the risks of future climate extremes on the US corn and soybean production using the Agricultural Production System sIMulator (APSIM) model under different climate scenarios. To quantify the uncertainty due to conceptual representations of heat, drought and flooding stress in crop models, we proposed a new strategy of algorithm ensemble in which different methods for simulating crop responses to those extreme climatic events were incorporated into the APSIM. This strategy allowed us to isolate irrelevant structure differences among existing crop models but only focus on the process of interest. Future climate inputs were derived from high-spatial-resolution (12km × 12km) Weather Research and Forecasting (WRF) simulations under Representative Concentration Pathways 4.5 (RCP 4.5) and 8.5 (RCP 8.5). Based on crop model simulations, we analyzed the magnitude and frequency of heat, drought and flooding stress for the 21st century. We also evaluated the water use efficiency and water deficit on regional scales if farmers were to boost their yield by applying more fertilizers. Finally we proposed spatially explicit adaptation strategies of irrigation and fertilizing for different management zones.
Predicting long-range transport: a systematic evaluation of two multimedia transport models.
Bennett, D H; Scheringer, M; McKone, T E; Hungerbühler, K
2001-03-15
The United Nations Environment Program has recently developed criteria to identify and restrict chemicals with a potential for persistence and long-range transport (persistent organic pollutants or POPs). There are many stakeholders involved, and the issues are not only scientific but also include social, economic, and political factors. This work focuses on one aspect of the POPs debate, the criteria for determining the potential for long-range transport (LRT). Our goal is to determine if current models are reliable enough to support decisions that classify a chemical based on the LRT potential. We examine the robustness of two multimedia fate models for determining the relative ranking and absolute spatial range of various chemicals in the environment. We also consider the effect of parameter uncertainties and the model uncertainty associated with the selection of an algorithm for gas-particle partitioning on the model results. Given the same chemical properties, both models give virtually the same ranking. However, when chemical parameter uncertainties and model uncertainties such as particle partitioning are considered, the spatial range distributions obtained for the individual chemicals overlap, preventing a distinct rank order. The absolute values obtained for the predicted spatial range or travel distance differ significantly between the two models for the uncertainties evaluated. We find that to evaluate a chemical when large and unresolved uncertainties exist, it is more informative to use two or more models and include multiple types of uncertainty. Model differences and uncertainties must be explicitly confronted to determine how the limitations of scientific knowledge impact predictions in the decision-making process.
Accurate green water loads calculation using naval hydro pack
NASA Astrophysics Data System (ADS)
Jasak, H.; Gatin, I.; Vukčević, V.
2017-12-01
An extensive verification and validation of Finite Volume based CFD software Naval Hydro based on foam-extend is presented in this paper for green water loads. Two-phase numerical model with advanced methods for treating the free surface is employed. Pressure loads on horizontal deck of Floating Production Storage and Offloading vessel (FPSO) model are compared to experimental results from [1] for three incident regular waves. Pressure peaks and integrals of pressure in time are measured on ten different locations on deck for each case. Pressure peaks and integrals are evaluated as average values among the measured incident wave periods, where periodic uncertainty is assessed for both numerical and experimental results. Spatial and temporal discretization refinement study is performed providing numerical discretization uncertainties.
Muis, Sanne; Güneralp, Burak; Jongman, Brenden; Aerts, Jeroen C J H; Ward, Philip J
2015-12-15
An accurate understanding of flood risk and its drivers is crucial for effective risk management. Detailed risk projections, including uncertainties, are however rarely available, particularly in developing countries. This paper presents a method that integrates recent advances in global-scale modeling of flood hazard and land change, which enables the probabilistic analysis of future trends in national-scale flood risk. We demonstrate its application to Indonesia. We develop 1000 spatially-explicit projections of urban expansion from 2000 to 2030 that account for uncertainty associated with population and economic growth projections, as well as uncertainty in where urban land change may occur. The projections show that the urban extent increases by 215%-357% (5th and 95th percentiles). Urban expansion is particularly rapid on Java, which accounts for 79% of the national increase. From 2000 to 2030, increases in exposure will elevate flood risk by, on average, 76% and 120% for river and coastal floods. While sea level rise will further increase the exposure-induced trend by 19%-37%, the response of river floods to climate change is highly uncertain. However, as urban expansion is the main driver of future risk, the implementation of adaptation measures is increasingly urgent, regardless of the wide uncertainty in climate projections. Using probabilistic urban projections, we show that spatial planning can be a very effective adaptation strategy. Our study emphasizes that global data can be used successfully for probabilistic risk assessment in data-scarce countries. Copyright © 2015 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Zhu, Q.; Xu, Y. P.; Gu, H.
2014-12-01
Traditionally, regional frequency analysis methods were developed for stationary environmental conditions. Nevertheless, recent studies have identified significant changes in hydrological records, leading to the 'death' of stationarity. Besides, uncertainty in hydrological frequency analysis is persistent. This study aims to investigate the impact of one of the most important uncertainty sources, parameter uncertainty, together with nonstationarity, on design rainfall depth in Qu River Basin, East China. A spatial bootstrap is first proposed to analyze the uncertainty of design rainfall depth estimated by regional frequency analysis based on L-moments and estimated on at-site scale. Meanwhile, a method combining the generalized additive models with 30-year moving window is employed to analyze non-stationarity existed in the extreme rainfall regime. The results show that the uncertainties of design rainfall depth with 100-year return period under stationary conditions estimated by regional spatial bootstrap can reach 15.07% and 12.22% with GEV and PE3 respectively. On at-site scale, the uncertainties can reach 17.18% and 15.44% with GEV and PE3 respectively. In non-stationary conditions, the uncertainties of maximum rainfall depth (corresponding to design rainfall depth) with 0.01 annual exceedance probability (corresponding to 100-year return period) are 23.09% and 13.83% with GEV and PE3 respectively. Comparing the 90% confidence interval, the uncertainty of design rainfall depth resulted from parameter uncertainty is less than that from non-stationarity frequency analysis with GEV, however, slightly larger with PE3. This study indicates that the spatial bootstrap can be successfully applied to analyze the uncertainty of design rainfall depth on both regional and at-site scales. And the non-stationary analysis shows that the differences between non-stationary quantiles and their stationary equivalents are important for decision makes of water resources management and risk management.
Precision Requirements for Space-based XCO2 Data
NASA Technical Reports Server (NTRS)
Miller, C. E.; Crisp, D.; DeCola, P. C.; Olsen, S. C.; Randerson, J. T.; Rayner, P.; Jacob, D.J.; Jones, D.; Suntharalingam, P.
2005-01-01
Precision requirements have been determined for the column-averaged CO2 dry air mole fraction (X(sub CO2)) data products to be delivered by the Orbiting Carbon Observatory (OCO). These requirements result from an assessment of the amplitude and spatial gradients in X(sub CO2), the relationship between X(sub CO2) precision and surface CO2 flux uncertainties calculated from inversions of the X(sub CO2) data, and the effects of X,,Z biases on CO2 flux inversions. Observing system simulation experiments and synthesis inversion modeling demonstrate that the OCO mission design and sampling strategy provide the means to achieve the X(sub CO2) precision requirements. The impact of X(sub CO2) biases on CO2 flux uncertainties depend on their spatial and temporal extent since CO2 sources and sinks are inferred from regional-scale X(sub CO2) gradients. Simulated OCO sampling of the TRACE-P CO2 fields shows the ability of X(sub CO2) data to constrain CO2 flux inversions over Asia and distinguish regional fluxes from India and China.
A method for determining average beach slope and beach slope variability for U.S. sandy coastlines
Doran, Kara S.; Long, Joseph W.; Overbeck, Jacquelyn R.
2015-01-01
The U.S. Geological Survey (USGS) National Assessment of Hurricane-Induced Coastal Erosion Hazards compares measurements of beach morphology with storm-induced total water levels to produce forecasts of coastal change for storms impacting the Gulf of Mexico and Atlantic coastlines of the United States. The wave-induced water level component (wave setup and swash) is estimated by using modeled offshore wave height and period and measured beach slope (from dune toe to shoreline) through the empirical parameterization of Stockdon and others (2006). Spatial and temporal variability in beach slope leads to corresponding variability in predicted wave setup and swash. For instance, seasonal and storm-induced changes in beach slope can lead to differences on the order of 1 meter (m) in wave-induced water level elevation, making accurate specification of this parameter and its associated uncertainty essential to skillful forecasts of coastal change. A method for calculating spatially and temporally averaged beach slopes is presented here along with a method for determining total uncertainty for each 200-m alongshore section of coastline.
A spatial assessment framework for evaluating flood risk under extreme climates.
Chen, Yun; Liu, Rui; Barrett, Damian; Gao, Lei; Zhou, Mingwei; Renzullo, Luigi; Emelyanova, Irina
2015-12-15
Australian coal mines have been facing a major challenge of increasing risk of flooding caused by intensive rainfall events in recent years. In light of growing climate change concerns and the predicted escalation of flooding, estimating flood inundation risk becomes essential for understanding sustainable mine water management in the Australian mining sector. This research develops a spatial multi-criteria decision making prototype for the evaluation of flooding risk at a regional scale using the Bowen Basin and its surroundings in Queensland as a case study. Spatial gridded data, including climate, hydrology, topography, vegetation and soils, were collected and processed in ArcGIS. Several indices were derived based on time series of observations and spatial modeling taking account of extreme rainfall, evapotranspiration, stream flow, potential soil water retention, elevation and slope generated from a digital elevation model (DEM), as well as drainage density and proximity extracted from a river network. These spatial indices were weighted using the analytical hierarchy process (AHP) and integrated in an AHP-based suitability assessment (AHP-SA) model under the spatial risk evaluation framework. A regional flooding risk map was delineated to represent likely impacts of criterion indices at different risk levels, which was verified using the maximum inundation extent detectable by a time series of remote sensing imagery. The result provides baseline information to help Bowen Basin coal mines identify and assess flooding risk when making adaptation strategies and implementing mitigation measures in future. The framework and methodology developed in this research offers the Australian mining industry, and social and environmental studies around the world, an effective way to produce reliable assessment on flood risk for managing uncertainty in water availability under climate change. Copyright © 2015. Published by Elsevier B.V.
NASA Astrophysics Data System (ADS)
Merker, Claire; Ament, Felix; Clemens, Marco
2017-04-01
The quantification of measurement uncertainty for rain radar data remains challenging. Radar reflectivity measurements are affected, amongst other things, by calibration errors, noise, blocking and clutter, and attenuation. Their combined impact on measurement accuracy is difficult to quantify due to incomplete process understanding and complex interdependencies. An improved quality assessment of rain radar measurements is of interest for applications both in meteorology and hydrology, for example for precipitation ensemble generation, rainfall runoff simulations, or in data assimilation for numerical weather prediction. Especially a detailed description of the spatial and temporal structure of errors is beneficial in order to make best use of the areal precipitation information provided by radars. Radar precipitation ensembles are one promising approach to represent spatially variable radar measurement errors. We present a method combining ensemble radar precipitation nowcasting with data assimilation to estimate radar measurement uncertainty at each pixel. This combination of ensemble forecast and observation yields a consistent spatial and temporal evolution of the radar error field. We use an advection-based nowcasting method to generate an ensemble reflectivity forecast from initial data of a rain radar network. Subsequently, reflectivity data from single radars is assimilated into the forecast using the Local Ensemble Transform Kalman Filter. The spread of the resulting analysis ensemble provides a flow-dependent, spatially and temporally correlated reflectivity error estimate at each pixel. We will present first case studies that illustrate the method using data from a high-resolution X-band radar network.
NASA Technical Reports Server (NTRS)
Dutta, Soumitra
1988-01-01
A model for approximate spatial reasoning using fuzzy logic to represent the uncertainty in the environment is presented. Algorithms are developed which can be used to reason about spatial information expressed in the form of approximate linguistic descriptions similar to the kind of spatial information processed by humans. Particular attention is given to static spatial reasoning.
NASA Astrophysics Data System (ADS)
Rasera, L. G.; Mariethoz, G.; Lane, S. N.
2017-12-01
Frequent acquisition of high-resolution digital elevation models (HR-DEMs) over large areas is expensive and difficult. Satellite-derived low-resolution digital elevation models (LR-DEMs) provide extensive coverage of Earth's surface but at coarser spatial and temporal resolutions. Although useful for large scale problems, LR-DEMs are not suitable for modeling hydrologic and geomorphic processes at scales smaller than their spatial resolution. In this work, we present a multiple-point geostatistical approach for downscaling a target LR-DEM based on available high-resolution training data and recurrent high-resolution remote sensing images. The method aims at generating several equiprobable HR-DEMs conditioned to a given target LR-DEM by borrowing small scale topographic patterns from an analogue containing data at both coarse and fine scales. An application of the methodology is demonstrated by using an ensemble of simulated HR-DEMs as input to a flow-routing algorithm. The proposed framework enables a probabilistic assessment of the spatial structures generated by natural phenomena operating at scales finer than the available terrain elevation measurements. A case study in the Swiss Alps is provided to illustrate the methodology.
Evaluating Precipitation from Orbital Data Products of TRMM and GPM over the Indian Subcontinent
NASA Astrophysics Data System (ADS)
Jayaluxmi, I.; Kumar, D. N.
2015-12-01
The rapidly growing records of microwave based precipitation data made available from various earth observation satellites have instigated a pressing need towards evaluating the associated uncertainty which arise from different sources such as retrieval error, spatial/temporal sampling error and sensor dependent error. Pertaining to microwave remote sensing, most of the studies in literature focus on gridded data products, fewer studies exist on evaluating the uncertainty inherent in orbital data products. Evaluation of the latter are essential as they potentially cause large uncertainties during real time flood forecasting studies especially at the watershed scale. The present study evaluates the uncertainty of precipitation data derived from the orbital data products of the Tropical Rainfall Measuring Mission (TRMM) satellite namely the 2A12, 2A25 and 2B31 products. Case study results over the flood prone basin of Mahanadi, India, are analyzed for precipitation uncertainty through these three facets viz., a) Uncertainty quantification using the volumetric metrics from the contingency table [Aghakouchak and Mehran 2014] b) Error characterization using additive and multiplicative error models c) Error decomposition to identify systematic and random errors d) Comparative assessment with the orbital data from GPM mission. The homoscedastic random errors from multiplicative error models justify a better representation of precipitation estimates by the 2A12 algorithm. It can be concluded that although the radiometer derived 2A12 precipitation data is known to suffer from many sources of uncertainties, spatial analysis over the case study region of India testifies that they are in excellent agreement with the reference estimates for the data period considered [Indu and Kumar 2015]. References A. AghaKouchak and A. Mehran (2014), Extended contingency table: Performance metrics for satellite observations and climate model simulations, Water Resources Research, vol. 49, 7144-7149; J. Indu and D. Nagesh Kumar (2015), Evaluation of Precipitation Retrievals from Orbital Data Products of TRMM over a Subtropical basin in India, IEEE Transactions on Geoscience and Remote Sensing, in press, doi: 10.1109/TGRS.2015.2440338.
Carbon storage in Chinese grassland ecosystems: Influence of different integrative methods.
Ma, Anna; He, Nianpeng; Yu, Guirui; Wen, Ding; Peng, Shunlei
2016-02-17
The accurate estimate of grassland carbon (C) is affected by many factors at the large scale. Here, we used six methods (three spatial interpolation methods and three grassland classification methods) to estimate C storage of Chinese grasslands based on published data from 2004 to 2014, and assessed the uncertainty resulting from different integrative methods. The uncertainty (coefficient of variation, CV, %) of grassland C storage was approximately 4.8% for the six methods tested, which was mainly determined by soil C storage. C density and C storage to the soil layer depth of 100 cm were estimated to be 8.46 ± 0.41 kg C m(-2) and 30.98 ± 1.25 Pg C, respectively. Ecosystem C storage was composed of 0.23 ± 0.01 (0.7%) above-ground biomass, 1.38 ± 0.14 (4.5%) below-ground biomass, and 29.37 ± 1.2 (94.8%) Pg C in the 0-100 cm soil layer. Carbon storage calculated by the grassland classification methods (18 grassland types) was closer to the mean value than those calculated by the spatial interpolation methods. Differences in integrative methods may partially explain the high uncertainty in C storage estimates in different studies. This first evaluation demonstrates the importance of multi-methodological approaches to accurately estimate C storage in large-scale terrestrial ecosystems.
Wellman, Tristan P.; Poeter, Eileen P.
2006-01-01
Computational limitations and sparse field data often mandate use of continuum representation for modeling hydrologic processes in large‐scale fractured aquifers. Selecting appropriate element size is of primary importance because continuum approximation is not valid for all scales. The traditional approach is to select elements by identifying a single representative elementary scale (RES) for the region of interest. Recent advances indicate RES may be spatially variable, prompting unanswered questions regarding the ability of sparse data to spatially resolve continuum equivalents in fractured aquifers. We address this uncertainty of estimating RES using two techniques. In one technique we employ data‐conditioned realizations generated by sequential Gaussian simulation. For the other we develop a new approach using conditioned random walks and nonparametric bootstrapping (CRWN). We evaluate the effectiveness of each method under three fracture densities, three data sets, and two groups of RES analysis parameters. In sum, 18 separate RES analyses are evaluated, which indicate RES magnitudes may be reasonably bounded using uncertainty analysis, even for limited data sets and complex fracture structure. In addition, we conduct a field study to estimate RES magnitudes and resulting uncertainty for Turkey Creek Basin, a crystalline fractured rock aquifer located 30 km southwest of Denver, Colorado. Analyses indicate RES does not correlate to rock type or local relief in several instances but is generally lower within incised creek valleys and higher along mountain fronts. Results of this study suggest that (1) CRWN is an effective and computationally efficient method to estimate uncertainty, (2) RES predictions are well constrained using uncertainty analysis, and (3) for aquifers such as Turkey Creek Basin, spatial variability of RES is significant and complex.
Endogenous spatial attention: evidence for intact functioning in adults with autism
Grubb, Michael A.; Behrmann, Marlene; Egan, Ryan; Minshew, Nancy J.; Carrasco, Marisa; Heeger, David J.
2012-01-01
Lay Abstract Attention allows us to selectively process the vast amount of information with which we are confronted. Focusing on a certain location of the visual scene (visual spatial attention) enables the prioritization of some aspects of information while ignoring others. Rapid manipulation of the attention field (i.e., the location and spread of visual spatial attention) is a critical aspect of human cognition, and previous research on spatial attention in individuals with autism spectrum disorders (ASD) has produced inconsistent results. In a series of three experiments, we evaluated claims in the literature that individuals with ASD exhibit a deficit in voluntarily controlling the deployment and size of the spatial attention field. We measured how well participants perform a visual discrimination task (accuracy) and how quickly they do so (reaction time), with and without spatial uncertainty (i.e., the lack of predictability concerning the spatial position of the upcoming stimulus). We found that high–functioning adults with autism exhibited slower reactions times overall with spatial uncertainty, but the effects of attention on performance accuracies and reaction times were indistinguishable between individuals with autism and typically developing individuals, in all three experiments. These results provide evidence of intact endogenous spatial attention function in high–functioning adults with ASD, suggesting that atypical endogenous spatial attention cannot be a latent characteristic of autism in general. Scientific Abstract Rapid manipulation of the attention field (i.e., the location and spread of visual spatial attention) is a critical aspect of human cognition, and previous research on spatial attention in individuals with autism spectrum disorders (ASD) has produced inconsistent results. In a series of three psychophysical experiments, we evaluated claims in the literature that individuals with ASD exhibit a deficit in voluntarily controlling the deployment and size of the spatial attention field. We measured the spatial distribution of performance accuracies and reaction times to quantify the sizes and locations of the attention field, with and without spatial uncertainty (i.e., the lack of predictability concerning the spatial position of the upcoming stimulus). We found that high–functioning adults with autism exhibited slower reactions times overall with spatial uncertainty, but the effects of attention on performance accuracies and reaction times were indistinguishable between individuals with autism and typically developing individuals, in all three experiments. These results provide evidence of intact endogenous spatial attention function in high–functioning adults with ASD, suggesting that atypical endogenous attention cannot be a latent characteristic of autism in general. PMID:23427075
Spatial and temporal study of nitrate concentration in groundwater by means of coregionalization
D'Agostino, V.; Greene, E.A.; Passarella, G.; Vurro, M.
1998-01-01
Spatial and temporal behavior of hydrochemical parameters in groundwater can be studied using tools provided by geostatistics. The cross-variogram can be used to measure the spatial increments between observations at two given times as a function of distance (spatial structure). Taking into account the existence of such a spatial structure, two different data sets (sampled at two different times), representing concentrations of the same hydrochemical parameter, can be analyzed by cokriging in order to reduce the uncertainty of the estimation. In particular, if one of the two data sets is a subset of the other (that is, an undersampled set), cokriging allows us to study the spatial distribution of the hydrochemical parameter at that time, while also considering the statistical characteristics of the full data set established at a different time. This paper presents an application of cokriging by using temporal subsets to study the spatial distribution of nitrate concentration in the aquifer of the Lucca Plain, central Italy. Three data sets of nitrate concentration in groundwater were collected during three different periods in 1991. The first set was from 47 wells, but the second and the third are undersampled and represent 28 and 27 wells, respectively. Comparing the result of cokriging with ordinary kriging showed an improvement of the uncertainty in terms of reducing the estimation variance. The application of cokriging to the undersampled data sets reduced the uncertainty in estimating nitrate concentration and at the same time decreased the cost of the field sampling and laboratory analysis.Spatial and temporal behavior of hydrochemical parameters in groundwater can be studied using tools provided by geostatistics. The cross-variogram can be used to measure the spatial increments between observations at two given times as a function of distance (spatial structure). Taking into account the existence of such a spatial structure, two different data sets (sampled at two different times), representing concentrations of the same hydrochemical parameter, can be analyzed by cokriging in order to reduce the uncertainty of the estimation. In particular, if one of the two data sets is a subset of the other (that is, an undersampled set), cokriging allows us to study the spatial distribution of the hydrochemical parameter at that time, while also considering the statistical characteristics of the full data set established at a different time. This paper presents an application of cokriging by using temporal subsets to study the spatial distribution of nitrate concentration in the aquifer of the Lucca Plain, central Italy. Three data sets of nitrate concentration in groundwater were collected during three different periods in 1991. The first set was from 47 wells, but the second and the third are undersampled and represent 28 and 27 wells, respectively. Comparing the result of cokriging with ordinary kriging showed an improvement of the uncertainty in terms of reducing the estimation variance. The application of cokriging to the undersampled data sets reduced the uncertainty in estimating nitrate concentration and at the same time decreased the cost of the field sampling and laboratory analysis.
NASA Astrophysics Data System (ADS)
Cui, Tao; Moore, Catherine; Raiber, Matthias
2018-05-01
Modelling cumulative impacts of basin-scale coal seam gas (CSG) extraction is challenging due to the long time frames and spatial extent over which impacts occur combined with the need to consider local-scale processes. The computational burden of such models limits the ability to undertake calibration and sensitivity and uncertainty analyses. A framework is presented that integrates recently developed methods and tools to address the computational burdens of an assessment of drawdown impacts associated with rapid CSG development in the Surat Basin, Australia. The null space Monte Carlo method combined with singular value decomposition (SVD)-assisted regularisation was used to analyse the uncertainty of simulated drawdown impacts. The study also describes how the computational burden of assessing local-scale impacts was mitigated by adopting a novel combination of a nested modelling framework which incorporated a model emulator of drawdown in dual-phase flow conditions, and a methodology for representing local faulting. This combination provides a mechanism to support more reliable estimates of regional CSG-related drawdown predictions. The study indicates that uncertainties associated with boundary conditions are reduced significantly when expressing differences between scenarios. The results are analysed and distilled to enable the easy identification of areas where the simulated maximum drawdown impacts could exceed trigger points associated with legislative `make good' requirements; trigger points require that either an adjustment in the development scheme or other measures are implemented to remediate the impact. This report contributes to the currently small body of work that describes modelling and uncertainty analyses of CSG extraction impacts on groundwater.
NASA Astrophysics Data System (ADS)
Mendoza, D. L.; Lin, J. C.; Mitchell, L.; Ehleringer, J. R.
2014-12-01
Accurate, high-resolution data on air pollutant emissions and concentrations are needed to understand human exposures and for both policy and pollutant management purposes. An important step in this process is also quantification of uncertainties. We present a spatially explicit and highly resolved emissions inventory for Salt Lake County, Utah, and trace gas concentration estimates for carbon monoxide (CO), carbon dioxide (CO2), nitrogen oxides (NOx) and fine particles (PM2.5) within Salt Lake City. We assess the validity of this approach by comparing measured concentrations against simulated values derived from combining the emissions inventory with an atmospheric model. The emissions inventory for the criteria pollutants was constructed using the 2011 National Emissions Inventory (NEI). The spatial and temporal allocation methods from the Emission Modeling Clearinghouse data set are used to downscale the NEI data from annual to hourly scales and from county-level to 500 m x 500 m resolution. Onroad mobile source emissions were estimated by combining a bottom-up emissions calculation approach for large roadway links with a top-down spatial allocation approach for other roadways. Vehicle activity data for road links were derived from automatic traffic responder data. The emissions inventory for CO2 was obtained from the Hestia emissions data product at an hourly, building, facility, and road link resolution. The AERMOD and CALPUFF dispersion models were used to transport emissions and estimate air pollutant concentrations at an hourly temporal and 500 m x 500 m spatial resolution. Modeled results were compared against measurements from a mobile lab equipped with trace gas measurement equipment traveling on pre-determined routes in the Salt Lake City area. The comparison between both approaches to concentration estimation highlights spatial locations and hours of high variability/uncertainty. Results presented here will inform understanding of variability and uncertainty in emissions and concentrations to better inform future policy. This work will also facilitate the development of a systematic approach to incorporate measurement data and models to better inform estimates of pollutant concentrations that determine the extent to which urban populations are exposed to adverse air quality.
MODIS land cover uncertainty in regional climate simulations
NASA Astrophysics Data System (ADS)
Li, Xue; Messina, Joseph P.; Moore, Nathan J.; Fan, Peilei; Shortridge, Ashton M.
2017-12-01
MODIS land cover datasets are used extensively across the climate modeling community, but inherent uncertainties and associated propagating impacts are rarely discussed. This paper modeled uncertainties embedded within the annual MODIS Land Cover Type (MCD12Q1) products and propagated these uncertainties through the Regional Atmospheric Modeling System (RAMS). First, land cover uncertainties were modeled using pixel-based trajectory analyses from a time series of MCD12Q1 for Urumqi, China. Second, alternative land cover maps were produced based on these categorical uncertainties and passed into RAMS. Finally, simulations from RAMS were analyzed temporally and spatially to reveal impacts. Our study found that MCD12Q1 struggles to discriminate between grasslands and croplands or grasslands and barren in this study area. Such categorical uncertainties have significant impacts on regional climate model outputs. All climate variables examined demonstrated impact across the various regions, with latent heat flux affected most with a magnitude of 4.32 W/m2 in domain average. Impacted areas were spatially connected to locations of greater land cover uncertainty. Both biophysical characteristics and soil moisture settings in regard to land cover types contribute to the variations among simulations. These results indicate that formal land cover uncertainty analysis should be included in MCD12Q1-fed climate modeling as a routine procedure.
Bottom friction. A practical approach to modelling coastal oceanography
NASA Astrophysics Data System (ADS)
Bolanos, Rodolfo; Jensen, Palle; Kofoed-Hansen, Henrik; Tornsfeldt Sørensen, Jacob
2017-04-01
Coastal processes imply the interaction of the atmosphere, the sea, the coastline and the bottom. The spatial gradients in this area are normally large, induced by orographic and bathymetric features. Although nowadays it is possible to obtain high-resolution bathymetry, the details of the seabed, e.g. sediment type, presence of biological material and living organisms are not available. Additionally, these properties as well as bathymetry can also be highly dynamic. These bottom characteristics are very important to describe the boundary layer of currents and waves and control to a large degree the dissipation of flows. The bottom friction is thus typically a calibration parameter in numerical modelling of coastal processes. In this work, we assess this process and put it into context of other physical processes uncertainties influencing wind-waves and currents in the coastal areas. A case study in the North Sea is used, particularly the west coast of Denmark, where water depth of less than 30 m cover a wide fringe along the coast, where several offshore wind farm developments are being carried out. We use the hydrodynamic model MIKE 21 HD and the spectral wave model MIKE 21 SW to simulate atmosphere and tidal induced flows and the wind wave generation and propagation. Both models represent state of the art and have been developed for flexible meshes, ideal for coastal oceanography as they can better represent coastlines and allow a variable spatial resolution within the domain. Sensitivity tests to bottom friction formulations are carried out into context of other processes (e.g. model forcing uncertainties, wind and wave interactions, wind drag coefficient). Additionally, a map of varying bottom properties is generated based on a literature survey to explore the impact of the spatial variability. Assessment of different approaches is made in order to establish a best practice regarding bottom friction and coastal oceanographic modelling. Its contribution is also assessed during storm conditions, where its most evident impact is expected as waves are affected by the bottom processes in larger areas, making bottom dissipation more efficient. We use available waves and current measurements in the North Sea (e.g. Ekofisk, Fino platforms and some other coastal stations at the west coast of Denmark) to quantify the importance of processes influencing waves and currents in the coastal zone and putting it in the context of the importance of bottom friction and other processes uncertainties.
Assessing uncertainties of GRACE-derived terrestrial water-storage fields
NASA Astrophysics Data System (ADS)
Fereria, Vagner; Montecino, Henry
2017-04-01
Space-borne sensors are producing many remotely sensed data and, consequently, different measurements of the same field are available to end users. Furthermore, different satellite processing centres are producing extensive products based on the data of only one mission. This is exactly the case with the Gravity Recovery and Climate Experiment (GRACE) mission, which has been monitoring terrestrial water storage (TWS) since April 2002, while the Centre for Space Research (CSR), the Jet Propulsion Laboratory (JPL), the GeoForschungsZentrum (GFZ), the Groupe de Recherche de Géodésie Spatiale (GRGS), among others, provide individual monthly solutions in the form of Stokes's coefficients. The inverted TWS maps from Stokes's coefficients are being used in many applications and, therefore, as no ground truth data exist, the uncertainties are unknown. An assessment of the uncertainties associated with these different products is mandatory in order to guide data producers and support the users to choose the best dataset. However, the estimation of uncertainties of space-borne products often relies on ground truth data, and in the absence of such data, an assessment of their qualities is a challenge. A recent study (Ferreira et al. 2016) evaluates the quality of each processing centre (CSR, JPL, GFZ, and GRGS) by estimating their individual uncertainties using a generalised formulation of the three-cornered hat (TCH) method. It was found that the TCH results for the study period of August 2002 to June 2014 indicate that on a global scale, the CSR, GFZ, GRGS, and JPL present uncertainties of 9.4, 13.7, 14.8, and 13.2 mm, respectively. On a basin scale, the overall good performance of the CSR is observed at 91 river basins. The TCH-based results are confirmed by a comparison with an ensemble solution from the four GRACE processing centres. Reference Ferreira VG, Montecino HDC, Yakubu CI and Heck B (2016) Uncertainties of the Gravity Recovery and Climate Experiment time-variable gravity-field solutions based on three-cornered hat method. Journal of Applied Remote Sensing, 10(1), pp 015015-(1-20). doi: 10.1117/1.JRS.10.015015
NASA Astrophysics Data System (ADS)
Alexander, R. B.; Boyer, E. W.; Schwarz, G. E.; Smith, R. A.
2013-12-01
Estimating water and material stores and fluxes in watershed studies is frequently complicated by uncertainties in quantifying hydrological and biogeochemical effects of factors such as land use, soils, and climate. Although these process-related effects are commonly measured and modeled in separate catchments, researchers are especially challenged by their complexity across catchments and diverse environmental settings, leading to a poor understanding of how model parameters and prediction uncertainties vary spatially. To address these concerns, we illustrate the use of Bayesian hierarchical modeling techniques with a dynamic version of the spatially referenced watershed model SPARROW (SPAtially Referenced Regression On Watershed attributes). The dynamic SPARROW model is designed to predict streamflow and other water cycle components (e.g., evapotranspiration, soil and groundwater storage) for monthly varying hydrological regimes, using mechanistic functions, mass conservation constraints, and statistically estimated parameters. In this application, the model domain includes nearly 30,000 NHD (National Hydrologic Data) stream reaches and their associated catchments in the Susquehanna River Basin. We report the results of our comparisons of alternative models of varying complexity, including models with different explanatory variables as well as hierarchical models that account for spatial and temporal variability in model parameters and variance (error) components. The model errors are evaluated for changes with season and catchment size and correlations in time and space. The hierarchical models consist of a two-tiered structure in which climate forcing parameters are modeled as random variables, conditioned on watershed properties. Quantification of spatial and temporal variations in the hydrological parameters and model uncertainties in this approach leads to more efficient (lower variance) and less biased model predictions throughout the river network. Moreover, predictions of water-balance components are reported according to probabilistic metrics (e.g., percentiles, prediction intervals) that include both parameter and model uncertainties. These improvements in predictions of streamflow dynamics can inform the development of more accurate predictions of spatial and temporal variations in biogeochemical stores and fluxes (e.g., nutrients and carbon) in watersheds.
NASA Technical Reports Server (NTRS)
Talpe, Matthieu J.; Nerem, R. Steven; Forootan, Ehsan; Schmidt, Michael; Lemoine, Frank G.; Enderlin, Ellyn M.; Landerer, Felix W.
2017-01-01
We construct long-term time series of Greenland and Antarctic ice sheet mass change from satellite gravity measurements. A statistical reconstruction approach is developed based on a principal component analysis (PCA) to combine high-resolution spatial modes from the Gravity Recovery and Climate Experiment (GRACE) mission with the gravity information from conventional satellite tracking data. Uncertainties of this reconstruction are rigorously assessed; they include temporal limitations for short GRACE measurements, spatial limitations for the low-resolution conventional tracking data measurements, and limitations of the estimated statistical relationships between low- and high-degree potential coefficients reflected in the PCA modes. Trends of mass variations in Greenland and Antarctica are assessed against a number of previous studies. The resulting time series for Greenland show a higher rate of mass loss than other methods before 2000, while the Antarctic ice sheet appears heavily influenced by interannual variations.
Effects of input uncertainty on cross-scale crop modeling
NASA Astrophysics Data System (ADS)
Waha, Katharina; Huth, Neil; Carberry, Peter
2014-05-01
The quality of data on climate, soils and agricultural management in the tropics is in general low or data is scarce leading to uncertainty in process-based modeling of cropping systems. Process-based crop models are common tools for simulating crop yields and crop production in climate change impact studies, studies on mitigation and adaptation options or food security studies. Crop modelers are concerned about input data accuracy as this, together with an adequate representation of plant physiology processes and choice of model parameters, are the key factors for a reliable simulation. For example, assuming an error in measurements of air temperature, radiation and precipitation of ± 0.2°C, ± 2 % and ± 3 % respectively, Fodor & Kovacs (2005) estimate that this translates into an uncertainty of 5-7 % in yield and biomass simulations. In our study we seek to answer the following questions: (1) are there important uncertainties in the spatial variability of simulated crop yields on the grid-cell level displayed on maps, (2) are there important uncertainties in the temporal variability of simulated crop yields on the aggregated, national level displayed in time-series, and (3) how does the accuracy of different soil, climate and management information influence the simulated crop yields in two crop models designed for use at different spatial scales? The study will help to determine whether more detailed information improves the simulations and to advise model users on the uncertainty related to input data. We analyse the performance of the point-scale crop model APSIM (Keating et al., 2003) and the global scale crop model LPJmL (Bondeau et al., 2007) with different climate information (monthly and daily) and soil conditions (global soil map and African soil map) under different agricultural management (uniform and variable sowing dates) for the low-input maize-growing areas in Burkina Faso/West Africa. We test the models' response to different levels of input data from very little to very detailed information, and compare the models' abilities to represent the spatial variability and temporal variability in crop yields. We display the uncertainty in crop yield simulations from different input data and crop models in Taylor diagrams which are a graphical summary of the similarity between simulations and observations (Taylor, 2001). The observed spatial variability can be represented well from both models (R=0.6-0.8) but APSIM predicts higher spatial variability than LPJmL due to its sensitivity to soil parameters. Simulations with the same crop model, climate and sowing dates have similar statistics and therefore similar skill to reproduce the observed spatial variability. Soil data is less important for the skill of a crop model to reproduce the observed spatial variability. However, the uncertainty in simulated spatial variability from the two crop models is larger than from input data settings and APSIM is more sensitive to input data then LPJmL. Even with a detailed, point-scale crop model and detailed input data it is difficult to capture the complexity and diversity in maize cropping systems.
William Salas; Steve Hagen
2013-01-01
This presentation will provide an overview of an approach for quantifying uncertainty in spatial estimates of carbon emission from land use change. We generate uncertainty bounds around our final emissions estimate using a randomized, Monte Carlo (MC)-style sampling technique. This approach allows us to combine uncertainty from different sources without making...
NASA Astrophysics Data System (ADS)
Poerbandono
2017-07-01
This paper assesses the presence of navigational hazards due to underestimation of charted depths originated from an establishment of a sea-level-related reference plane, i.e. datum. The study domain is situated in one of Indonesia's densest marine traffic, SW Java Sea, Indonesia. The assessment is based on the comparison of the authorized Chart Datum (CD), being uniformly located at 0.6 m below Mean Sea Level (MSL), and a spatially varying Lowest Astronomical Tide (LAT) generated for the purpose of this research. Hazards are considered here as the deviation of LAT from CD and quantified as the ratio of LAT -CD deviation with respect to the allowable Total Vertical Uncertainty (TVU), i.e. the international standard for accuracy of depth information on nautical charts. Underestimation of charted depth is expected for the case that LAT falls below CD. Such a risk magnifies with decreasing depths, as well as the increasing volume of traffic and draught of the vessel. It is found that most of the domain is in the interior of risk-free zone from using uniform CD. As much as 0.08 and 0.19 parts of the area are in zones where the uncertainty of CD contributes respectively to 50% and 30% of Total Vertical Uncertainty. These are zones where the hazard of navigation is expected to increase due to underestimated lowest tidal level.
The Key Role of Eyewitnesses in Rapid Impact Assessment of Global Earthquake
NASA Astrophysics Data System (ADS)
Bossu, R.; Steed, R.; Mazet-Roux, G.; Roussel, F.; Etivant, C.; Frobert, L.; Godey, S.
2014-12-01
Uncertainties in rapid impact assessments of global earthquakes are intrinsically large because they rely on 3 main elements (ground motion prediction models, building stock inventory and related vulnerability) which values and/or spatial variations are poorly constrained. Furthermore, variations of hypocentral location and magnitude within their respective uncertainty domain can lead to significantly different shaking level for centers of population and change the scope of the disaster. We present the strategy and methods implemented at the Euro-Med Seismological Centre (EMSC) to rapidly collect in-situ observations on earthquake effects from eyewitnesses for reducing uncertainties of rapid earthquake impact assessment. It comprises crowdsourced information (online questionnaires, pics) as well as information derived from real time analysis of web traffic (flashourcing technique), and more recently deployment of QCN (Quake Catcher Network) low cost sensors. We underline the importance of merging results of different methods to improve performances and reliability of collected data.We try to better understand and respond to public demands and expectations after earthquakes through improved information services and diversification of information tools (social networks, smartphone app., browsers adds-on…), which, in turn, drive more eyewitnesses to our services and improve data collection. We will notably present our LastQuake Twitter feed (Quakebot) and smartphone applications (IOs and android) which only report earthquakes that matter for the public and authorities, i.e. felt and damaging earthquakes identified thanks to citizen generated information.
Presentation of uncertainties on web platforms for climate change information
NASA Astrophysics Data System (ADS)
Nocke, Thomas; Wrobel, Markus; Reusser, Dominik
2014-05-01
Climate research has a long tradition, however there is still uncertainty about the specific effects of climate change. One of the key tasks is - beyond discussing climate change and its impacts in specialist groups - to present these to a wider audience. In that respect, decision-makers in the public sector as well as directly affected professional groups require to obtain easy-to-understand information. These groups are not made up of specialist scientists. This gives rise to the challenge that the scientific information must be presented such that it is commonly understood, however, the complexity of the science behind needs to be incorporated. In particular, this requires the explicit representation of spatial and temporal uncertainty information to lay people. Within this talk/poster we survey how climate change and climate impact uncertainty information is presented on various climate service web-based platforms. We outline how the specifics of this medium make it challenging to find adequate and readable representations of uncertainties. First, we introduce a multi-step approach in communicating the uncertainty basing on a typology of uncertainty distinguishing between epistemic, natural stochastic, and human reflexive uncertainty. Then, we compare existing concepts and representations for uncertainty communication with current practices on web-based platforms, including own solutions within our web platforms ClimateImpactsOnline and ci:grasp. Finally, we review surveys on how spatial uncertainty visualization techniques are conceived by untrainded users.
Dieye, A.M.; Roy, David P.; Hanan, N.P.; Liu, S.; Hansen, M.; Toure, A.
2012-01-01
Spatially explicit land cover land use (LCLU) change information is needed to drive biogeochemical models that simulate soil organic carbon (SOC) dynamics. Such information is increasingly being mapped using remotely sensed satellite data with classification schemes and uncertainties constrained by the sensing system, classification algorithms and land cover schemes. In this study, automated LCLU classification of multi-temporal Landsat satellite data were used to assess the sensitivity of SOC modeled by the Global Ensemble Biogeochemical Modeling System (GEMS). The GEMS was run for an area of 1560 km2 in Senegal under three climate change scenarios with LCLU maps generated using different Landsat classification approaches. This research provides a method to estimate the variability of SOC, specifically the SOC uncertainty due to satellite classification errors, which we show is dependent not only on the LCLU classification errors but also on where the LCLU classes occur relative to the other GEMS model inputs.
Sources of Uncertainty in the Prediction of LAI / fPAR from MODIS
NASA Technical Reports Server (NTRS)
Dungan, Jennifer L.; Ganapol, Barry D.; Brass, James A. (Technical Monitor)
2002-01-01
To explicate the sources of uncertainty in the prediction of biophysical variables over space, consider the general equation: where z is a variable with values on some nominal, ordinal, interval or ratio scale; y is a vector of input variables; u is the spatial support of y and z ; x and u are the spatial locations of y and z , respectively; f is a model and B is the vector of the parameters of this model. Any y or z has a value and a spatial extent which is called its support. Viewed in this way, categories of uncertainty are from variable (e.g. measurement), parameter, positional. support and model (e.g. structural) sources. The prediction of Leaf Area Index (LAI) and the fraction of absorbed photosynthetically active radiation (fPAR) are examples of z variables predicted using model(s) as a function of y variables and spatially constant parameters. The MOD15 algorithm is an example of f, called f(sub 1), with parameters including those defined by one of six biome types and solar and view angles. The Leaf Canopy Model (LCM)2, a nested model that combines leaf radiative transfer with a full canopy reflectance model through the phase function, is a simpler though similar radiative transfer approach to f(sub 1). In a previous study, MOD15 and LCM2 gave similar results for the broadleaf forest biome. Differences between these two models can be used to consider the structural uncertainty in prediction results. In an effort to quantify each of the five sources of uncertainty and rank their relative importance for the LAI/fPAR prediction problem, we used recent data for an EOS Core Validation Site in the broadleaf biome with coincident surface reflectance, vegetation index, fPAR and LAI products from the Moderate Resolution Imaging Spectrometer (MODIS). Uncertainty due to support on the input reflectance variable was characterized using Landsat ETM+ data. Input uncertainties were propagated through the LCM2 model and compared with published uncertainties from the MOD15 algorithm.
Assessing climate change impacts on water resources in remote mountain regions
NASA Astrophysics Data System (ADS)
Buytaert, Wouter; De Bièvre, Bert
2013-04-01
From a water resources perspective, remote mountain regions are often considered as a basket case. They are often regions where poverty is often interlocked with multiple threats to water supply, data scarcity, and high uncertainties. In these environments, it is paramount to generate locally relevant knowledge about water resources and how they impact local livelihoods. This is often problematic. Existing environmental data collection tends to be geographically biased towards more densely populated regions, and prioritized towards strategic economic activities. Data may also be locked behind institutional and technological barriers. These issues create a "knowledge trap" for data-poor regions, which is especially acute in remote and hard-to-reach mountain regions. We present lessons learned from a decade of water resources research in remote mountain regions of the Andes, Africa and South Asia. We review the entire tool chain of assessing climate change impacts on water resources, including the interrogation and downscaling of global circulation models, translating climate variables in water availability and access, and assessing local vulnerability. In global circulation models, mountain regions often stand out as regions of high uncertainties and lack of agreement of future trends. This is partly a technical artifact because of the different resolution and representation of mountain topography, but it also highlights fundamental uncertainties in climate impacts on mountain climate. This problem also affects downscaling efforts, because regional climate models should be run in very high spatial resolution to resolve local gradients, which is computationally very expensive. At the same time statistical downscaling methods may fail to find significant relations between local climate properties and synoptic processes. Further uncertainties are introduced when downscaled climate variables such as precipitation and temperature are to be translated in hydrologically relevant variables such as streamflow and groundwater recharge. Fundamental limitations in both the understanding of hydrological processes in mountain regions (e.g., glacier melt, wetland attenuation, groundwater flows) and in data availability introduce large uncertainties. Lastly, assessing access to water resources is a major challenge. Topographical gradients and barriers, as well as strong spatiotemporal variations in hydrological processes, makes it particularly difficult to assess which parts of the mountain population is most vulnerable to future perturbations of the water cycle.
Skaff, Nicholas K; Armstrong, Philip M; Andreadis, Theodore G; Cheruvelil, Kendra S
2017-10-18
Eastern equine encephalitis virus (EEEV) is an expanding mosquito-borne threat to humans and domestic animal populations in the northeastern United States. Outbreaks of EEEV are challenging to predict due to spatial and temporal uncertainty in the abundance and viral infection of Cs. melanura, the principal enzootic vector. EEEV activity may be closely linked to wetlands because they provide essential habitat for mosquito vectors and avian reservoir hosts. However, wetlands are not homogeneous and can vary by vegetation, connectivity, size, and inundation patterns. Wetlands may also have different effects on EEEV transmission depending on the assessed spatial scale. We investigated associations between wetland characteristics and Cs. melanura abundance and infection with EEEV at multiple spatial scales in Connecticut, USA. Our findings indicate that wetland vegetative characteristics have strong associations with Cs. melanura abundance. Deciduous and evergreen forested wetlands were associated with higher Cs. melanura abundance, likely because these wetlands provide suitable subterranean habitat for Cs. melanura development. In contrast, Cs. melanura abundance was negatively associated with emergent and scrub/shrub wetlands, and wetland connectivity to streams. These relationships were generally strongest at broad spatial scales. Additionally, the relationships between wetland characteristics and EEEV infection in Cs. melanura were generally weak. However, Cs. melanura abundance was strongly associated with EEEV infection, suggesting that wetland-associated changes in abundance may be indirectly linked to EEEV infection in Cs. melanura. Finally, we found that wet hydrological conditions during the transmission season and during the fall/winter preceding the transmission season were associated with higher Cs. melanura abundance and EEEV infection, indicating that wet conditions are favorable for EEEV transmission. These results expand the broad-scale understanding of the effects of wetlands on EEEV transmission and help to reduce the spatial and temporal uncertainty associated with EEEV outbreaks.
NASA Technical Reports Server (NTRS)
Narvet, Steven W.; Frigm, Ryan C.; Hejduk, Matthew D.
2011-01-01
Conjunction Assessment operations require screening assets against the space object catalog by placing a pre-determined spatial volume around each asset and predicting when another object will violate that volume. The selection of the screening volume used for each spacecraft is a trade-off between observing all conjunction events that may pose a potential risk to the primary spacecraft and the ability to analyze those predicted events. If the screening volumes are larger, then more conjunctions can be observed and therefore the probability of a missed detection of a high risk conjunction event is small; however, the amount of data which needs to be analyzed increases. This paper characterizes the sensitivity of screening volume size to capturing typical orbit uncertainties and the expected number of conjunction events observed. These sensitivities are quantified in the form of a trade space that allows for selection of appropriate screen-ing volumes to fit the desired concept of operations, system limitations, and tolerable analyst workloads. This analysis will specifically highlight the screening volume determination and selection process for use in the NASA Conjunction Assessment Risk Analysis process but will also provide a general framework for other Owner / Operators faced with similar decisions.
Wade, Alisa A.; Hand, Brian K.; Kovach, Ryan; Luikart, Gordon; Whited, Diane; Muhlfeld, Clint C.
2016-01-01
Climate change vulnerability assessments (CCVAs) are valuable tools for assessing species’ vulnerability to climatic changes, yet failure to include measures of adaptive capacity and to account for sources of uncertainty may limit their effectiveness. Here, we provide a more comprehensive CCVA approach that incorporates all three elements used for assessing species’ climate change vulnerability: exposure, sensitivity, and adaptive capacity. We illustrate our approach using case studies of two threatened salmonids with different life histories – anadromous steelhead trout (Oncorhynchus mykiss) and non-anadromous bull trout (Salvelinus confluentus) – within the Columbia River Basin, USA. We identified general patterns of high vulnerability in low-elevation and southernmost habitats for both species. However, vulnerability rankings varied widely depending on the factors (climate, habitat, demographic, and genetic) included in the CCVA and often differed for the two species at locations where they were sympatric. Our findings illustrate that CCVA results are highly sensitive to data inputs and that spatial differences can complicate multi-species conservation. Our results highlight how CCVAs should be considered within a broader conceptual and computational framework for refining hypotheses, guiding research, and comparing plausible scenarios of species’ vulnerability for ongoing and projected climate change.
NASA Astrophysics Data System (ADS)
Goulden, T.; Hopkinson, C.
2013-12-01
The quantification of LiDAR sensor measurement uncertainty is important for evaluating the quality of derived DEM products, compiling risk assessment of management decisions based from LiDAR information, and enhancing LiDAR mission planning capabilities. Current quality assurance estimates of LiDAR measurement uncertainty are limited to post-survey empirical assessments or vendor estimates from commercial literature. Empirical evidence can provide valuable information for the performance of the sensor in validated areas; however, it cannot characterize the spatial distribution of measurement uncertainty throughout the extensive coverage of typical LiDAR surveys. Vendor advertised error estimates are often restricted to strict and optimal survey conditions, resulting in idealized values. Numerical modeling of individual pulse uncertainty provides an alternative method for estimating LiDAR measurement uncertainty. LiDAR measurement uncertainty is theoretically assumed to fall into three distinct categories, 1) sensor sub-system errors, 2) terrain influences, and 3) vegetative influences. This research details the procedures for numerical modeling of measurement uncertainty from the sensor sub-system (GPS, IMU, laser scanner, laser ranger) and terrain influences. Results show that errors tend to increase as the laser scan angle, altitude or laser beam incidence angle increase. An experimental survey over a flat and paved runway site, performed with an Optech ALTM 3100 sensor, showed an increase in modeled vertical errors of 5 cm, at a nadir scan orientation, to 8 cm at scan edges; for an aircraft altitude of 1200 m and half scan angle of 15°. In a survey with the same sensor, at a highly sloped glacial basin site absent of vegetation, modeled vertical errors reached over 2 m. Validation of error models within the glacial environment, over three separate flight lines, respectively showed 100%, 85%, and 75% of elevation residuals fell below error predictions. Future work in LiDAR sensor measurement uncertainty must focus on the development of vegetative error models to create more robust error prediction algorithms. To achieve this objective, comprehensive empirical exploratory analysis is recommended to relate vegetative parameters to observed errors.
NASA Astrophysics Data System (ADS)
Chen, Lixia; van Westen, Cees J.; Hussin, Haydar; Ciurean, Roxana L.; Turkington, Thea; Chavarro-Rincon, Diana; Shrestha, Dhruba P.
2016-11-01
Extreme rainfall events are the main triggering causes for hydro-meteorological hazards in mountainous areas, where development is often constrained by the limited space suitable for construction. In these areas, hazard and risk assessments are fundamental for risk mitigation, especially for preventive planning, risk communication and emergency preparedness. Multi-hazard risk assessment in mountainous areas at local and regional scales remain a major challenge because of lack of data related to past events and causal factors, and the interactions between different types of hazards. The lack of data leads to a high level of uncertainty in the application of quantitative methods for hazard and risk assessment. Therefore, a systematic approach is required to combine these quantitative methods with expert-based assumptions and decisions. In this study, a quantitative multi-hazard risk assessment was carried out in the Fella River valley, prone to debris flows and flood in the north-eastern Italian Alps. The main steps include data collection and development of inventory maps, definition of hazard scenarios, hazard assessment in terms of temporal and spatial probability calculation and intensity modelling, elements-at-risk mapping, estimation of asset values and the number of people, physical vulnerability assessment, the generation of risk curves and annual risk calculation. To compare the risk for each type of hazard, risk curves were generated for debris flows, river floods and flash floods. Uncertainties were expressed as minimum, average and maximum values of temporal and spatial probability, replacement costs of assets, population numbers, and physical vulnerability. These result in minimum, average and maximum risk curves. To validate this approach, a back analysis was conducted using the extreme hydro-meteorological event that occurred in August 2003 in the Fella River valley. The results show a good performance when compared to the historical damage reports.
Hydrological model uncertainty due to spatial evapotranspiration estimation methods
NASA Astrophysics Data System (ADS)
Yu, Xuan; Lamačová, Anna; Duffy, Christopher; Krám, Pavel; Hruška, Jakub
2016-05-01
Evapotranspiration (ET) continues to be a difficult process to estimate in seasonal and long-term water balances in catchment models. Approaches to estimate ET typically use vegetation parameters (e.g., leaf area index [LAI], interception capacity) obtained from field observation, remote sensing data, national or global land cover products, and/or simulated by ecosystem models. In this study we attempt to quantify the uncertainty that spatial evapotranspiration estimation introduces into hydrological simulations when the age of the forest is not precisely known. The Penn State Integrated Hydrologic Model (PIHM) was implemented for the Lysina headwater catchment, located 50°03‧N, 12°40‧E in the western part of the Czech Republic. The spatial forest patterns were digitized from forest age maps made available by the Czech Forest Administration. Two ET methods were implemented in the catchment model: the Biome-BGC forest growth sub-model (1-way coupled to PIHM) and with the fixed-seasonal LAI method. From these two approaches simulation scenarios were developed. We combined the estimated spatial forest age maps and two ET estimation methods to drive PIHM. A set of spatial hydrologic regime and streamflow regime indices were calculated from the modeling results for each method. Intercomparison of the hydrological responses to the spatial vegetation patterns suggested considerable variation in soil moisture and recharge and a small uncertainty in the groundwater table elevation and streamflow. The hydrologic modeling with ET estimated by Biome-BGC generated less uncertainty due to the plant physiology-based method. The implication of this research is that overall hydrologic variability induced by uncertain management practices was reduced by implementing vegetation models in the catchment models.
Evaluating Predictive Uncertainty of Hyporheic Exchange Modelling
NASA Astrophysics Data System (ADS)
Chow, R.; Bennett, J.; Dugge, J.; Wöhling, T.; Nowak, W.
2017-12-01
Hyporheic exchange is the interaction of water between rivers and groundwater, and is difficult to predict. One of the largest contributions to predictive uncertainty for hyporheic fluxes have been attributed to the representation of heterogeneous subsurface properties. This research aims to evaluate which aspect of the subsurface representation - the spatial distribution of hydrofacies or the model for local-scale (within-facies) heterogeneity - most influences the predictive uncertainty. Also, we seek to identify data types that help reduce this uncertainty best. For this investigation, we conduct a modelling study of the Steinlach River meander, in Southwest Germany. The Steinlach River meander is an experimental site established in 2010 to monitor hyporheic exchange at the meander scale. We use HydroGeoSphere, a fully integrated surface water-groundwater model, to model hyporheic exchange and to assess the predictive uncertainty of hyporheic exchange transit times (HETT). A highly parameterized complex model is built and treated as `virtual reality', which is in turn modelled with simpler subsurface parameterization schemes (Figure). Then, we conduct Monte-Carlo simulations with these models to estimate the predictive uncertainty. Results indicate that: Uncertainty in HETT is relatively small for early times and increases with transit times. Uncertainty from local-scale heterogeneity is negligible compared to uncertainty in the hydrofacies distribution. Introducing more data to a poor model structure may reduce predictive variance, but does not reduce predictive bias. Hydraulic head observations alone cannot constrain the uncertainty of HETT, however an estimate of hyporheic exchange flux proves to be more effective at reducing this uncertainty. Figure: Approach for evaluating predictive model uncertainty. A conceptual model is first developed from the field investigations. A complex model (`virtual reality') is then developed based on that conceptual model. This complex model then serves as the basis to compare simpler model structures. Through this approach, predictive uncertainty can be quantified relative to a known reference solution.
Large-area settlement pattern recognition from Landsat-8 data
NASA Astrophysics Data System (ADS)
Wieland, Marc; Pittore, Massimiliano
2016-09-01
The study presents an image processing and analysis pipeline that combines object-based image analysis with a Support Vector Machine to derive a multi-layered settlement product from Landsat-8 data over large areas. 43 image scenes are processed over large parts of Central Asia (Southern Kazakhstan, Kyrgyzstan, Tajikistan and Eastern Uzbekistan). The main tasks tackled by this work include built-up area identification, settlement type classification and urban structure types pattern recognition. Besides commonly used accuracy assessments of the resulting map products, thorough performance evaluations are carried out under varying conditions to tune algorithm parameters and assess their applicability for the given tasks. As part of this, several research questions are being addressed. In particular the influence of the improved spatial and spectral resolution of Landsat-8 on the SVM performance to identify built-up areas and urban structure types are evaluated. Also the influence of an extended feature space including digital elevation model features is tested for mountainous regions. Moreover, the spatial distribution of classification uncertainties is analyzed and compared to the heterogeneity of the building stock within the computational unit of the segments. The study concludes that the information content of Landsat-8 images is sufficient for the tested classification tasks and even detailed urban structures could be extracted with satisfying accuracy. Freely available ancillary settlement point location data could further improve the built-up area classification. Digital elevation features and pan-sharpening could, however, not significantly improve the classification results. The study highlights the importance of dynamically tuned classifier parameters, and underlines the use of Shannon entropy computed from the soft answers of the SVM as a valid measure of the spatial distribution of classification uncertainties.
Discrete distributed strain sensing of intelligent structures
NASA Technical Reports Server (NTRS)
Anderson, Mark S.; Crawley, Edward F.
1992-01-01
Techniques are developed for the design of discrete highly distributed sensor systems for use in intelligent structures. First the functional requirements for such a system are presented. Discrete spatially averaging strain sensors are then identified as satisfying the functional requirements. A variety of spatial weightings for spatially averaging sensors are examined, and their wave number characteristics are determined. Preferable spatial weightings are identified. Several numerical integration rules used to integrate such sensors in order to determine the global deflection of the structure are discussed. A numerical simulation is conducted using point and rectangular sensors mounted on a cantilevered beam under static loading. Gage factor and sensor position uncertainties are incorporated to assess the absolute error and standard deviation of the error in the estimated tip displacement found by numerically integrating the sensor outputs. An experiment is carried out using a statically loaded cantilevered beam with five point sensors. It is found that in most cases the actual experimental error is within one standard deviation of the absolute error as found in the numerical simulation.
Manneh, Rima; Margni, Manuele; Deschênes, Louise
2010-06-01
Spatially differentiated intake fractions (iFs) linked to Canadian emissions of toxic organic chemicals were developed using the multimedia and multipathways fate and exposure model IMPACT 2002. The fate and exposure of chemicals released to the Canadian environment were modeled with a single regional mass-balance model and three models that provided multiple mass-balance regions within Canada. These three models were based on the Canadian subwatersheds (172 zones), ecozones (15 zones), and provinces (13 zones). Releases of 32 organic chemicals into water and air were considered. This was done in order to (i) assess and compare the spatial variability of iFs within and across the three levels of regionalization and (ii) compare the spatial iFs to nonspatial ones. Results showed that iFs calculated using the subwatershed resolution presented a higher spatial variability (up to 10 orders of magnitude for emissions into water) than the ones based on the ecozones and provinces, implying that higher spatial resolution could potentially reduce uncertainty in iFs and, therefore, increase the discriminating power when assessing and comparing toxic releases for known emission locations. Results also indicated that, for an unknown emission location, a model with high spatial resolution such as the subwatershed model could significantly improve the accuracy of a generic iF. Population weighted iFs span up to 3 orders of magnitude compared to nonspatial iFs calculated by the one-box model. Less significant differences were observed when comparing spatial versus nonspatial iFs from the ecozones and provinces, respectively.
Water resources of the Black Sea Basin at high spatial and temporal resolution
NASA Astrophysics Data System (ADS)
Rouholahnejad, Elham; Abbaspour, Karim C.; Srinivasan, Raghvan; Bacu, Victor; Lehmann, Anthony
2014-07-01
The pressure on water resources, deteriorating water quality, and uncertainties associated with the climate change create an environment of conflict in large and complex river system. The Black Sea Basin (BSB), in particular, suffers from ecological unsustainability and inadequate resource management leading to severe environmental, social, and economical problems. To better tackle the future challenges, we used the Soil and Water Assessment Tool (SWAT) to model the hydrology of the BSB coupling water quantity, water quality, and crop yield components. The hydrological model of the BSB was calibrated and validated considering sensitivity and uncertainty analysis. River discharges, nitrate loads, and crop yields were used to calibrate the model. Employing grid technology improved calibration computation time by more than an order of magnitude. We calculated components of water resources such as river discharge, infiltration, aquifer recharge, soil moisture, and actual and potential evapotranspiration. Furthermore, available water resources were calculated at subbasin spatial and monthly temporal levels. Within this framework, a comprehensive database of the BSB was created to fill the existing gaps in water resources data in the region. In this paper, we discuss the challenges of building a large-scale model in fine spatial and temporal detail. This study provides the basis for further research on the impacts of climate and land use change on water resources in the BSB.
Are We Telling Decision-makers the Wrong Things - and with Too Much Confidence?
NASA Astrophysics Data System (ADS)
Arnold, J.; Nowak, K. C.; Vano, J. A.; Newman, A. J.; Mizukami, N.; Mendoza, P. A.; Nijssen, B.; Wood, A.; Gutmann, E. D.; Clark, M. P.; Rasmussen, R.
2016-12-01
Water-resource management relies on decision-making over a wide range of space-time scales, nearly none of which maps cleanly onto the scales of current hydroclimatic scenarios of anthropogenic change. Myriad choices are made during vulnerability and impact assessments to quantify the changed-climate sensitivities of models used in that decision-making, including choices of hydrologic models, parameters, and parameterizations; their input forcings determined with various climate downscaling approaches; selected GCMs and output variables to be downscaled; and the forcing emissions scenarios, to name a few. Choosing alternative methods for producing gridded meteorological fields, for examples, can produce very different effects on the projected hydrologic outcomes they drive, with uncertainties across those methods revealed to be as large or larger than the climate change signal itself in some cases. Additionally, many popular climate downscaling methods simply rescale GCM precipitation, producing hydroclimatic projections with too much drizzle, incorrect representations of extreme events, and improper spatial scaling of variables crucial to water-resource vulnerability assessments and, importantly, the decisions they seek to inform. Real-world water-resource vulnerability and impacts assessments can be highly time-sensitive and resource limited, though, so they typically do not confront or even fully represent uncertainties associated with all choices. That deficiency results in assessments built on only partially revealed uncertainties which can misrepresent significant sensitivities and impacts in the final assessments of climate threats and hydrologic vulnerabilities. This talk will describe recent work by the U.S. Army Corps of Engineers, Bureau of Reclamation, University of Washington, and National Center for Atmospheric Research to develop and test methods to characterize more fully the uncertainties in the modeling chain for real-world uses. Examples will illustrate new implementations for communicating that fuller characterization in the ways most useful to inform water-resource management across multiple space-time scales under climate-changed futures.
Does global warming amplify interannual climate variability?
NASA Astrophysics Data System (ADS)
He, Chao; Li, Tim
2018-06-01
Based on the outputs of 30 models from Coupled Model Intercomparison Project Phase 5 (CMIP5), the fractional changes in the amplitude interannual variability (σ) for precipitation (P') and vertical velocity (ω') are assessed, and simple theoretical models are constructed to quantitatively understand the changes in σ(P') and σ(ω'). Both RCP8.5 and RCP4.5 scenarios show similar results in term of the fractional change per degree of warming, with slightly lower inter-model uncertainty under RCP8.5. Based on the multi-model median, σ(P') generally increases but σ(ω') generally decreases under global warming but both are characterized by non-uniform spatial patterns. The σ(P') decrease over subtropical subsidence regions but increase elsewhere, with a regional averaged value of 1.4% K- 1 over 20°S-50°N under RCP8.5. Diagnoses show that the mechanisms for the change in σ(P') are different for climatological ascending and descending regions. Over ascending regions, the increase of mean state specific humidity contributes to a general increase of σ(P') but the change of σ(ω') dominates its spatial pattern and inter-model uncertainty. But over descending regions, the change of σ(P') and its inter-model uncertainty are constrained by the change of mean state precipitation. The σ(ω') is projected to be weakened almost everywhere except over equatorial Pacific, with a regional averaged fractional change of - 3.4% K- 1 at 500 hPa. The overall reduction of σ(ω') results from the increased mean state static stability, while the substantially increased σ(ω') at the mid-upper troposphere over equatorial Pacific and the inter-model uncertainty of the changes in σ(ω') are dominated by the change in the interannual variability of diabatic heating.
Shih, Hsiu-Ching; Crawford-Brown, Douglas; Ma, Hwong-wen
2015-03-15
Assessment of the ability of climate policies to produce desired improvements in public health through co-benefits of air pollution reduction can consume resources in both time and research funds. These resources increase significantly as the spatial resolution of models increases. In addition, the level of spatial detail available in macroeconomic models at the heart of climate policy assessments is much lower than that available in traditional human health risk modeling. It is therefore important to determine whether increasing spatial resolution considerably affects risk-based decisions; which kinds of decisions might be affected; and under what conditions they will be affected. Human health risk co-benefits from carbon emissions reductions that bring about concurrent reductions in Particulate Matter (PM10) emissions is therefore examined here at four levels of spatial resolution (Uniform Nation, Uniform Region, Uniform County/city, Health Risk Assessment) in a case study of Taiwan as one of the geographic regions of a global macroeceonomic model, with results that are representative of small, industrialized nations within that global model. A metric of human health risk mortality (YOLL, years of life lost in life expectancy) is compared under assessments ranging from a "uniform simulation" in which there is no spatial resolution of changes in ambient air concentration under a policy to a "highly spatially resolved simulation" (called here Health Risk Assessment). PM10 is chosen in this study as the indicator of air pollution for which risks are assessed due to its significance as a co-benefit of carbon emissions reductions within climate mitigation policy. For the policy examined, the four estimates of mortality in the entirety of Taiwan are 747 YOLL, 834 YOLL, 984 YOLL and 916 YOLL, under Uniform Taiwan, Uniform Region, Uniform County and Health Risk Assessment respectively; or differences of 18%, 9%, 7% if the HRA methodology is taken as the baseline. While these differences are small compared to uncertainties in health risk assessment more generally, the ranks of different regions and of emissions categories as the focus of regulatory efforts estimated at these four levels of spatial resolution are quite different. The results suggest that issues of risk equity within a nation might be missed by the lower levels of spatial resolution, suggesting that low resolution models are suited to calculating national cost-benefit ratios but not as suited to assessing co-benefits of climate policies reflecting intersubject variability in risk, or in identifying sub-national regions and emissions sectors on which to focus attention (although even here, the errors introduced by low spatial resolution are generally less than 40%). Copyright © 2014 Elsevier Ltd. All rights reserved.
NASA Team 2 Sea Ice Concentration Algorithm Retrieval Uncertainty
NASA Technical Reports Server (NTRS)
Brucker, Ludovic; Cavalieri, Donald J.; Markus, Thorsten; Ivanoff, Alvaro
2014-01-01
Satellite microwave radiometers are widely used to estimate sea ice cover properties (concentration, extent, and area) through the use of sea ice concentration (IC) algorithms. Rare are the algorithms providing associated IC uncertainty estimates. Algorithm uncertainty estimates are needed to assess accurately global and regional trends in IC (and thus extent and area), and to improve sea ice predictions on seasonal to interannual timescales using data assimilation approaches. This paper presents a method to provide relative IC uncertainty estimates using the enhanced NASA Team (NT2) IC algorithm. The proposed approach takes advantage of the NT2 calculations and solely relies on the brightness temperatures (TBs) used as input. NT2 IC and its associated relative uncertainty are obtained for both the Northern and Southern Hemispheres using the Advanced Microwave Scanning Radiometer for the Earth Observing System (AMSR-E) TB. NT2 IC relative uncertainties estimated on a footprint-by-footprint swath-by-swath basis were averaged daily over each 12.5-km grid cell of the polar stereographic grid. For both hemispheres and throughout the year, the NT2 relative uncertainty is less than 5%. In the Southern Hemisphere, it is low in the interior ice pack, and it increases in the marginal ice zone up to 5%. In the Northern Hemisphere, areas with high uncertainties are also found in the high IC area of the Central Arctic. Retrieval uncertainties are greater in areas corresponding to NT2 ice types associated with deep snow and new ice. Seasonal variations in uncertainty show larger values in summer as a result of melt conditions and greater atmospheric contributions. Our analysis also includes an evaluation of the NT2 algorithm sensitivity to AMSR-E sensor noise. There is a 60% probability that the IC does not change (to within the computed retrieval precision of 1%) due to sensor noise, and the cumulated probability shows that there is a 90% chance that the IC varies by less than +/-3%. We also examined the daily IC variability, which is dominated by sea ice drift and ice formation/melt. Daily IC variability is the highest, year round, in the MIZ (often up to 20%, locally 30%). The temporal and spatial distributions of the retrieval uncertainties and the daily IC variability is expected to be useful for algorithm intercomparisons, climate trend assessments, and possibly IC assimilation in models.
Estimating instream constituent loads using replicate synoptic sampling, Peru Creek, Colorado
NASA Astrophysics Data System (ADS)
Runkel, Robert L.; Walton-Day, Katherine; Kimball, Briant A.; Verplanck, Philip L.; Nimick, David A.
2013-05-01
SummaryThe synoptic mass balance approach is often used to evaluate constituent mass loading in streams affected by mine drainage. Spatial profiles of constituent mass load are used to identify sources of contamination and prioritize sites for remedial action. This paper presents a field scale study in which replicate synoptic sampling campaigns are used to quantify the aggregate uncertainty in constituent load that arises from (1) laboratory analyses of constituent and tracer concentrations, (2) field sampling error, and (3) temporal variation in concentration from diel constituent cycles and/or source variation. Consideration of these factors represents an advance in the application of the synoptic mass balance approach by placing error bars on estimates of constituent load and by allowing all sources of uncertainty to be quantified in aggregate; previous applications of the approach have provided only point estimates of constituent load and considered only a subset of the possible errors. Given estimates of aggregate uncertainty, site specific data and expert judgement may be used to qualitatively assess the contributions of individual factors to uncertainty. This assessment can be used to guide the collection of additional data to reduce uncertainty. Further, error bars provided by the replicate approach can aid the investigator in the interpretation of spatial loading profiles and the subsequent identification of constituent source areas within the watershed. The replicate sampling approach is applied to Peru Creek, a stream receiving acidic, metal-rich effluent from the Pennsylvania Mine. Other sources of acidity and metals within the study reach include a wetland area adjacent to the mine and tributary inflow from Cinnamon Gulch. Analysis of data collected under low-flow conditions indicates that concentrations of Al, Cd, Cu, Fe, Mn, Pb, and Zn in Peru Creek exceed aquatic life standards. Constituent loading within the study reach is dominated by effluent from the Pennsylvania Mine, with over 50% of the Cd, Cu, Fe, Mn, and Zn loads attributable to a collapsed adit near the top of the study reach. These estimates of mass load may underestimate the effect of the Pennsylvania Mine as leakage from underground mine workings may contribute to metal loads that are currently attributed to the wetland area. This potential leakage confounds the evaluation of remedial options and additional research is needed to determine the magnitude and location of the leakage.
The role of internal climate variability for interpreting climate change scenarios
NASA Astrophysics Data System (ADS)
Maraun, Douglas
2013-04-01
When communicating information on climate change, the use of multi-model ensembles has been advocated to sample uncertainties over a range as wide as possible. To meet the demand for easily accessible results, the ensemble is often summarised by its multi-model mean signal. In rare cases, additional uncertainty measures are given to avoid loosing all information on the ensemble spread, e.g., the highest and lowest projected values. Such approaches, however, disregard the fundamentally different nature of the different types of uncertainties and might cause wrong interpretations and subsequently wrong decisions for adaptation. Whereas scenario and climate model uncertainties are of epistemic nature, i.e., caused by an in principle reducible lack of knowledge, uncertainties due to internal climate variability are aleatory, i.e., inherently stochastic and irreducible. As wisely stated in the proverb "climate is what you expect, weather is what you get", a specific region will experience one stochastic realisation of the climate system, but never exactly the expected climate change signal as given by a multi model mean. Depending on the meteorological variable, region and lead time, the signal might be strong or weak compared to the stochastic component. In cases of a low signal-to-noise ratio, even if the climate change signal is a well defined trend, no trends or even opposite trends might be experienced. Here I propose to use the time of emergence (TOE) to quantify and communicate when climate change trends will exceed the internal variability. The TOE provides a useful measure for end users to assess the time horizon for implementing adaptation measures. Furthermore, internal variability is scale dependent - the more local the scale, the stronger the influence of internal climate variability. Thus investigating the TOE as a function of spatial scale could help to assess the required spatial scale for implementing adaptation measures. I exemplify this proposal with a recently published study on the TOE for mean and heavy precipitation trends in Europe. In some regions trends emerge only late in the 21st century or even later, suggesting that in these regions adaptation to internal variability rather than to climate change is required. Yet in other regions the climate change signal is strong, urging for timely adaptation. Douglas Maraun, When at what scale will trends in European mean and heavy precipitation emerge? Env. Res. Lett., in press, 2013.
Estimating instream constituent loads using replicate synoptic sampling, Peru Creek, Colorado
Runkel, Robert L.; Walton-Day, Katherine; Kimball, Briant A.; Verplanck, Philip L.; Nimick, David A.
2013-01-01
The synoptic mass balance approach is often used to evaluate constituent mass loading in streams affected by mine drainage. Spatial profiles of constituent mass load are used to identify sources of contamination and prioritize sites for remedial action. This paper presents a field scale study in which replicate synoptic sampling campaigns are used to quantify the aggregate uncertainty in constituent load that arises from (1) laboratory analyses of constituent and tracer concentrations, (2) field sampling error, and (3) temporal variation in concentration from diel constituent cycles and/or source variation. Consideration of these factors represents an advance in the application of the synoptic mass balance approach by placing error bars on estimates of constituent load and by allowing all sources of uncertainty to be quantified in aggregate; previous applications of the approach have provided only point estimates of constituent load and considered only a subset of the possible errors. Given estimates of aggregate uncertainty, site specific data and expert judgement may be used to qualitatively assess the contributions of individual factors to uncertainty. This assessment can be used to guide the collection of additional data to reduce uncertainty. Further, error bars provided by the replicate approach can aid the investigator in the interpretation of spatial loading profiles and the subsequent identification of constituent source areas within the watershed.The replicate sampling approach is applied to Peru Creek, a stream receiving acidic, metal-rich effluent from the Pennsylvania Mine. Other sources of acidity and metals within the study reach include a wetland area adjacent to the mine and tributary inflow from Cinnamon Gulch. Analysis of data collected under low-flow conditions indicates that concentrations of Al, Cd, Cu, Fe, Mn, Pb, and Zn in Peru Creek exceed aquatic life standards. Constituent loading within the study reach is dominated by effluent from the Pennsylvania Mine, with over 50% of the Cd, Cu, Fe, Mn, and Zn loads attributable to a collapsed adit near the top of the study reach. These estimates of mass load may underestimate the effect of the Pennsylvania Mine as leakage from underground mine workings may contribute to metal loads that are currently attributed to the wetland area. This potential leakage confounds the evaluation of remedial options and additional research is needed to determine the magnitude and location of the leakage.
Raben, Jaime S; Hariharan, Prasanna; Robinson, Ronald; Malinauskas, Richard; Vlachos, Pavlos P
2016-03-01
We present advanced particle image velocimetry (PIV) processing, post-processing, and uncertainty estimation techniques to support the validation of computational fluid dynamics analyses of medical devices. This work is an extension of a previous FDA-sponsored multi-laboratory study, which used a medical device mimicking geometry referred to as the FDA benchmark nozzle model. Experimental measurements were performed using time-resolved PIV at five overlapping regions of the model for Reynolds numbers in the nozzle throat of 500, 2000, 5000, and 8000. Images included a twofold increase in spatial resolution in comparison to the previous study. Data was processed using ensemble correlation, dynamic range enhancement, and phase correlations to increase signal-to-noise ratios and measurement accuracy, and to resolve flow regions with large velocity ranges and gradients, which is typical of many blood-contacting medical devices. Parameters relevant to device safety, including shear stress at the wall and in bulk flow, were computed using radial basis functions. In addition, in-field spatially resolved pressure distributions, Reynolds stresses, and energy dissipation rates were computed from PIV measurements. Velocity measurement uncertainty was estimated directly from the PIV correlation plane, and uncertainty analysis for wall shear stress at each measurement location was performed using a Monte Carlo model. Local velocity uncertainty varied greatly and depended largely on local conditions such as particle seeding, velocity gradients, and particle displacements. Uncertainty in low velocity regions in the sudden expansion section of the nozzle was greatly reduced by over an order of magnitude when dynamic range enhancement was applied. Wall shear stress uncertainty was dominated by uncertainty contributions from velocity estimations, which were shown to account for 90-99% of the total uncertainty. This study provides advancements in the PIV processing methodologies over the previous work through increased PIV image resolution, use of robust image processing algorithms for near-wall velocity measurements and wall shear stress calculations, and uncertainty analyses for both velocity and wall shear stress measurements. The velocity and shear stress analysis, with spatially distributed uncertainty estimates, highlights the challenges of flow quantification in medical devices and provides potential methods to overcome such challenges.
Towards physiologically meaningful water-use efficiency estimates from eddy covariance data.
Knauer, Jürgen; Zaehle, Sönke; Medlyn, Belinda E; Reichstein, Markus; Williams, Christopher A; Migliavacca, Mirco; De Kauwe, Martin G; Werner, Christiane; Keitel, Claudia; Kolari, Pasi; Limousin, Jean-Marc; Linderson, Maj-Lena
2018-02-01
Intrinsic water-use efficiency (iWUE) characterizes the physiological control on the simultaneous exchange of water and carbon dioxide in terrestrial ecosystems. Knowledge of iWUE is commonly gained from leaf-level gas exchange measurements, which are inevitably restricted in their spatial and temporal coverage. Flux measurements based on the eddy covariance (EC) technique can overcome these limitations, as they provide continuous and long-term records of carbon and water fluxes at the ecosystem scale. However, vegetation gas exchange parameters derived from EC data are subject to scale-dependent and method-specific uncertainties that compromise their ecophysiological interpretation as well as their comparability among ecosystems and across spatial scales. Here, we use estimates of canopy conductance and gross primary productivity (GPP) derived from EC data to calculate a measure of iWUE (G 1 , "stomatal slope") at the ecosystem level at six sites comprising tropical, Mediterranean, temperate, and boreal forests. We assess the following six mechanisms potentially causing discrepancies between leaf and ecosystem-level estimates of G 1 : (i) non-transpirational water fluxes; (ii) aerodynamic conductance; (iii) meteorological deviations between measurement height and canopy surface; (iv) energy balance non-closure; (v) uncertainties in net ecosystem exchange partitioning; and (vi) physiological within-canopy gradients. Our results demonstrate that an unclosed energy balance caused the largest uncertainties, in particular if it was associated with erroneous latent heat flux estimates. The effect of aerodynamic conductance on G 1 was sufficiently captured with a simple representation. G 1 was found to be less sensitive to meteorological deviations between canopy surface and measurement height and, given that data are appropriately filtered, to non-transpirational water fluxes. Uncertainties in the derived GPP and physiological within-canopy gradients and their implications for parameter estimates at leaf and ecosystem level are discussed. Our results highlight the importance of adequately considering the sources of uncertainty outlined here when EC-derived water-use efficiency is interpreted in an ecophysiological context. © 2017 John Wiley & Sons Ltd.
Remotely Sensed Data for High Resolution Agro-Environmental Policy Analysis
NASA Astrophysics Data System (ADS)
Welle, Paul
Policy analyses of agricultural and environmental systems are often limited due to data constraints. Measurement campaigns can be costly, especially when the area of interest includes oceans, forests, agricultural regions or other dispersed spatial domains. Satellite based remote sensing offers a way to increase the spatial and temporal resolution of policy analysis concerning these systems. However, there are key limitations to the implementation of satellite data. Uncertainty in data derived from remote-sensing can be significant, and traditional methods of policy analysis for managing uncertainty on large datasets can be computationally expensive. Moreover, while satellite data can increasingly offer estimates of some parameters such as weather or crop use, other information regarding demographic or economic data is unlikely to be estimated using these techniques. Managing these challenges in practical policy analysis remains a challenge. In this dissertation, I conduct five case studies which rely heavily on data sourced from orbital sensors. First, I assess the magnitude of climate and anthropogenic stress on coral reef ecosystems. Second, I conduct an impact assessment of soil salinity on California agriculture. Third, I measure the propensity of growers to adapt their cropping practices to soil salinization in agriculture. Fourth, I analyze whether small-scale desalination units could be applied on farms in California in order mitigate the effects of drought and salinization as well as prevent agricultural drainage from entering vulnerable ecosystems. And fifth, I assess the feasibility of satellite-based remote sensing for salinity measurement at global scale. Through these case studies, I confront both the challenges and benefits associated with implementing satellite based-remote sensing for improved policy analysis.
NASA Astrophysics Data System (ADS)
Salvi, Kaustubh; Villarini, Gabriele; Vecchi, Gabriel A.
2017-10-01
Unprecedented alterations in precipitation characteristics over the last century and especially in the last two decades have posed serious socio-economic problems to society in terms of hydro-meteorological extremes, in particular flooding and droughts. The origin of these alterations has its roots in changing climatic conditions; however, its threatening implications can only be dealt with through meticulous planning that is based on realistic and skillful decadal precipitation predictions (DPPs). Skillful DPPs represent a very challenging prospect because of the complexities associated with precipitation predictions. Because of the limited skill and coarse spatial resolution, the DPPs provided by General Circulation Models (GCMs) fail to be directly applicable for impact assessment. Here, we focus on nine GCMs and quantify the seasonally and regionally averaged skill in DPPs over the continental United States. We address the problems pertaining to the limited skill and resolution by applying linear and kernel regression-based statistical downscaling approaches. For both the approaches, statistical relationships established over the calibration period (1961-1990) are applied to the retrospective and near future decadal predictions by GCMs to obtain DPPs at ∼4 km resolution. The skill is quantified across different metrics that evaluate potential skill, biases, long-term statistical properties, and uncertainty. Both the statistical approaches show improvements with respect to the raw GCM data, particularly in terms of the long-term statistical properties and uncertainty, irrespective of lead time. The outcome of the study is monthly DPPs from nine GCMs with 4-km spatial resolution, which can be used as a key input for impacts assessments.
Towards a meaningful assessment of marine ecological impacts in life cycle assessment (LCA).
Woods, John S; Veltman, Karin; Huijbregts, Mark A J; Verones, Francesca; Hertwich, Edgar G
2016-01-01
Human demands on marine resources and space are currently unprecedented and concerns are rising over observed declines in marine biodiversity. A quantitative understanding of the impact of industrial activities on the marine environment is thus essential. Life cycle assessment (LCA) is a widely applied method for quantifying the environmental impact of products and processes. LCA was originally developed to assess the impacts of land-based industries on mainly terrestrial and freshwater ecosystems. As such, impact indicators for major drivers of marine biodiversity loss are currently lacking. We review quantitative approaches for cause-effect assessment of seven major drivers of marine biodiversity loss: climate change, ocean acidification, eutrophication-induced hypoxia, seabed damage, overexploitation of biotic resources, invasive species and marine plastic debris. Our review shows that impact indicators can be developed for all identified drivers, albeit at different levels of coverage of cause-effect pathways and variable levels of uncertainty and spatial coverage. Modeling approaches to predict the spatial distribution and intensity of human-driven interventions in the marine environment are relatively well-established and can be employed to develop spatially-explicit LCA fate factors. Modeling approaches to quantify the effects of these interventions on marine biodiversity are less well-developed. We highlight specific research challenges to facilitate a coherent incorporation of marine biodiversity loss in LCA, thereby making LCA a more comprehensive and robust environmental impact assessment tool. Research challenges of particular importance include i) incorporation of the non-linear behavior of global circulation models (GCMs) within an LCA framework and ii) improving spatial differentiation, especially the representation of coastal regions in GCMs and ocean-carbon cycle models. Copyright © 2016 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Pal, Manali; Suman, Mayank; Das, Sarit Kumar; Maity, Rajib
2017-04-01
Information on spatio-temporal distribution of surface Soil Moisture Content (SMC) is essential in several hydrological, meteorological and agricultural applications. There has been increasing importance of microwave active remote sensing data for large-scale estimation of surface SMC because of its ability to monitor spatial and temporal variation of surface SMC at regional, continental and global scale at a reasonably fine spatial and temporal resolution. The use of Synthetic Aperture Radar (SAR) is highly potential for catchment-scale applications due to high spatial resolution (˜10-20 m) both for vegetated and bare soil surface as well as because of its all-weather and day and night characteristics. However, one prime disadvantage of SAR is that their signal is subjective to SMC along with Land Use Land Cover (LULC) and surface roughness conditions, making the retrieval of SMC from SAR data an "ill-posed" problem. Moreover, the quantification of uncertainty due to inappropriate surface roughness characterization, soil texture, inversion techniques etc. even in the latest established retrieval methods, is little explored. This paper reports a recently developed method to estimate the surface SMC with probabilistic assessment of uncertainty associated with the estimation (Pal et al., 2016). Quad-polarized SAR data from Radar Imaging Satellite1 (RISAT1), launched in 2012 by Indian Space Research Organization (ISRO) and information on LULC regarding bareland and vegetated land (<30 cm height) are used in estimation using the potential of multivariate probabilistic assessment through copulas. The salient features of the study are: 1) development of a combined index to understand the role of all the quad-polarized backscattering coefficients and soil texture information in SMC estimation; 2) applicability of the model for different incidence angles using normalized incidence angle theory proposed by Zibri et al. (2005); and 3) assessment of uncertainty range of the estimated SMC. Supervised Principal Component Analysis (SPCA) is used for development of combined index and Frank copula is found to be the best-fit copula. The developed model is validated with the field soil moisture values over 334 monitoring points within the study area and used for development of a soil moisture map. While the performance is promising, the model is applicable only for bare and vegetated land. References: Pal, M., Maity, R., Suman, M., Das, S.K., Patel, P., and Srivastava, H.S., (2016). "Satellite-Based Probabilistic Assessment of Soil Moisture Using C-Band Quad-Polarized RISAT1 Data." IEEE Transactions on Geoscience and Remote Sensing, In Press, doi:10.1109/TGRS.2016.2623378. Zribi, M., Baghdadi, N., Holah, N., and Fafin, O., (2005)."New methodology for soil surface moisture estimation and its application to ENVISAT-ASAR multi-incidence data inversion." Remote Sensing of Environment, vol. 96, nos. 3-4, pp. 485-496.
Short-term ensemble radar rainfall forecasts for hydrological applications
NASA Astrophysics Data System (ADS)
Codo de Oliveira, M.; Rico-Ramirez, M. A.
2016-12-01
Flooding is a very common natural disaster around the world, putting local population and economy at risk. Forecasting floods several hours ahead and issuing warnings are of main importance to permit proper response in emergency situations. However, it is important to know the uncertainties related to the rainfall forecasting in order to produce more reliable forecasts. Nowcasting models (short-term rainfall forecasts) are able to produce high spatial and temporal resolution predictions that are useful in hydrological applications. Nonetheless, they are subject to uncertainties mainly due to the nowcasting model used, errors in radar rainfall estimation, temporal development of the velocity field and to the fact that precipitation processes such as growth and decay are not taken into account. In this study an ensemble generation scheme using rain gauge data as a reference to estimate radars errors is used to produce forecasts with up to 3h lead-time. The ensembles try to assess in a realistic way the residual uncertainties that remain even after correction algorithms are applied in the radar data. The ensembles produced are compered to a stochastic ensemble generator. Furthermore, the rainfall forecast output was used as an input in a hydrodynamic sewer network model and also in hydrological model for catchments of different sizes in north England. A comparative analysis was carried of how was carried out to assess how the radar uncertainties propagate into these models. The first named author is grateful to CAPES - Ciencia sem Fronteiras for funding this PhD research.
GIS, remote sensing and spatial modeling for conservation of stone forest landscape in Lunan, China
NASA Astrophysics Data System (ADS)
Zhang, Chuanrong
The Lunan Stone Forest is the World's premier pinnacle karst landscape, with considerable scientific and cultural importance. Because of its inherent ecological fragility and ongoing human disruption, especially recently burgeoning tourism development, the landscape is stressed and is in danger of being destroyed. Conservation policies have been implemented by the local and national governments, but many problems remain in the national park. For example, there is no accurate detailed map and no computer system to help authorities manage the natural resources. By integrating GIS, remote sensing and spatial modeling this dissertation investigates the issue of landscape conservation and develops some methodologies to assist in management of the natural resources in the national park. Four elements are involved: (1) To help decision-makers and residents understand the scope of resource exploitation and develop appropriate protective strategies, the dissertation documents how the landscape has been changed by human activities over the past 3 decades; (2) To help authorities scientifically designate different levels of protection in the park and to let the public actively participate in conservation decision making, a web-based Spatial Decision Support System for the conservation of the landscape was developed; (3) To make data sharing and integration easy in the future, a GML-based interoperable database for the park was implemented; and (4) To acquire more information and provide the uncertainty information to landscape conservation decision-makers, spatial land use patterns were modeled and the distributional uncertainty of land cover categories was assessed using a triplex Markov chain (TMC) model approach.
Drummond, Leslie; Shomstein, Sarah
2013-01-01
The relative contributions of objects (i.e., object-based) and underlying spatial (i.e., space-based representations) to attentional prioritization and selection remain unclear. In most experimental circumstances, the two representations overlap thus their respective contributions cannot be evaluated. Here, a dynamic version of the two-rectangle paradigm allowed for a successful de-coupling of spatial and object representations. Space-based (cued spatial location), cued end of the object, and object-based (locations within the cued object) effects were sampled at several timepoints following the cue with high or low certainty as to target location. In the high uncertainty condition spatial benefits prevailed throughout most of the timecourse, as evidenced by facilitatory and inhibitory effects. Additionally, the cued end of the object, rather than a whole object, received the attentional benefit. When target location was predictable (low uncertainty manipulation), only probabilities guided selection (i.e., evidence by a benefit for the statistically biased location). These results suggest that with high spatial uncertainty, all available information present within the stimulus display is used for the purposes of attentional selection (e.g., spatial locations, cued end of the object) albeit to varying degrees and at different time points. However, as certainty increases, only spatial certainty guides selection (i.e., object ends and whole objects are filtered out). Taken together, these results further elucidate the contributing role of space- and object-representations to attentional guidance. PMID:24367302
Estimates of tropical analysis differences in daily values produced by two operational centers
NASA Technical Reports Server (NTRS)
Kasahara, Akira; Mizzi, Arthur P.
1992-01-01
To assess the uncertainty of daily synoptic analyses for the atmospheric state, the intercomparison of three First GARP Global Experiment level IIIb datasets is performed. Daily values of divergence, vorticity, temperature, static stability, vertical motion, mixing ratio, and diagnosed diabatic heating rate are compared for the period of 26 January-11 February 1979. The spatial variance and mean, temporal mean and variance, 2D wavenumber power spectrum, anomaly correlation, and normalized square difference are employed for comparison.
Jacob, Benjamin G; Griffith, Daniel A; Muturi, Ephantus J; Caamano, Erick X; Githure, John I; Novak, Robert J
2009-01-01
Background Autoregressive regression coefficients for Anopheles arabiensis aquatic habitat models are usually assessed using global error techniques and are reported as error covariance matrices. A global statistic, however, will summarize error estimates from multiple habitat locations. This makes it difficult to identify where there are clusters of An. arabiensis aquatic habitats of acceptable prediction. It is therefore useful to conduct some form of spatial error analysis to detect clusters of An. arabiensis aquatic habitats based on uncertainty residuals from individual sampled habitats. In this research, a method of error estimation for spatial simulation models was demonstrated using autocorrelation indices and eigenfunction spatial filters to distinguish among the effects of parameter uncertainty on a stochastic simulation of ecological sampled Anopheles aquatic habitat covariates. A test for diagnostic checking error residuals in an An. arabiensis aquatic habitat model may enable intervention efforts targeting productive habitats clusters, based on larval/pupal productivity, by using the asymptotic distribution of parameter estimates from a residual autocovariance matrix. The models considered in this research extends a normal regression analysis previously considered in the literature. Methods Field and remote-sampled data were collected during July 2006 to December 2007 in Karima rice-village complex in Mwea, Kenya. SAS 9.1.4® was used to explore univariate statistics, correlations, distributions, and to generate global autocorrelation statistics from the ecological sampled datasets. A local autocorrelation index was also generated using spatial covariance parameters (i.e., Moran's Indices) in a SAS/GIS® database. The Moran's statistic was decomposed into orthogonal and uncorrelated synthetic map pattern components using a Poisson model with a gamma-distributed mean (i.e. negative binomial regression). The eigenfunction values from the spatial configuration matrices were then used to define expectations for prior distributions using a Markov chain Monte Carlo (MCMC) algorithm. A set of posterior means were defined in WinBUGS 1.4.3®. After the model had converged, samples from the conditional distributions were used to summarize the posterior distribution of the parameters. Thereafter, a spatial residual trend analyses was used to evaluate variance uncertainty propagation in the model using an autocovariance error matrix. Results By specifying coefficient estimates in a Bayesian framework, the covariate number of tillers was found to be a significant predictor, positively associated with An. arabiensis aquatic habitats. The spatial filter models accounted for approximately 19% redundant locational information in the ecological sampled An. arabiensis aquatic habitat data. In the residual error estimation model there was significant positive autocorrelation (i.e., clustering of habitats in geographic space) based on log-transformed larval/pupal data and the sampled covariate depth of habitat. Conclusion An autocorrelation error covariance matrix and a spatial filter analyses can prioritize mosquito control strategies by providing a computationally attractive and feasible description of variance uncertainty estimates for correctly identifying clusters of prolific An. arabiensis aquatic habitats based on larval/pupal productivity. PMID:19772590
Liu, Junguo; Folberth, Christian; Yang, Hong; Röckström, Johan; Abbaspour, Karim; Zehnder, Alexander J. B.
2013-01-01
Food security and water scarcity have become two major concerns for future human's sustainable development, particularly in the context of climate change. Here we present a comprehensive assessment of climate change impacts on the production and water use of major cereal crops on a global scale with a spatial resolution of 30 arc-minutes for the 2030s (short term) and the 2090s (long term), respectively. Our findings show that impact uncertainties are higher on larger spatial scales (e.g., global and continental) but lower on smaller spatial scales (e.g., national and grid cell). Such patterns allow decision makers and investors to take adaptive measures without being puzzled by a highly uncertain future at the global level. Short-term gains in crop production from climate change are projected for many regions, particularly in African countries, but the gains will mostly vanish and turn to losses in the long run. Irrigation dependence in crop production is projected to increase in general. However, several water poor regions will rely less heavily on irrigation, conducive to alleviating regional water scarcity. The heterogeneity of spatial patterns and the non-linearity of temporal changes of the impacts call for site-specific adaptive measures with perspectives of reducing short- and long-term risks of future food and water security. PMID:23460901
NASA Astrophysics Data System (ADS)
Chen, C. F.; Liang, C. P.; Jang, C. S.; Chen, J. S.
2016-12-01
Groundwater is one of the most component water resources in Lanyang plain. The groundwater of the Lanyang Plain contains arsenic levels that exceed the current Taiwan Environmental Protection Administration (Taiwan EPA) limit of 10 μg/L. The arsenic of groundwater in some areas of the Lanyang Plain pose great menace for the safe use of groundwater resources. Therefore, poor water quality can adversely impact drinking water uses, leading to human health risks. This study analyzed the potential health risk associated with the ingestion of arsenic-affected groundwater in the arseniasis-endemic Lanyang plain. Geostatistical approach is widely used in spatial variability analysis and distributions of field data with uncertainty. The estimation of spatial distribution of the arsenic contaminant in groundwater is very important in the health risk assessment. This study used indicator kriging (IK) and ordinary kriging (OK) methods to explore the spatial variability of arsenic-polluted parameters. The estimated difference between IK and OK estimates was compared. The extent of arsenic pollution was spatially determined and the Target cancer risk (TR) and dose response were explored when the ingestion of arsenic in groundwater. Thus, a zonal management plan based on safe groundwater use is formulated. The research findings can provide a plan reference of regional water resources supplies for local government administrators and developing groundwater resources in the Lanyang Plain.
NASA Technical Reports Server (NTRS)
Hooker, Stanford B. (Editor); Firestone, Elaine R. (Editor); Mueller, James L.; Mclean, James T.; Johnson, B. Carol; Westphal, Todd L.; Cooper, John W.
1994-01-01
The results of the second Sea-viewing Wide Field-of-view Sensor (SeaWiFS) Intercalibration Round-Robin Experiment (SIRREX-2), which was held at the Center for Hydro-Optics and Remote Sensing (CHORS) at San Diego State University on 14-25 Jun. 1993 are presented. SeaWiFS is an ocean color radiometer that is scheduled for launch in 1994. The SIRREXs are part of the SeaWiFS Calibration and Validation Program that includes the GSFC, CHORS, NIST, and several other laboratories. GSFC maintains the radiometric scales (spectral radiance and irradiance) for the SeaWiFS program using spectral irradiance standards lamps, which are calibrated by NIST. The purpose of each SIRREX is to assure that the radiometric scales which are realized by the laboratories who participate in the SeaWiFS Calibration and Validation Program are correct; that is, the uncertainties of the radiometric scales are such that measurements of normalized water-leaving radiance using oceanographic radiometers have uncertainties of 5%. SIRREX-1 demonstrated, from the internal consistency of the results, that the program goals would not be met without improvements to the instrumentation. The results of SIRREX-2 demonstrate that spectral irradiance scales realized using the GSFC standard irradiance lamp (F269) are consistent with the program goals, as the uncertainty of these measurements is assessed to be about 1%. However, this is not true for the spectral radiance scales, where again the internal consistency of the results is used to assess the uncertainty. This is attributed to inadequate performance and characterization of the instrumentation. For example, spatial nonuniformities, spectral features, and sensitivity to illumination configuration were observed in some of the integrating sphere sources. The results of SIRREX-2 clearly indicate the direction for future work, with the main emphasis on instrument characterization and the assessment of the measurement uncertainties so that the results may be stated in a more definitive manner.
Chart-Asa, Chidsanuphong; Gibson, Jacqueline MacDonald
2015-02-15
This paper develops and then demonstrates a new approach for quantifying health impacts of traffic-related particulate matter air pollution at the urban project scale that includes variability and uncertainty in the analysis. We focus on primary particulate matter having a diameter less than 2.5 μm (PM2.5). The new approach accounts for variability in vehicle emissions due to temperature, road grade, and traffic behavior variability; seasonal variability in concentration-response coefficients; demographic variability at a fine spatial scale; uncertainty in air quality model accuracy; and uncertainty in concentration-response coefficients. We demonstrate the approach for a case study roadway corridor with a population of 16,000, where a new extension of the University of North Carolina (UNC) at Chapel Hill campus is slated for construction. The results indicate that at this case study site, health impact estimates increased by factors of 4-9, depending on the health impact considered, compared to using a conventional health impact assessment approach that overlooks these variability and uncertainty sources. In addition, we demonstrate how the method can be used to assess health disparities. For example, in the case study corridor, our method demonstrates the existence of statistically significant racial disparities in exposure to traffic-related PM2.5 under present-day traffic conditions: the correlation between percent black and annual attributable deaths in each census block is 0.37 (t(114)=4.2, p<0.0001). Overall, our results show that the proposed new campus will cause only a small incremental increase in health risks (annual risk 6×10(-10); lifetime risk 4×10(-8)), compared to if the campus is not built. Nonetheless, the approach we illustrate could be useful for improving the quality of information to support decision-making for other urban development projects. Copyright © 2014 Elsevier B.V. All rights reserved.
NASA Technical Reports Server (NTRS)
Roman, Miguel O.; Gatebe, Charles K.; Shuai, Yanmin; Wang, Zhuosen; Gao, Feng; Masek, Jeff; Schaaf, Crystal B.
2012-01-01
The quantification of uncertainty of global surface albedo data and products is a critical part of producing complete, physically consistent, and decadal land property data records for studying ecosystem change. A current challenge in validating satellite retrievals of surface albedo is the ability to overcome the spatial scaling errors that can contribute on the order of 20% disagreement between satellite and field-measured values. Here, we present the results from an uncertain ty analysis of MODerate Resolution Imaging Spectroradiometer (MODIS) and Landsat albedo retrievals, based on collocated comparisons with tower and airborne multi-angular measurements collected at the Atmospheric Radiation Measurement Program s (ARM) Cloud and Radiation Testbed (CART) site during the 2007 Cloud and Land Surface Interaction Campaign (CLAS33 IC 07). Using standard error propagation techniques, airborne measurements obtained by NASA s Cloud Absorption Radiometer (CAR) were used to quantify the uncertainties associated with MODIS and Landsat albedos across a broad range of mixed vegetation and structural types. Initial focus was on evaluating inter-sensor consistency through assessments of temporal stability, as well as examining the overall performance of satellite-derived albedos obtained at all diurnal solar zenith angles. In general, the accuracy of the MODIS and Landsat albedos remained under a 10% margin of error in the SW(0.3 - 5.0 m) domain. However, results reveal a high degree of variability in the RMSE (root mean square error) and bias of albedos in both the visible (0.3 - 0.7 m) and near-infrared (0.3 - 5.0 m) broadband channels; where, in some cases, retrieval uncertainties were found to be in excess of 20%. For the period of CLASIC 07, the primary factors that contributed to uncertainties in the satellite-derived albedo values include: (1) the assumption of temporal stability in the retrieval of 500 m MODIS BRDF values over extended periods of cloud-contaminated observations; and (2) the assumption of spatial 45 and structural uniformity at the Landsat (30 m) pixel scale.
Jones-Farrand, D. Todd; Fearer, Todd M.; Thogmartin, Wayne E.; Thompson, Frank R.; Nelson, Mark D.; Tirpak, John M.
2011-01-01
Selection of a modeling approach is an important step in the conservation planning process, but little guidance is available. We compared two statistical and three theoretical habitat modeling approaches representing those currently being used for avian conservation planning at landscape and regional scales: hierarchical spatial count (HSC), classification and regression tree (CRT), habitat suitability index (HSI), forest structure database (FS), and habitat association database (HA). We focused our comparison on models for five priority forest-breeding species in the Central Hardwoods Bird Conservation Region: Acadian Flycatcher, Cerulean Warbler, Prairie Warbler, Red-headed Woodpecker, and Worm-eating Warbler. Lacking complete knowledge on the distribution and abundance of each species with which we could illuminate differences between approaches and provide strong grounds for recommending one approach over another, we used two approaches to compare models: rank correlations among model outputs and comparison of spatial correspondence. In general, rank correlations were significantly positive among models for each species, indicating general agreement among the models. Worm-eating Warblers had the highest pairwise correlations, all of which were significant (P , 0.05). Red-headed Woodpeckers had the lowest agreement among models, suggesting greater uncertainty in the relative conservation value of areas within the region. We assessed model uncertainty by mapping the spatial congruence in priorities (i.e., top ranks) resulting from each model for each species and calculating the coefficient of variation across model ranks for each location. This allowed identification of areas more likely to be good targets of conservation effort for a species, those areas that were least likely, and those in between where uncertainty is higher and thus conservation action incorporates more risk. Based on our results, models developed independently for the same purpose (conservation planning for a particular species in a particular geography) yield different answers and thus different conservation strategies. We assert that using only one habitat model (even if validated) as the foundation of a conservation plan is risky. Using multiple models (i.e., ensemble prediction) can reduce uncertainty and increase efficacy of conservation action when models corroborate one another and increase understanding of the system when they do not.
Lu, Shasha; Zhou, Min; Guan, Xingliang; Tao, Lizao
2015-03-01
A large number of mathematical models have been developed for supporting optimization of land-use allocation; however, few of them simultaneously consider land suitability (e.g., physical features and spatial information) and various uncertainties existing in many factors (e.g., land availabilities, land demands, land-use patterns, and ecological requirements). This paper incorporates geographic information system (GIS) technology into interval-probabilistic programming (IPP) for land-use planning management (IPP-LUPM). GIS is utilized to assemble data for the aggregated land-use alternatives, and IPP is developed for tackling uncertainties presented as discrete intervals and probability distribution. Based on GIS, the suitability maps of different land users are provided by the outcomes of land suitability assessment and spatial analysis. The maximum area of every type of land use obtained from the suitability maps, as well as various objectives/constraints (i.e., land supply, land demand of socioeconomic development, future development strategies, and environmental capacity), is used as input data for the optimization of land-use areas with IPP-LUPM model. The proposed model not only considers the outcomes of land suitability evaluation (i.e., topography, ground conditions, hydrology, and spatial location) but also involves economic factors, food security, and eco-environmental constraints, which can effectively reflect various interrelations among different aspects in a land-use planning management system. The case study results at Suzhou, China, demonstrate that the model can help to examine the reliability of satisfying (or risk of violating) system constraints under uncertainty. Moreover, it may identify the quantitative relationship between land suitability and system benefits. Willingness to arrange the land areas based on the condition of highly suitable land will not only reduce the potential conflicts on the environmental system but also lead to a lower economic benefit. However, a strong desire to develop lower suitable land areas will bring not only a higher economic benefit but also higher risks of violating environmental and ecological constraints. The land manager should make decisions through trade-offs between economic objectives and environmental/ecological objectives.
A High Performance Bayesian Computing Framework for Spatiotemporal Uncertainty Modeling
NASA Astrophysics Data System (ADS)
Cao, G.
2015-12-01
All types of spatiotemporal measurements are subject to uncertainty. With spatiotemporal data becomes increasingly involved in scientific research and decision making, it is important to appropriately model the impact of uncertainty. Quantitatively modeling spatiotemporal uncertainty, however, is a challenging problem considering the complex dependence and dataheterogeneities.State-space models provide a unifying and intuitive framework for dynamic systems modeling. In this paper, we aim to extend the conventional state-space models for uncertainty modeling in space-time contexts while accounting for spatiotemporal effects and data heterogeneities. Gaussian Markov Random Field (GMRF) models, also known as conditional autoregressive models, are arguably the most commonly used methods for modeling of spatially dependent data. GMRF models basically assume that a geo-referenced variable primarily depends on its neighborhood (Markov property), and the spatial dependence structure is described via a precision matrix. Recent study has shown that GMRFs are efficient approximation to the commonly used Gaussian fields (e.g., Kriging), and compared with Gaussian fields, GMRFs enjoy a series of appealing features, such as fast computation and easily accounting for heterogeneities in spatial data (e.g, point and areal). This paper represents each spatial dataset as a GMRF and integrates them into a state-space form to statistically model the temporal dynamics. Different types of spatial measurements (e.g., categorical, count or continuous), can be accounted for by according link functions. A fast alternative to MCMC framework, so-called Integrated Nested Laplace Approximation (INLA), was adopted for model inference.Preliminary case studies will be conducted to showcase the advantages of the described framework. In the first case, we apply the proposed method for modeling the water table elevation of Ogallala aquifer over the past decades. In the second case, we analyze the drought impacts in Texas counties in the past years, where the spatiotemporal dynamics are represented in areal data.
Oldenkamp, Rik; Huijbregts, Mark A J; Ragas, Ad M J
2016-05-01
The selection of priority APIs (Active Pharmaceutical Ingredients) can benefit from a spatially explicit approach, since an API might exceed the threshold of environmental concern in one location, while staying below that same threshold in another. However, such a spatially explicit approach is relatively data intensive and subject to parameter uncertainty due to limited data. This raises the question to what extent a spatially explicit approach for the environmental prioritisation of APIs remains worthwhile when accounting for uncertainty in parameter settings. We show here that the inclusion of spatially explicit information enables a more efficient environmental prioritisation of APIs in Europe, compared with a non-spatial EU-wide approach, also under uncertain conditions. In a case study with nine antibiotics, uncertainty distributions of the PAF (Potentially Affected Fraction) of aquatic species were calculated in 100∗100km(2) environmental grid cells throughout Europe, and used for the selection of priority APIs. Two APIs have median PAF values that exceed a threshold PAF of 1% in at least one environmental grid cell in Europe, i.e., oxytetracycline and erythromycin. At a tenfold lower threshold PAF (i.e., 0.1%), two additional APIs would be selected, i.e., cefuroxime and ciprofloxacin. However, in 94% of the environmental grid cells in Europe, no APIs exceed either of the thresholds. This illustrates the advantage of following a location-specific approach in the prioritisation of APIs. This added value remains when accounting for uncertainty in parameter settings, i.e., if the 95th percentile of the PAF instead of its median value is compared with the threshold. In 96% of the environmental grid cells, the location-specific approach still enables a reduction of the selection of priority APIs of at least 50%, compared with a EU-wide prioritisation. Copyright © 2016 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Geng, Guannan; Zhang, Qiang; Martin, Randall V.; Lin, Jintai; Huo, Hong; Zheng, Bo; Wang, Siwen; He, Kebin
2017-03-01
Spatial proxies used in bottom-up emission inventories to derive the spatial distributions of emissions are usually empirical and involve additional levels of uncertainty. Although uncertainties in current emission inventories have been discussed extensively, uncertainties resulting from improper spatial proxies have rarely been evaluated. In this work, we investigate the impact of spatial proxies on the representation of gridded emissions by comparing six gridded NOx emission datasets over China developed from the same magnitude of emissions and different spatial proxies. GEOS-Chem-modeled tropospheric NO2 vertical columns simulated from different gridded emission inventories are compared with satellite-based columns. The results show that differences between modeled and satellite-based NO2 vertical columns are sensitive to the spatial proxies used in the gridded emission inventories. The total population density is less suitable for allocating NOx emissions than nighttime light data because population density tends to allocate more emissions to rural areas. Determining the exact locations of large emission sources could significantly strengthen the correlation between modeled and observed NO2 vertical columns. Using vehicle population and an updated road network for the on-road transport sector could substantially enhance urban emissions and improve the model performance. When further applying industrial gross domestic product (IGDP) values for the industrial sector, modeled NO2 vertical columns could better capture pollution hotspots in urban areas and exhibit the best performance of the six cases compared to satellite-based NO2 vertical columns (slope = 1.01 and R2 = 0. 85). This analysis provides a framework for information from satellite observations to inform bottom-up inventory development. In the future, more effort should be devoted to the representation of spatial proxies to improve spatial patterns in bottom-up emission inventories.
Assessing Uncertainty in Risk Assessment Models (BOSC CSS meeting)
In vitro assays are increasingly being used in risk assessments Uncertainty in assays leads to uncertainty in models used for risk assessments. This poster assesses uncertainty in the ER and AR models.
Alexeeff, Stacey E; Carroll, Raymond J; Coull, Brent
2016-04-01
Spatial modeling of air pollution exposures is widespread in air pollution epidemiology research as a way to improve exposure assessment. However, there are key sources of exposure model uncertainty when air pollution is modeled, including estimation error and model misspecification. We examine the use of predicted air pollution levels in linear health effect models under a measurement error framework. For the prediction of air pollution exposures, we consider a universal Kriging framework, which may include land-use regression terms in the mean function and a spatial covariance structure for the residuals. We derive the bias induced by estimation error and by model misspecification in the exposure model, and we find that a misspecified exposure model can induce asymptotic bias in the effect estimate of air pollution on health. We propose a new spatial simulation extrapolation (SIMEX) procedure, and we demonstrate that the procedure has good performance in correcting this asymptotic bias. We illustrate spatial SIMEX in a study of air pollution and birthweight in Massachusetts. © The Author 2015. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Uncertainty Analysis of A Flood Risk Mapping Procedure Applied In Urban Areas
NASA Astrophysics Data System (ADS)
Krause, J.; Uhrich, S.; Bormann, H.; Diekkrüger, B.
In the framework of IRMA-Sponge program the presented study was part of the joint research project FRHYMAP (flood risk and hydrological mapping). A simple con- ceptual flooding model (FLOODMAP) has been developed to simulate flooded areas besides rivers within cities. FLOODMAP requires a minimum of input data (digital el- evation model (DEM), river line, water level plain) and parameters and calculates the flood extent as well as the spatial distribution of flood depths. of course the simulated model results are affected by errors and uncertainties. Possible sources of uncertain- ties are the model structure, model parameters and input data. Thus after the model validation (comparison of simulated water to observed extent, taken from airborne pictures) the uncertainty of the essential input data set (digital elevation model) was analysed. Monte Carlo simulations were performed to assess the effect of uncertain- ties concerning the statistics of DEM quality and to derive flooding probabilities from the set of simulations. The questions concerning a minimum resolution of a DEM re- quired for flood simulation and concerning the best aggregation procedure of a given DEM was answered by comparing the results obtained using all available standard GIS aggregation procedures. Seven different aggregation procedures were applied to high resolution DEMs (1-2m) in three cities (Bonn, Cologne, Luxembourg). Basing on this analysis the effect of 'uncertain' DEM data was estimated and compared with other sources of uncertainties. Especially socio-economic information and monetary transfer functions required for a damage risk analysis show a high uncertainty. There- fore this study helps to analyse the weak points of the flood risk and damage risk assessment procedure.
Uncertainty in gridded CO 2 emissions estimates
Hogue, Susannah; Marland, Eric; Andres, Robert J.; ...
2016-05-19
We are interested in the spatial distribution of fossil-fuel-related emissions of CO 2 for both geochemical and geopolitical reasons, but it is important to understand the uncertainty that exists in spatially explicit emissions estimates. Working from one of the widely used gridded data sets of CO 2 emissions, we examine the elements of uncertainty, focusing on gridded data for the United States at the scale of 1° latitude by 1° longitude. Uncertainty is introduced in the magnitude of total United States emissions, the magnitude and location of large point sources, the magnitude and distribution of non-point sources, and from themore » use of proxy data to characterize emissions. For the United States, we develop estimates of the contribution of each component of uncertainty. At 1° resolution, in most grid cells, the largest contribution to uncertainty comes from how well the distribution of the proxy (in this case population density) represents the distribution of emissions. In other grid cells, the magnitude and location of large point sources make the major contribution to uncertainty. Uncertainty in population density can be important where a large gradient in population density occurs near a grid cell boundary. Uncertainty is strongly scale-dependent with uncertainty increasing as grid size decreases. In conclusion, uncertainty for our data set with 1° grid cells for the United States is typically on the order of ±150%, but this is perhaps not excessive in a data set where emissions per grid cell vary over 8 orders of magnitude.« less
Uncertainty Analysis of Downscaled CMIP5 Precipitation Data for Louisiana, USA
NASA Astrophysics Data System (ADS)
Sumi, S. J.; Tamanna, M.; Chivoiu, B.; Habib, E. H.
2014-12-01
The downscaled CMIP3 and CMIP5 Climate and Hydrology Projections dataset contains fine spatial resolution translations of climate projections over the contiguous United States developed using two downscaling techniques (monthly Bias Correction Spatial Disaggregation (BCSD) and daily Bias Correction Constructed Analogs (BCCA)). The objective of this study is to assess the uncertainty of the CMIP5 downscaled general circulation models (GCM). We performed an analysis of the daily, monthly, seasonal and annual variability of precipitation downloaded from the Downscaled CMIP3 and CMIP5 Climate and Hydrology Projections website for the state of Louisiana, USA at 0.125° x 0.125° resolution. A data set of daily gridded observations of precipitation of a rectangular boundary covering Louisiana is used to assess the validity of 21 downscaled GCMs for the 1950-1999 period. The following statistics are computed using the CMIP5 observed dataset with respect to the 21 models: the correlation coefficient, the bias, the normalized bias, the mean absolute error (MAE), the mean absolute percentage error (MAPE), and the root mean square error (RMSE). A measure of variability simulated by each model is computed as the ratio of its standard deviation, in both space and time, to the corresponding standard deviation of the observation. The correlation and MAPE statistics are also computed for each of the nine climate divisions of Louisiana. Some of the patterns that we observed are: 1) Average annual precipitation rate shows similar spatial distribution for all the models within a range of 3.27 to 4.75 mm/day from Northwest to Southeast. 2) Standard deviation of summer (JJA) precipitation (mm/day) for the models maintains lower value than the observation whereas they have similar spatial patterns and range of values in winter (NDJ). 3) Correlation coefficients of annual precipitation of models against observation have a range of -0.48 to 0.36 with variable spatial distribution by model. 4) Most of the models show negative correlation coefficients in summer and positive in winter. 5) MAE shows similar spatial distribution for all the models within a range of 5.20 to 7.43 mm/day from Northwest to Southeast of Louisiana. 6) Highest values of correlation coefficients are found at seasonal scale within a range of 0.36 to 0.46.
Control of experimental uncertainties in filtered Rayleigh scattering measurements
NASA Technical Reports Server (NTRS)
Forkey, Joseph N.; Finkelstein, N. D.; Lempert, Walter R.; Miles, Richard B.
1995-01-01
Filtered Rayleigh Scattering is a technique which allows for measurement of velocity, temperature, and pressure in unseeded flows, spatially resolved in 2-dimensions. We present an overview of the major components of a Filtered Rayleigh Scattering system. In particular, we develop and discuss a detailed theoretical model along with associated model parameters and related uncertainties. Based on this model, we then present experimental results for ambient room air and for a Mach 2 free jet, including spatially resolved measurements of velocity, temperature, and pressure.
Analyzing extreme sea levels for broad-scale impact and adaptation studies
NASA Astrophysics Data System (ADS)
Wahl, T.; Haigh, I. D.; Nicholls, R. J.; Arns, A.; Dangendorf, S.; Hinkel, J.; Slangen, A.
2017-12-01
Coastal impact and adaptation assessments require detailed knowledge on extreme sea levels (ESL), because increasing damage due to extreme events is one of the major consequences of sea-level rise (SLR) and climate change. Over the last few decades, substantial research efforts have been directed towards improved understanding of past and future SLR; different scenarios were developed with process-based or semi-empirical models and used for coastal impact studies at various temporal and spatial scales to guide coastal management and adaptation efforts. Uncertainties in future SLR are typically accounted for by analyzing the impacts associated with a range of scenarios and model ensembles. ESL distributions are then displaced vertically according to the SLR scenarios under the inherent assumption that we have perfect knowledge on the statistics of extremes. However, there is still a limited understanding of present-day ESL which is largely ignored in most impact and adaptation analyses. The two key uncertainties stem from: (1) numerical models that are used to generate long time series of storm surge water levels, and (2) statistical models used for determining present-day ESL exceedance probabilities. There is no universally accepted approach to obtain such values for broad-scale flood risk assessments and while substantial research has explored SLR uncertainties, we quantify, for the first time globally, key uncertainties in ESL estimates. We find that contemporary ESL uncertainties exceed those from SLR projections and, assuming that we meet the Paris agreement, the projected SLR itself by the end of the century. Our results highlight the necessity to further improve our understanding of uncertainties in ESL estimates through (1) continued improvement of numerical and statistical models to simulate and analyze coastal water levels and (2) exploit the rich observational database and continue data archeology to obtain longer time series and remove model bias. Finally, ESL uncertainties need to be integrated with SLR uncertainties. Otherwise, important improvements in providing more robust SLR projections are of less benefit for broad-scale impact and adaptation studies and decision processes.
Probabilistic forecasts based on radar rainfall uncertainty
NASA Astrophysics Data System (ADS)
Liguori, S.; Rico-Ramirez, M. A.
2012-04-01
The potential advantages resulting from integrating weather radar rainfall estimates in hydro-meteorological forecasting systems is limited by the inherent uncertainty affecting radar rainfall measurements, which is due to various sources of error [1-3]. The improvement of quality control and correction techniques is recognized to play a role for the future improvement of radar-based flow predictions. However, the knowledge of the uncertainty affecting radar rainfall data can also be effectively used to build a hydro-meteorological forecasting system in a probabilistic framework. This work discusses the results of the implementation of a novel probabilistic forecasting system developed to improve ensemble predictions over a small urban area located in the North of England. An ensemble of radar rainfall fields can be determined as the sum of a deterministic component and a perturbation field, the latter being informed by the knowledge of the spatial-temporal characteristics of the radar error assessed with reference to rain-gauges measurements. This approach is similar to the REAL system [4] developed for use in the Southern-Alps. The radar uncertainty estimate can then be propagated with a nowcasting model, used to extrapolate an ensemble of radar rainfall forecasts, which can ultimately drive hydrological ensemble predictions. A radar ensemble generator has been calibrated using radar rainfall data made available from the UK Met Office after applying post-processing and corrections algorithms [5-6]. One hour rainfall accumulations from 235 rain gauges recorded for the year 2007 have provided the reference to determine the radar error. Statistics describing the spatial characteristics of the error (i.e. mean and covariance) have been computed off-line at gauges location, along with the parameters describing the error temporal correlation. A system has then been set up to impose the space-time error properties to stochastic perturbations, generated in real-time at gauges location, and then interpolated back onto the radar domain, in order to obtain probabilistic radar rainfall fields in real time. The deterministic nowcasting model integrated in the STEPS system [7-8] has been used for the purpose of propagating the uncertainty and assessing the benefit of implementing the radar ensemble generator for probabilistic rainfall forecasts and ultimately sewer flow predictions. For this purpose, events representative of different types of precipitation (i.e. stratiform/convective) and significant at the urban catchment scale (i.e. in terms of sewer overflow within the urban drainage system) have been selected. As high spatial/temporal resolution is required to the forecasts for their use in urban areas [9-11], the probabilistic nowcasts have been set up to be produced at 1 km resolution and 5 min intervals. The forecasting chain is completed by a hydrodynamic model of the urban drainage network. The aim of this work is to discuss the implementation of this probabilistic system, which takes into account the radar error to characterize the forecast uncertainty, with consequent potential benefits in the management of urban systems. It will also allow a comparison with previous findings related to the analysis of different approaches to uncertainty estimation and quantification in terms of rainfall [12] and flows at the urban scale [13]. Acknowledgements The authors would like to acknowledge the BADC, the UK Met Office and Dr. Alan Seed from the Australian Bureau of Meteorology for providing the radar data and the nowcasting model. The authors acknowledge the support from the Engineering and Physical Sciences Research Council (EPSRC) via grant EP/I012222/1.
Uncertainty in Random Forests: What does it mean in a spatial context?
NASA Astrophysics Data System (ADS)
Klump, Jens; Fouedjio, Francky
2017-04-01
Geochemical surveys are an important part of exploration for mineral resources and in environmental studies. The samples and chemical analyses are often laborious and difficult to obtain and therefore come at a high cost. As a consequence, these surveys are characterised by datasets with large numbers of variables but relatively few data points when compared to conventional big data problems. With more remote sensing platforms and sensor networks being deployed, large volumes of auxiliary data of the surveyed areas are becoming available. The use of these auxiliary data has the potential to improve the prediction of chemical element concentrations over the whole study area. Kriging is a well established geostatistical method for the prediction of spatial data but requires significant pre-processing and makes some basic assumptions about the underlying distribution of the data. Some machine learning algorithms, on the other hand, may require less data pre-processing and are non-parametric. In this study we used a dataset provided by Kirkwood et al. [1] to explore the potential use of Random Forest in geochemical mapping. We chose Random Forest because it is a well understood machine learning method and has the advantage that it provides us with a measure of uncertainty. By comparing Random Forest to Kriging we found that both methods produced comparable maps of estimated values for our variables of interest. Kriging outperformed Random Forest for variables of interest with relatively strong spatial correlation. The measure of uncertainty provided by Random Forest seems to be quite different to the measure of uncertainty provided by Kriging. In particular, the lack of spatial context can give misleading results in areas without ground truth data. In conclusion, our preliminary results show that the model driven approach in geostatistics gives us more reliable estimates for our target variables than Random Forest for variables with relatively strong spatial correlation. However, in cases of weak spatial correlation Random Forest, as a nonparametric method, may give the better results once we have a better understanding of the meaning of its uncertainty measures in a spatial context. References [1] Kirkwood, C., M. Cave, D. Beamish, S. Grebby, and A. Ferreira (2016), A machine learning approach to geochemical mapping, Journal of Geochemical Exploration, 163, 28-40, doi:10.1016/j.gexplo.2016.05.003.
Quantitative Measures for Evaluation of Ultrasound Therapies of the Prostate
NASA Astrophysics Data System (ADS)
Kobelevskiy, Ilya; Burtnyk, Mathieu; Bronskill, Michael; Chopra, Rajiv
2010-03-01
Development of non-invasive techniques for prostate cancer treatment requires implementation of quantitative measures for evaluation of the treatment results. In this paper. we introduce measures that estimate spatial targeting accuracy and potential thermal damage to the structures surrounding the prostate. The measures were developed for the technique of treating prostate cancer with a transurethral ultrasound heating applicators guided by active MR temperature feedback. Variations of ultrasound element length and related MR imaging parameters such as MR slice thickness and update time were investigated by performing numerical simulations of the treatment on a database of ten patient prostate geometries segmented from clinical MR images. Susceptibility of each parameter configuration to uncertainty in MR temperature measurements was studied by adding noise to the temperature measurements. Gaussian noise with zero mean and standard deviation of 0, 1, 3 and 5° C was used to model different levels of uncertainty in MR temperature measurements. Results of simulations for each parameter configuration were averaged over the database of the ten prostate patient geometries studied. Results have shown that for update time of 5 seconds both 3- and 5-mm elements achieve appropriate performance for temperature uncertainty up to 3° C, while temperature uncertainty of 5° C leads to noticeable reduction in spatial accuracy and increased risk of damaging rectal wall. Ten-mm elements lacked spatial accuracy and had higher risk of damaging rectal wall compared to 3- and 5-mm elements, but were less sensitive to the level of temperature uncertainty. The effect of changing update time was studied for 5-mm elements. Simulations showed that update time had minor effects on all aspects of treatment for temperature uncertainty of 0° C and 1° C, while temperature uncertainties of 3° C and 5° C led to reduced spatial accuracy, increased potential damage to the rectal wall, and longer treatment times for update time above 5 seconds. Overall evaluation of results suggested that 5-mm elements showed best performance under physically reachable MR imaging parameters.
Skinner, Daniel J C; Rocks, Sophie A; Pollard, Simon J T
2016-12-01
A reliable characterisation of uncertainties can aid uncertainty identification during environmental risk assessments (ERAs). However, typologies can be implemented inconsistently, causing uncertainties to go unidentified. We present an approach based on nine structured elicitations, in which subject-matter experts, for pesticide risks to surface water organisms, validate and assess three dimensions of uncertainty: its level (the severity of uncertainty, ranging from determinism to ignorance); nature (whether the uncertainty is epistemic or aleatory); and location (the data source or area in which the uncertainty arises). Risk characterisation contains the highest median levels of uncertainty, associated with estimating, aggregating and evaluating the magnitude of risks. Regarding the locations in which uncertainty is manifest, data uncertainty is dominant in problem formulation, exposure assessment and effects assessment. The comprehensive description of uncertainty described will enable risk analysts to prioritise the required phases, groups of tasks, or individual tasks within a risk analysis according to the highest levels of uncertainty, the potential for uncertainty to be reduced or quantified, or the types of location-based uncertainty, thus aiding uncertainty prioritisation during environmental risk assessments. In turn, it is expected to inform investment in uncertainty reduction or targeted risk management action. Copyright © 2016 The Authors. Published by Elsevier B.V. All rights reserved.
Advances in Applications of Hierarchical Bayesian Methods with Hydrological Models
NASA Astrophysics Data System (ADS)
Alexander, R. B.; Schwarz, G. E.; Boyer, E. W.
2017-12-01
Mechanistic and empirical watershed models are increasingly used to inform water resource decisions. Growing access to historical stream measurements and data from in-situ sensor technologies has increased the need for improved techniques for coupling models with hydrological measurements. Techniques that account for the intrinsic uncertainties of both models and measurements are especially needed. Hierarchical Bayesian methods provide an efficient modeling tool for quantifying model and prediction uncertainties, including those associated with measurements. Hierarchical methods can also be used to explore spatial and temporal variations in model parameters and uncertainties that are informed by hydrological measurements. We used hierarchical Bayesian methods to develop a hybrid (statistical-mechanistic) SPARROW (SPAtially Referenced Regression On Watershed attributes) model of long-term mean annual streamflow across diverse environmental and climatic drainages in 18 U.S. hydrological regions. Our application illustrates the use of a new generation of Bayesian methods that offer more advanced computational efficiencies than the prior generation. Evaluations of the effects of hierarchical (regional) variations in model coefficients and uncertainties on model accuracy indicates improved prediction accuracies (median of 10-50%) but primarily in humid eastern regions, where model uncertainties are one-third of those in arid western regions. Generally moderate regional variability is observed for most hierarchical coefficients. Accounting for measurement and structural uncertainties, using hierarchical state-space techniques, revealed the effects of spatially-heterogeneous, latent hydrological processes in the "localized" drainages between calibration sites; this improved model precision, with only minor changes in regional coefficients. Our study can inform advances in the use of hierarchical methods with hydrological models to improve their integration with stream measurements.
Roberti, Joshua A.; SanClements, Michael D.; Loescher, Henry W.; Ayres, Edward
2014-01-01
Even though fine-root turnover is a highly studied topic, it is often poorly understood as a result of uncertainties inherent in its sampling, e.g., quantifying spatial and temporal variability. While many methods exist to quantify fine-root turnover, use of minirhizotrons has increased over the last two decades, making sensor errors another source of uncertainty. Currently, no standardized methodology exists to test and compare minirhizotron camera capability, imagery, and performance. This paper presents a reproducible, laboratory-based method by which minirhizotron cameras can be tested and validated in a traceable manner. The performance of camera characteristics was identified and test criteria were developed: we quantified the precision of camera location for successive images, estimated the trueness and precision of each camera's ability to quantify root diameter and root color, and also assessed the influence of heat dissipation introduced by the minirhizotron cameras and electrical components. We report detailed and defensible metrology analyses that examine the performance of two commercially available minirhizotron cameras. These cameras performed differently with regard to the various test criteria and uncertainty analyses. We recommend a defensible metrology approach to quantify the performance of minirhizotron camera characteristics and determine sensor-related measurement uncertainties prior to field use. This approach is also extensible to other digital imagery technologies. In turn, these approaches facilitate a greater understanding of measurement uncertainties (signal-to-noise ratio) inherent in the camera performance and allow such uncertainties to be quantified and mitigated so that estimates of fine-root turnover can be more confidently quantified. PMID:25391023
Niches, models, and climate change: Assessing the assumptions and uncertainties
Wiens, John A.; Stralberg, Diana; Jongsomjit, Dennis; Howell, Christine A.; Snyder, Mark A.
2009-01-01
As the rate and magnitude of climate change accelerate, understanding the consequences becomes increasingly important. Species distribution models (SDMs) based on current ecological niche constraints are used to project future species distributions. These models contain assumptions that add to the uncertainty in model projections stemming from the structure of the models, the algorithms used to translate niche associations into distributional probabilities, the quality and quantity of data, and mismatches between the scales of modeling and data. We illustrate the application of SDMs using two climate models and two distributional algorithms, together with information on distributional shifts in vegetation types, to project fine-scale future distributions of 60 California landbird species. Most species are projected to decrease in distribution by 2070. Changes in total species richness vary over the state, with large losses of species in some “hotspots” of vulnerability. Differences in distributional shifts among species will change species co-occurrences, creating spatial variation in similarities between current and future assemblages. We use these analyses to consider how assumptions can be addressed and uncertainties reduced. SDMs can provide a useful way to incorporate future conditions into conservation and management practices and decisions, but the uncertainties of model projections must be balanced with the risks of taking the wrong actions or the costs of inaction. Doing this will require that the sources and magnitudes of uncertainty are documented, and that conservationists and resource managers be willing to act despite the uncertainties. The alternative, of ignoring the future, is not an option. PMID:19822750
NASA Astrophysics Data System (ADS)
Friberg, Mariel D.; Kahn, Ralph A.; Holmes, Heather A.; Chang, Howard H.; Sarnat, Stefanie Ebelt; Tolbert, Paige E.; Russell, Armistead G.; Mulholland, James A.
2017-06-01
Spatiotemporal characterization of ambient air pollutant concentrations is increasingly relying on the combination of observations and air quality models to provide well-constrained, spatially and temporally complete pollutant concentration fields. Air quality models, in particular, are attractive, as they characterize the emissions, meteorological, and physiochemical process linkages explicitly while providing continuous spatial structure. However, such modeling is computationally intensive and has biases. The limitations of spatially sparse and temporally incomplete observations can be overcome by blending the data with estimates from a physically and chemically coherent model, driven by emissions and meteorological inputs. We recently developed a data fusion method that blends ambient ground observations and chemical-transport-modeled (CTM) data to estimate daily, spatially resolved pollutant concentrations and associated correlations. In this study, we assess the ability of the data fusion method to produce daily metrics (i.e., 1-hr max, 8-hr max, and 24-hr average) of ambient air pollution that capture spatiotemporal air pollution trends for 12 pollutants (CO, NO2, NOx, O3, SO2, PM10, PM2.5, and five PM2.5 components) across five metropolitan areas (Atlanta, Birmingham, Dallas, Pittsburgh, and St. Louis), from 2002 to 2008. Three sets of comparisons are performed: (1) the CTM concentrations are evaluated for each pollutant and metropolitan domain, (2) the data fusion concentrations are compared with the monitor data, (3) a comprehensive cross-validation analysis against observed data evaluates the quality of the data fusion model simulations across multiple metropolitan domains. The resulting daily spatial field estimates of air pollutant concentrations and uncertainties are not only consistent with observations, emissions, and meteorology, but substantially improve CTM-derived results for nearly all pollutants and all cities, with the exception of NO2 for Birmingham. The greatest improvements occur for O3 and PM2.5. Squared spatiotemporal correlation coefficients range between simulations and observations determined using cross-validation across all cities for air pollutants of secondary and mixed origins are R2 = 0.88-0.93 (O3), 0.81-0.89 (SO4), 0.67-0.83 (PM2.5), 0.52-0.72 (NO3), 0.43-0.80 (NH4), 0.32-0.51 (OC), and 0.14-0.71 (PM10). Results for relatively homogeneous pollutants of secondary origin, tend to be better than those for more spatially heterogeneous (larger spatial gradients) pollutants of primary origin (NOx, CO, SO2 and EC). Generally, background concentrations and spatial concentration gradients reflect interurban airshed complexity and the effects of regional transport, whereas daily spatial pattern variability shows intra-urban consistency in the fused data. With sufficiently high CTM spatial resolution, traffic-related pollutants exhibit gradual concentration gradients that peak toward the urban centers. Ambient pollutant concentration uncertainty estimates for the fused data are both more accurate and smaller than those for either the observations or the model simulations alone.
Zanini, Gabriele
2009-01-01
Selecting the best emissions abatement strategy is very difficult due to the complexity of the processes that determine the impact of atmospheric pollutants and to the connection with climate change issues. Atmospheric pollution models can provide policy makers with a tool for assessing the effectiveness of abatement measures and their associated costs. The MINNI integrated model has been developed to link policy and atmospheric science and to assess the costs of the measures. The results have been carefully verified in order to identify uncertainties and the models are continuously updated to represent the state of the art in atmospheric science. The fine spatial and temporal resolution of the simulations provide a strong basis for assessing impacts on environment and health.
Nagle, Samuel M; Sundar, Guru; Schafer, Mark E; Harris, Gerald R; Vaezy, Shahram; Gessert, James M; Howard, Samuel M; Moore, Mary K; Eaton, Richard M
2013-11-01
This article examines the challenges associated with making acoustic output measurements at high ultrasound frequencies (>20 MHz) in the context of regulatory considerations contained in the US Food and Drug Administration industry guidance document for diagnostic ultrasound devices. Error sources in the acoustic measurement, including hydrophone calibration and spatial averaging, nonlinear distortion, and mechanical alignment, are evaluated, and the limitations of currently available acoustic measurement instruments are discussed. An uncertainty analysis of acoustic intensity and power measurements is presented, and an example uncertainty calculation is done on a hypothetical 30-MHz high-frequency ultrasound system. This analysis concludes that the estimated measurement uncertainty of the acoustic intensity is +73%/-86%, and the uncertainty in the mechanical index is +37%/-43%. These values exceed the respective levels in the Food and Drug Administration guidance document of 30% and 15%, respectively, which are more representative of the measurement uncertainty associated with characterizing lower-frequency ultrasound systems. Recommendations made for minimizing the measurement uncertainty include implementing a mechanical positioning system that has sufficient repeatability and precision, reconstructing the time-pressure waveform via deconvolution using the hydrophone frequency response, and correcting for hydrophone spatial averaging.
Chen, Qiuwen; Rui, Han; Li, Weifeng; Zhang, Yanhui
2014-06-01
Algal blooms are a serious problem in waters, which damage aquatic ecosystems and threaten drinking water safety. However, the outbreak mechanism of algal blooms is very complex with great uncertainty, especially for large water bodies where environmental conditions have obvious variation in both space and time. This study developed an innovative method which integrated a self-organizing map (SOM) and fuzzy information diffusion theory to comprehensively analyze algal bloom risks with uncertainties. The Lake Taihu was taken as study case and the long-term (2004-2010) on-site monitoring data were used. The results showed that algal blooms in Taihu Lake were classified into four categories and exhibited obvious spatial-temporal patterns. The lake was mainly characterized by moderate bloom but had high uncertainty, whereas severe blooms with low uncertainty were observed in the northwest part of the lake. The study gives insight on the spatial-temporal dynamics of algal blooms, and should help government and decision-makers outline policies and practices on bloom monitoring and prevention. The developed method provides a promising approach to estimate algal bloom risks under uncertainties. Copyright © 2014 Elsevier B.V. All rights reserved.
Snover, Amy K; Mantua, Nathan J; Littell, Jeremy S; Alexander, Michael A; McClure, Michelle M; Nye, Janet
2013-12-01
Increased concern over climate change is demonstrated by the many efforts to assess climate effects and develop adaptation strategies. Scientists, resource managers, and decision makers are increasingly expected to use climate information, but they struggle with its uncertainty. With the current proliferation of climate simulations and downscaling methods, scientifically credible strategies for selecting a subset for analysis and decision making are needed. Drawing on a rich literature in climate science and impact assessment and on experience working with natural resource scientists and decision makers, we devised guidelines for choosing climate-change scenarios for ecological impact assessment that recognize irreducible uncertainty in climate projections and address common misconceptions about this uncertainty. This approach involves identifying primary local climate drivers by climate sensitivity of the biological system of interest; determining appropriate sources of information for future changes in those drivers; considering how well processes controlling local climate are spatially resolved; and selecting scenarios based on considering observed emission trends, relative importance of natural climate variability, and risk tolerance and time horizon of the associated decision. The most appropriate scenarios for a particular analysis will not necessarily be the most appropriate for another due to differences in local climate drivers, biophysical linkages to climate, decision characteristics, and how well a model simulates the climate parameters and processes of interest. Given these complexities, we recommend interaction among climate scientists, natural and physical scientists, and decision makers throughout the process of choosing and using climate-change scenarios for ecological impact assessment. Selección y Uso de Escenarios de Cambio Climático para Estudios de Impacto Ecológico y Decisiones de Conservación. © 2013 Society for Conservation Biology.
NASA Astrophysics Data System (ADS)
Hosseiny, S. M. H.; Zarzar, C.; Gomez, M.; Siddique, R.; Smith, V.; Mejia, A.; Demir, I.
2016-12-01
The National Water Model (NWM) provides a platform for operationalize nationwide flood inundation forecasting and mapping. The ability to model flood inundation on a national scale will provide invaluable information to decision makers and local emergency officials. Often, forecast products use deterministic model output to provide a visual representation of a single inundation scenario, which is subject to uncertainty from various sources. While this provides a straightforward representation of the potential inundation, the inherent uncertainty associated with the model output should be considered to optimize this tool for decision making support. The goal of this study is to produce ensembles of future flood inundation conditions (i.e. extent, depth, and velocity) to spatially quantify and visually assess uncertainties associated with the predicted flood inundation maps. The setting for this study is located in a highly urbanized watershed along the Darby Creek in Pennsylvania. A forecasting framework coupling the NWM with multiple hydraulic models was developed to produce a suite ensembles of future flood inundation predictions. Time lagged ensembles from the NWM short range forecasts were used to account for uncertainty associated with the hydrologic forecasts. The forecasts from the NWM were input to iRIC and HEC-RAS two-dimensional software packages, from which water extent, depth, and flow velocity were output. Quantifying the agreement between output ensembles for each forecast grid provided the uncertainty metrics for predicted flood water inundation extent, depth, and flow velocity. For visualization, a series of flood maps that display flood extent, water depth, and flow velocity along with the underlying uncertainty associated with each of the forecasted variables were produced. The results from this study demonstrate the potential to incorporate and visualize model uncertainties in flood inundation maps in order to identify the high flood risk zones.
Characterizing Uncertainties in Atmospheric Inversions of Fossil Fuel CO2 Emissions in California
NASA Astrophysics Data System (ADS)
Brophy, K. J.; Graven, H. D.; Manning, A.; Arnold, T.; Fischer, M. L.; Jeong, S.; Cui, X.; Parazoo, N.
2016-12-01
In 2006 California passed a law requiring greenhouse gas emissions be reduced to 1990 levels by 2020, equivalent to a 20% reduction over 2006-2020. Assessing compliance with greenhouse gas mitigation policies requires accurate determination of emissions, particularly for CO2 emitted by fossil fuel combustion (ffCO2). We found differences in inventory-based ffCO2 flux estimates for California total emissions of 11% (standard deviation relative to the mean), and even larger differences on some smaller sub-state levels. Top-down studies may be useful for validating ffCO2 flux estimates, but top-down studies of CO2 typically focus on biospheric CO2 fluxes and they are not yet well-developed for ffCO2. Implementing top-down studies of ffCO2 requires observations of a fossil fuel combustion tracer such as 14C to distinguish ffCO2 from biospheric CO2. However, even if a large number of 14C observations are available, multiple other sources of uncertainty will contribute to the uncertainty in posterior ffCO2 flux estimates. With a Bayesian inverse modelling approach, we use simulated atmospheric observations of ffCO2 at a network of 11 tower sites across California in an observing system simulation experiment to investigate uncertainties. We use four different prior ffCO2 flux estimates, two different atmospheric transport models, different types of spatial aggregation, and different assumptions for observational and model transport uncertainties to investigate contributions to posterior ffCO2 emission uncertainties. We show how various sources of uncertainty compare and which uncertainties are likely to limit top-down estimation of ffCO2 fluxes in California.
A Multi-Band Uncertainty Set Based Robust SCUC With Spatial and Temporal Budget Constraints
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dai, Chenxi; Wu, Lei; Wu, Hongyu
2016-11-01
The dramatic increase of renewable energy resources in recent years, together with the long-existing load forecast errors and increasingly involved price sensitive demands, has introduced significant uncertainties into power systems operation. In order to guarantee the operational security of power systems with such uncertainties, robust optimization has been extensively studied in security-constrained unit commitment (SCUC) problems, for immunizing the system against worst uncertainty realizations. However, traditional robust SCUC models with single-band uncertainty sets may yield over-conservative solutions in most cases. This paper proposes a multi-band robust model to accurately formulate various uncertainties with higher resolution. By properly tuning band intervalsmore » and weight coefficients of individual bands, the proposed multi-band robust model can rigorously and realistically reflect spatial/temporal relationships and asymmetric characteristics of various uncertainties, and in turn could effectively leverage the tradeoff between robustness and economics of robust SCUC solutions. The proposed multi-band robust SCUC model is solved by Benders decomposition (BD) and outer approximation (OA), while taking the advantage of integral property of the proposed multi-band uncertainty set. In addition, several accelerating techniques are developed for enhancing the computational performance and the convergence speed. Numerical studies on a 6-bus system and the modified IEEE 118-bus system verify the effectiveness of the proposed robust SCUC approach for enhancing uncertainty modeling capabilities and mitigating conservativeness of the robust SCUC solution.« less
NASA Astrophysics Data System (ADS)
Zhou, Y.; Gu, H.; Williams, C. A.
2017-12-01
Results from terrestrial carbon cycle models have multiple sources of uncertainty, each with its behavior and range. Their relative importance and how they combine has received little attention. This study investigates how various sources of uncertainty propagate, temporally and spatially, in CASA-Disturbance (CASA-D). CASA-D simulates the impact of climatic forcing and disturbance legacies on forest carbon dynamics with the following steps. Firstly, we infer annual growth and mortality rates from measured biomass stocks (FIA) over time and disturbance (e.g., fire, harvest, bark beetle) to represent annual post-disturbance carbon fluxes trajectories across forest types and site productivity settings. Then, annual carbon fluxes are estimated from these trajectories by using time since disturbance which is inferred from biomass (NBCD 2000) and disturbance maps (NAFD, MTBS and ADS). Finally, we apply monthly climatic scalars derived from default CASA to temporally distribute annual carbon fluxes to each month. This study assesses carbon flux uncertainty from two sources: driving data including climatic and forest biomass inputs, and three most sensitive parameters in CASA-D including maximum light use efficiency, temperature sensitivity of soil respiration (Q10) and optimum temperature identified by using EFAST (Extended Fourier Amplitude Sensitivity Testing). We quantify model uncertainties from each, and report their relative importance in estimating forest carbon sink/source in southeast United States from 2003 to 2010.
NASA Astrophysics Data System (ADS)
Manjunath, D.; Gomez, F.; Loveless, J.
2005-12-01
Interferometric Synthetic Aperture Radar (InSAR) provides unprecedented spatial imaging of crustal deformation. However, for small deformations, such as those due to interseismic strain accumulation, potentially significant uncertainty may result from other sources of interferometric phase, such as atmospheric effects, errors in satellite baseline, and height errors in the reference digital elevation model (DEM). We aim to constrain spatial and temporal variations in crustal deformation of the northern Chilean forearc region of the Andean subduction zone (19° - 22°S) using multiple interferograms spanning 1995 - 2000. The study area includes the region of the 1995 Mw 8.1 Antofagasta earthquake and the region to the north. In contrast to previous InSAR-based studies of the Chilean forearc, we seek to distinguish interferometric phase contributions from linear and nonlinear deformation, height errors in the DEM, and atmospheric effects. Understanding these phase contributions reduces the uncertainties on the deformation rates and provides a view of the time-dependence of deformation. The inteferograms cover a 150 km-wide swath spanning two adjacent orbital tracks. Our study involves the analysis of more than 28 inteferograms along each track. Coherent interferograms in the hyper-arid Atacama Desert permit spatial phase unwrapping. Initial estimates of topographic phase were determined using 3'' DEM data from the SRTM mission. We perform a pixel-by-pixel analysis of the unwrapped phase to identify time- and baseline-dependent phase contributions, using the Gamma Remote Sensing radar software. Atmospheric phase, non-linear deformation, and phase noise were further distinguished using a combination of spatial and temporal filters. Non-linear deformation is evident for up to 2.5 years following the 1995 earthquake, followed by a return to time-linear, interseismic strain accumulation. The regional trend of linear deformation, characterized by coastal subsidence and relative uplift inland, is consistent with the displacement field expected for a locked subduction zone. Our improved determination of deformation rates is used to formulate a new elastic model of interseismic strain in the Chilean forearc.
NASA Astrophysics Data System (ADS)
Costa, F. A. F.; Keir, G.; McIntyre, N.; Bulovic, N.
2015-12-01
Most groundwater supply bores in Australia do not have flow metering equipment and so regional groundwater abstraction rates are not well known. Past estimates of unmetered abstraction for regional numerical groundwater modelling typically have not attempted to quantify the uncertainty inherent in the estimation process in detail. In particular, the spatial properties of errors in the estimates are almost always neglected. Here, we apply Bayesian spatial models to estimate these abstractions at a regional scale, using the state-of-the-art computationally inexpensive approaches of integrated nested Laplace approximation (INLA) and stochastic partial differential equations (SPDE). We examine a case study in the Condamine Alluvium aquifer in southern Queensland, Australia; even in this comparatively data-rich area with extensive groundwater abstraction for agricultural irrigation, approximately 80% of bores do not have reliable metered flow records. Additionally, the metering data in this area are characterised by complicated statistical features, such as zero-valued observations, non-normality, and non-stationarity. While this precludes the use of many classical spatial estimation techniques, such as kriging, our model (using the R-INLA package) is able to accommodate these features. We use a joint model to predict both probability and magnitude of abstraction from bores in space and time, and examine the effect of a range of high-resolution gridded meteorological covariates upon the predictive ability of the model. Deviance Information Criterion (DIC) scores are used to assess a range of potential models, which reward good model fit while penalising excessive model complexity. We conclude that maximum air temperature (as a reasonably effective surrogate for evapotranspiration) is the most significant single predictor of abstraction rate; and that a significant spatial effect exists (represented by the SPDE approximation of a Gaussian random field with a Matérn covariance function). Our final model adopts air temperature, solar exposure, and normalized difference vegetation index (NDVI) as covariates, shows good agreement with previous estimates at a regional scale, and additionally offers rigorous quantification of uncertainty in the estimate.
Uncertainty in spatially explicit animal dispersal models
Mooij, Wolf M.; DeAngelis, Donald L.
2003-01-01
Uncertainty in estimates of survival of dispersing animals is a vexing difficulty in conservation biology. The current notion is that this uncertainty decreases the usefulness of spatially explicit population models in particular. We examined this problem by comparing dispersal models of three levels of complexity: (1) an event-based binomial model that considers only the occurrence of mortality or arrival, (2) a temporally explicit exponential model that employs mortality and arrival rates, and (3) a spatially explicit grid-walk model that simulates the movement of animals through an artificial landscape. Each model was fitted to the same set of field data. A first objective of the paper is to illustrate how the maximum-likelihood method can be used in all three cases to estimate the means and confidence limits for the relevant model parameters, given a particular set of data on dispersal survival. Using this framework we show that the structure of the uncertainty for all three models is strikingly similar. In fact, the results of our unified approach imply that spatially explicit dispersal models, which take advantage of information on landscape details, suffer less from uncertainly than do simpler models. Moreover, we show that the proposed strategy of model development safeguards one from error propagation in these more complex models. Finally, our approach shows that all models related to animal dispersal, ranging from simple to complex, can be related in a hierarchical fashion, so that the various approaches to modeling such dispersal can be viewed from a unified perspective.
Uncertainty of future projections of species distributions in mountainous regions.
Tang, Ying; Winkler, Julie A; Viña, Andrés; Liu, Jianguo; Zhang, Yuanbin; Zhang, Xiaofeng; Li, Xiaohong; Wang, Fang; Zhang, Jindong; Zhao, Zhiqiang
2018-01-01
Multiple factors introduce uncertainty into projections of species distributions under climate change. The uncertainty introduced by the choice of baseline climate information used to calibrate a species distribution model and to downscale global climate model (GCM) simulations to a finer spatial resolution is a particular concern for mountainous regions, as the spatial resolution of climate observing networks is often insufficient to detect the steep climatic gradients in these areas. Using the maximum entropy (MaxEnt) modeling framework together with occurrence data on 21 understory bamboo species distributed across the mountainous geographic range of the Giant Panda, we examined the differences in projected species distributions obtained from two contrasting sources of baseline climate information, one derived from spatial interpolation of coarse-scale station observations and the other derived from fine-spatial resolution satellite measurements. For each bamboo species, the MaxEnt model was calibrated separately for the two datasets and applied to 17 GCM simulations downscaled using the delta method. Greater differences in the projected spatial distributions of the bamboo species were observed for the models calibrated using the different baseline datasets than between the different downscaled GCM simulations for the same calibration. In terms of the projected future climatically-suitable area by species, quantification using a multi-factor analysis of variance suggested that the sum of the variance explained by the baseline climate dataset used for model calibration and the interaction between the baseline climate data and the GCM simulation via downscaling accounted for, on average, 40% of the total variation among the future projections. Our analyses illustrate that the combined use of gridded datasets developed from station observations and satellite measurements can help estimate the uncertainty introduced by the choice of baseline climate information to the projected changes in species distribution.
Uncertainty of future projections of species distributions in mountainous regions
Tang, Ying; Viña, Andrés; Liu, Jianguo; Zhang, Yuanbin; Zhang, Xiaofeng; Li, Xiaohong; Wang, Fang; Zhang, Jindong; Zhao, Zhiqiang
2018-01-01
Multiple factors introduce uncertainty into projections of species distributions under climate change. The uncertainty introduced by the choice of baseline climate information used to calibrate a species distribution model and to downscale global climate model (GCM) simulations to a finer spatial resolution is a particular concern for mountainous regions, as the spatial resolution of climate observing networks is often insufficient to detect the steep climatic gradients in these areas. Using the maximum entropy (MaxEnt) modeling framework together with occurrence data on 21 understory bamboo species distributed across the mountainous geographic range of the Giant Panda, we examined the differences in projected species distributions obtained from two contrasting sources of baseline climate information, one derived from spatial interpolation of coarse-scale station observations and the other derived from fine-spatial resolution satellite measurements. For each bamboo species, the MaxEnt model was calibrated separately for the two datasets and applied to 17 GCM simulations downscaled using the delta method. Greater differences in the projected spatial distributions of the bamboo species were observed for the models calibrated using the different baseline datasets than between the different downscaled GCM simulations for the same calibration. In terms of the projected future climatically-suitable area by species, quantification using a multi-factor analysis of variance suggested that the sum of the variance explained by the baseline climate dataset used for model calibration and the interaction between the baseline climate data and the GCM simulation via downscaling accounted for, on average, 40% of the total variation among the future projections. Our analyses illustrate that the combined use of gridded datasets developed from station observations and satellite measurements can help estimate the uncertainty introduced by the choice of baseline climate information to the projected changes in species distribution. PMID:29320501
NASA Astrophysics Data System (ADS)
Oriani, F.; Stisen, S.
2016-12-01
Rainfall amount is one of the most sensitive inputs to distributed hydrological models. Its spatial representation is of primary importance to correctly study the uncertainty of basin recharge and its propagation to the surface and underground circulation. We consider here the 10-km-grid rainfall product provided by the Danish Meteorological Institute as input to the National Water Resources Model of Denmark. Due to a drastic reduction in the rain gauge network in recent years (from approximately 500 stations in the period 1996-2006, to 250 in the period 2007-2014), the grid rainfall product, based on the interpolation of these data, is much less reliable. Consequently, the related hydrological model shows a significantly lower prediction power. To give a better estimation of spatial rainfall at the grid points far from ground measurements, we use the direct sampling technique (DS) [1], belonging to the family of multiple-point geostatistics. DS, already applied to rainfall and spatial variable estimation [2, 3], simulates a grid value by sampling a training data set where a similar data neighborhood occurs. In this way, complex statistical relations are preserved by generating similar spatial patterns to the ones found in the training data set. Using the reliable grid product from the period 1996-2006 as training data set, we first test the technique by simulating part of this data set, then we apply the technique to the grid product of the period 2007-2014, and subsequently analyzing the uncertainty propagation to the hydrological model. We show that DS can improve the reliability of the rainfall product by generating more realistic rainfall patterns, with a significant repercussion on the hydrological model. The reduction of rain gauge networks is a global phenomenon which has huge implications for hydrological model performance and the uncertainty assessment of water resources. Therefore, the presented methodology can potentially be used in many regions where historical records can act as training data. [1] G.Mariethoz et al. (2010), Water Resour. Res., 10.1029/2008WR007621.[2] F. Oriani et al. (2014), Hydrol. Earth Syst. Sc., 10.5194/hessd-11-3213-2014. [3] G. Mariethoz et al. (2012), Water Resour. Res., 10.1029/2012WR012115.
Space-Time Urban Air Pollution Forecasts
NASA Astrophysics Data System (ADS)
Russo, A.; Trigo, R. M.; Soares, A.
2012-04-01
Air pollution, like other natural phenomena, may be considered a space-time process. However, the simultaneous integration of time and space is not an easy task to perform, due to the existence of different uncertainties levels and data characteristics. In this work we propose a hybrid method that combines geostatistical and neural models to analyze PM10 time series recorded in the urban area of Lisbon (Portugal) for the 2002-2006 period and to produce forecasts. Geostatistical models have been widely used to characterize air pollution in urban areas, where the pollutant sources are considered diffuse, and also to industrial areas with localized emission sources. It should be stressed however that most geostatistical models correspond basically to an interpolation methodology (estimation, simulation) of a set of variables in a spatial or space-time domain. The temporal prediction of a pollutant usually requires knowledge of the main trends and complex patterns of physical dispersion phenomenon. To deal with low resolution problems and to enhance reliability of predictions, an approach based on neural network short term predictions in the monitoring stations which behave as a local conditioner to a fine grid stochastic simulation model is presented here. After the pollutant concentration is predicted for a given time period at the monitoring stations, we can use the local conditional distributions of observed values, given the predicted value for that period, to perform the spatial simulations for the entire area and consequently evaluate the spatial uncertainty of pollutant concentration. To attain this objective, we propose the use of direct sequential simulations with local distributions. With this approach one succeed to predict the space-time distribution of pollutant concentration that accounts for the time prediction uncertainty (reflecting the neural networks efficiency at each local monitoring station) and the spatial uncertainty as revealed by the spatial variograms. The dataset used consists of PM10 concentrations recorded hourly by 12 monitoring stations within the Lisbon's area, for the period 2002-2006. In addition, meteorological data recorded at 3 monitoring stations and boundary layer height (BLH) daily values from the ECMWF (European Centre for Medium Weather Forecast), ERA Interim, were also used. Based on the large-scale standard pressure fields from the ERA40/ECMWF, prevailing circulation patterns at regional scale where determined and used on the construction of the models. After the daily forecasts were produced, the difference between the average maps based on real observations and predicted values were determined and the model's performance was assessed. Based on the analysis of the results, we conclude that the proposed approach shows to be a very promising alternative for urban air quality characterization because of its good results and simplicity of application.
Validation Of TRMM For Hazard Assessment In The Remote Context Of Tropical Africa
NASA Astrophysics Data System (ADS)
Monsieurs, E.; Kirschbaum, D.; Tan, J.; Jacobs, L.; Kervyn, M.; Demoulin, A.; Dewitte, O.
2017-12-01
Accurate rainfall data is fundamental for understanding and mitigating the disastrous effects of many rainfall-triggered hazards, especially when one considers the challenges arising from climate change and rainfall variability. In tropical Africa in particular, the sparse operational rainfall gauging network hampers the ability to understand these hazards. Satellite rainfall estimates (SRE) can therefore be of great value. Yet, rigorous validation is required to identify the uncertainties when using SRE for hazard applications. We evaluated the Tropical Rainfall Measuring Mission (TRMM) Multi-satellite Precipitation Analysis (TMPA) 3B42 Research Derived Daily Product from 1998 to 2017, at 0.25° x 0.25° spatial and 24 h temporal resolution. The validation was done over the western branch of the East African Rift, with the perspective of regional landslide hazard assessment in mind. Even though we collected an unprecedented dataset of 47 gauges with a minimum temporal resolution of 24 h, the sparse and heterogeneous temporal coverage in a region with high rainfall variability poses challenges for validation. In addition, the discrepancy between local-scale gauge data and spatially averaged ( 775 km²) TMPA data in the context of local convective storms and orographic rainfall is a crucial source of uncertainty. We adopted a flexible framework for SRE validation that fosters explorative research in a remote context. Results show that TMPA performs reasonably well during the rainy seasons for rainfall intensities <20 mm/day. TMPA systematically underestimates rainfall, but most problematic is the decreasing probability of detection of high intensity rainfalls. We suggest that landslide hazard might be efficiently assessed if we take account of the systematic biases in TMPA data and determine rainfall thresholds modulated by controls on, and uncertainties of, TMPA revealed in this study. Moreover, it is found relevant in mapping regional-scale rainfall-triggered hazards that are in any case poorly covered by the sparse available gauges. We anticipate validation of TMPA's successor (Integrated Multi-satellitE Retrievals for Global Precipitation Measurement; 10 km × 10 km, half-hourly) using the proposed framework, as soon as this product will be available in early 2018 for the 1998-present period.
NASA Astrophysics Data System (ADS)
Lawrence, D. M.; Fisher, R.; Koven, C.; Oleson, K. W.; Swenson, S. C.; Hoffman, F. M.; Randerson, J. T.; Collier, N.; Mu, M.
2017-12-01
The International Land Model Benchmarking (ILAMB) project is a model-data intercomparison and integration project designed to assess and help improve land models. The current package includes assessment of more than 25 land variables across more than 60 global, regional, and site-level (e.g., FLUXNET) datasets. ILAMB employs a broad range of metrics including RMSE, mean error, spatial distributions, interannual variability, and functional relationships. Here, we apply ILAMB for the purpose of assessment of several generations of the Community Land Model (CLM4, CLM4.5, and CLM5). Encouragingly, CLM5, which is the result of model development over the last several years by more than 50 researchers from 15 different institutions, shows broad improvements across many ILAMB metrics including LAI, GPP, vegetation carbon stocks, and the historical net ecosystem carbon balance among others. We will also show that considerable uncertainty arises from the historical climate forcing data used (GSWP3v1 and CRUNCEPv7). ILAMB score variations due to forcing data can be as large for many variables as that due to model structural differences. Strengths and weaknesses and persistent biases across model generations will also be presented.
Conceptual uncertainty in crystalline bedrock: Is simple evaluation the only practical approach?
Geier, J.; Voss, C.I.; Dverstorp, B.
2002-01-01
A simple evaluation can be used to characterize the capacity of crystalline bedrock to act as a barrier to release radionuclides from a nuclear waste repository. Physically plausible bounds on groundwater flow and an effective transport-resistance parameter are estimated based on fundamental principles and idealized models of pore geometry. Application to an intensively characterized site in Sweden shows that, due to high spatial variability and uncertainty regarding properties of transport paths, the uncertainty associated with the geological barrier is too high to allow meaningful discrimination between good and poor performance. Application of more complex (stochastic-continuum and discrete-fracture-network) models does not yield a significant improvement in the resolution of geological barrier performance. Comparison with seven other less intensively characterized crystalline study sites in Sweden leads to similar results, raising a question as to what extent the geological barrier function can be characterized by state-of-the art site investigation methods prior to repository construction. A simple evaluation provides a simple and robust practical approach for inclusion in performance assessment.
Conceptual uncertainty in crystalline bedrock: Is simple evaluation the only practical approach?
Geier, J.; Voss, C.I.; Dverstorp, B.
2002-01-01
A simple evaluation can be used to characterise the capacity of crystalline bedrock to act as a barrier to releases of radionuclides from a nuclear waste repository. Physically plausible bounds on groundwater flow and an effective transport-resistance parameter are estimated based on fundamental principles and idealised models of pore geometry. Application to an intensively characterised site in Sweden shows that, due to high spatial variability and uncertainty regarding properties of transport paths, the uncertainty associated with the geological barrier is too high to allow meaningful discrimination between good and poor performance. Application of more complex (stochastic-continuum and discrete-fracture-network) models does not yield a significant improvement in the resolution of geologic-barrier performance. Comparison with seven other less intensively characterised crystalline study sites in Sweden leads to similar results, raising a question as to what extent the geological barrier function can be characterised by state-of-the art site investigation methods prior to repository construction. A simple evaluation provides a simple and robust practical approach for inclusion in performance assessment.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Foight, Dillon R.; Slane, Patrick O.; Güver, Tolga
We present a comprehensive study of interstellar X-ray extinction using the extensive Chandra supernova remnant (SNR) archive and use our results to refine the empirical relation between the hydrogen column density and optical extinction. In our analysis, we make use of the large, uniform data sample to assess various systematic uncertainties in the measurement of the interstellar X-ray absorption. Specifically, we address systematic uncertainties that originate from (i) the emission models used to fit SNR spectra; (ii) the spatial variations within individual remnants; (iii) the physical conditions of the remnant such as composition, temperature, and non-equilibrium regions; and (iv) themore » model used for the absorption of X-rays in the interstellar medium. Using a Bayesian framework to quantify these systematic uncertainties, and combining the resulting hydrogen column density measurements with the measurements of optical extinction toward the same remnants, we find the empirical relation N {sub H} = (2.87 ± 0.12) × 10{sup 21} A {sub V} cm{sup 2}, which is significantly higher than the previous measurements.« less
Understanding high magnitude flood risk: evidence from the past
NASA Astrophysics Data System (ADS)
MacDonald, N.
2009-04-01
The average length of gauged river flow records in the UK is ~25 years, which presents a problem in determining flood risk for high-magnitude flood events. Severe floods have been recorded in many UK catchments during the past 10 years, increasing the uncertainty in conventional flood risk estimates based on river flow records. Current uncertainty in flood risk has implications for society (insurance costs), individuals (personal vulnerability) and water resource managers (flood/drought risk). An alternative approach is required which can improve current understanding of the flood frequency/magnitude relationship. Historical documentary accounts are now recognised as a valuable resource when considering the flood frequency/magnitude relationship, but little consideration has been given to the temporal and spatial distribution of these records. Building on previous research based on British rivers (urban centre): Ouse (York), Trent (Nottingham), Tay (Perth), Severn (Shrewsbury), Dee (Chester), Great Ouse (Cambridge), Sussex Ouse (Lewes), Thames (Oxford), Tweed (Kelso) and Tyne (Hexham), this work considers the spatial and temporal distribution of historical flooding. The selected sites provide a network covering many of the largest river catchments in Britain, based on urban centres with long detailed documentary flood histories. The chronologies offer an opportunity to assess long-term patterns of flooding, indirectly determining periods of climatic variability and potentially increased geomorphic activity. This research represents the first coherent large scale analysis undertaken of historical multi-catchment flood chronologies, providing an unparalleled network of sites, permitting analysis of the spatial and temporal distribution of historical flood patterns on a national scale.
A review of uncertainty visualization within the IPCC reports
NASA Astrophysics Data System (ADS)
Nocke, Thomas; Reusser, Dominik; Wrobel, Markus
2015-04-01
Results derived from climate model simulations confront non-expert users with a variety of uncertainties. This gives rise to the challenge that the scientific information must be communicated such that it can be easily understood, however, the complexity of the science behind is still incorporated. With respect to the assessment reports of the IPCC, the situation is even more complicated, because heterogeneous sources and multiple types of uncertainties need to be compiled together. Within this work, we systematically (1) analyzed the visual representation of uncertainties in the IPCC AR4 and AR5 reports, and (2) executed a questionnaire to evaluate how different user groups such as decision-makers and teachers understand these uncertainty visualizations. Within the first step, we classified visual uncertainty metaphors for spatial, temporal and abstract representations. As a result, we clearly identified a high complexity of the IPCC visualizations compared to standard presentation graphics, sometimes even integrating two or more uncertainty classes / measures together with the "certain" (mean) information. Further we identified complex written uncertainty explanations within image captions even within the "summary reports for policy makers". In the second step, based on these observations, we designed a questionnaire to investigate how non-climate experts understand these visual representations of uncertainties, how visual uncertainty coding might hinder the perception of the "non-uncertain" data, and if alternatives for certain IPCC visualizations exist. Within the talk/poster, we will present first results from this questionnaire. Summarizing, we identified a clear trend towards complex images within the latest IPCC reports, with a tendency to incorporate as much as possible information into the visual representations, resulting in proprietary, non-standard graphic representations that are not necessarily easy to comprehend on one glimpse. We conclude that further translation is required to (visually) present the IPCC results to non-experts, providing tailored static and interactive visualization solutions for different user groups.
NASA Astrophysics Data System (ADS)
Balbi, Stefano; Villa, Ferdinando; Mojtahed, Vahid; Hegetschweiler, Karin Tessa; Giupponi, Carlo
2016-06-01
This article presents a novel methodology to assess flood risk to people by integrating people's vulnerability and ability to cushion hazards through coping and adapting. The proposed approach extends traditional risk assessments beyond material damages; complements quantitative and semi-quantitative data with subjective and local knowledge, improving the use of commonly available information; and produces estimates of model uncertainty by providing probability distributions for all of its outputs. Flood risk to people is modeled using a spatially explicit Bayesian network model calibrated on expert opinion. Risk is assessed in terms of (1) likelihood of non-fatal physical injury, (2) likelihood of post-traumatic stress disorder and (3) likelihood of death. The study area covers the lower part of the Sihl valley (Switzerland) including the city of Zurich. The model is used to estimate the effect of improving an existing early warning system, taking into account the reliability, lead time and scope (i.e., coverage of people reached by the warning). Model results indicate that the potential benefits of an improved early warning in terms of avoided human impacts are particularly relevant in case of a major flood event.
Liu, Xingmei; Wu, Jianjun; Xu, Jianming
2006-05-01
For many practical problems in environmental management, information about soil heavy metals, relative to threshold values that may be of practical importance is needed at unsampled sites. The Hangzhou-Jiaxing-Huzhou (HJH) Plain has always been one of the most important rice production areas in Zhejiang province, China, and the soil heavy metal concentration is directly related to the crop quality and ultimately the health of people. Four hundred and fifty soil samples were selected in topsoil in HJH Plain to characterize the spatial variability of Cu, Zn, Pb, Cr and Cd. Ordinary kriging and lognormal kriging were carried out to map the spatial patterns of heavy metals and disjunctive kriging was used to quantify the probability of heavy metal concentrations higher than their guide value. Cokriging method was used to minimize the sampling density for Cu, Zn and Cr. The results of this study could give insight into risk assessment of environmental pollution and decision-making for agriculture.
NASA Astrophysics Data System (ADS)
Garrigues, S.; Olioso, A.; Calvet, J.-C.; Lafont, S.; Martin, E.; Chanzy, A.; Marloie, O.; Bertrand, N.; Desfonds, V.; Renard, D.
2012-04-01
Vegetation productivity and water balance of Mediterranean regions will be particularly affected by climate and land-use changes. In order to analyze and predict these changes through land surface models, a critical step is to quantify the uncertainties associated with these models (processes, parameters) and their implementation over a long period of time. Besides, uncertainties attached to the data used to force these models (atmospheric forcing, vegetation and soil characteristics, crop management practices...) which are generally available at coarse spatial resolution (>1-10 km) and for a limited number of plant functional types, need to be evaluated. This paper aims at assessing the uncertainties in water (evapotranspiration) and energy fluxes estimated from a Soil Vegetation Atmosphere Transfer (SVAT) model over a Mediterranean agricultural site. While similar past studies focused on particular crop types and limited period of time, the originality of this paper consists in implementing the SVAT model and assessing its uncertainties over a long period of time (10 years), encompassing several cycles of distinct crops (wheat, sorghum, sunflower, peas). The impacts on the SVAT simulations of the following sources of uncertainties are characterized: - Uncertainties in atmospheric forcing are assessed comparing simulations forced with local meteorological measurements and simulations forced with re-analysis atmospheric dataset (SAFRAN database). - Uncertainties in key surface characteristics (soil, vegetation, crop management practises) are tested comparing simulations feeded with standard values from global database (e.g. ECOCLIMAP) and simulations based on in situ or site-calibrated values. - Uncertainties dues to the implementation of the SVAT model over a long period of time are analyzed with regards to crop rotation. The SVAT model being analyzed in this paper is ISBA in its a-gs version which simulates the photosynthesis and its coupling with the stomata conductance, as well as the time course of the plant biomass and the Leaf Area Index (LAI). The experiment was conducted at the INRA-Avignon (France) crop site (ICOS associated site), for which 10 years of energy and water eddy fluxes, soil moisture profiles, vegetation measurements, agricultural practises are available for distinct crop types. The uncertainties in evapotranspiration and energy flux estimates are quantified from both 10-year trend analysis and selected daily cycles spanning a range of atmospheric conditions and phenological stages. While the net radiation flux is correctly simulated, the cumulated latent heat flux is under-estimated. Daily plots indicate i) an overestimation of evapotranspiration over bare soil probably due to an overestimation of the soil water reservoir available for evaporation and ii) an under-estimation of transpiration for developed canopy. Uncertainties attached to the re-analysis atmospheric data show little influence on the cumulated values of evapotranspiration. Better performances are reached using in situ soil depths and site-calibrated photosynthesis parameters compared to the simulations based on the ECOCLIMAP standard values. Finally, this paper highlights the impact of the temporal succession of vegetation cover and bare soil on the simulation of soil moisture and evapotranspiration over a long period of time. Thus, solutions to account for crop rotation in the implementation of SVAT models are discussed.
Geostatistical Sampling Methods for Efficient Uncertainty Analysis in Flow and Transport Problems
NASA Astrophysics Data System (ADS)
Liodakis, Stylianos; Kyriakidis, Phaedon; Gaganis, Petros
2015-04-01
In hydrogeological applications involving flow and transport of in heterogeneous porous media the spatial distribution of hydraulic conductivity is often parameterized in terms of a lognormal random field based on a histogram and variogram model inferred from data and/or synthesized from relevant knowledge. Realizations of simulated conductivity fields are then generated using geostatistical simulation involving simple random (SR) sampling and are subsequently used as inputs to physically-based simulators of flow and transport in a Monte Carlo framework for evaluating the uncertainty in the spatial distribution of solute concentration due to the uncertainty in the spatial distribution of hydraulic con- ductivity [1]. Realistic uncertainty analysis, however, calls for a large number of simulated concentration fields; hence, can become expensive in terms of both time and computer re- sources. A more efficient alternative to SR sampling is Latin hypercube (LH) sampling, a special case of stratified random sampling, which yields a more representative distribution of simulated attribute values with fewer realizations [2]. Here, term representative implies realizations spanning efficiently the range of possible conductivity values corresponding to the lognormal random field. In this work we investigate the efficiency of alternative methods to classical LH sampling within the context of simulation of flow and transport in a heterogeneous porous medium. More precisely, we consider the stratified likelihood (SL) sampling method of [3], in which attribute realizations are generated using the polar simulation method by exploring the geometrical properties of the multivariate Gaussian distribution function. In addition, we propose a more efficient version of the above method, here termed minimum energy (ME) sampling, whereby a set of N representative conductivity realizations at M locations is constructed by: (i) generating a representative set of N points distributed on the surface of a M-dimensional, unit radius hyper-sphere, (ii) relocating the N points on a representative set of N hyper-spheres of different radii, and (iii) transforming the coordinates of those points to lie on N different hyper-ellipsoids spanning the multivariate Gaussian distribution. The above method is applied in a dimensionality reduction context by defining flow-controlling points over which representative sampling of hydraulic conductivity is performed, thus also accounting for the sensitivity of the flow and transport model to the input hydraulic conductivity field. The performance of the various stratified sampling methods, LH, SL, and ME, is compared to that of SR sampling in terms of reproduction of ensemble statistics of hydraulic conductivity and solute concentration for different sample sizes N (numbers of realizations). The results indicate that ME sampling constitutes an equally if not more efficient simulation method than LH and SL sampling, as it can reproduce to a similar extent statistics of the conductivity and concentration fields, yet with smaller sampling variability than SR sampling. References [1] Gutjahr A.L. and Bras R.L. Spatial variability in subsurface flow and transport: A review. Reliability Engineering & System Safety, 42, 293-316, (1993). [2] Helton J.C. and Davis F.J. Latin hypercube sampling and the propagation of uncertainty in analyses of complex systems. Reliability Engineering & System Safety, 81, 23-69, (2003). [3] Switzer P. Multiple simulation of spatial fields. In: Heuvelink G, Lemmens M (eds) Proceedings of the 4th International Symposium on Spatial Accuracy Assessment in Natural Resources and Environmental Sciences, Coronet Books Inc., pp 629?635 (2000).
Gething, Peter W; Patil, Anand P; Hay, Simon I
2010-04-01
Risk maps estimating the spatial distribution of infectious diseases are required to guide public health policy from local to global scales. The advent of model-based geostatistics (MBG) has allowed these maps to be generated in a formal statistical framework, providing robust metrics of map uncertainty that enhances their utility for decision-makers. In many settings, decision-makers require spatially aggregated measures over large regions such as the mean prevalence within a country or administrative region, or national populations living under different levels of risk. Existing MBG mapping approaches provide suitable metrics of local uncertainty--the fidelity of predictions at each mapped pixel--but have not been adapted for measuring uncertainty over large areas, due largely to a series of fundamental computational constraints. Here the authors present a new efficient approximating algorithm that can generate for the first time the necessary joint simulation of prevalence values across the very large prediction spaces needed for global scale mapping. This new approach is implemented in conjunction with an established model for P. falciparum allowing robust estimates of mean prevalence at any specified level of spatial aggregation. The model is used to provide estimates of national populations at risk under three policy-relevant prevalence thresholds, along with accompanying model-based measures of uncertainty. By overcoming previously unchallenged computational barriers, this study illustrates how MBG approaches, already at the forefront of infectious disease mapping, can be extended to provide large-scale aggregate measures appropriate for decision-makers.
NASA Astrophysics Data System (ADS)
Wang, S.; Huang, G. H.; Baetz, B. W.; Ancell, B. C.
2017-05-01
The particle filtering techniques have been receiving increasing attention from the hydrologic community due to its ability to properly estimate model parameters and states of nonlinear and non-Gaussian systems. To facilitate a robust quantification of uncertainty in hydrologic predictions, it is necessary to explicitly examine the forward propagation and evolution of parameter uncertainties and their interactions that affect the predictive performance. This paper presents a unified probabilistic framework that merges the strengths of particle Markov chain Monte Carlo (PMCMC) and factorial polynomial chaos expansion (FPCE) algorithms to robustly quantify and reduce uncertainties in hydrologic predictions. A Gaussian anamorphosis technique is used to establish a seamless bridge between the data assimilation using the PMCMC and the uncertainty propagation using the FPCE through a straightforward transformation of posterior distributions of model parameters. The unified probabilistic framework is applied to the Xiangxi River watershed of the Three Gorges Reservoir (TGR) region in China to demonstrate its validity and applicability. Results reveal that the degree of spatial variability of soil moisture capacity is the most identifiable model parameter with the fastest convergence through the streamflow assimilation process. The potential interaction between the spatial variability in soil moisture conditions and the maximum soil moisture capacity has the most significant effect on the performance of streamflow predictions. In addition, parameter sensitivities and interactions vary in magnitude and direction over time due to temporal and spatial dynamics of hydrologic processes.
Earth Observation, Spatial Data Quality, and Neglected Tropical Diseases.
Hamm, Nicholas A S; Soares Magalhães, Ricardo J; Clements, Archie C A
2015-12-01
Earth observation (EO) is the use of remote sensing and in situ observations to gather data on the environment. It finds increasing application in the study of environmentally modulated neglected tropical diseases (NTDs). Obtaining and assuring the quality of the relevant spatially and temporally indexed EO data remain challenges. Our objective was to review the Earth observation products currently used in studies of NTD epidemiology and to discuss fundamental issues relating to spatial data quality (SDQ), which limit the utilization of EO and pose challenges for its more effective use. We searched Web of Science and PubMed for studies related to EO and echinococossis, leptospirosis, schistosomiasis, and soil-transmitted helminth infections. Relevant literature was also identified from the bibliographies of those papers. We found that extensive use is made of EO products in the study of NTD epidemiology; however, the quality of these products is usually given little explicit attention. We review key issues in SDQ concerning spatial and temporal scale, uncertainty, and the documentation and use of quality information. We give examples of how these issues may interact with uncertainty in NTD data to affect the output of an epidemiological analysis. We conclude that researchers should give careful attention to SDQ when designing NTD spatial-epidemiological studies. This should be used to inform uncertainty analysis in the epidemiological study. SDQ should be documented and made available to other researchers.
Ersoy, Adem; Yunsel, Tayfun Yusuf; Atici, Umit
2008-02-01
Abandoned mine workings can undoubtedly cause varying degrees of contamination of soil with heavy metals such as lead and zinc has occurred on a global scale. Exposure to these elements may cause to harm human health and environment. In the study, a total of 269 soil samples were collected at 1, 5, and 10 m regular grid intervals of 100 x 100 m area of Carsington Pasture in the UK. Cell declustering technique was applied to the data set due to no statistical representativity. Directional experimental semivariograms of the elements for the transformed data showed that both geometric and zonal anisotropy exists in the data. The most evident spatial dependence structure of the continuity for the directional experimental semivariogram, characterized by spherical and exponential models of Pb and Zn were obtained. This study reports the spatial distribution and uncertainty of Pb and Zn concentrations in soil at the study site using a probabilistic approach. The approach was based on geostatistical sequential Gaussian simulation (SGS), which is used to yield a series of conditional images characterized by equally probable spatial distributions of the heavy elements concentrations across the area. Postprocessing of many simulations allowed the mapping of contaminated and uncontaminated areas, and provided a model for the uncertainty in the spatial distribution of element concentrations. Maps of the simulated Pb and Zn concentrations revealed the extent and severity of contamination. SGS was validated by statistics, histogram, variogram reproduction, and simulation errors. The maps of the elements might be used in the remediation studies, help decision-makers and others involved in the abandoned heavy metal mining site in the world.
NASA Astrophysics Data System (ADS)
Wang, C.; Rubin, Y.
2014-12-01
Spatial distribution of important geotechnical parameter named compression modulus Es contributes considerably to the understanding of the underlying geological processes and the adequate assessment of the Es mechanics effects for differential settlement of large continuous structure foundation. These analyses should be derived using an assimilating approach that combines in-situ static cone penetration test (CPT) with borehole experiments. To achieve such a task, the Es distribution of stratum of silty clay in region A of China Expo Center (Shanghai) is studied using the Bayesian-maximum entropy method. This method integrates rigorously and efficiently multi-precision of different geotechnical investigations and sources of uncertainty. Single CPT samplings were modeled as a rational probability density curve by maximum entropy theory. Spatial prior multivariate probability density function (PDF) and likelihood PDF of the CPT positions were built by borehole experiments and the potential value of the prediction point, then, preceding numerical integration on the CPT probability density curves, the posterior probability density curve of the prediction point would be calculated by the Bayesian reverse interpolation framework. The results were compared between Gaussian Sequential Stochastic Simulation and Bayesian methods. The differences were also discussed between single CPT samplings of normal distribution and simulated probability density curve based on maximum entropy theory. It is shown that the study of Es spatial distributions can be improved by properly incorporating CPT sampling variation into interpolation process, whereas more informative estimations are generated by considering CPT Uncertainty for the estimation points. Calculation illustrates the significance of stochastic Es characterization in a stratum, and identifies limitations associated with inadequate geostatistical interpolation techniques. This characterization results will provide a multi-precision information assimilation method of other geotechnical parameters.
Linking vegetation structure, function and physiology through spectroscopic remote sensing
NASA Astrophysics Data System (ADS)
Serbin, S.; Singh, A.; Couture, J. J.; Shiklomanov, A. N.; Rogers, A.; Desai, A. R.; Kruger, E. L.; Townsend, P. A.
2015-12-01
Terrestrial ecosystem process models require detailed information on ecosystem states and canopy properties to properly simulate the fluxes of carbon (C), water and energy from the land to the atmosphere and assess the vulnerability of ecosystems to perturbations. Current models fail to adequately capture the magnitude, spatial variation, and seasonality of terrestrial C uptake and storage, leading to significant uncertainties in the size and fate of the terrestrial C sink. By and large, these parameter and process uncertainties arise from inadequate spatial and temporal representation of plant traits, vegetation structure, and functioning. With increases in computational power and changes to model architecture and approaches, it is now possible for models to leverage detailed, data rich and spatially explicit descriptions of ecosystems to inform parameter distributions and trait tradeoffs. In this regard, spectroscopy and imaging spectroscopy data have been shown to be invaluable observational datasets to capture broad-scale spatial and, eventually, temporal dynamics in important vegetation properties. We illustrate the linkage of plant traits and spectral observations to supply key data constraints for model parameterization. These constraints can come either in the form of the raw spectroscopic data (reflectance, absorbtance) or physiological traits derived from spectroscopy. In this presentation we highlight our ongoing work to build ecological scaling relationships between critical vegetation characteristics and optical properties across diverse and complex canopies, including temperate broadleaf and conifer forests, Mediterranean vegetation, Arctic systems, and agriculture. We focus on work at the leaf, stand, and landscape scales, illustrating the importance of capturing the underlying variability in a range of parameters (including vertical variation within canopies) to enable more efficient scaling of traits related to functional diversity of ecosystems.
Estimating Uncertainties in the Multi-Instrument SBUV Profile Ozone Merged Data Set
NASA Technical Reports Server (NTRS)
Frith, Stacey; Stolarski, Richard
2015-01-01
The MOD data set is uniquely qualified for use in long-term ozone analysis because of its long record, high spatial coverage, and consistent instrument design and algorithm. The estimated MOD uncertainty term significantly increases the uncertainty over the statistical error alone. Trends in the post-2000 period are generally positive in the upper stratosphere, but only significant at 1-1.6 hPa. Remaining uncertainties not yet included in the Monte Carlo model are Smoothing Error ( 1 from 10 to 1 hPa) Relative calibration uncertainty between N11 and N17Seasonal cycle differences between SBUV records.
Climate-based archetypes for the environmental fate assessment of chemicals.
Ciuffo, Biagio; Sala, Serenella
2013-11-15
Emissions of chemicals have been on the rise for years, and their impacts are greatly influenced by spatial differentiation. Chemicals are usually emitted locally but their impact can be felt both locally and globally, due to their chemical properties and persistence. The variability of environmental parameters in the emission compartment may affect the chemicals' fate and the exposure at different orders of magnitude. The assessment of the environmental fate of chemicals and the inherent spatial differentiation requires the use of multimedia models at various levels of complexity (from a simple box model to complex computational and high-spatial-resolution models). The objective of these models is to support ecological and human health risk assessment, by reducing the uncertainty of chemical impact assessments. The parameterisation of spatially resolved multimedia models is usually based on scenarios of evaluative environments, or on geographical resolutions related to administrative boundaries (e.g. countries/continents) or landscape areas (e.g. watersheds, eco-regions). The choice of the most appropriate scale and scenario is important from a management perspective, as a balance should be reached between a simplified approach and computationally intensive multimedia models. In this paper, which aims to go beyond the more traditional approach based on scale/resolution (cell, country, and basin), we propose and assess climate-based archetypes for the impact assessment of chemicals released in air. We define the archetypes based on the main drivers of spatial variability, which we systematically identify by adopting global sensitivity analysis techniques. A case study that uses the high resolution multimedia model MAPPE (Multimedia Assessment of Pollutant Pathways in the Environment) is presented. Results of the analysis showed that suitable archetypes should be both climate- and chemical-specific, as different chemicals (or groups of them) have different traits that influence their spatial variability. This hypothesis was tested by comparing the variability of the output of MAPPE for four different climatic zones on four different continents for four different chemicals (which represent different combinations of physical and chemical properties). Results showed the high suitability of climate-based archetypes in assessing the impacts of chemicals released in air. However, further research work is still necessary to test these findings. Copyright © 2013 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Babcock, C. R.; Finley, A. O.; Andersen, H. E.; Moskal, L. M.; Morton, D. C.; Cook, B.; Nelson, R.
2017-12-01
Upcoming satellite lidar missions, such as GEDI and IceSat-2, are designed to collect laser altimetry data from space for narrow bands along orbital tracts. As a result lidar metric sets derived from these sources will not be of complete spatial coverage. This lack of complete coverage, or sparsity, means traditional regression approaches that consider lidar metrics as explanatory variables (without error) cannot be used to generate wall-to-wall maps of forest inventory variables. We implement a coregionalization framework to jointly model sparsely sampled lidar information and point-referenced forest variable measurements to create wall-to-wall maps with full probabilistic uncertainty quantification of all inputs. We inform the model with USFS Forest Inventory and Analysis (FIA) in-situ forest measurements and GLAS lidar data to spatially predict aboveground forest biomass (AGB) across the contiguous US. We cast our model within a Bayesian hierarchical framework to better model complex space-varying correlation structures among the lidar metrics and FIA data, which yields improved prediction and uncertainty assessment. To circumvent computational difficulties that arise when fitting complex geostatistical models to massive datasets, we use a Nearest Neighbor Gaussian process (NNGP) prior. Results indicate that a coregionalization modeling approach to leveraging sampled lidar data to improve AGB estimation is effective. Further, fitting the coregionalization model within a Bayesian mode of inference allows for AGB quantification across scales ranging from individual pixel estimates of AGB density to total AGB for the continental US with uncertainty. The coregionalization framework examined here is directly applicable to future spaceborne lidar acquisitions from GEDI and IceSat-2. Pairing these lidar sources with the extensive FIA forest monitoring plot network using a joint prediction framework, such as the coregionalization model explored here, offers the potential to improve forest AGB accounting certainty and provide maps for post-model fitting analysis of the spatial distribution of AGB.
NASA Technical Reports Server (NTRS)
Hammerling, Dorit M.; Michalak, Anna M.; Kawa, S. Randolph
2012-01-01
Satellite observations of CO2 offer new opportunities to improve our understanding of the global carbon cycle. Using such observations to infer global maps of atmospheric CO2 and their associated uncertainties can provide key information about the distribution and dynamic behavior of CO2, through comparison to atmospheric CO2 distributions predicted from biospheric, oceanic, or fossil fuel flux emissions estimates coupled with atmospheric transport models. Ideally, these maps should be at temporal resolutions that are short enough to represent and capture the synoptic dynamics of atmospheric CO2. This study presents a geostatistical method that accomplishes this goal. The method can extract information about the spatial covariance structure of the CO2 field from the available CO2 retrievals, yields full coverage (Level 3) maps at high spatial resolutions, and provides estimates of the uncertainties associated with these maps. The method does not require information about CO2 fluxes or atmospheric transport, such that the Level 3 maps are informed entirely by available retrievals. The approach is assessed by investigating its performance using synthetic OCO-2 data generated from the PCTM/ GEOS-4/CASA-GFED model, for time periods ranging from 1 to 16 days and a target spatial resolution of 1deg latitude x 1.25deg longitude. Results show that global CO2 fields from OCO-2 observations can be predicted well at surprisingly high temporal resolutions. Even one-day Level 3 maps reproduce the large-scale features of the atmospheric CO2 distribution, and yield realistic uncertainty bounds. Temporal resolutions of two to four days result in the best performance for a wide range of investigated scenarios, providing maps at an order of magnitude higher temporal resolution relative to the monthly or seasonal Level 3 maps typically reported in the literature.
NASA Technical Reports Server (NTRS)
Horvath, Thomas; Splinter, Scott; Daryabeigi, Kamran; Wood, William; Schwartz, Richard; Ross, Martin
2008-01-01
High resolution calibrated infrared imagery of vehicles during hypervelocity atmospheric entry or sustained hypersonic cruise has the potential to provide flight data on the distribution of surface temperature and the state of the airflow over the vehicle. In the early 1980 s NASA sought to obtain high spatial resolution infrared imagery of the Shuttle during entry. Despite mission execution with a technically rigorous pre-planning capability, the single airborne optical system for this attempt was considered developmental and the scientific return was marginal. In 2005 the Space Shuttle Program again sponsored an effort to obtain imagery of the Orbiter. Imaging requirements were targeted towards Shuttle ascent; companion requirements for entry did not exist. The engineering community was allowed to define observation goals and incrementally demonstrate key elements of a quantitative spatially resolved measurement capability over a series of flights. These imaging opportunities were extremely beneficial and clearly demonstrated capability to capture infrared imagery with mature and operational assets of the US Navy and the Missile Defense Agency. While successful, the usefulness of the imagery was, from an engineering perspective, limited. These limitations were mainly associated with uncertainties regarding operational aspects of data acquisition. These uncertainties, in turn, came about because of limited pre-flight mission planning capability, a poor understanding of several factors including the infrared signature of the Shuttle, optical hardware limitations, atmospheric effects and detector response characteristics. Operational details of sensor configuration such as detector integration time and tracking system algorithms were carried out ad hoc (best practices) which led to low probability of target acquisition and detector saturation. Leveraging from the qualified success during Return-to-Flight, the NASA Engineering and Safety Center sponsored an assessment study focused on increasing the probability of returning spatially resolved scientific/engineering thermal imagery. This paper provides an overview of the assessment task and the systematic approach designed to establish confidence in the ability of existing assets to reliably acquire, track and return global quantitative surface temperatures of the Shuttle during entry. A discussion of capability demonstration in support of a potential Shuttle boundary layer transition flight test is presented. Successful demonstration of a quantitative, spatially resolved, global temperature measurement on the proposed Shuttle boundary layer transition flight test could lead to potential future applications with hypersonic flight test programs within the USAF and DARPA along with flight test opportunities supporting NASA s project Constellation.
Spatializing 6,000 years of global urbanization from 3700 BC to AD 2000
NASA Astrophysics Data System (ADS)
Reba, Meredith; Reitsma, Femke; Seto, Karen C.
2016-06-01
How were cities distributed globally in the past? How many people lived in these cities? How did cities influence their local and regional environments? In order to understand the current era of urbanization, we must understand long-term historical urbanization trends and patterns. However, to date there is no comprehensive record of spatially explicit, historic, city-level population data at the global scale. Here, we developed the first spatially explicit dataset of urban settlements from 3700 BC to AD 2000, by digitizing, transcribing, and geocoding historical, archaeological, and census-based urban population data previously published in tabular form by Chandler and Modelski. The dataset creation process also required data cleaning and harmonization procedures to make the data internally consistent. Additionally, we created a reliability ranking for each geocoded location to assess the geographic uncertainty of each data point. The dataset provides the first spatially explicit archive of the location and size of urban populations over the last 6,000 years and can contribute to an improved understanding of contemporary and historical urbanization trends.
Spatializing 6,000 years of global urbanization from 3700 BC to AD 2000
Reba, Meredith; Reitsma, Femke; Seto, Karen C.
2016-01-01
How were cities distributed globally in the past? How many people lived in these cities? How did cities influence their local and regional environments? In order to understand the current era of urbanization, we must understand long-term historical urbanization trends and patterns. However, to date there is no comprehensive record of spatially explicit, historic, city-level population data at the global scale. Here, we developed the first spatially explicit dataset of urban settlements from 3700 BC to AD 2000, by digitizing, transcribing, and geocoding historical, archaeological, and census-based urban population data previously published in tabular form by Chandler and Modelski. The dataset creation process also required data cleaning and harmonization procedures to make the data internally consistent. Additionally, we created a reliability ranking for each geocoded location to assess the geographic uncertainty of each data point. The dataset provides the first spatially explicit archive of the location and size of urban populations over the last 6,000 years and can contribute to an improved understanding of contemporary and historical urbanization trends. PMID:27271481
Using robust Bayesian network to estimate the residuals of fluoroquinolone antibiotic in soil.
Li, Xuewen; Xie, Yunfeng; Li, Lianfa; Yang, Xunfeng; Wang, Ning; Wang, Jinfeng
2015-11-01
Prediction of antibiotic pollution and its consequences is difficult, due to the uncertainties and complexities associated with multiple related factors. This article employed domain knowledge and spatial data to construct a Bayesian network (BN) model to assess fluoroquinolone antibiotic (FQs) pollution in the soil of an intensive vegetable cultivation area. The results show: (1) The relationships between FQs pollution and contributory factors: Three factors (cultivation methods, crop rotations, and chicken manure types) were consistently identified as predictors in the topological structures of three FQs, indicating their importance in FQs pollution; deduced with domain knowledge, the cultivation methods are determined by the crop rotations, which require different nutrients (derived from the manure) according to different plant biomass. (2) The performance of BN model: The integrative robust Bayesian network model achieved the highest detection probability (pd) of high-risk and receiver operating characteristic (ROC) area, since it incorporates domain knowledge and model uncertainty. Our encouraging findings have implications for the use of BN as a robust approach to assessment of FQs pollution and for informing decisions on appropriate remedial measures.
Kim, Young-Min; Zhou, Ying; Gao, Yang; ...
2014-11-16
We report that the spatial pattern of the uncertainty in air pollution-related health impacts due to climate change has rarely been studied due to the lack of high-resolution model simulations, especially under the Representative Concentration Pathways (RCPs), the latest greenhouse gas emission pathways. We estimated future tropospheric ozone (O 3) and related excess mortality and evaluated the associated uncertainties in the continental United States under RCPs. Based on dynamically downscaled climate model simulations, we calculated changes in O 3 level at 12 km resolution between the future (2057 and 2059) and base years (2001–2004) under a low-to-medium emission scenario (RCP4.5)more » and a fossil fuel intensive emission scenario (RCP8.5). We then estimated the excess mortality attributable to changes in O 3. Finally, we analyzed the sensitivity of the excess mortality estimates to the input variables and the uncertainty in the excess mortality estimation using Monte Carlo simulations. O 3-related premature deaths in the continental U.S. were estimated to be 1312 deaths/year under RCP8.5 (95 % confidence interval (CI): 427 to 2198) and ₋2118 deaths/year under RCP4.5 (95 % CI: ₋3021 to ₋1216), when allowing for climate change and emissions reduction. The uncertainty of O 3-related excess mortality estimates was mainly caused by RCP emissions pathways. Finally, excess mortality estimates attributable to the combined effect of climate and emission changes on O 3 as well as the associated uncertainties vary substantially in space and so do the most influential input variables. Spatially resolved data is crucial to develop effective community level mitigation and adaptation policy.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kim, Young-Min; Zhou, Ying; Gao, Yang
We report that the spatial pattern of the uncertainty in air pollution-related health impacts due to climate change has rarely been studied due to the lack of high-resolution model simulations, especially under the Representative Concentration Pathways (RCPs), the latest greenhouse gas emission pathways. We estimated future tropospheric ozone (O 3) and related excess mortality and evaluated the associated uncertainties in the continental United States under RCPs. Based on dynamically downscaled climate model simulations, we calculated changes in O 3 level at 12 km resolution between the future (2057 and 2059) and base years (2001–2004) under a low-to-medium emission scenario (RCP4.5)more » and a fossil fuel intensive emission scenario (RCP8.5). We then estimated the excess mortality attributable to changes in O 3. Finally, we analyzed the sensitivity of the excess mortality estimates to the input variables and the uncertainty in the excess mortality estimation using Monte Carlo simulations. O 3-related premature deaths in the continental U.S. were estimated to be 1312 deaths/year under RCP8.5 (95 % confidence interval (CI): 427 to 2198) and ₋2118 deaths/year under RCP4.5 (95 % CI: ₋3021 to ₋1216), when allowing for climate change and emissions reduction. The uncertainty of O 3-related excess mortality estimates was mainly caused by RCP emissions pathways. Finally, excess mortality estimates attributable to the combined effect of climate and emission changes on O 3 as well as the associated uncertainties vary substantially in space and so do the most influential input variables. Spatially resolved data is crucial to develop effective community level mitigation and adaptation policy.« less
Uncertainty in Agricultural Impact Assessment
NASA Technical Reports Server (NTRS)
Wallach, Daniel; Mearns, Linda O.; Rivington, Michael; Antle, John M.; Ruane, Alexander C.
2014-01-01
This chapter considers issues concerning uncertainty associated with modeling and its use within agricultural impact assessments. Information about uncertainty is important for those who develop assessment methods, since that information indicates the need for, and the possibility of, improvement of the methods and databases. Such information also allows one to compare alternative methods. Information about the sources of uncertainties is an aid in prioritizing further work on the impact assessment method. Uncertainty information is also necessary for those who apply assessment methods, e.g., for projecting climate change impacts on agricultural production and for stakeholders who want to use the results as part of a decision-making process (e.g., for adaptation planning). For them, uncertainty information indicates the degree of confidence they can place in the simulated results. Quantification of uncertainty also provides stakeholders with an important guideline for making decisions that are robust across the known uncertainties. Thus, uncertainty information is important for any decision based on impact assessment. Ultimately, we are interested in knowledge about uncertainty so that information can be used to achieve positive outcomes from agricultural modeling and impact assessment.
NASA Astrophysics Data System (ADS)
Zorita, E.
2009-12-01
One of the objectives when comparing simulations of past climates to proxy-based climate reconstructions is to asses the skill of climate models to simulate climate change. This comparison may accomplished at large spatial scales, for instance the evolution of simulated and reconstructed Northern Hemisphere annual temperature, or at regional or point scales. In both approaches a 'fair' comparison has to take into account different aspects that affect the inevitable uncertainties and biases in the simulations and in the reconstructions. These efforts face a trade-off: climate models are believed to be more skillful at large hemispheric scales, but climate reconstructions are these scales are burdened by the spatial distribution of available proxies and by methodological issues surrounding the statistical method used to translate the proxy information into large-spatial averages. Furthermore, the internal climatic noise at large hemispheric scales is low, so that the sampling uncertainty tends to be also low. On the other hand, the skill of climate models at regional scales is limited by the coarse spatial resolution, which hinders a faithful representation of aspects important for the regional climate. At small spatial scales, the reconstruction of past climate probably faces less methodological problems if information from different proxies is available. The internal climatic variability at regional scales is, however, high. In this contribution some examples of the different issues faced when comparing simulation and reconstructions at small spatial scales in the past millennium are discussed. These examples comprise reconstructions from dendrochronological data and from historical documentary data in Europe and climate simulations with global and regional models. These examples indicate that the centennial climate variations can offer a reasonable target to assess the skill of global climate models and of proxy-based reconstructions, even at small spatial scales. However, as the focus shifts towards higher frequency variability, decadal or multidecadal, the need for larger simulation ensembles becomes more evident. Nevertheless,the comparison at these time scales may expose some lines of research on the origin of multidecadal regional climate variability.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lee, D.W.; Yambert, M.W.; Kocher, D.C.
1994-12-31
A performance assessment of the operating Solid Waste Storage Area 6 (SWSA 6) facility for the disposal of low-level radioactive waste at the Oak Ridge National Laboratory has been prepared to provide the technical basis for demonstrating compliance with the performance objectives of DOE Order 5820.2A, Chapter 111.2 An analysis of the uncertainty incorporated into the assessment was performed which addressed the quantitative uncertainty in the data used by the models, the subjective uncertainty associated with the models used for assessing performance of the disposal facility and site, and the uncertainty in the models used for estimating dose and humanmore » exposure. The results of the uncertainty analysis were used to interpret results and to formulate conclusions about the performance assessment. This paper discusses the approach taken in analyzing the uncertainty in the performance assessment and the role of uncertainty in performance assessment.« less
Romañach, Stephanie; Watling, James I.; Fletcher, Robert J.; Speroterra, Carolina; Bucklin, David N.; Brandt, Laura A.; Pearlstine, Leonard G.; Escribano, Yesenia; Mazzotti, Frank J.
2014-01-01
Climate change poses new challenges for natural resource managers. Predictive modeling of species–environment relationships using climate envelope models can enhance our understanding of climate change effects on biodiversity, assist in assessment of invasion risk by exotic organisms, and inform life-history understanding of individual species. While increasing interest has focused on the role of uncertainty in future conditions on model predictions, models also may be sensitive to the initial conditions on which they are trained. Although climate envelope models are usually trained using data on contemporary climate, we lack systematic comparisons of model performance and predictions across alternative climate data sets available for model training. Here, we seek to fill that gap by comparing variability in predictions between two contemporary climate data sets to variability in spatial predictions among three alternative projections of future climate. Overall, correlations between monthly temperature and precipitation variables were very high for both contemporary and future data. Model performance varied across algorithms, but not between two alternative contemporary climate data sets. Spatial predictions varied more among alternative general-circulation models describing future climate conditions than between contemporary climate data sets. However, we did find that climate envelope models with low Cohen's kappa scores made more discrepant spatial predictions between climate data sets for the contemporary period than did models with high Cohen's kappa scores. We suggest conservation planners evaluate multiple performance metrics and be aware of the importance of differences in initial conditions for spatial predictions from climate envelope models.
Assimilation of Spatially Sparse In Situ Soil Moisture Networks into a Continuous Model Domain
NASA Astrophysics Data System (ADS)
Gruber, A.; Crow, W. T.; Dorigo, W. A.
2018-02-01
Growth in the availability of near-real-time soil moisture observations from ground-based networks has spurred interest in the assimilation of these observations into land surface models via a two-dimensional data assimilation system. However, the design of such systems is currently hampered by our ignorance concerning the spatial structure of error afflicting ground and model-based soil moisture estimates. Here we apply newly developed triple collocation techniques to provide the spatial error information required to fully parameterize a two-dimensional (2-D) data assimilation system designed to assimilate spatially sparse observations acquired from existing ground-based soil moisture networks into a spatially continuous Antecedent Precipitation Index (API) model for operational agricultural drought monitoring. Over the contiguous United States (CONUS), the posterior uncertainty of surface soil moisture estimates associated with this 2-D system is compared to that obtained from the 1-D assimilation of remote sensing retrievals to assess the value of ground-based observations to constrain a surface soil moisture analysis. Results demonstrate that a fourfold increase in existing CONUS ground station density is needed for ground network observations to provide a level of skill comparable to that provided by existing satellite-based surface soil moisture retrievals.
Organic carbon stock modelling for the quantification of the carbon sinks in terrestrial ecosystems
NASA Astrophysics Data System (ADS)
Durante, Pilar; Algeet, Nur; Oyonarte, Cecilio
2017-04-01
Given the recent environmental policies derived from the serious threats caused by global change, practical measures to decrease net CO2 emissions have to be put in place. Regarding this, carbon sequestration is a major measure to reduce atmospheric CO2 concentrations within a short and medium term, where terrestrial ecosystems play a basic role as carbon sinks. Development of tools for quantification, assessment and management of organic carbon in ecosystems at different scales and management scenarios, it is essential to achieve these commitments. The aim of this study is to establish a methodological framework for the modeling of this tool, applied to a sustainable land use planning and management at spatial and temporal scale. The methodology for carbon stock estimation in ecosystems is based on merger techniques between carbon stored in soils and aerial biomass. For this purpose, both spatial variability map of soil organic carbon (SOC) and algorithms for calculation of forest species biomass will be created. For the modelling of the SOC spatial distribution at different map scales, it is necessary to fit in and screen the available information of soil database legacy. Subsequently, SOC modelling will be based on the SCORPAN model, a quantitative model use to assess the correlation among soil-forming factors measured at the same site location. These factors will be selected from both static (terrain morphometric variables) and dynamic variables (climatic variables and vegetation indexes -NDVI-), providing to the model the spatio-temporal characteristic. After the predictive model, spatial inference techniques will be used to achieve the final map and to extrapolate the data to unavailable information areas (automated random forest regression kriging). The estimated uncertainty will be calculated to assess the model performance at different scale approaches. Organic carbon modelling of aerial biomass will be estimate using LiDAR (Light Detection And Ranging) algorithms. The available LiDAR databases will be used. LiDAR statistics (which describe the LiDAR cloud point data to calculate forest stand parameters) will be correlated with different canopy cover variables. The regression models applied to the total area will produce a continuous geo-information map to each canopy variable. The CO2 estimation will be calculated by dry-mass conversion factors for each forest species (C kg-CO2 kg equivalent). The result is the organic carbon modelling at spatio-temporal scale with different levels of uncertainty associated to the predictive models and diverse detailed scales. However, one of the main expected problems is due to the heterogeneous spatial distribution of the soil information, which influences on the prediction of the models at different spatial scales and, consequently, at SOC map scale. Besides this, the variability and mixture of the forest species of the aerial biomass decrease the accuracy assessment of the organic carbon.
Sources of Uncertainty in Predicting Land Surface Fluxes Using Diverse Data and Models
NASA Technical Reports Server (NTRS)
Dungan, Jennifer L.; Wang, Weile; Michaelis, Andrew; Votava, Petr; Nemani, Ramakrishma
2010-01-01
In the domain of predicting land surface fluxes, models are used to bring data from large observation networks and satellite remote sensing together to make predictions about present and future states of the Earth. Characterizing the uncertainty about such predictions is a complex process and one that is not yet fully understood. Uncertainty exists about initialization, measurement and interpolation of input variables; model parameters; model structure; and mixed spatial and temporal supports. Multiple models or structures often exist to describe the same processes. Uncertainty about structure is currently addressed by running an ensemble of different models and examining the distribution of model outputs. To illustrate structural uncertainty, a multi-model ensemble experiment we have been conducting using the Terrestrial Observation and Prediction System (TOPS) will be discussed. TOPS uses public versions of process-based ecosystem models that use satellite-derived inputs along with surface climate data and land surface characterization to produce predictions of ecosystem fluxes including gross and net primary production and net ecosystem exchange. Using the TOPS framework, we have explored the uncertainty arising from the application of models with different assumptions, structures, parameters, and variable definitions. With a small number of models, this only begins to capture the range of possible spatial fields of ecosystem fluxes. Few attempts have been made to systematically address the components of uncertainty in such a framework. We discuss the characterization of uncertainty for this approach including both quantifiable and poorly known aspects.
Estimating regional evapotranspiration from remotely sensed data by surface energy balance models
NASA Technical Reports Server (NTRS)
Asrar, Ghassem; Kanemasu, Edward; Myneni, R. B.; Lapitan, R. L.; Harris, T. R.; Killeen, J. M.; Cooper, D. I.; Hwang, C.
1987-01-01
Spatial and temporal variations of surface radiative temperatures of the burned and unburned areas of the Konza tallgrass prairie were studied. The role of management practices, topographic conditions and the uncertainties associated with in situ or airborne surface temperature measurements were assessed. Evaluation of diurnal and seasonal spectral characteristics of the burned and unburned areas of the prairie was also made. This was accomplished based on the analysis of measured spectral reflectance of the grass canopies under field conditions, and modelling their spectral behavior using a one dimensional radiative transfer model.
Assesment of CALIPSO's level 3 climatological product
NASA Astrophysics Data System (ADS)
Papagiannopoulos, Nikolaos; Mona, Lucia; Pappalardo, Gelsomina
2015-04-01
Since December 2011 has been released the latest CALIPSO Level 3 (CL3) monthly product and is subject to calibration/validation studies. EARLINET as the unique European lidar network on a continental scale is the key candidate for these kind of studies. CALIPSO Level 3 data were compared against EARLINET monthly averages obtained by profiles during satellite overpasses. Data from stations of Potenza, Naples, Granada, Évora and Leipzig equipped with advanced multi-wavelength Raman lidars were used for this study. EARLINET monthly profiles yielded higher extinction values comparing to CALIPSO ones. In order to mitigate uncertainties due to spatial and temporal differences, we reproduced the CL3 filtering rubric onto the CALIPSO Level 2 data. Only grid CALIPSO overflights during EARLINET correlative measurements were used. From these data, monthly averages on 2x5 grid are reconstructed. The CALIPSO monthly mean profiles following the new approach are called CALIPSOLevel 3*,CL3*. This offers the possibility to achieve direct comparable datasets, even if greatly reduces the number of satellite grid overflights. Moreover, the comparison of matched observations reduces uncertainties from spatial variability that affects the sampled volumes. The agreement typically improved, in particular above the areas directly affected by the anthropogenic activities within the planetary boundary layer. In contrast to CL3 product, CL3* data offers the possibility to assess also the CALIPSO performance in terms of the backscatter coefficient keeping the same quality assurance criteria applied to extinction coefficient. Lastly, the typing capabilities of CALIPSO were assessed outlining the importance of the correct aerosol type assessment to the CALIPSO aerosol properties retrieval. This work is the first in-depth assessment to evaluate the aerosol optical properties reported in the CL 3 data product. The outcome will assist the establishment of independently derived uncertainty estimates that can be used to create more reliable model forecasts based on CALIPSO data. Moreover, the presented work can contribute to current and future studies that use space-based lidar data. Acknowledgments: The financial support for EARLINET provided by the European Union under grant RICA 025991 within the framework of the Sixth Framework Programme is gratefully acknowledged. Since 2011 EARLINET has been integrated in the ACTRIS Research Infrastructure Project supported by the European Union Seventh Framework Programme (FP7/2007-2013) under grant agreement no. 262254.
A Causal Inference Analysis of the Effect of Wildland Fire ...
Wildfire smoke is a major contributor to ambient air pollution levels. In this talk, we develop a spatio-temporal model to estimate the contribution of fire smoke to overall air pollution in different regions of the country. We combine numerical model output with observational data within a causal inference framework. Our methods account for aggregation and potential bias of the numerical model simulation, and address uncertainty in the causal estimates. We apply the proposed method to estimation of ozone and fine particulate matter from wildland fires and the impact on health burden assessment. We develop a causal inference framework to assess contributions of fire to ambient PM in the presence of spatial interference.
Visualizing uncertainties with the North Wyke Farm Platform Data Sets
NASA Astrophysics Data System (ADS)
Harris, Paul; Brunsdon, Chris; Lee, Michael
2016-04-01
The North Wyke Farm Platform (NWFP) is a systems-based, farm-scale experiment with the aim of addressing agricultural productivity and ecosystem responses to different management practices. The 63 ha site captures the spatial and/or temporal data necessary to develop a better understanding of the dynamic processes and underlying mechanisms that can be used to model how agricultural grassland systems respond to different management inputs. Via cattle beef and sheep production, the underlying principle is to manage each of three farmlets (each consisting of five hydrologically-isolated sub-catchments) in three contrasting ways: (i) improvement of permanent pasture through use of mineral fertilizers; (ii) improvement through use of legumes; and (iii) improvement through innovation. The connectivity between the timing and intensity of the different management operations, together with the transport of nutrients and potential pollutants from the NWFP is evaluated using numerous inter-linked data collection exercises. In this paper, we introduce some of the visualization opportunities that are possible with this rich data resource, and methods of analysis that might be applied to it, in particular with respect to data and model uncertainty operating across both temporal and spatial dimensions. An important component of the NWFP experiment is the representation of trade-offs with respect to: (a) economic profits, (b) environmental concerns, and (c) societal benefits, under the umbrella of sustainable intensification. Various visualizations exist to display such trade-offs and here we demonstrate ways to tailor them to relay key uncertainties and assessments of risk; and also consider how these visualizations can be honed to suit different audiences.
NASA Astrophysics Data System (ADS)
Wang, S.; Huang, G. H.; Baetz, B. W.; Huang, W.
2015-11-01
This paper presents a polynomial chaos ensemble hydrologic prediction system (PCEHPS) for an efficient and robust uncertainty assessment of model parameters and predictions, in which possibilistic reasoning is infused into probabilistic parameter inference with simultaneous consideration of randomness and fuzziness. The PCEHPS is developed through a two-stage factorial polynomial chaos expansion (PCE) framework, which consists of an ensemble of PCEs to approximate the behavior of the hydrologic model, significantly speeding up the exhaustive sampling of the parameter space. Multiple hypothesis testing is then conducted to construct an ensemble of reduced-dimensionality PCEs with only the most influential terms, which is meaningful for achieving uncertainty reduction and further acceleration of parameter inference. The PCEHPS is applied to the Xiangxi River watershed in China to demonstrate its validity and applicability. A detailed comparison between the HYMOD hydrologic model, the ensemble of PCEs, and the ensemble of reduced PCEs is performed in terms of accuracy and efficiency. Results reveal temporal and spatial variations in parameter sensitivities due to the dynamic behavior of hydrologic systems, and the effects (magnitude and direction) of parametric interactions depending on different hydrological metrics. The case study demonstrates that the PCEHPS is capable not only of capturing both expert knowledge and probabilistic information in the calibration process, but also of implementing an acceleration of more than 10 times faster than the hydrologic model without compromising the predictive accuracy.
Hock, Sabrina; Hasenauer, Jan; Theis, Fabian J
2013-01-01
Diffusion is a key component of many biological processes such as chemotaxis, developmental differentiation and tissue morphogenesis. Since recently, the spatial gradients caused by diffusion can be assessed in-vitro and in-vivo using microscopy based imaging techniques. The resulting time-series of two dimensional, high-resolutions images in combination with mechanistic models enable the quantitative analysis of the underlying mechanisms. However, such a model-based analysis is still challenging due to measurement noise and sparse observations, which result in uncertainties of the model parameters. We introduce a likelihood function for image-based measurements with log-normal distributed noise. Based upon this likelihood function we formulate the maximum likelihood estimation problem, which is solved using PDE-constrained optimization methods. To assess the uncertainty and practical identifiability of the parameters we introduce profile likelihoods for diffusion processes. As proof of concept, we model certain aspects of the guidance of dendritic cells towards lymphatic vessels, an example for haptotaxis. Using a realistic set of artificial measurement data, we estimate the five kinetic parameters of this model and compute profile likelihoods. Our novel approach for the estimation of model parameters from image data as well as the proposed identifiability analysis approach is widely applicable to diffusion processes. The profile likelihood based method provides more rigorous uncertainty bounds in contrast to local approximation methods.
Submesoscale Sea Surface Temperature Variability from UAV and Satellite Measurements
NASA Astrophysics Data System (ADS)
Castro, S. L.; Emery, W. J.; Tandy, W., Jr.; Good, W. S.
2017-12-01
Technological advances in spatial resolution of observations have revealed the importance of short-lived ocean processes with scales of O(1km). These submesoscale processes play an important role for the transfer of energy from the meso- to small scales and for generating significant spatial and temporal intermittency in the upper ocean, critical for the mixing of the oceanic boundary layer. Submesoscales have been observed in sea surface temperatures (SST) from satellites. Satellite SST measurements are spatial averages over the footprint of the satellite. When the variance of the SST distribution within the footprint is small, the average value is representative of the SST over the whole pixel. If the variance is large, the spatial heterogeneity is a source of uncertainty in satellite derived SSTs. Here we show evidence that the submesoscale variability in SSTs at spatial scales of 1km is responsible for the spatial variability within satellite footprints. Previous studies of the spatial variability in SST, using ship-based radiometric data suggested that variability at scales smaller than 1 km is significant and affects the uncertainty of satellite-derived skin SSTs. We examine data collected by a calibrated thermal infrared radiometer, the Ball Experimental Sea Surface Temperature (BESST), flown on a UAV over the Arctic Ocean and compare them with coincident measurements from the MODIS spaceborne radiometer to assess the spatial variability of SST within 1 km pixels. By taking the standard deviation of all the BESST measurements within individual MODIS pixels we show that significant spatial variability exists within the footprints. The distribution of the surface variability measured by BESST shows a peak value of O(0.1K) with 95% of the pixels showing σ < 0.45K. More importantly, high-variability pixels are located at density fronts in the marginal ice zone, which are a primary source of submesoscale intermittency near the surface in the Arctic Ocean. Wavenumber spectra of the BESST SSTs indicate a spectral slope of -2, consistent with the presence of submesoscale processes. Furthermore, not only is the BESST wavenumber spectra able to match the MODIS SST spectra well, but also extends the spectral slope of -2 by 2 decades relative to MODIS, from wavelengths of 8km to 0.08km.
Statistical and Spatial Analysis of Bathymetric Data for the St. Clair River, 1971-2007
Bennion, David
2009-01-01
To address questions concerning ongoing geomorphic processes in the St. Clair River, selected bathymetric datasets spanning 36 years were analyzed. Comparisons of recent high-resolution datasets covering the upper river indicate a highly variable, active environment. Although statistical and spatial comparisons of the datasets show that some changes to the channel size and shape have taken place during the study period, uncertainty associated with various survey methods and interpolation processes limit the statistically certain results. The methods used to spatially compare the datasets are sensitive to small variations in position and depth that are within the range of uncertainty associated with the datasets. Characteristics of the data, such as the density of measured points and the range of values surveyed, can also influence the results of spatial comparison. With due consideration of these limitations, apparently active and ongoing areas of elevation change in the river are mapped and discussed.
Constraining uncertainties in water supply reliability in a tropical data scarce basin
NASA Astrophysics Data System (ADS)
Kaune, Alexander; Werner, Micha; Rodriguez, Erasmo; de Fraiture, Charlotte
2015-04-01
Assessing the water supply reliability in river basins is essential for adequate planning and development of irrigated agriculture and urban water systems. In many cases hydrological models are applied to determine the surface water availability in river basins. However, surface water availability and variability is often not appropriately quantified due to epistemic uncertainties, leading to water supply insecurity. The objective of this research is to determine the water supply reliability in order to support planning and development of irrigated agriculture in a tropical, data scarce environment. The approach proposed uses a simple hydrological model, but explicitly includes model parameter uncertainty. A transboundary river basin in the tropical region of Colombia and Venezuela with an approximately area of 2100 km² was selected as a case study. The Budyko hydrological framework was extended to consider climatological input variability and model parameter uncertainty, and through this the surface water reliability to satisfy the irrigation and urban demand was estimated. This provides a spatial estimate of the water supply reliability across the basin. For the middle basin the reliability was found to be less than 30% for most of the months when the water is extracted from an upstream source. Conversely, the monthly water supply reliability was high (r>98%) in the lower basin irrigation areas when water was withdrawn from a source located further downstream. Including model parameter uncertainty provides a complete estimate of the water supply reliability, but that estimate is influenced by the uncertainty in the model. Reducing the uncertainty in the model through improved data and perhaps improved model structure will improve the estimate of the water supply reliability allowing better planning of irrigated agriculture and dependable water allocation decisions.
NASA Astrophysics Data System (ADS)
Wellen, Christopher; Arhonditsis, George B.; Long, Tanya; Boyd, Duncan
2014-11-01
Spatially distributed nonpoint source watershed models are essential tools to estimate the magnitude and sources of diffuse pollution. However, little work has been undertaken to understand the sources and ramifications of the uncertainty involved in their use. In this study we conduct the first Bayesian uncertainty analysis of the water quality components of the SWAT model, one of the most commonly used distributed nonpoint source models. Working in Southern Ontario, we apply three Bayesian configurations for calibrating SWAT to Redhill Creek, an urban catchment, and Grindstone Creek, an agricultural one. We answer four interrelated questions: can SWAT determine suspended sediment sources with confidence when end of basin data is used for calibration? How does uncertainty propagate from the discharge submodel to the suspended sediment submodels? Do the estimated sediment sources vary when different calibration approaches are used? Can we combine the knowledge gained from different calibration approaches? We show that: (i) despite reasonable fit at the basin outlet, the simulated sediment sources are subject to uncertainty sufficient to undermine the typical approach of reliance on a single, best fit simulation; (ii) more than a third of the uncertainty of sediment load predictions may stem from the discharge submodel; (iii) estimated sediment sources do vary significantly across the three statistical configurations of model calibration despite end-of-basin predictions being virtually identical; and (iv) Bayesian model averaging is an approach that can synthesize predictions when a number of adequate distributed models make divergent source apportionments. We conclude with recommendations for future research to reduce the uncertainty encountered when using distributed nonpoint source models for source apportionment.
Epistemic uncertainty in the location and magnitude of earthquakes in Italy from Macroseismic data
Bakun, W.H.; Gomez, Capera A.; Stucchi, M.
2011-01-01
Three independent techniques (Bakun and Wentworth, 1997; Boxer from Gasperini et al., 1999; and Macroseismic Estimation of Earthquake Parameters [MEEP; see Data and Resources section, deliverable D3] from R.M.W. Musson and M.J. Jimenez) have been proposed for estimating an earthquake location and magnitude from intensity data alone. The locations and magnitudes obtained for a given set of intensity data are almost always different, and no one technique is consistently best at matching instrumental locations and magnitudes of recent well-recorded earthquakes in Italy. Rather than attempting to select one of the three solutions as best, we use all three techniques to estimate the location and the magnitude and the epistemic uncertainties among them. The estimates are calculated using bootstrap resampled data sets with Monte Carlo sampling of a decision tree. The decision-tree branch weights are based on goodness-of-fit measures of location and magnitude for recent earthquakes. The location estimates are based on the spatial distribution of locations calculated from the bootstrap resampled data. The preferred source location is the locus of the maximum bootstrap location spatial density. The location uncertainty is obtained from contours of the bootstrap spatial density: 68% of the bootstrap locations are within the 68% confidence region, and so on. For large earthquakes, our preferred location is not associated with the epicenter but with a location on the extended rupture surface. For small earthquakes, the epicenters are generally consistent with the location uncertainties inferred from the intensity data if an epicenter inaccuracy of 2-3 km is allowed. The preferred magnitude is the median of the distribution of bootstrap magnitudes. As with location uncertainties, the uncertainties in magnitude are obtained from the distribution of bootstrap magnitudes: the bounds of the 68% uncertainty range enclose 68% of the bootstrap magnitudes, and so on. The instrumental magnitudes for large and small earthquakes are generally consistent with the confidence intervals inferred from the distribution of bootstrap resampled magnitudes.
Uncertainty visualisation in the Model Web
NASA Astrophysics Data System (ADS)
Gerharz, L. E.; Autermann, C.; Hopmann, H.; Stasch, C.; Pebesma, E.
2012-04-01
Visualisation of geospatial data as maps is a common way to communicate spatially distributed information. If temporal and furthermore uncertainty information are included in the data, efficient visualisation methods are required. For uncertain spatial and spatio-temporal data, numerous visualisation methods have been developed and proposed, but only few tools for visualisation of data in a standardised way exist. Furthermore, usually they are realised as thick clients, and lack functionality of handling data coming from web services as it is envisaged in the Model Web. We present an interactive web tool for visualisation of uncertain spatio-temporal data developed in the UncertWeb project. The client is based on the OpenLayers JavaScript library. OpenLayers provides standard map windows and navigation tools, i.e. pan, zoom in/out, to allow interactive control for the user. Further interactive methods are implemented using jStat, a JavaScript library for statistics plots developed in UncertWeb, and flot. To integrate the uncertainty information into existing standards for geospatial data, the Uncertainty Markup Language (UncertML) was applied in combination with OGC Observations&Measurements 2.0 and JavaScript Object Notation (JSON) encodings for vector and NetCDF for raster data. The client offers methods to visualise uncertain vector and raster data with temporal information. Uncertainty information considered for the tool are probabilistic and quantified attribute uncertainties which can be provided as realisations or samples, full probability distributions functions and statistics. Visualisation is supported for uncertain continuous and categorical data. In the client, the visualisation is realised using a combination of different methods. Based on previously conducted usability studies, a differentiation between expert (in statistics or mapping) and non-expert users has been indicated as useful. Therefore, two different modes are realised together in the tool: (i) adjacent maps showing data and uncertainty separately, and (ii) multidimensional mapping providing different visualisation methods in combination to explore the spatial, temporal and uncertainty distribution of the data. Adjacent maps allow a simpler visualisation by separating value and uncertainty maps for non-experts and a first overview. The multidimensional approach allows a more complex exploration of the data for experts by browsing through the different dimensions. It offers the visualisation of maps, statistic plots and time series in different windows and sliders to interactively move through time, space and uncertainty (thresholds).
NASA Astrophysics Data System (ADS)
Talebi, Hassan; Asghari, Omid; Emery, Xavier
2013-12-01
An accurate estimation of mineral grades in ore deposits with heterogeneous spatial variations requires defining geological domains that differentiate the types of mineralogy, alteration and lithology. Deterministic models define the layout of the domains based on the interpretation of the drill holes and do not take into account the uncertainty in areas with fewer data. Plurigaussian simulation (PGS) can be an alternative to generate multiple numerical models of the ore body, with the aim of assessing the uncertainty in the domain boundaries and improving the geological controls in the characterization of quantitative attributes. This study addresses the application of PGS to Sungun porphyry copper deposit (Iran), in order to simulate the layout of four hypogene alteration zones: potassic, phyllic, propylitic and argillic. The aim of this study is to construct numerical models in which the alteration structures reflect the evolution observed in the geology.
Implications of Satellite Swath Width on Global Aerosol Optical Thickness Statistics
NASA Technical Reports Server (NTRS)
Colarco, Peter; Kahn, Ralph; Remer, Lorraine; Levy, Robert; Welton, Ellsworth
2012-01-01
We assess the impact of swath width on the statistics of aerosol optical thickness (AOT) retrieved by satellite as inferred from observations made by the Moderate Resolution Imaging Spectroradiometer (MODIS). We sub-sample the year 2009 MODIS data from both the Terra and Aqua spacecraft along several candidate swaths of various widths. We find that due to spatial sampling there is an uncertainty of approximately 0.01 in the global, annual mean AOT. The sub-sampled monthly mean gridded AOT are within +/- 0.01 of the full swath AOT about 20% of the time for the narrow swath sub-samples, about 30% of the time for the moderate width sub-samples, and about 45% of the time for the widest swath considered. These results suggest that future aerosol satellite missions with only a narrow swath view may not sample the true AOT distribution sufficiently to reduce significantly the uncertainty in aerosol direct forcing of climate.
The Spatial Scaling of Global Rainfall Extremes
NASA Astrophysics Data System (ADS)
Devineni, N.; Xi, C.; Lall, U.; Rahill-Marier, B.
2013-12-01
Floods associated with severe storms are a significant source of risk for property, life and supply chains. These property losses tend to be determined as much by the duration of flooding as by the depth and velocity of inundation. High duration floods are typically induced by persistent rainfall (upto 30 day duration) as seen recently in Thailand, Pakistan, the Ohio and the Mississippi Rivers, France, and Germany. Events related to persistent and recurrent rainfall appear to correspond to the persistence of specific global climate patterns that may be identifiable from global, historical data fields, and also from climate models that project future conditions. A clear understanding of the space-time rainfall patterns for events or for a season will enable in assessing the spatial distribution of areas likely to have a high/low inundation potential for each type of rainfall forcing. In this paper, we investigate the statistical properties of the spatial manifestation of the rainfall exceedances. We also investigate the connection of persistent rainfall events at different latitudinal bands to large-scale climate phenomena such as ENSO. Finally, we present the scaling phenomena of contiguous flooded areas as a result of large scale organization of long duration rainfall events. This can be used for spatially distributed flood risk assessment conditional on a particular rainfall scenario. Statistical models for spatio-temporal loss simulation including model uncertainty to support regional and portfolio analysis can be developed.
Consistency assessment of rating curve data in various locations using Bidirectional Reach (BReach)
NASA Astrophysics Data System (ADS)
Van Eerdenbrugh, Katrien; Van Hoey, Stijn; Coxon, Gemma; Freer, Jim; Verhoest, Niko E. C.
2017-10-01
When estimating discharges through rating curves, temporal data consistency is a critical issue. In this research, consistency in stage-discharge data is investigated using a methodology called Bidirectional Reach (BReach), which departs from a (in operational hydrology) commonly used definition of consistency. A period is considered to be consistent if no consecutive and systematic deviations from a current situation occur that exceed observational uncertainty. Therefore, the capability of a rating curve model to describe a subset of the (chronologically sorted) data is assessed in each observation by indicating the outermost data points for which the rating curve model behaves satisfactorily. These points are called the maximum left or right reach, depending on the direction of the investigation. This temporal reach should not be confused with a spatial reach (indicating a part of a river). Changes in these reaches throughout the data series indicate possible changes in data consistency and if not resolved could introduce additional errors and biases. In this research, various measurement stations in the UK, New Zealand and Belgium are selected based on their significant historical ratings information and their specific characteristics related to data consistency. For each country, regional information is maximally used to estimate observational uncertainty. Based on this uncertainty, a BReach analysis is performed and, subsequently, results are validated against available knowledge about the history and behavior of the site. For all investigated cases, the methodology provides results that appear to be consistent with this knowledge of historical changes and thus facilitates a reliable assessment of (in)consistent periods in stage-discharge measurements. This assessment is not only useful for the analysis and determination of discharge time series, but also to enhance applications based on these data (e.g., by informing hydrological and hydraulic model evaluation design about consistent time periods to analyze).
Assessing the impact of radiative parameter uncertainty on plant growth simulation
NASA Astrophysics Data System (ADS)
Viskari, T.; Serbin, S.; Dietze, M.; Shiklomanov, A. N.
2015-12-01
Current Earth system models do not adequately project either the magnitude or the sign of carbon fluxes and storage associated with the terrestrial carbon cycle resulting in significant uncertainties in their potential feedbacks on the future climate system. A primary reason for the current uncertainty in these models is the lack of observational constraints of key biomes at relevant spatial and temporal scales. There is an increasingly large and highly resolved amount of remotely sensed observations that can provide the critical model inputs. However, effectively incorporating these data requires the use of radiative transfer models and their associated assumptions. How these parameter assumptions and uncertainties affect model projections for, e.g., leaf physiology, soil temperature or growth has not been examined in depth. In this presentation we discuss the use of high spectral resolution observations at the near surface to landscape scales to inform ecosystem process modeling efforts, particularly the uncertainties related to properties describing the radiation regime within vegetation canopies and the impact on C cycle projections. We illustrate that leaf and wood radiative properties and their associated uncertainties have an important impact on projected forest carbon uptake and storage. We further show the need for a strong data constraint on these properties and discuss sources of this remotely sensed information and methods for data assimilation into models. We present our approach as an efficient means for understanding and correcting implicit assumptions and model structural deficiencies in radiation transfer in vegetation canopies. Ultimately, a better understanding of the radiation balance of ecosystems will improve regional and global scale C and energy balance projections.
NASA Astrophysics Data System (ADS)
Rautman, C. A.; Treadway, A. H.
1991-11-01
Regulatory geologists are concerned with predicting the performance of sites proposed for waste disposal or for remediation of existing pollution problems. Geologic modeling of these sites requires large-scale expansion of knowledge obtained from very limited sampling. This expansion induces considerable uncertainty into the geologic models of rock properties that are required for modeling the predicted performance of the site. One method for assessing this uncertainty is through nonparametric geostatistical simulation. Simulation can produce a series of equiprobable models of a rock property of interest. Each model honors measured values at sampled locations, and each can be constructed to emulate both the univariate histogram and the spatial covariance structure of the measured data. Computing a performance model for a number of geologic simulations allows evaluation of the effects of geologic uncertainty. A site may be judged acceptable if the number of failures to meet a particular performance criterion produced by these computations is sufficiently low. A site that produces too many failures may be either unacceptable or simply inadequately described. The simulation approach to addressing geologic uncertainty is being applied to the potential high-level nuclear waste repository site at Yucca Mountain, Nevada, U.S.A. Preliminary geologic models of unsaturated permeability have been created that reproduce observed statistical properties reasonably well. A spread of unsaturated groundwater travel times has been computed that reflects the variability of those geologic models. Regions within the simulated models exhibiting the greatest variability among multiple runs are candidates for obtaining the greatest reduction in uncertainty through additional site characterization.
Howell, J.E.; Moore, C.T.; Conroy, M.J.; Hamrick, R.G.; Cooper, R.J.; Thackston, R.E.; Carroll, J.P.
2009-01-01
Large-scale habitat enhancement programs for birds are becoming more widespread, however, most lack monitoring to resolve uncertainties and enhance program impact over time. Georgia?s Bobwhite Quail Initiative (BQI) is a competitive, proposal-based system that provides incentives to landowners to establish habitat for northern bobwhites (Colinus virginianus). Using data from monitoring conducted in the program?s first years (1999?2001), we developed alternative hierarchical models to predict bobwhite abundance in response to program habitat modifications on local and regional scales. Effects of habitat and habitat management on bobwhite population response varied among geographical scales, but high measurement variability rendered the specific nature of these scaled effects equivocal. Under some models, BQI had positive impact at both local farm scales (1, 9 km2), particularly when practice acres were clustered, whereas other credible models indicated that bird response did not depend on spatial arrangement of practices. Thus, uncertainty about landscape-level effects of management presents a challenge to program managers who must decide which proposals to accept. We demonstrate that optimal selection decisions can be made despite this uncertainty and that uncertainty can be reduced over time, with consequent improvement in management efficacy. However, such an adaptive approach to BQI program implementation would require the reestablishment of monitoring of bobwhite abundance, an effort for which funding was discontinued in 2002. For landscape-level conservation programs generally, our approach demonstrates the value in assessing multiple scales of impact of habitat modification programs, and it reveals the utility of addressing management uncertainty through multiple decision models and system monitoring.
Parameter and uncertainty estimation for mechanistic, spatially explicit epidemiological models
NASA Astrophysics Data System (ADS)
Finger, Flavio; Schaefli, Bettina; Bertuzzo, Enrico; Mari, Lorenzo; Rinaldo, Andrea
2014-05-01
Epidemiological models can be a crucially important tool for decision-making during disease outbreaks. The range of possible applications spans from real-time forecasting and allocation of health-care resources to testing alternative intervention mechanisms such as vaccines, antibiotics or the improvement of sanitary conditions. Our spatially explicit, mechanistic models for cholera epidemics have been successfully applied to several epidemics including, the one that struck Haiti in late 2010 and is still ongoing. Calibration and parameter estimation of such models represents a major challenge because of properties unusual in traditional geoscientific domains such as hydrology. Firstly, the epidemiological data available might be subject to high uncertainties due to error-prone diagnosis as well as manual (and possibly incomplete) data collection. Secondly, long-term time-series of epidemiological data are often unavailable. Finally, the spatially explicit character of the models requires the comparison of several time-series of model outputs with their real-world counterparts, which calls for an appropriate weighting scheme. It follows that the usual assumption of a homoscedastic Gaussian error distribution, used in combination with classical calibration techniques based on Markov chain Monte Carlo algorithms, is likely to be violated, whereas the construction of an appropriate formal likelihood function seems close to impossible. Alternative calibration methods, which allow for accurate estimation of total model uncertainty, particularly regarding the envisaged use of the models for decision-making, are thus needed. Here we present the most recent developments regarding methods for parameter and uncertainty estimation to be used with our mechanistic, spatially explicit models for cholera epidemics, based on informal measures of goodness of fit.
Abidin, Tommy Rowel; Alexander, Neal; Brock, Paddy; Grigg, Matthew J.; Murphy, Amanda; William, Timothy; Menon, Jayaram; Drakeley, Chris J.; Cox, Jonathan
2016-01-01
The zoonotic malaria species Plasmodium knowlesi has become the main cause of human malaria in Malaysian Borneo. Deforestation and associated environmental and population changes have been hypothesized as main drivers of this apparent emergence. We gathered village-level data for P. knowlesi incidence for the districts of Kudat and Kota Marudu in Sabah state, Malaysia, for 2008–2012. We adjusted malaria records from routine reporting systems to reflect the diagnostic uncertainty of microscopy for P. knowlesi. We also developed negative binomial spatial autoregressive models to assess potential associations between P. knowlesi incidence and environmental variables derived from satellite-based remote-sensing data. Marked spatial heterogeneity in P. knowlesi incidence was observed, and village-level numbers of P. knowlesi cases were positively associated with forest cover and historical forest loss in surrounding areas. These results suggest the likelihood that deforestation and associated environmental changes are key drivers in P. knowlesi transmission in these areas. PMID:26812373
Carbon Budget and its Dynamics over Northern Eurasia Forest Ecosystems
NASA Astrophysics Data System (ADS)
Shvidenko, Anatoly; Schepaschenko, Dmitry; Kraxner, Florian; Maksyutov, Shamil
2016-04-01
The presentation contains an overview of recent findings and results of assessment of carbon cycling of forest ecosystems of Northern Eurasia. From a methodological point of view, there is a clear tendency in understanding a need of a Full and Verified Carbon Account (FCA), i.e. in reliable assessment of uncertainties for all modules and all stages of FCA. FCA is considered as a fuzzy (underspecified) system that supposes a system integration of major methods of carbon cycling study (land-ecosystem approach, LEA; process-based models; eddy covariance; and inverse modelling). Landscape-ecosystem approach 1) serves for accumulation of all relevant knowledge of landscape and ecosystems; 2) for strict systems designing the account, 3) contains all relevant spatially distributed empirical and semi-empirical data and models, and 4) is presented in form of an Integrated Land Information System (ILIS). The ILIS includes a hybrid land cover in a spatially and temporarily explicit way and corresponding attributive databases. The forest mask is provided by utilizing multi-sensor remote sensing data, geographically weighed regression and validation within GEO-wiki platform. By-pixel parametrization of forest cover is based on a special optimization algorithms using all available knowledge and information sources (data of forest inventory and different surveys, observations in situ, official statistics of forest management etc.). Major carbon fluxes within the LEA (NPP, HR, disturbances etc.) are estimated based on fusion of empirical data and aggregations with process-based elements by sets of regionally distributed models. Uncertainties within LEA are assessed for each module and at each step of the account. Within method results of LEA and corresponding uncertainties are harmonized and mutually constrained with independent outputs received by other methods based on the Bayesian approach. The above methodology have been applied to carbon account of Russian forests for 2000-2012. It has been shown that the Net Ecosystem Carbon Budget (NECB) of Russian forests for this period was in range of 0.5-0.7 Pg C yr-1 with a slight negative trend during the period due to acceleration of disturbance regimes and negative impacts of weather extremes (heat waves etc.). Uncertainties of the FCA for individual years were estimated at about 25% (CI 0.9). It has been shown that some models (e.g. majority of DGVMs) do not describe some processes on permafrost satisfactory while results of applications of ensembles of inverse models on average are closed to empirical assessments. A most important conclusion from this experience is that future improvements of knowledge of carbon cycling of Northern Eurasia forests requires development of an integrated observing system as a unified information background, as well as systems methodological improvements of all methods of cognition of carbon cycling.
NASA Astrophysics Data System (ADS)
Xiao, X.; Cohan, D. S.
2009-12-01
Substantial uncertainties in current emission inventories have been detected by the Texas Air Quality Study 2006 (TexAQS 2006) intensive field program. These emission uncertainties have caused large inaccuracies in model simulations of air quality and its responses to management strategies. To improve the quantitative understanding of the temporal, spatial, and categorized distributions of primary pollutant emissions by utilizing the corresponding measurements collected during TexAQS 2006, we implemented both the recursive Kalman filter and a batch matrix inversion 4-D data assimilation (FDDA) method in an iterative inverse modeling framework of the CMAQ-DDM model. Equipped with the decoupled direct method, CMAQ-DDM enables simultaneous calculation of the sensitivity coefficients of pollutant concentrations to emissions to be used in the inversions. Primary pollutant concentrations measured by the multiple platforms (TCEQ ground-based, NOAA WP-3D aircraft and Ronald H. Brown vessel, and UH Moody Tower) during TexAQS 2006 have been integrated for the use in the inverse modeling. Firstly pseudo-data analyses have been conducted to assess the two methods, taking a coarse spatial resolution emission inventory as a case. Model base case concentrations of isoprene and ozone at arbitrarily selected ground grid cells were perturbed to generate pseudo measurements with different assumed Gaussian uncertainties expressed by 1-sigma standard deviations. Single-species inversions have been conducted with both methods for isoprene and NOx surface emissions from eight states in the Southeastern United States by using the pseudo measurements of isoprene and ozone, respectively. Utilization of ozone pseudo data to invert for NOx emissions serves only for the purpose of method assessment. Both the Kalman filter and FDDA methods show good performance in tuning arbitrarily shifted a priori emissions to the base case “true” values within 3-4 iterations even for the nonlinear responses of ozone to NOx emissions. While the Kalman filter has better performance under the situation of very large observational uncertainties, the batch matrix FDDA method is better suited for incorporating temporally and spatially irregular data such as those measured by NOAA aircraft and ship. After validating the methods with the pseudo data, the inverse technique is applied to improve emission estimates of NOx from different source sectors and regions in the Houston metropolitan area by using NOx measurements during TexAQS 2006. EPA NEI2005-based and Texas-specified Emission Inventories for 2006 are used as the a priori emission estimates before optimization. The inversion results will be presented and discussed. Future work will conduct inverse modeling for additional species, and then perform a multi-species inversion for emissions consistency and reconciliation with secondary pollutants such as ozone.
Pesticide risk assessment in free-ranging bees is weather and landscape dependent.
Henry, Mickaël; Bertrand, Colette; Le Féon, Violette; Requier, Fabrice; Odoux, Jean-François; Aupinel, Pierrick; Bretagnolle, Vincent; Decourtye, Axel
2014-07-10
The risk assessment of plant protection products on pollinators is currently based on the evaluation of lethal doses through repeatable lethal toxicity laboratory trials. Recent advances in honeybee toxicology have, however, raised interest on assessing sublethal effects in free-ranging individuals. Here, we show that the sublethal effects of a neonicotinoid pesticide are modified in magnitude by environmental interactions specific to the landscape and time of exposure events. Field sublethal assessment is therefore context dependent and should be addressed in a temporally and spatially explicit way, especially regarding weather and landscape physiognomy. We further develop an analytical Effective Dose (ED) framework to help disentangle context-induced from treatment-induced effects and thus to alleviate uncertainty in field studies. Although the ED framework involves trials at concentrations above the expected field exposure levels, it allows to explicitly delineating the climatic and landscape contexts that should be targeted for in-depth higher tier risk assessment.
NASA Astrophysics Data System (ADS)
Fischbach, J. R.; Johnson, D.
2017-12-01
Louisiana's Comprehensive Master Plan for a Sustainable Coast is a 50-year plan designed to reduce flood risk and minimize land loss while allowing for the continued provision of economic and ecosystem services from this critical coastal region. Conceived in 2007 in response to hurricanes Katrina and Rita in 2005, the master plan is updated on a five-year planning cycle by the state's Coastal Protection and Restoration Authority (CPRA). Under the plan's middle-of-the-road (Medium) environmental scenario, the master plan is projected to reduce expected annual damage from storm surge flooding by approximately 65% relative to a future without action: from 5.3 billion to 2.2 billion in 2040, and from 12.1 billion to 3.7 billion in 2065. The Coastal Louisiana Risk Assessment model (CLARA) is used to estimate the risk reduction impacts of projects that have been considered for implementation as part of the plan. Evaluation of projects involves estimation of cost effectiveness in multiple future time periods and under a range of environmental uncertainties (e.g., the rates of sea level rise and land subsidence, changes in future hurricane intensity and frequency), operational uncertainties (e.g., system fragility), and economic uncertainties (e.g., patterns of population change and asset exposure). Between the 2012 and 2017 planning cycles, many improvements were made to the CLARA model. These included changes to the model's spatial resolution and definition of policy-relevant spatial units, an improved treatment of parametric uncertainty and uncertainty propagation between model components, the addition of a module to consider critical infrastructure exposure, and a new population growth model. CPRA also developed new scenarios for analysis in 2017 that were responsive to new scientific literature and to accommodate a new approach to modeling coastal morphology. In this talk, we discuss how CLARA has evolved over the 2012 and 2017 planning cycles in response to the needs of policy makers and CPRA managers. While changes will be illustrated through examples from Louisiana's 2017 Coastal Master Plan, we endeavor to provide generalizable and actionable insights about how modeling choices should be guided by the decision support process being used by planners.
NASA Astrophysics Data System (ADS)
Nepal, S.
2016-12-01
The spatial transferability of the model parameters of the process-oriented distributed J2000 hydrological model was investigated in two glaciated sub-catchments of the Koshi river basin in eastern Nepal. The basins had a high degree of similarity with respect to their static landscape features. The model was first calibrated (1986-1991) and validated (1992-1997) in the Dudh Koshi sub-catchment. The calibrated and validated model parameters were then transferred to the nearby Tamor catchment (2001-2009). A sensitivity and uncertainty analysis was carried out for both sub-catchments to discover the sensitivity range of the parameters in the two catchments. The model represented the overall hydrograph well in both sub-catchments, including baseflow and medium range flows (rising and recession limbs). The efficiency results according to both Nash-Sutcliffe and the coefficient of determination was above 0.84 in both cases. The sensitivity analysis showed that the same parameter was most sensitive for Nash-Sutcliffe (ENS) and Log Nash-Sutcliffe (LNS) efficiencies in both catchments. However, there were some differences in sensitivity to ENS and LNS for moderate and low sensitive parameters, although the majority (13 out of 16 for ENS and 16 out of 16 for LNS) had a sensitivity response in a similar range. A generalized likelihood uncertainty estimation (GLUE) result suggest that most of the time the observed runoff is within the parameter uncertainty range, although occasionally the values lie outside the uncertainty range, especially during flood peaks and more in the Tamor. This may be due to the limited input data resulting from the small number of precipitation stations and lack of representative stations in high-altitude areas, as well as to model structural uncertainty. The results indicate that transfer of the J2000 parameters to a neighboring catchment in the Himalayan region with similar physiographic landscape characteristics is viable. This indicates the possibility of applying process-based J2000 model be to the ungauged catchments in the Himalayan region, which could provide important insights into the hydrological system dynamics and provide much needed information to support water resources planning and management.
NASA Astrophysics Data System (ADS)
Henne, Stephan; Leuenberger, Markus; Steinbacher, Martin; Eugster, Werner; Meinhardt, Frank; Bergamaschi, Peter; Emmenegger, Lukas; Brunner, Dominik
2017-04-01
Similar to other Western European countries, agricultural sources dominate the methane (CH4) emission budget in Switzerland. 'Bottom-up' estimates of these emissions are still connected with relatively large uncertainties due to considerable variability and uncertainties in observed emission factors for the underlying processes (e.g., enteric fermentation, manure management). Here, we present a regional-scale (˜300 x 200 km2) atmospheric inversion study of CH4 emissions in Switzerland making use of the recently established CarboCount-CH network of four stations on the Swiss Plateau as well as the neighbouring mountain-top sites Jungfraujoch and Schauinsland (Germany). Continuous observations from all CarboCount-CH sites are available since 2013. We use a high-resolution (7 x 7 km2) Lagrangian particle dispersion model (FLEXPART-COSMO) in connection with two different inversion systems (Bayesian and extended Kalman filter) to estimate spatially and temporally resolved CH4 emissions for the Swiss domain in the period 2013 to 2016. An extensive set of sensitivity inversions is used to assess the overall uncertainty of our inverse approach. In general we find good agreement of the total Swiss CH4 emissions between our 'top-down' estimate and the national 'bottom-up' reporting. In addition, a robust emission seasonality, with reduced winter time values, can be seen in all years. No significant trend or year-to-year variability was observed for the analysed four-year period, again in agreement with a very small downward trend in the national 'bottom-up' reporting. Special attention is given to the influence of boundary conditions as taken from different global scale model simulations (TM5, FLEXPART) and remote observations. We find that uncertainties in the boundary conditions can induce large offsets in the national total emissions. However, spatial emission patterns are less sensitive to the choice of boundary condition. Furthermore and in order to demonstrate the validity of our approach, a series of inversion runs using synthetic observations, generated from 'true' emissions, in combination with various sources of uncertainty are presented.
NASA Astrophysics Data System (ADS)
Jalali, Mohammad; Ramazi, Hamidreza
2018-06-01
Earthquake catalogues are the main source of statistical seismology for the long term studies of earthquake occurrence. Therefore, studying the spatiotemporal problems is important to reduce the related uncertainties in statistical seismology studies. A statistical tool, time normalization method, has been determined to revise time-frequency relationship in one of the most active regions of Asia, Eastern Iran and West of Afghanistan, (a and b were calculated around 8.84 and 1.99 in the exponential scale, not logarithmic scale). Geostatistical simulation method has been further utilized to reduce the uncertainties in the spatial domain. A geostatistical simulation produces a representative, synthetic catalogue with 5361 events to reduce spatial uncertainties. The synthetic database is classified using a Geographical Information System, GIS, based on simulated magnitudes to reveal the underlying seismicity patterns. Although some regions with highly seismicity correspond to known faults, significantly, as far as seismic patterns are concerned, the new method highlights possible locations of interest that have not been previously identified. It also reveals some previously unrecognized lineation and clusters in likely future strain release.
Assessment of Global Mercury Deposition through Litterfall.
Wang, Xun; Bao, Zhengduo; Lin, Che-Jen; Yuan, Wei; Feng, Xinbin
2016-08-16
There is a large uncertainty in the estimate of global dry deposition of atmospheric mercury (Hg). Hg deposition through litterfall represents an important input to terrestrial forest ecosystems via cumulative uptake of atmospheric Hg (most Hg(0)) to foliage. In this study, we estimate the quantity of global Hg deposition through litterfall using statistical modeling (Monte Carlo simulation) of published data sets of litterfall biomass production, tree density, and Hg concentration in litter samples. On the basis of the model results, the global annual Hg deposition through litterfall is estimated to be 1180 ± 710 Mg yr(-1), more than two times greater than the estimate by GEOS-Chem. Spatial distribution of Hg deposition through litterfall suggests that deposition flux decreases spatially from tropical to temperate and boreal regions. Approximately 70% of global Hg(0) dry deposition occurs in the tropical and subtropical regions. A major source of uncertainty in this study is the heterogeneous geospatial distribution of available data. More observational data in regions (Southeast Asia, Africa, and South America) where few data sets exist will greatly improve the accuracy of the current estimate. Given that the quantity of global Hg deposition via litterfall is typically 2-6 times higher than Hg(0) evasion from forest floor, global forest ecosystems represent a strong Hg(0) sink.
Hybel, A-M; Godskesen, B; Rygaard, M
2015-09-01
Indicators of the impact on freshwater resources are becoming increasingly important in the evaluation of urban water systems. To reveal the importance of spatial resolution, we investigated how the choice of catchment scale influenced the freshwater impact assessment. Two different indicators were used in this study: the Withdrawal-To-Availability ratio (WTA) and the Water Stress Index (WSI). Results were calculated for three groundwater based Danish urban water supplies (Esbjerg, Aarhus, and Copenhagen). The assessment was carried out at three spatial levels: (1) the groundwater body level, (2) the river basin level, and (3) the regional level. The assessments showed that Copenhagen's water supply had the highest impact on the freshwater resource per cubic meter of water abstracted, with a WSI of 1.75 at Level 1. The WSI values were 1.64 for Aarhus's and 0.81 for Esbjerg's water supply. Spatial resolution was identified as a major factor determining the outcome of the impact assessment. For the three case studies, WTA and WSI were 27%-583% higher at Level 1 than impacts calculated for the regional scale. The results highlight that freshwater impact assessments based on regional data, rather than sub-river basin data, may dramatically underestimate the actual impact on the water resource. Furthermore, this study discusses the strengths and shortcomings of the applied indicator approaches. A sensitivity analysis demonstrates that although WSI has the highest environmental relevance, it also has the highest uncertainty, as it requires estimations of non-measurable environmental water requirements. Hence, the development of a methodology to obtain more site-specific and relevant estimations of environmental water requirements should be prioritized. Finally, the demarcation of the groundwater resource in aquifers remains a challenge for establishing a consistent method for benchmarking freshwater impacts caused by groundwater abstraction. Copyright © 2015 Elsevier Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
Hall, Dorothy K.; Box, Jason E.; Casey, Kimberly A.; Hook, Simon J.; Shuman, Christopher A.; Steffen, Konrad
2008-01-01
The most practical way to get a spatially broad and continuous measurements of the surface temperature in the data-sparse cryosphere is by satellite remote sensing. The uncertainties in satellite-derived LSTs must be understood to develop internally-consistent decade-scale land-surface temperature (LST) records needed for climate studies. In this work we assess satellite-derived "clear-sky" LST products from the Moderate Resolution Imaging Spectroradiometer (MODIS) and the Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER), and LSTs derived from the Enhanced Thematic Mapper Plus (ETM+) over snow and ice on Greenland. When possible, we compare satellite-derived LSTs with in-situ air-temperature observations from Greenland Climate Network (GC-Net) automatic-weather stations (AWS). We find that MODIS, ASTER and ETM+ provide reliable and consistent LSTs under clear-sky conditions and relatively-flat terrain over snow and ice targets over a range of temperatures from -40 to 0 C. The satellite-derived LSTs agree within a relative RMS uncertainty of approx.0.5 C. The good agreement among the LSTs derived from the various satellite instruments is especially notable since different spectral channels and different retrieval algorithms are used to calculate LST from the raw satellite data. The AWS record in-situ data at a "point" while the satellite instruments record data over an area varying in size from: 57 X 57 m (ETM+), 90 X 90 m (ASTER), or to 1 X 1 km (MODIS). Surface topography and other factors contribute to variability of LST within a pixel, thus the AWS measurements may not be representative of the LST of the pixel. Without more information on the local spatial patterns of LST, the AWS LST cannot be considered valid ground truth for the satellite measurements, with RMS uncertainty approx.2 C. Despite the relatively large AWS-derived uncertainty, we find LST data are characterized by high accuracy but have uncertain absolute precision.
Fine-grained suspended sediment source identification for the Kharaa River basin, northern Mongolia
NASA Astrophysics Data System (ADS)
Rode, Michael; Theuring, Philipp; Collins, Adrian L.
2015-04-01
Fine sediment inputs into river systems can be a major source of nutrients and heavy metals and have a strong impact on the water quality and ecosystem functions of rivers and lakes, including those in semiarid regions. However, little is known to date about the spatial distribution of sediment sources in most large scale river basins in Central Asia. Accordingly, a sediment source fingerprinting technique was used to assess the spatial sources of fine-grained (<10 microns) sediment in the 15 000 km2 Kharaa River basin in northern Mongolia. Five field sampling campaigns in late summer 2009, and spring and late summer in both 2010 and 2011, were conducted directly after high water flows, to collect an overall total of 900 sediment samples. The work used a statistical approach for sediment source discrimination with geochemical composite fingerprints based on a new Genetic Algorithm (GA)-driven Discriminant Function Analysis, the Kruskal-Wallis H-test and Principal Component Analysis. The composite fingerprints were subsequently used for numerical mass balance modelling with uncertainty analysis. The contributions of the individual sub-catchment spatial sediment sources varied from 6.4% (the headwater sub-catchment of Sugnugur Gol) to 36.2% (the Kharaa II sub-catchment in the middle reaches of the study basin) with the pattern generally showing higher contributions from the sub-catchments in the middle, rather than the upstream, portions of the study area. The importance of riverbank erosion was shown to increase from upstream to midstream tributaries. The source tracing procedure provides results in reasonable accordance with previous findings in the study region and demonstrates the general applicability and associated uncertainties of an approach for fine-grained sediment source investigation in large scale semi-arid catchments. The combined application of source fingerprinting and catchment modelling approaches can be used to assess whether tracing estimates are credible and in combination such approaches provide a basis for making sediment source apportionment more compelling to catchment stakeholders and managers.
Lin, Wei-Chih; Lin, Yu-Pin; Wang, Yung-Chieh; Chang, Tsun-Kuo; Chiang, Li-Chi
2014-02-21
In this study, a deconvolution procedure was used to create a variogram of oral cancer (OC) rates. Based on the variogram, area-to-point (ATP) Poisson kriging and p-field simulation were used to downscale and simulate, respectively, the OC rate data for Taiwan from the district scale to a 1 km × 1 km grid scale. Local cluster analysis (LCA) of OC mortality rates was then performed to identify OC mortality rate hot spots based on the downscaled and the p-field-simulated OC mortality maps. The relationship between OC mortality and land use was studied by overlapping the maps of the downscaled OC mortality, the LCA results, and the land uses. One thousand simulations were performed to quantify local and spatial uncertainties in the LCA to identify OC mortality hot spots. The scatter plots and Spearman's rank correlation yielded the relationship between OC mortality and concentrations of the seven metals in the 1 km cell grid. The correlation analysis results for the 1 km scale revealed a weak correlation between OC mortality rate and concentrations of the seven studied heavy metals in soil. Accordingly, the heavy metal concentrations in soil are not major determinants of OC mortality rates at the 1 km scale at which soils were sampled. The LCA statistical results for local indicator of spatial association (LISA) revealed that the sites with high probability of high-high (high value surrounded by high values) OC mortality at the 1 km grid scale were clustered in southern, eastern, and mid-western Taiwan. The number of such sites was also significantly higher on agricultural land and in urban regions than on land with other uses. The proposed approach can be used to downscale and evaluate uncertainty in mortality data from a coarse scale to a fine scale at which useful additional information can be obtained for assessing and managing land use and risk.
Setting priorities for research on pollution reduction functions of agricultural buffers.
Dosskey, Michael G
2002-11-01
The success of buffer installation initiatives and programs to reduce nonpoint source pollution of streams on agricultural lands will depend the ability of local planners to locate and design buffers for specific circumstances with substantial and predictable results. Current predictive capabilities are inadequate, and major sources of uncertainty remain. An assessment of these uncertainties cautions that there is greater risk of overestimating buffer impact than underestimating it. Priorities for future research are proposed that will lead more quickly to major advances in predictive capabilities. Highest priority is given for work on the surface runoff filtration function, which is almost universally important to the amount of pollution reduction expected from buffer installation and for which there remain major sources of uncertainty for predicting level of impact. Foremost uncertainties surround the extent and consequences of runoff flow concentration and pollutant accumulation. Other buffer functions, including filtration of groundwater nitrate and stabilization of channel erosion sources of sediments, may be important in some regions. However, uncertainty surrounds our ability to identify and quantify the extent of site conditions where buffer installation can substantially reduce stream pollution in these ways. Deficiencies in predictive models reflect gaps in experimental information as well as technology to account for spatial heterogeneity of pollutant sources, pathways, and buffer capabilities across watersheds. Since completion of a comprehensive watershed-scale buffer model is probably far off, immediate needs call for simpler techniques to gage the probable impacts of buffer installation at local scales.
NASA Astrophysics Data System (ADS)
Mujumdar, Pradeep P.
2014-05-01
Climate change results in regional hydrologic change. The three prominent signals of global climate change, viz., increase in global average temperatures, rise in sea levels and change in precipitation patterns convert into signals of regional hydrologic change in terms of modifications in water availability, evaporative water demand, hydrologic extremes of floods and droughts, water quality, salinity intrusion in coastal aquifers, groundwater recharge and other related phenomena. A major research focus in hydrologic sciences in recent years has been assessment of impacts of climate change at regional scales. An important research issue addressed in this context deals with responses of water fluxes on a catchment scale to the global climatic change. A commonly adopted methodology for assessing the regional hydrologic impacts of climate change is to use the climate projections provided by the General Circulation Models (GCMs) for specified emission scenarios in conjunction with the process-based hydrologic models to generate the corresponding hydrologic projections. The scaling problem arising because of the large spatial scales at which the GCMs operate compared to those required in distributed hydrologic models, and their inability to satisfactorily simulate the variables of interest to hydrology are addressed by downscaling the GCM simulations to hydrologic scales. Projections obtained with this procedure are burdened with a large uncertainty introduced by the choice of GCMs and emission scenarios, small samples of historical data against which the models are calibrated, downscaling methods used and other sources. Development of methodologies to quantify and reduce such uncertainties is a current area of research in hydrology. In this presentation, an overview of recent research carried out by the author's group on assessment of hydrologic impacts of climate change addressing scale issues and quantification of uncertainties is provided. Methodologies developed with conditional random fields, Dempster-Shafer theory, possibility theory, imprecise probabilities and non-stationary extreme value theory are discussed. Specific applications on uncertainty quantification in impacts on streamflows, evaporative water demands, river water quality and urban flooding are presented. A brief discussion on detection and attribution of hydrologic change at river basin scales, contribution of landuse change and likely alterations in return levels of hydrologic extremes is also provided.
An approach for conducting PM source apportionment will be developed, tested, and applied that directly addresses limitations in current SA methods, in particular variability, biases, and intensive resource requirements. Uncertainties in SA results and sensitivities to SA inpu...
Geological maps and models: are we certain how uncertain they are?
NASA Astrophysics Data System (ADS)
Mathers, Steve; Waters, Colin; McEvoy, Fiona
2014-05-01
Geological maps and latterly 3D models provide the spatial framework for geology at diverse scales or resolutions. As demands continue to rise for sustainable use of the subsurface, use of these maps and models is informing decisions on management of natural resources, hazards and environmental change. Inaccuracies and uncertainties in geological maps and models can impact substantially on the perception, assessment and management of opportunities and the associated risks . Lithostratigraphical classification schemes predominate, and are used in most geological mapping and modelling. The definition of unit boundaries, as 2D lines or 3D surfaces is the prime objective. The intervening area or volume is rarely described other than by its bulk attributes, those relating to the whole unit. Where sufficient data exist on the spatial and/or statistical distribution of properties it can be gridded or voxelated with integrity. Here we only discuss the uncertainty involved in defining the boundary conditions. The primary uncertainty of any geological map or model is the accuracy of the geological boundaries, i.e. tops, bases, limits, fault intersections etc. Traditionally these have been depicted on BGS maps using three line styles that reflect the uncertainty of the boundary, e.g. observed, inferred, conjectural. Most geological maps tend to neglect the subsurface expression (subcrops etc). Models could also be built with subsurface geological boundaries (as digital node strings) tagged with levels of uncertainty; initial experience suggests three levels may again be practicable. Once tagged these values could be used to autogenerate uncertainty plots. Whilst maps are predominantly explicit and based upon evidence and the conceptual the understanding of the geologist, models of this type are less common and tend to be restricted to certain software methodologies. Many modelling packages are implicit, being driven by simple statistical interpolation or complex algorithms for building surfaces in ways that are invisible and so not controlled by the working geologist. Such models have the advantage of being replicable within a software package and so can discount some interpretational differences between modellers. They can however create geologically implausible results unless good geological rules and control are established prior to model calculation. Comparisons of results from varied software packages yield surprisingly diverse results. This is a significant and often overlooked source of uncertainty in models. Expert elicitation is commonly employed to establish values used in statistical treatments of model uncertainty. However this introduces another possible source of uncertainty created by the different judgements of the modellers. The pragmatic solution appears to be using panels of experienced geologists to elicit the values. Treatments of uncertainty in maps and models yield relative rather than absolute values even though many of these are expressed numerically. This makes it extremely difficult to devise standard methodologies to determine uncertainty or propose fixed numerical scales for expressing the results. Furthermore, these may give a misleading impression of greater certainty than actually exists. This contribution outlines general perceptions with regard to uncertainty in our maps and models and presents results from recent BGS studies
Surface Temperature Data Analysis
NASA Technical Reports Server (NTRS)
Hansen, James; Ruedy, Reto
2012-01-01
Small global mean temperature changes may have significant to disastrous consequences for the Earth's climate if they persist for an extended period. Obtaining global means from local weather reports is hampered by the uneven spatial distribution of the reliably reporting weather stations. Methods had to be developed that minimize as far as possible the impact of that situation. This software is a method of combining temperature data of individual stations to obtain a global mean trend, overcoming/estimating the uncertainty introduced by the spatial and temporal gaps in the available data. Useful estimates were obtained by the introduction of a special grid, subdividing the Earth's surface into 8,000 equal-area boxes, using the existing data to create virtual stations at the center of each of these boxes, and combining temperature anomalies (after assessing the radius of high correlation) rather than temperatures.
SRNL PARTICIPATION IN THE MULTI-SCALE ENSEMBLE EXERCISES
DOE Office of Scientific and Technical Information (OSTI.GOV)
Buckley, R
2007-10-29
Consequence assessment during emergency response often requires atmospheric transport and dispersion modeling to guide decision making. A statistical analysis of the ensemble of results from several models is a useful way of estimating the uncertainty for a given forecast. ENSEMBLE is a European Union program that utilizes an internet-based system to ingest transport results from numerous modeling agencies. A recent set of exercises required output on three distinct spatial and temporal scales. The Savannah River National Laboratory (SRNL) uses a regional prognostic model nested within a larger-scale synoptic model to generate the meteorological conditions which are in turn used inmore » a Lagrangian particle dispersion model. A discussion of SRNL participation in these exercises is given, with particular emphasis on requirements for provision of results in a timely manner with regard to the various spatial scales.« less
Risk assessment of vector-borne diseases for public health governance.
Sedda, L; Morley, D W; Braks, M A H; De Simone, L; Benz, D; Rogers, D J
2014-12-01
In the context of public health, risk governance (or risk analysis) is a framework for the assessment and subsequent management and/or control of the danger posed by an identified disease threat. Generic frameworks in which to carry out risk assessment have been developed by various agencies. These include monitoring, data collection, statistical analysis and dissemination. Due to the inherent complexity of disease systems, however, the generic approach must be modified for individual, disease-specific risk assessment frameworks. The analysis was based on the review of the current risk assessments of vector-borne diseases adopted by the main Public Health organisations (OIE, WHO, ECDC, FAO, CDC etc…). Literature, legislation and statistical assessment of the risk analysis frameworks. This review outlines the need for the development of a general public health risk assessment method for vector-borne diseases, in order to guarantee that sufficient information is gathered to apply robust models of risk assessment. Stochastic (especially spatial) methods, often in Bayesian frameworks are now gaining prominence in standard risk assessment procedures because of their ability to assess accurately model uncertainties. Risk assessment needs to be addressed quantitatively wherever possible, and submitted with its quality assessment in order to enable successful public health measures to be adopted. In terms of current practice, often a series of different models and analyses are applied to the same problem, with results and outcomes that are difficult to compare because of the unknown model and data uncertainties. Therefore, the risk assessment areas in need of further research are identified in this article. Copyright © 2014 The Royal Society for Public Health. Published by Elsevier Ltd. All rights reserved.
Propagation of radar rainfall uncertainty in urban flood simulations
NASA Astrophysics Data System (ADS)
Liguori, Sara; Rico-Ramirez, Miguel
2013-04-01
This work discusses the results of the implementation of a novel probabilistic system designed to improve ensemble sewer flow predictions for the drainage network of a small urban area in the North of England. The probabilistic system has been developed to model the uncertainty associated to radar rainfall estimates and propagate it through radar-based ensemble sewer flow predictions. The assessment of this system aims at outlining the benefits of addressing the uncertainty associated to radar rainfall estimates in a probabilistic framework, to be potentially implemented in the real-time management of the sewer network in the study area. Radar rainfall estimates are affected by uncertainty due to various factors [1-3] and quality control and correction techniques have been developed in order to improve their accuracy. However, the hydrological use of radar rainfall estimates and forecasts remains challenging. A significant effort has been devoted by the international research community to the assessment of the uncertainty propagation through probabilistic hydro-meteorological forecast systems [4-5], and various approaches have been implemented for the purpose of characterizing the uncertainty in radar rainfall estimates and forecasts [6-11]. A radar-based ensemble stochastic approach, similar to the one implemented for use in the Southern-Alps by the REAL system [6], has been developed for the purpose of this work. An ensemble generator has been calibrated on the basis of the spatial-temporal characteristics of the residual error in radar estimates assessed with reference to rainfall records from around 200 rain gauges available for the year 2007, previously post-processed and corrected by the UK Met Office [12-13]. Each ensemble member is determined by summing a perturbation field to the unperturbed radar rainfall field. The perturbations are generated by imposing the radar error spatial and temporal correlation structure to purely stochastic fields. A hydrodynamic sewer network model implemented in the Infoworks software was used to model the rainfall-runoff process in the urban area. The software calculates the flow through the sewer conduits of the urban model using rainfall as the primary input. The sewer network is covered by 25 radar pixels with a spatial resolution of 1 km2. The majority of the sewer system is combined, carrying both urban rainfall runoff as well as domestic and trade waste water [11]. The urban model was configured to receive the probabilistic radar rainfall fields. The results showed that the radar rainfall ensembles provide additional information about the uncertainty in the radar rainfall measurements that can be propagated in urban flood modelling. The peaks of the measured flow hydrographs are often bounded within the uncertainty area produced by using the radar rainfall ensembles. This is in fact one of the benefits of using radar rainfall ensembles in urban flood modelling. More work needs to be done in improving the urban models, but this is out of the scope of this research. The rainfall uncertainty cannot explain the whole uncertainty shown in the flow simulations, and additional sources of uncertainty will come from the structure of the urban models as well as the large number of parameters required by these models. Acknowledgements The authors would like to acknowledge the BADC, the UK Met Office and the UK Environment Agency for providing the various data sets. We also thank Yorkshire Water Services Ltd for providing the urban model. The authors acknowledge the support from the Engineering and Physical Sciences Research Council (EPSRC) via grant EP/I012222/1. References [1] Browning KA, 1978. Meteorological applications of radar. Reports on Progress in Physics 41 761 Doi: 10.1088/0034-4885/41/5/003 [2] Rico-Ramirez MA, Cluckie ID, Shepherd G, Pallot A, 2007. A high-resolution radar experiment on the island of Jersey. Meteorological Applications 14: 117-129. [3] Villarini G, Krajewski WF, 2010. Review of the different sources of uncertainty in single polarization radar-based estimates of rainfall. Surveys in Geophysics 31: 107-129. [4] Rossa A, Liechti K, Zappa M, Bruen M, Germann U, Haase G, Keil C, Krahe P, 2011. The COST 731 Action: A review on uncertainty propagation in advanced hydrometeorological forecast systems. Atmospheric Research 100, 150-167. [5] Rossa A, Bruen M, Germann U, Haase G, Keil C, Krahe P, Zappa M, 2010. Overview and Main Results on the interdisciplinary effort in flood forecasting COST 731-Propagation of Uncertainty in Advanced Meteo-Hydrological Forecast Systems. Proceedings of Sixth European Conference on Radar in Meteorology and Hydrology ERAD 2010. [6] Germann U, Berenguer M, Sempere-Torres D, Zappa M, 2009. REAL - ensemble radar precipitation estimation for hydrology in a mountainous region. Quarterly Journal of the Royal Meteorological Society 135: 445-456. [8] Bowler NEH, Pierce CE, Seed AW, 2006. STEPS: a probabilistic precipitation forecasting scheme which merges and extrapolation nowcast with downscaled NWP. Quarterly Journal of the Royal Meteorological Society 132: 2127-2155. [9] Zappa M, Rotach MW, Arpagaus M, Dorninger M, Hegg C, Montani A, Ranzi R, Ament F, Germann U, Grossi G et al., 2008. MAP D-PHASE: real-time demonstration of hydrological ensemble prediction systems. Atmospheric Science Letters 9, 80-87. [10] Liguori S, Rico-Ramirez MA. Quantitative assessment of short-term rainfall forecasts from radar nowcasts and MM5 forecasts. Hydrological Processes, accepted article. DOI: 10.1002/hyp.8415 [11] Liguori S, Rico-Ramirez MA, Schellart ANA, Saul AJ, 2012. Using probabilistic radar rainfall nowcasts and NWP forecasts for flow prediction in urban catchments. Atmospheric Research 103: 80-95. [12] Harrison DL, Driscoll SJ, Kitchen M, 2000. Improving precipitation estimates from weather radar using quality control and correction techniques. Meteorological Applications 7: 135-144. [13] Harrison DL, Scovell RW, Kitchen M, 2009. High-resolution precipitation estimates for hydrological uses. Proceedings of the Institution of Civil Engineers - Water Management 162: 125-135.
Methodologies for evaluating performance and assessing uncertainty of atmospheric dispersion models
NASA Astrophysics Data System (ADS)
Chang, Joseph C.
This thesis describes methodologies to evaluate the performance and to assess the uncertainty of atmospheric dispersion models, tools that predict the fate of gases and aerosols upon their release into the atmosphere. Because of the large economic and public-health impacts often associated with the use of the dispersion model results, these models should be properly evaluated, and their uncertainty should be properly accounted for and understood. The CALPUFF, HPAC, and VLSTRACK dispersion modeling systems were applied to the Dipole Pride (DP26) field data (˜20 km in scale), in order to demonstrate the evaluation and uncertainty assessment methodologies. Dispersion model performance was found to be strongly dependent on the wind models used to generate gridded wind fields from observed station data. This is because, despite the fact that the test site was a flat area, the observed surface wind fields still showed considerable spatial variability, partly because of the surrounding mountains. It was found that the two components were comparable for the DP26 field data, with variability more important than uncertainty closer to the source, and less important farther away from the source. Therefore, reducing data errors for input meteorology may not necessarily increase model accuracy due to random turbulence. DP26 was a research-grade field experiment, where the source, meteorological, and concentration data were all well-measured. Another typical application of dispersion modeling is a forensic study where the data are usually quite scarce. An example would be the modeling of the alleged releases of chemical warfare agents during the 1991 Persian Gulf War, where the source data had to rely on intelligence reports, and where Iraq had stopped reporting weather data to the World Meteorological Organization since the 1981 Iran-Iraq-war. Therefore the meteorological fields inside Iraq must be estimated by models such as prognostic mesoscale meteorological models, based on observational data from areas outside of Iraq, and using the global fields simulated by the global meteorological models as the initial and boundary conditions for the mesoscale models. It was found that while comparing model predictions to observations in areas outside of Iraq, the predicted surface wind directions had errors between 30 to 90 deg, but the inter-model differences (or uncertainties) in the predicted surface wind directions inside Iraq, where there were no onsite data, were fairly constant at about 70 deg. (Abstract shortened by UMI.)
Uncertainty in predictions of forest carbon dynamics: separating driver error from model error.
Spadavecchia, L; Williams, M; Law, B E
2011-07-01
We present an analysis of the relative magnitude and contribution of parameter and driver uncertainty to the confidence intervals on estimates of net carbon fluxes. Model parameters may be difficult or impractical to measure, while driver fields are rarely complete, with data gaps due to sensor failure and sparse observational networks. Parameters are generally derived through some optimization method, while driver fields may be interpolated from available data sources. For this study, we used data from a young ponderosa pine stand at Metolius, Central Oregon, and a simple daily model of coupled carbon and water fluxes (DALEC). An ensemble of acceptable parameterizations was generated using an ensemble Kalman filter and eddy covariance measurements of net C exchange. Geostatistical simulations generated an ensemble of meteorological driving variables for the site, consistent with the spatiotemporal autocorrelations inherent in the observational data from 13 local weather stations. Simulated meteorological data were propagated through the model to derive the uncertainty on the CO2 flux resultant from driver uncertainty typical of spatially extensive modeling studies. Furthermore, the model uncertainty was partitioned between temperature and precipitation. With at least one meteorological station within 25 km of the study site, driver uncertainty was relatively small ( 10% of the total net flux), while parameterization uncertainty was larger, 50% of the total net flux. The largest source of driver uncertainty was due to temperature (8% of the total flux). The combined effect of parameter and driver uncertainty was 57% of the total net flux. However, when the nearest meteorological station was > 100 km from the study site, uncertainty in net ecosystem exchange (NEE) predictions introduced by meteorological drivers increased by 88%. Precipitation estimates were a larger source of bias in NEE estimates than were temperature estimates, although the biases partly compensated for each other. The time scales on which precipitation errors occurred in the simulations were shorter than the temporal scales over which drought developed in the model, so drought events were reasonably simulated. The approach outlined here provides a means to assess the uncertainty and bias introduced by meteorological drivers in regional-scale ecological forecasting.
Uncertainty in hydrological signatures
NASA Astrophysics Data System (ADS)
McMillan, Hilary; Westerberg, Ida
2015-04-01
Information that summarises the hydrological behaviour or flow regime of a catchment is essential for comparing responses of different catchments to understand catchment organisation and similarity, and for many other modelling and water-management applications. Such information types derived as an index value from observed data are known as hydrological signatures, and can include descriptors of high flows (e.g. mean annual flood), low flows (e.g. mean annual low flow, recession shape), the flow variability, flow duration curve, and runoff ratio. Because the hydrological signatures are calculated from observed data such as rainfall and flow records, they are affected by uncertainty in those data. Subjective choices in the method used to calculate the signatures create a further source of uncertainty. Uncertainties in the signatures may affect our ability to compare different locations, to detect changes, or to compare future water resource management scenarios. The aim of this study was to contribute to the hydrological community's awareness and knowledge of data uncertainty in hydrological signatures, including typical sources, magnitude and methods for its assessment. We proposed a generally applicable method to calculate these uncertainties based on Monte Carlo sampling and demonstrated it for a variety of commonly used signatures. The study was made for two data rich catchments, the 50 km2 Mahurangi catchment in New Zealand and the 135 km2 Brue catchment in the UK. For rainfall data the uncertainty sources included point measurement uncertainty, the number of gauges used in calculation of the catchment spatial average, and uncertainties relating to lack of quality control. For flow data the uncertainty sources included uncertainties in stage/discharge measurement and in the approximation of the true stage-discharge relation by a rating curve. The resulting uncertainties were compared across the different signatures and catchments, to quantify uncertainty magnitude and bias, and to test how uncertainty depended on the density of the raingauge network and flow gauging station characteristics. The uncertainties were sometimes large (i.e. typical intervals of ±10-40% relative uncertainty) and highly variable between signatures. Uncertainty in the mean discharge was around ±10% for both catchments, while signatures describing the flow variability had much higher uncertainties in the Mahurangi where there was a fast rainfall-runoff response and greater high-flow rating uncertainty. Event and total runoff ratios had uncertainties from ±10% to ±15% depending on the number of rain gauges used; precipitation uncertainty was related to interpolation rather than point uncertainty. Uncertainty distributions in these signatures were skewed, and meant that differences in signature values between these catchments were often not significant. We hope that this study encourages others to use signatures in a way that is robust to data uncertainty.
NASA Astrophysics Data System (ADS)
Vachula, R. S.; Huang, Y.; Russell, J. M.
2017-12-01
Lake sediment-based fire reconstructions offer paleoenvironmental context in which to assess modern fires and predict future burning. However, despite the ubiquity, many uncertainties remain regarding the taphonomy of paleofire proxies and the spatial scales for which they record variations in fire history. Here we present down-core proxy analyses of polycyclic aromatic hydrocarbons (PAHs) and three size-fractions of charcoal (63-150, >150 and >250 μm) from Swamp Lake, California, an annually laminated lacustrine archive. Using a statewide historical GIS dataset of area burned, we assess the spatial scales for which these proxies are reliable recorders of fire history. We find that the coherence of observed and proxy-recorded fire history inherently depends upon spatial scale. Contrary to conventional thinking that charcoal mainly records local fires, our results indicate that macroscopic charcoal (>150 μm) may record spatially broader (<25 km) changes in fire history, and as such, the coarsest charcoal particles (>250 μm) may be a more conservative proxy for local burning. We find that sub-macroscopic charcoal particles (63-150 μm) reliably record regional (up to 150 km) changes in fire history. These results indicate that charcoal-based fire reconstructions may represent spatially broader fire history than previously thought, which has major implications for our understanding of spatiotemporal paleofire variations. Our analyses of PAHs show that dispersal mobility is heterogeneous between compounds, but that PAH fluxes are reliable proxies of fire history within 25-50 km, which suggests PAHs may be a better spatially constrained paleofire proxy than sedimentary charcoal. Further, using a linear discriminant analysis model informed by modern emissions analyses, we show that PAH assemblages preserved in lake sediments can differentiate vegetation type burned, and are thus promising paleoecological biomarkers warranting further research and implementation. In sum, our analyses offer new insight into the spatial dimensions of paleofire proxies and constitute a methodology that can be applied to other locations and proxies to better inform site-specific reconstructions.
Uncertainty Assessment of Synthetic Design Hydrographs for Gauged and Ungauged Catchments
NASA Astrophysics Data System (ADS)
Brunner, Manuela I.; Sikorska, Anna E.; Furrer, Reinhard; Favre, Anne-Catherine
2018-03-01
Design hydrographs described by peak discharge, hydrograph volume, and hydrograph shape are essential for engineering tasks involving storage. Such design hydrographs are inherently uncertain as are classical flood estimates focusing on peak discharge only. Various sources of uncertainty contribute to the total uncertainty of synthetic design hydrographs for gauged and ungauged catchments. These comprise model uncertainties, sampling uncertainty, and uncertainty due to the choice of a regionalization method. A quantification of the uncertainties associated with flood estimates is essential for reliable decision making and allows for the identification of important uncertainty sources. We therefore propose an uncertainty assessment framework for the quantification of the uncertainty associated with synthetic design hydrographs. The framework is based on bootstrap simulations and consists of three levels of complexity. On the first level, we assess the uncertainty due to individual uncertainty sources. On the second level, we quantify the total uncertainty of design hydrographs for gauged catchments and the total uncertainty of regionalizing them to ungauged catchments but independently from the construction uncertainty. On the third level, we assess the coupled uncertainty of synthetic design hydrographs in ungauged catchments, jointly considering construction and regionalization uncertainty. We find that the most important sources of uncertainty in design hydrograph construction are the record length and the choice of the flood sampling strategy. The total uncertainty of design hydrographs in ungauged catchments depends on the catchment properties and is not negligible in our case.
NASA Astrophysics Data System (ADS)
Dialynas, Y. G.; Arnone, E.; Noto, L. V.; Bras, R. L.
2013-12-01
Slope stability depends on geotechnical and hydrological factors that exhibit wide natural spatial variability, yet sufficient measurements of the related parameters are rarely available over entire study areas. The uncertainty associated with the inability to fully characterize hydrologic behavior has an impact on any attempt to model landslide hazards. This work suggests a way to systematically account for this uncertainty in coupled distributed hydrological-stability models for shallow landslide hazard assessment. A probabilistic approach for the prediction of rainfall-triggered landslide occurrence at basin scale was implemented in an existing distributed eco-hydrological and landslide model, tRIBS-VEGGIE -landslide (Triangulated Irregular Network (TIN)-based Real-time Integrated Basin Simulator - VEGetation Generator for Interactive Evolution). More precisely, we upgraded tRIBS-VEGGIE- landslide to assess the likelihood of shallow landslides by accounting for uncertainty related to geotechnical and hydrological factors that directly affect slope stability. Natural variability of geotechnical soil characteristics was considered by randomizing soil cohesion and friction angle. Hydrological uncertainty related to the estimation of matric suction was taken into account by considering soil retention parameters as correlated random variables. The probability of failure is estimated through an assumed theoretical Factor of Safety (FS) distribution, conditioned on soil moisture content. At each cell, the temporally variant FS statistics are approximated by the First Order Second Moment (FOSM) method, as a function of parameters statistical properties. The model was applied on the Rio Mameyes Basin, located in the Luquillo Experimental Forest in Puerto Rico, where previous landslide analyses have been carried out. At each time step, model outputs include the probability of landslide occurrence across the basin, and the most probable depth of failure at each soil column. The use of the proposed probabilistic approach for shallow landslide prediction is able to reveal and quantify landslide risk at slopes assessed as stable by simpler deterministic methods.
Impact of magnitude uncertainties on seismic catalogue properties
NASA Astrophysics Data System (ADS)
Leptokaropoulos, K. M.; Adamaki, A. K.; Roberts, R. G.; Gkarlaouni, C. G.; Paradisopoulou, P. M.
2018-05-01
Catalogue-based studies are of central importance in seismological research, to investigate the temporal, spatial and size distribution of earthquakes in specified study areas. Methods for estimating the fundamental catalogue parameters like the Gutenberg-Richter (G-R) b-value and the completeness magnitude (Mc) are well established and routinely applied. However, the magnitudes reported in seismicity catalogues contain measurement uncertainties which may significantly distort the estimation of the derived parameters. In this study, we use numerical simulations of synthetic data sets to assess the reliability of different methods for determining b-value and Mc, assuming the G-R law validity. After contaminating the synthetic catalogues with Gaussian noise (with selected standard deviations), the analysis is performed for numerous data sets of different sample size (N). The noise introduced to the data generally leads to a systematic overestimation of magnitudes close to and above Mc. This fact causes an increase of the average number of events above Mc, which in turn leads to an apparent decrease of the b-value. This may result to a significant overestimation of seismicity rate even well above the actual completeness level. The b-value can in general be reliably estimated even for relatively small data sets (N < 1000) when only magnitudes higher than the actual completeness level are used. Nevertheless, a correction of the total number of events belonging in each magnitude class (i.e. 0.1 unit) should be considered, to deal with the magnitude uncertainty effect. Because magnitude uncertainties (here with the form of Gaussian noise) are inevitable in all instrumental catalogues, this finding is fundamental for seismicity rate and seismic hazard assessment analyses. Also important is that for some data analyses significant bias cannot necessarily be avoided by choosing a high Mc value for analysis. In such cases, there may be a risk of severe miscalculation of seismicity rate regardless the selected magnitude threshold, unless possible bias is properly assessed.
Skinner, Daniel J C; Rocks, Sophie A; Pollard, Simon J T
2017-06-01
A means for identifying and prioritising the treatment of uncertainty (UnISERA) in environmental risk assessments (ERAs) is tested, using three risk domains where ERA is an established requirement and one in which ERA practice is emerging. UnISERA's development draws on 19 expert elicitations across genetically modified higher plants, particulate matter, and agricultural pesticide release and is stress tested here for engineered nanomaterials (ENM). We are concerned with the severity of uncertainty; its nature; and its location across four accepted stages of ERAs. Using an established uncertainty scale, the risk characterisation stage of ERA harbours the highest severity level of uncertainty, associated with estimating, aggregating and evaluating expressions of risk. Combined epistemic and aleatory uncertainty is the dominant nature of uncertainty. The dominant location of uncertainty is associated with data in problem formulation, exposure assessment and effects assessment. Testing UnISERA produced agreements of 55%, 90%, and 80% for the severity level, nature and location dimensions of uncertainty between the combined case studies and the ENM stress test. UnISERA enables environmental risk analysts to prioritise risk assessment phases, groups of tasks, or individual ERA tasks and it can direct them towards established methods for uncertainty treatment. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.
Bezombes, Lucie; Gaucherand, Stéphanie; Kerbiriou, Christian; Reinert, Marie-Eve; Spiegelberger, Thomas
2017-08-01
In many countries, biodiversity compensation is required to counterbalance negative impacts of development projects on biodiversity by carrying out ecological measures, called offset when the goal is to reach "no net loss" of biodiversity. One main issue is to ensure that offset gains are equivalent to impact-related losses. Ecological equivalence is assessed with ecological equivalence assessment methods taking into account a range of key considerations that we summarized as ecological, spatial, temporal, and uncertainty. When equivalence assessment methods take into account all considerations, we call them "comprehensive". Equivalence assessment methods should also aim to be science-based and operational, which is challenging. Many equivalence assessment methods have been developed worldwide but none is fully satisfying. In the present study, we examine 13 equivalence assessment methods in order to identify (i) their general structure and (ii) the synergies and trade-offs between equivalence assessment methods characteristics related to operationality, scientific-basis and comprehensiveness (called "challenges" in his paper). We evaluate each equivalence assessment methods on the basis of 12 criteria describing the level of achievement of each challenge. We observe that all equivalence assessment methods share a general structure, with possible improvements in the choice of target biodiversity, the indicators used, the integration of landscape context and the multipliers reflecting time lags and uncertainties. We show that no equivalence assessment methods combines all challenges perfectly. There are trade-offs between and within the challenges: operationality tends to be favored while scientific basis are integrated heterogeneously in equivalence assessment methods development. One way of improving the challenges combination would be the use of offset dedicated data-bases providing scientific feedbacks on previous offset measures.
Dealing with uncertainties in environmental burden of disease assessment
2009-01-01
Disability Adjusted Life Years (DALYs) combine the number of people affected by disease or mortality in a population and the duration and severity of their condition into one number. The environmental burden of disease is the number of DALYs that can be attributed to environmental factors. Environmental burden of disease estimates enable policy makers to evaluate, compare and prioritize dissimilar environmental health problems or interventions. These estimates often have various uncertainties and assumptions which are not always made explicit. Besides statistical uncertainty in input data and parameters – which is commonly addressed – a variety of other types of uncertainties may substantially influence the results of the assessment. We have reviewed how different types of uncertainties affect environmental burden of disease assessments, and we give suggestions as to how researchers could address these uncertainties. We propose the use of an uncertainty typology to identify and characterize uncertainties. Finally, we argue that uncertainties need to be identified, assessed, reported and interpreted in order for assessment results to adequately support decision making. PMID:19400963
NASA Astrophysics Data System (ADS)
Bekaert, D. P.; Hamlington, B.; Buzzanga, B. A.; Jones, C. E.
2017-12-01
The rate of relative sea level rise results from a combination of land subsidence and rising seas associated with global warming on long timescales and exacerbated by shifts in ocean dynamics on shorter timescales. An understanding of the current-day magnitude of each component is needed to create accurate projections of future relative sea level rise upon which to base planning efforts. Current day land-based subsidence rates derived from GPS often lack the spatial resolution to capture the local spatial variability needed when assessing the impact of relative sea-level rise. Interferometric Synthetic Aperture Radar (InSAR) is an attractive technique that has the potential to provide a measurement every 20-30m when good signal coherence is maintained. In practice, coastal regions are challenging for InSAR due to variable vegetation cover and soil moisture, which can be in part mitigated by applying advanced time-series InSAR techniques. After applying time-series InSAR, derived rates need to be combined with GPS to tie relative subsidence rates into a geodetic reference frame. Given the need to make projections of relative sea-level rise it is particularly important to propagate all uncertainties during the different processing stages. Here we provide results from ALOS and Sentinel-1 over Hampton Roads area in the Chesapeake Bay region, which is experiencing one of the highest rates of relative sea level rise on the Atlantic coast of the United States. Although the current derived subsidence rates have large uncertainties, it is expected that this will improve with the decadal observations from Sentinel-1.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cafferty, Kara G.; Searcy, Erin M.; Nguyen, Long
To meet Energy Independence and Security Act (EISA) cellulosic biofuel mandates, the United States will require an annual domestic supply of about 242 million Mg of biomass by 2022. To improve the feedstock logistics of lignocellulosic biofuels and access available biomass resources from areas with varying yields, commodity systems have been proposed and designed to deliver on-spec biomass feedstocks at preprocessing “depots”, which densify and stabilize the biomass prior to long-distance transport and delivery to centralized biorefineries. The harvesting, preprocessing, and logistics (HPL) of biomass commodity supply chains thus could introduce spatially variable environmental impacts into the biofuel life cyclemore » due to needing to harvest, move, and preprocess biomass from multiple distances that have variable spatial density. This study examines the uncertainty in greenhouse gas (GHG) emissions of corn stover logisticsHPL within a bio-ethanol supply chain in the state of Kansas, where sustainable biomass supply varies spatially. Two scenarios were evaluated each having a different number of depots of varying capacity and location within Kansas relative to a central commodity-receiving biorefinery to test GHG emissions uncertainty. Monte Carlo simulation was used to estimate the spatial uncertainty in the HPL gate-to-gate sequence. The results show that the transport of densified biomass introduces the highest variability and contribution to the carbon footprint of the logistics HPL supply chain (0.2-13 g CO 2e/MJ). Moreover, depending upon the biomass availability and its spatial density and surrounding transportation infrastructure (road and rail), logistics HPL processes can increase the variability in life cycle environmental impacts for lignocellulosic biofuels. Within Kansas, life cycle GHG emissions could range from 24 to 41 g CO 2e/MJ depending upon the location, size and number of preprocessing depots constructed. However, this range can be minimized through optimizing the siting of preprocessing depots where ample rail infrastructure exists to supply biomass commodity to a regional biorefinery supply system« less
Cafferty, Kara G.; Searcy, Erin M.; Nguyen, Long; ...
2014-11-04
To meet Energy Independence and Security Act (EISA) cellulosic biofuel mandates, the United States will require an annual domestic supply of about 242 million Mg of biomass by 2022. To improve the feedstock logistics of lignocellulosic biofuels and access available biomass resources from areas with varying yields, commodity systems have been proposed and designed to deliver on-spec biomass feedstocks at preprocessing “depots”, which densify and stabilize the biomass prior to long-distance transport and delivery to centralized biorefineries. The harvesting, preprocessing, and logistics (HPL) of biomass commodity supply chains thus could introduce spatially variable environmental impacts into the biofuel life cyclemore » due to needing to harvest, move, and preprocess biomass from multiple distances that have variable spatial density. This study examines the uncertainty in greenhouse gas (GHG) emissions of corn stover logisticsHPL within a bio-ethanol supply chain in the state of Kansas, where sustainable biomass supply varies spatially. Two scenarios were evaluated each having a different number of depots of varying capacity and location within Kansas relative to a central commodity-receiving biorefinery to test GHG emissions uncertainty. Monte Carlo simulation was used to estimate the spatial uncertainty in the HPL gate-to-gate sequence. The results show that the transport of densified biomass introduces the highest variability and contribution to the carbon footprint of the logistics HPL supply chain (0.2-13 g CO 2e/MJ). Moreover, depending upon the biomass availability and its spatial density and surrounding transportation infrastructure (road and rail), logistics HPL processes can increase the variability in life cycle environmental impacts for lignocellulosic biofuels. Within Kansas, life cycle GHG emissions could range from 24 to 41 g CO 2e/MJ depending upon the location, size and number of preprocessing depots constructed. However, this range can be minimized through optimizing the siting of preprocessing depots where ample rail infrastructure exists to supply biomass commodity to a regional biorefinery supply system« less
Using heat as a tracer to estimate spatially distributed mean residence times in the hyporheic zone
NASA Astrophysics Data System (ADS)
Naranjo, R. C.; Pohll, G. M.; Stone, M. C.; Niswonger, R. G.; McKay, W. A.
2013-12-01
Biogeochemical reactions that occur in the hyporheic zone are highly dependent on the time solutes are in contact with riverbed sediments. In this investigation, we developed a two-dimensional longitudinal flow and solute transport model to estimate the spatial distribution of mean residence time in the hyporheic zone along a riffle-pool sequence to gain a better understanding of nitrogen reactions. A flow and transport model was developed to estimate spatially distributed mean residence times and was calibrated using observations of temperature and pressure. The approach used in this investigation accounts for the mixing of ages given advection and dispersion. Uncertainty of flow and transport parameters was evaluated using standard Monte-Carlo analysis and the generalized likelihood uncertainty estimation method. Results of parameter estimation indicate the presence of a low-permeable zone in the riffle area that induced horizontal flow at shallow depth within the riffle area. This establishes shallow and localized flow paths and limits deep vertical exchange. From the optimal model, mean residence times were found to be relatively long (9 - 40 days). The uncertainty of hydraulic conductivity resulted in a mean interquartile range of 13 days across all piezometers and was reduced by 24% with the inclusion of temperature and pressure observations. To a lesser extent, uncertainty in streambed porosity and dispersivity resulted in a mean interquartile range of 2.2- and 4.7 days, respectively. Alternative conceptual models demonstrate the importance of accounting for the spatial distribution of hydraulic conductivity in simulating mean residence times in a riffle-pool sequence. It is demonstrated that spatially variable mean residence time beneath a riffle-pool system does not conform to simple conceptual models of hyporheic flow through a riffle-pool sequence. Rather, the mixing behavior between the river and the hyporheic flow are largely controlled by layered heterogeneity and anisotropy of the subsurface.
NASA Astrophysics Data System (ADS)
Abbaspour, K. C.; Rouholahnejad, E.; Vaghefi, S.; Srinivasan, R.; Yang, H.; Kløve, B.
2015-05-01
A combination of driving forces are increasing pressure on local, national, and regional water supplies needed for irrigation, energy production, industrial uses, domestic purposes, and the environment. In many parts of Europe groundwater quantity, and in particular quality, have come under sever degradation and water levels have decreased resulting in negative environmental impacts. Rapid improvements in the economy of the eastern European block of countries and uncertainties with regard to freshwater availability create challenges for water managers. At the same time, climate change adds a new level of uncertainty with regard to freshwater supplies. In this research we build and calibrate an integrated hydrological model of Europe using the Soil and Water Assessment Tool (SWAT) program. Different components of water resources are simulated and crop yield and water quality are considered at the Hydrological Response Unit (HRU) level. The water resources are quantified at subbasin level with monthly time intervals. Leaching of nitrate into groundwater is also simulated at a finer spatial level (HRU). The use of large-scale, high-resolution water resources models enables consistent and comprehensive examination of integrated system behavior through physically-based, data-driven simulation. In this article we discuss issues with data availability, calibration of large-scale distributed models, and outline procedures for model calibration and uncertainty analysis. The calibrated model and results provide information support to the European Water Framework Directive and lay the basis for further assessment of the impact of climate change on water availability and quality. The approach and methods developed are general and can be applied to any large region around the world.
Temporal and Spatial Analysis of Monogenetic Volcanic Fields
NASA Astrophysics Data System (ADS)
Kiyosugi, Koji
Achieving an understanding of the nature of monogenetic volcanic fields depends on identification of the spatial and temporal patterns of volcanism in these fields, and their relationships to structures mapped in the shallow crust and inferred in the deep crust and mantle through interpretation of geochemical, radiometric and geophysical data. We investigate the spatial and temporal distributions of volcanism in the Abu Monogenetic Volcano Group, Southwest Japan. E-W elongated volcano distribution, which is identified by a nonparametric kernel method, is found to be consistent with the spatial extent of P-wave velocity anomalies in the lower crust and upper mantle, supporting the idea that the spatial density map of volcanic vents reflects the geometry of a mantle diapir. Estimated basalt supply to the lower crust is constant. This observation and the spatial distribution of volcanic vents suggest stability of magma productivity and essentially constant two-dimensional size of the source mantle diapir. We mapped conduits, dike segments, and sills in the San Rafael sub-volcanic field, Utah, where the shallowest part of a Pliocene magmatic system is exceptionally well exposed. The distribution of conduits matches the major features of dike distribution, including development of clusters and distribution of outliers. The comparison of San Rafael conduit distribution and the distributions of volcanoes in several recently active volcanic fields supports the use of statistical models, such as nonparametric kernel methods, in probabilistic hazard assessment for distributed volcanism. We developed a new recurrence rate calculation method that uses a Monte Carlo procedure to better reflect and understand the impact of uncertainties of radiometric age determinations on uncertainty of recurrence rate estimates for volcanic activity in the Abu, Yucca Mountain Region, and Izu-Tobu volcanic fields. Results suggest that the recurrence rates of volcanic fields can change by more than one order of magnitude on time scales of several hundred thousand to several million years. This suggests that magma generation rate beneath volcanic fields may change over these time scales. Also, recurrence rate varies more than one order of magnitude between these volcanic fields, consistent with the idea that distributed volcanism may be influenced by both the rate of magma generation and the potential for dike interaction during ascent.
NASA Astrophysics Data System (ADS)
Black, B.; Harte, M.; Goldfinger, C.
2017-12-01
Participating in a ten-year monitoring project to assess the ecological, social, and socioeconomic impacts of Oregon's Marine Protected Areas (MPAs), we have worked in partnership with the Oregon Department of Fish and Wildlife (ODFW) to develop a Bayesian geospatial method to evaluate the spatial and temporal variance in the provision of ecosystem services produced by Oregon's MPAs. Probabilistic (Bayesian) approaches to Marine Spatial Planning (MSP) show considerable potential for addressing issues such as uncertainty, cumulative effects, and the need to integrate stakeholder-held information and preferences into decision making processes. To that end, we have created a Bayesian-based geospatial approach to MSP capable of modelling the evolution of the provision of ecosystem services before and after the establishment of Oregon's MPAs. Our approach permits both planners and stakeholders to view expected impacts of differing policies, behaviors, or choices made concerning Oregon's MPAs and surrounding areas in a geospatial (map) format while simultaneously considering multiple parties' beliefs on the policies or uses in question. We quantify the influence of the MPAs as the shift in the spatial distribution of ecosystem services, both inside and outside the protected areas, over time. Once the MPAs' influence on the provision of coastal ecosystem services has been evaluated, it is possible to view these impacts through geovisualization techniques. As a specific example of model use and output, a user could investigate the effects of altering the habitat preferences of a rockfish species over a prescribed period of time (5, 10, 20 years post-harvesting restrictions, etc.) on the relative intensity of spillover from nearby reserves (please see submitted figure). Particular strengths of our Bayesian-based approach include its ability to integrate highly disparate input types (qualitative or quantitative), to accommodate data gaps, address uncertainty, and to investigate temporal and spatial variation. This approach conveys the modeled outcome of proposed policy changes and is also a vehicle through which stakeholders and planners can work together to compare and deliberate on the impacts of policy and management changes, a capacity of considerable utility for planners and stakeholders engaged in MSP.
On the Character and Mitigation of Atmospheric Noise in InSAR Time Series Analysis (Invited)
NASA Astrophysics Data System (ADS)
Barnhart, W. D.; Fielding, E. J.; Fishbein, E.
2013-12-01
Time series analysis of interferometric synthetic aperture radar (InSAR) data, with its broad spatial coverage and ability to image regions that are sometimes very difficult to access, is a powerful tool for characterizing continental surface deformation and its temporal variations. With the impending launch of dedicated SAR missions such as Sentinel-1, ALOS-2, and the planned NASA L-band SAR mission, large volume data sets will allow researchers to further probe ground displacement processes with increased fidelity. Unfortunately, the precision of measurements in individual interferograms is impacted by several sources of noise, notably spatially correlated signals caused by path delays through the stratified and turbulent atmosphere and ionosphere. Spatial and temporal variations in atmospheric water vapor often introduce several to tens of centimeters of apparent deformation in the radar line-of-sight, correlated over short spatial scales (<10 km). Signals resulting from atmospheric path delays are particularly problematic because, like the subsidence and uplift signals associated with tectonic deformation, they are often spatially correlated with topography. In this talk, we provide an overview of the effects of spatially correlated tropospheric noise in individual interferograms and InSAR time series analysis, and we highlight where common assumptions of the temporal and spatial characteristics of tropospheric noise fail. Next, we discuss two classes of methods for mitigating the effects of tropospheric water vapor noise in InSAR time series analysis and single interferograms: noise estimation and characterization with independent observations from multispectral sensors such as MODIS and MERIS; and noise estimation and removal with weather models, multispectral sensor observations, and GPS. Each of these techniques can provide independent assessments of the contribution of water vapor in interferograms, but each technique also suffers from several pitfalls that we outline. The multispectral near-infrared (NIR) sensors provide high spatial resolution (~1 km) estimates of total column tropospheric water vapor by measuring the absorption of reflected solar illumination and provide may excellent estimates of wet delay. The Online Services for Correcting Atmosphere in Radar (OSCAR) project currently provides water vapor products through web services (http://oscar.jpl.nasa.gov). Unfortunately, such sensors require daytime and cloudless observations. Global and regional numerical weather models can provide an additional estimate of both the dry and atmospheric delays with spatial resolution of (3-100 km) and time scales of 1-3 hours, though these models are of lower accuracy than imaging observations and are benefited by independent observations from independent observations of atmospheric water vapor. Despite these issues, the integration of these techniques for InSAR correction and uncertainty estimation may contribute substantially to the reduction and rigorous characterization of uncertainty in InSAR time series analysis - helping to expand the range of tectonic displacements imaged with InSAR, to robustly constrain geophysical models, and to generate a-priori assessments of satellite acquisitions goals.
Application of a baseflow filter for evaluating model structure suitability of the IHACRES CMD
NASA Astrophysics Data System (ADS)
Kim, H. S.
2015-02-01
The main objective of this study was to assess the predictive uncertainty from the rainfall-runoff model structure coupling a conceptual module (non-linear module) with a metric transfer function module (linear module). The methodology was primarily based on the comparison between the outputs of the rainfall-runoff model and those from an alternative model approach. An alternative model approach was used to minimise uncertainties arising from data and the model structure. A baseflow filter was adopted to better understand deficiencies in the forms of the rainfall-runoff model by avoiding the uncertainties related to data and the model structure. The predictive uncertainty from the model structure was investigated for representative groups of catchments having similar hydrological response characteristics in the upper Murrumbidgee Catchment. In the assessment of model structure suitability, the consistency (or variability) of catchment response over time and space in model performance and parameter values has been investigated to detect problems related to the temporal and spatial variability of the model accuracy. The predictive error caused by model uncertainty was evaluated through analysis of the variability of the model performance and parameters. A graphical comparison of model residuals, effective rainfall estimates and hydrographs was used to determine a model's ability related to systematic model deviation between simulated and observed behaviours and general behavioural differences in the timing and magnitude of peak flows. The model's predictability was very sensitive to catchment response characteristics. The linear module performs reasonably well in the wetter catchments but has considerable difficulties when applied to the drier catchments where a hydrologic response is dominated by quick flow. The non-linear module has a potential limitation in its capacity to capture non-linear processes for converting observed rainfall into effective rainfall in both the wetter and drier catchments. The comparative study based on a better quantification of the accuracy and precision of hydrological modelling predictions yields a better understanding for the potential improvement of model deficiencies.
Ando, Amy W; Mallory, Mindy L
2012-04-24
Climate change is likely to alter the spatial distributions of species and habitat types but the nature of such change is uncertain. Thus, climate change makes it difficult to implement standard conservation planning paradigms. Previous work has suggested some approaches to cope with such uncertainty but has not harnessed all of the benefits of risk diversification. We adapt Modern Portfolio Theory (MPT) to optimal spatial targeting of conservation activity, using wetland habitat conservation in the Prairie Pothole Region (PPR) as an example. This approach finds the allocations of conservation activity among subregions of the planning area that maximize the expected conservation returns for a given level of uncertainty or minimize uncertainty for a given expected level of returns. We find that using MPT instead of simple diversification in the PPR can achieve a value of the conservation objective per dollar spent that is 15% higher for the same level of risk. MPT-based portfolios can also have 21% less uncertainty over benefits or 6% greater expected benefits than the current portfolio of PPR conservation. Total benefits from conservation investment are higher if returns are defined in terms of benefit-cost ratios rather than benefits alone. MPT-guided diversification can work to reduce the climate-change-induced uncertainty of future ecosystem-service benefits from many land policy and investment initiatives, especially when outcomes are negatively correlated between subregions of a planning area.
Ando, Amy W.; Mallory, Mindy L.
2012-01-01
Climate change is likely to alter the spatial distributions of species and habitat types but the nature of such change is uncertain. Thus, climate change makes it difficult to implement standard conservation planning paradigms. Previous work has suggested some approaches to cope with such uncertainty but has not harnessed all of the benefits of risk diversification. We adapt Modern Portfolio Theory (MPT) to optimal spatial targeting of conservation activity, using wetland habitat conservation in the Prairie Pothole Region (PPR) as an example. This approach finds the allocations of conservation activity among subregions of the planning area that maximize the expected conservation returns for a given level of uncertainty or minimize uncertainty for a given expected level of returns. We find that using MPT instead of simple diversification in the PPR can achieve a value of the conservation objective per dollar spent that is 15% higher for the same level of risk. MPT-based portfolios can also have 21% less uncertainty over benefits or 6% greater expected benefits than the current portfolio of PPR conservation. Total benefits from conservation investment are higher if returns are defined in terms of benefit–cost ratios rather than benefits alone. MPT-guided diversification can work to reduce the climate-change–induced uncertainty of future ecosystem-service benefits from many land policy and investment initiatives, especially when outcomes are negatively correlated between subregions of a planning area. PMID:22451914
Remote sensing of ecosystem health: opportunities, challenges, and future perspectives.
Li, Zhaoqin; Xu, Dandan; Guo, Xulin
2014-11-07
Maintaining a healthy ecosystem is essential for maximizing sustainable ecological services of the best quality to human beings. Ecological and conservation research has provided a strong scientific background on identifying ecological health indicators and correspondingly making effective conservation plans. At the same time, ecologists have asserted a strong need for spatially explicit and temporally effective ecosystem health assessments based on remote sensing data. Currently, remote sensing of ecosystem health is only based on one ecosystem attribute: vigor, organization, or resilience. However, an effective ecosystem health assessment should be a comprehensive and dynamic measurement of the three attributes. This paper reviews opportunities of remote sensing, including optical, radar, and LiDAR, for directly estimating indicators of the three ecosystem attributes, discusses the main challenges to develop a remote sensing-based spatially-explicit comprehensive ecosystem health system, and provides some future perspectives. The main challenges to develop a remote sensing-based spatially-explicit comprehensive ecosystem health system are: (1) scale issue; (2) transportability issue; (3) data availability; and (4) uncertainties in health indicators estimated from remote sensing data. However, the Radarsat-2 constellation, upcoming new optical sensors on Worldview-3 and Sentinel-2 satellites, and improved technologies for the acquisition and processing of hyperspectral, multi-angle optical, radar, and LiDAR data and multi-sensoral data fusion may partly address the current challenges.
Favazza, Christopher P; Fetterly, Kenneth A; Hangiandreou, Nicholas J; Leng, Shuai; Schueler, Beth A
2015-01-01
Evaluation of flat-panel angiography equipment through conventional image quality metrics is limited by the scope of standard spatial-domain image quality metric(s), such as contrast-to-noise ratio and spatial resolution, or by restricted access to appropriate data to calculate Fourier domain measurements, such as modulation transfer function, noise power spectrum, and detective quantum efficiency. Observer models have been shown capable of overcoming these limitations and are able to comprehensively evaluate medical-imaging systems. We present a spatial domain-based channelized Hotelling observer model to calculate the detectability index (DI) of our different sized disks and compare the performance of different imaging conditions and angiography systems. When appropriate, changes in DIs were compared to expectations based on the classical Rose model of signal detection to assess linearity of the model with quantum signal-to-noise ratio (SNR) theory. For these experiments, the estimated uncertainty of the DIs was less than 3%, allowing for precise comparison of imaging systems or conditions. For most experimental variables, DI changes were linear with expectations based on quantum SNR theory. DIs calculated for the smallest objects demonstrated nonlinearity with quantum SNR theory due to system blur. Two angiography systems with different detector element sizes were shown to perform similarly across the majority of the detection tasks.
NASA Astrophysics Data System (ADS)
Camacho Suarez, V. V.; Shucksmith, J.; Schellart, A.
2016-12-01
Analytical and numerical models can be used to represent the advection-dispersion processes governing the transport of pollutants in rivers (Fan et al., 2015; Van Genuchten et al., 2013). Simplifications, assumptions and parameter estimations in these models result in various uncertainties within the modelling process and estimations of pollutant concentrations. In this study, we explore both: 1) the structural uncertainty due to the one dimensional simplification of the Advection Dispersion Equation (ADE) and 2) the parameter uncertainty due to the semi empirical estimation of the longitudinal dispersion coefficient. The relative significance of these uncertainties has not previously been examined. By analysing both the relative structural uncertainty of analytical solutions of the ADE, and the parameter uncertainty due to the longitudinal dispersion coefficient via a Monte Carlo analysis, an evaluation of the dominant uncertainties for a case study in the river Chillan, Chile is presented over a range of spatial scales.
Modeling the uncertainty of estimating forest carbon stocks in China
NASA Astrophysics Data System (ADS)
Yue, T. X.; Wang, Y. F.; Du, Z. P.; Zhao, M. W.; Zhang, L. L.; Zhao, N.; Lu, M.; Larocque, G. R.; Wilson, J. P.
2015-12-01
Earth surface systems are controlled by a combination of global and local factors, which cannot be understood without accounting for both the local and global components. The system dynamics cannot be recovered from the global or local controls alone. Ground forest inventory is able to accurately estimate forest carbon stocks at sample plots, but these sample plots are too sparse to support the spatial simulation of carbon stocks with required accuracy. Satellite observation is an important source of global information for the simulation of carbon stocks. Satellite remote-sensing can supply spatially continuous information about the surface of forest carbon stocks, which is impossible from ground-based investigations, but their description has considerable uncertainty. In this paper, we validated the Lund-Potsdam-Jena dynamic global vegetation model (LPJ), the Kriging method for spatial interpolation of ground sample plots and a satellite-observation-based approach as well as an approach for fusing the ground sample plots with satellite observations and an assimilation method for incorporating the ground sample plots into LPJ. The validation results indicated that both the data fusion and data assimilation approaches reduced the uncertainty of estimating carbon stocks. The data fusion had the lowest uncertainty by using an existing method for high accuracy surface modeling to fuse the ground sample plots with the satellite observations (HASM-SOA). The estimates produced with HASM-SOA were 26.1 and 28.4 % more accurate than the satellite-based approach and spatial interpolation of the sample plots, respectively. Forest carbon stocks of 7.08 Pg were estimated for China during the period from 2004 to 2008, an increase of 2.24 Pg from 1984 to 2008, using the preferred HASM-SOA method.
NASA Astrophysics Data System (ADS)
Balbi, S.; Villa, F.; Mojtahed, V.; Hegetschweiler, K. T.; Giupponi, C.
2015-10-01
This article presents a novel methodology to assess flood risk to people by integrating people's vulnerability and ability to cushion hazards through coping and adapting. The proposed approach extends traditional risk assessments beyond material damages; complements quantitative and semi-quantitative data with subjective and local knowledge, improving the use of commonly available information; produces estimates of model uncertainty by providing probability distributions for all of its outputs. Flood risk to people is modeled using a spatially explicit Bayesian network model calibrated on expert opinion. Risk is assessed in terms of: (1) likelihood of non-fatal physical injury; (2) likelihood of post-traumatic stress disorder; (3) likelihood of death. The study area covers the lower part of the Sihl valley (Switzerland) including the city of Zurich. The model is used to estimate the benefits of improving an existing Early Warning System, taking into account the reliability, lead-time and scope (i.e. coverage of people reached by the warning). Model results indicate that the potential benefits of an improved early warning in terms of avoided human impacts are particularly relevant in case of a major flood event: about 75 % of fatalities, 25 % of injuries and 18 % of post-traumatic stress disorders could be avoided.
Assessing species vulnerability to climate change
NASA Astrophysics Data System (ADS)
Pacifici, Michela; Foden, Wendy B.; Visconti, Piero; Watson, James E. M.; Butchart, Stuart H. M.; Kovacs, Kit M.; Scheffers, Brett R.; Hole, David G.; Martin, Tara G.; Akçakaya, H. Resit; Corlett, Richard T.; Huntley, Brian; Bickford, David; Carr, Jamie A.; Hoffmann, Ary A.; Midgley, Guy F.; Pearce-Kelly, Paul; Pearson, Richard G.; Williams, Stephen E.; Willis, Stephen G.; Young, Bruce; Rondinini, Carlo
2015-03-01
The effects of climate change on biodiversity are increasingly well documented, and many methods have been developed to assess species' vulnerability to climatic changes, both ongoing and projected in the coming decades. To minimize global biodiversity losses, conservationists need to identify those species that are likely to be most vulnerable to the impacts of climate change. In this Review, we summarize different currencies used for assessing species' climate change vulnerability. We describe three main approaches used to derive these currencies (correlative, mechanistic and trait-based), and their associated data requirements, spatial and temporal scales of application and modelling methods. We identify strengths and weaknesses of the approaches and highlight the sources of uncertainty inherent in each method that limit projection reliability. Finally, we provide guidance for conservation practitioners in selecting the most appropriate approach(es) for their planning needs and highlight priority areas for further assessments.
Towards a global harmonized permafrost soil organic carbon stock estimates.
NASA Astrophysics Data System (ADS)
Hugelius, G.; Mishra, U.; Yang, Y.
2017-12-01
Permafrost affected soils store disproportionately large amount of organic carbon stocks due to multiple cryopedogenic processes. Previous permafrost soil organic carbon (SOC) stock estimates used a variety of approaches and reported substantial uncertainty in SOC stocks of permafrost soils. Here, we used spatially referenced data of soil-forming factors (topographic attributes, land cover types, climate, and bedrock geology) and SOC pedon description data (n = 2552) in a regression kriging approach to predict the spatial and vertical heterogeneity of SOC stocks across the Northern Circumpolar and Tibetan permafrost regions. Our approach allowed us to take into account both environmental correlation and spatial autocorrelation to separately estimate SOC stocks and their spatial uncertainties (95% CI) for three depth intervals at 250 m spatial resolution. In Northern Circumpolar region, our results show 1278.1 (1009.33 - 1550.45) Pg C in 0-3 m depth interval, with 542.09 (451.83 - 610.15), 422.46 (306.48 - 550.82), and 313.55 (251.02 - 389.48) Pg C in 0 - 1, 1 - 2, and 2 - 3 m depth intervals, respectively. In Tibetan region, our results show 26.68 (9.82 - 79.92) Pg C in 0 - 3 m depth interval, with 13.98 (6.2 - 32.96), 6.49 (1.73 - 25.86), and 6.21 (1.889 - 20.90) Pg C in 0 - 1, 1 - 2, and 2 - 3 m depth intervals, respectively. Our estimates show large spatial variability (50 - 100% coefficient of variation, depending upon the study region and depth interval) and higher uncertainty range in comparison to existing estimates. We will present the observed controls of different environmental factors on SOC at the AGU meeting.
NASA Astrophysics Data System (ADS)
Uijlenhoet, R.; Brauer, C.; Overeem, A.; Sassi, M.; Rios Gaona, M. F.
2014-12-01
Several rainfall measurement techniques are available for hydrological applications, each with its own spatial and temporal resolution. We investigated the effect of these spatiotemporal resolutions on discharge simulations in lowland catchments by forcing a novel rainfall-runoff model (WALRUS) with rainfall data from gauges, radars and microwave links. The hydrological model used for this analysis is the recently developed Wageningen Lowland Runoff Simulator (WALRUS). WALRUS is a rainfall-runoff model accounting for hydrological processes relevant to areas with shallow groundwater (e.g. groundwater-surface water feedback). Here, we used WALRUS for case studies in a freely draining lowland catchment and a polder with controlled water levels. We used rain gauge networks with automatic (hourly resolution but low spatial density) and manual gauges (high spatial density but daily resolution). Operational (real-time) and climatological (gauge-adjusted) C-band radar products and country-wide rainfall maps derived from microwave link data from a cellular telecommunication network were also used. Discharges simulated with these different inputs were compared to observations. We also investigated the effect of spatiotemporal resolution with a high-resolution X-band radar data set for catchments with different sizes. Uncertainty in rainfall forcing is a major source of uncertainty in discharge predictions, both with lumped and with distributed models. For lumped rainfall-runoff models, the main source of input uncertainty is associated with the way in which (effective) catchment-average rainfall is estimated. When catchments are divided into sub-catchments, rainfall spatial variability can become more important, especially during convective rainfall events, leading to spatially varying catchment wetness and spatially varying contribution of quick flow routes. Improving rainfall measurements and their spatiotemporal resolution can improve the performance of rainfall-runoff models, indicating their potential for reducing flood damage through real-time control.
NASA Astrophysics Data System (ADS)
Yi, Yonghong; Kimball, John S.; Chen, Richard H.; Moghaddam, Mahta; Reichle, Rolf H.; Mishra, Umakant; Zona, Donatella; Oechel, Walter C.
2018-01-01
An important feature of the Arctic is large spatial heterogeneity in active layer conditions, which is generally poorly represented by global models and can lead to large uncertainties in predicting regional ecosystem responses and climate feedbacks. In this study, we developed a spatially integrated modeling and analysis framework combining field observations, local-scale ( ˜ 50 m resolution) active layer thickness (ALT) and soil moisture maps derived from low-frequency (L + P-band) airborne radar measurements, and global satellite environmental observations to investigate the ALT sensitivity to recent climate trends and landscape heterogeneity in Alaska. Modeled ALT results show good correspondence with in situ measurements in higher-permafrost-probability (PP ≥ 70 %) areas (n = 33; R = 0.60; mean bias = 1.58 cm; RMSE = 20.32 cm), but with larger uncertainty in sporadic and discontinuous permafrost areas. The model results also reveal widespread ALT deepening since 2001, with smaller ALT increases in northern Alaska (mean trend = 0.32±1.18 cm yr-1) and much larger increases (> 3 cm yr-1) across interior and southern Alaska. The positive ALT trend coincides with regional warming and a longer snow-free season (R = 0.60 ± 0.32). A spatially integrated analysis of the radar retrievals and model sensitivity simulations demonstrated that uncertainty in the spatial and vertical distribution of soil organic carbon (SOC) was the largest factor affecting modeled ALT accuracy, while soil moisture played a secondary role. Potential improvements in characterizing SOC heterogeneity, including better spatial sampling of soil conditions and advances in remote sensing of SOC and soil moisture, will enable more accurate predictions of active layer conditions and refinement of the modeling framework across a larger domain.
Propagation of neutron-reaction uncertainties through multi-physics models of novel LWR's
NASA Astrophysics Data System (ADS)
Hernandez-Solis, Augusto; Sjöstrand, Henrik; Helgesson, Petter
2017-09-01
The novel design of the renewable boiling water reactor (RBWR) allows a breeding ratio greater than unity and thus, it aims at providing for a self-sustained fuel cycle. The neutron reactions that compose the different microscopic cross-sections and angular distributions are uncertain, so when they are employed in the determination of the spatial distribution of the neutron flux in a nuclear reactor, a methodology should be employed to account for these associated uncertainties. In this work, the Total Monte Carlo (TMC) method is used to propagate the different neutron-reactions (as well as angular distributions) covariances that are part of the TENDL-2014 nuclear data (ND) library. The main objective is to propagate them through coupled neutronic and thermal-hydraulic models in order to assess the uncertainty of important safety parameters related to multi-physics, such as peak cladding temperature along the axial direction of an RBWR fuel assembly. The objective of this study is to quantify the impact that ND covariances of important nuclides such as U-235, U-238, Pu-239 and the thermal scattering of hydrogen in H2O have in the deterministic safety analysis of novel nuclear reactors designs.
Climate change, ecosystem impacts, and management for Pacific salmon
D.E. Schindler; X. Augerot; E. Fleishman; N.J. Mantua; B. Riddell; M. Ruckelshaus; J. Seeb; M. Webster
2008-01-01
As climate change intensifies, there is increasing interest in developing models that reduce uncertainties in projections of global climate and refine these projections to finer spatial scales. Forecasts of climate impacts on ecosystems are far more challenging and their uncertainties even larger because of a limited understanding of physical controls on biological...
Uncertainty Exposed: A Field Lab Exercise Where GIS Meets the Real World
ERIC Educational Resources Information Center
Prisley, Stephen P.; Luebbering, Candice
2011-01-01
Students in natural resources programs commonly take courses in geospatial technologies. An awareness of the uncertainty of spatial data and algorithms can be an important outcome of such courses. This article describes a laboratory exercise in a graduate geographic information system (GIS) class that involves collection of data for the assessment…
NASA Astrophysics Data System (ADS)
Zhu, Liang; Wang, Youguo
2018-07-01
In this paper, a rumor diffusion model with uncertainty of human behavior under spatio-temporal diffusion framework is established. Take physical significance of spatial diffusion into account, a diffusion threshold is set under which the rumor is not a trend topic and only spreads along determined physical connections. Heterogeneity of degree distribution and distance distribution has also been considered in theoretical model at the same time. The global existence and uniqueness of classical solution are proved with a Lyapunov function and an approximate classical solution in form of infinite series is constructed with a system of eigenfunction. Simulations and numerical solutions both on Watts-Strogatz (WS) network and Barabási-Albert (BA) network display the variation of density of infected connections from spatial and temporal dimensions. Relevant results show that the density of infected connections is dominated by network topology and uncertainty of human behavior at threshold time. With increase of social capability, rumor diffuses to the steady state in a higher speed. And the variation trends of diffusion size with uncertainty are diverse on different artificial networks.
NASA Astrophysics Data System (ADS)
Cecinati, Francesca; Rico-Ramirez, Miguel Angel; Heuvelink, Gerard B. M.; Han, Dawei
2017-05-01
The application of radar quantitative precipitation estimation (QPE) to hydrology and water quality models can be preferred to interpolated rainfall point measurements because of the wide coverage that radars can provide, together with a good spatio-temporal resolutions. Nonetheless, it is often limited by the proneness of radar QPE to a multitude of errors. Although radar errors have been widely studied and techniques have been developed to correct most of them, residual errors are still intrinsic in radar QPE. An estimation of uncertainty of radar QPE and an assessment of uncertainty propagation in modelling applications is important to quantify the relative importance of the uncertainty associated to radar rainfall input in the overall modelling uncertainty. A suitable tool for this purpose is the generation of radar rainfall ensembles. An ensemble is the representation of the rainfall field and its uncertainty through a collection of possible alternative rainfall fields, produced according to the observed errors, their spatial characteristics, and their probability distribution. The errors are derived from a comparison between radar QPE and ground point measurements. The novelty of the proposed ensemble generator is that it is based on a geostatistical approach that assures a fast and robust generation of synthetic error fields, based on the time-variant characteristics of errors. The method is developed to meet the requirement of operational applications to large datasets. The method is applied to a case study in Northern England, using the UK Met Office NIMROD radar composites at 1 km resolution and at 1 h accumulation on an area of 180 km by 180 km. The errors are estimated using a network of 199 tipping bucket rain gauges from the Environment Agency. 183 of the rain gauges are used for the error modelling, while 16 are kept apart for validation. The validation is done by comparing the radar rainfall ensemble with the values recorded by the validation rain gauges. The validated ensemble is then tested on a hydrological case study, to show the advantage of probabilistic rainfall for uncertainty propagation. The ensemble spread only partially captures the mismatch between the modelled and the observed flow. The residual uncertainty can be attributed to other sources of uncertainty, in particular to model structural uncertainty, parameter identification uncertainty, uncertainty in other inputs, and uncertainty in the observed flow.
Estimation of Spatial Trends in LAI in Heterogeneous Semi-arid Ecosystems using Full Waveform Lidar
NASA Astrophysics Data System (ADS)
Glenn, N. F.; Ilangakoon, N.; Spaete, L.; Dashti, H.
2017-12-01
Leaf area index (LAI) is a key structural trait that is defined by the plant functional type (PFT) and controlled by prevailing climate- and human-driven ecosystem stresses. Estimates of LAI using remote sensing techniques are limited by the uncertainties of vegetation inter and intra-gap fraction estimates; this is especially the case in sparse, low stature vegetated ecosystems. Small footprint full waveform lidar digitizes the total amount of return energy with the direction information as a near continuous waveform at a high vertical resolution (1 ns). Thus waveform lidar provides additional data matrices to capture vegetation gaps as well as PFTs that can be used to constrain the uncertainties of LAI estimates. In this study, we calculated a radiometrically calibrated full waveform parameter called backscatter cross section, along with other data matrices from the waveform to estimate vegetation gaps across plots (10 m x 10 m) in a semi-arid ecosystem in the western US. The LAI was then estimated using empirical relationships with directional gap fraction. Full waveform-derived gap fraction based LAI showed a high correlation with field observed shrub LAI (R2 = 0.66, RMSE = 0.24) compared to discrete return lidar based LAI (R2 = 0.01, RMSE = 0.5). The data matrices derived from full waveform lidar classified a number of deciduous and evergreen tree species, shrub species, and bare ground with an overall accuracy of 89% at 10 m. A similar analysis was performed at 1m with overall accuracy of 80%. The next step is to use these relationships to map the PFTs LAI at 10 m spatial scale across the larger study regions. The results show the exciting potential of full waveform lidar to identify plant functional types and LAI in low-stature vegetation dominated semi-arid ecosystems, an ecosystem in which many other remote sensing techniques fail. These results can be used to assess ecosystem state, habitat suitability as well as to constrain model uncertainties in vegetation dynamic models with a combination of other remote sensing techniques. Multi-spatial resolution (1 m and 10 m) studies provide basic information on the applicability and detection thresholds of future global satellite sensors designed at coarser spatial resolutions (e.g. GEDI, ICESat-2) in semi-arid ecosystems.
Hydraulic Conductivity Estimation using Bayesian Model Averaging and Generalized Parameterization
NASA Astrophysics Data System (ADS)
Tsai, F. T.; Li, X.
2006-12-01
Non-uniqueness in parameterization scheme is an inherent problem in groundwater inverse modeling due to limited data. To cope with the non-uniqueness problem of parameterization, we introduce a Bayesian Model Averaging (BMA) method to integrate a set of selected parameterization methods. The estimation uncertainty in BMA includes the uncertainty in individual parameterization methods as the within-parameterization variance and the uncertainty from using different parameterization methods as the between-parameterization variance. Moreover, the generalized parameterization (GP) method is considered in the geostatistical framework in this study. The GP method aims at increasing the flexibility of parameterization through the combination of a zonation structure and an interpolation method. The use of BMP with GP avoids over-confidence in a single parameterization method. A normalized least-squares estimation (NLSE) is adopted to calculate the posterior probability for each GP. We employee the adjoint state method for the sensitivity analysis on the weighting coefficients in the GP method. The adjoint state method is also applied to the NLSE problem. The proposed methodology is implemented to the Alamitos Barrier Project (ABP) in California, where the spatially distributed hydraulic conductivity is estimated. The optimal weighting coefficients embedded in GP are identified through the maximum likelihood estimation (MLE) where the misfits between the observed and calculated groundwater heads are minimized. The conditional mean and conditional variance of the estimated hydraulic conductivity distribution using BMA are obtained to assess the estimation uncertainty.
Uncertainty evaluation of a regional real-time system for rain-induced landslides
NASA Astrophysics Data System (ADS)
Kirschbaum, Dalia; Stanley, Thomas; Yatheendradas, Soni
2015-04-01
A new prototype regional model and evaluation framework has been developed over Central America and the Caribbean region using satellite-based information including precipitation estimates, modeled soil moisture, topography, soils, as well as regionally available datasets such as road networks and distance to fault zones. The algorithm framework incorporates three static variables: a susceptibility map; a 24-hr rainfall triggering threshold; and an antecedent soil moisture variable threshold, which have been calibrated using historic landslide events. The thresholds are regionally heterogeneous and are based on the percentile distribution of the rainfall or antecedent moisture time series. A simple decision tree algorithm framework integrates all three variables with the rainfall and soil moisture time series and generates a landslide nowcast in real-time based on the previous 24 hours over this region. This system has been evaluated using several available landslide inventories over the Central America and Caribbean region. Spatiotemporal uncertainty and evaluation metrics of the model are presented here based on available landslides reports. This work also presents a probabilistic representation of potential landslide activity over the region which can be used to further refine and improve the real-time landslide hazard assessment system as well as better identify and characterize the uncertainties inherent in this type of regional approach. The landslide algorithm provides a flexible framework to improve hazard estimation and reduce uncertainty at any spatial and temporal scale.
Determining Greenland Ice Sheet Accumulation Rates from Radar Remote Sensing
NASA Technical Reports Server (NTRS)
Jezek, Kenneth C.
2001-01-01
An important component of NASA's Program for Arctic Regional Climate Assessment (PARCA) is a mass balance investigation of the Greenland Ice Sheet. The mass balance is calculated by taking the difference between the snow accumulation and the ice discharge of the ice sheet. Uncertainties in this calculation include the snow accumulation rate, which has traditionally been determined by interpolating data from ice core samples taken throughout the ice sheet. The sparse data associated with ice cores, coupled with the high spatial and temporal resolution provided by remote sensing, have motivated scientists to investigate relationships between accumulation rate and microwave observations.
Planning for smallpox outbreaks
NASA Astrophysics Data System (ADS)
Ferguson, Neil M.; Keeling, Matt J.; John Edmunds, W.; Gani, Raymond; Grenfell, Bryan T.; Anderson, Roy M.; Leach, Steve
2003-10-01
Mathematical models of viral transmission and control are important tools for assessing the threat posed by deliberate release of the smallpox virus and the best means of containing an outbreak. Models must balance biological realism against limitations of knowledge, and uncertainties need to be accurately communicated to policy-makers. Smallpox poses the particular challenge that key biological, social and spatial factors affecting disease spread in contemporary populations must be elucidated largely from historical studies undertaken before disease eradication in 1979. We review the use of models in smallpox planning within the broader epidemiological context set by recent outbreaks of both novel and re-emerging pathogens.
When size matters: attention affects performance by contrast or response gain.
Herrmann, Katrin; Montaser-Kouhsari, Leila; Carrasco, Marisa; Heeger, David J
2010-12-01
Covert attention, the selective processing of visual information in the absence of eye movements, improves behavioral performance. We found that attention, both exogenous (involuntary) and endogenous (voluntary), can affect performance by contrast or response gain changes, depending on the stimulus size and the relative size of the attention field. These two variables were manipulated in a cueing task while stimulus contrast was varied. We observed a change in behavioral performance consonant with a change in contrast gain for small stimuli paired with spatial uncertainty and a change in response gain for large stimuli presented at one location (no uncertainty) and surrounded by irrelevant flanking distracters. A complementary neuroimaging experiment revealed that observers' attention fields were wider with than without spatial uncertainty. Our results support important predictions of the normalization model of attention and reconcile previous, seemingly contradictory findings on the effects of visual attention.
NASA Astrophysics Data System (ADS)
Oroza, C.; Zheng, Z.; Glaser, S. D.; Bales, R. C.; Conklin, M. H.
2016-12-01
We present a structured, analytical approach to optimize ground-sensor placements based on time-series remotely sensed (LiDAR) data and machine-learning algorithms. We focused on catchments within the Merced and Tuolumne river basins, covered by the JPL Airborne Snow Observatory LiDAR program. First, we used a Gaussian mixture model to identify representative sensor locations in the space of independent variables for each catchment. Multiple independent variables that govern the distribution of snow depth were used, including elevation, slope, and aspect. Second, we used a Gaussian process to estimate the areal distribution of snow depth from the initial set of measurements. This is a covariance-based model that also estimates the areal distribution of model uncertainty based on the independent variable weights and autocorrelation. The uncertainty raster was used to strategically add sensors to minimize model uncertainty. We assessed the temporal accuracy of the method using LiDAR-derived snow-depth rasters collected in water-year 2014. In each area, optimal sensor placements were determined using the first available snow raster for the year. The accuracy in the remaining LiDAR surveys was compared to 100 configurations of sensors selected at random. We found the accuracy of the model from the proposed placements to be higher and more consistent in each remaining survey than the average random configuration. We found that a relatively small number of sensors can be used to accurately reproduce the spatial patterns of snow depth across the basins, when placed using spatial snow data. Our approach also simplifies sensor placement. At present, field surveys are required to identify representative locations for such networks, a process that is labor intensive and provides limited guarantees on the networks' representation of catchment independent variables.
NASA Astrophysics Data System (ADS)
Yanai, R. D.; Bae, K.; Levine, C. R.; Lilly, P.; Vadeboncoeur, M. A.; Fatemi, F. R.; Blum, J. D.; Arthur, M.; Hamburg, S.
2013-12-01
Ecosystem nutrient budgets are difficult to construct and even more difficult to replicate. As a result, uncertainty in the estimates of pools and fluxes are rarely reported, and opportunities to assess confidence through replicated measurements are rare. In this study, we report nutrient concentrations and contents of soil and biomass pools in northern hardwood stands in replicate plots within replicate stands in 3 age classes (14-19 yr, 26-29 yr, and > 100 yr) at the Bartlett Experimental Forest, USA. Soils were described by quantitative soil pits in three plots per stand, excavated by depth increment to the C horizon and analyzed by a sequential extraction procedure. Variation in soil mass among pits within stands averaged 28% (coefficient of variation); variation among stands within an age class ranged from 9-25%. Variation in nutrient concentrations were higher still (averaging 38%, within element, depth increment, and extraction type), perhaps because the depth increments contained varying proportions of genetic horizons. To estimate nutrient contents of aboveground biomass, we propagated model uncertainty through allometric equations, and found errors ranging from 3-7%, depending on the stand. The variation in biomass among plots within stands (6-19%) was always larger than the allometric uncertainties. Variability in measured nutrient concentrations of tree tissues were more variable than the uncertainty in biomass. Foliage had the lowest variability (averaging 16% for Ca, Mg, K, N and P within age class and species), and wood had the highest (averaging 30%), when reported in proportion to the mean, because concentrations in wood are low. For Ca content of aboveground biomass, sampling variation was the greatest source of uncertainty. Coefficients of variation among plots within a stand averaged 16%; stands within an age class ranged from 5-25% CV, including uncertainties in tree allometry and tissue chemistry. Uncertainty analysis can help direct research effort to areas most in need of improvement. In systems such as the one we studied, more intensive sampling would be the best approach to reducing uncertainty, as natural spatial variation was higher than model or measurement uncertainties.
NASA Astrophysics Data System (ADS)
de Lavenne, Alban; Thirel, Guillaume; Andréassian, Vazken; Perrin, Charles; Ramos, Maria-Helena
2016-04-01
Semi-distributed hydrological models aim to provide useful information to understand and manage the spatial distribution of water resources. However, their evaluation is often limited to independent and single evaluations at each sub-catchment within larger catchments. This enables to qualify model performance at different points, but does not provide a coherent assessment of the overall spatial consistency of the model. To cope with these methodological deficiencies, we propose a two-step strategy. First, we apply a sequential spatial calibration procedure to define spatially consistent model parameters. Secondly, we evaluate the hydrological simulations using variables that involve some dependency between sub-catchments to evaluate the overall coherence of model outputs. In this study, we particularly choose to look at the simulated Intercatchment Groundwater Flows (IGF). The idea is that the water that is lost in one place should be recovered somewhere else within the catchment to guarantee a spatially coherent water balance in time. The model used is a recently developed daily semi-distributed model, which is based on a spatial distribution of the lumped GR5J model. The model has five parameters for each sub-catchments and a streamflow velocity parameter for flow routing between them. It implements two reservoirs, one for production and one for routing, and estimates IGF according to the level of the second in a way that catchment can release water to IGF during high flows and receive water through IGF during low flows. The calibration of the model is performed from upstream to downstream, making an efficient use of spatially distributed streamflow measurements. To take model uncertainty into account, we implemented three variants of the original model structure, each one computing in a different way the IGF in each sub-catchment. The study is applied on over 1000 catchments in France. By exploring a wide area and a variability of hydrometeorological conditions, we aim to detect IGF even between catchments which can be quite distant from one another.
NASA Astrophysics Data System (ADS)
Arnault, Joel; Rummler, Thomas; Baur, Florian; Lerch, Sebastian; Wagner, Sven; Fersch, Benjamin; Zhang, Zhenyu; Kerandi, Noah; Keil, Christian; Kunstmann, Harald
2017-04-01
Precipitation predictability can be assessed by the spread within an ensemble of atmospheric simulations being perturbed in the initial, lateral boundary conditions and/or modeled processes within a range of uncertainty. Surface-related processes are more likely to change precipitation when synoptic forcing is weak. This study investigates the effect of uncertainty in the representation of terrestrial water flows on precipitation predictability. The tools used for this investigation are the Weather Research and Forecasting (WRF) model and its hydrologically-enhanced version WRF-Hydro, applied over Central Europe during April-October 2008. The WRF grid is that of COSMO-DE, with a resolution of 2.8 km. In WRF-Hydro, the WRF grid is coupled with a sub-grid at 280 m resolution to resolve lateral terrestrial water flows. Vertical flow uncertainty is considered by modifying the parameter controlling the partitioning between surface runoff and infiltration in WRF, and horizontal flow uncertainty is considered by comparing WRF with WRF-Hydro. Precipitation predictability is deduced from the spread of an ensemble based on three turbulence parameterizations. Model results are validated with E-OBS precipitation and surface temperature, ESA-CCI soil moisture, FLUXNET-MTE surface evaporation and GRDC discharge. It is found that the uncertainty in the representation of terrestrial water flows is more likely to significantly affect precipitation predictability when surface flux spatial variability is high. In comparison to the WRF ensemble, WRF-Hydro slightly improves the adjusted continuous ranked probability score of daily precipitation. The reproduction of observed daily discharge with Nash-Sutcliffe model efficiency coefficients up to 0.91 demonstrates the potential of WRF-Hydro for flood forecasting.
Tennant, David; Bánáti, Diána; Kennedy, Marc; König, Jürgen; O'Mahony, Cian; Kettler, Susanne
2017-11-01
A previous publication described methods for assessing and reporting uncertainty in dietary exposure assessments. This follow-up publication uses a case study to develop proposals for representing and communicating uncertainty to risk managers. The food ingredient aspartame is used as the case study in a simple deterministic model (the EFSA FAIM template) and with more sophisticated probabilistic exposure assessment software (FACET). Parameter and model uncertainties are identified for each modelling approach and tabulated. The relative importance of each source of uncertainty is then evaluated using a semi-quantitative scale and the results expressed using two different forms of graphical summary. The value of this approach in expressing uncertainties in a manner that is relevant to the exposure assessment and useful to risk managers is then discussed. It was observed that the majority of uncertainties are often associated with data sources rather than the model itself. However, differences in modelling methods can have the greatest impact on uncertainties overall, particularly when the underlying data are the same. It was concluded that improved methods for communicating uncertainties for risk management is the research area where the greatest amount of effort is suggested to be placed in future. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.
Mesa-Frias, Marco; Chalabi, Zaid; Foss, Anna M
2014-01-01
Quantitative health impact assessment (HIA) is increasingly being used to assess the health impacts attributable to an environmental policy or intervention. As a consequence, there is a need to assess uncertainties in the assessments because of the uncertainty in the HIA models. In this paper, a framework is developed to quantify the uncertainty in the health impacts of environmental interventions and is applied to evaluate the impacts of poor housing ventilation. The paper describes the development of the framework through three steps: (i) selecting the relevant exposure metric and quantifying the evidence of potential health effects of the exposure; (ii) estimating the size of the population affected by the exposure and selecting the associated outcome measure; (iii) quantifying the health impact and its uncertainty. The framework introduces a novel application for the propagation of uncertainty in HIA, based on fuzzy set theory. Fuzzy sets are used to propagate parametric uncertainty in a non-probabilistic space and are applied to calculate the uncertainty in the morbidity burdens associated with three indoor ventilation exposure scenarios: poor, fair and adequate. The case-study example demonstrates how the framework can be used in practice, to quantify the uncertainty in health impact assessment where there is insufficient information to carry out a probabilistic uncertainty analysis. © 2013.
NASA Astrophysics Data System (ADS)
Müller, Benjamin; Bernhardt, Matthias; Jackisch, Conrad; Schulz, Karsten
2016-09-01
For understanding water and solute transport processes, knowledge about the respective hydraulic properties is necessary. Commonly, hydraulic parameters are estimated via pedo-transfer functions using soil texture data to avoid cost-intensive measurements of hydraulic parameters in the laboratory. Therefore, current soil texture information is only available at a coarse spatial resolution of 250 to 1000 m. Here, a method is presented to derive high-resolution (15 m) spatial topsoil texture patterns for the meso-scale Attert catchment (Luxembourg, 288 km2) from 28 images of ASTER (advanced spaceborne thermal emission and reflection radiometer) thermal remote sensing. A principle component analysis of the images reveals the most dominant thermal patterns (principle components, PCs) that are related to 212 fractional soil texture samples. Within a multiple linear regression framework, distributed soil texture information is estimated and related uncertainties are assessed. An overall root mean squared error (RMSE) of 12.7 percentage points (pp) lies well within and even below the range of recent studies on soil texture estimation, while requiring sparser sample setups and a less diverse set of basic spatial input. This approach will improve the generation of spatially distributed topsoil maps, particularly for hydrologic modeling purposes, and will expand the usage of thermal remote sensing products.
Comparing spatial regression to random forests for large ...
Environmental data may be “large” due to number of records, number of covariates, or both. Random forests has a reputation for good predictive performance when using many covariates, whereas spatial regression, when using reduced rank methods, has a reputation for good predictive performance when using many records. In this study, we compare these two techniques using a data set containing the macroinvertebrate multimetric index (MMI) at 1859 stream sites with over 200 landscape covariates. Our primary goal is predicting MMI at over 1.1 million perennial stream reaches across the USA. For spatial regression modeling, we develop two new methods to accommodate large data: (1) a procedure that estimates optimal Box-Cox transformations to linearize covariate relationships; and (2) a computationally efficient covariate selection routine that takes into account spatial autocorrelation. We show that our new methods lead to cross-validated performance similar to random forests, but that there is an advantage for spatial regression when quantifying the uncertainty of the predictions. Simulations are used to clarify advantages for each method. This research investigates different approaches for modeling and mapping national stream condition. We use MMI data from the EPA's National Rivers and Streams Assessment and predictors from StreamCat (Hill et al., 2015). Previous studies have focused on modeling the MMI condition classes (i.e., good, fair, and po
Spatial-temporal and cancer risk assessment of selected hazardous air pollutants in Seattle.
Wu, Chang-fu; Liu, L-J Sally; Cullen, Alison; Westberg, Hal; Williamson, John
2011-01-01
In the Seattle Air Toxics Monitoring Pilot Program, we measured 15 hazardous air pollutants (HAPs) at 6 sites for more than a year between 2000 and 2002. Spatial-temporal variations were evaluated with random-effects models and principal component analyses. The potential health risks were further estimated based on the monitored data, with the incorporation of the bootstrapping technique for the uncertainty analysis. It is found that the temporal variability was generally higher than the spatial variability for most air toxics. The highest temporal variability was observed for tetrachloroethylene (70% temporal vs. 34% spatial variability). Nevertheless, most air toxics still exhibited significant spatial variations, even after accounting for the temporal effects. These results suggest that it would require operating multiple air toxics monitoring sites over a significant period of time with proper monitoring frequency to better evaluate population exposure to HAPs. The median values of the estimated inhalation cancer risks ranged between 4.3 × 10⁻⁵ and 6.0 × 10⁻⁵, with the 5th and 95th percentile levels exceeding the 1 in a million level. VOCs as a whole contributed over 80% of the risk among the HAPs measured and arsenic contributed most substantially to the overall risk associated with metals. Copyright © 2010 Elsevier Ltd. All rights reserved.
Shryock, Daniel F.; Havrilla, Caroline A.; DeFalco, Lesley; Esque, Todd C.; Custer, Nathan; Wood, Troy E.
2015-01-01
Local adaptation influences plant species’ responses to climate change and their performance in ecological restoration. Fine-scale physiological or phenological adaptations that direct demographic processes may drive intraspecific variability when baseline environmental conditions change. Landscape genomics characterize adaptive differentiation by identifying environmental drivers of adaptive genetic variability and mapping the associated landscape patterns. We applied such an approach to Sphaeralcea ambigua, an important restoration plant in the arid southwestern United States, by analyzing variation at 153 amplified fragment length polymorphism loci in the context of environmental gradients separating 47 Mojave Desert populations. We identified 37 potentially adaptive loci through a combination of genome scan approaches. We then used a generalized dissimilarity model (GDM) to relate variability in potentially adaptive loci with spatial gradients in temperature, precipitation, and topography. We identified non-linear thresholds in loci frequencies driven by summer maximum temperature and water stress, along with continuous variation corresponding to temperature seasonality. Two GDM-based approaches for mapping predicted patterns of local adaptation are compared. Additionally, we assess uncertainty in spatial interpolations through a novel spatial bootstrapping approach. Our study presents robust, accessible methods for deriving spatially-explicit models of adaptive genetic variability in non-model species that will inform climate change modelling and ecological restoration.
Watershed scale rainfall‐runoff models are used for environmental management and regulatory modeling applications, but their effectiveness are limited by predictive uncertainties associated with model input data. This study evaluated the effect of temporal and spatial rainfall re...
Integrated probabilistic risk assessment for nanoparticles: the case of nanosilica in food.
Jacobs, Rianne; van der Voet, Hilko; Ter Braak, Cajo J F
Insight into risks of nanotechnology and the use of nanoparticles is an essential condition for the social acceptance and safe use of nanotechnology. One of the problems with which the risk assessment of nanoparticles is faced is the lack of data, resulting in uncertainty in the risk assessment. We attempt to quantify some of this uncertainty by expanding a previous deterministic study on nanosilica (5-200 nm) in food into a fully integrated probabilistic risk assessment. We use the integrated probabilistic risk assessment method in which statistical distributions and bootstrap methods are used to quantify uncertainty and variability in the risk assessment. Due to the large amount of uncertainty present, this probabilistic method, which separates variability from uncertainty, contributed to a better understandable risk assessment. We found that quantifying the uncertainties did not increase the perceived risk relative to the outcome of the deterministic study. We pinpointed particular aspects of the hazard characterization that contributed most to the total uncertainty in the risk assessment, suggesting that further research would benefit most from obtaining more reliable data on those aspects.
Spatial entanglement patterns and Einstein-Podolsky-Rosen steering in Bose-Einstein condensates
NASA Astrophysics Data System (ADS)
Fadel, Matteo; Zibold, Tilman; Décamps, Boris; Treutlein, Philipp
2018-04-01
Many-particle entanglement is a fundamental concept of quantum physics that still presents conceptual challenges. Although nonclassical states of atomic ensembles were used to enhance measurement precision in quantum metrology, the notion of entanglement in these systems was debated because the correlations among the indistinguishable atoms were witnessed by collective measurements only. Here, we use high-resolution imaging to directly measure the spin correlations between spatially separated parts of a spin-squeezed Bose-Einstein condensate. We observe entanglement that is strong enough for Einstein-Podolsky-Rosen steering: We can predict measurement outcomes for noncommuting observables in one spatial region on the basis of corresponding measurements in another region with an inferred uncertainty product below the Heisenberg uncertainty bound. This method could be exploited for entanglement-enhanced imaging of electromagnetic field distributions and quantum information tasks.
Prospects and pitfalls of occupational hazard mapping: 'between these lines there be dragons'.
Koehler, Kirsten A; Volckens, John
2011-10-01
Hazard data mapping is a promising new technique that can enhance the process of occupational exposure assessment and risk communication. Hazard maps have the potential to improve worker health by providing key input for the design of hazard intervention and control strategies. Hazard maps are developed with aid from direct-reading instruments, which can collect highly spatially and temporally resolved data in a relatively short period of time. However, quantifying spatial-temporal variability in the occupational environment is not a straightforward process, and our lack of understanding of how to ascertain and model spatial and temporal variability is a limiting factor in the use and interpretation of workplace hazard maps. We provide an example of how sources of and exposures to workplace hazards may be mischaracterized in a hazard map due to a lack of completeness and representativeness of collected measurement data. Based on this example, we believe that a major priority for research in this emerging area should focus on the development of a statistical framework to quantify uncertainty in spatially and temporally varying data. In conjunction with this need is one for the development of guidelines and procedures for the proper sampling, generation, and evaluation of workplace hazard maps.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Di Vittorio, Alan V.; Kyle, Page; Collins, William D.
Understanding potential impacts of climate change is complicated by spatially mismatched land representations between gridded datasets and models, and land use models with larger regions defined by geopolitical and/or biophysical criteria. Here in this study, we quantify the sensitivity of Global Change Assessment Model (GCAM) outputs to the delineation of Agro-Ecological Zones (AEZs), which are normally based on historical (1961–1990) climate. We reconstruct GCAM's land regions using projected (2071–2100) climate, and find large differences in estimated future land use that correspond with differences in agricultural commodity prices and production volumes. Importantly, historically delineated AEZs experience spatially heterogeneous climate impacts overmore » time, and do not necessarily provide more homogenous initial land productivity than projected AEZs. Finally, we conclude that non-climatic criteria for land use region delineation are likely preferable for modeling land use change in the context of climate change, and that uncertainty associated with land delineation needs to be quantified.« less
Di Vittorio, Alan V.; Kyle, Page; Collins, William D.
2016-09-03
Understanding potential impacts of climate change is complicated by spatially mismatched land representations between gridded datasets and models, and land use models with larger regions defined by geopolitical and/or biophysical criteria. Here in this study, we quantify the sensitivity of Global Change Assessment Model (GCAM) outputs to the delineation of Agro-Ecological Zones (AEZs), which are normally based on historical (1961–1990) climate. We reconstruct GCAM's land regions using projected (2071–2100) climate, and find large differences in estimated future land use that correspond with differences in agricultural commodity prices and production volumes. Importantly, historically delineated AEZs experience spatially heterogeneous climate impacts overmore » time, and do not necessarily provide more homogenous initial land productivity than projected AEZs. Finally, we conclude that non-climatic criteria for land use region delineation are likely preferable for modeling land use change in the context of climate change, and that uncertainty associated with land delineation needs to be quantified.« less
The Value of Learning about Natural History in Biodiversity Markets
Bruggeman, Douglas J.
2015-01-01
Markets for biodiversity have generated much controversy because of the often unstated and untested assumptions included in transactions rules. Simple trading rules are favored to reduce transaction costs, but others have argued that this leads to markets that favor development and erode biodiversity. Here, I describe how embracing complexity and uncertainty within a tradable credit system for the Red-cockaded Woodpecker (Picoides borealis) creates opportunities to achieve financial and conservation goals simultaneously. Reversing the effects of habitat fragmentation is one of the main reasons for developing markets. I include uncertainty in habitat fragmentation effects by evaluating market transactions using five alternative dispersal models that were able to approximate observed patterns of occupancy and movement. Further, because dispersal habitat is often not included in market transactions, I contrast how changes in breeding versus dispersal habitat affect credit values. I use an individually-based, spatially-explicit population model for the Red-cockaded Woodpecker (Picoides borealis) to predict spatial- and temporal- influences of landscape change on species occurrence and genetic diversity. Results indicated that the probability of no net loss of abundance and genetic diversity responded differently to the transient dynamics in breeding and dispersal habitat. Trades that do not violate the abundance cap may simultaneously violate the cap for the erosion of genetic diversity. To highlight how economic incentives may help reduce uncertainty, I demonstrate tradeoffs between the value of tradable credits and the value of information needed to predict the influence of habitat trades on population viability. For the trade with the greatest uncertainty regarding the change in habitat fragmentation, I estimate that the value of using 13-years of data to reduce uncertainty in dispersal behaviors is $6.2 million. Future guidance for biodiversity markets should at least encourage the use of spatially- and temporally-explicit techniques that include population genetic estimates and the influence of uncertainty. PMID:26675488
The Value of Learning about Natural History in Biodiversity Markets.
Bruggeman, Douglas J
2015-01-01
Markets for biodiversity have generated much controversy because of the often unstated and untested assumptions included in transactions rules. Simple trading rules are favored to reduce transaction costs, but others have argued that this leads to markets that favor development and erode biodiversity. Here, I describe how embracing complexity and uncertainty within a tradable credit system for the Red-cockaded Woodpecker (Picoides borealis) creates opportunities to achieve financial and conservation goals simultaneously. Reversing the effects of habitat fragmentation is one of the main reasons for developing markets. I include uncertainty in habitat fragmentation effects by evaluating market transactions using five alternative dispersal models that were able to approximate observed patterns of occupancy and movement. Further, because dispersal habitat is often not included in market transactions, I contrast how changes in breeding versus dispersal habitat affect credit values. I use an individually-based, spatially-explicit population model for the Red-cockaded Woodpecker (Picoides borealis) to predict spatial- and temporal- influences of landscape change on species occurrence and genetic diversity. Results indicated that the probability of no net loss of abundance and genetic diversity responded differently to the transient dynamics in breeding and dispersal habitat. Trades that do not violate the abundance cap may simultaneously violate the cap for the erosion of genetic diversity. To highlight how economic incentives may help reduce uncertainty, I demonstrate tradeoffs between the value of tradable credits and the value of information needed to predict the influence of habitat trades on population viability. For the trade with the greatest uncertainty regarding the change in habitat fragmentation, I estimate that the value of using 13-years of data to reduce uncertainty in dispersal behaviors is $6.2 million. Future guidance for biodiversity markets should at least encourage the use of spatially- and temporally-explicit techniques that include population genetic estimates and the influence of uncertainty.
Assessing dynamics, spatial scale, and uncertainty in task-related brain network analyses.
Stephen, Emily P; Lepage, Kyle Q; Eden, Uri T; Brunner, Peter; Schalk, Gerwin; Brumberg, Jonathan S; Guenther, Frank H; Kramer, Mark A
2014-01-01
The brain is a complex network of interconnected elements, whose interactions evolve dynamically in time to cooperatively perform specific functions. A common technique to probe these interactions involves multi-sensor recordings of brain activity during a repeated task. Many techniques exist to characterize the resulting task-related activity, including establishing functional networks, which represent the statistical associations between brain areas. Although functional network inference is commonly employed to analyze neural time series data, techniques to assess the uncertainty-both in the functional network edges and the corresponding aggregate measures of network topology-are lacking. To address this, we describe a statistically principled approach for computing uncertainty in functional networks and aggregate network measures in task-related data. The approach is based on a resampling procedure that utilizes the trial structure common in experimental recordings. We show in simulations that this approach successfully identifies functional networks and associated measures of confidence emergent during a task in a variety of scenarios, including dynamically evolving networks. In addition, we describe a principled technique for establishing functional networks based on predetermined regions of interest using canonical correlation. Doing so provides additional robustness to the functional network inference. Finally, we illustrate the use of these methods on example invasive brain voltage recordings collected during an overt speech task. The general strategy described here-appropriate for static and dynamic network inference and different statistical measures of coupling-permits the evaluation of confidence in network measures in a variety of settings common to neuroscience.
Uncertainties in mapping forest carbon in urban ecosystems.
Chen, Gang; Ozelkan, Emre; Singh, Kunwar K; Zhou, Jun; Brown, Marilyn R; Meentemeyer, Ross K
2017-02-01
Spatially explicit urban forest carbon estimation provides a baseline map for understanding the variation in forest vertical structure, informing sustainable forest management and urban planning. While high-resolution remote sensing has proven promising for carbon mapping in highly fragmented urban landscapes, data cost and availability are the major obstacle prohibiting accurate, consistent, and repeated measurement of forest carbon pools in cities. This study aims to evaluate the uncertainties of forest carbon estimation in response to the combined impacts of remote sensing data resolution and neighborhood spatial patterns in Charlotte, North Carolina. The remote sensing data for carbon mapping were resampled to a range of resolutions, i.e., LiDAR point cloud density - 5.8, 4.6, 2.3, and 1.2 pt s/m 2 , aerial optical NAIP (National Agricultural Imagery Program) imagery - 1, 5, 10, and 20 m. Urban spatial patterns were extracted to represent area, shape complexity, dispersion/interspersion, diversity, and connectivity of landscape patches across the residential neighborhoods with built-up densities from low, medium-low, medium-high, to high. Through statistical analyses, we found that changing remote sensing data resolution introduced noticeable uncertainties (variation) in forest carbon estimation at the neighborhood level. Higher uncertainties were caused by the change of LiDAR point density (causing 8.7-11.0% of variation) than changing NAIP image resolution (causing 6.2-8.6% of variation). For both LiDAR and NAIP, urban neighborhoods with a higher degree of anthropogenic disturbance unveiled a higher level of uncertainty in carbon mapping. However, LiDAR-based results were more likely to be affected by landscape patch connectivity, and the NAIP-based estimation was found to be significantly influenced by the complexity of patch shape. Copyright © 2016 Elsevier Ltd. All rights reserved.
Assessing Uncertainty in Expert Judgments About Natural Resources
David A. Cleaves
1994-01-01
Judgments are necessary in natural resources management, but uncertainty about these judgments should be assessed. When all judgments are rejected in the absence of hard data, valuable professional experience and knowledge are not utilized fully. The objective of assessing uncertainty is to get the best representation of knowledge and its bounds. Uncertainty...
Uncertainty and sensitivity assessment of flood risk assessments
NASA Astrophysics Data System (ADS)
de Moel, H.; Aerts, J. C.
2009-12-01
Floods are one of the most frequent and costly natural disasters. In order to protect human lifes and valuable assets from the effect of floods many defensive structures have been build. Despite these efforts economic losses due to catastrophic flood events have, however, risen substantially during the past couple of decades because of continuing economic developments in flood prone areas. On top of that, climate change is expected to affect the magnitude and frequency of flood events. Because these ongoing trends are expected to continue, a transition can be observed in various countries to move from a protective flood management approach to a more risk based flood management approach. In a risk based approach, flood risk assessments play an important role in supporting decision making. Most flood risk assessments assess flood risks in monetary terms (damage estimated for specific situations or expected annual damage) in order to feed cost-benefit analysis of management measures. Such flood risk assessments contain, however, considerable uncertainties. This is the result from uncertainties in the many different input parameters propagating through the risk assessment and accumulating in the final estimate. Whilst common in some other disciplines, as with integrated assessment models, full uncertainty and sensitivity analyses of flood risk assessments are not so common. Various studies have addressed uncertainties regarding flood risk assessments, but have mainly focussed on the hydrological conditions. However, uncertainties in other components of the risk assessment, like the relation between water depth and monetary damage, can be substantial as well. This research therefore tries to assess the uncertainties of all components of monetary flood risk assessments, using a Monte Carlo based approach. Furthermore, the total uncertainty will also be attributed to the different input parameters using a variance based sensitivity analysis. Assessing and visualizing the uncertainties of the final risk estimate will be helpful to decision makers to make better informed decisions and attributing this uncertainty to the input parameters helps to identify which parameters are most important when it comes to uncertainty in the final estimate and should therefore deserve additional attention in further research.
Efficient spatial privacy preserving scheme for sensor network
NASA Astrophysics Data System (ADS)
Debnath, Ashmita; Singaravelu, Pradheepkumar; Verma, Shekhar
2013-03-01
The privacy of sensitive events observed by a wireless sensor networks (WSN) needs to be protected. Adversaries with the knowledge of sensor deployment and network protocols can infer the location of a sensed event by monitoring the communication from the sensors even when the messages are encrypted. Encryption provides confidentiality; however, the context of the event can used to breach the privacy of sensed objects. An adversary can track the trajectory of a moving object or determine the location of the occurrence of a critical event to breach its privacy. In this paper, we propose ring signature to obfuscate the spatial information. Firstly, the extended region of location of an event of interest as estimated from a sensor communication is presented. Then, the increase in this region of spatial uncertainty due to the effect of ring signature is determined. We observe that ring signature can effectively enhance the region of location uncertainty of a sensed event. As the event of interest can be situated anywhere in the enhanced region of uncertainty, its privacy against local or global adversary is ensured. Both analytical and simulation results show that induced delay and throughput are insignificant with negligible impact on the performance of a WSN.