A method is presented and applied for evaluating an air quality model’s changes in pollutant concentrations stemming from changes in emissions while explicitly accounting for the uncertainties in the base emission inventory. Specifically, the Community Multiscale Air Quality (CMA...
Tsao, C-C; Campbell, J E; Mena-Carrasco, M; Spak, S N; Carmichael, G R; Chen, Y
2012-10-02
Although biofuels present an opportunity for renewable energy production, significant land-use change resulting from biofuels may contribute to negative environmental, economic, and social impacts. Here we examined non-GHG air pollution impacts from both indirect and direct land-use change caused by the anticipated expansion of Brazilian biofuels production. We synthesized information on fuel loading, combustion completeness, and emission factors, and developed a spatially explicit approach with uncertainty and sensitivity analyses to estimate air pollution emissions. The land-use change emissions, ranging from 6.7 to 26.4 Tg PM(2.5), were dominated by deforestation burning practices associated with indirect land-use change. We also found Brazilian sugar cane ethanol and soybean biodiesel including direct and indirect land-use change effects have much larger life-cycle emissions than conventional fossil fuels for six regulated air pollutants. The emissions magnitude and uncertainty decrease with longer life-cycle integration periods. Results are conditional to the single LUC scenario employed here. After LUC uncertainty, the largest source of uncertainty in LUC emissions stems from the combustion completeness during deforestation. While current biofuels cropland burning policies in Brazil seek to reduce life-cycle emissions, these policies do not address the large emissions caused by indirect land-use change.
Felton, Adam; Ranius, Thomas; Roberge, Jean-Michel; Öhman, Karin; Lämås, Tomas; Hynynen, Jari; Juutinen, Artti; Mönkkönen, Mikko; Nilsson, Urban; Lundmark, Tomas; Nordin, Annika
2017-07-15
A variety of modeling approaches can be used to project the future development of forest systems, and help to assess the implications of different management alternatives for biodiversity and ecosystem services. This diversity of approaches does however present both an opportunity and an obstacle for those trying to decide which modeling technique to apply, and interpreting the management implications of model output. Furthermore, the breadth of issues relevant to addressing key questions related to forest ecology, conservation biology, silviculture, economics, requires insights stemming from a number of distinct scientific disciplines. As forest planners, conservation ecologists, ecological economists and silviculturalists, experienced with modeling trade-offs and synergies between biodiversity and wood biomass production, we identified fifteen key considerations relevant to assessing the pros and cons of alternative modeling approaches. Specifically we identified key considerations linked to study question formulation, modeling forest dynamics, forest processes, study landscapes, spatial and temporal aspects, and the key response metrics - biodiversity and wood biomass production, as well as dealing with trade-offs and uncertainties. We also provide illustrative examples from the modeling literature stemming from the key considerations assessed. We use our findings to reiterate the need for explicitly addressing and conveying the limitations and uncertainties of any modeling approach taken, and the need for interdisciplinary research efforts when addressing the conservation of biodiversity and sustainable use of environmental resources. Copyright © 2017 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Dietze, M.; Raiho, A.; Fer, I.; Dawson, A.; Heilman, K.; Hooten, M.; McLachlan, J. S.; Moore, D. J.; Paciorek, C. J.; Pederson, N.; Rollinson, C.; Tipton, J.
2017-12-01
The pre-industrial period serves as an essential baseline against which we judge anthropogenic impacts on the earth's systems. However, direct measurements of key biogeochemical processes, such as carbon, water, and nutrient cycling, are absent for this period and there is no direct way to link paleoecological proxies, such as pollen and tree rings, to these processes. Process-based terrestrial ecosystem models provide a way to make inferences about the past, but have large uncertainties and by themselves often fail to capture much of the observed variability. Here we investigate the ability to improve inferences about pre-industrial biogeochemical cycles through the formal assimilation of proxy data into multiple process-based models. A Tobit ensemble filter with explicit estimation of process error was run at five sites across the eastern US for three models (LINKAGES, ED2, LPJ-GUESS). In addition to process error, the ensemble accounted for parameter uncertainty, estimated through the assimilation of the TRY and BETY trait databases, and driver uncertainty, accommodated by probabilistically downscaling and debiasing CMIP5 GCM output then filtering based on paleoclimate reconstructions. The assimilation was informed by four PalEON data products, each of which includes an explicit Bayesian error estimate: (1) STEPPS forest composition estimated from fossil pollen; (2) REFAB aboveground biomass (AGB) estimated from fossil pollen; (3) tree ring AGB and woody net primary productivity (wNPP); and (4) public land survey composition, stem density, and AGB. By comparing ensemble runs with and without data assimilation we are able to assess the information contribution of the proxy data to constraining biogeochemical fluxes, which is driven by the combination of model uncertainty, data uncertainty, and the strength of correlation between observed and unobserved quantities in the model ensemble. To our knowledge this is the first attempt at multi-model data assimilation with terrestrial ecosystem models. Results from the data-model assimilation allow us to assess the consistency across models in post-assimilation inferences about indirectly inferred quantities, such as GPP, soil carbon, and the water budget.
Dudaniec, Rachael Y; Worthington Wilmer, Jessica; Hanson, Jeffrey O; Warren, Matthew; Bell, Sarah; Rhodes, Jonathan R
2016-01-01
Landscape genetics lacks explicit methods for dealing with the uncertainty in landscape resistance estimation, which is particularly problematic when sample sizes of individuals are small. Unless uncertainty can be quantified, valuable but small data sets may be rendered unusable for conservation purposes. We offer a method to quantify uncertainty in landscape resistance estimates using multimodel inference as an improvement over single model-based inference. We illustrate the approach empirically using co-occurring, woodland-preferring Australian marsupials within a common study area: two arboreal gliders (Petaurus breviceps, and Petaurus norfolcensis) and one ground-dwelling antechinus (Antechinus flavipes). First, we use maximum-likelihood and a bootstrap procedure to identify the best-supported isolation-by-resistance model out of 56 models defined by linear and non-linear resistance functions. We then quantify uncertainty in resistance estimates by examining parameter selection probabilities from the bootstrapped data. The selection probabilities provide estimates of uncertainty in the parameters that drive the relationships between landscape features and resistance. We then validate our method for quantifying uncertainty using simulated genetic and landscape data showing that for most parameter combinations it provides sensible estimates of uncertainty. We conclude that small data sets can be informative in landscape genetic analyses provided uncertainty can be explicitly quantified. Being explicit about uncertainty in landscape genetic models will make results more interpretable and useful for conservation decision-making, where dealing with uncertainty is critical. © 2015 John Wiley & Sons Ltd.
Explicit asymmetric bounds for robust stability of continuous and discrete-time systems
NASA Technical Reports Server (NTRS)
Gao, Zhiqiang; Antsaklis, Panos J.
1993-01-01
The problem of robust stability in linear systems with parametric uncertainties is considered. Explicit stability bounds on uncertain parameters are derived and expressed in terms of linear inequalities for continuous systems, and inequalities with quadratic terms for discrete-times systems. Cases where system parameters are nonlinear functions of an uncertainty are also examined.
Uncertainty-accounting environmental policy and management of water systems.
Baresel, Christian; Destouni, Georgia
2007-05-15
Environmental policies for water quality and ecosystem management do not commonly require explicit stochastic accounts of uncertainty and risk associated with the quantification and prediction of waterborne pollutant loads and abatement effects. In this study, we formulate and investigate a possible environmental policy that does require an explicit stochastic uncertainty account. We compare both the environmental and economic resource allocation performance of such an uncertainty-accounting environmental policy with that of deterministic, risk-prone and risk-averse environmental policies under a range of different hypothetical, yet still possible, scenarios. The comparison indicates that a stochastic uncertainty-accounting policy may perform better than deterministic policies over a range of different scenarios. Even in the absence of reliable site-specific data, reported literature values appear to be useful for such a stochastic account of uncertainty.
Watt, S; Shores, E A; Kinoshita, S
1999-07-01
Implicit and explicit memory were examined in individuals with severe traumatic brain injury (TBI) under conditions of full and divided attention. Participants included 12 individuals with severe TBI and 12 matched controls. In Experiment 1, participants carried out an implicit test of word-stem completion and an explicit test of cued recall. Results demonstrated that TBI participants exhibited impaired explicit memory but preserved implicit memory. In Experiment 2, a significant reduction in the explicit memory performance of both TBI and control participants, as well as a significant decrease in the implicit memory performance of TBI participants, was achieved by reducing attentional resources at encoding. These results indicated that performance on an implicit task of word-stem completion may require the availability of additional attentional resources that are not preserved after severe TBI.
ERIC Educational Resources Information Center
Rinke, Carol R.; Gladstone-Brown, Wendy; Kinlaw, C. Ryan; Cappiello, Jean
2016-01-01
Although science, technology, engineering, and mathematics (STEM) education sits at the center of a national conversation, comparatively little attention has been given to growing need for STEM teacher preparation, particularly at the elementary level. This study analyzes the outcomes of a novel, preservice STEM teacher education model. Building…
Averaging business cycles vs. myopia: Do we need a long term vision when developing IRP?
DOE Office of Scientific and Technical Information (OSTI.GOV)
McDonald, C.; Gupta, P.C.
1995-05-01
Utility demand forecasting is inherently imprecise due to the number of uncertainties resulting from business cycles, policy making, technology breakthroughs, national and international political upheavals and the limitations of the forecasting tools. This implies that revisions based primarily on recent experience could lead to unstable forecasts. Moreover, new planning tools are required that provide an explicit consideration of uncertainty and lead to flexible and robust planning tools are required that provide an explicit consideration of uncertainty and lead to flexible and robust planning decisions.
Chonggang Xu; Hong S. He; Yuanman Hu; Yu Chang; Xiuzhen Li; Rencang Bu
2005-01-01
Geostatistical stochastic simulation is always combined with Monte Carlo method to quantify the uncertainty in spatial model simulations. However, due to the relatively long running time of spatially explicit forest models as a result of their complexity, it is always infeasible to generate hundreds or thousands of Monte Carlo simulations. Thus, it is of great...
Methods for handling uncertainty within pharmaceutical funding decisions
NASA Astrophysics Data System (ADS)
Stevenson, Matt; Tappenden, Paul; Squires, Hazel
2014-01-01
This article provides a position statement regarding decision making under uncertainty within the economic evaluation of pharmaceuticals, with a particular focus upon the National Institute for Health and Clinical Excellence context within England and Wales. This area is of importance as funding agencies have a finite budget from which to purchase a selection of competing health care interventions. The objective function generally used is that of maximising societal health with an explicit acknowledgement that there will be opportunity costs associated with purchasing a particular intervention. Three components of uncertainty are discussed within a pharmaceutical funding perspective: methodological uncertainty, parameter uncertainty and structural uncertainty, alongside a discussion of challenges that are particularly pertinent to health economic evaluation. The discipline has focused primarily on handling methodological and parameter uncertainty and a clear reference case has been developed for consistency across evaluations. However, uncertainties still remain. Less attention has been given to methods for handling structural uncertainty. The lack of adequate methods to explicitly incorporate this aspect of model development may result in the true uncertainty surrounding health care investment decisions being underestimated. Research in this area is ongoing as we review.
Uncertainty in spatially explicit animal dispersal models
Mooij, Wolf M.; DeAngelis, Donald L.
2003-01-01
Uncertainty in estimates of survival of dispersing animals is a vexing difficulty in conservation biology. The current notion is that this uncertainty decreases the usefulness of spatially explicit population models in particular. We examined this problem by comparing dispersal models of three levels of complexity: (1) an event-based binomial model that considers only the occurrence of mortality or arrival, (2) a temporally explicit exponential model that employs mortality and arrival rates, and (3) a spatially explicit grid-walk model that simulates the movement of animals through an artificial landscape. Each model was fitted to the same set of field data. A first objective of the paper is to illustrate how the maximum-likelihood method can be used in all three cases to estimate the means and confidence limits for the relevant model parameters, given a particular set of data on dispersal survival. Using this framework we show that the structure of the uncertainty for all three models is strikingly similar. In fact, the results of our unified approach imply that spatially explicit dispersal models, which take advantage of information on landscape details, suffer less from uncertainly than do simpler models. Moreover, we show that the proposed strategy of model development safeguards one from error propagation in these more complex models. Finally, our approach shows that all models related to animal dispersal, ranging from simple to complex, can be related in a hierarchical fashion, so that the various approaches to modeling such dispersal can be viewed from a unified perspective.
Lin, Huiyan; Liang, Jiafeng; Jin, Hua; Zhao, Dongmei
2018-07-01
Previous studies have investigated whether uncertainty influences neural responses to emotional events. The findings of such studies, particularly with respect to event-related potentials (ERPs), have been controversial due to several factors, such as the stimuli that serve as cues and the emotional content of the events. However, it is still unknown whether the effects of uncertainty on ERP responses to emotional events are influenced by anticipation patterns (e.g., explicit or implicit anticipation). To address this issue, participants in the present study were presented with anticipatory cues and then emotional (negative and neutral) pictures. The cues either did or did not signify the emotional content of the upcoming picture. In the inter-stimulus intervals between cues and pictures, participants were asked to estimate the expected probability of the occurrence of a specific emotional category of the subsequent picture based on a scale in the explicit anticipation condition, while in the implicit condition, participants were asked to indicate, using a number on a scale, which color was different from the others. The results revealed that in the explicit condition, uncertainty increased late positive potential (LPP) responses, particularly for negative pictures, whereas LPP responses were larger for certain negative pictures than for uncertain negative pictures in the implicit condition. The findings in the present study suggest that the anticipation pattern influences the effects of uncertainty when evaluation of negative events. Copyright © 2018 Elsevier B.V. All rights reserved.
An exploration of Intolerance of Uncertainty and memory bias.
Francis, Kylie; Dugas, Michel J; Ricard, Nathalie C
2016-09-01
Research suggests that individuals high in Intolerance of Uncertainty (IU) have information processing biases, which may explain the close relationship between IU and worry. Specifically, high IU individuals show an attentional bias for uncertainty, and negatively interpret uncertain information. However, evidence of a memory bias for uncertainty among high IU individuals is limited. This study therefore explored the relationship between IU and memory for uncertainty. In two separate studies, explicit and implicit memory for uncertain compared to other types of words was assessed. Cognitive avoidance and other factors that could influence information processing were also examined. IUS Factor 1 was a significant positive predictor of explicit memory for positive words, and IUS Factor 2 a significant negative predictor of implicit memory for positive words. Stimulus relevance and vocabulary were significant predictors of implicit memory for uncertain words. Cognitive avoidance was a significant predictor of both explicit and implicit memory for threat words. Female gender was a significant predictor of implicit memory for uncertain and neutral words. Word stimuli such as those used in these studies may not be the optimal way of assessing information processing biases related to IU. In addition, the predominantly female, largely student sample may limit the generalizability of the findings. Future research focusing on IU factors, stimulus relevance, and both explicit and implicit memory, was recommended. The potential role of cognitive avoidance on memory, information processing, and worry was explored. Copyright © 2016 Elsevier Ltd. All rights reserved.
Sengupta, Aritra; Foster, Scott D.; Patterson, Toby A.; Bravington, Mark
2012-01-01
Data assimilation is a crucial aspect of modern oceanography. It allows the future forecasting and backward smoothing of ocean state from the noisy observations. Statistical methods are employed to perform these tasks and are often based on or related to the Kalman filter. Typically Kalman filters assumes that the locations associated with observations are known with certainty. This is reasonable for typical oceanographic measurement methods. Recently, however an alternative and abundant source of data comes from the deployment of ocean sensors on marine animals. This source of data has some attractive properties: unlike traditional oceanographic collection platforms, it is relatively cheap to collect, plentiful, has multiple scientific uses and users, and samples areas of the ocean that are often difficult of costly to sample. However, inherent uncertainty in the location of the observations is a barrier to full utilisation of animal-borne sensor data in data-assimilation schemes. In this article we examine this issue and suggest a simple approximation to explicitly incorporate the location uncertainty, while staying in the scope of Kalman-filter-like methods. The approximation stems from a Taylor-series approximation to elements of the updating equation. PMID:22900005
USDA-ARS?s Scientific Manuscript database
The importance of measurement uncertainty in terms of calculation of model evaluation error statistics has been recently stated in the literature. The impact of measurement uncertainty on calibration results indicates the potential vague zone in the field of watershed modeling where the assumption ...
Varga, Nicole L.; Bauer, Patricia J.
2013-01-01
The present research was an investigation of the effect of delay on self-generation and retention of knowledge derived through integration by 6-year-old children. Children were presented with novel facts from passages read aloud to them (stem facts) and tested for self-generation of new knowledge through integration of the facts. In Experiment 1, children integrated the stem facts at Session 1 and retained the self-generated memory traces over 1 week. In Experiment 2, 1-week delays were imposed either between the to-be-integrated facts (between-stem delay) or after the stem facts but before the test (before-test delay). Integration performance was diminished in both conditions. Moreover, memory for individual stem facts was lower in Experiment 2 than in Experiment 1, suggesting that self-generation through integration promoted memory for explicitly taught information. The results indicate the importance of tests for promoting self-generation through integration as well as for retaining newly self-generated and explicitly taught information. PMID:23563162
Embryo futures and stem cell research: the management of informed uncertainty
Ehrich, Kathryn; Williams, Clare; Farsides, Bobbie; Scott, Rosamund
2012-01-01
In the social worlds of assisted conception and stem cell science, uncertainties proliferate and particular framings of the future may be highly strategic. In this article we explore meanings and articulations of the future using data from our study of ethical and social issues implicated by the donation of embryos to human embryonic stem cell research in three linked assisted conception units and stem cell laboratories in the UK. Framings of the future in this field inform the professional management of uncertainty and we explore some of the tensions this involves in practice. The bifurcation of choices for donating embryos into accepting informed uncertainty or not donating at all was identified through the research process of interviews and ethics discussion groups. Professional staff accounts in this study contained moral orientations that valued ideas such as engendering patient trust by offering full information, the sense of collective ownership of the National Heath Service and publicly funded science and ideas for how donors might be able to give restricted consent as a third option. PMID:21812792
Facing uncertainty in ecosystem services-based resource management.
Grêt-Regamey, Adrienne; Brunner, Sibyl H; Altwegg, Jürg; Bebi, Peter
2013-09-01
The concept of ecosystem services is increasingly used as a support for natural resource management decisions. While the science for assessing ecosystem services is improving, appropriate methods to address uncertainties in a quantitative manner are missing. Ignoring parameter uncertainties, modeling uncertainties and uncertainties related to human-environment interactions can modify decisions and lead to overlooking important management possibilities. In this contribution, we present a new approach for mapping the uncertainties in the assessment of multiple ecosystem services. The spatially explicit risk approach links Bayesian networks to a Geographic Information System for forecasting the value of a bundle of ecosystem services and quantifies the uncertainties related to the outcomes in a spatially explicit manner. We demonstrate that mapping uncertainties in ecosystem services assessments provides key information for decision-makers seeking critical areas in the delivery of ecosystem services in a case study in the Swiss Alps. The results suggest that not only the total value of the bundle of ecosystem services is highly dependent on uncertainties, but the spatial pattern of the ecosystem services values changes substantially when considering uncertainties. This is particularly important for the long-term management of mountain forest ecosystems, which have long rotation stands and are highly sensitive to pressing climate and socio-economic changes. Copyright © 2012 Elsevier Ltd. All rights reserved.
Scientific Uncertainty and Its Relevance to Science Education
ERIC Educational Resources Information Center
Ruggeri, Nancy Lee
2011-01-01
Uncertainty is inherent to scientific methods and practices, yet is it rarely explicitly discussed in science classrooms. Ironically, science is often equated with "certainty" in these contexts. Uncertainties that arise in science deserve special attention, as they are increasingly a part of public discussions and are susceptible to manipulation.…
Couvreur, Valentin; Ledder, Glenn; Manzoni, Stefano; Way, Danielle A; Muller, Erik B; Russo, Sabrina E
2018-05-08
Trees grow by vertically extending their stems, so accurate stem hydraulic models are fundamental to understanding the hydraulic challenges faced by tall trees. Using a literature survey, we showed that many tree species exhibit continuous vertical variation in hydraulic traits. To examine the effects of this variation on hydraulic function, we developed a spatially-explicit, analytical water transport model for stems. Our model allows Huber ratio, stem-saturated conductivity, pressure at 50% loss of conductivity, leaf area, and transpiration rate to vary continuously along the hydraulic path. Predictions from our model differ from a matric flux potential model parameterized with uniform traits. Analyses show that cavitation is a whole-stem emergent property resulting from nonlinear pressure-conductivity feedbacks that, with gravity, cause impaired water transport to accumulate along the path. Because of the compounding effects of vertical trait variation on hydraulic function, growing proportionally more sapwood and building tapered xylem with height, as well as reducing xylem vulnerability only at branch tips while maintaining transport capacity at the stem base, can compensate for these effects. We therefore conclude that the adaptive significance of vertical variation in stem hydraulic traits is to allow trees to grow tall and tolerate operating near their hydraulic limits. This article is protected by copyright. All rights reserved.
Uncertainty Forecasts Improve Weather-Related Decisions and Attenuate the Effects of Forecast Error
ERIC Educational Resources Information Center
Joslyn, Susan L.; LeClerc, Jared E.
2012-01-01
Although uncertainty is inherent in weather forecasts, explicit numeric uncertainty estimates are rarely included in public forecasts for fear that they will be misunderstood. Of particular concern are situations in which precautionary action is required at low probabilities, often the case with severe events. At present, a categorical weather…
ERIC Educational Resources Information Center
Wong, Sissy S.; Firestone, Jonah B.; Ronduen, Lionnel G.; Bang, EunJin
2016-01-01
Science, Technology, Engineering, and Mathematics (STEM) education has become one of the main priorities in the United States. Science education communities and researchers advocate for integration of STEM disciplines throughout the teaching curriculum. This requires teacher knowledge in STEM disciplines, as well as competence in scientific…
The characters of Palaeozoic jawed vertebrates
Brazeau, Martin D; Friedman, Matt
2014-01-01
Newly discovered fossils from the Silurian and Devonian periods are beginning to challenge embedded perceptions about the origin and early diversification of jawed vertebrates (gnathostomes). Nevertheless, an explicit cladistic framework for the relationships of these fossils relative to the principal crown lineages of the jawed vertebrates (osteichthyans: bony fishes and tetrapods; chondrichthyans: sharks, batoids, and chimaeras) remains elusive. We critically review the systematics and character distributions of early gnathostomes and provide a clearly stated hierarchy of synapomorphies covering the jaw-bearing stem gnathostomes and osteichthyan and chondrichthyan stem groups. We show that character lists, designed to support the monophyly of putative groups, tend to overstate their strength and lack cladistic corroboration. By contrast, synapomorphic hierarchies are more open to refutation and must explicitly confront conflicting evidence. Our proposed synapomorphy scheme is used to evaluate the status of the problematic fossil groups Acanthodii and Placodermi, and suggest profitable avenues for future research. We interpret placoderms as a paraphyletic array of stem-group gnathostomes, and suggest what we regard as two equally plausible placements of acanthodians: exclusively on the chondrichthyan stem, or distributed on both the chondrichthyan and osteichthyan stems. PMID:25750460
Parameter and uncertainty estimation for mechanistic, spatially explicit epidemiological models
NASA Astrophysics Data System (ADS)
Finger, Flavio; Schaefli, Bettina; Bertuzzo, Enrico; Mari, Lorenzo; Rinaldo, Andrea
2014-05-01
Epidemiological models can be a crucially important tool for decision-making during disease outbreaks. The range of possible applications spans from real-time forecasting and allocation of health-care resources to testing alternative intervention mechanisms such as vaccines, antibiotics or the improvement of sanitary conditions. Our spatially explicit, mechanistic models for cholera epidemics have been successfully applied to several epidemics including, the one that struck Haiti in late 2010 and is still ongoing. Calibration and parameter estimation of such models represents a major challenge because of properties unusual in traditional geoscientific domains such as hydrology. Firstly, the epidemiological data available might be subject to high uncertainties due to error-prone diagnosis as well as manual (and possibly incomplete) data collection. Secondly, long-term time-series of epidemiological data are often unavailable. Finally, the spatially explicit character of the models requires the comparison of several time-series of model outputs with their real-world counterparts, which calls for an appropriate weighting scheme. It follows that the usual assumption of a homoscedastic Gaussian error distribution, used in combination with classical calibration techniques based on Markov chain Monte Carlo algorithms, is likely to be violated, whereas the construction of an appropriate formal likelihood function seems close to impossible. Alternative calibration methods, which allow for accurate estimation of total model uncertainty, particularly regarding the envisaged use of the models for decision-making, are thus needed. Here we present the most recent developments regarding methods for parameter and uncertainty estimation to be used with our mechanistic, spatially explicit models for cholera epidemics, based on informal measures of goodness of fit.
Comparing the STEMS and AFIS growth models with respect to the uncertainty of predictions
Ronald E. McRoberts; Margaret R. Holdaway; Veronica C. Lessard
2000-01-01
The uncertainty in 5-, 10-, and 20-year diameter growth predictions is estimated using Monte Carlo simulations for four Lake States tree species. Two sets of diameter growth models are used: recalibrations of the STEMS models using forest inventory and analysis data, and new growth models developed as a component of an annual forest inventory system for the North...
NASA Technical Reports Server (NTRS)
Sotiropoulou, Rafaella-Eleni P.; Nenes, Athanasios; Adams, Peter J.; Seinfeld, John H.
2007-01-01
In situ observations of aerosol and cloud condensation nuclei (CCN) and the GISS GCM Model II' with an online aerosol simulation and explicit aerosol-cloud interactions are used to quantify the uncertainty in radiative forcing and autoconversion rate from application of Kohler theory. Simulations suggest that application of Koehler theory introduces a 10-20% uncertainty in global average indirect forcing and 2-11% uncertainty in autoconversion. Regionally, the uncertainty in indirect forcing ranges between 10-20%, and 5-50% for autoconversion. These results are insensitive to the range of updraft velocity and water vapor uptake coefficient considered. This study suggests that Koehler theory (as implemented in climate models) is not a significant source of uncertainty for aerosol indirect forcing but can be substantial for assessments of aerosol effects on the hydrological cycle in climatically sensitive regions of the globe. This implies that improvements in the representation of GCM subgrid processes and aerosol size distribution will mostly benefit indirect forcing assessments. Predictions of autoconversion, by nature, will be subject to considerable uncertainty; its reduction may require explicit representation of size-resolved aerosol composition and mixing state.
Embracing uncertainty in applied ecology.
Milner-Gulland, E J; Shea, K
2017-12-01
Applied ecologists often face uncertainty that hinders effective decision-making.Common traps that may catch the unwary are: ignoring uncertainty, acknowledging uncertainty but ploughing on, focussing on trivial uncertainties, believing your models, and unclear objectives.We integrate research insights and examples from a wide range of applied ecological fields to illustrate advances that are generally underused, but could facilitate ecologists' ability to plan and execute research to support management.Recommended approaches to avoid uncertainty traps are: embracing models, using decision theory, using models more effectively, thinking experimentally, and being realistic about uncertainty. Synthesis and applications . Applied ecologists can become more effective at informing management by using approaches that explicitly take account of uncertainty.
Isendahl, Nicola; Dewulf, Art; Pahl-Wostl, Claudia
2010-01-01
By now, the need for addressing uncertainty in the management of water resources is widely recognized, yet there is little expertise and experience how to effectively deal with uncertainty in practice. Uncertainties in water management practice so far are mostly dealt with intuitively or based on experience. That way decisions can be quickly taken but analytic processes of deliberate reasoning are bypassed. To meet the desire of practitioners for better guidance and tools how to deal with uncertainty more practice-oriented systematic approaches are needed. For that purpose we consider it important to understand how practitioners frame uncertainties. In this paper we present an approach where water managers developed criteria of relevance to understand and address uncertainties. The empirical research took place in the Doñana region of the Guadalquivir estuary in southern Spain making use of the method of card sorting. Through the card sorting exercise a broad range of criteria to make sense of and describe uncertainties was produced by different subgroups, which were then merged into a shared list of criteria. That way framing differences were made explicit and communication on uncertainty and on framing differences was enhanced. In that, the present approach constitutes a first step to enabling reframing and overcoming framing differences, which are important features on the way to robust decision-making. Moreover, the elaborated criteria build a basis for the development of more structured approaches to deal with uncertainties in water management practice. Copyright 2009 Elsevier Ltd. All rights reserved.
Feizizadeh, Bakhtiar; Jankowski, Piotr; Blaschke, Thomas
2014-03-01
GIS multicriteria decision analysis (MCDA) techniques are increasingly used in landslide susceptibility mapping for the prediction of future hazards, land use planning, as well as for hazard preparedness. However, the uncertainties associated with MCDA techniques are inevitable and model outcomes are open to multiple types of uncertainty. In this paper, we present a systematic approach to uncertainty and sensitivity analysis. We access the uncertainty of landslide susceptibility maps produced with GIS-MCDA techniques. A new spatially-explicit approach and Dempster-Shafer Theory (DST) are employed to assess the uncertainties associated with two MCDA techniques, namely Analytical Hierarchical Process (AHP) and Ordered Weighted Averaging (OWA) implemented in GIS. The methodology is composed of three different phases. First, weights are computed to express the relative importance of factors (criteria) for landslide susceptibility. Next, the uncertainty and sensitivity of landslide susceptibility is analyzed as a function of weights using Monte Carlo Simulation and Global Sensitivity Analysis. Finally, the results are validated using a landslide inventory database and by applying DST. The comparisons of the obtained landslide susceptibility maps of both MCDA techniques with known landslides show that the AHP outperforms OWA. However, the OWA-generated landslide susceptibility map shows lower uncertainty than the AHP-generated map. The results demonstrate that further improvement in the accuracy of GIS-based MCDA can be achieved by employing an integrated uncertainty-sensitivity analysis approach, in which the uncertainty of landslide susceptibility model is decomposed and attributed to model's criteria weights.
NASA Astrophysics Data System (ADS)
Feizizadeh, Bakhtiar; Jankowski, Piotr; Blaschke, Thomas
2014-03-01
GIS multicriteria decision analysis (MCDA) techniques are increasingly used in landslide susceptibility mapping for the prediction of future hazards, land use planning, as well as for hazard preparedness. However, the uncertainties associated with MCDA techniques are inevitable and model outcomes are open to multiple types of uncertainty. In this paper, we present a systematic approach to uncertainty and sensitivity analysis. We access the uncertainty of landslide susceptibility maps produced with GIS-MCDA techniques. A new spatially-explicit approach and Dempster-Shafer Theory (DST) are employed to assess the uncertainties associated with two MCDA techniques, namely Analytical Hierarchical Process (AHP) and Ordered Weighted Averaging (OWA) implemented in GIS. The methodology is composed of three different phases. First, weights are computed to express the relative importance of factors (criteria) for landslide susceptibility. Next, the uncertainty and sensitivity of landslide susceptibility is analyzed as a function of weights using Monte Carlo Simulation and Global Sensitivity Analysis. Finally, the results are validated using a landslide inventory database and by applying DST. The comparisons of the obtained landslide susceptibility maps of both MCDA techniques with known landslides show that the AHP outperforms OWA. However, the OWA-generated landslide susceptibility map shows lower uncertainty than the AHP-generated map. The results demonstrate that further improvement in the accuracy of GIS-based MCDA can be achieved by employing an integrated uncertainty-sensitivity analysis approach, in which the uncertainty of landslide susceptibility model is decomposed and attributed to model's criteria weights.
Oldenkamp, Rik; Huijbregts, Mark A J; Ragas, Ad M J
2016-05-01
The selection of priority APIs (Active Pharmaceutical Ingredients) can benefit from a spatially explicit approach, since an API might exceed the threshold of environmental concern in one location, while staying below that same threshold in another. However, such a spatially explicit approach is relatively data intensive and subject to parameter uncertainty due to limited data. This raises the question to what extent a spatially explicit approach for the environmental prioritisation of APIs remains worthwhile when accounting for uncertainty in parameter settings. We show here that the inclusion of spatially explicit information enables a more efficient environmental prioritisation of APIs in Europe, compared with a non-spatial EU-wide approach, also under uncertain conditions. In a case study with nine antibiotics, uncertainty distributions of the PAF (Potentially Affected Fraction) of aquatic species were calculated in 100∗100km(2) environmental grid cells throughout Europe, and used for the selection of priority APIs. Two APIs have median PAF values that exceed a threshold PAF of 1% in at least one environmental grid cell in Europe, i.e., oxytetracycline and erythromycin. At a tenfold lower threshold PAF (i.e., 0.1%), two additional APIs would be selected, i.e., cefuroxime and ciprofloxacin. However, in 94% of the environmental grid cells in Europe, no APIs exceed either of the thresholds. This illustrates the advantage of following a location-specific approach in the prioritisation of APIs. This added value remains when accounting for uncertainty in parameter settings, i.e., if the 95th percentile of the PAF instead of its median value is compared with the threshold. In 96% of the environmental grid cells, the location-specific approach still enables a reduction of the selection of priority APIs of at least 50%, compared with a EU-wide prioritisation. Copyright © 2016 Elsevier Ltd. All rights reserved.
Condition trees as a mechanism for communicating the meaning of uncertainties
NASA Astrophysics Data System (ADS)
Beven, Keith
2015-04-01
Uncertainty communication for environmental problems is fraught with difficulty for good epistemic reasons. The fact that most sources of uncertainty are subject to, and often dominated by, epistemic uncertainties means that the unthinking use of probability theory might actually be misleading and lead to false inference (even in some cases where the assumptions of a probabilistic error model might seem to be reasonably valid). This therefore creates problems in communicating the meaning of probabilistic uncertainties of model predictions to potential users (there are many examples in hydrology, hydraulics, climate change and other domains). It is suggested that one way of being more explicit about the meaning of uncertainties is to associate each type of application with a condition tree of assumptions that need to be made in producing an estimate of uncertainty. The condition tree then provides a basis for discussion and communication of assumptions about uncertainties with users. Agreement of assumptions (albeit generally at some institutional level) will provide some buy-in on the part of users, and a basis for commissioning of future studies. Even in some relatively well-defined problems, such as mapping flood risk, such a condition tree can be rather extensive, but by making each step in the tree explicit then an audit trail is established for future reference. This can act to provide focus in the exercise of agreeing more realistic assumptions.
Feizizadeh, Bakhtiar; Jankowski, Piotr; Blaschke, Thomas
2014-01-01
GIS multicriteria decision analysis (MCDA) techniques are increasingly used in landslide susceptibility mapping for the prediction of future hazards, land use planning, as well as for hazard preparedness. However, the uncertainties associated with MCDA techniques are inevitable and model outcomes are open to multiple types of uncertainty. In this paper, we present a systematic approach to uncertainty and sensitivity analysis. We access the uncertainty of landslide susceptibility maps produced with GIS-MCDA techniques. A new spatially-explicit approach and Dempster–Shafer Theory (DST) are employed to assess the uncertainties associated with two MCDA techniques, namely Analytical Hierarchical Process (AHP) and Ordered Weighted Averaging (OWA) implemented in GIS. The methodology is composed of three different phases. First, weights are computed to express the relative importance of factors (criteria) for landslide susceptibility. Next, the uncertainty and sensitivity of landslide susceptibility is analyzed as a function of weights using Monte Carlo Simulation and Global Sensitivity Analysis. Finally, the results are validated using a landslide inventory database and by applying DST. The comparisons of the obtained landslide susceptibility maps of both MCDA techniques with known landslides show that the AHP outperforms OWA. However, the OWA-generated landslide susceptibility map shows lower uncertainty than the AHP-generated map. The results demonstrate that further improvement in the accuracy of GIS-based MCDA can be achieved by employing an integrated uncertainty–sensitivity analysis approach, in which the uncertainty of landslide susceptibility model is decomposed and attributed to model's criteria weights. PMID:25843987
The properties of retrieval cues constrain the picture superiority effect.
Weldon, M S; Roediger, H L; Challis, B H
1989-01-01
In three experiments, we examined why pictures are remembered better than words on explicit memory tests like recall and recognition, whereas words produce more priming than pictures on some implicit tests, such as word-fragment and word-stem completion (e.g., completing -l-ph-nt or ele----- as elephant). One possibility is that pictures are always more accessible than words if subjects are given explicit retrieval instructions. An alternative possibility is that the properties of the retrieval cues themselves constrain the retrieval processes engaged; word fragments might induce data-driven (perceptually based) retrieval, which favors words regardless of the retrieval instructions. Experiment 1 demonstrated that words were remembered better than pictures on both the word-fragment and word-stem completion tasks under both implicit and explicit retrieval conditions. In Experiment 2, pictures were recalled better than words with semantically related extralist cues. In Experiment 3, when semantic cues were combined with word fragments, pictures and words were recalled equally well under explicit retrieval conditions, but words were superior to pictures under implicit instructions. Thus, the inherently data-limited properties of fragmented words limit their use in accessing conceptual codes. Overall, the results indicate that retrieval operations are largely determined by properties of the retrieval cues under both implicit and explicit retrieval conditions.
Implicit and Explicit Knowledge of Korean Learners in the Philippines across Contextual Shift
ERIC Educational Resources Information Center
Cruz, Selwyn A.; Pariña, Jose Cristina M.
2017-01-01
Stemming from issue of migration for education, the study explored the language learning experience of Korean university students who come to the Philippines for education. Specifically, it documented the changes in the students' implicit and explicit knowledge that occurred in the preactional until the actional phase of their learning journey in…
DEVELOPMENTS AT U.S. EPA IN ADDRESSING UNCERTAINTY IN RISK ASSESSMENT
An emerging trend in risk assessment is to be more explicit about uncertainties, both during the analytical procedures and in communicating results. In February 1 992, then-Deputy EPA Administrator Henry Habicht set out Agency goals in a memorandum stating that the Agency will "p...
Funnel Libraries for Real-Time Robust Feedback Motion Planning
2016-07-21
motion plans for a robot that are guaranteed to suc- ceed despite uncertainty in the environment, parametric model uncertainty, and disturbances...resulting funnel library is then used to sequentially compose motion plans at runtime while ensuring the safety of the robot . A major advantage of...the work presented here is that by explicitly taking into account the effect of uncertainty, the robot can evaluate motion plans based on how vulnerable
Uncertainty and risk in wildland fire management: A review
Matthew P. Thompson; Dave E. Calkin
2011-01-01
Wildland fire management is subject to manifold sources of uncertainty. Beyond the unpredictability of wildfire behavior, uncertainty stems from inaccurate/missing data, limited resource value measures to guide prioritization across fires and resources at risk, and an incomplete scientific understanding of ecological response to fire, of fire behavior response to...
NASA Astrophysics Data System (ADS)
Pianosi, Francesca; Lal Shrestha, Durga; Solomatine, Dimitri
2010-05-01
This research presents an extension of UNEEC (Uncertainty Estimation based on Local Errors and Clustering, Shrestha and Solomatine, 2006, 2008 & Solomatine and Shrestha, 2009) method in the direction of explicit inclusion of parameter uncertainty. UNEEC method assumes that there is an optimal model and the residuals of the model can be used to assess the uncertainty of the model prediction. It is assumed that all sources of uncertainty including input, parameter and model structure uncertainty are explicitly manifested in the model residuals. In this research, theses assumptions are relaxed, and the UNEEC method is extended to consider parameter uncertainty as well (abbreviated as UNEEC-P). In UNEEC-P, first we use Monte Carlo (MC) sampling in parameter space to generate N model realizations (each of which is a time series), estimate the prediction quantiles based on the empirical distribution functions of the model residuals considering all the residual realizations, and only then apply the standard UNEEC method that encapsulates the uncertainty of a hydrologic model (expressed by quantiles of the error distribution) in a machine learning model (e.g., ANN). UNEEC-P is applied first to a linear regression model of synthetic data, and then to a real case study of forecasting inflow to lake Lugano in northern Italy. The inflow forecasting model is a stochastic heteroscedastic model (Pianosi and Soncini-Sessa, 2009). The preliminary results show that the UNEEC-P method produces wider uncertainty bounds, which is consistent with the fact that the method considers also parameter uncertainty of the optimal model. In the future UNEEC method will be further extended to consider input and structure uncertainty which will provide more realistic estimation of model predictions.
Bell, David M; Ward, Eric J; Oishi, A Christopher; Oren, Ram; Flikkema, Paul G; Clark, James S
2015-07-01
Uncertainties in ecophysiological responses to environment, such as the impact of atmospheric and soil moisture conditions on plant water regulation, limit our ability to estimate key inputs for ecosystem models. Advanced statistical frameworks provide coherent methodologies for relating observed data, such as stem sap flux density, to unobserved processes, such as canopy conductance and transpiration. To address this need, we developed a hierarchical Bayesian State-Space Canopy Conductance (StaCC) model linking canopy conductance and transpiration to tree sap flux density from a 4-year experiment in the North Carolina Piedmont, USA. Our model builds on existing ecophysiological knowledge, but explicitly incorporates uncertainty in canopy conductance, internal tree hydraulics and observation error to improve estimation of canopy conductance responses to atmospheric drought (i.e., vapor pressure deficit), soil drought (i.e., soil moisture) and above canopy light. Our statistical framework not only predicted sap flux observations well, but it also allowed us to simultaneously gap-fill missing data as we made inference on canopy processes, marking a substantial advance over traditional methods. The predicted and observed sap flux data were highly correlated (mean sensor-level Pearson correlation coefficient = 0.88). Variations in canopy conductance and transpiration associated with environmental variation across days to years were many times greater than the variation associated with model uncertainties. Because some variables, such as vapor pressure deficit and soil moisture, were correlated at the scale of days to weeks, canopy conductance responses to individual environmental variables were difficult to interpret in isolation. Still, our results highlight the importance of accounting for uncertainty in models of ecophysiological and ecosystem function where the process of interest, canopy conductance in this case, is not observed directly. The StaCC modeling framework provides a statistically coherent approach to estimating canopy conductance and transpiration and propagating estimation uncertainty into ecosystem models, paving the way for improved prediction of water and carbon uptake responses to environmental change. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Predictive uncertainty in auditory sequence processing
Hansen, Niels Chr.; Pearce, Marcus T.
2014-01-01
Previous studies of auditory expectation have focused on the expectedness perceived by listeners retrospectively in response to events. In contrast, this research examines predictive uncertainty—a property of listeners' prospective state of expectation prior to the onset of an event. We examine the information-theoretic concept of Shannon entropy as a model of predictive uncertainty in music cognition. This is motivated by the Statistical Learning Hypothesis, which proposes that schematic expectations reflect probabilistic relationships between sensory events learned implicitly through exposure. Using probability estimates from an unsupervised, variable-order Markov model, 12 melodic contexts high in entropy and 12 melodic contexts low in entropy were selected from two musical repertoires differing in structural complexity (simple and complex). Musicians and non-musicians listened to the stimuli and provided explicit judgments of perceived uncertainty (explicit uncertainty). We also examined an indirect measure of uncertainty computed as the entropy of expectedness distributions obtained using a classical probe-tone paradigm where listeners rated the perceived expectedness of the final note in a melodic sequence (inferred uncertainty). Finally, we simulate listeners' perception of expectedness and uncertainty using computational models of auditory expectation. A detailed model comparison indicates which model parameters maximize fit to the data and how they compare to existing models in the literature. The results show that listeners experience greater uncertainty in high-entropy musical contexts than low-entropy contexts. This effect is particularly apparent for inferred uncertainty and is stronger in musicians than non-musicians. Consistent with the Statistical Learning Hypothesis, the results suggest that increased domain-relevant training is associated with an increasingly accurate cognitive model of probabilistic structure in music. PMID:25295018
Fostering pre-service teachers' views about nature of science: evaluation of a new STEM curriculum
NASA Astrophysics Data System (ADS)
Krell, Moritz; Koska, Johannes; Penning, Fenna; Krüger, Dirk
2015-09-01
Background: An elaborated understanding of Nature of Science (NOS) is seen as an important part of scientific literacy. In order to enable teachers to adequately discuss NOS in their lessons, various approaches have recently been employed to improve teachers' understanding of NOS. Purpose: This study investigated the effect of participating in a newly developed Science, Technology, Engineering and Mathematics (STEM) curriculum at the Freie Universität Berlin (Germany) on pre-service teachers' NOS views. Program description: In the new STEM curriculum, two versions of explicitly teaching NOS, which are discussed in the literature, have been adopted: the pre-service teachers explicitly reflect upon nature and history of science (version one) as well as conduct own scientific investigations (version two). Sample: N = 76 pre-service teachers from different semester levels (cross-sectional study) who participated in the new STEM curriculum took part in this study (intervention group). As control groups, students who did not partake in the new curriculum participated (pre-service primary (N = 134), science (N = 198), and no-science (N = 161) teachers). Design and methods: In order to allow an economic assessment, a testing instrument with closed-item formats was developed to assess the respondents' views about six NOS aspects. Results: The intervention group shows significantly more elaborated NOS views than a relevant control group (p < .01, g = .48). Additionally, a one-way ANOVA reveals a positive effect of semester level on NOS views for the intervention group (p < .01; η² = .16) but not for the control groups. Conclusion: The findings support evidence suggesting that explicit approaches are effective when fostering an informed understanding of NOS. More specifically, a sequence of both versions of explicitly teaching NOS discussed in the literature seems to be a way to successfully promote pre-service teachers' NOS understanding.
Vuust, Peter; Pearce, Marcus
2016-01-01
Musical expertise entails meticulous stylistic specialisation and enculturation. Even so, research on musical training effects has focused on generalised comparisons between musicians and non-musicians, and cross-cultural work addressing specialised expertise has traded cultural specificity and sensitivity for other methodological limitations. This study aimed to experimentally dissociate the effects of specialised stylistic training and general musical expertise on the perception of melodies. Non-musicians and professional musicians specialising in classical music or jazz listened to sampled renditions of saxophone solos improvised by Charlie Parker in the bebop style. Ratings of explicit uncertainty and expectedness for different continuations of each melodic excerpt were collected. An information-theoretic model of expectation enabled selection of stimuli affording highly certain continuations in the bebop style, but highly uncertain continuations in the context of general tonal expectations, and vice versa. The results showed that expert musicians have acquired probabilistic characteristics of music influencing their experience of expectedness and predictive uncertainty. While classical musicians had internalised key aspects of the bebop style implicitly, only jazz musicians’ explicit uncertainty ratings reflected the computational estimates, and jazz-specific expertise modulated the relationship between explicit and inferred uncertainty data. In spite of this, there was no evidence that non-musicians and classical musicians used a stylistically irrelevant cognitive model of general tonal music providing support for the theory of cognitive firewalls between stylistic models in predictive processing of music. PMID:27732612
Hansen, Niels Chr; Vuust, Peter; Pearce, Marcus
2016-01-01
Musical expertise entails meticulous stylistic specialisation and enculturation. Even so, research on musical training effects has focused on generalised comparisons between musicians and non-musicians, and cross-cultural work addressing specialised expertise has traded cultural specificity and sensitivity for other methodological limitations. This study aimed to experimentally dissociate the effects of specialised stylistic training and general musical expertise on the perception of melodies. Non-musicians and professional musicians specialising in classical music or jazz listened to sampled renditions of saxophone solos improvised by Charlie Parker in the bebop style. Ratings of explicit uncertainty and expectedness for different continuations of each melodic excerpt were collected. An information-theoretic model of expectation enabled selection of stimuli affording highly certain continuations in the bebop style, but highly uncertain continuations in the context of general tonal expectations, and vice versa. The results showed that expert musicians have acquired probabilistic characteristics of music influencing their experience of expectedness and predictive uncertainty. While classical musicians had internalised key aspects of the bebop style implicitly, only jazz musicians' explicit uncertainty ratings reflected the computational estimates, and jazz-specific expertise modulated the relationship between explicit and inferred uncertainty data. In spite of this, there was no evidence that non-musicians and classical musicians used a stylistically irrelevant cognitive model of general tonal music providing support for the theory of cognitive firewalls between stylistic models in predictive processing of music.
Drought stress and tree size determine stem CO2 efflux in a tropical forest.
Rowland, Lucy; da Costa, Antonio C L; Oliveira, Alex A R; Oliveira, Rafael S; Bittencourt, Paulo L; Costa, Patricia B; Giles, Andre L; Sosa, Azul I; Coughlin, Ingrid; Godlee, John L; Vasconcelos, Steel S; Junior, João A S; Ferreira, Leandro V; Mencuccini, Maurizio; Meir, Patrick
2018-06-01
CO 2 efflux from stems (CO 2_stem ) accounts for a substantial fraction of tropical forest gross primary productivity, but the climate sensitivity of this flux remains poorly understood. We present a study of tropical forest CO 2_stem from 215 trees across wet and dry seasons, at the world's longest running tropical forest drought experiment site. We show a 27% increase in wet season CO 2_stem in the droughted forest relative to a control forest. This was driven by increasing CO 2_stem in trees 10-40 cm diameter. Furthermore, we show that drought increases the proportion of maintenance to growth respiration in trees > 20 cm diameter, including large increases in maintenance respiration in the largest droughted trees, > 40 cm diameter. However, we found no clear taxonomic influence on CO 2_stem and were unable to accurately predict how drought sensitivity altered ecosystem scale CO 2_stem , due to substantial uncertainty introduced by contrasting methods previously employed to scale CO 2_stem fluxes. Our findings indicate that under future scenarios of elevated drought, increases in CO 2_stem may augment carbon losses, weakening or potentially reversing the tropical forest carbon sink. However, due to substantial uncertainties in scaling CO 2_stem fluxes, stand-scale future estimates of changes in stem CO 2 emissions remain highly uncertain. © 2018 The Authors New Phytologist © 2018 New Phytologist Trust.
Forest Management Under Uncertainty for Multiple Bird Population Objectives
Clinton T. Moore; W. Todd Plummer; Michael J. Conroy
2005-01-01
We advocate adaptive programs of decision making and monitoring for the management of forest birds when responses by populations to management, and particularly management trade-offs among populations, are uncertain. Models are necessary components of adaptive management. Under this approach, uncertainty about the behavior of a managed system is explicitly captured in...
Walsh, Daniel P.; Norton, Andrew S.; Storm, Daniel J.; Van Deelen, Timothy R.; Heisy, Dennis M.
2018-01-01
Implicit and explicit use of expert knowledge to inform ecological analyses is becoming increasingly common because it often represents the sole source of information in many circumstances. Thus, there is a need to develop statistical methods that explicitly incorporate expert knowledge, and can successfully leverage this information while properly accounting for associated uncertainty during analysis. Studies of cause-specific mortality provide an example of implicit use of expert knowledge when causes-of-death are uncertain and assigned based on the observer's knowledge of the most likely cause. To explicitly incorporate this use of expert knowledge and the associated uncertainty, we developed a statistical model for estimating cause-specific mortality using a data augmentation approach within a Bayesian hierarchical framework. Specifically, for each mortality event, we elicited the observer's belief of cause-of-death by having them specify the probability that the death was due to each potential cause. These probabilities were then used as prior predictive values within our framework. This hierarchical framework permitted a simple and rigorous estimation method that was easily modified to include covariate effects and regularizing terms. Although applied to survival analysis, this method can be extended to any event-time analysis with multiple event types, for which there is uncertainty regarding the true outcome. We conducted simulations to determine how our framework compared to traditional approaches that use expert knowledge implicitly and assume that cause-of-death is specified accurately. Simulation results supported the inclusion of observer uncertainty in cause-of-death assignment in modeling of cause-specific mortality to improve model performance and inference. Finally, we applied the statistical model we developed and a traditional method to cause-specific survival data for white-tailed deer, and compared results. We demonstrate that model selection results changed between the two approaches, and incorporating observer knowledge in cause-of-death increased the variability associated with parameter estimates when compared to the traditional approach. These differences between the two approaches can impact reported results, and therefore, it is critical to explicitly incorporate expert knowledge in statistical methods to ensure rigorous inference.
Uncertainty quantification of effective nuclear interactions
Pérez, R. Navarro; Amaro, J. E.; Arriola, E. Ruiz
2016-03-02
We give a brief review on the development of phenomenological NN interactions and the corresponding quanti cation of statistical uncertainties. We look into the uncertainty of effective interactions broadly used in mean eld calculations through the Skyrme parameters and effective eld theory counter-terms by estimating both statistical and systematic uncertainties stemming from the NN interaction. We also comment on the role played by different tting strategies on the light of recent developments.
Uncertainty quantification of effective nuclear interactions
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pérez, R. Navarro; Amaro, J. E.; Arriola, E. Ruiz
We give a brief review on the development of phenomenological NN interactions and the corresponding quanti cation of statistical uncertainties. We look into the uncertainty of effective interactions broadly used in mean eld calculations through the Skyrme parameters and effective eld theory counter-terms by estimating both statistical and systematic uncertainties stemming from the NN interaction. We also comment on the role played by different tting strategies on the light of recent developments.
NASA Astrophysics Data System (ADS)
Beatrici, Anderson; Santos Baptista, Leandra; Mauro Granjeiro, José
2018-03-01
Regenerative Medicine comprises the Biotechnology, Tissue Engineering and Biometrology for stem cell therapy. Starting from stem cells extracted from the patient, autologous implant, these cells are cultured and differentiated into other tissues, for example, articular cartilage. These cells are reorganized into microspheres (cell spheroids). Such tissue units are recombined into functional tissues constructs that can be implanted in the injured region for regeneration. It is necessary the biomechanical characterization of these constructed to determine if their properties are similar to native tissue. In this study was carried out the modeling of the calculation of uncertainty of the surface tension of cellular spheroids with the use of the Young-Laplace equation. We obtained relative uncertainties about 10%.
Uncertainties in SOA Formation from the Photooxidation of α-pinene
NASA Astrophysics Data System (ADS)
McVay, R.; Zhang, X.; Aumont, B.; Valorso, R.; Camredon, M.; La, S.; Seinfeld, J.
2015-12-01
Explicit chemical models such as GECKO-A (the Generator for Explicit Chemistry and Kinetics of Organics in the Atmosphere) enable detailed modeling of gas-phase photooxidation and secondary organic aerosol (SOA) formation. Comparison between these explicit models and chamber experiments can provide insight into processes that are missing or unknown in these models. GECKO-A is used to model seven SOA formation experiments from α-pinene photooxidation conducted at varying seed particle concentrations with varying oxidation rates. We investigate various physical and chemical processes to evaluate the extent of agreement between the experiments and the model predictions. We examine the effect of vapor wall loss on SOA formation and how the importance of this effect changes at different oxidation rates. Proposed gas-phase autoxidation mechanisms are shown to significantly affect SOA predictions. The potential effects of particle-phase dimerization and condensed-phase photolysis are investigated. We demonstrate the extent to which SOA predictions in the α-pinene photooxidation system depend on uncertainties in the chemical mechanism.
Adaptive management for a turbulent future
Allen, Craig R.; Fontaine, J.J.; Pope, K.L.; Garmestani, A.S.
2011-01-01
The challenges that face humanity today differ from the past because as the scale of human influence has increased, our biggest challenges have become global in nature, and formerly local problems that could be addressed by shifting populations or switching resources, now aggregate (i.e., "scale up") limiting potential management options. Adaptive management is an approach to natural resource management that emphasizes learning through management based on the philosophy that knowledge is incomplete and much of what we think we know is actually wrong. Adaptive management has explicit structure, including careful elucidation of goals, identification of alternative management objectives and hypotheses of causation, and procedures for the collection of data followed by evaluation and reiteration. It is evident that adaptive management has matured, but it has also reached a crossroads. Practitioners and scientists have developed adaptive management and structured decision making techniques, and mathematicians have developed methods to reduce the uncertainties encountered in resource management, yet there continues to be misapplication of the method and misunderstanding of its purpose. Ironically, the confusion over the term "adaptive management" may stem from the flexibility inherent in the approach, which has resulted in multiple interpretations of "adaptive management" that fall along a continuum of complexity and a priori design. Adaptive management is not a panacea for the navigation of 'wicked problems' as it does not produce easy answers, and is only appropriate in a subset of natural resource management problems where both uncertainty and controllability are high. Nonetheless, the conceptual underpinnings of adaptive management are simple; there will always be inherent uncertainty and unpredictability in the dynamics and behavior of complex social-ecological systems, but management decisions must still be made, and whenever possible, we should incorporate learning into management. ?? 2010 .
Adaptive Management for a Turbulent Future
Allen, Craig R.; Fontaine, Joseph J.; Pope, Kevin L.; Garmestani, Ahjond S.
2011-01-01
The challenges that face humanity today differ from the past because as the scale of human influence has increased, our biggest challenges have become global in nature, and formerly local problems that could be addressed by shifting populations or switching resources, now aggregate (i.e., "scale up") limiting potential management options. Adaptive management is an approach to natural resource management that emphasizes learning through management based on the philosophy that knowledge is incomplete and much of what we think we know is actually wrong. Adaptive management has explicit structure, including careful elucidation of goals, identification of alternative management objectives and hypotheses of causation, and procedures for the collection of data followed by evaluation and reiteration. It is evident that adaptive management has matured, but it has also reached a crossroads. Practitioners and scientists have developed adaptive management and structured decision making techniques, and mathematicians have developed methods to reduce the uncertainties encountered in resource management, yet there continues to be misapplication of the method and misunderstanding of its purpose. Ironically, the confusion over the term "adaptive management" may stem from the flexibility inherent in the approach, which has resulted in multiple interpretations of "adaptive management" that fall along a continuum of complexity and a priori design. Adaptive management is not a panacea for the navigation of 'wicked problems' as it does not produce easy answers, and is only appropriate in a subset of natural resource management problems where both uncertainty and controllability are high. Nonetheless, the conceptual underpinnings of adaptive management are simple; there will always be inherent uncertainty and unpredictability in the dynamics and behavior of complex social-ecological systems, but management decisions must still be made, and whenever possible, we should incorporate learning into management. Published by Elsevier Ltd.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sahoo, N; Zhu, X; Zhang, X
Purpose: To quantify the impact of range and setup uncertainties on various dosimetric indices that are used to assess normal tissue toxicities of patients receiving passive scattering proton beam therapy (PSPBT). Methods: Robust analysis of sample treatment plans of six brain cancer patients treated with PSPBT at our facility for whom the maximum brain stem dose exceeded 5800 CcGE were performed. The DVH of each plan was calculated in an Eclipse treatment planning system (TPS) version 11 applying ±3.5% range uncertainty and ±3 mm shift of the isocenter in x, y and z directions to account for setup uncertainties. Worst-casemore » dose indices for brain stem and whole brain were compared to their values in the nominal plan to determine the average change in their values. For the brain stem, maximum dose to 1 cc of volume, dose to 10%, 50%, 90% of volume (D10, D50, D90) and volume receiving 6000, 5400, 5000, 4500, 4000 CcGE (V60, V54, V50, V45, V40) were evaluated. For the whole brain, maximum dose to 1 cc of volume, and volume receiving 5400, 5000, 4500, 4000, 3000 CcGE (V54, V50, V45, V40 and V30) were assessed. Results: The average change in the values of these indices in the worst scenario cases from the nominal plan were as follows. Brain stem; Maximum dose to 1 cc of volume: 1.1%, D10: 1.4%, D50: 8.0%, D90:73.3%, V60:116.9%, V54:27.7%, V50: 21.2%, V45:16.2%, V40:13.6%,Whole brain; Maximum dose to 1 cc of volume: 0.3%, V54:11.4%, V50: 13.0%, V45:13.6%, V40:14.1%, V30:13.5%. Conclusion: Large to modest changes in the dosiemtric indices for brain stem and whole brain compared to nominal plan due to range and set up uncertainties were observed. Such potential changes should be taken into account while using any dosimetric parameters for outcome evaluation of patients receiving proton therapy.« less
NASA Technical Reports Server (NTRS)
Griffin, Brian Joseph; Burken, John J.; Xargay, Enric
2010-01-01
This paper presents an L(sub 1) adaptive control augmentation system design for multi-input multi-output nonlinear systems in the presence of unmatched uncertainties which may exhibit significant cross-coupling effects. A piecewise continuous adaptive law is adopted and extended for applicability to multi-input multi-output systems that explicitly compensates for dynamic cross-coupling. In addition, explicit use of high-fidelity actuator models are added to the L1 architecture to reduce uncertainties in the system. The L(sub 1) multi-input multi-output adaptive control architecture is applied to the X-29 lateral/directional dynamics and results are evaluated against a similar single-input single-output design approach.
Meija, Juris; Chartrand, Michelle M G
2018-01-01
Isotope delta measurements are normalized against international reference standards. Although multi-point normalization is becoming a standard practice, the existing uncertainty evaluation practices are either undocumented or are incomplete. For multi-point normalization, we present errors-in-variables regression models for explicit accounting of the measurement uncertainty of the international standards along with the uncertainty that is attributed to their assigned values. This manuscript presents framework to account for the uncertainty that arises due to a small number of replicate measurements and discusses multi-laboratory data reduction while accounting for inevitable correlations between the laboratories due to the use of identical reference materials for calibration. Both frequentist and Bayesian methods of uncertainty analysis are discussed.
ERIC Educational Resources Information Center
Kezar, Adrianna; Gehrke, Sean; Elrod, Susan
2015-01-01
This study examines the role of implicit theories of change in inhibiting STEM reform and identifies a range of approaches to help change agents alter their implicit beliefs in order to develop more explicit theories of change. Through observations and interviews, we focus on the experience of reform teams on 11 campuses that were involved in a…
Reducing uncertainty about objective functions in adaptive management
Williams, B.K.
2012-01-01
This paper extends the uncertainty framework of adaptive management to include uncertainty about the objectives to be used in guiding decisions. Adaptive decision making typically assumes explicit and agreed-upon objectives for management, but allows for uncertainty as to the structure of the decision process that generates change through time. Yet it is not unusual for there to be uncertainty (or disagreement) about objectives, with different stakeholders expressing different views not only about resource responses to management but also about the appropriate management objectives. In this paper I extend the treatment of uncertainty in adaptive management, and describe a stochastic structure for the joint occurrence of uncertainty about objectives as well as models, and show how adaptive decision making and the assessment of post-decision monitoring data can be used to reduce uncertainties of both kinds. Different degrees of association between model and objective uncertainty lead to different patterns of learning about objectives. ?? 2011.
Age differences in implicit memory: conceptual, perceptual, or methodological?
Mitchell, David B; Bruss, Peter J
2003-12-01
The authors examined age differences in conceptual and perceptual implicit memory via word-fragment completion, word-stem completion, category exemplar generation, picture-fragment identification, and picture naming. Young, middle-aged, and older participants (N = 60) named pictures and words at study. Limited test exposure minimized explicit memory contamination, yielding no reliable age differences and equivalent cross-format effects. In contrast, explicit memory and neuropsychological measures produced significant age differences. In a follow-up experiment, 24 young adults were informed a priori about implicit testing. Their priming was equivalent to the main experiment, showing that test trial time restrictions limit explicit memory strategies. The authors concluded that most implicit memory processes remain stable across adulthood and suggest that explicit contamination be rigorously monitored in aging studies.
Assessment of parametric uncertainty for groundwater reactive transport modeling,
Shi, Xiaoqing; Ye, Ming; Curtis, Gary P.; Miller, Geoffery L.; Meyer, Philip D.; Kohler, Matthias; Yabusaki, Steve; Wu, Jichun
2014-01-01
The validity of using Gaussian assumptions for model residuals in uncertainty quantification of a groundwater reactive transport model was evaluated in this study. Least squares regression methods explicitly assume Gaussian residuals, and the assumption leads to Gaussian likelihood functions, model parameters, and model predictions. While the Bayesian methods do not explicitly require the Gaussian assumption, Gaussian residuals are widely used. This paper shows that the residuals of the reactive transport model are non-Gaussian, heteroscedastic, and correlated in time; characterizing them requires using a generalized likelihood function such as the formal generalized likelihood function developed by Schoups and Vrugt (2010). For the surface complexation model considered in this study for simulating uranium reactive transport in groundwater, parametric uncertainty is quantified using the least squares regression methods and Bayesian methods with both Gaussian and formal generalized likelihood functions. While the least squares methods and Bayesian methods with Gaussian likelihood function produce similar Gaussian parameter distributions, the parameter distributions of Bayesian uncertainty quantification using the formal generalized likelihood function are non-Gaussian. In addition, predictive performance of formal generalized likelihood function is superior to that of least squares regression and Bayesian methods with Gaussian likelihood function. The Bayesian uncertainty quantification is conducted using the differential evolution adaptive metropolis (DREAM(zs)) algorithm; as a Markov chain Monte Carlo (MCMC) method, it is a robust tool for quantifying uncertainty in groundwater reactive transport models. For the surface complexation model, the regression-based local sensitivity analysis and Morris- and DREAM(ZS)-based global sensitivity analysis yield almost identical ranking of parameter importance. The uncertainty analysis may help select appropriate likelihood functions, improve model calibration, and reduce predictive uncertainty in other groundwater reactive transport and environmental modeling.
Uncertainties of predictions from parton distributions II: theoretical errors
NASA Astrophysics Data System (ADS)
Martin, A. D.; Roberts, R. G.; Stirling, W. J.; Thorne, R. S.
2004-06-01
We study the uncertainties in parton distributions, determined in global fits to deep inelastic and related hard scattering data, due to so-called theoretical errors. Amongst these, we include potential errors due to the change of perturbative order (NLO to NNLO), ln(1/x) and ln(1-x) effects, absorptive corrections and higher-twist contributions. We investigate these uncertainties both by including explicit corrections to our standard global analysis and by examining the sensitivity to changes of the x, Q 2, W 2 cuts on the data that are fitted. In this way we expose those kinematic regions where the conventional DGLAP description is inadequate. As a consequence we obtain a set of NLO, and of NNLO, conservative partons where the data are fully consistent with DGLAP evolution, but over a restricted kinematic domain. We also examine the potential effects of such issues as the choice of input parametrisation, heavy target corrections, assumptions about the strange quark sea and isospin violation. Hence we are able to compare the theoretical errors with those uncertainties due to errors on the experimental measurements, which we studied previously. We use W and Higgs boson production at the Tevatron and the LHC as explicit examples of the uncertainties arising from parton distributions. For many observables the theoretical error is dominant, but for the cross section for W production at the Tevatron both the theoretical and experimental uncertainties are small, and hence the NNLO prediction may serve as a valuable luminosity monitor.
Giving back or giving up: Native American student experiences in science and engineering.
Smith, Jessi L; Cech, Erin; Metz, Anneke; Huntoon, Meghan; Moyer, Christina
2014-07-01
Native Americans are underrepresented in science, technology, engineering, and math (STEM) careers. We examine communal goal incongruence-the mismatch between students' emphasis on communal work goals and the noncommunal culture of STEM-as a possible factor in this underrepresentation. First, we surveyed 80 Native American STEM freshmen and found they more highly endorsed communal goals than individualistic work goals. Next, we surveyed 96 Native American and White American students in STEM and non-STEM majors and confirmed that both Native American men and women in STEM highly endorsed communal goals. In a third study, we conducted a follow-up survey and in-depth interviews with a subset of Native American STEM students in their second semester to assess their experiences of belonging uncertainty, intrinsic motivation, persistence intentions, and perceived performance in STEM as a function of their initial communal work goals. Results demonstrate the prominence of communal goals among incoming Native American freshman (especially compared with White male STEM majors) and the connection between communal goals and feelings of belonging uncertainty, low motivation, and perceived poor performance 1 semester later. The interview data illustrate that these issues are particularly salient for students raised within tribal communities, and that a communal goal orientation is not just a vague desire to "help others," but a commitment to helping their tribal communities. The interviews also highlight the importance of student support programs for fostering feelings of belonging. We end by discussing implications for interventions and institutional changes that may promote Native American student retention in STEM.
STEM Education for Girls of Color
NASA Astrophysics Data System (ADS)
Yee, Kam H.
Science, technology, engineering, and math (STEM) fields struggle to increase recruitment and retention of girls of color. The dominant framework in STEM education is the pipeline which assumes girls in general lack motivation and interest to persist in STEM fields. Recent public discourse shifts to address institutionalized discrimination and systemic barriers in STEM culture that filter out underrepresented populations. Informal education or complementary learning STEM programs offer alternative opportunities for students to explore outside of rigid school academic and social systems. Few articles look specifically at STEM complementary learning programs, and even fewer focus on the effects on girls of color. This research is a quantitative study to categorize existing mission statements and training behind organizations that provide STEM programs. The results will provide a better understanding of the relationship between practices of STEM education organizations and the programs they create. Diversity training and inclusive language in mission statements had weak correlations with increased cultural responsiveness in the program offerings. The results suggest organizations must be more intentional and explicit when implementing diversity goals.
The perspectives, information and conclusions conveyed in research project abstracts, progress reports, final reports, journal abstracts and journal publications convey the viewpoints of the principal investigator and may not represent the views and policies of ORD and EPA. Concl...
Boyce, Ayesha S
2017-10-01
Evaluation must attend meaningfully and respectfully to issues of culture, race, diversity, power, and equity. This attention is especially critical within the evaluation of science, technology, engineering, and mathematics (STEM) educational programming, which has an explicit agenda of broadening participation. The purpose of this article is to report lessons learned from the implementation of a values-engaged, educative (Greene et al., 2006) evaluation within a multi-year STEM education program setting. This meta-evaluation employed a case study design using data from evaluator weekly systematic reflections, review of evaluation and program artifacts, stakeholder interviews, and peer review and assessment. The main findings from this study are (a) explicit attention to culture, diversity, and equity was initially challenged by organizational culture and under-developed evaluator-stakeholder professional relationship and (b) evidence of successful engagement of culture, diversity, and equity emerged in formal evaluation criteria and documents, and informal dialogue and discussion with stakeholders. The paper concludes with lessons learned and implications for practice. Copyright © 2017 Elsevier Ltd. All rights reserved.
Merging information from multi-model flood projections in a hierarchical Bayesian framework
NASA Astrophysics Data System (ADS)
Le Vine, Nataliya
2016-04-01
Multi-model ensembles are becoming widely accepted for flood frequency change analysis. The use of multiple models results in large uncertainty around estimates of flood magnitudes, due to both uncertainty in model selection and natural variability of river flow. The challenge is therefore to extract the most meaningful signal from the multi-model predictions, accounting for both model quality and uncertainties in individual model estimates. The study demonstrates the potential of a recently proposed hierarchical Bayesian approach to combine information from multiple models. The approach facilitates explicit treatment of shared multi-model discrepancy as well as the probabilistic nature of the flood estimates, by treating the available models as a sample from a hypothetical complete (but unobserved) set of models. The advantages of the approach are: 1) to insure an adequate 'baseline' conditions with which to compare future changes; 2) to reduce flood estimate uncertainty; 3) to maximize use of statistical information in circumstances where multiple weak predictions individually lack power, but collectively provide meaningful information; 4) to adjust multi-model consistency criteria when model biases are large; and 5) to explicitly consider the influence of the (model performance) stationarity assumption. Moreover, the analysis indicates that reducing shared model discrepancy is the key to further reduction of uncertainty in the flood frequency analysis. The findings are of value regarding how conclusions about changing exposure to flooding are drawn, and to flood frequency change attribution studies.
Model structures amplify uncertainty in predicted soil carbon responses to climate change.
Shi, Zheng; Crowell, Sean; Luo, Yiqi; Moore, Berrien
2018-06-04
Large model uncertainty in projected future soil carbon (C) dynamics has been well documented. However, our understanding of the sources of this uncertainty is limited. Here we quantify the uncertainties arising from model parameters, structures and their interactions, and how those uncertainties propagate through different models to projections of future soil carbon stocks. Both the vertically resolved model and the microbial explicit model project much greater uncertainties to climate change than the conventional soil C model, with both positive and negative C-climate feedbacks, whereas the conventional model consistently predicts positive soil C-climate feedback. Our findings suggest that diverse model structures are necessary to increase confidence in soil C projection. However, the larger uncertainty in the complex models also suggests that we need to strike a balance between model complexity and the need to include diverse model structures in order to forecast soil C dynamics with high confidence and low uncertainty.
Generalization of the Time-Energy Uncertainty Relation of Anandan-Aharonov Type
NASA Technical Reports Server (NTRS)
Hirayama, Minoru; Hamada, Takeshi; Chen, Jin
1996-01-01
A new type of time-energy uncertainty relation was proposed recently by Anandan and Aharonov. Their formula, to estimate the lower bound of time-integral of the energy-fluctuation in a quantum state is generalized to the one involving a set of quantum states. This is achieved by obtaining an explicit formula for the distance between two finitely separated points in the Grassman manifold.
Upper bounds on quantum uncertainty products and complexity measures
DOE Office of Scientific and Technical Information (OSTI.GOV)
Guerrero, Angel; Sanchez-Moreno, Pablo; Dehesa, Jesus S.
The position-momentum Shannon and Renyi uncertainty products of general quantum systems are shown to be bounded not only from below (through the known uncertainty relations), but also from above in terms of the Heisenberg-Kennard product . Moreover, the Cramer-Rao, Fisher-Shannon, and Lopez-Ruiz, Mancini, and Calbet shape measures of complexity (whose lower bounds have been recently found) are also bounded from above. The improvement of these bounds for systems subject to spherically symmetric potentials is also explicitly given. Finally, applications to hydrogenic and oscillator-like systems are done.
Conjugate gradient based projection - A new explicit methodology for frictional contact
NASA Technical Reports Server (NTRS)
Tamma, Kumar K.; Li, Maocheng; Sha, Desong
1993-01-01
With special attention towards the applicability to parallel computation or vectorization, a new and effective explicit approach for linear complementary formulations involving a conjugate gradient based projection methodology is proposed in this study for contact problems with Coulomb friction. The overall objectives are focussed towards providing an explicit methodology of computation for the complete contact problem with friction. In this regard, the primary idea for solving the linear complementary formulations stems from an established search direction which is projected to a feasible region determined by the non-negative constraint condition; this direction is then applied to the Fletcher-Reeves conjugate gradient method resulting in a powerful explicit methodology which possesses high accuracy, excellent convergence characteristics, fast computational speed and is relatively simple to implement for contact problems involving Coulomb friction.
The wave function and minimum uncertainty function of the bound quadratic Hamiltonian system
NASA Technical Reports Server (NTRS)
Yeon, Kyu Hwang; Um, Chung IN; George, T. F.
1994-01-01
The bound quadratic Hamiltonian system is analyzed explicitly on the basis of quantum mechanics. We have derived the invariant quantity with an auxiliary equation as the classical equation of motion. With the use of this invariant it can be determined whether or not the system is bound. In bound system we have evaluated the exact eigenfunction and minimum uncertainty function through unitary transformation.
NASA Astrophysics Data System (ADS)
Tang, Zhongqian; Zhang, Hua; Yi, Shanzhen; Xiao, Yangfan
2018-03-01
GIS-based multi-criteria decision analysis (MCDA) is increasingly used to support flood risk assessment. However, conventional GIS-MCDA methods fail to adequately represent spatial variability and are accompanied with considerable uncertainty. It is, thus, important to incorporate spatial variability and uncertainty into GIS-based decision analysis procedures. This research develops a spatially explicit, probabilistic GIS-MCDA approach for the delineation of potentially flood susceptible areas. The approach integrates the probabilistic and the local ordered weighted averaging (OWA) methods via Monte Carlo simulation, to take into account the uncertainty related to criteria weights, spatial heterogeneity of preferences and the risk attitude of the analyst. The approach is applied to a pilot study for the Gucheng County, central China, heavily affected by the hazardous 2012 flood. A GIS database of six geomorphological and hydrometeorological factors for the evaluation of susceptibility was created. Moreover, uncertainty and sensitivity analysis were performed to investigate the robustness of the model. The results indicate that the ensemble method improves the robustness of the model outcomes with respect to variation in criteria weights and identifies which criteria weights are most responsible for the variability of model outcomes. Therefore, the proposed approach is an improvement over the conventional deterministic method and can provides a more rational, objective and unbiased tool for flood susceptibility evaluation.
Johnston, Iain G; Rickett, Benjamin C; Jones, Nick S
2014-12-02
Back-of-the-envelope or rule-of-thumb calculations involving rough estimates of quantities play a central scientific role in developing intuition about the structure and behavior of physical systems, for example in so-called Fermi problems in the physical sciences. Such calculations can be used to powerfully and quantitatively reason about biological systems, particularly at the interface between physics and biology. However, substantial uncertainties are often associated with values in cell biology, and performing calculations without taking this uncertainty into account may limit the extent to which results can be interpreted for a given problem. We present a means to facilitate such calculations where uncertainties are explicitly tracked through the line of reasoning, and introduce a probabilistic calculator called CALADIS, a free web tool, designed to perform this tracking. This approach allows users to perform more statistically robust calculations in cell biology despite having uncertain values, and to identify which quantities need to be measured more precisely to make confident statements, facilitating efficient experimental design. We illustrate the use of our tool for tracking uncertainty in several example biological calculations, showing that the results yield powerful and interpretable statistics on the quantities of interest. We also demonstrate that the outcomes of calculations may differ from point estimates when uncertainty is accurately tracked. An integral link between CALADIS and the BioNumbers repository of biological quantities further facilitates the straightforward location, selection, and use of a wealth of experimental data in cell biological calculations. Copyright © 2014 The Authors. Published by Elsevier Inc. All rights reserved.
Dealing with uncertainties in environmental burden of disease assessment
2009-01-01
Disability Adjusted Life Years (DALYs) combine the number of people affected by disease or mortality in a population and the duration and severity of their condition into one number. The environmental burden of disease is the number of DALYs that can be attributed to environmental factors. Environmental burden of disease estimates enable policy makers to evaluate, compare and prioritize dissimilar environmental health problems or interventions. These estimates often have various uncertainties and assumptions which are not always made explicit. Besides statistical uncertainty in input data and parameters – which is commonly addressed – a variety of other types of uncertainties may substantially influence the results of the assessment. We have reviewed how different types of uncertainties affect environmental burden of disease assessments, and we give suggestions as to how researchers could address these uncertainties. We propose the use of an uncertainty typology to identify and characterize uncertainties. Finally, we argue that uncertainties need to be identified, assessed, reported and interpreted in order for assessment results to adequately support decision making. PMID:19400963
COMMUNICATING PROBABILISTIC RISK OUTCOMES TO RISK MANAGERS
Increasingly, risk assessors are moving away from simple deterministic assessments to probabilistic approaches that explicitly incorporate ecological variability, measurement imprecision, and lack of knowledge (collectively termed "uncertainty"). While the new methods provide an...
A polynomial chaos approach to the analysis of vehicle dynamics under uncertainty
NASA Astrophysics Data System (ADS)
Kewlani, Gaurav; Crawford, Justin; Iagnemma, Karl
2012-05-01
The ability of ground vehicles to quickly and accurately analyse their dynamic response to a given input is critical to their safety and efficient autonomous operation. In field conditions, significant uncertainty is associated with terrain and/or vehicle parameter estimates, and this uncertainty must be considered in the analysis of vehicle motion dynamics. Here, polynomial chaos approaches that explicitly consider parametric uncertainty during modelling of vehicle dynamics are presented. They are shown to be computationally more efficient than the standard Monte Carlo scheme, and experimental results compared with the simulation results performed on ANVEL (a vehicle simulator) indicate that the method can be utilised for efficient and accurate prediction of vehicle motion in realistic scenarios.
Modeling to Optimize Terminal Stem Cell Differentiation
Gallicano, G. Ian
2013-01-01
Embryonic stem cell (ESC), iPCs, and adult stem cells (ASCs) all are among the most promising potential treatments for heart failure, spinal cord injury, neurodegenerative diseases, and diabetes. However, considerable uncertainty in the production of ESC-derived terminally differentiated cell types has limited the efficiency of their development. To address this uncertainty, we and other investigators have begun to employ a comprehensive statistical model of ESC differentiation for determining the role of intracellular pathways (e.g., STAT3) in ESC differentiation and determination of germ layer fate. The approach discussed here applies the Baysian statistical model to cell/developmental biology combining traditional flow cytometry methodology and specific morphological observations with advanced statistical and probabilistic modeling and experimental design. The final result of this study is a unique tool and model that enhances the understanding of how and when specific cell fates are determined during differentiation. This model provides a guideline for increasing the production efficiency of therapeutically viable ESCs/iPSCs/ASC derived neurons or any other cell type and will eventually lead to advances in stem cell therapy. PMID:24278782
Valuing flexibilities in the design of urban water management systems.
Deng, Yinghan; Cardin, Michel-Alexandre; Babovic, Vladan; Santhanakrishnan, Deepak; Schmitter, Petra; Meshgi, Ali
2013-12-15
Climate change and rapid urbanization requires decision-makers to develop a long-term forward assessment on sustainable urban water management projects. This is further complicated by the difficulties of assessing sustainable designs and various design scenarios from an economic standpoint. A conventional valuation approach for urban water management projects, like Discounted Cash Flow (DCF) analysis, fails to incorporate uncertainties, such as amount of rainfall, unit cost of water, and other uncertainties associated with future changes in technological domains. Such approach also fails to include the value of flexibility, which enables managers to adapt and reconfigure systems over time as uncertainty unfolds. This work describes an integrated framework to value investments in urban water management systems under uncertainty. It also extends the conventional DCF analysis through explicit considerations of flexibility in systems design and management. The approach incorporates flexibility as intelligent decision-making mechanisms that enable systems to avoid future downside risks and increase opportunities for upside gains over a range of possible futures. A water catchment area in Singapore was chosen to assess the value of a flexible extension of standard drainage canals and a flexible deployment of a novel water catchment technology based on green roofs and porous pavements. Results show that integrating uncertainty and flexibility explicitly into the decision-making process can reduce initial capital expenditure, improve value for investment, and enable decision-makers to learn more about system requirements during the lifetime of the project. Copyright © 2013 Elsevier Ltd. All rights reserved.
Affect, Risk and Uncertainty in Decision-Marking an Integrated Computational-Empirical Approach
2009-07-26
OF ABSTRACT UU 18. NUMBER O PAGES 61 19a. NAME OF RESPONSIBLE PERSON Eva Hudlicka, Ph.D. 19b. TELEPHONE NUMBER (include area code...developed by Hudlicka (2002; 2003). MAMID was designed with the explicit purpose to model the effects of affective states and personality traits on...influenced by risk and uncertainty? • How do personality traits and affective states facilitate or prevent the expression of particular types of
DOE Office of Scientific and Technical Information (OSTI.GOV)
Qin, A; Yan, D
2014-06-15
Purpose: To evaluate uncertainties of organ specific Deformable Image Registration (DIR) for H and N cancer Adaptive Radiation Therapy (ART). Methods: A commercial DIR evaluation tool, which includes a digital phantom library of 8 patients, and the corresponding “Ground truth Deformable Vector Field” (GT-DVF), was used in the study. Each patient in the phantom library includes the GT-DVF created from a pair of CT images acquired prior to and at the end of the treatment course. Five DIR tools, including 2 commercial tools (CMT1, CMT2), 2 in-house (IH-FFD1, IH-FFD2), and a classic DEMON algorithms, were applied on the patient images.more » The resulting DVF was compared to the GT-DVF voxel by voxel. Organ specific DVF uncertainty was calculated for 10 ROIs: Whole Body, Brain, Brain Stem, Cord, Lips, Mandible, Parotid, Esophagus and Submandibular Gland. Registration error-volume histogram was constructed for comparison. Results: The uncertainty is relatively small for brain stem, cord and lips, while large in parotid and submandibular gland. CMT1 achieved best overall accuracy (on whole body, mean vector error of 8 patients: 0.98±0.29 mm). For brain, mandible, parotid right, parotid left and submandibular glad, the classic Demon algorithm got the lowest uncertainty (0.49±0.09, 0.51±0.16, 0.46±0.11, 0.50±0.11 and 0.69±0.47 mm respectively). For brain stem, cord and lips, the DVF from CMT1 has the best accuracy (0.28±0.07, 0.22±0.08 and 0.27±0.12 mm respectively). All algorithms have largest right parotid uncertainty on patient #7, which has image artifact caused by tooth implantation. Conclusion: Uncertainty of deformable CT image registration highly depends on the registration algorithm, and organ specific. Large uncertainty most likely appears at the location of soft-tissue organs far from the bony structures. Among all 5 DIR methods, the classic DEMON and CMT1 seem to be the best to limit the uncertainty within 2mm for all OARs. Partially supported by research grant from Elekta.« less
Making it stick: chasing the optimal stem cells for cardiac regeneration
Quijada, Pearl; Sussman, Mark A
2014-01-01
Despite the increasing use of stem cells for regenerative-based cardiac therapy, the optimal stem cell population(s) remains in a cloud of uncertainty. In the past decade, the field has witnessed a surge of researchers discovering stem cell populations reported to directly and/or indirectly contribute to cardiac regeneration through processes of cardiomyogenic commitment and/or release of cardioprotective paracrine factors. This review centers upon defining basic biological characteristics of stem cells used for sustaining cardiac integrity during disease and maintenance of communication between the cardiac environment and stem cells. Given the limited successes achieved so far in regenerative therapy, the future requires development of unprecedented concepts involving combinatorial approaches to create and deliver the optimal stem cell(s) that will enhance myocardial healing. PMID:25340282
PROBABILISTIC CHARACTERIZATION OF ATMOSPHERIC TRANSPORT AND DIFFUSION
The observed scatter of observations about air quality model predictions stems from a combination of naturally occurring stochastic variations that are impossible for any model to explicitly simulate and variations arising from limitations in our knowledge and from imperfect inpu...
Incorporating rainfall uncertainty in a SWAT model: the river Zenne basin (Belgium) case study
NASA Astrophysics Data System (ADS)
Tolessa Leta, Olkeba; Nossent, Jiri; van Griensven, Ann; Bauwens, Willy
2013-04-01
The European Union Water Framework Directive (EU-WFD) called its member countries to achieve a good ecological status for all inland and coastal water bodies by 2015. According to recent studies, the river Zenne (Belgium) is far from this objective. Therefore, an interuniversity and multidisciplinary project "Towards a Good Ecological Status in the river Zenne (GESZ)" was launched to evaluate the effects of wastewater management plans on the river. In this project, different models have been developed and integrated using the Open Modelling Interface (OpenMI). The hydrologic, semi-distributed Soil and Water Assessment Tool (SWAT) is hereby used as one of the model components in the integrated modelling chain in order to model the upland catchment processes. The assessment of the uncertainty of SWAT is an essential aspect of the decision making process, in order to design robust management strategies that take the predicted uncertainties into account. Model uncertainty stems from the uncertainties on the model parameters, the input data (e.g, rainfall), the calibration data (e.g., stream flows) and on the model structure itself. The objective of this paper is to assess the first three sources of uncertainty in a SWAT model of the river Zenne basin. For the assessment of rainfall measurement uncertainty, first, we identified independent rainfall periods, based on the daily precipitation and stream flow observations and using the Water Engineering Time Series PROcessing tool (WETSPRO). Secondly, we assigned a rainfall multiplier parameter for each of the independent rainfall periods, which serves as a multiplicative input error corruption. Finally, we treated these multipliers as latent parameters in the model optimization and uncertainty analysis (UA). For parameter uncertainty assessment, due to the high number of parameters of the SWAT model, first, we screened out its most sensitive parameters using the Latin Hypercube One-factor-At-a-Time (LH-OAT) technique. Subsequently, we only considered the most sensitive parameters for parameter optimization and UA. To explicitly account for the stream flow uncertainty, we assumed that the stream flow measurement error increases linearly with the stream flow value. To assess the uncertainty and infer posterior distributions of the parameters, we used a Markov Chain Monte Carlo (MCMC) sampler - differential evolution adaptive metropolis (DREAM) that uses sampling from an archive of past states to generate candidate points in each individual chain. It is shown that the marginal posterior distributions of the rainfall multipliers vary widely between individual events, as a consequence of rainfall measurement errors and the spatial variability of the rain. Only few of the rainfall events are well defined. The marginal posterior distributions of the SWAT model parameter values are well defined and identified by DREAM, within their prior ranges. The posterior distributions of output uncertainty parameter values also show that the stream flow data is highly uncertain. The approach of using rainfall multipliers to treat rainfall uncertainty for a complex model has an impact on the model parameter marginal posterior distributions and on the model results Corresponding author: Tel.: +32 (0)2629 3027; fax: +32(0)2629 3022. E-mail: otolessa@vub.ac.be
Uncertainty Analysis via Failure Domain Characterization: Unrestricted Requirement Functions
NASA Technical Reports Server (NTRS)
Crespo, Luis G.; Kenny, Sean P.; Giesy, Daniel P.
2011-01-01
This paper proposes an uncertainty analysis framework based on the characterization of the uncertain parameter space. This characterization enables the identification of worst-case uncertainty combinations and the approximation of the failure and safe domains with a high level of accuracy. Because these approximations are comprised of subsets of readily computable probability, they enable the calculation of arbitrarily tight upper and lower bounds to the failure probability. The methods developed herein, which are based on nonlinear constrained optimization, are applicable to requirement functions whose functional dependency on the uncertainty is arbitrary and whose explicit form may even be unknown. Some of the most prominent features of the methodology are the substantial desensitization of the calculations from the assumed uncertainty model (i.e., the probability distribution describing the uncertainty) as well as the accommodation for changes in such a model with a practically insignificant amount of computational effort.
When nano meets stem: the impact of nanotechnology in stem cell biology.
Kaur, Savneet; Singhal, Barkha
2012-01-01
Nanotechnology and biomedical treatments using stem cells are among the latest conduits of biotechnological research. Even more recently, scientists have begun finding ways to mate these two specialties of science. The advent of nanotechnology has paved the way for an explicit understanding of stem cell therapy in vivo and by recapitulation of such in vivo environments in the culture, this technology seems to accommodate a great potential in providing new vistas to stem cell research. Nanotechnology carries in its wake, the development of highly stable, efficient and specific gene delivery systems for both in vitro and in vivo genetic engineering of stem cells, use of nanoscale systems (such as microarrays) for investigation of gene expression in stem cells, creation of dynamic three-dimensional nano-environments for in vitro and in vivo maintenance and differentiation of stem cells and development of extremely sensitive in vivo detection systems to gain insights into the mechanisms of stem cell differentiation and apoptosis in different disease models. The present review presents an overview of the current applications and future prospects for the use of nanotechnology in stem cell biology. Copyright © 2011 The Society for Biotechnology, Japan. Published by Elsevier B.V. All rights reserved.
Engaging in and with Research to Improve STEM Education
ERIC Educational Resources Information Center
Christodoulou, Andri
2017-01-01
The demand for students to continue studying STEM subjects at post-16 and higher education levels remains high. Since the curriculum reforms in Science and Mathematics across phases in England were initiated in 2014, uncertainty remains on the impact that these reforms will have on students and teachers as the reforms continue to be implemented…
USDA-ARS?s Scientific Manuscript database
he wheat stem sawfly, Cephus cinctus Norton (Hymenoptera: Cephidae), is a key pest of wheat in the northern Great Plains of North America, and damage by this species has recently expanded southward. Current pest management practices are not very effective and uncertainties regarding its origin and i...
The ethics of patenting human embryonic stem cells.
Chapman, Audrey R
2009-09-01
Just as human embryonic stem cell research has generated controversy about the uses of human embryos for research and therapeutic applications, human embryonic stem cell patents raise fundamental ethical issues. The United States Patent and Trademark Office has granted foundational patents, including a composition of matter (or product) patent to the Wisconsin Alumni Research Foundation (WARF), the University of Wisconsin-Madison's intellectual property office. In contrast, the European Patent Office rejected the same WARF patent application for ethical reasons. This article assesses the appropriateness of these patents placing the discussion in the context of the deontological and consequentialist ethical issues related to human embryonic stem cell patenting. It advocates for a patent system that explicitly takes ethical factors into account and explores options for new types of intellectual property arrangements consistent with ethical concerns.
Seniors' uncertainty management of direct-to-consumer prescription drug advertising usefulness.
DeLorme, Denise E; Huh, Jisu
2009-09-01
This study provides insight into seniors' perceptions of and responses to direct-to-consumer prescription drug advertising (DTCA) usefulness, examines support for DTCA regulation as a type of uncertainty management, and extends and gives empirical voice to previous survey results through methodological triangulation. In-depth interview findings revealed that, for most informants, DTCA usefulness was uncertain and this uncertainty stemmed from 4 sources. The majority had negative responses to DTCA uncertainty and relied on 2 uncertainty-management strategies: information seeking from physicians, and inferences of and support for some government regulation of DTCA. Overall, the findings demonstrate the viability of uncertainty management theory (Brashers, 2001, 2007) for mass-mediated health communication, specifically DTCA. The article concludes with practical implications and research recommendations.
Accounting for uncertainty in marine reserve design.
Halpern, Benjamin S; Regan, Helen M; Possingham, Hugh P; McCarthy, Michael A
2006-01-01
Ecosystems and the species and communities within them are highly complex systems that defy predictions with any degree of certainty. Managing and conserving these systems in the face of uncertainty remains a daunting challenge, particularly with respect to developing networks of marine reserves. Here we review several modelling frameworks that explicitly acknowledge and incorporate uncertainty, and then use these methods to evaluate reserve spacing rules given increasing levels of uncertainty about larval dispersal distances. Our approach finds similar spacing rules as have been proposed elsewhere - roughly 20-200 km - but highlights several advantages provided by uncertainty modelling over more traditional approaches to developing these estimates. In particular, we argue that uncertainty modelling can allow for (1) an evaluation of the risk associated with any decision based on the assumed uncertainty; (2) a method for quantifying the costs and benefits of reducing uncertainty; and (3) a useful tool for communicating to stakeholders the challenges in managing highly uncertain systems. We also argue that incorporating rather than avoiding uncertainty will increase the chances of successfully achieving conservation and management goals.
NASA Astrophysics Data System (ADS)
Chinn, P. W. U.
2016-12-01
Context/Purpose: The Hawaiian Islands span 1500 miles. Age, size, altitude and isolation produced diverse topographies, weather patterns, and unique ecosystems. Around 500 C.E. Polynesians arrived and developed sustainable social ecosystems, ahupua`a, extending from mountain-top to reef. Place-based ecological knowledge was key to personal identity and resource management that sustained 700,000 people at western contact. But Native Hawaiian students are persistently underrepresented in science. This two-year mixed methods study asks if professional development (PD) can transform teaching in ways that support K12 Native Hawaiian students' engagement and learning in STEM. Methods: Place-based PD informed by theories of structure and agency (Sewell, 1992) and cultural funds of knowledge (Moll, Amanti, Neff, & Gonzalez, 1992) explicitly intersected Hawaiian and western STEM knowledge and practices. NGSS and Nā Hopena A`o, general learner outcomes that reflect Hawaiian culture and values provided teachers with new schemas for designing curriculum and assessment through the lens of culture and place. Data sources include surveys, teacher and student documents, photographs. Results: Teachers' lessons on invasive species, water, soils, Hawaiian STEM, and sustainability and student work showed they learned key Hawaiian terms, understood the impact of invasive species on native plants and animals, felt stronger senses of responsibility, belonging, and place, and preferred outdoor learning. Survey results of 21 4th graders showed Native Hawaiian students (n=6) were more interested in taking STEM and Hawaiian culture/language courses, more concerned about invasive species and culturally important plant and animals, but less able to connect school and family activities than non-Hawaiian peers (n=15). Teacher agency is seen in their interest in collaborating across schools to develop ahupua`a based K12 STEM curricula. Interpretation and Conclusion: Findings suggest PD explicitly integrating Western and Hawaiian STEM systems contributes to teacher agency and place-based expertise. Future research with a new cohort of teachers will expand grades and numbers of students surveyed to validate first year findings and guide future PD oriented to STEM equity for Native Hawaiian students.
NASA Astrophysics Data System (ADS)
Hakala, Kirsti; Addor, Nans; Seibert, Jan
2017-04-01
Streamflow stemming from Switzerland's mountainous landscape will be influenced by climate change, which will pose significant challenges to the water management and policy sector. In climate change impact research, the determination of future streamflow is impeded by different sources of uncertainty, which propagate through the model chain. In this research, we explicitly considered the following sources of uncertainty: (1) climate models, (2) downscaling of the climate projections to the catchment scale, (3) bias correction method and (4) parameterization of the hydrological model. We utilize climate projections at the 0.11 degree 12.5 km resolution from the EURO-CORDEX project, which are the most recent climate projections for the European domain. EURO-CORDEX is comprised of regional climate model (RCM) simulations, which have been downscaled from global climate models (GCMs) from the CMIP5 archive, using both dynamical and statistical techniques. Uncertainties are explored by applying a modeling chain involving 14 GCM-RCMs to ten Swiss catchments. We utilize the rainfall-runoff model HBV Light, which has been widely used in operational hydrological forecasting. The Lindström measure, a combination of model efficiency and volume error, was used as an objective function to calibrate HBV Light. Ten best sets of parameters are then achieved by calibrating using the genetic algorithm and Powell optimization (GAP) method. The GAP optimization method is based on the evolution of parameter sets, which works by selecting and recombining high performing parameter sets with each other. Once HBV is calibrated, we then perform a quantitative comparison of the influence of biases inherited from climate model simulations to the biases stemming from the hydrological model. The evaluation is conducted over two time periods: i) 1980-2009 to characterize the simulation realism under the current climate and ii) 2070-2099 to identify the magnitude of the projected change of streamflow under the climate scenarios RCP4.5 and RCP8.5. We utilize two techniques for correcting biases in the climate model output: quantile mapping and a new method, frequency bias correction. The FBC method matches the frequencies between observed and GCM-RCM data. In this way, it can be used to correct for all time scales, which is a known limitation of quantile mapping. A novel approach for the evaluation of the climate simulations and bias correction methods was then applied. Streamflow can be thought of as the "great integrator" of uncertainties. The ability, or the lack thereof, to correctly simulate streamflow is a way to assess the realism of the bias-corrected climate simulations. Long-term monthly mean as well as high and low flow metrics are used to evaluate the realism of the simulations under current climate and to gauge the impacts of climate change on streamflow. Preliminary results show that under present climate, calibration of the hydrological model comprises of a much smaller band of uncertainty in the modeling chain as compared to the bias correction of the GCM-RCMs. Therefore, for future time periods, we expect the bias correction of climate model data to have a greater influence on projected changes in streamflow than the calibration of the hydrological model.
NASA Astrophysics Data System (ADS)
Wellen, Christopher; Arhonditsis, George B.; Labencki, Tanya; Boyd, Duncan
2012-10-01
Regression-type, hybrid empirical/process-based models (e.g., SPARROW, PolFlow) have assumed a prominent role in efforts to estimate the sources and transport of nutrient pollution at river basin scales. However, almost no attempts have been made to explicitly accommodate interannual nutrient loading variability in their structure, despite empirical and theoretical evidence indicating that the associated source/sink processes are quite variable at annual timescales. In this study, we present two methodological approaches to accommodate interannual variability with the Spatially Referenced Regressions on Watershed attributes (SPARROW) nonlinear regression model. The first strategy uses the SPARROW model to estimate a static baseline load and climatic variables (e.g., precipitation) to drive the interannual variability. The second approach allows the source/sink processes within the SPARROW model to vary at annual timescales using dynamic parameter estimation techniques akin to those used in dynamic linear models. Model parameterization is founded upon Bayesian inference techniques that explicitly consider calibration data and model uncertainty. Our case study is the Hamilton Harbor watershed, a mixed agricultural and urban residential area located at the western end of Lake Ontario, Canada. Our analysis suggests that dynamic parameter estimation is the more parsimonious of the two strategies tested and can offer insights into the temporal structural changes associated with watershed functioning. Consistent with empirical and theoretical work, model estimated annual in-stream attenuation rates varied inversely with annual discharge. Estimated phosphorus source areas were concentrated near the receiving water body during years of high in-stream attenuation and dispersed along the main stems of the streams during years of low attenuation, suggesting that nutrient source areas are subject to interannual variability.
Adjoint-Based Implicit Uncertainty Analysis for Figures of Merit in a Laser Inertial Fusion Engine
DOE Office of Scientific and Technical Information (OSTI.GOV)
Seifried, J E; Fratoni, M; Kramer, K J
A primary purpose of computational models is to inform design decisions and, in order to make those decisions reliably, the confidence in the results of such models must be estimated. Monte Carlo neutron transport models are common tools for reactor designers. These types of models contain several sources of uncertainty that propagate onto the model predictions. Two uncertainties worthy of note are (1) experimental and evaluation uncertainties of nuclear data that inform all neutron transport models and (2) statistical counting precision, which all results of a Monte Carlo codes contain. Adjoint-based implicit uncertainty analyses allow for the consideration of anymore » number of uncertain input quantities and their effects upon the confidence of figures of merit with only a handful of forward and adjoint transport calculations. When considering a rich set of uncertain inputs, adjoint-based methods remain hundreds of times more computationally efficient than Direct Monte-Carlo methods. The LIFE (Laser Inertial Fusion Energy) engine is a concept being developed at Lawrence Livermore National Laboratory. Various options exist for the LIFE blanket, depending on the mission of the design. The depleted uranium hybrid LIFE blanket design strives to close the fission fuel cycle without enrichment or reprocessing, while simultaneously achieving high discharge burnups with reduced proliferation concerns. Neutron transport results that are central to the operation of the design are tritium production for fusion fuel, fission of fissile isotopes for energy multiplication, and production of fissile isotopes for sustained power. In previous work, explicit cross-sectional uncertainty analyses were performed for reaction rates related to the figures of merit for the depleted uranium hybrid LIFE blanket. Counting precision was also quantified for both the figures of merit themselves and the cross-sectional uncertainty estimates to gauge the validity of the analysis. All cross-sectional uncertainties were small (0.1-0.8%), bounded counting uncertainties, and were precise with regard to counting precision. Adjoint/importance distributions were generated for the same reaction rates. The current work leverages those adjoint distributions to transition from explicit sensitivities, in which the neutron flux is constrained, to implicit sensitivities, in which the neutron flux responds to input perturbations. This treatment vastly expands the set of data that contribute to uncertainties to produce larger, more physically accurate uncertainty estimates.« less
David M. Bell; Eric J. Ward; A. Christopher Oishi; Ram Oren; Paul G. Flikkema; James S. Clark; David Whitehead
2015-01-01
Uncertainties in ecophysiological responses to environment, such as the impact of atmospheric and soil moisture conditions on plant water regulation, limit our ability to estimate key inputs for ecosystem models. Advanced statistical frameworks provide coherent methodologies for relating observed data, such as stem sap flux density, to unobserved processes, such as...
Cancer stem cells: impact, heterogeneity, and uncertainty
Magee, Jeffrey A.; Piskounova, Elena; Morrison, Sean J.
2015-01-01
The differentiation of tumorigenic cancer stem cells into non-tumorigenic cancer cells confers heterogeneity to some cancers beyond that explained by clonal evolution or environmental differences. In such cancers, functional differences between tumorigenic and non-tumorigenic cells influence response to therapy and prognosis. However, it remains uncertain whether the model applies to many, or few, cancers due to questions about the robustness of cancer stem cell markers and the extent to which existing assays underestimate the frequency of tumorigenic cells. In cancers with rapid genetic change, reversible changes in cell states, or biological variability among patients the stem cell model may not be readily testable. PMID:22439924
Adaptive management: Chapter 1
Allen, Craig R.; Garmestani, Ahjond S.; Allen, Craig R.; Garmestani, Ahjond S.
2015-01-01
Adaptive management is an approach to natural resource management that emphasizes learning through management where knowledge is incomplete, and when, despite inherent uncertainty, managers and policymakers must act. Unlike a traditional trial and error approach, adaptive management has explicit structure, including a careful elucidation of goals, identification of alternative management objectives and hypotheses of causation, and procedures for the collection of data followed by evaluation and reiteration. The process is iterative, and serves to reduce uncertainty, build knowledge and improve management over time in a goal-oriented and structured process.
Allen, Craig R.; Garmestani, Ahjond S.
2015-01-01
Adaptive management is an approach to natural resource management that emphasizes learning through management where knowledge is incomplete, and when, despite inherent uncertainty, managers and policymakers must act. Unlike a traditional trial and error approach, adaptive management has explicit structure, including a careful elucidation of goals, identification of alternative management objectives and hypotheses of causation, and procedures for the collection of data followed by evaluation and reiteration. The process is iterative, and serves to reduce uncertainty, build knowledge and improve management over time in a goal-oriented and structured process.
Contaminated salmon and the public's trust
Luoma, Samuel N.; Löfstedt, Ragnar E.
2007-01-01
Scientific uncertainties often make it difficult for environmental policy makers to determine how to communicate risks to the public. A constructive, holistic, multisectoral dialogue about an issue can improve understanding of uncertainties from different perspectives and clarify options for risk communication. Many environmental issues could benefit from explicit promotion of such a dialogue. When issues are complex, unconstructive advocacy, narrow focus, and exclusion of selected parties from decision making can erode public trust in science and lead to cynicism about the policies of government and the private sector.
NASA Astrophysics Data System (ADS)
Legget, J.; Pepper, W.; Sankovski, A.; Smith, J.; Tol, R.; Wigley, T.
2003-04-01
Potential risks of human-induced climate change are subject to a three-fold uncertainty associated with: the extent of future anthropogenic and natural GHG emissions; global and regional climatic responses to emissions; and impacts of climatic changes on economies and the biosphere. Long-term analyses are also subject to uncertainty regarding how humans will respond to actual or perceived changes, through adaptation or mitigation efforts. Explicitly addressing these uncertainties is a high priority in the scientific and policy communities Probabilistic modeling is gaining momentum as a technique to quantify uncertainties explicitly and use decision analysis techniques that take advantage of improved risk information. The Climate Change Risk Assessment Framework (CCRAF) presented here a new integrative tool that combines the probabilistic approaches developed in population, energy and economic sciences with empirical data and probabilistic results of climate and impact models. The main CCRAF objective is to assess global climate change as a risk management challenge and to provide insights regarding robust policies that address the risks, by mitigating greenhouse gas emissions and by adapting to climate change consequences. The CCRAF endogenously simulates to 2100 or beyond annual region-specific changes in population; GDP; primary (by fuel) and final energy (by type) use; a wide set of associated GHG emissions; GHG concentrations; global temperature change and sea level rise; economic, health, and biospheric impacts; costs of mitigation and adaptation measures and residual costs or benefits of climate change. Atmospheric and climate components of CCRAF are formulated based on the latest version of Wigley's and Raper's MAGICC model and impacts are simulated based on a modified version of Tol's FUND model. The CCRAF is based on series of log-linear equations with deterministic and random components and is implemented using a Monte-Carlo method with up to 5000 variants per set of fixed input parameters. The shape and coefficients of CCRAF equations are derived from regression analyses of historic data and expert assessments. There are two types of random components in CCRAF - one reflects a year-to-year fluctuations around the expected value of a given variable (e.g., standard error of the annual GDP growth) and another is fixed within each CCRAF variant and represents some essential constants within a "world" represented by that variant (e.g., the value of climate sensitivity). Both types of random components are drawn from pre-defined probability distributions functions developed based on historic data or expert assessments. Preliminary CCRAF results emphasize the relative importance of uncertainties associated with the conversion of GHG and particulate emissions into radiative forcing and quantifying climate change effects at the regional level. A separates analysis involves an "adaptive decision-making", which optimizes the expected future policy effects given the estimated probabilistic uncertainties. As uncertainty for some variables evolve over the time steps, the decisions also adapt. This modeling approach is feasible only with explicit modeling of uncertainties.
The Multi-Step CADIS method for shutdown dose rate calculations and uncertainty propagation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ibrahim, Ahmad M.; Peplow, Douglas E.; Grove, Robert E.
2015-12-01
Shutdown dose rate (SDDR) analysis requires (a) a neutron transport calculation to estimate neutron flux fields, (b) an activation calculation to compute radionuclide inventories and associated photon sources, and (c) a photon transport calculation to estimate final SDDR. In some applications, accurate full-scale Monte Carlo (MC) SDDR simulations are needed for very large systems with massive amounts of shielding materials. However, these simulations are impractical because calculation of space- and energy-dependent neutron fluxes throughout the structural materials is needed to estimate distribution of radioisotopes causing the SDDR. Biasing the neutron MC calculation using an importance function is not simple becausemore » it is difficult to explicitly express the response function, which depends on subsequent computational steps. Furthermore, the typical SDDR calculations do not consider how uncertainties in MC neutron calculation impact SDDR uncertainty, even though MC neutron calculation uncertainties usually dominate SDDR uncertainty.« less
Uncertainty in estimates of the number of extraterrestrial civilizations
NASA Technical Reports Server (NTRS)
Sturrock, P. A.
1980-01-01
An estimation of the number N of communicative civilizations is made by means of Drake's formula which involves the combination of several quantities, each of which is to some extent uncertain. It is shown that the uncertainty in any quantity may be represented by a probability distribution function, even if that quantity is itself a probability. The uncertainty of current estimates of N is derived principally from uncertainty in estimates of the lifetime of advanced civilizations. It is argued that this is due primarily to uncertainty concerning the existence of a Galactic Federation which is in turn contingent upon uncertainty about whether the limitations of present-day physics are absolute or (in the event that there exists a yet undiscovered hyperphysics) transient. It is further argued that it is advantageous to consider explicitly these underlying assumptions in order to compare the probable numbers of civilizations operating radio beacons, permitting radio leakage, dispatching probes for radio surveillance for dispatching vehicles for manned surveillance.
Ambros Berger; Thomas Gschwantner; Ronald E. McRoberts; Klemens Schadauer
2014-01-01
National forest inventories typically estimate individual tree volumes using models that rely on measurements of predictor variables such as tree height and diameter, both of which are subject to measurement error. The aim of this study was to quantify the impacts of these measurement errors on the uncertainty of the model-based tree stem volume estimates. The impacts...
Enhanced Estimation of Terrestrial Loadings for TMDLs: Normalization Approach
USDA-ARS?s Scientific Manuscript database
TMDL implementation plans to remediate pathogen-impaired streams are usually based on deterministic terrestrial fate and transport (DTFT) models. A novel protocol is proposed that can effectively, efficiently, and explicitly capture the predictive uncertainty of DTFT models used to establish terres...
Incorporating parametric uncertainty into population viability analysis models
McGowan, Conor P.; Runge, Michael C.; Larson, Michael A.
2011-01-01
Uncertainty in parameter estimates from sampling variation or expert judgment can introduce substantial uncertainty into ecological predictions based on those estimates. However, in standard population viability analyses, one of the most widely used tools for managing plant, fish and wildlife populations, parametric uncertainty is often ignored in or discarded from model projections. We present a method for explicitly incorporating this source of uncertainty into population models to fully account for risk in management and decision contexts. Our method involves a two-step simulation process where parametric uncertainty is incorporated into the replication loop of the model and temporal variance is incorporated into the loop for time steps in the model. Using the piping plover, a federally threatened shorebird in the USA and Canada, as an example, we compare abundance projections and extinction probabilities from simulations that exclude and include parametric uncertainty. Although final abundance was very low for all sets of simulations, estimated extinction risk was much greater for the simulation that incorporated parametric uncertainty in the replication loop. Decisions about species conservation (e.g., listing, delisting, and jeopardy) might differ greatly depending on the treatment of parametric uncertainty in population models.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, Xuesong; Liang, Faming; Yu, Beibei
2011-11-09
Estimating uncertainty of hydrologic forecasting is valuable to water resources and other relevant decision making processes. Recently, Bayesian Neural Networks (BNNs) have been proved powerful tools for quantifying uncertainty of streamflow forecasting. In this study, we propose a Markov Chain Monte Carlo (MCMC) framework to incorporate the uncertainties associated with input, model structure, and parameter into BNNs. This framework allows the structure of the neural networks to change by removing or adding connections between neurons and enables scaling of input data by using rainfall multipliers. The results show that the new BNNs outperform the BNNs that only consider uncertainties associatedmore » with parameter and model structure. Critical evaluation of posterior distribution of neural network weights, number of effective connections, rainfall multipliers, and hyper-parameters show that the assumptions held in our BNNs are not well supported. Further understanding of characteristics of different uncertainty sources and including output error into the MCMC framework are expected to enhance the application of neural networks for uncertainty analysis of hydrologic forecasting.« less
Representations of stem cell clinics on Twitter.
Kamenova, Kalina; Reshef, Amir; Caulfield, Timothy
2014-12-01
The practice of travelling abroad to receive unproven and unregulated stem cell treatments has become an increasingly problematic global phenomenon known as 'stem cell tourism'. In this paper, we examine representations of nine major clinics and providers of such treatments on the microblogging network Twitter. We collected and conducted a content analysis of Twitter posts (n = 363) by these establishments and by other users mentioning them, focusing specifically on marketing claims about treatment procedures and outcomes, discussions of safety and efficacy of stem cell transplants, and specific representations of patients' experiences. Our analysis has shown that there were explicit claims or suggestions of benefits associated with unproven stem cell treatments in approximately one third of the tweets and that patients' experiences, whenever referenced, were presented as invariably positive and as testimonials about the efficacy of stem cell transplants. Furthermore, the results indicated that the tone of most tweets (60.2 %) was overwhelmingly positive and there were rarely critical discussions about significant health risks associated with unproven stem cell therapies. When placed in the context of past research on the problems associated with the marketing of unproven stem cell therapies, this analysis of representations on Twitter suggests that discussions in social media have also remained largely uncritical of the stem cell tourism phenomenon, with inaccurate representations of risks and benefits for patients.
Assessing Uncertainty in the Toxicology of PFOA
We use an approach known as Bayesian statistics to characterize our knowledge about the behavior of a chemical prior to an experiment and make explicit our assumptions about how we think the chemical behaves. When we then analyze the results of an experiment, we determine probab...
Maas-Hebner, Kathleen G.; Schreck, Carl B.; Hughes, Robert M.; Yeakley, Alan; Molina, Nancy
2016-01-01
We discuss the importance of addressing diffuse threats to long-term species and habitat viability in fish conservation and recovery planning. In the Pacific Northwest, USA, salmonid management plans have typically focused on degraded freshwater habitat, dams, fish passage, harvest rates, and hatchery releases. However, such plans inadequately address threats related to human population and economic growth, intra- and interspecific competition, and changes in climate, ocean, and estuarine conditions. Based on reviews conducted on eight conservation and/or recovery plans, we found that though threats resulting from such changes are difficult to model and/or predict, they are especially important for wide-ranging diadromous species. Adaptive management is also a critical but often inadequately constructed component of those plans. Adaptive management should be designed to respond to evolving knowledge about the fish and their supporting ecosystems; if done properly, it should help improve conservation efforts by decreasing uncertainty regarding known and diffuse threats. We conclude with a general call for environmental managers and planners to reinvigorate the adaptive management process in future management plans, including more explicitly identifying critical uncertainties, implementing monitoring programs to reduce those uncertainties, and explicitly stating what management actions will occur when pre-identified trigger points are reached.
NASA Astrophysics Data System (ADS)
Galbraith, D.; Levine, N. M.; Christoffersen, B. O.; Imbuzeiro, H. A.; Powell, T.; Costa, M. H.; Saleska, S. R.; Moorcroft, P. R.; Malhi, Y.
2014-12-01
The mathematical codes embedded within different vegetation models ultimately represent alternative hypotheses of biosphere functioning. While formulations for some processes (e.g. leaf-level photosynthesis) are often shared across vegetation models, other processes (e.g. carbon allocation) are much more variable in their representation across models. This creates the opportunity for equifinality - models can simulate similar values of key metrics such as NPP or biomass through very different underlying causal pathways. Intensive carbon cycle measurements allow for quantification of a comprehensive suite of carbon fluxes such as the productivity and respiration of leaves, roots and wood, allowing for in-depth assessment of carbon flows within ecosystems. Thus, they provide important information on poorly-constrained C-cycle processes such as allocation. We conducted an in-depth evaluation of the ability of four commonly used dynamic global vegetation models (CLM, ED2, IBIS, JULES) to simulate carbon cycle processes at ten lowland Amazonian rainforest sites where individual C-cycle components have been measured. The rigorous model-data comparison procedure allowed identification of biases which were specific to different models, providing clear avenues for model improvement and allowing determination of internal C-cycling pathways that were better supported by data. Furthermore, the intensive C-cycle data allowed for explicit testing of the validity of a number of assumptions made by specific models in the simulation of carbon allocation and plant respiration. For example, the ED2 model assumes that maintenance respiration of stems is negligible while JULES assumes equivalent allocation of NPP to fine roots and leaves. We argue that field studies focusing on simultaneous measurement of a large number of component fluxes are fundamentally important for reducing uncertainty in vegetation model simulations.
Dissociating word stem completion and cued recall as a function of divided attention at retrieval.
Clarke, A J Benjamin; Butler, Laurie T
2008-10-01
The aim of this study was to investigate the widely held, but largely untested, view that implicit memory (repetition priming) reflects an automatic form of retrieval. Specifically, in Experiment 1 we explored whether a secondary task (syllable monitoring), performed during retrieval, would disrupt performance on explicit (cued recall) and implicit (stem completion) memory tasks equally. Surprisingly, despite substantial memory and secondary costs to cued recall when performed with a syllable-monitoring task, the same manipulation had no effect on stem completion priming or on secondary task performance. In Experiment 2 we demonstrated that even when using a particularly demanding version of the stem completion task that incurred secondary task costs, the corresponding disruption to implicit memory performance was minimal. Collectively, the results are consistent with the view that implicit memory retrieval requires little or no processing capacity and is not seemingly susceptible to the effects of dividing attention at retrieval.
Modes of Imprinted Gene Action in Learning Disability
ERIC Educational Resources Information Center
Isles, A. R.; Humby, T.
2006-01-01
Background: It is now widely acknowledged that there may be a genetic contribution to learning disability and neuropsychiatric disorders, stemming from evidence provided by family, twin and adoption studies, and from explicit syndromic conditions. Recently it has been recognized that in some cases the presentation of genetic syndromes (or discrete…
Recollective performance advantages for implicit memory tasks.
Sheldon, Signy A M; Moscovitch, Morris
2010-10-01
A commonly held assumption is that processes underlying explicit and implicit memory are distinct. Recent evidence, however, suggests that they may interact more than previously believed. Using the remember-know procedure the current study examines the relation between recollection, a process thought to be exclusive to explicit memory, and performance on two implicit memory tasks, lexical decision and word stem completion. We found that, for both implicit tasks, words that were recollected were associated with greater priming effects than were words given a subsequent familiarity rating or words that had been studied but were not recognised (misses). Broadly, our results suggest that non-voluntary processes underlying explicit memory also benefit priming, a measure of implicit memory. More specifically, given that this benefit was due to a particular aspect of explicit memory (recollection), these results are consistent with some strength models of memory and with Moscovitch's (2008) proposal that recollection is a two-stage process, one rapid and unconscious and the other more effortful and conscious.
Implicit and Explicit Gender Beliefs in Spatial Ability: Stronger Stereotyping in Boys than Girls.
Vander Heyden, Karin M; van Atteveldt, Nienke M; Huizinga, Mariette; Jolles, Jelle
2016-01-01
Sex differences in spatial ability are a seriously debated topic, given the importance of spatial ability for success in the fields of science, technology, engineering, and mathematics (STEM) and girls' underrepresentation in these domains. In the current study we investigated the presence of stereotypic gender beliefs on spatial ability (i.e., "spatial ability is for boys") in 10- and 12-year-old children. We used both an explicit measure (i.e., a self-report questionnaire) and an implicit measure (i.e., a child IAT). Results of the explicit measure showed that both sexes associated spatial ability with boys, with boys holding more male stereotyped attitudes than girls. On the implicit measure, boys associated spatial ability with boys, while girls were gender-neutral. In addition, we examined the effects of gender beliefs on spatial performance, by experimentally activating gender beliefs within a pretest-instruction-posttest design. We compared three types of instruction: boys are better, girls are better, and no sex differences. No effects of these gender belief instructions were found on children's spatial test performance (i.e., mental rotation and paper folding). The finding that children of this age already have stereotypic beliefs about the spatial capacities of their own sex is important, as these beliefs may influence children's choices for spatial leisure activities and educational tracks in the STEM domain.
Implicit and Explicit Gender Beliefs in Spatial Ability: Stronger Stereotyping in Boys than Girls
Vander Heyden, Karin M.; van Atteveldt, Nienke M.; Huizinga, Mariette; Jolles, Jelle
2016-01-01
Sex differences in spatial ability are a seriously debated topic, given the importance of spatial ability for success in the fields of science, technology, engineering, and mathematics (STEM) and girls' underrepresentation in these domains. In the current study we investigated the presence of stereotypic gender beliefs on spatial ability (i.e., “spatial ability is for boys”) in 10- and 12-year-old children. We used both an explicit measure (i.e., a self-report questionnaire) and an implicit measure (i.e., a child IAT). Results of the explicit measure showed that both sexes associated spatial ability with boys, with boys holding more male stereotyped attitudes than girls. On the implicit measure, boys associated spatial ability with boys, while girls were gender-neutral. In addition, we examined the effects of gender beliefs on spatial performance, by experimentally activating gender beliefs within a pretest—instruction—posttest design. We compared three types of instruction: boys are better, girls are better, and no sex differences. No effects of these gender belief instructions were found on children's spatial test performance (i.e., mental rotation and paper folding). The finding that children of this age already have stereotypic beliefs about the spatial capacities of their own sex is important, as these beliefs may influence children's choices for spatial leisure activities and educational tracks in the STEM domain. PMID:27507956
The deuteron-radius puzzle is alive: A new analysis of nuclear structure uncertainties
NASA Astrophysics Data System (ADS)
Hernandez, O. J.; Ekström, A.; Nevo Dinur, N.; Ji, C.; Bacca, S.; Barnea, N.
2018-03-01
To shed light on the deuteron radius puzzle we analyze the theoretical uncertainties of the nuclear structure corrections to the Lamb shift in muonic deuterium. We find that the discrepancy between the calculated two-photon exchange correction and the corresponding experimentally inferred value by Pohl et al. [1] remain. The present result is consistent with our previous estimate, although the discrepancy is reduced from 2.6 σ to about 2 σ. The error analysis includes statistic as well as systematic uncertainties stemming from the use of nucleon-nucleon interactions derived from chiral effective field theory at various orders. We therefore conclude that nuclear theory uncertainty is more likely not the source of the discrepancy.
The Value of Learning about Natural History in Biodiversity Markets
Bruggeman, Douglas J.
2015-01-01
Markets for biodiversity have generated much controversy because of the often unstated and untested assumptions included in transactions rules. Simple trading rules are favored to reduce transaction costs, but others have argued that this leads to markets that favor development and erode biodiversity. Here, I describe how embracing complexity and uncertainty within a tradable credit system for the Red-cockaded Woodpecker (Picoides borealis) creates opportunities to achieve financial and conservation goals simultaneously. Reversing the effects of habitat fragmentation is one of the main reasons for developing markets. I include uncertainty in habitat fragmentation effects by evaluating market transactions using five alternative dispersal models that were able to approximate observed patterns of occupancy and movement. Further, because dispersal habitat is often not included in market transactions, I contrast how changes in breeding versus dispersal habitat affect credit values. I use an individually-based, spatially-explicit population model for the Red-cockaded Woodpecker (Picoides borealis) to predict spatial- and temporal- influences of landscape change on species occurrence and genetic diversity. Results indicated that the probability of no net loss of abundance and genetic diversity responded differently to the transient dynamics in breeding and dispersal habitat. Trades that do not violate the abundance cap may simultaneously violate the cap for the erosion of genetic diversity. To highlight how economic incentives may help reduce uncertainty, I demonstrate tradeoffs between the value of tradable credits and the value of information needed to predict the influence of habitat trades on population viability. For the trade with the greatest uncertainty regarding the change in habitat fragmentation, I estimate that the value of using 13-years of data to reduce uncertainty in dispersal behaviors is $6.2 million. Future guidance for biodiversity markets should at least encourage the use of spatially- and temporally-explicit techniques that include population genetic estimates and the influence of uncertainty. PMID:26675488
Uncertainty in flood forecasting: A distributed modeling approach in a sparse data catchment
NASA Astrophysics Data System (ADS)
Mendoza, Pablo A.; McPhee, James; Vargas, Ximena
2012-09-01
Data scarcity has traditionally precluded the application of advanced hydrologic techniques in developing countries. In this paper, we evaluate the performance of a flood forecasting scheme in a sparsely monitored catchment based on distributed hydrologic modeling, discharge assimilation, and numerical weather predictions with explicit validation uncertainty analysis. For the hydrologic component of our framework, we apply TopNet to the Cautin River basin, located in southern Chile, using a fully distributed a priori parameterization based on both literature-suggested values and data gathered during field campaigns. Results obtained from this step indicate that the incremental effort spent in measuring directly a set of model parameters was insufficient to represent adequately the most relevant hydrologic processes related to spatiotemporal runoff patterns. Subsequent uncertainty validation performed over a six month ensemble simulation shows that streamflow uncertainty is better represented during flood events, due to both the increase of state perturbation introduced by rainfall and the flood-oriented calibration strategy adopted here. Results from different assimilation configurations suggest that the upper part of the basin is the major source of uncertainty in hydrologic process representation and hint at the usefulness of interpreting assimilation results in terms of model input and parameterization inadequacy. Furthermore, in this case study the violation of Markovian state properties by the Ensemble Kalman filter did affect the numerical results, showing that an explicit treatment of the time delay between the generation of surface runoff and the arrival at the basin outlet is required in the assimilation scheme. Peak flow forecasting results demonstrate that there is a major problem with the Weather Research and Forecasting model outputs, which systematically overestimate precipitation over the catchment. A final analysis performed for a large flooding event that occurred in July 2006 shows that, in the absence of bias introduced by an incorrect model calibration, the updating of both model states and meteorological forecasts contributes to a better representation of streamflow uncertainty and to better hydrologic forecasts.
The Value of Learning about Natural History in Biodiversity Markets.
Bruggeman, Douglas J
2015-01-01
Markets for biodiversity have generated much controversy because of the often unstated and untested assumptions included in transactions rules. Simple trading rules are favored to reduce transaction costs, but others have argued that this leads to markets that favor development and erode biodiversity. Here, I describe how embracing complexity and uncertainty within a tradable credit system for the Red-cockaded Woodpecker (Picoides borealis) creates opportunities to achieve financial and conservation goals simultaneously. Reversing the effects of habitat fragmentation is one of the main reasons for developing markets. I include uncertainty in habitat fragmentation effects by evaluating market transactions using five alternative dispersal models that were able to approximate observed patterns of occupancy and movement. Further, because dispersal habitat is often not included in market transactions, I contrast how changes in breeding versus dispersal habitat affect credit values. I use an individually-based, spatially-explicit population model for the Red-cockaded Woodpecker (Picoides borealis) to predict spatial- and temporal- influences of landscape change on species occurrence and genetic diversity. Results indicated that the probability of no net loss of abundance and genetic diversity responded differently to the transient dynamics in breeding and dispersal habitat. Trades that do not violate the abundance cap may simultaneously violate the cap for the erosion of genetic diversity. To highlight how economic incentives may help reduce uncertainty, I demonstrate tradeoffs between the value of tradable credits and the value of information needed to predict the influence of habitat trades on population viability. For the trade with the greatest uncertainty regarding the change in habitat fragmentation, I estimate that the value of using 13-years of data to reduce uncertainty in dispersal behaviors is $6.2 million. Future guidance for biodiversity markets should at least encourage the use of spatially- and temporally-explicit techniques that include population genetic estimates and the influence of uncertainty.
Tolerance of uncertainty: Conceptual analysis, integrative model, and implications for healthcare.
Hillen, Marij A; Gutheil, Caitlin M; Strout, Tania D; Smets, Ellen M A; Han, Paul K J
2017-05-01
Uncertainty tolerance (UT) is an important, well-studied phenomenon in health care and many other important domains of life, yet its conceptualization and measurement by researchers in various disciplines have varied substantially and its essential nature remains unclear. The objectives of this study were to: 1) analyze the meaning and logical coherence of UT as conceptualized by developers of UT measures, and 2) develop an integrative conceptual model to guide future empirical research regarding the nature, causes, and effects of UT. A narrative review and conceptual analysis of 18 existing measures of Uncertainty and Ambiguity Tolerance was conducted, focusing on how measure developers in various fields have defined both the "uncertainty" and "tolerance" components of UT-both explicitly through their writings and implicitly through the items constituting their measures. Both explicit and implicit conceptual definitions of uncertainty and tolerance vary substantially and are often poorly and inconsistently specified. A logically coherent, unified understanding or theoretical model of UT is lacking. To address these gaps, we propose a new integrative definition and multidimensional conceptual model that construes UT as the set of negative and positive psychological responses-cognitive, emotional, and behavioral-provoked by the conscious awareness of ignorance about particular aspects of the world. This model synthesizes insights from various disciplines and provides an organizing framework for future research. We discuss how this model can facilitate further empirical and theoretical research to better measure and understand the nature, determinants, and outcomes of UT in health care and other domains of life. Uncertainty tolerance is an important and complex phenomenon requiring more precise and consistent definition. An integrative definition and conceptual model, intended as a tentative and flexible point of departure for future research, adds needed breadth, specificity, and precision to efforts to conceptualize and measure UT. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.
NASA Astrophysics Data System (ADS)
Chen, Peng-Fei; Sun, Wen-Yang; Ming, Fei; Huang, Ai-Jun; Wang, Dong; Ye, Liu
2018-01-01
Quantum objects are susceptible to noise from their surrounding environments, interaction with which inevitably gives rise to quantum decoherence or dissipation effects. In this work, we examine how different types of local noise under an open system affect entropic uncertainty relations for two incompatible measurements. Explicitly, we observe the dynamics of the entropic uncertainty in the presence of quantum memory under two canonical categories of noisy environments: unital (phase flip) and nonunital (amplitude damping). Our study shows that the measurement uncertainty exhibits a non-monotonic dynamical behavior—that is, the amount of the uncertainty will first inflate, and subsequently decrease, with the growth of decoherence strengths in the two channels. In contrast, the uncertainty decreases monotonically with the growth of the purity of the initial state shared in prior. In order to reduce the measurement uncertainty in noisy environments, we put forward a remarkably effective strategy to steer the magnitude of uncertainty by means of a local non-unitary operation (i.e. weak measurement) on the qubit of interest. It turns out that this non-unitary operation can greatly reduce the entropic uncertainty, upon tuning the operation strength. Our investigations might thereby offer an insight into the dynamics and steering of entropic uncertainty in open systems.
Robust root clustering for linear uncertain systems using generalized Lyapunov theory
NASA Technical Reports Server (NTRS)
Yedavalli, R. K.
1993-01-01
Consideration is given to the problem of matrix root clustering in subregions of a complex plane for linear state space models with real parameter uncertainty. The nominal matrix root clustering theory of Gutman & Jury (1981) using the generalized Liapunov equation is extended to the perturbed matrix case, and bounds are derived on the perturbation to maintain root clustering inside a given region. The theory makes it possible to obtain an explicit relationship between the parameters of the root clustering region and the uncertainty range of the parameter space.
Estimating uncertainty in map intersections
Ronald E. McRoberts; Mark A. Hatfield; Susan J. Crocker
2009-01-01
Traditionally, natural resource managers have asked the question "How much?" and have received sample-based estimates of resource totals or means. Increasingly, however, the same managers are now asking the additional question "Where?" and are expecting spatially explicit answers in the form of maps. Recent development of natural resource databases...
We introduce a hierarchical optimization framework for spatially targeting green infrastructure (GI) incentive policies in order to meet objectives related to cost and environmental effectiveness. The framework explicitly simulates the interaction between multiple levels of polic...
On the Teaching of Portfolio Theory.
ERIC Educational Resources Information Center
Biederman, Daniel K.
1992-01-01
Demonstrates how a simple portfolio problem expressed explicitly as an expected utility maximization problem can be used to instruct students in portfolio theory. Discusses risk aversion, decision making under uncertainty, and the limitations of the traditional mean variance approach. Suggests students may develop a greater appreciation of general…
Fuller, Robert William; Wong, Tony E; Keller, Klaus
2017-01-01
The response of the Antarctic ice sheet (AIS) to changing global temperatures is a key component of sea-level projections. Current projections of the AIS contribution to sea-level changes are deeply uncertain. This deep uncertainty stems, in part, from (i) the inability of current models to fully resolve key processes and scales, (ii) the relatively sparse available data, and (iii) divergent expert assessments. One promising approach to characterizing the deep uncertainty stemming from divergent expert assessments is to combine expert assessments, observations, and simple models by coupling probabilistic inversion and Bayesian inversion. Here, we present a proof-of-concept study that uses probabilistic inversion to fuse a simple AIS model and diverse expert assessments. We demonstrate the ability of probabilistic inversion to infer joint prior probability distributions of model parameters that are consistent with expert assessments. We then confront these inferred expert priors with instrumental and paleoclimatic observational data in a Bayesian inversion. These additional constraints yield tighter hindcasts and projections. We use this approach to quantify how the deep uncertainty surrounding expert assessments affects the joint probability distributions of model parameters and future projections.
NASA Astrophysics Data System (ADS)
Freeman, R.; Bathon, J.; Fryar, A. E.; Lyon, E.; McGlue, M. M.
2017-12-01
As national awareness of the importance of STEM education has grown, so too has the number of high schools that specifically emphasize STEM education. Students at these schools outperform their peers and these institutions send students into the college STEM pipeline at twice the rate of the average high school or more. Another trend in secondary education is the "early college high school" (ECHS) model, which encourages students to prepare for and attend college while in high school. These high schools, particularly ECHS's that focus on STEM, represent a natural pool for recruitment into the geosciences, yet most efforts at linking high school STEM education to future careers focus on health sciences or engineering. Through the NSF GEOPATHS-IMPACT program, the University of Kentucky (UK) Department of Earth and Environmental Science and the STEAM Academy, a STEM-focused ECHS located in Lexington, KY, have partnered to expose students to geoscience content. This public ECHS admits students using a lottery system to ensure that the demographics of the high school match those of the surrounding community. The perennial problem for recruiting students into geosciences is the lack of awareness of it as a potential career, due to lack of exposure to the subject in high school. Although the STEAM Academy does not offer an explicitly-named geoscience course, students begin their first semester in 9th grade Integrated Science. This course aligns to the Next Generation Science Standards (NGSS), which include a variety of geoscience content. We are working with the teachers to build a project-based learning curriculum to include explicit mention and awareness of careers in geosciences. The second phase of our project involves taking advantage of the school's existing internship program, in which students develop professional skills and career awareness by spending either one day/week or one hour/day off campus. We hosted our second round of interns this year. Eventually we plan to enroll interested students in introductory earth science courses in our department or at a nearby community college. We hope to build a model for establishing a pipeline from an ECHS STEM high school to a geoscience department that can be implemented by other universities. Here we present the highlights and challenges of this first year of our program.
DeWeber, Jefferson T; Wagner, Tyler
2018-06-01
Predictions of the projected changes in species distributions and potential adaptation action benefits can help guide conservation actions. There is substantial uncertainty in projecting species distributions into an unknown future, however, which can undermine confidence in predictions or misdirect conservation actions if not properly considered. Recent studies have shown that the selection of alternative climate metrics describing very different climatic aspects (e.g., mean air temperature vs. mean precipitation) can be a substantial source of projection uncertainty. It is unclear, however, how much projection uncertainty might stem from selecting among highly correlated, ecologically similar climate metrics (e.g., maximum temperature in July, maximum 30-day temperature) describing the same climatic aspect (e.g., maximum temperatures) known to limit a species' distribution. It is also unclear how projection uncertainty might propagate into predictions of the potential benefits of adaptation actions that might lessen climate change effects. We provide probabilistic measures of climate change vulnerability, adaptation action benefits, and related uncertainty stemming from the selection of four maximum temperature metrics for brook trout (Salvelinus fontinalis), a cold-water salmonid of conservation concern in the eastern United States. Projected losses in suitable stream length varied by as much as 20% among alternative maximum temperature metrics for mid-century climate projections, which was similar to variation among three climate models. Similarly, the regional average predicted increase in brook trout occurrence probability under an adaptation action scenario of full riparian forest restoration varied by as much as .2 among metrics. Our use of Bayesian inference provides probabilistic measures of vulnerability and adaptation action benefits for individual stream reaches that properly address statistical uncertainty and can help guide conservation actions. Our study demonstrates that even relatively small differences in the definitions of climate metrics can result in very different projections and reveal high uncertainty in predicted climate change effects. © 2018 John Wiley & Sons Ltd.
DeWeber, Jefferson T.; Wagner, Tyler
2018-01-01
Predictions of the projected changes in species distributions and potential adaptation action benefits can help guide conservation actions. There is substantial uncertainty in projecting species distributions into an unknown future, however, which can undermine confidence in predictions or misdirect conservation actions if not properly considered. Recent studies have shown that the selection of alternative climate metrics describing very different climatic aspects (e.g., mean air temperature vs. mean precipitation) can be a substantial source of projection uncertainty. It is unclear, however, how much projection uncertainty might stem from selecting among highly correlated, ecologically similar climate metrics (e.g., maximum temperature in July, maximum 30‐day temperature) describing the same climatic aspect (e.g., maximum temperatures) known to limit a species’ distribution. It is also unclear how projection uncertainty might propagate into predictions of the potential benefits of adaptation actions that might lessen climate change effects. We provide probabilistic measures of climate change vulnerability, adaptation action benefits, and related uncertainty stemming from the selection of four maximum temperature metrics for brook trout (Salvelinus fontinalis), a cold‐water salmonid of conservation concern in the eastern United States. Projected losses in suitable stream length varied by as much as 20% among alternative maximum temperature metrics for mid‐century climate projections, which was similar to variation among three climate models. Similarly, the regional average predicted increase in brook trout occurrence probability under an adaptation action scenario of full riparian forest restoration varied by as much as .2 among metrics. Our use of Bayesian inference provides probabilistic measures of vulnerability and adaptation action benefits for individual stream reaches that properly address statistical uncertainty and can help guide conservation actions. Our study demonstrates that even relatively small differences in the definitions of climate metrics can result in very different projections and reveal high uncertainty in predicted climate change effects.
Seidl, Rupert; Lexer, Manfred J
2013-01-15
The unabated continuation of anthropogenic greenhouse gas emissions and the lack of an international consensus on a stringent climate change mitigation policy underscore the importance of adaptation for coping with the all but inevitable changes in the climate system. Adaptation measures in forestry have particularly long lead times. A timely implementation is thus crucial for reducing the considerable climate vulnerability of forest ecosystems. However, since future environmental conditions as well as future societal demands on forests are inherently uncertain, a core requirement for adaptation is robustness to a wide variety of possible futures. Here we explicitly address the roles of climatic and social uncertainty in forest management, and tackle the question of robustness of adaptation measures in the context of multi-objective sustainable forest management (SFM). We used the Austrian Federal Forests (AFF) as a case study, and employed a comprehensive vulnerability assessment framework based on ecosystem modeling, multi-criteria decision analysis, and practitioner participation. We explicitly considered climate uncertainty by means of three climate change scenarios, and accounted for uncertainty in future social demands by means of three societal preference scenarios regarding SFM indicators. We found that the effects of climatic and social uncertainty on the projected performance of management were in the same order of magnitude, underlining the notion that climate change adaptation requires an integrated social-ecological perspective. Furthermore, our analysis of adaptation measures revealed considerable trade-offs between reducing adverse impacts of climate change and facilitating adaptive capacity. This finding implies that prioritization between these two general aims of adaptation is necessary in management planning, which we suggest can draw on uncertainty analysis: Where the variation induced by social-ecological uncertainty renders measures aiming to reduce climate change impacts statistically insignificant (i.e., for approximately one third of the investigated management units of the AFF case study), fostering adaptive capacity is suggested as the preferred pathway for adaptation. We conclude that climate change adaptation needs to balance between anticipating expected future conditions and building the capacity to address unknowns and surprises. Copyright © 2012 Elsevier Ltd. All rights reserved.
Uncertainty and research needs for supplementing wild populations of anadromous Pacific salmon
Reisenbichler, R.R.
2005-01-01
Substantial disagreement and uncertainty attend the question of whether the benefits from supplementing wild populations of anadromous salmonids with hatchery fish outweigh the risks. Prudent decisions about supplementation are most likely when the suite of potential benefits and hazards and the various sources of uncertainty are explicitly identified. Models help by indicating the potential consequences of various levels of supplementation but perhaps are most valuable for showing the limitations of available data and helping design studies and monitoring to provide critical data. Information and understanding about the issue are deficient. I discuss various benefits, hazards, and associated uncertainties for supplementation, and implications for the design of monitoring and research. Several studies to reduce uncertainty and facilitate prudent supplementation are described and range from short-term reductionistic studies that help define the issue or help avoid deleterious consequences from supplementation to long-term studies (ca. 10 or more fish generations) that evaluate the net result of positive and negative genetic, behavioral, and ecological effects from supplementation.
Integrated economic and climate projections for impact assessment
We designed scenarios for impact assessment that explicitly address policy choices and uncertainty in climate response. Economic projections and the resulting greenhouse gas emissions for the “no climate policy” scenario and two stabilization scenarios: at 4.5 W/m2 and 3.7 W/m2 b...
We present a multi-faceted sensitivity analysis of a spatially explicit, individual-based model (IBM) (HexSim) of a threatened species, the Northern Spotted Owl (Strix occidentalis caurina) on a national forest in Washington, USA. Few sensitivity analyses have been conducted on ...
Known unknowns: building an ethics of uncertainty into genomic medicine.
Newson, Ainsley J; Leonard, Samantha J; Hall, Alison; Gaff, Clara L
2016-09-01
Genomic testing has reached the point where, technically at least, it can be cheaper to undertake panel-, exome- or whole genome testing than it is to sequence a single gene. An attribute of these approaches is that information gleaned will often have uncertain significance. In addition to the challenges this presents for pre-test counseling and informed consent, a further consideration emerges over how - ethically - we should conceive of and respond to this uncertainty. To date, the ethical aspects of uncertainty in genomics have remained under-explored. In this paper, we draft a conceptual and ethical response to the question of how to conceive of and respond to uncertainty in genomic medicine. After introducing the problem, we articulate a concept of 'genomic uncertainty'. Drawing on this, together with exemplar clinical cases and related empirical literature, we then critique the presumption that uncertainty is always problematic and something to be avoided, or eradicated. We conclude by outlining an 'ethics of genomic uncertainty'; describing how we might handle uncertainty in genomic medicine. This involves fostering resilience, welfare, autonomy and solidarity. Uncertainty will be an inherent aspect of clinical practice in genomics for some time to come. Genomic testing should not be offered with the explicit aim to reduce uncertainty. Rather, uncertainty should be appraised, adapted to and communicated about as part of the process of offering and providing genomic information.
A Bayesian approach to model structural error and input variability in groundwater modeling
NASA Astrophysics Data System (ADS)
Xu, T.; Valocchi, A. J.; Lin, Y. F. F.; Liang, F.
2015-12-01
Effective water resource management typically relies on numerical models to analyze groundwater flow and solute transport processes. Model structural error (due to simplification and/or misrepresentation of the "true" environmental system) and input forcing variability (which commonly arises since some inputs are uncontrolled or estimated with high uncertainty) are ubiquitous in groundwater models. Calibration that overlooks errors in model structure and input data can lead to biased parameter estimates and compromised predictions. We present a fully Bayesian approach for a complete assessment of uncertainty for spatially distributed groundwater models. The approach explicitly recognizes stochastic input and uses data-driven error models based on nonparametric kernel methods to account for model structural error. We employ exploratory data analysis to assist in specifying informative prior for error models to improve identifiability. The inference is facilitated by an efficient sampling algorithm based on DREAM-ZS and a parameter subspace multiple-try strategy to reduce the required number of forward simulations of the groundwater model. We demonstrate the Bayesian approach through a synthetic case study of surface-ground water interaction under changing pumping conditions. It is found that explicit treatment of errors in model structure and input data (groundwater pumping rate) has substantial impact on the posterior distribution of groundwater model parameters. Using error models reduces predictive bias caused by parameter compensation. In addition, input variability increases parametric and predictive uncertainty. The Bayesian approach allows for a comparison among the contributions from various error sources, which could inform future model improvement and data collection efforts on how to best direct resources towards reducing predictive uncertainty.
NASA Astrophysics Data System (ADS)
Ginn, T. R.; Scheibe, T. D.
2006-12-01
Hydrogeology is among the most data-limited of the earth sciences, so that uncertainty arises in every aspect of subsurface flow and transport modeling, from conceptual model to spatial discretization to parameter values. Thus treatment of uncertainty is unavoidable, and the literature and conference proceedings are replete with approaches, templates, paradigms and such for doing so. However, such tools remain not well used, especially those of the stochastic analytic sort, leading recently to explicit inquiries about why this is the case, in response to which entire journal issues have been dedicated. In an effort to continue this discussion in a constructive way we report on an informal yet extensive survey of hydrogeology practitioners, as the "marketplace" for techniques to deal with uncertainty. We include scientists, engineers, regulators, and others in the survey, that reports on quantitative (or not) methods for uncertainty characterization and analysis, frequency and level of usage, and reasons behind the selection or avoidance of available methods. Results shed light on fruitful directions for future research in uncertainty quantification in hydrogeology.
Returning to STEM: Gendered Factors Affecting Employability for Mature Women Students
ERIC Educational Resources Information Center
Herman, Clem
2015-01-01
This paper adds to current discourses around employability by arguing for an explicit recognition of gender, in particular in relation to women's employment in male-dominated sectors such as science, engineering and technology. This is not limited to young first-time graduates but continues and evolves throughout the life course. Mature women…
The Effects of STEM PBL on Students' Mathematical and Scientific Vocabulary Knowledge
ERIC Educational Resources Information Center
Bilgin, Ali; Boedeker, Peter; Capraro, Robert M.; Capraro, Mary M.
2015-01-01
Vocabulary is at the surface level of language usage; thus, students need to develop mathematical and scientific vocabulary to be able to explicitly communicate their mathematical and scientific reasoning with others. The National Council of Teachers of Mathematics (NCTM) and the National Science Teachers Association (NSTA) have both created…
Uncertainty in quantum mechanics: faith or fantasy?
Penrose, Roger
2011-12-13
The word 'uncertainty', in the context of quantum mechanics, usually evokes an impression of an essential unknowability of what might actually be going on at the quantum level of activity, as is made explicit in Heisenberg's uncertainty principle, and in the fact that the theory normally provides only probabilities for the results of quantum measurement. These issues limit our ultimate understanding of the behaviour of things, if we take quantum mechanics to represent an absolute truth. But they do not cause us to put that very 'truth' into question. This article addresses the issue of quantum 'uncertainty' from a different perspective, raising the question of whether this term might be applied to the theory itself, despite its unrefuted huge success over an enormously diverse range of observed phenomena. There are, indeed, seeming internal contradictions in the theory that lead us to infer that a total faith in it at all levels of scale leads us to almost fantastical implications.
Defensive zeal and the uncertain self: what makes you so sure?
McGregor, Ian; Marigold, Denise C
2003-11-01
In Studies 1-3, undergraduates with high self-esteem (HSEs) reacted to personal uncertainty-threats with compensatory conviction about unrelated issues and aspects of the self. In Study 1 HSEs reacted to salience of personal dilemmas with increased implicit conviction about self-definition. In Study 2 they reacted to the same uncertainty-threat with increased explicit conviction about social issues. In Study 3, HSEs (particularly defensive HSEs, i.e., with low implicit self-esteem; C. H. Jordan, S. J. Spencer, & M. P. Zanna, 2003) reacted to uncertainty about a personal relationship with compensatory conviction about social issues. For HSEs in Study 4, expressing convictions about social issues decreased subjective salience of dilemma-related uncertainties that were not related to the social issues. Compensatory conviction is viewed as a mode of repression, akin to reaction formation, that helps keep unwanted thoughts out of awareness.
Deng, Yue; Bao, Feng; Yang, Yang; Ji, Xiangyang; Du, Mulong; Zhang, Zhengdong
2017-01-01
Abstract The automated transcript discovery and quantification of high-throughput RNA sequencing (RNA-seq) data are important tasks of next-generation sequencing (NGS) research. However, these tasks are challenging due to the uncertainties that arise in the inference of complete splicing isoform variants from partially observed short reads. Here, we address this problem by explicitly reducing the inherent uncertainties in a biological system caused by missing information. In our approach, the RNA-seq procedure for transforming transcripts into short reads is considered an information transmission process. Consequently, the data uncertainties are substantially reduced by exploiting the information transduction capacity of information theory. The experimental results obtained from the analyses of simulated datasets and RNA-seq datasets from cell lines and tissues demonstrate the advantages of our method over state-of-the-art competitors. Our algorithm is an open-source implementation of MaxInfo. PMID:28911101
McBride, Marissa F; Wilson, Kerrie A; Bode, Michael; Possingham, Hugh P
2007-12-01
Uncertainty in the implementation and outcomes of conservation actions that is not accounted for leaves conservation plans vulnerable to potential changes in future conditions. We used a decision-theoretic approach to investigate the effects of two types of investment uncertainty on the optimal allocation of global conservation resources for land acquisition in the Mediterranean Basin. We considered uncertainty about (1) whether investment will continue and (2) whether the acquired biodiversity assets are secure, which we termed transaction uncertainty and performance uncertainty, respectively. We also developed and tested the robustness of different rules of thumb for guiding the allocation of conservation resources when these sources of uncertainty exist. In the presence of uncertainty in future investment ability (transaction uncertainty), the optimal strategy was opportunistic, meaning the investment priority should be to act where uncertainty is highest while investment remains possible. When there was a probability that investments would fail (performance uncertainty), the optimal solution became a complex trade-off between the immediate biodiversity benefits of acting in a region and the perceived longevity of the investment. In general, regions were prioritized for investment when they had the greatest performance certainty, even if an alternative region was highly threatened or had higher biodiversity value. The improved performance of rules of thumb when accounting for uncertainty highlights the importance of explicitly incorporating sources of investment uncertainty and evaluating potential conservation investments in the context of their likely long-term success.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vrugt, Jasper A; Robinson, Bruce A; Ter Braak, Cajo J F
In recent years, a strong debate has emerged in the hydrologic literature regarding what constitutes an appropriate framework for uncertainty estimation. Particularly, there is strong disagreement whether an uncertainty framework should have its roots within a proper statistical (Bayesian) context, or whether such a framework should be based on a different philosophy and implement informal measures and weaker inference to summarize parameter and predictive distributions. In this paper, we compare a formal Bayesian approach using Markov Chain Monte Carlo (MCMC) with generalized likelihood uncertainty estimation (GLUE) for assessing uncertainty in conceptual watershed modeling. Our formal Bayesian approach is implemented usingmore » the recently developed differential evolution adaptive metropolis (DREAM) MCMC scheme with a likelihood function that explicitly considers model structural, input and parameter uncertainty. Our results demonstrate that DREAM and GLUE can generate very similar estimates of total streamflow uncertainty. This suggests that formal and informal Bayesian approaches have more common ground than the hydrologic literature and ongoing debate might suggest. The main advantage of formal approaches is, however, that they attempt to disentangle the effect of forcing, parameter and model structural error on total predictive uncertainty. This is key to improving hydrologic theory and to better understand and predict the flow of water through catchments.« less
Presentation of uncertainties on web platforms for climate change information
NASA Astrophysics Data System (ADS)
Nocke, Thomas; Wrobel, Markus; Reusser, Dominik
2014-05-01
Climate research has a long tradition, however there is still uncertainty about the specific effects of climate change. One of the key tasks is - beyond discussing climate change and its impacts in specialist groups - to present these to a wider audience. In that respect, decision-makers in the public sector as well as directly affected professional groups require to obtain easy-to-understand information. These groups are not made up of specialist scientists. This gives rise to the challenge that the scientific information must be presented such that it is commonly understood, however, the complexity of the science behind needs to be incorporated. In particular, this requires the explicit representation of spatial and temporal uncertainty information to lay people. Within this talk/poster we survey how climate change and climate impact uncertainty information is presented on various climate service web-based platforms. We outline how the specifics of this medium make it challenging to find adequate and readable representations of uncertainties. First, we introduce a multi-step approach in communicating the uncertainty basing on a typology of uncertainty distinguishing between epistemic, natural stochastic, and human reflexive uncertainty. Then, we compare existing concepts and representations for uncertainty communication with current practices on web-based platforms, including own solutions within our web platforms ClimateImpactsOnline and ci:grasp. Finally, we review surveys on how spatial uncertainty visualization techniques are conceived by untrainded users.
EXPECT: Explicit Representations for Flexible Acquisition
NASA Technical Reports Server (NTRS)
Swartout, BIll; Gil, Yolanda
1995-01-01
To create more powerful knowledge acquisition systems, we not only need better acquisition tools, but we need to change the architecture of the knowledge based systems we create so that their structure will provide better support for acquisition. Current acquisition tools permit users to modify factual knowledge but they provide limited support for modifying problem solving knowledge. In this paper, the authors argue that this limitation (and others) stem from the use of incomplete models of problem-solving knowledge and inflexible specification of the interdependencies between problem-solving and factual knowledge. We describe the EXPECT architecture which addresses these problems by providing an explicit representation for problem-solving knowledge and intent. Using this more explicit representation, EXPECT can automatically derive the interdependencies between problem-solving and factual knowledge. By deriving these interdependencies from the structure of the knowledge-based system itself EXPECT supports more flexible and powerful knowledge acquisition.
How measurement science can improve confidence in research results.
Plant, Anne L; Becker, Chandler A; Hanisch, Robert J; Boisvert, Ronald F; Possolo, Antonio M; Elliott, John T
2018-04-01
The current push for rigor and reproducibility is driven by a desire for confidence in research results. Here, we suggest a framework for a systematic process, based on consensus principles of measurement science, to guide researchers and reviewers in assessing, documenting, and mitigating the sources of uncertainty in a study. All study results have associated ambiguities that are not always clarified by simply establishing reproducibility. By explicitly considering sources of uncertainty, noting aspects of the experimental system that are difficult to characterize quantitatively, and proposing alternative interpretations, the researcher provides information that enhances comparability and reproducibility.
Analysis of potential trade-offs in regulation of disinfection by-products
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cromwell, J.E.; Zhang, X.; Regli, S.
1992-11-01
Executive Order 12291 requires the preparation of a Regulatory Impact Analysis (RIA) on all new major federal regulations. The goal of an RIA is to develop and organize information on benefits, costs, and economic impacts so as to clarify trade-offs among alternative regulatory options. This paper outlines explicit methodology for assessing the technical potential for risk-risk tradeoffs. The strategies used to cope with complexities and uncertainties in developing the Disinfection By-Products Regulatory Analysis Model are explained. Results are presented and discussed in light of uncertainties, and in light of the analytical requirements for regulatory impact analysis.
Memory performance in Holocaust survivors with posttraumatic stress disorder.
Golier, Julia A; Yehuda, Rachel; Lupien, Sonia J; Harvey, Philip D; Grossman, Robert; Elkin, Abbie
2002-10-01
The authors evaluated memory performance in Holocaust survivors and its association with posttraumatic stress disorder (PTSD) and age. Memory performance was measured in Holocaust survivors with PTSD (N=31) and without PTSD (N=16) and healthy Jewish adults not exposed to the Holocaust (N=35). Explicit and implicit memory were measured by paired-associate recall and word stem completion, respectively. The groups did not differ by age or gender, but the survivors with PTSD had significantly fewer years of education and had lower estimated IQs than the survivors without PTSD and the nonexposed group. There was a significant overall group effect for paired-associate recall but not word stem completion. The survivors with PTSD recalled fewer semantically unrelated words than did the survivors without PTSD and the nonexposed group and fewer semantically related words than the nonexposed group. Of the survivors with PTSD, 36% performed at a level indicative of frank cognitive impairment. Older age was significantly associated with poorer paired-associate recall in the survivors with PTSD but not in the other two groups. Markedly poorer explicit but not implicit memory was found in Holocaust survivors with PTSD, which may be a consequence of or a risk factor for chronic PTSD. Accelerated memory decline is one of several explanations for the significantly greater association of older age with poorer explicit memory in survivors with PTSD, which, if present, could increase the cognitive burden of this illness with aging.
Priming within and across modalities: exploring the nature of rCBF increases and decreases.
Badgaiyan, R D; Schacter, D L; Alpert, N M
2001-02-01
Neuroimaging studies suggest that within-modality priming is associated with reduced regional cerebral blood flow (rCBF) in the extrastriate area, whereas cross-modality priming is associated with increased rCBF in prefrontal cortex. To characterize the nature of rCBF changes in within- and cross-modality priming, we conducted two neuroimaging experiments using positron emission tomography (PET). In experiment 1, rCBF changes in within-modality auditory priming on a word stem completion task were observed under same- and different-voice conditions. Both conditions were associated with decreased rCBF in extrastriate cortex. In the different-voice condition there were additional rCBF changes in the middle temporal gyrus and prefrontal cortex. Results suggest that the extrastriate involvement in within-modality priming is sensitive to a change in sensory modality of target stimuli between study and test, but not to a change in the feature of a stimulus within the same modality. In experiment 2, we studied cross-modality priming on a visual stem completion test after encoding under full- and divided-attention conditions. Increased rCBF in the anterior prefrontal cortex was observed in the full- but not in the divided-attention condition. Because explicit retrieval is compromised after encoding under the divided-attention condition, prefrontal involvement in cross-modality priming indicates recruitment of an aspect of explicit retrieval mechanism. The aspect of explicit retrieval that is most likely to be involved in cross-modality priming is the familiarity effect. Copyright 2001 Academic Press.
NASA Astrophysics Data System (ADS)
Ciurean, R. L.; Glade, T.
2012-04-01
Decision under uncertainty is a constant of everyday life and an important component of risk management and governance. Recently, experts have emphasized the importance of quantifying uncertainty in all phases of landslide risk analysis. Due to its multi-dimensional and dynamic nature, (physical) vulnerability is inherently complex and the "degree of loss" estimates imprecise and to some extent even subjective. Uncertainty analysis introduces quantitative modeling approaches that allow for a more explicitly objective output, improving the risk management process as well as enhancing communication between various stakeholders for better risk governance. This study presents a review of concepts for uncertainty analysis in vulnerability of elements at risk to landslides. Different semi-quantitative and quantitative methods are compared based on their feasibility in real-world situations, hazard dependency, process stage in vulnerability assessment (i.e. input data, model, output), and applicability within an integrated landslide hazard and risk framework. The resulted observations will help to identify current gaps and future needs in vulnerability assessment, including estimation of uncertainty propagation, transferability of the methods, development of visualization tools, but also address basic questions like what is uncertainty and how uncertainty can be quantified or treated in a reliable and reproducible way.
The standard framework of Ecological Risk Assessment (ERA) uses organism-level assessment endpoints to qualitatively determine the risk to populations. While organism-level toxicity data provide the pathway by which a species may be affected by a chemical stressor, they neither i...
Crossing the Threshold: Bringing Biological Variation to the Foreground
ERIC Educational Resources Information Center
Batzli, Janet M.; Knight, Jennifer K.; Hartley, Laurel M.; Maskiewicz, April Cordero; Desy, Elizabeth A.
2016-01-01
Threshold concepts have been referred to as "jewels in the curriculum": concepts that are key to competency in a discipline but not taught explicitly. In biology, researchers have proposed the idea of threshold concepts that include such topics as variation, randomness, uncertainty, and scale. In this essay, we explore how the notion of…
The effects of divided attention on auditory priming.
Mulligan, Neil W; Duke, Marquinn; Cooper, Angela W
2007-09-01
Traditional theorizing stresses the importance of attentional state during encoding for later memory, based primarily on research with explicit memory. Recent research has begun to investigate the role of attention in implicit memory but has focused almost exclusively on priming in the visual modality. The present experiments examined the effect of divided attention on auditory implicit memory, using auditory perceptual identification, word-stem completion and word-fragment completion. Participants heard study words under full attention conditions or while simultaneously carrying out a distractor task (the divided attention condition). In Experiment 1, a distractor task with low response frequency failed to disrupt later auditory priming (but diminished explicit memory as assessed with auditory recognition). In Experiment 2, a distractor task with greater response frequency disrupted priming on all three of the auditory priming tasks as well as the explicit test. These results imply that although auditory priming is less reliant on attention than explicit memory, it is still greatly affected by at least some divided-attention manipulations. These results are consistent with research using visual priming tasks and have relevance for hypotheses regarding attention and auditory priming.
Young Children’s Sensitivity to Their Own Ignorance in Informing Others
Kim, Sunae; Paulus, Markus; Sodian, Beate; Proust, Joelle
2016-01-01
Prior research suggests that young children selectively inform others depending on others’ knowledge states. Yet, little is known whether children selectively inform others depending on their own knowledge states. To explore this issue, we manipulated 3- to 4-year-old children’s knowledge about the content of a box and assessed the impact on their decisions to inform another person. Moreover, we assessed the presence of uncertainty gestures while they inform another person in light of the suggestions that children's gestures reflect early developing, perhaps transient, epistemic sensitivity. Finally, we compared children’s performance in the informing context to their explicit verbal judgment of their knowledge states to further confirm the existence of a performance gap between the two tasks. In their decisions to inform, children tend to accurately assess their ignorance, whereas they tend to overestimate their own knowledge states when asked to explicitly report them. Moreover, children display different levels of uncertainty gestures depending on the varying degrees of their informational access. These findings suggest that children’s implicit awareness of their own ignorance may be facilitated in a social, communicative context. PMID:27023683
Evaluation of Automated Model Calibration Techniques for Residential Building Energy Simulation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Robertson, J.; Polly, B.; Collis, J.
2013-09-01
This simulation study adapts and applies the general framework described in BESTEST-EX (Judkoff et al 2010) for self-testing residential building energy model calibration methods. BEopt/DOE-2.2 is used to evaluate four mathematical calibration methods in the context of monthly, daily, and hourly synthetic utility data for a 1960's-era existing home in a cooling-dominated climate. The home's model inputs are assigned probability distributions representing uncertainty ranges, random selections are made from the uncertainty ranges to define 'explicit' input values, and synthetic utility billing data are generated using the explicit input values. The four calibration methods evaluated in this study are: an ASHRAEmore » 1051-RP-based approach (Reddy and Maor 2006), a simplified simulated annealing optimization approach, a regression metamodeling optimization approach, and a simple output ratio calibration approach. The calibration methods are evaluated for monthly, daily, and hourly cases; various retrofit measures are applied to the calibrated models and the methods are evaluated based on the accuracy of predicted savings, computational cost, repeatability, automation, and ease of implementation.« less
Evaluation of Automated Model Calibration Techniques for Residential Building Energy Simulation
DOE Office of Scientific and Technical Information (OSTI.GOV)
and Ben Polly, Joseph Robertson; Polly, Ben; Collis, Jon
2013-09-01
This simulation study adapts and applies the general framework described in BESTEST-EX (Judkoff et al 2010) for self-testing residential building energy model calibration methods. BEopt/DOE-2.2 is used to evaluate four mathematical calibration methods in the context of monthly, daily, and hourly synthetic utility data for a 1960's-era existing home in a cooling-dominated climate. The home's model inputs are assigned probability distributions representing uncertainty ranges, random selections are made from the uncertainty ranges to define "explicit" input values, and synthetic utility billing data are generated using the explicit input values. The four calibration methods evaluated in this study are: an ASHRAEmore » 1051-RP-based approach (Reddy and Maor 2006), a simplified simulated annealing optimization approach, a regression metamodeling optimization approach, and a simple output ratio calibration approach. The calibration methods are evaluated for monthly, daily, and hourly cases; various retrofit measures are applied to the calibrated models and the methods are evaluated based on the accuracy of predicted savings, computational cost, repeatability, automation, and ease of implementation.« less
Risk-based principles for defining and managing water security
Hall, Jim; Borgomeo, Edoardo
2013-01-01
The concept of water security implies concern about potentially harmful states of coupled human and natural water systems. Those harmful states may be associated with water scarcity (for humans and/or the environment), floods or harmful water quality. The theories and practices of risk analysis and risk management have been developed and elaborated to deal with the uncertain occurrence of harmful events. Yet despite their widespread application in public policy, theories and practices of risk management have well-known limitations, particularly in the context of severe uncertainties and contested values. Here, we seek to explore the boundaries of applicability of risk-based principles as a means of formalizing discussion of water security. Not only do risk concepts have normative appeal, but they also provide an explicit means of addressing the variability that is intrinsic to hydrological, ecological and socio-economic systems. We illustrate the nature of these interconnections with a simulation study, which demonstrates how water resources planning could take more explicit account of epistemic uncertainties, tolerability of risk and the trade-offs in risk among different actors. PMID:24080616
Achieving Optimal Quantum Acceleration of Frequency Estimation Using Adaptive Coherent Control.
Naghiloo, M; Jordan, A N; Murch, K W
2017-11-03
Precision measurements of frequency are critical to accurate time keeping and are fundamentally limited by quantum measurement uncertainties. While for time-independent quantum Hamiltonians the uncertainty of any parameter scales at best as 1/T, where T is the duration of the experiment, recent theoretical works have predicted that explicitly time-dependent Hamiltonians can yield a 1/T^{2} scaling of the uncertainty for an oscillation frequency. This quantum acceleration in precision requires coherent control, which is generally adaptive. We experimentally realize this quantum improvement in frequency sensitivity with superconducting circuits, using a single transmon qubit. With optimal control pulses, the theoretically ideal frequency precision scaling is reached for times shorter than the decoherence time. This result demonstrates a fundamental quantum advantage for frequency estimation.
Hong, Jinglan
2012-06-01
Uncertainty information is essential for the proper use of life cycle assessment and environmental assessments in decision making. To investigate the uncertainties of biodiesel and determine the level of confidence in the assertion that biodiesel is more environmentally friendly than diesel, an explicit analytical approach based on the Taylor series expansion for lognormal distribution was applied in the present study. A biodiesel case study demonstrates the probability that biodiesel has a lower global warming and non-renewable energy score than diesel, that is 92.3% and 93.1%, respectively. The results indicate the level of confidence in the assertion that biodiesel is more environmentally friendly than diesel based on the global warming and non-renewable energy scores. Copyright © 2011 Elsevier Ltd. All rights reserved.
Explicitly Teaching Critical Thinking Skills in a History Course
ERIC Educational Resources Information Center
McLaughlin, Anne Collins; McGill, Alicia Ebbitt
2017-01-01
Critical thinking skills are often assessed via student beliefs in non-scientific ways of thinking, (e.g, pseudoscience). Courses aimed at reducing such beliefs have been studied in the STEM fields with the most successful focusing on skeptical thinking. However, critical thinking is not unique to the sciences; it is crucial in the humanities and…
Sensitivity Analysis of Expected Wind Extremes over the Northwestern Sahara and High Atlas Region.
NASA Astrophysics Data System (ADS)
Garcia-Bustamante, E.; González-Rouco, F. J.; Navarro, J.
2017-12-01
A robust statistical framework in the scientific literature allows for the estimation of probabilities of occurrence of severe wind speeds and wind gusts, but does not prevent however from large uncertainties associated with the particular numerical estimates. An analysis of such uncertainties is thus required. A large portion of this uncertainty arises from the fact that historical observations are inherently shorter that the timescales of interest for the analysis of return periods. Additional uncertainties stem from the different choices of probability distributions and other aspects related to methodological issues or physical processes involved. The present study is focused on historical observations over the Ouarzazate Valley (Morocco) and in a high-resolution regional simulation of the wind in the area of interest. The aim is to provide extreme wind speed and wind gust return values and confidence ranges based on a systematic sampling of the uncertainty space for return periods up to 120 years.
Experiences of Uncertainty in Men With an Elevated PSA
Biddle, Caitlin; Brasel, Alicia; Underwood, Willie; Orom, Heather
2016-01-01
A significant proportion of men, ages 50 to 70 years, have, and continue to receive prostate specific antigen (PSA) tests to screen for prostate cancer (PCa). Approximately 70% of men with an elevated PSA level will not subsequently be diagnosed with PCa. Semistructured interviews were conducted with 13 men with an elevated PSA level who had not been diagnosed with PCa. Uncertainty was prominent in men’s reactions to the PSA results, stemming from unanswered questions about the PSA test, PCa risk, and confusion about their management plan. Uncertainty was exacerbated or reduced depending on whether health care providers communicated in lay and empathetic ways, and provided opportunities for question asking. To manage uncertainty, men engaged in information and health care seeking, self-monitoring, and defensive cognition. Results inform strategies for meeting informational needs of men with an elevated PSA and confirm the primary importance of physician communication behavior for open information exchange and uncertainty reduction. PMID:25979635
Wong, Tony E.; Keller, Klaus
2017-01-01
The response of the Antarctic ice sheet (AIS) to changing global temperatures is a key component of sea-level projections. Current projections of the AIS contribution to sea-level changes are deeply uncertain. This deep uncertainty stems, in part, from (i) the inability of current models to fully resolve key processes and scales, (ii) the relatively sparse available data, and (iii) divergent expert assessments. One promising approach to characterizing the deep uncertainty stemming from divergent expert assessments is to combine expert assessments, observations, and simple models by coupling probabilistic inversion and Bayesian inversion. Here, we present a proof-of-concept study that uses probabilistic inversion to fuse a simple AIS model and diverse expert assessments. We demonstrate the ability of probabilistic inversion to infer joint prior probability distributions of model parameters that are consistent with expert assessments. We then confront these inferred expert priors with instrumental and paleoclimatic observational data in a Bayesian inversion. These additional constraints yield tighter hindcasts and projections. We use this approach to quantify how the deep uncertainty surrounding expert assessments affects the joint probability distributions of model parameters and future projections. PMID:29287095
Verifying and Validating Simulation Models
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hemez, Francois M.
2015-02-23
This presentation is a high-level discussion of the Verification and Validation (V&V) of computational models. Definitions of V&V are given to emphasize that “validation” is never performed in a vacuum; it accounts, instead, for the current state-of-knowledge in the discipline considered. In particular comparisons between physical measurements and numerical predictions should account for their respective sources of uncertainty. The differences between error (bias), aleatoric uncertainty (randomness) and epistemic uncertainty (ignorance, lack-of- knowledge) are briefly discussed. Four types of uncertainty in physics and engineering are discussed: 1) experimental variability, 2) variability and randomness, 3) numerical uncertainty and 4) model-form uncertainty. Statisticalmore » sampling methods are available to propagate, and analyze, variability and randomness. Numerical uncertainty originates from the truncation error introduced by the discretization of partial differential equations in time and space. Model-form uncertainty is introduced by assumptions often formulated to render a complex problem more tractable and amenable to modeling and simulation. The discussion concludes with high-level guidance to assess the “credibility” of numerical simulations, which stems from the level of rigor with which these various sources of uncertainty are assessed and quantified.« less
A semi-implicit level set method for multiphase flows and fluid-structure interaction problems
NASA Astrophysics Data System (ADS)
Cottet, Georges-Henri; Maitre, Emmanuel
2016-06-01
In this paper we present a novel semi-implicit time-discretization of the level set method introduced in [8] for fluid-structure interaction problems. The idea stems from a linear stability analysis derived on a simplified one-dimensional problem. The semi-implicit scheme relies on a simple filter operating as a pre-processing on the level set function. It applies to multiphase flows driven by surface tension as well as to fluid-structure interaction problems. The semi-implicit scheme avoids the stability constraints that explicit scheme need to satisfy and reduces significantly the computational cost. It is validated through comparisons with the original explicit scheme and refinement studies on two-dimensional benchmarks.
Improving our legacy: Incorporation of adaptive management into state wildlife action plans
Fontaine, J.J.
2011-01-01
The loss of biodiversity is a mounting concern, but despite numerous attempts there are few large scale conservation efforts that have proven successful in reversing current declines. Given the challenge of biodiversity conservation, there is a need to develop strategic conservation plans that address species declines even with the inherent uncertainty in managing multiple species in complex environments. In 2002, the State Wildlife Grant program was initiated to fulfill this need, and while not explicitly outlined by Congress follows the fundamental premise of adaptive management, 'Learning by doing'. When action is necessary, but basic biological information and an understanding of appropriate management strategies are lacking, adaptive management enables managers to be proactive in spite of uncertainty. However, regardless of the strengths of adaptive management, the development of an effective adaptive management framework is challenging. In a review of 53 State Wildlife Action Plans, I found a keen awareness by planners that adaptive management was an effective method for addressing biodiversity conservation, but the development and incorporation of explicit adaptive management approaches within each plan remained elusive. Only ???25% of the plans included a framework for how adaptive management would be implemented at the project level within their state. There was, however, considerable support across plans for further development and implementation of adaptive management. By furthering the incorporation of adaptive management principles in conservation plans and explicitly outlining the decision making process, states will be poised to meet the pending challenges to biodiversity conservation. ?? 2010 .
Improving our legacy: incorporation of adaptive management into state wildlife action plans.
Fontaine, Joseph J
2011-05-01
The loss of biodiversity is a mounting concern, but despite numerous attempts there are few large scale conservation efforts that have proven successful in reversing current declines. Given the challenge of biodiversity conservation, there is a need to develop strategic conservation plans that address species declines even with the inherent uncertainty in managing multiple species in complex environments. In 2002, the State Wildlife Grant program was initiated to fulfill this need, and while not explicitly outlined by Congress follows the fundamental premise of adaptive management, 'Learning by doing'. When action is necessary, but basic biological information and an understanding of appropriate management strategies are lacking, adaptive management enables managers to be proactive in spite of uncertainty. However, regardless of the strengths of adaptive management, the development of an effective adaptive management framework is challenging. In a review of 53 State Wildlife Action Plans, I found a keen awareness by planners that adaptive management was an effective method for addressing biodiversity conservation, but the development and incorporation of explicit adaptive management approaches within each plan remained elusive. Only ~25% of the plans included a framework for how adaptive management would be implemented at the project level within their state. There was, however, considerable support across plans for further development and implementation of adaptive management. By furthering the incorporation of adaptive management principles in conservation plans and explicitly outlining the decision making process, states will be poised to meet the pending challenges to biodiversity conservation. Published by Elsevier Ltd.
Collaborative decision-analytic framework to maximize resilience of tidal marshes to climate change
Thorne, Karen M.; Mattsson, Brady J.; Takekawa, John Y.; Cummings, Jonathan; Crouse, Debby; Block, Giselle; Bloom, Valary; Gerhart, Matt; Goldbeck, Steve; Huning, Beth; Sloop, Christina; Stewart, Mendel; Taylor, Karen; Valoppi, Laura
2015-01-01
Decision makers that are responsible for stewardship of natural resources face many challenges, which are complicated by uncertainty about impacts from climate change, expanding human development, and intensifying land uses. A systematic process for evaluating the social and ecological risks, trade-offs, and cobenefits associated with future changes is critical to maximize resilience and conserve ecosystem services. This is particularly true in coastal areas where human populations and landscape conversion are increasing, and where intensifying storms and sea-level rise pose unprecedented threats to coastal ecosystems. We applied collaborative decision analysis with a diverse team of stakeholders who preserve, manage, or restore tidal marshes across the San Francisco Bay estuary, California, USA, as a case study. Specifically, we followed a structured decision-making approach, and we using expert judgment developed alternative management strategies to increase the capacity and adaptability to manage tidal marsh resilience while considering uncertainties through 2050. Because sea-level rise projections are relatively confident to 2050, we focused on uncertainties regarding intensity and frequency of storms and funding. Elicitation methods allowed us to make predictions in the absence of fully compatible models and to assess short- and long-term trade-offs. Specifically we addressed two questions. (1) Can collaborative decision analysis lead to consensus among a diverse set of decision makers responsible for environmental stewardship and faced with uncertainties about climate change, funding, and stakeholder values? (2) What is an optimal strategy for the conservation of tidal marshes, and what strategy is robust to the aforementioned uncertainties? We found that when taking this approach, consensus was reached among the stakeholders about the best management strategies to maintain tidal marsh integrity. A Bayesian decision network revealed that a strategy considering sea-level rise and storms explicitly in wetland restoration planning and designs was optimal, and it was robust to uncertainties about management effectiveness and budgets. We found that strategies that avoided explicitly accounting for future climate change had the lowest expected performance based on input from the team. Our decision-analytic framework is sufficiently general to offer an adaptable template, which can be modified for use in other areas that include a diverse and engaged stakeholder group.
Calnan, Michael; Hashem, Ferhana; Brown, Patrick
2017-07-01
This article examines the "technological appraisals" carried out by the National Institute for Health and Care Excellence as it regulates the provision of expensive new drugs within the English National Health Service on cost-effectiveness grounds. Ostensibly this is a highly rational process by which the regulatory mechanisms absorb uncertainty, but in practice, decision making remains highly complex and uncertain. This article draws on ethnographic data-interviews with a range of stakeholders and decision makers (n = 41), observations of public and closed appraisal meetings, and documentary analysis-regarding the decision-making processes involving three pharmaceutical products. The study explores the various ways in which different forms of uncertainty are perceived and tackled within these Single Technology Appraisals. Difficulties of dealing with the various levels of uncertainty were manifest and often rendered straightforward decision making problematic. Uncertainties associated with epistemology, procedures, interpersonal relations, and technicality were particularly evident. The need to exercise discretion within a more formal institutional framework shaped a pragmatic combining of strategies tactics-explicit and informal, collective and individual-to navigate through the layers of complexity and uncertainty in making decisions.
Uncertainty in Ecohydrological Modeling in an Arid Region Determined with Bayesian Methods
Yang, Junjun; He, Zhibin; Du, Jun; Chen, Longfei; Zhu, Xi
2016-01-01
In arid regions, water resources are a key forcing factor in ecosystem circulation, and soil moisture is the critical link that constrains plant and animal life on the soil surface and underground. Simulation of soil moisture in arid ecosystems is inherently difficult due to high variability. We assessed the applicability of the process-oriented CoupModel for forecasting of soil water relations in arid regions. We used vertical soil moisture profiling for model calibration. We determined that model-structural uncertainty constituted the largest error; the model did not capture the extremes of low soil moisture in the desert-oasis ecotone (DOE), particularly below 40 cm soil depth. Our results showed that total uncertainty in soil moisture prediction was improved when input and output data, parameter value array, and structure errors were characterized explicitly. Bayesian analysis was applied with prior information to reduce uncertainty. The need to provide independent descriptions of uncertainty analysis (UA) in the input and output data was demonstrated. Application of soil moisture simulation in arid regions will be useful for dune-stabilization and revegetation efforts in the DOE. PMID:26963523
Bianco, Paolo; Cao, Xu; Frenette, Paul S; Mao, Jeremy J; Robey, Pamela G; Simmons, Paul J; Wang, Cun-Yu
2013-01-01
Mesenchymal stem cells (MSCs) are the focus of intensive efforts worldwide directed not only at elucidating their nature and unique properties but also developing cell-based therapies for a diverse range of diseases. More than three decades have passed since the original formulation of the concept, revolutionary at the time, that multiple connective tissues could emanate from a common progenitor or stem cell retained in the postnatal bone marrow. Despite the many important advances made since that time, substantial ambiguities still plague the field regarding the nature, identity, function, mode of isolation and experimental handling of MSCs. These uncertainties have a major impact on their envisioned therapeutic use.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Odlyzko, Michael L.; Held, Jacob T.; Mkhoyan, K. Andre, E-mail: mkhoyan@umn.edu
2016-07-15
Quantitatively calibrated annular dark field scanning transmission electron microscopy (ADF-STEM) imaging experiments were compared to frozen phonon multislice simulations adapted to include chemical bonding effects. Having carefully matched simulation parameters to experimental conditions, a depth-dependent bonding effect was observed for high-angle ADF-STEM imaging of aluminum nitride. This result is explained by computational predictions, systematically examined in the preceding portion of this study, showing the propagation of the converged STEM beam to be highly sensitive to net interatomic charge transfer. Thus, although uncertainties in experimental conditions and simulation accuracy remain, the computationally predicted experimental bonding effect withstands the experimental testing reportedmore » here.« less
ERIC Educational Resources Information Center
Roderer, Thomas; Roebers, Claudia M.
2010-01-01
In the present study, primary school children's ability to give accurate confidence judgments (CJ) was addressed, with a special focus on uncertainty monitoring. In order to investigate the effects of memory retrieval processes on monitoring judgments, item difficulty in a vocabulary learning task (Japanese symbols) was manipulated. Moreover, as a…
A Methodology for Validation of High Resolution Combat Models
1988-06-01
TELEOLOGICAL PROBLEM ................................ 7 C. EPISTEMOLOGICAL PROBLEM ............................. 8 D. UNCERTAINTY PRINCIPLE...theoretical issues. "The Teleological Problem"--How a model by its nature formulates an explicit cause-and-effect relationship that excludes other...34experts" in establishing the standard for reality. Generalization from personal experience is often hampered by the parochial aspects of the
Laura Phillips-Mao; Susan M. Galatowitsch; Stephanie A. Snyder; Robert G. Haight
2016-01-01
Incorporating climate change into conservation decision-making at site and population scales is challenging due to uncertainties associated with localized climate change impacts and population responses to multiple interacting impacts and adaptation strategies. We explore the use of spatially explicit population models to facilitate scenario analysis, a conservation...
Adding uncertainty to forest inventory plot locations: effects on analyses using geospatial data
Alexia A. Sabor; Volker C. Radeloff; Ronald E. McRoberts; Murray Clayton; Susan I. Stewart
2007-01-01
The Forest Inventory and Analysis (FIA) program of the USDA Forest Service alters plot locations before releasing data to the public to ensure landowner confidentiality and sample integrity, but using data with altered plot locations in conjunction with other spatially explicit data layers produces analytical results with unknown amounts of error. We calculated the...
Adult age differences in perceptually based, but not conceptually based implicit tests of memory.
Small, B J; Hultsch, D F; Masson, M E
1995-05-01
Implicit tests of memory assess the influence of recent experience without requiring awareness of remembering. Evidence concerning age differences on implicit tests of memory suggests small age differences in favor of younger adults. However, the majority of research examining this issue has relied upon perceptually based implicit tests. Recently, a second type of implicit test, one that relies upon conceptually based processes, has been identified. The pattern of age differences on this second type of implicit test is less clear. In the present study, we examined the pattern of age differences on one conceptually based (fact completion) and one perceptually based (stem completion) implicit test of memory, as well as two explicit tests of memory (fact and word recall). Tasks were administered to 403 adults from three age groups (19-34 years, 58-73 years, 74-89 years). Significant age differences in favor of the young were found on stem completion but not fact completion. Age differences were present for both word and fast recall. Correlational analyses examining the relationship of memory performance to other cognitive variables indicated that the implicit tests were supported by different components than the explicit tests, as well as being different from each other.
Automated parton-shower variations in PYTHIA 8
Mrenna, S.; Skands, P.
2016-10-03
In the era of precision physics measurements at the LHC, efficient and exhaustive estimations of theoretical uncertainties play an increasingly crucial role. In the context of Monte Carlo (MC) event generators, the estimation of such uncertainties traditionally requires independent MC runs for each variation, for a linear increase in total run time. In this work, we report on an automated evaluation of the dominant (renormalization-scale and nonsingular) perturbative uncertainties in the pythia 8 event generator, with only a modest computational overhead. Each generated event is accompanied by a vector of alternative weights (one for each uncertainty variation), with each set separatelymore » preserving the total cross section. Explicit scale-compensating terms can be included, reflecting known coefficients of higher-order splitting terms and reducing the effect of the variations. In conclusion, the formalism also allows for the enhancement of rare partonic splittings, such as g→bb¯ and q→qγ, to obtain weighted samples enriched in these splittings while preserving the correct physical Sudakov factors.« less
Scientific reasoning abilities of nonscience majors in physics-based courses
NASA Astrophysics Data System (ADS)
Moore, J. Christopher; Rubbo, Louis J.
2012-06-01
We have found that non-STEM (science, technology, engineering, and mathematics) majors taking either a conceptual physics or astronomy course at two regional comprehensive institutions score significantly lower preinstruction on the Lawson’s Classroom Test of Scientific Reasoning (LCTSR) in comparison to national average STEM majors. Based on LCTSR score, the majority of non-STEM students can be classified as either concrete operational or transitional reasoners in Piaget’s theory of cognitive development, whereas in the STEM population formal operational reasoners are far more prevalent. In particular, non-STEM students demonstrate significant difficulty with proportional and hypothetico-deductive reasoning. Prescores on the LCTSR are correlated with normalized learning gains on various concept inventories. The correlation is strongest for content that can be categorized as mostly theoretical, meaning a lack of directly observable exemplars, and weakest for content categorized as mostly descriptive, where directly observable exemplars are abundant. Although the implementation of research-verified, interactive engagement pedagogy can lead to gains in content knowledge, significant gains in theoretical content (such as force and energy) are more difficult with non-STEM students. We also observe no significant gains on the LCTSR without explicit instruction in scientific reasoning patterns. These results further demonstrate that differences in student populations are important when comparing normalized gains on concept inventories, and the achievement of significant gains in scientific reasoning requires a reevaluation of the traditional approach to physics for non-STEM students.
NASA Astrophysics Data System (ADS)
Falugi, P.; Olaru, S.; Dumur, D.
2010-08-01
This article proposes an explicit robust predictive control solution based on linear matrix inequalities (LMIs). The considered predictive control strategy uses different local descriptions of the system dynamics and uncertainties and thus allows the handling of less conservative input constraints. The computed control law guarantees constraint satisfaction and asymptotic stability. The technique is effective for a class of nonlinear systems embedded into polytopic models. A detailed discussion of the procedures which adapt the partition of the state space is presented. For the practical implementation the construction of suitable (explicit) descriptions of the control law are described upon concrete algorithms.
Some rules for polydimensional squeezing
NASA Technical Reports Server (NTRS)
Manko, Vladimir I.
1994-01-01
The review of the following results is presented: For mixed state light of N-mode electromagnetic field described by Wigner function which has generic Gaussian form, the photon distribution function is obtained and expressed explicitly in terms of Hermite polynomials of 2N-variables. The momenta of this distribution are calculated and expressed as functions of matrix invariants of the dispersion matrix. The role of new uncertainty relation depending on photon state mixing parameter is elucidated. New sum rules for Hermite polynomials of several variables are found. The photon statistics of polymode even and odd coherent light and squeezed polymode Schroedinger cat light is given explicitly. Photon distribution for polymode squeezed number states expressed in terms of multivariable Hermite polynomials is discussed.
The effects of attention on perceptual implicit memory.
Rajaram, S; Srinivas, K; Travers, S
2001-10-01
Reports on the effects of dividing attention at study on subsequent perceptual priming suggest that perceptual priming is generally unaffected by attentional manipulations as long as word identity is processed. We tested this hypothesis in three experiments by using the implicit word fragment completion and word stem completion tasks. Division of attention was instantiated with the Stroop task in order to ensure the processing of word identity even when the participant's attention was directed to a stimulus attribute other than the word itself. Under these conditions, we found that even though perceptual priming was significant, it was significantly reduced in magnitude. A stem cued recall test in Experiment 2 confirmed a more deleterious effect of divided attention on explicit memory. Taken together, our findings delineate the relative contributions of perceptual analysis and attentional processes in mediating perceptual priming on two ubiquitously used tasks of word fragment completion and word stem completion.
Expert elicitation, uncertainty, and the value of information in controlling invasive species
Johnson, Fred A.; Smith, Brian J.; Bonneau, Mathieu; Martin, Julien; Romagosa, Christina; Mazzotti, Frank J.; Waddle, J. Hardin; Reed, Robert; Eckles, Jennifer Kettevrlin; Vitt, Laurie J.
2017-01-01
We illustrate the utility of expert elicitation, explicit recognition of uncertainty, and the value of information for directing management and research efforts for invasive species, using tegu lizards (Salvator merianae) in southern Florida as a case study. We posited a post-birth pulse, matrix model in which four age classes of tegus are recognized: hatchlings, 1 year-old, 2 year-olds, and 3 + year-olds. This matrix model was parameterized using a 3-point process to elicit estimates of tegu demographic rates in southern Florida from 10 herpetology experts. We fit statistical distributions for each parameter and for each expert, then drew and pooled a large number of replicate samples from these to form a distribution for each demographic parameter. Using these distributions, as well as the observed correlations among elicited values, we generated a large sample of matrix population models to infer how the tegu population would respond to control efforts. We used the concepts of Pareto efficiency and stochastic dominance to conclude that targeting older age classes at relatively high rates appears to have the best chance of minimizing tegu abundance and control costs. We conclude that expert opinion combined with an explicit consideration of uncertainty can be valuable in conducting an initial assessment of what control strategy, effort, and monetary resources are needed to reduce and eventually eliminate the invader. Scientists, in turn, can use the value of information to focus research in a way that not only increases the efficacy of control, but minimizes costs as well.
Tsiliyannis, Christos Aristeides
2013-09-01
Hazardous waste incinerators (HWIs) differ substantially from thermal power facilities, since instead of maximizing energy production with the minimum amount of fuel, they aim at maximizing throughput. Variations in quantity or composition of received waste loads may significantly diminish HWI throughput (the decisive profit factor), from its nominal design value. A novel formulation of combustion balance is presented, based on linear operators, which isolates the wastefeed vector from the invariant combustion stoichiometry kernel. Explicit expressions for the throughput are obtained, in terms of incinerator temperature, fluegas heat recuperation ratio and design parameters, for an arbitrary number of wastes, based on fundamental principles (mass and enthalpy balances). The impact of waste variations, of recuperation ratio and of furnace temperature is explicitly determined. It is shown that in the presence of waste uncertainty, the throughput may be a decreasing or increasing function of incinerator temperature and recuperation ratio, depending on the sign of a dimensionless parameter related only to the uncertain wastes. The dimensionless parameter is proposed as a sharp a' priori waste 'fingerprint', determining the necessary increase or decrease of manipulated variables (recuperation ratio, excess air, auxiliary fuel feed rate, auxiliary air flow) in order to balance the HWI and maximize throughput under uncertainty in received wastes. A 10-step procedure is proposed for direct application subject to process capacity constraints. The results may be useful for efficient HWI operation and for preparing hazardous waste blends. Copyright © 2013 Elsevier Ltd. All rights reserved.
Towards a more open debate about values in decision-making on agricultural biotechnology.
Devos, Yann; Sanvido, Olivier; Tait, Joyce; Raybould, Alan
2014-12-01
Regulatory decision-making over the use of products of new technology aims to be based on science-based risk assessment. In some jurisdictions, decision-making about the cultivation of genetically modified (GM) plants is blocked supposedly because of scientific uncertainty about risks to the environment. However, disagreement about the acceptability of risks is primarily a dispute over normative values, which is not resolvable through natural sciences. Natural sciences may improve the quality and relevance of the scientific information used to support environmental risk assessments and make scientific uncertainties explicit, but offer little to resolve differences about values. Decisions about cultivating GM plants will thus not necessarily be eased by performing more research to reduce scientific uncertainty in environmental risk assessments, but by clarifying the debate over values. We suggest several approaches to reveal values in decision-making: (1) clarifying policy objectives; (2) determining what constitutes environmental harm; (3) making explicit the factual and normative premises on which risk assessments are based; (4) better demarcating environmental risk assessment studies from ecological research; (5) weighing the potential for environmental benefits (i.e., opportunities) as well as the potential for environmental harms (i.e., risks); and (6) expanding participation in the risk governance of GM plants. Recognising and openly debating differences about values will not remove controversy about the cultivation of GM plants. However, by revealing what is truly in dispute, debates about values will clarify decision-making criteria.
NASA Astrophysics Data System (ADS)
Han, Feng; Zheng, Yi
2018-06-01
Significant Input uncertainty is a major source of error in watershed water quality (WWQ) modeling. It remains challenging to address the input uncertainty in a rigorous Bayesian framework. This study develops the Bayesian Analysis of Input and Parametric Uncertainties (BAIPU), an approach for the joint analysis of input and parametric uncertainties through a tight coupling of Markov Chain Monte Carlo (MCMC) analysis and Bayesian Model Averaging (BMA). The formal likelihood function for this approach is derived considering a lag-1 autocorrelated, heteroscedastic, and Skew Exponential Power (SEP) distributed error model. A series of numerical experiments were performed based on a synthetic nitrate pollution case and on a real study case in the Newport Bay Watershed, California. The Soil and Water Assessment Tool (SWAT) and Differential Evolution Adaptive Metropolis (DREAM(ZS)) were used as the representative WWQ model and MCMC algorithm, respectively. The major findings include the following: (1) the BAIPU can be implemented and used to appropriately identify the uncertain parameters and characterize the predictive uncertainty; (2) the compensation effect between the input and parametric uncertainties can seriously mislead the modeling based management decisions, if the input uncertainty is not explicitly accounted for; (3) the BAIPU accounts for the interaction between the input and parametric uncertainties and therefore provides more accurate calibration and uncertainty results than a sequential analysis of the uncertainties; and (4) the BAIPU quantifies the credibility of different input assumptions on a statistical basis and can be implemented as an effective inverse modeling approach to the joint inference of parameters and inputs.
Uncertainty in gridded CO 2 emissions estimates
Hogue, Susannah; Marland, Eric; Andres, Robert J.; ...
2016-05-19
We are interested in the spatial distribution of fossil-fuel-related emissions of CO 2 for both geochemical and geopolitical reasons, but it is important to understand the uncertainty that exists in spatially explicit emissions estimates. Working from one of the widely used gridded data sets of CO 2 emissions, we examine the elements of uncertainty, focusing on gridded data for the United States at the scale of 1° latitude by 1° longitude. Uncertainty is introduced in the magnitude of total United States emissions, the magnitude and location of large point sources, the magnitude and distribution of non-point sources, and from themore » use of proxy data to characterize emissions. For the United States, we develop estimates of the contribution of each component of uncertainty. At 1° resolution, in most grid cells, the largest contribution to uncertainty comes from how well the distribution of the proxy (in this case population density) represents the distribution of emissions. In other grid cells, the magnitude and location of large point sources make the major contribution to uncertainty. Uncertainty in population density can be important where a large gradient in population density occurs near a grid cell boundary. Uncertainty is strongly scale-dependent with uncertainty increasing as grid size decreases. In conclusion, uncertainty for our data set with 1° grid cells for the United States is typically on the order of ±150%, but this is perhaps not excessive in a data set where emissions per grid cell vary over 8 orders of magnitude.« less
Uncertainty in hydrological signatures for gauged and ungauged catchments
NASA Astrophysics Data System (ADS)
Westerberg, Ida K.; Wagener, Thorsten; Coxon, Gemma; McMillan, Hilary K.; Castellarin, Attilio; Montanari, Alberto; Freer, Jim
2016-03-01
Reliable information about hydrological behavior is needed for water-resource management and scientific investigations. Hydrological signatures quantify catchment behavior as index values, and can be predicted for ungauged catchments using a regionalization procedure. The prediction reliability is affected by data uncertainties for the gauged catchments used in prediction and by uncertainties in the regionalization procedure. We quantified signature uncertainty stemming from discharge data uncertainty for 43 UK catchments and propagated these uncertainties in signature regionalization, while accounting for regionalization uncertainty with a weighted-pooling-group approach. Discharge uncertainty was estimated using Monte Carlo sampling of multiple feasible rating curves. For each sampled rating curve, a discharge time series was calculated and used in deriving the gauged signature uncertainty distribution. We found that the gauged uncertainty varied with signature type, local measurement conditions and catchment behavior, with the highest uncertainties (median relative uncertainty ±30-40% across all catchments) for signatures measuring high- and low-flow magnitude and dynamics. Our regionalization method allowed assessing the role and relative magnitudes of the gauged and regionalized uncertainty sources in shaping the signature uncertainty distributions predicted for catchments treated as ungauged. We found that (1) if the gauged uncertainties were neglected there was a clear risk of overconditioning the regionalization inference, e.g., by attributing catchment differences resulting from gauged uncertainty to differences in catchment behavior, and (2) uncertainty in the regionalization results was lower for signatures measuring flow distribution (e.g., mean flow) than flow dynamics (e.g., autocorrelation), and for average flows (and then high flows) compared to low flows.
NASA Astrophysics Data System (ADS)
Jordan, Michelle
Uncertainty is ubiquitous in life, and learning is an activity particularly likely to be fraught with uncertainty. Previous research suggests that students and teachers struggle in their attempts to manage the psychological experience of uncertainty and that students often fail to experience uncertainty when uncertainty may be warranted. Yet, few educational researchers have explicitly and systematically observed what students do, their behaviors and strategies, as they attempt to manage the uncertainty they experience during academic tasks. In this study I investigated how students in one fifth grade class managed uncertainty they experienced while engaged in collaborative robotics engineering projects, focusing particularly on how uncertainty management was influenced by task structure and students' interactions with their peer collaborators. The study was initiated at the beginning of instruction related to robotics engineering and preceded through the completion of several long-term collaborative robotics projects, one of which was a design project. I relied primarily on naturalistic observation of group sessions, semi-structured interviews, and collection of artifacts. My data analysis was inductive and interpretive, using qualitative discourse analysis techniques and methods of grounded theory. Three theoretical frameworks influenced the conception and design of this study: community of practice, distributed cognition, and complex adaptive systems theory. Uncertainty was a pervasive experience for the students collaborating in this instructional context. Students experienced uncertainty related to the project activity and uncertainty related to the social system as they collaborated to fulfill the requirements of their robotics engineering projects. They managed their uncertainty through a diverse set of tactics for reducing, ignoring, maintaining, and increasing uncertainty. Students experienced uncertainty from more different sources and used more and different types of uncertainty management strategies in the less structured task setting than in the more structured task setting. Peer interaction was influential because students relied on supportive social response to enact most of their uncertainty management strategies. When students could not garner socially supportive response from their peers, their options for managing uncertainty were greatly reduced.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Weisz, Daniel R.; Fouesneau, Morgan; Dalcanton, Julianne J.
2013-01-10
We present a probabilistic approach for inferring the parameters of the present-day power-law stellar mass function (MF) of a resolved young star cluster. This technique (1) fully exploits the information content of a given data set; (2) can account for observational uncertainties in a straightforward way; (3) assigns meaningful uncertainties to the inferred parameters; (4) avoids the pitfalls associated with binning data; and (5) can be applied to virtually any resolved young cluster, laying the groundwork for a systematic study of the high-mass stellar MF (M {approx}> 1 M {sub Sun }). Using simulated clusters and Markov Chain Monte Carlomore » sampling of the probability distribution functions, we show that estimates of the MF slope, {alpha}, are unbiased and that the uncertainty, {Delta}{alpha}, depends primarily on the number of observed stars and on the range of stellar masses they span, assuming that the uncertainties on individual masses and the completeness are both well characterized. Using idealized mock data, we compute the theoretical precision, i.e., lower limits, on {alpha}, and provide an analytic approximation for {Delta}{alpha} as a function of the observed number of stars and mass range. Comparison with literature studies shows that {approx}3/4 of quoted uncertainties are smaller than the theoretical lower limit. By correcting these uncertainties to the theoretical lower limits, we find that the literature studies yield ({alpha}) = 2.46, with a 1{sigma} dispersion of 0.35 dex. We verify that it is impossible for a power-law MF to obtain meaningful constraints on the upper mass limit of the initial mass function, beyond the lower bound of the most massive star actually observed. We show that avoiding substantial biases in the MF slope requires (1) including the MF as a prior when deriving individual stellar mass estimates, (2) modeling the uncertainties in the individual stellar masses, and (3) fully characterizing and then explicitly modeling the completeness for stars of a given mass. The precision on MF slope recovery in this paper are lower limits, as we do not explicitly consider all possible sources of uncertainty, including dynamical effects (e.g., mass segregation), unresolved binaries, and non-coeval populations. We briefly discuss how each of these effects can be incorporated into extensions of the present framework. Finally, we emphasize that the technique and lessons learned are applicable to more general problems involving power-law fitting.« less
The Things You Do: Internal Models of Others’ Expected Behaviour Guide Action Observation
Schenke, Kimberley C.; Wyer, Natalie A.; Bach, Patric
2016-01-01
Predictions allow humans to manage uncertainties within social interactions. Here, we investigate how explicit and implicit person models–how different people behave in different situations–shape these predictions. In a novel action identification task, participants judged whether actors interacted with or withdrew from objects. In two experiments, we manipulated, unbeknownst to participants, the two actors action likelihoods across situations, such that one actor typically interacted with one object and withdrew from the other, while the other actor showed the opposite behaviour. In Experiment 2, participants additionally received explicit information about the two individuals that either matched or mismatched their actual behaviours. The data revealed direct but dissociable effects of both kinds of person information on action identification. Implicit action likelihoods affected response times, speeding up the identification of typical relative to atypical actions, irrespective of the explicit knowledge about the individual’s behaviour. Explicit person knowledge, in contrast, affected error rates, causing participants to respond according to expectations instead of observed behaviour, even when they were aware that the explicit information might not be valid. Together, the data show that internal models of others’ behaviour are routinely re-activated during action observation. They provide first evidence of a person-specific social anticipation system, which predicts forthcoming actions from both explicit information and an individuals’ prior behaviour in a situation. These data link action observation to recent models of predictive coding in the non-social domain where similar dissociations between implicit effects on stimulus identification and explicit behavioural wagers have been reported. PMID:27434265
The effect of articulatory suppression on implicit and explicit false memory in the DRM paradigm.
Van Damme, Ilse; Menten, Jan; d'Ydewalle, Gery
2010-11-01
Several studies have shown that reliable implicit false memory can be obtained in the DRM paradigm. There has been considerable debate, however, about whether or not conscious activation of critical lures during study is a necessary condition for this. Recent findings have revealed that articulatory suppression prevents subsequent false priming in an anagram task (Lovden & Johansson, 2003). The present experiment sought to replicate and extend these findings to an implicit word stem completion task, and to additionally investigate the effect of articulatory suppression on explicit false memory. Results showed an inhibitory effect of articulatory suppression on veridical memory, as well as on implicit false memory, whereas the level of explicit false memory was heightened. This suggests that articulatory suppression did not merely eliminate conscious lure activation, but had a more general capacity-delimiting effect. The drop in veridical memory can be attributed to diminished encoding of item-specific information. Superficial encoding also limited the spreading of semantic activation during study, which inhibited later false priming. In addition, the lack of item-specific and phenomenological details caused impaired source monitoring at test, resulting in heightened explicit false memory.
Anderson, Joel; Antalíková, Radka
2014-12-01
Denmark is currently experiencing the highest immigration rate in its modern history. Population surveys indicate that negative public attitudes toward immigrants actually stem from attitudes toward their (perceived) Islamic affiliation. We used a framing paradigm to investigate the explicit and implicit attitudes of Christian and Atheist Danes toward targets framed as Muslims or as immigrants. The results showed that explicit and implicit attitudes were more negative when the target was framed as a Muslim, rather than as an immigrant. Interestingly, implicit attitudes were qualified by the participants' religion. Specifically, analyses revealed that Christians demonstrated more negative implicit attitudes toward immigrants than Muslims. Conversely, Atheists demonstrated more negative implicit attitudes toward Muslims than Atheists. These results suggest a complex relationship between religion, and implicit and explicit prejudice. Both the religious affiliation of the perceiver and the perceived religious affiliation of the target are key factors in social perception. © 2014 Scandinavian Psychological Associations and John Wiley & Sons Ltd.
Cournot competition between a non-profit firm and a for-profit firm with uncertainty
NASA Astrophysics Data System (ADS)
Ferreira, Fernanda A.
2010-03-01
In this paper, we consider a Cournot competition between a nonprofit firm and a for-profit firm in a homogeneous goods market, with uncertain demand. Given an asymmetric tax schedule, we compute explicitly the Bayesian-Nash equilibrium. Furthermore, we analyze the effects of the tax rate and the degree of altruistic preference on market equilibrium outcomes.
Uncertainty estimation for map-based analyses
Ronald E. McRoberts; Mark A. Hatfield; Susan J. Crocker
2010-01-01
Traditionally, natural resource managers have asked the question, âHow much?â and have received sample-based estimates of resource totals or means. Increasingly, however, the same managers are now asking the additional question, âWhere?â and are expecting spatially explicit answers in the form of maps. Recent development of natural resource databases, access to...
Bruce G. Marcot; Peter H. Singleton; Nathan H. Schumaker
2015-01-01
Sensitivity analysisâdetermination of how prediction variables affect response variablesâof individual-based models (IBMs) are few but important to the interpretation of model output. We present sensitivity analysis of a spatially explicit IBM (HexSim) of a threatened species, the Northern Spotted Owl (NSO; Strix occidentalis caurina) in Washington...
An interactive Bayesian geostatistical inverse protocol for hydraulic tomography
Fienen, Michael N.; Clemo, Tom; Kitanidis, Peter K.
2008-01-01
Hydraulic tomography is a powerful technique for characterizing heterogeneous hydrogeologic parameters. An explicit trade-off between characterization based on measurement misfit and subjective characterization using prior information is presented. We apply a Bayesian geostatistical inverse approach that is well suited to accommodate a flexible model with the level of complexity driven by the data and explicitly considering uncertainty. Prior information is incorporated through the selection of a parameter covariance model characterizing continuity and providing stability. Often, discontinuities in the parameter field, typically caused by geologic contacts between contrasting lithologic units, necessitate subdivision into zones across which there is no correlation among hydraulic parameters. We propose an interactive protocol in which zonation candidates are implied from the data and are evaluated using cross validation and expert knowledge. Uncertainty introduced by limited knowledge of dynamic regional conditions is mitigated by using drawdown rather than native head values. An adjoint state formulation of MODFLOW-2000 is used to calculate sensitivities which are used both for the solution to the inverse problem and to guide protocol decisions. The protocol is tested using synthetic two-dimensional steady state examples in which the wells are located at the edge of the region of interest.
NASA Astrophysics Data System (ADS)
Moglia, Magnus; Sharma, Ashok K.; Maheepala, Shiroma
2012-07-01
SummaryPlanning of regional and urban water resources, and in particular with Integrated Urban Water Management approaches, often considers inter-relationships between human uses of water, the health of the natural environment as well as the cost of various management strategies. Decision makers hence typically need to consider a combination of social, environmental and economic goals. The types of strategies employed can include water efficiency measures, water sensitive urban design, stormwater management, or catchment management. Therefore, decision makers need to choose between different scenarios and to evaluate them against a number of criteria. This type of problem has a discipline devoted to it, i.e. Multi-Criteria Decision Analysis, which has often been applied in water management contexts. This paper describes the application of Subjective Logic in a basic Bayesian Network to a Multi-Criteria Decision Analysis problem. By doing this, it outlines a novel methodology that explicitly incorporates uncertainty and information reliability. The application of the methodology to a known case study context allows for exploration. By making uncertainty and reliability of assessments explicit, it allows for assessing risks of various options, and this may help in alleviating cognitive biases and move towards a well formulated risk management policy.
Robust fault-tolerant tracking control design for spacecraft under control input saturation.
Bustan, Danyal; Pariz, Naser; Sani, Seyyed Kamal Hosseini
2014-07-01
In this paper, a continuous globally stable tracking control algorithm is proposed for a spacecraft in the presence of unknown actuator failure, control input saturation, uncertainty in inertial matrix and external disturbances. The design method is based on variable structure control and has the following properties: (1) fast and accurate response in the presence of bounded disturbances; (2) robust to the partial loss of actuator effectiveness; (3) explicit consideration of control input saturation; and (4) robust to uncertainty in inertial matrix. In contrast to traditional fault-tolerant control methods, the proposed controller does not require knowledge of the actuator faults and is implemented without explicit fault detection and isolation processes. In the proposed controller a single parameter is adjusted dynamically in such a way that it is possible to prove that both attitude and angular velocity errors will tend to zero asymptotically. The stability proof is based on a Lyapunov analysis and the properties of the singularity free quaternion representation of spacecraft dynamics. Results of numerical simulations state that the proposed controller is successful in achieving high attitude performance in the presence of external disturbances, actuator failures, and control input saturation. Copyright © 2014 ISA. Published by Elsevier Ltd. All rights reserved.
Combining information from multiple flood projections in a hierarchical Bayesian framework
NASA Astrophysics Data System (ADS)
Le Vine, Nataliya
2016-04-01
This study demonstrates, in the context of flood frequency analysis, the potential of a recently proposed hierarchical Bayesian approach to combine information from multiple models. The approach explicitly accommodates shared multimodel discrepancy as well as the probabilistic nature of the flood estimates, and treats the available models as a sample from a hypothetical complete (but unobserved) set of models. The methodology is applied to flood estimates from multiple hydrological projections (the Future Flows Hydrology data set) for 135 catchments in the UK. The advantages of the approach are shown to be: (1) to ensure adequate "baseline" with which to compare future changes; (2) to reduce flood estimate uncertainty; (3) to maximize use of statistical information in circumstances where multiple weak predictions individually lack power, but collectively provide meaningful information; (4) to diminish the importance of model consistency when model biases are large; and (5) to explicitly consider the influence of the (model performance) stationarity assumption. Moreover, the analysis indicates that reducing shared model discrepancy is the key to further reduction of uncertainty in the flood frequency analysis. The findings are of value regarding how conclusions about changing exposure to flooding are drawn, and to flood frequency change attribution studies.
EXPLICIT: a feasibility study of remote expert elicitation in health technology assessment.
Grigore, Bogdan; Peters, Jaime; Hyde, Christopher; Stein, Ken
2017-09-04
Expert opinion is often sought to complement available information needed to inform model-based economic evaluations in health technology assessments. In this context, we define expert elicitation as the process of encoding expert opinion on a quantity of interest, together with associated uncertainty, as a probability distribution. When availability for face-to-face expert elicitation with a facilitator is limited, elicitation can be conducted remotely, overcoming challenges of finding an appropriate time to meet the expert and allowing access to experts situated too far away for practical face-to-face sessions. However, distance elicitation is associated with reduced response rates and limited assistance for the expert during the elicitation session. The aim of this study was to inform the development of a remote elicitation tool by exploring the influence of mode of elicitation on elicited beliefs. An Excel-based tool (EXPLICIT) was developed to assist the elicitation session, including the preparation of the expert and recording of their responses. General practitioners (GPs) were invited to provide expert opinion about population alcohol consumption behaviours. They were randomised to complete the elicitation by either a face-to-face meeting or email. EXPLICIT was used in the elicitation sessions for both arms. Fifteen GPs completed the elicitation session. Those conducted by email were longer than the face-to-face sessions (13 min 30 s vs 10 min 26 s, p = 0.1) and the email-elicited estimates contained less uncertainty. However, the resulting aggregated distributions were comparable. EXPLICIT was useful in both facilitating the elicitation task and in obtaining expert opinion from experts via email. The findings support the opinion that remote, self-administered elicitation is a viable approach within the constraints of HTA to inform policy making, although poor response rates may be observed and additional time for individual sessions may be required.
Stem cell research in Brazil: the production of a new field of science.
Zorzanelli, Rafaela Teixeira; Speroni, Angela Vasconi; Menezes, Rachel Aisengart; Leibing, Annette
2017-01-01
Based on a review of the literature published in the early twenty-first century by Brazilian researchers, the article offers an overview of stem cell research in Brazil. Three central topics were detected in these papers: (1) the funding of stem cell research in Brazil; (2) preclinical and clinical trials in Brazil; and (3) social anthropological analysis focused on ethical and legal matters. Our review identifies controversial questions in the construction of this scientific field, especially issues involving the media as a disseminator of values and of certain social representations, where new kinds of hope figure large. Within this climate of uncertainty, we find patients and their families energized by the promises of the "medicine of the future."
Extended Importance Sampling for Reliability Analysis under Evidence Theory
NASA Astrophysics Data System (ADS)
Yuan, X. K.; Chen, B.; Zhang, B. Q.
2018-05-01
In early engineering practice, the lack of data and information makes uncertainty difficult to deal with. However, evidence theory has been proposed to handle uncertainty with limited information as an alternative way to traditional probability theory. In this contribution, a simulation-based approach, called ‘Extended importance sampling’, is proposed based on evidence theory to handle problems with epistemic uncertainty. The proposed approach stems from the traditional importance sampling for reliability analysis under probability theory, and is developed to handle the problem with epistemic uncertainty. It first introduces a nominal instrumental probability density function (PDF) for every epistemic uncertainty variable, and thus an ‘equivalent’ reliability problem under probability theory is obtained. Then the samples of these variables are generated in a way of importance sampling. Based on these samples, the plausibility and belief (upper and lower bounds of probability) can be estimated. It is more efficient than direct Monte Carlo simulation. Numerical and engineering examples are given to illustrate the efficiency and feasible of the proposed approach.
Experiences of Uncertainty in Men With an Elevated PSA.
Biddle, Caitlin; Brasel, Alicia; Underwood, Willie; Orom, Heather
2015-05-15
A significant proportion of men, ages 50 to 70 years, have, and continue to receive prostate specific antigen (PSA) tests to screen for prostate cancer (PCa). Approximately 70% of men with an elevated PSA level will not subsequently be diagnosed with PCa. Semistructured interviews were conducted with 13 men with an elevated PSA level who had not been diagnosed with PCa. Uncertainty was prominent in men's reactions to the PSA results, stemming from unanswered questions about the PSA test, PCa risk, and confusion about their management plan. Uncertainty was exacerbated or reduced depending on whether health care providers communicated in lay and empathetic ways, and provided opportunities for question asking. To manage uncertainty, men engaged in information and health care seeking, self-monitoring, and defensive cognition. Results inform strategies for meeting informational needs of men with an elevated PSA and confirm the primary importance of physician communication behavior for open information exchange and uncertainty reduction. © The Author(s) 2015.
Uncertainty forecasts improve weather-related decisions and attenuate the effects of forecast error.
Joslyn, Susan L; LeClerc, Jared E
2012-03-01
Although uncertainty is inherent in weather forecasts, explicit numeric uncertainty estimates are rarely included in public forecasts for fear that they will be misunderstood. Of particular concern are situations in which precautionary action is required at low probabilities, often the case with severe events. At present, a categorical weather warning system is used. The work reported here tested the relative benefits of several forecast formats, comparing decisions made with and without uncertainty forecasts. In three experiments, participants assumed the role of a manager of a road maintenance company in charge of deciding whether to pay to salt the roads and avoid a potential penalty associated with icy conditions. Participants used overnight low temperature forecasts accompanied in some conditions by uncertainty estimates and in others by decision advice comparable to categorical warnings. Results suggested that uncertainty information improved decision quality overall and increased trust in the forecast. Participants with uncertainty forecasts took appropriate precautionary action and withheld unnecessary action more often than did participants using deterministic forecasts. When error in the forecast increased, participants with conventional forecasts were reluctant to act. However, this effect was attenuated by uncertainty forecasts. Providing categorical decision advice alone did not improve decisions. However, combining decision advice with uncertainty estimates resulted in the best performance overall. The results reported here have important implications for the development of forecast formats to increase compliance with severe weather warnings as well as other domains in which one must act in the face of uncertainty. PsycINFO Database Record (c) 2012 APA, all rights reserved.
Yao, Shuai-Lei; Luo, Jing-Jia; Huang, Gang
2016-01-01
Regional climate projections are challenging because of large uncertainty particularly stemming from unpredictable, internal variability of the climate system. Here, we examine the internal variability-induced uncertainty in precipitation and surface air temperature (SAT) trends during 2005-2055 over East Asia based on 40 member ensemble projections of the Community Climate System Model Version 3 (CCSM3). The model ensembles are generated from a suite of different atmospheric initial conditions using the same SRES A1B greenhouse gas scenario. We find that projected precipitation trends are subject to considerably larger internal uncertainty and hence have lower confidence, compared to the projected SAT trends in both the boreal winter and summer. Projected SAT trends in winter have relatively higher uncertainty than those in summer. Besides, the lower-level atmospheric circulation has larger uncertainty than that in the mid-level. Based on k-means cluster analysis, we demonstrate that a substantial portion of internally-induced precipitation and SAT trends arises from internal large-scale atmospheric circulation variability. These results highlight the importance of internal climate variability in affecting regional climate projections on multi-decadal timescales.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mencuccini, Maurizio; Salmon, Yann; Mitchell, Patrick
Substantial uncertainty surrounds our knowledge of tree stem growth, with some of the most basic questions, such as when stem radial growth occurs through the daily cycle, still unanswered. Here, we employed high-resolution point dendrometers, sap flow sensors, and developed theory and statistical approaches, to devise a novel method separating irreversible radial growth from elastic tension-driven and elastic osmotically driven changes in bark water content. We tested this method using data from five case study species. Experimental manipulations, namely a field irrigation experiment on Scots pine and a stem girdling experiment on red forest gum trees, were used to validatemore » the theory. Time courses of stem radial growth following irrigation and stem girdling were consistent with a-priori predictions. Patterns of stem radial growth varied across case studies, with growth occurring during the day and/or night, consistent with the available literature. Importantly, our approach provides a valuable alternative to existing methods, as it can be approximated by a simple empirical interpolation routine that derives irreversible radial growth using standard regression techniques. In conclusion, our novel method provides an improved understanding of the relative source–sink carbon dynamics of tree stems at a sub-daily time scale.« less
Mencuccini, Maurizio; Salmon, Yann; Mitchell, Patrick; Hölttä, Teemu; Choat, Brendan; Meir, Patrick; O'Grady, Anthony; Tissue, David; Zweifel, Roman; Sevanto, Sanna; Pfautsch, Sebastian
2017-02-01
Substantial uncertainty surrounds our knowledge of tree stem growth, with some of the most basic questions, such as when stem radial growth occurs through the daily cycle, still unanswered. We employed high-resolution point dendrometers, sap flow sensors, and developed theory and statistical approaches, to devise a novel method separating irreversible radial growth from elastic tension-driven and elastic osmotically driven changes in bark water content. We tested this method using data from five case study species. Experimental manipulations, namely a field irrigation experiment on Scots pine and a stem girdling experiment on red forest gum trees, were used to validate the theory. Time courses of stem radial growth following irrigation and stem girdling were consistent with a-priori predictions. Patterns of stem radial growth varied across case studies, with growth occurring during the day and/or night, consistent with the available literature. Importantly, our approach provides a valuable alternative to existing methods, as it can be approximated by a simple empirical interpolation routine that derives irreversible radial growth using standard regression techniques. Our novel method provides an improved understanding of the relative source-sink carbon dynamics of tree stems at a sub-daily time scale. © 2016 The Authors Plant, Cell & Environment Published by John Wiley & Sons Ltd.
Mencuccini, Maurizio; Salmon, Yann; Mitchell, Patrick; ...
2017-11-12
Substantial uncertainty surrounds our knowledge of tree stem growth, with some of the most basic questions, such as when stem radial growth occurs through the daily cycle, still unanswered. Here, we employed high-resolution point dendrometers, sap flow sensors, and developed theory and statistical approaches, to devise a novel method separating irreversible radial growth from elastic tension-driven and elastic osmotically driven changes in bark water content. We tested this method using data from five case study species. Experimental manipulations, namely a field irrigation experiment on Scots pine and a stem girdling experiment on red forest gum trees, were used to validatemore » the theory. Time courses of stem radial growth following irrigation and stem girdling were consistent with a-priori predictions. Patterns of stem radial growth varied across case studies, with growth occurring during the day and/or night, consistent with the available literature. Importantly, our approach provides a valuable alternative to existing methods, as it can be approximated by a simple empirical interpolation routine that derives irreversible radial growth using standard regression techniques. In conclusion, our novel method provides an improved understanding of the relative source–sink carbon dynamics of tree stems at a sub-daily time scale.« less
NASA Astrophysics Data System (ADS)
Jacobson, R. B.; Colvin, M. E.; Marmorek, D.; Randall, M.
2017-12-01
The Missouri River Recovery Program (MRRP) seeks to revise river-management strategies to avoid jeopardizing the existence of three species: pallid sturgeon (Scaphirhynchus albus), interior least tern (Sterna antillarum)), and piping plover (Charadrius melodus). Managing the river to maintain populations of the two birds (terns and plovers) is relatively straightforward: reproductive success can be modeled with some certainty as a direct, increasing function of exposed sandbar area. In contrast, the pallid sturgeon inhabits the benthic zone of a deep, turbid river and many parts of its complex life history are not directly observable. Hence, pervasive uncertainties exist about what factors are limiting population growth and what management actions may reverse population declines. These uncertainties are being addressed by the MRRP through a multi-step process. The first step was an Effects Analysis (EA), which: documented what is known and unknown about the river and the species; documented quality and quantity of existing information; used an expert-driven process to develop conceptual ecological models and to prioritize management hypotheses; and developed quantitative models linking management actions (flows, channel reconfigurations, and stocking) to population responses. The EA led to development of a science and adaptive-management plan with prioritized allocation of investment among 4 levels of effort ranging from fundamental research to full implementation. The plan includes learning from robust, hypothesis-driven effectiveness monitoring for all actions, with statistically sound experimental designs, multiple metrics, and explicit decision criteria to guide management. Finally, the science plan has been fully integrated with a new adaptive-management structure that links science to decision makers. The reinvigorated investment in science stems from the understanding that costly river-management decisions are not socially or politically supportable without better understanding of how this endangered fish will respond. While some hypotheses can be evaluated without actually implementing management actions in the river, assessing the effectiveness of other forms of habitat restoration requires in-river implementation within a rigorous experimental design.
Watling, James I.; Brandt, Laura A.; Bucklin, David N.; Fujisaki, Ikuko; Mazzotti, Frank J.; Romañach, Stephanie; Speroterra, Carolina
2015-01-01
Species distribution models (SDMs) are widely used in basic and applied ecology, making it important to understand sources and magnitudes of uncertainty in SDM performance and predictions. We analyzed SDM performance and partitioned variance among prediction maps for 15 rare vertebrate species in the southeastern USA using all possible combinations of seven potential sources of uncertainty in SDMs: algorithms, climate datasets, model domain, species presences, variable collinearity, CO2 emissions scenarios, and general circulation models. The choice of modeling algorithm was the greatest source of uncertainty in SDM performance and prediction maps, with some additional variation in performance associated with the comprehensiveness of the species presences used for modeling. Other sources of uncertainty that have received attention in the SDM literature such as variable collinearity and model domain contributed little to differences in SDM performance or predictions in this study. Predictions from different algorithms tended to be more variable at northern range margins for species with more northern distributions, which may complicate conservation planning at the leading edge of species' geographic ranges. The clear message emerging from this work is that researchers should use multiple algorithms for modeling rather than relying on predictions from a single algorithm, invest resources in compiling a comprehensive set of species presences, and explicitly evaluate uncertainty in SDM predictions at leading range margins.
Spafford, Marlee M; Schryer, Catherine F; Lingard, Lorelei; Hrynchak, Patricia K
2006-01-01
Healthcare students learn to manage clinical uncertainty amid the tensions that emerge between clinical omniscience and the 'truth for now' realities of the knowledge explosion in healthcare. The case presentation provides a portal to viewing the practitioner's ability to manage uncertainty. We examined the communicative features of uncertainty in 31 novice optometry case presentations and considered how these features contributed to the development of professional identity in optometry students. We also reflected on how these features compared with our earlier study of medical students' case presentations. Optometry students, like their counterparts in medicine, displayed a novice rhetoric of uncertainty that focused on personal deficits in knowledge. While optometry and medical students shared aspects of this rhetoric (seeking guidance and deflecting criticism), optometry students displayed instances of owning limits while medical students displayed instances of proving competence. We found that the nature of this novice rhetoric was shaped by professional identity (a tendency to assume an attitude of moral authority or defer to a higher authority) and the clinical setting (inpatient versus outpatient settings). More explicit discussions regarding uncertainty may help the novice unlock the code of contextual forces that cue the savvy member of the community to sanctioned discursive strategies.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sattison, M.B.; Blackman, H.S.; Novack, S.D.
The Office for Analysis and Evaluation of Operational Data (AEOD) has sought the assistance of the Idaho National Engineering Laboratory (INEL) to make some significant enhancements to the SAPHIRE-based Accident Sequence Precursor (ASP) models recently developed by the INEL. The challenge of this project is to provide the features of a full-scale PRA within the framework of the simplified ASP models. Some of these features include: (1) uncertainty analysis addressing the standard PRA uncertainties and the uncertainties unique to the ASP models and methods, (2) incorporation and proper quantification of individual human actions and the interaction among human actions, (3)more » enhanced treatment of common cause failures, and (4) extension of the ASP models to more closely mimic full-scale PRAs (inclusion of more initiators, explicitly modeling support system failures, etc.). This paper provides an overview of the methods being used to make the above improvements.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sattison, M.B.; Blackman, H.S.; Novack, S.D.
The Office for Analysis and Evaluation of Operational Data (AEOD) has sought the assistance of the Idaho National Engineering Laboratory (INEL) to make some significant enhancements to the SAPHIRE-based Accident Sequence Precursor (ASP) models recently developed by the INEL. The challenge of this project is to provide the features of a full-scale PRA within the framework of the simplified ASP models. Some of these features include: (1) uncertainty analysis addressing the standard PRA uncertainties and the uncertainties unique to the ASP models and methodology, (2) incorporation and proper quantification of individual human actions and the interaction among human actions, (3)more » enhanced treatment of common cause failures, and (4) extension of the ASP models to more closely mimic full-scale PRAs (inclusion of more initiators, explicitly modeling support system failures, etc.). This paper provides an overview of the methods being used to make the above improvements.« less
“Stringy” coherent states inspired by generalized uncertainty principle
NASA Astrophysics Data System (ADS)
Ghosh, Subir; Roy, Pinaki
2012-05-01
Coherent States with Fractional Revival property, that explicitly satisfy the Generalized Uncertainty Principle (GUP), have been constructed in the context of Generalized Harmonic Oscillator. The existence of such states is essential in motivating the GUP based phenomenological results present in the literature which otherwise would be of purely academic interest. The effective phase space is Non-Canonical (or Non-Commutative in popular terminology). Our results have a smooth commutative limit, equivalent to Heisenberg Uncertainty Principle. The Fractional Revival time analysis yields an independent bound on the GUP parameter. Using this and similar bounds obtained here, we derive the largest possible value of the (GUP induced) minimum length scale. Mandel parameter analysis shows that the statistics is Sub-Poissonian. Correspondence Principle is deformed in an interesting way. Our computational scheme is very simple as it requires only first order corrected energy values and undeformed basis states.
King, B
2001-11-01
The new laboratory accreditation standard, ISO/IEC 17025, reflects current thinking on good measurement practice by requiring more explicit and more demanding attention to a number of activities. These include client interactions, method validation, traceability, and measurement uncertainty. Since the publication of the standard in 1999 there has been extensive debate about its interpretation. It is the author's view that if good quality practices are already in place and if the new requirements are introduced in a manner that is fit for purpose, the additional work required to comply with the new requirements can be expected to be modest. The paper argues that the rigour required in addressing the issues should be driven by customer requirements and the factors that need to be considered in this regard are discussed. The issues addressed include the benefits, interim arrangements, specifying the analytical requirement, establishing traceability, evaluating the uncertainty and reporting the information.
Reactive approach motivation (RAM) for religion.
McGregor, Ian; Nash, Kyle; Prentice, Mike
2010-07-01
In 3 experiments, participants reacted with religious zeal to anxious uncertainty threats that have caused reactive approach motivation (RAM) in past research (see McGregor, Nash, Mann, & Phills, 2010, for implicit, explicit, and neural evidence of RAM). In Study 1, results were specific to religious ideals and did not extend to merely superstitious beliefs. Effects were most pronounced among the most anxious and uncertainty-averse participants in Study 1 and among the most approach-motivated participants in Study 2 (i.e., with high Promotion Focus, Behavioral Activation, Action Orientation, and Self-Esteem Scale scores). In Studies 2 and 3, anxious uncertainty threats amplified even the most jingoistic and extreme aspects of religious zeal. In Study 3, reactive religious zeal occurred only among participants who reported feeling disempowered in their everyday goals in life. Results support a RAM view of empowered religious idealism for anxiety management (cf. Armstrong, 2000; Inzlicht, McGregor, Hirsch, & Nash, 2009).
NASA Astrophysics Data System (ADS)
Moslehi, Mahsa; de Barros, Felipe P. J.
2017-01-01
We investigate how the uncertainty stemming from disordered porous media that display long-range correlation in the hydraulic conductivity (K) field propagates to predictions of environmental performance metrics (EPMs). In this study, the EPMs are quantities that are of relevance to risk analysis and remediation, such as peak flux-averaged concentration, early and late arrival times among others. By using stochastic simulations, we quantify the uncertainty associated with the EPMs for a given disordered spatial structure of the K-field and identify the probability distribution function (PDF) model that best captures the statistics of the EPMs of interest. Results indicate that the probabilistic distribution of the EPMs considered in this study follows lognormal PDF. Finally, through the use of information theory, we reveal how the persistent/anti-persistent correlation structure of the K-field influences the EPMs and corresponding uncertainties.
Schoel, Christiane; Bluemke, Matthias; Mueller, Patrick; Stahlberg, Dagmar
2011-09-01
We investigated the impact of uncertainty on leadership preferences and propose that the conjunction of self-esteem level and stability is an important moderator in this regard. Self-threatening uncertainty is aversive and activates the motivation to regain control. People with high and stable self-esteem should be confident of achieving this goal by self-determined amelioration of the situation and should therefore show a stronger preference for democratic leadership under conditions of uncertainty. By contrast, people with low and unstable self-esteem should place their trust and hope in the abilities of powerful others, resulting in a preference for autocratic leadership. Studies 1a and 1b validate explicit and implicit leadership measures and demonstrate a general prodemocratic default attitude under conditions of certainty. Studies 2 and 3 reveal a democratic reaction for individuals with stable high self-esteem and a submissive reaction for individuals with unstable low self-esteem under conditions of uncertainty. In Study 4, this pattern is cancelled out when individuals evaluate leadership styles from a leader instead of a follower perspective. PsycINFO Database Record (c) 2011 APA, all rights reserved.
Broadening the study of inductive reasoning: confirmation judgments with uncertain evidence.
Mastropasqua, Tommaso; Crupi, Vincenzo; Tentori, Katya
2010-10-01
Although evidence in real life is often uncertain, the psychology of inductive reasoning has, so far, been confined to certain evidence. The present study extends previous research by investigating whether people properly estimate the impact of uncertain evidence on a given hypothesis. Two experiments are reported, in which the uncertainty of evidence is explicitly (by means of numerical values) versus implicitly (by means of ambiguous pictures) manipulated. The results show that people's judgments are highly correlated with those predicted by normatively sound Bayesian measures of impact. This sensitivity to the degree of evidential uncertainty supports the centrality of inductive reasoning in cognition and opens the path to the study of this issue in more naturalistic settings.
One-jet inclusive cross section at order a(s)-cubed - Gluons only
NASA Technical Reports Server (NTRS)
Ellis, Stephen D.; Kunszt, Zoltan; Soper, Davison E.
1989-01-01
A complete calculation of the hadron jet cross-section at one order beyond the Born approximation is performed for the simplified case in which there are only gluons. The general structure of the differences from the lowest-order cross-section are described. This step allows two important improvements in the understanding of the theoretical hadron jet cross-section: first, the cross section at this order displays explicit dependence on the jet cone size, so that explicit account can be taken of the differences in jet definitions employed by different experiments; second, the magnitude of the uncertainty of the theoretical cross-section due to the arbitrary choice of the factorization scale has been reduced by a factor of two to three.
Sharpe, Kimberly; Di Pietro, Nina; Illes, Judy
2016-02-01
Stem cell research has generated considerable attention for its potential to remediate many disorders of the central nervous system including neurodevelopmental disorders such as autism spectrum disorder (ASD) and cerebral palsy (CP) that place a high burden on individual children, families and society. Here we characterized messaging about the use of stem cells for ASD and CP in news media articles and concurrent dissemination of discoveries through conventional science discourse. We searched LexisNexis and Canadian Newsstand for news articles from the US, UK, Canada and Australia in the period between 2000 and 2014, and PubMed for peer reviewed articles for the same 10 years. Using in-depth content analysis methods, we found less cautionary messaging about stem cells for ASD and CP in the resulting sample of 73 media articles than in the sample of 87 science papers, and a privileging of benefits over risk. News media also present stem cells as ready for clinical application to treat these neurodevelopmental disorders, even while the science literature calls for further research. Investigative news reports that explicitly quote researchers, however, provide the most accurate information to actual science news. The hope, hype, and promise of stem cell interventions for neurodevelopmental disorders, combined with the extreme vulnerability of these children and their families, creates a perfect storm in which journalists and stem cell scientists must commit to a continued, if not even more robust, partnership to promote balanced and accurate messaging.
Assessment of the GECKO-A modeling tool using chamber observations for C12 alkanes
NASA Astrophysics Data System (ADS)
Aumont, B.; La, S.; Ouzebidour, F.; Valorso, R.; Mouchel-Vallon, C.; Camredon, M.; Lee-Taylor, J. M.; Hodzic, A.; Madronich, S.; Yee, L. D.; Loza, C. L.; Craven, J. S.; Zhang, X.; Seinfeld, J.
2013-12-01
Secondary Organic Aerosol (SOA) production and ageing is the result of atmospheric oxidation processes leading to the progressive formation of organic species with higher oxidation state and lower volatility. Explicit chemical mechanisms reflect our understanding of these multigenerational oxidation steps. Major uncertainties remain concerning the processes leading to SOA formation and the development, assessment and improvement of such explicit schemes is therefore a key issue. The development of explicit mechanism to describe the oxidation of long chain hydrocarbons is however a challenge. Indeed, explicit oxidation schemes involve a large number of reactions and secondary organic species, far exceeding the size of chemical schemes that can be written manually. The chemical mechanism generator GECKO-A (Generator for Explicit Chemistry and Kinetics of Organics in the Atmosphere) is a computer program designed to overcome this difficulty. GECKO-A generates gas phase oxidation schemes according to a prescribed protocol assigning reaction pathways and kinetics data on the basis of experimental data and structure-activity relationships. In this study, we examine the ability of the generated schemes to explain SOA formation observed in the Caltech Environmental Chambers from various C12 alkane isomers and under high NOx and low NOx conditions. First results show that the model overestimates both the SOA yields and the O/C ratios. Various sensitivity tests are performed to explore processes that might be responsible for these disagreements.
DOE Office of Scientific and Technical Information (OSTI.GOV)
An, Y; Liang, J; Liu, W
2015-06-15
Purpose: We propose to apply a probabilistic framework, namely chanceconstrained optimization, in the intensity-modulated proton therapy (IMPT) planning subject to range and patient setup uncertainties. The purpose is to hedge against the influence of uncertainties and improve robustness of treatment plans. Methods: IMPT plans were generated for a typical prostate patient. Nine dose distributions are computed — the nominal one and one each for ±5mm setup uncertainties along three cardinal axes and for ±3.5% range uncertainty. These nine dose distributions are supplied to the solver CPLEX as chance constraints to explicitly control plan robustness under these representative uncertainty scenarios withmore » certain probability. This probability is determined by the tolerance level. We make the chance-constrained model tractable by converting it to a mixed integer optimization problem. The quality of plans derived from this method is evaluated using dose-volume histogram (DVH) indices such as tumor dose homogeneity (D5% – D95%) and coverage (D95%) and normal tissue sparing like V70 of rectum, V65, and V40 of bladder. We also compare the results from this novel method with the conventional PTV-based method to further demonstrate its effectiveness Results: Our model can yield clinically acceptable plans within 50 seconds. The chance-constrained optimization produces IMPT plans with comparable target coverage, better target dose homogeneity, and better normal tissue sparing compared to the PTV-based optimization [D95% CTV: 67.9 vs 68.7 (Gy), D5% – D95% CTV: 11.9 vs 18 (Gy), V70 rectum: 0.0 % vs 0.33%, V65 bladder: 2.17% vs 9.33%, V40 bladder: 8.83% vs 21.83%]. It also simultaneously makes the plan more robust [Width of DVH band at D50%: 2.0 vs 10.0 (Gy)]. The tolerance level may be varied to control the tradeoff between plan robustness and quality. Conclusion: The chance-constrained optimization generates superior IMPT plan compared to the PTV-based optimization with explicit control of plan robustness. NIH/NCI K25CA168984, Eagles Cancer Research Career Development, The Lawrence W. and Marilyn W. Matteson Fund for Cancer Research, Mayo ASU Seed Grant, and The Kemper Marley Foundation.« less
[Non-randomized evaluation studies (TREND)].
Vallvé, Carles; Artés, Maite; Cobo, Erik
2005-12-01
Nonrandomized intervention trials are needed when randomized clinical trials cannot be performed. To report the results from nonrandomized intervention studies transparently, the TREND (Transparent Reporting of Evaluations with Nonrandomized Designs) checklist should be used. This implies that nonrandomized studies should follow the remaining methodological tools usually employed in randomized trials and that the uncertainty introduced by the allocation mechanism should be explicitly reported and, if possible, quantified.
Phase estimation without a priori phase knowledge in the presence of loss
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kolodynski, Jan; Demkowicz-Dobrzanski, Rafal
2010-11-15
We find the optimal scheme for quantum phase estimation in the presence of loss when no a priori knowledge on the estimated phase is available. We prove analytically an explicit lower bound on estimation uncertainty, which shows that, as a function of the number of probes, quantum precision enhancement amounts at most to a constant factor improvement over classical strategies.
NASA Astrophysics Data System (ADS)
Wang, Dong; Huang, Aijun; Ming, Fei; Sun, Wenyang; Lu, Heping; Liu, Chengcheng; Ye, Liu
2017-06-01
The uncertainty principle provides a nontrivial bound to expose the precision for the outcome of the measurement on a pair of incompatible observables in a quantum system. Therefore, it is of essential importance for quantum precision measurement in the area of quantum information processing. Herein, we investigate quantum-memory-assisted entropic uncertainty relation (QMA-EUR) in a two-qubit Heisenberg \\boldsymbol{X}\\boldsymbol{Y}\\boldsymbol{Z} spin chain. Specifically, we observe the dynamics of QMA-EUR in a realistic model there are two correlated sites linked by a thermal entanglement in the spin chain with an inhomogeneous magnetic field. It turns out that the temperature, the external inhomogeneous magnetic field and the field inhomogeneity can lift the uncertainty of the measurement due to the reduction of the thermal entanglement, and explicitly higher temperature, stronger magnetic field or larger inhomogeneity of the field can result in inflation of the uncertainty. Besides, it is found that there exists distinct dynamical behaviors of the uncertainty for ferromagnetism \\boldsymbol{}≤ft(\\boldsymbol{J}<\\boldsymbol{0}\\right) and antiferromagnetism \\boldsymbol{}≤ft(\\boldsymbol{J}>\\boldsymbol{0}\\right) chains. Moreover, we also verify that the measuring uncertainty is dramatically anti-correlated with the purity of the bipartite spin system, the greater purity can result in the reduction of the measuring uncertainty, vice versa. Therefore, our observations might provide a better understanding of the dynamics of the entropic uncertainty in the Heisenberg spin chain, and thus shed light on quantum precision measurement in the framework of versatile systems, particularly solid states.
A Multidimensional Scaling Analysis of Students' Attitudes about Science Careers
NASA Astrophysics Data System (ADS)
Masnick, Amy M.; Stavros Valenti, S.; Cox, Brian D.; Osman, Christopher J.
2010-03-01
To encourage students to seek careers in Science, Technology, Engineering and Mathematics (STEM) fields, it is important to gauge students' implicit and explicit attitudes towards scientific professions. We asked high school and college students to rate the similarity of pairs of occupations, and then used multidimensional scaling (MDS) to create a spatial representation of occupational similarity. Other students confirmed the emergent MDS map by rating each of the occupations along several dimensions. We found that participants across age and sex considered scientific professions to be less creative and less people-oriented than other popular career choices. We conclude that students may be led away from STEM careers by common misperceptions that science is a difficult, uncreative, and socially isolating pursuit.
ERIC Educational Resources Information Center
Feldon, David F.; Timmerman, Briana Crotwell; Stowe, Kirk A.; Showman, Richard
2010-01-01
Poor instruction has been cited as a primary cause of attrition from STEM majors and a major obstacle to learning for those who stay [Seymour and Hewitt [1997]. Talking about leaving: Why undergraduates leave the sciences. Boulder, CO: Westview]. Using a double-blind design, this study tests the hypothesis that the lack of explicit instructions in…
NASA Astrophysics Data System (ADS)
Engeland, K.; Steinsland, I.; Petersen-Øverleir, A.; Johansen, S.
2012-04-01
The aim of this study is to assess the uncertainties in streamflow simulations when uncertainties in both observed inputs (precipitation and temperature) and streamflow observations used in the calibration of the hydrological model are explicitly accounted for. To achieve this goal we applied the elevation distributed HBV model operating on daily time steps to a small catchment in high elevation in Southern Norway where the seasonal snow cover is important. The uncertainties in precipitation inputs were quantified using conditional simulation. This procedure accounts for the uncertainty related to the density of the precipitation network, but neglects uncertainties related to measurement bias/errors and eventual elevation gradients in precipitation. The uncertainties in temperature inputs were quantified using a Bayesian temperature interpolation procedure where the temperature lapse rate is re-estimated every day. The uncertainty in the lapse rate was accounted for whereas the sampling uncertainty related to network density was neglected. For every day a random sample of precipitation and temperature inputs were drawn to be applied as inputs to the hydrologic model. The uncertainties in observed streamflow were assessed based on the uncertainties in the rating curve model. A Bayesian procedure was applied to estimate the probability for rating curve models with 1 to 3 segments and the uncertainties in their parameters. This method neglects uncertainties related to errors in observed water levels. Note that one rating curve was drawn to make one realisation of a whole time series of streamflow, thus the rating curve errors lead to a systematic bias in the streamflow observations. All these uncertainty sources were linked together in both calibration and evaluation of the hydrologic model using a DREAM based MCMC routine. Effects of having less information (e.g. missing one streamflow measurement for defining the rating curve or missing one precipitation station) was also investigated.
Bayesian analysis of input uncertainty in hydrological modeling: 2. Application
NASA Astrophysics Data System (ADS)
Kavetski, Dmitri; Kuczera, George; Franks, Stewart W.
2006-03-01
The Bayesian total error analysis (BATEA) methodology directly addresses both input and output errors in hydrological modeling, requiring the modeler to make explicit, rather than implicit, assumptions about the likely extent of data uncertainty. This study considers a BATEA assessment of two North American catchments: (1) French Broad River and (2) Potomac basins. It assesses the performance of the conceptual Variable Infiltration Capacity (VIC) model with and without accounting for input (precipitation) uncertainty. The results show the considerable effects of precipitation errors on the predicted hydrographs (especially the prediction limits) and on the calibrated parameters. In addition, the performance of BATEA in the presence of severe model errors is analyzed. While BATEA allows a very direct treatment of input uncertainty and yields some limited insight into model errors, it requires the specification of valid error models, which are currently poorly understood and require further work. Moreover, it leads to computationally challenging highly dimensional problems. For some types of models, including the VIC implemented using robust numerical methods, the computational cost of BATEA can be reduced using Newton-type methods.
NASA Astrophysics Data System (ADS)
Mel, Riccardo; Viero, Daniele Pietro; Carniello, Luca; Defina, Andrea; D'Alpaos, Luigi
2014-09-01
Providing reliable and accurate storm surge forecasts is important for a wide range of problems related to coastal environments. In order to adequately support decision-making processes, it also become increasingly important to be able to estimate the uncertainty associated with the storm surge forecast. The procedure commonly adopted to do this uses the results of a hydrodynamic model forced by a set of different meteorological forecasts; however, this approach requires a considerable, if not prohibitive, computational cost for real-time application. In the present paper we present two simplified methods for estimating the uncertainty affecting storm surge prediction with moderate computational effort. In the first approach we use a computationally fast, statistical tidal model instead of a hydrodynamic numerical model to estimate storm surge uncertainty. The second approach is based on the observation that the uncertainty in the sea level forecast mainly stems from the uncertainty affecting the meteorological fields; this has led to the idea to estimate forecast uncertainty via a linear combination of suitable meteorological variances, directly extracted from the meteorological fields. The proposed methods were applied to estimate the uncertainty in the storm surge forecast in the Venice Lagoon. The results clearly show that the uncertainty estimated through a linear combination of suitable meteorological variances nicely matches the one obtained using the deterministic approach and overcomes some intrinsic limitations in the use of a statistical tidal model.
ERIC Educational Resources Information Center
Magnussen, Leif Inge
2012-01-01
This paper explores the meaning Norwegian sea kayakers form in their engagement with nature and their reflections upon uncertainty and resistance. Findings in this paper stem from ethnographic fieldwork conducted in a sea kayak community, with the aim of describing learning processes and experiences made outdoors by recreational sea kayakers. Sea…
Tree height and tropical forest biomass estimation
M.O. Hunter; M. Keller; D. Vitoria; D.C. Morton
2013-01-01
Tropical forests account for approximately half of above-ground carbon stored in global vegetation. However, uncertainties in tropical forest carbon stocks remain high because it is costly and laborious to quantify standing carbon stocks. Carbon stocks of tropical forests are determined using allometric relations between tree stem diameter and height and biomass....
The Paleoclimate Uncertainty Cascade: Tracking Proxy Errors Via Proxy System Models.
NASA Astrophysics Data System (ADS)
Emile-Geay, J.; Dee, S. G.; Evans, M. N.; Adkins, J. F.
2014-12-01
Paleoclimatic observations are, by nature, imperfect recorders of climate variables. Empirical approaches to their calibration are challenged by the presence of multiple sources of uncertainty, which may confound the interpretation of signals and the identifiability of the noise. In this talk, I will demonstrate the utility of proxy system models (PSMs, Evans et al, 2013, 10.1016/j.quascirev.2013.05.024) to quantify the impact of all known sources of uncertainty. PSMs explicitly encode the mechanistic knowledge of the physical, chemical, biological and geological processes from which paleoclimatic observations arise. PSMs may be divided into sensor, archive and observation components, all of which may conspire to obscure climate signals in actual paleo-observations. As an example, we couple a PSM for the δ18O of speleothem calcite to an isotope-enabled climate model (Dee et al, submitted) to analyze the potential of this measurement as a proxy for precipitation amount. A simple soil/karst model (Partin et al, 2013, 10.1130/G34718.1) is used as sensor model, while a hiatus-permitting chronological model (Haslett & Parnell, 2008, 10.1111/j.1467-9876.2008.00623.x) is used as part of the observation model. This subdivision allows us to explicitly model the transformation from precipitation amount to speleothem calcite δ18O as a multi-stage process via a physical and chemical sensor model, and a stochastic archive model. By illustrating the PSM's behavior within the context of the climate simulations, we show how estimates of climate variability may be affected by each submodel's transformation of the signal. By specifying idealized climate signals(periodic vs. episodic, slow vs. fast) to the PSM, we investigate how frequency and amplitude patterns are modulated by sensor and archive submodels. To the extent that the PSM and the climate models are representative of real world processes, then the results may help us more accurately interpret existing paleodata, characterize their uncertainties, and design sampling strategies that exploit their strengths while mitigating their weaknesses.
Protocol and practice in the adaptive management of waterfowl harvests
Johnson, F.; Williams, K.
1999-01-01
Waterfowl harvest management in North America, for all its success, historically has had several shortcomings, including a lack of well-defined objectives, a failure to account for uncertain management outcomes, and inefficient use of harvest regulations to understand the effects of management. To address these and other concerns, the U.S. Fish and Wildlife Service began implementation of adaptive harvest management in 1995. Harvest policies are now developed using a Markov decision process in which there is an explicit accounting for uncontrolled environmental variation, partial controllability of harvest, and structural uncertainty in waterfowl population dynamics. Current policies are passively adaptive, in the sense that any reduction in structural uncertainty is an unplanned by-product of the regulatory process. A generalization of the Markov decision process permits the calculation of optimal actively adaptive policies, but it is not yet clear how state-specific harvest actions differ between passive and active approaches. The Markov decision process also provides managers the ability to explore optimal levels of aggregation or "management scale" for regulating harvests in a system that exhibits high temporal, spatial, and organizational variability. Progress in institutionalizing adaptive harvest management has been remarkable, but some managers still perceive the process as a panacea, while failing to appreciate the challenges presented by this more explicit and methodical approach to harvest regulation. Technical hurdles include the need to develop better linkages between population processes and the dynamics of landscapes, and to model the dynamics of structural uncertainty in a more comprehensive fashion. From an institutional perspective, agreement on how to value and allocate harvests continues to be elusive, and there is some evidence that waterfowl managers have overestimated the importance of achievement-oriented factors in setting hunting regulations. Indeed, it is these unresolved value judgements, and the lack of an effective structure for organizing debate, that present the greatest threat to adaptive harvest management as a viable means for coping with management uncertainty. Copyright ?? 1999 by The Resilience Alliance.
NASA Astrophysics Data System (ADS)
Weisz, Daniel R.; Fouesneau, Morgan; Hogg, David W.; Rix, Hans-Walter; Dolphin, Andrew E.; Dalcanton, Julianne J.; Foreman-Mackey, Daniel T.; Lang, Dustin; Johnson, L. Clifton; Beerman, Lori C.; Bell, Eric F.; Gordon, Karl D.; Gouliermis, Dimitrios; Kalirai, Jason S.; Skillman, Evan D.; Williams, Benjamin F.
2013-01-01
We present a probabilistic approach for inferring the parameters of the present-day power-law stellar mass function (MF) of a resolved young star cluster. This technique (1) fully exploits the information content of a given data set; (2) can account for observational uncertainties in a straightforward way; (3) assigns meaningful uncertainties to the inferred parameters; (4) avoids the pitfalls associated with binning data; and (5) can be applied to virtually any resolved young cluster, laying the groundwork for a systematic study of the high-mass stellar MF (M >~ 1 M ⊙). Using simulated clusters and Markov Chain Monte Carlo sampling of the probability distribution functions, we show that estimates of the MF slope, α, are unbiased and that the uncertainty, Δα, depends primarily on the number of observed stars and on the range of stellar masses they span, assuming that the uncertainties on individual masses and the completeness are both well characterized. Using idealized mock data, we compute the theoretical precision, i.e., lower limits, on α, and provide an analytic approximation for Δα as a function of the observed number of stars and mass range. Comparison with literature studies shows that ~3/4 of quoted uncertainties are smaller than the theoretical lower limit. By correcting these uncertainties to the theoretical lower limits, we find that the literature studies yield langαrang = 2.46, with a 1σ dispersion of 0.35 dex. We verify that it is impossible for a power-law MF to obtain meaningful constraints on the upper mass limit of the initial mass function, beyond the lower bound of the most massive star actually observed. We show that avoiding substantial biases in the MF slope requires (1) including the MF as a prior when deriving individual stellar mass estimates, (2) modeling the uncertainties in the individual stellar masses, and (3) fully characterizing and then explicitly modeling the completeness for stars of a given mass. The precision on MF slope recovery in this paper are lower limits, as we do not explicitly consider all possible sources of uncertainty, including dynamical effects (e.g., mass segregation), unresolved binaries, and non-coeval populations. We briefly discuss how each of these effects can be incorporated into extensions of the present framework. Finally, we emphasize that the technique and lessons learned are applicable to more general problems involving power-law fitting. Based on observations made with the NASA/ESA Hubble Space Telescope, obtained from the Data Archive at the Space Telescope Science Institute, which is operated by the Association of Universities for Research in Astronomy, Inc., under NASA contract NAS 5-26555.
NASA Astrophysics Data System (ADS)
Wang, Dong; Ming, Fei; Huang, Ai-Jun; Sun, Wen-Yang; Ye, Liu
2017-09-01
The uncertainty principle configures a low bound to the measuring precision for a pair of non-commuting observables, and hence is considerably nontrivial to quantum precision measurement in the field of quantum information theory. In this letter, we consider the entropic uncertainty relation (EUR) in the context of quantum memory in a two-qubit isotropic Heisenberg spin chain. Specifically, we explore the dynamics of EUR in a practical scenario, where two associated nodes of a one-dimensional XXX-spin chain, under an inhomogeneous magnetic field, are connected to a thermal entanglement. We show that the temperature and magnetic field effect can lead to the inflation of the measuring uncertainty, stemming from the reduction of systematic quantum correlation. Notably, we reveal that, firstly, the uncertainty is not fully dependent on the observed quantum correlation of the system; secondly, the dynamical behaviors of the measuring uncertainty are relatively distinct with respect to ferromagnetism and antiferromagnetism chains. Meanwhile, we deduce that the measuring uncertainty is dramatically correlated with the mixedness of the system, implying that smaller mixedness tends to reduce the uncertainty. Furthermore, we propose an effective strategy to control the uncertainty of interest by means of quantum weak measurement reversal. Therefore, our work may shed light on the dynamics of the measuring uncertainty in the Heisenberg spin chain, and thus be important to quantum precision measurement in various solid-state systems.
Information theoretic quantification of diagnostic uncertainty.
Westover, M Brandon; Eiseman, Nathaniel A; Cash, Sydney S; Bianchi, Matt T
2012-01-01
Diagnostic test interpretation remains a challenge in clinical practice. Most physicians receive training in the use of Bayes' rule, which specifies how the sensitivity and specificity of a test for a given disease combine with the pre-test probability to quantify the change in disease probability incurred by a new test result. However, multiple studies demonstrate physicians' deficiencies in probabilistic reasoning, especially with unexpected test results. Information theory, a branch of probability theory dealing explicitly with the quantification of uncertainty, has been proposed as an alternative framework for diagnostic test interpretation, but is even less familiar to physicians. We have previously addressed one key challenge in the practical application of Bayes theorem: the handling of uncertainty in the critical first step of estimating the pre-test probability of disease. This essay aims to present the essential concepts of information theory to physicians in an accessible manner, and to extend previous work regarding uncertainty in pre-test probability estimation by placing this type of uncertainty within a principled information theoretic framework. We address several obstacles hindering physicians' application of information theoretic concepts to diagnostic test interpretation. These include issues of terminology (mathematical meanings of certain information theoretic terms differ from clinical or common parlance) as well as the underlying mathematical assumptions. Finally, we illustrate how, in information theoretic terms, one can understand the effect on diagnostic uncertainty of considering ranges instead of simple point estimates of pre-test probability.
Zhang, H X
2008-01-01
An innovative approach for total maximum daily load (TMDL) allocation and implementation is the watershed-based pollutant trading. Given the inherent scientific uncertainty for the tradeoffs between point and nonpoint sources, setting of trading ratios can be a contentious issue and was already listed as an obstacle by several pollutant trading programs. One of the fundamental reasons that a trading ratio is often set higher (e.g. greater than 2) is to allow for uncertainty in the level of control needed to attain water quality standards, and to provide a buffer in case traded reductions are less effective than expected. However, most of the available studies did not provide an approach to explicitly address the determination of trading ratio. Uncertainty analysis has rarely been linked to determination of trading ratio.This paper presents a practical methodology in estimating "equivalent trading ratio (ETR)" and links uncertainty analysis with trading ratio determination from TMDL allocation process. Determination of ETR can provide a preliminary evaluation of "tradeoffs" between various combination of point and nonpoint source control strategies on ambient water quality improvement. A greater portion of NPS load reduction in overall TMDL load reduction generally correlates with greater uncertainty and thus requires greater trading ratio. The rigorous quantification of trading ratio will enhance the scientific basis and thus public perception for more informed decision in overall watershed-based pollutant trading program. (c) IWA Publishing 2008.
Visual Tracking Using 3D Data and Region-Based Active Contours
2016-09-28
adaptive control strategies which explicitly take uncertainty into account. Filtering methods ranging from the classical Kalman filters valid for...linear systems to the much more general particle filters also fit into this framework in a very natural manner. In particular, the particle filtering ...the number of samples required for accurate filtering increases with the dimension of the system noise. In our approach, we approximate curve
Zhang, Xiaoling; Huang, Kai; Zou, Rui; Liu, Yong; Yu, Yajuan
2013-01-01
The conflict of water environment protection and economic development has brought severe water pollution and restricted the sustainable development in the watershed. A risk explicit interval linear programming (REILP) method was used to solve integrated watershed environmental-economic optimization problem. Interval linear programming (ILP) and REILP models for uncertainty-based environmental economic optimization at the watershed scale were developed for the management of Lake Fuxian watershed, China. Scenario analysis was introduced into model solution process to ensure the practicality and operability of optimization schemes. Decision makers' preferences for risk levels can be expressed through inputting different discrete aspiration level values into the REILP model in three periods under two scenarios. Through balancing the optimal system returns and corresponding system risks, decision makers can develop an efficient industrial restructuring scheme based directly on the window of "low risk and high return efficiency" in the trade-off curve. The representative schemes at the turning points of two scenarios were interpreted and compared to identify a preferable planning alternative, which has the relatively low risks and nearly maximum benefits. This study provides new insights and proposes a tool, which was REILP, for decision makers to develop an effectively environmental economic optimization scheme in integrated watershed management.
Zou, Rui; Liu, Yong; Yu, Yajuan
2013-01-01
The conflict of water environment protection and economic development has brought severe water pollution and restricted the sustainable development in the watershed. A risk explicit interval linear programming (REILP) method was used to solve integrated watershed environmental-economic optimization problem. Interval linear programming (ILP) and REILP models for uncertainty-based environmental economic optimization at the watershed scale were developed for the management of Lake Fuxian watershed, China. Scenario analysis was introduced into model solution process to ensure the practicality and operability of optimization schemes. Decision makers' preferences for risk levels can be expressed through inputting different discrete aspiration level values into the REILP model in three periods under two scenarios. Through balancing the optimal system returns and corresponding system risks, decision makers can develop an efficient industrial restructuring scheme based directly on the window of “low risk and high return efficiency” in the trade-off curve. The representative schemes at the turning points of two scenarios were interpreted and compared to identify a preferable planning alternative, which has the relatively low risks and nearly maximum benefits. This study provides new insights and proposes a tool, which was REILP, for decision makers to develop an effectively environmental economic optimization scheme in integrated watershed management. PMID:24191144
NASA Astrophysics Data System (ADS)
Riegle-Crumb, Catherine
2017-01-01
Despite being the focus of decades of research as well as interventions, gender inequality in representation in many STEM fields, including physics, engineering, and computer science remains. Recent research indicates that high school is a particularly important time point to investigate regarding the roots of inequality, as this is when many young women decide that they are not interested in pursuing degrees in these STEM fields. This presentation will focus on the role of local contexts, including communities, classrooms, and peers, in contributing to such decisions. Specifically, sociological theories suggest that role models and peers within young people's immediate environment can send both implicit and explicit messages that contradict larger social stereotypes, and promote perceptions and experiences of inclusion. Alternatively, adults and peers can endorse and behave in a manner consistent with stereotypes, leading to overtly exclusionary messages and actions. Utilizing data from a large urban district in the Southwest, as well as a national sample of high school students, this presentation will examine how such factors within local contexts can work in both positive and negative ways to shape girls' interests and expectations in STEM fields.
Self-definition of women experiencing a nontraditional graduate fellowship program
NASA Astrophysics Data System (ADS)
Buck, Gayle A.; Leslie-Pelecky, Diandra L.; Lu, Yun; Plano Clark, Vicki L.; Creswell, John W.
2006-10-01
Women continue to be underrepresented in the fields of science, technology, engineering, and mathematics (STEM). One factor contributing to this underrepresentation is the graduate school experience. Graduate programs in STEM fields are constructed around assumptions that ignore the reality of women's lives; however, emerging opportunities may lead to experiences that are more compatible for women. One such opportunity is the Graduate Teaching Fellows in K-12 Education (GK-12) Program, which was introduced by the National Science Foundation in 1999. Although this nontraditional graduate program was not designed explicitly for women, it provided an unprecedented context in which to research how changing some of the basic assumptions upon which a graduate school operates may impact women in science. This exploratory case study examines the self-definition of 8 women graduate students who participated in a GK-12 program at a major research university. The findings from this case study contribute to higher education's understanding of the terrain women graduate students in the STEM areas must navigate as they participate in programs that are thought to be more conducive to their modes of self-definition while they continue to seek to be successful in the historically Eurocentric, masculine STEM fields.
Evaluating Variability and Uncertainty of Geological Strength Index at a Specific Site
NASA Astrophysics Data System (ADS)
Wang, Yu; Aladejare, Adeyemi Emman
2016-09-01
Geological Strength Index (GSI) is an important parameter for estimating rock mass properties. GSI can be estimated from quantitative GSI chart, as an alternative to the direct observational method which requires vast geological experience of rock. GSI chart was developed from past observations and engineering experience, with either empiricism or some theoretical simplifications. The GSI chart thereby contains model uncertainty which arises from its development. The presence of such model uncertainty affects the GSI estimated from GSI chart at a specific site; it is, therefore, imperative to quantify and incorporate the model uncertainty during GSI estimation from the GSI chart. A major challenge for quantifying the GSI chart model uncertainty is a lack of the original datasets that have been used to develop the GSI chart, since the GSI chart was developed from past experience without referring to specific datasets. This paper intends to tackle this problem by developing a Bayesian approach for quantifying the model uncertainty in GSI chart when using it to estimate GSI at a specific site. The model uncertainty in the GSI chart and the inherent spatial variability in GSI are modeled explicitly in the Bayesian approach. The Bayesian approach generates equivalent samples of GSI from the integrated knowledge of GSI chart, prior knowledge and observation data available from site investigation. Equations are derived for the Bayesian approach, and the proposed approach is illustrated using data from a drill and blast tunnel project. The proposed approach effectively tackles the problem of how to quantify the model uncertainty that arises from using GSI chart for characterization of site-specific GSI in a transparent manner.
Predicting long-range transport: a systematic evaluation of two multimedia transport models.
Bennett, D H; Scheringer, M; McKone, T E; Hungerbühler, K
2001-03-15
The United Nations Environment Program has recently developed criteria to identify and restrict chemicals with a potential for persistence and long-range transport (persistent organic pollutants or POPs). There are many stakeholders involved, and the issues are not only scientific but also include social, economic, and political factors. This work focuses on one aspect of the POPs debate, the criteria for determining the potential for long-range transport (LRT). Our goal is to determine if current models are reliable enough to support decisions that classify a chemical based on the LRT potential. We examine the robustness of two multimedia fate models for determining the relative ranking and absolute spatial range of various chemicals in the environment. We also consider the effect of parameter uncertainties and the model uncertainty associated with the selection of an algorithm for gas-particle partitioning on the model results. Given the same chemical properties, both models give virtually the same ranking. However, when chemical parameter uncertainties and model uncertainties such as particle partitioning are considered, the spatial range distributions obtained for the individual chemicals overlap, preventing a distinct rank order. The absolute values obtained for the predicted spatial range or travel distance differ significantly between the two models for the uncertainties evaluated. We find that to evaluate a chemical when large and unresolved uncertainties exist, it is more informative to use two or more models and include multiple types of uncertainty. Model differences and uncertainties must be explicitly confronted to determine how the limitations of scientific knowledge impact predictions in the decision-making process.
A logical foundation for representation of clinical data.
Campbell, K E; Das, A K; Musen, M A
1994-01-01
OBJECTIVE: A general framework for representation of clinical data that provides a declarative semantics of terms and that allows developers to define explicitly the relationships among both terms and combinations of terms. DESIGN: Use of conceptual graphs as a standard representation of logic and of an existing standardized vocabulary, the Systematized Nomenclature of Medicine (SNOMED International), for lexical elements. Concepts such as time, anatomy, and uncertainty must be modeled explicitly in a way that allows relation of these foundational concepts to surface-level clinical descriptions in a uniform manner. RESULTS: The proposed framework was used to model a simple radiology report, which included temporal references. CONCLUSION: Formal logic provides a framework for formalizing the representation of medical concepts. Actual implementations will be required to evaluate the practicality of this approach. PMID:7719805
A risk based approach for SSTO/TSTO comparisons
NASA Astrophysics Data System (ADS)
Greenberg, Joel S.
1996-03-01
An approach has been developed for performing early comparisons of transportation architectures explicitly taking into account quantitative measures of uncertainty and resulting risk. Risk considerations are necessary since the transportation systems are likely to have significantly different levels of risk, both because of differing degrees of freedom in achieving desired performance levels and their different states of development and utilization. The approach considers the uncertainty of achievement of technology goals, effect that the achieved technology level will have on transportation system performance and the relationship between system performance/capability and the ability to accommodate variations in payload mass. The consequences of system performance are developed in terms of nonrecurring, recurring, and the present value of transportation system life cycle costs.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pramanik, Souvik, E-mail: souvick.in@gmail.com; Moussa, Mohamed, E-mail: mohamed.ibrahim@fsc.bu.edu.eg; Faizal, Mir, E-mail: f2mir@uwaterloo.ca
In this paper, the deformation of the Heisenberg algebra, consistent with both the generalized uncertainty principle and doubly special relativity, has been analyzed. It has been observed that, though this algebra can give rise to fractional derivative terms in the corresponding quantum mechanical Hamiltonian, a formal meaning can be given to them by using the theory of harmonic extensions of function. Depending on this argument, the expression of the propagator of the path integral corresponding to the deformed Heisenberg algebra, has been obtained. In particular, the consistent expression of the one dimensional free particle propagator has been evaluated explicitly. Withmore » this propagator in hand, it has been shown that, even in free particle case, normal generalized uncertainty principle and doubly special relativity show very much different result.« less
NASA Technical Reports Server (NTRS)
Yedavalli, R. K.
1992-01-01
The problem of analyzing and designing controllers for linear systems subject to real parameter uncertainty is considered. An elegant, unified theory for robust eigenvalue placement is presented for a class of D-regions defined by algebraic inequalities by extending the nominal matrix root clustering theory of Gutman and Jury (1981) to linear uncertain time systems. The author presents explicit conditions for matrix root clustering for different D-regions and establishes the relationship between the eigenvalue migration range and the parameter range. The bounds are all obtained by one-shot computation in the matrix domain and do not need any frequency sweeping or parameter gridding. The method uses the generalized Lyapunov theory for getting the bounds.
Dragons, Ladybugs, and Softballs: Girls' STEM Engagement with Human-Centered Robotics
NASA Astrophysics Data System (ADS)
Gomoll, Andrea; Hmelo-Silver, Cindy E.; Šabanović, Selma; Francisco, Matthew
2016-12-01
Early experiences in science, technology, engineering, and math (STEM) are important for getting youth interested in STEM fields, particularly for girls. Here, we explore how an after-school robotics club can provide informal STEM experiences that inspire students to engage with STEM in the future. Human-centered robotics, with its emphasis on the social aspects of science and technology, may be especially important for bringing girls into the STEM pipeline. Using a problem-based approach, we designed two robotics challenges. We focus here on the more extended second challenge, in which participants were asked to imagine and build a telepresence robot that would allow others to explore their space from a distance. This research follows four girls as they engage with human-centered telepresence robotics design. We constructed case studies of these target participants to explore their different forms of engagement and phases of interest development—considering facets of behavioral, social, cognitive, and conceptual-to-consequential engagement as well as stages of interest ranging from triggered interest to well-developed individual interest. The results demonstrated that opportunities to personalize their robots and feedback from peers and facilitators were important motivators. We found both explicit and vicarious engagement and varied interest phases in our group of four focus participants. This first iteration of our project demonstrated that human-centered robotics is a promising approach to getting girls interested and engaged in STEM practices. As we design future iterations of our robotics club environment, we must consider how to harness multiple forms of leadership and engagement without marginalizing students with different working preferences.
Quantifying the Uncertainty in Discharge Data Using Hydraulic Knowledge and Uncertain Gaugings
NASA Astrophysics Data System (ADS)
Renard, B.; Le Coz, J.; Bonnifait, L.; Branger, F.; Le Boursicaud, R.; Horner, I.; Mansanarez, V.; Lang, M.
2014-12-01
River discharge is a crucial variable for Hydrology: as the output variable of most hydrologic models, it is used for sensitivity analyses, model structure identification, parameter estimation, data assimilation, prediction, etc. A major difficulty stems from the fact that river discharge is not measured continuously. Instead, discharge time series used by hydrologists are usually based on simple stage-discharge relations (rating curves) calibrated using a set of direct stage-discharge measurements (gaugings). In this presentation, we present a Bayesian approach to build such hydrometric rating curves, to estimate the associated uncertainty and to propagate this uncertainty to discharge time series. The three main steps of this approach are described: (1) Hydraulic analysis: identification of the hydraulic controls that govern the stage-discharge relation, identification of the rating curve equation and specification of prior distributions for the rating curve parameters; (2) Rating curve estimation: Bayesian inference of the rating curve parameters, accounting for the individual uncertainties of available gaugings, which often differ according to the discharge measurement procedure and the flow conditions; (3) Uncertainty propagation: quantification of the uncertainty in discharge time series, accounting for both the rating curve uncertainties and the uncertainty of recorded stage values. In addition, we also discuss current research activities, including the treatment of non-univocal stage-discharge relationships (e.g. due to hydraulic hysteresis, vegetation growth, sudden change of the geometry of the section, etc.).
NASA Astrophysics Data System (ADS)
Croke, B. F.
2008-12-01
The role of performance indicators is to give an accurate indication of the fit between a model and the system being modelled. As all measurements have an associated uncertainty (determining the significance that should be given to the measurement), performance indicators should take into account uncertainties in the observed quantities being modelled as well as in the model predictions (due to uncertainties in inputs, model parameters and model structure). In the presence of significant uncertainty in observed and modelled output of a system, failure to adequately account for variations in the uncertainties means that the objective function only gives a measure of how well the model fits the observations, not how well the model fits the system being modelled. Since in most cases, the interest lies in fitting the system response, it is vital that the objective function(s) be designed to account for these uncertainties. Most objective functions (e.g. those based on the sum of squared residuals) assume homoscedastic uncertainties. If model contribution to the variations in residuals can be ignored, then transformations (e.g. Box-Cox) can be used to remove (or at least significantly reduce) heteroscedasticity. An alternative which is more generally applicable is to explicitly represent the uncertainties in the observed and modelled values in the objective function. Previous work on this topic addressed the modifications to standard objective functions (Nash-Sutcliffe efficiency, RMSE, chi- squared, coefficient of determination) using the optimal weighted averaging approach. This paper extends this previous work; addressing the issue of serial correlation. A form for an objective function that includes serial correlation will be presented, and the impact on model fit discussed.
NASA Astrophysics Data System (ADS)
Holmquist, J. R.; Crooks, S.; Windham-Myers, L.; Megonigal, P.; Weller, D.; Lu, M.; Bernal, B.; Byrd, K. B.; Morris, J. T.; Troxler, T.; McCombs, J.; Herold, N.
2017-12-01
Stable coastal wetlands can store substantial amounts of carbon (C) that can be released when they are degraded or eroded. The EPA recently incorporated coastal wetland net-storage and emissions within the Agricultural Forested and Other Land Uses category of the U.S. National Greenhouse Gas Inventory (NGGI). This was a seminal analysis, but its quantification of uncertainty needs improvement. We provide a value-added analysis by estimating that uncertainty, focusing initially on the most basic assumption, the area of coastal wetlands. We considered three sources: uncertainty in the areas of vegetation and salinity subclasses, uncertainty in the areas of changing or stable wetlands, and uncertainty in the inland extent of coastal wetlands. The areas of vegetation and salinity subtypes, as well as stable or changing, were estimated from 2006 and 2010 maps derived from Landsat imagery by the Coastal Change Analysis Program (C-CAP). We generated unbiased area estimates and confidence intervals for C-CAP, taking into account mapped area, proportional areas of commission and omission errors, as well as the number of observations. We defined the inland extent of wetlands as all land below the current elevation of twice monthly highest tides. We generated probabilistic inundation maps integrating wetland-specific bias and random error in light-detection and ranging elevation maps, with the spatially explicit random error in tidal surfaces generated from tide gauges. This initial uncertainty analysis will be extended to calculate total propagated uncertainty in the NGGI by including the uncertainties in the amount of C lost from eroded and degraded wetlands, stored annually in stable wetlands, and emitted in the form of methane by tidal freshwater wetlands.
Eigenspace perturbations for structural uncertainty estimation of turbulence closure models
NASA Astrophysics Data System (ADS)
Jofre, Lluis; Mishra, Aashwin; Iaccarino, Gianluca
2017-11-01
With the present state of computational resources, a purely numerical resolution of turbulent flows encountered in engineering applications is not viable. Consequently, investigations into turbulence rely on various degrees of modeling. Archetypal amongst these variable resolution approaches would be RANS models in two-equation closures, and subgrid-scale models in LES. However, owing to the simplifications introduced during model formulation, the fidelity of all such models is limited, and therefore the explicit quantification of the predictive uncertainty is essential. In such scenario, the ideal uncertainty estimation procedure must be agnostic to modeling resolution, methodology, and the nature or level of the model filter. The procedure should be able to give reliable prediction intervals for different Quantities of Interest, over varied flows and flow conditions, and at diametric levels of modeling resolution. In this talk, we present and substantiate the Eigenspace perturbation framework as an uncertainty estimation paradigm that meets these criteria. Commencing from a broad overview, we outline the details of this framework at different modeling resolution. Thence, using benchmark flows, along with engineering problems, the efficacy of this procedure is established. This research was partially supported by NNSA under the Predictive Science Academic Alliance Program (PSAAP) II, and by DARPA under the Enabling Quantification of Uncertainty in Physical Systems (EQUiPS) project (technical monitor: Dr Fariba Fahroo).
Uncertainty Analysis in Large Area Aboveground Biomass Mapping
NASA Astrophysics Data System (ADS)
Baccini, A.; Carvalho, L.; Dubayah, R.; Goetz, S. J.; Friedl, M. A.
2011-12-01
Satellite and aircraft-based remote sensing observations are being more frequently used to generate spatially explicit estimates of aboveground carbon stock of forest ecosystems. Because deforestation and forest degradation account for circa 10% of anthropogenic carbon emissions to the atmosphere, policy mechanisms are increasingly recognized as a low-cost mitigation option to reduce carbon emission. They are, however, contingent upon the capacity to accurately measures carbon stored in the forests. Here we examine the sources of uncertainty and error propagation in generating maps of aboveground biomass. We focus on characterizing uncertainties associated with maps at the pixel and spatially aggregated national scales. We pursue three strategies to describe the error and uncertainty properties of aboveground biomass maps, including: (1) model-based assessment using confidence intervals derived from linear regression methods; (2) data-mining algorithms such as regression trees and ensembles of these; (3) empirical assessments using independently collected data sets.. The latter effort explores error propagation using field data acquired within satellite-based lidar (GLAS) acquisitions versus alternative in situ methods that rely upon field measurements that have not been systematically collected for this purpose (e.g. from forest inventory data sets). A key goal of our effort is to provide multi-level characterizations that provide both pixel and biome-level estimates of uncertainties at different scales.
Probability-based hazard avoidance guidance for planetary landing
NASA Astrophysics Data System (ADS)
Yuan, Xu; Yu, Zhengshi; Cui, Pingyuan; Xu, Rui; Zhu, Shengying; Cao, Menglong; Luan, Enjie
2018-03-01
Future landing and sample return missions on planets and small bodies will seek landing sites with high scientific value, which may be located in hazardous terrains. Autonomous landing in such hazardous terrains and highly uncertain planetary environments is particularly challenging. Onboard hazard avoidance ability is indispensable, and the algorithms must be robust to uncertainties. In this paper, a novel probability-based hazard avoidance guidance method is developed for landing in hazardous terrains on planets or small bodies. By regarding the lander state as probabilistic, the proposed guidance algorithm exploits information on the uncertainty of lander position and calculates the probability of collision with each hazard. The collision probability serves as an accurate safety index, which quantifies the impact of uncertainties on the lander safety. Based on the collision probability evaluation, the state uncertainty of the lander is explicitly taken into account in the derivation of the hazard avoidance guidance law, which contributes to enhancing the robustness to the uncertain dynamics of planetary landing. The proposed probability-based method derives fully analytic expressions and does not require off-line trajectory generation. Therefore, it is appropriate for real-time implementation. The performance of the probability-based guidance law is investigated via a set of simulations, and the effectiveness and robustness under uncertainties are demonstrated.
ERIC Educational Resources Information Center
McDonald, Christopher
2012-01-01
Community colleges are experiencing many challenges that stem from political and fiscal uncertainties and these challenges are often exacerbated by high turnover in executive leadership positions (Floyd et al., 2010), which has implications for longterm strategic planning. The magnitude of the problems associated with high turnover rates for…
Zamanzadeh, Vahid; Valizadeh, Leila; Sayadi, Leila; Taleghani, Fariba; Jeddian, Alireza
2013-01-01
Background This study explored the state of hematopoietic stem cell transplantation (HSCT) recipient patients and problems experienced by them and nurse about these state and problems, in Iran. Methods Qualitative content analysis was used for analyzing semi-structured interviews with 12 HSCT recipient patients and 18 nurses. Results Three main categories described the HSCT state and problems: shadow of death, living with uncertainty, and immersion in problems. Patients treated with risk variety in continuity with probability of death. The patients lived with uncertainty. Consequently these resulted immersion in problems with four sub-categories including: (a) Physical problems, (b) money worries, (c) life disturbances, and (d) emotional strain. Conclusion HSCT patients live in a state of limbo between life and death with multidimensional problems. Establish centers for supporting and educating of patients and their families, education of health care providers, enhancement of public knowledge about HSCT along with allocating more budgets to take care of these patients can help patients for passing from this limbo. PMID:24505532
Wood, Alexander
2004-01-01
This interim report describes an alternative approach for evaluating the efficacy of using mercury (Hg) offsets to improve water quality. Hg-offset programs may allow dischargers facing higher-pollution control costs to meet their regulatory obligations by making more cost effective pollutant-reduction decisions. Efficient Hg management requires methods to translate that science and economics into a regulatory decision framework. This report documents the work in progress by the U.S. Geological Surveys Western Geographic Science Center in collaboration with Stanford University toward developing this decision framework to help managers, regulators, and other stakeholders decide whether offsets can cost effectively meet the Hg total maximum daily load (TMDL) requirements in the Sacramento River watershed. Two key approaches being considered are: (1) a probabilistic approach that explicitly incorporates scientific uncertainty, cost information, and value judgments; and (2) a quantitative approach that captures uncertainty in testing the feasibility of Hg offsets. Current fate and transport-process models commonly attempt to predict chemical transformations and transport pathways deterministically. However, the physical, chemical, and biologic processes controlling the fate and transport of Hg in aquatic environments are complex and poorly understood. Deterministic models of Hg environmental behavior contain large uncertainties, reflecting this lack of understanding. The uncertainty in these underlying physical processes may produce similarly large uncertainties in the decisionmaking process. However, decisions about control strategies are still being made despite the large uncertainties in current Hg loadings, the relations between total Hg (HgT) loading and methylmercury (MeHg) formation, and the relations between control efforts and Hg content in fish. The research presented here focuses on an alternative analytical approach to the current use of safety factors and deterministic methods for Hg TMDL decision support, one that is fully compatible with an adaptive management approach. This alternative approach uses empirical data and informed judgment to provide a scientific and technical basis for helping National Pollutant Discharge Elimination System (NPDES) permit holders make management decisions. An Hg-offset system would be an option if a wastewater-treatment plant could not achieve NPDES permit requirements for HgT reduction. We develop a probabilistic decision-analytical model consisting of three submodels for HgT loading, MeHg, and cost mitigation within a Bayesian network that integrates information of varying rigor and detail into a simple model of a complex system. Hg processes are identified and quantified by using a combination of historical data, statistical models, and expert judgment. Such an integrated approach to uncertainty analysis allows easy updating of prediction and inference when observations of model variables are made. We demonstrate our approach with data from the Cache Creek watershed (a subbasin of the Sacramento River watershed). The empirical models used to generate the needed probability distributions are based on the same empirical models currently being used by the Central Valley Regional Water Quality Control Cache Creek Hg TMDL working group. The significant difference is that input uncertainty and error are explicitly included in the model and propagated throughout its algorithms. This work demonstrates how to integrate uncertainty into the complex and highly uncertain Hg TMDL decisionmaking process. The various sources of uncertainty are propagated as decision risk that allows decisionmakers to simultaneously consider uncertainties in remediation/implementation costs while attempting to meet environmental/ecologic targets. We must note that this research is on going. As more data are collected, the HgT and cost-mitigation submodels are updated and the uncer
Cultivating Citizen Scientists in the Undergraduate Science Classroom
NASA Astrophysics Data System (ADS)
Egger, A. E.
2007-12-01
Several studies indicate a strong correlation between the number of college science courses and science literacy. It is not surprising, then, that the majority of participants in citizen science projects are college graduates who enrolled in at least two science courses. If one goal of citizen science projects is to increase civic science literacy, research suggests that most are preaching to the choir. Attracting a wider audience to citizen science is, therefore, a key challenge. One way to address this challenge is to attract students to enroll and succeed in science courses in college, even if they do not pursue a major in the science, technology, engineering, and mathematics (STEM) disciplines. In fact, only 20% of students receive a degree in STEM, yet virtually all undergraduates are required to take at least one science course. Introductory science courses are therefore critical to cultivating citizen scientists, as they include a large proportion of non- STEM majors. Indeed, a major thrust of recent undergraduate STEM educational reform has been the promotion of 'science for all'. The science for all concept goes beyond recruiting students into the STEM disciplines to promoting a level of scientific literacy necessary to make informed decisions. A clear implication of this inclusive attitude is the need to redesign introductory science courses to make them accessible and explicitly related to scientific literacy. This does not mean dumbing down courses; on the contrary, it means engaging students in real scientific investigations and incorporating explicit teaching about the process of science, thus fostering a lifelong appreciation for (and, hopefully, participation in) science. Unfortunately, many students enter college with minimal understanding of the process of science. And when they arrive in their introductory classes, science is presented to them as a system of facts to be memorized - comparable to memorizing a poem in a foreign language without understanding the vocabulary. New resources available through the Visionlearning project (http://www.visionlearning.com) provide the means to incorporate teaching about the process of science into disciplinary content, thus facilitating the reform the way that undergraduate students are taught science at the introductory level. This kind of educational reform may be a long-term approach to developing citizen scientists, but research from several different disciplines and perspectives suggests it is a critical step in building scientific literacy and lifelong participation in science.
NASA Astrophysics Data System (ADS)
Aumont, B.; Camredon, M.; Isaacman-VanWertz, G. A.; Karam, C.; Valorso, R.; Madronich, S.; Kroll, J. H.
2016-12-01
Gas phase oxidation of VOC is a gradual process leading to the formation of multifunctional organic compounds, i.e., typically species with higher oxidation state, high water solubility and low volatility. These species contribute to the formation of secondary organic aerosols (SOA) viamultiphase processes involving a myriad of organic species that evolve through thousands of reactions and gas/particle mass exchanges. Explicit chemical mechanisms reflect the understanding of these multigenerational oxidation steps. These mechanisms rely directly on elementary reactions to describe the chemical evolution and track the identity of organic carbon through various phases down to ultimate oxidation products. The development, assessment and improvement of such explicit schemes is a key issue, as major uncertainties remain on the chemical pathways involved during atmospheric oxidation of organic matter. An array of mass spectrometric techniques (CIMS, PTRMS, AMS) was recently used to track the composition of organic species during α-pinene oxidation in the MIT environmental chamber, providing an experimental database to evaluate and improve explicit mechanisms. In this study, the GECKO-A tool (Generator for Explicit Chemistry and Kinetics of Organics in the Atmosphere) is used to generate fully explicit oxidation schemes for α-pinene multiphase oxidation simulating the MIT experiment. The ability of the GECKO-A chemical scheme to explain the organic molecular composition in the gas and the condensed phases is explored. First results of this model/observation comparison at the molecular level will be presented.
NASA Astrophysics Data System (ADS)
Marlowe, Ashley E.; Singh, Abhishek; Semichaevsky, Andrey V.; Yingling, Yaroslava G.
2009-03-01
Nucleic acid nanoparticles can self-assembly through the formation of complementary loop-loop interactions or stem-stem interactions. Presence and concentration of ions can significantly affect the self-assembly process and the stability of the nanostructure. In this presentation we use explicit molecular dynamics simulations to examine the variations in cationic distributions and hydration environment around DNA and RNA helices and loop-loop interactions. Our simulations show that the potassium and sodium ionic distributions are different around RNA and DNA motifs which could be indicative of ion mediated relative stability of loop-loop complexes. Moreover in RNA loop-loop motifs ions are consistently present and exchanged through a distinct electronegative channel. We will also show how we used the specific RNA loop-loop motif to design a RNA hexagonal nanoparticle.
NASA Astrophysics Data System (ADS)
Seo, Jongmin; Schiavazzi, Daniele; Marsden, Alison
2017-11-01
Cardiovascular simulations are increasingly used in clinical decision making, surgical planning, and disease diagnostics. Patient-specific modeling and simulation typically proceeds through a pipeline from anatomic model construction using medical image data to blood flow simulation and analysis. To provide confidence intervals on simulation predictions, we use an uncertainty quantification (UQ) framework to analyze the effects of numerous uncertainties that stem from clinical data acquisition, modeling, material properties, and boundary condition selection. However, UQ poses a computational challenge requiring multiple evaluations of the Navier-Stokes equations in complex 3-D models. To achieve efficiency in UQ problems with many function evaluations, we implement and compare a range of iterative linear solver and preconditioning techniques in our flow solver. We then discuss applications to patient-specific cardiovascular simulation and how the problem/boundary condition formulation in the solver affects the selection of the most efficient linear solver. Finally, we discuss performance improvements in the context of uncertainty propagation. Support from National Institute of Health (R01 EB018302) is greatly appreciated.
Portelli, Lucas A; Falldorf, Karsten; Thuróczy, György; Cuppen, Jan
2018-04-01
Experiments on cell cultures exposed to extremely low frequency (ELF, 3-300 Hz) magnetic fields are often subject to multiple sources of uncertainty associated with specific electric and magnetic field exposure conditions. Here we systemically quantify these uncertainties based on exposure conditions described in a group of bioelectromagnetic experimental reports for a representative sampling of the existing literature. The resulting uncertainties, stemming from insufficient, ambiguous, or erroneous description, design, implementation, or validation of the experimental methods and systems, were often substantial enough to potentially make any successful reproduction of the original experimental conditions difficult or impossible. Without making any assumption about the true biological relevance of ELF electric and magnetic fields, these findings suggest another contributing factor which may add to the overall variability and irreproducibility traditionally associated with experimental results of in vitro exposures to low-level ELF magnetic fields. Bioelectromagnetics. 39:231-243, 2018. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.
Spatializing 6,000 years of global urbanization from 3700 BC to AD 2000
NASA Astrophysics Data System (ADS)
Reba, Meredith; Reitsma, Femke; Seto, Karen C.
2016-06-01
How were cities distributed globally in the past? How many people lived in these cities? How did cities influence their local and regional environments? In order to understand the current era of urbanization, we must understand long-term historical urbanization trends and patterns. However, to date there is no comprehensive record of spatially explicit, historic, city-level population data at the global scale. Here, we developed the first spatially explicit dataset of urban settlements from 3700 BC to AD 2000, by digitizing, transcribing, and geocoding historical, archaeological, and census-based urban population data previously published in tabular form by Chandler and Modelski. The dataset creation process also required data cleaning and harmonization procedures to make the data internally consistent. Additionally, we created a reliability ranking for each geocoded location to assess the geographic uncertainty of each data point. The dataset provides the first spatially explicit archive of the location and size of urban populations over the last 6,000 years and can contribute to an improved understanding of contemporary and historical urbanization trends.
Spatializing 6,000 years of global urbanization from 3700 BC to AD 2000
Reba, Meredith; Reitsma, Femke; Seto, Karen C.
2016-01-01
How were cities distributed globally in the past? How many people lived in these cities? How did cities influence their local and regional environments? In order to understand the current era of urbanization, we must understand long-term historical urbanization trends and patterns. However, to date there is no comprehensive record of spatially explicit, historic, city-level population data at the global scale. Here, we developed the first spatially explicit dataset of urban settlements from 3700 BC to AD 2000, by digitizing, transcribing, and geocoding historical, archaeological, and census-based urban population data previously published in tabular form by Chandler and Modelski. The dataset creation process also required data cleaning and harmonization procedures to make the data internally consistent. Additionally, we created a reliability ranking for each geocoded location to assess the geographic uncertainty of each data point. The dataset provides the first spatially explicit archive of the location and size of urban populations over the last 6,000 years and can contribute to an improved understanding of contemporary and historical urbanization trends. PMID:27271481
Jones, Hayley E; Hickman, Matthew; Kasprzyk-Hordern, Barbara; Welton, Nicky J; Baker, David R; Ades, A E
2014-07-15
Concentrations of metabolites of illicit drugs in sewage water can be measured with great accuracy and precision, thanks to the development of sensitive and robust analytical methods. Based on assumptions about factors including the excretion profile of the parent drug, routes of administration and the number of individuals using the wastewater system, the level of consumption of a drug can be estimated from such measured concentrations. When presenting results from these 'back-calculations', the multiple sources of uncertainty are often discussed, but are not usually explicitly taken into account in the estimation process. In this paper we demonstrate how these calculations can be placed in a more formal statistical framework by assuming a distribution for each parameter involved, based on a review of the evidence underpinning it. Using a Monte Carlo simulations approach, it is then straightforward to propagate uncertainty in each parameter through the back-calculations, producing a distribution for instead of a single estimate of daily or average consumption. This can be summarised for example by a median and credible interval. To demonstrate this approach, we estimate cocaine consumption in a large urban UK population, using measured concentrations of two of its metabolites, benzoylecgonine and norbenzoylecgonine. We also demonstrate a more sophisticated analysis, implemented within a Bayesian statistical framework using Markov chain Monte Carlo simulation. Our model allows the two metabolites to simultaneously inform estimates of daily cocaine consumption and explicitly allows for variability between days. After accounting for this variability, the resulting credible interval for average daily consumption is appropriately wider, representing additional uncertainty. We discuss possibilities for extensions to the model, and whether analysis of wastewater samples has potential to contribute to a prevalence model for illicit drug use. Copyright © 2014. Published by Elsevier B.V.
Jones, Hayley E.; Hickman, Matthew; Kasprzyk-Hordern, Barbara; Welton, Nicky J.; Baker, David R.; Ades, A.E.
2014-01-01
Concentrations of metabolites of illicit drugs in sewage water can be measured with great accuracy and precision, thanks to the development of sensitive and robust analytical methods. Based on assumptions about factors including the excretion profile of the parent drug, routes of administration and the number of individuals using the wastewater system, the level of consumption of a drug can be estimated from such measured concentrations. When presenting results from these ‘back-calculations’, the multiple sources of uncertainty are often discussed, but are not usually explicitly taken into account in the estimation process. In this paper we demonstrate how these calculations can be placed in a more formal statistical framework by assuming a distribution for each parameter involved, based on a review of the evidence underpinning it. Using a Monte Carlo simulations approach, it is then straightforward to propagate uncertainty in each parameter through the back-calculations, producing a distribution for instead of a single estimate of daily or average consumption. This can be summarised for example by a median and credible interval. To demonstrate this approach, we estimate cocaine consumption in a large urban UK population, using measured concentrations of two of its metabolites, benzoylecgonine and norbenzoylecgonine. We also demonstrate a more sophisticated analysis, implemented within a Bayesian statistical framework using Markov chain Monte Carlo simulation. Our model allows the two metabolites to simultaneously inform estimates of daily cocaine consumption and explicitly allows for variability between days. After accounting for this variability, the resulting credible interval for average daily consumption is appropriately wider, representing additional uncertainty. We discuss possibilities for extensions to the model, and whether analysis of wastewater samples has potential to contribute to a prevalence model for illicit drug use. PMID:24636801
DOE Office of Scientific and Technical Information (OSTI.GOV)
Morton, April M; McManamay, Ryan A; Nagle, Nicholas N
Abstract As urban areas continue to grow and evolve in a world of increasing environmental awareness, the need for high resolution spatially explicit estimates for energy and water demand has become increasingly important. Though current modeling efforts mark significant progress in the effort to better understand the spatial distribution of energy and water consumption, many are provided at a course spatial resolution or rely on techniques which depend on detailed region-specific data sources that are not publicly available for many parts of the U.S. Furthermore, many existing methods do not account for errors in input data sources and may thereforemore » not accurately reflect inherent uncertainties in model outputs. We propose an alternative and more flexible Monte-Carlo simulation approach to high-resolution residential and commercial electricity and water consumption modeling that relies primarily on publicly available data sources. The method s flexible data requirement and statistical framework ensure that the model is both applicable to a wide range of regions and reflective of uncertainties in model results. Key words: Energy Modeling, Water Modeling, Monte-Carlo Simulation, Uncertainty Quantification Acknowledgment This manuscript has been authored by employees of UT-Battelle, LLC, under contract DE-AC05-00OR22725 with the U.S. Department of Energy. Accordingly, the United States Government retains and the publisher, by accepting the article for publication, acknowledges that the United States Government retains a non-exclusive, paid-up, irrevocable, world-wide license to publish or reproduce the published form of this manuscript, or allow others to do so, for United States Government purposes.« less
Coverage-based constraints for IMRT optimization
NASA Astrophysics Data System (ADS)
Mescher, H.; Ulrich, S.; Bangert, M.
2017-09-01
Radiation therapy treatment planning requires an incorporation of uncertainties in order to guarantee an adequate irradiation of the tumor volumes. In current clinical practice, uncertainties are accounted for implicitly with an expansion of the target volume according to generic margin recipes. Alternatively, it is possible to account for uncertainties by explicit minimization of objectives that describe worst-case treatment scenarios, the expectation value of the treatment or the coverage probability of the target volumes during treatment planning. In this note we show that approaches relying on objectives to induce a specific coverage of the clinical target volumes are inevitably sensitive to variation of the relative weighting of the objectives. To address this issue, we introduce coverage-based constraints for intensity-modulated radiation therapy (IMRT) treatment planning. Our implementation follows the concept of coverage-optimized planning that considers explicit error scenarios to calculate and optimize patient-specific probabilities q(\\hat{d}, \\hat{v}) of covering a specific target volume fraction \\hat{v} with a certain dose \\hat{d} . Using a constraint-based reformulation of coverage-based objectives we eliminate the trade-off between coverage and competing objectives during treatment planning. In-depth convergence tests including 324 treatment plan optimizations demonstrate the reliability of coverage-based constraints for varying levels of probability, dose and volume. General clinical applicability of coverage-based constraints is demonstrated for two cases. A sensitivity analysis regarding penalty variations within this planing study based on IMRT treatment planning using (1) coverage-based constraints, (2) coverage-based objectives, (3) probabilistic optimization, (4) robust optimization and (5) conventional margins illustrates the potential benefit of coverage-based constraints that do not require tedious adjustment of target volume objectives.
Foundations of a mathematical theory of darwinism.
Batty, Charles J K; Crewe, Paul; Grafen, Alan; Gratwick, Richard
2014-08-01
This paper pursues the 'formal darwinism' project of Grafen, whose aim is to construct formal links between dynamics of gene frequencies and optimization programmes, in very abstract settings with general implications for biologically relevant situations. A major outcome is the definition, within wide assumptions, of the ubiquitous but problematic concept of 'fitness'. This paper is the first to present the project for mathematicians. Within the framework of overlapping generations in discrete time and no social interactions, the current model shows links between fitness maximization and gene frequency change in a class-structured population, with individual-level uncertainty but no uncertainty in the class projection operator, where individuals are permitted to observe and condition their behaviour on arbitrary parts of the uncertainty. The results hold with arbitrary numbers of loci and alleles, arbitrary dominance and epistasis, and make no assumptions about linkage, linkage disequilibrium or mating system. An explicit derivation is given of Fisher's Fundamental Theorem of Natural Selection in its full generality.
Dieye, A.M.; Roy, David P.; Hanan, N.P.; Liu, S.; Hansen, M.; Toure, A.
2012-01-01
Spatially explicit land cover land use (LCLU) change information is needed to drive biogeochemical models that simulate soil organic carbon (SOC) dynamics. Such information is increasingly being mapped using remotely sensed satellite data with classification schemes and uncertainties constrained by the sensing system, classification algorithms and land cover schemes. In this study, automated LCLU classification of multi-temporal Landsat satellite data were used to assess the sensitivity of SOC modeled by the Global Ensemble Biogeochemical Modeling System (GEMS). The GEMS was run for an area of 1560 km2 in Senegal under three climate change scenarios with LCLU maps generated using different Landsat classification approaches. This research provides a method to estimate the variability of SOC, specifically the SOC uncertainty due to satellite classification errors, which we show is dependent not only on the LCLU classification errors but also on where the LCLU classes occur relative to the other GEMS model inputs.
Quadratic stabilisability of multi-agent systems under switching topologies
NASA Astrophysics Data System (ADS)
Guan, Yongqiang; Ji, Zhijian; Zhang, Lin; Wang, Long
2014-12-01
This paper addresses the stabilisability of multi-agent systems (MASs) under switching topologies. Necessary and/or sufficient conditions are presented in terms of graph topology. These conditions explicitly reveal how the intrinsic dynamics of the agents, the communication topology and the external control input affect stabilisability jointly. With the appropriate selection of some agents to which the external inputs are applied and the suitable design of neighbour-interaction rules via a switching topology, an MAS is proved to be stabilisable even if so is not for each of uncertain subsystem. In addition, a method is proposed to constructively design a switching rule for MASs with norm-bounded time-varying uncertainties. The switching rules designed via this method do not rely on uncertainties, and the switched MAS is quadratically stabilisable via decentralised external self-feedback for all uncertainties. With respect to applications of the stabilisability results, the formation control and the cooperative tracking control are addressed. Numerical simulations are presented to demonstrate the effectiveness of the proposed results.
NASA Technical Reports Server (NTRS)
Yedavalli, R. K.
1992-01-01
The aspect of controller design for improving the ride quality of aircraft in terms of damping ratio and natural frequency specifications on the short period dynamics is addressed. The controller is designed to be robust with respect to uncertainties in the real parameters of the control design model such as uncertainties in the dimensional stability derivatives, imperfections in actuator/sensor locations and possibly variations in flight conditions, etc. The design is based on a new robust root clustering theory developed by the author by extending the nominal root clustering theory of Gutman and Jury to perturbed matrices. The proposed methodology allows to get an explicit relationship between the parameters of the root clustering region and the uncertainty radius of the parameter space. The current literature available for robust stability becomes a special case of this unified theory. The bounds derived on the parameter perturbation for robust root clustering are then used in selecting the robust controller.
Sustainable water management under future uncertainty with eco-engineering decision scaling
NASA Astrophysics Data System (ADS)
Poff, N. Leroy; Brown, Casey M.; Grantham, Theodore E.; Matthews, John H.; Palmer, Margaret A.; Spence, Caitlin M.; Wilby, Robert L.; Haasnoot, Marjolijn; Mendoza, Guillermo F.; Dominique, Kathleen C.; Baeza, Andres
2016-01-01
Managing freshwater resources sustainably under future climatic and hydrological uncertainty poses novel challenges. Rehabilitation of ageing infrastructure and construction of new dams are widely viewed as solutions to diminish climate risk, but attaining the broad goal of freshwater sustainability will require expansion of the prevailing water resources management paradigm beyond narrow economic criteria to include socially valued ecosystem functions and services. We introduce a new decision framework, eco-engineering decision scaling (EEDS), that explicitly and quantitatively explores trade-offs in stakeholder-defined engineering and ecological performance metrics across a range of possible management actions under unknown future hydrological and climate states. We illustrate its potential application through a hypothetical case study of the Iowa River, USA. EEDS holds promise as a powerful framework for operationalizing freshwater sustainability under future hydrological uncertainty by fostering collaboration across historically conflicting perspectives of water resource engineering and river conservation ecology to design and operate water infrastructure for social and environmental benefits.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ulissi, Zachary W.; Medford, Andrew J.; Bligaard, Thomas
Surface reaction networks involving hydrocarbons exhibit enormous complexity with thousands of species and reactions for all but the very simplest of chemistries. We present a framework for optimization under uncertainty for heterogeneous catalysis reaction networks using surrogate models that are trained on the fly. The surrogate model is constructed by teaching a Gaussian process adsorption energies based on group additivity fingerprints, combined with transition-state scaling relations and a simple classifier for determining the rate-limiting step. The surrogate model is iteratively used to predict the most important reaction step to be calculated explicitly with computationally demanding electronic structure theory. Applying thesemore » methods to the reaction of syngas on rhodium(111), we identify the most likely reaction mechanism. Lastly, propagating uncertainty throughout this process yields the likelihood that the final mechanism is complete given measurements on only a subset of the entire network and uncertainty in the underlying density functional theory calculations.« less
Ulissi, Zachary W.; Medford, Andrew J.; Bligaard, Thomas; ...
2017-03-06
Surface reaction networks involving hydrocarbons exhibit enormous complexity with thousands of species and reactions for all but the very simplest of chemistries. We present a framework for optimization under uncertainty for heterogeneous catalysis reaction networks using surrogate models that are trained on the fly. The surrogate model is constructed by teaching a Gaussian process adsorption energies based on group additivity fingerprints, combined with transition-state scaling relations and a simple classifier for determining the rate-limiting step. The surrogate model is iteratively used to predict the most important reaction step to be calculated explicitly with computationally demanding electronic structure theory. Applying thesemore » methods to the reaction of syngas on rhodium(111), we identify the most likely reaction mechanism. Lastly, propagating uncertainty throughout this process yields the likelihood that the final mechanism is complete given measurements on only a subset of the entire network and uncertainty in the underlying density functional theory calculations.« less
NASA Astrophysics Data System (ADS)
Kim, Ho Sung
2013-12-01
A quantitative method for estimating an expected uncertainty (reliability and validity) in assessment results arising from the relativity between four variables, viz examiner's expertise, examinee's expertise achieved, assessment task difficulty and examinee's performance, was developed for the complex assessment applicable to final year project thesis assessment including peer assessment. A guide map can be generated by the method for finding expected uncertainties prior to the assessment implementation with a given set of variables. It employs a scale for visualisation of expertise levels, derivation of which is based on quantified clarities of mental images for levels of the examiner's expertise and the examinee's expertise achieved. To identify the relevant expertise areas that depend on the complexity in assessment format, a graphical continuum model was developed. The continuum model consists of assessment task, assessment standards and criterion for the transition towards the complex assessment owing to the relativity between implicitness and explicitness and is capable of identifying areas of expertise required for scale development.
Embracing uncertainty in climate change policy
NASA Astrophysics Data System (ADS)
Otto, Friederike E. L.; Frame, David J.; Otto, Alexander; Allen, Myles R.
2015-10-01
The 'pledge and review' approach to reducing greenhouse-gas emissions presents an opportunity to link mitigation goals explicitly to the evolving climate response. This seems desirable because the progression from the Intergovernmental Panel on Climate Change's fourth to fifth assessment reports has seen little reduction in uncertainty. A common reaction to persistent uncertainties is to advocate mitigation policies that are robust even under worst-case scenarios, thereby focusing attention on upper extremes of both the climate response and the costs of impacts and mitigation, all of which are highly contestable. Here we ask whether those contributing to the formation of climate policies can learn from 'adaptive management' techniques. Recognizing that long-lived greenhouse gas emissions have to be net zero by the time temperatures reach a target stabilization level, such as 2 °C above pre-industrial levels, and anchoring commitments to an agreed index of attributable anthropogenic warming would provide a transparent approach to meeting such a temperature goal without prior consensus on the climate response.
Sustainable water management under future uncertainty with eco-engineering decision scaling
Poff, N LeRoy; Brown, Casey M; Grantham, Theodore E.; Matthews, John H; Palmer, Margaret A.; Spence, Caitlin M; Wilby, Robert L.; Haasnoot, Marjolijn; Mendoza, Guillermo F; Dominique, Kathleen C; Baeza, Andres
2015-01-01
Managing freshwater resources sustainably under future climatic and hydrological uncertainty poses novel challenges. Rehabilitation of ageing infrastructure and construction of new dams are widely viewed as solutions to diminish climate risk, but attaining the broad goal of freshwater sustainability will require expansion of the prevailing water resources management paradigm beyond narrow economic criteria to include socially valued ecosystem functions and services. We introduce a new decision framework, eco-engineering decision scaling (EEDS), that explicitly and quantitatively explores trade-offs in stakeholder-defined engineering and ecological performance metrics across a range of possible management actions under unknown future hydrological and climate states. We illustrate its potential application through a hypothetical case study of the Iowa River, USA. EEDS holds promise as a powerful framework for operationalizing freshwater sustainability under future hydrological uncertainty by fostering collaboration across historically conflicting perspectives of water resource engineering and river conservation ecology to design and operate water infrastructure for social and environmental benefits.
Intrinsic uncertainty on the nature of dark energy
NASA Astrophysics Data System (ADS)
Valkenburg, Wessel; Kunz, Martin; Marra, Valerio
2013-12-01
We argue that there is an intrinsic noise on measurements of the equation of state parameter w = p/ρ from large-scale structure around us. The presence of the large-scale structure leads to an ambiguity in the definition of the background universe and thus there is a maximal precision with which we can determine the equation of state of dark energy. To study the uncertainty due to local structure, we model density perturbations stemming from a standard inflationary power spectrum by means of the exact Lemaître-Tolman-Bondi solution of Einstein’s equation, and show that the usual distribution of matter inhomogeneities in a ΛCDM cosmology causes a variation of w - as inferred from distance measures - of several percent. As we observe only one universe, or equivalently because of the cosmic variance, this uncertainty is systematic in nature.
Regulation of stem cell-based therapies in Canada: current issues and concerns.
von Tigerstrom, Barbara; Nguyen, Thu Minh; Knoppers, Bartha Maria
2012-09-01
Stem cell therapies offer enormous potential for the treatment of a wide range of diseases and conditions. Despite the excitement over such advances, regulators are faced with the challenge of determining criteria to ensure stem cells and their products are safe and effective for human use. However, stem cell-based products and therapies present unique regulatory challenges because standard drug development models do not wholly apply given the complexity and diversity of these products and therapies. As a result, regulatory requirements are often unclear and ambiguous creating unnecessary barriers for research. In order to better understand the barriers that might affect Canadian stem cell researchers, we sought feedback from stakeholders regarding areas of uncertainty or concern about existing regulatory oversight of cell therapies. A selection of Canadian researchers and clinicians working in the area of stem cell research were interviewed to assess certain key questions: 1) whether current regulatory requirements are easily accessible and well understood; 2) whether regulatory requirements create important challenges or barriers; and 3) whether there is a need for further guidance on the issue. The results of this survey are summarized and compared to issues and concerns experienced in other countries, as reported in the literature, to identify challenges which may be on the horizon and to provide possible solutions for regulatory reform.
Mackinger, Barbara; Jonas, Eva
2012-01-01
When confronted with important questions we like to rely on the advice of experts. However, uncertainty can occur regarding advisors' motivation to pursue self-interest and deceive the client. This can especially occur when the advisor has the possibility to receive an incentive by recommending a certain alternative. We investigated how the possibility to pursue self-interest led to explicit strategic behavior (bias in recommendation and transfer of information) and to implicit strategic behavior (bias in information processing: evaluation and memory). In Study 1 explicit strategic behavior could be identified: self-interested advisors recommended more often the self-serving alternative and transferred more self-interested biased information to their client compared to the advisor without specific interest. Also deception through implicit strategic behavior was identified: self-interested advisors biased the evaluation of information less in favor of the client compared to the control group. Self-interested advisors also remembered conflicting information regarding their self-interest worse compared to advisors without self-interest. In Study 2 beside self-interest we assessed accountability which interacted with self-interest and increased the bias: when accountability was high advisor's self-interest led to higher explicit strategic behavior (less transfer of conflicting information), and to higher implicit strategic behavior (devaluated and remembered less conflicting information). Both studies identified implicit strategic behavior as mediator which can explain the relation between self-interest and explicit strategic behavior. Results of both studies suggest that self-interested advisors use explicit and implicit strategic behavior to receive an incentive. Thus, advisors do not only consciously inform their clients "self-interested," but they are influenced unconsciously by biased information processing - a tendency which even increased with high accountability.
Mackinger, Barbara; Jonas, Eva
2012-01-01
When confronted with important questions we like to rely on the advice of experts. However, uncertainty can occur regarding advisors’ motivation to pursue self-interest and deceive the client. This can especially occur when the advisor has the possibility to receive an incentive by recommending a certain alternative. We investigated how the possibility to pursue self-interest led to explicit strategic behavior (bias in recommendation and transfer of information) and to implicit strategic behavior (bias in information processing: evaluation and memory). In Study 1 explicit strategic behavior could be identified: self-interested advisors recommended more often the self-serving alternative and transferred more self-interested biased information to their client compared to the advisor without specific interest. Also deception through implicit strategic behavior was identified: self-interested advisors biased the evaluation of information less in favor of the client compared to the control group. Self-interested advisors also remembered conflicting information regarding their self-interest worse compared to advisors without self-interest. In Study 2 beside self-interest we assessed accountability which interacted with self-interest and increased the bias: when accountability was high advisor’s self-interest led to higher explicit strategic behavior (less transfer of conflicting information), and to higher implicit strategic behavior (devaluated and remembered less conflicting information). Both studies identified implicit strategic behavior as mediator which can explain the relation between self-interest and explicit strategic behavior. Results of both studies suggest that self-interested advisors use explicit and implicit strategic behavior to receive an incentive. Thus, advisors do not only consciously inform their clients “self-interested,” but they are influenced unconsciously by biased information processing – a tendency which even increased with high accountability. PMID:23440297
Regulatory uncertainty and the associated business risk for emerging technologies
NASA Astrophysics Data System (ADS)
Hoerr, Robert A.
2011-04-01
An oversight system specifically concerned with nanomaterials should be flexible enough to take into account the unique aspects of individual novel materials and the settings in which they might be used, while recognizing that heretofore unrecognized safety issues may require future modifications. This article considers a question not explicitly considered by the project team: what is the risk that uncertainty over how regulatory oversight will be applied to nanomaterials will delay or block the development of this emerging technology, thereby depriving human health of potential and substantial benefits? An ambiguous regulatory environment could delay the availability of valuable new technology and therapeutics for human health by reducing access to investment capital. Venture capitalists list regulatory uncertainty as a major reason not to invest at all in certain areas. Uncertainty is far more difficult to evaluate than risk, which lends itself to quantitative models and can be factored into projections of return on possible investments. Loss of time has a large impact on investment return. An examination of regulatory case histories suggests that an increase in regulatory resting requirement, where the path is well-defined, is far less costly than a delay of a year or more in achieving product approval and market launch.
NASA Technical Reports Server (NTRS)
Cheeseman, Peter; Stutz, John
2005-01-01
A long standing mystery in using Maximum Entropy (MaxEnt) is how to deal with constraints whose values are uncertain. This situation arises when constraint values are estimated from data, because of finite sample sizes. One approach to this problem, advocated by E.T. Jaynes [1], is to ignore this uncertainty, and treat the empirically observed values as exact. We refer to this as the classic MaxEnt approach. Classic MaxEnt gives point probabilities (subject to the given constraints), rather than probability densities. We develop an alternative approach that assumes that the uncertain constraint values are represented by a probability density {e.g: a Gaussian), and this uncertainty yields a MaxEnt posterior probability density. That is, the classic MaxEnt point probabilities are regarded as a multidimensional function of the given constraint values, and uncertainty on these values is transmitted through the MaxEnt function to give uncertainty over the MaXEnt probabilities. We illustrate this approach by explicitly calculating the generalized MaxEnt density for a simple but common case, then show how this can be extended numerically to the general case. This paper expands the generalized MaxEnt concept introduced in a previous paper [3].
NASA Astrophysics Data System (ADS)
Riva, Fabio; Milanese, Lucio; Ricci, Paolo
2017-10-01
To reduce the computational cost of the uncertainty propagation analysis, which is used to study the impact of input parameter variations on the results of a simulation, a general and simple to apply methodology based on decomposing the solution to the model equations in terms of Chebyshev polynomials is discussed. This methodology, based on the work by Scheffel [Am. J. Comput. Math. 2, 173-193 (2012)], approximates the model equation solution with a semi-analytic expression that depends explicitly on time, spatial coordinates, and input parameters. By employing a weighted residual method, a set of nonlinear algebraic equations for the coefficients appearing in the Chebyshev decomposition is then obtained. The methodology is applied to a two-dimensional Braginskii model used to simulate plasma turbulence in basic plasma physics experiments and in the scrape-off layer of tokamaks, in order to study the impact on the simulation results of the input parameter that describes the parallel losses. The uncertainty that characterizes the time-averaged density gradient lengths, time-averaged densities, and fluctuation density level are evaluated. A reasonable estimate of the uncertainty of these distributions can be obtained with a single reduced-cost simulation.
NASA Astrophysics Data System (ADS)
Hermans, Thomas; Nguyen, Frédéric; Klepikova, Maria; Dassargues, Alain; Caers, Jef
2018-04-01
In theory, aquifer thermal energy storage (ATES) systems can recover in winter the heat stored in the aquifer during summer to increase the energy efficiency of the system. In practice, the energy efficiency is often lower than expected from simulations due to spatial heterogeneity of hydraulic properties or non-favorable hydrogeological conditions. A proper design of ATES systems should therefore consider the uncertainty of the prediction related to those parameters. We use a novel framework called Bayesian Evidential Learning (BEL) to estimate the heat storage capacity of an alluvial aquifer using a heat tracing experiment. BEL is based on two main stages: pre- and postfield data acquisition. Before data acquisition, Monte Carlo simulations and global sensitivity analysis are used to assess the information content of the data to reduce the uncertainty of the prediction. After data acquisition, prior falsification and machine learning based on the same Monte Carlo are used to directly assess uncertainty on key prediction variables from observations. The result is a full quantification of the posterior distribution of the prediction conditioned to observed data, without any explicit full model inversion. We demonstrate the methodology in field conditions and validate the framework using independent measurements.
An estimate of the number of tropical tree species
J. W. Ferry Slik; Victor Arroyo-Rodriguez; Shin-Ichiro and others Aiba
2015-01-01
The high species richness of tropical forests has long been recognized, yet there remains substantial uncertainty regarding the actual number of tropical tree species. Using a pantropical tree inventory database from closed canopy forests, consisting of 657,630 trees belonging to 11,371 species, we use a fitted value of Fishers alpha and an approximate pantropical stem...
Beekhuizen, Johan; Heuvelink, Gerard B M; Huss, Anke; Bürgi, Alfred; Kromhout, Hans; Vermeulen, Roel
2014-11-01
With the increased availability of spatial data and computing power, spatial prediction approaches have become a standard tool for exposure assessment in environmental epidemiology. However, such models are largely dependent on accurate input data. Uncertainties in the input data can therefore have a large effect on model predictions, but are rarely quantified. With Monte Carlo simulation we assessed the effect of input uncertainty on the prediction of radio-frequency electromagnetic fields (RF-EMF) from mobile phone base stations at 252 receptor sites in Amsterdam, The Netherlands. The impact on ranking and classification was determined by computing the Spearman correlations and weighted Cohen's Kappas (based on tertiles of the RF-EMF exposure distribution) between modelled values and RF-EMF measurements performed at the receptor sites. The uncertainty in modelled RF-EMF levels was large with a median coefficient of variation of 1.5. Uncertainty in receptor site height, building damping and building height contributed most to model output uncertainty. For exposure ranking and classification, the heights of buildings and receptor sites were the most important sources of uncertainty, followed by building damping, antenna- and site location. Uncertainty in antenna power, tilt, height and direction had a smaller impact on model performance. We quantified the effect of input data uncertainty on the prediction accuracy of an RF-EMF environmental exposure model, thereby identifying the most important sources of uncertainty and estimating the total uncertainty stemming from potential errors in the input data. This approach can be used to optimize the model and better interpret model output. Copyright © 2014 Elsevier Inc. All rights reserved.
Uncertainty estimation of Intensity-Duration-Frequency relationships: A regional analysis
NASA Astrophysics Data System (ADS)
Mélèse, Victor; Blanchet, Juliette; Molinié, Gilles
2018-03-01
We propose in this article a regional study of uncertainties in IDF curves derived from point-rainfall maxima. We develop two generalized extreme value models based on the simple scaling assumption, first in the frequentist framework and second in the Bayesian framework. Within the frequentist framework, uncertainties are obtained i) from the Gaussian density stemming from the asymptotic normality theorem of the maximum likelihood and ii) with a bootstrap procedure. Within the Bayesian framework, uncertainties are obtained from the posterior densities. We confront these two frameworks on the same database covering a large region of 100, 000 km2 in southern France with contrasted rainfall regime, in order to be able to draw conclusion that are not specific to the data. The two frameworks are applied to 405 hourly stations with data back to the 1980's, accumulated in the range 3 h-120 h. We show that i) the Bayesian framework is more robust than the frequentist one to the starting point of the estimation procedure, ii) the posterior and the bootstrap densities are able to better adjust uncertainty estimation to the data than the Gaussian density, and iii) the bootstrap density give unreasonable confidence intervals, in particular for return levels associated to large return period. Therefore our recommendation goes towards the use of the Bayesian framework to compute uncertainty.
The law (and politics) of safe injection facilities in the United States.
Beletsky, Leo; Davis, Corey S; Anderson, Evan; Burris, Scott
2008-02-01
Safe injection facilities (SIFs) have shown promise in reducing harms and social costs associated with injection drug use. Favorable evaluations elsewhere have raised the issue of their implementation in the United States. Recognizing that laws shape health interventions targeting drug users, we analyzed the legal environment for publicly authorized SIFs in the United States. Although states and some municipalities have the power to authorize SIFs under state law, federal authorities could still interfere with these facilities under the Controlled Substances Act. A state- or locally-authorized SIF could proceed free of legal uncertainty only if federal authorities explicitly authorized it or decided not to interfere. Given legal uncertainty, and the similar experience with syringe exchange programs, we recommend a process of sustained health research, strategic advocacy, and political deliberation.
THE MAYAK WORKER DOSIMETRY SYSTEM (MWDS-2013) FOR INTERNALLY DEPOSITED PLUTONIUM: AN OVERVIEW
DOE Office of Scientific and Technical Information (OSTI.GOV)
Birchall, A.; Vostrotin, V.; Puncher, M.
The Mayak Worker Dosimetry System (MWDS-2013) is a system for interpreting measurement data from Mayak workers from both internal and external sources. This paper is concerned with the calculation of annual organ doses for Mayak workers exposed to plutonium aerosols, where the measurement data consists mainly of activity of plutonium in urine samples. The system utilises the latest biokinetic and dosimetric models, and unlike its predecessors, takes explicit account of uncertainties in both the measurement data and model parameters. The aim of this paper is to describe the complete MWDS-2013 system (including model parameter values and their uncertainties) and themore » methodology used (including all the relevant equations) and the assumptions made. Where necessary, supplementary papers which justify specific assumptions are cited.« less
NASA Astrophysics Data System (ADS)
Magnoni, F.; Scognamiglio, L.; Tinti, E.; Casarotti, E.
2014-12-01
Seismic moment tensor is one of the most important source parameters defining the earthquake dimension and style of the activated fault. Moment tensor catalogues are ordinarily used by geoscientists, however, few attempts have been done to assess possible impacts of moment magnitude uncertainties upon their own analysis. The 2012 May 20 Emilia mainshock is a representative event since it is defined in literature with a moment magnitude value (Mw) spanning between 5.63 and 6.12. An uncertainty of ~0.5 units in magnitude leads to a controversial knowledge of the real size of the event. The possible uncertainty associated to this estimate could be critical for the inference of other seismological parameters, suggesting caution for seismic hazard assessment, coulomb stress transfer determination and other analyses where self-consistency is important. In this work, we focus on the variability of the moment tensor solution, highlighting the effect of four different velocity models, different types and ranges of filtering, and two different methodologies. Using a larger dataset, to better quantify the source parameter uncertainty, we also analyze the variability of the moment tensor solutions depending on the number, the epicentral distance and the azimuth of used stations. We endorse that the estimate of seismic moment from moment tensor solutions, as well as the estimate of the other kinematic source parameters, cannot be considered an absolute value and requires to come out with the related uncertainties and in a reproducible framework characterized by disclosed assumptions and explicit processing workflows.
Bayesian characterization of uncertainty in species interaction strengths.
Wolf, Christopher; Novak, Mark; Gitelman, Alix I
2017-06-01
Considerable effort has been devoted to the estimation of species interaction strengths. This effort has focused primarily on statistical significance testing and obtaining point estimates of parameters that contribute to interaction strength magnitudes, leaving the characterization of uncertainty associated with those estimates unconsidered. We consider a means of characterizing the uncertainty of a generalist predator's interaction strengths by formulating an observational method for estimating a predator's prey-specific per capita attack rates as a Bayesian statistical model. This formulation permits the explicit incorporation of multiple sources of uncertainty. A key insight is the informative nature of several so-called non-informative priors that have been used in modeling the sparse data typical of predator feeding surveys. We introduce to ecology a new neutral prior and provide evidence for its superior performance. We use a case study to consider the attack rates in a New Zealand intertidal whelk predator, and we illustrate not only that Bayesian point estimates can be made to correspond with those obtained by frequentist approaches, but also that estimation uncertainty as described by 95% intervals is more useful and biologically realistic using the Bayesian method. In particular, unlike in bootstrap confidence intervals, the lower bounds of the Bayesian posterior intervals for attack rates do not include zero when a predator-prey interaction is in fact observed. We conclude that the Bayesian framework provides a straightforward, probabilistic characterization of interaction strength uncertainty, enabling future considerations of both the deterministic and stochastic drivers of interaction strength and their impact on food webs.
Land, Charles E; Kwon, Deukwoo; Hoffman, F Owen; Moroz, Brian; Drozdovitch, Vladimir; Bouville, André; Beck, Harold; Luckyanov, Nicholas; Weinstock, Robert M; Simon, Steven L
2015-02-01
Dosimetic uncertainties, particularly those that are shared among subgroups of a study population, can bias, distort or reduce the slope or significance of a dose response. Exposure estimates in studies of health risks from environmental radiation exposures are generally highly uncertain and thus, susceptible to these methodological limitations. An analysis was published in 2008 concerning radiation-related thyroid nodule prevalence in a study population of 2,994 villagers under the age of 21 years old between August 1949 and September 1962 and who lived downwind from the Semipalatinsk Nuclear Test Site in Kazakhstan. This dose-response analysis identified a statistically significant association between thyroid nodule prevalence and reconstructed doses of fallout-related internal and external radiation to the thyroid gland; however, the effects of dosimetric uncertainty were not evaluated since the doses were simple point "best estimates". In this work, we revised the 2008 study by a comprehensive treatment of dosimetric uncertainties. Our present analysis improves upon the previous study, specifically by accounting for shared and unshared uncertainties in dose estimation and risk analysis, and differs from the 2008 analysis in the following ways: 1. The study population size was reduced from 2,994 to 2,376 subjects, removing 618 persons with uncertain residence histories; 2. Simulation of multiple population dose sets (vectors) was performed using a two-dimensional Monte Carlo dose estimation method; and 3. A Bayesian model averaging approach was employed for evaluating the dose response, explicitly accounting for large and complex uncertainty in dose estimation. The results were compared against conventional regression techniques. The Bayesian approach utilizes 5,000 independent realizations of population dose vectors, each of which corresponds to a set of conditional individual median internal and external doses for the 2,376 subjects. These 5,000 population dose vectors reflect uncertainties in dosimetric parameters, partly shared and partly independent, among individual members of the study population. Risk estimates for thyroid nodules from internal irradiation were higher than those published in 2008, which results, to the best of our knowledge, from explicitly accounting for dose uncertainty. In contrast to earlier findings, the use of Bayesian methods led to the conclusion that the biological effectiveness for internal and external dose was similar. Estimates of excess relative risk per unit dose (ERR/Gy) for males (177 thyroid nodule cases) were almost 30 times those for females (571 cases) and were similar to those reported for thyroid cancers related to childhood exposures to external and internal sources in other studies. For confirmed cases of papillary thyroid cancers (3 in males, 18 in females), the ERR/Gy was also comparable to risk estimates from other studies, but not significantly different from zero. These findings represent the first reported dose response for a radiation epidemiologic study considering all known sources of shared and unshared errors in dose estimation and using a Bayesian model averaging (BMA) method for analysis of the dose response.
Gissi, Elena; Menegon, Stefano; Sarretta, Alessandro; Appiotti, Federica; Maragno, Denis; Vianello, Andrea; Depellegrin, Daniel; Venier, Chiara; Barbanti, Andrea
2017-01-01
Maritime spatial planning (MSP) is envisaged as a tool to apply an ecosystem-based approach to the marine and coastal realms, aiming at ensuring that the collective pressure of human activities is kept within acceptable limits. Cumulative impacts (CI) assessment can support science-based MSP, in order to understand the existing and potential impacts of human uses on the marine environment. A CI assessment includes several sources of uncertainty that can hinder the correct interpretation of its results if not explicitly incorporated in the decision-making process. This study proposes a three-level methodology to perform a general uncertainty analysis integrated with the CI assessment for MSP, applied to the Adriatic and Ionian Region (AIR). We describe the nature and level of uncertainty with the help of expert judgement and elicitation to include all of the possible sources of uncertainty related to the CI model with assumptions and gaps related to the case-based MSP process in the AIR. Next, we use the results to tailor the global uncertainty analysis to spatially describe the uncertainty distribution and variations of the CI scores dependent on the CI model factors. The results show the variability of the uncertainty in the AIR, with only limited portions robustly identified as the most or the least impacted areas under multiple model factors hypothesis. The results are discussed for the level and type of reliable information and insights they provide to decision-making. The most significant uncertainty factors are identified to facilitate the adaptive MSP process and to establish research priorities to fill knowledge gaps for subsequent planning cycles. The method aims to depict the potential CI effects, as well as the extent and spatial variation of the data and scientific uncertainty; therefore, this method constitutes a suitable tool to inform the potential establishment of the precautionary principle in MSP.
NASA Astrophysics Data System (ADS)
Schmidt, Matthew; Fulton, Lori
2016-04-01
The need to prepare students with twenty-first-century skills through STEM-related teaching is strong, especially at the elementary level. However, most teacher education preparation programs do not focus on STEM education. In an attempt to provide an exemplary model of a STEM unit, we used a rapid prototyping approach to transform an inquiry-based unit on moon phases into one that integrated technology in a meaningful manner to develop technological literacy and scientific concepts for pre-service teachers (PSTs). Using qualitative case study methodology, we describe lessons learned related to the development and implementation of a STEM unit in an undergraduate elementary methods course, focusing on the impact the inquiry model had on PSTs' perceptions of inquiry-based science instruction and how the integration of technology impacted their learning experience. Using field notes and survey data, we uncovered three overarching themes. First, we found that PSTs held absolutist beliefs and had a need for instruction on inquiry-based learning and teaching. Second, we determined that explicit examples of effective and ineffective technology use are needed to help PSTs develop an understanding of meaningful technology integration. Finally, the rapid prototyping approach resulted in a successful modification of the unit, but caused the usability of our digital instructional materials to suffer. Our findings suggest that while inquiry-based STEM units can be implemented in existing programs, creating and testing these prototypes requires significant effort to meet PSTs' learning needs, and that iterating designs is essential to successful implementation.
[Stem cells and therapeutic cloning, medical perspectives under discussion].
Manuel, Catherine; Lafon, Claude; Hairion, Dominique; Antoniotti, Stéphanie
2004-03-13
Innovative biotechnical progress over the past few years regards stem cells and therapeutic cloning, which open promising medical horizons for many presently incurable diseases. THE CURRENT DEBATE: The research work in France has been stalled because of the prohibitions listed in the so-called "bioethical" laws of 1994. The ongoing revision of these laws is based on a certain number of ethical questions and launches a disputable parlementary debate. Other than reproductive cloning and research on the embryo, the possibilities provided by stem cells and therapeutic cloning should be emphasized and the different positions advanced specified, showing an evolution in the laws in France. ABUSIVE LEGISLATIVE PROHIBITIONS: The proposed law, which maintains the prohibition for research on the embryo, with a 5-Year dispensation, and which explicitly prohibits therapeutic cloning, is not in keeping with the widening of in this field expected by research teams. Many scientists and physicians, supported by patients' associations, are aware of the importance of therapeutic progress attached to such research. They should not be stalled in their studies by the prohibitions maintained in the new law.
Energetics of codon-anticodon recognition on the small ribosomal subunit.
Almlöf, Martin; Andér, Martin; Aqvist, Johan
2007-01-09
Recent crystal structures of the small ribosomal subunit have made it possible to examine the detailed energetics of codon recognition on the ribosome by computational methods. The binding of cognate and near-cognate anticodon stem loops to the ribosome decoding center, with mRNA containing the Phe UUU and UUC codons, are analyzed here using explicit solvent molecular dynamics simulations together with the linear interaction energy (LIE) method. The calculated binding free energies are in excellent agreement with experimental binding constants and reproduce the relative effects of mismatches in the first and second codon position versus a mismatch at the wobble position. The simulations further predict that the Leu2 anticodon stem loop is about 10 times more stable than the Ser stem loop in complex with the Phe UUU codon. It is also found that the ribosome significantly enhances the intrinsic stability differences of codon-anticodon complexes in aqueous solution. Structural analysis of the simulations confirms the previously suggested importance of the universally conserved nucleotides A1492, A1493, and G530 in the decoding process.
Global asymptotic stability and hopf bifurcation for a blood cell production model.
Crauste, Fabien
2006-04-01
We analyze the asymptotic stability of a nonlinear system of two differential equations with delay, describing the dynamics of blood cell produc- tion. This process takes place in the bone marrow, where stem cells differen- tiate throughout division in blood cells. Taking into account an explicit role of the total population of hematopoietic stem cells in the introduction of cells in cycle, we are led to study a characteristic equation with delay-dependent coefficients. We determine a necessary and sufficient condition for the global stability of the first steady state of our model, which describes the popula- tion's dying out, and we obtain the existence of a Hopf bifurcation for the only nontrivial positive steady state, leading to the existence of periodic solutions. These latter are related to dynamical diseases affecting blood cells known for their cyclic nature.
Stadlbauer, Petr; Krepl, Miroslav; Cheatham, Thomas E.; Koča, Jaroslav; Šponer, Jiří
2013-01-01
Explicit solvent molecular dynamics simulations have been used to complement preceding experimental and computational studies of folding of guanine quadruplexes (G-DNA). We initiate early stages of unfolding of several G-DNAs by simulating them under no-salt conditions and then try to fold them back using standard excess salt simulations. There is a significant difference between G-DNAs with all-anti parallel stranded stems and those with stems containing mixtures of syn and anti guanosines. The most natural rearrangement for all-anti stems is a vertical mutual slippage of the strands. This leads to stems with reduced numbers of tetrads during unfolding and a reduction of strand slippage during refolding. The presence of syn nucleotides prevents mutual strand slippage; therefore, the antiparallel and hybrid quadruplexes initiate unfolding via separation of the individual strands. The simulations confirm the capability of G-DNA molecules to adopt numerous stable locally and globally misfolded structures. The key point for a proper individual folding attempt appears to be correct prior distribution of syn and anti nucleotides in all four G-strands. The results suggest that at the level of individual molecules, G-DNA folding is an extremely multi-pathway process that is slowed by numerous misfolding arrangements stabilized on highly variable timescales. PMID:23700306
Gender Stereotype Susceptibility
Pavlova, Marina A.; Weber, Susanna; Simoes, Elisabeth; Sokolov, Alexander N.
2014-01-01
Gender affects performance on a variety of cognitive tasks, and this impact may stem from socio-cultural factors such as gender stereotyping. Here we systematically manipulated gender stereotype messages on a social cognition task on which no initial gender gap has been documented. The outcome reveals: (i) Stereotyping affects both females and males, with a more pronounced impact on females. Yet an explicit negative message for males elicits a striking paradoxical deterioration in performance of females. (ii) Irrespective of gender and directness of message, valence of stereotype message affects performance: negative messages have stronger influence than positive ones. (iii) Directness of stereotype message differentially impacts performance of females and males: females tend to be stronger affected by implicit than explicit negative messages, whereas in males this relationship is opposite. The data are discussed in the light of neural networks underlying gender stereotyping. The findings provide novel insights into the sources of gender related fluctuations in cognition and behavior. PMID:25517903
Gender stereotype susceptibility.
Pavlova, Marina A; Weber, Susanna; Simoes, Elisabeth; Sokolov, Alexander N
2014-01-01
Gender affects performance on a variety of cognitive tasks, and this impact may stem from socio-cultural factors such as gender stereotyping. Here we systematically manipulated gender stereotype messages on a social cognition task on which no initial gender gap has been documented. The outcome reveals: (i) Stereotyping affects both females and males, with a more pronounced impact on females. Yet an explicit negative message for males elicits a striking paradoxical deterioration in performance of females. (ii) Irrespective of gender and directness of message, valence of stereotype message affects performance: negative messages have stronger influence than positive ones. (iii) Directness of stereotype message differentially impacts performance of females and males: females tend to be stronger affected by implicit than explicit negative messages, whereas in males this relationship is opposite. The data are discussed in the light of neural networks underlying gender stereotyping. The findings provide novel insights into the sources of gender related fluctuations in cognition and behavior.
NASA Astrophysics Data System (ADS)
Orlandi, Javier G.; Casademunt, Jaume
2017-05-01
We introduce a coarse-grained stochastic model for the spontaneous activity of neuronal cultures to explain the phenomenon of noise focusing, which entails localization of the noise activity in excitable networks with metric correlations. The system is modeled as a continuum excitable medium with a state-dependent spatial coupling that accounts for the dynamics of synaptic connections. The most salient feature is the emergence at the mesoscale of a vector field V (r ) , which acts as an advective carrier of the noise. This entails an explicit symmetry breaking of isotropy and homogeneity that stems from the amplification of the quenched fluctuations of the network by the activity avalanches, concomitant with the excitable dynamics. We discuss the microscopic interpretation of V (r ) and propose an explicit construction of it. The coarse-grained model shows excellent agreement with simulations at the network level. The generic nature of the observed phenomena is discussed.
Marin E. Chambers; Paula J. Fornwalt; Sparkle L. Malone; Michael Battaglia
2016-01-01
Many recent wildfires in ponderosa pine (Pinus ponderosa Lawson & C. Lawson) - dominated forests of the western United States have burned more severely than historical ones, generating concern about forest resilience. This concern stems from uncertainty about the ability of ponderosa pine and other co-occurring conifers to regenerate in areas where no...
Gendered Expectations: Examining How Peers Shape Female Students' Intent to Pursue STEM Fields.
Riegle-Crumb, Catherine; Morton, Karisma
2017-01-01
Building on prior psychological and sociological research on the power of local environments to shape gendered outcomes in STEM fields, this study focuses on the critical stage of adolescence to explore the potential negative impact of exposure to exclusionary messages from peers within girls' science classrooms, as well as the positive potential impact of inclusionary messages. Specifically, utilizing longitudinal data from a diverse sample of adolescent youth, analyses examine how the presence of biased male peers, as well as confident female peers, shape girls' subsequent intentions to pursue different STEM fields, focusing specifically on intentions to pursue the male-dominated fields of computer science and engineering, as well as more gender equitable fields. Results reveal that exposure to a higher percentage of 8th grade male peers in the classroom who endorsed explicit gender/STEM stereotypes significantly and negatively predicted girls' later intentions to pursue a computer science/engineering (CS/E) major. Yet results also reveal that exposure to a higher percentage of confident female peers in the science classroom positively predicted such intentions. These results were specific to CS/E majors, suggesting that peers are an important source of messages regarding whether or not girls should pursue non-traditional STEM fields. This study calls attention to the importance of examining both positive and negative sources of influence within the local contexts where young people live and learn. Limitations and directions for future research are also discussed.
Gendered Expectations: Examining How Peers Shape Female Students' Intent to Pursue STEM Fields
Riegle-Crumb, Catherine; Morton, Karisma
2017-01-01
Building on prior psychological and sociological research on the power of local environments to shape gendered outcomes in STEM fields, this study focuses on the critical stage of adolescence to explore the potential negative impact of exposure to exclusionary messages from peers within girls' science classrooms, as well as the positive potential impact of inclusionary messages. Specifically, utilizing longitudinal data from a diverse sample of adolescent youth, analyses examine how the presence of biased male peers, as well as confident female peers, shape girls' subsequent intentions to pursue different STEM fields, focusing specifically on intentions to pursue the male-dominated fields of computer science and engineering, as well as more gender equitable fields. Results reveal that exposure to a higher percentage of 8th grade male peers in the classroom who endorsed explicit gender/STEM stereotypes significantly and negatively predicted girls' later intentions to pursue a computer science/engineering (CS/E) major. Yet results also reveal that exposure to a higher percentage of confident female peers in the science classroom positively predicted such intentions. These results were specific to CS/E majors, suggesting that peers are an important source of messages regarding whether or not girls should pursue non-traditional STEM fields. This study calls attention to the importance of examining both positive and negative sources of influence within the local contexts where young people live and learn. Limitations and directions for future research are also discussed. PMID:28360868
NASA Astrophysics Data System (ADS)
Arnbjerg-Nielsen, Karsten; Zhou, Qianqian
2014-05-01
There has been a significant increase in climatic extremes in many regions. In Central and Northern Europe, this has led to more frequent and more severe floods. Along with improved flood modelling technologies this has enabled development of economic assessment of climate change adaptation to increasing urban flood risk. Assessment of adaptation strategies often requires a comprehensive risk-based economic analysis of current risk, drivers of change of risk over time, and measures to reduce the risk. However, such studies are often associated with large uncertainties. The uncertainties arise from basic assumptions in the economic analysis and the hydrological model, but also from the projection of future societies to local climate change impacts and suitable adaptation options. This presents a challenge to decision makers when trying to identify robust measures. We present an integrated uncertainty analysis, which can assess and quantify the overall uncertainty in relation to climate change adaptation to urban flash floods. The analysis is based on an uncertainty cascade that by means of Monte Carlo simulations of flood risk assessments incorporates climate change impacts as a key driver of risk changes over time. The overall uncertainty is then attributed to six bulk processes: climate change impact, urban rainfall-runoff processes, stage-depth functions, unit cost of repair, cost of adaptation measures, and discount rate. We apply the approach on an urban hydrological catchment in Odense, Denmark, and find that the uncertainty on the climate change impact appears to have the least influence on the net present value of the studied adaptation measures-. This does not imply that the climate change impact is not important, but that the uncertainties are not dominating when deciding on action or in-action. We then consider the uncertainty related to choosing between adaptation options given that a decision of action has been taken. In this case the major part of the uncertainty on the estimated net present values is identical for all adaptation options and will therefore not affect a comparison between adaptation measures. This makes the chose among the options easier. Furthermore, the explicit attribution of uncertainty also enables a reduction of the overall uncertainty by identifying the processes which contributes the most. This knowledge can then be used to further reduce the uncertainty related to decision making, as a substantial part of the remaining uncertainty is epistemic.
Uncertainty in projected climate change arising from uncertain fossil-fuel emission factors
NASA Astrophysics Data System (ADS)
Quilcaille, Y.; Gasser, T.; Ciais, P.; Lecocq, F.; Janssens-Maenhout, G.; Mohr, S.
2018-04-01
Emission inventories are widely used by the climate community, but their uncertainties are rarely accounted for. In this study, we evaluate the uncertainty in projected climate change induced by uncertainties in fossil-fuel emissions, accounting for non-CO2 species co-emitted with the combustion of fossil-fuels and their use in industrial processes. Using consistent historical reconstructions and three contrasted future projections of fossil-fuel extraction from Mohr et al we calculate CO2 emissions and their uncertainties stemming from estimates of fuel carbon content, net calorific value and oxidation fraction. Our historical reconstructions of fossil-fuel CO2 emissions are consistent with other inventories in terms of average and range. The uncertainties sum up to a ±15% relative uncertainty in cumulative CO2 emissions by 2300. Uncertainties in the emissions of non-CO2 species associated with the use of fossil fuels are estimated using co-emission ratios varying with time. Using these inputs, we use the compact Earth system model OSCAR v2.2 and a Monte Carlo setup, in order to attribute the uncertainty in projected global surface temperature change (ΔT) to three sources of uncertainty, namely on the Earth system’s response, on fossil-fuel CO2 emission and on non-CO2 co-emissions. Under the three future fuel extraction scenarios, we simulate the median ΔT to be 1.9, 2.7 or 4.0 °C in 2300, with an associated 90% confidence interval of about 65%, 52% and 42%. We show that virtually all of the total uncertainty is attributable to the uncertainty in the future Earth system’s response to the anthropogenic perturbation. We conclude that the uncertainty in emission estimates can be neglected for global temperature projections in the face of the large uncertainty in the Earth system response to the forcing of emissions. We show that this result does not hold for all variables of the climate system, such as the atmospheric partial pressure of CO2 and the radiative forcing of tropospheric ozone, that have an emissions-induced uncertainty representing more than 40% of the uncertainty in the Earth system’s response.
Sato, Yousuke; Goto, Daisuke; Michibata, Takuro; Suzuki, Kentaroh; Takemura, Toshihiko; Tomita, Hirofumi; Nakajima, Teruyuki
2018-03-07
Aerosols affect climate by modifying cloud properties through their role as cloud condensation nuclei or ice nuclei, called aerosol-cloud interactions. In most global climate models (GCMs), the aerosol-cloud interactions are represented by empirical parameterisations, in which the mass of cloud liquid water (LWP) is assumed to increase monotonically with increasing aerosol loading. Recent satellite observations, however, have yielded contradictory results: LWP can decrease with increasing aerosol loading. This difference implies that GCMs overestimate the aerosol effect, but the reasons for the difference are not obvious. Here, we reproduce satellite-observed LWP responses using a global simulation with explicit representations of cloud microphysics, instead of the parameterisations. Our analyses reveal that the decrease in LWP originates from the response of evaporation and condensation processes to aerosol perturbations, which are not represented in GCMs. The explicit representation of cloud microphysics in global scale modelling reduces the uncertainty of climate prediction.
Pesticide risk assessment in free-ranging bees is weather and landscape dependent.
Henry, Mickaël; Bertrand, Colette; Le Féon, Violette; Requier, Fabrice; Odoux, Jean-François; Aupinel, Pierrick; Bretagnolle, Vincent; Decourtye, Axel
2014-07-10
The risk assessment of plant protection products on pollinators is currently based on the evaluation of lethal doses through repeatable lethal toxicity laboratory trials. Recent advances in honeybee toxicology have, however, raised interest on assessing sublethal effects in free-ranging individuals. Here, we show that the sublethal effects of a neonicotinoid pesticide are modified in magnitude by environmental interactions specific to the landscape and time of exposure events. Field sublethal assessment is therefore context dependent and should be addressed in a temporally and spatially explicit way, especially regarding weather and landscape physiognomy. We further develop an analytical Effective Dose (ED) framework to help disentangle context-induced from treatment-induced effects and thus to alleviate uncertainty in field studies. Although the ED framework involves trials at concentrations above the expected field exposure levels, it allows to explicitly delineating the climatic and landscape contexts that should be targeted for in-depth higher tier risk assessment.
Supporting Handoff in Asynchronous Collaborative Sensemaking Using Knowledge-Transfer Graphs.
Zhao, Jian; Glueck, Michael; Isenberg, Petra; Chevalier, Fanny; Khan, Azam
2018-01-01
During asynchronous collaborative analysis, handoff of partial findings is challenging because externalizations produced by analysts may not adequately communicate their investigative process. To address this challenge, we developed techniques to automatically capture and help encode tacit aspects of the investigative process based on an analyst's interactions, and streamline explicit authoring of handoff annotations. We designed our techniques to mediate awareness of analysis coverage, support explicit communication of progress and uncertainty with annotation, and implicit communication through playback of investigation histories. To evaluate our techniques, we developed an interactive visual analysis system, KTGraph, that supports an asynchronous investigative document analysis task. We conducted a two-phase user study to characterize a set of handoff strategies and to compare investigative performance with and without our techniques. The results suggest that our techniques promote the use of more effective handoff strategies, help increase an awareness of prior investigative process and insights, as well as improve final investigative outcomes.
NASA Technical Reports Server (NTRS)
Oda, T.; Ott, L.; Lauvaux, T.; Feng, S.; Bun, R.; Roman, M.; Baker, D. F.; Pawson, S.
2017-01-01
Fossil fuel carbon dioxide (CO2) emissions (FFCO2) are the largest input to the global carbon cycle on a decadal time scale. Because total emissions are assumed to be reasonably well constrained by fuel statistics, FFCO2 often serves as a reference in order to deduce carbon uptake by poorly understood terrestrial and ocean sinks. Conventional atmospheric CO2 flux inversions solve for spatially explicit regional sources and sinks and estimate land and ocean fluxes by subtracting FFCO2. Thus, errors in FFCO2 can propagate into the final inferred flux estimates. Gridded emissions are often based on disaggregation of emissions estimated at national or regional level. Although national and regional total FFCO2 are well known, gridded emission fields are subject to additional uncertainties due to the emission disaggregation. Assessing such uncertainties is often challenging because of the lack of physical measurements for evaluation. We first review difficulties in assessing uncertainties associated with gridded FFCO2 emission data and present several approaches for evaluation of such uncertainties at multiple scales. Given known limitations, inter-emission data differences are often used as a proxy for the uncertainty. The popular approach allows us to characterize differences in emissions, but does not allow us to fully quantify emission disaggregation biases. Our work aims to vicariously evaluate FFCO2 emission data using atmospheric models and measurements. We show a global simulation experiment where uncertainty estimates are propagated as an atmospheric tracer (uncertainty tracer) alongside CO2 in NASA's GEOS model and discuss implications of FFCO2 uncertainties in the context of flux inversions. We also demonstrate the use of high resolution urban CO2 simulations as a tool for objectively evaluating FFCO2 data over intense emission regions. Though this study focuses on FFCO2 emission data, the outcome of this study could also help improve the knowledge of similar gridded emissions data for non-CO2 compounds with similar emission characteristics.
NASA Astrophysics Data System (ADS)
Oda, T.; Ott, L. E.; Lauvaux, T.; Feng, S.; Bun, R.; Roman, M. O.; Baker, D. F.; Pawson, S.
2017-12-01
Fossil fuel carbon dioxide (CO2) emissions (FFCO2) are the largest input to the global carbon cycle on a decadal time scale. Because total emissions are assumed to be reasonably well constrained by fuel statistics, FFCO2 often serves as a reference in order to deduce carbon uptake by poorly understood terrestrial and ocean sinks. Conventional atmospheric CO2 flux inversions solve for spatially explicit regional sources and sinks and estimate land and ocean fluxes by subtracting FFCO2. Thus, errors in FFCO2 can propagate into the final inferred flux estimates. Gridded emissions are often based on disaggregation of emissions estimated at national or regional level. Although national and regional total FFCO2 are well known, gridded emission fields are subject to additional uncertainties due to the emission disaggregation. Assessing such uncertainties is often challenging because of the lack of physical measurements for evaluation. We first review difficulties in assessing uncertainties associated with gridded FFCO2 emission data and present several approaches for evaluation of such uncertainties at multiple scales. Given known limitations, inter-emission data differences are often used as a proxy for the uncertainty. The popular approach allows us to characterize differences in emissions, but does not allow us to fully quantify emission disaggregation biases. Our work aims to vicariously evaluate FFCO2 emission data using atmospheric models and measurements. We show a global simulation experiment where uncertainty estimates are propagated as an atmospheric tracer (uncertainty tracer) alongside CO2 in NASA's GEOS model and discuss implications of FFCO2 uncertainties in the context of flux inversions. We also demonstrate the use of high resolution urban CO2 simulations as a tool for objectively evaluating FFCO2 data over intense emission regions. Though this study focuses on FFCO2 emission data, the outcome of this study could also help improve the knowledge of similar gridded emissions data for non-CO2 compounds that share emission sectors.
Cost-effective conservation of an endangered frog under uncertainty.
Rose, Lucy E; Heard, Geoffrey W; Chee, Yung En; Wintle, Brendan A
2016-04-01
How should managers choose among conservation options when resources are scarce and there is uncertainty regarding the effectiveness of actions? Well-developed tools exist for prioritizing areas for one-time and binary actions (e.g., protect vs. not protect), but methods for prioritizing incremental or ongoing actions (such as habitat creation and maintenance) remain uncommon. We devised an approach that combines metapopulation viability and cost-effectiveness analyses to select among alternative conservation actions while accounting for uncertainty. In our study, cost-effectiveness is the ratio between the benefit of an action and its economic cost, where benefit is the change in metapopulation viability. We applied the approach to the case of the endangered growling grass frog (Litoria raniformis), which is threatened by urban development. We extended a Bayesian model to predict metapopulation viability under 9 urbanization and management scenarios and incorporated the full probability distribution of possible outcomes for each scenario into the cost-effectiveness analysis. This allowed us to discern between cost-effective alternatives that were robust to uncertainty and those with a relatively high risk of failure. We found a relatively high risk of extinction following urbanization if the only action was reservation of core habitat; habitat creation actions performed better than enhancement actions; and cost-effectiveness ranking changed depending on the consideration of uncertainty. Our results suggest that creation and maintenance of wetlands dedicated to L. raniformis is the only cost-effective action likely to result in a sufficiently low risk of extinction. To our knowledge we are the first study to use Bayesian metapopulation viability analysis to explicitly incorporate parametric and demographic uncertainty into a cost-effective evaluation of conservation actions. The approach offers guidance to decision makers aiming to achieve cost-effective conservation under uncertainty. © 2015 Society for Conservation Biology.
Constraining uncertainties in water supply reliability in a tropical data scarce basin
NASA Astrophysics Data System (ADS)
Kaune, Alexander; Werner, Micha; Rodriguez, Erasmo; de Fraiture, Charlotte
2015-04-01
Assessing the water supply reliability in river basins is essential for adequate planning and development of irrigated agriculture and urban water systems. In many cases hydrological models are applied to determine the surface water availability in river basins. However, surface water availability and variability is often not appropriately quantified due to epistemic uncertainties, leading to water supply insecurity. The objective of this research is to determine the water supply reliability in order to support planning and development of irrigated agriculture in a tropical, data scarce environment. The approach proposed uses a simple hydrological model, but explicitly includes model parameter uncertainty. A transboundary river basin in the tropical region of Colombia and Venezuela with an approximately area of 2100 km² was selected as a case study. The Budyko hydrological framework was extended to consider climatological input variability and model parameter uncertainty, and through this the surface water reliability to satisfy the irrigation and urban demand was estimated. This provides a spatial estimate of the water supply reliability across the basin. For the middle basin the reliability was found to be less than 30% for most of the months when the water is extracted from an upstream source. Conversely, the monthly water supply reliability was high (r>98%) in the lower basin irrigation areas when water was withdrawn from a source located further downstream. Including model parameter uncertainty provides a complete estimate of the water supply reliability, but that estimate is influenced by the uncertainty in the model. Reducing the uncertainty in the model through improved data and perhaps improved model structure will improve the estimate of the water supply reliability allowing better planning of irrigated agriculture and dependable water allocation decisions.
NASA Astrophysics Data System (ADS)
Dale, Amy; Fant, Charles; Strzepek, Kenneth; Lickley, Megan; Solomon, Susan
2017-03-01
We present maize production in sub-Saharan Africa as a case study in the exploration of how uncertainties in global climate change, as reflected in projections from a range of climate model ensembles, influence climate impact assessments for agriculture. The crop model AquaCrop-OS (Food and Agriculture Organization of the United Nations) was modified to run on a 2° × 2° grid and coupled to 122 climate model projections from multi-model ensembles for three emission scenarios (Coupled Model Intercomparison Project Phase 3 [CMIP3] SRES A1B and CMIP5 Representative Concentration Pathway [RCP] scenarios 4.5 and 8.5) as well as two "within-model" ensembles (NCAR CCSM3 and ECHAM5/MPI-OM) designed to capture internal variability (i.e., uncertainty due to chaos in the climate system). In spite of high uncertainty, most notably in the high-producing semi-arid zones, we observed robust regional and sub-regional trends across all ensembles. In agreement with previous work, we project widespread yield losses in the Sahel region and Southern Africa, resilience in Central Africa, and sub-regional increases in East Africa and at the southern tip of the continent. Spatial patterns of yield losses corresponded with spatial patterns of aridity increases, which were explicitly evaluated. Internal variability was a major source of uncertainty in both within-model and between-model ensembles and explained the majority of the spatial distribution of uncertainty in yield projections. Projected climate change impacts on maize production in different regions and nations ranged from near-zero or positive (upper quartile estimates) to substantially negative (lower quartile estimates), highlighting a need for risk management strategies that are adaptive and robust to uncertainty.
A Geographically Explicit Genetic Model of Worldwide Human-Settlement History
Liu, Hua; Prugnolle, Franck; Manica, Andrea; Balloux, François
2006-01-01
Currently available genetic and archaeological evidence is generally interpreted as supportive of a recent single origin of modern humans in East Africa. However, this is where the near consensus on human settlement history ends, and considerable uncertainty clouds any more detailed aspect of human colonization history. Here, we present a dynamic genetic model of human settlement history coupled with explicit geographical distances from East Africa, the likely origin of modern humans. We search for the best-supported parameter space by fitting our analytical prediction to genetic data that are based on 52 human populations analyzed at 783 autosomal microsatellite markers. This framework allows us to jointly estimate the key parameters of the expansion of modern humans. Our best estimates suggest an initial expansion of modern humans ∼56,000 years ago from a small founding population of ∼1,000 effective individuals. Our model further points to high growth rates in newly colonized habitats. The general fit of the model with the data is excellent. This suggests that coupling analytical genetic models with explicit demography and geography provides a powerful tool for making inferences on human-settlement history. PMID:16826514
Resonance frequency of fluid-filled and prestressed spherical shell-A model of the human eyeball.
Shih, Po-Jen; Guo, Yi-Ren
2016-04-01
An acoustic tonometer that measures shifts in resonance frequencies associated with intraocular pressure (IOP) could provide an opportunity for a type of tonometer that can be operated at home or worn by patients. However, there is insufficient theoretical background, especially with respect to the uncertainty in operating frequency ranges and the unknown relationships between IOPs and resonance frequencies. The purpose of this paper is to develop a frequency function for application in an acoustic tonometer. A linear wave theory is used to derive an explicit frequency function, consisting of an IOP and seven other physiological parameters. In addition, impulse response experiments are performed to measure the natural frequencies of porcine eyes to validate the provided function. From a real-time detection perspective, explicitly providing a frequency function can be the best way to set up an acoustic tonometer. The theory shows that the resonance oscillation of the eyeball is mainly dominated by liquid inside the eyeball. The experimental validation demonstrates the good prediction of IOPs and resonance frequencies. The proposed explicit frequency function supports further modal analysis not only of the dynamics of eyeballs, but also of the natural frequencies, for further development of the acoustic tonometer.
Uncertainty and risk in wildland fire management: a review.
Thompson, Matthew P; Calkin, Dave E
2011-08-01
Wildland fire management is subject to manifold sources of uncertainty. Beyond the unpredictability of wildfire behavior, uncertainty stems from inaccurate/missing data, limited resource value measures to guide prioritization across fires and resources at risk, and an incomplete scientific understanding of ecological response to fire, of fire behavior response to treatments, and of spatiotemporal dynamics involving disturbance regimes and climate change. This work attempts to systematically align sources of uncertainty with the most appropriate decision support methodologies, in order to facilitate cost-effective, risk-based wildfire planning efforts. We review the state of wildfire risk assessment and management, with a specific focus on uncertainties challenging implementation of integrated risk assessments that consider a suite of human and ecological values. Recent advances in wildfire simulation and geospatial mapping of highly valued resources have enabled robust risk-based analyses to inform planning across a variety of scales, although improvements are needed in fire behavior and ignition occurrence models. A key remaining challenge is a better characterization of non-market resources at risk, both in terms of their response to fire and how society values those resources. Our findings echo earlier literature identifying wildfire effects analysis and value uncertainty as the primary challenges to integrated wildfire risk assessment and wildfire management. We stress the importance of identifying and characterizing uncertainties in order to better quantify and manage them. Leveraging the most appropriate decision support tools can facilitate wildfire risk assessment and ideally improve decision-making. Published by Elsevier Ltd.
Aeras: A next generation global atmosphere model
Spotz, William F.; Smith, Thomas M.; Demeshko, Irina P.; ...
2015-06-01
Sandia National Laboratories is developing a new global atmosphere model named Aeras that is performance portable and supports the quantification of uncertainties. These next-generation capabilities are enabled by building Aeras on top of Albany, a code base that supports the rapid development of scientific application codes while leveraging Sandia's foundational mathematics and computer science packages in Trilinos and Dakota. Embedded uncertainty quantification (UQ) is an original design capability of Albany, and performance portability is a recent upgrade. Other required features, such as shell-type elements, spectral elements, efficient explicit and semi-implicit time-stepping, transient sensitivity analysis, and concurrent ensembles, were not componentsmore » of Albany as the project began, and have been (or are being) added by the Aeras team. We present early UQ and performance portability results for the shallow water equations.« less
On the Timing of Glacial Terminations in the Equatorial Pacific
NASA Astrophysics Data System (ADS)
Khider, D.; Ahn, S.; Lisiecki, L. E.; Lawrence, C.; Kienast, M.
2015-12-01
Understanding the mechanisms through which the climate system responds to orbital insolation changes requires establishing the timing of events imprinted on the geological record. In this study, we investigate the relative timing of the glacial terminations across the equatorial Pacific in order to identify a possible mechanism through which the tropics may have influenced a global climate response. The relative termination timing between the eastern and western equatorial Pacific was assessed from 15 published SST records based on Globigerinoides ruber Mg/Ca or alkenone thermometry. The novelty of our study lies in the accounting of the various sources of uncertainty inherent to paleoclimate reconstruction and timing analysis. Specifically, we use a Monte-Carlo process allowing sampling of possible realizations of the time series that are functions of the uncertainty of the benthic δ18O alignment to a global benthic curve, of the SST uncertainty, and of the uncertainty in the change point, which we use as a definition for the termination timing. We find that the uncertainty on the relative timing estimates is on the order of several thousand years, and stems from age model uncertainty (60%) and the uncertainty in the change point detection (40%). Random sources of uncertainty are the main contributor, and, therefore, averaging over a large datasets and/or higher resolution records should yield more precise and accurate estimates of the relative lead-lag. However, at this time, the number of records is not sufficient to identify any significant differences in the timing of the last three glacial terminations in SST records from the Eastern and Western Tropical Pacific.
Combining Bayesian Networks and Agent Based Modeling to develop a decision-support model in Vietnam
NASA Astrophysics Data System (ADS)
Nong, Bao Anh; Ertsen, Maurits; Schoups, Gerrit
2016-04-01
Complexity and uncertainty in natural resources management have been focus themes in recent years. Within these debates, with the aim to define an approach feasible for water management practice, we are developing an integrated conceptual modeling framework for simulating decision-making processes of citizens, in our case in the Day river area, Vietnam. The model combines Bayesian Networks (BNs) and Agent-Based Modeling (ABM). BNs are able to combine both qualitative data from consultants / experts / stakeholders, and quantitative data from observations on different phenomena or outcomes from other models. Further strengths of BNs are that the relationship between variables in the system is presented in a graphical interface, and that components of uncertainty are explicitly related to their probabilistic dependencies. A disadvantage is that BNs cannot easily identify the feedback of agents in the system once changes appear. Hence, ABM was adopted to represent the reaction among stakeholders under changes. The modeling framework is developed as an attempt to gain better understanding about citizen's behavior and factors influencing their decisions in order to reduce uncertainty in the implementation of water management policy.
NASA Astrophysics Data System (ADS)
Christoffersen, B. O.; Xu, C.; Koven, C.; Fisher, R.; Knox, R. G.; Kueppers, L. M.; Chambers, J. Q.; McDowell, N.
2017-12-01
Recent syntheses of variation in woody plant traits have emphasized how hydraulic traits - those related to the acquisition, transport and retention of water across roots, stems and leaves - are coordinated along a limited set of dimensions or sequence of responses (Reich 2014, Bartlett et al. 2016). However, in many hydraulic trait-trait relationships, there is considerable residual variation, despite the fact that many bivariate relationships are statistically significant. In other instances, such as the relationship between root-stem-leaf vulnerability to embolism, data are so limited that testing the trait coordination hypothesis is not yet possible. The impacts on plant hydraulic function of competing hypotheses regarding trait coordination (or the lack thereof) and residual trait variation have not yet been comprehensively tested and thus remain unknown. We addressed this knowledge gap with a parameter sensitivity analysis using a plant hydraulics model in which all parameters are biologically-interpretable and measurable plant hydraulic traits, as embedded within a size- and demographically-structured ecosystem model, the `Functionally Assembled Terrestrial Ecosystem Simulator' (FATES). We focused on tropical forests, where co-existing species have been observed to possess large variability in their hydraulic traits. Assembling 10 distinct datasets of hydraulic traits of stomata, leaves, stems, and roots, we determined the best-fit theoretical distribution for each trait and quantified interspecific (between-species) trait-trait coordination in tropical forests as a rank correlation matrix. We imputed missing correlations with values based on competing hypotheses of trait coordination, such as coordinated shifts in embolism vulnerability from roots to shoots (the hydraulic fuse hypothesis). Based on the Fourier Amplitude Sensitivity Test and our correlation matrix, we generated thousands of parameter sets for an ensemble of hydraulics model simulations at a tropical forest site in central Amazonia. We explore the sensitivity of simulated leaf water potential and stem sap flux in the context of hypotheses of trait-trait coordination and their associated uncertainties.
The interplay between societal concerns and the regulatory frame on GM crops in the European Union.
Devos, Yann; Reheul, Dirk; De Waele, Danny; Van Speybroeck, Linda
2006-01-01
Recapitulating how genetic modification technology and its agro-food products aroused strong societal opposition in the European Union, this paper demonstrates how this opposition contributed to shape the European regulatory frame on GM crops. More specifically, it describes how this opposition contributed to a de facto moratorium on the commercialization of new GM crop events in the end of the nineties. From this period onwards, the regulatory frame has been continuously revised in order to slow down further erosion of public and market confidence. Various scientific and technical reforms were made to meet societal concerns relating to the safety of GM crops. In this context, the precautionary principle, environmental post-market monitoring and traceability were adopted as ways to cope with scientific uncertainties. Labeling, traceability, co-existence and public information were installed in an attempt to meet the general public request for more information about GM agro-food products, and the specific demand to respect the consumers' and farmers' freedom of choice. Despite these efforts, today, the explicit role of public participation and/or ethical consultation during authorization procedures is at best minimal. Moreover, no legal room was created to progress to an integral sustainability evaluation during market procedures. It remains to be seen whether the recent policy shift towards greater transparency about value judgments, plural viewpoints and scientific uncertainties will be one step forward in integrating ethical concerns more explicitly in risk analysis. As such, the regulatory frame stands open for further interpretation, reflecting in various degrees a continued interplay with societal concerns relating to GM agro-food products. In this regard, both societal concerns and diversely interpreted regulatory criteria can be inferred as signaling a request - and even a quest - to render more explicit the broader-than-scientific dimension of the actual risk analysis.
Gkionis, Konstantinos; Kruse, Holger; Šponer, Jiří
2016-04-12
Modern dispersion-corrected DFT methods have made it possible to perform reliable QM studies on complete nucleic acid (NA) building blocks having hundreds of atoms. Such calculations, although still limited to investigations of potential energy surfaces, enhance the portfolio of computational methods applicable to NAs and offer considerably more accurate intrinsic descriptions of NAs than standard MM. However, in practice such calculations are hampered by the use of implicit solvent environments and truncation of the systems. Conventional QM optimizations are spoiled by spurious intramolecular interactions and severe structural deformations. Here we compare two approaches designed to suppress such artifacts: partially restrained continuum solvent QM and explicit solvent QM/MM optimizations. We report geometry relaxations of a set of diverse double-quartet guanine quadruplex (GQ) DNA stems. Both methods provide neat structures without major artifacts. However, each one also has distinct weaknesses. In restrained optimizations, all errors in the target geometries (i.e., low-resolution X-ray and NMR structures) are transferred to the optimized geometries. In QM/MM, the initial solvent configuration causes some heterogeneity in the geometries. Nevertheless, both approaches represent a decisive step forward compared to conventional optimizations. We refine earlier computations that revealed sizable differences in the relative energies of GQ stems computed with AMBER MM and QM. We also explore the dependence of the QM/MM results on the applied computational protocol.
The effect of age on word-stem cued recall: a behavioral and electrophysiological study.
Osorio, Alexandra; Ballesteros, Soledad; Fay, Séverine; Pouthas, Viviane
2009-09-15
The present study investigated the effects of aging on behavioral cued-recall performance and on the neural correlates of explicit memory using event-related potentials (ERPs) under shallow and deep encoding conditions. At test, participants were required to complete old and new three-letter word stems using the letters as retrieval cues. The main results were as follows: (1) older participants exhibited the same level of explicit memory as young adults with the same high level of education. Moreover older adults benefited as much as young ones from deep processing at encoding; (2) brain activity at frontal sites showed that the shallow old/new effect developed and ended earlier for older than young adults. In contrast, the deep old/new effect started later for older than for young adults and was sustained up to 1000 ms in both age groups. Moreover, the results suggest that the frontal old/new effect was bilateral but greater over the right than the left electrode sites from 600 ms onward; (3) there were no differences at parietal sites between age groups: the old/new effect developed from 400 ms under both encoding conditions and was sustained up to 1000 ms under the deep condition but ended earlier (800 ms) under the shallow condition. These ERP results indicate significant age-related changes in brain activity associated with the voluntary retrieval of previously encoded information, in spite of similar behavioral performance of young and older adults.
To what extent can ecosystem services motivate protecting biodiversity?
Dee, Laura E; De Lara, Michel; Costello, Christopher; Gaines, Steven D
2017-08-01
Society increasingly focuses on managing nature for the services it provides people rather than for the existence of particular species. How much biodiversity protection would result from this modified focus? Although biodiversity contributes to ecosystem services, the details of which species are critical, and whether they will go functionally extinct in the future, are fraught with uncertainty. Explicitly considering this uncertainty, we develop an analytical framework to determine how much biodiversity protection would arise solely from optimising net value from an ecosystem service. Using stochastic dynamic programming, we find that protecting a threshold number of species is optimal, and uncertainty surrounding how biodiversity produces services makes it optimal to protect more species than are presumed critical. We define conditions under which the economically optimal protection strategy is to protect all species, no species, and cases in between. We show how the optimal number of species to protect depends upon different relationships between species and services, including considering multiple services. Our analysis provides simple criteria to evaluate when managing for particular ecosystem services could warrant protecting all species, given uncertainty. Evaluating this criterion with empirical estimates from different ecosystems suggests that optimising some services will be more likely to protect most species than others. © 2017 John Wiley & Sons Ltd/CNRS.
Forest management under uncertainty for multiple bird population objectives
Moore, C.T.; Plummer, W.T.; Conroy, M.J.; Ralph, C. John; Rich, Terrell D.
2005-01-01
We advocate adaptive programs of decision making and monitoring for the management of forest birds when responses by populations to management, and particularly management trade-offs among populations, are uncertain. Models are necessary components of adaptive management. Under this approach, uncertainty about the behavior of a managed system is explicitly captured in a set of alternative models. The models generate testable predictions about the response of populations to management, and monitoring data provide the basis for assessing these predictions and informing future management decisions. To illustrate these principles, we examine forest management at the Piedmont National Wildlife Refuge, where management attention is focused on the recovery of the Red-cockaded Woodpecker (Picoides borealis) population. However, managers are also sensitive to the habitat needs of many non-target organisms, including Wood Thrushes (Hylocichla mustelina) and other forest interior Neotropical migratory birds. By simulating several management policies on a set of-alternative forest and bird models, we found a decision policy that maximized a composite response by woodpeckers and Wood Thrushes despite our complete uncertainty regarding system behavior. Furthermore, we used monitoring data to update our measure of belief in each alternative model following one cycle of forest management. This reduction of uncertainty translates into a reallocation of model influence on the choice of optimal decision action at the next decision opportunity.
An adaptive decision framework for the conservation of a threatened plant
Moore, Clinton T.; Fonnesbeck, Christopher J.; Shea, Katriona; Lah, Kristopher J.; McKenzie, Paul M.; Ball, Lianne C.; Runge, Michael C.; Alexander, Helen M.
2011-01-01
Mead's milkweed Asclepias meadii, a long-lived perennial herb of tallgrass prairie and glade communities of the central United States, is a species designated as threatened under the U.S. Endangered Species Act. Challenges to its successful management include the facts that much about its life history is unknown, its age at reproductive maturity is very advanced, certain life stages are practically unobservable, its productivity is responsive to unpredictable environmental events, and most of the known populations occur on private lands unprotected by any legal conservation instrument. One critical source of biological uncertainty is the degree to which fire promotes growth and reproductive response in the plant. To aid in its management, we developed a prototype population-level state-dependent decision-making framework that explicitly accounts for this uncertainty and for uncertainties related to stochastic environmental effects and vital rates. To parameterize the decision model, we used estimates found in the literature, and we analyzed data from a long-term monitoring program where fates of individual plants were observed through time. We demonstrate that different optimal courses of action are followed according to how one believes that fire influences reproductive response, and we show that the action taken for certain population states is informative for resolving uncertainty about competing beliefs regarding the effect of fire. We advocate the use of a model-predictive approach for the management of rare populations, particularly when management uncertainty is profound. Over time, an adaptive management approach should reduce uncertainty and improve management performance as predictions of management outcome generated under competing models are continually informed and updated by monitoring data.
NASA Astrophysics Data System (ADS)
Pilz, Tobias; Francke, Till; Bronstert, Axel
2016-04-01
Until today a large number of competing computer models has been developed to understand hydrological processes and to simulate and predict streamflow dynamics of rivers. This is primarily the result of a lack of a unified theory in catchment hydrology due to insufficient process understanding and uncertainties related to model development and application. Therefore, the goal of this study is to analyze the uncertainty structure of a process-based hydrological catchment model employing a multiple hypotheses approach. The study focuses on three major problems that have received only little attention in previous investigations. First, to estimate the impact of model structural uncertainty by employing several alternative representations for each simulated process. Second, explore the influence of landscape discretization and parameterization from multiple datasets and user decisions. Third, employ several numerical solvers for the integration of the governing ordinary differential equations to study the effect on simulation results. The generated ensemble of model hypotheses is then analyzed and the three sources of uncertainty compared against each other. To ensure consistency and comparability all model structures and numerical solvers are implemented within a single simulation environment. First results suggest that the selection of a sophisticated numerical solver for the differential equations positively affects simulation outcomes. However, already some simple and easy to implement explicit methods perform surprisingly well and need less computational efforts than more advanced but time consuming implicit techniques. There is general evidence that ambiguous and subjective user decisions form a major source of uncertainty and can greatly influence model development and application at all stages.
Semantic processes leading to true and false memory formation in schizophrenia.
Paz-Alonso, Pedro M; Ghetti, Simona; Ramsay, Ian; Solomon, Marjorie; Yoon, Jong; Carter, Cameron S; Ragland, J Daniel
2013-07-01
Encoding semantic relationships between items on word lists (semantic processing) enhances true memories, but also increases memory distortions. Episodic memory impairments in schizophrenia (SZ) are strongly driven by failures to process semantic relations, but the exact nature of these relational semantic processing deficits is not well understood. Here, we used a false memory paradigm to investigate the impact of implicit and explicit semantic processing manipulations on episodic memory in SZ. Thirty SZ and 30 demographically matched healthy controls (HC) studied Deese/Roediger-McDermott (DRM) lists of semantically associated words. Half of the lists had strong implicit semantic associations and the remainder had low strength associations. Similarly, half of the lists were presented under "standard" instructions and the other half under explicit "relational processing" instructions. After study, participants performed recall and old/new recognition tests composed of targets, critical lures, and unrelated lures. HC exhibited higher true memories and better discriminability between true and false memory compared to SZ. High, versus low, associative strength increased false memory rates in both groups. However, explicit "relational processing" instructions positively improved true memory rates only in HC. Finally, true and false memory rates were associated with severity of disorganized and negative symptoms in SZ. These results suggest that reduced processing of semantic relationships during encoding in SZ may stem from an inability to implement explicit relational processing strategies rather than a fundamental deficit in the implicit activation and retrieval of word meanings from patients' semantic lexicon. Copyright © 2013 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Kardohely, Andrew
The American economy hinges on the health and production of science, technology engineering and mathematics workforce (STEM). Although this sector of the American workforce represents a substantially fewer jobs the STEM workforce fuels job growth and sustainability in the other sectors of the American workforce. Unfortunately, over the next decade the U.S. will face an additional deficit of over a million STEM professionals, thus the need is here now to fill this deficit. STEM education should, therefore, dedicated to producing graduates. One strategy to produce more STEM graduates is through retention of student in STEM majors. Retention or persistence is highly related to student sense of belonging in academic environments. This study investigates graduate teaching assistants (GTAs) perceptions of their classrooms and the implications of those perceptions on professional development. Furthermore, correlations between classroom community and student desire to persist, as measured by Rovai's Classroom Community Index (CCI) were established (P=0.0311). The interactions are described and results are discussed. Using a framework of teaching for community, and a qualitative analytic case study with memo writing about codes and themes methodology supported several themes including passion to teach and dedication to student learning, innovation in teaching practices based on evidence, an intrinsic desire to seek a diverse set of feedback, and instructors can foster community in the classroom. Using the same methodology one emergent theme, a tacit rather than explicit understanding of reading the classroom, was also present in the current study. Based on the results and using a lens for professional development, strategies and suggestions are made regarding strategies to enhance instructors' use of feedback and professional development.
NASA Astrophysics Data System (ADS)
Schoups, G.; Vrugt, J. A.; Fenicia, F.; van de Giesen, N. C.
2010-10-01
Conceptual rainfall-runoff models have traditionally been applied without paying much attention to numerical errors induced by temporal integration of water balance dynamics. Reliance on first-order, explicit, fixed-step integration methods leads to computationally cheap simulation models that are easy to implement. Computational speed is especially desirable for estimating parameter and predictive uncertainty using Markov chain Monte Carlo (MCMC) methods. Confirming earlier work of Kavetski et al. (2003), we show here that the computational speed of first-order, explicit, fixed-step integration methods comes at a cost: for a case study with a spatially lumped conceptual rainfall-runoff model, it introduces artificial bimodality in the marginal posterior parameter distributions, which is not present in numerically accurate implementations of the same model. The resulting effects on MCMC simulation include (1) inconsistent estimates of posterior parameter and predictive distributions, (2) poor performance and slow convergence of the MCMC algorithm, and (3) unreliable convergence diagnosis using the Gelman-Rubin statistic. We studied several alternative numerical implementations to remedy these problems, including various adaptive-step finite difference schemes and an operator splitting method. Our results show that adaptive-step, second-order methods, based on either explicit finite differencing or operator splitting with analytical integration, provide the best alternative for accurate and efficient MCMC simulation. Fixed-step or adaptive-step implicit methods may also be used for increased accuracy, but they cannot match the efficiency of adaptive-step explicit finite differencing or operator splitting. Of the latter two, explicit finite differencing is more generally applicable and is preferred if the individual hydrologic flux laws cannot be integrated analytically, as the splitting method then loses its advantage.
NASA Astrophysics Data System (ADS)
Arnold, Jeffrey; Clark, Martyn; Gutmann, Ethan; Wood, Andy; Nijssen, Bart; Rasmussen, Roy
2016-04-01
The United States Army Corps of Engineers (USACE) has had primary responsibility for multi-purpose water resource operations on most of the major river systems in the U.S. for more than 200 years. In that time, the USACE projects and programs making up those operations have proved mostly robust against the range of natural climate variability encountered over their operating life spans. However, in some watersheds and for some variables, climate change now is known to be shifting the hydroclimatic baseline around which that natural variability occurs and changing the range of that variability as well. This makes historical stationarity an inappropriate basis for assessing continued project operations under climate-changed futures. That means new hydroclimatic projections are required at multiple scales to inform decisions about specific threats and impacts, and for possible adaptation responses to limit water-resource vulnerabilities and enhance operational resilience. However, projections of possible future hydroclimatologies have myriad complex uncertainties that require explicit guidance for interpreting and using them to inform those decisions about climate vulnerabilities and resilience. Moreover, many of these uncertainties overlap and interact. Recent work, for example, has shown the importance of assessing the uncertainties from multiple sources including: global model structure [Meehl et al., 2005; Knutti and Sedlacek, 2013]; internal climate variability [Deser et al., 2012; Kay et al., 2014]; climate downscaling methods [Gutmann et al., 2012; Mearns et al., 2013]; and hydrologic models [Addor et al., 2014; Vano et al., 2014; Mendoza et al., 2015]. Revealing, reducing, and representing these uncertainties is essential for defining the plausible quantitative climate change narratives required to inform water-resource decision-making. And to be useful, such quantitative narratives, or storylines, of climate change threats and hydrologic impacts must sample from the full range of uncertainties associated with all parts of the simulation chain, from global climate models with simulations of natural climate variability, through regional climate downscaling, and on to modeling of affected hydrologic processes and downstream water resources impacts. This talk will present part of the work underway now both to reveal and reduce some important uncertainties and to develop explicit guidance for future generation of quantitative hydroclimatic storylines. Topics will include: 1- model structural and parameter-set limitations of some methods widely used to quantify climate impacts to hydrologic processes [Gutmann et al., 2014; Newman et al., 2015]; 2- development and evaluation of new, spatially consistent, U.S. national-scale climate downscaling and hydrologic simulation capabilities directly relevant at the multiple scales of water-resource decision-making [Newman et al., 2015; Mizukami et al., 2015; Gutmann et al., 2016]; and 3- development and evaluation of advanced streamflow forecasting methods to reduce and represent integrated uncertainties in a tractable way [Wood et al., 2014; Wood et al., 2015]. A key focus will be areas where climatologic and hydrologic science is currently under-developed to inform decisions - or is perhaps wrongly scaled or misapplied in practice - indicating the need for additional fundamental science and interpretation.
Hoffman, F. Owen; Moroz, Brian; Drozdovitch, Vladimir; Bouville, André; Beck, Harold; Luckyanov, Nicholas; Weinstock, Robert M.; Simon, Steven L.
2015-01-01
Dosimetic uncertainties, particularly those that are shared among subgroups of a study population, can bias, distort or reduce the slope or significance of a dose response. Exposure estimates in studies of health risks from environmental radiation exposures are generally highly uncertain and thus, susceptible to these methodological limitations. An analysis was published in 2008 concerning radiation-related thyroid nodule prevalence in a study population of 2,994 villagers under the age of 21 years old between August 1949 and September 1962 and who lived downwind from the Semi-palatinsk Nuclear Test Site in Kazakhstan. This dose-response analysis identified a statistically significant association between thyroid nodule prevalence and reconstructed doses of fallout-related internal and external radiation to the thyroid gland; however, the effects of dosimetric uncertainty were not evaluated since the doses were simple point “best estimates”. In this work, we revised the 2008 study by a comprehensive treatment of dosimetric uncertainties. Our present analysis improves upon the previous study, specifically by accounting for shared and unshared uncertainties in dose estimation and risk analysis, and differs from the 2008 analysis in the following ways: 1. The study population size was reduced from 2,994 to 2,376 subjects, removing 618 persons with uncertain residence histories; 2. Simulation of multiple population dose sets (vectors) was performed using a two-dimensional Monte Carlo dose estimation method; and 3. A Bayesian model averaging approach was employed for evaluating the dose response, explicitly accounting for large and complex uncertainty in dose estimation. The results were compared against conventional regression techniques. The Bayesian approach utilizes 5,000 independent realizations of population dose vectors, each of which corresponds to a set of conditional individual median internal and external doses for the 2,376 subjects. These 5,000 population dose vectors reflect uncertainties in dosimetric parameters, partly shared and partly independent, among individual members of the study population. Risk estimates for thyroid nodules from internal irradiation were higher than those published in 2008, which results, to the best of our knowledge, from explicitly accounting for dose uncertainty. In contrast to earlier findings, the use of Bayesian methods led to the conclusion that the biological effectiveness for internal and external dose was similar. Estimates of excess relative risk per unit dose (ERR/Gy) for males (177 thyroid nodule cases) were almost 30 times those for females (571 cases) and were similar to those reported for thyroid cancers related to childhood exposures to external and internal sources in other studies. For confirmed cases of papillary thyroid cancers (3 in males, 18 in females), the ERR/Gy was also comparable to risk estimates from other studies, but not significantly different from zero. These findings represent the first reported dose response for a radiation epidemiologic study considering all known sources of shared and unshared errors in dose estimation and using a Bayesian model averaging (BMA) method for analysis of the dose response. PMID:25574587
NASA Astrophysics Data System (ADS)
Le Coz, Jérôme; Renard, Benjamin; Bonnifait, Laurent; Branger, Flora; Le Boursicaud, Raphaël; Horner, Ivan; Mansanarez, Valentin; Lang, Michel; Vigneau, Sylvain
2015-04-01
River discharge is a crucial variable for Hydrology: as the output variable of most hydrologic models, it is used for sensitivity analyses, model structure identification, parameter estimation, data assimilation, prediction, etc. A major difficulty stems from the fact that river discharge is not measured continuously. Instead, discharge time series used by hydrologists are usually based on simple stage-discharge relations (rating curves) calibrated using a set of direct stage-discharge measurements (gaugings). In this presentation, we present a Bayesian approach (cf. Le Coz et al., 2014) to build such hydrometric rating curves, to estimate the associated uncertainty and to propagate this uncertainty to discharge time series. The three main steps of this approach are described: (1) Hydraulic analysis: identification of the hydraulic controls that govern the stage-discharge relation, identification of the rating curve equation and specification of prior distributions for the rating curve parameters; (2) Rating curve estimation: Bayesian inference of the rating curve parameters, accounting for the individual uncertainties of available gaugings, which often differ according to the discharge measurement procedure and the flow conditions; (3) Uncertainty propagation: quantification of the uncertainty in discharge time series, accounting for both the rating curve uncertainties and the uncertainty of recorded stage values. The rating curve uncertainties combine the parametric uncertainties and the remnant uncertainties that reflect the limited accuracy of the mathematical model used to simulate the physical stage-discharge relation. In addition, we also discuss current research activities, including the treatment of non-univocal stage-discharge relationships (e.g. due to hydraulic hysteresis, vegetation growth, sudden change of the geometry of the section, etc.). An operational version of the BaRatin software and its graphical interface are made available free of charge on request to the authors. J. Le Coz, B. Renard, L. Bonnifait, F. Branger, R. Le Boursicaud (2014). Combining hydraulic knowledge and uncertain gaugings in the estimation of hydrometric rating curves: a Bayesian approach, Journal of Hydrology, 509, 573-587.
Zach, Alexandra; Horna, Viviana; Leuschner, Christoph
2008-01-01
Much uncertainty exists about the magnitude of woody tissue respiration and its environmental control in highly diverse tropical moist forests. In a tropical mountain rain forest in southern Ecuador, we measured the apparent diurnal gas exchange of stems and coarse roots (diameter 1-4 cm) of trees from representative families along an elevational transect with plots at 1050, 1890 and 3050 m a.s.l. Mean air temperatures were 20.8, 17.2 and 10.6 degrees C, respectively. Stem and root CO(2) efflux of 13 to 21 trees per stand from dominant families were investigated with an open gas exchange system while stand microclimate was continuously monitored. Substantial variation in respiratory activity among and within species was found at all sites. Mean daily CO(2) release rates from stems declined 6.6-fold from 1.38 micromol m(-2) s(-1) at 1050 m to 0.21 micromol m(-2) s(-1) at 3050 m. Mean daily CO(2) release from coarse roots decreased from 0.35 to 0.20 micromol m(-2) s(-1) with altitude, but the differences were not significant. There was, thus, a remarkable shift from a high ratio of stem to coarse root respiration rates at the lowest elevation to an apparent equivalence of stem and coarse root CO(2) efflux rates at the highest elevation. We conclude that stem respiration, but not root respiration, greatly decreases with elevation in this transect, coinciding with a substantial decrease in relative stem diameter increment and a large increase in fine and coarse root biomass production with elevation.
NASA Astrophysics Data System (ADS)
Perdigão, R. A. P.
2017-12-01
Predictability assessments are traditionally made on a case-by-case basis, often by running the particular model of interest with randomly perturbed initial/boundary conditions and parameters, producing computationally expensive ensembles. These approaches provide a lumped statistical view of uncertainty evolution, without eliciting the fundamental processes and interactions at play in the uncertainty dynamics. In order to address these limitations, we introduce a systematic dynamical framework for predictability assessment and forecast, by analytically deriving governing equations of predictability in terms of the fundamental architecture of dynamical systems, independent of any particular problem under consideration. The framework further relates multiple uncertainty sources along with their coevolutionary interplay, enabling a comprehensive and explicit treatment of uncertainty dynamics along time, without requiring the actual model to be run. In doing so, computational resources are freed and a quick and effective a-priori systematic dynamic evaluation is made of predictability evolution and its challenges, including aspects in the model architecture and intervening variables that may require optimization ahead of initiating any model runs. It further brings out universal dynamic features in the error dynamics elusive to any case specific treatment, ultimately shedding fundamental light on the challenging issue of predictability. The formulated approach, framed with broad mathematical physics generality in mind, is then implemented in dynamic models of nonlinear geophysical systems with various degrees of complexity, in order to evaluate their limitations and provide informed assistance on how to optimize their design and improve their predictability in fundamental dynamical terms.
The potential for meta-analysis to support decision analysis in ecology.
Mengersen, Kerrie; MacNeil, M Aaron; Caley, M Julian
2015-06-01
Meta-analysis and decision analysis are underpinned by well-developed methods that are commonly applied to a variety of problems and disciplines. While these two fields have been closely linked in some disciplines such as medicine, comparatively little attention has been paid to the potential benefits of linking them in ecology, despite reasonable expectations that benefits would be derived from doing so. Meta-analysis combines information from multiple studies to provide more accurate parameter estimates and to reduce the uncertainty surrounding them. Decision analysis involves selecting among alternative choices using statistical information that helps to shed light on the uncertainties involved. By linking meta-analysis to decision analysis, improved decisions can be made, with quantification of the costs and benefits of alternate decisions supported by a greater density of information. Here, we briefly review concepts of both meta-analysis and decision analysis, illustrating the natural linkage between them and the benefits from explicitly linking one to the other. We discuss some examples in which this linkage has been exploited in the medical arena and how improvements in precision and reduction of structural uncertainty inherent in a meta-analysis can provide substantive improvements to decision analysis outcomes by reducing uncertainty in expected loss and maximising information from across studies. We then argue that these significant benefits could be translated to ecology, in particular to the problem of making optimal ecological decisions in the face of uncertainty. Copyright © 2013 John Wiley & Sons, Ltd.
NASA Astrophysics Data System (ADS)
Wang, S.; Huang, G. H.; Baetz, B. W.; Ancell, B. C.
2017-05-01
The particle filtering techniques have been receiving increasing attention from the hydrologic community due to its ability to properly estimate model parameters and states of nonlinear and non-Gaussian systems. To facilitate a robust quantification of uncertainty in hydrologic predictions, it is necessary to explicitly examine the forward propagation and evolution of parameter uncertainties and their interactions that affect the predictive performance. This paper presents a unified probabilistic framework that merges the strengths of particle Markov chain Monte Carlo (PMCMC) and factorial polynomial chaos expansion (FPCE) algorithms to robustly quantify and reduce uncertainties in hydrologic predictions. A Gaussian anamorphosis technique is used to establish a seamless bridge between the data assimilation using the PMCMC and the uncertainty propagation using the FPCE through a straightforward transformation of posterior distributions of model parameters. The unified probabilistic framework is applied to the Xiangxi River watershed of the Three Gorges Reservoir (TGR) region in China to demonstrate its validity and applicability. Results reveal that the degree of spatial variability of soil moisture capacity is the most identifiable model parameter with the fastest convergence through the streamflow assimilation process. The potential interaction between the spatial variability in soil moisture conditions and the maximum soil moisture capacity has the most significant effect on the performance of streamflow predictions. In addition, parameter sensitivities and interactions vary in magnitude and direction over time due to temporal and spatial dynamics of hydrologic processes.
Quantification of uncertainties in global grazing systems assessment
NASA Astrophysics Data System (ADS)
Fetzel, T.; Havlik, P.; Herrero, M.; Kaplan, J. O.; Kastner, T.; Kroisleitner, C.; Rolinski, S.; Searchinger, T.; Van Bodegom, P. M.; Wirsenius, S.; Erb, K.-H.
2017-07-01
Livestock systems play a key role in global sustainability challenges like food security and climate change, yet many unknowns and large uncertainties prevail. We present a systematic, spatially explicit assessment of uncertainties related to grazing intensity (GI), a key metric for assessing ecological impacts of grazing, by combining existing data sets on (a) grazing feed intake, (b) the spatial distribution of livestock, (c) the extent of grazing land, and (d) its net primary productivity (NPP). An analysis of the resulting 96 maps implies that on average 15% of the grazing land NPP is consumed by livestock. GI is low in most of the world's grazing lands, but hotspots of very high GI prevail in 1% of the total grazing area. The agreement between GI maps is good on one fifth of the world's grazing area, while on the remainder, it is low to very low. Largest uncertainties are found in global drylands and where grazing land bears trees (e.g., the Amazon basin or the Taiga belt). In some regions like India or Western Europe, massive uncertainties even result in GI > 100% estimates. Our sensitivity analysis indicates that the input data for NPP, animal distribution, and grazing area contribute about equally to the total variability in GI maps, while grazing feed intake is a less critical variable. We argue that a general improvement in quality of the available global level data sets is a precondition for improving the understanding of the role of livestock systems in the context of global environmental change or food security.
Exploiting risk-reward structures in decision making under uncertainty.
Leuker, Christina; Pachur, Thorsten; Hertwig, Ralph; Pleskac, Timothy J
2018-06-01
People often have to make decisions under uncertainty-that is, in situations where the probabilities of obtaining a payoff are unknown or at least difficult to ascertain. One solution to this problem is to infer the probability from the magnitude of the potential payoff and thus exploit the inverse relationship between payoffs and probabilities that occurs in many domains in the environment. Here, we investigated how the mind may implement such a solution: (1) Do people learn about risk-reward relationships from the environment-and if so, how? (2) How do learned risk-reward relationships impact preferences in decision-making under uncertainty? Across three experiments (N = 352), we found that participants can learn risk-reward relationships from being exposed to choice environments with a negative, positive, or uncorrelated risk-reward relationship. They were able to learn the associations both from gambles with explicitly stated payoffs and probabilities (Experiments 1 & 2) and from gambles about epistemic events (Experiment 3). In subsequent decisions under uncertainty, participants often exploited the learned association by inferring probabilities from the magnitudes of the payoffs. This inference systematically influenced their preferences under uncertainty: Participants who had been exposed to a negative risk-reward relationship tended to prefer the uncertain option over a smaller sure option for low payoffs, but not for high payoffs. This pattern reversed in the positive condition and disappeared in the uncorrelated condition. This adaptive change in preferences is consistent with the use of the risk-reward heuristic. Copyright © 2018 Elsevier B.V. All rights reserved.
Using cost-benefit concepts in design floods improves communication of uncertainty
NASA Astrophysics Data System (ADS)
Ganora, Daniele; Botto, Anna; Laio, Francesco; Claps, Pierluigi
2017-04-01
Flood frequency analysis, i.e. the study of the relationships between the magnitude and the rarity of high flows in a river, is the usual procedure adopted to assess flood hazard, preliminary to the plan/design of flood protection measures. It grounds on the fit of a probability distribution to the peak discharge values recorded in gauging stations and the final estimates over a region are thus affected by uncertainty, due to the limited sample availability and of the possible alternatives in terms of the probabilistic model and the parameter estimation methods used. In the last decade, the scientific community dealt with this issue by developing a number of methods to quantify such uncertainty components. Usually, uncertainty is visually represented through confidence bands, which are easy to understand, but are not yet demonstrated to be useful for design purposes: they usually disorient decision makers, as the design flood is no longer univocally defined, making the decision process undetermined. These considerations motivated the development of the uncertainty-compliant design flood estimator (UNCODE) procedure (Botto et al., 2014) that allows one to select meaningful flood design values accounting for the associated uncertainty by considering additional constraints based on cost-benefit criteria. This method suggests an explicit multiplication factor that corrects the traditional (without uncertainty) design flood estimates to incorporate the effects of uncertainty in the estimate at the same safety level. Even though the UNCODE method was developed for design purposes, it can represent a powerful and robust tool to help clarifying the effects of the uncertainty in statistical estimation. As the process produces increased design flood estimates, this outcome demonstrates how uncertainty leads to more expensive flood protection measures, or insufficiency of current defenses. Moreover, the UNCODE approach can be used to assess the "value" of data, as the costs of flood prevention can get down by reducing uncertainty with longer observed flood records. As the multiplication factor is dimensionless, some examples of application provided show how this approach allows simple comparisons of the effects of uncertainty in different catchments, helping to build ranking procedures for planning purposes. REFERENCES Botto, A., Ganora, D., Laio, F., and Claps, P.: Uncertainty compliant design flood estimation, Water Resources Research, 50, doi:10.1002/2013WR014981, 2014.
Generalized sediment budgets of the Lower Missouri River, 1968–2014
Heimann, David C.
2016-09-13
Sediment budgets of the Lower Missouri River were developed in a study led by the U.S. Geological Survey in cooperation with the U.S. Army Corps of Engineers. The scope of the study included the development of a long-term (post-impoundment, 1968–2014) average annual sediment budget and selected annual, monthly, and daily sediment budgets for a reach and period that adequate data were available. Included in the analyses were 31 main-stem and tributary stations of the Lower Missouri River and two Mississippi River stations—the Mississippi River below Grafton, Illinois, and the Mississippi River at St. Louis, Missouri.Long-term average annual suspended-sediment loads of Missouri River main-stem stations ranged from 0.33 million tons at the Missouri River at Yankton, South Dakota, station to 71.2 million tons at Missouri River at Hermann, Mo., station. Gaged tributary gains accounted for 9–36 percent of the local reach budgets and cumulative gaged tributary contributions accounted for 84 percent of the long-term average suspended-sediment load of the Missouri River at Hermann, Mo., station. Although the sediment budgets for seven defined main-stem reaches generally were incomplete—missing bedload, reach storage, and ungaged tributary contributions—the budget residuals (net result of sediment inputs and outputs) for six of the seven reaches ranged from -7.0 to 1.7 million tons, or from -9.2 to 4.0 percent of the reach output suspended-sediment load, and were within the 10 percent reported measurement error of annual suspended-sediment loads for large rivers. The remaining reach, downstream from Gavin’s Point Dam, extended from Yankton, S. Dak., to Sioux City, Iowa, and had a budget residual of -9.8 million tons, which was -88 percent of the suspended-sediment load at Sioux City.The Lower Missouri River reach from Omaha, Nebraska, to Nebraska City, Nebr., had periods of concurrent sediment data for each primary budget component with which to analyze and determine a suspended-sediment budget for selected annual, monthly, and daily time increments. The temporal changes in the cumulative annual budget residuals were poorly correlated with the comparatively steady 1968–2011 annual stage trends at the Missouri River at Nebraska City, Nebr., station. An accurate total sediment budget is developed by having concurrent data available for all primary suspended and bedload components for a reach of interest throughout a period. Such a complete budget, with concurrent record for suspended-sediment load and bedload components, is unavailable for any reach and period in the Lower Missouri River. The primary data gaps are in bedload data, and also in suspended-sediment gains and losses including ungaged tributary inputs and sediment storage. Bedload data gaps in the Missouri River Basin are much more prevalent than suspended-sediment data gaps, and the first step in the development of reach bedload budgets is the establishment of a standardized bedload monitoring program at main-stem stations.The temporal changes in flow-adjusted suspended-sediment concentrations analyzed at main-stem Missouri River stations indicated an overall downward change in concentrations between 1968 and 2014. Temporary declines in flow-adjusted suspended-sediment concentrations during and following large floods were evident but generally returned to near pre-flood values within about 6 months.Data uncertainties associated with the development of a sediment budget include uncertainties associated with the collection of suspended-sediment and bedload data and the computation of suspended-sediment loads. These uncertainties vary depending on the frequency of data collection, the variability of conditions being represented by the discrete samples, and the statistical approach to suspended-sediment load computations. The coefficients of variation of suspended-sediment loads of Missouri River tributary stations for 1968–2014 were greater, 75.0 percent, than the main-stem stations, 47.1 percent. The lower coefficient of variation at main-stem stations compared to tributaries, primarily is the result of the lower variability in streamflow and sediment discharge identified at main-stem stations. To obtain similar accuracy between suspended-sediment loads at main-stem and tributary stations, a longer period of record is required of the tributary stations. During 1968–2014, however, the Missouri River main-stem station record was much more complete (87 percent) than the tributary station record (28 percent).
NASA Astrophysics Data System (ADS)
Miller, Brant Gregory
Mainstream curricula have struggled to provide American Indian students with meaningful learning experiences. This research project studied a novel approach to engaging students with science, technology, engineering, and mathematics (STEM) content through a culturally-based context. The traditional American Indian game of Snow Snakes (shushumeg in Ojibwe) presented a highly engaging context for delivering STEM content. Through the engaging context of snow snakes, the designed STEM curriculum explicitly applied mathematics (scaling and data), and science (force and motion) to an engineering prototype iteration that used available materials and tools (technology) for success. It was hypothesized that by engaging students through the carefully integrated STEM curriculum, driven by the culturally based context of snow snakes, students would exhibit an increase in science agency and achievement. The overarching research question explored for this study was: How does a culturally-based and integrated STEM curriculum impact student's science agency? Associated sub-questions were: (1) What does science agency look like for 6th grade students? (2) What key experiences are involved in the development of science agency through a culturally-based STEM curriculum context? And (3) What are the impacts on the community associated with the implementation of a culturally-based STEM curriculum? A case study research design was implemented for this research. Yin (2003) defines a case study as investigating a phenomenon (e.g. science agency) which occurs within authentic contexts (e.g. snow snakes, Adventure Learning, and Eagle Soaring School) especially when the boundaries between phenomenon and context are unclear. For this case study Eagle Soaring School acted as the bounded case with students from the 6th grade class representing the embedded units. Science agency was the theoretical framework for data analysis. Major findings were categorized as science and STEM learning, agency, and community impact. Concerning agency, students displayed science agency through: connecting snow snake experiences to outside contexts; students emerging as leaders; and students commanding a facility with science. This research lays the foundation for future inquiry into the development of science agency in students using culturally-based contexts.
NASA Astrophysics Data System (ADS)
Heyman, Jeremy Benjamin
This project builds upon the author's multi-year critical ethnographic study of urban immigrant students and their trajectories into STEM (science, technology, engineering, or mathematics) from high school through their transition to college. At its core, this study investigates the paths of over three dozen newcomer immigrant English language learner students in high-poverty urban neighborhoods who are not generally considered "legitimate contenders" for Bachelor's degrees in STEM fields on the basis of such characteristics as test scores, high school and prior preparation, and age. The students are followed through their high school experiences, their transition to college, and through their current progress in college, with explicit attention paid to key mediating experiences and relationships in and especially outside of the classroom that were associated with their toward persistence and success. Thick description and analysis of the students and their experiences, among those who persisted as well as the minority who switched out of STEM majors, helps to demonstrate a proof-of-concept of these students' ability to succeed while painting a comprehensive picture of their march forward to degrees in STEM fields against a backdrop of economic, linguistic, and other barriers to entry and success. Using a framework of social and capital and resilience theories, this work has uncovered a number of themes and factors that will help educators to better understand the evolution of these traditionally marginalized students' STEM-related interests, skills, and career plans. The findings center around students' exposure to research internships and other STEM enrichment and outreach experiences, long-term mentoring and other key relationships, and integration of STEM and college access efforts in setting them up for a successful transition to college, as well as an emphasis on the importance of students' calling upon their own resilience and other strengths and prior experiences. The results provide novel insights and recommendations for improving access and persistence in STEM among students in areas of concentrated poverty who are also struggling with mastering a new language and a host of other challenges.
Linking river management to species conservation using dynamic landscape scale models
Freeman, Mary C.; Buell, Gary R.; Hay, Lauren E.; Hughes, W. Brian; Jacobson, Robert B.; Jones, John W.; Jones, S.A.; LaFontaine, Jacob H.; Odom, Kenneth R.; Peterson, James T.; Riley, Jeffrey W.; Schindler, J. Stephen; Shea, C.; Weaver, J.D.
2013-01-01
Efforts to conserve stream and river biota could benefit from tools that allow managers to evaluate landscape-scale changes in species distributions in response to water management decisions. We present a framework and methods for integrating hydrology, geographic context and metapopulation processes to simulate effects of changes in streamflow on fish occupancy dynamics across a landscape of interconnected stream segments. We illustrate this approach using a 482 km2 catchment in the southeastern US supporting 50 or more stream fish species. A spatially distributed, deterministic and physically based hydrologic model is used to simulate daily streamflow for sub-basins composing the catchment. We use geographic data to characterize stream segments with respect to channel size, confinement, position and connectedness within the stream network. Simulated streamflow dynamics are then applied to model fish metapopulation dynamics in stream segments, using hypothesized effects of streamflow magnitude and variability on population processes, conditioned by channel characteristics. The resulting time series simulate spatially explicit, annual changes in species occurrences or assemblage metrics (e.g. species richness) across the catchment as outcomes of management scenarios. Sensitivity analyses using alternative, plausible links between streamflow components and metapopulation processes, or allowing for alternative modes of fish dispersal, demonstrate large effects of ecological uncertainty on model outcomes and highlight needed research and monitoring. Nonetheless, with uncertainties explicitly acknowledged, dynamic, landscape-scale simulations may prove useful for quantitatively comparing river management alternatives with respect to species conservation.
Intermittent Stem Cell Cycling Balances Self-Renewal and Senescence of the C. elegans Germ Line.
Cinquin, Amanda; Chiang, Michael; Paz, Adrian; Hallman, Sam; Yuan, Oliver; Vysniauskaite, Indre; Fowlkes, Charless C; Cinquin, Olivier
2016-04-01
Self-renewing organs often experience a decline in function in the course of aging. It is unclear whether chronological age or external factors control this decline, or whether it is driven by stem cell self-renewal-for example, because cycling cells exhaust their replicative capacity and become senescent. Here we assay the relationship between stem cell cycling and senescence in the Caenorhabditis elegans reproductive system, defining this senescence as the progressive decline in "reproductive capacity," i.e. in the number of progeny that can be produced until cessation of reproduction. We show that stem cell cycling diminishes remaining reproductive capacity, at least in part through the DNA damage response. Paradoxically, gonads kept under conditions that preclude reproduction keep cycling and producing cells that undergo apoptosis or are laid as unfertilized gametes, thus squandering reproductive capacity. We show that continued activity is in fact beneficial inasmuch as gonads that are active when reproduction is initiated have more sustained early progeny production. Intriguingly, continued cycling is intermittent-gonads switch between active and dormant states-and in all likelihood stochastic. Other organs face tradeoffs whereby stem cell cycling has the beneficial effect of providing freshly-differentiated cells and the detrimental effect of increasing the likelihood of cancer or senescence; stochastic stem cell cycling may allow for a subset of cells to preserve proliferative potential in old age, which may implement a strategy to deal with uncertainty as to the total amount of proliferation to be undergone over an organism's lifespan.
The ends of uncertainty: Air quality science and planning in Central California
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fine, James
Air quality planning in Central California is complicated and controversial despite millions of dollars invested to improve scientific understanding. This research describes and critiques the use of photochemical air quality simulation modeling studies in planning to attain standards for ground-level ozone in the San Francisco Bay Area and the San Joaquin Valley during the 1990's. Data are gathered through documents and interviews with planners, modelers, and policy-makers at public agencies and with representatives from the regulated and environmental communities. Interactions amongst organizations are diagramed to identify significant nodes of interaction. Dominant policy coalitions are described through narratives distinguished by theirmore » uses of and responses to uncertainty, their exposures to risks, and their responses to the principles of conservatism, civil duty, and caution. Policy narratives are delineated using aggregated respondent statements to describe and understand advocacy coalitions. I found that models impacted the planning process significantly, but were used not purely for their scientific capabilities. Modeling results provided justification for decisions based on other constraints and political considerations. Uncertainties were utilized opportunistically by stakeholders instead of managed explicitly. Ultimately, the process supported the partisan views of those in control of the modeling. Based on these findings, as well as a review of model uncertainty analysis capabilities, I recommend modifying the planning process to allow for the development and incorporation of uncertainty information, while addressing the need for inclusive and meaningful public participation. By documenting an actual air quality planning process these findings provide insights about the potential for using new scientific information and understanding to achieve environmental goals, most notably the analysis of uncertainties in modeling applications. Concurrently, needed uncertainty information is identified and capabilities to produce it are assessed. Practices to facilitate incorporation of uncertainty information are suggested based on research findings, as well as theory from the literatures of the policy sciences, decision sciences, science and technology studies, consensus-based and communicative planning, and modeling.« less
Energy efficient quantum machines
NASA Astrophysics Data System (ADS)
Abah, Obinna; Lutz, Eric
2017-05-01
We investigate the performance of a quantum thermal machine operating in finite time based on shortcut-to-adiabaticity techniques. We compute efficiency and power for a paradigmatic harmonic quantum Otto engine by taking the energetic cost of the shortcut driving explicitly into account. We demonstrate that shortcut-to-adiabaticity machines outperform conventional ones for fast cycles. We further derive generic upper bounds on both quantities, valid for any heat engine cycle, using the notion of quantum speed limit for driven systems. We establish that these quantum bounds are tighter than those stemming from the second law of thermodynamics.
The effect of encoding manipulation on word-stem cued recall: an event-related potential study.
Fay, Séverine; Isingrini, Michel; Ragot, Richard; Pouthas, Viviane
2005-08-01
The purpose of the present study was to find out whether the neural correlates of explicit retrieval from episodic memory would vary according to conditions at encoding when the words were presented in separate study/test blocks. Event-related potentials (ERPs) were recorded while participants performed a word-stem cued-recall task. Deeply (semantically) studied words were associated with higher levels of recall and faster response times than shallowly (lexically) studied words. Robust ERP old/new effects were observed for each encoding condition. They varied in magnitude, being largest in the semantic condition. As expected, scalp distributions also differed: for deeply studied words, the old/new effect resembled that found in previous ERP studies of word-stem cued-recall tasks (parietal and right frontal effects, between 400-800 and 800-1100 ms post-stimulus), whereas for shallowly studied words, the parietal old/new effect was absent in the latter latency window. These results can be interpreted as reflecting access to different kinds of memory representation depending on the nature of the processing engaged during encoding. Furthermore, differences in the ERPs elicited by new items indicate that subjects adopted different processing strategies in the test blocks following each encoding condition.
Correlated evolution of stem and leaf hydraulic traits in Pereskia (Cactaceae).
Edwards, Erika J
2006-01-01
Recent studies have demonstrated significant correlations between stem and leaf hydraulic properties when comparing across species within ecological communities. This implies that these traits are co-evolving, but there have been few studies addressing plant water relations within an explicitly evolutionary framework. This study tests for correlated evolution among a suite of plant water-use traits and environmental parameters in seven species of Pereskia (Cactaceae), using phylogenetically independent contrasts. There were significant evolutionary correlations between leaf-specific xylem hydraulic conductivity, Huber Value, leaf stomatal pore index, leaf venation density and leaf size, but none of these traits appeared to be correlated with environmental water availability; only two water relations traits - mid-day leaf water potentials and photosynthetic water use efficiency - correlated with estimates of moisture regime. In Pereskia, it appears that many stem and leaf hydraulic properties thought to be critical to whole-plant water use have not evolved in response to habitat shifts in water availability. This may be because of the extremely conservative stomatal behavior and particular rooting strategy demonstrated by all Pereskia species investigated. These results highlight the need for a lineage-based approach to understand the relative roles of functional traits in ecological adaptation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Xu, H.
The accepted clinical method to accommodate targeting uncertainties inherent in fractionated external beam radiation therapy is to utilize GTV-to-CTV and CTV-to-PTV margins during the planning process to design a PTV-conformal static dose distribution on the planning image set. Ideally, margins are selected to ensure a high (e.g. >95%) target coverage probability (CP) in spite of inherent inter- and intra-fractional positional variations, tissue motions, and initial contouring uncertainties. Robust optimization techniques, also known as probabilistic treatment planning techniques, explicitly incorporate the dosimetric consequences of targeting uncertainties by including CP evaluation into the planning optimization process along with coverage-based planning objectives. Themore » treatment planner no longer needs to use PTV and/or PRV margins; instead robust optimization utilizes probability distributions of the underlying uncertainties in conjunction with CP-evaluation for the underlying CTVs and OARs to design an optimal treated volume. This symposium will describe CP-evaluation methods as well as various robust planning techniques including use of probability-weighted dose distributions, probability-weighted objective functions, and coverage optimized planning. Methods to compute and display the effect of uncertainties on dose distributions will be presented. The use of robust planning to accommodate inter-fractional setup uncertainties, organ deformation, and contouring uncertainties will be examined as will its use to accommodate intra-fractional organ motion. Clinical examples will be used to inter-compare robust and margin-based planning, highlighting advantages of robust-plans in terms of target and normal tissue coverage. Robust-planning limitations as uncertainties approach zero and as the number of treatment fractions becomes small will be presented, as well as the factors limiting clinical implementation of robust planning. Learning Objectives: To understand robust-planning as a clinical alternative to using margin-based planning. To understand conceptual differences between uncertainty and predictable motion. To understand fundamental limitations of the PTV concept that probabilistic planning can overcome. To understand the major contributing factors to target and normal tissue coverage probability. To understand the similarities and differences of various robust planning techniques To understand the benefits and limitations of robust planning techniques.« less
TU-AB-BRB-02: Stochastic Programming Methods for Handling Uncertainty and Motion in IMRT Planning
DOE Office of Scientific and Technical Information (OSTI.GOV)
Unkelbach, J.
The accepted clinical method to accommodate targeting uncertainties inherent in fractionated external beam radiation therapy is to utilize GTV-to-CTV and CTV-to-PTV margins during the planning process to design a PTV-conformal static dose distribution on the planning image set. Ideally, margins are selected to ensure a high (e.g. >95%) target coverage probability (CP) in spite of inherent inter- and intra-fractional positional variations, tissue motions, and initial contouring uncertainties. Robust optimization techniques, also known as probabilistic treatment planning techniques, explicitly incorporate the dosimetric consequences of targeting uncertainties by including CP evaluation into the planning optimization process along with coverage-based planning objectives. Themore » treatment planner no longer needs to use PTV and/or PRV margins; instead robust optimization utilizes probability distributions of the underlying uncertainties in conjunction with CP-evaluation for the underlying CTVs and OARs to design an optimal treated volume. This symposium will describe CP-evaluation methods as well as various robust planning techniques including use of probability-weighted dose distributions, probability-weighted objective functions, and coverage optimized planning. Methods to compute and display the effect of uncertainties on dose distributions will be presented. The use of robust planning to accommodate inter-fractional setup uncertainties, organ deformation, and contouring uncertainties will be examined as will its use to accommodate intra-fractional organ motion. Clinical examples will be used to inter-compare robust and margin-based planning, highlighting advantages of robust-plans in terms of target and normal tissue coverage. Robust-planning limitations as uncertainties approach zero and as the number of treatment fractions becomes small will be presented, as well as the factors limiting clinical implementation of robust planning. Learning Objectives: To understand robust-planning as a clinical alternative to using margin-based planning. To understand conceptual differences between uncertainty and predictable motion. To understand fundamental limitations of the PTV concept that probabilistic planning can overcome. To understand the major contributing factors to target and normal tissue coverage probability. To understand the similarities and differences of various robust planning techniques To understand the benefits and limitations of robust planning techniques.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Renaud, M; Seuntjens, J; Roberge, D
Purpose: Assessing the performance and uncertainty of a pre-calculated Monte Carlo (PMC) algorithm for proton and electron transport running on graphics processing units (GPU). While PMC methods have been described in the past, an explicit quantification of the latent uncertainty arising from recycling a limited number of tracks in the pre-generated track bank is missing from the literature. With a proper uncertainty analysis, an optimal pre-generated track bank size can be selected for a desired dose calculation uncertainty. Methods: Particle tracks were pre-generated for electrons and protons using EGSnrc and GEANT4, respectively. The PMC algorithm for track transport was implementedmore » on the CUDA programming framework. GPU-PMC dose distributions were compared to benchmark dose distributions simulated using general-purpose MC codes in the same conditions. A latent uncertainty analysis was performed by comparing GPUPMC dose values to a “ground truth” benchmark while varying the track bank size and primary particle histories. Results: GPU-PMC dose distributions and benchmark doses were within 1% of each other in voxels with dose greater than 50% of Dmax. In proton calculations, a submillimeter distance-to-agreement error was observed at the Bragg Peak. Latent uncertainty followed a Poisson distribution with the number of tracks per energy (TPE) and a track bank of 20,000 TPE produced a latent uncertainty of approximately 1%. Efficiency analysis showed a 937× and 508× gain over a single processor core running DOSXYZnrc for 16 MeV electrons in water and bone, respectively. Conclusion: The GPU-PMC method can calculate dose distributions for electrons and protons to a statistical uncertainty below 1%. The track bank size necessary to achieve an optimal efficiency can be tuned based on the desired uncertainty. Coupled with a model to calculate dose contributions from uncharged particles, GPU-PMC is a candidate for inverse planning of modulated electron radiotherapy and scanned proton beams. This work was supported in part by FRSQ-MSSS (Grant No. 22090), NSERC RG (Grant No. 432290) and CIHR MOP (Grant No. MOP-211360)« less
Marlicz, Wojciech; Yung, Diana E; Skonieczna-Żydecka, Karolina; Loniewski, Igor; van Hemert, Saskia; Loniewska, Beata; Koulaouzidis, Anastasios
2017-10-01
Over the last decade, remarkable progress has been made in the understanding of disease pathophysiology. Many new theories expound on the importance of emerging factors such as microbiome influences, genomics/omics, stem cells, innate intestinal immunity or mucosal barrier complexities. This has introduced a further dimension of uncertainty into clinical decision-making, but equally, may shed some light on less well-understood and difficult to manage conditions. Areas covered: Comprehensive review of the literature on gut barrier and microbiome relevant to small bowel pathology. A PubMed/Medline search from 1990 to April 2017 was undertaken and papers from this range were included. Expert commentary: The scenario of clinical uncertainty is well-illustrated by functional gastrointestinal disorders (FGIDs). The movement towards achieving a better understanding of FGIDs is expressed in the Rome IV guidelines. Novel diagnostic and therapeutic protocols focused on the GB and SB microbiome can facilitate diagnosis, management and improve our understanding of the underlying pathological mechanisms in FGIDs.
NASA Astrophysics Data System (ADS)
Korenaga, Jun
2011-05-01
The seismic structure of large igneous provinces provides unique constraints on the nature of their parental mantle, allowing us to investigate past mantle dynamics from present crustal structure. To exploit this crust-mantle connection, however, it is prerequisite to quantify the uncertainty of a crustal velocity model, as it could suffer from considerable velocity-depth ambiguity. In this contribution, a practical strategy is suggested to estimate the model uncertainty by explicitly exploring the degree of velocity-depth ambiguity in the model space. In addition, wide-angle seismic data collected over the Ontong Java Plateau are revisited to provide a worked example of the new approach. My analysis indicates that the crustal structure of this gigantic plateau is difficult to reconcile with the melting of a pyrolitic mantle, pointing to the possibility of large-scale compositional heterogeneity in the convecting mantle.
Selecting surrogate endpoints for estimating pesticide effects on avian reproductive success.
Bennett, Richard S; Etterson, Matthew A
2013-10-01
A Markov chain nest productivity model (MCnest) has been developed for projecting the effects of a specific pesticide-use scenario on the annual reproductive success of avian species of concern. A critical element in MCnest is the use of surrogate endpoints, defined as measured endpoints from avian toxicity tests that represent specific types of effects possible in field populations at specific phases of a nesting attempt. In this article, we discuss the attributes of surrogate endpoints and provide guidance for selecting surrogates from existing avian laboratory tests as well as other possible sources. We also discuss some of the assumptions and uncertainties related to using surrogate endpoints to represent field effects. The process of explicitly considering how toxicity test results can be used to assess effects in the field helps identify uncertainties and data gaps that could be targeted in higher-tier risk assessments. © 2013 SETAC.
Form of prior for constrained thermodynamic processes with uncertainty
NASA Astrophysics Data System (ADS)
Aneja, Preety; Johal, Ramandeep S.
2015-05-01
We consider the quasi-static thermodynamic processes with constraints, but with additional uncertainty about the control parameters. Motivated by inductive reasoning, we assign prior distribution that provides a rational guess about likely values of the uncertain parameters. The priors are derived explicitly for both the entropy-conserving and the energy-conserving processes. The proposed form is useful when the constraint equation cannot be treated analytically. The inference is performed using spin-1/2 systems as models for heat reservoirs. Analytical results are derived in the high-temperatures limit. An agreement beyond linear response is found between the estimates of thermal quantities and their optimal values obtained from extremum principles. We also seek an intuitive interpretation for the prior and the estimated value of temperature obtained therefrom. We find that the prior over temperature becomes uniform over the quantity kept conserved in the process.
Economics, ethics, and climate policy: framing the debate
NASA Astrophysics Data System (ADS)
Howarth, Richard B.; Monahan, Patricia A.
1996-04-01
This paper examines the economic and ethical dimensions of climate policy in light of existing knowledge of the impacts of global warming and the costs of greenhouse gas emissions abatement. We find that the criterion of economic efficiency, operationalized through cost-benefit analysis, is ill-equipped to cope with the pervasive uncertainties and issues of intergenerational fairness that characterize climate change. In contrast, the concept of sustainable development—that today's policies should ensure that future generations enjoy life opportunities undiminished relative to the present—is a normative criterion that explicitly addresses the uncertainties and distributional aspects of global environmental change. If one interprets the sustainability criterion to imply that it is morally wrong to impose catastrophic risks on unborn generations when reducing those risks would not noticeably diminish the quality of life of existing persons, a case can be made for significant steps to reduce greenhouse gas emissions.
Speeded induction under uncertainty: the influence of multiple categories and feature conjunctions.
Newell, Ben R; Paton, Helen; Hayes, Brett K; Griffiths, Oren
2010-12-01
When people are uncertain about the category membership of an item (e.g., Is it a dog or a dingo?), research shows that they tend to rely only on the dominant or most likely category when making inductions (e.g., How likely is it to befriend me?). An exception has been reported using speeded induction judgments where participants appeared to use information from multiple categories to make inductions (Verde, Murphy, & Ross, 2005). In two speeded induction studies, we found that participants tended to rely on the frequency with which features co-occurred when making feature predictions, independently of category membership. This pattern held whether categories were considered implicitly (Experiment 1) or explicitly (Experiment 2) prior to feature induction. The results converge with other recent work suggesting that people often rely on feature conjunction information, rather than category boundaries, when making inductions under uncertainty.
STARS: The Space Transportation Architecture Risk System
NASA Technical Reports Server (NTRS)
Greenberg, Joel S.
1997-01-01
Because of the need to perform comparisons between transportation systems that are likely to have significantly different levels of risk, both because of differing degrees of freedom in achieving desired performance levels and their different states of development and utilization, an approach has been developed for performing early comparisons of transportation architectures explicitly taking into account quantitative measures of uncertainty and resulting risk. The approach considers the uncertainty associated with the achievement of technology goals, the effect that the achieved level of technology will have on transportation system performance and the relationship between transportation system performance/capability and the ability to accommodate variations in payload mass. The consequences of system performance are developed in terms of expected values and associated standard deviations of nonrecurring, recurring and the present value of transportation system life cycle cost. Typical results are presented to illustrate the application of the methodology.
NASA Technical Reports Server (NTRS)
Otterman, J.; Brakke, T.
1986-01-01
The projections of leaf areas onto a horizontal plane and onto a vertical plane are examined for their utility in characterizing canopies for sunlight penetration (direct beam only) models. These projections exactly specify the penetration if the projections on the principal plane of the normals to the top surfaces of the leaves are in the same quadrant as the sun. Inferring the total leaf area from these projections (and therefore the penetration as a function of the total leaf area) is possible only with a large uncertainty (up to + or - 32 percent) because the projections are a specific measure of the total leaf area only if the leaf angle distribution is known. It is expected that this uncertainty could be reduced to more acceptable levels by making an approximate assessment of whether the zenith angle distribution is that of an extremophile canopy.
NASA Technical Reports Server (NTRS)
Kweon, In SO; Hebert, Martial; Kanade, Takeo
1989-01-01
A three-dimensional perception system for building a geometrical description of rugged terrain environments from range image data is presented with reference to the exploration of the rugged terrain of Mars. An intermediate representation consisting of an elevation map that includes an explicit representation of uncertainty and labeling of the occluded regions is proposed. The locus method used to convert range image to an elevation map is introduced, along with an uncertainty model based on this algorithm. Both the elevation map and the locus method are the basis of a terrain matching algorithm which does not assume any correspondences between range images. The two-stage algorithm consists of a feature-based matching algorithm to compute an initial transform and an iconic terrain matching algorithm to merge multiple range images into a uniform representation. Terrain modeling results on real range images of rugged terrain are presented. The algorithms considered are a fundamental part of the perception system for the Ambler, a legged locomotor.
Uncertainty and instream flow standards
Castleberry, D.; Cech, J.; Erman, D.; Hankin, D.; Healey, M.; Kondolf, M.; Mengel, M.; Mohr, M.; Moyle, P.; Nielsen, Jennifer L.; Speed, T.; Williams, J.
1996-01-01
Several years ago, Science published an important essay (Ludwig et al. 1993) on the need to confront the scientific uncertainty associated with managing natural resources. The essay did not discuss instream flow standards explicitly, but its arguments apply. At an April 1995 workshop in Davis, California, all 12 participants agreed that currently no scientifically defensible method exists for defining the instream flows needed to protect particular species of fish or aquatic ecosystems (Williams, in press). We also agreed that acknowledging this fact is an essential step in dealing rationally and effectively with the problem.Practical necessity and the protection of fishery resources require that new instream flow standards be established and that existing standards be revised. However, if standards cannot be defined scientifically, how can this be done? We join others in recommending the approach of adaptive management. Applied to instream flow standards, this approach involves at least three elements.
Rosenzweig, Cynthia; Elliott, Joshua; Deryng, Delphine; Ruane, Alex C.; Müller, Christoph; Arneth, Almut; Boote, Kenneth J.; Folberth, Christian; Glotter, Michael; Khabarov, Nikolay; Neumann, Kathleen; Piontek, Franziska; Pugh, Thomas A. M.; Schmid, Erwin; Stehfest, Elke; Yang, Hong; Jones, James W.
2014-01-01
Here we present the results from an intercomparison of multiple global gridded crop models (GGCMs) within the framework of the Agricultural Model Intercomparison and Improvement Project and the Inter-Sectoral Impacts Model Intercomparison Project. Results indicate strong negative effects of climate change, especially at higher levels of warming and at low latitudes; models that include explicit nitrogen stress project more severe impacts. Across seven GGCMs, five global climate models, and four representative concentration pathways, model agreement on direction of yield changes is found in many major agricultural regions at both low and high latitudes; however, reducing uncertainty in sign of response in mid-latitude regions remains a challenge. Uncertainties related to the representation of carbon dioxide, nitrogen, and high temperature effects demonstrated here show that further research is urgently needed to better understand effects of climate change on agricultural production and to devise targeted adaptation strategies. PMID:24344314
NASA Technical Reports Server (NTRS)
Rosenzweig, Cynthia E.; Elliott, Joshua; Deryng, Delphine; Ruane, Alex C.; Mueller, Christoph; Arneth, Almut; Boote, Kenneth J.; Folberth, Christian; Glotter, Michael; Khabarov, Nikolay
2014-01-01
Here we present the results from an intercomparison of multiple global gridded crop models (GGCMs) within the framework of the Agricultural Model Intercomparison and Improvement Project and the Inter-Sectoral Impacts Model Intercomparison Project. Results indicate strong negative effects of climate change, especially at higher levels of warming and at low latitudes; models that include explicit nitrogen stress project more severe impacts. Across seven GGCMs, five global climate models, and four representative concentration pathways, model agreement on direction of yield changes is found in many major agricultural regions at both low and high latitudes; however, reducing uncertainty in sign of response in mid-latitude regions remains a challenge. Uncertainties related to the representation of carbon dioxide, nitrogen, and high temperature effects demonstrated here show that further research is urgently needed to better understand effects of climate change on agricultural production and to devise targeted adaptation strategies.
IMPACT - Integrated Modeling of Perturbations in Atmospheres for Conjunction Tracking
2013-09-01
the primary source of drag acceleration uncertainty stem from inadequate knowledge of r and CD. Atmospheric mass densities are often inferred from...sophisticated GSI models are diffuse reflection with incomplete accommodation (DRIA) [18] and the Cercignani-Lampis-Lord ( CLL ) model [19]. The DRIA model has...been applied in satellite drag coefficient modeling for nearly 50 years; however, the CLL model was only recently applied to satellite drag
Carbon cycle confidence and uncertainty: Exploring variation among soil biogeochemical models
Wieder, William R.; Hartman, Melannie D.; Sulman, Benjamin N.; ...
2017-11-09
Emerging insights into factors responsible for soil organic matter stabilization and decomposition are being applied in a variety of contexts, but new tools are needed to facilitate the understanding, evaluation, and improvement of soil biogeochemical theory and models at regional to global scales. To isolate the effects of model structural uncertainty on the global distribution of soil carbon stocks and turnover times we developed a soil biogeochemical testbed that forces three different soil models with consistent climate and plant productivity inputs. The models tested here include a first-order, microbial implicit approach (CASA-CNP), and two recently developed microbially explicit models thatmore » can be run at global scales (MIMICS and CORPSE). When forced with common environmental drivers, the soil models generated similar estimates of initial soil carbon stocks (roughly 1,400 Pg C globally, 0–100 cm), but each model shows a different functional relationship between mean annual temperature and inferred turnover times. Subsequently, the models made divergent projections about the fate of these soil carbon stocks over the 20th century, with models either gaining or losing over 20 Pg C globally between 1901 and 2010. Single-forcing experiments with changed inputs, tem- perature, and moisture suggest that uncertainty associated with freeze-thaw processes as well as soil textural effects on soil carbon stabilization were larger than direct temper- ature uncertainties among models. Finally, the models generated distinct projections about the timing and magnitude of seasonal heterotrophic respiration rates, again reflecting structural uncertainties that were related to environmental sensitivities and assumptions about physicochemical stabilization of soil organic matter. Here, by providing a computationally tractable and numerically consistent framework to evaluate models we aim to better understand uncertainties among models and generate insights about fac- tors regulating the turnover of soil organic matter.« less
Carbon cycle confidence and uncertainty: Exploring variation among soil biogeochemical models
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wieder, William R.; Hartman, Melannie D.; Sulman, Benjamin N.
Emerging insights into factors responsible for soil organic matter stabilization and decomposition are being applied in a variety of contexts, but new tools are needed to facilitate the understanding, evaluation, and improvement of soil biogeochemical theory and models at regional to global scales. To isolate the effects of model structural uncertainty on the global distribution of soil carbon stocks and turnover times we developed a soil biogeochemical testbed that forces three different soil models with consistent climate and plant productivity inputs. The models tested here include a first-order, microbial implicit approach (CASA-CNP), and two recently developed microbially explicit models thatmore » can be run at global scales (MIMICS and CORPSE). When forced with common environmental drivers, the soil models generated similar estimates of initial soil carbon stocks (roughly 1,400 Pg C globally, 0–100 cm), but each model shows a different functional relationship between mean annual temperature and inferred turnover times. Subsequently, the models made divergent projections about the fate of these soil carbon stocks over the 20th century, with models either gaining or losing over 20 Pg C globally between 1901 and 2010. Single-forcing experiments with changed inputs, tem- perature, and moisture suggest that uncertainty associated with freeze-thaw processes as well as soil textural effects on soil carbon stabilization were larger than direct temper- ature uncertainties among models. Finally, the models generated distinct projections about the timing and magnitude of seasonal heterotrophic respiration rates, again reflecting structural uncertainties that were related to environmental sensitivities and assumptions about physicochemical stabilization of soil organic matter. Here, by providing a computationally tractable and numerically consistent framework to evaluate models we aim to better understand uncertainties among models and generate insights about fac- tors regulating the turnover of soil organic matter.« less
NASA Astrophysics Data System (ADS)
Malard, J. J.; Rojas, M.; Adamowski, J. F.; Gálvez, J.; Tuy, H. A.; Melgar-Quiñonez, H.
2015-12-01
While cropping models represent the biophysical aspects of agricultural systems, system dynamics modelling offers the possibility of representing the socioeconomic (including social and cultural) aspects of these systems. The two types of models can then be coupled in order to include the socioeconomic dimensions of climate change adaptation in the predictions of cropping models.We develop a dynamically coupled socioeconomic-biophysical model of agricultural production and its repercussions on food security in two case studies from Guatemala (a market-based, intensive agricultural system and a low-input, subsistence crop-based system). Through the specification of the climate inputs to the cropping model, the impacts of climate change on the entire system can be analysed, and the participatory nature of the system dynamics model-building process, in which stakeholders from NGOs to local governmental extension workers were included, helps ensure local trust in and use of the model.However, the analysis of climate variability's impacts on agroecosystems includes uncertainty, especially in the case of joint physical-socioeconomic modelling, and the explicit representation of this uncertainty in the participatory development of the models is important to ensure appropriate use of the models by the end users. In addition, standard model calibration, validation, and uncertainty interval estimation techniques used for physically-based models are impractical in the case of socioeconomic modelling. We present a methodology for the calibration and uncertainty analysis of coupled biophysical (cropping) and system dynamics (socioeconomic) agricultural models, using survey data and expert input to calibrate and evaluate the uncertainty of the system dynamics as well as of the overall coupled model. This approach offers an important tool for local decision makers to evaluate the potential impacts of climate change and their feedbacks through the associated socioeconomic system.
Meyers, S.R.; Siewert, S.E.; Singer, B.S.; Sageman, B.B.; Condon, D.J.; Obradovich, J.D.; Jicha, B.R.; Sawyer, D.A.
2012-01-01
We develop an intercalibrated astrochronologic and radioisotopic time scale for the Cenomanian-Turonian boundary (CTB) interval near the Global Stratotype Section and Point in Colorado, USA, where orbitally influenced rhythmic strata host bentonites that contain sanidine and zircon suitable for 40Ar/ 39Ar and U-Pb dating. Paired 40Ar/ 39Ar and U-Pb ages are determined from four bentonites that span the Vascoceras diartianum to Pseudaspidoceras flexuosum ammonite biozones, utilizing both newly collected material and legacy sanidine samples of J. Obradovich. Comparison of the 40Ar/ 39Ar and U-Pb results underscores the strengths and limitations of each system, and supports an astronomically calibrated Fish Canyon sanidine standard age of 28.201 Ma. The radioisotopic data and published astrochronology are employed to develop a new CTB time scale, using two statistical approaches: (1) a simple integration that yields a CTB age of 93.89 ?? 0.14 Ma (2??; total radioisotopic uncertainty), and (2) a Bayesian intercalibration that explicitly accounts for orbital time scale uncertainty, and yields a CTB age of 93.90 ?? 0.15 Ma (95% credible interval; total radioisotopic and orbital time scale uncertainty). Both approaches firmly anchor the floating orbital time scale, and the Bayesian technique yields astronomically recalibrated radioisotopic ages for individual bentonites, with analytical uncertainties at the permil level of resolution, and total uncertainties below 2???. Using our new results, the duration between the Cenomanian-Turonian and the Cretaceous-Paleogene boundaries is 27.94 ?? 0.16 Ma, with an uncertainty of less than one-half of a long eccentricity cycle. ?? 2012 Geological Society of America.
Niches, models, and climate change: Assessing the assumptions and uncertainties
Wiens, John A.; Stralberg, Diana; Jongsomjit, Dennis; Howell, Christine A.; Snyder, Mark A.
2009-01-01
As the rate and magnitude of climate change accelerate, understanding the consequences becomes increasingly important. Species distribution models (SDMs) based on current ecological niche constraints are used to project future species distributions. These models contain assumptions that add to the uncertainty in model projections stemming from the structure of the models, the algorithms used to translate niche associations into distributional probabilities, the quality and quantity of data, and mismatches between the scales of modeling and data. We illustrate the application of SDMs using two climate models and two distributional algorithms, together with information on distributional shifts in vegetation types, to project fine-scale future distributions of 60 California landbird species. Most species are projected to decrease in distribution by 2070. Changes in total species richness vary over the state, with large losses of species in some “hotspots” of vulnerability. Differences in distributional shifts among species will change species co-occurrences, creating spatial variation in similarities between current and future assemblages. We use these analyses to consider how assumptions can be addressed and uncertainties reduced. SDMs can provide a useful way to incorporate future conditions into conservation and management practices and decisions, but the uncertainties of model projections must be balanced with the risks of taking the wrong actions or the costs of inaction. Doing this will require that the sources and magnitudes of uncertainty are documented, and that conservationists and resource managers be willing to act despite the uncertainties. The alternative, of ignoring the future, is not an option. PMID:19822750
Brooks, Patricia J; Kempe, Vera
2013-02-01
In this study, we sought to identify cognitive predictors of individual differences in adult foreign-language learning and to test whether metalinguistic awareness mediated the observed relationships. Using a miniature language-learning paradigm, adults (N = 77) learned Russian vocabulary and grammar (gender agreement and case marking) over six 1-h sessions, completing tasks that encouraged attention to phrases without explicitly teaching grammatical rules. The participants' ability to describe the Russian gender and case-marking patterns mediated the effects of nonverbal intelligence and auditory sequence learning on grammar learning and generalization. Hence, even under implicit-learning conditions, individual differences stemmed from explicit metalinguistic awareness of the underlying grammar, which, in turn, was linked to nonverbal intelligence and auditory sequence learning. Prior knowledge of languages with grammatical gender (predominantly Spanish) predicted learning of gender agreement. Transfer of knowledge of gender from other languages to Russian was not mediated by awareness, which suggests that transfer operates through an implicit process akin to structural priming.
Age differences in implicit memory: more apparent than real.
Russo, R; Parkin, A J
1993-01-01
Elderly subjects and a group of young subjects identified fragmented picture sequences under conditions of focused attention. Two other groups of young subjects carried out this task under divided-attention conditions. Implicit memory, as measured by item-specific savings, was found in all groups, but this effect was smaller in the elderly group. The young subjects, but not elderly subjects, performed better on new items. The divided-attention conditions equated recall and recognition by the young and the elderly, but only the young subjects showed greater savings for recalled items. The elderly subjects' reduced implicit memory therefore stemmed from their inability to facilitate implicit memory with explicit memory. A second experiment, involving only young subjects tested after delay, produced findings similar to those for the young divided-attention subjects. Implicit memory, as measured by savings in picture completion, does not show an age-related change when the role of explicit memory is considered. Age does, however, reduce skill learning.
Density estimates of monarch butterflies overwintering in central Mexico
Diffendorfer, Jay E.; López-Hoffman, Laura; Oberhauser, Karen; Pleasants, John; Semmens, Brice X.; Semmens, Darius; Taylor, Orley R.; Wiederholt, Ruscena
2017-01-01
Given the rapid population decline and recent petition for listing of the monarch butterfly (Danaus plexippus L.) under the Endangered Species Act, an accurate estimate of the Eastern, migratory population size is needed. Because of difficulty in counting individual monarchs, the number of hectares occupied by monarchs in the overwintering area is commonly used as a proxy for population size, which is then multiplied by the density of individuals per hectare to estimate population size. There is, however, considerable variation in published estimates of overwintering density, ranging from 6.9–60.9 million ha−1. We develop a probability distribution for overwinter density of monarch butterflies from six published density estimates. The mean density among the mixture of the six published estimates was ∼27.9 million butterflies ha−1 (95% CI [2.4–80.7] million ha−1); the mixture distribution is approximately log-normal, and as such is better represented by the median (21.1 million butterflies ha−1). Based upon assumptions regarding the number of milkweed needed to support monarchs, the amount of milkweed (Asclepias spp.) lost (0.86 billion stems) in the northern US plus the amount of milkweed remaining (1.34 billion stems), we estimate >1.8 billion stems is needed to return monarchs to an average population size of 6 ha. Considerable uncertainty exists in this required amount of milkweed because of the considerable uncertainty occurring in overwinter density estimates. Nevertheless, the estimate is on the same order as other published estimates. The studies included in our synthesis differ substantially by year, location, method, and measures of precision. A better understanding of the factors influencing overwintering density across space and time would be valuable for increasing the precision of conservation recommendations. PMID:28462031
Density estimates of monarch butterflies overwintering in central Mexico
Thogmartin, Wayne E.; Diffendorfer, James E.; Lopez-Hoffman, Laura; Oberhauser, Karen; Pleasants, John M.; Semmens, Brice X.; Semmens, Darius J.; Taylor, Orley R.; Wiederholt, Ruscena
2017-01-01
Given the rapid population decline and recent petition for listing of the monarch butterfly (Danaus plexippus L.) under the Endangered Species Act, an accurate estimate of the Eastern, migratory population size is needed. Because of difficulty in counting individual monarchs, the number of hectares occupied by monarchs in the overwintering area is commonly used as a proxy for population size, which is then multiplied by the density of individuals per hectare to estimate population size. There is, however, considerable variation in published estimates of overwintering density, ranging from 6.9–60.9 million ha−1. We develop a probability distribution for overwinter density of monarch butterflies from six published density estimates. The mean density among the mixture of the six published estimates was ∼27.9 million butterflies ha−1 (95% CI [2.4–80.7] million ha−1); the mixture distribution is approximately log-normal, and as such is better represented by the median (21.1 million butterflies ha−1). Based upon assumptions regarding the number of milkweed needed to support monarchs, the amount of milkweed (Asclepias spp.) lost (0.86 billion stems) in the northern US plus the amount of milkweed remaining (1.34 billion stems), we estimate >1.8 billion stems is needed to return monarchs to an average population size of 6 ha. Considerable uncertainty exists in this required amount of milkweed because of the considerable uncertainty occurring in overwinter density estimates. Nevertheless, the estimate is on the same order as other published estimates. The studies included in our synthesis differ substantially by year, location, method, and measures of precision. A better understanding of the factors influencing overwintering density across space and time would be valuable for increasing the precision of conservation recommendations.
Error Analysis of non-TLD HDR Brachytherapy Dosimetric Techniques
NASA Astrophysics Data System (ADS)
Amoush, Ahmad
The American Association of Physicists in Medicine Task Group Report43 (AAPM-TG43) and its updated version TG-43U1 rely on the LiF TLD detector to determine the experimental absolute dose rate for brachytherapy. The recommended uncertainty estimates associated with TLD experimental dosimetry include 5% for statistical errors (Type A) and 7% for systematic errors (Type B). TG-43U1 protocol does not include recommendation for other experimental dosimetric techniques to calculate the absolute dose for brachytherapy. This research used two independent experimental methods and Monte Carlo simulations to investigate and analyze uncertainties and errors associated with absolute dosimetry of HDR brachytherapy for a Tandem applicator. An A16 MicroChamber* and one dose MOSFET detectors† were selected to meet the TG-43U1 recommendations for experimental dosimetry. Statistical and systematic uncertainty analyses associated with each experimental technique were analyzed quantitatively using MCNPX 2.6‡ to evaluate source positional error, Tandem positional error, the source spectrum, phantom size effect, reproducibility, temperature and pressure effects, volume averaging, stem and wall effects, and Tandem effect. Absolute dose calculations for clinical use are based on Treatment Planning System (TPS) with no corrections for the above uncertainties. Absolute dose and uncertainties along the transverse plane were predicted for the A16 microchamber. The generated overall uncertainties are 22%, 17%, 15%, 15%, 16%, 17%, and 19% at 1cm, 2cm, 3cm, 4cm, and 5cm, respectively. Predicting the dose beyond 5cm is complicated due to low signal-to-noise ratio, cable effect, and stem effect for the A16 microchamber. Since dose beyond 5cm adds no clinical information, it has been ignored in this study. The absolute dose was predicted for the MOSFET detector from 1cm to 7cm along the transverse plane. The generated overall uncertainties are 23%, 11%, 8%, 7%, 7%, 9%, and 8% at 1cm, 2cm, 3cm, and 4cm, 5cm, 6cm, and 7cm, respectively. The Nucletron Freiburg flap applicator is used with the Nucletron remote afterloader HDR machine to deliver dose to surface cancers. Dosimetric data for the Nucletron 192Ir source were generated using Monte Carlo simulation and compared with the published data. Two dimensional dosimetric data were calculated at two source positions; at the center of the sphere of the applicator and between two adjacent spheres. Unlike the TPS dose algorithm, The Monte Carlo code developed for this research accounts for the applicator material, secondary electrons and delta particles, and the air gap between the skin and the applicator. *Standard Imaging, Inc., Middleton, Wisconsin USA † OneDose MOSFET, Sicel Technologies, Morrisville NC ‡ Los Alamos National Laboratory, NM USA
Remote sensing of ecosystem health: opportunities, challenges, and future perspectives.
Li, Zhaoqin; Xu, Dandan; Guo, Xulin
2014-11-07
Maintaining a healthy ecosystem is essential for maximizing sustainable ecological services of the best quality to human beings. Ecological and conservation research has provided a strong scientific background on identifying ecological health indicators and correspondingly making effective conservation plans. At the same time, ecologists have asserted a strong need for spatially explicit and temporally effective ecosystem health assessments based on remote sensing data. Currently, remote sensing of ecosystem health is only based on one ecosystem attribute: vigor, organization, or resilience. However, an effective ecosystem health assessment should be a comprehensive and dynamic measurement of the three attributes. This paper reviews opportunities of remote sensing, including optical, radar, and LiDAR, for directly estimating indicators of the three ecosystem attributes, discusses the main challenges to develop a remote sensing-based spatially-explicit comprehensive ecosystem health system, and provides some future perspectives. The main challenges to develop a remote sensing-based spatially-explicit comprehensive ecosystem health system are: (1) scale issue; (2) transportability issue; (3) data availability; and (4) uncertainties in health indicators estimated from remote sensing data. However, the Radarsat-2 constellation, upcoming new optical sensors on Worldview-3 and Sentinel-2 satellites, and improved technologies for the acquisition and processing of hyperspectral, multi-angle optical, radar, and LiDAR data and multi-sensoral data fusion may partly address the current challenges.
NASA Astrophysics Data System (ADS)
Maity, Arnab; Padhi, Radhakant; Mallaram, Sanjeev; Mallikarjuna Rao, G.; Manickavasagam, M.
2016-10-01
A new nonlinear optimal and explicit guidance law is presented in this paper for launch vehicles propelled by solid motors. It can ensure very high terminal precision despite not having the exact knowledge of the thrust-time curve apriori. This was motivated from using it for a carrier launch vehicle in a hypersonic mission, which demands an extremely narrow terminal accuracy window for the launch vehicle for successful initiation of operation of the hypersonic vehicle. The proposed explicit guidance scheme, which computes the optimal guidance command online, ensures the required stringent final conditions with high precision at the injection point. A key feature of the proposed guidance law is an innovative extension of the recently developed model predictive static programming guidance with flexible final time. A penalty function approach is also followed to meet the input and output inequality constraints throughout the vehicle trajectory. In this paper, the guidance law has been successfully validated from nonlinear six degree-of-freedom simulation studies by designing an inner-loop autopilot as well, which enhances confidence of its usefulness significantly. In addition to excellent nominal results, the proposed guidance has been found to have good robustness for perturbed cases as well.
The equivalence principle in a quantum world
NASA Astrophysics Data System (ADS)
Bjerrum-Bohr, N. E. J.; Donoghue, John F.; El-Menoufi, Basem Kamal; Holstein, Barry R.; Planté, Ludovic; Vanhove, Pierre
2015-09-01
We show how modern methods can be applied to quantum gravity at low energy. We test how quantum corrections challenge the classical framework behind the equivalence principle (EP), for instance through introduction of nonlocality from quantum physics, embodied in the uncertainty principle. When the energy is small, we now have the tools to address this conflict explicitly. Despite the violation of some classical concepts, the EP continues to provide the core of the quantum gravity framework through the symmetry — general coordinate invariance — that is used to organize the effective field theory (EFT).
NASA Astrophysics Data System (ADS)
Arnst, M.; Abello Álvarez, B.; Ponthot, J.-P.; Boman, R.
2017-11-01
This paper is concerned with the characterization and the propagation of errors associated with data limitations in polynomial-chaos-based stochastic methods for uncertainty quantification. Such an issue can arise in uncertainty quantification when only a limited amount of data is available. When the available information does not suffice to accurately determine the probability distributions that must be assigned to the uncertain variables, the Bayesian method for assigning these probability distributions becomes attractive because it allows the stochastic model to account explicitly for insufficiency of the available information. In previous work, such applications of the Bayesian method had already been implemented by using the Metropolis-Hastings and Gibbs Markov Chain Monte Carlo (MCMC) methods. In this paper, we present an alternative implementation, which uses an alternative MCMC method built around an Itô stochastic differential equation (SDE) that is ergodic for the Bayesian posterior. We draw together from the mathematics literature a number of formal properties of this Itô SDE that lend support to its use in the implementation of the Bayesian method, and we describe its discretization, including the choice of the free parameters, by using the implicit Euler method. We demonstrate the proposed methodology on a problem of uncertainty quantification in a complex nonlinear engineering application relevant to metal forming.
Tan, Q; Huang, G H; Cai, Y P
2010-09-01
The existing inexact optimization methods based on interval-parameter linear programming can hardly address problems where coefficients in objective functions are subject to dual uncertainties. In this study, a superiority-inferiority-based inexact fuzzy two-stage mixed-integer linear programming (SI-IFTMILP) model was developed for supporting municipal solid waste management under uncertainty. The developed SI-IFTMILP approach is capable of tackling dual uncertainties presented as fuzzy boundary intervals (FuBIs) in not only constraints, but also objective functions. Uncertainties expressed as a combination of intervals and random variables could also be explicitly reflected. An algorithm with high computational efficiency was provided to solve SI-IFTMILP. SI-IFTMILP was then applied to a long-term waste management case to demonstrate its applicability. Useful interval solutions were obtained. SI-IFTMILP could help generate dynamic facility-expansion and waste-allocation plans, as well as provide corrective actions when anticipated waste management plans are violated. It could also greatly reduce system-violation risk and enhance system robustness through examining two sets of penalties resulting from variations in fuzziness and randomness. Moreover, four possible alternative models were formulated to solve the same problem; solutions from them were then compared with those from SI-IFTMILP. The results indicate that SI-IFTMILP could provide more reliable solutions than the alternatives. 2010 Elsevier Ltd. All rights reserved.
Leedale, Joseph; Tompkins, Adrian M; Caminade, Cyril; Jones, Anne E; Nikulin, Grigory; Morse, Andrew P
2016-03-31
The effect of climate change on the spatiotemporal dynamics of malaria transmission is studied using an unprecedented ensemble of climate projections, employing three diverse bias correction and downscaling techniques, in order to partially account for uncertainty in climate- driven malaria projections. These large climate ensembles drive two dynamical and spatially explicit epidemiological malaria models to provide future hazard projections for the focus region of eastern Africa. While the two malaria models produce very distinct transmission patterns for the recent climate, their response to future climate change is similar in terms of sign and spatial distribution, with malaria transmission moving to higher altitudes in the East African Community (EAC) region, while transmission reduces in lowland, marginal transmission zones such as South Sudan. The climate model ensemble generally projects warmer and wetter conditions over EAC. The simulated malaria response appears to be driven by temperature rather than precipitation effects. This reduces the uncertainty due to the climate models, as precipitation trends in tropical regions are very diverse, projecting both drier and wetter conditions with the current state-of-the-art climate model ensemble. The magnitude of the projected changes differed considerably between the two dynamical malaria models, with one much more sensitive to climate change, highlighting that uncertainty in the malaria projections is also associated with the disease modelling approach.
Steering vaccinomics innovations with anticipatory governance and participatory foresight.
Ozdemir, Vural; Faraj, Samer A; Knoppers, Bartha M
2011-09-01
Vaccinomics is the convergence of vaccinology and population-based omics sciences. The success of knowledge-based innovations such as vaccinomics is not only contingent on access to new biotechnologies. It also requires new ways of governance of science, knowledge production, and management. This article presents a conceptual analysis of the anticipatory and adaptive approaches that are crucial for the responsible design and sustainable transition of vaccinomics to public health practice. Anticipatory governance is a new approach to manage the uncertainties embedded on an innovation trajectory with participatory foresight, in order to devise governance instruments for collective "steering" of science and technology. As a contrast to hitherto narrowly framed "downstream impact assessments" for emerging technologies, anticipatory governance adopts a broader and interventionist approach that recognizes the social construction of technology design and innovation. It includes in its process explicit mechanisms to understand the factors upstream to the innovation trajectory such as deliberation and cocultivation of the aims, motives, funding, design, and direction of science and technology, both by experts and publics. This upstream shift from a consumer "product uptake" focus to "participatory technology design" on the innovation trajectory is an appropriately radical and necessary departure in the field of technology assessment, especially given that considerable public funds are dedicated to innovations. Recent examples of demands by research funding agencies to anticipate the broad impacts of proposed research--at a very upstream stage at the time of research funding application--suggest that anticipatory governance with foresight may be one way how postgenomics scientific practice might transform in the future toward responsible innovation. Moreover, the present context of knowledge production in vaccinomics is such that policy making for vaccines of the 21st century is occurring in the face of uncertainties where the "facts are uncertain, values in dispute, stakes high and decisions urgent and where no single one of these dimensions can be managed in isolation from the rest." This article concludes, however, that uncertainty is not an accident of the scientific method, but its very substance. Anticipatory governance with participatory foresight offers a mechanism to respond to such inherent sociotechnical uncertainties in the emerging field of vaccinomics by making the coproduction of scientific knowledge by technology and the social systems explicit. Ultimately, this serves to integrate scientific and social knowledge thereby steering innovations to coproduce results and outputs that are socially robust and context sensitive.
TU-AB-BRB-01: Coverage Evaluation and Probabilistic Treatment Planning as a Margin Alternative
DOE Office of Scientific and Technical Information (OSTI.GOV)
Siebers, J.
The accepted clinical method to accommodate targeting uncertainties inherent in fractionated external beam radiation therapy is to utilize GTV-to-CTV and CTV-to-PTV margins during the planning process to design a PTV-conformal static dose distribution on the planning image set. Ideally, margins are selected to ensure a high (e.g. >95%) target coverage probability (CP) in spite of inherent inter- and intra-fractional positional variations, tissue motions, and initial contouring uncertainties. Robust optimization techniques, also known as probabilistic treatment planning techniques, explicitly incorporate the dosimetric consequences of targeting uncertainties by including CP evaluation into the planning optimization process along with coverage-based planning objectives. Themore » treatment planner no longer needs to use PTV and/or PRV margins; instead robust optimization utilizes probability distributions of the underlying uncertainties in conjunction with CP-evaluation for the underlying CTVs and OARs to design an optimal treated volume. This symposium will describe CP-evaluation methods as well as various robust planning techniques including use of probability-weighted dose distributions, probability-weighted objective functions, and coverage optimized planning. Methods to compute and display the effect of uncertainties on dose distributions will be presented. The use of robust planning to accommodate inter-fractional setup uncertainties, organ deformation, and contouring uncertainties will be examined as will its use to accommodate intra-fractional organ motion. Clinical examples will be used to inter-compare robust and margin-based planning, highlighting advantages of robust-plans in terms of target and normal tissue coverage. Robust-planning limitations as uncertainties approach zero and as the number of treatment fractions becomes small will be presented, as well as the factors limiting clinical implementation of robust planning. Learning Objectives: To understand robust-planning as a clinical alternative to using margin-based planning. To understand conceptual differences between uncertainty and predictable motion. To understand fundamental limitations of the PTV concept that probabilistic planning can overcome. To understand the major contributing factors to target and normal tissue coverage probability. To understand the similarities and differences of various robust planning techniques To understand the benefits and limitations of robust planning techniques.« less
TU-AB-BRB-00: New Methods to Ensure Target Coverage
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
2015-06-15
The accepted clinical method to accommodate targeting uncertainties inherent in fractionated external beam radiation therapy is to utilize GTV-to-CTV and CTV-to-PTV margins during the planning process to design a PTV-conformal static dose distribution on the planning image set. Ideally, margins are selected to ensure a high (e.g. >95%) target coverage probability (CP) in spite of inherent inter- and intra-fractional positional variations, tissue motions, and initial contouring uncertainties. Robust optimization techniques, also known as probabilistic treatment planning techniques, explicitly incorporate the dosimetric consequences of targeting uncertainties by including CP evaluation into the planning optimization process along with coverage-based planning objectives. Themore » treatment planner no longer needs to use PTV and/or PRV margins; instead robust optimization utilizes probability distributions of the underlying uncertainties in conjunction with CP-evaluation for the underlying CTVs and OARs to design an optimal treated volume. This symposium will describe CP-evaluation methods as well as various robust planning techniques including use of probability-weighted dose distributions, probability-weighted objective functions, and coverage optimized planning. Methods to compute and display the effect of uncertainties on dose distributions will be presented. The use of robust planning to accommodate inter-fractional setup uncertainties, organ deformation, and contouring uncertainties will be examined as will its use to accommodate intra-fractional organ motion. Clinical examples will be used to inter-compare robust and margin-based planning, highlighting advantages of robust-plans in terms of target and normal tissue coverage. Robust-planning limitations as uncertainties approach zero and as the number of treatment fractions becomes small will be presented, as well as the factors limiting clinical implementation of robust planning. Learning Objectives: To understand robust-planning as a clinical alternative to using margin-based planning. To understand conceptual differences between uncertainty and predictable motion. To understand fundamental limitations of the PTV concept that probabilistic planning can overcome. To understand the major contributing factors to target and normal tissue coverage probability. To understand the similarities and differences of various robust planning techniques To understand the benefits and limitations of robust planning techniques.« less
Data fusion and classification using a hybrid intrinsic cellular inference network
NASA Astrophysics Data System (ADS)
Woodley, Robert; Walenz, Brett; Seiffertt, John; Robinette, Paul; Wunsch, Donald
2010-04-01
Hybrid Intrinsic Cellular Inference Network (HICIN) is designed for battlespace decision support applications. We developed an automatic method of generating hypotheses for an entity-attribute classifier. The capability and effectiveness of a domain specific ontology was used to generate automatic categories for data classification. Heterogeneous data is clustered using an Adaptive Resonance Theory (ART) inference engine on a sample (unclassified) data set. The data set is the Lahman baseball database. The actual data is immaterial to the architecture, however, parallels in the data can be easily drawn (i.e., "Team" maps to organization, "Runs scored/allowed" to Measure of organization performance (positive/negative), "Payroll" to organization resources, etc.). Results show that HICIN classifiers create known inferences from the heterogonous data. These inferences are not explicitly stated in the ontological description of the domain and are strictly data driven. HICIN uses data uncertainty handling to reduce errors in the classification. The uncertainty handling is based on subjective logic. The belief mass allows evidence from multiple sources to be mathematically combined to increase or discount an assertion. In military operations the ability to reduce uncertainty will be vital in the data fusion operation.
Stefansson, Gunnar; Rosenberg, Andrew A
2005-01-29
We consider combinations of three types of control measures for the management of fisheries when the input information for policy decisions is uncertain. The methods considered include effort controls, catch quotas and area closures. We simulated a hypothetical fishery loosely based on the Icelandic cod fishery, using a simple spatially explicit dynamic model. We compared the performance with respect to conserving the resource and economic return for each type of control measure alone and in combination. In general, combining more than one type of primary direct control on fishing provides a greater buffer to uncertainty than any single form of fishery control alone. Combining catch quota control with a large closed area is a most effective system for reducing the risk of stock collapse and maintaining both short and long-term economic performance. Effort controls can also be improved by adding closed areas to the management scheme. We recommend that multiple control methods be used wherever possible and that closed areas should be used to buffer uncertainty. To be effective, these closed areas must be large and exclude all principal gears to provide real protection from fishing mortality.
Managing Wind Power Uncertainty Through Strategic Reserve Purchasing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Du, Ershun; Zhang, Ning; Kang, Chongqing
With the rapidly increasing penetration of wind power, wind producers are becoming increasingly responsible for the deviation of the wind power output from the forecast. Such uncertainty results in revenue losses to the wind power producers (WPPs) due to penalties in ex-post imbalance settlements. This paper explores the opportunities available for WPPs if they can purchase or schedule some reserves to offset part of their deviation rather than being fully penalized in the real time market. The revenue for WPPs under such mechanism is modeled. The optimal strategy for managing the uncertainty of wind power by purchasing reserves to maximizemore » the WPP's revenue is analytically derived with rigorous optimality conditions. The amount of energy and reserves that should be bid in the market are explicitly quantified by the probabilistic forecast and the prices of the energy and reserves. A case study using the price data from ERCOT and wind power data from NREL is performed to verify the effectiveness of the derived optimal bidding strategy and the benefits of reserve purchasing. Additionally, the proposed bidding strategy can also reduce the risk of variations on WPP's revenue.« less
Two-stage fuzzy-stochastic robust programming: a hybrid model for regional air quality management.
Li, Yongping; Huang, Guo H; Veawab, Amornvadee; Nie, Xianghui; Liu, Lei
2006-08-01
In this study, a hybrid two-stage fuzzy-stochastic robust programming (TFSRP) model is developed and applied to the planning of an air-quality management system. As an extension of existing fuzzy-robust programming and two-stage stochastic programming methods, the TFSRP can explicitly address complexities and uncertainties of the study system without unrealistic simplifications. Uncertain parameters can be expressed as probability density and/or fuzzy membership functions, such that robustness of the optimization efforts can be enhanced. Moreover, economic penalties as corrective measures against any infeasibilities arising from the uncertainties are taken into account. This method can, thus, provide a linkage to predefined policies determined by authorities that have to be respected when a modeling effort is undertaken. In its solution algorithm, the fuzzy decision space can be delimited through specification of the uncertainties using dimensional enlargement of the original fuzzy constraints. The developed model is applied to a case study of regional air quality management. The results indicate that reasonable solutions have been obtained. The solutions can be used for further generating pollution-mitigation alternatives with minimized system costs and for providing a more solid support for sound environmental decisions.
Stefansson, Gunnar; Rosenberg, Andrew A.
2005-01-01
We consider combinations of three types of control measures for the management of fisheries when the input information for policy decisions is uncertain. The methods considered include effort controls, catch quotas and area closures. We simulated a hypothetical fishery loosely based on the Icelandic cod fishery, using a simple spatially explicit dynamic model. We compared the performance with respect to conserving the resource and economic return for each type of control measure alone and in combination. In general, combining more than one type of primary direct control on fishing provides a greater buffer to uncertainty than any single form of fishery control alone. Combining catch quota control with a large closed area is a most effective system for reducing the risk of stock collapse and maintaining both short and long-term economic performance. Effort controls can also be improved by adding closed areas to the management scheme. We recommend that multiple control methods be used wherever possible and that closed areas should be used to buffer uncertainty. To be effective, these closed areas must be large and exclude all principal gears to provide real protection from fishing mortality. PMID:15713593
NASA Astrophysics Data System (ADS)
Sun, Ruochen; Yuan, Huiling; Liu, Xiaoli
2017-11-01
The heteroscedasticity treatment in residual error models directly impacts the model calibration and prediction uncertainty estimation. This study compares three methods to deal with the heteroscedasticity, including the explicit linear modeling (LM) method and nonlinear modeling (NL) method using hyperbolic tangent function, as well as the implicit Box-Cox transformation (BC). Then a combined approach (CA) combining the advantages of both LM and BC methods has been proposed. In conjunction with the first order autoregressive model and the skew exponential power (SEP) distribution, four residual error models are generated, namely LM-SEP, NL-SEP, BC-SEP and CA-SEP, and their corresponding likelihood functions are applied to the Variable Infiltration Capacity (VIC) hydrologic model over the Huaihe River basin, China. Results show that the LM-SEP yields the poorest streamflow predictions with the widest uncertainty band and unrealistic negative flows. The NL and BC methods can better deal with the heteroscedasticity and hence their corresponding predictive performances are improved, yet the negative flows cannot be avoided. The CA-SEP produces the most accurate predictions with the highest reliability and effectively avoids the negative flows, because the CA approach is capable of addressing the complicated heteroscedasticity over the study basin.
Adaptive management for ecosystem services (j/a) | Science ...
Management of natural resources for the production of ecosystem services, which are vital for human well-being, is necessary even when there is uncertainty regarding system response to management action. This uncertainty is the result of incomplete controllability, complex internal feedbacks, and non-linearity that often interferes with desired management outcomes, and insufficient understanding of nature and people. Adaptive management was developed to reduce such uncertainty. We present a framework for the application of adaptive management for ecosystem services that explicitly accounts for cross-scale tradeoffs in the production of ecosystem services. Our framework focuses on identifying key spatiotemporal scales (plot, patch, ecosystem, landscape, and region) that encompass dominant structures and processes in the system, and includes within- and cross-scale dynamics, ecosystem service tradeoffs, and management controllability within and across scales. Resilience theory recognizes that a limited set of ecological processes in a given system regulate ecosystem services, yet our understanding of these processes is poorly understood. If management actions erode or remove these processes, the system may shift into an alternative state unlikely to support the production of desired services. Adaptive management provides a process to assess the underlying within and cross-scale tradeoffs associated with production of ecosystem services while proceeding with manage
Characterizing model uncertainties in the life cycle of lignocellulose-based ethanol fuels.
Spatari, Sabrina; MacLean, Heather L
2010-11-15
Renewable and low carbon fuel standards being developed at federal and state levels require an estimation of the life cycle carbon intensity (LCCI) of candidate fuels that can substitute for gasoline, such as second generation bioethanol. Estimating the LCCI of such fuels with a high degree of confidence requires the use of probabilistic methods to account for known sources of uncertainty. We construct life cycle models for the bioconversion of agricultural residue (corn stover) and energy crops (switchgrass) and explicitly examine uncertainty using Monte Carlo simulation. Using statistical methods to identify significant model variables from public data sets and Aspen Plus chemical process models,we estimate stochastic life cycle greenhouse gas (GHG) emissions for the two feedstocks combined with two promising fuel conversion technologies. The approach can be generalized to other biofuel systems. Our results show potentially high and uncertain GHG emissions for switchgrass-ethanol due to uncertain CO₂ flux from land use change and N₂O flux from N fertilizer. However, corn stover-ethanol,with its low-in-magnitude, tight-in-spread LCCI distribution, shows considerable promise for reducing life cycle GHG emissions relative to gasoline and corn-ethanol. Coproducts are important for reducing the LCCI of all ethanol fuels we examine.
NASA Astrophysics Data System (ADS)
Klaus, Julian; Zehe, Erwin
2010-05-01
Rapid water flow along spatially connected - often biologically mediated - flow paths of minimum flow resistance is widely acknowledged to play a key role in runoff generation at the hillslope and small catchment scales but also in the transport of solutes like agro chemicals and nutrients in cohesive soils. Especially at tile drained fields site connected vertical flow structures such as worm burrows, roots or shrinkage cracks act as short cuts allowing water flow to bypass the soil matrix. In the present study we propose a spatially explicit approach to represent worm burrows as connected structures of high conductivity and low retention capacity in a 2D physically model. With this approach tile drain discharge and preferential flow patterns in soil observed during the irrigation of a tile drained hillslope in the Weiherbach catchment were modelled. The model parameters derived from measurements and are considered to be uncertain. Given this uncertainty of key factors that organise flow and transport at tile drained sites the main objectives of the present studies are to shed light on the following three questions: 1. Does a simplified approach that explicitly represents worm burrows as continuous flow paths of small flow resistance and low retention properties in a 2D physically model allow successful reproduction of event flow response at a tile drained field site in the Weiherbach catchment? 2. Does the above described uncertainty in key factors cause equifinality i.e. are there several model structural setups that reproduce event flow response in an acceptable manner without compromising our physical understanding of the system? 3. If so, what are the key factors that have to be known at high accuracy to reduce the equifinality of model structures? The issue of equifinality is usually discussed in catchment modelling to indicate that often a large set of conceptual model parameter sets allows acceptable reproduction of the behaviour of the system of interest - in many cases catchment stream flow response. Beven and Binley (1992) suggest that these model structures should be considered to be equally likely to account for predictive uncertainty. In this study we show that the above outline approach allows successful prediction of the tile drain discharge and preferential flow patterns in soil observed during the irrigation of a tile drained hillslope in the Weiherbach catchment flow event. Strikingly we a found a considerable equifinality in the model structural setup, when key parameters such as the area density of worm burrows, their hydraulic conductivity and the conductivity of the tile drains were varied within the ranges of either our measurements or measurements reported in the literature. Thirteen different model setups yielded a normalised time-shifted Nash-Sutcliffe of more than 0.9, which means that more than 90% of the flow variability is explained by the model. Also the flow volumes were in good accordance and timing errors were less or equal than 20 min (which corresponds to two simulation output time steps). It is elaborated that this uncertainty/equifinality could be reduced when more precise data on initial states of the subsurface and on the drainage area of a single drainage tube could be made available. However, such data are currently most difficult to assess even at very well investigated site as the one that is dealt with here. We thus suggest non uniqueness of process based model structures seems thus to be an important factor causing predictive uncertainty at many sites where preferential flow dominates systems response. References Beven, K.J. and Binley, A.M., 1992. The future of distributed models: model calibration and uncertainty prediction, Hydrological Processes, 6, p.279-298.
Arnold, E N
1990-05-22
Phylogenies based on morphology vary considerably in their quality: some are robust and explicit with little conflict in the data set, whereas others are far more tenuous, with much conflict and many possible alternatives. The main primary reasons for untrue or inexplicit morphological phylogenies are: not enough characters developed between branching points, uncertain character polarity, poorly differentiated character states, homoplasy caused by parallelism or reversal, and extinction, which may remove species entirely from consideration and can make originally conflicting data sets misleadingly compatible, increasing congruence at the expense of truth. Extinction differs from other confounding factors in not being apparent either in the data set or in subsequent analysis. One possibility is that variation in the quality of morphological phylogenies has resulted from exposure to different ecological situations. To investigate this, it is necessary to compare the histories of the clades concerned. In the case of explicit morphological phylogenies, ecological and behavioural data can be integrated with them and it may then be possible to decide whether morphological characters are likely to have been elicited by the environments through which the clade has passed. The credibility of such results depends not only on the phylogeny being robust but also on its detailed topology: a pectinate phylogeny will often allow more certain and more explicit statements to be made about historical events. In the case of poor phylogenies, it is not possible to produce detailed histories, but they can be compared with robust phylogenies in the range of ecological situations occupied, and whether they occupy novel situations in comparison with their outgroups. LeQuesne testing can give information about niche homoplasy, and it may also be possible to see if morphological features are functionally associated with ecological parameters, even if the direction of change is unknown. Examination of the robust and explicit phylogeny of the semaphore geckoes (Pristurus) suggests that its quality does stem from a variety of environmental factors. The group has progressed along an ecological continuum, passing through a series of increasingly severe niches that appear to have elicited many morphological changes. The fact that niches are progressively filled reduces the likelihood of species reinvading a previous one with related character reversal. Because the niches of advanced Pristurus are virtually unique within the Gekkonidae the morphological changes produced are also very rare and therefore easy to polarize. Ecological changes on the main stem of the phylogeny are abrupt and associated character states consequently well differentiated.(ABSTRACT TRUNCATED AT 400 WORDS)
Predicate calculus, artificial intelligence, and workers' compensation.
Harber, P; McCoy, J M
1989-05-01
Application of principles of predicate calculus (PC) and artificial intelligence (AI) search methods to occupational medicine can meet several goals. First, they can improve understanding of the diagnostic process and recognition of the sources of uncertainty in knowledge and in case specific information. Second, PC provides a rational means of resolving differences in conclusion based upon the same premises. Third, understanding of these principles allows separation of knowledge (facts) from the process by which they are used and therefore facilitates development of AI-based expert systems. Application of PC to recognizing causation of pulmonary fibrosis is demonstrated in this paper, providing a method that can be generalized to other problems in occupational medicine. Application of PC and understanding of AI search routines may be particularly applicable to workers' compensation where explicit statement of rational and inferential process is necessary. This approach is useful in the diagnosis of occupational lung disease and may be particularly valuable in workers' compensation considerations, wherein explicit statement of rationale is needed.
Steen, Valerie; Sofaer, Helen R.; Skagen, Susan K.; Ray, Andrea J.; Noon, Barry R
2017-01-01
Species distribution models (SDMs) are commonly used to assess potential climate change impacts on biodiversity, but several critical methodological decisions are often made arbitrarily. We compare variability arising from these decisions to the uncertainty in future climate change itself. We also test whether certain choices offer improved skill for extrapolating to a changed climate and whether internal cross-validation skill indicates extrapolative skill. We compared projected vulnerability for 29 wetland-dependent bird species breeding in the climatically dynamic Prairie Pothole Region, USA. For each species we built 1,080 SDMs to represent a unique combination of: future climate, class of climate covariates, collinearity level, and thresholding procedure. We examined the variation in projected vulnerability attributed to each uncertainty source. To assess extrapolation skill under a changed climate, we compared model predictions with observations from historic drought years. Uncertainty in projected vulnerability was substantial, and the largest source was that of future climate change. Large uncertainty was also attributed to climate covariate class with hydrological covariates projecting half the range loss of bioclimatic covariates or other summaries of temperature and precipitation. We found that choices based on performance in cross-validation improved skill in extrapolation. Qualitative rankings were also highly uncertain. Given uncertainty in projected vulnerability and resulting uncertainty in rankings used for conservation prioritization, a number of considerations appear critical for using bioclimatic SDMs to inform climate change mitigation strategies. Our results emphasize explicitly selecting climate summaries that most closely represent processes likely to underlie ecological response to climate change. For example, hydrological covariates projected substantially reduced vulnerability, highlighting the importance of considering whether water availability may be a more proximal driver than precipitation. However, because cross-validation results were correlated with extrapolation results, the use of cross-validation performance metrics to guide modeling choices where knowledge is limited was supported.
Testing the robustness of management decisions to uncertainty: Everglades restoration scenarios.
Fuller, Michael M; Gross, Louis J; Duke-Sylvester, Scott M; Palmer, Mark
2008-04-01
To effectively manage large natural reserves, resource managers must prepare for future contingencies while balancing the often conflicting priorities of different stakeholders. To deal with these issues, managers routinely employ models to project the response of ecosystems to different scenarios that represent alternative management plans or environmental forecasts. Scenario analysis is often used to rank such alternatives to aid the decision making process. However, model projections are subject to uncertainty in assumptions about model structure, parameter values, environmental inputs, and subcomponent interactions. We introduce an approach for testing the robustness of model-based management decisions to the uncertainty inherent in complex ecological models and their inputs. We use relative assessment to quantify the relative impacts of uncertainty on scenario ranking. To illustrate our approach we consider uncertainty in parameter values and uncertainty in input data, with specific examples drawn from the Florida Everglades restoration project. Our examples focus on two alternative 30-year hydrologic management plans that were ranked according to their overall impacts on wildlife habitat potential. We tested the assumption that varying the parameter settings and inputs of habitat index models does not change the rank order of the hydrologic plans. We compared the average projected index of habitat potential for four endemic species and two wading-bird guilds to rank the plans, accounting for variations in parameter settings and water level inputs associated with hypothetical future climates. Indices of habitat potential were based on projections from spatially explicit models that are closely tied to hydrology. For the American alligator, the rank order of the hydrologic plans was unaffected by substantial variation in model parameters. By contrast, simulated major shifts in water levels led to reversals in the ranks of the hydrologic plans in 24.1-30.6% of the projections for the wading bird guilds and several individual species. By exposing the differential effects of uncertainty, relative assessment can help resource managers assess the robustness of scenario choice in model-based policy decisions.
Steen, Valerie; Sofaer, Helen R; Skagen, Susan K; Ray, Andrea J; Noon, Barry R
2017-11-01
Species distribution models (SDMs) are commonly used to assess potential climate change impacts on biodiversity, but several critical methodological decisions are often made arbitrarily. We compare variability arising from these decisions to the uncertainty in future climate change itself. We also test whether certain choices offer improved skill for extrapolating to a changed climate and whether internal cross-validation skill indicates extrapolative skill. We compared projected vulnerability for 29 wetland-dependent bird species breeding in the climatically dynamic Prairie Pothole Region, USA. For each species we built 1,080 SDMs to represent a unique combination of: future climate, class of climate covariates, collinearity level, and thresholding procedure. We examined the variation in projected vulnerability attributed to each uncertainty source. To assess extrapolation skill under a changed climate, we compared model predictions with observations from historic drought years. Uncertainty in projected vulnerability was substantial, and the largest source was that of future climate change. Large uncertainty was also attributed to climate covariate class with hydrological covariates projecting half the range loss of bioclimatic covariates or other summaries of temperature and precipitation. We found that choices based on performance in cross-validation improved skill in extrapolation. Qualitative rankings were also highly uncertain. Given uncertainty in projected vulnerability and resulting uncertainty in rankings used for conservation prioritization, a number of considerations appear critical for using bioclimatic SDMs to inform climate change mitigation strategies. Our results emphasize explicitly selecting climate summaries that most closely represent processes likely to underlie ecological response to climate change. For example, hydrological covariates projected substantially reduced vulnerability, highlighting the importance of considering whether water availability may be a more proximal driver than precipitation. However, because cross-validation results were correlated with extrapolation results, the use of cross-validation performance metrics to guide modeling choices where knowledge is limited was supported.
Incorporating uncertainty of management costs in sensitivity analyses of matrix population models.
Salomon, Yacov; McCarthy, Michael A; Taylor, Peter; Wintle, Brendan A
2013-02-01
The importance of accounting for economic costs when making environmental-management decisions subject to resource constraints has been increasingly recognized in recent years. In contrast, uncertainty associated with such costs has often been ignored. We developed a method, on the basis of economic theory, that accounts for the uncertainty in population-management decisions. We considered the case where, rather than taking fixed values, model parameters are random variables that represent the situation when parameters are not precisely known. Hence, the outcome is not precisely known either. Instead of maximizing the expected outcome, we maximized the probability of obtaining an outcome above a threshold of acceptability. We derived explicit analytical expressions for the optimal allocation and its associated probability, as a function of the threshold of acceptability, where the model parameters were distributed according to normal and uniform distributions. To illustrate our approach we revisited a previous study that incorporated cost-efficiency analyses in management decisions that were based on perturbation analyses of matrix population models. Incorporating derivations from this study into our framework, we extended the model to address potential uncertainties. We then applied these results to 2 case studies: management of a Koala (Phascolarctos cinereus) population and conservation of an olive ridley sea turtle (Lepidochelys olivacea) population. For low aspirations, that is, when the threshold of acceptability is relatively low, the optimal strategy was obtained by diversifying the allocation of funds. Conversely, for high aspirations, the budget was directed toward management actions with the highest potential effect on the population. The exact optimal allocation was sensitive to the choice of uncertainty model. Our results highlight the importance of accounting for uncertainty when making decisions and suggest that more effort should be placed on understanding the distributional characteristics of such uncertainty. Our approach provides a tool to improve decision making. © 2013 Society for Conservation Biology.
ICYESS 2013: Understanding and Interpreting Uncertainty
NASA Astrophysics Data System (ADS)
Rauser, F.; Niederdrenk, L.; Schemann, V.; Schmidt, A.; Suesser, D.; Sonntag, S.
2013-12-01
We will report the outcomes and highlights of the Interdisciplinary Conference of Young Earth System Scientists (ICYESS) on Understanding and Interpreting Uncertainty in September 2013, Hamburg, Germany. This conference is aimed at early career scientists (Masters to Postdocs) from a large variety of scientific disciplines and backgrounds (natural, social and political sciences) and will enable 3 days of discussions on a variety of uncertainty-related aspects: 1) How do we deal with implicit and explicit uncertainty in our daily scientific work? What is uncertain for us, and for which reasons? 2) How can we communicate these uncertainties to other disciplines? E.g., is uncertainty in cloud parameterization and respectively equilibrium climate sensitivity a concept that is understood equally well in natural and social sciences that deal with Earth System questions? Or vice versa, is, e.g., normative uncertainty as in choosing a discount rate relevant for natural scientists? How can those uncertainties be reconciled? 3) How can science communicate this uncertainty to the public? Is it useful at all? How are the different possible measures of uncertainty understood in different realms of public discourse? Basically, we want to learn from all disciplines that work together in the broad Earth System Science community how to understand and interpret uncertainty - and then transfer this understanding to the problem of how to communicate with the public, or its different layers / agents. ICYESS is structured in a way that participation is only possible via presentation, so every participant will give their own professional input into how the respective disciplines deal with uncertainty. Additionally, a large focus is put onto communication techniques; there are no 'standard presentations' in ICYESS. Keynote lectures by renowned scientists and discussions will lead to a deeper interdisciplinary understanding of what we do not really know, and how to deal with it. Many participants have a fresh view on the scientific questions because they have been scientifically raised in interdisciplinary graduate schools and institutions and bring a mix of professional expertise into the Earth System sciences. The extraordinary conference structure and the focus on young Earth System scientists lead to a unique perspective for ICYESS, and hopefully to insights that are relevant to the broader scientific community. At the AGU fall meeting we would like to present results and questions that will come out of ICYESS and put them into the ongoing broad discussion of communicating climate science uncertainties. More information on ICYESS can be found at icyess.eu
Chisholm, Jolene; von Tigerstrom, Barbara; Bedford, Patrick; Fradette, Julie; Viswanathan, Sowmya
2017-12-01
In Canada, minimally manipulated autologous cell therapies for homologous use (MMAC-H) are either regulated under the practice of medicine, or as drugs or devices under the Food and Drugs Act, Food and Drug Regulations (F&DR) or Medical Device Regulations (MDR). Cells, Tissues and Organs (CTO) Regulations in Canada are restricted to minimally manipulated allogeneic products for homologous use. This leaves an important gap in the interpretation of existing regulations. The purposes of this workshop co-organized by the Stem Cell Network and the Centre for Commercialization of Regenerative Medicine (CCRM) were to discuss the current state of regulation of MMAC-H therapies in Canada and compare it with other regulatory jurisdictions, with the intent of providing specific policy recommendations to Health Canada. Participants came to a consensus on the need for well-defined common terminology between regulators and stakeholders, a common source of confusion and misinformation. A need for a harmonized national approach to oversight of facilities providing MMAC-H therapies based on existing standards, such as Canadian Standards Association (CSA), was also voiced. Facilities providing MMAC-H therapies should also participate in collection of long-term data to ensure patient safety and efficacy of therapies. Harmonization across provinces of the procedures and practices involving administration of MMAC-H would be preferred. Participants felt that devices used to process MMAC-H are adequately regulated under existing MDR. Overly prescriptive regulation will stifle innovation, whereas insufficient regulation might allow unsafe or ineffective therapies to be offered. Until a clear, balanced and explicit approach is articulated, regulatory uncertainty remains a barrier. Copyright © 2017 International Society for Cellular Therapy. Published by Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Wellen, Christopher; Arhonditsis, George B.; Long, Tanya; Boyd, Duncan
2014-11-01
Spatially distributed nonpoint source watershed models are essential tools to estimate the magnitude and sources of diffuse pollution. However, little work has been undertaken to understand the sources and ramifications of the uncertainty involved in their use. In this study we conduct the first Bayesian uncertainty analysis of the water quality components of the SWAT model, one of the most commonly used distributed nonpoint source models. Working in Southern Ontario, we apply three Bayesian configurations for calibrating SWAT to Redhill Creek, an urban catchment, and Grindstone Creek, an agricultural one. We answer four interrelated questions: can SWAT determine suspended sediment sources with confidence when end of basin data is used for calibration? How does uncertainty propagate from the discharge submodel to the suspended sediment submodels? Do the estimated sediment sources vary when different calibration approaches are used? Can we combine the knowledge gained from different calibration approaches? We show that: (i) despite reasonable fit at the basin outlet, the simulated sediment sources are subject to uncertainty sufficient to undermine the typical approach of reliance on a single, best fit simulation; (ii) more than a third of the uncertainty of sediment load predictions may stem from the discharge submodel; (iii) estimated sediment sources do vary significantly across the three statistical configurations of model calibration despite end-of-basin predictions being virtually identical; and (iv) Bayesian model averaging is an approach that can synthesize predictions when a number of adequate distributed models make divergent source apportionments. We conclude with recommendations for future research to reduce the uncertainty encountered when using distributed nonpoint source models for source apportionment.
Conceptualizing genetic counseling as psychotherapy in the era of genomic medicine.
Austin, Jehannine; Semaka, Alicia; Hadjipavlou, George
2014-12-01
Discussions about genetic contributions to medical illness have become increasingly commonplace. Physicians and other health-care providers in all quarters of medicine, from oncology to psychiatry, routinely field questions about the genetic basis of the medical conditions they treat. Communication about genetic testing and risk also enter into these conversations, as knowledge about genetics is increasingly expected of all medical specialists. Attendant to this evolving medical landscape is some uncertainty regarding the future of the genetic counseling profession, with the potential for both increases and decreases in demand for genetic counselors being possible outcomes. This emerging uncertainty provides the opportunity to explicitly conceptualize the potentially distinct value and contributions of the genetic counselor over and above education about genetics and risk that may be provided by other health professionals. In this paper we suggest conceptualizing genetic counseling as a highly circumscribed form of psychotherapy in which effective communication of genetic information is a central therapeutic goal. While such an approach is by no means new--in 1979 Seymour Kessler explicitly described genetic counseling as a "kind of psychotherapeutic encounter," an "interaction with a psychotherapeutic potential"--we expand on his view, and provide research evidence in support of our position. We review available evidence from process and outcome studies showing that genetic counseling is a therapeutic encounter that cannot be reduced to one where the counselor performs a simple "conduit for information" function, without losing effectiveness. We then discuss potential barriers that may have impeded greater uptake of a psychotherapeutic model of practice, and close by discussing implications for practice.
Conceptualizing genetic counseling as psychotherapy in the era of genomic medicine
Austin, Jehannine; Semaka, Alicia; Hadjipavlou, George
2014-01-01
Discussions about genetic contributions to medical illness have become increasingly commonplace. Physicians and other health-care providers in all quarters of medicine, from oncology to psychiatry, routinely field questions about the genetic basis of the medical conditions they treat. Communication about genetic testing and risk also enter into these conversations, as knowledge about genetics is increasingly expected of all medical specialists. Attendant to this evolving medical landscape is some uncertainty regarding the future of the genetic counseling profession, with the potential for both increases and decreases in demand for genetic counselors being possible outcomes. This emerging uncertainty provides the opportunity to explicitly conceptualize the potentially distinct value and contributions of the genetic counselor over and above education about genetics and risk that may be provided by other health professionals. In this paper we suggest conceptualizing genetic counseling as a highly circumscribed form of psychotherapy in which effective communication of genetic information is a central therapeutic goal. While such an approach is by no means new—in 1979 Seymour Kessler explicitly described genetic counseling as a “kind of psychotherapeutic encounter,” an “interaction with a psychotherapeutic potential”—we expand on his view, and provide research evidence in support of our position. We review available evidence from process and outcome studies showing that genetic counseling is a therapeutic encounter that cannot be reduced to one where the counselor performs a simple “conduit for information” function, without losing effectiveness. We then discuss potential barriers that may have impeded greater uptake of a psychotherapeutic model of practice, and close by discussing implications for practice. PMID:24841456
Saravana Kumar, Gurunathan; George, Subin Philip
2017-02-01
This work proposes a methodology involving stiffness optimization for subject-specific cementless hip implant design based on finite element analysis for reducing stress-shielding effect. To assess the change in the stress-strain state of the femur and the resulting stress-shielding effect due to insertion of the implant, a finite element analysis of the resected femur with implant assembly is carried out for a clinically relevant loading condition. Selecting the von Mises stress as the criterion for discriminating regions for elastic modulus difference, a stiffness minimization method was employed by varying the elastic modulus distribution in custom implant stem. The stiffness minimization problem is formulated as material distribution problem without explicitly penalizing partial volume elements. This formulation enables designs that could be fabricated using additive manufacturing to make porous implant with varying levels of porosity. Stress-shielding effect, measured as difference between the von Mises stress in the intact and implanted femur, decreased as the elastic modulus distribution is optimized.
Implicit and explicit forgetting: when is gist remembered?
Dorfman, J; Mandler, G
1994-08-01
Recognition (YES/NO) and stem completion (cued: complete with a word from the list; and uncued: complete with the first word that comes to mind) were tested following either semantic or non-semantic processing of a categorized input list. Item/instance information was tested by contrasting target items from the input list with new items that were categorically related to them; gist/categorical information was tested by comparing target items semantically related to the input items with unrelated new items. For both recognition and stem completion, regardless of initial processing condition, item information decayed rapidly over a period of one week. Gist information was maintained over the same period when initial processing was semantic but only in the cued condition for completion. These results are discussed in terms of dual process theory, which postulates activation/integration of a representation as primarily relevant to implicit item information and elaboration of a representation as mainly relevant to semantic (i.e. categorical) information.
Horton, Keith D; Wilson, Daryl E; Vonk, Jennifer; Kirby, Sarah L; Nielsen, Tina
2005-07-01
Using the stem completion task, we compared estimates of automatic retrieval from an implicit memory task, the process dissociation procedure, and the speeded response procedure. Two standard manipulations were employed. In Experiment 1, a depth of processing effect was found on automatic retrieval using the speeded response procedure although this effect was substantially reduced in Experiment 2 when lexical processing was required of all words. In Experiment 3, the speeded response procedure showed an advantage of full versus divided attention at study on automatic retrieval. An implicit condition showed parallel effects in each study, suggesting that implicit stem completion may normally provide a good estimate of automatic retrieval. Also, we replicated earlier findings from the process dissociation procedure, but estimates of automatic retrieval from this procedure were consistently lower than those from the speeded response procedure, except when conscious retrieval was relatively low. We discuss several factors that may contribute to the conflicting outcomes, including the evidence for theoretical assumptions and criterial task differences between implicit and explicit tests.
NASA Astrophysics Data System (ADS)
Bittner, S.; Priesack, E.
2012-04-01
We apply a functional-structural model of tree water flow to single old-growth trees in a temperate broad-leaved forest stand. Roots, stems and branches are represented by connected porous cylinder elements further divided into the inner heartwood cylinders surrounded by xylem and phloem. Xylem water flow is simulated by applying a non-linear Darcy flow in porous media driven by the water potential gradient according to the cohesion-tension theory. The flow model is based on physiological input parameters such as the hydraulic conductivity, stomatal response to leaf water potential and root water uptake capability and, thus, can reflect the different properties of tree species. The actual root water uptake is calculated using also a non-linear Darcy law based on the gradient between root xylem water potential and rhizosphere soil water potential and by the simulation of soil water flow applying Richards equation. A leaf stomatal conductance model is combined with the hydrological tree and soil water flow model and a spatially explicit three-dimensional canopy light model. The structure of the canopy and the tree architectures are derived by applying an automatic tree skeleton extraction algorithm from point clouds obtained by use of a terrestrial laser scanner allowing an explicit representation of the water flow path in the stem and branches. The high spatial resolution of the root and branch geometry and their connectivity makes the detailed modelling of the water use of single trees possible and allows for the analysis of the interaction between single trees and the influence of the canopy light regime (including different fractions of direct sunlight and diffuse skylight) on the simulated sap flow and transpiration. The model can be applied at various sites and to different tree species, enabling the up-scaling of the water usage of single trees to the total transpiration of mixed stands. Examples are given to reveal differences between diffuse- and ring-porous tree species and to simulate the diurnal dynamics of transpiration, stem sap flux, and root water uptake observed during the vegetation period in the year 2009.
[Decision process in a multidisciplinary cancer team with limited evidence].
Lassalle, R; Marold, J; Schöbel, M; Manzey, D; Bohn, S; Dietz, A; Boehm, A
2014-04-01
The Head and Neck Cancer Tumor Board is a multispeciality comprehensive conference that brings together experts with different backgrounds to make group decisions about the appropriate treatment. Due to the complexity of the patient cases and the collaboration of different medical disciplines most of these decisions have to be made under uncertainty, i. e., with-out knowing all relevant factors and without being quite sure about the outcome. To develop effective team decision making under uncertainty, it is necessary to understand how medical experts perceive and handle uncertainties. The aim of this field study was to develop a knowledge base by exploring additionally the factors that influence group decision making processes. A structured nonparticipant observational study was employed to address the research goal. Video data were analyzed by 2 independent observers using an observation checklist. A total of 20 videotaped case discussions were studied. Observations were complemented by a questionnaire gathering subjective evaluations of board members about the process and quality of their decisions (N=15). The results show that uncertainty is recognized by board members. Reasons for uncertainty may stem from the complexity of the cases (e. g. therapy options) or the assessment from different disciplines coming together at the board. With respect to handling uncertainty and guaranteeing an optimal decision making process potential for improvement could be defined. This pertains to the handling of different levels of competence, the promotion of a positive discussion culture as well as structuring of the decision making process. © Georg Thieme Verlag KG Stuttgart · New York.
An improved state-parameter analysis of ecosystem models using data assimilation
Chen, M.; Liu, S.; Tieszen, L.L.; Hollinger, D.Y.
2008-01-01
Much of the effort spent in developing data assimilation methods for carbon dynamics analysis has focused on estimating optimal values for either model parameters or state variables. The main weakness of estimating parameter values alone (i.e., without considering state variables) is that all errors from input, output, and model structure are attributed to model parameter uncertainties. On the other hand, the accuracy of estimating state variables may be lowered if the temporal evolution of parameter values is not incorporated. This research develops a smoothed ensemble Kalman filter (SEnKF) by combining ensemble Kalman filter with kernel smoothing technique. SEnKF has following characteristics: (1) to estimate simultaneously the model states and parameters through concatenating unknown parameters and state variables into a joint state vector; (2) to mitigate dramatic, sudden changes of parameter values in parameter sampling and parameter evolution process, and control narrowing of parameter variance which results in filter divergence through adjusting smoothing factor in kernel smoothing algorithm; (3) to assimilate recursively data into the model and thus detect possible time variation of parameters; and (4) to address properly various sources of uncertainties stemming from input, output and parameter uncertainties. The SEnKF is tested by assimilating observed fluxes of carbon dioxide and environmental driving factor data from an AmeriFlux forest station located near Howland, Maine, USA, into a partition eddy flux model. Our analysis demonstrates that model parameters, such as light use efficiency, respiration coefficients, minimum and optimum temperatures for photosynthetic activity, and others, are highly constrained by eddy flux data at daily-to-seasonal time scales. The SEnKF stabilizes parameter values quickly regardless of the initial values of the parameters. Potential ecosystem light use efficiency demonstrates a strong seasonality. Results show that the simultaneous parameter estimation procedure significantly improves model predictions. Results also show that the SEnKF can dramatically reduce the variance in state variables stemming from the uncertainty of parameters and driving variables. The SEnKF is a robust and effective algorithm in evaluating and developing ecosystem models and in improving the understanding and quantification of carbon cycle parameters and processes. ?? 2008 Elsevier B.V.
Human-Induced Vegetation Degradation in a Semi-Arid Rangeland
NASA Astrophysics Data System (ADS)
Jackson, Hasan
Current assessments of anthropogenic land degradation and its impact on vegetation at regional scales are prone to large uncertainties due to the lack of an objective, transferable, spatially and temporally explicit measure of land degradation. These uncertainties have resulted in contradictory estimates of degradation extent and severity and the role of human activities. The uncertainties limit the ability to assess the effects on the biophysical environment and effectiveness of past, current, and future policies of land use. The overall objective of the dissertation is to assess degradation in a semi-arid region at a regional scale where the process of anthropogenic land degradation is evident. Net primary productivity (NPP) is used as the primary indicator to measure degradation. It is hypothesized that land degradation resulting from human factors on the landscape irreversibly reduces NPP below the potential set by environmental conditions. It is also hypothesized that resulting reductions in NPP are distinguishable from natural, spatial and temporal, variability in NPP. The specific goals of the dissertation are to (1) identify the extent and severity of degradation using productivity as the primary surrogate, (2) compare the degradation of productivity to other known mechanisms of degradation, and (3) relate the expression of degradation to components of vegetation and varying environmental conditions. This dissertation employed the Local NPP Scaling (LNS) approach to identify patterns of anthropogenic degradation of NPP in the Burdekin Dry Tropics (BDT) region of Queensland (14 million hectares), Australia from 2000 to 2013. The method started with land classification based on the environmental factors presumed to control NPP to group pixels having similar potential NPP. Then, satellite remotely sensing data were used to compare actual NPP with its potential. The difference, in units of mass of carbon fixed in NPP per unit area per monitoring interval and per year, also its percentage of the potential, were the measures of degradation. Degradation was then compared to non-green components of vegetation (e.g. wood, stems, leaf litter, dead biomass) to determine their relationship in space and time. Finally, the symptoms of degradation were compared to land management patterns and the environmental variability (e.g. drought, non-drought conditions). Nearly 20% of the region was identified as degraded and another 7% had significant negative trends. The average annual reduction in NPP due to anthropogenic degradation was -17% of the non-degraded potential, although the severity of degradation varied substantially throughout the region. Non-green vegetation cover was strongly correlated with the inter-annual and intra-annual temporal trends of degradation. The dynamics of degradation in drought and non-drought years provided evidence of multiple stables states of degradation.
Gonzalez-Meler, Miquel A.; Lynch, Douglas J.; Baltzer, Jennifer L.
2016-01-01
Plants appear to produce an excess of leaves, stems and roots beyond what would provide the most efficient harvest of available resources. One way to understand this overproduction of tissues is that excess tissue production provides a competitive advantage. Game theoretic models predict overproduction of all tissues compared with non-game theoretic models because they explicitly account for this indirect competitive benefit. Here, we present a simple game theoretic model of plants simultaneously competing to harvest carbon and nitrogen. In the model, a plant's fitness is influenced by its own leaf, stem and root production, and the tissue production of others, which produces a triple tragedy of the commons. Our model predicts (i) absolute net primary production when compared with two independent global datasets; (ii) the allocation relationships to leaf, stem and root tissues in one dataset; (iii) the global distribution of biome types and the plant functional types found within each biome; and (iv) ecosystem responses to nitrogen or carbon fertilization. Our game theoretic approach removes the need to define allocation or vegetation type a priori but instead lets these emerge from the model as evolutionarily stable strategies. We believe this to be the simplest possible model that can describe plant production. PMID:28120794
McNickle, Gordon G; Gonzalez-Meler, Miquel A; Lynch, Douglas J; Baltzer, Jennifer L; Brown, Joel S
2016-11-16
Plants appear to produce an excess of leaves, stems and roots beyond what would provide the most efficient harvest of available resources. One way to understand this overproduction of tissues is that excess tissue production provides a competitive advantage. Game theoretic models predict overproduction of all tissues compared with non-game theoretic models because they explicitly account for this indirect competitive benefit. Here, we present a simple game theoretic model of plants simultaneously competing to harvest carbon and nitrogen. In the model, a plant's fitness is influenced by its own leaf, stem and root production, and the tissue production of others, which produces a triple tragedy of the commons. Our model predicts (i) absolute net primary production when compared with two independent global datasets; (ii) the allocation relationships to leaf, stem and root tissues in one dataset; (iii) the global distribution of biome types and the plant functional types found within each biome; and (iv) ecosystem responses to nitrogen or carbon fertilization. Our game theoretic approach removes the need to define allocation or vegetation type a priori but instead lets these emerge from the model as evolutionarily stable strategies. We believe this to be the simplest possible model that can describe plant production. © 2016 The Author(s).
NASA Astrophysics Data System (ADS)
He, Hongxing; Meyer, Astrid; Jansson, Per-Erik; Svensson, Magnus; Rütting, Tobias; Klemedtsson, Leif
2018-02-01
The symbiosis between plants and Ectomycorrhizal fungi (ECM) is shown to considerably influence the carbon (C) and nitrogen (N) fluxes between the soil, rhizosphere, and plants in boreal forest ecosystems. However, ECM are either neglected or presented as an implicit, undynamic term in most ecosystem models, which can potentially reduce the predictive power of models.
In order to investigate the necessity of an explicit consideration of ECM in ecosystem models, we implement the previously developed MYCOFON model into a detailed process-based, soil-plant-atmosphere model, Coup-MYCOFON, which explicitly describes the C and N fluxes between ECM and roots. This new Coup-MYCOFON model approach (ECM explicit) is compared with two simpler model approaches: one containing ECM implicitly as a dynamic uptake of organic N considering the plant roots to represent the ECM (ECM implicit), and the other a static N approach in which plant growth is limited to a fixed N level (nonlim). Parameter uncertainties are quantified using Bayesian calibration in which the model outputs are constrained to current forest growth and soil C / N ratio for four forest sites along a climate and N deposition gradient in Sweden and simulated over a 100-year period.
The nonlim
approach could not describe the soil C / N ratio due to large overestimation of soil N sequestration but simulate the forest growth reasonably well. The ECM implicit
and explicit
approaches both describe the soil C / N ratio well but slightly underestimate the forest growth. The implicit approach simulated lower litter production and soil respiration than the explicit approach. The ECM explicit Coup-MYCOFON model provides a more detailed description of internal ecosystem fluxes and feedbacks of C and N between plants, soil, and ECM. Our modeling highlights the need to incorporate ECM and organic N uptake into ecosystem models, and the nonlim approach is not recommended for future long-term soil C and N predictions. We also provide a key set of posterior fungal parameters that can be further investigated and evaluated in future ECM studies.
NASA Astrophysics Data System (ADS)
Lahiri, B. B.; Ranoo, Surojit; Philip, John
2017-11-01
Magnetic fluid hyperthermia (MFH) is becoming a viable cancer treatment methodology where the alternating magnetic field induced heating of magnetic fluid is utilized for ablating the cancerous cells or making them more susceptible to the conventional treatments. The heating efficiency in MFH is quantified in terms of specific absorption rate (SAR), which is defined as the heating power generated per unit mass. In majority of the experimental studies, SAR is evaluated from the temperature rise curves, obtained under non-adiabatic experimental conditions, which is prone to various thermodynamic uncertainties. A proper understanding of the experimental uncertainties and its remedies is a prerequisite for obtaining accurate and reproducible SAR. Here, we study the thermodynamic uncertainties associated with peripheral heating, delayed heating, heat loss from the sample and spatial variation in the temperature profile within the sample. Using first order approximations, an adiabatic reconstruction protocol for the measured temperature rise curves is developed for SAR estimation, which is found to be in good agreement with those obtained from the computationally intense slope corrected method. Our experimental findings clearly show that the peripheral and delayed heating are due to radiation heat transfer from the heating coils and slower response time of the sensor, respectively. Our results suggest that the peripheral heating is linearly proportional to the sample area to volume ratio and coil temperature. It is also observed that peripheral heating decreases in presence of a non-magnetic insulating shielding. The delayed heating is found to contribute up to ~25% uncertainties in SAR values. As the SAR values are very sensitive to the initial slope determination method, explicit mention of the range of linear regression analysis is appropriate to reproduce the results. The effect of sample volume to area ratio on linear heat loss rate is systematically studied and the results are compared using a lumped system thermal model. The various uncertainties involved in SAR estimation are categorized as material uncertainties, thermodynamic uncertainties and parametric uncertainties. The adiabatic reconstruction is found to decrease the uncertainties in SAR measurement by approximately three times. Additionally, a set of experimental guidelines for accurate SAR estimation using adiabatic reconstruction protocol is also recommended. These results warrant a universal experimental and data analysis protocol for SAR measurements during field induced heating of magnetic fluids under non-adiabatic conditions.
NASA Astrophysics Data System (ADS)
Feng, S.; Lauvaux, T.; Keller, K.; Davis, K. J.
2016-12-01
Current estimates of biogenic carbon fluxes over North America based on top-down atmospheric inversions are subject to considerable uncertainty. This uncertainty stems to a large part from the uncertain prior fluxes estimates with the associated error covariances and approximations in the atmospheric transport models that link observed carbon dioxide mixing ratios with surface fluxes. Specifically, approximations in the representation of vertical mixing associated with atmospheric turbulence or convective transport and largely under-determined prior fluxes and their error structures significantly hamper our capacity to reliably estimate regional carbon fluxes. The Atmospheric Carbon and Transport - America (ACT-America) mission aims at reducing the uncertainties in inverse fluxes at the regional-scale by deploying airborne and ground-based platforms to characterize atmospheric GHG mixing ratios and the concurrent atmospheric dynamics. Two aircraft measure the 3-dimensional distribution of greenhouse gases at synoptic scales, focusing on the atmospheric boundary layer and the free troposphere during both fair and stormy weather conditions. Here we analyze two main questions: (i) What level of information can we expect from the currently planned observations? (ii) How might ACT-America reduce the hindcast and predictive uncertainty of carbon estimates over North America?
Perceptual priming versus explicit memory: dissociable neural correlates at encoding.
Schott, Björn; Richardson-Klavehn, Alan; Heinze, Hans-Jochen; Düzel, Emrah
2002-05-15
We addressed the hypothesis that perceptual priming and explicit memory have distinct neural correlates at encoding. Event-related potentials (ERPs) were recorded while participants studied visually presented words at deep versus shallow levels of processing (LOPs). The ERPs were sorted by whether or not participants later used studied words as completions to three-letter word stems in an intentional memory test, and by whether or not they indicated that these completions were remembered from the study list. Study trials from which words were later used and not remembered (primed trials) and study trials from which words were later used and remembered (remembered trials) were compared to study trials from which words were later not used (forgotten trials), in order to measure the ERP difference associated with later memory (DM effect). Primed trials involved an early (200-450 msec) centroparietal negative-going DM effect. Remembered trials involved a late (900-1200 msec) right frontal, positive-going DM effect regardless of LOP, as well as an earlier (600-800 msec) central, positive-going DM effect during shallow study processing only. All three DM effects differed topographically, and, in terms of their onset or duration, from the extended (600-1200 msec) fronto-central, positive-going shift for deep compared with shallow study processing. The results provide the first clear evidence that perceptual priming and explicit memory have distinct neural correlates at encoding, consistent with Tulving and Schacter's (1990) distinction between brain systems concerned with perceptual representation versus semantic and episodic memory. They also shed additional light on encoding processes associated with later explicit memory, by suggesting that brain processes influenced by LOP set the stage for other, at least partially separable, brain processes that are more directly related to encoding success.
Predicting drug hydrolysis based on moisture uptake in various packaging designs.
Naversnik, Klemen; Bohanec, Simona
2008-12-18
An attempt was made to predict the stability of a moisture sensitive drug product based on the knowledge of the dependence of the degradation rate on tablet moisture. The moisture increase inside a HDPE bottle with the drug formulation was simulated with the sorption-desorption moisture transfer model, which, in turn, allowed an accurate prediction of the drug degradation kinetics. The stability prediction, obtained by computer simulation, was made in a considerably shorter time frame and required little resources compared to a conventional stability study. The prediction was finally upgraded to a stochastic Monte Carlo simulation, which allowed quantitative incorporation of uncertainty, stemming from various sources. The resulting distribution of the outcome of interest (amount of degradation product at expiry) is a comprehensive way of communicating the result along with its uncertainty, superior to single-value results or confidence intervals.
Dealing With Uncertainty When Assessing Fish Passage Through Culvert Road Crossings
NASA Astrophysics Data System (ADS)
Anderson, Gregory B.; Freeman, Mary C.; Freeman, Byron J.; Straight, Carrie A.; Hagler, Megan M.; Peterson, James T.
2012-09-01
Assessing the passage of aquatic organisms through culvert road crossings has become increasingly common in efforts to restore stream habitat. Several federal and state agencies and local stakeholders have adopted assessment approaches based on literature-derived criteria for culvert impassability. However, criteria differ and are typically specific to larger-bodied fishes. In an analysis to prioritize culverts for remediation to benefit imperiled, small-bodied fishes in the Upper Coosa River system in the southeastern United States, we assessed the sensitivity of prioritization to the use of differing but plausible criteria for culvert impassability. Using measurements at 256 road crossings, we assessed culvert impassability using four alternative criteria sets represented in Bayesian belief networks. Two criteria sets scored culverts as either passable or impassable based on alternative thresholds of culvert characteristics (outlet elevation, baseflow water velocity). Two additional criteria sets incorporated uncertainty concerning ability of small-bodied fishes to pass through culverts and estimated a probability of culvert impassability. To prioritize culverts for remediation, we combined estimated culvert impassability with culvert position in the stream network relative to other barriers to compute prospective gain in connected stream habitat for the target fish species. Although four culverts ranked highly for remediation regardless of which criteria were used to assess impassability, other culverts differed widely in priority depending on criteria. Our results emphasize the value of explicitly incorporating uncertainty into criteria underlying remediation decisions. Comparing outcomes among alternative, plausible criteria may also help to identify research most needed to narrow management uncertainty.
Dealing with uncertainty when assessing fish passage through culvert road crossings.
Anderson, Gregory B; Freeman, Mary C; Freeman, Byron J; Straight, Carrie A; Hagler, Megan M; Peterson, James T
2012-09-01
Assessing the passage of aquatic organisms through culvert road crossings has become increasingly common in efforts to restore stream habitat. Several federal and state agencies and local stakeholders have adopted assessment approaches based on literature-derived criteria for culvert impassability. However, criteria differ and are typically specific to larger-bodied fishes. In an analysis to prioritize culverts for remediation to benefit imperiled, small-bodied fishes in the Upper Coosa River system in the southeastern United States, we assessed the sensitivity of prioritization to the use of differing but plausible criteria for culvert impassability. Using measurements at 256 road crossings, we assessed culvert impassability using four alternative criteria sets represented in Bayesian belief networks. Two criteria sets scored culverts as either passable or impassable based on alternative thresholds of culvert characteristics (outlet elevation, baseflow water velocity). Two additional criteria sets incorporated uncertainty concerning ability of small-bodied fishes to pass through culverts and estimated a probability of culvert impassability. To prioritize culverts for remediation, we combined estimated culvert impassability with culvert position in the stream network relative to other barriers to compute prospective gain in connected stream habitat for the target fish species. Although four culverts ranked highly for remediation regardless of which criteria were used to assess impassability, other culverts differed widely in priority depending on criteria. Our results emphasize the value of explicitly incorporating uncertainty into criteria underlying remediation decisions. Comparing outcomes among alternative, plausible criteria may also help to identify research most needed to narrow management uncertainty.
Vanguelova, E I; Bonifacio, E; De Vos, B; Hoosbeek, M R; Berger, T W; Vesterdal, L; Armolaitis, K; Celi, L; Dinca, L; Kjønaas, O J; Pavlenda, P; Pumpanen, J; Püttsepp, Ü; Reidy, B; Simončič, P; Tobin, B; Zhiyanski, M
2016-11-01
Spatially explicit knowledge of recent and past soil organic carbon (SOC) stocks in forests will improve our understanding of the effect of human- and non-human-induced changes on forest C fluxes. For SOC accounting, a minimum detectable difference must be defined in order to adequately determine temporal changes and spatial differences in SOC. This requires sufficiently detailed data to predict SOC stocks at appropriate scales within the required accuracy so that only significant changes are accounted for. When designing sampling campaigns, taking into account factors influencing SOC spatial and temporal distribution (such as soil type, topography, climate and vegetation) are needed to optimise sampling depths and numbers of samples, thereby ensuring that samples accurately reflect the distribution of SOC at a site. Furthermore, the appropriate scales related to the research question need to be defined: profile, plot, forests, catchment, national or wider. Scaling up SOC stocks from point sample to landscape unit is challenging, and thus requires reliable baseline data. Knowledge of the associated uncertainties related to SOC measures at each particular scale and how to reduce them is crucial for assessing SOC stocks with the highest possible accuracy at each scale. This review identifies where potential sources of errors and uncertainties related to forest SOC stock estimation occur at five different scales-sample, profile, plot, landscape/regional and European. Recommendations are also provided on how to reduce forest SOC uncertainties and increase efficiency of SOC assessment at each scale.
A Decision Support System for effective use of probability forecasts
NASA Astrophysics Data System (ADS)
De Kleermaeker, Simone; Verkade, Jan
2013-04-01
Often, water management decisions are based on hydrological forecasts. These forecasts, however, are affected by inherent uncertainties. It is increasingly common for forecasting agencies to make explicit estimates of these uncertainties and thus produce probabilistic forecasts. Associated benefits include the decision makers' increased awareness of forecasting uncertainties and the potential for risk-based decision-making. Also, a stricter separation of responsibilities between forecasters and decision maker can be made. However, simply having probabilistic forecasts available is not sufficient to realise the associated benefits. Additional effort is required in areas such as forecast visualisation and communication, decision making in uncertainty and forecast verification. Also, revised separation of responsibilities requires a shift in institutional arrangements and responsibilities. A recent study identified a number of additional issues related to the effective use of probability forecasts. When moving from deterministic to probability forecasting, a dimension is added to an already multi-dimensional problem; this makes it increasingly difficult for forecast users to extract relevant information from a forecast. A second issue is that while probability forecasts provide a necessary ingredient for risk-based decision making, other ingredients may not be present. For example, in many cases no estimates of flood damage, of costs of management measures and of damage reduction are available. This paper presents the results of the study, including some suggestions for resolving these issues and the integration of those solutions in a prototype decision support system (DSS). A pathway for further development of the DSS is outlined.
NASA Astrophysics Data System (ADS)
Wang, Jun; Wang, Yang; Zeng, Hui
2016-01-01
A key issue to address in synthesizing spatial data with variable-support in spatial analysis and modeling is the change-of-support problem. We present an approach for solving the change-of-support and variable-support data fusion problems. This approach is based on geostatistical inverse modeling that explicitly accounts for differences in spatial support. The inverse model is applied here to produce both the best predictions of a target support and prediction uncertainties, based on one or more measurements, while honoring measurements. Spatial data covering large geographic areas often exhibit spatial nonstationarity and can lead to computational challenge due to the large data size. We developed a local-window geostatistical inverse modeling approach to accommodate these issues of spatial nonstationarity and alleviate computational burden. We conducted experiments using synthetic and real-world raster data. Synthetic data were generated and aggregated to multiple supports and downscaled back to the original support to analyze the accuracy of spatial predictions and the correctness of prediction uncertainties. Similar experiments were conducted for real-world raster data. Real-world data with variable-support were statistically fused to produce single-support predictions and associated uncertainties. The modeling results demonstrate that geostatistical inverse modeling can produce accurate predictions and associated prediction uncertainties. It is shown that the local-window geostatistical inverse modeling approach suggested offers a practical way to solve the well-known change-of-support problem and variable-support data fusion problem in spatial analysis and modeling.
Dealing with uncertainty when assessing fish passage through culvert road crossings
Anderson, Gregory B.; Freeman, Mary C.; Freeman, Byron J.; Straight, Carrie A.; Hagler, Megan M.; Peterson, James T.
2012-01-01
Assessing the passage of aquatic organisms through culvert road crossings has become increasingly common in efforts to restore stream habitat. Several federal and state agencies and local stakeholders have adopted assessment approaches based on literature-derived criteria for culvert impassability. However, criteria differ and are typically specific to larger-bodied fishes. In an analysis to prioritize culverts for remediation to benefit imperiled, small-bodied fishes in the Upper Coosa River system in the southeastern United States, we assessed the sensitivity of prioritization to the use of differing but plausible criteria for culvert impassability. Using measurements at 256 road crossings, we assessed culvert impassability using four alternative criteria sets represented in Bayesian belief networks. Two criteria sets scored culverts as either passable or impassable based on alternative thresholds of culvert characteristics (outlet elevation, baseflow water velocity). Two additional criteria sets incorporated uncertainty concerning ability of small-bodied fishes to pass through culverts and estimated a probability of culvert impassability. To prioritize culverts for remediation, we combined estimated culvert impassability with culvert position in the stream network relative to other barriers to compute prospective gain in connected stream habitat for the target fish species. Although four culverts ranked highly for remediation regardless of which criteria were used to assess impassability, other culverts differed widely in priority depending on criteria. Our results emphasize the value of explicitly incorporating uncertainty into criteria underlying remediation decisions. Comparing outcomes among alternative, plausible criteria may also help to identify research most needed to narrow management uncertainty.
Asymptotic formulae for likelihood-based tests of new physics
NASA Astrophysics Data System (ADS)
Cowan, Glen; Cranmer, Kyle; Gross, Eilam; Vitells, Ofer
2011-02-01
We describe likelihood-based statistical tests for use in high energy physics for the discovery of new phenomena and for construction of confidence intervals on model parameters. We focus on the properties of the test procedures that allow one to account for systematic uncertainties. Explicit formulae for the asymptotic distributions of test statistics are derived using results of Wilks and Wald. We motivate and justify the use of a representative data set, called the "Asimov data set", which provides a simple method to obtain the median experimental sensitivity of a search or measurement as well as fluctuations about this expectation.
Baushev, A. N.; Federici, S.; Pohl, M.
2012-09-20
The indirect detection of dark matter requires that dark matter annihilation products be discriminated from conventional astrophysical backgrounds. We re-analyze GeV-band gamma-ray observations of the prominent Milky Way dwarf satellite galaxy Segue 1, for which the expected astrophysical background is minimal. Here, we explicitly account for the angular extent of the conservatively expected gamma-ray signal and keep the uncertainty in the dark-matter profile external to the likelihood analysis of the gamma-ray data.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mehrez, Loujaine; Ghanem, Roger; McAuliffe, Colin
multiscale framework to construct stochastic macroscopic constitutive material models is proposed. A spectral projection approach, specifically polynomial chaos expansion, has been used to construct explicit functional relationships between the homogenized properties and input parameters from finer scales. A homogenization engine embedded in Multiscale Designer, software for composite materials, has been used for the upscaling process. The framework is demonstrated using non-crimp fabric composite materials by constructing probabilistic models of the homogenized properties of a non-crimp fabric laminate in terms of the input parameters together with the homogenized properties from finer scales.
Lattice QCD and the timelike pion form factor.
Meyer, Harvey B
2011-08-12
We present a formula that allows one to calculate the pion form factor in the timelike region 2m(π) ≤ √(s) ≤ 4m(π) in lattice QCD. The form factor quantifies the contribution of two-pion states to the vacuum polarization. It must be known very accurately in order to reduce the theoretical uncertainty on the anomalous magnetic moment of the muon. At the same time, the formula constitutes a rare example where, in a restricted kinematic regime, the spectral function of a conserved current can be determined from Euclidean observables without an explicit analytic continuation.
NASA Technical Reports Server (NTRS)
Haddad, Wassim M.; Bernstein, Dennis S.
1991-01-01
Lyapunov function proofs of sufficient conditions for asymptotic stability are given for feedback interconnections of bounded real and positive real transfer functions. Two cases are considered: (1) a proper bounded real (resp., positive real) transfer function with a bounded real (resp., positive real) time-varying memoryless nonlinearity; and (2) two strictly proper bounded real (resp., positive real) transfer functions. A similar treatment is given for the circle and Popov theorems. Application of these results to robust stability with time-varying bounded real, positive real, and sector-bounded uncertainty is discussed.
Uncertainty in mixing models: a blessing in disguise?
NASA Astrophysics Data System (ADS)
Delsman, J. R.; Oude Essink, G. H. P.
2012-04-01
Despite the abundance of tracer-based studies in catchment hydrology over the past decades, relatively few studies have addressed the uncertainty associated with these studies in much detail. This uncertainty stems from analytical error, spatial and temporal variance in end-member composition, and from not incorporating all relevant processes in the necessarily simplistic mixing models. Instead of applying standard EMMA methodology, we used end-member mixing model analysis within a Monte Carlo framework to quantify the uncertainty surrounding our analysis. Borrowing from the well-known GLUE methodology, we discarded mixing models that could not satisfactorily explain sample concentrations and analyzed the posterior parameter set. This use of environmental tracers aided in disentangling hydrological pathways in a Dutch polder catchment. This 10 km2 agricultural catchment is situated in the coastal region of the Netherlands. Brackish groundwater seepage, originating from Holocene marine transgressions, adversely affects water quality in this catchment. Current water management practice is aimed at improving water quality by flushing the catchment with fresh water from the river Rhine. Climate change is projected to decrease future fresh water availability, signifying the need for a more sustainable water management practice and a better understanding of the functioning of the catchment. The end-member mixing analysis increased our understanding of the hydrology of the studied catchment. The use of a GLUE-like framework for applying the end-member mixing analysis not only quantified the uncertainty associated with the analysis, the analysis of the posterior parameter set also identified the existence of catchment processes otherwise overlooked.
Bayesian Model Averaging of Artificial Intelligence Models for Hydraulic Conductivity Estimation
NASA Astrophysics Data System (ADS)
Nadiri, A.; Chitsazan, N.; Tsai, F. T.; Asghari Moghaddam, A.
2012-12-01
This research presents a Bayesian artificial intelligence model averaging (BAIMA) method that incorporates multiple artificial intelligence (AI) models to estimate hydraulic conductivity and evaluate estimation uncertainties. Uncertainty in the AI model outputs stems from error in model input as well as non-uniqueness in selecting different AI methods. Using one single AI model tends to bias the estimation and underestimate uncertainty. BAIMA employs Bayesian model averaging (BMA) technique to address the issue of using one single AI model for estimation. BAIMA estimates hydraulic conductivity by averaging the outputs of AI models according to their model weights. In this study, the model weights were determined using the Bayesian information criterion (BIC) that follows the parsimony principle. BAIMA calculates the within-model variances to account for uncertainty propagation from input data to AI model output. Between-model variances are evaluated to account for uncertainty due to model non-uniqueness. We employed Takagi-Sugeno fuzzy logic (TS-FL), artificial neural network (ANN) and neurofuzzy (NF) to estimate hydraulic conductivity for the Tasuj plain aquifer, Iran. BAIMA combined three AI models and produced better fitting than individual models. While NF was expected to be the best AI model owing to its utilization of both TS-FL and ANN models, the NF model is nearly discarded by the parsimony principle. The TS-FL model and the ANN model showed equal importance although their hydraulic conductivity estimates were quite different. This resulted in significant between-model variances that are normally ignored by using one AI model.
Remote Sensing of Ecosystem Health: Opportunities, Challenges, and Future Perspectives
Li, Zhaoqin; Xu, Dandan; Guo, Xulin
2014-01-01
Maintaining a healthy ecosystem is essential for maximizing sustainable ecological services of the best quality to human beings. Ecological and conservation research has provided a strong scientific background on identifying ecological health indicators and correspondingly making effective conservation plans. At the same time, ecologists have asserted a strong need for spatially explicit and temporally effective ecosystem health assessments based on remote sensing data. Currently, remote sensing of ecosystem health is only based on one ecosystem attribute: vigor, organization, or resilience. However, an effective ecosystem health assessment should be a comprehensive and dynamic measurement of the three attributes. This paper reviews opportunities of remote sensing, including optical, radar, and LiDAR, for directly estimating indicators of the three ecosystem attributes, discusses the main challenges to develop a remote sensing-based spatially-explicit comprehensive ecosystem health system, and provides some future perspectives. The main challenges to develop a remote sensing-based spatially-explicit comprehensive ecosystem health system are: (1) scale issue; (2) transportability issue; (3) data availability; and (4) uncertainties in health indicators estimated from remote sensing data. However, the Radarsat-2 constellation, upcoming new optical sensors on Worldview-3 and Sentinel-2 satellites, and improved technologies for the acquisition and processing of hyperspectral, multi-angle optical, radar, and LiDAR data and multi-sensoral data fusion may partly address the current challenges. PMID:25386759
Fung, Ronald K F; Kerridge, Ian H
2013-02-01
The discovery of induced pluripotent stem (iPS) cells in 2006 was heralded as a major breakthrough in stem cell research. Since then, progress in iPS cell technology has paved the way towards clinical application, particularly cell replacement therapy, which has refueled debate on the ethics of stem cell research. However, much of the discourse has focused on questions of moral status and potentiality, overlooking the ethical issues which are introduced by the clinical testing of iPS cell replacement therapy. First-in-human trials, in particular, raise a number of ethical concerns including informed consent, subject recruitment and harm minimisation as well as the inherent uncertainty and risks which are involved in testing medical procedures on humans for the first time. These issues, while a feature of any human research, become more complex in the case of iPS cell therapy, given the seriousness of the potential risks, the unreliability of available animal models, the vulnerability of the target patient group, and the high stakes of such an intensely public area of science. Our paper will present a detailed case study of iPS cell replacement therapy for Parkinson's disease to highlight these broader ethical and epistemological concerns. If we accept that iPS cell technology is fraught with challenges which go far beyond merely refuting the potentiality of the stem cell line, we conclude that iPS cell research should not replace, but proceed alongside embryonic and adult somatic stem cell research to promote cross-fertilisation of knowledge and better clinical outcomes. © 2011 Blackwell Publishing Ltd.
Hypersonic vehicle model and control law development using H(infinity) and micron synthesis
NASA Astrophysics Data System (ADS)
Gregory, Irene M.; Chowdhry, Rajiv S.; McMinn, John D.; Shaughnessy, John D.
1994-10-01
The control system design for a Single Stage To Orbit (SSTO) air breathing vehicle will be central to a successful mission because a precise ascent trajectory will preserve narrow payload margins. The air breathing propulsion system requires the vehicle to fly roughly halfway around the Earth through atmospheric turbulence. The turbulence, the high sensitivity of the propulsion system to inlet flow conditions, the relatively large uncertainty of the parameters characterizing the vehicle, and continuous acceleration make the problem especially challenging. Adequate stability margins must be provided without sacrificing payload mass since payload margins are critical. Therefore, a multivariable control theory capable of explicitly including both uncertainty and performance is needed. The H(infinity) controller in general provides good robustness but can result in conservative solutions for practical problems involving structured uncertainty. Structured singular value mu framework for analysis and synthesis is potentially much less conservative and hence more appropriate for problems with tight margins. An SSTO control system requires: highly accurate tracking of velocity and altitude commands while limiting angle-of-attack oscillations, minimized control power usage, and a stabilized vehicle when atmospheric turbulence and system uncertainty are present. The controller designs using H(infinity) and mu-synthesis procedures were compared. An integrated flight/propulsion dynamic mathematical model of a conical accelerator vehicle was linearized as the vehicle accelerated through Mach 8. Vehicle acceleration through the selected flight condition gives rise to parametric variation that was modeled as a structured uncertainty. The mu-analysis approach was used in the frequency domain to conduct controller analysis and was confirmed by time history plots. Results demonstrate the inherent advantages of the mu framework for this class of problems.
Fast integration-based prediction bands for ordinary differential equation models.
Hass, Helge; Kreutz, Clemens; Timmer, Jens; Kaschek, Daniel
2016-04-15
To gain a deeper understanding of biological processes and their relevance in disease, mathematical models are built upon experimental data. Uncertainty in the data leads to uncertainties of the model's parameters and in turn to uncertainties of predictions. Mechanistic dynamic models of biochemical networks are frequently based on nonlinear differential equation systems and feature a large number of parameters, sparse observations of the model components and lack of information in the available data. Due to the curse of dimensionality, classical and sampling approaches propagating parameter uncertainties to predictions are hardly feasible and insufficient. However, for experimental design and to discriminate between competing models, prediction and confidence bands are essential. To circumvent the hurdles of the former methods, an approach to calculate a profile likelihood on arbitrary observations for a specific time point has been introduced, which provides accurate confidence and prediction intervals for nonlinear models and is computationally feasible for high-dimensional models. In this article, reliable and smooth point-wise prediction and confidence bands to assess the model's uncertainty on the whole time-course are achieved via explicit integration with elaborate correction mechanisms. The corresponding system of ordinary differential equations is derived and tested on three established models for cellular signalling. An efficiency analysis is performed to illustrate the computational benefit compared with repeated profile likelihood calculations at multiple time points. The integration framework and the examples used in this article are provided with the software package Data2Dynamics, which is based on MATLAB and freely available at http://www.data2dynamics.org helge.hass@fdm.uni-freiburg.de Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
Uncertainties in mapping forest carbon in urban ecosystems.
Chen, Gang; Ozelkan, Emre; Singh, Kunwar K; Zhou, Jun; Brown, Marilyn R; Meentemeyer, Ross K
2017-02-01
Spatially explicit urban forest carbon estimation provides a baseline map for understanding the variation in forest vertical structure, informing sustainable forest management and urban planning. While high-resolution remote sensing has proven promising for carbon mapping in highly fragmented urban landscapes, data cost and availability are the major obstacle prohibiting accurate, consistent, and repeated measurement of forest carbon pools in cities. This study aims to evaluate the uncertainties of forest carbon estimation in response to the combined impacts of remote sensing data resolution and neighborhood spatial patterns in Charlotte, North Carolina. The remote sensing data for carbon mapping were resampled to a range of resolutions, i.e., LiDAR point cloud density - 5.8, 4.6, 2.3, and 1.2 pt s/m 2 , aerial optical NAIP (National Agricultural Imagery Program) imagery - 1, 5, 10, and 20 m. Urban spatial patterns were extracted to represent area, shape complexity, dispersion/interspersion, diversity, and connectivity of landscape patches across the residential neighborhoods with built-up densities from low, medium-low, medium-high, to high. Through statistical analyses, we found that changing remote sensing data resolution introduced noticeable uncertainties (variation) in forest carbon estimation at the neighborhood level. Higher uncertainties were caused by the change of LiDAR point density (causing 8.7-11.0% of variation) than changing NAIP image resolution (causing 6.2-8.6% of variation). For both LiDAR and NAIP, urban neighborhoods with a higher degree of anthropogenic disturbance unveiled a higher level of uncertainty in carbon mapping. However, LiDAR-based results were more likely to be affected by landscape patch connectivity, and the NAIP-based estimation was found to be significantly influenced by the complexity of patch shape. Copyright © 2016 Elsevier Ltd. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gerhard Strydom; Su-Jong Yoon
2014-04-01
Computational Fluid Dynamics (CFD) evaluation of homogeneous and heterogeneous fuel models was performed as part of the Phase I calculations of the International Atomic Energy Agency (IAEA) Coordinate Research Program (CRP) on High Temperature Reactor (HTR) Uncertainties in Modeling (UAM). This study was focused on the nominal localized stand-alone fuel thermal response, as defined in Ex. I-3 and I-4 of the HTR UAM. The aim of the stand-alone thermal unit-cell simulation is to isolate the effect of material and boundary input uncertainties on a very simplified problem, before propagation of these uncertainties are performed in subsequent coupled neutronics/thermal fluids phasesmore » on the benchmark. In many of the previous studies for high temperature gas cooled reactors, the volume-averaged homogeneous mixture model of a single fuel compact has been applied. In the homogeneous model, the Tristructural Isotropic (TRISO) fuel particles in the fuel compact were not modeled directly and an effective thermal conductivity was employed for the thermo-physical properties of the fuel compact. On the contrary, in the heterogeneous model, the uranium carbide (UCO), inner and outer pyrolytic carbon (IPyC/OPyC) and silicon carbide (SiC) layers of the TRISO fuel particles are explicitly modeled. The fuel compact is modeled as a heterogeneous mixture of TRISO fuel kernels embedded in H-451 matrix graphite. In this study, a steady-state and transient CFD simulations were performed with both homogeneous and heterogeneous models to compare the thermal characteristics. The nominal values of the input parameters are used for this CFD analysis. In a future study, the effects of input uncertainties in the material properties and boundary parameters will be investigated and reported.« less
Estimating trends in the global mean temperature record
NASA Astrophysics Data System (ADS)
Poppick, Andrew; Moyer, Elisabeth J.; Stein, Michael L.
2017-06-01
Given uncertainties in physical theory and numerical climate simulations, the historical temperature record is often used as a source of empirical information about climate change. Many historical trend analyses appear to de-emphasize physical and statistical assumptions: examples include regression models that treat time rather than radiative forcing as the relevant covariate, and time series methods that account for internal variability in nonparametric rather than parametric ways. However, given a limited data record and the presence of internal variability, estimating radiatively forced temperature trends in the historical record necessarily requires some assumptions. Ostensibly empirical methods can also involve an inherent conflict in assumptions: they require data records that are short enough for naive trend models to be applicable, but long enough for long-timescale internal variability to be accounted for. In the context of global mean temperatures, empirical methods that appear to de-emphasize assumptions can therefore produce misleading inferences, because the trend over the twentieth century is complex and the scale of temporal correlation is long relative to the length of the data record. We illustrate here how a simple but physically motivated trend model can provide better-fitting and more broadly applicable trend estimates and can allow for a wider array of questions to be addressed. In particular, the model allows one to distinguish, within a single statistical framework, between uncertainties in the shorter-term vs. longer-term response to radiative forcing, with implications not only on historical trends but also on uncertainties in future projections. We also investigate the consequence on inferred uncertainties of the choice of a statistical description of internal variability. While nonparametric methods may seem to avoid making explicit assumptions, we demonstrate how even misspecified parametric statistical methods, if attuned to the important characteristics of internal variability, can result in more accurate uncertainty statements about trends.
Hypersonic vehicle model and control law development using H(infinity) and micron synthesis
NASA Technical Reports Server (NTRS)
Gregory, Irene M.; Chowdhry, Rajiv S.; Mcminn, John D.; Shaughnessy, John D.
1994-01-01
The control system design for a Single Stage To Orbit (SSTO) air breathing vehicle will be central to a successful mission because a precise ascent trajectory will preserve narrow payload margins. The air breathing propulsion system requires the vehicle to fly roughly halfway around the Earth through atmospheric turbulence. The turbulence, the high sensitivity of the propulsion system to inlet flow conditions, the relatively large uncertainty of the parameters characterizing the vehicle, and continuous acceleration make the problem especially challenging. Adequate stability margins must be provided without sacrificing payload mass since payload margins are critical. Therefore, a multivariable control theory capable of explicitly including both uncertainty and performance is needed. The H(infinity) controller in general provides good robustness but can result in conservative solutions for practical problems involving structured uncertainty. Structured singular value mu framework for analysis and synthesis is potentially much less conservative and hence more appropriate for problems with tight margins. An SSTO control system requires: highly accurate tracking of velocity and altitude commands while limiting angle-of-attack oscillations, minimized control power usage, and a stabilized vehicle when atmospheric turbulence and system uncertainty are present. The controller designs using H(infinity) and mu-synthesis procedures were compared. An integrated flight/propulsion dynamic mathematical model of a conical accelerator vehicle was linearized as the vehicle accelerated through Mach 8. Vehicle acceleration through the selected flight condition gives rise to parametric variation that was modeled as a structured uncertainty. The mu-analysis approach was used in the frequency domain to conduct controller analysis and was confirmed by time history plots. Results demonstrate the inherent advantages of the mu framework for this class of problems.
Physicians’ Anxiety Due to Uncertainty and Use of Race in Medical Decision-Making
Cunningham, Brooke A.; Bonham, Vence L.; Sellers, Sherrill L.; Yeh, Hsin-Chieh; Cooper, Lisa A.
2014-01-01
Background The explicit use of race in medical decision-making is contested. Researchers have hypothesized that physicians use race in care when they are uncertain. Objectives To investigate whether physician anxiety due to uncertainty is associated with a higher propensity to use race in medical decision-making. Research Design A national cross-sectional survey of general internists Subjects A national sample of 1738 clinically active general internists drawn from the SK&A physician database Measures Anxiety Due to Uncertainty (ADU) is a 5-item measure of emotional reactions to clinical uncertainty. Bonham and Sellers Racial Attributes in Clinical Evaluation (RACE) scale includes 7 items that measure self-reported use of race in medical decision-making. We used bivariate regression to test for associations between physician characteristics, ADU and RACE. Multivariate linear regression was performed to test for associations between ADU and RACE while adjusting for potential confounders. Results The mean score on ADU was 19.9 (SD=5.6). Mean score on RACE was 13.5 (SD=5.6). After adjusting for physician demographics, physicians with higher levels of ADU scored higher on RACE (+β=0.08 in RACE, p=0.04, for each 1-point increase in ADU), as did physicians who understand “race” to mean biological or genetic ancestral, rather than sociocultural, group. Physicians who graduated from a US medical school, completed fellowship, and had more white patients, scored lower on RACE. Conclusions This study demonstrates positive associations between physicians’ anxiety due to uncertainty, meanings attributed to race, and self-reported use of race in medical decision-making. Future research should examine the potential impact of these associations on patient outcomes and healthcare disparities. PMID:25025871
On the role of budget sufficiency, cost efficiency, and uncertainty in species management
van der Burg, Max Post; Bly, Bartholomew B.; Vercauteren, Tammy; Grand, James B.; Tyre, Andrew J.
2014-01-01
Many conservation planning frameworks rely on the assumption that one should prioritize locations for management actions based on the highest predicted conservation value (i.e., abundance, occupancy). This strategy may underperform relative to the expected outcome if one is working with a limited budget or the predicted responses are uncertain. Yet, cost and tolerance to uncertainty rarely become part of species management plans. We used field data and predictive models to simulate a decision problem involving western burrowing owls (Athene cunicularia hypugaea) using prairie dog colonies (Cynomys ludovicianus) in western Nebraska. We considered 2 species management strategies: one maximized abundance and the other maximized abundance in a cost-efficient way. We then used heuristic decision algorithms to compare the 2 strategies in terms of how well they met a hypothetical conservation objective. Finally, we performed an info-gap decision analysis to determine how these strategies performed under different budget constraints and uncertainty about owl response. Our results suggested that when budgets were sufficient to manage all sites, the maximizing strategy was optimal and suggested investing more in expensive actions. This pattern persisted for restricted budgets up to approximately 50% of the sufficient budget. Below this budget, the cost-efficient strategy was optimal and suggested investing in cheaper actions. When uncertainty in the expected responses was introduced, the strategy that maximized abundance remained robust under a sufficient budget. Reducing the budget induced a slight trade-off between expected performance and robustness, which suggested that the most robust strategy depended both on one's budget and tolerance to uncertainty. Our results suggest that wildlife managers should explicitly account for budget limitations and be realistic about their expected levels of performance.
Validity of Willingness to Pay Measures under Preference Uncertainty.
Braun, Carola; Rehdanz, Katrin; Schmidt, Ulrich
2016-01-01
Recent studies in the marketing literature developed a new method for eliciting willingness to pay (WTP) with an open-ended elicitation format: the Range-WTP method. In contrast to the traditional approach of eliciting WTP as a single value (Point-WTP), Range-WTP explicitly allows for preference uncertainty in responses. The aim of this paper is to apply Range-WTP to the domain of contingent valuation and to test for its theoretical validity and robustness in comparison to the Point-WTP. Using data from two novel large-scale surveys on the perception of solar radiation management (SRM), a little-known technique for counteracting climate change, we compare the performance of both methods in the field. In addition to the theoretical validity (i.e. the degree to which WTP values are consistent with theoretical expectations), we analyse the test-retest reliability and stability of our results over time. Our evidence suggests that the Range-WTP method clearly outperforms the Point-WTP method.
Aligning Spinoza with Descartes: An informed Cartesian account of the truth bias.
Street, Chris N H; Kingstone, Alan
2017-08-01
There is a bias towards believing information is true rather than false. The Spinozan account claims there is an early, automatic bias towards believing. Only afterwards can people engage in an effortful re-evaluation and disbelieve the information. Supporting this account, there is a greater bias towards believing information is true when under cognitive load. However, developing on the Adaptive Lie Detector (ALIED) theory, the informed Cartesian can equally explain this data. The account claims the bias under load is not evidence of automatic belief; rather, people are undecided, but if forced to guess they can rely on context information to make an informed judgement. The account predicts, and we found, that if people can explicitly indicate their uncertainty, there should be no bias towards believing because they are no longer required to guess. Thus, we conclude that belief formation can be better explained by an informed Cartesian account - an attempt to make an informed judgment under uncertainty. © 2016 The British Psychological Society.
Stochastic Energy Deployment System
DOE Office of Scientific and Technical Information (OSTI.GOV)
2011-11-30
SEDS is an economy-wide energy model of the U.S. The model captures dynamics between supply, demand, and pricing of the major energy types consumed and produced within the U.S. These dynamics are captured by including: the effects of macroeconomics; the resources and costs of primary energy types such as oil, natural gas, coal, and biomass; the conversion of primary fuels into energy products like petroleum products, electricity, biofuels, and hydrogen; and lastly the end- use consumption attributable to residential and commercial buildings, light and heavy transportation, and industry. Projections from SEDS extend to the year 2050 by one-year time stepsmore » and are generally projected at the national level. SEDS differs from other economy-wide energy models in that it explicitly accounts for uncertainty in technology, markets, and policy. SEDS has been specifically developed to avoid the computational burden, and sometimes fruitless labor, that comes from modeling significantly low-level details. Instead, SEDS focuses on the major drivers within the energy economy and evaluates the impact of uncertainty around those drivers.« less
Korun, M; Vodenik, B; Zorko, B
2018-03-01
A new method for calculating the detection limits of gamma-ray spectrometry measurements is presented. The method is applicable for gamma-ray emitters, irrespective of the influences of the peaked background, the origin of the background and the overlap with other peaks. It offers the opportunity for multi-gamma-ray emitters to calculate the common detection limit, corresponding to more peaks. The detection limit is calculated by approximating the dependence of the uncertainty in the indication on its value with a second-order polynomial. In this approach the relation between the input quantities and the detection limit are described by an explicit expression and can be easy investigated. The detection limit is calculated from the data usually provided by the reports of peak-analyzing programs: the peak areas and their uncertainties. As a result, the need to use individual channel contents for calculating the detection limit is bypassed. Copyright © 2017 Elsevier Ltd. All rights reserved.
Stelzenmüller, V; Lee, J; Garnacho, E; Rogers, S I
2010-10-01
For the UK continental shelf we developed a Bayesian Belief Network-GIS framework to visualise relationships between cumulative human pressures, sensitive marine landscapes and landscape vulnerability, to assess the consequences of potential marine planning objectives, and to map uncertainty-related changes in management measures. Results revealed that the spatial assessment of footprints and intensities of human activities had more influence on landscape vulnerabilities than the type of landscape sensitivity measure used. We addressed questions regarding consequences of potential planning targets, and necessary management measures with spatially-explicit assessment of their consequences. We conclude that the BN-GIS framework is a practical tool allowing for the visualisation of relationships, the spatial assessment of uncertainty related to spatial management scenarios, the engagement of different stakeholder views, and enables a quick update of new spatial data and relationships. Ultimately, such BN-GIS based tools can support the decision-making process used in adaptive marine management. Copyright © 2010 Elsevier Ltd. All rights reserved.
Validity of Willingness to Pay Measures under Preference Uncertainty
Braun, Carola; Rehdanz, Katrin; Schmidt, Ulrich
2016-01-01
Recent studies in the marketing literature developed a new method for eliciting willingness to pay (WTP) with an open-ended elicitation format: the Range-WTP method. In contrast to the traditional approach of eliciting WTP as a single value (Point-WTP), Range-WTP explicitly allows for preference uncertainty in responses. The aim of this paper is to apply Range-WTP to the domain of contingent valuation and to test for its theoretical validity and robustness in comparison to the Point-WTP. Using data from two novel large-scale surveys on the perception of solar radiation management (SRM), a little-known technique for counteracting climate change, we compare the performance of both methods in the field. In addition to the theoretical validity (i.e. the degree to which WTP values are consistent with theoretical expectations), we analyse the test-retest reliability and stability of our results over time. Our evidence suggests that the Range-WTP method clearly outperforms the Point-WTP method. PMID:27096163
Adaptive management for soil ecosystem services
Birge, Hannah E.; Bevans, Rebecca A.; Allen, Craig R.; Angeler, David G.; Baer, Sara G.; Wall, Diana H.
2016-01-01
Ecosystem services provided by soil include regulation of the atmosphere and climate, primary (including agricultural) production, waste processing, decomposition, nutrient conservation, water purification, erosion control, medical resources, pest control, and disease mitigation. The simultaneous production of these multiple services arises from complex interactions among diverse aboveground and belowground communities across multiple scales. When a system is mismanaged, non-linear and persistent losses in ecosystem services can arise. Adaptive management is an approach to management designed to reduce uncertainty as management proceeds. By developing alternative hypotheses, testing these hypotheses and adjusting management in response to outcomes, managers can probe dynamic mechanistic relationships among aboveground and belowground soil system components. In doing so, soil ecosystem services can be preserved and critical ecological thresholds avoided. Here, we present an adaptive management framework designed to reduce uncertainty surrounding the soil system, even when soil ecosystem services production is not the explicit management objective, so that managers can reach their management goals without undermining soil multifunctionality or contributing to an irreversible loss of soil ecosystem services.
NASA Astrophysics Data System (ADS)
Menichetti, Lorenzo; Kätterer, Thomas; Leifeld, Jens
2016-05-01
Soil organic carbon (SOC) dynamics result from different interacting processes and controls on spatial scales from sub-aggregate to pedon to the whole ecosystem. These complex dynamics are translated into models as abundant degrees of freedom. This high number of not directly measurable variables and, on the other hand, very limited data at disposal result in equifinality and parameter uncertainty. Carbon radioisotope measurements are a proxy for SOC age both at annual to decadal (bomb peak based) and centennial to millennial timescales (radio decay based), and thus can be used in addition to total organic C for constraining SOC models. By considering this additional information, uncertainties in model structure and parameters may be reduced. To test this hypothesis we studied SOC dynamics and their defining kinetic parameters in the Zürich Organic Fertilization Experiment (ZOFE) experiment, a > 60-year-old controlled cropland experiment in Switzerland, by utilizing SOC and SO14C time series. To represent different processes we applied five model structures, all stemming from a simple mother model (Introductory Carbon Balance Model - ICBM): (I) two decomposing pools, (II) an inert pool added, (III) three decomposing pools, (IV) two decomposing pools with a substrate control feedback on decomposition, (V) as IV but with also an inert pool. These structures were extended to explicitly represent total SOC and 14C pools. The use of different model structures allowed us to explore model structural uncertainty and the impact of 14C on kinetic parameters. We considered parameter uncertainty by calibrating in a formal Bayesian framework. By varying the relative importance of total SOC and SO14C data in the calibration, we could quantify the effect of the information from these two data streams on estimated model parameters. The weighing of the two data streams was crucial for determining model outcomes, and we suggest including it in future modeling efforts whenever SO14C data are available. The measurements and all model structures indicated a dramatic decline in SOC in the ZOFE experiment after an initial land use change in 1949 from grass- to cropland, followed by a constant but smaller decline. According to all structures, the three treatments (control, mineral fertilizer, farmyard manure) we considered were still far from equilibrium. The estimates of mean residence time (MRT) of the C pools defined by our models were sensitive to the consideration of the SO14C data stream. Model structure had a smaller effect on estimated MRT, which ranged between 5.9 ± 0.1 and 4.2 ± 0.1 years and 78.9 ± 0.1 and 98.9 ± 0.1 years for young and old pools, respectively, for structures without substrate interactions. The simplest model structure performed the best according to information criteria, validating the idea that we still lack data for mechanistic SOC models. Although we could not exclude any of the considered processes possibly involved in SOC decomposition, it was not possible to discriminate their relative importance.
Vertical variations in wood CO2 efflux for live emergent trees in a Bornean tropical rainforest.
Katayama, Ayumi; Kume, Tomonori; Komatsu, Hikaru; Ohashi, Mizue; Matsumoto, Kazuho; Ichihashi, Ryuji; Kumagai, Tomo'omi; Otsuki, Kyoichi
2014-05-01
Difficult access to 40-m-tall emergent trees in tropical rainforests has resulted in a lack of data related to vertical variations in wood CO2 efflux, even though significant variations in wood CO2 efflux are an important source of errors when estimating whole-tree total wood CO2 efflux. This study aimed to clarify vertical variations in wood CO2 efflux for emergent trees and to document the impact of the variations on the whole-tree estimates of stem and branch CO2 efflux. First, we measured wood CO2 efflux and factors related to tree morphology and environment for seven live emergent trees of two dipterocarp species at four to seven heights of up to ∼ 40 m for each tree using ladders and a crane. No systematic tendencies in vertical variations were observed for all the trees. Wood CO2 efflux was not affected by stem and air temperature, stem diameter, stem height or stem growth. The ratios of wood CO2 efflux at the treetop to that at breast height were larger in emergent trees with relatively smaller diameters at breast height. Second, we compared whole-tree stem CO2 efflux estimates using vertical measurements with those based on solely breast height measurements. We found similar whole-tree stem CO2 efflux estimates regardless of the patterns of vertical variations in CO2 efflux because the surface area in the canopy, where wood CO2 efflux often differed from that at breast height, was very small compared with that at low stem heights, resulting in little effect of the vertical variations on the estimate. Additionally, whole-tree branch CO2 efflux estimates using measured wood CO2 efflux in the canopy were considerably different from those measured using only breast height measurements. Uncertainties in wood CO2 efflux in the canopy did not cause any bias in stem CO2 efflux scaling, but affected branch CO2 efflux. © The Author 2014. Published by Oxford University Press. All rights reserved.
Lorenz, Marco; Fürst, Christine; Thiel, Enrico
2013-09-01
Regarding increasing pressures by global societal and climate change, the assessment of the impact of land use and land management practices on land degradation and the related decrease in sustainable provision of ecosystem services gains increasing interest. Existing approaches to assess agricultural practices focus on the assessment of single crops or statistical data because spatially explicit information on practically applied crop rotations is mostly not available. This provokes considerable uncertainties in crop production models as regional specifics have to be neglected or cannot be considered in an appropriate way. In a case study in Saxony, we developed an approach to (i) derive representative regional crop rotations by combining different data sources and expert knowledge. This includes the integration of innovative crop sequences related to bio-energy production or organic farming and different soil tillage, soil management and soil protection techniques. Furthermore, (ii) we developed a regionalization approach for transferring crop rotations and related soil management strategies on the basis of statistical data and spatially explicit data taken from so called field blocks. These field blocks are the smallest spatial entity for which agricultural practices must be reported to apply for agricultural funding within the frame of the European Agricultural Fund for Rural Development (EAFRD) program. The information was finally integrated into the spatial decision support tool GISCAME to assess and visualize in spatially explicit manner the impact of alternative agricultural land use strategies on soil erosion risk and ecosystem services provision. Objective of this paper is to present the approach how to create spatially explicit information on agricultural management practices for a study area around Dresden, the capital of the German Federal State Saxony. Copyright © 2013 Elsevier Ltd. All rights reserved.
Black holes, hidden symmetries, and complete integrability
NASA Astrophysics Data System (ADS)
Frolov, Valeri P.; Krtouš, Pavel; Kubizňák, David
2017-11-01
The study of higher-dimensional black holes is a subject which has recently attracted vast interest. Perhaps one of the most surprising discoveries is a realization that the properties of higher-dimensional black holes with the spherical horizon topology and described by the Kerr-NUT-(A)dS metrics are very similar to the properties of the well known four-dimensional Kerr metric. This remarkable result stems from the existence of a single object called the principal tensor. In our review we discuss explicit and hidden symmetries of higher-dimensional Kerr-NUT-(A)dS black hole spacetimes. We start with discussion of the Killing and Killing-Yano objects representing explicit and hidden symmetries. We demonstrate that the principal tensor can be used as a "seed object" which generates all these symmetries. It determines the form of the geometry, as well as guarantees its remarkable properties, such as special algebraic type of the spacetime, complete integrability of geodesic motion, and separability of the Hamilton-Jacobi, Klein-Gordon, and Dirac equations. The review also contains a discussion of different applications of the developed formalism and its possible generalizations.
Black holes, hidden symmetries, and complete integrability.
Frolov, Valeri P; Krtouš, Pavel; Kubizňák, David
2017-01-01
The study of higher-dimensional black holes is a subject which has recently attracted vast interest. Perhaps one of the most surprising discoveries is a realization that the properties of higher-dimensional black holes with the spherical horizon topology and described by the Kerr-NUT-(A)dS metrics are very similar to the properties of the well known four-dimensional Kerr metric. This remarkable result stems from the existence of a single object called the principal tensor. In our review we discuss explicit and hidden symmetries of higher-dimensional Kerr-NUT-(A)dS black hole spacetimes. We start with discussion of the Killing and Killing-Yano objects representing explicit and hidden symmetries. We demonstrate that the principal tensor can be used as a "seed object" which generates all these symmetries. It determines the form of the geometry, as well as guarantees its remarkable properties, such as special algebraic type of the spacetime, complete integrability of geodesic motion, and separability of the Hamilton-Jacobi, Klein-Gordon, and Dirac equations. The review also contains a discussion of different applications of the developed formalism and its possible generalizations.
Matsuyama, Ryota; Lee, Hyojung; Yamaguchi, Takayuki; Tsuzuki, Shinya
2018-01-01
Background A Rohingya refugee camp in Cox’s Bazar, Bangladesh experienced a large-scale diphtheria epidemic in 2017. The background information of previously immune fraction among refugees cannot be explicitly estimated, and thus we conducted an uncertainty analysis of the basic reproduction number, R0. Methods A renewal process model was devised to estimate the R0 and ascertainment rate of cases, and loss of susceptible individuals was modeled as one minus the sum of initially immune fraction and the fraction naturally infected during the epidemic. To account for the uncertainty of initially immune fraction, we employed a Latin Hypercube sampling (LHS) method. Results R0 ranged from 4.7 to 14.8 with the median estimate at 7.2. R0 was positively correlated with ascertainment rates. Sensitivity analysis indicated that R0 would become smaller with greater variance of the generation time. Discussion Estimated R0 was broadly consistent with published estimate from endemic data, indicating that the vaccination coverage of 86% has to be satisfied to prevent the epidemic by means of mass vaccination. LHS was particularly useful in the setting of a refugee camp in which the background health status is poorly quantified. PMID:29629244
NICE technology appraisals: working with multiple levels of uncertainty and the potential for bias.
Brown, Patrick; Calnan, Michael
2013-05-01
One of the key roles of the English National Institute for Health and Clinical Excellence (NICE) is technology appraisal. This essentially involves evaluating the cost effectiveness of pharmaceutical products and other technologies for use within the National Health Service. Based on a content analysis of key documents which shed light on the nature of appraisals, this paper draws attention to the multiple layers of uncertainty and complexity which are latent within the appraisal process, and the often socially constructed mechanisms for tackling these. Epistemic assumptions, bounded rationality and more explicitly relational forms of managing knowledge are applied to this end. These findings are discussed in the context of the literature highlighting the inherently social process of regulation. A framework is developed which posits the various forms of uncertainty, and responses to these, as potential conduits of regulatory bias-in need of further research. That NICE's authority is itself regulated by other actors within the regulatory regime, particularly the pharmaceutical industry, exposes it to the threat of regulatory capture. Following Lehoux, it is concluded that a more transparent and reflexive format for technological appraisals is necessary. This would enable a more robust, defensible form of decision-making and moreover enable NICE to preserve its legitimacy in the midst of pressures which threaten this.
NASA Astrophysics Data System (ADS)
Dykema, J. A.; Anderson, J. G.
2014-12-01
Measuring water vapor at the highest spatial and temporal at all vertical levels and at arbitrary times requires strategic utilization of disparate observations from satellites, ground-based remote sensing, and in situ measurements. These different measurement types have different response times and very different spatial averaging properties, both horizontally and vertically. Accounting for these different measurement properties and explicit propagation of associated uncertainties is necessary to test particular scientific hypotheses, especially in cases of detection of weak signals in the presence of natural fluctuations, and for process studies with small ensembles. This is also true where ancillary data from meteorological analyses are required, which have their own sampling limitations and uncertainties. This study will review two investigations pertaining to measurements of water vapor in the mid-troposphere and lower stratosphere that mix satellite observations with observations from other sources. The focus of the mid-troposphere analysis is to obtain improved estimates of water vapor at the instant of a sounding satellite overpass. The lower stratosphere work examines the uncertainty inherent in a small ensemble of anomalously elevated lower stratospheric water vapor observations when meteorological analysis products and aircraft in situ observations are required for interpretation.
Adaptive management for ecosystem services.
Birgé, Hannah E; Allen, Craig R; Garmestani, Ahjond S; Pope, Kevin L
2016-12-01
Management of natural resources for the production of ecosystem services, which are vital for human well-being, is necessary even when there is uncertainty regarding system response to management action. This uncertainty is the result of incomplete controllability, complex internal feedbacks, and non-linearity that often interferes with desired management outcomes, and insufficient understanding of nature and people. Adaptive management was developed to reduce such uncertainty. We present a framework for the application of adaptive management for ecosystem services that explicitly accounts for cross-scale tradeoffs in the production of ecosystem services. Our framework focuses on identifying key spatiotemporal scales (plot, patch, ecosystem, landscape, and region) that encompass dominant structures and processes in the system, and includes within- and cross-scale dynamics, ecosystem service tradeoffs, and management controllability within and across scales. Resilience theory recognizes that a limited set of ecological processes in a given system regulate ecosystem services, yet our understanding of these processes is poorly understood. If management actions erode or remove these processes, the system may shift into an alternative state unlikely to support the production of desired services. Adaptive management provides a process to assess the underlying within and cross-scale tradeoffs associated with production of ecosystem services while proceeding with management designed to meet the demands of a growing human population. Copyright © 2016 Elsevier Ltd. All rights reserved.
Matsuyama, Ryota; Akhmetzhanov, Andrei R; Endo, Akira; Lee, Hyojung; Yamaguchi, Takayuki; Tsuzuki, Shinya; Nishiura, Hiroshi
2018-01-01
A Rohingya refugee camp in Cox's Bazar, Bangladesh experienced a large-scale diphtheria epidemic in 2017. The background information of previously immune fraction among refugees cannot be explicitly estimated, and thus we conducted an uncertainty analysis of the basic reproduction number, R 0 . A renewal process model was devised to estimate the R 0 and ascertainment rate of cases, and loss of susceptible individuals was modeled as one minus the sum of initially immune fraction and the fraction naturally infected during the epidemic. To account for the uncertainty of initially immune fraction, we employed a Latin Hypercube sampling (LHS) method. R 0 ranged from 4.7 to 14.8 with the median estimate at 7.2. R 0 was positively correlated with ascertainment rates. Sensitivity analysis indicated that R 0 would become smaller with greater variance of the generation time. Estimated R 0 was broadly consistent with published estimate from endemic data, indicating that the vaccination coverage of 86% has to be satisfied to prevent the epidemic by means of mass vaccination. LHS was particularly useful in the setting of a refugee camp in which the background health status is poorly quantified.
Achcar, Fiona; Barrett, Michael P; Breitling, Rainer
2013-09-01
Previous models of glycolysis in the sleeping sickness parasite Trypanosoma brucei assumed that the core part of glycolysis in this unicellular parasite is tightly compartimentalized within an organelle, the glycosome, which had previously been shown to contain most of the glycolytic enzymes. The glycosomes were assumed to be largely impermeable, and exchange of metabolites between the cytosol and the glycosome was assumed to be regulated by specific transporters in the glycosomal membrane. This tight compartmentalization was considered to be essential for parasite viability. Recently, size-specific metabolite pores were discovered in the membrane of glycosomes. These channels are proposed to allow smaller metabolites to diffuse across the membrane but not larger ones. In light of this new finding, we re-analyzed the model taking into account uncertainty about the topology of the metabolic system in T. brucei, as well as uncertainty about the values of all parameters of individual enzymatic reactions. Our analysis shows that these newly-discovered nonspecific pores are not necessarily incompatible with our current knowledge of the glycosomal metabolic system, provided that the known cytosolic activities of the glycosomal enzymes play an important role in the regulation of glycolytic fluxes and the concentration of metabolic intermediates of the pathway. © 2013 FEBS.
Acoustic holography as a metrological tool for characterizing medical ultrasound sources and fields
Sapozhnikov, Oleg A.; Tsysar, Sergey A.; Khokhlova, Vera A.; Kreider, Wayne
2015-01-01
Acoustic holography is a powerful technique for characterizing ultrasound sources and the fields they radiate, with the ability to quantify source vibrations and reduce the number of required measurements. These capabilities are increasingly appealing for meeting measurement standards in medical ultrasound; however, associated uncertainties have not been investigated systematically. Here errors associated with holographic representations of a linear, continuous-wave ultrasound field are studied. To facilitate the analysis, error metrics are defined explicitly, and a detailed description of a holography formulation based on the Rayleigh integral is provided. Errors are evaluated both for simulations of a typical therapeutic ultrasound source and for physical experiments with three different ultrasound sources. Simulated experiments explore sampling errors introduced by the use of a finite number of measurements, geometric uncertainties in the actual positions of acquired measurements, and uncertainties in the properties of the propagation medium. Results demonstrate the theoretical feasibility of keeping errors less than about 1%. Typical errors in physical experiments were somewhat larger, on the order of a few percent; comparison with simulations provides specific guidelines for improving the experimental implementation to reduce these errors. Overall, results suggest that holography can be implemented successfully as a metrological tool with small, quantifiable errors. PMID:26428789
Probabilistic framework for assessing the ice sheet contribution to sea level change.
Little, Christopher M; Urban, Nathan M; Oppenheimer, Michael
2013-02-26
Previous sea level rise (SLR) assessments have excluded the potential for dynamic ice loss over much of Greenland and Antarctica, and recently proposed "upper bounds" on Antarctica's 21st-century SLR contribution are derived principally from regions where present-day mass loss is concentrated (basin 15, or B15, drained largely by Pine Island, Thwaites, and Smith glaciers). Here, we present a probabilistic framework for assessing the ice sheet contribution to sea level change that explicitly accounts for mass balance uncertainty over an entire ice sheet. Applying this framework to Antarctica, we find that ongoing mass imbalances in non-B15 basins give an SLR contribution by 2100 that: (i) is comparable to projected changes in B15 discharge and Antarctica's surface mass balance, and (ii) varies widely depending on the subset of basins and observational dataset used in projections. Increases in discharge uncertainty, or decreases in the exceedance probability used to define an upper bound, increase the fractional contribution of non-B15 basins; even weak spatial correlations in future discharge growth rates markedly enhance this sensitivity. Although these projections rely on poorly constrained statistical parameters, they may be updated with observations and/or models at many spatial scales, facilitating a more comprehensive account of uncertainty that, if implemented, will improve future assessments.
Li, Zhijun; Su, Chun-Yi
2013-09-01
In this paper, adaptive neural network control is investigated for single-master-multiple-slaves teleoperation in consideration of time delays and input dead-zone uncertainties for multiple mobile manipulators carrying a common object in a cooperative manner. Firstly, concise dynamics of teleoperation systems consisting of a single master robot, multiple coordinated slave robots, and the object are developed in the task space. To handle asymmetric time-varying delays in communication channels and unknown asymmetric input dead zones, the nonlinear dynamics of the teleoperation system are transformed into two subsystems through feedback linearization: local master or slave dynamics including the unknown input dead zones and delayed dynamics for the purpose of synchronization. Then, a model reference neural network control strategy based on linear matrix inequalities (LMI) and adaptive techniques is proposed. The developed control approach ensures that the defined tracking errors converge to zero whereas the coordination internal force errors remain bounded and can be made arbitrarily small. Throughout this paper, stability analysis is performed via explicit Lyapunov techniques under specific LMI conditions. The proposed adaptive neural network control scheme is robust against motion disturbances, parametric uncertainties, time-varying delays, and input dead zones, which is validated by simulation studies.
Adaptive restoration of river terrace vegetation through iterative experiments
Dela Cruz, Michelle P.; Beauchamp, Vanessa B.; Shafroth, Patrick B.; Decker, Cheryl E.; O’Neil, Aviva
2014-01-01
Restoration projects can involve a high degree of uncertainty and risk, which can ultimately result in failure. An adaptive restoration approach can reduce uncertainty through controlled, replicated experiments designed to test specific hypotheses and alternative management approaches. Key components of adaptive restoration include willingness of project managers to accept the risk inherent in experimentation, interest of researchers, availability of funding for experimentation and monitoring, and ability to restore sites as iterative experiments where results from early efforts can inform the design of later phases. This paper highlights an ongoing adaptive restoration project at Zion National Park (ZNP), aimed at reducing the cover of exotic annual Bromus on riparian terraces, and revegetating these areas with native plant species. Rather than using a trial-and-error approach, ZNP staff partnered with academic, government, and private-sector collaborators to conduct small-scale experiments to explicitly address uncertainties concerning biomass removal of annual bromes, herbicide application rates and timing, and effective seeding methods for native species. Adaptive restoration has succeeded at ZNP because managers accept the risk inherent in experimentation and ZNP personnel are committed to continue these projects over a several-year period. Techniques that result in exotic annual Bromus removal and restoration of native plant species at ZNP can be used as a starting point for adaptive restoration projects elsewhere in the region.
RUIZ-RAMOS, MARGARITA; MÍNGUEZ, M. INÉS
2006-01-01
• Background Plant structural (i.e. architectural) models explicitly describe plant morphology by providing detailed descriptions of the display of leaf and stem surfaces within heterogeneous canopies and thus provide the opportunity for modelling the functioning of plant organs in their microenvironments. The outcome is a class of structural–functional crop models that combines advantages of current structural and process approaches to crop modelling. ALAMEDA is such a model. • Methods The formalism of Lindenmayer systems (L-systems) was chosen for the development of a structural model of the faba bean canopy, providing both numerical and dynamic graphical outputs. It was parameterized according to the results obtained through detailed morphological and phenological descriptions that capture the detailed geometry and topology of the crop. The analysis distinguishes between relationships of general application for all sowing dates and stem ranks and others valid only for all stems of a single crop cycle. • Results and Conclusions The results reveal that in faba bean, structural parameterization valid for the entire plant may be drawn from a single stem. ALAMEDA was formed by linking the structural model to the growth model ‘Simulation d'Allongement des Feuilles’ (SAF) with the ability to simulate approx. 3500 crop organs and components of a group of nine plants. Model performance was verified for organ length, plant height and leaf area. The L-system formalism was able to capture the complex architecture of canopy leaf area of this indeterminate crop and, with the growth relationships, generate a 3D dynamic crop simulation. Future development and improvement of the model are discussed. PMID:16390842
NASA Astrophysics Data System (ADS)
Subramanian, Aneesh C.; Palmer, Tim N.
2017-06-01
Stochastic schemes to represent model uncertainty in the European Centre for Medium-Range Weather Forecasts (ECMWF) ensemble prediction system has helped improve its probabilistic forecast skill over the past decade by both improving its reliability and reducing the ensemble mean error. The largest uncertainties in the model arise from the model physics parameterizations. In the tropics, the parameterization of moist convection presents a major challenge for the accurate prediction of weather and climate. Superparameterization is a promising alternative strategy for including the effects of moist convection through explicit turbulent fluxes calculated from a cloud-resolving model (CRM) embedded within a global climate model (GCM). In this paper, we compare the impact of initial random perturbations in embedded CRMs, within the ECMWF ensemble prediction system, with stochastically perturbed physical tendency (SPPT) scheme as a way to represent model uncertainty in medium-range tropical weather forecasts. We especially focus on forecasts of tropical convection and dynamics during MJO events in October-November 2011. These are well-studied events for MJO dynamics as they were also heavily observed during the DYNAMO field campaign. We show that a multiscale ensemble modeling approach helps improve forecasts of certain aspects of tropical convection during the MJO events, while it also tends to deteriorate certain large-scale dynamic fields with respect to stochastically perturbed physical tendencies approach that is used operationally at ECMWF.
Are models, uncertainty, and dispute resolution compatible?
NASA Astrophysics Data System (ADS)
Anderson, J. D.; Wilson, J. L.
2013-12-01
Models and their uncertainty often move from an objective use in planning and decision making into the regulatory environment, then sometimes on to dispute resolution through litigation or other legal forums. Through this last transition whatever objectivity the models and uncertainty assessment may have once possessed becomes biased (or more biased) as each party chooses to exaggerate either the goodness of a model, or its worthlessness, depending on which view is in its best interest. If worthlessness is desired, then what was uncertain becomes unknown, or even unknowable. If goodness is desired, then precision and accuracy are often exaggerated and uncertainty, if it is explicitly recognized, encompasses only some parameters or conceptual issues, ignores others, and may minimize the uncertainty that it accounts for. In dispute resolution, how well is the adversarial process able to deal with these biases? The challenge is that they are often cloaked in computer graphics and animations that appear to lend realism to what could be mostly fancy, or even a manufactured outcome. While junk science can be challenged through appropriate motions in federal court, and in most state courts, it not unusual for biased or even incorrect modeling results, or conclusions based on incorrect results, to be permitted to be presented at trial. Courts allow opinions that are based on a "reasonable degree of scientific certainty," but when that 'certainty' is grossly exaggerated by an expert, one way or the other, how well do the courts determine that someone has stepped over the line? Trials are based on the adversary system of justice, so opposing and often irreconcilable views are commonly allowed, leaving it to the judge or jury to sort out the truth. Can advances in scientific theory and engineering practice, related to both modeling and uncertainty, help address this situation and better ensure that juries and judges see more objective modeling results, or at least see measures of those results that help to expose biases?
Bio-physical vs. Economic Uncertainty in the Analysis of Climate Change Impacts on World Agriculture
NASA Astrophysics Data System (ADS)
Hertel, T. W.; Lobell, D. B.
2010-12-01
Accumulating evidence suggests that agricultural production could be greatly affected by climate change, but there remains little quantitative understanding of how these agricultural impacts would affect economic livelihoods in poor countries. The recent paper by Hertel, Burke and Lobell (GEC, 2010) considers three scenarios of agricultural impacts of climate change, corresponding to the fifth, fiftieth, and ninety fifth percentiles of projected yield distributions for the world’s crops in 2030. They evaluate the resulting changes in global commodity prices, national economic welfare, and the incidence of poverty in a set of 15 developing countries. Although the small price changes under the medium scenario are consistent with previous findings, their low productivity scenario reveals the potential for much larger food price changes than reported in recent studies which have hitherto focused on the most likely outcomes. The poverty impacts of price changes under the extremely adverse scenario are quite heterogeneous and very significant in some population strata. They conclude that it is critical to look beyond central case climate shocks and beyond a simple focus on yields and highly aggregated poverty impacts. In this paper, we conduct a more formal, systematic sensitivity analysis (SSA) with respect to uncertainty in the biophysical impacts of climate change on agriculture, by explicitly specifying joint distributions for global yield changes - this time focusing on 2050. This permits us to place confidence intervals on the resulting price impacts and poverty results which reflect the uncertainty inherited from the biophysical side of the analysis. We contrast this with the economic uncertainty inherited from the global general equilibrium model (GTAP), by undertaking SSA with respect to the behavioral parameters in that model. This permits us to assess which type of uncertainty is more important for regional price and poverty outcomes. Finally, we undertake a combined SSA, wherein climate change-induced productivity shocks are permitted to interact with the uncertain economic parameters. This permits us to examine potential interactions between the two sources of uncertainty.
Hess, Jeremy J.; Ebi, Kristie L.; Markandya, Anil; Balbus, John M.; Wilkinson, Paul; Haines, Andy; Chalabi, Zaid
2014-01-01
Background: Policy decisions regarding climate change mitigation are increasingly incorporating the beneficial and adverse health impacts of greenhouse gas emission reduction strategies. Studies of such co-benefits and co-harms involve modeling approaches requiring a range of analytic decisions that affect the model output. Objective: Our objective was to assess analytic decisions regarding model framework, structure, choice of parameters, and handling of uncertainty when modeling health co-benefits, and to make recommendations for improvements that could increase policy uptake. Methods: We describe the assumptions and analytic decisions underlying models of mitigation co-benefits, examining their effects on modeling outputs, and consider tools for quantifying uncertainty. Discussion: There is considerable variation in approaches to valuation metrics, discounting methods, uncertainty characterization and propagation, and assessment of low-probability/high-impact events. There is also variable inclusion of adverse impacts of mitigation policies, and limited extension of modeling domains to include implementation considerations. Going forward, co-benefits modeling efforts should be carried out in collaboration with policy makers; these efforts should include the full range of positive and negative impacts and critical uncertainties, as well as a range of discount rates, and should explicitly characterize uncertainty. We make recommendations to improve the rigor and consistency of modeling of health co-benefits. Conclusion: Modeling health co-benefits requires systematic consideration of the suitability of model assumptions, of what should be included and excluded from the model framework, and how uncertainty should be treated. Increased attention to these and other analytic decisions has the potential to increase the policy relevance and application of co-benefits modeling studies, potentially helping policy makers to maximize mitigation potential while simultaneously improving health. Citation: Remais JV, Hess JJ, Ebi KL, Markandya A, Balbus JM, Wilkinson P, Haines A, Chalabi Z. 2014. Estimating the health effects of greenhouse gas mitigation strategies: addressing parametric, model, and valuation challenges. Environ Health Perspect 122:447–455; http://dx.doi.org/10.1289/ehp.1306744 PMID:24583270
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bixler, Nathan E.; Osborn, Douglas M.; Sallaberry, Cedric Jean-Marie
2014-02-01
This paper describes the convergence of MELCOR Accident Consequence Code System, Version 2 (MACCS2) probabilistic results of offsite consequences for the uncertainty analysis of the State-of-the-Art Reactor Consequence Analyses (SOARCA) unmitigated long-term station blackout scenario at the Peach Bottom Atomic Power Station. The consequence metrics evaluated are individual latent-cancer fatality (LCF) risk and individual early fatality risk. Consequence results are presented as conditional risk (i.e., assuming the accident occurs, risk per event) to individuals of the public as a result of the accident. In order to verify convergence for this uncertainty analysis, as recommended by the Nuclear Regulatory Commission’s Advisorymore » Committee on Reactor Safeguards, a ‘high’ source term from the original population of Monte Carlo runs has been selected to be used for: (1) a study of the distribution of consequence results stemming solely from epistemic uncertainty in the MACCS2 parameters (i.e., separating the effect from the source term uncertainty), and (2) a comparison between Simple Random Sampling (SRS) and Latin Hypercube Sampling (LHS) in order to validate the original results obtained with LHS. Three replicates (each using a different random seed) of size 1,000 each using LHS and another set of three replicates of size 1,000 using SRS are analyzed. The results show that the LCF risk results are well converged with either LHS or SRS sampling. The early fatality risk results are less well converged at radial distances beyond 2 miles, and this is expected due to the sparse data (predominance of “zero” results).« less
Adaptive management of large aquatic ecosystem recovery programs in the United States.
Thom, Ronald; St Clair, Tom; Burns, Rebecca; Anderson, Michael
2016-12-01
Adaptive management (AM) is being employed in a number of programs in the United States to guide actions to restore aquatic ecosystems because these programs are both expensive and are faced with significant uncertainties. Many of these uncertainties are associated with prioritizing when, where, and what kind of actions are needed to meet the objectives of enhancing ecosystem services and recovering threatened and endangered species. We interviewed nine large-scale aquatic ecosystem restoration programs across the United States to document the lessons learned from implementing AM. In addition, we recorded information on ecological drivers (e.g., endangered fish species) for the program, and inferred how these drivers reflected more generic ecosystem services. Ecosystem services (e.g., genetic diversity, cultural heritage), albeit not explicit drivers, were either important to the recovery or enhancement of the drivers, or were additional benefits associated with actions to recover or enhance the program drivers. Implementing programs using AM lessons learned has apparently helped achieve better results regarding enhancing ecosystem services and restoring target species populations. The interviews yielded several recommendations. The science and AM program must be integrated into how the overall restoration program operates in order to gain understanding and support, and effectively inform management decision-making. Governance and decision-making varied based on its particular circumstances. Open communication within and among agency and stakeholder groups and extensive vetting lead up to decisions. It was important to have an internal agency staff member to implement the AM plan, and a clear designation of roles and responsibilities, and long-term commitment of other involved parties. The most important management questions and information needs must be identified up front. It was imperative to clearly identify, link and continually reinforce the essential components of an AM plan, including objectives, constraints, uncertainties, hypotheses, management actions, decision criteria and triggers, monitoring, and research. Some employed predictive models and the results of research on uncertainties to vet options for actions. Many relied on best available science and professional judgment to decide if adjustments to actions were needed. All programs emphasized the need to be nimble enough to be responsive to new information and make necessary adjustments to management action implementation. We recommend that ecosystem services be explicit drivers of restoration programs to facilitate needed funding and communicate to the general public and with the global efforts on restoring and conserving ecosystems. Copyright © 2016 Elsevier Ltd. All rights reserved.
A Bayesian Hierarchical Modeling Approach to Predicting Flow in Ungauged Basins
NASA Astrophysics Data System (ADS)
Gronewold, A.; Alameddine, I.; Anderson, R. M.
2009-12-01
Recent innovative approaches to identifying and applying regression-based relationships between land use patterns (such as increasing impervious surface area and decreasing vegetative cover) and rainfall-runoff model parameters represent novel and promising improvements to predicting flow from ungauged basins. In particular, these approaches allow for predicting flows under uncertain and potentially variable future conditions due to rapid land cover changes, variable climate conditions, and other factors. Despite the broad range of literature on estimating rainfall-runoff model parameters, however, the absence of a robust set of modeling tools for identifying and quantifying uncertainties in (and correlation between) rainfall-runoff model parameters represents a significant gap in current hydrological modeling research. Here, we build upon a series of recent publications promoting novel Bayesian and probabilistic modeling strategies for quantifying rainfall-runoff model parameter estimation uncertainty. Our approach applies alternative measures of rainfall-runoff model parameter joint likelihood (including Nash-Sutcliffe efficiency, among others) to simulate samples from the joint parameter posterior probability density function. We then use these correlated samples as response variables in a Bayesian hierarchical model with land use coverage data as predictor variables in order to develop a robust land use-based tool for forecasting flow in ungauged basins while accounting for, and explicitly acknowledging, parameter estimation uncertainty. We apply this modeling strategy to low-relief coastal watersheds of Eastern North Carolina, an area representative of coastal resource waters throughout the world because of its sensitive embayments and because of the abundant (but currently threatened) natural resources it hosts. Consequently, this area is the subject of several ongoing studies and large-scale planning initiatives, including those conducted through the United States Environmental Protection Agency (USEPA) total maximum daily load (TMDL) program, as well as those addressing coastal population dynamics and sea level rise. Our approach has several advantages, including the propagation of parameter uncertainty through a nonparametric probability distribution which avoids common pitfalls of fitting parameters and model error structure to a predetermined parametric distribution function. In addition, by explicitly acknowledging correlation between model parameters (and reflecting those correlations in our predictive model) our model yields relatively efficient prediction intervals (unlike those in the current literature which are often unnecessarily large, and may lead to overly-conservative management actions). Finally, our model helps improve understanding of the rainfall-runoff process by identifying model parameters (and associated catchment attributes) which are most sensitive to current and future land use change patterns. Disclaimer: Although this work was reviewed by EPA and approved for publication, it may not necessarily reflect official Agency policy.
NASA Astrophysics Data System (ADS)
Warner, Thomas T.; Sheu, Rong-Shyang; Bowers, James F.; Sykes, R. Ian; Dodd, Gregory C.; Henn, Douglas S.
2002-05-01
Ensemble simulations made using a coupled atmospheric dynamic model and a probabilistic Lagrangian puff dispersion model were employed in a forensic analysis of the transport and dispersion of a toxic gas that may have been released near Al Muthanna, Iraq, during the Gulf War. The ensemble study had two objectives, the first of which was to determine the sensitivity of the calculated dosage fields to the choices that must be made about the configuration of the atmospheric dynamic model. In this test, various choices were used for model physics representations and for the large-scale analyses that were used to construct the model initial and boundary conditions. The second study objective was to examine the dispersion model's ability to use ensemble inputs to predict dosage probability distributions. Here, the dispersion model was used with the ensemble mean fields from the individual atmospheric dynamic model runs, including the variability in the individual wind fields, to generate dosage probabilities. These are compared with the explicit dosage probabilities derived from the individual runs of the coupled modeling system. The results demonstrate that the specific choices made about the dynamic-model configuration and the large-scale analyses can have a large impact on the simulated dosages. For example, the area near the source that is exposed to a selected dosage threshold varies by up to a factor of 4 among members of the ensemble. The agreement between the explicit and ensemble dosage probabilities is relatively good for both low and high dosage levels. Although only one ensemble was considered in this study, the encouraging results suggest that a probabilistic dispersion model may be of value in quantifying the effects of uncertainties in a dynamic-model ensemble on dispersion model predictions of atmospheric transport and dispersion.
A Machine Learning Approach to Predicted Bathymetry
NASA Astrophysics Data System (ADS)
Wood, W. T.; Elmore, P. A.; Petry, F.
2017-12-01
Recent and on-going efforts have shown how machine learning (ML) techniques, incorporating more, and more disparate data than can be interpreted manually, can predict seafloor properties, with uncertainty, where they have not been measured directly. We examine here a ML approach to predicted bathymetry. Our approach employs a paradigm of global bathymetry as an integral component of global geology. From a marine geology and geophysics perspective the bathymetry is the thickness of one layer in an ensemble of layers that inter-relate to varying extents vertically and geospatially. The nature of the multidimensional relationships in these layers between bathymetry, gravity, magnetic field, age, and many other global measures is typically geospatially dependent and non-linear. The advantage of using ML is that these relationships need not be stated explicitly, nor do they need to be approximated with a transfer function - the machine learns them via the data. Fundamentally, ML operates by brute-force searching for multidimensional correlations between desired, but sparsely known data values (in this case water depth), and a multitude of (geologic) predictors. Predictors include quantities known extensively such as remotely sensed measurements (i.e. gravity and magnetics), distance from spreading ridge, trench etc., (and spatial statistics based on these quantities). Estimating bathymetry from an approximate transfer function is inherently model, as well as data limited - complex relationships are explicitly ruled out. The ML is a purely data-driven approach, so only the extent and quality of the available observations limit prediction accuracy. This allows for a system in which new data, of a wide variety of types, can be quickly and easily assimilated into updated bathymetry predictions with quantitative posterior uncertainties.
Zipkin, Elise F; Grant, Evan H Campbell; Fagan, William F
2012-10-01
The ability to accurately predict patterns of species' occurrences is fundamental to the successful management of animal communities. To determine optimal management strategies, it is essential to understand species-habitat relationships and how species habitat use is related to natural or human-induced environmental changes. Using five years of monitoring data in the Chesapeake and Ohio Canal National Historical Park, Maryland, USA, we developed four multispecies hierarchical models for estimating amphibian wetland use that account for imperfect detection during sampling. The models were designed to determine which factors (wetland habitat characteristics, annual trend effects, spring/summer precipitation, and previous wetland occupancy) were most important for predicting future habitat use. We used the models to make predictions about species occurrences in sampled and unsampled wetlands and evaluated model projections using additional data. Using a Bayesian approach, we calculated a posterior distribution of receiver operating characteristic area under the curve (ROC AUC) values, which allowed us to explicitly quantify the uncertainty in the quality of our predictions and to account for false negatives in the evaluation data set. We found that wetland hydroperiod (the length of time that a wetland holds water), as well as the occurrence state in the prior year, were generally the most important factors in determining occupancy. The model with habitat-only covariates predicted species occurrences well; however, knowledge of wetland use in the previous year significantly improved predictive ability at the community level and for two of 12 species/species complexes. Our results demonstrate the utility of multispecies models for understanding which factors affect species habitat use of an entire community (of species) and provide an improved methodology using AUC that is helpful for quantifying the uncertainty in model predictions while explicitly accounting for detection biases.
Zipkin, Elise F.; Grant, Evan H. Campbell; Fagan, William F.
2012-01-01
The ability to accurately predict patterns of species' occurrences is fundamental to the successful management of animal communities. To determine optimal management strategies, it is essential to understand species-habitat relationships and how species habitat use is related to natural or human-induced environmental changes. Using five years of monitoring data in the Chesapeake and Ohio Canal National Historical Park, Maryland, USA, we developed four multi-species hierarchical models for estimating amphibian wetland use that account for imperfect detection during sampling. The models were designed to determine which factors (wetland habitat characteristics, annual trend effects, spring/summer precipitation, and previous wetland occupancy) were most important for predicting future habitat use. We used the models to make predictions of species occurrences in sampled and unsampled wetlands and evaluated model projections using additional data. Using a Bayesian approach, we calculated a posterior distribution of receiver operating characteristic area under the curve (ROC AUC) values, which allowed us to explicitly quantify the uncertainty in the quality of our predictions and to account for false negatives in the evaluation dataset. We found that wetland hydroperiod (the length of time that a wetland holds water) as well as the occurrence state in the prior year were generally the most important factors in determining occupancy. The model with only habitat covariates predicted species occurrences well; however, knowledge of wetland use in the previous year significantly improved predictive ability at the community level and for two of 12 species/species complexes. Our results demonstrate the utility of multi-species models for understanding which factors affect species habitat use of an entire community (of species) and provide an improved methodology using AUC that is helpful for quantifying the uncertainty in model predictions while explicitly accounting for detection biases.
On the nature of implicit soul beliefs: when the past weighs more than the present.
Anglin, Stephanie M
2015-06-01
Intuitive childhood beliefs in dualism may lay the foundation for implicit soul and afterlife beliefs, which may diverge from explicit beliefs formed later in adulthood. Brief Implicit Association Tests were developed to investigate the relation of implicit soul and afterlife beliefs to childhood and current beliefs. Early but not current beliefs covaried with implicit beliefs. Results demonstrated greater discrepancies in current than in childhood soul and afterlife beliefs among religious groups, and no differences in implicit beliefs. These findings suggest that implicit soul and afterlife beliefs diverge from current self-reported beliefs, stemming instead from childhood beliefs. © 2014 The British Psychological Society.
Wirth, Christian; Schumacher, Jens; Schulze, Ernst-Detlef
2004-02-01
To facilitate future carbon and nutrient inventories, we used mixed-effect linear models to develop new generic biomass functions for Norway spruce (Picea abies (L.) Karst.) in Central Europe. We present both the functions and their respective variance-covariance matrices and illustrate their application for biomass prediction and uncertainty estimation for Norway spruce trees ranging widely in size, age, competitive status and site. We collected biomass data for 688 trees sampled in 102 stands by 19 authors. The total number of trees in the "base" model data sets containing the predictor variables diameter at breast height (D), height (H), age (A), site index (SI) and site elevation (HSL) varied according to compartment (roots: n = 114, stem: n = 235, dry branches: n = 207, live branches: n = 429 and needles: n = 551). "Core" data sets with about 40% fewer trees could be extracted containing the additional predictor variables crown length and social class. A set of 43 candidate models representing combinations of lnD, lnH, lnA, SI and HSL, including second-order polynomials and interactions, was established. The categorical variable "author" subsuming mainly methodological differences was included as a random effect in a mixed linear model. The Akaike Information Criterion was used for model selection. The best models for stem, root and branch biomass contained only combinations of D, H and A as predictors. More complex models that included site-related variables resulted for needle biomass. Adding crown length as a predictor for needles, branches and roots reduced both the bias and the confidence interval of predictions substantially. Applying the best models to a test data set of 17 stands ranging in age from 16 to 172 years produced realistic allocation patterns at the tree and stand levels. The 95% confidence intervals (% of mean prediction) were highest for crown compartments (approximately +/- 12%) and lowest for stem biomass (approximately +/- 5%), and within each compartment, they were highest for the youngest and oldest stands, respectively.
Implicitly Coordinated Detect and Avoid Capability for Safe Autonomous Operation of Small UAS
NASA Technical Reports Server (NTRS)
Balachandran, Swee; Munoz, Cesar A.; Consiglio, Maria C.
2017-01-01
As the airspace becomes increasingly shared by autonomous small Unmanned Aerial Systems (UAS), there would be a pressing need for coordination strategies so that aircraft can safely and independently maneuver around obstacles, geofences, and traffic aircraft. Explicitly coordinating resolution strategies for small UAS would require additional components such as a reliable vehicle-to-vehicle communication infrastructure and standardized protocols for information exchange that could significantly increase the cost of deploying small UAS in a shared airspace. This paper explores a novel approach that enables multiple aircraft to implicitly coordinate their resolution maneuvers. By requiring all aircraft to execute the proposed approach deterministically, it is possible for all of them to implicitly agree on the region of airspace each will be occupying in a given time interval. The proposed approach lends itself to the construction of a suitable feedback mechanism that enables the real-time execution of an implicitly conflict-free path in a closed-loop manner dealing with uncertainties in aircraft speed. If a network infrastructure is available, the proposed approach can also exploit the benefits of explicit information.
Improved Satellite-based Crop Yield Mapping by Spatially Explicit Parameterization of Crop Phenology
NASA Astrophysics Data System (ADS)
Jin, Z.; Azzari, G.; Lobell, D. B.
2016-12-01
Field-scale mapping of crop yields with satellite data often relies on the use of crop simulation models. However, these approaches can be hampered by inaccuracies in the simulation of crop phenology. Here we present and test an approach to use dense time series of Landsat 7 and 8 acquisitions data to calibrate various parameters related to crop phenology simulation, such as leaf number and leaf appearance rates. These parameters are then mapped across the Midwestern United States for maize and soybean, and for two different simulation models. We then implement our recently developed Scalable satellite-based Crop Yield Mapper (SCYM) with simulations reflecting the improved phenology parameterizations, and compare to prior estimates based on default phenology routines. Our preliminary results show that the proposed method can effectively alleviate the underestimation of early-season LAI by the default Agricultural Production Systems sIMulator (APSIM), and that spatially explicit parameterization for the phenology model substantially improves the SCYM performance in capturing the spatiotemporal variation in maize and soybean yield. The scheme presented in our study thus preserves the scalability of SCYM, while significantly reducing its uncertainty.
Efficient Robust Optimization of Metal Forming Processes using a Sequential Metamodel Based Strategy
NASA Astrophysics Data System (ADS)
Wiebenga, J. H.; Klaseboer, G.; van den Boogaard, A. H.
2011-08-01
The coupling of Finite Element (FE) simulations to mathematical optimization techniques has contributed significantly to product improvements and cost reductions in the metal forming industries. The next challenge is to bridge the gap between deterministic optimization techniques and the industrial need for robustness. This paper introduces a new and generally applicable structured methodology for modeling and solving robust optimization problems. Stochastic design variables or noise variables are taken into account explicitly in the optimization procedure. The metamodel-based strategy is combined with a sequential improvement algorithm to efficiently increase the accuracy of the objective function prediction. This is only done at regions of interest containing the optimal robust design. Application of the methodology to an industrial V-bending process resulted in valuable process insights and an improved robust process design. Moreover, a significant improvement of the robustness (>2σ) was obtained by minimizing the deteriorating effects of several noise variables. The robust optimization results demonstrate the general applicability of the robust optimization strategy and underline the importance of including uncertainty and robustness explicitly in the numerical optimization procedure.
The Impact of Aerosol Microphysical Representation in Models on the Direct Radiative Effect
NASA Astrophysics Data System (ADS)
Ridley, D. A.; Heald, C. L.
2017-12-01
Aerosol impacts the radiative balance of the atmosphere both directly and indirectly. There is considerable uncertainty remaining in the aerosol direct radiative effect (DRE), hampering understanding of the present magnitude of anthropogenic aerosol forcing and how future changes in aerosol loading will influence climate. Computationally expensive explicit aerosol microphysics are usually reserved for modelling of the aerosol indirect radiative effects that depend upon aerosol particle number. However, the direct radiative effects of aerosol are also strongly dependent upon the aerosol size distribution, especially particles between 0.2µm - 2µm diameter. In this work, we use a consistent model framework and consistent emissions to explore the impact of prescribed size distributions (bulk scheme) relative to explicit microphysics (sectional scheme) on the aerosol radiative properties. We consider the difference in aerosol burden, water uptake, and extinction efficiency resulting from the two representations, highlighting when and where the bulk and sectional schemes diverge significantly in their estimates of the DRE. Finally, we evaluate the modelled size distributions using in-situ measurements over a range of regimes to provide constraints on both the accumulation and coarse aerosol sizes.
NASA Astrophysics Data System (ADS)
Ragon, T.; Sladen, A.; Bletery, Q.; Simons, M.; Magnoni, F.; Avallone, A.; Cavalié, O.; Vergnolle, M.
2016-12-01
Despite the diversity of available data for the Mw 6.1 2009 earthquake in L'Aquila, Italy, published finite fault slip models are surprisingly different. For instance, the amplitude of the maximum coseismic slip patch varies from 80cm to 225cm, and its depth oscillates between 5 and 15km. Discrepancies between proposed source parameters are believed to result from three sources: observational uncertainties, epistemic uncertainties, and the inherent non-uniqueness of inverse problems. We explore the whole solution space of fault-slip models compatible with the data within the range of both observational and epistemic uncertainties by performing a fully Bayesian analysis. In this initial stage, we restrict our analysis to the static problem.In terms of observation uncertainty, we must take into account the difference in time span associated with the different data types: InSAR images provide excellent spatial coverage but usually correspond to a period of a few days to weeks after the mainshock and can thus be potentially biased by significant afterslip. Continuous GPS stations do not have the same shortcoming, but in contrast do not have the desired spatial coverage near the fault. In the case of the L'Aquila earthquake, InSAR images include a minimum of 6 days of afterslip. Here, we explicitly account for these different time windows in the inversion by jointly inverting for coseismic and post-seismic fault slip. Regarding epistemic or modeling uncertainties, we focus on the impact of uncertain fault geometry and elastic structure. Modeling errors, which result from inaccurate model predictions and are generally neglected, are estimated for both earth model and fault geometry as non-diagonal covariance matrices. The L'Aquila earthquake is particularly suited to investigation of these effects given the availability of a detailed aftershock catalog and 3D velocity models. This work aims at improving our knowledge of the L'Aquila earthquake as well as at providing a more general perspective on which uncertainties are the most critical in finite-fault source studies.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bogen, K T
2007-01-30
As reflected in the 2005 USEPA Guidelines for Cancer Risk Assessment, some chemical carcinogens may have a site-specific mode of action (MOA) that is dual, involving mutation in addition to cell-killing induced hyperplasia. Although genotoxicity may contribute to increased risk at all doses, the Guidelines imply that for dual MOA (DMOA) carcinogens, judgment be used to compare and assess results obtained using separate ''linear'' (genotoxic) vs. ''nonlinear'' (nongenotoxic) approaches to low-level risk extrapolation. However, the Guidelines allow the latter approach to be used only when evidence is sufficient to parameterize a biologically based model that reliably extrapolates risk to lowmore » levels of concern. The Guidelines thus effectively prevent MOA uncertainty from being characterized and addressed when data are insufficient to parameterize such a model, but otherwise clearly support a DMOA. A bounding factor approach--similar to that used in reference dose procedures for classic toxicity endpoints--can address MOA uncertainty in a way that avoids explicit modeling of low-dose risk as a function of administered or internal dose. Even when a ''nonlinear'' toxicokinetic model cannot be fully validated, implications of DMOA uncertainty on low-dose risk may be bounded with reasonable confidence when target tumor types happen to be extremely rare. This concept was illustrated for the rodent carcinogen naphthalene. Bioassay data, supplemental toxicokinetic data, and related physiologically based pharmacokinetic and 2-stage stochastic carcinogenesis modeling results all clearly indicate that naphthalene is a DMOA carcinogen. Plausibility bounds on rat-tumor-type specific DMOA-related uncertainty were obtained using a 2-stage model adapted to reflect the empirical link between genotoxic and cytotoxic effects of the most potent identified genotoxic naphthalene metabolites, 1,2- and 1,4-naphthoquinone. Resulting bounds each provided the basis for a corresponding ''uncertainty'' factor <1 appropriate to apply to estimates of naphthalene risk obtained by linear extrapolation under a default genotoxic MOA assumption. This procedure is proposed as scientifically credible method to address MOA uncertainty for DMOA carcinogens.« less
NASA Technical Reports Server (NTRS)
Lee, Eunjee; Koster, Randal D.; Ott, Lesley E.; Weir, Brad; Mahanama, Sarith; Chang, Yehui; Zeng, Fan-Wei
2017-01-01
Understanding the underlying processes that control the carbon cycle is key to predicting future global change. Much of the uncertainty in the magnitude and variability of the atmospheric carbon dioxide (CO2) stems from uncertainty in terrestrial carbon fluxes, and the relative impacts of temperature and moisture variations on regional and global scales are poorly understood. Here we investigate the impact of a regional drought on terrestrial carbon fluxes and CO2 mixing ratios over North America using the NASA Goddard Earth Observing System (GEOS) Model. Results show a sequence of changes in carbon fluxes and atmospheric CO2, induced by the drought. The relative contributions of meteorological changes to the neighboring carbon dynamics are also presented. The coupled modeling approach allows a direct quantification of the impact of the regional drought on local and proximate carbon exchange at the land surface via the carbon-water feedback processes.
Tidal disruption of Periodic Comet Shoemaker-Levy 9 and a constraint on its mean density
NASA Technical Reports Server (NTRS)
Boss, Alan P.
1994-01-01
The apparent tidal disruption of Periodic Comet Shoemaker-Levy 9 (1993e) during a close encounter within approximately 1.62 planetary radii of Jupiter can be used along with theoretical models of tidal disruption to place an upper bound on the density of the predisruption body. Depending on the theoretical model used, these upper bounds range from rho(sub c) less than 0.702 +/- 0.080 g/cu cm for a simple analytical model calibrated by numerical smoothed particle hydrodynamics (SPH) simulations to rho(sub c) less than 1.50 +/- 0.17 g/cu cm for a detailed semianalytical model. The quoted uncertainties stem from an assumed uncertainty in the perijove radius. However, the uncertainty introduced by the different theoretical models is the major source of error; this uncertainty could be eliminated by future SPH simulations specialized to cometary disruptions, including the effects of initially prolate, spinning comets. If the SPH-based upper bound turns out to be most appropriate, it would be consistent with the predisruption body being a comet with a relatively low density and porous structure, as has been asserted previously based on observations of cometary outgassing. Regardless of which upper bound is preferable, the models all agree that the predisruption body could not have been a relatively high-density body, such as an asteroid with rho approximately = 2 g/cu cm.
Global estimation of CO emissions using three sets of satellite data for burned area
NASA Astrophysics Data System (ADS)
Jain, Atul K.
Using three sets of satellite data for burned areas together with the tree cover imagery and a biogeochemical component of the Integrated Science Assessment Model (ISAM) the global emissions of CO and associated uncertainties are estimated for the year 2000. The available fuel load (AFL) is calculated using the ISAM biogeochemical model, which accounts for the aboveground and surface fuel removed by land clearing for croplands and pasturelands, as well as the influence on fuel load of various ecosystem processes (such as stomatal conductance, evapotranspiration, plant photosynthesis and respiration, litter production, and soil organic carbon decomposition) and important feedback mechanisms (such as climate and fertilization feedback mechanism). The ISAM estimated global total AFL in the year 2000 was about 687 Pg AFL. All forest ecosystems account for about 90% of the global total AFL. The estimated global CO emissions based on three global burned area satellite data sets (GLOBSCAR, GBA, and Global Fire Emissions Database version 2 (GFEDv2)) for the year 2000 ranges between 320 and 390 Tg CO. Emissions from open fires are highest in tropical Africa, primarily due to forest cutting and burning. The estimated overall uncertainty in global CO emission is about ±65%, with the highest uncertainty occurring in North Africa and Middle East region (±99%). The results of this study suggest that the uncertainties in the calculated emissions stem primarily from the area burned data.
Casado, María; de Lecuona, Itziar
2013-01-01
This paper identifies problems and analyzes those conflicts posed by the evaluation of research projects involving the collection and use of human induced pluripotent stem cells (iPS) in Spain. Current legislation is causing problems of interpretation, circular and unnecessary referrals, legal uncertainty and undue delays. Actually, this situation may cause a lack of control and monitoring, and even some paralysis in regenerative medicine and cell therapy research, that is a priority nowadays. The analysis of the current legislation and its bioethical implications, led us to conclude that the review of iPS research projects cannot be assimilated to the evaluation of research projects that involve human embryonic stem cell (hESC). In this context, our proposal is based on the review by the Research Ethics Committees and the checkout by the Spanish Comission of Guarantees for Donation and Use of Human Cells and Tissues (CGDUCTH) of human iPS cells research projects. Moreover, this article claims for a more transparent research system, by effectively articulating the Registry on Research Projects. Finally, a model of verification protocol (checklist) for checking out biomedical research projects involving human iPS cells is suggested.
Kelin, Hu; Qin, Chen; Wang, Hongqing
2014-01-01
Coastal wetlands play a unique role in extreme hurricane events. The impact of wetlands on storm surge depends on multiple factors including vegetation, landscape, and storm characteristics. The Delft3D model, in which vegetation effects on flow and turbulence are explicitly incorporated, was applied to the semi-enclosed Breton Sound (BS) estuary in coastal Louisiana to investigate the wetland impact. Guided by extensive field observations, a series of numerical experiments were conducted based on variations of actual vegetation properties and storm parameters from Hurricane Isaac in 2012. Both the vegetation-induced maximum surge reduction (MSR) and maximum surge reduction rate (MSRR) increased with stem height and stem density, and were more sensitive to stem height. The MSR and MSRR decreased significantly with increasing wind intensity. The MSRR was the highest with a fast-moving weak storm. It was also found that the MSRR varied proportionally to the expression involving the maximum bulk velocity and surge over the area of interest, and was more dependent on the maximum bulk surge. Both MSR and MSRR appeared to increase when the area of interest decreased from the whole BS estuary to the upper estuary. Within the range of the numerical experiments, the maximum simulated MSR and MSRR over the upper estuary were 0.7 m and 37%, respectively.
Rumman, Mohammad; Majumder, Abhijit; Harkness, Linda; Venugopal, Balu; Vinay, M B; Pillai, Malini S; Kassem, Moustapha; Dhawan, Jyotsna
2018-05-17
Several studies have suggested that bone marrow stromal steam cells (BMSC) exist in a quiescent state (G0) within the in vivo niche; however, an explicit analysis of the biology of G0 state-BMSC has not been reported. We hypothesized that induction of G0 in BMSC might enhance their stem cell properties. Thus, we induced quiescence in BMSC in vitro by (a) suspension culture in a viscous medium or (b) culture on soft polyacrylamide substrate; and examined their molecular and functional phenotype. Induction of G0 was confirmed by bromo-deoxyuridine (BrdU) labelling and analysis of cell cycle gene expression. Upon reactivation and re-entry into cell cycle, G0 state-BMSC exhibited enhanced clonogenic self-renewal, preferential differentiation into osteoblastic rather than adipocytic cells and increased ectopic bone formation when implanted subcutaneously in vivo in immune-deficient mice, compared to asynchronous proliferating (pre-G0) BMSC. Global gene expression profiling revealed reprogramming of the transcriptome during G0 state including significant alterations in relevant pathways and expression of secreted factors, suggesting altered autocrine and paracrine signaling by G0 state-BMSC and a possible mechanism for enhanced bone formation. G0 state-BMSC might provide a clinically relevant model for understanding the in vivo biology of BMSC. Copyright © 2018. Published by Elsevier B.V.
NASA Astrophysics Data System (ADS)
Ragon, Théa; Sladen, Anthony; Simons, Mark
2018-05-01
The ill-posed nature of earthquake source estimation derives from several factors including the quality and quantity of available observations and the fidelity of our forward theory. Observational errors are usually accounted for in the inversion process. Epistemic errors, which stem from our simplified description of the forward problem, are rarely dealt with despite their potential to bias the estimate of a source model. In this study, we explore the impact of uncertainties related to the choice of a fault geometry in source inversion problems. The geometry of a fault structure is generally reduced to a set of parameters, such as position, strike and dip, for one or a few planar fault segments. While some of these parameters can be solved for, more often they are fixed to an uncertain value. We propose a practical framework to address this limitation by following a previously implemented method exploring the impact of uncertainties on the elastic properties of our models. We develop a sensitivity analysis to small perturbations of fault dip and position. The uncertainties in fault geometry are included in the inverse problem under the formulation of the misfit covariance matrix that combines both prediction and observation uncertainties. We validate this approach with the simplified case of a fault that extends infinitely along strike, using both Bayesian and optimization formulations of a static inversion. If epistemic errors are ignored, predictions are overconfident in the data and source parameters are not reliably estimated. In contrast, inclusion of uncertainties in fault geometry allows us to infer a robust posterior source model. Epistemic uncertainties can be many orders of magnitude larger than observational errors for great earthquakes (Mw > 8). Not accounting for uncertainties in fault geometry may partly explain observed shallow slip deficits for continental earthquakes. Similarly, ignoring the impact of epistemic errors can also bias estimates of near surface slip and predictions of tsunamis induced by megathrust earthquakes. (Mw > 8)
Chakraborty, Rajshekhar; Savani, Bipin N; Litzow, Mark; Mohty, Mohamad; Hashmi, Shahrukh
2015-07-15
The widespread use of complementary and alternative medicine (CAM) in cancer survivors is well known despite a paucity of scientific evidence to support its use. The number of survivors of hematopoietic stem cell transplant (HCT) is growing rapidly and HCT clinicians are aware that many of their patients use CAM therapies consistently. However, due to a paucity of data regarding the benefits and harms of CAM therapies in these survivors, clinicians are reluctant to provide specific recommendations for or against particular CAM therapies. A systematic literature review was conducted with a search using PubMed, the Cochrane Database of Systematic Reviews, and Ovid online for each CAM therapy as defined by the National Center of Complementary and Alternative Medicine. The search generated 462 references, of which 26 articles were deemed to be relevant for the review. Due to extensive heterogeneity in data and limited randomized trials, a meta-analysis could not be performed but a comprehensive systematic review was conducted with specified outcomes for each CAM therapy. In randomized controlled trials, certain mind and body interventions such as relaxation were observed to be effective in alleviating psychological symptoms in patients undergoing HCT, whereas the majority of the other CAM treatments were found to have mixed results. CAM use is an understudied area in HCT survivorship and clinicians should convey the benefits and uncertainties concerning the role of CAM therapies to their patients. © 2015 American Cancer Society.