Comparisons of Wilks’ and Monte Carlo Methods in Response to the 10CFR50.46(c) Proposed Rulemaking
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, Hongbin; Szilard, Ronaldo; Zou, Ling
The Nuclear Regulatory Commission (NRC) is proposing a new rulemaking on emergency core system/loss-of-coolant accident (LOCA) performance analysis. In the proposed rulemaking, designated as 10CFR50.46(c), the US NRC put forward an equivalent cladding oxidation criterion as a function of cladding pre-transient hydrogen content. The proposed rulemaking imposes more restrictive and burnup-dependent cladding embrittlement criteria; consequently nearly all the fuel rods in a reactor core need to be analyzed under LOCA conditions to demonstrate compliance to the safety limits. New analysis methods are required to provide a thorough characterization of the reactor core in order to identify the locations of themore » limiting rods as well as to quantify the safety margins under LOCA conditions. With the new analysis method presented in this work, the limiting transient case and the limiting rods can be easily identified to quantify the safety margins in response to the proposed new rulemaking. In this work, the best-estimate plus uncertainty (BEPU) analysis capability for large break LOCA with the new cladding embrittlement criteria using the RELAP5-3D code is established and demonstrated with a reduced set of uncertainty parameters. Both the direct Monte Carlo method and the Wilks’ nonparametric statistical method can be used to perform uncertainty quantification. Wilks’ method has become the de-facto industry standard to perform uncertainty quantification in BEPU LOCA analyses. Despite its widespread adoption by the industry, the use of small sample sizes to infer statement of compliance to the existing 10CFR50.46 rule, has been a major cause of unrealized operational margin in today’s BEPU methods. Moreover the debate on the proper interpretation of the Wilks’ theorem in the context of safety analyses is not fully resolved yet, even more than two decades after its introduction in the frame of safety analyses in the nuclear industry. This represents both a regulatory and application risk in rolling out new methods. With the 10CFR50.46(c) proposed rulemaking, the deficiencies of the Wilks’ approach are further exacerbated. The direct Monte Carlo approach offers a robust alternative to perform uncertainty quantification within the context of BEPU analyses. In this work, the Monte Carlo method is compared with the Wilks’ method in response to the NRC 10CFR50.46(c) proposed rulemaking.« less
Probabilistic margin evaluation on accidental transients for the ASTRID reactor project
NASA Astrophysics Data System (ADS)
Marquès, Michel
2014-06-01
ASTRID is a technological demonstrator of Sodium cooled Fast Reactor (SFR) under development. The conceptual design studies are being conducted in accordance with the Generation IV reactor objectives, particularly in terms of improving safety. For the hypothetical events, belonging to the accidental category "severe accident prevention situations" having a very low frequency of occurrence, the safety demonstration is no more based on a deterministic demonstration with conservative assumptions on models and parameters but on a "Best-Estimate Plus Uncertainty" (BEPU) approach. This BEPU approach ispresented in this paper for an Unprotected Loss-of-Flow (ULOF) event. The Best-Estimate (BE) analysis of this ULOFt ransient is performed with the CATHARE2 code, which is the French reference system code for SFR applications. The objective of the BEPU analysis is twofold: first evaluate the safety margin to sodium boiling in taking into account the uncertainties on the input parameters of the CATHARE2 code (twenty-two uncertain input parameters have been identified, which can be classified into five groups: reactor power, accident management, pumps characteristics, reactivity coefficients, thermal parameters and head losses); secondly quantify the contribution of each input uncertainty to the overall uncertainty of the safety margins, in order to refocusing R&D efforts on the most influential factors. This paper focuses on the methodological aspects of the evaluation of the safety margin. At least for the preliminary phase of the project (conceptual design), a probabilistic criterion has been fixed in the context of this BEPU analysis; this criterion is the value of the margin to sodium boiling, which has a probability 95% to be exceeded, obtained with a confidence level of 95% (i.e. the M5,95percentile of the margin distribution). This paper presents two methods used to assess this percentile: the Wilks method and the Bootstrap method ; the effectiveness of the two methods is compared on the basis of 500 simulations performed with theCATHARE2 code. We conclude that, with only 100 simulations performed with the CATHARE2 code, which is a number of simulations workable in the conceptual design phase of the ASTRID project where the models and the hypothesis are often modified, it is best in order to evaluate the percentile M5,95 of the margin to sodium boiling to use the bootstrap method, which will provide a slightly conservative result. On the other hand, in order to obtain an accurate estimation of the percentileM5,95, for the safety report for example, it will be necessary to perform at least 300 simulations with the CATHARE2 code. In this case, both methods (Wilks and Bootstrap) would give equivalent results.
Brown, Cameron S.; Zhang, Hongbin; Kucukboyaci, Vefa; ...
2016-09-07
VERA-CS (Virtual Environment for Reactor Applications, Core Simulator) is a coupled neutron transport and thermal-hydraulics subchannel code under development by the Consortium for Advanced Simulation of Light Water Reactors (CASL). VERA-CS was used to simulate a typical pressurized water reactor (PWR) full core response with 17x17 fuel assemblies for a main steam line break (MSLB) accident scenario with the most reactive rod cluster control assembly stuck out of the core. The accident scenario was initiated at the hot zero power (HZP) at the end of the first fuel cycle with return to power state points that were determined by amore » system analysis code and the most limiting state point was chosen for core analysis. The best estimate plus uncertainty (BEPU) analysis method was applied using Wilks’ nonparametric statistical approach. In this way, 59 full core simulations were performed to provide the minimum departure from nucleate boiling ratio (MDNBR) at the 95/95 (95% probability with 95% confidence level) tolerance limit. The results show that this typical PWR core remains within MDNBR safety limits for the MSLB accident.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brown, Cameron S.; Zhang, Hongbin; Kucukboyaci, Vefa
VERA-CS (Virtual Environment for Reactor Applications, Core Simulator) is a coupled neutron transport and thermal-hydraulics subchannel code under development by the Consortium for Advanced Simulation of Light Water Reactors (CASL). VERA-CS was used to simulate a typical pressurized water reactor (PWR) full core response with 17x17 fuel assemblies for a main steam line break (MSLB) accident scenario with the most reactive rod cluster control assembly stuck out of the core. The accident scenario was initiated at the hot zero power (HZP) at the end of the first fuel cycle with return to power state points that were determined by amore » system analysis code and the most limiting state point was chosen for core analysis. The best estimate plus uncertainty (BEPU) analysis method was applied using Wilks’ nonparametric statistical approach. In this way, 59 full core simulations were performed to provide the minimum departure from nucleate boiling ratio (MDNBR) at the 95/95 (95% probability with 95% confidence level) tolerance limit. The results show that this typical PWR core remains within MDNBR safety limits for the MSLB accident.« less
The NASA Langley Multidisciplinary Uncertainty Quantification Challenge
NASA Technical Reports Server (NTRS)
Crespo, Luis G.; Kenny, Sean P.; Giesy, Daniel P.
2014-01-01
This paper presents the formulation of an uncertainty quantification challenge problem consisting of five subproblems. These problems focus on key aspects of uncertainty characterization, sensitivity analysis, uncertainty propagation, extreme-case analysis, and robust design.
Remaining Useful Life Estimation in Prognosis: An Uncertainty Propagation Problem
NASA Technical Reports Server (NTRS)
Sankararaman, Shankar; Goebel, Kai
2013-01-01
The estimation of remaining useful life is significant in the context of prognostics and health monitoring, and the prediction of remaining useful life is essential for online operations and decision-making. However, it is challenging to accurately predict the remaining useful life in practical aerospace applications due to the presence of various uncertainties that affect prognostic calculations, and in turn, render the remaining useful life prediction uncertain. It is challenging to identify and characterize the various sources of uncertainty in prognosis, understand how each of these sources of uncertainty affect the uncertainty in the remaining useful life prediction, and thereby compute the overall uncertainty in the remaining useful life prediction. In order to achieve these goals, this paper proposes that the task of estimating the remaining useful life must be approached as an uncertainty propagation problem. In this context, uncertainty propagation methods which are available in the literature are reviewed, and their applicability to prognostics and health monitoring are discussed.
Accounting for uncertainty in marine reserve design.
Halpern, Benjamin S; Regan, Helen M; Possingham, Hugh P; McCarthy, Michael A
2006-01-01
Ecosystems and the species and communities within them are highly complex systems that defy predictions with any degree of certainty. Managing and conserving these systems in the face of uncertainty remains a daunting challenge, particularly with respect to developing networks of marine reserves. Here we review several modelling frameworks that explicitly acknowledge and incorporate uncertainty, and then use these methods to evaluate reserve spacing rules given increasing levels of uncertainty about larval dispersal distances. Our approach finds similar spacing rules as have been proposed elsewhere - roughly 20-200 km - but highlights several advantages provided by uncertainty modelling over more traditional approaches to developing these estimates. In particular, we argue that uncertainty modelling can allow for (1) an evaluation of the risk associated with any decision based on the assumed uncertainty; (2) a method for quantifying the costs and benefits of reducing uncertainty; and (3) a useful tool for communicating to stakeholders the challenges in managing highly uncertain systems. We also argue that incorporating rather than avoiding uncertainty will increase the chances of successfully achieving conservation and management goals.
Using a Meniscus to Teach Uncertainty in Measurement
NASA Astrophysics Data System (ADS)
Backman, Philip
2008-02-01
I have found that students easily understand that a measurement cannot be exact, but they often seem to lack an understanding of why it is important to know something about the magnitude of the uncertainty. This tends to promote an attitude that almost any uncertainty value will do. Such indifference may exist because once an uncertainty is determined or calculated, it remains as only a number without a concrete physical connection back to the experiment. For the activity described here—presented as a challenge—groups of students are given a container and asked to make certain measurements and to estimate the uncertainty in each of those measurements. They are then challenged to complete a particular task involving the container and a volume of water. Whether the assigned task is actually achievable, however, slowly comes into question once the magnitude of the uncertainties in the original measurements is compared to the specific requirements of the challenge.
Introducing Risk Analysis and Calculation of Profitability under Uncertainty in Engineering Design
ERIC Educational Resources Information Center
Kosmopoulou, Georgia; Freeman, Margaret; Papavassiliou, Dimitrios V.
2011-01-01
A major challenge that chemical engineering graduates face at the modern workplace is the management and operation of plants under conditions of uncertainty. Developments in the fields of industrial organization and microeconomics offer tools to address this challenge with rather well developed concepts, such as decision theory and financial risk…
Holbrook, Colin; Sousa, Paulo; Hahn-Holbrook, Jennifer
2012-01-01
Individuals subtly reminded of death, coalitional challenges, or feelings of uncertainty display exaggerated preferences for affirmations and against criticisms of their cultural in-groups. Terror management, coalitional psychology, and uncertainty management theories postulate this “worldview defense” effect as the output of mechanisms evolved either to allay the fear of death, foster social support, or reduce anxiety by increasing adherence to cultural values. In 4 studies, we report evidence for an alternative perspective. We argue that worldview defense owes to unconscious vigilance, a state of accentuated reactivity to affective targets (which need not relate to cultural worldviews) that follows detection of subtle alarm cues (which need not pertain to death, coalitional challenges, or uncertainty). In Studies 1 and 2, death-primed participants produced exaggerated ratings of worldview-neutral affective targets. In Studies 3 and 4, subliminal threat manipulations unrelated to death, coalitional challenges, or uncertainty evoked worldview defense. These results are discussed as they inform evolutionary interpretations of worldview defense and future investigations of the influence of unconscious alarm on judgment. PMID:21644809
Introducing Decision Making under Uncertainty and Strategic Considerations in Engineering Design
ERIC Educational Resources Information Center
Kosmopoulou, Georgia; Jog, Chintamani; Freeman, Margaret; Papavassiliou, Dimitrios V.
2010-01-01
Chemical Engineering graduates will face challenges at the workplace that even their peers who graduated a few years ago were not expected to face. One such major challenge is the management and operation of companies and plants under conditions of uncertainty and the need to make decisions in competitive situations. Modern developments in…
Wong, Alfred Ka-Shing; Ong, Shu Fen; Matchar, David Bruce; Lie, Desiree; Ng, Reuben; Yoon, Kirsten Eom; Wong, Chek Hooi
2017-10-01
Studies are needed to inform the preparation of community nurses to address patient behavioral and social factors contributing to unnecessary readmissions to hospital. This study uses nurses' input to understand challenges faced during home care, to derive a framework to address the challenges. Semistructured interviews were conducted to saturation with 16 community nurses in Singapore. Interviews were transcribed verbatim and transcripts independently coded for emergent themes. Themes were interpreted using grounded theory. Seven major themes emerged from 16 interviews: Strained social relationships, complex care decision-making processes within families, communication barriers, patient's or caregiver neglect of health issues, building and maintaining trust, trial-and-error nature of work, and dealing with uncertainty. Community nurses identified uncertainty arising from complexities in social-relational, personal, and organizational factors as a central challenge. Nursing education should focus on navigating and managing uncertainty at the personal, patient, and family levels.
Modeling sustainability in renewable energy supply chain systems
NASA Astrophysics Data System (ADS)
Xie, Fei
This dissertation aims at modeling sustainability of renewable fuel supply chain systems against emerging challenges. In particular, the dissertation focuses on the biofuel supply chain system design, and manages to develop advanced modeling framework and corresponding solution methods in tackling challenges in sustaining biofuel supply chain systems. These challenges include: (1) to integrate "environmental thinking" into the long-term biofuel supply chain planning; (2) to adopt multimodal transportation to mitigate seasonality in biofuel supply chain operations; (3) to provide strategies in hedging against uncertainty from conversion technology; and (4) to develop methodologies in long-term sequential planning of the biofuel supply chain under uncertainties. All models are mixed integer programs, which also involves multi-objective programming method and two-stage/multistage stochastic programming methods. In particular for the long-term sequential planning under uncertainties, to reduce the computational challenges due to the exponential expansion of the scenario tree, I also developed efficient ND-Max method which is more efficient than CPLEX and Nested Decomposition method. Through result analysis of four independent studies, it is found that the proposed modeling frameworks can effectively improve the economic performance, enhance environmental benefits and reduce risks due to systems uncertainties for the biofuel supply chain systems.
Uncertainty and risk in wildland fire management: a review.
Thompson, Matthew P; Calkin, Dave E
2011-08-01
Wildland fire management is subject to manifold sources of uncertainty. Beyond the unpredictability of wildfire behavior, uncertainty stems from inaccurate/missing data, limited resource value measures to guide prioritization across fires and resources at risk, and an incomplete scientific understanding of ecological response to fire, of fire behavior response to treatments, and of spatiotemporal dynamics involving disturbance regimes and climate change. This work attempts to systematically align sources of uncertainty with the most appropriate decision support methodologies, in order to facilitate cost-effective, risk-based wildfire planning efforts. We review the state of wildfire risk assessment and management, with a specific focus on uncertainties challenging implementation of integrated risk assessments that consider a suite of human and ecological values. Recent advances in wildfire simulation and geospatial mapping of highly valued resources have enabled robust risk-based analyses to inform planning across a variety of scales, although improvements are needed in fire behavior and ignition occurrence models. A key remaining challenge is a better characterization of non-market resources at risk, both in terms of their response to fire and how society values those resources. Our findings echo earlier literature identifying wildfire effects analysis and value uncertainty as the primary challenges to integrated wildfire risk assessment and wildfire management. We stress the importance of identifying and characterizing uncertainties in order to better quantify and manage them. Leveraging the most appropriate decision support tools can facilitate wildfire risk assessment and ideally improve decision-making. Published by Elsevier Ltd.
NASA Technical Reports Server (NTRS)
Sankararaman, Shankar
2016-01-01
This paper presents a computational framework for uncertainty characterization and propagation, and sensitivity analysis under the presence of aleatory and epistemic un- certainty, and develops a rigorous methodology for efficient refinement of epistemic un- certainty by identifying important epistemic variables that significantly affect the overall performance of an engineering system. The proposed methodology is illustrated using the NASA Langley Uncertainty Quantification Challenge (NASA-LUQC) problem that deals with uncertainty analysis of a generic transport model (GTM). First, Bayesian inference is used to infer subsystem-level epistemic quantities using the subsystem-level model and corresponding data. Second, tools of variance-based global sensitivity analysis are used to identify four important epistemic variables (this limitation specified in the NASA-LUQC is reflective of practical engineering situations where not all epistemic variables can be refined due to time/budget constraints) that significantly affect system-level performance. The most significant contribution of this paper is the development of the sequential refine- ment methodology, where epistemic variables for refinement are not identified all-at-once. Instead, only one variable is first identified, and then, Bayesian inference and global sensi- tivity calculations are repeated to identify the next important variable. This procedure is continued until all 4 variables are identified and the refinement in the system-level perfor- mance is computed. The advantages of the proposed sequential refinement methodology over the all-at-once uncertainty refinement approach are explained, and then applied to the NASA Langley Uncertainty Quantification Challenge problem.
Holbrook, Colin; Sousa, Paulo; Hahn-Holbrook, Jennifer
2011-09-01
Individuals subtly reminded of death, coalitional challenges, or feelings of uncertainty display exaggerated preferences for affirmations and against criticisms of their cultural in-groups. Terror management, coalitional psychology, and uncertainty management theories postulate this "worldview defense" effect as the output of mechanisms evolved either to allay the fear of death, foster social support, or reduce anxiety by increasing adherence to cultural values. In 4 studies, we report evidence for an alternative perspective. We argue that worldview defense owes to unconscious vigilance, a state of accentuated reactivity to affective targets (which need not relate to cultural worldviews) that follows detection of subtle alarm cues (which need not pertain to death, coalitional challenges, or uncertainty). In Studies 1 and 2, death-primed participants produced exaggerated ratings of worldview-neutral affective targets. In Studies 3 and 4, subliminal threat manipulations unrelated to death, coalitional challenges, or uncertainty evoked worldview defense. These results are discussed as they inform evolutionary interpretations of worldview defense and future investigations of the influence of unconscious alarm on judgment. (PsycINFO Database Record (c) 2011 APA, all rights reserved). PsycINFO Database Record (c) 2011 APA, all rights reserved.
Uncertainty in macroeconomic policy-making: art or science?
Aikman, David; Barrett, Philip; Kapadia, Sujit; King, Mervyn; Proudman, James; Taylor, Tim; de Weymarn, Iain; Yates, Tony
2011-12-13
Uncertainty is pervasive in economic policy-making. Modern economies share similarities with other complex systems in their unpredictability. But economic systems also differ from those in the natural sciences because outcomes are affected by the state of beliefs of the systems' participants. The dynamics of beliefs and how they interact with economic outcomes can be rich and unpredictable. This paper relates these ideas to the recent crisis, which has reminded us that we need a financial system that is resilient in the face of the unpredictable and extreme. It also highlights how such uncertainty puts a premium on sound communication strategies by policy-makers. This creates challenges in informing others about the uncertainties in the economy, and how policy is set in the face of those uncertainties. We show how the Bank of England tries to deal with some of these challenges in its communications about monetary policy.
Presentation of uncertainties on web platforms for climate change information
NASA Astrophysics Data System (ADS)
Nocke, Thomas; Wrobel, Markus; Reusser, Dominik
2014-05-01
Climate research has a long tradition, however there is still uncertainty about the specific effects of climate change. One of the key tasks is - beyond discussing climate change and its impacts in specialist groups - to present these to a wider audience. In that respect, decision-makers in the public sector as well as directly affected professional groups require to obtain easy-to-understand information. These groups are not made up of specialist scientists. This gives rise to the challenge that the scientific information must be presented such that it is commonly understood, however, the complexity of the science behind needs to be incorporated. In particular, this requires the explicit representation of spatial and temporal uncertainty information to lay people. Within this talk/poster we survey how climate change and climate impact uncertainty information is presented on various climate service web-based platforms. We outline how the specifics of this medium make it challenging to find adequate and readable representations of uncertainties. First, we introduce a multi-step approach in communicating the uncertainty basing on a typology of uncertainty distinguishing between epistemic, natural stochastic, and human reflexive uncertainty. Then, we compare existing concepts and representations for uncertainty communication with current practices on web-based platforms, including own solutions within our web platforms ClimateImpactsOnline and ci:grasp. Finally, we review surveys on how spatial uncertainty visualization techniques are conceived by untrainded users.
Research strategies for addressing uncertainties
Busch, David E.; Brekke, Levi D.; Averyt, Kristen; Jardine, Angela; Welling, Leigh; Garfin, Gregg; Jardine, Angela; Merideth, Robert; Black, Mary; LeRoy, Sarah
2013-01-01
Research Strategies for Addressing Uncertainties builds on descriptions of research needs presented elsewhere in the book; describes current research efforts and the challenges and opportunities to reduce the uncertainties of climate change; explores ways to improve the understanding of changes in climate and hydrology; and emphasizes the use of research to inform decision making.
Inference for Distributions over the Permutation Group
2008-05-01
world problems, such as voting , ranking, and data association. Representing uncertainty over permutations is challenging, since there are n...problems, such as voting , ranking, and data association. Representing uncertainty over permutations is challenging, since there are n! possibilities...the Krone ker (or Tensor ) Produ t Representation.In general, the Krone ker produ t representation is redu ible, and so it ande omposed into a dire t
Uncertainty Management in Remote Sensing of Climate Data. Summary of A Workshop
NASA Technical Reports Server (NTRS)
McConnell, M.; Weidman, S.
2009-01-01
Great advances have been made in our understanding of the climate system over the past few decades, and remotely sensed data have played a key role in supporting many of these advances. Improvements in satellites and in computational and data-handling techniques have yielded high quality, readily accessible data. However, rapid increases in data volume have also led to large and complex datasets that pose significant challenges in data analysis (NRC, 2007). Uncertainty characterization is needed for every satellite mission and scientists continue to be challenged by the need to reduce the uncertainty in remotely sensed climate records and projections. The approaches currently used to quantify the uncertainty in remotely sensed data, including statistical methods used to calibrate and validate satellite instruments, lack an overall mathematically based framework.
"I Don't Want to Be an Ostrich": Managing Mothers' Uncertainty during BRCA1/2 Genetic Counseling.
Fisher, Carla L; Roccotagliata, Thomas; Rising, Camella J; Kissane, David W; Glogowski, Emily A; Bylund, Carma L
2017-06-01
Families who face genetic disease risk must learn how to grapple with complicated uncertainties about their health and future on a long-term basis. Women who undergo BRCA 1/2 genetic testing describe uncertainty related to personal risk as well as their loved ones', particularly daughters', risk. The genetic counseling setting is a prime opportunity for practitioners to help mothers manage uncertainty in the moment but also once they leave a session. Uncertainty Management Theory (UMT) helps to illuminate the various types of uncertainty women encounter and the important role of communication in uncertainty management. Informed by UMT, we conducted a thematic analysis of 16 genetic counseling sessions between practitioners and mothers at risk for, or carriers of, a BRCA1/2 mutation. Five themes emerged that represent communication strategies used to manage uncertainty: 1) addresses myths, misunderstandings, or misconceptions; 2) introduces uncertainty related to science; 3) encourages information seeking or sharing about family medical history; 4) reaffirms or validates previous behavior or decisions; and 5) minimizes the probability of personal risk or family members' risk. Findings illustrate the critical role of genetic counseling for families in managing emotionally challenging risk-related uncertainty. The analysis may prove beneficial to not only genetic counseling practice but generations of families at high risk for cancer who must learn strategic approaches to managing a complex web of uncertainty that can challenge them for a lifetime.
Introduction to the Special Issue on Climate Ethics: Uncertainty, Values and Policy.
Roeser, Sabine
2017-10-01
Climate change is a pressing phenomenon with huge potential ethical, legal and social policy implications. Climate change gives rise to intricate moral and policy issues as it involves contested science, uncertainty and risk. In order to come to scientifically and morally justified, as well as feasible, policies, targeting climate change requires an interdisciplinary approach. This special issue will identify the main challenges that climate change poses from social, economic, methodological and ethical perspectives by focusing on the complex interrelations between uncertainty, values and policy in this context. This special issue brings together scholars from economics, social sciences and philosophy in order to address these challenges.
Bayesian-information-gap decision theory with an application to CO 2 sequestration
O'Malley, D.; Vesselinov, V. V.
2015-09-04
Decisions related to subsurface engineering problems such as groundwater management, fossil fuel production, and geologic carbon sequestration are frequently challenging because of an overabundance of uncertainties (related to conceptualizations, parameters, observations, etc.). Because of the importance of these problems to agriculture, energy, and the climate (respectively), good decisions that are scientifically defensible must be made despite the uncertainties. We describe a general approach to making decisions for challenging problems such as these in the presence of severe uncertainties that combines probabilistic and non-probabilistic methods. The approach uses Bayesian sampling to assess parametric uncertainty and Information-Gap Decision Theory (IGDT) to addressmore » model inadequacy. The combined approach also resolves an issue that frequently arises when applying Bayesian methods to real-world engineering problems related to the enumeration of possible outcomes. In the case of zero non-probabilistic uncertainty, the method reduces to a Bayesian method. Lastly, to illustrate the approach, we apply it to a site-selection decision for geologic CO 2 sequestration.« less
NASA Astrophysics Data System (ADS)
Darch, Peter T.; Sands, Ashley E.
2016-06-01
Sky surveys, such as the Sloan Digital Sky Survey (SDSS) and the Large Synoptic Survey Telescope (LSST), generate data on an unprecedented scale. While many scientific projects span a few years from conception to completion, sky surveys are typically on the scale of decades. This paper focuses on critical challenges arising from long timescales, and how sky surveys address these challenges.We present findings from a study of LSST, comprising interviews (n=58) and observation. Conceived in the 1990s, the LSST Corporation was formed in 2003, and construction began in 2014. LSST will commence data collection operations in 2022 for ten years.One challenge arising from this long timescale is uncertainty about future needs of the astronomers who will use these data many years hence. Sources of uncertainty include scientific questions to be posed, astronomical phenomena to be studied, and tools and practices these astronomers will have at their disposal. These uncertainties are magnified by the rapid technological and scientific developments anticipated between now and the start of LSST operations.LSST is implementing a range of strategies to address these challenges. Some strategies involve delaying resolution of uncertainty, placing this resolution in the hands of future data users. Other strategies aim to reduce uncertainty by shaping astronomers’ data analysis practices so that these practices will integrate well with LSST once operations begin.One approach that exemplifies both types of strategy is the decision to make LSST data management software open source, even now as it is being developed. This policy will enable future data users to adapt this software to evolving needs. In addition, LSST intends for astronomers to start using this software well in advance of 2022, thereby embedding LSST software and data analysis approaches in the practices of astronomers.These findings strengthen arguments for making the software supporting sky surveys available as open source. Such arguments usually focus on reuse potential of software, and enhancing replicability of analyses. In this case, however, open source software also promises to mitigate the critical challenge of anticipating the needs of future data users.
Quantum computing and probability.
Ferry, David K
2009-11-25
Over the past two decades, quantum computing has become a popular and promising approach to trying to solve computationally difficult problems. Missing in many descriptions of quantum computing is just how probability enters into the process. Here, we discuss some simple examples of how uncertainty and probability enter, and how this and the ideas of quantum computing challenge our interpretations of quantum mechanics. It is found that this uncertainty can lead to intrinsic decoherence, and this raises challenges for error correction.
Model Uncertainty Quantification Methods In Data Assimilation
NASA Astrophysics Data System (ADS)
Pathiraja, S. D.; Marshall, L. A.; Sharma, A.; Moradkhani, H.
2017-12-01
Data Assimilation involves utilising observations to improve model predictions in a seamless and statistically optimal fashion. Its applications are wide-ranging; from improving weather forecasts to tracking targets such as in the Apollo 11 mission. The use of Data Assimilation methods in high dimensional complex geophysical systems is an active area of research, where there exists many opportunities to enhance existing methodologies. One of the central challenges is in model uncertainty quantification; the outcome of any Data Assimilation study is strongly dependent on the uncertainties assigned to both observations and models. I focus on developing improved model uncertainty quantification methods that are applicable to challenging real world scenarios. These include developing methods for cases where the system states are only partially observed, where there is little prior knowledge of the model errors, and where the model error statistics are likely to be highly non-Gaussian.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chatterjee, Samrat; Tipireddy, Ramakrishna; Oster, Matthew R.
Securing cyber-systems on a continual basis against a multitude of adverse events is a challenging undertaking. Game-theoretic approaches, that model actions of strategic decision-makers, are increasingly being applied to address cybersecurity resource allocation challenges. Such game-based models account for multiple player actions and represent cyber attacker payoffs mostly as point utility estimates. Since a cyber-attacker’s payoff generation mechanism is largely unknown, appropriate representation and propagation of uncertainty is a critical task. In this paper we expand on prior work and focus on operationalizing the probabilistic uncertainty quantification framework, for a notional cyber system, through: 1) representation of uncertain attacker andmore » system-related modeling variables as probability distributions and mathematical intervals, and 2) exploration of uncertainty propagation techniques including two-phase Monte Carlo sampling and probability bounds analysis.« less
Nagle, Samuel M; Sundar, Guru; Schafer, Mark E; Harris, Gerald R; Vaezy, Shahram; Gessert, James M; Howard, Samuel M; Moore, Mary K; Eaton, Richard M
2013-11-01
This article examines the challenges associated with making acoustic output measurements at high ultrasound frequencies (>20 MHz) in the context of regulatory considerations contained in the US Food and Drug Administration industry guidance document for diagnostic ultrasound devices. Error sources in the acoustic measurement, including hydrophone calibration and spatial averaging, nonlinear distortion, and mechanical alignment, are evaluated, and the limitations of currently available acoustic measurement instruments are discussed. An uncertainty analysis of acoustic intensity and power measurements is presented, and an example uncertainty calculation is done on a hypothetical 30-MHz high-frequency ultrasound system. This analysis concludes that the estimated measurement uncertainty of the acoustic intensity is +73%/-86%, and the uncertainty in the mechanical index is +37%/-43%. These values exceed the respective levels in the Food and Drug Administration guidance document of 30% and 15%, respectively, which are more representative of the measurement uncertainty associated with characterizing lower-frequency ultrasound systems. Recommendations made for minimizing the measurement uncertainty include implementing a mechanical positioning system that has sufficient repeatability and precision, reconstructing the time-pressure waveform via deconvolution using the hydrophone frequency response, and correcting for hydrophone spatial averaging.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sung, Yixing; Adams, Brian M.; Secker, Jeffrey R.
2011-12-01
The CASL Level 1 Milestone CASL.P4.01, successfully completed in December 2011, aimed to 'conduct, using methodologies integrated into VERA, a detailed sensitivity analysis and uncertainty quantification of a crud-relevant problem with baseline VERA capabilities (ANC/VIPRE-W/BOA).' The VUQ focus area led this effort, in partnership with AMA, and with support from VRI. DAKOTA was coupled to existing VIPRE-W thermal-hydraulics and BOA crud/boron deposit simulations representing a pressurized water reactor (PWR) that previously experienced crud-induced power shift (CIPS). This work supports understanding of CIPS by exploring the sensitivity and uncertainty in BOA outputs with respect to uncertain operating and model parameters. Thismore » report summarizes work coupling the software tools, characterizing uncertainties, and analyzing the results of iterative sensitivity and uncertainty studies. These studies focused on sensitivity and uncertainty of CIPS indicators calculated by the current version of the BOA code used in the industry. Challenges with this kind of analysis are identified to inform follow-on research goals and VERA development targeting crud-related challenge problems.« less
Probabilistic methods for sensitivity analysis and calibration in the NASA challenge problem
Safta, Cosmin; Sargsyan, Khachik; Najm, Habib N.; ...
2015-01-01
In this study, a series of algorithms are proposed to address the problems in the NASA Langley Research Center Multidisciplinary Uncertainty Quantification Challenge. A Bayesian approach is employed to characterize and calibrate the epistemic parameters based on the available data, whereas a variance-based global sensitivity analysis is used to rank the epistemic and aleatory model parameters. A nested sampling of the aleatory–epistemic space is proposed to propagate uncertainties from model parameters to output quantities of interest.
Probabilistic methods for sensitivity analysis and calibration in the NASA challenge problem
DOE Office of Scientific and Technical Information (OSTI.GOV)
Safta, Cosmin; Sargsyan, Khachik; Najm, Habib N.
In this study, a series of algorithms are proposed to address the problems in the NASA Langley Research Center Multidisciplinary Uncertainty Quantification Challenge. A Bayesian approach is employed to characterize and calibrate the epistemic parameters based on the available data, whereas a variance-based global sensitivity analysis is used to rank the epistemic and aleatory model parameters. A nested sampling of the aleatory–epistemic space is proposed to propagate uncertainties from model parameters to output quantities of interest.
NASA Astrophysics Data System (ADS)
Liu, Y.; Gupta, H.; Wagener, T.; Stewart, S.; Mahmoud, M.; Hartmann, H.; Springer, E.
2007-12-01
Some of the most challenging issues facing contemporary water resources management are those typified by complex coupled human-environmental systems with poorly characterized uncertainties. In other words, major decisions regarding water resources have to be made in the face of substantial uncertainty and complexity. It has been suggested that integrated models can be used to coherently assemble information from a broad set of domains, and can therefore serve as an effective means for tackling the complexity of environmental systems. Further, well-conceived scenarios can effectively inform decision making, particularly when high complexity and poorly characterized uncertainties make the problem intractable via traditional uncertainty analysis methods. This presentation discusses the integrated modeling framework adopted by SAHRA, an NSF Science & Technology Center, to investigate stakeholder-driven water sustainability issues within the semi-arid southwestern US. The multi-disciplinary, multi-resolution modeling framework incorporates a formal scenario approach to analyze the impacts of plausible (albeit uncertain) alternative futures to support adaptive management of water resources systems. Some of the major challenges involved in, and lessons learned from, this effort will be discussed.
Extreme-Scale Bayesian Inference for Uncertainty Quantification of Complex Simulations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Biros, George
Uncertainty quantification (UQ)—that is, quantifying uncertainties in complex mathematical models and their large-scale computational implementations—is widely viewed as one of the outstanding challenges facing the field of CS&E over the coming decade. The EUREKA project set to address the most difficult class of UQ problems: those for which both the underlying PDE model as well as the uncertain parameters are of extreme scale. In the project we worked on these extreme-scale challenges in the following four areas: 1. Scalable parallel algorithms for sampling and characterizing the posterior distribution that exploit the structure of the underlying PDEs and parameter-to-observable map. Thesemore » include structure-exploiting versions of the randomized maximum likelihood method, which aims to overcome the intractability of employing conventional MCMC methods for solving extreme-scale Bayesian inversion problems by appealing to and adapting ideas from large-scale PDE-constrained optimization, which have been very successful at exploring high-dimensional spaces. 2. Scalable parallel algorithms for construction of prior and likelihood functions based on learning methods and non-parametric density estimation. Constructing problem-specific priors remains a critical challenge in Bayesian inference, and more so in high dimensions. Another challenge is construction of likelihood functions that capture unmodeled couplings between observations and parameters. We will create parallel algorithms for non-parametric density estimation using high dimensional N-body methods and combine them with supervised learning techniques for the construction of priors and likelihood functions. 3. Bayesian inadequacy models, which augment physics models with stochastic models that represent their imperfections. The success of the Bayesian inference framework depends on the ability to represent the uncertainty due to imperfections of the mathematical model of the phenomena of interest. This is a central challenge in UQ, especially for large-scale models. We propose to develop the mathematical tools to address these challenges in the context of extreme-scale problems. 4. Parallel scalable algorithms for Bayesian optimal experimental design (OED). Bayesian inversion yields quantified uncertainties in the model parameters, which can be propagated forward through the model to yield uncertainty in outputs of interest. This opens the way for designing new experiments to reduce the uncertainties in the model parameters and model predictions. Such experimental design problems have been intractable for large-scale problems using conventional methods; we will create OED algorithms that exploit the structure of the PDE model and the parameter-to-output map to overcome these challenges. Parallel algorithms for these four problems were created, analyzed, prototyped, implemented, tuned, and scaled up for leading-edge supercomputers, including UT-Austin’s own 10 petaflops Stampede system, ANL’s Mira system, and ORNL’s Titan system. While our focus is on fundamental mathematical/computational methods and algorithms, we will assess our methods on model problems derived from several DOE mission applications, including multiscale mechanics and ice sheet dynamics.« less
Using high-throughput literature mining to support read-across predictions of toxicity (SOT)
Building scientific confidence in the development and evaluation of read-across remains an ongoing challenge. Approaches include establishing systematic frameworks to identify sources of uncertainty and ways to address them. One source of uncertainty is related to characterizing ...
High-throughput literature mining to support read-across predictions of toxicity (ASCCT meeting)
Building scientific confidence in the development and evaluation of read-across remains an ongoing challenge. Approaches include establishing systematic frameworks to identify sources of uncertainty and ways to address them. One source of uncertainty is related to characterizing ...
Quantifying uncertainty in read-across assessment – an algorithmic approach - (SOT)
Read-across is a popular data gap filling technique within category and analogue approaches for regulatory purposes. Acceptance of read-across remains an ongoing challenge with several efforts underway for identifying and addressing uncertainties. Here we demonstrate an algorithm...
Health care providers under pressure: making the most of challenging times.
Davis, Scott B; Robinson, Phillip J
2010-01-01
Whether the slowing economic recovery, tight credit markets, increasing costs, or the uncertainty surrounding health care reform, the health care industry faces some sizeable challenges. These factors have put considerable strain on the industry's traditional financing options that the industry has relied on in the past--bonds, banks, finance companies, private equity, venture capital, real estate investment trusts, private philanthropy, and grants. At the same time, providers are dealing with rising costs, lower reimbursement rates, shrinking demand for elective procedures, higher levels of charitable care and bad debt, and increased scrutiny of tax-exempt hospitals. Providers face these challenges against a back ground of uncertainty created by health care reform.
The role of future scenarios to understand deep uncertainty for air quality management.
The environment and its interaction with human systems (economic, social and political) is complex and dynamic. Key drivers may disrupt systemdynamics in unforeseen ways, making it difficult to predict future conditions precisely. This kind of deep uncertainty presents a challeng...
Understanding impacts of climate change on hydrodynamic processes and ecosystem response within the Great Lakes is an important and challenging task. Variability in future climate conditions, uncertainty in rainfall-runoff model forecasts, the potential for land use change, and t...
Risk intelligence: making profit from uncertainty in data processing system.
Zheng, Si; Liao, Xiangke; Liu, Xiaodong
2014-01-01
In extreme scale data processing systems, fault tolerance is an essential and indispensable part. Proactive fault tolerance scheme (such as the speculative execution in MapReduce framework) is introduced to dramatically improve the response time of job executions when the failure becomes a norm rather than an exception. Efficient proactive fault tolerance schemes require precise knowledge on the task executions, which has been an open challenge for decades. To well address the issue, in this paper we design and implement RiskI, a profile-based prediction algorithm in conjunction with a riskaware task assignment algorithm, to accelerate task executions, taking the uncertainty nature of tasks into account. Our design demonstrates that the nature uncertainty brings not only great challenges, but also new opportunities. With a careful design, we can benefit from such uncertainties. We implement the idea in Hadoop 0.21.0 systems and the experimental results show that, compared with the traditional LATE algorithm, the response time can be improved by 46% with the same system throughput.
Risk Intelligence: Making Profit from Uncertainty in Data Processing System
Liao, Xiangke; Liu, Xiaodong
2014-01-01
In extreme scale data processing systems, fault tolerance is an essential and indispensable part. Proactive fault tolerance scheme (such as the speculative execution in MapReduce framework) is introduced to dramatically improve the response time of job executions when the failure becomes a norm rather than an exception. Efficient proactive fault tolerance schemes require precise knowledge on the task executions, which has been an open challenge for decades. To well address the issue, in this paper we design and implement RiskI, a profile-based prediction algorithm in conjunction with a riskaware task assignment algorithm, to accelerate task executions, taking the uncertainty nature of tasks into account. Our design demonstrates that the nature uncertainty brings not only great challenges, but also new opportunities. With a careful design, we can benefit from such uncertainties. We implement the idea in Hadoop 0.21.0 systems and the experimental results show that, compared with the traditional LATE algorithm, the response time can be improved by 46% with the same system throughput. PMID:24883392
NASA Astrophysics Data System (ADS)
Bulthuis, Kevin; Arnst, Maarten; Pattyn, Frank; Favier, Lionel
2017-04-01
Uncertainties in sea-level rise projections are mostly due to uncertainties in Antarctic ice-sheet predictions (IPCC AR5 report, 2013), because key parameters related to the current state of the Antarctic ice sheet (e.g. sub-ice-shelf melting) and future climate forcing are poorly constrained. Here, we propose to improve the predictions of Antarctic ice-sheet behaviour using new uncertainty quantification methods. As opposed to ensemble modelling (Bindschadler et al., 2013) which provides a rather limited view on input and output dispersion, new stochastic methods (Le Maître and Knio, 2010) can provide deeper insight into the impact of uncertainties on complex system behaviour. Such stochastic methods usually begin with deducing a probabilistic description of input parameter uncertainties from the available data. Then, the impact of these input parameter uncertainties on output quantities is assessed by estimating the probability distribution of the outputs by means of uncertainty propagation methods such as Monte Carlo methods or stochastic expansion methods. The use of such uncertainty propagation methods in glaciology may be computationally costly because of the high computational complexity of ice-sheet models. This challenge emphasises the importance of developing reliable and computationally efficient ice-sheet models such as the f.ETISh ice-sheet model (Pattyn, 2015), a new fast thermomechanical coupled ice sheet/ice shelf model capable of handling complex and critical processes such as the marine ice-sheet instability mechanism. Here, we apply these methods to investigate the role of uncertainties in sub-ice-shelf melting, calving rates and climate projections in assessing Antarctic contribution to sea-level rise for the next centuries using the f.ETISh model. We detail the methods and show results that provide nominal values and uncertainty bounds for future sea-level rise as a reflection of the impact of the input parameter uncertainties under consideration, as well as a ranking of the input parameter uncertainties in the order of the significance of their contribution to uncertainty in future sea-level rise. In addition, we discuss how limitations posed by the available information (poorly constrained data) pose challenges that motivate our current research.
Uncertainty quantification in volumetric Particle Image Velocimetry
NASA Astrophysics Data System (ADS)
Bhattacharya, Sayantan; Charonko, John; Vlachos, Pavlos
2016-11-01
Particle Image Velocimetry (PIV) uncertainty quantification is challenging due to coupled sources of elemental uncertainty and complex data reduction procedures in the measurement chain. Recent developments in this field have led to uncertainty estimation methods for planar PIV. However, no framework exists for three-dimensional volumetric PIV. In volumetric PIV the measurement uncertainty is a function of reconstructed three-dimensional particle location that in turn is very sensitive to the accuracy of the calibration mapping function. Furthermore, the iterative correction to the camera mapping function using triangulated particle locations in space (volumetric self-calibration) has its own associated uncertainty due to image noise and ghost particle reconstructions. Here we first quantify the uncertainty in the triangulated particle position which is a function of particle detection and mapping function uncertainty. The location uncertainty is then combined with the three-dimensional cross-correlation uncertainty that is estimated as an extension of the 2D PIV uncertainty framework. Finally the overall measurement uncertainty is quantified using an uncertainty propagation equation. The framework is tested with both simulated and experimental cases. For the simulated cases the variation of estimated uncertainty with the elemental volumetric PIV error sources are also evaluated. The results show reasonable prediction of standard uncertainty with good coverage.
A structured analysis of uncertainty surrounding modeled impacts of groundwater-extraction rules
NASA Astrophysics Data System (ADS)
Guillaume, Joseph H. A.; Qureshi, M. Ejaz; Jakeman, Anthony J.
2012-08-01
Integrating economic and groundwater models for groundwater-management can help improve understanding of trade-offs involved between conflicting socioeconomic and biophysical objectives. However, there is significant uncertainty in most strategic decision-making situations, including in the models constructed to represent them. If not addressed, this uncertainty may be used to challenge the legitimacy of the models and decisions made using them. In this context, a preliminary uncertainty analysis was conducted of a dynamic coupled economic-groundwater model aimed at assessing groundwater extraction rules. The analysis demonstrates how a variety of uncertainties in such a model can be addressed. A number of methods are used including propagation of scenarios and bounds on parameters, multiple models, block bootstrap time-series sampling and robust linear regression for model calibration. These methods are described within the context of a theoretical uncertainty management framework, using a set of fundamental uncertainty management tasks and an uncertainty typology.
Advanced Modeling and Uncertainty Quantification for Flight Dynamics; Interim Results and Challenges
NASA Technical Reports Server (NTRS)
Hyde, David C.; Shweyk, Kamal M.; Brown, Frank; Shah, Gautam
2014-01-01
As part of the NASA Vehicle Systems Safety Technologies (VSST), Assuring Safe and Effective Aircraft Control Under Hazardous Conditions (Technical Challenge #3), an effort is underway within Boeing Research and Technology (BR&T) to address Advanced Modeling and Uncertainty Quantification for Flight Dynamics (VSST1-7). The scope of the effort is to develop and evaluate advanced multidisciplinary flight dynamics modeling techniques, including integrated uncertainties, to facilitate higher fidelity response characterization of current and future aircraft configurations approaching and during loss-of-control conditions. This approach is to incorporate multiple flight dynamics modeling methods for aerodynamics, structures, and propulsion, including experimental, computational, and analytical. Also to be included are techniques for data integration and uncertainty characterization and quantification. This research shall introduce new and updated multidisciplinary modeling and simulation technologies designed to improve the ability to characterize airplane response in off-nominal flight conditions. The research shall also introduce new techniques for uncertainty modeling that will provide a unified database model comprised of multiple sources, as well as an uncertainty bounds database for each data source such that a full vehicle uncertainty analysis is possible even when approaching or beyond Loss of Control boundaries. Methodologies developed as part of this research shall be instrumental in predicting and mitigating loss of control precursors and events directly linked to causal and contributing factors, such as stall, failures, damage, or icing. The tasks will include utilizing the BR&T Water Tunnel to collect static and dynamic data to be compared to the GTM extended WT database, characterizing flight dynamics in off-nominal conditions, developing tools for structural load estimation under dynamic conditions, devising methods for integrating various modeling elements into a real-time simulation capability, generating techniques for uncertainty modeling that draw data from multiple modeling sources, and providing a unified database model that includes nominal plus increments for each flight condition. This paper presents status of testing in the BR&T water tunnel and analysis of the resulting data and efforts to characterize these data using alternative modeling methods. Program challenges and issues are also presented.
Sufficiently elucidating uncertainty and sensitivity structures in environmental models can be a difficult task, even for low-order, single-media constructs driven by a unique set of site-specific data. The ensuing challenge of examining ever more complex, integrated, higher-ord...
Role of future scenarios in understanding deep uncertainty in long-term air quality management
The environment and its interactions with human systems, whether economic, social or political, are complex. Relevant drivers may disrupt system dynamics in unforeseen ways, making it difficult to predict future conditions. This kind of deep uncertainty presents a challenge to ...
2016-02-01
In addition , the parser updates some parameters based on uncertainties. For example, Analytica was very slow to update Pk values based on...moderate range. The additional security environments helped to fill gaps in lower severity. Weapons Effectiveness Pk values were modified to account for two...project is to help improve the value and character of defense resource planning in an era of growing uncertainty and complex strategic challenges
Supporting Middle School Students Whose Parents Are Deployed: Challenges and Strategies for Schools
ERIC Educational Resources Information Center
Williams, Brenda
2013-01-01
Middle school students from military families face unique challenges, especially when their parents are deployed. Among the challenges they experience are frequent relocations; issues that affect academic achievement; uncertainty; and changes in roles, responsibilities, and relationships at home. Reunification involves issues of the returning…
ERIC Educational Resources Information Center
Jameson, Jill
2012-01-01
The complex leadership attribute of "negative capability" in managing uncertainty and engendering trust may be amongst the qualities enabling institutions to cope with multiple recent government policy challenges affecting English higher education, including significant increases in student fees. Research findings are reported on changes…
Coping with Economic Uncertainty: Focus on Key Risks Essential
ERIC Educational Resources Information Center
Sander, Laura
2009-01-01
During this period of continued economic uncertainty, higher-education institutions are facing a variety of challenges that by now are very familiar to governing boards and institutional leaders, including poor investment returns, reduced liquidity, limited choices in how they structure debt issues, and threats to flexibility in tuition pricing.…
Uncertainty in Citizen Science observations: from measurement to user perception
NASA Astrophysics Data System (ADS)
Lahoz, William; Schneider, Philipp; Castell, Nuria
2016-04-01
Citizen Science activities concern general public engagement in scientific research activities when citizens actively contribute to science either with their intellectual effort or surrounding knowledge or with their tools and resources. The advent of technologies such as the Internet and smartphones, and the growth in their usage, has significantly increased the potential benefits from Citizen Science activities. Citizen Science observations from low-cost sensors, smartphones and Citizen Observatories, provide a novel and recent development in platforms for observing the Earth System, with the opportunity to extend the range of observational platforms available to society to spatio-temporal scales (10-100s m; 1 hr or less) highly relevant to citizen needs. The potential value of Citizen Science is high, with applications in science, education, social aspects, and policy aspects, but this potential, particularly for citizens and policymakers, remains largely untapped. Key areas where Citizen Science data start to have demonstrable benefits include GEOSS Societal Benefit Areas such as Health and Weather. Citizen Science observations have many challenges, including simulation of smaller spatial scales, noisy data, combination with traditional observational methods (satellite and in situ data), and assessment, representation and visualization of uncertainty. Within these challenges, that of the assessment and representation of uncertainty and its communication to users is fundamental, as it provides qualitative and/or quantitative information that influences the belief users will have in environmental information. This presentation will discuss the challenges in assessment and representation of uncertainty in Citizen Science observations, its communication to users, including the use of visualization, and the perception of this uncertainty information by users of Citizen Science observations.
Uncertainty quantification in downscaling procedures for effective decisions in energy systems
NASA Astrophysics Data System (ADS)
Constantinescu, E. M.
2010-12-01
Weather is a major driver both of energy supply and demand, and with the massive adoption of renewable energy sources and changing economic and producer-consumer paradigms, the management of the next-generation energy systems is becoming ever more challenging. The operational and planning decisions in energy systems are guided by efficiency and reliability, and therefore a central role in these decisions will be played by the ability to obtain weather condition forecasts with accurate uncertainty estimates. The appropriate temporal and spatial resolutions needed for effective decision-making, be it operational or planning, is not clear. It is arguably certain however, that such temporal scales as hourly variations of temperature or wind conditions and ramp events are essential in this process. Planning activities involve decade or decades-long projections of weather. One sensible way to achieve this is to embed regional weather models in a global climate system. This strategy acts as a downscaling procedure. Uncertainty modeling techniques must be developed in order to quantify and minimize forecast errors as well as target variables that impact the decision-making process the most. We discuss the challenges of obtaining a realistic uncertainty quantification estimate using mathematical algorithms based on scalable matrix-free computations and physics-based statistical models. The process of making decisions for energy management systems based on future weather scenarios is a very complex problem. We shall focus on the challenges in generating wind power predictions based on regional weather predictions, and discuss the implications of making the common assumptions about the uncertainty models.
NASA Astrophysics Data System (ADS)
Pu, Zhiqiang; Tan, Xiangmin; Fan, Guoliang; Yi, Jianqiang
2014-08-01
Flexible air-breathing hypersonic vehicles feature significant uncertainties which pose huge challenges to robust controller designs. In this paper, four major categories of uncertainties are analyzed, that is, uncertainties associated with flexible effects, aerodynamic parameter variations, external environmental disturbances, and control-oriented modeling errors. A uniform nonlinear uncertainty model is explored for the first three uncertainties which lumps all uncertainties together and consequently is beneficial for controller synthesis. The fourth uncertainty is additionally considered in stability analysis. Based on these analyses, the starting point of the control design is to decompose the vehicle dynamics into five functional subsystems. Then a robust trajectory linearization control (TLC) scheme consisting of five robust subsystem controllers is proposed. In each subsystem controller, TLC is combined with the extended state observer (ESO) technique for uncertainty compensation. The stability of the overall closed-loop system with the four aforementioned uncertainties and additional singular perturbations is analyzed. Particularly, the stability of nonlinear ESO is also discussed from a Liénard system perspective. At last, simulations demonstrate the great control performance and the uncertainty rejection ability of the robust scheme.
Medical Humanities: The Rx for Uncertainty?
Ofri, Danielle
2017-12-01
While medical students often fear the avalanche of knowledge they are required to learn during training, it is learning to translate that knowledge into wisdom that is the greatest challenge of becoming a doctor. Part of that challenge is learning to tolerate ambiguity and uncertainty, a difficult feat for doctors who are taught to question anything that is not evidence based or peer reviewed. The medical humanities specialize in this ambiguity and uncertainty, which are hallmarks of actual clinical practice but rarely addressed in medical education. The humanities also force reflection and contemplation-skills that are crucial to thoughtful decision making and to personal wellness. Beyond that, the humanities add a dose of joy and beauty to a training process that is notoriously frugal in these departments. Well integrated, the humanities can be the key to transforming medical knowledge into clinical wisdom.
Design Optimization of Composite Structures under Uncertainty
NASA Technical Reports Server (NTRS)
Haftka, Raphael T.
2003-01-01
Design optimization under uncertainty is computationally expensive and is also challenging in terms of alternative formulation. The work under the grant focused on developing methods for design against uncertainty that are applicable to composite structural design with emphasis on response surface techniques. Applications included design of stiffened composite plates for improved damage tolerance, the use of response surfaces for fitting weights obtained by structural optimization, and simultaneous design of structure and inspection periods for fail-safe structures.
Information theoretic quantification of diagnostic uncertainty.
Westover, M Brandon; Eiseman, Nathaniel A; Cash, Sydney S; Bianchi, Matt T
2012-01-01
Diagnostic test interpretation remains a challenge in clinical practice. Most physicians receive training in the use of Bayes' rule, which specifies how the sensitivity and specificity of a test for a given disease combine with the pre-test probability to quantify the change in disease probability incurred by a new test result. However, multiple studies demonstrate physicians' deficiencies in probabilistic reasoning, especially with unexpected test results. Information theory, a branch of probability theory dealing explicitly with the quantification of uncertainty, has been proposed as an alternative framework for diagnostic test interpretation, but is even less familiar to physicians. We have previously addressed one key challenge in the practical application of Bayes theorem: the handling of uncertainty in the critical first step of estimating the pre-test probability of disease. This essay aims to present the essential concepts of information theory to physicians in an accessible manner, and to extend previous work regarding uncertainty in pre-test probability estimation by placing this type of uncertainty within a principled information theoretic framework. We address several obstacles hindering physicians' application of information theoretic concepts to diagnostic test interpretation. These include issues of terminology (mathematical meanings of certain information theoretic terms differ from clinical or common parlance) as well as the underlying mathematical assumptions. Finally, we illustrate how, in information theoretic terms, one can understand the effect on diagnostic uncertainty of considering ranges instead of simple point estimates of pre-test probability.
Incorporating uncertainty into medical decision making: an approach to unexpected test results.
Bianchi, Matt T; Alexander, Brian M; Cash, Sydney S
2009-01-01
The utility of diagnostic tests derives from the ability to translate the population concepts of sensitivity and specificity into information that will be useful for the individual patient: the predictive value of the result. As the array of available diagnostic testing broadens, there is a temptation to de-emphasize history and physical findings and defer to the objective rigor of technology. However, diagnostic test interpretation is not always straightforward. One significant barrier to routine use of probability-based test interpretation is the uncertainty inherent in pretest probability estimation, the critical first step of Bayesian reasoning. The context in which this uncertainty presents the greatest challenge is when test results oppose clinical judgment. It is this situation when decision support would be most helpful. The authors propose a simple graphical approach that incorporates uncertainty in pretest probability and has specific application to the interpretation of unexpected results. This method quantitatively demonstrates how uncertainty in disease probability may be amplified when test results are unexpected (opposing clinical judgment), even for tests with high sensitivity and specificity. The authors provide a simple nomogram for determining whether an unexpected test result suggests that one should "switch diagnostic sides.'' This graphical framework overcomes the limitation of pretest probability uncertainty in Bayesian analysis and guides decision making when it is most challenging: interpretation of unexpected test results.
Perceptual uncertainty and line-call challenges in professional tennis
Mather, George
2008-01-01
Fast-moving sports such as tennis require both players and match officials to make rapid accurate perceptual decisions about dynamic events in the visual world. Disagreements arise regularly, leading to disputes about decisions such as line calls. A number of factors must contribute to these disputes, including lapses in concentration, bias and gamesmanship. Fundamental uncertainty or variability in the sensory information supporting decisions must also play a role. Modern technological innovations now provide detailed and accurate physical information that can be compared against the decisions of players and officials. The present paper uses this psychophysical data to assess the significance of perceptual limitations as a contributor to real-world decisions in professional tennis. A detailed analysis is presented of a large body of data on line-call challenges in professional tennis tournaments over the last 2 years. Results reveal that the vast majority of challenges can be explained in a direct highly predictable manner by a simple model of uncertainty in perceptual information processing. Both players and line judges are remarkably accurate at judging ball bounce position, with a positional uncertainty of less than 40 mm. Line judges are more reliable than players. Judgements are more difficult for balls bouncing near base and service lines than those bouncing near side and centre lines. There is no evidence for significant errors in localization due to image motion. PMID:18426755
Perceptual uncertainty and line-call challenges in professional tennis.
Mather, George
2008-07-22
Fast-moving sports such as tennis require both players and match officials to make rapid accurate perceptual decisions about dynamic events in the visual world. Disagreements arise regularly, leading to disputes about decisions such as line calls. A number of factors must contribute to these disputes, including lapses in concentration, bias and gamesmanship. Fundamental uncertainty or variability in the sensory information supporting decisions must also play a role. Modern technological innovations now provide detailed and accurate physical information that can be compared against the decisions of players and officials. The present paper uses this psychophysical data to assess the significance of perceptual limitations as a contributor to real-world decisions in professional tennis. A detailed analysis is presented of a large body of data on line-call challenges in professional tennis tournaments over the last 2 years. Results reveal that the vast majority of challenges can be explained in a direct highly predictable manner by a simple model of uncertainty in perceptual information processing. Both players and line judges are remarkably accurate at judging ball bounce position, with a positional uncertainty of less than 40mm. Line judges are more reliable than players. Judgements are more difficult for balls bouncing near base and service lines than those bouncing near side and centre lines. There is no evidence for significant errors in localization due to image motion.
Changing Global Risk Landscape - Challenges for Risk Management (Invited)
NASA Astrophysics Data System (ADS)
Wenzel, F.
2009-12-01
The exponentially growing losses related to natural disasters on a global scale reflect a changing risk landscape that is characterized by the influence of climate change and a growing population, particularly in urban agglomerations and coastal zones. In consequence of these trends we witness (a) new hazards such as landslides due to dwindling permafrost, new patterns of strong precipitation and related floods, potential for tropical cyclones in the Mediterranean, sea level rise and others; (b) new risks related to large numbers of people in very dense urban areas, and risks related to the vulnerability of infrastructure such as energy supply, water supply, transportation, communication, etc. (c) extreme events with unprecedented size and implications. An appropriate answer to these challenges goes beyond classical views of risk assessment and protection. It must include an understanding of risk as changing with time so that risk assessment needs to be supplemented by risk monitoring. It requires decision making under high uncertainty. The risks (i.e. potentials for future losses) of extreme events are not only high but also very difficult to quantify, as they are characterized by high levels of uncertainty. Uncertainties relate to frequency, time of occurrence, strength and impact of extreme events but also to the coping capacities of society in response to them. The characterization, quantification, reduction in the extent possible of the uncertainties is an inherent topic of extreme event research. However, they will not disappear, so a rational approach to extreme events must include more than reducing uncertainties. It requires us to assess and rate the irreducible uncertainties, to evaluate options for mitigation under large uncertainties, and their communication to societal sectors. Thus scientist need to develop methodologies that aim at a rational approach to extreme events associated with high levels of uncertainty.
Reddington, C. L.; Carslaw, K. S.; Stier, P.; ...
2017-09-01
The largest uncertainty in the historical radiative forcing of climate is caused by changes in aerosol particles due to anthropogenic activity. Sophisticated aerosol microphysics processes have been included in many climate models in an effort to reduce the uncertainty. However, the models are very challenging to evaluate and constrain because they require extensive in situ measurements of the particle size distribution, number concentration, and chemical composition that are not available from global satellite observations. The Global Aerosol Synthesis and Science Project (GASSP) aims to improve the robustness of global aerosol models by combining new methodologies for quantifying model uncertainty, tomore » create an extensive global dataset of aerosol in situ microphysical and chemical measurements, and to develop new ways to assess the uncertainty associated with comparing sparse point measurements with low-resolution models. GASSP has assembled over 45,000 hours of measurements from ships and aircraft as well as data from over 350 ground stations. The measurements have been harmonized into a standardized format that is easily used by modelers and nonspecialist users. Available measurements are extensive, but they are biased to polluted regions of the Northern Hemisphere, leaving large pristine regions and many continental areas poorly sampled. The aerosol radiative forcing uncertainty can be reduced using a rigorous model–data synthesis approach. Nevertheless, our research highlights significant remaining challenges because of the difficulty of constraining many interwoven model uncertainties simultaneously. Although the physical realism of global aerosol models still needs to be improved, the uncertainty in aerosol radiative forcing will be reduced most effectively by systematically and rigorously constraining the models using extensive syntheses of measurements.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Reddington, C. L.; Carslaw, K. S.; Stier, P.
The largest uncertainty in the historical radiative forcing of climate is caused by changes in aerosol particles due to anthropogenic activity. Sophisticated aerosol microphysics processes have been included in many climate models in an effort to reduce the uncertainty. However, the models are very challenging to evaluate and constrain because they require extensive in situ measurements of the particle size distribution, number concentration, and chemical composition that are not available from global satellite observations. The Global Aerosol Synthesis and Science Project (GASSP) aims to improve the robustness of global aerosol models by combining new methodologies for quantifying model uncertainty, tomore » create an extensive global dataset of aerosol in situ microphysical and chemical measurements, and to develop new ways to assess the uncertainty associated with comparing sparse point measurements with low-resolution models. GASSP has assembled over 45,000 hours of measurements from ships and aircraft as well as data from over 350 ground stations. The measurements have been harmonized into a standardized format that is easily used by modelers and nonspecialist users. Available measurements are extensive, but they are biased to polluted regions of the Northern Hemisphere, leaving large pristine regions and many continental areas poorly sampled. The aerosol radiative forcing uncertainty can be reduced using a rigorous model–data synthesis approach. Nevertheless, our research highlights significant remaining challenges because of the difficulty of constraining many interwoven model uncertainties simultaneously. Although the physical realism of global aerosol models still needs to be improved, the uncertainty in aerosol radiative forcing will be reduced most effectively by systematically and rigorously constraining the models using extensive syntheses of measurements.« less
NASA Astrophysics Data System (ADS)
Rajabi, Mohammad Mahdi; Ketabchi, Hamed
2017-12-01
Combined simulation-optimization (S/O) schemes have long been recognized as a valuable tool in coastal groundwater management (CGM). However, previous applications have mostly relied on deterministic seawater intrusion (SWI) simulations. This is a questionable simplification, knowing that SWI models are inevitably prone to epistemic and aleatory uncertainty, and hence a management strategy obtained through S/O without consideration of uncertainty may result in significantly different real-world outcomes than expected. However, two key issues have hindered the use of uncertainty-based S/O schemes in CGM, which are addressed in this paper. The first issue is how to solve the computational challenges resulting from the need to perform massive numbers of simulations. The second issue is how the management problem is formulated in presence of uncertainty. We propose the use of Gaussian process (GP) emulation as a valuable tool in solving the computational challenges of uncertainty-based S/O in CGM. We apply GP emulation to the case study of Kish Island (located in the Persian Gulf) using an uncertainty-based S/O algorithm which relies on continuous ant colony optimization and Monte Carlo simulation. In doing so, we show that GP emulation can provide an acceptable level of accuracy, with no bias and low statistical dispersion, while tremendously reducing the computational time. Moreover, five new formulations for uncertainty-based S/O are presented based on concepts such as energy distances, prediction intervals and probabilities of SWI occurrence. We analyze the proposed formulations with respect to their resulting optimized solutions, the sensitivity of the solutions to the intended reliability levels, and the variations resulting from repeated optimization runs.
Methods for handling uncertainty within pharmaceutical funding decisions
NASA Astrophysics Data System (ADS)
Stevenson, Matt; Tappenden, Paul; Squires, Hazel
2014-01-01
This article provides a position statement regarding decision making under uncertainty within the economic evaluation of pharmaceuticals, with a particular focus upon the National Institute for Health and Clinical Excellence context within England and Wales. This area is of importance as funding agencies have a finite budget from which to purchase a selection of competing health care interventions. The objective function generally used is that of maximising societal health with an explicit acknowledgement that there will be opportunity costs associated with purchasing a particular intervention. Three components of uncertainty are discussed within a pharmaceutical funding perspective: methodological uncertainty, parameter uncertainty and structural uncertainty, alongside a discussion of challenges that are particularly pertinent to health economic evaluation. The discipline has focused primarily on handling methodological and parameter uncertainty and a clear reference case has been developed for consistency across evaluations. However, uncertainties still remain. Less attention has been given to methods for handling structural uncertainty. The lack of adequate methods to explicitly incorporate this aspect of model development may result in the true uncertainty surrounding health care investment decisions being underestimated. Research in this area is ongoing as we review.
Defining the measurand in radius of curvature measurements
NASA Astrophysics Data System (ADS)
Davies, Angela; Schmitz, Tony L.
2003-11-01
Traceable radius of curvature measurements are critical for precision optics manufacture. An optical bench measurement of radius is very repeatable and is the preferred method for low-uncertainty applications. On an optical bench, the displacement of the optic is measured as it is moved between the cat's eye and confocal positions, each identified using a figure measuring interferometer. Traceability requires connection to a basic unit (the meter, here) in addition to a defensible uncertainty analysis, and the identification and proper propagation of all uncertainty sources in this measurement is challenging. Recent work has focused on identifying all uncertainty contributions; measurement biases have been approximately taken into account and uncertainties combined in an RSS sense for a final measurement estimate and uncertainty. In this paper we report on a new mathematical definition of the radius measurand, which is a single function that depends on all uncertainty sources, such as error motions, alignment uncertainty, displacement gauge uncertainty, etc. The method is based on a homogeneous transformation matrix (HTM) formalism, and intrinsically defines an unbiased estimate for radius, providing a single mathematical expression for uncertainty propagation through a Taylor-series expansion.
Climate change, ecosystem impacts, and management for Pacific salmon
D.E. Schindler; X. Augerot; E. Fleishman; N.J. Mantua; B. Riddell; M. Ruckelshaus; J. Seeb; M. Webster
2008-01-01
As climate change intensifies, there is increasing interest in developing models that reduce uncertainties in projections of global climate and refine these projections to finer spatial scales. Forecasts of climate impacts on ecosystems are far more challenging and their uncertainties even larger because of a limited understanding of physical controls on biological...
Examining the effects of transportation governance on infrastructure adaptation to climate change.
DOT National Transportation Integrated Search
2015-05-01
Transportation agencies across the United States are faced with the challenge of effectively : adapting infrastructure to withstand the predicted effects of climate change. This challenge is : magnified by a nationwide funding shortage, uncertainty a...
Making sense of genetic uncertainty: the role of religion and spirituality.
White, Mary T
2009-02-15
This article argues that to the extent that religious and spiritual beliefs can help people cope with genetic uncertainty, a limited spiritual assessment may be appropriate in genetic counseling. The article opens by establishing why genetic information is inherently uncertain and why this uncertainty can be medically, morally, and spiritually problematic. This is followed by a review of the range of factors that can contribute to risk assessments, including a few heuristics commonly used in responses to uncertainty. The next two sections summarize recent research on the diverse roles of religious and spiritual beliefs in genetic decisions and challenges to conducting spiritual assessments in genetic counseling. Based on these findings, religious and spiritual beliefs are posited as serving essentially as a heuristic that some people will utilize in responding to their genetic risks. In the interests of helping such clients make informed decisions, a limited spiritual assessment is recommended and described. Some of the challenges and risks associated with this limited assessment are discussed. Since some religious and spiritual beliefs can conflict with the values of medicine, some decisions will remain problematic. (c) 2009 Wiley-Liss, Inc.
Integrating uncertainty into public energy research and development decisions
NASA Astrophysics Data System (ADS)
Anadón, Laura Díaz; Baker, Erin; Bosetti, Valentina
2017-05-01
Public energy research and development (R&D) is recognized as a key policy tool for transforming the world's energy system in a cost-effective way. However, managing the uncertainty surrounding technological change is a critical challenge for designing robust and cost-effective energy policies. The design of such policies is particularly important if countries are going to both meet the ambitious greenhouse-gas emissions reductions goals set by the Paris Agreement and achieve the required harmonization with the broader set of objectives dictated by the Sustainable Development Goals. The complexity of informing energy technology policy requires, and is producing, a growing collaboration between different academic disciplines and practitioners. Three analytical components have emerged to support the integration of technological uncertainty into energy policy: expert elicitations, integrated assessment models, and decision frameworks. Here we review efforts to incorporate all three approaches to facilitate public energy R&D decision-making under uncertainty. We highlight emerging insights that are robust across elicitations, models, and frameworks, relating to the allocation of public R&D investments, and identify gaps and challenges that remain.
Beever, Erik A; Mattsson, Brady J; Germino, Matthew J; Burg, Max Post Van Der; Bradford, John B; Brunson, Mark W
2014-04-01
Integration of conservation partnerships across geographic, biological, and administrative boundaries is increasingly relevant because drivers of change, such as climate shifts, transcend these boundaries. We explored successes and challenges of established conservation programs that span multiple watersheds and consider both social and ecological concerns. We asked representatives from a diverse set of 11 broad-extent conservation partnerships in 29 countries 17 questions that pertained to launching and maintaining partnerships for broad-extent conservation, specifying ultimate management objectives, and implementation and learning. Partnerships invested more funds in implementing conservation actions than any other aspect of conservation, and a program's context (geographic extent, United States vs. other countries, developed vs. developing nation) appeared to substantially affect program approach. Despite early successes of these organizations and benefits of broad-extent conservation, specific challenges related to uncertainties in scaling up information and to coordination in the face of diverse partner governance structures, conflicting objectives, and vast uncertainties regarding future system dynamics hindered long-term success, as demonstrated by the focal organizations. Engaging stakeholders, developing conservation measures, and implementing adaptive management were dominant challenges. To inform future research on broad-extent conservation, we considered several challenges when we developed detailed questions, such as what qualities of broad-extent partnerships ensure they complement, integrate, and strengthen, rather than replace, local conservation efforts and which adaptive management processes yield actionable conservation strategies that account explicitly for dynamics and uncertainties regarding multiscale governance, environmental conditions, and knowledge of the system? © 2014 Society for Conservation Biology.
Beever, Erik A.; Bradford, John B.; Germino, Matthew J.; Mattsson, Brady J.; Post van der Burg, Max; Brunson, Mark
2014-01-01
Integration of conservation partnerships across geographic, biological, and administrative boundaries is increasingly relevant because drivers of change, such as climate shifts, transcend these boundaries. We explored successes and challenges of established conservation programs that span multiple watersheds and consider both social and ecological concerns. We asked representatives from a diverse set of 11 broadextent conservation partnerships in 29 countries 17 questions that pertained to launching and maintaining partnerships for broad-extent conservation, specifying ultimate management objectives, and implementation and learning. Partnerships invested more funds in implementing conservation actions than any other aspect of conservation, and a program’s context (geographic extent, United States vs. other countries, developed vs. developing nation) appeared to substantially affect program approach. Despite early successes of these organizations and benefits of broad-extent conservation, specific challenges related to uncertainties in scaling up information and to coordination in the face of diverse partner governance structures, conflicting objectives, and vast uncertainties regarding future system dynamics hindered long-term success, as demonstrated by the focal organizations. Engaging stakeholders, developing conservation measures, and implementing adaptive management were dominant challenges. To inform future research on broad-extent conservation, we considered several challenges when we developed detailed questions, such as what qualities of broad-extent partnerships ensure they complement, integrate, and strengthen, rather than replace, local conservation efforts and which adaptive management processes yield actionable conservation strategies that account explicitly for dynamics and uncertainties regarding multiscale governance, environmental conditions, and knowledge of the system?
Mittal, Chiraag; Griskevicius, Vladas
2014-10-01
Past research found that environmental uncertainty leads people to behave differently depending on their childhood environment. For example, economic uncertainty leads people from poor childhoods to become more impulsive while leading people from wealthy childhoods to become less impulsive. Drawing on life history theory, we examine the psychological mechanism driving such diverging responses to uncertainty. Five experiments show that uncertainty alters people's sense of control over the environment. Exposure to uncertainty led people from poorer childhoods to have a significantly lower sense of control than those from wealthier childhoods. In addition, perceptions of control statistically mediated the effect of uncertainty on impulsive behavior. These studies contribute by demonstrating that sense of control is a psychological driver of behaviors associated with fast and slow life history strategies. We discuss the implications of this for theory and future research, including that environmental uncertainty might lead people who grew up poor to quit challenging tasks sooner than people who grew up wealthy. 2014 APA, all rights reserved
Zhang, X.; McGuire, A.D.; Ruess, Roger W.
2006-01-01
A major challenge confronting the scientific community is to understand both patterns of and controls over spatial and temporal variability of carbon exchange between boreal forest ecosystems and the atmosphere. An understanding of the sources of variability of carbon processes at fine scales and how these contribute to uncertainties in estimating carbon fluxes is relevant to representing these processes at coarse scales. To explore some of the challenges and uncertainties in estimating carbon fluxes at fine to coarse scales, we conducted a modeling analysis of canopy foliar maintenance respiration for black spruce ecosystems of Alaska by scaling empirical hourly models of foliar maintenance respiration (Rm) to estimate canopy foliar Rm for individual stands. We used variation in foliar N concentration among stands to develop hourly stand-specific models and then developed an hourly pooled model. An uncertainty analysis identified that the most important parameter affecting estimates of canopy foliar Rm was one that describes R m at 0??C per g N, which explained more than 55% of variance in annual estimates of canopy foliar Rm. The comparison of simulated annual canopy foliar Rm identified significant differences between stand-specific and pooled models for each stand. This result indicates that control over foliar N concentration should be considered in models that estimate canopy foliar Rm of black spruce stands across the landscape. In this study, we also temporally scaled the hourly stand-level models to estimate canopy foliar Rm of black spruce stands using mean monthly temperature data. Comparisons of monthly Rm between the hourly and monthly versions of the models indicated that there was very little difference between the estimates of hourly and monthly models, suggesting that hourly models can be aggregated to use monthly input data with little loss of precision. We conclude that uncertainties in the use of a coarse-scale model for estimating canopy foliar Rm at regional scales depend on uncertainties in representing needle-level respiration and on uncertainties in representing the spatial variability of canopy foliar N across a region. The development of spatial data sets of canopy foliar N represents a major challenge in estimating canopy foliar maintenance respiration at regional scales. ?? Springer 2006.
Gregersen, I B; Arnbjerg-Nielsen, K
2012-01-01
Several extraordinary rainfall events have occurred in Denmark within the last few years. For each event, problems in urban areas occurred as the capacity of the existing drainage systems were exceeded. Adaptation to climate change is necessary but also very challenging as urban drainage systems are characterized by long technical lifetimes and high, unrecoverable construction costs. One of the most important barriers for the initiation and implementation of the adaptation strategies is therefore the uncertainty when predicting the magnitude of the extreme rainfall in the future. This challenge is explored through the application and discussion of three different theoretical decision support strategies: the precautionary principle, the minimax strategy and Bayesian decision support. The reviewed decision support strategies all proved valuable for addressing the identified uncertainties, at best applied together as they all yield information that improved decision making and thus enabled more robust decisions.
Uncertainty in weather and climate prediction
Slingo, Julia; Palmer, Tim
2011-01-01
Following Lorenz's seminal work on chaos theory in the 1960s, probabilistic approaches to prediction have come to dominate the science of weather and climate forecasting. This paper gives a perspective on Lorenz's work and how it has influenced the ways in which we seek to represent uncertainty in forecasts on all lead times from hours to decades. It looks at how model uncertainty has been represented in probabilistic prediction systems and considers the challenges posed by a changing climate. Finally, the paper considers how the uncertainty in projections of climate change can be addressed to deliver more reliable and confident assessments that support decision-making on adaptation and mitigation. PMID:22042896
Raben, Jaime S; Hariharan, Prasanna; Robinson, Ronald; Malinauskas, Richard; Vlachos, Pavlos P
2016-03-01
We present advanced particle image velocimetry (PIV) processing, post-processing, and uncertainty estimation techniques to support the validation of computational fluid dynamics analyses of medical devices. This work is an extension of a previous FDA-sponsored multi-laboratory study, which used a medical device mimicking geometry referred to as the FDA benchmark nozzle model. Experimental measurements were performed using time-resolved PIV at five overlapping regions of the model for Reynolds numbers in the nozzle throat of 500, 2000, 5000, and 8000. Images included a twofold increase in spatial resolution in comparison to the previous study. Data was processed using ensemble correlation, dynamic range enhancement, and phase correlations to increase signal-to-noise ratios and measurement accuracy, and to resolve flow regions with large velocity ranges and gradients, which is typical of many blood-contacting medical devices. Parameters relevant to device safety, including shear stress at the wall and in bulk flow, were computed using radial basis functions. In addition, in-field spatially resolved pressure distributions, Reynolds stresses, and energy dissipation rates were computed from PIV measurements. Velocity measurement uncertainty was estimated directly from the PIV correlation plane, and uncertainty analysis for wall shear stress at each measurement location was performed using a Monte Carlo model. Local velocity uncertainty varied greatly and depended largely on local conditions such as particle seeding, velocity gradients, and particle displacements. Uncertainty in low velocity regions in the sudden expansion section of the nozzle was greatly reduced by over an order of magnitude when dynamic range enhancement was applied. Wall shear stress uncertainty was dominated by uncertainty contributions from velocity estimations, which were shown to account for 90-99% of the total uncertainty. This study provides advancements in the PIV processing methodologies over the previous work through increased PIV image resolution, use of robust image processing algorithms for near-wall velocity measurements and wall shear stress calculations, and uncertainty analyses for both velocity and wall shear stress measurements. The velocity and shear stress analysis, with spatially distributed uncertainty estimates, highlights the challenges of flow quantification in medical devices and provides potential methods to overcome such challenges.
ERIC Educational Resources Information Center
Bond, Clare Elizabeth; Philo, Chris; Shipton, Zoe Kai
2011-01-01
A key challenge in university geoscience teaching is to give students the skills to cope with uncertainty. Professional geoscientists can rarely be certain of the "right answer" to problems posed by most geological datasets, and reasoning through this uncertainty, being intelligently flexible in interpreting data which are limited in resolution…
Karin Riley; Matthew Thompson; Peter Webley; Kevin D. Hyde
2017-01-01
Modeling has been used to characterize and map natural hazards and hazard susceptibility for decades. Uncertainties are pervasive in natural hazards analysis, including a limited ability to predict where and when extreme events will occur, with what consequences, and driven by what contributing factors. Modeling efforts are challenged by the intrinsic...
A Statistician's View of Upcoming Grand Challenges
NASA Astrophysics Data System (ADS)
Meng, Xiao Li
2010-01-01
In this session we have seen some snapshots of the broad spectrum of challenges, in this age of huge, complex, computer-intensive models, data, instruments,and questions. These challenges bridge astronomy at many wavelengths; basic physics; machine learning; -- and statistics. At one end of our spectrum, we think of 'compressing' the data with non-parametric methods. This raises the question of creating 'pseudo-replicas' of the data for uncertainty estimates. What would be involved in, e.g. boot-strap and related methods? Somewhere in the middle are these non-parametric methods for encapsulating the uncertainty information. At the far end, we find more model-based approaches, with the physics model embedded in the likelihood and analysis. The other distinctive problem is really the 'black-box' problem, where one has a complicated e.g. fundamental physics-based computer code, or 'black box', and one needs to know how changing the parameters at input -- due to uncertainties of any kind -- will map to changing the output. All of these connect to challenges in complexity of data and computation speed. Dr. Meng will highlight ways to 'cut corners' with advanced computational techniques, such as Parallel Tempering and Equal Energy methods. As well, there are cautionary tales of running automated analysis with real data -- where "30 sigma" outliers due to data artifacts can be more common than the astrophysical event of interest.
Markov decision processes in natural resources management: observability and uncertainty
Williams, Byron K.
2015-01-01
The breadth and complexity of stochastic decision processes in natural resources presents a challenge to analysts who need to understand and use these approaches. The objective of this paper is to describe a class of decision processes that are germane to natural resources conservation and management, namely Markov decision processes, and to discuss applications and computing algorithms under different conditions of observability and uncertainty. A number of important similarities are developed in the framing and evaluation of different decision processes, which can be useful in their applications in natural resources management. The challenges attendant to partial observability are highlighted, and possible approaches for dealing with it are discussed.
Mishra, U.; Jastrow, J.D.; Matamala, R.; Hugelius, G.; Koven, C.D.; Harden, Jennifer W.; Ping, S.L.; Michaelson, G.J.; Fan, Z.; Miller, R.M.; McGuire, A.D.; Tarnocai, C.; Kuhry, P.; Riley, W.J.; Schaefer, K.; Schuur, E.A.G.; Jorgenson, M.T.; Hinzman, L.D.
2013-01-01
The vast amount of organic carbon (OC) stored in soils of the northern circumpolar permafrost region is a potentially vulnerable component of the global carbon cycle. However, estimates of the quantity, decomposability, and combustibility of OC contained in permafrost-region soils remain highly uncertain, thereby limiting our ability to predict the release of greenhouse gases due to permafrost thawing. Substantial differences exist between empirical and modeling estimates of the quantity and distribution of permafrost-region soil OC, which contribute to large uncertainties in predictions of carbon–climate feedbacks under future warming. Here, we identify research challenges that constrain current assessments of the distribution and potential decomposability of soil OC stocks in the northern permafrost region and suggest priorities for future empirical and modeling studies to address these challenges.
NASA Astrophysics Data System (ADS)
Brekke, L. D.; Clark, M. P.; Gutmann, E. D.; Wood, A.; Mizukami, N.; Mendoza, P. A.; Rasmussen, R.; Ikeda, K.; Pruitt, T.; Arnold, J. R.; Rajagopalan, B.
2015-12-01
Adaptation planning assessments often rely on single methods for climate projection downscaling and hydrologic analysis, do not reveal uncertainties from associated method choices, and thus likely produce overly confident decision-support information. Recent work by the authors has highlighted this issue by identifying strengths and weaknesses of widely applied methods for downscaling climate projections and assessing hydrologic impacts. This work has shown that many of the methodological choices made can alter the magnitude, and even the sign of the climate change signal. Such results motivate consideration of both sources of method uncertainty within an impacts assessment. Consequently, the authors have pursued development of improved downscaling techniques spanning a range of method classes (quasi-dynamical and circulation-based statistical methods) and developed approaches to better account for hydrologic analysis uncertainty (multi-model; regional parameter estimation under forcing uncertainty). This presentation summarizes progress in the development of these methods, as well as implications of pursuing these developments. First, having access to these methods creates an opportunity to better reveal impacts uncertainty through multi-method ensembles, expanding on present-practice ensembles which are often based only on emissions scenarios and GCM choices. Second, such expansion of uncertainty treatment combined with an ever-expanding wealth of global climate projection information creates a challenge of how to use such a large ensemble for local adaptation planning. To address this challenge, the authors are evaluating methods for ensemble selection (considering the principles of fidelity, diversity and sensitivity) that is compatible with present-practice approaches for abstracting change scenarios from any "ensemble of opportunity". Early examples from this development will also be presented.
Cross-species extrapolation of chemical effects: Challenges and new insights
One of the greatest uncertainties in chemical risk assessment is extrapolation of effects from tested to untested species. While this undoubtedly is a challenge in the human health arena, species extrapolation is a particularly daunting task in ecological assessments, where it is...
DOT National Transportation Integrated Search
2010-01-14
Due to the volatility of current highway construction commodity prices, owners, contractors, and designers are facing serious challenges in both short-term estimating and long-term planning. Among these challenges is significant uncertainty about the...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sig Drellack, Lance Prothro
2007-12-01
The Underground Test Area (UGTA) Project of the U.S. Department of Energy, National Nuclear Security Administration Nevada Site Office is in the process of assessing and developing regulatory decision options based on modeling predictions of contaminant transport from underground testing of nuclear weapons at the Nevada Test Site (NTS). The UGTA Project is attempting to develop an effective modeling strategy that addresses and quantifies multiple components of uncertainty including natural variability, parameter uncertainty, conceptual/model uncertainty, and decision uncertainty in translating model results into regulatory requirements. The modeling task presents multiple unique challenges to the hydrological sciences as a result ofmore » the complex fractured and faulted hydrostratigraphy, the distributed locations of sources, the suite of reactive and non-reactive radionuclides, and uncertainty in conceptual models. Characterization of the hydrogeologic system is difficult and expensive because of deep groundwater in the arid desert setting and the large spatial setting of the NTS. Therefore, conceptual model uncertainty is partially addressed through the development of multiple alternative conceptual models of the hydrostratigraphic framework and multiple alternative models of recharge and discharge. Uncertainty in boundary conditions is assessed through development of alternative groundwater fluxes through multiple simulations using the regional groundwater flow model. Calibration of alternative models to heads and measured or inferred fluxes has not proven to provide clear measures of model quality. Therefore, model screening by comparison to independently-derived natural geochemical mixing targets through cluster analysis has also been invoked to evaluate differences between alternative conceptual models. Advancing multiple alternative flow models, sensitivity of transport predictions to parameter uncertainty is assessed through Monte Carlo simulations. The simulations are challenged by the distributed sources in each of the Corrective Action Units, by complex mass transfer processes, and by the size and complexity of the field-scale flow models. An efficient methodology utilizing particle tracking results and convolution integrals provides in situ concentrations appropriate for Monte Carlo analysis. Uncertainty in source releases and transport parameters including effective porosity, fracture apertures and spacing, matrix diffusion coefficients, sorption coefficients, and colloid load and mobility are considered. With the distributions of input uncertainties and output plume volumes, global analysis methods including stepwise regression, contingency table analysis, and classification tree analysis are used to develop sensitivity rankings of parameter uncertainties for each model considered, thus assisting a variety of decisions.« less
Uncertainty Quantification in Climate Modeling and Projection
DOE Office of Scientific and Technical Information (OSTI.GOV)
Qian, Yun; Jackson, Charles; Giorgi, Filippo
The projection of future climate is one of the most complex problems undertaken by the scientific community. Although scientists have been striving to better understand the physical basis of the climate system and to improve climate models, the overall uncertainty in projections of future climate has not been significantly reduced (e.g., from the IPCC AR4 to AR5). With the rapid increase of complexity in Earth system models, reducing uncertainties in climate projections becomes extremely challenging. Since uncertainties always exist in climate models, interpreting the strengths and limitations of future climate projections is key to evaluating risks, and climate change informationmore » for use in Vulnerability, Impact, and Adaptation (VIA) studies should be provided with both well-characterized and well-quantified uncertainty. The workshop aimed at providing participants, many of them from developing countries, information on strategies to quantify the uncertainty in climate model projections and assess the reliability of climate change information for decision-making. The program included a mixture of lectures on fundamental concepts in Bayesian inference and sampling, applications, and hands-on computer laboratory exercises employing software packages for Bayesian inference, Markov Chain Monte Carlo methods, and global sensitivity analyses. The lectures covered a range of scientific issues underlying the evaluation of uncertainties in climate projections, such as the effects of uncertain initial and boundary conditions, uncertain physics, and limitations of observational records. Progress in quantitatively estimating uncertainties in hydrologic, land surface, and atmospheric models at both regional and global scales was also reviewed. The application of Uncertainty Quantification (UQ) concepts to coupled climate system models is still in its infancy. The Coupled Model Intercomparison Project (CMIP) multi-model ensemble currently represents the primary data for assessing reliability and uncertainties of climate change information. An alternative approach is to generate similar ensembles by perturbing parameters within a single-model framework. One of workshop’s objectives was to give participants a deeper understanding of these approaches within a Bayesian statistical framework. However, there remain significant challenges still to be resolved before UQ can be applied in a convincing way to climate models and their projections.« less
ERIC Educational Resources Information Center
Taylor, Maureen
2000-01-01
Explores cultural variability, especially uncertainty avoidance and power distance, and examines how it affects public response to crisis. Presents an analysis of the national cultures of six European countries that showed that publics who live in nations that are high in uncertainty avoidance and power distance tend to react more strongly, and…
Thermospheric mass density model error variance as a function of time scale
NASA Astrophysics Data System (ADS)
Emmert, J. T.; Sutton, E. K.
2017-12-01
In the increasingly crowded low-Earth orbit environment, accurate estimation of orbit prediction uncertainties is essential for collision avoidance. Poor characterization of such uncertainty can result in unnecessary and costly avoidance maneuvers (false positives) or disregard of a collision risk (false negatives). Atmospheric drag is a major source of orbit prediction uncertainty, and is particularly challenging to account for because it exerts a cumulative influence on orbital trajectories and is therefore not amenable to representation by a single uncertainty parameter. To address this challenge, we examine the variance of measured accelerometer-derived and orbit-derived mass densities with respect to predictions by thermospheric empirical models, using the data-minus-model variance as a proxy for model uncertainty. Our analysis focuses mainly on the power spectrum of the residuals, and we construct an empirical model of the variance as a function of time scale (from 1 hour to 10 years), altitude, and solar activity. We find that the power spectral density approximately follows a power-law process but with an enhancement near the 27-day solar rotation period. The residual variance increases monotonically with altitude between 250 and 550 km. There are two components to the variance dependence on solar activity: one component is 180 degrees out of phase (largest variance at solar minimum), and the other component lags 2 years behind solar maximum (largest variance in the descending phase of the solar cycle).
NASA Astrophysics Data System (ADS)
Alarcon, T.; Garcia, M. E.; Small, D. L.; Portney, K.; Islam, S.
2013-12-01
Providing water to the expanding population of megacities, which have over 10 million people, with a stressed and aging water infrastructure creates unprecedented challenges. These challenges are exacerbated by dwindling supply and competing demands, altered precipitation and runoff patterns in a changing climate, fragmented water utility business models, and changing consumer behavior. While there is an extensive literature on the effects of climate change on water resources, the uncertainty of climate change predictions continues to be high. This hinders the value of these predictions for municipal water supply planning. The ability of water utilities to meet future water needs will largely depend on their capacity to make decisions under uncertainty. Water stressors, like changes in demographics, climate, and socioeconomic patterns, have varying degrees of uncertainty. Identifying which stressors will have a greater impact on water resources, may reduce the level of future uncertainty for planning and managing water utilities. Within this context, we analyze historical and projected changes of population and climate to quantify the relative impacts of these two stressors on water resources. We focus on megacities that rely primarily on surface water resources to evaluate (a) population growth pattern from 1950-2010 and projected population for 2010-2060; (b) climate change impact on projected climate change scenarios for 2010-2060; and (c) water access for 1950-2010; projected needs for 2010-2060.
The doctor-patient relationship as a toolkit for uncertain clinical decisions.
Diamond-Brown, Lauren
2016-06-01
Medical uncertainty is a well-recognized problem in healthcare, yet how doctors make decisions in the face of uncertainty remains to be understood. This article draws on interdisciplinary literature on uncertainty and physician decision-making to examine a specific physician response to uncertainty: using the doctor-patient relationship as a toolkit. Additionally, I ask what happens to this process when the doctor-patient relationship becomes fragmented. I answer these questions by examining obstetrician-gynecologists' narratives regarding how they make decisions when faced with uncertainty in childbirth. Between 2013 and 2014, I performed 21 semi-structured interviews with obstetricians in the United States. Obstetricians were selected to maximize variation in relevant physician, hospital, and practice characteristics. I began with grounded theory and moved to analytical coding of themes in relation to relevant literature. My analysis renders it evident that some physicians use the doctor-patient relationship as a toolkit for dealing with uncertainty. I analyze how this process varies for physicians in different models of care by comparing doctors' experiences in models with continuous versus fragmented doctor-patient relationships. My key findings are that obstetricians in both models appealed to the ideal of patient-centered decision-making to cope with uncertain decisions, but in practice physicians in fragmented care faced a number of challenges to using the doctor-patient relationship as a toolkit for decision-making. These challenges led to additional uncertainties and in some cases to poor outcomes for doctors and/or patients; they also raised concerns about the reproduction of inequality. Thus organization of care delivery mitigates the efficacy of doctors' use of the doctor-patient relationship toolkit for uncertain decisions. These findings have implications for theorizing about decision-making under conditions of medical uncertainty, for understanding how the doctor-patient relationship and model of care affect physician decision-making, and for forming policy on the optimal structure of medical work. Copyright © 2016 Elsevier Ltd. All rights reserved.
A taxonomy of medical uncertainties in clinical genome sequencing.
Han, Paul K J; Umstead, Kendall L; Bernhardt, Barbara A; Green, Robert C; Joffe, Steven; Koenig, Barbara; Krantz, Ian; Waterston, Leo B; Biesecker, Leslie G; Biesecker, Barbara B
2017-08-01
Clinical next-generation sequencing (CNGS) is introducing new opportunities and challenges into the practice of medicine. Simultaneously, these technologies are generating uncertainties of an unprecedented scale that laboratories, clinicians, and patients are required to address and manage. We describe in this report the conceptual design of a new taxonomy of uncertainties around the use of CNGS in health care. Interviews to delineate the dimensions of uncertainty in CNGS were conducted with genomics experts and themes were extracted in order to expand on a previously published three-dimensional taxonomy of medical uncertainty. In parallel, we developed an interactive website to disseminate the CNGS taxonomy to researchers and engage them in its continued refinement. The proposed taxonomy divides uncertainty along three axes-source, issue, and locus-and further discriminates the uncertainties into five layers with multiple domains. Using a hypothetical clinical example, we illustrate how the taxonomy can be applied to findings from CNGS and used to guide stakeholders through interpretation and implementation of variant results. The utility of the proposed taxonomy lies in promoting consistency in describing dimensions of uncertainty in publications and presentations, to facilitate research design and management of the uncertainties inherent in the implementation of CNGS.Genet Med advance online publication 19 January 2017.
A Taxonomy of Medical Uncertainties in Clinical Genome Sequencing
Han, Paul K. J.; Umstead, Kendall L.; Bernhardt, Barbara A.; Green, Robert C.; Joffe, Steven; Koenig, Barbara; Krantz, Ian; Waterston, Leo B.; Biesecker, Leslie G.; Biesecker, Barbara B.
2017-01-01
Purpose Clinical next generation sequencing (CNGS) is introducing new opportunities and challenges into the practice of medicine. Simultaneously, these technologies are generating uncertainties of unprecedented scale that laboratories, clinicians, and patients are required to address and manage. We describe in this report the conceptual design of a new taxonomy of uncertainties around the use of CNGS in health care. Methods Interviews to delineate the dimensions of uncertainty in CNGS were conducted with genomics experts, and themes were extracted in order to expand upon a previously published three-dimensional taxonomy of medical uncertainty. In parallel we developed an interactive website to disseminate the CNGS taxonomy to researchers and engage them in its continued refinement. Results The proposed taxonomy divides uncertainty along three axes: source, issue, and locus, and further discriminates the uncertainties into five layers with multiple domains. Using a hypothetical clinical example, we illustrate how the taxonomy can be applied to findings from CNGS and used to guide stakeholders through interpretation and implementation of variant results. Conclusion The utility of the proposed taxonomy lies in promoting consistency in describing dimensions of uncertainty in publications and presentations, to facilitate research design and management of the uncertainties inherent in the implementation of CNGS. PMID:28102863
Visualizing uncertainty about the future.
Spiegelhalter, David; Pearson, Mike; Short, Ian
2011-09-09
We are all faced with uncertainty about the future, but we can get the measure of some uncertainties in terms of probabilities. Probabilities are notoriously difficult to communicate effectively to lay audiences, and in this review we examine current practice for communicating uncertainties visually, using examples drawn from sport, weather, climate, health, economics, and politics. Despite the burgeoning interest in infographics, there is limited experimental evidence on how different types of visualizations are processed and understood, although the effectiveness of some graphics clearly depends on the relative numeracy of an audience. Fortunately, it is increasingly easy to present data in the form of interactive visualizations and in multiple types of representation that can be adjusted to user needs and capabilities. Nonetheless, communicating deeper uncertainties resulting from incomplete or disputed knowledge--or from essential indeterminacy about the future--remains a challenge.
Transmission expansion with smart switching under demand uncertainty and line failures
Schumacher, Kathryn M.; Chen, Richard Li-Yang; Cohn, Amy E. M.
2016-06-07
One of the major challenges in deciding where to build new transmission lines is that there is uncertainty regarding future loads, renewal generation output and equipment failures. We propose a robust optimization model whose transmission expansion solutions ensure that demand can be met over a wide range of conditions. Specifically, we require feasible operation for all loads and renewable generation levels within given ranges, and for all single transmission line failures. Furthermore, we consider transmission switching as an allowable recovery action. This relatively inexpensive method of redirecting power flows improves resiliency, but introduces computational challenges. Lastly, we present a novelmore » algorithm to solve this model. Computational results are discussed.« less
Engagement and Uncertainty: Emerging Technologies Challenge the Work of Engagement
ERIC Educational Resources Information Center
Eaton, Weston; Wright, Wynne; Whyte, Kyle; Gasteyer, Stephen P.; Gehrke, Pat J.
2014-01-01
Universities' increasing applications of science and technology to address a wide array of societal problems may serve to thwart democratic engagement strategies. For emerging technologies, such challenges are particularly salient, as knowledge is incomplete and application and impact are uncertain or contested. Insights from science and…
Learning That Makes a Difference: Pedagogy and Practice for Learning Abroad
ERIC Educational Resources Information Center
Benham Rennick, Joanne
2015-01-01
Society faces significant new challenges surrounding issues in human health; global security; environmental devastation; human rights violations; economic uncertainty; population explosion and regression; recognition of diversity, difference and special populations at home and abroad. In light of these challenges, there is a great opportunity, and…
Mindful Organizing as a Paradigm to Develop Managers
ERIC Educational Resources Information Center
Gebauer, Annette
2013-01-01
How can managers prepare for extreme but exceptional events and for the challenge of managing complexity and uncertainty in their daily business? Confronted with the challenge of achieving high and reliable performance in risk-prone, fast-paced, and unpredictable environments, managers and management scholars can learn a lot from the organizing…
The Impact of Model Uncertainty on Spatial Compensation in Active Structural Acoustic Control
NASA Technical Reports Server (NTRS)
Cabell, Randolph H.; Gibbs, Gary P.; Sprofera, Joseph D.; Clark, Robert L.
2004-01-01
Turbulent boundary layer (TBL) noise is considered a primary factor in the interior noise experienced by passengers aboard commercial airliners. There have been numerous investigations of interior noise control devoted to aircraft panels; however, practical realization is a challenge since the physical boundary conditions are uncertain at best. In most prior studies, pinned or clamped boundary conditions have been assumed; however, realistic panels likely display a range of varying boundary conditions between these two limits. Uncertainty in boundary conditions is a challenge for control system designers, both in terms of the compensator implemented and the location of actuators and sensors required to achieve the desired control. The impact of model uncertainties, uncertain boundary conditions in particular, on the selection of actuator and sensor locations for structural acoustic control are considered herein. Results from this research effort indicate that it is possible to optimize the design of actuator and sensor location and aperture, which minimizes the impact of boundary conditions on the desired structural acoustic control.
Assessing nanoparticle risk poses prodigious challenges.
MacPhail, Robert C; Grulke, Eric A; Yokel, Robert A
2013-01-01
Risk assessment is used both formally and informally to estimate the likelihood of an adverse event occurring, for example, as a consequence of exposure to a hazardous chemical, drug, or other agent. Formal risk assessments in government regulatory agencies have a long history of practice. The precision with which risk can be estimated is inevitably constrained, however, by uncertainties arising from the lack of pertinent data. Developing accurate risk assessments for nanoparticles and nanoparticle-containing products may present further challenges because of the unique properties of the particles, uncertainties about their composition and the populations exposed to them, and how these may change throughout the particle's life cycle. This review introduces the evolving practice of risk assessment followed by some of the uncertainties that need to be addressed to improve our understanding of nanoparticle risks. Given the clarion call for life-cycle assessments of nanoparticles, an unprecedented degree of national and international coordination between scientific organizations, regulatory agencies, and stakeholders will be required to achieve this goal. Copyright © 2013 Wiley Periodicals, Inc.
Visualizing Uncertainty of Point Phenomena by Redesigned Error Ellipses
NASA Astrophysics Data System (ADS)
Murphy, Christian E.
2018-05-01
Visualizing uncertainty remains one of the great challenges in modern cartography. There is no overarching strategy to display the nature of uncertainty, as an effective and efficient visualization depends, besides on the spatial data feature type, heavily on the type of uncertainty. This work presents a design strategy to visualize uncertainty con-nected to point features. The error ellipse, well-known from mathematical statistics, is adapted to display the uncer-tainty of point information originating from spatial generalization. Modified designs of the error ellipse show the po-tential of quantitative and qualitative symbolization and simultaneous point based uncertainty symbolization. The user can intuitively depict the centers of gravity, the major orientation of the point arrays as well as estimate the ex-tents and possible spatial distributions of multiple point phenomena. The error ellipse represents uncertainty in an intuitive way, particularly suitable for laymen. Furthermore it is shown how applicable an adapted design of the er-ror ellipse is to display the uncertainty of point features originating from incomplete data. The suitability of the error ellipse to display the uncertainty of point information is demonstrated within two showcases: (1) the analysis of formations of association football players, and (2) uncertain positioning of events on maps for the media.
A Unified Approach for Reporting ARM Measurement Uncertainties Technical Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Campos, E; Sisterson, DL
The Atmospheric Radiation Measurement (ARM) Climate Research Facility is observationally based, and quantifying the uncertainty of its measurements is critically important. With over 300 widely differing instruments providing over 2,500 datastreams, concise expression of measurement uncertainty is quite challenging. The ARM Facility currently provides data and supporting metadata (information about the data or data quality) to its users through a number of sources. Because the continued success of the ARM Facility depends on the known quality of its measurements, the Facility relies on instrument mentors and the ARM Data Quality Office (DQO) to ensure, assess, and report measurement quality. Therefore,more » an easily-accessible, well-articulated estimate of ARM measurement uncertainty is needed.« less
A Unified Approach for Reporting ARM Measurement Uncertainties Technical Report: Updated in 2016
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sisterson, Douglas
The U.S. Department of Energy (DOE) Atmospheric Radiation Measurement (ARM) Climate Research Facility is observationally based, and quantifying the uncertainty of its measurements is critically important. With over 300 widely differing instruments providing over 2,500 datastreams, concise expression of measurement uncertainty is quite challenging. ARM currently provides data and supporting metadata (information about the data or data quality) to its users through several sources. Because the continued success of the ARM Facility depends on the known quality of its measurements, ARM relies on Instrument Mentors and the ARM Data Quality Office to ensure, assess, and report measurement quality. Therefore, anmore » easily accessible, well-articulated estimate of ARM measurement uncertainty is needed. This report is a continuation of the work presented by Campos and Sisterson (2015) and provides additional uncertainty information from instruments not available in their report. As before, a total measurement uncertainty has been calculated as a function of the instrument uncertainty (calibration factors), the field uncertainty (environmental factors), and the retrieval uncertainty (algorithm factors). This study will not expand on methods for computing these uncertainties. As before, it will focus on the practical identification, characterization, and inventory of the measurement uncertainties already available to the ARM community through the ARM Instrument Mentors and their ARM instrument handbooks. This study continues the first steps towards reporting ARM measurement uncertainty as: (1) identifying how the uncertainty of individual ARM measurements is currently expressed, (2) identifying a consistent approach to measurement uncertainty, and then (3) reclassifying ARM instrument measurement uncertainties in a common framework.« less
Fostering climate dialogue by introducing students to uncertainty in decision-making
NASA Astrophysics Data System (ADS)
Addor, N.; Ewen, T.; Johnson, L.; Coltekin, A.; Derungs, C.; Muccione, V.
2014-12-01
Uncertainty is present in all fields of climate research, spanning from climate projections, to assessing regional impacts and vulnerabilities to adaptation policy and decision-making. The complex and interdisciplinary nature of climate information, however, makes the decision-making process challenging. This process is further hindered by a lack of institutionalized dialogue between climate researchers, decision-makers and user groups. Forums that facilitate such dialogue would allow these groups to actively engage with each other to improve decisions. In parallel, introducing students to these challenges is one way to foster such climate dialogue. We present the design and outcome of an innovative workshop-seminar series we convened at the University of Zurich to demonstrate the pedagogical importance of such forums. An initial two-day workshop brought together 50 participants, including bachelor, master and PhD students and academic staff, and nine speakers from academia, industry, government, and philanthropy. The main objectives were to provide participants with tools to communicate uncertainty in their current or future research projects, to foster exchange between practitioners, students and scientists from different backgrounds and finally to expose students to multidisciplinary collaborations and real-world problems involving decisions under uncertainty. An opinion survey conducted before and after the workshop enabled us to observe changes in participants' perspectives on what information and tools should be exchanged between researchers and decision-makers to better address uncertainty. Responses demonstrated a marked shift from a pre-workshop vertical conceptualization of researcher-user group interaction to a post-workshop horizontal mode: in the former, researchers were portrayed as bestowing data-based products to decision-makers, while in the latter, both sets of actors engaged in frequent communication, exchanging their needs and expertise. Drawing on examples from the course evaluation, we seek to encourage the organization of similar events, introducing students to these challenges at an early stage of their education and career as a first step towards improving future dialogue.
Known unknowns: building an ethics of uncertainty into genomic medicine.
Newson, Ainsley J; Leonard, Samantha J; Hall, Alison; Gaff, Clara L
2016-09-01
Genomic testing has reached the point where, technically at least, it can be cheaper to undertake panel-, exome- or whole genome testing than it is to sequence a single gene. An attribute of these approaches is that information gleaned will often have uncertain significance. In addition to the challenges this presents for pre-test counseling and informed consent, a further consideration emerges over how - ethically - we should conceive of and respond to this uncertainty. To date, the ethical aspects of uncertainty in genomics have remained under-explored. In this paper, we draft a conceptual and ethical response to the question of how to conceive of and respond to uncertainty in genomic medicine. After introducing the problem, we articulate a concept of 'genomic uncertainty'. Drawing on this, together with exemplar clinical cases and related empirical literature, we then critique the presumption that uncertainty is always problematic and something to be avoided, or eradicated. We conclude by outlining an 'ethics of genomic uncertainty'; describing how we might handle uncertainty in genomic medicine. This involves fostering resilience, welfare, autonomy and solidarity. Uncertainty will be an inherent aspect of clinical practice in genomics for some time to come. Genomic testing should not be offered with the explicit aim to reduce uncertainty. Rather, uncertainty should be appraised, adapted to and communicated about as part of the process of offering and providing genomic information.
Lie, Nataskja-Elena Kersting; Larsen, Torill Marie Bogsnes; Hauken, May Aasebø
2017-07-31
Young adult cancer patients (YACPs), aged 18-35 years when diagnosed with cancer, are in a vulnerable transitioning period from adolescence to adulthood, where cancer adds a tremendous burden. However, YACPs' challenges and coping strategies are under-researched. The objective of this study was to explore what challenges YACP experience during their treatment, and what coping strategies they applied to them. We conducted a qualitative study with a phenomenological-hermeneutic design, including retrospective, semi-structured interviews of 16 YACPs who had undergone cancer treatment. Data were analysed using thematic analysis and interpreted applying the Cognitive Activation Theory of Stress (CATS). We found "coping with changes and uncertainty" as overarching topic for YACPs' challenges, particularly related to five themes, including (1) receiving the diagnosis, (2) encountering the healthcare system, (3) living with cancer, (4) dealing with the impact of the treatment and (5) reactions from the social network. YACPs' coping strategies applied to these challenges varied broadly and ranged from maladaptive strategies, such as neglecting the situation, to conducive emotional or instrumental approaches to manage their challenges. The findings call for age-specific needs assessments, information and support for YACPs, and their families in order to facilitate YACPs' coping during their treatment. © 2017 John Wiley & Sons Ltd.
Weather uncertainty versus climate change uncertainty in a short television weather broadcast
NASA Astrophysics Data System (ADS)
Witte, J.; Ward, B.; Maibach, E.
2011-12-01
For TV meteorologists talking about uncertainty in a two-minute forecast can be a real challenge. It can quickly open the way to viewer confusion. TV meteorologists understand the uncertainties of short term weather models and have different methods to convey the degrees of confidence to the viewing public. Visual examples are seen in the 7-day forecasts and the hurricane track forecasts. But does the public really understand a 60 percent chance of rain or the hurricane cone? Communication of climate model uncertainty is even more daunting. The viewing public can quickly switch to denial of solid science. A short review of the latest national survey of TV meteorologists by George Mason University and lessons learned from a series of climate change workshops with TV broadcasters provide valuable insights into effectively using visualizations and invoking multimedia-learning theories in weather forecasts to improve public understanding of climate change.
Aeroservoelastic Uncertainty Model Identification from Flight Data
NASA Technical Reports Server (NTRS)
Brenner, Martin J.
2001-01-01
Uncertainty modeling is a critical element in the estimation of robust stability margins for stability boundary prediction and robust flight control system development. There has been a serious deficiency to date in aeroservoelastic data analysis with attention to uncertainty modeling. Uncertainty can be estimated from flight data using both parametric and nonparametric identification techniques. The model validation problem addressed in this paper is to identify aeroservoelastic models with associated uncertainty structures from a limited amount of controlled excitation inputs over an extensive flight envelope. The challenge to this problem is to update analytical models from flight data estimates while also deriving non-conservative uncertainty descriptions consistent with the flight data. Multisine control surface command inputs and control system feedbacks are used as signals in a wavelet-based modal parameter estimation procedure for model updates. Transfer function estimates are incorporated in a robust minimax estimation scheme to get input-output parameters and error bounds consistent with the data and model structure. Uncertainty estimates derived from the data in this manner provide an appropriate and relevant representation for model development and robust stability analysis. This model-plus-uncertainty identification procedure is applied to aeroservoelastic flight data from the NASA Dryden Flight Research Center F-18 Systems Research Aircraft.
NASA Astrophysics Data System (ADS)
Speciale, A.; Kenney, M. A.; Gerst, M.; Baer, A. E.; DeWitt, D.; Gottschalk, J.; Handel, S.
2017-12-01
The uncertainty of future weather and climate conditions is important for many decisions made in communities and economic sectors. One tool that decision-makers use in gauging this uncertainty is forecasts, especially maps (or visualizations) of probabilistic forecast results. However, visualizing geospatial uncertainty is challenging because including probability introduces an extra variable to represent and probability is often poorly understood by users. Using focus group and survey methods, this study seeks to understand the barriers to using probabilistic temperature and precipitation visualizations for specific decisions in the agriculture, energy, emergency management, and water resource sectors. Preliminary results shown here focus on findings of emergency manager needs. Our experimental design uses National Oceanic and Atmospheric Administration (NOAA's) Climate Prediction Center (CPC) climate outlooks, which produce probabilistic temperature and precipitation forecast visualizations at the 6-10 day, 8-14 day, 3-4 week, and 1 and 3 month timeframes. Users were asked to complete questions related to how they use weather information, how uncertainty is represented, and design elements (e.g., color, contour lines) of the visualizations. Preliminary results from the emergency management sector indicate there is significant confusion on how "normal" weather is defined, boundaries between probability ranges, and meaning of the contour lines. After a complete understandability diagnosis is made using results from all sectors, we will collaborate with CPC to suggest modifications to the climate outlook visualizations. These modifications will then be retested in similar focus groups and web-based surveys to confirm they better meet the needs of users.
NASA Astrophysics Data System (ADS)
Perdigão, R. A. P.
2017-12-01
Predictability assessments are traditionally made on a case-by-case basis, often by running the particular model of interest with randomly perturbed initial/boundary conditions and parameters, producing computationally expensive ensembles. These approaches provide a lumped statistical view of uncertainty evolution, without eliciting the fundamental processes and interactions at play in the uncertainty dynamics. In order to address these limitations, we introduce a systematic dynamical framework for predictability assessment and forecast, by analytically deriving governing equations of predictability in terms of the fundamental architecture of dynamical systems, independent of any particular problem under consideration. The framework further relates multiple uncertainty sources along with their coevolutionary interplay, enabling a comprehensive and explicit treatment of uncertainty dynamics along time, without requiring the actual model to be run. In doing so, computational resources are freed and a quick and effective a-priori systematic dynamic evaluation is made of predictability evolution and its challenges, including aspects in the model architecture and intervening variables that may require optimization ahead of initiating any model runs. It further brings out universal dynamic features in the error dynamics elusive to any case specific treatment, ultimately shedding fundamental light on the challenging issue of predictability. The formulated approach, framed with broad mathematical physics generality in mind, is then implemented in dynamic models of nonlinear geophysical systems with various degrees of complexity, in order to evaluate their limitations and provide informed assistance on how to optimize their design and improve their predictability in fundamental dynamical terms.
ERIC Educational Resources Information Center
McDonald, Christopher
2012-01-01
Community colleges are experiencing many challenges that stem from political and fiscal uncertainties and these challenges are often exacerbated by high turnover in executive leadership positions (Floyd et al., 2010), which has implications for longterm strategic planning. The magnitude of the problems associated with high turnover rates for…
Leadership Development in Governments of the United Arab Emirates: Re-Framing a Wicked Problem
ERIC Educational Resources Information Center
Mathias, Megan
2017-01-01
Developing the next generation of leaders in government is seen as a strategic challenge of national importance in the United Arab Emirates (UAE). This article examines the wicked nature of the UAE's leadership development challenge, identifying patterns of complexity, uncertainty, and divergence in the strategic intentions underlying current…
USDA-ARS?s Scientific Manuscript database
The progressive improvement of computer science and development of auto-calibration techniques means that calibration of simulation models is no longer a major challenge for watershed planning and management. Modelers now increasingly focus on challenges such as improved representation of watershed...
ERIC Educational Resources Information Center
Battin, James Vernon
2017-01-01
Today's academic and social environment creates uncertainties, new roles, frequent changes, and challenging situations for student affairs academic leaders. The purpose of this study was to explore how student affairs academic leaders described their recent challenging experiences in addressing student drug abuse in higher education. The…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Xu, Zhijie; Lai, Canhai; Marcy, Peter William
2017-05-01
A challenging problem in designing pilot-scale carbon capture systems is to predict, with uncertainty, the adsorber performance and capture efficiency under various operating conditions where no direct experimental data exist. Motivated by this challenge, we previously proposed a hierarchical framework in which relevant parameters of physical models were sequentially calibrated from different laboratory-scale carbon capture unit (C2U) experiments. Specifically, three models of increasing complexity were identified based on the fundamental physical and chemical processes of the sorbent-based carbon capture technology. Results from the corresponding laboratory experiments were used to statistically calibrate the physical model parameters while quantifying some of theirmore » inherent uncertainty. The parameter distributions obtained from laboratory-scale C2U calibration runs are used in this study to facilitate prediction at a larger scale where no corresponding experimental results are available. In this paper, we first describe the multiphase reactive flow model for a sorbent-based 1-MW carbon capture system then analyze results from an ensemble of simulations with the upscaled model. The simulation results are used to quantify uncertainty regarding the design’s predicted efficiency in carbon capture. In particular, we determine the minimum gas flow rate necessary to achieve 90% capture efficiency with 95% confidence.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Muller, L; Soldner, A; Kirk, M
Purpose: The beam range uncertainty presents a special challenge for proton therapy. Novel technologies currently under development offer strategies to reduce the range uncertainty [1,2]. This work quantifies the potential advantages that could be realized by such a reduction for dosimetrically challenging chordomas at the base of skull. Therapeutic improvement was assessed by evaluating tumor control probabilities (TCP) and normal tissue complication probabilities (NTCP). Methods: Treatment plans were made for a modulated-scanned proton delivery technique using the Eclipse treatment planning system. The prescription dose was 7920 cGy to the CTV. Three different range uncertainty scenarios were considered: 5 mm (3.5%more » of the beam range + 1 mm, representing current clinical practice, “Curr”), 2 mm (1.3%), and 1 mm (0.7%). For each of 4 patients, 3 different PTVs were defined via uniform expansion of the CTV by the value of the range uncertainty. Tumor control probability (TCP) and normal tissue complication probabilities (NTCPs) for organs-at-risk (OARs) were calculated using the Lyman-Kutcher-Burman[3] formalism and published model parameters [ref Terahara[4], quantec S10, Burman Red Journal v21 pp 123]. Our plan optimization strategy was to achieve PTV close to prescription while maintaining OAR NTCP values at or better than the Curr plan. Results: The average TCP values for the 5, 2, and 1 mm range uncertainty scenarios are 51%, 55% and 65%. The improvement in TCP for patients was between 4 and 30%, depending primarily on the proximity of the GTV to OAR. The average NTCPs for the brainstem and cord were about 4% and 1%, respectively, for all target margins. Conclusion: For base of skull chordomas, reduced target margins can substantially increase the TCP without increasing the NTCP. This work demonstrates the potential significance of a reduction in the range uncertainty for proton beams.« less
NASA Astrophysics Data System (ADS)
Pianosi, Francesca
2015-04-01
Sustainable water resource management in a quickly changing world poses new challenges to hydrology and decision sciences. Systems analysis can contribute to promote sustainable practices by providing the theoretical background and the operational tools for an objective and transparent appraisal of policy options for water resource systems (WRS) management. Traditionally, limited availability of data and computing resources imposed to use oversimplified WRS models, with little consideration of modeling uncertainties and of the non-stationarity and feedbacks between WRS drivers, and a priori aggregation of costs and benefits. Nowadays we increasingly recognize the inadequacy of these simplifications, and consider them among the reasons for the limited use of model-generated information in actual decision-making processes. On the other hand, fast-growing availability of data and computing resources are opening up unprecedented possibilities in the way we build and apply numerical models. In this talk I will discuss my experiences and ideas on how we can exploit this potential to improve model-informed decision-making while facing the challenges of uncertainty, non-stationarity, feedbacks and conflicting objectives. In particular, through practical examples of WRS design and operation problems, my talk will aim at stimulating discussion about the impact of uncertainty on decisions: can inaccurate and imprecise predictions still carry valuable information for decision-making? Does uncertainty in predictions necessarily limit our ability to make 'good' decisions? Or can uncertainty even be of help for decision-making, for instance by reducing the projected conflict between competing water use? Finally, I will also discuss how the traditionally separate disciplines of numerical modelling, optimization, and uncertainty and sensitivity analysis have in my experience been just different facets of the same 'systems approach'.
Holistic uncertainty analysis in river basin modeling for climate vulnerability assessment
NASA Astrophysics Data System (ADS)
Taner, M. U.; Wi, S.; Brown, C.
2017-12-01
The challenges posed by uncertain future climate are a prominent concern for water resources managers. A number of frameworks exist for assessing the impacts of climate-related uncertainty, including internal climate variability and anthropogenic climate change, such as scenario-based approaches and vulnerability-based approaches. While in many cases climate uncertainty may be dominant, other factors such as future evolution of the river basin, hydrologic response and reservoir operations are potentially significant sources of uncertainty. While uncertainty associated with modeling hydrologic response has received attention, very little attention has focused on the range of uncertainty and possible effects of the water resources infrastructure and management. This work presents a holistic framework that allows analysis of climate, hydrologic and water management uncertainty in water resources systems analysis with the aid of a water system model designed to integrate component models for hydrology processes and water management activities. The uncertainties explored include those associated with climate variability and change, hydrologic model parameters, and water system operation rules. A Bayesian framework is used to quantify and model the uncertainties at each modeling steps in integrated fashion, including prior and the likelihood information about model parameters. The framework is demonstrated in a case study for the St. Croix Basin located at border of United States and Canada.
Values and uncertainties in climate prediction, revisited.
Parker, Wendy
2014-06-01
Philosophers continue to debate both the actual and the ideal roles of values in science. Recently, Eric Winsberg has offered a novel, model-based challenge to those who argue that the internal workings of science can and should be kept free from the influence of social values. He contends that model-based assignments of probability to hypotheses about future climate change are unavoidably influenced by social values. I raise two objections to Winsberg's argument, neither of which can wholly undermine its conclusion but each of which suggests that his argument exaggerates the influence of social values on estimates of uncertainty in climate prediction. I then show how a more traditional challenge to the value-free ideal seems tailor-made for the climate context.
Modelling impacts of climate change on arable crop diseases: progress, challenges and applications.
Newbery, Fay; Qi, Aiming; Fitt, Bruce Dl
2016-08-01
Combining climate change, crop growth and crop disease models to predict impacts of climate change on crop diseases can guide planning of climate change adaptation strategies to ensure future food security. This review summarises recent developments in modelling climate change impacts on crop diseases, emphasises some major challenges and highlights recent trends. The use of multi-model ensembles in climate change modelling and crop modelling is contributing towards measures of uncertainty in climate change impact projections but other aspects of uncertainty remain largely unexplored. Impact assessments are still concentrated on few crops and few diseases but are beginning to investigate arable crop disease dynamics at the landscape level. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.
Communicating uncertainty in hydrological forecasts: mission impossible?
NASA Astrophysics Data System (ADS)
Ramos, Maria-Helena; Mathevet, Thibault; Thielen, Jutta; Pappenberger, Florian
2010-05-01
Cascading uncertainty in meteo-hydrological modelling chains for forecasting and integrated flood risk assessment is an essential step to improve the quality of hydrological forecasts. Although the best methodology to quantify the total predictive uncertainty in hydrology is still debated, there is a common agreement that one must avoid uncertainty misrepresentation and miscommunication, as well as misinterpretation of information by users. Several recent studies point out that uncertainty, when properly explained and defined, is no longer unwelcome among emergence response organizations, users of flood risk information and the general public. However, efficient communication of uncertain hydro-meteorological forecasts is far from being a resolved issue. This study focuses on the interpretation and communication of uncertain hydrological forecasts based on (uncertain) meteorological forecasts and (uncertain) rainfall-runoff modelling approaches to decision-makers such as operational hydrologists and water managers in charge of flood warning and scenario-based reservoir operation. An overview of the typical flow of uncertainties and risk-based decisions in hydrological forecasting systems is presented. The challenges related to the extraction of meaningful information from probabilistic forecasts and the test of its usefulness in assisting operational flood forecasting are illustrated with the help of two case-studies: 1) a study on the use and communication of probabilistic flood forecasting within the European Flood Alert System; 2) a case-study on the use of probabilistic forecasts by operational forecasters from the hydroelectricity company EDF in France. These examples show that attention must be paid to initiatives that promote or reinforce the active participation of expert forecasters in the forecasting chain. The practice of face-to-face forecast briefings, focusing on sharing how forecasters interpret, describe and perceive the model output forecasted scenarios, is essential. We believe that the efficient communication of uncertainty in hydro-meteorological forecasts is not a mission impossible. Questions remaining unanswered in probabilistic hydrological forecasting should not neutralize the goal of such a mission, and the suspense kept should instead act as a catalyst for overcoming the remaining challenges.
Assessing Uncertainties in Surface Water Security: A Probabilistic Multi-model Resampling approach
NASA Astrophysics Data System (ADS)
Rodrigues, D. B. B.
2015-12-01
Various uncertainties are involved in the representation of processes that characterize interactions between societal needs, ecosystem functioning, and hydrological conditions. Here, we develop an empirical uncertainty assessment of water security indicators that characterize scarcity and vulnerability, based on a multi-model and resampling framework. We consider several uncertainty sources including those related to: i) observed streamflow data; ii) hydrological model structure; iii) residual analysis; iv) the definition of Environmental Flow Requirement method; v) the definition of critical conditions for water provision; and vi) the critical demand imposed by human activities. We estimate the overall uncertainty coming from the hydrological model by means of a residual bootstrap resampling approach, and by uncertainty propagation through different methodological arrangements applied to a 291 km² agricultural basin within the Cantareira water supply system in Brazil. Together, the two-component hydrograph residual analysis and the block bootstrap resampling approach result in a more accurate and precise estimate of the uncertainty (95% confidence intervals) in the simulated time series. We then compare the uncertainty estimates associated with water security indicators using a multi-model framework and provided by each model uncertainty estimation approach. The method is general and can be easily extended forming the basis for meaningful support to end-users facing water resource challenges by enabling them to incorporate a viable uncertainty analysis into a robust decision making process.
Uncertainty As a Trigger for a Paradigm Change in Science Communication
NASA Astrophysics Data System (ADS)
Schneider, S.
2014-12-01
Over the last decade, the need to communicate uncertainty increased. Climate sciences and environmental sciences have faced massive propaganda campaigns by global industry and astroturf organizations. These organizations use the deep societal mistrust in uncertainty to point out alleged unethical and intentional delusion of decision makers and the public by scientists and their consultatory function. Scientists, who openly communicate uncertainty of climate model calculations, earthquake occurrence frequencies, or possible side effects of genetic manipulated semen have to face massive campaigns against their research, and sometimes against their person and live as well. Hence, new strategies to communicate uncertainty have to face the societal roots of the misunderstanding of the concept of uncertainty itself. Evolutionary biology has shown, that human mind is well suited for practical decision making by its sensory structures. Therefore, many of the irrational concepts about uncertainty are mitigated if data is presented in formats the brain is adapted to understand. At the end, the impact of uncertainty to the decision-making process is finally dominantly driven by preconceptions about terms such as uncertainty, vagueness or probabilities. Parallel to the increasing role of scientific uncertainty in strategic communication, science communicators for example at the Research and Development Program GEOTECHNOLOGIEN developed a number of techniques to master the challenge of putting uncertainty in the focus. By raising the awareness of scientific uncertainty as a driving force for scientific development and evolution, the public perspective on uncertainty is changing. While first steps to implement this process are under way, the value of uncertainty still is underestimated in the public and in politics. Therefore, science communicators are in need for new and innovative ways to talk about scientific uncertainty.
Scientist-Practitioner Engagement to Inform Regional Hydroclimate Model Evaluation
NASA Astrophysics Data System (ADS)
Jones, A. D.; Jagannathan, K. A.; Ullrich, P. A.
2017-12-01
Water mangers face significant challenges in planning for the coming decades as previously stationary aspects of the regional hydroclimate shift in response to global climate change. Providing scientific insights that enable appropriate use of regional hydroclimate projections for planning is a non-trivial problem. The system of data, models, and methods used to produce regional hydroclimate projections is subject to multiple interacting uncertainties and biases, including uncertainties that arise from general circulation models, re-analysis data products, regional climate models, hydrologic models, and statistical downscaling methods. Moreover, many components of this system were not designed with the information needs of water managers in mind. To address this problem and provide actionable insights into the sources of uncertainty present in regional hydroclimate data products, Project Hyperion has undertaken a stakeholder engagement process in four case study water basins across the US. Teams of water managers and scientists are interacting in a structured manner to identify decision-relevant metrics of model performance. These metrics are in turn being used to drive scientific investigations to uncover the sources of uncertainty in these quantities. Thus far, we have found that identification of climate phenomena of interest to stakeholders is relatively easy, but translating these into specific quantifiable metrics and prioritizing metrics is more challenging. Iterative feedback among scientists and stakeholders has proven critical in resolving these challenges, as has the roles played by boundary spanners who understand and can speak to the perspectives of multiple professional communities. Here we describe the structured format of our engagement process and the lessons learned so far, as we aim to improve the decision-relevance of hydroclimate projections through a collaborative process.
Assessing uncertainties in surface water security: An empirical multimodel approach
NASA Astrophysics Data System (ADS)
Rodrigues, Dulce B. B.; Gupta, Hoshin V.; Mendiondo, Eduardo M.; Oliveira, Paulo Tarso S.
2015-11-01
Various uncertainties are involved in the representation of processes that characterize interactions among societal needs, ecosystem functioning, and hydrological conditions. Here we develop an empirical uncertainty assessment of water security indicators that characterize scarcity and vulnerability, based on a multimodel and resampling framework. We consider several uncertainty sources including those related to (i) observed streamflow data; (ii) hydrological model structure; (iii) residual analysis; (iv) the method for defining Environmental Flow Requirement; (v) the definition of critical conditions for water provision; and (vi) the critical demand imposed by human activities. We estimate the overall hydrological model uncertainty by means of a residual bootstrap resampling approach, and by uncertainty propagation through different methodological arrangements applied to a 291 km2 agricultural basin within the Cantareira water supply system in Brazil. Together, the two-component hydrograph residual analysis and the block bootstrap resampling approach result in a more accurate and precise estimate of the uncertainty (95% confidence intervals) in the simulated time series. We then compare the uncertainty estimates associated with water security indicators using a multimodel framework and the uncertainty estimates provided by each model uncertainty estimation approach. The range of values obtained for the water security indicators suggests that the models/methods are robust and performs well in a range of plausible situations. The method is general and can be easily extended, thereby forming the basis for meaningful support to end-users facing water resource challenges by enabling them to incorporate a viable uncertainty analysis into a robust decision-making process.
Andrew D. Richardson; David Y. Hollinger
2007-01-01
Missing values in any data set create problems for researchers. The process by which missing values are replaced, and the data set is made complete, is generally referred to as imputation. Within the eddy flux community, the term "gap filling" is more commonly applied. A major challenge is that random errors in measured data result in uncertainty in the gap-...
Raising the Cash: A Study of the Role of Leadership in a Capital Campaign
ERIC Educational Resources Information Center
Diaz, Sandra L.
2013-01-01
Institutions of higher education are facing a number of challenges as they compete in the new global economy. Many of these institutions face increasing internal and external challenges, all of which involve additional expenses at a time when resources are diminishing. With growing financial uncertainties caused by fluctuating financial markets…
The Challenge of Managing a Large University in Conditions of Uncertainty
ERIC Educational Resources Information Center
Priest, Ann
2012-01-01
This paper is written in the context of practice at Nottingham Trent University, using elements of its management as example or counterbalance to the literature. The challenge at the heart of the paper is that of practically balancing a managerial and business-focused culture and processes to support the aspirations and motivations of the…
2015-04-29
are being conducted for the SpaceX Falcon 9 v1.1 launch system. In addition, in its fiscal year 2016 President’s Budget request, DOD requested funding...DOD expects SpaceX to be certified by June 2015. Additionally, the department has faced unexpected complications, such as challenges to its
NASA Astrophysics Data System (ADS)
Puechberty, Rachel; Bechon, Pierre-Marie; Le Coz, Jérôme; Renard, Benjamin
2015-04-01
The French national hydrological services (NHS) manage the production of streamflow time series throughout the national territory. The hydrological data are made available to end-users through different web applications and the national hydrological archive (Banque Hydro). Providing end-users with qualitative and quantitative information on the uncertainty of the hydrological data is key to allow them drawing relevant conclusions and making appropriate decisions. Due to technical and organisational issues that are specific to the field of hydrometry, quantifying the uncertainty of hydrological measurements is still challenging and not yet standardized. The French NHS have made progress on building a consistent strategy to assess the uncertainty of their streamflow data. The strategy consists of addressing the uncertainties produced and propagated at each step of the data production with uncertainty analysis tools that are compatible with each other and compliant with international uncertainty guidance and standards. Beyond the necessary research and methodological developments, operational software tools and procedures are absolutely necessary to the data management and uncertainty analysis by field hydrologists. A first challenge is to assess, and if possible reduce, the uncertainty of streamgauging data, i.e. direct stage-discharge measurements. Interlaboratory experiments proved to be a very efficient way to empirically measure the uncertainty of a given streamgauging technique in given measurement conditions. The Q+ method (Le Coz et al., 2012) was developed to improve the uncertainty propagation method proposed in the ISO748 standard for velocity-area gaugings. Both empirical or computed (with Q+) uncertainty values can now be assigned in BAREME, which is the software used by the French NHS for managing streamgauging measurements. A second pivotal step is to quantify the uncertainty related to stage-discharge rating curves and their application to water level records to produce continuous discharge time series. The management of rating curves is also done using BAREME. The BaRatin method (Le Coz et al., 2014) was developed as a Bayesian approach of rating curve development and uncertainty analysis. Since BaRatin accounts for the individual uncertainties of gauging data used to build the rating curve, it was coupled with BAREME. The BaRatin method is still undergoing development and research, in particular to address non univocal or time-varying stage-discharge relations, due to hysteresis, variable backwater, rating shifts, etc. A new interface including new options is under development. The next steps are now to propagate the uncertainties of water level records, through uncertain rating curves, up to discharge time series and derived variables (e.g. annual mean flow) and statistics (e.g. flood quantiles). Bayesian tools are already available for both tasks but further validation and development is necessary for their integration in the operational data workflow of the French NHS. References Le Coz, J., Camenen, B., Peyrard, X., Dramais, G., 2012. Uncertainty in open-channel discharges measured with the velocity-area method. Flow Measurement and Instrumentation 26, 18-29. Le Coz, J., Renard, B., Bonnifait, L., Branger, F., Le Boursicaud, R., 2014. Combining hydraulic knowledge and uncertain gaugings in the estimation of hydrometric rating curves: a Bayesian approach, Journal of Hydrology, 509, 573-587.
NASA Astrophysics Data System (ADS)
Rehfeld, Kira; Goswami, Bedartha; Marwan, Norbert; Breitenbach, Sebastian; Kurths, Jürgen
2013-04-01
Statistical analysis of dependencies amongst paleoclimate data helps to infer on the climatic processes they reflect. Three key challenges have to be addressed, however: the datasets are heterogeneous in time (i) and space (ii), and furthermore time itself is a variable that needs to be reconstructed, which (iii) introduces additional uncertainties. To address these issues in a flexible way we developed the paleoclimate network framework, inspired by the increasing application of complex networks in climate research. Nodes in the paleoclimate network represent a paleoclimate archive, and an associated time series. Links between these nodes are assigned, if these time series are significantly similar. Therefore, the base of the paleoclimate network is formed by linear and nonlinear estimators for Pearson correlation, mutual information and event synchronization, which quantify similarity from irregularly sampled time series. Age uncertainties are propagated into the final network analysis using time series ensembles which reflect the uncertainty. We discuss how spatial heterogeneity influences the results obtained from network measures, and demonstrate the power of the approach by inferring teleconnection variability of the Asian summer monsoon for the past 1000 years.
A solution to the static frame validation challenge problem using Bayesian model selection
Grigoriu, M. D.; Field, R. V.
2007-12-23
Within this paper, we provide a solution to the static frame validation challenge problem (see this issue) in a manner that is consistent with the guidelines provided by the Validation Challenge Workshop tasking document. The static frame problem is constructed such that variability in material properties is known to be the only source of uncertainty in the system description, but there is ignorance on the type of model that best describes this variability. Hence both types of uncertainty, aleatoric and epistemic, are present and must be addressed. Our approach is to consider a collection of competing probabilistic models for themore » material properties, and calibrate these models to the information provided; models of different levels of complexity and numerical efficiency are included in the analysis. A Bayesian formulation is used to select the optimal model from the collection, which is then used for the regulatory assessment. Lastly, bayesian credible intervals are used to provide a measure of confidence to our regulatory assessment.« less
A Survey of Recent Advances in Particle Filters and Remaining Challenges for Multitarget Tracking
Wang, Xuedong; Sun, Shudong; Corchado, Juan M.
2017-01-01
We review some advances of the particle filtering (PF) algorithm that have been achieved in the last decade in the context of target tracking, with regard to either a single target or multiple targets in the presence of false or missing data. The first part of our review is on remarkable achievements that have been made for the single-target PF from several aspects including importance proposal, computing efficiency, particle degeneracy/impoverishment and constrained/multi-modal systems. The second part of our review is on analyzing the intractable challenges raised within the general multitarget (multi-sensor) tracking due to random target birth and termination, false alarm, misdetection, measurement-to-track (M2T) uncertainty and track uncertainty. The mainstream multitarget PF approaches consist of two main classes, one based on M2T association approaches and the other not such as the finite set statistics-based PF. In either case, significant challenges remain due to unknown tracking scenarios and integrated tracking management. PMID:29168772
Validation of the thermal challenge problem using Bayesian Belief Networks.
DOE Office of Scientific and Technical Information (OSTI.GOV)
McFarland, John; Swiler, Laura Painton
The thermal challenge problem has been developed at Sandia National Laboratories as a testbed for demonstrating various types of validation approaches and prediction methods. This report discusses one particular methodology to assess the validity of a computational model given experimental data. This methodology is based on Bayesian Belief Networks (BBNs) and can incorporate uncertainty in experimental measurements, in physical quantities, and model uncertainties. The approach uses the prior and posterior distributions of model output to compute a validation metric based on Bayesian hypothesis testing (a Bayes' factor). This report discusses various aspects of the BBN, specifically in the context ofmore » the thermal challenge problem. A BBN is developed for a given set of experimental data in a particular experimental configuration. The development of the BBN and the method for ''solving'' the BBN to develop the posterior distribution of model output through Monte Carlo Markov Chain sampling is discussed in detail. The use of the BBN to compute a Bayes' factor is demonstrated.« less
Indonesia Country Analysis Brief
2015-01-01
Indonesia is reorienting energy production from serving primarily export markets to meeting its growing domestic consumption. Indonesia's energy industry has faced challenges in recent years from regulatory uncertainty and inadequate investment.
Wildfire Decision Making Under Uncertainty
NASA Astrophysics Data System (ADS)
Thompson, M.
2013-12-01
Decisions relating to wildfire management are subject to multiple sources of uncertainty, and are made by a broad range of individuals, across a multitude of environmental and socioeconomic contexts. In this presentation I will review progress towards identification and characterization of uncertainties and how this information can support wildfire decision-making. First, I will review a typology of uncertainties common to wildfire management, highlighting some of the more salient sources of uncertainty and how they present challenges to assessing wildfire risk. This discussion will cover the expanding role of burn probability modeling, approaches for characterizing fire effects, and the role of multi-criteria decision analysis, and will provide illustrative examples of integrated wildfire risk assessment across a variety of planning scales. Second, I will describe a related uncertainty typology that focuses on the human dimensions of wildfire management, specifically addressing how social, psychological, and institutional factors may impair cost-effective risk mitigation. This discussion will encompass decision processes before, during, and after fire events, with a specific focus on active management of complex wildfire incidents. An improved ability to characterize uncertainties faced in wildfire management could lead to improved delivery of decision support, targeted communication strategies, and ultimately to improved wildfire management outcomes.
NASA Technical Reports Server (NTRS)
Owens, Andrew; De Weck, Olivier L.; Stromgren, Chel; Goodliff, Kandyce; Cirillo, William
2017-01-01
Future crewed missions to Mars present a maintenance logistics challenge that is unprecedented in human spaceflight. Mission endurance – defined as the time between resupply opportunities – will be significantly longer than previous missions, and therefore logistics planning horizons are longer and the impact of uncertainty is magnified. Maintenance logistics forecasting typically assumes that component failure rates are deterministically known and uses them to represent aleatory uncertainty, or uncertainty that is inherent to the process being examined. However, failure rates cannot be directly measured; rather, they are estimated based on similarity to other components or statistical analysis of observed failures. As a result, epistemic uncertainty – that is, uncertainty in knowledge of the process – exists in failure rate estimates that must be accounted for. Analyses that neglect epistemic uncertainty tend to significantly underestimate risk. Epistemic uncertainty can be reduced via operational experience; for example, the International Space Station (ISS) failure rate estimates are refined using a Bayesian update process. However, design changes may re-introduce epistemic uncertainty. Thus, there is a tradeoff between changing a design to reduce failure rates and operating a fixed design to reduce uncertainty. This paper examines the impact of epistemic uncertainty on maintenance logistics requirements for future Mars missions, using data from the ISS Environmental Control and Life Support System (ECLS) as a baseline for a case study. Sensitivity analyses are performed to investigate the impact of variations in failure rate estimates and epistemic uncertainty on spares mass. The results of these analyses and their implications for future system design and mission planning are discussed.
Yao, Shuai-Lei; Luo, Jing-Jia; Huang, Gang
2016-01-01
Regional climate projections are challenging because of large uncertainty particularly stemming from unpredictable, internal variability of the climate system. Here, we examine the internal variability-induced uncertainty in precipitation and surface air temperature (SAT) trends during 2005-2055 over East Asia based on 40 member ensemble projections of the Community Climate System Model Version 3 (CCSM3). The model ensembles are generated from a suite of different atmospheric initial conditions using the same SRES A1B greenhouse gas scenario. We find that projected precipitation trends are subject to considerably larger internal uncertainty and hence have lower confidence, compared to the projected SAT trends in both the boreal winter and summer. Projected SAT trends in winter have relatively higher uncertainty than those in summer. Besides, the lower-level atmospheric circulation has larger uncertainty than that in the mid-level. Based on k-means cluster analysis, we demonstrate that a substantial portion of internally-induced precipitation and SAT trends arises from internal large-scale atmospheric circulation variability. These results highlight the importance of internal climate variability in affecting regional climate projections on multi-decadal timescales.
Hoffmann, Sabine; Rage, Estelle; Laurier, Dominique; Laroche, Pierre; Guihenneuc, Chantal; Ancelet, Sophie
2017-02-01
Many occupational cohort studies on underground miners have demonstrated that radon exposure is associated with an increased risk of lung cancer mortality. However, despite the deleterious consequences of exposure measurement error on statistical inference, these analyses traditionally do not account for exposure uncertainty. This might be due to the challenging nature of measurement error resulting from imperfect surrogate measures of radon exposure. Indeed, we are typically faced with exposure uncertainty in a time-varying exposure variable where both the type and the magnitude of error may depend on period of exposure. To address the challenge of accounting for multiplicative and heteroscedastic measurement error that may be of Berkson or classical nature, depending on the year of exposure, we opted for a Bayesian structural approach, which is arguably the most flexible method to account for uncertainty in exposure assessment. We assessed the association between occupational radon exposure and lung cancer mortality in the French cohort of uranium miners and found the impact of uncorrelated multiplicative measurement error to be of marginal importance. However, our findings indicate that the retrospective nature of exposure assessment that occurred in the earliest years of mining of this cohort as well as many other cohorts of underground miners might lead to an attenuation of the exposure-risk relationship. More research is needed to address further uncertainties in the calculation of lung dose, since this step will likely introduce important sources of shared uncertainty.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mueller, Don; Rearden, Bradley T; Reed, Davis Allan
2010-01-01
One of the challenges associated with implementation of burnup credit is the validation of criticality calculations used in the safety evaluation; in particular the availability and use of applicable critical experiment data. The purpose of the validation is to quantify the relationship between reality and calculated results. Validation and determination of bias and bias uncertainty require the identification of sets of critical experiments that are similar to the criticality safety models. A principal challenge for crediting fission products (FP) in a burnup credit safety evaluation is the limited availability of relevant FP critical experiments for bias and bias uncertainty determination.more » This paper provides an evaluation of the available critical experiments that include FPs, along with bounding, burnup-dependent estimates of FP biases generated by combining energy dependent sensitivity data for a typical burnup credit application with the nuclear data uncertainty information distributed with SCALE 6. A method for determining separate bias and bias uncertainty values for individual FPs and illustrative results is presented. Finally, a FP bias calculation method based on data adjustment techniques and reactivity sensitivity coefficients calculated with the SCALE sensitivity/uncertainty tools and some typical results is presented. Using the methods described in this paper, the cross-section bias for a representative high-capacity spent fuel cask associated with the ENDF/B-VII nuclear data for 16 most important stable or near stable FPs is predicted to be no greater than 2% of the total worth of the 16 FPs, or less than 0.13 % k/k.« less
Shah, Kavita R.; Sarma, Karthik V.; Mahajan, Anish P.
2013-01-01
Despite the HIV “test-and-treat” strategy’s promise, questions about its clinical rationale, operational feasibility, and ethical appropriateness have led to vigorous debate in the global HIV community. We performed a systematic review of the literature published between January 2009 and May 2012 using PubMed, SCOPUS, Global Health, Web of Science, BIOSIS, Cochrane CENTRAL, EBSCO Africa-Wide Information, and EBSCO CINAHL Plus databases to summarize clinical uncertainties, health service challenges, and ethical complexities that may affect the test-and-treat strategy’s success. A thoughtful approach to research and implementation to address clinical and health service questions and meaningful community engagement regarding ethical complexities may bring us closer to safe, feasible, and effective test-and-treat implementation. PMID:23597344
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sung, Yixing; Adams, Brian M.; Witkowski, Walter R.
2011-04-01
The CASL Level 2 Milestone VUQ.Y1.03, 'Enable statistical sensitivity and UQ demonstrations for VERA,' was successfully completed in March 2011. The VUQ focus area led this effort, in close partnership with AMA, and with support from VRI. DAKOTA was coupled to VIPRE-W thermal-hydraulics simulations representing reactors of interest to address crud-related challenge problems in order to understand the sensitivity and uncertainty in simulation outputs with respect to uncertain operating and model form parameters. This report summarizes work coupling the software tools, characterizing uncertainties, selecting sensitivity and uncertainty quantification algorithms, and analyzing the results of iterative studies. These demonstration studies focusedmore » on sensitivity and uncertainty of mass evaporation rate calculated by VIPRE-W, a key predictor for crud-induced power shift (CIPS).« less
Planning Under Continuous Time and Resource Uncertainty: A Challenge for AI
NASA Technical Reports Server (NTRS)
Bresina, John; Dearden, Richard; Meuleau, Nicolas; Smith, David; Washington, Rich; Clancy, Daniel (Technical Monitor)
2002-01-01
There has been considerable work in Al on decision-theoretic planning and planning under uncertainty. Unfortunately, all of this work suffers from one or more of the following limitations: 1) it relies on very simple models of actions and time, 2) it assumes that uncertainty is manifested in discrete action outcomes, and 3) it is only practical for very small problems. For many real world problems, these assumptions fail to hold. A case in point is planning the activities for a Mars rover. For this domain none of the above assumptions are valid: 1) actions can be concurrent and have differing durations, 2) there is uncertainty concerning action durations and consumption of continuous resources like power, and 3) typical daily plans involve on the order of a hundred actions. We describe the rover problem, discuss previous work on planning under uncertainty, and present a detailed. but very small, example illustrating some of the difficulties of finding good plans.
Woolacott, Nerys; Corbett, Mark; Jones-Diette, Julie; Hodgson, Robert
2017-10-01
Regulatory authorities are approving innovative therapies with limited evidence. Although this level of data is sufficient for the regulator to establish an acceptable risk-benefit balance, it is problematic for downstream health technology assessment, where assessment of cost-effectiveness requires reliable estimates of effectiveness relative to existing clinical practice. Some key issues associated with a limited evidence base include using data, from nonrandomized studies, from small single-arm trials, or from single-center trials; and using surrogate end points. We examined these methodological challenges through a pragmatic review of the available literature. Methods to adjust nonrandomized studies for confounding are imperfect. The relative treatment effect generated from single-arm trials is uncertain and may be optimistic. Single-center trial results may not be generalizable. Surrogate end points, on average, overestimate treatment effects. Current methods for analyzing such data are limited, and effectiveness claims based on these suboptimal forms of evidence are likely to be subject to significant uncertainty. Assessments of cost-effectiveness, based on the modeling of such data, are likely to be subject to considerable uncertainty. This uncertainty must not be underestimated by decision makers: methods for its quantification are required and schemes to protect payers from the cost of uncertainty should be implemented. Crown Copyright © 2017. Published by Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Touhidul Mustafa, Syed Md.; Nossent, Jiri; Ghysels, Gert; Huysmans, Marijke
2017-04-01
Transient numerical groundwater flow models have been used to understand and forecast groundwater flow systems under anthropogenic and climatic effects, but the reliability of the predictions is strongly influenced by different sources of uncertainty. Hence, researchers in hydrological sciences are developing and applying methods for uncertainty quantification. Nevertheless, spatially distributed flow models pose significant challenges for parameter and spatially distributed input estimation and uncertainty quantification. In this study, we present a general and flexible approach for input and parameter estimation and uncertainty analysis of groundwater models. The proposed approach combines a fully distributed groundwater flow model (MODFLOW) with the DiffeRential Evolution Adaptive Metropolis (DREAM) algorithm. To avoid over-parameterization, the uncertainty of the spatially distributed model input has been represented by multipliers. The posterior distributions of these multipliers and the regular model parameters were estimated using DREAM. The proposed methodology has been applied in an overexploited aquifer in Bangladesh where groundwater pumping and recharge data are highly uncertain. The results confirm that input uncertainty does have a considerable effect on the model predictions and parameter distributions. Additionally, our approach also provides a new way to optimize the spatially distributed recharge and pumping data along with the parameter values under uncertain input conditions. It can be concluded from our approach that considering model input uncertainty along with parameter uncertainty is important for obtaining realistic model predictions and a correct estimation of the uncertainty bounds.
Challenges in Incorporating Climate Change Adaptation into Integrated Water Resources Management
NASA Astrophysics Data System (ADS)
Kirshen, P. H.; Cardwell, H.; Kartez, J.; Merrill, S.
2011-12-01
Over the last few decades, integrated water resources management (IWRM), under various names, has become the accepted philosophy for water management in the USA. While much is still to be learned about how to actually carry it out, implementation is slowly moving forward - spurred by both legislation and the demands of stakeholders. New challenges to IWRM have arisen because of climate change. Climate change has placed increased demands on the creativities of planners and engineers because they now must design systems that will function over decades of hydrologic uncertainties that dwarf any previous hydrologic or other uncertainties. Climate and socio-economic monitoring systems must also now be established to determine when the future climate has changed sufficiently to warrant undertaking adaptation. The requirements for taking some actions now and preserving options for future actions as well as the increased risk of social inequities in climate change impacts and adaptation are challenging experts in stakeholder participation. To meet these challenges, an integrated methodology is essential that builds upon scenario analysis, risk assessment, statistical decision theory, participatory planning, and consensus building. This integration will create cross-disciplinary boundaries for these disciplines to overcome.
Zizzo, Natalie; Racine, Eric
2017-11-09
Fetal alcohol spectrum disorder (FASD) is a leading form of neurodevelopmental delay in Canada, affecting an estimated 3000 babies per year. FASD involves a range of disabilities that entail significant costs to affected individuals, families, and society. Exposure to alcohol in utero is a necessary factor for FASD development, and this has led to FASD being described as "completely preventable". However, there are significant ethical challenges associated with FASD prevention. These challenges revolve around 1) what should be communicated about the risks of alcohol consumption during pregnancy, given some ongoing scientific uncertainty about the effects of prenatal alcohol exposure, and 2) how to communicate these risks, given the potential for stigma against women who give birth to children with FASD as well as against children and adults with FASD. In this paper, we share initial thoughts on how primary care physicians can tackle this complex challenge. First, we recommend honest disclosure of scientific evidence to women and the tailoring of information offered to pregnant women. Second, we propose a contextualized, patient-centred, compassionate approach to ensure that appropriate advice is given to patients in a supportive, non-stigmatizing way.
A Unified Approach for Reporting ARM Measurement Uncertainties Technical Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Campos, E; Sisterson, Douglas
The U.S. Department of Energy (DOE) Atmospheric Radiation Measurement (ARM) Climate Research Facility is observationally based, and quantifying the uncertainty of its measurements is critically important. With over 300 widely differing instruments providing over 2,500 datastreams, concise expression of measurement uncertainty is quite challenging. The ARM Facility currently provides data and supporting metadata (information about the data or data quality) to its users through a number of sources. Because the continued success of the ARM Facility depends on the known quality of its measurements, the Facility relies on instrument mentors and the ARM Data Quality Office (DQO) to ensure, assess,more » and report measurement quality. Therefore, an easily accessible, well-articulated estimate of ARM measurement uncertainty is needed. Note that some of the instrument observations require mathematical algorithms (retrievals) to convert a measured engineering variable into a useful geophysical measurement. While those types of retrieval measurements are identified, this study does not address particular methods for retrieval uncertainty. As well, the ARM Facility also provides engineered data products, or value-added products (VAPs), based on multiple instrument measurements. This study does not include uncertainty estimates for those data products. We propose here that a total measurement uncertainty should be calculated as a function of the instrument uncertainty (calibration factors), the field uncertainty (environmental factors), and the retrieval uncertainty (algorithm factors). The study will not expand on methods for computing these uncertainties. Instead, it will focus on the practical identification, characterization, and inventory of the measurement uncertainties already available in the ARM community through the ARM instrument mentors and their ARM instrument handbooks. As a result, this study will address the first steps towards reporting ARM measurement uncertainty: 1) identifying how the uncertainty of individual ARM measurements is currently expressed, 2) identifying a consistent approach to measurement uncertainty, and then 3) reclassifying ARM instrument measurement uncertainties in a common framework.« less
Aerospace Applications of Optimization under Uncertainty
NASA Technical Reports Server (NTRS)
Padula, Sharon; Gumbert, Clyde; Li, Wu
2003-01-01
The Multidisciplinary Optimization (MDO) Branch at NASA Langley Research Center develops new methods and investigates opportunities for applying optimization to aerospace vehicle design. This paper describes MDO Branch experiences with three applications of optimization under uncertainty: (1) improved impact dynamics for airframes, (2) transonic airfoil optimization for low drag, and (3) coupled aerodynamic/structures optimization of a 3-D wing. For each case, a brief overview of the problem and references to previous publications are provided. The three cases are aerospace examples of the challenges and opportunities presented by optimization under uncertainty. The present paper will illustrate a variety of needs for this technology, summarize promising methods, and uncover fruitful areas for new research.
Aerospace Applications of Optimization under Uncertainty
NASA Technical Reports Server (NTRS)
Padula, Sharon; Gumbert, Clyde; Li, Wu
2006-01-01
The Multidisciplinary Optimization (MDO) Branch at NASA Langley Research Center develops new methods and investigates opportunities for applying optimization to aerospace vehicle design. This paper describes MDO Branch experiences with three applications of optimization under uncertainty: (1) improved impact dynamics for airframes, (2) transonic airfoil optimization for low drag, and (3) coupled aerodynamic/structures optimization of a 3-D wing. For each case, a brief overview of the problem and references to previous publications are provided. The three cases are aerospace examples of the challenges and opportunities presented by optimization under uncertainty. The present paper will illustrate a variety of needs for this technology, summarize promising methods, and uncover fruitful areas for new research.
NASA Astrophysics Data System (ADS)
Han, Feng; Zheng, Yi
2018-06-01
Significant Input uncertainty is a major source of error in watershed water quality (WWQ) modeling. It remains challenging to address the input uncertainty in a rigorous Bayesian framework. This study develops the Bayesian Analysis of Input and Parametric Uncertainties (BAIPU), an approach for the joint analysis of input and parametric uncertainties through a tight coupling of Markov Chain Monte Carlo (MCMC) analysis and Bayesian Model Averaging (BMA). The formal likelihood function for this approach is derived considering a lag-1 autocorrelated, heteroscedastic, and Skew Exponential Power (SEP) distributed error model. A series of numerical experiments were performed based on a synthetic nitrate pollution case and on a real study case in the Newport Bay Watershed, California. The Soil and Water Assessment Tool (SWAT) and Differential Evolution Adaptive Metropolis (DREAM(ZS)) were used as the representative WWQ model and MCMC algorithm, respectively. The major findings include the following: (1) the BAIPU can be implemented and used to appropriately identify the uncertain parameters and characterize the predictive uncertainty; (2) the compensation effect between the input and parametric uncertainties can seriously mislead the modeling based management decisions, if the input uncertainty is not explicitly accounted for; (3) the BAIPU accounts for the interaction between the input and parametric uncertainties and therefore provides more accurate calibration and uncertainty results than a sequential analysis of the uncertainties; and (4) the BAIPU quantifies the credibility of different input assumptions on a statistical basis and can be implemented as an effective inverse modeling approach to the joint inference of parameters and inputs.
Cloud Ice: A Climate Model Challenge With Signs and Expectations of Progress
NASA Astrophysics Data System (ADS)
Li, F.; Waliser, D.; Bacmeister, J.; Chern, J.; Del Genio, T.; Jiang, J.; Kharitondov, M.; Liou, K.; Meng, H.; Minnis, P.; Rossow, B.; Stephens, G.; Sun-Mack, S.; Tao, W.; Vane, D.; Woods, C.; Tompkins, A.; Wu, D.
2007-12-01
Global climate models (GCMs), including those assessed in the IPCC AR4, exhibit considerable disagreement in the amount of cloud ice - both in terms of the annual global mean as well as their spatial variability. Global measurements of cloud ice have been difficult due to the challenges involved in remotely sensing ice water content (IWC) and its vertical profile - including complications associated with multi-level clouds, mixed-phases and multiple hydrometer types, the uncertainty in classifying ice particle size and shape for remote retrievals, and the relatively small time and space scales associated with deep convection. Together, these measurement difficulties make it a challenge to characterize and understand the mechanisms of ice cloud formation and dissipation. Fortunately, there are new observational resources recently established that can be expected to lead to considerable reduction in the observational uncertainties of cloud ice, and in turn improve the fidelity of model representations. Specifically, these include the Microwave Limb Sounder (MLS) on the Earth Observing System (EOS) Aura satellite, and the CloudSat and Calipso satellite missions, all of which fly in formation in what is referred to as the A-Train. Based on radar and limb-sounding techniques, these new satellite measurements provide a considerable leap forward in terms of the information gathered regarding upper-tropospheric cloud IWC as well as other macrophysical and microphysical properties. In this presentation, we describe the current state of GCM representations of cloud ice and their associated uncertainties, the nature of the new observational resources for constraining cloud ice values in GCMs, the challenges in making model-data comparisons with these data resources, and prospects for near-term improvements in model representations.
Significant Figures in Measurements with Uncertainty: A Working Criterion
NASA Astrophysics Data System (ADS)
Vilchis, Abraham
2017-03-01
Generally speaking, students have difficulty reporting out measurements and estimates of quantities used in the laboratory, and with handling the significant figures associated with them. When required to make calculation involving quantities with different numbers of significant figures, they have difficulty in assigning the corresponding digits to the final result. When in addition, the quantities have uncertainty, the operations entailed pose an even greater challenge. The article advocates for some working rules for students (and teachers) in an effort to combat this problem.
Uncertainty estimation of simulated water levels for the Mitch flood event in Tegucigalpa
NASA Astrophysics Data System (ADS)
Fuentes Andino, Diana Carolina; Halldin, Sven; Keith, Beven; Chong-Yu, Xu
2013-04-01
Hurricane Mitch in 1998 left a devastating flood in Tegucigalpa, the capital city of Honduras. Due to the extremely large magnitude of the Mitch flood, hydrometric measurements were not taken during the event. However, post-event indirect measurements of the discharge were obtained by the U.S. Geological Survey (USGS) and post-event surveyed high water marks were obtained by the Japan International Cooperation agency (JICA). This work proposes a methodology to simulate the water level during the Mitch event when the available data is associated with large uncertainty. The results of the two-dimensional hydrodynamic model LISFLOOD-FP will be evaluated using the Generalized Uncertainty Estimation (GLUE) framework. The main challenge in the proposed methodology is to formulate an approach to evaluate the model results when there are large uncertainties coming from both the model parameters and the evaluation data.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Santamarina, A.; Bernard, D.; Dos Santos, N.
This paper describes the method to define relevant targeted integral measurements that allow the improvement of nuclear data evaluations and the determination of corresponding reliable covariances. {sup 235}U and {sup 56}Fe examples are pointed out for the improvement of JEFF3 data. Utilizations of these covariances are shown for Sensitivity and Representativity studies, Uncertainty calculations, and Transposition of experimental results to industrial applications. S/U studies are more and more used in Reactor Physics and Safety-Criticality. However, the reliability of study results relies strongly on the ND covariance relevancy. Our method derives the real uncertainty associated with each evaluation from calibration onmore » targeted integral measurements. These realistic covariance matrices allow reliable JEFF3.1.1 calculation of prior uncertainty due to nuclear data, as well as uncertainty reduction based on representative integral experiments, in challenging design calculations such as GEN3 and RJH reactors.« less
Uncertainty Quantification in Aeroelasticity
NASA Astrophysics Data System (ADS)
Beran, Philip; Stanford, Bret; Schrock, Christopher
2017-01-01
Physical interactions between a fluid and structure, potentially manifested as self-sustained or divergent oscillations, can be sensitive to many parameters whose values are uncertain. Of interest here are aircraft aeroelastic interactions, which must be accounted for in aircraft certification and design. Deterministic prediction of these aeroelastic behaviors can be difficult owing to physical and computational complexity. New challenges are introduced when physical parameters and elements of the modeling process are uncertain. By viewing aeroelasticity through a nondeterministic prism, where key quantities are assumed stochastic, one may gain insights into how to reduce system uncertainty, increase system robustness, and maintain aeroelastic safety. This article reviews uncertainty quantification in aeroelasticity using traditional analytical techniques not reliant on computational fluid dynamics; compares and contrasts this work with emerging methods based on computational fluid dynamics, which target richer physics; and reviews the state of the art in aeroelastic optimization under uncertainty. Barriers to continued progress, for example, the so-called curse of dimensionality, are discussed.
Deng, Yue; Bao, Feng; Yang, Yang; Ji, Xiangyang; Du, Mulong; Zhang, Zhengdong
2017-01-01
Abstract The automated transcript discovery and quantification of high-throughput RNA sequencing (RNA-seq) data are important tasks of next-generation sequencing (NGS) research. However, these tasks are challenging due to the uncertainties that arise in the inference of complete splicing isoform variants from partially observed short reads. Here, we address this problem by explicitly reducing the inherent uncertainties in a biological system caused by missing information. In our approach, the RNA-seq procedure for transforming transcripts into short reads is considered an information transmission process. Consequently, the data uncertainties are substantially reduced by exploiting the information transduction capacity of information theory. The experimental results obtained from the analyses of simulated datasets and RNA-seq datasets from cell lines and tissues demonstrate the advantages of our method over state-of-the-art competitors. Our algorithm is an open-source implementation of MaxInfo. PMID:28911101
NASA Astrophysics Data System (ADS)
Subramanian, Aneesh C.; Palmer, Tim N.
2017-06-01
Stochastic schemes to represent model uncertainty in the European Centre for Medium-Range Weather Forecasts (ECMWF) ensemble prediction system has helped improve its probabilistic forecast skill over the past decade by both improving its reliability and reducing the ensemble mean error. The largest uncertainties in the model arise from the model physics parameterizations. In the tropics, the parameterization of moist convection presents a major challenge for the accurate prediction of weather and climate. Superparameterization is a promising alternative strategy for including the effects of moist convection through explicit turbulent fluxes calculated from a cloud-resolving model (CRM) embedded within a global climate model (GCM). In this paper, we compare the impact of initial random perturbations in embedded CRMs, within the ECMWF ensemble prediction system, with stochastically perturbed physical tendency (SPPT) scheme as a way to represent model uncertainty in medium-range tropical weather forecasts. We especially focus on forecasts of tropical convection and dynamics during MJO events in October-November 2011. These are well-studied events for MJO dynamics as they were also heavily observed during the DYNAMO field campaign. We show that a multiscale ensemble modeling approach helps improve forecasts of certain aspects of tropical convection during the MJO events, while it also tends to deteriorate certain large-scale dynamic fields with respect to stochastically perturbed physical tendencies approach that is used operationally at ECMWF.
Are models, uncertainty, and dispute resolution compatible?
NASA Astrophysics Data System (ADS)
Anderson, J. D.; Wilson, J. L.
2013-12-01
Models and their uncertainty often move from an objective use in planning and decision making into the regulatory environment, then sometimes on to dispute resolution through litigation or other legal forums. Through this last transition whatever objectivity the models and uncertainty assessment may have once possessed becomes biased (or more biased) as each party chooses to exaggerate either the goodness of a model, or its worthlessness, depending on which view is in its best interest. If worthlessness is desired, then what was uncertain becomes unknown, or even unknowable. If goodness is desired, then precision and accuracy are often exaggerated and uncertainty, if it is explicitly recognized, encompasses only some parameters or conceptual issues, ignores others, and may minimize the uncertainty that it accounts for. In dispute resolution, how well is the adversarial process able to deal with these biases? The challenge is that they are often cloaked in computer graphics and animations that appear to lend realism to what could be mostly fancy, or even a manufactured outcome. While junk science can be challenged through appropriate motions in federal court, and in most state courts, it not unusual for biased or even incorrect modeling results, or conclusions based on incorrect results, to be permitted to be presented at trial. Courts allow opinions that are based on a "reasonable degree of scientific certainty," but when that 'certainty' is grossly exaggerated by an expert, one way or the other, how well do the courts determine that someone has stepped over the line? Trials are based on the adversary system of justice, so opposing and often irreconcilable views are commonly allowed, leaving it to the judge or jury to sort out the truth. Can advances in scientific theory and engineering practice, related to both modeling and uncertainty, help address this situation and better ensure that juries and judges see more objective modeling results, or at least see measures of those results that help to expose biases?
Hess, Jeremy J.; Ebi, Kristie L.; Markandya, Anil; Balbus, John M.; Wilkinson, Paul; Haines, Andy; Chalabi, Zaid
2014-01-01
Background: Policy decisions regarding climate change mitigation are increasingly incorporating the beneficial and adverse health impacts of greenhouse gas emission reduction strategies. Studies of such co-benefits and co-harms involve modeling approaches requiring a range of analytic decisions that affect the model output. Objective: Our objective was to assess analytic decisions regarding model framework, structure, choice of parameters, and handling of uncertainty when modeling health co-benefits, and to make recommendations for improvements that could increase policy uptake. Methods: We describe the assumptions and analytic decisions underlying models of mitigation co-benefits, examining their effects on modeling outputs, and consider tools for quantifying uncertainty. Discussion: There is considerable variation in approaches to valuation metrics, discounting methods, uncertainty characterization and propagation, and assessment of low-probability/high-impact events. There is also variable inclusion of adverse impacts of mitigation policies, and limited extension of modeling domains to include implementation considerations. Going forward, co-benefits modeling efforts should be carried out in collaboration with policy makers; these efforts should include the full range of positive and negative impacts and critical uncertainties, as well as a range of discount rates, and should explicitly characterize uncertainty. We make recommendations to improve the rigor and consistency of modeling of health co-benefits. Conclusion: Modeling health co-benefits requires systematic consideration of the suitability of model assumptions, of what should be included and excluded from the model framework, and how uncertainty should be treated. Increased attention to these and other analytic decisions has the potential to increase the policy relevance and application of co-benefits modeling studies, potentially helping policy makers to maximize mitigation potential while simultaneously improving health. Citation: Remais JV, Hess JJ, Ebi KL, Markandya A, Balbus JM, Wilkinson P, Haines A, Chalabi Z. 2014. Estimating the health effects of greenhouse gas mitigation strategies: addressing parametric, model, and valuation challenges. Environ Health Perspect 122:447–455; http://dx.doi.org/10.1289/ehp.1306744 PMID:24583270
NASA Astrophysics Data System (ADS)
Gurney, K. R.; Chandrasekaran, V.; Mendoza, D. L.; Geethakumar, S.
2010-12-01
The Vulcan Project has estimated United States fossil fuel CO2 emissions at the hourly time scale and at spatial scales below the county level for the year 2002. Vulcan is built from a wide variety of observational data streams including regulated air pollutant emissions reporting, traffic monitoring, energy statistics, and US census data. In addition to these data sets, Vulcan relies on a series of modeling assumptions and constructs to interpolate in space, time and transform non-CO2 reporting into an estimate of CO2 combustion emissions. The recent version 2.0 of the Vulcan inventory has produced advances in a number of categories with particular emphasis on improved temporal structure. Onroad transportation emissions now avail of roughly 5000 automated traffic count monitors allowing for much improved diurnal and weekly time structure in our onroad transportation emissions. Though the inventory shows excellent agreement with independent national-level CO2 emissions estimates, uncertainty quantification has been a challenging task given the large number of data sources and numerous modeling assumptions. However, we have now accomplished a complete uncertainty estimate across all the Vulcan economic sectors and will present uncertainty estimates as a function of space, time, sector and fuel. We find that, like the underlying distribution of CO2 emissions themselves, the uncertainty is also strongly lognormal with high uncertainty associated with a relatively small number of locations. These locations typically are locations reliant upon coal combustion as the dominant CO2 source. We will also compare and contrast Vulcan fossil fuel CO2 emissions estimates against estimates built from DOE fuel-based surveys at the state level. We conclude that much of the difference between the Vulcan inventory and DOE statistics are not due to biased estimation but mechanistic differences in supply versus demand and combustion in space/time.
A Cascade Approach to Uncertainty Estimation for the Hydrological Simulation of Droughts
NASA Astrophysics Data System (ADS)
Smith, Katie; Tanguy, Maliko; Parry, Simon; Prudhomme, Christel
2016-04-01
Uncertainty poses a significant challenge in environmental research and the characterisation and quantification of uncertainty has become a research priority over the past decade. Studies of extreme events are particularly affected by issues of uncertainty. This study focusses on the sources of uncertainty in the modelling of streamflow droughts in the United Kingdom. Droughts are a poorly understood natural hazard with no universally accepted definition. Meteorological, hydrological and agricultural droughts have different meanings and vary both spatially and temporally, yet each is inextricably linked. The work presented here is part of two extensive interdisciplinary projects investigating drought reconstruction and drought forecasting capabilities in the UK. Lumped catchment models are applied to simulate streamflow drought, and uncertainties from 5 different sources are investigated: climate input data, potential evapotranspiration (PET) method, hydrological model, within model structure, and model parameterisation. Latin Hypercube sampling is applied to develop large parameter ensembles for each model structure which are run using parallel computing on a high performance computer cluster. Parameterisations are assessed using a multi-objective evaluation criteria which includes both general and drought performance metrics. The effect of different climate input data and PET methods on model output is then considered using the accepted model parameterisations. The uncertainty from each of the sources creates a cascade, and when presented as such the relative importance of each aspect of uncertainty can be determined.
Nuclear data uncertainty propagation by the XSUSA method in the HELIOS2 lattice code
NASA Astrophysics Data System (ADS)
Wemple, Charles; Zwermann, Winfried
2017-09-01
Uncertainty quantification has been extensively applied to nuclear criticality analyses for many years and has recently begun to be applied to depletion calculations. However, regulatory bodies worldwide are trending toward requiring such analyses for reactor fuel cycle calculations, which also requires uncertainty propagation for isotopics and nuclear reaction rates. XSUSA is a proven methodology for cross section uncertainty propagation based on random sampling of the nuclear data according to covariance data in multi-group representation; HELIOS2 is a lattice code widely used for commercial and research reactor fuel cycle calculations. This work describes a technique to automatically propagate the nuclear data uncertainties via the XSUSA approach through fuel lattice calculations in HELIOS2. Application of the XSUSA methodology in HELIOS2 presented some unusual challenges because of the highly-processed multi-group cross section data used in commercial lattice codes. Currently, uncertainties based on the SCALE 6.1 covariance data file are being used, but the implementation can be adapted to other covariance data in multi-group structure. Pin-cell and assembly depletion calculations, based on models described in the UAM-LWR Phase I and II benchmarks, are performed and uncertainties in multiplication factor, reaction rates, isotope concentrations, and delayed-neutron data are calculated. With this extension, it will be possible for HELIOS2 users to propagate nuclear data uncertainties directly from the microscopic cross sections to subsequent core simulations.
NASA Astrophysics Data System (ADS)
Wang, Z.
2015-12-01
For decades, distributed and lumped hydrological models have furthered our understanding of hydrological system. The development of hydrological simulation in large scale and high precision elaborated the spatial descriptions and hydrological behaviors. Meanwhile, the new trend is also followed by the increment of model complexity and number of parameters, which brings new challenges of uncertainty quantification. Generalized Likelihood Uncertainty Estimation (GLUE) has been widely used in uncertainty analysis for hydrological models referring to Monte Carlo method coupled with Bayesian estimation. However, the stochastic sampling method of prior parameters adopted by GLUE appears inefficient, especially in high dimensional parameter space. The heuristic optimization algorithms utilizing iterative evolution show better convergence speed and optimality-searching performance. In light of the features of heuristic optimization algorithms, this study adopted genetic algorithm, differential evolution, shuffled complex evolving algorithm to search the parameter space and obtain the parameter sets of large likelihoods. Based on the multi-algorithm sampling, hydrological model uncertainty analysis is conducted by the typical GLUE framework. To demonstrate the superiority of the new method, two hydrological models of different complexity are examined. The results shows the adaptive method tends to be efficient in sampling and effective in uncertainty analysis, providing an alternative path for uncertainty quantilization.
NASA Technical Reports Server (NTRS)
Hubert, Daan; Lambert, Jean-Christopher; Verhoelst, Tijl; Granville, Jose; Keppens, Arno; Baray, Jean-Luc; Cortesi, Ugo; Degenstein, D. A.; Froidevaux, Lucien; Godin-Beekmann, Sophie;
2015-01-01
Most recent assessments of long-term changes in the vertical distribution of ozone (by e.g. WMO and SI2N) rely on data sets that integrate observations by multiple instruments. Several merged satellite ozone profile records have been developed over the past few years; each considers a particular set of instruments and adopts a particular merging strategy. Their intercomparison by Tummon et al. revealed that the current merging schemes are not sufficiently refined to correct for all major differences between the limb/occultation records. This shortcoming introduces uncertainties that need to be known to obtain a sound interpretation of the different satellite-based trend studies. In practice however, producing realistic uncertainty estimates is an intricate task which depends on a sufficiently detailed understanding of the characteristics of each contributing data record and on the subsequent interplay and propagation of these through the merging scheme. Our presentation discusses these challenges in the context of limb/occultation ozone profile records, but they are equally relevant for other instruments and atmospheric measurements. We start by showing how the NDACC and GAW-affiliated ground-based networks of ozonesonde and lidar instruments allowed us to characterize fourteen limb/occultation ozone profile records, together providing a global view over the last three decades. Our prime focus will be on techniques to estimate long-term drift since our results suggest this is the main driver of the major trend differences between the merged data sets. The single-instrument drift estimates are then used for a tentative estimate of the systematic uncertainty in the profile trends from merged data records. We conclude by reflecting on possible further steps needed to improve the merging algorithms and to obtain a better characterization of the uncertainties involved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhou, Z.; Liu, C.; Botterud, A.
Renewable energy resources have been rapidly integrated into power systems in many parts of the world, contributing to a cleaner and more sustainable supply of electricity. Wind and solar resources also introduce new challenges for system operations and planning in terms of economics and reliability because of their variability and uncertainty. Operational strategies based on stochastic optimization have been developed recently to address these challenges. In general terms, these stochastic strategies either embed uncertainties into the scheduling formulations (e.g., the unit commitment [UC] problem) in probabilistic forms or develop more appropriate operating reserve strategies to take advantage of advanced forecastingmore » techniques. Other approaches to address uncertainty are also proposed, where operational feasibility is ensured within an uncertainty set of forecasting intervals. In this report, a comprehensive review is conducted to present the state of the art through Spring 2015 in the area of stochastic methods applied to power system operations with high penetration of renewable energy. Chapters 1 and 2 give a brief introduction and overview of power system and electricity market operations, as well as the impact of renewable energy and how this impact is typically considered in modeling tools. Chapter 3 reviews relevant literature on operating reserves and specifically probabilistic methods to estimate the need for system reserve requirements. Chapter 4 looks at stochastic programming formulations of the UC and economic dispatch (ED) problems, highlighting benefits reported in the literature as well as recent industry developments. Chapter 5 briefly introduces alternative formulations of UC under uncertainty, such as robust, chance-constrained, and interval programming. Finally, in Chapter 6, we conclude with the main observations from our review and important directions for future work.« less
The fact of uncertainty, the uncertainty of facts and the cultural resonance of doubt.
Oreskes, Naomi
2015-11-28
Sixty years after industry executives first decided to fight the facts of tobacco, the exploitation of doubt and uncertainty as a defensive tactic has spread to a diverse set of industries and issues with an interest in challenging scientific evidence. However, one can find examples of doubt-mongering before tobacco. One involves the early history of electricity generation in the USA. In the 1920s, the American National Electric Light Association ran a major propaganda campaign against public sector electricity generation, focused on the insistence that privately generated electricity was cheaper and that public power generation was socialistic and therefore un-American. This campaign included advertisements, editorials (generally ghost-written), the rewriting of textbooks and the development of high school and college curricula designed to cast doubt on the cost-effectiveness of public electricity generation and extol the virtues of laissez-faire capitalism. It worked in large part by finding, cultivating and paying experts to endorse the industry's claims in the mass media and the public debate, and to legitimatize the alterations to textbooks and curricula. The similarities between the electric industry strategy and the defence of tobacco, lead paint and fossil fuels suggests that these strategies work for reasons that are not specific to the particular technical claims under consideration. This paper argues that a reason for the cultural persistence of doubt is what we may label the 'fact of uncertainty'. Uncertainty is intrinsic to science, and this creates vulnerabilities that interested parties may, and commonly do, exploit, both by attempting to challenge the specific conclusions of technical experts and by implying that those conclusions threaten other social values. © 2015 The Author(s).
Niemansburg, Sophie L; Habets, Michelle G J L; Dhert, Wouter J A; van Delden, Johannes J M; Bredenoord, Annelien L
2015-11-01
The innovative field of Regenerative Medicine (RM) is expected to extend the possibilities of prevention or early treatment in healthcare. Increasingly, clinical trials will be developed for people at risk of disease to investigate these RM interventions. These individuals at risk are characterised by their susceptibility for developing clinically manifest disease in future due to the existence of degenerative abnormalities. So far, there has been little debate about the ethical appropriateness of including such individuals at risk in clinical trials. We discuss three main challenges of selecting this participant model for testing RM interventions: the challenge of achieving a proportional risk-benefit balance; complexities in the trial design in terms of follow-up and sample size; and the difficulty of obtaining informed consent due to the many uncertainties. We conclude that selecting the model is not ethically justifiable for first-in-man trials with RM interventions due to the high risks and uncertainties. However, the model can be ethically appropriate for testing the efficacy of RM interventions under the following conditions: interventions should be low risk; the degenerative abnormalities (and other risk factors) should be strongly related with disease within a short time frame; robust preclinical evidence of efficacy needs to be present; and the informed consent procedure should contain extra safeguards with regard to communication on uncertainties. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/
Pathology and failure in the design and implementation of adaptive management
Allen, Craig R.; Gunderson, Lance H.
2011-01-01
The conceptual underpinnings for adaptive management are simple; there will always be inherent uncertainty and unpredictability in the dynamics and behavior of complex ecological systems as a result non-linear interactions among components and emergence, yet management decisions must still be made. The strength of adaptive management is in the recognition and confrontation of such uncertainty. Rather than ignore uncertainty, or use it to preclude management actions, adaptive management can foster resilience and flexibility to cope with an uncertain future, and develop safe to fail management approaches that acknowledge inevitable changes and surprises. Since its initial introduction, adaptive management has been hailed as a solution to endless trial and error approaches to complex natural resource management challenges. However, its implementation has failed more often than not. It does not produce easy answers, and it is appropriate in only a subset of natural resource management problems. Clearly adaptive management has great potential when applied appropriately. Just as clearly adaptive management has seemingly failed to live up to its high expectations. Why? We outline nine pathologies and challenges that can lead to failure in adaptive management programs. We focus on general sources of failures in adaptive management, so that others can avoid these pitfalls in the future. Adaptive management can be a powerful and beneficial tool when applied correctly to appropriate management problems; the challenge is to keep the concept of adaptive management from being hijacked for inappropriate use.
Uncertainty information in climate data records from Earth observation
NASA Astrophysics Data System (ADS)
Merchant, C. J.
2017-12-01
How to derive and present uncertainty in climate data records (CDRs) has been debated within the European Space Agency Climate Change Initiative, in search of common principles applicable across a range of essential climate variables. Various points of consensus have been reached, including the importance of improving provision of uncertainty information and the benefit of adopting international norms of metrology for language around the distinct concepts of uncertainty and error. Providing an estimate of standard uncertainty per datum (or the means to readily calculate it) emerged as baseline good practice, and should be highly relevant to users of CDRs when the uncertainty in data is variable (the usual case). Given this baseline, the role of quality flags is clarified as being complementary to and not repetitive of uncertainty information. Data with high uncertainty are not poor quality if a valid estimate of the uncertainty is available. For CDRs and their applications, the error correlation properties across spatio-temporal scales present important challenges that are not fully solved. Error effects that are negligible in the uncertainty of a single pixel may dominate uncertainty in the large-scale and long-term. A further principle is that uncertainty estimates should themselves be validated. The concepts of estimating and propagating uncertainty are generally acknowledged in geophysical sciences, but less widely practised in Earth observation and development of CDRs. Uncertainty in a CDR depends in part (and usually significantly) on the error covariance of the radiances and auxiliary data used in the retrieval. Typically, error covariance information is not available in the fundamental CDR (FCDR) (i.e., with the level-1 radiances), since provision of adequate level-1 uncertainty information is not yet standard practice. Those deriving CDRs thus cannot propagate the radiance uncertainty to their geophysical products. The FIDUCEO project (www.fiduceo.eu) is demonstrating metrologically sound methodologies addressing this problem for four key historical CDRs. FIDUCEO methods of uncertainty analysis (which also tend to lead to improved FCDRs and CDRs) could support coherent treatment of uncertainty across FCDRs to CDRs and higher level products for a wide range of essential climate variables.
Confronting dynamics and uncertainty in optimal decision making for conservation
Williams, Byron K.; Johnson, Fred A.
2013-01-01
The effectiveness of conservation efforts ultimately depends on the recognition that decision making, and the systems that it is designed to affect, are inherently dynamic and characterized by multiple sources of uncertainty. To cope with these challenges, conservation planners are increasingly turning to the tools of decision analysis, especially dynamic optimization methods. Here we provide a general framework for optimal, dynamic conservation and then explore its capacity for coping with various sources and degrees of uncertainty. In broadest terms, the dynamic optimization problem in conservation is choosing among a set of decision options at periodic intervals so as to maximize some conservation objective over the planning horizon. Planners must account for immediate objective returns, as well as the effect of current decisions on future resource conditions and, thus, on future decisions. Undermining the effectiveness of such a planning process are uncertainties concerning extant resource conditions (partial observability), the immediate consequences of decision choices (partial controllability), the outcomes of uncontrolled, environmental drivers (environmental variation), and the processes structuring resource dynamics (structural uncertainty). Where outcomes from these sources of uncertainty can be described in terms of probability distributions, a focus on maximizing the expected objective return, while taking state-specific actions, is an effective mechanism for coping with uncertainty. When such probability distributions are unavailable or deemed unreliable, a focus on maximizing robustness is likely to be the preferred approach. Here the idea is to choose an action (or state-dependent policy) that achieves at least some minimum level of performance regardless of the (uncertain) outcomes. We provide some examples of how the dynamic optimization problem can be framed for problems involving management of habitat for an imperiled species, conservation of a critically endangered population through captive breeding, control of invasive species, construction of biodiversity reserves, design of landscapes to increase habitat connectivity, and resource exploitation. Although these decision making problems and their solutions present significant challenges, we suggest that a systematic and effective approach to dynamic decision making in conservation need not be an onerous undertaking. The requirements are shared with any systematic approach to decision making--a careful consideration of values, actions, and outcomes.
Confronting dynamics and uncertainty in optimal decision making for conservation
NASA Astrophysics Data System (ADS)
Williams, Byron K.; Johnson, Fred A.
2013-06-01
The effectiveness of conservation efforts ultimately depends on the recognition that decision making, and the systems that it is designed to affect, are inherently dynamic and characterized by multiple sources of uncertainty. To cope with these challenges, conservation planners are increasingly turning to the tools of decision analysis, especially dynamic optimization methods. Here we provide a general framework for optimal, dynamic conservation and then explore its capacity for coping with various sources and degrees of uncertainty. In broadest terms, the dynamic optimization problem in conservation is choosing among a set of decision options at periodic intervals so as to maximize some conservation objective over the planning horizon. Planners must account for immediate objective returns, as well as the effect of current decisions on future resource conditions and, thus, on future decisions. Undermining the effectiveness of such a planning process are uncertainties concerning extant resource conditions (partial observability), the immediate consequences of decision choices (partial controllability), the outcomes of uncontrolled, environmental drivers (environmental variation), and the processes structuring resource dynamics (structural uncertainty). Where outcomes from these sources of uncertainty can be described in terms of probability distributions, a focus on maximizing the expected objective return, while taking state-specific actions, is an effective mechanism for coping with uncertainty. When such probability distributions are unavailable or deemed unreliable, a focus on maximizing robustness is likely to be the preferred approach. Here the idea is to choose an action (or state-dependent policy) that achieves at least some minimum level of performance regardless of the (uncertain) outcomes. We provide some examples of how the dynamic optimization problem can be framed for problems involving management of habitat for an imperiled species, conservation of a critically endangered population through captive breeding, control of invasive species, construction of biodiversity reserves, design of landscapes to increase habitat connectivity, and resource exploitation. Although these decision making problems and their solutions present significant challenges, we suggest that a systematic and effective approach to dynamic decision making in conservation need not be an onerous undertaking. The requirements are shared with any systematic approach to decision making—a careful consideration of values, actions, and outcomes.
Nicod, Elena; Annemans, Lieven; Bucsics, Anna; Lee, Anne; Upadhyaya, Sheela; Facey, Karen
2017-03-28
Challenges commonly encountered in HTA of orphan medicinal products (OMPs) were identified in Advance-HTA. Since then, new initiatives have been developed to specifically address issues related to HTA of OMPs. This study aimed to understand why these new HTA initiatives in England, Scotland and at European-level were established and whether they resolve the challenges of OMPs. The work of Advance-HTA was updated with a literature review and a conceptual framework of clinical, regulatory and economic challenges for OMPs was developed. The new HTA programmes were critiqued against the conceptual framework and outstanding challenges identified. The new programmes in England and Scotland recognise the challenges identified in demonstrating the value of ultra-OMPs (and OMPs) and that they require a different process to standard HTA approaches. Wider considerations of disease and treatment experiences from a multi-stakeholder standpoint are needed, combined with other measures to deal with uncertainty (e.g. managed entry agreements). While approaches to assessing this new view of value of OMPs, extending beyond cost/QALY frameworks, differ, their criteria are similar. These are complemented by a European initiative that fosters multi-stakeholder dialogue and consensus about value determinants throughout the life-cycle of an OMP. New HTA programmes specific to OMPs have been developed but questions remain about whether they sufficiently capture value and manage uncertainty in clinical practice. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Huan, Xun; Safta, Cosmin; Sargsyan, Khachik
The development of scramjet engines is an important research area for advancing hypersonic and orbital flights. Progress toward optimal engine designs requires accurate flow simulations together with uncertainty quantification. However, performing uncertainty quantification for scramjet simulations is challenging due to the large number of uncertain parameters involved and the high computational cost of flow simulations. These difficulties are addressed in this paper by developing practical uncertainty quantification algorithms and computational methods, and deploying them in the current study to large-eddy simulations of a jet in crossflow inside a simplified HIFiRE Direct Connect Rig scramjet combustor. First, global sensitivity analysis ismore » conducted to identify influential uncertain input parameters, which can help reduce the system’s stochastic dimension. Second, because models of different fidelity are used in the overall uncertainty quantification assessment, a framework for quantifying and propagating the uncertainty due to model error is presented. In conclusion, these methods are demonstrated on a nonreacting jet-in-crossflow test problem in a simplified scramjet geometry, with parameter space up to 24 dimensions, using static and dynamic treatments of the turbulence subgrid model, and with two-dimensional and three-dimensional geometries.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Huan, Xun; Safta, Cosmin; Sargsyan, Khachik
The development of scramjet engines is an important research area for advancing hypersonic and orbital flights. Progress toward optimal engine designs requires accurate flow simulations together with uncertainty quantification. However, performing uncertainty quantification for scramjet simulations is challenging due to the large number of uncertain parameters involved and the high computational cost of flow simulations. These difficulties are addressed in this paper by developing practical uncertainty quantification algorithms and computational methods, and deploying them in the current study to large-eddy simulations of a jet in crossflow inside a simplified HIFiRE Direct Connect Rig scramjet combustor. First, global sensitivity analysis ismore » conducted to identify influential uncertain input parameters, which can help reduce the system’s stochastic dimension. Second, because models of different fidelity are used in the overall uncertainty quantification assessment, a framework for quantifying and propagating the uncertainty due to model error is presented. Finally, these methods are demonstrated on a nonreacting jet-in-crossflow test problem in a simplified scramjet geometry, with parameter space up to 24 dimensions, using static and dynamic treatments of the turbulence subgrid model, and with two-dimensional and three-dimensional geometries.« less
NASA Astrophysics Data System (ADS)
Huan, Xun; Safta, Cosmin; Sargsyan, Khachik; Geraci, Gianluca; Eldred, Michael S.; Vane, Zachary P.; Lacaze, Guilhem; Oefelein, Joseph C.; Najm, Habib N.
2018-03-01
The development of scramjet engines is an important research area for advancing hypersonic and orbital flights. Progress toward optimal engine designs requires accurate flow simulations together with uncertainty quantification. However, performing uncertainty quantification for scramjet simulations is challenging due to the large number of uncertain parameters involved and the high computational cost of flow simulations. These difficulties are addressed in this paper by developing practical uncertainty quantification algorithms and computational methods, and deploying them in the current study to large-eddy simulations of a jet in crossflow inside a simplified HIFiRE Direct Connect Rig scramjet combustor. First, global sensitivity analysis is conducted to identify influential uncertain input parameters, which can help reduce the systems stochastic dimension. Second, because models of different fidelity are used in the overall uncertainty quantification assessment, a framework for quantifying and propagating the uncertainty due to model error is presented. These methods are demonstrated on a nonreacting jet-in-crossflow test problem in a simplified scramjet geometry, with parameter space up to 24 dimensions, using static and dynamic treatments of the turbulence subgrid model, and with two-dimensional and three-dimensional geometries.
Huan, Xun; Safta, Cosmin; Sargsyan, Khachik; ...
2018-02-09
The development of scramjet engines is an important research area for advancing hypersonic and orbital flights. Progress toward optimal engine designs requires accurate flow simulations together with uncertainty quantification. However, performing uncertainty quantification for scramjet simulations is challenging due to the large number of uncertain parameters involved and the high computational cost of flow simulations. These difficulties are addressed in this paper by developing practical uncertainty quantification algorithms and computational methods, and deploying them in the current study to large-eddy simulations of a jet in crossflow inside a simplified HIFiRE Direct Connect Rig scramjet combustor. First, global sensitivity analysis ismore » conducted to identify influential uncertain input parameters, which can help reduce the system’s stochastic dimension. Second, because models of different fidelity are used in the overall uncertainty quantification assessment, a framework for quantifying and propagating the uncertainty due to model error is presented. In conclusion, these methods are demonstrated on a nonreacting jet-in-crossflow test problem in a simplified scramjet geometry, with parameter space up to 24 dimensions, using static and dynamic treatments of the turbulence subgrid model, and with two-dimensional and three-dimensional geometries.« less
NASA Astrophysics Data System (ADS)
Ranger, N.; Millner, A.; Niehoerster, F.
2010-12-01
Traditionally, climate change risk assessments have taken a roughly four-stage linear ‘chain’ of moving from socioeconomic projections, to climate projections, to primary impacts and then finally onto economic and social impact assessment. Adaptation decisions are then made on the basis of these outputs. The escalation of uncertainty through this chain is well known; resulting in an ‘explosion’ of uncertainties in the final risk and adaptation assessment. The space of plausible future risk scenarios is growing ever wider with the application of new techniques which aim to explore uncertainty ever more deeply; such as those used in the recent ‘probabilistic’ UK Climate Projections 2009, and the stochastic integrated assessment models, for example PAGE2002. This explosion of uncertainty can make decision-making problematic, particularly given that the uncertainty information communicated can not be treated as strictly probabilistic and therefore, is not an easy fit with standard decision-making under uncertainty approaches. Additional problems can arise from the fact that the uncertainty estimated for different components of the ‘chain’ is rarely directly comparable or combinable. Here, we explore the challenges and limitations of using current projections for adaptation decision-making. We report the findings of a recent report completed for the UK Adaptation Sub-Committee on approaches to deal with these challenges and make robust adaptation decisions today. To illustrate these approaches, we take a number of illustrative case studies, including a case of adaptation to hurricane risk on the US Gulf Coast. This is a particularly interesting case as it involves urgent adaptation of long-lived infrastructure but requires interpreting highly uncertain climate change science and modelling; i.e. projections of Atlantic basin hurricane activity. An approach we outline is reversing the linear chain of assessments to put the economics and decision-making first. Such an approach forces one to focus on the information of greatest value for the specific decision. We suggest that such an approach will help to accommodate the uncertainties in the chain and facilitate robust decision-making. Initial findings of these case studies will be presented with the aim of raising open questions and promoting discussion of the methodology. Finally, we reflect on the implications for the design of climate model experiments.
To be or not to be: How do we speak about uncertainty in public?
NASA Astrophysics Data System (ADS)
Todesco, Micol; Lolli, Barbara; Sheldrake, Tom; Odbert, Henry
2016-04-01
One of the challenges related to hazard communication concerns the public perception and understanding of scientific uncertainties, and of its implications in terms of hazard assessment and mitigation. Often science is perceived as an effective dispenser of resolving answers to the main issues posed by the complexities of life and nature. In this perspective, uncertainty is seen as a pernicious lack of knowledge that hinders our ability to face complex problems. From a scientific perspective, however, the definition of uncertainty is the only valuable tool we have to handle errors affecting our data and propagating through the increasingly complex models we develop to describe reality. Through uncertainty, scientists acknowledge the great variability that characterises natural systems and account for it in their assessment of possible scenarios. From this point of view, uncertainty is not ignorance, but it rather provides a great deal of information that is needed to inform decision making. To find effective ways to bridge the gap between these different meaning of uncertainty, we asked high-school students for assistance. With their help, we gathered definitions of the term 'uncertainty' interviewing different categories of peoples, including schoolmates and professors, neighbours, families and friends. These definitions will be compared with those provided by scientists, to find differences and similarity. To understand the role of uncertainty on judgment, a hands-on experiment is performed where students will have to estimate the exact time of explosion of party poppers subjected to a variable degree of pull. At the end of the project, the students will express their own understanding of uncertainty in a video, which will be made available for sharing. Materials collected during all the activities will contribute to our understanding of how uncertainty is portrayed and can be better expressed to improve our hazard communication.
Mathematics applied to the climate system: outstanding challenges and recent progress
Williams, Paul D.; Cullen, Michael J. P.; Davey, Michael K.; Huthnance, John M.
2013-01-01
The societal need for reliable climate predictions and a proper assessment of their uncertainties is pressing. Uncertainties arise not only from initial conditions and forcing scenarios, but also from model formulation. Here, we identify and document three broad classes of problems, each representing what we regard to be an outstanding challenge in the area of mathematics applied to the climate system. First, there is the problem of the development and evaluation of simple physically based models of the global climate. Second, there is the problem of the development and evaluation of the components of complex models such as general circulation models. Third, there is the problem of the development and evaluation of appropriate statistical frameworks. We discuss these problems in turn, emphasizing the recent progress made by the papers presented in this Theme Issue. Many pressing challenges in climate science require closer collaboration between climate scientists, mathematicians and statisticians. We hope the papers contained in this Theme Issue will act as inspiration for such collaborations and for setting future research directions. PMID:23588054
Li, Zhaoying; Zhou, Wenjie; Liu, Hao
2016-09-01
This paper addresses the nonlinear robust tracking controller design problem for hypersonic vehicles. This problem is challenging due to strong coupling between the aerodynamics and the propulsion system, and the uncertainties involved in the vehicle dynamics including parametric uncertainties, unmodeled model uncertainties, and external disturbances. By utilizing the feedback linearization technique, a linear tracking error system is established with prescribed references. For the linear model, a robust controller is proposed based on the signal compensation theory to guarantee that the tracking error dynamics is robustly stable. Numerical simulation results are given to show the advantages of the proposed nonlinear robust control method, compared to the robust loop-shaping control approach. Copyright © 2016 ISA. Published by Elsevier Ltd. All rights reserved.
NWS Operational Requirements for Ensemble-Based Hydrologic Forecasts
NASA Astrophysics Data System (ADS)
Hartman, R. K.
2008-12-01
Ensemble-based hydrologic forecasts have been developed and issued by National Weather Service (NWS) staff at River Forecast Centers (RFCs) for many years. Used principally for long-range water supply forecasts, only the uncertainty associated with weather and climate have been traditionally considered. As technology and societal expectations of resource managers increase, the use and desire for risk-based decision support tools has also increased. These tools require forecast information that includes reliable uncertainty estimates across all time and space domains. The development of reliable uncertainty estimates associated with hydrologic forecasts is being actively pursued within the United States and internationally. This presentation will describe the challenges, components, and requirements for operational hydrologic ensemble-based forecasts from the perspective of a NOAA/NWS River Forecast Center.
NASA Astrophysics Data System (ADS)
de Barros, Felipe P. J.; Rubin, Yoram; Maxwell, Reed M.
2009-06-01
Defining rational and effective hydrogeological data acquisition strategies is of crucial importance as such efforts are always resource limited. Usually, strategies are developed with the goal of reducing uncertainty, but less often they are developed in the context of their impacts on uncertainty. This paper presents an approach for determining site characterization needs on the basis of human health risk. The main challenge is in striking a balance between reduction in uncertainty in hydrogeological, behavioral, and physiological parameters. Striking this balance can provide clear guidance on setting priorities for data acquisition and for better estimating adverse health effects in humans. This paper addresses this challenge through theoretical developments and numerical simulation. A wide range of factors that affect site characterization needs are investigated, including the dimensions of the contaminant plume and additional length scales that characterize the transport problem, as well as the model of human health risk. The concept of comparative information yield curves is used for investigating the relative impact of hydrogeological and physiological parameters in risk. Results show that characterization needs are dependent on the ratios between flow and transport scales within a risk-driven approach. Additionally, the results indicate that human health risk becomes less sensitive to hydrogeological measurements for large plumes. This indicates that under near-ergodic conditions, uncertainty reduction in human health risk may benefit from better understanding of the physiological component as opposed to a more detailed hydrogeological characterization.
NASA Astrophysics Data System (ADS)
Hill Clarvis, M.; Allan, A.; Hannah, D. M.
2013-12-01
Climate change has significant ramifications for water law and governance, yet, there is strong evidence that legal regulations have often failed to protect environments or promote sustainable development. Scholars have increasingly suggested that the preservation and restoration paradigms of legislation and regulation are no longer adequate for climate change related challenges in complex and cross-scale social-ecological systems. This is namely due to past assumptions of stationarity, uniformitarianism and the perception of ecosystem change as predictable and reversible. This paper reviews the literature on law and resilience and then presents and discusses a set of practical examples of legal mechanisms from the water resources management sector, identified according to a set of guiding principles from the literature on adaptive capacity, adaptive governance as well as adaptive and integrated water resources management. It then assesses the aptness of these different measures according to scientific evidence of increased uncertainty and changing ecological baselines. A review of the best practice examples demonstrates that there are a number of best practice examples attempting to integrate adaptive elements of flexibility, iterativity, connectivity and subsidiarity into a variety of legislative mechanisms, suggesting that there is not as significant a tension between resilience and the law as many scholars have suggested. However, while many of the mechanisms may indeed be suitable for addressing challenges relating to current levels of change and uncertainty, analysis across a broader range of uncertainty highlights challenges relating to more irreversible changes associated with greater levels of warming. Furthermore the paper identifies a set of pre-requisites that are fundamental to the successful implementation of such mechanisms, namely monitoring and data sharing, financial and technical capacity, particularly in nations that are most at risk with the least data infrastructure. The article aims to contribute to both theory and practice, enabling policy makers to translate resilience based terminology and adaptive governance principles into clear instructions for incorporating uncertainty into legislation and policy design.
Focused Belief Measures for Uncertainty Quantification in High Performance Semantic Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Joslyn, Cliff A.; Weaver, Jesse R.
In web-scale semantic data analytics there is a great need for methods which aggregate uncertainty claims, on the one hand respecting the information provided as accurately as possible, while on the other still being tractable. Traditional statistical methods are more robust, but only represent distributional, additive uncertainty. Generalized information theory methods, including fuzzy systems and Dempster-Shafer (DS) evidence theory, represent multiple forms of uncertainty, but are computationally and methodologically difficult. We require methods which provide an effective balance between the complete representation of the full complexity of uncertainty claims in their interaction, while satisfying the needs of both computational complexitymore » and human cognition. Here we build on J{\\o}sang's subjective logic to posit methods in focused belief measures (FBMs), where a full DS structure is focused to a single event. The resulting ternary logical structure is posited to be able to capture the minimal amount of generalized complexity needed at a maximum of computational efficiency. We demonstrate the efficacy of this approach in a web ingest experiment over the 2012 Billion Triple dataset from the Semantic Web Challenge.« less
A robust approach to chance constrained optimal power flow with renewable generation
Lubin, Miles; Dvorkin, Yury; Backhaus, Scott N.
2016-09-01
Optimal Power Flow (OPF) dispatches controllable generation at minimum cost subject to operational constraints on generation and transmission assets. The uncertainty and variability of intermittent renewable generation is challenging current deterministic OPF approaches. Recent formulations of OPF use chance constraints to limit the risk from renewable generation uncertainty, however, these new approaches typically assume the probability distributions which characterize the uncertainty and variability are known exactly. We formulate a robust chance constrained (RCC) OPF that accounts for uncertainty in the parameters of these probability distributions by allowing them to be within an uncertainty set. The RCC OPF is solved usingmore » a cutting-plane algorithm that scales to large power systems. We demonstrate the RRC OPF on a modified model of the Bonneville Power Administration network, which includes 2209 buses and 176 controllable generators. In conclusion, deterministic, chance constrained (CC), and RCC OPF formulations are compared using several metrics including cost of generation, area control error, ramping of controllable generators, and occurrence of transmission line overloads as well as the respective computational performance.« less
Automated Generation of Tabular Equations of State with Uncertainty Information
NASA Astrophysics Data System (ADS)
Carpenter, John H.; Robinson, Allen C.; Debusschere, Bert J.; Mattsson, Ann E.
2015-06-01
As computational science pushes toward higher fidelity prediction, understanding the uncertainty associated with closure models, such as the equation of state (EOS), has become a key focus. Traditional EOS development often involves a fair amount of art, where expert modelers may appear as magicians, providing what is felt to be the closest possible representation of the truth. Automation of the development process gives a means by which one may demystify the art of EOS, while simultaneously obtaining uncertainty information in a manner that is both quantifiable and reproducible. We describe our progress on the implementation of such a system to provide tabular EOS tables with uncertainty information to hydrocodes. Key challenges include encoding the artistic expert opinion into an algorithmic form and preserving the analytic models and uncertainty information in a manner that is both accurate and computationally efficient. Results are demonstrated on a multi-phase aluminum model. *Sandia National Laboratories is a multi-program laboratory managed and operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy's National Nuclear Security Administration under contract DE-AC04-94AL85000.
NASA Astrophysics Data System (ADS)
Khademian, Amir; Abdollahipour, Hamed; Bagherpour, Raheb; Faramarzi, Lohrasb
2017-10-01
In addition to the numerous planning and executive challenges, underground excavation in urban areas is always followed by certain destructive effects especially on the ground surface; ground settlement is the most important of these effects for which estimation there exist different empirical, analytical and numerical methods. Since geotechnical models are associated with considerable model uncertainty, this study characterized the model uncertainty of settlement estimation models through a systematic comparison between model predictions and past performance data derived from instrumentation. To do so, the amount of surface settlement induced by excavation of the Qom subway tunnel was estimated via empirical (Peck), analytical (Loganathan and Poulos) and numerical (FDM) methods; the resulting maximum settlement value of each model were 1.86, 2.02 and 1.52 cm, respectively. The comparison of these predicted amounts with the actual data from instrumentation was employed to specify the uncertainty of each model. The numerical model outcomes, with a relative error of 3.8%, best matched the reality and the analytical method, with a relative error of 27.8%, yielded the highest level of model uncertainty.
Bringing nanomedicines to market: regulatory challenges, opportunities, and uncertainties.
Nijhara, Ruchika; Balakrishnan, Krishna
2006-06-01
Scientists and entrepreneurs who contemplate developing nanomedicine products face several unique challenges in addition to many of the traditional hurdles of product development. In this review we analyze the major physicochemical, biologic and functional characteristics of several nanomedicine products on the market and explore the question of what made them unique. What made them successful? We also focus on the regulatory challenges faced by nanomedicine product developers. Based on these analyses, we propose the factors that are most likely to contribute to the success of nanomedicine products.
The Challenges of Credible Thermal Protection System Reliability Quantification
NASA Technical Reports Server (NTRS)
Green, Lawrence L.
2013-01-01
The paper discusses several of the challenges associated with developing a credible reliability estimate for a human-rated crew capsule thermal protection system. The process of developing such a credible estimate is subject to the quantification, modeling and propagation of numerous uncertainties within a probabilistic analysis. The development of specific investment recommendations, to improve the reliability prediction, among various potential testing and programmatic options is then accomplished through Bayesian analysis.
Model-based nonlinear control of hydraulic servo systems: Challenges, developments and perspectives
NASA Astrophysics Data System (ADS)
Yao, Jianyong
2018-06-01
Hydraulic servo system plays a significant role in industries, and usually acts as a core point in control and power transmission. Although linear theory-based control methods have been well established, advanced controller design methods for hydraulic servo system to achieve high performance is still an unending pursuit along with the development of modern industry. Essential nonlinearity is a unique feature and makes model-based nonlinear control more attractive, due to benefit from prior knowledge of the servo valve controlled hydraulic system. In this paper, a discussion for challenges in model-based nonlinear control, latest developments and brief perspectives of hydraulic servo systems are presented: Modelling uncertainty in hydraulic system is a major challenge, which includes parametric uncertainty and time-varying disturbance; some specific requirements also arise ad hoc difficulties such as nonlinear friction during low velocity tracking, severe disturbance, periodic disturbance, etc.; to handle various challenges, nonlinear solutions including parameter adaptation, nonlinear robust control, state and disturbance observation, backstepping design and so on, are proposed and integrated, theoretical analysis and lots of applications reveal their powerful capability to solve pertinent problems; and at the end, some perspectives and associated research topics (measurement noise, constraints, inner valve dynamics, input nonlinearity, etc.) in nonlinear hydraulic servo control are briefly explored and discussed.
On the role of model-based monitoring for adaptive planning under uncertainty
NASA Astrophysics Data System (ADS)
Raso, Luciano; Kwakkel, Jan; Timmermans, Jos; Haasnoot, Mariolijn
2016-04-01
Adaptive plans, designed to anticipate and respond to an unfolding uncertain future, have found a fertile application domain in the planning of deltas that are exposed to rapid socioeconomic development and climate change. Adaptive planning, under the moniker of adaptive delta management, is used in the Dutch Delta Program for developing a nation-wide plan to prepare for uncertain climate change and socio-economic developments. Scientifically, adaptive delta management relies heavily on Dynamic Adaptive Policy Pathways. Currently, in the Netherlands the focus is shifting towards implementing the adaptive delta plan. This shift is especially relevant because the efficacy of adaptive plans hinges on monitoring on-going developments and ensuring that actions are indeed taken if and when necessary. In the design of an effective monitoring system for an adaptive plan, three challenges have to be confronted: • Shadow of the past: The development of adaptive plans and the design of their monitoring system relies heavily on current knowledge of the system, and current beliefs about plausible future developments. A static monitoring system is therefore exposed to the exact same uncertainties one tries to address through adaptive planning. • Inhibition of learning: Recent applications of adaptive planning tend to overlook the importance of learning and new information, and fail to account for this explicitly in the design of adaptive plans. • Challenge of surprise: Adaptive policies are designed in light of the current foreseen uncertainties. However, developments that are not considered during the design phase as being plausible could still substantially affect the performance of adaptive policies. The shadow of the past, the inhibition of learning, and the challenge of surprise taken together suggest that there is a need for redesigning the concepts of monitoring and evaluation to support the implementation of adaptive plans. Innovations from control theory, triggered by the challenge of uncertainty in operational control, may offer solutions from which monitoring for adaptive planning can benefit. Specifically: (i) in control, observations are incorporated into the model through data assimilation, updating the present state, boundary conditions, and parameters based on new observations, diminishing the shadow of the past; (ii) adaptive control is a way to modify the characteristics of the internal model, incorporating new knowledge on the system, countervailing the inhibition of learning; and (iii) in closed-loop control, a continuous system update equips the controller with "inherent robustness", i.e. to capacity to adapts to new conditions even when these were not initially considered. We aim to explore how inherent robustness addresses the challenge of surprise. Innovations in model-based control might help to improve and adapt the models used to support adaptive delta management to new information (reducing uncertainty). Moreover, this would offer a starting point for using these models not only in the design of adaptive plans, but also as part of the monitoring. The proposed research requires multidisciplinary cooperation between control theory, the policy sciences, and integrated assessment modeling.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sullivan, P.; Eurek, K.; Margolis, R.
2014-07-01
Because solar power is a rapidly growing component of the electricity system, robust representations of solar technologies should be included in capacity-expansion models. This is a challenge because modeling the electricity system--and, in particular, modeling solar integration within that system--is a complex endeavor. This report highlights the major challenges of incorporating solar technologies into capacity-expansion models and shows examples of how specific models address those challenges. These challenges include modeling non-dispatchable technologies, determining which solar technologies to model, choosing a spatial resolution, incorporating a solar resource assessment, and accounting for solar generation variability and uncertainty.
Lack of power enhances visual perceptual discrimination.
Weick, Mario; Guinote, Ana; Wilkinson, David
2011-09-01
Powerless individuals face much challenge and uncertainty. As a consequence, they are highly vigilant and closely scrutinize their social environments. The aim of the present research was to determine whether these qualities enhance performance in more basic cognitive tasks involving simple visual feature discrimination. To test this hypothesis, participants performed a series of perceptual matching and search tasks involving colour, texture, and size discrimination. As predicted, those primed with powerlessness generated shorter reaction times and made fewer eye movements than either powerful or control participants. The results indicate that the heightened vigilance shown by powerless individuals is associated with an advantage in performing simple types of psychophysical discrimination. These findings highlight, for the first time, an underlying competency in perceptual cognition that sets powerless individuals above their powerful counterparts, an advantage that may reflect functional adaptation to the environmental challenge and uncertainty that they face. © 2011 Canadian Psychological Association
Tasker, Fiona; Wood, Sally
2016-10-01
Our prospective study investigated couples' expectations of adoptive parenthood and explored how these changed with their actual experience of parenthood. Six heterosexual couples were interviewed just before placement began and 6 months after the children had arrived. Interpretative Phenomenological Analysis (IPA) was used to analyse both sets of interview data. Expectations of adoptive parenthood mostly transformed smoothly into adoption experience for couples, but challenges were experienced when family scripts collided and a continued feeling of unsafe uncertainty then prevailed within these newly formed family systems. Family script collision seemed a particular problem for couples adopting sibling pairs. To further professional practice in working with families over the transition to adoptive parenting, we suggest that professionals keep in mind a framework that includes the following: Internal and external world influences on family members, Intergenerational issues, Family scripts and the Structural challenges of adoption (IIFS). © The Author(s) 2016.
The NIST Simple Guide for Evaluating and Expressing Measurement Uncertainty
NASA Astrophysics Data System (ADS)
Possolo, Antonio
2016-11-01
NIST has recently published guidance on the evaluation and expression of the uncertainty of NIST measurement results [1, 2], supplementing but not replacing B. N. Taylor and C. E. Kuyatt's (1994) Guidelines for Evaluating and Expressing the Uncertainty of NIST Measurement Results (NIST Technical Note 1297) [3], which tracks closely the Guide to the expression of uncertainty in measurement (GUM) [4], originally published in 1995 by the Joint Committee for Guides in Metrology of the International Bureau of Weights and Measures (BIPM). The scope of this Simple Guide, however, is much broader than the scope of both NIST Technical Note 1297 and the GUM, because it attempts to address several of the uncertainty evaluation challenges that have arisen at NIST since the 1990s, for example to include molecular biology, greenhouse gases and climate science measurements, and forensic science. The Simple Guide also expands the scope of those two other guidance documents by recognizing observation equations (that is, statistical models) as bona fide measurement models. These models are indispensable to reduce data from interlaboratory studies, to combine measurement results for the same measurand obtained by different methods, and to characterize the uncertainty of calibration and analysis functions used in the measurement of force, temperature, or composition of gas mixtures. This presentation reviews the salient aspects of the Simple Guide, illustrates the use of models and methods for uncertainty evaluation not contemplated in the GUM, and also demonstrates the NIST Uncertainty Machine [5] and the NIST Consensus Builder, which are web-based applications accessible worldwide that facilitate evaluations of measurement uncertainty and the characterization of consensus values in interlaboratory studies.
Prioritizing Chemicals and Data Requirements for Screening-Level Exposure and Risk Assessment
Brown, Trevor N.; Wania, Frank; Breivik, Knut; McLachlan, Michael S.
2012-01-01
Background: Scientists and regulatory agencies strive to identify chemicals that may cause harmful effects to humans and the environment; however, prioritization is challenging because of the large number of chemicals requiring evaluation and limited data and resources. Objectives: We aimed to prioritize chemicals for exposure and exposure potential and obtain a quantitative perspective on research needs to better address uncertainty in screening assessments. Methods: We used a multimedia mass balance model to prioritize > 12,000 organic chemicals using four far-field human exposure metrics. The propagation of variance (uncertainty) in key chemical information used as model input for calculating exposure metrics was quantified. Results: Modeled human concentrations and intake rates span approximately 17 and 15 orders of magnitude, respectively. Estimates of exposure potential using human concentrations and a unit emission rate span approximately 13 orders of magnitude, and intake fractions span 7 orders of magnitude. The actual chemical emission rate contributes the greatest variance (uncertainty) in exposure estimates. The human biotransformation half-life is the second greatest source of uncertainty in estimated concentrations. In general, biotransformation and biodegradation half-lives are greater sources of uncertainty in modeled exposure and exposure potential than chemical partition coefficients. Conclusions: Mechanistic exposure modeling is suitable for screening and prioritizing large numbers of chemicals. By including uncertainty analysis and uncertainty in chemical information in the exposure estimates, these methods can help identify and address the important sources of uncertainty in human exposure and risk assessment in a systematic manner. PMID:23008278
PIV Uncertainty Methodologies for CFD Code Validation at the MIR Facility
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sabharwall, Piyush; Skifton, Richard; Stoots, Carl
2013-12-01
Currently, computational fluid dynamics (CFD) is widely used in the nuclear thermal hydraulics field for design and safety analyses. To validate CFD codes, high quality multi dimensional flow field data are essential. The Matched Index of Refraction (MIR) Flow Facility at Idaho National Laboratory has a unique capability to contribute to the development of validated CFD codes through the use of Particle Image Velocimetry (PIV). The significance of the MIR facility is that it permits non intrusive velocity measurement techniques, such as PIV, through complex models without requiring probes and other instrumentation that disturb the flow. At the heart ofmore » any PIV calculation is the cross-correlation, which is used to estimate the displacement of particles in some small part of the image over the time span between two images. This image displacement is indicated by the location of the largest peak. In the MIR facility, uncertainty quantification is a challenging task due to the use of optical measurement techniques. Currently, this study is developing a reliable method to analyze uncertainty and sensitivity of the measured data and develop a computer code to automatically analyze the uncertainty/sensitivity of the measured data. The main objective of this study is to develop a well established uncertainty quantification method for the MIR Flow Facility, which consists of many complicated uncertainty factors. In this study, the uncertainty sources are resolved in depth by categorizing them into uncertainties from the MIR flow loop and PIV system (including particle motion, image distortion, and data processing). Then, each uncertainty source is mathematically modeled or adequately defined. Finally, this study will provide a method and procedure to quantify the experimental uncertainty in the MIR Flow Facility with sample test results.« less
NASA Astrophysics Data System (ADS)
Badawy, B.; Fletcher, C. G.
2017-12-01
The parameterization of snow processes in land surface models is an important source of uncertainty in climate simulations. Quantifying the importance of snow-related parameters, and their uncertainties, may therefore lead to better understanding and quantification of uncertainty within integrated earth system models. However, quantifying the uncertainty arising from parameterized snow processes is challenging due to the high-dimensional parameter space, poor observational constraints, and parameter interaction. In this study, we investigate the sensitivity of the land simulation to uncertainty in snow microphysical parameters in the Canadian LAnd Surface Scheme (CLASS) using an uncertainty quantification (UQ) approach. A set of training cases (n=400) from CLASS is used to sample each parameter across its full range of empirical uncertainty, as determined from available observations and expert elicitation. A statistical learning model using support vector regression (SVR) is then constructed from the training data (CLASS output variables) to efficiently emulate the dynamical CLASS simulations over a much larger (n=220) set of cases. This approach is used to constrain the plausible range for each parameter using a skill score, and to identify the parameters with largest influence on the land simulation in CLASS at global and regional scales, using a random forest (RF) permutation importance algorithm. Preliminary sensitivity tests indicate that snow albedo refreshment threshold and the limiting snow depth, below which bare patches begin to appear, have the highest impact on snow output variables. The results also show a considerable reduction of the plausible ranges of the parameters values and hence reducing their uncertainty ranges, which can lead to a significant reduction of the model uncertainty. The implementation and results of this study will be presented and discussed in details.
A complete representation of uncertainties in layer-counted paleoclimatic archives
NASA Astrophysics Data System (ADS)
Boers, Niklas; Goswami, Bedartha; Ghil, Michael
2017-09-01
Accurate time series representation of paleoclimatic proxy records is challenging because such records involve dating errors in addition to proxy measurement errors. Rigorous attention is rarely given to age uncertainties in paleoclimatic research, although the latter can severely bias the results of proxy record analysis. Here, we introduce a Bayesian approach to represent layer-counted proxy records - such as ice cores, sediments, corals, or tree rings - as sequences of probability distributions on absolute, error-free time axes. The method accounts for both proxy measurement errors and uncertainties arising from layer-counting-based dating of the records. An application to oxygen isotope ratios from the North Greenland Ice Core Project (NGRIP) record reveals that the counting errors, although seemingly small, lead to substantial uncertainties in the final representation of the oxygen isotope ratios. In particular, for the older parts of the NGRIP record, our results show that the total uncertainty originating from dating errors has been seriously underestimated. Our method is next applied to deriving the overall uncertainties of the Suigetsu radiocarbon comparison curve, which was recently obtained from varved sediment cores at Lake Suigetsu, Japan. This curve provides the only terrestrial radiocarbon comparison for the time interval 12.5-52.8 kyr BP. The uncertainties derived here can be readily employed to obtain complete error estimates for arbitrary radiometrically dated proxy records of this recent part of the last glacial interval.
NASA Astrophysics Data System (ADS)
Hampel, B.; Liu, B.; Nording, F.; Ostermann, J.; Struszewski, P.; Langfahl-Klabes, J.; Bieler, M.; Bosse, H.; Güttler, B.; Lemmens, P.; Schilling, M.; Tutsch, R.
2018-03-01
In many cases, the determination of the measurement uncertainty of complex nanosystems provides unexpected challenges. This is in particular true for complex systems with many degrees of freedom, i.e. nanosystems with multiparametric dependencies and multivariate output quantities. The aim of this paper is to address specific questions arising during the uncertainty calculation of such systems. This includes the division of the measurement system into subsystems and the distinction between systematic and statistical influences. We demonstrate that, even if the physical systems under investigation are very different, the corresponding uncertainty calculation can always be realized in a similar manner. This is exemplarily shown in detail for two experiments, namely magnetic nanosensors and ultrafast electro-optical sampling of complex time-domain signals. For these examples the approach for uncertainty calculation following the guide to the expression of uncertainty in measurement (GUM) is explained, in which correlations between multivariate output quantities are captured. To illustate the versatility of the proposed approach, its application to other experiments, namely nanometrological instruments for terahertz microscopy, dimensional scanning probe microscopy, and measurement of concentration of molecules using surface enhanced Raman scattering, is shortly discussed in the appendix. We believe that the proposed approach provides a simple but comprehensive orientation for uncertainty calculation in the discussed measurement scenarios and can also be applied to similar or related situations.
Liu, Shuguang; Bond-Lamberty, Ben; Hicke, Jeffrey A.; Vargas, Rodrigo; Zhao, Shuqing; Chen, Jing; Edburg, Steven L.; Hu, Yueming; Liu, Jinxun; McGuire, A. David; Xiao, Jingfeng; Keane, Robert; Yuan, Wenping; Tang, Jianwu; Luo, Yiqi; Potter, Christopher; Oeding, Jennifer
2011-01-01
Forest disturbances greatly alter the carbon cycle at various spatial and temporal scales. It is critical to understand disturbance regimes and their impacts to better quantify regional and global carbon dynamics. This review of the status and major challenges in representing the impacts of disturbances in modeling the carbon dynamics across North America revealed some major advances and challenges. First, significant advances have been made in representation, scaling, and characterization of disturbances that should be included in regional modeling efforts. Second, there is a need to develop effective and comprehensive process‐based procedures and algorithms to quantify the immediate and long‐term impacts of disturbances on ecosystem succession, soils, microclimate, and cycles of carbon, water, and nutrients. Third, our capability to simulate the occurrences and severity of disturbances is very limited. Fourth, scaling issues have rarely been addressed in continental scale model applications. It is not fully understood which finer scale processes and properties need to be scaled to coarser spatial and temporal scales. Fifth, there are inadequate databases on disturbances at the continental scale to support the quantification of their effects on the carbon balance in North America. Finally, procedures are needed to quantify the uncertainty of model inputs, model parameters, and model structures, and thus to estimate their impacts on overall model uncertainty. Working together, the scientific community interested in disturbance and its impacts can identify the most uncertain issues surrounding the role of disturbance in the North American carbon budget and develop working hypotheses to reduce the uncertainty
Blind prediction of cyclohexane-water distribution coefficients from the SAMPL5 challenge.
Bannan, Caitlin C; Burley, Kalistyn H; Chiu, Michael; Shirts, Michael R; Gilson, Michael K; Mobley, David L
2016-11-01
In the recent SAMPL5 challenge, participants submitted predictions for cyclohexane/water distribution coefficients for a set of 53 small molecules. Distribution coefficients (log D) replace the hydration free energies that were a central part of the past five SAMPL challenges. A wide variety of computational methods were represented by the 76 submissions from 18 participating groups. Here, we analyze submissions by a variety of error metrics and provide details for a number of reference calculations we performed. As in the SAMPL4 challenge, we assessed the ability of participants to evaluate not just their statistical uncertainty, but their model uncertainty-how well they can predict the magnitude of their model or force field error for specific predictions. Unfortunately, this remains an area where prediction and analysis need improvement. In SAMPL4 the top performing submissions achieved a root-mean-squared error (RMSE) around 1.5 kcal/mol. If we anticipate accuracy in log D predictions to be similar to the hydration free energy predictions in SAMPL4, the expected error here would be around 1.54 log units. Only a few submissions had an RMSE below 2.5 log units in their predicted log D values. However, distribution coefficients introduced complexities not present in past SAMPL challenges, including tautomer enumeration, that are likely to be important in predicting biomolecular properties of interest to drug discovery, therefore some decrease in accuracy would be expected. Overall, the SAMPL5 distribution coefficient challenge provided great insight into the importance of modeling a variety of physical effects. We believe these types of measurements will be a promising source of data for future blind challenges, especially in view of the relatively straightforward nature of the experiments and the level of insight provided.
Blind prediction of cyclohexane-water distribution coefficients from the SAMPL5 challenge
Bannan, Caitlin C.; Burley, Kalistyn H.; Chiu, Michael; Shirts, Michael R.; Gilson, Michael K.; Mobley, David L.
2016-01-01
In the recent SAMPL5 challenge, participants submitted predictions for cyclohexane/water distribution coefficients for a set of 53 small molecules. Distribution coefficients (log D) replace the hydration free energies that were a central part of the past five SAMPL challenges. A wide variety of computational methods were represented by the 76 submissions from 18 participating groups. Here, we analyze submissions by a variety of error metrics and provide details for a number of reference calculations we performed. As in the SAMPL4 challenge, we assessed the ability of participants to evaluate not just their statistical uncertainty, but their model uncertainty – how well they can predict the magnitude of their model or force field error for specific predictions. Unfortunately, this remains an area where prediction and analysis need improvement. In SAMPL4 the top performing submissions achieved a root-mean-squared error (RMSE) around 1.5 kcal/mol. If we anticipate accuracy in log D predictions to be similar to the hydration free energy predictions in SAMPL4, the expected error here would be around 1.54 log units. Only a few submissions had an RMSE below 2.5 log units in their predicted log D values. However, distribution coefficients introduced complexities not present in past SAMPL challenges, including tautomer enumeration, that are likely to be important in predicting biomolecular properties of interest to drug discovery, therefore some decrease in accuracy would be expected. Overall, the SAMPL5 distribution coefficient challenge provided great insight into the importance of modeling a variety of physical effects. We believe these types of measurements will be a promising source of data for future blind challenges, especially in view of the relatively straightforward nature of the experiments and the level of insight provided. PMID:27677750
Managing uncertainty in flood protection planning with climate projections
NASA Astrophysics Data System (ADS)
Dittes, Beatrice; Špačková, Olga; Schoppa, Lukas; Straub, Daniel
2018-04-01
Technical flood protection is a necessary part of integrated strategies to protect riverine settlements from extreme floods. Many technical flood protection measures, such as dikes and protection walls, are costly to adapt after their initial construction. This poses a challenge to decision makers as there is large uncertainty in how the required protection level will change during the measure lifetime, which is typically many decades long. Flood protection requirements should account for multiple future uncertain factors: socioeconomic, e.g., whether the population and with it the damage potential grows or falls; technological, e.g., possible advancements in flood protection; and climatic, e.g., whether extreme discharge will become more frequent or not. This paper focuses on climatic uncertainty. Specifically, we devise methodology to account for uncertainty associated with the use of discharge projections, ultimately leading to planning implications. For planning purposes, we categorize uncertainties as either visible
, if they can be quantified from available catchment data, or hidden
, if they cannot be quantified from catchment data and must be estimated, e.g., from the literature. It is vital to consider the hidden uncertainty
, since in practical applications only a limited amount of information (e.g., a finite projection ensemble) is available. We use a Bayesian approach to quantify the visible uncertainties
and combine them with an estimate of the hidden uncertainties to learn a joint probability distribution of the parameters of extreme discharge. The methodology is integrated into an optimization framework and applied to a pre-alpine case study to give a quantitative, cost-optimal recommendation on the required amount of flood protection. The results show that hidden uncertainty ought to be considered in planning, but the larger the uncertainty already present, the smaller the impact of adding more. The recommended planning is robust to moderate changes in uncertainty as well as in trend. In contrast, planning without consideration of bias and dependencies in and between uncertainty components leads to strongly suboptimal planning recommendations.
Li, Zhengpeng; Liu, Shuguang; Zhang, Xuesong; West, Tristram O.; Ogle, Stephen M.; Zhou, Naijun
2016-01-01
Quantifying spatial and temporal patterns of carbon sources and sinks and their uncertainties across agriculture-dominated areas remains challenging for understanding regional carbon cycles. Characteristics of local land cover inputs could impact the regional carbon estimates but the effect has not been fully evaluated in the past. Within the North American Carbon Program Mid-Continent Intensive (MCI) Campaign, three models were developed to estimate carbon fluxes on croplands: an inventory-based model, the Environmental Policy Integrated Climate (EPIC) model, and the General Ensemble biogeochemical Modeling System (GEMS) model. They all provided estimates of three major carbon fluxes on cropland: net primary production (NPP), net ecosystem production (NEP), and soil organic carbon (SOC) change. Using data mining and spatial statistics, we studied the spatial distribution of the carbon fluxes uncertainties and the relationships between the uncertainties and the land cover characteristics. Results indicated that uncertainties for all three carbon fluxes were not randomly distributed, but instead formed multiple clusters within the MCI region. We investigated the impacts of three land cover characteristics on the fluxes uncertainties: cropland percentage, cropland richness and cropland diversity. The results indicated that cropland percentage significantly influenced the uncertainties of NPP and NEP, but not on the uncertainties of SOC change. Greater uncertainties of NPP and NEP were found in counties with small cropland percentage than the counties with large cropland percentage. Cropland species richness and diversity also showed negative correlations with the model uncertainties. Our study demonstrated that the land cover characteristics contributed to the uncertainties of regional carbon fluxes estimates. The approaches we used in this study can be applied to other ecosystem models to identify the areas with high uncertainties and where models can be improved to reduce overall uncertainties for regional carbon flux estimates.
Creating dialogue: a workshop on "Uncertainty in Decision Making in a Changing Climate"
NASA Astrophysics Data System (ADS)
Ewen, Tracy; Addor, Nans; Johnson, Leigh; Coltekin, Arzu; Derungs, Curdin; Muccione, Veruska
2014-05-01
Uncertainty is present in all fields of climate research, spanning from projections of future climate change, to assessing regional impacts and vulnerabilities, to adaptation policy and decision-making. In addition to uncertainties, managers and planners in many sectors are often confronted with large amounts of information from climate change research whose complex and interdisciplinary nature make it challenging to incorporate into the decision-making process. An overarching issue in tackling this problem is the lack of institutionalized dialogue between climate researchers, decision-makers and user groups. Forums that facilitate such dialogue would allow climate researchers to actively engage with end-users and researchers in different disciplines to better characterize uncertainties and ultimately understand which ones are critically considered and incorporated into decisions made. We propose that the introduction of students to these challenges at an early stage of their education and career is a first step towards improving future dialogue between climate researchers, decision-makers and user groups. To this end, we organized a workshop at the University of Zurich, Switzerland, entitled "Uncertainty in Decision Making in a Changing Climate". It brought together 50 participants, including Bachelor, Master and PhD students and academic staff, and nine selected speakers from academia, industry, government, and philanthropy. Speakers introduced participants to topics ranging from uncertainties in climate model scenarios to managing uncertainties in development and aid agencies. The workshop consisted of experts' presentations, a panel discussion and student group work on case studies. Pedagogical goals included i) providing participants with an overview of the current research on uncertainty and on how uncertainty is dealt with by decision-makers, ii) fostering exchange between practitioners, students, and scientists from different backgrounds, iii) exposing students, at an early stage of their professional life, to multidisciplinary collaborations and real-world problems involving decisions under uncertainty. An opinion survey conducted before and after the workshop enabled us to observe changes in participants' perspectives on what information and tools should be exchanged between researchers and decision-makers to better address uncertainty. Responses demonstrated a marked shift from a pre-workshop vertical conceptualizations of researcher—user group interaction to a post-workshop horizontal mode: in the former, researchers were portrayed as bestowing data-based products to decision-makers, while in the latter, both sets of actors engaged in institutionalized dialogues and frequent communication, exchanging their needs, expertise, and personnel. In addition to the survey, we will draw on examples from the course evaluation to illustrate the strengths and weaknesses of our approach. By doing so, we seek to encourage the organization of similar events by other universities, with the mid-term goal to improve future dialogue. From a pedagogical perspective, introducing students to these ideas at a very early stage in their research careers is an ideal opportunity to establish new modes of communication with an interdisciplinary perspective and strengthen dialogue between climate researchers, decision-makers and user groups.
Scaling Up Decision Theoretic Planning to Planetary Rover Problems
NASA Technical Reports Server (NTRS)
Meuleau, Nicolas; Dearden, Richard; Washington, Rich
2004-01-01
Because of communication limits, planetary rovers must operate autonomously during consequent durations. The ability to plan under uncertainty is one of the main components of autonomy. Previous approaches to planning under uncertainty in NASA applications are not able to address the challenges of future missions, because of several apparent limits. On another side, decision theory provides a solid principle framework for reasoning about uncertainty and rewards. Unfortunately, there are several obstacles to a direct application of decision-theoretic techniques to the rover domain. This paper focuses on the issues of structure and concurrency, and continuous state variables. We describes two techniques currently under development that address specifically these issues and allow scaling-up decision theoretic solution techniques to planetary rover planning problems involving a small number of goals.
Data management to enhance long-term watershed research capacity
USDA-ARS?s Scientific Manuscript database
Water resources are under growing pressure globally, and in the face of projected climate change, uncertainty about precipitation frequency and intensity; evapotranspiration, runoff, and snowmelt poses severe societal challenges. Interdisciplinary environmental research across natural and social sc...
Quantifying measurement uncertainty and spatial variability in the context of model evaluation
NASA Astrophysics Data System (ADS)
Choukulkar, A.; Brewer, A.; Pichugina, Y. L.; Bonin, T.; Banta, R. M.; Sandberg, S.; Weickmann, A. M.; Djalalova, I.; McCaffrey, K.; Bianco, L.; Wilczak, J. M.; Newman, J. F.; Draxl, C.; Lundquist, J. K.; Wharton, S.; Olson, J.; Kenyon, J.; Marquis, M.
2017-12-01
In an effort to improve wind forecasts for the wind energy sector, the Department of Energy and the NOAA funded the second Wind Forecast Improvement Project (WFIP2). As part of the WFIP2 field campaign, a large suite of in-situ and remote sensing instrumentation was deployed to the Columbia River Gorge in Oregon and Washington from October 2015 - March 2017. The array of instrumentation deployed included 915-MHz wind profiling radars, sodars, wind- profiling lidars, and scanning lidars. The role of these instruments was to provide wind measurements at high spatial and temporal resolution for model evaluation and improvement of model physics. To properly determine model errors, the uncertainties in instrument-model comparisons need to be quantified accurately. These uncertainties arise from several factors such as measurement uncertainty, spatial variability, and interpolation of model output to instrument locations, to name a few. In this presentation, we will introduce a formalism to quantify measurement uncertainty and spatial variability. The accuracy of this formalism will be tested using existing datasets such as the eXperimental Planetary boundary layer Instrumentation Assessment (XPIA) campaign. Finally, the uncertainties in wind measurement and the spatial variability estimates from the WFIP2 field campaign will be discussed to understand the challenges involved in model evaluation.
Uncertainty Quantification in High Throughput Screening ...
Using uncertainty quantification, we aim to improve the quality of modeling data from high throughput screening assays for use in risk assessment. ToxCast is a large-scale screening program that analyzes thousands of chemicals using over 800 assays representing hundreds of biochemical and cellular processes, including endocrine disruption, cytotoxicity, and zebrafish development. Over 2.6 million concentration response curves are fit to models to extract parameters related to potency and efficacy. Models built on ToxCast results are being used to rank and prioritize the toxicological risk of tested chemicals and to predict the toxicity of tens of thousands of chemicals not yet tested in vivo. However, the data size also presents challenges. When fitting the data, the choice of models, model selection strategy, and hit call criteria must reflect the need for computational efficiency and robustness, requiring hard and somewhat arbitrary cutoffs. When coupled with unavoidable noise in the experimental concentration response data, these hard cutoffs cause uncertainty in model parameters and the hit call itself. The uncertainty will then propagate through all of the models built on the data. Left unquantified, this uncertainty makes it difficult to fully interpret the data for risk assessment. We used bootstrap resampling methods to quantify the uncertainty in fitting models to the concentration response data. Bootstrap resampling determines confidence intervals for
Uncertainty and Sensitivity Analysis of Afterbody Radiative Heating Predictions for Earth Entry
NASA Technical Reports Server (NTRS)
West, Thomas K., IV; Johnston, Christopher O.; Hosder, Serhat
2016-01-01
The objective of this work was to perform sensitivity analysis and uncertainty quantification for afterbody radiative heating predictions of Stardust capsule during Earth entry at peak afterbody radiation conditions. The radiation environment in the afterbody region poses significant challenges for accurate uncertainty quantification and sensitivity analysis due to the complexity of the flow physics, computational cost, and large number of un-certain variables. In this study, first a sparse collocation non-intrusive polynomial chaos approach along with global non-linear sensitivity analysis was used to identify the most significant uncertain variables and reduce the dimensions of the stochastic problem. Then, a total order stochastic expansion was constructed over only the important parameters for an efficient and accurate estimate of the uncertainty in radiation. Based on previous work, 388 uncertain parameters were considered in the radiation model, which came from the thermodynamics, flow field chemistry, and radiation modeling. The sensitivity analysis showed that only four of these variables contributed significantly to afterbody radiation uncertainty, accounting for almost 95% of the uncertainty. These included the electronic- impact excitation rate for N between level 2 and level 5 and rates of three chemical reactions in uencing N, N(+), O, and O(+) number densities in the flow field.
Kriston, Levente; Meister, Ramona
2014-03-01
Judging applicability (relevance) of meta-analytical findings to particular clinical decision-making situations remains challenging. We aimed to describe an evidence synthesis method that accounts for possible uncertainty regarding applicability of the evidence. We conceptualized uncertainty regarding applicability of the meta-analytical estimates to a decision-making situation as the result of uncertainty regarding applicability of the findings of the trials that were included in the meta-analysis. This trial-level applicability uncertainty can be directly assessed by the decision maker and allows for the definition of trial inclusion probabilities, which can be used to perform a probabilistic meta-analysis with unequal probability resampling of trials (adaptive meta-analysis). A case study with several fictitious decision-making scenarios was performed to demonstrate the method in practice. We present options to elicit trial inclusion probabilities and perform the calculations. The result of an adaptive meta-analysis is a frequency distribution of the estimated parameters from traditional meta-analysis that provides individually tailored information according to the specific needs and uncertainty of the decision maker. The proposed method offers a direct and formalized combination of research evidence with individual clinical expertise and may aid clinicians in specific decision-making situations. Copyright © 2014 Elsevier Inc. All rights reserved.
Adamson, M W; Morozov, A Y; Kuzenkov, O A
2016-09-01
Mathematical models in biology are highly simplified representations of a complex underlying reality and there is always a high degree of uncertainty with regards to model function specification. This uncertainty becomes critical for models in which the use of different functions fitting the same dataset can yield substantially different predictions-a property known as structural sensitivity. Thus, even if the model is purely deterministic, then the uncertainty in the model functions carries through into uncertainty in model predictions, and new frameworks are required to tackle this fundamental problem. Here, we consider a framework that uses partially specified models in which some functions are not represented by a specific form. The main idea is to project infinite dimensional function space into a low-dimensional space taking into account biological constraints. The key question of how to carry out this projection has so far remained a serious mathematical challenge and hindered the use of partially specified models. Here, we propose and demonstrate a potentially powerful technique to perform such a projection by using optimal control theory to construct functions with the specified global properties. This approach opens up the prospect of a flexible and easy to use method to fulfil uncertainty analysis of biological models.
NASA Astrophysics Data System (ADS)
Chen, Cheng; Xu, Weijie; Guo, Tong; Chen, Kai
2017-10-01
Uncertainties in structure properties can result in different responses in hybrid simulations. Quantification of the effect of these uncertainties would enable researchers to estimate the variances of structural responses observed from experiments. This poses challenges for real-time hybrid simulation (RTHS) due to the existence of actuator delay. Polynomial chaos expansion (PCE) projects the model outputs on a basis of orthogonal stochastic polynomials to account for influences of model uncertainties. In this paper, PCE is utilized to evaluate effect of actuator delay on the maximum displacement from real-time hybrid simulation of a single degree of freedom (SDOF) structure when accounting for uncertainties in structural properties. The PCE is first applied for RTHS without delay to determine the order of PCE, the number of sample points as well as the method for coefficients calculation. The PCE is then applied to RTHS with actuator delay. The mean, variance and Sobol indices are compared and discussed to evaluate the effects of actuator delay on uncertainty quantification for RTHS. Results show that the mean and the variance of the maximum displacement increase linearly and exponentially with respect to actuator delay, respectively. Sensitivity analysis through Sobol indices also indicates the influence of the single random variable decreases while the coupling effect increases with the increase of actuator delay.
2011-11-30
OH: South- Western Cengage Learning. Mankiw , N. G. (2006). Principles of economics (4th ed.). Mason, OH: Thompson South- Western. Private...When the choice to in-source or outsource an installation function or service requirement exists, in these challenging economic times, it is now more...decision uncertainties. When the choice to in-source or outsource an installation function or service requirement exists, in these challenging economic
NASA Astrophysics Data System (ADS)
Seo, Jongmin; Schiavazzi, Daniele; Marsden, Alison
2017-11-01
Cardiovascular simulations are increasingly used in clinical decision making, surgical planning, and disease diagnostics. Patient-specific modeling and simulation typically proceeds through a pipeline from anatomic model construction using medical image data to blood flow simulation and analysis. To provide confidence intervals on simulation predictions, we use an uncertainty quantification (UQ) framework to analyze the effects of numerous uncertainties that stem from clinical data acquisition, modeling, material properties, and boundary condition selection. However, UQ poses a computational challenge requiring multiple evaluations of the Navier-Stokes equations in complex 3-D models. To achieve efficiency in UQ problems with many function evaluations, we implement and compare a range of iterative linear solver and preconditioning techniques in our flow solver. We then discuss applications to patient-specific cardiovascular simulation and how the problem/boundary condition formulation in the solver affects the selection of the most efficient linear solver. Finally, we discuss performance improvements in the context of uncertainty propagation. Support from National Institute of Health (R01 EB018302) is greatly appreciated.
The impacts of uncertainty and variability in groundwater-driven health risk assessment. (Invited)
NASA Astrophysics Data System (ADS)
Maxwell, R. M.
2010-12-01
Potential human health risk from contaminated groundwater is becoming an important, quantitative measure used in management decisions in a range of applications from Superfund to CO2 sequestration. Quantitatively assessing the potential human health risks from contaminated groundwater is challenging due to the many coupled processes, uncertainty in transport parameters and the variability in individual physiology and behavior. Perspective on human health risk assessment techniques will be presented and a framework used to predict potential, increased human health risk from contaminated groundwater will be discussed. This framework incorporates transport of contaminants through the subsurface from source to receptor and health risks to individuals via household exposure pathways. The subsurface is shown subject to both physical and chemical heterogeneity which affects downstream concentrations at receptors. Cases are presented where hydraulic conductivity can exhibit both uncertainty and spatial variability in addition to situations where hydraulic conductivity is the dominant source of uncertainty in risk assessment. Management implications, such as characterization and remediation will also be discussed.
Huang, Guowen; Lee, Duncan; Scott, E Marian
2018-03-30
The long-term health effects of air pollution are often estimated using a spatio-temporal ecological areal unit study, but this design leads to the following statistical challenges: (1) how to estimate spatially representative pollution concentrations for each areal unit; (2) how to allow for the uncertainty in these estimated concentrations when estimating their health effects; and (3) how to simultaneously estimate the joint effects of multiple correlated pollutants. This article proposes a novel 2-stage Bayesian hierarchical model for addressing these 3 challenges, with inference based on Markov chain Monte Carlo simulation. The first stage is a multivariate spatio-temporal fusion model for predicting areal level average concentrations of multiple pollutants from both monitored and modelled pollution data. The second stage is a spatio-temporal model for estimating the health impact of multiple correlated pollutants simultaneously, which accounts for the uncertainty in the estimated pollution concentrations. The novel methodology is motivated by a new study of the impact of both particulate matter and nitrogen dioxide concentrations on respiratory hospital admissions in Scotland between 2007 and 2011, and the results suggest that both pollutants exhibit substantial and independent health effects. © 2017 The Authors. Statistics in Medicine Published by John Wiley & Sons Ltd.
NASA Astrophysics Data System (ADS)
Sedlmeier, Katrin; Gubler, Stefanie; Spierig, Christoph; Flubacher, Moritz; Maurer, Felix; Quevedo, Karim; Escajadillo, Yury; Avalos, Griña; Liniger, Mark A.; Schwierz, Cornelia
2017-04-01
Seasonal climate forecast products potentially have a high value for users of different sectors. During the first phase (2012-2015) of the project CLIMANDES (a pilot project of the Global Framework for Climate Services led by WMO [http://www.wmo.int/gfcs/climandes]), a demand study conducted with Peruvian farmers indicated a large interest in seasonal climate information for agriculture. The study further showed that the required information should by precise, timely, and understandable. In addition to the actual forecast, two complex measures are essential to understand seasonal climate predictions and their limitations correctly: forecast uncertainty and forecast skill. The former can be sampled by using an ensemble of climate simulations, the latter derived by comparing forecasts of past time periods to observations. Including uncertainty and skill information in an understandable way for end-users (who are often not technically educated) poses a great challenge. However, neglecting this information would lead to a false sense of determinism which could prove fatal to the credibility of climate information. Within the second phase (2016-2018) of the project CLIMANDES, one goal is to develop a prototype of a user-tailored seasonal forecast for the agricultural sector in Peru. In this local context, the basic education level of the rural farming community presents a major challenge for the communication of seasonal climate predictions. This contribution proposes different graphical presentations of climate forecasts along with possible approaches to visualize and communicate the associated skill and uncertainties, considering end users with varying levels of technical knowledge.
Uncertainty information in climate data records from Earth observation
NASA Astrophysics Data System (ADS)
Merchant, Christopher J.; Paul, Frank; Popp, Thomas; Ablain, Michael; Bontemps, Sophie; Defourny, Pierre; Hollmann, Rainer; Lavergne, Thomas; Laeng, Alexandra; de Leeuw, Gerrit; Mittaz, Jonathan; Poulsen, Caroline; Povey, Adam C.; Reuter, Max; Sathyendranath, Shubha; Sandven, Stein; Sofieva, Viktoria F.; Wagner, Wolfgang
2017-07-01
The question of how to derive and present uncertainty information in climate data records (CDRs) has received sustained attention within the European Space Agency Climate Change Initiative (CCI), a programme to generate CDRs addressing a range of essential climate variables (ECVs) from satellite data. Here, we review the nature, mathematics, practicalities, and communication of uncertainty information in CDRs from Earth observations. This review paper argues that CDRs derived from satellite-based Earth observation (EO) should include rigorous uncertainty information to support the application of the data in contexts such as policy, climate modelling, and numerical weather prediction reanalysis. Uncertainty, error, and quality are distinct concepts, and the case is made that CDR products should follow international metrological norms for presenting quantified uncertainty. As a baseline for good practice, total standard uncertainty should be quantified per datum in a CDR, meaning that uncertainty estimates should clearly discriminate more and less certain data. In this case, flags for data quality should not duplicate uncertainty information, but instead describe complementary information (such as the confidence in the uncertainty estimate provided or indicators of conditions violating the retrieval assumptions). The paper discusses the many sources of error in CDRs, noting that different errors may be correlated across a wide range of timescales and space scales. Error effects that contribute negligibly to the total uncertainty in a single-satellite measurement can be the dominant sources of uncertainty in a CDR on the large space scales and long timescales that are highly relevant for some climate applications. For this reason, identifying and characterizing the relevant sources of uncertainty for CDRs is particularly challenging. The characterization of uncertainty caused by a given error effect involves assessing the magnitude of the effect, the shape of the error distribution, and the propagation of the uncertainty to the geophysical variable in the CDR accounting for its error correlation properties. Uncertainty estimates can and should be validated as part of CDR validation when possible. These principles are quite general, but the approach to providing uncertainty information appropriate to different ECVs is varied, as confirmed by a brief review across different ECVs in the CCI. User requirements for uncertainty information can conflict with each other, and a variety of solutions and compromises are possible. The concept of an ensemble CDR as a simple means of communicating rigorous uncertainty information to users is discussed. Our review concludes by providing eight concrete recommendations for good practice in providing and communicating uncertainty in EO-based climate data records.
NASA Astrophysics Data System (ADS)
Clough, B.; Russell, M.; Domke, G. M.; Woodall, C. W.
2016-12-01
Uncertainty estimates are needed to establish confidence in national forest carbon stocks and to verify changes reported to the United Nations Framework Convention on Climate Change. Good practice guidance from the Intergovernmental Panel on Climate Change stipulates that uncertainty assessments should neither exaggerate nor underestimate the actual error within carbon stocks, yet methodological guidance for forests has been hampered by limited understanding of how complex dynamics give rise to errors across spatial scales (i.e., individuals to continents). This talk highlights efforts to develop a multi-scale, data-driven framework for assessing uncertainty within the United States (US) forest carbon inventory, and focuses on challenges and opportunities for improving the precision of national forest carbon stock estimates. Central to our approach is the calibration of allometric models with a newly established legacy biomass database for North American tree species, and the use of hierarchical models to link these data with the Forest Inventory and Analysis (FIA) database as well as remote sensing datasets. Our work suggests substantial risk for misestimating key sources of uncertainty including: (1) attributing more confidence in allometric models than what is warranted by the best available data; (2) failing to capture heterogeneity in biomass stocks due to environmental variation at regional scales; and (3) ignoring spatial autocorrelation and other random effects that are characteristic of national forest inventory data. Our results suggest these sources of error may be much higher than is generally assumed, though these results must be understood with the limited scope and availability of appropriate calibration data in mind. In addition to reporting on important sources of uncertainty, this talk will discuss opportunities to improve the precision of national forest carbon stocks that are motivated by our use of data-driven forecasting including: (1) improving the taxonomic and geographic scope of available biomass data; (2) direct attribution of landscape-level heterogeneity in biomass stocks to specific ecological processes; and (3) integration of expert opinion and meta-analysis to lessen the influence of often highly variable datasets on biomass stock forecasts.
Managing the Risks of Climate Change and Terrorism
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rosa, Eugene; Dietz, Tom; Moss, Richard H.
2012-04-07
The article describes challenges to comparative risk assessment, a key approach for managing uncertainty in decision making, across diverse threats such as terrorism and climate change and argues new approaches will be particularly important in addressing decisions related to sustainability.
REDD+ emissions estimation and reporting: dealing with uncertainty
NASA Astrophysics Data System (ADS)
Pelletier, Johanne; Martin, Davy; Potvin, Catherine
2013-09-01
The United Nations Framework Convention on Climate Change (UNFCCC) defined the technical and financial modalities of policy approaches and incentives to reduce emissions from deforestation and forest degradation in developing countries (REDD+). Substantial technical challenges hinder precise and accurate estimation of forest-related emissions and removals, as well as the setting and assessment of reference levels. These challenges could limit country participation in REDD+, especially if REDD+ emission reductions were to meet quality standards required to serve as compliance grade offsets for developed countries’ emissions. Using Panama as a case study, we tested the matrix approach proposed by Bucki et al (2012 Environ. Res. Lett. 7 024005) to perform sensitivity and uncertainty analysis distinguishing between ‘modelling sources’ of uncertainty, which refers to model-specific parameters and assumptions, and ‘recurring sources’ of uncertainty, which refers to random and systematic errors in emission factors and activity data. The sensitivity analysis estimated differences in the resulting fluxes ranging from 4.2% to 262.2% of the reference emission level. The classification of fallows and the carbon stock increment or carbon accumulation of intact forest lands were the two key parameters showing the largest sensitivity. The highest error propagated using Monte Carlo simulations was caused by modelling sources of uncertainty, which calls for special attention to ensure consistency in REDD+ reporting which is essential for securing environmental integrity. Due to the role of these modelling sources of uncertainty, the adoption of strict rules for estimation and reporting would favour comparability of emission reductions between countries. We believe that a reduction of the bias in emission factors will arise, among other things, from a globally concerted effort to improve allometric equations for tropical forests. Public access to datasets and methodology used to evaluate reference level and emission reductions would strengthen the credibility of the system by promoting accountability and transparency. To secure conservativeness and deal with uncertainty, we consider the need for further research using real data available to developing countries to test the applicability of conservative discounts including the trend uncertainty and other possible options that would allow real incentives and stimulate improvements over time. Finally, we argue that REDD+ result-based actions assessed on the basis of a dashboard of performance indicators, not only in ‘tonnes CO2 equ. per year’ might provide a more holistic approach, at least until better accuracy and certainty of forest carbon stocks emission and removal estimates to support a REDD+ policy can be reached.
NASA Astrophysics Data System (ADS)
Shafii, M.; Tolson, B.; Matott, L. S.
2012-04-01
Hydrologic modeling has benefited from significant developments over the past two decades. This has resulted in building of higher levels of complexity into hydrologic models, which eventually makes the model evaluation process (parameter estimation via calibration and uncertainty analysis) more challenging. In order to avoid unreasonable parameter estimates, many researchers have suggested implementation of multi-criteria calibration schemes. Furthermore, for predictive hydrologic models to be useful, proper consideration of uncertainty is essential. Consequently, recent research has emphasized comprehensive model assessment procedures in which multi-criteria parameter estimation is combined with statistically-based uncertainty analysis routines such as Bayesian inference using Markov Chain Monte Carlo (MCMC) sampling. Such a procedure relies on the use of formal likelihood functions based on statistical assumptions, and moreover, the Bayesian inference structured on MCMC samplers requires a considerably large number of simulations. Due to these issues, especially in complex non-linear hydrological models, a variety of alternative informal approaches have been proposed for uncertainty analysis in the multi-criteria context. This study aims at exploring a number of such informal uncertainty analysis techniques in multi-criteria calibration of hydrological models. The informal methods addressed in this study are (i) Pareto optimality which quantifies the parameter uncertainty using the Pareto solutions, (ii) DDS-AU which uses the weighted sum of objective functions to derive the prediction limits, and (iii) GLUE which describes the total uncertainty through identification of behavioral solutions. The main objective is to compare such methods with MCMC-based Bayesian inference with respect to factors such as computational burden, and predictive capacity, which are evaluated based on multiple comparative measures. The measures for comparison are calculated both for calibration and evaluation periods. The uncertainty analysis methodologies are applied to a simple 5-parameter rainfall-runoff model, called HYMOD.
Quantifying Variation in Gait Features from Wearable Inertial Sensors Using Mixed Effects Models
Cresswell, Kellen Garrison; Shin, Yongyun; Chen, Shanshan
2017-01-01
The emerging technology of wearable inertial sensors has shown its advantages in collecting continuous longitudinal gait data outside laboratories. This freedom also presents challenges in collecting high-fidelity gait data. In the free-living environment, without constant supervision from researchers, sensor-based gait features are susceptible to variation from confounding factors such as gait speed and mounting uncertainty, which are challenging to control or estimate. This paper is one of the first attempts in the field to tackle such challenges using statistical modeling. By accepting the uncertainties and variation associated with wearable sensor-based gait data, we shift our efforts from detecting and correcting those variations to modeling them statistically. From gait data collected on one healthy, non-elderly subject during 48 full-factorial trials, we identified four major sources of variation, and quantified their impact on one gait outcome—range per cycle—using a random effects model and a fixed effects model. The methodology developed in this paper lays the groundwork for a statistical framework to account for sources of variation in wearable gait data, thus facilitating informative statistical inference for free-living gait analysis. PMID:28245602
Family physicians and dementia in Canada: Part 2. Understanding the challenges of dementia care.
Pimlott, Nicholas J G; Persaud, Malini; Drummond, Neil; Cohen, Carole A; Silvius, James L; Seigel, Karen; Hollingworth, Gary R; Dalziel, William B
2009-05-01
To explore the challenges Canadian family physicians face in providing dementia care. Qualitative study using focus groups. Academic family practice clinics in Calgary, Alta, Ottawa, Ont, and Toronto, Ont. Eighteen family physicians. We conducted 4 qualitative focus groups of 4 to 6 family physicians whose practices we had audited in a previous study. Focus group transcripts were analyzed using the principles of thematic analysis. Five major themes related to the provision of dementia care by family physicians emerged: 1) diagnostic uncertainty; 2) the complexity of dementia; 3) time as a paradox in the provision of dementia care; 4) the importance of patients' families; 5) and familiarity with patients. Participants expressed uncertainty about diagnosing dementia and a strong need for expert verification of diagnoses owing to the complexity of dementia. Time, patients' family members, and familiarity with patients were seen as both barriers and enablers in the provision of dementia care. Family physicians face many challenges in providing dementia care. The results of this study and the views of family physicians should be considered in the development and dissemination of future dementia guidelines, as well as by specialist colleagues, policy makers, and those involved in developing continuing physician education about dementia.
Climate Change Mitigation Challenge for Wood Utilization-The Case of Finland.
Soimakallio, Sampo; Saikku, Laura; Valsta, Lauri; Pingoud, Kim
2016-05-17
The urgent need to mitigate climate change invokes both opportunities and challenges for forest biomass utilization. Fossil fuels can be substituted by using wood products in place of alternative materials and energy, but wood harvesting reduces forest carbon sink and processing of wood products requires material and energy inputs. We assessed the extended life cycle carbon emissions considering substitution impacts for various wood utilization scenarios over 100 years from 2010 onward for Finland. The scenarios were based on various but constant wood utilization structures reflecting current and anticipated mix of wood utilization activities. We applied stochastic simulation to deal with the uncertainty in a number of input variables required. According to our analysis, the wood utilization decrease net carbon emissions with a probability lower than 40% for each of the studied scenarios. Furthermore, large emission reductions were exceptionally unlikely. The uncertainty of the results were influenced clearly the most by the reduction in the forest carbon sink. There is a significant trade-off between avoiding emissions through fossil fuel substitution and reduction in forest carbon sink due to wood harvesting. This creates a major challenge for forest management practices and wood utilization activities in responding to ambitious climate change mitigation targets.
NASA Astrophysics Data System (ADS)
Javed, Kamran; Gouriveau, Rafael; Zerhouni, Noureddine
2017-09-01
Integrating prognostics to a real application requires a certain maturity level and for this reason there is a lack of success stories about development of a complete Prognostics and Health Management system. In fact, the maturity of prognostics is closely linked to data and domain specific entities like modeling. Basically, prognostics task aims at predicting the degradation of engineering assets. However, practically it is not possible to precisely predict the impending failure, which requires a thorough understanding to encounter different sources of uncertainty that affect prognostics. Therefore, different aspects crucial to the prognostics framework, i.e., from monitoring data to remaining useful life of equipment need to be addressed. To this aim, the paper contributes to state of the art and taxonomy of prognostics approaches and their application perspectives. In addition, factors for prognostics approach selection are identified, and new case studies from component-system level are discussed. Moreover, open challenges toward maturity of the prognostics under uncertainty are highlighted and scheme for an efficient prognostics approach is presented. Finally, the existing challenges for verification and validation of prognostics at different technology readiness levels are discussed with respect to open challenges.
When 1+1 can be >2: Uncertainties compound when simulating climate, fisheries and marine ecosystems
NASA Astrophysics Data System (ADS)
Evans, Karen; Brown, Jaclyn N.; Sen Gupta, Alex; Nicol, Simon J.; Hoyle, Simon; Matear, Richard; Arrizabalaga, Haritz
2015-03-01
Multi-disciplinary approaches that combine oceanographic, biogeochemical, ecosystem, fisheries population and socio-economic models are vital tools for modelling whole ecosystems. Interpreting the outputs from such complex models requires an appreciation of the many different types of modelling frameworks being used and their associated limitations and uncertainties. Both users and developers of particular model components will often have little involvement or understanding of other components within such modelling frameworks. Failure to recognise limitations and uncertainties associated with components and how these uncertainties might propagate throughout modelling frameworks can potentially result in poor advice for resource management. Unfortunately, many of the current integrative frameworks do not propagate the uncertainties of their constituent parts. In this review, we outline the major components of a generic whole of ecosystem modelling framework incorporating the external pressures of climate and fishing. We discuss the limitations and uncertainties associated with each component of such a modelling system, along with key research gaps. Major uncertainties in modelling frameworks are broadly categorised into those associated with (i) deficient knowledge in the interactions of climate and ocean dynamics with marine organisms and ecosystems; (ii) lack of observations to assess and advance modelling efforts and (iii) an inability to predict with confidence natural ecosystem variability and longer term changes as a result of external drivers (e.g. greenhouse gases, fishing effort) and the consequences for marine ecosystems. As a result of these uncertainties and intrinsic differences in the structure and parameterisation of models, users are faced with considerable challenges associated with making appropriate choices on which models to use. We suggest research directions required to address these uncertainties, and caution against overconfident predictions. Understanding the full impact of uncertainty makes it clear that full comprehension and robust certainty about the systems themselves are not feasible. A key research direction is the development of management systems that are robust to this unavoidable uncertainty.
Forensic Uncertainty Quantification of Explosive Dispersal of Particles
NASA Astrophysics Data System (ADS)
Hughes, Kyle; Park, Chanyoung; Haftka, Raphael; Kim, Nam-Ho
2017-06-01
In addition to the numerical challenges of simulating the explosive dispersal of particles, validation of the simulation is often plagued with poor knowledge of the experimental conditions. The level of experimental detail required for validation is beyond what is usually included in the literature. This presentation proposes the use of forensic uncertainty quantification (UQ) to investigate validation-quality experiments to discover possible sources of uncertainty that may have been missed in initial design of experiments or under-reported. The current experience of the authors has found that by making an analogy to crime scene investigation when looking at validation experiments, valuable insights may be gained. One examines all the data and documentation provided by the validation experimentalists, corroborates evidence, and quantifies large sources of uncertainty a posteriori with empirical measurements. In addition, it is proposed that forensic UQ may benefit from an independent investigator to help remove possible implicit biases and increases the likelihood of discovering unrecognized uncertainty. Forensic UQ concepts will be discussed and then applied to a set of validation experiments performed at Eglin Air Force Base. This work was supported in part by the U.S. Department of Energy, National Nuclear Security Administration, Advanced Simulation and Computing Program.
Sahaf, Robab; Sadat Ilali, Ehteram; Peyrovi, Hamid; Akbari Kamrani, Ahmad Ali; Spahbodi, Fatemeh
2017-01-01
The chronic kidney disease is a major health concern. The number of the elderly people with chronic renal failure has increased across the world. Dialysis is an appropriate therapy for the elderly, but it involves certain challenges. The present paper reports uncertainty as part of the elderly experiences of living with hemodialysis. This qualitative study applied Max van Manen interpretative phenomenological analysis to explain and explore experiences of the elderly with hemodialysis. Given the study inclusion criteria, data were collected using in-depth unstructured interviews with nine elderly undergoing hemodialysis, and then analyzed according to Van Manen 6-stage methodological approach. One of the most important findings emerging in the main study was "uncertainty", which can be important and noteworthy, given other aspects of the elderly life (loneliness, despair, comorbidity of diseases, disability, and mental and psychosocial problems). Uncertainty about the future is the most psychological concerns of people undergoing hemodialysis. The results obtained are indicative of the importance of paying attention to a major aspect in the life of the elderly undergoing hemodialysis, uncertainty. A positive outlook can be created in the elderly through education and increased knowledge about the disease, treatment and complications.
Hydrologic and geochemical data assimilation at the Hanford 300 Area
NASA Astrophysics Data System (ADS)
Chen, X.; Hammond, G. E.; Murray, C. J.; Zachara, J. M.
2012-12-01
In modeling the uranium migration within the Integrated Field Research Challenge (IFRC) site at the Hanford 300 Area, uncertainties arise from both hydrologic and geochemical sources. The hydrologic uncertainty includes the transient flow boundary conditions induced by dynamic variations in Columbia River stage and the underlying heterogeneous hydraulic conductivity field, while the geochemical uncertainty is a result of limited knowledge of the geochemical reaction processes and parameters, as well as heterogeneity in uranium source terms. In this work, multiple types of data, including the results from constant-injection tests, borehole flowmeter profiling, and conservative tracer tests, are sequentially assimilated across scales within a Bayesian framework to reduce the hydrologic uncertainty. The hydrologic data assimilation is then followed by geochemical data assimilation, where the goal is to infer the heterogeneous distribution of uranium sources using uranium breakthrough curves from a desorption test that took place at high spring water table. We demonstrate in our study that Ensemble-based data assimilation techniques (Ensemble Kalman filter and smoother) are efficient in integrating multiple types of data sequentially for uncertainty reduction. The computational demand is managed by using the multi-realization capability within the parallel PFLOTRAN simulator.
Ocean state and uncertainty forecasts using HYCOM with Local Ensemble Transfer Kalman Filter (LETKF)
NASA Astrophysics Data System (ADS)
Wei, Mozheng; Hogan, Pat; Rowley, Clark; Smedstad, Ole-Martin; Wallcraft, Alan; Penny, Steve
2017-04-01
An ensemble forecast system based on the US Navy's operational HYCOM using Local Ensemble Transfer Kalman Filter (LETKF) technology has been developed for ocean state and uncertainty forecasts. One of the advantages is that the best possible initial analysis states for the HYCOM forecasts are provided by the LETKF which assimilates the operational observations using ensemble method. The background covariance during this assimilation process is supplied with the ensemble, thus it avoids the difficulty of developing tangent linear and adjoint models for 4D-VAR from the complicated hybrid isopycnal vertical coordinate in HYCOM. Another advantage is that the ensemble system provides the valuable uncertainty estimate corresponding to every state forecast from HYCOM. Uncertainty forecasts have been proven to be critical for the downstream users and managers to make more scientifically sound decisions in numerical prediction community. In addition, ensemble mean is generally more accurate and skilful than the single traditional deterministic forecast with the same resolution. We will introduce the ensemble system design and setup, present some results from 30-member ensemble experiment, and discuss scientific, technical and computational issues and challenges, such as covariance localization, inflation, model related uncertainties and sensitivity to the ensemble size.
Evaluating Variability and Uncertainty of Geological Strength Index at a Specific Site
NASA Astrophysics Data System (ADS)
Wang, Yu; Aladejare, Adeyemi Emman
2016-09-01
Geological Strength Index (GSI) is an important parameter for estimating rock mass properties. GSI can be estimated from quantitative GSI chart, as an alternative to the direct observational method which requires vast geological experience of rock. GSI chart was developed from past observations and engineering experience, with either empiricism or some theoretical simplifications. The GSI chart thereby contains model uncertainty which arises from its development. The presence of such model uncertainty affects the GSI estimated from GSI chart at a specific site; it is, therefore, imperative to quantify and incorporate the model uncertainty during GSI estimation from the GSI chart. A major challenge for quantifying the GSI chart model uncertainty is a lack of the original datasets that have been used to develop the GSI chart, since the GSI chart was developed from past experience without referring to specific datasets. This paper intends to tackle this problem by developing a Bayesian approach for quantifying the model uncertainty in GSI chart when using it to estimate GSI at a specific site. The model uncertainty in the GSI chart and the inherent spatial variability in GSI are modeled explicitly in the Bayesian approach. The Bayesian approach generates equivalent samples of GSI from the integrated knowledge of GSI chart, prior knowledge and observation data available from site investigation. Equations are derived for the Bayesian approach, and the proposed approach is illustrated using data from a drill and blast tunnel project. The proposed approach effectively tackles the problem of how to quantify the model uncertainty that arises from using GSI chart for characterization of site-specific GSI in a transparent manner.
Stochastic Simulation and Forecast of Hydrologic Time Series Based on Probabilistic Chaos Expansion
NASA Astrophysics Data System (ADS)
Li, Z.; Ghaith, M.
2017-12-01
Hydrological processes are characterized by many complex features, such as nonlinearity, dynamics and uncertainty. How to quantify and address such complexities and uncertainties has been a challenging task for water engineers and managers for decades. To support robust uncertainty analysis, an innovative approach for the stochastic simulation and forecast of hydrologic time series is developed is this study. Probabilistic Chaos Expansions (PCEs) are established through probabilistic collocation to tackle uncertainties associated with the parameters of traditional hydrological models. The uncertainties are quantified in model outputs as Hermite polynomials with regard to standard normal random variables. Sequentially, multivariate analysis techniques are used to analyze the complex nonlinear relationships between meteorological inputs (e.g., temperature, precipitation, evapotranspiration, etc.) and the coefficients of the Hermite polynomials. With the established relationships between model inputs and PCE coefficients, forecasts of hydrologic time series can be generated and the uncertainties in the future time series can be further tackled. The proposed approach is demonstrated using a case study in China and is compared to a traditional stochastic simulation technique, the Markov-Chain Monte-Carlo (MCMC) method. Results show that the proposed approach can serve as a reliable proxy to complicated hydrological models. It can provide probabilistic forecasting in a more computationally efficient manner, compared to the traditional MCMC method. This work provides technical support for addressing uncertainties associated with hydrological modeling and for enhancing the reliability of hydrological modeling results. Applications of the developed approach can be extended to many other complicated geophysical and environmental modeling systems to support the associated uncertainty quantification and risk analysis.
NASA Astrophysics Data System (ADS)
Delottier, H.; Pryet, A.; Lemieux, J. M.; Dupuy, A.
2018-05-01
Specific yield and groundwater recharge of unconfined aquifers are both essential parameters for groundwater modeling and sustainable groundwater development, yet the collection of reliable estimates of these parameters remains challenging. Here, a joint approach combining an aquifer test with application of the water-table fluctuation (WTF) method is presented to estimate these parameters and quantify their uncertainty. The approach requires two wells: an observation well instrumented with a pressure probe for long-term monitoring and a pumping well, located in the vicinity, for the aquifer test. The derivative of observed drawdown levels highlights the necessity to represent delayed drainage from the unsaturated zone when interpreting the aquifer test results. Groundwater recharge is estimated with an event-based WTF method in order to minimize the transient effects of flow dynamics in the unsaturated zone. The uncertainty on groundwater recharge is obtained by the propagation of the uncertainties on specific yield (Bayesian inference) and groundwater recession dynamics (regression analysis) through the WTF equation. A major portion of the uncertainty on groundwater recharge originates from the uncertainty on the specific yield. The approach was applied to a site in Bordeaux (France). Groundwater recharge was estimated to be 335 mm with an associated uncertainty of 86.6 mm at 2σ. By the use of cost-effective instrumentation and parsimonious methods of interpretation, the replication of such a joint approach should be encouraged to provide reliable estimates of specific yield and groundwater recharge over a region of interest. This is necessary to reduce the predictive uncertainty of groundwater management models.
NASA Astrophysics Data System (ADS)
Wasklewicz, Thad; Zhu, Zhen; Gares, Paul
2017-12-01
Rapid technological advances, sustained funding, and a greater recognition of the value of topographic data have helped develop an increasing archive of topographic data sources. Advances in basic and applied research related to Earth surface changes require researchers to integrate recent high-resolution topography (HRT) data with the legacy datasets. Several technical challenges and data uncertainty issues persist to date when integrating legacy datasets with more recent HRT data. The disparate data sources required to extend the topographic record back in time are often stored in formats that are not readily compatible with more recent HRT data. Legacy data may also contain unknown error or unreported error that make accounting for data uncertainty difficult. There are also cases of known deficiencies in legacy datasets, which can significantly bias results. Finally, scientists are faced with the daunting challenge of definitively deriving the extent to which a landform or landscape has or will continue to change in response natural and/or anthropogenic processes. Here, we examine the question: how do we evaluate and portray data uncertainty from the varied topographic legacy sources and combine this uncertainty with current spatial data collection techniques to detect meaningful topographic changes? We view topographic uncertainty as a stochastic process that takes into consideration spatial and temporal variations from a numerical simulation and physical modeling experiment. The numerical simulation incorporates numerous topographic data sources typically found across a range of legacy data to present high-resolution data, while the physical model focuses on more recent HRT data acquisition techniques. Elevation uncertainties observed from anchor points in the digital terrain models are modeled using "states" in a stochastic estimator. Stochastic estimators trace the temporal evolution of the uncertainties and are natively capable of incorporating sensor measurements observed at various times in history. The geometric relationship between the anchor point and the sensor measurement can be approximated via spatial correlation even when a sensor does not directly observe an anchor point. Findings from a numerical simulation indicate the estimated error coincides with the actual error using certain sensors (Kinematic GNSS, ALS, TLS, and SfM-MVS). Data from 2D imagery and static GNSS did not perform as well at the time the sensor is integrated into estimator largely as a result of the low density of data added from these sources. The estimator provides a history of DEM estimation as well as the uncertainties and cross correlations observed on anchor points. Our work provides preliminary evidence that our approach is valid for integrating legacy data with HRT and warrants further exploration and field validation. [Figure not available: see fulltext.
Wu, Y.; Liu, S.
2012-01-01
Parameter optimization and uncertainty issues are a great challenge for the application of large environmental models like the Soil and Water Assessment Tool (SWAT), which is a physically-based hydrological model for simulating water and nutrient cycles at the watershed scale. In this study, we present a comprehensive modeling environment for SWAT, including automated calibration, and sensitivity and uncertainty analysis capabilities through integration with the R package Flexible Modeling Environment (FME). To address challenges (e.g., calling the model in R and transferring variables between Fortran and R) in developing such a two-language coupling framework, 1) we converted the Fortran-based SWAT model to an R function (R-SWAT) using the RFortran platform, and alternatively 2) we compiled SWAT as a Dynamic Link Library (DLL). We then wrapped SWAT (via R-SWAT) with FME to perform complex applications including parameter identifiability, inverse modeling, and sensitivity and uncertainty analysis in the R environment. The final R-SWAT-FME framework has the following key functionalities: automatic initialization of R, running Fortran-based SWAT and R commands in parallel, transferring parameters and model output between SWAT and R, and inverse modeling with visualization. To examine this framework and demonstrate how it works, a case study simulating streamflow in the Cedar River Basin in Iowa in the United Sates was used, and we compared it with the built-in auto-calibration tool of SWAT in parameter optimization. Results indicate that both methods performed well and similarly in searching a set of optimal parameters. Nonetheless, the R-SWAT-FME is more attractive due to its instant visualization, and potential to take advantage of other R packages (e.g., inverse modeling and statistical graphics). The methods presented in the paper are readily adaptable to other model applications that require capability for automated calibration, and sensitivity and uncertainty analysis.
Assessing climate change and socio-economic uncertainties in long term management of water resources
NASA Astrophysics Data System (ADS)
Jahanshahi, Golnaz; Dawson, Richard; Walsh, Claire; Birkinshaw, Stephen; Glenis, Vassilis
2015-04-01
Long term management of water resources is challenging for decision makers given the range of uncertainties that exist. Such uncertainties are a function of long term drivers of change, such as climate, environmental loadings, demography, land use and other socio economic drivers. Impacts of climate change on frequency of extreme events such as drought make it a serious threat to water resources and water security. The release of probabilistic climate information, such as the UKCP09 scenarios, provides improved understanding of some uncertainties in climate models. This has motivated a more rigorous approach to dealing with other uncertainties in order to understand the sensitivity of investment decisions to future uncertainty and identify adaptation options that are as far as possible robust. We have developed and coupled a system of models that includes a weather generator, simulations of catchment hydrology, demand for water and the water resource system. This integrated model has been applied in the Thames catchment which supplies the city of London, UK. This region is one of the driest in the UK and hence sensitive to water availability. In addition, it is one of the fastest growing parts of the UK and plays an important economic role. Key uncertainties in long term water resources in the Thames catchment, many of which result from earth system processes, are identified and quantified. The implications of these uncertainties are explored using a combination of uncertainty analysis and sensitivity testing. The analysis shows considerable uncertainty in future rainfall, river flow and consequently water resource. For example, results indicate that by the 2050s, low flow (Q95) in the Thames catchment will range from -44 to +9% compared with the control scenario (1970s). Consequently, by the 2050s the average number of drought days are expected to increase 4-6 times relative to the 1970s. Uncertainties associated with urban growth increase these risks further. Adaptation measures, such as new reservoirs can manage these risks to a certain extent, but our sensitivity testing demonstrates that they are less robust to future uncertainties than measures taken to reduce water demand. Keywords: Climate change, Uncertainty, Decision making, Drought, Risk, Water resources management.
NASA Astrophysics Data System (ADS)
Tsai, F. T.; Elshall, A. S.; Hanor, J. S.
2012-12-01
Subsurface modeling is challenging because of many possible competing propositions for each uncertain model component. How can we judge that we are selecting the correct proposition for an uncertain model component out of numerous competing propositions? How can we bridge the gap between synthetic mental principles such as mathematical expressions on one hand, and empirical observation such as observation data on the other hand when uncertainty exists on both sides? In this study, we introduce hierarchical Bayesian model averaging (HBMA) as a multi-model (multi-proposition) framework to represent our current state of knowledge and decision for hydrogeological structure modeling. The HBMA framework allows for segregating and prioritizing different sources of uncertainty, and for comparative evaluation of competing propositions for each source of uncertainty. We applied the HBMA to a study of hydrostratigraphy and uncertainty propagation of the Southern Hills aquifer system in the Baton Rouge area, Louisiana. We used geophysical data for hydrogeological structure construction through indictor hydrostratigraphy method and used lithologic data from drillers' logs for model structure calibration. However, due to uncertainty in model data, structure and parameters, multiple possible hydrostratigraphic models were produced and calibrated. The study considered four sources of uncertainties. To evaluate mathematical structure uncertainty, the study considered three different variogram models and two geological stationarity assumptions. With respect to geological structure uncertainty, the study considered two geological structures with respect to the Denham Springs-Scotlandville fault. With respect to data uncertainty, the study considered two calibration data sets. These four sources of uncertainty with their corresponding competing modeling propositions resulted in 24 calibrated models. The results showed that by segregating different sources of uncertainty, HBMA analysis provided insights on uncertainty priorities and propagation. In addition, it assisted in evaluating the relative importance of competing modeling propositions for each uncertain model component. By being able to dissect the uncertain model components and provide weighted representation of the competing propositions for each uncertain model component based on the background knowledge, the HBMA functions as an epistemic framework for advancing knowledge about the system under study.
Predicting ecological responses in a changing ocean: the effects of future climate uncertainty.
Freer, Jennifer J; Partridge, Julian C; Tarling, Geraint A; Collins, Martin A; Genner, Martin J
2018-01-01
Predicting how species will respond to climate change is a growing field in marine ecology, yet knowledge of how to incorporate the uncertainty from future climate data into these predictions remains a significant challenge. To help overcome it, this review separates climate uncertainty into its three components (scenario uncertainty, model uncertainty, and internal model variability) and identifies four criteria that constitute a thorough interpretation of an ecological response to climate change in relation to these parts (awareness, access, incorporation, communication). Through a literature review, the extent to which the marine ecology community has addressed these criteria in their predictions was assessed. Despite a high awareness of climate uncertainty, articles favoured the most severe emission scenario, and only a subset of climate models were used as input into ecological analyses. In the case of sea surface temperature, these models can have projections unrepresentative against a larger ensemble mean. Moreover, 91% of studies failed to incorporate the internal variability of a climate model into results. We explored the influence that the choice of emission scenario, climate model, and model realisation can have when predicting the future distribution of the pelagic fish, Electrona antarctica . Future distributions were highly influenced by the choice of climate model, and in some cases, internal variability was important in determining the direction and severity of the distribution change. Increased clarity and availability of processed climate data would facilitate more comprehensive explorations of climate uncertainty, and increase in the quality and standard of marine prediction studies.
How to deal with climate change uncertainty in the planning of engineering systems
NASA Astrophysics Data System (ADS)
Spackova, Olga; Dittes, Beatrice; Straub, Daniel
2016-04-01
The effect of extreme events such as floods on the infrastructure and built environment is associated with significant uncertainties: These include the uncertain effect of climate change, uncertainty on extreme event frequency estimation due to limited historic data and imperfect models, and, not least, uncertainty on future socio-economic developments, which determine the damage potential. One option for dealing with these uncertainties is the use of adaptable (flexible) infrastructure that can easily be adjusted in the future without excessive costs. The challenge is in quantifying the value of adaptability and in finding the optimal sequence of decision. Is it worth to build a (potentially more expensive) adaptable system that can be adjusted in the future depending on the future conditions? Or is it more cost-effective to make a conservative design without counting with the possible future changes to the system? What is the optimal timing of the decision to build/adjust the system? We develop a quantitative decision-support framework for evaluation of alternative infrastructure designs under uncertainties, which: • probabilistically models the uncertain future (trough a Bayesian approach) • includes the adaptability of the systems (the costs of future changes) • takes into account the fact that future decisions will be made under uncertainty as well (using pre-posterior decision analysis) • allows to identify the optimal capacity and optimal timing to build/adjust the infrastructure. Application of the decision framework will be demonstrated on an example of flood mitigation planning in Bavaria.
Final Report---Optimization Under Nonconvexity and Uncertainty: Algorithms and Software
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jeff Linderoth
2011-11-06
the goal of this work was to develop new algorithmic techniques for solving large-scale numerical optimization problems, focusing on problems classes that have proven to be among the most challenging for practitioners: those involving uncertainty and those involving nonconvexity. This research advanced the state-of-the-art in solving mixed integer linear programs containing symmetry, mixed integer nonlinear programs, and stochastic optimization problems. The focus of the work done in the continuation was on Mixed Integer Nonlinear Programs (MINLP)s and Mixed Integer Linear Programs (MILP)s, especially those containing a great deal of symmetry.
Self-organization, embodiment, and biologically inspired robotics.
Pfeifer, Rolf; Lungarella, Max; Iida, Fumiya
2007-11-16
Robotics researchers increasingly agree that ideas from biology and self-organization can strongly benefit the design of autonomous robots. Biological organisms have evolved to perform and survive in a world characterized by rapid changes, high uncertainty, indefinite richness, and limited availability of information. Industrial robots, in contrast, operate in highly controlled environments with no or very little uncertainty. Although many challenges remain, concepts from biologically inspired (bio-inspired) robotics will eventually enable researchers to engineer machines for the real world that possess at least some of the desirable properties of biological organisms, such as adaptivity, robustness, versatility, and agility.
Generalized uncertainty principle and quantum gravity phenomenology
NASA Astrophysics Data System (ADS)
Bosso, Pasquale
The fundamental physical description of Nature is based on two mutually incompatible theories: Quantum Mechanics and General Relativity. Their unification in a theory of Quantum Gravity (QG) remains one of the main challenges of theoretical physics. Quantum Gravity Phenomenology (QGP) studies QG effects in low-energy systems. The basis of one such phenomenological model is the Generalized Uncertainty Principle (GUP), which is a modified Heisenberg uncertainty relation and predicts a deformed canonical commutator. In this thesis, we compute Planck-scale corrections to angular momentum eigenvalues, the hydrogen atom spectrum, the Stern-Gerlach experiment, and the Clebsch-Gordan coefficients. We then rigorously analyze the GUP-perturbed harmonic oscillator and study new coherent and squeezed states. Furthermore, we introduce a scheme for increasing the sensitivity of optomechanical experiments for testing QG effects. Finally, we suggest future projects that may potentially test QG effects in the laboratory.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tsao, Jeffrey Y.; Trucano, Timothy G.; Kleban, Stephen D.
This report contains the written footprint of a Sandia-hosted workshop held in Albuquerque, New Mexico, June 22-23, 2016 on “Complex Systems Models and Their Applications: Towards a New Science of Verification, Validation and Uncertainty Quantification,” as well as of pre-work that fed into the workshop. The workshop’s intent was to explore and begin articulating research opportunities at the intersection between two important Sandia communities: the complex systems (CS) modeling community, and the verification, validation and uncertainty quantification (VVUQ) community The overarching research opportunity (and challenge) that we ultimately hope to address is: how can we quantify the credibility of knowledgemore » gained from complex systems models, knowledge that is often incomplete and interim, but will nonetheless be used, sometimes in real-time, by decision makers?« less
Active Learning Using Hint Information.
Li, Chun-Liang; Ferng, Chun-Sung; Lin, Hsuan-Tien
2015-08-01
The abundance of real-world data and limited labeling budget calls for active learning, an important learning paradigm for reducing human labeling efforts. Many recently developed active learning algorithms consider both uncertainty and representativeness when making querying decisions. However, exploiting representativeness with uncertainty concurrently usually requires tackling sophisticated and challenging learning tasks, such as clustering. In this letter, we propose a new active learning framework, called hinted sampling, which takes both uncertainty and representativeness into account in a simpler way. We design a novel active learning algorithm within the hinted sampling framework with an extended support vector machine. Experimental results validate that the novel active learning algorithm can result in a better and more stable performance than that achieved by state-of-the-art algorithms. We also show that the hinted sampling framework allows improving another active learning algorithm designed from the transductive support vector machine.
Creating Impact Functions to Estimate the Domestic Effects of Global Climate Action
Quantifying and monetizing the impacts of climate change can be challenging due to the complexity of impacts, availability of data, variability across geographic and temporal time scales, sources of uncertainty, and computational constraints. Recent advancements in using consist...
Data management to enhance long-term watershed research capacity: context and STWEARDS case study
USDA-ARS?s Scientific Manuscript database
Water resources are under growing pressure globally, and in the face of projected climate change, uncertainty about precipitation frequency and intensity; evapotranspiration, runoff, and snowmelt poses severe societal challenges. Interdisciplinary environmental research across natural and social sc...
ENVIRONMENTAL SYSTEMS MANAGEMENT, SUSTAINABILITY THEORY, AND THE CHALLENGE OF UNCERTAINTY
Environmental Systems Management is the management of environmental problems at the systems level fully accounting fo rthe multi-dimensional nature of the environment. This includes socio-economic dimensions as well s the usual physical and life science aspects. This is important...
Bezombes, Lucie; Gaucherand, Stéphanie; Kerbiriou, Christian; Reinert, Marie-Eve; Spiegelberger, Thomas
2017-08-01
In many countries, biodiversity compensation is required to counterbalance negative impacts of development projects on biodiversity by carrying out ecological measures, called offset when the goal is to reach "no net loss" of biodiversity. One main issue is to ensure that offset gains are equivalent to impact-related losses. Ecological equivalence is assessed with ecological equivalence assessment methods taking into account a range of key considerations that we summarized as ecological, spatial, temporal, and uncertainty. When equivalence assessment methods take into account all considerations, we call them "comprehensive". Equivalence assessment methods should also aim to be science-based and operational, which is challenging. Many equivalence assessment methods have been developed worldwide but none is fully satisfying. In the present study, we examine 13 equivalence assessment methods in order to identify (i) their general structure and (ii) the synergies and trade-offs between equivalence assessment methods characteristics related to operationality, scientific-basis and comprehensiveness (called "challenges" in his paper). We evaluate each equivalence assessment methods on the basis of 12 criteria describing the level of achievement of each challenge. We observe that all equivalence assessment methods share a general structure, with possible improvements in the choice of target biodiversity, the indicators used, the integration of landscape context and the multipliers reflecting time lags and uncertainties. We show that no equivalence assessment methods combines all challenges perfectly. There are trade-offs between and within the challenges: operationality tends to be favored while scientific basis are integrated heterogeneously in equivalence assessment methods development. One way of improving the challenges combination would be the use of offset dedicated data-bases providing scientific feedbacks on previous offset measures.
NASA Astrophysics Data System (ADS)
Rodysill, J. R.
2017-12-01
Proxy-based reconstructions provide vital information for developing histories of environmental and climate changes. Networks of spatiotemporal paleoclimate information are powerful tools for understanding dynamical processes within the global climate system and improving model-based predictions of the patterns and magnitudes of climate changes at local- to global-scales. Compiling individual paleoclimate records and integrating reconstructed climate information in the context of an ensemble of multi-proxy records, which are fundamental for developing a spatiotemporal climate data network, are hindered by challenges related to data and information accessibility, chronological uncertainty, sampling resolution, climate proxy type, and differences between depositional environments. The U.S. Geological Survey (USGS) North American Holocene Climate Synthesis Working Group has been compiling and integrating multi-proxy paleoclimate data as part of an ongoing effort to synthesize Holocene climate records from North America. The USGS North American Holocene Climate Synthesis Working Group recently completed a late Holocene hydroclimate synthesis for the North American continent using several proxy types from a range of depositional environments, including lakes, wetlands, coastal marine, and cave speleothems. Using new age-depth relationships derived from the Bacon software package, we identified century-scale patterns of wetness and dryness for the past 2000 years with an age uncertainty-based confidence rating for each proxy record. Additionally, for highly-resolved North American lake sediment records, we computed average late Holocene sediment deposition rates and identified temporal trends in age uncertainty that are common to multiple lakes. This presentation addresses strengths and challenges of compiling and integrating data from different paleoclimate archives, with a particular focus on lake sediments, which may inform and guide future paleolimnological studies.
Weiss, David; Freund, Alexandra M; Wiese, Bettina S
2012-11-01
The present research focuses on 2 factors that might help or hurt women to cope with the uncertainties associated with developmental transitions in modern societies (i.e., starting one's first job, graduating from high school, reentry to work after parental leave). We investigate (a) the role of openness to experience in coping with challenging transitions and (b) the (mal)adaptive consequences of adopting a traditional gender ideology. Starting with the assumption that transitional uncertainty has different consequences for women high or low in openness to experience, a first experiment (N = 61; 18-30 years) demonstrated that self-efficacy and well-being decrease after being confronted with transitional uncertainty among women low in openness. Two longitudinal studies investigated the (mal)adaptive consequences of adopting a traditional gender ideology for women high or low in openness in dealing with challenging transitions. Study 2 examined whether endorsing or rejecting traditional gender role beliefs might help female (but not male) students to maintain a sense of self-efficacy and subjective well-being during the transition of graduating from high school (N = 520, 17-22 years). Study 3 (N = 297; 20-53 years) tested the same model for women in middle adulthood during the transition from parental leave to reentry into work life. For both studies, latent growth analyses showed that endorsing traditional gender role beliefs contributed to self-efficacy and subjective well-being among women low in openness. By contrast, for women high in openness, rejecting traditional gender role beliefs had a positive effect on their relative level of self-efficacy and subjective well-being. Functions of ideologies in the context of challenging transitions are discussed.
Modelling Freshwater Resources at the Global Scale: Challenges and Prospects
NASA Technical Reports Server (NTRS)
Doll, Petra; Douville, Herve; Guntner, Andreas; Schmied, Hannes Muller; Wada, Yoshihide
2015-01-01
Quantification of spatially and temporally resolved water flows and water storage variations for all land areas of the globe is required to assess water resources, water scarcity and flood hazards, and to understand the Earth system. This quantification is done with the help of global hydrological models (GHMs). What are the challenges and prospects in the development and application of GHMs? Seven important challenges are presented. (1) Data scarcity makes quantification of human water use difficult even though significant progress has been achieved in the last decade. (2) Uncertainty of meteorological input data strongly affects model outputs. (3) The reaction of vegetation to changing climate and CO2 concentrations is uncertain and not taken into account in most GHMs that serve to estimate climate change impacts. (4) Reasons for discrepant responses of GHMs to changing climate have yet to be identified. (5) More accurate estimates of monthly time series of water availability and use are needed to provide good indicators of water scarcity. (6) Integration of gradient-based groundwater modelling into GHMs is necessary for a better simulation of groundwater-surface water interactions and capillary rise. (7) Detection and attribution of human interference with freshwater systems by using GHMs are constrained by data of insufficient quality but also GHM uncertainty itself. Regarding prospects for progress, we propose to decrease the uncertainty of GHM output by making better use of in situ and remotely sensed observations of output variables such as river discharge or total water storage variations by multi-criteria validation, calibration or data assimilation. Finally, we present an initiative that works towards the vision of hyper resolution global hydrological modelling where GHM outputs would be provided at a 1-km resolution with reasonable accuracy.
NASA Astrophysics Data System (ADS)
Arnbjerg-Nielsen, Karsten; Zhou, Qianqian
2014-05-01
There has been a significant increase in climatic extremes in many regions. In Central and Northern Europe, this has led to more frequent and more severe floods. Along with improved flood modelling technologies this has enabled development of economic assessment of climate change adaptation to increasing urban flood risk. Assessment of adaptation strategies often requires a comprehensive risk-based economic analysis of current risk, drivers of change of risk over time, and measures to reduce the risk. However, such studies are often associated with large uncertainties. The uncertainties arise from basic assumptions in the economic analysis and the hydrological model, but also from the projection of future societies to local climate change impacts and suitable adaptation options. This presents a challenge to decision makers when trying to identify robust measures. We present an integrated uncertainty analysis, which can assess and quantify the overall uncertainty in relation to climate change adaptation to urban flash floods. The analysis is based on an uncertainty cascade that by means of Monte Carlo simulations of flood risk assessments incorporates climate change impacts as a key driver of risk changes over time. The overall uncertainty is then attributed to six bulk processes: climate change impact, urban rainfall-runoff processes, stage-depth functions, unit cost of repair, cost of adaptation measures, and discount rate. We apply the approach on an urban hydrological catchment in Odense, Denmark, and find that the uncertainty on the climate change impact appears to have the least influence on the net present value of the studied adaptation measures-. This does not imply that the climate change impact is not important, but that the uncertainties are not dominating when deciding on action or in-action. We then consider the uncertainty related to choosing between adaptation options given that a decision of action has been taken. In this case the major part of the uncertainty on the estimated net present values is identical for all adaptation options and will therefore not affect a comparison between adaptation measures. This makes the chose among the options easier. Furthermore, the explicit attribution of uncertainty also enables a reduction of the overall uncertainty by identifying the processes which contributes the most. This knowledge can then be used to further reduce the uncertainty related to decision making, as a substantial part of the remaining uncertainty is epistemic.
NASA Astrophysics Data System (ADS)
Gogu, C.; Yin, W.; Haftka, R.; Ifju, P.; Molimard, J.; Le Riche, R.; Vautrin, A.
2010-06-01
A major challenge in the identification of material properties is handling different sources of uncertainty in the experiment and the modelling of the experiment for estimating the resulting uncertainty in the identified properties. Numerous improvements in identification methods have provided increasingly accurate estimates of various material properties. However, characterizing the uncertainty in the identified properties is still relatively crude. Different material properties obtained from a single test are not obtained with the same confidence. Typically the highest uncertainty is associated with respect to properties to which the experiment is the most insensitive. In addition, the uncertainty in different properties can be strongly correlated, so that obtaining only variance estimates may be misleading. A possible approach for handling the different sources of uncertainty and estimating the uncertainty in the identified properties is the Bayesian method. This method was introduced in the late 1970s in the context of identification [1] and has been applied since to different problems, notably identification of elastic constants from plate vibration experiments [2]-[4]. The applications of the method to these classical pointwise tests involved only a small number of measurements (typically ten natural frequencies in the previously cited vibration test) which facilitated the application of the Bayesian approach. For identifying elastic constants, full field strain or displacement measurements provide a high number of measured quantities (one measurement per image pixel) and hence a promise of smaller uncertainties in the properties. However, the high number of measurements represents also a major computational challenge in applying the Bayesian approach to full field measurements. To address this challenge we propose an approach based on the proper orthogonal decomposition (POD) of the full fields in order to drastically reduce their dimensionality. POD is based on projecting the full field images on a modal basis, constructed from sample simulations, and which can account for the variations of the full field as the elastic constants and other parameters of interest are varied. The fidelity of the decomposition depends on the number of basis vectors used. Typically even complex fields can be accurately represented with no more than a few dozen modes and for our problem we showed that only four or five modes are sufficient [5]. To further reduce the computational cost of the Bayesian approach we use response surface approximations of the POD coefficients of the fields. We show that 3rd degree polynomial response surface approximations provide a satisfying accuracy. The combination of POD decomposition and response surface methodology allows to bring down the computational time of the Bayesian identification to a few days. The proposed approach is applied to Moiré interferometry full field displacement measurements from a traction experiment on a plate with a hole. The laminate with a layup of [45,- 45,0]s is made out of a Toray® T800/3631 graphite/epoxy prepreg. The measured displacement maps are provided in Figure 1. The mean values of the identified properties joint probability density function are in agreement with previous identifications carried out on the same material. Furthermore the probability density function also provides the coefficient of variation with which the properties are identified as well as the correlations between the various properties. We find that while the longitudinal Young’s modulus is identified with good accuracy (low standard deviation), the Poisson’s ration is identified with much higher uncertainty. Several of the properties are also found to be correlated. The identified uncertainty structure of the elastic constants (i.e. variance co-variance matrix) has potential benefits to reliability analyses, by allowing a more accurate description of the input uncertainty. An additional advantage of the Bayesian approach is that it provides a natural way (in the form of the prior probability density function) for accounting for prior information that may be available on the material properties thought. This is of great interest for reducing the uncertainty on properties that can only be determined with low confidence from the plate with a hole experiment, such as Poisson’s ratio or transverse Young’s modulus in our case.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, Yishen; Zhou, Zhi; Liu, Cong
2016-08-01
As more wind power and other renewable resources are being integrated into the electric power grid, the forecast uncertainty brings operational challenges for the power system operators. In this report, different operational strategies for uncertainty management are presented and evaluated. A comprehensive and consistent simulation framework is developed to analyze the performance of different reserve policies and scheduling techniques under uncertainty in wind power. Numerical simulations are conducted on a modified version of the IEEE 118-bus system with a 20% wind penetration level, comparing deterministic, interval, and stochastic unit commitment strategies. The results show that stochastic unit commitment provides amore » reliable schedule without large increases in operational costs. Moreover, decomposition techniques, such as load shift factor and Benders decomposition, can help in overcoming the computational obstacles to stochastic unit commitment and enable the use of a larger scenario set to represent forecast uncertainty. In contrast, deterministic and interval unit commitment tend to give higher system costs as more reserves are being scheduled to address forecast uncertainty. However, these approaches require a much lower computational effort Choosing a proper lower bound for the forecast uncertainty is important for balancing reliability and system operational cost in deterministic and interval unit commitment. Finally, we find that the introduction of zonal reserve requirements improves reliability, but at the expense of higher operational costs.« less
NASA Technical Reports Server (NTRS)
Frieler, K.; Elliott, Joshua; Levermann, A.; Heinke, J.; Arneth, A.; Bierkens, M. F. P.; Ciais, P.; Clark, D. B.; Deryng, D.; Doll, P.;
2015-01-01
Climate change and its impacts already pose considerable challenges for societies that will further increase with global warming (IPCC, 2014a, b). Uncertainties of the climatic response to greenhouse gas emissions include the potential passing of large-scale tipping points (e.g. Lenton et al., 2008; Levermann et al., 2012; Schellnhuber, 2010) and changes in extreme meteorological events (Field et al., 2012) with complex impacts on societies (Hallegatte et al., 2013). Thus climate change mitigation is considered a necessary societal response for avoiding uncontrollable impacts (Conference of the Parties, 2010). On the other hand, large-scale climate change mitigation itself implies fundamental changes in, for example, the global energy system. The associated challenges come on top of others that derive from equally important ethical imperatives like the fulfilment of increasing food demand that may draw on the same resources. For example, ensuring food security for a growing population may require an expansion of cropland, thereby reducing natural carbon sinks or the area available for bio-energy production. So far, available studies addressing this problem have relied on individual impact models, ignoring uncertainty in crop model and biome model projections. Here, we propose a probabilistic decision framework that allows for an evaluation of agricultural management and mitigation options in a multi-impactmodel setting. Based on simulations generated within the Inter-Sectoral Impact Model Intercomparison Project (ISI-MIP), we outline how cross-sectorally consistent multi-model impact simulations could be used to generate the information required for robust decision making. Using an illustrative future land use pattern, we discuss the trade-off between potential gains in crop production and associated losses in natural carbon sinks in the new multiple crop- and biome-model setting. In addition, crop and water model simulations are combined to explore irrigation increases as one possible measure of agricultural intensification that could limit the expansion of cropland required in response to climate change and growing food demand. This example shows that current impact model uncertainties pose an important challenge to long-term mitigation planning and must not be ignored in long-term strategic decision making
NASA Astrophysics Data System (ADS)
Frieler, K.; Levermann, A.; Elliott, J.; Heinke, J.; Arneth, A.; Bierkens, M. F. P.; Ciais, P.; Clark, D. B.; Deryng, D.; Döll, P.; Falloon, P.; Fekete, B.; Folberth, C.; Friend, A. D.; Gellhorn, C.; Gosling, S. N.; Haddeland, I.; Khabarov, N.; Lomas, M.; Masaki, Y.; Nishina, K.; Neumann, K.; Oki, T.; Pavlick, R.; Ruane, A. C.; Schmid, E.; Schmitz, C.; Stacke, T.; Stehfest, E.; Tang, Q.; Wisser, D.; Huber, V.; Piontek, F.; Warszawski, L.; Schewe, J.; Lotze-Campen, H.; Schellnhuber, H. J.
2015-07-01
Climate change and its impacts already pose considerable challenges for societies that will further increase with global warming (IPCC, 2014a, b). Uncertainties of the climatic response to greenhouse gas emissions include the potential passing of large-scale tipping points (e.g. Lenton et al., 2008; Levermann et al., 2012; Schellnhuber, 2010) and changes in extreme meteorological events (Field et al., 2012) with complex impacts on societies (Hallegatte et al., 2013). Thus climate change mitigation is considered a necessary societal response for avoiding uncontrollable impacts (Conference of the Parties, 2010). On the other hand, large-scale climate change mitigation itself implies fundamental changes in, for example, the global energy system. The associated challenges come on top of others that derive from equally important ethical imperatives like the fulfilment of increasing food demand that may draw on the same resources. For example, ensuring food security for a growing population may require an expansion of cropland, thereby reducing natural carbon sinks or the area available for bio-energy production. So far, available studies addressing this problem have relied on individual impact models, ignoring uncertainty in crop model and biome model projections. Here, we propose a probabilistic decision framework that allows for an evaluation of agricultural management and mitigation options in a multi-impact-model setting. Based on simulations generated within the Inter-Sectoral Impact Model Intercomparison Project (ISI-MIP), we outline how cross-sectorally consistent multi-model impact simulations could be used to generate the information required for robust decision making. Using an illustrative future land use pattern, we discuss the trade-off between potential gains in crop production and associated losses in natural carbon sinks in the new multiple crop- and biome-model setting. In addition, crop and water model simulations are combined to explore irrigation increases as one possible measure of agricultural intensification that could limit the expansion of cropland required in response to climate change and growing food demand. This example shows that current impact model uncertainties pose an important challenge to long-term mitigation planning and must not be ignored in long-term strategic decision making.
NASA Astrophysics Data System (ADS)
Vicuna, S.; Scott, C. A.; Bonelli, S.; Bustos, E.; Meza, F. J.
2014-12-01
The Maipo basin holds 40% of Chile's total population and almost half of the country's Gross Domestic Product. The basin is located in the semiarid central region of the country and, aside from the typical pressures of growth in developing country basins, the Maipo river faces climate change impacts associated with a reduction in total runoff and changes in its seasonality. Surface water is the main water source for human settlements and economic activities including agriculture. In 2012 we started a research project to create a climate variability and climate change adaptation plan for the basin. The pillars of the plan are co-produced by researchers and a Scenario Building Team (SBT) with membership of relevant water and land use stakeholders (including from civil society, public and private sectors) in the basin. Following similar experiences in other regions in the world that have faced the challenges of dealing with long term planning under uncertainty, the project has divided the task of developing the plan into a series of interconnected elements. A critical first component is to work on the desired vision(s) of the basin for the future. In this regards, the "water security" concept has been chosen as a framework that accommodates all objectives of the SBT members. Understanding and quantifying the uncertainties that could affect the future water security of the basin is another critical aspect of the plan. Near and long term climate scenarios are one dimension of these uncertainties that are combined with base development uncertainties such as urban growth scenarios. A third component constructs the models/tools that allows the assessment of impacts on water security that could arise under these scenarios. The final critical component relates to the development of the adaptation measures that could avoid the negative impacts and/or capture the potential opportunities. After two years in the development of the adaptation plan a series of results has been achieved in all critical components that are presented here. The success in the process now poses a series of new challenges, most importantly: how to implement and monitor the evolution of the adaptation process.
Evaluation of the Uncertainty in JP-7 Kinetics Models Applied to Scramjets
NASA Technical Reports Server (NTRS)
Norris, A. T.
2017-01-01
One of the challenges of designing and flying a scramjet-powered vehicle is the difficulty of preflight testing. Ground tests at realistic flight conditions introduce several sources of uncertainty to the flow that must be addressed. For example, the scales of the available facilities limit the size of vehicles that can be tested and so performance metrics for larger flight vehicles must be extrapolated from ground tests at smaller scales. To create the correct flow enthalpy for higher Mach number flows, most tunnels use a heater that introduces vitiates into the flow. At these conditions, the effects of the vitiates on the combustion process is of particular interest to the engine designer, where the ground test results must be extrapolated to flight conditions. In this paper, the uncertainty of the cracked JP-7 chemical kinetics used in the modeling of a hydrocarbon-fueled scramjet was investigated. The factors that were identified as contributing to uncertainty in the combustion process were the level of flow vitiation, the uncertainty of the kinetic model coefficients and the variation of flow properties between ground testing and flight. The method employed was to run simulations of small, unit problems and identify which variables were the principal sources of uncertainty for the mixture temperature. Then using this resulting subset of all the variables, the effects of the uncertainty caused by the chemical kinetics on a representative scramjet flow-path for both vitiated (ground) and nonvitiated (flight) flows were investigated. The simulations showed that only a few of the kinetic rate equations contribute to the uncertainty in the unit problem results, and when applied to the representative scramjet flowpath, the resulting temperature variability was on the order of 100 K. Both the vitiated and clean air results showed very similar levels of uncertainty, and the difference between the mean properties were generally within the range of uncertainty predicted.
A review of uncertainty visualization within the IPCC reports
NASA Astrophysics Data System (ADS)
Nocke, Thomas; Reusser, Dominik; Wrobel, Markus
2015-04-01
Results derived from climate model simulations confront non-expert users with a variety of uncertainties. This gives rise to the challenge that the scientific information must be communicated such that it can be easily understood, however, the complexity of the science behind is still incorporated. With respect to the assessment reports of the IPCC, the situation is even more complicated, because heterogeneous sources and multiple types of uncertainties need to be compiled together. Within this work, we systematically (1) analyzed the visual representation of uncertainties in the IPCC AR4 and AR5 reports, and (2) executed a questionnaire to evaluate how different user groups such as decision-makers and teachers understand these uncertainty visualizations. Within the first step, we classified visual uncertainty metaphors for spatial, temporal and abstract representations. As a result, we clearly identified a high complexity of the IPCC visualizations compared to standard presentation graphics, sometimes even integrating two or more uncertainty classes / measures together with the "certain" (mean) information. Further we identified complex written uncertainty explanations within image captions even within the "summary reports for policy makers". In the second step, based on these observations, we designed a questionnaire to investigate how non-climate experts understand these visual representations of uncertainties, how visual uncertainty coding might hinder the perception of the "non-uncertain" data, and if alternatives for certain IPCC visualizations exist. Within the talk/poster, we will present first results from this questionnaire. Summarizing, we identified a clear trend towards complex images within the latest IPCC reports, with a tendency to incorporate as much as possible information into the visual representations, resulting in proprietary, non-standard graphic representations that are not necessarily easy to comprehend on one glimpse. We conclude that further translation is required to (visually) present the IPCC results to non-experts, providing tailored static and interactive visualization solutions for different user groups.
NASA Astrophysics Data System (ADS)
Schwarz, Jakob; Kirchengast, Gottfried; Schwaerz, Marc
2018-05-01
Global Navigation Satellite System (GNSS) radio occultation (RO) observations are highly accurate, long-term stable data sets and are globally available as a continuous record from 2001. Essential climate variables for the thermodynamic state of the free atmosphere - such as pressure, temperature, and tropospheric water vapor profiles (involving background information) - can be derived from these records, which therefore have the potential to serve as climate benchmark data. However, to exploit this potential, atmospheric profile retrievals need to be very accurate and the remaining uncertainties quantified and traced throughout the retrieval chain from raw observations to essential climate variables. The new Reference Occultation Processing System (rOPS) at the Wegener Center aims to deliver such an accurate RO retrieval chain with integrated uncertainty propagation. Here we introduce and demonstrate the algorithms implemented in the rOPS for uncertainty propagation from excess phase to atmospheric bending angle profiles, for estimated systematic and random uncertainties, including vertical error correlations and resolution estimates. We estimated systematic uncertainty profiles with the same operators as used for the basic state profiles retrieval. The random uncertainty is traced through covariance propagation and validated using Monte Carlo ensemble methods. The algorithm performance is demonstrated using test day ensembles of simulated data as well as real RO event data from the satellite missions CHAllenging Minisatellite Payload (CHAMP); Constellation Observing System for Meteorology, Ionosphere, and Climate (COSMIC); and Meteorological Operational Satellite A (MetOp). The results of the Monte Carlo validation show that our covariance propagation delivers correct uncertainty quantification from excess phase to bending angle profiles. The results from the real RO event ensembles demonstrate that the new uncertainty estimation chain performs robustly. Together with the other parts of the rOPS processing chain this part is thus ready to provide integrated uncertainty propagation through the whole RO retrieval chain for the benefit of climate monitoring and other applications.
Uncertainty and inference in the world of paleoecological data
NASA Astrophysics Data System (ADS)
McLachlan, J. S.; Dawson, A.; Dietze, M.; Finley, M.; Hooten, M.; Itter, M.; Jackson, S. T.; Marlon, J. R.; Raiho, A.; Tipton, J.; Williams, J.
2017-12-01
Proxy data in paleoecology and paleoclimatology share a common set of biases and uncertainties: spatiotemporal error associated with the taphonomic processes of deposition, preservation, and dating; calibration error between proxy data and the ecosystem states of interest; and error in the interpolation of calibrated estimates across space and time. Researchers often account for this daunting suite of challenges by applying qualitave expert judgment: inferring the past states of ecosystems and assessing the level of uncertainty in those states subjectively. The effectiveness of this approach can be seen by the extent to which future observations confirm previous assertions. Hierarchical Bayesian (HB) statistical approaches allow an alternative approach to accounting for multiple uncertainties in paleo data. HB estimates of ecosystem state formally account for each of the common uncertainties listed above. HB approaches can readily incorporate additional data, and data of different types into estimates of ecosystem state. And HB estimates of ecosystem state, with associated uncertainty, can be used to constrain forecasts of ecosystem dynamics based on mechanistic ecosystem models using data assimilation. Decisions about how to structure an HB model are also subjective, which creates a parallel framework for deciding how to interpret data from the deep past.Our group, the Paleoecological Observatory Network (PalEON), has applied hierarchical Bayesian statistics to formally account for uncertainties in proxy based estimates of past climate, fire, primary productivity, biomass, and vegetation composition. Our estimates often reveal new patterns of past ecosystem change, which is an unambiguously good thing, but we also often estimate a level of uncertainty that is uncomfortably high for many researchers. High levels of uncertainty are due to several features of the HB approach: spatiotemporal smoothing, the formal aggregation of multiple types of uncertainty, and a coarseness in statistical models of taphonomic process. Each of these features provides useful opportunities for statisticians and data-generating researchers to assess what we know about the signal and the noise in paleo data and to improve inference about past changes in ecosystem state.
FROM ASSESSMENT TO POLICY--LESSONS LEARNED FROM THE U.S. NATIONAL ASSESSMENT (Journal Article)
The process of translating scientific information into timely and useful insights that inform policy and resource management decisions, despite the existence of uncertainties, is a difficult and challenging task. Policy-focused assessment is one approach to achieving this end. ...
FROM ASSESSMENT TO POLICY: LESSONS LEARNED FROM THE U.S. NATIONAL ASSESSMENT
The process of translating scientific information into timely and useful insights that inform policy and resource management decisions, despite the existence of uncertainties, is a difficult and challenging task. Policy-focused assessment is one approach to achieving this end. I...
EU Water Governance: Striking the Right Balance between Regulatory Flexibility and Enforcement?
Considering the challenges and threats currently facing water management and the exacerbation of uncertainty by climate change, the need for flexible yet robust and legitimate environmental regulation is evident. The European Union took a novel approach toward sustainable water r...
Thriving in Complexity: A Framework for Leadership Education
ERIC Educational Resources Information Center
Watkins, Daryl; Earnhardt, Matthew; Pittenger, Linda; Roberts, Robin; Rietsema, Kees; Cosman-Ross, Janet
2017-01-01
Technological advances, globalization, network complexity, and social complexity complicate almost every aspect of our organizations and environments. Leadership educators are challenged with developing leaders who can sense environmental cues, adapt to rapidly changing contexts, and thrive in uncertainty while adhering to their values systems. In…
DOT National Transportation Integrated Search
2015-01-01
The growing uncertainty about oil prices and availability has made long-range transportation planning : more challenging. Rather than relying on trend extrapolation, this study uses market mechanisms to : evaluate key long-range transportation planni...
Generalized Read-Across (GenRA) prediction using chemical and biological information (BOSC)
Read-across is a popular data gap filling technique within category and analogue approaches for regulatory purposes. Acceptance of read-across remains a challenge with several efforts underway for identifying and addressing uncertainties. To date, these approaches have been quali...
Risk management consideration in the bioeconomy
Camilla Abbati de Assis; Ronalds Gonzalez; Stephen Kelley; Hasan Jameel; Ted Bilek; Jesse Daystar; Robert Handfield; Jay Golden; Jeff Prestemon; Damien Singh
2017-01-01
In investing in a new venture, companies aim to increase their competitiveness and generate value in scenarios where volatile markets, geopolitical instabilities, and disruptive technologies create uncertainty and risk. The biobased industry poses additional challenges as it competes in a mature, highly efficient market, dominated by...
Madhusoodhanan, C G; Sreeja, K G; Eldho, T I
2016-10-01
Climate change is a major concern in the twenty-first century and its assessments are associated with multiple uncertainties, exacerbated and confounded in the regions where human interventions are prevalent. The present study explores the challenges for climate change impact assessment on the water resources of India, one of the world's largest human-modified systems. The extensive human interventions in the Energy-Land-Water-Climate (ELWC) nexus significantly impact the water resources of the country. The direct human interventions in the landscape may surpass/amplify/mask the impacts of climate change and in the process also affect climate change itself. Uncertainties in climate and resource assessments add to the challenge. Formulating coherent resource and climate change policies in India would therefore require an integrated approach that would assess the multiple interlinkages in the ELWC nexus and distinguish the impacts of global climate change from that of regional human interventions. Concerted research efforts are also needed to incorporate the prominent linkages in the ELWC nexus in climate/earth system modelling.
Calibration under uncertainty for finite element models of masonry monuments
DOE Office of Scientific and Technical Information (OSTI.GOV)
Atamturktur, Sezer,; Hemez, Francois,; Unal, Cetin
2010-02-01
Historical unreinforced masonry buildings often include features such as load bearing unreinforced masonry vaults and their supporting framework of piers, fill, buttresses, and walls. The masonry vaults of such buildings are among the most vulnerable structural components and certainly among the most challenging to analyze. The versatility of finite element (FE) analyses in incorporating various constitutive laws, as well as practically all geometric configurations, has resulted in the widespread use of the FE method for the analysis of complex unreinforced masonry structures over the last three decades. However, an FE model is only as accurate as its input parameters, andmore » there are two fundamental challenges while defining FE model input parameters: (1) material properties and (2) support conditions. The difficulties in defining these two aspects of the FE model arise from the lack of knowledge in the common engineering understanding of masonry behavior. As a result, engineers are unable to define these FE model input parameters with certainty, and, inevitably, uncertainties are introduced to the FE model.« less
[HTA-Perspective: Challenges in the early assessment of new oncological drugs].
Wild, Claudia; Nachtnebel, Anna
2013-01-01
Oncologic drug therapies have gained wide attention in the context of health policy priority setting for serious and socially significant diseases with high human and monetary costs. Due to uncertainties and scepticism about the actual therapeutic importance of newly approved oncology products, an early assessment programme was already established in Austria in 2007. The assessment of new oncology products is thereby faced with special challenges, since study populations are frequently not representative or the study design is laid out in such a manner that a definitive assessment of patient-relevant endpoints is not possible (cross-overs after interim assessments, surrogate parameters as primary endpoints, uncontrolled studies or those with unrealistic comparators, invalidated post-hoc identified biomarkers). On account of these major uncertainties, even the European Medicines Agency (EMA) is already contemplating multi-stage, "adaptive" approvals, and national reimbursement institutions are increasingly working with outcome-oriented, conditional reimbursement. (As supplied by publisher). Copyright © 2013. Published by Elsevier GmbH.
Challenges, uncertainties, and issues facing gas production from gas-hydrate deposits
Moridis, G.J.; Collett, T.S.; Pooladi-Darvish, M.; Hancock, S.; Santamarina, C.; Boswel, R.; Kneafsey, T.; Rutqvist, J.; Kowalsky, M.B.; Reagan, M.T.; Sloan, E.D.; Sum, A.K.; Koh, C.A.
2011-01-01
The current paper complements the Moridis et al. (2009) review of the status of the effort toward commercial gas production from hydrates. We aim to describe the concept of the gas-hydrate (GH) petroleum system; to discuss advances, requirements, and suggested practices in GH prospecting and GH deposit characterization; and to review the associated technical, economic, and environmental challenges and uncertainties, which include the following: accurate assessment of producible fractions of the GH resource; development of methods for identifying suitable production targets; sampling of hydrate-bearing sediments (HBS) and sample analysis; analysis and interpretation of geophysical surveys of GH reservoirs; well-testing methods; interpretation of well-testing results; geomechanical and reservoir/well stability concerns; well design, operation, and installation; field operations and extending production beyond sand-dominated GH reservoirs; monitoring production and geomechanical stability; laboratory investigations; fundamental knowledge of hydrate behavior; the economics of commercial gas production from hydrates; and associated environmental concerns. ?? 2011 Society of Petroleum Engineers.
A Framework for Reliability and Safety Analysis of Complex Space Missions
NASA Technical Reports Server (NTRS)
Evans, John W.; Groen, Frank; Wang, Lui; Austin, Rebekah; Witulski, Art; Mahadevan, Nagabhushan; Cornford, Steven L.; Feather, Martin S.; Lindsey, Nancy
2017-01-01
Long duration and complex mission scenarios are characteristics of NASA's human exploration of Mars, and will provide unprecedented challenges. Systems reliability and safety will become increasingly demanding and management of uncertainty will be increasingly important. NASA's current pioneering strategy recognizes and relies upon assurance of crew and asset safety. In this regard, flexibility to develop and innovate in the emergence of new design environments and methodologies, encompassing modeling of complex systems, is essential to meet the challenges.
Alegría, C Aramburu
2010-12-01
• Transsexual persons are increasing their visibility in society, and health care providers and others (such as social workers) will be called upon to help with issues that transsexual persons face. Challenges that face transsexual persons often include issues involving relationships. Psychiatric and mental health nurses and other caregivers can increase their therapeutic skills in working with couples that include transsexual persons by becoming aware of these challenges and subsequent activities that can help with them. • This research study looks at couple relationships in which one partner reveals male-to-female transsexual identity. These are relationships that were established as man-woman and now will transition into relationships that include a male-to-female person and a female partner. • Common challenges for these couples include issues related to: (1) sexual identity and relationship uncertainty; (2) male-to-female transition decision making; and (3) presenting in public. • Relationship maintenance activities that helped the couples in the study maintain and strengthen their relationships through these challenges include: (1) communication; (2) self-talk (for example, putting the situation in perspective); (3) social networks; (4) positive interactions; (5) impression management (for example, managing displays of affection in public); and (6) social activism. This qualitative study describes the relational dynamics that help sustain relationships of couples that include male-to-female transsexual persons (MTF) and their natal female partners (NF) following disclosure of transsexualism. Relationship challenges and relationship maintenance activities are identified. Each partner in 17 MTF-NF couples participated in individual surveys and interviews. The data were coded for themes related to relationship challenges and activities. MTF-NF couples experience challenges within the contexts of their relationships and of society. These challenges include: (1) sexual identity and relationship uncertainty; (2) male-to-female transition decision making; and (3) public presentation. Relationship maintenance activities enabled the study couples to maintain and strengthen their relationships through these challenges. These activities include: (1) communication; (2) self-talk; (3) social networks; (4) positivity; (5) impression management; and (6) social activism. Via this report, psychiatric and mental health nurses can increase their therapeutic skills in working with MTF-NF couples. © 2010 Blackwell Publishing.
Joint Knowledge Generation Between Climate Science and Infrastructure Engineering
NASA Astrophysics Data System (ADS)
Stoner, A. M. K.; Hayhoe, K.; Jacobs, J. M.
2015-12-01
Over the past decade the engineering community has become increasingly aware of the need to incorporate climate projections into the planning and design of sensitive infrastructure. However, this is a task that is easier said than done. This presentation will discuss some of the successes and hurdles experiences through the past year, from a climate scientist's perspective, working with engineers in infrastructure research and applied engineering through the Infrastructure & Climate Network (ICNet). Engineers rely on strict building codes and ordinances, and can be the subject of lawsuits if those codes are not followed. Matters are further complicated by the uncertainty inherent to climate projections, which include short-term natural variability, as well as the influence of scientific uncertainty and even human behavior on the rate and magnitude of change. Climate scientists typically address uncertainty by creating projections based on multiple models following different future scenarios. This uncertainty is difficult to incorporate into engineering projects, however, due to the fact that they cannot build two different bridges, one allowing for a lower amount of change, and another for a higher. More often than not there is a considerable difference between the costs of building two such bridges, which means that available funds often are the deciding factor. Discussions of climate science are often well received with engineers who work in the research area of infrastructure; going a step further, however, and implementing it in applied engineering projects can be challenging. This presentation will discuss some of the challenges and opportunities inherent to collaborations between climate scientists and transportation engineers, drawing from a range of studies including truck weight restrictions on roads during the spring thaw, and bridge deck performance due to environmental forcings.
NASA Astrophysics Data System (ADS)
Moss, R. H.
2017-12-01
Assessment of potential impacts and adaptations to global environmental change evaluate the continuously evolving state of science through the lens of relevance to challenges such as planning long-lived infrastructure and managing risks to property, ecosystems, public health, and other valued assets or objectives. These planning and decision contexts present varied challenges, including: multiple attributes at risk from interacting environmental and socioeconomic trends; uncertainties (scientific and otherwise); partial solutions with indefinite costs and benefits; and tradeoffs across stakeholder groups. Research and evaluation of assessments indicate they convey information that is more usable and relevant to decision makers if they are designed as sustained interactions of pertinent scientific and user communities and result in products beyond written reports. This talk will report on the work of a Federal Advisory Committee for the Sustained National Climate Assessment (SNCA) to develop recommendations to increase the SNCA's relevance and usability. The recommendations build on the conclusions of a 2013 report by the predecessor SNCA advisory committee and suggest next steps for (1) engagement, (2) provision of core scientific products, (3) tailoring of information and tools to provide insights under uncertainty, and (4) evaluation of products and outcomes. The recommended process focuses on providing insights relevant to consideration of risks and solutions. While resulting in a wide range of products and outcomes on an ongoing basis, aggregation and assessment of emerging insights and good practice for supporting decision making under uncertainty would recur over a four-year adaptive management cycle in the context of the preparation of the US national assessment report mandated under the Global Change Research Act. Uncertainty about the future role of Federal agencies in the assessment process and opportunities for increased engagement by non-Federal actors will be considered.
NASA Astrophysics Data System (ADS)
Li, Xiaojun; Li, Yandong; Chang, Ching-Fu; Tan, Benjamin; Chen, Ziyang; Sege, Jon; Wang, Changhong; Rubin, Yoram
2018-01-01
Modeling of uncertainty associated with subsurface dynamics has long been a major research topic. Its significance is widely recognized for real-life applications. Despite the huge effort invested in the area, major obstacles still remain on the way from theory and applications. Particularly problematic here is the confusion between modeling uncertainty and modeling spatial variability, which translates into a (mis)conception, in fact an inconsistency, in that it suggests that modeling of uncertainty and modeling of spatial variability are equivalent, and as such, requiring a lot of data. This paper investigates this challenge against the backdrop of a 7 km, deep underground tunnel in China, where environmental impacts are of major concern. We approach the data challenge by pursuing a new concept for Rapid Impact Modeling (RIM), which bypasses altogether the need to estimate posterior distributions of model parameters, focusing instead on detailed stochastic modeling of impacts, conditional to all information available, including prior, ex-situ information and in-situ measurements as well. A foundational element of RIM is the construction of informative priors for target parameters using ex-situ data, relying on ensembles of well-documented sites, pre-screened for geological and hydrological similarity to the target site. The ensembles are built around two sets of similarity criteria: a physically-based set of criteria and an additional set covering epistemic criteria. In another variation to common Bayesian practice, we update the priors to obtain conditional distributions of the target (environmental impact) dependent variables and not the hydrological variables. This recognizes that goal-oriented site characterization is in many cases more useful in applications compared to parameter-oriented characterization.
Traffic Flow Management Wrap-Up
NASA Technical Reports Server (NTRS)
Grabbe, Shon
2011-01-01
Traffic Flow Management involves the scheduling and routing of air traffic subject to airport and airspace capacity constraints, and the efficient use of available airspace. Significant challenges in this area include: (1) weather integration and forecasting, (2) accounting for user preferences in the Traffic Flow Management decision making process, and (3) understanding and mitigating the environmental impacts of air traffic on the environment. To address these challenges, researchers in the Traffic Flow Management area are developing modeling, simulation and optimization techniques to route and schedule air traffic flights and flows while accommodating user preferences, accounting for system uncertainties and considering the environmental impacts of aviation. This presentation will highlight some of the major challenges facing researchers in this domain, while also showcasing recent innovations designed to address these challenges.
Estimation of Uncertainties in Stage-Discharge Curve for an Experimental Himalayan Watershed
NASA Astrophysics Data System (ADS)
Kumar, V.; Sen, S.
2016-12-01
Various water resource projects developed on rivers originating from the Himalayan region, the "Water Tower of Asia", plays an important role on downstream development. Flow measurements at the desired river site are very critical for river engineers and hydrologists for water resources planning and management, flood forecasting, reservoir operation and flood inundation studies. However, an accurate discharge assessment of these mountainous rivers is costly, tedious and frequently dangerous to operators during flood events. Currently, in India, discharge estimation is linked to stage-discharge relationship known as rating curve. This relationship would be affected by a high degree of uncertainty. Estimating the uncertainty of rating curve remains a relevant challenge because it is not easy to parameterize. Main source of rating curve uncertainty are errors because of incorrect discharge measurement, variation in hydraulic conditions and depth measurement. In this study our objective is to obtain best parameters of rating curve that fit the limited record of observations and to estimate uncertainties at different depth obtained from rating curve. The rating curve parameters of standard power law are estimated for three different streams of Aglar watershed located in lesser Himalayas by maximum-likelihood estimator. Quantification of uncertainties in the developed rating curves is obtained from the estimate of variances and covariances of the rating curve parameters. Results showed that the uncertainties varied with catchment behavior with error varies between 0.006-1.831 m3/s. Discharge uncertainty in the Aglar watershed streams significantly depend on the extent of extrapolation outside the range of observed water levels. Extrapolation analysis confirmed that more than 15% for maximum discharges and 5% for minimum discharges are not strongly recommended for these mountainous gauging sites.
NASA Astrophysics Data System (ADS)
Schwabe, O.; Shehab, E.; Erkoyuncu, J.
2015-08-01
The lack of defensible methods for quantifying cost estimate uncertainty over the whole product life cycle of aerospace innovations such as propulsion systems or airframes poses a significant challenge to the creation of accurate and defensible cost estimates. Based on the axiomatic definition of uncertainty as the actual prediction error of the cost estimate, this paper provides a comprehensive overview of metrics used for the uncertainty quantification of cost estimates based on a literature review, an evaluation of publicly funded projects such as part of the CORDIS or Horizon 2020 programs, and an analysis of established approaches used by organizations such NASA, the U.S. Department of Defence, the ESA, and various commercial companies. The metrics are categorized based on their foundational character (foundations), their use in practice (state-of-practice), their availability for practice (state-of-art) and those suggested for future exploration (state-of-future). Insights gained were that a variety of uncertainty quantification metrics exist whose suitability depends on the volatility of available relevant information, as defined by technical and cost readiness level, and the number of whole product life cycle phases the estimate is intended to be valid for. Information volatility and number of whole product life cycle phases can hereby be considered as defining multi-dimensional probability fields admitting various uncertainty quantification metric families with identifiable thresholds for transitioning between them. The key research gaps identified were the lacking guidance grounded in theory for the selection of uncertainty quantification metrics and lacking practical alternatives to metrics based on the Central Limit Theorem. An innovative uncertainty quantification framework consisting of; a set-theory based typology, a data library, a classification system, and a corresponding input-output model are put forward to address this research gap as the basis for future work in this field.
Barazzetti Barbieri, Cristina; de Souza Sarkis, Jorge Eduardo
2018-07-01
The forensic interpretation of environmental analytical data is usually challenging due to the high geospatial variability of these data. The measurements' uncertainty includes contributions from the sampling and from the sample handling and preparation processes. These contributions are often disregarded in analytical techniques results' quality assurance. A pollution crime investigation case was used to carry out a methodology able to address these uncertainties in two different environmental compartments, freshwater sediments and landfill leachate. The methodology used to estimate the uncertainty was the duplicate method (that replicates predefined steps of the measurement procedure in order to assess its precision) and the parameters used to investigate the pollution were metals (Cr, Cu, Ni, and Zn) in the leachate, the suspect source, and in the sediment, the possible sink. The metal analysis results were compared to statutory limits and it was demonstrated that Cr and Ni concentrations in sediment samples exceeded the threshold levels at all sites downstream the pollution sources, considering the expanded uncertainty U of the measurements and a probability of contamination >0.975, at most sites. Cu and Zn concentrations were above the statutory limits at two sites, but the classification was inconclusive considering the uncertainties of the measurements. Metal analyses in leachate revealed that Cr concentrations were above the statutory limits with a probability of contamination >0.975 in all leachate ponds while the Cu, Ni and Zn probability of contamination was below 0.025. The results demonstrated that the estimation of the sampling uncertainty, which was the dominant component of the combined uncertainty, is required for a comprehensive interpretation of the environmental analyses results, particularly in forensic cases. Copyright © 2018 Elsevier B.V. All rights reserved.
Chasing Perfection: Should We Reduce Model Uncertainty in Carbon Cycle-Climate Feedbacks
NASA Astrophysics Data System (ADS)
Bonan, G. B.; Lombardozzi, D.; Wieder, W. R.; Lindsay, K. T.; Thomas, R. Q.
2015-12-01
Earth system model simulations of the terrestrial carbon (C) cycle show large multi-model spread in the carbon-concentration and carbon-climate feedback parameters. Large differences among models are also seen in their simulation of global vegetation and soil C stocks and other aspects of the C cycle, prompting concern about model uncertainty and our ability to faithfully represent fundamental aspects of the terrestrial C cycle in Earth system models. Benchmarking analyses that compare model simulations with common datasets have been proposed as a means to assess model fidelity with observations, and various model-data fusion techniques have been used to reduce model biases. While such efforts will reduce multi-model spread, they may not help reduce uncertainty (and increase confidence) in projections of the C cycle over the twenty-first century. Many ecological and biogeochemical processes represented in Earth system models are poorly understood at both the site scale and across large regions, where biotic and edaphic heterogeneity are important. Our experience with the Community Land Model (CLM) suggests that large uncertainty in the terrestrial C cycle and its feedback with climate change is an inherent property of biological systems. The challenge of representing life in Earth system models, with the rich diversity of lifeforms and complexity of biological systems, may necessitate a multitude of modeling approaches to capture the range of possible outcomes. Such models should encompass a range of plausible model structures. We distinguish between model parameter uncertainty and model structural uncertainty. Focusing on improved parameter estimates may, in fact, limit progress in assessing model structural uncertainty associated with realistically representing biological processes. Moreover, higher confidence may be achieved through better process representation, but this does not necessarily reduce uncertainty.
Communicating spatial uncertainty to non-experts using R
NASA Astrophysics Data System (ADS)
Luzzi, Damiano; Sawicka, Kasia; Heuvelink, Gerard; de Bruin, Sytze
2016-04-01
Effective visualisation methods are important for the efficient use of uncertainty information for various groups of users. Uncertainty propagation analysis is often used with spatial environmental models to quantify the uncertainty within the information. A challenge arises when trying to effectively communicate the uncertainty information to non-experts (not statisticians) in a wide range of cases. Due to the growing popularity and applicability of the open source programming language R, we undertook a project to develop an R package that facilitates uncertainty propagation analysis in spatial environmental modelling. The package has implemented Monte Carlo algorithms for uncertainty propagation, the output of which is represented by an ensemble of model outputs (i.e. a sample from a probability distribution). Numerous visualisation methods exist that aim to present such spatial uncertainty information both statically, dynamically and interactively. To provide the most universal visualisation tools for non-experts, we conducted a survey on a group of 20 university students and assessed the effectiveness of selected static and interactive methods for visualising uncertainty in spatial variables such as DEM and land cover. The static methods included adjacent maps and glyphs for continuous variables. Both allow for displaying maps with information about the ensemble mean, variance/standard deviation and prediction intervals. Adjacent maps were also used for categorical data, displaying maps of the most probable class, as well as its associated probability. The interactive methods included a graphical user interface, which in addition to displaying the previously mentioned variables also allowed for comparison of joint uncertainties at multiple locations. The survey indicated that users could understand the basics of the uncertainty information displayed in the static maps, with the interactive interface allowing for more in-depth information. Subsequently, the R package included a collation of the plotting functions that were evaluated in the survey. The implementation of static visualisations was done via calls to the 'ggplot2' package. This allowed the user to provide control over the content, legend, colours, axes and titles. The interactive methods were implemented using the 'shiny' package allowing users to activate the visualisation of statistical descriptions of uncertainty through interaction with a plotted map of means. This research brings uncertainty visualisation to a broader audience through the development of tools for visualising uncertainty using open source software.
Bayesian analysis of input uncertainty in hydrological modeling: 2. Application
NASA Astrophysics Data System (ADS)
Kavetski, Dmitri; Kuczera, George; Franks, Stewart W.
2006-03-01
The Bayesian total error analysis (BATEA) methodology directly addresses both input and output errors in hydrological modeling, requiring the modeler to make explicit, rather than implicit, assumptions about the likely extent of data uncertainty. This study considers a BATEA assessment of two North American catchments: (1) French Broad River and (2) Potomac basins. It assesses the performance of the conceptual Variable Infiltration Capacity (VIC) model with and without accounting for input (precipitation) uncertainty. The results show the considerable effects of precipitation errors on the predicted hydrographs (especially the prediction limits) and on the calibrated parameters. In addition, the performance of BATEA in the presence of severe model errors is analyzed. While BATEA allows a very direct treatment of input uncertainty and yields some limited insight into model errors, it requires the specification of valid error models, which are currently poorly understood and require further work. Moreover, it leads to computationally challenging highly dimensional problems. For some types of models, including the VIC implemented using robust numerical methods, the computational cost of BATEA can be reduced using Newton-type methods.
Uncertainty, ensembles and air quality dispersion modeling: applications and challenges
NASA Astrophysics Data System (ADS)
Dabberdt, Walter F.; Miller, Erik
The past two decades have seen significant advances in mesoscale meteorological modeling research and applications, such as the development of sophisticated and now widely used advanced mesoscale prognostic models, large eddy simulation models, four-dimensional data assimilation, adjoint models, adaptive and targeted observational strategies, and ensemble and probabilistic forecasts. Some of these advances are now being applied to urban air quality modeling and applications. Looking forward, it is anticipated that the high-priority air quality issues for the near-to-intermediate future will likely include: (1) routine operational forecasting of adverse air quality episodes; (2) real-time high-level support to emergency response activities; and (3) quantification of model uncertainty. Special attention is focused here on the quantification of model uncertainty through the use of ensemble simulations. Application to emergency-response dispersion modeling is illustrated using an actual event that involved the accidental release of the toxic chemical oleum. Both surface footprints of mass concentration and the associated probability distributions at individual receptors are seen to provide valuable quantitative indicators of the range of expected concentrations and their associated uncertainty.
Oliveira, José Egídio; Mendonça, Marina; Coimbra, Susana; Fontaine, Anne Marie
2014-12-01
In a familistic southern European society such as the Portuguese, the family has historically played a prominent role in supporting the negotiation of transition pathways into adulthood. The present study aimed at capturing (1) the relative weight of parental financial support and autonomy support in contributing to the youngsters' psychological well-being (PWB), and (2) the mediating role of identity capital and uncertainty management in this relationship. A total of 620 participants completed measures of parental support, identity capital, uncertainty management and PWB. Autonomy support was found to be the strongest predictor of PWB, both directly and indirectly through its effects on identity capital and the use of target focused uncertainty management strategies. Conversely, financial support evidenced only a minor indirect impact through the mediation of tangible identity capital. Autonomy stimulation may constitute one of the most developmentally determinant family challenges in assisting the process of coming of age in Portugal. Copyright © 2014 The Foundation for Professionals in Services for Adolescents. Published by Elsevier Ltd. All rights reserved.
Reduced Uncertainties in the Flutter Analysis of the Aerostructures Test Wing
NASA Technical Reports Server (NTRS)
Pak, Chan-gi; Lung, Shun-fat
2010-01-01
Tuning the finite element model using measured data to minimize the model uncertainties is a challenging task in the area of structural dynamics. A test validated finite element model can provide a reliable flutter analysis to define the flutter placard speed to which the aircraft can be flown prior to flight flutter testing. Minimizing the difference between numerical and experimental results is a type of optimization problem. Through the use of the National Aeronautics and Space Administration Dryden Flight Research Center s (Edwards, California, USA) multidisciplinary design, analysis, and optimization tool to optimize the objective function and constraints; the mass properties, the natural frequencies, and the mode shapes are matched to the target data and the mass matrix orthogonality is retained. The approach in this study has been applied to minimize the model uncertainties for the structural dynamic model of the aerostructures test wing, which was designed, built, and tested at the National Aeronautics and Space Administration Dryden Flight Research Center. A 25-percent change in flutter speed has been shown after reducing the uncertainties
Apostol, Izydor; Kelner, Drew; Jiang, Xinzhao Grace; Huang, Gang; Wypych, Jette; Zhang, Xin; Gastwirt, Jessica; Chen, Kenneth; Fodor, Szilan; Hapuarachchi, Suminda; Meriage, Dave; Ye, Frank; Poppe, Leszek; Szpankowski, Wojciech
2012-12-01
To predict precision and other performance characteristics of chromatographic purity methods, which represent the most widely used form of analysis in the biopharmaceutical industry. We have conducted a comprehensive survey of purity methods, and show that all performance characteristics fall within narrow measurement ranges. This observation was used to develop a model called Uncertainty Based on Current Information (UBCI), which expresses these performance characteristics as a function of the signal and noise levels, hardware specifications, and software settings. We applied the UCBI model to assess the uncertainty of purity measurements, and compared the results to those from conventional qualification. We demonstrated that the UBCI model is suitable to dynamically assess method performance characteristics, based on information extracted from individual chromatograms. The model provides an opportunity for streamlining qualification and validation studies by implementing a "live validation" of test results utilizing UBCI as a concurrent assessment of measurement uncertainty. Therefore, UBCI can potentially mitigate the challenges associated with laborious conventional method validation and facilitates the introduction of more advanced analytical technologies during the method lifecycle.
Addressing uncertainty in atomistic machine learning.
Peterson, Andrew A; Christensen, Rune; Khorshidi, Alireza
2017-05-10
Machine-learning regression has been demonstrated to precisely emulate the potential energy and forces that are output from more expensive electronic-structure calculations. However, to predict new regions of the potential energy surface, an assessment must be made of the credibility of the predictions. In this perspective, we address the types of errors that might arise in atomistic machine learning, the unique aspects of atomistic simulations that make machine-learning challenging, and highlight how uncertainty analysis can be used to assess the validity of machine-learning predictions. We suggest this will allow researchers to more fully use machine learning for the routine acceleration of large, high-accuracy, or extended-time simulations. In our demonstrations, we use a bootstrap ensemble of neural network-based calculators, and show that the width of the ensemble can provide an estimate of the uncertainty when the width is comparable to that in the training data. Intriguingly, we also show that the uncertainty can be localized to specific atoms in the simulation, which may offer hints for the generation of training data to strategically improve the machine-learned representation.
Reduced Uncertainties in the Flutter Analysis of the Aerostructures Test Wing
NASA Technical Reports Server (NTRS)
Pak, Chan-Gi; Lung, Shun Fat
2011-01-01
Tuning the finite element model using measured data to minimize the model uncertainties is a challenging task in the area of structural dynamics. A test validated finite element model can provide a reliable flutter analysis to define the flutter placard speed to which the aircraft can be flown prior to flight flutter testing. Minimizing the difference between numerical and experimental results is a type of optimization problem. Through the use of the National Aeronautics and Space Administration Dryden Flight Research Center's (Edwards, California) multidisciplinary design, analysis, and optimization tool to optimize the objective function and constraints; the mass properties, the natural frequencies, and the mode shapes are matched to the target data, and the mass matrix orthogonality is retained. The approach in this study has been applied to minimize the model uncertainties for the structural dynamic model of the aerostructures test wing, which was designed, built, and tested at the National Aeronautics and Space Administration Dryden Flight Research Center. A 25 percent change in flutter speed has been shown after reducing the uncertainties.
Creation and Validation of Sintered PTFE BRDF Targets & Standards
Durell, Christopher; Scharpf, Dan; McKee, Greg; L’Heureux, Michelle; Georgiev, Georgi; Obein, Gael; Cooksey, Catherine
2016-01-01
Sintered polytetrafluoroethylene (PTFE) is an extremely stable, near-perfect Lambertian reflecting diffuser and calibration standard material that has been used by national labs, space, aerospace and commercial sectors for over two decades. New uncertainty targets of 2 % on-orbit absolute validation in the Earth Observing Systems community have challenged the industry to improve is characterization and knowledge of almost every aspect of radiometric performance (space and ground). Assuming “near perfect” reflectance for angular dependent measurements is no longer going to suffice for many program needs. The total hemispherical spectral reflectance provides a good mark of general performance; but, without the angular characterization of bidirectional reflectance distribution function (BRDF) measurements, critical data is missing from many applications and uncertainty budgets. Therefore, traceable BRDF measurement capability is needed to characterize sintered PTFE’s angular response and provide a full uncertainty profile to users. This paper presents preliminary comparison measurements of the BRDF of sintered PTFE from several laboratories to better quantify the BRDF of sintered PTFE, assess the BRDF measurement comparability between laboratories, and improve estimates of measurement uncertainties under laboratory conditions. PMID:26900206
Creation and Validation of Sintered PTFE BRDF Targets & Standards.
Durell, Christopher; Scharpf, Dan; McKee, Greg; L'Heureux, Michelle; Georgiev, Georgi; Obein, Gael; Cooksey, Catherine
2015-09-21
Sintered polytetrafluoroethylene (PTFE) is an extremely stable, near-perfect Lambertian reflecting diffuser and calibration standard material that has been used by national labs, space, aerospace and commercial sectors for over two decades. New uncertainty targets of 2 % on-orbit absolute validation in the Earth Observing Systems community have challenged the industry to improve is characterization and knowledge of almost every aspect of radiometric performance (space and ground). Assuming "near perfect" reflectance for angular dependent measurements is no longer going to suffice for many program needs. The total hemispherical spectral reflectance provides a good mark of general performance; but, without the angular characterization of bidirectional reflectance distribution function (BRDF) measurements, critical data is missing from many applications and uncertainty budgets. Therefore, traceable BRDF measurement capability is needed to characterize sintered PTFE's angular response and provide a full uncertainty profile to users. This paper presents preliminary comparison measurements of the BRDF of sintered PTFE from several laboratories to better quantify the BRDF of sintered PTFE, assess the BRDF measurement comparability between laboratories, and improve estimates of measurement uncertainties under laboratory conditions.
Earth Observation, Spatial Data Quality, and Neglected Tropical Diseases.
Hamm, Nicholas A S; Soares Magalhães, Ricardo J; Clements, Archie C A
2015-12-01
Earth observation (EO) is the use of remote sensing and in situ observations to gather data on the environment. It finds increasing application in the study of environmentally modulated neglected tropical diseases (NTDs). Obtaining and assuring the quality of the relevant spatially and temporally indexed EO data remain challenges. Our objective was to review the Earth observation products currently used in studies of NTD epidemiology and to discuss fundamental issues relating to spatial data quality (SDQ), which limit the utilization of EO and pose challenges for its more effective use. We searched Web of Science and PubMed for studies related to EO and echinococossis, leptospirosis, schistosomiasis, and soil-transmitted helminth infections. Relevant literature was also identified from the bibliographies of those papers. We found that extensive use is made of EO products in the study of NTD epidemiology; however, the quality of these products is usually given little explicit attention. We review key issues in SDQ concerning spatial and temporal scale, uncertainty, and the documentation and use of quality information. We give examples of how these issues may interact with uncertainty in NTD data to affect the output of an epidemiological analysis. We conclude that researchers should give careful attention to SDQ when designing NTD spatial-epidemiological studies. This should be used to inform uncertainty analysis in the epidemiological study. SDQ should be documented and made available to other researchers.
Lightning Impacts on Airports - Challenges of Balancing Safety & Efficiency
NASA Astrophysics Data System (ADS)
Steiner, Matthias; Deierling, Wiebke; Nelson, Eric; Stone, Ken
2013-04-01
Thunderstorms and lightning pose a safety risk to personnel working outdoors, such as people maintaining airport grounds (e.g., mowing grass or repairing runway lighting) or servicing aircraft on ramps (handling baggage, food service, refueling, tugging and guiding aircraft from/to gates, etc.). Since lightning strikes can cause serious injuries or death, it is important to provide timely alerts to airport personnel so that they can get to safety when lightning is imminent. This presentation discusses the challenges and uncertainties involved in using lightning information and stakeholder procedures to ensure safety of outdoor personnel while keeping ramp operations as efficient as possible considering thunderstorm impacts. The findings presented are based on extensive observations of airline operators under thunderstorm impacts. These observations reveal a complex picture with substantial uncertainties related to the (1) source of lightning information (e.g., sensor type, network, data processing) used to base ramp closure decisions on, (2) uncertainties involved in the safety procedures employed by various stakeholders across the aviation industry (yielding notably different rules being applied by multiple airlines even at a single airport), and (3) human factors issues related to the use of decision support tools and the implementation of safety procedures. This research is supported by the United States Federal Aviation Administration (FAA). The views expressed are those of the authors and do not necessarily represent the official policy or position of the FAA.
NASA Astrophysics Data System (ADS)
Bloom, A. Anthony; Lauvaux, Thomas; Worden, John; Yadav, Vineet; Duren, Riley; Sander, Stanley P.; Schimel, David S.
2016-12-01
Understanding the processes controlling terrestrial carbon fluxes is one of the grand challenges of climate science. Carbon cycle process controls are readily studied at local scales, but integrating local knowledge across extremely heterogeneous biota, landforms and climate space has proven to be extraordinarily challenging. Consequently, top-down or integral flux constraints at process-relevant scales are essential to reducing process uncertainty. Future satellite-based estimates of greenhouse gas fluxes - such as CO2 and CH4 - could potentially provide the constraints needed to resolve biogeochemical process controls at the required scales. Our analysis is focused on Amazon wetland CH4 emissions, which amount to a scientifically crucial and methodologically challenging case study. We quantitatively derive the observing system (OS) requirements for testing wetland CH4 emission hypotheses at a process-relevant scale. To distinguish between hypothesized hydrological and carbon controls on Amazon wetland CH4 production, a satellite mission will need to resolve monthly CH4 fluxes at a ˜ 333 km resolution and with a ≤ 10 mg CH4 m-2 day-1 flux precision. We simulate a range of low-earth orbit (LEO) and geostationary orbit (GEO) CH4 OS configurations to evaluate the ability of these approaches to meet the CH4 flux requirements. Conventional LEO and GEO missions resolve monthly ˜ 333 km Amazon wetland fluxes at a 17.0 and 2.7 mg CH4 m-2 day-1 median uncertainty level. Improving LEO CH4 measurement precision by
Bayesian network learning for natural hazard assessments
NASA Astrophysics Data System (ADS)
Vogel, Kristin
2016-04-01
Even though quite different in occurrence and consequences, from a modelling perspective many natural hazards share similar properties and challenges. Their complex nature as well as lacking knowledge about their driving forces and potential effects make their analysis demanding. On top of the uncertainty about the modelling framework, inaccurate or incomplete event observations and the intrinsic randomness of the natural phenomenon add up to different interacting layers of uncertainty, which require a careful handling. Thus, for reliable natural hazard assessments it is crucial not only to capture and quantify involved uncertainties, but also to express and communicate uncertainties in an intuitive way. Decision-makers, who often find it difficult to deal with uncertainties, might otherwise return to familiar (mostly deterministic) proceedings. In the scope of the DFG research training group „NatRiskChange" we apply the probabilistic framework of Bayesian networks for diverse natural hazard and vulnerability studies. The great potential of Bayesian networks was already shown in previous natural hazard assessments. Treating each model component as random variable, Bayesian networks aim at capturing the joint distribution of all considered variables. Hence, each conditional distribution of interest (e.g. the effect of precautionary measures on damage reduction) can be inferred. The (in-)dependencies between the considered variables can be learned purely data driven or be given by experts. Even a combination of both is possible. By translating the (in-)dependences into a graph structure, Bayesian networks provide direct insights into the workings of the system and allow to learn about the underlying processes. Besides numerous studies on the topic, learning Bayesian networks from real-world data remains challenging. In previous studies, e.g. on earthquake induced ground motion and flood damage assessments, we tackled the problems arising with continuous variables and incomplete observations. Further studies rise the challenge of relying on very small data sets. Since parameter estimates for complex models based on few observations are unreliable, it is necessary to focus on simplified, yet still meaningful models. A so called Markov Blanket approach is developed to identify the most relevant model components and to construct a simple Bayesian network based on those findings. Since the proceeding is completely data driven, it can easily be transferred to various applications in natural hazard domains. This study is funded by the Deutsche Forschungsgemeinschaft (DFG) within the research training programme GRK 2043/1 "NatRiskChange - Natural hazards and risks in a changing world" at Potsdam University.
Neo-Positivist Intrusions, Post-Qualitative Challenges, and PAR's Generative Indeterminacies
ERIC Educational Resources Information Center
Miller, Janet L.
2017-01-01
Although committed to PAR's overarching aspirations, many advocates also have noted myriad complexities of engaging in PAR, where ambiguities and disarrays--all kinds of inconclusive evidence--can proliferate. Uncertainties especially can erupt if PAR education-focused projects are positioned, oxymoronically, as expected to produce "high…
The Role of Adaptability in Tackling Climate and Environmental Challenges
ERIC Educational Resources Information Center
Martin, Andrew J.; Liem, Gregory Arief D.
2015-01-01
Adaptability is our capacity to respond to change, uncertainty, and variability. We report on recent research investigating how young people's adaptability is related to their environmental awareness, environmental concerns, and pro-environmental attitudes that support the need for policy and action to sustain the environment.
Model Uncertainty and Bayesian Model Averaged Benchmark Dose Estimation for Continuous Data
The benchmark dose (BMD) approach has gained acceptance as a valuable risk assessment tool, but risk assessors still face significant challenges associated with selecting an appropriate BMD/BMDL estimate from the results of a set of acceptable dose-response models. Current approa...
Extant process-based hydrologic and water quality models are indispensable to water resources planning and environmental management. However, models are only approximations of real systems and often calibrated with incomplete and uncertain data. Reliable estimates, or perhaps f...
Extant process-based hydrologic and water quality models are indispensable to water resources planning and environmental management. However, models are only approximations of real systems and often calibrated with incomplete and uncertain data. Reliable estimates, or perhaps f...
Measurement of Emissions from Produced Water Ponds: Upstream Oil and Gas Study #1; Final Report
Significant uncertainty exists regarding air pollutant emissions from upstream oil and gas production operations. Oil and gas operations present unique and challenging emission testing issues due to the large variety and quantity of potential emissions sources. This report summ...
Research, Practice, Uncertainty and Responsibility
ERIC Educational Resources Information Center
Skovsmose, Ole
2006-01-01
Three issues concerning the relationship between research and practice are addressed. (1) A certain "prototype mathematics classroom" seems to dominate the research field, which in many cases seems selective with respect to what practices to address. I suggest challenging the dominance of the discourse created around the prototype mathematics…
ERIC Educational Resources Information Center
Hayes, Dianne
2012-01-01
Higher education institutions are in the battle of a lifetime as they are coping with political and economic uncertainties, threats to federal aid, declining state support, higher tuition rates and increased competition from for-profit institutions. Amid all these challenges, these institutions are pressed to keep up with technological demands,…
Managing Curriculum Change and "Ontological Uncertainty" in Tertiary Education
ERIC Educational Resources Information Center
Keesing-Styles, Linda; Nash, Simon; Ayres, Robert
2014-01-01
Curriculum reform at institutional level is a challenging endeavour. Those charged with leading this process will encounter both enthusiasm and multiple obstacles to teacher engagement including the particularly complex issue of confronting existing teacher identities. At Unitec Institute of Technology (Unitec), the "Living Curriculum"…
ERIC Educational Resources Information Center
Lewis, Gary
2012-01-01
Novice secondary mathematics teachers attempting teaching consonant with NCTM (1991) Professional Standards for Teaching Mathematics experience stresses related to those attempts. Foremost among those stresses are challenges while orchestrating student-centred, whole-class discussions. Such discussions can create uncertainty and stress as novices…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sattison, M.B.; Blackman, H.S.; Novack, S.D.
The Office for Analysis and Evaluation of Operational Data (AEOD) has sought the assistance of the Idaho National Engineering Laboratory (INEL) to make some significant enhancements to the SAPHIRE-based Accident Sequence Precursor (ASP) models recently developed by the INEL. The challenge of this project is to provide the features of a full-scale PRA within the framework of the simplified ASP models. Some of these features include: (1) uncertainty analysis addressing the standard PRA uncertainties and the uncertainties unique to the ASP models and methods, (2) incorporation and proper quantification of individual human actions and the interaction among human actions, (3)more » enhanced treatment of common cause failures, and (4) extension of the ASP models to more closely mimic full-scale PRAs (inclusion of more initiators, explicitly modeling support system failures, etc.). This paper provides an overview of the methods being used to make the above improvements.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sattison, M.B.; Blackman, H.S.; Novack, S.D.
The Office for Analysis and Evaluation of Operational Data (AEOD) has sought the assistance of the Idaho National Engineering Laboratory (INEL) to make some significant enhancements to the SAPHIRE-based Accident Sequence Precursor (ASP) models recently developed by the INEL. The challenge of this project is to provide the features of a full-scale PRA within the framework of the simplified ASP models. Some of these features include: (1) uncertainty analysis addressing the standard PRA uncertainties and the uncertainties unique to the ASP models and methodology, (2) incorporation and proper quantification of individual human actions and the interaction among human actions, (3)more » enhanced treatment of common cause failures, and (4) extension of the ASP models to more closely mimic full-scale PRAs (inclusion of more initiators, explicitly modeling support system failures, etc.). This paper provides an overview of the methods being used to make the above improvements.« less
Some challenges with statistical inference in adaptive designs.
Hung, H M James; Wang, Sue-Jane; Yang, Peiling
2014-01-01
Adaptive designs have generated a great deal of attention to clinical trial communities. The literature contains many statistical methods to deal with added statistical uncertainties concerning the adaptations. Increasingly encountered in regulatory applications are adaptive statistical information designs that allow modification of sample size or related statistical information and adaptive selection designs that allow selection of doses or patient populations during the course of a clinical trial. For adaptive statistical information designs, a few statistical testing methods are mathematically equivalent, as a number of articles have stipulated, but arguably there are large differences in their practical ramifications. We pinpoint some undesirable features of these methods in this work. For adaptive selection designs, the selection based on biomarker data for testing the correlated clinical endpoints may increase statistical uncertainty in terms of type I error probability, and most importantly the increased statistical uncertainty may be impossible to assess.
Adaptive management of rangeland systems
Allen, Craig R.; Angeler, David G.; Fontaine, Joseph J.; Garmestani, Ahjond S.; Hart, Noelle M.; Pope, Kevin L.; Twidwell, Dirac
2017-01-01
Adaptive management is an approach to natural resource management that uses structured learning to reduce uncertainties for the improvement of management over time. The origins of adaptive management are linked to ideas of resilience theory and complex systems. Rangeland management is particularly well suited for the application of adaptive management, having sufficient controllability and reducible uncertainties. Adaptive management applies the tools of structured decision making and requires monitoring, evaluation, and adjustment of management. Adaptive governance, involving sharing of power and knowledge among relevant stakeholders, is often required to address conflict situations. Natural resource laws and regulations can present a barrier to adaptive management when requirements for legal certainty are met with environmental uncertainty. However, adaptive management is possible, as illustrated by two cases presented in this chapter. Despite challenges and limitations, when applied appropriately adaptive management leads to improved management through structured learning, and rangeland management is an area in which adaptive management shows promise and should be further explored.
Pryce, Laura; Tweed, Alison; Hilton, Amanda; Priest, Helena M
2017-01-01
Improved life expectancy means that more adults with intellectual disabilities are now living with ageing parents. This study explored older families' perceptions of the future. Semi-structured interviews were conducted with nine older parents and three adults with intellectual disabilities and analysed to produce an explanatory thematic framework. 'Tolerating uncertainty' was the major theme in participants' attempts to manage anxieties about the future, encompassing sub-themes of 'accepting the parenting role', 'facing challenges', 'being supported/isolated', 'positive meaning making', 're-evaluating as time moves on' and 'managing future thinking'. Some participants expressed preferences for their future which were in contrast to their parents' views, and provide a unique perspective that has often been neglected in prior research. This research has found commonalities in how families tolerate the uncertainty of the future, but also unique differences that require tailored interventions and prospective action by services. © 2015 John Wiley & Sons Ltd.
Dating Tips for Divergence-Time Estimation.
O'Reilly, Joseph E; Dos Reis, Mario; Donoghue, Philip C J
2015-11-01
The molecular clock is the only viable means of establishing an accurate timescale for Life on Earth, but it remains reliant on a capricious fossil record for calibration. 'Tip-dating' promises a conceptual advance, integrating fossil species among their living relatives using molecular/morphological datasets and evolutionary models. Fossil species of known age establish calibration directly, and their phylogenetic uncertainty is accommodated through the co-estimation of time and topology. However, challenges remain, including a dearth of effective models of morphological evolution, rate correlation, the non-random nature of missing characters in fossil data, and, most importantly, accommodating uncertainty in fossil age. We show uncertainty in fossil-dating propagates to divergence-time estimates, yielding estimates that are older and less precise than those based on traditional node calibration. Ultimately, node and tip calibrations are not mutually incompatible and may be integrated to achieve more accurate and precise evolutionary timescales. Copyright © 2015 Elsevier Ltd. All rights reserved.
Linear Mixed Models: Gum and Beyond
NASA Astrophysics Data System (ADS)
Arendacká, Barbora; Täubner, Angelika; Eichstädt, Sascha; Bruns, Thomas; Elster, Clemens
2014-04-01
In Annex H.5, the Guide to the Evaluation of Uncertainty in Measurement (GUM) [1] recognizes the necessity to analyze certain types of experiments by applying random effects ANOVA models. These belong to the more general family of linear mixed models that we focus on in the current paper. Extending the short introduction provided by the GUM, our aim is to show that the more general, linear mixed models cover a wider range of situations occurring in practice and can be beneficial when employed in data analysis of long-term repeated experiments. Namely, we point out their potential as an aid in establishing an uncertainty budget and as means for gaining more insight into the measurement process. We also comment on computational issues and to make the explanations less abstract, we illustrate all the concepts with the help of a measurement campaign conducted in order to challenge the uncertainty budget in calibration of accelerometers.
NASA Astrophysics Data System (ADS)
Moser, S. C.
2011-12-01
As adaptation planning is rising rapidly on the agenda of decision-makers, the need for adequate information to inform those decisions is growing. Locally relevant climate change (as well as related impacts and vulnerability) information, however, is difficult to obtain and that which can be obtained carries the burden of significant scientific uncertainty. This paper aims to assess how important such uncertainty is in adaptation planning, decision-making, and related stakeholder engagement. Does uncertainty actually hinder adaptation planning? Is scientific uncertainty used to postpone decisions reflecting ideologically agendas? Or is it a convenient defense against cognitive and affective engagement with the emerging and projected - and in some cases daunting - climate change risks? To whom does such uncertainty matter and how important is it relative to other challenges decision-makers and stakeholders face? The paper draws on four sources of information to answer these questions: (1) a statewide survey of California coastal managers conducted in summer 2011, (2) years of continual engagement with, and observation of, decision-makers in local adaptation efforts, (3) findings from focus groups with lay individuals in coastal California; and (4) a review of relevant adaptation literature to guide and contextualize the empirical research. The findings entail some "inconvenient truths" for those claiming critical technical or political importance. Rather, the insights suggest that some uncertainties matter more than others; they matter at certain times, but not at others; and they matter to some decision-makers, but not to others. Implications for scientists communicating and engaging with communities are discussed.
The visualization of spatial uncertainty
DOE Office of Scientific and Technical Information (OSTI.GOV)
Srivastava, R.M.
1994-12-31
Geostatistical conditions simulation is gaining acceptance as a numerical modeling tool in the petroleum industry. Unfortunately, many of the new users of conditional simulation work with only one outcome or ``realization`` and ignore the many other outcomes that could be produced by their conditional simulation tools. 3-D visualization tools allow them to create very realistic images of this single outcome as reality. There are many methods currently available for presenting the uncertainty information from a family of possible outcomes; most of these, however, use static displays and many present uncertainty in a format that is not intuitive. This paper exploresmore » the visualization of uncertainty through dynamic displays that exploit the intuitive link between uncertainty and change by presenting the use with a constantly evolving model. The key technical challenge to such a dynamic presentation is the ability to create numerical models that honor the available well data and geophysical information and yet are incrementally different so that successive frames can be viewed rapidly as an animated cartoon. An example of volumetric uncertainty from a Gulf Coast reservoir will be used to demonstrate that such a dynamic presentation is the ability to create numerical models that honor the available well data and geophysical information and yet are incrementally different so that successive frames can be viewed rapidly as an animated cartoon. An example of volumetric uncertainty from a Gulf Coast reservoir will be used to demonstrate that such animation is possible and to show that such dynamic displays can be an effective tool in risk analysis for the petroleum industry.« less
Propagating uncertainty from hydrology into human health risk assessment
NASA Astrophysics Data System (ADS)
Siirila, E. R.; Maxwell, R. M.
2013-12-01
Hydro-geologic modeling and uncertainty assessment of flow and transport parameters can be incorporated into human health risk (both cancer and non-cancer) assessment to better understand the associated uncertainties. This interdisciplinary approach is needed now more than ever as societal problems concerning water quality are increasingly interdisciplinary as well. For example, uncertainty can originate from environmental conditions such as a lack of information or measurement error, or can manifest as variability, such as differences in physiological and exposure parameters between individuals. To complicate the matter, traditional risk assessment methodologies are independent of time, virtually neglecting any temporal dependence. Here we present not only how uncertainty and variability can be incorporated into a risk assessment, but also how time dependent risk assessment (TDRA) allows for the calculation of risk as a function of time. The development of TDRA and the inclusion of quantitative risk analysis in this research provide a means to inform decision makers faced with water quality issues and challenges. The stochastic nature of this work also provides a means to address the question of uncertainty in management decisions, a component that is frequently difficult to quantify. To illustrate this new formulation and to investigate hydraulic mechanisms for sensitivity, an example of varying environmental concentration signals resulting from rate dependencies in geochemical reactions is used. Cancer risk is computed and compared using environmental concentration ensembles modeled with sorption as 1) a linear equilibrium assumption and 2) first order kinetics. Results show that the up scaling of these small-scale processes controls the distribution, magnitude, and associated uncertainty of cancer risk.
Wu, Yiping; Liu, Shuguang; Huang, Zhihong; Yan, Wende
2014-01-01
Ecosystem models are useful tools for understanding ecological processes and for sustainable management of resources. In biogeochemical field, numerical models have been widely used for investigating carbon dynamics under global changes from site to regional and global scales. However, it is still challenging to optimize parameters and estimate parameterization uncertainty for complex process-based models such as the Erosion Deposition Carbon Model (EDCM), a modified version of CENTURY, that consider carbon, water, and nutrient cycles of ecosystems. This study was designed to conduct the parameter identifiability, optimization, sensitivity, and uncertainty analysis of EDCM using our developed EDCM-Auto, which incorporated a comprehensive R package—Flexible Modeling Framework (FME) and the Shuffled Complex Evolution (SCE) algorithm. Using a forest flux tower site as a case study, we implemented a comprehensive modeling analysis involving nine parameters and four target variables (carbon and water fluxes) with their corresponding measurements based on the eddy covariance technique. The local sensitivity analysis shows that the plant production-related parameters (e.g., PPDF1 and PRDX) are most sensitive to the model cost function. Both SCE and FME are comparable and performed well in deriving the optimal parameter set with satisfactory simulations of target variables. Global sensitivity and uncertainty analysis indicate that the parameter uncertainty and the resulting output uncertainty can be quantified, and that the magnitude of parameter-uncertainty effects depends on variables and seasons. This study also demonstrates that using the cutting-edge R functions such as FME can be feasible and attractive for conducting comprehensive parameter analysis for ecosystem modeling.
Robinson, Mike J F; Anselme, Patrick; Suchomel, Kristen; Berridge, Kent C
2015-08-01
Amphetamine and stress can sensitize mesolimbic dopamine-related systems. In Pavlovian autoshaping, repeated exposure to uncertainty of reward prediction can enhance motivated sign-tracking or attraction to a discrete reward-predicting cue (lever-conditioned stimulus; CS+), as well as produce cross-sensitization to amphetamine. However, it remains unknown how amphetamine sensitization or repeated restraint stress interact with uncertainty in controlling CS+ incentive salience attribution reflected in sign-tracking. Here rats were tested in 3 successive phases. First, different groups underwent either induction of amphetamine sensitization or repeated restraint stress, or else were not sensitized or stressed as control groups (either saline injections only, or no stress or injection at all). All next received Pavlovian autoshaping training under either certainty conditions (100% CS-UCS association) or uncertainty conditions (50% CS-UCS association and uncertain reward magnitude). During training, rats were assessed for sign-tracking to the CS+ lever versus goal-tracking to the sucrose dish. Finally, all groups were tested for psychomotor sensitization of locomotion revealed by an amphetamine challenge. Our results confirm that reward uncertainty enhanced sign-tracking attraction toward the predictive CS+ lever, at the expense of goal-tracking. We also reported that amphetamine sensitization promoted sign-tracking even in rats trained under CS-UCS certainty conditions, raising them to sign-tracking levels equivalent to the uncertainty group. Combining amphetamine sensitization and uncertainty conditions did not add together to elevate sign-tracking further above the relatively high levels induced by either manipulation alone. In contrast, repeated restraint stress enhanced subsequent amphetamine-elicited locomotion, but did not enhance CS+ attraction. (c) 2015 APA, all rights reserved).
Robinson, Mike J.F.; Anselme, Patrick; Suchomel, Kristen; Berridge, Kent C.
2015-01-01
Amphetamine and stress can sensitize mesolimbic dopamine-related systems. In Pavlovian autoshaping, repeated exposure to uncertainty of reward prediction can enhance motivated sign-tracking or attraction to a discrete reward-predicting cue (lever CS+), as well as produce cross-sensitization to amphetamine. However, it remains unknown how amphetamine-sensitization or repeated restraint stress interact with uncertainty in controlling CS+ incentive salience attribution reflected in sign-tracking. Here rats were tested in three successive phases. First, different groups underwent either induction of amphetamine sensitization or repeated restraint stress, or else were not sensitized or stressed as control groups (either saline injections only, or no stress or injection at all). All next received Pavlovian autoshaping training under either certainty conditions (100% CS-UCS association) or uncertainty conditions (50% CS-UCS association and uncertain reward magnitude). During training, rats were assessed for sign-tracking to the lever CS+ versus goal-tracking to the sucrose dish. Finally, all groups were tested for psychomotor sensitization of locomotion revealed by an amphetamine challenge. Our results confirm that reward uncertainty enhanced sign-tracking attraction toward the predictive CS+ lever, at the expense of goal-tracking. We also report that amphetamine sensitization promoted sign-tracking even in rats trained under CS-UCS certainty conditions, raising them to sign-tracking levels equivalent to the uncertainty group. Combining amphetamine sensitization and uncertainty conditions together did not add together to elevate sign-tracking further above the relatively high levels induced by either manipulation alone. In contrast, repeated restraint stress enhanced subsequent amphetamine-elicited locomotion, but did not enhance CS+ attraction. PMID:26076340
Uncertainty Estimation using Bootstrapped Kriging Predictions for Precipitation Isoscapes
NASA Astrophysics Data System (ADS)
Ma, C.; Bowen, G. J.; Vander Zanden, H.; Wunder, M.
2017-12-01
Isoscapes are spatial models representing the distribution of stable isotope values across landscapes. Isoscapes of hydrogen and oxygen in precipitation are now widely used in a diversity of fields, including geology, biology, hydrology, and atmospheric science. To generate isoscapes, geostatistical methods are typically applied to extend predictions from limited data measurements. Kriging is a popular method in isoscape modeling, but quantifying the uncertainty associated with the resulting isoscapes is challenging. Applications that use precipitation isoscapes to determine sample origin require estimation of uncertainty. Here we present a simple bootstrap method (SBM) to estimate the mean and uncertainty of the krigged isoscape and compare these results with a generalized bootstrap method (GBM) applied in previous studies. We used hydrogen isotopic data from IsoMAP to explore these two approaches for estimating uncertainty. We conducted 10 simulations for each bootstrap method and found that SBM results in more kriging predictions (9/10) compared to GBM (4/10). Prediction from SBM was closer to the original prediction generated without bootstrapping and had less variance than GBM. SBM was tested on different datasets from IsoMAP with different numbers of observation sites. We determined that predictions from the datasets with fewer than 40 observation sites using SBM were more variable than the original prediction. The approaches we used for estimating uncertainty will be compiled in an R package that is under development. We expect that these robust estimates of precipitation isoscape uncertainty can be applied in diagnosing the origin of samples ranging from various type of waters to migratory animals, food products, and humans.
A Statistics-Based Material Property Analysis to Support TPS Characterization
NASA Technical Reports Server (NTRS)
Copeland, Sean R.; Cozmuta, Ioana; Alonso, Juan J.
2012-01-01
Accurate characterization of entry capsule heat shield material properties is a critical component in modeling and simulating Thermal Protection System (TPS) response in a prescribed aerothermal environment. The thermal decomposition of the TPS material during the pyrolysis and charring processes is poorly characterized and typically results in large uncertainties in material properties as inputs for ablation models. These material property uncertainties contribute to large design margins on flight systems and cloud re- construction efforts for data collected during flight and ground testing, making revision to existing models for entry systems more challenging. The analysis presented in this work quantifies how material property uncertainties propagate through an ablation model and guides an experimental test regimen aimed at reducing these uncertainties and characterizing the dependencies between properties in the virgin and charred states for a Phenolic Impregnated Carbon Ablator (PICA) based TPS. A sensitivity analysis identifies how the high-fidelity model behaves in the expected flight environment, while a Monte Carlo based uncertainty propagation strategy is used to quantify the expected spread in the in-depth temperature response of the TPS. An examination of how perturbations to the input probability density functions affect output temperature statistics is accomplished using a Kriging response surface of the high-fidelity model. Simulations are based on capsule configuration and aerothermal environments expected during the Mars Science Laboratory (MSL) entry sequence. We identify and rank primary sources of uncertainty from material properties in a flight-relevant environment, show the dependence on spatial orientation and in-depth location on those uncertainty contributors, and quantify how sensitive the expected results are.
Quantified Uncertainties in Comparative Life Cycle Assessment: What Can Be Concluded?
2018-01-01
Interpretation of comparative Life Cycle Assessment (LCA) results can be challenging in the presence of uncertainty. To aid in interpreting such results under the goal of any comparative LCA, we aim to provide guidance to practitioners by gaining insights into uncertainty-statistics methods (USMs). We review five USMs—discernibility analysis, impact category relevance, overlap area of probability distributions, null hypothesis significance testing (NHST), and modified NHST–and provide a common notation, terminology, and calculation platform. We further cross-compare all USMs by applying them to a case study on electric cars. USMs belong to a confirmatory or an exploratory statistics’ branch, each serving different purposes to practitioners. Results highlight that common uncertainties and the magnitude of differences per impact are key in offering reliable insights. Common uncertainties are particularly important as disregarding them can lead to incorrect recommendations. On the basis of these considerations, we recommend the modified NHST as a confirmatory USM. We also recommend discernibility analysis as an exploratory USM along with recommendations for its improvement, as it disregards the magnitude of the differences. While further research is necessary to support our conclusions, the results and supporting material provided can help LCA practitioners in delivering a more robust basis for decision-making. PMID:29406730
Quantifying and managing uncertainty in operational modal analysis
NASA Astrophysics Data System (ADS)
Au, Siu-Kui; Brownjohn, James M. W.; Mottershead, John E.
2018-03-01
Operational modal analysis aims at identifying the modal properties (natural frequency, damping, etc.) of a structure using only the (output) vibration response measured under ambient conditions. Highly economical and feasible, it is becoming a common practice in full-scale vibration testing. In the absence of (input) loading information, however, the modal properties have significantly higher uncertainty than their counterparts identified from free or forced vibration (known input) tests. Mastering the relationship between identification uncertainty and test configuration is of great interest to both scientists and engineers, e.g., for achievable precision limits and test planning/budgeting. Addressing this challenge beyond the current state-of-the-art that are mostly concerned with identification algorithms, this work obtains closed form analytical expressions for the identification uncertainty (variance) of modal parameters that fundamentally explains the effect of test configuration. Collectively referred as 'uncertainty laws', these expressions are asymptotically correct for well-separated modes, small damping and long data; and are applicable under non-asymptotic situations. They provide a scientific basis for planning and standardization of ambient vibration tests, where factors such as channel noise, sensor number and location can be quantitatively accounted for. The work is reported comprehensively with verification through synthetic and experimental data (laboratory and field), scientific implications and practical guidelines for planning ambient vibration tests.
Hou, Xianlong; Hodges, Ben R; Feng, Dongyu; Liu, Qixiao
2017-03-15
As oil transport increasing in the Texas bays, greater risks of ship collisions will become a challenge, yielding oil spill accidents as a consequence. To minimize the ecological damage and optimize rapid response, emergency managers need to be informed with how fast and where oil will spread as soon as possible after a spill. The state-of-the-art operational oil spill forecast modeling system improves the oil spill response into a new stage. However uncertainty due to predicted data inputs often elicits compromise on the reliability of the forecast result, leading to misdirection in contingency planning. Thus understanding the forecast uncertainty and reliability become significant. In this paper, Monte Carlo simulation is implemented to provide parameters to generate forecast probability maps. The oil spill forecast uncertainty is thus quantified by comparing the forecast probability map and the associated hindcast simulation. A HyosPy-based simple statistic model is developed to assess the reliability of an oil spill forecast in term of belief degree. The technologies developed in this study create a prototype for uncertainty and reliability analysis in numerical oil spill forecast modeling system, providing emergency managers to improve the capability of real time operational oil spill response and impact assessment. Copyright © 2017 Elsevier Ltd. All rights reserved.
Moving across scales: Challenges and opportunities in upscaling carbon fluxes
NASA Astrophysics Data System (ADS)
Naithani, K. J.
2016-12-01
Light use efficiency (LUE) type models are commonly used to upscale terrestrial C fluxes and estimate regional and global C budgets. Model parameters are often estimated for each land cover type (LCT) using flux observations from one or more eddy covariance towers, and then spatially extrapolated by integrating land cover, meteorological, and remotely sensed data. Decisions regarding the type of input data (spatial resolution of land cover data, spatial and temporal length of flux data), representation of landscape structure (land use vs. disturbance regime), and the type of modeling framework (common risk vs. hierarchical) all influence the estimates CO2 fluxes and the associated uncertainties, but are rarely considered together. This work presents a synthesis of past and present efforts for upscaling CO2 fluxes and associated uncertainties in the ChEAS (Chequamegon Ecosystem Atmosphere Study) region in northern Wisconsin and the Upper Peninsula of Michigan. This work highlights two key future research needs. First, the characterization of uncertainties due to all of the abovementioned factors reflects only a (hopefully relevant) subset the overall uncertainties. Second, interactions among these factors are likely critical, but are poorly represented by the tower network at landscape scales. Yet, results indicate significant spatial and temporal heterogeneity of uncertainty in CO2 fluxes which can inform carbon management efforts and prioritize data needs.
Probability-based hazard avoidance guidance for planetary landing
NASA Astrophysics Data System (ADS)
Yuan, Xu; Yu, Zhengshi; Cui, Pingyuan; Xu, Rui; Zhu, Shengying; Cao, Menglong; Luan, Enjie
2018-03-01
Future landing and sample return missions on planets and small bodies will seek landing sites with high scientific value, which may be located in hazardous terrains. Autonomous landing in such hazardous terrains and highly uncertain planetary environments is particularly challenging. Onboard hazard avoidance ability is indispensable, and the algorithms must be robust to uncertainties. In this paper, a novel probability-based hazard avoidance guidance method is developed for landing in hazardous terrains on planets or small bodies. By regarding the lander state as probabilistic, the proposed guidance algorithm exploits information on the uncertainty of lander position and calculates the probability of collision with each hazard. The collision probability serves as an accurate safety index, which quantifies the impact of uncertainties on the lander safety. Based on the collision probability evaluation, the state uncertainty of the lander is explicitly taken into account in the derivation of the hazard avoidance guidance law, which contributes to enhancing the robustness to the uncertain dynamics of planetary landing. The proposed probability-based method derives fully analytic expressions and does not require off-line trajectory generation. Therefore, it is appropriate for real-time implementation. The performance of the probability-based guidance law is investigated via a set of simulations, and the effectiveness and robustness under uncertainties are demonstrated.
Challenges of Iranian Adolescents for Preventing Dental Caries
Fallahi, Arezoo; Ghofranipour, Fazlollah; Ahmadi, Fazlollah; Malekafzali, Beheshteh; Hajizadeh, Ebrahim
2014-01-01
Background: Oral health plays a vital role in people’s general health and well-being. With regard to the costly treatments of oral diseases, preventive programs need to be designed for dental caries based on children’s perspectives. Objectives: The purpose of this study was to describe and explore challenges for caring dental health based on children’s perspectives. Patients and Methods: A qualitative design with content analysis approach was applied to collect and analyze the perspectives of students about factors influencing oral and dental care. Eighteen Iranian students in 8 guidance schools were chosen through the purposive sampling. Semi-structured interviews were held for data gathering. In order to support the validity and rigor of the data, different criteria such as acceptability, confirmability, and transferability were utilized. Results: During data analysis, four main themes developed: “barriers to dental health,” “maintaining dental health,” “uncertainty in decision-making” and “supportive factors”. “Uncertainty in decision-making” and “barriers to dental health” were the main challenges for preventing dental caries among adolescents. Conclusions: “Certainty in decision-making” to maintain dental health depends on overcoming the barriers of dental health. Further research is needed to confirm the findings of this study. PMID:25593720
Pyrometer with tracking balancing
NASA Astrophysics Data System (ADS)
Ponomarev, D. B.; Zakharenko, V. A.; Shkaev, A. G.
2018-04-01
Currently, one of the main metrological noncontact temperature measurement challenges is the emissivity uncertainty. This paper describes a pyrometer with emissivity effect diminishing through the use of a measuring scheme with tracking balancing in which the radiation receiver is a null-indicator. In this paper the results of the prototype pyrometer absolute error study in surfaces temperature measurement of aluminum and nickel samples are presented. There is absolute error calculated values comparison considering the emissivity table values with errors on the results of experimental measurements by the proposed method. The practical implementation of the proposed technical solution has allowed two times to reduce the error due to the emissivity uncertainty.
Smith, Quentin W; Street, Richard L; Volk, Robert J; Fordis, Michael
2013-02-01
The near ubiquitous access to information is transforming the roles and relationships among clinical professionals, patients, and their care givers in nearly all aspects of healthcare. Informed patients engage their physicians in conversations about their conditions, options and the tradeoffs among diagnostic and therapeutic benefits and harms. The processes of care today increasingly and explicitly integrate exploration of patient values and preferences as patients and clinicians jointly engage in reaching decisions about care. The informed patient of today who can understand and use scientific information can participate as an equal partner with her clinician. Others with beliefs or educational, cultural, or literacy backgrounds that pose challenges to comprehending and applying evidence may face disenfranchisement. These barriers are significant enough, even in the face of certainty of evidence, that clinicians and investigators have given much thought to how best to engage all patients in decision making. However, barriers remain, as most decision making must occur in settings where uncertainty, if not considerable uncertainty, accompanies any statement of what we know. In September 2011, health care and health communication experts came together in Rockville, Maryland under the auspices of the Agency for Healthcare Research and Quality (AHRQ) John M. Eisenberg Center for Clinical Decisions and Communications Science Annual Meeting to explore the challenges of differing levels of evidence in promoting shared decisions and to propose strategies for going forward in addressing these challenges. Eight scholarly papers emerged, and with this introductory article, comprise this special issue of Medical Care Research and Review.
Adaptive Pathways: Possible Next Steps for Payers in Preparation for Their Potential Implementation.
Vella Bonanno, Patricia; Ermisch, Michael; Godman, Brian; Martin, Antony P; Van Den Bergh, Jesper; Bezmelnitsyna, Liudmila; Bucsics, Anna; Arickx, Francis; Bybau, Alexander; Bochenek, Tomasz; van de Casteele, Marc; Diogene, Eduardo; Eriksson, Irene; Fürst, Jurij; Gad, Mohamed; Greičiūtė-Kuprijanov, Ieva; van der Graaff, Martin; Gulbinovic, Jolanta; Jones, Jan; Joppi, Roberta; Kalaba, Marija; Laius, Ott; Langner, Irene; Mardare, Ileana; Markovic-Pekovic, Vanda; Magnusson, Einar; Melien, Oyvind; Meshkov, Dmitry O; Petrova, Guenka I; Selke, Gisbert; Sermet, Catherine; Simoens, Steven; Schuurman, Ad; Ramos, Ricardo; Rodrigues, Jorge; Zara, Corinne; Zebedin-Brandl, Eva; Haycox, Alan
2017-01-01
Medicines receiving a conditional marketing authorization through Medicines Adaptive Pathways to Patients (MAPPs) will be a challenge for payers. The "introduction" of MAPPs is already seen by the European Medicines Agency (EMA) as a fait accompli, with payers not consulted or involved. However, once medicines are approved through MAPPs, they will be evaluated for funding by payers through different activities. These include Health Technology Assessment (HTA) with often immature clinical data and high uncertainty, financial considerations, and negotiations through different types of agreements, which can require monitoring post launch. Payers have experience with new medicines approved through conditional approval, and the fact that MAPPs present additional challenges is a concern from their perspective. There may be some activities where payers can collaborate. The final decisions on whether to reimburse a new medicine via MAPPs will have more variation than for medicines licensed via conventional processes. This is due not only to increasing uncertainty associated with medicines authorized through MAPPs but also differences in legal frameworks between member states. Moreover, if the financial and side-effect burden from the period of conditional approval until granting full marketing authorization is shifted to the post-authorization phase, payers may have to bear such burdens. Collection of robust data during routine clinical use is challenging along with high prices for new medicines during data collection. This paper presents the concept of MAPPs and possible challenges. Concerns and potential ways forward are discussed and a number of recommendations are presented from the perspective of payers.
Integrating uncertainties for climate change mitigation
NASA Astrophysics Data System (ADS)
Rogelj, Joeri; McCollum, David; Reisinger, Andy; Meinshausen, Malte; Riahi, Keywan
2013-04-01
The target of keeping global average temperature increase to below 2°C has emerged in the international climate debate more than a decade ago. In response, the scientific community has tried to estimate the costs of reaching such a target through modelling and scenario analysis. Producing such estimates remains a challenge, particularly because of relatively well-known, but ill-quantified uncertainties, and owing to limited integration of scientific knowledge across disciplines. The integrated assessment community, on one side, has extensively assessed the influence of technological and socio-economic uncertainties on low-carbon scenarios and associated costs. The climate modelling community, on the other side, has worked on achieving an increasingly better understanding of the geophysical response of the Earth system to emissions of greenhouse gases (GHG). This geophysical response remains a key uncertainty for the cost of mitigation scenarios but has only been integrated with assessments of other uncertainties in a rudimentary manner, i.e., for equilibrium conditions. To bridge this gap between the two research communities, we generate distributions of the costs associated with limiting transient global temperature increase to below specific temperature limits, taking into account uncertainties in multiple dimensions: geophysical, technological, social and political. In other words, uncertainties resulting from our incomplete knowledge about how the climate system precisely reacts to GHG emissions (geophysical uncertainties), about how society will develop (social uncertainties and choices), which technologies will be available (technological uncertainty and choices), when we choose to start acting globally on climate change (political choices), and how much money we are or are not willing to spend to achieve climate change mitigation. We find that political choices that delay mitigation have the largest effect on the cost-risk distribution, followed by geophysical, future energy demand, and mitigation technology uncertainties. This information provides central information for policy making, since it helps to understand the relationship between mitigation costs and their potential to reduce the risk of exceeding 2°C, or other temperature limits like 3°C or 1.5°C, under a wide range of scenarios.
Dewey and Schon: An Analysis of Reflective Thinking.
ERIC Educational Resources Information Center
Bauer, Norman J.
The challenge to the dominance of rationality in educational philosophy presented by John Dewey and Donald Schon is examined in this paper. The paper identifies basic assumptions of their perspective and explains concepts of reflective thinking, which include biography, context of uncertainty, and "not-yet." A model of reflective thought…
Show Me the Money: Impact of County Funding on Retention Rates for Extension Educators
ERIC Educational Resources Information Center
Feldhues, Katherine; Tanner, Timothy
2017-01-01
Extension administrators contemplating the challenge of employee turnover should consider potential motivation factors. Through the lens of Herzberg's motivation-hygiene theory, we explored the relationship between financial uncertainty and employee turnover in Ohio State University Extension. The Human Resources department and Business Office of…
Read-across is a popular data gap filling technique within category and analogue approaches for regulatory purposes. Acceptance of read-across remains an ongoing challenge with several efforts underway for identifying and addressing uncertainties. Here we demonstrate an algorithm...
What the Willow Teaches: Sustainability Learning as Craft
ERIC Educational Resources Information Center
Cato, Molly Scott
2014-01-01
Whilst the importance of mainstreaming sustainability in higher education curricula is now widely acknowledged, the challenge for educators at university level is to develop and maintain authority and confidence in an area dominated by limited knowledge and uncertainty. This article suggests that the most empowering and authentic response is to…
Dispersion modeling tools have traditionally provided critical information for air quality management decisions, but have been used recently to provide exposure estimates to support health studies. However, these models can be challenging to implement, particularly in near-road s...
Fungal biodegradation of lignocelluloses
Annele Hatakka; Kenneth E. Hammel
2010-01-01
Uncertainties in the basic structures of especially lignin but also other components in lignocellulose make fungal biodegradation studies a challenging task. The following properties are important in terms of microbial or enzymatic attack: (1) lignin polymers have compact structures that are insoluble in water and difficult to penetrate by microbes or enzymes, (2) the...
Action Learning, Performativity and Negative Capability
ERIC Educational Resources Information Center
Edmonstone, John
2016-01-01
The paper examines the concept of negative capability as a human capacity for containment and contrasts it with well-valued positive capability as expressed through performativity in organisations and society. It identifies the problem of dispersal--the complex ways we behave in order to avoid the emotional challenges of living with uncertainty.…
Establishing Quality Assurance in the South African Context
ERIC Educational Resources Information Center
Strydom, A. H.; Strydom, J. F.
2004-01-01
This paper provides perspectives on the unique challenges and opportunities facing the national auditing and accreditation system in South African higher education. In doing so, the quality assurance contexts of developed countries, Africa and South Africa are considered and the issues of uncertainty and conformity are highlighted. This is…
Atkinson, A.J.; Trenham, P.C.; Fisher, R.N.; Hathaway, S.A.; Johnson, B.S.; Torres, S.G.; Moore, Y.C.
2004-01-01
critical management uncertainties; and 3) implementing long-term monitoring and adaptive management. Ultimately, the success of regional conservation planning depends on the ability of monitoring programs to confront the challenges of adaptively managing and monitoring complex ecosystems and diverse arrays of sensitive species.
Modeling wildfire incident complexity dynamics
Matthew P. Thompson
2013-01-01
Wildfire management in the United States and elsewhere is challenged by substantial uncertainty regarding the location and timing of fire events, the socioeconomic and ecological consequences of these events, and the costs of suppression. Escalating U.S. Forest Service suppression expenditures is of particular concern at a time of fiscal austerity as swelling fire...
The Nature of Pedagogic Teacher-Student Interactions: A Phenomenographic Study
ERIC Educational Resources Information Center
Beutel, Denise
2010-01-01
Globally, teaching has become more complex and more challenging over recent years, with new and increased demands being placed on teachers by students, their families, governments and wider society. Teachers work with more diverse communities in times characterised by volatility, uncertainty and moral ambiguity. Societal, political, economic and…
Justin M. Louen; Christopher G. Surfleet
2017-01-01
Stream temperature impacts have resulted in increased restrictions on land management, such as timber harvest and riparian restoration, creating considerable uncertainty for future planning and management of redwood (Sequoia sempervirens (D.Don) Endl.) forestlands. Challenges remain in the assessment of downstream cumulative stream...
While aerosol radiative effects have been recognized as some of the largest sources of uncertainty among the forcers of climate change, the verification of the spatial and temporal variability of aerosol radiative forcing has remained challenging. Anthropogenic emissions of prima...
USDA-ARS?s Scientific Manuscript database
The complexity of the hydrologic system challenges the development of models. One issue faced during the model development stage is the uncertainty involved in model parameterization. Using a single optimized set of parameters (one snapshot) to represent baseline conditions of the system limits the ...
ERIC Educational Resources Information Center
Malm, James R.
2008-01-01
A total of six Maryland community college presidents were guided through conversations to identify the organizational challenges and uncertainties that have forced organizational changes in their respective colleges. Another thrust of the research was to discover what organizational change processes these presidents have implemented to overcome…
Future socio-economic impacts and vulnerabilities
Balgis Osman-Elasha; Neil Adger; Maria Brockhaus; Carol J. Pierce Colfer; Brent Sohngen; Tallaat Dafalla; Linda A. Joyce; Nkem Johnson; Carmenza Robledo
2009-01-01
The projected impacts of climate change are significant, and despite the uncertainties associated with current climate and ecosystem model projections, the associated changes in the provision of forest ecosystem services are expected to be substantial in many parts of the world. These impacts will present significant social and economic challenges for affected...
Long-Range Planning: Finding Fiscal Certainty in a Time of Uncertainty
ERIC Educational Resources Information Center
Malinowski, Matthew J.
2012-01-01
To navigate today's fiscal challenges successfully, school districts must constantly examine the long-term fiscal implications of policy, programmatic, and human resource decisions on their organization. They must look at the effect of such items as bargaining agreements, contracted services, placement costs, transportation costs, benefits,…
Assessment of SFR Wire Wrap Simulation Uncertainties
DOE Office of Scientific and Technical Information (OSTI.GOV)
Delchini, Marc-Olivier G.; Popov, Emilian L.; Pointer, William David
Predictive modeling and simulation of nuclear reactor performance and fuel are challenging due to the large number of coupled physical phenomena that must be addressed. Models that will be used for design or operational decisions must be analyzed for uncertainty to ascertain impacts to safety or performance. Rigorous, structured uncertainty analyses are performed by characterizing the model’s input uncertainties and then propagating the uncertainties through the model to estimate output uncertainty. This project is part of the ongoing effort to assess modeling uncertainty in Nek5000 simulations of flow configurations relevant to the advanced reactor applications of the Nuclear Energy Advancedmore » Modeling and Simulation (NEAMS) program. Three geometries are under investigation in these preliminary assessments: a 3-D pipe, a 3-D 7-pin bundle, and a single pin from the Thermal-Hydraulic Out-of-Reactor Safety (THORS) facility. Initial efforts have focused on gaining an understanding of Nek5000 modeling options and integrating Nek5000 with Dakota. These tasks are being accomplished by demonstrating the use of Dakota to assess parametric uncertainties in a simple pipe flow problem. This problem is used to optimize performance of the uncertainty quantification strategy and to estimate computational requirements for assessments of complex geometries. A sensitivity analysis to three turbulent models was conducted for a turbulent flow in a single wire wrapped pin (THOR) geometry. Section 2 briefly describes the software tools used in this study and provides appropriate references. Section 3 presents the coupling interface between Dakota and a computational fluid dynamic (CFD) code (Nek5000 or STARCCM+), with details on the workflow, the scripts used for setting up the run, and the scripts used for post-processing the output files. In Section 4, the meshing methods used to generate the THORS and 7-pin bundle meshes are explained. Sections 5, 6 and 7 present numerical results for the 3-D pipe, the single pin THORS mesh, and the 7-pin bundle mesh, respectively.« less
NASA Technical Reports Server (NTRS)
Oda, T.; Ott, L.; Lauvaux, T.; Feng, S.; Bun, R.; Roman, M.; Baker, D. F.; Pawson, S.
2017-01-01
Fossil fuel carbon dioxide (CO2) emissions (FFCO2) are the largest input to the global carbon cycle on a decadal time scale. Because total emissions are assumed to be reasonably well constrained by fuel statistics, FFCO2 often serves as a reference in order to deduce carbon uptake by poorly understood terrestrial and ocean sinks. Conventional atmospheric CO2 flux inversions solve for spatially explicit regional sources and sinks and estimate land and ocean fluxes by subtracting FFCO2. Thus, errors in FFCO2 can propagate into the final inferred flux estimates. Gridded emissions are often based on disaggregation of emissions estimated at national or regional level. Although national and regional total FFCO2 are well known, gridded emission fields are subject to additional uncertainties due to the emission disaggregation. Assessing such uncertainties is often challenging because of the lack of physical measurements for evaluation. We first review difficulties in assessing uncertainties associated with gridded FFCO2 emission data and present several approaches for evaluation of such uncertainties at multiple scales. Given known limitations, inter-emission data differences are often used as a proxy for the uncertainty. The popular approach allows us to characterize differences in emissions, but does not allow us to fully quantify emission disaggregation biases. Our work aims to vicariously evaluate FFCO2 emission data using atmospheric models and measurements. We show a global simulation experiment where uncertainty estimates are propagated as an atmospheric tracer (uncertainty tracer) alongside CO2 in NASA's GEOS model and discuss implications of FFCO2 uncertainties in the context of flux inversions. We also demonstrate the use of high resolution urban CO2 simulations as a tool for objectively evaluating FFCO2 data over intense emission regions. Though this study focuses on FFCO2 emission data, the outcome of this study could also help improve the knowledge of similar gridded emissions data for non-CO2 compounds with similar emission characteristics.
NASA Astrophysics Data System (ADS)
Almeida, Susana; Holcombe, Liz; Pianosi, Francesca; Wagener, Thorsten
2016-04-01
Landslides have many negative economic and societal impacts, including the potential for significant loss of life and damage to infrastructure. Slope stability assessment can be used to guide decisions about the management of landslide risk, but its usefulness can be challenged by high levels of uncertainty in predicting landslide occurrence. Prediction uncertainty may be associated with the choice of model that is used to assess slope stability, the quality of the available input data, or a lack of knowledge of how future climatic and socio-economic changes may affect future landslide risk. While some of these uncertainties can be characterised by relatively well-defined probability distributions, for other uncertainties, such as those linked to climate change, no probability distribution is available to characterise them. This latter type of uncertainty, often referred to as deep uncertainty, means that robust policies need to be developed that are expected to perform acceptably well over a wide range of future conditions. In our study the impact of deep uncertainty on slope stability predictions is assessed in a quantitative and structured manner using Global Sensitivity Analysis (GSA) and the Combined Hydrology and Stability Model (CHASM). In particular, we use several GSA methods including the Method of Morris, Regional Sensitivity Analysis and Classification and Regression Trees (CART), as well as advanced visualization tools, to assess the combination of conditions that may lead to slope failure. Our example application is a slope in the Caribbean, an area that is naturally susceptible to landslides due to a combination of high rainfall rates during the hurricane season, steep slopes, and highly weathered residual soils. Rapid unplanned urbanisation and changing climate may further exacerbate landslide risk in the future. Our example shows how we can gain useful information in the presence of deep uncertainty by combining physically based models with GSA in a scenario discovery framework.
Prestele, Reinhard; Alexander, Peter; Rounsevell, Mark D. A.; ...
2016-05-02
Model-based global projections of future land use and land cover (LULC) change are frequently used in environmental assessments to study the impact of LULC change on environmental services and to provide decision support for policy. These projections are characterized by a high uncertainty in terms of quantity and allocation of projected changes, which can severely impact the results of environmental assessments. In this study, we identify hotspots of uncertainty, based on 43 simulations from 11 global-scale LULC change models representing a wide range of assumptions of future biophysical and socio-economic conditions. We attribute components of uncertainty to input data, modelmore » structure, scenario storyline and a residual term, based on a regression analysis and analysis of variance. From this diverse set of models and scenarios we find that the uncertainty varies, depending on the region and the LULC type under consideration. Hotspots of uncertainty appear mainly at the edges of globally important biomes (e.g. boreal and tropical forests). Our results indicate that an important source of uncertainty in forest and pasture areas originates from different input data applied in the models. Cropland, in contrast, is more consistent among the starting conditions, while variation in the projections gradually increases over time due to diverse scenario assumptions and different modeling approaches. Comparisons at the grid cell level indicate that disagreement is mainly related to LULC type definitions and the individual model allocation schemes. We conclude that improving the quality and consistency of observational data utilized in the modeling process as well as improving the allocation mechanisms of LULC change models remain important challenges. Furthermore, current LULC representation in environmental assessments might miss the uncertainty arising from the diversity of LULC change modeling approaches and many studies ignore the uncertainty in LULC projections in assessments of LULC change impacts on climate, water resources or biodiversity.« less
NASA Astrophysics Data System (ADS)
Oda, T.; Ott, L. E.; Lauvaux, T.; Feng, S.; Bun, R.; Roman, M. O.; Baker, D. F.; Pawson, S.
2017-12-01
Fossil fuel carbon dioxide (CO2) emissions (FFCO2) are the largest input to the global carbon cycle on a decadal time scale. Because total emissions are assumed to be reasonably well constrained by fuel statistics, FFCO2 often serves as a reference in order to deduce carbon uptake by poorly understood terrestrial and ocean sinks. Conventional atmospheric CO2 flux inversions solve for spatially explicit regional sources and sinks and estimate land and ocean fluxes by subtracting FFCO2. Thus, errors in FFCO2 can propagate into the final inferred flux estimates. Gridded emissions are often based on disaggregation of emissions estimated at national or regional level. Although national and regional total FFCO2 are well known, gridded emission fields are subject to additional uncertainties due to the emission disaggregation. Assessing such uncertainties is often challenging because of the lack of physical measurements for evaluation. We first review difficulties in assessing uncertainties associated with gridded FFCO2 emission data and present several approaches for evaluation of such uncertainties at multiple scales. Given known limitations, inter-emission data differences are often used as a proxy for the uncertainty. The popular approach allows us to characterize differences in emissions, but does not allow us to fully quantify emission disaggregation biases. Our work aims to vicariously evaluate FFCO2 emission data using atmospheric models and measurements. We show a global simulation experiment where uncertainty estimates are propagated as an atmospheric tracer (uncertainty tracer) alongside CO2 in NASA's GEOS model and discuss implications of FFCO2 uncertainties in the context of flux inversions. We also demonstrate the use of high resolution urban CO2 simulations as a tool for objectively evaluating FFCO2 data over intense emission regions. Though this study focuses on FFCO2 emission data, the outcome of this study could also help improve the knowledge of similar gridded emissions data for non-CO2 compounds that share emission sectors.
Prestele, Reinhard; Alexander, Peter; Rounsevell, Mark D A; Arneth, Almut; Calvin, Katherine; Doelman, Jonathan; Eitelberg, David A; Engström, Kerstin; Fujimori, Shinichiro; Hasegawa, Tomoko; Havlik, Petr; Humpenöder, Florian; Jain, Atul K; Krisztin, Tamás; Kyle, Page; Meiyappan, Prasanth; Popp, Alexander; Sands, Ronald D; Schaldach, Rüdiger; Schüngel, Jan; Stehfest, Elke; Tabeau, Andrzej; Van Meijl, Hans; Van Vliet, Jasper; Verburg, Peter H
2016-12-01
Model-based global projections of future land-use and land-cover (LULC) change are frequently used in environmental assessments to study the impact of LULC change on environmental services and to provide decision support for policy. These projections are characterized by a high uncertainty in terms of quantity and allocation of projected changes, which can severely impact the results of environmental assessments. In this study, we identify hotspots of uncertainty, based on 43 simulations from 11 global-scale LULC change models representing a wide range of assumptions of future biophysical and socioeconomic conditions. We attribute components of uncertainty to input data, model structure, scenario storyline and a residual term, based on a regression analysis and analysis of variance. From this diverse set of models and scenarios, we find that the uncertainty varies, depending on the region and the LULC type under consideration. Hotspots of uncertainty appear mainly at the edges of globally important biomes (e.g., boreal and tropical forests). Our results indicate that an important source of uncertainty in forest and pasture areas originates from different input data applied in the models. Cropland, in contrast, is more consistent among the starting conditions, while variation in the projections gradually increases over time due to diverse scenario assumptions and different modeling approaches. Comparisons at the grid cell level indicate that disagreement is mainly related to LULC type definitions and the individual model allocation schemes. We conclude that improving the quality and consistency of observational data utilized in the modeling process and improving the allocation mechanisms of LULC change models remain important challenges. Current LULC representation in environmental assessments might miss the uncertainty arising from the diversity of LULC change modeling approaches, and many studies ignore the uncertainty in LULC projections in assessments of LULC change impacts on climate, water resources or biodiversity. © 2016 The Authors. Global Change Biology Published by John Wiley & Sons Ltd.
Scott, Finlay; Jardim, Ernesto; Millar, Colin P; Cerviño, Santiago
2016-01-01
Estimating fish stock status is very challenging given the many sources and high levels of uncertainty surrounding the biological processes (e.g. natural variability in the demographic rates), model selection (e.g. choosing growth or stock assessment models) and parameter estimation. Incorporating multiple sources of uncertainty in a stock assessment allows advice to better account for the risks associated with proposed management options, promoting decisions that are more robust to such uncertainty. However, a typical assessment only reports the model fit and variance of estimated parameters, thereby underreporting the overall uncertainty. Additionally, although multiple candidate models may be considered, only one is selected as the 'best' result, effectively rejecting the plausible assumptions behind the other models. We present an applied framework to integrate multiple sources of uncertainty in the stock assessment process. The first step is the generation and conditioning of a suite of stock assessment models that contain different assumptions about the stock and the fishery. The second step is the estimation of parameters, including fitting of the stock assessment models. The final step integrates across all of the results to reconcile the multi-model outcome. The framework is flexible enough to be tailored to particular stocks and fisheries and can draw on information from multiple sources to implement a broad variety of assumptions, making it applicable to stocks with varying levels of data availability The Iberian hake stock in International Council for the Exploration of the Sea (ICES) Divisions VIIIc and IXa is used to demonstrate the framework, starting from length-based stock and indices data. Process and model uncertainty are considered through the growth, natural mortality, fishing mortality, survey catchability and stock-recruitment relationship. Estimation uncertainty is included as part of the fitting process. Simple model averaging is used to integrate across the results and produce a single assessment that considers the multiple sources of uncertainty.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Prestele, Reinhard; Alexander, Peter; Rounsevell, Mark D. A.
Model-based global projections of future land use and land cover (LULC) change are frequently used in environmental assessments to study the impact of LULC change on environmental services and to provide decision support for policy. These projections are characterized by a high uncertainty in terms of quantity and allocation of projected changes, which can severely impact the results of environmental assessments. In this study, we identify hotspots of uncertainty, based on 43 simulations from 11 global-scale LULC change models representing a wide range of assumptions of future biophysical and socio-economic conditions. We attribute components of uncertainty to input data, modelmore » structure, scenario storyline and a residual term, based on a regression analysis and analysis of variance. From this diverse set of models and scenarios we find that the uncertainty varies, depending on the region and the LULC type under consideration. Hotspots of uncertainty appear mainly at the edges of globally important biomes (e.g. boreal and tropical forests). Our results indicate that an important source of uncertainty in forest and pasture areas originates from different input data applied in the models. Cropland, in contrast, is more consistent among the starting conditions, while variation in the projections gradually increases over time due to diverse scenario assumptions and different modeling approaches. Comparisons at the grid cell level indicate that disagreement is mainly related to LULC type definitions and the individual model allocation schemes. We conclude that improving the quality and consistency of observational data utilized in the modeling process as well as improving the allocation mechanisms of LULC change models remain important challenges. Furthermore, current LULC representation in environmental assessments might miss the uncertainty arising from the diversity of LULC change modeling approaches and many studies ignore the uncertainty in LULC projections in assessments of LULC change impacts on climate, water resources or biodiversity.« less
NASA Astrophysics Data System (ADS)
Klein, R.; Woodward, C. S.; Johannesson, G.; Domyancic, D.; Covey, C. C.; Lucas, D. D.
2012-12-01
Uncertainty Quantification (UQ) is a critical field within 21st century simulation science that resides at the very center of the web of emerging predictive capabilities. The science of UQ holds the promise of giving much greater meaning to the results of complex large-scale simulations, allowing for quantifying and bounding uncertainties. This powerful capability will yield new insights into scientific predictions (e.g. Climate) of great impact on both national and international arenas, allow informed decisions on the design of critical experiments (e.g. ICF capsule design, MFE, NE) in many scientific fields, and assign confidence bounds to scientifically predictable outcomes (e.g. nuclear weapons design). In this talk I will discuss a major new strategic initiative (SI) we have developed at Lawrence Livermore National Laboratory to advance the science of Uncertainty Quantification at LLNL focusing in particular on (a) the research and development of new algorithms and methodologies of UQ as applied to multi-physics multi-scale codes, (b) incorporation of these advancements into a global UQ Pipeline (i.e. a computational superstructure) that will simplify user access to sophisticated tools for UQ studies as well as act as a self-guided, self-adapting UQ engine for UQ studies on extreme computing platforms and (c) use laboratory applications as a test bed for new algorithms and methodologies. The initial SI focus has been on applications for the quantification of uncertainty associated with Climate prediction, but the validated UQ methodologies we have developed are now being fed back into Science Based Stockpile Stewardship (SSS) and ICF UQ efforts. To make advancements in several of these UQ grand challenges, I will focus in talk on the following three research areas in our Strategic Initiative: Error Estimation in multi-physics and multi-scale codes ; Tackling the "Curse of High Dimensionality"; and development of an advanced UQ Computational Pipeline to enable complete UQ workflow and analysis for ensemble runs at the extreme scale (e.g. exascale) with self-guiding adaptation in the UQ Pipeline engine. This work was performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344 and was funded by the Uncertainty Quantification Strategic Initiative Laboratory Directed Research and Development Project at LLNL under project tracking code 10-SI-013 (UCRL LLNL-ABS-569112).
NASA Astrophysics Data System (ADS)
Vandergoes, Marcus J.; Howarth, Jamie D.; Dunbar, Gavin B.; Turnbull, Jocelyn C.; Roop, Heidi A.; Levy, Richard H.; Li, Xun; Prior, Christine; Norris, Margaret; Keller, Liz D.; Baisden, W. Troy; Ditchburn, Robert; Fitzsimons, Sean J.; Bronk Ramsey, Christopher
2018-05-01
Annually resolved (varved) lake sequences are important palaeoenvironmental archives as they offer a direct incremental dating technique for high-frequency reconstruction of environmental and climate change. Despite the importance of these records, establishing a robust chronology and quantifying its precision and accuracy (estimations of error) remains an essential but challenging component of their development. We outline an approach for building reliable independent chronologies, testing the accuracy of layer counts and integrating all chronological uncertainties to provide quantitative age and error estimates for varved lake sequences. The approach incorporates (1) layer counts and estimates of counting precision; (2) radiometric and biostratigrapic dating techniques to derive independent chronology; and (3) the application of Bayesian age modelling to produce an integrated age model. This approach is applied to a case study of an annually resolved sediment record from Lake Ohau, New Zealand. The most robust age model provides an average error of 72 years across the whole depth range. This represents a fractional uncertainty of ∼5%, higher than the <3% quoted for most published varve records. However, the age model and reported uncertainty represent the best fit between layer counts and independent chronology and the uncertainties account for both layer counting precision and the chronological accuracy of the layer counts. This integrated approach provides a more representative estimate of age uncertainty and therefore represents a statistically more robust chronology.
Gao, Xueping; Liu, Yinzhu; Sun, Bowen
2018-06-05
The risk of water shortage caused by uncertainties, such as frequent drought, varied precipitation, multiple water resources, and different water demands, brings new challenges to the water transfer projects. Uncertainties exist for transferring water and local surface water; therefore, the relationship between them should be thoroughly studied to prevent water shortage. For more effective water management, an uncertainty-based water shortage risk assessment model (UWSRAM) is developed to study the combined effect of multiple water resources and analyze the shortage degree under uncertainty. The UWSRAM combines copula-based Monte Carlo stochastic simulation and the chance-constrained programming-stochastic multiobjective optimization model, using the Lunan water-receiving area in China as an example. Statistical copula functions are employed to estimate the joint probability of available transferring water and local surface water and sampling from the multivariate probability distribution, which are used as inputs for the optimization model. The approach reveals the distribution of water shortage and is able to emphasize the importance of improving and updating transferring water and local surface water management, and examine their combined influence on water shortage risk assessment. The possible available water and shortages can be calculated applying the UWSRAM, also with the corresponding allocation measures under different water availability levels and violating probabilities. The UWSRAM is valuable for mastering the overall multi-water resource and water shortage degree, adapting to the uncertainty surrounding water resources, establishing effective water resource planning policies for managers and achieving sustainable development.
Treatment planning optimisation in proton therapy
McGowan, S E; Burnet, N G; Lomax, A J
2013-01-01
ABSTRACT. The goal of radiotherapy is to achieve uniform target coverage while sparing normal tissue. In proton therapy, the same sources of geometric uncertainty are present as in conventional radiotherapy. However, an important and fundamental difference in proton therapy is that protons have a finite range, highly dependent on the electron density of the material they are traversing, resulting in a steep dose gradient at the distal edge of the Bragg peak. Therefore, an accurate knowledge of the sources and magnitudes of the uncertainties affecting the proton range is essential for producing plans which are robust to these uncertainties. This review describes the current knowledge of the geometric uncertainties and discusses their impact on proton dose plans. The need for patient-specific validation is essential and in cases of complex intensity-modulated proton therapy plans the use of a planning target volume (PTV) may fail to ensure coverage of the target. In cases where a PTV cannot be used, other methods of quantifying plan quality have been investigated. A promising option is to incorporate uncertainties directly into the optimisation algorithm. A further development is the inclusion of robustness into a multicriteria optimisation framework, allowing a multi-objective Pareto optimisation function to balance robustness and conformity. The question remains as to whether adaptive therapy can become an integral part of a proton therapy, to allow re-optimisation during the course of a patient's treatment. The challenge of ensuring that plans are robust to range uncertainties in proton therapy remains, although these methods can provide practical solutions. PMID:23255545
The rationale for intensity-modulated proton therapy in geometrically challenging cases
NASA Astrophysics Data System (ADS)
Safai, S.; Trofimov, A.; Adams, J. A.; Engelsman, M.; Bortfeld, T.
2013-09-01
Intensity-modulated proton therapy (IMPT) delivered with beam scanning is currently available at a limited number of proton centers. However, a simplified form of IMPT, the technique of field ‘patching’, has long been a standard practice in proton therapy centers. In field patching, different parts of the target volume are treated from different directions, i.e., a part of the tumor gets either full dose from a radiation field, or almost no dose. Thus, patching represents a form of binary intensity modulation. This study explores the limitations of the standard binary field patching technique, and evaluates possible dosimetric advantages of continuous dose modulations in IMPT. Specifics of the beam delivery technology, i.e., pencil beam scanning versus passive scattering and modulation, are not investigated. We have identified two geometries of target volumes and organs at risk (OAR) in which the use of field patching is severely challenged. We focused our investigations on two patient cases that exhibit these geometries: a paraspinal tumor case and a skull-base case. For those cases we performed treatment planning comparisons of three-dimensional conformal proton therapy (3DCPT) with field patching versus IMPT, using commercial and in-house software, respectively. We also analyzed the robustness of the resulting plans with respect to systematic setup errors of ±1 mm and range errors of ±2.5 mm. IMPT is able to better spare OAR while providing superior dose coverage for the challenging cases identified above. Both 3DCPT and IMPT are sensitive to setup errors and range uncertainties, with IMPT showing the largest effect. Nevertheless, when delivery uncertainties are taken into account IMPT plans remain superior regarding target coverage and OAR sparing. On the other hand, some clinical goals, such as the maximum dose to OAR, are more likely to be unmet with IMPT under large range errors. IMPT can potentially improve target coverage and OAR sparing in challenging cases, even when compared with the relatively complicated and time consuming field patching technique. While IMPT plans tend to be more sensitive to delivery uncertainties, their dosimetric advantage generally holds. Robust treatment planning techniques may further reduce the sensitivity of IMPT plans.
Uncertainty characterization of HOAPS 3.3 latent heat-flux-related parameters
NASA Astrophysics Data System (ADS)
Liman, Julian; Schröder, Marc; Fennig, Karsten; Andersson, Axel; Hollmann, Rainer
2018-03-01
Latent heat flux (LHF) is one of the main contributors to the global energy budget. As the density of in situ LHF measurements over the global oceans is generally poor, the potential of remotely sensed LHF for meteorological applications is enormous. However, to date none of the available satellite products have included estimates of systematic, random, and sampling uncertainties, all of which are essential for assessing their quality. Here, the challenge is taken on by matching LHF-related pixel-level data of the Hamburg Ocean Atmosphere Parameters and Fluxes from Satellite (HOAPS) climatology (version 3.3) to in situ measurements originating from a high-quality data archive of buoys and selected ships. Assuming the ground reference to be bias-free, this allows for deriving instantaneous systematic uncertainties as a function of four atmospheric predictor variables. The approach is regionally independent and therefore overcomes the issue of sparse in situ data densities over large oceanic areas. Likewise, random uncertainties are derived, which include not only a retrieval component but also contributions from in situ measurement noise and the collocation procedure. A recently published random uncertainty decomposition approach is applied to isolate the random retrieval uncertainty of all LHF-related HOAPS parameters. It makes use of two combinations of independent data triplets of both satellite and in situ data, which are analysed in terms of their pairwise variances of differences. Instantaneous uncertainties are finally aggregated, allowing for uncertainty characterizations on monthly to multi-annual timescales. Results show that systematic LHF uncertainties range between 15 and 50 W m-2 with a global mean of 25 W m-2. Local maxima are mainly found over the subtropical ocean basins as well as along the western boundary currents. Investigations indicate that contributions from qa (U) to the overall LHF uncertainty are on the order of 60 % (25 %). From an instantaneous point of view, random retrieval uncertainties are specifically large over the subtropics with a global average of 37 W m-2. In a climatological sense, their magnitudes become negligible, as do respective sampling uncertainties. Regional and seasonal analyses suggest that largest total LHF uncertainties are seen over the Gulf Stream and the Indian monsoon region during boreal winter. In light of the uncertainty measures, the observed continuous global mean LHF increase up to 2009 needs to be treated with caution. The demonstrated approach can easily be transferred to other satellite retrievals, which increases the significance of the present work.
Goutzamanis, Stelliana; Doyle, Joseph S; Thompson, Alexander; Dietze, Paul; Hellard, Margaret; Higgs, Peter
2018-04-02
People who inject drugs (PWID) are most at risk of hepatitis C virus infection in Australia. The introduction of transient elastography (TE) (measuring hepatitis fibrosis) and direct acting antiviral medications will likely alter the experience of living with hepatitis C. We aimed to explore positive and negative influences on wellbeing and stress among PWID with hepatitis C. The Treatment and Prevention (TAP) study examines the feasibility of treating hepatitis C mono-infected PWID in community settings. Semi-structured interviews were conducted with 16 purposively recruited TAP participants. Participants were aware of their hepatitis C seropositive status and had received fibrosis assessment (measured by TE) prior to interview. Questions were open-ended, focusing on the impact of health status on wellbeing and self-reported stress. Interviews were voice recorded, transcribed verbatim and thematically analysed, guided by Mishel's (1988) theory of Uncertainty in Illness. In line with Mishel's theory of Uncertainty in Illness all participants reported hepatitis C-related uncertainty, particularly mis-information or a lack of knowledge surrounding liver health and the meaning of TE results. Those with greater fibrosis experienced an extra layer of prognostic uncertainty. Experiences of uncertainty were a key motivation to seek treatment, which was seen as a way to regain some stability in life. Treatment completion alleviated hepatitis C-related stress, and promoted feelings of empowerment and confidence in addressing other life challenges. TE scores seemingly provide some certainty. However, when paired with limited knowledge, particularly among people with severe fibrosis, TE may be a source of uncertainty and increased personal stress. This suggests the need for simple education programs and resources on liver health to minimise stress.
Sources of uncertanity as a basis to fill the information gap in a response to flood
NASA Astrophysics Data System (ADS)
Kekez, Toni; Knezic, Snjezana
2016-04-01
Taking into account uncertainties in flood risk management remains a challenge due to difficulties in choosing adequate structural and/or non-structural risk management options. Despite stated measures wrong decisions are often being made when flood occurs. Parameter and structural uncertainties which include model and observation errors as well as lack of knowledge about system characteristics are the main considerations. Real time flood risk assessment methods are predominantly based on measured water level values and vulnerability as well as other relevant characteristics of flood affected area. The goal of this research is to identify sources of uncertainties and to minimize information gap between the point where the water level is measured and the affected area, taking into consideration main uncertainties that can affect risk value at the observed point or section of the river. Sources of uncertainties are identified and determined using system analysis approach and relevant uncertainties are included in the risk assessment model. With such methodological approach it is possible to increase response time with more effective risk assessment which includes uncertainty propagation model. Response phase could be better planned with adequate early warning systems resulting in more time and less costs to help affected areas and save human lives. Reliable and precise information is necessary to raise emergency operability level in order to enhance safety of citizens and reducing possible damage. The results of the EPISECC (EU funded FP7) project are used to validate potential benefits of this research in order to improve flood risk management and response methods. EPISECC aims at developing a concept of a common European Information Space for disaster response which, among other disasters, considers the floods.
Probabilistic cost estimates for climate change mitigation.
Rogelj, Joeri; McCollum, David L; Reisinger, Andy; Meinshausen, Malte; Riahi, Keywan
2013-01-03
For more than a decade, the target of keeping global warming below 2 °C has been a key focus of the international climate debate. In response, the scientific community has published a number of scenario studies that estimate the costs of achieving such a target. Producing these estimates remains a challenge, particularly because of relatively well known, but poorly quantified, uncertainties, and owing to limited integration of scientific knowledge across disciplines. The integrated assessment community, on the one hand, has extensively assessed the influence of technological and socio-economic uncertainties on low-carbon scenarios and associated costs. The climate modelling community, on the other hand, has spent years improving its understanding of the geophysical response of the Earth system to emissions of greenhouse gases. This geophysical response remains a key uncertainty in the cost of mitigation scenarios but has been integrated with assessments of other uncertainties in only a rudimentary manner, that is, for equilibrium conditions. Here we bridge this gap between the two research communities by generating distributions of the costs associated with limiting transient global temperature increase to below specific values, taking into account uncertainties in four factors: geophysical, technological, social and political. We find that political choices that delay mitigation have the largest effect on the cost-risk distribution, followed by geophysical uncertainties, social factors influencing future energy demand and, lastly, technological uncertainties surrounding the availability of greenhouse gas mitigation options. Our information on temperature risk and mitigation costs provides crucial information for policy-making, because it clarifies the relative importance of mitigation costs, energy demand and the timing of global action in reducing the risk of exceeding a global temperature increase of 2 °C, or other limits such as 3 °C or 1.5 °C, across a wide range of scenarios.
NASA Astrophysics Data System (ADS)
Huda, J.; Kauneckis, D. L.
2013-12-01
Climate change adaptation represents a number of unique policy-making challenges. Foremost among these is dealing with the range of future climate impacts to a wide scope of inter-related natural systems, their interaction with social and economic systems, and uncertainty resulting from the variety of downscaled climate model scenarios and climate science projections. These cascades of uncertainty have led to a number of new approaches as well as a reexamination of traditional methods for evaluating risk and uncertainty in policy-making. Policy makers are required to make decisions and formulate policy irrespective of the level of uncertainty involved and while a debate continues regarding the level of scientific certainty required in order to make a decision, incremental change in the climate policy continues at multiple governance levels. This project conducts a comparative analysis of the range of methodological approaches that are evolving to address uncertainty in climate change policy. It defines 'methodologies' to include a variety of quantitative and qualitative approaches involving both top-down and bottom-up policy processes that attempt to enable policymakers to synthesize climate information into the policy process. The analysis examines methodological approaches to decision-making in climate policy based on criteria such as sources of policy choice information, sectors to which the methodology has been applied, sources from which climate projections were derived, quantitative and qualitative methods used to deal with uncertainty, and the benefits and limitations of each. A typology is developed to better categorize the variety of approaches and methods, examine the scope of policy activities they are best suited for, and highlight areas for future research and development.
The Certainty of Uncertainty: Potential Sources of Bias and Imprecision in Disease Ecology Studies.
Lachish, Shelly; Murray, Kris A
2018-01-01
Wildlife diseases have important implications for wildlife and human health, the preservation of biodiversity and the resilience of ecosystems. However, understanding disease dynamics and the impacts of pathogens in wild populations is challenging because these complex systems can rarely, if ever, be observed without error. Uncertainty in disease ecology studies is commonly defined in terms of either heterogeneity in detectability (due to variation in the probability of encountering, capturing, or detecting individuals in their natural habitat) or uncertainty in disease state assignment (due to misclassification errors or incomplete information). In reality, however, uncertainty in disease ecology studies extends beyond these components of observation error and can arise from multiple varied processes, each of which can lead to bias and a lack of precision in parameter estimates. Here, we present an inventory of the sources of potential uncertainty in studies that attempt to quantify disease-relevant parameters from wild populations (e.g., prevalence, incidence, transmission rates, force of infection, risk of infection, persistence times, and disease-induced impacts). We show that uncertainty can arise via processes pertaining to aspects of the disease system, the study design, the methods used to study the system, and the state of knowledge of the system, and that uncertainties generated via one process can propagate through to others because of interactions between the numerous biological, methodological and environmental factors at play. We show that many of these sources of uncertainty may not be immediately apparent to researchers (for example, unidentified crypticity among vectors, hosts or pathogens, a mismatch between the temporal scale of sampling and disease dynamics, demographic or social misclassification), and thus have received comparatively little consideration in the literature to date. Finally, we discuss the type of bias or imprecision introduced by these varied sources of uncertainty and briefly present appropriate sampling and analytical methods to account for, or minimise, their influence on estimates of disease-relevant parameters. This review should assist researchers and practitioners to navigate the pitfalls of uncertainty in wildlife disease ecology studies.
Rainfall or parameter uncertainty? The power of sensitivity analysis on grouped factors
NASA Astrophysics Data System (ADS)
Nossent, Jiri; Pereira, Fernando; Bauwens, Willy
2017-04-01
Hydrological models are typically used to study and represent (a part of) the hydrological cycle. In general, the output of these models mostly depends on their input rainfall and parameter values. Both model parameters and input precipitation however, are characterized by uncertainties and, therefore, lead to uncertainty on the model output. Sensitivity analysis (SA) allows to assess and compare the importance of the different factors for this output uncertainty. Hereto, the rainfall uncertainty can be incorporated in the SA by representing it as a probabilistic multiplier. Such multiplier can be defined for the entire time series, or several of these factors can be determined for every recorded rainfall pulse or for hydrological independent storm events. As a consequence, the number of parameters included in the SA related to the rainfall uncertainty can be (much) lower or (much) higher than the number of model parameters. Although such analyses can yield interesting results, it remains challenging to determine which type of uncertainty will affect the model output most due to the different weight both types will have within the SA. In this study, we apply the variance based Sobol' sensitivity analysis method to two different hydrological simulators (NAM and HyMod) for four diverse watersheds. Besides the different number of model parameters (NAM: 11 parameters; HyMod: 5 parameters), the setup of our sensitivity and uncertainty analysis-combination is also varied by defining a variety of scenarios including diverse numbers of rainfall multipliers. To overcome the issue of the different number of factors and, thus, the different weights of the two types of uncertainty, we build on one of the advantageous properties of the Sobol' SA, i.e. treating grouped parameters as a single parameter. The latter results in a setup with a single factor for each uncertainty type and allows for a straightforward comparison of their importance. In general, the results show a clear influence of the weights in the different SA scenarios. However, working with grouped factors resolves this issue and leads to clear importance results.
Back pain and the resolution of diagnostic uncertainty in illness narratives.
Lillrank, Annika
2003-09-01
In this paper I consider 30 Finnish women's written narratives about the process of getting back pain diagnosed. From the beginning of the early discomfort of back pain, the women were sure of its bodily and subjective reality. They struggled repeatedly to be taken seriously, and only after years of medical disparagement did they encounter medical professionals who were able solve the riddle and give it a name, a diagnosis. Since back pain is a baffling problem and challenges the central biomedical epistemology-objective knowledge and measurable findings separate from subjective experience-it allowed the doctors to show a disrespectful attitude toward back pain sufferers. The moral essence of the women's common story was the stigmatizing experience when doctors did not take subjective pain seriously. Instead, doctors' neglectful attitudes became part of the prolonged problem. During the long-lasting uncertainty, women tried multiple coping strategies to ease their lives and developed mental attitudes to endure the pain. Since the protagonists did not give up the lived certainty of back pain they were gradually able to challenge medical uncertainty and to demand a thorough medical examination, and/or through random circumstance they encountered doctors who were willing to take their symptoms seriously. This triggered turning points that immediately or very soon resulted in solving the riddle of the puzzling pain. To be finally diagnosed was a great relief. However, to be taken seriously as a person was considered to be the greatest relief.
Shretta, Rima; Yadav, Prashant
2012-12-02
The global demand for artemisinin-based combination therapy (ACT) has grown sharply since its recommendation by the World Health Organization in 2002. However, a combination of financing and programmatic uncertainties, limited suppliers of finished products, information opacity across the different tiers in the supply chain, and widespread fluctuations in raw material prices have together contributed to a market fraught with demand and supply uncertainties and price volatility. Various short-term solutions have been deployed to alleviate supply shortages caused by these challenges; however, new mechanisms are required to build resilience into the supply chain. This review concludes that a mix of strategies is required to stabilize the artemisinin and ACT market. First, better and more effective pooling of demand and supply risks and better contracting to allow risk sharing among the stakeholders are needed. Physical and financial buffer stocks will enable better matching of demand and supply in the short and medium term. Secondly, physical buffers will allow stable supplies when there are procurement and supply management challenges while financial buffer funds will address issues around funding disruptions. Finally, in the medium to long term, significant investments in country level system strengthening will be required to minimize national level demand uncertainties. In addition a voluntary standard for extractors to ensure appropriate purchasing and sales practices as well as minimum quality and ethical standards could help stabilize the artemisinin market in the long term.
2012-01-01
The global demand for artemisinin-based combination therapy (ACT) has grown sharply since its recommendation by the World Health Organization in 2002. However, a combination of financing and programmatic uncertainties, limited suppliers of finished products, information opacity across the different tiers in the supply chain, and widespread fluctuations in raw material prices have together contributed to a market fraught with demand and supply uncertainties and price volatility. Various short-term solutions have been deployed to alleviate supply shortages caused by these challenges; however, new mechanisms are required to build resilience into the supply chain. This review concludes that a mix of strategies is required to stabilize the artemisinin and ACT market. First, better and more effective pooling of demand and supply risks and better contracting to allow risk sharing among the stakeholders are needed. Physical and financial buffer stocks will enable better matching of demand and supply in the short and medium term. Secondly, physical buffers will allow stable supplies when there are procurement and supply management challenges while financial buffer funds will address issues around funding disruptions. Finally, in the medium to long term, significant investments in country level system strengthening will be required to minimize national level demand uncertainties. In addition a voluntary standard for extractors to ensure appropriate purchasing and sales practices as well as minimum quality and ethical standards could help stabilize the artemisinin market in the long term. PMID:23198961
The Impact of Model Uncertainty on Spatial Compensation in Structural Acoustic Control
NASA Technical Reports Server (NTRS)
Clark, Robert L.
2005-01-01
Turbulent boundary layer (TBL) noise is considered a primary contribution to the interior noise present in commercial airliners. There are numerous investigations of interior noise control devoted to aircraft panels; however, practical realization is a potential challenge since physical boundary conditions are uncertain at best. In most prior studies, pinned or clamped boundary conditions were assumed; however, realistic panels likely display a range of boundary conditions between these two limits. Uncertainty in boundary conditions is a challenge for control system designers, both in terms of the compensator implemented and the location of transducers required to achieve the desired control. The impact of model uncertainties, specifically uncertain boundaries, on the selection of transducer locations for structural acoustic control is considered herein. The final goal of this work is the design of an aircraft panel structure that can reduce TBL noise transmission through the use of a completely adaptive, single-input, single-output control system. The feasibility of this goal is demonstrated through the creation of a detailed analytical solution, followed by the implementation of a test model in a transmission loss apparatus. Successfully realizing a control system robust to variations in boundary conditions can lead to the design and implementation of practical adaptive structures that could be used to control the transmission of sound to the interior of aircraft. Results from this research effort indicate it is possible to optimize the design of actuator and sensor location and aperture, minimizing the impact of boundary conditions on the desired structural acoustic control.
Management strategies in hospitals: scenario planning.
Ghanem, Mohamed; Schnoor, Jörg; Heyde, Christoph-Eckhard; Kuwatsch, Sandra; Bohn, Marco; Josten, Christoph
2015-01-01
Instead of waiting for challenges to confront hospital management, doctors and managers should act in advance to optimize and sustain value-based health. This work highlights the importance of scenario planning in hospitals, proposes an elaborated definition of the stakeholders of a hospital and defines the influence factors to which hospitals are exposed to. Based on literature analysis as well as on personal interviews with stakeholders we propose an elaborated definition of stakeholders and designed a questionnaire that integrated the following influence factors, which have relevant impact on hospital management: political/legal, economic, social, technological and environmental forces. These influence factors are examined to develop the so-called critical uncertainties. Thorough identification of uncertainties was based on a "Stakeholder Feedback". Two key uncertainties were identified and considered in this study: the development of workload for the medical staff the profit oriented performance of the medical staff. According to the developed scenarios, complementary education of the medical staff as well as of non-medical top executives and managers of hospitals was the recommended core strategy. Complementary scenario-specific strategic options should be considered whenever needed to optimize dealing with a specific future development of the health care environment. Strategic planning in hospitals is essential to ensure sustainable success. It considers multiple situations and integrates internal and external insights and perspectives in addition to identifying weak signals and "blind spots". This flows into a sound planning for multiple strategic options. It is a state of the art tool that allows dealing with the increasing challenges facing hospital management.
Cognitive influences on risk-seeking by rhesus macaques
Hayden, Benjamin Y.; Heilbronner, Sarah R.; Nair, Amrita C.; Platt, Michael L.
2009-01-01
Humans and other animals are idiosyncratically sensitive to risk, either preferring or avoiding options having the same value but differing in uncertainty. Many explanations for risk sensitivity rely on the non-linear shape of a hypothesized utility curve. Because such models do not place any importance on uncertainty per se, utility curve-based accounts predict indifference between risky and riskless options that offer the same distribution of rewards. Here we show that monkeys strongly prefer uncertain gambles to alternating rewards with the same payoffs, demonstrating that uncertainty itself contributes to the appeal of risky options. Based on prior observations, we hypothesized that the appeal of the risky option is enhanced by the salience of the potential jackpot. To test this, we subtly manipulated payoffs in a second gambling task. We found that monkeys are more sensitive to small changes in the size of the large reward than to equivalent changes in the size of the small reward, indicating that they attend preferentially to the jackpots. Together, these results challenge utility curve-based accounts of risk sensitivity, and suggest that psychological factors, such as outcome salience and uncertainty itself, contribute to risky decision-making. PMID:19844596
Wildhaber, Mark L.; Wikle, Christopher K.; Anderson, Christopher J.; Franz, Kristie J.; Moran, Edward H.; Dey, Rima; Mader, Helmut; Kraml, Julia
2012-01-01
Climate change operates over a broad range of spatial and temporal scales. Understanding its effects on ecosystems requires multi-scale models. For understanding effects on fish populations of riverine ecosystems, climate predicted by coarse-resolution Global Climate Models must be downscaled to Regional Climate Models to watersheds to river hydrology to population response. An additional challenge is quantifying sources of uncertainty given the highly nonlinear nature of interactions between climate variables and community level processes. We present a modeling approach for understanding and accomodating uncertainty by applying multi-scale climate models and a hierarchical Bayesian modeling framework to Midwest fish population dynamics and by linking models for system components together by formal rules of probability. The proposed hierarchical modeling approach will account for sources of uncertainty in forecasts of community or population response. The goal is to evaluate the potential distributional changes in an ecological system, given distributional changes implied by a series of linked climate and system models under various emissions/use scenarios. This understanding will aid evaluation of management options for coping with global climate change. In our initial analyses, we found that predicted pallid sturgeon population responses were dependent on the climate scenario considered.
Characterizing Uncertainty and Variability in PBPK Models ...
Mode-of-action based risk and safety assessments can rely upon tissue dosimetry estimates in animals and humans obtained from physiologically-based pharmacokinetic (PBPK) modeling. However, risk assessment also increasingly requires characterization of uncertainty and variability; such characterization for PBPK model predictions represents a continuing challenge to both modelers and users. Current practices show significant progress in specifying deterministic biological models and the non-deterministic (often statistical) models, estimating their parameters using diverse data sets from multiple sources, and using them to make predictions and characterize uncertainty and variability. The International Workshop on Uncertainty and Variability in PBPK Models, held Oct 31-Nov 2, 2006, sought to identify the state-of-the-science in this area and recommend priorities for research and changes in practice and implementation. For the short term, these include: (1) multidisciplinary teams to integrate deterministic and non-deterministic/statistical models; (2) broader use of sensitivity analyses, including for structural and global (rather than local) parameter changes; and (3) enhanced transparency and reproducibility through more complete documentation of the model structure(s) and parameter values, the results of sensitivity and other analyses, and supporting, discrepant, or excluded data. Longer-term needs include: (1) theoretic and practical methodological impro
Efficient Characterization of Parametric Uncertainty of Complex (Bio)chemical Networks.
Schillings, Claudia; Sunnåker, Mikael; Stelling, Jörg; Schwab, Christoph
2015-08-01
Parametric uncertainty is a particularly challenging and relevant aspect of systems analysis in domains such as systems biology where, both for inference and for assessing prediction uncertainties, it is essential to characterize the system behavior globally in the parameter space. However, current methods based on local approximations or on Monte-Carlo sampling cope only insufficiently with high-dimensional parameter spaces associated with complex network models. Here, we propose an alternative deterministic methodology that relies on sparse polynomial approximations. We propose a deterministic computational interpolation scheme which identifies most significant expansion coefficients adaptively. We present its performance in kinetic model equations from computational systems biology with several hundred parameters and state variables, leading to numerical approximations of the parametric solution on the entire parameter space. The scheme is based on adaptive Smolyak interpolation of the parametric solution at judiciously and adaptively chosen points in parameter space. As Monte-Carlo sampling, it is "non-intrusive" and well-suited for massively parallel implementation, but affords higher convergence rates. This opens up new avenues for large-scale dynamic network analysis by enabling scaling for many applications, including parameter estimation, uncertainty quantification, and systems design.
Efficient Characterization of Parametric Uncertainty of Complex (Bio)chemical Networks
Schillings, Claudia; Sunnåker, Mikael; Stelling, Jörg; Schwab, Christoph
2015-01-01
Parametric uncertainty is a particularly challenging and relevant aspect of systems analysis in domains such as systems biology where, both for inference and for assessing prediction uncertainties, it is essential to characterize the system behavior globally in the parameter space. However, current methods based on local approximations or on Monte-Carlo sampling cope only insufficiently with high-dimensional parameter spaces associated with complex network models. Here, we propose an alternative deterministic methodology that relies on sparse polynomial approximations. We propose a deterministic computational interpolation scheme which identifies most significant expansion coefficients adaptively. We present its performance in kinetic model equations from computational systems biology with several hundred parameters and state variables, leading to numerical approximations of the parametric solution on the entire parameter space. The scheme is based on adaptive Smolyak interpolation of the parametric solution at judiciously and adaptively chosen points in parameter space. As Monte-Carlo sampling, it is “non-intrusive” and well-suited for massively parallel implementation, but affords higher convergence rates. This opens up new avenues for large-scale dynamic network analysis by enabling scaling for many applications, including parameter estimation, uncertainty quantification, and systems design. PMID:26317784
NASA Astrophysics Data System (ADS)
Jie, M.; Zhang, J.; Guo, B. B.
2017-12-01
As a typical distributed hydrological model, the SWAT model also has a challenge in calibrating parameters and analysis their uncertainty. This paper chooses the Chaohe River Basin China as the study area, through the establishment of the SWAT model, loading the DEM data of the Chaohe river basin, the watershed is automatically divided into several sub-basins. Analyzing the land use, soil and slope which are on the basis of the sub-basins and calculating the hydrological response unit (HRU) of the study area, after running SWAT model, the runoff simulation values in the watershed are obtained. On this basis, using weather data, known daily runoff of three hydrological stations, combined with the SWAT-CUP automatic program and the manual adjustment method are used to analyze the multi-site calibration of the model parameters. Furthermore, the GLUE algorithm is used to analyze the parameters uncertainty of the SWAT model. Through the sensitivity analysis, calibration and uncertainty study of SWAT, the results indicate that the parameterization of the hydrological characteristics of the Chaohe river is successful and feasible which can be used to simulate the Chaohe river basin.
NASA Astrophysics Data System (ADS)
Pathiraja, S. D.; van Leeuwen, P. J.
2017-12-01
Model Uncertainty Quantification remains one of the central challenges of effective Data Assimilation (DA) in complex partially observed non-linear systems. Stochastic parameterization methods have been proposed in recent years as a means of capturing the uncertainty associated with unresolved sub-grid scale processes. Such approaches generally require some knowledge of the true sub-grid scale process or rely on full observations of the larger scale resolved process. We present a methodology for estimating the statistics of sub-grid scale processes using only partial observations of the resolved process. It finds model error realisations over a training period by minimizing their conditional variance, constrained by available observations. Special is that these realisations are binned conditioned on the previous model state during the minimization process, allowing for the recovery of complex error structures. The efficacy of the approach is demonstrated through numerical experiments on the multi-scale Lorenz 96' model. We consider different parameterizations of the model with both small and large time scale separations between slow and fast variables. Results are compared to two existing methods for accounting for model uncertainty in DA and shown to provide improved analyses and forecasts.
Ter Horst, Mechteld M S; Koelmans, Albert A
2016-10-04
The assessment of chemical degradation rates from water-sediment experiments like for instance OECD 308 is challenging due to parallel occurrence of processes like degradation, sorption and diffusive transport, at different rates in water and sediment or at their interface. To systematically and quantitatively analyze this limitation, we generated artificial experiment data sets using model simulations and then used these data sets in an inverse modeling exercise to estimate degradation half-lives in water and sediment (DegT50 wat and DegT50 sed ), which then were evaluated against their true values. Results were visualized by chemical space diagrams that identified those substance property combinations for which the OECD 308 test is fundamentally inappropriate. We show that the uncertainty in estimated degradation half-lives in water increases as the process of diffusion to the sediment becomes dominant over degradation in the water. We show that in theory the uncertainty in the estimated DegT50 sed is smaller than the uncertainty in the DegT50 wat . The predictive value of our chemical space diagrams was validated using literature transformation rates and their uncertainties that were inferred from real water-sediment experiments.
ERIC Educational Resources Information Center
Goodnough, Karen
2008-01-01
This article reports on the experiences and perceptions of K-12 teachers as they engaged in a participatory action research (PAR) project, "Science Across the Curriculum." Although the experiences and professional learning of two of the project participants are highlighted, the challenges that all participants experienced as they conceptualized…
A major uncertainty in many aquatic risk assessments for toxic chemicals is the aggregate effect of the physicochemical characteristics of exposure media on toxicity, and how this affects extrapolation of laboratory test results to natural systems. A notable example of this is h...
The Way of Openness: Moral Sphere Theory, Education, Ethics, and Classroom Management
ERIC Educational Resources Information Center
Bullough, Robert V., Jr.
2014-01-01
Noting the challenges of radical pluralism and uncertainty to ethics and education, the author describes, then explores Moral Sphere Theory (MST) developed by the philosopher Robert Kane and in relationship to insights drawn from American pragmatism. The argument is that MST offers fresh ways for thinking about education and the profound…
Flourishing Creativity: Education in an Age of Wonder
ERIC Educational Resources Information Center
Tan, Oon Seng
2015-01-01
The twenty-first century is often described as an age of uncertainty and ambiguity with unprecedented challenges. Those with a creative mind-set however might call this millennium an age of wonder. New technologies and digital media are facilitating imagination and inventiveness. How are we innovating education? Are schools and classroom fostering…
Integrating climate change considerations into forest management tools and training
Linda M. Nagel; Christopher W. Swanston; Maria K. Janowiak
2010-01-01
Silviculturists are currently facing the challenge of developing management strategies that meet broad ecological and social considerations in spite of a high degree of uncertainty in future climatic conditions. Forest managers need state-of-the-art knowledge about climate change and potential impacts to facilitate development of silvicultural objectives and...
ERIC Educational Resources Information Center
Butler, Rose
2015-01-01
This paper examines forms of boundary work undertaken by parents in a regional Australian city to negotiate social processes around the school market amidst rising economic insecurity. It outlines structural changes, which have increased economic inequality in Australia and impacted on educational reform, and the specific challenges faced by…
Variability Is Not the Villain: Finding Patterns in Complex Natural Images
ERIC Educational Resources Information Center
Brinton, Brigette Adair; Curran, Mary Carla
2015-01-01
Everyone needs strong observational skills to solve challenging problems and make informed decisions. However, many students expect to find exact answers to their questions by using the internet and do not understand the role of uncertainty, especially in decision making and scientific research. Humans and other animals choose among many options…
Wildlife Conservation Planning Using Stochastic Optimization and Importance Sampling
Robert G. Haight; Laurel E. Travis
1997-01-01
Formulations for determining conservation plans for sensitive wildlife species must account for economic costs of habitat protection and uncertainties about how wildlife populations will respond. This paper describes such a formulation and addresses the computational challenge of solving it. The problem is to determine the cost-efficient level of habitat protection...
National Wildlife Refuges: Portals to conservation
Joseph F. McCauley
2014-01-01
Scientific uncertainty regarding the potential effects of climate change on natural ecosystems will make it increasingly challenging for the National Wildlife Refuge System to fulfill its mission to conserve wildlife and fish habitat across the diverse ecosystems of the United States. This is especially true in the contiguous 48 states, where 70 percent of the land and...
Enhancing the Reflexivity of System Innovation Projects with System Analyses
ERIC Educational Resources Information Center
van Mierlo, Barbara; Arkesteijn, Marlen; Leeuwis, Cees
2010-01-01
Networks aiming for fundamental changes bring together a variety of actors who are part and parcel of a problematic context. These system innovation projects need to be accompanied by a monitoring and evaluation approach that supports and maintains reflexivity to be able to deal with uncertainties and conflicts while challenging current practices…
Matthew P. Thompson; Bruce G. Marcot; Frank R. Thompson; Steven McNulty; Larry A. Fisher; Michael C. Runge; David Cleaves; Monica Tomosy
2013-01-01
Sustainable management of national forests and grasslands within the National Forest System (NFS) often requires managers to make tough decisions under considerable uncertainty, complexity, and potential conflict. Resource decisionmakers must weigh a variety of risks, stressors, and challenges to sustainable management, including climate change, wildland fire, invasive...
Migrating Legacy Systems in the Global Merger & Acquisition Environment
ERIC Educational Resources Information Center
Katerattanakul, Pairin; Kam, Hwee-Joo; Lee, James J.; Hong, Soongoo
2009-01-01
The MetaFrame system migration project at WorldPharma, while driven by merger and acquisition, had faced complexities caused by both technical challenges and organizational issues in the climate of uncertainties. However, WorldPharma still insisted on instigating this post-merger system migration project. This project served to (1) consolidate the…
Harnessing Social Media Collaborative Intelligence to Champion Enterprise Innovation
ERIC Educational Resources Information Center
Juntiwasarakij, Suwan
2012-01-01
Innovation is a survival tool for the corporate world to play in the ever competitive free global market in the 21st century. The innovation process, especially at the front end, is the most challenging phase because of the inextricably intertwined fuzziness of high uncertainty and the deficiency of information available. Although the uncertainty…
ERIC Educational Resources Information Center
Hills, Thomas
2007-01-01
Constructivism in practice is a challenging endeavor that invites teachers and students to engage in problems that elicit uncertainty. This article investigates the relationship between preferences for constructivist approaches and other classroom behaviors that influence the development of future teachers. The theoretical premise for this…
Acceptance of read-across is an ongoing challenge and several efforts are underway to identify and address major uncertainties associated with read-across. Several approaches have been proposed but to date few case studies if any have evaluated how Tox21 approaches may be instruc...
To Have and to Hold: Reflections of an Interim Director
ERIC Educational Resources Information Center
Hibl, Lisa M.
2018-01-01
Leading a long-established LLC as an Interim Director poses particular challenges and rewards. Uncertainties abound for the program and for the individual. Professionally speaking, taking on the hybrid role of faculty/ administrator can be both difficult and exciting. Ultimately, the solutions are in the details. Listening carefully to students,…
Managing Risk and Uncertainty in Large-Scale University Research Projects
ERIC Educational Resources Information Center
Moore, Sharlissa; Shangraw, R. F., Jr.
2011-01-01
Both publicly and privately funded research projects managed by universities are growing in size and scope. Complex, large-scale projects (over $50 million) pose new management challenges and risks for universities. This paper explores the relationship between project success and a variety of factors in large-scale university projects. First, we…
NASA Technical Reports Server (NTRS)
Morton, Douglas; Souza, Carlos, Jr.; Souza, Carlos, Jr.; Keller, Michael
2012-01-01
Large-scale tropical forest monitoring efforts in support of REDD+ (Reducing Emissions from Deforestation and forest Degradation plus enhancing forest carbon stocks) confront a range of challenges. REDD+ activities typically have short reporting time scales, diverse data needs, and low tolerance for uncertainties. Meeting these challenges will require innovative use of remote sensing data, including integrating data at different spatial and temporal resolutions. The global scientific community is engaged in developing, evaluating, and applying new methods for regional to global scale forest monitoring. Pilot REDD+ activities are underway across the tropics with support from a range of national and international groups, including SilvaCarbon, an interagency effort to coordinate US expertise on forest monitoring and resource management. Early actions on REDD+ have exposed some of the inherent tradeoffs that arise from the use of incomplete or inaccurate data to quantify forest area changes and related carbon emissions. Here, we summarize recent advances in forest monitoring to identify and target the main sources of uncertainty in estimates of forest area changes, aboveground carbon stocks, and Amazon forest carbon emissions.
NASA Astrophysics Data System (ADS)
Deng, Yongxin
2016-07-01
This paper examines complications in neighborhood mapping and corresponding challenges for the GIS community, taking both a conceptual and a methodological perspective. It focuses on the social and spatial dimensions of the neighborhood concept and highlights their relationship in neighborhood mapping. Following a brief summary of neighborhood definitions, five interwoven factors are identified to be origins of neighborhood mapping difficulties: conceptual vagueness, uncertainty of various sources, GIS representation, scale, and neighborhood homogeneity or continuity. Existing neighborhood mapping methods are grouped into six categories to be assessed: perception based, physically based, inference based, preexisting, aggregated, and automated. Mapping practices in various neighborhood-related disciplines and applications are cited as examples to demonstrate how the methods work, as well as how they should be evaluated. A few mapping strategies for the improvement of neighborhood mapping are prescribed from a GIS perspective: documenting simplifications employed in the mapping procedure, addressing uncertainty sources, developing new data solutions, and integrating complementary mapping methods. Incorporation of high-resolution data and introduction of more GIS ideas and methods (such as fuzzy logic) are identified to be future opportunities.
Optimization in Cardiovascular Modeling
NASA Astrophysics Data System (ADS)
Marsden, Alison L.
2014-01-01
Fluid mechanics plays a key role in the development, progression, and treatment of cardiovascular disease. Advances in imaging methods and patient-specific modeling now reveal increasingly detailed information about blood flow patterns in health and disease. Building on these tools, there is now an opportunity to couple blood flow simulation with optimization algorithms to improve the design of surgeries and devices, incorporating more information about the flow physics in the design process to augment current medical knowledge. In doing so, a major challenge is the need for efficient optimization tools that are appropriate for unsteady fluid mechanics problems, particularly for the optimization of complex patient-specific models in the presence of uncertainty. This article reviews the state of the art in optimization tools for virtual surgery, device design, and model parameter identification in cardiovascular flow and mechanobiology applications. In particular, it reviews trade-offs between traditional gradient-based methods and derivative-free approaches, as well as the need to incorporate uncertainties. Key future challenges are outlined, which extend to the incorporation of biological response and the customization of surgeries and devices for individual patients.
NASA Astrophysics Data System (ADS)
Chen, Yaning; Li, Weihong; Fang, Gonghuan; Li, Zhi
2017-02-01
Meltwater from glacierized catchments is one of the most important water supplies in central Asia. Therefore, the effects of climate change on glaciers and snow cover will have increasingly significant consequences for runoff. Hydrological modeling has become an indispensable research approach to water resources management in large glacierized river basins, but there is a lack of focus in the modeling of glacial discharge. This paper reviews the status of hydrological modeling in glacierized catchments of central Asia, discussing the limitations of the available models and extrapolating these to future challenges and directions. After reviewing recent efforts, we conclude that the main sources of uncertainty in assessing the regional hydrological impacts of climate change are the unreliable and incomplete data sets and the lack of understanding of the hydrological regimes of glacierized catchments of central Asia. Runoff trends indicate a complex response to changes in climate. For future variation of water resources, it is essential to quantify the responses of hydrologic processes to both climate change and shrinking glaciers in glacierized catchments, and scientific focus should be on reducing uncertainties linked to these processes.
Cloud Feedbacks on Climate: A Challenging Scientific Problem
Norris, Joe
2017-12-22
One reason it has been difficult to develop suitable social and economic policies to address global climate change is that projected global warming during the coming century has a large uncertainty range. The primary physical cause of this large uncertainty range is lack of understanding of the magnitude and even sign of cloud feedbacks on the climate system. If Earth's cloudiness responded to global warming by reflecting more solar radiation back to space or allowing more terrestrial radiation to be emitted to space, this would mitigate the warming produced by increased anthropogenic greenhouse gases. Contrastingly, a cloud response that reduced solar reflection or terrestrial emission would exacerbate anthropogenic greenhouse warming. It is likely that a mixture of responses will occur depending on cloud type and meteorological regime, and at present, we do not know what the net effect will be. This presentation will explain why cloud feedbacks have been a challenging scientific problem from the perspective of theory, modeling, and observations. Recent research results on observed multidecadal cloud-atmosphere-ocean variability over the Pacific Ocean will also be shown, along with suggestions for future research.
NASA Astrophysics Data System (ADS)
Carmichael, G. R.; Saide, P. E.; Gao, M.; Streets, D. G.; Kim, J.; Woo, J. H.
2017-12-01
Ambient aerosols are important air pollutants with direct impacts on human health and on the Earth's weather and climate systems through their interactions with radiation and clouds. Their role is dependent on their distributions of size, number, phase and composition, which vary significantly in space and time. There remain large uncertainties in simulated aerosol distributions due to uncertainties in emission estimates and in chemical and physical processes associated with their formation and removal. These uncertainties lead to large uncertainties in weather and air quality predictions and in estimates of health and climate change impacts. Despite these uncertainties and challenges, regional-scale coupled chemistry-meteorological models such as WRF-Chem have significant capabilities in predicting aerosol distributions and explaining aerosol-weather interactions. We explore the hypothesis that new advances in on-line, coupled atmospheric chemistry/meteorological models, and new emission inversion and data assimilation techniques applicable to such coupled models, can be applied in innovative ways using current and evolving observation systems to improve predictions of aerosol distributions at regional scales. We investigate the impacts of assimilating AOD from geostationary satellite (GOCI) and surface PM2.5 measurements on predictions of AOD and PM in Korea during KORUS-AQ through a series of experiments. The results suggest assimilating datasets from multiple platforms can improve the predictions of aerosol temporal and spatial distributions.
NASA Astrophysics Data System (ADS)
Wiandt, T. J.
2008-06-01
The Hart Scientific Division of the Fluke Corporation operates two accredited standard platinum resistance thermometer (SPRT) calibration facilities, one at the Hart Scientific factory in Utah, USA, and the other at a service facility in Norwich, UK. The US facility is accredited through National Voluntary Laboratory Accreditation Program (NVLAP), and the UK facility is accredited through UKAS. Both provide SPRT calibrations using similar equipment and procedures, and at similar levels of uncertainty. These uncertainties are among the lowest available commercially. To achieve and maintain low uncertainties, it is required that the calibration procedures be thorough and optimized. However, to minimize customer downtime, it is also important that the instruments be calibrated in a timely manner and returned to the customer. Consequently, subjecting the instrument to repeated calibrations or extensive repeated measurements is not a viable approach. Additionally, these laboratories provide SPRT calibration services involving a wide variety of SPRT designs. These designs behave differently, yet predictably, when subjected to calibration measurements. To this end, an evaluation strategy involving both statistical process control and internal consistency measures is utilized to provide confidence in both the instrument calibration and the calibration process. This article describes the calibration facilities, procedure, uncertainty analysis, and internal quality assurance measures employed in the calibration of SPRTs. Data will be reviewed and generalities will be presented. Finally, challenges and considerations for future improvements will be discussed.
Merging information from multi-model flood projections in a hierarchical Bayesian framework
NASA Astrophysics Data System (ADS)
Le Vine, Nataliya
2016-04-01
Multi-model ensembles are becoming widely accepted for flood frequency change analysis. The use of multiple models results in large uncertainty around estimates of flood magnitudes, due to both uncertainty in model selection and natural variability of river flow. The challenge is therefore to extract the most meaningful signal from the multi-model predictions, accounting for both model quality and uncertainties in individual model estimates. The study demonstrates the potential of a recently proposed hierarchical Bayesian approach to combine information from multiple models. The approach facilitates explicit treatment of shared multi-model discrepancy as well as the probabilistic nature of the flood estimates, by treating the available models as a sample from a hypothetical complete (but unobserved) set of models. The advantages of the approach are: 1) to insure an adequate 'baseline' conditions with which to compare future changes; 2) to reduce flood estimate uncertainty; 3) to maximize use of statistical information in circumstances where multiple weak predictions individually lack power, but collectively provide meaningful information; 4) to adjust multi-model consistency criteria when model biases are large; and 5) to explicitly consider the influence of the (model performance) stationarity assumption. Moreover, the analysis indicates that reducing shared model discrepancy is the key to further reduction of uncertainty in the flood frequency analysis. The findings are of value regarding how conclusions about changing exposure to flooding are drawn, and to flood frequency change attribution studies.
What motivates researchers in times of economic uncertainty.
NASA Technical Reports Server (NTRS)
Bucher, G. C.; Reece, J. E.
1972-01-01
Results of a study initiated late in 1970 to obtain both a measure of on-and-around-the-job factors which were 'motivating' to engineers and scientists, and to obtain an indication of how the relative importance of these factors changes as a result of the uncertain economic environment. A questionnaire, 'The Jackman Job Satisfaction Schedule,' was used to satisfy the needs of the study. It is concluded that managers can enhance the feeling of motivation by making individual job assignments interesting and challenging, by formulating significant milestones and end points into job content, and by assigning ample rewards with corresponding responsibility. In times of economic uncertainty increased emphasis should be given to security-related aspects of employment.
Risk communication: Uncertainties and the numbers game
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ortigara, M.
1995-08-30
The science of risk assessment seeks to characterize the potential risk in situations that may pose hazards to human health or the environment. However, the conclusions reached by the scientists and engineers are not an end in themselves - they are passed on to the involved companies, government agencies, legislators, and the public. All interested parties must then decide what to do with the information. Risk communication is a type of technical communication that involves some unique challenges. This paper first defines the relationships between risk assessment, risk management, and risk communication and then explores two issues in risk communication:more » addressing uncertainty and putting risk number into perspective.« less
Sanvido, Olivier; Romeis, Jörg; Bigler, Franz
2011-12-01
The ability to decide what kind of environmental changes observed during post-market environmental monitoring of genetically modified (GM) crops represent environmental harm is an essential part of most legal frameworks regulating the commercial release of GM crops into the environment. Among others, such decisions are necessary to initiate remedial measures or to sustain claims of redress linked to environmental liability. Given that consensus on criteria to evaluate 'environmental harm' has not yet been found, there are a number of challenges for risk managers when interpreting GM crop monitoring data for environmental decision-making. In the present paper, we argue that the challenges in decision-making have four main causes. The first three causes relate to scientific data collection and analysis, which have methodological limits. The forth cause concerns scientific data evaluation, which is controversial among the different stakeholders involved in the debate on potential impacts of GM crops on the environment. This results in controversy how the effects of GM crops should be valued and what constitutes environmental harm. This controversy may influence decision-making about triggering corrective actions by regulators. We analyse all four challenges and propose potential strategies for addressing them. We conclude that environmental monitoring has its limits in reducing uncertainties remaining from the environmental risk assessment prior to market approval. We argue that remaining uncertainties related to adverse environmental effects of GM crops would probably be assessed in a more efficient and rigorous way during pre-market risk assessment. Risk managers should acknowledge the limits of environmental monitoring programmes as a tool for decision-making.
Adaptive Pathways: Possible Next Steps for Payers in Preparation for Their Potential Implementation
Vella Bonanno, Patricia; Ermisch, Michael; Godman, Brian; Martin, Antony P.; Van Den Bergh, Jesper; Bezmelnitsyna, Liudmila; Bucsics, Anna; Arickx, Francis; Bybau, Alexander; Bochenek, Tomasz; van de Casteele, Marc; Diogene, Eduardo; Eriksson, Irene; Fürst, Jurij; Gad, Mohamed; Greičiūtė-Kuprijanov, Ieva; van der Graaff, Martin; Gulbinovic, Jolanta; Jones, Jan; Joppi, Roberta; Kalaba, Marija; Laius, Ott; Langner, Irene; Mardare, Ileana; Markovic-Pekovic, Vanda; Magnusson, Einar; Melien, Oyvind; Meshkov, Dmitry O.; Petrova, Guenka I.; Selke, Gisbert; Sermet, Catherine; Simoens, Steven; Schuurman, Ad; Ramos, Ricardo; Rodrigues, Jorge; Zara, Corinne; Zebedin-Brandl, Eva; Haycox, Alan
2017-01-01
Medicines receiving a conditional marketing authorization through Medicines Adaptive Pathways to Patients (MAPPs) will be a challenge for payers. The “introduction” of MAPPs is already seen by the European Medicines Agency (EMA) as a fait accompli, with payers not consulted or involved. However, once medicines are approved through MAPPs, they will be evaluated for funding by payers through different activities. These include Health Technology Assessment (HTA) with often immature clinical data and high uncertainty, financial considerations, and negotiations through different types of agreements, which can require monitoring post launch. Payers have experience with new medicines approved through conditional approval, and the fact that MAPPs present additional challenges is a concern from their perspective. There may be some activities where payers can collaborate. The final decisions on whether to reimburse a new medicine via MAPPs will have more variation than for medicines licensed via conventional processes. This is due not only to increasing uncertainty associated with medicines authorized through MAPPs but also differences in legal frameworks between member states. Moreover, if the financial and side-effect burden from the period of conditional approval until granting full marketing authorization is shifted to the post-authorization phase, payers may have to bear such burdens. Collection of robust data during routine clinical use is challenging along with high prices for new medicines during data collection. This paper presents the concept of MAPPs and possible challenges. Concerns and potential ways forward are discussed and a number of recommendations are presented from the perspective of payers. PMID:28878667
Lessons and challenges from adaptation pathways planning applications
NASA Astrophysics Data System (ADS)
Haasnoot, M.; Lawrence, J.; Kwakkel, J. H.; Walker, W.; Timmermans, J.; Bloemen, P.; Thissen, W.
2015-12-01
Planning for adaptation to dynamic risks (e.g., because of climate change) is a critical need. The concept of 'adaptive policies' is receiving increasing attention as a way of performing strategic planning that is able to address many of the inherent challenges of uncertainty and dynamic change. Several approaches for developing adaptive policies are available in the literature. One approach, for which several applications already exist, is Dynamic Adaptive Policy Pathways (DAPP). Pathway maps enable policy analysts, decision makers, and stakeholders to recognize potential 'locked-in' situations and to assess the flexibility, robustness, and efficacy of decision alternatives. Most of the applications of DAPP have been in deltas, coastal cities, or floodplains, often within the context of climate change adaptation. In this talk, we describe the DAPP approach and present a framework for designing signposts as adaptation signals, together with an illustrative application for the Rhine River in the Netherlands. We also draw lessons and challenges from pathways applications that differ in environment, culture, and institutional context. For example, the Dutch Delta Programme has used pathways to identify short-term decisions and long-term policy options. In Bangladesh, an application is in its early phase. Steps before generating pathways - such as long- term thinking in multiple possible futures and acknowledging uncertainties - are already a big challenge there. In New Zealand, the 'Sustainable Delta Game' has been used as the catalyst for pathways thinking by two local councils. This has led to its application in decision making for coastal and flood risk management and economic analysis of policy options.
NASA Astrophysics Data System (ADS)
Wolf, A.; Gaitan, C. F.; Thomas, T.; Watts, D.; Bollinger, J.
2017-12-01
Food and agriculture is the largest global industry, at $7.8Tn annual value, and is also the least digitized industry. As a consequence, the inefficiencies in this industry are staggering: yield gaps below potential are 20-70% worldwide, and of the crops that are produced, 20-50% are lost from the time of harvest up to consumption. Where some frame the challenges in agriculture as "grow more with less," a more useful analysis is around risk and uncertainty. In emerging markets, lack of geospatial data makes it difficult to recommend improved seeds or fertilizers for particular locales, therefore risky to make operating loans, impossible to accurately price crop insurance, and ultimately poses challenges in making contracts for delivery to processors that bring ag products into the food system. In developed markets, the ever increasing demands around immediacy, transparency, quality, crop novelty and food safety are straining the capacity of growers and processors to keep up. We have come to see this as a challenge in developing predictions joining both buyers and sellers around a shared set of facts on harvest timing, total yield, and post harvest quality. While these challenges have been met historically from government agencies and marketing boards reporting seasonal and regional forecasts, in many instances these are insufficient for making critical operational decisions on short timescales. In this talk, we will present a new set of measurements and analytical tools that enable unprecedented granularity in predictions to reduce risk and uncertainty in the food and ag supply chain, with special attention to applications that have potential to be economically self-sustaining.
Predicting uncertainty in future marine ice sheet volume using Bayesian statistical methods
NASA Astrophysics Data System (ADS)
Davis, A. D.
2015-12-01
The marine ice instability can trigger rapid retreat of marine ice streams. Recent observations suggest that marine ice systems in West Antarctica have begun retreating. However, unknown ice dynamics, computationally intensive mathematical models, and uncertain parameters in these models make predicting retreat rate and ice volume difficult. In this work, we fuse current observational data with ice stream/shelf models to develop probabilistic predictions of future grounded ice sheet volume. Given observational data (e.g., thickness, surface elevation, and velocity) and a forward model that relates uncertain parameters (e.g., basal friction and basal topography) to these observations, we use a Bayesian framework to define a posterior distribution over the parameters. A stochastic predictive model then propagates uncertainties in these parameters to uncertainty in a particular quantity of interest (QoI)---here, the volume of grounded ice at a specified future time. While the Bayesian approach can in principle characterize the posterior predictive distribution of the QoI, the computational cost of both the forward and predictive models makes this effort prohibitively expensive. To tackle this challenge, we introduce a new Markov chain Monte Carlo method that constructs convergent approximations of the QoI target density in an online fashion, yielding accurate characterizations of future ice sheet volume at significantly reduced computational cost.Our second goal is to attribute uncertainty in these Bayesian predictions to uncertainties in particular parameters. Doing so can help target data collection, for the purpose of constraining the parameters that contribute most strongly to uncertainty in the future volume of grounded ice. For instance, smaller uncertainties in parameters to which the QoI is highly sensitive may account for more variability in the prediction than larger uncertainties in parameters to which the QoI is less sensitive. We use global sensitivity analysis to help answer this question, and make the computation of sensitivity indices computationally tractable using a combination of polynomial chaos and Monte Carlo techniques.
NASA Astrophysics Data System (ADS)
Xiao, H.; Wu, J.-L.; Wang, J.-X.; Sun, R.; Roy, C. J.
2016-11-01
Despite their well-known limitations, Reynolds-Averaged Navier-Stokes (RANS) models are still the workhorse tools for turbulent flow simulations in today's engineering analysis, design and optimization. While the predictive capability of RANS models depends on many factors, for many practical flows the turbulence models are by far the largest source of uncertainty. As RANS models are used in the design and safety evaluation of many mission-critical systems such as airplanes and nuclear power plants, quantifying their model-form uncertainties has significant implications in enabling risk-informed decision-making. In this work we develop a data-driven, physics-informed Bayesian framework for quantifying model-form uncertainties in RANS simulations. Uncertainties are introduced directly to the Reynolds stresses and are represented with compact parameterization accounting for empirical prior knowledge and physical constraints (e.g., realizability, smoothness, and symmetry). An iterative ensemble Kalman method is used to assimilate the prior knowledge and observation data in a Bayesian framework, and to propagate them to posterior distributions of velocities and other Quantities of Interest (QoIs). We use two representative cases, the flow over periodic hills and the flow in a square duct, to evaluate the performance of the proposed framework. Both cases are challenging for standard RANS turbulence models. Simulation results suggest that, even with very sparse observations, the obtained posterior mean velocities and other QoIs have significantly better agreement with the benchmark data compared to the baseline results. At most locations the posterior distribution adequately captures the true model error within the developed model form uncertainty bounds. The framework is a major improvement over existing black-box, physics-neutral methods for model-form uncertainty quantification, where prior knowledge and details of the models are not exploited. This approach has potential implications in many fields in which the governing equations are well understood but the model uncertainty comes from unresolved physical processes.
NASA Astrophysics Data System (ADS)
Verardo, E.; Atteia, O.; Rouvreau, L.
2015-12-01
In-situ bioremediation is a commonly used remediation technology to clean up the subsurface of petroleum-contaminated sites. Forecasting remedial performance (in terms of flux and mass reduction) is a challenge due to uncertainties associated with source properties and the uncertainties associated with contribution and efficiency of concentration reducing mechanisms. In this study, predictive uncertainty analysis of bio-remediation system efficiency is carried out with the null-space Monte Carlo (NSMC) method which combines the calibration solution-space parameters with the ensemble of null-space parameters, creating sets of calibration-constrained parameters for input to follow-on remedial efficiency. The first step in the NSMC methodology for uncertainty analysis is model calibration. The model calibration was conducted by matching simulated BTEX concentration to a total of 48 observations from historical data before implementation of treatment. Two different bio-remediation designs were then implemented in the calibrated model. The first consists in pumping/injection wells and the second in permeable barrier coupled with infiltration across slotted piping. The NSMC method was used to calculate 1000 calibration-constrained parameter sets for the two different models. Several variants of the method were implemented to investigate their effect on the efficiency of the NSMC method. The first variant implementation of the NSMC is based on a single calibrated model. In the second variant, models were calibrated from different initial parameter sets. NSMC calibration-constrained parameter sets were sampled from these different calibrated models. We demonstrate that in context of nonlinear model, second variant avoids to underestimate parameter uncertainty which may lead to a poor quantification of predictive uncertainty. Application of the proposed approach to manage bioremediation of groundwater in a real site shows that it is effective to provide support in management of the in-situ bioremediation systems. Moreover, this study demonstrates that the NSMC method provides a computationally efficient and practical methodology of utilizing model predictive uncertainty methods in environmental management.
How uncertain are climate model projections of water availability indicators across the Middle East?
Hemming, Debbie; Buontempo, Carlo; Burke, Eleanor; Collins, Mat; Kaye, Neil
2010-11-28
The projection of robust regional climate changes over the next 50 years presents a considerable challenge for the current generation of climate models. Water cycle changes are particularly difficult to model in this area because major uncertainties exist in the representation of processes such as large-scale and convective rainfall and their feedback with surface conditions. We present climate model projections and uncertainties in water availability indicators (precipitation, run-off and drought index) for the 1961-1990 and 2021-2050 periods. Ensembles from two global climate models (GCMs) and one regional climate model (RCM) are used to examine different elements of uncertainty. Although all three ensembles capture the general distribution of observed annual precipitation across the Middle East, the RCM is consistently wetter than observations, especially over the mountainous areas. All future projections show decreasing precipitation (ensemble median between -5 and -25%) in coastal Turkey and parts of Lebanon, Syria and Israel and consistent run-off and drought index changes. The Intergovernmental Panel on Climate Change (IPCC) Fourth Assessment Report (AR4) GCM ensemble exhibits drying across the north of the region, whereas the Met Office Hadley Centre work Quantifying Uncertainties in Model ProjectionsAtmospheric (QUMP-A) GCM and RCM ensembles show slight drying in the north and significant wetting in the south. RCM projections also show greater sensitivity (both wetter and drier) and a wider uncertainty range than QUMP-A. The nature of these uncertainties suggests that both large-scale circulation patterns, which influence region-wide drying/wetting patterns, and regional-scale processes, which affect localized water availability, are important sources of uncertainty in these projections. To reduce large uncertainties in water availability projections, it is suggested that efforts would be well placed to focus on the understanding and modelling of both large-scale processes and their teleconnections with Middle East climate and localized processes involved in orographic precipitation.
NASA Astrophysics Data System (ADS)
Anderson, C. J.; Wildhaber, M. L.; Wikle, C. K.; Moran, E. H.; Franz, K. J.; Dey, R.
2012-12-01
Climate change operates over a broad range of spatial and temporal scales. Understanding the effects of change on ecosystems requires accounting for the propagation of information and uncertainty across these scales. For example, to understand potential climate change effects on fish populations in riverine ecosystems, climate conditions predicted by course-resolution atmosphere-ocean global climate models must first be translated to the regional climate scale. In turn, this regional information is used to force watershed models, which are used to force river condition models, which impact the population response. A critical challenge in such a multiscale modeling environment is to quantify sources of uncertainty given the highly nonlinear nature of interactions between climate variables and the individual organism. We use a hierarchical modeling approach for accommodating uncertainty in multiscale ecological impact studies. This framework allows for uncertainty due to system models, model parameter settings, and stochastic parameterizations. This approach is a hybrid between physical (deterministic) downscaling and statistical downscaling, recognizing that there is uncertainty in both. We use NARCCAP data to determine confidence the capability of climate models to simulate relevant processes and to quantify regional climate variability within the context of the hierarchical model of uncertainty quantification. By confidence, we mean the ability of the regional climate model to replicate observed mechanisms. We use the NCEP-driven simulations for this analysis. This provides a base from which regional change can be categorized as either a modification of previously observed mechanisms or emergence of new processes. The management implications for these categories of change are significantly different in that procedures to address impacts from existing processes may already be known and need adjustment; whereas, an emergent processes may require new management strategies. The results from hierarchical analysis of uncertainty are used to study the relative change in weights of the endangered Missouri River pallid sturgeon (Scaphirhynchus albus) under a 21st century climate scenario.
A probabilistic seismic model for the European Arctic
NASA Astrophysics Data System (ADS)
Hauser, Juerg; Dyer, Kathleen M.; Pasyanos, Michael E.; Bungum, Hilmar; Faleide, Jan I.; Clark, Stephen A.; Schweitzer, Johannes
2011-01-01
The development of three-dimensional seismic models for the crust and upper mantle has traditionally focused on finding one model that provides the best fit to the data while observing some regularization constraints. In contrast to this, the inversion employed here fits the data in a probabilistic sense and thus provides a quantitative measure of model uncertainty. Our probabilistic model is based on two sources of information: (1) prior information, which is independent from the data, and (2) different geophysical data sets, including thickness constraints, velocity profiles, gravity data, surface wave group velocities, and regional body wave traveltimes. We use a Markov chain Monte Carlo (MCMC) algorithm to sample models from the prior distribution, the set of plausible models, and test them against the data to generate the posterior distribution, the ensemble of models that fit the data with assigned uncertainties. While being computationally more expensive, such a probabilistic inversion provides a more complete picture of solution space and allows us to combine various data sets. The complex geology of the European Arctic, encompassing oceanic crust, continental shelf regions, rift basins and old cratonic crust, as well as the nonuniform coverage of the region by data with varying degrees of uncertainty, makes it a challenging setting for any imaging technique and, therefore, an ideal environment for demonstrating the practical advantages of a probabilistic approach. Maps of depth to basement and depth to Moho derived from the posterior distribution are in good agreement with previously published maps and interpretations of the regional tectonic setting. The predicted uncertainties, which are as important as the absolute values, correlate well with the variations in data coverage and quality in the region. A practical advantage of our probabilistic model is that it can provide estimates for the uncertainties of observables due to model uncertainties. We will demonstrate how this can be used for the formulation of earthquake location algorithms that take model uncertainties into account when estimating location uncertainties.
Is carbon farming an effective climate mitigation option?
NASA Astrophysics Data System (ADS)
Zelikova, T. J.; Funk, J.; Deich, N.; Amador, G.; Jacobson, R.
2017-12-01
"Carbon farming" refers to agricultural and land management practices that store carbon in soils and biomass. Carbon-farming techniques can include crop rotation, cover crops, no-till practices, and the application of compost to build up soil organic matter. Carbon farming also improves agricultural production and sustainability, while mitigating climate change. Despite well-documented benefits of carbon farming, these practices continue to be underutilized outside of experimental settings. One barrier to the widespread use of carbon farming is the challenge of fitting these practices into ongoing commercial operations, while managing the consequent market uncertainties across the value chain. To help address this barrier, we are working with landowners and local groups to establish demonstration "test beds" that can build experience among land managers and help resolve market uncertainties. We specifically focus on demonstrating the commercial viability of management practices that can enhance soil health, catalyzing economic and environmental synergies that come from healthy soils. Each test bed has a commercial agricultural operation at its center, and we bring together researchers, local groups, corporate partners, and key policymakers who can support wider adoption of these agricultural techniques. Early challenges have included finding commercial farms willing to shift their practices and face uncertain outcomes. A transition to new practices usually involves changes in equipment, scheduling, activities, and monitoring that have implications for the entire farm operation, its resources, and its bottom line. At the same time, practitioners have difficulty quantifying the carbon benefits they provide, due to persistent uncertainties, even with the benefit of decades of experimental research. We are building a network of farmers who are implementing carbon farming practices and addressing these challenges, step by step. We envision our test beds becoming hubs that support a community of practitioners who can show the value of this work in real-world commercial operations, supported by rigorous science. By bringing together the necessary elements of each test bed, we aim to facilitate widespread integration of carbon storage activities into the agricultural sector.
Towards a Multi-Resolution Model of Seismic Risk in Central Asia. Challenge and perspectives
NASA Astrophysics Data System (ADS)
Pittore, M.; Wieland, M.; Bindi, D.; Parolai, S.
2011-12-01
Assessing seismic risk, defined as the probability of occurrence of economical and social losses as consequence of an earthquake, both at regional and at local scale is a challenging, multi-disciplinary task. In order to provide a reliable estimate, diverse information must be gathered by seismologists, geologists, engineers and civil authorities, and carefully integrated keeping into account the different levels of uncertainty. The research towards an integrated methodology, able to seamlessly describe seismic risk at different spatial scales is challenging, but discloses new application perspectives, particularly in those countries which suffer from a relevant seismic hazard but do not have resources for a standard assessment. Central Asian countries in particular, which exhibit one of the highest seismic hazard in the world, are experiencing a steady demographic growth, often accompanied by informal settlement and urban sprawling. A reliable evaluation of how these factors affect the seismic risk, together with a realistic assessment of the assets exposed to seismic hazard and their structural vulnerability is of particular importance, in order to undertake proper mitigation actions and to promptly and efficiently react to a catastrophic event. New strategies are needed to efficiently cope with systematic lack of information and uncertainties. An original approach is presented to assess seismic risk based on integration of information coming from remote-sensing and ground-based panoramic imaging, in situ measurements, expert knowledge and already available data. Efficient sampling strategies based on freely available medium-resolution multi-spectral satellite images are adopted to optimize data collection and validation, in a multi-scale approach. Panoramic imaging is also considered as a valuable ground-based visual data collection technique, suitable both for manual and automatic analysis. A full-probabilistic framework based on Bayes Network is proposed to integrate available information taking into account both aleatory and epistemic uncertainties. An improved risk model for the capital of Kyrgyz Republic, Biskek, has been developed following this approach and tested based on different earthquake scenarios. Preliminary results will be presented and discussed.
NASA Astrophysics Data System (ADS)
Clark, Martyn P.; Bierkens, Marc F. P.; Samaniego, Luis; Woods, Ross A.; Uijlenhoet, Remko; Bennett, Katrina E.; Pauwels, Valentijn R. N.; Cai, Xitian; Wood, Andrew W.; Peters-Lidard, Christa D.
2017-07-01
The diversity in hydrologic models has historically led to great controversy on the correct
approach to process-based hydrologic modeling, with debates centered on the adequacy of process parameterizations, data limitations and uncertainty, and computational constraints on model analysis. In this paper, we revisit key modeling challenges on requirements to (1) define suitable model equations, (2) define adequate model parameters, and (3) cope with limitations in computing power. We outline the historical modeling challenges, provide examples of modeling advances that address these challenges, and define outstanding research needs. We illustrate how modeling advances have been made by groups using models of different type and complexity, and we argue for the need to more effectively use our diversity of modeling approaches in order to advance our collective quest for physically realistic hydrologic models.
NASA Astrophysics Data System (ADS)
Clark, M. P.; Nijssen, B.; Wood, A.; Mizukami, N.; Newman, A. J.
2017-12-01
The diversity in hydrologic models has historically led to great controversy on the "correct" approach to process-based hydrologic modeling, with debates centered on the adequacy of process parameterizations, data limitations and uncertainty, and computational constraints on model analysis. In this paper, we revisit key modeling challenges on requirements to (1) define suitable model equations, (2) define adequate model parameters, and (3) cope with limitations in computing power. We outline the historical modeling challenges, provide examples of modeling advances that address these challenges, and define outstanding research needs. We illustrate how modeling advances have been made by groups using models of different type and complexity, and we argue for the need to more effectively use our diversity of modeling approaches in order to advance our collective quest for physically realistic hydrologic models.
An adaptive decision framework for the conservation of a threatened plant
Moore, Clinton T.; Fonnesbeck, Christopher J.; Shea, Katriona; Lah, Kristopher J.; McKenzie, Paul M.; Ball, Lianne C.; Runge, Michael C.; Alexander, Helen M.
2011-01-01
Mead's milkweed Asclepias meadii, a long-lived perennial herb of tallgrass prairie and glade communities of the central United States, is a species designated as threatened under the U.S. Endangered Species Act. Challenges to its successful management include the facts that much about its life history is unknown, its age at reproductive maturity is very advanced, certain life stages are practically unobservable, its productivity is responsive to unpredictable environmental events, and most of the known populations occur on private lands unprotected by any legal conservation instrument. One critical source of biological uncertainty is the degree to which fire promotes growth and reproductive response in the plant. To aid in its management, we developed a prototype population-level state-dependent decision-making framework that explicitly accounts for this uncertainty and for uncertainties related to stochastic environmental effects and vital rates. To parameterize the decision model, we used estimates found in the literature, and we analyzed data from a long-term monitoring program where fates of individual plants were observed through time. We demonstrate that different optimal courses of action are followed according to how one believes that fire influences reproductive response, and we show that the action taken for certain population states is informative for resolving uncertainty about competing beliefs regarding the effect of fire. We advocate the use of a model-predictive approach for the management of rare populations, particularly when management uncertainty is profound. Over time, an adaptive management approach should reduce uncertainty and improve management performance as predictions of management outcome generated under competing models are continually informed and updated by monitoring data.
NASA Astrophysics Data System (ADS)
Millar, R.; Ingram, W.; Allen, M. R.; Lowe, J.
2013-12-01
Temperature and precipitation patterns are the climate variables with the greatest impacts on both natural and human systems. Due to the small spatial scales and the many interactions involved in the global hydrological cycle, in general circulation models (GCMs) representations of precipitation changes are subject to considerable uncertainty. Quantifying and understanding the causes of uncertainty (and identifying robust features of predictions) in both global and local precipitation change is an essential challenge of climate science. We have used the huge distributed computing capacity of the climateprediction.net citizen science project to examine parametric uncertainty in an ensemble of 20,000 perturbed-physics versions of the HadCM3 general circulation model. The ensemble has been selected to have a control climate in top-of-atmosphere energy balance [Yamazaki et al. 2013, J.G.R.]. We force this ensemble with several idealised climate-forcing scenarios including carbon dioxide step and transient profiles, solar radiation management geoengineering experiments with stratospheric aerosols, and short-lived climate forcing agents. We will present the results from several of these forcing scenarios under GCM parametric uncertainty. We examine the global mean precipitation energy budget to understand the robustness of a simple non-linear global precipitation model [Good et al. 2012, Clim. Dyn.] as a better explanation of precipitation changes in transient climate projections under GCM parametric uncertainty than a simple linear tropospheric energy balance model. We will also present work investigating robust conclusions about precipitation changes in a balanced ensemble of idealised solar radiation management scenarios [Kravitz et al. 2011, Atmos. Sci. Let.].
NASA Astrophysics Data System (ADS)
Almeida, Susana; Holcombe, Elizabeth Ann; Pianosi, Francesca; Wagener, Thorsten
2017-02-01
Landslides have large negative economic and societal impacts, including loss of life and damage to infrastructure. Slope stability assessment is a vital tool for landslide risk management, but high levels of uncertainty often challenge its usefulness. Uncertainties are associated with the numerical model used to assess slope stability and its parameters, with the data characterizing the geometric, geotechnic and hydrologic properties of the slope, and with hazard triggers (e.g. rainfall). Uncertainties associated with many of these factors are also likely to be exacerbated further by future climatic and socio-economic changes, such as increased urbanization and resultant land use change. In this study, we illustrate how numerical models can be used to explore the uncertain factors that influence potential future landslide hazard using a bottom-up strategy. Specifically, we link the Combined Hydrology And Stability Model (CHASM) with sensitivity analysis and Classification And Regression Trees (CART) to identify critical thresholds in slope properties and climatic (rainfall) drivers that lead to slope failure. We apply our approach to a slope in the Caribbean, an area that is naturally susceptible to landslides due to a combination of high rainfall rates, steep slopes, and highly weathered residual soils. For this particular slope, we find that uncertainties regarding some slope properties (namely thickness and effective cohesion of topsoil) are as important as the uncertainties related to future rainfall conditions. Furthermore, we show that 89 % of the expected behaviour of the studied slope can be characterized based on only two variables - the ratio of topsoil thickness to cohesion and the ratio of rainfall intensity to duration.
Incorporating climate change into ecosystem service assessments and decisions: a review.
Runting, Rebecca K; Bryan, Brett A; Dee, Laura E; Maseyk, Fleur J F; Mandle, Lisa; Hamel, Perrine; Wilson, Kerrie A; Yetka, Kathleen; Possingham, Hugh P; Rhodes, Jonathan R
2017-01-01
Climate change is having a significant impact on ecosystem services and is likely to become increasingly important as this phenomenon intensifies. Future impacts can be difficult to assess as they often involve long timescales, dynamic systems with high uncertainties, and are typically confounded by other drivers of change. Despite a growing literature on climate change impacts on ecosystem services, no quantitative syntheses exist. Hence, we lack an overarching understanding of the impacts of climate change, how they are being assessed, and the extent to which other drivers, uncertainties, and decision making are incorporated. To address this, we systematically reviewed the peer-reviewed literature that assesses climate change impacts on ecosystem services at subglobal scales. We found that the impact of climate change on most types of services was predominantly negative (59% negative, 24% mixed, 4% neutral, 13% positive), but varied across services, drivers, and assessment methods. Although uncertainty was usually incorporated, there were substantial gaps in the sources of uncertainty included, along with the methods used to incorporate them. We found that relatively few studies integrated decision making, and even fewer studies aimed to identify solutions that were robust to uncertainty. For management or policy to ensure the delivery of ecosystem services, integrated approaches that incorporate multiple drivers of change and account for multiple sources of uncertainty are needed. This is undoubtedly a challenging task, but ignoring these complexities can result in misleading assessments of the impacts of climate change, suboptimal management outcomes, and the inefficient allocation of resources for climate adaptation. © 2016 John Wiley & Sons Ltd.
Howell, J.E.; Moore, C.T.; Conroy, M.J.; Hamrick, R.G.; Cooper, R.J.; Thackston, R.E.; Carroll, J.P.
2009-01-01
Large-scale habitat enhancement programs for birds are becoming more widespread, however, most lack monitoring to resolve uncertainties and enhance program impact over time. Georgia?s Bobwhite Quail Initiative (BQI) is a competitive, proposal-based system that provides incentives to landowners to establish habitat for northern bobwhites (Colinus virginianus). Using data from monitoring conducted in the program?s first years (1999?2001), we developed alternative hierarchical models to predict bobwhite abundance in response to program habitat modifications on local and regional scales. Effects of habitat and habitat management on bobwhite population response varied among geographical scales, but high measurement variability rendered the specific nature of these scaled effects equivocal. Under some models, BQI had positive impact at both local farm scales (1, 9 km2), particularly when practice acres were clustered, whereas other credible models indicated that bird response did not depend on spatial arrangement of practices. Thus, uncertainty about landscape-level effects of management presents a challenge to program managers who must decide which proposals to accept. We demonstrate that optimal selection decisions can be made despite this uncertainty and that uncertainty can be reduced over time, with consequent improvement in management efficacy. However, such an adaptive approach to BQI program implementation would require the reestablishment of monitoring of bobwhite abundance, an effort for which funding was discontinued in 2002. For landscape-level conservation programs generally, our approach demonstrates the value in assessing multiple scales of impact of habitat modification programs, and it reveals the utility of addressing management uncertainty through multiple decision models and system monitoring.
NASA Astrophysics Data System (ADS)
Keating, Elizabeth H.; Doherty, John; Vrugt, Jasper A.; Kang, Qinjun
2010-10-01
Highly parameterized and CPU-intensive groundwater models are increasingly being used to understand and predict flow and transport through aquifers. Despite their frequent use, these models pose significant challenges for parameter estimation and predictive uncertainty analysis algorithms, particularly global methods which usually require very large numbers of forward runs. Here we present a general methodology for parameter estimation and uncertainty analysis that can be utilized in these situations. Our proposed method includes extraction of a surrogate model that mimics key characteristics of a full process model, followed by testing and implementation of a pragmatic uncertainty analysis technique, called null-space Monte Carlo (NSMC), that merges the strengths of gradient-based search and parameter dimensionality reduction. As part of the surrogate model analysis, the results of NSMC are compared with a formal Bayesian approach using the DiffeRential Evolution Adaptive Metropolis (DREAM) algorithm. Such a comparison has never been accomplished before, especially in the context of high parameter dimensionality. Despite the highly nonlinear nature of the inverse problem, the existence of multiple local minima, and the relatively large parameter dimensionality, both methods performed well and results compare favorably with each other. Experiences gained from the surrogate model analysis are then transferred to calibrate the full highly parameterized and CPU intensive groundwater model and to explore predictive uncertainty of predictions made by that model. The methodology presented here is generally applicable to any highly parameterized and CPU-intensive environmental model, where efficient methods such as NSMC provide the only practical means for conducting predictive uncertainty analysis.
Rivera-Rodriguez, Claudia L; Resch, Stephen; Haneuse, Sebastien
2018-01-01
In many low- and middle-income countries, the costs of delivering public health programs such as for HIV/AIDS, nutrition, and immunization are not routinely tracked. A number of recent studies have sought to estimate program costs on the basis of detailed information collected on a subsample of facilities. While unbiased estimates can be obtained via accurate measurement and appropriate analyses, they are subject to statistical uncertainty. Quantification of this uncertainty, for example, via standard errors and/or 95% confidence intervals, provides important contextual information for decision-makers and for the design of future costing studies. While other forms of uncertainty, such as that due to model misspecification, are considered and can be investigated through sensitivity analyses, statistical uncertainty is often not reported in studies estimating the total program costs. This may be due to a lack of awareness/understanding of (1) the technical details regarding uncertainty estimation and (2) the availability of software with which to calculate uncertainty for estimators resulting from complex surveys. We provide an overview of statistical uncertainty in the context of complex costing surveys, emphasizing the various potential specific sources that contribute to overall uncertainty. We describe how analysts can compute measures of uncertainty, either via appropriately derived formulae or through resampling techniques such as the bootstrap. We also provide an overview of calibration as a means of using additional auxiliary information that is readily available for the entire program, such as the total number of doses administered, to decrease uncertainty and thereby improve decision-making and the planning of future studies. A recent study of the national program for routine immunization in Honduras shows that uncertainty can be reduced by using information available prior to the study. This method can not only be used when estimating the total cost of delivering established health programs but also to decrease uncertainty when the interest lies in assessing the incremental effect of an intervention. Measures of statistical uncertainty associated with survey-based estimates of program costs, such as standard errors and 95% confidence intervals, provide important contextual information for health policy decision-making and key inputs for the design of future costing studies. Such measures are often not reported, possibly because of technical challenges associated with their calculation and a lack of awareness of appropriate software. Modern statistical analysis methods for survey data, such as calibration, provide a means to exploit additional information that is readily available but was not used in the design of the study to significantly improve the estimation of total cost through the reduction of statistical uncertainty.
Resch, Stephen
2018-01-01
Objectives: In many low- and middle-income countries, the costs of delivering public health programs such as for HIV/AIDS, nutrition, and immunization are not routinely tracked. A number of recent studies have sought to estimate program costs on the basis of detailed information collected on a subsample of facilities. While unbiased estimates can be obtained via accurate measurement and appropriate analyses, they are subject to statistical uncertainty. Quantification of this uncertainty, for example, via standard errors and/or 95% confidence intervals, provides important contextual information for decision-makers and for the design of future costing studies. While other forms of uncertainty, such as that due to model misspecification, are considered and can be investigated through sensitivity analyses, statistical uncertainty is often not reported in studies estimating the total program costs. This may be due to a lack of awareness/understanding of (1) the technical details regarding uncertainty estimation and (2) the availability of software with which to calculate uncertainty for estimators resulting from complex surveys. We provide an overview of statistical uncertainty in the context of complex costing surveys, emphasizing the various potential specific sources that contribute to overall uncertainty. Methods: We describe how analysts can compute measures of uncertainty, either via appropriately derived formulae or through resampling techniques such as the bootstrap. We also provide an overview of calibration as a means of using additional auxiliary information that is readily available for the entire program, such as the total number of doses administered, to decrease uncertainty and thereby improve decision-making and the planning of future studies. Results: A recent study of the national program for routine immunization in Honduras shows that uncertainty can be reduced by using information available prior to the study. This method can not only be used when estimating the total cost of delivering established health programs but also to decrease uncertainty when the interest lies in assessing the incremental effect of an intervention. Conclusion: Measures of statistical uncertainty associated with survey-based estimates of program costs, such as standard errors and 95% confidence intervals, provide important contextual information for health policy decision-making and key inputs for the design of future costing studies. Such measures are often not reported, possibly because of technical challenges associated with their calculation and a lack of awareness of appropriate software. Modern statistical analysis methods for survey data, such as calibration, provide a means to exploit additional information that is readily available but was not used in the design of the study to significantly improve the estimation of total cost through the reduction of statistical uncertainty. PMID:29636964
NASA Astrophysics Data System (ADS)
Lall, U.
2010-12-01
To honor the passing this year of eminent hydrologists, Dooge, Klemes and Shiklomanov, I offer an irreverent look at the issues of uncertainty and stationarity as the hydrologic industry prepares climate change products. In an AGU keynote, Dooge said that the principle of mass balance was the only hydrologic law. It was not clear how one should apply it. Klemes observed that Rippl’s 1872 mass curve analyses could essentially subsume many of the advances in stochastic modeling and reservoir optimization. Shiklomanov tackled data challenges to present a comprehensive view of the world’s water supply and demand highlighting the imbalance and sustainability challenge we face. He did not characterize the associated uncertainties. It is remarkable how little data can provide insights, while at times much information from models and data hihglights uncertainty. Hydrologists have focused on parameter uncertainties in hydrologic models. The indeterminacy of the typical situation offered Beven the opportunity to coin the term equifinality. However, this ignores the fact that the traditional continuum model fails us across scales if we don’t re-derive the correct averaged equations accounting for subscale heterogeneity. Nevertheless, the operating paradigm here has been a stimulus response model y = f(x,P), where y are the observations of the state variables, x are observations of hydrologic drivers, P are model parameters, and f(.,.) is an appropriate differential or integral transform. The uncertainty analyses then focuses on P, such that the resulting field of y is approximately unbiased and has minimum variance or maximum likelihood. The parameters P are usually time invariant, and x and/or f(.,.) are expected to account for changes in the boundary conditions. Thus the dynamics is stationary, while the time series of either x or y may not be. Given the lack of clarity as to whether the dynamical system or the trajectory is stationary it is amusing that the paper ”Stationarity is Dead” that implicitly uses changes in time series properties and boundary conditions as its basis gets much press. To avoid the stationarity dilemma, hydrologists are willing to take climate model outputs, rather than an analysis based on historical climate. Uncertainty analysis is viewed as the appropriate shrinkage of the spread across models and ensembles by clever averaging after bias corrections of the model output - a process I liken to transforming elephants into mice. Since it is someone else’s model, we abandon the seemingly good sense of seeking the best parameters P that reproduce the data y. We now seek to fit a model y = T{f1(x,P1),f2(x,P2)…}, where we don’t question the parameter or model but simply fudge the outputs to what was observed. Clearly, we can’t become climate modelers and must work with what we are dealt. By the way, doesn’t this uncertainty analysis and reduction process involve an assumption of stationarity? So, how should hydrologists navigate this muddle of uncertainty and stationarity? I offer some ideas tying to modeling purpose, and advocate a greater effort on diagnostic analyses that provide insights into how hydrologic dynamics co-evolve with climate at a variety of space and time scales. Are there natural bounds or structure to systemic uncertainty and predictability, and what are the key carriers of hydrologic information?
NASA Astrophysics Data System (ADS)
Lawley, Russell; Barron, Mark; Lee, Katy
2014-05-01
Uncertainty in geological linework: communicating the expert's tacit model to the data user(s) by expert elicitation. R. Lawley, M. Barron and K. Lee. NERC - British Geological Survey, Environmental Science Centre, Keyworth, Nottingham, UK, NG12 5GG The boundaries mapped in traditional field geological survey are subject to a wide range of inherent uncertainties. A map at a survey-scale of 1:10,000 is created by a combination of terrain interpretation, direct observations from boreholes and exposures (often sparsely distributed), and indirect interpretation of proxy variables such as soil properties, vegetation and remotely sensed images. A critical factor influencing the quality of the final map is the skill and experience of the surveyor to bring this information together in a coherent conceptual model. The users of geological data comprising or based on mapped boundaries are increasingly aware of these uncertainties, and want to know how to manage them. The growth of 3D modelling, which takes 2D surveys as a starting point, adds urgency to the need for a better understanding of survey uncertainties; particularly where 2D mapping of variable vintage has been compiled into a national coverage. Previous attempts to apply confidence on the basis of metrics such as data density, survey age or survey techniques have proved useful for isolating single, critical, factors but do not generally succeed in evaluating geological mapping 'in the round', because they cannot account for the 'conceptual' skill set of the surveyor. The British Geological Survey (BGS) is using expert elicitation methods to gain a better understanding of uncertainties within the national geological map of Great Britain. The expert elicitation approach starts with the assumption that experienced surveyors have an intuitive sense of the uncertainty of the boundaries that they map, based on a tacit model of geology and its complexity and the nature of the surveying process. The objective of elicitation is to extract this model in a useable, quantitative, form by a robust and transparent procedure. At BGS expert elicitation is being used to evaluate the uncertainty of mapped boundaries in different common mapping scenarios, with a view to building a 'collective' understanding of the challenges each scenario presents. For example, a 'sharp contact (at surface) between highly contrasting sedimentary rocks' represents one level of survey challenge that should be accurately met by all surveyors, even novices. In contrast, a 'transitional boundary defined by localised facies-variation' may require much more experience to resolve (without recourse to significantly more sampling). We will describe the initial phase of this exercise in which uncertainty models were elicited for mapped boundaries in six contrasting scenarios. Each scenario was presented to a panel of experts with varied expertise and career history. In five cases it was possible to arrive at a consensus model, in a sixth case experts with different experience took different views of the nature of the mapping problem. We will discuss our experience of the use of elicitation methodology and the implications of our results for further work at the BGS to quantify uncertainty in map products. In particular we will consider the value of elicitation as a means to capture the expertise of individuals as they retire, and as the composition of the organization's staff changes in response to the management and policy decisions.
Agent-Centric Approach for Cybersecurity Decision-Support with Partial Observability
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tipireddy, Ramakrishna; Chatterjee, Samrat; Paulson, Patrick R.
Generating automated cyber resilience policies for real-world settings is a challenging research problem that must account for uncertainties in system state over time and dynamics between attackers and defenders. In addition to understanding attacker and defender motives and tools, and identifying “relevant” system and attack data, it is also critical to develop rigorous mathematical formulations representing the defender’s decision-support problem under uncertainty. Game-theoretic approaches involving cyber resource allocation optimization with Markov decision processes (MDP) have been previously proposed in the literature. Moreover, advancements in reinforcement learning approaches have motivated the development of partially observable stochastic games (POSGs) in various multi-agentmore » problem domains with partial information. Recent advances in cyber-system state space modeling have also generated interest in potential applicability of POSGs for cybersecurity. However, as is the case in strategic card games such as poker, research challenges using game-theoretic approaches for practical cyber defense applications include: 1) solving for equilibrium and designing efficient algorithms for large-scale, general problems; 2) establishing mathematical guarantees that equilibrium exists; 3) handling possible existence of multiple equilibria; and 4) exploitation of opponent weaknesses. Inspired by advances in solving strategic card games while acknowledging practical challenges associated with the use of game-theoretic approaches in cyber settings, this paper proposes an agent-centric approach for cybersecurity decision-support with partial system state observability.« less
Sequential Design of Experiments to Maximize Learning from Carbon Capture Pilot Plant Testing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Soepyan, Frits B.; Morgan, Joshua C.; Omell, Benjamin P.
Pilot plant test campaigns can be expensive and time-consuming. Therefore, it is of interest to maximize the amount of learning and the efficiency of the test campaign given the limited number of experiments that can be conducted. This work investigates the use of sequential design of experiments (SDOE) to overcome these challenges by demonstrating its usefulness for a recent solvent-based CO2 capture plant test campaign. Unlike traditional design of experiments methods, SDOE regularly uses information from ongoing experiments to determine the optimum locations in the design space for subsequent runs within the same experiment. However, there are challenges that needmore » to be addressed, including reducing the high computational burden to efficiently update the model, and the need to incorporate the methodology into a computational tool. We address these challenges by applying SDOE in combination with a software tool, the Framework for Optimization, Quantification of Uncertainty and Surrogates (FOQUS) (Miller et al., 2014a, 2016, 2017). The results of applying SDOE on a pilot plant test campaign for CO2 capture suggests that relative to traditional design of experiments methods, SDOE can more effectively reduce the uncertainty of the model, thus decreasing technical risk. Future work includes integrating SDOE into FOQUS and using SDOE to support additional large-scale pilot plant test campaigns.« less
NASA Technical Reports Server (NTRS)
Tao, Gang; Joshi, Suresh M.
2008-01-01
In this paper, the problem of controlling systems with failures and faults is introduced, and an overview of recent work on direct adaptive control for compensation of uncertain actuator failures is presented. Actuator failures may be characterized by some unknown system inputs being stuck at some unknown (fixed or varying) values at unknown time instants, that cannot be influenced by the control signals. The key task of adaptive compensation is to design the control signals in such a manner that the remaining actuators can automatically and seamlessly take over for the failed ones, and achieve desired stability and asymptotic tracking. A certain degree of redundancy is necessary to accomplish failure compensation. The objective of adaptive control design is to effectively use the available actuation redundancy to handle failures without the knowledge of the failure patterns, parameters, and time of occurrence. This is a challenging problem because failures introduce large uncertainties in the dynamic structure of the system, in addition to parametric uncertainties and unknown disturbances. The paper addresses some theoretical issues in adaptive actuator failure compensation: actuator failure modeling, redundant actuation requirements, plant-model matching, error system dynamics, adaptation laws, and stability, tracking, and performance analysis. Adaptive control designs can be shown to effectively handle uncertain actuator failures without explicit failure detection. Some open technical challenges and research problems in this important research area are discussed.
NASA Astrophysics Data System (ADS)
Kirchengast, Gottfried; Li, Ying; Scherllin-Pirscher, Barbara; Schwärz, Marc; Schwarz, Jakob; Nielsen, Johannes K.
2017-04-01
The GNSS radio occultation (RO) technique is an important remote sensing technique for obtaining thermodynamic profiles of temperature, humidity, and pressure in the Earth's troposphere. However, due to refraction effects of both dry ambient air and water vapor in the troposphere, retrieval of accurate thermodynamic profiles at these lower altitudes is challenging and requires suitable background information in addition to the RO refractivity information. Here we introduce a new moist air retrieval algorithm aiming to improve the quality and robustness of retrieving temperature, humidity and pressure profiles in moist air tropospheric conditions. The new algorithm consists of four steps: (1) use of prescribed specific humidity and its uncertainty to retrieve temperature and its associated uncertainty; (2) use of prescribed temperature and its uncertainty to retrieve specific humidity and its associated uncertainty; (3) use of the previous results to estimate final temperature and specific humidity profiles through optimal estimation; (4) determination of air pressure and density profiles from the results obtained before. The new algorithm does not require elaborated matrix inversions which are otherwise widely used in 1D-Var retrieval algorithms, and it allows a transparent uncertainty propagation, whereby the uncertainties of prescribed variables are dynamically estimated accounting for their spatial and temporal variations. Estimated random uncertainties are calculated by constructing error covariance matrices from co-located ECMWF short-range forecast and corresponding analysis profiles. Systematic uncertainties are estimated by empirical modeling. The influence of regarding or disregarding vertical error correlations is quantified. The new scheme is implemented with static input uncertainty profiles in WEGC's current OPSv5.6 processing system and with full scope in WEGC's next-generation system, the Reference Occultation Processing System (rOPS). Results from both WEGC systems, current OPSv5.6 and next-generation rOPS, are shown and discussed, based on both insights from individual profiles and statistical ensembles, and compared to moist air retrieval results from the UCAR Boulder and ROM-SAF Copenhagen centers. The results show that the new algorithmic scheme improves the temperature, humidity and pressure retrieval performance, in particular also the robustness including for integrated uncertainty estimation for large-scale applications, over the previous algorithms. The new rOPS-implemented algorithm will therefore be used in the first large-scale reprocessing towards a tropospheric climate data record 2001-2016 by the rOPS, including its integrated uncertainty propagation.
Scheduling Future Water Supply Investments Under Uncertainty
NASA Astrophysics Data System (ADS)
Huskova, I.; Matrosov, E. S.; Harou, J. J.; Kasprzyk, J. R.; Reed, P. M.
2014-12-01
Uncertain hydrological impacts of climate change, population growth and institutional changes pose a major challenge to planning of water supply systems. Planners seek optimal portfolios of supply and demand management schemes but also when to activate assets whilst considering many system goals and plausible futures. Incorporation of scheduling into the planning under uncertainty problem strongly increases its complexity. We investigate some approaches to scheduling with many-objective heuristic search. We apply a multi-scenario many-objective scheduling approach to the Thames River basin water supply system planning problem in the UK. Decisions include which new supply and demand schemes to implement, at what capacity and when. The impact of different system uncertainties on scheme implementation schedules are explored, i.e. how the choice of future scenarios affects the search process and its outcomes. The activation of schemes is influenced by the occurrence of extreme hydrological events in the ensemble of plausible scenarios and other factors. The approach and results are compared with a previous study where only the portfolio problem is addressed (without scheduling).
Sheppard, Maria K
2013-06-01
The patient mobility case law of the Court of Justice of the European Union created legal uncertainty for the healthcare systems of EU Member States. The Patient Mobility Directive setting out patients' cross-border rights was adopted to end this uncertainty. With the Directive to be transposed into national law by October 2013 this article discusses whether the Directive achieves this objective for the English NHS. It contrasts the legal position of the NHS patient under case law and under the Directive regarding the need for prior authorisation of cross-border treatment, the level of reimbursement and the ambit of the healthcare benefits basket. It is argued that the risk of legal challenge may persist under the Directive, specifically regarding treatments which are classified by health authorities as low priority, namely treatments which are either not 'generally' available or only available subject to certain clinical criteria or access thresholds.
Panaceas, uncertainty, and the robust control framework in sustainability science
Anderies, John M.; Rodriguez, Armando A.; Janssen, Marco A.; Cifdaloz, Oguzhan
2007-01-01
A critical challenge faced by sustainability science is to develop strategies to cope with highly uncertain social and ecological dynamics. This article explores the use of the robust control framework toward this end. After briefly outlining the robust control framework, we apply it to the traditional Gordon–Schaefer fishery model to explore fundamental performance–robustness and robustness–vulnerability trade-offs in natural resource management. We find that the classic optimal control policy can be very sensitive to parametric uncertainty. By exploring a large class of alternative strategies, we show that there are no panaceas: even mild robustness properties are difficult to achieve, and increasing robustness to some parameters (e.g., biological parameters) results in decreased robustness with respect to others (e.g., economic parameters). On the basis of this example, we extract some broader themes for better management of resources under uncertainty and for sustainability science in general. Specifically, we focus attention on the importance of a continual learning process and the use of robust control to inform this process. PMID:17881574
NASA Technical Reports Server (NTRS)
Chan, David T.; Pinier, Jeremy T.; Wilcox, Floyd J., Jr.; Dalle, Derek J.; Rogers, Stuart E.; Gomez, Reynaldo J.
2016-01-01
The development of the aerodynamic database for the Space Launch System (SLS) booster separation environment has presented many challenges because of the complex physics of the ow around three independent bodies due to proximity e ects and jet inter- actions from the booster separation motors and the core stage engines. This aerodynamic environment is dicult to simulate in a wind tunnel experiment and also dicult to simu- late with computational uid dynamics. The database is further complicated by the high dimensionality of the independent variable space, which includes the orientation of the core stage, the relative positions and orientations of the solid rocket boosters, and the thrust lev- els of the various engines. Moreover, the clearance between the core stage and the boosters during the separation event is sensitive to the aerodynamic uncertainties of the database. This paper will present the development process for Version 3 of the SLS booster separa- tion aerodynamic database and the statistics-based uncertainty quanti cation process for the database.
Sustainable water management under future uncertainty with eco-engineering decision scaling
NASA Astrophysics Data System (ADS)
Poff, N. Leroy; Brown, Casey M.; Grantham, Theodore E.; Matthews, John H.; Palmer, Margaret A.; Spence, Caitlin M.; Wilby, Robert L.; Haasnoot, Marjolijn; Mendoza, Guillermo F.; Dominique, Kathleen C.; Baeza, Andres
2016-01-01
Managing freshwater resources sustainably under future climatic and hydrological uncertainty poses novel challenges. Rehabilitation of ageing infrastructure and construction of new dams are widely viewed as solutions to diminish climate risk, but attaining the broad goal of freshwater sustainability will require expansion of the prevailing water resources management paradigm beyond narrow economic criteria to include socially valued ecosystem functions and services. We introduce a new decision framework, eco-engineering decision scaling (EEDS), that explicitly and quantitatively explores trade-offs in stakeholder-defined engineering and ecological performance metrics across a range of possible management actions under unknown future hydrological and climate states. We illustrate its potential application through a hypothetical case study of the Iowa River, USA. EEDS holds promise as a powerful framework for operationalizing freshwater sustainability under future hydrological uncertainty by fostering collaboration across historically conflicting perspectives of water resource engineering and river conservation ecology to design and operate water infrastructure for social and environmental benefits.
Properties of Extreme Precipitation and Their Uncertainties in 3-year GPM Precipitation Radar Data
NASA Astrophysics Data System (ADS)
Liu, N.; Liu, C.
2017-12-01
Extreme high precipitation rates are often related to flash floods and have devastating impacts on human society and the environments. To better understand these rare events, 3-year Precipitation Features (PFs) are defined by grouping the contiguous areas with nonzero near-surface precipitation derived using Global Precipitation Measurement (GPM) Ku band Precipitation Radar (KuPR). The properties of PFs with extreme precipitation rates greater than 20, 50, 100 mm/hr, such as the geographical distribution, volumetric precipitation contribution, seasonal and diurnal variations, are examined. In addition to the large seasonal and regional variations, the rare extreme precipitation rates often have a larger contribution to the local total precipitation. Extreme precipitation rates occur more often over land than over ocean. The challenges in the retrieval of extreme precipitation might be from the attenuation correction and large uncertainties in the Z-R relationships from near-surface radar reflectivity to precipitation rates. These potential uncertainties are examined by using collocated ground based radar reflectivity and precipitation retrievals.
Probability bounds analysis for nonlinear population ecology models.
Enszer, Joshua A; Andrei Măceș, D; Stadtherr, Mark A
2015-09-01
Mathematical models in population ecology often involve parameters that are empirically determined and inherently uncertain, with probability distributions for the uncertainties not known precisely. Propagating such imprecise uncertainties rigorously through a model to determine their effect on model outputs can be a challenging problem. We illustrate here a method for the direct propagation of uncertainties represented by probability bounds though nonlinear, continuous-time, dynamic models in population ecology. This makes it possible to determine rigorous bounds on the probability that some specified outcome for a population is achieved, which can be a core problem in ecosystem modeling for risk assessment and management. Results can be obtained at a computational cost that is considerably less than that required by statistical sampling methods such as Monte Carlo analysis. The method is demonstrated using three example systems, with focus on a model of an experimental aquatic food web subject to the effects of contamination by ionic liquids, a new class of potentially important industrial chemicals. Copyright © 2015. Published by Elsevier Inc.
Contingency Planning for Planetary Rovers
NASA Technical Reports Server (NTRS)
Dearden, Richard; Meuleau, Nicolas; Ramakrishnan, Sailesh; Smith, David; Washington, Rich; Clancy, Daniel (Technical Monitor)
2002-01-01
There has been considerable work in AI on planning under uncertainty. But this work generally assumes an extremely simple model of action that does not consider continuous time and resources. These assumptions are not reasonable for a Mars rover, which must cope with uncertainty about the duration of tasks, the power required, the data storage necessary, along with its position and orientation. In this paper, we outline an approach to generating contingency plans when the sources of uncertainty involve continuous quantities such as time and resources. The approach involves first constructing a "seed" plan, and then incrementally adding contingent branches to this plan in order to improve utility. The challenge is to figure out the best places to insert contingency branches. This requires an estimate of how much utility could be gained by building a contingent branch at any given place in the seed plan. Computing this utility exactly is intractable, but we outline an approximation method that back propagates utility distributions through a graph structure similar to that of a plan graph.
Incremental Contingency Planning
NASA Technical Reports Server (NTRS)
Dearden, Richard; Meuleau, Nicolas; Ramakrishnan, Sailesh; Smith, David E.; Washington, Rich
2003-01-01
There has been considerable work in AI on planning under uncertainty. However, this work generally assumes an extremely simple model of action that does not consider continuous time and resources. These assumptions are not reasonable for a Mars rover, which must cope with uncertainty about the duration of tasks, the energy required, the data storage necessary, and its current position and orientation. In this paper, we outline an approach to generating contingency plans when the sources of uncertainty involve continuous quantities such as time and resources. The approach involves first constructing a "seed" plan, and then incrementally adding contingent branches to this plan in order to improve utility. The challenge is to figure out the best places to insert contingency branches. This requires an estimate of how much utility could be gained by building a contingent branch at any given place in the seed plan. Computing this utility exactly is intractable, but we outline an approximation method that back propagates utility distributions through a graph structure similar to that of a plan graph.
Computational sciences in the upstream oil and gas industry
Halsey, Thomas C.
2016-01-01
The predominant technical challenge of the upstream oil and gas industry has always been the fundamental uncertainty of the subsurface from which it produces hydrocarbon fluids. The subsurface can be detected remotely by, for example, seismic waves, or it can be penetrated and studied in the extremely limited vicinity of wells. Inevitably, a great deal of uncertainty remains. Computational sciences have been a key avenue to reduce and manage this uncertainty. In this review, we discuss at a relatively non-technical level the current state of three applications of computational sciences in the industry. The first of these is seismic imaging, which is currently being revolutionized by the emergence of full wavefield inversion, enabled by algorithmic advances and petascale computing. The second is reservoir simulation, also being advanced through the use of modern highly parallel computing architectures. Finally, we comment on the role of data analytics in the upstream industry. This article is part of the themed issue ‘Energy and the subsurface’. PMID:27597785
Sustainable water management under future uncertainty with eco-engineering decision scaling
Poff, N LeRoy; Brown, Casey M; Grantham, Theodore E.; Matthews, John H; Palmer, Margaret A.; Spence, Caitlin M; Wilby, Robert L.; Haasnoot, Marjolijn; Mendoza, Guillermo F; Dominique, Kathleen C; Baeza, Andres
2015-01-01
Managing freshwater resources sustainably under future climatic and hydrological uncertainty poses novel challenges. Rehabilitation of ageing infrastructure and construction of new dams are widely viewed as solutions to diminish climate risk, but attaining the broad goal of freshwater sustainability will require expansion of the prevailing water resources management paradigm beyond narrow economic criteria to include socially valued ecosystem functions and services. We introduce a new decision framework, eco-engineering decision scaling (EEDS), that explicitly and quantitatively explores trade-offs in stakeholder-defined engineering and ecological performance metrics across a range of possible management actions under unknown future hydrological and climate states. We illustrate its potential application through a hypothetical case study of the Iowa River, USA. EEDS holds promise as a powerful framework for operationalizing freshwater sustainability under future hydrological uncertainty by fostering collaboration across historically conflicting perspectives of water resource engineering and river conservation ecology to design and operate water infrastructure for social and environmental benefits.
Levine, Lyle E.; Okoro, Chukwudi A.; Xu, Ruqing
2015-09-30
We report non-destructive measurements of the full elastic strain and stress tensors from individual dislocation cells distributed along the full extent of a 50 mm-long polycrystalline copper via in Si is reported. Determining all of the components of these tensors from sub-micrometre regions within deformed metals presents considerable challenges. The primary issues are ensuring that different diffraction peaks originate from the same sample volume and that accurate determination is made of the peak positions from plastically deformed samples. For these measurements, three widely separated reflections were examined from selected, individual grains along the via. The lattice spacings and peak positionsmore » were measured for multiple dislocation cell interiors within each grain and the cell-interior peaks were sorted out using the measured included angles. A comprehensive uncertainty analysis using a Monte Carlo uncertainty algorithm provided uncertainties for the elastic strain tensor and stress tensor components.« less
Assessment of adaptation measures to high-mountain risks in Switzerland under climate uncertainties
NASA Astrophysics Data System (ADS)
Muccione, Veruska; Lontzek, Thomas; Huggel, Christian; Ott, Philipp; Salzmann, Nadine
2015-04-01
The economic evaluation of different adaptation options is important to support policy-makers that need to set priorities in the decision-making process. However, the decision-making process faces considerable uncertainties regarding current and projected climate impacts. First, physical climate and related impact systems are highly complex and not fully understood. Second, the further we look into the future, the more important the emission pathways become, with effects on the frequency and severity of climate impacts. Decision on adaptation measures taken today and in the future must be able to adequately consider the uncertainties originating from the different sources. Decisions are not taken in a vacuum but always in the context of specific social, economic, institutional and political conditions. Decision finding processes strongly depend on the socio-political system and usually have evolved over some time. Finding and taking decisions in the respective socio-political and economic context multiplies the uncertainty challenge. Our presumption is that a sound assessment of the different adaptation options in Switzerland under uncertainty necessitates formulating and solving a dynamic, stochastic optimization problem. Economic optimization models in the field of climate change are not new. Typically, such models are applied for global-scale studies but barely for local-scale problems. In this analysis, we considered the case of the Guttannen-Grimsel Valley, situated in the Swiss Bernese Alps. The alpine community has been affected by high-magnitude, high-frequency debris flows that started in 2009 and were historically unprecendented. They were related to thaw of permafrost in the rock slopes of Ritzlihorn and repeated rock fall events that accumulated at the debris fan and formed a sediment source for debris flows and were transported downvalley. An important transit road, a trans-European gas pipeline and settlements were severely affected and partly destroyed. Several adaptation measures were discussed by the responsible authorities but decision making is particularly challenging under multiple uncertainties. For this area, we developed a stochastic optimization model for concrete and real-case adaptation options and measures and use dynamic programming to explore the optimal adaptation decisions under uncertainty in face of uncertain impacts from climate change of debris flows and flooding. Even though simplification needed to be made the results produced were concrete and tangible, indicating that excavation is a preferable adaptation option based on our assumption and modeling in comparison to building a dam or relocation, which is not necessarily intuitive and adds an additional perspective to what has so far been sketched and evaluated by cantonal and communal authorities for Guttannen. Moreover, the building of an alternative cantonal road appears to be more expensive than costs incurring due to road closure.
[The challenge of clinical complexity in the 21st century: Could frailty indexes be the answer?
Amblàs-Novellas, Jordi; Espaulella-Panicot, Joan; Inzitari, Marco; Rexach, Lourdes; Fontecha, Benito; Romero-Ortuno, Roman
The number of older people with complex clinical conditions and complex care needs continues to increase in the population. This is presenting many challenges to healthcare professionals and healthcare systems. In the face of these challenges, approaches are required that are practical and feasible. The frailty paradigm may be an excellent opportunity to review and establish some of the principles of comprehensive Geriatric Assessment in specialties outside Geriatric Medicine. The assessment of frailty using Frailty Indexes provides an aid to the 'situational diagnosis' of complex clinical situations, and may help in tackling uncertainty in a person-centred approach. Copyright © 2016 SEGG. Publicado por Elsevier España, S.L.U. All rights reserved.
Anderies, John M
2015-02-01
I present a general mathematical modeling framework that can provide a foundation for the study of sustainability in social- ecological systems (SESs). Using basic principles from feedback control and a sequence of specific models from bioeconomics and economic growth, I outline several mathematical and empirical challenges associated with the study of sustainability of SESs. These challenges are categorized into three classes: (1) the social choice of performance measures, (2) uncertainty, and (3) collective action. Finally, I present some opportunities for combining stylized dynamical systems models with empirical data on human behavior and biophysical systems to address practical challenges for the design of effective governance regimes (policy feedbacks) for highly uncertain natural resource systems.
Time-lapse seismic - repeatability versus usefulness and 2D versus 3D
NASA Astrophysics Data System (ADS)
Landro, M.
2017-12-01
Time-lapse seismic has developed rapidly over the past decades, especially for monitoring of oil and gas reservoirs and subsurface storage of CO2. I will review and discuss some of the critical enabling factors for the commercial success of this technology. It was early realized that how well we are able to repeat our seismic experiment is crucial. However, it is always a question of detectability versus repeatability. For marine seismic, there are several factors limiting the repeatability: Weather conditions, positioning of sources and receivers and so on. I will discuss recent improvements in both acquisition and processing methods over the last decade. It is well known that repeated 3D seismic data is the most accurate tool for reservoir monitoring purposes. However, several examples show that 2D seismic data may be used for monitoring purposes despite lower repeatability. I will use examples from an underground blow out in the North Sea, and repeated 2D seismic lines acquired before and after the Tohoku earthquake in 2011 to illustrate this. A major challenge when using repeated 2D seismic for subsurface monitoring purposes is the lack of 3D calibration points and significantly less amount of data. For marine seismic acquisition, feathering issues and crossline dip effects become more critical compared to 3D seismic acquisition. Furthermore, the uncertainties arising from a non-ideal 2D seismic acquisition are hard to assess, since the 3D subsurface geometry has not been mapped. One way to shed more light on this challenge is to use 3D time lapse seismic modeling testing various crossline dips or geometries. Other ways are to use alternative data sources, such as bathymetry, time lapse gravity or electromagnetic data. The end result for all time-lapse monitoring projects is an interpretation associated with uncertainties, and for the 2D case these uncertainties are often large. The purpose of this talk is to discuss how to reduces and control these uncertainties as much as possible.
Examining the Challenging Hindrances facing in the Construction Projects: South India’s Perspective
NASA Astrophysics Data System (ADS)
Subramanyam, K.; Haridharan, M. K.
2017-07-01
Developing countries like India require a huge infrastructure to facilitate needs of the people. Construction industry provides several opportunities to the individuals. Construction manager work is to supervise and organize the construction activities in construction projects. Now a day construction manager facing challenges. This paper aimed to study the challenges facing by the construction manager in the perception of construction professionals. 39 variables were taken from the literature review which found to be severe impact on construction managers’ performance. Construction manager, project manager and site engineers are the respondents for this survey. Using SPSS, regression analysis was done and recognized significant challenges. These challenges were classified into 5 domains. In management challenges, resource availability and allocation, risks and uncertainties existing in the project onsite, top management support and cost constraints are the most significant variables. In skills requirement of a construction manager challenges, technical skills required to learn and adapt new technology in the project, decision making and planning according to the situation in site are the most significant variables. In performance challenges, implementation of tasks according to the plan is the important variable whereas in onsite challenges, manage project risks, develop project policies and procedures are the most important.
ERIC Educational Resources Information Center
Dawson, Michelle; Pooley, Julie Ann
2013-01-01
Throughout our lifespan we face many challenges which are often referred to as transitions. The move to university is one such transition which may place individuals at risk of suffering ongoing significant life stress, anxiety and uncertainty. Optimism, promotion of independent functioning (PIF), promotion of volitional functioning (PVF) and…
Asian Security Challenges-Planning in the Face of Strategic Uncertainties. Volume 2. Appendices
1994-10-01
suited to the company’s own strengths." In addition, the authors argue, U.S. firms should emulate the Japanese and South Koreans by setting up global ... marketing franchises and sharing core competencies where it makes sense to do so. Finally, they take a swipe at traditional strategic planning, criticizing
Carina Wyborn; Laurie Yung; Daniel Murphy; Daniel R. Williams
2015-01-01
Adaptation is situated within multiple, interacting social, political, and economic forces. Adaptation pathways envision adaptation as a continual pathway of change and response embedded within this broader sociopolitical context. Pathways emphasize that current decisions are both informed by past actions and shape the landscape of future options. This research...
ERIC Educational Resources Information Center
Hsu, Huei-Lien
2012-01-01
By centralizing the issue of test fairness in language proficiency assessments, this study responds to a call by researchers for developing greater social responsibility in the language testing agenda. As inquiries into language attitude and psychology indicate, there is an underlying uncertainty pertaining to the validity of test use and score…
"Where Is the Post-Modern Truth We Have Lost in Reductionist Knowledge?" A Curriculum's Epitaph
ERIC Educational Resources Information Center
Slabbert, Johannes A.; Hattingh, Annemarie
2006-01-01
This essay suggests a way for creating a curriculum for the future amidst the challenges of post-modern uncertainty. Curriculum discourse in the past has been dominated by widely-accepted key questions, which produce and maintain curricula that are essentially fragmented and reductionistic, and directly opposed to the essential demands of the…
Evaluating uncertainty and parameter sensitivity in environmental models can be a difficult task, even for low-order, single-media constructs driven by a unique set of site-specific data. The challenge of examining ever more complex, integrated, higher-order models is a formidab...
A complex case of congenital cystic renal disease
Cordiner, David S; Evans, Clair A; Brundler, Marie-Anne; McPhillips, Maeve; Murio, Enric; Darling, Mark; Taheri, Sepideh
2012-01-01
This case outlines the potential complexity of autosomal recessive polycystic kidney disease (ARPKD). It highlights the challenges involved in managing this condition, some of the complications faced and areas of uncertainty in the decision making process. With a paucity of published paediatric cases on this subject, this should add to the pool of information currently available. PMID:22605879
An Inquiry into Action Research: Teaching and Doing Action Research for the First-Time
ERIC Educational Resources Information Center
Palak, Deniz
2013-01-01
I undertook this inquiry into action research while teaching research methods within a graduate degree teacher education program. This inquiry represents my initial encounter with action research and describes the tools, challenges, and uncertainties that I encountered while teaching and doing action research for the first-time. The main purpose…
ERIC Educational Resources Information Center
Campbell, Merle Wayne
2013-01-01
Intelligent decision systems have the potential to support and greatly amplify human decision-making across a number of industries and domains. However, despite the rapid improvement in the underlying capabilities of these "intelligent" systems, increasing their acceptance as decision aids in industry has remained a formidable challenge.…
The Big Picture: Understanding Learning and Meta-Learning Challenges
ERIC Educational Resources Information Center
Carneiro, Roberto
2007-01-01
The future learning agenda is fraught with uncertainty. This opening article attempts to pose broad societal questions that determine the learning agenda-setting as well as the emergence of new knowledge paradigms. It begins with a quick overview of the futures of learning agenda. Then, it moves on to depict the value chain that departs from…
Delving into Key Dimensions of ESD through Analyses of a Middle School Science Textbook
ERIC Educational Resources Information Center
Sahin, Elvan
2016-01-01
Uncertainties and debates regarding the term of sustainable development are still going on, and similarly, the notion of education for sustainable development (ESD) is open to debate. There has been an attempt to make the concept of ESD evident, which is quite challenging. Palmer (1998) stated the appropriateness of ESD within environmental…
Moving Cultures: The Perilous Problems of Cultural Dichotomies in a Globalizing Society.
ERIC Educational Resources Information Center
Hermans, Hubert J. M.; Kempen, Harry J. G.
1998-01-01
Cultural differences in the form of dichotomous distinctions do not meet the challenge of globalization and its implications for a psychology of culture and self. An alternative approach that is sensitive to the process of cultural interchange, the cultural complexity of self and identity, and the experience of uncertainty is described. Contains…
Addressing uncertainty: how to conserve and manage rare or little-known fungi
Randy Molina; Thomas R. Horton; James M. Trappe; Bruce G. Marcot
2011-01-01
One of the greater challenges in conserving fungi comes from our incomplete knowledge of degree of rarity, risk status, and habitat requirements of most fungal species. We discuss approaches to immediately begin closing knowledge gaps, including: (1) harnessing collective expert knowledge so that data from professional experiences (e.g., personal collection and...
America's Young Adults: Special Issue, 2014
ERIC Educational Resources Information Center
Cook, Traci; Kappeler, Evelyn; Ellis, Renee; Kominski, Robert; Cooper, Alexia; Smith, Erica; Donoghue, Brecht; Whitestone, Yuko; Snyder, Tom; Aud, Susan; Williamson, Lisa; Henderson, Steve; Steffen, Barry; Madans, Jennifer; Lukacs, Susan; Pastor, Patricia; Goldstrom, Ingrid; Han, Beth; Bures, Regina; Chamberlain, Seth; Despain, Jason; Chadwick, Laura; Park, Jennifer
2014-01-01
The well-being of young adults in the United States today remains an area of key interest to the public and policy-makers alike. This age group faces the well-known challenges of achieving financial and social independence while forming their own households at a time of greater economic uncertainty than in the past. Better understanding of the…
Adapting to climate change at Olympic National Forest and Olympic National Park
Jessica E. Halofsky; David L. Peterson; Kathy A. O’Halloran; Catherine Hawkins Hoffman
2011-01-01
Climate change presents a major challenge to natural resource managers both because of the magnitude of potential effects of climate change on ecosystem structure, processes, and function, and because of the uncertainty associated with those potential ecological effects. Concrete ways to adapt to climate change are needed to help natural resource managers take the...
ERIC Educational Resources Information Center
Hornos, Eduardo H.; Pleguezuelos, Eduardo M.; Brailovsky, Carlos A.; Harillo, Leandro D.; Dory, Valerie; Charlin, Bernard
2013-01-01
Introduction: Judgment in the face of uncertainty is an important dimension of expertise and clinical competence. However, it is challenging to conceive continuing professional development (CPD) initiatives aimed at helping physicians enhance their clinical judgment skills in ill-defined situations. We present an online script concordance-based…
Climate change and vulnerability of bull trout (Salvelinus confluentus ) in a fire-prone landscape
Jeffrey A. Falke; Rebecca L. Flitcroft; Jason B. Dunham; Kristina M. McNyset; Paul F. Hessburg; Gordon H. Reeves; C. Tara Marshall
2015-01-01
Linked atmospheric and wildfire changes will complicate future management of native coldwater fishes in fire-prone landscapes, and new approaches to management that incorporate uncertainty are needed to address this challenge. We used a Bayesian network (BN) approach to evaluate population vulnerability of bull trout (Salvelinus confluentus) in the Wenatchee River...
Laura Phillips-Mao; Susan M. Galatowitsch; Stephanie A. Snyder; Robert G. Haight
2016-01-01
Incorporating climate change into conservation decision-making at site and population scales is challenging due to uncertainties associated with localized climate change impacts and population responses to multiple interacting impacts and adaptation strategies. We explore the use of spatially explicit population models to facilitate scenario analysis, a conservation...
Maria K. Janowiak; Christopher W. Swanston; Linda M. Nagel; Christopher R. Webster; Brian J. Palik; Mark J. Twery; John B. Bradford; Linda R. Parker; Andrea T. Hille; Sheela M. Johnson
2011-01-01
Land managers across the country face the immense challenge of developing and applying appropriate management strategies as forests respond to climate change. We hosted a workshop to explore silvicultural strategies for addressing the uncertainties surrounding climate change and forest response in the northeastern and north-central United States. Outcomes of this...
ERIC Educational Resources Information Center
Larson, Jan M.; Fay, Martha
2016-01-01
This study is based on an international immersion service-learning/research experience in a remote village in Moldova that provided faculty and students an opportunity to teach journalism and help local students and community representatives create their own online news outlet. Students' existing conceptions were challenged, they experienced…
Professional Identity and Engagement among Newly Qualified Teachers in Times of Uncertainty
ERIC Educational Resources Information Center
Correa Gorospe, José Miguel; Martínez-Arbelaiz, Asunción; Fernández-Olaskoaga, Lorea
2018-01-01
Social, political and economic conditions shape a context of permanent flux where early childhood education teachers have to join the labour market and build their professional identity while facing numerous challenges. The aim of this study is to investigate the effects that a changing world and precarious job conditions can have on newly…
Ethics, Ricoeur And Philosophy: Ethical Teacher Workshops
ERIC Educational Resources Information Center
Scott-Baumann, Alison
2006-01-01
This work is about the ethics of education, and about philosophy as a discipline that can help us to help children look at ethics afresh. The study and practice of ethics is about morals and uncertainties and, as such, poses problems for the research community. The philosopher Ricoeur challenges research as only one way to find meaning in the…
21st Century Skills, Education & Competitiveness: A Resource and Policy Guide
ERIC Educational Resources Information Center
Partnership for 21st Century Skills, 2008
2008-01-01
Americans are deeply concerned about their present and future prospects in a time of economic uncertainty. Policymakers have a make-or-break opening--and an obligation--to chart a new path for public education that will secure the nation's economic competitiveness. This guide summarizes the challenges and opportunities that, if left unaddressed,…
U.S. Offshore Wind Manufacturing and Supply Chain Development
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hamilton, Bruce
2013-02-22
This report seeks to provide an organized, analytical approach to identifying and bounding uncertainties around offshore wind manufacturing and supply chain capabilities; projecting potential component-level supply chain needs under three demand scenarios; and identifying key supply chain challenges and opportunities facing the future U.S. market and current suppliers of the nation’s landbased wind market.
Though aerosol radiative effects have been recognized as some of the largest sources of uncertainty among the forcers of climate change, the verification of the spatial and temporal variability of the magnitude and directionality of aerosol radiative forcing has remained challeng...
Implementing GoodWork Programs: Helping Students to become Ethical Workers
ERIC Educational Resources Information Center
Fischman, Wendy; Gardner, Howard
2009-01-01
Today, young people entering the job market face challenges, as well as uncertainty. The influx of new technologies and powerful market forces have changed the ways in which people work in their own offices, as well as with others around the globe. Alongside the excitement of new technologies and the financial benefits, they also confront new…
Career Guidance and Therapeutic Counselling: Sharing "What Works" in Practice with Young People
ERIC Educational Resources Information Center
Westergaard, Jane
2012-01-01
Many young people in the UK and across the world, where austerity measures are biting deep, find themselves at a time of crisis and uncertainty in their lives. The assumptions previously held of clear and straightforward career paths are being challenged and "career" has come to mean more than simply "work" or…
An integrated science plan for the Lake Tahoe basin: conceptual framework and research strategies
Zachary P. Hymanson; Michael W. Collopy
2010-01-01
An integrated science plan was developed to identify and refine contemporary science information needs for the Lake Tahoe basin ecosystem. The main objectives were to describe a conceptual framework for an integrated science program, and to develop research strategies addressing key uncertainties and information gaps that challenge government agencies in the theme...
Quantitative assessment of human exposures and health effects due to air pollution involve detailed characterization of impacts of air quality on exposure and dose. A key challenge is to integrate these three components on a consistent spatial and temporal basis taking into acco...
Understanding the wicked nature of "unmanaged recreation" in Colorado's Front Range
Jeffrey J. Brooks; Patricia A. Champ
2006-01-01
Unmanaged recreation presents a challenge to both researchers and managers of outdoor recreation in the United States because it is shrouded in uncertainty resulting from disagreement over the definition of the problem, the strategies for resolving the problem, and the outcomes of management. Incomplete knowledge about recreation visitorsâ values and relationships with...
Read-across remains a popular data gap filling technique within category and analogue approaches for regulatory purposes. Acceptance of read-across is an ongoing challenge with several efforts underway for identifying and addressing uncertainties. Here we demonstrate an algorithm...
A Framework for Representing and Jointly Reasoning over Linguistic and Non-Linguistic Knowledge
ERIC Educational Resources Information Center
Murugesan, Arthi
2009-01-01
Natural language poses several challenges to developing computational systems for modeling it. Natural language is not a precise problem but is rather ridden with a number of uncertainties in the form of either alternate words or interpretations. Furthermore, natural language is a generative system where the problem size is potentially infinite.…
Ekwunife, Obinna I; Grote, Andreas Gerber; Mosch, Christoph; O'Mahony, James F; Lhachimi, Stefan K
2015-05-12
Cervical cancer poses a huge health burden, both to developed and developing nations, making prevention and control strategies necessary. However, the challenges of designing and implementing prevention strategies differ for low- and middle-income countries (LMICs) as compared to countries with fully developed health care systems. Moreover, for many LMICs, much of the data needed for decision analytic modelling, such as prevalence, will most likely only be partly available or measured with much larger uncertainty. Lastly, imperfect implementation of human papillomavirus (HPV) vaccination may influence the effectiveness of cervical cancer prevention in unpredictable ways. This systematic review aims to assess how decision analytic modelling studies of HPV cost-effectiveness in LMICs accounted for the particular challenges faced in such countries. Specifically, the study will assess the following: (1) whether the existing literature on cost-effectiveness modelling of HPV vaccines acknowledges the distinct challenges of LMICs, (2) how these challenges were accommodated in the models, (3) whether certain parameters systemically exhibited large degrees of uncertainty due to lack of data and how influential were these parameters on model-based recommendations, and (4) whether the choice of modelling herd immunity influences model-based recommendations, especially when coverage of a HPV vaccination program is not optimal. We will conduct a systematic review to identify suitable studies from MEDLINE (via PubMed), EMBASE, NHS Economic Evaluation Database (NHS EED), EconLit, Web of Science, and CEA Registry. Searches will be conducted for studies of interest published since 2006. The searches will be supplemented by hand searching of the most relevant papers found in the search. Studies will be critically appraised using Consolidated Health Economic Evaluation Reporting Standards (CHEERS) statement checklist. We will undertake a descriptive, narrative, and interpretative synthesis of data to address the study objectives. The proposed systematic review will assess how the cost-effectiveness studies of HPV vaccines accounted for the distinct challenges of LMICs. The gaps identified will expose areas for additional research as well as challenges that need to be accounted for in future modelling studies. PROSPERO CRD42015017870.
Valuating Privacy with Option Pricing Theory
NASA Astrophysics Data System (ADS)
Berthold, Stefan; Böhme, Rainer
One of the key challenges in the information society is responsible handling of personal data. An often-cited reason why people fail to make rational decisions regarding their own informational privacy is the high uncertainty about future consequences of information disclosures today. This chapter builds an analogy to financial options and draws on principles of option pricing to account for this uncertainty in the valuation of privacy. For this purpose, the development of a data subject's personal attributes over time and the development of the attribute distribution in the population are modeled as two stochastic processes, which fit into the Binomial Option Pricing Model (BOPM). Possible applications of such valuation methods to guide decision support in future privacy-enhancing technologies (PETs) are sketched.
Landscape change in the southern Piedmont: challenges, solutions, and uncertainty across scales
Conroy, M.J.; Allen, Craig R.; Peterson, J.T.; Pritchard, L.J.; Moore, C.T.
2003-01-01
The southern Piedmont of the southeastern United States epitomizes the complex and seemingly intractable problems and hard decisions that result from uncontrolled urban and suburban sprawl. Here we consider three recurrent themes in complicated problems involving complex systems: (1) scale dependencies and cross-scale, often nonlinear relationships; (2) resilience, in particular the potential for complex systems to move to alternate stable states with decreased ecological and/or economic value; and (3) uncertainty in the ability to understand and predict outcomes, perhaps particularly those that occur as a result of human impacts. We consider these issues in the context of landscape-level decision making, using as an example water resources and lotic systems in the Piedmont region of the southeastern United States.
Parameter Uncertainty Analysis Using Monte Carlo Simulations for a Regional-Scale Groundwater Model
NASA Astrophysics Data System (ADS)
Zhang, Y.; Pohlmann, K.
2016-12-01
Regional-scale grid-based groundwater models for flow and transport often contain multiple types of parameters that can intensify the challenge of parameter uncertainty analysis. We propose a Monte Carlo approach to systematically quantify the influence of various types of model parameters on groundwater flux and contaminant travel times. The Monte Carlo simulations were conducted based on the steady-state conversion of the original transient model, which was then combined with the PEST sensitivity analysis tool SENSAN and particle tracking software MODPATH. Results identified hydrogeologic units whose hydraulic conductivity can significantly affect groundwater flux, and thirteen out of 173 model parameters that can cause large variation in travel times for contaminant particles originating from given source zones.
Model Predictions and Observed Performance of JWST's Cryogenic Position Metrology System
NASA Technical Reports Server (NTRS)
Lunt, Sharon R.; Rhodes, David; DiAntonio, Andrew; Boland, John; Wells, Conrad; Gigliotti, Trevis; Johanning, Gary
2016-01-01
The James Webb Space Telescope cryogenic testing requires measurement systems that both obtain a very high degree of accuracy and can function in that environment. Close-range photogrammetry was identified as meeting those criteria. Testing the capability of a close-range photogrammetric system prior to its existence is a challenging problem. Computer simulation was chosen over building a scaled mock-up to allow for increased flexibility in testing various configurations. Extensive validation work was done to ensure that the actual as-built system meet accuracy and repeatability requirements. The simulated image data predicted the uncertainty in measurement to be within specification and this prediction was borne out experimentally. Uncertainty at all levels was verified experimentally to be less than 0.1 millimeters.
Structural Health Monitoring for Impact Damage in Composite Structures.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Roach, Dennis P.; Raymond Bond; Doug Adams
Composite structures are increasing in prevalence throughout the aerospace, wind, defense, and transportation industries, but the many advantages of these materials come with unique challenges, particularly in inspecting and repairing these structures. Because composites of- ten undergo sub-surface damage mechanisms which compromise the structure without a clear visual indication, inspection of these components is critical to safely deploying composite re- placements to traditionally metallic structures. Impact damage to composites presents one of the most signi fi cant challenges because the area which is vulnerable to impact damage is generally large and sometimes very dif fi cult to access. This workmore » seeks to further evolve iden- ti fi cation technology by developing a system which can detect the impact load location and magnitude in real time, while giving an assessment of the con fi dence in that estimate. Fur- thermore, we identify ways by which impact damage could be more effectively identi fi ed by leveraging impact load identi fi cation information to better characterize damage. The impact load identi fi cation algorithm was applied to a commercial scale wind turbine blade, and results show the capability to detect impact magnitude and location using a single accelerometer, re- gardless of sensor location. A technique for better evaluating the uncertainty of the impact estimates was developed by quantifying how well the impact force estimate meets the assump- tions underlying the force estimation technique. This uncertainty quanti fi cation technique was found to reduce the 95% con fi dence interval by more than a factor of two for impact force estimates showing the least uncertainty, and widening the 95% con fi dence interval by a fac- tor of two for the most uncertain force estimates, avoiding the possibility of understating the uncertainty associated with these estimates. Linear vibration based damage detection tech- niques were investigated in the context of structural stiffness reductions and impact damage. A method by which the sensitivity to damage could be increased for simple structures was presented, and the challenges of applying that technique to a more complex structure were identi fi ed. The structural dynamic changes in a weak adhesive bond were investigated, and the results showed promise for identifying weak bonds that show little or no static reduction in stiffness. To address these challenges in identifying highly localized impact damage, the possi- bility of detecting damage through nonlinear dynamic characteristics was also identi fi ed, with a proposed technique which would leverage impact location estimates to enable the detection of impact damage. This nonlinear damage identi fi cation concept was evaluated on a composite panel with a substructure disbond, and the results showed that the nonlinear dynamics at the damage site could be observed without a baseline healthy reference. By further developing impact load identi fi cation technology and combining load and damage estimation techniques into an integrated solution, the challenges associated with impact detection in composite struc- tures can be effectively solved, thereby reducing costs, improving safety, and enhancing the operational readiness and availability of high value assets.« less
NASA Astrophysics Data System (ADS)
Weatherill, Graeme; Garcia, Julio; Poggi, Valerio; Chen, Yen-Shin; Pagani, Marco
2016-04-01
The Global Earthquake Model (GEM) has, since its inception in 2009, made many contributions to the practice of seismic hazard modeling in different regions of the globe. The OpenQuake-engine (hereafter referred to simply as OpenQuake), GEM's open-source software for calculation of earthquake hazard and risk, has found application in many countries, spanning a diversity of tectonic environments. GEM itself has produced a database of national and regional seismic hazard models, harmonizing into OpenQuake's own definition the varied seismogenic sources found therein. The characterization of active faults in probabilistic seismic hazard analysis (PSHA) is at the centre of this process, motivating many of the developments in OpenQuake and presenting hazard modellers with the challenge of reconciling seismological, geological and geodetic information for the different regions of the world. Faced with these challenges, and from the experience gained in the process of harmonizing existing models of seismic hazard, four critical issues are addressed. The challenge GEM has faced in the development of software is how to define a representation of an active fault (both in terms of geometry and earthquake behaviour) that is sufficiently flexible to adapt to different tectonic conditions and levels of data completeness. By exploring the different fault typologies supported by OpenQuake we illustrate how seismic hazard calculations can, and do, take into account complexities such as geometrical irregularity of faults in the prediction of ground motion, highlighting some of the potential pitfalls and inconsistencies that can arise. This exploration leads to the second main challenge in active fault modeling, what elements of the fault source model impact most upon the hazard at a site, and when does this matter? Through a series of sensitivity studies we show how different configurations of fault geometry, and the corresponding characterisation of near-fault phenomena (including hanging wall and directivity effects) within modern ground motion prediction equations, can have an influence on the seismic hazard at a site. Yet we also illustrate the conditions under which these effects may be partially tempered when considering the full uncertainty in rupture behaviour within the fault system. The third challenge is the development of efficient means for representing both aleatory and epistemic uncertainties from active fault models in PSHA. In implementing state-of-the-art seismic hazard models into OpenQuake, such as those recently undertaken in California and Japan, new modeling techniques are needed that redefine how we treat interdependence of ruptures within the model (such as mutual exclusivity), and the propagation of uncertainties emerging from geology. Finally, we illustrate how OpenQuake, and GEM's additional toolkits for model preparation, can be applied to address long-standing issues in active fault modeling in PSHA. These include constraining the seismogenic coupling of a fault and the partitioning of seismic moment between the active fault surfaces and the surrounding seismogenic crust. We illustrate some of the possible roles that geodesy can play in the process, but highlight where this may introduce new uncertainties and potential biases into the seismic hazard process, and how these can be addressed.
The 2014 Sandia Verification and Validation Challenge: Problem statement
Hu, Kenneth; Orient, George
2016-01-18
This paper presents a case study in utilizing information from experiments, models, and verification and validation (V&V) to support a decision. It consists of a simple system with data and models provided, plus a safety requirement to assess. The goal is to pose a problem that is flexible enough to allow challengers to demonstrate a variety of approaches, but constrained enough to focus attention on a theme. This was accomplished by providing a good deal of background information in addition to the data, models, and code, but directing the participants' activities with specific deliverables. In this challenge, the theme ismore » how to gather and present evidence about the quality of model predictions, in order to support a decision. This case study formed the basis of the 2014 Sandia V&V Challenge Workshop and this resulting special edition of the ASME Journal of Verification, Validation, and Uncertainty Quantification.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Clark, Martyn P.; Bierkens, Marc F. P.; Samaniego, Luis
The diversity in hydrologic models has historically led to great controversy on the correct approach to process-based hydrologic modeling, with debates centered on the adequacy of process parameterizations, data limitations and uncertainty, and computational constraints on model analysis. Here, we revisit key modeling challenges on requirements to (1) define suitable model equations, (2) define adequate model parameters, and (3) cope with limitations in computing power. We outline the historical modeling challenges, provide examples of modeling advances that address these challenges, and define outstanding research needs. We also illustrate how modeling advances have been made by groups using models of different type and complexity,more » and we argue for the need to more effectively use our diversity of modeling approaches in order to advance our collective quest for physically realistic hydrologic models.« less
The computational challenges of Earth-system science.
O'Neill, Alan; Steenman-Clark, Lois
2002-06-15
The Earth system--comprising atmosphere, ocean, land, cryosphere and biosphere--is an immensely complex system, involving processes and interactions on a wide range of space- and time-scales. To understand and predict the evolution of the Earth system is one of the greatest challenges of modern science, with success likely to bring enormous societal benefits. High-performance computing, along with the wealth of new observational data, is revolutionizing our ability to simulate the Earth system with computer models that link the different components of the system together. There are, however, considerable scientific and technical challenges to be overcome. This paper will consider four of them: complexity, spatial resolution, inherent uncertainty and time-scales. Meeting these challenges requires a significant increase in the power of high-performance computers. The benefits of being able to make reliable predictions about the evolution of the Earth system should, on their own, amply repay this investment.